CN114047763B - System, method and device for distinguishing collision barriers by unmanned vehicles and electronic equipment - Google Patents

System, method and device for distinguishing collision barriers by unmanned vehicles and electronic equipment Download PDF

Info

Publication number
CN114047763B
CN114047763B CN202111356947.2A CN202111356947A CN114047763B CN 114047763 B CN114047763 B CN 114047763B CN 202111356947 A CN202111356947 A CN 202111356947A CN 114047763 B CN114047763 B CN 114047763B
Authority
CN
China
Prior art keywords
obstacle
information
unmanned vehicle
real
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111356947.2A
Other languages
Chinese (zh)
Other versions
CN114047763A (en
Inventor
唐香珺
蔡云飞
任国全
范红波
王子航
吴定海
王怀光
李晓磊
李志宁
曹凤利
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Science and Technology
Army Engineering University of PLA
Original Assignee
Nanjing University of Science and Technology
Army Engineering University of PLA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Science and Technology, Army Engineering University of PLA filed Critical Nanjing University of Science and Technology
Priority to CN202111356947.2A priority Critical patent/CN114047763B/en
Publication of CN114047763A publication Critical patent/CN114047763A/en
Application granted granted Critical
Publication of CN114047763B publication Critical patent/CN114047763B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0223Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0253Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting relative motion information from a plurality of images taken successively, e.g. visual odometry, optical flow
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0259Control of position or course in two dimensions specially adapted to land vehicles using magnetic or electromagnetic means
    • G05D1/0263Control of position or course in two dimensions specially adapted to land vehicles using magnetic or electromagnetic means using magnetic strips
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Optics & Photonics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention provides a system, a method, a device and electronic equipment for distinguishing collision barriers by an unmanned vehicle, which comprise the following steps: a radar for acquiring real-time obstacle information around the unmanned vehicle; the storage module comprises a database and is used for storing original obstacle information in the database; the force transducer is used for acquiring impact force information generated when the unmanned vehicle collides with the obstacle; and the control module is connected with the radar and the storage module and is used for judging whether the real-time obstacle information is different from the original obstacle information, if so, outputting a control instruction to control the unmanned vehicle to travel to the obstacle, judging whether the obstacle can collide according to the impact force information and generating real-time obstacle related information, and storing the real-time obstacle related information into the database. The problem that an unmanned vehicle cannot identify a collidable obstacle in the prior art is solved.

Description

System, method and device for distinguishing collision barriers by unmanned vehicles and electronic equipment
Technical Field
The present document relates to the field of unmanned vehicles, and in particular, to a system, a method, a device and an electronic device for distinguishing collision obstacles by using an unmanned vehicle.
Background
Along with the rapid development of the automatic driving technology of the unmanned vehicle, the unmanned vehicle can effectively identify road obstacles and reasonably avoid the obstacles at present. But are not effectively identifiable for objects that may collide with obstacles such as grass, foam, brush, etc. For these collidable obstacles, current autopilot technology can only recognize and avoid the obstacle, but we hope that the unmanned off-road vehicle will directly go away from the collidable obstacle without avoiding the obstacle.
Disclosure of Invention
The invention aims to provide a system, a method, a device and electronic equipment for distinguishing collision barriers by an unmanned vehicle, and the system for distinguishing the collision barriers by the unmanned vehicle can solve the problem that the unmanned vehicle cannot identify the collision barriers in the prior art.
In order to achieve the above object, the present invention provides the following technical solutions:
a system for an unmanned vehicle to distinguish between collision obstacles, comprising:
a radar for acquiring real-time obstacle information around the unmanned vehicle;
the storage module comprises a database and is used for storing original obstacle information in the database;
the force transducer is used for acquiring impact force information generated when the unmanned vehicle collides with the obstacle;
and the control module is connected with the radar and the storage module and is used for judging whether the real-time obstacle information is different from the original obstacle information, if so, outputting a control instruction to control the unmanned vehicle to travel to the obstacle, judging whether the obstacle can collide according to the impact force information and generating real-time obstacle related information, and storing the real-time obstacle related information into the database.
Based on the technical scheme, the invention can also be improved as follows:
further, the system also comprises an image acquisition module for acquiring real-time image information of the surrounding environment of the unmanned vehicle.
Further, the storage module is connected with the image acquisition module, and the storage module is further used for:
and storing the real-time image information.
Further, the system also comprises an inertial navigation unit, wherein the inertial navigation unit is connected with the control module and is used for controlling the unmanned vehicle to run after receiving the control instruction.
A method for an unmanned vehicle to distinguish between collision obstacles, the method comprising:
s101, acquiring real-time obstacle information around an unmanned vehicle through a radar;
s102, storing original obstacle information through a storage module;
s103, acquiring impact force information generated when the unmanned vehicle collides with an obstacle through a force sensor;
s104, judging whether the real-time obstacle information is different from the original obstacle information or not through the control module, if so, outputting a control instruction to control the unmanned vehicle to travel towards the obstacle, judging whether the obstacle can collide or not according to the impact force information, generating real-time obstacle related information, and storing the real-time obstacle related information into a database.
Further, the method further comprises:
s105, acquiring real-time image information of the surrounding environment of the unmanned vehicle through an image acquisition module.
Further, the step S102 further includes:
and S1021, storing the real-time image information.
Further, the method further comprises:
s106, controlling the unmanned vehicle to run after receiving the control instruction through the inertial navigation unit.
An apparatus for an unmanned vehicle to distinguish between collision obstacles, comprising: the system comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the computer program realizes the steps of the method for distinguishing collision barriers by the unmanned vehicle when being executed by the processor.
An electronic device having stored thereon an implementation program for information transfer, which when executed by a processor, implements the steps of a method for a drone to distinguish between collision obstacles.
The invention has the following advantages:
according to the system for distinguishing collision barriers by the unmanned vehicle, the deep learning method is applied to the automatic driving technology of the unmanned vehicle, so that the unmanned vehicle can autonomously learn the collidability of the barriers. By using the deep learning method, the collidability of the obstacle can be effectively identified. The deep learning technology is utilized, the model training trolley judges the collidability of the obstacle by utilizing the impact force acquired by the force transducer, continuously adjusts and learns, and finally can effectively identify which are collidable obstacles and which are non-collidable obstacles, does not need to carry out obstacle avoidance processing on the collidable obstacles, and generates a finally trained model for detecting the collidability of the obstacle. And then the unmanned vehicle uses the trained model to detect the collidability. The problem that an unmanned vehicle cannot identify a collidable obstacle in the prior art is solved.
Drawings
For a clearer description of one or more embodiments of the present description or of the solutions of the prior art, the drawings that are necessary for the description of the embodiments or of the prior art will be briefly described, it being apparent that the drawings in the description that follow are only some of the embodiments described in the description, from which, for a person skilled in the art, other drawings can be obtained without inventive faculty.
FIG. 1 is a schematic diagram of a control principle of a system for distinguishing collision barriers by an unmanned vehicle in an embodiment of the invention;
FIG. 2 is a schematic diagram of a control principle of a system for distinguishing collision barriers by an unmanned vehicle according to an embodiment of the invention;
FIG. 3 is a flow chart of a method for an unmanned vehicle to distinguish between collision obstacles in an embodiment of the invention;
FIG. 4 is a flow chart of a method for an unmanned vehicle to distinguish between collision obstacles in an embodiment of the invention;
FIG. 5 is a schematic illustration of a model training drone in an embodiment of the present invention;
fig. 6 is a schematic diagram of an unmanned vehicle in an embodiment of the invention.
Radar 10, memory module 20, load cell 30, control module 40, image acquisition module 50, inertial navigation unit 60, unmanned vehicle 70.
Detailed Description
In order to enable a person skilled in the art to better understand the technical solutions in one or more embodiments of the present specification, the technical solutions in one or more embodiments of the present specification will be clearly and completely described below with reference to the drawings in one or more embodiments of the present specification, and it is obvious that the described embodiments are only some embodiments of the present specification, not all embodiments. All other embodiments, which can be made by one or more embodiments of the present disclosure without inventive faculty, are intended to be within the scope of the present disclosure.
Example 1
As shown in fig. 1 and 5, a system for distinguishing collision barriers for an unmanned vehicle includes:
a radar 10 for acquiring real-time obstacle information around the unmanned vehicle 70; the radar 10 is a laser radar, and is disposed above the body of the unmanned vehicle 70, and is configured to emit a laser beam to detect the position, volume, etc. of an obstacle.
A storage module 20 comprising a database, wherein the database stores therein raw obstacle information;
a load cell 30 for collecting impact force information generated when the unmanned vehicle 70 collides with an obstacle; the load cell 30 is disposed at the forefront of the vehicle body to detect the impact force at the time of collision of the unmanned vehicle 70 with an obstacle.
And a control module 40, connected to the radar 10 and the storage module 20, for determining whether the real-time obstacle information is different from the original obstacle information, if yes, outputting a control command to control the unmanned vehicle 70 to travel to the obstacle, determining whether the obstacle is collidable according to the impact force information, generating real-time obstacle related information, and storing the real-time obstacle related information in the database.
Firstly, the test trolley is required to learn a model, the test trolley judges the collidability of an obstacle by using the impact force acquired by the force sensor 30, and continuously adjusts and learns, so that the collidability obstacle can be accurately identified finally, obstacle avoidance behaviors are not required to be made on the collidability obstacle such as grass, shrubs and the like, and training of the model can be stopped when the accuracy of identification meets the requirement. The obtained trained model is put into the actual unmanned vehicle 70, so that the obstacle can be detected, and the collidability of the obstacle can be judged. The invention can be used for the selective obstacle avoidance treatment of the off-road vehicle under the condition of complex obstacles such as automatic driving on hillsides.
Based on the technical scheme, the invention can also be improved as follows:
as shown in fig. 2, the system further includes an image acquisition module 50 for acquiring real-time image information of the environment surrounding the drone 70. The image acquisition module 50 is arranged at a position of the vehicle body behind the vehicle body, a bracket with adjustable height, pitch angle and horizontal deflection angle is arranged between the image acquisition module 50 and the platform, the angle of the camera can be automatically adjusted according to different requirements and tasks, the pitch angle adjusting range is-45-45 degrees, the horizontal deflection angle adjusting range is-60-60 degrees, and the height adjusting range is 50cm-100cm.
Further, the storage module 20 is connected to the image acquisition module 50, and the storage module 20 is further configured to:
and storing the real-time image information.
Further, the system further includes an inertial navigation unit 60, where the inertial navigation unit 60 is connected to the control module 40, and is configured to control the unmanned aerial vehicle 70 to travel after receiving the control command. The inertial navigation unit 60 is used to provide information such as speed, yaw angle and position.
The model training trolley firstly collects the environmental information around the unmanned vehicle 70 by using the radar 10 and the image collecting module 50, then judges the success rate of the current model for identifying the collidability of the obstacle according to the collected point cloud data and image data, if the success rate meets the requirement, the next action can be directly judged, if the obstacle can be collided, obstacle avoidance is not needed, and otherwise obstacle avoidance processing is carried out.
If it is judged that the success rate of the identification of the type of obstacle by the current model does not meet the requirement according to the obstacle information acquired by the radar 10 and the image acquisition module 50, the obstacle is slowly moved and collided, the force sensor 30 in front of the vehicle body is used for detecting the collision force, the collision object is judged to belong to the non-collidable obstacle when the collision force is larger than a threshold value, and if the obstacle can be successfully pressed, the collision object is judged to belong to the collidable obstacle. And the newly learned model is correspondingly modified, so that the success rate of identifying the collision of the model to the obstacle is improved.
As shown in fig. 3, a method for distinguishing collision barriers by using an unmanned vehicle specifically includes:
s101, collecting real-time obstacle information around an unmanned vehicle;
in this step, real-time obstacle information around the unmanned vehicle 70 is acquired by the radar 10;
s102, storing original obstacle information;
in this step, the original obstacle information is stored by the storage module 20;
s103, acquiring impact force information;
in this step, the force sensor 30 collects the impact force information generated when the unmanned vehicle 70 collides with an obstacle;
and S104, storing the real-time obstacle related information into a database.
In this step, the control module 40 determines whether the real-time obstacle information is different from the original obstacle information, if yes, outputs a control command to control the unmanned vehicle 70 to travel toward the obstacle, determines whether the obstacle can collide according to the impact force information, generates real-time obstacle related information, and stores the real-time obstacle related information in a database.
As shown in fig. 4, further, the method further includes:
s105, acquiring real-time image information of the surrounding environment of the unmanned vehicle.
In this step, real-time image information of the surroundings of the unmanned vehicle 70 is acquired by the image acquisition module 50.
Further, the step S102 further includes:
s1021, real-time image information is stored.
In this step, the real-time image information is stored.
Further, the method further comprises:
s106, the inertial navigation unit controls the unmanned vehicle to run.
In this step, the inertial navigation unit 60 receives the control command and then controls the unmanned vehicle 70 to travel.
An apparatus for an unmanned vehicle to distinguish between collision obstacles, comprising: the system comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the computer program realizes the steps of the method for distinguishing collision barriers by the unmanned vehicle when being executed by the processor.
An electronic device having stored thereon an implementation program for information transfer, which when executed by a processor, implements the steps of a method for a drone to distinguish between collision obstacles.
The system for distinguishing collision barriers by the unmanned vehicle is used as follows:
in use, real-time obstacle information around the drone 70 is collected by the radar 10; storing the original obstacle information by the storage module 20; acquiring impact force information generated when the unmanned vehicle 70 collides with an obstacle through the load cell 30; whether the real-time obstacle information is different from the original obstacle information or not is judged by the control module 40, if yes, a control instruction is output to control the unmanned vehicle 70 to drive towards the obstacle, whether the obstacle can collide or not is judged according to the collision force information, real-time obstacle related information is generated, and the real-time obstacle related information is stored in a database.
Example 2
In embodiment 2, the same configuration as that in embodiment 1 is given the same reference numerals, and the same description is omitted, and embodiment 2 is improved on the basis of embodiment 1, and as shown in fig. 6, the overall design of the unmanned vehicle 70 is reduced in comparison with the model training vehicle in that the load cell 30 at the forefront of the vehicle body is reduced, and the model which has been already trained is used in actual use to identify the collidability of the obstacle, so that detection of the collision force is unnecessary. The rest configuration is basically consistent with the model training trolley, and the radar 10, the inertial navigation unit 60, the control module 40 and the image acquisition module 50 are arranged on the vehicle body.
Since the model training cart has completed the cart model for identifying the collidability of the obstacle, the model has reached a relatively high identification success rate, and the model can be directly used in the unmanned cart 70 to distinguish the obstacle, and the collidable obstacle and the non-collidable obstacle are identified.
The drone 70 recognizes each obstacle at each moment using conventional target recognition techniques, and then recognizes whether the obstacle belongs to a collidable obstacle or a non-collidable obstacle according to a trained model. If the obstacle can collide, no obstacle avoidance processing is needed, otherwise, the unmanned vehicle 70 carries out corresponding obstacle avoidance processing.
It should be noted that, in the present specification, the embodiments related to the storage medium and the embodiments related to the blockchain-based service providing method in the present specification are based on the same inventive concept, so that the specific implementation of the embodiments may refer to the implementation of the corresponding blockchain-based service providing method, and the repetition is omitted.
The foregoing describes specific embodiments of the present disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims can be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing are also possible or may be advantageous.
In the 30 s of the 20 th century, improvements to one technology could clearly be distinguished as improvements in hardware (e.g., improvements to circuit structures such as diodes, transistors, switches, etc.) or software (improvements to the process flow). However, with the development of technology, many improvements of the current method flows can be regarded as direct improvements of hardware circuit structures. Designers almost always obtain corresponding hardware circuit structures by programming improved method flows into hardware circuits. Therefore, an improvement of a method flow cannot be said to be realized by a hardware entity module. For example, a programmable logic device (Programmable Logic Device, PLD) (e.g., field programmable gate array (Field Programmable Gate Array, FPGA)) is an integrated circuit whose logic function is determined by the programming of the device by a user. A designer programs to "integrate" a digital system onto a PLD without requiring the chip manufacturer to design and fabricate application-specific integrated circuit chips. Moreover, nowadays, instead of manually manufacturing integrated circuit chips, such programming is mostly implemented by using "logic compiler" software, which is similar to the software compiler used in program development and writing, and the original code before the compiling is also written in a specific programming language, which is called hardware description language (Hardware Description Language, HDL), but not just one of the hdds, but a plurality of kinds, such as ABEL (Advanced Boolean Expression Language), AHDL (Altera Hardware Description Language), confluence, CUPL (Cornell University Programming Language), HDCal, JHDL (Java Hardware Description Language), lava, lola, myHDL, PALASM, RHDL (Ruby Hardware Description Language), etc., VHDL (Very-High-Speed Integrated Circuit Hardware Description Language) and Verilog are currently most commonly used. It will also be apparent to those skilled in the art that a hardware circuit implementing the logic method flow can be readily obtained by merely slightly programming the method flow into an integrated circuit using several of the hardware description languages described above.
The controller may be implemented in any suitable manner, for example, the controller may take the form of, for example, a microprocessor 202 or processor 202, and a computer readable medium storing computer readable program code (e.g., software or firmware) executable by the (micro) processor 202, logic gates, switches, application specific integrated circuits (Application Specific Integrated Circuit, ASIC), programmable logic controllers, and embedded microcontrollers, examples of which include, but are not limited to, the following microcontrollers: ARC 625D, atmel AT91SAM, microchip PIC18F26K20, and Silicone Labs C8051F320, the memory controller may also be implemented as part of the control logic of the memory. Those skilled in the art will also appreciate that, in addition to implementing the controller in a pure computer readable program code, it is well possible to implement the same functionality by logically programming the method steps such that the controller is in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers, etc. Such a controller may thus be regarded as a kind of hardware component, and means for performing various functions included therein may also be regarded as structures within the hardware component. Or even means for achieving the various functions may be regarded as either software modules implementing the methods or structures within hardware components.
The system, apparatus, module or unit set forth in the above embodiments may be implemented in particular by a computer chip or entity, or by a product having a certain function. One typical implementation is a computer. In particular, the computer may be, for example, a personal computer, a laptop computer, a cellular telephone, a camera phone, a smart phone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device, or a combination of any of these devices.
For convenience of description, the above devices are described as being functionally divided into various units, respectively. Of course, the functions of each unit may be implemented in the same piece or pieces of software and/or hardware when implementing the embodiments of the present specification.
One skilled in the relevant art will recognize that one or more embodiments of the present description may be provided as a method, system, or computer program product. Accordingly, one or more embodiments of the present description may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present description can take the form of a computer program product on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
The present description is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the specification. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor 202 of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor 202 of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In one typical configuration, the computing device includes one or more processors 202 (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of computer-readable media.
Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. Computer-readable media, as defined herein, does not include transitory computer-readable media (transmission media), such as modulated data signals and carrier waves.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises the element.
One or more embodiments of the present specification may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. One or more embodiments of the specification may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
In this specification, each embodiment is described in a progressive manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. In particular, for system embodiments, since they are substantially similar to method embodiments, the description is relatively simple, as relevant to see a section of the description of method embodiments.
The foregoing description is by way of example only and is not intended to limit the present disclosure. Various modifications and changes may occur to those skilled in the art. Any modifications, equivalent substitutions, improvements, etc. that fall within the spirit and principles of the present document are intended to be included within the scope of the claims of the present document.

Claims (10)

1. A system for distinguishing between collision obstacles for an unmanned vehicle, comprising:
radar for acquiring real-time obstacle information around an unmanned vehicle, including the position and volume of the obstacle;
the storage module comprises a database and is used for storing original obstacle information in the database;
the force transducer is used for acquiring impact force information generated when the unmanned vehicle collides with the obstacle;
the control module is connected with the radar and the storage module and is used for judging whether the real-time obstacle information is different from the original obstacle information, if so, outputting a control instruction to control the unmanned vehicle to travel to the obstacle, judging whether the obstacle can collide according to the impact force information and generating real-time obstacle related information, and storing the real-time obstacle related information into the database;
the deep learning technology is utilized, the model training trolley judges the collidability of the obstacle by utilizing the impact force acquired by the force transducer, continuously adjusts and learns, and finally effectively identifies which are collidable obstacles and which are non-collidable obstacles, does not need to carry out obstacle avoidance treatment on the collidable obstacles, and generates a finally trained model for detecting the collidability of the obstacle.
2. The system for distinguishing collision barriers of a drone of claim 1, further comprising an image acquisition module for acquiring real-time image information of the surroundings of the drone.
3. The unmanned vehicle collision obstacle distinguishing system of claim 2, wherein the memory module is coupled to the image acquisition module, the memory module further configured to:
and storing the real-time image information.
4. The system for distinguishing collision barriers for a vehicle according to claim 1, further comprising an inertial navigation unit, wherein the inertial navigation unit is connected with the control module and is used for controlling the vehicle to run after receiving the control command.
5. A method for distinguishing collision barriers for an unmanned vehicle, the method comprising:
s101, acquiring real-time obstacle information around an unmanned vehicle through a radar;
s102, storing original obstacle information through a storage module;
s103, acquiring impact force information generated when the unmanned vehicle collides with an obstacle through a force sensor;
s104, judging whether the real-time obstacle information is different from the original obstacle information or not through a control module, if so, outputting a control instruction to control the unmanned vehicle to travel towards the obstacle, judging whether the obstacle can collide or not according to the collision force information, generating real-time obstacle related information, and storing the real-time obstacle related information into a database;
the deep learning technology is utilized, the model training trolley judges the collidability of the obstacle by utilizing the impact force acquired by the force transducer, continuously adjusts and learns, and finally effectively identifies which are collidable obstacles and which are non-collidable obstacles, does not need to carry out obstacle avoidance treatment on the collidable obstacles, and generates a finally trained model for detecting the collidability of the obstacle.
6. The method of claim 5, further comprising:
s105, acquiring real-time image information of the surrounding environment of the unmanned vehicle through an image acquisition module.
7. The method of claim 5, wherein S102 further comprises:
and S1021, storing the real-time image information.
8. The method of claim 6, further comprising:
s106, controlling the unmanned vehicle to run after receiving the control instruction through the inertial navigation unit.
9. An apparatus for distinguishing collision barriers for an unmanned vehicle, comprising: memory, a processor and a computer program stored on the memory and executable on the processor, which when executed by the processor, performs the steps of the method of distinguishing between collision obstacles for an unmanned vehicle as claimed in any one of claims 5 to 8.
10. An electronic device, characterized in that it has stored thereon an implementation program of information transfer, which when executed by a processor implements the steps of the method of distinguishing collision obstacles for an unmanned vehicle according to any one of claims 5 to 8.
CN202111356947.2A 2021-11-16 2021-11-16 System, method and device for distinguishing collision barriers by unmanned vehicles and electronic equipment Active CN114047763B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111356947.2A CN114047763B (en) 2021-11-16 2021-11-16 System, method and device for distinguishing collision barriers by unmanned vehicles and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111356947.2A CN114047763B (en) 2021-11-16 2021-11-16 System, method and device for distinguishing collision barriers by unmanned vehicles and electronic equipment

Publications (2)

Publication Number Publication Date
CN114047763A CN114047763A (en) 2022-02-15
CN114047763B true CN114047763B (en) 2024-04-05

Family

ID=80209516

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111356947.2A Active CN114047763B (en) 2021-11-16 2021-11-16 System, method and device for distinguishing collision barriers by unmanned vehicles and electronic equipment

Country Status (1)

Country Link
CN (1) CN114047763B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106737653A (en) * 2015-11-20 2017-05-31 哈尔滨工大天才智能科技有限公司 The method of discrimination of barrier hard and soft in a kind of robot vision
CN110758387A (en) * 2019-10-29 2020-02-07 桂林电子科技大学 Unmanned vehicle-based anti-collision device and method
CN214215575U (en) * 2020-11-27 2021-09-17 北京三快在线科技有限公司 Unmanned vehicle

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106737653A (en) * 2015-11-20 2017-05-31 哈尔滨工大天才智能科技有限公司 The method of discrimination of barrier hard and soft in a kind of robot vision
CN110758387A (en) * 2019-10-29 2020-02-07 桂林电子科技大学 Unmanned vehicle-based anti-collision device and method
CN214215575U (en) * 2020-11-27 2021-09-17 北京三快在线科技有限公司 Unmanned vehicle

Also Published As

Publication number Publication date
CN114047763A (en) 2022-02-15

Similar Documents

Publication Publication Date Title
CN110929431B (en) Training method and device for vehicle driving decision model
CN110262486B (en) Unmanned equipment motion control method and device
CN111238523B (en) Method and device for predicting motion trail
CN112987760B (en) Trajectory planning method and device, storage medium and electronic equipment
CN111126362B (en) Method and device for predicting obstacle track
CN112068553A (en) Robot obstacle avoidance processing method and device and robot
CN114077252B (en) Robot collision obstacle distinguishing device and method
CN113296541B (en) Future collision risk based unmanned equipment control method and device
CN112306059B (en) Training method, control method and device for control model
CN111331595B (en) Method and apparatus for controlling operation of service robot
CN112799411A (en) Control method and device of unmanned equipment
CN111127551A (en) Target detection method and device
CN111186437A (en) Vehicle track risk determination method and device
CN112947495B (en) Model training method, unmanned equipment control method and device
CN114047763B (en) System, method and device for distinguishing collision barriers by unmanned vehicles and electronic equipment
CN113033527A (en) Scene recognition method and device, storage medium and unmanned equipment
CN112818968A (en) Target object classification method and device
CN112949756A (en) Method and device for model training and trajectory planning
CN117195974A (en) Training method and device for reserve pool calculation model based on pulse signals
CN110895406A (en) Method and device for testing unmanned equipment based on interferent track planning
CN114019971B (en) Unmanned equipment control method and device, storage medium and electronic equipment
CN114167857B (en) Control method and device of unmanned equipment
CN112987762B (en) Trajectory planning method and device, storage medium and electronic equipment
CN112731447B (en) Obstacle tracking method and device, storage medium and electronic equipment
CN114815825A (en) Method and device for determining optimal driving track of vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant