CN114489112A - Cooperative sensing system and method for intelligent vehicle-unmanned aerial vehicle - Google Patents

Cooperative sensing system and method for intelligent vehicle-unmanned aerial vehicle Download PDF

Info

Publication number
CN114489112A
CN114489112A CN202111519197.6A CN202111519197A CN114489112A CN 114489112 A CN114489112 A CN 114489112A CN 202111519197 A CN202111519197 A CN 202111519197A CN 114489112 A CN114489112 A CN 114489112A
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
vehicle
intelligent vehicle
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111519197.6A
Other languages
Chinese (zh)
Inventor
徐坤
李慧云
潘仲鸣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Institute of Advanced Technology of CAS
Original Assignee
Shenzhen Institute of Advanced Technology of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Institute of Advanced Technology of CAS filed Critical Shenzhen Institute of Advanced Technology of CAS
Priority to CN202111519197.6A priority Critical patent/CN114489112A/en
Publication of CN114489112A publication Critical patent/CN114489112A/en
Priority to PCT/CN2022/136955 priority patent/WO2023109589A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Abstract

The invention discloses a cooperative sensing system and method of an intelligent vehicle-unmanned aerial vehicle. In the system, the intelligent vehicle comprises a first perception module, a first calculation module, an unmanned cabin module and an active guiding module, and the unmanned aerial vehicle comprises a second perception module and a second calculation module. The first sensing module is used for detecting environmental information around the intelligent vehicle; the unmanned cabin module is provided with a space for placing one or more unmanned aerial vehicles; the active guiding module transmits guiding information to indicate the unmanned aerial vehicle to acquire relative positioning with the intelligent vehicle; the first calculation module is used for evaluating the complexity of the surrounding environment and determining whether to transmit the guide information according to an evaluation result; the second sensing module detects environmental information and identifies guide information transmitted by the intelligent vehicle; the second calculation module controls the motion of the unmanned aerial vehicle, processes the detected information and obtains semantic information of the surrounding environment of the intelligent vehicle. The unmanned aerial vehicle cooperative sensing system actively intervenes to guide the unmanned aerial vehicle to correct the posture and the track, and improves the reliability of the unmanned aerial vehicle cooperative sensing.

Description

Cooperative sensing system and method for intelligent vehicle-unmanned aerial vehicle
Technical Field
The invention relates to the technical field of intelligent equipment, in particular to a cooperative sensing system and method of an intelligent vehicle-unmanned aerial vehicle.
Background
When the intelligent vehicle autonomously operates in a complex environment (including a structured road environment, an unstructured field environment and the like) and an unknown or unstructured environment, firstly, the perception of surrounding environment scenes needs to be realized, a drivable area is determined, and then reasonable path planning and motion control can be performed. Therefore, accurate sensing and identification of the drivable area are the key points of the intelligent vehicle safety navigation, and have wide application prospects in the fields of investigation, search, monitoring and the like.
The existing technical scheme for perceiving the driving area mainly comprises two types, wherein one type is that the information of the surrounding environment is collected by means of single-vehicle perception, namely, the information is processed and calculated on a vehicle-mounted computing device, and corresponding algorithms and software are designed to realize the perception of the surrounding environment, such as identifying objects, obstacles and roads in the environment, tracking dynamic obstacles, dividing a space capable of driving and the like. The second type is cooperative sensing, that is, the sensing device carried by the sensing device itself can be used, and the sensing device arranged in the surrounding environment, such as a sensing module arranged on the roadside, is used for collecting environmental information from different visual angles, and then the environmental information is fused and processed in a computing module to obtain sensing and understanding results of the environment. The calculation module can be carried out at a vehicle-mounted end and also can be carried out in a cloud calculation module.
For example, for a scheme of cooperative sensing, patent application CN201510889283.4 discloses a vehicle environment sensing system and method based on an unmanned aerial vehicle, which includes an unmanned aerial vehicle and a ground station, wherein the unmanned aerial vehicle and the ground station installed on the vehicle are connected by a mooring rope, the ground station is shot by the unmanned aerial vehicle, and a video image is transmitted to the ground station; and calculating flight parameters by the ground station on the vehicle, and sending the flight parameters to the unmanned aerial vehicle to control the unmanned aerial vehicle to move forward together with the vehicle. Patent application CN201910181147.8 discloses a vehicle driving auxiliary system based on unmanned aerial vehicle, there is the cabin that stops at the roof portion, and the under-deck has wireless charging panel, and unmanned aerial vehicle can follow the car and take off, relies on the high definition digtal camera that self carried to provide the field of vision scope of extension for the driver. However, a reliability method for ensuring the cooperation of the vehicle and the unmanned aerial vehicle after the system becomes complex is not provided, and the autonomous cooperation capability is not provided. Patent application CN202110535374.3 discloses a keep somewhere unmanned aerial vehicle with car and keep away barrier system, has the rope of mooring between unmanned aerial vehicle and the vehicle to designed the unhook device, when avoiding the cable of mooring to be twined by the barrier, made unmanned aerial vehicle unhook guarantee safety landing. The patent application also does not relate to a reliability guarantee method of cooperative sensing and does not have autonomous cooperative capability
The prior art scheme mainly has the following defects:
1) the environment information acquired by single vehicle sensing is limited, only the information detected by the first visual angle of the intelligent vehicle can be obtained, and the comprehensive and accurate detection of the surrounding environment cannot be realized;
2) cooperative sensing expands the range of environmental information detection, but requires that sensing equipment be installed at a proper position in the environment in advance, is inflexible, and is not suitable for an unknown environment entering for the first time (such environment cannot arrange sensors at a proper position in a driving space in advance).
3) The existing method for realizing sensing by adopting the takeoff of the unmanned aerial vehicle is connected with a ground station on a vehicle through a mooring rope, and is not suitable for complex environments (such as bridges, wire rungs and tall trees); a ground station calculation module is needed to control the flight of the unmanned aerial vehicle, so that the equipment cost (such as a ground station) is increased; the unmanned aerial vehicle can not realize autonomous tracking and flexible autonomous planning of flight paths, and can not flexibly adjust the sensing detection range.
4) The existing mode of adopting the unmanned aerial vehicle and the vehicle cooperative sensing is characterized in that as the unmanned aerial vehicle is added, the system becomes more complex, a mechanism and a method for reliable cooperative sensing are lacked, and the reliable cooperative sensing of the system is difficult to ensure.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provides a cooperative sensing system and method of an intelligent vehicle-unmanned aerial vehicle.
According to a first aspect of the invention, a cooperative sensing system of a smart car-unmanned aerial vehicle is provided. The system comprises: intelligent car, one or more unmanned aerial vehicle. The intelligent vehicle comprises a first perception module, a first calculation module, an unmanned cabin module and an active guiding module, and the unmanned aerial vehicle comprises a second perception module and a second calculation module, wherein: the first sensing module is used for detecting environmental information around the intelligent vehicle; the unmanned cabin module is provided with a space for placing one or more unmanned aerial vehicles; the active guiding module is used for transmitting guiding information to indicate the unmanned aerial vehicle to acquire relative positioning with the intelligent vehicle; the first calculation module is used for evaluating the complexity of the surrounding environment according to the detection information transmitted back by the unmanned aerial vehicle and the detection information of the intelligent vehicle and determining whether to transmit the guide information according to an evaluation result; the second sensing module is used for detecting environmental information in the visual field of the unmanned aerial vehicle and identifying the intelligent vehicle or guiding information transmitted by the intelligent vehicle; the second calculation module is used for controlling the motion of the unmanned aerial vehicle, processing the detection information of the second sensing module and acquiring the semantic information of the surrounding environment of the intelligent vehicle.
According to a second aspect of the invention, a cooperative sensing method of an intelligent vehicle-unmanned aerial vehicle is provided. The method comprises the following steps:
detecting environmental information around the intelligent vehicle;
receiving detection information transmitted back by the unmanned aerial vehicle;
determining whether the intelligent vehicle transmits the guiding information according to the environmental information around the intelligent vehicle and the detection information transmitted back by the unmanned aerial vehicle, and indicating the unmanned aerial vehicle to respond to the guiding information to acquire the relative positioning between the unmanned aerial vehicle and the intelligent vehicle;
and controlling the driving track of the intelligent vehicle based on the semantic information of the surrounding environment of the intelligent vehicle acquired by the unmanned aerial vehicle.
Compared with the prior art, the invention has the advantages that a novel technical scheme of reliable collaborative perception in a complex environment is provided, multi-view environment perception of the environment is realized in a reliable collaborative mode of the unmanned aerial vehicle and the intelligent vehicle (or named unmanned vehicle), and the problem that the reliability of the system is difficult to guarantee due to lack of a safe cross validation link in the conventional collaborative perception system can be solved.
Other features of the present invention and advantages thereof will become apparent from the following detailed description of exemplary embodiments thereof, which proceeds with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description, serve to explain the principles of the invention.
Fig. 1 is a schematic diagram of a cooperative sensing system of a smart car-drone according to one embodiment of the present invention;
fig. 2 is a flowchart of a cooperative sensing method of an intelligent vehicle-unmanned aerial vehicle according to an embodiment of the present invention.
Detailed Description
Various exemplary embodiments of the present invention will now be described in detail with reference to the accompanying drawings. It should be noted that: the relative arrangement of the components and steps, the numerical expressions and numerical values set forth in these embodiments do not limit the scope of the present invention unless specifically stated otherwise.
The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the invention, its application, or uses.
Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate.
In all examples shown and discussed herein, any particular value should be construed as merely illustrative, and not limiting. Thus, other examples of the exemplary embodiments may have different values.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, further discussion thereof is not required in subsequent figures.
Referring to fig. 1, the cooperative sensing system provided includes a smart car and a drone. In one embodiment, the intelligent vehicle comprises a first sensing module, a first calculating module, a first communication module, an unmanned cabin module and an active guiding module.
The first sensing module is provided with an intelligent vehicle environment detection sensing device and an intelligent vehicle empty detection sensing device. The intelligent vehicle environment detection sensing equipment is used for detecting the environment information around the intelligent vehicle, and can be set in one or more types, including but not limited to laser radar, a camera, a satellite positioning system, an inertial measurement unit and the like. The intelligent vehicle air detection sensing equipment is used for detecting and identifying the overhead unmanned aerial vehicle, and can comprise one or a combination of several types, such as a laser radar, a camera and the like.
The first communication module provides information interaction between the unmanned aerial vehicle and the intelligent vehicle, and the selectable communication modes comprise wifi, 4G or 5G networks and the like. Preferably, a 5G network is employed to provide real-time high bandwidth communication between the smart car and the drone.
The unmanned cabin module is provided with a space for placing one or more unmanned aerial vehicles; the unmanned aerial vehicle can vertically take off and land through the opened cabin door after the cabin door is opened; have wired and/or wireless charging system, usable vehicle mounted power and power conversion system realize the charging to unmanned aerial vehicle.
The active guiding module is used for providing guiding information which can be identified by the unmanned aerial vehicle, and the guiding information comprises but is not limited to image marks, pattern marks formed by luminous LEDs and the like. After the unmanned aerial vehicle detects the guiding information, the relative positioning between the unmanned aerial vehicle and the intelligent vehicle can be obtained through the airborne computing module, and the relative positioning information can be used for reasonable motion trajectory control (such as hovering, tracking, autonomous landing or detection of making a trajectory by taking the intelligent vehicle as a center) of the unmanned aerial vehicle.
The first calculation module processes detection information transmitted back by the unmanned aerial vehicle and detection information of the intelligent vehicle, obtains complexity evaluation around the intelligent vehicle by fusing the two kinds of information, guides subsequent navigation planning and control of the intelligent vehicle, and performs safety cross validation. Details regarding secure cross-validation will be described in more detail below.
It should be noted that in the cooperative sensing system provided, the number of drones may be one or more. For clarity, in the following description, the modules involved and the corresponding functions are described by taking one drone as an example.
In one embodiment, the drone includes a second awareness module, a second computing module, and a second communication module.
The second perception module is used for detecting environmental information in the visual field of the unmanned aerial vehicle and identifying the intelligent vehicle or an active guide module on the intelligent vehicle. One or more types of detection sensing equipment can be installed on the second sensing module, including but not limited to laser radar, a camera, a satellite positioning system, an inertial measurement unit and the like.
The second communication module is used for providing information interaction between the unmanned aerial vehicle and the intelligent vehicle. For example, the environment detection result of the second calculation module of the unmanned aerial vehicle is transmitted to the first communication module of the smart car through the second communication module.
The second computing module processes the motion control of the unmanned aerial vehicle; processing the detection information of the second sensing module, performing data preprocessing, and acquiring semantic information of the surrounding environment of the intelligent vehicle; performing security cross validation, etc.
With reference to fig. 2, based on the above system, the reliable cooperative sensing method provided by the present invention includes the following steps.
And S210, when the intelligent vehicle navigates in a structured and simple environment, the intelligent vehicle only depends on the first sensing module and the first computing module to realize environment sensing and guide the vehicle to realize safe navigation.
And S220, when the intelligent vehicle encounters a complex environment, the intelligent vehicle sends a cooperative sensing request instruction to the unmanned aerial vehicle and the unmanned cabin.
The smart car may determine whether it is in a complex environment based on sensing of the surrounding environment or determine that it is currently in a complex environment when the smart car cannot achieve safe navigation by means of its own navigation system.
Step S230, after receiving the cooperative sensing request instruction, the unmanned cabin opens the cabin door to vacate an upper area; the unmanned aerial vehicle takes off from the intelligent vehicle after receiving the request instruction of the intelligent vehicle and the signal indicating that the unmanned cabin door is successfully opened.
Step S240, the smart car obtains the relative position information of the unmanned aerial vehicle with respect to the smart car, or referred to as a first relative position.
For example, the smart car may acquire the first relative position in two ways:
1) according to the position information obtained by the unmanned aerial vehicle and the intelligent vehicle satellite positioning system, calculating to obtain a first relative position (dxi, dyi, dzi), wherein the calculation mode is as follows: dxi ═ x2-x 1; y2-y 1; dzi-z 2-z 1. Wherein, (x1, y1, z1) is the position information obtained by the intelligent vehicle according to the satellite positioning system; (x2, y2, z2) is the position information obtained by the drone from the satellite positioning system. The satellite positioning system can be GPS, Beidou or other systems.
2) The position (first relative position) of the unmanned aerial vehicle relative to the intelligent vehicle is identified through the air detection sensing equipment of the first sensing module. For example, the identification method is: inputting the blank detection data of the image or the laser, establishing an identification network model, and outputting a regression result of the first relative position by the network. The parameters for identifying the network model are obtained through training based on a sample data set, and each sample data reflects the corresponding relation between an input image or detection data and a known relative position label. The recognition network model can be of various types, preferably a deep neural network.
For unmanned aerial vehicle, can acquire relative position information or the second relative position of relative intelligent car according to following two kinds of modes:
1) and calculating to obtain a second relative position (dxi) according to the position information obtained by the satellite positioning system of the unmanned aerial vehicle and the intelligent vehicle2,dyi2,dzi2) The calculation method is as follows: dxi2=x2-x1;dyi2=y2-y1;dzi2Z2-z 1. Wherein, (x1, y1, z1) is the position information obtained by the intelligent vehicle according to the satellite positioning system; (x2, y2, z2) is the position information that the drone gets from the global positioning system.
2) The unmanned aerial vehicle identifies an active guiding module of the intelligent vehicle, and calculates to obtain a second relative position relative to the intelligent vehicle. For example, the identification method is: the method comprises the steps of collecting visual information detected by an unmanned aerial vehicle to the ground, wherein the visual information contains active guiding module information on an intelligent vehicle, the active guiding module information can be image marks (such as two-dimensional code image identifiers (aruco) or apriltag) and images formed by active light-emitting LEDs (available at night), and after the unmanned aerial vehicle shoots the images containing the active guiding modules to the ground according to a carried camera, the second relative position information is identified by utilizing a general image identification algorithm (such as an edge detection algorithm) or establishing a new neural network identification model (preferably a deep neural network).
And step S250, performing bidirectional cross safety verification on the unmanned aerial vehicle and the intelligent vehicle.
In order to ensure the reliability of the system in cooperative sensing, a bidirectional cross safety verification method for the unmanned aerial vehicle and the intelligent vehicle is designed, and comprises the following two verification branches:
1) and the safety of the unmanned aerial vehicle is automatically verified by the smart vehicle.
a) And the intelligent vehicle sets a safe cooperative space for the unmanned aerial vehicle to execute the cooperative sensing task in advance.
In one embodiment, the safety coordination space is a vertebral body space, and the size of the space range can be calibrated and adjusted according to a specific intelligent vehicle and unmanned aerial vehicle system. For example, at least the following three elements are satisfied: the intelligent vehicle and the unmanned aerial vehicle in the space have good communication quality; the unmanned aerial vehicle and the intelligent vehicle can clearly detect the other side; the requirement of the environment detection range of the intelligent vehicle navigation is met.
b) The intelligent vehicle judges whether the position of the unmanned aerial vehicle falls in the safety coordination space or not according to the first relative position, and if so, the intelligent vehicle continues to execute subsequent movement and detection tasks; if the unmanned aerial vehicle exceeds the safety coordination space set by the intelligent vehicle, the intelligent vehicle sends out an over-range instruction to the unmanned aerial vehicle, and the unmanned aerial vehicle adjusts the motion trail, returns to the safety coordination space and then carries out subsequent tasks.
2) And the unmanned aerial vehicle actively verifies the safety of the position of the unmanned aerial vehicle.
In one embodiment, the security of the drone actively verifying its location includes:
a) the intelligent vehicle sends the first relative position to the unmanned aerial vehicle;
b) the unmanned aerial vehicle judges the separation parameter f of the first relative position and the second relative position, and the separation parameter f is expressed as:
f=cx*|dxi-dxi2|+cy*|dyi-dyi2|+cz*|dzi-dzi2|,
the weighting coefficients of the deviation of the first relative position and the second relative position in the three directions x, y and z are positive, and cx + cy + cz is equal to 1. For example, if the deviation in the z direction is emphasized more, cz can be set larger than cx and cy.
If the separation parameter is smaller than a preset threshold fmin, the cooperative sensing system of the intelligent vehicle and the unmanned aerial vehicle is considered to operate well, and the follow-up work is continued; if the parameter is greater than the set threshold, or any one of the first relative position and the second relative position is abnormal (if the numerical information of the relative position cannot be obtained or the relative position is greater than the set standard), which indicates that the cooperative sensing motion system is abnormal, the trajectory or the posture needs to be corrected, so that the unmanned aerial vehicle tends to the safe cooperative space until the separation parameter is less than the set threshold.
The bidirectional cross-validation may be triggered periodically or by events, for example, a fixed-period execution mode is set, such as cross-validation is performed once in 1 second, and when an anomaly occurs, the fixed-period time is reduced, such as accelerated to 500 ms. Or when the intelligent vehicle or the unmanned aerial vehicle detects an abnormal event, the intelligent vehicle or the unmanned aerial vehicle actively requests to perform bidirectional cross validation, such as that the safe navigation cannot be performed as the abnormal event or the abnormal event is determined according to the complexity of the surrounding environment.
And step S260, under the condition of bidirectional cross safety verification, the unmanned aerial vehicle moves and detects according to a preset track mode.
In one embodiment, the predetermined trajectory pattern includes the following three: hovering above the intelligent vehicle at a specified relative position, moving along with the intelligent vehicle, and moving in a safety coordination space according to a planned track taking the intelligent vehicle as a center.
And step S270, the unmanned aerial vehicle utilizes the second sensing module to detect the environmental information. And the unmanned aerial vehicle inputs the information detected by the second sensing module into the second calculation module, and performs preprocessing and identification on the detected information to obtain second sensing information. The unmanned aerial vehicle transmits the second sensing information to the first communication module of the intelligent vehicle through the second communication module.
And step S280, a first calculation module of the intelligent vehicle fuses detection information of the intelligent vehicle and detection information of the unmanned aerial vehicle.
For example, environmental information collected from a plurality of different viewing angles of the unmanned aerial vehicle and the smart car is collected, objects, roads, obstacles and the like in the surrounding environment are identified, and the identification method can adopt the existing general image identification technology (such as object classification based on deep learning). Because the information is not only from the information detected by the intelligent vehicle, but also from the information of the overlooking visual angle, the forward-looking visual angle and the like of the unmanned aerial vehicle, more comprehensive environmental information can be detected;
for another example, understanding and processing the surrounding environment, outputting a high-level abstract result, obtaining quantitative evaluation of complexity of the surrounding environment, establishing a risk space-time situation map and the like, transmitting the risk space-time situation map to a planning module of the vehicle, and guiding the vehicle to realize safe navigation.
In one embodiment, the method for quantitative complexity evaluation is:
firstly, an object is identified by using a semantic extraction method, semantic segmentation is carried out, attribute values of all parts are defined, the attribute values reflect the passability degree of the intelligent vehicle, and the larger the value is, the poorer the trafficability is represented. For example, when there is an impenetrable obstacle or a ravine area, the attribute value is defined as infinity; the graded terrain area attribute value is greater than a flat terrain area.
And then, generating a risk space-time situation map according to the defined trafficability attribute map and the positioning position. In particular, around the intelligent vehicle, for each zone, there is a quantitative value of the risk spatio-temporal situation equal to the value of the passable attribute defined above. The area to which the vehicle belongs can be divided by adopting a discrete grid map, so that for each grid on the map, a quantitative value of the risk space-time situation exists, and the larger the value is, the higher the risk is represented when the vehicle passes on the grid in the future.
On the basis, the grid with lower risk is selected to plan the driving path according to the target position and the risk out-of-control situation map during the navigation of the intelligent vehicle, so that the passing safety of the intelligent vehicle on the complex terrain is ensured
And step S290, the unmanned aerial vehicle automatically lands in a stopped cabin after finishing the environment detection.
It should be noted that in the system provided by the present invention, the number of drones may be greater than 1, and if there are n drones, there are a second relative position, …, and an n +1 th relative position in addition to the first relative position. For example, the (n + 1) th relative position represents the position of the nth drone relative to the smart car. Under this condition, in two-way cross safety verification, increase the two-way cross safety verification of intelligent car and a plurality of unmanned aerial vehicle, similar with above-mentioned process, no longer describe here.
In order to further verify the effect of the invention, a test experiment of a prototype system was carried out. Experiments prove that when the relative position of the unmanned aerial vehicle and the intelligent vehicle has overlarge deviation or the unmanned aerial vehicle exceeds the safety space set by the intelligent vehicle, the unmanned aerial vehicle is actively guided to correct the posture and the track, and the reliability of unmanned aerial vehicle cooperative sensing is improved.
In conclusion, the invention improves the reliability of the unmanned aerial vehicle-intelligent vehicle in executing the cooperative sensing task by arranging the active guide module and designing the intelligent vehicle and the unmanned aerial vehicle to have a bidirectional cross safety verification mechanism, and is applicable to cooperative sensing of one intelligent vehicle and a plurality of unmanned aerial vehicles.
The present invention may be a system, method and/or computer program product. The computer program product may include a computer-readable storage medium having computer-readable program instructions embodied therewith for causing a processor to implement various aspects of the present invention.
The computer readable storage medium may be a tangible device that can hold and store the instructions for use by the instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic memory device, a magnetic memory device, an optical memory device, an electromagnetic memory device, a semiconductor memory device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), a memory stick, a floppy disk, a mechanical coding device, such as punch cards or in-groove projection structures having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media as used herein is not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission medium (e.g., optical pulses through a fiber optic cable), or electrical signals transmitted through electrical wires.
The computer-readable program instructions described herein may be downloaded from a computer-readable storage medium to a respective computing/processing device, or to an external computer or external storage device via a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. The network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in the respective computing/processing device.
The computer program instructions for carrying out operations of the present invention may be assembler instructions, Instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C + +, Python, or the like, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, aspects of the present invention are implemented by personalizing an electronic circuit, such as a programmable logic circuit, a Field Programmable Gate Array (FPGA), or a Programmable Logic Array (PLA), with state information of computer-readable program instructions, which can execute the computer-readable program instructions.
Aspects of the present invention are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer readable program instructions.
These computer-readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer-readable program instructions may also be stored in a computer-readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium storing the instructions comprises an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions. It is well known to those skilled in the art that implementation by hardware, implementation by software, and implementation by a combination of software and hardware are equivalent.
Having described embodiments of the present invention, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein is chosen in order to best explain the principles of the embodiments, the practical application, or improvements made to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein. The scope of the invention is defined by the appended claims.

Claims (10)

1. A smart car-drone cooperative sensing system comprising: intelligent vehicle, one or more unmanned aerial vehicle, its characterized in that, intelligent vehicle includes first perception module, first calculation module, unmanned aerial vehicle cabin module and initiative guide module, and unmanned aerial vehicle includes second perception module and second calculation module, wherein:
the first sensing module is used for detecting environmental information around the intelligent vehicle;
the unmanned cabin module is provided with a space for placing one or more unmanned aerial vehicles;
the active guiding module is used for transmitting guiding information to indicate the unmanned aerial vehicle to acquire relative positioning with the intelligent vehicle;
the first calculation module is used for evaluating the complexity of the surrounding environment according to the detection information transmitted back by the unmanned aerial vehicle and the detection information of the intelligent vehicle and determining whether to transmit the guide information according to an evaluation result;
the second sensing module is used for detecting environmental information in the visual field of the unmanned aerial vehicle and identifying the intelligent vehicle or guiding information transmitted by the intelligent vehicle;
the second calculation module is used for controlling the motion of the unmanned aerial vehicle, processing the detection information of the second sensing module and acquiring the semantic information of the surrounding environment of the intelligent vehicle.
2. The system of claim 1, wherein in response to the guidance information communicated by the smart car, the drone acquires relative positioning with the smart car and performs two-way intersection security validation in cooperation with the smart car.
3. The system of claim 2, wherein the bidirectional cross-security validation comprises:
the intelligent vehicle actively verifies whether the unmanned aerial vehicle falls into a preset safety coordination space or not according to the first relative position, if not, the intelligent vehicle sends an out-of-range instruction to the unmanned aerial vehicle so that the unmanned aerial vehicle can adjust the motion track, wherein the first relative position is the position, relative to the unmanned aerial vehicle, acquired by the intelligent vehicle;
and the unmanned aerial vehicle actively verifies the safety of the position of the unmanned aerial vehicle according to the first relative position and the second relative position, if the first relative position and the second relative position are not verified, track correction is carried out, so that the unmanned aerial vehicle tends to the safety coordination space, wherein the second relative position is the relative position between the unmanned aerial vehicle and the intelligent vehicle, and the second relative position is sensed by the unmanned aerial vehicle.
4. The system of claim 3, wherein the drone actively verifying the security of its location from the first and second relative positions comprises:
the unmanned aerial vehicle receives a first relative position from the smart car;
the unmanned aerial vehicle judges the separation parameter f of the first relative position and the second relative position, if the separation parameter f is largeAt a set threshold value fminOr the first relative position and the second relative position are abnormal, the position of the unmanned aerial vehicle is judged to be in an unsafe state.
5. The system of claim 4, wherein the separation parameter f is expressed as:
f=cx*|dxi-dxi2|+cy*|dyi-dyi2|+cz*|dzi-dzi2|
wherein cx, cy, cz is the first relative position dxi and the second relative position dxi2The weight coefficients of the deviations in the three directions x, y and z are all positive, and cx + cy + cz is equal to 1.
6. The system of claim 1, wherein the guidance information comprises graphical indicia or pattern indicia comprised of light emitting LEDs.
7. The system of claim 1, wherein the first perception module comprises a smart car environment detection sensing device for detecting environmental information around the smart car and a smart car-to-air detection sensing device for detecting and identifying an overhead drone.
8. The system of claim 1, wherein the smart car performs the complexity assessment of the surrounding environment to control the driving trajectory based on the following steps:
extracting the identified objects by utilizing semantics, performing semantic segmentation, defining attribute values of all parts, wherein the attribute values reflect the passable degree of the intelligent vehicle, and obtaining a passable attribute map;
and generating a risk space-time situation map according to the trafficability attribute map and the positioning position, wherein a quantitative value of the risk space-time situation is provided for each area around the intelligent vehicle, the value is equal to the defined trafficability attribute value, the area is divided by adopting a discrete grid map, and a quantitative value of the risk space-time situation is provided for each grid on the map.
9. A cooperative sensing method of a smart car-drone, for use in the system of any one of claims 1 to 8, comprising the steps of:
detecting environmental information around the intelligent vehicle;
receiving detection information transmitted back by the unmanned aerial vehicle;
determining whether the intelligent vehicle transmits guiding information or not according to the environmental information around the intelligent vehicle and the detection information transmitted back by the unmanned aerial vehicle, and indicating the unmanned aerial vehicle to respond to the guiding information to acquire relative positioning with the intelligent vehicle;
and controlling the driving track of the intelligent vehicle based on the semantic information of the surrounding environment of the intelligent vehicle acquired by the unmanned aerial vehicle.
10. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method as claimed in claim 9.
CN202111519197.6A 2021-12-13 2021-12-13 Cooperative sensing system and method for intelligent vehicle-unmanned aerial vehicle Pending CN114489112A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111519197.6A CN114489112A (en) 2021-12-13 2021-12-13 Cooperative sensing system and method for intelligent vehicle-unmanned aerial vehicle
PCT/CN2022/136955 WO2023109589A1 (en) 2021-12-13 2022-12-06 Smart car-unmanned aerial vehicle cooperative sensing system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111519197.6A CN114489112A (en) 2021-12-13 2021-12-13 Cooperative sensing system and method for intelligent vehicle-unmanned aerial vehicle

Publications (1)

Publication Number Publication Date
CN114489112A true CN114489112A (en) 2022-05-13

Family

ID=81493025

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111519197.6A Pending CN114489112A (en) 2021-12-13 2021-12-13 Cooperative sensing system and method for intelligent vehicle-unmanned aerial vehicle

Country Status (2)

Country Link
CN (1) CN114489112A (en)
WO (1) WO2023109589A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023109589A1 (en) * 2021-12-13 2023-06-22 深圳先进技术研究院 Smart car-unmanned aerial vehicle cooperative sensing system and method
CN116540784A (en) * 2023-06-28 2023-08-04 西北工业大学 Unmanned system air-ground collaborative navigation and obstacle avoidance method based on vision

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117111178B (en) * 2023-10-18 2024-02-06 中国电建集团贵阳勘测设计研究院有限公司 Dam hidden danger and dangerous situation air-ground water collaborative detection system and method

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105512628B (en) * 2015-12-07 2018-10-23 北京航空航天大学 Vehicle environmental sensory perceptual system based on unmanned plane and method
CN107479554B (en) * 2017-09-07 2020-12-11 海南飞行者科技有限公司 Robot system and outdoor map building navigation method thereof
DE102018205578B4 (en) * 2018-04-12 2020-12-03 Audi Ag Method for forming a convoy comprising at least one unmanned, autonomously movable object, as well as a correspondingly designed movable object
CN110221625B (en) * 2019-05-27 2021-08-03 北京交通大学 Autonomous landing guiding method for precise position of unmanned aerial vehicle
CN110221623A (en) * 2019-06-17 2019-09-10 酷黑科技(北京)有限公司 A kind of air-ground coordination operating system and its localization method
CN111300372A (en) * 2020-04-02 2020-06-19 同济人工智能研究院(苏州)有限公司 Air-ground cooperative intelligent inspection robot and inspection method
CN111707988B (en) * 2020-05-29 2023-06-20 江苏科技大学 Unmanned vehicle system positioning method based on unmanned vehicle-mounted UWB base station
CN112731922A (en) * 2020-12-14 2021-04-30 南京大学 Unmanned aerial vehicle auxiliary intelligent vehicle driving method and system based on indoor positioning
CN114489112A (en) * 2021-12-13 2022-05-13 深圳先进技术研究院 Cooperative sensing system and method for intelligent vehicle-unmanned aerial vehicle

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023109589A1 (en) * 2021-12-13 2023-06-22 深圳先进技术研究院 Smart car-unmanned aerial vehicle cooperative sensing system and method
CN116540784A (en) * 2023-06-28 2023-08-04 西北工业大学 Unmanned system air-ground collaborative navigation and obstacle avoidance method based on vision
CN116540784B (en) * 2023-06-28 2023-09-19 西北工业大学 Unmanned system air-ground collaborative navigation and obstacle avoidance method based on vision

Also Published As

Publication number Publication date
WO2023109589A1 (en) 2023-06-22

Similar Documents

Publication Publication Date Title
US20210397185A1 (en) Object Motion Prediction and Autonomous Vehicle Control
CN109466543B (en) Planning autonomous movement
US10732625B2 (en) Autonomous vehicle operations with automated assistance
US11364936B2 (en) Method and system for controlling safety of ego and social objects
US20180190046A1 (en) Calibration for autonomous vehicle operation
CN114489112A (en) Cooperative sensing system and method for intelligent vehicle-unmanned aerial vehicle
US20190101924A1 (en) Anomaly Detection Systems and Methods for Autonomous Vehicles
US20180233054A1 (en) Method and apparatus for controlling agent movement in an operating space
CN102707724A (en) Visual localization and obstacle avoidance method and system for unmanned plane
CN116710977B (en) Autonomous vehicle system for intelligent on-board selection of data for building remote machine learning models
CN112506222A (en) Unmanned aerial vehicle intelligent obstacle avoidance method and device
CN116710976A (en) Autonomous vehicle system for intelligent on-board selection of data for training a remote machine learning model
US20230028792A1 (en) Machine learning architectures for camera-based detection and avoidance on aircrafts
JP2023538588A (en) Neural networks for unmanned aircraft and air traffic management
US11645775B1 (en) Methods and apparatus for depth estimation on a non-flat road with stereo-assisted monocular camera in a vehicle
CN113885011A (en) Light detection and ranging recalibration system based on point cloud chart for automatic vehicle
US11531349B2 (en) Corner case detection and collection for a path planning system
US20220044554A1 (en) Method and System for Providing Environmental Data
CN117201567B (en) System and method for controlling cage entering and exiting through underground mining scene perception fusion technology
CN110582428A (en) vehicle monitoring system and method for sensing external objects
CN116724214A (en) Method and system for generating a lane-level map of a region of interest for navigation of an autonomous vehicle
CN116783105A (en) On-board feedback system for autonomous vehicle
WO2021126940A1 (en) Systems and methods for injecting faults into an autonomy system
CN116324662B (en) System for performing structured testing across an autonomous fleet of vehicles
US20230045416A1 (en) Information processing device, information processing method, and information processing program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination