CN110427035B - Target tracking method for vehicle - Google Patents

Target tracking method for vehicle Download PDF

Info

Publication number
CN110427035B
CN110427035B CN201910744516.XA CN201910744516A CN110427035B CN 110427035 B CN110427035 B CN 110427035B CN 201910744516 A CN201910744516 A CN 201910744516A CN 110427035 B CN110427035 B CN 110427035B
Authority
CN
China
Prior art keywords
target
sensor
degree
tracking
calculating
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910744516.XA
Other languages
Chinese (zh)
Other versions
CN110427035A (en
Inventor
邓堃
卢红喜
黄宇
金晨
陈文琳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Geely Holding Group Co Ltd
Zhejiang Geely Automobile Research Institute Co Ltd
Original Assignee
Zhejiang Geely Holding Group Co Ltd
Zhejiang Geely Automobile Research Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Geely Holding Group Co Ltd, Zhejiang Geely Automobile Research Institute Co Ltd filed Critical Zhejiang Geely Holding Group Co Ltd
Priority to CN201910744516.XA priority Critical patent/CN110427035B/en
Publication of CN110427035A publication Critical patent/CN110427035A/en
Application granted granted Critical
Publication of CN110427035B publication Critical patent/CN110427035B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0223Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle

Abstract

The invention provides a target tracking method for a vehicle, and belongs to the field of automatic driving. The method comprises the following steps: obtaining targets to be tracked around a vehicle; selecting a controllable sensor for tracking the target to be tracked according to the detection and identification capacity of each controllable sensor on the target to be tracked; calculating the normal degree of operation based on the current operation state data of the system; calculating resource calling feasibility based on the capability data of the resource calling of the system; calculating the normal maintenance degree of the function based on the maintenance condition data of the normal function of the system; calculating target detection tracking easiness based on the difficulty data of the target detection tracking of the system; calculating the integral controllability of the system according to a preset algorithm; judging whether the system reaches a preset standard for controlling the selected sensor or not according to the overall controllability of the system; and if so, controlling the selected sensor to track the target to be tracked. The target tracking method can comprehensively estimate the resources of the vehicle-road cooperative automatic driving system.

Description

Target tracking method for vehicle
Technical Field
The invention relates to the field of automatic driving, in particular to a target tracking method for a vehicle.
Background
The automatic driving technology is a technical hotspot of the current automobile industry, and is mainly divided into six automatic driving grades of L0-L5 at present according to the automatic driving grade of SAE, wherein the grade L0 refers to a vehicle without any automatic driving function, the grade L1-L2 automatic driving is still A Driving Assistance System (ADAS) in nature, the grade L3 automatic driving can be called a quasi-automatic driving system, and the grade L4-L5 automatic driving can be considered as a real meaningful automatic driving system.
In a conventional L1-L2-level automatic driving vehicle, vehicle-mounted sensors (GPS, IMU, wheel speed sensor, etc.) and sensing sensors (forward radar, forward looking camera, ultrasonic radar, etc.) are mainly used to implement auxiliary driving functions in simple scenes, such as acc (adaptive Cruise control), AEB, TJA, HWA, etc. With the improvement of the automatic driving function and the safety level, the vehicle needs to have more accurate sensing and positioning capability, more reliable and stable decision control capability and capability of processing more complex scenes. Therefore, higher requirements are put on the self-vehicle and peripheral environment perception capabilities, for example, the autonomous vehicle of L3-L5 realizes the environment perception capabilities of high-precision map/positioning, dynamic and static target detection and tracking, lane road edge detection, traffic sign identification and the like by adding a forward laser radar, a plurality of angle radars and side radars, a high-pixel forward-looking camera, a side-looking camera, a rear-looking camera, a high-precision map server and the like.
In order to realize automatic driving, the tracking of the target to be tracked can be completed by controlling a controllable sensor capable of adjusting the detection position on the vehicle, and the dominant perception direction of the sensor is adjusted to synchronously track and detect the target to be tracked.
However, the resources of the vehicle are limited, and the consideration includes both the software and hardware resource considerations of the system, such as perception, calculation, storage, and energy consumption, and the constraint resource considerations of the system, such as functional requirements, functional safety, information safety, and realizability. Therefore, when a controllable sensor is called to track an object to be tracked, the system is expected to have enough resources to support the proposed system and method to complete the function of sensor adjustment.
Therefore, it is necessary to estimate whether the resources of the vehicle-road cooperative automatic driving system are sufficient.
Disclosure of Invention
One objective of the present invention is to provide a target tracking method for a vehicle, which can estimate the resources of a vehicle-road cooperative automatic driving system more comprehensively.
In particular, the invention provides a method for object tracking for a vehicle provided with a road coordinated autonomous driving system comprising a plurality of controllable sensors adjustable to detection positions, comprising the steps of:
obtaining targets to be tracked around the vehicle;
selecting a controllable sensor for tracking the target to be tracked as a selected sensor according to the detection and identification capacity of each controllable sensor on the target to be tracked;
acquiring current operation state data of the system, and calculating the operation normality of the system based on the current operation state data of the system;
acquiring the capability data of the resource calling of the system, and calculating the resource calling feasibility of the system based on the capability data of the resource calling of the system;
acquiring maintenance condition data of the normal function of the system, and calculating the normal maintenance degree of the function of the system based on the maintenance condition data of the normal function of the system;
acquiring difficulty data of target detection and tracking of the system, and calculating the target detection and tracking easiness of the system based on the difficulty data of the target detection and tracking of the system;
calculating the overall controllability of the system according to a preset algorithm by using the system operation normality, the resource calling feasibility, the function normality maintenance and the target detection tracking easiness;
judging whether the system reaches a preset standard for controlling the selected sensor or not according to the overall controllability of the system;
and when the system reaches a preset standard for controlling the selected sensor, the system controls the selected sensor to track the target to be tracked.
Optionally, obtaining current operation state data of the system, and calculating the operation normality of the system based on the current operation state data of the system, includes:
acquiring the normal operation degree of different functional modules of the system and the normal coordination operation degree of the system;
and calculating the system operation normality according to the operation normality of different functional modules of the system and the coordinated operation normality of the system.
Optionally, acquiring capability data of resource invocation of the system, and calculating resource invocation feasibility of the system based on the capability data of resource invocation of the system, includes:
acquiring the calling condition information of the controllable sensor and the equipment related to the controllable sensor in the system, and calculating the hardware resource calling feasibility degree according to the calling condition information of the controllable sensor and the equipment related to the controllable sensor;
acquiring the calling condition information of the software in the system, and calculating the software resource calling feasibility according to the calling condition information of the software in the system;
and calculating the resource calling feasibility of the system according to the hardware resource calling feasibility and the software resource calling feasibility.
Optionally, obtaining maintenance condition data of the normal function of the system, and calculating a normal maintenance degree of the function of the system based on the maintenance condition data of the normal function of the system, includes:
acquiring the normal operation degree of the environment monitoring function and the normal operation degree of the system redundancy backup function of the remaining sensors after the selected sensor is removed;
and calculating the normal function maintenance degree of the system according to the normal operation degree of the environment monitoring function and the normal operation degree of the system redundancy backup function of the rest sensors without the selected sensor.
Optionally, the obtaining difficulty level data of target detection and tracking of the system, and calculating the target detection and tracking easiness of the system based on the difficulty level data of target detection and tracking of the system, includes:
acquiring detection and identification characteristic information, environmental factor information and target attribute information of the controllable sensor, and calculating the detection and identification easiness of the target according to the detection and identification characteristic information, the environmental factor information and the target attribute information of the controllable sensor;
acquiring electromechanical control characteristic information and target motion characteristic information of the controllable sensor, and calculating to obtain the locking tracking easiness of the target according to the electromechanical control characteristic information and the target motion characteristic information of the controllable sensor;
and calculating the target detection tracking easiness according to the detection identification easiness and the locking tracking easiness.
Optionally, calculating the overall controllability of the system according to a preset algorithm, where the system operation normality, the resource invocation feasibility, the function normality maintenance and the target detection tracking easiness are calculated, and the method includes:
and weighting and adding the resource calling feasibility degree, the function normal maintenance degree and the target detection tracking easiness degree by using a first coefficient, a second coefficient and a third coefficient respectively, and taking the smaller value of the added value and the system operation normality degree as the integral controllability degree, wherein the sum of the first coefficient, the second coefficient and the third coefficient is equal to 1.
Optionally, determining whether the system meets a preset standard for controlling the selected sensor according to the overall controllability of the system includes:
and when the overall controllability is larger than a first threshold value, judging that the system reaches a preset standard for controlling the controllable sensor.
Optionally, selecting the controllable sensor for tracking the target to be tracked as a selected sensor according to the detection and identification capability of each controllable sensor on the target to be tracked, where the selecting includes:
acquiring a possible motion range of the target to be tracked;
calculating the sensing detection range of all the controllable sensors;
calculating a detectable degree of identification of each of the controllable sensors based on a degree of coincidence of the range of possible motion and the perceived detection range of each of the controllable sensors;
selecting as the selected sensor the controllable sensor having the detectable identification greater than a second threshold.
Optionally, calculating a detectable degree of recognition of each of the controllable sensors based on a degree of coincidence of the range of possible motion and the perceived range of detection of each of the controllable sensors comprises:
the ratio of the overlap of the range of possible motion and the range of perceptual detection to the range of possible motion is taken as the detectable degree of recognition.
Optionally, selecting the controllable sensor with the detectable identification greater than a second threshold as the selected sensor further comprises:
and comparing the detectable recognition degree with a second threshold value in sequence according to the height of the detectable recognition degree.
The invention considers the integral controllability of the system through four dimensions of the system operation normality, the resource calling feasibility, the function normal maintenance degree and the target detection tracking easiness, and judges whether the resources of the system are enough to control the selected sensor according to the integral controllability. Therefore, before the selected sensor is controlled, the system is comprehensively estimated, and the selected sensor is controlled in the later period.
Furthermore, the invention compares the detectable identification degrees of the targets to be tracked according to the controllable sensors, and the controllable sensor with the larger detectable identification degree of the targets to be tracked at the screening part tracks the targets to be tracked, so that the targets to be tracked fall in the visual range and the visual field range of the sensor to the maximum extent.
The above and other objects, advantages and features of the present invention will become more apparent to those skilled in the art from the following detailed description of specific embodiments thereof, taken in conjunction with the accompanying drawings.
Drawings
Some specific embodiments of the invention will be described in detail hereinafter, by way of illustration and not limitation, with reference to the accompanying drawings. The same reference numbers in the drawings identify the same or similar elements or components. Those skilled in the art will appreciate that the drawings are not necessarily drawn to scale. In the drawings:
FIG. 1 is a block flow diagram of a target tracking method according to one embodiment of the invention;
FIG. 2 is a block diagram of a flow of selecting selected sensors for a target tracking method according to another embodiment of the invention;
FIG. 3 is a schematic diagram of a target tracking method according to one embodiment of the invention.
Detailed Description
FIG. 1 is a block flow diagram of a target tracking method according to one embodiment of the invention. As shown in fig. 1, the present invention provides a target tracking method for a vehicle provided with a vehicle-road cooperative automatic driving system including a plurality of controllable sensors that can adjust a detection position. In one embodiment, the method comprises the steps of:
s10: and acquiring targets to be tracked around the vehicle.
S20: and selecting the controllable sensor for tracking the target to be tracked as a selected sensor according to the detection and identification capability of each controllable sensor on the target to be tracked.
S30: and acquiring the current running state data of the system, and calculating the running normality of the system based on the current running state data of the system.
S40: and acquiring the capability data of the resource calling of the system, and calculating the resource calling feasibility of the system based on the capability data of the resource calling of the system.
S50: and acquiring maintenance condition data of the normal function of the system, and calculating the normal maintenance degree of the function of the system based on the maintenance condition data of the normal function of the system.
S60: and acquiring difficulty data of target detection and tracking of the system, and calculating the target detection and tracking easiness of the system based on the difficulty data of the target detection and tracking of the system.
S70: and calculating the integral controllability of the system according to the system operation normality, the resource calling feasibility, the function normal maintenance degree and the target detection tracking easiness by a preset algorithm.
S80: and judging whether the system reaches a preset standard for controlling the selected sensor or not according to the overall controllability of the system.
S90: and when the system reaches the preset standard of controlling the selected sensor, the system controls the selected sensor to track the target to be tracked.
In the embodiment, the overall controllability of the system is considered through four dimensions of the system operation normality, the resource calling feasibility, the function normal maintenance degree and the target detection tracking easiness, and whether the resources of the system are enough to control the selected sensor is judged according to the overall controllability. Therefore, before the selected sensor is controlled, the system is comprehensively estimated, and the selected sensor is controlled in the later period.
Optionally, S30 includes:
and acquiring the normal operation degree of different functional modules of the system and the normal coordination operation degree of the system.
And calculating the operation normality of the system according to the operation normality of different functional modules of the system and the coordinated operation normality of the system.
Whether the system operates normally is a precondition for the adjustable control of the sensor, and if the system cannot operate normally, the adjustment of the sensor cannot be controlled to complete corresponding functions. The system refers to a vehicle-road cooperative automatic driving system, and comprises software and hardware of different functional modules such as sensing, calculating, storing, controlling, communicating, power supply and the like. The software and hardware of each module are provided with independent operation condition monitoring submodules for monitoring whether the module operates normally. The normal operation means that the hardware and software of the module work normally and can complete the given function, and the abnormal condition of the module does not occur. Meanwhile, the whole system is also provided with an integral operation condition monitoring module which is used for summarizing the information of the monitoring submodules of the modules on one hand and monitoring whether the whole system operates normally on the other hand. The normal operation means that the hardware and software of the whole system work normally and can complete the given function, and the whole system has no abnormal condition.
In one embodiment, the system operation normality σ may be calculated by the following formula (1) opr
σ opr =min(σ sysm ,σ sens ,σ comp ,σ stog ,σ cntr ,σ comm ,σ powr ) (1)
Where σ is sysm Is the degree of normality of coordinated operation of the system, σ sens Is the degree of normality of operation, σ, of the system sensing module comp Is the operating normality, σ, of the system computing module stog Is the degree of normal operation, σ, of the system memory module cntr Is the degree of normal operation, σ, of the system control module comm Is the degree of normal operation, σ, of the system communication module powr Is the normal degree of operation of the system power supply module. These operation normalities are all quantities varying between 0 and 1, and can be obtained by the operation health monitoring function of each module.
In one embodiment, S40 includes:
acquiring the calling condition information of a controllable sensor and equipment related to the controllable sensor in the system, and calculating the hardware resource calling feasibility degree according to the calling condition information of the controllable sensor and the equipment related to the controllable sensor;
acquiring the calling condition information of the software in the system, and calculating the software resource calling feasibility according to the calling condition information of the software in the system;
and calculating the resource calling feasibility of the system according to the hardware resource calling feasibility and the software resource calling feasibility.
The system is only the most basic condition to be considered when it is working properly, and whether the system has enough resources to be called for the sensor regulation control is the second problem to be considered. The resources here mainly include two types, one is hardware resources, and the other is software resources. The hardware resources comprise various sensors for controlling and adjusting equipped on the self vehicle and associated machinery, motors, drives, power supply equipment and the like. The software resources include computational resources (e.g., CPU usage), storage resources (e.g., memory usage), communication resources (e.g., network communication bandwidth usage), and the like, which are occupied for software control algorithms and the like that implement sensor control adjustment. The basic principle is that at least one sensor which can be called and used for controlling and adjusting and related equipment exist in hardware resources of the system, and meanwhile software resources of the system have enough idle computing, storage and communication resources to realize computing work of controlling and adjusting of the related sensor.
According to the hardware and software resource condition of the system, the resource calling feasibility degree sigma of the system can be calculated reso . In one embodiment, the smaller value of the hardware resource invocation feasibility and the software resource invocation feasibility is taken as the resource invocation feasibility, which is shown in formula (2):
σ reso =min(σ hwr ,σ swr ) (2)
where σ is hwr Is the hardware resource invocation feasibility, sigma, of the system swr Is the software resource invocation feasibility of the system, where feasibility is an amount that varies between 0 and 1. Calculate sigma hwr The method of (a) may be to determine whether at least one sensor and associated device available for control adjustment can be invoked, and if so, to determine whether the at least one sensor and associated device is available for control adjustment
σ hwr 1, otherwise σ hwr 0. Calculate sigma swr Can be obtained according to the following formula (3):
σ swr =min(σ cpu ,σ mem ,σ bwt ) (3)
where σ is cpu Is the system computing power resource feasibility, sigma mem Is the system memory resource feasibility, σ bwt Is the system storage resource feasibility. Calculate sigma cpu 、σ mem 、σ bwt The method of (1) may be to determine whether the CPU utilization, the memory utilization, and the network bandwidth utilization of the system are smaller than a set threshold (e.g., 60%), if so, the corresponding resource feasibility is 1, otherwise, the resource feasibility is 0. The main reason for this is that we want the system to have a certain margin of operation at any time to cope with the emergency.
In one embodiment, S50 includes:
and acquiring the normal operation degree of the environment monitoring function and the normal operation degree of the system redundancy backup function of the rest sensors after the selected sensor is removed.
And calculating the normal function maintenance degree of the system according to the normal operation degree of the environment monitoring function and the normal operation degree of the system redundancy backup function of the rest sensors without the selected sensor.
The sensor adjusting and controlling function is only one function in an automatic driving system based on vehicle-road coordination, and in order to realize the function, sensor equipment for adjusting and controlling needs to be called. It should be noted that, in addition to the function of adjusting and controlling to achieve synchronous tracking and locking of the target to be tracked, the sensor devices also need to perform other functions of the autopilot system, including an environment monitoring function for ensuring normal operation of the autopilot system, and a redundant backup function for achieving a high functional safety level of the autopilot system. The environment monitoring function for ensuring the normal operation of the automatic driving system refers to a common target perception detection function, a lane line detection function, a traffic sign detection function, an environment map construction function and the like. The function of realizing the redundant backup of the high functional safety level of the automatic driving system refers to that a plurality of perception sensors are arranged in the same visual angle range to realize the environment monitoring function so as to realize the redundant backup of the function and prevent one sensor from being damaged to cause the failure of the whole system. It is therefore desirable to consider whether the remaining sensing sensors can meet the minimum overall environment monitoring function requirements and redundant backup function requirements of the autopilot system if the selected sensor(s) available for control adjustments are called.
According to the configuration of the perception sensor of the self-vehicle, the requirement of the lowest environment monitoring function and the requirement of the lowest redundancy backup function, the function normal maintenance degree sigma of the system can be calculated through a formula (4) func
σ func =min(σ envr ,σ redn ) (4)
Where σ is envr The normal operation degree, sigma, of the environment monitoring function of the system after the selected sensor is removed redn Is the degree of normal operation of the redundant backup function of the system after removal of the selected sensor, where the degree of feasibility is an amount varying between 0 and 1.
In one embodiment, S60 includes:
and acquiring the detection and identification characteristic information, the environmental factor information and the target attribute information of the controllable sensor, and calculating the detection and identification easiness of the target according to the detection and identification characteristic information, the environmental factor information and the target attribute information of the controllable sensor.
And acquiring the electromechanical control characteristic information and the target motion characteristic information of the controllable sensor, and calculating to obtain the locking tracking easiness of the target according to the electromechanical control characteristic information and the target motion characteristic information of the controllable sensor.
And calculating the target detection and tracking easiness according to the detection and identification easiness and the locking and tracking easiness.
Since the sensor detection recognition characteristic of the own vehicle is already fixed, the target detection capability of the sensor is basically determined. Meanwhile, the electromechanical control characteristics of the sensor of the self vehicle are fixed, so that the target tracking capability of the sensor is basically determined. The difficulty of detection and identification of the sensor is different for different environmental factors or target attributes. For example, most sensors have high detection recognition rate on targets in rainy and snowy weather with good illumination; most sensors have a high probability of target recognition for a nominal reference set of targets (e.g., regular vehicles, pedestrians, bicycles, etc.) and a probability of target detection recognition for a non-nominal set of targets (e.g., small carts, hand tractors, tires, etc.). The difficulty degree of synchronously locking and tracking the target by controlling and adjusting the sensor is different for different target motion characteristics. For example, for a target which performs low-speed regular motion (for example, uniform-speed linear/steering motion and uniform-acceleration linear/steering motion), it is relatively easy to adjust the control sensor to track and lock the target; for a target which moves at a high speed or does irregular movement (such as uniform speed and high speed movement, high speed acceleration and deceleration movement and S-shaped movement), it is relatively difficult to adjust and control a sensor to track and lock the high target.
According to the detection and identification characteristic information, the environmental factor information and the target attribute information of the sensor, the detection and identification easiness degree sigma of the target can be calculated detc . According to the electromechanical control characteristic information of the sensor and the motion characteristic information of the target, the locking tracking easiness sigma of the target can be calculated trak . We can calculate the target detection tracking ease σ by equation (5) targ
σ targ =min(σ detc ,σ trak ) (5)
Where σ is detc Is the ease of detection and identification of the target, σ trak It is the lock tracking ease of the target, where the feasibility is a variable between 0 and 1.
In another embodiment, S70 includes:
and weighting the resource calling feasibility degree, the function normal maintenance degree and the target detection tracking easiness degree by using a first coefficient, a second coefficient and a third coefficient respectively, adding the weighted values, and taking the smaller value of the added value and the system operation normality degree as the overall controllability degree, wherein the sum of the first coefficient, the second coefficient and the third coefficient is equal to 1.
The overall controllability of the system can be obtained by comprehensive calculation through the system operation normality, the resource calling feasibility, the function normality maintenance degree and the target detection tracking easiness. The controllability of the system reflects the ability and readiness of the system to control various aspects of the tuning sensor. We present here a systematic way to consider whether the adjustment sensor can be controlled to synchronously track the target. The method has the core that the controllability of the system is calculated, and when the controllability is higher than a set experience threshold, the control sensor can be adjusted; otherwise, the control sensor may not be adjusted. The degree of controllability of the system can be calculated by the following formula:
σ cntr =min(σ opr ,(α reso ·σ resofunc ·σ functarg ·σ targ )) (6)
here the system operates normally σ opr Most importantly, if the value is small, the system is not operated normally, and the overall controllability of the system is low. Meanwhile, the resource calling feasibility, the normal function maintenance degree and the target detection tracking easiness are almost important, the relative importance is distinguished through the weight, wherein the weight is more than or equal to alpha and is more than or equal to 0 reso ≤1,0≤α func ≤1,0≤α targ Alpha is less than or equal to 1 resofunctarg 1. When all of these values areWhen the degree of controllability of the system is larger, the degree of controllability of the system is larger.
In one embodiment, S80 includes:
and when the overall controllability is greater than the first threshold value, judging that the system reaches a preset standard for controlling the controllable sensor. I.e. when the overall degree of controllability is greater than the first threshold value, the system is determined to be capable of controlling the controllable sensor.
A threshold value may be selected empirically
Figure BDA0002165107800000091
The method is used for determining whether the overall controllability of the system meets the requirement, namely whether the resources of the system are enough to complete the work of controlling and adjusting the sensor. For example, the threshold is selected to be
Figure BDA0002165107800000092
If it is used
Figure BDA0002165107800000093
The system has sufficient resources to control the adjustment sensor to track the target to be tracked. Otherwise, the system does not have enough resources at present, and the task of controlling and adjusting the sensor cannot be finished.
FIG. 2 is a block diagram of a process for selecting selected sensors in a target tracking method according to another embodiment of the invention. As shown in fig. 2, in another embodiment, S20 includes:
s21: and acquiring a possible motion range of the target to be tracked.
S22: the sensing detection range of all controllable sensors is calculated.
S23: the detectable degree of recognition of each controllable sensor is calculated based on the degree of coincidence of the range of possible motion with the sensed detection range of each controllable sensor.
S24: the controllable sensor with the detectable recognition degree larger than the second threshold value is selected as the selected sensor.
The selected sensor is controlled and adjusted, so that the dominant sensing direction of the sensor synchronously tracks and locks the target to be tracked, and the sensing detection, identification and tracking performances of the target to be tracked are improved. After the target to be tracked is determined, the controllable sensor with the large detectable recognition degree of the target to be tracked is screened to track the target to be tracked according to the comparison of the detectable recognition degrees of the controllable sensors, so that the target to be tracked falls in the visual range and the visual field range of the sensor to the maximum extent.
In one embodiment, S23 includes:
the ratio of the overlap of the possible motion range and the perceived detection range to the possible motion range is taken as the detectable degree of recognition.
In another embodiment, S24 is preceded by:
the detectable recognition degree is compared with a second threshold value according to the height of the detectable recognition degree.
FIG. 3 is a schematic diagram of a target tracking method according to one embodiment of the invention. As shown in fig. 3, a vehicle coordinate system (right-hand coordinate system) is established here with the center of the front bumper of the vehicle (the box in fig. 3) as the origin of coordinates, the longitudinal direction of travel of the vehicle as the y-axis (forward direction as positive direction), and the lateral direction of motion of the vehicle as the x-axis (right direction as positive direction). The perceived detection range of a vehicle sensor may generally be represented by a sector area representing the range of line-of-sight of the sensor's perceived field of view. Through the electromechanical control adjustment mode, a plurality of perception detection ranges (a plurality of fan-shaped areas) can be obtained, and the areas formed by all the fan-shaped sets are the perception detection ranges. As shown in fig. 3, the vehicle perception sensors may be described in the vehicle coordinate system as: sensor mounting position (x) s ,y s ) (ii) a Sensor horizontal perception field of view left boundary direction angle theta ls (ii) a Sensor horizontal perception field of view right boundary direction angle theta rs (ii) a Perception visual range r of sensor s (ii) a The sensor perception range parameter mainly depends on the perception characteristic parameter of the sensor. Note that the horizontal sensing field axis direction angle of the sensor
Figure BDA0002165107800000101
By controlling the control we can adjust x within a certain range s ,y s ,θ ls ,θ rs ,r s Etc., it can be assumed here that the adjustment ranges of the parameter variables are:
x s ∈[x s-min ,x s-max ,]
y s ∈[y s-min ,y s-max ,]
θ ls ∈[θ ls-min ,θ ls-max ]
θ rs ∈[θ rs-min ,θ rs-max ]
r s ∈[r s-min ,r s-max ]
lsrs |≤θ FOV
these upper and lower bound parameters x s-min/max 、y s-min/max 、θ ls-min/max 、θ rs-min/max 、r s-min/max And a perceptual field angle range parameter θ FoV Etc. depending on factors such as the electromechanical control characteristics of the sensors and vehicle layout constraints. Through an electromechanical control adjustment mode, the sensing detection range of the sensor is changed from one fan-shaped area to a set of a plurality of fan-shaped areas, and the range covered by the fan-shaped set depends on upper and lower limit parameters, so that the sensor has a larger sensing detection range and a larger degree of freedom, as shown in the fan-shaped area of fig. 3.
The possible motion range of the object to be tracked can be generally represented by an elliptical area, which represents the motion uncertainty range of the object, as shown by the elliptical area in fig. 3. The target to be tracked may be described in the vehicle coordinate system as: center position (x) of target to be tracked t ,y t ) The ellipse of the movable range describes the radius of the major axis as la t Minor axis radius ld t The angle of inclination is l theta t . The parameters of the target to be tracked here mainly depend on the performance of the perception detection sensor (accuracy, resolution, uncertainty, etc.), and the target motion characteristics (speed, acceleration, heading angle, etc.).
If the detectable recognition degree of the target is smaller (more toward 0), the probability that the target can be sensed and detected by the sensor is lower, that is, the sensor sensing detection range hardly covers the movable range of the target. Conversely, the smaller the detectable recognition of the target (the more toward 1), the higher the probability that the target will be perceptually detected by the sensor. A second threshold may be empirically selected for determining whether the detectable identification of the target can be met by the sensor. For example, a second threshold of 0.85 is selected, and if the detectable recognition of the target is greater than 0.85, the sensing may be selected for sensing detection and tracking of the target; otherwise, the sensor cannot meet the sensing detection and tracking tasks of the target.
It should be noted that the above embodiment describes the adjustment problem of the selected sensor in the horizontal direction, and ignores the adjustment problem in the vertical direction to simplify the description process, and in practical applications, the adjustment in the vertical direction may be performed by referring to the above method, and the principle is the same.
It should be noted that the control and adjustment of the axial direction of the sensing field of view (horizontal) of the sensor is only one of the adjustment manners, and other control and adjustment manners are also conceivable, such as controlling and adjusting the installation position, the pitch angle, the sensing distance, the sensing field angle, etc. of the sensor.
Thus, it should be appreciated by those skilled in the art that while a number of exemplary embodiments of the invention have been illustrated and described in detail herein, many other variations or modifications consistent with the principles of the invention may be directly determined or derived from the disclosure of the present invention without departing from the spirit and scope of the invention. Accordingly, the scope of the invention should be understood and interpreted to cover all such other variations or modifications.

Claims (7)

1. A method for object tracking for a vehicle provided with a road-coordinated autonomous driving system comprising a plurality of controllable sensors of adjustable detection position, characterized in that it comprises the following steps:
obtaining targets to be tracked around the vehicle;
selecting a controllable sensor for tracking the target to be tracked as a selected sensor according to the detection and identification capacity of each controllable sensor on the target to be tracked;
acquiring the running normality of different functional modules of the system and the coordinated running normality of the system, wherein the running normality of the different functional modules of the system and the coordinated running normality of the system are changed between 0 and 1;
taking the minimum value of the normal degree of operation of different functional modules of the system and the normal degree of coordinated operation of the system as the normal degree of operation of the system;
acquiring the capability data of the resource calling of the system, and calculating the resource calling feasibility of the system based on the capability data of the resource calling of the system;
the step of calculating the resource calling feasibility of the system based on the capability data of the resource calling of the system comprises the following steps:
calling hardware resources to feasibility degree sigma hwr And software resource invocation feasibility degree sigma swr Is taken as the resource calling feasibility degree, sigma hwr And σ swr All by amounts varying between 0 and 1;
acquiring maintenance condition data of the normal function of the system, and calculating the normal maintenance degree of the function of the system based on the maintenance condition data of the normal function of the system;
the step of calculating the normal function maintenance degree of the system based on the maintenance condition data of the normal function of the system comprises the following steps:
obtaining the normal operation degree sigma of the environment monitoring function of the system after removing the selected sensor envr And the normal operation degree sigma of the redundant backup function of the system after removing the selected sensor redn Is taken as the functional normality maintenance degree, σ envr And σ redn All by amounts varying between 0 and 1;
acquiring difficulty data of target detection and tracking of the system, and calculating the target detection and tracking easiness of the system based on the difficulty data of the target detection and tracking of the system;
the step of calculating the target detection and tracking easiness of the system based on the difficulty data of the target detection and tracking of the system comprises the following steps:
target detection and identification easiness σ detc Ease of lock tracking with target σ trak The smaller value of (a) is the target detection tracking ease (σ) detc And σ trak All by amounts varying between 0 and 1;
weighting and adding the resource calling feasibility degree, the function normal maintenance degree and the target detection tracking easiness degree by using a first coefficient, a second coefficient and a third coefficient respectively, and taking the smaller value of the added value and the system operation normality degree as the overall controllability degree, wherein the sum of the first coefficient, the second coefficient and the third coefficient is equal to 1;
judging whether the system reaches a preset standard for controlling the selected sensor or not according to the overall controllability of the system;
and when the system reaches a preset standard for controlling the selected sensor, the system controls the selected sensor to track the target to be tracked.
2. The method according to claim 1, wherein obtaining capability data of the resource invocation of the system and calculating the resource invocation feasibility of the system based on the capability data of the resource invocation of the system comprises:
acquiring the calling condition information of the controllable sensor and the equipment related to the controllable sensor in the system, and calculating the hardware resource calling feasibility according to the calling condition information of the controllable sensor and the equipment related to the controllable sensor;
acquiring the calling condition information of the software in the system, and calculating the software resource calling feasibility according to the calling condition information of the software in the system;
and calculating the resource calling feasibility of the system according to the hardware resource calling feasibility and the software resource calling feasibility.
3. The target tracking method according to claim 2, wherein acquiring difficulty data of target detection and tracking of the system and calculating target detection and tracking easiness of the system based on the difficulty data of target detection and tracking of the system comprises:
acquiring detection and identification characteristic information, environmental factor information and target attribute information of the controllable sensor, and calculating the detection and identification easiness of the target according to the detection and identification characteristic information, the environmental factor information and the target attribute information of the controllable sensor;
acquiring electromechanical control characteristic information and target motion characteristic information of the controllable sensor, and calculating to obtain the locking tracking easiness of the target according to the electromechanical control characteristic information and the target motion characteristic information of the controllable sensor;
and calculating the target detection and tracking easiness according to the detection and identification easiness and the locking and tracking easiness.
4. The method of claim 3, wherein determining whether the system meets a predetermined criteria for controlling the selected sensor based on the overall controllability of the system comprises:
and when the overall controllability is larger than a first threshold value, judging that the system reaches a preset standard for controlling the controllable sensor.
5. The target tracking method according to any one of claims 1 to 4, wherein selecting the controllable sensor for tracking the target to be tracked as the selected sensor according to the detection and identification capability of each controllable sensor on the target to be tracked comprises:
acquiring a possible motion range of the target to be tracked;
calculating the sensing detection range of all the controllable sensors;
calculating a detectable degree of identification of each of the controllable sensors based on a degree of coincidence of the range of possible motion and the perceived detection range of each of the controllable sensors;
selecting as the selected sensor the controllable sensor having the detectable identification greater than a second threshold.
6. The method of claim 5, wherein calculating the detectable degree of identification of each of the controllable sensors based on the degree of coincidence of the range of possible motion and the range of perceived detection of each of the controllable sensors comprises:
the ratio of the overlap of the range of possible motion and the range of perceptual detection to the range of possible motion is taken as the detectable degree of recognition.
7. The method of claim 5, wherein selecting the controllable sensor having the detectable identification greater than a second threshold as the selected sensor further comprises:
and comparing the detectable identification with a second threshold value according to the height of the detectable identification.
CN201910744516.XA 2019-08-13 2019-08-13 Target tracking method for vehicle Active CN110427035B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910744516.XA CN110427035B (en) 2019-08-13 2019-08-13 Target tracking method for vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910744516.XA CN110427035B (en) 2019-08-13 2019-08-13 Target tracking method for vehicle

Publications (2)

Publication Number Publication Date
CN110427035A CN110427035A (en) 2019-11-08
CN110427035B true CN110427035B (en) 2022-09-16

Family

ID=68414309

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910744516.XA Active CN110427035B (en) 2019-08-13 2019-08-13 Target tracking method for vehicle

Country Status (1)

Country Link
CN (1) CN110427035B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102881053A (en) * 2012-09-26 2013-01-16 江门市侍卫长卫星应用安全有限公司 Social stability maintenance supervision system on basis of vehicle identification
DE102013001079A1 (en) * 2013-01-23 2014-08-07 Sew-Eurodrive Gmbh & Co Kg System with vehicles and method for operating a system with multiple vehicles
CN108284836A (en) * 2018-01-25 2018-07-17 吉林大学 A kind of longitudinal direction of car follow-up control method
CN108334087A (en) * 2018-01-25 2018-07-27 广州大学 A kind of advanced driving assistance system of hardware and software platform based on software definition
CN109739232A (en) * 2018-12-29 2019-05-10 百度在线网络技术(北京)有限公司 Barrier method for tracing, device, car-mounted terminal and storage medium
CN109885099A (en) * 2017-12-06 2019-06-14 智飞智能装备科技东台有限公司 A kind of visual identifying system for unmanned plane tracking lock target

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102881053A (en) * 2012-09-26 2013-01-16 江门市侍卫长卫星应用安全有限公司 Social stability maintenance supervision system on basis of vehicle identification
DE102013001079A1 (en) * 2013-01-23 2014-08-07 Sew-Eurodrive Gmbh & Co Kg System with vehicles and method for operating a system with multiple vehicles
CN109885099A (en) * 2017-12-06 2019-06-14 智飞智能装备科技东台有限公司 A kind of visual identifying system for unmanned plane tracking lock target
CN108284836A (en) * 2018-01-25 2018-07-17 吉林大学 A kind of longitudinal direction of car follow-up control method
CN108334087A (en) * 2018-01-25 2018-07-27 广州大学 A kind of advanced driving assistance system of hardware and software platform based on software definition
CN109739232A (en) * 2018-12-29 2019-05-10 百度在线网络技术(北京)有限公司 Barrier method for tracing, device, car-mounted terminal and storage medium

Also Published As

Publication number Publication date
CN110427035A (en) 2019-11-08

Similar Documents

Publication Publication Date Title
CN108983768B (en) Automatic driving system
WO2019178548A1 (en) Determining drivable free-space for autonomous vehicles
US11625038B2 (en) Autonomous driving device
CN106004860A (en) Vehicle traveling control device
US9956958B2 (en) Vehicle driving control device and control device
US11845471B2 (en) Travel assistance method and travel assistance device
US11585669B2 (en) Vehicle routing using connected data analytics platform
JP2020052607A (en) Information processing system
RU2678416C2 (en) Cruise control system of vehicle and method for operation thereof
CN111457933B (en) Method and device for determining static and dynamic information of lane class
US11458992B2 (en) Safe trajectory tracking in uncertain environments
CN110347166B (en) Sensor control method for automatic driving system
CN110427035B (en) Target tracking method for vehicle
US11884296B2 (en) Allocating processing resources to concurrently-executing neural networks
CN112285717A (en) Method for controlling vehicle-mounted radar signal and electronic equipment
CN115440025B (en) Information processing server, processing method of information processing server, and non-transitory storage medium
US20220404506A1 (en) Online validation of lidar-to-lidar alignment and lidar-to-vehicle alignment
US20230271628A1 (en) Distributed processing of vehicle sensor data
US20230131124A1 (en) Connected vehicle road-safety infrastructure insights
CN114355868A (en) Dynamic speed planning method and system for self-driving
CN116968493A (en) Suspension control method and device for automatic driving vehicle, vehicle and medium
JP2022142860A (en) Drive assist device, drive assist method and program
CN113785340A (en) Information processing method, information processing system, and information processing apparatus
CN114248783A (en) Vehicle auxiliary control method and device, map acquisition method and server
CN117125095A (en) Vehicle control method, electronic equipment and computer storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant