CN115410376A - Vehicle identity determination method, device and equipment applied to vehicle-road cooperation - Google Patents

Vehicle identity determination method, device and equipment applied to vehicle-road cooperation Download PDF

Info

Publication number
CN115410376A
CN115410376A CN202110593501.5A CN202110593501A CN115410376A CN 115410376 A CN115410376 A CN 115410376A CN 202110593501 A CN202110593501 A CN 202110593501A CN 115410376 A CN115410376 A CN 115410376A
Authority
CN
China
Prior art keywords
detection data
vehicle
data
determining
detection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110593501.5A
Other languages
Chinese (zh)
Inventor
程浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Chenggu Technology Co ltd
Original Assignee
Shenzhen Chenggu Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Chenggu Technology Co ltd filed Critical Shenzhen Chenggu Technology Co ltd
Priority to CN202110593501.5A priority Critical patent/CN115410376A/en
Publication of CN115410376A publication Critical patent/CN115410376A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/017Detecting movement of traffic to be counted or controlled identifying vehicles
    • G08G1/0175Detecting movement of traffic to be counted or controlled identifying vehicles by photographing vehicles, e.g. when violating traffic rules
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07BTICKET-ISSUING APPARATUS; FARE-REGISTERING APPARATUS; FRANKING APPARATUS
    • G07B15/00Arrangements or apparatus for collecting fares, tolls or entrance fees at one or more control points
    • G07B15/06Arrangements for road pricing or congestion charging of vehicles or vehicle users, e.g. automatic toll systems

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Finance (AREA)
  • Traffic Control Systems (AREA)

Abstract

The application is suitable for the technical field of vehicle identity matching, and provides a vehicle identity determination method, a device and equipment applied to vehicle-road cooperation, wherein the method comprises the following steps: acquiring first detection data and second detection data, wherein the first detection data are data uploaded by a radar, the second detection data are data uploaded by a Road Side Unit (RSU), and the second detection data comprise identity information of a vehicle; determining a difference between the first detected data and the second detected data; determining the matching degree of the first detection data and the second detection data according to a preset weight and the difference; and determining the identity information corresponding to the vehicle corresponding to the first detection data according to the matching degree. By the method, the accuracy of vehicle identity matching can be improved.

Description

Vehicle identity determination method, device and equipment applied to vehicle-road cooperation
Technical Field
The present application relates to the field of vehicle identity matching technologies, and in particular, to a method, an apparatus, a device, and a computer-readable storage medium for vehicle identity determination applied to vehicle-road coordination.
Background
In an Electronic Toll Collection (ETC), when a vehicle passes through a Road Side Unit (RSU) at high speed, the RSU communicates with an On Board Unit (OBU) On the vehicle by microwave. In the communication process, the RSU identifies the vehicle type of the vehicle where the OBU is located by acquiring the information of the OBU, calculates the corresponding rate of the vehicle and further deducts the toll of the vehicle. In the ETC system, after the OBU and the RSU establish a microwave communication link, vehicle identity recognition and electronic fee deduction can be realized on the way of the vehicle without stopping the vehicle.
However, since there are usually many vehicles on the highway, if the vehicle identification is wrong, it may cause a fee deduction error.
To address this problem, existing methods identify the identity of the vehicle with the radar, RSU, and camera together. Specifically, vehicle image coordinates are obtained through the RSU, vehicle image coordinates are obtained through the radar, the vehicle image coordinates obtained through the RSU are matched with the vehicle image coordinates obtained through the radar, if the coordinates are not matched, images are captured, the captured images are identified, and vehicle identification results are obtained. And comparing the vehicle identification result with the OBU information of the vehicle, and determining a corresponding output result according to the comparison result. However, the camera does not have the capability of working all weather, and is greatly influenced by weather such as rain, snow, fog, sand and dust, and the like, so the condition that the identity of the vehicle cannot be accurately identified still exists by the method.
Disclosure of Invention
The embodiment of the application provides a vehicle identity determining method applied to vehicle-road cooperation, and can solve the problem that the vehicle identity is difficult to accurately identify by the existing method.
In a first aspect, an embodiment of the present application provides a vehicle identity determining method applied to vehicle-road coordination, including:
acquiring first detection data and second detection data, wherein the first detection data are data uploaded by a radar, the second detection data are data uploaded by a Road Side Unit (RSU), and the second detection data comprise identity information of a vehicle;
determining a difference between the first detected data and the second detected data;
determining the matching degree of the first detection data and the second detection data according to a preset weight and the difference;
and determining the identity information corresponding to the vehicle corresponding to the first detection data according to the matching degree.
In a second aspect, an embodiment of the present application provides a vehicle identity determining apparatus applied to vehicle-road coordination, including:
the system comprises a detection data acquisition module, a data processing module and a data processing module, wherein the detection data acquisition module is used for acquiring first detection data and second detection data, the first detection data is data uploaded by a radar, the second detection data is data uploaded by a Road Side Unit (RSU), and the second detection data comprises identity information of a vehicle;
a difference determination module for determining a difference between the first detection data and the second detection data;
the matching degree determining module is used for determining the matching degree of the first detection data and the second detection data according to preset weight and the difference;
and the identity information determining module of the vehicle is used for determining the identity information corresponding to the vehicle corresponding to the first detection data according to the matching degree.
In a third aspect, an apparatus is provided in an embodiment of the present application, and includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and the processor implements the method according to the first aspect when executing the computer program.
In a fourth aspect, an embodiment of the present application provides a vehicle identity determining system applied to vehicle-road coordination, where the vehicle identity determining system includes: at least one radar and at least one RSU, further comprising an apparatus as described in the third aspect.
In a fifth aspect, the present application provides a computer-readable storage medium, which stores a computer program, and when executed by a processor, the computer program implements the method according to the first aspect.
In a sixth aspect, embodiments of the present application provide a computer program product, which, when run on a device, causes the device to perform the method of the first aspect.
Compared with the prior art, the embodiment of the application has the advantages that:
in the embodiment of the application, the matching degree is determined according to the preset weight and the difference, and the difference is determined according to the first detection data and the second detection data, namely, when the matching degree is determined, the data uploaded by the radar (namely, the first detection data) and the data uploaded by the RSU (namely, the second detection data) are fused to obtain the difference between the first detection data and the second detection data, and the difference is correspondingly adjusted through the weight, so that the accuracy of the obtained matching degree is improved, and the accuracy of identity information determination according to the matching degree is further improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings used in the embodiments or the description of the prior art will be briefly described below.
Fig. 1 is a schematic flowchart of a vehicle identity determining method applied to vehicle-road coordination according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a radar and RSU mounted on a door frame according to an embodiment of the present application;
fig. 3 is a schematic diagram of an information interaction interval corresponding to an interaction position according to an embodiment of the present application;
fig. 4 is a schematic diagram of an information interaction area corresponding to information interaction between an RSU and a afterloaded OBU according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a hierarchical structure model provided in an embodiment of the present application;
fig. 6 is a schematic structural block diagram of a vehicle identity determining apparatus applied to vehicle-road coordination according to a second embodiment of the present application;
fig. 7 is a schematic structural diagram of an apparatus provided in the third embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
Furthermore, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used for distinguishing between descriptions and not necessarily for describing or implying relative importance.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise.
The first embodiment is as follows:
a vehicle-road cooperation system (vehicle-road cooperation) is a latest development direction of an intelligent transportation system, and in the vehicle-road cooperation system, identity information corresponding to a track of a vehicle generally needs to be recognized. In order to identify the vehicle identity, the existing method combines a radar, an RSU and a camera to identify the vehicle identity together. However, in this method, only the radar and the vehicle image coordinates acquired by the RSU are simply matched, but the range of information interaction between the RSU and the OBU is unstable, that is, the accuracy of the vehicle image coordinates acquired by the RSU is low.
In order to solve the above technical problem, an embodiment of the present application provides a vehicle identity determining method applied to vehicle-road coordination. In the vehicle identity determination method applied to vehicle-road cooperation, after a radar detects a target in a detection area of the radar, first detection data are obtained, and the first detection data are uploaded to equipment. The RSU interacts with an OBU of a vehicle passing through the RSU to obtain second detection data, and the second detection data is uploaded to the device. The device determines the difference between the first detection data and the second detection data, and then determines the matching degree of the first detection data and the second detection data according to the preset weight and the difference, namely determines the matching degree of the identity information contained in the vehicle corresponding to the first detection data and the identity information contained in the second detection data, and further can identify which identity information the vehicle corresponding to the first detection data is matched with according to the matching degree. Because the data uploaded by the radar and the data uploaded by the RSU are fused when the matching degree is determined, namely, the relation between the first detection data and the second detection data is considered, the accuracy of the matching degree is improved, and the accuracy of identity information determination according to the matching degree is further improved.
The following describes a vehicle identity determination method applied to vehicle-road coordination according to an embodiment of the present application with reference to the accompanying drawings.
Fig. 1 shows a flowchart of a vehicle identity determination method applied to vehicle-road coordination provided in an embodiment of the present application, where the vehicle identity determination method applied to vehicle-road coordination (hereinafter referred to as a vehicle-road identity determination method) is applied to a device, and details are as follows:
step S11, first detection data and second detection data are obtained, the first detection data are data uploaded by a radar, the second detection data are data uploaded by a Road Side Unit (RSU), and the second detection data comprise identity information of a vehicle.
In the present embodiment, a radar and an RSU are installed on a road to obtain information about a vehicle. Specifically, the radar detects a target in its detection area and obtains information about the target. For example, assuming that the target is a vehicle, information such as a distance between the vehicle and the radar, a speed of the vehicle, a length, a width, and a height of the vehicle is acquired, and the acquired information is used as first detection data uploaded to the device by the radar. And the RSU reads the information of the OBU by interacting with the OBU of the passing vehicle. For example, the identity information (such as a license plate number) of the corresponding vehicle in the OBU, the length, the width, the height and the like of the vehicle are read, and the read information is used as second detection data uploaded to the device by the RSU.
In some embodiments, if a plurality of radars are installed on the road, the device acquires a plurality of first detection data uploaded by the plurality of radars. Similarly, if a plurality of RSUs are installed on the road, the device will acquire a plurality of second detection data uploaded by the plurality of RSUs.
In some embodiments, in order to improve the accuracy of subsequent identity information matching, the detection region corresponding to the radar and the detection region corresponding to the RSU are set to have an overlapping region. For example, both the radar and the RSU are installed on the same mast or the same side bar on the highway, or both the radar and the RSU are installed at the exit of the same parking lot, so that the detection area corresponding to the radar and the detection area corresponding to the RSU have overlapping areas. Referring to fig. 2, in fig. 2, a black circle is a radar, a black square is an RSU, a box is a device, and a marked area on a lane is a detection area where the radar overlaps the RSU. It is noted that in fig. 2 only one radar is shown, in a practical case the number of radars may be larger than 1.
In some embodiments, considering that the speed of the vehicle on the highway is high, in order to obtain information of the vehicle in time, the coverage of the RSU on the gantry is set to be greater than or equal to the coverage threshold, for example, if the coverage threshold is 500 meters, the coverage of the RSU is set to be greater than 500 meters. Assuming a coverage threshold of 1000 meters, the coverage of the RSU is set equal to 1000 meters, etc. And the vehicle speed near the ETC toll station at the entrance and exit of the expressway is relatively slow, the coverage area of the RSU near the ETC toll station at the entrance and exit of the expressway can be smaller than that of the RSU on the portal frame.
In some embodiments, to ensure the matching accuracy, the time of the radar is set to be synchronized with the time of the RSU, so as to ensure that the times corresponding to the first detection data and the second detection data are synchronized.
Step S12, a difference between the first detection data and the second detection data is determined.
In this embodiment, the corresponding difference may be determined by a difference between the first detection data and the second detection data, or may be determined by a quotient of the first detection data and the second detection data. Of course, the determination may be made in other ways, and only the difference between the two needs to be reflected, which is not limited herein. It should be noted that the first detection data and the second detection data participating in the calculation in this step are not necessarily data uploaded at the same time point, but in order to increase the speed of the subsequent matching, the difference between the uploading time of the first detection data and the uploading time of the second detection data is ensured to be within the preset time difference range.
And S13, determining the matching degree of the first detection data and the second detection data according to the preset weight and the difference.
For example, for data corresponding to certain dimensions, which is not important for determining the identity of the vehicle, the corresponding difference is reduced by setting a smaller weight, and conversely, the corresponding difference is increased by setting a larger weight.
In this embodiment, the higher the matching degree is, the higher the probability that the identity information of the vehicle corresponding to the first detection data is the identity information corresponding to the second detection data is, and if the matching degree is positive, the lower the probability that the identity information of the vehicle corresponding to the first detection data is the identity information corresponding to the second detection data is.
And S14, determining the identity information corresponding to the vehicle corresponding to the first detection data according to the matching degree.
In this embodiment, since the matching degree represents the matching degree of the first detection data and the second detection data, the first detection data corresponds to the vehicle, and the second detection data includes the identity information indicating the identity of the vehicle, the matching probability of the identity information of the vehicle corresponding to the first detection data and the second detection data can be determined according to the matching degree.
In the embodiment of the application, the matching degree is determined according to the preset weight and the difference, and the difference is determined according to the first detection data and the second detection data, namely, when the matching degree is determined, the data uploaded by the radar (namely, the first detection data) and the data uploaded by the RSU (namely, the second detection data) are fused to obtain the difference between the first detection data and the second detection data, and the difference is correspondingly adjusted through the weight, so that the accuracy of the obtained matching degree is improved, and the accuracy of identity information determination according to the matching degree is further improved.
In some embodiments, the preset weight is determined according to the following:
and a1, screening out data types influencing vehicle identity determination from the historical first detection data and the historical second detection data.
The historical first detection data refers to the first detection data which is not uploaded by the radar at the time and needs to be matched. Similarly, the historical second detection data refers to second detection data which is not uploaded by the RSU at the present time and needs to be matched.
For example, if the length of the vehicle in the second detection data uploaded by the RSU is "3.6 meters", the data type of "3.6 meters" is "long".
In this embodiment, in consideration of the fact that there may be data that is not related to the determination of the vehicle identification information in the first detection data and the second detection data, a desired data type may be screened out from the respective data types corresponding to the first detection data and the second detection data. In this embodiment, after the screening operation is performed, the data types that can affect the subsequent determination of the identity information of the vehicle are screened out.
and a2, determining a preset weight according to the screened data type.
In the above a1 and a2, since the screened data type is a data type that can affect the vehicle identity determination, the weight can be determined only by using the screened data type, so that not only can the accuracy of the determined weight be improved, but also the speed of determining the weight can be improved after the data types are reduced.
In some embodiments, the historical second detection data comprises historical pre-shipment second detection data and historical post-shipment second detection data, the historical pre-shipment second detection data is second detection data obtained after interaction of the RSU with the pre-shipment OBU, the historical post-shipment second detection data is second detection data obtained after interaction of the RSU with the post-shipment OBU, the pre-shipment OBU refers to an OBU installed on the vehicle before production of the vehicle, and the post-shipment OBU refers to an OBU installed on the vehicle after production of the vehicle. That is, when performing data type screening, it is necessary to distinguish whether the historical second detection data is obtained by the interaction between the RSU and the front-loaded OBU or the interaction between the RSU and the rear-loaded OBU, that is, step a1 includes:
and a11, screening out a data type influencing the vehicle identity determination from the historical first detection data and the historical front-mounted second detection data to obtain a front-mounted data type.
and a12, screening out data types influencing vehicle identity determination from the historical first detection data and the historical afterloading second detection data to obtain afterloading data types.
The step a2 comprises the following steps:
and a21, determining the weight according to the screened front loading data type to obtain a preset front loading weight.
and a22, determining the weight according to the screened afterloading data type to obtain a preset afterloading weight.
Step S13 includes:
b1, if the second detection data are obtained by interaction of the RSU and the front-mounted OBU, determining the matching degree of the first detection data and the second detection data according to preset front-mounted weight and difference.
And b2, if the second detection data are obtained by interaction of the RSU and the afterloading OBU, determining the matching degree of the first detection data and the second detection data according to preset afterloading weight and difference.
In this embodiment, the types of OBUs that interact with the RSU include a front-loading OBU and a rear-loading OBU. Wherein, front-loading OBU refers to the OBU installed on the vehicle before the vehicle is produced, and rear-loading OBU refers to the OBU installed on the vehicle after the vehicle is produced. Because the front-mounted OBU can utilize a vehicle power supply (the vehicle power supply is sufficient, and the power supply is not a problem), the front-mounted OBU can perform information interaction with a vehicle system to acquire data of the vehicle, for example, the position of the vehicle (for example, the position of the vehicle is determined by a Beidou satellite navigation system, and the front-mounted OBU acquires the position of the vehicle from the Beidou satellite navigation system), the speed (the speed direction and the speed), the length, the width, the height and the like of the vehicle. Whereas a aftermarket OBU is powered by an internal power source or solar energy (since an aftermarket OBU has power problems, an aftermarket OBU has both a wake mode and a sleep mode), in which the aftermarket OBU only uploads information stored in a built-in file and information on the direction of travel of the vehicle provided by a compass within the aftermarket OBU.
In order to more clearly describe the data types corresponding to the first detection data and the second detection data, the following description is made in conjunction with table 1.
Table 1:
Figure BDA0003090103040000071
as can be seen from table 1, when the RSU interacts with the aftermarket OBU, detection data (i.e., second detection data) corresponding to the following data types will be obtained: license plate number (i.e. identity information), information interaction position, driving direction, length x width x height, time of OBU wake-up and time of OBU sleep. The data type obtained by the radar detecting the vehicle can also be directly obtained from table 1, and is not described herein again.
When the second detection data are detection data obtained by interaction between the RSU and the front-mounted OBU, the front-mounted OBU can interact with the RSU in real time, so that the RSU can upload the second detection data obtained by the RSU to the equipment in real time. In another scenario, the radar can also upload the first detection data obtained by the radar to the device in real time. As can be seen from table 1, the RSU interacts with different types of OBUs (front-mounted OBUs and rear-mounted OBUs) to obtain different second detection data, and the uploading manner is also different, in addition, the position information provided by the big dipper, the vehicle type information (i.e., length × width × height) detected by the radar, and the signal characteristics (the time when the signal appears and the time when the signal disappears) are usually incompletely accurate controllable information, and the rear-mounted OBU information interaction position (a schematic diagram of an information interaction interval corresponding to the interaction position is shown in fig. 3) is a matching position obtained based on the information interaction time, that is, the matching position also has uncertainty. Therefore, the corresponding second detection data (namely the historical pre-installed second detection data and the historical post-installed second detection data) are determined by combining the types of the OBUs, and then the corresponding data types are screened out according to the historical first detection data and the historical pre-installed second detection data (or the historical post-installed second detection data), so that the accuracy of the screened data types can be improved, and further, the accuracy of the subsequently obtained weights and the accuracy of the subsequently determined matching degree according to the weights can be improved.
In some embodiments, as can be seen in conjunction with table 1, when the RSU interacts with different types of OBUs, different data types exist in the resulting second detection data. For example, if the second detection data is obtained after the RSU interacts with the aftermarket OBU, it is considered that the aftermarket OBU only uploads information stored in the built-in file and information of the vehicle traveling direction provided by a compass in the aftermarket OBU, that is, only information and the vehicle traveling direction inherent in the built-in file uploaded by the aftermarket OBU can be obtained after the RSU interacts with the aftermarket OBU. That is, the second detection data obtained after the interaction of the RSU with the aftermarket OBU does not contain the data type of the position information (the position information is denoted by C1) indicating the lane in which the vehicle is currently located. However, if the second detection data is obtained by interaction between the RSU and the front-mounted OBU, the front-mounted OBU can acquire the position of the vehicle, and therefore the RSU can specify the position information "C1" indicating the lane where the vehicle corresponding to the front-mounted OBU is located, from the position of the vehicle uploaded by the front-mounted OBU. For example, the RSU determines the relative position relationship between the vehicle and the RSU by combining its own coordinates and the position of the vehicle uploaded by the front-mounted OBU, and then obtains "C1". In consideration of the fact that the dimension of the data type is increased, the determined weight of the data type is more accurate, and therefore, in the embodiment of the application, after the RSU interacts with the aftermarket OBU, the position information "C1" of the lane where the vehicle corresponding to the OBU is located is determined. Namely: if at least two RSUs are installed on the same door frame or the same side rod, the vehicle identity determination method comprises the following steps:
and c1, determining second detection data belonging to the same vehicle from the second detection data after the history.
And C2, determining C1 used for indicating the position information of the lane where the vehicle is currently located according to the second detection data belonging to the same vehicle for each vehicle.
Correspondingly, step a12 comprises:
and screening out the data type influencing the vehicle identity determination from the historical first detection data, the C1 and the historical afterloading second detection data to obtain the afterloading data type.
In C1 and C2, if the second detection data includes identity information, at least two pieces of second detection data acquired by each RSU for the afterloading OBU of the same vehicle can be determined according to the identity information, and then C1 is determined according to the corresponding at least two pieces of second detection data. Specifically, when different RSUs interact with the same vehicle, the interaction time has a certain difference, so that the lane where the vehicle is located can be determined according to the difference of the interaction time. Referring to fig. 4, in fig. 4, 2 RSUs are mounted on a door frame: RSU1 and RSU2, the shaded portion in fig. 4 is an information interaction area corresponding to the RSU1 performing information interaction with an afterloading OBU.
In this embodiment, the dimensionality corresponding to the data type C1 is added to screen the data types, so that the accuracy of the obtained afterloading data types is improved.
In some embodiments, step a21 comprises:
and determining the ratio of the influence of any two front-mounted data types on the vehicle identity determination in the screened front-mounted data types, wherein the ratio of the influence is taken as a preset front-mounted weight.
In this embodiment, the ratio of the influences of any two front-mounted data types on the vehicle identity determination is determined by analyzing the influences of the detection data corresponding to different front-mounted data types on the vehicle identity determination. For example, for a vehicle with known identity information, first detection data and second detection data of the vehicle are acquired, and the influence of each detection data on the determination of the identity information of the vehicle is analyzed respectively, so as to determine the ratio of the corresponding influences. For example, if the height of the vehicle is 2.1 meters and 2.8 meters, which both do not have a large influence on the determination of the identity information, the height of the vehicle (the data type corresponding to 2.1 meters or the data type corresponding to 2.8 meters) is set to have a small influence on the determination of the identity of the vehicle.
It should be noted that the process of determining the preset afterloading weight in step a22 is similar to the process of determining the preset frontloading weight in step a21, and will not be described herein again.
In some embodiments, before step S13, comprising:
and carrying out consistency detection on the paired comparison matrixes, wherein the paired comparison matrixes are determined according to preset weights.
Correspondingly, step S13 includes:
and determining the matching degree of the first detection data and the second detection data according to the differentiation matrix and the paired comparison matrix detected through consistency, wherein the differentiation matrix is determined according to the difference.
In this embodiment, consistency detection is performed on the paired comparison matrices, and only after the consistency detection is passed, the matching degree is determined according to the differentiation matrix and the paired comparison matrices that have passed the consistency detection. After the pair of comparison matrixes pass consistency detection, the weights in the pair of comparison matrixes are consistent, so that when the matching degree is determined according to the differentiation matrixes and the pair of comparison matrixes passing consistency detection, the accuracy of the obtained matching degree can be improved.
In order to more clearly describe the process of determining the pair-wise comparison matrix according to the preset weight, the data type corresponding to each second piece of detection data of each first detection data is abstracted, and the abstracted data typeA hierarchical model as shown in fig. 5 is composed. In FIG. 5, a vehicle target identification A is detected for the radar 1 The RSU detects m license plates and information such as position coordinates (C1, C2) and speed C3 corresponding to the m license plates (C1, C2, C3 and the like are all of one data type). Wherein, the above-mentioned "C2" is used for indicating the distance between the vehicle and the gantry.
The following lists the data types obtained by the device after screening the second detection data uploaded after the RSU interacts with the OBUs of different types.
If the second detection data is detection data uploaded after interaction between the RSU and the afterloading OBU, the data types screened by the device in combination with table 1 include: c2 for indicating the distance of the vehicle from the mast, C3 for indicating the speed of the vehicle, C4 for indicating the length of the vehicle, C5 for indicating the width of the vehicle, and C6 for indicating the height of the vehicle. As can be seen from table 1, after the RSU interacts with the rear OBU, the acquired vehicle speed is only the direction of the vehicle speed, and there is no information about the magnitude of the vehicle speed. I.e., C3 in this embodiment is the direction of the vehicle speed. It should be noted that, because there is no correspondence between the time when the OBU wakes up and the time when the OBU sleeps, and the time when the signal appears and disappears in the radar, the second detection data type determined after interaction with the afterloading OBU does not include both the time when the OBU wakes up and the time when the OBU sleeps.
If the second detection data is detection data uploaded after interaction between the RSU and the front-mounted OBU, the data types screened by the device in combination with table 1 include: c1, C2 for indicating the distance of the vehicle from the gantry, C3 for indicating the speed of the vehicle, C4 for indicating the length of the vehicle, C5 for indicating the width of the vehicle, C6 for indicating the height of the vehicle, C7 for indicating the time of reception of the signal, and C8 for indicating the time of interruption of the signal.
As can be seen from table 1, after the RSU interacts with the front OBU, the acquired vehicle speed includes the direction of the vehicle speed and the magnitude of the vehicle speed.
After the second detection data is determined, determining a corresponding pair comparison matrix according to the number of RSUs mounted on the gantry, which is detailed in the following (i) to (iii):
(i) Set to a comparison matrix, if the second detection data is obtained after the RSU is interacted with the afterloading OBU, and the number of RSUs mounted on the same gantry or the same side bar is equal to 1, a is determined by:
A=(a ij ) n×n ,a ij >0,
Figure BDA0003090103040000101
a ii =1,a ij is represented by C i Influence on vehicle identity determination and C j Ratio of influence on vehicle identity determination, n denotes C i Number of (2), C i 、C j ∈{C2、C3、C4、C5、C6}。
In this embodiment, n =5. In this embodiment, the influence is obtained by analyzing the existing detection data. Taking the front-mounted OBU as an example, the radar measures the vehicle speed very accurately, the RSU obtains the instantaneous speed of the vehicle from the front-mounted OBU accurately, and at this time, when the speeds of the RSU and the target vehicle detected on both sides of the radar are equal or close, the probability that the RSU and the target vehicle are the same vehicle is very high, that is, the fixing response of C3 to the identity information of the vehicle is large. However, when the positions are close, since the RSU provides a position with an error of 10m from the actual position, when the RSU and the radar both sides detect the target vehicle position by 10m, they may be the same vehicle as well, i.e. the position coordinates C1 and C2 have less certainty of determining the identity information of the vehicle.
In this embodiment, the influence magnitude relationship is compared pairwise to obtain corresponding weights, and then each weight is used to construct a comparison matrix. Wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0003090103040000102
when the dimension a ij Is shown in Table 2, wherein Z + Is a positive integer.
Table 2:
Figure BDA0003090103040000103
Figure BDA0003090103040000111
(ii) If the second detection data are obtained after the RSU and the afterloading OBU are interacted, and the number of the RSUs arranged on the same door frame or the same side rod is larger than 1, A is determined through the following mode:
A=(a ij ) n×n ,a ij >0,
Figure BDA0003090103040000112
a ii =1,a ij is represented by C i Influence on vehicle identity determination and C j Ratio of influence on vehicle identity determination, n denotes C i Number of (2), C i 、C j ∈{C1、C2、C3、C4、C5、C6}。
In this embodiment, n =6.
(iii) And if the second detection data are obtained after the RSU and the front OBU are interacted, determining A by the following method:
A=(a ij ) n×n ,a ij >0,
Figure BDA0003090103040000113
a ii =1,a ij is represented by C i Influence on vehicle identity determination and C j Ratio of influence on vehicle identity determination, n denotes C i Number of (2), C i 、C j E { C1, C2, C3, C4, C5, C6, C7}, or, C i 、C j ∈{C1、C2、C3、C4、C5、C6、C8}。
In this embodiment, n =7.
In the above-described (i) to (iii), the specific detection data included in the second detection data and the number of RSUs mounted on the same gantry or the same side bar are determined, so that the determined pair comparison matrix is more accurate.
In some embodiments, the pair-wise comparison matrix is consistency checked, comprising:
d1, determining the feature vector of the paired comparison matrix.
And d2, determining the eigenvalue of the paired comparison matrix according to the eigenvector.
d3, determining a consistency ratio according to the characteristic value, the row number of the paired comparison matrix and a preset random consistency index corresponding to the row number.
And d4, if the consistency ratio is smaller than a preset ratio threshold, judging that the paired comparison matrixes pass consistency detection, otherwise, judging that the paired comparison matrixes do not pass consistency detection.
Among the above d1 to d4, the preset ratio threshold may be set to 0.1. Assume that the pairwise comparison matrix is:
Figure BDA0003090103040000114
the steps of solving the eigenvectors and eigenvalues of the pairwise comparison matrix a are as follows:
(1) Normalizing each column vector of the matrix A
Figure BDA0003090103040000121
(2) To pair
Figure BDA0003090103040000122
Summing by rows to obtain
Figure BDA0003090103040000123
(3) Then will
Figure BDA0003090103040000124
Normalization:
Figure BDA0003090103040000125
ω=(ω 12 ,…,ω n ) T i.e. the matrix (approximate) eigenvector.
(4) Computing
Figure BDA0003090103040000126
As an approximation of the maximum root of features,i.e. the (approximate) eigenvalue.
The consistency index (Consensus index, CI) is determined according to the following formula:
Figure BDA0003090103040000127
where n is the number of rows in the paired comparison matrix.
In this embodiment, the consistency index is used to describe an n-order positive reciprocal matrix a = (a) ij ) To the extent that consistency is close, a smaller CI indicates greater consistency. And the feature vector corresponding to the maximum feature value is used as a weight vector of the influence degree of the compared factor on a certain factor of an upper layer, and the larger the inconsistency degree is, the larger the caused judgment error is. In conclusion, the value of (λ -n) can be used to measure the degree of inconsistency of A.
In this embodiment, in order to perform consistency check on the matrix values, the consistency ratio needs to be determined according to the order of the matrix, and therefore a Random consistency index (RI) is introduced, where the Random consistency index (RI) is shown in table 3.
Table 3:
n 1 2 3 4 5 6 7 8 9 10 11
RI 0 0 0.58 0.90 1.12 1.24 1.32 1.41 1.45 1.49 1.51
that is to say that the first and second electrodes,
Figure BDA0003090103040000128
then there is
Figure BDA0003090103040000129
For example, when n =6, RI =1.24.
That is, a Consistency Ratio (CR) is obtained from the above CI and RI:
Figure BDA00030901030400001210
if the predetermined ratio threshold is 0.1, and
Figure BDA00030901030400001211
then a pair-wise comparison is determinedThe matrix passes the consistency check.
The above-mentioned CR represents the ratio of the consistency index to the mean of the random consistency indexes, and when CR is less than a certain threshold, it is determined that the pair comparison matrix passes the consistency test.
In some embodiments, if the pair of comparison matrices fails the consistency detection, the pair of comparison matrices is subjected to consistency processing to obtain a new pair of comparison matrices, the new pair of comparison matrices is used as the pair of comparison matrices, and the step of performing the consistency detection on the pair of comparison matrices and the subsequent steps are returned.
In this embodiment, when the pair of comparison matrices fails to pass consistency detection, consistency processing is performed on the pair of comparison matrices, that is, an existing pair of comparison matrices is optimized until the pair of comparison matrices after optimization can pass consistency processing. The optimized paired comparison matrixes can be detected through consistency, so that the accuracy of the matching degree determined according to the paired comparison matrixes can be improved.
In some embodiments, if a pair of comparison matrices fails to pass consistency detection, the pair of comparison matrices needs to be subjected to consistency processing, that is, the pair of comparison matrices is subjected to consistency processing to obtain a new pair of comparison matrices, including:
e1, traversing each element in the paired comparison matrix, and comparing the weight corresponding to the traversed element with the weight corresponding to the element at the specified position in the paired comparison matrix.
In some embodiments, the number of the elements at the specified position is greater than 1, for example, the number of the elements at the specified position is set to be greater than 2, and at this time, comparing the weight corresponding to the traversed element with the weight corresponding to the element at the specified position in the pair-wise comparison matrix specifically includes: the product of the weights corresponding to the traversed elements and the weights corresponding to the 2 elements are compared. For example, traverse each element a in A ij Comparison element a ij And a ik ×a kj The size of (2).
And e2, determining a new weight according to the comparison result and a preset adjusting value.
In some embodiments, the predetermined adjustment value is a fixed value. In some embodiments, the preset adjustment value is a value related to the number of times of the unification, i.e., a dynamic value. For example, assume that the preset adjustment value is ε = ε η Eta is the number of coincidences if a ij >a ik ×a kj Then, then
Figure BDA0003090103040000131
If a is ij <a ik ×a kj Then, then
Figure BDA0003090103040000132
If a is ij =a ik ×a kj Then, then
Figure BDA0003090103040000133
And e3, replacing the new weight with the weight corresponding to the traversed element to obtain a new paired comparison matrix.
As can be seen from the example given in step e2,
Figure BDA0003090103040000134
as a new weight, then
Figure BDA0003090103040000135
Obtaining a new pairwise comparison matrix
Figure BDA0003090103040000136
And returning to the step of performing consistency detection on the paired comparison matrix and subsequent steps so as to continue to perform consistency detection on the new paired comparison.
In some embodiments, determining a degree of matching of the first detected data and the second detected data from the difference matrix and the pair-wise comparison matrix through consistency detection includes:
and determining the matching degree of the first detection data and the second detection data according to the feature vectors of the differentiation matrix and the paired comparison matrix detected through consistency.
In this embodiment, the fusion information may be calculated according to the following formula:
Figure BDA0003090103040000137
wherein, delta represents a differentiation matrix, m represents the number of the license plate numbers detected by the RSU, N represents the number of the targets detected by the radar,
Figure BDA0003090103040000141
normalizing the fusion information according to the following formula to obtain the matching degree of the first detection data and the second detection data:
Figure BDA0003090103040000142
because the matching degree is determined according to the feature vector of the paired comparison matrix and the differentiation matrix, the required calculation amount is less than that of the method for determining the matching degree directly according to the paired comparison matrix and the differentiation matrix, and the speed of determining the matching degree can be improved.
In some embodiments, step S15 (or step S36) includes:
f1, determining the highest matching degree from the matching degrees corresponding to the vehicles for each vehicle.
In this embodiment, if the RSU detects m license plates, there are m fusion probabilities corresponding to the vehicles.
And f2, determining identity information from the second detection data corresponding to the highest matching degree, and determining a vehicle identifier from the first detection data corresponding to the highest matching degree.
And f3, taking the identity information as the identity information of the vehicle corresponding to the vehicle identification.
In the above f1 to f3, since the vehicle and the identity information corresponding to the highest matching degree are determined as the matched information, and when the matching degree is the highest, it indicates that the matching degree of the vehicle and the identity information is the highest, an accurate matching result can be obtained through the above operation, that is, accurate identification of the vehicle identity can be realized.
Specifically, assuming that the radar detects N targets, the information of the N targets is integrated to obtain N column vectors with matching degrees:
Figure BDA0003090103040000143
the degree of matching here is represented by probability. For example, p 1N Indicating how well the first license plate number (i.e., the identity information of the vehicle) matches the nth target (i.e., the vehicle). As different radar detection targets correspond to different schemes, namely the license plate numbers of information fusion are not all the same, and the probability of the non-scheme is 0, N column vectors with matching degrees are complemented, and a matching degree matrix is output:
Figure BDA0003090103040000144
and determining identity information matched with each vehicle according to the matching degree matrix.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by functions and internal logic of the process, and should not constitute any limitation to the implementation process of the embodiments of the present application.
The second embodiment:
fig. 6 shows a block diagram of a vehicle identity determining apparatus applied to vehicle-road coordination provided in the embodiment of the present application, where the vehicle identity determining apparatus applied to vehicle-road coordination is applied to a device, and for convenience of description, only the relevant parts of the embodiment of the present application are shown.
Referring to fig. 6, the vehicle identification determination device 6 applied to the vehicle-road coordination includes: the system comprises a detection data acquisition module 61, a difference determination module 62, a matching degree determination module 63 and a vehicle identity information determination module 64. Wherein:
the detection data acquisition module 61 is configured to acquire first detection data and second detection data, where the first detection data is data uploaded by a radar, the second detection data is data uploaded by a road side unit RSU, and the second detection data includes identity information of a vehicle.
A difference determining module 62, configured to determine a difference between the first detection data and the second detection data.
A matching degree determining module 63, configured to determine a matching degree between the first detection data and the second detection data according to a preset weight and the difference.
And the vehicle identity information determining module 64 is configured to determine, according to the matching degree, the identity information corresponding to the vehicle corresponding to the first detection data. Wherein:
in the embodiment of the application, the matching degree is determined according to the preset weight and the difference, and the difference is determined according to the first detection data and the second detection data, namely, when the matching degree is determined, the data uploaded by the radar (namely, the first detection data) and the data uploaded by the RSU (namely, the second detection data) are fused to obtain the difference between the first detection data and the second detection data, and the difference is correspondingly adjusted through the weight, so that the accuracy of the obtained matching degree is improved, and the accuracy of identity information determination according to the matching degree is further improved.
In some embodiments, the preset weight is determined according to the following manner:
and screening out data types influencing vehicle identity determination from the historical first detection data and the historical second detection data.
And determining a preset weight according to the screened data type.
In some embodiments, the historical second detection data includes historical pre-shipment second detection data and historical post-shipment second detection data, the historical pre-shipment second detection data is second detection data obtained after interaction of the RSU with a pre-shipment OBU, the historical post-shipment second detection data is second detection data obtained after interaction of the RSU with a post-shipment OBU, the pre-shipment OBU is an OBU installed on the vehicle before production of the vehicle, and the post-shipment OBU is an OBU installed on the vehicle after production of the vehicle.
The above-mentioned screening out the data type that influences vehicle identity and confirm from historical first detected data and historical second detected data includes:
screening out data types influencing vehicle identity determination from the historical first detection data and the historical pre-mounted second detection data to obtain pre-mounted data types; and screening out the data type influencing the vehicle identity determination from the historical first detection data and the historical afterloading second detection data to obtain the afterloading data type.
Correspondingly, the determining the preset weight according to the screened data type includes:
determining the weight according to the screened front-loading data type to obtain a preset front-loading weight; and determining the weight according to the screened afterloading data types to obtain a preset afterloading weight.
Correspondingly, the matching degree determining module 63 is specifically configured to:
if the second detection data is obtained by interaction of the RSU and a front-mounted OBU, determining the matching degree of the first detection data and the second detection data according to the preset front-mounted weight and the difference; and if the second sensed data is obtained by interaction of the RSU and the aftermarket OBU, determining a degree of matching between the first sensed data and the second sensed data based on the predetermined aftermarket weight and the difference.
In some embodiments, if at least two RSUs are installed on the same door frame or the same side bar, the vehicle identification apparatus 6 includes:
and the second detection data determining module of the same vehicle is used for determining second detection data belonging to the same vehicle from the second detection data after the history.
And the position information determining module is used for determining C1 used for indicating the position information of the lane where the vehicle is currently located according to the second detection data belonging to the same vehicle for each vehicle.
Correspondingly, the step of screening out the data type influencing the vehicle identity determination from the historical first detection data and the historical afterloading second detection data to obtain the afterloading data type comprises the following steps:
and screening out the data types influencing the vehicle identity determination from the historical first detection data, the C1 and the historical afterloading second detection data to obtain the afterloading data types.
In some embodiments, the determining the weight according to the screened front-loading data type to obtain a preset front-loading weight includes:
and determining the ratio of the influence of any two front-mounted data types on the vehicle identity determination in the screened front-mounted data types, wherein the ratio of the influence is taken as the preset front-mounted weight.
In some embodiments, the vehicle identity determining device 6 applied to vehicle-road coordination includes:
and the consistency detection module is used for carrying out consistency detection on paired comparison matrixes, and the paired comparison matrixes are determined according to the preset weight.
Correspondingly, the matching degree determining module 63 is specifically configured to:
and determining the matching degree of the first detection data and the second detection data according to a differentiation matrix and the pair of comparison matrixes passing consistency detection, wherein the differentiation matrix is determined according to the difference.
In some embodiments, the performing consistency check on the pair of comparison matrices includes:
determining the feature vector of the pair of comparison matrixes; determining the eigenvalue of the pair of comparison matrixes according to the eigenvector; determining a consistency ratio according to the characteristic value, the line number of the paired comparison matrix and a preset random consistency index corresponding to the line number; and if the consistency ratio is smaller than a preset ratio threshold value, judging that the paired comparison matrixes pass consistency detection, otherwise, judging that the paired comparison matrixes do not pass consistency detection.
In some embodiments, if the pair of comparison matrices fails to pass the consistency check, the pair of comparison matrices is subjected to consistency processing to obtain a new pair of comparison matrices, the new pair of comparison matrices is used as the pair of comparison matrices, and the step of performing the consistency check on the pair of comparison matrices and the subsequent steps are returned.
In some embodiments, the matching degree determining module 63, in determining the matching degree of the first detection data and the second detection data according to the difference matrix and the pair of comparison matrices detected by consistency, includes:
and determining the matching degree of the first detection data and the second detection data according to the feature vectors of the differentiation matrix and the paired comparison matrix detected through consistency.
It should be noted that, for the information interaction, execution process, and other contents between the above devices/units, the specific functions and technical effects thereof based on the same concept as those of the method embodiment of the present application can be specifically referred to the method embodiment portion, and are not described herein again.
Example three:
fig. 7 is a schematic structural diagram of an apparatus according to an embodiment of the present application. As shown in fig. 7, the apparatus 7 of this embodiment includes: at least one processor 70 (only one processor is shown in fig. 7), a memory 71, and a computer program 72 stored in the memory 71 and executable on the at least one processor 70, the processor 70 implementing the steps of any of the various method embodiments described above when executing the computer program 72:
acquiring first detection data and second detection data, wherein the first detection data are data uploaded by a radar, the second detection data are data uploaded by a Road Side Unit (RSU), and the second detection data comprise identity information of a vehicle;
determining a difference between the first detected data and the second detected data;
determining the matching degree of the first detection data and the second detection data according to a preset weight and the difference;
and determining the identity information corresponding to the vehicle corresponding to the first detection data according to the matching degree.
Optionally, the preset weight is determined according to the following manner:
screening out data types influencing vehicle identity determination from the historical first detection data and the historical second detection data;
and determining a preset weight according to the screened data type.
Optionally, the historical second detection data includes historical pre-loading second detection data and historical post-loading second detection data, the historical pre-loading second detection data is second detection data obtained after interaction between an RSU and a pre-loading OBU, the historical post-loading second detection data is second detection data obtained after interaction between an RSU and a post-loading OBU, the pre-loading OBU refers to an OBU installed on the vehicle before production of the vehicle, and the post-loading OBU refers to an OBU installed on the vehicle after production of the vehicle;
the method for screening out the data types influencing the vehicle identity determination from the historical first detection data and the historical second detection data comprises the following steps:
screening out data types influencing vehicle identity determination from the historical first detection data and the historical pre-mounted second detection data to obtain pre-mounted data types;
screening out data types influencing vehicle identity determination from the historical first detection data and the historical afterloading second detection data to obtain afterloading data types;
the determining the preset weight according to the screened data type includes:
determining the weight according to the screened front loading data type to obtain a preset front loading weight;
determining the weight according to the screened afterloading data types to obtain a preset afterloading weight;
the determining the matching degree of the first detection data and the second detection data according to the preset weight and the difference comprises:
if the second detection data is obtained by interaction of the RSU and a front-mounted OBU, determining the matching degree of the first detection data and the second detection data according to the preset front-mounted weight and the difference;
and if the second detection data is obtained by interaction of the RSU and the afterloading OBU, determining the matching degree of the first detection data and the second detection data according to the preset afterloading weight and the difference.
Optionally, if at least two RSUs are installed on the same mast or the same side bar, the vehicle identity determination method includes:
determining second detection data belonging to the same vehicle from the historical afterloading second detection data;
for each vehicle, determining C1 used for indicating the position information of the lane where the vehicle is located currently according to second detection data belonging to the same vehicle;
the step of screening out data types influencing vehicle identity determination from the historical first detection data and the historical afterloading second detection data to obtain afterloading data types comprises the following steps:
and screening out data types influencing vehicle identity determination from the historical first detection data, the C1 and the historical afterloading second detection data to obtain afterloading data types.
Optionally, the determining the weight according to the screened type of the front loading data to obtain a preset front loading weight includes:
and determining the ratio of the influence of any two front-mounted data types on the vehicle identity determination in the screened front-mounted data types, wherein the ratio of the influence is taken as the preset front-mounted weight.
Optionally, before the determining the matching degree of the first detection data and the second detection data according to the preset weight and the difference, the method further includes:
carrying out consistency detection on a pair of comparison matrixes, wherein the pair of comparison matrixes are determined according to the preset weight;
the determining the matching degree of the first detection data and the second detection data according to the preset weight and the difference comprises:
determining a degree of matching of the first detection data and the second detection data according to a differentiation matrix and the pair of comparison matrices through consistency detection, wherein the differentiation matrix is determined according to the difference.
Optionally, the performing consistency detection on the paired comparison matrices includes:
determining feature vectors of the pair-wise comparison matrix;
determining an eigenvalue of the pair of comparison matrices according to the eigenvector;
determining consistency ratio according to the characteristic value, the row number of the paired comparison matrix and a preset random consistency index corresponding to the row number;
and if the consistency ratio is smaller than a preset ratio threshold value, judging that the paired comparison matrixes pass consistency detection, otherwise, judging that the paired comparison matrixes do not pass consistency detection.
Optionally, if the pair of comparison matrices fails to pass consistency detection, consistency processing is performed on the pair of comparison matrices to obtain a new pair of comparison matrices, the new pair of comparison matrices is used as the pair of comparison matrices, and the step of performing consistency detection on the pair of comparison matrices and the subsequent steps are returned.
Optionally, the determining the matching degree of the first detection data and the second detection data according to the differentiation matrix and the pair-wise comparison matrix detected by consistency includes:
and determining the matching degree of the first detection data and the second detection data according to the feature vector sum of the differentiation matrix and the paired comparison matrix detected through consistency.
The device 7 may be a computing device such as a desktop computer, a notebook, a palm computer, and a cloud server. The device may include, but is not limited to, a processor 70, a memory 71. Those skilled in the art will appreciate that fig. 7 is merely an example of the device 7, and does not constitute a limitation of the device 7, and may include more or less components than those shown, or some of the components may be combined, or different components may be included, such as input and output devices, network access devices, etc.
The Processor 70 may be a Central Processing Unit (CPU), and the Processor 70 may be other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The storage 71 may in some embodiments be an internal storage unit of the device 7, such as a hard disk or a memory of the device 7. The memory 71 may be an external storage device of the device 7 in other embodiments, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, provided on the device 7. Further, the memory 71 may include both an internal storage unit and an external storage device of the device 7. The memory 71 is used for storing an operating system, an application program, a BootLoader (BootLoader), data, and other programs, such as program codes of the computer programs. The above-mentioned memory 71 may also be used to temporarily store data that has been output or is to be output.
It should be clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional units and modules is only used for illustration, and in practical applications, the above functions may be distributed as different functional units and modules according to needs, that is, the internal structure of the apparatus may be divided into different functional units or modules to implement all or part of the above described functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only used for distinguishing one functional unit from another, and are not used for limiting the protection scope of the present application. For the specific working processes of the units and modules in the system, reference may be made to the corresponding processes in the foregoing method embodiments, which are not described herein again.
The embodiment of the present application further provides a vehicle identity determining system applied to vehicle-road coordination, where the vehicle identity determining system includes: at least one radar and at least one RSU, further comprising apparatus as in embodiment three above.
An embodiment of the present application further provides a network device, where the network device includes: at least one processor, a memory, and a computer program stored in the memory and executable on the at least one processor, the processor implementing the steps of any of the various method embodiments described above when executing the computer program.
The embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the computer program implements the steps that can be implemented in the above method embodiments.
The embodiments of the present application provide a computer program product, which when running on a mobile terminal, enables the mobile terminal to implement the steps in the above method embodiments when executed.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the processes in the methods of the embodiments described above may be implemented by a computer program instructing related hardware to execute the computer program, and the computer program may be stored in a computer readable storage medium, and when executed by a processor, may implement the steps of the embodiments of the methods described above. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a photographing apparatus/communication device, recording medium, computer Memory, read-Only Memory (ROM), random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, and software distribution medium. Such as a usb-drive, a removable hard drive, a magnetic or optical disk, etc. In certain jurisdictions, computer-readable media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and patent practice.
In the above embodiments, the description of each embodiment has its own emphasis, and reference may be made to the related description of other embodiments for parts that are not described or recited in any embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/network device and method may be implemented in other ways. For example, the above-described apparatus/network device embodiments are merely illustrative, and for example, a module or a unit may be divided into only one type of logic function, and may be implemented in other ways, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, indirect coupling or communication connection of devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one position, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
The above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (13)

1. A vehicle identity determination method applied to vehicle-road cooperation is characterized by comprising the following steps:
acquiring first detection data and second detection data, wherein the first detection data are data uploaded by a radar, the second detection data are data uploaded by a Road Side Unit (RSU), and the second detection data comprise identity information of a vehicle;
determining a difference between the first detected data and the second detected data;
determining the matching degree of the first detection data and the second detection data according to a preset weight and the difference;
and determining the identity information corresponding to the vehicle corresponding to the first detection data according to the matching degree.
2. The vehicle identity determination method according to claim 1, wherein the preset weight is determined according to:
screening out data types influencing vehicle identity determination from the historical first detection data and the historical second detection data;
and determining a preset weight according to the screened data type.
3. The vehicle identity determination method according to claim 2, wherein the historical second detection data includes historical pre-shipment second detection data and historical post-shipment second detection data, the historical pre-shipment second detection data is second detection data obtained after the RSU interacts with the pre-shipment OBU, and the historical post-shipment second detection data is second detection data obtained after the RSU interacts with the post-shipment OBU, the pre-shipment OBU being an OBU installed on the vehicle before production of the vehicle, and the post-shipment OBU being an OBU installed on the vehicle after production of the vehicle;
the method for screening out data types influencing vehicle identity determination from historical first detection data and historical second detection data comprises the following steps:
screening out a data type influencing vehicle identity determination from historical first detection data and the historical front-mounted second detection data to obtain a front-mounted data type;
screening out a data type influencing the vehicle identity determination from the historical first detection data and the historical afterloading second detection data to obtain an afterloading data type;
the determining the preset weight according to the screened data type includes:
determining the weight according to the screened front loading data type to obtain a preset front loading weight;
determining the weight according to the screened afterloading data type to obtain a preset afterloading weight;
the determining the matching degree of the first detection data and the second detection data according to the preset weight and the difference comprises:
if the second detection data is obtained by interaction of the RSU and a front-mounted OBU, determining the matching degree of the first detection data and the second detection data according to the preset front-mounted weight and the difference;
and if the second detection data is obtained by interaction of the RSU and the afterloading OBU, determining the matching degree of the first detection data and the second detection data according to the preset afterloading weight and the difference.
4. The vehicle identification method of claim 3, wherein if at least two RSUs are installed on the same door frame or on the same side bar, the vehicle identification method comprises:
determining second detection data belonging to the same vehicle from the second detection data after the history;
for each vehicle, determining C1 used for indicating the position information of the lane where the vehicle is currently located according to second detection data belonging to the same vehicle;
the step of screening out data types influencing vehicle identity determination from the historical first detection data and the historical afterloading second detection data to obtain afterloading data types comprises the following steps:
and screening out data types influencing vehicle identity determination from the historical first detection data, the C1 and the historical afterloading second detection data to obtain afterloading data types.
5. The vehicle identity determination method of claim 4, wherein the determining of the weight according to the screened front-load data type to obtain a preset front-load weight comprises:
and determining the ratio of the influence of any two front-mounted data types on the vehicle identity determination in the screened front-mounted data types, wherein the ratio of the influence is taken as the preset front-mounted weight.
6. The vehicle identity determination method according to any one of claims 1 to 5, characterized by, before the determining the degree of matching of the first detection data and the second detection data based on a preset weight and the difference, comprising:
carrying out consistency detection on a pair of comparison matrixes, wherein the pair of comparison matrixes are determined according to the preset weight;
the determining the matching degree of the first detection data and the second detection data according to the preset weight and the difference comprises:
determining a matching degree of the first detection data and the second detection data according to a differentiation matrix and the pair of comparison matrices passing consistency detection, wherein the differentiation matrix is determined according to the difference.
7. The vehicle identity determination method of claim 6, wherein the performing a consistency check on the pairwise comparison matrix comprises:
determining feature vectors of the pair-wise comparison matrix;
determining an eigenvalue of the pair of comparison matrices according to the eigenvector;
determining a consistency ratio according to the characteristic value, the row number of the paired comparison matrix and a preset random consistency index corresponding to the row number;
and if the consistency ratio is smaller than a preset ratio threshold value, judging that the paired comparison matrixes pass consistency detection, otherwise, judging that the paired comparison matrixes do not pass consistency detection.
8. The vehicle identity determination method according to claim 6, wherein if the pair of comparison matrices fails the consistency check, the pair of comparison matrices is subjected to a consistency process to obtain a new pair of comparison matrices, the new pair of comparison matrices is used as the pair of comparison matrices, and the step of performing the consistency check on the pair of comparison matrices and the subsequent steps are returned.
9. The vehicle identity determination method according to claim 7, wherein the determining a degree of matching of the first detection data and the second detection data based on the differentiation matrix and the pair-wise comparison matrix through consistency detection includes:
and determining the matching degree of the first detection data and the second detection data according to the feature vectors of the differentiation matrix and the paired comparison matrix detected through consistency.
10. A vehicle identity determination device applied to vehicle-road cooperation is characterized by comprising:
the system comprises a detection data acquisition module, a data processing module and a data processing module, wherein the detection data acquisition module is used for acquiring first detection data and second detection data, the first detection data is data uploaded by a radar, the second detection data is data uploaded by a Road Side Unit (RSU), and the second detection data comprises identity information of a vehicle;
a difference determination module to determine a difference between the first detection data and the second detection data;
the matching degree determining module is used for determining the matching degree of the first detection data and the second detection data according to preset weight and the difference;
and the identity information determining module of the vehicle is used for determining the identity information corresponding to the vehicle corresponding to the first detection data according to the matching degree.
11. An apparatus comprising a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor implements the method of any of claims 1 to 9 when executing the computer program.
12. A vehicle identity determination system for vehicle infrastructure collaboration, the vehicle identity determination system comprising: at least one radar and at least one RSU, further comprising the apparatus of claim 11.
13. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 9.
CN202110593501.5A 2021-05-28 2021-05-28 Vehicle identity determination method, device and equipment applied to vehicle-road cooperation Pending CN115410376A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110593501.5A CN115410376A (en) 2021-05-28 2021-05-28 Vehicle identity determination method, device and equipment applied to vehicle-road cooperation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110593501.5A CN115410376A (en) 2021-05-28 2021-05-28 Vehicle identity determination method, device and equipment applied to vehicle-road cooperation

Publications (1)

Publication Number Publication Date
CN115410376A true CN115410376A (en) 2022-11-29

Family

ID=84155416

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110593501.5A Pending CN115410376A (en) 2021-05-28 2021-05-28 Vehicle identity determination method, device and equipment applied to vehicle-road cooperation

Country Status (1)

Country Link
CN (1) CN115410376A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101350109A (en) * 2008-09-05 2009-01-21 交通部公路科学研究所 Method for locating and controlling multilane free flow video vehicle
CN204314951U (en) * 2014-11-06 2015-05-06 北京万集科技股份有限公司 A kind of access device of vehicle carried electronic label
CN104658264A (en) * 2015-01-27 2015-05-27 武汉烽火众智数字技术有限责任公司 Vehicle verification method and system based on ETC and video
CN110189425A (en) * 2019-05-27 2019-08-30 武汉万集信息技术有限公司 Multilane free-flow vehicle detection method and system based on binocular vision
CN110189424A (en) * 2019-05-27 2019-08-30 武汉万集信息技术有限公司 Multilane free-flow vehicle detection method and system based on multiple target radar
CN110619256A (en) * 2018-06-19 2019-12-27 杭州海康威视数字技术股份有限公司 Road monitoring detection method and device
WO2020211658A1 (en) * 2019-04-17 2020-10-22 阿里巴巴集团控股有限公司 Trigger detection method, apparatus and system
CN112258830A (en) * 2020-10-23 2021-01-22 上海博泰悦臻电子设备制造有限公司 Method for evaluating reliability of vehicle formation driving and application thereof

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101350109A (en) * 2008-09-05 2009-01-21 交通部公路科学研究所 Method for locating and controlling multilane free flow video vehicle
CN204314951U (en) * 2014-11-06 2015-05-06 北京万集科技股份有限公司 A kind of access device of vehicle carried electronic label
CN104658264A (en) * 2015-01-27 2015-05-27 武汉烽火众智数字技术有限责任公司 Vehicle verification method and system based on ETC and video
CN110619256A (en) * 2018-06-19 2019-12-27 杭州海康威视数字技术股份有限公司 Road monitoring detection method and device
WO2020211658A1 (en) * 2019-04-17 2020-10-22 阿里巴巴集团控股有限公司 Trigger detection method, apparatus and system
CN110189425A (en) * 2019-05-27 2019-08-30 武汉万集信息技术有限公司 Multilane free-flow vehicle detection method and system based on binocular vision
CN110189424A (en) * 2019-05-27 2019-08-30 武汉万集信息技术有限公司 Multilane free-flow vehicle detection method and system based on multiple target radar
CN112258830A (en) * 2020-10-23 2021-01-22 上海博泰悦臻电子设备制造有限公司 Method for evaluating reliability of vehicle formation driving and application thereof

Similar Documents

Publication Publication Date Title
CN109087510B (en) Traffic monitoring method and device
US20200342430A1 (en) Information Processing Method and Apparatus
US20130132166A1 (en) Smart toll network for improving performance of vehicle identification systems
CN111429592B (en) ETC settlement method and system based on intelligent vehicle machine and intelligent vehicle machine
CN108765937B (en) Vehicle identification device, roadside unit and method for ETC system
CN109508731A (en) A kind of vehicle based on fusion feature recognition methods, system and device again
CN109977776A (en) A kind of method for detecting lane lines, device and mobile unit
CN112133085B (en) Vehicle information matching method, device and system, storage medium and electronic device
CN113093129A (en) Automatic calibration method and device for vehicle-mounted radar and terminal equipment
CN114463372A (en) Vehicle identification method and device, terminal equipment and computer readable storage medium
CN114495520B (en) Counting method and device for vehicles, terminal and storage medium
CN113192217B (en) Fee evasion detection method, fee evasion detection device, computer equipment and medium
CN108693517B (en) Vehicle positioning method and device and radar
CN110910519A (en) Information acquisition method and device, free flow charging system and storage medium
CN110596639A (en) Vehicle tracking and positioning method, information marking method, system and control terminal
CN112215887A (en) Pose determination method and device, storage medium and mobile robot
CN111862208B (en) Vehicle positioning method, device and server based on screen optical communication
CN115410376A (en) Vehicle identity determination method, device and equipment applied to vehicle-road cooperation
CN112212851B (en) Pose determination method and device, storage medium and mobile robot
CN109448145B (en) Road side unit system for reducing ETC lane construction cost
CN112884924B (en) High-speed lane no-stop charging method, system and storage medium
CN113923405B (en) Mobile communication system based on safety monitoring
CN116721396A (en) Lane line detection method, device and storage medium
WO2023098320A1 (en) Method, apparatus, device, system, and storage medium for vehicle fee charging and program product
CN114089272A (en) Calibration method, device, terminal and storage medium of positioning module

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination