CN112950947A - Vehicle information processing method, device and system - Google Patents

Vehicle information processing method, device and system Download PDF

Info

Publication number
CN112950947A
CN112950947A CN201911174197.XA CN201911174197A CN112950947A CN 112950947 A CN112950947 A CN 112950947A CN 201911174197 A CN201911174197 A CN 201911174197A CN 112950947 A CN112950947 A CN 112950947A
Authority
CN
China
Prior art keywords
vehicle
information
board unit
unit
specific area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911174197.XA
Other languages
Chinese (zh)
Other versions
CN112950947B (en
Inventor
张志强
吴栋磊
闵洪波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Banma Zhixing Network Hongkong Co Ltd
Original Assignee
Banma Zhixing Network Hongkong Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Banma Zhixing Network Hongkong Co Ltd filed Critical Banma Zhixing Network Hongkong Co Ltd
Priority to CN201911174197.XA priority Critical patent/CN112950947B/en
Publication of CN112950947A publication Critical patent/CN112950947A/en
Application granted granted Critical
Publication of CN112950947B publication Critical patent/CN112950947B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/017Detecting movement of traffic to be counted or controlled identifying vehicles
    • G08G1/0175Detecting movement of traffic to be counted or controlled identifying vehicles by photographing vehicles, e.g. when violating traffic rules
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0108Measuring and analyzing of parameters relative to traffic conditions based on the source of data
    • G08G1/0112Measuring and analyzing of parameters relative to traffic conditions based on the source of data from the vehicle, e.g. floating car data [FCD]
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0108Measuring and analyzing of parameters relative to traffic conditions based on the source of data
    • G08G1/0116Measuring and analyzing of parameters relative to traffic conditions based on the source of data from roadside infrastructure, e.g. beacons

Abstract

A vehicle information processing method, device and system are provided. The method comprises the following steps: communicating with an on-board unit on a vehicle to acquire first information sent by the on-board unit; shooting the vehicle to acquire second information of the vehicle according to the shot image; and performing processing for the acquired first information and second information. The on-board unit may be a dedicated on-board OBU or other stand-alone device capable of communicating vehicle information with the roadside unit. According to the invention, the automatic binding and matching of the vehicle-mounted unit and the vehicle characteristics can be realized by combining the information interaction and the visual identification of the vehicle-mounted unit. The matching verification can be carried out for multiple times so as to realize supervision on abnormal conditions.

Description

Vehicle information processing method, device and system
Technical Field
The present invention relates to the field of information processing, and in particular, to a method, an apparatus, and a system for processing vehicle information.
Background
In daily life, there are various scenarios that require vehicle identity identification, such as parking lot and highway tolling, and general road monitoring, etc. In these scenarios, the onboard unit is usually used as a certificate for recognizing the identity of the vehicle, and thus the operation based on the identity of the vehicle is completed. For example, when a vehicle equipped with an on-board unit passes through a highway, a fee is charged according to the position of the on-board unit when the vehicle enters and leaves the highway. However, the identification of the vehicle identity by only using a specific certificate leads to problems, such as the problem that different vehicles privately exchange the vehicle-mounted unit before leaving high speed to avoid the total fee evasion, or the problem that the vehicle-mounted unit is artificially shielded and fails.
Therefore, a solution is needed that can ensure that the identity of the vehicle is correctly identified.
Disclosure of Invention
In view of the above, the present invention provides a vehicle information processing scheme, which can implement automatic binding and matching between an on-board unit and vehicle features by combining information interaction and visual recognition of the on-board unit. The matching verification can be carried out for multiple times so as to realize supervision on abnormal conditions.
According to a first aspect of the present invention, there is provided a vehicle information processing method including: communicating with an on-board unit on a vehicle to acquire first information sent by the on-board unit; shooting the vehicle to acquire second information of the vehicle according to the shot image; and performing processing for the acquired first information and second information. The on-board unit may be a dedicated on-board OBU or other stand-alone device capable of communicating vehicle information with the roadside unit.
According to a second aspect of the present invention, there is provided a vehicle information processing method including: when a first vehicle enters a specific area, communicating with a first vehicle-mounted unit on the first vehicle to acquire first information sent by the first vehicle-mounted unit; shooting a first vehicle to acquire second information of the first vehicle according to the shot image; communicating with a second onboard unit on a second vehicle to acquire third information transmitted by the second onboard unit when the second vehicle is located within or leaves the specific area; determining that the second on-board unit is the same as the first on-board unit if the first information and the third information correspond; shooting a second vehicle to acquire fourth information of the second vehicle according to the shot image; and determining that the second vehicle is the same vehicle as the first vehicle based on the second information and fourth information.
According to a third aspect of the present invention, there is provided a vehicle information processing system including: the device comprises a first device, a second device and a third device, wherein the first device is used for communicating with an on-board unit on a vehicle to acquire first information sent by the on-board unit; a second device for photographing the vehicle to acquire second information of the vehicle from the photographed image; and a third device configured to perform processing for the acquired first information and the second information.
According to a fourth aspect of the present invention, there is provided a vehicle information processing apparatus comprising: a communication unit to: the information processing apparatus includes a communication unit communicating with an on-board unit on a vehicle to acquire first information transmitted by the on-board unit, notifying an adjacent photographing apparatus to photograph the vehicle to acquire second information of the vehicle from a photographed image, and a processing unit performing processing for the acquired first information and the second information.
According to a fifth aspect of the invention, there is provided a computing device comprising: a processor; and
a memory having executable code stored thereon, which when executed by the processor, causes the processor to perform the method as described above.
According to a sixth aspect of the invention, a non-transitory machine-readable storage medium is proposed, having stored thereon executable code which, when executed by a processor of an electronic device, causes the processor to perform the method as described above.
Therefore, the vehicle information processing scheme of the invention can realize automatic binding and matching of the vehicle-mounted unit and the vehicle characteristics by combining the information interaction and the visual identification of the vehicle-mounted unit. The matching verification can be carried out for multiple times so as to realize supervision on abnormal conditions.
Drawings
The above and other objects, features and advantages of the present disclosure will become more apparent by describing in greater detail exemplary embodiments thereof with reference to the attached drawings, in which like reference numerals generally represent like parts throughout.
Fig. 1 shows a flowchart illustrating a vehicle information processing method according to an embodiment of the present invention.
Fig. 2 shows a flowchart of a vehicle information processing method according to another embodiment of the invention.
Fig. 3 shows an example of the composition of the highway toll system based on the principle of the present invention.
Fig. 4 shows an example of the interaction flow of the entrance tollgate with the vehicle.
Fig. 5 shows an example of the interaction flow of the exit tollgate with the vehicle.
Fig. 6 shows an example of a plurality of roadside units arranged on the way of the vehicle.
Fig. 7 shows an example in which a vehicle in line is present at the time of charging.
Fig. 8 shows an example where there is a trailing vehicle at the time of charging.
Fig. 9 shows an example of a process flow when an illegal vehicle exists.
Fig. 10 shows another example of the presence of a trailing vehicle at the time of charging.
Fig. 11 shows an example of a process flow when an illegal vehicle exists.
Fig. 12 shows an example of the composition of a vehicle information processing system according to an embodiment of the invention.
Fig. 13 shows a configuration example of a vehicle information processing apparatus according to an embodiment of the present invention.
FIG. 14 shows a schematic structural diagram of a computing device according to one embodiment of the invention.
Detailed Description
Preferred embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While the preferred embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
In an automated processing scenario for vehicles, such as an automatic toll collection scenario on a highway, confirmation of the vehicle identity and subsequent operations based on the confirmation, such as toll deduction on an associated account, may be accomplished by equipping the vehicle with an on-board unit having a communication function (at least a short-range communication function) and capable of identifying the vehicle identity.
In the above automation scenario, cooperation of the on-board unit and the roadside unit is usually required to complete the relevant operations. In this context, an on-board unit refers to a broad term "on-board" unit, i.e. a unit equipped on the vehicle that is capable of communicating with the road side unit to assist in determining the identity of the vehicle, e.g. a specific chip or hardware unit mounted on the vehicle, or a more general purpose standalone device owned by the driver or passenger on the vehicle that is capable of communicating with the road side unit, such as a smart terminal or a wearable device. The road side unit is a unit that is installed near the vehicle travel path and that can communicate with the on-board unit. In one particular scenario, an on-board unit (OBU) may refer to a hardware unit installed in a vehicle that may enable V2X communication while supporting V2X applications. The Road Side Unit (RSU) may be a hardware unit installed at the roadside that can implement V2X communication while supporting V2X applications. Here, V2X (Vehicle to event) refers to the communication between the Vehicle-mounted unit and other devices, including but not limited to Vehicle-to-Vehicle (V2V), Vehicle-to-road (V2I), Vehicle-to-person (V2P), Vehicle-to-cloud (V2C), and so on.
It should be understood that although embodiments of the present invention will be described below preferably in connection with V2X communication, the vehicle information processing scheme of the present invention is also applicable to application scenarios where on-board unit and roadside unit communication is performed based on other communication protocols. For example, the driver, the passenger, or the vehicle owner's smart phone, the GPS navigator, the watch, or the bracelet, or other independent devices may be bound to the specific vehicle, and perform communication with the road side unit, and perform the vehicle authentication function together with the image captured by the road side unit.
In a conventional scenario in which an on-board unit is used to automatically recognize a vehicle identity, since recognition of the vehicle identity is only related to the on-board unit, there is a case in which cheating is performed by an operation on the on-board unit. For example, after a vehicle loaded with V2X OBU equipment drives into a highway, or the V2X equipment fails due to human factors (such as closing, shielding, etc.), the vehicle cannot be identified by the roadside V2X RSU equipment, and the exit toll station system cannot automatically complete the restoration of the traffic path, and can only charge according to the shortest path between the entrance and the exit. Or the long-distance truck and the short-distance car exchange OBU equipment, and the OBU equipment are respectively automatically paid through the outlet to integrally escape. In addition, when a vehicle carrying a V2X OBU device exits the automated exit toll station, it may later attempt to evade fees with other vehicles; or other vehicles are inserted in front of the intelligent toll collection system, so that the vehicles paying normally cannot pass normally, and the like.
In order to solve at least one of the problems described above, the present invention proposes a vehicle information processing scheme that improves the reliability of an on-board unit identifying a specific vehicle by automatically matching on-board unit information with vehicle exterior information, and that is capable of flexibly coping with the problem of tailgating fee evasion of other vehicles.
Fig. 1 shows a flowchart illustrating a vehicle information processing method according to an embodiment of the present invention. The method may be implemented in particular by a road side unit in combination with a camera unit. In different embodiments, the shooting unit may be a separate device or may be integrated with the road side unit.
In step S110, a communication is made with an on-board unit on a vehicle to acquire first information transmitted by the on-board unit. In step S120, the vehicle is photographed to acquire second information of the vehicle from the photographed image. Subsequently, in step S130, processing is performed with respect to the acquired first information and the second information, for example, association processing of the first and second information is performed.
Since the first and second information need to be associated (e.g., to generate or verify a correlation), the operations of steps S110 and S120 need to be performed simultaneously or sequentially within a limited spatial range. In other words, the first information and the second information are not acquired at a far place and time interval, and may be both acquired "in the field".
Typically, the first information acquired from the on-board unit includes identification information of the on-board unit. In different embodiments, the information of the associated vehicle can be remotely found out through the identification code of the vehicle-mounted unit, and the information of the associated vehicle of the vehicle-mounted unit can also be included in the first information. The "associated vehicle" of the on-board unit refers to the vehicle that the on-board unit is supposed to represent. As previously mentioned, in one embodiment, the on-board unit may be a narrowly defined OBU, in which case the vehicle that the on-board unit is identified as representing may be the vehicle to which the on-board unit is installed as recorded by the on-board unit issuing structure or traffic control authority. The information of the on-board unit associated with the vehicle may include, for example, license plate number information of the vehicle.
Alternatively or additionally, the on-board unit may be a unit which is equipped on board the vehicle in a broad sense. In this case, the on-board unit may include the OBU in the narrow sense, or may include other independent devices capable of communicating with the road side unit and indicating the identity of the vehicle, for example, a smart terminal such as a mobile phone or a GPS navigator, or a wearable device such as a watch or a bracelet. These devices may be bound to a specific vehicle, for example, respective on-board unit functions APP may be downloaded on a smartphone and binding to one or more vehicles may be achieved by respective binding and/or authentication steps (e.g., uploading a vehicle ownership certificate or a driver's license, etc.). These stand-alone devices may be paired with a vehicle (e.g., bluetooth pairing) by themselves or manually when initially "on-board" the device, e.g., when the device holder is boarding as a driver or a rider, and used as an on-board unit for the paired vehicle.
Accordingly, the second information may include appearance feature information of the vehicle visually recognized by photographing the vehicle. Here, "appearance characteristic information" refers to visually recognizable characteristics of the vehicle itself, such as a vehicle type, color, size, and the like. In a preferred embodiment, the "appearance characteristic information" may further include more detailed appearance characteristics, such as, for example, a vehicle body pattern, a window sticker, an external damage, a refit, or the like, which are visually recognizable.
In the case where the first information and the second information contain different kinds of information as described above, the process of step S130 may include performing a binding operation of the first information and the second information. For example, the in-vehicle unit identification number and the license plate number provided by the in-vehicle unit are made to bind with the processing appearance feature information identified from the captured image. For example, consider a vehicle having a in-vehicle unit identification number a and a license plate number B, having an appearance characteristic C (e.g., a red XX model vehicle).
In the case that the first information and the second information already contain the same kind of information, step S130 may include performing a matching operation of the first information and the second information. For example, in one embodiment, the first information provided by the on-board unit already includes form factor information (e.g., model information entered at registration) of the associated vehicle. At this time, the vehicle model in the first information may be compared with the vehicle model information in the identified second information to confirm whether there is a match. For example, if the vehicle type of the associated vehicle included in the first information is also C (for example, a red XX model vehicle), the first information and the second information are successfully matched. Otherwise, the matching is regarded as unsuccessful, and corresponding alarming or informing operation is carried out.
In other embodiments, the associating operation for the first and second information may include matching and binding, e.g., verifying whether the same item is the same, and binding different items if the same item is the same. In one embodiment, the second information may further include license plate number information of the vehicle visually recognized by photographing the vehicle. The above information can be verified with the license plate number information in the first information in the same way.
As described above, the operations of steps S110 and S120 are preferably performed in the field. For example, the first information is acquired by short-range communication by a roadside unit installed on the site, and the vehicle is photographed by a photographing apparatus near or integrated with the roadside unit to acquire the second information. For this reason, the communication with the in-vehicle unit in step S110 may be broadcast communication. For example, the rsu may receive the first information broadcast by the on-board unit, or the rsu may broadcast the communication request and receive the first information responded by the on-board unit. Likewise, in step S120, the vehicle may be photographed using a field-mounted vision recognition apparatus. Thereby, accurate acquisition of the first and second information is ensured by field operation.
In addition, although the operations of steps S110 and S120 may be performed in the field, there is no limitation on the object to which the instruction to perform the above operations is issued. That is, in some embodiments, the operations may be performed under control of a roadside unit, and in other embodiments, the operations may involve remote communication with a server (e.g., a cloud platform). For example, the step S130 of performing processing for the acquired first information and second information may include: and uploading the processed result to a server. In other embodiments, step S130 may include obtaining the prior processing results from the server for subsequent verification.
The above-described visual recognition device can also identify offending vehicles, such as trailing and squat vehicles, when the vehicles are photographed using a field-mounted visual recognition device. Then, when the visual recognition device photographs a trailing vehicle behind the vehicle, the relevant party may be notified to perform an escape prevention operation for the trailing vehicle. In other embodiments, the vehicle itself may also be leveraged for offending vehicle identification, e.g., the vehicle may be notified to take an image using an onboard camera to determine whether a trailing vehicle behind or a leading fleet vehicle is present.
The vehicle information processing method is particularly suitable for multiple matching authentication of the vehicle and the vehicle-mounted unit in a specific time period. In other words, within a certain period of time, it is confirmed via a plurality of matching operations that a certain on-board unit is always mounted on its associated vehicle. Fig. 2 shows a flowchart of a vehicle information processing method according to another embodiment of the invention. The method can be regarded as an implementation of the method shown in fig. 1 in a specific scenario.
When a first vehicle enters a specific area, in step S210, a first vehicle-mounted unit on the first vehicle is communicated to acquire first information sent by the first vehicle-mounted unit. In step S220, the first vehicle is photographed to acquire second information of the first vehicle from the photographed image. Thereby, information processing is performed when the vehicle enters the specific area.
When the second vehicle is located in the specific area or leaves the specific area, in step S230, the second vehicle communicates with the second onboard unit on the second vehicle to acquire the third information transmitted by the second onboard unit. In step S240, in the case where the first information and the third information correspond, it is determined that the second on-board unit is the same on-board unit as the first on-board unit. In step S250, the second vehicle is photographed to acquire fourth information of the second vehicle from the photographed image. In step S260, it is determined that the second vehicle is the same vehicle as the first vehicle based on the second information and fourth information. Thus, information processing is performed when the information processing is performed within the specific area or when the information processing is performed while the information processing is.
The specific area may include a highway, an enclosed road, or a parking lot. For example, when a vehicle enters a highway, first information (e.g., an identification number a and a license plate number B) may be acquired from an on-board unit of a first vehicle at a highway entrance, and appearance characteristic information (e.g., a vehicle type C) of the first vehicle may be visually recognized by photographing a vehicle image. When the vehicle is running on or off the highway, first information (for example, an identification number a and a license plate number B) can be acquired from an on-board unit of the vehicle, and the exterior characteristic information can be visually recognized by capturing an image of the vehicle. If the appearance information identified at this time is not changed (e.g., vehicle type C remains), it can be assumed that the on-board unit is always located on its registration-associated vehicle and has not been replaced. Therefore, subsequent operations, such as automatic payment of highway tolls and the like, are facilitated.
In some embodiments, the matching operation performed within a particular area or at an exit may be performed directly from the first and second information obtained at the entry. In other embodiments, the binding and/or matching operation for the acquired first information and the second information may be performed at the portal. Similar to the above description, the binding operation may be to bind different types of information items in the first and second information, and the matching operation is to compare and verify the same type of information items.
In a preferred embodiment, the results of the above-described binding and/or matching operations may also be uploaded to a server. Then, step S260 may include: obtaining the result of the binding and/or matching operation from the server; and determining that the second vehicle is the same vehicle as the first vehicle according to the fourth information and a result of the binding and/or matching operation.
Similar to the foregoing, the communication with the on-board unit may be a broadcast-based communication. Thus, when the first vehicle enters the particular area, communicating with the first on-board unit on the first vehicle may include: broadcasting a communication request; and receiving the first information responded by the first vehicle-mounted unit. Communicating with a second on-board unit on a second vehicle when the second vehicle is located within or exits the particular area may include: receiving the third information broadcast by the second vehicle-mounted unit; or broadcasting a communication request and receiving the third information responded by the second vehicle-mounted unit.
After determining that the second vehicle is the same vehicle as the first vehicle, a transaction may be conducted with the second on-board unit of the second vehicle. Further, a transaction may be conducted with the second on-board unit of the second vehicle based on location information when the first vehicle enters a particular area and location information when the second vehicle is located within or exits the particular area, e.g., a charge may be made based on ingress and egress locations.
In addition, when a second vehicle is located in the specific area, at least one communication to a second vehicle-mounted unit and shooting of the second vehicle can be performed for the second vehicle to obtain at least one set of second vehicle information, wherein the set of second vehicle information comprises fifth information sent by the second vehicle-mounted unit, sixth information obtained according to a shot image and second vehicle position information during shooting; uploading the acquired at least one group of second vehicle information to a server; confirming spatiotemporal information of the second vehicle within the specific area using the at least one set of second vehicle information. This enables, for example, acquisition of information while the vehicle is traveling. Based on the spatiotemporal information, it may be determined that the second vehicle loading the first onboard unit is always the same vehicle as the first vehicle. Additionally, the transaction with the second on-board unit of the second vehicle is also based on the temporal-spatial information, e.g., for enabling a travel path based charging.
After the transaction is successful, the second vehicle may be allowed to exit the particular area, e.g., an automatic railing is raised and cleared. In a preferred embodiment, the matching can be done again before release. Then, after the transaction is successful, the second vehicle may be photographed to acquire seventh information of the second vehicle from the photographed image; and allowing the second vehicle to exit the specific area after determining that the seventh information corresponds to the second or fourth information.
In addition, the method may further include: when a trailing vehicle is photographed after the vehicle, an operation is performed that does not allow the trailing vehicle to leave the specific area. The determination of the trailing vehicle may be based on a roadside mounted camera or may be based on a camera of the vehicle. Then, the method may include notifying the second vehicle to turn on a rear-view camera; and shooting by using the rearview camera to identify whether a trailing vehicle exists or not.
The invention combines vehicle-mounted unit communication (for example, V2X communication) and visual sensor vehicle identification technology to realize (1) automatic binding and fault-tolerant processing of vehicle characteristics (such as license plate, model, color, size and the like) and characteristics (ID, MAC address and the like) contained by the vehicle-mounted unit at the entrance of a specific area, (2) identification, recording and early warning of each RSU point in the specific area through the vehicle, supervision of abnormal conditions such as parking card changing, card changing and the like through geographical position and motion track and full-coverage arrangement of the RSU based on video identification, and (3) transaction, such as charging, on the exit of the specific area through comparing the visual identification driving path with the vehicle characteristics and combining the driving track of the vehicle-mounted unit (for example, V2X OBU equipment).
The principles of the present invention will be explained in detail below with reference to an example in which a particular area is a toll highway. Fig. 3 shows an example of the composition of the highway toll system based on the principle of the present invention. As shown in fig. 3, the toll collection system includes a roadside unit, a visual recognition unit, and a passing device. The roadside unit is a roadside V2XRSU with a V2X communication function, and can perform short-range communication, such as broadcast communication, with an onboard unit (e.g., an onboard V2XOBU or an onboard independent intelligent terminal device), and also perform remote communication with a cloud server. The above-mentioned toll collection system may be disposed at the entrance and exit of the expressway.
In fig. 3, the visual recognition unit is shown as a visual sensing device arranged in front and back. It should be understood that in other embodiments, the visual recognition unit may have other implementations, for example, may be disposed in only one direction, or may be disposed in a single or multiple directions, or may be a single or multiple devices, or may be a device integrated with the roadside unit.
Fig. 4 shows an example of the interaction flow of the entrance tollgate with the vehicle.
As shown in fig. 4, when a vehicle loaded with an on-board unit (e.g., a V2X OBU device or a separate intelligent terminal device) enters a highway entrance lane,
roadside unit RSU broadcasts toll station clearance capability;
the OBU receives the broadcast permission capability of the toll station, completes a P2P communication session with the RSU, and responds that the vehicle is a car, including information such as the identification of the OBU, the identification code of the car, the type of the car, the color, the size and the position of the car;
at the same time, the vision recognition system captures and recognizes vehicle features such as license plate, type, color, etc.;
comparing the vehicle characteristics identified by the visual identification system with the vehicle information transmitted by the OBU by the RSU, automatically matching the V2X equipment with the vehicle after verification is completed, and reporting a binding result to the cloud control platform;
let the vehicle go and end the V2X communication.
Fig. 5 shows an example of the interaction flow of the exit tollgate with the vehicle.
The vehicle carrying the V2X OBU device, when exiting the highway exit lane,
the road side unit RSU broadcasts the toll station charging capabilities;
the OBU receives the broadcast charging capability of the toll station, completes a P2P communication session with the RSU, and responds that the vehicle is a car, wherein the OBU comprises information such as the identification of the OBU, the identification code of the vehicle, the type, the color, the size and the position of the □ vehicle;
at the same time, the vision recognition system takes a picture and recognizes the vehicle features;
determining the vehicle running path calculation cost by the RSU according to the binding information recorded by the OBU identifier and the cloud control platform; and verifying the vehicle information using a visual recognition system;
the road test unit RSU requests payment information including payment amount, RSU identification and a position lamp from the on-board unit OBU;
when the on-board unit OBU receives the RSU payment request, after the RSU device is verified, the on-board unit OBU responds to the payment information, including the payment account, the payment amount, the payment key, and the like;
after the RSU completes payment deduction, the OBU is notified of the deduction result.
At the same time, the vision recognition system is again photographed and the passing vehicle is recognized; the RSU verifies that the vehicle has paid for and is released.
In order to prevent the vehicle-mounted unit or the license plate from being replaced in the middle, the road side unit can be arranged at other positions outside the entrance and exit of the expressway. Fig. 6 shows an example of a plurality of roadside units arranged on the way of the vehicle.
As shown in fig. 6, when a vehicle carrying V2X OBU equipment passes through a highway road test V2X RSU,
the on-board OBU broadcasts vehicle information including, but not limited to, vehicle type, size, location, speed, etc.;
the drive test visual recognition system takes a picture and recognizes a passing vehicle;
and the road test RSU receives the OBU broadcast vehicle information, compares and fuses the OBU broadcast vehicle information with the identification result of the visual identification system, uploads the vehicle information to the cloud control platform, and records the vehicle running path by combining with the RSU position information.
If the road test RSU does not receive the OBU broadcast information of the passing vehicles, the vehicles are marked as suspicious when the results of the visual identification system are uploaded, and the suspicion of fee evasion is pre-warned.
The roadside unit shown in fig. 6 may be used to record spatiotemporal information of the vehicle during travel. The spatiotemporal information may restore the driving path of the vehicle in the expressway as described above to realize the charging according to the driving path.
The high-speed charging system described based on fig. 3-6 is also capable of identifying and reflecting queue-insertion and trailing behavior. Fig. 7 shows an example in which a vehicle in line is present at the time of charging. When the truck loading the V2X OBU equipment pays, another unpaid car is inserted in front. Fig. 8 shows an example where there is a trailing vehicle at the time of charging. When the truck loaded with the V2X OBU equipment pays, another unpaid car trails.
Fig. 9 shows an example of a process flow when an illegal vehicle exists. As shown, a forward (for trailing vehicles) or backward (for on-board vehicles) vision recognition system photographs and inspects for offending vehicles. At this time, the illegal vehicle information can be sent to the cloud end through the drive test V2X RSU, and then the station staff is notified. Meanwhile, the automatic handrail can be controlled not to be released.
Fig. 10 shows another example of the presence of a trailing vehicle at the time of charging. As shown, another unpaid car trails while the truck carrying the V2X OBU equipment is paying. At this time, since the vision recognition system lacks a forward recognition function (only one backward camera), the vehicle equipped with the backward camera can be used for recognition of the following vehicle.
Fig. 11 shows an example of a process flow when an illegal vehicle exists. When a truck loaded with a V2X OBU device passes through the communication coverage range of the automatic toll collection system cloud platform, the vehicle-mounted rear-view camera shoots the situation at the tail of the vehicle and uploads the shot images or videos to the cloud platform (for example, via V2X) for communication.
The cloud platform performs vehicle identification and fee evasion behavior analysis based on the images or videos, and sends the result to a toll station:
if the subsequent vehicle is located at a position that, even if accelerated, is not sufficient to "trail" the vehicle past the automatic railing, the current truck is quickly released;
if the subsequent vehicle may "trail" the vehicle past the automatic balustrade, an operation to prevent the subsequent vehicle from escaping may be performed, such as controlling the automatic balustrade to fall down at an accelerated speed, warning the subsequent vehicle to pay for a slow trip by a screen or sound, or notifying field personnel of emergency treatment, etc.;
if a subsequent vehicle forces a "trail" pass, a video or image of the vehicle and its "fare evasion" behavior can be recorded on the cloud platform for subsequent processing (e.g., warning, or reimbursing fees and penalties when the vehicle is driven at high speed, etc.).
It should be understood that in other embodiments, the current vehicle may also be notified to turn on the front camera for the in-line vehicle. In addition, image recognition, analysis and warning processing can also be realized by the road side unit.
When two or more vehicles with the same characteristics pass through the same entrance lane in sequence, under the condition that the vehicle-mounted V2X OBU equipment is wrongly bound with the vehicle (generally, a vehicle identification code such as a license plate), accurate charging can still be guaranteed as long as the verification system supports automatic correction.
For example, suppose three vehicles V1, V2, V3 with the same characteristics, carrying V2X OBU devices O1, O2, O3, respectively. When they pass through the same entrance lane, they are automatically bound as V1O2, V2O3 and V3O 1.
1) If they exit through the same exit at the same time, whether the system detects the binding error, can conclude that their driving route is the same, the expense is the same;
2) if a certain road test unit RSU receives the information of the vehicle-mounted unit O1, the visual identification system is identified as the vehicle V1; at this time, if the cloud control platform finds V1O2 and V3O1, the binding relationship between the vehicle and the V2X device may be revised to be V1O1 and V3O2, and other vehicle features are kept unchanged. At this time, the binding relationship between the vehicle and the V2X device recorded by the cloud control platform is V1O1, V2O3, and V3O 2;
3) for V2O3, V3O2, if they exit via the same exit, such as 1) processing; otherwise, as in 2), the results would be V2O2 and V3O 3.
Similarly, if the verification system supports fuzzy matching, the binding relationship can be maintained and accurate charging can still be achieved. The vision verification system can distinguish and identify by strictly comparing vehicle identification codes (e.g., license plate numbers) provided that the features of the lead vehicle or trailing vehicle are the same as the current vehicle.
Examples of applications of the present invention in highway toll scenarios are described above in connection with fig. 3-11. It should be understood that the present invention is also applicable to other scenarios where matching and binding is based on-board unit communication and visual identification, such as parking lot toll verification, urban expressway and general road monitoring, etc. In different embodiments, one or more sets of roadside units and visual recognition units for matching may be arranged for a specific area. For example, in a small parking lot in which the exit and entrance are the same, a set of roadside units and visual recognition units may be installed only at the entrance for matching when the vehicle enters and exits (even during parking). In addition, the cloud control platform is not necessary, and can be directly connected with the traffic management system or can be processed locally.
Fig. 12 shows an example of the composition of a vehicle information processing system according to an embodiment of the invention. The system may be used as a highway toll station as shown in the above figures, for example, a toll collection system arranged at an entrance and an exit, and a highway roadside component as shown in fig. 6.
As shown in fig. 12, the vehicle information processing system 1200 of the present invention may include a first device 1210, a second device 1220, and a third device 1230.
The first device 1210 may communicate with an onboard unit on a vehicle to acquire first information transmitted by the onboard unit. The second device 1220 may photograph the vehicle to acquire second information of the vehicle from the photographed image. The third device 1230 may perform processing for the acquired first information and the second information. Here, the first device 1210 may be implemented as a roadside V2XRSU in a highway toll station scenario. The second device 1220 may implement a vehicle identification verification device, such as two cameras positioned one behind the other. The third device 1230 for performing the processing may be implemented as a separate device or may be integrated in the first or second device, e.g. both the first device 1210 and the third device 1230 are preferably implemented by a rsu integration. While in a scenario such as that of fig. 6, the roadside unit may further integrate a second device 1220 to fully include the first through third devices therein.
The second device 1220 may perform a visual recognition operation on the photographed image of the vehicle to acquire appearance feature information of the vehicle as second information. In addition to the capture operation itself being implemented by the second device 1220, in various embodiments, the visual recognition operation may be implemented by any of the first device, the second device, or the third device. The first information may then include identification information of the on-board unit and information of the on-board unit associated with the vehicle. In some embodiments, the first information may also include associated vehicle appearance characteristic information.
The system 1200 may be implemented at least in part at an entrance to a particular area, such as a toll collection system at an entrance to a highway. Then, the first device may include a first device arranged at an entrance of the specific area, and receive the first information transmitted by the on-board unit on the vehicle based on the broadcast communication; the second device may include a second device disposed at an entrance of a specific area, and photographs the vehicle when the vehicle reaches the entrance to acquire appearance feature information of the vehicle from the photographed image; the third device may perform a binding and/or matching process with respect to the acquired first information and the second information. The system 1200 may further include: a release means arranged at an entrance of a specific area, said release means releasing said vehicle into said specific area after said third device has performed said binding and/or matching process.
System 1200 may be implemented at least in part at an exit of a particular area, for example, a toll collection system at an exit of a highway; it may also be implemented at least partially in this area, for example, as a highway roadside mounted unit as shown in fig. 6. Then, the first device may include a first device disposed at an exit of the specific area or within the specific area, and receive the first information transmitted by the on-board unit on the vehicle based on the broadcast communication; the second device includes a second device disposed at an exit of a specific area or within a specific area, and photographs the vehicle when the vehicle reaches the vicinity of the second device to acquire appearance feature information of the vehicle from the photographed image.
In different application scenarios, system 1200 may have one or more sets of devices. For example, in a compact car park with the same entrance and exit, the first and second devices of the entrance and exit may be implemented by the same set of devices. At this point, system 1200 includes only one set of devices. In a large parking lot having a plurality of entrances and exits, each of the entrances and exits may be equipped with a set of first and second devices for communicating with an on-board unit and performing vehicle photographing. At this time, the third device may be implemented by a single device through networking, or may be implemented at each gateway separately, for example, by a road side unit of each gateway, and the system 1200 may include multiple sets of devices equipped at each gateway. In the expressway, not only a set of first and second devices for communicating with the on-board unit and photographing the vehicle need to be provided for each gateway, but also, as shown in fig. 6, may be provided inside the expressway to communicate and photograph the vehicle in travel. At this point, system 1200 may include more groups of devices. In various scenarios as above, the system 1200 may further include a server (or cloud platform) according to specific implementations to implement centralized storage, processing, and delivery of information. For example, the system 1200 may upload information obtained via the first device, the second device, and the third device to a server, and download required data from the server. For example, the entrance charging system may upload the acquired vehicle primary binding/matching data to the server. The outlet charging system may for example download the above-mentioned primary binding/matching data from the server based on the on board unit identification code and use it together with the data obtained at the outlet for subsequent processing, e.g. matching, charging calculation etc.
The third device may determine whether the appearance information of the vehicle acquired at the specific-area exit or in the specific area matches the appearance information of the vehicle acquired at the entrance based on a processing result of the binding and/or matching process at the entrance of the specific area. If so, the third device communicates with the on-board unit of the vehicle to conduct the transaction. If not, an alarm can be sent out on site or the cloud end can be uploaded.
The system 1200 may also include a clearance device, such as an automatic railing. In a case where the transaction is completed, the third device notifies the release apparatus that the vehicle is released from the specific area.
The system 1200 of the present invention may also have automatic handling capabilities for offending vehicles. For example, when a trailing vehicle is behind the vehicle captured by a second device (e.g., a camera at an exit), system 1200 may notify a clearance device that the trailing vehicle is not allowed to exit the particular area. In other embodiments, the system 1200 may also command the vehicle located at the exit of the specific area to turn on the rear camera, so as to determine whether there is a trailing vehicle according to the image captured by the rear camera; or whether the queue-inserting vehicle exists is judged through the image shot by the front camera by opening the front camera.
In the case where the equipping with the plurality of sets of communication and photographing apparatuses is also performed inside the specific area as shown in fig. 6, the third apparatus may acquire the spatiotemporal information when the vehicle is photographed or communicated with the on-vehicle unit within the specific area. The spatiotemporal information may be used for transactions and for identification of abnormal behavior. For example, a third device may conduct a transaction with the on-board unit of the vehicle based on the temporal-spatial information. The third device or the cloud platform can also judge that a certain vehicle is always in a physical proximity position with the associated vehicle-mounted unit when the certain vehicle is in a specific area based on multiple groups of time-space information.
In order to comprehensively acquire the appearance feature information of the vehicle, the second device 1220 may include an image pickup device provided on the road side and adapted to pick up an image of the vehicle from at least two directions. For example, the second device 1220 may include two separate cameras as shown in fig. 3. In other embodiments, the second device 1220 may also be a single camera that may be rotated, for example. In some embodiments, the second device 1220 may also be a camera with depth information measurement capabilities.
Fig. 13 shows a configuration example of a vehicle information processing apparatus according to an embodiment of the present invention. The device may be a roadside unit used in an expressway tollgate as shown in the above figures, for example, roadside V2X RSU disposed at an entrance and an exit, and a roadside component located at the expressway roadside and incorporating a visual recognition function as shown in fig. 6.
As shown in fig. 13, the vehicle information processing apparatus 1300 may include: a communication unit 1310 and a processing unit 1320.
The communication unit 1310 may be configured to: the method comprises the steps of communicating with an on-board unit on a vehicle to acquire first information sent by the on-board unit, and informing adjacent shooting equipment to shoot the vehicle to acquire second information of the vehicle according to a shot image. Here, the communication unit 1310 may be a short-range communication unit, for example, a unit having a broadcast communication capability. The processing unit 1320 may perform processing for the acquired first information and the second information.
The first information may include identification information (e.g., ID or MAC) of the in-vehicle unit and information (e.g., license plate number) of the in-vehicle unit associated vehicle, and the second information includes visually recognized appearance feature information of the vehicle, and the processing unit performs a binding process of binding the first information and the second information.
The first information may further include appearance feature information of the vehicle associated with the on-board unit, and the processing unit may execute matching processing of matching appearance feature information included in the first information and appearance feature information included in the second information.
Where a cloud platform is involved, apparatus 1300 may further comprise: a second communication unit for communicating with the server. The second communication unit has a remote communication capability, and uploads the acquired information and/or the processed result to the server, and acquires the required information from the server.
The apparatus 1300 may be disposed in different locations of a particular area to perform different functions, individually or in combination. For example, apparatus 1300 may be: an entrance device arranged at an entrance of the specific area to perform vehicle information processing on an entrance vehicle; an in-zone device arranged within a specific zone to perform vehicle information processing on vehicles within the specific zone; and an exit device arranged at an exit of the specific area to perform vehicle information processing on the exit vehicle.
The outlet device and/or the in-zone device may be adapted to: based on first information acquired from an on-board unit of a vehicle at an exit and/or in a region, vehicle information of an entrance device for the on-board unit is searched for matching the vehicle information at the entrance with the vehicle information at the exit and/or in the region. Upon matching, the exit device and/or the in-zone device may notify its communication unit to conduct a transaction with an on-board unit of a vehicle at the exit and/or in the zone. Subsequently, in a case where it is determined that the transaction is completed, the processing unit may notify the communication unit that the vehicle in which the transaction is completed is released from the specific area.
Likewise, the apparatus 1300 may be provided with the capability of automatically identifying and processing the offending vehicle. For example, in certain embodiments, the exit device may be used to: in a case where it is determined from the captured image that there is a trailing vehicle, an operation of not allowing the trailing vehicle to leave the specific area is performed. In other embodiments, the exit device may be used to: and informing the exit of the vehicle to start a rear camera so as to determine whether a trailing vehicle exists according to the image shot by the rear camera.
In order to provide more comprehensive matching information of the on-board unit to the vehicle characteristics, the processing unit may record time and location information of acquiring the first information and/or the second information to generate spatiotemporal information, and the exit device and/or the in-region device is used for: transacting with an on-board unit of a vehicle at the exit and/or in the area based on the spatiotemporal information of the vehicle at the exit and/or in the area.
FIG. 14 shows a schematic structural diagram of a computing device according to one embodiment of the invention.
Referring to fig. 14, computing device 1400 includes memory 1410 and processor 1420.
The processor 1420 may be a multi-core processor or may include a plurality of processors. In some embodiments, processor 1420 may include a general-purpose host processor and one or more special purpose coprocessors such as a Graphics Processor (GPU), Digital Signal Processor (DSP), or the like. In some embodiments, processor 1420 may be implemented using custom circuitry, such as an Application Specific Integrated Circuit (ASIC) or a Field Programmable Gate Array (FPGA).
The memory 1410 may include various types of storage units, such as system memory, Read Only Memory (ROM), and permanent storage. Where the ROM may store static data or instructions for the processor 1420 or other modules of the computer. The persistent storage device may be a read-write storage device. The persistent storage may be a non-volatile storage device that does not lose stored instructions and data even after the computer is powered off. In some embodiments, the persistent storage device employs a mass storage device (e.g., magnetic or optical disk, flash memory) as the persistent storage device. In other embodiments, the permanent storage may be a removable storage device (e.g., floppy disk, optical drive). The system memory may be a read-write memory device or a volatile read-write memory device, such as a dynamic random access memory. The system memory may store instructions and data that some or all of the processors require at runtime. In addition, the memory 1410 may include any combination of computer-readable storage media, including various types of semiconductor memory chips (DRAM, SRAM, SDRAM, flash, programmable read only memory), magnetic and/or optical disks, may also be used. In some embodiments, memory 1410 may include a removable storage device that is readable and/or writable, such as a Compact Disc (CD), a read-only digital versatile disc (e.g., DVD-ROM, dual layer DVD-ROM), a read-only Blu-ray disc, an ultra-density optical disc, a flash memory card (e.g., SD card, min SD card, Micro-SD card, etc.), a magnetic floppy disc, or the like. Computer-readable storage media do not contain carrier waves or transitory electronic signals transmitted by wireless or wired means.
The memory 1410 has stored thereon executable code, which, when processed by the processor 1420, may cause the processor 1420 to perform the vehicle information processing methods described above.
The vehicle information processing method, apparatus, and system according to the present invention have been described in detail above with reference to the accompanying drawings. According to the vehicle information processing scheme, the vehicle-mounted unit information interaction and the visual recognition are combined, so that the automatic binding and matching of the vehicle-mounted unit and the vehicle characteristics can be realized. The matching verification can be carried out for multiple times so as to realize supervision on abnormal conditions.
Furthermore, the method according to the invention may also be implemented as a computer program or computer program product comprising computer program code instructions for carrying out the above-mentioned steps defined in the above-mentioned method of the invention.
Alternatively, the invention may also be embodied as a non-transitory machine-readable storage medium (or computer-readable storage medium, or machine-readable storage medium) having stored thereon executable code (or a computer program, or computer instruction code) which, when executed by a processor of an electronic device (or computing device, server, etc.), causes the processor to perform the steps of the above-described method according to the invention.
Those of skill would further appreciate that the various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the disclosure herein may be implemented as electronic hardware, computer software, or combinations of both.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems and methods according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Having described embodiments of the present invention, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein is chosen in order to best explain the principles of the embodiments, the practical application, or improvements made to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (59)

1. A vehicle information processing method comprising:
communicating with an on-board unit on a vehicle to acquire first information sent by the on-board unit;
shooting the vehicle to acquire second information of the vehicle according to the shot image; and
processing is performed with respect to the acquired first information and the second information.
2. The method of claim 1, wherein performing processing for the acquired first information and second information comprises:
and under the condition that the first information and the second information contain the same kind of information, performing matching operation of the first information and the second information.
3. The method of claim 1, wherein performing processing for the acquired first information and second information comprises:
and under the condition that the first information and the second information contain different types of information, performing binding operation of the first information and the second information.
4. The method of claim 1, wherein the first information includes identification information of the on-board unit.
5. The method of claim 1, wherein the first information includes information that the on board unit is associated with a vehicle, the information includes license plate number information of the vehicle.
6. The method of claim 5, wherein the on-board unit associated vehicle information includes on-board unit associated vehicle appearance characteristic information.
7. The method of claim 1, wherein the second information includes appearance feature information of the vehicle visually recognized by photographing the vehicle.
8. The method of claim 7, wherein the second information further includes license plate information of the vehicle visually recognized by photographing the vehicle.
9. The method of claim 1, wherein communicating with an on-board unit on a vehicle comprises one of:
receiving the broadcast information of the on-board unit,
broadcasting a communication request; and
and receiving first information responded by the vehicle-mounted unit.
10. The method of claim 1, wherein the on-board unit comprises:
a dedicated hardware unit placed or installed on the vehicle;
an intelligent terminal;
the wearable device can be moved.
11. The method of claim 1, wherein filming the vehicle comprises:
the vehicle is photographed using a field-mounted vision recognition device.
12. The method of claim 11, wherein filming the vehicle using a field-mounted visual recognition device comprises:
when the visual recognition device shoots a trailing vehicle behind the vehicle, the relevant party is informed to perform the escape prevention operation aiming at the trailing vehicle.
13. The method of claim 1, further comprising:
the vehicle is notified to take an image using an onboard camera to determine whether a trailing vehicle is present.
14. The method of claim 1, wherein performing processing for the acquired first information and second information comprises:
and uploading the processed result to a server.
15. A vehicle information processing method comprising:
when the first vehicle enters the particular area,
communicating with a first vehicle-mounted unit on a first vehicle to acquire first information sent by the first vehicle-mounted unit;
shooting a first vehicle to acquire second information of the first vehicle according to the shot image;
when a second vehicle is located within or leaves the specific area,
communicating with a second on-board unit on a second vehicle to obtain third information sent by the second on-board unit;
determining that the second on-board unit is the same as the first on-board unit if the first information and the third information correspond;
shooting a second vehicle to acquire fourth information of the second vehicle according to the shot image; and
determining that the second vehicle is the same vehicle as the first vehicle based on the second information and fourth information.
16. The method of claim 15, wherein the second information of the first vehicle comprises:
the appearance feature information of the first vehicle visually recognized by photographing an image of the first vehicle, an
The fourth information of the second vehicle includes:
and visually recognizing appearance feature information of the second vehicle by shooting an image of the second vehicle.
17. The method of claim 15, further comprising:
and binding and/or matching the acquired first information and the second information.
18. The method of claim 17, further comprising:
and uploading the result of the binding and/or matching operation to a server.
19. The method of claim 18, wherein determining, from the second information and fourth information, that the second vehicle is the same vehicle as the first vehicle comprises:
obtaining the result of the binding and/or matching operation from the server; and
determining that the second vehicle is the same vehicle as the first vehicle according to the fourth information and a result of the binding and/or matching operation.
20. The method of claim 15, wherein communicating with a first on-board unit on a first vehicle upon the first vehicle entering a particular area comprises:
broadcasting a communication request; and
and receiving the first information responded by the first vehicle-mounted unit.
21. The method of claim 15, wherein communicating with a second on-board unit on a second vehicle while the second vehicle is within or exiting the particular area comprises:
receiving the third information broadcast by the second vehicle-mounted unit; or
Broadcasting a communication request and receiving the third information responded by the second onboard unit.
22. The method of claim 15, further comprising:
after determining that the second vehicle is the same vehicle as the first vehicle, conducting a transaction with the second on-board unit of the second vehicle.
23. The method of claim 22, wherein, after determining that the second vehicle is the same vehicle as the first vehicle, conducting a transaction with the second on-board unit of the second vehicle comprises:
performing a transaction with the second on-board unit of a second vehicle based on location information when the first vehicle enters a particular area and location information when the second vehicle is located within or exits the particular area.
24. The method of claim 23, further comprising:
when the second vehicle is located within the specific area,
performing at least one of communication with a second vehicle-mounted unit and shooting of the second vehicle for the second vehicle to acquire at least one set of second vehicle information, wherein the set of second vehicle information comprises fifth information sent by the second vehicle-mounted unit, sixth information obtained according to a shot image and second vehicle position information during shooting;
uploading the acquired at least one group of second vehicle information to a server;
confirming spatiotemporal information of the second vehicle within the specific area using the at least one set of second vehicle information.
25. The method of claim 24, further comprising:
determining, based on the spatiotemporal information, that a second vehicle loading the first onboard unit is always the same vehicle as the first vehicle.
26. The method of claim 24, wherein: after determining that the second vehicle is the same vehicle as the first vehicle, conducting a transaction with the second on-board unit of the second vehicle comprises:
conducting a transaction with the second on-board unit of the second vehicle based on the temporal-spatial information.
27. The method of claim 23, further comprising:
allowing the second vehicle to exit the particular area after the transaction is successful.
28. The method of claim 27, wherein allowing the second vehicle to exit the particular area after the transaction is successful further comprises:
shooting the second vehicle after the transaction is successful to acquire seventh information of the second vehicle according to the shot image; and
allowing the second vehicle to exit the specific area after determining that the seventh information corresponds to the second or fourth information.
29. The method of claim 27, further comprising:
when a trailing vehicle is photographed after the vehicle, an operation is performed that does not allow the trailing vehicle to leave the specific area.
30. The method of claim 29, further comprising:
notifying the second vehicle to turn on a rearview camera; and
and shooting by using the rearview camera to identify whether a trailing vehicle exists or not.
31. A vehicle information processing system comprising:
the device comprises a first device, a second device and a third device, wherein the first device is used for communicating with an on-board unit on a vehicle to acquire first information sent by the on-board unit;
a second device for photographing the vehicle to acquire second information of the vehicle from the photographed image; and
third means for performing processing for the acquired first information and the second information.
32. The system of claim 31, wherein obtaining second information of the vehicle from the captured image comprises:
performing a visual recognition operation on the photographed vehicle image to acquire appearance feature information of the vehicle,
wherein the visual recognition operation is implemented by one of the first device, the second device, or the third device.
33. The system of claim 32, wherein the first information includes identification information of the on-board unit and information of the on-board unit associated vehicle.
34. The system of claim 33, wherein the first device includes a first device disposed at an entrance of a specific area, and the first device receives first information transmitted from an on-board unit on a vehicle based on broadcast communication;
the second device includes a second device disposed at an entrance of a specific area and photographs the vehicle when the vehicle reaches the entrance to acquire appearance feature information of the vehicle from the photographed image;
the third device performs binding and/or matching processing for the acquired first information and the second information.
35. The system of claim 34, further comprising:
a release means arranged at an entrance of a specific area, said release means releasing said vehicle into said specific area after said third device has performed said binding and/or matching process.
36. The system of claim 34, wherein,
the first device includes a first device arranged at an exit of a specific area or within the specific area, and receives first information transmitted by an on-board unit on a vehicle based on broadcast communication;
the second device includes a second device disposed at an exit of a specific area or within a specific area, and photographs the vehicle when the vehicle reaches the vicinity of the second device to acquire appearance feature information of the vehicle from the photographed image.
37. The system of claim 36, wherein the third device determines whether the appearance information of the vehicle acquired at the specific-area exit or in the specific area matches the appearance information of the vehicle acquired at the entrance based on a processing result of the binding and/or matching process at the entrance of the specific area.
38. The system of claim 36, wherein the third device communicates with an on-board unit of the vehicle to conduct a transaction if the third device determines that the appearance information of the vehicle obtained at the specific area exit or within the specific area matches the appearance information of the vehicle obtained at the entrance.
39. The system of claim 38, further comprising:
a release device that notifies the release device to release the vehicle from the specific area in a case where the transaction is completed.
40. The system of claim 39, wherein the system notifies the clearance device not to allow the trailing vehicle to leave the particular area when the trailing vehicle is behind the vehicle captured by the second device.
41. The system of claim 38, wherein the system commands a vehicle located at an exit of the specific area to turn on a rear camera to determine whether there is a trailing vehicle from an image taken by the rear camera.
42. The system of claim 36, wherein the third device obtains spatiotemporal information of the vehicle as it is photographed within the particular area or communicated with an on-board unit.
43. The system of claim 42, wherein the third device conducts a transaction with the on-board unit of the vehicle based on the spatiotemporal information.
44. The system of claim 32, wherein the second device comprises a camera disposed on the roadside for capturing the vehicle from at least two directions.
45. The system of claim 31, wherein the first device performs at least part of the functions of the third device.
46. The system of claim 31, wherein the system uploads information obtained via the first device, the second device, and the third device to a server.
47. A vehicle information processing apparatus comprising:
a communication unit to:
communicating with an on-board unit on a vehicle to acquire first information transmitted by the on-board unit,
notifying an adjacent photographing apparatus to photograph the vehicle to acquire second information of the vehicle from the photographed image, an
A processing unit configured to execute processing for the acquired first information and the second information.
48. The apparatus of claim 47, wherein the first information comprises identification information of the on-board unit and information of the on-board unit associated with a vehicle, and the second information comprises visually identified appearance feature information of the vehicle, and
the processing unit executes a binding process of binding the first information and the second information.
49. The apparatus of claim 48, wherein the first information further comprises appearance feature information of the on-board unit-associated vehicle, and
the processing unit executes matching processing for matching the appearance feature information included in the first information with the appearance feature information included in the second information.
50. The apparatus of claim 47, further comprising:
and the second communication unit is used for communicating with the server and uploading the acquired information and/or the processed result to the server.
51. The apparatus of claim 47, wherein the apparatus comprises at least one of:
an entrance device arranged at an entrance of the specific area to perform vehicle information processing on an entrance vehicle;
an in-zone device arranged within a specific zone to perform vehicle information processing on vehicles within the specific zone; and
an exit device disposed at an exit of the specific area to perform vehicle information processing on the exit vehicle.
52. The apparatus of claim 51, wherein the outlet apparatus and/or the in-zone apparatus is configured to:
based on first information acquired from an on-board unit of a vehicle at an exit and/or in a region, vehicle information of an entrance device for the on-board unit is searched for matching the vehicle information at the entrance with the vehicle information at the exit and/or in the region.
53. The apparatus of claim 52, wherein the outlet apparatus and/or the in-zone apparatus is configured to:
and under the condition that the vehicle information at the entrance is determined to be matched with the vehicle information at the exit and/or in the area, informing the communication unit to carry out transaction with the vehicle-mounted unit of the vehicle at the exit and/or in the area.
54. The apparatus of claim 53, wherein the processing unit is to:
in a case where it is determined that the transaction is completed, notifying the communication unit that the vehicle in which the transaction is completed is cleared from the specific area.
55. The apparatus of claim 51, the outlet means for:
in a case where it is determined from the captured image that there is a trailing vehicle, an operation of not allowing the trailing vehicle to leave the specific area is performed.
56. The apparatus of claim 55, the outlet means for:
and informing the exit of the vehicle to start a rear camera so as to determine whether a trailing vehicle exists according to the image shot by the rear camera.
57. The apparatus of claim 53, wherein the processing unit is to:
recording time and location information of acquiring the first information and/or the second information to generate spatio-temporal information, and
the outlet device and/or the in-zone device is configured to:
transacting with an on-board unit of a vehicle at the exit and/or in the area based on the spatiotemporal information of the vehicle at the exit and/or in the area.
58. A computing device, comprising:
a processor; and
a memory having executable code stored thereon, which when executed by the processor, causes the processor to perform the method of any one of claims 1-30.
59. A non-transitory machine-readable storage medium having stored thereon executable code, which when executed by a processor of an electronic device, causes the processor to perform the method of any of claims 1-30.
CN201911174197.XA 2019-11-26 2019-11-26 Vehicle information processing method, device and system Active CN112950947B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911174197.XA CN112950947B (en) 2019-11-26 2019-11-26 Vehicle information processing method, device and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911174197.XA CN112950947B (en) 2019-11-26 2019-11-26 Vehicle information processing method, device and system

Publications (2)

Publication Number Publication Date
CN112950947A true CN112950947A (en) 2021-06-11
CN112950947B CN112950947B (en) 2023-05-30

Family

ID=76225040

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911174197.XA Active CN112950947B (en) 2019-11-26 2019-11-26 Vehicle information processing method, device and system

Country Status (1)

Country Link
CN (1) CN112950947B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113888900A (en) * 2021-09-10 2022-01-04 海信集团控股股份有限公司 Vehicle early warning method and device
CN113947892A (en) * 2021-08-26 2022-01-18 北京万集科技股份有限公司 Abnormal parking monitoring method and device, server and readable storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003216989A (en) * 2002-01-21 2003-07-31 Toshiba Corp Charge collection system, on-vehicle equipment, and charge collection method
CN105447918A (en) * 2015-12-22 2016-03-30 深圳市金溢科技股份有限公司 Vehicle-mounted unit, highway vehicle fee-collecting method and system
CN106504353A (en) * 2015-09-07 2017-03-15 腾讯科技(深圳)有限公司 Vehicle toll method and apparatus
CN108510734A (en) * 2018-03-30 2018-09-07 深圳市金溢科技股份有限公司 A kind of information of vehicles matching process of roadside unit and a kind of roadside unit
US10121289B1 (en) * 2014-04-11 2018-11-06 Amtech Systems, LLC Vehicle-based electronic toll system with interface to vehicle display
US20190088038A1 (en) * 2016-03-31 2019-03-21 Mitsubishi Heavy Industries Machinery Systems, Ltd. Same vehicle detection device, toll collection facility, same vehicle detection method, and program
CN110211250A (en) * 2019-06-20 2019-09-06 深圳成谷科技有限公司 The anti-escape charging method of vehicle of the radio frequency in conjunction with video and system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003216989A (en) * 2002-01-21 2003-07-31 Toshiba Corp Charge collection system, on-vehicle equipment, and charge collection method
US10121289B1 (en) * 2014-04-11 2018-11-06 Amtech Systems, LLC Vehicle-based electronic toll system with interface to vehicle display
CN106504353A (en) * 2015-09-07 2017-03-15 腾讯科技(深圳)有限公司 Vehicle toll method and apparatus
CN105447918A (en) * 2015-12-22 2016-03-30 深圳市金溢科技股份有限公司 Vehicle-mounted unit, highway vehicle fee-collecting method and system
US20190088038A1 (en) * 2016-03-31 2019-03-21 Mitsubishi Heavy Industries Machinery Systems, Ltd. Same vehicle detection device, toll collection facility, same vehicle detection method, and program
CN108510734A (en) * 2018-03-30 2018-09-07 深圳市金溢科技股份有限公司 A kind of information of vehicles matching process of roadside unit and a kind of roadside unit
CN110211250A (en) * 2019-06-20 2019-09-06 深圳成谷科技有限公司 The anti-escape charging method of vehicle of the radio frequency in conjunction with video and system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113947892A (en) * 2021-08-26 2022-01-18 北京万集科技股份有限公司 Abnormal parking monitoring method and device, server and readable storage medium
CN113888900A (en) * 2021-09-10 2022-01-04 海信集团控股股份有限公司 Vehicle early warning method and device

Also Published As

Publication number Publication date
CN112950947B (en) 2023-05-30

Similar Documents

Publication Publication Date Title
JP5517393B2 (en) Mobile charging system and mobile charging method using mobile charging system
AU670159B2 (en) Electronic traffic tariff reception system and vehicle identification apparatus
CN107195004B (en) Three video camera Car license recognition toll collection systems and car plate precise recognition method
CN104574540A (en) Electronic toll collection system and method
CN110163985A (en) A kind of curb parking management charge system and charging method based on the identification of vehicle face
KR102287051B1 (en) Parking space managing system by using image recognizing device
CN111526483A (en) Avoidance control method, device, storage medium, vehicle and system
CN112950947B (en) Vehicle information processing method, device and system
JP2808513B2 (en) Electronic toll collection system
CN105528812A (en) Automatic toll collection system of highway toll station
CN114022965A (en) Method and system for tracking track of vehicle running on expressway
JP6535938B2 (en) Toll collection facility, toll collection method and program
CA3113494A1 (en) Parking lot management system based on real-time computer vision processing
CN111260953A (en) In-road parking management method, device and system
JP2739021B2 (en) Vehicle identification device
JP5320096B2 (en) Unauthorized passing vehicle management system and unauthorized passing vehicle management method
CN112802355B (en) Method, system and storage medium for acquiring real-scene road conditions in real time
JP7228314B2 (en) Unauthorized Vehicle Detection System for Flapless Parking Lot
WO2016142940A1 (en) System for monitoring a parking arena and parking enforcement
JP6372043B2 (en) In-vehicle system and monitoring system
CN113593247A (en) Parking management system
JP3664965B2 (en) Road billing system and roadside device
US11482006B2 (en) System of monitoring vehicles in a parking arena
KR102513911B1 (en) System and method for parking management by parking cars motion detection based on video images
CN210515657U (en) Mobile supervision system and bus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant