CN114999198A - Mixed traffic flow fusion control method and system based on high-precision map relative position - Google Patents

Mixed traffic flow fusion control method and system based on high-precision map relative position Download PDF

Info

Publication number
CN114999198A
CN114999198A CN202210390390.2A CN202210390390A CN114999198A CN 114999198 A CN114999198 A CN 114999198A CN 202210390390 A CN202210390390 A CN 202210390390A CN 114999198 A CN114999198 A CN 114999198A
Authority
CN
China
Prior art keywords
vehicle
information
image
traffic flow
mixed traffic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210390390.2A
Other languages
Chinese (zh)
Inventor
黄鸿
邓晓光
廖兴国
涂辉招
李�浩
陆淼嘉
郭静秋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Ocn Network Technology Co ltd
Tongji University
Original Assignee
Guangzhou Ocn Network Technology Co ltd
Tongji University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Ocn Network Technology Co ltd, Tongji University filed Critical Guangzhou Ocn Network Technology Co ltd
Priority to CN202210390390.2A priority Critical patent/CN114999198A/en
Publication of CN114999198A publication Critical patent/CN114999198A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096708Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
    • G08G1/096725Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information generates an automatic action on the vehicle control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/292Multi-camera tracking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096766Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
    • G08G1/096783Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is a roadside individual element
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/45External transmission of data to or from the vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30232Surveillance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30236Traffic on road, railway or crossing

Abstract

The application discloses a mixed traffic flow fusion control method and system based on high-precision map relative positions, which relate to an automatic driving technology, and the method is applied to a road side unit and comprises the following steps: acquiring images in the area, and determining the position and the vehicle information of a vehicle in the images; and according to the vehicle information, sending the position information of the vehicle to a vehicle controller corresponding to the vehicle information so that the vehicle can execute automatic driving according to the vehicle information and the positioning information acquired by the vehicle. By implementing the method, the accurate position of the vehicle in the road can be determined by utilizing the facilities of the road side unit, and the limitation of the sensor of the vehicle is broken through.

Description

Mixed traffic flow fusion control method and system based on high-precision map relative position
Technical Field
The application relates to an automatic driving technology, in particular to a mixed traffic flow fusion control method and a mixed traffic flow fusion control system based on a high-precision map relative position.
Background
With the development of the unmanned technology, the sensing capability of the unmanned vehicle sensor is improved, and meanwhile, the vehicle-road cooperative system also provides a powerful supporting effect for the unmanned technology. At present, the automatic driving technology mainly uses a sensor on an unmanned vehicle to sense the road condition. The sensors are not able to know information in a large area nearby, limited by the location of the vehicle.
Disclosure of Invention
The present invention is directed to solving at least one of the problems of the prior art. Therefore, the invention provides a mixed traffic flow fusion control method and system based on the relative position of a high-precision map.
On one hand, the embodiment of the application provides a mixed traffic flow fusion control method based on the relative position of a high-precision map, which is applied to a road side unit and comprises the following steps:
acquiring an image in the area, and determining the position and vehicle information of a vehicle in the image;
and according to the vehicle information, sending the position information of the vehicle to a vehicle controller corresponding to the vehicle information so that the vehicle can execute automatic driving according to the vehicle information and the positioning information acquired by the vehicle.
In some embodiments, the vehicle information in the image is determined, in particular:
and acquiring vehicle information from the image, wherein the vehicle information is at least one of a license plate and vehicle appearance characteristics.
In some embodiments, the position of the vehicle in the image is determined by:
identifying a vehicle from the image;
acquiring calibration parameters of a camera;
and determining the position information of each vehicle in the road as the vehicle information according to the calibration parameters of the camera and the position of the vehicle in the image.
In some embodiments, the vehicle information includes head position information and parking space position information of each vehicle.
In some embodiments, the sending the position information of the vehicle to the vehicle controller corresponding to the vehicle information specifically includes:
and broadcasting the position information of each vehicle to the area where the vehicle is located, so that the vehicle determines whether the vehicle is the information of the vehicle according to the vehicle information related to the position information.
On the other hand, the embodiment discloses a mixed traffic flow fusion control system based on the relative position of a high-precision map, which comprises:
the road side unit is used for acquiring the image in the region, determining the position of the vehicle and the vehicle information in the image, and sending the position information of the vehicle to a vehicle controller corresponding to the vehicle information according to the vehicle information;
and the vehicle is used for executing automatic driving according to the vehicle information and the positioning information acquired by the vehicle.
In some embodiments, the vehicle and the road side unit are configured with the same map, the vehicle determines the coarse position of the vehicle on the map based on the positioning module of the vehicle, and the vehicle determines the accurate position of the vehicle on the map according to the coarse position and the position information sent by the road side unit.
In some embodiments, the position of the vehicle in the image is determined by:
identifying a vehicle from the image;
acquiring calibration parameters of a camera;
and determining the position information of each vehicle in the road as the vehicle information according to the calibration parameters of the camera and the position of the vehicle in the image.
In some embodiments, the sending the position information of the vehicle to the vehicle controller corresponding to the vehicle information specifically includes:
and broadcasting the position information of each vehicle to the area where the vehicle is located, so that the vehicle determines whether the vehicle is the information of the vehicle according to the vehicle information related to the position information.
In some embodiments, the vehicle information in the image is determined, in particular:
and acquiring vehicle information from the image, wherein the vehicle information is at least one of a license plate and vehicle appearance characteristics.
According to the embodiment of the application, the image in the area is acquired by the road side unit, the position of a vehicle and the vehicle information in the image are determined, and then the position information of the vehicle is sent to the vehicle controller corresponding to the vehicle information according to the vehicle information, so that the vehicle can perform automatic driving according to the vehicle information and the positioning information acquired by the vehicle; by utilizing the method and the device, the obstacle avoidance capability and the route planning capability of the automatic driving vehicle can be improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a flow chart of a mixed traffic flow fusion control method based on the relative position of a high-precision map;
FIG. 2 is a photographic schematic of a roadside unit;
fig. 3 is a block diagram of a mixed traffic flow fusion control method based on the relative position of a high-precision map.
Detailed Description
In order to make the purpose, technical solutions and advantages of the present application clearer, the technical solutions of the present application will be clearly and completely described below through embodiments with reference to the accompanying drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
In the description of the present invention, the meaning of a plurality is one or more, the meaning of a plurality is two or more, and larger, smaller, larger, etc. are understood as excluding the present numbers, and larger, smaller, inner, etc. are understood as including the present numbers. If the first and second are described for the purpose of distinguishing technical features, they are not to be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated or implicitly indicating the precedence of the technical features indicated.
In the description of the present invention, unless otherwise specifically limited, the terms such as set forth and the like should be construed broadly, and those skilled in the art can reasonably determine the specific meanings of the above terms in combination with the detailed contents of the technical solutions.
In the description of the present invention, reference to the description of the terms "one embodiment," "some embodiments," "an illustrative embodiment," "an example," "a specific example," or "some examples," etc., means that a particular feature or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present invention. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Referring to fig. 1 and 2, the embodiment of the present application provides a mixed traffic flow fusion control method based on the relative position of a high-precision map, in the configuration of the system, the system is composed of a component installed on a vehicle and a plurality of road side units, wherein the road side units may be original infrastructure of a road, such as a road gate, and the like, and after upgrading and installing a specific sensor, the infrastructure is used for implementing the present application, and the present application is applied to the road side units, and the method includes the following steps:
and S1, acquiring the images in the area, and determining the position of the vehicle and the vehicle information in the images.
It will be appreciated that the cameras used to capture the images may be one or more, and they may be oriented in different directions to cooperatively cover the entire area. It should be understood that the coverage range between the cameras may be repeated, and the shooting range of each camera may be distinguished by cutting the shot picture, so that the images shot by each camera are a complete and continuous area.
In the embodiment, the camera is fixedly arranged, and the shooting height and angle of the camera are also fixed, so that the position of the vehicle corresponding to the road in the image can be obtained from the picture by calibrating the camera. These positions may include where the vehicle is on the side, head, and tail of the vehicle on the lane number.
The vehicle information is at least one of a license plate and vehicle appearance characteristics, mainly refers to information for distinguishing different vehicles, and the simplest distinguishing mode is the license plate. The license plate of each vehicle in the current image can be determined by recognizing the license plate. In some embodiments, the representation may also be represented by a characteristic of the vehicle, such as the color of the vehicle, a signal of the vehicle, the status of the vehicle (window open/close, light on/off, etc.). In some technologies, the vehicle information may be confirmed by technologies such as ETC.
Specifically, the roadside unit may acquire license plate information and appearance characteristics of the vehicle by using a camera at a road gate, and then track the vehicle in the road by combining a plurality of cameras. This makes it possible to determine the relationship between the vehicles.
And S2, according to the vehicle information, sending the position information of the vehicle to a vehicle controller corresponding to the vehicle information, so that the vehicle can execute automatic driving according to the vehicle information and the positioning information acquired by the vehicle.
In the present embodiment, the position information may be transmitted to the vehicle in an oriented manner, but this manner requires a long path through a server or the like, and thus the real-time property of the transmission is relatively poor. In other embodiments, the information of the road segment at the current time point may be actually broadcast by means of regional broadcast. The vehicle can determine what is currently being measured by the roadside unit. In addition, the vehicle is matched with positioning information obtained by the GPS or the Beidou system, and the positioning information is fused, so that a more precise vehicle position is obtained. Better decision data is provided for automatic driving of the vehicle. Specifically, the present embodiment provides an example of algorithm fusion, in which a vehicle positioning system is used to determine a road segment where the vehicle is located, vehicle position information of a road side unit of the road segment where the vehicle is located is received, and a specific location where the vehicle is located is determined from the vehicle position information.
According to the embodiment of the application, the image in the area is acquired by the road side unit, the position of a vehicle and the vehicle information in the image are determined, and then the position information of the vehicle is sent to the vehicle controller corresponding to the vehicle information according to the vehicle information, so that the vehicle can perform automatic driving according to the vehicle information and the positioning information acquired by the vehicle; by utilizing the method and the device, the obstacle avoidance capability and the route planning capability of the automatic driving vehicle can be improved.
In some embodiments, when determining the position of the vehicle in the image, the method specifically includes the following steps:
and S11, identifying the vehicle from the image. The work of vehicle identification in the picture can be completed through a trained convolutional neural network. In this step, the position of the vehicle in the map is mainly identified.
And S12, acquiring calibration parameters of the camera. The parameters are calibrated in the process of installing and debugging the camera. Which describes the relationship between the camera view and the actual position of the road. The position of the object in the image in the actual road can be converted through the calibrated parameters.
And S13, determining the position information of each vehicle in the road as the vehicle information according to the calibration parameters of the camera and the position of the vehicle in the image. In this embodiment, the roadside unit determines the condition of the road once in each period, broadcasts the condition to each vehicle, and the vehicles perform comprehensive judgment based on the positioning data of the vehicles and the measurement data provided by the roadside unit.
In some embodiments, in order to facilitate the calculation of the autonomous vehicle, the vehicle head position information and the parking space position information are directly provided, so that the autonomous vehicle can make a decision quickly. Therefore, the vehicle information includes the head position information and the parking space position information of each vehicle.
In some embodiments, the sending the position information of the vehicle to the vehicle controller corresponding to the vehicle information specifically includes: and broadcasting the position information of each vehicle to the area where the vehicle is located, so that the vehicle determines whether the vehicle is the information of the vehicle according to the vehicle information related to the position information.
In the embodiment, the information is distributed in a broadcasting manner, in an example, as shown in fig. 2, the roadside unit locks the position of each license plate vehicle in a video tracking manner, and when a road condition at a certain time is broadcasted, the following information "vehicle a is located at the X position of lane B" is sent; the vehicle B is positioned at the Y position of the lane A; c vehicle is in Z position of lane C ".
Referring to fig. 3, the embodiment discloses a mixed traffic flow fusion control system based on relative positions of high-precision maps, which includes:
the road side unit is used for acquiring the image in the area, determining the position of the vehicle and the vehicle information in the image, and sending the position information of the vehicle to a vehicle controller corresponding to the vehicle information according to the vehicle information; the vehicle information is at least one of a license plate and vehicle appearance characteristics.
And the vehicle is used for executing automatic driving according to the vehicle information and the positioning information acquired by the vehicle.
In this embodiment, the vehicle carries an autopilot system and a positioning system (e.g., GPS or beidou, etc.) that can communicate with a positioning satellite. The road side unit is provided with a camera and a communication component, and the communication component mainly sends road information measured by the road side unit to an automatic driving system of the vehicle.
In this embodiment, the vehicle and the roadside unit are provided with the same map, the vehicle determines the rough position of the vehicle on the map based on the positioning module of the vehicle, and the vehicle determines the accurate position of the vehicle on the map according to the rough position and the position information sent by the roadside unit. Because the road side unit and the vehicle adopt the same map information, the road side unit and the vehicle can process the positioning data based on a certain fusion algorithm, so that the position of the vehicle on the map can be accurately obtained, the actual environment of the current road can be reconstructed by matching with a high-precision map, and accurate decision information is provided for an automatic driving system, and the decision information surpasses a sensor carried by the vehicle.
In some embodiments, when determining the position of the vehicle in the image, the method specifically includes the following steps:
and S11, identifying the vehicle from the image. The work of vehicle identification in the picture can be completed through a trained convolutional neural network. In this step, the position of the vehicle in the map is mainly identified.
And S12, acquiring calibration parameters of the camera. The parameters are calibrated in the process of installing and debugging the camera. Which describes the relationship between the camera view and the actual position of the road. The position of an object in the image in the actual road can be converted through the calibrated parameters.
And S13, determining the position information of each vehicle in the road as the vehicle information according to the calibration parameters of the camera and the position of the vehicle in the image. In this embodiment, the roadside unit determines the condition of the road once in each period, broadcasts the condition to each vehicle, and the vehicle performs comprehensive judgment based on the positioning data of the vehicle and the measurement data provided by the roadside unit.
In some embodiments, the sending the position information of the vehicle to the vehicle controller corresponding to the vehicle information specifically includes: and broadcasting the position information of each vehicle to the located area so that the vehicle determines whether the vehicle is the information of the vehicle according to the vehicle information related to the position information. During broadcasting, the road side unit binds the vehicle information and the position information of the vehicle together through a certain format, then the vehicle can determine the position information corresponding to the vehicle after receiving the broadcast data, and the condition of the complete current road is obtained through the position information of other vehicles.
The integrated units described in this application may be stored in a computer-readable storage medium if they are implemented in the form of software functional units and sold or used as separate products. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present application and the technical principles employed. It will be understood by those skilled in the art that the present application is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the application. Therefore, although the present application has been described in more detail with reference to the above embodiments, the present application is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present application, and the scope of the present application is determined by the scope of the appended claims.

Claims (10)

1. A mixed traffic flow fusion control method based on the relative position of a high-precision map is applied to a road side unit and is characterized by comprising the following steps:
acquiring an image in the area, and determining the position and vehicle information of a vehicle in the image;
and according to the vehicle information, sending the position information of the vehicle to a vehicle controller corresponding to the vehicle information so that the vehicle can execute automatic driving according to the vehicle information and the positioning information acquired by the vehicle.
2. The mixed traffic flow fusion control method based on the relative position of the high-precision map according to claim 1, characterized in that the vehicle information in the image is determined, specifically:
and acquiring vehicle information from the image, wherein the vehicle information is at least one of a license plate and vehicle appearance characteristics.
3. The mixed traffic flow fusion control method based on the relative position of the high-precision map according to claim 1, characterized in that the position of the vehicle in the image is determined, specifically:
identifying a vehicle from the image;
acquiring calibration parameters of a camera;
and determining the position information of each vehicle in the road as the vehicle information according to the calibration parameters of the camera and the position of the vehicle in the image.
4. The mixed traffic flow fusion control method based on the high-precision map relative position according to claim 1, wherein the vehicle information includes head position information and parking space position information of each vehicle.
5. The mixed traffic flow fusion control method based on the relative position of the high-precision map according to claim 1, wherein the sending of the position information of the vehicle to the vehicle controller corresponding to the vehicle information is specifically:
and broadcasting the position information of each vehicle to the located area so that the vehicle determines whether the vehicle is the information of the vehicle according to the vehicle information related to the position information.
6. A mixed traffic flow fusion control system based on relative positions of high-precision maps is characterized by comprising:
the road side unit is used for acquiring the image in the area, determining the position of the vehicle and the vehicle information in the image, and sending the position information of the vehicle to a vehicle controller corresponding to the vehicle information according to the vehicle information;
and the vehicle is used for executing automatic driving according to the vehicle information and the positioning information acquired by the vehicle.
7. The mixed traffic flow fusion control system based on the relative position of the high-precision map as claimed in claim 6, wherein the vehicles and the road side unit are provided with the same map, the vehicles determine the coarse position of the vehicles on the map based on the positioning modules of the vehicles, and the vehicles determine the accurate position of the vehicles on the map according to the coarse position and the position information sent by the road side unit.
8. The mixed traffic flow fusion control system based on the relative position of the high-precision map according to claim 6, characterized in that the position of the vehicle in the image is determined, specifically:
identifying a vehicle from the image;
acquiring calibration parameters of a camera;
and determining the position information of each vehicle in the road as the vehicle information according to the calibration parameters of the camera and the position of the vehicle in the image.
9. The mixed traffic flow fusion control system based on the relative position of the high-precision map according to claim 6, wherein the sending of the position information of the vehicle to the vehicle controller corresponding to the vehicle information is specifically:
and broadcasting the position information of each vehicle to the located area so that the vehicle determines whether the vehicle is the information of the vehicle according to the vehicle information related to the position information.
10. The mixed traffic flow fusion control system based on the relative position of the high-precision map according to claim 6, characterized in that the vehicle information in the image is determined, specifically:
and acquiring vehicle information from the image, wherein the vehicle information is at least one of a license plate and vehicle appearance characteristics.
CN202210390390.2A 2022-04-14 2022-04-14 Mixed traffic flow fusion control method and system based on high-precision map relative position Pending CN114999198A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210390390.2A CN114999198A (en) 2022-04-14 2022-04-14 Mixed traffic flow fusion control method and system based on high-precision map relative position

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210390390.2A CN114999198A (en) 2022-04-14 2022-04-14 Mixed traffic flow fusion control method and system based on high-precision map relative position

Publications (1)

Publication Number Publication Date
CN114999198A true CN114999198A (en) 2022-09-02

Family

ID=83023554

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210390390.2A Pending CN114999198A (en) 2022-04-14 2022-04-14 Mixed traffic flow fusion control method and system based on high-precision map relative position

Country Status (1)

Country Link
CN (1) CN114999198A (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107063275A (en) * 2017-03-24 2017-08-18 重庆邮电大学 Intelligent vehicle map emerging system and method based on roadside device
CN108010360A (en) * 2017-12-27 2018-05-08 中电海康集团有限公司 A kind of automatic Pilot context aware systems based on bus or train route collaboration
CN111787481A (en) * 2020-06-17 2020-10-16 北京航空航天大学 Road-vehicle coordination high-precision sensing method based on 5G
CN112729316A (en) * 2019-10-14 2021-04-30 北京图森智途科技有限公司 Positioning method and device of automatic driving vehicle, vehicle-mounted equipment, system and vehicle
US20210195365A1 (en) * 2020-05-11 2021-06-24 Beijing Baidu Netcom Science Technology Co., Ltd. Positioning method and device, on-board equipment, vehicle, and positioning system
CN113034952A (en) * 2021-03-01 2021-06-25 长沙理工大学 Road traffic safety real-time early warning system based on vehicle-road cooperation
CN113503880A (en) * 2021-06-15 2021-10-15 银隆新能源股份有限公司 Vehicle positioning method and device
CN113959457A (en) * 2021-10-20 2022-01-21 中国第一汽车股份有限公司 Positioning method and device for automatic driving vehicle, vehicle and medium
CN114279453A (en) * 2022-03-04 2022-04-05 智道网联科技(北京)有限公司 Automatic driving vehicle positioning method and device based on vehicle-road cooperation and electronic equipment

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107063275A (en) * 2017-03-24 2017-08-18 重庆邮电大学 Intelligent vehicle map emerging system and method based on roadside device
CN108010360A (en) * 2017-12-27 2018-05-08 中电海康集团有限公司 A kind of automatic Pilot context aware systems based on bus or train route collaboration
CN112729316A (en) * 2019-10-14 2021-04-30 北京图森智途科技有限公司 Positioning method and device of automatic driving vehicle, vehicle-mounted equipment, system and vehicle
US20210195365A1 (en) * 2020-05-11 2021-06-24 Beijing Baidu Netcom Science Technology Co., Ltd. Positioning method and device, on-board equipment, vehicle, and positioning system
CN111787481A (en) * 2020-06-17 2020-10-16 北京航空航天大学 Road-vehicle coordination high-precision sensing method based on 5G
CN113034952A (en) * 2021-03-01 2021-06-25 长沙理工大学 Road traffic safety real-time early warning system based on vehicle-road cooperation
CN113503880A (en) * 2021-06-15 2021-10-15 银隆新能源股份有限公司 Vehicle positioning method and device
CN113959457A (en) * 2021-10-20 2022-01-21 中国第一汽车股份有限公司 Positioning method and device for automatic driving vehicle, vehicle and medium
CN114279453A (en) * 2022-03-04 2022-04-05 智道网联科技(北京)有限公司 Automatic driving vehicle positioning method and device based on vehicle-road cooperation and electronic equipment

Similar Documents

Publication Publication Date Title
US20180288320A1 (en) Camera Fields of View for Object Detection
CN111508260A (en) Vehicle parking space detection method, device and system
CN111025308B (en) Vehicle positioning method, device, system and storage medium
WO2020057407A1 (en) Vehicle navigation assistance method and system
JP2021099793A (en) Intelligent traffic control system and control method for the same
JP6973351B2 (en) Sensor calibration method and sensor calibration device
CN111354214B (en) Auxiliary parking method and system
JP2007010335A (en) Vehicle position detecting device and system
CN114764876A (en) Evaluation method and evaluation device for perception fusion algorithm
EP3994043A1 (en) Sourced lateral offset for adas or ad features
CN116572995B (en) Automatic driving method and device of vehicle and vehicle
CN113743709A (en) Online perceptual performance assessment for autonomous and semi-autonomous vehicles
JP7136138B2 (en) Map generation data collection device, map generation data collection method, and vehicle
CN112150576B (en) High-precision vector map acquisition system and method
CN114999198A (en) Mixed traffic flow fusion control method and system based on high-precision map relative position
CN113852925A (en) Vehicle command method and system
CN115909795B (en) Autonomous parking system and method based on parking lot cooperation
JP2022056153A (en) Temporary stop detection device, temporary stop detection system, and temporary stop detection program
CN113771845A (en) Method, device, vehicle and storage medium for predicting vehicle track
JP6933069B2 (en) Pathfinding device
JP7241582B2 (en) MOBILE POSITION DETECTION METHOD AND MOBILE POSITION DETECTION SYSTEM
RU121950U1 (en) MOBILE VEHICLE CONTROL POST
CN114216469B (en) Method for updating high-precision map, intelligent base station and storage medium
US20220017095A1 (en) Vehicle-based data acquisition
US11393222B2 (en) Vehicle management system, vehicle-mounted device, vehicle management method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination