CN110602849B - Wisdom street lamp developments thing capture system based on radar - Google Patents

Wisdom street lamp developments thing capture system based on radar Download PDF

Info

Publication number
CN110602849B
CN110602849B CN201910796594.4A CN201910796594A CN110602849B CN 110602849 B CN110602849 B CN 110602849B CN 201910796594 A CN201910796594 A CN 201910796594A CN 110602849 B CN110602849 B CN 110602849B
Authority
CN
China
Prior art keywords
radar
vehicle
road
positioning
under test
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910796594.4A
Other languages
Chinese (zh)
Other versions
CN110602849A (en
Inventor
李彦星
曾卫华
张远东
田峰
蔡健
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanxi Coal Geological Exploration And Painting Institute
China University of Geosciences Beijing
Original Assignee
Shanxi Coal Geological Exploration And Painting Institute
China University of Geosciences Beijing
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanxi Coal Geological Exploration And Painting Institute, China University of Geosciences Beijing filed Critical Shanxi Coal Geological Exploration And Painting Institute
Priority to CN201910796594.4A priority Critical patent/CN110602849B/en
Publication of CN110602849A publication Critical patent/CN110602849A/en
Application granted granted Critical
Publication of CN110602849B publication Critical patent/CN110602849B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/58Velocity or trajectory determination systems; Sense-of-movement determination systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/58Velocity or trajectory determination systems; Sense-of-movement determination systems
    • G01S13/588Velocity or trajectory determination systems; Sense-of-movement determination systems deriving the velocity value from the range measurement
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/02Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
    • G01S15/06Systems determining the position data of a target
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Traffic Control Systems (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention provides a radar-based intelligent street lamp dynamic object capturing system. The system includes a plurality of street lamps and a radar control subsystem. The street lamp is provided with a radar detector and a communicator. The radar detector is disposed facing a road and is used to detect an object moving on the road to generate a detection signal. The radar control subsystem comprises a modeling module, a real-time positioning module and a dynamic capturing module. The invention can realize dynamic capture of the detected vehicle, generate the motion trail, provide possibility for collecting road traffic data and provide basis for recording road condition information in more detail.

Description

Wisdom street lamp developments thing capture system based on radar
Technical Field
The invention relates to a municipal street lamp system, in particular to a radar-based intelligent street lamp dynamic object capturing system.
Background
The vehicle speed detector is an instrument for checking the speed of a running vehicle. Most commonly, a hand-held radar doppler detector, which is inexpensive and practical; it is shaped like a pistol, commonly known as a "radar gun". The principle is based on the doppler effect, i.e. the vehicle speed is proportional to the microwave frequency variation. The detector emits microwave, and the Doppler effect of the reflected wave can indicate the position and speed of the automobile. The traffic police can observe the traffic police by the traffic police at the roadside, and can also observe the traffic police by a vehicle, and penalizes if an overspeed person is found. The road camera system records the vehicle condition information in an image acquisition mode to provide a basis for recording the traffic information, but a system capable of replacing the camera system to detect the vehicle condition information does not exist at present.
Disclosure of Invention
The invention can realize dynamic capture of a detected vehicle, generate a motion track, provide possibility for collecting road traffic data and provide basis for recording road condition information in more detail.
In one aspect of the invention, a radar-based smart street light dynamic object capture system is provided. The system may include a plurality of street lamps, and a radar control subsystem.
In some examples, each of the street lamps is provided with a radar detector disposed facing the road and configured to detect a vehicle under test on the road to generate a detection signal, and a communicator.
In some examples, the radar control subsystem includes a modeling module, a real-time positioning module, and a dynamic capture module.
In some examples, the modeling module is configured with a modeling strategy configured to build a road coordinate model and to specify a location of each of the radar detectors in the road coordinate model.
In some examples, the real-time location module is configured with a location calculation unit and a location policy; wherein the positioning calculation unit is configured to process the detection signal of each radar detector to generate a corresponding vehicle-radar position relationship in real time, the vehicle-radar position relationship being reflected in a detection area of the radar detector, and a position relationship between a vehicle under test on a road and the radar detector
In some examples, the positioning strategy includes a positioning algorithm and a shape detection algorithm, the positioning algorithm is configured to determine an independent position measurement of the vehicle under test in the road coordinate model according to each of the vehicle-radar position relationships; the shape detection algorithm is configured to generate an independent shape measurement of a vehicle under test in the road coordinate model based on the detection signal of the radar detector, the independent shape measurement of the vehicle under test reflecting a shape of the vehicle under test.
In some examples, the dynamic capture module is configured with a capture policy and a follow-up policy; wherein the capture policy is configured to: generating location marker information in the road coordinate model corresponding to the independent position measurement of the vehicle under test each time the independent position measurement and the independent shape measurement of the vehicle under test are generated. Receiving and updating the position mark information so as to generate vehicle track information of the detected vehicle in the road coordinate model, wherein the vehicle track information reflects the motion track of the detected vehicle on the road
While preferred embodiments of the present invention have been shown and described herein, it will be obvious to those skilled in the art that such embodiments are provided by way of example only. Numerous modifications, changes, and substitutions will now occur to those skilled in the art without departing from the invention. It should be understood that various alternatives to the embodiments of the invention described herein may be employed in practicing the invention. It is intended that the following claims define the scope of the invention and that methods and structures within the scope of these claims and their equivalents be covered thereby.
Drawings
The novel features believed characteristic of the invention are set forth with particularity in the appended claims. A better understanding of the features and advantages of the present invention will be obtained by reference to the following detailed description that sets forth illustrative embodiments, in which the principles of the invention are utilized, and the accompanying drawings of which:
fig. 1 is a schematic view of an arrangement of a street light according to an exemplary embodiment of the present invention;
FIG. 2 is a system architecture schematic of a radar-based smart street light dynamic object capture system according to an exemplary embodiment of the present invention; and
fig. 3 is a policy flow diagram according to an exemplary embodiment of the present invention.
Reference numerals: 1. a street lamp; 11. a radar detector; 12. a communicator; 2. a radar control subsystem; 21. a modeling module; 22. a real-time positioning module; 23. a dynamic capture module; s1, modeling strategy; s2, positioning strategy; s3, capturing a strategy; s4, following the strategy.
Detailed Description
While preferred embodiments of the present invention have been shown and described herein, it will be obvious to those skilled in the art that such embodiments are provided by way of example only. Numerous variations, changes, and substitutions will now occur to those skilled in the art without departing from the invention. It should be understood that various alternatives to the embodiments of the invention described herein may be employed in practicing the invention.
Referring to fig. 1, the intelligent street lamp dynamic object capturing system based on radar of the present invention includes a plurality of street lamps 1. Each street lamp 1 may be provided with a radar detector 11 and a communicator 12. The radar detector 11 is disposed facing the road and is used to detect an object moving on the road to generate a detection signal for the vehicle under test. The radar detector can detect the vehicle by using the Doppler principle, and the direction of the vehicle can be determined by the direction of the reflected signal (the direction of the road is used as a basis for assisting in judging the position). The street lamps 1 are further provided with a communicator 12, and the communicator 12 is used for realizing data interaction between the street lamps 1 and between the street lamps and a control system (for example, a radar control subsystem described below). For example, the communicator may include a cellular communication module, such as a 4G module. In some examples, one radar control subsystem 2 may communicate with at least 16 street lights 1 via each communicator 12.
The intelligent street lamp dynamic object capturing system based on the radar further comprises a radar control subsystem 2. The radar control subsystem 2 may include a modeling module 21, a real-time positioning module 22, and a motion capture module 23.
In some examples, the modeling module 21 is configured with a modeling strategy S1. The modeling strategy S1 is configured to create a road coordinate model and to specify the position of each of the radar detectors 11 in the road coordinate model. The road coordinate model has specific position coordinates of the radar detectors in addition to the basic road coordinate system, so that the relative position relationship between the radar detectors can be judged. Meanwhile, the detection range of the radar detector can be determined according to the parameters of the radar detector, and a basis is established for judging the position of the vehicle.
In some examples, the real-time location module 22 is configured with a location policy S2 and a location calculation unit. The positioning calculation unit is configured to process the detection signal of each of the radar detectors 11 so as to generate a corresponding vehicle-radar position relationship in real time. The vehicle-radar positional relationship reflects the positional relationship of the radar detector 11 with the vehicle under test on the road in the detection area of the radar detector 11. The positioning strategy S2 may include a positioning algorithm for determining an individual position measurement of the vehicle under test (i.e., the position measurement of the vehicle under test by the radar detector) in the road coordinate model based on each vehicle-radar position relationship. In some examples, the positioning strategy S2 may further include a shape detection algorithm for generating an independent shape measurement result of the vehicle under test in the road coordinate model (i.e., a measurement result of the radar detector on the shape of the vehicle under test) according to the detection signal of the radar detector 11, wherein the independent shape measurement result reflects the shape of the vehicle under test. The shape of the detected vehicle is an intermediate result of the generated track information, so that vehicle track information of the detected vehicle in the road coordinate model is generated conveniently, and the vehicle track information reflects the motion track of the detected vehicle on the road.
There are two algorithms for locating the position of a vehicle using a radar detector. In the first algorithm, each radar detector comprises a plurality of radar detection units with different array setting angles. The direction of the vehicle can be determined according to the angle of the reflected signal detected by the radar detector, and the distance of the vehicle is determined according to the time difference of the reflected signal, so that the positioning of the vehicle is realized. Meanwhile, according to the number and position of the radar detection units and the reflected signals detected by the radar detectors, the corresponding vehicle shape can be determined. In a second algorithm, a signal is transmitted by the radar detection unit that exactly covers the detection area. Since the driving direction of the vehicle is known, when the radar detects the reflected signal, the vehicle is judged to be located at the initial position of the detection area, and therefore position detection is achieved. It can be found that the two methods have the disadvantage of large measurement error.
In some examples of the invention, the motion capture module 23 is configured with a capture strategy S3 and a follow strategy S4. The capture policy S3 is configured to: position mark information corresponding to an independent position measurement result of a vehicle under test is generated in the road coordinate model each time the independent position measurement result and the independent shape measurement result of the vehicle under test are generated. The follow strategy S4 is configured to: and receiving and updating the position mark information so as to generate vehicle track information of the detected vehicle in the road coordinate model, wherein the vehicle track information reflects the motion track of the detected vehicle on the road. Because the speed of the vehicle to be tested is fast, a large amount of computing resources of the system are consumed for realizing dynamic real-time capture. According to the method, the position information and the shape information of the vehicle to be detected are acquired at intervals and in a single time and are updated, so that the position and the shape of the vehicle to be detected can be continuously monitored. Compared with the real-time vehicle monitoring based on video streaming, the system disclosed by the invention has the advantages that the data volume needing to be processed is smaller, and the reliability of vehicle monitoring and track generation can be still ensured.
In some examples of the invention, the capture policy S3 is further configured such that the follow policy S4 is further configured to: and generating the vehicle speed information of the detected vehicle according to the vehicle track information and the time stamp.
In some examples of the present invention, the detection areas of the radar detectors 11 have overlapping areas, so that the monitoring range seamlessly covers the entire road. The shape detection algorithm is further configured to: determining the independent shape measurement result of the vehicle under test in the road coordinate model each time the vehicle under test passes through the overlap region.
In some examples of the invention, the shape detection algorithm is further configured to: when the vehicle under test passes through the overlap region, the corresponding radar detector detects an obstruction region and correlates the detected obstruction region in the road coordinate model to generate the independent shape measurement of the vehicle under test. Several occlusion areas may be determined to be related if they are consecutive in the road coordinate model. For the radar measurement shelter object area, as long as the radar number and the orientation are different, the detection result can be obtained through measurement, and the shelter object area is detected to obtain an intermediate result generated by shape information, so that the position and the shape of the detected vehicle can be continuously monitored.
In some examples of the invention, the modeling strategy S1 is configured to: and acquiring road surface structure information of the road through the radar detector, and establishing the road coordinate model according to the road surface structure information.
In some examples, the positioning policy S2 further includes configuring a context data table. And storing the situation information and a real-time positioning algorithm corresponding to each situation information in the situation data table. The positioning policy S2 determines context information according to the current environmental factors, thereby determining a corresponding positioning algorithm. In some examples, the context information includes a vehicle speed factor reflecting a vehicle speed of the vehicle under test and a distance factor reflecting a distance between the vehicle under test and the radar detector 11. In some examples, the context information includes a temperature factor reflecting a temperature of the current environment and a humidity factor reflecting a humidity of the current environment.
The reason for setting the context data table is that the accuracy of radar ranging is affected by environmental factors. Therefore, different positioning algorithms can be configured according to different situations by establishing the corresponding situation table, so that the detection precision can be ensured under various external environments. For example, 10 environmental factors and 10 corresponding positioning algorithms may be stored in the context data table; in the case of a humidity of 40% and a temperature of 37 degrees celsius, the position measurement of the vehicle can be performed using a corresponding positioning algorithm, thereby ensuring the measurement accuracy in this particular environment. .
In some examples, one radar control subsystem 2 may communicate with at least 16 street lights 1 via each communicator 12.
In an exemplary embodiment of the invention, for a certain vehicle under test, the set of position coordinates measured by the radar detector and the radar control subsystem is
Figure BDA0002181129270000071
Assume that the bit coordinate of each radar detector is (x)0,y0) Distance factor
Figure BDA0002181129270000072
The set of calculation formulas is
Figure BDA0002181129270000073
Because the speed factor is calculated, the path track of the measured vehicle needs to be simulated firstly, and the error is generated when the limited positioning point is used for simulating the vehicle running track, so that the error can be reduced by adopting least square fitting;
in the function:
Figure BDA0002181129270000074
find a function of:
Figure BDA0002181129270000081
so that the sum of squared errors is minimized, i.e.:
Figure BDA0002181129270000082
calculated to obtain
Figure BDA0002181129270000083
The vehicle speed factor at the time t can be obtained by differentiating the appointed time t.
While preferred embodiments of the present invention have been shown and described herein, it will be obvious to those skilled in the art that such embodiments are provided by way of example only. Numerous modifications, changes, and substitutions will now occur to those skilled in the art without departing from the invention. It should be understood that various alternatives to the embodiments of the invention described herein may be employed in practicing the invention. It is intended that the following claims define the scope of the invention and that methods and structures within the scope of these claims and their equivalents be covered thereby.

Claims (9)

1. A smart street lamp dynamic object capturing system based on radar comprises a plurality of street lamps and a radar control subsystem, wherein each street lamp is provided with a radar detector and a communicator, and the radar detector is arranged facing a road and used for detecting a detected vehicle on the road to generate a detection signal;
the radar control subsystem comprises a modeling module, a real-time positioning module and a dynamic capturing module;
the modeling module is configured with a modeling strategy, and the modeling strategy is configured to establish a road coordinate model and to mark the position of each radar detector in the road coordinate model;
the real-time positioning module is provided with a positioning calculation unit and a positioning strategy;
the method is characterized in that:
the positioning calculation unit is configured to process the detection signal of each radar detector to generate a corresponding vehicle-radar position relation in real time, wherein the vehicle-radar position relation reflects the position relation between a detected vehicle on a road and the radar detector in the detection area of the radar detector;
the positioning strategy comprises a positioning algorithm and a shape detection algorithm, and the positioning algorithm is configured to determine an independent position measurement result of the vehicle to be detected in the road coordinate model according to each vehicle-radar position relation;
the shape detection algorithm is configured to generate an independent shape measurement result of a vehicle under test in the road coordinate model according to the detection signal of the radar detector, the independent shape measurement result of the vehicle under test reflecting the shape of the vehicle under test;
the dynamic capture module is configured with a capture strategy and a following strategy; wherein the capture policy is configured to: generating position mark information corresponding to the independent position measurement result of the vehicle under test in the road coordinate model each time the independent position measurement result and the independent shape measurement result of the vehicle under test are generated;
the follow-up policy is configured to: receiving and updating the position mark information so as to generate vehicle track information of the detected vehicle in the road coordinate model, wherein the vehicle track information reflects the motion track of the detected vehicle on the road;
wherein the detection areas of the plurality of radar detectors have overlapping areas; the shape detection algorithm is further configured to: determining the independent shape measurement result of the vehicle under test in the road coordinate model each time the vehicle under test passes through the overlap region.
2. The radar-based smart street light animate object capturing system of claim 1, wherein:
the capture policy is further configured to: establishing a timestamp for each of the location marker information; the follow-up policy is further configured to: and generating the vehicle speed information of the detected vehicle according to the vehicle track information and the time stamp.
3. The radar-based smart street light animate object capturing system of claim 1, wherein:
the shape detection algorithm is further configured to: when the vehicle under test passes through the overlap region, the corresponding radar detector detects an obstruction region and correlates the detected obstruction region in the road coordinate model to generate the independent shape measurement of the vehicle under test.
4. The radar-based smart street light animate object capturing system of claim 1, wherein:
the modeling strategy is configured to: and acquiring road surface structure information of the road through the radar detector, and establishing the road coordinate model according to the road surface structure information.
5. The radar-based smart street light animate object capturing system of claim 1, wherein:
the positioning strategy is also provided with a situation data table, and situation information and the positioning algorithm corresponding to each situation information are stored in the situation data table; the positioning policy is further configured to determine the context information according to current environmental factors, thereby determining the corresponding positioning algorithm.
6. The smart radar-based street light animate object capturing system of claim 5, wherein:
the situation information comprises a vehicle speed factor and a distance factor; the vehicle speed factor reflects the vehicle speed of the detected vehicle, and the distance factor reflects the distance between the detected vehicle and the radar detector.
7. The smart radar-based street light animate object capturing system of claim 5, wherein:
the context information comprises a temperature factor and a humidity factor; the temperature factor reflects a temperature of a current environment and the humidity factor reflects a humidity of the current environment.
8. The radar-based smart street light animate object capturing system of claim 1, wherein:
the communicator is configured with a cellular communication module.
9. The radar-based smart street light animate object capturing system of claim 1, wherein:
one said radar control subsystem is in communication with at least 16 said street lamps.
CN201910796594.4A 2019-08-27 2019-08-27 Wisdom street lamp developments thing capture system based on radar Active CN110602849B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910796594.4A CN110602849B (en) 2019-08-27 2019-08-27 Wisdom street lamp developments thing capture system based on radar

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910796594.4A CN110602849B (en) 2019-08-27 2019-08-27 Wisdom street lamp developments thing capture system based on radar

Publications (2)

Publication Number Publication Date
CN110602849A CN110602849A (en) 2019-12-20
CN110602849B true CN110602849B (en) 2020-11-06

Family

ID=68855810

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910796594.4A Active CN110602849B (en) 2019-08-27 2019-08-27 Wisdom street lamp developments thing capture system based on radar

Country Status (1)

Country Link
CN (1) CN110602849B (en)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000241542A (en) * 1999-02-25 2000-09-08 Mitsubishi Electric Corp Movable body-tracking device
CN108922188B (en) * 2018-07-24 2020-12-29 河北德冠隆电子科技有限公司 Radar tracking and positioning four-dimensional live-action traffic road condition perception early warning monitoring management system
CN108986450B (en) * 2018-07-25 2024-01-16 北京万集科技股份有限公司 Vehicle environment sensing method, terminal and system
CN109285373B (en) * 2018-08-31 2020-08-14 南京锦和佳鑫信息科技有限公司 Intelligent network traffic system for whole road network
CN109584568B (en) * 2018-12-29 2022-05-03 浙江方大智控科技有限公司 Intelligent traffic radar monitoring system

Also Published As

Publication number Publication date
CN110602849A (en) 2019-12-20

Similar Documents

Publication Publication Date Title
US11113966B2 (en) Vehicular information systems and methods
KR102193950B1 (en) Vehicle and sensing device of utilizing spatial information acquired using sensor, and server for the same
CN103176185B (en) Method and system for detecting road barrier
CN111025308B (en) Vehicle positioning method, device, system and storage medium
US20190369212A1 (en) Determining specular reflectivity characteristics using lidar
CN110356339B (en) Lane change blind area monitoring method and system and vehicle
EP3830604B1 (en) Lidar system design to mitigate lidar crosstalk
JP2020535410A (en) Methods and systems for mapping and locating vehicles based on radar measurements
CN110602848B (en) Wisdom street lamp control system based on developments are caught
CN110853356A (en) Vehicle lane change detection method based on radar and video linkage
US20210398425A1 (en) Vehicular information systems and methods
CN110602849B (en) Wisdom street lamp developments thing capture system based on radar
CN110596656B (en) Intelligent street lamp feedback compensation system based on big data
CN110531352B (en) Driving state feedback system based on intelligent street lamp
CN110596690B (en) Speed of a motor vehicle detects linked system based on street lamp
CN110599779B (en) Intelligent street lamp self-checking system based on vehicle speed analysis
WO2021115273A1 (en) Communication method and apparatus
JP2011164071A (en) Device and method for managing object
WO2024013839A1 (en) Object recognition device and object recognition method
US11995986B2 (en) Vehicular information systems and methods
CN115148018B (en) Traffic event detection device and method
WO2021095269A1 (en) Information generation device, information generation method, and computer program
Kälin et al. Highly Accurate Pose Estimation as a Reference for Autonomous Vehicles in Near-Range Scenarios. Remote Sens. 2022, 14, 90
CN115930976A (en) Data fusion method and perception fusion system
CN115421136A (en) Vehicle detection system and detection method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant