CN219574949U - Intelligent traffic perception integrated device based on binocular depth fusion of thunder and sound - Google Patents

Intelligent traffic perception integrated device based on binocular depth fusion of thunder and sound Download PDF

Info

Publication number
CN219574949U
CN219574949U CN202320140382.2U CN202320140382U CN219574949U CN 219574949 U CN219574949 U CN 219574949U CN 202320140382 U CN202320140382 U CN 202320140382U CN 219574949 U CN219574949 U CN 219574949U
Authority
CN
China
Prior art keywords
module
radar
sound
processing unit
integrated device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202320140382.2U
Other languages
Chinese (zh)
Inventor
刘建
陈杰
丁勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huanuo Xingkong Technology Co ltd
Original Assignee
Hunan Huanuo Xingkong Electronic Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan Huanuo Xingkong Electronic Technology Co ltd filed Critical Hunan Huanuo Xingkong Electronic Technology Co ltd
Priority to CN202320140382.2U priority Critical patent/CN219574949U/en
Application granted granted Critical
Publication of CN219574949U publication Critical patent/CN219574949U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Abstract

The utility model relates to the field of intelligent traffic detection, and discloses an intelligent traffic perception integrated device based on radar sound binocular depth fusion, which is used for solving the technical problem that the accuracy of detecting traffic events such as overspeed of motor vehicles, noise pollution of automobiles and the like is low at present. The utility model is a traffic sensor integrating a double-lens camera, a pickup array, a millimeter wave radar and a high-performance processor, the double-lens camera module, the pickup array module and the radar module transmit the collected video stream, sound source data stream and radar data stream to a central processor in an edge computing processing module for signal level processing, thereby realizing the depth fusion and mutual verification of a video target, a sound source target and a radar target and achieving the effect of high resolution and high penetration real-time monitoring on various motor vehicles, non-motor vehicles, pedestrians and the like running on a road in a detection range.

Description

Intelligent traffic perception integrated device based on binocular depth fusion of thunder and sound
Technical Field
The utility model relates to the field of intelligent traffic detection, in particular to an intelligent traffic perception integrated device based on radar sound binocular depth fusion.
Background
With the rapid increase of the keeping amount and the travel demand of motor vehicles, the road traffic is increasingly complicated, and urban traffic noise pollution, overspeed motor vehicles, traffic accidents and the like become a great disease of road traffic management, thus preventing the normal running of the road. Currently, the traffic detection device mainly uses video (such as a video event detection system and video structuring), and in recent years, millimeter wave radars are also widely tested and partially used on traffic roads. However, each has the defects that the detection distance is short, the detection cannot be performed at night, the positioning cannot be performed, handshake tracking cannot be performed, and the like are defects of video. The disadvantages of radar and laser are large scene effects, occlusion affects tracking, no visible scene, etc. In addition, there is also a device combining the radar and the video, and in theory, the combination of the video and the radar can achieve the improvement of the detection precision by utilizing the complementation of the quality. However, the video and the radar in the technical security field are far apart in frequency and detection distance, the front-end fusion technology of the video and the radar has higher requirements, and the technology based on the current highway traffic detection mostly adopts the old security video or video structured system to be fused with the radar as the algorithm of the rear end, so that the ideal requirement is not achieved, and even the effect of video structuring is not as good as that of the comprehensive effect. Meanwhile, the existing whistle detection snapshot system, the overspeed detection snapshot system and various traffic event detection systems independently operate, and the linkage is lacked, so that a plurality of sparrow rod problems are caused. Therefore, the radar, the video and other technologies have low fusion degree, and the detection accuracy of traffic events such as overspeed and noise pollution of motor vehicles is not high, which is the main technical problem at present.
In view of the foregoing, there is a need in the art for an apparatus that accurately identifies and efficiently detects urban transportation vehicle noise pollution, motor vehicle overspeed, and other traffic events.
Disclosure of Invention
The utility model provides an intelligent traffic perception integrated device based on radar sound binocular depth fusion, which is used for solving the technical problems that in the prior art, video and radar technologies cannot be perfectly fused, and the accuracy of detecting noise pollution, overspeed of motor vehicles and other conditions in urban traffic is not high.
In order to solve the technical problems, the technical scheme provided by the utility model is as follows: an intelligent traffic perception integrated device based on the binocular depth fusion of thunder and sound comprises a shell, an edge calculation processing module and a control module, wherein an accommodating space is formed inside the shell; and the double-lens camera module, the radar module and the pickup array module are respectively connected with the edge calculation processing module through circuits.
The radar module adopts a double-radio-frequency antenna design and is used for detecting a target vehicle to form radar data flow.
And the pickup array module is used for distinguishing the whistling behavior of the whistling vehicle, positioning the whistling vehicle and forming a sound source data stream.
The dual-lens camera module comprises a long-focus lens, a short-focus lens and an ISP sub-module, wherein the long-focus lens and the short-focus lens are installed in parallel, the long-focus lens is installed on the upper part of the short-focus lens, and the ISP sub-module is used for performing video image processing to form a video stream.
The edge computing processing module comprises a central processor for performing signal level processing on the video stream, the sound source data stream and the radar data stream.
Further, the intelligent traffic perception integrated device based on the binocular depth fusion of the thunder and the sound further comprises a wireless transmission module and an alarm module which are respectively connected with the edge calculation processing module.
The wireless transmission module is connected with an antenna outside the shell for network communication; the alarm module is used for transmitting alarm signals to alarm equipment outside the shell.
Further, the edge computing processing module further includes:
a 16TOPS high-power chip connected with the CPU; a level conversion section connecting the central processing unit and the radar module; the first network transformer is connected with the double-lens camera module and the central processing unit; the second network transformer is connected with the pickup array module and the central processing unit; and the third network transformer is connected with the wireless transmission module and the central processing unit.
Further, the level conversion part is connected with the central processing unit through a UART interface and is connected with the radar module through an SPI interface; the first network transformer is connected with the double-lens camera module through an MIPI interface; the second network transformer is connected with the pickup array module through a network port and is connected with the central processing unit through a USB2.0 interface; the third network transformer is connected with the wireless transmission module through a network port.
Further, the edge calculation processing module further comprises a heating and heat dissipation control part; the heating and heat dissipation control part comprises a heating control circuit, a heat dissipation control circuit and a temperature sensor; the heating control circuit is connected with a fan arranged in the shell; the heat dissipation control circuit is connected with a heating plate arranged in the shell.
Further, the intelligent traffic perception integrated device based on the binocular depth fusion of the thunder and the sound also comprises a gyroscope arranged in the shell and a level meter embedded outside the shell.
The utility model has the following beneficial effects:
the utility model provides an intelligent traffic perception integrated device based on the binocular depth fusion of radar sound, which integrates the active detectability, high sensitivity and data judgment and visibility of intelligent video analysis of radar technology, greatly improves the detection rate and the recognition rate through the deep learning, and has comprehensive, real-time and accurate perception capability. The double-lens camera module and the radar module respectively adopt a double-lens design and a double-radio frequency antenna design, the detection distance can reach 350m, and no obvious precision loss exists. The targets such as pedestrians, non-motor vehicles, motor vehicles and the like can be accurately identified and distinguished at the same time on the road sections where people and vehicles are mixed, traffic is dense and easy to jam. The device integrates overspeed detection snapshot and whistle detection snapshot functions, achieves the purpose of complementary advantages and deep fusion of millimeter wave radar, pickup arrays and video camera technologies, and achieves high-resolution and high-penetration real-time monitoring of various motor vehicles, non-motor vehicles, pedestrians and the like running on a road in a detection range.
In addition to the objects, features and advantages described above, the present utility model has other objects, features and advantages. The utility model will be described in further detail with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are included to provide a further understanding of the utility model and are incorporated in and constitute a part of this specification, illustrate embodiments of the utility model and together with the description serve to explain the utility model. In the drawings:
FIG. 1 is a schematic diagram of the operation of the apparatus of the preferred embodiment of the present utility model;
FIG. 2 is one of the schematic structural views of the apparatus according to the preferred embodiment of the present utility model;
FIG. 3 is a second schematic view of the structure of the device according to the preferred embodiment of the present utility model;
fig. 4 is a general block diagram of the cpu processing board hardware of the preferred embodiment of the present utility model.
Reference numerals in the drawings: 1. an edge calculation processing module; 2. a radar module; 3. a pickup array module; 4. a dual-lens camera module; 5. a gyroscope; 6. a wireless transmission module; 7. a central processing unit; 8. an alarm module; 9. a heating heat dissipation control part; 10. a first network transformer; 11. a second network transformer; 12. a third network transformer; 13. a level conversion section; 14. and (5) a level gauge.
Detailed Description
Embodiments of the utility model are described in detail below with reference to the attached drawings, but the utility model can be implemented in a number of different ways, which are defined and covered by the claims.
The utility model discloses an intelligent traffic perception integrated device based on binocular depth fusion of thunder and sound, which integrates overspeed detection snapshot and whistle detection snapshot functions. Fig. 1 is a schematic diagram of the operation of the device in this embodiment, and the main functions of the device according to fig. 1 are generally described, where the device includes a dual-lens camera module 4, a radar module 2, a pickup array module 3, and an edge computing processing module 1, and the dual-lens camera module 4, the radar module 2, the pickup array module 3, and the edge computing processing module 1 are integrated into a radar sound fusion integrated device. The radar module 2, the double-lens camera module 4 and the pickup array module 3 transmit the acquired radar data stream, video stream and sound source data stream to the edge calculation processing module 1, and after the edge calculation fusion processing, traffic event detection such as whistle vehicles or overspeed vehicle license plates is automatically identified, meanwhile, snap-shot images and evidence obtaining videos of illegal whistle vehicles or overspeed vehicles are saved, corresponding early warning signals can be output for early warning, and data are transmitted to the management platform in a wired or wireless transmission mode.
The intelligent traffic perception integrated device based on the binocular depth fusion of the thunder and the sound in the embodiment is a brand new generation intelligent sensor specially designed for road traffic, and is a traffic sensor integrating a double-lens camera, a pickup array, a millimeter wave radar and a high-performance processor. The radar fusion integrated device is used for simultaneously accessing a video stream formed by the double-lens camera module 4, a radar data stream formed by the radar module 2 and a sound source data stream formed by the pickup array module 3 into an edge calculation processing module 1 in the integrated machine through MIPI and SPI interfaces, and carrying out data processing by utilizing a central processing unit 7 in the built-in edge calculation processing module 1. The device can be arranged in urban road traffic to inhibit whistling, and can be used for continuously and automatically monitoring and evidence obtaining records of motor vehicle horn-like behavior and overspeed behavior in any lane of a monitoring area on an overspeed detection area, an intersection or a park road section. Fig. 2 is a schematic diagram of the structure of the device.
Referring to FIG. 3, a second schematic diagram of the device of the present utility model is shown. The device comprises seven parts including an edge calculation processing module 1, a radar module 2, a double-lens camera module 4, a pickup array module, a wireless transmission module 6, an alarm module 8 and a gyroscope 5, wherein the parts are connected through a circuit arranged in a device shell. Wherein the wireless transmission module 6 and the alarm module 8 are positioned at the positions marked by the edge calculation processing module 1.
The radar module 2 adopts a double-radio-frequency antenna design, the detection distance is longer, and the target detection is more accurate. And the 80Hz frequency band detection and low-power consumption FMCW modulation technology is adopted, so that the use requirements in severe environments such as rainy days, foggy days and the like are met. The medium wave beam detection distance is far, the long-distance target identification can be realized, the short-distance wave beam detection range is wide, 1-6 lane target detection can be covered, and the requirements of most urban roads and expressway traffic scenes are met.
The pickup array module 3 accurately positions the illegal whistle vehicles through a military grade sonar array positioning technology, marks the whistle sounds of the motor vehicle in an acoustic cloud image mode, effectively distinguishes the whistle behaviors of the side-by-side vehicles and the front and back vehicles, fuses video image data, automatically identifies the number plate of the whistle vehicle through the edge computing processing module 1, and simultaneously stores a snap image of the illegal whistle vehicles and a video of the illegal whistle evidence.
The double-lens camera module 4 is matched with two ultra-low light CMOS with 12mm/35mm long and short focal lenses, wherein the long focal lens is arranged right above the short focal lens and is a 900-ten-thousand high-definition camera lens. And the ISP submodule of the double-lens camera module 4 is used for carrying out image processing and longitudinally splicing and outputting a path of full-road scene video with high definition, low noise and excellent night vision effect, so that the full-domain coverage of the detection distance with the radar is realized.
The wireless transmission module 6 supports 4G/5G and WIFI modes for data transmission, is connected with the edge calculation processing module 1, and is connected with an antenna outside the device shell for network communication.
The alarm module 8 is used for receiving the alarm signal output by the edge calculation processing module 1 and transmitting the alarm signal to external alarm equipment, and the alarm equipment can adopt modes such as an audible and visual alarm, an early warning horn, LED early warning information and the like.
The gyroscope 5 has an automatic error correction function, equipment is offset caused by severe weather or ground vibration, automatic parameter correction can be realized, maintenance cost is reduced, the external level 14 is embedded outside the device shell as shown in fig. 2, and the installation workload can be greatly simplified.
The edge calculation processing module 1 comprises a central processing unit 7, a 16TOPS high-power chip connected with the central processing unit 7, a level conversion part 13, a first network transformer 10, a second network transformer 11, a third network transformer 12, a heating and heat dissipation control part 9 and a plurality of hardware interfaces.
The level conversion part is connected with the radar module 2 and the central processing unit 7, the first network transformer 10 is connected with the dual-lens camera module 4 and the central processing unit 7, the second network transformer 11 is connected with the pickup array module 3 and the central processing unit 7, and the third network transformer 12 is connected with the wireless transmission module 6.
The 16TOPS high-power chip provides sufficient power for synchronously starting rich algorithms and accurately analyzing, not only supports the unmanned attribute recognition functions of license plates, vehicle body colors, license plate colors, vehicle marks, vehicle types and the like, but also supports the detection functions of bayonets and 30+ events, and meets the application requirements of target detection and traffic behavior detection. The central processing unit 7 adopts a four-core ARM processor, can perform signal level processing on original video stream, radar data and sound source data, and performs depth fusion and mutual verification on a video target, a radar target and a sound source target, so that high-resolution and high-penetration real-time monitoring on various motor vehicles, non-motor vehicles, pedestrians and the like running on a road in a detection range is realized.
The heating and radiating control part 9 is used for heating and radiating the board card in the working process and comprises a heating control circuit, a radiating control circuit and a temperature sensor. The temperature sensor monitors the ambient temperature and outputs temperature information to the central processing unit 7, if the central processing unit 7 judges that the temperature is too high, a control signal is output to the heat dissipation control circuit to control the fan to rotate, and heat is conducted out through the heat dissipation fins of the device base. If the CPU7 judges that the temperature is too low, a control signal is output to the heating control circuit, meanwhile, the power supply function of the CPU7 stops working, and meanwhile, the heating plate is electrified to heat. The detailed flow of the heating process is as follows: the temperature sensor has already been started before the central processing unit 7 is powered up. When the temperature is too low, the central processing unit 7 outputs a control signal to control the voltage of the heat dissipation control circuit to be loaded on the heat dissipation fins, so that preheating before the central processing unit 7 is not started is realized; after the central processing unit 7 is powered on, the temperature sensor can monitor the ambient temperature in real time, and temperature information can be transmitted to the central processing unit 7. The central processing unit 7 can judge whether the ambient temperature is suitable for self-operation, and if the ambient temperature is not suitable for self-operation, a control signal is output to the heating control circuit, so that the voltage of the heating control circuit is loaded on the radiating fin, and preheating is realized.
Referring to fig. 4, a general block diagram of the cpu processing board hardware is provided. In the figure, a central processing unit 7 is connected with a physical layer conversion part through a UART/CAN/12C interface and then is connected with a cradle head control port through a 485/CAN/12C interface. The central processing unit 7 is connected with a level conversion part through a UART interface, and the level conversion part is connected with the radar module 2 through an SPI interface. The CPU7 is connected to the alarm module 8 through an external reserved interface, where the external reserved interface includes, but is not limited to, PCIE, 12C, 12S, UART, SDIO interfaces. The central processing unit 7 is connected with the first network transformer 10, and the first network transformer 10 is connected with the double-lens camera module 4 through the MIPI interface, so that the transmission rate is 100BASE-T. The central processing unit 7 is connected with the third network transformer 12, and the third network transformer 12 is connected with the wireless transmission module 6 through a network port, so that the transmission rate can be 1000/100/10BASE-T. The central processing unit 7 is connected with a heating and heat dissipation control part 9, and the heating and heat dissipation control part 9 is connected with a heating plate and a fan interface. The central processing unit 7 is connected with power management, and the power management is connected with power input. The central processing unit 7 is connected with the second network transformer 11 through the USB2.0 and then connected with the pickup array module 3 through a network port.
Furthermore, the intelligent traffic perception integrated device for the binocular depth fusion of the thunder and sound has the following main characteristics:
1) 7×24h (no matter day or night), is suitable for all-weather real-time stable protection, is suitable for various bad weather such as rain, snow, fog, haze, dust and the like, and has no illumination influence at night, thereby avoiding missing report to the maximum extent and eliminating false report.
2) The radar and the video are subjected to signal level fusion, the active detectivity, high sensitivity and the data judgment and visibility of intelligent video analysis of the radar technology are fused, the detection rate and the recognition rate are greatly improved through deep learning, and the comprehensive, real-time and accurate perception capability is realized
3) The double-lens camera module 4 and the radar module 2 respectively adopt a double-lens design and a double-radio frequency antenna design, the detection distance can reach 350m, and no obvious precision loss exists. The system is suitable for the road sections with mixed running of people and vehicles, dense traffic flow and easy congestion, can accurately identify and distinguish targets such as pedestrians, non-motor vehicles, motor vehicles and the like, has wide coverage range and complete detection elements, and can obviously reduce deployment cost.
4) The device collects overspeed detection snapshot, and whistle detection snapshot function is in an organic whole, greatly reduced construction cost. All data stream information (including video streams, radar data streams, sound source data streams) are lane-level, including various target status information such as target position, speed, direction, category (motor vehicle, non-motor vehicle, pedestrian, etc.), license plate number, etc. Wherein the video stream comprises image data, the radar data stream comprises target position, speed, direction, distance, etc., and the sound source data stream comprises sound decibels, sound type, sound source direction, etc.
5) The installation and the debugging are simple and convenient, the deep learning and the self-calibration system are built in, an intersection operation unit is not required to be additionally arranged, the high-speed portal or the side stand column can be integrally installed, and manual calibration is not required after the high-speed portal or the side stand column is used; the method has the advantages of low energy consumption, no calibration and less maintenance.
Furthermore, the utility model designs the reliability of the device according to the requirement that the MTBF is not less than 3000 hours, and the specific reliability measures are as follows:
1) The miniaturization design is enhanced, the power consumption is reduced, the number of components is reduced, and the size is reduced. Preferably, industrial grade or certified devices are used.
2) All connectors are selected from industry high-reliability connectors, and indexes such as material technology and service life of the connectors are strictly controlled.
3) And the derating design is adopted for the electronic components, so that the influence of electric stress on the service life of the components is reduced.
4) The motherboard circuit and the interface are designed with antistatic function.
5) The circuit board adopts a three-proofing process design.
The above description is only of the preferred embodiments of the present utility model and is not intended to limit the present utility model, but various modifications and variations can be made to the present utility model by those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present utility model should be included in the protection scope of the present utility model.

Claims (6)

1. The intelligent traffic perception integrated device based on the binocular depth fusion of the thunder and sound is characterized by comprising a shell, an edge computing processing module (1) and a control module, wherein an accommodating space is formed in the shell; the double-lens camera module (4), the radar module (2) and the pickup array module (3) are respectively connected with the edge calculation processing module through circuits;
the radar module (2) adopts a double-radio-frequency antenna design and is used for detecting a target vehicle to form radar data flow;
the pickup array module (3) is used for distinguishing the whistling behavior of the whistling vehicle, positioning the whistling vehicle and forming a sound source data stream;
the double-lens camera module (4) comprises a long-focus lens, a short-focus lens and an ISP sub-module, wherein the long-focus lens and the short-focus lens are installed in parallel, the long-focus lens is installed on the upper part of the short-focus lens, and the ISP sub-module is used for performing video image processing to form a video stream;
the edge calculation processing module (1) comprises a central processor (7) for signal level processing of the video stream, the sound source data stream and the radar data stream.
2. The intelligent traffic perception integrated device based on the thunder and sound binocular depth fusion according to claim 1, which is characterized by further comprising a wireless transmission module (6) and an alarm module (8) which are respectively connected with the edge calculation processing module (1);
the wireless transmission module (6) is connected with an antenna outside the shell for network communication; the alarm module (8) is used for transmitting alarm signals to alarm equipment outside the shell.
3. The intelligent traffic perception integrated device based on the thunder and sound binocular depth fusion according to claim 1, wherein the edge calculation processing module (1) further comprises:
a 16TOPS high-power chip connected with the central processing unit (7); a level conversion section connecting the central processing unit (7) and the radar module (2); a first network transformer (10) connecting the dual-lens camera module (4) with the central processing unit (7); a second network transformer (11) connecting the pickup array module (3) with the central processing unit (7); and a third network transformer (12) connecting the wireless transmission module (6) and the central processing unit (7).
4. The intelligent traffic perception integrated device based on the thunder and sound binocular depth fusion according to claim 3, wherein the level conversion part (13) is connected with the central processing unit (7) through a UART interface and is connected with the radar module (2) through an SPI interface; the first network transformer (10) is connected with the double-lens camera module (4) through an MIPI interface; the second network transformer (11) is connected with the pickup array module (3) through a network port and is connected with the central processing unit (7) through a USB2.0 interface; the third network transformer (12) is connected with the wireless transmission module (6) through a network port.
5. The intelligent traffic perception integrated device based on the thunder and sound binocular depth fusion according to claim 1, wherein the edge calculation processing module further comprises a heating and heat dissipation control part (9);
the heating and heat dissipation control part (9) comprises a heating control circuit, a heat dissipation control circuit and a temperature sensor; the heating control circuit is connected with a fan arranged in the shell; the heat dissipation control circuit is connected with a heating plate arranged in the shell.
6. The intelligent traffic perception integrated device based on the thunder and sound binocular depth fusion according to claim 1, further comprising a gyroscope (5) arranged in the shell and a level meter (14) embedded outside the shell.
CN202320140382.2U 2023-02-06 2023-02-06 Intelligent traffic perception integrated device based on binocular depth fusion of thunder and sound Active CN219574949U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202320140382.2U CN219574949U (en) 2023-02-06 2023-02-06 Intelligent traffic perception integrated device based on binocular depth fusion of thunder and sound

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202320140382.2U CN219574949U (en) 2023-02-06 2023-02-06 Intelligent traffic perception integrated device based on binocular depth fusion of thunder and sound

Publications (1)

Publication Number Publication Date
CN219574949U true CN219574949U (en) 2023-08-22

Family

ID=87670358

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202320140382.2U Active CN219574949U (en) 2023-02-06 2023-02-06 Intelligent traffic perception integrated device based on binocular depth fusion of thunder and sound

Country Status (1)

Country Link
CN (1) CN219574949U (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117289260A (en) * 2023-11-27 2023-12-26 陕西欧卡电子智能科技有限公司 Millimeter wave radar device for ship

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117289260A (en) * 2023-11-27 2023-12-26 陕西欧卡电子智能科技有限公司 Millimeter wave radar device for ship

Similar Documents

Publication Publication Date Title
CN201594319U (en) Multifunctional electronic police system with high-definition snapshot
CN109557534B (en) Multi-element omnibearing tracking detection radar sensor equipment and use method thereof
CN219574949U (en) Intelligent traffic perception integrated device based on binocular depth fusion of thunder and sound
CN112099040A (en) Whole-course continuous track vehicle tracking system and method based on laser radar network
CN216873319U (en) Thunder look fuses road monitored control system
CN208476310U (en) Spacing monitoring apparatus and system
CN105185114A (en) Device and method for collecting evidence of vehicle regulation-violating lane changing behavior
CN105632218A (en) Motorway vehicle-mounted group navigation system based on GPS and GSM platform
Soni et al. Internet of Vehicles based approach for road safety applications using sensor technologies
CN114002669A (en) Road target detection system based on radar and video fusion perception
CN111477011A (en) Detection device and detection method for road intersection early warning
CN205644987U (en) On -vehicle crowd navigation of highway based on GPS and GSM platform
CN116229728A (en) Road traffic monitoring system of 4D millimeter wave radar and video sensor array
US20230177724A1 (en) Vehicle to infrastructure extrinsic calibration system and method
CN112687113A (en) Roadside information perception equipment
CN114445733A (en) Night road information sensing system based on machine vision, radar and WiFi and information fusion method
CN109435847A (en) A kind of vehicle anti-rear collision method for early warning based on camera and millimetre-wave radar
CN206460615U (en) Integrated multilane device for measuring speed and taking snap pictures
CN109874099B (en) Networking vehicle-mounted equipment flow control system
CN217426263U (en) Holographic road network road monitoring system
CN209447328U (en) Compound eye traffic offence apparatus for obtaining evidence
CN112765423A (en) City perception base station
CN114299715A (en) Expressway information detection system based on videos, laser radar and DSRC
CN112735121A (en) Holographic sensing system based on image-level laser radar
CN215298230U (en) City perception base station

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address
CP03 Change of name, title or address

Address after: Building B7, Lugu Enterprise Plaza, No. 27 Wenxuan Road, High tech Development Zone, Changsha City, Hunan Province, 410221

Patentee after: Huanuo Xingkong Technology Co.,Ltd.

Address before: Building B7, Lugu Enterprise Plaza, No. 27 Wenxuan Road, High tech Development Zone, Changsha City, Hunan Province, 410100

Patentee before: Hunan Huanuo Xingkong Electronic Technology Co.,Ltd.