CN114937367A - Intelligent camera system for cooperative monitoring of vehicle and road and control method - Google Patents

Intelligent camera system for cooperative monitoring of vehicle and road and control method Download PDF

Info

Publication number
CN114937367A
CN114937367A CN202210550244.1A CN202210550244A CN114937367A CN 114937367 A CN114937367 A CN 114937367A CN 202210550244 A CN202210550244 A CN 202210550244A CN 114937367 A CN114937367 A CN 114937367A
Authority
CN
China
Prior art keywords
camera
sensor
vehicle
road
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210550244.1A
Other languages
Chinese (zh)
Inventor
朱洪留
郭长江
范林林
任超
曹葵康
刘军传
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tztek Technology Co Ltd
Original Assignee
Tztek Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tztek Technology Co Ltd filed Critical Tztek Technology Co Ltd
Priority to CN202210550244.1A priority Critical patent/CN114937367A/en
Publication of CN114937367A publication Critical patent/CN114937367A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/04Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0125Traffic data processing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Traffic Control Systems (AREA)
  • Studio Devices (AREA)

Abstract

The invention relates to the technical field of road traffic control, and discloses an intelligent camera system for vehicle-road cooperative monitoring and a control method thereof, wherein the intelligent camera system comprises a forward-looking camera, a backward-looking camera, a downward-looking camera, a sensor and a control device, and is integrated in the intelligent camera system; the control device comprises a processor module, an operating system module, a positioning module and a network transmission module, and is used for respectively realizing data receiving and processing, displaying a target sensing effect and a position track in real time, synchronously triggering a camera and a sensor through time service and pushing sensing data to a client; and performing fusion operation on data obtained by the camera and the sensor by using the control device, matching with a closed loop system formed by a network transmission module, screening and displaying the road end target perception effect in the vehicle-road cooperation scheme in real time, refreshing the position of the perception target, and quickly verifying the scheme deployment effect.

Description

Intelligent camera system for cooperative monitoring of vehicle and road and control method
Technical Field
The invention belongs to the technical field of road traffic control, and particularly relates to an intelligent camera system for vehicle-road cooperative monitoring and a control method.
Background
An Intelligent Traffic System (ITS) is a comprehensive Traffic System which effectively and comprehensively applies advanced information technology, computer technology, data communication technology, sensor technology, electronic control technology, artificial intelligence and the like to Traffic transportation, vehicle-road cooperation and service control, thereby strengthening the relation among vehicles, roads and users, and forming the comprehensive Traffic System which ensures safety, improves efficiency, improves environment and saves energy. The intelligent traffic system is a development direction of a future traffic system, fully utilizes technologies such as the Internet of things, cloud computing, artificial intelligence, automatic control and mobile internet, manages and controls all aspects of traffic fields such as traffic management, transportation and public trip and the whole process of traffic construction management, enables the traffic system to have the capabilities of perception, interconnection, analysis, prediction, control and the like in regions, cities and even larger space-time ranges, fully ensures traffic safety, exerts the efficiency of traffic infrastructure, improves the operation efficiency and the management level of the traffic system, and provides sustainable economic development service for smooth public trip and smooth economic development.
For example, patent application publication No. CN107123303A proposes a system and method for managing roadside parking spaces by linking radar and smart camera, which monitors and manages vehicles entering and leaving a road, and enhances the detection results to improve the detection accuracy. Therefore, how to further optimize and design the intelligent camera to integrate and cooperatively process richer data such as images and pictures can be seen, and the problems of missed shooting, mistaken shooting and inaccurate detection data can be avoided; meanwhile, the workload of manual checking is reduced, the road end perception effect of the vehicle-road cooperation scheme is improved, and the monitoring efficiency is improved; will be a technical problem to be solved.
Disclosure of Invention
In order to overcome the defects of the prior art, the invention aims to provide an intelligent camera system and a control method for vehicle and road cooperative monitoring, which are used for solving or partially solving the problems.
The invention provides an intelligent camera system for cooperative vehicle and road monitoring, which comprises a front-view camera, a rear-view camera, a top-view camera, a sensor and a control device, wherein the front-view camera is connected with the rear-view camera; wherein the content of the first and second substances,
the front-view camera, the rear-view camera, the overlook camera and the sensor are integrated in the intelligent camera system;
the control device comprises a processor module, an operating system module, a positioning module and a network transmission module; wherein the content of the first and second substances,
the processor module is configured to receive data from the camera and the sensor and process the data through a perception fusion algorithm;
the operating system module is configured to display the road end target perception effect in the vehicle cooperation scheme in real time, refresh the position track of the road end target and verify the deployment effect of the vehicle system scheme;
the positioning module is configured to synchronously trigger the camera and the sensor based on the time service synchronization function of the GPS positioning system or the Beidou positioning system;
the network transmission module is configured to receive a Websocket request of the client and push sensed data in a JSON format to the client by the server; the method comprises the steps that a client side initiates a Websocket request, receives data in a JSON format at the same time, and projects a target model into map software in real time according to coordinate information of a perception target;
and performing fusion operation on data obtained by the front-view camera, the rear-view camera, the overlook camera and the sensor by using the control device, and outputting images, videos and positioning information which meet monitoring requirements.
Optionally, one or two front-view cameras and one or two rear-view cameras are respectively arranged, and a telephoto lens or a medium-focus lens is used.
Optionally, the number of the overlooking cameras is one, and the overlooking cameras are installed at the center of the bottom of the intelligent camera.
Optionally, the sensor includes a millimeter wave radar sensor and a laser radar sensor, and is electrically connected to the control device.
The invention provides an intelligent camera control method for vehicle-road cooperative monitoring, which adopts the intelligent camera system described in the first aspect, performs fusion operation on data obtained by a front-view camera, a rear-view camera, a downward-view camera and a sensor through a processor module, an operating system module, a positioning module and a network transmission module, and outputs images, videos and positioning information meeting monitoring requirements; wherein the content of the first and second substances,
the processor module is provided with a host processor and a slave processor, and is configured to receive data from the camera and the sensor and process the data through a perception fusion algorithm;
the operating system module comprises vehicle-road cooperative perception projection software and map software, and is configured to display the road end target perception effect in the vehicle cooperative scheme in real time, refresh the position track of the road end target and verify the vehicle system scheme deployment effect;
the positioning module can use a GPS (global positioning system) or a Beidou positioning system, and is configured to synchronously trigger the camera and the sensor based on the time service synchronization function of the GPS or the Beidou positioning system;
the network transmission module comprises a server, a client, a browser and a local area network, and is configured to receive a Websocket request of the client and push sensed data in a JSON format to the client; the client side initiates a Websocket request, receives data in a JSON format at the same time, and projects the target model into map software in real time according to coordinate information of the perception target.
Furthermore, perception fusion algorithms which can be used by the operating system module comprise a drive test sensor fusion perception algorithm, a target positioning and tracking algorithm, a license plate recognition algorithm and a traffic event detection algorithm.
Optionally, the step of configuring data transmission for the smart camera system in the operating system module includes:
s101: accessing the smart camera system using the IP address;
s102: performing basic configuration, advanced configuration and system maintenance on the intelligent camera system;
s103: configuring a trigger mode and a trigger strategy for a front-view camera, a rear-view camera, a top-view camera and a sensor respectively;
s104: respectively configuring channels for a front-view camera, a rear-view camera, a top-view camera and a sensor;
s105: and carrying out plug flow configuration on the front-view camera, the rear-view camera and the overlook camera.
Further, in step S103, the trigger mode includes: distance trigger, time trigger and soft trigger, and the trigger strategy comprises: image priority, real-time priority, and synchronization priority.
Further, in step S104, the channel configuration is to establish a connection between the operating system module and the sensor and the camera, and associate the camera name and the sensor name for sensing fusion calculation.
Further, in step S105, the stream types of the plug-flow configuration include a main stream, a sub stream, and a third stream, which respectively correspond to the three streams of the camera.
The intelligent camera system and the control method for cooperative vehicle and road monitoring provided by the invention can realize the following beneficial effects that:
(1) the seamless full coverage of the front view, the rear view and the under-rod view is realized by arranging a front-view camera, a rear-view camera and a top-view camera; meanwhile, the front and rear cameras can be matched with long and medium focus lenses to realize sensing and tracking of targets in the front and rear view ranges of 800 meters respectively;
(2) the algorithm entry in the operating system and the processor supports secondary development, and a user can implant an intelligent algorithm set by the user;
(3) the camera, the sensor and the control device are integrated in an intelligent camera system, so that the system delay is extremely low, the delay from the start of triggering exposure of the camera to the acquisition of complete image data by an operating system is less than 50ms, and the requirement of most intelligent traffic and vehicle-road cooperative scenes on the system delay can be met;
(4) and screening and displaying the road end target perception effect in the vehicle-road cooperation scheme in real time in a closed-loop system through an operating system module, projecting and refreshing the position of the perception target in real time, and quickly verifying the scheme deployment effect.
Drawings
FIG. 1 is a schematic diagram of a smart camera system according to an embodiment of the present invention;
FIG. 2 is an internal assembly perspective view of a smart camera system in accordance with an embodiment of the present invention;
FIG. 3 is a schematic diagram showing the components of the control device of the intelligent camera system according to the embodiment of the present invention;
fig. 4 is a flowchart of a method for controlling an intelligent camera system according to an embodiment of the invention.
Wherein the content of the first and second substances,
100-smart camera system;
101-assembling a housing; 1021-a first forward-looking camera; 1022-a second front-view camera; 1031-a first rear view camera; 1032-a second rear view camera; 104-looking down the camera; 105-a sensor; 1061-slave processor; 1062-a host processor;
200-a control device;
201-a processor module; 202-operating system module; 203-a positioning module; 204-network transmission module.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some embodiments of the present invention, but not all embodiments. All other embodiments, which can be obtained by a person skilled in the art without any inventive step based on the embodiments of the present invention, are within the scope of the present invention.
In a first aspect of the present invention, a smart camera system 100 for vehicle-road cooperative monitoring is proposed, and with reference to fig. 1 and fig. 2, the smart camera system includes: a front view camera 1021/1022, a rear view camera 1031/1032, a top view camera 104, a sensor 105, and a control device 200; wherein the content of the first and second substances,
the front view camera 1021/1022, rear view camera 1031/1032, downward view camera 104, and sensor 105 are integrated inside the smart camera system 100;
as shown in fig. 3, the control device 200 includes a processor module 201, an operating system module 202, a positioning module 203, and a network transmission module 204; wherein the content of the first and second substances,
the processor module 201 is configured to receive data from the camera, the sensor, and process the data through a perceptual fusion algorithm;
the operating system module 202 is configured to display the road end target perception effect in the vehicle cooperation scheme in real time, refresh the position track of the road end target and verify the vehicle system scheme deployment effect;
the positioning module 203 is configured to synchronously trigger the camera and the sensor based on a time service synchronization function of a GPS (global positioning system) or a Beidou positioning system;
the network transmission module 204 is configured to receive a Websocket request of a client and push sensed data in JSON format to the client by a server; the method comprises the steps that a client side initiates a Websocket request, receives JSON format data at the same time, and projects a target model into map software in real time according to coordinate information of a perception target;
the control device 200 performs fusion operation on data obtained by the front camera 1021/1022, the rear camera 1031/1032, the overhead camera 104, and the sensor 105, and outputs images, videos, and positioning information that meet the monitoring requirements.
Preferably, this embodiment shows a set of smart camera system 100 using two front-view cameras (a first front-view camera 1021 and a second front-view camera 1022), two rear-view cameras (a first rear-view camera 1031 and a second rear-view camera 1032) and a downward-view camera 104, respectively, and the components of the smart camera system 100 are schematically illustrated in fig. 1, and the internal assembly structure of the smart camera system 100 is illustrated in fig. 2.
As shown in fig. 2, inside the fitting case 101 are mounted:
two front-view cameras, a first front-view camera 1021 and a second front-view camera 1022, the two front-view cameras are mounted at one end of the assembly housing 101, and the lens faces of the front-view cameras extend outwards towards the end away from the assembly housing;
two rear view cameras, a first rear view camera 1031 and a second rear view camera 1032, wherein the two rear view cameras are arranged at the other end of the assembling shell 101 in a frame mode, and the lens surfaces of the rear view cameras extend outwards towards the other end deviating from the assembling shell;
a downward-looking camera 104 mounted on the bottom of the mounting case 101, and having a lens projecting downward through a lens exit provided on the bottom of the mounting case. The present embodiment preferably selects the camera parameters as follows:
an imaging device: SonyIMX 490;
target surface size: 1/1.55 inch; pixel size: 3 μm; maximum resolution 2880 1860;
matching a lens:
and (3) long coke: 75mm lens, field angle: 8.3 ° 6.6 ° 4.9 °;
middle coke: 10-50mm lens, field angle: (Wide)47.1 ° 37.2 ° 27.5 °; (Tele)11.1 ° 8.9 ° 6.7 °;
wide angle: 181 ° -151 ° -81 °;
output image and video parameters:
image frame rate: 30 frames/second; image data format: YUV 422; encoding protocol for the outer output video stream: h.265 or H.264 coding, GB28281 protocol; for the external output video stream: three streams, 2880 × 1860@30 fps.
The 5 cameras can realize seamless full coverage of the front view, the rear view and the view under the rod; the front and rear cameras can be matched with long and medium focus lenses to realize sensing and tracking of targets in the front and rear view ranges of 800 meters.
Preferably, the sensor 105 mounted inside the fitting housing 101 includes: the millimeter wave radar sensor and the laser radar sensor are electrically connected with the control device.
Millimeter wave radar sensors use millimeter waves, i.e. the frequency is: 30 to 300GHz and a wavelength of 1 to 10 mm. The millimeter wave has the advantages of microwave guidance and photoelectric guidance because the wavelength of the millimeter wave is between centimeter wave and light wave. Compared with a centimeter-wave radar, the millimeter-wave radar has the characteristics of small volume, easy integration and high spatial resolution. Compared with optical sensors such as a camera, infrared sensors, laser sensors and the like, the millimeter wave radar has the advantages of being strong in fog, smoke and dust penetrating capability, strong in anti-interference capability, all-weather and all-day-long.
The laser radar sensor is an active optical detection device, is used for environmental perception, provides a brand-new technical means for acquiring spatial information, and enables the spatial information to be acquired with higher automation degree and more obvious efficiency. The laser radar sensor is combined with satellite positioning, inertial navigation, photography and remote sensing technologies, and can acquire large-range digital earth surface model data; the vehicle-mounted system can be used for acquiring three-dimensional data of the surfaces of roads, bridges, tunnels and large buildings; fixed lidar systems are often used for accurate scanning measurement in small-range areas and acquisition of three-dimensional model data.
In order to cooperate with the camera and the sensor installed in the smart camera system in the preferred embodiment to perform data interaction, analysis, storage and screening on the acquired data, pictures, images and the like, a control device needs to be further configured for the smart camera system.
Preferably, the processor module 201 in the control device 200 includes a master processor 1062 and a slave processor 1061, and the processor module is configured to receive data from a plurality of cameras and a plurality of sensors, and process the obtained data through a perceptual fusion algorithm.
Furthermore, the perception fusion algorithm comprises a road test sensor fusion perception algorithm, a target positioning and tracking algorithm, a license plate recognition algorithm and a traffic event detection algorithm.
The preferred processor module configuration parameters of this embodiment are as follows:
built-in nvidia.jetson.xavier processor, AI algorithm 32TOPS (up to 64 TOPS); meanwhile, secondary development is supported, and a user can implant an intelligent algorithm designed by the user to realize the access and processing of the real-time data of the multiple sensors. The host processor and the slave processor are both provided with USB and HDMI extension lines to facilitate system debugging.
Preferably, the operating system module 202 in the control device 200 includes vehicle-road cooperative sensing projection software and map software, and is configured to display a road-end target sensing effect in the vehicle cooperative scheme in real time, refresh a position track of the road-end target, and verify a vehicle system scheme deployment effect.
Preferably, the positioning module 203 in the control device 200 uses a GPS positioning system or a beidou positioning system, and the positioning module is configured to synchronously trigger the camera and the sensor based on a time service synchronization function of the GPS positioning system or the beidou positioning system.
The system delay is extremely low, the delay from the start of triggering exposure by the camera to the acquisition of complete image data by the operating system is less than 50ms, and the requirement of most intelligent traffic and vehicle-road cooperative scenes on the system delay is met. Meanwhile, the system supports the perceptual data in a Protobuf format and the video stream push in a GB28281 format. The time service synchronously supports NTP time service or PTP time service.
Preferably, the network transmission module 204 in the control device 200 includes a server, a client, a browser, and a local area network, and is configured to receive a Websocket request from the client and push sensed JSON-formatted data to the client; the client side initiates a Websocket request, receives data in a JSON format at the same time, and projects the target model into map software in real time according to coordinate information of the perception target.
In the integrated closed-loop system formed by the camera, the sensor and the control device, the deployment does not need to rely on other software. The road end target perception effect in the vehicle-road cooperation scheme is displayed in real time, the position track of a road end perception target (human/vehicle/non-motor vehicle) is refreshed through the frequency of 3Hz, the perception accuracy is observed in real time, and the deployment effect can be rapidly verified at the road end conveniently.
The client can filter the road-end perception target data to a certain degree, and extract the road-end perception target data in a fixed data extraction mode, so that the performance consumption of the browser is greatly reduced, and meanwhile, the real-time projection effect of the road-end perception target can be kept.
Positioning a corresponding projection target by clicking a self-adaptive point mark corresponding to current projection data displayed on a map in map software; and displaying the data information of the projection target in the information window by clicking the point mark. The road end perception effect of the vehicle-road cooperation scheme is checked in real time, software debugging efficiency is improved, and scheme verification difficulty is reduced.
In a second aspect of the present invention, referring to fig. 3, an intelligent camera system as described in the first aspect is adopted, and a processor 201, an operating system module 202, a positioning module 203, and a network transmission module 204 perform fusion operation on data obtained by a front-view camera, a rear-view camera, an overhead-view camera, and a sensor, and output images, videos, and positioning information meeting monitoring requirements; wherein the content of the first and second substances,
the processor module 201 is provided with a host processor and a slave processor, and the processor module is configured to receive data from the camera and the sensor and process the data through a perception fusion algorithm;
the operating system module 202 comprises vehicle-road cooperative perception projection software and map software, and is configured to display road end target perception effects in a vehicle cooperative scheme in real time, refresh position tracks of road end targets, and verify vehicle system scheme deployment effects;
the positioning module 203 can use a GPS (global positioning system) or a Beidou positioning system, and is configured to synchronously trigger the camera and the sensor based on the time service synchronization function of the GPS or the Beidou positioning system;
the network transmission module 204 comprises a server, a client, a browser and a local area network, and is configured to receive a Websocket request of the client and push sensed data in a JSON format to the client; the client side initiates a Websocket request, receives data in a JSON format at the same time, and projects the target model into map software in real time according to coordinate information of the perception target.
Furthermore, perception fusion algorithms which can be used by the operating system module comprise a drive test sensor fusion perception algorithm, a target positioning and tracking algorithm, a license plate recognition algorithm and a traffic event detection algorithm.
Referring to fig. 4, the step of configuring the smart camera system for data transfer within the operating system module, 202, includes:
s101: accessing the smart camera system using the IP address;
on one hand, the local login can be accessed by inputting a self-loop IP or an IP of equipment in a browser; on the other hand, if access to other smart camera systems within the same lan is required, the IP of the corresponding target needs to be input. It should be noted that the ID of the smart camera system is the device identifier thereof, and includes a non-mandatory ID (which may provide random selection) and a mandatory ID, and the mandatory ID is used for detecting a corresponding configuration algorithm, fusing multiple devices, and the like.
S102: performing basic configuration, advanced configuration and system maintenance on the intelligent camera system;
s103: configuring a trigger mode and a trigger strategy for a front-view camera, a rear-view camera, a top-view camera and a sensor respectively; wherein the trigger mode includes: distance trigger, time trigger and soft trigger, and the trigger strategy comprises: image priority, real-time priority, and synchronization priority.
The time trigger needs a time service function of the linkage system, namely, the time service state data of the GPS or the Beidou positioning system is fed back, and the time service state data can be displayed as 'time service' when the time service of the GPS or the Beidou positioning system is successful.
The synchronization priority is to turn on the synchronization priority when the time stamps of all cameras are required to be consistent when more than one camera is configured on one device, and to select the real-time priority when the real-time effect is not ideal.
S104: respectively configuring channels for a front-view camera, a rear-view camera, a top-view camera and a sensor; the channel configuration is to establish connection between the operating system module and the sensor and between the operating system module and the camera, and associate the camera name and the sensor name for perception fusion calculation.
Whether the channel configuration is performed on the camera or the sensor, the matching setting is performed by respectively including enabling configuration, channel number setting, camera/sensor name, camera/sensor type, camera/sensor subtype and camera/sensor sn number.
S105: and carrying out plug flow configuration on the front-view camera, the rear-view camera and the overlook camera. The method comprises the following steps:
(1) enabling configuration, controlling a switch for pushing flow, and pushing data to relevant servers only when the switch is opened.
(2) The camera channel number is configured, the push stream is also the streaming media of the push camera, and channel 1 is the camera with the corresponding channel number of 1.
(3) The IP address and port number of the server are set.
The stream types of the plug-flow configuration comprise a main code stream, a sub code stream and a third code stream which respectively correspond to three paths of code streams of the camera.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (10)

1. An intelligent camera system for cooperative monitoring of a vehicle and a road is characterized by comprising a front-view camera, a rear-view camera, a top-view camera, a sensor and a control device; wherein, the first and the second end of the pipe are connected with each other,
the front-view camera, the rear-view camera, the downward-view camera and the sensor are integrated inside the intelligent camera system;
the control device comprises a processor module, an operating system module, a positioning module and a network transmission module; wherein the content of the first and second substances,
the processor module is configured to receive data from the camera, the sensor, and process the data through a perceptual fusion algorithm;
the operating system module is configured to display the road end target perception effect in the vehicle cooperation scheme in real time, refresh the position track of the road end target and verify the vehicle system scheme deployment effect;
the positioning module is configured to synchronously trigger a camera and a sensor based on a time service synchronization function of the GPS or Beidou positioning system;
the network transmission module is configured to receive a Websocket request of a client and push sensed data in a JSON format to the client by a server; the method comprises the steps that a client side initiates a Websocket request, receives data in a JSON format at the same time, and projects a target model into map software in real time according to coordinate information of a perception target;
and the control device is used for carrying out fusion operation on the data obtained by the front-view camera, the rear-view camera, the overlook camera and the sensor and outputting images, videos and positioning information which meet the monitoring requirement.
2. A smart camera system for vehicle-road cooperative monitoring as claimed in claim 1, wherein the front view camera and the rear view camera are respectively provided with one or two, and a telephoto lens or a medium focus lens is used.
3. A smart camera system for vehicle-road cooperative monitoring as recited in claim 2, wherein the number of the downward-looking cameras is one, and the downward-looking cameras are installed at the center of the bottom of the smart camera.
4. A smart camera system for collaborative monitoring of vehicle and road according to claim 1, wherein the sensor includes a millimeter wave radar sensor and a laser radar sensor, and is electrically connected to the control device.
5. An intelligent camera control method for cooperative monitoring of a vehicle and a road is characterized in that the intelligent camera system according to any one of claims 1 to 4 is adopted, and fusion operation is carried out on data obtained by a front-view camera, a rear-view camera, a top-view camera and a sensor through a processor module, an operating system module, a positioning module and a network transmission module, so as to output images, videos and positioning information which meet monitoring requirements; wherein the content of the first and second substances,
the processor module is configured to receive data from the camera, the sensor, and process the data through a perceptual fusion algorithm;
the operating system module is configured to display the road end target perception effect in the vehicle cooperation scheme in real time, refresh the position track of the road end target and verify the vehicle system scheme deployment effect;
the positioning module is configured to synchronously trigger a camera and a sensor based on a time service synchronization function of the GPS or Beidou positioning system;
the network transmission module is configured to receive a Websocket request of a client and push sensed data in a JSON format to the client by a server; the client side initiates a Websocket request, receives data in a JSON format at the same time, and projects the target model into map software in real time according to coordinate information of the perception target.
6. The intelligent camera control method for vehicle-road cooperative monitoring according to claim 5, wherein the perception fusion algorithm comprises a drive test sensor fusion perception, target positioning and tracking, license plate recognition and traffic event detection algorithm.
7. A smart camera control method for vehicle-road cooperative monitoring according to claim 6, wherein the step of configuring the smart camera system for data transmission within the operating system module comprises:
s101: accessing the smart camera system using the IP address;
s102: performing basic configuration, advanced configuration and system maintenance on the intelligent camera system;
s103: configuring a trigger mode and a trigger strategy for a front-view camera, a rear-view camera, a top-view camera and a sensor respectively;
s104: respectively configuring channels for a front-view camera, a rear-view camera, a top-view camera and a sensor;
s105: and carrying out plug flow configuration on the front-view camera, the rear-view camera and the overlook camera.
8. A smart camera control method for vehicle-road cooperative monitoring according to claim 7, wherein the trigger mode in step S103 includes: distance trigger, time trigger and soft trigger, and the trigger strategy comprises: image priority, real-time priority, and synchronization priority.
9. An intelligent camera control method for vehicle-road cooperative monitoring according to claim 7, wherein the channel in step S104 is configured to connect the operating system module with the sensor and the camera, and associate the camera name with the sensor name for perception fusion calculation.
10. The intelligent camera control method for cooperative vehicle and road monitoring according to claim 7, wherein the stream types of the stream pushing configuration in step S105 include a main stream, a sub stream, and a third stream, which respectively correspond to three streams of the camera.
CN202210550244.1A 2022-05-20 2022-05-20 Intelligent camera system for cooperative monitoring of vehicle and road and control method Pending CN114937367A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210550244.1A CN114937367A (en) 2022-05-20 2022-05-20 Intelligent camera system for cooperative monitoring of vehicle and road and control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210550244.1A CN114937367A (en) 2022-05-20 2022-05-20 Intelligent camera system for cooperative monitoring of vehicle and road and control method

Publications (1)

Publication Number Publication Date
CN114937367A true CN114937367A (en) 2022-08-23

Family

ID=82863792

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210550244.1A Pending CN114937367A (en) 2022-05-20 2022-05-20 Intelligent camera system for cooperative monitoring of vehicle and road and control method

Country Status (1)

Country Link
CN (1) CN114937367A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116913096A (en) * 2023-09-13 2023-10-20 北京华录高诚科技有限公司 Traffic situation investigation equipment and method based on Beidou short message communication technology

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102737511A (en) * 2012-07-04 2012-10-17 武汉大学 Intelligent road side system
CN107229690A (en) * 2017-05-19 2017-10-03 广州中国科学院软件应用技术研究所 Dynamic High-accuracy map datum processing system and method based on trackside sensor
CN108833833A (en) * 2018-06-20 2018-11-16 长安大学 Towards intelligent network connection automobile scene image data perception and coprocessing system
US20190244521A1 (en) * 2018-02-06 2019-08-08 Cavh Llc Intelligent road infrastructure system (iris): systems and methods
CN111554088A (en) * 2020-04-13 2020-08-18 重庆邮电大学 Multifunctional V2X intelligent roadside base station system
WO2020211658A1 (en) * 2019-04-17 2020-10-22 阿里巴巴集团控股有限公司 Trigger detection method, apparatus and system
CN112530173A (en) * 2020-12-03 2021-03-19 北京百度网讯科技有限公司 Roadside sensing method and device, electronic equipment, storage medium and roadside equipment
CN114002669A (en) * 2021-10-21 2022-02-01 北京理工大学重庆创新中心 Road target detection system based on radar and video fusion perception

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102737511A (en) * 2012-07-04 2012-10-17 武汉大学 Intelligent road side system
CN107229690A (en) * 2017-05-19 2017-10-03 广州中国科学院软件应用技术研究所 Dynamic High-accuracy map datum processing system and method based on trackside sensor
US20190244521A1 (en) * 2018-02-06 2019-08-08 Cavh Llc Intelligent road infrastructure system (iris): systems and methods
CN108833833A (en) * 2018-06-20 2018-11-16 长安大学 Towards intelligent network connection automobile scene image data perception and coprocessing system
WO2020211658A1 (en) * 2019-04-17 2020-10-22 阿里巴巴集团控股有限公司 Trigger detection method, apparatus and system
CN111554088A (en) * 2020-04-13 2020-08-18 重庆邮电大学 Multifunctional V2X intelligent roadside base station system
CN112530173A (en) * 2020-12-03 2021-03-19 北京百度网讯科技有限公司 Roadside sensing method and device, electronic equipment, storage medium and roadside equipment
CN114002669A (en) * 2021-10-21 2022-02-01 北京理工大学重庆创新中心 Road target detection system based on radar and video fusion perception

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116913096A (en) * 2023-09-13 2023-10-20 北京华录高诚科技有限公司 Traffic situation investigation equipment and method based on Beidou short message communication technology
CN116913096B (en) * 2023-09-13 2023-12-15 北京华录高诚科技有限公司 Traffic situation investigation equipment and method based on Beidou short message communication technology

Similar Documents

Publication Publication Date Title
Chen et al. Cooper: Cooperative perception for connected autonomous vehicles based on 3d point clouds
CN103795976B (en) A kind of full-time empty 3 d visualization method
CN109471128B (en) Positive sample manufacturing method and device
CN106204595B (en) A kind of airdrome scene three-dimensional panorama monitoring method based on binocular camera
US20200356108A1 (en) Information transmission method and client device
CN103295396B (en) Speedy ex-situ evidence collection of traffic accident method and system
CN110164135B (en) Positioning method, positioning device and positioning system
CN115187742B (en) Method, system and related device for generating automatic driving simulation test scene
US20230121051A1 (en) Roadside sensing system and traffic control method
CN113074714B (en) Multi-state potential sensing sensor based on multi-data fusion and processing method thereof
CN210526874U (en) Airborne three-light photoelectric pod system
CN103416050A (en) Information provision system, information provision device, photographing device, and computer program
CN114937367A (en) Intelligent camera system for cooperative monitoring of vehicle and road and control method
CN113141442B (en) Camera and light supplementing method thereof
CN116013016A (en) Fire monitoring method, system and device
WO2021232826A1 (en) Wireless-positioning-technology-based method and device for controlling camera to dynamically track road target
Carmichael et al. Dataset and benchmark: Novel sensors for autonomous vehicle perception
CN114944066A (en) Intelligent camera system for vehicle and road cooperative monitoring
CN217273162U (en) A support device and intelligent camera for installing intelligent camera
CN116959262A (en) Road traffic control method, device, equipment and storage medium
CN114415489B (en) Time synchronization method, device, equipment and medium for vehicle-mounted sensor
CN108206940B (en) Video streaming connection and transmission method, gateway device and viewing device
CN111683220A (en) Unmanned vehicle monitoring and taking-over scheme based on 4-way fixed-focus camera and 1-way pan-tilt zoom camera
Ko et al. On scaling distributed low-power wireless image sensors
CN113112815A (en) Radar checkpoint vehicle management system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination