CN116486334A - High-altitude parabolic monitoring method, system and device based on vehicle and storage medium - Google Patents
High-altitude parabolic monitoring method, system and device based on vehicle and storage medium Download PDFInfo
- Publication number
- CN116486334A CN116486334A CN202310426904.XA CN202310426904A CN116486334A CN 116486334 A CN116486334 A CN 116486334A CN 202310426904 A CN202310426904 A CN 202310426904A CN 116486334 A CN116486334 A CN 116486334A
- Authority
- CN
- China
- Prior art keywords
- altitude parabolic
- altitude
- target vehicle
- vehicle
- image information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 44
- 238000012544 monitoring process Methods 0.000 title claims abstract description 36
- 238000012549 training Methods 0.000 claims description 44
- 238000013527 convolutional neural network Methods 0.000 claims description 19
- 238000012545 processing Methods 0.000 claims description 5
- 238000001514 detection method Methods 0.000 abstract description 2
- 230000006870 function Effects 0.000 description 28
- 230000006399 behavior Effects 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 239000002184 metal Substances 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000006467 substitution reaction Methods 0.000 description 3
- 238000003491 array Methods 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 238000010801 machine learning Methods 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 1
- 238000003709 image segmentation Methods 0.000 description 1
- 238000012806 monitoring device Methods 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/0464—Convolutional networks [CNN, ConvNet]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/774—Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/64—Three-dimensional objects
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02T—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
- Y02T10/00—Road transport of goods or passengers
- Y02T10/10—Internal combustion engine [ICE] based vehicles
- Y02T10/40—Engine management systems
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Evolutionary Computation (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Computing Systems (AREA)
- Software Systems (AREA)
- Artificial Intelligence (AREA)
- Biomedical Technology (AREA)
- Medical Informatics (AREA)
- Databases & Information Systems (AREA)
- Life Sciences & Earth Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- Data Mining & Analysis (AREA)
- Molecular Biology (AREA)
- General Engineering & Computer Science (AREA)
- Mathematical Physics (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses a vehicle-based high-altitude parabolic monitoring method, a system, a device and a storage medium, comprising the following steps: acquiring depth image information of an area above a target vehicle, inputting the depth image information into a pre-trained high-altitude parabolic identification model, and determining whether a high-altitude parabolic exists or not; when the high-altitude parabolic object exists, determining a first motion track of the high-altitude parabolic object according to continuous multi-frame depth image information, and predicting a throwing position and a landing position of the high-altitude parabolic object according to the first motion track; judging whether the high-altitude parabolic object threatens a target vehicle according to the landing position and the object type of the high-altitude parabolic object, if so, controlling the target vehicle to take vehicle body protection measures; and generating high-altitude parabolic evidence information according to the throwing position and the depth image information, and sending the high-altitude parabolic evidence information to an alarm platform or a vehicle owner of the target vehicle. The high-altitude parabolic behavior monitoring system can monitor high-altitude parabolic behavior in real time and determine the throwing position to conduct real-time evidence, reduces risks caused by high-altitude parabolic behavior, and can be applied to the technical field of vehicle safety detection.
Description
Technical Field
The invention relates to the technical field of vehicle safety monitoring, in particular to a high-altitude parabolic monitoring method, a system and a device based on a vehicle and a storage medium.
Background
In recent years, the phenomenon of high altitude parabolic in a community occurs, and great potential safety hazards are brought to pedestrians and vehicles. In the prior art, the high-altitude cameras are arranged around the building to shoot and monitor the upper area, so that the method cannot inform pedestrians and vehicles of the high-altitude parabolic risk in real time and accurately, and the method can only be used for retrospective tracing, and cannot guarantee the safety of the pedestrians and vehicles. In the related art, a high-altitude camera is arranged on a vehicle roof to monitor and early warn the high-altitude parabolic objects above the vehicle, however, when a vehicle owner is not in the vehicle, the method cannot take countermeasures in time even if the high-altitude parabolic objects are monitored, and on the other hand, the throwing position of the high-altitude parabolic objects cannot be locked quickly and the high-altitude parabolic objects cannot be proved, in addition, the reminding of pedestrians is omitted, and the safety of the pedestrians and the vehicles cannot be guaranteed.
Disclosure of Invention
The present invention aims to solve at least one of the technical problems existing in the prior art to a certain extent.
Therefore, an object of the embodiments of the present invention is to provide a vehicle-based high-altitude parabolic monitoring method, which reduces risks caused by high-altitude parabolic, ensures safety of pedestrians and vehicles to a certain extent, and can perform real-time evidence on high-altitude parabolic behaviors.
It is another object of an embodiment of the present invention to provide a vehicle-based high altitude parabolic monitoring system.
In order to achieve the technical purpose, the technical scheme adopted by the embodiment of the invention comprises the following steps:
in a first aspect, an embodiment of the present invention provides a vehicle-based high altitude parabolic monitoring method, including the steps of:
acquiring depth image information of an area above a target vehicle in real time, inputting the depth image information into a pre-trained high-altitude parabolic identification model, and determining whether a high-altitude parabolic exists according to a high-altitude parabolic identification result;
when a high-altitude parabolic object exists, determining a first motion track of the high-altitude parabolic object according to continuous multi-frame depth image information, and predicting a throwing position and a landing position of the high-altitude parabolic object according to the first motion track;
judging whether the high-altitude parabolic object threatens the target vehicle according to the landing position and the object type of the high-altitude parabolic object, and if so, controlling the target vehicle to take vehicle body protection measures;
and generating high-altitude parabolic evidence information according to the throwing position and the depth image information, and sending the high-altitude parabolic evidence information to a preset alarm platform or a vehicle owner of the target vehicle.
Further, in one embodiment of the present invention, the step of acquiring depth image information of an area above the target vehicle in real time specifically includes:
and when the target vehicle is in a parking state, starting a binocular depth camera arranged on the roof of the target vehicle, and continuously acquiring the depth image information of the area above the target vehicle through the binocular depth camera.
Further, in one embodiment of the present invention, the method for monitoring high altitude parabolic trough further includes a step of pre-training the high altitude parabolic trough identification model, which specifically includes:
acquiring a plurality of preset high-altitude parabolic depth images, and determining high-altitude parabolic type labels corresponding to the high-altitude parabolic depth images;
converting the high altitude parabolic depth image into three-dimensional point cloud sample data, and constructing a training data set according to the three-dimensional point cloud sample data and the corresponding high altitude parabolic type label;
and inputting the training data set into a pre-constructed convolutional neural network for training to obtain the trained high-altitude parabolic recognition model.
Further, in an embodiment of the present invention, the step of inputting the training data set into a pre-constructed convolutional neural network for training to obtain the trained high altitude parabolic recognition model specifically includes:
inputting the training data set into the convolutional neural network to obtain a high-altitude parabolic prediction result;
determining a loss value of the convolutional neural network according to the high altitude parabolic prediction result and the high altitude parabolic type label;
updating model parameters of the convolutional neural network through a back propagation algorithm according to the loss value, and returning to the step of inputting the training data set into the convolutional neural network;
and stopping training when the loss value reaches a preset first threshold value or the iteration number reaches a preset second threshold value, and obtaining a trained high-altitude parabolic identification model.
Further, in one embodiment of the present invention, the step of determining a first motion trajectory of the high-altitude parabolic object according to the depth image information of consecutive frames, and predicting a throwing position and a landing position of the high-altitude parabolic object according to the first motion trajectory specifically includes:
performing differential processing on the depth image information of the current frame and the depth image information of the previous frame to obtain foreground image information of the high-altitude parabolic object corresponding to the depth image information of the current frame, and determining a first position coordinate of the high-altitude parabolic object according to the foreground image information;
determining a first motion track of the high-altitude parabolic object according to the first position coordinates corresponding to the continuous multi-frame depth image information;
determining three-dimensional space position information of a plurality of high-altitude buildings according to the depth image information;
and determining the throwing position according to the first motion track and the three-dimensional space position information, and determining the landing position according to the first motion track and the horizontal plane where the binocular depth camera is positioned.
Further, in an embodiment of the present invention, the step of determining whether the high altitude parabolic object threatens the target vehicle according to the landing position and the object type of the high altitude parabolic object, if yes, controlling the target vehicle to take a vehicle body protection measure specifically includes:
judging whether the high-altitude parabolic object falls at the current position of the target vehicle according to the falling position;
determining the object type of the high-altitude parabolic object according to the high-altitude parabolic object identification result, and determining the danger level of the high-altitude parabolic object according to the object type;
when the high-altitude parabolic object falls at the current position of the target vehicle and the danger level is greater than or equal to a preset third threshold value, determining that the high-altitude parabolic object threatens the target vehicle;
and when the high-altitude parabolic object is determined to have threat to the target vehicle, controlling the target vehicle to start an airbag protection device arranged on the roof of the target vehicle.
Further, in one embodiment of the present invention, the high altitude parabolic monitoring method further comprises the steps of:
when the high-altitude parabolic object exists, reminding people around the target vehicle of the high-altitude falling object risk through a voice broadcasting device arranged on the target vehicle.
In a second aspect, an embodiment of the present invention provides a vehicle-based high altitude parabolic monitoring system, including:
the high-altitude parabolic recognition module is used for acquiring depth image information of an area above a target vehicle in real time, inputting the depth image information into a pre-trained high-altitude parabolic recognition model, and determining whether a high-altitude parabolic exists according to a high-altitude parabolic recognition result;
the motion trail determining module is used for determining a first motion trail of the high-altitude parabolic object according to the continuous multi-frame depth image information when the high-altitude parabolic object exists, and predicting a throwing position and a landing position of the high-altitude parabolic object according to the first motion trail;
the vehicle body protection control module is used for judging whether the high-altitude parabolic object threatens the target vehicle according to the landing position and the object type of the high-altitude parabolic object, and if so, controlling the target vehicle to take vehicle body protection measures;
the high-altitude parabolic evidence obtaining module is used for generating high-altitude parabolic evidence obtaining information according to the throwing position and the depth image information and sending the high-altitude parabolic evidence obtaining information to a preset alarm platform or a vehicle owner of the target vehicle.
In a third aspect, an embodiment of the present invention provides a vehicle-based high altitude parabolic monitoring apparatus, including:
at least one processor;
at least one memory for storing at least one program;
the at least one program, when executed by the at least one processor, causes the at least one processor to implement a vehicle-based overhead parabolic monitoring method as described above.
In a fourth aspect, embodiments of the present invention also provide a computer readable storage medium having stored therein a processor executable program which when executed by a processor is configured to perform a vehicle-based overhead parabolic monitoring method as described above.
The advantages and benefits of the invention will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention.
According to the embodiment of the invention, depth image information of an area above a target vehicle is acquired in real time, the depth image information is input into a pre-trained high-altitude parabolic identification model, whether high-altitude parabolic exists is determined according to a high-altitude parabolic identification result, when the high-altitude parabolic exists, a first motion track of the high-altitude parabolic is determined according to continuous multi-frame depth image information, a throwing position and a landing position of the high-altitude parabolic are determined according to the first motion track, whether threat to the target vehicle exists by the high-altitude parabolic is judged according to the landing position and the object type of the high-altitude parabolic, if so, a vehicle body protection measure is controlled to be adopted for the target vehicle, and in addition, high-altitude parabolic evidence information is generated according to the throwing position and the depth image information and is sent to a preset alarm platform or a vehicle owner of the target vehicle. According to the embodiment of the invention, through the acquisition of the depth image information and the model identification, the high-altitude parabolic behavior can be monitored in real time, the throwing position is determined to be proved in real time, trouble-causing personnel can be found in time, the risk brought by high-altitude parabolic is reduced, and the safety of pedestrians and vehicles is ensured to a certain extent.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the following description will refer to the drawings that are needed in the embodiments of the present invention, and it should be understood that the drawings in the following description are only for convenience and clarity to describe some embodiments in the technical solutions of the present invention, and other drawings may be obtained according to these drawings without any inventive effort for those skilled in the art.
FIG. 1 is a flow chart of steps of a vehicle-based high altitude parabolic monitoring method according to an embodiment of the present invention;
FIG. 2 is a block diagram of a vehicle-based overhead parabolic monitoring system according to an embodiment of the present invention;
fig. 3 is a block diagram of a high-altitude parabolic monitoring device based on a vehicle according to an embodiment of the present invention.
Detailed Description
Embodiments of the present invention are described in detail below, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to like or similar elements or elements having like or similar functions throughout. The embodiments described below by referring to the drawings are illustrative only and are not to be construed as limiting the invention. The step numbers in the following embodiments are set for convenience of illustration only, and the order between the steps is not limited in any way, and the execution order of the steps in the embodiments may be adaptively adjusted according to the understanding of those skilled in the art.
In the description of the present invention, the plurality means two or more, and if the description is made to the first and second for the purpose of distinguishing technical features, it should not be construed as indicating or implying relative importance or implicitly indicating the number of the indicated technical features or implicitly indicating the precedence of the indicated technical features. Furthermore, unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art.
Referring to fig. 1, an embodiment of the present invention provides a vehicle-based high altitude parabolic monitoring method, which specifically includes the following steps:
s101, acquiring depth image information of an area above a target vehicle in real time, inputting the depth image information into a pre-trained high-altitude parabolic recognition model, and determining whether high-altitude parabolic exists according to a high-altitude parabolic recognition result.
In particular, in a computer vision system, three-dimensional scene information provides more possibilities for various computer vision applications such as image segmentation, object detection, object tracking, and the like, and Depth images (Depth maps) are widely used as a general three-dimensional scene information expression mode. The gray value of each pixel of the depth image may be used to characterize how far or near a point in the scene is from the camera. According to the embodiment of the invention, the distance between the high altitude parabolic object and the imaging device can be accurately perceived through the acquisition of the depth image information of the area above the target vehicle, so that the actual position of the high altitude parabolic object can be accurately determined.
Further as an optional embodiment, the step of acquiring depth image information of the area above the target vehicle in real time specifically includes:
when the target vehicle is in a parking state, a binocular depth camera arranged on the roof of the target vehicle is started, and depth image information of an area above the target vehicle is continuously acquired through the binocular depth camera.
Specifically, the embodiment of the invention acquires the depth image of the area above the target vehicle through the binocular depth camera. In some alternative embodiments, lidar may also be used for depth imaging.
Further as an optional embodiment, the high altitude parabolic monitoring method further includes a step of training a high altitude parabolic identification model in advance, which specifically includes:
a1, acquiring a plurality of preset high-altitude parabolic depth images, and determining high-altitude parabolic type labels corresponding to the high-altitude parabolic depth images;
a2, converting the high-altitude parabolic depth image into three-dimensional point cloud sample data, and constructing a training data set according to the three-dimensional point cloud sample data and the corresponding high-altitude parabolic type label;
and A3, inputting the training data set into a pre-constructed convolutional neural network for training to obtain a trained high-altitude parabolic recognition model.
Specifically, acquiring a plurality of high-altitude parabolic depth images with a sufficient number and different article types, determining label information of each high-altitude parabolic depth image according to the corresponding article type, wherein the label information comprises that the label of paper is (1), the label of a metal product is (2), and the like; in addition, a plurality of sample images without high-altitude parabolic objects can be acquired, and the tag can be set to (0).
Converting the high-altitude parabolic depth image into three-dimensional point cloud sample data so as to extract image features; and forming a training data set according to the obtained three-dimensional point cloud sample data and the corresponding labels.
Further as an optional implementation manner, the step of inputting the training data set into a pre-constructed convolutional neural network to perform training to obtain a trained high-altitude parabolic recognition model specifically includes:
a31, inputting the training data set into a convolutional neural network to obtain a high-altitude parabolic prediction result;
a32, determining a loss value of the convolutional neural network according to the high altitude parabolic prediction result and the high altitude parabolic type label;
a33, updating model parameters of the convolutional neural network through a back propagation algorithm according to the loss value, and returning to the step of inputting the training data set into the convolutional neural network;
and A34, stopping training when the loss value reaches a preset first threshold value or the iteration number reaches a preset second threshold value, and obtaining a trained high-altitude parabolic identification model.
Specifically, after data in the training data set is input into the initialized convolutional neural network model, a recognition result output by the model, namely, a high altitude parabolic prediction result, can be obtained, and the accuracy of model prediction can be evaluated according to the high altitude parabolic prediction result and the label information, so that parameters of the model are updated. For the high-altitude parabolic recognition model, the accuracy of the model prediction result can be measured by a Loss Function (Loss Function), wherein the Loss Function is defined on single training data and is used for measuring the prediction error of one training data, and particularly determining the Loss value of the training data through the label of the single training data and the prediction result of the model on the training data. In actual training, one training data set has a lot of training data, so that a Cost Function (Cost Function) is generally adopted to measure the overall error of the training data set, and the Cost Function is defined on the whole training data set and is used for calculating the average value of the prediction errors of all the training data, so that the prediction effect of the model can be better measured. For a general machine learning model, based on the cost function, a regular term for measuring the complexity of the model can be used as a training objective function, and based on the objective function, the loss value of the whole training data set can be obtained. There are many kinds of common loss functions, such as 0-1 loss function, square loss function, absolute loss function, logarithmic loss function, cross entropy loss function, etc., which can be used as the loss function of the machine learning model, and will not be described in detail herein. In the embodiment of the invention, one loss function can be selected to determine the loss value of training. Based on the trained loss value, updating the parameters of the model by adopting a back propagation algorithm, and iterating for several rounds to obtain the trained high-altitude parabolic recognition model. Specifically, the number of iteration rounds may be preset, or training may be considered complete when the test set meets the accuracy requirements.
When the depth image information of the area above the target vehicle is identified, the depth image information is also converted into corresponding three-dimensional point clouds, and then the corresponding three-dimensional point clouds are input into a high-altitude parabolic identification model to obtain a high-altitude parabolic identification result output by the model, such as high-altitude parabolic of a paper type, high-altitude parabolic of a metal product type, no high-altitude parabolic and the like.
S102, when the high-altitude parabolic object exists, determining a first motion track of the high-altitude parabolic object according to continuous multi-frame depth image information, and predicting a throwing position and a landing position of the high-altitude parabolic object according to the first motion track.
Further as an optional implementation manner, the method specifically includes the steps of determining a first motion track of the high altitude parabolic object according to the continuous multi-frame depth image information, and predicting a throwing position and a landing position of the high altitude parabolic object according to the first motion track:
s1021, carrying out differential processing on the current frame depth image information and the previous frame depth image information to obtain foreground image information of the high-altitude parabolic object corresponding to the current frame depth image information, and determining a first position coordinate of the high-altitude parabolic object according to the foreground image information;
s1022, determining a first motion track of the high-altitude parabolic object according to a first position coordinate corresponding to the continuous multi-frame depth image information;
s1023, determining three-dimensional space position information of a plurality of high-altitude buildings according to the depth image information;
s1024, determining a throwing position according to the first motion track and the three-dimensional space position information, and determining a landing position according to the first motion track and the horizontal plane where the binocular depth camera is located.
Specifically, when the high-altitude parabolic object is detected for the first time, the high-altitude parabolic object appears in the depth image of the current frame for the first time, and the depth image of the previous frame can be used as a background image of the depth image of the current frame at the moment; the foreground image of the high-altitude parabolic object in the depth image of the current frame can be obtained rapidly and accurately through differential processing, so that the position coordinates of the high-altitude parabolic object can be determined; when the high-altitude parabolic object is detected again later, the foreground image and the position of the high-altitude parabolic object are determined by the previous depth image, so that the foreground image of the high-altitude parabolic object in the depth image of the current frame can be quickly acquired through differential processing, and the position coordinates of the foreground image of the high-altitude parabolic object are determined.
After a certain number of first position coordinates are determined, a first motion trail of the high-altitude parabolic object can be determined according to a related mechanical model; meanwhile, depth information of surrounding high-altitude buildings is recorded in the depth image information, and three-dimensional space position information of the high-altitude buildings can be obtained through a three-dimensional reconstruction technology; the intersection point of the first motion trail and the three-dimensional space position information can be judged to be a throwing position, and the intersection point of the first motion trail and the horizontal plane where the binocular depth camera is located can be judged to be a landing position.
S103, judging whether the high-altitude parabolic object threatens the target vehicle according to the landing position and the object type of the high-altitude parabolic object, and if so, controlling the target vehicle to take vehicle body protection measures.
Specifically, when the landing position is located at the position of the target vehicle and the danger level of the object type of the high-altitude parabolic object is high, it is determined that there is a threat to the target vehicle and vehicle body protection measures are taken. Step S103 specifically includes the following steps:
s1031, judging whether the high-altitude parabolic object falls on the current position of the target vehicle according to the falling position;
s1032, determining the object type of the high-altitude parabolic object according to the high-altitude parabolic object identification result, and determining the danger level of the high-altitude parabolic object according to the object type;
s1033, determining that the high-altitude parabolic object threatens the target vehicle when the high-altitude parabolic object falls at the current position of the target vehicle and the danger level is greater than or equal to a preset third threshold value;
s1034, when it is determined that the high-altitude parabolic object threatens the target vehicle, controlling the target vehicle to start an air bag protection device arranged on the roof of the target vehicle.
Specifically, different article types may be preset with different hazard levels, for example, the hazard level of paper may be set to 0, the hazard level of metal product may be set to 5, the hazard level of plastic product may be set to 3, and so on; when the high-altitude parabolic object is determined to fall at the current position of the target vehicle, and the danger level of the high-altitude parabolic object is more than or equal to a third threshold value (such as 2), the threat of the high-altitude parabolic object to the target vehicle can be determined; and starting the air bag protection device on the roof of the target vehicle to perform covering protection on the body of the target vehicle, so as to avoid damage to the target vehicle caused by high-altitude parabolic objects.
S104, generating high-altitude parabolic evidence information according to the throwing position and the depth image information, and sending the high-altitude parabolic evidence information to a preset alarm platform or a vehicle owner of the target vehicle.
Specifically, after the throwing position of the high-altitude throwing object is identified, whether the high-altitude throwing object threatens a target vehicle or not is judged, the throwing position and the corresponding multi-frame depth image are used as evidence-lifting information to be sent to an alarm platform or a user terminal of a vehicle owner, so that a trouble-causing person can be found conveniently and quickly. For example, the launch position may be noted in the depth image, thereby forming the evidence information.
Further as an alternative embodiment, the high altitude parabolic monitoring method further comprises the steps of:
when the high-altitude parabolic object exists, people around the target vehicle are reminded of the high-altitude falling object risk through the voice broadcasting device arranged on the target vehicle.
Specifically, after the high-altitude parabolic object is identified, whether the high-altitude parabolic object threatens the target vehicle or not is judged, the voice prompt is carried out through the voice broadcasting device arranged on the target vehicle, so that the harm of the high-altitude parabolic object to surrounding personnel is avoided, and the safety of pedestrians is further guaranteed.
The method steps of the embodiments of the present invention are described above. It can be realized that the embodiment of the invention can monitor the high-altitude parabolic behavior in real time and determine the throwing position for real-time verification through the acquisition of the depth image information and the model identification, thereby being convenient for finding out the accident causing personnel in time, reducing the risk brought by the high-altitude parabolic, and ensuring the safety of pedestrians and vehicles to a certain extent.
Referring to fig. 2, an embodiment of the present invention provides a vehicle-based high altitude parabolic monitoring system, comprising:
the high-altitude parabolic recognition module is used for acquiring depth image information of an area above the target vehicle in real time, inputting the depth image information into a pre-trained high-altitude parabolic recognition model, and determining whether a high-altitude parabolic exists according to a high-altitude parabolic recognition result;
the motion trail determining module is used for determining a first motion trail of the high-altitude parabolic object according to continuous multi-frame depth image information when the high-altitude parabolic object exists, and predicting a throwing position and a landing position of the high-altitude parabolic object according to the first motion trail;
the vehicle body protection control module is used for judging whether the high-altitude parabolic object threatens the target vehicle according to the landing position and the object type of the high-altitude parabolic object, and if so, controlling the target vehicle to take vehicle body protection measures;
the high-altitude parabolic evidence collection module is used for generating high-altitude parabolic evidence collection information according to the throwing position and the depth image information and sending the high-altitude parabolic evidence collection information to a preset alarm platform or a vehicle owner of a target vehicle.
The content in the method embodiment is applicable to the system embodiment, the functions specifically realized by the system embodiment are the same as those of the method embodiment, and the achieved beneficial effects are the same as those of the method embodiment.
Referring to fig. 3, an embodiment of the present invention provides a vehicle-based high altitude parabolic monitoring apparatus, including:
at least one processor;
at least one memory for storing at least one program;
the at least one program, when executed by the at least one processor, causes the at least one processor to implement a vehicle-based overhead parabolic monitoring method as described above.
The content in the method embodiment is applicable to the embodiment of the device, and the functions specifically realized by the embodiment of the device are the same as those of the method embodiment, and the obtained beneficial effects are the same as those of the method embodiment.
The embodiment of the invention also provides a computer readable storage medium, in which a processor executable program is stored, which when executed by a processor is used for executing the above-mentioned high altitude parabolic monitoring method based on the vehicle.
The computer readable storage medium of the embodiment of the invention can execute the method for monitoring the high altitude parabolic trough based on the vehicle, can execute any combination implementation steps of the method embodiment, and has the corresponding functions and beneficial effects of the method.
Embodiments of the present invention also disclose a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The computer instructions may be read from a computer-readable storage medium by a processor of a computer device, and executed by the processor, to cause the computer device to perform the method shown in fig. 1.
In some alternative embodiments, the functions/acts noted in the block diagrams may occur out of the order noted in the operational illustrations. For example, two blocks shown in succession may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality/acts involved. Furthermore, the embodiments presented and described in the flowcharts of the present invention are provided by way of example in order to provide a more thorough understanding of the technology. The disclosed methods are not limited to the operations and logic flows presented herein. Alternative embodiments are contemplated in which the order of various operations is changed, and in which sub-operations described as part of a larger operation are performed independently.
Furthermore, while the present invention has been described in the context of functional modules, it should be appreciated that, unless otherwise indicated, one or more of the functions and/or features described above may be integrated in a single physical device and/or software module or one or more of the functions and/or features may be implemented in separate physical devices or software modules. It will also be appreciated that a detailed discussion of the actual implementation of each module is not necessary to an understanding of the present invention. Rather, the actual implementation of the various functional modules in the apparatus disclosed herein will be apparent to those skilled in the art from consideration of their attributes, functions and internal relationships. Accordingly, one of ordinary skill in the art can implement the invention as set forth in the claims without undue experimentation. It is also to be understood that the specific concepts disclosed are merely illustrative and are not intended to be limiting upon the scope of the invention, which is to be defined in the appended claims and their full scope of equivalents.
The above functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on this understanding, the technical solution of the present invention may be embodied in essence or a part contributing to the prior art or a part of the technical solution in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the above-described method of the various embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
Logic and/or steps represented in the flowcharts or otherwise described herein, e.g., a ordered listing of executable instructions for implementing logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). In addition, the computer-readable medium may even be paper or other suitable medium upon which the program described above is printed, as the program described above may be electronically captured, via, for instance, optical scanning of the paper or other medium, then compiled, interpreted, or otherwise processed in a suitable manner, if necessary, and then stored in a computer memory.
It is to be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above-described embodiments, the various steps or methods may be implemented in software or firmware stored in a memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, may be implemented using any one or combination of the following techniques, as is well known in the art: discrete logic circuits having logic gates for implementing logic functions on data signals, application specific integrated circuits having suitable combinational logic gates, programmable Gate Arrays (PGAs), field Programmable Gate Arrays (FPGAs), and the like.
In the foregoing description of the present specification, reference has been made to the terms "one embodiment/example", "another embodiment/example", "certain embodiments/examples", and the like, means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiments or examples. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
While embodiments of the present invention have been shown and described, it will be understood by those of ordinary skill in the art that: many changes, modifications, substitutions and variations may be made to the embodiments without departing from the spirit and principles of the invention, the scope of which is defined by the claims and their equivalents.
While the preferred embodiment of the present invention has been described in detail, the present invention is not limited to the above embodiments, and various equivalent modifications and substitutions can be made by those skilled in the art without departing from the spirit of the present invention, and these equivalent modifications and substitutions are intended to be included in the scope of the present invention as defined in the appended claims.
Claims (10)
1. A vehicle-based high altitude parabolic monitoring method, comprising the steps of:
acquiring depth image information of an area above a target vehicle in real time, inputting the depth image information into a pre-trained high-altitude parabolic identification model, and determining whether a high-altitude parabolic exists according to a high-altitude parabolic identification result;
when a high-altitude parabolic object exists, determining a first motion track of the high-altitude parabolic object according to continuous multi-frame depth image information, and predicting a throwing position and a landing position of the high-altitude parabolic object according to the first motion track;
judging whether the high-altitude parabolic object threatens the target vehicle according to the landing position and the object type of the high-altitude parabolic object, and if so, controlling the target vehicle to take vehicle body protection measures;
and generating high-altitude parabolic evidence information according to the throwing position and the depth image information, and sending the high-altitude parabolic evidence information to a preset alarm platform or a vehicle owner of the target vehicle.
2. The method for monitoring the high altitude parabolic trough of a vehicle according to claim 1, wherein the step of acquiring the depth image information of the area above the target vehicle in real time comprises the following steps:
and when the target vehicle is in a parking state, starting a binocular depth camera arranged on the roof of the target vehicle, and continuously acquiring the depth image information of the area above the target vehicle through the binocular depth camera.
3. The vehicle-based high altitude parabolic monitoring method according to claim 1, further comprising a step of pre-training the high altitude parabolic recognition model, which specifically comprises:
acquiring a plurality of preset high-altitude parabolic depth images, and determining high-altitude parabolic type labels corresponding to the high-altitude parabolic depth images;
converting the high altitude parabolic depth image into three-dimensional point cloud sample data, and constructing a training data set according to the three-dimensional point cloud sample data and the corresponding high altitude parabolic type label;
and inputting the training data set into a pre-constructed convolutional neural network for training to obtain the trained high-altitude parabolic recognition model.
4. A vehicle-based high altitude parabolic monitoring method according to claim 3, wherein the step of inputting the training data set into a convolutional neural network constructed in advance to perform training, and obtaining a trained high altitude parabolic recognition model comprises the following steps:
inputting the training data set into the convolutional neural network to obtain a high-altitude parabolic prediction result;
determining a loss value of the convolutional neural network according to the high altitude parabolic prediction result and the high altitude parabolic type label;
updating model parameters of the convolutional neural network through a back propagation algorithm according to the loss value, and returning to the step of inputting the training data set into the convolutional neural network;
and stopping training when the loss value reaches a preset first threshold value or the iteration number reaches a preset second threshold value, and obtaining a trained high-altitude parabolic identification model.
5. The method for monitoring the high-altitude parabolic object according to claim 2, wherein the step of determining the first motion trajectory of the high-altitude parabolic object according to the depth image information of a plurality of consecutive frames and predicting the throwing position and the landing position of the high-altitude parabolic object according to the first motion trajectory comprises the following steps:
performing differential processing on the depth image information of the current frame and the depth image information of the previous frame to obtain foreground image information of the high-altitude parabolic object corresponding to the depth image information of the current frame, and determining a first position coordinate of the high-altitude parabolic object according to the foreground image information;
determining a first motion track of the high-altitude parabolic object according to the first position coordinates corresponding to the continuous multi-frame depth image information;
determining three-dimensional space position information of a plurality of high-altitude buildings according to the depth image information;
and determining the throwing position according to the first motion track and the three-dimensional space position information, and determining the landing position according to the first motion track and the horizontal plane where the binocular depth camera is positioned.
6. The method for monitoring the high-altitude parabolic trough of a vehicle according to claim 1, wherein the step of determining whether the high-altitude parabolic trough threatens the target vehicle according to the landing position and the object type of the high-altitude parabolic trough, if so, controlling the target vehicle to take a vehicle body protection measure, specifically comprises the following steps:
judging whether the high-altitude parabolic object falls at the current position of the target vehicle according to the falling position;
determining the object type of the high-altitude parabolic object according to the high-altitude parabolic object identification result, and determining the danger level of the high-altitude parabolic object according to the object type;
when the high-altitude parabolic object falls at the current position of the target vehicle and the danger level is greater than or equal to a preset third threshold value, determining that the high-altitude parabolic object threatens the target vehicle;
and when the high-altitude parabolic object is determined to have threat to the target vehicle, controlling the target vehicle to start an airbag protection device arranged on the roof of the target vehicle.
7. A vehicle-based high altitude parabolic trough monitoring method according to any one of claims 1 to 6, further comprising the steps of:
when the high-altitude parabolic object exists, reminding people around the target vehicle of the high-altitude falling object risk through a voice broadcasting device arranged on the target vehicle.
8. A vehicle-based high altitude parabolic monitoring system, comprising:
the high-altitude parabolic recognition module is used for acquiring depth image information of an area above a target vehicle in real time, inputting the depth image information into a pre-trained high-altitude parabolic recognition model, and determining whether a high-altitude parabolic exists according to a high-altitude parabolic recognition result;
the motion trail determining module is used for determining a first motion trail of the high-altitude parabolic object according to the continuous multi-frame depth image information when the high-altitude parabolic object exists, and predicting a throwing position and a landing position of the high-altitude parabolic object according to the first motion trail;
the vehicle body protection control module is used for judging whether the high-altitude parabolic object threatens the target vehicle according to the landing position and the object type of the high-altitude parabolic object, and if so, controlling the target vehicle to take vehicle body protection measures;
the high-altitude parabolic evidence obtaining module is used for generating high-altitude parabolic evidence obtaining information according to the throwing position and the depth image information and sending the high-altitude parabolic evidence obtaining information to a preset alarm platform or a vehicle owner of the target vehicle.
9. A vehicle-based high altitude parabolic monitoring apparatus, comprising:
at least one processor;
at least one memory for storing at least one program;
when the at least one program is executed by the at least one processor, the at least one processor is caused to implement a vehicle-based high altitude parabolic monitoring method as claimed in any one of claims 1 to 7.
10. A computer readable storage medium, in which a processor executable program is stored, characterized in that the processor executable program, when being executed by a processor, is for performing a vehicle-based high altitude parabolic monitoring method according to any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310426904.XA CN116486334A (en) | 2023-04-19 | 2023-04-19 | High-altitude parabolic monitoring method, system and device based on vehicle and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310426904.XA CN116486334A (en) | 2023-04-19 | 2023-04-19 | High-altitude parabolic monitoring method, system and device based on vehicle and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116486334A true CN116486334A (en) | 2023-07-25 |
Family
ID=87224588
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310426904.XA Pending CN116486334A (en) | 2023-04-19 | 2023-04-19 | High-altitude parabolic monitoring method, system and device based on vehicle and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116486334A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118134962A (en) * | 2024-05-08 | 2024-06-04 | 中国人民解放军国防科技大学 | High-altitude parabolic detection method, electronic equipment and storage medium |
-
2023
- 2023-04-19 CN CN202310426904.XA patent/CN116486334A/en active Pending
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118134962A (en) * | 2024-05-08 | 2024-06-04 | 中国人民解放军国防科技大学 | High-altitude parabolic detection method, electronic equipment and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11643076B2 (en) | Forward collision control method and apparatus, electronic device, program, and medium | |
CN112418069B (en) | High-altitude parabolic detection method and device, computer equipment and storage medium | |
CN112037266B (en) | Falling object identification method and device, terminal equipment and storage medium | |
CN109087510B (en) | Traffic monitoring method and device | |
US9767570B2 (en) | Systems and methods for computer vision background estimation using foreground-aware statistical models | |
CN110047095B (en) | Tracking method and device based on target detection and terminal equipment | |
CN110781836A (en) | Human body recognition method and device, computer equipment and storage medium | |
CN110807439B (en) | Method and device for detecting obstacle | |
CN109241829A (en) | The Activity recognition method and device of convolutional neural networks is paid attention to based on space-time | |
CN116486334A (en) | High-altitude parabolic monitoring method, system and device based on vehicle and storage medium | |
CN113239746B (en) | Electric vehicle detection method, device, terminal equipment and computer readable storage medium | |
CN114708555A (en) | Forest fire prevention monitoring method based on data processing and electronic equipment | |
CN111383248B (en) | Pedestrian red light running judging method and device and electronic equipment | |
CN110443119B (en) | Method and device for identifying state of goods in carriage | |
CN116823884A (en) | Multi-target tracking method, system, computer equipment and storage medium | |
CN111191682A (en) | Network model training method, target detection method of night image and related equipment | |
CN115171066A (en) | Method, device and equipment for determining perception risk and storage medium | |
CN114821978A (en) | Method, device and medium for eliminating false alarm | |
CN111199179B (en) | Target object tracking method, terminal equipment and medium | |
CN113850774A (en) | Tunnel fire early warning method, system, device and storage medium | |
CN113283286A (en) | Driver abnormal behavior detection method and device | |
CN118279773B (en) | Unmanned aerial vehicle-based forbidden fishing tackle monitoring method and system | |
CN117475358B (en) | Collision prediction method and device based on unmanned aerial vehicle vision | |
CN115431968B (en) | Vehicle controller, vehicle and vehicle control method | |
CN113011347B (en) | Intelligent driving method and device based on artificial intelligence and related products |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |