CN114882661A - Outdoor early warning method, device, system and computer readable storage medium - Google Patents

Outdoor early warning method, device, system and computer readable storage medium Download PDF

Info

Publication number
CN114882661A
CN114882661A CN202210465111.4A CN202210465111A CN114882661A CN 114882661 A CN114882661 A CN 114882661A CN 202210465111 A CN202210465111 A CN 202210465111A CN 114882661 A CN114882661 A CN 114882661A
Authority
CN
China
Prior art keywords
moving target
vehicle
radar
moving
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210465111.4A
Other languages
Chinese (zh)
Inventor
周建鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Xiaopeng Motors Technology Co Ltd
Original Assignee
Guangzhou Xiaopeng Motors Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Xiaopeng Motors Technology Co Ltd filed Critical Guangzhou Xiaopeng Motors Technology Co Ltd
Priority to CN202210465111.4A priority Critical patent/CN114882661A/en
Publication of CN114882661A publication Critical patent/CN114882661A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/22Electrical actuation
    • G08B13/24Electrical actuation by interference with electromagnetic field distribution
    • G08B13/2491Intrusion detection systems, i.e. where the body of an intruder causes the interference with the electromagnetic field
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/886Radar or analogous systems specially adapted for specific applications for alarm systems
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/10Alarms for ensuring the safety of persons responsive to calamitous events, e.g. tornados or earthquakes
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Abstract

The application relates to an outdoor early warning method, an outdoor early warning device and a computer readable storage medium. The method comprises the following steps: acquiring vehicle surrounding environment information through vehicle-mounted equipment, wherein the vehicle-mounted equipment comprises a vision sensor, a vibration sensor and an induction radar which are arranged on a vehicle; identifying whether illegal intrusion behaviors occur or not based on data collected by a visual sensor and/or an induction radar; predicting whether a disaster threat event occurs or not based on data acquired by at least one of the vehicle-mounted devices; and if illegal intrusion behaviors and/or disaster threat events occur, sending alarm information. The scheme provided by the application can early warn outdoor dangers of various scenes only by depending on equipment configured by vehicles.

Description

Outdoor early warning method, device, system and computer readable storage medium
Technical Field
The application relates to the field of intelligent driving, in particular to an outdoor early warning method, an outdoor early warning system and a computer-readable storage medium.
Background
Whether the people need to camp and zada in the field for work or relax the bodies and the minds to go to mountainous areas, seasides and other outdoor camping places, the personal safety of the people is the most important thing. In the related art, the early warning method for the outdoor personnel is that professional early warning equipment purchased by the outdoor personnel, for example, some portable video monitoring systems undertake outdoor early warning work. When the professional early warning equipment detects the illegal invasion, the professional early warning equipment can send out sound and light alarm to outdoor personnel, and the function of preventing the illegal invasion is really played to a certain extent. However, the related art needs to additionally purchase a professional early warning device, and has the defects of single early warning scene, small range and/or poor accuracy. When some scenes are not illegally invasive and still pose a threat to the party, the failure of the related art to warn may cause great harm to the party.
Disclosure of Invention
In order to solve or partially solve the problems in the related art, the application provides an outdoor early warning method, an outdoor early warning device and a computer-readable storage medium, and outdoor dangers of various scenes can be early warned only by equipment configured by a vehicle.
The first aspect of the present application provides an outdoor early warning method, including:
acquiring vehicle surrounding environment information through vehicle-mounted equipment, wherein the vehicle-mounted equipment comprises a vision sensor, a vibration sensor and an induction radar which are arranged on a vehicle;
identifying whether illegal intrusion behaviors occur or not based on data collected by the vision sensor and/or the induction radar;
predicting whether a disaster threat event occurs or not based on data collected by at least one of the vehicle-mounted devices;
and if the illegal intrusion behavior and/or the disaster threat event occur, sending alarm information.
This application second aspect provides an outdoor early warning device, includes:
the system comprises an acquisition module, a processing module and a processing module, wherein the acquisition module is used for acquiring vehicle surrounding environment information through vehicle-mounted equipment, and the vehicle-mounted equipment comprises a vision sensor, a vibration sensor and an induction radar which are arranged on a vehicle;
the identification module is used for identifying whether illegal intrusion behaviors occur or not based on data collected by the vision sensor and/or the induction radar;
the prediction module is used for predicting whether a disaster threat event occurs or not based on data collected by at least one piece of vehicle-mounted equipment;
and the alarm module is used for sending alarm information if the illegal intrusion behavior and/or the disaster threat event occur.
A third aspect of the present application provides a vehicle comprising:
a processor; and
a memory having executable code stored thereon, which when executed by the processor, causes the processor to perform the method as described above.
A fourth aspect of the present application provides a computer-readable storage medium having stored thereon executable code, which, when executed by a processor of an electronic device, causes the processor to perform the method as described above.
The technical scheme provided by the application can comprise the following beneficial effects: on one hand, the acquisition equipment of the environmental information such as the vision sensor, the vibration sensor, the induction radar and the like is vehicle-mounted equipment, which means that outdoor personnel do not need to additionally purchase professional early warning equipment and operate the vehicle-mounted equipment, so that the professional quality requirement on the outdoor personnel is reduced during outdoor early warning; on the other hand, based on these mobile units, not only can discern illegal intrusion behavior, but also can predict whether take place calamity threat incident to can carry out the early warning to the outdoor danger of multiple scene, promote outdoor activities personnel's safety.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
The foregoing and other objects, features and advantages of the application will be apparent from the following more particular descriptions of exemplary embodiments of the application as illustrated in the accompanying drawings wherein like reference numbers generally represent like parts throughout the application.
Fig. 1 is a schematic flow chart of an outdoor early warning method according to an embodiment of the present disclosure;
fig. 2 is a schematic structural diagram of an outdoor early warning device shown in an embodiment of the present application;
fig. 3 is a schematic structural diagram of a vehicle according to an embodiment of the present application.
Detailed Description
Embodiments of the present application will be described in more detail below with reference to the accompanying drawings. While embodiments of the present application are illustrated in the accompanying drawings, it should be understood that the present application may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this application and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It should be understood that although the terms "first," "second," "third," etc. may be used herein to describe various information, these information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present application. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the present application, "a plurality" means two or more unless specifically limited otherwise.
Whether the people need to camp and zada in the field for work or relax the bodies and the minds to go to mountainous areas, seasides and other outdoor camping places, the personal safety of the people is the most important thing. In the related art, the early warning method for the outdoor personnel is that professional early warning equipment purchased by the outdoor personnel, for example, some portable video monitoring systems undertake outdoor early warning work. When the professional early warning equipment detects the illegal invasion, the professional early warning equipment can send out sound and light alarm to outdoor personnel, and the function of preventing the illegal invasion is really played to a certain extent. However, the above related technologies not only require additional purchase of professional early warning equipment and require outdoor personnel to have a certain or even higher professional literacy, but also have the defects of single early warning scene, smaller range and/or poorer accuracy. When some scenes are not illegally invasive and still pose a threat to the party, the failure of the related art to warn may cause great harm to the party.
In order to solve the above problems, embodiments of the present application provide an outdoor early warning method, which can early warn outdoor dangers in various scenes only by using a device configured by a vehicle.
The technical solutions of the embodiments of the present application are described in detail below with reference to the accompanying drawings.
Referring to fig. 1, a schematic flow chart of an outdoor warning method shown in the embodiment of the present application is shown, which mainly includes steps S101 to S104, and is described as follows:
step S101: the method comprises the steps of obtaining vehicle surrounding environment information through vehicle-mounted equipment, wherein the vehicle-mounted equipment comprises a vision sensor, a vibration sensor and an induction radar which are arranged on a vehicle.
In the embodiment of the present application, the visual sensor may be an image capturing device such as a monocular camera, a binocular camera, or a depth camera, the sensing radar may be a laser radar, an ultrasonic radar, and/or a millimeter wave radar, and these vehicle-mounted devices may be deployed in front of and behind and on both sides of the vehicle, for example, the visual sensor may be a forward-looking camera deployed at the front end of the vehicle, and a look-around camera deployed on both sides of the vehicle. Generally, the distance of the image shot by the vision sensor such as the forward-looking camera is longer than the detection distance of the induction radar.
Step S102: and identifying whether illegal intrusion behaviors occur or not based on data collected by the vision sensor and/or the induction radar.
In the embodiment of the application, the image data collected by the vision sensor is point cloud data, and the data collected by the induction radar is point cloud data. The point cloud data is information returned from a point hit by a target after a signal hits the target by an induction radar such as a laser radar, an ultrasonic radar, or a millimeter wave radar, and is called point cloud data, which is information such as two-dimensional or three-dimensional coordinates, texture of the target, reflection intensity, and echo frequency carried by each point (may be called a cloud point) in the point cloud, because the number of points hit by the target is generally large (the order of magnitude is usually ten thousand or one hundred thousand units), and the points corresponding to the returned information cluster in a two-dimensional plane or three-dimensional space in a cloud shape. The illegal intrusion behavior according to the embodiment of the present invention refers to a behavior that is performed by various animals including humans and may cause personal injury to an object of intrusion, and the object of intrusion may be a vehicle and/or an outdoor activity person near the vehicle, for example, a person who needs to camp in a mountain area or outdoors camping at seaside for working, or for relaxing the body and mind.
The sensing radar and the vision sensor are respectively good and bad when collecting data, for example, under the severe weather environment such as light intensity is small and/or rainy days, no matter the sensing radar is a monocular camera, a binocular camera or a depth camera, the defect that the definition of collected image data is poor can occur, the sensing radar does not have the problems, the sensing radar also has great advantages in the aspect of target positioning compared with the vision sensor, however, the type of the target is usually difficult to identify only based on the data collected by the sensing radar, for example, the target is a person, a fierce animal or other animals, and in the environment with good light intensity, the type of the target can be easily identified based on the image data collected by the vision sensor. Therefore, in the embodiment of the application, different sensors or inductive radars can be selected according to the current environment of the vehicle. The current environment of the vehicle can be judged based on the data collected by the raindrop sensor and/or the photosensitive sensor. If the vehicle is currently in the environment that is raining and/or has poor light intensity, the sensing radar may be enabled to collect data, otherwise, the vision sensor and the sensing radar are enabled to collect data, as described in detail below.
As an embodiment of the present application, based on data collected by the sensing radar, identifying whether an illegal intrusion behavior occurs may be: calculating kinematic data of the moving object and/or a distance between the moving object and the invasive object based on data acquired by the inductive radar; and if the kinematic data of the moving target exceeds a preset kinematic data threshold and/or the distance between the moving target and an infringing object exceeds a preset first distance threshold, determining that the moving target is carrying out illegal intrusion behavior, wherein the kinematic data comprises information such as moving speed, acceleration and orientation of the moving target. As mentioned above, the induction radar has great advantages in locating a target because the induction radar can generally achieve accuracy in ranging, and therefore, based on the accurate ranging function, the kinematic data of the moving target and/or the distance between the moving target and the infringing object can be calculated based on the data collected by the induction radar. Whether it is a human or other violent animal (e.g., jackal tiger leopard, etc.), which generally has a specific speed, acceleration or orientation when attacking a target, or the moving target is closer to an offending object, also means that the moving target constitutes a threat to the offending object, and therefore, if calculated, the kinematic data of the moving target exceeds a preset kinematic data threshold and/or the distance between the moving target and the offending object exceeds a preset first distance threshold, it is determined that the moving target is performing illegal intrusion behavior. Furthermore, in view of the complex environment outdoors, there may be objects that change suddenly in speed without actually having an attack, for example, a strong wind blowing vegetation. In order to prevent the system from frequently giving false alarms to disturb normal work of outdoor personnel, the above embodiment may further detect the first position and the second position of the moving target in the preset time period after the sensing radar detects the moving target, and determine that the moving target is a false target, for example, a flower, a grass and a tree which are occasionally blown by strong wind, if a distance between the first position and the second position of the moving target in the preset time period is smaller than a preset second distance threshold. After a moving object is determined to be a false object, the relevant data can be culled without processing.
As an embodiment of the present application, the identification of whether an illegal intrusion behavior occurs based on data collected by the vision sensor and the sensing radar may be further implemented through steps S1021 to S1024, which are described in detail as follows:
step S1021: based on image data acquired by the vision sensor, the type and the spatial activity information of the first moving object in the vision coordinate system are determined.
After the vision sensor collects the image data of the surrounding environment of the vehicle, the image data can be input into a trained neural network, and the type of the first moving target and the space activity information under the vision coordinate system are obtained by performing feature extraction on the image. Here, the type of the first moving object refers to what object the first moving object is specifically, for example, whether it is a human or some inlaying animal such as coyote. As for the spatial activity information of the first moving object in the visual coordinate system, the position, the posture, the moving track and the like of the first moving object in the visual coordinate system can be mentioned.
Step S1022: and determining the spatial activity information of the second moving target under the radar coordinate system based on the point cloud data acquired by the induction radar.
Similar to the spatial activity information of the first moving target in the visual coordinate system, the spatial activity information of the second moving target in the radar coordinate system may be the position, the attitude, the moving track and the like of the first moving target in the radar coordinate system.
Step S1023: and matching the first moving target and the second moving target based on the spatial activity information of the first moving target and the spatial activity information of the second moving target in the same coordinate system.
As mentioned previously, vision sensors have the advantage of determining the type of target, while inductive radar has the advantage of locating moving targets. Therefore, the first moving target and the second moving target are matched based on the spatial activity information of the first moving target and the spatial activity information of the second moving target, and whether the first moving target and the second moving target are the same moving target or not can be confirmed on one hand; on the other hand, after confirming that the moving object is the same, the type of the moving object detected by the induction radar can be determined by means of the determination of the type of the first moving object based on the image data collected by the vision sensor. Specifically, as an embodiment of the present application, matching the first moving object and the second moving object based on the spatial activity information of the first moving object and the spatial activity information of the second moving object in the same coordinate system may be: converting the visual coordinate system and the radar coordinate system into the same coordinate system; under the same coordinate system, calculating the overlapping degree of the region of interest corresponding to the space activity information of the first moving target and the region of interest corresponding to the space activity information of the second moving target; and if the overlapping degree of the region of interest corresponding to the spatial activity information of the first moving target and the region of interest corresponding to the spatial activity information of the second moving target is greater than a preset threshold value, the first moving target and the second moving target are successfully matched. In the above embodiment, the converting the visual coordinate system and the radar coordinate system to the same coordinate system may be converting the visual coordinate system to the radar coordinate system, or converting the radar coordinate system to the visual coordinate system. In a specific conversion process, a conversion matrix of two coordinate systems can be obtained according to a calibration relation between a visual sensor and an induction radar, so that the visual coordinate system and the radar coordinate system are converted into the same coordinate system, and if the overlapping degree of a region of interest corresponding to the spatial activity information of the first moving target and a region of interest corresponding to the spatial activity information of the second moving target is greater than a preset threshold value in the same coordinate system, it is indicated that the first moving target shot by the visual sensor and the second moving target detected by the induction radar are actually the same moving target, and the first moving target and the second moving target are successfully matched.
As another embodiment of the present application, in the same coordinate system, matching the first moving object and the second moving object based on the spatial activity information of the first moving object and the spatial activity information of the second moving object may further be: converting the track of the first moving target into a bird's-eye view track of the first moving target under the sensing radar visual angle; calculating the relative displacement similarity and the track line type similarity of the first moving target and the second moving target in the same time period according to the aerial view track of the first moving target and the track of the second moving target; and if the relative displacement similarity and the flight path type similarity in the same time interval respectively exceed a first preset threshold and a second preset threshold, determining that the aerial view flight path of the first moving target is successfully matched with the flight path of the second moving target. In the above embodiment, the converting the track of the first moving object into the bird's-eye view track of the first moving object under the sensing radar viewing angle may be obtained by multiplying the track of the first moving object by an inverse matrix of a certain conversion matrix H, where the certain conversion matrix H may be obtained through the following steps 1) to 3):
1) taking four corners of a relatively striking rectangular object (such as a lane line or an automobile and the like) in an original image (the original image is any one image acquired at an angle used by a vision sensor when acquiring an image to which a first moving target belongs) as reference points, and recording the positions of the reference points in the original image;
2) estimating relative positions of the four reference points in the aerial view (namely, an image shot in the aerial view) according to the real distance between the reference points, and estimating the positions of the four reference points in the aerial view according to the proportion of the pixel point distance to the size of the original image;
3) the transformation matrix H can be obtained by substituting the positions of the four reference points in the original image and the bird's eye view into the formula P0 ═ H × P1, where H is a transformation matrix of 4 × 4 in the formula P0 ═ H × P1, P0 is the position of any one point in the original image, and P1 is the position of the any one point in the bird's eye view.
In the above embodiment, the track type similarity between the first moving object and the second moving object in the same time period may be obtained by sampling the tracks of the first moving object and the second moving object, calculating the sum of the derivatives of the sampling points to obtain the curvature of the track of the first moving object and the curvature of the track of the second moving object, and summing the ratios of the curvatures of the first moving object and the second moving object by multiplying the ratios of the lengths of the first moving object and the second moving object by corresponding weights, where the sum may be used as the track type similarity between the first moving object and the second moving object in the same time period. As for the similarity of the relative displacement between the first moving object and the second moving object in the same time period, the obtaining method may be: extracting the moving distance of the first moving target on the two-dimensional coordinate plane in the aerial view track of the first moving target in the same time period; dividing the moving distance of the first moving target on the two-dimensional coordinate plane in a preset time period by the size of the image to which the aerial view track of the first moving target belongs to obtain a first relative displacement; and calculating the ratio of the first relative displacement to the second relative displacement to obtain the relative displacement similarity of the first moving target and the second moving target, wherein the second relative displacement is the value obtained by dividing the moving distance of the second moving target on the two-dimensional plane in the track of the second moving target in a preset time period by the size of the image to which the moving track of the second moving target belongs. In the above embodiment, the distance that the first moving object or the second moving object moves on the two-dimensional coordinate plane in the same time period may be a distance that moves in an x-axis direction or a y-axis direction of the two-dimensional coordinate plane. It should be noted that, since the sensing radar has a relatively large certainty in acquiring the spatial position information such as the track of the moving target, and the second relative displacement in the above embodiment also has a relatively large certainty, a ratio of the first relative displacement to the second relative displacement may be used as the relative displacement similarity between the first moving target and the second moving target.
Step S1024: and if the matching is successful and the second moving target approaches to an infringement object in the monitoring range of the induction radar, determining that the target belonging to the type is carrying out illegal intrusion behavior.
If the first moving target and the second moving target are successfully matched, the target object shot by the vision sensor and the target object detected by the induction radar are the same target object. When a second moving target (actually, a first moving target shot by a visual sensor) approaches an infringing object in the monitoring range of the induction radar, the second moving target is closer to the infringing object, and once the distance between the second moving target and the infringing object is shortened to a preset threshold value, it is determined that the target belonging to the type determined by the visual sensor is carrying out illegal intrusion behaviors.
The above embodiments mainly describe in detail the process of identifying whether an illegal intrusion action occurs based on data collected by the sensing radar and identifying whether an illegal intrusion action occurs based on data collected by the vision sensor and the sensing radar. In fact, as can be seen from the above description of identifying whether the illegal intrusion behavior occurs based on the data collected by the vision sensor and the sensing radar, the technical solution of the present application may also identify whether the illegal intrusion behavior occurs based on the data collected by the vision sensor, and it is only identified whether the illegal intrusion behavior occurs based on the data collected by the vision sensor, which is generally more suitable for scenes with good illumination environment, such as sunny days. In addition, it should be noted that, considering that cloud point common constants corresponding to data (that is, point cloud data) acquired by an induction radar are hundreds of thousands, in order to reduce consumption of computing resources and thus improve real-time performance of the technical solution, in the embodiment of the present application, before matching the first moving target and the second moving target based on spatial activity information of the first moving target and spatial activity information of the second moving target, data acquired by the induction radar may be screened to filter out a null target signal and/or an invalid target signal in the data acquired by the induction radar, where the null target signal may be that cloud point densities corresponding to the data acquired by the induction radar are very sparse and even close to 0, and the invalid target signal may be that cloud points corresponding to the data acquired by the induction radar obviously belong to distorted points.
Step S103: and predicting whether a disaster threat event occurs or not based on data collected by at least one vehicle-mounted device in the vehicle-mounted devices.
It should be noted that the illegal intrusion behavior different from the foregoing embodiment is mainly caused by human or fierce animals, and the disaster threat event herein mainly refers to a natural event threatening the personal safety of the outdoor personnel, such as earthquake, torrential flood, debris flow, landslide and falling rocks, etc. As an embodiment of the present application, at least one of the vehicle-mounted devices includes a vision sensor and an induction radar, and predicting whether a disaster threat event occurs based on data collected by the at least one of the vehicle-mounted devices may be: extracting the position and geometric attributes of the moving object on the image based on the image data acquired by the vision sensor, wherein the geometric attributes comprise the shape and the size of the moving object; determining whether the moving target is a natural disaster derivative or not according to the geometric attributes of the moving target; if the moving target is a natural disaster derivative, such as a rigid body like rockfall, trees and the like, performing curve fitting according to the position of the moving target on the image and positioning data of the moving target acquired by the induction radar to obtain a motion track of the moving target; and if the moving track of the moving target covers the point corresponding to the invading object and/or the distance between the moving target and the invading object is less than a preset distance threshold value, determining that a disaster threat event occurs. In the above embodiment, when curve fitting is performed according to the position of the moving target on the image and the positioning data of the moving target collected by the sensing radar, the moving target shot by the vision sensor and the moving target detected by the sensing radar may be matched to determine whether the moving targets obtained by the vision sensor and the sensing radar are the same target object, and the specific matching method may refer to the description of the related embodiments. As described above, the sensing radar has a great advantage in positioning the target, and thus after it is determined that the moving target photographed by the vision sensor and the moving target detected by the sensing radar are the same target object, the moving track of the moving target can be obtained by fitting according to data such as the position, the speed, the orientation and the like of the moving target at different times, which are acquired by the sensing radar. If the moving track of the moving target covers the point corresponding to the invading object and/or the distance between the moving target and the invading object is smaller than the preset distance threshold, it is indicated that the moving target may invade the invading object such as outdoor personnel, and therefore it is determined that a disaster threat event has occurred.
It should be noted that, since the motion trajectory of the moving object can be regarded as a curve equation on a two-dimensional plane (the size of the moving object is not considered here, because the moving object with any size can infringe the infringing object, and therefore, the case of a three-dimensional space is not considered when the motion trajectory of the moving object is considered), in the above embodiment, it is determined whether the motion trajectory of the moving object covers a point corresponding to the infringing object, that is, whether the motion trajectory of the moving object passes through a point corresponding to the infringing object, and it is actually considered whether the point corresponding to the infringing object is a solution of the motion trajectory corresponding curve equation of the moving object. For example, it is assumed that the motion trajectory of the moving object corresponds to the curve equation of y ═ f (x), and the coordinates of the point corresponding to the invading object on the two-dimensional plane are (x) 0 ,y 0 ) If x 0 And y 0 For the solution of the curve equation y ═ f (x),if the damage target is in motion state, the motion trajectory of the motion target is determined to cover the point corresponding to the damage object, that is, the motion trajectory of the motion target passes through the point corresponding to the damage object, if the damage target is in motion state, curve fitting can be performed according to the position of the damage target on the image and the positioning data of the damage target collected by the induction radar, so as to obtain the motion trajectory of the damage target, and the corresponding curve equation of the motion trajectory on the two-dimensional plane is y-g (x). If the curve equation corresponding to the motion trajectory of the moving object on the two-dimensional plane is y ═ f (x), one method for judging whether the motion trajectory of the moving object covers the point corresponding to the infringement object is to combine the curve equation y ═ g (x) and the curve equation y ═ f (x) and examine whether the curve equation set has an understanding in the rational number domain. And if the curve equation set has comprehension in the rational number domain, determining that the motion trail of the moving object covers the point corresponding to the invasion object.
Benefiting from the advantages of the inductive radar in positioning, in one embodiment of the present application, based on the data collected by the inductive radar, predicting whether a disaster threat event occurs may further be: clustering cloud points corresponding to point cloud data acquired by an induction radar based on data acquired by the induction radar to obtain a plurality of point cloud clusters; calculating the average distance of each point cloud cluster of the plurality of point cloud clusters or the distance between each point cloud cluster and an infringement object; and if the average distance of the point cloud clusters C in the point cloud clusters in a plurality of preset time periods changes and/or the distance between the point cloud clusters C and the invasion object exceeds a preset distance threshold value, determining that a disaster threat event occurs. In the above embodiment, the point cloud cluster obtained based on the data acquired by the induction radar may correspond to natural disaster derivatives that are easily deformed or fluid types such as flood, debris flow, and collapse. As for the average distance of the point cloud clusters, it can be defined as the average value of the point cloud distances of all the point cloud clusters in one detection period of the induction radar, and the average value of the point cloud distances is the average value of the distances of all the point cloud points in the point cloud clusters. Since the average distance of the point cloud clusters is relatively constant when the form of the natural object is not changed, once the form of the natural object is changed, for example, still water becomes flood, debris flow, landslide and the like, the average distance of the corresponding point cloud clusters is rapidly changed, and therefore, once the average distance of the point cloud clusters C in a plurality of point cloud clusters in a preset time period is changed and/or the distance between the point cloud cluster C and an infringement object exceeds a preset distance threshold, it is determined that a disaster threat event has occurred. It should be noted that the distance between the point cloud cluster C and the invasion object may be defined as the distance between the centroid of the point cloud cluster C and the invasion object.
As for the data collected by the vibration sensor, whether a disaster threat event occurs or not is predicted, in the embodiment of the present application, according to the surface abnormal vibration collected by the vibration sensor, when the abnormal vibration exceeds a preset vibration threshold, the disaster threat event, for example, an earthquake, is predicted to occur.
Step S104: and if illegal intrusion behaviors and/or disaster threat events occur, sending alarm information.
When illegal invasion behaviors and/or disaster threat events occur, alarm information can be sent to an invasion object, acousto-optic alarm can be given, for example, when violent animals such as jackal coyote and the like invade, strong light and sharp sound can be used for sending an alarm to outdoor activity personnel near the vehicle, and meanwhile invasion of the violent animals can be threatened.
As can be seen from the outdoor early warning method illustrated in fig. 1, on one hand, the acquisition devices of the environmental information such as the vision sensor, the vibration sensor, the induction radar and the like are all vehicle-mounted devices, which means that outdoor personnel do not need to additionally purchase professional early warning devices, and do not need to operate the vehicle-mounted devices, so that the professional quality requirements on the outdoor personnel are reduced during outdoor early warning; on the other hand, based on these mobile units, not only can discern illegal intrusion behavior, but also can predict whether take place calamity threat incident to can carry out the early warning to the outdoor danger of multiple scene, promote outdoor activities personnel's safety.
Corresponding to the embodiment of the application function implementation method, the application also provides an outdoor early warning device, a vehicle and a corresponding embodiment.
Fig. 2 is a schematic structural diagram of an outdoor early warning device shown in an embodiment of the present application. For convenience of explanation, only the portions related to the embodiments of the present application are shown. The apparatus of fig. 2 mainly includes an obtaining module 201, an identifying module 202, a predicting module 203, and an alarming module 204, wherein:
the system comprises an acquisition module 201, a processing module and a processing module, wherein the acquisition module is used for acquiring vehicle surrounding environment information through vehicle-mounted equipment, and the vehicle-mounted equipment comprises a vision sensor, a vibration sensor and an induction radar which are arranged on a vehicle;
the identification module 202 is used for identifying whether illegal intrusion behaviors occur or not based on data collected by the vision sensor and/or the induction radar;
the prediction module 203 is used for predicting whether a disaster threat event occurs or not based on data acquired by at least one vehicle-mounted device in the vehicle-mounted devices;
the alarm module 204 is configured to send alarm information if an illegal intrusion behavior and/or a disaster threat event occurs.
As can be seen from the outdoor early warning device illustrated in fig. 2, on one hand, the acquisition devices of the environmental information such as the vision sensor, the vibration sensor, the induction radar and the like are all vehicle-mounted devices, which means that outdoor personnel do not need to additionally purchase professional early warning devices, and do not need to operate the vehicle-mounted devices, so that the professional quality requirements on the outdoor personnel are reduced during outdoor early warning; on the other hand, based on these mobile units, not only can discern illegal intrusion behavior, but also can predict whether take place calamity threat incident to can carry out the early warning to the outdoor danger of multiple scene, promote outdoor activities personnel's safety.
Fig. 3 is a schematic structural diagram of a vehicle according to an embodiment of the present application. The vehicle 300 includes a memory 310 and a processor 320.
The Processor 320 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 310 may include various types of storage units such as a system memory, a Read Only Memory (ROM), and a permanent storage device. Wherein the ROM may store static data or instructions for the processor 320 or other modules of the computer. The persistent storage device may be a read-write storage device. The persistent storage may be a non-volatile storage device that does not lose stored instructions and data even after the computer is powered off. In some embodiments, the persistent storage device employs a mass storage device (e.g., magnetic or optical disk, flash memory) as the persistent storage device. In other embodiments, the permanent storage may be a removable storage device (e.g., floppy disk, optical drive). The system memory may be a read-write memory device or a volatile read-write memory device, such as a dynamic random access memory. The system memory may store instructions and data that some or all of the processors require at runtime. Further, the memory 310 may comprise any combination of computer-readable storage media, including various types of semiconductor memory chips (e.g., DRAM, SRAM, SDRAM, flash memory, programmable read-only memory), magnetic and/or optical disks, may also be employed. In some embodiments, memory 310 may include a removable storage device that is readable and/or writable, such as a Compact Disc (CD), a digital versatile disc read only (e.g., DVD-ROM, dual layer DVD-ROM), a Blu-ray disc read only, an ultra-dense disc, a flash memory card (e.g., SD card, min SD card, Micro-SD card, etc.), a magnetic floppy disk, or the like. Computer-readable storage media do not contain carrier waves or transitory electronic signals transmitted by wireless or wired means.
The memory 310 has stored thereon executable code that, when processed by the processor 320, may cause the processor 320 to perform some or all of the methods described above.
Furthermore, the method according to the present application may also be implemented as a computer program or computer program product comprising computer program code instructions for performing some or all of the steps of the above-described method of the present application.
Alternatively, the present application may also be embodied as a computer-readable storage medium (or non-transitory machine-readable storage medium or machine-readable storage medium) having executable code (or a computer program or computer instruction code) stored thereon, which, when executed by a processor of a vehicle (or server, etc.), causes the processor to perform some or all of the various steps of the above-described methods according to the present application.
Having described embodiments of the present application, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the disclosed embodiments. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein is chosen in order to best explain the principles of the embodiments, the practical application, or improvements made to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (12)

1. An outdoor early warning method, characterized in that the method comprises:
the method comprises the steps that vehicle surrounding environment information is obtained through vehicle-mounted equipment, and the vehicle-mounted equipment comprises a vision sensor, a vibration sensor and an induction radar which are arranged on a vehicle;
identifying whether illegal intrusion behaviors occur or not based on data collected by the vision sensor and/or the induction radar;
predicting whether a disaster threat event occurs or not based on data collected by at least one of the vehicle-mounted devices;
and if the illegal intrusion behavior and/or the disaster threat event occur, sending alarm information.
2. The outdoor early warning method of claim 1, wherein the identifying whether illegal intrusion behavior occurs based on the data collected by the vision sensor and the sensing radar comprises:
determining the type of a first moving target and spatial activity information under a visual coordinate system based on image data acquired by the visual sensor;
determining spatial activity information of a second moving target under a radar coordinate system based on the point cloud data acquired by the induction radar;
matching the first moving target and the second moving target based on the space activity information of the first moving target and the space activity information of the second moving target in the same coordinate system;
and if the matching is successful and the second moving target approaches to an infringing object in the monitoring range of the induction radar, determining that the target belonging to the type is carrying out illegal intrusion behavior.
3. The outdoor early warning method of claim 2, wherein the matching of the first moving target and the second moving target based on the spatial activity information of the first moving target and the spatial activity information of the second moving target in the same coordinate system comprises:
converting the visual coordinate system and the radar coordinate system to the same coordinate system;
under the same coordinate system, calculating the overlapping degree of the region of interest corresponding to the spatial activity information of the first moving target and the region of interest corresponding to the spatial activity information of the second moving target;
and if the overlapping degree of the region of interest corresponding to the spatial activity information of the first moving target and the region of interest corresponding to the spatial activity information of the second moving target is greater than a preset threshold value, the first moving target and the second moving target are successfully matched.
4. The outdoor early warning method of claim 2, wherein the matching of the first moving target and the second moving target based on the spatial activity information of the first moving target and the spatial activity information of the second moving target in the same coordinate system comprises:
converting the track of the first moving target into a bird's-eye view track of the first moving target under the visual angle of the induction radar;
calculating the relative displacement similarity and the track line type similarity of the first moving target and the second moving target in the same time period according to the aerial view track of the first moving target and the track of the second moving target;
and if the relative displacement similarity and the trajectory similarity in the same time period respectively exceed a first preset threshold and a second preset threshold, determining that the aerial view trajectory of the first moving target is successfully matched with the trajectory of the second moving target.
5. The outdoor early warning method of claim 1, wherein the identifying whether illegal intrusion behavior occurs based on the data collected by the inductive radar comprises:
calculating kinematic data of a moving target and/or a distance between the moving target and an invasive object based on data acquired by the inductive radar, the kinematic data including a moving speed, an acceleration and an orientation of the moving target;
and if the kinematic data of the moving target exceeds a preset kinematic data threshold and/or the distance between the moving target and the invasion object exceeds a preset first distance threshold, determining that the moving target is carrying out illegal invasion.
6. The outdoor early warning method of claim 5, further comprising:
after the induction radar detects the moving target, detecting a first position and a second position of the moving target in a preset time period;
and if the distance between the first position and the second position is smaller than a preset second distance threshold, determining that the moving target is a false target.
7. An outdoor warning method according to any one of claims 2 to 6, characterised in that the method further comprises:
and screening the data collected by the induction radar before matching the first moving target and the second moving target based on the space activity information of the first moving target and the space activity information of the second moving target in the same coordinate system so as to filter out a null target signal and/or an invalid target signal in the data collected by the induction radar.
8. The outdoor early warning method of claim 1, wherein at least one of the vehicle-mounted devices comprises the vision sensor and an inductive radar, and the predicting whether a disaster threat event occurs based on data collected by the at least one of the vehicle-mounted devices comprises:
extracting the position and geometric attributes of a moving object on an image based on image data acquired by the vision sensor, wherein the geometric attributes comprise the shape and the size of the moving object;
determining whether the moving target is a natural disaster derivative or not according to the geometric attributes of the moving target;
if the moving target is a natural disaster derivative, performing curve fitting according to the position of the moving target on an image and positioning data of the moving target acquired by the induction radar to obtain a moving track of the moving target;
and if the moving track of the moving target covers the point corresponding to the invading object and/or the distance between the moving target and the invading object is less than a preset distance threshold value, determining that a disaster threat event occurs.
9. The outdoor early warning method of claim 1, wherein at least one of the vehicle-mounted devices comprises the inductive radar, and predicting whether a disaster threat event occurs based on data collected by the at least one of the vehicle-mounted devices comprises:
clustering cloud points corresponding to the point cloud data acquired by the induction radar based on the data acquired by the induction radar to obtain a plurality of point cloud clusters;
calculating the average distance of each point cloud cluster of the point cloud clusters or the distance between each point cloud cluster and an infringement object;
and if the average distance of the point cloud clusters C in the point cloud clusters changes in a preset time period and/or the distance between the point cloud clusters C and the invasion object exceeds a preset distance threshold value, determining that a disaster threat event occurs.
10. An outdoor early warning device, characterized in that the device comprises:
the system comprises an acquisition module, a processing module and a processing module, wherein the acquisition module is used for acquiring vehicle surrounding environment information through vehicle-mounted equipment, and the vehicle-mounted equipment comprises a vision sensor, a vibration sensor and an induction radar which are arranged on a vehicle;
the identification module is used for identifying whether illegal intrusion behaviors occur or not based on data collected by the vision sensor and/or the induction radar;
the prediction module is used for predicting whether a disaster threat event occurs or not based on data acquired by at least one piece of vehicle-mounted equipment;
and the alarm module is used for sending alarm information if the illegal intrusion behavior and/or the disaster threat event occur.
11. A vehicle, characterized by comprising:
a processor; and
a memory having executable code stored thereon, which when executed by the processor, causes the processor to perform the method of any one of claims 1 to 9.
12. A computer readable storage medium having stored thereon executable code which, when executed by a processor of an electronic device, causes the processor to perform the method of any of claims 1 to 9.
CN202210465111.4A 2022-04-29 2022-04-29 Outdoor early warning method, device, system and computer readable storage medium Pending CN114882661A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210465111.4A CN114882661A (en) 2022-04-29 2022-04-29 Outdoor early warning method, device, system and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210465111.4A CN114882661A (en) 2022-04-29 2022-04-29 Outdoor early warning method, device, system and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN114882661A true CN114882661A (en) 2022-08-09

Family

ID=82674012

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210465111.4A Pending CN114882661A (en) 2022-04-29 2022-04-29 Outdoor early warning method, device, system and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN114882661A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116148862A (en) * 2023-01-16 2023-05-23 无锡市雷华科技有限公司 Comprehensive early warning and evaluating method for bird detection radar flying birds

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105128795A (en) * 2015-10-09 2015-12-09 阳运崎 Sensing-technique-based intelligent window-breaking escape warning and rescuing auxiliary device
CN111862653A (en) * 2020-07-31 2020-10-30 广州小鹏汽车科技有限公司 Warning method and device and vehicle
CN112550307A (en) * 2020-11-16 2021-03-26 东风汽车集团有限公司 Outdoor early warning system and vehicle that vehicle was used
CN112614290A (en) * 2020-12-10 2021-04-06 中科蓝卓(北京)信息科技有限公司 Radar video cooperative target detection device and method
CN113570622A (en) * 2021-07-26 2021-10-29 北京全路通信信号研究设计院集团有限公司 Obstacle determination method and device, electronic equipment and storage medium
CN113997855A (en) * 2021-10-12 2022-02-01 上海洛轲智能科技有限公司 Outdoor early warning method and device, electronic equipment and computer storage medium
CN114419825A (en) * 2022-03-29 2022-04-29 中国铁路设计集团有限公司 High-speed rail perimeter intrusion monitoring device and method based on millimeter wave radar and camera

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105128795A (en) * 2015-10-09 2015-12-09 阳运崎 Sensing-technique-based intelligent window-breaking escape warning and rescuing auxiliary device
CN111862653A (en) * 2020-07-31 2020-10-30 广州小鹏汽车科技有限公司 Warning method and device and vehicle
CN112550307A (en) * 2020-11-16 2021-03-26 东风汽车集团有限公司 Outdoor early warning system and vehicle that vehicle was used
CN112614290A (en) * 2020-12-10 2021-04-06 中科蓝卓(北京)信息科技有限公司 Radar video cooperative target detection device and method
CN113570622A (en) * 2021-07-26 2021-10-29 北京全路通信信号研究设计院集团有限公司 Obstacle determination method and device, electronic equipment and storage medium
CN113997855A (en) * 2021-10-12 2022-02-01 上海洛轲智能科技有限公司 Outdoor early warning method and device, electronic equipment and computer storage medium
CN114419825A (en) * 2022-03-29 2022-04-29 中国铁路设计集团有限公司 High-speed rail perimeter intrusion monitoring device and method based on millimeter wave radar and camera

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
郑少武;李巍华;胡坚耀;: "基于激光点云与图像信息融合的交通环境车辆检测", 仪器仪表学报, no. 12, 31 December 2019 (2019-12-31), pages 143 - 151 *
郑少武等: ""基于激光点云与图像信息融合的交通环境车辆检测"", 《仪器仪表学报》, vol. 40, no. 12, pages 143 - 151 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116148862A (en) * 2023-01-16 2023-05-23 无锡市雷华科技有限公司 Comprehensive early warning and evaluating method for bird detection radar flying birds
CN116148862B (en) * 2023-01-16 2024-04-02 无锡市雷华科技有限公司 Comprehensive early warning and evaluating method for bird detection radar flying birds

Similar Documents

Publication Publication Date Title
US9696409B2 (en) Sensor suite and signal processing for border surveillance
US9075143B2 (en) Sparse array RF imaging for surveillance applications
AU2017359142B2 (en) Systems and methods for detecting flying animals
JP4946228B2 (en) In-vehicle pedestrian detection device
CN111753609A (en) Target identification method and device and camera
JP2010145318A (en) Intruding object identifying method, intruding object identifying device, and intruding object identifying sensor device
Silva et al. Computer-based identification and tracking of Antarctic icebergs in SAR images
CN115034324B (en) Multi-sensor fusion perception efficiency enhancement method
CN114882661A (en) Outdoor early warning method, device, system and computer readable storage medium
CN110544271B (en) Parabolic motion detection method and related device
US9721154B2 (en) Object detection apparatus, object detection method, and object detection system
CN110287957B (en) Low-slow small target positioning method and positioning device
CN116935551A (en) Perimeter intrusion detection method, system, equipment and storage medium
CN111931657A (en) Object recognition system, method and storage medium
Bloisi et al. Integrated visual information for maritime surveillance
KR102440169B1 (en) Smart guard system for improving the accuracy of effective detection through multi-sensor signal fusion and AI image analysis
US20090297049A1 (en) Detection of partially occluded targets in ladar images
CN113673569A (en) Target detection method, target detection device, electronic equipment and storage medium
Hożyń et al. Detection of unmanned aerial vehicles using computer vision methods: a comparative analysis
CN116224280B (en) Radar target detection method, radar target detection device, radar equipment and storage medium
CN113570547A (en) Object detection method, object detection apparatus, and computer-readable storage medium
CN113687348A (en) Pedestrian identification method and device based on tracking micro-Doppler image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination