CN114290991A - Blind area monitoring method and device, storage medium and edge cloud platform - Google Patents

Blind area monitoring method and device, storage medium and edge cloud platform Download PDF

Info

Publication number
CN114290991A
CN114290991A CN202111625945.9A CN202111625945A CN114290991A CN 114290991 A CN114290991 A CN 114290991A CN 202111625945 A CN202111625945 A CN 202111625945A CN 114290991 A CN114290991 A CN 114290991A
Authority
CN
China
Prior art keywords
vehicle
sheltered
dimensional information
obstacle
blind area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111625945.9A
Other languages
Chinese (zh)
Other versions
CN114290991B (en
Inventor
卢浩
周光涛
郭鹏伟
程军峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Unicom Smart Connection Technology Ltd
Original Assignee
China Unicom Smart Connection Technology Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Unicom Smart Connection Technology Ltd filed Critical China Unicom Smart Connection Technology Ltd
Priority to CN202111625945.9A priority Critical patent/CN114290991B/en
Publication of CN114290991A publication Critical patent/CN114290991A/en
Application granted granted Critical
Publication of CN114290991B publication Critical patent/CN114290991B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Traffic Control Systems (AREA)

Abstract

The embodiment of the invention provides a blind area monitoring method and device, a storage medium and an edge cloud platform. The method comprises the following steps: receiving three-dimensional information of a sheltered vehicle, three-dimensional information of a sheltered vehicle and a position of an obstacle; establishing a blind area of the sheltered vehicle according to the three-dimensional information of the sheltered vehicle and the three-dimensional information of the sheltered vehicle; judging whether the sheltered vehicle is dangerous or not according to the blind area of the sheltered vehicle and the position of the obstacle; and if the shielded vehicle is judged to have danger, sending early warning information to the terminal equipment. According to the technical scheme provided by the embodiment of the invention, whether the sheltered vehicle is dangerous or not can be judged based on the established blind area and the obstacle position of the sheltered vehicle, and the accuracy of monitoring the blind area is improved.

Description

Blind area monitoring method and device, storage medium and edge cloud platform
[ technical field ] A method for producing a semiconductor device
The invention relates to the technical field of communication, in particular to a blind area monitoring method and device, a storage medium and an edge cloud platform.
[ background of the invention ]
The applications of some vehicle-road cooperative scenes are designed in CSAE 53-2017 cooperative intelligent transportation system-vehicle communication system application layer and application data interaction standard, wherein one of the applications is the function of collision early warning of weak traffic participants, and the weak traffic participants need to have short-range wireless communication capability. The traditional automobile relies on the research and development of the leading single-automobile intelligent technology of a car factory in the development of the intelligent traffic technology, the bottlenecks of perception blind areas and limited calculation capacity exist, the obstacles in the blind areas cannot be accurately monitored, and the accuracy of blind area monitoring is reduced.
[ summary of the invention ]
In view of this, embodiments of the present invention provide a method and an apparatus for monitoring a blind area, a storage medium, and an edge cloud platform, so as to improve accuracy of monitoring the blind area.
In one aspect, an embodiment of the present invention provides a blind area monitoring method, including:
receiving three-dimensional information of a sheltered vehicle, three-dimensional information of a sheltered vehicle and a position of an obstacle;
establishing a blind area of the sheltered vehicle according to the three-dimensional information of the sheltered vehicle and the three-dimensional information of the sheltered vehicle;
judging whether the sheltered vehicle is dangerous or not according to the blind area of the sheltered vehicle and the position of the obstacle;
and if the shielded vehicle is judged to have danger, sending early warning information to the terminal equipment.
Optionally, the establishing a blind area of the occluded vehicle according to the three-dimensional information of the occluded vehicle and the three-dimensional information of the occluded vehicle includes:
establishing a three-dimensional bounding box of the occlusion vehicle based on the three-dimensional information of the occlusion vehicle;
establishing a three-dimensional bounding box of the occluded vehicle based on the three-dimensional information of the occluded vehicle;
and taking an area enclosed by the front vertex of the three-dimensional bounding box of the sheltered vehicle, a plurality of vertex connecting lines of the three-dimensional bounding box of the sheltered vehicle and the ground as a blind area of the sheltered vehicle.
Optionally, judging whether the vehicle is dangerous according to the blind area of the sheltered vehicle and the position of the obstacle, including:
acquiring the speed of a sheltered vehicle, the speed of an obstacle and the distance from the obstacle;
calculating the speed of the shielded vehicle, the speed of the obstacle and the distance to generate the time of the shielded vehicle reaching the obstacle;
judging whether the time for the shielded vehicle to reach the obstacle is less than a set threshold value;
and if the time for the sheltered vehicle to reach the obstacle is judged to be less than the set threshold value, judging that the sheltered vehicle is dangerous.
Optionally, the method further comprises:
and if the time that the sheltered vehicle reaches the obstacle is judged to be greater than or equal to the set threshold value, sending reminding information to the terminal equipment.
Optionally, the method further comprises:
acquiring three-dimensional information of an occluded vehicle and three-dimensional information of an occluding vehicle by two sensors, the occluding vehicle and the occluded vehicle being vehicles passing through a curve,
one sensor is located at the intersection of two external tangents of the curve, and the other sensor is located at the intersection of the curve perpendicular to the intersection of the two external tangents.
Optionally, the received three-dimensional information of the sheltered vehicle and the three-dimensional information of the sheltered vehicle are used for establishing the blind area of the sheltered vehicle in real time.
Optionally, the method further comprises:
receiving the speed and the running direction of the sheltered vehicle and the speed and the running direction of the sheltered vehicle;
generating a predicted track and a predicted direction according to the speed and the running direction of the shielded vehicle and the speed and the running direction of the shielded vehicle;
according to the three-dimensional information of the sheltered vehicle and the three-dimensional information of the sheltered vehicle, establishing a blind area of the sheltered vehicle comprises the following steps: and predicting the three-dimensional information of the sheltered vehicle and the three-dimensional information of the sheltered vehicle after the preset time by using the predicted track and the predicted direction, and then establishing a blind area of the sheltered vehicle by using the prediction result.
On the other hand, an embodiment of the present invention provides a blind area monitoring device, including:
the receiving module is used for receiving the three-dimensional information of the sheltered vehicle, the three-dimensional information of the sheltered vehicle and the position of the obstacle;
the establishing module is used for establishing a blind area of the shielded vehicle according to the three-dimensional information of the shielded vehicle, the three-dimensional information of the shielded vehicle and the position of the obstacle;
the first judgment module is used for judging whether the vehicle is dangerous or not according to the blind area of the sheltered vehicle and the position of the obstacle;
and the sending module is used for sending early warning information to the terminal equipment if the first judging module judges that the vehicle is dangerous.
In another aspect, an embodiment of the present invention provides a storage medium, including: the storage medium comprises a stored program, wherein when the program runs, the equipment where the storage medium is located is controlled to execute the blind area monitoring method.
In another aspect, an embodiment of the present invention provides an edge cloud platform, including a memory and a processor, where the memory is used to store information including program instructions, and the processor is used to control execution of the program instructions, where the program instructions are loaded and executed by the processor to implement the steps of the above-mentioned blind area monitoring method.
In the technical scheme of the blind area monitoring method provided by the embodiment of the invention, the three-dimensional information of a sheltered vehicle and the position of an obstacle are received; establishing a blind area of the sheltered vehicle according to the three-dimensional information of the sheltered vehicle and the three-dimensional information of the sheltered vehicle; judging whether the sheltered vehicle is dangerous or not according to the blind area of the sheltered vehicle and the position of the obstacle; and if the shielded vehicle is judged to have danger, sending early warning information to the terminal equipment. According to the technical scheme provided by the embodiment of the invention, whether the sheltered vehicle is dangerous or not can be judged based on the established blind area and the obstacle position of the sheltered vehicle, and the accuracy of monitoring the blind area is improved.
[ description of the drawings ]
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive labor.
Fig. 1 is a flowchart of a blind area monitoring method according to an embodiment of the present invention;
FIG. 2 is a flow chart of another blind area monitoring method according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of sensor placement locations;
FIG. 4a is a schematic view of a three-dimensional bounding box of an obstructing vehicle;
FIG. 4b is a schematic view of a three-dimensional bounding box of an occluded vehicle;
FIG. 5 is a schematic view of a blind spot and an obstacle of a sheltered vehicle;
fig. 6 is a schematic structural diagram of a blind area monitoring device according to an embodiment of the present invention;
fig. 7 is a schematic diagram of an edge cloud platform according to an embodiment of the present invention.
[ detailed description ] embodiments
For better understanding of the technical solutions of the present invention, the following detailed descriptions of the embodiments of the present invention are provided with reference to the accompanying drawings.
It should be understood that the described embodiments are only some embodiments of the invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terminology used in the embodiments of the invention is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the examples of the present invention and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be understood that the term "and/or" as used herein is merely one type of associative relationship that describes an associated object, meaning that three types of relationships may exist, e.g., A and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship.
An embodiment of the present invention provides a blind area monitoring method, and fig. 1 is a flowchart of the blind area monitoring method provided in the embodiment of the present invention, as shown in fig. 1, the method includes:
and 102, receiving the three-dimensional information of the shielded vehicle, the three-dimensional information of the shielded vehicle and the position of the obstacle.
In the embodiment of the invention, each step is executed by the edge cloud platform.
In the embodiment of the present invention, the three-dimensional information of the sheltered vehicle, and the position of the obstacle may be acquired by a sensor, and the sensor includes: laser radar, camera or millimeter wave radar, etc. Or the information may be provided actively by the vehicle.
In the embodiment of the invention, because the use of a single sensor has certain limitation, a plurality of sensors can be arranged in the embodiment of the invention, and the plurality of sensors acquire the three-dimensional information of the sheltered vehicle, the three-dimensional information of the sheltered vehicle and the position of the obstacle.
In the embodiment of the invention, the sensor sends the three-dimensional information of the sheltered vehicle, the three-dimensional information of the sheltered vehicle and the position of the obstacle to the edge cloud platform in a long-distance/short-distance communication mode. For example, the long-distance Communication method includes a fifth Generation Mobile Communication Technology (5th Generation Mobile Communication Technology, abbreviated as 5G).
And step 104, establishing a blind area of the shielded vehicle according to the three-dimensional information of the shielded vehicle and the three-dimensional information of the shielded vehicle.
Wherein the three-dimensional information of the obstructing vehicle and the obstructed vehicle after a predetermined time can be predicted by using the received three-dimensional information or by using the predicted track and the predicted direction, and then the blind area of the obstructed vehicle is established.
In the embodiment of the invention, the blind area of the sheltered vehicle is established in real time by utilizing the received three-dimensional information of the sheltered vehicle and the three-dimensional information of the sheltered vehicle.
In this step, as an alternative, step 104 specifically includes:
step S1 receives the vehicle speed and the traveling direction of the shielded vehicle and the vehicle speed and the traveling direction of the shielded vehicle.
Step S2 is a step of generating a predicted trajectory and a predicted direction from the vehicle speed and the traveling direction of the shielded vehicle and the vehicle speed and the traveling direction of the shielded vehicle.
And step S3, predicting the three-dimensional information of the shielded vehicle and the three-dimensional information of the shielded vehicle after a preset time by using the predicted track and the predicted direction, and then establishing a blind area of the shielded vehicle by using the prediction result.
And step 106, judging whether the sheltered vehicle has danger or not according to the blind area of the sheltered vehicle and the position of the obstacle.
In the embodiment of the invention, if the sheltered vehicle is judged to have danger, the sheltered vehicle is indicated to collide with the barrier; if the sheltered vehicle is judged to be not dangerous, the sheltered vehicle is not collided with the obstacle.
And step 108, if the shielded vehicle is judged to have danger, sending early warning information to the terminal equipment.
In the embodiment of the invention, the terminal equipment comprises a mobile phone, a tablet personal computer, wearable equipment or a vehicle machine.
In this step, the warning information is sent to the terminal device of the sheltered vehicle. For example, the warning message is "please slow down and avoid danger! ".
According to the technical scheme provided by the embodiment of the invention, three-dimensional information of a sheltered vehicle and a position of an obstacle are received; establishing a blind area of the sheltered vehicle according to the three-dimensional information of the sheltered vehicle and the three-dimensional information of the sheltered vehicle; judging whether the sheltered vehicle is dangerous or not according to the blind area of the sheltered vehicle and the position of the obstacle; if the sheltered vehicle is judged to be dangerous, in the technical scheme of sending early warning information to the terminal equipment, whether the sheltered vehicle is dangerous or not can be judged based on the established blind area and the obstacle position of the sheltered vehicle, and the accuracy of blind area monitoring is improved.
An embodiment of the present invention provides another blind area monitoring method, and fig. 2 is a flowchart of another blind area monitoring method provided in an embodiment of the present invention, and as shown in fig. 2, the method includes:
step 202, receiving three-dimensional information of an occluded vehicle, three-dimensional information of an occluding vehicle and an obstacle position.
In the embodiment of the invention, each step is executed by the edge cloud platform.
In the embodiment of the present invention, the three-dimensional information of the sheltered vehicle, and the position of the obstacle may be acquired by a sensor, and the sensor includes: laser radar, camera or millimeter wave radar, etc. Or the information may be provided actively by the vehicle.
In the embodiment of the invention, because the use of a single sensor has certain limitation, a plurality of sensors are arranged in the embodiment of the invention, and the plurality of sensors acquire the three-dimensional information of the sheltered vehicle, the three-dimensional information of the sheltered vehicle and the position of the obstacle.
In the embodiment of the invention, the three-dimensional information of the sheltered vehicle and the three-dimensional information of the sheltered vehicle are obtained through two sensors, the sheltered vehicle and the sheltered vehicle are vehicles passing through a curve, one sensor is positioned at the intersection point of two external tangents of the curve, and the other sensor is positioned at the intersection point of a curve perpendicular line at the intersection point of the two external tangents.
Specifically, two sensor arrangement positions are provided, fig. 3 is a schematic view of the sensor arrangement positions, as shown in fig. 3, a sensor arrangement position 1 is provided on the outer side of a turning road, a sensor arrangement position 2 is provided on the inner side of the turning road, the sensor arrangement position 1 and the sensor arrangement position 2 monitor the turning road in the direction indicated by the arrow in fig. 3, and the closed figure in fig. 3 is the monitoring range of the sensor.
As an alternative, the method of setting the sensor arrangement position is: taking a center line of a turning road as a reference, making external tangent lines according to the curvature of a curve, taking the intersection point of the two external tangent lines as a sensor arrangement position 1, making perpendicular lines at the tangent points of the two external tangent lines respectively, and taking the intersection point of the two perpendicular lines as a sensor arrangement position 2. Through the arrangement mode, the length, the width and the height of the sheltering vehicle, the sheltered vehicle and the obstacle can be obtained more simply and conveniently with small errors by the sensor, so that more accurate three-dimensional information can be obtained, and further, through the more accurate three-dimensional information, a more accurate blind area can be established through the mode shown in figure 5.
As an alternative, in fig. 3, if the vehicle moves forward in the direction of the arrow shown in the figure, the direction 1 can identify the speed, position, driving direction, three-dimensional information and the like of the occluded vehicle, in order to avoid the presence of a large vehicle occluding a small vehicle behind and causing missed identification, a point location identification is added in the direction 3 for blind compensation, and meanwhile, the sensor arrangement position 2 and the data acquired by the sensor arrangement position 1 are matched and fused to identify the corresponding information of the vehicle together. The direction 2 identifies the obstacle, and the obstacle and the like are identified and judged in combination with the direction 4, so that the real-time situation of the road is obtained and sent to the vehicle about to enter the turning road, and the calculation period of the whole system is about 50 ms. Early warning messages can be sent to the vehicle entering the network every 50 ms. The period can be adjusted by a small amount.
And step 204, establishing a three-dimensional bounding box of the occlusion vehicle based on the three-dimensional information of the occlusion vehicle.
FIG. 4a is a schematic view of a three-dimensional bounding box of an obstructing vehicle. The sheltering vehicle is a large vehicle. For example, a large vehicle includes: common buses, trucks or large special cars and the like. As shown in fig. 4a, the occlusion vehicle is a truck, and the three-dimensional information of the occlusion vehicle can be used to generate a three-dimensional bounding box of the occlusion vehicle, where the three-dimensional bounding box of the occlusion vehicle includes two corners in front of the occlusion vehicle and two corners in back of the occlusion vehicle, the two corners in front are a P point and a Q point respectively, and the two corners in back are an M point and an N point respectively. The three-dimensional information may include, among other things, position, length, width, and height information of the obstructing vehicle.
And step 206, establishing a three-dimensional bounding box of the occluded vehicle based on the three-dimensional information of the occluded vehicle.
In an embodiment of the present invention, fig. 4b is a schematic diagram of a three-dimensional bounding box of an occluded vehicle, where the occluded vehicle is a small vehicle. For example, a small vehicle includes: cars, jeep, mini-cars or light buses, etc. As shown in fig. 4b, the occluded vehicle is a car, and the three-dimensional information of the occluded vehicle can be used to generate a three-dimensional bounding box of the occluded vehicle, wherein the three-dimensional bounding box of the occluded vehicle includes four corners in front of the occluded vehicle, and the four corners are D point, C point, E point and F point respectively. The three-dimensional information may include, among other things, position, length, width, and height information of the occluded vehicle.
And step 208, taking an area enclosed by the front vertex of the three-dimensional enclosure of the sheltered vehicle, a plurality of vertex connecting lines of the three-dimensional enclosure of the sheltered vehicle and the ground as a blind area of the sheltered vehicle.
In the embodiment of the invention, if the sheltered vehicle is a car, the sheltered vehicle is a truck, and the sheltered vehicle are turned to the left in fig. 3 as an example, the point C of the sheltered vehicle is selected as a starting point (i.e., the vertex of the three-dimensional bounding box close to the eyes of the driver or the vertex of the front detection sensor of the driven vehicle is selected), and the requirement of | MN | > | CF |, and the CP, CQ, CN and CM are connected and extended. The plane CPQ, the plane CMN and the ground plane form a three-dimensional blind area in the left forward direction of the sheltered vehicle, that is, the intersection of the plane CPQ and the plane CMN with the forward road is a road blind area range, that is, the blind area of the sheltered vehicle. If obstacles such as traffic accidents exist in the range of the blind area of the road, the speed of the sheltered vehicle is too fast, and dangers such as collision and the like are easy to happen, so that personal and property losses are caused.
In the embodiment of the present invention, fig. 5 is a schematic view of a blind area and an obstacle of a sheltered vehicle, and as shown in fig. 5, an absolute position, such as a sensor arrangement position 1, is selected, and a world coordinate system is established based on the sensor arrangement position 1, and at this time, a real position of another sensor, such as a sensor arrangement position 2, can be obtained. In the sensor recognition, three-dimensional information of an obstructing vehicle and three-dimensional information of an obstructed vehicle are acquired, coordinates (x1, y1) of an F point, coordinates (x2, y2) of an M point and coordinates (x3, y3) of a Q point are located, and although the considered area is three-dimensional, the vehicles are all on a uniform road surface, so that the considered area can be two-dimensional coordinates. Thus, the equation for the straight line FM can be:
Figure BDA0003440037890000091
similarly, the equation for the straight line FQ is:
Figure BDA0003440037890000092
therefore, as can be seen from the schematic diagram 5, the position where the recognition area of the sensor overlaps the blind area of the shielded vehicle is (x0, y0) if there is an obstacle, and if the U point satisfies the condition:
Figure BDA0003440037890000093
an obstacle exists in the blind area of the sheltered vehicle.
If the obstacle U (x0, y0) satisfies the following condition:
Figure BDA0003440037890000094
no obstacle is present in the blind zone of the sheltered vehicle.
The linear equation in fig. 5 and the blind area of the sheltered vehicle are dynamically changed and are updated and established in real time according to the time of data acquisition by the sensor.
And step 210, acquiring the speed of the shielded vehicle, the speed of the obstacle and the distance from the obstacle.
In this step, the speed of the shielded vehicle, the speed of the obstacle, and the distance from the obstacle can be acquired by the sensor.
And step 212, calculating the speed of the shielded vehicle, the speed and the distance of the obstacle, and generating the time when the shielded vehicle reaches the obstacle.
In the step, a formula in the blind area danger early warning model is used
Figure BDA0003440037890000101
Calculating the speed of the shielded vehicle, the speed and the distance of the obstacle to generate the time when the shielded vehicle reaches the obstacle, wherein t is the time when the shielded vehicle reaches the obstacle, X is the distance of the obstacle, Va is the speed of the shielded vehicle, and Vu is the speed of the obstacle. When the obstacle is in a stationary state, the speed of the obstacle is 0 and Vu is 0.
Step 214, judging whether the time for the shielded vehicle to reach the obstacle is less than a set threshold value, if so, executing step 216; if not, go to step 218.
In the embodiment of the present invention, the set threshold can be set according to actual conditions, for example, the set threshold is 15 seconds.
In the embodiment of the invention, if the time that the sheltered vehicle reaches the obstacle is judged to be less than the set threshold, it is indicated that the sheltered vehicle is about to impact the obstacle, and step 216 is executed; if the time for the sheltered vehicle to reach the obstacle is determined to be greater than or equal to the set threshold, it indicates that the sheltered vehicle does not immediately impact the obstacle, and step 218 is performed.
And step 216, sending early warning information to the terminal equipment, and ending the process.
In the embodiment of the invention, the terminal equipment comprises a mobile phone, a tablet personal computer, wearable equipment or a vehicle machine.
In this step, the warning information is sent to the terminal device of the sheltered vehicle. For example, the warning message is "please slow down and avoid danger! ".
As an alternative, step 216 may be followed by: and if the collision information sent by the sensor is received, sending rescue information to the emergency rescue platform through the regional cloud platform.
In the embodiment of the invention, when the three-dimensional coordinate points of the two vehicles are overlapped or infinitely approaches to 0, the two vehicles can be judged to have collided, and the sensor sends collision information to the edge cloud platform. For example, the collision information includes a collision location, a collision time, and a degree of collision.
In the embodiment of the invention, the edge cloud platform sends rescue information to the emergency rescue platform through the regional cloud platform so as to supply the emergency rescue platform to carry out accident rescue and jam combing in time.
Step 218, sending the reminding information to the terminal equipment.
In this step, a reminding message is sent to the terminal device of the sheltered vehicle to remind the vehicle to avoid danger and make a decision for autonomous vehicle identification. For example, the warning message is "please pay attention to the turning road and decelerate.
According to the technical scheme provided by the embodiment of the invention, three-dimensional information of a sheltered vehicle and a position of an obstacle are received; establishing a blind area of the sheltered vehicle according to the three-dimensional information of the sheltered vehicle and the three-dimensional information of the sheltered vehicle; judging whether the sheltered vehicle is dangerous or not according to the blind area of the sheltered vehicle and the position of the obstacle; if the sheltered vehicle is judged to be dangerous, in the technical scheme of sending early warning information to the terminal equipment, whether the sheltered vehicle is dangerous or not can be judged based on the established blind area and the obstacle position of the sheltered vehicle, and the accuracy of blind area monitoring is improved.
According to the technical scheme provided by the embodiment of the invention, whether the sheltered vehicle has danger or not can be judged, shelters of the vehicle to be screened and other vehicles can be constructed and evaluated in real time, the potential collision possibility of the vehicle to be screened and other vehicles can be evaluated by analyzing the mutual motion state of the vehicle and the blind area danger early warning model, and meanwhile, target traffic participants can be reminded in various ways, the potential risks can be noticed in time, and corresponding actions can be taken to avoid the risks. The problem of blind area monitoring of turn road is solved, the rapid development of autopilot is promoted.
According to the technical scheme provided by the embodiment of the invention, a turning road is always a key area with frequent accidents, an expressway accident is particularly an accident with a large proportion due to curve overtaking or curve sideslip, the potential collision risk of the turning road is reminded, and the related technology cannot accurately combine each vehicle to perform potential collision early warning. According to potential view shielding and the fact that the sensing ability of the vehicle cannot find a target object in time, the embodiment of the invention provides a blind area danger early warning model, calculates the time of the vehicle reaching an obstacle, quickly locks a dangerous vehicle and accurately analyzes the time of the vehicle which is possibly dangerous. Can issue accurate early warning instruction to dangerous vehicle fast through 5G network.
The embodiment of the invention provides a blind area monitoring device. Fig. 6 is a schematic structural diagram of a blind area monitoring device according to an embodiment of the present invention, as shown in fig. 6, the device includes: the device comprises a receiving module 11, a building module 12, a first judging module 13 and a sending module 14.
The receiving module 11 is configured to receive the three-dimensional information of the sheltered vehicle, the speed of the sheltered vehicle, the driving direction of the sheltered vehicle, the three-dimensional information of the sheltered vehicle, and the position of the obstacle, which are acquired by the sensor.
The establishing module 12 is configured to establish a blind area of the shielded vehicle according to the three-dimensional information of the shielded vehicle and the position of the obstacle.
The first judging module 13 is used for judging whether the vehicle is dangerous or not according to the blind area of the sheltered vehicle and the position of the obstacle.
The sending module 14 is configured to send the warning information to the terminal device if the first determining module 13 determines that the vehicle is dangerous.
In the embodiment of the present invention, the establishing module 12 is specifically configured to establish a three-dimensional bounding box of an obstructing vehicle based on three-dimensional information of the obstructing vehicle; establishing a three-dimensional bounding box of the occluded vehicle based on the three-dimensional information of the occluded vehicle; and connecting lines of a front vertex of the three-dimensional box of the sheltered vehicle and a plurality of vertexes of the three-dimensional bounding box of the sheltered vehicle and an area of the ground surrounding wall are used as blind areas of the sheltered vehicle.
In the embodiment of the present invention, the first determining module 13 is specifically configured to obtain a speed of a shielded vehicle, a speed of an obstacle, and a distance from the obstacle; calculating the speed of the shielded vehicle, the speed of the obstacle and the distance to generate the time of the shielded vehicle reaching the obstacle; judging whether the time for the shielded vehicle to reach the obstacle is less than a set threshold value; and if the time for the sheltered vehicle to reach the obstacle is judged to be less than the set threshold value, judging that the vehicle is dangerous.
In the embodiment of the present invention, the sending module 14 is further configured to send a reminding message to the terminal device if the first determining module 13 determines that the time for the blocked vehicle to reach the obstacle is greater than or equal to the set threshold.
In the embodiment of the invention, the three-dimensional information of the sheltered vehicle and the three-dimensional information of the sheltered vehicle are obtained through two sensors, the sheltered vehicle and the sheltered vehicle are vehicles passing through a curve, one sensor is positioned at the intersection point of two external tangents of the curve, and the other sensor is positioned at the intersection point of a curve perpendicular line at the intersection point of the two external tangents.
In the embodiment of the present invention, the establishing module 12 is specifically configured to establish the blind area of the occluded vehicle in real time by using the received three-dimensional information of the occluded vehicle and the three-dimensional information of the occluding vehicle.
According to the technical scheme provided by the embodiment of the invention, three-dimensional information of a sheltered vehicle and a position of an obstacle are received; establishing a blind area of the sheltered vehicle according to the three-dimensional information of the sheltered vehicle and the three-dimensional information of the sheltered vehicle; judging whether the sheltered vehicle is dangerous or not according to the blind area of the sheltered vehicle and the position of the obstacle; if the sheltered vehicle is judged to be dangerous, in the technical scheme of sending early warning information to the terminal equipment, whether the sheltered vehicle is dangerous or not can be judged based on the established blind area and the obstacle position of the sheltered vehicle, and the accuracy of blind area monitoring is improved.
The blind area monitoring device provided in this embodiment may be used to implement the above-mentioned blind area monitoring method in fig. 1 and fig. 2, and for specific description, reference may be made to the above-mentioned embodiment of the blind area monitoring method, and a description thereof is not repeated here.
Embodiments of the present invention provide a storage medium, where the storage medium includes a stored program, where, when the program runs, a device in which the storage medium is located is controlled to execute each step of the above-mentioned embodiment of the blind area monitoring method, and for a specific description, reference may be made to the above-mentioned embodiment of the blind area monitoring method.
An embodiment of the present invention provides an edge cloud platform, which includes a memory and a processor, where the memory is configured to store information including program instructions, and the processor is configured to control execution of the program instructions, and the program instructions are loaded and executed by the processor to implement each step of the embodiment of the blind area monitoring method, and for specific description, reference may be made to the embodiment of the blind area monitoring method.
Fig. 7 is a schematic diagram of an edge cloud platform according to an embodiment of the present invention. As shown in fig. 7, the edge cloud platform 20 of this embodiment includes: the processor 21, the memory 22, and the computer program 23 stored in the memory 22 and capable of running on the processor 21, where the computer program 23 is executed by the processor 21 to implement the method for blind area monitoring in the embodiment, and in order to avoid repetition, the details are not repeated herein. Alternatively, the computer program is executed by the processor 21 to implement the functions of the models/units applied to the blind area monitoring apparatus in the embodiment, which are not described herein again to avoid redundancy.
The edge cloud platform 20 includes, but is not limited to, a processor 21 and a memory 22. Those skilled in the art will appreciate that fig. 7 is merely an example of an edge cloud platform 20, and does not constitute a limitation of the edge cloud platform 20, and may include more or fewer components than shown, or combine certain components, or different components, e.g., the edge cloud platform may also include input-output devices, network access devices, buses, etc.
The Processor 21 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The storage 22 may be an internal storage unit of the edge cloud platform 20, such as a hard disk or a memory of the edge cloud platform 20. The memory 22 may also be an external storage device of the edge cloud platform 20, such as a plug-in hard disk provided on the edge cloud platform 20, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like. Further, the memory 22 may also include both internal and external storage units of the edge cloud platform 20. The memory 22 is used to store computer programs and other programs and data required by the edge cloud platform. The memory 22 may also be used to temporarily store data that has been output or is to be output.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the embodiments provided in the present invention, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions in actual implementation, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
The integrated unit implemented in the form of a software functional unit may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) or a Processor (Processor) to execute some steps of the methods according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (10)

1. A blind area monitoring method, comprising:
receiving three-dimensional information of a sheltered vehicle, three-dimensional information of a sheltered vehicle and a position of an obstacle;
establishing a blind area of the sheltered vehicle according to the three-dimensional information of the sheltered vehicle and the three-dimensional information of the sheltered vehicle;
judging whether the sheltered vehicle is dangerous or not according to the blind area of the sheltered vehicle and the position of the obstacle;
and if the shielded vehicle is judged to have danger, sending early warning information to the terminal equipment.
2. The method according to claim 1, wherein the establishing a blind area of the occluded vehicle according to the three-dimensional information of the occluded vehicle and the three-dimensional information of the occluding vehicle comprises:
establishing a three-dimensional bounding box of the occlusion vehicle based on the three-dimensional information of the occlusion vehicle;
establishing a three-dimensional bounding box of the occluded vehicle based on the three-dimensional information of the occluded vehicle;
and taking an area enclosed by the front vertex of the three-dimensional bounding box of the sheltered vehicle, a plurality of vertex connecting lines of the three-dimensional bounding box of the sheltered vehicle and the ground as a blind area of the sheltered vehicle.
3. The method of claim 1, wherein determining whether the vehicle is at risk based on the blind spot of the sheltered vehicle and the position of the obstacle comprises:
acquiring the speed of a sheltered vehicle, the speed of an obstacle and the distance from the obstacle;
calculating the speed of the shielded vehicle, the speed of the obstacle and the distance to generate the time of the shielded vehicle reaching the obstacle;
judging whether the time for the shielded vehicle to reach the obstacle is less than a set threshold value;
and if the time for the sheltered vehicle to reach the obstacle is judged to be less than the set threshold value, judging that the sheltered vehicle is dangerous.
4. The method of claim 3, further comprising:
and if the time that the sheltered vehicle reaches the obstacle is judged to be greater than or equal to the set threshold value, sending reminding information to the terminal equipment.
5. The method of claim 1, further comprising:
acquiring three-dimensional information of an occluded vehicle and three-dimensional information of an occluding vehicle by two sensors, the occluding vehicle and the occluded vehicle being vehicles passing through a curve,
one sensor is located at the intersection of two external tangents of the curve, and the other sensor is located at the intersection of the curve perpendicular to the intersection of the two external tangents.
6. The method of claim 1, wherein the blind zone of the occluded vehicle is established in real time using the received three-dimensional information of the occluded vehicle and the three-dimensional information of the occluding vehicle.
7. The method of claim 1, further comprising:
receiving the speed and the running direction of the sheltered vehicle and the speed and the running direction of the sheltered vehicle;
generating a predicted track and a predicted direction according to the speed and the running direction of the shielded vehicle and the speed and the running direction of the shielded vehicle;
according to the three-dimensional information of the sheltered vehicle and the three-dimensional information of the sheltered vehicle, establishing a blind area of the sheltered vehicle comprises the following steps: and predicting the three-dimensional information of the sheltered vehicle and the three-dimensional information of the sheltered vehicle after the preset time by using the predicted track and the predicted direction, and then establishing a blind area of the sheltered vehicle by using the prediction result.
8. A blind area monitoring device, comprising:
the receiving module is used for receiving the three-dimensional information of the sheltered vehicle, the three-dimensional information of the sheltered vehicle and the position of the obstacle;
the establishing module is used for establishing a blind area of the shielded vehicle according to the three-dimensional information of the shielded vehicle, the three-dimensional information of the shielded vehicle and the position of the obstacle;
the first judgment module is used for judging whether the vehicle is dangerous or not according to the blind area of the sheltered vehicle and the position of the obstacle;
and the sending module is used for sending early warning information to the terminal equipment if the first judging module judges that the vehicle is dangerous.
9. A storage medium, comprising: the storage medium includes a stored program, wherein the program controls an apparatus in which the storage medium is located to perform the blind area monitoring method according to any one of claims 1 to 7 when the program is executed.
10. An edge cloud platform comprising a memory for storing information comprising program instructions and a processor for controlling the execution of the program instructions, wherein the program instructions when loaded and executed by the processor implement the steps of the blind spot monitoring method of any one of claims 1 to 7.
CN202111625945.9A 2021-12-28 2021-12-28 Blind area monitoring method and device, storage medium and edge cloud platform Active CN114290991B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111625945.9A CN114290991B (en) 2021-12-28 2021-12-28 Blind area monitoring method and device, storage medium and edge cloud platform

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111625945.9A CN114290991B (en) 2021-12-28 2021-12-28 Blind area monitoring method and device, storage medium and edge cloud platform

Publications (2)

Publication Number Publication Date
CN114290991A true CN114290991A (en) 2022-04-08
CN114290991B CN114290991B (en) 2023-07-04

Family

ID=80971197

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111625945.9A Active CN114290991B (en) 2021-12-28 2021-12-28 Blind area monitoring method and device, storage medium and edge cloud platform

Country Status (1)

Country Link
CN (1) CN114290991B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114582132A (en) * 2022-05-05 2022-06-03 四川九通智路科技有限公司 Vehicle collision detection early warning system and method based on machine vision

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000105894A (en) * 1998-09-29 2000-04-11 Mitsubishi Electric Corp Vehicle drive support system
JP2001148098A (en) * 1999-11-24 2001-05-29 Oki Electric Ind Co Ltd Running assisting device for vehicle
JP2007156568A (en) * 2005-11-30 2007-06-21 Toshiba Corp Road travelling support system
US20110102195A1 (en) * 2009-10-29 2011-05-05 Fuji Jukogyo Kabushiki Kaisha Intersection driving support apparatus
US20160207534A1 (en) * 2015-01-20 2016-07-21 Toyota Jidosha Kabushiki Kaisha Collision avoidance control system and control method
CN108447304A (en) * 2018-04-18 2018-08-24 北京交通大学 Construction road Pedestrians and vehicles intelligent collision warning system and method based on bus or train route collaboration
CN109979238A (en) * 2017-12-28 2019-07-05 北京百度网讯科技有限公司 Barrier based reminding method, device and equipment in lane
CN110962744A (en) * 2018-09-28 2020-04-07 中国电信股份有限公司 Vehicle blind area detection method and vehicle blind area detection system
WO2021075454A1 (en) * 2019-10-14 2021-04-22 株式会社Soken In-vehicle device and driving assistance method
CN112829762A (en) * 2019-11-25 2021-05-25 华为技术有限公司 Vehicle running speed generation method and related equipment
CN113060157A (en) * 2021-03-30 2021-07-02 恒大新能源汽车投资控股集团有限公司 Blind zone road condition broadcasting device, road condition information sharing device, system and vehicle
CN113348119A (en) * 2020-04-02 2021-09-03 华为技术有限公司 Vehicle blind area identification method, automatic driving assistance system and intelligent driving vehicle comprising system

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000105894A (en) * 1998-09-29 2000-04-11 Mitsubishi Electric Corp Vehicle drive support system
JP2001148098A (en) * 1999-11-24 2001-05-29 Oki Electric Ind Co Ltd Running assisting device for vehicle
JP2007156568A (en) * 2005-11-30 2007-06-21 Toshiba Corp Road travelling support system
US20110102195A1 (en) * 2009-10-29 2011-05-05 Fuji Jukogyo Kabushiki Kaisha Intersection driving support apparatus
US20160207534A1 (en) * 2015-01-20 2016-07-21 Toyota Jidosha Kabushiki Kaisha Collision avoidance control system and control method
CN109979238A (en) * 2017-12-28 2019-07-05 北京百度网讯科技有限公司 Barrier based reminding method, device and equipment in lane
CN108447304A (en) * 2018-04-18 2018-08-24 北京交通大学 Construction road Pedestrians and vehicles intelligent collision warning system and method based on bus or train route collaboration
CN110962744A (en) * 2018-09-28 2020-04-07 中国电信股份有限公司 Vehicle blind area detection method and vehicle blind area detection system
WO2021075454A1 (en) * 2019-10-14 2021-04-22 株式会社Soken In-vehicle device and driving assistance method
CN112829762A (en) * 2019-11-25 2021-05-25 华为技术有限公司 Vehicle running speed generation method and related equipment
CN113348119A (en) * 2020-04-02 2021-09-03 华为技术有限公司 Vehicle blind area identification method, automatic driving assistance system and intelligent driving vehicle comprising system
CN113060157A (en) * 2021-03-30 2021-07-02 恒大新能源汽车投资控股集团有限公司 Blind zone road condition broadcasting device, road condition information sharing device, system and vehicle

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114582132A (en) * 2022-05-05 2022-06-03 四川九通智路科技有限公司 Vehicle collision detection early warning system and method based on machine vision

Also Published As

Publication number Publication date
CN114290991B (en) 2023-07-04

Similar Documents

Publication Publication Date Title
CN108932869B (en) Vehicle system, vehicle information processing method, recording medium, traffic system, infrastructure system, and information processing method
CN106891888B (en) Vehicle turn signal detection
CN106873580B (en) Autonomous driving at intersections based on perception data
US20210276589A1 (en) Method, apparatus, device and computer storage medium for vehicle control
US20190362168A1 (en) Predicting Vehicle Movements Based on Driver Body Language
CN112389466B (en) Method, device and equipment for automatically avoiding vehicle and storage medium
RU2735340C1 (en) Parking control method and parking control device
US9524643B2 (en) Orientation sensitive traffic collision warning system
RU2564268C1 (en) Driving aid
US20180056998A1 (en) System and Method for Multi-Vehicle Path Planning Technical Field
US11945435B2 (en) Devices and methods for predicting collisions and/or intersection violations
CN112172663A (en) Danger alarm method based on door opening and related equipment
EP3886076A1 (en) Warning system for a host automotive vehicle
US11926299B2 (en) System and method for predicting road collisions with a host vehicle
CN113853640B (en) electronic control device
JP2014232508A (en) Avoidance track prediction device
US20210312193A1 (en) Devices and methods for predicting intersection violations and/or collisions
US20220032906A1 (en) Device and method for reducing collision risk
CN113538917A (en) Collision early warning method and collision early warning device
CN113022441A (en) Vehicle blind area detection method and device, electronic equipment and storage medium
CN114290991B (en) Blind area monitoring method and device, storage medium and edge cloud platform
CN114537447A (en) Safe passing method and device, electronic equipment and storage medium
CN113085848B (en) Control method and device for unmanned road roller, electronic equipment and storage medium
CN112249007A (en) Vehicle danger alarm method and related equipment
CN114170846B (en) Vehicle lane change early warning method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant