CN115862341A - Unmanned aerial vehicle traffic accident processing system based on edge calculation - Google Patents

Unmanned aerial vehicle traffic accident processing system based on edge calculation Download PDF

Info

Publication number
CN115862341A
CN115862341A CN202211545034.XA CN202211545034A CN115862341A CN 115862341 A CN115862341 A CN 115862341A CN 202211545034 A CN202211545034 A CN 202211545034A CN 115862341 A CN115862341 A CN 115862341A
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
accident
traffic
subsystem
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211545034.XA
Other languages
Chinese (zh)
Inventor
谢晓兰
徐克顺
高荣
杨哲兴
刘亚荣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guilin University of Technology
Original Assignee
Guilin University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guilin University of Technology filed Critical Guilin University of Technology
Priority to CN202211545034.XA priority Critical patent/CN115862341A/en
Publication of CN115862341A publication Critical patent/CN115862341A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Traffic Control Systems (AREA)

Abstract

The invention discloses an unmanned aerial vehicle traffic accident handling system based on edge calculation. The system comprises an unmanned aerial vehicle subsystem, a 5G transmission subsystem and a traffic accident management cloud subsystem. The unmanned aerial vehicle subsystem is provided with a plurality of unmanned aerial vehicles carrying different devices to execute specific tasks; the 5G transmission subsystem uses a super-fusion server to deploy a cloud platform with a simplified function by utilizing the advantage that computing and storage resources of 5G edge computing are small; the traffic accident management cloud subsystem realizes unmanned aerial vehicle scheduling by using unmanned aerial vehicle cooperation, realizes three-dimensional reconstruction by using an SFM (small form-factor pluggable) method, and detects vital signs of accident personnel by using a photoplethysmography principle and a transdermal optical imaging technology; the unmanned aerial vehicle traffic accident processing system based on edge calculation utilizes the advantages of rapid monitoring of the unmanned aerial vehicle and combines with a computer vision technology, related personnel are efficiently assisted to process traffic accidents, traffic jam is reduced, time is saved, and lives are saved.

Description

Unmanned aerial vehicle traffic accident processing system based on edge calculation
Technical Field
The invention belongs to the technical field of unmanned aerial vehicles, computer vision and edge calculation, and particularly relates to an unmanned aerial vehicle traffic accident handling system based on edge calculation.
Background
Due to human influences, such as failure of negotiation, simple traffic accidents, such as slight scratches and collisions, can result in severe traffic congestion, and time becomes an important life-saving factor in the face of severe traffic accidents. The traffic accident handling system is a system for assisting related personnel in handling accidents in road traffic by utilizing some existing intelligent devices and advanced technologies, and can well realize the combination of people and intelligent technologies and realize quick response and handling to emergent traffic accident events.
Although currently common drone-based assistance traffic accident handling systems already exist, these systems still have some drawbacks in terms of:
(1) Unmanned aerial vehicle and relevant personnel cooperation intelligent degree is not high.
(2) Accidents cannot be treated in a short time to dredge roads.
(3) For major traffic accidents, personnel injury cannot be treated in an emergency in time.
In view of this, the conventional system cannot meet the requirements of people at the present stage for a low-delay and high-security sensitive system. Therefore, in the face of various conditions which may occur and the defects of the existing system, the invention provides an unmanned aerial vehicle traffic accident handling system based on edge calculation, which combines an unmanned aerial vehicle, computer vision and edge calculation technology to carry out traffic accident handling, can handle or help related personnel to handle traffic accidents in the fastest time, saves lives to a great extent and reduces traffic jam.
Disclosure of Invention
The invention aims to provide an unmanned aerial vehicle traffic accident handling system based on edge calculation, which can reduce the time for handling traffic accidents, assist related personnel to save lives, accelerate the speed of handling accidents, reduce traffic congestion and make up for the defects of the traditional traffic road system based on unmanned aerial vehicle assistance.
The invention is realized by the following steps: an unmanned aerial vehicle traffic accident processing system based on edge computing comprises an unmanned aerial vehicle subsystem, a 5G transmission subsystem and a traffic accident management cloud subsystem. Wherein the unmanned aerial vehicle subsystem is connected with the 5G transmission subsystem; the 5G transmission subsystem is connected with the traffic accident management cloud subsystem.
The unmanned aerial vehicle traffic accident handling system process based on edge calculation can be divided into two accident scenes:
(1) The process of the light accident scene is as follows:
step S11: when a traffic accident happens, the accident is light, no casualties or injuries are light, but the accident responsibility problem cannot be solved on the spot, and related personnel are called for help.
Step S12: the traffic management command center receives the help seeking and determines related accident information (such as position information).
Step S13: and the command center dispatches the unmanned aerial vehicle with the shorter distance to the site through the 5G base station.
Step S14: wherein information acquisition and pronunciation unmanned aerial vehicle begin to utilize the camera to shoot the scene, constantly remind accident personnel notice through the speaker simultaneously, and utilize 5G transmission subsystem to reach the edge device with information, meanwhile, warning unmanned aerial vehicle flies to accident vehicle rear and constantly reminds rear vehicle the place ahead emergence accident through light scintillation and speaker, after scene shooting is accomplished, utilize the speaker to remind relevant personnel to go out the traffic route with accident vehicle, wait for relevant personnel to handle at the roadside, avoid causing the traffic jam.
Step S15: after the edge equipment receives the data, the SFM algorithm is used for carrying out three-dimensional reconstruction, the accident site three-dimensional model is reproduced, and the 5G technology is used for transmitting the result to the related equipment.
Step S16: and after the relevant personnel receive the results, the control center performs preliminary analysis on the responsibility judgment, and the relevant personnel arrive at the site and then perform final responsibility judgment.
Step S17: the unmanned aerial vehicle returns to the warehouse.
(2) The process of the heavy accident scene is as follows:
step S21: when a traffic accident occurs, the accident is serious, and people are injured or killed seriously, so that the traffic police are called for help.
Step S22: the traffic management command center receives the help seeking and determines related accident information (such as position information).
Step S23: and the command center dispatches the unmanned aerial vehicle with the shorter distance to the site through the 5G base station.
Step S24: the information acquisition and voice unmanned aerial vehicle starts shooting scenes by using a camera and simultaneously shoots, the 5G transmission subsystem is used for transmitting information to the edge equipment, and meanwhile, the unmanned aerial vehicle is warned to fly to the rear of an accident vehicle and continuously reminds the front of the rear vehicle of the accident vehicle of accidents through light flashing and a loudspeaker; meanwhile, the medical unmanned aerial vehicle releases the first-aid kit. After information acquisition and voice unmanned aerial vehicle scene shooting are completed, data are uploaded to edge equipment and are handed to a traffic accident management cloud subsystem for processing.
Step S25: after the traffic accident management cloud subsystem receives data, the vital sign detection module is used for acquiring vital sign data according to the information acquisition and the video uploaded by the voice unmanned aerial vehicle, the 120 help seeking is carried out, the relevant conditions are simply explained, the detected vital sign data are sent, and medical staff can know the relevant conditions and make preparations. Meanwhile, after receiving the information acquisition and voice unmanned aerial vehicle data, the SFM algorithm in the three-dimensional reconstruction module is used for three-dimensional reconstruction, the accident site three-dimensional model is reproduced, and the result is transmitted to relevant equipment.
Step S26: after the relevant personnel receive the results, the control center conducts preliminary analysis on the accident scene conditions, relevant personnel deployment is conducted at the same time, and the relevant personnel are rescued in time after arriving at the scene.
Step S27: the unmanned aerial vehicle returns to the warehouse.
The unmanned aerial vehicle subsystem comprises unmanned aerial vehicle, vision camera, warning light and speaker, divide into three kinds of type unmanned aerial vehicle: information acquisition and pronunciation unmanned aerial vehicle, warning unmanned aerial vehicle, medical treatment unmanned aerial vehicle, the scene can divide two kinds of condition to explain:
(1) The system consists of two unmanned aerial vehicles, wherein a loudspeaker in the information acquisition and voice unmanned aerial vehicle is responsible for reminding car accident personnel of various cautions, and a visual camera is responsible for taking pictures of the same scene from multiple angles, storing the pictures according to a serial number and uploading edge clouds for processing; warning unmanned aerial vehicle is responsible for warning rear vehicle and personnel, avoids the secondary emergence of accident.
(2) The system consists of three unmanned aerial vehicles, wherein a loudspeaker in the information acquisition and voice unmanned aerial vehicle is responsible for reminding traffic accident personnel of various cautions, and a visual camera is responsible for shooting pictures of the same scene from multiple angles and uploading edge clouds for processing; the medical unmanned aerial vehicle is responsible for putting in the first-aid kit and carrying out emergency treatment in time; warning unmanned aerial vehicle is responsible for warning rear vehicle and personnel, avoids the secondary emergence of accident.
The 5G transmission subsystem utilizes the advantage that computing and storage resources of 5G edge computing are small, a cloud platform with a simplified function is deployed by using the super-fusion server, data can be transmitted quickly, and delay is effectively reduced.
The traffic accident management cloud subsystem comprises an unmanned aerial vehicle scheduling module, a three-dimensional reconstruction module and a vital sign detection module. The unmanned aerial vehicle dispatching module is connected with the three-dimensional reconstruction module and the vital sign detection module; the three-dimensional reconstruction module and the vital sign detection module are arranged in parallel. The traffic accident management cloud subsystem is a control center for managing the whole system, can be matched with a human machine, and utilizes an autonomous instruction or a human-computer combined instruction to issue a command through data analysis sent by the unmanned aerial vehicle scheduling module, the three-dimensional reconstruction module and the vital sign detection module, so that traffic accidents are comprehensively processed.
And the unmanned aerial vehicle scheduling module in the traffic accident management cloud subsystem schedules the unmanned aerial vehicle in the whole traffic area based on the deployment. And related personnel control take-off, arrive at the accident site, deal with the accident and return to the air independently.
A three-dimensional reconstruction module in the traffic accident management cloud subsystem reconstructs a 3D model according to an accident scene and a related surrounding environment by using an SFM method; the specific model comprises an accident vehicle model which comprises details such as specific impact points, friction points and the like of an accident vehicle; the traffic guidance or restriction mark model comprises traffic marked lines, trees, railings and other field objects at a certain distance around an accident site.
The vital sign detection module in the traffic accident management cloud subsystem detects drivers and carried persons of accident vehicles by searching in a certain range of an accident scene, and comprises the steps of performing heart rate detection by utilizing human face changes based on a photoplethysmography principle and combined with independent component analysis; blood pressure detection is carried out based on a transdermal optical imaging technology, the bleeding volume of the wounded and whether shock occurs or not are preliminarily judged according to surface analysis, and the wounded part is preliminarily judged.
Compared with the traditional traffic accident handling system based on unmanned aerial vehicle assistance, the unmanned aerial vehicle traffic accident handling system based on edge calculation provided by the invention has the following advantages:
(1) And (4) intelligentization. The unmanned aerial vehicle traffic accident handling whole process based on edge calculation does not need excessive participation of personnel, and is similar to a point-to-point mode to handle accidents compared with management personnel, and once the accidents happen, related personnel only need to pay attention to results in hands.
(2) The response speed is high. According to the invention, based on the 5G data transmission and the addition of the edge cloud, the data transmission delay is greatly reduced, and the task can be processed more quickly.
(3) A novel processing system. The invention combines the unmanned aerial vehicle with the three-dimensional reconstruction technology, the vital sign detection and other leading edge technologies, and combines the advantages of 5G and edge calculation to form a novel traffic accident handling system.
Drawings
Fig. 1 is a block diagram of an unmanned aerial vehicle traffic accident handling system based on edge calculation.
Fig. 2 is a process diagram of the unmanned aerial vehicle light traffic accident handling system based on edge calculation.
Fig. 3 is a process diagram of the system for processing the serious traffic accident of the unmanned aerial vehicle based on the edge calculation.
Fig. 4 is a schematic view of the appearance of the drone with different functions in the drone subsystem of the present invention.
Fig. 5 is a process diagram of the 5G transmission subsystem of the present invention.
FIG. 6 is a view of a daytime common road unmanned aerial vehicle cooperation scene in the unmanned aerial vehicle subsystem of the present invention.
Fig. 7 is a schematic diagram illustrating steps of an SFM method in a traffic accident management cloud subsystem three-dimensional reconstruction module according to the present invention and a result of a traffic accident scene.
Fig. 8 is a processing step diagram of the PRG in the traffic accident management cloud subsystem vital sign detection module according to the present invention.
The mark in the figure is: unmanned aerial vehicle subsystem 1, 5G transmission subsystem 2, traffic accident management cloud subsystem 3, unmanned aerial vehicle dispatch module 3-1, three-dimensional reconstruction module 3-2, vital sign detection module 3-3, information acquisition and pronunciation unmanned aerial vehicle camera 1-1, information acquisition and pronunciation unmanned aerial vehicle speaker 1-2, information acquisition and pronunciation unmanned aerial vehicle speaker 1-3, information acquisition and pronunciation unmanned aerial vehicle camera 1-4, warning unmanned aerial vehicle warning light 1-5, warning unmanned aerial vehicle speaker 1-6, medical treatment unmanned aerial vehicle first aid kit 1-7, unmanned aerial vehicle 2-1, 5G basic station 2-2, edge cloud subsystem 2-3, terminal equipment 2-4.
Detailed Description
The technical solution of the present patent will be further described in detail with reference to the following embodiments.
The embodiment is as follows: as shown in fig. 1, an unmanned aerial vehicle traffic accident handling system based on edge computing includes an unmanned aerial vehicle subsystem 1, a 5G transmission subsystem 2 and a traffic accident management cloud subsystem 3. The traffic accident management cloud subsystem 3 comprises an unmanned aerial vehicle scheduling module 3-1, a three-dimensional reconstruction module 3-2 and a vital sign detection module 3-3.
The unmanned aerial vehicle subsystem 1 is connected with the 5G transmission subsystem 2; the 5G transmission subsystem 2 is connected with the traffic accident management cloud subsystem 3. The unmanned aerial vehicle dispatching module 3-1 in the traffic accident management cloud subsystem 3 is connected with the three-dimensional reconstruction module 3-2 and the vital sign detection module 3-3; the three-dimensional reconstruction module 3-2 is parallel to the vital sign detection module 3-3.
The unmanned aerial vehicle traffic accident handling system process based on edge calculation can be divided into two accident scenes:
(1) As shown in fig. 2, the process of the light accident scenario is as follows:
step S11: when a traffic accident (1) occurs, the accident is light, no casualties or injuries are light, but the accident responsibility problem cannot be solved on the spot, and related personnel are called for help.
Step S12: the traffic management command center (2) receives the help seeking and determines related accident information (such as position information).
Step S13: and the command center dispatches the unmanned aerial vehicle with a shorter distance to the site through the 5G base station (3).
Step S14: the information acquisition and voice unmanned aerial vehicle (4) starts to shoot scenes by using the camera 1-1, meanwhile, the speaker 1-2 is used for constantly reminding accident personnel of matters of attention, the 5G transmission subsystem 2 is used for transmitting information to the edge device (6), meanwhile, the warning unmanned aerial vehicle (5) flies to the rear of an accident vehicle, the light 1-5 flickers and the speaker 1-6 constantly remind the front of the rear vehicle of accidents, after scene shooting is completed, the speaker 1-2 is used for reminding relevant personnel of driving the accident vehicle out of a traffic road, and the relevant personnel are waited to process at the roadside, so that traffic jam is avoided.
Step S15: after receiving the data, the edge device (6) starts to carry out three-dimensional reconstruction by using an SFM algorithm, reproduces the three-dimensional model of the accident site and transmits the result to the related device (7) by using a 5G technology.
Step S16: after the relevant personnel receive the results, the control center (2) performs preliminary analysis on the responsibility judgment, and the relevant personnel arrive at the site and then perform final responsibility judgment.
Step S17: the unmanned aerial vehicle returns to the warehouse.
(2) Referring to fig. 3, the process of the heavy accident scenario is as follows:
step S21: when a traffic accident (1) occurs, the accident is serious, and people are injured or killed seriously, so that a traffic police is called for help.
Step S22: the traffic management command center (2) receives the help seeking and determines related accident information (such as position information).
Step S23: and the command center dispatches the unmanned aerial vehicle with a shorter distance to the site through the 5G base station (3).
Step S24: the information acquisition and voice unmanned aerial vehicle (4) starts to shoot scenes by using the camera 1-1 and simultaneously shoots, the 5G transmission subsystem 2 is used for transmitting information to the edge equipment (7), and meanwhile, the warning unmanned aerial vehicle (5) flies to the rear of an accident vehicle and continuously reminds the front of the rear vehicle of accidents through light 1-5 flickering and a loudspeaker 1-6; meanwhile, the medical unmanned aerial vehicle (6) releases the first-aid kit. After information acquisition and scene shooting of the voice unmanned aerial vehicle (4) are completed, data are uploaded to the edge device (7) and are sent to the traffic accident management cloud subsystem 3 to be processed.
Step S25: after the traffic accident management cloud subsystem 3 receives the data, the vital sign data are acquired by the vital sign detection module 3-3 according to the information acquisition and the video uploaded by the voice unmanned aerial vehicle (4), help is sought 120, related conditions are simply explained, the detected vital sign data are sent, and medical staff can know the related conditions and make preparations. Meanwhile, after receiving data of the information acquisition and voice unmanned aerial vehicle (4), the SFM algorithm in the three-dimensional reconstruction module 3-2 is used for three-dimensional reconstruction, the accident scene three-dimensional model is reproduced, and the result is transmitted to the related equipment (8).
Step S26: after the relevant personnel receive the results, the control center conducts preliminary analysis on the accident scene conditions, relevant personnel deployment is conducted at the same time, and the relevant personnel are rescued in time after arriving at the scene.
Step S27: and the unmanned aerial vehicle returns to the warehouse.
The unmanned aerial vehicle subsystem 1 is mainly used for scene shooting, personnel reminding and traffic dispersion. Fig. 4 is an appearance schematic diagram of an unmanned aerial vehicle with different functions in an unmanned aerial vehicle subsystem, fig. 4 (a) is a front view of the unmanned aerial vehicle with information collection and voice, a camera 1-1 is responsible for taking scene pictures and taking pictures, and a loudspeaker 1-2 is responsible for playing reminding information; fig. 4 (b) is a side view of the information collecting and voice unmanned aerial vehicle, and the speaker 1-3 is responsible for playing reminding information; fig. 4 (c) is a side view of the information acquisition and voice unmanned aerial vehicle, and the cameras 1-4 are responsible for taking scene pictures and detecting vital signs; FIG. 4 (d) is a top view of the UAV, which is composed of 4 propellers; fig. 4 (e) is a front view of the warning unmanned aerial vehicle, wherein the warning lamps 1-5 are responsible for warning the rear vehicle, the 3 warning lamps continuously flash to achieve the warning effect, and the loudspeaker 1-6 is responsible for playing the warning information; fig. 4 (f) is a front view of the medical unmanned aerial vehicle, and the first-aid kits 1-7 can be put down by the unmanned aerial vehicle.
The 5G transmission subsystem 2 is mainly used for fast transmission of data. Fig. 5 is a process diagram of a 5G transmission subsystem of the present invention, and the specific transmission sequence process is as follows:
step S201, the unmanned aerial vehicle 2-1 uploads data to the 5G base station 2-2 through the 5G.
Step S202: the 5G base station 2-2 forwards data to the edge cloud 2-3, and meanwhile the edge cloud 2-3 can also send instructions or data to be forwarded to the unmanned aerial vehicle 2-1 through the base station 2-2.
Step S203: the edge cloud sends the instruction or the data to the terminal 2-4 through the 5G base station 2-2.
The traffic accident management cloud subsystem 3 comprises an unmanned aerial vehicle scheduling module 3-1, a three-dimensional reconstruction module 3-2 and a vital sign detection module 3-3.
And the unmanned aerial vehicle scheduling module 3-1 is used for regulating and controlling a plurality of unmanned aerial vehicles. The unmanned aerial vehicle dispatching module 3-1 can be divided into three steps of unmanned aerial vehicle take-off, cooperation and return, the take-off and return of the unmanned aerial vehicle can be controlled by related personnel, and the unmanned aerial vehicle can also independently return, and the cooperation of multiple unmanned aerial vehicles can be divided into path planning and unmanned aerial vehicle cooperation.
The unmanned plane path planning steps are as follows:
step S311: the control center acquires the accident position according to the call for help, and transmits the positioning information to the unmanned aerial vehicle through the 5G.
Step S312: the unmanned plane receives data, determines a terminal point, plans a path, and plans a track of the unmanned plane, wherein the track planning of the unmanned plane involves application of three dimensions of x (t), y (t) and z (t), and a displacement function r (t) can be expressed as:
Figure BDA0003979504700000071
wherein x (T) represents a horizontal axis coordinate, y (T) represents a vertical axis coordinate, z (T) represents a vertical axis coordinate, a path from a starting point to an end point in a space is divided into n +1 waypoints (including the starting point and the end point), n represents the number of the waypoints, and at a time T (k), the aircraft needs to arrive at a k-th waypoint R defined in advance k . R corresponding to k =1 1 I.e. representing the start of flight, corresponding to R of k = n +1 n+1 I.e. representing the end of the flight. The invention records the waypoints as:
Figure BDA0003979504700000072
in the formula, R k (T k ) Set vector, X, representing waypoints k Abscissa, Y, representing the kth waypoint k Ordinate, Z, representing the kth waypoint k Representing the vertical axis coordinate of the kth waypoint.
Step S313: the planning path is divided into n sections, and a parameter equation in each section is solved by using a jerk minimization method.
Step S314: and (5) solving the parameters and determining the track.
In the flight process of the unmanned aerial vehicle, the unmanned aerial vehicle autonomous obstacle avoidance system is added in the process of arriving at another waypoint from one waypoint.
FIG. 6 is a view showing a scene of the unmanned aerial vehicle on the common day road in the unmanned aerial vehicle subsystem 3-1 according to the present invention, where task allocation depends on the function of each unmanned aerial vehicle, and the unmanned aerial vehicle U is a U-shaped unmanned aerial vehicle 1 ,U 2 ,U 3 The settings were as follows: u shape 1 Is an information acquisition and voice unmanned plane (1), U 2 For warning unmanned aerial vehicle (2), U 3 For medical unmanned aerial vehicle (3):
for safety between drones, U 1 ,U 2 A distance D between 12 Rule D 12 More than or equal to 1, wherein the unit is rice;
in order to effectively remind the rear vehicle and greatly avoid the secondary occurrence of accidents, U 3 Distance D from the accident point 3 Stipulating:
if an accident occurs on a common roadDay D, day D 3 Not less than 50 at night D 3 Not less than 150, and the unit is rice.
If an accident occurs on the expressway, D days 3 Not less than 150 deg.C at night 3 Is more than or equal to 250, and the unit is rice.
And the three-dimensional reconstruction module 3-2 is used for processing data and constructing a three-dimensional scene model. Fig. 7 is a block diagram showing steps of an SFM (structure-from-motion) method in the traffic accident management cloud subsystem three-dimensional reconstruction module according to the present invention, where the SFM method is an off-line algorithm for performing three-dimensional reconstruction based on various collected out-of-order pictures, and includes the following specific steps:
step S321: and detecting characteristic points of each scene picture shot by the unmanned aerial vehicle, and extracting characteristic description of the characteristic points.
Step S322: matching the feature points in each pair of pictures, establishing the track of the feature points in the scene picture, removing the matching which does not meet the geometric constraint, adopting a matching method as Euclidean distance, and adopting a feature vector calculation method as follows:
f nn =arg min f′∈F(J) ||f d -f′ d || 2 formula (3)
In the formula (f) nn Representing a feature vector, F (I) representing feature points around the image I. f. of d Represents a feature point coordinate of f' d Indicating the coordinates of the deviation points, for finding the nearest neighbor feature vector F for each image I and J, taking into account each feature F' ∈ F (I) nn ∈F(J)。
Step S323: its internal parameters, such as focal length, pixels, etc., are obtained for the camera on the drone.
Step S324: external parameters for taking each image, such as balance of the camera, translation parameters, etc., are acquired.
Step S325: and (4) obtaining three-dimensional coordinates of different points by utilizing triangulation calculation through the parameters obtained in the step (3) and the step (4) to obtain sparse reconstruction.
Step S326: the result is optimized, the error when the three-dimensional point is back projected to the image is minimized by adjusting the estimated value of the camera parameter and the coordinate of the three-dimensional point, and the optimization equation can be described as:
Figure BDA0003979504700000081
wherein g (C, X) is the optimized target, w ij Indicating whether the camera observes the track, if w is observed ij =1, otherwise, w ij =0,q ij Representing the coordinates of the estimated value points, P (C) i ,X j ) Is a three-dimensional point coordinate of which C i Representing a set of abscissas, X j Representing a set of ordinates, q ij -P(C i ,X j ) Is the sum of the projection errors of the trajectory j in camera i.
Fig. 7 is a schematic view of a result of a traffic accident scene in the three-dimensional reconstruction module 3-2 of the traffic accident management cloud subsystem 3 according to the present invention, and photographs are taken from different angles, and a 3D model is reconstructed using an SFM method.
The vital sign detection module 3-3 is used for detecting vital signs of people and assisting rescue. The vital sign detection module 3-3 combines independent component analysis including the principle of photoplethysmography based on, utilizes the change of people's face to carry out heart rate detection, carries out blood pressure detection based on transdermal optical imaging technique, carries out preliminary judgement wounded's hemorrhage volume and whether shock condition according to surface analysis to preliminary judgement wounded position.
Fig. 8 shows a processing step diagram of the PRG in the vital sign detection module 3-3 of the traffic accident management cloud subsystem 3 according to the present invention, and the Photoplethysmography (PRG) is based on the ILED light source and the detector, and measures the attenuated light reflected and absorbed by the blood vessels and tissues of the human body, traces the pulse state of the blood vessels, and measures the pulse wave. The method comprises the following specific steps:
step S331: and (5) video acquisition. Videos are collected through the unmanned aerial vehicle camera 1-1, and data are uploaded to the traffic accident management cloud subsystem 3.
Step S332: and (4) preprocessing data. And in the traffic accident management cloud subsystem 3, primary processing is carried out on the uploaded video data.
Step S333: and (4) multithreading operation. In the traffic accident management cloud subsystem 3, parallel processing is adopted, and a multithreading technology is used for operation, so that the calculation speed is increased.
Step S334: and (4) detecting the heart rate. And obtaining a detection result.
The blood pressure detection based on the percutaneous optical imaging technology is that according to videos shot by an unmanned aerial vehicle, the change of facial blood flow is monitored by the percutaneous optical imaging technology, and the blood pressure is finally measured on the basis of a machine learning algorithm and a blood pressure calculation model.
In summary, the invention relates to an unmanned aerial vehicle traffic accident handling system based on edge calculation, and provides an unmanned aerial vehicle subsystem based on various types of unmanned aerial vehicles, which realizes the functions of camera shooting, voice playing, reminding and the like by using a camera, a loudspeaker, a warning light and the like; the 5G transmission subsystem is used for transmitting data by using 5G and constructing a cloud platform by using computing and storage resources of 5G edge computing to realize low-delay transmission and processing of tasks; the traffic accident management cloud subsystem provides an unmanned aerial vehicle scheduling module based on intelligent scheduling of an unmanned aerial vehicle, and intelligent deployment of the unmanned aerial vehicle is realized; providing a three-dimensional reconstruction module using an SFM method to realize 3D scene reconstruction; the vital sign detection module based on the photoplethysmography principle and the transdermal optical imaging technology is provided, so that vital sign detection of accident personnel is realized; the unmanned aerial vehicle traffic accident processing system based on edge calculation utilizes the advantages of rapid monitoring of the unmanned aerial vehicle and combines a computer vision technology, efficiently assists related personnel to process traffic accidents, can be applied to various traffic roads, and realizes efficient processing of traffic accidents.
The foregoing is only a preferred embodiment of the present invention. The above embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same, and the scope of the present invention is not limited to any such changes or substitutions that can be easily conceived by those skilled in the art within the technical scope of the present invention.

Claims (1)

1. An unmanned aerial vehicle traffic accident processing system based on edge computing is characterized by comprising an unmanned aerial vehicle subsystem (1), a 5G transmission subsystem (2) and a traffic accident management cloud subsystem (3); the unmanned aerial vehicle subsystem (1) is connected with the 5G transmission subsystem (2); the 5G transmission subsystem (2) is connected with the traffic accident management cloud subsystem (3); the traffic accident management cloud subsystem (3) comprises an unmanned aerial vehicle scheduling module (3-1), a three-dimensional reconstruction module (3-2) and a vital sign detection module (3-3); the unmanned aerial vehicle dispatching module (3-1) is connected with the three-dimensional reconstruction module (3-2) and the vital sign detection module (3-3); the three-dimensional reconstruction module (3-2) is parallel to the vital sign detection module (3-3);
the unmanned aerial vehicle traffic accident handling system based on the edge calculation comprises the following processes:
1) The process of the light accident scene is as follows:
step S11: when a traffic accident happens, the accident is light, no casualties or injuries are light, but the accident responsibility problem cannot be solved on the spot to help a traffic police;
step S12: the traffic management command center receives the help seeking and determines related accident information (such as position information);
step S13: the command center dispatches the unmanned aerial vehicle with the closer distance to the site through the 5G transmission subsystem (2);
step S14: the information acquisition and voice unmanned aerial vehicle starts to shoot scenes by using a camera (1-1) and transmits information to a traffic accident management cloud subsystem (3), and meanwhile, the warning unmanned aerial vehicle flies to the rear of an accident vehicle and is reminded to continuously remind that an accident occurs in the front of the rear vehicle through a warning lamp (1-5) to flicker and a loudspeaker (1-6); after information acquisition and voice unmanned aerial vehicle scene shooting are finished, related personnel are continuously reminded of driving out the accident vehicle from the traffic road by using the loudspeaker (1-2), and the related personnel are waited for processing at the roadside, so that traffic jam is avoided;
step S15: after the traffic accident management cloud subsystem (3) receives the data, the three-dimensional reconstruction module (3-2) starts to carry out three-dimensional reconstruction by using an SFM algorithm, reproduces an accident scene three-dimensional model and transmits the result to the related terminal equipment (2-4);
step S16: after the relevant personnel receive the result, the control center carries out preliminary analysis on the responsibility judgment, and the relevant personnel judge the final responsibility after arriving at the site;
step S17: returning by the unmanned aerial vehicle;
2) The process of the heavy accident scene is as follows:
step (ii) of S21: when a traffic accident occurs, people are injured or killed seriously, and then the traffic police are called for help;
step S22: the traffic management command center receives the help seeking and determines related accident information (such as position information);
step S23: the command center dispatches the unmanned aerial vehicle with the closer distance to the site through the 5G transmission subsystem (2);
step S24: the information acquisition and voice unmanned aerial vehicle information acquisition utilize a camera (1-1) to start shooting scenes and simultaneously carry out shooting, and transmit information to a traffic accident management cloud subsystem (3), and meanwhile, the warning unmanned aerial vehicle flies behind an accident vehicle and is continuously reminded of accidents occurring in front of the rear vehicle through the flickering of a warning lamp (1-5) and the reminding of a loudspeaker (1-6); meanwhile, the medical unmanned aerial vehicle releases the first-aid kit (1-7); after information acquisition and voice unmanned aerial vehicle scene shooting are completed, data are uploaded to a traffic accident management cloud subsystem (3);
step S25: after the traffic accident management cloud subsystem (3) receives the data, the vital sign detection module (3-3) acquires vital sign data according to the video uploaded by the unmanned aerial vehicle, and performs 120 help seeking, simply explains relevant conditions and sends the detected vital sign data to enable medical staff to know the relevant conditions and make preparations; after receiving information acquisition and voice unmanned aerial vehicle data, the three-dimensional reconstruction module (3-2) performs three-dimensional reconstruction by using an SFM algorithm, reproduces the three-dimensional model of the accident scene, and transmits the result to the related terminal equipment (2-4);
step S26: after the relevant personnel receive the result, the control center carries out preliminary analysis on the accident scene situation, meanwhile, relevant personnel deployment is carried out, and the relevant personnel are rescued in time after arriving at the scene;
step S27: returning by the unmanned aerial vehicle;
the unmanned aerial vehicle subsystem (1) is composed of three unmanned aerial vehicles: the system comprises an information acquisition and voice unmanned aerial vehicle, a warning unmanned aerial vehicle and a medical unmanned aerial vehicle, wherein a loudspeaker (1-2) in the information acquisition and voice unmanned aerial vehicle is responsible for reminding various cautions of traffic accident personnel, and a visual camera is responsible for taking pictures of the same scene from multiple angles, storing the pictures according to serial numbers and uploading edge clouds (2-3) for processing; the warning unmanned aerial vehicle is responsible for warning rear vehicles and personnel and avoiding secondary occurrence of accidents; if the accident is large and casualties exist, the medical unmanned aerial vehicle can throw in a first-aid kit;
the 5G transmission subsystem (2) utilizes the advantage that computing and storage resources of 5G edge computing are small, and deploys a cloud platform with a simplified function by using a super fusion server;
the traffic accident management cloud subsystem (3) manages the operation of the whole system, performs man-machine cooperation, and issues commands by using an autonomous command or a man-machine combined command through data analysis sent by the unmanned aerial vehicle scheduling module (3-1), the three-dimensional reconstruction module (3-2) and the vital sign detection module (3-3) so as to comprehensively process traffic accidents;
an unmanned aerial vehicle scheduling module (3-1) in the traffic accident management cloud subsystem (3) schedules the unmanned aerial vehicle in the whole traffic area based on the deployment, and related personnel control take-off, arrive at an accident site, deal with the accident and return to the air autonomously;
a three-dimensional reconstruction module (3-2) in the traffic accident management cloud subsystem (3) establishes a 3D model according to an accident scene and a related surrounding environment; the specific model comprises an accident vehicle model which comprises details such as specific impact points, friction points and the like of an accident vehicle; the traffic guidance or restriction mark model comprises traffic marked lines, trees, railings and other field objects at a certain distance around an accident field;
vital sign detection module (3-3) among traffic accident management cloud subsystem (3) are through searching in the certain limit to the scene of an accident, detect accident vehicle driver and personnel that carry, include based on photoplethysmography principle, combine independent component analysis, utilize the change of people's face to carry out heart rate and detect, carry out blood pressure detection based on transdermal optical imaging technique, carry out preliminary judgement wounded's hemorrhage volume and whether shock condition according to surface analysis to preliminary judgement wounded injured part.
CN202211545034.XA 2022-12-04 2022-12-04 Unmanned aerial vehicle traffic accident processing system based on edge calculation Pending CN115862341A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211545034.XA CN115862341A (en) 2022-12-04 2022-12-04 Unmanned aerial vehicle traffic accident processing system based on edge calculation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211545034.XA CN115862341A (en) 2022-12-04 2022-12-04 Unmanned aerial vehicle traffic accident processing system based on edge calculation

Publications (1)

Publication Number Publication Date
CN115862341A true CN115862341A (en) 2023-03-28

Family

ID=85669709

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211545034.XA Pending CN115862341A (en) 2022-12-04 2022-12-04 Unmanned aerial vehicle traffic accident processing system based on edge calculation

Country Status (1)

Country Link
CN (1) CN115862341A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107194989A (en) * 2017-05-16 2017-09-22 交通运输部公路科学研究所 The scene of a traffic accident three-dimensional reconstruction system and method taken photo by plane based on unmanned plane aircraft
CN108711273A (en) * 2018-03-30 2018-10-26 榛硕(武汉)智能科技有限公司 A kind of quick processing system of traffic accident and its processing method
CN112150803A (en) * 2020-08-27 2020-12-29 东风汽车集团有限公司 Method and server for processing traffic accidents
CN112435458A (en) * 2019-06-21 2021-03-02 北京航空航天大学 Emergency simulation method for unmanned aerial vehicle on highway under traffic accident
CN113643520A (en) * 2021-08-04 2021-11-12 南京及物智能技术有限公司 Intelligent traffic accident processing system and method
CN114840017A (en) * 2022-04-01 2022-08-02 合众新能源汽车有限公司 Control method, control device, control system, traffic management system, and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107194989A (en) * 2017-05-16 2017-09-22 交通运输部公路科学研究所 The scene of a traffic accident three-dimensional reconstruction system and method taken photo by plane based on unmanned plane aircraft
CN108711273A (en) * 2018-03-30 2018-10-26 榛硕(武汉)智能科技有限公司 A kind of quick processing system of traffic accident and its processing method
CN112435458A (en) * 2019-06-21 2021-03-02 北京航空航天大学 Emergency simulation method for unmanned aerial vehicle on highway under traffic accident
CN112150803A (en) * 2020-08-27 2020-12-29 东风汽车集团有限公司 Method and server for processing traffic accidents
CN113643520A (en) * 2021-08-04 2021-11-12 南京及物智能技术有限公司 Intelligent traffic accident processing system and method
CN114840017A (en) * 2022-04-01 2022-08-02 合众新能源汽车有限公司 Control method, control device, control system, traffic management system, and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
郑渊茂;何原荣;冷鹏;姚诚鑫;: "基于无人机的交通事故现场快速制图及归档系统构建", 重庆理工大学学报(自然科学), no. 11, 15 November 2017 (2017-11-15) *

Similar Documents

Publication Publication Date Title
US9783320B2 (en) Airplane collision avoidance
CN207367052U (en) A kind of life detection car, wearable device and virtual reality detection system
US10322804B2 (en) Device that controls flight altitude of unmanned aerial vehicle
CN107585222B (en) Unmanned reconnaissance vehicle
CN112660157B (en) Multifunctional remote monitoring and auxiliary driving system for barrier-free vehicle
US11574504B2 (en) Information processing apparatus, information processing method, and program
US20150268338A1 (en) Tracking from a vehicle
US20170193308A1 (en) Systems and methods for personal security using autonomous drones
US11150659B2 (en) Information collection system and server apparatus
CN109259948B (en) Wheelchair for assisting driving
GB2601275A (en) Detection and classification of siren signals and localization of siren signal sources
CN106527426A (en) Indoor multi-target track planning system and method
CN106353886A (en) Intelligent head up display of automobile
WO2014080388A2 (en) Police drone
WO2022246852A1 (en) Automatic driving system testing method based on aerial survey data, testing system, and storage medium
CN104802710B (en) A kind of intelligent automobile reversing aid system and householder method
CN110271545A (en) Controller of vehicle, control method for vehicle and storage medium
JP2020142652A (en) Vehicle driving control system
CN109272755A (en) A kind of additional transport Command Management System based on unmanned plane
KR102184598B1 (en) Driving Prediction and Safety Driving System Based on Judgment of Driver Emergency Situation of Autonomous Driving Vehicle
US20180342153A1 (en) Autonomous traffic managing system
CN111035543A (en) Intelligent blind guiding robot
CN111216718B (en) Collision avoidance method, device and equipment
CN107226024B (en) A kind of high-precision vehicle sighting distance acquisition and processing system and method
KR102441077B1 (en) Apparatus for controlling taking off and landing of a dron in a vehicle and method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination