CN113406640A - System and method for tracking collider and collecting evidence by unmanned aerial vehicle after collision accident - Google Patents

System and method for tracking collider and collecting evidence by unmanned aerial vehicle after collision accident Download PDF

Info

Publication number
CN113406640A
CN113406640A CN202010186534.3A CN202010186534A CN113406640A CN 113406640 A CN113406640 A CN 113406640A CN 202010186534 A CN202010186534 A CN 202010186534A CN 113406640 A CN113406640 A CN 113406640A
Authority
CN
China
Prior art keywords
target object
current vehicle
vehicle
drone
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202010186534.3A
Other languages
Chinese (zh)
Inventor
唐帅
孙铎
曲彤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Audi AG
Original Assignee
Audi AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Audi AG filed Critical Audi AG
Priority to CN202010186534.3A priority Critical patent/CN113406640A/en
Publication of CN113406640A publication Critical patent/CN113406640A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/862Combination of radar systems with sonar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/865Combination of radar systems with lidar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/93Sonar systems specially adapted for specific applications for anti-collision purposes
    • G01S15/931Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Acoustics & Sound (AREA)
  • Traffic Control Systems (AREA)

Abstract

The present disclosure relates to systems and methods for collecting evidence following a collision event. The system comprises: a collision detection unit configured to detect whether another target object collides with the current vehicle and acquire an image of the other target object; a target object tracking unit configured to transmit a drone above the current vehicle, track the target object based on the image of the other target object, and acquire information of the target object; a communication unit configured to transmit information of the target object to a mobile device or the current vehicle. According to the system and the method for collecting evidence after the collision accident, the unmanned aerial vehicle is transmitted to track and obtain evidence of the target object after the collision accident is detected, and timely evidence collection is carried out on a collider in the collision escape accident.

Description

System and method for tracking collider and collecting evidence by unmanned aerial vehicle after collision accident
Technical Field
The present disclosure relates to the field of vehicle safety technology, and more particularly, to a system and method for a drone to track a collider to collect evidence after a collision accident.
Background
Most car owners are not in the scene when escape accidents occur after collision. In such a case, the owner may not find the right-maintaining subject, and even if the owner can find the right-maintaining subject, it may be difficult to provide sufficient evidence, for example, face information of the collider.
Accordingly, there is a need for a system and method for post-crash evidence collection.
Disclosure of Invention
An aim at of this disclosure, there is other target object to collide the current vehicle through detecting, launches unmanned aerial vehicle to current vehicle top, tracks and shoots and collect evidence, carries out timely evidence to the collider in realizing the collision escape accident.
Thus, according to a first aspect of the present disclosure, there is provided a system for collecting evidence following a collision incident, the system comprising:
a collision detection unit configured to detect whether another target object collides with a current vehicle in a state where the vehicle is stopped, and acquire an image of the other target object in response to detection that the other target object collides with the current vehicle;
a target object tracking unit configured to, in response to other target objects colliding with the current vehicle, transmit a drone above the current vehicle, track the target object based on an image of the target object, and acquire information of the target object;
a communication unit configured to transmit information of the target object to a mobile device or the current vehicle.
According to an embodiment of the present disclosure, the state of the vehicle being stopped includes any one, any two, or all three of: vehicle stop, lock and speed are zero.
According to an embodiment of the present disclosure, the collision detection unit includes a sensor (e.g., an image pickup device, an ultrasonic sensor, a laser radar, or a millimeter wave radar) mounted on the present vehicle, and is configured to detect an object around the present vehicle.
According to an embodiment of the present disclosure, whether there is another target object colliding with the current vehicle includes determining by (i) and/or (ii) as follows:
(i) detecting that the acceleration of the vehicle body exceeds a threshold value;
(ii) detecting that the distance between other objects and the current vehicle is less than a threshold value.
According to the embodiment of the present disclosure, the unmanned aerial vehicle is detachably connected with the current vehicle, and is detached from the current vehicle when being launched.
According to an embodiment of the present disclosure, the drone is attached outside the current vehicle, for example on the roof; or within the current vehicle, may fly out of the current vehicle window, e.g., a sunroof.
According to an embodiment of the present disclosure, the target object tracking unit includes a camera device provided on the unmanned aerial vehicle.
According to the embodiment of the disclosure, tracking the target object comprises matching an image of a surrounding object shot by a camera device on the unmanned aerial vehicle with the image of the target object, and locking the target object.
According to the embodiment of the present disclosure, the information of the target object is transmitted to the mobile device or the current vehicle through wireless communication such as Wi-Fi, bluetooth, or a mobile network.
According to a second aspect of the present disclosure, a vehicle is provided comprising a system for collecting evidence following a collision event according to the present disclosure.
According to a third aspect of the present disclosure, there is provided a method of collecting evidence following a collision incident, the method comprising:
(1) detecting whether other target objects collide with the current vehicle or not in the state that the vehicle stops running, and acquiring images of the other target objects in response to the fact that the other target objects collide with the current vehicle;
(2) responding to the fact that other target objects collide with the current vehicle, transmitting an unmanned aerial vehicle to the position above the current vehicle, tracking the target objects based on images of the other target objects and acquiring information of the target objects;
(3) and sending the information of the target object to a mobile device or the current vehicle.
According to an embodiment of the present disclosure, the state of the vehicle being stopped includes any one, any two, or all three of: vehicle stop, lock and speed are zero.
According to the embodiment of the present disclosure, in (1), whether or not another target object collides with the present vehicle is detected by a sensor (e.g., an image pickup device, an ultrasonic sensor, a laser radar, or a millimeter wave radar) mounted on the present vehicle.
According to an embodiment of the present disclosure, in (2), the detecting that there is another target object colliding with the current vehicle includes determining by (i) and/or (ii) as follows:
(i) detecting that the acceleration of the vehicle body exceeds a threshold value;
(ii) detecting that the distance between other objects and the current vehicle is less than a threshold value.
According to an embodiment of the present disclosure, in (2), the drone is detachably connected with the current vehicle, and is detached from the current vehicle when launched.
According to an embodiment of the present disclosure, in (2), the drone is attached outside the current vehicle, e.g. on the roof; or within the current vehicle, may fly out of the current vehicle window, e.g., a sunroof.
According to the embodiment of the present disclosure, in (2), a camera device is provided on the unmanned aerial vehicle.
According to the embodiment of the disclosure, in (2), the camera device on the unmanned aerial vehicle captures images of surrounding objects, matches the images of the other target objects from the current vehicle, and locks the target objects.
According to the embodiment of the present disclosure, in (2), after tracking the target object and acquiring the information of the target object, the unmanned aerial vehicle returns to the current vehicle.
According to the embodiment of the present disclosure, in (3), the information of the target object is transmitted to the mobile device or the current vehicle through wireless communication such as Wi-Fi, bluetooth, or a mobile network.
According to a fourth aspect of the present disclosure, there is provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method of any of the above.
By using the system and the method for collecting evidence after collision accident, the unmanned aerial vehicle is transmitted to the upper part of the current vehicle after collision, the target object is tracked in time, shot and evidence is obtained, and automatic evidence collection is carried out on a collider in collision and escape accidents.
Drawings
The present disclosure may be better understood from the following description of specific embodiments thereof taken in conjunction with the accompanying drawings, in which like reference numerals identify identical or functionally similar elements.
Fig. 1 shows a schematic view of a system according to one embodiment of the present disclosure.
Fig. 2 shows a schematic view of one application scenario of a system according to one embodiment of the present disclosure.
Fig. 3 shows a block flow diagram of a method according to an embodiment of the present disclosure.
Detailed Description
Hereinafter, embodiments of the present disclosure are described with reference to the drawings. The following detailed description and drawings are included to illustrate the principles of the disclosure, which is not to be limited to the preferred embodiments described, but is to be defined by the claims. The disclosure will now be described in detail with reference to exemplary embodiments thereof, some of which are illustrated in the accompanying drawings. The following description refers to the accompanying drawings, in which like reference numerals refer to the same or similar elements in different drawings unless otherwise indicated. The aspects described in the following exemplary embodiments do not represent all aspects of the present disclosure. Rather, these aspects are merely examples of systems and methods according to various aspects of the present disclosure recited in the appended claims.
The system for collecting evidence after a collision accident according to the present disclosure may be mounted on or applied to a vehicle. The vehicle may be an internal combustion engine vehicle using an internal combustion engine as a drive source, an electric vehicle or a fuel cell vehicle using an electric motor as a drive source, a hybrid vehicle using both of the above as drive sources, or a vehicle having another drive source. The system for collecting evidence after a collision accident according to the present disclosure is preferably applied to an autonomous vehicle. The autonomous vehicles referred to herein include fully autonomous vehicles, as well as vehicles having autonomous driving modes.
Fig. 1 shows a schematic view of a system 100 for collecting evidence after a collision accident according to an embodiment of the present disclosure. As shown in FIG. 1, the post-crash evidence collection system 100 includes a crash detection unit 110, a target object tracking unit 120, and a communication unit 130.
In fig. 1, the collision detection unit 110 is configured to detect whether another target object collides with the current vehicle in a state where the vehicle is stopped, and to acquire an image of the other target object in response to detection that the other target object collides with the current vehicle. In general, the escape after collision is a situation in which the vehicle owner is not present, and thus the vehicle is stopped, and the vehicle is stopped, including any one, any two, or all three of the following: vehicle stop, lock and speed are zero. The collision detection unit may include a sensor, such as an image pickup device, an ultrasonic sensor, a lidar or a millimeter wave radar, mounted on the present vehicle, for detecting objects around the present vehicle. Whether there is another target object colliding with the current vehicle includes determining by (i) and/or (ii) as follows: (i) detecting acceleration of the vehicle body exceeding a threshold (e.g., 0.1 m/s)2) (ii) a (ii) It is detected that the distance of the other object from the current vehicle is less than a threshold value (e.g., 3 cm). If it is detected that the acceleration of the vehicle body exceeds a threshold value and the distance of the other object from the current vehicle is less than a threshold value at the same time, it is judged that the movement of the vehicle body is the result of the collision of the other object. The acceleration of the vehicle body may be detected by a gyro sensor mounted on the current vehicle; other thingsThe distance of the body from the current vehicle may be detected by a sensor (e.g., an image pickup device, an ultrasonic sensor, a laser radar, or a millimeter wave radar). One or more sensors may be provided and may be mounted at various locations around the current vehicle.
In fig. 1, the target object tracking unit 120 is configured to transmit a drone to above the current vehicle in response to other target objects colliding with the current vehicle, track the target object based on an image of the target object, and acquire information of the target object. Thus, the target object tracking unit 120 may comprise the drone. The unmanned aerial vehicle with but current vehicle is connected with the mode of breaking away, during the transmission with current vehicle breaks away from, and when connecting current vehicle can charge it. According to an embodiment of the present disclosure, the drone is attached outside the current vehicle, for example on the roof; or within the current vehicle, may fly out of the current vehicle window, e.g., a sunroof. In the event that the vehicle owner is not present, the present vehicle has a collision and then has an escape accident, the drone is released, for example by disengaging the docking element on the present vehicle to release the drone. Preferably, the collision detection unit 110 sends information or instructions to the drone by wireless communication: current vehicle position, relative position, takeoff command (e.g., up, down, or 360 degree horizontal motion). The drone release may be finally confirmed by the user, who may send a confirmation through a mobile device connected to the current vehicle. The user may be the owner, driver or manager of the vehicle.
According to an embodiment of the present disclosure, the target object tracking unit 120 includes a camera device provided on the drone. Therefore, tracking the target object may include matching an image of a surrounding object captured by the camera on the drone with an image of the target object captured by the collision detection unit 110, and locking the target object. Specifically, a sensor (e.g., an image pickup device, an ultrasonic sensor, a laser radar, or a millimeter wave radar) of the collision detection unit 110 detects a target object closest to the current vehicle at the time of a collision, and takes an image thereof. The current vehicle sends the image of the target object detected by the collision detection unit 110 to the unmanned aerial vehicle through wireless communication, and the camera device on the unmanned aerial vehicle shoots the image of the surrounding object and matches the shot image with the image from the current vehicle, so as to lock the target object. The current vehicle comprises Wi-Fi, Bluetooth or a mobile network through wireless communication with the unmanned aerial vehicle.
According to the embodiment of the present disclosure, the drone tracks and shoots the evidence of the target object. And automatically controlling the unmanned aerial vehicle to fly on the target object, and continuously tracking in real time. More preferably, the drone flies to a location (e.g., a license plate or a face) of the relevant key information for shooting and forensics. And shooting the image and/or video of the target object by a camera device on the unmanned aerial vehicle. Optionally, if the lighting conditions are below the detection threshold of the optical sensor on the drone on the current vehicle, turning on a light on the drone.
In fig. 1, the communication unit 130 is configured to transmit the information of the target object to a mobile device or the current vehicle. The drone sends the captured images and/or videos to the user, for example by wireless communication to the mobile device of the connected user or to the current vehicle. The wireless communication may include Wi-Fi, bluetooth, or a mobile network. The image/video may be browsed or played on the mobile device; alternatively, the image/video may be viewed or played on a display device of the current vehicle. For example, when the current vehicle is first started after the accident, a presentation is presented to the user on the vehicle to alert the user that the vehicle has been in a collision.
According to this disclosed embodiment, unmanned aerial vehicle returns current vehicle after accomplishing to shoot to collect evidence. For example, tracking to a certain time (e.g., 10 seconds) or distance (e.g., 30 meters) from the current vehicle location, the drone plans a path to the current vehicle location, and the drone attaches back to the docking element on the current vehicle to be charged for the next start.
Referring to FIG. 2, an example application of a system and method for collecting evidence following a collision event is shown, according to an embodiment of the present disclosure. In fig. 2, the current vehicle 1 is in a stopped state, the unmanned aerial vehicle 2 is in or on the current vehicle 1, and after the collision of the other traffic participants 3 with the current vehicle, the collision detection unit 110 determines the occurrence of the collision by (i) and/or (ii) as follows: (i) detecting that the acceleration of the vehicle body exceeds a threshold value; (ii) detecting that the distance between other objects and the current vehicle is smaller than a threshold value; at this time, the target object tracking unit 120 transmits the drone 2 to above the current vehicle 1, tracks the other traffic participants 3 based on the images of the other traffic participants 3, and acquires information of the other traffic participants 3.
The system 100 for collecting evidence following a collision event described above may be installed on a vehicle. Accordingly, the present disclosure is directed to a vehicle including the system for collecting evidence after a collision event 100 described above. However, those skilled in the art will appreciate that the system 100 for collecting evidence after a collision event may also be installed on or applied to a mobile device in the form of an application program or application software (APP), or the like.
A method of collecting evidence after a collision accident according to an embodiment of the present disclosure will be described below with reference to the accompanying drawings. Fig. 3 is a flowchart illustrating a method S100 of collecting evidence after a collision accident according to an embodiment of the present disclosure. The method of collecting evidence post-crash S100 is performed by the system of collecting evidence post-crash 100 described above.
As shown in fig. 3, in step S110, in a state where the vehicle is stopped, it is detected whether another target object collides with the current vehicle (S110a), and in response to the detection that another target object collides with the current vehicle, an image of the other target object is acquired (S110 b). In general, the escape after collision is a situation in which the vehicle owner is not present, and thus the vehicle is stopped, and the vehicle is stopped, including any one, any two, or all three of the following: vehicle stop, lock and speed are zero. The present vehicle may be mounted thereon with a sensor, such as a camera, an ultrasonic sensor, a laser radar, or a millimeter wave radar, for detecting objects around the present vehicle. Whether other target objects collide with the current vehicleComprising determining by (i) and/or (ii) as follows: (i) detecting acceleration of the vehicle body exceeding a threshold (e.g., 0.1 m/s)2) (ii) a (ii) It is detected that the distance of the other object from the current vehicle is less than a threshold value (e.g., 3 cm). If it is detected that the acceleration of the vehicle body exceeds a threshold value and the distance of the other object from the current vehicle is less than a threshold value at the same time, it is judged that the movement of the vehicle body is the result of the collision of the other object. The acceleration of the vehicle body may be detected by a gyro sensor mounted on the current vehicle; the distance of the other object from the current vehicle may be detected by a sensor (e.g., an image pickup device, an ultrasonic sensor, a laser radar, or a millimeter wave radar). One or more sensors may be provided and may be mounted at various locations around the current vehicle.
In step S120, in response to that there is another target object colliding with the current vehicle, an unmanned aerial vehicle is launched above the current vehicle, the target object is tracked based on the image of the other target object, and information of the target object is acquired. The unmanned aerial vehicle with but current vehicle is connected with the mode of breaking away, during the transmission with current vehicle breaks away from, and when connecting current vehicle can charge it. According to an embodiment of the present disclosure, the drone is attached outside the current vehicle, for example on the roof; or within the current vehicle, may fly out of the current vehicle window, e.g., a sunroof. In the event that the vehicle owner is not present, the present vehicle has a collision and then has an escape accident, the drone is released, for example by disengaging the docking element on the present vehicle to release the drone. Preferably, the current vehicle may send information or instructions to the drone via wireless communication: current vehicle position, relative position, takeoff command (e.g., up, down, or 360 degree horizontal motion). The drone release may be finally confirmed by the user, who may send a confirmation through a mobile device connected to the current vehicle. The user may be an owner, driver or manager of the vehicle.
According to the embodiment of the disclosure, the information of the target object is acquired through the camera device on the unmanned aerial vehicle. Therefore, tracking the target object may include matching an image of a surrounding object captured by the camera on the drone with the image of the target object captured in S110b, and locking the target object. Specifically, a target object closest to the current vehicle is detected at the time of a collision using a sensor (e.g., an image pickup device, an ultrasonic sensor, a laser radar, or a millimeter wave radar) in S110b, and an image is captured thereof. The current vehicle sends the image of the target object detected in the step S110b to the unmanned aerial vehicle through wireless communication, and the camera device on the unmanned aerial vehicle shoots the image of the surrounding object and matches the shot image with the image from the current vehicle, so as to lock the target object. The current vehicle comprises Wi-Fi, Bluetooth or a mobile network through wireless communication with the unmanned aerial vehicle.
According to the embodiment of the present disclosure, the drone tracks and shoots the evidence of the target object. And automatically controlling the unmanned aerial vehicle to fly on the target object, and continuously tracking in real time. More preferably, the drone flies to a location of the key-related information (e.g., a license plate, an appearance of a vehicle, or an appearance of a person) for shooting for evidence collection. And shooting the image and/or video of the target object by a camera device on the unmanned aerial vehicle. Optionally, if the lighting conditions are below the detection threshold of the optical sensor on the drone on the current vehicle, turning on a light on the drone.
In step S130, the information of the target object is transmitted to a mobile device or the current vehicle. The drone sends the captured images and/or videos to the user, for example by wireless communication to the mobile device of the connected user or to the current vehicle. The wireless communication may include Wi-Fi, bluetooth, or a mobile network. The image/video may be browsed or played on the mobile device; alternatively, the image/video may be viewed or played on a display device of the current vehicle. For example, when the current vehicle is first started after the accident, a presentation is presented to the user on the vehicle to alert the user that the vehicle has been in a collision.
According to this disclosed embodiment, unmanned aerial vehicle prefers to return current vehicle after accomplishing to shoot to collect evidence. For example, tracking to a certain time (e.g., 10 seconds) or distance (e.g., 30 meters) from the current vehicle location, the drone plans a path to the current vehicle location, and the drone attaches back to the docking element on the current vehicle to be charged for the next start.
The method of the present disclosure may be accomplished using a computer program that is started by detecting that another target object collides with a current vehicle (S110a) in step S110, matching an image of a surrounding object captured by a camera on the drone (S110b) with an image of the other target object from the current vehicle, obtaining information of the target object (S120), and transmitting the information of the target object to a user' S mobile device or the current vehicle (S130). Accordingly, the present disclosure may also include a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method described in embodiments of the present disclosure.
It will be understood by those skilled in the art that the division and order of the steps in the method for collecting evidence after a crash event of the present disclosure is merely illustrative and not restrictive, and that various omissions, additions, substitutions, modifications and changes may be made by those skilled in the art without departing from the spirit and scope of the present disclosure as set forth in the appended claims and their equivalents.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
While the present disclosure has been described in connection with embodiments, it is to be understood by those skilled in the art that the foregoing description and drawings are merely illustrative and not restrictive of the disclosed embodiments. Various modifications and variations are possible without departing from the spirit of the disclosure.

Claims (15)

1. A system for collecting evidence following a collision event, the system comprising:
a collision detection unit configured to detect whether another target object collides with a current vehicle in a state where the vehicle is stopped, and acquire an image of the other target object in response to detection that the other target object collides with the current vehicle;
a target object tracking unit configured to, in response to another target object colliding with the current vehicle, transmit a drone above the current vehicle, track the target object based on an image of the other target object, and acquire information of the target object; and
a communication unit configured to transmit information of the target object to a mobile device or the current vehicle.
2. The system according to claim 1, characterized in that the collision detection unit comprises a sensor (e.g. a camera, an ultrasonic sensor, a lidar or a millimeter wave radar) mounted on the current vehicle, configured for detecting objects around the current vehicle.
3. The system of claim 1 or 2, wherein whether there is another target object colliding with the current vehicle comprises determining by (i) and/or (ii) as follows:
(i) detecting that the acceleration of the vehicle body exceeds a threshold value;
(ii) detecting that the distance between other objects and the current vehicle is less than a threshold value.
4. The system of claim 1 or 2, wherein the drone is detachably connected to the current vehicle, and is detached from the current vehicle when launched.
5. The system of claim 1 or 2, wherein the target object tracking unit comprises a camera disposed on the drone.
6. The system of claim 5, wherein tracking the target object comprises matching an image of a surrounding object captured by a camera on the drone with an image of the target object, locking the target object.
7. A vehicle, characterized in that the vehicle comprises a system according to any of claims 1-6.
8. A method of collecting evidence following a collision event, the method comprising:
(1) detecting whether other target objects collide with the current vehicle or not in the state that the vehicle stops running, and acquiring images of the other target objects in response to the fact that the other target objects collide with the current vehicle;
(2) responding to the fact that other target objects collide with the current vehicle, transmitting an unmanned aerial vehicle to the position above the current vehicle, tracking the target objects based on images of the other target objects and acquiring information of the target objects; and
(3) and sending the information of the target object to a mobile device or the current vehicle.
9. The method according to claim 8, characterized in that in (1), it is detected whether another target object collides with the present vehicle by a sensor (e.g., an image pickup device, an ultrasonic sensor, a lidar or a millimeter-wave radar) mounted on the present vehicle.
10. The method according to claim 8 or 9, wherein in (2), whether there is another target object colliding with the current vehicle includes determining by (i) and/or (ii) as follows:
(i) detecting that the acceleration of the vehicle body exceeds a threshold value;
(ii) detecting that the distance between other objects and the current vehicle is less than a threshold value.
11. The method according to claim 8 or 9, wherein in (2) the drone is detachably connected to the current vehicle, being detached from the current vehicle when launched.
12. The method according to claim 8 or 9, characterized in that in (2) a camera device is provided on the drone.
13. The method of claim 12, wherein in (2) the camera on the drone captures an image of surrounding objects, matches the image of the other target objects from the current vehicle, and locks the target objects.
14. The method according to claim 8 or 9, wherein in (2), after tracking the target object and acquiring information of the target object, the drone returns to the current vehicle.
15. A computer-readable storage medium, characterized in that a computer program is stored thereon, which computer program, when being executed by a processor, carries out the method according to any one of the claims 8-14.
CN202010186534.3A 2020-03-17 2020-03-17 System and method for tracking collider and collecting evidence by unmanned aerial vehicle after collision accident Withdrawn CN113406640A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010186534.3A CN113406640A (en) 2020-03-17 2020-03-17 System and method for tracking collider and collecting evidence by unmanned aerial vehicle after collision accident

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010186534.3A CN113406640A (en) 2020-03-17 2020-03-17 System and method for tracking collider and collecting evidence by unmanned aerial vehicle after collision accident

Publications (1)

Publication Number Publication Date
CN113406640A true CN113406640A (en) 2021-09-17

Family

ID=77677245

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010186534.3A Withdrawn CN113406640A (en) 2020-03-17 2020-03-17 System and method for tracking collider and collecting evidence by unmanned aerial vehicle after collision accident

Country Status (1)

Country Link
CN (1) CN113406640A (en)

Similar Documents

Publication Publication Date Title
US10625734B2 (en) Automatic driving system for automatically driven vehicle
US20180357484A1 (en) Video processing device and video processing method
US11363235B2 (en) Imaging apparatus, image processing apparatus, and image processing method
US20170267244A1 (en) Driver assistance apparatus and control method thereof
KR101810964B1 (en) Drone for safe and secure law enforcement on the street and Method for controlling the same
CN110001649B (en) Vehicle control system, vehicle control method, and storage medium
US20160152211A1 (en) Information processing system, method, apparatus, computer readable medium, and computer readable program for information exchange in vehicles
US10558218B2 (en) Vehicle surroundings monitoring apparatus, monitoring system, remote monitoring apparatus, and monitoring method
CN103425731A (en) Traffic event data source identification, data collection and data storage
CN111376853B (en) Vehicle control system, vehicle control method, and storage medium
CN111369708A (en) Vehicle driving information recording method and device
CN111627256A (en) Unmanned aerial vehicle control method, vehicle-mounted terminal and computer-readable storage medium
KR102163317B1 (en) Control method for vehicle
CN112929603A (en) Vehicle remote viewing method and device and storage medium
WO2021217575A1 (en) Identification method and identification device for object of interest of user
KR102042050B1 (en) Apparatus for Inspection and Crackdown Vehicle Using Drone
JP5939201B2 (en) Status monitoring apparatus, security system, program, and status monitoring method
KR20140030691A (en) Device and method of transfering blackbox image using v2x communication
KR102258804B1 (en) Apparatus and Method for Authenticating Biometric Information using Certification Score
KR20150121775A (en) Method for Service Real-Time Security of Vehicle using Vehicle Black Box and System thereof
KR101746579B1 (en) Car black box system using the drones
CN113406640A (en) System and method for tracking collider and collecting evidence by unmanned aerial vehicle after collision accident
JP2019036862A (en) Server apparatus, recording method, program, and information processing apparatus
KR101391909B1 (en) Method for service image data of black box in the vehicle
CN115649190A (en) Control method, device, medium, vehicle and chip for vehicle auxiliary braking

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20210917

WW01 Invention patent application withdrawn after publication