CN111599216A - Auxiliary driving method, device and system based on image recognition and UWB (ultra-wideband) tag - Google Patents
Auxiliary driving method, device and system based on image recognition and UWB (ultra-wideband) tag Download PDFInfo
- Publication number
- CN111599216A CN111599216A CN202010420471.3A CN202010420471A CN111599216A CN 111599216 A CN111599216 A CN 111599216A CN 202010420471 A CN202010420471 A CN 202010420471A CN 111599216 A CN111599216 A CN 111599216A
- Authority
- CN
- China
- Prior art keywords
- vehicle
- reading
- image recognition
- distance
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q9/00—Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling
- B60Q9/008—Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for anti-collision purposes
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/161—Decentralised systems, e.g. inter-vehicle communication
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Mechanical Engineering (AREA)
- Traffic Control Systems (AREA)
Abstract
The invention relates to the technical field of vehicle assistance, and discloses a driving assistance method, a driving assistance device and a driving assistance system based on image recognition and a UWB (ultra Wide band) tag, wherein the method comprises the following steps: configuring two cameras, measuring the object distance by adopting a binocular stereo vision technology, writing vehicle information into a UWB (ultra wide band) tag, reading the vehicle information in real time, performing comparison processing, granting attention levels, judging a numerical range and sending a warning; the invention can directly measure all obstacles, quickly determine the information of surrounding vehicles, accurately and efficiently detect abnormal conditions, provide warning information for drivers in time for reference, and has higher practical value and wide application prospect.
Description
Technical Field
The invention relates to the technical field of vehicle assistance, in particular to a driving assistance method, device and system based on image recognition and UWB (ultra-wideband) tags.
Background
With the increasing automobile keeping quantity in the world, the automobile traffic accidents have been increased in successive years, and the traffic safety problem becomes a public nuisance in modern society. Statistically, among all traffic accidents, car crash accidents (including car collisions and car-to-fixture collisions) are the predominant form. The collision accident of the automobile is mostly caused by the factors of too fast driving speed, too small driving distance, untimely braking and the like.
In order to further improve road traffic safety and help drivers to reduce erroneous operations, attention has been paid to and intelligent automobile safety technologies represented by Advanced Driver Assistance Systems (ADAS) in recent years. The automobile emergency collision avoidance system assists a driver to adjust the motion track of an automobile through active intervention of an actuator, so that collision avoidance is realized. The novel bicycle can save lives of drivers at critical moment, and has good market prospect.
In the prior art, for the grasping of the information of surrounding vehicles, most distance data formed by radar or laser ranging cannot accurately and quickly acquire the real-time motion information and the important parameter information such as internal power and steering of the surrounding vehicles, and effective reminding cannot be made.
Disclosure of Invention
Aiming at the defects of the prior art, the invention provides an auxiliary driving method based on image recognition and a UWB (ultra Wide band) tag, which is used for solving the problems in the background art.
The technical scheme adopted by the invention for solving the technical problems is as follows:
the invention provides an auxiliary driving method based on image recognition and a UWB (ultra Wide band) tag, which comprises the following steps of:
two cameras are arranged above the vehicle in parallel, and the distance between the measured object and the vehicle is obtained through an optical ranging principle;
writing static vehicle information and dynamic vehicle information into a UWB tag in real time through a UWB tag reader-writer to wait for reading, wherein the static vehicle information comprises a license plate number, a vehicle model and a vehicle overall dimension, and the dynamic vehicle information comprises vehicle running direction, real-time speed, engine rotating speed, acceleration, steering angle, load weight and position information;
reading vehicle information stored in UWB tags of all vehicles in a preset range in real time, comparing the vehicle information with the vehicle information of the vehicle, granting different attention degrees of the compared vehicles to be sent to a memory for storage or updating the attention degree grade corresponding to the vehicle in the memory;
and for the vehicle with the highest attention degree, if the data in the dynamic vehicle information exceeds a preset normal value range, a warning signal is sent to the driver of the vehicle.
Preferably, the step of obtaining the distance between the measured object and the vehicle by the optical ranging principle comprises the following steps:
configuring the internal and external parameters of the two cameras to be the same;
shooting pictures in real time;
determining the three-dimensional coordinates of objects in the actual scene through a binocular stereo vision technology;
and obtaining a distance parameter according to the vehicle outline size and the camera installation position information.
Preferably, the dynamic vehicle information is acquired by:
acquiring parameters including vehicle speed per hour, engine speed, steering angle and acceleration through a CAN bus of the vehicle;
determining vehicle position information and a driving direction through the GPS/INS in cooperation with a gyroscope;
the weight of the load of the vehicle is collected through the arranged pressure sensor.
Preferably, the vehicle information stored in the vehicle UWB tag within the preset range is compared with the vehicle information of the vehicle by the following method:
reading real-time position information of the vehicle and the comparison vehicle, calculating to obtain a linear distance between the two vehicles, and performing orthogonal decomposition on the linear distance along the driving direction of the vehicle to obtain a vertical distance along the driving direction of the vehicle and a transverse distance perpendicular to the driving direction of the vehicle;
reading the running direction data of the vehicle and the comparison vehicle, and calculating to obtain the running direction included angle of the two vehicles;
and reading and comparing the output torque and the engine speed of the vehicle engine, and calculating to obtain the output torque and the speed variation.
Preferably, the granting of the attention level comprises the steps of:
reading and comparing the vertical distance S of the vehicle relative to the vehicleVerticalTransverse distance SHorizontal barAngle theta with the direction of travelClip;
Reading and comparing real-time speed V of vehicleTime of flightAnd an acceleration a;
reading the variation delta R of the rotating speed;
substituting formula P ═ λ1SVertical+λ2SHorizontal bar+λ3θClip+λ4VTime of flight+λ5a+λ5Δ R obtains a rating of interest parameter for the aligned vehicle, where λ1、λ2、λ3、λ4、λ5、λ5Reading a weight coefficient value corresponding to a prestored mode according to a driving environment mode selected by a user as a weight coefficient;
ranking the grade parameters after the comparison of all vehicles in a preset range is finished, and awarding attention grades from high to low;
and simultaneously sending a warning signal to the driver of the vehicle and sending a warning signal to the vehicle with the highest attention degree.
Preferably, when the straight-line distance between the vehicle and the comparison vehicle is lower than a preset threshold, the high-precision distance measuring service is triggered, and the method comprises the following steps:
establishing a space coordinate system, wherein the space coordinate system takes the center of the vehicle body of the vehicle as the origin of coordinates, the direction parallel to the ground to the front of the vehicle head is taken as the positive direction of a Y axis, the direction perpendicular to the ground to the lower part of the ground is taken as the positive direction of a Z axis, and the direction parallel to the ground to the right side of a driver is taken as the positive direction of an X axis;
reading the vehicle outline size data to obtain the farthest endpoint coordinates of each surface of the vehicle outline solid;
reading and comparing vehicle outline size data, and combining distance data measured by a laser arranged at the periphery of the vehicle to obtain the coordinates of the farthest end points of all surfaces of the compared vehicle outline stereo;
and calculating the point distance of the coordinate of the farthest end point of each surface of the outline stereo of the vehicle and the coordinate of the farthest end point of each surface of the outline stereo of the comparison vehicle, and feeding back the corresponding position of the point with the minimum distance to the driver of the vehicle and the driver of the comparison vehicle in real time.
Preferably, the high precision distance measurement service further comprises a modeling step of:
and reading the coordinate data of the vehicle in the space coordinate system and the coordinate data of the vehicle with the highest attention degree, and dynamically displaying the outline models of the two vehicles in proportion on a vehicle internal display device.
The invention also provides a driving assistance device based on image recognition and UWB tags, comprising:
the image ranging module is used for obtaining the distance between the measured object and the vehicle through an optical ranging principle;
the UWB tag module is used for storing static vehicle information and dynamic vehicle information, wherein the static vehicle information comprises a license plate number, a vehicle model and a vehicle overall dimension, and the dynamic vehicle information comprises a vehicle running direction, a real-time speed, an engine rotating speed, an acceleration, a steering angle, a load weight and position information;
the data reading and writing module is used for writing the static vehicle information and the dynamic vehicle information into the UWB tag in real time to wait for reading;
the data processing module is used for reading vehicle information stored in UWB tags of all vehicles in a preset range in real time, comparing the vehicle information with the vehicle information of the vehicle, granting different attention degrees of the compared vehicles to be sent to the memory for storage or updating the attention degree grade corresponding to the vehicle in the memory;
and the reminding display module is used for sending a warning signal to the driver of the vehicle when the data in the dynamic vehicle information corresponding to the vehicle with the highest attention degree exceeds a preset normal value range.
The invention also provides an auxiliary driving system based on image recognition and UWB tags, comprising:
one or more processors;
storage means for storing one or more programs;
a driving assistance device based on image recognition and UWB tags;
the one or more programs, when executed by the one or more processors, cause an image recognition and UWB tag based assisted driving apparatus to implement a driving assistance method as described previously in cooperation with the one or more processors.
The invention also provides a vehicle control system, which comprises the driving assistance system based on the image recognition and the UWB tag.
Compared with the prior art, the invention has the following beneficial effects:
the invention adopts double cameras, utilizes binocular stereoscopic vision technology to determine the position in the barrier, and has lower cost compared with the schemes such as laser radar and the like; secondly, the identification rate is not limited, because in principle, identification is not needed first and then measurement is not needed, but all obstacles are directly measured; thirdly, the precision is higher than that of a monocular, and the distance is calculated by directly utilizing parallax; fourthly, the sample database does not need to be maintained, because the concept of the sample does not exist for the binocular;
by collecting important static information and real-time dynamic information in the running process of a vehicle and utilizing a control system and a data collector of the vehicle, the information is quickly and accurately summarized and is waited to be read, so that data delay and data errors caused by the need of external measurement for data collection in the prior art are avoided, and on the other hand, vehicle information data of the vehicle, which cannot be measured externally but is expected to be important and effectively known for surrounding vehicle drivers, is provided in real time, so that accurate and effective auxiliary driving information based on image recognition and UWB (ultra wide band) tags is obtained through further processing and is used for the drivers to refer to;
according to the invention, the surrounding vehicle information in the preset range is compared with the vehicle information, different attention levels are granted according to the comparison result, the important vehicle information is mainly monitored for the highest level, and once an abnormal condition occurs, the important vehicle information is timely reported, so that the surrounding vehicles are dynamically monitored and prompted according to the actual driving environment.
Further salient features and significant advances with respect to the present invention over the prior art are described in further detail in the examples section.
Drawings
Other features, objects and advantages of the invention will become more apparent upon reading of the detailed description of non-limiting embodiments with reference to the following drawings:
FIG. 1 is a flow chart of an assisted driving method based on image recognition and UWB tag according to the invention;
FIG. 2 is a schematic structural diagram of a driving assistance device based on image recognition and UWB tags according to the invention;
FIG. 3 is a schematic diagram of a driving assistance system based on image recognition and UWB tag according to the invention;
fig. 4 is a schematic structural diagram of a vehicle control system according to the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that certain names are used throughout the specification and claims to refer to particular components. It will be understood that one of ordinary skill in the art may refer to the same component by different names. The present specification and claims do not intend to distinguish between components that differ in name but not function. As used in the specification and claims of this application, the terms "comprises" and "comprising" are intended to be open-ended terms that should be interpreted as "including, but not limited to," or "including, but not limited to. The embodiments described in the detailed description are preferred embodiments of the present invention and are not intended to limit the scope of the present invention.
Moreover, those skilled in the art will appreciate that aspects of the present invention may be embodied as a system, method or computer program product. Accordingly, various aspects of the present invention may be embodied in a combination of hardware and software, which may be referred to herein generally as a "circuit," module "or" system. Furthermore, in some embodiments, various aspects of the invention may also be embodied in the form of a computer program product in one or more microcontroller-readable media having microcontroller-readable program code embodied therein.
As shown in fig. 1 to 4, the driving assistance method based on image recognition and UWB tag of the present embodiment includes the following steps:
the method comprises the following steps that double cameras are arranged above a vehicle in parallel, and the distance between a measured object and the vehicle is obtained through an optical ranging principle;
writing static vehicle information and dynamic vehicle information into a UWB tag in real time through a UWB tag reader-writer to wait for reading, wherein the static vehicle information comprises a license plate number, a vehicle model and a vehicle overall dimension, and the dynamic vehicle information comprises vehicle running direction, real-time speed, engine rotating speed, acceleration, steering angle, load weight and position information;
reading vehicle information stored in UWB tags of all vehicles in a preset range in real time, comparing the vehicle information with the vehicle information of the vehicle, granting different attention degrees of the compared vehicles to be sent to a memory for storage or updating the attention degree grade corresponding to the vehicle in the memory;
and for the vehicle with the highest attention degree, if the data in the dynamic vehicle information exceeds a preset normal value range, a warning signal is sent to the driver of the vehicle.
The specific steps of obtaining the distance between the measured object and the vehicle by the optical ranging principle in the embodiment are as follows:
configuring the internal and external parameters of the two cameras to be the same;
shooting pictures in real time;
determining the three-dimensional coordinates of objects in the actual scene through a binocular stereo vision technology;
and obtaining a distance parameter according to the vehicle outline size and the camera installation position information.
The dynamic vehicle information in the present embodiment is acquired by the following method:
acquiring parameters including vehicle speed per hour, engine speed, steering angle and acceleration through a CAN bus of the vehicle;
determining vehicle position information and a driving direction through the GPS/INS in cooperation with a gyroscope;
the weight of the load of the vehicle is collected through the arranged pressure sensor.
The real-time parameter information of the vehicle is directly obtained on the basis of the existing CAN bus of the vehicle and the GPS/INS matched with the gyroscope and the pressure sensor, and no new equipment is required to be added, so that the application of the invention has practicability and economy.
In the embodiment, vehicle information stored in a vehicle UWB tag within a preset range is compared with vehicle information of a vehicle by the following method:
reading real-time position information of the vehicle and the comparison vehicle, calculating to obtain a linear distance between the two vehicles, and performing orthogonal decomposition on the linear distance along the driving direction of the vehicle to obtain a vertical distance along the driving direction of the vehicle and a transverse distance perpendicular to the driving direction of the vehicle;
reading the running direction data of the vehicle and the comparison vehicle, and calculating to obtain the running direction included angle of the two vehicles;
and reading and comparing the output torque and the engine speed of the vehicle engine, and calculating to obtain the output torque and the speed variation. And comparing the vehicle of the vehicle with the comparison vehicle to provide detailed reference data for subsequent attention levels.
The granting of the attention level in this embodiment includes the steps of:
reading and comparing the vertical distance S of the vehicle relative to the vehicleVerticalTransverse distance SHorizontal barAngle theta with the direction of travelClip;
Reading and comparing real-time speed V of vehicleTime of flightAnd an acceleration a;
reading the variation delta R of the rotating speed;
substituting formula P ═ λ1SVertical+λ2SHorizontal bar+λ3θClip+λ4VTime of flight+λ5a+λ5Δ R obtains a rating of interest parameter for the aligned vehicle, where λ1、λ2、λ3、λ4、λ5、λ5And reading a weight coefficient value corresponding to a prestored mode according to the driving environment mode selected by the user as the weight coefficient.
Ranking the grade parameters after finishing the comparison of all vehicles in the preset range, and awarding attention grades according to the grade from high to low. Weights are given to comparison results through selection of different driving environments, attention levels of comparison vehicles are obtained, and accordingly layered monitoring is conducted on the road vehicles in a distinguished mode.
In the present embodiment, a warning signal is issued to the driver of the vehicle and a warning signal is issued to the vehicle having the highest attention level.
In this embodiment, when this car is less than the predetermined threshold with the comparison vehicle linear distance, trigger high accuracy distance measurement service, include:
establishing a space coordinate system, wherein the space coordinate system takes the center of the vehicle body of the vehicle as the origin of coordinates, the direction parallel to the ground to the front of the vehicle head is taken as the positive direction of a Y axis, the direction perpendicular to the ground to the lower part of the ground is taken as the positive direction of a Z axis, and the direction parallel to the ground to the right side of a driver is taken as the positive direction of an X axis;
reading the vehicle outline size data to obtain the farthest endpoint coordinates of each surface of the vehicle outline solid;
reading and comparing vehicle outline size data, and combining distance data measured by a laser arranged at the periphery of the vehicle to obtain the coordinates of the farthest end points of all surfaces of the compared vehicle outline stereo;
and calculating the point distance of the coordinate of the farthest end point of each surface of the outline stereo of the vehicle and the coordinate of the farthest end point of each surface of the outline stereo of the comparison vehicle, and feeding back the corresponding position of the point with the minimum distance to the driver of the vehicle and the driver of the comparison vehicle in real time.
The actual contour of the vehicle is compared in a space modeling mode when the distance between the vehicles is small in the embodiment, the most accurate and effective comparison model can be obtained, and therefore reasonable, effective and accurate reference data can be provided for drivers from the practical use angle.
The high-precision distance measurement service in this embodiment further includes a modeling step:
and reading the coordinate data of the vehicle in the space coordinate system and the coordinate data of the vehicle with the highest attention degree, and dynamically displaying the outline models of the two vehicles in proportion on a vehicle internal display device. In the embodiment, the outline model of the vehicle can be displayed for a driver in a three-dimensional manner, so that the road condition can be visually displayed, and particularly, a warning effect is provided for certain blind areas.
The present embodiment also provides a driving assistance apparatus based on image recognition and UWB tag, including:
the image ranging module is used for obtaining the distance between the measured object and the vehicle through an optical ranging principle;
the UWB tag module is used for storing static vehicle information and dynamic vehicle information, wherein the static vehicle information comprises a license plate number, a vehicle model and a vehicle overall dimension, and the dynamic vehicle information comprises a vehicle running direction, a real-time speed, an engine rotating speed, an acceleration, a steering angle, a load weight and position information;
the data reading and writing module is used for writing the static vehicle information and the dynamic vehicle information into the UWB tag in real time to wait for reading;
the data processing module is used for reading vehicle information stored in UWB tags of all vehicles in a preset range in real time, comparing the vehicle information with the vehicle information of the vehicle, granting different attention degrees of the compared vehicles to be sent to the memory for storage or updating the attention degree grade corresponding to the vehicle in the memory;
and the reminding display module is used for sending a warning signal to the driver of the vehicle when the data in the dynamic vehicle information corresponding to the vehicle with the highest attention degree exceeds a preset normal value range.
The present embodiment further provides a driving assistance system based on image recognition and UWB tag, including:
one or more processors;
storage means for storing one or more programs;
a driving assistance device based on image recognition and UWB tags;
when the one or more programs are executed by the one or more processors, the image recognition and UWB tag-based assisted driving apparatus is caused to implement the image recognition and UWB tag-based assisted driving method as described above in cooperation with the one or more processors.
The embodiment also provides a vehicle control system, which comprises the driving assistance system based on the image recognition and the UWB tag.
The auxiliary driving method based on image recognition and UWB tags can quickly determine surrounding vehicle information, accurately and efficiently detect abnormal conditions, provide warning information for drivers in time for reference, and has high practical value and wide application prospect.
It will be evident to those skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments, and that the present invention may be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned.
Furthermore, it should be understood that although the present description refers to embodiments, not every embodiment may contain only a single embodiment, and such description is for clarity only, and those skilled in the art should integrate the description, and the embodiments may be combined as appropriate to form other embodiments understood by those skilled in the art.
Claims (10)
1. An auxiliary driving method based on image recognition and UWB tags is characterized by comprising the following steps:
the method comprises the following steps that double cameras are arranged above a vehicle in parallel, and the distance between a measured object and the vehicle is obtained through an optical ranging principle;
writing static vehicle information and dynamic vehicle information into a UWB tag in real time through a UWB tag reader-writer to wait for reading, wherein the static vehicle information comprises a license plate number, a vehicle model and a vehicle overall dimension, and the dynamic vehicle information comprises vehicle running direction, real-time speed, engine rotating speed, acceleration, steering angle, load weight and position information;
reading vehicle information stored in UWB tags of all vehicles in a preset range in real time, comparing the vehicle information with the vehicle information of the vehicle, granting different attention degrees of the compared vehicles to be sent to a memory for storage or updating the attention degree grade corresponding to the vehicle in the memory;
and for the vehicle with the highest attention degree, if the data in the dynamic vehicle information exceeds a preset normal value range, a warning signal is sent to the driver of the vehicle.
2. The driving assistance method based on image recognition and UWB tag according to claim 1, wherein the specific steps of obtaining the distance between the measured object and the vehicle by the optical ranging principle are as follows:
configuring the internal and external parameters of the two cameras to be the same;
shooting pictures in real time;
determining the three-dimensional coordinates of objects in the actual scene through a binocular stereo vision technology;
and obtaining a distance parameter according to the vehicle outline size and the camera installation position information.
3. The driving assistance method based on image recognition and UWB tag according to claim 1, wherein the dynamic vehicle information is obtained by:
acquiring parameters including vehicle speed per hour, engine speed, steering angle and acceleration through a CAN bus of the vehicle;
determining vehicle position information and a driving direction through the GPS/INS in cooperation with a gyroscope;
the weight of the load of the vehicle is collected through the arranged pressure sensor.
4. The driving assistance method based on image recognition and UWB tag according to claim 1, wherein the vehicle information stored in the UWB tag of the vehicle within the preset range is compared with the vehicle information by the following method:
reading real-time position information of the vehicle and the comparison vehicle, calculating to obtain a linear distance between the two vehicles, and performing orthogonal decomposition on the linear distance along the driving direction of the vehicle to obtain a vertical distance along the driving direction of the vehicle and a transverse distance perpendicular to the driving direction of the vehicle;
reading the running direction data of the vehicle and the comparison vehicle, and calculating to obtain the running direction included angle of the two vehicles;
and reading and comparing the output torque and the engine speed of the vehicle engine, and calculating to obtain the output torque and the speed variation.
5. The image recognition and UWB tag-based assisted driving method according to claim 4, wherein the granting of the attention level comprises the steps of:
reading and comparing the vertical distance S of the vehicle relative to the vehicleVerticalTransverse distance SHorizontal barAngle theta with the direction of travelClip;
Reading and comparing real-time speed V of vehicleTime of flightAnd an acceleration a;
reading the variation delta R of the rotating speed;
substituting formula P ═ λ1SVertical+λ2SHorizontal bar+λ3θClip+λ4VTime of flight+λ5a+λ5Δ R obtains a rating of interest parameter for the aligned vehicle, where λ1、λ2、λ3、λ4、λ5、λ5Reading a weight coefficient value corresponding to a prestored mode according to a driving environment mode selected by a user as a weight coefficient;
ranking the grade parameters after the comparison of all vehicles in a preset range is finished, and awarding attention grades from high to low;
and simultaneously sending a warning signal to the driver of the vehicle and sending a warning signal to the vehicle with the highest attention degree.
6. The driving assistance method based on image recognition and UWB tag of claim 4, wherein when the straight-line distance between the host vehicle and the comparison vehicle is lower than the preset threshold, the high-precision distance measurement service is triggered, comprising:
establishing a space coordinate system, wherein the space coordinate system takes the center of the vehicle body of the vehicle as the origin of coordinates, the direction parallel to the ground to the front of the vehicle head is taken as the positive direction of a Y axis, the direction perpendicular to the ground to the lower part of the ground is taken as the positive direction of a Z axis, and the direction parallel to the ground to the right side of a driver is taken as the positive direction of an X axis;
reading the vehicle outline size data to obtain the farthest endpoint coordinates of each surface of the vehicle outline solid;
reading and comparing vehicle outline size data, and combining distance data measured by a laser arranged at the periphery of the vehicle to obtain the coordinates of the farthest end points of all surfaces of the compared vehicle outline stereo;
and calculating the point distance of the coordinate of the farthest end point of each surface of the outline stereo of the vehicle and the coordinate of the farthest end point of each surface of the outline stereo of the comparison vehicle, and feeding back the corresponding position of the point with the minimum distance to the driver of the vehicle and the driver of the comparison vehicle in real time.
7. The driving assistance method based on image recognition and UWB tag according to claim 6, wherein the high precision distance ranging service further comprises a modeling step:
and reading the coordinate data of the vehicle in the space coordinate system and the coordinate data of the vehicle with the highest attention degree, and dynamically displaying the outline models of the two vehicles in proportion on a vehicle internal display device.
8. A driving assistance device based on image recognition and UWB tags, characterized by comprising:
the image ranging module is used for obtaining the distance between the measured object and the vehicle through an optical ranging principle;
the UWB tag module is used for storing static vehicle information and dynamic vehicle information, wherein the static vehicle information comprises a license plate number, a vehicle model and a vehicle overall dimension, and the dynamic vehicle information comprises a vehicle running direction, a real-time speed, an engine rotating speed, an acceleration, a steering angle, a load weight and position information;
the data reading and writing module is used for writing the static vehicle information and the dynamic vehicle information into the UWB tag in real time to wait for reading;
the data processing module is used for reading vehicle information stored in UWB tags of all vehicles in a preset range in real time, comparing the vehicle information with the vehicle information of the vehicle, granting different attention degrees of the compared vehicles to be sent to the memory for storage or updating the attention degree grade corresponding to the vehicle in the memory;
and the reminding display module is used for sending a warning signal to the driver of the vehicle when the data in the dynamic vehicle information corresponding to the vehicle with the highest attention degree exceeds a preset normal value range.
9. An image recognition and UWB tag based driver assistance system, comprising:
one or more processors;
storage means for storing one or more programs;
a driving assistance device based on image recognition and UWB tags;
the one or more programs, when executed by the one or more processors, cause an image recognition and UWB tag-based driver assistance device to implement, in cooperation with the one or more processors, the method of any one of claims 1-7.
10. A vehicle control system comprising the image recognition and UWB tag based assisted driving system of claim 9.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010420471.3A CN111599216A (en) | 2020-05-18 | 2020-05-18 | Auxiliary driving method, device and system based on image recognition and UWB (ultra-wideband) tag |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010420471.3A CN111599216A (en) | 2020-05-18 | 2020-05-18 | Auxiliary driving method, device and system based on image recognition and UWB (ultra-wideband) tag |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111599216A true CN111599216A (en) | 2020-08-28 |
Family
ID=72192437
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010420471.3A Withdrawn CN111599216A (en) | 2020-05-18 | 2020-05-18 | Auxiliary driving method, device and system based on image recognition and UWB (ultra-wideband) tag |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111599216A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112256032A (en) * | 2020-11-02 | 2021-01-22 | 中国计量大学 | AGV positioning system, control method, equipment and storage medium |
CN114274966A (en) * | 2021-11-25 | 2022-04-05 | 惠州市德赛西威汽车电子股份有限公司 | Driving behavior monitoring method and system based on UWB technology |
CN116980571A (en) * | 2023-09-25 | 2023-10-31 | 苏州市瑞思特智能制造有限公司 | Anti-collision bucket-wheel stacker-reclaimer, anti-collision system and anti-collision method thereof |
CN117152666A (en) * | 2023-10-18 | 2023-12-01 | 北京精英智通科技股份有限公司 | Analysis correction recognition method and system for motor vehicle characteristics |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100114418A1 (en) * | 2008-11-06 | 2010-05-06 | Stephen Varghese Samuel | System and method for determining a side-impact collision status of a vehicle |
CN102542843A (en) * | 2010-12-07 | 2012-07-04 | 比亚迪股份有限公司 | Early warning method for preventing vehicle collision and device |
DE102016216523A1 (en) * | 2016-09-01 | 2018-03-01 | Robert Bosch Gmbh | Method and driver assistance system for assisting a driver of a vehicle |
CN107993485A (en) * | 2017-10-30 | 2018-05-04 | 惠州市德赛西威汽车电子股份有限公司 | A kind of adaptive method for early warning and device based on car networking |
CN108877296A (en) * | 2018-08-01 | 2018-11-23 | 江苏省送变电有限公司 | A kind of collision avoidance system based on Internet of Things |
KR20190055898A (en) * | 2017-11-16 | 2019-05-24 | 르노삼성자동차 주식회사 | The lane change assistant system |
CN110349425A (en) * | 2019-08-13 | 2019-10-18 | 浙江吉利汽车研究院有限公司 | A kind of important goal generation method for bus or train route collaboration automated driving system |
-
2020
- 2020-05-18 CN CN202010420471.3A patent/CN111599216A/en not_active Withdrawn
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100114418A1 (en) * | 2008-11-06 | 2010-05-06 | Stephen Varghese Samuel | System and method for determining a side-impact collision status of a vehicle |
CN102542843A (en) * | 2010-12-07 | 2012-07-04 | 比亚迪股份有限公司 | Early warning method for preventing vehicle collision and device |
DE102016216523A1 (en) * | 2016-09-01 | 2018-03-01 | Robert Bosch Gmbh | Method and driver assistance system for assisting a driver of a vehicle |
CN107993485A (en) * | 2017-10-30 | 2018-05-04 | 惠州市德赛西威汽车电子股份有限公司 | A kind of adaptive method for early warning and device based on car networking |
KR20190055898A (en) * | 2017-11-16 | 2019-05-24 | 르노삼성자동차 주식회사 | The lane change assistant system |
CN108877296A (en) * | 2018-08-01 | 2018-11-23 | 江苏省送变电有限公司 | A kind of collision avoidance system based on Internet of Things |
CN110349425A (en) * | 2019-08-13 | 2019-10-18 | 浙江吉利汽车研究院有限公司 | A kind of important goal generation method for bus or train route collaboration automated driving system |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112256032A (en) * | 2020-11-02 | 2021-01-22 | 中国计量大学 | AGV positioning system, control method, equipment and storage medium |
CN114274966A (en) * | 2021-11-25 | 2022-04-05 | 惠州市德赛西威汽车电子股份有限公司 | Driving behavior monitoring method and system based on UWB technology |
CN116980571A (en) * | 2023-09-25 | 2023-10-31 | 苏州市瑞思特智能制造有限公司 | Anti-collision bucket-wheel stacker-reclaimer, anti-collision system and anti-collision method thereof |
CN116980571B (en) * | 2023-09-25 | 2023-12-19 | 苏州市瑞思特智能制造有限公司 | Anti-collision bucket-wheel stacker-reclaimer, anti-collision system and anti-collision method thereof |
CN117152666A (en) * | 2023-10-18 | 2023-12-01 | 北京精英智通科技股份有限公司 | Analysis correction recognition method and system for motor vehicle characteristics |
CN117152666B (en) * | 2023-10-18 | 2024-02-09 | 北京精英智通科技股份有限公司 | Analysis correction recognition method and system for motor vehicle characteristics |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111599216A (en) | Auxiliary driving method, device and system based on image recognition and UWB (ultra-wideband) tag | |
CN106240458B (en) | A kind of vehicular frontal impact method for early warning based on vehicle-mounted binocular camera | |
CN110077399B (en) | Vehicle anti-collision method based on road marking and wheel detection fusion | |
CN110745140B (en) | Vehicle lane change early warning method based on continuous image constraint pose estimation | |
Dagan et al. | Forward collision warning with a single camera | |
CN110065494B (en) | Vehicle anti-collision method based on wheel detection | |
CN111775940B (en) | Automatic channel changing method, device, equipment and storage medium | |
EP2347400B1 (en) | Method and system for combining sensor data | |
CN111516682A (en) | Motor vehicle management and control method, device and system based on intelligent driving environment measurement and control | |
CN109815832A (en) | Driving method for early warning and Related product | |
CN112562405A (en) | Radar video intelligent fusion and early warning method and system | |
CN104181534A (en) | Probabilistic target selection and threat assessment method and application to intersection collision alert system | |
CN104085396A (en) | Panoramic lane departure warning method and system | |
CN102442248A (en) | System and method for alarming front impact danger coupled with driver viewing direction and vehicle using same | |
CN108021899A (en) | Vehicle intelligent front truck anti-collision early warning method based on binocular camera | |
CN113844444A (en) | Vehicle forward collision early warning method and device, electronic equipment and vehicle | |
CN102778223A (en) | License number cooperation target and monocular camera based automobile anti-collision early warning method | |
CN112526521A (en) | Multi-target tracking method for automobile millimeter wave anti-collision radar | |
CN111881245B (en) | Method, device, equipment and storage medium for generating visibility dynamic map | |
CN111516706A (en) | Vehicle automatic detection danger avoidance auxiliary method, device and system | |
CN114387821B (en) | Vehicle collision early warning method, device, electronic equipment and storage medium | |
CN115635977A (en) | Vehicle collision early warning method and device, electronic equipment and storage medium | |
CN113085854A (en) | System and method for identifying obstacle above vehicle through radar camera | |
CN114396958B (en) | Lane positioning method and system based on multiple lanes and multiple sensors and vehicle | |
CN103568990A (en) | Method and system for achieving vehicle safety warning |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WW01 | Invention patent application withdrawn after publication |
Application publication date: 20200828 |
|
WW01 | Invention patent application withdrawn after publication |