CN110473414B - Vehicle driving path determining method, device and system - Google Patents

Vehicle driving path determining method, device and system Download PDF

Info

Publication number
CN110473414B
CN110473414B CN201910768131.7A CN201910768131A CN110473414B CN 110473414 B CN110473414 B CN 110473414B CN 201910768131 A CN201910768131 A CN 201910768131A CN 110473414 B CN110473414 B CN 110473414B
Authority
CN
China
Prior art keywords
traffic light
determining
angle
target
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910768131.7A
Other languages
Chinese (zh)
Other versions
CN110473414A (en
Inventor
张海强
李世明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jingwei Hirain Tech Co Ltd
Original Assignee
Beijing Jingwei Hirain Tech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jingwei Hirain Tech Co Ltd filed Critical Beijing Jingwei Hirain Tech Co Ltd
Priority to CN201910768131.7A priority Critical patent/CN110473414B/en
Publication of CN110473414A publication Critical patent/CN110473414A/en
Application granted granted Critical
Publication of CN110473414B publication Critical patent/CN110473414B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/04Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/09623Systems involving the acquisition of information from passive traffic signs by means mounted on the vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096708Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
    • G08G1/096725Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information generates an automatic action on the vehicle control

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Atmospheric Sciences (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention provides a vehicle running path determining method, a vehicle running path determining device and a vehicle running path determining system. And then determining the target traffic light based on the attribute category information, the angle category information and the position information. And then determining a target driving path of the vehicle based on the target traffic light. Therefore, the determination of the target traffic light in the scheme is determined based on the attribute type information, the angle type information and the position information, the accuracy of traffic light identification is improved, then the target driving path of the vehicle is determined based on the target traffic light, and the safety of determining the target driving path can be enhanced.

Description

Vehicle driving path determining method, device and system
Technical Field
The invention relates to the technical field of automatic driving, in particular to a method, a device and a system for determining a vehicle driving path.
Background
The automatic driving vehicle obtains surrounding environment information and control instructions to realize automatic driving of the vehicle. However, when the autonomous vehicle travels at the intersection as shown in fig. 1, the autonomous vehicle a needs to recognize all traffic light information within a visual angle, such as traffic light C, traffic light D, traffic light E, and traffic light F in the figure, and determine a traffic light to be observed, such as traffic light C, based on the recognized traffic light information, and then determine a target travel path.
However, once the determined target driving route does not conform to the actual traffic rule, a serious safety hazard may be caused, for example, the vehicle a in fig. 1 needs to drive according to the traffic light C, but if the determined target driving route is according to the traffic light D, the vehicle may actually drive according to the target driving route, and a phenomenon that the vehicle collides with other vehicle routes which normally drive, or even collides occurs may occur.
Therefore, how to provide a method for determining a vehicle driving path, which can accurately identify traffic lights and enhance the safety of determining a target path, is a major technical problem to be solved urgently by technical personnel in the field.
Disclosure of Invention
The invention provides a vehicle running path determining method which can accurately identify traffic lights and enhance the safety of determining a target path.
In order to achieve the purpose, the technical scheme provided by the application is as follows:
a vehicle travel path determination method, comprising:
acquiring an image to be identified of a target vehicle;
determining attribute category information, position information and angle category information of an object to be identified in the image to be identified based on the image to be identified;
determining a target traffic light based on the attribute category information, the angle category information and the position information;
and determining a target running path of the vehicle based on the target traffic light.
Optionally, the acquiring an image to be identified of the target vehicle includes:
acquiring an image of a preset visual angle output by a visual sensor of the target vehicle;
and determining the image to be recognized based on the image with the preset visual angle, wherein the image to be recognized has a preset data size.
Optionally, the determining, based on the image to be recognized, attribute category information, position information, and angle category information of the object to be recognized in the image to be recognized includes:
inputting the image to be identified into a preset neural network, and determining the object to be identified with the attribute category information of the traffic light as the traffic light to be identified;
acquiring an actual angle value of the traffic light to be identified and an angle category value output by a preset neural network;
determining a loss value of a preset neural network based on the actual angle value of the traffic light to be identified and the angle class value output by the preset neural network;
and determining the target weight of the preset neural network based on the loss value so that the preset neural network identifies the image to be identified according to the target weight.
Optionally, the determining a target traffic light based on the attribute category information, the angle category information, and the position information includes:
determining an orientation of each traffic light based on the angle category;
calculating the acute angle absolute value of an included angle between the orientation of each traffic light and the driving orientation of the vehicle;
and determining the traffic light corresponding to the minimum value in the acute angle absolute values as the target traffic light.
Optionally, the determining a target traffic light based on the attribute category information, the angle category information, and the position information further includes:
acquiring map information of the vehicle;
determining an orientation of each intersection based on the location information and the map information;
calculating an angle difference between the orientation of the crossroad and the orientation of the traffic lights;
and determining the traffic light corresponding to the minimum value in the angle difference as the target traffic light.
A vehicle travel path determination device comprising:
the acquisition module is used for acquiring an image to be identified of the target vehicle;
the object information determining module is used for determining attribute category information, position information and angle category information of the object to be recognized in the image to be recognized based on the image to be recognized;
the target traffic light determining module is used for determining a target traffic light based on the attribute category information, the angle category information and the position information;
and the target running path determining module is used for determining a target running path of the vehicle based on the target traffic light.
Optionally, the obtaining module includes:
a first acquisition unit, configured to acquire an image of a preset viewing angle output by a vision sensor of the target vehicle;
and the first determining unit is used for determining the image to be recognized based on the image with the preset visual angle, and the image to be recognized has a preset data size.
Optionally, the object information determining module includes:
the second determining unit is used for inputting the image to be identified into a preset neural network and determining the object to be identified with the attribute class information of the traffic light as the traffic light to be identified;
the second acquisition unit is used for acquiring the actual angle value of the traffic light to be identified and the angle category value output by the preset neural network;
a third determining unit, configured to determine a loss value of a preset neural network based on the actual angle value of the traffic light to be identified and the angle category value output by the preset neural network;
and the fourth determining unit is used for determining the target weight of the preset neural network based on the loss value so that the preset neural network can identify the image to be identified according to the target weight.
Optionally, the target traffic light determining module includes:
a fifth determining unit for determining the orientation of each traffic light based on the angle category;
the first calculation unit is used for calculating the acute angle absolute value of an included angle between the orientation of each traffic light and the driving orientation of the vehicle;
a sixth determining unit, configured to determine a traffic light corresponding to a minimum value in the acute angle absolute values as the target traffic light;
or the like, or, alternatively,
a third acquisition unit configured to acquire map information of the vehicle;
a seventh determining unit configured to determine an orientation of each intersection based on the position information and the map information;
a second calculation unit for calculating an angle difference between the direction of the crossroad and the direction of the traffic light;
and the eighth determining unit is used for determining the traffic light corresponding to the minimum value in the angle difference as the target traffic light.
A vehicle travel path determination system includes any one of the vehicle travel path determination devices described above.
The invention provides a method, a device and a system for determining a vehicle running path. And then determining the target traffic light based on the attribute category information, the angle category information and the position information. And then determining a target running path of the vehicle based on the target traffic light. Therefore, the determination of the target traffic light in the scheme is determined based on the attribute type information, the angle type information and the position information, the accuracy of traffic light identification is improved, then the target driving path of the vehicle is determined based on the target traffic light, and the safety of determining the target driving path can be enhanced.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic view of a traffic route;
fig. 2 is a schematic flow chart of a method for determining a driving path of a vehicle according to an embodiment of the present invention;
fig. 3 is a schematic flow chart of a method for determining a driving path of a vehicle according to an embodiment of the present invention;
FIG. 4 is a schematic view of yet another traffic path;
fig. 5 is a schematic flow chart of a method for determining a driving path of a vehicle according to an embodiment of the present invention;
fig. 6 is a schematic flow chart of a method for determining a driving path of a vehicle according to an embodiment of the present invention;
fig. 7 is a schematic flow chart of a method for determining a driving route of a vehicle according to an embodiment of the present invention;
fig. 8 is a schematic structural diagram of a vehicle travel path determination device according to an embodiment of the present invention.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in detail below.
Referring to fig. 2, fig. 2 is a schematic flow chart of a method for determining a driving route of a vehicle according to an embodiment of the present invention, including:
s21, acquiring an image to be recognized of the target vehicle;
in this embodiment, the target vehicle is a vehicle to be subjected to vehicle travel path determination, and may be an autonomous vehicle or a non-autonomous vehicle. In the present embodiment, the parameters such as the model and the power type of the target vehicle are not limited.
The image to be recognized is a picture input into the neural network, and the neural network performs set target recognition on the image to be recognized, for example, a traffic light in the image to be recognized, position information of the traffic light in the image to be recognized, an orientation of the traffic light relative to a target vehicle, and the like are recognized through the neural network.
Specifically, as shown in fig. 3, the embodiment provides a specific implementation method for acquiring an image to be recognized of a target vehicle, including:
s31, acquiring an image of a preset visual angle output by a visual sensor of the target vehicle;
generally, a vision sensor is installed on a target vehicle, and a preset viewing angle of the target vehicle can be photographed, such as a driving recorder, a mobile electronic device, and the like, so that an image of the current viewing angle of the target vehicle can be photographed by the vision sensor installed on the current vehicle.
S32, determining the image to be recognized based on the image with the preset view angle, wherein the image to be recognized has a preset data size.
Since the neural network requires a fixed size data input, for example, the height and width of the image are required to be 224, the purpose of this step is to adjust the size of the image of the preset viewing angle output by the vision sensor so that the adjusted size meets the input size requirement of the neural network.
Specifically, a resize function of the opencv library may be called, and the size of the image of the preset viewing angle output by the visual sensor is adjusted to the input size required by the neural network, so as to generate the image to be recognized.
S22, determining attribute category information, position information and angle category information of the object to be recognized in the image to be recognized based on the image to be recognized;
after the image to be recognized is obtained, the step can determine the category information, the position information and the angle category information of the object to be recognized in the image to be recognized through the neural network. The category information of the object to be recognized comprises two categories: traffic lights and non-traffic lights. In this embodiment, the traffic lights in the image to be recognized need to be recognized, and therefore, the neural network only needs to recognize the target of which the category is the traffic light. Namely, objects except traffic lights in the image to be identified, such as vehicles and trees, are not identified.
It should be noted that, in addition to the intersection diagram of fig. 1, the inventor considers that, as shown in fig. 4, the actual road configuration may also be such that when the vehicle is located on the road 1, the vehicle needs to travel along the traffic light a1, and when the vehicle is located on the road 2, the vehicle needs to travel along the traffic light a2, and the traffic light a1 and the traffic light a2 may be located at the same position or at similar positions.
Therefore, in the present embodiment, in addition to the identification of the category information of the object to be identified, it is necessary to identify the position information of the traffic light and the angle category information of the traffic light with respect to the target vehicle. The position information comprises four categories, namely an X-axis coordinate and a Y-axis coordinate of the position of the traffic light in the image to be identified, and the width W and the height H of the traffic light.
For example, referring to fig. 4, when the positions of the traffic lights to be followed by the road a and the road B are close to each other, the target traffic light may be further determined according to the height of the traffic light and the angle category information relative to the target vehicle.
In the present embodiment, the angle category information includes 60 categories, each of which represents an angle interval of 3 °, for example, if the angle category is 1, it indicates that the relative position (actual angle) of the traffic light and the target vehicle is within the interval [0 ° to 3 ° ], if the angle category is 2, it indicates that the relative position (actual angle) of the traffic light and the target vehicle is within the interval [4 ° to 6 ° ], … …, and so on, and if the angle category is 60, it indicates that the relative position (actual angle) of the traffic light and the target vehicle is within the interval [178 ° to 180 ° ].
Specifically, in this embodiment, the neural network converts the real angle of the identified traffic light into an angle interval, and one angle interval is represented by one angle category, that is, the embodiment converts the input and output of the neural network into a classification problem, and the neural network can predict the classification problem well.
Further, as shown in fig. 5, this embodiment further provides a specific implementation method for determining attribute category information, position information, and angle category information of an object to be recognized in the image to be recognized based on the image to be recognized, including:
s51, inputting the image to be identified into a preset neural network, and determining the object to be identified with the attribute class information of the traffic light as the traffic light to be identified;
s52, acquiring the actual angle value of the traffic light to be identified and the angle category value output by a preset neural network;
s53, determining a loss value of a preset neural network based on the actual angle value of the traffic light to be identified and the angle class value output by the preset neural network;
s54, determining the target weight of the preset neural network based on the loss value, so that the preset neural network can identify the image to be identified according to the target weight.
As described above, the neural network can identify the attribute category of each target in the image to be identified, and in this embodiment, the attribute category only needs to be divided into two categories, that is, traffic lights or non-traffic lights, and then the object to be identified, whose attribute category information is a traffic light, is determined to be a traffic light to be identified.
Since the accuracy of the output result of the neural network is improved by training repeatedly, in this embodiment, the actual angle value of the traffic light to be recognized is obtained first, then the target angle category corresponding to the actual angle value of the traffic light to be recognized is found according to table 1, and the target angle category is used as the output target of the angle category prediction of the neural network.
TABLE 1
Actual angle value Angular interval Class of angle
For example: 2.2 degree [0°,3°] 1
For example: 5.1 degree [4°,6°] 2
For example: 8 degree [7°,9°] 3
…… …… ……
For example: 176.2 degree [175°,177°] 59
For example: 179.7 degree [178°,180°) 60
Then, the angle category value output by the neural network is obtained, the angle category value output by the neural network is compared with the target angle category, and a loss function (such as a cross entropy loss function) is used for calculating a loss value between the angle category value output by each traffic light through the neural network and the target angle category, namely, the loss value of the neural network is preset.
In particular, the loss function is the loss between the output and the target value. And the cross entropy loss can be written as L ═ plog (p). And after the loss value is calculated, the loss value is reversely propagated and the weight is updated, so that the target weight is determined, and the preset neural network can identify the image to be identified according to the target weight.
S23, determining a target traffic light based on the attribute category information, the angle category information and the position information;
and S24, determining a target driving path of the vehicle based on the target traffic light.
After the neural network outputs the angle category information and the position information of the traffic lights, a target traffic light can be determined according to the position information of the traffic lights to be identified, for example, the traffic light closest to the target vehicle is determined as the target traffic light, or the traffic light with the minimum angle category information of the traffic lights output by the neural network is determined as the target traffic light. Or when the angle type information of the traffic lights to be identified is the same, the target traffic lights are further determined according to the position information of the traffic lights.
After the target traffic light is determined, if the target traffic light is a red light and the target traffic light is a green light based on the indication state of the target traffic light, a target driving path of the vehicle is further determined, and if the vehicle can run straight, the vehicle can turn left, and the like.
Therefore, the determination of the target traffic light in the scheme is determined based on the attribute type information, the angle type information and the position information, the accuracy of traffic light identification is improved, then the target driving path of the vehicle is determined based on the target traffic light, and the safety of determining the target driving path can be enhanced.
On the basis of the above embodiments, this embodiment further provides several specific implementation manners for determining the target traffic light based on the attribute category information, the angle category information, and the position information, for example:
the first method, as shown in fig. 6, includes the steps of:
s61, determining the orientation of each traffic light based on the angle category;
s62, calculating an acute angle absolute value of an included angle between the orientation of each traffic light and the driving orientation of the vehicle;
and S63, determining the traffic light corresponding to the minimum value in the acute angle absolute values as the target traffic light.
Schematically, the present embodiment is illustrated as a vehicle B with reference to fig. 1. And (3) inputting the image to be identified in the figure 1 into a trained neural network, wherein the neural network can output the position information of the traffic light C, the traffic light D and the traffic light E in the figure 1 and the angle category relative to the vehicle B.
Assume that the position information of the traffic light C in the figure is (5, 4), the height is 3m, the width is 1m, and the angle category of the traffic light C output by the neural network with respect to the vehicle B is 15.
The position information of the traffic light D in the figure is (4, 6), the height is 3m, the width is 1m, and the angle class of the traffic light D output by the neural network relative to the vehicle B is 1.
The position information of the traffic light E in the figure is (3, 5), the height is 3m, the width is 1m, and the angle class of the traffic light E output by the neural network relative to the vehicle B is 15.
Then, based on the angle categories, the present embodiment may determine that an angular bisector of the angle interval corresponding to the angle category is the orientation of each traffic light. For example, if the angle category of the traffic light C and the traffic light E is 15, the corresponding angle interval is [43 °, 45 ° ], and then the bisector L1 of the angle interval [43 °, 45 ° ] can be determined as the orientation of the traffic light C, and the bisector L2 can be determined as the orientation of the traffic light E. Likewise, it can be determined that the bisector L3 of the angle interval 0, 3 is the orientation of the traffic light D.
It should be noted that, in this embodiment, it is not limited that only the bisector of the angle interval may be the direction of the traffic light, and may also be other set directions, for example, the dividing line of 1/3 of the angle interval corresponding to the angle category is determined as the direction of the traffic light, that is, the direction may be set according to the actual design requirement, and the method is not limited to the manner of determining the direction, i.e., the bisector.
After the orientation of each traffic light is determined, the present embodiment further calculates an included angle between the orientation of each traffic light and the driving orientation of the target vehicle. The driving direction of the target vehicle is the shooting direction of the host vehicle, i.e., the direction of the vehicle B is L' in fig. 1. Then, step S62 is to calculate the absolute value α 1 of the acute angle between the bisector L1 and the orientation L ' of the vehicle B, calculate the absolute value α 2 of the acute angle between the bisector L2 and the orientation L ' of the vehicle B, and calculate the absolute value α 3 of the acute angle between the bisector L3 and the orientation L ' of the vehicle B, where α 1 is 45 °, α 2 is 45 °, and α 3 is 0 °.
And then, determining the minimum value alpha 3 in alpha 1, alpha 2 and alpha 3, and determining the traffic light D corresponding to the acute angle absolute value alpha 3 as the target traffic light. And then the vehicle B carries out path planning and driving according to the traffic light D.
The second method, as shown in fig. 7, includes the steps of:
s71, acquiring map information of the vehicle;
s72, determining the direction of each crossroad based on the position information and the map information;
s73, calculating the angle difference between the orientation of the crossroad and the orientation of the traffic light;
s74, determining the traffic light corresponding to the minimum value in the angle difference as the target traffic light.
In the first mode, the map information does not need to be considered, and in the second mode, the map information needs to be further combined.
Illustratively, referring to fig. 4, map information of the position of the vehicle is first obtained, where the map information at least includes intersection information near the geographic position of the vehicle. For example, the nearest intersection is searched for from the map based on the vehicle position, the direction of the intersection is searched for with the intersection as the center, the direction of the road a is obtained as a, and the direction of the road B is obtained as B.
Then, according to the position and orientation information of the vehicle in the map, the orientation of the traffic light identified by the image is converted into a map coordinate system from a camera coordinate system, as shown by t1 and t 2. Then, the absolute values of the acute angles between t1 and t2 and a and b are calculated, and the combination of the acute angle and the minimum is selected, as shown in table 2, and the combinations a-t2 and b-t1 are selected, namely, a is associated with a t2 lamp and b is associated with a t1 lamp.
TABLE 2
Angular difference t1 t2
a 35 1
b 3 32
And finally, selecting the traffic light associated with the road according to the road where the vehicle is located as the traffic light which the vehicle should obey.
It should be noted that, in this embodiment, a matching manner combining the map and the vehicle pose information may prevent a correlation error caused by vehicle skew, and further ensure accuracy of the target traffic lights.
On the basis of the above-described embodiment, the present embodiment also provides a vehicle travel path determination device, as shown in fig. 8, including:
an obtaining module 81, configured to obtain an image to be identified of a target vehicle;
an object information determining module 82, configured to determine attribute category information, position information, and angle category information of an object to be recognized in the image to be recognized, based on the image to be recognized;
a target traffic light determination module 83, configured to determine a target traffic light based on the attribute category information, the angle category information, and the position information;
and a target driving path determining module 84, configured to determine a target driving path of the vehicle based on the target traffic light.
The obtaining module provided by this embodiment may include:
a first acquisition unit, configured to acquire an image of a preset viewing angle output by a vision sensor of the target vehicle;
and the first determining unit is used for determining the image to be recognized based on the image with the preset visual angle, and the image to be recognized has a preset data size.
In addition, in the vehicle travel path determination device according to the present embodiment, the object information determination module may include:
the second determining unit is used for inputting the image to be identified into a preset neural network and determining the object to be identified with the attribute class information of the traffic light as the traffic light to be identified;
the second acquisition unit is used for acquiring the actual angle value of the traffic light to be identified and the angle category value output by the preset neural network;
a third determining unit, configured to determine a loss value of a preset neural network based on the actual angle value of the traffic light to be identified and the angle category value output by the preset neural network;
and the fourth determining unit is used for determining the target weight of the preset neural network based on the loss value so that the preset neural network can identify the image to be identified according to the target weight.
In the device for determining a driving route of a vehicle according to this embodiment, the target traffic light determining module may include:
a fifth determining unit for determining the orientation of each traffic light based on the angle category;
the first calculation unit is used for calculating the acute angle absolute value of an included angle between the orientation of each traffic light and the driving orientation of the vehicle;
a sixth determining unit, configured to determine a traffic light corresponding to a minimum value in the acute angle absolute values as the target traffic light;
or the like, or, alternatively,
a third acquisition unit configured to acquire map information of the vehicle;
a seventh determining unit configured to determine an orientation of each intersection based on the position information and the map information;
a second calculation unit for calculating an angle difference between the direction of the crossroad and the direction of the traffic light;
and the eighth determining unit is used for determining the traffic light corresponding to the minimum value in the angle difference as the target traffic light.
According to the working principle of the vehicle running path determining device, please refer to the embodiment of the method, the target traffic light is determined based on the attribute type information, the angle type information and the position information, so that the accuracy of traffic light identification is improved, and then the target running path of the vehicle is determined based on the target traffic light, so that the safety of determining the target running path can be enhanced.
On the basis of the above embodiments, the present embodiment further provides a vehicle travel path determining system, which includes any one of the above vehicle travel path determining devices, and the working principle of the system refers to the above device embodiments, and will not be described repeatedly herein.
In summary, the present invention provides a vehicle travel path determining method, device and system, where the vehicle travel path determining method first obtains an image to be recognized of a target vehicle, and then determines attribute type information, position information and angle type information of an object to be recognized in the image to be recognized based on the image to be recognized. And then determining the target traffic light based on the attribute category information, the angle category information and the position information. And then determining a target driving path of the vehicle based on the target traffic light. Therefore, the determination of the target traffic light in the scheme is determined based on the attribute type information, the angle type information and the position information, the accuracy of traffic light identification is improved, then the target driving path of the vehicle is determined based on the target traffic light, and the safety of determining the target driving path can be enhanced.
The foregoing is merely a preferred embodiment of the invention and is not intended to limit the invention in any manner. Although the present invention has been described with reference to the preferred embodiments, it is not intended to be limited thereto. Those skilled in the art can make numerous possible variations and modifications to the present teachings, or modify equivalent embodiments to equivalent variations, without departing from the scope of the present teachings, using the methods and techniques disclosed above. Therefore, any simple modification, equivalent change and modification made to the above embodiments according to the technical essence of the present invention are still within the scope of the protection of the technical solution of the present invention, unless the contents of the technical solution of the present invention are departed.

Claims (7)

1. A vehicle travel path determination method, characterized by comprising:
acquiring an image to be identified of a target vehicle;
determining attribute category information, position information and angle category information of an object to be identified in the image to be identified based on the image to be identified;
determining a target traffic light based on the attribute category information, the angle category information and the position information;
determining a target driving path of the vehicle based on the target traffic light;
the determining attribute category information, position information and angle category information of the object to be recognized in the image to be recognized based on the image to be recognized includes:
inputting the image to be identified into a preset neural network, and determining the object to be identified with the attribute category information of the traffic light as the traffic light to be identified;
acquiring an actual angle value of the traffic light to be identified and an angle category value output by a preset neural network;
determining a loss value of a preset neural network based on the actual angle value of the traffic light to be identified and the angle class value output by the preset neural network;
determining target weight of the preset neural network based on the loss value so that the preset neural network can identify the image to be identified according to the target weight;
the determining a target traffic light based on the attribute category information, the angle category information, and the position information includes:
determining an orientation of each traffic light based on the angle category;
calculating the acute angle absolute value of an included angle between the orientation of each traffic light and the driving orientation of the vehicle;
and determining the traffic light corresponding to the minimum value in the acute angle absolute values as the target traffic light.
2. The vehicle travel path determination method according to claim 1, wherein the acquiring an image to be recognized of a target vehicle includes:
acquiring an image of a preset visual angle output by a visual sensor of the target vehicle;
and determining the image to be recognized based on the image with the preset visual angle, wherein the image to be recognized has a preset data size.
3. The vehicle travel path determining method according to claim 1, wherein the determining a target traffic light based on the attribute category information, the angle category information, and the position information further includes:
acquiring map information of the vehicle;
determining an orientation of each intersection based on the location information and the map information;
calculating an angle difference between the orientation of the crossroad and the orientation of the traffic lights;
and determining the traffic light corresponding to the minimum value in the angle difference as the target traffic light.
4. A vehicle travel path determination device characterized by comprising:
the acquisition module is used for acquiring an image to be identified of the target vehicle;
the object information determining module is used for determining attribute category information, position information and angle category information of the object to be recognized in the image to be recognized based on the image to be recognized;
the target traffic light determining module is used for determining a target traffic light based on the attribute category information, the angle category information and the position information;
the target running path determining module is used for determining a target running path of the vehicle based on the target traffic light;
the object information determination module includes:
the second determining unit is used for inputting the image to be identified into a preset neural network and determining the object to be identified with the attribute class information of the traffic light as the traffic light to be identified;
the second acquisition unit is used for acquiring the actual angle value of the traffic light to be identified and the angle category value output by the preset neural network;
a third determining unit, configured to determine a loss value of a preset neural network based on the actual angle value of the traffic light to be identified and the angle category value output by the preset neural network;
a fourth determining unit, configured to determine, based on the loss value, a target weight of the preset neural network, so that the preset neural network identifies the image to be identified according to the target weight;
the target traffic light determination module includes:
a fifth determining unit for determining the orientation of each traffic light based on the angle category;
the first calculation unit is used for calculating the acute angle absolute value of an included angle between the orientation of each traffic light and the driving orientation of the vehicle;
and the sixth determining unit is used for determining the traffic light corresponding to the minimum value in the acute angle absolute values as the target traffic light.
5. The vehicle travel path determination device according to claim 4, characterized in that the acquisition module includes:
a first acquisition unit, configured to acquire an image of a preset viewing angle output by a vision sensor of the target vehicle;
and the first determining unit is used for determining the image to be recognized based on the image with the preset visual angle, and the image to be recognized has a preset data size.
6. The vehicle travel path determination device according to claim 4, wherein the target traffic light determination module further includes:
a third acquisition unit configured to acquire map information of the vehicle;
a seventh determining unit configured to determine an orientation of each intersection based on the position information and the map information;
a second calculation unit for calculating an angle difference between the direction of the crossroad and the direction of the traffic light;
and the eighth determining unit is used for determining the traffic light corresponding to the minimum value in the angle difference as the target traffic light.
7. A vehicle travel path determination system characterized by comprising the vehicle travel path determination apparatus according to any one of claims 4 to 6.
CN201910768131.7A 2019-08-20 2019-08-20 Vehicle driving path determining method, device and system Active CN110473414B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910768131.7A CN110473414B (en) 2019-08-20 2019-08-20 Vehicle driving path determining method, device and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910768131.7A CN110473414B (en) 2019-08-20 2019-08-20 Vehicle driving path determining method, device and system

Publications (2)

Publication Number Publication Date
CN110473414A CN110473414A (en) 2019-11-19
CN110473414B true CN110473414B (en) 2021-03-23

Family

ID=68512011

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910768131.7A Active CN110473414B (en) 2019-08-20 2019-08-20 Vehicle driving path determining method, device and system

Country Status (1)

Country Link
CN (1) CN110473414B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111444810A (en) * 2020-03-23 2020-07-24 东软睿驰汽车技术(沈阳)有限公司 Traffic light information identification method, device, equipment and storage medium
CN111867211A (en) * 2020-07-14 2020-10-30 深圳市千百辉照明工程有限公司 Automatic adjusting method, device and system of intelligent lamp
JP2022128712A (en) * 2021-02-24 2022-09-05 本田技研工業株式会社 Road information generation device
CN114373321B (en) * 2021-12-01 2023-08-25 北京天兵科技有限公司 Path optimization method, system, device and medium for individual single trip

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103473946A (en) * 2013-06-25 2013-12-25 中国计量学院 Coordinate-based crossing signal lamp state instant prompting method and coordinate-based crossing signal lamp state instant prompting
CN106023619A (en) * 2016-06-17 2016-10-12 乐视控股(北京)有限公司 Traffic signal lamp information obtaining method, device and vehicle
CN106909937A (en) * 2017-02-09 2017-06-30 北京汽车集团有限公司 Traffic lights recognition methods, control method for vehicle, device and vehicle
KR20180068262A (en) * 2016-12-13 2018-06-21 현대자동차주식회사 Method and system for realizing a traffic signal
CN108305475A (en) * 2017-03-06 2018-07-20 腾讯科技(深圳)有限公司 A kind of traffic lights recognition methods and device
CN108932861A (en) * 2018-09-05 2018-12-04 广州小鹏汽车科技有限公司 The based reminding method and system of traffic lights variation

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102509466A (en) * 2011-10-28 2012-06-20 南京邮电大学 Traffic signal light auxiliary recognition system based on mobile telephone and method
MX352401B (en) * 2014-05-20 2017-11-23 Nissan Motor Traffic-light recognition device and traffic-light recognition method.
CN106251664A (en) * 2016-09-19 2016-12-21 重庆邮电大学 A kind of traffic lights based on DSRC coding and state recognition system and method
CN109508580B (en) * 2017-09-15 2022-02-25 阿波罗智能技术(北京)有限公司 Traffic signal lamp identification method and device
CN108847038A (en) * 2018-06-29 2018-11-20 奇瑞汽车股份有限公司 A kind of speed bootstrap technique and device
CN109145798B (en) * 2018-08-13 2021-10-22 浙江零跑科技股份有限公司 Driving scene target identification and travelable region segmentation integration method
CN109740526B (en) * 2018-12-29 2023-06-20 清华大学苏州汽车研究院(吴江) Signal lamp identification method, device, equipment and medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103473946A (en) * 2013-06-25 2013-12-25 中国计量学院 Coordinate-based crossing signal lamp state instant prompting method and coordinate-based crossing signal lamp state instant prompting
CN106023619A (en) * 2016-06-17 2016-10-12 乐视控股(北京)有限公司 Traffic signal lamp information obtaining method, device and vehicle
KR20180068262A (en) * 2016-12-13 2018-06-21 현대자동차주식회사 Method and system for realizing a traffic signal
CN106909937A (en) * 2017-02-09 2017-06-30 北京汽车集团有限公司 Traffic lights recognition methods, control method for vehicle, device and vehicle
CN108305475A (en) * 2017-03-06 2018-07-20 腾讯科技(深圳)有限公司 A kind of traffic lights recognition methods and device
CN108932861A (en) * 2018-09-05 2018-12-04 广州小鹏汽车科技有限公司 The based reminding method and system of traffic lights variation

Also Published As

Publication number Publication date
CN110473414A (en) 2019-11-19

Similar Documents

Publication Publication Date Title
CN110473414B (en) Vehicle driving path determining method, device and system
CN106767853B (en) Unmanned vehicle high-precision positioning method based on multi-information fusion
EP3842754A1 (en) System and method of detecting change in object for updating high-definition map
CN111874006B (en) Route planning processing method and device
US20220230449A1 (en) Automatically perceiving travel signals
JP4409035B2 (en) Image processing apparatus, singular part detection method, and recording medium recording singular part detection program
US10650256B2 (en) Automatically perceiving travel signals
RU2661963C1 (en) Device for calculating route of motion
US20180299893A1 (en) Automatically perceiving travel signals
CN111279154B (en) Navigation area identification and topology matching and associated systems and methods
CN102208013A (en) Scene matching reference data generation system and position measurement system
JP2020135874A (en) Local sensing-based autonomous navigation, associated system and method
KR101665599B1 (en) Augmented reality navigation apparatus for providing route guide service and method thereof
KR102604298B1 (en) Apparatus and method for estimating location of landmark and computer recordable medium storing computer program thereof
KR102565573B1 (en) Metric back-propagation for subsystem performance evaluation
CN111856963A (en) Parking simulation method and device based on vehicle-mounted looking-around system
KR102331000B1 (en) Method and computing device for specifying traffic light of interest in autonomous driving system
US11892300B2 (en) Method and system for determining a model of the environment of a vehicle
JP2020060369A (en) Map information system
WO2018195150A1 (en) Automatically perceiving travel signals
JP3857698B2 (en) Driving environment recognition device
KR20100086589A (en) A moving object tracking control system for a mobile robot using zigbee's rssi(received signal strength indication)
US20220245831A1 (en) Speed estimation systems and methods without camera calibration
KR102367138B1 (en) Method of detection crosswalk using lidar sensor and crosswalk detection device performing method
US20180300566A1 (en) Automatically perceiving travel signals

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 4 / F, building 1, No.14 Jiuxianqiao Road, Chaoyang District, Beijing 100020

Applicant after: Beijing Jingwei Hirain Technologies Co.,Inc.

Address before: 8 / F, block B, No. 11, Anxiang Beili, Chaoyang District, Beijing 100101

Applicant before: Beijing Jingwei HiRain Technologies Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant