CN114137980A - Control method and device, vehicle and readable storage medium - Google Patents

Control method and device, vehicle and readable storage medium Download PDF

Info

Publication number
CN114137980A
CN114137980A CN202111434048.XA CN202111434048A CN114137980A CN 114137980 A CN114137980 A CN 114137980A CN 202111434048 A CN202111434048 A CN 202111434048A CN 114137980 A CN114137980 A CN 114137980A
Authority
CN
China
Prior art keywords
target object
current vehicle
vehicle
moving direction
moving speed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111434048.XA
Other languages
Chinese (zh)
Other versions
CN114137980B (en
Inventor
张明达
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Xiaopeng Motors Technology Co Ltd
Original Assignee
Guangzhou Xiaopeng Autopilot Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Xiaopeng Autopilot Technology Co Ltd filed Critical Guangzhou Xiaopeng Autopilot Technology Co Ltd
Priority to CN202111434048.XA priority Critical patent/CN114137980B/en
Publication of CN114137980A publication Critical patent/CN114137980A/en
Application granted granted Critical
Publication of CN114137980B publication Critical patent/CN114137980B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0223Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar

Abstract

The application provides a control method. The control method comprises the steps of obtaining a first position, a first moving direction and a first moving speed of a target object; determining whether the current vehicle collides with the target object according to the first position, the first moving direction and the first moving speed, and the second position, the second moving direction and the second moving speed of the current vehicle; and if so, adjusting the second moving speed so as to prevent the current vehicle and the target object from colliding. The control method, the control device, the vehicle and the nonvolatile computer readable storage medium can judge whether the target object and the current vehicle collide with each other or not by detecting the positions, the moving directions and the moving speeds of the target object and the current vehicle, so that the collision with the target object is prevented in advance, and the situation that the user cannot find the vehicle which is far and fast approaching in time because the view field is limited by the window of the vehicle when the user is in the vehicle or the situation that the user cannot react to slow down the vehicle even if the user is found is prevented from occurring is prevented.

Description

Control method and device, vehicle and readable storage medium
Technical Field
The present application relates to the field of vehicle control technologies, and in particular, to a control method, a control device, a vehicle, and a non-volatile computer-readable storage medium.
Background
With the improvement of living standard, almost every family has the ability to buy the car as the tool of riding instead of walk, therefore, the car safety problem is receiving much attention, however, when the user is in the car, the visual field is limited by the car window, it is difficult to find the car that is far away and approaching quickly in time, or even if the user finds that the car is too late to respond to the speed reduction, thereby causing the occurrence of safety accident.
Disclosure of Invention
Embodiments of the present application provide a control method, a control apparatus, a vehicle, and a non-volatile computer-readable storage medium.
The control method of the embodiment of the application comprises the steps of obtaining a first position, a first moving direction and a first moving speed of a target object; determining whether the current vehicle and the target object collide according to the first position, the first moving direction and the first moving speed, and a second position, a second moving direction and a second moving speed of the current vehicle; and if so, adjusting the second moving speed so as to prevent the current vehicle and the target object from colliding.
The control device of the embodiment of the application comprises an obtaining module, a determining module and an adjusting module. The acquisition module is used for acquiring a first position, a first moving direction and a first moving speed of a target object; a determination module, configured to determine whether the current vehicle and the target object collide with each other according to the first position, the first moving direction, and the first moving speed, and a second position, a second moving direction, and a second moving speed of the current vehicle; and the adjusting module is used for adjusting the second moving speed when the current vehicle collides with the target object so as to prevent the current vehicle from colliding with the target object.
The vehicle of the embodiment of the present application includes a laser radar for acquiring a first position, a first moving direction, and a first moving speed of a target object, and a processor 30; the processor 30 is configured to determine whether the current vehicle and the target object collide according to the first position, the first moving direction and the first moving speed, and the second position, the second moving direction and the second moving speed of the current vehicle; and when the current vehicle collides with the target object, adjusting the second moving speed so that the current vehicle does not collide with the target object.
A non-transitory computer readable storage medium embodying a computer program which, when executed by one or more processors 30, causes the processors 30 to perform a control method. The control method comprises the steps of obtaining a first position, a first moving direction and a first moving speed of a target object; determining whether the current vehicle and the target object collide according to the first position, the first moving direction and the first moving speed, and a second position, a second moving direction and a second moving speed of the current vehicle; and if so, adjusting the second moving speed so as to prevent the current vehicle and the target object from colliding.
The control method, control device, vehicle, and non-volatile computer-readable storage medium of the embodiments of the present application, by detecting the positions, moving directions and moving speeds of the target object and the current vehicle, it can be determined whether the two will collide, for example, the collision point of the two can be determined according to the positions and moving directions of the two, then, the time when the two reach the collision point is determined according to the moving speeds of the two, if the two reach the collision point at the same time, the two collide with each other, thereby adjusting the moving speed of the current vehicle when it is judged that a collision will occur, thereby preventing a collision with the target object in advance, thereby preventing the situation that the user is in the vehicle, the view field is limited by the vehicle windows, the vehicle which is far away and is rapidly close is difficult to find in time, or even if the reaction is found to be delayed, the reaction may be delayed, which may lead to a safety accident.
Additional aspects and advantages of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a schematic flow chart of a control method according to certain embodiments of the present application;
FIG. 2 is a block schematic diagram of a control device according to certain embodiments of the present application;
FIG. 3 is a schematic plan view of a vehicle according to certain embodiments of the present application;
FIGS. 4 and 5 are schematic diagrams of a scenario of a control method according to some embodiments of the present application;
FIGS. 6 and 7 are schematic flow charts of control methods according to certain embodiments of the present application;
FIG. 8 is a schematic diagram of a scenario of a control method according to some embodiments of the present application;
FIG. 9 is a schematic flow chart diagram of a control method according to certain embodiments of the present application;
FIG. 10 is a schematic diagram of a connection between a processor 30 and a computer-readable storage medium according to some embodiments of the present application.
Detailed Description
Embodiments of the present application will be further described below with reference to the accompanying drawings. The same or similar reference numbers in the drawings identify the same or similar elements or elements having the same or similar functionality throughout. In addition, the embodiments of the present application described below in conjunction with the accompanying drawings are exemplary and are only for the purpose of explaining the embodiments of the present application, and are not to be construed as limiting the present application.
When shooting different scenes, generally, a user actively adjusts shooting parameters to achieve the best shooting effect in the current scene, but the method needs strong professional ability and is only suitable for professional persons in the shooting industry, and the learning cost of the common user is high; or a plurality of shooting modes are arranged in the camera and selected by a user according to scenes, however, shooting parameters of the shooting modes are fixed, the shooting modes are difficult to adapt to more complex shooting scenes, and the shooting effect is still poor.
Referring to fig. 1 to 3, a control method according to an embodiment of the present disclosure includes the following steps:
011: acquiring a first position, a first moving direction and a first moving speed of a target object;
012: determining whether the current vehicle collides with the target object according to the first position, the first moving direction and the first moving speed, and the second position, the second moving direction and the second moving speed of the current vehicle; and
013: if so, adjusting the second moving speed so that the current vehicle and the target object do not collide with each other.
The control device 10 of the embodiment of the present application includes an acquisition module 11, a determination module 12, and an adjustment module 13. The obtaining module 11, the determining module 12 and the adjusting module 13 are configured to perform step 011, step 012 and step 013, respectively. Namely, the obtaining module 11 is configured to obtain a first position, a first moving direction and a first moving speed of the target object; the determining module 12 is configured to determine whether the current vehicle and the target object collide with each other according to the first position, the first moving direction and the first moving speed, and the second position, the second moving direction and the second moving speed of the current vehicle; the adjusting module 13 is configured to adjust the second moving speed when the current vehicle collides with the target object, so that the current vehicle does not collide with the target object.
The vehicle 100 of the embodiment of the present application includes a laser radar 20 and a processor 30, the laser radar 20 is configured to acquire a first position, a first moving direction, and a first moving speed of a target object; the processor 30 is configured to determine whether the current vehicle 100 and the target object collide with each other according to the first position, the first moving direction and the first moving speed, and the second position, the second moving direction and the second moving speed of the current vehicle 100; and adjusting the second moving speed so that the current vehicle 100 and the target object do not collide when the current vehicle 100 and the target object collide. That is, laser radar 20 is configured to perform step 011, and processor 30 is configured to perform step 012 and step 013.
Specifically, the laser radar 20 of the vehicle 100 is disposed at the vehicle head 40, and when the vehicle 100 travels, the laser radar 20 located at the vehicle head 40 may scan an object in a scene in front of the vehicle 100 in real time to identify an object (such as a pedestrian, an electric vehicle, an automobile, and the like) movable in the scene, for example, the laser radar 20 may emit laser light within a field of view thereof, so as to acquire a position, a moving speed, a moving direction, and the like of the object within the field of view. In one embodiment, the laser radar 20 may emit laser light at a preset frequency to obtain multiple frames of point cloud information (or depth images), and may determine the position, moving direction, and moving speed of the target object according to the position change of the target object in the point cloud information (or depth images) and the time interval between different frames of point cloud information (or depth images) (the time interval may be determined according to the preset frequency and the difference between the number of frames in different frames).
The laser radar 20 may have a larger field angle, generally greater than or equal to 120 degrees (°), and when the laser radar 20 is disposed at the vehicle head 40 (e.g., at the middle of the vehicle head 40), a larger field of view than a user located in the vehicle can be obtained, so that a target object that may collide is detected in advance, and the probability of a safety accident is reduced.
Referring to fig. 4, in the present embodiment, the two laser radars 20 are respectively installed at two ends of the vehicle head 40, so as to cover a larger field of view in front of the vehicle 100, improve the overall field of view of the laser radars 20, further improve the probability of detecting a target object in advance, and reduce the probability of safety accidents.
In one example, the field angles of the two lidar 20 are both 120 degrees, and there is an overlap, such that the two lidar 20 together cover the entire field range in front of the vehicle 100, e.g. the combined field angle of the two lidar 20 is 180 degrees, i.e. the field ranges of the two lidar 20 together cover the field range in front of the vehicle 100 of 180 degrees. It is understood that the closer to the central angle of view in the field of view of the laser radar 20, the higher the detection accuracy. For example, the field of view of the laser radar 20 is divided into a region of interest (ROI in fig. 4) and two regions of no interest (non-ROI in fig. 4), generally, the region of interest is within ± 40 degrees of the central field angle, the other two regions are regions of no interest, the detection accuracy of the region of interest is higher than that of the regions of no interest, and when the vehicle 100 is provided with only the single laser radar 20, the region of interest covers the region directly in front of the vehicle 100, so as to preferentially ensure the detection accuracy of the object directly in front of the vehicle 100. The locomotive 40 of the application is provided with the two laser radars 20, and one non-interested region of each laser radar 20 is covered by the interested region of the other laser radar 20, so that only two small non-interested regions of 0-20 degrees and 160-180 degrees exist in the combined view field range of the two laser radars 20, and the 20-160 degrees of the combined view field range are all interested regions, thereby improving the detection accuracy of the two laser radars 20.
Because the two laser radars 20 are respectively arranged at the two ends of the vehicle head 40, the field ranges of the two laser radars are a certain blind area a in front of the vehicle head 40, the offset angle of the two laser radars 20 cannot be set too large, and the offset angle is larger, which results in that the range of the blind area a is larger, so that the vehicle 100 cannot detect a target object which is closer in the blind area a, wherein the offset angle of the laser radars 20 represents the direction corresponding to the central field angle of the laser radars 20 and the included angle (such as 30 ° in fig. 4) in front of the vehicle 100, in the embodiment of the present application, the offset angles of the two laser radars 20 are both 30 degrees, so that the combined field angle of the two laser radars 20 is exactly 180 degrees, generally speaking, target objects which are likely to collide are basically within the field range of 180 degrees of the vehicle 100, and objects which are located outside the field range of 180 degrees of the vehicle 100 are generally the same as the driving direction of the vehicle 100, the probability of collision is low, and therefore, on the basis of ensuring that the combined field angle is large, the offset angle of the two laser radars 20 is small, so that one non-interested region of each laser radar 20 can be covered by the interested region of the other laser radar 20, the proportion of the interested region in the combined field range is maximally improved, and the detection accuracy of the two laser radars 20 is improved.
In other embodiments, if the detection accuracy of the region of non-interest of the laser radar 20 can also meet the requirement, a larger offset angle may be set, for example, the offset angle is set to 45 degrees, so as to achieve coverage of a 215-degree field of view in front of the vehicle 100, and maximally improve the field of view that the laser radar 20 can detect. In another embodiment, the offset angle of the lidar 20 may be adjusted according to different scenarios, for example, when the vehicles 100 are traveling on a highway, the distance between the vehicles 100 is generally large, and there is generally no pedestrian, and at this time, the influence of the blind area a is small, and the offset angle may be set to be large, so as to improve the field range that the lidar 20 can detect as much as possible; when the vehicle 100 runs on an urban road, the distance between the vehicles 100 is generally small, and pedestrians generally exist, at this time, the influence of the blind area a is large, and the offset angle can be set to be small, so that the blind area a is reduced, and the vehicle 100 is prevented from colliding with a target object due to the fact that the target object in the blind area a cannot be detected.
Referring to fig. 5, after the laser radar 20 detects the first position W1, the first moving direction S1 and the first moving speed V1 of the target object, the processor 30 determines whether the current vehicle 100 collides with the target object according to the first position W1, the first moving direction S1 and the first moving speed V1, and the second position W2, the second moving direction S2 and the second moving speed V2 of the current vehicle 100; for example, the processor 30 can determine the position W3 of the collision point between the current vehicle 100 and the target object from the first position W1, the second position W2, the first moving direction S1 and the second moving direction S2, and then determine the time when the two reach the collision point from the second moving speed V2 of the current vehicle 100 and the first moving speed V1 of the target object, and if the two reach the collision point at the same time, determine that the two will collide.
When it is determined that a collision may occur, adjusting the moving speed of the current vehicle 100, for example, increasing the second moving speed of the current vehicle 100, so that the current vehicle 100 accelerates through a collision point, thereby preventing a collision with the target object in advance; or the second moving speed of the current vehicle 100 is reduced so that the target object passes through the collision point first, thereby preventing a collision with the target object in advance.
The control method, control device, vehicle 100, and non-volatile computer-readable storage medium of the embodiments of the present application, by detecting the positions, moving directions, and moving speeds of the target object and the current vehicle 100, it can be determined whether or not both of them will collide, for example, a collision point of both can be determined from the positions and moving directions of both of them, then, the time when the two reach the collision point is determined according to the moving speeds of the two, if the two reach the collision point at the same time, the two collide with each other, so that the moving speed of the current vehicle 100 is adjusted when it is judged that a collision may occur, thereby preventing a collision with the target object in advance, thereby preventing the user from being difficult to find the vehicle 100 which is far away and is approaching quickly because the view field is limited by the windows of the automobile in the vehicle 100, or even if the reaction is found to be delayed, the reaction may be delayed, which may lead to a safety accident. In addition, the vehicle 100 is provided with the double laser radars 20, so that the coverage of a larger view field range in front of the vehicle 100 is realized, the integral view field range of the laser radars 20 is further improved, the probability of detecting a collided target object in advance is further improved, and the probability of safety accidents is reduced; and the offset angles of the two laser radars 20 are both 30 degrees, so that the combined field angle of the two laser radars 20 is just 180 degrees, one non-interested region of each laser radar 20 can be covered by the interested region of the other laser radar 20, and on the basis of ensuring that the combined field angle is large, the offset angles of the two laser radars 20 are small, so that one non-interested region of each laser radar 20 can be covered by the interested region of the other laser radar 20, the proportion of the interested region in the combined field range is maximally improved, and the detection accuracy of the two laser radars 20 is improved.
Referring to fig. 2, 3 and 6, in some embodiments, step 011 includes the steps of:
0111: controlling the current vehicle 100 to emit laser; and
0112: and receiving laser reflected by the target object to acquire a first position, a first moving direction and a first moving speed.
In some embodiments, the obtaining module 11 is further configured to perform step 0111 and step 0112. Namely, the obtaining module 11 is further configured to control the current vehicle 100 to emit laser light; and receiving the laser reflected by the target object to acquire a first position, a first moving direction and a first moving speed.
In some embodiments, the processor 30 is also used to control the current vehicle 100 to emit laser light; and receiving the laser reflected by the target object to acquire a first position, a first moving direction and a first moving speed. That is, step 0111 and step 0112 may be implemented by processor 30.
Specifically, in acquiring the position, the moving direction, and the moving speed of the target object, the vehicle 100 may be controlled to project laser light toward the field of view of the lidar 20, and then the object within the field of view may reflect the laser light and be received by the lidar 20 to acquire the first position, the first moving direction, and the first moving speed. For example, the laser radar 20 may emit a laser point cloud, and after the laser point cloud is reflected by the target object, the laser radar 20 may determine point cloud information (i.e., a first position, such as a three-dimensional position coordinate) of the target object according to the received laser point cloud, and may obtain a position transformation (i.e., a change in the three-dimensional position coordinate) of the target object according to the point cloud information at different times, so as to determine a first moving direction and a first moving speed of the target object. Or, the laser radar 20 determines the distance between the target object and the laser radar 20 according to the transmitting time and the time of receiving the laser reflected by the target object, and can obtain the integral depth image within the field of view according to the laser reflected by the scene within the field of view; determining a first position of a target object in a current scene, and acquiring depth images of continuous multi-frame scenes; according to the position change of the target object in the continuous multi-frame depth images and the time interval between different frame depth images, the first moving direction and the first moving speed of the target object can be determined. Therefore, the position, the moving speed and the moving direction of the target object can be rapidly acquired, and the efficiency of collision judgment is improved.
Referring to fig. 2, 3 and 7, in some embodiments, the control method further includes the following steps:
014: when the distance between the target object and the current vehicle 100 is equal to the blind area distance, controlling the current vehicle 100 to stop, wherein the blind area distance is determined according to the blind area A; and
015: and when the distance is greater than the safe distance, controlling the current vehicle 100 to start, wherein the safe distance is greater than or equal to the blind area distance.
In certain embodiments, the control apparatus 10 further includes a first control module 14 and a second control module 15, the first control module 14 configured to perform step 014 and the second control module 15 configured to perform step 015. That is, the first control module 14 is configured to control the current vehicle 100 to stop when the distance between the target object and the current vehicle 100 is equal to a blind area distance, which is determined according to the blind area a; and the second control module 15 is used for controlling the current vehicle 100 to start when the distance is greater than a safe distance, wherein the safe distance is greater than or equal to the blind area distance.
In some embodiments, the processor 30 is further configured to control the current vehicle 100 to stop when the distance between the target object and the current vehicle 100 is equal to a blind zone distance, the blind zone distance being determined according to the blind zone a; and when the distance is greater than the safe distance, controlling the current vehicle 100 to start, wherein the safe distance is greater than or equal to the blind area distance. That is, steps 014 and 015 may be implemented by the processor 30.
Specifically, referring to fig. 8, since the two laser radars 20 of the present embodiment are respectively disposed at two ends of the vehicle head 40, the field of view ranges of the two laser radars form a blind area a of a predetermined range in front of the vehicle head 40 due to a distance therebetween, the larger the offset angle of the laser radar 20, the larger the predetermined range, and the laser radar 20 cannot detect the target object B within the blind area a, and when the target object B is partially located within the blind area a, the distance between the target object B and the current vehicle 100 may be actually inaccurate, such as the distance D1 between the target object B and the current vehicle 100 (i.e., the distance detected by the laser radar 20, hereinafter referred to as the detection distance D1) may be calculated according to the first position of the target object B detected by the laser radar 20 and the second position of the current vehicle 100, and when the target object B is partially located within the blind area a, the actual distance D2 between the target object B and the current vehicle 100 is less than or equal to the detection distance D1, therefore, when the detected distance D1 of the target object B and the current vehicle 100 is equal to the blind area distance D3, the target object B may be partially located within the blind area a when the vehicle 100 is to stop, preventing collision with the target object B, wherein the blind area distance D3 is determined according to the blind area a, as determined according to the maximum distance from the center of the vehicle head 40 to the edge of the blind area a. When the detected distance D1 is greater than the blind area distance D3, it may be determined that the target object B is not located within the blind area a, and therefore, the vehicle 100 may be started normally at this time without worrying about colliding with the target object B within the blind area a.
Of course, to further ensure safety, a safety distance greater than or equal to the blind distance D3 may be provided, and when the detected distance is greater than the safety distance, the vehicle 100 is allowed to start, thereby ensuring driving safety. As one example, the blind distance D3 is 1m, and the safe distance may be set to 2 m.
Referring again to fig. 2, 3 and 9, in some embodiments, step 012 includes:
0121: when the first moving direction and the second moving direction are intersected, determining the position of the collision point according to the first position, the second position, the first moving direction and the second moving direction;
0122: calculating a first distance between the collision point and the target object and a second distance between the collision point and the current vehicle 100 according to the first position, the second position and the position of the collision point;
0123: calculating a first time length according to the first distance and the first moving speed, and calculating a second time length according to the second distance and the second moving speed;
0124: when the time length difference value between the first time length and the second time length is smaller than a preset threshold value, determining that the current vehicle 100 collides with the target object;
0125: when the difference between the first time period and the second time period is greater than a preset threshold, it is determined that the current vehicle 100 and the target object do not collide.
In certain embodiments, the determining module 12 is further configured to perform step 0121, step 0122, step 0123, step 0124, and step 0125. Namely, the determining module 12 is further configured to determine the position of the collision point according to the first position, the second position, the first moving direction and the second moving direction when the first moving direction and the second moving direction intersect; calculating a first distance between the collision point and the target object and a second distance between the collision point and the current vehicle 100 according to the first position, the second position and the position of the collision point; calculating a first time length according to the first distance and the first moving speed, and calculating a second time length according to the second distance and the second moving speed; when the time length difference value between the first time length and the second time length is smaller than a preset threshold value, determining that the current vehicle 100 collides with the target object; when the difference between the first time period and the second time period is greater than a preset threshold, it is determined that the current vehicle 100 and the target object do not collide.
In some embodiments, the processor 30 is further configured to determine a location of the collision point based on the first location, the second location, the first movement direction, and the second movement direction when the first movement direction and the second movement direction intersect; calculating a first distance between the collision point and the target object and a second distance between the collision point and the current vehicle 100 according to the first position, the second position and the position of the collision point; calculating a first time length according to the first distance and the first moving speed, and calculating a second time length according to the second distance and the second moving speed; when the time length difference value between the first time length and the second time length is smaller than a preset threshold value, determining that the current vehicle 100 collides with the target object; when the difference between the first time period and the second time period is greater than a preset threshold, it is determined that the current vehicle 100 and the target object do not collide. That is, step 0121, step 0122, step 0123, step 0124, and step 0125 may be implemented by the processor 30.
Referring to fig. 5 again, specifically, when determining whether the target object and the current vehicle 100 collide with each other, it is first determined whether the moving directions of the target object and the current vehicle 100 intersect with each other, for example, the first moving direction S1 of the target object is from the first position W1, and the second moving direction S2 of the current vehicle 100 is from the second position W2, and then it is determined whether the first moving direction S1 and the second moving direction S2 converge or diverge, i.e., it is determined whether the first moving direction and the second moving direction intersect with each other, as shown in fig. 9, the first moving direction S1 and the second moving direction S2 converge, i.e., it is determined that the first moving direction and the second moving direction intersect with each other; more specifically, the processor 30 may determine whether the first moving direction S1 and the second moving direction S2 intersect according to a trend of a distance between the first position W1 and the second position W2 at a plurality of consecutive times, for example, when a distance between the first position W1 and the second position W2 at a plurality of consecutive times gradually increases, it indicates that the first moving direction S1 and the second moving direction S2 do not intersect, and when a distance between the first position W1 and the second position W2 gradually decreases, it indicates that the first moving direction S1 and the second moving direction S2 intersect.
When it is determined that the first moving direction S1 and the second moving direction S2 do not intersect with each other, it may be determined that the target object and the current vehicle 100 do not collide with each other, and thus, the second moving speed of the current vehicle 100 may not need to be adjusted.
When it is determined that the first moving direction and the second moving direction intersect, it may be determined that the target object and the current vehicle 100 may collide, and at this time, the processor 30 may accurately determine the position W3 of the collision point according to the first position W1, the second position W2, the first moving direction S1, and the second moving direction S2. The processor 30 then calculates a first distance d1 between the collision point and the target object (i.e., the distance between the position W3 and the first position W1 of the collision point) and a second distance d2 between the collision point and the current vehicle 100 (i.e., the distance between the position W3 and the second position W2 of the collision point), and then the processor 30 calculates a first time period required for the target object to move to the collision point based on the first distance d1 and the first moving speed S1, and calculates a second time period required for the current vehicle 100 to move to the collision point based on the second distance d2 and the second moving speed S2, it being understood that when the first time period and the second time period are equal, it is interpreted that the target object and the current vehicle 100 arrive at the collision point at the same time, the target object and the current vehicle 100 collide, and when the first time period and the second time period are not equal, it is interpreted that the target object and the current vehicle 100 do not arrive at the collision point at the same time, the target object and the current vehicle 100 do not collide.
Of course, since the target object and the current vehicle 100 have a certain length, even if the first duration and the second duration are not equal, the target object and the current vehicle 100 may collide, and therefore, a preset threshold may be set, the processor 30 may determine whether a difference between the durations of the first duration and the second duration is greater than the preset threshold, may determine that the current vehicle 100 and the target object do not collide if the difference between the durations of the first duration and the second duration is greater than the preset threshold, and determine that the current vehicle 100 and the target object collide when the difference between the durations of the first duration and the second duration is less than or equal to the preset threshold; the preset threshold may be determined according to the first moving speed, the second moving speed, and the length of the current vehicle 100, for example, according to the greater of the first moving speed and the second moving speed and the length of the current vehicle 100, so as to determine a more accurate preset threshold in real time, thereby ensuring the accuracy of collision detection.
Referring to fig. 10, a non-volatile computer readable storage medium 300 storing a computer program 302 according to an embodiment of the present disclosure, when the computer program 302 is executed by one or more processors 30, the processor 30 may execute the control method according to any of the above embodiments.
For example, referring to fig. 1, the computer program 302, when executed by the one or more processors 30, causes the processors 30 to perform the steps of:
011: acquiring a first position, a first moving direction and a first moving speed of a target object;
012: determining whether the current vehicle collides with the target object according to the first position, the first moving direction and the first moving speed, and the second position, the second moving direction and the second moving speed of the current vehicle; and
013: if so, adjusting the second moving speed so that the current vehicle and the target object do not collide with each other.
For another example, referring to fig. 6, when the computer program 302 is executed by the one or more processors 30, the processors 30 may further perform the following steps:
0111: controlling the current vehicle 100 to emit laser; and
0112: and receiving laser reflected by the target object to acquire a first position, a first moving direction and a first moving speed.
In the description herein, references to the description of the terms "one embodiment," "some embodiments," "an illustrative embodiment," "an example," "a specific example" or "some examples" or the like mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more program modules for implementing specific logical functions or steps of the process, and the scope of the preferred embodiments of the present application includes additional implementations in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
Although embodiments of the present application have been shown and described above, it is to be understood that the above embodiments are exemplary and not to be construed as limiting the present application, and that changes, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.

Claims (13)

1. A control method, comprising:
acquiring a first position, a first moving direction and a first moving speed of a target object;
determining whether the current vehicle and the target object collide according to the first position, the first moving direction and the first moving speed, and a second position, a second moving direction and a second moving speed of the current vehicle; and
and if so, adjusting the second moving speed so that the current vehicle and the target object do not collide with each other.
2. The control method according to claim 1, wherein the acquiring the first position, the first moving direction, and the first moving speed of the target object includes:
controlling the current vehicle to emit laser; and
receiving laser light reflected by the target object to acquire the first position, the first moving direction and the first moving speed.
3. The control method according to claim 2, wherein one or more lidar(s) for emitting the laser light is/are mounted on a head of the current vehicle.
4. The control method according to claim 3, wherein the number of the laser radars is two, the two laser radars are respectively arranged at two ends of the vehicle head, and the two laser radars form a blind area in a predetermined range in front of the vehicle head, and the control method further comprises:
when the distance between the target object and the current vehicle is equal to a blind area distance, controlling the current vehicle to stop, wherein the blind area distance is determined according to the blind area; and
and when the distance is greater than a safe distance, controlling the current vehicle to start, wherein the safe distance is greater than or equal to the blind area distance.
5. The control method according to claim 1, wherein the determining whether the current vehicle and the target object collide based on the first position, the first moving direction, and the first moving speed, and the second position, the second moving direction, and the second moving speed of the current vehicle includes:
when the first moving direction and the second moving direction are intersected, determining the position of a collision point according to the first position, the second position, the first moving direction and the second moving direction;
calculating a first distance between the collision point and the target object and a second distance between the collision point and the current vehicle according to the first position, the second position and the position of the collision point;
calculating a first time length according to the first distance and the first moving speed, and calculating a second time length according to the second distance and the second moving speed;
when the time length difference value of the first time length and the second time length is smaller than a preset threshold value, determining that the current vehicle collides with the target object;
and when the time length difference value between the first time length and the second time length is larger than the preset threshold value, determining that the current vehicle and the target object do not collide.
6. The control method according to claim 1, characterized by further comprising:
determining that the current vehicle and the target object do not collide when the first moving direction and the second moving direction do not intersect.
7. The control method of claim 5, wherein said adjusting said second movement speed comprises
Increasing or decreasing the second moving speed so that the time length difference is greater than the preset threshold.
8. A control device, comprising:
the acquisition module is used for acquiring a first position, a first moving direction and a first moving speed of a target object;
a determination module, configured to determine whether the current vehicle and the target object collide with each other according to the first position, the first moving direction, and the first moving speed, and a second position, a second moving direction, and a second moving speed of the current vehicle; and
and the adjusting module is used for adjusting the second moving speed when the current vehicle collides with the target object so as to prevent the current vehicle from colliding with the target object.
9. A vehicle, characterized in that the vehicle comprises a lidar for acquiring a first position, a first direction of movement and a first speed of movement of a target object; the processor is used for determining whether the current vehicle and the target object collide according to the first position, the first moving direction and the first moving speed, and the second position, the second moving direction and the second moving speed of the current vehicle; and when the current vehicle collides with the target object, adjusting the second moving speed so that the current vehicle does not collide with the target object.
10. The vehicle of claim 9, characterized in that the nose of the current vehicle is fitted with one or more lidar for emitting laser light.
11. The vehicle of claim 10, wherein one lidar is mounted at each end of the head, and the fields of view of the two lidar have overlapping portions.
12. The vehicle of claim 11, wherein the field of view of both of the lidar is 120 degrees and the combined field of view of both of the lidar is 180 degrees.
13. A non-transitory computer-readable storage medium containing a computer program which, when executed by a processor, causes the processor to execute the control method of any one of claims 1 to 7.
CN202111434048.XA 2021-11-29 2021-11-29 Control method and device, vehicle and readable storage medium Active CN114137980B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111434048.XA CN114137980B (en) 2021-11-29 2021-11-29 Control method and device, vehicle and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111434048.XA CN114137980B (en) 2021-11-29 2021-11-29 Control method and device, vehicle and readable storage medium

Publications (2)

Publication Number Publication Date
CN114137980A true CN114137980A (en) 2022-03-04
CN114137980B CN114137980B (en) 2022-12-13

Family

ID=80388951

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111434048.XA Active CN114137980B (en) 2021-11-29 2021-11-29 Control method and device, vehicle and readable storage medium

Country Status (1)

Country Link
CN (1) CN114137980B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109817021A (en) * 2019-01-15 2019-05-28 北京百度网讯科技有限公司 A kind of laser radar trackside blind area traffic participant preventing collision method and device
CN110059574A (en) * 2019-03-23 2019-07-26 浙江交通职业技术学院 A kind of vehicle blind zone detection method
CN110871793A (en) * 2018-08-31 2020-03-10 现代自动车株式会社 Collision avoidance control system and method
CN111210660A (en) * 2018-11-22 2020-05-29 奥迪股份公司 Collision early warning method and device, computer equipment, storage medium and vehicle
CN111833648A (en) * 2019-04-18 2020-10-27 上海汽车集团股份有限公司 Vehicle collision early warning method and device
CN111942282A (en) * 2019-05-17 2020-11-17 比亚迪股份有限公司 Vehicle and driving blind area early warning method, device and system thereof and storage medium
CN112009479A (en) * 2019-05-31 2020-12-01 通用汽车环球科技运作有限责任公司 Method and apparatus for adjusting field of view of sensor
CN212766009U (en) * 2020-04-15 2021-03-23 丰疆智能科技股份有限公司 Vehicle radar monitoring system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110871793A (en) * 2018-08-31 2020-03-10 现代自动车株式会社 Collision avoidance control system and method
CN111210660A (en) * 2018-11-22 2020-05-29 奥迪股份公司 Collision early warning method and device, computer equipment, storage medium and vehicle
CN109817021A (en) * 2019-01-15 2019-05-28 北京百度网讯科技有限公司 A kind of laser radar trackside blind area traffic participant preventing collision method and device
CN110059574A (en) * 2019-03-23 2019-07-26 浙江交通职业技术学院 A kind of vehicle blind zone detection method
CN111833648A (en) * 2019-04-18 2020-10-27 上海汽车集团股份有限公司 Vehicle collision early warning method and device
CN111942282A (en) * 2019-05-17 2020-11-17 比亚迪股份有限公司 Vehicle and driving blind area early warning method, device and system thereof and storage medium
CN112009479A (en) * 2019-05-31 2020-12-01 通用汽车环球科技运作有限责任公司 Method and apparatus for adjusting field of view of sensor
CN212766009U (en) * 2020-04-15 2021-03-23 丰疆智能科技股份有限公司 Vehicle radar monitoring system

Also Published As

Publication number Publication date
CN114137980B (en) 2022-12-13

Similar Documents

Publication Publication Date Title
US10829120B2 (en) Proactive safe driving for an automated vehicle
US10710580B2 (en) Tailgating situation handling by an automated driving vehicle
US10977944B2 (en) Apparatus and method for supporting collision avoidance of vehicle
US9223311B2 (en) Vehicle driving support control apparatus
US10576973B2 (en) Driving assistance device and driving assistance method
JP6158523B2 (en) Inter-vehicle distance control device
US9623869B2 (en) Vehicle driving support control apparatus
US9499171B2 (en) Driving support apparatus for vehicle
KR20200102004A (en) Apparatus, system and method for preventing collision
US9020747B2 (en) Method for recognizing a turn-off maneuver
US10793096B2 (en) Vehicle control device with object detection
CN109204311B (en) Automobile speed control method and device
US20030088361A1 (en) Monitor system of vehicle outside and the method thereof
CN108528433B (en) Automatic control method and device for vehicle running
CN110045736B (en) Bend obstacle avoiding method based on unmanned aerial vehicle
US20030097237A1 (en) Monitor system of vehicle outside and the method thereof
US8214087B2 (en) Object specifying device for a vehicle
US11897458B2 (en) Collision avoidance apparatus for vehicle
KR20080022748A (en) Collision avoidance method using stereo camera
WO2018070335A1 (en) Movement detection device, movement detection method
CN109689459B (en) Vehicle travel control method and travel control device
US11433888B2 (en) Driving support system
CN114137980B (en) Control method and device, vehicle and readable storage medium
US10777077B2 (en) Vehicle control device, vehicle control method, and storage medium
CN108528449B (en) Automatic control method and device for vehicle running

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20240304

Address after: 510000 No.8 Songgang street, Cencun, Tianhe District, Guangzhou City, Guangdong Province

Patentee after: GUANGZHOU XIAOPENG MOTORS TECHNOLOGY Co.,Ltd.

Country or region after: China

Address before: Room 46, room 406, No.1, Yichuang street, Zhongxin knowledge city, Huangpu District, Guangzhou City, Guangdong Province 510000

Patentee before: Guangzhou Xiaopeng Automatic Driving Technology Co.,Ltd.

Country or region before: China

TR01 Transfer of patent right