CN115534991A - Vehicle warning method, device, equipment and storage medium - Google Patents

Vehicle warning method, device, equipment and storage medium Download PDF

Info

Publication number
CN115534991A
CN115534991A CN202211360990.0A CN202211360990A CN115534991A CN 115534991 A CN115534991 A CN 115534991A CN 202211360990 A CN202211360990 A CN 202211360990A CN 115534991 A CN115534991 A CN 115534991A
Authority
CN
China
Prior art keywords
angle
vehicle
relative
horizontal plane
plane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211360990.0A
Other languages
Chinese (zh)
Inventor
汪莹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Wutong Chelian Technology Co Ltd
Original Assignee
Beijing Wutong Chelian Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Wutong Chelian Technology Co Ltd filed Critical Beijing Wutong Chelian Technology Co Ltd
Priority to CN202211360990.0A priority Critical patent/CN115534991A/en
Publication of CN115534991A publication Critical patent/CN115534991A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/06Road conditions
    • B60W40/076Slope angle of the road
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/143Alarm means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/15Road slope, i.e. the inclination of a road segment in the longitudinal direction
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/50Barriers

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Human Computer Interaction (AREA)
  • Traffic Control Systems (AREA)

Abstract

The application discloses a vehicle warning method, a vehicle warning device, vehicle warning equipment and a storage medium, and belongs to the technical field of safe driving of vehicles. The method comprises the following steps: determining the angle of the ray emitted by a ray sensor arranged on the vehicle relative to a reference surface; detecting that no barrier exists in a reference area in the driving direction of the vehicle and no reference surface is detected according to the angle of the ray relative to the reference surface, and acquiring the angle of the reference surface relative to the horizontal plane, which is detected by a gravity sensor arranged on the vehicle; and performing warning operation based on the angle of the reference surface relative to the horizontal plane being greater than zero. The obstacle and the reference surface do not exist in the reference area in the driving direction of the vehicle, and the angle of the reference surface detected by the gravity sensor relative to the horizontal plane is larger than zero, so that the fact that the vehicle is about to reach the top of the slope can be determined, the terminal gives an alarm, and if the vehicle is about to reach the top of the slope in the opposite direction, the collision probability between the vehicle and the obstacle can be reduced by the alarm function of the vehicle.

Description

Vehicle warning method, device, equipment and storage medium
Technical Field
The embodiment of the application relates to the technical field of safe driving of vehicles, in particular to a vehicle warning method, device, equipment and storage medium.
Background
With the continuous development of safe driving technology of vehicles, the warning function of vehicles is more and more emphasized by people. For example, if the surrounding light is weak when the vehicle is running, or the vehicle is in a blind area and cannot find a front obstacle in time, the probability of an accident is increased. Therefore, how to reduce the accident probability by using the warning function of the vehicle is a problem to be solved.
Disclosure of Invention
The embodiment of the application provides a vehicle warning method, device, equipment and storage medium, which can utilize the warning function of a vehicle to warn and reduce the accident probability. The technical scheme is as follows:
in one aspect, an embodiment of the present application provides a vehicle warning method, where the method includes:
determining an angle of a ray emitted by a ray sensor relative to a reference surface, wherein the ray sensor is installed on a vehicle and is positioned in front of the vehicle in the driving direction, and the reference surface is a plane where the vehicle is located;
detecting whether an obstacle exists in a reference area in the driving direction of the vehicle and detecting the reference plane according to the angle of the ray relative to the reference plane;
if no obstacle exists in a reference area in the driving direction of the vehicle and the reference surface is not detected, acquiring an angle of the reference surface relative to a horizontal plane, which is detected by a gravity sensor installed on the vehicle;
and performing warning operation based on the fact that the angle of the reference surface relative to the horizontal plane is larger than zero.
In one possible implementation, the determining an angle of the radiation emitted by the radiation sensor with respect to a reference plane includes:
detecting the angle of the reference surface relative to the horizontal plane according to the gravity sensor;
determining an angle of the ray relative to the reference plane based on a difference between the angle of the horizontal plane and the angle of the reference plane relative to the horizontal plane.
In one possible implementation, the determining the angle of the ray relative to the reference plane based on the difference between the angle of the horizontal plane and the angle of the reference plane relative to the horizontal plane includes:
determining the angle of the ray relative to the reference plane according to the following formula based on the difference between the angle of the horizontal plane and the angle of the reference plane relative to the horizontal plane;
Figure BDA0003922055560000021
wherein β represents an angle of the ray relative to the reference plane; the p represents an angle of the horizontal plane; the alpha represents the angle of the reference surface relative to the horizontal plane, and the alpha is more than or equal to 0 degree; and k is a proportionality coefficient.
In one possible implementation, the determining the angle of the ray relative to the reference plane based on the difference between the angle of the horizontal plane and the angle of the reference plane relative to the horizontal plane includes:
determining an angle of the ray relative to the reference plane based on a difference between the angle of the horizontal plane and the angle of the reference plane relative to the horizontal plane according to the following formula;
Figure BDA0003922055560000022
wherein α 'represents an angle of the reference plane with respect to the horizontal plane, and α' < 0 °.
In one possible implementation, after detecting whether an obstacle exists in a reference area in the driving direction of the vehicle and detecting the reference plane according to an angle of the ray relative to the reference plane, the method further includes:
if an obstacle exists in a reference area in the driving direction of the vehicle, controlling the speed of the vehicle so that the vehicle keeps a distance from the obstacle.
In one possible implementation, after detecting whether an obstacle exists in a reference area in the driving direction of the vehicle and detecting the reference plane according to an angle of the ray relative to the reference plane, the method further includes:
and if no obstacle exists in the reference area in the driving direction of the vehicle and the reference surface is detected, controlling the vehicle to continue driving according to the current driving state.
In one possible implementation, the warning operation includes at least one of a whistle warning and a voice broadcast.
In another aspect, a vehicle warning device is provided, the device including:
the device comprises a determining module, a judging module and a judging module, wherein the determining module is used for determining the angle of a ray emitted by a ray sensor relative to a reference surface, the ray sensor is installed on a vehicle and is positioned in front of the vehicle in the driving direction, and the reference surface is a plane where the vehicle is located;
the detection module is used for detecting whether an obstacle exists in a reference area in the driving direction of the vehicle and detecting the reference surface according to the angle of the ray relative to the reference surface;
the acquisition module is used for acquiring an angle of a reference surface relative to a horizontal plane, which is detected by a gravity sensor installed on the vehicle, if no obstacle exists in a reference area in the driving direction of the vehicle and the reference surface is not detected;
and the warning module is used for performing warning operation based on the fact that the angle of the reference surface relative to the horizontal plane is larger than zero.
In a possible implementation manner, the determining module is configured to detect an angle of the reference plane with respect to the horizontal plane according to the gravity sensor;
determining an angle of the ray relative to the reference plane based on a difference between the angle of the horizontal plane and the angle of the reference plane relative to the horizontal plane.
In a possible implementation manner, the determining module is configured to determine the angle of the ray relative to the reference plane according to the following formula based on a difference between the angle of the horizontal plane and the angle of the reference plane relative to the horizontal plane;
Figure BDA0003922055560000031
wherein β represents an angle of the ray relative to the reference plane; the p represents an angle of the horizontal plane; the alpha represents the angle of the reference surface relative to the horizontal plane, and the alpha is more than or equal to 0 degree; and k is a proportionality coefficient.
In a possible implementation manner, the determining module is configured to determine the angle of the ray relative to the reference plane according to the following formula based on a difference between the angle of the horizontal plane and the angle of the reference plane relative to the horizontal plane;
Figure BDA0003922055560000032
wherein α 'represents an angle of the reference plane with respect to the horizontal plane, and α' < 0 °.
In one possible implementation, the apparatus further includes:
a control module for controlling a speed of the vehicle to keep a distance between the vehicle and an obstacle if the obstacle exists in a reference area in the driving direction of the vehicle.
In a possible implementation manner, the control module is further configured to control the vehicle to continue to travel according to a current traveling state if no obstacle exists in a reference area in the traveling direction of the vehicle and the reference plane is detected.
In one possible implementation, the warning operation includes at least one of a whistle warning and a voice broadcast.
In another aspect, a computer device is provided, which includes a processor and a memory, wherein at least one computer program is stored in the memory, and the at least one computer program is loaded by the processor and executed to cause the computer device to implement any one of the above vehicle warning methods.
In another aspect, a computer-readable storage medium is provided, in which at least one computer program is stored, the at least one computer program being loaded and executed by a processor to make a computer implement any of the above-mentioned vehicle warning methods.
In another aspect, a computer program product or computer program is also provided, comprising computer instructions stored in a computer readable storage medium. The computer instructions are read from the computer readable storage medium by a processor of a computer device, and the computer instructions are executed by the processor to cause the computer device to execute any one of the vehicle warning methods.
The technical scheme provided by the embodiment of the application at least has the following beneficial effects:
in the embodiment of the application, if the fact that no obstacle or reference surface exists in the reference area in the driving direction of the vehicle is detected according to the angle of the ray relative to the reference surface, and the angle of the reference surface relative to the horizontal plane is larger than zero, it can be determined that the vehicle is about to reach the top of a slope, and the terminal warns at the moment. If an obstacle is about to reach the top of the slope in the opposite direction, the warning operation can reduce the probability of collision between the vehicle and the obstacle due to the fact that the vehicle and the obstacle are in the blind sight area.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings required to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the description below are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
FIG. 1 is a schematic illustration of an implementation environment provided by an embodiment of the present application;
FIG. 2 is a flow chart of a method for warning a vehicle according to an embodiment of the present disclosure;
FIG. 3 is a schematic diagram of a case where an angle of a reference plane with respect to a horizontal plane is equal to zero according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a case where an angle of a reference plane with respect to a horizontal plane is greater than zero according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a case where an angle of a reference plane with respect to a horizontal plane is less than zero according to an embodiment of the present application;
FIG. 6 is a diagram of logical decisions on ray angles and directions provided by an embodiment of the present application;
fig. 7 is a schematic diagram of a background control model according to an embodiment of the present application;
FIG. 8 is a logic diagram illustrating a method for warning a vehicle according to an embodiment of the present disclosure;
FIG. 9 is a schematic view of a vehicle warning device according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of a server provided in an embodiment of the present application;
fig. 11 is a schematic structural diagram of a terminal according to an embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
An embodiment of the present application provides a vehicle warning method, please refer to fig. 1, which shows a schematic diagram of an implementation environment of the method provided in the embodiment of the present application. Taking the method applied to a vehicle driving scenario as an example, the implementation environment may include: a terminal 11 and a server 12.
The terminal 11 is installed with an application program or a web page capable of performing a vehicle warning operation, and when the application program or the web page needs to perform the vehicle warning operation, the method provided by the embodiment of the present application may be applied to perform the vehicle warning operation. The server 12 may store the information of the vehicle warning, and the terminal 11 may obtain the information of the vehicle warning from the server 12. Of course, the acquired information may be stored in the terminal 11.
Alternatively, the terminal 11 may be a smart device such as a vehicle-mounted terminal, a smart car machine, or the like. The server 12 may be a server, a server cluster composed of a plurality of servers, or a cloud computing service center. The terminal 11 establishes a communication connection with the server 12 through a wired or wireless network.
It should be understood by those skilled in the art that the above-mentioned terminal 11 and server 12 are only examples, and other existing or future terminals or servers may be suitable for the present application and are included within the scope of the present application and are hereby incorporated by reference.
The embodiment of the application provides a vehicle warning method, which can be applied to the implementation environment shown in fig. 1. As shown in fig. 2, taking the application of the method to the vehicle-mounted terminal as an example, the method includes steps 201 to 204.
In step 201, an angle of a ray emitted by a ray sensor relative to a reference plane is determined, the ray sensor is mounted on a vehicle and located in front of the vehicle in a driving direction, and the reference plane is a plane where the vehicle is located.
The number of rays emitted by the ray sensor is not limited in the embodiment of the application, and the ray sensor can be divided into a single-line sensor or a multi-line sensor according to the difference of the number of rays emitted by the ray sensor. The number of lines of the multi-line sensor can be 4 lines, 8 lines, 16 lines and the like. No matter what type of radiation sensor is adopted, the radiation sensor provided by the embodiment of the application is installed on the vehicle and is positioned in front of the vehicle in the driving direction. In addition, the vehicle is located on the reference surface, if the vehicle runs from west to east, the running direction of the vehicle is east, and the ray sensor should be installed on the vehicle and located in east so as to be capable of emitting rays from west to east.
In one possible implementation, determining an angle of a radiation emitted by the radiation sensor with respect to a reference plane includes: detecting the angle of the reference surface relative to the horizontal plane according to the gravity sensor; the angle of the ray relative to the reference plane is determined based on the difference between the angle of the horizontal plane and the angle of the reference plane relative to the horizontal plane.
In the embodiment of the application, the gravity sensor is installed on the vehicle, and the angle of the reference plane where the vehicle is located relative to the horizontal plane is further calculated by measuring the acceleration caused by the gravity of the vehicle. After the angle of the reference plane relative to the horizontal plane is calculated by the gravity sensor, the angle is uploaded to the terminal, and the terminal can further determine the angle of the ray relative to the reference plane.
In one possible implementation, determining the angle of the ray relative to the reference plane based on the difference between the angle of the horizontal plane and the angle of the reference plane relative to the horizontal plane includes:
determining the angle of the ray relative to the reference plane according to the following formula (1) based on the difference between the angle of the horizontal plane and the angle of the reference plane relative to the horizontal plane;
Figure BDA0003922055560000061
wherein β represents the angle of the ray relative to the reference plane; p represents the angle of the horizontal plane; alpha represents the angle of the reference surface relative to the horizontal plane, and alpha is more than or equal to 0 degree; k is a scaling factor.
The embodiment of the present application does not limit the size of the angle p of the horizontal plane measured by the gravity sensor, and the angle of the horizontal plane may be 0 ° by way of example. In the formula (1), the value range of the proportionality coefficient k is more than or equal to 1, wherein the size of k can be set according to experience and can also be flexibly set according to an application scene, and the embodiment of the application is not limited thereto. Illustratively, the scaling factor may be 2, 3, etc. α denotes an angle of the reference plane with respect to the horizontal plane, and α =0 ° represents that the reference plane on which the vehicle is located is parallel to the horizontal plane. This situation can be seen in fig. 3, which is a schematic view of the situation where the angle of the reference plane with respect to the horizontal plane is equal to zero. In fig. 3, 301 is a vehicle, 302 is a horizontal plane, and is also a reference plane on which the vehicle is located. The vehicle in fig. 3 is driven from west to east, and the radiation sensor is mounted on the vehicle and emits radiation from west to east.
Similarly, α > 0 ° represents that the angle of the reference plane on which the vehicle is located with respect to the horizontal plane is greater than zero, i.e., the vehicle is in an uphill state. This situation can be seen in fig. 4, which is a schematic view of the situation where the angle of the reference plane with respect to the horizontal plane is greater than zero. In fig. 4, 401 denotes a vehicle, 402 denotes a reference plane on which the vehicle is located, and 403 denotes a horizontal plane. In fig. 4, α is greater than 0 °, and the vehicle is in an uphill state at this time; beta is the angle of the ray relative to the reference plane, beta < 0 means that the angle of the ray relative to the reference plane is less than zero.
Taking p as 0 ° and k as 3 as an example, if the gravity sensor located on the vehicle detects that the angle α of the reference plane where the vehicle is located with respect to the horizontal plane is 30 °, and after the gravity sensor uploads the angle α to the terminal, the terminal may calculate the angle β = -10 ° of the ray with respect to the reference plane according to the formula (1), where the angle represents that the ray emitted by the ray sensor should be shifted downward by 10 ° on the basis of being parallel to the reference plane.
In one possible implementation, determining the angle of the ray relative to the reference plane based on the difference between the angle of the horizontal plane and the angle of the reference plane relative to the horizontal plane includes:
determining the angle of the ray relative to the reference plane according to the following formula (2) based on the difference between the angle of the horizontal plane and the angle of the reference plane relative to the horizontal plane;
Figure BDA0003922055560000071
wherein α 'denotes an angle of the reference plane with respect to the horizontal plane, and α' < 0 °. α' < 0 ° represents that the angle of the reference plane in which the vehicle is located with respect to the horizontal is less than zero, i.e. the vehicle is in a downhill situation. This situation can be seen in fig. 5, which is a schematic view of the situation where the angle of the reference plane with respect to the horizontal plane is less than zero. In fig. 5, 501 denotes a vehicle, 502 denotes a reference plane on which the vehicle is placed, and 503 denotes a horizontal plane. And in fig. 5, alpha is greater than 0 deg., when the vehicle is in a downhill state, beta is the angle of the ray relative to the reference surface, and beta is greater than 0, which means that the angle of the ray relative to the reference surface is greater than zero.
Taking p as 0 ° and k as 3 as an example, if the gravity sensor located on the vehicle detects that the angle α of the reference plane on which the vehicle is located with respect to the horizontal plane is-30 °, the angle β =10 ° of the ray with respect to the reference plane can be calculated according to the formula (2), which indicates that the ray emitted by the ray sensor should be shifted 10 ° upward on the basis of being parallel to the reference plane.
FIG. 6 is a logic decision diagram of ray angle and direction. In fig. 6, step 601 is to input the angle of the reference plane relative to the horizontal plane, i.e. the slope α. After the gravity sensor detects the α, the α is sent to the terminal, the terminal may execute step 602, that is, the angle of the ray emitted by the ray sensor with respect to the reference surface is calculated, that is, the angle is a ray sensor offset value β, then the terminal executes step 603, determines whether β is smaller than 0, if β is smaller than 0, executes step 604, and the terminal controls the ray sensor to offset | β |; if not, go to step 605, the terminal controls the radiation sensor to shift up by | β |.
In step 202, it is detected whether an obstacle is present in a reference area in the traveling direction of the vehicle and the reference plane is detected based on the angle of the ray with respect to the reference plane.
After calculating the angle of the radiation with respect to the reference plane according to the method in step 201, the terminal may control the radiation sensor to emit the radiation according to the angle to detect whether there are an obstacle and a reference plane in the reference area in the driving direction of the vehicle. For example, when the radiation detects an obstacle, reflection is generated, the reflected radiation is received again by the radiation sensor, and then the radiation sensor uploads the detection result to the terminal, so that the terminal can determine whether the obstacle exists in the reference area in the driving direction of the vehicle and detect the reference surface. The present embodiment does not limit the type of the obstacle, and the obstacle may be a traveling vehicle or a pedestrian, for example.
For example, the range of the reference region in the vehicle traveling direction may be set empirically or determined from input information. For example, when the angle of the reference plane with respect to the horizontal plane is zero, the range of the reference area may be set empirically. According to the content in step 201, if the angle of the reference plane with respect to the horizontal plane is zero, the included angle between the ray emitted by the ray sensor and the reference plane where the vehicle is located is zero, as shown in fig. 3, and at this time, the ray can detect the obstacle in the reference area in the driving direction of the vehicle. In this case, the range of the reference area may be an area within 30 meters of the front of the vehicle, or may be a semicircular area with a radius of 30 meters in front of the vehicle, where the center of the circle is the vehicle.
For example, when the angle of the reference plane where the vehicle is located with respect to the horizontal plane is greater than zero, the size of the range of the reference area in the traveling direction of the vehicle may be determined according to the input information, which in this embodiment may be the installation height of the radiation sensor and the angle of the reference plane with respect to the horizontal plane. If the vehicle is in an uphill state, the range size of the reference region, that is, the farthest distance that the ray can irradiate, can be calculated by the following formula (3):
y = x | tan (90 ° - | β |) | formula (3)
Wherein y represents the farthest distance that the ray can impinge on; x represents the installation height of the radiation sensor, and the value of the height is not limited in the embodiment of the present application, and x may be 1 meter exemplarily.
According to the content in step 201, when β = -10 °, as for example x =1, it can be calculated according to equation (3) that the farthest distance that can be irradiated when the ray is emitted at an angle of 10 ° with respect to the reference plane is 5.67 meters. The reference area in the direction of travel of the vehicle at this time is then the area within 5.67 meters of the vehicle in the direction of travel.
In step 203, if no obstacle exists in the reference area in the driving direction of the vehicle and the reference plane is not detected, the angle of the reference plane relative to the horizontal plane, which is detected by the gravity sensor mounted on the vehicle, is obtained.
The manner in which the gravity sensor mounted on the vehicle detects the angle of the reference plane with respect to the horizontal plane has been described in detail in the foregoing step 201, and is not described in detail herein. After the gravity sensor detects the angle of the reference surface relative to the horizontal plane, the terminal acquires the detected angle information from the gravity sensor.
In the embodiment of the present application, the following two situations may cause the radiation sensor not to detect the existence of the obstacle in the reference area in the driving direction of the vehicle, and also not detect the reference plane where the vehicle is located:
the first condition is as follows: the reference plane on which the vehicle is located is a horizontal plane, that is, the included angle between the reference plane on which the vehicle is located and the horizontal plane is zero. At this time, the ray sensor emits rays at an angle of zero with respect to a reference plane where the vehicle is located, and if the ray sensor does not receive the reflected rays within the reference area range, it is indicated that no obstacle exists in the reference area in the driving direction of the vehicle, and the ray sensor does not detect the reference plane where the vehicle is located.
Case two: the vehicle is in an uphill condition and the actual distance of the vehicle from the top of the hill is less than the furthest distance the radiation can impinge. This situation also results in the radiation sensor detecting that there are no obstacles in the reference area in the direction of travel of the vehicle and also no reference plane.
In view of the above two situations that may cause the radiation sensor not to detect that an obstacle exists in the reference area in the driving direction of the vehicle, and also not detect the reference plane where the vehicle is located, the terminal also needs to control the gravity sensor to acquire the angle of the reference plane relative to the horizontal plane to further determine the current driving condition of the vehicle.
In step 204, an alarm operation is performed based on the angle of the reference plane relative to the horizontal plane being greater than zero.
According to the content in the step 203, if the angle of the reference plane relative to the horizontal plane is greater than zero, it indicates that the vehicle is in an uphill state, and the condition two in the step 203 is satisfied, that is, the vehicle is about to reach the top of a slope, and at this time, the terminal needs to perform warning operation. In one possible implementation, the warning operation includes at least one of a whistling warning and a voice broadcast. The content of the voice announcement is not limited in the embodiments of the present application, and the content of the voice announcement may be "coming to the top, carefully driving toward the vehicle", for example.
In one possible implementation, after detecting whether an obstacle exists in a reference area in a driving direction of the vehicle and detecting the reference plane according to an angle of the ray relative to the reference plane, the method further includes: if an obstacle exists in the reference area in the traveling direction of the vehicle, the speed of the vehicle is controlled so that the vehicle maintains a distance from the obstacle.
If the obstacle exists in the reference area in the driving direction of the vehicle, the terminal directly controls the driving speed of the vehicle without considering whether the reference surface can be detected or not, and the vehicle and the obstacle can be kept to be driven within the distance threshold range. For example, the actual distance between the vehicle and the obstacle may be detected by radiation emitted from a radiation sensor mounted on the vehicle. The size of the threshold is not limited in the embodiment of the present application, and for example, the size of the threshold may be set empirically or may be set flexibly according to an application scenario.
In one possible implementation, after detecting whether an obstacle exists in a reference area in a driving direction of the vehicle and detecting the reference plane according to an angle of the ray relative to the reference plane, the method further includes: and if no obstacle exists in the reference area in the driving direction of the vehicle and the reference surface is detected, controlling the vehicle to continue driving according to the current driving state.
In the embodiment of the present application, there are two cases that may cause no obstacle in the reference area in the traveling direction of the vehicle and the reference surface is detected:
case three: the vehicle is in an uphill state and is at a distance from the top of the hill greater than the furthest distance the radiation can impinge. If the vehicle is in an uphill state, the ray emitted by the ray sensor detects the reference surface description, the actual distance between the vehicle and the top of the slope is larger than the farthest distance which can be irradiated by the ray, at the moment, no warning is needed, and the vehicle only needs to be controlled to continue to run according to the current running state.
Case four: the vehicle is in a downhill state. If the vehicle is in a downhill state and no obstacle exists in the reference area in the driving direction of the vehicle, the ray emitted by the ray sensor can detect the reference surface, and at the moment, the vehicle is controlled to continue to drive according to the current driving state without warning.
Therefore, regardless of what is shown in the above-described case three or case four, as long as no obstacle is present in the reference area in the traveling direction of the vehicle and the reference surface is detected, the vehicle can be controlled to continue traveling in accordance with the current traveling state.
FIG. 7 is a schematic diagram of a background control model. In fig. 7, the terminal control microprocessor receives and processes signals from the gravity sensor and the radiation sensor in a centralized manner, and controls the angle of the radiation emitted by the radiation sensor, the vehicle speed and the warning operation through an algorithm. Wherein, the terminal can control the microprocessor to warn through the loudspeaker controller and the voice server.
FIG. 8 is a logic decision diagram for a vehicle alert method. In fig. 8, step 801 is that the terminal determines whether the radiation sensor can detect an obstacle, if so, step 802 is executed, the terminal controls the vehicle speed to keep the vehicle at a distance from the obstacle, and if not, step 803 is executed, and the terminal determines whether the radiation sensor can detect the reference plane. If the reference surface can be detected, executing step 804, and controlling the vehicle to normally run by the terminal; if the reference surface is not detected, executing step 805, the terminal determines whether the vehicle is in an uphill state through the gravity sensor, namely, whether the angle between the reference surface and the horizontal plane is greater than zero is judged, if yes, executing step 806, and the terminal controls the loudspeaker controller and the voice server to make a whistle and alarm and voice broadcast; if not, step 804 is still executed, that is, the vehicle is controlled to normally run, wherein controlling the vehicle to normally run is controlling the vehicle to continue running according to the current speed.
In the embodiment of the application, if the fact that no obstacle or reference surface exists in the reference area in the driving direction of the vehicle is detected according to the angle of the ray relative to the reference surface, and the angle of the reference surface relative to the horizontal plane is larger than zero, it can be determined that the vehicle is about to reach the top of a slope, and the terminal warns at the moment. If an obstacle is about to reach the top of the slope in the opposite direction, the warning operation can reduce the probability of collision between the vehicle and the obstacle due to the fact that the vehicle and the obstacle are in the blind sight area.
Referring to fig. 9, an embodiment of the present application provides a vehicle warning device, including:
the determining module 901 is configured to determine an angle of a ray emitted by a ray sensor relative to a reference plane, where the ray sensor is installed on a vehicle and located in front of the vehicle in a driving direction, and the reference plane is a plane where the vehicle is located;
a detection module 902, configured to detect whether an obstacle exists in a reference area in a driving direction of the vehicle and detect a reference plane according to an angle of the ray relative to the reference plane;
an obtaining module 903, configured to obtain, if no obstacle exists in a reference area in a driving direction of the vehicle and a reference surface is not detected, an angle of the reference surface, relative to a horizontal plane, detected by a gravity sensor mounted on the vehicle;
and the warning module 904 is used for performing warning operation based on the angle of the reference plane relative to the horizontal plane being greater than zero.
In a possible implementation manner, the determining module 901 is configured to detect an angle of a reference plane with respect to a horizontal plane according to a gravity sensor;
the angle of the ray relative to the reference plane is determined based on the difference between the angle of the horizontal plane and the angle of the reference plane relative to the horizontal plane.
In a possible implementation manner, the determining module 901 is configured to determine an angle of the ray relative to the reference plane according to the following formula based on a difference between the angle of the horizontal plane and the angle of the reference plane relative to the horizontal plane;
Figure BDA0003922055560000111
wherein β represents the angle of the ray relative to the reference plane; p represents the angle of the horizontal plane; alpha represents the angle of the reference plane relative to the horizontal plane, and alpha is more than or equal to 0 degree; k is a scaling factor.
In a possible implementation manner, the determining module 901 is configured to determine an angle of the ray relative to the reference plane according to the following formula based on a difference between the angle of the horizontal plane and the angle of the reference plane relative to the horizontal plane;
Figure BDA0003922055560000121
wherein α 'denotes an angle of the reference plane with respect to the horizontal plane, and α' < 0 °.
In one possible implementation, the apparatus further includes:
and the control module is used for controlling the speed of the vehicle to keep the distance between the vehicle and the obstacle if the obstacle exists in the reference area in the driving direction of the vehicle.
In a possible implementation manner, the control module is further configured to control the vehicle to continue to travel according to the current travel state if no obstacle exists in the reference area in the travel direction of the vehicle and the reference surface is detected.
In one possible implementation, the warning operation includes at least one of a whistle warning and a voice broadcast.
In the embodiment of the application, if the fact that no obstacle or reference plane exists in the reference area in the driving direction of the vehicle is detected according to the angle of the ray relative to the reference plane, and the angle of the reference plane relative to the horizontal plane is larger than zero, it can be determined that the vehicle is about to reach the top of a slope, and the terminal gives an alarm at the moment. If an obstacle is about to reach the top of the slope in the opposite direction, the warning operation can reduce the probability of collision between the vehicle and the obstacle due to the fact that the vehicle and the obstacle are in the sight blind area.
It should be noted that, when the apparatus provided in the foregoing embodiment implements the functions thereof, only the division of the functional modules is illustrated, and in practical applications, the above functions may be distributed by different functional modules as needed, that is, the internal structure of the device may be divided into different functional modules to implement all or part of the functions described above. In addition, the apparatus and method embodiments provided by the above embodiments belong to the same concept, and specific implementation processes thereof are described in the method embodiments for details, which are not described herein again.
Fig. 10 is a schematic structural diagram of a server according to an embodiment of the present application, where the server may generate relatively large differences due to different configurations or performances, and may include one or more processors 1001 and one or more memories 1002, where the processors 1001 may be Central Processing Units (CPUs), and the one or more memories 1002 store at least one computer program, and the at least one computer program is loaded and executed by the one or more processors to enable the server 1001 to implement the vehicle warning method according to the above-described method embodiments. Of course, the server may also have components such as a wired or wireless network interface, a keyboard, and an input/output interface, so as to perform input/output, and the server may also include other components for implementing the functions of the device, which are not described herein again.
Fig. 11 is a schematic structural diagram of a terminal according to an embodiment of the present application. The terminal may be: a smartphone, a tablet, a laptop, or a desktop computer. A terminal may also be referred to by other names such as user equipment, portable terminal, laptop terminal, desktop terminal, etc.
Generally, a terminal includes: a processor 1501 and memory 1502.
Processor 1501 may include one or more processing cores, such as a 4-core processor, an 8-core processor, or the like. The processor 1501 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). Processor 1501 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also referred to as a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 1501 may be integrated with a GPU (Graphics Processing Unit) that is responsible for rendering and drawing content that the display screen needs to display. In some embodiments, processor 1501 may also include an AI (Artificial Intelligence) processor for processing computational operations related to machine learning.
The memory 1502 may include one or more computer-readable storage media, which may be non-transitory. The memory 1502 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in the memory 1502 is configured to store at least one instruction for execution by the processor 1501 to cause the terminal to implement the vehicle alert method provided by the method embodiments herein.
In some embodiments, the terminal may further optionally include: a peripheral interface 1503 and at least one peripheral. The processor 1501, memory 1502, and peripheral interface 1503 may be connected by buses or signal lines. Various peripheral devices may be connected to peripheral interface 1503 via buses, signal lines, or circuit boards. Specifically, the peripheral device includes: at least one of a radio frequency circuit 1504, a display 1505, a camera assembly 1506, audio circuitry 1507, a positioning assembly 1508, and a power supply 1509.
The peripheral interface 1503 may be used to connect at least one peripheral related to I/O (Input/Output) to the processor 1501 and the memory 1502. In some embodiments, the processor 1501, memory 1502, and peripheral interface 1503 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 1501, the memory 1502, and the peripheral device interface 1503 may be implemented on separate chips or circuit boards, which is not limited by the present embodiment.
The Radio Frequency circuit 1504 is used to receive and transmit RF (Radio Frequency) signals, also known as electromagnetic signals. The radio frequency circuitry 1504 communicates with communication networks and other communication devices via electromagnetic signals. The radio frequency circuit 1504 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 1504 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuit 1504 can communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: metropolitan area networks, various generation mobile communication networks (2G, 3G, 4G, and 5G), wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the radio frequency circuit 1504 may also include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display screen 1505 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 1505 is a touch display screen, the display screen 1505 also has the ability to capture touch signals on or over the surface of the display screen 1505. The touch signal may be input to the processor 1501 as a control signal for processing. At this point, the display 1505 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display 1505 may be one, provided on the front panel of the terminal; in other embodiments, the display 1505 may be at least two, each disposed on a different surface of the terminal or in a folded design; in other embodiments, the display 1505 may be a flexible display, provided on a curved surface or a folded surface of the terminal. Even more, the display 1505 may be configured in a non-rectangular irregular pattern, i.e., a shaped screen. The Display 1505 can be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), etc.
The camera assembly 1506 is used to capture images or video. Optionally, the camera assembly 1506 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 1506 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuitry 1507 may include a microphone and speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 1501 for processing or inputting the electric signals to the radio-frequency circuit 1504 to achieve voice communication. For the purpose of stereo sound collection or noise reduction, a plurality of microphones can be arranged at different parts of the terminal respectively. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is then used to convert electrical signals from the processor 1501 or the radio frequency circuitry 1504 into sound waves. The loudspeaker can be a traditional film loudspeaker and can also be a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuitry 1507 may also include a headphone jack.
The positioning component 1508 is used to locate the current geographic Location of the terminal to implement navigation or LBS (Location Based Service).
A power supply 1509 is used to supply power to the various components in the terminal. The power supply 1509 may be alternating current, direct current, disposable or rechargeable. When the power supply 1509 includes a rechargeable battery, the rechargeable battery may support wired charging or wireless charging. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, the terminal also includes one or more sensors 1510. The one or more sensors 1510 include, but are not limited to: acceleration sensor 1511, gyro sensor 1512, pressure sensor 1513, fingerprint sensor 1514, optical sensor 1515, and proximity sensor 1516.
The acceleration sensor 1511 can detect the magnitude of acceleration on three coordinate axes of a coordinate system established with the terminal. For example, the acceleration sensor 1511 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 1501 may control the display screen 1505 to display the user interface in a landscape view or a portrait view based on the gravitational acceleration signal collected by the acceleration sensor 1511. The acceleration sensor 1511 may also be used for acquisition of motion data of a game or a user.
The gyroscope sensor 1512 may detect a body direction and a rotation angle of the terminal, and the gyroscope sensor 1512 may cooperate with the acceleration sensor 1511 to acquire a 3D motion of the user on the terminal. The processor 1501 may implement the following functions according to the data collected by the gyro sensor 1512: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
The pressure sensor 1513 may be disposed on a side frame of the terminal and/or under the display 1505. When the pressure sensor 1513 is disposed on the side frame of the terminal, the holding signal of the user to the terminal can be detected, and the processor 1501 performs left-right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 1513. When the pressure sensor 1513 is disposed at a lower layer of the display screen 1505, the processor 1501 controls the operability control on the UI interface in accordance with the pressure operation of the user on the display screen 1505. The operability control comprises at least one of a button control, a scroll bar control, an icon control, and a menu control.
The fingerprint sensor 1514 is configured to collect a fingerprint of the user, and the processor 1501 identifies the user based on the fingerprint collected by the fingerprint sensor 1514, or the fingerprint sensor 1514 identifies the user based on the collected fingerprint. Upon recognizing that the user's identity is a trusted identity, the processor 1501 authorizes the user to perform relevant sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying, and changing settings, etc. The fingerprint sensor 1514 may be disposed on the front, back, or side of the terminal. When a physical key or vendor Logo (trademark) is provided on the terminal, the fingerprint sensor 1514 may be integrated with the physical key or vendor Logo.
The optical sensor 1515 is used to collect ambient light intensity. In one embodiment, processor 1501 may control the brightness of display screen 1505 based on the intensity of ambient light collected by optical sensor 1515. Specifically, when the ambient light intensity is high, the display brightness of the display screen 1505 is adjusted up; when the ambient light intensity is low, the display brightness of the display screen 1505 is adjusted down. In another embodiment, the processor 1501 may also dynamically adjust the shooting parameters of the camera assembly 1506 based on the ambient light intensity collected by the optical sensor 1515.
A proximity sensor 1516, also called a distance sensor, is usually provided on the front panel of the terminal. The proximity sensor 1516 is used to collect a distance between the user and the front surface of the terminal. In one embodiment, when the proximity sensor 1516 detects that the distance between the user and the front face of the terminal gradually decreases, the processor 1501 controls the display 1505 to switch from a bright screen state to a dark screen state; when the proximity sensor 1516 detects that the distance between the user and the front face of the terminal gradually becomes larger, the processor 1501 controls the display 1505 to switch from the message screen state to the bright screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 11 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be used.
In an exemplary embodiment, a computer device is also provided, the computer device comprising a processor and a memory, the memory having at least one computer program stored therein. The at least one computer program is loaded and executed by one or more processors to cause the computer apparatus to implement any of the vehicle alert methods described above.
In an exemplary embodiment, there is also provided a computer readable storage medium having at least one computer program stored therein, the at least one computer program being loaded and executed by a processor of a computer device to cause the computer to implement any one of the above-mentioned vehicle alert methods.
In one possible implementation, the computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a Compact Disc Read-Only Memory (CD-ROM), a magnetic tape, a floppy disk, an optical data storage device, and the like.
In an exemplary embodiment, a computer program product or computer program is also provided, the computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer readable storage medium, and the processor executes the computer instructions to cause the computer device to perform any one of the vehicle alert methods described above.
It should be noted that information (including but not limited to user equipment information, user personal information, etc.), data (including but not limited to data for analysis, stored data, presented data, etc.), and signals referred to in this application are authorized by the user or sufficiently authorized by various parties, and the collection, use, and processing of the relevant data is required to comply with relevant laws and regulations and standards in relevant countries and regions. For example, reference to angles of reference planes relative to a horizontal plane, etc., are all obtained with sufficient authority.
It should be understood that reference herein to "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
It should be noted that the terms "first," "second," and the like (if any) in the description and claims of this application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are capable of operation in sequences other than those illustrated or described herein. The implementations described in the above exemplary embodiments do not represent all implementations consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
The above description is only exemplary of the present application and should not be taken as limiting the present application, and any modifications, equivalents, improvements and the like that are made within the principles of the present application should be included in the protection scope of the present application.

Claims (10)

1. A method of vehicle warning, the method comprising:
determining an angle of a ray emitted by a ray sensor relative to a reference surface, wherein the ray sensor is installed on a vehicle and is positioned in front of the vehicle in the driving direction, and the reference surface is a plane where the vehicle is located;
detecting whether an obstacle exists in a reference area in the driving direction of the vehicle and detecting the reference surface according to the angle of the ray relative to the reference surface;
if no obstacle exists in the reference area in the driving direction of the vehicle and the reference surface is not detected, acquiring the angle of the reference surface, relative to the horizontal plane, detected by a gravity sensor installed on the vehicle;
and performing warning operation based on the fact that the angle of the reference surface relative to the horizontal plane is larger than zero.
2. The method of claim 1, wherein determining an angle of radiation emitted by the radiation sensor relative to a reference plane comprises:
detecting the angle of the reference surface relative to the horizontal plane according to the gravity sensor;
determining an angle of the ray relative to the reference plane based on a difference between the angle of the horizontal plane and the angle of the reference plane relative to the horizontal plane.
3. The method of claim 2, wherein the determining the angle of the ray relative to the reference plane based on the difference between the angle of the horizontal plane and the angle of the reference plane relative to the horizontal plane comprises:
determining the angle of the ray relative to the reference plane according to the following formula based on the difference between the angle of the horizontal plane and the angle of the reference plane relative to the horizontal plane;
Figure FDA0003922055550000011
wherein β represents an angle of the ray relative to the reference plane; the p represents an angle of the horizontal plane; the alpha represents the angle of the reference surface relative to the horizontal plane, and the alpha is more than or equal to 0 degree; and k is a proportionality coefficient.
4. The method of claim 2, wherein determining the angle of the ray relative to the reference plane based on the difference between the angle of the horizontal plane and the angle of the reference plane relative to the horizontal plane comprises:
determining an angle of the ray relative to the reference plane based on a difference between the angle of the horizontal plane and the angle of the reference plane relative to the horizontal plane according to the following formula;
Figure FDA0003922055550000021
wherein β represents an angle of the ray relative to the reference plane; the p represents an angle of the horizontal plane; the α 'represents an angle of the reference plane relative to the horizontal plane, and the α' < 0 °; and k is a proportionality coefficient.
5. The method of claim 1, wherein after detecting whether an obstacle is present within a reference area in the direction of travel of the vehicle and detecting the reference plane based on an angle of the ray relative to the reference plane, further comprising:
and if an obstacle exists in the reference area in the driving direction of the vehicle, controlling the speed of the vehicle so as to keep the distance between the vehicle and the obstacle.
6. The method of claim 1, wherein after detecting whether an obstacle is present within a reference area in the direction of travel of the vehicle and detecting the reference plane based on an angle of the ray relative to the reference plane, further comprising:
and if no obstacle exists in the reference area in the driving direction of the vehicle and the reference surface is detected, controlling the vehicle to continue driving according to the current driving state.
7. The method of any of claims 1-6, wherein the alert operation comprises at least one of a siren alert, a voice announcement.
8. A vehicle warning device, said device comprising:
the device comprises a determining module, a judging module and a judging module, wherein the determining module is used for determining the angle of a ray emitted by a ray sensor relative to a reference surface, the ray sensor is installed on a vehicle and is positioned in front of the vehicle in the driving direction, and the reference surface is a plane where the vehicle is located;
the detection module is used for detecting whether an obstacle exists in a reference area in the driving direction of the vehicle and detecting the reference surface according to the angle of the ray relative to the reference surface;
the acquisition module is used for acquiring the angle of a reference surface relative to a horizontal plane, which is detected by a gravity sensor arranged on the vehicle, if no obstacle exists in the reference area in the driving direction of the vehicle and the reference surface is not detected;
and the warning module is used for performing warning operation based on the fact that the angle of the reference surface relative to the horizontal plane is larger than zero.
9. A computer device, characterized in that the computer device comprises a processor and a memory, in which at least one computer program is stored, which is loaded and executed by the processor, to cause the computer device to carry out the vehicle alert method according to any one of claims 1 to 7.
10. A computer-readable storage medium, in which at least one computer program is stored, which is loaded and executed by a processor to cause a computer to implement a vehicle warning method according to any one of claims 1 to 7.
CN202211360990.0A 2022-11-02 2022-11-02 Vehicle warning method, device, equipment and storage medium Pending CN115534991A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211360990.0A CN115534991A (en) 2022-11-02 2022-11-02 Vehicle warning method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211360990.0A CN115534991A (en) 2022-11-02 2022-11-02 Vehicle warning method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115534991A true CN115534991A (en) 2022-12-30

Family

ID=84720027

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211360990.0A Pending CN115534991A (en) 2022-11-02 2022-11-02 Vehicle warning method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115534991A (en)

Similar Documents

Publication Publication Date Title
CN112406707B (en) Vehicle early warning method, vehicle, device, terminal and storage medium
CN111010537B (en) Vehicle control method, device, terminal and storage medium
CN112991439B (en) Method, device, electronic equipment and medium for positioning target object
CN110775056B (en) Vehicle driving method, device, terminal and medium based on radar detection
CN116853240A (en) Barrier early warning method, device, equipment and storage medium
CN111147738A (en) Police vehicle-mounted panoramic and coma system, device, electronic equipment and medium
CN115904593A (en) Screen management method, device, equipment and storage medium
CN112885095B (en) Road surface information detection method, device, equipment and computer readable storage medium
CN114789734A (en) Perception information compensation method, device, vehicle, storage medium, and program
CN111717205B (en) Vehicle control method, device, electronic equipment and computer readable storage medium
CN115534991A (en) Vehicle warning method, device, equipment and storage medium
CN111984755A (en) Method and device for determining target parking point, electronic equipment and storage medium
CN113034822A (en) Method, device, electronic equipment and medium for prompting user
CN113734167B (en) Vehicle control method, device, terminal and storage medium
CN113734199B (en) Vehicle control method, device, terminal and storage medium
CN112991790B (en) Method, device, electronic equipment and medium for prompting user
CN115959157A (en) Vehicle control method and apparatus
CN112241662B (en) Method and device for detecting drivable area
CN111294513B (en) Photographing method and device, electronic equipment and storage medium
CN116452653A (en) Method, device, equipment and computer readable storage medium for determining traffic information
CN117944668A (en) Obstacle avoidance method and device for automatic driving vehicle
CN114419913A (en) In-vehicle reminding method and device, vehicle and storage medium
CN116311413A (en) Face recognition method, device, equipment and storage medium
CN117302248A (en) Vehicle control method, vehicle and device
CN117351757A (en) Signal lamp early warning method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination