Detailed Description
The embodiment of the application is suitable for the vehicle-mounted system, and the vehicle-mounted system can be deployed in a vehicle. It should be understood that the embodiment of the present application is mainly applied to a vehicle in a key-off state (or a parked state), but the embodiment of the present application may also be applied to a vehicle in other states, for example, a vehicle in a slow driving state, or a vehicle that has stopped driving but has not been key-off, and the present application is not limited thereto. The stall state is a state in which the engine of the vehicle stalls and the vehicle is stopped.
Referring to fig. 1, a schematic diagram of a possible architecture of an on-board system provided in an embodiment of the present application is provided, where the architecture of the on-board system at least includes a sensor system and an Electronic Control Unit (ECU) system. The sensor system can collect data of the surrounding environment of the vehicle, and the collected data is input into the ECU system and processed by the ECU system.
The sensor system includes a variety of sensors, including, for example and without limitation, the following: ultrasonic radar (USS), a Camera (Camera), an Inertial Navigation System (INS), and a Global Positioning System (GPS).
1) The ultrasonic radar is a radar using ultrasonic detection. The working principle of the ultrasonic radar is that the distance is measured and calculated by the time difference between the time when the ultrasonic wave is sent out by the ultrasonic wave transmitting device and the time when the ultrasonic wave is received by the receiver. Ultrasonic waves refer to the vibration frequency of more than 20000Hz, the vibration frequency (frequency) per second is very high and exceeds the general upper limit of human auditory sense (20000 Hz), and people call the sound waves which cannot be heard as ultrasonic waves.
Ultrasonic radars include, but are not limited to, the following two: the first is a reversing radar mounted on a front bumper and a rear bumper of a vehicle, namely used for measuring front and rear obstacles of the vehicle, and the radar is called UPA in the industry; the second is an ultrasonic radar, known in the industry as APA, mounted on the side of the vehicle for measuring the distance to side obstacles. The UPA is a short-range ultrasonic wave, is mainly installed at the front part and the rear part of a vehicle body, has the detection range of 25 cm-2.5 m, and has the advantages of large detection distance, small Doppler effect and temperature return interference and more accurate detection. The APA is a remote ultrasonic sensor, is mainly used for the side surface of a vehicle body, has the detection range of 35 cm-5 m, and can cover a parking space. The directivity is strong, the transmission performance is superior to that of the UPA, and the interference of other APAs and UPAs is not easy to happen.
For example, fig. 2 shows a layout diagram of a plurality of sensors on a vehicle, and in the example shown in fig. 2, the ultrasonic radars a, b, g, h, i, and j are short-range ultrasonic radars and are arranged at the head and tail of the vehicle, and the ultrasonic radars c, d, e, and f are long-range ultrasonic radars and are arranged at the left and right sides of the vehicle.
2) A camera, or referred to as a camera sensor. The cameras in embodiments of the present application may include any camera for acquiring images of the environment in which the vehicle is located, including, for example and without limitation: infrared cameras, visible light cameras, and the like.
For example, in the example shown in fig. 2, the camera 1 is disposed at the front side of the vehicle, and can capture an image in front of the vehicle; the camera 2 is arranged at the rear side of the vehicle and can acquire images behind the vehicle; the cameras 3 and 4 are respectively arranged on the left side and the right side of the vehicle and can collect images of the left side and the right side of the vehicle.
3) The inertial navigation system is a navigation parameter calculation system taking a gyroscope and an accelerometer as sensitive devices, establishes a navigation coordinate system according to the output of the gyroscope, and calculates the speed and the position of a carrier (such as a vehicle) in the navigation coordinate system according to the output of the accelerometer.
4) The global positioning system, also called as global satellite positioning system, is called as "ball position system" for short, and is a middle-distance circular orbit satellite navigation system, which combines the technology of satellite and communication development and utilizes the navigation satellite to measure time and distance.
It should be understood that fig. 2 is only an example, and the arrangement position of various sensors in practical application may be different from fig. 2, and may also include more or fewer sensors, and may also include other types of sensors, which is not limited in this application.
The ECU system may process data collected by each sensor in the sensor system. For example, the ECU processes image data collected by the camera to identify objects (e.g., obstacles) in the image. And the ECU system can also make a decision to drive the controlled element to work based on the processing result. Wherein the controlled element includes but is not limited to: sensors, speakers, car lights, central control screens, etc.
In the embodiment of the present application, the ECU system is composed of a plurality of ECUs, and the ECUs may communicate with each other to exchange data, for example, each ECU is connected to a Controller Area Network (CAN) bus, and the ECUs exchange data based on the CAN bus.
The specific implementation of the ECU may be any device or module having processing functionality. For example, the ECU may be a Central Processing Unit (CPU), and the ECU may also be other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, and the like. The general purpose processor may be a microprocessor or any conventional processor, among others.
Referring to fig. 3, the ECUs in the embodiment of the present application include, but are not limited to, the following types according to the functional division of each ECU: a vehicle mounted Mobile Data Center (MDC), a Body Control Manager (BCM), a smart Cabin Domain Controller (CDC), and a Telematics Box (TBOX).
1) MDC is the core ECU of the vehicle. The MDC has a calculation and control function, and can calculate data collected by each sensor, convert the calculation result into a control command, and control the operation of the controlled element by the control command, for example, the MDC sends the control command to an ECU (e.g., BCM, CDC, TBOX, etc.) corresponding to the controlled element, and the ECU corresponding to the controlled element drives the controlled element to operate according to the control command.
The MDC may also control memory (ROM/FLASH/EEPROM, RAM), input/output interfaces (I/O), and other external circuitry; the memory may store programs.
The vehicle monitoring method provided by the embodiment of the present application may be controlled by the MDC or may be completed by calling other components, for example, a processing program stored in a memory according to the embodiment of the present application, to perform an operation on data collected by each sensor, and to control a controlled element to operate.
2) BCM, also known as body computer (body computer), is an ECU for controlling the electrical systems of the vehicle body. BCM controlled elements include, but are not limited to: power windows, power mirrors, air conditioners, vehicle lights (such as headlights, turn lights, etc.), anti-theft locking systems, central locks, defrosting devices, etc. The BCM CAN be connected with other vehicle-mounted ECUs through a CAN bus.
3) And CDC is an ECU for controlling each element in the smart car. Elements in the smart cabin include, but are not limited to, the following: instrument screen, central control panel screen (for short central control screen), new line display screen, microphone, camera, speaker (loudspeaker) or bluetooth module etc.. The intelligent cabin can control the running state and running track of the automatic driving vehicle through human-computer interaction according to the requirements of passengers, so that the human-computer interaction or remote control in the intelligent cabin can transmit the same command to control the running of the vehicle.
4) And the TBOX is mainly used for communicating with an application program (APP) of a background system or user equipment to realize APP-related vehicle information display and control. TBOX may use 3G cellular communications such as Code Division Multiple Access (CDMA), EVD0, global system for mobile communications (GSM)/General Packet Radio Service (GPRS), or 4G cellular communications such as Long Term Evolution (LTE), or 5G cellular communications. TBOX may communicate with a Wireless Local Area Network (WLAN) using WiFi. In some embodiments, TBOX may communicate directly with the device using an infrared link, bluetooth, or ZigBee. TBOX may also communicate based on other wireless protocols, such as the direct communication with other vehicles and/or roadside stations based on the vehicle's Dedicated Short Range Communications (DSRC) protocol.
It should be noted that fig. 3 is only an example, and the number and layout of ECUs may have other implementation manners in practical applications, and the present application is not limited in detail herein. In addition, each ECU in fig. 3 may be separately deployed or may be mutually integrated and deployed, and the embodiment of the present application is not limited.
Based on the above description, the embodiment of the present application provides a vehicle monitoring method, taking the application of the method to the vehicle-mounted system shown in fig. 1 as an example, and referring to fig. 4, the method includes the following processes:
s401, when the vehicle is in a flameout state, the vehicle selects at least one sensor from a plurality of sensors installed on the vehicle according to at least one of scene types of the surrounding environment, moving objects in the surrounding environment and barrier conditions of the surrounding environment.
Specifically, the ECU system in the vehicle determines the type of the event to be monitored according to at least one of the scene type of the surrounding environment, the moving object in the surrounding environment and the barrier condition of the surrounding environment, and then selects at least one sensor corresponding to the type of the event to be monitored from a plurality of sensors mounted on the vehicle, namely the selected at least one sensor can effectively monitor the type of the event.
Alternatively, a specific implementation of selecting at least one sensor from a plurality of sensors mounted on a vehicle may be: at least one sensor is selected from a plurality of sensors mounted on the vehicle and activated. It should be understood that if some of the at least one sensor is already selected and activated, only the sensor that is not activated need be activated. Optionally, if other sensors besides the selected at least one sensor in the plurality of sensors have been selected or turned on, the other sensors are deselected or turned off.
In a specific implementation process, a plurality of elements (i.e., scene types of the surrounding environment, moving objects in the surrounding environment, and barriers of the surrounding environment) for selecting the at least one sensor from the plurality of sensors by the vehicle may be implemented individually or in combination, and the present application is not limited thereto.
First, a case where a scene type of the surrounding environment, a moving object in the surrounding environment, and a barrier situation of the surrounding environment are individually implemented will be described.
1. The vehicle selects at least one sensor according to the scene type of the surrounding environment.
The scene type of the ambient environment may characterize: the classification of the surrounding environment is based on the manner of formation, functional use, geographic location, time zone, facility, elements of the natural environment, human activity characteristics, building type or privacy, etc.
The application does not limit the specific dividing mode of the scene types, for example: according to the formation of the surrounding environment, the scene type of the surrounding environment can be divided into a natural environment, an artificial environment and the like; according to the functions of the surrounding environment, the scene types of the surrounding environment can be divided into living environment, ecological environment and the like; according to different elements in the surrounding environment, the scene type of the surrounding environment can be divided into an atmospheric environment, a water environment, a soil environment, a biological environment, a geological environment and the like; according to the gathering mode of human beings in the surrounding environment, the environment can be divided into rural environment, urban environment and the like; according to the privacy of the surrounding environment, the scene types of the surrounding environment can be divided into a private environment, a public environment and the like; the scene types of the surrounding environment may be classified into residential area environment, open/underground parking lot environment, street roadside environment, high-speed roadside environment, field environment, and the like, according to the types of buildings in the surrounding environment.
In the embodiment of the application, the corresponding relation between the scene type and the sensor can be preset by the vehicle, so that before the sensor is adopted to monitor the surrounding environment of the vehicle, the scene type of the surrounding environment can be firstly identified, and then the sensor corresponding to the scene type of the surrounding environment is selected from various sensors installed on the vehicle according to the corresponding relation between the scene type and the sensor. Therefore, the vehicle does not need to start all sensors all the time to monitor the surrounding environment, the corresponding sensors are selected according to the requirement to monitor the surrounding environment, the monitoring precision can be improved, meanwhile, the loss of other sensors is reduced, the service life of the sensors is prolonged, and the overall energy consumption of the vehicle sensors is reduced.
For example, fig. 5 shows a flowchart of a sensor selection method based on scene recognition, which may be applied to the in-vehicle system shown in fig. 1, and may be specifically executed by an ECU system in the in-vehicle system. The method comprises the following steps:
s501, in a flameout state, the vehicle identifies the scene type of the surrounding environment.
Specifically, the MDC in the vehicle's ECU system identifies the scene type of the vehicle's surrounding environment. The identification mode can be various, and the application is not limited. For example, the MDC obtains a history of the vehicle during driving (e.g., image data captured by a camera during driving, position data in a navigation system, etc.), and then determines a scene type of the surrounding environment of the vehicle based on the history. Or, for example, the MDC first gathers data for the ambient environment based on one or more sensors (e.g., cameras) on the vehicle and then determines the scene type of the ambient environment of the vehicle based on the data.
And S502, selecting a sensor corresponding to the scene type of the surrounding environment from a plurality of sensors installed on the vehicle by the vehicle according to the corresponding relation between the scene type and the sensor.
Specifically, the MDC may preset a first corresponding relationship between the scene type and the sensor, for example, store the first corresponding relationship between the scene type and the sensor in a memory. After the MDC determines the scene type of the surrounding environment, a sensor corresponding to the scene type is selected from a plurality of sensors installed on the vehicle according to the first corresponding relation.
The sensor corresponding to each scene type may be determined according to the event type to be monitored in the scene type, and the specific correspondence between the scene type and the sensor is not specifically limited in the present application, and the following lists several possible examples:
example 1, inside a residential cell: in the scene, pedestrians are rare, the road condition is simple, and only slight scratch of the vehicles is possible, so that the safety of the vehicles can be ensured only by selecting the camera and the ultrasonic radar.
Example 2, street curb: in the scene, the flow of people and vehicles is large, the situation is complex and changeable, and various safety threats such as scratch and grazing of trailers and vehicles and theft can occur, so that a camera, an ultrasonic radar, an inertial navigation system and a global positioning system can be selected, and the safety of the vehicles is ensured.
Example 3, open/underground parking lot: the situation is simple, but the risk that the vehicle is scratched and stolen exists, so that the inertial navigation system, the camera and the ultrasonic radar can be selected to sense that the object is close to the vehicle.
Example 4, outdoor strange environment: the vehicle is parked in strange outdoor, the risk of theft is very high, so the change of the vehicle in the amplitude and the direction of the earthquake degree can be sensed by selecting the camera and the INS.
It should be understood that, before step S502, some sensors may have been selected for monitoring the surrounding environment of the vehicle, and therefore, in executing step S502, if a sensor corresponding to the scene type of the surrounding environment is not selected, the sensor corresponding to the scene type of the surrounding environment is selected, and if a sensor corresponding to the scene type of the surrounding environment is selected for monitoring the surrounding environment of the vehicle, the sensor may be continuously maintained to monitor the surrounding environment of the vehicle. Optionally, if other sensors that do not correspond to the scene type of the surrounding environment are selected for monitoring the surrounding environment of the vehicle, the monitoring of the surrounding environment by the sensors that do not correspond to the scene type of the surrounding environment may be cancelled.
According to the embodiment of the application, different monitoring mechanisms are formulated according to different scene types of the surrounding environment of the vehicle, namely, the sensor corresponding to the scene type of the surrounding environment is selected from a plurality of sensors installed on the vehicle to be used for monitoring the surrounding environment. The monitoring precision can be improved, meanwhile, the loss of other sensors is reduced, the service life of the sensors is prolonged, and the overall energy consumption of the vehicle sensors is reduced.
2. The vehicle selects at least one sensor from a plurality of sensors mounted on the vehicle according to moving objects in the surrounding environment.
The moving object in the embodiment of the present application: refers to any object that can move. Including any living objects that can be moved (e.g., people, cats, dogs, rabbits, snakes, butterflies, wolves, birds, etc.) and non-living objects (e.g., vehicles, drones, mountain rock falls, etc., fallen leaves, etc.). It should be understood that the movement of the moving object may be autonomous (e.g., human walking, surrounding vehicle driving, bird flying, animal running, etc.) or passive (e.g., wind blown blade falling, mountain landslide, etc.), and the present application is not limited thereto.
In the embodiment of the application, the vehicle can select the sensor from the various sensors mounted on the vehicle by sensing the moving object existing around the vehicle body, so that the loss of other sensors can be reduced while the monitoring precision is improved, the service life of the sensor is prolonged, and the overall energy consumption of the vehicle sensor is reduced.
The following exemplifies several typical application scenarios: scene 1, late night community/street roadside: there will be pedestrian or vehicle passing by chance and few moving objects will appear. Scene 2, outdoor scene: less people flow and fewer moving objects appear around the vehicle body. Scene 3, underground parking lot: in areas other than the areas where vehicles enter and exit, fewer moving objects appear around the vehicle body.
In these scenarios, before the vehicle uses the sensors to monitor the surrounding environment of the vehicle, a small number of sensors (e.g., a first sensor) may be selected from a plurality of sensors installed on the vehicle, and then the first sensor may be used to detect whether a moving object is present in the surrounding environment. After the vehicle determines that a moving object is present in the surrounding environment, the first sensor and other types of sensors, such as ultrasonic radar, inertial navigation systems, etc., are selected from a variety of sensors mounted on the vehicle for monitoring the surrounding environment of the vehicle.
Optionally, in order to improve the monitoring accuracy, the vehicle may select another type of sensor besides the first sensor after determining that a moving object is present in the surrounding environment and the moving object satisfies the preset condition. Further optionally, the preset condition includes, but is not limited to, any one or more of the following: moving the moving object to the direction close to the vehicle; the occurrence frequency of the moving object exceeds a preset frequency; the appearance time of the moving object exceeds the preset time; the moving object is within a preset range of the vehicle.
It should be understood that the present application is not limited to the type of first sensor, for example the first sensor may be a camera or an ultrasonic radar, etc. Taking the first sensor as a camera for example, fig. 6 shows a flowchart of a moving object-based sensor selection method, which can be applied to the vehicle-mounted system shown in fig. 1, and the method includes:
s601, in a flameout state, starting a camera by the vehicle, entering a monitoring state, and shooting images of the surrounding environment by the camera;
s602, monitoring whether a moving object exists around the vehicle body by the vehicle based on the image shot by the camera;
specifically, the MDC recognizes that there is a shot image captured, and detects whether there is a moving object present in the surrounding environment.
S603, if a moving object appears, the vehicle continuously monitors the appearance frequency of the object based on the image shot by the camera; if no moving object exists, the camera is kept on.
S604, when the occurrence frequency of the moving object is very high (if the occurrence frequency exceeds a set frequency threshold), the safety of the vehicle body is threatened, and the vehicle selects other types of sensors (such as an ultrasonic radar, an inertial navigation system and the like) from a plurality of sensors arranged on the vehicle and monitors the surrounding environment together by matching with a camera; if the frequency is low, it may be only the pedestrian or animal that passes by without the intention of approaching the vehicle, and it is only necessary to keep the camera on, i.e. based on the camera detection.
Alternatively, the vehicle may determine other types of sensors according to at least one of the moving direction of the moving object, the appearance frequency of the moving object, the appearance duration of the moving object, or the distance between the moving object and the vehicle.
Example 1, an MDC of a vehicle presets a second correspondence between a moving direction of a moving object and a sensor, and after obtaining the moving direction of the moving object, the MDC determines to select a second sensor from a plurality of sensors mounted on the vehicle according to the second correspondence.
Still taking the example that the first sensor is a camera, if the moving object moves towards the direction of approaching the vehicle in a curve manner, it indicates that the object does not approach the vehicle quickly (it may be that only pedestrians pass by), and an ultrasonic radar can be selected to cooperate with the camera for monitoring; if the moving object moves linearly towards the direction close to the vehicle, the moving object is shown to be close to the vehicle quickly, and the ultrasonic radar, the inertial navigation system, the global positioning system and the like can be selected to be matched with the camera for monitoring, so that the monitoring capability of the vehicle is improved quickly.
Example 2, the MDC of the vehicle previously sets a third correspondence relationship between the occurrence frequency of the moving object and the sensor, and determines a third sensor that needs to be selected from the plurality of sensors mounted on the vehicle according to the third correspondence relationship after obtaining the occurrence frequency of the moving object.
Taking the first sensor as a camera as an example, if the occurrence frequency of the moving object is lower than a first frequency threshold, selecting an ultrasonic radar to cooperate with the camera for monitoring; if the appearance frequency of the moving object is higher than a first frequency threshold and lower than a second frequency threshold, selecting an ultrasonic radar and a global positioning system to cooperate with a camera for monitoring; and if the appearance frequency of the moving object is higher than the second frequency threshold value, selecting an ultrasonic radar, a global positioning system and an inertial navigation system to be matched with the camera for monitoring. The first frequency threshold is lower than the second frequency threshold, and both the first frequency threshold and the second frequency threshold are greater than 0.
The occurrence frequency of the moving object may be the number of occurrences of the moving object within a preset time range, for example, the number of occurrences of the moving object within one minute, the MDC may sample whether the moving object occurs once per second, and if the moving object occurs, the count value is +1. Of course, this is merely an example, and the present application does not limit the specific implementation of MDC for counting the occurrence frequency of moving objects.
Example 3, the MDC of the vehicle sets in advance a fourth correspondence between the occurrence duration of the moving object and the sensor, and determines, after obtaining the occurrence duration of the moving object, a fourth sensor that needs to be selected from the plurality of sensors mounted on the vehicle according to the fourth correspondence.
Taking the first sensor as a camera as an example, if the appearance duration of a moving object reaches 5S, selecting an ultrasonic radar to cooperate with the camera for monitoring; if the occurrence time of the moving object reaches 30S, selecting an ultrasonic radar and a global positioning system to cooperate with a camera for monitoring; and if the occurrence time of the moving object reaches 1min, selecting an ultrasonic radar, a global positioning system and an inertial navigation system to cooperate with the camera for monitoring.
Example 4, the MDC of the vehicle previously sets a fifth correspondence between the distance from the moving object to the vehicle and the sensor, and determines a fifth sensor that needs to be selected from a plurality of sensors mounted on the vehicle according to the fifth correspondence after obtaining the distance from the moving object to the vehicle.
Taking the first sensor as a camera as an example, if the distance between the moving object and the vehicle is 5-10 meters, selecting an ultrasonic radar to cooperate with the camera for monitoring; if the distance between the moving object and the vehicle is 2-5 m, selecting an ultrasonic radar and a global positioning system to cooperate with a camera for monitoring; and if the distance between the moving object and the vehicle is within 2 meters, selecting an ultrasonic radar, a global positioning system and an inertial navigation system to cooperate with the camera for monitoring.
Optionally, after the vehicle selects another type of sensor to cooperate with the camera to jointly monitor the surrounding environment, if it is monitored that the appearance frequency of the moving object becomes low or the moving object disappears, the other type of sensor may be reduced or turned off.
According to the embodiment of the application, different monitoring mechanisms are formulated by sensing whether moving objects exist around the vehicle. If no moving object exists around the vehicle body, the camera is selected to meet the monitoring condition. If moving objects exist around the vehicle body, other sensors are selected to be matched with the camera for use, and monitoring strength is improved. The monitoring precision can be improved, meanwhile, the loss of other sensors is reduced, the service life of the sensors is prolonged, and the overall energy consumption of the vehicle sensors is reduced.
3. The vehicle selects at least one sensor from a plurality of sensors mounted on the vehicle according to the barrier condition of the surrounding environment.
The barrier condition of the surrounding environment includes whether a barrier exists in the surrounding environment, a barrier type of the surrounding environment, a barrier level of the surrounding environment, and the like. The barrier is another object capable of protecting the safety of the vehicle, for example, another moving object can be blocked from approaching the vehicle. The application is not limited to a particular type of barrier, for example, barriers including but not limited to walls, other vehicles, trees, or fences, etc. The degree of barrier of the surrounding environment may refer to the degree or ability of the barrier in the surrounding environment to block other moving objects from approaching or damaging the vehicle.
Alternatively, the vehicle may determine the degree of the barrier of the surrounding environment based on the openness of the surrounding environment. For example, the lower the openness, the heavier the barrier; conversely, the higher the openness, the lighter the barrier. Optionally, the degree of the barrier of the surrounding environment is related to a scene type of the surrounding environment, a degree of the enclosure of the space of the surrounding environment, or a size of the space of the surrounding environment. For example, at the roadside of a street, the space closure is poor, the mobility of pedestrians and other vehicles is large, so the openness of the surrounding environment is high, and the degree of barrier of the surrounding environment is light; for example, in a residential area, the space is generally closed due to access control, and the barrier degree of the surrounding environment is generally due to the fact that pedestrians and other vehicles have small mobility and are generally open; for example, in a private garage, the space is small, the space is closed strongly, the mobility of pedestrians and other vehicles is very small, the openness is low, and therefore the degree of barrier of the surrounding environment is high.
In the embodiment of the present application, the vehicle may monitor the barrier condition of the peripheral environment in multiple implementations, for example, the barrier condition of the peripheral environment is determined based on the scene type of the peripheral environment, or the barrier condition of the peripheral environment is determined based on the size of the space of the peripheral environment, or the barrier condition of the peripheral environment is determined based on the degree of closure of the space of the peripheral environment.
Taking the case of judging the barrier condition of the surrounding environment based on the degree of enclosure of the space of the surrounding environment as an example: in one possible implementation, the vehicle may monitor the presence of safety barriers in various orientations of the vehicle; if the safety barrier does not exist in a certain position of the vehicle, the position has high openness, light barrier degree and poor safety, and a sensor capable of monitoring the position can be selected from various sensors mounted on the vehicle to monitor the position; if a safety barrier exists in another direction of the vehicle, the situation that the openness is low in the direction is shown, the barrier degree is high, the safety is high, and the number of sensors for monitoring the direction can be reduced or closed appropriately.
Optionally, the barrier is a safety barrier when the distance between the area in which the barrier is located and the vehicle is less than a first threshold.
For example, FIG. 7 shows a flow diagram of a method for sensor selection based on environmental barrier conditions, which may be applied to the on-board system shown in FIG. 1, the method comprising:
and S701, under a flameout state, starting a monitoring function of the vehicle, and enabling the vehicle to enter a monitoring state.
Optionally, when the monitoring function of the vehicle is turned on, all the sensors may be turned on, or only some of the sensors may be turned on (for example, only the camera is turned on), which is not limited in the present application.
S702, the vehicle monitors whether a safety barrier exists around the vehicle body.
For example, the vehicle determines whether there is a wall or another vehicle around the vehicle body from the image captured by the camera.
Optionally, when the distance between the barrier and the vehicle is less than the preset distance, the barrier is a safety barrier. The preset distance is, for example, 1 meter, 1.5 meters, 2 meters, etc., and the application is not limited thereto. In addition, the value of the preset distance may be different for different types of barriers. For example, for a wall, the preset distance is 1.5 meters; for other vehicles, the preset distance is 1 meter.
S703A, if a safety barrier is arranged on one side, the sensor on the side of the barrier is not turned on/off.
Specifically, if the sensor on the barrier side of the vehicle has not been turned on, the sensor on that side of the vehicle is kept turned off; if the sensor on that side of the vehicle is already on, the sensor on that side of the vehicle is turned off.
Optionally, the vehicle may shut down all sensors on the barrier side of the vehicle to save power consumption of the sensors to the maximum extent and extend the life of the sensors.
Alternatively, the vehicle may turn off only a portion of the sensors on the barrier side to further improve safety while properly conserving sensor power consumption.
And S703B, aiming at the other side without the safety barrier, normally starting the sensor on the side.
The following further exemplifies several typical application scenarios:
scene 1, when the vehicle parks in the parking space, the vehicle is parked on one side or two sides: the distance left between vehicles is narrow on the side of the vehicle body where the vehicles are parked, pedestrians or other vehicles are inconvenient to pass through the area, and therefore the threat or damage to the side of the vehicle body is almost negligible. At this time, it is not significant to turn on the side sensor for monitoring, so that the side sensor (for example, a camera, an ultrasonic radar, etc.) may not be turned on/off.
Scene 2, when the vehicle parks alongside, there are obstacles such as walls on one side: the wall-type obstacles exist on one side, the distance between the vehicle and the wall is narrow, the risks of rubbing, abnormal movement, theft and the like of the vehicle can be avoided, and the obstacles can protect the safety of the side of the vehicle body, so that the sensors (such as a camera, an ultrasonic radar and the like) on the side can not be turned on/off.
Whether the safety barrier exists around the vehicle body is sensed by the vehicle in the embodiment of the application, the existing value of the barrier is reasonably utilized, potential threat factors of the vehicle body side are avoided, unnecessary loss to the sensor is avoided, the service life of the sensor is prolonged, and the monitoring effectiveness is improved.
The above description has been made on the case where the three elements (i.e., the scene type of the surrounding environment, the moving object in the surrounding environment, and the barrier situation of the surrounding environment) used for selecting the sensor according to the present application are individually implemented. In specific implementation, the three sensor selection schemes can be implemented in combination with each other.
Several of these possible binding schemes are exemplified below.
4. The vehicle selects at least one sensor from a plurality of sensors mounted on the vehicle according to the scene type of the surrounding environment and the moving object in the surrounding environment.
For example, after the vehicle is switched from a driving state to a flameout state, firstly, determining the scene type of the surrounding environment of the vehicle according to the historical record of the vehicle in the driving process, and selecting at least one sensor corresponding to the scene type of the surrounding environment; then whether a moving object appears in the surrounding environment is detected based on the at least one sensor, and if the moving object appears in the surrounding environment or the frequency of the moving object exceeds a preset frequency, other types or more of sensors are further selected.
5. The vehicle selects at least one sensor from a plurality of sensors installed on the vehicle according to the scene type of the surrounding environment and the barrier condition of the surrounding environment.
For example, after the vehicle is switched from a driving state to a flameout state, firstly, determining the scene type of the surrounding environment of the vehicle according to the historical record of the vehicle in the driving process, and selecting at least one sensor corresponding to the scene type of the surrounding environment; and then detecting whether a safety barrier exists around the vehicle or not based on the selected sensors, continuously selecting part or all of the sensors on the side for which the safety barrier does not exist, and deselecting or turning off all the sensors on the side for which the safety barrier exists.
6. The vehicle selects at least one sensor from a plurality of sensors mounted on the vehicle according to the moving object in the surrounding environment and the barrier condition of the surrounding environment.
For example, after the vehicle is switched from a driving state to an off state, the cameras on the front, back, left and right sides are selected first, whether a safety barrier exists around the vehicle is detected based on the cameras on the front, back, left and right sides, a part or all of the sensors on the side are selected for the side where the safety barrier does not exist, and all the sensors on the side are turned off for the side where the safety barrier exists. Thereafter, the vehicle continues to detect whether or not a moving object is present in the surrounding environment based on the already selected sensors, and if the moving object is present in the surrounding environment or the frequency of the moving object exceeds a preset frequency, another type or a greater number of sensors are further selected.
7. The method comprises the following steps that a vehicle selects at least one sensor from a plurality of sensors installed on the vehicle according to the scene type of the surrounding environment, moving objects in the surrounding environment and the barrier condition of the surrounding environment;
for example, after the vehicle is switched from a driving state to a flameout state, all the front and rear cameras on four sides are selected first, whether a safety barrier exists around the vehicle is detected based on the front, rear, left and right cameras, for the side where the safety barrier does not exist, part or all of the sensors on the side are selected, and for the side where the safety barrier exists, all the sensors on the side are turned off. Then, the vehicle detects the scene type of the surrounding environment based on the selected camera, and selects a sensor corresponding to the current scene type. Thereafter, the vehicle continues to detect whether or not a moving object is present in the surrounding environment based on all the sensors having been selected, and further selects another type or a greater number of sensors if the moving object is present in the surrounding environment or the frequency of the moving object exceeds a preset frequency.
It should be understood that the above-mentioned fourth to seventh parts are only examples of the combined embodiments, and other combined embodiments are possible in specific implementation.
S402, the vehicle monitors the surrounding environment based on the at least one sensor.
In particular, the vehicle is based on at least one factor that the at least one sensor poses a threat to the safety of said vehicle. For example, selected sensors on an MDC control vehicle in an ECU system of the vehicle collect data of the surrounding environment of the vehicle; after the data are collected by each sensor, the data are transmitted to the MDC; after receiving the data collected by each sensor, the MDC analyzes the data to obtain the factors that the surrounding environment threatens the safety of the vehicle.
The following is illustrated by several specific examples:
example 1, taking a camera as an example: the MDC may monitor the surrounding environment for the presence of obstacles, the type of obstacles (e.g., pedestrians, bicycles, electric vehicles, vehicles), the distance between an obstacle and a vehicle, the movement tendency of an obstacle relative to a vehicle (e.g., approaching, moving away, or stationary, etc.), and the like based on the data collected by the camera.
Example 2, taking an ultrasonic radar as an example: the MDC may monitor the surrounding environment for the presence of obstacles, the distance of obstacles from the vehicle, etc. based on the data collected by the ultrasonic radar.
Example 3, take an inertial navigation system as an example: the MDC may monitor the vehicle for a shake value, a movement value (or a change in position value), a duration of vehicle shake, a duration of vehicle movement, etc., based on data collected by the inertial navigation system.
Example 4, take global positioning system as an example: the MDC can perform location tracking, vehicle condition monitoring, vehicle trace recording, etc. on the vehicle based on the data collected by the global positioning system.
It should be appreciated that the MDC is based on the ability of the at least one sensor to monitor at least one factor that threatens the security of the vehicle from the ambient environment, and whether the MDC can obtain the corresponding factor after analyzing the data collected by the at least one sensor depends on whether the corresponding factor is actually present in the ambient environment. If the corresponding factors exist in the surrounding environment, the MDC analyzes the data collected by the at least one sensor to obtain the corresponding factors, and if the corresponding factors do not exist in the surrounding environment, the MDC analyzes the data collected by the at least one sensor to not obtain the corresponding factors.
Further, after the vehicle obtains the factors that threaten the safety of the vehicle by the surrounding environment, the vehicle can determine the threat level of the surrounding environment to the vehicle according to the factors.
The MDC may determine the threat level of the surrounding environment to the vehicle based on one or more of: 1) The type of factors monitored by the plurality of sensors; 2) Values of factors monitored by the plurality of sensors; 3) The duration of each factor; 4) The surrounding environment is changed; 5) The number of changes in the surrounding environment; 6) The speed of the vehicle, etc. It should be understood that the above partial items may be obtained based on other items, for example, the vehicle speed of the vehicle may be obtained by performing statistical analysis on two factors of "time of vehicle movement" and "distance of vehicle movement".
The application does not limit the specific dividing mode of the vehicle to the threat level. Two of these possible ways are listed below: mode 1, the MDC classifies threat levels according to the number of types of factors monitored by the sensors, where the type of factors monitored by the MDC based on the plurality of sensors is less at low threat levels than at high threat levels. Mode 2, the MDC classifies threat levels according to values of the factors monitored by the sensors, wherein the value of any of the factors monitored by the MDC based on the plurality of sensors is lower at a low threat level than at a high threat level. It should be understood that the above modes 1 and 2 may be implemented separately or in combination with the embodiments, and are not limited herein.
The application does not limit the total number of threat levels. For example, the threat level is 1 level, i.e., "there is a threat"; for example, the threat levels are 2 levels, with level 1 being "low threat" and level 2 being "high threat"; for example, there are 3 levels of threat, level 1 being "low threat", level 2 being "high threat", and level 3 being "dangerous". Alternatively, "no threat present" may also be reduced to a single level, e.g., a level of 0 when no threat is present.
In the following, 4 levels of threat levels are taken as examples, and level 0 is "no threat", level 1 is "low threat", level 2 is "high threat", and level 3 is "dangerous":
the factors that the MDC can monitor based on the various sensors include: the method comprises the steps of monitoring an obstacle, an obstacle distance value ([ m ]), a vehicle vibration value ([ g ]), a vehicle position change ([ m ]), duration ([ ms ]), the number of parking environment changes (N) and vehicle speed (m/s). Wherein the monitoring of the obstacle is: the MDC monitors that the obstacles exist around the vehicle body based on the sensor; obstacle distance value: the MDC is based on the distance from an obstacle to a vehicle body monitored by a sensor; vehicle vibration value: the MDC monitors the vibration value of the vehicle based on the sensor; vehicle position change: the MDC monitors for changes in vehicle location (which may be vehicle theft or bad weather effects) based on sensors; duration: the MDC monitors the duration of a factor, such as the duration of vehicle movement or vibration, based on sensors; number of parking environment changes: the MDC monitors the change times of the vehicle parking environment based on a sensor; vehicle speed: the MDC monitors the speed at which the vehicle is moving based on sensors.
Example 1, MDC measures when animals and or people are walking around a vehicle as: an obstacle is detected. The MDC may determine that the surrounding environment is not a threat to the vehicle, with a threat level of 0.
Example 2, when animals and or people approach the vehicle, the MDC measures factors such as: obstacle, obstacle distance value, where the obstacle distance value is small (e.g., 0.5 m). The MDC may determine that the surrounding environment poses a low threat to the vehicle, with a threat level of 1.
Example 3, when animals and or humans touch a vehicle, the MDC measures factors: the method comprises the steps of monitoring an obstacle, a vehicle vibration value and an obstacle distance value, wherein the vehicle vibration value is small (such as 0.1 g), and the obstacle distance value is small (such as 0.01 m). The MDC can determine that the surrounding environment poses high threat to the vehicle, and the threat level is 2;
example 4, when animals and or humans attempt to force the door open, the MDC measures the following factors: the method comprises the steps of monitoring an obstacle, a vehicle vibration value, an obstacle distance value and a duration, wherein the vehicle vibration value is large (such as 0.5 g), the obstacle distance value is small (such as 0.01 m), and the time sequence time is long (such as 3 s). The MDC may determine that the surrounding environment is very dangerous to the vehicle with a threat level of 3.
Further, after determining the threat level, the vehicle may execute a response event corresponding to the threat level.
Specifically, the MDC may preset a correspondence between the threat level and the response event, for example, store the correspondence between the threat level and the response event in the memory. And after the MDC determines the threat level of the surrounding environment to the vehicle, executing a response event corresponding to the threat level according to the corresponding relation.
Referring to fig. 8, still taking 4 levels of threat levels (level 0 is no threat, level 1 is low threat, level 2 is high threat, and level 3 is dangerous) as an example, the response events corresponding to each level are respectively as follows:
after the vehicle is shut down, the sensor is turned on (the sensor which is selected to be turned on is referred to as a specific implementation method of S401), the vehicle enters a monitoring state, namely the sensor is adopted to collect data of the surrounding environment, and the MDC analyzes the data collected by the sensor to monitor whether a threat exists;
1) When the threat level is 'no threat exists', the vehicle can not execute any response event, or the response event is that the MDC controls the vehicle to continuously keep a 'monitoring state', namely, the sensor is adopted to monitor the surrounding environment of the vehicle;
2) When the threat level is 'low threat', the MDC controls the vehicle to enter a 'warning state', and the vehicle outputs warning information, such as vehicle lamp flashing, whistling, central control screen flashing and the like;
3) When the threat level is 'high threat', the MDC controls the vehicle to enter an 'event recording state', records events occurring in the surrounding environment, such as storing video images collected by a camera;
4) When the threat level is "dangerous", the MDC controls the vehicle to enter an "alarm state", and the vehicle sends alarm information to user equipment (such as a mobile phone, a smart watch, and the like) associated with the vehicle, for example, sends a short message to a mobile phone APP, and uploads a video recorded by a camera to a cloud to support downloading of the user equipment, and the like.
Alternatively, after the "alarm state" has ended for a period of time, such as 30s shown in fig. 8, the mdc may control the vehicle to resume the initial "monitoring state", i.e., stop sending alarm information to the user device associated with the vehicle, and continue to use the sensors to collect data about the surrounding environment. In this way, power consumption can be saved.
Optionally, the vehicle may maintain a "monitoring state" in the whole course when executing a response event corresponding to any threat level, that is, the sensor is used to monitor the surrounding environment of the vehicle all the time, so that the threat level may be updated in real time.
It should be understood that the process of the threat level of the surrounding environment monitored by the MDC to the vehicle may be to sequentially traverse each threat level from low to high, that is, to switch from "no threat" to "low threat", then to switch from "low threat" to "high threat", and then to switch to "dangerous"; the threat level of the surrounding environment monitored by the MDC to the vehicle may also be directly into one of the higher levels, such as directly into a "high threat" or "danger," which is not limited by the present application.
Optionally, the response event corresponding to the high threat level may include a response event corresponding to the low threat level to further improve the responsiveness of the vehicle. For example, when the threat level is "high threat", the MDC controls the vehicle to enter an "event recording state", and the vehicle records events occurring in the surrounding environment while outputting warning information. For example, when the threat level is "danger", the MDC controls the vehicle to enter an "alert state", and the vehicle outputs warning information, records events occurring in the surrounding environment, and simultaneously transmits alert information to the user equipment associated with the vehicle.
Optionally, the MDC executes a response event corresponding to the threat level, which may specifically be: and the MDC sends a control instruction to the ECU corresponding to the controlled element, so that the ECU corresponding to the controlled element drives the controlled element to execute a corresponding response event.
For example, referring to fig. 9, each ECU is controlled to drive a corresponding controlled element to execute an example of a response event for each threat level.
1) Under the flame-out state, the vehicle automatically enters a monitoring state: the MDC selects an ultrasonic radar and a camera, and monitors the environment around the vehicle based on data acquired by the ultrasonic radar and the camera.
2) When the MDC monitors that an object approaches the vehicle according to data collected by the ultrasonic radar and the camera, the MDC determines that the surrounding environment causes low threat to the vehicle, and then the MDC is automatically switched to a warning state: and the MDC awakens the BCM, the BCM controls the vehicle lamp to flicker and whistle according to the instruction of the MDC, the MDC awakens the CDC at the same time, and the CDC controls the central control screen to flicker according to the instruction of the MDC so as to warn that the camera close to the object is recording and monitoring.
3) When the MDC monitors that an object contacts the vehicle according to data collected by the ultrasonic radar and the camera, the MDC determines that the surrounding environment causes high threat to the vehicle, and then the MDC is automatically switched to an event recording state: the MDC selects an inertial navigation system (namely the inertial navigation system, a camera and an ultrasonic radar are monitored simultaneously), awakens the CDC to flicker in control, records and stores videos to the CDC through the camera, stores the videos through an external USB flash disk, and guides the videos into a Personal Computer (PC) for a user to see through the vehicle.
4) When the MDC monitors that a more serious threat occurs according to data collected by an ultrasonic radar, a camera, an inertial navigation system, and the like (for example, a BCM is triggered in a scene where a vehicle is unauthorized to enter or a tire pressure is abnormal, or an INS is triggered in a scene where a collision, a door prying, a window smashing, an abnormal vibration, a moving scene, and the like), the MDC determines that the surrounding environment causes a "danger" to the vehicle, and then automatically switches to an alarm state: the CDC is awaken up to the MDC, and the CDC increases the demonstration luminance of well accuse screen, and the CDC transfers the speaker volume to the biggest in order to support to shout, and the CDC passes through TBOX and uploads the video of recording before to the high in the clouds, reminds or APP to user's cell-phone propelling movement SMS reminds, supports user's cell-phone from high in the clouds download video.
Based on the above, the vehicle in the embodiment of the application can monitor threat factors existing in the surrounding environment in combination with different sensors according to at least one of the scene type of the surrounding environment, the moving object in the surrounding environment and the barrier condition of the surrounding environment, so that a traditional monitoring mechanism is optimized, multiple types of events can be perceived and identified, the monitoring precision can be improved, the loss of the sensors is reduced, and the service life of the sensors is prolonged. In addition, the vehicle in the embodiment of the application can also identify the threat level according to the monitored factors, execute the response event corresponding to the threat level, further eliminate the threat in time, and improve the safety of the vehicle in a flameout state.
Based on the same technical concept, the embodiment of the present application further provides a vehicle monitoring device 1000, where the device 1000 has a function of implementing the method steps shown in fig. 4 to 9, for example, the device 1000 includes a function or a module or a unit or means (means) for executing the method steps shown in fig. 4 to 9, and the function or the module or the unit or the means may be implemented by software, or implemented by hardware executing corresponding software.
For example, referring to fig. 10, the apparatus 1000 may comprise:
a processing unit 1001 configured to select at least one sensor from a plurality of sensors mounted on a vehicle, according to at least one of a scene type of a surrounding environment of the vehicle, a moving object in the surrounding environment, and a barrier situation of the surrounding environment;
a monitoring unit 1002 for monitoring the surroundings of the vehicle based on at least one sensor.
The specific implementation manner of the method steps executed by each unit may refer to the specific implementation manner when the vehicle executes the corresponding method steps in the embodiments shown in fig. 4 to fig. 9, and is not described herein again.
Based on the same technical concept, the embodiment of the application further provides the vehicle-mounted device 1100. Referring to fig. 11, the onboard apparatus includes at least one processor 1101 for performing the method steps shown in fig. 4-9.
Optionally, the vehicle-mounted device 1100 may further include a memory 1102, and the memory 1102 is represented by a dashed box in fig. 11 and is optional for the vehicle-mounted device 1100.
Optionally, the memory 1102 and the processor 1101 are communicatively connected by a bus, which is represented by a thick black line in fig. 11.
It should be understood that the processors mentioned in the embodiments of the present application may be implemented by hardware or may be implemented by software. When implemented in hardware, the processor may be a logic circuit, an integrated circuit, or the like. When implemented in software, the processor may be a general-purpose processor implemented by reading software code stored in a memory.
The Processor may be, for example, a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
It will be appreciated that the memory referred to in the embodiments of the application may be either volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The non-volatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable PROM (EEPROM), or a flash Memory. The volatile Memory may be a Random Access Memory (RAM) which serves as an external cache. By way of example, and not limitation, many forms of RAM are available, such as Static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), synchronous Dynamic Random Access Memory (SDRAM), double Data rate Synchronous Dynamic random access memory (DDR SDRAM), enhanced Synchronous SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), and Direct Rambus RAM (DR RAM).
It should be noted that when the processor is a general-purpose processor, a DSP, an ASIC, an FPGA or other programmable logic device, a discrete gate or transistor logic device, or a discrete hardware component, the memory (memory module) may be integrated into the processor.
It should be noted that the memory described herein is intended to comprise, without being limited to, these and any other suitable types of memory.
Based on the same technical concept, embodiments of the present application also provide a computer-readable storage medium for storing instructions, which when executed, cause the method shown in fig. 4 to 9 to be implemented.
Based on the same technical concept, the embodiment of the present application further provides a computer program product, in which instructions are stored, and when the computer program product runs on a computer, the computer is caused to execute the methods shown in fig. 4 to 9.
It is to be understood that the above embodiments may be combined with each other.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.