WO2022206336A1 - 一种车辆监测方法、装置和车辆 - Google Patents
一种车辆监测方法、装置和车辆 Download PDFInfo
- Publication number
- WO2022206336A1 WO2022206336A1 PCT/CN2022/080204 CN2022080204W WO2022206336A1 WO 2022206336 A1 WO2022206336 A1 WO 2022206336A1 CN 2022080204 W CN2022080204 W CN 2022080204W WO 2022206336 A1 WO2022206336 A1 WO 2022206336A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- vehicle
- surrounding environment
- sensor
- sensors
- level
- Prior art date
Links
- 238000012544 monitoring process Methods 0.000 title claims abstract description 77
- 238000000034 method Methods 0.000 title claims abstract description 72
- 230000004888 barrier function Effects 0.000 claims abstract description 87
- 230000004044 response Effects 0.000 claims description 36
- 238000004590 computer program Methods 0.000 claims description 21
- 238000012545 processing Methods 0.000 claims description 19
- 238000012806 monitoring device Methods 0.000 claims description 10
- 238000009434 installation Methods 0.000 claims 1
- 238000005265 energy consumption Methods 0.000 abstract description 7
- 230000002035 prolonged effect Effects 0.000 abstract 1
- 238000013461 design Methods 0.000 description 24
- 238000010586 diagram Methods 0.000 description 15
- 230000006870 function Effects 0.000 description 13
- 230000008569 process Effects 0.000 description 7
- 241001465754 Metazoa Species 0.000 description 6
- 238000001514 detection method Methods 0.000 description 6
- 238000013459 approach Methods 0.000 description 5
- 230000007246 mechanism Effects 0.000 description 4
- 230000001360 synchronised effect Effects 0.000 description 4
- 125000002737 ampicillanyl group Chemical group N[C@@H](C(=O)N[C@H]1[C@@H]2N([C@H](C(S2)(C)C)C(=O)*)C1=O)C1=CC=CC=C1 0.000 description 3
- 230000010267 cellular communication Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 235000019800 disodium phosphate Nutrition 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000010187 selection method Methods 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 2
- 230000015572 biosynthetic process Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 230000007774 longterm Effects 0.000 description 2
- 238000006748 scratching Methods 0.000 description 2
- 230000002393 scratching effect Effects 0.000 description 2
- 240000008791 Antiaris toxicaria Species 0.000 description 1
- 241000271566 Aves Species 0.000 description 1
- 241000282421 Canidae Species 0.000 description 1
- 241000282472 Canis lupus familiaris Species 0.000 description 1
- 208000012661 Dyskinesia Diseases 0.000 description 1
- 241000282326 Felis catus Species 0.000 description 1
- 241000282412 Homo Species 0.000 description 1
- 241000255777 Lepidoptera Species 0.000 description 1
- 241000283973 Oryctolagus cuniculus Species 0.000 description 1
- 241000270295 Serpentes Species 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000007664 blowing Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000002688 persistence Effects 0.000 description 1
- 239000011435 rock Substances 0.000 description 1
- 239000002689 soil Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000007619 statistical method Methods 0.000 description 1
- 230000026676 system process Effects 0.000 description 1
- 238000010257 thawing Methods 0.000 description 1
- 238000002604 ultrasonography Methods 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q1/00—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
- B60Q1/26—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
- B60Q1/46—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for giving flashing caution signals during drive, other than signalling change of direction, e.g. flashing the headlights or hazard lights
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q5/00—Arrangement or adaptation of acoustic signal devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R25/00—Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
- B60R25/10—Fittings or systems for preventing or indicating unauthorised use or theft of vehicles actuating a signalling device
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R25/00—Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
- B60R25/30—Detection related to theft or to other events relevant to anti-theft systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C5/00—Registering or indicating the working of vehicles
- G07C5/08—Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C5/00—Registering or indicating the working of vehicles
- G07C5/08—Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
- G07C5/0841—Registering performance data
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/12—Messaging; Mailboxes; Announcements
- H04W4/14—Short messaging services, e.g. short message services [SMS] or unstructured supplementary service data [USSD]
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2552/00—Input parameters relating to infrastructure
- B60W2552/50—Barriers
Definitions
- the present application relates to the field of smart/intelligent car, and in particular, to a vehicle monitoring method, device and vehicle.
- the existing technology uses sensors, such as cameras, lidar or millimeter-wave radar, to monitor the surrounding environment of the vehicle in the flame-off state.
- sensors such as cameras, lidar or millimeter-wave radar
- a single sensor can monitor few threat factors and cannot deal with the actual complex and changeable scenarios; moreover, long-term monitoring will cause great loss to the sensor, reduce the life of the sensor, and consume a large amount of energy.
- Embodiments of the present application provide a vehicle monitoring method, device, and vehicle, which can take into account the safety of the vehicle in a flameout state and the service life and energy consumption of sensors.
- a vehicle monitoring method which is applied to a vehicle in a flameout state, the method comprising: the vehicle according to at least one of a scene type of a surrounding environment of the vehicle, a moving object in the surrounding environment, and a barrier condition of the surrounding environment, At least one sensor is selected from a variety of sensors installed on the vehicle; then, the surrounding environment of the vehicle is monitored based on the at least one sensor.
- the vehicle in the embodiment of the present application combines different sensors to monitor the threat factors existing in the surrounding environment, and optimizes the traditional
- the monitoring mechanism can perceive and identify multiple types of events, which can improve the monitoring accuracy, reduce the loss of the sensor, and prolong the service life of the sensor.
- the following describes a specific implementation method for the vehicle to select at least one sensor from a variety of sensors installed on the vehicle according to at least one of scene types of the surrounding environment of the vehicle, moving objects in the surrounding environment, and barrier conditions of the surrounding environment.
- the vehicle can first identify the scene type of the surrounding environment, and then, according to the corresponding relationship between the scene type and the sensor, select the sensor corresponding to the scene type of the surrounding environment from the various sensors installed on the vehicle as the sensor. at least one sensor.
- the sensor selected by the vehicle is more suitable for the scene type of the surrounding environment, which can improve the monitoring accuracy of the vehicle on the surrounding environment.
- the vehicle can first use the camera installed on the vehicle to capture the image of the surrounding environment of the vehicle, and when it is judged that there is a moving object in the surrounding environment according to the image captured by the camera, the vehicle can then use the camera installed on the vehicle to obtain a picture of the surrounding environment.
- the sensors cameras and other types of sensors other than cameras are selected as the at least one sensor. In other words, if there are no moving objects around the vehicle body, the camera can be selected to meet the monitoring needs. If there are moving objects around the vehicle body, other sensors can be selected to be used with the camera to improve the monitoring intensity.
- the vehicle can reduce the number of sensors turned on as much as possible while taking into account the monitoring accuracy, thereby reducing the loss of the sensor and prolonging the life of the sensor.
- the vehicle may determine that the moving object satisfies the preset condition, and then select the camera and other types of sensors other than the camera as at least one sensor from a variety of sensors installed on the vehicle.
- the preset condition may be any one or more of the following: the moving object moves in a direction close to the vehicle, the frequency of occurrence of the moving object exceeds the preset frequency, the duration of the appearance of the moving object exceeds the preset duration, or the moving object is in the preset time of the vehicle within the range.
- the vehicle can further reduce the loss of the sensor and prolong the life of the sensor while taking into account the monitoring accuracy.
- the vehicle can first obtain the barrier situation around the vehicle. If there is a barrier on the first area, the first area is in the first orientation of the vehicle, and the distance between the first area and the vehicle is smaller than the first area. If the threshold value is exceeded, it means that the first area has a very low possibility of causing a safety threat to the vehicle, and the at least one sensor selected by the vehicle may not include a sensor for monitoring in the first orientation.
- the vehicle can reasonably utilize the existence value of the barrier by sensing whether there is a barrier around the body, avoid potential threat factors on the side of the barrier, improve the safety of the vehicle, reduce the loss of the sensor, and prolong the use of the sensor. time, improving the effectiveness of monitoring.
- the barrier includes a wall or other vehicle.
- the vehicle can use the existence value of the wall or other vehicles to avoid potential threat factors, reduce the loss of the sensor, prolong the use time of the sensor, and improve the effectiveness of monitoring.
- the following introduces a specific implementation method for the vehicle to monitor the surrounding environment of the vehicle based on at least one sensor.
- the vehicle can monitor at least one factor that threatens the safety of the vehicle in the surrounding environment based on at least one sensor; then, according to the at least one factor, determine the threat level of the surrounding environment to the vehicle; Response event corresponding to the level.
- the vehicle can eliminate threats in time and improve the safety of the vehicle when the vehicle is turned off.
- the threat level of the surrounding environment to the vehicle is determined according to at least one of the type of the at least one factor, the value of the at least one factor, the duration of the at least one factor, the number of changes in the surrounding environment, or the speed of the vehicle. .
- the vehicle can subdivide the threat level of the surrounding environment to the vehicle, and further improve the safety of the vehicle in the flameout state.
- the threat level includes the first level, the second level and the third level in order from low to high;
- the response events corresponding to the first level include any one or more of the following: flashing lights, beeping, etc.
- the response events corresponding to the second level include: the response events corresponding to the first level, and the use of the camera to record and save the video;
- the response events corresponding to the third level include: the response events corresponding to the first level Level 2 corresponds to responding events, sending reminders to user devices, and uploading videos to the cloud to support user device downloads.
- an embodiment of the present application provides a vehicle monitoring device.
- the device is, for example, a vehicle in a flameout state, or a component or a processing chip located in a vehicle in a flameout state.
- the device includes a device for executing the first aspect. Or the function or module or unit or means of the method described in any possible design of the first aspect.
- the apparatus may include: a processing unit configured to, according to at least one of a scene type of the surrounding environment of the vehicle, a moving object in the surrounding environment, and a barrier condition of the surrounding environment, from a variety of sensors installed on the vehicle. At least one sensor is selected from among; the monitoring unit is configured to monitor the surrounding environment of the vehicle based on the at least one sensor.
- an embodiment of the present application provides a vehicle monitoring device, which is applied to a vehicle in a flameout state.
- the vehicle monitoring device includes a processor and a memory, the memory stores computer program instructions, and the processor executes the computer program instructions to For implementing the method as described in the first aspect or any possible design of the first aspect.
- the processor may be configured to: select at least one from a variety of sensors installed on the vehicle according to at least one of a scene type of the surrounding environment of the vehicle, moving objects in the surrounding environment, and barrier conditions of the surrounding environment.
- a sensor monitoring the surrounding environment of a vehicle based on at least one sensor.
- an embodiment of the present application provides a vehicle, including: multiple sensors; and the vehicle monitoring device according to the second aspect or the vehicle monitoring device according to the third aspect.
- an embodiment of the present application provides a computer-readable storage medium, where the computer-readable storage medium is used to store instructions, and when the instructions are executed, the first aspect or any possible design of the first aspect is implemented. The described method is implemented.
- an embodiment of the present application provides a computer program product, where instructions are stored in the computer program product, and when the computer program product is run on a processor, the computer program product can be implemented as described in the first aspect or any possible design of the first aspect. method is implemented.
- FIG. 1 is a schematic diagram of the architecture of a possible vehicle-mounted system provided by an embodiment of the present application
- FIG. 2 is a schematic diagram of a possible sensor layout provided by an embodiment of the present application.
- FIG. 3 is a schematic diagram of the architecture of a possible ECU system provided by an embodiment of the present application.
- FIG. 4 is a flowchart of a vehicle monitoring method provided by an embodiment of the present application.
- FIG. 5 is a flowchart of a sensor selection method based on scene recognition provided by an embodiment of the present application
- FIG. 6 is a flowchart of a method for selecting a sensor based on a moving object according to an embodiment of the present application
- FIG. 7 is a flowchart of a sensor selection method based on an environmental barrier condition provided by an embodiment of the present application.
- FIG. 8 is a schematic diagram of a possible threat level and a corresponding response event
- FIG. 9 is a schematic diagram of a possible method of performing a response event
- FIG. 10 is a schematic structural diagram of a vehicle monitoring device 1000 provided by an embodiment of the application.
- FIG. 11 is a schematic structural diagram of an in-vehicle device 1100 provided by an embodiment of the present application.
- the embodiments of the present application are applicable to an in-vehicle system, and the in-vehicle system can be deployed in a vehicle. It should be understood that the embodiments of the present application are mainly applied to a vehicle in a flameout state (or a parking state) as an example, but the embodiments of the present application can also be applied to vehicles in other states, for example, a vehicle that is running slowly, or It can also be applied to vehicles that have stopped running but not turned off, which is not limited in this application.
- the engine-off state refers to that the engine of the vehicle is turned off and the vehicle is in a stopped state.
- the architecture of the vehicle-mounted system at least includes a sensor system and an electronic control unit (Electronic Control Unit, ECU) system.
- the sensor system can collect the data of the surrounding environment of the vehicle, and input the collected data into the ECU system for processing by the ECU system.
- ECU Electronic Control Unit
- the sensor system includes a variety of sensors, such as but not limited to the following sensors: Ultrasonic Sensor (USS), Camera (Camera), Inertial Navigation System (INS), Global Positioning System (Global Positioning System, GPS) .
- USS Ultrasonic Sensor
- Camera Camera
- INS Inertial Navigation System
- GPS Global Positioning System
- Ultrasonic radar refers to the radar that uses ultrasonic detection.
- the working principle of ultrasonic radar is to measure the distance from the time difference between when the ultrasonic transmitter sends out ultrasonic waves and receives the transmitted ultrasonic waves through the receiver.
- Ultrasound refers to the vibration frequency greater than 20000Hz or more, and its vibration frequency (frequency) per second is very high, which exceeds the general upper limit of human hearing (20000Hz). People call this inaudible sound wave ultrasonic.
- Ultrasonic radar includes but is not limited to the following two types: the first one is installed on the front and rear bumpers of the vehicle, that is, the reversing radar used to measure obstacles in the front and rear of the vehicle. This radar is called UPA in the industry; the second is installed in the vehicle. The ultrasonic radar on the side of the vehicle is used to measure the distance of the side obstacles, which is called APA in the industry.
- UPA is a short-range ultrasonic wave, which is mainly installed on the front and rear of the vehicle body, with a detection range of 25cm to 2.5m. Due to the large detection distance, the Doppler effect and temperature return interference are small, and the detection is more accurate.
- APA is a long-range ultrasonic sensor, mainly used on the side of the vehicle body, with a detection range of 35cm to 5m, which can cover a parking space. It has strong directivity, better propagation performance than UPA, and is not easily interfered by other APAs and UPAs.
- Fig. 2 shows a schematic diagram of the layout of multiple sensors on a vehicle.
- the ultrasonic radars a, b, g, h, i, and j are short-range ultrasonic radars, which are arranged on the front and rear of the vehicle.
- the ultrasonic radars c, d, e, and f are long-range ultrasonic radars, which are arranged on the left and right sides of the vehicle.
- the camera in the embodiment of the present application may include any camera for acquiring an image of the environment where the vehicle is located, for example, including but not limited to: an infrared camera, a visible light camera, and the like.
- camera 1 is arranged on the front side of the vehicle and can capture images in front of the vehicle;
- camera 2 is arranged on the rear side of the vehicle and can capture images behind the vehicle;
- cameras 3 and 4 are respectively arranged in On the left and right sides of the vehicle, images of the left and right sides of the vehicle can be collected.
- the inertial navigation system is a navigation parameter calculation system with gyroscope and accelerometer as sensitive devices.
- the system establishes a navigation coordinate system according to the output of the gyroscope, and calculates the position of the carrier (such as a vehicle) in the navigation coordinate system according to the output of the accelerometer. speed and position in .
- Global Positioning System also known as Global Satellite Positioning System, referred to as “ball position system” is a medium-distance circular orbit satellite navigation system, combined with satellite and communication development technology, using navigation satellites for timing and ranging .
- FIG. 2 is only an example, and in practical applications, the setting positions of various sensors may be different from those shown in FIG. 2, and may include more or less sensors, and may also include other types of sensors, which are not discussed in this application. limited.
- the ECU system can process the data collected by each sensor in the sensor system. For example, the ECU system processes the image data collected by the camera to identify objects (such as obstacles) in the image. The ECU system can also make decisions based on the processing results and drive the controlled components to work. Among them, the controlled elements include but are not limited to: sensors, speakers, lights, central control screens, etc.
- the ECU system is composed of multiple ECUs, and each ECU can communicate with each other for data exchange.
- each ECU is connected to a Controller Area Network (CAN) bus, and each ECU is based on CAN The bus exchanges data.
- CAN Controller Area Network
- the specific implementation of the ECU can be any device or module with processing functions.
- the ECU may be a central processing unit (Central Processing Unit, CPU), and the ECU may also be other general-purpose processors, digital signal processors (Digital Signal Processors, DSPs), application specific integrated circuits (Application Specific Integrated Circuits, ASICs), ready-made Field Programmable Gate Array (FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
- the general-purpose processor may be a microprocessor or any conventional processor.
- the ECUs in the embodiments of the present application include but are not limited to the following types: a vehicle-mounted mobile data center (mobile data center, MDC), a body control manager (Body Control Management, BCM) , intelligent cockpit domain controller (Cockpit Domain Controller, CDC) and telematics processor (Telematics Box, TBOX).
- MDC vehicle-mounted mobile data center
- BCM Body Control Management
- CDC intelligent cockpit domain controller
- telematics processor Telematics Box, TBOX
- MDC is the core ECU of the vehicle.
- MDC has the functions of operation and control. It can operate on the data collected by each sensor, and convert the results of the operation into control instructions, and control the work of the controlled element through the control instruction.
- MDC sends the control instruction to the ECU corresponding to the controlled element.
- the ECU corresponding to the controlled element drives the controlled element to work according to the control instruction.
- MDC can also control the memory (ROM/FLASH/EEPROM, RAM), input/output interface (I/O) and other external circuits; the memory can store programs.
- the vehicle monitoring method provided by the embodiment of the present application can be completed by the MDC controlled or by calling other components, for example, by calling the processing program of the embodiment of the present application stored in the memory to perform operations on the data collected by each sensor and control the controlled components work.
- BCM also known as body computer (body computer)
- body computer is an ECU used to control the electrical system of the body.
- Components controlled by BCM include but are not limited to: power windows, power mirrors, air conditioners, lights (such as headlights, turn signals, etc.), anti-theft locking system, central locking, defrosting device, etc.
- BCM can be connected with other vehicle ECUs through CAN bus.
- CDC is an ECU used to control various components in the smart cockpit.
- the components in the smart cockpit include but are not limited to the following: instrument screen, central control panel screen (referred to as central control screen), head-up display, microphone, camera, speaker (ie, speaker) or Bluetooth module, etc.
- the intelligent cockpit can control the running state and running trajectory of the autonomous vehicle through human-computer interaction according to the needs of the passengers, so that the human-computer interaction or remote control in the intelligent cockpit can transmit the same command to control the operation of the vehicle.
- TBOX which is mainly used to communicate with an application (application, APP) of a background system or a user equipment, and realize the display and control of vehicle information associated with the APP.
- TBOX can use 3G cellular communication, such as code division multiple access (CDMA), EVD0, global system for mobile communications (GSM)/general packet radio service (GPRS) , or 4G cellular communications, such as long term evolution (LTE), or 5G cellular communications.
- CDMA code division multiple access
- GSM global system for mobile communications
- GPRS general packet radio service
- 4G cellular communications such as long term evolution (LTE), or 5G cellular communications.
- TBOX can communicate with wireless local area network (WLAN) using WiFi.
- the TBOX may communicate directly with the device using an infrared link, Bluetooth, or ZigBee.
- the TBOX may also communicate based on other wireless protocols, for example, the TBOX communicates directly with other vehicles and/or roadside stations based on the vehicle's
- FIG. 3 is only used as an example. In practical applications, the number and layout of ECUs may also have other implementations, which are not specifically limited in this application. In addition, each ECU in FIG. 3 may be deployed independently, or may be deployed in an integrated manner with each other, which is not limited in the embodiment of the present application.
- an embodiment of the present application provides a vehicle monitoring method. Taking the method applied to the vehicle-mounted system shown in FIG. 1 as an example, referring to FIG. 4 , the method includes the following processes:
- the vehicle selects at least one sensor from a variety of sensors installed on the vehicle according to at least one of scene types of the surrounding environment, moving objects in the surrounding environment, and barrier conditions of the surrounding environment.
- the ECU system in the vehicle determines the type of events that need to be monitored according to at least one of the scene type of the surrounding environment, the moving objects in the surrounding environment, and the barrier conditions of the surrounding environment, and then uses various sensors installed on the vehicle to determine the type of events that need to be monitored. Select at least one sensor corresponding to the type of event to be monitored, that is, the selected at least one sensor can effectively monitor the type of event.
- a specific implementation of selecting at least one sensor from a variety of sensors installed on the vehicle may be: selecting and turning on at least one sensor from a variety of sensors installed on the vehicle. It should be understood that if some of the at least one sensor have been selected and turned on, it is only necessary to turn on the sensor that is not turned on by the at least one sensor. Optionally, if other sensors have been selected or turned on in addition to the selected at least one sensor among the plurality of sensors, the other sensors are deselected or turned off.
- the vehicle selects multiple elements of the at least one sensor from the multiple sensors (ie, the scene type of the surrounding environment, the moving objects in the surrounding environment, and the barrier conditions of the surrounding environment). ), which can be implemented independently or in combination with each other, which is not limited in this application.
- the scene types of the surrounding environment, the moving objects in the surrounding environment, and the barrier conditions of the surrounding environment are respectively introduced separately.
- the vehicle selects at least one sensor according to the scene type of the surrounding environment.
- the scene type of the surrounding environment can be characterized: the classification of the surrounding environment according to the formation method, functional use, geographical location, time period, facilities, natural environment elements, human activity characteristics, building type or privacy.
- This application does not limit the specific division method of scene types, for example: according to the formation of the surrounding environment, the scene types of the surrounding environment can be divided into natural environment, artificial environment, etc.; according to the functions of the surrounding environment, the scene types of the surrounding environment can be divided into living environment , ecological environment, etc.; according to different elements in the surrounding environment, the scene types of the surrounding environment can be divided into atmospheric environment, water environment, soil environment, biological environment, geological environment, etc.; according to the way humans gather in the surrounding environment, can be divided into rural environment, urban environment, etc.; according to the privacy of the surrounding environment, the scene types of the surrounding environment can be divided into private environment, public environment, etc.; according to the types of buildings in the surrounding environment, the scene types of the surrounding environment can be divided into Underground parking lot environment, street roadside environment, highway roadside environment, field environment, etc.
- the vehicle can preset the corresponding relationship between the scene type and the sensor, and then before using the sensor to monitor the surrounding environment of the vehicle, the scene type of the surrounding environment can be identified first, and then the corresponding relationship between the scene type and the sensor can be identified. , and select the sensor corresponding to the scene type of the surrounding environment from a variety of sensors installed on the vehicle. In this way, the vehicle does not have to start all sensors to monitor the surrounding environment, but select the corresponding sensor to monitor the surrounding environment according to the needs, which can improve the monitoring accuracy, reduce the loss of other sensors, prolong the life of the sensor, reduce the Overall energy consumption of vehicle sensors.
- FIG. 5 shows a flowchart of a method for selecting a sensor based on scene recognition.
- the method can be applied to the in-vehicle system shown in FIG. 1 , and can be specifically executed by the ECU system in the in-vehicle system.
- the method includes:
- the vehicle recognizes the scene type of the surrounding environment.
- the MDC in the ECU system of the vehicle identifies the scene type of the surrounding environment of the vehicle.
- the MDC obtains the historical records of the vehicle during driving (such as image data collected by the camera during driving, location data in the navigation system, etc.), and then determines the scene type of the surrounding environment of the vehicle according to the historical records.
- the MDC first collects data of the surrounding environment based on one or more sensors (eg, cameras) on the vehicle, and then determines the scene type of the surrounding environment of the vehicle based on the data.
- the vehicle selects a sensor corresponding to the scene type of the surrounding environment from a variety of sensors installed on the vehicle according to the corresponding relationship between the scene type and the sensor.
- the MDC may preset the first correspondence between the scene type and the sensor, for example, save the first correspondence between the scene type and the sensor in the memory. After the MDC determines the scene type of the surrounding environment, according to the first correspondence, a sensor corresponding to the scene type is selected from a variety of sensors installed on the vehicle.
- the sensor corresponding to each scene type can be determined according to the type of event that needs to be monitored under the scene type. This application does not specifically limit the specific correspondence between the scene type and the sensor. Several possible examples are listed below:
- Example 1 Inside a residential area: In this scenario, there are few pedestrians and the road conditions are simple. Generally, only slight vehicle scratches may occur. Therefore, you can only choose the camera and ultrasonic radar to ensure the safety of the vehicle.
- Example 2 Street roadside: In this scenario, the flow of people and vehicles is large, and the situation is complex and changeable. Various security threats such as trailers, vehicle scratches, and theft may occur, so you can choose cameras, ultrasonic radars, inertial navigation systems, global Positioning system to ensure the safety of the vehicle.
- Example 3 Open/underground parking lot: The situation is simple, but there is a risk of vehicle scratching and vehicle theft, so inertial navigation systems, cameras, and ultrasonic radars can be selected to sense the approach of objects.
- Example 4 Unfamiliar outdoor environment: The vehicle is parked in an unfamiliar outdoor, and the risk of being stolen is extremely high, so you can choose the camera and INS to sense the changes in the vibration amplitude and orientation of the vehicle.
- step S502 some sensors may have been selected for monitoring the surrounding environment of the vehicle. Therefore, when step S502 is performed, if the sensor corresponding to the scene type of the surrounding environment is not selected, select the For the sensor corresponding to the scene type of the surrounding environment, if the sensor corresponding to the scene type of the surrounding environment has been selected for monitoring the surrounding environment of the vehicle, the sensor can continue to monitor the surrounding environment of the vehicle. Optionally, if other sensors that do not correspond to the scene type of the surrounding environment are selected for monitoring the surrounding environment of the vehicle, the monitoring of the surrounding environment by these non-corresponding sensors may also be canceled.
- the embodiments of the present application formulate different monitoring mechanisms for different scene types of the surrounding environment of the vehicle, that is, a sensor corresponding to the scene type of the surrounding environment is selected from a variety of sensors installed on the vehicle for monitoring the surrounding environment. It can improve the monitoring accuracy, reduce the loss of other sensors, prolong the life of the sensor, and reduce the overall energy consumption of the vehicle sensor.
- the vehicle selects at least one sensor from a variety of sensors installed on the vehicle according to the moving objects in the surrounding environment.
- the moving objects in the embodiments of the present application refer to all objects that can move. Including any living objects that can move (such as people, cats, dogs, rabbits, snakes, butterflies, wolves, birds, etc.) and non-living objects (such as vehicles, drones, falling rocks, etc., fallen leaves, etc.). It should be understood that the movement of moving objects can be autonomous (such as the walking of people, the driving of surrounding vehicles, the flight of birds, the running of animals, etc.), or it can be passive (such as wind blowing leaves, landslides, etc.). There are no restrictions on applications.
- the vehicle can select a sensor from a variety of sensors installed on the vehicle by sensing the presence of moving objects around the vehicle body, which can improve the monitoring accuracy, reduce the loss of other sensors, and prolong the life of the sensor. , reduce the overall energy consumption of vehicle sensors.
- Scenario 1 Late night community/street roadside: Pedestrians or vehicles pass by occasionally, and almost no moving objects appear.
- a small number of sensors can be selected from a variety of sensors installed on the vehicle, and then the first sensor can be used to detect Whether there are moving objects in the surrounding environment.
- the first sensor and other types of sensors are selected from a variety of sensors installed on the vehicle for monitoring the surrounding environment of the vehicle, and other types of sensors are used to monitor the surrounding environment of the vehicle. Sensors such as ultrasonic radar, inertial navigation systems, etc.
- the vehicle may select other types of sensors other than the first sensor after determining that a moving object appears in the surrounding environment and the moving object satisfies a preset condition.
- the preset conditions include but are not limited to any one or more of the following: the moving object moves in a direction close to the vehicle; the occurrence frequency of the moving object exceeds the preset frequency; the appearance duration of the moving object exceeds the preset Duration; moving objects are within the preset range of the vehicle.
- FIG. 6 shows a flowchart of a method for selecting a sensor based on a moving object. The method can be applied to the in-vehicle system shown in FIG. 1 , and the method includes:
- the vehicle starts the camera, enters the monitoring state, and the camera captures images of the surrounding environment;
- the vehicle monitors whether there is a moving object around the vehicle body based on the image captured by the camera;
- the MDC identifies the captured image and detects whether there is a moving object in the surrounding environment.
- the vehicle continues to monitor the frequency of the appearance of the object based on the image captured by the camera; if there is no moving object, just keep the camera turned on.
- the vehicle may determine other types of sensors according to at least one of the moving direction of the moving object, the appearance frequency of the moving object, the appearance duration of the moving object, or the distance between the moving object and the vehicle.
- Example 1 The MDC of the vehicle pre-sets the second correspondence between the moving direction of the moving object and the sensor. After the MDC obtains the moving direction of the moving object, it selects from a variety of sensors installed on the vehicle according to the second correspondence. second sensor.
- the first sensor as the camera as an example, if the moving object moves in a curve in the direction of approaching the vehicle, it means that the object will not approach the vehicle soon (maybe just pedestrians passing by), you can choose ultrasonic radar to monitor with the camera; if the moving object is approaching the vehicle If the direction moves in a straight line, it means that the moving object will soon approach the vehicle. You can choose ultrasonic radar, inertial navigation system, global positioning system, etc. at the same time to cooperate with camera monitoring to quickly improve the monitoring ability of the vehicle.
- Example 2 The MDC of the vehicle pre-sets the third correspondence between the frequency of occurrence of moving objects and the sensor. After the MDC obtains the frequency of occurrence of moving objects, it determines the number of sensors that need to be selected from the various sensors installed on the vehicle according to the third correspondence. third sensor.
- the first sensor as a camera as an example, if the frequency of occurrence of moving objects is lower than the first frequency threshold, select ultrasonic radar to monitor with the camera; if the frequency of occurrence of moving objects is higher than the first frequency threshold and lower than the second frequency threshold If the frequency of moving objects is higher than the second frequency threshold, ultrasonic radar, global positioning system and inertial navigation system are selected for monitoring with camera.
- the first frequency threshold is lower than the second frequency threshold, and both the first frequency threshold and the second frequency threshold are greater than 0.
- the occurrence frequency of moving objects can be the number of occurrences of moving objects within a preset time range, such as the number of occurrences of moving objects in one minute
- MDC can sample whether moving objects appear every second, and if so, the count value + 1.
- this is only an example, and the present application does not limit the specific implementation of the MDC for counting the occurrence frequency of moving objects.
- Example 3 The MDC of the vehicle pre-sets the fourth correspondence between the appearance duration of the moving object and the sensor. After the MDC obtains the appearance duration of the moving object, it determines the sensor that needs to be selected from the various sensors installed on the vehicle according to the fourth correspondence. Fourth sensor.
- the ultrasonic radar is selected to cooperate with the camera for monitoring; if the moving object appears for 30S, the ultrasonic radar and GPS are selected to cooperate with the camera to monitor; if the moving object appears When the duration reaches 1min, ultrasonic radar, global positioning system and inertial navigation system are selected to cooperate with camera monitoring.
- Example 4 The MDC of the vehicle pre-sets the fifth correspondence between the distance between the moving object and the vehicle and the sensor. After the MDC obtains the distance between the moving object and the vehicle, it determines that the distance between the moving object and the vehicle needs to be obtained from the various sensors installed on the vehicle according to the fifth correspondence. Selected fifth sensor.
- the ultrasonic radar is selected for monitoring with the camera; if the distance between the moving object and the vehicle is 2 to 5 meters, the ultrasonic radar and the global positioning system are selected. Cooperate with camera monitoring; if the distance between the moving object and the vehicle is within 2 meters, choose ultrasonic radar, global positioning system and inertial navigation system to cooperate with camera monitoring.
- the vehicle selects other types of sensors to cooperate with cameras to monitor the surrounding environment, if it is detected that the moving object appears less frequently or the moving object disappears, other types of sensors can be reduced or turned off.
- the embodiments of the present application formulate different monitoring mechanisms by sensing whether there are moving objects around the vehicle. If there are no moving objects around the vehicle body, the camera can be selected to meet the monitoring conditions. If there are moving objects around the vehicle body, other sensors are selected to be used with the camera to improve the monitoring intensity. It can improve the monitoring accuracy, reduce the loss of other sensors, prolong the life of the sensor, and reduce the overall energy consumption of the vehicle sensor.
- the vehicle selects at least one sensor from a variety of sensors installed on the vehicle according to the barrier conditions of the surrounding environment.
- the barrier condition of the surrounding environment includes whether there is a barrier in the surrounding environment, the type of barrier in the surrounding environment, or the degree of barrier in the surrounding environment, etc.
- the barrier refers to other objects that can protect the safety of the vehicle, for example, can block other moving objects from approaching the vehicle. This application does not limit the specific type of barrier, for example, barriers include, but are not limited to, walls, other vehicles, trees or fences, and the like.
- the degree of barrier of the surrounding environment may refer to the degree or ability of the barriers in the surrounding environment to prevent other moving objects from approaching the vehicle or damage the vehicle.
- the vehicle may determine the degree of barrier of the surrounding environment based on the openness of the surrounding environment. For example, the lower the openness, the heavier the barrier degree; conversely, the higher the openness, the lighter the barrier degree.
- the barrier degree of the surrounding environment is related to the scene type of the surrounding environment, the closed degree of the space of the surrounding environment, or the size of the space of the surrounding environment, and the like.
- the space is poorly closed, and the mobility of pedestrians and other vehicles is large, so the openness of the surrounding environment is relatively high, so the degree of barrier in the surrounding environment is relatively light; for example, in a residential area, there is access control, so space Generally closed, the mobility of pedestrians and other vehicles is small, and the openness is average, so the degree of barrier of the surrounding environment is average; for example, in a private garage, the space is small, the space is closed, and the mobility of pedestrians and other vehicles is very small. , the openness is low, so the barrier degree of the surrounding environment is high.
- the vehicle monitoring the barrier condition of the surrounding environment can be implemented in various ways, such as judging the barrier condition of the surrounding environment based on the scene type of the surrounding environment, or judging the barrier condition of the surrounding environment based on the size of the space of the surrounding environment, or The barrier condition of the surrounding environment is judged based on the degree of enclosure of the surrounding environment.
- the vehicle can monitor whether there are safety barriers in all directions of the vehicle; if there is no safety barrier in a certain direction of the vehicle, It means that the orientation is relatively open, the degree of barrier is light, and the safety is poor. You can select a sensor that can monitor this orientation from a variety of sensors installed on the vehicle, and monitor this orientation; if there is safety in another orientation of the vehicle If there is a barrier, it means that the openness in this orientation is low, the degree of barrier is high, and the safety is high, and the sensors used to monitor this orientation can be appropriately reduced or closed.
- the barrier when the distance between the area where the barrier is located and the vehicle is less than the first threshold, the barrier is a safety barrier.
- FIG. 7 shows a flowchart of a sensor selection method based on environmental barrier conditions, which can be applied to the vehicle-mounted system shown in FIG. 1 , and the method includes:
- all the sensors may be turned on, or only some of the sensors may be turned on (for example, only the camera is turned on), which is not limited in this application.
- the vehicle monitors whether there is a safety barrier around the vehicle body.
- the vehicle determines whether there are walls or other vehicles around the vehicle body based on the images captured by the camera.
- the barrier when the distance between the barrier and the vehicle is less than the preset distance, the barrier is a safety barrier.
- the preset distance is, for example, 1 meter, 1.5 meters, or 2 meters, which is not limited in this application.
- the value of the preset distance may be different. For example, for a wall, the preset distance is 1.5 meters; for other vehicles, the preset distance is 1 meter.
- the sensor on the side of the vehicle is kept off; if the sensor on the side of the vehicle has been turned on, the sensor on the side of the vehicle is turned off.
- the vehicle can turn off all sensors on the side of the vehicle barrier to minimize the power consumption of the sensors and prolong the life of the sensors.
- the vehicle may only turn off part of the sensors on the side of the barrier, so as to further improve the safety performance while appropriately saving the power consumption of the sensors.
- Scenario 1 When the vehicle is parked in the parking space, there are vehicles parked on one or both sides: the side of the vehicle is parked, the distance between the vehicles is narrow, and it is inconvenient for pedestrians or other vehicles to pass through the area. The threat or damage caused by this side of the body is almost negligible. At this time, it is of little significance to turn on the sensor on this side for monitoring, so it is not necessary to turn on/off the sensors on this side (such as camera and ultrasonic radar, etc.).
- Scenario 2 When the vehicle is parked, there are obstacles such as walls on one side: there are obstacles such as walls on one side, and the distance between the car and the wall is narrow, which can avoid the risk of vehicle scratching, abnormal movement, theft, etc., obstacles Objects can protect the safety of this side of the body, so the sensors on this side (such as cameras and ultrasonic radars, etc.) can not be turned on/off.
- obstacles Objects can protect the safety of this side of the body, so the sensors on this side (such as cameras and ultrasonic radars, etc.) can not be turned on/off.
- the vehicle in the embodiment of the present application senses whether there is a safety barrier around the vehicle body, makes reasonable use of the existence value of the barrier, avoids potential threat factors on the side of the vehicle body, and also avoids unnecessary loss of the sensor, prolongs the use time of the sensor, and improves the Monitoring effectiveness.
- the three elements used in the present application for selecting sensors ie, the scene type of the surrounding environment, the moving objects in the surrounding environment, and the barrier situation of the surrounding environment
- the above three solutions for selecting sensors can also be implemented in combination with each other.
- the vehicle selects at least one sensor from a variety of sensors installed on the vehicle according to the scene type of the surrounding environment and the moving objects in the surrounding environment.
- the vehicle after the vehicle is switched from the driving state to the flame-off state, first determine the scene type of the surrounding environment of the vehicle according to the historical record of the vehicle during driving, and select at least one sensor corresponding to the scene type of the surrounding environment; The at least one sensor detects whether a moving object appears in the surrounding environment, and if the moving object appears in the surrounding environment or the frequency of moving objects exceeds a preset frequency, other types or more sensors are further selected.
- the vehicle selects at least one sensor from a variety of sensors installed on the vehicle according to the scene type of the surrounding environment and the barrier conditions of the surrounding environment.
- the vehicle after the vehicle is switched from the driving state to the flame-off state, first determine the scene type of the surrounding environment of the vehicle according to the historical record of the vehicle during driving, and select at least one sensor corresponding to the scene type of the surrounding environment; The selected sensor detects whether there is a safety barrier around the vehicle. For the side where there is no safety barrier, continue to select some or all of the sensors on this side. For the side with a safety barrier, deselect or turn off all the sensors on this side.
- the vehicle selects at least one sensor from a variety of sensors installed on the vehicle according to the moving objects in the surrounding environment and the barrier conditions of the surrounding environment.
- the cameras on the front, rear, left, and right sides are selected first, and whether there is a safety barrier around the vehicle is detected based on the cameras in the four directions. Some or all of the sensors on the side, for the side where the safety barrier exists, turn off all the sensors on that side. After that, the vehicle continues to continuously detect whether moving objects appear in the surrounding environment based on the selected sensors, and if the moving objects appear in the surrounding environment or the frequency of moving objects exceeds the preset frequency, other types or more sensors are further selected.
- the vehicle selects at least one sensor from a variety of sensors installed on the vehicle according to the scene type of the surrounding environment, the moving objects in the surrounding environment, and the barrier conditions of the surrounding environment;
- the cameras on all four sides of the front and rear are first selected, and whether there is a safety barrier around the vehicle is detected based on the cameras in the four directions of the front, rear, left, and right. Some or all of the sensors on the side, for the side where the safety barrier exists, turn off all the sensors on that side.
- the vehicle detects the scene type of the surrounding environment based on the selected camera, and selects the sensor corresponding to the current scene type. After that, the vehicle continues to detect whether moving objects appear in the surrounding environment based on all the sensors that have been selected. If moving objects appear in the surrounding environment or the frequency of moving objects exceeds the preset frequency, other types or more sensors are further selected.
- the vehicle monitors the surrounding environment based on the at least one sensor.
- the vehicle is based on at least one factor that threatens the safety of the vehicle based on the at least one sensor.
- the MDC in the ECU system of the vehicle controls the selected sensors on the vehicle to collect the data of the surrounding environment of the vehicle; after each sensor collects the data, it transmits the data to the MDC; after the MDC receives the data collected by each sensor, By analyzing these data, factors that threaten the safety of the vehicle in the surrounding environment can be obtained.
- Example 1 MDC can monitor whether there are obstacles in the surrounding environment based on the data collected by the camera, the types of obstacles (such as pedestrians, bicycles, electric vehicles, vehicles), the distance between obstacles and vehicles, and the relative distance between obstacles and vehicles. movement trend (such as approaching, moving away, or stationary, etc.).
- obstacles such as pedestrians, bicycles, electric vehicles, vehicles
- movement trend such as approaching, moving away, or stationary, etc.
- Example 2 MDC can monitor whether there are obstacles in the surrounding environment, the distance between obstacles and vehicles, etc. based on the data collected by ultrasonic radar.
- MDC can monitor the vibration value, movement value (or position change value) of the vehicle, the duration of vehicle vibration, the duration of vehicle movement, etc. based on the data collected by the inertial navigation system.
- Example 4 Taking the GPS as an example: MDC can perform positioning and tracking of vehicles, monitoring of vehicle conditions, and track records based on the data collected by the GPS.
- the MDC has the ability to monitor at least one factor that threatens the safety of the vehicle based on the at least one sensor, and whether the MDC can obtain the corresponding factor after analyzing the data collected by the at least one sensor depends on Whether the corresponding factors actually exist in the surrounding environment. If there are corresponding factors in the surrounding environment, the MDC can analyze the data collected by the at least one sensor to obtain the corresponding factors. If there are no corresponding factors in the surrounding environment, the MDC will analyze the data collected by the at least one sensor. The analysis does not obtain corresponding factors.
- the vehicle may determine the threat level of the surrounding environment to the vehicle according to the factors.
- the MDC can determine the threat level of the surrounding environment to the vehicle according to one or more of the following: 1) the types of factors monitored by the various sensors; 2) the values of the factors monitored by the various sensors; 3) the persistence of each factor 4) The surrounding environment changes; 5) The number of changes in the surrounding environment; 6) The speed of the vehicle, etc. It should be understood that some of the above items can be obtained based on other items. For example, the speed of the vehicle can be obtained by performing statistical analysis on two factors, "the time the vehicle moves" and the "distance the vehicle moves".
- Method 1 MDC divides the threat level according to the number of types of factors monitored by the sensor, and when the threat level is low, the types of factors detected by the MDC based on the multiple sensors are less than those of the high threat level. The MDC is based on the types of factors monitored by the various sensors.
- Mode 2 MDC divides the threat level according to the value of the factor monitored by the sensor, wherein the value of any factor monitored by the MDC based on the multiple sensors at a low threat level is smaller than the value of any factor monitored by the MDC based on the multiple sensors at a high threat level. the value of any factor. It should be understood that the foregoing manners 1 and 2 may be implemented independently, or may be combined with the embodiments, which are not limited here.
- This application does not limit the total number of threat levels. For example, there is a total of 1 threat level, that is, “there is a threat”; for example, there are 2 threat levels in total, where level 1 is “low threat” and level 2 is “high threat”; for example, there are 3 threat levels, level 1 is “low threat”, level 2 is “high threat”, and level 3 is “dangerous”.
- “no threat exists” can also be grouped into a separate level, eg the level is 0 when there is no threat.
- level 0 is "no threat”
- level 1 is “low threat”
- level 2 is “high threat”
- level 3 is "dangerous"
- the factors that MDC can monitor based on the above-mentioned various sensors include: detected obstacles, obstacle distance value (*m), vehicle vibration value (*g), vehicle position change (*m), duration (*ms), Number of parking environment changes (N), vehicle speed (m/s).
- obstacles detected MDC detects obstacles around the body based on the sensor; Obstacle distance value: MDC is based on the distance from the obstacle detected by the sensor to the body; Vehicle vibration value: The MDC detects the vibration value of the vehicle based on the sensor; Vehicle location change: MDC sensor-based monitoring of vehicle location changes (possibly due to vehicle theft or inclement weather); Duration: MDC sensor-based monitoring of the duration of a factor, such as vehicle movement or duration of vibration; parking environment Number of changes: The number of changes in the parking environment of the vehicle monitored by the MDC based on the sensor; Vehicle speed: The speed at which the vehicle is moving based on the sensor monitoring by the MDC.
- Example 1 When animals and/or people pass near the vehicle, the factors measured by MDC are: Obstacles are detected. MDC can determine that the surrounding environment does not pose a threat to the vehicle, and the threat level is 0.
- Example 2 When animals and/or people approach the vehicle, the factors measured by MDC are: obstacles and obstacle distance values, where the obstacle distance value is small (for example, 0.5m). MDC can determine that the surrounding environment poses a low threat to the vehicle, with a threat level of 1.
- MDC can determine that the surrounding environment poses a high threat to the vehicle, with a threat level of 2;
- Example 4 When animals and/or people try to forcibly open the door, the factors measured by MDC are: obstacles detected, vehicle vibration value, obstacle distance value, and duration, among which the vehicle vibration value is larger (such as 0.5g), The obstacle distance value is small (such as 0.01m), and the timing time is long (such as 3s). MDC can determine that the surrounding environment is dangerous relative to the vehicle, with a threat level of 3.
- the vehicle may perform a response event corresponding to the threat level.
- the MDC may preset the corresponding relationship between the threat level and the response event, for example, save the corresponding relationship between the threat level and the response event in the memory. After the MDC determines the threat level of the surrounding environment to the vehicle, according to the corresponding relationship, a response event corresponding to the threat level is executed.
- the response events corresponding to each level are as follows:
- the sensor is turned on (refer to the specific implementation method of S401 for selecting the sensor to be turned on), and the vehicle enters the "monitoring state", that is, the sensor is used to collect the data of the surrounding environment, and the MDC analyzes the data collected by the sensor to monitor whether there is a threat;
- the vehicle may not perform any response event, or the response event is that the MDC controls the vehicle to continue to maintain the "monitoring state", that is, the sensor is used to monitor the surrounding environment of the vehicle;
- the MDC controls the vehicle to enter the "warning state", and the vehicle outputs warning information, such as flashing lights, honking, and flashing of the central control screen;
- the MDC controls the vehicle to enter the "event recording state” to record the events in the surrounding environment, such as saving the video images collected by the camera;
- the MDC controls the vehicle to enter the “alarm state", and the vehicle sends alarm information to the user equipment associated with the vehicle (such as mobile phones, smart watches, etc.) Upload to the cloud to support the user device download, etc.
- the user equipment associated with the vehicle such as mobile phones, smart watches, etc.
- the MDC can control the vehicle to restore the original "monitoring state", that is, stop sending alarm information to the user equipment associated with the vehicle, and continue to use sensor acquisition. data about the surrounding environment. In this way, power consumption can be saved.
- the vehicle in the process of executing a response event corresponding to any threat level, can maintain a "monitoring state" throughout the whole process, that is, the sensor is always used to monitor the surrounding environment of the vehicle, and then the threat level can be updated in real time.
- the process of the threat level of the surrounding environment to the vehicle monitored by the MDC can be traversed from low to high in order to traverse each threat level, that is, first switch from “no threat” to "low threat”, and then from “low threat” Switch to "High Threat”, and then switch to "Dangerous”; the threat level of the surrounding environment monitored by MDC to the vehicle can also directly enter one of the higher levels, such as directly entering "High Threat” or “Dangerous", This application does not limit this.
- the response events corresponding to the high threat level may include response events corresponding to the low threat level, so as to further improve the response capability of the vehicle.
- the MDC controls the vehicle to enter the "event recording state”, and the vehicle records events that occur in the surrounding environment while outputting warning information.
- the MDC controls the vehicle to enter the "alarm state”, and the vehicle outputs warning information and records events in the surrounding environment, and at the same time sends alarm information to the user equipment associated with the vehicle.
- the MDC executes the response event corresponding to the threat level, which may specifically be: the MDC sends a control instruction to the ECU corresponding to the controlled element, so that the ECU corresponding to the controlled element drives the controlled element to execute the corresponding response event.
- FIG. 9 it is an example in which the MDC controls each ECU to drive the corresponding controlled element to execute a response event under each threat level.
- MDC selects ultrasonic radar and camera to monitor the environment around the vehicle based on the data collected by the ultrasonic radar and camera.
- the MDC When the MDC detects an object approaching the vehicle according to the data collected by the ultrasonic radar and the camera, the MDC determines that the surrounding environment poses a "low threat" to the vehicle, so it automatically switches to the warning state: the MDC wakes up the BCM, and the BCM controls the lights according to the MDC's instructions. Flashing and whistle, at the same time MDC wakes up CDC, CDC controls the flashing of the central control screen according to the instructions of MDC to warn that the camera is recording and monitoring near objects.
- MDC determines that the surrounding environment poses a "high threat" to the vehicle, so it automatically switches to the event record state: MDC selects the inertial navigation system (ie inertial navigation system, Camera and ultrasonic radar monitor at the same time), MDC wakes up CDC to perform central control flashing, camera records video and stores video to CDC, external U disk (USB flash disk) saves video, supports importing video to personal computer (Personal Computer, PC) for the user's car to see.
- inertial navigation system ie inertial navigation system, Camera and ultrasonic radar monitor at the same time
- MDC wakes up CDC to perform central control flashing
- external U disk USB flash disk
- PC Personal Computer
- MDC detects more serious threats based on the data collected by ultrasonic radar, camera, inertial navigation system, etc. Vibration, moving scenes, etc. trigger INS), MDC determines that the surrounding environment is "dangerous" to the vehicle, so it automatically switches to the alarm state: MDC wakes up CDC, CDC increases the display brightness of the central control screen, and CDC adjusts the speaker volume to the maximum to support shouting , CDC uploads the previously recorded video to the cloud through TBOX, pushes SMS reminders or APP reminders to the user's mobile phone, and supports the user's mobile phone to download videos from the cloud.
- the vehicle in the embodiment of the present application can monitor the threat factors existing in the surrounding environment in conjunction with different sensors according to at least one of the scene type of the surrounding environment, the moving objects in the surrounding environment, and the barrier conditions of the surrounding environment. , optimizes the traditional monitoring mechanism, can perceive and identify multiple types of events, can improve the monitoring accuracy, reduce the loss of the sensor, and prolong the service life of the sensor.
- the vehicle in the embodiment of the present application can also identify the threat level according to the monitored factors, and execute a response event corresponding to the threat level, thereby eliminating the threat in time, and improving the safety of the vehicle in a flameout state.
- an embodiment of the present application further provides a vehicle monitoring device 1000 , the device 1000 has the function of implementing the method steps shown in FIGS. 4 to 9 .
- the functions or modules or units or means of the shown method steps can be implemented by software, or by hardware, or by executing corresponding software by hardware.
- the apparatus 1000 may include:
- a processing unit 1001 configured to select at least one sensor from a variety of sensors installed on the vehicle according to at least one of a scene type of the surrounding environment of the vehicle, moving objects in the surrounding environment, and barrier conditions of the surrounding environment;
- the monitoring unit 1002 is configured to monitor the surrounding environment of the vehicle based on at least one sensor.
- the embodiments of the present application also provide an in-vehicle device 1100 .
- the in-vehicle device includes at least one processor 1101 for executing the method steps shown in FIGS. 4 to 9 .
- the in-vehicle device 1100 may further include a memory 1102 , which is indicated by a dashed box in FIG. 11 to indicate that the memory 1102 is optional for the in-vehicle device 1100 .
- the memory 1102 and the processor 1101 are communicatively connected through a bus, and the bus is represented by a thick black line in FIG. 11 .
- the processor mentioned in the embodiments of the present application may be implemented by hardware or software.
- the processor When implemented in hardware, the processor may be a logic circuit, an integrated circuit, or the like.
- the processor When implemented in software, the processor may be a general-purpose processor implemented by reading software codes stored in memory.
- the processor may be a central processing unit (Central Processing Unit, CPU), or other general-purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuit (Application Specific Integrated Circuit, ASIC) , Off-the-shelf Programmable Gate Array (Field Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
- a general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
- the memory mentioned in the embodiments of the present application may be volatile memory or non-volatile memory, or may include both volatile and non-volatile memory.
- the non-volatile memory may be a read-only memory (Read-Only Memory, ROM), a programmable read-only memory (Programmable ROM, PROM), an erasable programmable read-only memory (Erasable PROM, EPROM), an electrically programmable read-only memory (Erasable PROM, EPROM). Erase programmable read-only memory (Electrically EPROM, EEPROM) or flash memory.
- Volatile memory may be Random Access Memory (RAM), which acts as an external cache.
- RAM Static RAM
- DRAM Dynamic RAM
- SDRAM Synchronous DRAM
- SDRAM double data rate synchronous dynamic random access memory
- Double Data Eate SDRAM DDR SDRAM
- enhanced SDRAM ESDRAM
- synchronous link dynamic random access memory Synchlink DRAM, SLDRAM
- Direct Rambus RAM Direct Rambus RAM
- the processor is a general-purpose processor, DSP, ASIC, FPGA or other programmable logic devices, discrete gate or transistor logic devices, or discrete hardware components
- the memory storage module
- memory described herein is intended to include, but not be limited to, these and any other suitable types of memory.
- an embodiment of the present application further provides a computer-readable storage medium, where the readable storage medium is used to store instructions, and when the instructions are executed, the methods shown in FIG. 4 to FIG. 9 are implemented. .
- an embodiment of the present application also provides a computer program product, where instructions are stored in the computer program product, and when the computer program product runs on a computer, the computer executes the methods shown in FIG. 4 to FIG. 9 .
- the embodiments of the present application may be provided as a method, a system, or a computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
- computer-usable storage media including, but not limited to, disk storage, CD-ROM, optical storage, etc.
- These computer program instructions may also be stored in a computer-readable memory capable of directing a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory result in an article of manufacture comprising instruction means, the instructions
- the apparatus implements the functions specified in the flow or flow of the flowcharts and/or the block or blocks of the block diagrams.
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Transportation (AREA)
- Acoustics & Sound (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Mathematical Physics (AREA)
- Traffic Control Systems (AREA)
Abstract
本申请实施例提供一种车辆监测方法、装置和车辆。车辆处于熄火状态时,根据周边环境的场景类型、周边环境中的移动物体、周边环境的屏障情况中的至少一项,从安装于车辆上的多种传感器中选择至少一种传感器,进而基于该至少一种传感器对周边环境进行监测。该方法中车辆能够根据周边环境动态选择传感器,实现对多类事件进行感知识别,可以提高监测精度,同时可以不必使用所有传感器对周边环境进行监测,所以可以在兼顾车辆在熄火状态下的安全性的同时,提高传感器的使用寿命,减少对传感器的能耗。
Description
相关申请的交叉引用
本申请要求在2021年03月29日提交中国专利局、申请号为202110333138.3、申请名称为“一种车辆监测方法、装置和车辆”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
本申请涉及智能车(smart/inteligent car)领域,尤其涉及一种车辆监测方法、装置和车辆。
随着经济与科技的快速发展,人们的生活水平日益提高,车辆成为了人们生活中的必需品。车辆在为车主提供出行便利的同时,也带来了一些问题和困扰。例如,车主将车辆熄火之后便会离开,此时车辆将无人看管。
为保障车辆在熄火状态下的安全,现有技术采用传感器,如摄像头、激光雷达或毫米波雷达,对熄火状态下的车辆的周边环境进行监测。但是,单一传感器能够监测的威胁因素较少,无法应对实际复杂多变的场景;并且,长时间的监测对传感器的损耗大,减少了传感器的寿命,能耗也较大。
因此,如何兼顾车辆在熄火状态下的安全性和传感器的使用寿命及能耗,成为亟待解决的技术问题。
发明内容
本申请实施例提供一种车辆监测方法、装置和车辆,可以兼顾车辆在熄火状态下的安全性和传感器的使用寿命及能耗。
第一方面,提供一种车辆监测方法,应用于熄火状态下的车辆,方法包括:车辆根据车辆的周边环境的场景类型、周边环境中的移动物体、周边环境的屏障情况中的至少一项,从安装于车辆上的多种传感器中选择至少一种传感器;然后,基于至少一种传感器对车辆的周边环境进行监测。
本申请实施例中的车辆根据周边环境的场景类型、周边环境中的移动物体、周边环境的屏障情况中的至少一项,联合不同的传感器对周边环境存在的威胁因素进行监测,优化了传统的监测机制,能够对多类型事件进行感知识别,可以在提高监测精度的同时,减少对传感器的损耗,延长传感器的使用寿命。
以下介绍车辆根据车辆的周边环境的场景类型、周边环境中的移动物体、周边环境的屏障情况中的至少一项从安装于车辆上的多种传感器中选择至少一种传感器的具体实现方法。
一种可能的设计中,车辆可以先识别周边环境的场景类型,然后根据场景类型与传感器的对应关系,从安装于车辆上的多种传感器中选择与周边环境的场景类型相对应的传感 器作为该至少一种传感器。
该设计中,车辆选择出的传感器与周边环境的场景类型更加适配,能够提高车辆对周边环境的监测精度。
一种可能的设计中,车辆可以先使用安装于车辆上的摄像头拍摄车辆的周边环境的图像,当根据摄像头拍摄的图像判断周边环境中有移动物体出现时,再从安装于车辆上的多种传感器中选择摄像头和除摄像头之外的其它类型的传感器作为该至少一种传感器。换而言之,若车身周围不存在移动物体,则选择摄像头即可满足监测需求,若车身周围存在移动物体,则再选择其他传感器配合摄像头使用,提高监测强度。
该设计中,车辆可以在兼顾监测精度的同时,尽可能地减少开启的传感器,进而减少对传感器的损耗,延长传感器的寿命。
一种可能的设计中,车辆可以确定移动物体满足预设条件之后,再从安装于车辆上的多种传感器中选择摄像头和除摄像头之外的其它类型的传感器作为至少一种传感器之前。例如,该预设条件以下任意一项或多项:移动物体往靠近车辆的方向移动、移动物体的出现频率超过预设频率、移动物体的出现时长超过预设时长或者移动物体处于车辆的预设范围内。
该设计中,车辆可以在兼顾监测精度的同时,进一步减少对传感器的损耗,延长传感器的寿命。
一种可能的设计中,车辆可以先获得车辆周边的屏障情况,如果在第一区域上存在屏障,第一区域在车辆的第一方位上,并且第一区域与车辆之间的距离小于第一阈值,则说明第一区域对车辆造成安全威胁的可能性非常低,则车辆选择的至少一种传感器可以不包括用于在第一方位上进行监测的传感器。
该设计中,车辆通过感知车身周围是否存在屏障,可以合理利用该屏障的存在价值,避开屏障侧潜在的威胁因素,提高车辆的安全性,同时还减少了对传感器的损耗,延长传感器的使用时长,提高了监测的有效性。
一种可能的设计中,屏障包括墙体或其它车辆。
该设计中,车辆可以利用墙体或其它车辆的存在价值,避开潜在的威胁因素,减少对传感器的损耗,延长传感器的使用时长,提高监测的有效性。
应理解,上述几种选择传感器的方案可以相互结合实施。
以下介绍车辆基于至少一种传感器对车辆的周边环境进行监测的具体实现方法。
一种可能的设计中,车辆可以基于至少一种传感器监测周边环境对车辆的安全造成威胁的至少一个因素;然后,根据该至少一个因素,确定周边环境对车辆的威胁级别;然后,执行与威胁级别相对应的响应事件。
该设计中,车辆可以及时排除威胁,提高车辆在熄火状态下的安全性。
一种可能的设计中,根据至少一个因素的类型、至少一个因素的值、至少一个因素的持续时间、周边环境的变化次数或者车辆的车速中的至少一项,确定周边环境对车辆的威胁级别。例如,因素的类型越多,对应的威胁级别越高;或者例如,因素的值越时,威胁级别越高。
该设计中,车辆能够对周边环境对车辆的威胁级别进行细分,进一步提高车辆在熄火状态下的安全性。
一种可能的设计中,威胁级别从低到高依次包括第一级别、第二级别和第三级别;与 第一级别相对应的响应事件包括以下任意一项或多项:车灯闪烁、鸣笛或中控屏闪烁;与第二级别相对应的响应事件包括:与第一级别相对应的响应事件,以及采用摄像头录制视频和保存视频;与第三级别相对应的响应事件包括:与第二级别相对应的响应事件,向用户设备发送提醒,以及将视频上传至云端以支持用户设备下载。
该设计中,不同威胁级别对应不同的响应事件,能够针对性地及时排除威胁,可以进一步提高车辆在熄火状态下的安全性。
第二方面,本申请实施例提供一种车辆监测装置,该装置例如是熄火状态下的车辆,或者是位于熄火状态下的车辆内的部件或者处理芯片,该装置包括用于执行如第一方面或第一方面任一种可能的设计中所述的方法的功能或模块或单元或手段。
示例性的,该装置可以包括:处理单元,用于根据车辆的周边环境的场景类型、周边环境中的移动物体、周边环境的屏障情况中的至少一项,从安装于车辆上的多种传感器中选择至少一种传感器;监测单元,用于基于至少一种传感器对车辆的周边环境进行监测。
各单元所执行的方法步骤的具体实现方式可以参见第一方面或第一方面任一种可能的设计中对应方法步骤的具体实现方式,这里不再赘述。
第三方面,本申请实施例提供一种车辆监测装置,应用于熄火状态的车辆,车辆监测装置包括处理器和存储器,所述存储器存储计算机程序指令,所述处理器运行所述计算机程序指令可以用于实现如第一方面或第一方面任一种可能的设计中所述的方法。
示例性的,该处理器可以用于:根据车辆的周边环境的场景类型、周边环境中的移动物体、周边环境的屏障情况中的至少一项,从安装于车辆上的多种传感器中选择至少一种传感器;基于至少一种传感器对车辆的周边环境进行监测。
该处理器所执行的方法步骤的具体实现方式可以参见第一方面或第一方面任一种可能的设计中对应方法步骤的具体实现方式,这里不再赘述。
第四方面,本申请实施例提供一种车辆,包括:多种传感器;以及,如第二方面所述的车辆监测装置或者如第三方面所述的车辆监测装置。
第五方面,本申请实施例提供一种计算机可读存储介质,该计算机可读存储介质用于存储指令,当指令被执行时,使如第一方面或第一方面任一种可能的设计中所述的方法被实现。
第六方面,本申请实施例提供一种计算机程序产品,计算机程序产品中存储有指令,当其在处理器上运行时,使得如第一方面或第一方面任一种可能的设计中所述的方法被实现。
上述第二方面至第六方面中各设计的有益效果参见第一方面中对应设计的有益效果,这里不再赘述。
图1为本申请实施例提供的一种可能的车载系统的架构示意图;
图2为本申请实施例提供的一种可能的传感器布局示意图;
图3为本申请实施例提供的一种可能的ECU系统的架构示意图;
图4为本申请实施例提供的一种车辆监测方法的流程图;
图5为本申请实施例提供的一种基于场景识别的传感器选择方法的流程图;
图6为本申请实施例提供的一种基于移动物体的传感器选择方法的流程图;
图7为本申请实施例提供的一种基于环境屏障情况的传感器选择方法的流程图;
图8为一种可能的威胁级别和对应的响应事件的示意图;
图9为一种可能的执行响应事件的方法的示意图;
图10为本申请实施例提供的一种车辆监测装置1000的结构示意图;
图11为本申请实施例提供的一种车载设备1100的结构示意图。
本申请实施例适用于车载系统,该车载系统可以部署于车辆中。应理解,本申请实施例主要以应用于熄火状态(或驻车状态)的车辆为例,但本申请实施例也可以应用于其它状态的车辆,例如也可以应用于缓慢行驶中的车辆,或者也可以应用于已停止行驶但未熄火的车辆,本申请不做限制。其中,熄火状态是指车辆的发动机熄火,车辆呈停止状态。
参见图1,为本申请实施例提供的一种可能的车载系统的架构示意图,该车载系统的架构中至少包括传感器系统和电子控制单元(Electronic Control Unit,ECU)系统。传感器系统可以采集车辆周边环境的数据,并将采集到的数据输入ECU系统,由ECU系统进行处理。
传感器系统包括多种传感器,例如包括但不限于以下传感器:超声波雷达(Ultrasonic Sensor,USS)、摄像头(Camera)、惯性导航系统(Inertial Navigation System,INS)、全球定位系统(Global Positioning System,GPS)。
1)、超声波雷达是指运用超声波探测的雷达。超声波雷达的工作原理是通过超声波发射装置向外发出超声波,到通过接收器接收到发送过来超声波时的时间差来测算距离。超声波是指振动频率大于20000Hz以上的,其每秒的振动次数(频率)甚高,超出了人耳听觉的一般上限(20000Hz),人们将这种听不见的声波叫做超声波。
超声波雷达包括但不限于以下两种:第一种是安装在车辆前后保险杠上的,也就是用于测量车辆前后障碍物的倒车雷达,这种雷达业内称为UPA;第二种是安装在车辆侧面的,用于测量侧方障碍物距离的超声波雷达,业内称为APA。UPA是一种短程超声波,主要安装在车身的前部与后部,检测范围为25cm~2.5m,由于检测距离大,多普勒效应和温回度干扰小,检测更准确。APA是一种远程超声波传感器,主要用于车身侧面,检测范围为35cm~5m,可覆盖一个停车位。方向性强,传播性能优于UPA,不易受到其他APA和UPA的干扰。
例如,图2示出了多个传感器在车辆上的布局示意图,在图2所示的示例中,超声波雷达a、b、g、h、i、j为短程超声波雷达,设置在车辆的车头和车尾,超声波雷达c、d、e、f为远程超声波雷达,设置在车辆左右侧。
2)、摄像头,或者称为相机,或称为相机传感器。本申请实施例中的摄像头可以包括用于获取车辆所位于的环境的图像的任何摄像头,例如包括但不限于:红外摄像头、可见光摄像头等。
例如,在图2所示的示例中,摄像头1设置在车辆的前侧,可以采集车辆前方的图像;摄像头2设置在车辆的后侧,可以采集车辆后方的图像;摄像头3、4分别设置在车辆的左右两侧,可以采集车辆的左右两侧的图像。
3)、惯性导航系统,是以陀螺和加速度计为敏感器件的导航参数解算系统,该系统根据陀螺的输出建立导航坐标系,根据加速度计输出解算出运载体(如车辆)在导航坐标系 中的速度和位置。
4)、全球定位系统,又称全球卫星定位系统,简称为“球位系”,是一个中距离圆型轨道卫星导航系统,结合卫星及通讯发展的技术,利用导航卫星进行测时和测距。
应理解,图2仅作为一种示例,实际应用中各类传感器的设置位置可以与图2不同,也可以包括更多或者更少的传感器,也可以包含其他类型的传感器,本申请对此不作限定。
ECU系统可以对传感器系统中各传感器采集的数据进行处理。例如ECU系统对摄像头采集的图像数据进行处理,识别出图像中的物体(如障碍物)。ECU系统还可以基于处理结果,做出决策,驱动被控元件工作。其中,被控元件包括但不限于:传感器、扬声器、车灯、中控屏等。
在本申请实施例中ECU系统由多个ECU组成,各ECU之间可以相互通信进行数据交换,例如每个ECU与控制器局域网络(Controller Area Network,CAN)总线相连,各ECU之间基于CAN总线交换数据。
ECU的具体实现可以是具备处理功能的任意设备或模块。例如,ECU可以是中央处理单元(Central Processing Unit,CPU),ECU还可以是其他通用处理器、数字信号处理器(Digital Signal Processor,DSP)、专用集成电路(Application Specific Integrated Circuit,ASIC)、现成可编程门阵列(Field Programmable Gate Array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等。其中,通用处理器可以是微处理器或者是任何常规的处理器。
参见图3,依据各ECU的功能划分,本申请实施例中的ECU包括但不限于以下几种类型:车载移动数据中心(mobile data center,MDC)、车身控制管理器(Body Control Management,BCM)、智能座舱域控制器(Cockpit Domain Controller,CDC)以及远程信息处理器(Telematics Box,TBOX)。
1)、MDC是车辆的核心ECU。MDC具有运算与控制的功能,可以对各传感器采集的数据进行运算,并将运算的结果转变为控制指令,通过控制指令控制被控元件工作,例如MDC将控制指令发送到被控元件对应的ECU(例如BCM、CDC、TBOX等),被控元件对应的ECU根据该控制指令驱动被控元件工作。
MDC还可以对存储器(ROM/FLASH/EEPROM、RAM)、输入/输出接口(I/O)和其它外部电路的控制;存储器可以存放程序。
本申请实施例提供的车辆监控方法,可以由MDC来控制或调用其他部件来完成,比如调用存储器中存储的本申请实施例的处理程序,来对各传感器采集的数据进行运算,以及控制被控元件工作。
2)、BCM,又称为车身电脑(body computer),是用于控制车身电器系统的ECU。BCM控制的元件包括但不限于:电动车窗、电动后视镜、空调、车灯(如大灯、转向灯等)、防盗锁止系统、中控锁、除霜装置等。BCM可以通过CAN总线与其他车载ECU相连。
3)、CDC是用于控制智能座舱中各元件的ECU。智能座舱中的元件包括但不限于以下:仪表屏,中央控制面板屏(简称中控屏),抬头显示屏、麦克风、摄像头、扬声器(即喇叭)或蓝牙模块等。智能座舱可以根据乘坐人员的需要,通过人机交互来控制自动驾驶车辆的运行状态以及运行轨迹,从而使智能座舱中的人机交互或者是远程控制都可以传送相同的命令来控制车辆的运行。
4)、TBOX,主要用于和后台系统或用户设备的应用程序(application,APP)通信, 实现APP关联车辆信息显示与控制。TBOX可使用3G蜂窝通信,例如码分多址(code division multiple access,CDMA)、EVD0、全球移动通信系统(global system for mobile communications,GSM)/通用分组无线服务技术(general packet radio service,GPRS),或者4G蜂窝通信,例如长期演进(long term evolution,LTE),或者5G蜂窝通信。TBOX可利用WiFi与无线局域网(wireless local area network,WLAN)通信。在一些实施例中,TBOX可利用红外链路、蓝牙或ZigBee与设备直接通信。TBOX还可以基于其他无线协议通信,例如TBOX基于车辆的专用短程通信(dedicated short range communications,DSRC)协议直接与其它车辆和/或路边台站之间的通信。
需要说明的是,图3仅作为一种示例,在实际应用中ECU的数量和布局还可以有其它实现方式,本申请此处不作具体限定。另外,图3中的各ECU可以分别独立部署,也可以相互集成部署,本申请实施例不做限定。
基于上述描述,本申请实施例提供一种车辆监测方法,以该方法应用于图1所示的车载系统为例,参见图4,该方法包括如下流程:
S401、车辆处于熄火状态时,车辆根据周边环境的场景类型、周边环境中的移动物体、周边环境的屏障情况中的至少一项,从安装于车辆上的多种传感器中选择至少一种传感器。
具体的,车辆中的ECU系统根据周边环境的场景类型、周边环境中的移动物体、周边环境的屏障情况中的至少一项,确定需要监测的事件类型,然后从安装于车辆上的多种传感器中选择与该需要监测的事件类型相对应的至少一种传感器,即选择的至少一种传感器能够对该类型的事件实现有效监测。
可选的,从安装于车辆上的多种传感器中选择至少一种传感器的具体实现可以是:从安装于车辆上的多种传感器中选中并开启至少一种传感器。应理解,若该至少一种传感器中已经有部分传感器被选中和开启,则只需开启该至少一种传感器未被开启的传感器即可。可选的,若该多种传感器中除了被选择的该至少一种传感器之外,还有其它传感器已被选中或已经被开启,则取消选中该其它传感器或关闭该其它传感器。
需要说明的是,在具体实现过程中,用于车辆从该多种传感器中选择该至少一种传感器的多个要素(即周边环境的场景类型、周边环境中的移动物体、周边环境的屏障情况),可以分别单独实施,也可以相互结合实施,本申请不做限定。
以下,先对周边环境的场景类型、周边环境中的移动物体、周边环境的屏障情况分别单独实施的情况进行介绍。
一、车辆根据周边环境的场景类型选择至少一种传感器。
周边环境的场景类型可以表征:根据形成方式、功能用途、地理位置、时间区段、设施、自然环境要素、人类活动特点、建筑类型或隐私性等对周边环境的进行的分类。
本申请不限定场景类型的具体划分方式,例如:按照周边环境的形成,周边环境的场景类型可以分为自然环境、人工环境等;按照周边环境的功能,周边环境的场景类型可以分为生活环境、生态环境等;按照周边环境中的不同要素,周边环境的场景类型可以分为大气环境、水环境、土壤环境、生物环境、地质环境等;按照周边环境中人类的聚集方式,可分为乡村环境、城镇坏境等;按照周边环境隐私性,周边环境的场景类型可以分为私人环境、公共环境等;按照周边环境中的建筑类型,周边环境的场景类型可以分为居民小区环境、露天/地下停车场环境、街道路边环境、高速路边环境、野外环境等。
本申请实施例中,车辆可以预先设置好场景类型与传感器的对应关系,进而在采用传 感器对车辆的周边环境进行监测之前,可以先识别周边环境的场景类型,然后根据场景类型与传感器的对应关系,从安装于车辆上的多种传感器中选择与周边环境的场景类型相对应的传感器。如此,车辆不必一直启动所有传感器对周边环境进行监测,而是根据需求选择对应的传感器对周边环境进行监测,可以在提高监测精度的同时,减少对其它传感器的损耗,延长传感器的寿命,减小车辆传感器整体能耗。
例如,图5示出了一种基于场景识别的传感器选择方法的流程图,该方法可以应用于图1所示的车载系统,具体可以由该车载系统中的ECU系统执行。该方法包括:
S501、熄火状态下,车辆识别周边环境的场景类型。
具体的,车辆的ECU系统中的MDC识别车辆的周边环境的场景类型。识别方式可以有多种,本申请不做限制。例如,MDC获取车辆在行车过程中的历史记录(例如行车过程中摄像头采集的图像数据、导航系统中的位置数据等),然后根据该历史记录确定车辆的周边环境的场景类型。或者例如,MDC先基于车辆上的一个或多个传感器(例如摄像头)采集周边环境的数据,然后基于该数据确定车辆的周边环境的场景类型。
S502、车辆根据场景类型与传感器的对应关系,从安装于车辆上的多种传感器中选择与周边环境的场景类型相对应的传感器。
具体的,MDC可以预先设置场景类型与传感器的第一对应关系,例如在存储器中保存场景类型与传感器的第一对应关系。当MDC确定出周边环境的场景类型之后,根据该第一对应关系,从安装于车辆上的多种传感器中选择与该场景类型相对应的传感器。
其中,每个场景类型对应的传感器,可以根据该场景类型下需要监测的事件类型确定,本申请对场景类型与传感器的具体对应关系不做具体限定,以下列举几个可能的示例:
示例1、居民小区内部:这种场景中,行人稀少,路况简单,一般只可能出现轻微的车辆剐蹭,所以可以只选择摄像头和超声波雷达,即可保证车辆的安全。
示例2、街道路边:这种场景中,人流量车流量大,情况复杂多变,可能出现拖车、车辆剐蹭、盗窃等各种安全威胁,所以可以选择摄像头,超声波雷达,惯性导航系统,全球定位系统,保证车辆的安全。
示例3、露天/地下停车场:情况简单,但是存在车辆剐蹭,车辆被盗窃的风险,所以可以选择惯性导航系统、摄像头、超声波雷达感知物体靠近。
示例4、户外陌生环境:车辆停放在陌生的户外,被盗窃的风险极大,所以可以选择摄像头、INS感知车辆在震度幅度和方位上的变化。
应理解,在步骤S502之前,可能有部分传感器已经被选择用于对车辆的周边环境进行监测,因此在执行步骤S502时,如果与周边环境的场景类型相对应的传感器未被选择,则选择该与周边环境的场景类型相对应的传感器,如果与周边环境的场景类型相对应的传感器已被选择用于对车辆的周边环境进行监测,则继续保持该传感器对车辆的周边环境进行监测即可。可选的,如果有与周边环境的场景类型不对应的其它传感器被选择用于对车辆的周边环境进行监测,则还可以取消该些不对应的传感器对周边环境的监测。
本申请实施例针对车辆周边环境的不同场景类型,制定不同的监测机制,即从安装于车辆上的多种传感器中选择与周边环境的场景类型相对应的传感器用于对周边环境进行监测。可以在提高监测精度的同时,减少对其它传感器的损耗,延长传感器的寿命,减小车辆传感器整体能耗。
二、车辆根据周边环境中的移动物体从安装于所述车辆上的多种传感器中选择至少一 种传感器。
本申请实施例中的移动物体:指一切能够移动的物体。包括可以移动的任何生命物体(如人、猫、狗、兔子、蛇、蝴蝶、狼、鸟类等)和非生命物体(如车辆、无人机、山体滑落的石头等、落叶等)。应理解,移动物体的移动可以是自主性的(例如人的行走、周围车辆的行驶,鸟的飞行,动物的奔跑等),也可以被动的(例如风吹叶落,山体滑坡等),本申请不做限制。
本申请实施例中,车辆可以通过感知车身周围存在移动物体,从安装于所述车辆上的多种传感器中选择传感器,可以在提高监测精度的同时,减少对其它传感器的损耗,延长传感器的寿命,减小车辆传感器整体能耗。
以下例举几种典型应用场景:场景1、深夜小区/街道路边:偶然才会有行人或车辆经过,几乎没有移动物体出现。场景2、户外场景:人流较少,车身周围出现的移动物体较少。场景3、地下停车场:非车辆进出的区域,车身周围出现的移动物体较少。
在这些场景下,车辆在采用传感器对所述车辆的周边环境进行监测之前,可以先从安装于所述车辆上的多种传感器中选择少量传感器(如第一传感器),然后使用第一传感器检测所述周边环境中是否有移动物体出现。当车辆确定周边环境中有移动物体出现之后,再从安装于所述车辆上的多种传感器中选择该第一传感器和其他其它类型的传感器用于对车辆的周边环境进行监测,其他其它类型的传感器例如超声波雷达、惯性导航系统等。
可选的,为提高监测的精度,车辆可以在确定周边环境中有移动物体出现且移动物体满足预设条件之后,再选择除第一传感器之外的其它类型的传感器。进一步可选的,该预设条件包括但不限于以下任意一项或多项:移动物体往靠近所述车辆的方向移动;移动物体的出现频率超过预设频率;移动物体的出现时长超过预设时长;移动物体处于车辆的预设范围内。
应理解,本申请不限制第一传感器的类型,例如第一传感器可以是摄像头或超声波雷达等。以第一传感器是摄像头为例,图6示出了一种基于移动物体的传感器选择方法的流程图,该方法可以应用于图1所示的车载系统,该方法包括:
S601、熄火状态下,车辆启动摄像头,进入监测状态,摄像头拍摄周边环境的图像;
S602、车辆基于摄像头拍摄的图像监测车身周围是否存在移动物体;
具体的,MDC识别摄像有拍摄的图像,检测周边环境中是否有移动物体出现。
S603、如果出现移动物体,车辆继续基于摄像头拍摄的图像监测物体出现的频率;如果不存在移动物体,只需继续保持摄像头开启。
S604、移动物体出现频率很高时(如超过设定的频率阈值),车身安全受到威胁,车辆从安装于所述车辆上的多种传感器中选择其它类型的传感器(如超声波雷达、惯性导航系统等),配合摄像头共同对周边环境进行监测;若频率较低,则可能只是行人或动物经过,没有靠近车辆的意图,则只需保持摄像头开启,即基于摄像头检测。
可选的,车辆可以根据移动物体的移动方向、移动物体的出现频率、移动物体的出现时长或移动物体与车辆的距离等中的至少一项,确定其它类型的传感器。
示例1、车辆的MDC预先设置移动物体的移动方向与传感器的第二对应关系,MDC在获得移动物体的移动方向之后,根据第二对应关系确从安装于所述车辆上的多种传感器中选择第二传感器。
仍以第一传感器是摄像头为例,若移动物体往靠近车辆方向曲线移动,则表示物体不 会很快接近车辆(可能只是行人路过),可以选择超声波雷达配合摄像头监测;若移动物体往靠近车辆方向直线移动,表示移动物体将很快接近车辆,可以同时选择超声波雷达、惯性导航系统、全球定位系统等配合摄像头监测,迅速提高车辆的监测能力。
示例2、车辆的MDC预先设置移动物体的出现频率与传感器的第三对应关系,MDC在获得移动物体的出现频率之后,根据第三对应关系确定需要从安装于车辆上的多种传感器中选择的第三传感器。
仍以第一传感器是摄像头为例,若移动物体的出现频率低于第一频率阈值,则选择超声波雷达配合摄像头监测;若移动物体的出现频率高于第一频率阈值且低于第二频率阈值,则选择超声波雷达、全球定位系统配合摄像头监测;若移动物体的出现频率高于第二频率阈值,选择超声波雷达、全球定位系统以及惯性导航系统配合摄像头监测。其中,第一频率阈值低于第二频率阈值,且第一频率阈值、第二频率阈值均大于0。
其中,移动物体的出现频率可以是移动物体在预设时间范围内出现次数,例如移动物体在一分钟内的出现次数,MDC可以每秒钟采样一次移动物体是否出现,若出现,则计数值+1。当然,此处仅为举例,本申请不限制MDC用于统计移动物体的出现频率的具体实现方式。
示例3、车辆的MDC预先设置移动物体的出现时长与传感器的第四对应关系,MDC在获得移动物体的出现时长之后,根据第四对应关系确定需要从安装于车辆上的多种传感器中选择的第四传感器。
仍以第一传感器是摄像头为例,若移动物体出现时长达到5S,则选择超声波雷达配合摄像头监测;若移动物体出现时长达到30S,则选择超声波雷达、全球定位系统配合摄像头监测;若移动物体出现时长达到1min,则选择超声波雷达、全球定位系统以及惯性导航系统配合摄像头监测。
示例4、车辆的MDC预先设置移动物体与车辆的距离与传感器的第五对应关系,MDC在获得移动物体与车辆的距离之后,根据第五对应关系确定需要从安装于车辆上的多种传感器中选择的第五传感器。
仍以第一传感器是摄像头为例,若移动物体与车辆距离在5~10米,则选择超声波雷达配合摄像头监测;若移动物体与车辆距离在2~5米,则选择超声波雷达、全球定位系统配合摄像头监测;若移动物体与车辆距离在2米内,则选择超声波雷达、全球定位系统以及惯性导航系统配合摄像头监测。
可选的,在车辆选择其它类型的传感器配合摄像头共同对周边环境监测之后,若又监测到该移动物体出现频率变低或该移动物体消失,则还可以减少或关闭其它类型的传感器。
本申请实施例通过感知车辆周围是否存在移动物体,制定不同的监测机制。若车身周围不存在移动物体,则选择摄像头即可满足监测条件。若车身周围存在移动物体,则再选择其他传感器配合摄像头使用,提高监测强度。可以在提高监测精度的同时,减少对其它传感器的损耗,延长传感器的寿命,减小车辆传感器整体能耗。
三、车辆根据周边环境的屏障情况从安装于所述车辆上的多种传感器中选择至少一种传感器。
周边环境的屏障情况包括周边环境中是否存在屏障、周边环境的屏障类型或周边环境的屏障程度等。其中,屏障是指能够对车辆的安全起到保护作用的其它物体,例如是能够阻挡其它移动物体靠近车辆。本申请不限制屏障的具体类型,例如屏障包括但不限于是墙 体、其它车辆、树木或栅栏等。周边环境的屏障程度可以指周边环境中的屏障阻挡其它移动物体靠近车辆或损害车辆的程度或能力。
可选的,车辆可以基于周边环境的开放性来判别周边环境的屏障程度的高低。例如,开放性越低,屏障程度越重;反之,开放性越高,屏障程度越轻。可选的,周边环境的屏障程度与周边环境的场景类型、周边环境的空间的封闭程度或者周边环境的空间的大小等相关。例如,在街道路边,空间封闭性差,行人和其它车辆的流动性较大,所以周边环境的开放性较高,所以周边环境的屏障程度较轻;例如,在居民小区,有出入管制所以空间封闭性一般,行人和其它车辆的流动性较小,开放性一般,所以周边环境的屏障程度一般;例如,在私人车库,空间小,空间封闭性强,行人和其它车辆的流动性都非常小,开放性低,所以周边环境的屏障程度较高。
本申请实施例中,车辆监测周边环境的屏障情况可以有多种实现方式,例如基于周边环境的场景类型判断周边环境的屏障情况,或者基于周边环境的空间的大小判断周边环境的屏障情况,或者基于周边环境的空间的封闭程度判断周边环境的屏障情况。
以基于周边环境的空间的封闭程度判断周边环境的屏障情况为例:一种可能的实现方式中,车辆可以监测车辆各个方位上是否存在安全屏障;若车辆的某一方位上不存在安全屏障,则说明该方位开放性较高,屏障程度轻,安全性差,可以从安装于车辆上的多种传感器中选择可以监测该方位的传感器,对该方位进行监测;若车辆的另一方位上存在安全屏障,则说明该方位上开放性较低,屏障程度高,安全性高,可以适当减少或关闭用于监测该方位的传感器。
可选的,当屏障所在区域与车辆之间的距离小于第一阈值时,该屏障是安全屏障。
例如,图7示出了一种基于环境屏障情况的传感器选择方法的流程图,该方法可以应用于图1所示的车载系统,该方法包括:
S701、熄火状态下,车辆的监控功能开启,车辆进入监测状态。
可选的,车辆的监控功能开启时,可以开启全部的传感器,也可以只开启部分传感器(例如只开启摄像头),本申请不做限制。
S702、车辆监测车身周围是否存在安全屏障。
例如,车辆根据摄像头拍摄的图像判断车身周围是否有墙壁或其他车辆等。
可选的,屏障与车辆的距离小于预设距离时,该屏障为安全屏障。该预设距离例如是1米、1.5米或2米等,本申请不做限制。另外,针对不同类型的屏障,预设距离的值可以不同。例如,针对墙体,该预设距离为1.5米;针对其它车辆,该预设距离为1米。
S703A、若某一侧有安全屏障,则不开启/关闭该屏障侧的传感器。
具体的,如果车辆在屏障侧的传感器还未开启,则保持车辆该侧的传感器关闭;如果车辆该侧的传感器已经开启,则关闭车辆该侧的传感器。
可选的,车辆可以关闭车屏障侧的所有传感器,以最大程度地节省传感器的功耗,延长传感器的寿命。
可选的,车辆可以仅关闭屏障侧的部分传感器,以在适当节省传感器的功耗的同时,进一步提高安全性能。
S703B、针对不存在安全屏障的其它侧,正常开启该侧的传感器。
以下再例举几种典型应用场景:
场景1、车辆在停车位停车时,单侧或两侧停有车辆:停有车辆的车身侧,车与车之 间留出的距离较狭窄,行人或其他车辆不方便通过该区域,因此对车身该侧造成的威胁或破坏几乎可以忽略不计。此时再开启该侧传感器进行监测的意义不大,因此可以不开启/关闭该侧的传感器(例如摄像头和超声波雷达等)。
场景2、车辆在靠边停车时,单侧有墙壁等障碍物:单侧存在墙壁类的障碍物,车与墙之间的间距较窄,可以避免车辆剐蹭,异常移动、被盗等风险,障碍物可以保护车身该侧的安全,因此可以不开启/关闭该侧的传感器(例如摄像头和超声波雷达等)。
本申请实施例中的车辆通过感知车身周围是否存在安全屏障,合理利用该屏障的存在价值,避开车身侧潜在的威胁因素,也避免了对传感器的不必要损耗,延长传感器的使用时长,提高监测的有效性。
以上对本申请用于选择传感器的三个要素(即周边环境的场景类型、周边环境中的移动物体、周边环境的屏障情况)分别单独实施的情况进行了介绍。在具体实现时,上述三种选择传感器的方案也可以相互结合实施。
以下例举其中几种可能的结合方案。
四、车辆根据周边环境的场景类型、周边环境中的移动物体从安装于车辆上的多种传感器中选择至少一种传感器。
示例性的,车辆从行驶状态切换至熄火状态后,首先根据车辆在行车过程中的历史记录确定车辆的周边环境的场景类型,选择与周边环境的场景类型相对应的至少一种传感器;然后基于该至少一种传感器检测周边环境中是否出现移动物体,如果周边环境中出现移动物体或移动物体的频次超过预设频次,则进一步选择其它类型或更多数量的传感器。
五、车辆根据周边环境的场景类型、周边环境的屏障情况从安装于车辆上的多种传感器中选择至少一种传感器。
示例性的,车辆从行驶状态切换至熄火状态后,首先根据车辆在行车过程中的历史记录确定车辆的周边环境的场景类型,选择与周边环境的场景类型相对应的至少一种传感器;然后基于已选择的传感器检测车辆四周是否存在安全屏障,对于不存在安全屏障的一侧,继续选择该侧的部分或全部传感器,对于存在安全屏障的一侧,取消选择或关闭该侧的所有传感器。
六、车辆根据周边环境中的移动物体、周边环境的屏障情况从安装于车辆上的多种传感器中选择至少一种传感器。
示例性的,车辆从行驶状态切换至熄火状态后,首先选择前后左右四侧的摄像头,基于前后左右四个方位的摄像头检测车辆四周是否存在安全屏障,对于不存在安全屏障的一侧,选择该侧的部分或全部传感器,对于存在安全屏障的一侧,关闭该侧的所有传感器。之后,车辆继续基于已经选择的传感器持续检测周边环境中是否出现移动物体,如果周边环境中出现移动物体或移动物体的频次超过预设频次,则进一步选择其它类型或更多数量的传感器。
七、车辆根据周边环境的场景类型、周边环境中的移动物体以及周边环境的屏障情况从安装于车辆上的多种传感器中选择至少一种传感器;
示例性的,车辆从行驶状态切换至熄火状态后,首先选择前后所有四侧的摄像头,基于前后左右四个方位的摄像头检测车辆四周是否存在安全屏障,对于不存在安全屏障的一侧,选择该侧的部分或全部传感器,对于存在安全屏障的一侧,关闭该侧的所有传感器。之后,车辆基于已选择的摄像头检测周边环境的场景类型,选择与当前场景类型相对应的 传感器。之后,车辆继续基于已选择所有传感器持续检测周边环境中是否出现移动物体,如果周边环境中出现移动物体或移动物体的频次超过预设频次,则进一步选择其它类型或更多数量的传感器。
应理解,上述第四~第七部分仅仅是对结合实施方式的部分举例,在具体实现时,还可以有其它结合实施的方式。
S402、车辆基于该至少一种传感器对周边环境进行监测。
具体的,车辆基于该至少一种传感器对所述车辆的安全造成威胁的至少一个因素。具体实现过程例如,车辆的ECU系统中的MDC控制车辆上已选择的各传感器采集车辆的周边环境的数据;各传感器采集到数据之后将数据传输给MDC;MDC接收到各传感器采集的数据之后,对该些数据进行分析,可获得周边环境对车辆的安全造成威胁的因素。
以下通过几个具体的示例来说明:
示例1、以摄像头为例:MDC可以基于摄像头采集的数据监测周边环境是否存在障碍物、障碍物的种类(如行人,自行车,电动车,车辆)、障碍物与车辆的距离、障碍物相对车辆的移动趋势(如靠近、远离或静止等)等。
示例2、以超声波雷达为例:MDC可以基于超声波雷达采集的数据监测周边环境是否存在障碍物、障碍物与车辆的距离等。
示例3、以惯性导航系统为例:MDC可以基于惯性导航系统采集的数据监测车辆的震动值、移动值(或者说位置变化值),车辆震动的持续时间,车辆移动的持续时间等。
示例4、以全球定位系统为例:MDC可以基于全球定位系统采集的数据对车辆进行定位跟踪、车况监听、车迹记录等。
应理解,MDC基于该至少一种传感器具有监测周边环境对车辆的安全造成威胁的至少一个因素的能力,而MDC对该至少一种传感器采集的数据进行分析后是否能够得到对应的因素,取决于周边环境中是否真实存在该对应的因素。如果周边环境中存在对应的因素,则MDC对该至少一种传感器采集的数据进行分析可以获得对应的因素,如果周边环境中不存在对应的因素,则MDC对该至少一种传感器采集的数据进行分析不会获得对应的因素。
进一步的,车辆在获得周边环境对车辆的安全造成威胁的因素之后,车辆可根据该些因素确定周边环境对车辆的威胁级别。
MDC可以根据以下一项或多项确定周边环境对车辆的威胁级别:1)该多种传感器监测到的因素的类型;2)该多种传感器监测到的因素的值;3)各因素的持续时间;4)周边环境发生变化的;5)周边环境的变化次数;6)车辆的车速等。应理解,上述部分项可以基于其它项获得,例如,车辆的车速可以通过对“车辆移动的时间”、“车辆移动的距离”两个因素进行统计分析获得。
本申请不限定车辆对威胁级别的具体划分方式。以下列举其中两种可能的方式:方式1、MDC根据传感器监测到的因素的类型的数量划分威胁级别,其中低威胁级别时MDC基于该多种传感器监测到的因素的类型少于高威胁级别时MDC基于该多种传感器监测到的因素的类型。方式2、MDC根据传感器监测到的因素的值划分威胁级别,其中低威胁级别时MDC基于该多种传感器监测到的任一因素的值小于高威胁级别时MDC基于该多种传感器监测到的该任一因素的值。应理解,上述方式1、2可以分别单独实施,也可以结合实施例,这里不做限定。
本申请不限定威胁级别的总级别数。例如,威胁级别共1个级别,即“存在威胁”;例如,威胁级别共2个级别,级别1为“低威胁”、级别2为“高威胁”;例如,威胁级别共3个级别,级别1为“低威胁”、级别2为“高威胁”、级别3为“危险”。可选的,“不存在威胁”也可以被归纳为一个单独的级别,例如不存在威胁时的级别是0。
以下以威胁级别共4个级别,且级别0为“不存在威胁”、级别1为“低威胁”、级别2为“高威胁”、级别3为“危险”为例:
设MDC基于上述多种传感器能够监测的因素包括:监测到障碍物、障碍物距离值(*m)、车辆震动值(*g)、车辆位置变化(*m)、持续时间(*ms)、停车环境变化次数(N)、车速(m/s)。其中,监测到障碍物:MDC基于传感器监测到车身周围存在障碍物;障碍物距离值:MDC基于传感器监测到的障碍物到车身的距离;车辆震动值:MDC基于传感器监测到车辆的震动值;车辆位置变化:MDC基于传感器监测车辆位置发生了变化(可能是车辆被盗或恶劣天气影响);持续时间:MDC基于传感器监测某一因素持续的时间,例如车辆移动或震动的持续时间;停车环境变化次数:MDC基于传感器监测车辆停车环境的变化次数;车速:MDC基于传感器监测到车辆移动的速度。
示例1、当动物和或人从车辆附近上走过时,MDC测量到的因素有:监测到障碍物。MDC可确定周边环境不会对车辆造成威胁,威胁级别为0。
示例2、当动物和或人靠近车辆时,MDC测量到的因素有:障碍物、障碍物距离值,其中障碍物距离值较小(比如0.5m)。MDC可确定周边环境对车辆造成低威胁,威胁级别为1。
示例3、当动物和或人触碰到车辆时,MDC测量到的因素有:监测到障碍物、车辆震动值、障碍物距离值,其中车辆震动值较小(比如0.1g),障碍物距离值很小(比如0.01m)。MDC可确定周边环境对车辆造成高威胁,威胁级别为2;
示例4、当动物和或人尝试强行打开车门时,MDC测量到的因素有:监测到障碍物、车辆震动值、障碍物距离值、持续时间,其中车辆震动值较大(比如0.5g),障碍物距离值很小(比如0.01m),时序时间较长(比如3s)。MDC可确定周边环境相对车辆十分危险,威胁级别为3。
进一步的,车辆在确定了威胁级别之后,可执行与该威胁级别相对应的响应事件。
具体的,MDC可以预先设置威胁级别与响应事件的对应关系,例如在存储器中保存威胁级别与响应事件的对应关系。当MDC确定出周边环境对车辆的威胁级别之后,根据该对应关系,执行与该威胁级别相对应的响应事件。
参见图8,仍以威胁级别共4个级别(级别0为不存在威胁、级别1为低威胁、级别2为高威胁、级别3为危险)为例,各级别对应的响应事件分别如下:
车辆熄火后,传感器被开启(选择开启的传感器参见S401的具体实现方法),车辆进入“监控状态”,即采用传感器采集周边环境的数据,MDC对传感器采集的数据进行分析,监测是否存在威胁;
1)威胁级别为“不存在威胁”时,车辆可以不执行任何响应事件,或者说响应事件是MDC控制车辆继续保持“监控状态”,即采用传感器对车辆的周边环境进行监测;
2)威胁级别为“低威胁”时,MDC控制车辆进入“警告状态”,车辆输出警告信息,例如车灯闪烁、鸣笛、中控屏闪烁等;
3)威胁级别为“高威胁”时,MDC控制车辆进入“事件记录状态”,记录周边环境 中发生的事件,例如保存摄像头采集的视频图像等;
4)威胁级别为“危险”时,MDC控制车辆进入“警报状态”,车辆向车辆关联的用户设备(如手机、智能手表等)发送警报信息,例如向手机APP发送短信,将摄像头录制的视频上传至云端以支持该用户设备下载等。
可选的,当“警报状态”结束一段时间后,例如图8所示的30s,MDC可控制车辆恢复最初的“监控状态”,即停止向车辆关联的用户设备发送警报信息,继续采用传感器采集周边环境的数据。如此,可以节省功耗。
可选的,车辆在执行任一威胁级别对应的响应事件的过程中,可以全程保持“监控状态”,即一直采用传感器对车辆的周边环境监测,进而可以实时更新威胁级别。
应理解,MDC监测到的周边环境对车辆的威胁级别的过程,可以是由低到高依次遍历各威胁级别,即先从“不存在威胁”切换为“低威胁”,然后从“低威胁”切换为“高威胁”,然后再切换为“危险”;MDC监测到的周边环境对车辆的威胁级别也可以是直接进入其中一个较高的级别,例如直接进入“高威胁”或“危险”,本申请对此不做限制。
可选的,高威胁级别对应的响应事件可以包括低威胁级别对应的响应事件,以进一步提高车辆的响应能力。例如,威胁级别为“高威胁”时,MDC控制车辆进入“事件记录状态”,车辆在输出警告信息的同时,记录周边环境中发生的事件。例如,威胁级别为“危险”时,MDC控制车辆进入“警报状态”,车辆在输出警告信息、记录周边环境中发生的事件的同时,向车辆关联的用户设备发送警报信息。
可选的,MDC执行与该威胁级别相对应的响应事件,具体可以是:MDC发送控制指令给被控元件对应的ECU,使得被控元件对应的ECU驱动被控元件执行相应的响应事件。
例如,参见图9,为各威胁级别下MDC控制各ECU驱动对应被控元件执行响应事件的一种示例。
1)熄火状态下,车辆自动进入监控状态:MDC选择超声波雷达、摄像头,基于超声波雷达、摄像头采集的数据监测车辆周围的环境。
2)MDC根据超声波雷达、摄像头采集的数据监测到有物体靠近车辆时,MDC确定周边环境对车辆造成“低威胁”,于是自动切换到警告状态:MDC唤醒BCM,BCM根据MDC的指令控制车灯闪烁、鸣笛,同时MDC唤醒CDC,CDC根据MDC的指令控制中控屏闪烁,以警告靠近物体摄像头正在录像和监测。
3)MDC根据超声波雷达、摄像头采集的数据监测到有物体接触车辆时,MDC确定周边环境对车辆造成“高威胁”,于是自动切换到事件记录状态:MDC选择惯性导航系统(即惯性导航系统、摄像头和超声波雷达同时监测),MDC唤醒CDC执行中控闪烁,摄像头录制视频并存储视频到CDC,外置U盘(USB flash disk)保存视频,支持将视频导入到个人计算机(Personal Computer,PC)以供用户车看。
4)MDC根据超声波雷达、摄像头、惯性导航系统等采集的数据监测到有更严重的威胁发生时(例如车辆非授权进入或者胎压异常场景触发BCM,或者例如碰撞、撬门、砸窗、异常震动、移动场景等触发INS),MDC确定周边环境对车辆造成“危险”,于是自动切换到警报状态:MDC唤醒CDC,CDC增加中控屏的显示亮度,CDC将扬声器音量调至最大以支持喊话,CDC通过TBOX将之前录制的视频上传到云端,向用户的手机推送短信提醒或APP提醒,支持用户的手机从云端下载视频。
基于上述可知,本申请实施例中的车辆可以根据周边环境的场景类型、周边环境中的 移动物体、周边环境的屏障情况中的至少一项,联合不同的传感器对周边环境存在的威胁因素进行监测,优化了传统的监测机制,能够对多类型事件进行感知识别,可以在提高监测精度的同时,减少对传感器的损耗,延长传感器的使用寿命。另外,本申请实施例中的车辆还能够根据监测到的因素识别威胁级别,执行与威胁级别相对应的响应事件,进而及时排除威胁,提高了车辆在熄火状态下的安全性。
基于同一技术构思,本申请实施例还提供一种车辆监测装置1000,该装置1000具备实现图4~图9所示方法步骤的功能,比如,该装置1000包括用于执行上述图4~图9所示方法步骤的功能或模块或单元或手段(means),该功能或模块或单元或手段可以通过软件实现,或者通过硬件实现,也可以通过硬件执行相应的软件实现。
示例性的,参见图10,装置1000可以包括:
处理单元1001,用于根据车辆的周边环境的场景类型、周边环境中的移动物体、周边环境的屏障情况中的至少一项,从安装于车辆上的多种传感器中选择至少一种传感器;
监测单元1002,用于基于至少一种传感器对车辆的周边环境进行监测。
上述各单元所执行的方法步骤的具体实现方式可以参见上述图4~图9所示实施例中由车辆执行对应方法步骤时的具体实现方式,这里不再赘述。
基于同一技术构思,本申请实施例还提供一种车载设备1100。参见图11,该车载设备包括至少一个处理器1101,该处理器用于执行图4~图9所示的方法步骤。
可选的,该车载设备1100还可以包括存储器1102,在图11中采用虚线框表示存储器1102对于车载设备1100是可选的。
可选的,存储器1102和处理器1101通过总线通信连接,图11中用黑色粗线表示总线。
应理解,本申请实施例中提及的处理器可以通过硬件实现也可以通过软件实现。当通过硬件实现时,该处理器可以是逻辑电路、集成电路等。当通过软件实现时,该处理器可以是一个通用处理器,通过读取存储器中存储的软件代码来实现。
示例性的,处理器可以是中央处理单元(Central Processing Unit,CPU),还可以是其他通用处理器、数字信号处理器(Digital Signal Processor,DSP)、专用集成电路(Application Specific Integrated Circuit,ASIC)、现成可编程门阵列(Field Programmable Gate Array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。
应理解,本申请实施例中提及的存储器可以是易失性存储器或非易失性存储器,或可包括易失性和非易失性存储器两者。其中,非易失性存储器可以是只读存储器(Read-Only Memory,ROM)、可编程只读存储器(Programmable ROM,PROM)、可擦除可编程只读存储器(Erasable PROM,EPROM)、电可擦除可编程只读存储器(Electrically EPROM,EEPROM)或闪存。易失性存储器可以是随机存取存储器(Random Access Memory,RAM),其用作外部高速缓存。通过示例性但不是限制性说明,许多形式的RAM可用,例如静态随机存取存储器(Static RAM,SRAM)、动态随机存取存储器(Dynamic RAM,DRAM)、同步动态随机存取存储器(Synchronous DRAM,SDRAM)、双倍数据速率同步动态随机存取存储器(Double Data Eate SDRAM,DDR SDRAM)、增强型同步动态随机存取存储器(Enhanced SDRAM,ESDRAM)、同步连接动态随机存取存储器(Synchlink DRAM,SLDRAM)和直接内存总线随机存取存储器(Direct Rambus RAM,DR RAM)。
需要说明的是,当处理器为通用处理器、DSP、ASIC、FPGA或者其他可编程逻辑器 件、分立门或者晶体管逻辑器件、分立硬件组件时,存储器(存储模块)可以集成在处理器中。
应注意,本文描述的存储器旨在包括但不限于这些和任意其它适合类型的存储器。
基于同一技术构思,本申请实施例还提供一种计算机可读存储介质,所述可读存储介质用于存储指令,当所述指令被执行时,使如图4~图9所示方法被实现。
基于同一技术构思,本申请实施例还一种计算机程序产品,该计算机程序产品中存储有指令,当其在计算机上运行时,使得该计算机执行如图4~图9所示方法。
应理解,上述各实施例可以相互结合。
本领域内的技术人员应明白,本申请的实施例可提供为方法、系统、或计算机程序产品。因此,本申请可采用完全硬件实施例、完全软件实施例、或结合软件和硬件方面的实施例的形式。而且,本申请可采用在一个或多个其中包含有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器、CD-ROM、光学存储器等)上实施的计算机程序产品的形式。
本申请是参照根据本申请的方法、设备(系统)、和计算机程序产品的流程图和/或方框图来描述的。应理解可由计算机程序指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些计算机程序指令到通用计算机、专用计算机、嵌入式处理机或其他可编程数据处理设备的处理器以产生一个机器,使得通过计算机或其他可编程数据处理设备的处理器执行的指令产生用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的装置。
这些计算机程序指令也可存储在能引导计算机或其他可编程数据处理设备以特定方式工作的计算机可读存储器中,使得存储在该计算机可读存储器中的指令产生包括指令装置的制造品,该指令装置实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能。
这些计算机程序指令也可装载到计算机或其他可编程数据处理设备上,使得在计算机或其他可编程设备上执行一系列操作步骤以产生计算机实现的处理,从而在计算机或其他可编程设备上执行的指令提供用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的步骤。
显然,本领域的技术人员可以对本申请进行各种改动和变型而不脱离本申请的精神和范围。这样,倘若本申请的这些修改和变型属于本申请权利要求及其等同技术的范围之内,则本申请也意图包含这些改动和变型在内。
Claims (22)
- 一种车辆监测方法,其特征在于,应用于熄火状态下的车辆,所述方法包括:根据所述车辆的周边环境的场景类型、所述周边环境中的移动物体、所述周边环境的屏障情况中的至少一项,从安装于所述车辆上的多种传感器中选择至少一种传感器;基于所述至少一种传感器对所述车辆的周边环境进行监测。
- 根据权利要求1所述的方法,其特征在于,在根据所述车辆的周边环境的场景类型、所述周边环境中的移动物体、所述周边环境的屏障情况中的至少一项,从安装于所述车辆上的多种传感器中选择至少一种传感器之前,所述方法还包括:识别所述周边环境的场景类型;根据所述车辆的周边环境的场景类型、所述周边环境中的移动物体、所述周边环境的屏障情况中的至少一项,从安装于所述车辆上的多种传感器中选择至少一种传感器,包括:根据场景类型与传感器的对应关系,从安装于所述车辆上的多种传感器中选择与所述周边环境的场景类型相对应的传感器。
- 根据权利要求1或2所述的方法,其特征在于,在根据所述车辆的周边环境的场景类型、所述周边环境中的移动物体、所述周边环境的屏障情况中的至少一项,从安装于所述车辆上的多种传感器中选择至少一种传感器之前,所述方法还包括:使用安装于所述车辆上的摄像头拍摄所述车辆的周边环境的图像;根据所述车辆的周边环境的场景类型、所述周边环境中的移动物体、所述周边环境的屏障情况中的至少一项,从安装于所述车辆上的多种传感器中选择至少一种传感器,包括:根据所述摄像头拍摄的图像判断所述周边环境中有移动物体出现;从安装于所述车辆上的多种传感器中选择所述摄像头和除所述摄像头之外的其它类型的传感器作为所述至少一种传感器。
- 根据权利要求3所述的方法,其特征在于,在从安装于所述车辆上的多种传感器中选择所述摄像头和除所述摄像头之外的其它类型的传感器作为所述至少一种传感器之前,所述方法还包括:确定所述移动物体满足以下任意一项或多项:所述移动物体往靠近所述车辆的方向移动、所述移动物体的出现频率超过预设频率、所述移动物体的出现时长超过预设时长或者所述移动物体处于所述车辆的预设范围内。
- 根据权利要求1-4任一项所述的方法,其特征在于,所述方法还包括:获得所述屏障情况,所述屏障情况包括在第一区域上存在屏障,所述第一区域在所述车辆的第一方位上且与所述车辆之间的距离小于第一阈值;所述至少一种传感器不包括用于在所述第一方位上进行监测的传感器。
- 根据权利要求5所述的方法,其特征在于,所述屏障包括墙体或其它车辆。
- 根据权利要求1-6任一项所述的方法,其特征在于,基于所述至少一种传感器对所述车辆的周边环境进行监测,包括:基于所述至少一种传感器监测所述周边环境对所述车辆的安全造成威胁的至少一个因素;根据所述至少一个因素,确定所述周边环境对所述车辆的威胁级别;执行与所述威胁级别相对应的响应事件。
- 根据权利要求7所述的方法,其特征在于,根据所述至少一个因素,确定所述周边环境对所述车辆的威胁级别,包括:根据所述至少一个因素的类型、所述至少一个因素的值、所述至少一个因素的持续时间、所述周边环境的变化次数或者所述车辆的车速中的至少一项,确定所述周边环境对所述车辆的威胁级别。
- 根据权利要求7或8所述的方法,其特征在于,所述威胁级别从低到高依次包括第一级别、第二级别和第三级别;与所述第一级别相对应的响应事件包括以下任意一项或多项:车灯闪烁、鸣笛或中控屏闪烁;与所述第二级别相对应的响应事件包括:与所述第一级别相对应的响应事件,以及采用摄像头录制视频和保存所述视频;与所述第三级别相对应的响应事件包括:与所述第二级别相对应的响应事件,向用户设备发送提醒,以及将所述视频上传至云端以支持所述用户设备下载。
- 一种车辆监测装置,其特征在于,应用于熄火状态下的车辆,所述装置包括:处理单元,用于根据所述车辆的周边环境的场景类型、所述周边环境中的移动物体、所述周边环境的屏障情况中的至少一项,从安装于所述车辆上的多种传感器中选择至少一种传感器;监测单元,用于基于所述至少一种传感器对所述车辆的周边环境进行监测。
- 根据权利要求10所述的装置,其特征在于,所述处理单元用于:识别所述周边环境的场景类型;根据场景类型与传感器的对应关系,从安装于所述车辆上的多种传感器中选择与所述周边环境的场景类型相对应的传感器。
- 根据权利要求10或11所述的装置,其特征在于,所述处理单元用于:使用安装于所述车辆上的摄像头拍摄所述车辆的周边环境的图像;根据所述摄像头拍摄的图像判断所述周边环境中有移动物体出现;从安装于所述车辆上的多种传感器中选择所述摄像头和除所述摄像头之外的其它类型的传感器作为所述至少一种传感器。
- 根据权利要求12所述的装置,其特征在于,所述处理单元还用于:在从安装于所述车辆上的多种传感器中选择所述摄像头和除所述摄像头之外的其它类型的传感器作为所述至少一种传感器之前,确定所述移动物体满足以下任意一项或多项:所述移动物体往靠近所述车辆的方向移动、所述移动物体的出现频率超过预设频率、所述移动物体的出现时长超过预设时长或者所述移动物体处于所述车辆的预设范围内。
- 根据权利要求10-13任一项所述的装置,其特征在于,所述处理单元还用于:获得所述屏障情况,所述屏障情况包括在第一区域上存在屏障,所述第一区域在所述车辆的第一方位上且与所述车辆之间的距离小于第一阈值;所述至少一种传感器不包括用于在所述第一方位上进行监测的传感器。
- 根据权利要求14所述的装置,其特征在于,所述屏障包括墙体或其它车辆。
- 根据权利要求10-15任一项所述的装置,其特征在于,所述监测单元具体用于:基于所述至少一种传感器监测所述周边环境对所述车辆的安全造成威胁的至少一个因素;根据所述至少一个因素,确定所述周边环境对所述车辆的威胁级别;执行与所述威胁级别相对应的响应事件。
- 根据权利要求16所述的装置,其特征在于,所述监测单元在根据所述至少一个因素,确定所述周边环境对所述车辆的威胁级别时,具体用于:根据所述至少一个因素的类型、所述至少一个因素的值、所述至少一个因素的持续时间、所述周边环境的变化次数或者所述车辆的车速中的至少一项,确定所述周边环境对所述车辆的威胁级别。
- 根据权利要求16或17所述的装置,其特征在于,所述威胁级别从低到高依次包括第一级别、第二级别和第三级别;与所述第一级别相对应的响应事件包括以下任意一项或多项:车灯闪烁、鸣笛或中控屏闪烁;与所述第二级别相对应的响应事件包括:与所述第一级别相对应的响应事件,以及采用摄像头录制视频和保存所述视频;与所述第三级别相对应的响应事件包括:与所述第二级别相对应的响应事件,向用户设备发送提醒,以及将所述视频上传至云端以支持所述用户设备下载。
- 一种车辆监测装置,其特征在于,应用于熄火状态的车辆,所述装置包括存储器和处理器,所述存储器存储计算机程序指令,所述处理器运行所述计算机程序指令以执行如权利要求1-9任一项所述的方法。
- 一种车辆,其特征在于,包括:多种传感器;以及如权利要求10-19任一项所述的车辆监测装置。
- 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质用于存储指令,当所述指令被执行时,使如权利要求1-9中任一项所述的方法被实现。
- 一种计算机程序产品,其特征在于,所述计算机程序产品中存储有指令,当其在处理器上运行时,使得如权利要求1-9任一项所述的方法被实现。
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110333138.3 | 2021-03-29 | ||
CN202110333138.3A CN115214631A (zh) | 2021-03-29 | 2021-03-29 | 一种车辆监测方法、装置和车辆 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022206336A1 true WO2022206336A1 (zh) | 2022-10-06 |
Family
ID=83456854
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2022/080204 WO2022206336A1 (zh) | 2021-03-29 | 2022-03-10 | 一种车辆监测方法、装置和车辆 |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN115214631A (zh) |
WO (1) | WO2022206336A1 (zh) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20220234604A1 (en) * | 2021-01-26 | 2022-07-28 | Ford Global Technologies, Llc | Hazard condition warning for package delivery operation |
CN115713842A (zh) * | 2022-10-10 | 2023-02-24 | 重庆长安新能源汽车科技有限公司 | 车辆驻车时主动避险方法、系统、车辆及存储介质 |
CN116279454A (zh) * | 2023-01-16 | 2023-06-23 | 禾多科技(北京)有限公司 | 车身装置控制方法、装置、电子设备和计算机可读介质 |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP4389472A1 (en) * | 2022-12-23 | 2024-06-26 | Nio Technology (Anhui) Co., Ltd | Functional safety for an electrical vehicle in stationary mode |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2003047923A1 (de) * | 2001-11-30 | 2003-06-12 | Robert Bosch Gmbh | Verfahren zum manipulieren |
JP2005078221A (ja) * | 2003-08-28 | 2005-03-24 | Nippon Soken Inc | 車載機器の駆動制御装置 |
JP2005161996A (ja) * | 2003-12-02 | 2005-06-23 | Mitsubishi Electric Corp | 車両周辺監視システム |
CN107323377A (zh) * | 2017-05-08 | 2017-11-07 | 苏州统购信息科技有限公司 | 一种车载预警系统以及预警方法 |
CN109204232A (zh) * | 2017-06-29 | 2019-01-15 | 宝沃汽车(中国)有限公司 | 车辆周边异常监测方法、装置及车辆 |
WO2020025614A1 (de) * | 2018-08-02 | 2020-02-06 | Trw Automotive Gmbh | Überwachungssystem für ein fahrzeug |
-
2021
- 2021-03-29 CN CN202110333138.3A patent/CN115214631A/zh active Pending
-
2022
- 2022-03-10 WO PCT/CN2022/080204 patent/WO2022206336A1/zh active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2003047923A1 (de) * | 2001-11-30 | 2003-06-12 | Robert Bosch Gmbh | Verfahren zum manipulieren |
JP2005078221A (ja) * | 2003-08-28 | 2005-03-24 | Nippon Soken Inc | 車載機器の駆動制御装置 |
JP2005161996A (ja) * | 2003-12-02 | 2005-06-23 | Mitsubishi Electric Corp | 車両周辺監視システム |
CN107323377A (zh) * | 2017-05-08 | 2017-11-07 | 苏州统购信息科技有限公司 | 一种车载预警系统以及预警方法 |
CN109204232A (zh) * | 2017-06-29 | 2019-01-15 | 宝沃汽车(中国)有限公司 | 车辆周边异常监测方法、装置及车辆 |
WO2020025614A1 (de) * | 2018-08-02 | 2020-02-06 | Trw Automotive Gmbh | Überwachungssystem für ein fahrzeug |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20220234604A1 (en) * | 2021-01-26 | 2022-07-28 | Ford Global Technologies, Llc | Hazard condition warning for package delivery operation |
US11724641B2 (en) * | 2021-01-26 | 2023-08-15 | Ford Global Technologies, Llc | Hazard condition warning for package delivery operation |
CN115713842A (zh) * | 2022-10-10 | 2023-02-24 | 重庆长安新能源汽车科技有限公司 | 车辆驻车时主动避险方法、系统、车辆及存储介质 |
CN115713842B (zh) * | 2022-10-10 | 2024-09-13 | 重庆长安新能源汽车科技有限公司 | 车辆驻车时主动避险方法、系统、车辆及存储介质 |
CN116279454A (zh) * | 2023-01-16 | 2023-06-23 | 禾多科技(北京)有限公司 | 车身装置控制方法、装置、电子设备和计算机可读介质 |
CN116279454B (zh) * | 2023-01-16 | 2023-12-19 | 禾多科技(北京)有限公司 | 车身装置控制方法、装置、电子设备和计算机可读介质 |
Also Published As
Publication number | Publication date |
---|---|
CN115214631A (zh) | 2022-10-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2022206336A1 (zh) | 一种车辆监测方法、装置和车辆 | |
US10421436B2 (en) | Systems and methods for surveillance of a vehicle using camera images | |
US10997430B1 (en) | Dangerous driver detection and response system | |
US9319860B2 (en) | Mobile terminal that determine whether the user is walking while watching the mobile terminal | |
US10424204B1 (en) | Collision warnings provided by stationary vehicles | |
JP2017138694A (ja) | 映像処理装置及び映像処理方法 | |
WO2011136234A1 (ja) | 車外音検出装置 | |
US20150177363A1 (en) | Mobile gunshot detection | |
JP2016515275A (ja) | 統合型ナビゲーション及び衝突回避システム | |
CN111252066A (zh) | 紧急制动控制方法、装置、车辆及存储介质 | |
CN113808418B (zh) | 路况信息显示系统、方法、车辆、计算机设备和存储介质 | |
US11132562B2 (en) | Camera system to detect unusual circumstances and activities while driving | |
CN115703431A (zh) | 用于车辆安全监测的系统和方法 | |
CN103448670A (zh) | 车辆监控系统 | |
CN203465850U (zh) | 人工智能行车安全警示系统 | |
KR20130051238A (ko) | 다중 영상 및 음향을 이용한 교통사고검지시스템 | |
CN117590404A (zh) | 物体检测方法及电子设备 | |
WO2021053624A1 (en) | Device and method for detecting an object in the blind spot of a vehicle | |
CN108734809A (zh) | 一种多功能行车记录仪 | |
CN111427063A (zh) | 一种移动装置通行控制方法、装置、设备、系统及介质 | |
CN107640111B (zh) | 基于百核微处理器控制的汽车视觉图像处理系统及方法 | |
CN115662187A (zh) | 一种基于互联网的交通盲区预警系统 | |
CN114913712A (zh) | 用于防止车辆事故的系统和方法 | |
CN114333392A (zh) | 基于rfid的路侧停车监测预警方法及其系统 | |
Ball et al. | Analysis of fixed and mobile sensor systems for parking space detection |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22778520 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 22778520 Country of ref document: EP Kind code of ref document: A1 |