CN115214631A - Vehicle monitoring method and device and vehicle - Google Patents

Vehicle monitoring method and device and vehicle Download PDF

Info

Publication number
CN115214631A
CN115214631A CN202110333138.3A CN202110333138A CN115214631A CN 115214631 A CN115214631 A CN 115214631A CN 202110333138 A CN202110333138 A CN 202110333138A CN 115214631 A CN115214631 A CN 115214631A
Authority
CN
China
Prior art keywords
vehicle
surrounding environment
sensor
moving object
sensors
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110333138.3A
Other languages
Chinese (zh)
Inventor
李添泽
郑益红
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Yinwang Intelligent Technology Co ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202110333138.3A priority Critical patent/CN115214631A/en
Priority to PCT/CN2022/080204 priority patent/WO2022206336A1/en
Publication of CN115214631A publication Critical patent/CN115214631A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/26Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
    • B60Q1/46Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for giving flashing caution signals during drive, other than signalling change of direction, e.g. flashing the headlights or hazard lights
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q5/00Arrangement or adaptation of acoustic signal devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R25/00Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
    • B60R25/10Fittings or systems for preventing or indicating unauthorised use or theft of vehicles actuating a signalling device
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R25/00Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
    • B60R25/30Detection related to theft or to other events relevant to anti-theft systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • G07C5/0841Registering performance data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/12Messaging; Mailboxes; Announcements
    • H04W4/14Short messaging services, e.g. short message services [SMS] or unstructured supplementary service data [USSD]
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • B60W2552/50Barriers

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • General Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Mathematical Physics (AREA)
  • Traffic Control Systems (AREA)

Abstract

The embodiment of the application provides a vehicle monitoring method and device and a vehicle. When the vehicle is in a flameout state, at least one sensor is selected from a plurality of sensors installed on the vehicle according to at least one of scene types of the surrounding environment, moving objects in the surrounding environment and barrier conditions of the surrounding environment, and then the surrounding environment is monitored based on the at least one sensor. According to the method, the vehicle can dynamically select the sensors according to the surrounding environment, so that the sensing and identification of various events are realized, the monitoring precision can be improved, and meanwhile, all the sensors are not required to be used for monitoring the surrounding environment, so that the service life of the sensors is prolonged and the energy consumption of the sensors is reduced while the safety of the vehicle in a flameout state is considered.

Description

Vehicle monitoring method and device and vehicle
Technical Field
The application relates to the field of intelligent vehicles (smart/inteligent car), in particular to a vehicle monitoring method and device and a vehicle.
Background
With the rapid development of economy and science, the living standard of people is increasingly improved, and vehicles become necessities in life of people. The vehicle brings some problems and troubles while providing travel convenience for the owner. For example, the owner of the vehicle may leave the vehicle after turning off the vehicle, at which time the vehicle will be unattended.
In order to ensure the safety of the vehicle in a flameout state, the prior art adopts a sensor, such as a camera, a laser radar or a millimeter wave radar, to monitor the surrounding environment of the vehicle in the flameout state. However, a single sensor can monitor a few threat factors and cannot cope with actual complex and changeable scenes; in addition, long-time monitoring has large loss on the sensor, the service life of the sensor is shortened, and the energy consumption is also large.
Therefore, how to consider the safety of the vehicle in the flameout state, the service life of the sensor and the energy consumption becomes a technical problem to be solved urgently.
Disclosure of Invention
The embodiment of the application provides a vehicle monitoring method, a vehicle monitoring device and a vehicle, which can give consideration to the safety of the vehicle in a flameout state and the service life and energy consumption of a sensor.
In a first aspect, a vehicle monitoring method is provided, which is applied to a vehicle in a flameout state, and includes: the vehicle selects at least one sensor from a plurality of sensors mounted on the vehicle according to at least one of scene type of surrounding environment of the vehicle, moving object in the surrounding environment, and barrier condition of the surrounding environment; the surroundings of the vehicle are then monitored on the basis of at least one sensor.
According to the method and the device, the vehicle monitors the threat factors existing in the peripheral environment in combination with different sensors according to at least one of the scene type of the peripheral environment, the moving objects in the peripheral environment and the barrier condition of the peripheral environment, the traditional monitoring mechanism is optimized, the multi-type events can be perceived and identified, the monitoring precision can be improved, the loss of the sensors can be reduced, and the service life of the sensors can be prolonged.
A specific implementation method in which the vehicle selects at least one sensor from a plurality of sensors mounted on the vehicle according to at least one of a scene type of a surrounding environment of the vehicle, a moving object in the surrounding environment, and a barrier situation of the surrounding environment is described below.
In one possible design, the vehicle may first identify a scene type of the ambient environment, and then select a sensor corresponding to the scene type of the ambient environment from a plurality of sensors installed on the vehicle as the at least one sensor according to a correspondence between the scene type and the sensor.
In the design, the sensor selected by the vehicle is more adaptive to the scene type of the surrounding environment, and the monitoring precision of the vehicle on the surrounding environment can be improved.
In one possible design, the vehicle may first capture an image of the surroundings of the vehicle using a camera mounted on the vehicle, and when it is determined that a moving object is present in the surroundings based on the image captured by the camera, the camera and another type of sensor other than the camera are selected as the at least one sensor from a plurality of types of sensors mounted on the vehicle. In other words, if no moving object exists around the vehicle body, the monitoring requirement can be met by selecting the camera, and if the moving object exists around the vehicle body, other sensors are selected to be matched with the camera for use, so that the monitoring strength is improved.
In the design, the vehicle can reduce the started sensors as much as possible while considering the monitoring precision, thereby reducing the loss of the sensors and prolonging the service life of the sensors.
In one possible design, after the vehicle determines that the moving object satisfies the preset condition, the camera and the other type of sensor other than the camera are selected as at least one sensor from the plurality of sensors mounted on the vehicle. For example, the preset condition is any one or more of: the moving object moves towards the direction close to the vehicle, the appearance frequency of the moving object exceeds the preset frequency, the appearance duration of the moving object exceeds the preset duration, or the moving object is in the preset range of the vehicle.
In the design, the vehicle can further reduce the loss of the sensor and prolong the service life of the sensor while considering the monitoring precision.
In one possible design, the vehicle may first obtain a barrier condition around the vehicle, and if the barrier is present in a first area, the first area being in a first orientation of the vehicle, and a distance between the first area and the vehicle being less than a first threshold value, indicating that the first area has a very low likelihood of posing a security threat to the vehicle, the at least one sensor selected by the vehicle may not include a sensor for monitoring in the first orientation.
In the design, the vehicle can reasonably utilize the existing value of the barrier by sensing whether the barrier exists around the vehicle body, avoid potential threat factors at the barrier side, improve the safety of the vehicle, simultaneously reduce the loss of the sensor, prolong the service life of the sensor and improve the monitoring effectiveness.
In one possible design, the barrier comprises a wall or other vehicle.
In the design, the vehicle can utilize the existing value of a wall body or other vehicles, potential threat factors are avoided, the loss of the sensor is reduced, the service life of the sensor is prolonged, and the monitoring effectiveness is improved.
It will be appreciated that several of the above described sensor selection schemes may be implemented in combination with each other.
The following describes a specific implementation method for monitoring the surroundings of a vehicle based on at least one sensor.
In one possible design, the vehicle may monitor at least one factor that threatens the safety of the vehicle by the surrounding environment based on at least one sensor; then, according to the at least one factor, determining the threat level of the surrounding environment to the vehicle; then, a response event corresponding to the threat level is executed.
In the design, the vehicle can timely eliminate the threat and improve the safety of the vehicle in a flameout state.
In one possible design, the threat level of the surrounding environment to the vehicle is determined based on at least one of a type of the at least one factor, a value of the at least one factor, a duration of the at least one factor, a number of changes in the surrounding environment, or a speed of the vehicle. For example, the more types of factors, the higher the corresponding threat level; or, for example, the higher the value of the factor, the higher the threat level.
In this design, the vehicle can subdivide the threat level of the surrounding environment to the vehicle, further improving the safety of the vehicle in the flameout state.
In one possible design, the threat level includes, in order from low to high, a first level, a second level, and a third level; the response event corresponding to the first level includes any one or more of: flashing vehicle lights, whistling or flashing central control screens; the response events corresponding to the second level include: responding to the event corresponding to the first level, and recording and storing the video by adopting a camera; response events corresponding to the third level include: and sending a prompt to the user equipment according to the response event corresponding to the second level, and uploading the video to the cloud to support the user equipment to download.
In the design, different threat levels correspond to different response events, threats can be specifically and timely eliminated, and the safety of the vehicle in a flameout state can be further improved.
In a second aspect, embodiments of the present application provide a vehicle monitoring device, for example a vehicle in an off state, or a component or processing chip located within a vehicle in an off state, comprising functions or modules or units or means for performing a method as described in the first aspect or any one of the possible designs of the first aspect.
Illustratively, the apparatus may include: a processing unit for selecting at least one sensor from a plurality of sensors mounted on the vehicle according to at least one of a scene type of a surrounding environment of the vehicle, a moving object in the surrounding environment, and a barrier situation of the surrounding environment; and the monitoring unit is used for monitoring the surrounding environment of the vehicle based on at least one sensor.
For a specific implementation manner of the method steps executed by each unit, reference may be made to the first aspect or a specific implementation manner of corresponding method steps in any possible design of the first aspect, which is not described herein again.
In a third aspect, embodiments of the present application provide a vehicle monitoring device applied to a vehicle in a flameout state, where the vehicle monitoring device includes a processor and a memory, where the memory stores computer program instructions, and the processor executes the computer program instructions to implement the method according to the first aspect or any one of the possible designs of the first aspect.
Illustratively, the processor may be configured to: selecting at least one sensor from a plurality of sensors mounted on the vehicle according to at least one of a scene type of a surrounding environment of the vehicle, a moving object in the surrounding environment, and a barrier situation of the surrounding environment; the surroundings of the vehicle are monitored on the basis of at least one sensor.
For a specific implementation of the method steps executed by the processor, reference may be made to the specific implementation of the method steps in the first aspect or any one of the possible designs of the first aspect, which is not described herein again.
In a fourth aspect, an embodiment of the present application provides a vehicle, including: a plurality of sensors; and a vehicle monitoring device as described in the second aspect or a vehicle monitoring device as described in the third aspect.
In a fifth aspect, embodiments of the present application provide a computer-readable storage medium for storing instructions that, when executed, cause a method as described in the first aspect or any one of the possible designs of the first aspect to be implemented.
In a sixth aspect, an embodiment of the present application provides a computer program product having instructions stored thereon, which when executed on a processor, cause a method as described in the first aspect or any one of the possible designs of the first aspect to be implemented.
The beneficial effects of the designs in the second aspect to the sixth aspect described above refer to the beneficial effects of the corresponding designs in the first aspect, which are not described herein again.
Drawings
FIG. 1 is a schematic diagram of an architecture of a possible on-board system according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of a possible sensor layout provided by an embodiment of the present application;
FIG. 3 is a schematic diagram of a possible ECU system according to an embodiment of the present application;
FIG. 4 is a flow chart of a vehicle monitoring method provided by an embodiment of the present application;
fig. 5 is a flowchart of a sensor selection method based on scene recognition according to an embodiment of the present application;
FIG. 6 is a flowchart illustrating a method for selecting a sensor based on a moving object according to an embodiment of the present disclosure;
FIG. 7 is a flow chart of a method for sensor selection based on environmental barrier conditions according to an embodiment of the present application;
FIG. 8 is a schematic illustration of one possible threat level and corresponding response event;
FIG. 9 is a schematic diagram of one possible method of performing a response event;
fig. 10 is a schematic structural diagram of a vehicle monitoring device 1000 according to an embodiment of the present disclosure;
fig. 11 is a schematic structural diagram of an in-vehicle device 1100 according to an embodiment of the present application.
Detailed Description
The embodiment of the application is suitable for the vehicle-mounted system, and the vehicle-mounted system can be deployed in a vehicle. It should be understood that the embodiment of the present application is mainly applied to a vehicle in a key-off state (or a parked state), but the embodiment of the present application may also be applied to a vehicle in other states, for example, a vehicle in a slow driving state, or a vehicle that has stopped driving but has not been key-off, and the present application is not limited thereto. The stall state is a state in which the engine of the vehicle stalls and the vehicle is stopped.
Referring to fig. 1, a schematic diagram of a possible architecture of an on-board system provided in an embodiment of the present application is provided, where the architecture of the on-board system at least includes a sensor system and an Electronic Control Unit (ECU) system. The sensor system can collect data of the surrounding environment of the vehicle, and the collected data is input into the ECU system and processed by the ECU system.
The sensor system includes a variety of sensors, including, for example and without limitation, the following: ultrasonic radar (USS), a Camera (Camera), an Inertial Navigation System (INS), and a Global Positioning System (GPS).
1) The ultrasonic radar is a radar using ultrasonic detection. The working principle of the ultrasonic radar is that the distance is measured and calculated by the time difference between the time when the ultrasonic wave is sent out by the ultrasonic wave transmitting device and the time when the ultrasonic wave is received by the receiver. Ultrasonic waves refer to the vibration frequency of more than 20000Hz, the vibration frequency (frequency) per second is very high and exceeds the general upper limit of human auditory sense (20000 Hz), and people call the sound waves which cannot be heard as ultrasonic waves.
Ultrasonic radars include, but are not limited to, the following two: the first is a reversing radar mounted on a front bumper and a rear bumper of a vehicle, namely used for measuring front and rear obstacles of the vehicle, and the radar is called UPA in the industry; the second is an ultrasonic radar, known in the industry as APA, mounted on the side of the vehicle for measuring the distance to side obstacles. The UPA is a short-range ultrasonic wave, is mainly installed at the front part and the rear part of a vehicle body, has the detection range of 25 cm-2.5 m, and has the advantages of large detection distance, small Doppler effect and temperature return interference and more accurate detection. The APA is a remote ultrasonic sensor, is mainly used for the side surface of a vehicle body, has the detection range of 35 cm-5 m, and can cover a parking space. The directivity is strong, the transmission performance is superior to that of the UPA, and the interference of other APAs and UPAs is not easy to happen.
For example, fig. 2 shows a layout diagram of a plurality of sensors on a vehicle, and in the example shown in fig. 2, the ultrasonic radars a, b, g, h, i, and j are short-range ultrasonic radars and are arranged at the head and tail of the vehicle, and the ultrasonic radars c, d, e, and f are long-range ultrasonic radars and are arranged at the left and right sides of the vehicle.
2) A camera, or referred to as a camera sensor. The cameras in embodiments of the present application may include any camera for acquiring images of the environment in which the vehicle is located, including, for example and without limitation: infrared cameras, visible light cameras, and the like.
For example, in the example shown in fig. 2, the camera 1 is disposed at the front side of the vehicle, and can capture an image in front of the vehicle; the camera 2 is arranged at the rear side of the vehicle and can acquire images behind the vehicle; the cameras 3 and 4 are respectively arranged on the left side and the right side of the vehicle and can collect images of the left side and the right side of the vehicle.
3) The inertial navigation system is a navigation parameter calculation system taking a gyroscope and an accelerometer as sensitive devices, establishes a navigation coordinate system according to the output of the gyroscope, and calculates the speed and the position of a carrier (such as a vehicle) in the navigation coordinate system according to the output of the accelerometer.
4) The global positioning system, also called as global satellite positioning system, is called as "ball position system" for short, and is a middle-distance circular orbit satellite navigation system, which combines the technology of satellite and communication development and utilizes the navigation satellite to measure time and distance.
It should be understood that fig. 2 is only an example, and the arrangement position of various sensors in practical application may be different from fig. 2, and may also include more or fewer sensors, and may also include other types of sensors, which is not limited in this application.
The ECU system may process data collected by each sensor in the sensor system. For example, the ECU processes image data collected by the camera to identify objects (e.g., obstacles) in the image. And the ECU system can also make a decision to drive the controlled element to work based on the processing result. Wherein the controlled element includes but is not limited to: sensors, speakers, car lights, central control screens, etc.
In the embodiment of the present application, the ECU system is composed of a plurality of ECUs, and the ECUs may communicate with each other to exchange data, for example, each ECU is connected to a Controller Area Network (CAN) bus, and the ECUs exchange data based on the CAN bus.
The specific implementation of the ECU may be any device or module having processing functionality. For example, the ECU may be a Central Processing Unit (CPU), and the ECU may also be other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, and the like. The general purpose processor may be a microprocessor or any conventional processor, among others.
Referring to fig. 3, the ECUs in the embodiment of the present application include, but are not limited to, the following types according to the functional division of each ECU: a vehicle mounted Mobile Data Center (MDC), a Body Control Manager (BCM), a smart Cabin Domain Controller (CDC), and a Telematics Box (TBOX).
1) MDC is the core ECU of the vehicle. The MDC has a calculation and control function, and can calculate data collected by each sensor, convert the calculation result into a control command, and control the operation of the controlled element by the control command, for example, the MDC sends the control command to an ECU (e.g., BCM, CDC, TBOX, etc.) corresponding to the controlled element, and the ECU corresponding to the controlled element drives the controlled element to operate according to the control command.
The MDC may also control memory (ROM/FLASH/EEPROM, RAM), input/output interfaces (I/O), and other external circuitry; the memory may store programs.
The vehicle monitoring method provided by the embodiment of the present application may be controlled by the MDC or may be completed by calling other components, for example, a processing program stored in a memory according to the embodiment of the present application, to perform an operation on data collected by each sensor, and to control a controlled element to operate.
2) BCM, also known as body computer (body computer), is an ECU for controlling the electrical systems of the vehicle body. BCM controlled elements include, but are not limited to: power windows, power mirrors, air conditioners, vehicle lights (such as headlights, turn lights, etc.), anti-theft locking systems, central locks, defrosting devices, etc. The BCM CAN be connected with other vehicle-mounted ECUs through a CAN bus.
3) And CDC is an ECU for controlling each element in the smart car. Elements in the smart cabin include, but are not limited to, the following: instrument screen, central control panel screen (for short central control screen), new line display screen, microphone, camera, speaker (loudspeaker) or bluetooth module etc.. The intelligent cabin can control the running state and running track of the automatic driving vehicle through human-computer interaction according to the requirements of passengers, so that the human-computer interaction or remote control in the intelligent cabin can transmit the same command to control the running of the vehicle.
4) And the TBOX is mainly used for communicating with an application program (APP) of a background system or user equipment to realize APP-related vehicle information display and control. TBOX may use 3G cellular communications such as Code Division Multiple Access (CDMA), EVD0, global system for mobile communications (GSM)/General Packet Radio Service (GPRS), or 4G cellular communications such as Long Term Evolution (LTE), or 5G cellular communications. TBOX may communicate with a Wireless Local Area Network (WLAN) using WiFi. In some embodiments, TBOX may communicate directly with the device using an infrared link, bluetooth, or ZigBee. TBOX may also communicate based on other wireless protocols, such as the direct communication with other vehicles and/or roadside stations based on the vehicle's Dedicated Short Range Communications (DSRC) protocol.
It should be noted that fig. 3 is only an example, and the number and layout of ECUs may have other implementation manners in practical applications, and the present application is not limited in detail herein. In addition, each ECU in fig. 3 may be separately deployed or may be mutually integrated and deployed, and the embodiment of the present application is not limited.
Based on the above description, the embodiment of the present application provides a vehicle monitoring method, taking the application of the method to the vehicle-mounted system shown in fig. 1 as an example, and referring to fig. 4, the method includes the following processes:
s401, when the vehicle is in a flameout state, the vehicle selects at least one sensor from a plurality of sensors installed on the vehicle according to at least one of scene types of the surrounding environment, moving objects in the surrounding environment and barrier conditions of the surrounding environment.
Specifically, the ECU system in the vehicle determines the type of the event to be monitored according to at least one of the scene type of the surrounding environment, the moving object in the surrounding environment and the barrier condition of the surrounding environment, and then selects at least one sensor corresponding to the type of the event to be monitored from a plurality of sensors mounted on the vehicle, namely the selected at least one sensor can effectively monitor the type of the event.
Alternatively, a specific implementation of selecting at least one sensor from a plurality of sensors mounted on a vehicle may be: at least one sensor is selected from a plurality of sensors mounted on the vehicle and activated. It should be understood that if some of the at least one sensor is already selected and activated, only the sensor that is not activated need be activated. Optionally, if other sensors besides the selected at least one sensor in the plurality of sensors have been selected or turned on, the other sensors are deselected or turned off.
In a specific implementation process, a plurality of elements (i.e., scene types of the surrounding environment, moving objects in the surrounding environment, and barriers of the surrounding environment) for selecting the at least one sensor from the plurality of sensors by the vehicle may be implemented individually or in combination, and the present application is not limited thereto.
First, a case where a scene type of the surrounding environment, a moving object in the surrounding environment, and a barrier situation of the surrounding environment are individually implemented will be described.
1. The vehicle selects at least one sensor according to the scene type of the surrounding environment.
The scene type of the ambient environment may characterize: the classification of the surrounding environment is based on the manner of formation, functional use, geographic location, time zone, facility, elements of the natural environment, human activity characteristics, building type or privacy, etc.
The application does not limit the specific dividing mode of the scene types, for example: according to the formation of the surrounding environment, the scene type of the surrounding environment can be divided into a natural environment, an artificial environment and the like; according to the functions of the surrounding environment, the scene types of the surrounding environment can be divided into living environment, ecological environment and the like; according to different elements in the surrounding environment, the scene type of the surrounding environment can be divided into an atmospheric environment, a water environment, a soil environment, a biological environment, a geological environment and the like; according to the gathering mode of human beings in the surrounding environment, the environment can be divided into rural environment, urban environment and the like; according to the privacy of the surrounding environment, the scene types of the surrounding environment can be divided into a private environment, a public environment and the like; the scene types of the surrounding environment may be classified into residential area environment, open/underground parking lot environment, street roadside environment, high-speed roadside environment, field environment, and the like, according to the types of buildings in the surrounding environment.
In the embodiment of the application, the corresponding relation between the scene type and the sensor can be preset by the vehicle, so that before the sensor is adopted to monitor the surrounding environment of the vehicle, the scene type of the surrounding environment can be firstly identified, and then the sensor corresponding to the scene type of the surrounding environment is selected from various sensors installed on the vehicle according to the corresponding relation between the scene type and the sensor. Therefore, the vehicle does not need to start all sensors all the time to monitor the surrounding environment, the corresponding sensors are selected according to the requirement to monitor the surrounding environment, the monitoring precision can be improved, meanwhile, the loss of other sensors is reduced, the service life of the sensors is prolonged, and the overall energy consumption of the vehicle sensors is reduced.
For example, fig. 5 shows a flowchart of a sensor selection method based on scene recognition, which may be applied to the in-vehicle system shown in fig. 1, and may be specifically executed by an ECU system in the in-vehicle system. The method comprises the following steps:
s501, in a flameout state, the vehicle identifies the scene type of the surrounding environment.
Specifically, the MDC in the vehicle's ECU system identifies the scene type of the vehicle's surrounding environment. The identification mode can be various, and the application is not limited. For example, the MDC obtains a history of the vehicle during driving (e.g., image data captured by a camera during driving, position data in a navigation system, etc.), and then determines a scene type of the surrounding environment of the vehicle based on the history. Or, for example, the MDC first gathers data for the ambient environment based on one or more sensors (e.g., cameras) on the vehicle and then determines the scene type of the ambient environment of the vehicle based on the data.
And S502, selecting a sensor corresponding to the scene type of the surrounding environment from a plurality of sensors installed on the vehicle by the vehicle according to the corresponding relation between the scene type and the sensor.
Specifically, the MDC may preset a first corresponding relationship between the scene type and the sensor, for example, store the first corresponding relationship between the scene type and the sensor in a memory. After the MDC determines the scene type of the surrounding environment, a sensor corresponding to the scene type is selected from a plurality of sensors installed on the vehicle according to the first corresponding relation.
The sensor corresponding to each scene type may be determined according to the event type to be monitored in the scene type, and the specific correspondence between the scene type and the sensor is not specifically limited in the present application, and the following lists several possible examples:
example 1, inside a residential cell: in the scene, pedestrians are rare, the road condition is simple, and only slight scratch of the vehicles is possible, so that the safety of the vehicles can be ensured only by selecting the camera and the ultrasonic radar.
Example 2, street curb: in the scene, the flow of people and vehicles is large, the situation is complex and changeable, and various safety threats such as scratch and grazing of trailers and vehicles and theft can occur, so that a camera, an ultrasonic radar, an inertial navigation system and a global positioning system can be selected, and the safety of the vehicles is ensured.
Example 3, open/underground parking lot: the situation is simple, but the risk that the vehicle is scratched and stolen exists, so that the inertial navigation system, the camera and the ultrasonic radar can be selected to sense that the object is close to the vehicle.
Example 4, outdoor strange environment: the vehicle is parked in strange outdoor, the risk of theft is very high, so the change of the vehicle in the amplitude and the direction of the earthquake degree can be sensed by selecting the camera and the INS.
It should be understood that, before step S502, some sensors may have been selected for monitoring the surrounding environment of the vehicle, and therefore, in executing step S502, if a sensor corresponding to the scene type of the surrounding environment is not selected, the sensor corresponding to the scene type of the surrounding environment is selected, and if a sensor corresponding to the scene type of the surrounding environment is selected for monitoring the surrounding environment of the vehicle, the sensor may be continuously maintained to monitor the surrounding environment of the vehicle. Optionally, if other sensors that do not correspond to the scene type of the surrounding environment are selected for monitoring the surrounding environment of the vehicle, the monitoring of the surrounding environment by the sensors that do not correspond to the scene type of the surrounding environment may be cancelled.
According to the embodiment of the application, different monitoring mechanisms are formulated according to different scene types of the surrounding environment of the vehicle, namely, the sensor corresponding to the scene type of the surrounding environment is selected from a plurality of sensors installed on the vehicle to be used for monitoring the surrounding environment. The monitoring precision can be improved, meanwhile, the loss of other sensors is reduced, the service life of the sensors is prolonged, and the overall energy consumption of the vehicle sensors is reduced.
2. The vehicle selects at least one sensor from a plurality of sensors mounted on the vehicle according to moving objects in the surrounding environment.
The moving object in the embodiment of the present application: refers to any object that can move. Including any living objects that can be moved (e.g., people, cats, dogs, rabbits, snakes, butterflies, wolves, birds, etc.) and non-living objects (e.g., vehicles, drones, mountain rock falls, etc., fallen leaves, etc.). It should be understood that the movement of the moving object may be autonomous (e.g., human walking, surrounding vehicle driving, bird flying, animal running, etc.) or passive (e.g., wind blown blade falling, mountain landslide, etc.), and the present application is not limited thereto.
In the embodiment of the application, the vehicle can select the sensor from the various sensors mounted on the vehicle by sensing the moving object existing around the vehicle body, so that the loss of other sensors can be reduced while the monitoring precision is improved, the service life of the sensor is prolonged, and the overall energy consumption of the vehicle sensor is reduced.
The following exemplifies several typical application scenarios: scene 1, late night community/street roadside: there will be pedestrian or vehicle passing by chance and few moving objects will appear. Scene 2, outdoor scene: less people flow and fewer moving objects appear around the vehicle body. Scene 3, underground parking lot: in areas other than the areas where vehicles enter and exit, fewer moving objects appear around the vehicle body.
In these scenarios, before the vehicle uses the sensors to monitor the surrounding environment of the vehicle, a small number of sensors (e.g., a first sensor) may be selected from a plurality of sensors installed on the vehicle, and then the first sensor may be used to detect whether a moving object is present in the surrounding environment. After the vehicle determines that a moving object is present in the surrounding environment, the first sensor and other types of sensors, such as ultrasonic radar, inertial navigation systems, etc., are selected from a variety of sensors mounted on the vehicle for monitoring the surrounding environment of the vehicle.
Optionally, in order to improve the monitoring accuracy, the vehicle may select another type of sensor besides the first sensor after determining that a moving object is present in the surrounding environment and the moving object satisfies the preset condition. Further optionally, the preset condition includes, but is not limited to, any one or more of the following: moving the moving object to the direction close to the vehicle; the occurrence frequency of the moving object exceeds a preset frequency; the appearance time of the moving object exceeds the preset time; the moving object is within a preset range of the vehicle.
It should be understood that the present application is not limited to the type of first sensor, for example the first sensor may be a camera or an ultrasonic radar, etc. Taking the first sensor as a camera for example, fig. 6 shows a flowchart of a moving object-based sensor selection method, which can be applied to the vehicle-mounted system shown in fig. 1, and the method includes:
s601, in a flameout state, starting a camera by the vehicle, entering a monitoring state, and shooting images of the surrounding environment by the camera;
s602, monitoring whether a moving object exists around the vehicle body by the vehicle based on the image shot by the camera;
specifically, the MDC recognizes that there is a shot image captured, and detects whether there is a moving object present in the surrounding environment.
S603, if a moving object appears, the vehicle continuously monitors the appearance frequency of the object based on the image shot by the camera; if no moving object exists, the camera is kept on.
S604, when the occurrence frequency of the moving object is very high (if the occurrence frequency exceeds a set frequency threshold), the safety of the vehicle body is threatened, and the vehicle selects other types of sensors (such as an ultrasonic radar, an inertial navigation system and the like) from a plurality of sensors arranged on the vehicle and monitors the surrounding environment together by matching with a camera; if the frequency is low, it may be only the pedestrian or animal that passes by without the intention of approaching the vehicle, and it is only necessary to keep the camera on, i.e. based on the camera detection.
Alternatively, the vehicle may determine other types of sensors according to at least one of the moving direction of the moving object, the appearance frequency of the moving object, the appearance duration of the moving object, or the distance between the moving object and the vehicle.
Example 1, an MDC of a vehicle presets a second correspondence between a moving direction of a moving object and a sensor, and after obtaining the moving direction of the moving object, the MDC determines to select a second sensor from a plurality of sensors mounted on the vehicle according to the second correspondence.
Still taking the example that the first sensor is a camera, if the moving object moves towards the direction of approaching the vehicle in a curve manner, it indicates that the object does not approach the vehicle quickly (it may be that only pedestrians pass by), and an ultrasonic radar can be selected to cooperate with the camera for monitoring; if the moving object moves linearly towards the direction close to the vehicle, the moving object is shown to be close to the vehicle quickly, and the ultrasonic radar, the inertial navigation system, the global positioning system and the like can be selected to be matched with the camera for monitoring, so that the monitoring capability of the vehicle is improved quickly.
Example 2, the MDC of the vehicle previously sets a third correspondence relationship between the occurrence frequency of the moving object and the sensor, and determines a third sensor that needs to be selected from the plurality of sensors mounted on the vehicle according to the third correspondence relationship after obtaining the occurrence frequency of the moving object.
Taking the first sensor as a camera as an example, if the occurrence frequency of the moving object is lower than a first frequency threshold, selecting an ultrasonic radar to cooperate with the camera for monitoring; if the appearance frequency of the moving object is higher than a first frequency threshold and lower than a second frequency threshold, selecting an ultrasonic radar and a global positioning system to cooperate with a camera for monitoring; and if the appearance frequency of the moving object is higher than the second frequency threshold value, selecting an ultrasonic radar, a global positioning system and an inertial navigation system to be matched with the camera for monitoring. The first frequency threshold is lower than the second frequency threshold, and both the first frequency threshold and the second frequency threshold are greater than 0.
The occurrence frequency of the moving object may be the number of occurrences of the moving object within a preset time range, for example, the number of occurrences of the moving object within one minute, the MDC may sample whether the moving object occurs once per second, and if the moving object occurs, the count value is +1. Of course, this is merely an example, and the present application does not limit the specific implementation of MDC for counting the occurrence frequency of moving objects.
Example 3, the MDC of the vehicle sets in advance a fourth correspondence between the occurrence duration of the moving object and the sensor, and determines, after obtaining the occurrence duration of the moving object, a fourth sensor that needs to be selected from the plurality of sensors mounted on the vehicle according to the fourth correspondence.
Taking the first sensor as a camera as an example, if the appearance duration of a moving object reaches 5S, selecting an ultrasonic radar to cooperate with the camera for monitoring; if the occurrence time of the moving object reaches 30S, selecting an ultrasonic radar and a global positioning system to cooperate with a camera for monitoring; and if the occurrence time of the moving object reaches 1min, selecting an ultrasonic radar, a global positioning system and an inertial navigation system to cooperate with the camera for monitoring.
Example 4, the MDC of the vehicle previously sets a fifth correspondence between the distance from the moving object to the vehicle and the sensor, and determines a fifth sensor that needs to be selected from a plurality of sensors mounted on the vehicle according to the fifth correspondence after obtaining the distance from the moving object to the vehicle.
Taking the first sensor as a camera as an example, if the distance between the moving object and the vehicle is 5-10 meters, selecting an ultrasonic radar to cooperate with the camera for monitoring; if the distance between the moving object and the vehicle is 2-5 m, selecting an ultrasonic radar and a global positioning system to cooperate with a camera for monitoring; and if the distance between the moving object and the vehicle is within 2 meters, selecting an ultrasonic radar, a global positioning system and an inertial navigation system to cooperate with the camera for monitoring.
Optionally, after the vehicle selects another type of sensor to cooperate with the camera to jointly monitor the surrounding environment, if it is monitored that the appearance frequency of the moving object becomes low or the moving object disappears, the other type of sensor may be reduced or turned off.
According to the embodiment of the application, different monitoring mechanisms are formulated by sensing whether moving objects exist around the vehicle. If no moving object exists around the vehicle body, the camera is selected to meet the monitoring condition. If moving objects exist around the vehicle body, other sensors are selected to be matched with the camera for use, and monitoring strength is improved. The monitoring precision can be improved, meanwhile, the loss of other sensors is reduced, the service life of the sensors is prolonged, and the overall energy consumption of the vehicle sensors is reduced.
3. The vehicle selects at least one sensor from a plurality of sensors mounted on the vehicle according to the barrier condition of the surrounding environment.
The barrier condition of the surrounding environment includes whether a barrier exists in the surrounding environment, a barrier type of the surrounding environment, a barrier level of the surrounding environment, and the like. The barrier is another object capable of protecting the safety of the vehicle, for example, another moving object can be blocked from approaching the vehicle. The application is not limited to a particular type of barrier, for example, barriers including but not limited to walls, other vehicles, trees, or fences, etc. The degree of barrier of the surrounding environment may refer to the degree or ability of the barrier in the surrounding environment to block other moving objects from approaching or damaging the vehicle.
Alternatively, the vehicle may determine the degree of the barrier of the surrounding environment based on the openness of the surrounding environment. For example, the lower the openness, the heavier the barrier; conversely, the higher the openness, the lighter the barrier. Optionally, the degree of the barrier of the surrounding environment is related to a scene type of the surrounding environment, a degree of the enclosure of the space of the surrounding environment, or a size of the space of the surrounding environment. For example, at the roadside of a street, the space closure is poor, the mobility of pedestrians and other vehicles is large, so the openness of the surrounding environment is high, and the degree of barrier of the surrounding environment is light; for example, in a residential area, the space is generally closed due to access control, and the barrier degree of the surrounding environment is generally due to the fact that pedestrians and other vehicles have small mobility and are generally open; for example, in a private garage, the space is small, the space is closed strongly, the mobility of pedestrians and other vehicles is very small, the openness is low, and therefore the degree of barrier of the surrounding environment is high.
In the embodiment of the present application, the vehicle may monitor the barrier condition of the peripheral environment in multiple implementations, for example, the barrier condition of the peripheral environment is determined based on the scene type of the peripheral environment, or the barrier condition of the peripheral environment is determined based on the size of the space of the peripheral environment, or the barrier condition of the peripheral environment is determined based on the degree of closure of the space of the peripheral environment.
Taking the case of judging the barrier condition of the surrounding environment based on the degree of enclosure of the space of the surrounding environment as an example: in one possible implementation, the vehicle may monitor the presence of safety barriers in various orientations of the vehicle; if the safety barrier does not exist in a certain position of the vehicle, the position has high openness, light barrier degree and poor safety, and a sensor capable of monitoring the position can be selected from various sensors mounted on the vehicle to monitor the position; if a safety barrier exists in another direction of the vehicle, the situation that the openness is low in the direction is shown, the barrier degree is high, the safety is high, and the number of sensors for monitoring the direction can be reduced or closed appropriately.
Optionally, the barrier is a safety barrier when the distance between the area in which the barrier is located and the vehicle is less than a first threshold.
For example, FIG. 7 shows a flow diagram of a method for sensor selection based on environmental barrier conditions, which may be applied to the on-board system shown in FIG. 1, the method comprising:
and S701, under a flameout state, starting a monitoring function of the vehicle, and enabling the vehicle to enter a monitoring state.
Optionally, when the monitoring function of the vehicle is turned on, all the sensors may be turned on, or only some of the sensors may be turned on (for example, only the camera is turned on), which is not limited in the present application.
S702, the vehicle monitors whether a safety barrier exists around the vehicle body.
For example, the vehicle determines whether there is a wall or another vehicle around the vehicle body from the image captured by the camera.
Optionally, when the distance between the barrier and the vehicle is less than the preset distance, the barrier is a safety barrier. The preset distance is, for example, 1 meter, 1.5 meters, 2 meters, etc., and the application is not limited thereto. In addition, the value of the preset distance may be different for different types of barriers. For example, for a wall, the preset distance is 1.5 meters; for other vehicles, the preset distance is 1 meter.
S703A, if a safety barrier is arranged on one side, the sensor on the side of the barrier is not turned on/off.
Specifically, if the sensor on the barrier side of the vehicle has not been turned on, the sensor on that side of the vehicle is kept turned off; if the sensor on that side of the vehicle is already on, the sensor on that side of the vehicle is turned off.
Optionally, the vehicle may shut down all sensors on the barrier side of the vehicle to save power consumption of the sensors to the maximum extent and extend the life of the sensors.
Alternatively, the vehicle may turn off only a portion of the sensors on the barrier side to further improve safety while properly conserving sensor power consumption.
And S703B, aiming at the other side without the safety barrier, normally starting the sensor on the side.
The following further exemplifies several typical application scenarios:
scene 1, when the vehicle parks in the parking space, the vehicle is parked on one side or two sides: the distance left between vehicles is narrow on the side of the vehicle body where the vehicles are parked, pedestrians or other vehicles are inconvenient to pass through the area, and therefore the threat or damage to the side of the vehicle body is almost negligible. At this time, it is not significant to turn on the side sensor for monitoring, so that the side sensor (for example, a camera, an ultrasonic radar, etc.) may not be turned on/off.
Scene 2, when the vehicle parks alongside, there are obstacles such as walls on one side: the wall-type obstacles exist on one side, the distance between the vehicle and the wall is narrow, the risks of rubbing, abnormal movement, theft and the like of the vehicle can be avoided, and the obstacles can protect the safety of the side of the vehicle body, so that the sensors (such as a camera, an ultrasonic radar and the like) on the side can not be turned on/off.
Whether the safety barrier exists around the vehicle body is sensed by the vehicle in the embodiment of the application, the existing value of the barrier is reasonably utilized, potential threat factors of the vehicle body side are avoided, unnecessary loss to the sensor is avoided, the service life of the sensor is prolonged, and the monitoring effectiveness is improved.
The above description has been made on the case where the three elements (i.e., the scene type of the surrounding environment, the moving object in the surrounding environment, and the barrier situation of the surrounding environment) used for selecting the sensor according to the present application are individually implemented. In specific implementation, the three sensor selection schemes can be implemented in combination with each other.
Several of these possible binding schemes are exemplified below.
4. The vehicle selects at least one sensor from a plurality of sensors mounted on the vehicle according to the scene type of the surrounding environment and the moving object in the surrounding environment.
For example, after the vehicle is switched from a driving state to a flameout state, firstly, determining the scene type of the surrounding environment of the vehicle according to the historical record of the vehicle in the driving process, and selecting at least one sensor corresponding to the scene type of the surrounding environment; then whether a moving object appears in the surrounding environment is detected based on the at least one sensor, and if the moving object appears in the surrounding environment or the frequency of the moving object exceeds a preset frequency, other types or more of sensors are further selected.
5. The vehicle selects at least one sensor from a plurality of sensors installed on the vehicle according to the scene type of the surrounding environment and the barrier condition of the surrounding environment.
For example, after the vehicle is switched from a driving state to a flameout state, firstly, determining the scene type of the surrounding environment of the vehicle according to the historical record of the vehicle in the driving process, and selecting at least one sensor corresponding to the scene type of the surrounding environment; and then detecting whether a safety barrier exists around the vehicle or not based on the selected sensors, continuously selecting part or all of the sensors on the side for which the safety barrier does not exist, and deselecting or turning off all the sensors on the side for which the safety barrier exists.
6. The vehicle selects at least one sensor from a plurality of sensors mounted on the vehicle according to the moving object in the surrounding environment and the barrier condition of the surrounding environment.
For example, after the vehicle is switched from a driving state to an off state, the cameras on the front, back, left and right sides are selected first, whether a safety barrier exists around the vehicle is detected based on the cameras on the front, back, left and right sides, a part or all of the sensors on the side are selected for the side where the safety barrier does not exist, and all the sensors on the side are turned off for the side where the safety barrier exists. Thereafter, the vehicle continues to detect whether or not a moving object is present in the surrounding environment based on the already selected sensors, and if the moving object is present in the surrounding environment or the frequency of the moving object exceeds a preset frequency, another type or a greater number of sensors are further selected.
7. The method comprises the following steps that a vehicle selects at least one sensor from a plurality of sensors installed on the vehicle according to the scene type of the surrounding environment, moving objects in the surrounding environment and the barrier condition of the surrounding environment;
for example, after the vehicle is switched from a driving state to a flameout state, all the front and rear cameras on four sides are selected first, whether a safety barrier exists around the vehicle is detected based on the front, rear, left and right cameras, for the side where the safety barrier does not exist, part or all of the sensors on the side are selected, and for the side where the safety barrier exists, all the sensors on the side are turned off. Then, the vehicle detects the scene type of the surrounding environment based on the selected camera, and selects a sensor corresponding to the current scene type. Thereafter, the vehicle continues to detect whether or not a moving object is present in the surrounding environment based on all the sensors having been selected, and further selects another type or a greater number of sensors if the moving object is present in the surrounding environment or the frequency of the moving object exceeds a preset frequency.
It should be understood that the above-mentioned fourth to seventh parts are only examples of the combined embodiments, and other combined embodiments are possible in specific implementation.
S402, the vehicle monitors the surrounding environment based on the at least one sensor.
In particular, the vehicle is based on at least one factor that the at least one sensor poses a threat to the safety of said vehicle. For example, selected sensors on an MDC control vehicle in an ECU system of the vehicle collect data of the surrounding environment of the vehicle; after the data are collected by each sensor, the data are transmitted to the MDC; after receiving the data collected by each sensor, the MDC analyzes the data to obtain the factors that the surrounding environment threatens the safety of the vehicle.
The following is illustrated by several specific examples:
example 1, taking a camera as an example: the MDC may monitor the surrounding environment for the presence of obstacles, the type of obstacles (e.g., pedestrians, bicycles, electric vehicles, vehicles), the distance between an obstacle and a vehicle, the movement tendency of an obstacle relative to a vehicle (e.g., approaching, moving away, or stationary, etc.), and the like based on the data collected by the camera.
Example 2, taking an ultrasonic radar as an example: the MDC may monitor the surrounding environment for the presence of obstacles, the distance of obstacles from the vehicle, etc. based on the data collected by the ultrasonic radar.
Example 3, take an inertial navigation system as an example: the MDC may monitor the vehicle for a shake value, a movement value (or a change in position value), a duration of vehicle shake, a duration of vehicle movement, etc., based on data collected by the inertial navigation system.
Example 4, take global positioning system as an example: the MDC can perform location tracking, vehicle condition monitoring, vehicle trace recording, etc. on the vehicle based on the data collected by the global positioning system.
It should be appreciated that the MDC is based on the ability of the at least one sensor to monitor at least one factor that threatens the security of the vehicle from the ambient environment, and whether the MDC can obtain the corresponding factor after analyzing the data collected by the at least one sensor depends on whether the corresponding factor is actually present in the ambient environment. If the corresponding factors exist in the surrounding environment, the MDC analyzes the data collected by the at least one sensor to obtain the corresponding factors, and if the corresponding factors do not exist in the surrounding environment, the MDC analyzes the data collected by the at least one sensor to not obtain the corresponding factors.
Further, after the vehicle obtains the factors that threaten the safety of the vehicle by the surrounding environment, the vehicle can determine the threat level of the surrounding environment to the vehicle according to the factors.
The MDC may determine the threat level of the surrounding environment to the vehicle based on one or more of: 1) The type of factors monitored by the plurality of sensors; 2) Values of factors monitored by the plurality of sensors; 3) The duration of each factor; 4) The surrounding environment is changed; 5) The number of changes in the surrounding environment; 6) The speed of the vehicle, etc. It should be understood that the above partial items may be obtained based on other items, for example, the vehicle speed of the vehicle may be obtained by performing statistical analysis on two factors of "time of vehicle movement" and "distance of vehicle movement".
The application does not limit the specific dividing mode of the vehicle to the threat level. Two of these possible ways are listed below: mode 1, the MDC classifies threat levels according to the number of types of factors monitored by the sensors, where the type of factors monitored by the MDC based on the plurality of sensors is less at low threat levels than at high threat levels. Mode 2, the MDC classifies threat levels according to values of the factors monitored by the sensors, wherein the value of any of the factors monitored by the MDC based on the plurality of sensors is lower at a low threat level than at a high threat level. It should be understood that the above modes 1 and 2 may be implemented separately or in combination with the embodiments, and are not limited herein.
The application does not limit the total number of threat levels. For example, the threat level is 1 level, i.e., "there is a threat"; for example, the threat levels are 2 levels, with level 1 being "low threat" and level 2 being "high threat"; for example, there are 3 levels of threat, level 1 being "low threat", level 2 being "high threat", and level 3 being "dangerous". Alternatively, "no threat present" may also be reduced to a single level, e.g., a level of 0 when no threat is present.
In the following, 4 levels of threat levels are taken as examples, and level 0 is "no threat", level 1 is "low threat", level 2 is "high threat", and level 3 is "dangerous":
the factors that the MDC can monitor based on the various sensors include: the method comprises the steps of monitoring an obstacle, an obstacle distance value ([ m ]), a vehicle vibration value ([ g ]), a vehicle position change ([ m ]), duration ([ ms ]), the number of parking environment changes (N) and vehicle speed (m/s). Wherein the monitoring of the obstacle is: the MDC monitors that the obstacles exist around the vehicle body based on the sensor; obstacle distance value: the MDC is based on the distance from an obstacle to a vehicle body monitored by a sensor; vehicle vibration value: the MDC monitors the vibration value of the vehicle based on the sensor; vehicle position change: the MDC monitors for changes in vehicle location (which may be vehicle theft or bad weather effects) based on sensors; duration: the MDC monitors the duration of a factor, such as the duration of vehicle movement or vibration, based on sensors; number of parking environment changes: the MDC monitors the change times of the vehicle parking environment based on a sensor; vehicle speed: the MDC monitors the speed at which the vehicle is moving based on sensors.
Example 1, MDC measures when animals and or people are walking around a vehicle as: an obstacle is detected. The MDC may determine that the surrounding environment is not a threat to the vehicle, with a threat level of 0.
Example 2, when animals and or people approach the vehicle, the MDC measures factors such as: obstacle, obstacle distance value, where the obstacle distance value is small (e.g., 0.5 m). The MDC may determine that the surrounding environment poses a low threat to the vehicle, with a threat level of 1.
Example 3, when animals and or humans touch a vehicle, the MDC measures factors: the method comprises the steps of monitoring an obstacle, a vehicle vibration value and an obstacle distance value, wherein the vehicle vibration value is small (such as 0.1 g), and the obstacle distance value is small (such as 0.01 m). The MDC can determine that the surrounding environment poses high threat to the vehicle, and the threat level is 2;
example 4, when animals and or humans attempt to force the door open, the MDC measures the following factors: the method comprises the steps of monitoring an obstacle, a vehicle vibration value, an obstacle distance value and a duration, wherein the vehicle vibration value is large (such as 0.5 g), the obstacle distance value is small (such as 0.01 m), and the time sequence time is long (such as 3 s). The MDC may determine that the surrounding environment is very dangerous to the vehicle with a threat level of 3.
Further, after determining the threat level, the vehicle may execute a response event corresponding to the threat level.
Specifically, the MDC may preset a correspondence between the threat level and the response event, for example, store the correspondence between the threat level and the response event in the memory. And after the MDC determines the threat level of the surrounding environment to the vehicle, executing a response event corresponding to the threat level according to the corresponding relation.
Referring to fig. 8, still taking 4 levels of threat levels (level 0 is no threat, level 1 is low threat, level 2 is high threat, and level 3 is dangerous) as an example, the response events corresponding to each level are respectively as follows:
after the vehicle is shut down, the sensor is turned on (the sensor which is selected to be turned on is referred to as a specific implementation method of S401), the vehicle enters a monitoring state, namely the sensor is adopted to collect data of the surrounding environment, and the MDC analyzes the data collected by the sensor to monitor whether a threat exists;
1) When the threat level is 'no threat exists', the vehicle can not execute any response event, or the response event is that the MDC controls the vehicle to continuously keep a 'monitoring state', namely, the sensor is adopted to monitor the surrounding environment of the vehicle;
2) When the threat level is 'low threat', the MDC controls the vehicle to enter a 'warning state', and the vehicle outputs warning information, such as vehicle lamp flashing, whistling, central control screen flashing and the like;
3) When the threat level is 'high threat', the MDC controls the vehicle to enter an 'event recording state', records events occurring in the surrounding environment, such as storing video images collected by a camera;
4) When the threat level is "dangerous", the MDC controls the vehicle to enter an "alarm state", and the vehicle sends alarm information to user equipment (such as a mobile phone, a smart watch, and the like) associated with the vehicle, for example, sends a short message to a mobile phone APP, and uploads a video recorded by a camera to a cloud to support downloading of the user equipment, and the like.
Alternatively, after the "alarm state" has ended for a period of time, such as 30s shown in fig. 8, the mdc may control the vehicle to resume the initial "monitoring state", i.e., stop sending alarm information to the user device associated with the vehicle, and continue to use the sensors to collect data about the surrounding environment. In this way, power consumption can be saved.
Optionally, the vehicle may maintain a "monitoring state" in the whole course when executing a response event corresponding to any threat level, that is, the sensor is used to monitor the surrounding environment of the vehicle all the time, so that the threat level may be updated in real time.
It should be understood that the process of the threat level of the surrounding environment monitored by the MDC to the vehicle may be to sequentially traverse each threat level from low to high, that is, to switch from "no threat" to "low threat", then to switch from "low threat" to "high threat", and then to switch to "dangerous"; the threat level of the surrounding environment monitored by the MDC to the vehicle may also be directly into one of the higher levels, such as directly into a "high threat" or "danger," which is not limited by the present application.
Optionally, the response event corresponding to the high threat level may include a response event corresponding to the low threat level to further improve the responsiveness of the vehicle. For example, when the threat level is "high threat", the MDC controls the vehicle to enter an "event recording state", and the vehicle records events occurring in the surrounding environment while outputting warning information. For example, when the threat level is "danger", the MDC controls the vehicle to enter an "alert state", and the vehicle outputs warning information, records events occurring in the surrounding environment, and simultaneously transmits alert information to the user equipment associated with the vehicle.
Optionally, the MDC executes a response event corresponding to the threat level, which may specifically be: and the MDC sends a control instruction to the ECU corresponding to the controlled element, so that the ECU corresponding to the controlled element drives the controlled element to execute a corresponding response event.
For example, referring to fig. 9, each ECU is controlled to drive a corresponding controlled element to execute an example of a response event for each threat level.
1) Under the flame-out state, the vehicle automatically enters a monitoring state: the MDC selects an ultrasonic radar and a camera, and monitors the environment around the vehicle based on data acquired by the ultrasonic radar and the camera.
2) When the MDC monitors that an object approaches the vehicle according to data collected by the ultrasonic radar and the camera, the MDC determines that the surrounding environment causes low threat to the vehicle, and then the MDC is automatically switched to a warning state: and the MDC awakens the BCM, the BCM controls the vehicle lamp to flicker and whistle according to the instruction of the MDC, the MDC awakens the CDC at the same time, and the CDC controls the central control screen to flicker according to the instruction of the MDC so as to warn that the camera close to the object is recording and monitoring.
3) When the MDC monitors that an object contacts the vehicle according to data collected by the ultrasonic radar and the camera, the MDC determines that the surrounding environment causes high threat to the vehicle, and then the MDC is automatically switched to an event recording state: the MDC selects an inertial navigation system (namely the inertial navigation system, a camera and an ultrasonic radar are monitored simultaneously), awakens the CDC to flicker in control, records and stores videos to the CDC through the camera, stores the videos through an external USB flash disk, and guides the videos into a Personal Computer (PC) for a user to see through the vehicle.
4) When the MDC monitors that a more serious threat occurs according to data collected by an ultrasonic radar, a camera, an inertial navigation system, and the like (for example, a BCM is triggered in a scene where a vehicle is unauthorized to enter or a tire pressure is abnormal, or an INS is triggered in a scene where a collision, a door prying, a window smashing, an abnormal vibration, a moving scene, and the like), the MDC determines that the surrounding environment causes a "danger" to the vehicle, and then automatically switches to an alarm state: the CDC is awaken up to the MDC, and the CDC increases the demonstration luminance of well accuse screen, and the CDC transfers the speaker volume to the biggest in order to support to shout, and the CDC passes through TBOX and uploads the video of recording before to the high in the clouds, reminds or APP to user's cell-phone propelling movement SMS reminds, supports user's cell-phone from high in the clouds download video.
Based on the above, the vehicle in the embodiment of the application can monitor threat factors existing in the surrounding environment in combination with different sensors according to at least one of the scene type of the surrounding environment, the moving object in the surrounding environment and the barrier condition of the surrounding environment, so that a traditional monitoring mechanism is optimized, multiple types of events can be perceived and identified, the monitoring precision can be improved, the loss of the sensors is reduced, and the service life of the sensors is prolonged. In addition, the vehicle in the embodiment of the application can also identify the threat level according to the monitored factors, execute the response event corresponding to the threat level, further eliminate the threat in time, and improve the safety of the vehicle in a flameout state.
Based on the same technical concept, the embodiment of the present application further provides a vehicle monitoring device 1000, where the device 1000 has a function of implementing the method steps shown in fig. 4 to 9, for example, the device 1000 includes a function or a module or a unit or means (means) for executing the method steps shown in fig. 4 to 9, and the function or the module or the unit or the means may be implemented by software, or implemented by hardware executing corresponding software.
For example, referring to fig. 10, the apparatus 1000 may comprise:
a processing unit 1001 configured to select at least one sensor from a plurality of sensors mounted on a vehicle, according to at least one of a scene type of a surrounding environment of the vehicle, a moving object in the surrounding environment, and a barrier situation of the surrounding environment;
a monitoring unit 1002 for monitoring the surroundings of the vehicle based on at least one sensor.
The specific implementation manner of the method steps executed by each unit may refer to the specific implementation manner when the vehicle executes the corresponding method steps in the embodiments shown in fig. 4 to fig. 9, and is not described herein again.
Based on the same technical concept, the embodiment of the application further provides the vehicle-mounted device 1100. Referring to fig. 11, the onboard apparatus includes at least one processor 1101 for performing the method steps shown in fig. 4-9.
Optionally, the vehicle-mounted device 1100 may further include a memory 1102, and the memory 1102 is represented by a dashed box in fig. 11 and is optional for the vehicle-mounted device 1100.
Optionally, the memory 1102 and the processor 1101 are communicatively connected by a bus, which is represented by a thick black line in fig. 11.
It should be understood that the processors mentioned in the embodiments of the present application may be implemented by hardware or may be implemented by software. When implemented in hardware, the processor may be a logic circuit, an integrated circuit, or the like. When implemented in software, the processor may be a general-purpose processor implemented by reading software code stored in a memory.
The Processor may be, for example, a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
It will be appreciated that the memory referred to in the embodiments of the application may be either volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The non-volatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable PROM (EEPROM), or a flash Memory. The volatile Memory may be a Random Access Memory (RAM) which serves as an external cache. By way of example, and not limitation, many forms of RAM are available, such as Static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), synchronous Dynamic Random Access Memory (SDRAM), double Data rate Synchronous Dynamic random access memory (DDR SDRAM), enhanced Synchronous SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), and Direct Rambus RAM (DR RAM).
It should be noted that when the processor is a general-purpose processor, a DSP, an ASIC, an FPGA or other programmable logic device, a discrete gate or transistor logic device, or a discrete hardware component, the memory (memory module) may be integrated into the processor.
It should be noted that the memory described herein is intended to comprise, without being limited to, these and any other suitable types of memory.
Based on the same technical concept, embodiments of the present application also provide a computer-readable storage medium for storing instructions, which when executed, cause the method shown in fig. 4 to 9 to be implemented.
Based on the same technical concept, the embodiment of the present application further provides a computer program product, in which instructions are stored, and when the computer program product runs on a computer, the computer is caused to execute the methods shown in fig. 4 to 9.
It is to be understood that the above embodiments may be combined with each other.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.

Claims (22)

1. A vehicle monitoring method for use with a vehicle in a flameout condition, the method comprising:
selecting at least one sensor from a plurality of sensors mounted on the vehicle according to at least one of a scene type of a surrounding environment of the vehicle, a moving object in the surrounding environment, and a barrier situation of the surrounding environment;
the surroundings of the vehicle are monitored on the basis of the at least one sensor.
2. The method of claim 1, wherein before selecting at least one sensor from a plurality of sensors mounted on the vehicle based on at least one of a scene type of a surrounding environment of the vehicle, a moving object in the surrounding environment, a barrier condition of the surrounding environment, the method further comprises:
identifying a scene type of the surrounding environment;
selecting at least one sensor from a plurality of sensors mounted on the vehicle according to at least one of a scene type of a surrounding environment of the vehicle, a moving object in the surrounding environment, and a barrier condition of the surrounding environment, including:
and selecting a sensor corresponding to the scene type of the surrounding environment from a plurality of sensors installed on the vehicle according to the corresponding relation between the scene type and the sensor.
3. The method of claim 1 or 2, wherein before selecting at least one sensor from a plurality of sensors mounted on the vehicle according to at least one of a scene type of a surrounding environment of the vehicle, a moving object in the surrounding environment, a barrier condition of the surrounding environment, the method further comprises:
shooting an image of the surroundings of the vehicle using a camera mounted on the vehicle;
selecting at least one sensor from a plurality of sensors mounted on the vehicle according to at least one of a scene type of a surrounding environment of the vehicle, a moving object in the surrounding environment, and a barrier condition of the surrounding environment, including:
judging that a moving object appears in the surrounding environment according to the image shot by the camera;
the camera and other types of sensors other than the camera are selected as the at least one sensor from a plurality of types of sensors mounted on the vehicle.
4. The method of claim 3, wherein prior to selecting the camera and other types of sensors other than the camera as the at least one sensor from a plurality of types of sensors mounted on the vehicle, the method further comprises:
determining that the moving object satisfies any one or more of: the moving object moves towards the direction close to the vehicle, the appearance frequency of the moving object exceeds a preset frequency, the appearance duration of the moving object exceeds a preset duration, or the moving object is within a preset range of the vehicle.
5. The method according to any one of claims 1-4, further comprising:
obtaining the barrier condition, the barrier condition comprising a presence of a barrier on a first area, the first area being in a first orientation of the vehicle and a distance from the vehicle being less than a first threshold;
the at least one sensor does not include a sensor for monitoring in the first orientation.
6. The method of claim 5, wherein the barrier comprises a wall or other vehicle.
7. The method of any one of claims 1-6, wherein monitoring the surrounding environment of the vehicle based on the at least one sensor comprises:
monitoring at least one factor threatening the security of the vehicle from the surrounding environment based on the at least one sensor;
determining a threat level of the surrounding environment to the vehicle based on the at least one factor;
executing a response event corresponding to the threat level.
8. The method of claim 7, wherein determining the threat level of the ambient environment to the vehicle based on the at least one factor comprises:
determining a threat level of the ambient environment to the vehicle based on at least one of a type of the at least one factor, a value of the at least one factor, a duration of the at least one factor, a number of changes in the ambient environment, or a speed of the vehicle.
9. The method according to claim 7 or 8, wherein the threat levels comprise a first level, a second level and a third level in order from low to high;
the response event corresponding to the first level includes any one or more of: the vehicle lamp flickers, whistles or the central control screen flickers;
the response events corresponding to the second level include: responding to the event corresponding to the first level, recording a video by adopting a camera and storing the video;
response events corresponding to the third level include: and sending a prompt to user equipment according to a response event corresponding to the second level, and uploading the video to a cloud so as to support the user equipment to download.
10. A vehicle monitoring device for use with a vehicle in an off state, the device comprising:
a processing unit for selecting at least one sensor from a plurality of sensors mounted on the vehicle according to at least one of a scene type of a surrounding environment of the vehicle, a moving object in the surrounding environment, and a barrier situation of the surrounding environment;
a monitoring unit for monitoring the surroundings of the vehicle on the basis of the at least one sensor.
11. The apparatus of claim 10, wherein the processing unit is configured to:
identifying a scene type of the ambient environment;
and selecting a sensor corresponding to the scene type of the surrounding environment from a plurality of sensors installed on the vehicle according to the corresponding relation between the scene type and the sensor.
12. The apparatus according to claim 10 or 11, wherein the processing unit is configured to:
capturing an image of a surrounding environment of the vehicle using a camera mounted on the vehicle;
judging that a moving object appears in the surrounding environment according to the image shot by the camera;
the camera and other types of sensors than the camera are selected as the at least one sensor from a plurality of types of sensors mounted on the vehicle.
13. The apparatus as recited in claim 12, said processing unit to further:
determining that the moving object satisfies any one or more of the following, before selecting the camera and another type of sensor other than the camera as the at least one sensor from among a plurality of types of sensors mounted on the vehicle: the moving object moves towards the direction close to the vehicle, the appearance frequency of the moving object exceeds a preset frequency, the appearance duration of the moving object exceeds a preset duration, or the moving object is within a preset range of the vehicle.
14. The apparatus according to any of claims 10-13, wherein the processing unit is further configured to:
obtaining the barrier condition, the barrier condition comprising a presence of a barrier on a first area, the first area being in a first orientation of the vehicle and a distance from the vehicle being less than a first threshold;
the at least one sensor does not include a sensor for monitoring in the first orientation.
15. The apparatus of claim 14, wherein the barrier comprises a wall or other vehicle.
16. The device according to any one of claims 10 to 15, wherein the monitoring unit is specifically configured to:
monitoring at least one factor threatening the security of the vehicle from the surrounding environment based on the at least one sensor;
determining a threat level of the surrounding environment to the vehicle based on the at least one factor;
executing a response event corresponding to the threat level.
17. The apparatus according to claim 16, wherein the monitoring unit, when determining the threat level of the surrounding environment to the vehicle based on the at least one factor, is specifically configured to:
determining a threat level of the ambient environment to the vehicle based on at least one of a type of the at least one factor, a value of the at least one factor, a duration of the at least one factor, a number of changes in the ambient environment, or a speed of the vehicle.
18. The apparatus according to claim 16 or 17, wherein the threat level comprises a first level, a second level and a third level in order from low to high;
the response event corresponding to the first level includes any one or more of: the vehicle lamp flickers, whistles or the central control screen flickers;
the response events corresponding to the second level include: responding to the event corresponding to the first level, recording a video by adopting a camera and storing the video;
response events corresponding to the third level include: and sending a prompt to user equipment according to the response event corresponding to the second level, and uploading the video to a cloud to support the user equipment to download.
19. A vehicle monitoring apparatus, for use with a vehicle in a flameout state, the apparatus comprising a memory storing computer program instructions and a processor executing the computer program instructions to perform operations according to any one of claims 1 to 9.
20. A vehicle, characterized by comprising:
a plurality of sensors; and
a vehicle monitoring apparatus as claimed in any of claims 10 to 19.
21. A computer-readable storage medium for storing instructions that, when executed, cause the method of any one of claims 1-9 to be implemented.
22. A computer program product having stored therein instructions which, when run on a processor, cause the method according to any one of claims 1-9 to be implemented.
CN202110333138.3A 2021-03-29 2021-03-29 Vehicle monitoring method and device and vehicle Pending CN115214631A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110333138.3A CN115214631A (en) 2021-03-29 2021-03-29 Vehicle monitoring method and device and vehicle
PCT/CN2022/080204 WO2022206336A1 (en) 2021-03-29 2022-03-10 Vehicle monitoring method and apparatus, and vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110333138.3A CN115214631A (en) 2021-03-29 2021-03-29 Vehicle monitoring method and device and vehicle

Publications (1)

Publication Number Publication Date
CN115214631A true CN115214631A (en) 2022-10-21

Family

ID=83456854

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110333138.3A Pending CN115214631A (en) 2021-03-29 2021-03-29 Vehicle monitoring method and device and vehicle

Country Status (2)

Country Link
CN (1) CN115214631A (en)
WO (1) WO2022206336A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024131934A1 (en) * 2022-12-23 2024-06-27 Nio Technology (Anhui) Co., Ltd Functional safety for an electrical vehicle in stationary mode

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11724641B2 (en) * 2021-01-26 2023-08-15 Ford Global Technologies, Llc Hazard condition warning for package delivery operation
CN115713842B (en) * 2022-10-10 2024-09-13 重庆长安新能源汽车科技有限公司 Active risk avoiding method and system during parking of vehicle, vehicle and storage medium
CN116279454B (en) * 2023-01-16 2023-12-19 禾多科技(北京)有限公司 Vehicle body device control method, device, electronic apparatus, and computer-readable medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10159006B4 (en) * 2001-11-30 2006-09-14 Robert Bosch Gmbh Device for monitoring
JP4149879B2 (en) * 2003-08-28 2008-09-17 株式会社日本自動車部品総合研究所 Drive control device for in-vehicle equipment
JP3900357B2 (en) * 2003-12-02 2007-04-04 三菱電機株式会社 Vehicle perimeter monitoring system
CN107323377A (en) * 2017-05-08 2017-11-07 苏州统购信息科技有限公司 A kind of vehicle-mounted early warning system and method for early warning
CN109204232B (en) * 2017-06-29 2020-11-13 宝沃汽车(中国)有限公司 Vehicle periphery abnormity monitoring method and device and vehicle
DE102018128888A1 (en) * 2018-08-02 2020-02-06 Trw Automotive Gmbh Monitoring system for a vehicle

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024131934A1 (en) * 2022-12-23 2024-06-27 Nio Technology (Anhui) Co., Ltd Functional safety for an electrical vehicle in stationary mode

Also Published As

Publication number Publication date
WO2022206336A1 (en) 2022-10-06

Similar Documents

Publication Publication Date Title
WO2022206336A1 (en) Vehicle monitoring method and apparatus, and vehicle
US9352683B2 (en) Traffic density sensitivity selector
US10997430B1 (en) Dangerous driver detection and response system
US10421436B2 (en) Systems and methods for surveillance of a vehicle using camera images
CN106004784B (en) Load reduction for vehicle ignition switch off via off-board sensors
US20220089181A1 (en) Systems and Methods for Remote Monitoring of a Vehicle, Robot or Drone
US10061013B2 (en) Mobile gunshot detection
CN113160508B (en) Car as a house night security system and method based on artificial intelligence
US20180357484A1 (en) Video processing device and video processing method
CN203854604U (en) Intelligent blind spot monitoring device for automobile
CN107826069A (en) geocode information auxiliary vehicle warning
US9165454B2 (en) Security system, program product therefor, and surveillance method
KR100364121B1 (en) A multimedia system for protection against auto accidents and theft
KR20060008967A (en) Event detection system
SE541541C2 (en) Method and system for theft detection in a vehicle
CN111252066A (en) Emergency braking control method and device, vehicle and storage medium
CN113056775A (en) Traffic monitoring and evidence collection system
CN115703431A (en) System and method for vehicle safety monitoring
CN103448670A (en) Vehicle monitoring system
CN113808418A (en) Road condition information display system, method, vehicle, computer device and storage medium
CN115427268A (en) Artificial intelligence enabled alerts for detecting passengers locked in a vehicle
CN111098864B (en) Prompt method, device, automatic driving vehicle and storage medium
CN203465850U (en) Artificial intelligence driving safety warning system
JP3900357B2 (en) Vehicle perimeter monitoring system
US20110055133A1 (en) Systems and Methods for Analyzing Communication Options

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20241030

Address after: 518129 Huawei Headquarters Office Building 101, Wankecheng Community, Bantian Street, Longgang District, Shenzhen, Guangdong

Applicant after: Shenzhen Yinwang Intelligent Technology Co.,Ltd.

Country or region after: China

Address before: 518129 Bantian HUAWEI headquarters office building, Longgang District, Guangdong, Shenzhen

Applicant before: HUAWEI TECHNOLOGIES Co.,Ltd.

Country or region before: China