WO2021036363A1 - 乘员保护方法及装置 - Google Patents

乘员保护方法及装置 Download PDF

Info

Publication number
WO2021036363A1
WO2021036363A1 PCT/CN2020/091724 CN2020091724W WO2021036363A1 WO 2021036363 A1 WO2021036363 A1 WO 2021036363A1 CN 2020091724 W CN2020091724 W CN 2020091724W WO 2021036363 A1 WO2021036363 A1 WO 2021036363A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
car
information
occupant
occupants
Prior art date
Application number
PCT/CN2020/091724
Other languages
English (en)
French (fr)
Inventor
席志鹏
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to EP20859614.8A priority Critical patent/EP4019343A4/en
Publication of WO2021036363A1 publication Critical patent/WO2021036363A1/zh
Priority to US17/680,885 priority patent/US20220305988A1/en

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • G08B25/01Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
    • G08B25/016Personal emergency signalling and security systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q9/00Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • B60R21/01Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
    • B60R21/015Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting the presence or position of passengers, passenger seats or child seats, and the related safety parameters therefor, e.g. speed or timing of airbag inflation in relation to occupant position or seat belt use
    • B60R21/01512Passenger detection systems
    • B60R21/0153Passenger detection systems using field detection presence sensors
    • B60R21/01538Passenger detection systems using field detection presence sensors for image processing, e.g. cameras or sensor arrays
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B29/00Checking or monitoring of signalling or alarm systems; Prevention or correction of operating errors, e.g. preventing unauthorised operation
    • G08B29/18Prevention or correction of operating errors
    • G08B29/185Signal analysis techniques for reducing or preventing false alarms or for enhancing the reliability of the system
    • G08B29/188Data fusion; cooperative systems, e.g. voting among different detectors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/26Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
    • B60Q1/50Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking
    • B60Q1/544Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking for indicating other states or conditions of the vehicle occupants, e.g. for indicating disabled occupants
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q5/00Arrangement or adaptation of acoustic signal devices
    • B60Q5/005Arrangement or adaptation of acoustic signal devices automatically actuated
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • B60R16/023Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for transmission of signals between vehicle parts or subsystems
    • B60R16/0231Circuits relating to the driving or the functioning of the vehicle
    • B60R16/0232Circuits relating to the driving or the functioning of the vehicle for measuring vehicle parameters and indicating critical, abnormal or dangerous conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • B60R2021/0027Post collision measures, e.g. notifying emergency services
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/18Status alarms
    • G08B21/22Status alarms responsive to presence or absence of persons
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/18Status alarms
    • G08B21/24Reminder alarms, e.g. anti-loss alarms
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • G08B25/006Alarm destination chosen according to type of event, e.g. in case of fire phone the fire service, in case of medical emergency phone the ambulance

Definitions

  • This application relates to the field of communication technology, and in particular to an occupant protection method and device.
  • the present application provides an occupant protection method and device to reduce or avoid the impact of abnormal conditions in the vehicle on the occupants and protect the safety of the occupants in the vehicle.
  • the embodiments of the present application provide an occupant protection method, which is applied in a vehicle, or applied to a device (such as a cloud server, a mobile phone terminal, etc.) with a function of controlling the vehicle, or in a vehicle.
  • a device such as a cloud server, a mobile phone terminal, etc.
  • the method includes: obtaining information of occupants in the vehicle and information of the environment in the vehicle.
  • the occupant information in the car includes one or more of the behavior information of the occupants in the car and the sound information of the occupants in the car.
  • One or more of the internal temperature information One or more of the internal temperature information. According to the information of the occupants in the vehicle and the environment information in the vehicle, the abnormal state type and the degree of abnormality are determined.
  • the abnormal state types include abnormal occupants in the car and/or abnormal environment in the car.
  • the abnormal occupant in the car includes one or more of abnormal behavior of the occupant in the car and abnormal sound of the occupant in the car.
  • the abnormal environment in the car includes the environment in the car.
  • an emergency measure is determined, where the emergency measure is an operation to reduce the degree of abnormality.
  • the above-mentioned occupant protection method is adopted to obtain the information of the occupants in the vehicle and the environment information in the vehicle, and then determine the type of abnormal state and the degree of abnormality according to the information of the occupant in the vehicle and the information of the environment in the vehicle, and then determine emergency measures according to the type of abnormal state and the degree of abnormality.
  • corresponding emergency measures have been taken to reduce or avoid the impact of abnormal conditions in the vehicle on the occupants, thereby protecting the safety of the occupants in the vehicle.
  • acquiring information about the occupants and the environment in the vehicle includes: acquiring first data and second data monitored by sensors, where the sensors include seat pressure sensors, cameras, sound sensors, and air quality One or more of sensors and temperature sensors.
  • the first data determine the occupant information in the vehicle, where the first data includes at least one of seat pressure data in the vehicle, image data of the occupant in the vehicle, and voice data of the occupant in the vehicle.
  • the second data the in-vehicle environment information is determined.
  • the second data includes at least one of image data of the environment in the vehicle, environmental sound data in the vehicle, air quality data in the vehicle, and temperature data in the vehicle.
  • the number and types of sensors involved are large, and the first data and second data are obtained.
  • the data is richer, and the accuracy in determining the information about the occupants in the car and the environment information in the car is high.
  • the occupant protection mode is triggered by the wake-up signal.
  • the wake-up signal includes at least one of in-vehicle seat pressure data that exceeds a preset pressure threshold and lasts longer than a preset duration, and voice data of in-vehicle occupants that exceed a preset decibel threshold.
  • the occupants in the car can be protected more systematically. Therefore, through the above process, the safety of the occupants in the car can be better guaranteed and unnecessary casualties can be reduced.
  • the vehicle driving system operates at a preset low frequency, and determines the occupant information in the vehicle according to the first data; and determines the environment information in the vehicle according to the second data. And/or, at least one of the control driving function and the entertainment function is turned off.
  • the embodiment of the present application can realize low power consumption operation of the vehicle driving system in the occupant protection mode, and save resources.
  • the behavior information of the occupants in the car includes the location of the occupants in the car, the posture of the occupants in the car, and the facial expressions of the occupants in the car.
  • the position of the occupants in the car includes the front seat and the rear seat.
  • the postures of the occupants in the car include sitting upright, curled up, and lying down.
  • the facial expressions of the occupants in the car include normal, happy, sad, angry, anxious, or uncomfortable.
  • the voice information of the occupants in the car includes the volume of the occupants in the car, the voiceprint of the occupants in the car, and the semantic information of the voice of the occupants in the car.
  • the voice and semantic information of the occupants in the car includes asking for help, singing, and making phone calls.
  • the image information of the environment in the car includes normal, fire, car accident and so on.
  • the abnormal state type is determined, which specifically includes: if the occupant in the car is curled up or lying down, and the facial expression of the occupant in the car is uncomfortable, then the abnormal state is determined.
  • the state type is abnormal behavior of the occupants in the vehicle. If the volume of the occupant in the vehicle exceeds the first preset decibel threshold corresponding to the voiceprint of the occupant in the vehicle, and/or the semantic information of the voice of the occupant in the vehicle includes distress information, it is determined that the abnormal state type is an abnormal voice of the occupant in the vehicle.
  • the abnormal state type is an image of the environment in the vehicle. If the in-vehicle environmental sound information exceeds the second preset decibel threshold, it is determined that the abnormal state type is abnormal in-vehicle environmental sound. If the air quality in the vehicle exceeds the preset air quality threshold, it is determined that the type of abnormal state is abnormal air quality in the vehicle. If the temperature inside the vehicle exceeds the preset temperature threshold, it is determined that the type of abnormal state is abnormal temperature inside the vehicle.
  • the degree of abnormality is determined according to the information of the occupants in the vehicle and the environment information in the vehicle, which specifically includes: fusion of the information of the occupants in the vehicle and the environment information in the vehicle during the same time period to obtain a description of the current vehicle Fusion information of the inner scene. Analyze the fusion information to determine the degree of abnormality used to indicate the impact of the current in-vehicle scene on the occupants.
  • the fusion information is analyzed to determine the degree of abnormality, which specifically includes: if the current scene in the car described by the fusion information is a baby crying, the degree of abnormality is low. If the current scene in the car described by the fusion information is a baby crying, and the temperature in the car is abnormal, the degree of abnormality is relatively high. If the current scene in the car described by the fusion information is a baby crying and there is a fire in the car, the degree of abnormality is high.
  • emergency measures include emergency communications and/or emergency control measures.
  • the communication content of emergency communication includes one of vehicle location information, vehicle appearance information, license plate number information, in-vehicle status information, in-vehicle image data, in-vehicle sound information, sensor status information with abnormal data, and emergency control measures.
  • the communication method of emergency communication is one or more of SMS, MMS, and voice call.
  • Emergency control measures include one or more of voice reminding the occupants in the car, opening the windows, opening the door, opening the air purification equipment in the car, opening the temperature adjustment equipment, opening the fire extinguishing device, unlocking/opening the door, homing, and flashing the lights. item.
  • the emergency measures are determined, which specifically includes: if the degree of abnormality is low, the emergency measure is determined to be emergency communication, and the contact person for emergency communication is the first preset emergency contact. If the degree of abnormality is high, the emergency measures are determined to be emergency communications and emergency control measures, the contact person for emergency communications is the first preset emergency contact person, and the emergency control measures are determined according to the type of abnormal state. If the degree of abnormality is high, the emergency measures are emergency communication and emergency control measures. Contacts for emergency communication include a first preset emergency contact and a second preset emergency contact. The emergency control measures are determined according to the type of abnormal state.
  • the emergency control measures are determined according to the type of abnormal state, specifically including: if the type of abnormal state is abnormal air quality in the car, the emergency control measures are to open the windows, open the doors, and open the air purification equipment in the car. At least one of. If the type of abnormal state is abnormal temperature in the vehicle, the emergency control measures are at least one of opening the window, opening the door, and opening the temperature adjustment device. If the abnormal state type is abnormal behavior of the occupants in the car and/or abnormal voice of the occupants in the car, the emergency control measures are voice reminding the occupants in the car, open the windows, open the door, open the temperature control device, open the fire extinguishing device, unlock/open the door At least one of, whistle, flashing car lights.
  • the emergency control measures are voice reminding the occupants in the car, open the window, open the door, open the temperature control device, open the fire extinguishing device, unlock/open the door At least one of, whistle, flashing car lights.
  • the embodiments of the present application provide an occupant protection device, which is applied to a vehicle, or applied to other equipment (such as cloud server, mobile phone terminal, etc.) with the function of controlling the vehicle, or a chip system in the vehicle
  • the device includes an acquisition unit and a processing unit.
  • the acquisition unit is used to acquire information about the occupants in the vehicle and the environment information in the vehicle.
  • the occupant information in the car includes one or more of the behavior information of the occupants in the car and the sound information of the occupants in the car.
  • One or more of the internal temperature information is used to acquire information about the occupants in the vehicle and the environment information in the vehicle.
  • the processing unit is used to determine the type of abnormal state and the degree of abnormality according to the information of the occupants in the vehicle and the information of the environment in the vehicle.
  • the abnormal state types include abnormal occupants in the car and/or abnormal environment in the car.
  • the abnormal occupant in the car includes one or more of abnormal behavior of the occupant in the car and abnormal sound of the occupant in the car.
  • the abnormal environment in the car includes the environment in the car.
  • an emergency measure is determined, where the emergency measure is an operation to reduce the degree of abnormality.
  • the acquisition unit is specifically used to acquire the first data and the second data monitored by the sensor, where the sensor includes one of a seat pressure sensor, a camera, a sound sensor, an air quality sensor, and a temperature sensor.
  • the sensor includes one of a seat pressure sensor, a camera, a sound sensor, an air quality sensor, and a temperature sensor.
  • the first data determine the occupant information in the vehicle, where the first data includes at least one of seat pressure data in the vehicle, image data of the occupant in the vehicle, and voice data of the occupant in the vehicle.
  • the second data the in-vehicle environment information is determined.
  • the second data includes at least one of image data of the environment in the vehicle, environmental sound data in the vehicle, air quality data in the vehicle, and temperature data in the vehicle.
  • the processing unit is also used to trigger the occupant protection mode through the wake-up signal.
  • the wake-up signal includes at least one of in-vehicle seat pressure data that exceeds a preset pressure threshold and lasts longer than a preset duration, and voice data of in-vehicle occupants that exceed a preset decibel threshold.
  • the processing unit is also configured to work at a preset low frequency, determine the occupant information in the vehicle according to the first data; and determine the environmental information in the vehicle according to the second data. And/or, at least one of the control driving function and the entertainment function is turned off.
  • the behavior information of the occupants in the car includes the location of the occupants in the car, the posture of the occupants in the car, and the facial expressions of the occupants in the car.
  • the position of the occupants in the car includes the front seat and the rear seat.
  • the postures of the occupants in the car include sitting upright, curled up, and lying down.
  • the facial expressions of the occupants in the car include normal, happy, sad, angry, anxious, or uncomfortable.
  • the voice information of the occupants in the car includes the volume of the occupants in the car, the voiceprint of the occupants in the car, and the semantic information of the voice of the occupants in the car.
  • the voice and semantic information of the occupants in the car includes asking for help, singing, and making phone calls.
  • the image information of the environment in the car includes normal, fire, car accident and so on.
  • the processing unit is specifically configured to determine that the type of abnormal state is abnormal behavior of the occupant in the car if the posture of the occupant in the car is curled up or lying down, and the facial expression of the occupant in the car is uncomfortable. If the volume of the occupant in the car exceeds the first preset decibel threshold corresponding to the voiceprint of the occupant in the car, and/or the semantic information of the occupant's voice in the car contains distress information, the abnormal state type is determined to be the abnormal voice of the occupant in the car. If the image information of the environment in the vehicle is a fire or a car accident, it is determined that the abnormal state type is an image of the environment in the vehicle.
  • the in-vehicle environmental sound information exceeds the second preset decibel threshold, it is determined that the abnormal state type is abnormal in-vehicle environmental sound. If the air quality in the vehicle exceeds the preset air quality threshold, it is determined that the type of abnormal state is abnormal air quality in the vehicle. If the temperature inside the vehicle exceeds the preset temperature threshold, it is determined that the type of abnormal state is abnormal temperature inside the vehicle.
  • the processing unit is specifically also used to fuse the occupant information in the car and the environment information in the car in the same time period to obtain the fusion information used to describe the current scene in the car. Analyze the fusion information to determine the degree of abnormality used to indicate the impact of the current in-vehicle scene on the occupants.
  • the processing unit is specifically also used for if the current in-car scene described by the fusion information is a baby crying, the degree of abnormality is low. If the current scene in the car described by the fusion information is a baby crying, and the temperature in the car is abnormal, the degree of abnormality is relatively high. If the current scene in the car described by the fusion information is a baby crying and there is a fire in the car, the degree of abnormality is high.
  • emergency measures include emergency communications and/or emergency control measures.
  • the communication content of emergency communication includes one of vehicle location information, vehicle appearance information, license plate number information, in-vehicle status information, in-vehicle image data, in-vehicle sound information, sensor status information with abnormal data, and emergency control measures.
  • the communication method of emergency communication is one or more of SMS, MMS, and voice call.
  • Emergency control measures include one or more of voice reminding the occupants in the car, opening the windows, opening the door, opening the air purification equipment in the car, opening the temperature adjustment equipment, opening the fire extinguishing device, unlocking/opening the door, homing, and flashing the lights. item.
  • the processing unit is specifically used to determine that the emergency measure is emergency communication if the degree of abnormality is low, and the contact person for emergency communication is the first preset emergency contact person. If the degree of abnormality is high, the emergency measures are determined to be emergency communications and emergency control measures, the contact person for emergency communications is the first preset emergency contact person, and the emergency control measures are determined according to the type of abnormal state. If the degree of abnormality is high, the emergency measures are emergency communication and emergency control measures. Contacts for emergency communication include a first preset emergency contact and a second preset emergency contact, and the emergency control measures are determined according to the abnormal state type.
  • the processing unit is specifically also used to, if the abnormal state type is abnormal air quality in the vehicle, the emergency control measures are at least one of opening the window, opening the door, and opening the air purification device in the vehicle. If the type of abnormal state is abnormal temperature in the vehicle, the emergency control measures are at least one of opening the window, opening the door, and opening the temperature adjustment device. If the abnormal state type is abnormal behavior of the occupants in the car and/or abnormal voice of the occupants in the car, the emergency control measures are voice reminding the occupants in the car, open the windows, open the door, open the temperature control device, open the fire extinguishing device, unlock/open the door At least one of, whistle, flashing car lights.
  • the emergency control measures are voice reminding the occupants in the car, open the window, open the door, open the temperature control device, open the fire extinguishing device, unlock/open the door At least one of, whistle, flashing car lights.
  • an occupant protection device including: a processor, a memory, and a communication interface; wherein the communication interface is used to communicate with other equipment or a communication network, and the memory is used to store computer execution instructions.
  • the processor executes the computer-executable instructions stored in the memory, so that the occupant protection device executes the occupant protection method according to any one of the first aspect and the first aspect.
  • an embodiment of the present application also provides a computer-readable storage medium, including instructions, which when run on a computer, cause the computer to execute the occupant protection method of any one of the above-mentioned first aspect and the first aspect .
  • the embodiments of the present application also provide a computer program product, including instructions, which when run on a computer, cause the computer to execute the occupant protection method as described in the first aspect and any one of the first aspect.
  • Fig. 1 is a first structural diagram of an autonomous vehicle provided by an embodiment of the application
  • FIG. 2 is a second structural diagram of an automatic driving vehicle provided by an embodiment of the application.
  • FIG. 3 is a schematic structural diagram of a computer system provided by an embodiment of this application.
  • FIG. 4 is a schematic structural diagram of a chip system provided by an embodiment of the application.
  • FIG. 5 is a schematic diagram 1 of the application of a cloud-side command automatic driving vehicle provided by an embodiment of this application;
  • FIG. 6 is a second schematic diagram of the application of a cloud-side command automatic driving vehicle provided by an embodiment of this application.
  • FIG. 7 is a schematic structural diagram of a computer program product provided by an embodiment of the application.
  • FIG. 8 is a schematic flowchart of an occupant protection method provided by an embodiment of the application.
  • FIG. 9 is a first structural diagram of an occupant protection device provided by an embodiment of the application.
  • FIG. 10 is a second structural diagram of the occupant protection device provided by the embodiment of the application.
  • the occupant protection method provided in the embodiments of the present application is applied to a vehicle, or applied to other devices (such as a cloud server, a mobile phone terminal, etc.) having a function of controlling a vehicle.
  • the vehicle may be an auto-driving vehicle
  • the auto-driving vehicle may be a vehicle with partial auto-driving functions
  • a vehicle with all auto-driving functions that is, the level of auto-driving of the vehicle can refer to the United States
  • SAE Society of Automotive Engineers
  • Vehicles or other devices can implement the occupant protection method provided in the embodiments of this application through the components (including hardware and software) contained in the vehicle or other devices, and obtain information about the occupants in the vehicle and the environment information in the vehicle. The information obtained determines the type of abnormal state and the degree of abnormality in the car, and then determines the corresponding emergency measures to reduce the degree of abnormality.
  • FIG. 1 is a functional block diagram of a vehicle 100 provided in an embodiment of the present application.
  • the vehicle 100 may include various subsystems, such as a travel system 110, a sensor system 120, a control system 130, one or more peripheral devices 140, a power supply 150, a computer system 160, and a user interface 170.
  • the vehicle 100 may include more or fewer subsystems, and each subsystem may include multiple elements.
  • each subsystem and element of the vehicle 100 may be interconnected by wire or wirelessly.
  • the travel system 110 may include components that provide power movement for the vehicle 100.
  • the travel system 110 may include an engine, an energy source, a transmission, and wheels.
  • the engine can be an internal combustion engine, an electric motor, an air compression engine, or a combination with other types of engines.
  • the engine can convert the energy source into mechanical energy.
  • energy sources such as gasoline, diesel, other petroleum-based fuels, propane, other compressed gas-based fuels, ethanol, solar panels, batteries and other power sources, etc.
  • the energy source can also be other vehicle 100
  • the subsystem provides energy.
  • the transmission device can transmit the mechanical power from the engine to the wheels to change the rotation speed of the wheels and so on.
  • the transmission device may include at least one of a gearbox, a differential, and a drive shaft, where the drive shaft may be coupled to one or more shafts of one or more wheels.
  • the transmission device may also include other devices, such as a clutch.
  • the sensor system 120 may include several sensors that sense information about the surrounding environment of the vehicle 100.
  • the sensor system 120 includes at least one of a positioning system 121 (such as a GPS system, a Beidou system or other positioning systems), an inertial measurement unit (IMU) 122, a radar 123, a laser rangefinder 124, and a camera 125.
  • a positioning system 121 such as a GPS system, a Beidou system or other positioning systems
  • IMU inertial measurement unit
  • the positioning system 121 can be used to estimate the geographic location of the vehicle 100.
  • the IMU 122 is used to sense the position and orientation change of the vehicle 100 based on inertial acceleration.
  • the IMU 122 may be a combination of an accelerometer and a gyroscope.
  • the radar 123 can use radio signals to sense objects in the surrounding environment of the vehicle 100. In addition to sensing objects, the radar 123 can also be used to sense the speed and/or the forward direction of the objects.
  • the laser rangefinder 124 may use lasers to sense objects in the environment where the vehicle 100 is located. In some embodiments, the laser rangefinder 124 may include one or more laser sources, laser scanners, and one or more detectors. , And other system components.
  • the camera 125 may be used to capture multiple images of the surrounding environment of the vehicle 100, and the camera 125 may be a static camera or a video camera.
  • the sensor system 120 also includes sensors of the internal system of the vehicle 100.
  • the sensors of the internal system of the vehicle 100 include an in-vehicle camera 1215, a seat pressure sensor 126, a sound sensor 127, an air quality sensor 128, a temperature sensor 129, a vibration sensor 1210, a touch sensor 1211, a humidity sensor 1212 , Smoke sensor 1213 and vehicle speed sensor 1214 and other sensors.
  • the in-vehicle camera 1215 may be used to capture multiple images of the occupants in the vehicle and multiple images of the environment in the vehicle.
  • the seat pressure sensor 126 may be used to monitor pressure data on each seat in the vehicle.
  • the sound sensor 127 can be used to monitor the sound data of the occupants in the car and the sound data of the environment in the car.
  • the air quality sensor 128 can be used to monitor the air quality in the vehicle and obtain related air quality data.
  • the temperature sensor 129 is used to monitor the temperature in the vehicle.
  • the vibration sensor 1210 is used to capture vibration data occurring in the vehicle.
  • the touch sensor 1211 is used to monitor the touch data on the display screen of the central control unit in the vehicle.
  • the humidity sensor 1212 is used to monitor humidity data in the vehicle.
  • the smoke sensor 1213 is used to monitor the smoke concentration data in the vehicle.
  • the vehicle speed sensor 1214 is used to monitor the speed data of the vehicle to determine whether the vehicle is at a standstill.
  • the sensors of the vehicle's internal system may also include air quality sensors, fuel gauges, oil temperature gauges, and so on.
  • One or more sensor data collected by these sensors can be used to detect objects and their corresponding characteristics (position, shape, temperature, speed, etc.). This detection and identification is to achieve the safe operation of the vehicle 100 and ensure the safety of the occupants in the vehicle. The essential.
  • the control system 130 may control the operation of the vehicle 100 and its components.
  • the control system 130 may include various elements, such as at least one of a steering system 131, a throttle 132, a braking unit 133, a computer vision system 134, a route control system 135, and an obstacle avoidance system 136.
  • the steering system 131 is used to adjust the forward direction of the vehicle 100.
  • the steering system 131 may be a steering wheel system.
  • the throttle 132 further controls the speed of the vehicle 100 by controlling the operating speed of the engine.
  • the braking unit 133 is used to control the deceleration of the vehicle 100, and the braking unit 133 can use friction to reduce the rotation speed of the wheels.
  • the braking unit 133 can reduce the rotation speed of the wheels by converting the kinetic energy of the wheels into electric current, and the braking unit 133 can also take other forms to reduce the rotation speed of the wheels, thereby controlling the speed of the vehicle 100.
  • the computer vision system 134 can process and analyze the images captured by the camera 125 and the in-vehicle camera 1215 to identify objects in the surrounding environment of the vehicle 100 and/or features of the objects, as well as the images in the vehicle 100 Behavior information of occupants in the car and image information of the environment in the car.
  • the objects and/or object features in the surroundings of the vehicle 100 include traffic signals, road boundaries, obstacles, etc.
  • the occupant information in the vehicle 100 includes information such as facial expressions of the occupants in the vehicle, and postures of the occupants in the vehicle.
  • the computer vision system 134 can use at least one of an object recognition algorithm, a structure from motion (SFM) algorithm, video tracking, or other computer vision technologies to complete drawing environmental maps, tracking objects, Estimate the speed of the object, determine the current situation in the car, and so on.
  • SFM structure from motion
  • the route control system 135 is used to determine the travel route of the vehicle 100.
  • the route control system 135 may combine sensor data from the sensor system 120 and one or more predetermined map data to determine a driving route for the vehicle 100.
  • the obstacle avoidance system 136 is used to identify and evaluate obstacles, and plan a way to cross potential obstacles in the surrounding environment of the vehicle 100, such as avoiding and bypassing.
  • control system 130 may add other components in addition to the above-mentioned components, or may reduce a part of the above-mentioned components, or replace the above-mentioned components with other components.
  • the vehicle 100 interacts with peripheral devices 140 such as external sensors, other vehicles, and other computer systems through the user interface 170.
  • the peripheral device 140 may include at least one of a wireless communication system 141, an onboard computer 142, a microphone 143, a speaker 144, and/or other peripheral devices.
  • the onboard computer 142 may provide information to the vehicle 100 or the user of the vehicle 100 through the user interface 170 and receive information from the vehicle 100 or the user of the vehicle 100.
  • the on-board computer 142 can be operated through a display screen.
  • the user interface 170 may also provide a means for the vehicle 100 to communicate with other devices located in the vehicle 100.
  • the microphone 143 may receive audio (eg, voice commands or other audio input) from a user of the vehicle 100 through the user interface 170.
  • the speaker 144 may output audio to the user of the vehicle 100 through the user interface 170.
  • the wireless communication system 141 may be used for wireless communication with one or more devices via a communication network or directly.
  • the wireless communication system 141 may use 3G cellular communication, such as CDMA, EVDO, GSM/GPRS, or 4G cellular communication, such as LTE, or 5G cellular communication.
  • the wireless communication system 141 may also use WiFi or other wireless protocols to communicate with wireless local area networks ( wireless local area network, WLAN) communication.
  • the wireless communication system 141 can directly communicate with devices, such as various vehicle communication systems, using infrared links, Bluetooth, or ZigBee.
  • the wireless communication system 141 may include one or more dedicated short range communications (DSRC) devices.
  • DSRC dedicated short range communications
  • the power supply 150 may provide power to various components of the vehicle 100.
  • the power source 150 may be a rechargeable lithium ion or lead-acid battery, and one or more battery packs of this battery are configured as a power source to provide power for various components of the vehicle 100.
  • the power supply 150 and the energy source may be implemented together.
  • the computer system 160 may include at least one processor 161 that executes instructions 1621 stored in a non-transitory computer readable medium such as a data storage device 162.
  • the computer system 160 can also control multiple computer devices in individual components or subsystems of the vehicle 100 in a distributed manner.
  • the processor 161 may be any conventional processor, such as a commercially available central processing unit (CPU).
  • the processor may be a dedicated device such as an application specific integrated circuit (ASIC) or other hardware-based processor.
  • processors, computer system, or memory may actually include a processor, a computer system, or a memory that can be stored in the same Multiple processors, computer systems, or memories in a physical housing, or include multiple processors, computer systems, or memories that may not be stored in the same physical housing.
  • the memory may be a hard drive, or other storage medium located in a different physical enclosure. Therefore, a reference to a processor or computer system will be understood to include a reference to a collection of processors or computer systems or memories that may operate in parallel, or a reference to a collection of processors or computer systems or memories that may not operate in parallel.
  • some components such as the steering component and the deceleration component may each have its own processor and only perform calculations related to the function of a specific component.
  • the processor may be remote from the vehicle and wirelessly communicate with the vehicle.
  • some of the processes described herein are performed by a processor disposed in the vehicle, while others are performed by a remote processor, including taking the necessary steps to perform a single manipulation.
  • the data storage device 162 may include instructions 1621 (eg, program logic), which may be executed by the processor 161 to perform various functions of the vehicle 100, including those described above.
  • the data storage device 162 may also contain additional instructions, including sending data to, receiving data from, interacting with, and/or performing data on one or more of the traveling system 110, the sensor system 120, the control system 130, and the peripheral device 140. Control instructions.
  • the data storage device 162 can also store data, such as road maps, route information, the location and/or direction and/or speed of the vehicle 100, data of other vehicles, and other information.
  • the vehicle 100 is autonomous and semi-autonomous. And/or in the manual driving mode, the above-mentioned data and related information can be used by the vehicle 100 and the computer system 160.
  • the data storage device 162 obtains in-vehicle environment information and in-vehicle occupant information from the sensor system 120 or other components of the vehicle 100.
  • the in-vehicle environment information may be, for example, image information of the environment inside the vehicle, such as a fire. Or air quality information in the vehicle, etc.
  • the information of the occupants in the vehicle may be behavior information of the occupants in the vehicle and/or voice information of the occupants in the vehicle.
  • the data storage device 162 may also store state information of the vehicle itself and state information of other vehicles interacting with the vehicle.
  • the state information of the vehicle includes, but is not limited to, the speed of the vehicle, the scene inside the vehicle, and the like.
  • the data storage device 162 can also obtain information such as the distance between obstacles in the surrounding environment and the vehicle based on the speed and distance measurement functions of the radar 123.
  • the processor 161 can obtain the information from the data storage device 162, and obtain the final emergency strategy based on the environmental information of the environment in which the vehicle is located, the state information of the vehicle itself, and the traditional emergency strategy to control the vehicle 100 to take emergency measures. , In order to alleviate the abnormal state in the car.
  • the user interface 170 is used to provide information to or receive information from a user of the vehicle 100.
  • the user interface 170 may interact with one or more input/output devices in the set of peripheral devices 140, such as one or more of the wireless communication system 141, the onboard computer 142, the microphone 143, and the speaker 144.
  • the computer system 160 may control the vehicle 100 based on information acquired from various subsystems (for example, the traveling system 110, the sensor system 120, and the control system 130) and the information received from the user interface 170. For example, the computer system 160 may control the steering system 131 to change the forward direction of the vehicle according to the information from the control system 130, so as to avoid obstacles detected by the sensor system 120 and the obstacle avoidance system 136. In some embodiments, the computer system 160 can control many aspects of the vehicle 100 and its subsystems.
  • one or more of these components described above may be installed or associated with the vehicle 100 separately.
  • the data storage device 162 may exist partially or completely separately from the vehicle 100.
  • the above-mentioned components may be coupled together for communication in a wired and/or wireless manner.
  • FIG. 1 should not be construed as a limitation to the embodiments of the present application.
  • an autonomous driving vehicle such as the above vehicle 100, can determine the adjustment instruction for the current state of the equipment in the vehicle according to the environmental information and occupant information in the vehicle.
  • the occupants in the vehicle 100 may be babies, teenagers, adults, old people, and so on.
  • each occupant in the vehicle may be considered independently, and the emergency communication and emergency control measures of the vehicle 100 may be determined based on the respective characteristics of the occupants, such as voiceprint information, and the volume of the occupants in the vehicle.
  • the vehicle 100 as an autonomous vehicle or the computer equipment associated therewith may be based on the information of the identified occupants and the in-vehicle environment information. (For example, slipping a car, catching fire, etc.) to determine the type and degree of abnormality in the car, and determine the corresponding emergency communication measures.
  • the identified occupant information in the vehicle and the environmental information in the vehicle can also be considered as a whole to predict the degree of abnormality of the conditions in the vehicle. .
  • the vehicle 100 can determine the communication content, the communication object, and the communication method of the emergency communication based on the recognized abnormality of the in-vehicle condition. In this process, other factors may also be considered to determine the emergency communication command of the vehicle 100, such as the location of the vehicle 100, the environment outside the vehicle, the speed of the vehicle, and so on. In other words, self-driving cars can determine what kind of emergency communication the vehicle needs based on the identified occupant information and in-vehicle environment information (for example, the emergency communication method is SMS or telephone, etc., and the emergency communication contact person is Driver or emergency center, etc.).
  • the emergency communication method is SMS or telephone, etc.
  • the emergency communication contact person is Driver or emergency center, etc.
  • computer equipment can also provide instructions for adjusting the status of various devices in the vehicle 100, so that the autonomous vehicles follow the given emergency control measures and control various devices in the vehicle.
  • the state of the car should be adjusted to ensure the safety of the occupants in the car.
  • the above-mentioned vehicle 100 may be a car, truck, motorcycle, bus, boat, airplane, helicopter, lawn mower, recreational vehicle, playground vehicle, construction equipment, tram, golf cart, train, and trolley, etc.
  • the application examples are not particularly limited.
  • the vehicle may include the following modules:
  • the perception module 201 is used to acquire data monitored by sensors on the vehicle, and process the acquired data to determine the information of the occupants in the vehicle and the environment information in the vehicle.
  • the perception module 201 can be divided into an occupant perception module 2011 and an environment perception module 2012.
  • the interaction between the occupant perception module 2011 and the environment perception module 2012 and the on-board equipment (each sensor) can be referred to as shown in Figure 2.
  • the arrow in the figure indicates that the sensor transmits data to the occupant perception module 2011 or the environment perception module 2012, connected
  • the relationship between sensors on the same dashed line is "and/or”.
  • the occupant perception module 2011 obtains the first data through the in-car camera, seat pressure sensor, sound sensor, vibration sensor, and touch sensor.
  • the first data includes the image data of the occupant in the car, the pressure data of the seat in the car, and the sound of the occupants in the car. Data, seat vibration data in the car, touch data on the display screen in the car.
  • the environmental perception module 2012 obtains the second data through the in-car camera, sound sensor, air quality sensor, temperature sensor, humidity sensor, smoke sensor, and vehicle speed sensor.
  • the second data includes the image data of the environment in the car, the sound data of the environment in the car, and the car Interior air quality data, interior temperature data, interior humidity data, interior smoke density data, and vehicle speed data.
  • the perception module 201 can analyze the acquired first data and second data to obtain information about the occupants in the vehicle and the environment information in the vehicle, and transmit these information to the decision-making module 202, and then issue emergency communication instructions and/or emergency control. instruction.
  • the decision-making module 202 is used to receive the occupant information and the environment information in the vehicle sent by the perception module 201, and analyze the received information, and determine the abnormal state type of the in-vehicle condition when the in-vehicle condition is abnormal.
  • the occupant information in the vehicle and the environment information in the vehicle are merged at the same time period to obtain the fusion information describing the current in-vehicle scene. Analyze the fusion information to determine the degree of abnormality in the vehicle.
  • emergency measures are determined according to the type of abnormal state and the degree of abnormality in the vehicle.
  • the emergency measures include emergency communication and/or emergency control measures, and the corresponding emergency control instructions are issued to the execution module 203, and the emergency communication instructions are issued to the communication Module 204.
  • the execution module 203 is configured to receive emergency control instructions issued by the decision module 202 and execute corresponding emergency control measures.
  • the communication module 204 is used to receive emergency communication instructions issued by the decision-making module 202, establish an information transmission channel between the vehicle driving system and the driver or other preset emergency contacts and/or social emergency agencies, and transmit it to facilitate the evaluation of the communication object The abnormal state of the vehicle and/or information that helps the rescue deployment.
  • the computer system 160 shown in FIG. 1 includes a processor 301, and the processor 301 is coupled to a system bus 302.
  • the processor 301 may be one or more processors, where each processor may include one or more processor cores.
  • a display adapter (video adapter) 303, the display adapter 303 can drive the display 324, and the display 324 is coupled to the system bus 302.
  • the system bus 302 is coupled with an input/output (I/O) bus (BUS) 305 through a bus bridge 304.
  • the I/O interface 306 and the I/O bus 305 are coupled.
  • the I/O interface 306 communicates with various I/O devices, such as input device 307 (such as keyboard, mouse, display screen, etc.), media tray 308, (such as CD-ROM, multimedia interface, etc.) .
  • the transceiver 309 can send and/or receive radio communication signals
  • the camera 310 can capture static and dynamic digital video images
  • USB universal serial bus
  • the interface connected to the I/O interface 306 may be a USB interface.
  • the processor 301 may be any traditional processor, including a reduced instruction set computer (RISC) processor, a complex instruction set computer (CISC) processor, or a combination of the foregoing.
  • the processor may be a dedicated device such as an application specific integrated circuit (ASIC).
  • the processor 301 may be a neural network processor or a combination of a neural network processor and the foregoing traditional processors.
  • the computer system 160 may be located far away from the autonomous vehicle, and may communicate with the autonomous vehicle 100 wirelessly.
  • some of the processes described herein can be configured to be executed on a processor in an autonomous vehicle, and some other processes are executed by a remote processor, including taking actions required to perform a single maneuver.
  • the computer system 160 may communicate with a software deployment server (deploying server) 313 through a network interface 312.
  • the network interface 312 is a hardware network interface, such as a network card.
  • the network (network) 314 may be an external network, such as the Internet, or an internal network, such as an Ethernet or a virtual private network (VPN).
  • the network 314 may also be a wireless network, such as a WiFi network, a cellular network, and so on.
  • the hard disk drive interface 315 and the system bus 302 are coupled.
  • the hard disk drive interface 315 and the hard disk drive 316 are connected.
  • the system memory 317 and the system bus 302 are coupled.
  • the data running in the system memory 317 may include an operating system (OS) 318 and application programs 319 of the computer system 160.
  • OS operating system
  • the operating system includes but is not limited to Shell320 and kernel 321.
  • Shell 320 is an interface between the user and the kernel of the operating system.
  • the shell is the outermost layer of the operating system. The shell manages the interaction between the user and the operating system: waiting for the user's input, interpreting the user's input to the operating system, and processing the output of various operating systems.
  • the kernel 321 is composed of those parts of the operating system that are used to manage memory, files, peripherals, and system resources. Directly interact with the hardware, the operating system kernel usually runs processes and provides inter-process communication, and provides functions such as CPU time slice management, interruption, memory management, IO management, and so on.
  • Application programs 319 include programs related to controlling auto-driving cars, such as programs that manage the interaction between autonomous vehicles and road obstacles, programs that control the route or speed of autonomous vehicles, and programs that control interaction between autonomous vehicles and other autonomous vehicles on the road. .
  • the application 319 also exists on the deploying server 313 system. In one embodiment, when the application program 319 needs to be executed, the computer system 160 may download the application program 319 from the deploying server 313.
  • the application program 319 may be an application program that controls the vehicle to calculate a final driving strategy based on the foregoing environmental information, state information, and a traditional rule-based driving strategy.
  • the environmental information is the information of the environment where the vehicle is currently located (green belts, lanes, traffic lights, etc.)
  • the state information is the information of the target object that interacts with the vehicle (speed, acceleration, etc.).
  • the processor 301 of the computer system 160 calls the application 319 to obtain the final driving strategy.
  • the sensor 322 is associated with the computer system 160.
  • the sensor 322 is used to detect the environment around the computer system 160.
  • the sensor 322 can detect animals, cars, obstacles, and pedestrian crossings.
  • the sensor can also detect the surrounding environment of the above-mentioned animals, cars, obstacles and crosswalks.
  • the environment around the animal for example, other animals that appear around the animal, weather conditions, and the brightness of the surrounding environment.
  • the sensor may be a camera, an infrared sensor, a chemical detector, a microphone, etc.
  • the occupant protection method of the embodiments of the present application may also be executed by a chip system.
  • FIG. 4 is a structural diagram of a chip system provided by an embodiment of the present application.
  • a neural network processor (neural processing unit, NPU) 40 is mounted on a main CPU (Host CPU) as a coprocessor, and the Host CPU allocates tasks to the NPU.
  • the core part of the NPU is the arithmetic circuit 403.
  • the arithmetic circuit 403 is controlled by the controller 404, so that the arithmetic circuit 403 can extract matrix data in the memory and perform multiplication operations.
  • the arithmetic circuit 403 includes multiple processing units (process engines, PE). In some implementations, the arithmetic circuit 403 is a two-dimensional systolic array. The arithmetic circuit 403 may also be a one-dimensional systolic array, or other electronic circuits capable of performing mathematical operations such as multiplication and addition. In some implementations, the arithmetic circuit 403 is a general-purpose matrix processor.
  • the arithmetic circuit 403 obtains the data corresponding to the weight matrix B from the weight memory 402 and caches it on each PE in the arithmetic circuit 403.
  • the arithmetic circuit 403 fetches the corresponding data of the input matrix A from the input memory 401, and performs matrix operations according to the input matrix A and the weight matrix B, and the partial or final result of the matrix operation can be stored in an accumulator 408.
  • the arithmetic circuit 403 can be used to implement a feature extraction model (such as a convolutional neural network model), and input image data into the convolutional neural network model, and the features of the image can be obtained through the operation of the model. Furthermore, the image features are output to the classifier, and the classifier outputs the classification probability of the object in the image.
  • a feature extraction model such as a convolutional neural network model
  • the unified memory 406 is used to store input data and output data.
  • the weight data in the external memory is directly sent to the weight memory 402 through a direct memory access controller (DMAC) 405.
  • the input data in the external memory can be transferred to the unified memory 406 through the DMAC, or transferred to the input memory 401.
  • DMAC direct memory access controller
  • the bus interface unit (BIU) 410 is used for the interaction between the AXI bus and the DMAC and the instruction fetch buffer 409. It is also used for the instruction fetch memory 409 to obtain instructions from an external memory, and is also used for the storage unit access controller 405 to obtain the original data of the input matrix A or the weight matrix B from the external memory.
  • the DMAC is mainly used to transfer the input data in the external memory (DDR) to the unified memory 406, or to transfer the weight data to the weight memory 402, or to transfer the input data to the input memory 401.
  • DDR external memory
  • the vector calculation unit 407 may include a plurality of operation processing units. It is used to perform further processing on the output of the arithmetic circuit 403 when needed, such as vector multiplication, vector addition, exponential operation, logarithmic operation, size comparison and so on. Mainly used for non-convolution/FC layer network calculations in neural networks, such as pooling, batch normalization, local response normalization, etc.
  • the vector calculation unit 407 stores the processed output vector to the unified memory 506.
  • the vector calculation unit 407 may apply a nonlinear function to the output of the arithmetic circuit 403, such as a vector of accumulated values, to generate the activation value.
  • the vector calculation unit 407 generates a normalized value, a combined value, or both.
  • the processed output vector can also be used as an activation input of the arithmetic circuit 403, for example, for use in a subsequent layer in a neural network.
  • the controller 404 is connected to an instruction fetch buffer 409, and the instructions used by the controller 404 can be stored in the instruction fetch buffer 409.
  • the unified memory 406, the input memory 401, the weight memory 402, and the fetch memory 409 are all On-Chip memories.
  • the external memory is private to the NPU hardware architecture.
  • the main CPU and NPU cooperate to realize the corresponding algorithm for the functions required by the vehicle 100 in Figure 1, and the corresponding algorithm for the functions required by the vehicle shown in Figure 2, or the algorithm shown in Figure 3.
  • the computer system 160 may also receive information from other computer systems or transfer information to other computer systems.
  • the sensor data collected from the sensor system 120 of the vehicle 100 can be transferred to another computer, and the data can be processed by the other computer.
  • the data from the computer system 160 can be transmitted to the computer system 510 on the cloud side via the network for further processing.
  • the network and intermediate nodes can include various configurations and protocols, including the Internet, World Wide Web, Intranet, virtual private network, wide area network, local area network, private network using one or more company’s proprietary communication protocols, Ethernet, WiFi and HTTP, And various combinations of the foregoing. This communication can be performed by any device capable of transferring data to and from other computers, such as modems and wireless interfaces.
  • another computer system 510 may include a server with multiple computers, such as a load balancing server group.
  • another computer system 510 exchanges information with different nodes of the network.
  • the server 520 may have a configuration similar to the computer system 160 and have a processor 530, a memory 540, instructions 550, and data 560.
  • the data 560 of the server 520 may include providing weather-related information.
  • the server 520 may receive, monitor, store, update, and transmit various information related to weather.
  • the information may include, for example, precipitation, cloud, and/or temperature information and/or humidity information in the form of reports, radar information, forecasts, etc.
  • the cloud service center may receive information (such as data collected by vehicle sensors or other information) from the autonomous vehicles 613 and 612 in its environment 600 via a network 611 such as a wireless communication network.
  • a network 611 such as a wireless communication network.
  • the cloud service center 620 runs the stored occupant information and in-vehicle environment information based on the data monitored by the sensors, and then determines the abnormal state type and degree of abnormality, and determines the emergency according to the abnormal state type and degree of abnormality.
  • the relevant procedures of the measures remind the relevant personnel of the autonomous vehicles 613 and 612 and/or take corresponding emergency control measures.
  • the cloud service center 620 may provide a part of the map to the vehicles 613 and 612 through the network 611.
  • operations can be divided between different locations.
  • multiple cloud service centers can receive, confirm, combine, and/or send information reports.
  • information reports and/or sensor data can also be sent between vehicles.
  • Other configurations are also possible.
  • the cloud service center 620 sends to the autonomous vehicle a response indicating whether or not to remind the driver and other relevant personnel of the abnormal situation in the vehicle. For example, the cloud service center 620 determines the communication object, communication content, communication method, and emergency control specific measures of the emergency communication based on the collected sensor data and emergency response information.
  • the cloud service center 620 observes the video stream in its operating environment 600 or changes in the state of the vehicles 613 and 612, such as changes in speed, changes in the state of occupants in the vehicle, and changes in the environment in the vehicle, and confirms the effect of emergency measures in order to download Before carrying out emergency control, evaluate the emergency measures according to the effects of emergency measures, and control the vehicles to take more accurate emergency measures.
  • the disclosed methods may be implemented as computer program instructions in a machine-readable format, encoded on a computer-readable storage medium, or encoded on other non-transitory media or articles.
  • Figure 7 schematically illustrates a conceptual partial view of an example computer program product arranged in accordance with at least some of the embodiments shown herein, the example computer program product including a computer program for executing a computer process on a computing device.
  • the example computer program product 700 is provided using a signal bearing medium 701.
  • the signal-bearing medium 701 may include one or more program instructions 702, which, when run by one or more processors, may provide all or part of the functions described above with respect to FIGS. 1 to 6, or may provide descriptions in subsequent embodiments. All or part of the function.
  • program instructions 702 in FIG. 7 also describe example instructions.
  • the signal-bearing medium 701 may include a computer-readable medium 703, such as, but not limited to, a hard disk drive, compact disk (CD), digital video compact disk (DVD), digital tape, memory, read-only storage memory (Read -Only Memory, ROM) or Random Access Memory (RAM), etc.
  • the signal bearing medium 601 may include a computer recordable medium 704, such as, but not limited to, memory, read/write (R/W) CD, R/W DVD, and so on.
  • the signal-bearing medium 701 may include a communication medium 705, such as, but not limited to, digital and/or analog communication media (eg, fiber optic cables, waveguides, wired communication links, wireless communication links, etc.).
  • the signal bearing medium 701 may be communicated by a wireless communication medium 705 (for example, a wireless communication medium that complies with the IEEE 802.11 standard or other transmission protocols).
  • the one or more program instructions 702 may be, for example, computer-executable instructions or logic-implemented instructions.
  • a computing device such as that described with respect to FIGS. 1 to 6 may be configured to respond to one or more of the computer readable medium 703, and/or computer recordable medium 704, and/or communication medium 705.
  • a program instruction 702 communicated to the computing device provides various operations, functions, or actions. It should be understood that the arrangement described here is for illustrative purposes only.
  • this application provides an occupant protection method.
  • the processor 161, the processor 301, the NFU 40, and the processor 530 are executed. As shown in FIG. 8, the method includes steps S801-S803:
  • the occupant information in the car includes at least one of behavior information of the occupant in the car and voice information of the occupant in the car
  • the behavior information of the occupant in the car includes at least one of the position of the occupant in the car, the posture of the occupant in the car, and the facial expression of the occupant in the car.
  • the sound information of the occupants in the car includes at least one of the volume of the occupants in the car, the voiceprint of the occupants in the car, and the semantic information of the sound of the occupants in the car.
  • the in-vehicle environment information includes at least one of image information of the in-vehicle environment, sound information of the in-vehicle environment, air quality information in the vehicle, and temperature information in the vehicle.
  • the position of the occupants in the car includes the front seat and the rear seat.
  • the front seat can be divided into the driver's seat and the passenger seat;
  • the rear seat can be divided into the driver's side rear seat and the co-pilot Side rear seats.
  • the postures of the occupants in the car include sitting upright, curled up, and lying down.
  • the facial expressions of the occupants in the car include normal, happy, sad, angry, anxious or uncomfortable.
  • the voiceprint information of the occupants in the car includes babies, teenagers, adults, the elderly, etc., and the semantic information of the occupants in the car includes calling for help, singing, and making phone calls.
  • the image information of the environment in the car includes normal, fire, car accident, etc.
  • the process of determining the information about the occupants in the vehicle and the environment information in the vehicle is the process of converting unstructured data collected from sensors into structured data or semi-structured data.
  • At least one of the seat pressure sensor, the camera, and the sound sensor is used to acquire at least one of the seat pressure data in the vehicle, the image data of the occupant in the vehicle, and the voice data of the occupant in the vehicle, that is, the first data.
  • At least one of the camera, the sound sensor, the air quality sensor, and the temperature sensor is used to obtain at least one of the image data of the environment in the car, the sound data of the environment in the car, the air quality data in the car, and the temperature data in the car, that is, the first Two data.
  • the data of the seat pressure in the car and the image data of the occupants in the car in the first data are analyzed Determine the location of the occupants in the car.
  • Analyze the voice data of the occupants in the first data to determine the volume of the occupants in the vehicle, the voice prints of the occupants in the vehicle, and the semantic information of the voices of the occupants in the vehicle. In this way, information about the occupants in the vehicle can be obtained.
  • four seats in the car namely the driver’s seat, the passenger’s seat, the driver’s rear seat and the passenger’s rear seat, are respectively provided with seat pressure sensors A, B, C and D.
  • the seat pressure data collected by A and B exceeds the preset seat pressure threshold
  • the seat pressure data collected by C and D does not exceed the preset seat pressure threshold.
  • the image information of the environment in the car includes normal, fire, car accident, etc. Take the fire in the car as an example.
  • the combustion of combustibles will produce noise, heat, smoke, carbon dioxide, etc.
  • the sound sensor, temperature sensor, and air quality sensor collected by the car's environmental sound data, car temperature data, car air quality data, etc. may change.
  • the in-vehicle environmental image information is normal. If one or more of the values of the in-vehicle environmental sound data, the in-vehicle temperature data, and the in-vehicle air quality data increase, and the in-vehicle environmental data is abnormal, it is determined that the in-vehicle environmental image information is abnormal, such as a fire.
  • the first data and the second data may be analyzed according to the neural network model to determine the occupant information in the vehicle and the environment information in the vehicle, respectively.
  • the neural network model is obtained by model training with historical data and deep learning algorithms.
  • At least one of a vibration sensor and a touch sensor is also installed in the vehicle, at least one of the vibration sensor and the touch sensor can be further used to obtain seat vibration data in the vehicle and the touch screen of the vehicle. At least one of the data.
  • the vibration sensor and the touch sensor can be further used to obtain seat vibration data in the vehicle and the touch screen of the vehicle. At least one of the data.
  • the seat pressure data in the car will also change with the movement of the occupant.
  • their position may change.
  • four seats in the car namely the driver’s seat, the passenger’s seat, the driver’s rear seat and the passenger’s rear seat, are respectively provided with seat pressure sensors A, B, C and D and seat vibration sensors E, F, G, H.
  • the seat pressure sensors collected by A and B exceed the preset seat pressure threshold.
  • the seat vibration sensor the seat vibration received by E and F The data exceeds the preset vibration threshold.
  • At least one of a humidity sensor, a smoke sensor, and a vehicle speed sensor may be further used to obtain the humidity data and the vehicle speed sensor.
  • the vehicle speed data is analyzed to determine the vehicle speed information. If the environmental image information in the car includes normal, fire, car accident, etc., take the fire in the car as an example. When the image collected by the camera changes, the combustion of combustible materials will produce noise, heat, smoke, carbon dioxide, etc.
  • In-vehicle sound sensors temperature sensors, smoke sensors, air quality sensors, and humidity sensors collected in-vehicle environmental sound data, in-vehicle temperature data, in-vehicle smoke concentration data, in-vehicle air quality data, and in-vehicle humidity data, etc. It may change.
  • car environmental sound information and car air In addition to quality information, temperature information in the car, smoke information in the car, and humidity information in the car, it is also necessary to check the image data of the car environment, the sound data of the car, the air quality data of the car, the temperature data of the car, and the environmental sound data of the car. , Car smoke density data and car humidity data are comprehensively analyzed to determine the image information of the car environment. In this way, environmental information in the vehicle can be obtained.
  • the values of the environmental sound data in the car, the temperature data in the car, the air quality data in the car, the smoke density data in the car, and the humidity data in the car are not increased, combined with the analysis of the environmental image data in the car, if If the image data of the environment in the car is normal, the image information of the environment in the car is normal. If the values of the environmental sound data in the car, the temperature data in the car, the air quality data in the car, the smoke density data in the car, and the humidity data in the car are increased, combined with the analysis of the car’s environmental image data, if the car’s environmental image data is not If it is normal, it is determined that the image information of the environment inside the vehicle is a fire.
  • the vehicle speed data exceeds the preset speed threshold and the image information of the environment inside the vehicle is normal, it is determined that the vehicle is rolling.
  • both the in-vehicle camera and the sound sensor are involved, which realizes the multiplexing of the in-vehicle sensors, and the in-vehicle sensor information can be more fully utilized.
  • the occupant protection mode is turned on before the occupant information in the vehicle and the environment information in the vehicle are determined.
  • the occupant protection mode can be automatically turned on by a wake-up signal after the driver leaves the vehicle or after the vehicle driving system is turned off.
  • the wake-up signal is one or more sensor data that can reflect the presence of occupants in the vehicle.
  • the in-vehicle seat pressure data may be used as a wake-up signal to trigger the occupant protection mode.
  • the volume of the in-vehicle occupant in the voice data of the in-vehicle occupant exceeds the preset decibel threshold
  • the voice data of the in-vehicle occupant may be used as a wake-up signal to trigger the occupant protection mode.
  • the seat vibration data in the vehicle exceeds the preset vibration threshold
  • the seat vibration data in the vehicle can be used as a wake-up signal to trigger the occupant protection mode.
  • the touch data of the display screen in the car and other vehicle driving systems can receive external information and automatically power on to start the vehicle driving system after sleep or shutdown.
  • the data obtained by sensors or specific devices can also be used as a wake-up signal to wake up the occupant protection mode.
  • the occupant protection mode can also be manually triggered by the driver, that is, when the driver leaves the vehicle or after leaving the vehicle, the driver manually triggers the occupant protection mode through a specific interaction between himself and the vehicle.
  • the driver can control the activation of the occupant protection mode by operating an application (APP) installed on his mobile phone or other electronic device terminal, or by operating the screen of the vehicle central control unit.
  • APP application
  • the occupant protection mode has a variety of triggering methods, which provides users with a good fool-proof design for triggering the occupant protection mode, which can ensure that the occupant protection mode is turned on when necessary.
  • the occupants in the car can be protected more systematically. Therefore, through the above process, the safety of the occupants in the car can be better guaranteed and unnecessary casualties can be reduced.
  • the first data and the second data are processed at a preset low frequency to determine the occupant information and the environment information in the vehicle, respectively.
  • at least one of the other functions in the vehicle that are not related to the occupant protection mode, such as the driving function, the entertainment function, and the navigation function is turned off.
  • the calculation units in the vehicle that are not related to the occupant protection mode such as the driving strategy calculation unit, are turned off.
  • the system can work at a lower frequency, and/or control the function modules not related to the occupant protection mode to be turned off, thereby reducing the power consumption of the system.
  • S802 Determine the type of abnormal state and the degree of abnormality according to the information of the occupants in the vehicle and the information of the environment in the vehicle.
  • the abnormal state types include abnormal occupants in the vehicle and/or abnormal environment in the vehicle.
  • the abnormality of the occupant in the car includes at least one of abnormal behavior of the occupant in the car and abnormal sound of the occupant in the car
  • the abnormality of the car environment includes abnormal image of the car environment, abnormal sound of the car environment, abnormal air quality in the car, At least one of the abnormal temperature in the vehicle.
  • the preset threshold value and the pre-training evaluation model are combined to determine the abnormal abnormal state type in the vehicle.
  • the pre-training evaluation model may be a neural network model, and the neural network model may be determined for model training based on historical vehicle occupant information, historical vehicle environmental information, and deep learning algorithms.
  • the abnormal state type is determined to be the occupant in the car Abnormal behavior; if the volume of the occupant in the car exceeds the first preset decibel threshold corresponding to the voiceprint of the occupant in the car, the abnormal state type is abnormal voice of the occupant in the car; if the semantic information of the occupant’s voice in the car is distress, the abnormal state type is car The voice of the occupant in the car is abnormal; if the volume of the occupant in the car exceeds the first preset decibel threshold corresponding to the voiceprint of the occupant in the car, and the semantic information of the occupant’s voice in the car is for help, the abnormal state type is abnormal voice of the occupant in the car; if the environment in the car If the image information is a fire or a car accident, the
  • the air quality in the car exceeds the preset air quality threshold, the environment in the car is abnormal, and the type of abnormal state is abnormal air quality in the car. If the temperature inside the car exceeds the preset temperature threshold, the environment inside the car is abnormal, and the type of abnormal state is abnormal temperature inside the car.
  • the occupant information in the vehicle and the environment information in the vehicle during the same time period are fused to obtain the fusion information used to describe the current in-vehicle scene.
  • the fusion information can be a simple listing of the occupant information and the environment information in the car, or it can be an overall description of the current scene in the car. Then analyze the obtained fusion information, according to the influence of the current in-vehicle scene described by the fusion information on the occupants, and the degree of abnormality of the current in-vehicle conditions.
  • the degree of abnormality is low;
  • the impact of the occupants can be solved by emergency measures, that is, the risk is controllable, and the degree of abnormality is relatively high; if the current in-vehicle scene described by the fusion information has an impact on the occupants in the vehicle, emergency measures can only be used to alleviate the safety of the occupants in the vehicle. If it is still affected, the degree of abnormality is high.
  • the analysis method used when analyzing the fusion information may be semantic analysis.
  • the occupant information in the vehicle and the environmental information in the vehicle are fused to obtain the fused information.
  • the current car scene described by the fusion information is "baby crying", it can be determined that there is no abnormality in the current car scene except for the baby crying, and it can be considered that the current car scene does not affect the safety of the baby in the car , The degree of abnormality is low; if the current scene in the car described by the fusion information is "baby crying and the temperature in the car is abnormal", it can be determined that the abnormal temperature in the current car scene has an impact on the safety of the baby in the car, but The temperature abnormality can be adjusted by the in-car temperature adjustment device to relieve the temperature abnormality's influence on the baby in the car.
  • the degree of abnormality is relatively high; if the current car described by the fusion information is "baby crying and the car is on fire" , In the current car scene, the fire in the car has a greater impact on the baby, and the situation can only be alleviated by the in-car fire extinguishing device, so the degree of abnormality is high.
  • S803 Determine emergency measures according to the type of abnormal state and the degree of abnormality.
  • the emergency measures are actions to reduce the degree of abnormality, including emergency communication and/or emergency control measures.
  • the communication content of emergency communication includes vehicle location information, vehicle appearance information, license plate number information, car status information, car image data, car sound information, sensor status information with abnormal data, emergency control measures, and other objects that help communication Evaluate one or more of the information of the abnormal state in the vehicle, other information that helps the communication object to quickly locate the location of the vehicle, and other information that can improve the efficiency of rescue.
  • the communication methods of emergency communication are SMS, MMS, voice call, the method of using the communication link between the vehicle driving system and the driver's handheld terminal, the method of sending alarm information to the cloud server through the network, and other methods that can send communication content and establish One or more of the communication links between the vehicle driving system and the driver.
  • the emergency measures shall be determined to include emergency communication and/or emergency control measures. Specifically, if the degree of abnormality is low, the emergency measure is determined to be emergency communication, and the contact person for emergency communication is the first preset emergency contact person, such as a driver; if the degree of abnormality is high, the emergency measure is determined to be emergency communication and emergency control measures. , The contact person for emergency communication is the first preset emergency contact person, and the emergency control measures are determined according to the type of abnormal state. If the degree of abnormality is high, the emergency measures are determined to be emergency communication and emergency control measures, the contacts for emergency communication are the first preset emergency contact and the second preset emergency contact, and the emergency control measures are determined according to the type of abnormal state.
  • the second preset emergency contact person is a person or organization that can provide rescue to the occupants in the vehicle, such as a driver or an emergency center.
  • the first preset emergency contact person and/or the second preset emergency contact person of the emergency communication may be preset by the system, or may be set by the user according to his specific needs.
  • Emergency control measures include measures to remind the occupants and other personnel in the vehicle, measures to adjust temperature, ventilation measures, and fire-fighting measures.
  • emergency control measures include voice reminding occupants in the car, opening windows, opening doors, opening air purification equipment, opening temperature regulation equipment, opening fire extinguishing devices, unlocking/opening doors, homing, flashing lights, braking, One or more of the measures that can be taken by opening/closing the sunroof and other vehicle driving systems by obtaining vehicle control authority.
  • the emergency control measure is at least one of opening the window, opening the door, and opening the temperature adjustment device.
  • the emergency control measures are at least one of opening the window, opening the door, and opening the air purification device.
  • the abnormal state type is abnormal behavior of the occupants in the car and/or abnormal voice of the occupants in the car
  • the emergency control measures are voice reminding the occupants in the car, open the windows, open the door, open the temperature control device, open the fire extinguishing device, unlock/open the door At least one of, whistle, flashing car lights.
  • the emergency control measures are voice reminding the occupants in the car, open the window, open the door, open the temperature control device, open the fire extinguishing device, unlock/open the door At least one of, whistle, flashing car lights.
  • the emergency control measures include at least one of honking and flashing lights; if the vehicle moves abnormally, such as rolling, Or other abnormal conditions that require braking, the emergency control measures at least include braking; if the abnormal conditions in the car can be intervened and resolved by the occupants in the car, or other abnormal conditions that need to be reminded to the occupants, the emergency control measures should be at least Including voice reminders to passengers in the car.
  • the emergency control measure may also be driving the vehicle to the nearest hospital.
  • the information flow of emergency communication can be one-way transmission, that is, the communication content of emergency communication is transmitted from the vehicle driving system to a specific communication object.
  • the information flow of emergency communication can also be multi-directional transmission, that is, the communication content of emergency communication is transmitted from the vehicle driving system to multiple communication objects.
  • the information flow of emergency communication can also be transmitted to each other, that is, the vehicle driving system can transmit information to the communication object, and can also receive the information transmitted by the communication object.
  • the occupants in the vehicle are first verbally reminded, so that the occupants in the vehicle can take emergency control measures in time to alleviate the abnormal conditions in the vehicle. If the abnormal situation in the car is not alleviated, the driver can be notified through emergency communication. The driver can determine whether emergency control measures are required, what emergency measures to take, and whether to further notify the driver based on the type of abnormal state and the degree of abnormality in the car. A person or organization that provides first aid measures to the occupants in the vehicle.
  • the driver or other relevant personnel can also be directly notified through emergency communication, and the driver or other relevant personnel can directly inform the driver or other relevant personnel according to the vehicle.
  • the type of abnormal state and the degree of abnormality in the vehicle determine whether emergency control measures need to be taken, what kind of emergency measures to take, and whether to further notify the person or organization that can provide first aid measures for the occupants in the vehicle.
  • the content of emergency communication includes vehicle information, occupant information, and environment information in the vehicle, as well as emergency control measures recommended for users. For specific descriptions of emergency communication and emergency control measures, please refer to the above content, which will not be repeated here.
  • the embodiment of the present application provides an occupant protection method that relies on the vehicle driving system, which can maximize the use of in-vehicle sensor information for in-vehicle occupant perception and in-vehicle environment perception.
  • the sensor information in the car is processed and used to obtain information about the occupants and the environment in the car, and determine the type and degree of abnormality of the conditions in the car, so as to identify the abnormal conditions in the car in time.
  • the embodiments of the present application can also control various modules and on-board equipment in the vehicle, and take corresponding emergency measures for the type and degree of abnormal conditions in the vehicle, thereby reducing or avoiding the impact of abnormal conditions in the vehicle on the occupants, and protecting the occupants in the vehicle. Security.
  • FIG. 9 shows the function of the automatic window cleaning device involved in the above embodiment. Schematic diagram of a possible structure.
  • the occupant protection device includes an acquisition unit 901 and a processing unit 902.
  • the occupant protection device may also include other modules, or the occupant protection device may include fewer modules.
  • the acquiring unit 901 is configured to acquire information about occupants in the vehicle and information about the environment in the vehicle.
  • the occupant information in the car includes one or more of the behavior information of the occupants in the car and the sound information of the occupants in the car.
  • the acquiring unit 901 is configured to acquire the first data and the second data monitored by the sensor, where the sensor includes one or more of a seat pressure sensor, a camera, a sound sensor, an air quality sensor, and a temperature sensor.
  • the first data determine the occupant information in the vehicle, where the first data includes at least one of seat pressure data in the vehicle, image data of the occupant in the vehicle, and voice data of the occupant in the vehicle.
  • the second data the in-vehicle environment information is determined.
  • the second data includes at least one of image data of the environment in the vehicle, environmental sound data in the vehicle, air quality data in the vehicle, and temperature data in the vehicle.
  • the processing unit 902 may also be used to trigger the occupant protection mode through a wake-up signal.
  • the wake-up signal includes at least one of in-vehicle seat pressure data that exceeds a preset pressure threshold and lasts longer than a preset duration, and voice data of in-vehicle occupants that exceed a preset decibel threshold.
  • the processing unit 902 is configured to determine the type of abnormal state and the degree of abnormality according to the information of the occupants in the vehicle and the information of the environment in the vehicle.
  • the abnormal state types include abnormal occupants in the car and/or abnormal environment in the car.
  • the abnormal occupant in the car includes one or more of abnormal behavior of the occupant in the car and abnormal sound of the occupant in the car.
  • the abnormal environment in the car includes the environment in the car. One or more of abnormal image, abnormal sound in the car environment, abnormal air quality in the car, and abnormal temperature in the car.
  • the behavior information of the occupants in the vehicle includes the location of the occupants in the vehicle, the posture of the occupants in the vehicle, and the facial expressions of the occupants in the vehicle.
  • the position of the occupants in the car includes the front seat and the rear seat.
  • the postures of the occupants in the car include sitting upright, curled up, and lying down.
  • the facial expressions of the occupants in the car include normal, happy, sad, angry, anxious, or uncomfortable.
  • the voice information of the occupants in the car includes the volume of the occupants in the car, the voiceprint of the occupants in the car, and the semantic information of the occupants in the car.
  • the voice and semantic information of the occupants in the car includes asking for help, singing, and making phone calls.
  • the image information of the environment in the car includes normal, fire, car accident and so on.
  • the type of abnormal state is abnormal behavior of the occupant in the car. If the volume of the occupant in the vehicle exceeds the first preset decibel threshold corresponding to the voiceprint of the occupant in the vehicle, and/or the semantic information of the voice of the occupant in the vehicle includes distress information, it is determined that the abnormal state type is an abnormal voice of the occupant in the vehicle. If the image information of the environment in the vehicle is a fire or a car accident, it is determined that the abnormal state type is an image of the environment in the vehicle.
  • the in-vehicle environmental sound information exceeds the second preset decibel threshold, it is determined that the abnormal state type is abnormal in-vehicle environmental sound. If the air quality in the vehicle exceeds the preset air quality threshold, it is determined that the type of abnormal state is abnormal air quality in the vehicle. If the temperature inside the vehicle exceeds the preset temperature threshold, it is determined that the type of abnormal state is abnormal temperature inside the vehicle.
  • the processing unit 902 is configured to fuse the occupant information and the in-vehicle environment information in the same time period to obtain the fusion information used to describe the current in-vehicle scene. Analyze the fusion information to determine the degree of abnormality used to indicate the impact of the current in-vehicle scene on the occupants.
  • the degree of abnormality is low. If the current scene in the car described by the fusion information is a baby crying, and the temperature in the car is abnormal, the degree of abnormality is relatively high. If the current scene in the car described by the fusion information is a baby crying and there is a fire in the car, the degree of abnormality is high.
  • the processing unit 902 is also used to determine emergency measures according to the type of abnormal state and the degree of abnormality, where the emergency measure is an operation to reduce the degree of abnormality.
  • emergency measures include emergency communication and/or emergency control measures.
  • the communication content of emergency communication includes one of vehicle location information, vehicle appearance information, license plate number information, in-vehicle status information, in-vehicle image data, in-vehicle sound information, sensor status information with abnormal data, and emergency control measures.
  • the communication method of emergency communication is one or more of SMS, MMS, and voice call.
  • Emergency control measures include one or more of voice reminding the occupants in the car, opening the windows, opening the door, opening the air purification equipment in the car, opening the temperature adjustment equipment, opening the fire extinguishing device, unlocking/opening the door, homing, and flashing the lights. item.
  • the processing unit 902 is configured to determine that the emergency measure is emergency communication if the degree of abnormality is low, and the contact person of the emergency communication is the first preset emergency contact person. If the degree of abnormality is high, the emergency measures are determined to be emergency communications and emergency control measures, the contact person for emergency communications is the first preset emergency contact person, and the emergency control measures are determined according to the type of abnormal state. If the degree of abnormality is high, the emergency measures are emergency communication and emergency control measures. Contacts for emergency communication include a first preset emergency contact and a second preset emergency contact, and the emergency control measures are determined according to the abnormal state type.
  • the emergency control measure is at least one of opening the window, opening the door, and opening the air purification device in the vehicle.
  • the emergency control measures are at least one of opening the window, opening the door, and opening the temperature regulating device.
  • the abnormal state type is abnormal behavior of the occupants in the car and/or abnormal voice of the occupants in the car
  • the emergency control measures are voice reminding the occupants in the car, open the windows, open the door, open the temperature control device, open the fire extinguishing device, unlock/open the door At least one of, whistle, flashing car lights.
  • the emergency control measures are voice reminding the occupants in the car, open the window, open the door, open the temperature control device, open the fire extinguishing device, unlock/open the door At least one of, whistle, flashing car lights.
  • the present application also provides an occupant protection device, including a processor 1001 and a memory 1002.
  • the processor 1001 and the memory 1002 are connected (for example, connected to each other through a bus 1004).
  • the occupant protection device may further include a transceiver 1003, which is connected to the processor 1001 and the memory 1002, and the transceiver 1003 is used to receive/send data.
  • a transceiver 1003 which is connected to the processor 1001 and the memory 1002, and the transceiver 1003 is used to receive/send data.
  • the processor 1001 may perform operations of the implementation solution corresponding to FIG. 8 and various feasible implementation manners thereof. For example, it is used to perform operations of the obtaining unit 901 and the processing unit 902, and/or other operations described in the embodiment of the present application.
  • the present application also provides an occupant protection device, including a non-volatile storage medium and a central processing unit.
  • the non-volatile storage medium stores an executable program.
  • the central processing unit is connected to the non-volatile storage medium and executes the executable program. The program is executed to realize the occupant protection method shown in FIG. 8 in the embodiment of the present application.
  • Another embodiment of the present application further provides a computer-readable storage medium.
  • the computer-readable storage medium includes one or more program codes, and the one or more programs include instructions.
  • the processor executes the program codes, 20
  • the automatic window cleaning device implements the occupant protection method shown in FIG. 8.
  • a computer program product in another embodiment of the present application, includes computer-executable instructions, and the computer-executable instructions are stored in a computer-readable storage medium.
  • the at least one processor of the occupant protection device can read the computer-executable instruction from the computer-readable storage medium, and the at least one processor executes the computer-executable instruction to make the occupant protection device execute the corresponding steps in the occupant protection method shown in FIG. 8.
  • the foregoing embodiments may be implemented in whole or in part by software, hardware, firmware, or any combination thereof.
  • the above-mentioned embodiments may appear in the form of a computer program product in whole or in part, and the computer program product includes one or more computer instructions.
  • the computer program instructions When the computer program instructions are loaded and executed on the computer, the processes or functions according to the embodiments of the present application are generated in whole or in part.
  • the computer may be a general-purpose computer, a special-purpose computer, a computer network, or other programmable devices.
  • Computer instructions may be stored in a computer-readable storage medium, or transmitted from one computer-readable storage medium to another computer-readable storage medium.
  • Computer instructions may be transmitted from a website, computer, server, or data center through a cable (such as Coaxial cable, optical fiber, digital subscriber line (digital subscriber line, DSL) or wireless (such as infrared, wireless, microwave, etc.) transmission to another website site, computer, server or data center.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server or data center integrated with one or more available media.
  • the usable medium may be a magnetic medium (for example, a floppy disk, a hard disk, and a magnetic tape), an optical medium (for example, a DVD), or a semiconductor medium (for example, a solid state disk (SSD)).
  • the disclosed device and method can be implemented in other ways.
  • the device embodiments described above are merely illustrative.
  • the division of the modules or units is only a logical function division. In actual implementation, there may be other division methods, for example, multiple units or components may be divided. It can be combined or integrated into another device, or some features can be omitted or not implemented.
  • the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, devices or units, and may be in electrical, mechanical or other forms.
  • the units described as separate parts may be physically separated or not physically separated.
  • the parts displayed as a unit may be one physical unit or multiple physical units, that is, they may be located in one place, or they may be distributed. To many different places. In the application process, some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • the functional units in the various embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated unit can be implemented in the form of hardware or software functional unit.
  • the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a computer readable storage medium.
  • the technical solutions of the embodiments of the present application are essentially or the part that contributes to the prior art or the part of the technical solutions can be embodied in the form of a software product, and the computer software product is stored in a storage medium.
  • Including several instructions to make a device (which may be a personal computer, a server, a network device, a single-chip microcomputer, or a chip, etc.) or a processor execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage media include: U disk, mobile hard disk, read-only memory (read-only memory, ROM), random access memory (random access memory, RAM), magnetic disks or optical disks and other media that can store program codes. .

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Security & Cryptography (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Emergency Management (AREA)
  • Business, Economics & Management (AREA)
  • Alarm Systems (AREA)
  • Traffic Control Systems (AREA)
  • Emergency Alarm Devices (AREA)

Abstract

一种乘员保护方法及装置,用于通信技术领域,可以根据车内乘员信息以及车内环境信息,确定车内异常状态以及异常类型,并进一步确定应急措施,减少或避免车内异常状况对于乘员的影响,从而保护车内乘员的安全。所述方法包括:获取车内乘员信息以及车内环境信息。根据车内乘员信息以及车内环境信息,确定异常状态类型以及异常程度。根据异常状态类型以及异常程度,确定减小异常程度的应急措施。

Description

乘员保护方法及装置
本申请要求于2019年10月18日提交中国国家知识产权局、申请号为201910996188.2的中国专利优先权,该中国专利进一步要求于2019年08月30日提交中国国家知识产权局、申请号为201910815590.6、发明名称为“智能汽车中乘员保护的方法和设备”的中国专利申请的优先权。上述申请的全部内容通过引用结合在本申请中。
技术领域
本申请涉及通信技术领域,尤其涉及一种乘员保护方法及装置。
背景技术
在现有技术中,驾驶员在离开车辆时,可能会由于疏忽等原因将车内其他乘员困在车内。在车内异常情况发生时,车内被困乘员(尤其是儿童)可能会由于对车辆不熟悉等原因无法自行离开车辆或采取有效的保护措施,从而造成不必要的伤亡事故。
发明内容
本申请提供一种乘员保护方法及装置,以减少或避免车内异常状况对于乘员的影响,保护车内乘员的安全。
为达到上述目的,本申请采用如下技术方案:
第一方面,本申请实施例提供一种乘员保护方法,该方法应用于车辆中,或者应用于具有控制车辆的功能的其他设备的装置(比如云端服务器、手机终端等)中,或者车辆中的芯片系统中,或者处理器上运行的操作系统和驱动中。该方法包括:获取车内乘员信息以及车内环境信息。其中,车内乘员信息包括车内乘员行为信息、车内乘员声音信息中的一种或多种,车内环境信息包括车内环境图像信息、车内环境声音信息、车内空气质量信息、车内温度信息中的一种或多种。根据车内乘员信息以及车内环境信息,确定异常状态类型以及异常程度。其中,异常状态类型包括车内乘员异常和/或车内环境异常,车内乘员异常包括车内乘员行为异常以及车内乘员声音异常中的一种或多种,车内环境异常包括车内环境图像异常、车内环境声音异常、车内空气质量异常以及车内温度异常中的一种或多种。根据异常状态类型以及异常程度,确定应急措施,其中,应急措施为减小异常程度的操作。
采用上述乘员保护方法,获取车内乘员信息以及车内环境信息,再根据车内乘员信息以及车内环境信息,确定异常状态类型以及异常程度,进而根据异常状态类型以及异常程度,确定应急措施。在上述过程中,确定车内的异常状态类型和异常程度后,还采取了相应的应急措施,可以减少或避免车内异常状况对于乘员的影响,从而保护车内乘员的安全。
在一种可能的设计中,获取车内乘员信息以及车内环境信息,包括:获取传感器监测到的第一数据以及第二数据,其中,传感器包括座椅压力传感器、摄像头、声音传感器、空气质量传感器、温度传感器中的一种或多种。根据第一数据,确定车内乘员信息,其中,第一数据包括车内座椅压力数据、车内乘员图像数据以及车内乘员声音数据中的至少一项。根据第二数据,确定车内环境信息,第二数据包括车内环境图像数据、车内环境声音数据、车内空气质量数据以及车内温度数据中的至少一项。
在对传感器监测到的第一数据和第二数据进行分析,进而确定车内乘员信息和车内环境信息时,所涉及到的传感器的数量和种类较多,则得到的第一数据和第二数据较为丰富,确定车内乘员信息和车内环境信息时的准确度较高。
在一种可能的设计中,通过唤醒信号触发乘员保护模式。其中,唤醒信号包括超过预设压力阈值且持续时长超过预设时长的车内座椅压力数据、超过预设分贝阈值的车内乘员声音数据中的至少一种。
在乘员保护模式下,车内乘员可以得到更系统的保护,因此,通过上述过程,可以更好的保证车内乘员的安全,减少不必要的伤亡事故。
在一种可能的设计中,触发乘员保护模式后,车辆驾驶系统工作在预设低频率下,根据第一数据确定车内乘员信息;以及根据第二数据确定车内环境信息。和/或,控制行驶功能和娱乐功能中的至少一项关闭。
通过上述过程,本申请实施例可以实现乘员保护模式下,车辆驾驶系统的低功耗工作,节省资源。
在一种可能的设计中,车内乘员行为信息包括车内乘员所在位置,车内乘员姿态,车内乘员面部表情。车内乘员所在位置包括前排座位、后排座位。车内乘员姿态包括端坐、蜷缩、躺。车内乘员面部表情包括正常、开心、悲伤、生气、焦急或者不舒服。车内乘员声音信息包括车内乘员音量、车内乘员声纹、车内乘员声音语义信息。车内乘员声音语义信息包括求救、唱歌、打电话。车内环境图像信息包括正常、起火、车祸等等。
在一种可能的设计中,根据车内乘员信息以及车内环境信息,确定异常状态类型,具体包括:若车内乘员姿态为蜷缩或者躺,且车内乘员面部表情为不舒服,则确定异常状态类型为车内乘员行为异常。若车内乘员音量超过车内乘员声纹对应的第一预设分贝阈值,和/或车内乘员声音语义信息包含求救信息,则确定异常状态类型为车内乘员声音异常。若车内环境图像信息为起火或车祸,则确定异常状态类型为车内环境图像异常。若车内环境声音信息超过第二预设分贝阈值,则确定异常状态类型为车内环境声音异常。若车内空气质量超过预设空气质量阈值,则确定异常状态类型为车内空气质量异常。若车内温度超过预设温度阈值,则确定异常状态类型为车内温度异常。
在一种可能的设计中,根据车内乘员信息、车内环境信息,确定异常程度,具体包括:对同一时间段内的车内乘员信息和车内环境信息进行融合,得到用于描述当前车内场景的融合信息。对融合信息进行分析,确定用于表示当前车内场景对于乘员的影响的异常程度。
在一种可能的设计中,对融合信息进行分析,确定异常程度,具体包括:若融合信息描述的当前车内场景为婴儿啼哭,则异常程度低。若融合信息描述的当前车内场景为婴儿啼哭,且车内温度异常,则异常程度较高。若融合信息描述的当前车内场景为婴儿啼哭,且车内起火,则异常程度高。
在一种可能的设计中,应急措施包括应急通讯和/或应急控制措施。其中,应急通讯的通讯内容包括车辆位置信息、车辆外观信息、车牌号信息、车内状态信息、车内图像数据、车内声音信息、数据异常的传感器状态信息、应急控制措施中的一项或多项。应急通讯的通讯方式为短信、彩信、语音电话中的一种或多种。应急控制措施包括语音提醒车内乘员、打开车窗、打开车门、打开车内空气净化设备、打开温度调节设备、打开灭火装置、解锁/打开车门、鸣笛、闪烁车灯中的一项或多项。
在一种可能的设计中,根据异常状态类型以及异常程度,确定应急措施,具体包括:若异常程度低,则确定应急措施为应急通讯,应急通讯的联系人为第一预设紧急联系人。若异常程度较高,则确定应急措施为应急通讯和应急控制措施,应急通讯的联系人为第一预设紧急联系人,应急控制措施根据异常状态类型确定。若异常程度高,则应急措施为应急通讯和 应急控制措施,应急通讯的联系人包括第一预设紧急联系人和第二预设紧急联系人,应急控制措施根据所述异常状态类型确定。
在一种可能的设计中,应急控制措施根据异常状态类型确定,具体包括:若异常状态类型为车内空气质量异常,则应急控制措施为打开车窗、打开车门、打开车内空气净化设备中的至少一种。若异常状态类型为车内温度异常,则应急控制措施为打开车窗、打开车门、打开温度调节设备中的至少一种。若异常状态类型为车内乘员行为异常和/或车内乘员声音异常,则应急控制措施为语音提醒车内乘员、打开车窗、打开车门、打开温度调节设备、打开灭火装置、解锁/打开车门、鸣笛、闪烁车灯中的至少一项。若异常状态类型为车内环境图像异常和/或车内环境声音异常,则应急控制措施为语音提醒车内乘员、打开车窗、打开车门、打开温度调节设备、打开灭火装置、解锁/打开车门、鸣笛、闪烁车灯中的至少一项。
第二方面,本申请实施例提供一种乘员保护装置,该装置应用于车辆中,或者应用于具有控制车辆的功能的其他设备(比如云端服务器、手机终端等)中,或者车辆中的芯片系统中,或者处理器上运行的操作系统和驱动中,该装置包括获取单元和处理单元。获取单元,用于获取车内乘员信息以及车内环境信息。其中,车内乘员信息包括车内乘员行为信息、车内乘员声音信息中的一种或多种,车内环境信息包括车内环境图像信息、车内环境声音信息、车内空气质量信息、车内温度信息中的一种或多种。处理单元,用于根据车内乘员信息以及车内环境信息,确定异常状态类型以及异常程度。其中,异常状态类型包括车内乘员异常和/或车内环境异常,车内乘员异常包括车内乘员行为异常以及车内乘员声音异常中的一种或多种,车内环境异常包括车内环境图像异常、车内环境声音异常、车内空气质量异常以及车内温度异常中的一种或多种。根据异常状态类型以及异常程度,确定应急措施,其中,应急措施为减小异常程度的操作。
在一种可能的设计中,获取单元,具体用于获取传感器监测到的第一数据以及第二数据,其中,传感器包括座椅压力传感器、摄像头、声音传感器、空气质量传感器、温度传感器中的一种或多种。根据第一数据,确定车内乘员信息,其中,第一数据包括车内座椅压力数据、车内乘员图像数据以及车内乘员声音数据中的至少一项。根据第二数据,确定车内环境信息,第二数据包括车内环境图像数据、车内环境声音数据、车内空气质量数据以及车内温度数据中的至少一项。
在一种可能的设计中,处理单元,还用于通过唤醒信号触发乘员保护模式。其中,唤醒信号包括超过预设压力阈值且持续时长超过预设时长的车内座椅压力数据、超过预设分贝阈值的车内乘员声音数据中的至少一种。
在一种可能的设计中,处理单元,还用于工作在预设低频率下,根据第一数据确定车内乘员信息;以及根据第二数据确定车内环境信息。和/或,控制行驶功能和娱乐功能中的至少一项关闭。
在一种可能的设计中,车内乘员行为信息包括车内乘员所在位置,车内乘员姿态,车内乘员面部表情。车内乘员所在位置包括前排座位、后排座位。车内乘员姿态包括端坐、蜷缩、躺。车内乘员面部表情包括正常、开心、悲伤、生气、焦急或者不舒服。车内乘员声音信息包括车内乘员音量、车内乘员声纹、车内乘员声音语义信息。车内乘员声音语义信息包括求救、唱歌、打电话。车内环境图像信息包括正常、起火、车祸等等。
在一种可能的设计中,处理单元,具体用于若车内乘员姿态为蜷缩或者躺,且车内乘员面部表情为不舒服,则确定异常状态类型为车内乘员行为异常。若车内乘员音量超过车内乘 员声纹对应的第一预设分贝阈值,和/或车内乘员声音语义信息包含求救信息,则确定异常状态类型为车内乘员声音异常。若车内环境图像信息为起火或车祸,则确定异常状态类型为车内环境图像异常。若车内环境声音信息超过第二预设分贝阈值,则确定异常状态类型为车内环境声音异常。若车内空气质量超过预设空气质量阈值,则确定异常状态类型为车内空气质量异常。若车内温度超过预设温度阈值,则确定异常状态类型为车内温度异常。
在一种可能的设计中,处理单元,具体还用于对同一时间段内的车内乘员信息和车内环境信息进行融合,得到用于描述当前车内场景的融合信息。对融合信息进行分析,确定用于表示当前车内场景对于乘员的影响的异常程度。
在一种可能的设计中,处理单元,具体还用于若融合信息描述的当前车内场景为婴儿啼哭,则异常程度低。若融合信息描述的当前车内场景为婴儿啼哭,且车内温度异常,则异常程度较高。若融合信息描述的当前车内场景为婴儿啼哭,且车内起火,则异常程度高。
在一种可能的设计中,应急措施包括应急通讯和/或应急控制措施。其中,应急通讯的通讯内容包括车辆位置信息、车辆外观信息、车牌号信息、车内状态信息、车内图像数据、车内声音信息、数据异常的传感器状态信息、应急控制措施中的一项或多项。应急通讯的通讯方式为短信、彩信、语音电话中的一种或多种。应急控制措施包括语音提醒车内乘员、打开车窗、打开车门、打开车内空气净化设备、打开温度调节设备、打开灭火装置、解锁/打开车门、鸣笛、闪烁车灯中的一项或多项。
在一种可能的设计中,处理单元,具体还用于若异常程度低,则确定应急措施为应急通讯,应急通讯的联系人为第一预设紧急联系人。若异常程度较高,则确定应急措施为应急通讯和应急控制措施,应急通讯的联系人为第一预设紧急联系人,应急控制措施根据异常状态类型确定。若异常程度高,则应急措施为应急通讯和应急控制措施,应急通讯的联系人包括第一预设紧急联系人和第二预设紧急联系人,应急控制措施根据所述异常状态类型确定。
在一种可能的设计中,处理单元,具体还用于若异常状态类型为车内空气质量异常,则应急控制措施为打开车窗、打开车门、打开车内空气净化设备中的至少一种。若异常状态类型为车内温度异常,则应急控制措施为打开车窗、打开车门、打开温度调节设备中的至少一种。若异常状态类型为车内乘员行为异常和/或车内乘员声音异常,则应急控制措施为语音提醒车内乘员、打开车窗、打开车门、打开温度调节设备、打开灭火装置、解锁/打开车门、鸣笛、闪烁车灯中的至少一项。若异常状态类型为车内环境图像异常和/或车内环境声音异常,则应急控制措施为语音提醒车内乘员、打开车窗、打开车门、打开温度调节设备、打开灭火装置、解锁/打开车门、鸣笛、闪烁车灯中的至少一项。
第三方面,提供一种乘员保护装置,包括:处理器、存储器和通信接口;其中,该通信接口用于与其他设备或通信网络通信,该存储器用于存储计算机执行指令,当该乘员保护装置运行时,该处理器执行该存储器存储的该计算机执行指令,以使该乘员保护装置执行如上述第一方面以及第一方面中任一项的乘员保护方法。
第四方面,本申请实施例中还提供一种计算机可读存储介质,包括指令,当其在计算机上运行时,使得计算机执行如上述第一方面以及第一方面中任一项的乘员保护方法。
第五方面,本申请实施例中还提供一种计算机程序产品,包括指令,当其在计算机上运行时,使得计算机执行如上述第一方面以及第一方面中任一项的乘员保护方法。
附图说明
图1为本申请实施例提供的一种自动驾驶车辆的结构示意图一;
图2为本申请实施例提供的一种自动驾驶车辆的结构示意图二;
图3为本申请实施例提供的一种计算机系统的结构示意图;
图4为本申请实施例提供的一种芯片系统的结构示意图;
图5为本申请实施例提供的一种云侧指令自动驾驶车辆的应用示意图一;
图6为本申请实施例提供的一种云侧指令自动驾驶车辆的应用示意图二;
图7为本申请实施例提供的一种计算机程序产品的结构示意图;
图8为本申请实施例提供的乘员保护方法的流程示意图;
图9为本申请实施例提供的乘员保护装置的结构示意图一;
图10为本申请实施例提供的乘员保护装置的结构示意图二。
具体实施方式
本申请实施例提供的乘员保护方法应用在车辆中,或者应用于具有控制车辆的功能的其他设备(比如云端服务器、手机终端等)中。其中,所述车辆可以为自动驾驶车辆,该自动驾驶车辆可以是具备部分自动驾驶功能的车辆,也可以是具备全部自动驾驶功能的车辆,也就是说,该车辆的自动驾驶的等级可以参照美国汽车工程师协会(society of automotive engineers,SAE)的分类标准,划分为无自动化(L0)、驾驶支援(L1)、部分自动化(L2)、有条件自动化(L3)、高度自动化(L4)或者完全自动化(L5)。车辆或者其他设备(比如云端服务器、手机终端等)可通过其包含的组件(包括硬件和软件)实施本申请实施例提供的乘员保护方法,获取车内乘员信息以及车内环境信息,并根据获取到的信息确定车内的异常状态类型以及异常程度,进而确定相应的应急措施,以减小异常程度。
以车辆为自动驾驶车辆为例,图1是本申请实施例提供的车辆100的功能框图。车辆100可包括各种子系统,例如行进系统110、传感器系统120、控制系统130、一个或多个外围设备140、电源150、计算机系统160以及用户接口170。可选地,车辆100可包括更多或更少的子系统,并且每个子系统可包括多个元件。另外,车辆100的每个子系统和元件可以通过有线或者无线互连。
行进系统110可包括为车辆100提供动力运动的组件。在一个实施例中,行进系统110可包括引擎、能量源、传动装置和车轮。引擎可以是内燃引擎、电动机、空气压缩引擎或与其他类型的引擎的组合。引擎可以将能量源转换成机械能量。能量源的类型有很多种,例如汽油、柴油、其他基于石油的燃料、丙烷、其他基于压缩气体的燃料、乙醇、太阳能电池板、电池和其他电力来源等,能量源也可以为车辆100的其他子系统提供能量。传动装置可以将来自引擎的机械动力传送到车轮,以改变车轮的转速等。传动装置可包括变速箱、差速器和驱动轴等器件中的至少一项,其中,驱动轴可以耦合到一个或多个车轮的一个或多个轴。在一个实施例中,传动装置还可以包括其他器件,比如离合器。
传感器系统120可包括感测车辆100周边环境信息的若干个传感器。例如,传感器系统120包括定位系统121(例如GPS系统、北斗系统或者其他定位系统)、惯性测量单元(inertial measurement unit,IMU)122、雷达123、激光测距仪124以及摄像头125等传感器中的至少一项。
定位系统121可用于估计车辆100的地理位置。IMU122用于基于惯性加速度来感测车辆100的位置和朝向变化,在一个实施例中,IMU 122可以是加速度计和陀螺仪的组合。雷达123可利用无线电信号来感测车辆100的周边环境内的物体,除了感测物体以外,雷达123还可用于感测物体的速度和/或前进方向。激光测距仪124可利用激光来感测车辆100所位 于的环境中的物体,在一些实施例中,激光测距仪124可包括一个或多个激光源、激光扫描器以及一个或多个检测器,以及其他系统组件。摄像头125可用于捕捉车辆100的周边环境的多个图像,且摄像头125可以是静态摄像头或视频摄像头。
传感器系统120还包括车辆100的内部系统的传感器。在本申请实施例中,车辆100的内部系统的传感器包括车内摄像头1215、座椅压力传感器126、声音传感器127、空气质量传感器128、温度传感器129、振动传感器1210、触摸传感器1211、湿度传感器1212、烟雾传感器1213以及车速传感器1214等传感器。车内摄像头1215可用于捕捉车内乘员的多个图像和车内环境的多个图像。座椅压力传感器126可用于监测车内各个座椅上的压力数据。声音传感器127可用于监测车内乘员声音数据和车内环境声音数据。空气质量传感器128可用于监测车内的空气质量,获取相关的空气质量数据。温度传感器129用于监测车内温度。振动传感器1210用于捕捉车内发生的振动数据。触摸传感器1211用于监测车内中央控制单元的显示屏上的触摸数据。湿度传感器1212用于监测车内湿度数据。烟雾传感器1213用于监测车内烟雾浓度数据。车速传感器1214用于监测车辆的速度数据,以确定车辆是否处于静止状态。此外,车辆内部系统的传感器还可以包括空气质量传感器、燃油量表、机油温度表等。这些传感器收集到的一个或多个传感器数据均可用于检测对象及其相应特性(位置、形状、温度、速度等),这种检测和识别是实现车辆100的安全操作并保证车内乘员安全的关键。
控制系统130可控制车辆100及其组件的操作。控制系统130可包括各种元件,例如转向系统131、油门132、制动单元133、计算机视觉系统134、路线控制系统135以及障碍规避系统136中的至少一个。
转向系统131用于调整车辆100的前进方向。例如转向系统131可以为方向盘系统。
油门132通过控制引擎的操作速度来进一步控制车辆100的速度。
制动单元133用于控制车辆100减速,制动单元133可利用摩擦力来减小车轮的转速。可选的,制动单元133可以通过将车轮的动能转换为电流,来减小车轮的转速,制动单元133也可采取其他形式来减小车轮的转速,进而控制车辆100的速度。
在本申请实施例中,计算机视觉系统134可以对摄像头125和车内摄像头1215捕捉到的图像进行处理和分析,以识别车辆100周边环境中的物体和/或物体的特征,以及车辆100内的车内乘员行为信息和车内环境图像信息。其中,车辆100周围环境中的物体和/或物体特征包括交通信号、道路边界和障碍物等,车辆100内的车内乘员信息包括车内乘员面部表情、车内乘员姿态等信息。在一些实施例中,计算机视觉系统134可使用物体识别算法、运动中恢复结构(structure from motion,SFM)算法、视频跟踪或其他计算机视觉技术中的至少一种,完成绘制环境地图、跟踪物体、估计物体速度、确定车内当前状况等等操作。
路线控制系统135用于确定车辆100的行驶路线。在一些实施例中,路线控制系统135可结合来自传感器系统120中的传感器数据以及一个或多个预定地图数据,为车辆100确定行驶路线。
障碍规避系统136用于识别和评估障碍物,并规划越过车辆100的周围环境中的潜在障碍物的方式,例如躲避绕行等。
当然,可选的,控制系统130可以增加除了上述组件之外的其他组件,也可以减少上述组件中的一部分,或者利用其他组件替换上述组件。
车辆100通过用户接口170与外部传感器、其他车辆、其他计算机系统等外围设备140 进行交互。外围设备140可包括无线通信系统141、车载电脑142、麦克风143、扬声器144和/或其他外围设备中的至少一个。示例性的,车载电脑142可通过用户接口170向车辆100或车辆100的用户提供信息,并接收来自车辆100或车辆100的用户的信息。在一些实施例中,车载电脑142可以通过显示屏进行操作。在其他情况中,用户接口170还可以提供车辆100与位于车辆100内的其它设备通信的手段。例如,麦克风143可通过用户接口170从车辆100的用户接收音频(例如,语音命令或其他音频输入)。类似地,扬声器144可通过用户接口170向车辆100的用户输出音频。
无线通信系统141可以用于经由通信网络或者直接与一个或多个设备进行无线通信。例如,无线通信系统141可使用3G蜂窝通信,例如CDMA、EVD0、GSM/GPRS,或者4G蜂窝通信,例如LTE,或者5G蜂窝通信,无线通信系统141还可利用WiFi或者其他无线协议与无线局域网(wireless local area network,WLAN)通信。在一些实施例中,无线通信系统141可利用红外链路、蓝牙或ZigBee与设备,例如各种车辆通信系统,直接通信。可选的,无线通信系统141可包括一个或多个专用短程通信(dedicated short range communications,DSRC)设备。
电源150可向车辆100的各种组件提供电力。在一个实施例中,电源150可以为可再充电锂离子或铅酸电池,将这种电池的一个或多个电池组配置为电源,为车辆100的各种组件提供电力。在一些实施例中,例如一些全电动车,电源150和能量源可一起实现。
车辆100的部分或全部功能受计算机系统160控制。计算机系统160可包括至少一个处理器161,处理器161执行存储在例如数据存储装置162这样的非暂态计算机可读介质中的指令1621。计算机系统160还可以采用分布式方式来控制车辆100的个体组件或子系统中的多个计算机设备。其中,处理器161可以是任何常规的处理器,诸如商业可获得的中央处理单元(central processing unit,CPU)。可选地,该处理器可以是诸如专用集成电路(application specific integrated circuit,ASIC)或其它基于硬件的处理器的专用设备。尽管图1功能性地图示了处理器、存储器、和在相同物理外壳中的其它元件,但是本领域的普通技术人员应该理解该处理器、计算机系统、或存储器实际上可以包括可以存储在相同的物理外壳内的多个处理器、计算机系统、或存储器,或者包括可以不存储在相同的物理外壳内的多个处理器、计算机系统、或存储器。例如,存储器可以是硬盘驱动器,或位于不同于物理外壳内的其它存储介质。因此,对处理器或计算机系统的引用将被理解为包括对可以并行操作的处理器或计算机系统或存储器的集合的引用,或者可以不并行操作的处理器或计算机系统或存储器的集合的引用。不同于使用单一的处理器来执行此处所描述的步骤,诸如转向组件和减速组件的一些组件每个都可以具有其自己的处理器,只执行与特定组件的功能相关的计算。
在此处所描述的各个方面中,处理器可以远离该车辆并且与该车辆进行无线通信。在其它方面中,此处所描述的过程中的一些由布置在车辆内的处理器上执行,而其它则由远程处理器执行,包括采取执行单一操纵的必要步骤。
在一些实施例中,数据存储装置162可包含指令1621(例如,程序逻辑),指令1621可被处理器161执行来执行车辆100的各种功能,包括以上描述的那些功能。数据存储装置162也可包含额外的指令,包括向行进系统110、传感器系统120、控制系统130和外围设备140中的一个或多个发送数据、从其接收数据、与其交互和/或对其进行控制的指令。
除了指令1621以外,数据存储装置162还可存储数据,例如道路地图、路线信息、车 辆100的位置和/或方向和/或速度、其它车辆的数据以及其他信息等,车辆100在自主、半自主和/或手动驾驶模式中时,上述数据和相关信息可被车辆100和计算机系统160使用。
比如,在本申请实施例中,数据存储装置162从传感器系统120或车辆100的其他组件获取车内环境信息和车内乘员信息,车内环境信息例如可以为车辆内部环境图像信息,例如起火,或者车内空气质量信息等,车内乘员信息可以为车内乘员的行为信息和/或车内乘员的声音信息等。数据存储装置162还可以存储该车辆自身的状态信息,以及与该车辆有交互的其他车辆的状态信息,其中,车辆的状态信息包括但不限于车辆的速度、车辆内部场景等。除上述内容外,数据存储装置162还可以获取车辆基于雷达123的测速、测距功能,得到的周围环境中的障碍物与自身之间的距离等信息。如此,处理器161可从数据存储装置162获取这些信息,并基于车辆所处环境的环境信息、车辆自身的状态信息,以及传统的应急策略,得到最终的应急策略,以控制车辆100采取应急措施,以缓解车内异常状态。
用户接口170,用于向车辆100的用户提供信息或从其接收信息。可选地,用户接口170可与外围设备140的集合内的一个或多个输入/输出设备进行交互,例如无线通信系统141、车载电脑142、麦克风143和扬声器144中的一个或多个。
计算机系统160可基于从各种子系统(例如,行进系统110、传感器系统120和控制系统130)获取的信息以及从用户接口170接收的信息来控制车辆100。例如,计算机系统160可根据来自控制系统130的信息,控制转向系统131更改车辆前进方向,从而规避由传感器系统120和障碍规避系统136检测到的障碍物。在一些实施例中,计算机系统160可对车辆100及其子系统的许多方面进行控制。
可选地,上述这些组件中的一个或多个可与车辆100分开安装或关联。例如,数据存储装置162可以部分或完全地与车辆100分开存在。上述组件可以通过有线和/或无线的方式耦合在一起进行通信。
可选地,上述组件只是一个示例,实际应用中,上述各个模块中的组件有可能根据实际需要增添或者删除,图1不应理解为对本申请实施例的限制。
在本申请实施例中,自动驾驶汽车,如上面的车辆100,可以根据其车内的环境信息和乘员信息以确定对当前车内设备状态的调整指令。其中,车辆100内的乘员可以是婴儿、少年、成年人、老人等。在一些示例中,可以独立地考虑车内的每个乘员,并且基于乘员各自的特性,例如声纹信息、车内乘员的音量等,来确定车辆100的应急通讯和应急控制措施。
可选地,作为自动驾驶汽车的车辆100或者与其相关联的计算机设备(如图1的计算机系统160、计算机视觉系统134、数据存储装置162)可以基于所识别的乘员的信息和车内环境信息(例如,溜车、起火等等)来判断车内的异常状态类型和异常程度,并确定相应的应急通讯措施。可选地,车内乘员信息与车内环境信息之间存在一定的关联关系,因此,还可以对所识别的车内乘员信息和车内环境信息进行整体考虑,来预测车内状况的异常程度。车辆100能够基于所识别的车内状况的异常程度来确定应急通讯的通讯内容、通讯对象以及通讯方式。在这个过程中,也可以考虑其它因素来确定车辆100的应急通讯指令,诸如,车辆100的位置、车外的环境和车辆的速度等等。换句话说,自动驾驶汽车能够基于所识别的车内乘员信息和车内环境信息来确定车辆需要采取什么样的应急通讯(例如,应急通讯的通讯方式为短信或者电话等、应急通讯的联系人为驾驶员或者急救中心等)。
除了提供指示自动驾驶汽车进行应急通讯的指令之外,计算机设备还可以提供调整车辆100内的各种设备状态的指令,以使得自动驾驶汽车遵循给定的应急控制措施,对车内各种 设备的状态进行调整,以保证车内乘员的安全。
上述车辆100可以为轿车、卡车、摩托车、公共汽车、船、飞机、直升飞机、割草机、娱乐车、游乐场车辆、施工设备、电车、高尔夫球车、火车、和手推车等,本申请实施例不做特别的限定。
参见图2,示例性的,车辆中可以包括以下模块:
感知模块201,用于获取车辆上的传感器监测到的数据,并对获取到的数据进行处理,确定车内乘员信息以及车内环境信息。更详细的,感知模块201可以划分为乘员感知模块2011和环境感知模块2012。其中,乘员感知模块2011和环境感知模块2012与车载设备(各个传感器)之间的交互关系可参照图2所示,图中箭头表示传感器将数据传输给乘员感知模块2011或环境感知模块2012,连接在同一虚线上的传感器之间为“和/或”的关系。乘员感知模块2011通过车内摄像头、座椅压力传感器、声音传感器、振动传感器、触摸传感器来获取第一数据,第一数据中包括车内乘员图像数据、车内座椅压力数据、车内乘员声音数据、车内座椅振动数据、车内显示屏触摸数据。环境感知模块2012通过车内摄像头、声音传感器、空气质量传感器、温度传感器、湿度传感器、烟雾传感器以及车速传感器获取第二数据,第二数据中包括车内环境图像数据、车内环境声音数据、车内空气质量数据、车内温度数据、车内湿度数据、车内烟雾浓度数据以及车辆速度数据。另外,感知模块201可以对获取到的第一数据和第二数据进行分析,得到车内乘员信息以及车内环境信息,并向决策模块202传递这些信息,以下发应急通讯指令和/或应急控制指令。
决策模块202,用于接收感知模块201发送的车内乘员信息以及车内环境信息,并对接收到的信息进行分析,在车内状况异常时,确定车内状况的异常状态类型。对同一时间段的车内乘员信息和车内环境信息相融合,得到描述当前车内场景的融合信息。对融合信息进行分析,确定车内状况的异常程度。进而根据车内异常状态类型以及异常程度来确定应急措施,应急措施中包括应急通讯和/或应急控制措施,并将相应的应急控制指令下发给执行模块203,将应急通讯指令下发给通讯模块204。
执行模块203,用于接收决策模块202下发的应急控制指令,并执行相应的应急控制措施。
通讯模块204,用于接收决策模块202下发的应急通讯指令,建立车辆驾驶系统与驾驶员或其他预设紧急联系人和/或社会应急机构的信息传输通道,并传输有助于通讯对象评估车辆异常状态和/或有助于救援展开的信息。
在本申请实施例的一种可能的实现方式中,如图3所示,图1所示的计算机系统160包括处理器301,处理器301和系统总线302耦合。处理器301可以是一个或者多个处理器,其中每个处理器都可以包括一个或多个处理器核。显示适配器(video adapter)303,显示适配器303可以驱动显示器324,显示器324和系统总线302耦合。系统总线302通过总线桥304和输入输出(I/O)总线(BUS)305耦合。I/O接口306和I/O总线305耦合。I/O接口306和多种I/O设备进行通信,比如输入设备307(如:键盘,鼠标,显示屏等),多媒体盘(media tray)308,(例如,CD-ROM,多媒体接口等)。收发器309(可以发送和/或接收无线电通信信号),摄像头310(可以捕捉静态和动态数字视频图像)和外部通用串行总线(universal serial bus,USB)接口311。其中,可选地,和I/O接口306相连接的接口可以是USB接口。
其中,处理器301可以是任何传统处理器,包括精简指令集计算(reduced instruction  set computer,RISC)处理器、复杂指令集计算(complex instruction set computer,CISC)处理器或上述的组合。可选地,处理器可以是诸如专用集成电路(ASIC)的专用装置。可选地,处理器301可以是神经网络处理器或者是神经网络处理器和上述传统处理器的组合。
可选地,在本文所述的各种实施例中,计算机系统160可位于远离自动驾驶车辆的地方,并且可与自动驾驶车辆100无线通信。在其它方面,本文所述的一些过程可设置在自动驾驶车辆内的处理器上执行,其它一些过程由远程处理器执行,包括采取执行单个操纵所需的动作。
计算机系统160可以通过网络接口312和软件部署服务器(deploying server)313通信。网络接口312是硬件网络接口,比如,网卡。网络(network)314可以是外部网络,比如因特网,也可以是内部网络,比如以太网或者虚拟私人网络(virtual private network,VPN)。可选地,网络314还可以为无线网络,比如WiFi网络,蜂窝网络等。
硬盘驱动器接口315和系统总线302耦合。硬盘驱动器接口315和硬盘驱动器316相连接。系统内存317和系统总线302耦合。运行在系统内存317的数据可以包括计算机系统160的操作系统(operating system,OS)318和应用程序319。
操作系统包括但不限于Shell320和内核(kernel)321。Shell 320是介于使用者和操作系统的内核(kernel)间的一个接口。shell是操作系统最外面的一层。shell管理使用者与操作系统之间的交互:等待使用者的输入,向操作系统解释使用者的输入,并且处理各种各样的操作系统的输出结果。
内核321由操作系统中用于管理存储器、文件、外设和系统资源的那些部分组成。直接与硬件交互,操作系统内核通常运行进程,并提供进程间的通信,提供CPU时间片管理、中断、内存管理、IO管理等等功能。
应用程序319包括控制汽车自动驾驶相关的程序,比如,管理自动驾驶的车辆和路上障碍物交互的程序,控制自动驾驶车辆路线或者速度的程序,控制自动驾驶车辆和路上其他自动驾驶车辆交互的程序。应用程序319也存在于deploying server 313的系统上。在一个实施例中,在需要执行应用程序319时,计算机系统160可以从deploying server 313下载应用程序319。
又比如,应用程序319可以是控制车辆根据上述环境信息、状态信息,以及传统的基于规则的驾驶策略计算最终驾驶策略的应用程序。其中,环境信息为车辆当前所处环境的信息(绿化带、车道、交通信号灯等),状态信息为与车辆有交互的目标对象的信息(速度、加速度等)。计算机系统160的处理器301调用该应用程序319,得到最终的驾驶策略。
传感器322和计算机系统160关联。传感器322用于探测计算机系统160周围的环境。举例来说,传感器322可以探测动物,汽车,障碍物和人行横道等。进一步传感器还可以探测上述动物,汽车,障碍物和人行横道等物体周围的环境。比如:动物周围的环境,例如,动物周围出现的其他动物,天气条件,周围环境的光亮度等。可选地,如果计算机系统160位于自动驾驶的车辆上,传感器可以是摄像头,红外线感应器,化学检测器,麦克风等。
在本申请的另一些实施例中,本申请实施例的乘员保护方法还可以由芯片系统执行。参见图4,是本申请实施例提供的一种芯片系统的结构图。
神经网络处理器(neural processing unit,NPU)40作为协处理器挂载到主CPU(Host CPU)上,由Host CPU为NPU分配任务。NPU的核心部分为运算电路403。示例性的,通过控制器404控制运算电路403,从而运算电路403可提取存储器中的矩阵数据并进行乘法运 算。
在一些实现中,运算电路403内部包括多个处理单元(process engine,PE)。在一些实现中,运算电路403是二维脉动阵列。运算电路403还可以是一维脉动阵列,或者能够执行例如乘法和加法这样的数学运算的其它电子线路。在一些实现中,运算电路403是通用的矩阵处理器。
举例来说,假设有输入矩阵A,权重矩阵B,输出矩阵C。运算电路403从权重存储器402中获取权重矩阵B相应的数据,并缓存在运算电路403中每一个PE上。运算电路403从输入存储器401中取输入矩阵A相应的数据,并根据输入矩阵A和权重矩阵B进行矩阵运算,得到矩阵运算的部分结果或最终结果可保存在累加器(accumulator)408中。
又比如,运算电路403可用于实现特征提取模型(如卷积神经网络模型),并将图像数据输入卷积神经网络模型,通过该模型的运算,得到图像的特征。进而,将图像特征输出到分类器,由分类器输出图像中物体的分类概率。
统一存储器406用于存放输入数据以及输出数据。外部存储器中的权重数据直接通过存储单元访问控制器(direct memory access controller,DMAC)405被送往到权重存储器402中。外部存储器中的输入数据可通过DMAC被搬运到统一存储器406中,或者被搬运到输入存储器401中。
总线接口单元(bus interface unit,BIU)410,用于AXI总线与DMAC和取指存储器(instruction fetch buffer)409的交互。还用于取指存储器409从外部存储器获取指令,还用于存储单元访问控制器405从外部存储器获取输入矩阵A或者权重矩阵B的原数据。
DMAC主要用于将外部存储器(DDR)中的输入数据搬运到统一存储器406,或将权重数据搬运到权重存储器402中,或将输入数据搬运到输入存储器401中。
向量计算单元407可包括多个运算处理单元。用于在需要的情况下,可以对运算电路403的输出做进一步处理,如向量乘,向量加,指数运算,对数运算,大小比较等等。主要用于神经网络中非卷积/FC层网络计算,如池化(pooling),批归一化(batch normalization),局部响应归一化(local response normalization)等。
在一些实现中,向量计算单元407将经处理的输出向量存储到统一存储器506。例如,向量计算单元407可以将非线性函数应用到运算电路403的输出,例如累加值的向量,用以生成激活值。在一些实现中,向量计算单元407生成归一化的值、合并值,或二者均有。在一些实现中,处理过的输出向量还能够用作运算电路403的激活输入,例如用于在神经网络中的后续层中的使用。
控制器404连接取指存储器(instruction fetch buffer)409,控制器404使用的指令可存储在取指存储器409中。
作为一种可能的实现方式,统一存储器406,输入存储器401,权重存储器402以及取指存储器409均为On-Chip存储器。外部存储器私有于该NPU硬件架构。
结合图1至图3,主CPU和NPU共同配合,可实现图1中车辆100所需功能的相应算法,也可实现图2所示车辆所需功能的相应算法,也可以实现图3所示计算机系统160所需功能的相应算法。
在本申请的另一些实施例中,计算机系统160还可以从其它计算机系统接收信息或转移信息到其它计算机系统。或者,从车辆100的传感器系统120收集的传感器数据可以被转移到另一个计算机,由另一计算机对此数据进行处理。如图5所示,来自计算机系统160的数 据可以经由网络被传送到云侧的计算机系统510用于进一步的处理。网络以及中间节点可以包括各种配置和协议,包括因特网、万维网、内联网、虚拟专用网络、广域网、局域网、使用一个或多个公司的专有通信协议的专用网络、以太网、WiFi和HTTP、以及前述的各种组合。这种通信可以由能够传送数据到其它计算机和从其它计算机传送数据的任何设备执行,诸如调制解调器和无线接口。
在一个示例中,另一计算机系统510(位于服务器)可以包括具有多个计算机的服务器,例如负载均衡服务器群。为了从计算机系统160接收、处理并传送数据,另一计算机系统510与网络的不同节点交换信息。该服务器520可以具有类似于计算机系统160的配置,并具有处理器530、存储器540、指令550、和数据560。
在一个示例中,服务器520的数据560可以包括提供天气相关的信息。例如,服务器520可以接收、监视、存储、更新、以及传送与天气相关的各种信息。该信息可以包括例如以报告形式、雷达信息形式、预报形式等的降水、云、和/或温度信息和/或湿度信息。
参见图6,为自主驾驶车辆和云服务中心(云服务器)交互的示例。云服务中心可以经诸如无线通信网络的网络611,从其环境600内的自动驾驶车辆613、612接收信息(诸如车辆传感器收集到的数据或者其它信息)。
云服务中心620根据接收到的数据,运行其存储的根据传感器监测到的数据确定车内乘员信息和车内环境信息,进而确定异常状态类型和异常程度,并根据异常状态类型和异常程度确定应急措施的相关的程序,对自动驾驶车辆613、612的相关人员进行提醒和/或采取相应的应急控制措施。
示例性的,云服务中心620通过网络611可将地图的部分提供给车辆613、612。在其它示例中,可以在不同位置之间划分操作。例如,多个云服务中心可以接收、证实、组合和/或发送信息报告。在一些示例中还可以在车辆之间发送信息报告和/传感器数据。其它配置也是可能的。
在一些示例中,云服务中心620向自动驾驶车辆发送指示对驾驶员等相关人员进行/不进行车内异常状况提醒的响应。例如,云服务中心620基于收集到的传感器数据和应急措施信息,确定应急通讯的通讯对象、通讯内容、通讯方式以及应急控制的具体措施。云服务中心620观察其操作环境600内的视频流或者车辆613、612的状态的变化,例如速度的变化、车辆内部乘员状态的变化以及车内环境的变化,并且确认应急措施的效果,以便在下次进行应急控制前,根据应急措施的效果对本次应急措施进行评估,控制车辆采取更准确的应急措施。
在一些实施例中,所公开的方法可以实施为以机器可读格式,被编码在计算机可读存储介质上的或者被编码在其它非瞬时性介质或者制品上的计算机程序指令。图7示意性地示出根据这里展示的至少一些实施例而布置的示例计算机程序产品的概念性局部视图,示例计算机程序产品包括用于在计算设备上执行计算机进程的计算机程序。在一个实施例中,示例计算机程序产品700是使用信号承载介质701来提供的。信号承载介质701可以包括一个或多个程序指令702,其当被一个或多个处理器运行时可以提供以上针对图1至图6描述的全部功能或者部分功能,或者可以提供后续实施例中描述的全部或部分功能。例如,参考图8中所示的实施例,S801至S803中的一个或多个特征可以由与信号承载介质701相关联的一个或多个指令来承担。此外,图7中的程序指令702也描述示例指令。
在一些示例中,信号承载介质701可以包含计算机可读介质703,诸如但不限于,硬盘 驱动器、紧密盘(CD)、数字视频光盘(DVD)、数字磁带、存储器、只读存储记忆体(Read-Only Memory,ROM)或随机存储记忆体(Random Access Memory,RAM)等等。在一些实施方式中,信号承载介质601可以包含计算机可记录介质704,诸如但不限于,存储器、读/写(R/W)CD、R/W DVD、等等。在一些实施方式中,信号承载介质701可以包含通信介质705,诸如但不限于,数字和/或模拟通信介质(例如,光纤电缆、波导、有线通信链路、无线通信链路、等等)。因此,例如,信号承载介质701可以由无线形式的通信介质705(例如,遵守IEEE 802.11标准或者其它传输协议的无线通信介质)来传达。一个或多个程序指令702可以是,例如,计算机可执行指令或者逻辑实施指令。在一些示例中,诸如针对图1至图6描述的计算设备可以被配置为,响应于通过计算机可读介质703、和/或计算机可记录介质704、和/或通信介质705中的一个或多个传达到计算设备的程序指令702,提供各种操作、功能、或者动作。应该理解,这里描述的布置仅仅是用于示例的目的。因而,本领域技术人员将理解,其它布置和其它元素(例如,机器、接口、功能、顺序、和功能组等等)能够被取而代之地使用,并且一些元素可以根据所期望的结果而一并省略。另外,所描述的元素中的许多是可以被实现为离散的或者分布式的组件的、或者以任何适当的组合和位置来结合其它组件实施的功能实体。
为了及时识别车内异常状况并减少或避免车内异常状况对于乘员的影响,特别是在驾驶员离开车辆的场景下,保护车内乘员的安全,本申请提供一种乘员保护方法,可以由上述实施例中的处理器161、处理器301、NFU40以及处理器530来执行,如图8所示,该方法包括步骤S801-S803:
S801、获取车内乘员信息以及车内环境信息。
其中,车内乘员信息包括车内乘员行为信息以及车内乘员声音信息中的至少一种,车内乘员行为信息包括车内乘员所在位置、车内乘员姿态、车内乘员面部表情中的至少一项,车内乘员声音信息包括车内乘员音量、车内乘员声纹、车内乘员声音语义信息中的至少一项。车内环境信息包括车内环境图像信息、车内环境声音信息、车内空气质量信息以及车内温度信息中的至少一项。
示例性的,车内乘员所在位置包括前排座位、后排座位,进一步的,前排座位可以划分为驾驶员座位、副驾驶座位;后排座位可以划分为驾驶员侧后排座位以及副驾驶侧后排座位。车内乘员姿态包括端坐、蜷缩、躺等,车内乘员面部表情包括正常、开心、悲伤、生气、焦急或者不舒服等。车内乘员声纹信息包括婴儿、少年、成人、老人等,车内乘员声音语义信息包括求救、唱歌、打电话等。车内环境图像信息包括正常、起火、车祸等。
可选的,确定车内乘员信息以及车内环境信息的过程,就是将从传感器收集到的非结构化数据转化为结构化数据或者半结构化数据的过程。利用座椅压力传感器、摄像头、声音传感器中的至少一种传感器,获取车内座椅压力数据、车内乘员图像数据、车内乘员声音数据中的至少一项,即第一数据。利用摄像头、声音传感器、空气质量传感器、温度传感器中的至少一种传感器,获取车内环境图像数据、车内环境声音数据、车内空气质量数据以及车内温度数据中的至少一项,即第二数据。之后,因车内座椅压力数据与车内乘员图像数据,均可在一定程度上反映车内乘员所在位置,则对第一数据中的车内座椅压力数据和车内乘员图像数据进行分析确定车内乘员所在位置。对车内乘员图像数据进行分析确定车内乘员姿态以及车内乘员面部表情。对第一数据中的车内乘员声音数据进行分析,确定车内乘员音量、车内乘员声纹以及车内乘员声音语义信息。这样,可得到车内乘员信息。
示例性的,在车内的四个座椅处,即驾驶员座位、副驾驶座位、驾驶员侧后排座位和副驾驶员侧后排座位处,分别设置有座椅压力传感器A、B、C和D。在这4个座椅压力传感器中,若A、B所收集到的座椅压力数据超过预设座椅压力阈值,C、D所收集到的座椅压力数据未超过预设座椅压力阈值。结合对摄像头所采集的车内乘员图像数据进行分析,确定驾驶员座位处有乘客,但副驾驶座位处放置有行李箱。综上,可进一步确定车内仅有一位乘员,且该车内乘员所在位置为驾驶员座位。
车内环境图像信息包括正常、起火、车祸等,以车内起火为例,在摄像头所采集到的图像发生变化的同时,由于可燃物的燃烧会产生噪音、热量、烟雾、二氧化碳等,车内的声音传感器、温度传感器、空气质量传感器所收集到的车内环境声音数据、车内温度数据、车内空气质量数据等均可能会发生变化。因此,除分别对第二数据中的车内环境声音数据、车内空气质量数据、车内温度数据进行分析,确定车内环境声音信息、车内空气质量信息、车内温度信息外,还需要对车内环境图像数据、车内环境声音数据、车内空气质量数据、车内温度数据综合进行分析,确定车内环境图像信息。这样,可得到车内环境信息。
示例性的,若车内环境声音数据、车内温度数据、车内空气质量数据的数值均未升高,且车内环境图像数据正常,则确定车内环境图像信息为正常。若车内环境声音数据、车内温度数据、车内空气质量数据的数值中的一项或多项升高,且车内环境数据不正常,则确定车内环境图像信息为异常,例如起火。
可选地,在获取第一数据和第二数据之后,可以根据神经网络模型对第一数据和第二数据进行分析,分别确定车内乘员信息和车内环境信息。其中,神经网络模型由历史数据以及深度学习算法进行模型训练得到。
可选的,若车辆中还安装有振动传感器、触摸传感器中的至少一种,则可以进一步利用振动传感器和触摸传感器中的至少一种,获取车内座椅振动数据、以及车内显示屏触摸数据中的至少一种。之后,若乘员在车内有所动作的话,在摄像头所采集到的车内乘员图像数据发生变化的同时,车内座椅压力数据也会随乘员的动作发生改变,则在确定车内乘员的运动信息时,需要对车内乘员图像数据、车内座椅压力数据以及车内座椅振动数据进行分析。另外,在车内乘员有所动作时,其位置可能发生改变,则在确定车内乘员位置时,需对车内乘员图像数据、车内座椅压力数据、车内座椅振动数据进行分析。对车内乘员图像数据进行分析确定车内乘员姿态以及车内乘员面部表情。对第一数据中的车内乘员声音数据进行分析,确定车内乘员音量、车内乘员声纹以及车内乘员声音语义信息。对第一数据中的车内显示屏触摸数据进行分析,确定车内乘员触摸显示屏信息。这样,可得到车内乘员信息。
示例性的,在车内的四个座椅处,即驾驶员座位、副驾驶座位、驾驶员侧后排座位和副驾驶员侧后排座位处,分别设置有座椅压力传感器A、B、C和D以及座椅振动传感器E、F、G、H。首先,在这4个座椅压力传感器中,A、B所收集到的座椅压力传感器超过预设座椅压力阈值,其次,在座椅振动传感器中,E、F所接收到的座椅振动数据超过预设振动阈值。最后,结合对摄像头所采集的车内乘员图像数据的分析,确定仅驾驶员座位处有乘客。综上,可进一步确定车内仅有一位乘员,且该车内乘员从副驾驶座位转移到驾驶员座位,当前车内乘员所处位置为驾驶员座位。
可选的,若车辆中还安装有湿度传感器、烟雾传感器、车速传感器中的至少一种,则可以进一步利用湿度传感器、烟雾传感器、车速传感器中的至少一种,获取车内湿度数据、车内烟雾浓度数据、车辆速度数据中的至少一种。之后,对车辆速度数据进行分析确定车辆速 度信息。若车内环境图像信息包括正常、起火、车祸等,以车内起火为例,在摄像头所采集到的图像发生变化的同时,由于可燃物的燃烧会产生噪音、热量、烟雾、二氧化碳等,车内的声音传感器、温度传感器、烟雾传感器、空气质量传感器、湿度传感器所收集到的车内环境声音数据、车内温度数据、车内烟雾浓度数据、车内空气质量数据、车内湿度数据等均可能会发生变化。因此,除分别对第二数据中的车内环境声音数据、车内空气质量数据、车内温度数据、车内烟雾浓度数据、车内湿度数据进行分析,确定车内环境声音信息、车内空气质量信息、车内温度信息、车内烟雾信息、车内湿度信息外,还需要对车内环境图像数据、车内环境声音数据、车内空气质量数据、车内温度数据、车内环境声音数据、车内烟雾浓度数据、车内湿度数据综合进行分析,确定车内环境图像信息。这样,可得到车内环境信息。
示例性的,若车内环境声音数据、车内温度数据、车内空气质量数据、车内烟雾浓度数据、车内湿度数据的数值均未升高,结合对车内环境图像数据的分析,若车内环境图像数据正常,则车内环境图像信息为正常。若车内环境声音数据、车内温度数据、车内空气质量数据、车内烟雾浓度数据、车内湿度数据的数值升高,结合对车内环境图像数据的分析,若车内环境图像数据不正常,则确定车内环境图像信息为起火。
示例性的,对第一数据进行分析后,确定车内乘员仅有一位,且该乘员位于副驾驶员侧后排座位。此时,若车辆速度数据超过预设速度阈值,车内环境图像信息为正常,则确定车辆发生溜车情况。
需要说明的是,在车内乘员感知和车内环境感知的过程中,均涉及到了车内摄像头以及声音传感器,实现了车内传感器的复用,车内传感器信息可以得到更加充分的利用。另外,在对传感器监测到的第一数据和第二数据进行分析,进而确定车内乘员信息和车内环境信息时,所涉及到的传感器的数量和种类越多,则得到的第一数据和第二数据越丰富,确定车内乘员信息和车内环境信息时的准确度越高。
在另一种可能的实现方式中,在确定车内乘员信息和车内环境信息之前,开启乘员保护模式。
可选的,乘员保护模式可以在驾驶员离开车辆后,或者在车辆驾驶系统关闭后,通过唤醒信号自动开启。其中,唤醒信号为可以反映车内有乘员存在的一项或多项传感器数据。
示例性的,若车内座椅压力数据超过预设压力阈值且持续时长超过预设时长,则所述车内座椅压力数据可作为唤醒信号,触发乘员保护模式。或者若车内乘员声音数据中的车内乘员音量超过预设分贝阈值,则所述车内乘员声音数据可作为唤醒信号,触发乘员保护模式。或者若车内座椅振动数据超过预设振动阈值,则所述车内座椅振动数据可作为唤醒信号,触发乘员保护模式。除上述特定的车内座椅压力数据和车内乘员声音数据外,车内显示屏触摸数据以及其他车辆驾驶系统休眠或关闭后,能够接收外部信息并自动上电,以启动车辆驾驶系统中的传感器或特定装置获取的数据等,也可作为唤醒信号来唤醒乘员保护模式。
可选地,乘员保护模式也可以由驾驶员手动触发,即驾驶员在离开车辆时或者离开车辆之后,驾驶员通过其自身与车辆之间的特定交互方式,来手动触发乘员保护模式。例如,驾驶员可以通过操作其手机或其他电子设备终端上安装的应用程序(application,APP),或者通过操作车辆中央控制单元屏幕来控制乘员保护模式开启。
需要说明的是,乘员保护模式的触发方式多样,为用户提供了很好的触发乘员保护模式的防呆设计,可以确保乘员保护模式在必要情况下开启。另外,在乘员保护模式下,车内乘员可以得到更系统的保护,因此,通过上述过程,可以更好地保证车内乘员的安全,减少不 必要的伤亡事故。
可选的,乘员保护模式开启后,以预设低频率来对第一数据和第二数据进行处理,分别确定车内乘员信息和车内环境信息。和/或,在乘员保护模式下,车辆内与乘员保护模式不相关的其他功能中的至少一项关闭,例如行驶功能、娱乐功能以及导航功能等。和/或,在乘员保护模式下,车辆内与乘员保护模式不相关的计算单元关闭,例如驾驶策略计算单元等。
在上述过程中,乘员保护模式开启后,系统可以以较低频率工作,和/或控制与乘员保护模式不相关的功能模块关闭,从而降低系统功耗。
需要说明的是,在乘员保护模式下,与该模式不相关的功能模块或者计算单元保持关闭,但车辆驾驶系统具有对这些功能模块或计算单元的控制权限。也就是说,在乘员保护模式下,车辆驾驶系统可以根据其需要唤醒或者打开原本保持关闭的功能模块或者计算单元,以调用某些功能或者完成某项计算任务。另外,驾驶员可以对车辆驾驶系统中与乘员保护模式相关的组件进行云端升级,也可以将乘员保护模式禁用。
S802、根据车内乘员信息以及车内环境信息,确定异常状态类型以及异常程度。
其中,异常状态类型包括车内乘员异常和/或车内环境异常。具体的,车内乘员异常包括车内乘员行为异常、车内乘员声音异常中的至少一种,所述车内环境异常包括车内环境图像异常、车内环境声音异常、车内空气质量异常、车内温度异常中的至少一种。
可选的,针对步骤S801中得到的车内乘员信息和车内环境信息,结合预设阈值以及预训练评估模型,来确定车内异常的异常状态类型。其中,预训练评估模型可以为神经网络模型,该神经网络模型可以根据历史车内乘员信息、历史车内环境信息以及深度学习算法进行模型训练确定。
示例性的,若车内乘员姿态为蜷缩或者躺,且车内乘员面部表情为不舒服,则车内乘员行为异常;若车内乘员面部表情为不舒服,则确定异常状态类型为车内乘员行为异常;若车内乘员音量超过车内乘员声纹对应的第一预设分贝阈值,则异常状态类型为车内乘员声音异常;若车内乘员声音语义信息为求救,则异常状态类型为车内乘员声音异常;若车内乘员音量超过车内乘员声纹对应的第一预设分贝阈值,且车内乘员声音语义信息为求救,则异常状态类型为车内乘员声音异常;若车内环境图像信息为起火或者车祸,则异常状态类型为车内环境图像异常;若车内环境声音信息超过第二预设分贝阈值,则车内环境异常,异常状态类型为车内环境声音异常。若车内空气质量超过预设空气质量阈值,则车内环境异常,异常状态类型为车内空气质量异常。若车内温度超过预设温度阈值,则车内环境异常,异常状态类型为车内温度异常。
可选的,在确定异常状态类型之后,对同一时间段内的车内乘员信息和车内环境信息进行融合,得到用于描述当前车内场景的融合信息。其中,融合信息可以是对车内乘员信息和车内环境信息的简单罗列,也可以是对于当前车内场景的总体描述。然后对得到的融合信息进行分析,根据融合信息所描述的当前车内场景对乘员的影响大小,当前车内状况的异常程度。具体的,若融合信息所描述的当前车内场景不影响车内乘员安全,或者说,车内乘员异常但车内环境无异常,则异常程度低;若融合信息所描述的车内场景对车内乘员的影响可通过应急措施解决,即风险可控,则异常程度较高;若融合信息所描述的当前车内场景对车内乘员的影响通过应急措施只能缓解,但车内乘员的安全依旧受到影响,则异常程度高。
可选的,若融合信息为文本信息,则对融合信息进行分析时所采用的分析方法可以为语义分析。
示例性的,对车内乘员信息以及车内环境信息进行融合,得到融合信息。若该融合信息所描述的当前车内场景为“婴儿啼哭”,则可以确定除婴儿啼哭外,当前车内场景中并无其他异常,则可认为当前车内场景并不能影响车内婴儿的安全,异常程度低;若该融合信息所描述的当前车内场景为“婴儿啼哭,且车内温度异常”,则可以确定在当前车内场景中,温度异常对车内婴儿安全造成了影响,但温度异常可通过车内温度调节设备来调节,从而解除温度异常对车内婴儿的影响,此时,异常程度较高;若该融合信息所描述的当前车内“婴儿啼哭,且车内起火”,则在当前车内场景中,车内起火对于婴儿造成的影响较大,且该状况只能通过车内灭火装置缓解,则异常程度高。
S803、根据异常状态类型以及异常程度,确定应急措施。
其中,所述应急措施为减小异常程度的动作,包括应急通讯和/或应急控制措施。应急通讯的通讯内容包括车辆位置信息、车辆外观信息、车牌号信息、车内状态信息、车内图像数据、车内声音信息、数据异常的传感器状态信息、应急控制措施、其他有助于通讯对象评估车内异常状态的信息、其他有助于通讯对象快速定位车辆位置的信息,以及其他能够提高救援效率的信息中的一项或多项。应急通讯的通讯方式为短信、彩信、语音电话、借助车辆驾驶系统与驾驶员手持终端之间的通信链路的方式、通过网络向云端服务器发送报警信息的方式、以及其他能够发送通讯内容并建立车辆驾驶系统和驾驶员之间的通信链路的方式中的一种或多种。
根据异常程度的不同,确定应急措施中包括应急通讯和/或应急控制措施。具体的,若异常程度低,则确定应急措施为应急通讯,应急通讯的联系人为第一预设紧急联系人,例如驾驶员;若异常程度较高,则确定应急措施为应急通讯和应急控制措施,应急通讯的联系人为第一预设紧急联系人,应急控制措施根据异常状态类型来确定。若异常程度高,则确定应急措施为应急通讯和应急控制措施,应急通讯的联系人为第一预设紧急联系人和第二预设紧急联系人,应急控制措施根据异常状态类型确定。其中,第二预设紧急联系人为可以为车内乘员提供救援的人员或者机构,例如驾驶员或急救中心等。应急通讯的第一预设紧急联系人和/或第二预设紧急联系人可以是系统预先设定的,也可以是用户根据其具体需求来设定的。应急控制措施包括提醒车内乘员以及其他人员的措施,调节温度的措施、通风措施、灭火措施等。示例性的,应急控制措施包括语音提醒车内乘员、打开车窗、打开车门、打开空气净化设备、打开温度调节设备、打开灭火装置、解锁/打开车门、鸣笛、闪烁车灯、制动、打开/关闭天窗以及其他车辆驾驶系统通过获取车辆控制权限所能够采取的措施中的一项或多项。
示例性的,若异常状态类型为车内温度异常,则应急控制措施为打开车窗、打开车门、打开温度调节设备中的至少一项。若异常状态类型为车内空气质量异常,则应急控制措施为打开车窗、打开车门、打开空气净化设备中的至少一种。若异常状态类型为车内乘员行为异常和/或车内乘员声音异常,则应急控制措施为语音提醒车内乘员、打开车窗、打开车门、打开温度调节设备、打开灭火装置、解锁/打开车门、鸣笛、闪烁车灯中的至少一项。若异常状态类型为车内环境图像异常和/或车内环境声音异常,则应急控制措施为语音提醒车内乘员、打开车窗、打开车门、打开温度调节设备、打开灭火装置、解锁/打开车门、鸣笛、闪烁车灯中的至少一项。
示例性的,若车内出现需要或者有必要提示车辆外交通参与者的异常状态,则应急控制措施至少包括鸣笛、闪烁车灯中的至少一项;若车辆发生异常移动,例如溜车,或者其他需 要制动的异常状态时,则应急控制措施至少包括制动;若车内异常状态可由车内乘员介入并解决,或者其他需要对车内乘员进行提醒的异常状态,则应急控制措施至少包括语音提醒车内乘客。
示例性的,若车辆为完全自动驾驶车辆,且车内状况的异常程度高,则应急控制措施还可以为驾驶车辆到最近的医院等。
需要说明的是,应急通讯的信息流向可以是单向传递,也就是说,应急通讯的通信内容从车辆驾驶系统传送到某一特定通讯对象。可选的,应急通讯的信息流向也可以是多向传递,也就是说,应急通讯的通讯内容从车辆驾驶系统传送到多个通讯对象。可选的,应急通讯的信息流向也可以是相向传递,也就是说,车辆驾驶系统可以向通讯对象传送信息,也可以接收通讯对象传送的信息。
可选的,在确定车内的异常状态类型以及异常程度后,先语音提醒车内乘员,使得车内乘员及时采取应急控制措施缓解车内异常情况。若车内异常情况未得到缓解,可通过应急通讯通知驾驶员,由驾驶员根据车内的异常状态类型以及异常程度,确定是否需要采取应急控制措施、采取什么样的应急措施以及是否进一步通知可为车内乘员提供急救措施的人员或者机构。
可选的,在另一种可能的实现方式中,在确定车内的异常状态类型以及异常程度后,也可以直接通过应急通讯通知驾驶员或其他相关人员,由驾驶员或其他相关人员根据车内的异常状态类型以及异常程度,确定是否需要采取应急控制措施、采取什么样的应急措施以及是否进一步通知可为车内乘员提供急救措施的人员或者机构。应急通讯的内容除包括车辆信息、车内乘员信息、车内环境信息外,也可以包括为用户推荐的应急控制措施。具体关于应急通讯和应急控制措施的描述可以参见上述内容,在此不再赘述。
通过上述过程,本申请实施例提供了一种依托于车辆驾驶系统的乘员保护方法,可以最大限度的利用车内传感器信息进行车内乘员感知和车内环境感知,通过系统中丰富的计算单元对车内传感器信息进行处理和利用,得到车内乘员信息和车内环境信息,并确定车内状况的异常状态类型以及异常程度,从而及时识别车内异常状况。另外,本申请实施例还可以控制车辆内各个模块和车载设备,针对车内的异常状态类型和异常程度采取相应的应急措施,从而减少或避免车内异常状况对于乘员的影响,保护车内乘员的安全。
本申请实施例可以根据上述方法示例,对乘员保护装置进行功能模块的划分,在采用对应各个功能划分各个功能模块的情况下,图9示出上述实施例中所涉及的车窗自动清洁装置的一种可能的结构的示意图。如图9所示,乘员保护装置包括获取单元901、处理单元902。当然,乘员保护装置还可以包括其他模块,或者乘员保护装置可以包括更少的模块。
获取单元901,用于获取车内乘员信息以及车内环境信息。其中,车内乘员信息包括车内乘员行为信息、车内乘员声音信息中的一种或多种,车内环境信息包括车内环境图像信息、车内环境声音信息、车内空气质量信息、车内温度信息中的一种或多种。
具体的,获取单元901,用于获取传感器监测到的第一数据以及第二数据,其中,传感器包括座椅压力传感器、摄像头、声音传感器、空气质量传感器、温度传感器中的一种或多种。根据第一数据,确定车内乘员信息,其中,第一数据包括车内座椅压力数据、车内乘员图像数据以及车内乘员声音数据中的至少一项。根据第二数据,确定车内环境信息,第二数据包括车内环境图像数据、车内环境声音数据、车内空气质量数据以及车内温度数据中的至少一项。
可选的,处理单元902,还可以用于通过唤醒信号触发乘员保护模式。其中,唤醒信号包括超过预设压力阈值且持续时长超过预设时长的车内座椅压力数据、超过预设分贝阈值的车内乘员声音数据中的至少一种。
处理单元902,用于根据车内乘员信息以及车内环境信息,确定异常状态类型以及异常程度。其中,异常状态类型包括车内乘员异常和/或车内环境异常,车内乘员异常包括车内乘员行为异常以及车内乘员声音异常中的一种或多种,车内环境异常包括车内环境图像异常、车内环境声音异常、车内空气质量异常以及车内温度异常中的一种或多种。
示例性的,车内乘员行为信息包括车内乘员所在位置,车内乘员姿态,车内乘员面部表情。车内乘员所在位置包括前排座位、后排座位。车内乘员姿态包括端坐、蜷缩、躺。车内乘员面部表情包括正常、开心、悲伤、生气、焦急或者不舒服。车内乘员声音信息包括车内乘员音量,车内乘员声纹,车内乘员声音语义信息。车内乘员声音语义信息包括求救、唱歌、打电话。车内环境图像信息包括正常、起火、车祸等等。
示例性的,若车内乘员姿态为蜷缩或者躺,且车内乘员面部表情为不舒服,则确定异常状态类型为车内乘员行为异常。若车内乘员音量超过车内乘员声纹对应的第一预设分贝阈值,和/或车内乘员声音语义信息包含求救信息,则确定异常状态类型为车内乘员声音异常。若车内环境图像信息为起火或车祸,则确定异常状态类型为车内环境图像异常。若车内环境声音信息超过第二预设分贝阈值,则确定异常状态类型为车内环境声音异常。若车内空气质量超过预设空气质量阈值,则确定异常状态类型为车内空气质量异常。若车内温度超过预设温度阈值,则确定异常状态类型为车内温度异常。
具体的,处理单元902,用于对同一时间段内的车内乘员信息和车内环境信息进行融合,得到用于描述当前车内场景的融合信息。对融合信息进行分析,确定用于表示当前车内场景对于乘员的影响的异常程度。
示例性的,若融合信息描述的当前车内场景为婴儿啼哭,则异常程度低。若融合信息描述的当前车内场景为婴儿啼哭,且车内温度异常,则异常程度较高。若融合信息描述的当前车内场景为婴儿啼哭,且车内起火,则异常程度高。
处理单元902,还用于根据异常状态类型以及异常程度,确定应急措施,其中,应急措施为减小异常程度的操作。
示例性的,应急措施包括应急通讯和/或应急控制措施。其中,应急通讯的通讯内容包括车辆位置信息、车辆外观信息、车牌号信息、车内状态信息、车内图像数据、车内声音信息、数据异常的传感器状态信息、应急控制措施中的一项或多项。应急通讯的通讯方式为短信、彩信、语音电话中的一种或多种。应急控制措施包括语音提醒车内乘员、打开车窗、打开车门、打开车内空气净化设备、打开温度调节设备、打开灭火装置、解锁/打开车门、鸣笛、闪烁车灯中的一项或多项。
具体的,处理单元902,用于若异常程度低,则确定应急措施为应急通讯,应急通讯的联系人为第一预设紧急联系人。若异常程度较高,则确定应急措施为应急通讯和应急控制措施,应急通讯的联系人为第一预设紧急联系人,应急控制措施根据异常状态类型确定。若异常程度高,则应急措施为应急通讯和应急控制措施,应急通讯的联系人包括第一预设紧急联系人和第二预设紧急联系人,应急控制措施根据所述异常状态类型确定。
示例性的,若异常状态类型为车内空气质量异常,则应急控制措施为打开车窗、打开车门、打开车内空气净化设备中的至少一种。若异常状态类型为车内温度异常,则应急控制措 施为打开车窗、打开车门、打开温度调节设备中的至少一种。若异常状态类型为车内乘员行为异常和/或车内乘员声音异常,则应急控制措施为语音提醒车内乘员、打开车窗、打开车门、打开温度调节设备、打开灭火装置、解锁/打开车门、鸣笛、闪烁车灯中的至少一项。若异常状态类型为车内环境图像异常和/或车内环境声音异常,则应急控制措施为语音提醒车内乘员、打开车窗、打开车门、打开温度调节设备、打开灭火装置、解锁/打开车门、鸣笛、闪烁车灯中的至少一项。
参见图10,本申请还提供一种乘员保护装置,包括处理器1001以及存储器1002。
处理器1001与存储器1002相连接(如通过总线1004相互连接)。
可选的,乘员保护装置还可包括收发器1003,收发器1003连接处理器1001和存储器1002,收发器1003用于接收/发送数据。
处理器1001,可以执行图8所对应的实施方案及其各种可行的实施方式的操作。比如,用于执行获取单元901和处理单元902的操作,和/或本申请实施例中所描述的其他操作。
关于处理器、存储器、总线和收发器的具体介绍,可参见上文,这里不再赘述。
本申请还提供一种乘员保护装置,包括非易失性存储介质,以及中央处理器,非易失性存储介质存储有可执行程序,中央处理器与非易失性存储介质连接,并执行可执行程序以实现本申请实施例如图8所示的乘员保护方法。
本申请另一实施例还提供一种计算机可读存储介质,该计算机可读存储介质包括一个或多个程序代码,该一个或多个程序包括指令,当处理器在执行该程序代码时,20该自动车窗清洁装置执行如图8所示的乘员保护方法。
在本申请的另一实施例中,还提供一种计算机程序产品,该计算机程序产品包括计算机执行指令,该计算机执行指令存储在计算机可读存储介质中。乘员保护装置的至少一个处理器可以从计算机可读存储介质读取该计算机执行指令,至少一个处理器执行该计算机执行指令使得乘员保护装置执行如图8所示的乘员保护方法中的相应步骤。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的系统、装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
上述实施例可以全部或部分通过软件,硬件,固件或者其任意组合实现。当使用软件程序实现时,上述实施例可以全部或部分地以计算机程序产品的形式出现,计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行计算机程序指令时,全部或部分地产生按照本申请实施例的流程或功能。
其中,所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线(digital subscriber line,DSL))或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心传输。计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。该可用介质可以是磁性介质,(例如,软盘,硬盘、磁带)、光介质(例如,DVD)或者半导体介质(例如固态硬盘(solid state disk,SSD))等。
通过以上的实施方式的描述,所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,仅以上述各功能单元的划分进行举例说明,实际应用中,可以根据需要而将上述功能 分配由不同的功能单元完成,即将装置的内部结构划分成不同的功能单元,以完成以上描述的全部或者部分功能。
在本申请所提供的几个实施例中,应该理解到,所揭露的装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个装置,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是物理上分开的,或者也可以不是物理上分开的,作为单元显示的部件可以是一个物理单元或多个物理单元,即可以位于一个地方,或者也可以分布到多个不同地方。在应用过程中,可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请实施例的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一个设备(可以是个人计算机,服务器,网络设备,单片机或者芯片等)或处理器(processor)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(read-only memory,ROM)、随机存取存储器(random access memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何在本申请揭露的技术范围内的变化或替换,都应涵盖在本申请的保护范围之内。

Claims (25)

  1. 一种乘员保护方法,其特征在于,包括:
    获取车内乘员信息以及车内环境信息;所述车内乘员信息包括车内乘员行为信息、车内乘员声音信息中的一种或多种;所述车内环境信息包括车内环境图像信息、车内环境声音信息、车内空气质量信息、车内温度信息中的一种或多种;
    根据所述车内乘员信息以及所述车内环境信息,确定异常状态类型以及异常程度;所述异常状态类型包括车内乘员异常和/或车内环境异常,所述车内乘员异常包括车内乘员行为异常以及车内乘员声音异常中的一种或多种,所述车内环境异常包括车内环境图像异常、车内环境声音异常、车内空气质量异常以及车内温度异常中的一种或多种;
    根据所述异常状态类型以及所述异常程度,确定应急措施,所述应急措施为减小所述异常程度的操作。
  2. 根据权利要求1所述的乘员保护方法,其特征在于,所述获取车内乘员信息以及车内环境信息,包括:
    获取传感器监测到的第一数据以及第二数据,所述传感器包括座椅压力传感器、摄像头、声音传感器、空气质量传感器、温度传感器中的一种或多种;
    根据所述第一数据,确定所述车内乘员信息,所述第一数据包括车内座椅压力数据、车内乘员图像数据以及车内乘员声音数据中的至少一项;
    根据所述第二数据,确定所述车内环境信息,所述第二数据包括车内环境图像数据、车内环境声音数据、车内空气质量数据以及车内温度数据中的至少一项。
  3. 根据权利要求1或2所述的乘员保护方法,其特征在于,所述方法还包括:
    通过唤醒信号触发乘员保护模式,所述唤醒信号包括超过预设压力阈值且持续时长超过预设时长的车内座椅压力数据、超过预设分贝阈值的车内乘员声音数据中的至少一种。
  4. 根据权利要求3所述的乘员保护方法,其特征在于,在所述触发乘员保护模式后,所述方法还包括:
    工作在预设低频率下,根据第一数据确定车内乘员信息;以及根据第二数据确定车内环境信息;
    和/或,控制行驶功能和娱乐功能中的至少一项关闭。
  5. 根据权利要求1-4任一项所述的乘员保护方法,其特征在于,所述车内乘员行为信息包括车内乘员所在位置,车内乘员姿态,车内乘员面部表情;所述车内乘员所在位置包括前排座位、后排座位;所述车内乘员姿态包括端坐、蜷缩、躺;所述车内乘员面部表情包括正常、开心、悲伤、生气、焦急或者不舒服;
    所述车内乘员声音信息包括车内乘员音量、车内乘员声纹、车内乘员声音语义信息;所述车内乘员声音语义信息包括求救、唱歌、打电话;
    所述车内环境图像信息包括正常、起火、车祸。
  6. 根据权利要求5所述的乘员保护方法,其特征在于,根据所述车内乘员信息、所述车内环境信息,确定异常状态类型,具体包括:
    若所述车内乘员姿态为蜷缩或者躺,且所述车内乘员面部表情为不舒服,则确定所述异常状态类型为车内乘员行为异常;
    若所述车内乘员音量超过所述车内乘员声纹对应的第一预设分贝阈值,和/或所述车内 乘员声音语义信息包含求救信息,则确定所述异常状态类型为车内乘员声音异常;
    若所述车内环境图像信息为起火或车祸,则确定所述异常状态类型为车内环境图像异常;
    若所述车内环境声音信息超过第二预设分贝阈值,则确定所述异常状态类型为车内环境声音异常;
    若所述车内空气质量超过预设空气质量阈值,则确定所述异常状态类型为车内空气质量异常;
    若所述车内温度超过预设温度阈值,则确定所述异常状态类型为车内温度异常。
  7. 根据权利要求5或6所述的乘员保护方法,其特征在于,根据所述车内乘员信息、所述车内环境信息,确定异常程度,具体包括:
    对同一时间段内的所述车内乘员信息和所述车内环境信息进行融合,得到融合信息,所述融合信息用于描述当前车内场景;
    对所述融合信息进行分析,确定异常程度,所述异常程度用于表示当前车内场景对于乘员的影响。
  8. 根据权利要求7所述的乘员保护方法,其特征在于,对所述融合信息进行分析,确定异常程度,具体包括:
    若所述融合信息描述的当前车内场景为婴儿啼哭,则所述异常程度低;
    若所述融合信息描述的当前车内场景为婴儿啼哭,且车内温度异常,则所述异常程度较高;
    若所述融合信息描述的当前车内场景为婴儿啼哭,且车内起火,则所述异常程度高。
  9. 根据权利要求1-8任一项所述的乘员保护方法,其特征在于,所述应急措施包括应急通讯和/或应急控制措施;其中,所述应急通讯的通讯内容包括车辆位置信息、车辆外观信息、车牌号信息、车内状态信息、车内图像数据、车内声音信息、数据异常的传感器状态信息、所述应急控制措施中的一项或多项;所述应急通讯的通讯方式为短信、彩信、语音电话中的一种或多种;所述应急控制措施包括语音提醒车内乘员、打开车窗、打开车门、打开车内空气净化设备、打开温度调节设备、打开灭火装置、解锁/打开车门、鸣笛、闪烁车灯中的一项或多项。
  10. 根据权利要求9所述的乘员保护方法,其特征在于,所述根据所述异常状态类型以及所述异常程度,确定应急措施,具体包括:
    若所述异常程度低,则确定所述应急措施为应急通讯,所述应急通讯的联系人为第一预设紧急联系人;
    若所述异常程度较高,则确定所述应急措施为应急通讯和应急控制措施,所述应急通讯的联系人为第一预设紧急联系人,所述应急控制措施根据所述异常状态类型确定;
    若所述异常程度高,则所述应急措施为应急通讯和应急控制措施,所述应急通讯的联系人包括第一预设紧急联系人和第二预设紧急联系人,所述应急控制措施根据所述异常状态类型确定。
  11. 根据权利要求10所述的乘员保护方法,其特征在于,所述应急控制措施根据所述异常状态类型确定,具体包括:
    若所述异常状态类型为车内空气质量异常,则所述应急控制措施为打开车窗、打开车门、打开车内空气净化设备中的至少一种;
    若所述异常状态类型为车内温度异常,则所述应急控制措施为打开车窗、打开车门、打开温度调节设备中的至少一种;
    若所述异常状态类型为车内乘员行为异常和/或车内乘员声音异常,则所述应急控制措施为语音提醒车内乘员、打开车窗、打开车门、打开温度调节设备、打开灭火装置、解锁/打开车门、鸣笛、闪烁车灯中的至少一项;
    若所述异常状态类型为车内环境图像异常和/或车内环境声音异常,则所述应急控制措施为语音提醒车内乘员、打开车窗、打开车门、打开温度调节设备、打开灭火装置、解锁/打开车门、鸣笛、闪烁车灯中的至少一项。
  12. 一种乘员保护装置,其特征在于,所述装置包括:
    获取单元,用于获取车内乘员信息以及车内环境信息;所述车内乘员信息包括车内乘员行为信息、车内乘员声音信息中的一种或多种;所述车内环境信息包括车内环境图像信息、车内环境声音信息、车内空气质量信息、车内温度信息中的一种或多种;
    处理单元,用于根据所述车内乘员信息以及所述车内环境信息,确定异常状态类型以及异常程度;所述异常状态类型包括车内乘员异常和/或车内环境异常,所述车内乘员异常包括车内乘员行为异常以及车内乘员声音异常中的一种或多种,所述车内环境异常包括车内环境图像异常、车内环境声音异常、车内空气质量异常以及车内温度异常中的一种或多种;
    所述处理单元,还用于根据所述异常状态类型以及所述异常程度,确定应急措施,所述应急措施为减小所述异常程度的操作。
  13. 根据权利要求12所述的乘员保护装置,其特征在于,
    所述获取单元,具体用于获取传感器监测到的第一数据以及第二数据,所述传感器包括座椅压力传感器、摄像头、声音传感器、空气质量传感器、温度传感器中的一种或多种;
    根据所述第一数据,确定所述车内乘员信息,所述第一数据包括车内座椅压力数据、车内乘员图像数据以及车内乘员声音数据中的至少一项;
    根据所述第二数据,确定所述车内环境信息,所述第二数据包括车内环境图像数据、车内环境声音数据、车内空气质量数据以及车内温度数据中的至少一项。
  14. 根据权利要求12或13所述的乘员保护装置,其特征在于,
    所述处理单元,还用于通过唤醒信号触发乘员保护模式,所述唤醒信号包括超过预设压力阈值且持续时长超过预设时长的车内座椅压力数据、超过预设分贝阈值的车内乘员声音数据中的至少一种。
  15. 根据权利要求14所述的乘员保护装置,其特征在于,
    所述处理单元,用于工作在预设低频率下,根据第一数据确定车内乘员信息;以及根据第二数据确定车内环境信息;
    和/或,控制行驶功能和娱乐功能中的至少一项关闭。
  16. 根据权利要求12-15任一项所述的乘员保护装置,其特征在于,所述车内乘员行为信息包括车内乘员所在位置,车内乘员姿态,车内乘员面部表情;所述车内乘员所在位置包括前排座位、后排座位;所述车内乘员姿态包括端坐、蜷缩、躺;所述车内乘员面部表情包括正常、开心、悲伤、生气、焦急或者不舒服;
    所述车内乘员声音信息包括车内乘员音量、车内乘员声纹、车内乘员声音语义信息;所述车内乘员声音语义信息包括求救、唱歌、打电话;
    所述车内环境图像信息包括正常、起火、车祸。
  17. 根据权利要求16所述的乘员保护装置,其特征在于,
    所述处理单元,具体用于若所述车内乘员姿态为蜷缩或者躺,且所述车内乘员面部表情为不舒服,则确定所述异常状态类型为车内乘员行为异常;
    若所述车内乘员音量超过所述车内乘员声纹对应的第一预设分贝阈值,和/或所述车内乘员声音语义信息包含求救信息,则确定所述异常状态类型为车内乘员声音异常;
    若所述车内环境图像信息为起火或车祸,则确定所述异常状态类型为车内环境图像异常;
    若所述车内环境声音信息超过第二预设分贝阈值,则确定所述异常状态类型为车内环境声音异常;
    若所述车内空气质量超过预设空气质量阈值,则确定所述异常状态类型为车内空气质量异常;
    若所述车内温度超过预设温度阈值,则确定所述异常状态类型为车内温度异常。
  18. 根据权利要求16或17所述的乘员保护装置,其特征在于,
    所述处理单元,具体用于对同一时间段内的所述车内乘员信息和所述车内环境信息进行融合,得到融合信息,所述融合信息用于描述当前车内场景;
    对所述融合信息进行分析,确定异常程度,所述异常程度用于表示当前车内场景对于乘员的影响。
  19. 根据权利要求18所述的乘员保护装置,其特征在于,
    所述处理单元,具体还用于若所述融合信息描述的当前车内场景为婴儿啼哭,则所述异常程度低;
    若所述融合信息描述的当前车内场景为婴儿啼哭,且车内温度异常,则所述异常程度较高;
    若所述融合信息描述的当前车内场景为婴儿啼哭,且车内起火,则所述异常程度高。
  20. 根据权利要求12-19任一项所述的乘员保护装置,其特征在于,所述应急措施包括应急通讯和/或应急控制措施;其中,所述应急通讯的通讯内容包括车辆位置信息、车辆外观信息、车牌号信息、车内状态信息、车内图像数据、车内声音信息、数据异常的传感器状态信息、所述应急控制措施中的一项或多项;所述应急通讯的通讯方式为短信、彩信、语音电话中的一种或多种;所述应急控制措施包括语音提醒车内乘员、打开车窗、打开车门、打开车内空气净化设备、打开温度调节设备、打开灭火装置、解锁/打开车门、鸣笛、闪烁车灯中的一项或多项。
  21. 根据权利要求20所述的乘员保护装置,其特征在于,
    所述处理单元,具体用于若所述异常程度低,则确定所述应急措施为应急通讯,所述应急通讯的联系人为第一预设紧急联系人;
    若所述异常程度较高,则确定所述应急措施为应急通讯和应急控制措施,所述应急通讯的联系人为第一预设紧急联系人,所述应急控制措施根据所述异常状态类型确定;
    若所述异常程度高,则所述应急措施为应急通讯和应急控制措施,所述应急通讯的联系人包括第一预设紧急联系人和第二预设紧急联系人,所述应急控制措施根据所述异常状态类型确定。
  22. 根据权利要求21所述的乘员保护装置,其特征在于,
    所述处理单元,具体还用于若所述异常状态类型为车内空气质量异常,则所述应急控制 措施为打开车窗、打开车门、打开车内空气净化设备中的至少一种;
    若所述异常状态类型为车内温度异常,则所述应急控制措施为打开车窗、打开车门、打开温度调节设备中的至少一种;
    若所述异常状态类型为车内乘员行为异常和/或车内乘员声音异常,则所述应急控制措施为语音提醒车内乘员、打开车窗、打开车门、打开温度调节设备、打开灭火装置、解锁/打开车门、鸣笛、闪烁车灯中的至少一项;
    若所述异常状态类型为车内环境图像异常和/或车内环境声音异常,则所述应急控制措施为语音提醒车内乘员、打开车窗、打开车门、打开温度调节设备、打开灭火装置、解锁/打开车门、鸣笛、闪烁车灯中的至少一项。
  23. 一种乘员保护装置,其特征在于,包括:处理器、存储器和通信接口;其中,通信接口用于与其他设备或通信网络通信,存储器用于存储计算机执行指令,当该装置运行时,处理器执行存储器存储的所述计算机执行指令以使该装置执行如权利要求1-11任一项所述的乘员保护方法。
  24. 一种计算机可读存储介质,其特征在于,包括程序和指令,当所述程序或指令在计算机上运行时,使得计算机执行如权利要求1-11任一项所述的乘员保护方法。
  25. 一种包含指令的计算机程序产品,其特征在于,当所述计算机程序产品在计算机上运行时,使得所述计算机执行如权利要求1-11任一项所述的乘员保护方法。
PCT/CN2020/091724 2019-08-30 2020-05-22 乘员保护方法及装置 WO2021036363A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP20859614.8A EP4019343A4 (en) 2019-08-30 2020-05-22 METHOD AND DEVICE FOR OCCUPANT PROTECTION
US17/680,885 US20220305988A1 (en) 2019-08-30 2022-02-25 Passenger Protection Method and Apparatus

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN201910815590 2019-08-30
CN201910815590.6 2019-08-30
CN201910996188.2 2019-10-18
CN201910996188.2A CN110758241B (zh) 2019-08-30 2019-10-18 乘员保护方法及装置

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/680,885 Continuation US20220305988A1 (en) 2019-08-30 2022-02-25 Passenger Protection Method and Apparatus

Publications (1)

Publication Number Publication Date
WO2021036363A1 true WO2021036363A1 (zh) 2021-03-04

Family

ID=69332439

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/091724 WO2021036363A1 (zh) 2019-08-30 2020-05-22 乘员保护方法及装置

Country Status (4)

Country Link
US (1) US20220305988A1 (zh)
EP (1) EP4019343A4 (zh)
CN (1) CN110758241B (zh)
WO (1) WO2021036363A1 (zh)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110758241B (zh) * 2019-08-30 2022-03-11 华为技术有限公司 乘员保护方法及装置
CN112992136A (zh) * 2020-12-16 2021-06-18 呼唤(上海)云计算股份有限公司 智能婴儿监护系统及方法
US11142211B1 (en) * 2021-03-15 2021-10-12 Samsara Networks Inc. Vehicle rider behavioral monitoring
CN114954338A (zh) * 2022-05-07 2022-08-30 浙江吉利控股集团有限公司 一种车内救生方法
CN115782835B (zh) * 2023-02-09 2023-04-28 江苏天一航空工业股份有限公司 一种旅客登机车自动驻车远程驾驶控制方法

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9014920B1 (en) * 2012-07-02 2015-04-21 Ricardo Torres Vehicle occupants alert and alarm system
CN105253085A (zh) * 2015-11-14 2016-01-20 大连理工大学 车内滞留乘员状态识别及险态控制系统
CN105741485A (zh) * 2016-01-29 2016-07-06 大连楼兰科技股份有限公司 儿童遗留车内自动安全处理的车载系统及处理方法
CN108394374A (zh) * 2018-04-26 2018-08-14 奇瑞汽车股份有限公司 乘车保护方法、装置及计算机可读存储介质
CN108791156A (zh) * 2017-04-27 2018-11-13 赫拉胡克两合公司 用于保护车辆乘员的方法
CN108944672A (zh) * 2018-07-19 2018-12-07 安顺职业技术学院 一种车内环境监测自动应急安全保护系统
CN208544243U (zh) * 2018-07-04 2019-02-26 上汽通用汽车有限公司 儿童被锁提示系统及汽车车窗控制系统
CN110001566A (zh) * 2019-03-25 2019-07-12 奇瑞汽车股份有限公司 车内生命体保护方法、装置及计算机可读存储介质
CN110758241A (zh) * 2019-08-30 2020-02-07 华为技术有限公司 乘员保护方法及装置

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3834463B2 (ja) * 2000-10-13 2006-10-18 株式会社日立製作所 車載故障警報通報システム
CN111524332A (zh) * 2013-10-25 2020-08-11 英特尔公司 对车载环境条件做出响应
US9227484B1 (en) * 2014-03-20 2016-01-05 Wayne P. Justice Unattended vehicle passenger detection system
CN105313819A (zh) * 2015-11-10 2016-02-10 奇瑞汽车股份有限公司 一种汽车遗漏生命安全保护系统及其保护方法
CN107010073A (zh) * 2015-11-30 2017-08-04 法拉第未来公司 基于数据聚合自动检测车辆中的占用条件的系统和方法
CN107878396A (zh) * 2016-09-30 2018-04-06 法乐第(北京)网络科技有限公司 车辆控制系统及其控制方法
CN110637331A (zh) * 2017-01-29 2019-12-31 勿忘有限公司 用于检测车辆中乘员的存在的系统及其装置
CN107933425B (zh) * 2017-11-27 2021-06-25 中原工学院 一种基于物联网技术的车内生命探测系统
CN108062847A (zh) * 2017-12-01 2018-05-22 江苏海宏信息科技有限公司 一种车内乘员和宠物综合安全防护及监控系统及方法
CN110119684A (zh) * 2019-04-11 2019-08-13 华为技术有限公司 图像识别方法和电子设备
KR20190100109A (ko) * 2019-08-09 2019-08-28 엘지전자 주식회사 자율주행 차량의 오탑승 방지 방법 및 그 장치

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9014920B1 (en) * 2012-07-02 2015-04-21 Ricardo Torres Vehicle occupants alert and alarm system
CN105253085A (zh) * 2015-11-14 2016-01-20 大连理工大学 车内滞留乘员状态识别及险态控制系统
CN105741485A (zh) * 2016-01-29 2016-07-06 大连楼兰科技股份有限公司 儿童遗留车内自动安全处理的车载系统及处理方法
CN108791156A (zh) * 2017-04-27 2018-11-13 赫拉胡克两合公司 用于保护车辆乘员的方法
CN108394374A (zh) * 2018-04-26 2018-08-14 奇瑞汽车股份有限公司 乘车保护方法、装置及计算机可读存储介质
CN208544243U (zh) * 2018-07-04 2019-02-26 上汽通用汽车有限公司 儿童被锁提示系统及汽车车窗控制系统
CN108944672A (zh) * 2018-07-19 2018-12-07 安顺职业技术学院 一种车内环境监测自动应急安全保护系统
CN110001566A (zh) * 2019-03-25 2019-07-12 奇瑞汽车股份有限公司 车内生命体保护方法、装置及计算机可读存储介质
CN110758241A (zh) * 2019-08-30 2020-02-07 华为技术有限公司 乘员保护方法及装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4019343A4

Also Published As

Publication number Publication date
CN110758241A (zh) 2020-02-07
CN110758241B (zh) 2022-03-11
EP4019343A1 (en) 2022-06-29
EP4019343A4 (en) 2022-10-19
US20220305988A1 (en) 2022-09-29

Similar Documents

Publication Publication Date Title
WO2021036363A1 (zh) 乘员保护方法及装置
CN107776574B (zh) 一种自动驾驶车辆的驾驶模式切换方法和装置
CN110371132B (zh) 驾驶员接管评估方法及装置
WO2021052213A1 (zh) 调整油门踏板特性的方法和装置
CN113665528B (zh) 自主车辆安全系统和方法
JP6773040B2 (ja) 情報処理システム、および情報処理システムの情報処理方法、情報処理装置、並びにプログラム
KR102498091B1 (ko) 운전 제어 장치 및 운전 제어 방법, 그리고 프로그램
JP2024023534A (ja) 車両、ロボットまたはドローンを遠隔監視するシステムおよび方法
WO2021212379A1 (zh) 车道线检测方法及装置
EP3675121A2 (en) Two-way in-vehicle virtual personal assistant
JP7329755B2 (ja) 支援方法およびそれを利用した支援システム、支援装置
KR20180040092A (ko) 모바일 센서 플랫폼
CN108288312A (zh) 驾驶行为确定方法及装置
US10528047B1 (en) Method and system for monitoring user activity
WO2022000448A1 (zh) 车内隔空手势的交互方法、电子装置及系统
KR20200128480A (ko) 자율 주행 차량과 이를 이용한 보행자 안내 시스템 및 방법
US20230331259A1 (en) Vehicle summoning method, intelligent vehicle, and device
US20230232113A1 (en) Method and apparatus for controlling light compensation time of camera module
CN113226886A (zh) 控制车辆行驶的方法、装置及车辆
CN112771606A (zh) 一种汽车空调噪声控制系统、方法及相关车载设备
WO2021217575A1 (zh) 用户感兴趣对象的识别方法以及识别装置
CN112455461B (zh) 自动驾驶车辆的人车交互的方法、自动驾驶系统
CN114248794A (zh) 车辆的控制方法、装置及车辆
Mandal et al. Vehicle tracking with alcohol detection & seat belt control system
WO2022022344A1 (zh) 一种自动驾驶行车控制方法及装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20859614

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2020859614

Country of ref document: EP

Effective date: 20220323