US20230108895A1 - Data processing method and apparatus, and device - Google Patents

Data processing method and apparatus, and device Download PDF

Info

Publication number
US20230108895A1
US20230108895A1 US17/993,609 US202217993609A US2023108895A1 US 20230108895 A1 US20230108895 A1 US 20230108895A1 US 202217993609 A US202217993609 A US 202217993609A US 2023108895 A1 US2023108895 A1 US 2023108895A1
Authority
US
United States
Prior art keywords
data
driving
vehicle
historical
historical driving
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US17/993,609
Other languages
English (en)
Inventor
Narisong Bao
Guicheng ZHANG
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Assigned to HUAWEI TECHNOLOGIES CO., LTD. reassignment HUAWEI TECHNOLOGIES CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BAO, NARISONG, ZHANG, Guicheng
Publication of US20230108895A1 publication Critical patent/US20230108895A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/761Proximity, similarity or dissimilarity measures
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/02Registering or indicating driving, working, idle, or waiting time only
    • G07C5/04Registering or indicating driving, working, idle, or waiting time only using counting means or digital clocks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • G06N20/20Ensemble learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/774Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/41Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • G06V20/597Recognising the driver's state or behaviour, e.g. attention or drowsiness
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • G07C5/0808Diagnosing performance data
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • G07C5/0841Registering performance data

Definitions

  • This application relates to the field of intelligent driving and intelligent vehicle technologies, and in particular, to a data processing method and apparatus, and a device.
  • driving recorders are disposed on many vehicles, and can shoot and store videos of the vehicles during driving processes.
  • the video shot by the driving recorder and a shooting time of the video may be stored in preset storage space.
  • the user needs to search for a corresponding video clip in the preset storage space.
  • the user usually estimates an approximate time,. and searches for the corresponding video clip in the preset storage space based on the estimated time.
  • searching for the video clip in this way is not efficient.
  • Embodiments of this application provide a data processing method and apparatus, and a device. This improves efficiency of searching for a video clip.
  • an embodiment of this application provides a data processing method.
  • the method includes: obtaining first data and a first historical driving feature: determining, based on the first data and the first historical driving feature, that a vehicle is abnormal in a first time period; and annotating a driving record video of the vehicle based on the first time period.
  • the first data includes first driving data of the vehicle in the first time period, and the first historical driving feature is determined based on historical driving data of the vehicle.
  • the first data may reflect a driving status of the vehicle in the first time period
  • the first historical driving feature may reflect a driving habit of a driver of the vehicle. Because the driving habit of the user is usually fixed, whether the vehicle is abnormal in the first time period can be accurately determined based on the first driving data and the first historical driving feature.
  • the driving record video shot by an image shooting device may be annotated to record that the vehicle is abnormal in the first time period. In this way, the driving record video can be accurately annotated.
  • the user may accurately and quickly search for a video clip based on an annotation result of the driving record video, to improve video clip searching efficiency.
  • the first driving data includes at least one of the following: status data of a component in the vehicle, running data of a component in the vehicle, or sensor data collected by a component in the vehicle.
  • the first driving data includes one or more of status data, running data, or sensor data of a component.
  • the data is generated in a running process of the vehicle, and can accurately reflect the driving status of the vehicle. Therefore, whether the vehicle is abnormal in the first time period can be accurately determined based on the first driving data.
  • the first historical driving feature includes a plurality of historical data curves corresponding to a plurality of types of historical driving data, the historical data curves are used to indicate a distribution rule of the historical driving data, and the historical driving data includes at least one of the following: historical status data of the component in the vehicle: and historical running data of the component in the vehicle; or historical sensor data collected by the component in the vehicle.
  • the historical driving data includes one or more of historical status data, historical running data, or historical sensor data of a component.
  • the data is generated in a historical running process of the vehicle, and can accurately reflect a historical driving status of the vehicle.
  • the historical driving status of the vehicle reflects the driving habit of the user. Therefore, whether the vehicle is abnormal in the first time period can be accurately determined based on the first historical driving feature.
  • the first historical driving feature may be obtained in the following manner: processing the historical driving data by using a first model to obtain the first historical driving feature.
  • the first model is obtained by learning a plurality of groups of first samples, and each of the plurality of groups of first samples includes sample driving data and a sample historical driving feature.
  • the plurality of groups of first samples are learned, so that the first model has a function of obtaining a driving feature of the driving data Therefore, the first historical driving feature can be accurately obtained by processing the historical sample data by using the first model.
  • that the vehicle is abnormal in the first time period may be determined based on the first data and the first historical driving feature in the following manner: determining a similarity between the first data and the first historical driving feature; and determining, based on the similarity, that the vehicle is abnormal in the first time period.
  • the similarity between the first data and the first historical driving feature reflects a similarity between a running status of the vehicle in the first time period and the driving habit of the user. Because the driving habit of the user is usually fixed, whether the vehicle is abnormal in the first time period may be accurately determined based on the similarity.
  • the similarity between the first data and the first historical driving feature may be determined in the following manner: processing the first data and the first historical driving feature by using a second model to obtain the similarity.
  • the second model is obtained by learning a plurality of groups of second samples, each of the plurality of groups of second samples includes sample data, a sample driving feature, and a sample similarity, and the sample data includes sample driving data.
  • the plurality of groups of second samples are learned, so that the second model has a function of determining a similarity between data and a driving feature. Therefore, the similarity between the first data and the first historical driving feature may be accurately determined by processing the first data and the first historical driving feature by using the second model.
  • the processing the first data and the first historical driving feature by using a second model to obtain the similarity includes: determining, by using the second model, a first data curve corresponding to the first data; and comparing the first data curve with the historical data curve in the first historical driving feature by using the second model to obtain the similarity.
  • the first data curve can accurately reflect the first data. Therefore, the similarity can be accurately obtained by comparing the first data curve with the historical data curve.
  • the first data further includes scenario data
  • the scenario data includes at least one of the following information: time information, location information, road condition information, or weather information.
  • driving habits of the user in different scenarios may be different. Therefore, when the first data further includes the scenario data, a scenario in which the user is driving in the first time period may be determined based on the scenario data. In this way, whether the vehicle is abnormal in the first time period can be more accurately determined based on the first data.
  • the first historical driving feature may be obtained in the following manner: determining the first historical driving feature from a plurality of historical driving features based on the scenario data.
  • the obtained first historical driving feature is related to the scenario data, so that a scenario corresponding to the first data is the same as a scenario corresponding to the first historical driving feature, and accuracy of determining whether the vehicle is abnormal in the first time period is high.
  • the driving record video of the vehicle may be annotated based on the first time period in the following manner: generating abnormality information; and annotating the abnormality information in the driving record video based on the first time period.
  • the abnormality information includes at least one of the following information: an abnormality level, an abnormality type, or abnormality description information.
  • the abnormality information is annotated in the driving record video, so that the user can accurately and quickly search for the video clip based on the annotation result of the driving record video. This improves video clip search efficiency.
  • an embodiment of this application provides a data processing apparatus, including an obtaining module, a determining module, and an annotation module.
  • the obtaining module is configured to obtain first data and a first historical driving feature, the first data includes first driving data of a vehicle in a first time period, and the first historical driving feature is determined based on historical driving data of the vehicle.
  • the determining module is configured to determine, based on the first data and the first historical driving feature, that the vehicle is abnormal in the first time period.
  • the annotation module is configured to annotate a driving record video of the vehicle based on the first time period.
  • the first driving data includes at least one of the following:
  • the first historical driving feature includes a plurality of historical data curves corresponding to a plurality of types of historical driving data, the historical data curves are used to indicate a distribution rule of the historical driving data, and the historical driving data includes at least one of the following:
  • the obtaining module is specifically configured to:
  • the first model is obtained by learning a plurality of groups of first samples, and each of the plurality of groups of first samples includes sample driving data and a sample historical driving feature.
  • the determining module is specifically configured to:
  • the determining module is specifically configured to:
  • the second model is obtained by learning a plurality of groups of second samples, each of the plurality of groups of second samples includes sample data, a sample driving feature, and a sample similarity, and the sample data includes sample driving data.
  • the determining module is specifically configured to:
  • the first data further includes scenario data
  • the scenario data includes at least one of the following information:
  • time information location information, road condition information, or weather information.
  • the determining module is specifically configured to:
  • annotation module is specifically configured to:
  • the abnormality information includes at least one of the following information:
  • an abnormality level an abnormality type, or abnormality description information.
  • an embodiment of this application provides a data processing apparatus.
  • the data processing apparatus includes a memory and a processor.
  • the memory stores computer program instructions, and the processor runs the computer program instructions to perform the operation according to any implementation of the first aspect.
  • an embodiment of this application provides a computer storage medium, including computer instructions.
  • the computer instructions are run by a processor, the method according to any implementation of the first aspect is implemented.
  • an embodiment of this application provides a computer program product.
  • the computer program product runs on a processor, the method according to any implementation of the first aspect is implemented.
  • first data and a first historical driving feature of a vehicle in a first time period may be obtained, that the vehicle is abnormal in the first time period is determined based on the first data and the first historical driving feature, and a driving record video of the vehicle is annotated based on the first time period.
  • the first data can reflect a driving status of the vehicle in the first time period
  • the first historical driving feature can reflect a driving habit of a driver of the vehicle. Because the driving habit of the user is usually fixed, whether the vehicle is abnormal in the first time period can be determined based on the first driving data and the first historical driving feature.
  • the driving record video shot by an image shooting device may be annotated to record that the vehicle is abnormal in the first time period.
  • the driving record video can be accurately annotated.
  • the user may accurately and quickly search for a video clip based on an annotation result of the driving record video, to improve video clip searching efficiency.
  • FIG. 1 is a schematic diagram of an application scenario according to an embodiment of this application:
  • FIG. 2 A is a schematic diagram of a video obtaining manner according to an embodiment of this application:
  • FIG. 2 B is a schematic diagram of another video obtaining manner according to an embodiment of this application:
  • FIG. 2 C is a schematic diagram of still another video obtaining manner according to an embodiment of this application:
  • FIG. 3 is a schematic flowchart of a data processing method according to an embodiment of this application:
  • FIG. 4 is a schematic diagram of driving data and a driving feature according to an embodiment of this application.
  • FIG. 5 A is a schematic diagram of a manner of obtaining historical driving data according to an embodiment of this application.
  • FIG. 5 B is a schematic diagram of another manner of obtaining historical driving data according to an embodiment of this application.
  • FIG. 6 is a schematic diagram of a historical driving feature according to an embodiment of this application.
  • FIG. 7 A is a schematic diagram of determining a similarity according to an embodiment of this application.
  • FIG. 7 B is a schematic diagram of determining a similarity according to an embodiment of this application.
  • FIG. 8 A to FIG. 8 C are schematic diagrams of a video viewing manner according to an embodiment of this application.
  • FIG. 9 is a schematic diagram of a video annotation manner according to an embodiment of this application.
  • FIG. 10 A to FIG. 10 F are schematic diagrams of a video viewing manner according to an embodiment of this application.
  • FIG. 11 is a schematic flowchart of another data processing method according to an embodiment of this application.
  • FIG. 12 A is a schematic diagram of a structure of a driving recorder according to an embodiment of this application.
  • FIG. 12 B is a schematic diagram of a structure of another driving recorder according to an embodiment of this application.
  • FIG. 13 A is a schematic diagram of a device connection according to an embodiment of this application.
  • FIG. 13 B is a schematic diagram of another device connection according to an embodiment of this application.
  • FIG. 13 C is a schematic diagram of still another device connection according to an embodiment of this application:
  • FIG. 14 A is a schematic diagram of a structure of a vehicle according to an embodiment of this application:
  • FIG. 14 B is a schematic diagram of another structure of the vehicle according to an embodiment of this application.
  • FIG. 15 is a schematic diagram of a structure of a data processing apparatus according to an embodiment of this application.
  • FIG. 16 is a schematic diagram of a hardware structure of a data processing apparatus according to an embodiment of this application.
  • a vehicle in this application may include a private car, a bus, a cargo van, a passenger vehicle, a motorcycle, and the like.
  • a component in a vehicle may be a part or a sensor in the vehicle, and the part may be a component that maintains running of the vehicle.
  • the part may include an engine, a generator, a shock absorber, a vehicle light, or the like.
  • the sensor may include a temperature sensor, a speed sensor, an acceleration sensor, or the like.
  • Driving data data generated in a driving process of a vehicle.
  • the driving data may include at least one of the following: status data of a component, running data of a component, or sensor data collected by a component.
  • Status data is used to indicate a status of a component.
  • status data of a vehicle light is used to indicate that a status of the vehicle light is on or off
  • status data of an engine is used to indicate that a status of the engine is started or paused
  • status data of an electromagnetic valve is used to indicate that a status of the electromagnetic valve is on or off.
  • Running data is used to indicate data generated in a running process of a component.
  • running data of an engine may include a rotational speed of the engine
  • running data of an electronic gear shifter may include a gear of the electronic gear shifter
  • running data of a battery may include remaining electricity of the battery.
  • Sensor data refers to data collected by a sensor.
  • the sensor data may include: a temperature collected by a temperature sensor, a speed collected by a speed sensor, and an air-fuel ratio collected by an air-fuel ratio sensor.
  • Different components may generate different data, and a component may generate one or more of status data, running data, and sensor data.
  • a component may generate status data
  • some components may generate status data and running data
  • some components may generate sensor data.
  • a headliner switch may generate status data
  • an engine may generate status data and running data
  • a temperature sensor may generate sensor data.
  • Driving record video a video obtained by shooting a vehicle during a driving process.
  • the video may include a video obtained by shooting a surrounding of a vehicle body and a video obtained by shooting an interior of the vehicle.
  • the video is shot by an image shooting device.
  • the image shooting device may be an apparatus built in the vehicle, or may be an apparatus installed on the vehicle.
  • FIG. 1 is a schematic diagram of an application scenario according to an embodiment of this application.
  • a vehicle includes an engine electronic system, a chassis electronic system, vehicle body electronics, a safety and comfort system, an entertainment and communications system, and an image shooting device.
  • the engine electronic system may include: an engine management electronic control unit (ECU), a battery, a generator, a starter, a temperature sensor, a knock sensor, an air-fuel ratio sensor, an oxygen sensor, an engine harness, a cooling system, an ignition system, an intake and exhaust system, a transmission system, an electric feed pump, and the like.
  • ECU engine management electronic control unit
  • a battery a generator
  • a starter a temperature sensor
  • a knock sensor an air-fuel ratio sensor
  • an oxygen sensor an engine harness
  • a cooling system a cooling system
  • an ignition system an intake and exhaust system
  • a transmission system an electric feed pump
  • the chassis electronic system may include: a steering system, a suspension system, a braking system, and the like.
  • the vehicle body electronics may include: a business continuity management (BCM) system, a relay/fuse, a floor harness, a door harness, a headliner harness, a dashboard harness, a universal serial bus (USB)/high-definition multimedia interface (HDMI) cable, an electric rearview mirror, a window lifting motor, an electric tailgate strut, a door and window switch, a wiper motor, a sunroof motor, on-board diagnostics (OBD), a lighting system, a switch, and the like.
  • BCM business continuity management
  • a relay/fuse a floor harness
  • a door harness
  • a headliner harness
  • a dashboard harness a universal serial bus (USB)/high-definition multimedia interface (HDMI) cable
  • HDMI high-definition multimedia interface
  • OBD on-board diagnostics
  • the safety and comfort system may include: a safety system, a seat adjustment motor, an active noise cancellation unit, a horn, an air conditioning system, and the like.
  • the entertainment and communications system may include a human-machine interaction (HMI) system, a communications system, and the like.
  • HMI human-machine interaction
  • the image shooting device may be disposed outside the vehicle and/or inside the vehicle, and there may be one or more image shooting devices.
  • an image shooting device may be disposed at a point A, a point B, a point C. or a point D of the vehicle.
  • the image shooting device may shoot a video.
  • the image shooting device may shoot a video during a running (or starting) process of the vehicle.
  • storage space may be set in the vehicle, and a video shot by the image shooting device may be stored in the storage space.
  • a wireless communications module may be disposed in the vehicle, and the wireless communications module may send a video shot by the image shooting device to a cloud server or send a video shot by the image shooting device to a terminal device (for example, a device such as a mobile phone or a computer) of a user.
  • a terminal device for example, a device such as a mobile phone or a computer
  • driving data of the vehicle may be obtained.
  • a historical driving feature of the vehicle may be determined based on historical driving data of the vehicle, and the historical driving feature of the vehicle may represent a driving habit of a driver of the vehicle.
  • First driving data of the vehicle in a first time period (for example, a time period before a current moment) may indicate a driving status of the vehicle in the first time period. Because the driving habit of the user is usually fixed, whether the vehicle is abnormal in the first time period can be determined by comparing the first driving data and the historical driving feature of the vehicle.
  • a driving record video shot by an image shooting device may be annotated, to record that the vehicle is abnormal in the first time period.
  • a case in which the vehicle is abnormal may include: a vehicle fault, a car accident, or abnormal driving of the user.
  • the abnormal driving of the user may include: the user drives extremely fast, the user drives extremely slow, the user drives the vehicle to loiter in an area, and the like.
  • the user may view the driving record video (a video shot by the image shooting device, or a video obtained after the video shot by the image shooting device is annotated) by using a terminal device.
  • the terminal device may obtain the driving record video in a plurality of manners. The following describes several manners of obtaining the driving record video by the terminal device with reference to FIG. 2 A to FIG. 2 C .
  • FIG. 2 A is a schematic diagram of a video obtaining manner according to an embodiment of this application. As shown in FIG. 2 A , there are a vehicle A, a removable hard disk B, and a terminal device C.
  • the driving record video may be stored in preset storage space in the vehicle A.
  • the user may copy the driving record video in the storage space to the removable hard disk B, and then copy the driving record video to the terminal device C.
  • the driving record video may be stored in the removable hard disk B.
  • the removable hard disk B is inserted into the vehicle, and a video shot by the image shooting device is stored in the removable hard disk B.
  • the user may remove the removable hard disk B from the vehicle, and then copy the driving record video in the removable hard disk B to the terminal device C.
  • FIG. 2 B is a schematic diagram of another video obtaining manner according to an embodiment of this application.
  • the driving record video may be stored in preset storage space in the vehicle A.
  • a communications module is disposed in the vehicle A, and the vehicle A may communicate with the terminal device B by using the communications module.
  • the vehicle A may communicate with the terminal device B via Bluetooth or Wi-Fi.
  • An application may be installed in the terminal device B, and a user may control, by using the application, transmission of the video in the preset storage space of the vehicle A to the terminal device B.
  • the user may select a driving record video of a specific time period by using the application, so that the vehicle transmits the driving record video of the specific time period to the terminal device.
  • FIG. 2 C is a schematic diagram of still another video obtaining manner according to an embodiment of this application.
  • a vehicle A there are a vehicle A. a terminal device B, and a cloud server C.
  • a communications module is disposed in the vehicle A.
  • the vehicle A may upload a driving record video to the cloud server C by using the communications module, and the cloud server C stores the driving record video.
  • An application may be installed in the terminal device B, and a user may control, by using the application, the terminal device B to download the driving record video from the cloud server C.
  • the user may select a driving record video of a specific time period by using the application, so that the terminal device B requests to download the driving record video of the specific time period from the cloud server C.
  • FIG. 2 A to FIG. 2 C merely show examples of manners in which a terminal device obtains a driving record video, and do not constitute a limitation on the manners.
  • FIG. 3 is a schematic flowchart of a data processing method according to an embodiment of this application. Refer to FIG. 3 . The method may include the following steps.
  • An execution body for this embodiment of this application may be a vehicle, a driving recorder, or a data processing apparatus disposed in the vehicle or the driving recorder.
  • the data processing apparatus may be implemented by using software, or may be implemented by using a combination of software and hardware.
  • the data processing apparatus may be a processing chip.
  • the first model is used to process driving data to obtain a corresponding driving feature.
  • the driving feature may be represented by using a curve.
  • the driving feature may represent a distribution rule of the driving data.
  • the following describes the driving data and the driving feature by using an example in which the driving data is an acceleration.
  • FIG. 4 is a schematic diagram of driving data and a driving feature according to an embodiment of this application.
  • a horizontal axis of a coordinate system represents a road condition index, and the road condition index may be determined based on parameters such as a congestion degree of a road and weather.
  • a vertical axis of the coordinate system represents a normalized acceleration. Points in the coordinate system represent discrete accelerations (driving data), and the accelerations may be accelerations that are in different road conditions and that are collected in a preset time period.
  • a curve in the coordinate system represents the driving feature (which may also be referred to as an acceleration feature), and the driving feature may indicate a distribution rule of accelerations.
  • the first model may be obtained in the following manners.
  • a device other than a driving recorder generates a first model, and the driving recorder obtains the first model from the device.
  • the device may be a server, a terminal device, or the like.
  • the device may generate the first model in the following manner: obtaining a plurality of groups of first samples, and learning the plurality of groups of first samples to obtain the first model.
  • Each group of first samples includes sample driving data and a sample driving feature.
  • the sample driving data in one group of first samples may be one type of driving data.
  • the type of the driving data may be a speed, an acceleration, an engine rotation speed, an engine temperature, or the like.
  • the sample driving data in one group of first samples may be a speed in a preset time period .
  • the sample driving feature corresponds to the sample driving data, and the sample driving feature is used to reflect a feature of the sample driving data.
  • the sample driving feature may be represented by a sample curve, and the sample curve is used to indicate a distribution rule of the sample driving data.
  • the sample driving feature may be obtained by annotating the sample driving data manually.
  • Algorithms for learning the first model may include: a logistic regression algorithm, a decision tree algorithm, a vector machine algorithm, a naive Bayes algorithm, and the like.
  • the first model After the first model is obtained by learning the plurality of groups of first samples, the first model has the following function: After the driving data is input to the first model, the first model may process the driving data, to output a driving feature corresponding to the driving data.
  • the driving recorder may obtain the first model from the device.
  • the first model may be an offline model.
  • the historical driving data may be data generated in a historical driving process of the vehicle.
  • the historical driving process may be a driving process of the vehicle in a historical time period.
  • the historical time period may be one week, one month, half a year, one year, or the like before a current moment.
  • the historical time period may be set based on an actual situation.
  • the historical driving data includes at least one of the following: historical status data of a component in the vehicle; and historical running data of a component in the vehicle; or historical sensor data collected by a component in the vehicle.
  • the driving recorder can obtain the historical driving data generated by the vehicle.
  • the following describes, with reference to FIG. 5 A and FIG. 5 B , a manner in which the driving recorder obtains the historical driving data.
  • FIG. 5 A is a schematic diagram of obtaining historical driving data according to an embodiment of this application.
  • a vehicle includes a plurality of components and a vehicle body status system, and the plurality of components include a plurality of parts and a plurality of sensors.
  • the vehicle body status system may obtain and store the driving data generated by the components.
  • the driving recorder and the vehicle body status system may communicate with each other.
  • the driving recorder and the vehicle body status system may be connected in a wired manner or in a wireless manner, so that the driving recorder may obtain historical driving data from the vehicle body status system.
  • FIG. 5 B is a schematic diagram of another manner of obtaining historical driving data according to an embodiment of this application.
  • a vehicle includes a plurality of components and a vehicle body status system, and the plurality of components include a plurality of parts and a plurality of sensors.
  • the vehicle body status system may obtain the driving data generated by the components, and transmit the driving data generated by the components to a cloud server.
  • the cloud server stores the driving data.
  • the driving recorder may request to obtain the historical driving data from the cloud server.
  • FIG. 5 A and FIG. 5 B merely show examples of manners in which the driving recorder obtains the historical driving data, and do not constitute a limitation.
  • the historical driving data includes a plurality of types of historical driving data.
  • types of the historical driving data include a speed, an acceleration, an engine rotation speed, or an engine temperature.
  • the first historical driving feature includes driving features corresponding to the plurality of types of historical driving data.
  • the first historical driving feature includes a driving feature corresponding to a historical speed, a driving feature corresponding to a historical acceleration, or a driving feature corresponding to a historical engine rotation speed.
  • the driving feature corresponding to the historical speed is used to indicate a distribution rule of the historical speed
  • the driving feature corresponding to the historical acceleration is used to indicate a distribution rule of the historical acceleration.
  • the vehicle is usually driven by one or more fixed users.
  • a driving habit of the user is usually fixed, so that driving data of the vehicle is usually distributed as a fixed curve.
  • speeds at which the user drives the vehicle in different road conditions are usually distributed as a fixed speed curve, and accelerations at which the user drives the vehicle in different road conditions are usually distributed as a fixed acceleration curve. Therefore, the first historical driving feature determined based on the historical driving data may represent a driving habit of the user, or the first historical driving feature may also be referred to as a user profile of the user who drives the vehicle.
  • the historical driving data may be input to the first model, and the first model outputs the first historical driving feature corresponding to the historical driving data.
  • the first historical driving feature includes a plurality of historical data curves corresponding to various types of historical driving data.
  • the historical data curves each are used to indicate a distribution rule of the historical driving data. That is, the historical data curves each are used to represent the driving feature corresponding to the historical driving data.
  • the following describes the first historical driving feature with reference to FIG. 6 .
  • FIG. 6 is a schematic diagram of a historical driving feature according to an embodiment of this application.
  • each curve represents one historical driving feature
  • one historical driving feature corresponds to one type of historical driving data.
  • meanings represented by a horizontal axis and a vertical axis are different.
  • the horizontal axis may represent a road condition index
  • the vertical axis may represent an acceleration.
  • the horizontal axis may represent a road condition index
  • the vertical axis may represent a speed.
  • FIG. 6 merely shows an example of the first historical driving feature, and does not constitute a limitation on the first historical driving feature.
  • the second model is used to obtain a similarity between data and a driving feature.
  • the data may include driving data or the data may include driving data and scenario data.
  • the second model may be obtained in the following manners.
  • a device other than a driving recorder generates a second model, and the driving recorder obtains the second model from the device.
  • the device may be a server, a terminal device, or the like.
  • the device may generate the second model in the following manner: obtaining a plurality of groups of second samples, and learning the plurality of groups of second samples to obtain the second model.
  • Each group of second samples includes sample data, a sample driving feature, and a sample similarity.
  • sample data may include sample driving data
  • sample data may include sample driving data and sample scenario data.
  • sample scenario data includes at least one of the following information: sample time information, sample location information, sample road condition information, or sample weather information.
  • the sample similarity may be a similarity between the sample data and the sample driving feature, and the sample similarity may be a manually annotated similarity.
  • the sample driving data in one group of second samples may be one type of driving data.
  • the type of the driving data may be a speed, an acceleration, a transmitter rotation speed, an engine temperature, or the like.
  • the sample driving data in one group of first samples may be a speed in a preset time period.
  • the sample driving feature corresponds to the sample driving data.
  • the sample driving feature may be represented by using a sample curve.
  • Algorithms for learning the second model may include: a logistic regression algorithm, a decision tree algorithm, a vector machine algorithm, a naive Bayes algorithm, and the like.
  • the second model After the second model is obtained by learning the plurality of groups of second samples, the second model has the following function: After data and a driving feature are input to the second model, the second model may process the input data and the driving feature, to obtain a similarity between the data and the driving feature.
  • the driving recorder may obtain the second model from the device.
  • the second model may be an offline model.
  • the second model and the first model may be integrated into one model.
  • a model obtained through integration may simultaneously have functions of the first model and functions of the second model.
  • the second model and the first model may be two independent models.
  • the process from S 301 to S 304 may be a one-time execution process or a periodic execution process. That is, before S 305 , the process from S 301 to S 304 only needs to be performed once, but does not need to be performed in real time.
  • the first data includes first driving data of the vehicle in a first time period.
  • the first data may include the first driving data and the scenario data, and the scenario data includes at least one of the following information: time information, location information, road condition information, or weather information.
  • the first time period may be a time period before a current moment and closest to the current moment.
  • the first time period may be 5 minutes, 10 minutes, 30 minutes, 1 hour, or the like before the current moment.
  • the first data and the first historical driving feature may be input to the second model, so that the second model outputs the similarity between the first data and the first historical driving feature.
  • the second model may first determine a first data curve corresponding to the first data, and then compare the first data curve with a historical data curve in the first historical driving feature, to obtain the similarity. If the first data further includes the scenario data, a historical data curve corresponding to the scenario data may be further obtained from the first historical driving feature, and then the first data curve and the historical data curve corresponding to the scenario data may be compared to obtain the similarity.
  • the first data may include the first driving data, or the first data may include the first driving data and the scenario data.
  • a process of determining the similarity between the first data and the first historical driving feature is accordingly different. The following two cases may be included.
  • the first driving data may include a plurality of types of driving data, and the first historical driving feature includes a historical driving feature corresponding to each type of driving data.
  • a similarity between each type of driving data and a corresponding historical driving feature may be separately obtained, and then the similarity between the first data and the first historical driving feature is determined based on the similarity between each type of driving data and the corresponding historical driving feature.
  • a first data curve corresponding to the type of driving data may be obtained, a historical data curve corresponding to the type of driving data is obtained from the first historical driving feature, and then a similarity between the first data curve and the corresponding historical data curve is determined as the similarity between this type of driving data and the corresponding historical driving feature.
  • the following describes a process of determining the similarity between the first data and the first historical driving feature with reference to FIG. 7 A .
  • FIG. 7 A is a schematic diagram of determining a similarity according to an embodiment of this application.
  • the first data includes the first driving data.
  • the first driving data may include a plurality of types of driving data.
  • Driving data 1, driving data 2, driving data 3, ..., and driving data N (N is an integer greater than 1) are N different types of driving data.
  • the driving data 1 may be a speed
  • the driving data 2 may be an acceleration
  • the driving data 3 may be an engine rotation speed.
  • a data curve corresponding to each type of driving data may be obtained.
  • a data curve corresponding to each type of driving data may be obtained by using the second model.
  • the driving data 1 may be input to the second model, so that the second model outputs a data curve 1 corresponding to the driving data 1.
  • the driving data 2 may be input to the second model, so that the second model outputs a data curve 2 corresponding to the driving data 2.
  • N data curves corresponding to the N types of driving data may be obtained.
  • a first historical driving feature includes a plurality of historical data curves.
  • the first historical driving feature includes a historical data curve 1 corresponding to the driving data 1 and a historical data curve 2 corresponding to the driving data 2.
  • the data curve 1 and the historical data curve 1 may be compared to obtain a similarity 1
  • the data curve 2 and the historical data curve 2 may be compared to obtain a similarity 2, and by analogy. N similarities may be obtained.
  • An average value or a weighted average value of the N similarities may be determined as the similarity between the first data and the first historical driving feature.
  • the first driving data may include a plurality of types of driving data.
  • the first driving data may include a speed, an acceleration, and an engine rotation speed.
  • the scenario data includes at least one of the following information: time information, location information, road condition information, or weather information.
  • a corresponding scenario type may be determined and obtained based on the scenario data.
  • the scenario type may include rainy weather, a morning rush hour, an evening rush hour, a holiday, a workday, and an unblocked road condition.
  • the first historical driving feature includes a plurality of historical driving features corresponding to each type of driving data, and the plurality of historical driving features correspond to different scenario types.
  • the driving data is a speed
  • the first historical driving feature includes a plurality of historical speed driving features corresponding to the speed
  • the plurality of historical speed driving features may include: a historical speed driving feature corresponding to rainy weather, a historical speed driving feature corresponding to a morning rush hour, a historical speed driving feature corresponding to a holiday, and the like.
  • a similarity between each type of driving data and a corresponding historical driving feature may be separately obtained based on the scenario data, and then the similarity between the first data and the first historical driving feature is determined based on the similarity between each type of driving data and the corresponding historical driving feature.
  • a first data curve corresponding to the type of driving data may be obtained.
  • a plurality of historical data curves corresponding to the type of driving data are obtained from the first historical driving feature.
  • a scenario type is determined based on the scenario data.
  • a first historical data curve corresponding to the scenario type is determined from the plurality of historical data curves.
  • a similarity between the first data curve and the first historical data curve is determined as the similarity between the type of driving data and the historical driving feature.
  • the following describes a process of determining the similarity between the first data and the first historical driving feature with reference to FIG. 7 B .
  • FIG. 7 B is a schematic diagram of determining a similarity according to an embodiment of this application.
  • the first data includes the first driving data and the scenario data.
  • the first driving data may include a plurality of types of driving data.
  • Driving data 1, driving data 2, driving data 3, ..., and driving data N (N is an integer greater than 1) are N different types of driving data.
  • the driving data 1 may be a speed
  • the driving data 2 may be an acceleration
  • the driving data 3 may be an engine rotation speed.
  • a data curve corresponding to each type of driving data may be obtained.
  • a data curve corresponding to each type of driving data may be obtained by using the second model.
  • the driving data 1 may be input to the second model, so that the second model outputs a data curve 1 corresponding to the driving data 1.
  • the driving data 2 may be input to the second model, so that the second model outputs a data curve 2 corresponding to the driving data 2.
  • N data curves corresponding to the N types of driving data may be obtained.
  • a first historical driving feature includes a plurality of historical data curves corresponding to each type of driving data.
  • the first historical driving feature includes a plurality of historical data curves (for example, a historical data curve 11, a historical data curve 12, ..., and a historical data curve 1X) corresponding to the driving data 1, a plurality of historical data curves (for example, a historical data curve 21, a historical data curve 22, ..., a historical data curve 2K) corresponding to the driving data 2, and the like.
  • a first historical data curve (assumed to be a historical data curve 11) corresponding to the scenario data may be selected, based on the scenario data, from the plurality of historical data curves corresponding to the driving data 1
  • a first historical data curve (assumed to be a historical data curve 21) corresponding to the scenario data may be selected, based on the scenario data, from the plurality of historical data curves corresponding to the driving data 2.
  • a first historical data curve corresponding to each type of driving data are selected based on the scenario data.
  • the data curve 1 and the historical data curve 1 may be compared to obtain a similarity 1
  • the data curve 2 and the historical data curve 2 may be compared to obtain a similarity 2
  • N similarities may be obtained.
  • An average value or a weighted average value of the N similarities may be determined as the similarity between the first data and the first historical driving feature.
  • FIG. 7 A and FIG. 7 B merely show an example of determining the similarity between the first data and the first historical driving feature, and do not constitute a limitation thereto.
  • the similarity may be determined in another manner.
  • the similarity between the first data and the first historical driving feature may be further obtained according to an algorithm.
  • a process of obtaining the similarity according to an algorithm is similar to the process of obtaining the similarity by using the second model. Details are not described herein again.
  • a case in which the vehicle is abnormal in the first time period may include: the vehicle is faulty in the first time period, a vehicle accident occurs to the vehicle in the first time period, or the user drives the vehicle abnormally in the first time period. That the user drives the vehicle abnormally may include: the user drives extremely fast, the user drives extremely slow, the user drive the vehicle to loiter in an area, or the like.
  • the similarity is less than or equal to a similarity threshold, it is determined that the vehicle is abnormal in the first time period.
  • the driving record video of the vehicle may be annotated in the following manner: generating abnormality information; and annotating the abnormality information in the driving record video based on the first time period.
  • the abnormality information includes at least one of the following information: an abnormality indication, an abnormality level, an abnormality type, or abnormality description information.
  • the abnormality indication is used to indicate that the vehicle is abnormal.
  • the abnormality level is used to indicate an abnormality degree of the vehicle. For example, a lower similarity determined in S 306 indicates a higher abnormality degree of the vehicle and a higher abnormality level.
  • the abnormality type may include an abnormal brake, an abnormal turn, an abnormal acceleration, an abnormal engine, an abnormal driving track, and the like.
  • the abnormality description information is used to describe a specific condition of a vehicle abnormality.
  • the abnormality description information may include: 5 abrupt brakes within one minute, an acceleration greater than a first threshold, 10 round-trip drives in a same road section within one hour, or 10 lane changes within three minutes.
  • the user may view the annotated driving record video by using a terminal device (like a computer or a mobile phone).
  • a terminal device like a computer or a mobile phone.
  • An image shooting device may shoot a plurality of driving record videos of the vehicle.
  • one trip of the vehicle (from the time when the vehicle is started to the time when the vehicle is stalled) may correspond to one driving record video, or one piece of preset driving duration (for example, one hour) of the vehicle may correspond to one driving record video.
  • a plurality of driving record videos of the vehicle for a plurality of trips may be stored in a list.
  • the driving record videos are annotated, abnormality information is generated, and the abnormality information is used as attribute information of the driving record videos. In this way, when the driving record videos are displayed in a list, the abnormality information may be displayed.
  • Some information (for example, the abnormality description information or some content in the abnormality description information) in the abnormality information may be further added to corresponding locations in the driving record videos as subtitles or pop-ups.
  • FIG. 8 A to FIG. 8 C are schematic diagrams of a video viewing manner according to an embodiment of this application. As shown in FIG. 8 A to FIG. 8 C , there are an interface 801 to an interface 803 .
  • the terminal device may display a folder corresponding to the driving record videos.
  • the user may tap the folder corresponding to the driving record videos to display a driving record video list on the terminal device.
  • the list includes five driving record videos and attribute information of the five driving record videos.
  • the attribute information includes a shooting time, an abnormality indication, an abnormality level, an abnormality type, and abnormality description information. Certainly, the attribute information may further include other information. This is not limited thereto.
  • the five driving record videos may be videos corresponding to five trips of the vehicle. It can be learned from the driving record video list that, in time periods corresponding to a driving record video 1, a driving record video 4, and a driving record video 5. the vehicle is normal, and abnormality indications corresponding to the three driving record videos are “normal”.
  • the vehicle In time periods corresponding to a driving record video 2 and a driving record video 3, the vehicle is abnormal, and abnormality indications corresponding to the two driving record videos are “abnormal”.
  • the two driving record videos further have attributes: abnormality levels, abnormality types, and abnormality description information.
  • the abnormality description information may be brief description information, and the user may tap the brief description information to display detailed description information on the terminal device. For example, after the user taps the brief description information of “frequent brakes”, the terminal device may display detailed description information of “8 sudden brakes between a 20 th minute and a 22 nd minute”.
  • identifiers and attribute information of “normal” and “abnormal” driving record videos may be displayed in different display manners, where the display manners include a font color, a font type, a font size, and the like.
  • the display manners include a font color, a font type, a font size, and the like.
  • an identifier and attribute information of a “normal” driving record video may be displayed in red font
  • an identifier and attribute information of an “abnormal” driving record video may be displayed in black font.
  • the user may tap an identifier of a driving record video in the driving record video list to play the driving record video on the terminal device.
  • the user may tap an identifier of the driving record video 2 to play the driving record video 2 on the terminal device.
  • the terminal device plays the driving record video 2.
  • the user may determine, based on abnormality description information of the driving record video 2, that the vehicle is abnormal between the 20 th minute and the 22 nd minute of the driving record video 2, and the user may drag a video play progress bar based on the abnormality description information.
  • partial content (an abrupt brake) of the abnormality description information is displayed as subtitles in the video.
  • the user may conveniently obtain the driving record video corresponding to an abnormality in the vehicle based on the driving record video list, learn the time period when the vehicle is abnormal in the driving record video based on the abnormality description information, and accurately locate the time period in the driving record video.
  • the driving record video may alternatively be a video of preset duration.
  • duration of each driving record video is one hour.
  • the driving record video whose abnormality indication is “abnormal” may alternatively be a video shot in a time period in which the vehicle encounters an abnormality .
  • duration of the driving record video 2 may be two minutes, and the vehicle encounters an abnormality within the two minutes.
  • One driving record video shot by an image shooting device may correspond to one video file. If a vehicle becomes abnormal in a shooting time period of the driving record video, an abnormality information file is generated.
  • the abnormality information file may include one or more pieces of abnormality information, and the abnormality information file may be used as a configuration file of the video file.
  • One piece of abnormality information in the abnormality information file may correspond to one abnormality time period (or abnormality moment) in the driving record video, and there is an association between the abnormality information and the abnormality time period (or abnormality moment) in the driving record video. For example, the corresponding abnormality time period (or abnormality moment) may be located based on the abnormality information.
  • FIG. 9 is a schematic diagram of a video annotation manner according to an embodiment of this application.
  • a video file and an abnormality information file may be generated.
  • the video file may be a video of 80 minutes
  • the abnormality information file may include three pieces of abnormality information, which are respectively abnormality information 1 , abnormality information 2 , and abnormality information 3 .
  • the abnormality information 1 corresponds to a moment t1 in the driving record video, that is, the moment t1 in the driving record video may be located based on the abnormality information 1 .
  • the abnormality information 2 corresponds to a moment t2 in the driving record video, that is, the moment t2 in the driving record video may be located based on the abnormality information 2 .
  • the abnormality information 13 corresponds to a moment t3 in the driving record video, that is, the moment t3 in the driving record video may be located based on the abnormality information 3 .
  • FIG. 10 A to FIG. 10 F are a schematic diagrams of a video viewing manner according to an embodiment of this application. As shown in FIG. 10 A to FIG. 10 F , interfaces 1001 to 1006 are included.
  • FIG. 1001 it is assumed that a manner in which the terminal device obtains a driving record video is shown in FIG. 2 B or FIG. 2 C .
  • An application is installed in the terminal device. It is assumed that the driving record video is transmitted to the terminal device by using the application.
  • the user may start the application on the terminal device.
  • the application includes a “Video Playback” icon. The user may tap the icon to display the interface 1002 on the terminal device.
  • the interface 1001 may further include another icon. The icon included in the interface 1001 is not specifically limited in this embodiment of this application.
  • a page corresponding to the “All Videos” icon includes identifiers of all videos, and a page corresponding to the “Abnormality Video” icon includes an identifier of an abnormality video.
  • the user may tap the “Abnormality Video” icon to display the interface 1003 on the terminal device.
  • the abnormality video is a video whose shooting time period includes a time period in which the vehicle is abnormal.
  • an abnormality video list is included.
  • the user may tap any abnormality video to play the abnormality video on the terminal device. For example, it is assumed that the user needs to view a video 2 , the user taps the video 2 to display the interface 1004 on the terminal device.
  • a video playing window M and at least one “Abnormality Point” icon are included.
  • a quantity of the “Abnormality Point” icons is consistent with a quantity of abnormalities in the abnormality video. For example, it is assumed that three abnormalities are included in the video 2 . and three “Abnormality Point” icons are accordingly included in the interface: an “Abnormality Point 1” icon, an “Abnormality Point 2” icon, and an “Abnormality Point 3” icon.
  • the three “Abnormality Point” icons may be generated based on three pieces of abnormality information in FIG. 9 . It is assumed that the user needs to view a video clip corresponding to the abnormality point 1 , the user may tap the “Abnormality Pint 1” icon to display the interface 1005 on the terminal device.
  • a play progress bar of the video in the video playing window M jumps to the moment t1, and abnormality description information (an abrupt brake) is displayed in the video. It is assumed that the user needs to view a video clip corresponding to the abnormality point 2 , the user may tap the “Abnormality Point 2” icon to display the interface 1006 on the terminal device.
  • the play progress bar of the video in the video playing window M jumps to the moment t2, and abnormality description information (an abrupt turn) is displayed in the video.
  • the user can conveniently view an abnormality video by using an application in the terminal device, and can quickly locate an abnormality time period based on an “Abnormality Point” icon.
  • first data and a first historical driving feature of a vehicle in a first time period may be obtained, that the vehicle is abnormal in the first time period may be determined based on the first data and the first historical driving feature, and a driving record video of the vehicle may be annotated based on the first time period.
  • the first data may reflect a driving status of the vehicle in the first time period
  • the first historical driving feature may reflect a driving habit of a driver of the vehicle. Because the driving habit of the user is usually fixed, whether the vehicle is abnormal in the first time period can be determined by comparing the first driving data and the historical driving feature of the vehicle.
  • the driving record video shot by an image shooting device may be annotated to record that the vehicle is abnormal in the first time period.
  • the driving record video can be accurately annotated.
  • the user may accurately and quickly search for a video clip based on an annotation result of the driving record video to improve video clip searching efficiency.
  • processing steps (S 301 to S 308 ) shown in the embodiment in FIG. 3 do not constitute a specific limitation on the data processing process.
  • the data processing process may include more or fewer steps than those in the embodiment of FIG. 3 .
  • the data processing process may include some steps in the embodiment of FIG. 3 , or some steps in the embodiment of FIG. 3 may be replaced by steps having a same function, or some steps in the embodiment of FIG. 3 may be split into a plurality of steps, or the like.
  • FIG. 11 is a schematic flowchart of another data processing method according to an embodiment of this application. Refer to FIG. 11 . The method may include the following steps.
  • An execution body for this embodiment of this application may be a vehicle, a driving recorder, or a data processing apparatus disposed in the vehicle or the driving recorder.
  • the data processing apparatus may be implemented by using software, or may be implemented by using a combination of software and hardware.
  • the data processing apparatus may be a processing chip.
  • the first data includes first driving data of a vehicle in a first time period, and the first historical driving feature is determined based on historical driving data of the vehicle.
  • S 1102 Determine, Based on the First Data and the First Historical Driving Feature, That the Vehicle is Abnormal in the First Time Period.
  • the first data and the first historical driving feature of the vehicle in the first time period may be obtained, that the vehicle is abnormal in the first time period is determined based on the first data and the first historical driving feature, and the driving record video of the vehicle is annotated based on the first time period.
  • the first data may reflect a driving status of the vehicle in the first time period
  • the first historical driving feature may reflect a driving habit of a driver of the vehicle. Because the driving habit of the user is usually fixed, whether the vehicle is abnormal in the first time period can be determined by comparing the first driving data and the historical driving feature of the vehicle.
  • the driving record video shot by an image shooting device may be annotated to record that the vehicle is abnormal in the first time period.
  • the driving record video can be accurately annotated.
  • the user may accurately and quickly search for a video clip based on an annotation result of the driving record video to improve video clip searching efficiency.
  • An embodiment of this application further provides a driving recorder.
  • FIG. 12 A is a schematic diagram of a structure of a driving recorder according to an embodiment of this application.
  • a driving recorder 120 includes a processing chip 121 and an image shooting device 122 .
  • the image shooting device 121 is configured to shoot a driving record video.
  • the processing chip 122 is configured to: obtain first data and a first historical driving feature; determine, based on the first data and the first historical driving feature, that the vehicle is abnormal in a first time period; and annotate a driving record video based on the first time period.
  • the first data includes first driving data of the vehicle in the first time period, and the first historical driving feature is determined based on historical driving data of the vehicle.
  • the processing chip 122 may further perform the data processing method shown in the foregoing method embodiments. A process and beneficial effects thereof are similar, and details are not described herein again.
  • FIG. 12 B is a schematic diagram of a structure of another driving recorder according to an embodiment of this application.
  • a driving recorder 120 may further include a memory 123 , and the memory 123 is configured to store a driving record video.
  • the driving record video may be sent to the memory 123 , to store the driving record video in the memory 123 .
  • the memory 123 may further store program instructions, and a processing chip 122 may perform, based on the program instructions, the technical solutions shown in the foregoing method embodiments.
  • the driving recorder 120 may further include a communications module 124 , and the communications module 124 may communicate with another device (for example, a vehicle or a cloud server).
  • another device for example, a vehicle or a cloud server.
  • the driving recorder 120 may further include a data collection device.
  • the data collection device may include a camera, a millimeter-wave radar, a laser radar, an inertial sensor, or the like.
  • the data collection device may obtain first data.
  • the driving recorder 120 is usually disposed on a vehicle.
  • the following describes a connection relationship between the driving recorder and the vehicle with reference to FIG. 13 A to FIG. 13 C .
  • the driving recorder in FIG. 12 B is used for description.
  • FIG. 13 A is a schematic diagram of device connection according to an embodiment of this application. As shown in FIG. 13 A , an interface may be disposed on the communications module 124 of the driving recorder 120 , and an interface may be disposed on a vehicle.
  • the interface on the driving recorder 120 may be connected to the interface on the vehicle, so that the driving recorder and the vehicle can communicate with each other by using the interfaces.
  • the interface may alternatively be disposed at another location. This is not specifically limited in this embodiment of this application.
  • FIG. 13 B is a schematic diagram of another device connection according to an embodiment of this application.
  • a communications module is disposed on the vehicle, and the communications module 124 in the driving recorder 120 may communicate (for example, through wireless communication) with the communications module on the vehicle.
  • the vehicle may send driving data to the driving recorder by using the communications modules.
  • FIG. 13 C is a schematic diagram of still another device connection according to an embodiment of this application.
  • a communications module is disposed on the vehicle, and a communications module in the vehicle may communicate with a server.
  • the vehicle may send driving data to a cloud server by using the communications module.
  • the driving recorder 120 may communicate with the cloud server by using the communications module 124 .
  • the driving recorder 120 may obtain the driving data from the cloud server by using the communications module 124 .
  • FIG. 13 A to FIG. 13 C merely show an example of a structure of a vehicle, and do not constitute a limitation on the structure of the vehicle.
  • An embodiment of this application further provides a vehicle.
  • the following describes structures of the vehicle with reference to FIG. 14 A and FIG. 14 B .
  • FIG. 14 A is a schematic diagram of a structure of a vehicle according to an embodiment of this application.
  • a vehicle 140 may include a processing chip 141 and an image shooting device 142 .
  • the image shooting device 141 is configured to shoot a driving record video.
  • the processing chip 142 is configured to: obtain first data and a first historical driving feature; determine, based on the first data and the first historical driving feature, that the vehicle is abnormal in a first time period; and annotate a driving record video based on the first time period.
  • the first data includes first driving data of the vehicle in the first time period, and the first historical driving feature is determined based on historical driving data of the vehicle.
  • the processing chip 122 may further perform the data processing method shown in the foregoing method embodiments. A process and beneficial effects thereof are similar, and details are not described herein again.
  • FIG. 14 B is a schematic diagram of another structure of the vehicle according to an embodiment of this application. Based on FIG. 14 A , as shown in FIG. 14 B , the vehicle may further include a memory 143 , and the memory 143 is configured to store a driving record video.
  • the driving record video may be sent to the memory 143 . to store the driving record video in the memory 143 .
  • the memory 143 may further store program instructions, and a processing chip 142 may perform, based on the program instructions, the technical solutions shown in the foregoing method embodiments.
  • the vehicle further includes a vehicle body status system 144 .
  • the vehicle body status system 144 may obtain first driving data, and send the first driving data to the processing chip 141 .
  • the vehicle further includes components.
  • the components include a plurality of parts and a plurality of sensors.
  • the vehicle body status system 144 may obtain the first driving data from the components of the vehicle.
  • the vehicle 140 may further include a communications module (not shown in the figure), and the communications module may communicate with another device (for example, a cloud server).
  • a communications module not shown in the figure
  • another device for example, a cloud server
  • the vehicle 140 may further include a data collection device.
  • the data collection device may include a camera, a millimeter-wave radar, a laser radar, an inertial sensor, or the like.
  • the data collection device may obtain the first data
  • FIG. 15 is a schematic diagram of a structure of a data processing apparatus according to an embodiment of this application.
  • a data processing apparatus 150 may include an obtaining module 151 , a determining module 152 , and an annotation module 153 .
  • the obtaining module 151 is configured to obtain first data and a first historical driving feature.
  • the first data includes first driving data of a vehicle in a first time period, and the first historical driving feature is determined based on historical driving data of the vehicle.
  • the determining module 152 is configured to determine, based on the first data and the first historical driving feature, that the vehicle is abnormal in the first time period.
  • the annotation module 153 is configured to annotate a driving record video of the vehicle based on the first time period.
  • the obtaining module 151 may perform S 303 and S 305 in the embodiment shown in FIG. 3 and S 1101 in the embodiment shown in FIG. 1 1 .
  • the determining module 152 may perform S 306 and S 307 in the embodiment shown in FIG. 3 and S 1102 in the embodiment shown in FIG. 11 .
  • the annotation module 153 may perform S 308 in the embodiment shown in FIG. 3 and S 1103 in the embodiment shown in FIG. 11 .
  • the first driving data includes at least one of the following:
  • the first historical driving feature includes a plurality of historical data curves corresponding to a plurality of types of historical driving data, the historical data curves are used to indicate a distribution rule of the historical driving data, and the historical driving data includes at least one of the following:
  • the obtaining module is specifically configured to:
  • the first model is obtained by learning a plurality of groups of first samples, and each of the plurality of groups of first samples includes sample driving data and a sample historical driving feature.
  • the determining module 152 is specifically configured to:
  • the determining module 152 is specifically configured to:
  • the second model is obtained by learning a plurality of groups of second samples, each of the plurality of groups of second samples includes sample data, a sample driving feature, and a sample similarity, and the sample data includes sample driving data.
  • the determining module 152 is specifically configured to:
  • the first data further includes scenario data
  • the scenario data includes at least one of the following information:
  • time information location information, road condition information, or weather information.
  • the determining module 152 is specifically configured to:
  • annotation module 153 is specifically configured to:
  • the abnormality information includes at least one of the following information:
  • an abnormality level an abnormality type, or abnormality description information.
  • FIG. 16 is a schematic diagram of a hardware structure of a data processing apparatus according to an embodiment of this application.
  • a data processing apparatus 160 may include a processor 161 and a memory 162 .
  • the processor 161 and the memory 162 may communicate with each other.
  • the processor 161 and the memory 162 communicate with each other by using a communications bus 163 .
  • the memory 162 is configured to store program instructions, and the processor 161 is configured to run the program instructions in the memory to perform the data processing method shown in any one of the foregoing method embodiments.
  • the processor 161 may perform the steps in the embodiments shown in FIG. 3 or FIG. 11 .
  • the data processing apparatus 160 may further include a transmitter and/or a receiver.
  • the processor may be a central processing unit (CPU), or may be another general-purpose processor, a digital signal processor (DSP), an application-specific integrated circuit (ASIC), or the like.
  • the general-purpose processor may be a microprocessor, or the processor may be any conventional processor or the like. Steps of the methods disclosed with reference to this application may be directly implemented by a hardware processor, or may be implemented by a combination of hardware and software modules in the processor.
  • the data processing apparatus 160 may be a chip, a driving recorder, a vehicle, a component in a driving recorder, a component in a vehicle, or the like.
  • the data processing apparatus 160 may further include at least one of an image shooting device or a data collection device.
  • the data collection device may exist in a form of a sensor.
  • the data collection device may include: a camera, a millimeter-wave radar, a laser radar, an inertial sensor, or the like.
  • the image shooting device is configured to shoot the driving record video
  • the data collection device is configured to obtain the first data
  • the memory may be further configured to store the driving record video.
  • a structure of the data processing apparatus 160 may be shown as FIG. 12 A or FIG. 12 B .
  • the data processing apparatus 160 may further include at least one of an image shooting device or a data collection device.
  • the image shooting device is configured to shoot the driving record video
  • the data collection device is configured to obtain the first data
  • the memory may be further configured to store the driving record video.
  • the data collection device may exist in a form of a sensor.
  • the data collection apparatus may include a camera, a millimeter-wave radar, a laser radar, an inertial sensor, or the like.
  • the data collection device may further be a vehicle body status system configured to obtain the first driving data.
  • a structure of the data processing apparatus may be shown as FIG. 14 A or FIG. 14 B .
  • This application provides a readable storage medium, where the readable storage medium stores a computer program.
  • the computer program is used to implement the data processing method according to any one of the foregoing embodiments.
  • An embodiment of this application provides a computer program product.
  • the computer program product includes instructions, and when the instructions are executed, a computer is enabled to perform the foregoing data processing method.
  • the memory includes: a read-only memory (ROM), a RAM, a flash memory, a hard disk drive, a solid-state drive, a magnetic tape, a floppy disk, an optical disc, and any combination thereof.
  • Embodiments of this application are described with reference to the flowcharts and/or block diagrams of the method, the device (system), and the computer program product according to embodiments of this application. It should be understood that computer program instructions may be used to implement each process and/or each block in the flowcharts and/or the block diagrams and a combination of a process and/or a block in the flowcharts and/or the block diagrams.
  • the computer program instructions may be provided for a general-purpose computer, a dedicated computer, an embedded processor, or a processing unit of another programmable data processing unit to generate a machine, so that instructions executed by a computer or the processing unit of the another programmable data processing unit generate an apparatus for implementing a specific function in one or more processes in the flowcharts and/or in one or more blocks in the block diagrams.
  • the computer program instructions may alternatively be stored in a computer-readable memory that can indicate a computer or another programmable data processing device to work in a specific manner, so that the instructions stored in the computer-readable memory generate an artifact that includes an instruction apparatus.
  • the instruction apparatus implements a specific function in one or more procedures in the flowcharts and/or in one or more blocks in the block diagrams.
  • the computer program instructions may alternatively be loaded onto a computer or another programmable data processing device, so that a series of operations and steps are performed on the computer or the another programmable device, to generate computer-implemented processing . Therefore, the instructions executed on the computer or the another programmable device provide steps for implementing a specific function in one or more procedures in the flowcharts and/or in one or more blocks in the block diagrams.
  • the term “including” and a variant thereof may refer to non-limitative inclusion; the term “or” and a variant thereof may refer to “and/or”.
  • the terms “first”, “second”, and the like are intended to distinguish between similar objects but do not necessarily indicate a specific order or sequence.
  • “A plurality of” in this application refers to two or more than two.
  • the term “and/or” describes an association relationship for associated objects and represents that three relationships may exist. For example, A and/or B may represent the following three cases: Only A exists, both A and B exist, and only B exists. The character “/” usually indicates an “or” relationship between the associated objects.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Data Mining & Analysis (AREA)
  • Medical Informatics (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Mathematical Physics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Computational Linguistics (AREA)
  • Traffic Control Systems (AREA)
  • Time Recorders, Dirve Recorders, Access Control (AREA)
US17/993,609 2020-05-26 2022-11-23 Data processing method and apparatus, and device Pending US20230108895A1 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/092381 WO2021237465A1 (fr) 2020-05-26 2020-05-26 Procédé, appareil et dispositif de traitement de données

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/092381 Continuation WO2021237465A1 (fr) 2020-05-26 2020-05-26 Procédé, appareil et dispositif de traitement de données

Publications (1)

Publication Number Publication Date
US20230108895A1 true US20230108895A1 (en) 2023-04-06

Family

ID=75017350

Family Applications (1)

Application Number Title Priority Date Filing Date
US17/993,609 Pending US20230108895A1 (en) 2020-05-26 2022-11-23 Data processing method and apparatus, and device

Country Status (6)

Country Link
US (1) US20230108895A1 (fr)
EP (1) EP4148700A4 (fr)
JP (1) JP2023527387A (fr)
KR (1) KR20230014749A (fr)
CN (1) CN112543937A (fr)
WO (1) WO2021237465A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114120633A (zh) * 2021-11-02 2022-03-01 吉旗(成都)科技有限公司 车辆行程回放方法、装置、服务器及存储介质
CN114281446A (zh) * 2021-11-26 2022-04-05 上海闪马智能科技有限公司 异常事件的显示方法及装置、存储介质、电子装置

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE19939468B4 (de) * 1999-08-20 2014-09-25 Robert Bosch Gmbh Verfahren zur Steuerung einer Aufzeichnung eines Unfalldatenrekorders in Kraftfahrzeugen
JP5664605B2 (ja) * 2012-07-26 2015-02-04 株式会社デンソー ドライブ映像記録装置及びドライブ映像記録システム
JP6486640B2 (ja) * 2014-10-09 2019-03-20 株式会社日立製作所 運転特性診断装置、運転特性診断システム、運転特性診断方法、情報出力装置、情報出力方法
US9925987B1 (en) * 2015-12-11 2018-03-27 Lytx, Inc. Driving abnormality detection
US10397516B2 (en) * 2016-04-29 2019-08-27 Ford Global Technologies, Llc Systems, methods, and devices for synchronization of vehicle data with recorded audio
CN106504367B (zh) * 2016-11-02 2019-03-22 中车青岛四方机车车辆股份有限公司 车辆故障预警方法和装置
CN110910636B (zh) * 2017-11-27 2021-08-10 银江股份有限公司 一种基于海量交通数据分析的车辆安全驾驶行为分析方法
DE102018205203A1 (de) * 2018-04-06 2019-10-10 Robert Bosch Gmbh Datenrekorderanordnung für ein Fahrzeug
CN108846997A (zh) * 2018-06-23 2018-11-20 张譯丹 无车承运人辅助安全监控系统
CN110956800B (zh) * 2018-09-27 2021-07-23 杭州海康威视系统技术有限公司 一种路段交通数据预处理方法、装置及电子设备
CN110505423A (zh) * 2019-08-15 2019-11-26 杭州鸿晶自动化科技有限公司 一种行车记录系统
CN110807436B (zh) * 2019-11-07 2022-10-18 深圳鼎然信息科技有限公司 危险驾驶行为识别与危险事件预测方法、装置及存储介质
CN110942671B (zh) * 2019-12-04 2022-06-07 北京京东乾石科技有限公司 车辆危险驾驶检测方法、装置以及存储介质

Also Published As

Publication number Publication date
JP2023527387A (ja) 2023-06-28
EP4148700A1 (fr) 2023-03-15
CN112543937A (zh) 2021-03-23
EP4148700A4 (fr) 2023-03-29
KR20230014749A (ko) 2023-01-30
WO2021237465A1 (fr) 2021-12-02

Similar Documents

Publication Publication Date Title
US20230108895A1 (en) Data processing method and apparatus, and device
CN106143490B (zh) 车辆自适应巡航控制方法及装置
US20170166172A1 (en) Emergency braking system and method of controlling the same
CN108333518B (zh) 电池健康状况评估
US20120078440A1 (en) Methods and systems for integration of vehicle systems
US10877474B2 (en) Autonomous driving control apparatus, vehicle having the apparatus, and method of controlling the apparatus
US10906547B2 (en) Controlling engine idle sailing in a vehicle using relative vehicle speed
CN113228620A (zh) 一种图像的获取方法以及相关设备
CN103101497A (zh) 汽车摄像系统及其视角与车速同步变化的数据处理方法
US20190138316A1 (en) Vehicle battery drainage avoidance
CN110956715A (zh) 行车记录方法、行车记录系统、云端服务器及车辆
US11414094B2 (en) Systems and methods for switching between a primary driver and a secondary driver for a vehicle
CN114640823A (zh) 一种基于座舱域控制器的紧急视频记录方法
CN115297461B (zh) 数据交互方法、装置、车辆、可读存储介质及芯片
CN116048055A (zh) 车辆故障检测方法、装置和存储介质
US11676402B2 (en) Information processing method, non-transitory computer readable medium, in-vehicle apparatus, vehicle, information processing apparatus, and information processing system
US11383702B2 (en) Vehicle and control method thereof
CN112700658B (zh) 用于车辆的图像分享的系统及相应的方法和存储介质
US20230419749A1 (en) Driving diagnostic device, driving diagnostic system, driving diagnostic method, and storage medium
CN114572219B (zh) 自动超车方法、装置、车辆、存储介质及芯片
JP7494666B2 (ja) 情報管理システムおよびこれに用いられる車載装置、携帯端末、画像管理サーバ
CN116588015B (zh) 车辆控制方法、车辆控制系统及存储介质
CN116278744B (zh) 数据显示方法、装置、车辆及存储介质
WO2023024618A1 (fr) Procédé de traitement de données et appareil associé
JP7270206B2 (ja) 情報提示システム、情報提示方法、プログラム、及び、移動体

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

AS Assignment

Owner name: HUAWEI TECHNOLOGIES CO., LTD., CHINA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BAO, NARISONG;ZHANG, GUICHENG;SIGNING DATES FROM 20230207 TO 20230221;REEL/FRAME:062845/0146