CN115200696A - Vibration behavior monitoring method and device, electronic equipment and storage medium - Google Patents

Vibration behavior monitoring method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN115200696A
CN115200696A CN202110321602.7A CN202110321602A CN115200696A CN 115200696 A CN115200696 A CN 115200696A CN 202110321602 A CN202110321602 A CN 202110321602A CN 115200696 A CN115200696 A CN 115200696A
Authority
CN
China
Prior art keywords
vibration
data
current user
vehicle
recognition result
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110321602.7A
Other languages
Chinese (zh)
Inventor
陈鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nine Intelligent Changzhou Tech Co Ltd
Original Assignee
Nine Intelligent Changzhou Tech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nine Intelligent Changzhou Tech Co Ltd filed Critical Nine Intelligent Changzhou Tech Co Ltd
Priority to CN202110321602.7A priority Critical patent/CN115200696A/en
Publication of CN115200696A publication Critical patent/CN115200696A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01HMEASUREMENT OF MECHANICAL VIBRATIONS OR ULTRASONIC, SONIC OR INFRASONIC WAVES
    • G01H17/00Measuring mechanical vibrations or ultrasonic, sonic or infrasonic waves, not provided for in the preceding groups
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M17/00Testing of vehicles
    • G01M17/007Wheeled or endless-tracked vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Abstract

The application provides a vibration behavior monitoring method and device, electronic equipment and a storage medium, wherein the method comprises the following steps: acquiring vehicle vibration data to be identified; generating the characteristics of the vibration behaviors corresponding to the vehicle vibration data according to the vehicle vibration data and the trained first deep learning model; and identifying the current user according to the characteristics of the vibration behaviors to obtain a first identification result. According to the vibration behavior monitoring method and device, the electronic equipment and the storage medium, the vibration behavior can be identified in a finer granularity, and the user experience degree is improved.

Description

Vibration behavior monitoring method and device, electronic equipment and storage medium
Technical Field
The present application relates to the field of vehicle technologies, and in particular, to a method and an apparatus for monitoring a vibration behavior, an electronic device, and a storage medium.
Background
With the development of economy, tools for traveling such as automobiles and the like gradually become necessities of daily life of people. In order to enhance user experience and ensure the safety of people and vehicles, the method is particularly important for identifying the behaviors and the states of the vehicles. Among them, vehicle vibration behavior recognition is often applied to monitoring of vehicle safety, for example, to give an alarm prompt for abnormal vibration of a vehicle.
In the related art, the vibration behavior of the vehicle is mainly identified based on the vibration data of the vehicle, so that identification with finer granularity cannot be realized, and the user experience is poor.
Disclosure of Invention
The present application is directed to solving, at least to some extent, one of the technical problems in the related art.
Therefore, a first objective of the present application is to provide a vibration behavior monitoring method, so as to realize finer-grained recognition of a vibration behavior and improve user experience.
A second object of the present application is to provide a vibration behavior monitoring device.
A third object of the present application is to propose an electronic device.
A fourth object of the present application is to propose a computer-readable storage medium.
In order to achieve the above object, an embodiment of a first aspect of the present application provides a vibration behavior monitoring method, including: acquiring vehicle vibration data to be identified; generating the characteristics of the vibration behaviors corresponding to the vehicle vibration data according to the vehicle vibration data and the trained first deep learning model; and identifying the current user according to the characteristics of the vibration behaviors to obtain a first identification result.
The vibration behavior monitoring method provided by the embodiment of the application obtains vehicle vibration data to be recognized, generates characteristics of vibration behaviors corresponding to the vehicle vibration data according to the vehicle vibration data and a trained first deep learning model, and recognizes a current user according to the characteristics of the vibration behaviors to obtain a first recognition result. The current user is identified based on the characteristics of the vibration behaviors, so that the vibration behaviors can be identified in a finer granularity, and the user experience is improved.
According to one embodiment of the application, the acquiring of vehicle vibration data to be identified comprises: and acquiring the vehicle vibration data through an inertial sensor IMU.
According to an embodiment of the application, the identifying the current user according to the characteristics of the vibration behavior includes: and identifying the current user according to the characteristics of the vibration behaviors, the pre-stored characteristics of the vibration behaviors and a corresponding relation table of registered users.
According to an embodiment of the present application, the vibration behavior monitoring method of the embodiment of the present application further includes: acquiring the identification data of the current user; identifying the current user according to the identity identification data to obtain a second identification result; verifying the first recognition result according to the second recognition result; and determining a target identification result according to the verification result.
According to one embodiment of the application, the identification data comprises at least one of: the system comprises face images, vehicle locking habit data, vehicle unlocking habit data, vehicle driving path habit data and vehicle destination habit data.
According to an embodiment of the present application, the identification data includes a face image, and the identifying the current user according to the identification data to obtain a second identification result includes: generating the characteristics of the face image according to the face image and a trained second deep learning model; and identifying the current user according to the characteristics of the face image to obtain a second identification result.
According to an embodiment of the present application, the identifying the current user according to the features of the face image includes: and identifying the current user according to the characteristics of the face image, the pre-stored corresponding relation table of the characteristics of the face image and the registered user.
According to an embodiment of the application, the determining the target recognition result according to the verification result includes: if the second recognition result is consistent with the first recognition result, determining that the target recognition result is the first recognition result; and if the second recognition result is inconsistent with the first recognition result, determining that the target recognition result is the second recognition result.
According to an embodiment of the present application, the method for monitoring vibration behavior of an embodiment of the present application further includes: and outputting corresponding alarm information according to the target identification result.
According to an embodiment of the present application, the outputting corresponding alarm information according to the target recognition result includes: if the current user is the registered user, the alarm information is not output; and if the current user is not the registered user, outputting the alarm information.
According to an embodiment of the application, the first deep learning model and the second deep learning model include any one of: a convolutional neural network CNN model, a recurrent neural network RNN model and a long-short term memory neural network LSTM model.
To achieve the above object, a second aspect of the present application provides a vibration behavior monitoring device, including: the acquisition module is used for acquiring vehicle vibration data to be identified; the generating module is used for generating the characteristics of the vibration behaviors corresponding to the vehicle vibration data according to the vehicle vibration data and the trained first deep learning model; and the identification module is used for identifying the current user according to the characteristics of the vibration behaviors to obtain a first identification result.
The vibration behavior monitoring device provided by the embodiment of the application acquires vehicle vibration data to be recognized, generates the characteristics of vibration behaviors corresponding to the vehicle vibration data according to the vehicle vibration data and a trained first deep learning model, and recognizes a current user according to the characteristics of the vibration behaviors to obtain a first recognition result. The current user is identified based on the characteristics of the vibration behaviors, so that the vibration behaviors can be identified in a finer granularity, and the user experience is improved.
To achieve the above object, an embodiment of a third aspect of the present application provides an electronic device, including: the present invention relates to a method for monitoring shock behavior, and more particularly to a method for monitoring shock behavior, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor.
To achieve the above object, a fourth aspect of the present application provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements a vibration behavior monitoring method according to an embodiment of the first aspect of the present application.
Drawings
FIG. 1 is a schematic flow diagram of a method of vibration behavior monitoring according to one embodiment of the present application;
FIG. 2 is a schematic flow diagram of a method of vibration behavior monitoring according to another embodiment of the present application;
FIG. 3 is a schematic flow diagram of a method of monitoring vibrational behavior according to another embodiment of the present application;
FIG. 4 is a schematic view of a scenario of a seismic activity monitoring method according to another embodiment of the present application;
FIG. 5 is a schematic diagram of a shock behavior monitoring device according to one embodiment of the present application;
FIG. 6 is a schematic view of an electronic device according to one embodiment of the present application.
Detailed Description
Reference will now be made in detail to the embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the same or similar elements or elements having the same or similar functions throughout. The embodiments described below with reference to the drawings are exemplary and intended to be used for explaining the present application and should not be construed as limiting the present application.
The following describes a vibration behavior monitoring method, apparatus, electronic device, and storage medium according to embodiments of the present application with reference to the drawings.
Fig. 1 is a schematic flow chart of a vibration behavior monitoring method according to an embodiment of the present application. The vibration behavior monitoring method can be executed by the vibration behavior monitoring device, and the vibration behavior monitoring device can be arranged in a vehicle and/or a cloud server. As shown in fig. 1, the vibration behavior monitoring method according to the embodiment of the present application includes the following steps:
and S101, acquiring vehicle vibration data to be identified.
In the embodiment of the application, the vehicle vibration data is data related to vibration behaviors of the vehicle, and specifically, the current vehicle vibration data of the vehicle to be identified can be obtained through an Inertial sensor (Inertial Measurement Unit, IMU for short) and the like.
And S102, generating the characteristics of the vibration behaviors corresponding to the vehicle vibration data according to the vehicle vibration data and the trained first deep learning model.
In the embodiment of the application, the vehicle vibration data acquired in step S101 is input into a first deep learning model trained in advance, and the first deep learning model outputs the characteristics of the vibration behavior corresponding to the vehicle vibration data. In actual operation, the vehicle vibration data acquired in step S101 generally needs to be preprocessed, and the preprocessed vehicle vibration data is input into the first deep learning model trained in advance. The preprocessing may specifically include data screening, etc., which is not limited in this embodiment. The training process of the first deep learning model is not described herein.
S103, identifying the current user according to the characteristics of the vibration behaviors to obtain a first identification result.
In the embodiment of the application, the current user is the user currently triggering the vehicle to generate the vehicle vibration data. The characteristics of the jarring behaviour may differ from user to user, since each user has certain patterns in some specific behaviour, for example the habits of driving may differ for each user. And identifying the current user according to the characteristics of the vibration behaviors generated in the step S102 to obtain a first identification result. For example, whether the current user is a registered user is identified, and the registered user may be a car owner or a friend, family, etc. of the car owner.
According to the vibration behavior monitoring method, vehicle vibration data to be recognized are obtained, the characteristics of vibration behaviors corresponding to the vehicle vibration data are generated according to the vehicle vibration data and a trained first deep learning model, and a current user is recognized according to the characteristics of the vibration behaviors to obtain a first recognition result. The current user is identified based on the characteristics of the vibration behaviors, the vibration behaviors are identified in a finer granularity mode, and the user experience degree is improved.
Fig. 2 is a flow chart of a vibration behavior monitoring method according to another embodiment of the present application. As shown in fig. 2, based on the embodiment shown in fig. 1, the vibration behavior monitoring method of the embodiment of the present application includes the following steps:
s201, vehicle vibration data to be identified are acquired.
S202, generating the characteristics of the vibration behaviors corresponding to the vehicle vibration data according to the vehicle vibration data and the trained first deep learning model.
And S203, identifying the current user according to the characteristics of the vibration behaviors to obtain a first identification result.
In the embodiment of the present application, steps S201 to S203 are the same as steps S101 to S103 in the above embodiment, and are not described herein again.
And S204, acquiring the identification data of the current user.
In the embodiment of the application, because the characteristics of the vibration behaviors corresponding to different users are similar, the current user may not be accurately identified by simply adopting the characteristics of the vibration behaviors, and therefore the current user can be identified again by adopting an identification data identification mode. The identification data may specifically include, but is not limited to, at least one of the following: the system comprises face images, vehicle locking habit data, vehicle unlocking habit data, vehicle driving path habit data, vehicle destination habit data and the like. The vehicle locking habit data is, for example, a habit of a user to lock a vehicle by using a remote key or a physical key. And vehicle unlocking habit data, such as habit of unlocking a vehicle by using a remote control key or a physical key. The vehicle travel route habit data, such as the habit of the user to travel between company and home. Vehicle destination habit data, e.g. the destination of a user's habit for driving being a company, a home, etc
As for the face image, the face image of the current user may be obtained by an image acquisition device installed on the vehicle, for example, a camera.
S205, the current user is identified according to the identity identification data, and a second identification result is obtained.
In this embodiment of the application, since the corresponding identification data of different users may be different, the current user is identified according to the identification data of the current user obtained in step S204, and a second identification result is obtained. For example, whether the current user is a registered user is identified, and the registered user may be a car owner or a friend, family, etc. of the car owner.
This step S205 will be described in detail below by taking the case where the identification data includes a face image as an example.
As shown in fig. 3, the step S205 of "identifying the current user according to the identification data to obtain the second identification result" may specifically include the following steps:
and S301, generating the characteristics of the face image according to the face image and the trained second deep learning model.
In the embodiment of the present application, the face image of the current user acquired in step S204 is input into a second deep learning model trained in advance, and the second deep learning model outputs features of the face image. In actual operation, the face image of the current user obtained in step S204 generally needs to be preprocessed, and the preprocessed face image of the current user is input into the second deep learning model trained in advance. The preprocessing may specifically include data screening, etc., which is not limited in this embodiment. The training process of the second deep learning model is not described herein.
And S302, identifying the current user according to the characteristics of the face image to obtain a second identification result.
In the embodiment of the application, because the features of the face images corresponding to different users may be different, the current user is identified according to the features of the face image generated in step S301, and a second identification result is obtained. For example, whether the current user is a registered user is identified, and the registered user may be a car owner or a friend, family, etc. of the car owner.
After step S205, the vibration behavior monitoring method of the embodiment of the present application may further include the following steps S206 to S208.
S206, verifying the first recognition result according to the second recognition result.
In the embodiment of the present application, the second identification result obtained in step S205 and the first identification result obtained in step S203 are compared in consistency, so as to implement cross validation.
And S207, determining a target identification result according to the verification result.
In the embodiment of the present application, for example, if the second recognition result is consistent with the first recognition result, the target recognition result is determined as the first recognition result. And if the second recognition result is not consistent with the first recognition result, determining that the target recognition result is the second recognition result, or determining that the target recognition result is that the current user is not the registered user.
And S208, outputting corresponding alarm information according to the target recognition result.
In the embodiment of the application, if the current user is the registered user, the alarm information is not output, and if the current user is not the registered user, the alarm information is output. The alarm information output can be specifically a short message sent to a mobile terminal device of a vehicle owner or a siren sound sent by the vehicle, and the method is not limited to the above.
Further, the step S203 may specifically include: and identifying the current user according to the characteristics of the vibration behaviors, the pre-stored characteristics of the vibration behaviors and the corresponding relation table of the registered user.
In the embodiment of the application, the corresponding registered user can be searched in the pre-stored correspondence table of the vibration behavior characteristics and the registered users according to the vibration behavior characteristics, so that the identification of the current user is realized. The vibration behavior characteristics and the correspondence table of the registered user can be obtained specifically in the following manner: the registered user carries out a certain vibration behavior for multiple times, such as multiple cart behaviors, vehicle vibration data generated each time are input into the trained first deep learning model, vibration behavior characteristics corresponding to the vehicle vibration data are obtained, a corresponding relation table between the registered user and the characteristics of multiple groups of vibration behaviors is established, and similarly, a corresponding relation table between other registered users and the characteristics of the corresponding multiple groups of vibration behaviors can be established.
Further, the step S302 may specifically include: and identifying the current user according to the characteristics of the face image, the pre-stored characteristics of the face image and the corresponding relation table of the registered user.
In the embodiment of the application, the corresponding registered user can be searched in the corresponding relation table of the pre-stored facial image characteristics and the registered users according to the facial image characteristics, so that the identification of the current user is realized. The corresponding relation table of the features of the face image and the registered user can be obtained in the following way: and acquiring a plurality of face images of the registered user, inputting the face images into the trained second deep learning model to obtain the features of the face images, establishing a corresponding relation table between the registered user and the features of the plurality of groups of face images, and establishing a corresponding relation table between other registered users and the features of the plurality of groups of corresponding face images in the same way.
Those skilled in the art will appreciate that the first deep learning model and the second deep learning model in the embodiment of the present application may include, but are not limited to, any one of the following: convolutional Neural Networks (CNN) models, recurrent Neural Networks (RNN) models, long Short-Term Memory Neural Networks (LSTM) models, and the like.
According to the vibration behavior monitoring method, vehicle vibration data to be recognized are obtained, the characteristics of vibration behaviors corresponding to the vehicle vibration data are generated according to the vehicle vibration data and a trained first deep learning model, and a current user is recognized according to the characteristics of the vibration behaviors to obtain a first recognition result. The current user is identified based on the characteristics of the vibration behaviors, so that the vibration behaviors can be identified in a finer granularity, and the user experience is improved. The current user is identified based on the identity identification data to obtain a second identification result, the first identification result obtained based on the characteristics of the vibration behavior can be verified, and the situation that the current user cannot be accurately identified only based on the characteristics of the vibration behavior is avoided.
For clearly explaining the vibration behavior monitoring method according to the embodiment of the present application, the vibration behavior monitoring method according to the embodiment of the present application is described below with reference to fig. 4. As shown in fig. 4, the vehicle end generates a correspondence table between the characteristics of the vibration behavior and the registered user through user vibration behavior registration, and uploads the correspondence table to the cloud server for storage. And the vehicle side generates a corresponding relation table of the characteristics of the face image and the registered user through the face registration of the user, and uploads the face image and the corresponding relation table to the cloud server for storage. The vehicle end obtains vehicle vibration data, preprocesses the vehicle vibration data and uploads the vehicle vibration data to the cloud server, the preprocessed vehicle vibration data generate corresponding vibration behavior characteristics after passing through the first deep learning model, and a first identification result of a current user is obtained according to the stored vibration behavior characteristics and the corresponding relation table of the registered user. The vehicle side acquires a face image of a current user, the face image is preprocessed and then uploaded to the cloud server, the preprocessed face image generates corresponding features of the face image after passing through the second deep learning model, a second recognition result of the current user is obtained according to the stored features of the face image and a corresponding relation table of a registered user, the first recognition result and the second recognition result are subjected to cross verification to obtain a target recognition result, and the vehicle side is controlled to output corresponding alarm information according to the target recognition result.
In order to implement the embodiments, the embodiments of the present application further provide a vibration behavior monitoring device, which can implement the vibration behavior monitoring method of any of the embodiments. Fig. 5 is a schematic structural diagram of a vibration behavior monitoring device according to an embodiment of the present application. As shown in fig. 5, the vibration behavior monitoring device 40 according to the embodiment of the present disclosure may specifically include: an acquisition module 41, a generation module 42 and an identification module 43.
The obtaining module 41 is configured to obtain vehicle vibration data to be identified.
And the generating module 42 is configured to generate a feature of a vibration behavior corresponding to the vehicle vibration data according to the vehicle vibration data and the trained first deep learning model.
And the identification module 43 is configured to identify the current user according to the characteristics of the vibration behavior to obtain a first identification result.
Further, in a possible implementation manner of the embodiment of the present application, the obtaining module 41 is specifically configured to: vehicle vibration data is acquired through an inertial sensor IMU.
Further, in a possible implementation manner of the embodiment of the present application, the identifying module 43 is specifically configured to: and identifying the current user according to the characteristics of the vibration behaviors, the pre-stored characteristics of the vibration behaviors and the corresponding relation table of the registered user.
Further, in a possible implementation manner of the embodiment of the present application, the obtaining module 41 is further configured to obtain a face image of a current user; the generating module 42 is further configured to generate features of the face image according to the face image and the trained second deep learning model; the recognition module 43 is further configured to recognize the current user according to the features of the face image, so as to obtain a second recognition result.
Further, in a possible implementation manner of the embodiment of the present application, the identifying module 43 is specifically configured to: and identifying the current user according to the characteristics of the facial image, the pre-stored characteristics of the facial image and the corresponding relation table of the registered user.
Further, in a possible implementation manner of the embodiment of the present application, the identifying module 43 is further configured to: verifying the first recognition result according to the second recognition result; and determining a target identification result according to the verification result.
Further, in a possible implementation manner of the embodiment of the present application, the identifying module 43 is specifically configured to: if the second recognition result is consistent with the first recognition result, determining the target recognition result as the first recognition result; and if the second recognition result is not consistent with the first recognition result, determining that the target recognition result is the second recognition result.
Further, in a possible implementation manner of the embodiment of the present application, the identifying module 43 is further configured to: and outputting corresponding alarm information according to the target identification result.
Further, in a possible implementation manner of the embodiment of the present application, the identifying module 43 is specifically configured to: if the current user is a registered user, not outputting alarm information; and if the current user is not the registered user, outputting alarm information.
Further, in a possible implementation manner of the embodiment of the present application, the first deep learning model and the second deep learning model may specifically include, but are not limited to, any one of the following: a convolutional neural network CNN model, a recurrent neural network RNN model and a long-short term memory neural network LSTM model.
It should be noted that the above explanation of the embodiment of the vibration behavior monitoring method is also applicable to the vibration behavior monitoring apparatus of this embodiment, and is not repeated herein.
The vibration behavior monitoring device provided by the embodiment of the application acquires vehicle vibration data to be recognized, generates the characteristics of vibration behaviors corresponding to the vehicle vibration data according to the vehicle vibration data and a trained first deep learning model, and recognizes a current user according to the characteristics of the vibration behaviors to obtain a first recognition result. The current user is identified based on the characteristics of the vibration behaviors, the vibration behaviors are identified in a finer granularity mode, and the user experience degree is improved. The current user is identified based on the characteristics of the face image to obtain a second identification result, the first identification result obtained based on the characteristics of the vibration behavior can be verified, and the situation that the current user cannot be accurately identified only based on the characteristics of the vibration behavior is avoided.
In order to implement the foregoing embodiment, an electronic device 50 is further provided in the embodiment of the present application, as shown in fig. 6, the electronic device 50 may specifically include a memory 51, a processor 52, and a computer program stored on the memory 51 and capable of running on the processor 52, and when the processor 52 executes the program, the vibration behavior monitoring method as shown in the foregoing embodiment is implemented.
In order to implement the foregoing embodiments, the present application further proposes a computer-readable storage medium, on which a computer program is stored, where the computer program is executed by a processor to implement the vibration behavior monitoring method as shown in the foregoing embodiments.
In the description herein, reference to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Moreover, various embodiments or examples and features of various embodiments or examples described in this specification can be combined and combined by one skilled in the art without being mutually inconsistent.
Although embodiments of the present application have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present application, and that variations, modifications, substitutions and alterations may be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.

Claims (14)

1. A method of vibration behavior monitoring, comprising:
acquiring vehicle vibration data to be identified;
generating the characteristics of the vibration behaviors corresponding to the vehicle vibration data according to the vehicle vibration data and a trained first deep learning model;
and identifying the current user according to the characteristics of the vibration behaviors to obtain a first identification result.
2. The method according to claim 1, wherein the acquiring vehicle vibration data to be identified comprises:
and acquiring the vehicle vibration data through an inertial sensor IMU.
3. The method of claim 1, wherein identifying a current user based on the vibration behavior characteristics comprises:
and identifying the current user according to the characteristics of the vibration behaviors, the pre-stored characteristics of the vibration behaviors and a corresponding relation table of registered users.
4. The method of claim 1, further comprising:
acquiring the identity identification data of the current user;
identifying the current user according to the identity identification data to obtain a second identification result;
verifying the first recognition result according to the second recognition result;
and determining a target identification result according to the verification result.
5. The method of claim 4, wherein the identification data comprises at least one of:
the system comprises face images, vehicle locking habit data, vehicle unlocking habit data, vehicle driving path habit data and vehicle destination habit data.
6. The method for monitoring vibration behavior according to claim 4, wherein the identification data includes a face image, and the identifying the current user according to the identification data to obtain a second identification result includes:
generating the characteristics of the face image according to the face image and the trained second deep learning model;
and identifying the current user according to the characteristics of the face image to obtain a second identification result.
7. The method according to claim 6, wherein the recognizing the current user according to the features of the facial image comprises:
and identifying the current user according to the characteristics of the face image, the pre-stored corresponding relation table of the characteristics of the face image and the registered user.
8. The method according to claim 4, wherein the determining a target identification result according to the verification result comprises:
if the second recognition result is consistent with the first recognition result, determining that the target recognition result is the first recognition result;
and if the second recognition result is inconsistent with the first recognition result, determining that the target recognition result is the second recognition result.
9. The method of claim 4, further comprising:
and outputting corresponding alarm information according to the target identification result.
10. The vibration behavior monitoring method according to claim 9, wherein the outputting corresponding alarm information according to the target recognition result comprises:
if the current user is the registered user, the alarm information is not output;
and if the current user is not the registered user, outputting the alarm information.
11. The method according to claim 1 or 6, wherein the first and second deep learning models comprise any one of:
a convolutional neural network CNN model, a recurrent neural network RNN model and a long-short term memory neural network LSTM model.
12. A vibration behavior monitoring device, comprising:
the acquisition module is used for acquiring vehicle vibration data to be identified;
the generating module is used for generating the characteristics of the vibration behaviors corresponding to the vehicle vibration data according to the vehicle vibration data and the trained first deep learning model;
and the identification module is used for identifying the current user according to the characteristics of the vibration behaviors to obtain a first identification result.
13. An electronic device, comprising: memory, processor and computer program stored on said memory and executable on said processor, said processor implementing the method of vibration behavior monitoring according to any of claims 1-11 when executing said program.
14. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method of monitoring vibrational behavior according to any one of claims 1 to 11.
CN202110321602.7A 2021-03-25 2021-03-25 Vibration behavior monitoring method and device, electronic equipment and storage medium Pending CN115200696A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110321602.7A CN115200696A (en) 2021-03-25 2021-03-25 Vibration behavior monitoring method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110321602.7A CN115200696A (en) 2021-03-25 2021-03-25 Vibration behavior monitoring method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115200696A true CN115200696A (en) 2022-10-18

Family

ID=83570440

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110321602.7A Pending CN115200696A (en) 2021-03-25 2021-03-25 Vibration behavior monitoring method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115200696A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116839843A (en) * 2023-07-07 2023-10-03 广东度班科技有限公司 Remote vibration monitoring method and system for vehicle

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116839843A (en) * 2023-07-07 2023-10-03 广东度班科技有限公司 Remote vibration monitoring method and system for vehicle
CN116839843B (en) * 2023-07-07 2024-01-26 广东度班科技有限公司 Remote vibration monitoring method and system for vehicle

Similar Documents

Publication Publication Date Title
US11565721B2 (en) Testing a neural network
US10713874B2 (en) Machine learning-based platform for user identification
CN109711557B (en) Driving track prediction method, computer equipment and storage medium
JP2015501459A (en) Computing platform for the development and deployment of sensor-driven vehicle telemetry applications and services
JP2017215898A (en) Machine learning system
CN109523652A (en) Processing method, device, equipment and the storage medium of insurance based on driving behavior
US20210397683A1 (en) System and Method for Continuous User Authentication
CN110209281B (en) Method, electronic device, and medium for processing motion signal
CN117079299B (en) Data processing method, device, electronic equipment and storage medium
JP2021096854A (en) System and method for detecting adversarial attack
CN110009780A (en) A kind of car door unlocking method, server and storage medium based on car networking
CN110718217B (en) Control method, terminal and computer readable storage medium
JP2019074849A (en) Drive data analyzer
CN115200696A (en) Vibration behavior monitoring method and device, electronic equipment and storage medium
EP3926498A1 (en) System and method for continuous user authentication
CN109313645B (en) Artificial intelligence terminal system, server and behavior control method thereof
CN113085872B (en) Driving behavior evaluation method, device, equipment and storage medium
JP2019214249A (en) Detection device, computer program, detection method, and learning model
JP7207227B2 (en) DRIVING ACTION EVALUATION DEVICE, DRIVING ACTION EVALUATION METHOD, AND DRIVING ACTION EVALUATION PROGRAM
WO2016006021A1 (en) Data analysis device, control method for data analysis device, and control program for data analysis device
DE102018125990A1 (en) Method and system for activating a security function
JP2020042490A (en) Information management apparatus and information management method
WO2022101996A1 (en) Model use control system, model use control method, model use control device, and model use control program
US20230298350A1 (en) Vehicle identification system
EP3710964B1 (en) A method for improving user authentication performed by a communication device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination