CN113313087A - Passenger behavior supervision method and device for unmanned automobile - Google Patents

Passenger behavior supervision method and device for unmanned automobile Download PDF

Info

Publication number
CN113313087A
CN113313087A CN202110854320.3A CN202110854320A CN113313087A CN 113313087 A CN113313087 A CN 113313087A CN 202110854320 A CN202110854320 A CN 202110854320A CN 113313087 A CN113313087 A CN 113313087A
Authority
CN
China
Prior art keywords
current
passenger
behavior
data
abnormal behavior
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110854320.3A
Other languages
Chinese (zh)
Other versions
CN113313087B (en
Inventor
穆振东
王平
徐军莉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangxi University of Technology
Original Assignee
Jiangxi University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangxi University of Technology filed Critical Jiangxi University of Technology
Priority to CN202110854320.3A priority Critical patent/CN113313087B/en
Publication of CN113313087A publication Critical patent/CN113313087A/en
Application granted granted Critical
Publication of CN113313087B publication Critical patent/CN113313087B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/46Extracting features or characteristics from the video content, e.g. video fingerprints, representative shots or key frames
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Evolutionary Biology (AREA)
  • Molecular Biology (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Biomedical Technology (AREA)
  • Human Computer Interaction (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Alarm Systems (AREA)

Abstract

The invention discloses a passenger behavior supervision method and a device of an unmanned automobile, wherein the method comprises the following steps: when the unmanned automobile meets the starting condition, acquiring the current video data of the current passenger in the unmanned automobile, which is acquired by a video acquisition device in real time, wherein the video acquisition device is arranged in the unmanned automobile; inputting the current video data into a preset behavior supervision model for behavior recognition to judge whether characteristic data matched with abnormal behavior data stored in the supervision model exist in the current video data; and if the current video data has characteristic data matched with the abnormal behavior data stored in the supervision model, controlling an in-vehicle alarm device to send alarm information, and simultaneously storing the corresponding relation between the passenger information of the current passenger and the corresponding abnormal behavior data. The invention can monitor the passenger behavior of the unmanned automobile and improve the driving safety.

Description

Passenger behavior supervision method and device for unmanned automobile
Technical Field
The invention relates to the technical field of unmanned driving, in particular to a passenger behavior supervision method and device of an unmanned automobile.
Background
With the development of science and technology, unmanned vehicles are rapidly developed, the unmanned vehicles sense the road environment through a vehicle-mounted sensing system, automatically plan driving routes and control vehicles to reach preset targets, and at present, many automobile manufacturers have already provided the unmanned vehicles.
When the passenger uses the unmanned automobile, as no other person (such as a driver) is usually nearby, the driving safety of the unmanned automobile can be affected if the passenger makes dangerous behaviors, such as attempting to open a door, damaging facilities in the automobile and the like, during the driving process. However, in the prior art, solutions for passenger behavior supervision of unmanned vehicles are lacking.
Disclosure of Invention
Therefore, the invention provides a passenger behavior monitoring method of an unmanned automobile, which is used for monitoring the passenger behavior of the unmanned automobile and improving the driving safety.
According to an embodiment of the invention, the passenger behavior monitoring method for the unmanned automobile is applied to an on-board system which is arranged in the corresponding unmanned automobile, and comprises the following steps:
when the unmanned automobile meets the starting condition, acquiring the current video data of the current passenger in the unmanned automobile, which is acquired by a video acquisition device in real time, wherein the video acquisition device is arranged in the unmanned automobile;
inputting the current video data into a preset behavior supervision model for behavior recognition to judge whether characteristic data matched with abnormal behavior data stored in the supervision model exist in the current video data, wherein the behavior supervision model consists of a convolution network, a recursion network, a fusion network and a classification network, the convolution network is used for extracting the characteristics of each frame in the video, the recursion network is used for modeling the time sequence relation between the frames in the video, the fusion network is used for fusing training characteristic data, and the classification network is used for outputting the probability that each frame belongs to each abnormal behavior;
and if the current video data has characteristic data matched with the abnormal behavior data stored in the supervision model, controlling an in-vehicle alarm device to send alarm information, and simultaneously storing the corresponding relation between the passenger information of the current passenger and the corresponding abnormal behavior data.
According to the passenger behavior monitoring method of the unmanned vehicle, the current video data of the current passenger in the unmanned vehicle, which is acquired by the video acquisition device in real time, is acquired, the current video data is input into the preset behavior monitoring model for behavior identification, and the monitoring model is used for judging whether the current video data has the characteristic data matched with the abnormal behavior data stored in the monitoring model or not, because the behavior monitoring model consists of a convolution network, a recursion network, a fusion network and a classification network, the abnormal behavior existing in the current video data can be effectively identified, if the current video data has the characteristic data matched with the abnormal behavior data stored in the monitoring model, an in-vehicle alarm device is controlled to send alarm information to remind the passenger to stop dangerous behavior, thereby improving the driving safety, meanwhile, the corresponding relation between the passenger information of the current passenger and the corresponding abnormal behavior data is stored, so that the follow-up statistical analysis is facilitated.
In addition, the passenger behavior supervision method for the unmanned vehicle according to the above embodiment of the present invention may further have the following additional technical features:
further, after the step of inputting the current video data into a preset behavior surveillance model for behavior recognition to determine whether feature data matched with abnormal behavior data stored in the surveillance model exists in the current video data, the method further includes:
if the current video data contains characteristic data matched with the abnormal behavior data stored in the supervision model, recording the type and the frequency of the abnormal behavior generated by the current passenger;
and calculating the current score of the current passenger for taking according to the category and the times of the abnormal behaviors generated by the current passenger.
Further, in the step of calculating the current score of the present ride of the current passenger according to the category and the number of times of the abnormal behavior generated by the current passenger, the current score of the present ride of the current passenger is calculated by adopting the following formula:
G=a1*b1* c1+ a2*b2* c2+…+ai*bi* ci
wherein the content of the first and second substances,Grepresents the current score, a1Representing the number of times that abnormal class 1 behavior occurred, b1Weight value corresponding to the 1 st abnormal behavior, c1Representing a basic deduction generated by generating a class 1 abnormal behavior once;
a2representing the number of times that abnormal class 2 behavior occurred, b2Weight value corresponding to the 2 nd type abnormal behavior, c2Representing a base deduction generated by generating a class 2 abnormal behavior once; a isiRepresenting the number of times of producing abnormal behaviour of the i-th class, biA weight value corresponding to the i-th abnormal behavior, ciThe base deduction generated by generating the ith abnormal behavior once is shown, wherein i is a positive integer.
Further, after the step of calculating the current score of the current passenger for the ride according to the category and the number of times of the abnormal behavior generated by the current passenger, the method further includes:
when the current score is larger than or equal to a score threshold value, listing the current passenger in a blacklist of a corresponding grade according to the size of the current score; the blacklists of different levels correspond to different score thresholds and different multiplication limiting rules.
Further, when the unmanned vehicle meets the starting condition, before the step of acquiring the current video data of the current passenger in the unmanned vehicle, which is acquired by the video acquisition device, in real time, the method further comprises:
acquiring current face identification data of the current passenger acquired by the video acquisition device and current fingerprint identification data of the current passenger acquired by a fingerprint device, wherein the fingerprint device is arranged in the unmanned automobile in an acquisition manner;
judging whether the current face identification data is matched with face identification data of a user initiating a bus taking request stored in a server or not and whether the current fingerprint identification data is matched with fingerprint identification data of the user initiating the bus taking request stored in the server or not;
if the current face identification data is matched with the face identification data of the user initiating the taking bus request stored in the server, and the current fingerprint identification data is matched with the fingerprint identification data of the user initiating the taking bus request stored in the server, judging that the unmanned automobile meets the starting condition;
and if the current face identification data are not matched with the face identification data of the user initiating the taking bus request stored in the server, and/or the current fingerprint identification data are not matched with the fingerprint identification data of the user initiating the taking bus request stored in the server, judging that the unmanned automobile does not meet the starting condition.
Further, after each current passenger finishes riding, storing face recognition data of each current passenger and abnormal behavior information generated in the riding process through a server, wherein the abnormal behavior information at least comprises an abnormal behavior category;
the method further comprises the following steps:
when each current passenger enters the unmanned automobile, acquiring face recognition data of the current passenger, wherein the face recognition data at least comprises passenger gender and passenger age;
confirming the type of the current passenger according to the gender of the passenger and the age of the passenger, and searching the highest-frequency abnormal behavior type corresponding to the type of the current passenger in a server according to the type of the current passenger;
and generating corresponding behavior prompt voice according to the highest-frequency abnormal behavior category so as to standardize the behavior of the current passenger.
Further, be equipped with a remote communication equipment, wireless health monitoring bracelet and first-aid kit in the unmanned vehicle, the method still includes:
monitoring a current physiological state of a current passenger through the wireless health monitoring bracelet to obtain monitoring data, wherein the monitoring data at least comprises a heart rate value and a blood pressure value of the passenger;
if the heart rate value or the blood pressure value of the current passenger exceeds the corresponding preset threshold value and the unmanned automobile is in a driving state, generating voice rescue prompt information and automatically opening the first-aid kit;
and sending a distress signal to the nearest emergency center through the remote communication equipment, and immediately planning navigation to drive to the nearest emergency center, wherein the distress signal comprises the identity information and monitoring data of the current passenger.
Further, the method further comprises:
counting current scores corresponding to each passenger taking the passengers each time within a preset time period, and calculating to obtain an average score according to a plurality of current scores;
judging whether the average score is smaller than a corresponding score threshold value;
if yes, calculating and pushing the coupon link of the corresponding money amount to the corresponding passenger according to the average score;
wherein, the calculation formula of the amount corresponding to the coupon is represented as:
Figure 944927DEST_PATH_IMAGE001
wherein the content of the first and second substances,
Figure 825158DEST_PATH_IMAGE002
for the amount of money corresponding to the coupon,
Figure 466355DEST_PATH_IMAGE003
in order to correct the coefficients of the image data,
Figure 824655DEST_PATH_IMAGE004
for the average score value,
Figure 500487DEST_PATH_IMAGE005
cumulative sum of money spent by current passenger in unmanned vehicle, average score
Figure 82778DEST_PATH_IMAGE004
Is the average of a plurality of said current scores.
Furthermore, a residual error network is adopted in the convolutional network to extract the image characteristics of each frame in the video, and a long-time memory unit network and a short-time memory unit network are adopted as a recurrent neural network in the recurrent network to model the time sequence relation between the frames in the video.
Another embodiment of the present invention provides a passenger behavior monitoring device for an unmanned vehicle, which monitors passenger behavior of the unmanned vehicle to improve driving safety.
According to an embodiment of the present invention, the passenger behavior monitoring apparatus for an unmanned vehicle is applied to an on-vehicle system provided in a corresponding unmanned vehicle, and includes:
the system comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring current video data of a current passenger in the unmanned automobile, which is acquired by a video acquisition device in real time when the unmanned automobile meets a starting condition, and the video acquisition device is arranged in the unmanned automobile;
the system comprises a first judgment module, a second judgment module and a third judgment module, wherein the first judgment module is used for inputting the current video data into a preset behavior supervision model for behavior identification so as to judge whether characteristic data matched with abnormal behavior data stored in the supervision model exist in the current video data, the behavior supervision model consists of a convolution network, a recursion network, a fusion network and a classification network, the convolution network is used for extracting the characteristics of each frame in the video, the recursion network is used for modeling the time sequence relation between the frames in the video, the fusion network is used for fusing training characteristic data, and the classification network is used for outputting the probability that each frame belongs to each abnormal behavior;
and the output module is used for controlling an in-vehicle alarm device to send alarm information and storing the corresponding relation between the passenger information of the current passenger and the corresponding abnormal behavior data if the current video data has the characteristic data matched with the abnormal behavior data stored in the supervision model.
According to the passenger behavior monitoring device of the unmanned automobile, the current video data of the current passenger in the unmanned automobile, which is acquired by the video acquisition device, is acquired in real time, the current video data is input into the preset behavior monitoring model for behavior identification, and the monitoring model is used for judging whether the current video data has the characteristic data matched with the abnormal behavior data stored in the monitoring model or not, because the behavior monitoring model consists of a convolution network, a recursion network, a fusion network and a classification network, the abnormal behavior existing in the current video data can be effectively identified, if the current video data has the characteristic data matched with the abnormal behavior data stored in the monitoring model, an alarm device in the automobile is controlled to send alarm information to remind the passenger to stop dangerous behavior, thereby improving the driving safety, meanwhile, the corresponding relation between the passenger information of the current passenger and the corresponding abnormal behavior data is stored, so that the follow-up statistical analysis is facilitated.
In addition, the passenger behavior monitoring apparatus of the unmanned vehicle according to the above embodiment of the present invention may further have the following additional technical features:
further, the apparatus further comprises:
the recording module is used for recording the type and the times of the abnormal behaviors generated by the current passenger if the current video data has characteristic data matched with the abnormal behavior data stored in the supervision model;
the calculation module is used for calculating the current value of the current passenger for taking according to the category and the times of the abnormal behaviors generated by the current passenger;
a listing module for listing the current passenger in a blacklist when the current score is greater than or equal to a score threshold, wherein the passenger in the blacklist is prohibited from using the unmanned automobile within a preset time.
Further, the calculating module is configured to calculate a current score of the current passenger's ride by using the following formula:
G=a1*b1* c1+ a2*b2* c2+…+ai*bi* ci
wherein the content of the first and second substances,Grepresents the current score, a1Representing the number of times that abnormal class 1 behavior occurred, b1Weight value corresponding to the 1 st abnormal behavior, c1Representing a basic deduction generated by generating a class 1 abnormal behavior once;
a2representing the number of times that abnormal class 2 behavior occurred, b2Weight value corresponding to the 2 nd type abnormal behavior, c2Representing a base deduction generated by generating a class 2 abnormal behavior once; a isiRepresenting the number of times of producing abnormal behaviour of the i-th class, biA weight value corresponding to the i-th abnormal behavior, ciThe base deduction generated by generating the ith abnormal behavior once is shown, wherein i is a positive integer.
Further, the listing module is specifically configured to:
and when the current score is greater than or equal to a score threshold, listing the current passenger in a blacklist of a corresponding grade according to the size of the current score, wherein the blacklists of different grades correspond to different score thresholds and different multiplication limiting rules.
Further, the apparatus further comprises:
the second acquisition module is used for acquiring the current face identification data of the current passenger acquired by the video acquisition device and the current fingerprint identification data of the current passenger acquired by the fingerprint device, and the fingerprint device is arranged in the unmanned automobile in an acquisition mode;
the second judgment module is used for judging whether the current face identification data is matched with the face identification data of the user initiating the bus taking request stored in the server and whether the current fingerprint identification data is matched with the fingerprint identification data of the user initiating the bus taking request stored in the server;
the first judgment module is used for judging that the unmanned automobile meets the starting condition if the current face identification data is matched with the face identification data of the user initiating the taking bus request stored in the server and the current fingerprint identification data is matched with the fingerprint identification data of the user initiating the taking bus request stored in the server;
the first judging module is further used for judging that the unmanned automobile does not meet the starting condition if the current face identification data is not matched with the face identification data of the user initiating the taking bus request stored in the server and/or the current fingerprint identification data is not matched with the fingerprint identification data of the user initiating the taking bus request stored in the server.
Furthermore, a residual error network is adopted in the convolutional network to extract the image characteristics of each frame in the video, and a long-time memory unit network and a short-time memory unit network are adopted as a recurrent neural network in the recurrent network to model the time sequence relation between the frames in the video.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
The above and/or additional aspects and advantages of embodiments of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a flow chart of a passenger behavior supervision method for an unmanned vehicle according to a first embodiment of the present invention;
FIG. 2 is a flowchart of a first embodiment of the present invention for determining whether an unmanned vehicle meets start-up conditions;
FIG. 3 is a flow chart of a passenger behavior supervision method for an unmanned vehicle according to a second embodiment of the present invention;
FIG. 4 is a flow chart of a passenger behavior supervision method for an unmanned vehicle in a third embodiment of the present invention;
FIG. 5 is a flow chart of a method for passenger behavior supervision of an unmanned vehicle according to a fourth embodiment of the present invention;
FIG. 6 is a flow chart of a passenger behavior supervision method for an unmanned vehicle according to a fifth embodiment of the present invention;
fig. 7 is a schematic structural diagram of a passenger behavior monitoring apparatus of an unmanned vehicle according to a sixth embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The first embodiment is as follows:
referring to fig. 1, a first embodiment of the present invention provides a method for supervising passenger behavior of an unmanned vehicle, the method being applied to a vehicle-mounted system disposed in the corresponding unmanned vehicle, the method including steps S101 to S103:
s101, when the unmanned automobile meets a starting condition, acquiring current video data of a current passenger in the unmanned automobile, which is acquired by a video acquisition device in real time, wherein the video acquisition device is arranged in the unmanned automobile.
The unmanned vehicle to which the embodiment of the present invention is applied is provided with one or more video capture devices for capturing current video data of passengers in the vehicle in real time, and the video capture devices may be cameras, for example, which is not limited in the embodiment of the present invention. The video acquisition device can be arranged at the corresponding position in the vehicle according to actual needs, and one or more video acquisition devices can be arranged according to actual needs. In specific implementation, a plurality of video capture devices are usually arranged to capture video from various angles in the vehicle.
Referring to fig. 2, when the unmanned vehicle meets the start condition, before the step of acquiring the current video data of the current passenger in the unmanned vehicle, which is acquired by the video acquisition device, in real time, the method further includes a step of determining whether the unmanned vehicle meets the start condition, which includes steps S1011 to S1014.
And S1011, acquiring the current face identification data of the current passenger acquired by the video acquisition device and the current fingerprint identification data of the current passenger acquired by the fingerprint device, wherein the fingerprint device is arranged in the unmanned automobile in an acquisition mode.
S1012, judging whether the current face identification data is matched with the face identification data of the user initiating the riding request stored in the server and whether the current fingerprint identification data is matched with the fingerprint identification data of the user initiating the riding request stored in the server;
s1013, if the current face identification data is matched with the face identification data of the user initiating the riding request stored in the server, and the current fingerprint identification data is matched with the fingerprint identification data of the user initiating the riding request stored in the server, judging that the unmanned automobile meets the starting condition;
and S1014, if the current face identification data is not matched with the face identification data of the user initiating the taking bus request stored in the server, and/or the current fingerprint identification data is not matched with the fingerprint identification data of the user initiating the taking bus request stored in the server, judging that the unmanned automobile does not meet the starting condition.
By taking the face identification data and the fingerprint identification data as the judgment conditions, the unmanned vehicle can be started only by judging that the unmanned vehicle meets the starting conditions on the premise that the current face identification data is matched with the face identification data of the user initiating the taking bus request stored in the server and the current fingerprint identification data is matched with the fingerprint identification data of the user initiating the taking bus request stored in the server, so that the benefit and the taking safety of passengers can be protected to the maximum extent, and the taking behavior of other users interfering the legal user is avoided.
S102, inputting the current video data into a preset behavior supervision model for behavior recognition to judge whether characteristic data matched with abnormal behavior data stored in the supervision model exist in the current video data, wherein the behavior supervision model comprises a convolution network, a recursion network, a fusion network and a classification network, the convolution network is used for extracting the characteristics of each frame in the video, the recursion network is used for modeling the time sequence relation between the frames in the video, the fusion network is used for fusing training characteristic data, and the classification network is used for outputting the probability that each frame belongs to each abnormal behavior.
The supervision model is trained in advance, and is specifically a convolutional neural network supervision model, wherein characteristic data of various abnormal behaviors are stored, and the abnormal behaviors comprise pushing and pulling vehicle doors, frequently clicking electronic equipment in the vehicle, violently damaging facilities in the vehicle and the like.
Specifically, the convolutional network adopts a residual error network to extract the image features of each frame in the video, and the complexity of the fitting function can be reduced through decomposition. The residual error network introduces short circuit connection between the output and the input, thereby effectively solving the problem of gradient disappearance appearing when the number of network layers is too deep.
In the recursive network, a long-time and short-time memory unit network is used as a recurrent neural network to model the time sequence relation between frames in the video. The Long-Short-Term Memory unit network (LSTM) can replace the hidden layer of the traditional recurrent neural network with a Long-Short-Term structure unit, so that dynamic dependence and Long-Term dependence between time sequence data can be effectively captured.
S103, if the current video data contains characteristic data matched with the abnormal behavior data stored in the supervision model, controlling an in-vehicle alarm device to send alarm information, and simultaneously storing the corresponding relation between the passenger information of the current passenger and the corresponding abnormal behavior data.
The alarm device in the vehicle is, for example, a sound alarm, and the sound alarm is controlled to send out preset alarm voice to remind passengers of stopping abnormal behaviors. In specific implementation, a unique ID information may be assigned to each passenger, and if there is feature data matching abnormal behavior data stored in the surveillance model in the current video data, the correspondence between the passenger information of the current passenger and the corresponding abnormal behavior data is stored, which is convenient for subsequent statistical analysis.
Example two:
referring to fig. 3, as a specific example, after the step of inputting the current video data into a preset behavior surveillance model for behavior recognition to determine whether there is feature data in the current video data that matches abnormal behavior data stored in the surveillance model in step S102, the method further includes:
s201, if the current video data contains characteristic data matched with the abnormal behavior data stored in the supervision model, recording the type and the frequency of the abnormal behavior generated by the current passenger.
S202, calculating the current score of the current passenger for taking according to the type and the frequency of the abnormal behaviors generated by the current passenger.
Wherein the current value of the current passenger's ride is calculated using the following formula:
G=a1*b1* c1+ a2*b2* c2+…+ai*bi* ci
wherein the content of the first and second substances,Grepresents the current score, a1Representing the number of times that abnormal class 1 behavior occurred, b1Weight value corresponding to the 1 st abnormal behavior, c1Representing a basic deduction generated by generating a class 1 abnormal behavior once;
a2representing the number of times that abnormal class 2 behavior occurred, b2Weight value corresponding to the 2 nd type abnormal behavior, c2Representing a base deduction generated by generating a class 2 abnormal behavior once; a isiRepresenting the number of times of producing abnormal behaviour of the i-th class, biA weight value corresponding to the i-th abnormal behavior, ciThe base deduction generated by generating the ith abnormal behavior once is shown, wherein i is a positive integer. Wherein, corresponding weight values can be allocated to different abnormal lines.
S203, when the current score is larger than or equal to a score threshold value, listing the current passenger in a blacklist, and prohibiting the passenger in the blacklist from using the unmanned automobile within a preset time.
If the current score is larger than or equal to the score threshold, the historical abnormal behaviors of the passenger are excessive, so that the passenger is limited not to use the unmanned automobile within a preset time (for example, three months), the benefits of manufacturers are maintained, and the passenger is urged to correct the riding behavior of the passenger as soon as possible.
In addition, in specific implementation, when the current score is greater than or equal to the score threshold, the current passenger is further listed in a blacklist of a corresponding level according to the magnitude of the current score, the blacklists of different levels correspond to different score thresholds and different multiplication limiting rules, and the blacklist level and the corresponding score threshold and multiplication limiting rules are, for example, the following table.
Blacklist rating Fractional threshold Rule of limiting multiplication
Level 1 [50,60] Unmanned automobile can not be used within one month
Stage 2 (60,70] Unmanned automobile can not be used within two months
Grade 3 (70,80] Unmanned automobile can not be used within three months
4 stage (80,90] Unmanned automobile can not be used within six months
Grade 5 >90 Unmanned automobile can not be used within one year
According to the passenger behavior monitoring method of the unmanned vehicle provided by the embodiment, the current video data of the current passenger in the unmanned vehicle, which is acquired by the video acquisition device in real time, is acquired, the current video data is input into the preset behavior monitoring model for behavior identification, and the monitoring model is used for judging whether the current video data has the characteristic data matched with the abnormal behavior data stored in the monitoring model or not, because the behavior monitoring model consists of a convolution network, a recursion network, a fusion network and a classification network, the abnormal behavior in the current video data can be effectively identified, if the current video data has the characteristic data matched with the abnormal behavior data stored in the monitoring model, an in-vehicle alarm device is controlled to send alarm information to remind the passenger of stopping dangerous behavior, thereby improving the driving safety, meanwhile, the corresponding relation between the passenger information of the current passenger and the corresponding abnormal behavior data is stored, so that the follow-up statistical analysis is facilitated.
Example three:
in this embodiment, in order to avoid the abnormal behavior of the passenger to the maximum extent, a voice prompt is required to be performed as much as possible to avoid the abnormal behavior during the passenger riding. Based on this, please refer to fig. 4, a third embodiment of the present invention provides a method for supervising passenger behavior of an unmanned vehicle.
Note that after each current passenger completes the ride, the face recognition data of each current passenger and the abnormal behavior information generated during the ride are stored by the server. Wherein the abnormal behavior information comprises an abnormal behavior category. For example, the abnormal behavior categories include: smoking, unbelting, and extending the head out of the window.
Specifically, in this embodiment, the method includes:
s301, when each current passenger enters the unmanned automobile, the face recognition data of the current passenger is obtained, and the face recognition data at least comprises the gender and the age of the passenger.
It can be understood that through the acquired face recognition data, the gender and the corresponding age of the passenger can be determined through analysis.
S302, confirming the type of the current passenger according to the gender of the passenger and the age of the passenger, and searching the highest-frequency abnormal behavior type corresponding to the type of the current passenger in a server according to the type of the current passenger.
As described above, since each passenger inevitably performs some abnormal behavior while riding the unmanned vehicle. Since the type of the passenger can be confirmed according to the gender and age of the passenger, and since high-frequency abnormal behaviors corresponding to some kinds of people are collected and counted in the previous using process. For example, the high frequency abnormal behavior of a middle-aged man is smoking, and the high frequency abnormal behavior of a minor is unbelted and the head is extended out of the window.
And S303, generating corresponding behavior prompt voice according to the highest-frequency abnormal behavior type so as to standardize the behavior of the current passenger.
For example, if the current passenger is a middle-aged man, the corresponding high-frequency abnormal behavior is smoking. At this time, the control system of the unmanned automobile generates corresponding behavior prompting voice, for example, a reminder notices that smoking cannot be performed in the automobile, so as to standardize the behavior of the current passenger.
Example four:
in real life, some passengers with heart diseases inevitably have irreparable consequences if sudden diseases cannot be rescued in time in the riding process. In order to solve the problem, referring to fig. 5, a fourth embodiment of the present invention provides a method for supervising passenger behavior of an unmanned vehicle, in which a remote communication device, a wireless health monitoring bracelet and an emergency kit are disposed in the unmanned vehicle, and the method is used for emergency treatment when a passenger has a sudden disease, and specifically includes the following steps:
s401, monitoring the current physiological state of the current passenger through the wireless health monitoring bracelet to obtain monitoring data, wherein the monitoring data at least comprises a heart rate value and a blood pressure value of the passenger.
And S402, judging whether the heart rate value or the blood pressure value of the current passenger exceeds a corresponding preset threshold value.
It can be understood that if the heart rate value or the blood pressure value of the passenger exceeds the corresponding preset threshold, it indicates that the health of the passenger is in a critical state at the moment, and the passenger needs to be rescued immediately.
And S403, judging whether the unmanned automobile is in a running state or not.
S404, generating voice rescue prompt information and automatically opening the first-aid kit.
As described in step S403, if the unmanned vehicle is in a driving state, the vehicle driving system automatically generates the voice rescue prompt information to prompt the passenger how to perform self rescue, and automatically opens the first-aid kit installed in the vehicle, so that the passenger can take corresponding medicine from the first-aid kit to perform self rescue.
S405, sending a distress signal to the emergency center closest to the emergency center through the remote communication equipment, and immediately planning navigation to drive to the closest emergency center.
Meanwhile, as the unmanned automobile is in a running state, the distress signal can be sent to the nearest emergency center through the remote communication equipment, and the navigation path is immediately planned to run to the nearest emergency center. The distress signal comprises identity information of a current passenger and monitoring data.
As a supplement, if it is determined that the heart rate value or the blood pressure value of the passenger exceeds the corresponding preset threshold value, and the unmanned vehicle is in a static state (not started), the vehicle driving system automatically generates voice rescue prompt information to prompt the passenger how to perform self rescue, and automatically starts a first-aid kit arranged in the vehicle, so that the passenger can take the corresponding medicine from the first-aid kit to perform self rescue.
Example five:
in order to bring better use experience to passengers, coupons are not regularly pushed to the passengers according to the actual use conditions of the passengers. Referring to fig. 6, the present invention includes the following steps:
s501, counting the current scores of each current passenger in each riding in a preset time period, and calculating to obtain an average score according to the current scores.
It will be appreciated that passenger a will have a current score after each use of the unmanned vehicle. In this step, the plurality of current scores are summed and averaged to obtain an average score.
S502, whether the average score is smaller than the corresponding score threshold value or not.
It should be noted that, if the average score is greater than the corresponding score threshold, it indicates that the passenger has excessive abnormal behaviors in the previous riding process and is included in the blacklist, and the preferential condition cannot be satisfied for such passenger. If the average score is smaller than the corresponding score threshold, step S503 is executed. In this embodiment, the corresponding score threshold is 50.
And S503, calculating and pushing the coupon link with the corresponding money amount to the corresponding passenger according to the average score.
Wherein, the calculation formula of the amount corresponding to the coupon is represented as:
Figure 211271DEST_PATH_IMAGE001
wherein the content of the first and second substances,
Figure 373262DEST_PATH_IMAGE002
as couponsThe amount of the corresponding money,
Figure 638022DEST_PATH_IMAGE003
in order to correct the coefficients of the image data,
Figure 391214DEST_PATH_IMAGE004
for the average score value,
Figure 741424DEST_PATH_IMAGE005
cumulative sum of money spent by current passenger in unmanned vehicle, average score
Figure 972685DEST_PATH_IMAGE004
Is the average of a plurality of said current scores.
It will be appreciated that for this formula, the more the cumulative spend amount, and the lower the average score (less anomalous behavior), the greater the corresponding coupon amount. According to the mechanism, the reward is carried out on the passengers so as to improve the overall experience of the passengers.
Example six:
referring to fig. 7, a sixth embodiment of the present invention provides a passenger behavior monitoring device for an unmanned vehicle, the device being applied to a vehicle-mounted system, the vehicle-mounted system being disposed in a corresponding unmanned vehicle, the device comprising:
the first obtaining module 11 is configured to obtain, in real time, current video data of a current passenger in the unmanned vehicle, which is collected by a video collecting device, when the unmanned vehicle meets a starting condition, where the video collecting device is disposed in the unmanned vehicle;
the first judging module 12 is configured to input the current video data into a preset behavior supervision model for behavior recognition, so as to judge whether feature data matched with abnormal behavior data stored in the supervision model exists in the current video data, where the behavior supervision model is composed of a convolution network, a recursion network, a fusion network and a classification network, the convolution network is configured to extract features of each frame in a video, the recursion network is configured to model a time sequence relationship between frames in the video, the fusion network is configured to fuse training feature data, and the classification network is configured to output a probability that each frame belongs to each abnormal behavior;
and the output module 13 is configured to control an in-vehicle alarm device to send alarm information if the current video data includes feature data matched with the abnormal behavior data stored in the monitoring model, and store a corresponding relationship between the passenger information of the current passenger and the corresponding abnormal behavior data.
In this embodiment, the apparatus further includes:
a recording module 14, configured to record a category and a number of times of an abnormal behavior generated by the current passenger if feature data matching the abnormal behavior data stored in the surveillance model exists in the current video data;
the calculating module 15 is configured to calculate a current score of the current passenger for taking the passenger this time according to the category and the number of times of the abnormal behavior generated by the current passenger;
a listing module 16, configured to list the current passenger in a blacklist when the current score is greater than or equal to a score threshold, where the passenger in the blacklist is prohibited from using the unmanned vehicle for a preset time.
In this embodiment, the calculating module 15 is configured to calculate the current score of the current passenger riding according to the following formula:
G=a1*b1* c1+ a2*b2* c2+…+ai*bi* ci
wherein the content of the first and second substances,Grepresents the current score, a1Representing the number of times that abnormal class 1 behavior occurred, b1Weight value corresponding to the 1 st abnormal behavior, c1Representing a basic deduction generated by generating a class 1 abnormal behavior once;
a2representing the number of times that abnormal class 2 behavior occurred, b2Weight value corresponding to the 2 nd type abnormal behavior, c2Representing a base deduction generated by generating a class 2 abnormal behavior once; a isiIndicating the generation of a class i exceptionNumber of acts, biA weight value corresponding to the i-th abnormal behavior, ciThe base deduction generated by generating the ith abnormal behavior once is shown, wherein i is a positive integer.
In this embodiment, the listing module 16 is specifically configured to:
and when the current score is greater than or equal to a score threshold, listing the current passenger in a blacklist of a corresponding grade according to the size of the current score, wherein the blacklists of different grades correspond to different score thresholds and different multiplication limiting rules.
In this embodiment, the apparatus further includes:
a second obtaining module 17, configured to obtain current face identification data of the current passenger collected by the video collecting device and current fingerprint identification data of the current passenger collected by a fingerprint device, where the fingerprint device is collected and arranged in the unmanned vehicle;
a second judging module 18, configured to judge whether the current face identification data matches face identification data of a user initiating a riding request stored in the server, and whether the current fingerprint identification data matches fingerprint identification data of the user initiating the riding request stored in the server;
a first determination module 19, configured to determine that the unmanned vehicle meets a start condition if the current face identification data matches the face identification data of the user initiating the taking bus request stored in the server, and the current fingerprint identification data matches the fingerprint identification data of the user initiating the taking bus request stored in the server;
the first determination module 19 is further configured to determine that the unmanned vehicle does not satisfy the starting condition if the current face identification data does not match the face identification data of the user initiating the taking car request stored in the server, and/or the current fingerprint identification data does not match the fingerprint identification data of the user initiating the taking car request stored in the server.
In this embodiment, a residual error network is used in the convolutional network to extract image features of each frame in the video, and a long-and-short-term memory unit network is used as a recurrent neural network in the recurrent network to model a time sequence relationship between frames in the video.
According to the passenger behavior monitoring device of the unmanned vehicle provided by the embodiment, the current video data of the current passenger in the unmanned vehicle, which is acquired by the video acquisition device in real time, is acquired, the current video data is input into the preset behavior monitoring model for behavior identification, and the monitoring model is used for judging whether the current video data has the characteristic data matched with the abnormal behavior data stored in the monitoring model or not, because the behavior monitoring model consists of a convolution network, a recursion network, a fusion network and a classification network, the abnormal behavior in the current video data can be effectively identified, if the current video data has the characteristic data matched with the abnormal behavior data stored in the monitoring model, the in-vehicle alarm device is controlled to send alarm information to remind the passenger of stopping dangerous behavior, thereby improving the driving safety, meanwhile, the corresponding relation between the passenger information of the current passenger and the corresponding abnormal behavior data is stored, so that the follow-up statistical analysis is facilitated.
The logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
While embodiments of the invention have been shown and described, it will be understood by those of ordinary skill in the art that: various changes, modifications, substitutions and alterations can be made to the embodiments without departing from the principles and spirit of the invention, the scope of which is defined by the claims and their equivalents.

Claims (10)

1. A passenger behavior supervision method for an unmanned vehicle is applied to an on-board system which is arranged in the corresponding unmanned vehicle, and comprises the following steps:
when the unmanned automobile meets the starting condition, acquiring the current video data of the current passenger in the unmanned automobile, which is acquired by a video acquisition device in real time, wherein the video acquisition device is arranged in the unmanned automobile;
inputting the current video data into a preset behavior supervision model for behavior recognition to judge whether characteristic data matched with abnormal behavior data stored in the supervision model exist in the current video data, wherein the behavior supervision model consists of a convolution network, a recursion network, a fusion network and a classification network, the convolution network is used for extracting the characteristics of each frame in the video, the recursion network is used for modeling the time sequence relation between the frames in the video, the fusion network is used for fusing training characteristic data, and the classification network is used for outputting the probability that each frame belongs to each abnormal behavior;
and if the current video data has characteristic data matched with the abnormal behavior data stored in the supervision model, controlling an in-vehicle alarm device to send alarm information, and simultaneously storing the corresponding relation between the passenger information of the current passenger and the corresponding abnormal behavior data.
2. The passenger behavior supervision method of an unmanned vehicle according to claim 1, wherein after the step of inputting the current video data into a preset behavior supervision model for behavior recognition to determine whether there is characteristic data matching with abnormal behavior data stored in the supervision model in the current video data, the method further comprises:
if the current video data contains characteristic data matched with the abnormal behavior data stored in the supervision model, recording the type and the frequency of the abnormal behavior generated by the current passenger;
and calculating the current score of the current passenger for taking according to the category and the times of the abnormal behaviors generated by the current passenger.
3. The passenger behavior supervision method of an unmanned vehicle according to claim 2, wherein in the step of calculating the current score of the present ride of the present passenger according to the category and the number of times of the abnormal behavior generated by the present passenger, the current score of the present ride of the present passenger is calculated using the following equation:
G=a1*b1* c1+ a2*b2* c2+…+ai*bi* ci
wherein the content of the first and second substances,Grepresents the current score, a1Representing the number of times that abnormal class 1 behavior occurred, b1Weight value corresponding to the 1 st abnormal behavior, c1Representing a basic deduction generated by generating a class 1 abnormal behavior once;
a2representing the number of times that abnormal class 2 behavior occurred, b2Weight value corresponding to the 2 nd type abnormal behavior, c2Representing a base deduction generated by generating a class 2 abnormal behavior once; a isiRepresenting the number of times of producing abnormal behaviour of the i-th class, biA weight value corresponding to the i-th abnormal behavior, ciThe base deduction generated by generating the ith abnormal behavior once is shown, wherein i is a positive integer.
4. The passenger behavior supervision method of an unmanned vehicle according to claim 3, wherein after the step of calculating the current score of the present ride of the present passenger according to the category and the number of the abnormal behavior generated by the present passenger, the method further comprises:
when the current score is larger than or equal to a score threshold value, listing the current passenger in a blacklist of a corresponding grade according to the size of the current score; the blacklists of different levels correspond to different score thresholds and different multiplication limiting rules.
5. The passenger behavior surveillance method of an unmanned vehicle as claimed in claim 1, wherein when the unmanned vehicle satisfies a start condition, the method further comprises, before the step of acquiring in real time the current video data of the current passenger in the unmanned vehicle acquired by the video acquisition device, the method further comprises:
acquiring current face identification data of the current passenger acquired by the video acquisition device and current fingerprint identification data of the current passenger acquired by a fingerprint device, wherein the fingerprint device is arranged in the unmanned automobile in an acquisition manner;
judging whether the current face identification data is matched with face identification data of a user initiating a bus taking request stored in a server or not and whether the current fingerprint identification data is matched with fingerprint identification data of the user initiating the bus taking request stored in the server or not;
if the current face identification data is matched with the face identification data of the user initiating the taking bus request stored in the server, and the current fingerprint identification data is matched with the fingerprint identification data of the user initiating the taking bus request stored in the server, judging that the unmanned automobile meets the starting condition;
and if the current face identification data are not matched with the face identification data of the user initiating the taking bus request stored in the server, and/or the current fingerprint identification data are not matched with the fingerprint identification data of the user initiating the taking bus request stored in the server, judging that the unmanned automobile does not meet the starting condition.
6. The passenger behavior supervision method of an unmanned vehicle according to claim 1, wherein after each of the current passengers completes the ride, face recognition data of each of the current passengers and abnormal behavior information generated during the ride are stored by a server, the abnormal behavior information including at least an abnormal behavior category;
the method further comprises the following steps:
when each current passenger enters the unmanned automobile, acquiring face recognition data of the current passenger, wherein the face recognition data at least comprises passenger gender and passenger age;
confirming the type of the current passenger according to the gender of the passenger and the age of the passenger, and searching the highest-frequency abnormal behavior type corresponding to the type of the current passenger in a server according to the type of the current passenger;
and generating corresponding behavior prompt voice according to the highest-frequency abnormal behavior category so as to standardize the behavior of the current passenger.
7. The method of claim 1, wherein a remote communication device, a wireless health monitoring bracelet, and an emergency kit are disposed in the unmanned vehicle, the method further comprising:
monitoring a current physiological state of a current passenger through the wireless health monitoring bracelet to obtain monitoring data, wherein the monitoring data at least comprises a heart rate value and a blood pressure value of the passenger;
if the heart rate value or the blood pressure value of the current passenger exceeds the corresponding preset threshold value and the unmanned automobile is in a driving state, generating voice rescue prompt information and automatically opening the first-aid kit;
and sending a distress signal to the nearest emergency center through the remote communication equipment, and immediately planning navigation to drive to the nearest emergency center, wherein the distress signal comprises the identity information and monitoring data of the current passenger.
8. The passenger behavior supervision method of an unmanned vehicle according to claim 3, characterized in that the method further comprises:
counting current scores corresponding to each passenger taking the passengers each time within a preset time period, and calculating to obtain an average score according to a plurality of current scores;
judging whether the average score is smaller than a corresponding score threshold value;
if yes, calculating and pushing the coupon link of the corresponding money amount to the corresponding passenger according to the average score;
wherein, the calculation formula of the amount corresponding to the coupon is represented as:
Figure 216549DEST_PATH_IMAGE001
wherein the content of the first and second substances,
Figure 868110DEST_PATH_IMAGE002
for the amount of money corresponding to the coupon,
Figure 116689DEST_PATH_IMAGE003
in order to correct the coefficients of the image data,
Figure 246319DEST_PATH_IMAGE004
for the average score value,
Figure 529533DEST_PATH_IMAGE005
cumulative sum of money spent by current passenger in unmanned vehicle, average score
Figure 86416DEST_PATH_IMAGE004
Is the average of a plurality of said current scores.
9. Passenger behavior supervision device for an unmanned vehicle, wherein the device is applied to an on-board system provided in a corresponding unmanned vehicle, the device comprising:
the system comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring current video data of a current passenger in the unmanned automobile, which is acquired by a video acquisition device in real time when the unmanned automobile meets a starting condition, and the video acquisition device is arranged in the unmanned automobile;
the system comprises a first judgment module, a second judgment module and a third judgment module, wherein the first judgment module is used for inputting the current video data into a preset behavior supervision model for behavior identification so as to judge whether characteristic data matched with abnormal behavior data stored in the supervision model exist in the current video data, the behavior supervision model consists of a convolution network, a recursion network, a fusion network and a classification network, the convolution network is used for extracting the characteristics of each frame in the video, the recursion network is used for modeling the time sequence relation between the frames in the video, the fusion network is used for fusing training characteristic data, and the classification network is used for outputting the probability that each frame belongs to each abnormal behavior;
and the output module is used for controlling an in-vehicle alarm device to send alarm information and storing the corresponding relation between the passenger information of the current passenger and the corresponding abnormal behavior data if the current video data has the characteristic data matched with the abnormal behavior data stored in the supervision model.
10. The passenger behavior supervision device of an unmanned vehicle according to claim 9, characterized in that the device further comprises:
the recording module is used for recording the type and the times of the abnormal behaviors generated by the current passenger if the current video data has characteristic data matched with the abnormal behavior data stored in the supervision model;
the calculation module is used for calculating the current value of the current passenger for taking according to the category and the times of the abnormal behaviors generated by the current passenger;
a listing module for listing the current passenger in a blacklist when the current score is greater than or equal to a score threshold, the passenger in the blacklist being prohibited from using the unmanned vehicle within a preset time;
the calculation module is further configured to calculate a current score of the current passenger's ride by using the following formula:
G=a1*b1* c1+ a2*b2* c2+…+ai*bi* ci
wherein the content of the first and second substances,Grepresents the current score, a1Representing the number of times that abnormal class 1 behavior occurred, b1Weight value corresponding to the 1 st abnormal behavior, c1Representing a basic deduction generated by generating a class 1 abnormal behavior once;
a2representing the number of times that abnormal class 2 behavior occurred, b2Weight value corresponding to the 2 nd type abnormal behavior, c2Representing a base deduction generated by generating a class 2 abnormal behavior once; a isiRepresenting the number of times of producing abnormal behaviour of the i-th class, biA weight value corresponding to the i-th abnormal behavior, ciThe base deduction generated by generating the ith abnormal behavior once is shown, wherein i is a positive integer.
CN202110854320.3A 2021-07-28 2021-07-28 Passenger behavior supervision method and device for unmanned automobile Active CN113313087B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110854320.3A CN113313087B (en) 2021-07-28 2021-07-28 Passenger behavior supervision method and device for unmanned automobile

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110854320.3A CN113313087B (en) 2021-07-28 2021-07-28 Passenger behavior supervision method and device for unmanned automobile

Publications (2)

Publication Number Publication Date
CN113313087A true CN113313087A (en) 2021-08-27
CN113313087B CN113313087B (en) 2021-11-02

Family

ID=77381836

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110854320.3A Active CN113313087B (en) 2021-07-28 2021-07-28 Passenger behavior supervision method and device for unmanned automobile

Country Status (1)

Country Link
CN (1) CN113313087B (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107103310A (en) * 2017-06-01 2017-08-29 鄂尔多斯市普渡科技有限公司 The passenger behavior monitor system and method for a kind of unmanned taxi
CN108074396A (en) * 2016-11-10 2018-05-25 关晓芙 The evaluation method that drives safely and system
CN108648377A (en) * 2018-04-10 2018-10-12 合肥美的智能科技有限公司 Automatically vending system based on unmanned retail units and method
CN109410519A (en) * 2018-10-25 2019-03-01 衢州学院 A kind of vehicle crew's safety monitoring method and system based on big data
CN109515315A (en) * 2018-09-14 2019-03-26 纵目科技(上海)股份有限公司 Object identification method, system, terminal and storage medium in a kind of automatic driving vehicle
CN109902575A (en) * 2019-01-24 2019-06-18 平安科技(深圳)有限公司 Anti- based on automatic driving vehicle abducts method, apparatus and relevant device
CN109934607A (en) * 2017-12-17 2019-06-25 北京嘀嘀无限科技发展有限公司 A kind of cycling behavioral guidance method and device
CN110166741A (en) * 2019-04-15 2019-08-23 深圳壹账通智能科技有限公司 Environment control method, device, equipment and storage medium based on artificial intelligence
CN110602248A (en) * 2019-09-27 2019-12-20 腾讯科技(深圳)有限公司 Abnormal behavior information identification method, system, device, equipment and medium
CN111274881A (en) * 2020-01-10 2020-06-12 中国平安财产保险股份有限公司 Driving safety monitoring method and device, computer equipment and storage medium
CN111738044A (en) * 2020-01-06 2020-10-02 西北大学 Campus violence assessment method based on deep learning behavior recognition
CN112289031A (en) * 2020-11-03 2021-01-29 蚌埠学院 Method and device for detecting and alarming abnormal conditions in bus driving process
CN112422531A (en) * 2020-11-05 2021-02-26 博智安全科技股份有限公司 CNN and XGboost-based network traffic abnormal behavior detection method
CN112491779A (en) * 2019-09-12 2021-03-12 中移(苏州)软件技术有限公司 Abnormal behavior detection method and device and electronic equipment

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108074396A (en) * 2016-11-10 2018-05-25 关晓芙 The evaluation method that drives safely and system
CN107103310A (en) * 2017-06-01 2017-08-29 鄂尔多斯市普渡科技有限公司 The passenger behavior monitor system and method for a kind of unmanned taxi
CN109934607A (en) * 2017-12-17 2019-06-25 北京嘀嘀无限科技发展有限公司 A kind of cycling behavioral guidance method and device
CN108648377A (en) * 2018-04-10 2018-10-12 合肥美的智能科技有限公司 Automatically vending system based on unmanned retail units and method
CN109515315A (en) * 2018-09-14 2019-03-26 纵目科技(上海)股份有限公司 Object identification method, system, terminal and storage medium in a kind of automatic driving vehicle
CN109410519A (en) * 2018-10-25 2019-03-01 衢州学院 A kind of vehicle crew's safety monitoring method and system based on big data
CN109902575A (en) * 2019-01-24 2019-06-18 平安科技(深圳)有限公司 Anti- based on automatic driving vehicle abducts method, apparatus and relevant device
CN110166741A (en) * 2019-04-15 2019-08-23 深圳壹账通智能科技有限公司 Environment control method, device, equipment and storage medium based on artificial intelligence
CN112491779A (en) * 2019-09-12 2021-03-12 中移(苏州)软件技术有限公司 Abnormal behavior detection method and device and electronic equipment
CN110602248A (en) * 2019-09-27 2019-12-20 腾讯科技(深圳)有限公司 Abnormal behavior information identification method, system, device, equipment and medium
CN111738044A (en) * 2020-01-06 2020-10-02 西北大学 Campus violence assessment method based on deep learning behavior recognition
CN111274881A (en) * 2020-01-10 2020-06-12 中国平安财产保险股份有限公司 Driving safety monitoring method and device, computer equipment and storage medium
CN112289031A (en) * 2020-11-03 2021-01-29 蚌埠学院 Method and device for detecting and alarming abnormal conditions in bus driving process
CN112422531A (en) * 2020-11-05 2021-02-26 博智安全科技股份有限公司 CNN and XGboost-based network traffic abnormal behavior detection method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
WEI HUANG等: "Video-Based Abnormal Driving Behavior Detection via Deep Learning Fusions", 《IEEE ACCESS》 *
毛志强: "视频序列中人体异常行为分析技术研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Also Published As

Publication number Publication date
CN113313087B (en) 2021-11-02

Similar Documents

Publication Publication Date Title
US9676395B2 (en) Incapacitated driving detection and prevention
CN107161153B (en) A kind of driving behavior methods of marking and device
CN107423869B (en) It is a kind of that the system for driving permission is limited based on traffic historical record
CN112052776B (en) Unmanned vehicle autonomous driving behavior optimization method and device and computer equipment
CN109146217A (en) Safety travel appraisal procedure, device, server, computer readable storage medium
CN111598368B (en) Risk identification method, system and device based on stop abnormality after stroke end
KR102143211B1 (en) A method and system for preventing drowsiness driving and keeping vehicle safe
CN110533880B (en) Abnormal driving state detection and early warning control method based on electrocardiosignals
CN110143202A (en) A kind of dangerous driving identification and method for early warning and system
CN111105110A (en) Driving risk determination method, device, medium and computing equipment
CN112071309A (en) Network appointment car safety monitoring device and system
CN113460062A (en) Driving behavior analysis system
EP3853681A1 (en) Method for classifying a non-driving activity of a driver in respect of an interruptibility of the non-driving activity in the event of a prompt to take over the driving function, and method for re-releasing a non-driving activity following an interruption of said non-driving activity as a result of a prompt to takeover the driving function
CN111429329A (en) Method and device for monitoring network car booking behavior
CN113313087B (en) Passenger behavior supervision method and device for unmanned automobile
CN116453345B (en) Bus driving safety early warning method and system based on driving risk feedback
Qiu et al. Use of triplet-loss function to improve driving anomaly detection using conditional generative adversarial network
CN110415710A (en) Parameter regulation means, device, equipment and the medium of interactive system for vehicle-mounted voice
CN112137630A (en) Method and system for relieving negative emotion of driver
JP2010271794A (en) Driving behavior guiding system
CN110083858A (en) A kind of driving preference pre-judging method
CN113312958B (en) Method and device for adjusting dispatch priority based on driver state
CN115909651A (en) Method, device, equipment and storage medium for protecting personal safety in vehicle
CN112712010A (en) Method, device and equipment for evaluating safe driving of common vehicle driver
CN113479212B (en) Method for monitoring behaviors of taxi appointment drivers

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant