WO2020054518A1 - Information management device and information management method - Google Patents

Information management device and information management method Download PDF

Info

Publication number
WO2020054518A1
WO2020054518A1 PCT/JP2019/034629 JP2019034629W WO2020054518A1 WO 2020054518 A1 WO2020054518 A1 WO 2020054518A1 JP 2019034629 W JP2019034629 W JP 2019034629W WO 2020054518 A1 WO2020054518 A1 WO 2020054518A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
vehicle
data
information management
learning
Prior art date
Application number
PCT/JP2019/034629
Other languages
French (fr)
Japanese (ja)
Inventor
慎 江上
一希 笠井
晴香 谷口
知柔 今林
佐久間 淳
栄造 北村
加藤 重之
泰秀 與茂
Original Assignee
オムロン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オムロン株式会社 filed Critical オムロン株式会社
Publication of WO2020054518A1 publication Critical patent/WO2020054518A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services

Definitions

  • the present invention relates to an information management device and an information management method for lending a vehicle to be shared.
  • Patent Literature 1 discloses an in-vehicle monitoring system and the like in a car sharing system capable of monitoring the inside of a vehicle.
  • the in-vehicle monitoring system manages the vehicle according to a reservation from a user, performs an authentication process, lends and returns the vehicle to the user, and monitors the inside of the vehicle by photographing the vehicle.
  • This is an in-vehicle monitoring system in a car sharing system.
  • the in-vehicle monitoring system receives an authentication process monitoring unit that monitors whether or not the user has completed the authentication process after the user has used the vehicle, and receives a monitoring result that the authentication process monitoring unit has completed the vehicle use authentication process.
  • Patent Literature 2 discloses an information management device that restricts use of a vehicle to a user who has poor behavior and allows a good user to use the vehicle without restriction.
  • This information management device is used in a vehicle rental system for renting a vehicle.
  • the information management device includes a storage unit, an identification information acquisition unit, an extraction unit, and an output unit.
  • the storage unit stores information on whether or not the vehicle can be rented in advance, corresponding to the identification information for identifying the user.
  • the identification information acquisition unit acquires the identification information of the user.
  • the extraction unit uses the identification information acquired by the identification information acquisition unit to extract the availability information corresponding to the identification information stored in the storage unit.
  • the output unit outputs the propriety information extracted by the extraction unit.
  • Patent Literature 3 realizes improvement of recognition time, learning time, and recognition performance by realizing automatic selection of learning data for each optimum recognition category and selection of an optimal feature amount in pattern recognition using a neural network.
  • a learning support device or the like for preventing a decrease in recognition performance due to a change in environment is disclosed.
  • This learning support device and the like include a feature value calculation unit, a statistical analysis unit, and a neural network unit. The feature value calculation unit calculates a feature value for each image data.
  • the statistical analysis unit performs a cluster analysis on each set of feature amounts corresponding to each image data, thereby classifying each set of feature amounts into a plurality of clusters corresponding to each category of image data. Then, a set of feature amounts representing each of the classified clusters is selected as learning data. After these sets of feature amounts are normalized by the normalization unit, they are used for learning of the neural network unit.
  • Patent Literature 4 discloses a specific operation detecting device that detects a characteristic specific operation that frequently appears in a video at high speed and accurately by analyzing the video.
  • the specific action template creating unit previously stores the feature vector of the specific action as a template.
  • the feature calculating unit calculates a feature vector from the video, and the identification calculating unit compares the feature vector calculated by the feature calculating unit with the feature vector stored by the specific operation template creating unit, and determines the specific operation. Identify.
  • This feature vector is an element based on a frame number, a motion direction number, and a block number, that is, a vector in which the number of pixels of a predetermined motion direction existing in a predetermined block of a predetermined frame is an element. Since the matching process of the feature vector focusing on the motion direction is performed between the feature vector of the specific operation stored in advance and the calculated feature vector, the specific operation can be detected from the video.
  • Patent Literature 5 discloses an action recognition system that can recognize complicated and diverse actions of a person and improve the accuracy of action recognition.
  • This behavior recognition system is composed of a recognition target data generation unit that generates recognition target data using recognition target image data including a person to be subjected to behavior recognition, and a plurality of behavior models that are data that model the behavior of a person.
  • a likelihood calculation unit that compares the data with the recognition target data generated by the recognition target data generation unit and calculates the likelihood of the recognition target data for each of the plurality of behavior model data; and a template of the object generated in advance.
  • An image matching unit that compares the data with the image data to be recognized, and a recognition result determination unit that specifies a person's action as a recognition result based on the calculation result by the likelihood calculation unit and the matching result by the image matching unit. .
  • JP 2012-043042 A JP 2009-271632 A JP 09-330406 A JP 2010-267029 A Japanese Patent Application Laid-Open No. 2007-213528
  • the present invention has been devised in view of such circumstances, and lends a vehicle to a user provided on a vehicle provided to a car-sharing or rental car, based on a fraudulent act such as damage or smoking performed by the user.
  • An information management apparatus and an information management method capable of restricting the information management.
  • an information management device used in a car sharing system that lends a vehicle shared by a plurality of users, a receiving device that receives sensor data from a sensor installed in the vehicle, A usage record recording device that stores sensor data received by the receiving device in association with the user as data on the user's behavior, and classifies user's misconduct from data on the user's activity recorded by the usage record recording device A classification device, and a control device that gives a user's performance based on the type of fraudulent activity classified by the classification device, and determines whether the user can use the vehicle based on the performance and a predetermined lending standard.
  • An information management device is provided. According to this, it is possible to provide an information management device that restricts the lending of the vehicle to the user based on the illegal act performed by the user and the lending standard in the vehicle provided for car sharing or the like.
  • the predetermined lending standard may be set for each vehicle owner. According to this, if the vehicle lending standard is uniform, the allowable range differs for each owner, so it may be uncomfortable depending on the owner, but it is possible to set a lending standard that can satisfy itself for each owner Thus, the owner can be prevented from being uncomfortable.
  • a terminal communication device that communicates with a provider terminal operated by a vehicle provider
  • the control device is configured to transmit data relating to a user's action according to information from the provider terminal received by the terminal communication device. It may be characterized in that transmission and change of the grade are permitted. According to this, the provider can directly inspect the data and judge the presence or absence of fraud, and even if the misclassification of the machine is incorrect, the grade can be manually corrected, so that the fraud can be accurately classified. Become like
  • control device may be characterized in that the control device transmits to the provider terminal data relating to the behavior of the user having low reliability of the classification performed by the classification device. According to this, it is possible to urge the review of the results and the learning of the classifying device by notifying the provider of the classified data.
  • the learning device further includes a learning device that learns the classification device.
  • the learning device receives a label of the fraudulent activity given by the provider of the vehicle from the provider terminal to the data on the user's behavior, It may be characterized in that the classifier is trained using the label as the teacher data. According to this, the performance of the classification device can be improved by making the provider learn the classification device.
  • An information management method used in a car sharing system for renting out a vehicle shared by a plurality of users to solve the above-mentioned problem comprising: receiving sensor data from a sensor installed in the vehicle; The data is associated with the user and stored as data on the user's actions, the data on the user's actions is classified into the fraudulent acts of the user, and the grades of the users are assigned based on the types of the fraudulent actions.
  • An information management method for determining whether a user can use a vehicle based on a predetermined rental standard is provided. According to this, it is possible to provide an information management method for restricting the lending of the vehicle to the user based on the improper activity performed by the user in the vehicle provided for car sharing or the like.
  • information that restricts the lending of a vehicle to a user based on a car-sharing or fraudulent activity such as smoking performed by the user in the vehicle provided to the rental car can be provided.
  • FIG. 1 is a block diagram of a car sharing system according to a first embodiment of the present invention.
  • FIG. 1 is a block configuration diagram of an information management device according to a first embodiment of the present invention.
  • FIG. 1 is a block diagram of a classification device and a learning device according to a first embodiment of the present invention.
  • FIG. 1 is a block diagram of a classification device and a learning device according to a first embodiment of the present invention (when a multilayer neural network is used as a classifier).
  • 5 is a flowchart at the start of use of a vehicle in the car sharing system according to the first embodiment of the present invention.
  • 5 is a flowchart at the end of use of a vehicle in the car sharing system according to the first embodiment of the present invention.
  • An example of the fraud and occurrence rate according to the present invention. 4 is an example of output values and label data when a multilayer neural network is used as a classifier of the classification device according to the first embodiment of the present invention.
  • the information management apparatus 100 is used in a car sharing system 1 for renting a vehicle CR shared by a plurality of users.
  • the car sharing system 1 includes an information management device 100, a vehicle CR, a user terminal TM2 operated when a user of the vehicle CR applies for a reservation, and a user of the vehicle CR applying for a reservation.
  • a vehicle lending management device RTS that sometimes communicates with the user terminal TM2, an electronic key EK that the user of the vehicle CR possesses when using the device, and a communication that the provider of the vehicle CR operates to communicate with the information management device 100.
  • car sharing in this specification means that the owner of the vehicle CR provides the vehicle CR for shared use within a predetermined group or community, and a plurality of uses within the group or community. Refers to a system that manages lending, etc., for lending to people.
  • the provider terminal TM1 and the user terminal TM2 communicate information from the information management device 100 and the vehicle lending management device RTS with the communication function of communicating with the information management device 100 and the vehicle lending management device RTS via the Internet or a public line. It has a display function for displaying, an input function for a provider or a user to input information, a control function and a storage function for realizing the display function and the input function.
  • the provider terminal TM1 and the user terminal TM2 are, for example, a personal computer or a smartphone.
  • the display function is a display
  • the input function is a keyboard
  • the control function is a CPU (Central Processing Unit)
  • the storage function is a memory or a hard disk.
  • the vehicle rental management device RTS is accessed when the user operates the user terminal TM2 to apply for the use of the vehicle CR.
  • Can the vehicle lending management device RTS receive information such as a vehicle CR desired to be used, a use period, a return place, and the like at the time of use application, and permit the information management device 100 connected via a network to permit the use application? Inquire whether or not.
  • the network connecting the vehicle lending management device RTS and the information management device 100 may be the Internet or a dedicated line.
  • the vehicle lending management device RTS creates a schedule for lending with the available vehicle CR, performs a cost process for the user, The contents are transmitted to the user terminal TM2.
  • the vehicle lending management device RTS when the vehicle lending management device RTS receives the information indicating that the vehicle is not permitted, the vehicle lending management device RTS transmits to the user terminal TM2 that the desired vehicle CR cannot be lent. Since the vehicle lending management device RTS is accessed from a number of user terminals TM2, it is preferable that the vehicle lending management device RTS be a server having a control function, a storage function, and a communication function that are more sophisticated than a personal computer.
  • the vehicle CR includes a vehicle management system CC.
  • the vehicle management system CC includes a vehicle-side communication unit CR1 that communicates with the information management apparatus 100 via a network, and an image sensor that is a sensor that detects a user's action, action, operation, or a resulting state.
  • CR2 a vehicle state monitoring unit CR3 for monitoring the driving state / use state of the vehicle CR, a vehicle-side control unit CR4 for performing these controls, a vehicle-side authentication unit CR5 for communicating with the electronic key EK for authentication, and an engine.
  • a door lock ECU CR8 for controlling door locking and unlocking.
  • the vehicle-side communication unit CR1 communicates with the information management device 100 to receive an identification code of the electronic key EK corresponding to the vehicle CR at the start of use, or to transmit information detected by the image sensor CR2 at the end of use.
  • the network between the information management device 100 and the vehicle CR is not particularly limited as long as it is a network capable of wireless communication.
  • the image sensor CR2 is a sensor that detects an action / action / action of a user of the vehicle CR including a passenger and a state generated as a result, and is typically a sensor that detects a moving image such as a monitoring camera. For example, a camera capable of imaging visible light or infrared light.
  • the image sensor CR2 is installed in the vehicle CR so that a driver or a passenger can be imaged in the cabin of the vehicle CR.
  • the image sensor CR2 provides the vehicle CR such as a state resulting from the user's smoking, eating and drinking, putting on a dashboard, putting on a box, performing manners while driving, and leaving garbage unattended.
  • the image sensor CR2 may have a function of detecting a biological signal (vital sign, reflection, voluntary movement, or the like) of the driver, and may be capable of acquiring the driver's tension or emotion.
  • the image sensor CR2 can also detect a state that the provider of the vehicle CR considers undesirable, such as driving in an unnecessarily nervous state.
  • the vehicle state monitoring unit CR3 monitors the state of the vehicle CR using various sensors.
  • sensors used in the vehicle state monitoring unit CR3 include acceleration sensors, vehicle speed sensors, steering angle sensors, vibration sensors, yaw rate sensors, gyroscopes, inter-vehicle distances, including sensors related to the main control system of acceleration, steering, and braking.
  • Sensors front / rear cameras, LIDAR, position sensors (GPS: Global Positioning System), and the like.
  • GPS Global Positioning System
  • information on the user's actions / actions / actions detected by the image sensor CR2 and the vehicle state monitoring unit CR3, the resulting state, and the state of the vehicle are combined with data on the user's actions. To tell.
  • the vehicle management system CC detects and controls the vehicle state as described above in addition to the illustrated engine ECUCR6 for controlling the engine, the power supply ECUCR7 for controlling the power supply, and the door lock ECUCR8 for controlling the locking and unlocking of the door. It is equipped with a number of ECUs (Electronic Control Unit) that perform the operations. For example, it is a steering control ECU, a brake control ECU, or the like.
  • ECUs Electronic Control Unit
  • the vehicle-side authentication unit CR5 communicates with the electronic key EK to authenticate that the person has the electronic key EK that can use the vehicle CR. If the vehicle has been authenticated, the vehicle-side authentication unit CR5 instructs the door lock ECU CR8 to unlock the door, turns on the main power supply to the power supply ECU CR7, instructs the engine ECU CR6 to rotate the starter motor of the engine, and notifies the vehicle-side control unit CR4 of the vehicle. The start of the control of the side communication unit CR1, etc. is instructed.
  • the electronic key EK may be a vehicle key (FOB) unique to the vehicle CR if it can be directly delivered, or may be a smartphone or an IC card registered in advance by the user. Instead of using the electronic key EK, a fingerprint or face that can identify the user may be used. In this case, the vehicle-side authentication unit CR5 has a function of performing fingerprint authentication / face authentication.
  • FOB vehicle key
  • the information management device 100 receives the sensor data from the sensor installed in the vehicle CR, and associates the sensor data received by the reception device 10 with the user.
  • a data storage unit 20 (use result recording device) for storing as data relating to the act, a classifying device 30 for classifying a user's fraudulent activity from the data relating to the user's act recorded by the data storage unit 20, and a learning of the classifying device 30
  • a learning device 70 a vehicle database 50 that records information about the vehicle CR and information about the provider of the vehicle CR, a transmission device 80 that transmits an identification code of an available electronic key EK to the vehicle CR, Controls the entire device, database, and the like, and based on the type of misconduct classified by the classifying device 30, Only, and a control device 60 for determining the availability of the vehicle by the user on the basis of results and a predetermined lending standards, the.
  • the receiving device 10 has a network interface function of communicating with the vehicle-side communication unit CR1 of the vehicle CR via a network connected to a wirelessly communicable network (for example, a wireless telephone line or a wireless LAN). Further, the receiving device 10 may have a receiving function of a terminal communication device that communicates with the provider terminal TM1 via these networks. The terminal communication device that communicates with the provider terminal TM1 may be separately provided as a dedicated device in the learning device 70 or the like. In addition, the receiving device 10 receives, via these networks, sensor data detected by the above-described various sensors installed in the vehicle CR at the end of use or the like. The receiving device 10 also communicates with the vehicle lending management device RTS, and receives information (for example, information on a user, a period of use, and the like) related to an application for a reservation to use the vehicle CR.
  • a wirelessly communicable network for example, a wireless telephone line or a wireless LAN.
  • the receiving device 10 may have a receiving function of
  • the data storage unit 20 stores the sensor data received by the receiving device 10 in association with the user as data relating to the action of the user of the vehicle CR.
  • the data storage unit 20 stores the image data detected by the image sensor CR2, the driving data such as sudden start / brake detected by the acceleration sensor and the steering angle sensor of the vehicle state monitoring unit CR3, and the user ID (member number, Driver's license number, name, etc.).
  • the image data is used to perform various operations during the ride of the user (driver and passenger), for example, not only smoking, eating and drinking inside the vehicle using the in-vehicle camera, but also leaving the pedestrian approaching the vehicle in front of the vehicle using the camera outside the vehicle.
  • the data storage unit 20 stores information on the user's actions / actions / actions detected by the image sensor CR2 and the vehicle state monitoring unit CR3, the resulting state, and the state of the vehicle. Record as usage record.
  • the classifying device 30 classifies a user's misconduct based on data on the user's behavior recorded by the data storage unit 20.
  • the classification device 30 includes a data input unit 31 that reads information on the user's actual usage from the data storage unit 20, and a feature that calculates the characteristic amount from the actual usage information read by the data input unit 31. It comprises a quantity calculation unit 32 and a classifier 33 for classifying fraudulent acts based on the feature values calculated by the feature value calculation unit 32.
  • the feature amount calculation unit 32 includes a luminance distribution (Wavelet) of the entire object in the image, a Haar-like feature amount, a HOG feature amount, an EOH feature amount, and the like, which are local feature amounts of the object.
  • a plurality of feature values may be used. Further, feature points of the eyes, nose and mouth, feature points of the fingers of the hand, and feature points of these chronological changes may be extracted. If necessary, noise removal or normalization may be performed as preprocessing. Further, the configuration may be such that the value (lightness or color information) of each pixel of the image data is directly input to the classifier 33 without using the feature amount calculation unit 32.
  • the classifier 33 classifies which of the previously labeled fraudulent acts the input feature value corresponds to.
  • the classifier 33 may be any classifier that uses a classification method that can divide input data (images, moving images, audio data, and the like) into predetermined categories (classes).
  • the classifier 33 is preferably a classifier capable of machine learning.
  • the classifier 33 may be a multilayer neural network 33 as shown in FIG.
  • the multilayer neural network 33 includes an input layer, a plurality of intermediate layers, and an output layer configured by a node configuration and a connection weight between nodes in which the calculated feature amount is previously learned by deep learning (deep learning) or the like.
  • the multilayer neural network 33 has as many or more output nodes as the type of fraud labeled.
  • the output layer includes output corresponding to these misconducts. Has nodes.
  • an image in which the hand is close to the user's mouth and the hand is holding something is labeled as "food and drink” to learn, and similarly, the hand is close to the user's mouth,
  • the image where the light flickers with is labeled "smoking” and learned.
  • the output value of “drinking” or “smoking” is large.
  • the multilayer neural network 33 adds an output node of the output layer, and accordingly, the node configuration in a plurality of intermediate layers and the connection weight between the nodes are re-learned. Being built.
  • the classification device 30 is not limited to the classification method using the multilayer neural network 33.
  • learning data such as a support vector machine, logistic regression, random forest, and boosting.
  • a plurality of classifiers 33 may be used in combination.
  • a convolutional neural network may be used as the multilayer neural network 33.
  • the convolutional neural network When the convolutional neural network is used, the pixel data of the image is directly input to the input layer of the convolutional neural network without using the feature amount calculation unit 32.
  • a convolutional layer and a pooling layer are provided as layers next to the input layer of the convolutional neural network, and these layers execute extraction of a feature amount.
  • a template may be created for each fraudulent activity, and matching with the classification target may be performed using an average value of templates having the same label. . Further, matching may be performed sequentially with all templates. When a provider labels and learns data, the number of templates increases.
  • the classifying device 30 classifies data such as image data input from the data storage unit 20 and outputs the data to the control device 60.
  • the control device 60 controls the entire information management device 100 including the database.
  • the control device 60 communicates with the receiving device 10, the transmitting device 80, the data storage unit 20, the classifying device 30, the learning device 70, the user database 40, and the vehicle database 50 to cooperate with each other, and obtains data obtained from the vehicle CR.
  • the user is classified based on the fraud.
  • the control device 60 (result giving section 62) determines the user's results based on the classified misconduct.
  • the control device 60 sends the information to the receiving device 10, the transmitting device 80, the data storage unit 20, the classifying device 30, the learning device 70, the user database 40, and the vehicle database 50 based on the information from the provider terminal TM1.
  • the terminal communication device may include the functions of the receiving device 10 and the transmitting device 80, or may be provided in a learning data input unit 71 described later.
  • the control device 60 creates a grade of the user based on the classification result output by the classification device 30 and stores the grade in the user database 40.
  • the user's performance is the type of fraudulent activity performed by the user, the total score given for each type of fraudulent activity, the incidence rate described later, and the like.
  • the user database 40 records the performance of the user (a performance recording device). As shown in FIG. 7, the user database 40 accumulates the occurrence rate of each fraudulent act for each user (user ID).
  • the control device 60 also performs control such as writing information about the vehicle CR and information about its provider into the vehicle database 50 and reading the information from the vehicle database 50 on the contrary.
  • the vehicle database 50 records, as information on the provider of the vehicle CR, a lending standard for each provider.
  • the lending criterion for each provider is a tolerance for a predetermined fraud. For example, lending is permitted if the frequency of littering and fraudulent driving while driving is less than 50%, but lending is prohibited if the frequency is 50% or more. Such as prohibiting lending when an outbreak occurs. Instead of using the incidence rate or the like, the types of fraudulent activities performed by the user may be listed and recorded in the vehicle database 50. Then, if predetermined fraudulent acts are recorded, lending may be prohibited.
  • the control device 60 performs overall control of the information management device 100, and includes a determination unit 61 and a performance imparting unit 62 therein.
  • the determination unit 61 determines whether or not the user can use the vehicle CR based on the user's performance recorded in the user database 40 and the lending standard of the vehicle database 50. According to this, it is possible to restrict the lending of the vehicle CR to the user based on the fraudulent activity performed by the user in the vehicle CR provided to the car sharing and the lending standard determined by the provider.
  • the lending standard may be uniform as a company that provides the car sharing system 1, but the lending standard may be set for each owner who provides the vehicle. Since the acceptable range may differ for each owner that provides the vehicle, an action that is OK for one owner may be uncomfortable for another owner. By being able to set a satisfactory lending standard for each owner, it is possible to prevent the owner from becoming uncomfortable.
  • the control device 60 controls the transmission device 80 to permit the vehicle lending management device RTS to use the reservation. Is transmitted, and if not permitted, a message is transmitted not to be permitted. Further, the control device 60 controls the transmitting device 80 to transmit the identification code of the available electronic key EK to the vehicle CR when the user is actually permitted to use the vehicle CR and the user actually uses the vehicle CR. I do. As a result, the user can start using the vehicle CR as requested.
  • the control device 60 communicates with the provider terminal TM1 operated by the provider of the vehicle CR via the receiving device 10 and the transmitting device 80 as terminal communication devices for performing communication with the terminal, and receives a signal from the provider terminal TM1. According to the command, transmission of the data stored in the data storage unit 20 and change of the user's performance stored in the user database 40 are permitted.
  • the terminal communication device may be provided as a dedicated device.
  • the data may be transmitted when there is a request from a provider or the like, or may be transmitted automatically. At the time of transmission, since the amount of information in a moving image is too large, transmission may be limited to data used for digesting and determination.
  • the transmission of the data stored in the data storage unit 20 to the provider terminal TM1 is performed via the transmission device 80.
  • a dedicated terminal communication device for performing communication with the provider terminal TM1 may be separately provided.
  • the provider can check the data at the time of use of the user himself and correct the performance of the user using the provider terminal TM1 if the judgment of the classifying device 30 is wrong. preferable. In this way, the vehicle CR provider can directly inspect the data and judge the presence or absence of fraud, so even if the grade is incorrectly assigned due to incorrect classification of machine fraud, the grade is manually corrected. it can.
  • the control device 60 may automatically transmit the data to the terminal.
  • the control device 60 receives the output from the classification device 30 and transmits, for example, data with low reliability of the classification performed by the classification device 30 to the provider terminal TM1.
  • the output from the classification device 30 at that time and the image data input at that time are notified to a provider or the like, so that the results can be reviewed or the learning of the classification device 30 can be performed. Can be encouraged.
  • the control device 60 receives the command from the provider terminal TM1, and controls the learning device 70 in accordance with the command to cause the classifying device 30 to learn.
  • the learning device 70 includes a learning data input unit 71 that receives data transmitted from the provider terminal TM1, a learning data storage unit 72 that stores data received by the learning data input unit 71, A parameter calculation unit 73 for calculating parameters of the classifier 33 based on data stored in the data storage unit 72, an output from the classifier 33, and the like; a parameter value storage unit 74 for storing calculated parameter values; A parameter updating unit 75 for updating the parameters of the unit 33 to the stored values.
  • the learning data input unit 71 may receive data directly from the provider terminal TM1, or the data received by the receiving device 10 of the information management device 100 may be transferred and received by the control device 60.
  • the learning data storage unit 72 previously stores a large number of learning input data for learning and label data as teacher data of each learning input data.
  • the learning input data is image data
  • the label data is data indicating what image the image data is.
  • the learning device 70 includes a learning data input unit 71 that receives data from the provider terminal TM1, a learning data storage unit 72 that stores data received by the learning data input unit 71, and a learning data storage unit.
  • a weight calculation unit 73 that calculates the connection weight between nodes based on the data stored in 72, a weight data storage unit 74 that stores the calculated connection weight, and a connection weight between nodes of the multilayer neural network 33 are stored.
  • a weight updating unit 75 for updating the connection weights.
  • the learning data storage unit 72 previously stores a large number of learning image data for learning and data of a label assigned to each of the learning image data.
  • the learning data storage unit 72 may store the feature amount output from the feature amount calculation unit 32 in association with each data. Note that a large amount of learning data is stored in the learning data storage unit 72 in advance, and the multilayer neural network 33 has been trained using this learning data.
  • the provider After receiving the data stored in the data storage unit 20 at the provider terminal TM1, the provider directly inspects the data to determine whether or not there is any fraud. Label if there are any. The provider sees, for example, data that cannot be clearly identified as smoking or eating or drinking with low reliability, and recognizes that this is smoking and labels the data.
  • the learning data input unit 71 receives the image data A and the label data A given by the provider from the provider terminal TM1 as data for learning.
  • the label data is data corresponding to the node of the output layer of the multilayer neural network 33.
  • the label data A is (1: 0: 0: 0: 0: 0: 0) as shown in FIG.
  • the conversion from the label (smoking) to the label data may be executed by the provider terminal TM1 in which dedicated application software is installed, or may be executed by the learning data input unit 71.
  • the image data A received by the learning data input unit 71 and the label data A corresponding to the image data are stored in the learning data storage unit 72 as learning data.
  • the learning device 70 inputs the learning image data A to the data input unit 31 of the classifying device 30, and the feature calculating unit 32 calculates the feature of the learning image data A, To enter.
  • the output value from the output layer of the multilayer neural network 33 and each output value of an internal node at this time are input to the weight calculation unit 73 of the learning device 70.
  • the label data A is also input from the learning data storage unit 72.
  • data as the current connection weight of the multilayer neural network 33 is input from the weight data storage unit 74.
  • the weight calculation unit 73 corrects the connection weight between nodes using a method such as an error back propagation method in order to reduce an error between the output value from the output layer and the label data, for example, a square error.
  • the weight calculator 73 similarly corrects the connection weight between nodes using a method such as the backpropagation method for a plurality of pieces of learning data other than the image data A stored in the learning data storage unit 72. I will do it.
  • the learning device 70 confirms that the error between the output value from the output layer and the label data has converged to the predetermined threshold value for all the learning data, the learning of the classification device 30 ends.
  • the weight data storage unit 74 stores the connection weight at the end of learning, and outputs the connection weight to the weight update unit 75.
  • the weight updating unit 75 updates the value of the connection weight between the nodes of the multilayer neural network 33 to the value of the connection weight at the end of learning.
  • the parameter calculation unit 73 calculates a parameter value once using a plurality of pieces of learning data stored in the learning data storage unit 72. Then, the parameter value storage unit 74 stores the parameter value, and causes the parameter value to update the current parameter value of the support vector machine via the parameter update unit 75. Thereafter, when the provider confirms the determination of the support vector machine, the provider assigns the correct label data to the image data A for which the support vector machine has made an incorrect determination, and the image data A is input to the learning data input unit 71. Label data A is added. The parameter calculation unit 73 calculates the parameter value of the identification function again so that the added learning data can be correctly classified. Then, the parameter updating unit 75 updates the parameter value of the support vector machine to the calculated new parameter value.
  • the parameters of the identification function may be adjusted using a method such as “soft margin” or “kernel trick” other than “maximization of margin”.
  • Adaboost one of the boosting, is known as a classifier.
  • the Adaboost classifier is composed of a plurality of weak classifiers, and calculates the product of the output value from the weak classifier and the reliability assigned to each weak classifier for each weak classifier, The final classification is performed based on the sum of these products.
  • weights are assigned to the learning input data, and the false detection rate of each weak classifier is calculated based on the weight.
  • the reliability of the weak classifier is calculated based on the false detection rate.
  • Each weight of the input data for learning is set to a larger weight for input data whose classification is incorrect by the weak classifier.
  • the parameter calculating unit 73 increases the weight of the image data A and sets the weight of the other image data stored in the learning data storage unit 72. Is re-weighted, and the reliability of each weak classifier is calculated again. The reliability calculated for each weak classifier is stored in the parameter value storage unit 74, and the reliability of each weak classifier of the classifier is updated via the parameter update unit 75.
  • the classification device 30 and the learning device 70 are realized using a computer and software.
  • the classification device 30 and the learning device 70 may be realized by the same computer. Part of the classification device 30 and the learning device 70 may use dedicated hardware.
  • the classifier 33 of the classification device 30 when the classifier 33 of the classification device 30 is a multilayer neural network, the classifier 33 may be realized using a plurality of neurochips or the like.
  • the same configuration as the classifier 33 may be configured in the parameter calculation unit 73 of the learning device 70 to calculate parameters without receiving the output from the classifier 33.
  • the user accesses the vehicle lending management device RTS from the user terminal TM2, inputs a use period and the like, and applies for a use reservation of the vehicle CR.
  • the vehicle lending management device RTS Upon receiving the application from the user, the vehicle lending management device RTS transmits a user ID to the information management device 100 and requests the information management device 100 to determine whether or not use is possible in S102.
  • the user ID may be, for example, a license number or an ID registered in the information management device 100 in advance.
  • the information management apparatus 100 Upon receiving the user ID, the information management apparatus 100 authenticates the ID in S104. If the authentication is successful, the information management device 100 accesses the user database 40 in S106, and confirms the user's performance regarding the fraudulent activity when the user rents the vehicle CR. If the use does not meet the lending conditions or has already been reserved in S108, the information management device 100 transmits a non-permission response to the vehicle lending management device RTS in S110. Upon receiving the response, the vehicle lending management device RTS transmits an unusable response to the user in S112. Further, when the vehicle CR is not available, the information management apparatus 100 may recommend another vehicle that can be reserved and that meets the lending conditions based on the performance of the user.
  • the information management device 100 replies to the vehicle lending management device RTS in S114 that the use is permitted.
  • the vehicle lending management device RTS transmits the key ID of the permitted vehicle CR from the transmission device 80 in S116.
  • the user downloads the key ID to his / her smartphone or the like, and in S118, transmits a signal to approach and unlock the door when using the vehicle CR.
  • vehicle CR authenticates the key ID in S120, and if the authentication is successful, unlocks the door of vehicle CR in S122.
  • the vehicle CR provider can provide the vehicle CR to the car sharing system 1 with ease.
  • the vehicle CR continues to detect the user's actions / actions / actions and the resulting state and the like by the image sensor CR2 in S200, and the driving state of the vehicle CR by the vehicle state monitoring unit CR3 in S202. / Continue to monitor usage status.
  • the vehicle CR When the user terminates the use of the vehicle CR, the user gets off the vehicle CR in S204 and transmits a key ID to the vehicle CR in S206 to perform locking. Upon receiving the key ID, the vehicle CR locks the door in S208.
  • the vehicle CR transmits data detected by sensors such as the image sensor CR2 and the vehicle state monitoring unit CR3 to the information management device 100 in S210. In this embodiment, the data detected by the sensor is transmitted at the end of use. However, the present invention is not limited to this. Data may be transmitted in real time during use.
  • the classifying device 30 classifies a device corresponding to an improper act based on the data, and determines the performance of the user based on the use record.
  • the information management device 100 notifies the determination result to the provider and the user of the vehicle CR.
  • the information management apparatus 100 updates the record of the user in the user database 40 including the latest data.
  • the provider may request the provider terminal TM1 to access the user's performance and usage results at S218 at his or her favorite timing.
  • the provider is not limited to the access at this timing, but may notify the provider at the timing when the information management apparatus 100 determines the availability, and make the determination at that time.
  • the information management device 100 transmits a digest such as image data of the user's performance and usage results from the transmission device 80 in S220.
  • the provider Upon receiving the digest, in S222, the provider corrects the user's performance or relabels the image data of the misconduct classified by the classification device 30 by mistake, and obtains information from the provider terminal TM1. It is transmitted to the management device 100.
  • the classifying device 30 is a case where the image data classified as eating and drinking is actually smoking.
  • the information management device 100 updates the user's performance in the user database 40 in S224.
  • the learning device 70 of the information management device 100 trains the multilayer neural network 33 of the classification device 30 using the new learning data in S226.
  • the provider checks the performance data of the user, and when there is an error in the classification of the classification device 30, the performance of the classification device 30 is improved by learning the classification device 30 again based on the performance data. Can be done.
  • the above is also an information management method used in the car sharing system 1 for renting a vehicle CR shared by a plurality of users. That is, the above-described information management method receives sensor data from a sensor installed in the vehicle CR, stores data relating to the user's behavior from the received sensor data, and classifies the user's fraudulent behavior from the data.
  • This is an information management method that records the performance of a user based on the type of misconduct and determines whether the user can use the vehicle based on the performance and a predetermined lending standard. According to this, it is possible to provide an information management method for restricting the lending of the vehicle to the user based on the improper activity performed by the user in the vehicle provided for car sharing or the like.
  • Receiving device 20 Data storage unit / image data storage unit (usage record device) Reference Signs List 30 Classification device 31 Data input unit / image data input unit 32 Feature calculation unit 33 Classifier (multilayer neural network) 40 User Database (Result Recorder) Reference Signs List 50 Vehicle database 60 Control device 61 Judgment unit 62 Grading unit 70 Learning device 71 Learning data input unit 72 Learning data storage unit 73 Parameter calculation unit (weight calculation unit) 74 Parameter value storage (weight data storage) 75 Parameter update unit (weight update unit) 80 Transmission device RTS Vehicle lending management device TM1 Provider terminal (operation terminal) TM2 user terminal CR vehicle CR1 vehicle side communication part CR2 image sensor CR3 vehicle state monitoring part CR4 vehicle side control part CR5 vehicle side authentication part CR6 engine ECU CR7 power supply ECU CR8 Door lock ECU CC Vehicle Management System EK Electronic Key

Landscapes

  • Business, Economics & Management (AREA)
  • Tourism & Hospitality (AREA)
  • Health & Medical Sciences (AREA)
  • Economics (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Traffic Control Systems (AREA)

Abstract

The present invention limits the renting of a vehicle to a user on the basis of the wrongful act, etc., that the user has done to a vehicle provided for car sharing or similar other purposes. Provided is an information processing device 100 comprising: a receiving device 10 for receiving sensor data from a sensor installed in a vehicle CR; a usage performance recording device (data storage unit 20) for storing the sensor data received by the receiving device in association with a user as data relating to the act of the user; a classification device 30 for classifying the wrongful act of the user from the data relating to the act of the user that is recorded by the usage performance recording device; and a control device 60 for correlating the user on the basis of the type of wrongful act classified by the classification device and determining whether or not the use of a vehicle by the user is permissible on the basis of track records and a prescribed renting standard.

Description

情報管理装置および情報管理方法Information management apparatus and information management method
 本発明は、共同利用する車両の貸し出しについての情報管理装置および情報管理方法に関する。 The present invention relates to an information management device and an information management method for lending a vehicle to be shared.
 従来から、カーシェアリングシステムなどの共用物を管理する技術が提案されている。例えば、特許文献1は、車内を監視することが可能なカーシェアリングシステムにおける車内監視システム等を開示する。この車内監視システムは、利用者からの予約に応じて車両を管理し、認証処理を行って、利用者への車両の貸出および返却を行うとともに、車両内の撮影を行って車両内を監視するカーシェアリングシステムにおける車内監視システムである。この車内監視システムは、車両において利用者による車両利用後の認証処理が完了したかを監視する認証処理監視手段と、認証処理監視手段における車両利用の認証処理が完了したとの監視結果を受けて、車両内の撮影を行う撮影手段と、撮影手段により撮影された画像情報を管理者に通知する通知手段とを有する。 技術 Conventionally, technologies for managing shared objects such as car sharing systems have been proposed. For example, Patent Literature 1 discloses an in-vehicle monitoring system and the like in a car sharing system capable of monitoring the inside of a vehicle. The in-vehicle monitoring system manages the vehicle according to a reservation from a user, performs an authentication process, lends and returns the vehicle to the user, and monitors the inside of the vehicle by photographing the vehicle. This is an in-vehicle monitoring system in a car sharing system. The in-vehicle monitoring system receives an authentication process monitoring unit that monitors whether or not the user has completed the authentication process after the user has used the vehicle, and receives a monitoring result that the authentication process monitoring unit has completed the vehicle use authentication process. A photographing means for photographing the inside of the vehicle, and a notifying means for notifying an administrator of image information photographed by the photographing means.
 また、貸し出した車両において行った利用者の行為に基づいて車両の利用を制限する技術が提案されている。たとえば、特許文献2は、素行の悪い利用者に対して車両の利用を制限するとともに、優良な利用者に対して車両を制限することなく利用させる情報管理装置を開示する。この情報管理装置は、車両の貸し出しをおこなう貸車システムに用いられる。情報管理装置は、格納部と、識別情報取得部と、抽出部と、出力部と、を備えて構成される。格納部は、利用者を識別する識別情報に対応し、車両の貸し出しの可否に関する情報を予め格納する。識別情報取得部は、利用者の識別情報を取得する。抽出部は、識別情報取得部によって取得された識別情報を用いて、格納部に格納されている当該識別情報に対応する可否情報を抽出する。出力部は、抽出部によって抽出された可否情報を出力する。 技術 Also, there has been proposed a technique for restricting use of a vehicle based on a user's action performed on the rented vehicle. For example, Patent Literature 2 discloses an information management device that restricts use of a vehicle to a user who has poor behavior and allows a good user to use the vehicle without restriction. This information management device is used in a vehicle rental system for renting a vehicle. The information management device includes a storage unit, an identification information acquisition unit, an extraction unit, and an output unit. The storage unit stores information on whether or not the vehicle can be rented in advance, corresponding to the identification information for identifying the user. The identification information acquisition unit acquires the identification information of the user. The extraction unit uses the identification information acquired by the identification information acquisition unit to extract the availability information corresponding to the identification information stored in the storage unit. The output unit outputs the propriety information extracted by the extraction unit.
 また、画像などのデータから対象物を検出する識別器として多層ニューラルネットワークを用いたディープラーニングや、教師データを用いて学習させることによって性能を向上させる技術が提案されている。たとえば、特許文献3は、ニューラルネットワークを用いたパターン認識における最適な認識カテゴリ毎の学習データの自動選択と最適な特徴量の選択の実現によって認識時間、学習時間、及び認識性能の向上を実現させ、また、環境の変化に起因する認識性能の低下を防止する学習支援装置等を開示する。この学習支援装置等は、特徴量演算部と、統計解析部と、ニューラルネットワーク部と備える。特徴量演算部は、各画像データに対し、特徴量を演算する。統計解析部は、各画像データに対応する特徴量の各組に対して、クラスタ分析を実行することによって、上記各特徴量の組を、画像データの各カテゴリに対応する複数のクラスタに分類し、分類された各クラスタを代表する特徴量の組を、学習データとして選択する。これらの特徴量の組は、正規化部で正規化された後、ニューラルネットワーク部の学習に使用される。 技術 Also, techniques for improving the performance by deep learning using a multilayer neural network as a discriminator for detecting an object from data such as images, and learning using teacher data have been proposed. For example, Patent Literature 3 realizes improvement of recognition time, learning time, and recognition performance by realizing automatic selection of learning data for each optimum recognition category and selection of an optimal feature amount in pattern recognition using a neural network. Further, a learning support device or the like for preventing a decrease in recognition performance due to a change in environment is disclosed. This learning support device and the like include a feature value calculation unit, a statistical analysis unit, and a neural network unit. The feature value calculation unit calculates a feature value for each image data. The statistical analysis unit performs a cluster analysis on each set of feature amounts corresponding to each image data, thereby classifying each set of feature amounts into a plurality of clusters corresponding to each category of image data. Then, a set of feature amounts representing each of the classified clusters is selected as learning data. After these sets of feature amounts are normalized by the normalization unit, they are used for learning of the neural network unit.
 また、映像から動作の特徴ベクトルを得ることにより動作を検出する技術が提案されている。たとえば、特許文献4は、映像を解析することにより、映像の中で頻繁に現れる特徴的な特定動作を、高速かつ正確に検出する特定動作検出装置を開示する。この特定動作検出装置では、特定動作テンプレート作成部には、特定動作の特徴ベクトルがテンプレートとして予め記憶されている。特徴量計算部は、映像から特徴ベクトルを計算し、識別計算部は、特徴量計算部により計算された特徴ベクトルと、特定動作テンプレート作成部により記憶された特徴ベクトルとを比較し、特定動作を識別する。この特徴ベクトルは、フレーム番号、動き方位番号及びブロック番号による要素、すなわち、所定フレームの所定ブロック内に存在する所定動き方位の画素数を要素にしたベクトルである。動き方位に着目した特徴ベクトルのマッチング処理を、予め記憶した特定動作の特徴ベクトルと、計算した特徴ベクトルとの間で行うから、映像の中から特定動作を検出することができる。 技術 Also, a technique for detecting a motion by obtaining a motion feature vector from a video has been proposed. For example, Patent Literature 4 discloses a specific operation detecting device that detects a characteristic specific operation that frequently appears in a video at high speed and accurately by analyzing the video. In this specific action detection device, the specific action template creating unit previously stores the feature vector of the specific action as a template. The feature calculating unit calculates a feature vector from the video, and the identification calculating unit compares the feature vector calculated by the feature calculating unit with the feature vector stored by the specific operation template creating unit, and determines the specific operation. Identify. This feature vector is an element based on a frame number, a motion direction number, and a block number, that is, a vector in which the number of pixels of a predetermined motion direction existing in a predetermined block of a predetermined frame is an element. Since the matching process of the feature vector focusing on the motion direction is performed between the feature vector of the specific operation stored in advance and the calculated feature vector, the specific operation can be detected from the video.
 また、学習用画像データの特徴量を用いて行動を認識する技術が提案されている。たとえば、特許文献5は、複雑で多様な人物の行動を認識するとともに、行動認識の精度の向上を図ることができる行動認識システムを開示する。この行動認識システムは、行動認識の対象となる人物を含む認識対象画像データを用いて、認識対象データを生成する認識対象データ生成部と、人物の行動をモデル化したデータである複数の行動モデルデータと認識対象データ生成部によって生成された認識対象データとを比較し、複数の行動モデルデータのそれぞれに対する認識対象データの尤度を算出する尤度算出部と、予め生成された対象物のテンプレートデータと認識対象画像データとを比較する画像マッチング部と、尤度算出部による算出結果及び画像マッチング部によるマッチング結果に基づいて、認識結果とする人物の行動を特定する認識結果判定部とを備える。 技術 Further, a technique for recognizing an action using a feature amount of learning image data has been proposed. For example, Patent Literature 5 discloses an action recognition system that can recognize complicated and diverse actions of a person and improve the accuracy of action recognition. This behavior recognition system is composed of a recognition target data generation unit that generates recognition target data using recognition target image data including a person to be subjected to behavior recognition, and a plurality of behavior models that are data that model the behavior of a person. A likelihood calculation unit that compares the data with the recognition target data generated by the recognition target data generation unit and calculates the likelihood of the recognition target data for each of the plurality of behavior model data; and a template of the object generated in advance. An image matching unit that compares the data with the image data to be recognized, and a recognition result determination unit that specifies a person's action as a recognition result based on the calculation result by the likelihood calculation unit and the matching result by the image matching unit. .
特開2012-043042号公報JP 2012-043042 A 特開2009-271632号公報JP 2009-271632 A 特開平09-330406号公報JP 09-330406 A 特開2010-267029号公報JP 2010-267029 A 特開2007-213528号公報Japanese Patent Application Laid-Open No. 2007-213528
 近年、カーシェアリングのニーズが高まっている。一方、車両をカーシェアリングに提供する車両の提供者(車両の所有者およびカーシェアリングシステムを提供している企業や団体で働いている人たちを含む)にとって、自分の車がどのように使われることになるのか分からないため、見ず知らずの人に気軽に車をシェアできないという問題がある。そのため、車両の提供者が自分の車両をシェアカーとして提供するための阻害要因となっている。また、貸し出した車両に対して利用者が行う不正行為(貸出条件として行ってはならないとされた禁止行為)の許容範囲は、提供者によって異なる場合がある。 ニ ー ズ In recent years, the need for car sharing has increased. On the other hand, for vehicle providers who provide vehicles for car sharing (including vehicle owners and those who work for companies and organizations that provide car sharing systems), how their vehicles are used There is a problem that it is not easy to share a car with a stranger because you do not know if that will happen. This is a hindrance for the vehicle provider to provide his / her vehicle as a share car. Also, the allowable range of fraudulent acts (prohibited acts that must not be performed as lending conditions) performed by the user on the lent vehicle may differ depending on the provider.
 本発明は、かかる事情を鑑みて考案されたものであり、カーシェアリングやレンタカーに提供した車両において利用者の行った破損や喫煙などの不正行為等に基づいて、当該利用者への車両の貸出を制限することのできる情報管理装置および情報管理方法を提供する。 The present invention has been devised in view of such circumstances, and lends a vehicle to a user provided on a vehicle provided to a car-sharing or rental car, based on a fraudulent act such as damage or smoking performed by the user. An information management apparatus and an information management method capable of restricting the information management.
 上記課題を解決するために、複数の利用者が共同利用する車両の貸し出しを行うカーシェアリングシステムに用いられる情報管理装置であって、車両に設置されたセンサからセンサデータを受信する受信装置と、受信装置が受信したセンサデータを利用者と対応付けて利用者の行為に関するデータとして記憶する利用実績記録装置と、利用実績記録装置が記録した利用者の行為に関するデータから利用者の不正行為を分類する分類装置と、分類装置が分類した不正行為の種類に基づいて利用者の成績を付け、成績と所定の貸出基準に基づいて、利用者による車両の利用可否を判断する制御装置と、を備える情報管理装置が提供される。
 これによれば、カーシェアリング等に提供した車両において利用者の行った不正行為と貸出基準に基づいて、当該利用者への車両の貸出を制限する情報管理装置を提供できる。
In order to solve the above problems, an information management device used in a car sharing system that lends a vehicle shared by a plurality of users, a receiving device that receives sensor data from a sensor installed in the vehicle, A usage record recording device that stores sensor data received by the receiving device in association with the user as data on the user's behavior, and classifies user's misconduct from data on the user's activity recorded by the usage record recording device A classification device, and a control device that gives a user's performance based on the type of fraudulent activity classified by the classification device, and determines whether the user can use the vehicle based on the performance and a predetermined lending standard. An information management device is provided.
According to this, it is possible to provide an information management device that restricts the lending of the vehicle to the user based on the illegal act performed by the user and the lending standard in the vehicle provided for car sharing or the like.
 さらに、所定の貸出基準は、車両の所有者ごとに設定されることを特徴としてもよい。
 これによれば、車両の貸出基準が一律の場合には所有者ごとに許容できる範囲は異なるので所有者によっては不快になる事があるが、所有者ごとに自身が満足できる貸出基準を設定できることで、所有者が不快になることを回避できる。
Further, the predetermined lending standard may be set for each vehicle owner.
According to this, if the vehicle lending standard is uniform, the allowable range differs for each owner, so it may be uncomfortable depending on the owner, but it is possible to set a lending standard that can satisfy itself for each owner Thus, the owner can be prevented from being uncomfortable.
 さらに、車両の提供者が操作する提供者端末との通信を行う端末通信装置を備え、制御装置は、端末通信装置で受信する提供者端末からの情報に応じて、利用者の行為に関するデータの送信および成績の変更を許可することを特徴としてもよい。
 これによれば、提供者が直接データを検査し、不正行為の有無を判断できるので、また機械の不正行為分類が誤っていても手動で成績を修正できるので、不正行為の正確な分類ができるようになる。
Furthermore, a terminal communication device that communicates with a provider terminal operated by a vehicle provider is provided, and the control device is configured to transmit data relating to a user's action according to information from the provider terminal received by the terminal communication device. It may be characterized in that transmission and change of the grade are permitted.
According to this, the provider can directly inspect the data and judge the presence or absence of fraud, and even if the misclassification of the machine is incorrect, the grade can be manually corrected, so that the fraud can be accurately classified. Become like
 さらに、制御装置は、分類装置が行う分類の信頼度が低い利用者の行為に関するデータを提供者端末に送信することを特徴としてもよい。
 これによれば、分類があやしいデータを提供者に通知することによって、成績の見直しや分類装置の学習を促すことができる。
Furthermore, the control device may be characterized in that the control device transmits to the provider terminal data relating to the behavior of the user having low reliability of the classification performed by the classification device.
According to this, it is possible to urge the review of the results and the learning of the classifying device by notifying the provider of the classified data.
 さらに、分類装置を学習させる学習装置をさらに備え、学習装置は、提供者端末から送信される車両の提供者が利用者の行為に関するデータに対して付与する不正行為のラベルを受信し、不正行為のラベルを教師データとして用いて分類装置を学習させることを特徴としてもよい。
 これによれば、提供者によって分類装置を学習させることで、分類装置の性能を向上させることができる。
The learning device further includes a learning device that learns the classification device. The learning device receives a label of the fraudulent activity given by the provider of the vehicle from the provider terminal to the data on the user's behavior, It may be characterized in that the classifier is trained using the label as the teacher data.
According to this, the performance of the classification device can be improved by making the provider learn the classification device.
 上記課題を解決するために、複数の利用者が共同利用する車両の貸し出しを行うカーシェアリングシステムに用いられる情報管理方法であって、車両に設置されたセンサからセンサデータを受信し、受信したセンサデータを利用者と対応付けて利用者の行為に関するデータとして記憶し、利用者の行為に関するデータから利用者の不正行為を分類し、不正行為の種類に基づいて利用者の成績を付け、成績と所定の貸出基準に基づいて、利用者による車両の利用可否を判断する情報管理方法が提供される。
 これによれば、カーシェアリング等に提供した車両において利用者の行った不正行為等に基づいて、当該利用者への車両の貸出を制限する情報管理方法を提供できる。
An information management method used in a car sharing system for renting out a vehicle shared by a plurality of users to solve the above-mentioned problem, comprising: receiving sensor data from a sensor installed in the vehicle; The data is associated with the user and stored as data on the user's actions, the data on the user's actions is classified into the fraudulent acts of the user, and the grades of the users are assigned based on the types of the fraudulent actions. An information management method for determining whether a user can use a vehicle based on a predetermined rental standard is provided.
According to this, it is possible to provide an information management method for restricting the lending of the vehicle to the user based on the improper activity performed by the user in the vehicle provided for car sharing or the like.
 以上説明したように、本発明によれば、カーシェアリングやレンタカーに提供した車両において利用者の行った破損や喫煙などの不正行為等に基づいて、当該利用者への車両の貸出を制限する情報管理装置および情報管理方法を提供できる。 As described above, according to the present invention, information that restricts the lending of a vehicle to a user based on a car-sharing or fraudulent activity such as smoking performed by the user in the vehicle provided to the rental car. A management device and an information management method can be provided.
本発明に係る第一実施例のカーシェアリングシステムのブロック構成図。FIG. 1 is a block diagram of a car sharing system according to a first embodiment of the present invention. 本発明に係る第一実施例の情報管理装置のブロック構成図。FIG. 1 is a block configuration diagram of an information management device according to a first embodiment of the present invention. 本発明に係る第一実施例の分類装置および学習装置のブロック構成図。FIG. 1 is a block diagram of a classification device and a learning device according to a first embodiment of the present invention. 本発明に係る第一実施例の分類装置および学習装置のブロック構成図(分類器に多層ニューラルネットワークを使用した場合)。FIG. 1 is a block diagram of a classification device and a learning device according to a first embodiment of the present invention (when a multilayer neural network is used as a classifier). 本発明に係る第一実施例のカーシェアリングシステムにおける車両の利用開始時のフローチャート。5 is a flowchart at the start of use of a vehicle in the car sharing system according to the first embodiment of the present invention. 本発明に係る第一実施例のカーシェアリングシステムにおける車両の利用終了時のフローチャート。5 is a flowchart at the end of use of a vehicle in the car sharing system according to the first embodiment of the present invention. 本発明に係る不正行為と発生率に関する例。An example of the fraud and occurrence rate according to the present invention. 本発明に係る第一実施例の分類装置の分類器に多層ニューラルネットワークを使用した場合における出力値とラベルデータの例。4 is an example of output values and label data when a multilayer neural network is used as a classifier of the classification device according to the first embodiment of the present invention.
 以下では、図面を参照しながら、本発明に係る実施例について説明する。 Hereinafter, embodiments according to the present invention will be described with reference to the drawings.
<第一実施例>
 図1乃至図4を参照し、本実施例における情報管理装置100を説明する。情報管理装置100は、複数の利用者が共同利用する車両CRの貸し出しを行うカーシェアリングシステム1に用いられる。カーシェアリングシステム1は、情報管理装置100と、車両CRと、車両CRの利用者が利用予約の申請を行うときに操作する利用者端末TM2と、車両CRの利用者が利用予約の申請を行うときに利用者端末TM2と通信を行う車両貸出管理装置RTSと、車両CRの利用者が利用時に所持する電子キーEKと、車両CRの提供者が操作して情報管理装置100と通信を行う提供者端末TM1と、を備える。なお、本明細書におけるカーシェアリングとは、車両CRを所有している所有者が、所定の団体やコミュニティの中で車両CRを共同利用するために提供し、その団体やコミュニティ内の複数の利用者に貸し出すため、貸出の管理などを行うシステムを言う。
<First embodiment>
The information management apparatus 100 according to the present embodiment will be described with reference to FIGS. The information management device 100 is used in a car sharing system 1 for renting a vehicle CR shared by a plurality of users. The car sharing system 1 includes an information management device 100, a vehicle CR, a user terminal TM2 operated when a user of the vehicle CR applies for a reservation, and a user of the vehicle CR applying for a reservation. A vehicle lending management device RTS that sometimes communicates with the user terminal TM2, an electronic key EK that the user of the vehicle CR possesses when using the device, and a communication that the provider of the vehicle CR operates to communicate with the information management device 100. Terminal TM1. It should be noted that the term “car sharing” in this specification means that the owner of the vehicle CR provides the vehicle CR for shared use within a predetermined group or community, and a plurality of uses within the group or community. Refers to a system that manages lending, etc., for lending to people.
 提供者端末TM1と利用者端末TM2は、情報管理装置100や車両貸出管理装置RTSとインターネットや公衆回線を介して通信を行う通信機能と共に、情報管理装置100や車両貸出管理装置RTSからの情報を表示する表示機能、提供者や利用者が情報を入力する入力機能、表示機能や入力機能を実現するための制御機能や記憶機能を備える。提供者端末TM1と利用者端末TM2は、たとえば、パーソナルコンピュータやスマートフォンである。たとえば、パーソナルコンピュータである場合、表示機能はディスプレイ、入力機能はキーボード、制御機能はCPU(Central Processing Unit)、記憶機能はメモリやハードディスクである。 The provider terminal TM1 and the user terminal TM2 communicate information from the information management device 100 and the vehicle lending management device RTS with the communication function of communicating with the information management device 100 and the vehicle lending management device RTS via the Internet or a public line. It has a display function for displaying, an input function for a provider or a user to input information, a control function and a storage function for realizing the display function and the input function. The provider terminal TM1 and the user terminal TM2 are, for example, a personal computer or a smartphone. For example, in the case of a personal computer, the display function is a display, the input function is a keyboard, the control function is a CPU (Central Processing Unit), and the storage function is a memory or a hard disk.
 車両貸出管理装置RTSは、利用者が利用者端末TM2を操作して車両CRの利用を申請する際にアクセスされる。車両貸出管理装置RTSは、利用申請時に、利用を希望する車両CR、利用する期間、返却場所などの情報を受け付けて、ネットワークを介して接続された情報管理装置100にこの利用申請を許可できるか否かを問い合わせる。なお、車両貸出管理装置RTSと情報管理装置100を接続するネットワークは、インターネットであってもよいし、専用回線であってもよい。車両貸出管理装置RTSは、情報管理装置100から許可される旨の情報を受信した場合には、利用可能な車両CRでの貸し出しのための予定を作成したり、利用者に対する費用処理を行い、その内容を利用者端末TM2に送信する。また、車両貸出管理装置RTSは、不許可の旨の情報を受信した場合には、希望した車両CRは貸し出すことができない旨を利用者端末TM2に送信する。車両貸出管理装置RTSは、多数の利用者端末TM2からアクセスされるので、パーソナルコンピュータより高性能な制御機能、記憶機能、通信機能を有するサーバであることが好ましい。 The vehicle rental management device RTS is accessed when the user operates the user terminal TM2 to apply for the use of the vehicle CR. Can the vehicle lending management device RTS receive information such as a vehicle CR desired to be used, a use period, a return place, and the like at the time of use application, and permit the information management device 100 connected via a network to permit the use application? Inquire whether or not. The network connecting the vehicle lending management device RTS and the information management device 100 may be the Internet or a dedicated line. When the vehicle lending management device RTS receives the information to the effect that it is permitted from the information management device 100, the vehicle lending management device RTS creates a schedule for lending with the available vehicle CR, performs a cost process for the user, The contents are transmitted to the user terminal TM2. In addition, when the vehicle lending management device RTS receives the information indicating that the vehicle is not permitted, the vehicle lending management device RTS transmits to the user terminal TM2 that the desired vehicle CR cannot be lent. Since the vehicle lending management device RTS is accessed from a number of user terminals TM2, it is preferable that the vehicle lending management device RTS be a server having a control function, a storage function, and a communication function that are more sophisticated than a personal computer.
 車両CRは、車両管理システムCCを備える。車両管理システムCCは、情報管理装置100とネットワークを介して通信を行う車両側通信部CR1と、利用者の行為、行動、動作や、その結果として生じた状態などを検出するセンサである画像センサCR2と、車両CRの運転状態/使用状態を監視する車両状態監視部CR3と、これらの制御を行う車両側制御部CR4と、電子キーEKと通信し認証を行う車両側認証部CR5と、エンジンの制御を行うエンジンECUCR6と、電源制御を行う電源ECUCR7と、ドアの施解錠を制御するドアロックECUCR8と、を備える。 The vehicle CR includes a vehicle management system CC. The vehicle management system CC includes a vehicle-side communication unit CR1 that communicates with the information management apparatus 100 via a network, and an image sensor that is a sensor that detects a user's action, action, operation, or a resulting state. CR2, a vehicle state monitoring unit CR3 for monitoring the driving state / use state of the vehicle CR, a vehicle-side control unit CR4 for performing these controls, a vehicle-side authentication unit CR5 for communicating with the electronic key EK for authentication, and an engine. And a door lock ECU CR8 for controlling door locking and unlocking.
 車両側通信部CR1は、情報管理装置100と通信を行い、利用開始時に車両CRに対応した電子キーEKの識別コードを受信したり、利用終了時に画像センサCR2が検出した情報を送信したりする。なお、情報管理装置100と車両CRとの間のネットワークは、無線通信可能なネットワークであれば特に限定されない。 The vehicle-side communication unit CR1 communicates with the information management device 100 to receive an identification code of the electronic key EK corresponding to the vehicle CR at the start of use, or to transmit information detected by the image sensor CR2 at the end of use. . Note that the network between the information management device 100 and the vehicle CR is not particularly limited as long as it is a network capable of wireless communication.
 画像センサCR2は、同乗者を含む車両CRの利用者の行為/行動/動作、およびその結果として生じた状態などを検出するセンサであり、典型的には監視カメラのような動画として検出するセンサであり、たとえば可視光や赤外線を撮像することのできるカメラである。画像センサCR2は、車両CRの車室内で運転者や同乗車を撮像できるように車両CRに設置される。画像センサCR2は、利用者の喫煙、飲食、ダッシュボードへの足のせ、箱乗り、運転中のマナー等の行為等、またゴミの放置等の行為等の結果として生じた状態など、車両CRを提供した提供者が望ましくないと考える行為等や状態を認識できる情報を検出し、記録できればよく、音声データも検出できることが好ましい。画像センサCR2は、運転者の生体信号(バイタルサイン、反射、随意運動など)を検出できる機能を有し、運転者の緊張状態や感情を取得できるものであってもよい。画像センサCR2は、必要以上の緊張状態での運転など、車両CRを提供した提供者が望ましくないと考える状態も検出できる。 The image sensor CR2 is a sensor that detects an action / action / action of a user of the vehicle CR including a passenger and a state generated as a result, and is typically a sensor that detects a moving image such as a monitoring camera. For example, a camera capable of imaging visible light or infrared light. The image sensor CR2 is installed in the vehicle CR so that a driver or a passenger can be imaged in the cabin of the vehicle CR. The image sensor CR2 provides the vehicle CR such as a state resulting from the user's smoking, eating and drinking, putting on a dashboard, putting on a box, performing manners while driving, and leaving garbage unattended. It is only necessary to detect and record information capable of recognizing an act or state or the like that the provider considers to be undesirable, and it is preferable that audio data can also be detected. The image sensor CR2 may have a function of detecting a biological signal (vital sign, reflection, voluntary movement, or the like) of the driver, and may be capable of acquiring the driver's tension or emotion. The image sensor CR2 can also detect a state that the provider of the vehicle CR considers undesirable, such as driving in an unnecessarily nervous state.
 車両状態監視部CR3は、様々なセンサを用いて車両CRの状態を監視する。たとえば、車両状態監視部CR3に用いられるセンサは、加速・操舵・制動の主制御系統に関連するセンサを含む、加速度センサ、車速センサ、操舵角センサ、振動センサ、ヨーレートセンサ、ジャイロスコープ、車間距離センサ、前方/後方カメラ、LIDAR、位置センサ(GPS:Global Positioning System)などである。車両状態監視部CR3は、このようなセンサから得られるデータに基づき、たとえば、急発進/急ブレーキ等の行為等、また極端に短い車間距離等の結果として生じた車両の状態など、車両CRを提供した提供者が望ましくないと考える行為等や車両の状態を認識できる情報を検出し、監視する。なお、本明細書では、画像センサCR2や車両状態監視部CR3が検出する利用者の行為/行動/動作、その結果として生じた状態、および車両の状態についての情報を利用者の行為に関するデータと言う。 (4) The vehicle state monitoring unit CR3 monitors the state of the vehicle CR using various sensors. For example, sensors used in the vehicle state monitoring unit CR3 include acceleration sensors, vehicle speed sensors, steering angle sensors, vibration sensors, yaw rate sensors, gyroscopes, inter-vehicle distances, including sensors related to the main control system of acceleration, steering, and braking. Sensors, front / rear cameras, LIDAR, position sensors (GPS: Global Positioning System), and the like. Based on the data obtained from such sensors, the vehicle state monitoring unit CR3 monitors the vehicle CR based on data such as actions such as sudden start / sudden braking and the state of the vehicle resulting from an extremely short inter-vehicle distance. Detects and monitors information that allows the provider to recognize an undesirable behavior or the state of the vehicle. In this specification, information on the user's actions / actions / actions detected by the image sensor CR2 and the vehicle state monitoring unit CR3, the resulting state, and the state of the vehicle are combined with data on the user's actions. To tell.
 車両管理システムCCは、図示する、エンジンの制御を行うエンジンECUCR6、電源制御を行う電源ECUCR7、ドアの施解錠を制御するドアロックECUCR8以外にも、上記したような車両の状態を検出し制御を行う多数のECU(Electronic Control Unit)を備える。たとえば、ステアリング制御ECU、ブレーキ制御ECUなどである。 The vehicle management system CC detects and controls the vehicle state as described above in addition to the illustrated engine ECUCR6 for controlling the engine, the power supply ECUCR7 for controlling the power supply, and the door lock ECUCR8 for controlling the locking and unlocking of the door. It is equipped with a number of ECUs (Electronic Control Unit) that perform the operations. For example, it is a steering control ECU, a brake control ECU, or the like.
 車両側認証部CR5は、電子キーEKと通信を行い、車両CRを利用できる電子キーEKを適格に所持する者であることを認証する。認証できた場合、車両側認証部CR5は、ドアロックECUCR8にドアの解錠、電源ECUCR7に主電源のオン、エンジンECUCR6にエンジンのスタータモータの回転を指示すると共に、車両側制御部CR4に車両側通信部CR1等の制御の開始を指示する。 (4) The vehicle-side authentication unit CR5 communicates with the electronic key EK to authenticate that the person has the electronic key EK that can use the vehicle CR. If the vehicle has been authenticated, the vehicle-side authentication unit CR5 instructs the door lock ECU CR8 to unlock the door, turns on the main power supply to the power supply ECU CR7, instructs the engine ECU CR6 to rotate the starter motor of the engine, and notifies the vehicle-side control unit CR4 of the vehicle. The start of the control of the side communication unit CR1, etc. is instructed.
 電子キーEKは、直接受け渡しできる場合には車両CRに固有の車両キー(FOB)でもよいし、利用者が事前に登録したスマートフォンやICカードであってもよい。また、電子キーEKを用いず、利用者を特定できる指紋や顔を用いてもよい。この場合、車両側認証部CR5は、指紋認証/顔認証を行う機能を有する。 The electronic key EK may be a vehicle key (FOB) unique to the vehicle CR if it can be directly delivered, or may be a smartphone or an IC card registered in advance by the user. Instead of using the electronic key EK, a fingerprint or face that can identify the user may be used. In this case, the vehicle-side authentication unit CR5 has a function of performing fingerprint authentication / face authentication.
 情報管理装置100は、図2に示すように、車両CRに設置されたセンサからセンサデータを受信する受信装置10と、受信装置10が受信したセンサデータを利用者と対応付けて、利用者の行為に関するデータとして記憶するデータ蓄積部20(利用実績記録装置)と、データ蓄積部20が記録した利用者の行為に関するデータから利用者の不正行為を分類する分類装置30と、分類装置30を学習させる学習装置70と、車両CRに関する情報およびその車両CRの提供者に関する情報を記録する車両データベース50と、車両CRに対して利用可能な電子キーEKの識別コードを送信する送信装置80と、これらの装置やデータベース等を全体的に制御すると共に、分類装置30が分類した不正行為の種類に基づいて利用者の成績を付け、成績と所定の貸出基準に基づいて利用者による車両の利用可否を判断する制御装置60と、を備える。 As shown in FIG. 2, the information management device 100 receives the sensor data from the sensor installed in the vehicle CR, and associates the sensor data received by the reception device 10 with the user. A data storage unit 20 (use result recording device) for storing as data relating to the act, a classifying device 30 for classifying a user's fraudulent activity from the data relating to the user's act recorded by the data storage unit 20, and a learning of the classifying device 30 A learning device 70, a vehicle database 50 that records information about the vehicle CR and information about the provider of the vehicle CR, a transmission device 80 that transmits an identification code of an available electronic key EK to the vehicle CR, Controls the entire device, database, and the like, and based on the type of misconduct classified by the classifying device 30, Only, and a control device 60 for determining the availability of the vehicle by the user on the basis of results and a predetermined lending standards, the.
 受信装置10は、車両CRの車両側通信部CR1と、無線通信可能なネットワーク(たとえば、無線電話回線や無線LANなど)に接続されたネットワークを介して通信を行うネットワークインターフェース機能を有する。さらに受信装置10は、これらのネットワークを介して提供者端末TM1と通信する端末通信装置の受信機能を備えても良い。なお、提供者端末TM1と通信する端末通信装置は専用の装置として別途、学習装置70などに設けてもよい。また、受信装置10は、これらのネットワークを介して、利用終了時などに車両CRに設置された上述した様々なセンサが検出したセンサデータを受信する。受信装置10は、車両貸出管理装置RTSとも通信を行い、車両CRの利用予約の申請に関する情報等(たとえば、利用者や利用する期間などの情報)を受信する。 The receiving device 10 has a network interface function of communicating with the vehicle-side communication unit CR1 of the vehicle CR via a network connected to a wirelessly communicable network (for example, a wireless telephone line or a wireless LAN). Further, the receiving device 10 may have a receiving function of a terminal communication device that communicates with the provider terminal TM1 via these networks. The terminal communication device that communicates with the provider terminal TM1 may be separately provided as a dedicated device in the learning device 70 or the like. In addition, the receiving device 10 receives, via these networks, sensor data detected by the above-described various sensors installed in the vehicle CR at the end of use or the like. The receiving device 10 also communicates with the vehicle lending management device RTS, and receives information (for example, information on a user, a period of use, and the like) related to an application for a reservation to use the vehicle CR.
 データ蓄積部20は、受信装置10が受信したセンサデータを利用者と対応付けて、車両CRの利用者の行為に関するデータとして記憶する。データ蓄積部20は、画像センサCR2が検出した画像データや、車両状態監視部CR3の加速度センサや操舵角センサなどが検出した急発進/急ブレーキなどの運転データと、利用者ID(会員番号、運転免許証番号、氏名など)とを紐づけて記録する。画像データは、利用者(運転者および同乗者)の乗車中における様々な行い、たとえば車内カメラによる車内での喫煙、飲食、ゴミの放置だけでなく、車外カメラによる歩行者への接近や前方車両への接近(煽り運転)などの運転行為が判明する画像などを含むことが好ましい。データ蓄積部20は、このような、画像センサCR2や車両状態監視部CR3が検出した利用者の行為/行動/動作、その結果として生じた状態、および車両の状態についての情報を、利用者の利用実績として記録する。 The data storage unit 20 stores the sensor data received by the receiving device 10 in association with the user as data relating to the action of the user of the vehicle CR. The data storage unit 20 stores the image data detected by the image sensor CR2, the driving data such as sudden start / brake detected by the acceleration sensor and the steering angle sensor of the vehicle state monitoring unit CR3, and the user ID (member number, Driver's license number, name, etc.). The image data is used to perform various operations during the ride of the user (driver and passenger), for example, not only smoking, eating and drinking inside the vehicle using the in-vehicle camera, but also leaving the pedestrian approaching the vehicle in front of the vehicle using the camera outside the vehicle. It is preferable to include an image or the like in which a driving action such as approach to the vehicle (fan driving) is found. The data storage unit 20 stores information on the user's actions / actions / actions detected by the image sensor CR2 and the vehicle state monitoring unit CR3, the resulting state, and the state of the vehicle. Record as usage record.
 分類装置30は、データ蓄積部20が記録した利用者の行為に関するデータに基づき利用者の不正行為を分類する。分類装置30は、図3に示すように、データ蓄積部20から利用者の利用実績の情報を読み込むデータ入力部31と、データ入力部31が読み込んだ利用実績情報からその特徴量を演算する特徴量演算部32と、特徴量演算部32が算出した特徴量に基づき不正行為を分類する分類器33とを備える。特徴量演算部32は、たとえばデータが画像データの場合、画像中の物体全体の輝度分布(Wavelet)や、物体の局所の特徴量であるHaar-like特徴量、HOG特徴量、EOH特徴量などを算出する。複数の特徴量を使用しても良い。また、眼鼻口の特徴点、手の指の特徴点、これらの時系列的変化の特徴点を抽出しても良い。また、必要に応じ前処理としてノイズ除去や正規化などを行っても良い。さらに、特徴量演算部32を用いずに、画像データの各画素の値(明度や色情報)などを直接、分類器33へ入力するような構成としても良い。 The classifying device 30 classifies a user's misconduct based on data on the user's behavior recorded by the data storage unit 20. As shown in FIG. 3, the classification device 30 includes a data input unit 31 that reads information on the user's actual usage from the data storage unit 20, and a feature that calculates the characteristic amount from the actual usage information read by the data input unit 31. It comprises a quantity calculation unit 32 and a classifier 33 for classifying fraudulent acts based on the feature values calculated by the feature value calculation unit 32. For example, when the data is image data, the feature amount calculation unit 32 includes a luminance distribution (Wavelet) of the entire object in the image, a Haar-like feature amount, a HOG feature amount, an EOH feature amount, and the like, which are local feature amounts of the object. Is calculated. A plurality of feature values may be used. Further, feature points of the eyes, nose and mouth, feature points of the fingers of the hand, and feature points of these chronological changes may be extracted. If necessary, noise removal or normalization may be performed as preprocessing. Further, the configuration may be such that the value (lightness or color information) of each pixel of the image data is directly input to the classifier 33 without using the feature amount calculation unit 32.
 分類器33は、入力された特徴量が予めラベル付けされた不正行為の内どの不正行為に該当するかを分類する。分類器33は、入力されたデータ(画像、動画、音声データなど)を決められたカテゴリ(クラス)に分けることができる分類手法を利用する分類器であれば良い。分類器33は、機械学習できる分類器が望ましい。分類器33は、図4に示すように、多層ニューラルネットワーク33であってもよい。多層ニューラルネットワーク33は、算出された特徴量を予め深層学習(ディープラーニング)などにより学習されたノード構成やノード間の結合重みにより構成されている入力層/複数の中間層/出力層から成る。 The classifier 33 classifies which of the previously labeled fraudulent acts the input feature value corresponds to. The classifier 33 may be any classifier that uses a classification method that can divide input data (images, moving images, audio data, and the like) into predetermined categories (classes). The classifier 33 is preferably a classifier capable of machine learning. The classifier 33 may be a multilayer neural network 33 as shown in FIG. The multilayer neural network 33 includes an input layer, a plurality of intermediate layers, and an output layer configured by a node configuration and a connection weight between nodes in which the calculated feature amount is previously learned by deep learning (deep learning) or the like.
 より具体的には、多層ニューラルネットワーク33は、ラベル付けされた不正行為の種類と同じ数またはそれ以上の出力ノードを有する。たとえば、分類すべき不正行為が、図7に示すように、喫煙、ゴミ放置、飲食、足のせ、通話、煽り運転の6つあった場合、出力層には、これらの不正行為に対応する出力ノードを有する。入力された特徴量が、図8に示すような喫煙にも飲食にも該当すると識別されるが喫煙の可能性が高い場合には、出力は、(喫煙:ゴミ放置:飲食:足のせ:通話:煽り運転)=(0.8:0:0.2:0:0:0)のように出力する。 More specifically, the multilayer neural network 33 has as many or more output nodes as the type of fraud labeled. For example, as shown in FIG. 7, when there are six types of misconduct to be classified, such as smoking, leaving garbage, eating and drinking, putting on foot, talking, and driving, the output layer includes output corresponding to these misconducts. Has nodes. Although the input feature value is identified as applicable to both smoking and eating as shown in FIG. 8, if the possibility of smoking is high, the output is (smoking: leaving garbage: eating and drinking: putting on foot: calling) : Swaying operation) = (0.8: 0: 0.2: 0: 0: 0).
 たとえば、利用者の口に手が近づいていて、手が何か物を持っている画像を「飲食」というラベルを付けて学習させ、同様に、利用者の口に手が近づいていて、口元で光が明滅する画像を「喫煙」というラベルを付けて学習しておく。これらと類似の画像が入力されると、「飲食」または「喫煙」の出力値が大きく出力される。なお、提供者が新たな不正行為を追加する場合には、多層ニューラルネットワーク33は、出力層の出力ノードが追加され、それに伴い、複数の中間層におけるノード構成やノード間の結合重みが再学習されて構築される。 For example, an image in which the hand is close to the user's mouth and the hand is holding something is labeled as "food and drink" to learn, and similarly, the hand is close to the user's mouth, The image where the light flickers with is labeled "smoking" and learned. When an image similar to these is input, the output value of “drinking” or “smoking” is large. When the provider adds a new wrongdoing, the multilayer neural network 33 adds an output node of the output layer, and accordingly, the node configuration in a plurality of intermediate layers and the connection weight between the nodes are re-learned. Being built.
 なお、分類装置30は、多層ニューラルネットワーク33による分類方法に限定されない。たとえば、サポートベクターマシン、ロジスティック回帰、ランダムフォレスト、ブースティングなどのように、学習データを利用して学習できるものが望ましい。また、分類器33は複数を組み合わせて用いても良い。また、多層ニューラルネットワーク33に畳み込みニューラルネットワークを用いても良い。畳み込みニューラルネットワークを用いる場合には、特徴量演算部32を用いずに、画像の画素データを直接、畳み込みニューラルネットワークの入力層に入力する。畳み込みニューラルネットワークの入力層の次の層として畳み込み層とプーリング層が設けられており、これらの層が特徴量の抽出を実行する。また、たとえば、テンプレートとのマッチングによって分類する方法を用いる場合には、不正行為ごとにテンプレートを作成しておき、同じラベルのテンプレートの平均値を用いて、被分類対象とマッチングを行っても良い。また、すべてのテンプレートと順次マッチングさせても良い。提供者がデータにラベルを付けて学習させる時にはテンプレートが増加する。分類装置30は、上述したように、データ蓄積部20から入力された画像データなどのデータを分類して、制御装置60に出力する。 The classification device 30 is not limited to the classification method using the multilayer neural network 33. For example, it is desirable to use learning data, such as a support vector machine, logistic regression, random forest, and boosting. Further, a plurality of classifiers 33 may be used in combination. Further, a convolutional neural network may be used as the multilayer neural network 33. When the convolutional neural network is used, the pixel data of the image is directly input to the input layer of the convolutional neural network without using the feature amount calculation unit 32. A convolutional layer and a pooling layer are provided as layers next to the input layer of the convolutional neural network, and these layers execute extraction of a feature amount. Further, for example, in the case of using a method of classifying by matching with a template, a template may be created for each fraudulent activity, and matching with the classification target may be performed using an average value of templates having the same label. . Further, matching may be performed sequentially with all templates. When a provider labels and learns data, the number of templates increases. As described above, the classifying device 30 classifies data such as image data input from the data storage unit 20 and outputs the data to the control device 60.
 制御装置60は、データベースを含む情報管理装置100の全体的な制御を行う。制御装置60は、受信装置10、送信装置80、データ蓄積部20、分類装置30、学習装置70、利用者データベース40、車両データベース50と相互通信を行って協調させながら、車両CRから取得したデータに基づいて、利用者の不正行為を分類させる。さらに、制御装置60(成績付与部62)は、分類された不正行為に基づいて、利用者の成績を定めることも行う。また、制御装置60は、提供者端末TM1からの情報に基づいて、受信装置10、送信装置80、データ蓄積部20、分類装置30、学習装置70、利用者データベース40、車両データベース50に対してさまざまな制御を行う。なお、端末通信装置は、受信装置10と送信装置80の機能を含むものであってもよいし、後述する学習データ入力部71に設けられてもよい。 The control device 60 controls the entire information management device 100 including the database. The control device 60 communicates with the receiving device 10, the transmitting device 80, the data storage unit 20, the classifying device 30, the learning device 70, the user database 40, and the vehicle database 50 to cooperate with each other, and obtains data obtained from the vehicle CR. The user is classified based on the fraud. Further, the control device 60 (result giving section 62) determines the user's results based on the classified misconduct. Further, the control device 60 sends the information to the receiving device 10, the transmitting device 80, the data storage unit 20, the classifying device 30, the learning device 70, the user database 40, and the vehicle database 50 based on the information from the provider terminal TM1. Perform various controls. The terminal communication device may include the functions of the receiving device 10 and the transmitting device 80, or may be provided in a learning data input unit 71 described later.
 制御装置60(成績付与部62)は、分類装置30が出力した分類結果に基づいて利用者の成績を作成して利用者データベース40に格納する。利用者の成績は、利用者が行った不正行為の種類そのものや、不正行為の種類ごとに付与されている点数の合計や、後述する発生率などである。利用者データベース40は、利用者の成績を記録する(成績記録装置)。利用者データベース40は、図7に示すように、利用者(利用者ID)ごとに不正行為ごとの発生率を蓄積する。発生率は、発生率=不正行為の回数÷車両貸出回数の式で計算して求められる。たとえば、本図では、この利用者においては、不正行為の喫煙が貸出毎に行われており、足のせは貸出2回に1回の割合で行われていることを示している。また、利用者データベース40は、データ蓄積部20の当該データへのリンクを含んでいてもよい。 The control device 60 (the grade imparting unit 62) creates a grade of the user based on the classification result output by the classification device 30 and stores the grade in the user database 40. The user's performance is the type of fraudulent activity performed by the user, the total score given for each type of fraudulent activity, the incidence rate described later, and the like. The user database 40 records the performance of the user (a performance recording device). As shown in FIG. 7, the user database 40 accumulates the occurrence rate of each fraudulent act for each user (user ID). The incidence rate is calculated by the formula: incidence rate = number of fraudulent acts / number of vehicle lending times. For example, this figure shows that, for this user, illegal smoking is performed for each lending, and footsteps are performed once every two lendings. Further, the user database 40 may include a link to the data in the data storage unit 20.
 制御装置60は、車両CRに関する情報およびその提供者に関する情報を車両データベース50に書き込んだり、逆に車両データベース50から読み出す等の制御も行う。車両データベース50は、車両CRの提供者に関する情報として提供者ごとの貸出基準を含んで記録する。提供者ごとの貸出基準は、所定の不正行為に対する許容度である。たとえば、ゴミ放置や運転中通話の不正行為の発生率が50%未満では貸し出しを許可するが、50%以上では貸し出しを禁止する、また、喫煙や煽り運転の不正行為の発生率は、1回の発生を以って貸し出しを禁止するなどの如くである。発生率などを用いずに、その利用者が行った不正行為の種類を列挙して車両データベース50に記録するようにしても良い。そして、所定の不正行為が記録されていれば、貸し出しを禁止するようにしてもよい。 The control device 60 also performs control such as writing information about the vehicle CR and information about its provider into the vehicle database 50 and reading the information from the vehicle database 50 on the contrary. The vehicle database 50 records, as information on the provider of the vehicle CR, a lending standard for each provider. The lending criterion for each provider is a tolerance for a predetermined fraud. For example, lending is permitted if the frequency of littering and fraudulent driving while driving is less than 50%, but lending is prohibited if the frequency is 50% or more. Such as prohibiting lending when an outbreak occurs. Instead of using the incidence rate or the like, the types of fraudulent activities performed by the user may be listed and recorded in the vehicle database 50. Then, if predetermined fraudulent acts are recorded, lending may be prohibited.
 制御装置60は、情報管理装置100の全体的な制御を行うと共に、内部に判断部61と成績付与部62を備える。判断部61は、利用者データベース40が記録した利用者の成績と車両データベース50が有する貸出基準に基づいて、利用者による車両CRの利用可否を判断する。これによれば、カーシェアリングに提供した車両CRにおいて利用者の行った不正行為と提供者自ら定めた貸出基準に基づいて、当該利用者への車両CRの貸出を制限することができる。貸出基準は、カーシェアリングシステム1を提供している企業として一律であってもよいが、車両を提供している所有者ごとに貸出基準を設定するようにしても良い。車両を提供している所有者ごとに許容できる範囲は異なることも有るので、ある所有者にとっては問題がない行為であっても他の所有者にとっては不快になる事がある。所有者ごとに自身が満足できる貸出基準を設定できることで、所有者が不快になることを回避できる。 The control device 60 performs overall control of the information management device 100, and includes a determination unit 61 and a performance imparting unit 62 therein. The determination unit 61 determines whether or not the user can use the vehicle CR based on the user's performance recorded in the user database 40 and the lending standard of the vehicle database 50. According to this, it is possible to restrict the lending of the vehicle CR to the user based on the fraudulent activity performed by the user in the vehicle CR provided to the car sharing and the lending standard determined by the provider. The lending standard may be uniform as a company that provides the car sharing system 1, but the lending standard may be set for each owner who provides the vehicle. Since the acceptable range may differ for each owner that provides the vehicle, an action that is OK for one owner may be uncomfortable for another owner. By being able to set a satisfactory lending standard for each owner, it is possible to prevent the owner from becoming uncomfortable.
 判断部61が、利用申請した利用者が車両CRの貸出基準を満たし、貸出を許可した場合には、制御装置60は送信装置80を制御し、車両貸出管理装置RTSに対して利用予約の許可する旨を送信し、許可されなかった場合には許可しない旨を送信する。また、制御装置60は送信装置80を制御し、貸出を許可された場合であって利用者が実際に車両CRを利用する時には、車両CRに対して利用可能な電子キーEKの識別コードを送信する。これにより、利用者は、車両CRを利用申請通りに利用を開始できる。 When the determination unit 61 determines that the user who has applied for use satisfies the vehicle CR lending standard and permits lending, the control device 60 controls the transmission device 80 to permit the vehicle lending management device RTS to use the reservation. Is transmitted, and if not permitted, a message is transmitted not to be permitted. Further, the control device 60 controls the transmitting device 80 to transmit the identification code of the available electronic key EK to the vehicle CR when the user is actually permitted to use the vehicle CR and the user actually uses the vehicle CR. I do. As a result, the user can start using the vehicle CR as requested.
 制御装置60は、端末との通信を行う端末通信装置としての受信装置10および送信装置80を介して、車両CRの提供者が操作する提供者端末TM1と通信を行い、提供者端末TM1からの命令に応じて、データ蓄積部20に蓄積されたデータの送信や利用者データベース40に蓄積された利用者の成績の変更を許可する。なお、端末通信装置は専用の装置として設けられるようにしても良い。データの送信は、提供者等からの要求があった場合に送信してもよいし、自動で送信してもよい。なお、送信の時には動画では情報量が多すぎるので、ダイジェストや判定に利用した時のデータに絞って送信してもよい。データ蓄積部20に蓄積されたデータの提供者端末TM1への送信は、送信装置80を介して行われる。なお、提供者端末TM1との通信を行う専用の端末通信装置を別途設置しても良い。このように、提供者は利用者の利用時のデータを自分で確認して分類装置30の判定が間違っている場合には、提供者端末TM1を用いて、その利用者の成績を修正できることが好ましい。このように、車両CRの提供者が直接データを検査し、不正行為の有無を判断できるので、機械の不正行為分類が誤っているために成績が正しく付けられていない場合でも手動で成績を修正できる。 The control device 60 communicates with the provider terminal TM1 operated by the provider of the vehicle CR via the receiving device 10 and the transmitting device 80 as terminal communication devices for performing communication with the terminal, and receives a signal from the provider terminal TM1. According to the command, transmission of the data stored in the data storage unit 20 and change of the user's performance stored in the user database 40 are permitted. The terminal communication device may be provided as a dedicated device. The data may be transmitted when there is a request from a provider or the like, or may be transmitted automatically. At the time of transmission, since the amount of information in a moving image is too large, transmission may be limited to data used for digesting and determination. The transmission of the data stored in the data storage unit 20 to the provider terminal TM1 is performed via the transmission device 80. Note that a dedicated terminal communication device for performing communication with the provider terminal TM1 may be separately provided. As described above, the provider can check the data at the time of use of the user himself and correct the performance of the user using the provider terminal TM1 if the judgment of the classifying device 30 is wrong. preferable. In this way, the vehicle CR provider can directly inspect the data and judge the presence or absence of fraud, so even if the grade is incorrectly assigned due to incorrect classification of machine fraud, the grade is manually corrected. it can.
 上述したように、端末からの命令に応じてデータ蓄積部20に蓄積されたデータの送信が行われる以外に、制御装置60が自動的に端末に送信するようにしても良い。この場合には、制御装置60は分類装置30からの出力を受信し、例えば、分類装置30が行う分類の信頼度が低いデータを、提供者端末TM1に送信する。信頼度が低いデータとは、上述したように、分類装置30からの出力が、(喫煙:ゴミ放置:飲食:足のせ:通話:煽り運転)=(0.6:0:0.4:0:0:0)のように、入力された特徴量が不正行為として喫煙なのか飲食なのかはっきり識別できないようなデータを言う。このように、信頼度が低く分類があやしい場合には、その時の分類装置30からの出力とその時に入力された画像データを提供者等に通知することによって、成績の見直しや分類装置30の学習を促すことができる。 As described above, in addition to transmitting the data stored in the data storage unit 20 in response to a command from the terminal, the control device 60 may automatically transmit the data to the terminal. In this case, the control device 60 receives the output from the classification device 30 and transmits, for example, data with low reliability of the classification performed by the classification device 30 to the provider terminal TM1. As described above, data with low reliability means that the output from the classification device 30 is (smoking: leaving garbage: eating and drinking: putting on a foot: talking: driving in a row) = (0.6: 0: 0.4: 0). : 0: 0), which indicates that the input feature value cannot be clearly identified as smoking or eating as fraud. In this way, when the reliability is low and the classification is suspicious, the output from the classification device 30 at that time and the image data input at that time are notified to a provider or the like, so that the results can be reviewed or the learning of the classification device 30 can be performed. Can be encouraged.
 制御装置60は、提供者端末TM1からの命令を受け付け、その命令に応じて、学習装置70を制御することによって、分類装置30を学習させる。学習装置70は、図3に示すように、提供者端末TM1から送信されたデータを受け付ける学習データ入力部71と、学習データ入力部71が受け付けたデータを記憶する学習データ記憶部72と、学習データ記憶部72に蓄えられたデータや分類器33からの出力などに基づき分類器33のパラメータを演算するパラメータ演算部73と、演算されたパラメータの値を記憶するパラメータ値記憶部74と、分類器33のパラメータを記憶された値に更新するパラメータ更新部75とを備える。学習データ入力部71が提供者端末TM1から直接データを受信しても良いし、情報管理装置100の受信装置10が受信したデータが制御装置60によって転送されて受信するようにしても良い。学習データ記憶部72には予め学習のための多数の学習入力データと各学習入力データの教師データであるラベルデータが記憶されている。画像を分類する場合であれば、学習入力データは画像データであり、ラベルデータはその画像データが何の画像かを示すデータである。 The control device 60 receives the command from the provider terminal TM1, and controls the learning device 70 in accordance with the command to cause the classifying device 30 to learn. As shown in FIG. 3, the learning device 70 includes a learning data input unit 71 that receives data transmitted from the provider terminal TM1, a learning data storage unit 72 that stores data received by the learning data input unit 71, A parameter calculation unit 73 for calculating parameters of the classifier 33 based on data stored in the data storage unit 72, an output from the classifier 33, and the like; a parameter value storage unit 74 for storing calculated parameter values; A parameter updating unit 75 for updating the parameters of the unit 33 to the stored values. The learning data input unit 71 may receive data directly from the provider terminal TM1, or the data received by the receiving device 10 of the information management device 100 may be transferred and received by the control device 60. The learning data storage unit 72 previously stores a large number of learning input data for learning and label data as teacher data of each learning input data. In the case of classifying images, the learning input data is image data, and the label data is data indicating what image the image data is.
 分類器33に多層ニューラルネットワークを用いた場合の学習装置70の説明を行う。学習装置70は、図4に示すように、提供者端末TM1からデータを受け付ける学習データ入力部71と、学習データ入力部71が受け付けたデータを記憶する学習データ記憶部72と、学習データ記憶部72に蓄えられたデータに基づきノード間の結合重みを演算する重み演算部73と、演算された結合重みを記憶する重みデータ記憶部74と、多層ニューラルネットワーク33のノード間の結合重みを記憶された結合重みに更新する重み更新部75とを備える。なお、学習データ記憶部72には予め学習のための多数の学習画像データとそれぞれに付与されたラベルのデータが蓄えられている。また、学習データ記憶部72は、特徴量演算部32から出力される特徴量を各データと紐づけて記憶するようにしても良い。なお、学習データ記憶部72には予め多数の学習データが蓄積されており、この学習データを用いて、多層ニューラルネットワーク33は学習済みである。 A description will be given of the learning device 70 in the case where a multilayer neural network is used as the classifier 33. As illustrated in FIG. 4, the learning device 70 includes a learning data input unit 71 that receives data from the provider terminal TM1, a learning data storage unit 72 that stores data received by the learning data input unit 71, and a learning data storage unit. A weight calculation unit 73 that calculates the connection weight between nodes based on the data stored in 72, a weight data storage unit 74 that stores the calculated connection weight, and a connection weight between nodes of the multilayer neural network 33 are stored. And a weight updating unit 75 for updating the connection weights. Note that the learning data storage unit 72 previously stores a large number of learning image data for learning and data of a label assigned to each of the learning image data. Further, the learning data storage unit 72 may store the feature amount output from the feature amount calculation unit 32 in association with each data. Note that a large amount of learning data is stored in the learning data storage unit 72 in advance, and the multilayer neural network 33 has been trained using this learning data.
 提供者は、提供者端末TM1で、データ蓄積部20に蓄積されたデータを受信した後、自ら直接データを検査し、不正行為の有無を判断し、不正行為である場合にはどの不正行為であるかのラベル付けを行う。提供者は、たとえば、信頼度が低い喫煙なのか飲食なのかはっきり識別できないデータを見て、自らこれは喫煙であると認定しそのデータのラベル付けを行う。 After receiving the data stored in the data storage unit 20 at the provider terminal TM1, the provider directly inspects the data to determine whether or not there is any fraud. Label if there are any. The provider sees, for example, data that cannot be clearly identified as smoking or eating or drinking with low reliability, and recognizes that this is smoking and labels the data.
 学習データ入力部71は、画像データAと提供者が付与したラベルデータAを、学習のためのデータとして提供者端末TM1から受け付ける。なお、ラベルデータは多層ニューラルネットワーク33の出力層のノードに対応したデータである。例えば、ラベルが喫煙の場合には、図8で示すように、そのラベルデータAは(1:0:0:0:0:0)である。ラベル(喫煙)からラベルデータへの変換は、専用のアプリケーションソフトが実装された提供者端末TM1が実行しても良いし、学習データ入力部71が実行しても良い。 The learning data input unit 71 receives the image data A and the label data A given by the provider from the provider terminal TM1 as data for learning. Note that the label data is data corresponding to the node of the output layer of the multilayer neural network 33. For example, when the label is smoking, the label data A is (1: 0: 0: 0: 0: 0) as shown in FIG. The conversion from the label (smoking) to the label data may be executed by the provider terminal TM1 in which dedicated application software is installed, or may be executed by the learning data input unit 71.
 学習データ入力部71が受け付けた画像データAと当該画像データに対応するラベルデータAは、学習用のデータとして学習データ記憶部72に記憶される。学習装置70は、この学習用の画像データAを分類装置30のデータ入力部31に入力し、特徴量演算部32はこの学習用の画像データAの特徴量を演算して、多層ニューラルネットワーク33に入力する。学習装置70の重み演算部73には、この時の多層ニューラルネットワーク33の出力層からの出力値および内部のノードの各出力値が入力される。さらに、学習データ記憶部72からラベルデータAも入力される。 The image data A received by the learning data input unit 71 and the label data A corresponding to the image data are stored in the learning data storage unit 72 as learning data. The learning device 70 inputs the learning image data A to the data input unit 31 of the classifying device 30, and the feature calculating unit 32 calculates the feature of the learning image data A, To enter. The output value from the output layer of the multilayer neural network 33 and each output value of an internal node at this time are input to the weight calculation unit 73 of the learning device 70. Further, the label data A is also input from the learning data storage unit 72.
 また、多層ニューラルネットワーク33の現在の結合重みであるデータを重みデータ記憶部74から入力される。重み演算部73は、出力層からの出力値とラベルデータとの誤差、例えば2乗誤差を小さくするために、誤差逆伝播法などの方法を用いてノードとノードの結合重みを修正する。重み演算部73は、学習データ記憶部72に記憶されている、画像データA以外の複数の学習データに対しても同様に誤差逆伝播法などの方法を用いてノードとノードの結合重みを修正していく。そして、学習装置70は、全ての学習データに関して、出力層からの出力値とラベルデータとの誤差が所定の閾値に収束したことを確認すると、分類装置30の学習を終了する。そして、重みデータ記憶部74は、学習終了時の結合重みを記憶し、その結合重みを重み更新部75に出力する。重み更新部75は、多層ニューラルネットワーク33のノード間の結合重みの値を学習終了時の結合重みの値に更新する。 デ ー タ Also, data as the current connection weight of the multilayer neural network 33 is input from the weight data storage unit 74. The weight calculation unit 73 corrects the connection weight between nodes using a method such as an error back propagation method in order to reduce an error between the output value from the output layer and the label data, for example, a square error. The weight calculator 73 similarly corrects the connection weight between nodes using a method such as the backpropagation method for a plurality of pieces of learning data other than the image data A stored in the learning data storage unit 72. I will do it. When the learning device 70 confirms that the error between the output value from the output layer and the label data has converged to the predetermined threshold value for all the learning data, the learning of the classification device 30 ends. Then, the weight data storage unit 74 stores the connection weight at the end of learning, and outputs the connection weight to the weight update unit 75. The weight updating unit 75 updates the value of the connection weight between the nodes of the multilayer neural network 33 to the value of the connection weight at the end of learning.
 また、分類器33としてサポートベクターマシンを用いた場合には、学習装置70では、学習データ記憶部72に蓄積されている複数の学習データを用いて、サポートベクターマシンを学習させる。具体的には、各画像データがそれぞれに付与されているクラスに区分けできる識別関数のパラメータを調整する。単純な例として、特徴量が2次元(x1,x2)の場合を考えると、識別関数y=a1x1+a2x2+bのパラメータであるa1,a2,bの値を算出する。算出には、例えば「マージン最大化」という方法で行う。 In the case where a support vector machine is used as the classifier 33, the learning device 70 learns the support vector machine using a plurality of learning data stored in the learning data storage unit 72. More specifically, the parameters of the identification function that can classify each image data into classes assigned to each image data are adjusted. As a simple example, when the feature amount is two-dimensional (x1, x2), the values of a1, a2, and b, which are the parameters of the discriminant function y = a1x1 + a2x2 + b, are calculated. The calculation is performed by, for example, a method of “maximizing margin”.
 このようにして、パラメータ演算部73は、一度、学習データ記憶部72に蓄積されている複数の学習データを用いて、パラメータ値を算出する。そして、パラメータ値記憶部74は、そのパラメータ値を記憶し、パラメータ更新部75を介して当該パラメータ値に現状のサポートベクターマシンのパラメータ値を更新させる。その後、サポートベクターマシンの判定を提供者が確認した時に、サポートベクターマシンが判定を間違った画像データAに対して提供者が正しいラベルデータを付与して、学習データ入力部71に画像データAとラベルデータAを追加する。この追加された学習データが正しく分類できるように再度、パラメータ演算部73は識別関数のパラメータ値を算出する。そして、パラメータ更新部75は、算出された新しいパラメータ値にサポートベクターマシンのパラメータ値を更新する。「マージン最大化」以外にも「ソフトマージン」、「カーネルトリック」などの方法を用いて識別関数のパラメータを調整しても良い。 パ ラ メ ー タ Thus, the parameter calculation unit 73 calculates a parameter value once using a plurality of pieces of learning data stored in the learning data storage unit 72. Then, the parameter value storage unit 74 stores the parameter value, and causes the parameter value to update the current parameter value of the support vector machine via the parameter update unit 75. Thereafter, when the provider confirms the determination of the support vector machine, the provider assigns the correct label data to the image data A for which the support vector machine has made an incorrect determination, and the image data A is input to the learning data input unit 71. Label data A is added. The parameter calculation unit 73 calculates the parameter value of the identification function again so that the added learning data can be correctly classified. Then, the parameter updating unit 75 updates the parameter value of the support vector machine to the calculated new parameter value. The parameters of the identification function may be adjusted using a method such as “soft margin” or “kernel trick” other than “maximization of margin”.
 また、ブースティングの1つであるAdaboostが分類器として知られている。Adaboostの分類器は、複数の弱識別器から構成されており、弱識別器からの出力値と弱識別器それぞれに付与されている信頼度の積を、全ての弱識別器ごとに算出し、これらの積を合計した値に基づいて最終的な分類を行うものである。分類器を学習させる場合には、学習用の入力データに、それぞれ重みを付与して、この重みを基に弱識別器それぞれの誤検出率を計算する。この誤検出率に基づいて弱識別器の信頼度を計算する。学習用の入力データの各重みは、弱識別器が分類を間違った入力データに対しては重みを大きく設定される。 A Also, Adaboost, one of the boosting, is known as a classifier. The Adaboost classifier is composed of a plurality of weak classifiers, and calculates the product of the output value from the weak classifier and the reliability assigned to each weak classifier for each weak classifier, The final classification is performed based on the sum of these products. When the classifier is trained, weights are assigned to the learning input data, and the false detection rate of each weak classifier is calculated based on the weight. The reliability of the weak classifier is calculated based on the false detection rate. Each weight of the input data for learning is set to a larger weight for input data whose classification is incorrect by the weak classifier.
 分類器であるAdaboostが分類を間違った画像データAを用いて学習させるので、パラメータ演算部73は、この画像データAの重みを大きくして、学習データ記憶部72に記憶されている他の複数の学習入力データ全ての重みを付け直して、弱識別器それぞれの信頼度を計算しなおす。弱識別器ごとに算出された信頼度をパラメータ値記憶部74に記憶し、パラメータ更新部75を介して、分類器の弱識別器それぞれの信頼度を更新する。なお、分類装置30と学習装置70は、コンピュータとソフトウェアを用いて実現される。分類装置30と学習装置70を同じコンピュータで実現しても良い。分類装置30と学習装置70の一部は、専用のハードウェアを用いても良い。例えば、分類装置30の分類器33が多層ニューラルネットワークの場合には、複数のニューロチップ等を用いて分類器33を実現しても良い。学習装置70のパラメータ演算部73に、分類器33と同じものを構成して、分類器33からの出力を受け取らずにパラメータを演算するようにしても良い。使用する分類器ごとに、その分類器に対応した異なるソフトウェアを学習装置70のパラメータ演算部73に実装することによって、様々は分類器に対応させることができる。上述したように、分類装置30を学習させることで、分類装置30の性能を向上させることができる。 Since the classifier Adaboost learns using the image data A whose classification is incorrect, the parameter calculating unit 73 increases the weight of the image data A and sets the weight of the other image data stored in the learning data storage unit 72. Is re-weighted, and the reliability of each weak classifier is calculated again. The reliability calculated for each weak classifier is stored in the parameter value storage unit 74, and the reliability of each weak classifier of the classifier is updated via the parameter update unit 75. Note that the classification device 30 and the learning device 70 are realized using a computer and software. The classification device 30 and the learning device 70 may be realized by the same computer. Part of the classification device 30 and the learning device 70 may use dedicated hardware. For example, when the classifier 33 of the classification device 30 is a multilayer neural network, the classifier 33 may be realized using a plurality of neurochips or the like. The same configuration as the classifier 33 may be configured in the parameter calculation unit 73 of the learning device 70 to calculate parameters without receiving the output from the classifier 33. By installing different software corresponding to each classifier in the parameter calculation unit 73 of the learning device 70 for each classifier to be used, it is possible to correspond to various classifiers. As described above, the performance of the classification device 30 can be improved by learning the classification device 30.
 図5を参照し、カーシェアリングシステム1における車両CRの利用開始時の流れについて説明する。利用者は、S100において、利用者端末TM2から車両貸出管理装置RTSにアクセスして、利用期間などを入力して車両CRの利用予約の申請を行う。車両貸出管理装置RTSは、利用者から申請を受けると、S102において、情報管理装置100に対して利用者IDを送信して利用可否の判断の要請を行う。なお、利用者IDは、たとえば免許証番号であってもよいし、予め情報管理装置100に登録したIDであってもよい。 With reference to FIG. 5, the flow at the start of using the vehicle CR in the car sharing system 1 will be described. In S100, the user accesses the vehicle lending management device RTS from the user terminal TM2, inputs a use period and the like, and applies for a use reservation of the vehicle CR. Upon receiving the application from the user, the vehicle lending management device RTS transmits a user ID to the information management device 100 and requests the information management device 100 to determine whether or not use is possible in S102. Note that the user ID may be, for example, a license number or an ID registered in the information management device 100 in advance.
 情報管理装置100は、利用者IDを受信すると、S104において、そのIDの認証を行う。情報管理装置100は、認証に成功した場合、S106において、利用者データベース40にアクセスし、その利用者の車両CRの貸出時における不正行為に関する利用者の成績を確認する。情報管理装置100は、S108において利用が貸出条件に合わないとか既に予約済みであったような場合には、S110において不許可の回答を車両貸出管理装置RTSに送信する。車両貸出管理装置RTSは、その回答を受信すると、S112において、利用者に対して利用不可の回答を送信する。また、情報管理装置100は、車両CRが利用不可の場合、他の予約可能な車両であって、その利用者の成績で貸出条件に適合する車両を推奨してもよい。 (4) Upon receiving the user ID, the information management apparatus 100 authenticates the ID in S104. If the authentication is successful, the information management device 100 accesses the user database 40 in S106, and confirms the user's performance regarding the fraudulent activity when the user rents the vehicle CR. If the use does not meet the lending conditions or has already been reserved in S108, the information management device 100 transmits a non-permission response to the vehicle lending management device RTS in S110. Upon receiving the response, the vehicle lending management device RTS transmits an unusable response to the user in S112. Further, when the vehicle CR is not available, the information management apparatus 100 may recommend another vehicle that can be reserved and that meets the lending conditions based on the performance of the user.
 情報管理装置100は、S110において予約可能でありその利用者が貸出条件に適合する場合には、S114において、車両貸出管理装置RTSに対して利用を許諾する旨の回答を行う。車両貸出管理装置RTSは、その旨受信すると、S116において、送信装置80から許可された車両CRのキーIDを送信する。利用者は、そのキーIDを自分のスマートフォンなどにダウンロードして、S118において、車両CRを利用する時接近してドアを開錠する信号を発信する。車両CRは、その信号を受けると、S120においてキーIDの認証を行い、認証に成功した場合は、S122において車両CRのドアを開錠する。このように、利用者の行った過去の不正行為に照らして提供者の貸出基準に適合した利用者のみに車両CRの貸し出しを許可し、適合しない利用者には自動的に不許可とすることで、車両CRの提供者は安心して、カーシェアリングシステム1に車両CRを提供することができる。 If the user can make a reservation in S110 and the user satisfies the lending conditions, the information management device 100 replies to the vehicle lending management device RTS in S114 that the use is permitted. Upon receiving the notification, the vehicle lending management device RTS transmits the key ID of the permitted vehicle CR from the transmission device 80 in S116. The user downloads the key ID to his / her smartphone or the like, and in S118, transmits a signal to approach and unlock the door when using the vehicle CR. Upon receiving the signal, vehicle CR authenticates the key ID in S120, and if the authentication is successful, unlocks the door of vehicle CR in S122. In this way, only the users who conform to the provider's lending criteria are permitted to lend the vehicle CR in the light of past misconduct performed by the user, and the non-compliant users are automatically disallowed. Thus, the vehicle CR provider can provide the vehicle CR to the car sharing system 1 with ease.
 図6を参照し、カーシェアリングシステム1における車両CRの利用終了時の流れについて説明する。車両CRは、利用時には、S200において画像センサCR2により利用者の行為/行動/動作、およびその結果として生じた状態などを検出し続け、また、S202において車両状態監視部CR3により車両CRの運転状態/使用状態を監視し続けている。 With reference to FIG. 6, the flow at the end of use of the vehicle CR in the car sharing system 1 will be described. At the time of use, the vehicle CR continues to detect the user's actions / actions / actions and the resulting state and the like by the image sensor CR2 in S200, and the driving state of the vehicle CR by the vehicle state monitoring unit CR3 in S202. / Continue to monitor usage status.
 利用者は、車両CRの利用を終了する場合、S204において車両CRを降車し、S206において車両CRに対してキーIDを発信して施錠を行う。車両CRは、キーIDを受信すると、S208においてドアを施錠する。車両CRは、利用の終了を認識すると、S210において、画像センサCR2、車両状態監視部CR3などのセンサが検出したデータを情報管理装置100に送信する。なお、本実施例では、センサが検出したデータを送信するのは利用終了時であるが、これに限定されず、利用最中にリアルタイムにデータを送信してもよい。 When the user terminates the use of the vehicle CR, the user gets off the vehicle CR in S204 and transmits a key ID to the vehicle CR in S206 to perform locking. Upon receiving the key ID, the vehicle CR locks the door in S208. When recognizing the end of use, the vehicle CR transmits data detected by sensors such as the image sensor CR2 and the vehicle state monitoring unit CR3 to the information management device 100 in S210. In this embodiment, the data detected by the sensor is transmitted at the end of use. However, the present invention is not limited to this. Data may be transmitted in real time during use.
 情報管理装置100は、車両CRからデータを受信すると、S212において、そのデータに基づき分類装置30が不正行為に該当するものを分類し、当該利用者の成績を利用実績に基づき判定する。情報管理装置100は、S214において、その判定結果を車両CRの提供者および利用者に通知する。情報管理装置100は、S216において、利用者データベース40の当該利用者の成績記録を最新のデータを含めて更新する。 When the information management device 100 receives the data from the vehicle CR, in S212, the classifying device 30 classifies a device corresponding to an improper act based on the data, and determines the performance of the user based on the use record. In S214, the information management device 100 notifies the determination result to the provider and the user of the vehicle CR. In step S216, the information management apparatus 100 updates the record of the user in the user database 40 including the latest data.
 提供者は、自分の好きなタイミングで、S218において、提供者端末TM1から利用者の成績や利用実績へのアクセスを要求してもよい。提供者は、このタイミングでのアクセスに限定されず、情報管理装置100が利用可否判断を行うタイミングで提供者に通知して、その時判断してもよい。情報管理装置100は、かかるアクセス要求を受け、S220において、送信装置80から利用者の成績や利用実績の画像データ等のダイジェストを送信する。 (4) The provider may request the provider terminal TM1 to access the user's performance and usage results at S218 at his or her favorite timing. The provider is not limited to the access at this timing, but may notify the provider at the timing when the information management apparatus 100 determines the availability, and make the determination at that time. In response to the access request, the information management device 100 transmits a digest such as image data of the user's performance and usage results from the transmission device 80 in S220.
 提供者は、そのダイジェストを受け取ると、S222において、利用者の成績を修正したり、分類装置30が誤って分類した不正行為の画像データについてラベルを付けなおしたりして、提供者端末TM1から情報管理装置100に送信する。たとえば、分類装置30は、飲食と分類した画像データは、実は喫煙であったような場合である。情報管理装置100は、提供者端末TM1からかかる内容を受信すると、S224において、利用者データベース40の当該利用者の成績を更新する。そして、情報管理装置100の学習装置70は、S226において、その新たな学習データを用いて、分類装置30の多層ニューラルネットワーク33を学習させる。このように、提供者が利用者の実績データをチェックし、分類装置30の分類に誤りがある場合には、再度その実績データにより分類装置30を学習させることで、分類装置30の性能を向上させることができる。 Upon receiving the digest, in S222, the provider corrects the user's performance or relabels the image data of the misconduct classified by the classification device 30 by mistake, and obtains information from the provider terminal TM1. It is transmitted to the management device 100. For example, the classifying device 30 is a case where the image data classified as eating and drinking is actually smoking. Upon receiving the content from the provider terminal TM1, the information management device 100 updates the user's performance in the user database 40 in S224. Then, the learning device 70 of the information management device 100 trains the multilayer neural network 33 of the classification device 30 using the new learning data in S226. As described above, the provider checks the performance data of the user, and when there is an error in the classification of the classification device 30, the performance of the classification device 30 is improved by learning the classification device 30 again based on the performance data. Can be done.
 上述したことは、複数の利用者が共同利用する車両CRの貸し出しを行うカーシェアリングシステム1に用いられる情報管理方法でもある。すなわち、上述した情報管理方法は、車両CRに設置されたセンサからセンサデータを受信し、受信したセンサデータから、利用者の行為に関するデータを記憶し、そのデータから利用者の不正行為を分類し、不正行為の種類に基づいた利用者の成績を記録し、その成績と所定の貸出基準に基づいて、利用者による車両の利用可否を判断する情報管理方法である。これによれば、カーシェアリング等に提供した車両において利用者の行った不正行為等に基づいて、当該利用者への車両の貸出を制限する情報管理方法を提供できる。 The above is also an information management method used in the car sharing system 1 for renting a vehicle CR shared by a plurality of users. That is, the above-described information management method receives sensor data from a sensor installed in the vehicle CR, stores data relating to the user's behavior from the received sensor data, and classifies the user's fraudulent behavior from the data. This is an information management method that records the performance of a user based on the type of misconduct and determines whether the user can use the vehicle based on the performance and a predetermined lending standard. According to this, it is possible to provide an information management method for restricting the lending of the vehicle to the user based on the improper activity performed by the user in the vehicle provided for car sharing or the like.
 なお、本発明は、例示した実施例に限定するものではなく、特許請求の範囲の各項に記載された内容から逸脱しない範囲の構成による実施が可能である。すなわち、本発明は、主に特定の実施形態に関して特に図示され、かつ説明されているが、本発明の技術的思想および目的の範囲から逸脱することなく、以上述べた実施形態に対し、数量、その他の詳細な構成において、当業者が様々な変形を加えることができるものである。 The present invention is not limited to the illustrated embodiment, and can be implemented with a configuration that does not deviate from the contents described in the claims. That is, the present invention has been particularly shown and described with particular reference to particular embodiments thereof, but without departing from the spirit and purpose of the invention, Those skilled in the art can make various modifications in other detailed configurations.
 1    カーシェアリングシステム
 100  情報管理装置
 10   受信装置
 20   データ蓄積部/画像データ蓄積部(利用実績記録装置)
 30   分類装置
 31   データ入力部/画像データ入力部
 32   特徴量演算部
 33   分類器(多層ニューラルネットワーク)
 40   利用者データベース(成績記録装置)
 50   車両データベース
 60   制御装置
 61   判断部
 62   成績付与部
 70   学習装置
 71   学習データ入力部
 72   学習データ記憶部
 73   パラメータ演算部(重み演算部)
 74   パラメータ値記憶部(重みデータ記憶部)
 75   パラメータ更新部(重み更新部)
 80   送信装置
 RTS  車両貸出管理装置
 TM1  提供者端末(操作端末)
 TM2  利用者端末
 CR   車両
 CR1  車両側通信部
 CR2  画像センサ
 CR3  車両状態監視部
 CR4  車両側制御部
 CR5  車両側認証部
 CR6  エンジンECU
 CR7  電源ECU
 CR8  ドアロックECU
 CC   車両管理システム
 EK   電子キー
DESCRIPTION OF SYMBOLS 1 Car sharing system 100 Information management device 10 Receiving device 20 Data storage unit / image data storage unit (usage record device)
Reference Signs List 30 Classification device 31 Data input unit / image data input unit 32 Feature calculation unit 33 Classifier (multilayer neural network)
40 User Database (Result Recorder)
Reference Signs List 50 Vehicle database 60 Control device 61 Judgment unit 62 Grading unit 70 Learning device 71 Learning data input unit 72 Learning data storage unit 73 Parameter calculation unit (weight calculation unit)
74 Parameter value storage (weight data storage)
75 Parameter update unit (weight update unit)
80 Transmission device RTS Vehicle lending management device TM1 Provider terminal (operation terminal)
TM2 user terminal CR vehicle CR1 vehicle side communication part CR2 image sensor CR3 vehicle state monitoring part CR4 vehicle side control part CR5 vehicle side authentication part CR6 engine ECU
CR7 power supply ECU
CR8 Door lock ECU
CC Vehicle Management System EK Electronic Key

Claims (6)

  1.  複数の利用者が共同利用する車両の貸し出しを行うカーシェアリングシステムに用いられる情報管理装置であって、
     車両に設置されたセンサからセンサデータを受信する受信装置と、
     前記受信装置が受信した前記センサデータを利用者と対応付けて利用者の行為に関するデータとして記憶する利用実績記録装置と、
     前記利用実績記録装置が記録した前記利用者の行為に関するデータから利用者の不正行為を分類する分類装置と、
     前記分類装置が分類した前記不正行為の種類に基づいて利用者の成績を付け、前記成績と所定の貸出基準に基づいて、利用者による車両の利用可否を判断する制御装置と、
     を備える情報管理装置。
    An information management device used in a car sharing system for renting a vehicle shared by a plurality of users,
    A receiving device that receives sensor data from a sensor installed in the vehicle,
    A usage record recording device that stores the sensor data received by the receiving device as data relating to the action of the user in association with the user,
    A classification device that classifies a user's fraudulent activity from data on the user's activity recorded by the usage record recording device;
    A control device that gives a user's performance based on the type of the fraudulent activity classified by the classification device, and determines whether the user can use the vehicle based on the performance and a predetermined lending criterion,
    An information management device comprising:
  2.  前記所定の貸出基準は、車両の所有者ごとに設定されることを特徴とする請求項1に記載の情報管理装置。 The information management device according to claim 1, wherein the predetermined lending standard is set for each vehicle owner.
  3.  車両の提供者が操作する提供者端末との通信を行う端末通信装置を備え、
     前記制御装置は、前記端末通信装置で受信する前記提供者端末からの情報に応じて、前記利用者の行為に関するデータの送信および前記成績の変更を許可することを特徴とする請求項1または2に記載の情報管理装置。
    A terminal communication device for communicating with a provider terminal operated by a vehicle provider is provided,
    3. The control device according to claim 1, wherein the terminal device permits transmission of data relating to the action of the user and a change of the score in accordance with information received from the provider terminal. 4. An information management device according to item 1.
  4.  前記制御装置は、前記分類装置が行う分類の信頼度が低い前記利用者の行為に関するデータを前記提供者端末に送信することを特徴とする請求項3に記載の情報管理装置。 4. The information management device according to claim 3, wherein the control device transmits, to the provider terminal, data on an action of the user with low reliability of classification performed by the classification device. 5.
  5.  前記分類装置を学習させる学習装置をさらに備え、
     前記学習装置は、前記提供者端末から送信される前記車両の提供者が前記利用者の行為に関するデータに対して付与する不正行為のラベルを受信し、前記不正行為のラベルを教師データとして用いて前記分類装置を学習させることを特徴とする請求項3または4に記載の情報管理装置。
    Further comprising a learning device for learning the classification device,
    The learning device receives a label of a fraudulent act given by the provider of the vehicle to data on the act of the user transmitted from the provider terminal, and uses the label of the illegal act as teacher data. The information management device according to claim 3, wherein the information management device learns the classification device.
  6.  複数の利用者が共同利用する車両の貸し出しを行うカーシェアリングシステムに用いられる情報管理方法であって、
     車両に設置されたセンサからセンサデータを受信し、
     受信したセンサデータを利用者と対応付けて利用者の行為に関するデータとして記憶し、
     前記利用者の行為に関するデータから利用者の不正行為を分類し、
     前記不正行為の種類に基づいて利用者の成績を付け、
     前記成績と所定の貸出基準に基づいて、利用者による車両の利用可否を判断する、
     情報管理方法。
    An information management method used in a car sharing system for renting a vehicle shared by a plurality of users,
    Receive sensor data from sensors installed in the vehicle,
    Storing the received sensor data as data relating to the user's actions in association with the user,
    Classifying user misconduct from the data on the user's conduct,
    Assigning a grade to the user based on the type of fraud;
    Based on the results and predetermined lending criteria, determine whether the user can use the vehicle,
    Information management method.
PCT/JP2019/034629 2018-09-10 2019-09-03 Information management device and information management method WO2020054518A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018168836A JP2020042490A (en) 2018-09-10 2018-09-10 Information management apparatus and information management method
JP2018-168836 2018-09-10

Publications (1)

Publication Number Publication Date
WO2020054518A1 true WO2020054518A1 (en) 2020-03-19

Family

ID=69777615

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/034629 WO2020054518A1 (en) 2018-09-10 2019-09-03 Information management device and information management method

Country Status (2)

Country Link
JP (1) JP2020042490A (en)
WO (1) WO2020054518A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7392624B2 (en) * 2020-10-12 2023-12-06 トヨタ自動車株式会社 Control device, control method, and program

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009271632A (en) * 2008-05-01 2009-11-19 Pioneer Electronic Corp Information management device, information management method, information management program, and recording medium
JP2012084136A (en) * 2010-09-13 2012-04-26 Shinano Kenshi Co Ltd Vehicle operation management system
JP2015152953A (en) * 2014-02-10 2015-08-24 Jx日鉱日石エネルギー株式会社 Information processing apparatus and information processing method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009271632A (en) * 2008-05-01 2009-11-19 Pioneer Electronic Corp Information management device, information management method, information management program, and recording medium
JP2012084136A (en) * 2010-09-13 2012-04-26 Shinano Kenshi Co Ltd Vehicle operation management system
JP2015152953A (en) * 2014-02-10 2015-08-24 Jx日鉱日石エネルギー株式会社 Information processing apparatus and information processing method

Also Published As

Publication number Publication date
JP2020042490A (en) 2020-03-19

Similar Documents

Publication Publication Date Title
KR102297162B1 (en) Vehicle control methods and systems, in-vehicle intelligent systems, electronic devices, media
US11580756B2 (en) System and method for determining probability that a vehicle driver is associated with a driver identifier
JP6145210B1 (en) Passenger management device and passenger management method
CN113825689A (en) Autonomous vehicle system
CN101296821B (en) Methods for performing driver identity verification
CN106056839A (en) Security monitoring system and method for internet-based car hailing service
CN103434484A (en) Vehicle-mounted identification and authentication device, mobile terminal and intelligent vehicle key control system and method
CN112446761A (en) System and method for co-multiplication using block chains
EP3926498A1 (en) System and method for continuous user authentication
CN110909597A (en) Vehicle child lock falling control method and device and vehicle-mounted equipment
KR102269367B1 (en) Parking settlement system using vehicle feature points based on deep learning
US20240311614A1 (en) Image forgery detection via headpose estimation
WO2020054518A1 (en) Information management device and information management method
CN114926824A (en) Method for judging bad driving behavior
CN109770922B (en) Embedded fatigue detection system and method
JP2004164626A (en) Vehicle loan and return machine, vehicle loan and return system and vehicle loan and return method
CN112686076A (en) Image processing method, system and computer readable storage medium
CN114863532A (en) Model training method, apparatus, device and medium executed at terminal device
US8902043B1 (en) Mitigating conformational bias in authentication systems
US20230298350A1 (en) Vehicle identification system
US20240217523A1 (en) System and Method for Controlling Access Based on Preconditioned Activities
US20240054795A1 (en) Automatic Vehicle Verification
US20240087389A1 (en) Method of managing parking access into or exiting from a multi-residential building
Selavaraj et al. Image extraction for vehicle theft detection using Neural Network
JP2022190791A (en) Mobile body authentication device, entrance/leaving managing device, mobile body authentication system device, mobile body authentication method, entrance/leaving managing method, program, and recording medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19860355

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19860355

Country of ref document: EP

Kind code of ref document: A1