WO2020054518A1 - Dispositif et procédé de gestion d'informations - Google Patents
Dispositif et procédé de gestion d'informations Download PDFInfo
- Publication number
- WO2020054518A1 WO2020054518A1 PCT/JP2019/034629 JP2019034629W WO2020054518A1 WO 2020054518 A1 WO2020054518 A1 WO 2020054518A1 JP 2019034629 W JP2019034629 W JP 2019034629W WO 2020054518 A1 WO2020054518 A1 WO 2020054518A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- vehicle
- data
- information management
- learning
- Prior art date
Links
- 238000007726 management method Methods 0.000 title claims description 91
- 230000009471 action Effects 0.000 claims description 36
- 238000004891 communication Methods 0.000 claims description 20
- 230000000694 effects Effects 0.000 claims description 17
- 230000005540 biological transmission Effects 0.000 claims description 12
- 230000008859 change Effects 0.000 claims description 4
- 238000013500 data storage Methods 0.000 abstract description 37
- 230000010365 information processing Effects 0.000 abstract 1
- 238000004364 calculation method Methods 0.000 description 23
- 238000013528 artificial neural network Methods 0.000 description 22
- 230000006870 function Effects 0.000 description 20
- 238000000034 method Methods 0.000 description 20
- 238000012544 monitoring process Methods 0.000 description 20
- 230000000391 smoking effect Effects 0.000 description 18
- 230000006399 behavior Effects 0.000 description 13
- 230000007937 eating Effects 0.000 description 9
- 230000035622 drinking Effects 0.000 description 8
- 230000008569 process Effects 0.000 description 7
- 238000012706 support-vector machine Methods 0.000 description 7
- 230000033001 locomotion Effects 0.000 description 5
- 230000004044 response Effects 0.000 description 5
- 238000013527 convolutional neural network Methods 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 238000000605 extraction Methods 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 3
- 238000013135 deep learning Methods 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 238000012905 input function Methods 0.000 description 3
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 2
- 238000013459 approach Methods 0.000 description 2
- 238000010606 normalization Methods 0.000 description 2
- 238000007619 statistical method Methods 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000007621 cluster analysis Methods 0.000 description 1
- 230000008451 emotion Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000007477 logistic regression Methods 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 238000011176 pooling Methods 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000007637 random forest analysis Methods 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 239000007858 starting material Substances 0.000 description 1
- 230000021542 voluntary musculoskeletal movement Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
Definitions
- the present invention relates to an information management device and an information management method for lending a vehicle to be shared.
- Patent Literature 1 discloses an in-vehicle monitoring system and the like in a car sharing system capable of monitoring the inside of a vehicle.
- the in-vehicle monitoring system manages the vehicle according to a reservation from a user, performs an authentication process, lends and returns the vehicle to the user, and monitors the inside of the vehicle by photographing the vehicle.
- This is an in-vehicle monitoring system in a car sharing system.
- the in-vehicle monitoring system receives an authentication process monitoring unit that monitors whether or not the user has completed the authentication process after the user has used the vehicle, and receives a monitoring result that the authentication process monitoring unit has completed the vehicle use authentication process.
- Patent Literature 2 discloses an information management device that restricts use of a vehicle to a user who has poor behavior and allows a good user to use the vehicle without restriction.
- This information management device is used in a vehicle rental system for renting a vehicle.
- the information management device includes a storage unit, an identification information acquisition unit, an extraction unit, and an output unit.
- the storage unit stores information on whether or not the vehicle can be rented in advance, corresponding to the identification information for identifying the user.
- the identification information acquisition unit acquires the identification information of the user.
- the extraction unit uses the identification information acquired by the identification information acquisition unit to extract the availability information corresponding to the identification information stored in the storage unit.
- the output unit outputs the propriety information extracted by the extraction unit.
- Patent Literature 3 realizes improvement of recognition time, learning time, and recognition performance by realizing automatic selection of learning data for each optimum recognition category and selection of an optimal feature amount in pattern recognition using a neural network.
- a learning support device or the like for preventing a decrease in recognition performance due to a change in environment is disclosed.
- This learning support device and the like include a feature value calculation unit, a statistical analysis unit, and a neural network unit. The feature value calculation unit calculates a feature value for each image data.
- the statistical analysis unit performs a cluster analysis on each set of feature amounts corresponding to each image data, thereby classifying each set of feature amounts into a plurality of clusters corresponding to each category of image data. Then, a set of feature amounts representing each of the classified clusters is selected as learning data. After these sets of feature amounts are normalized by the normalization unit, they are used for learning of the neural network unit.
- Patent Literature 4 discloses a specific operation detecting device that detects a characteristic specific operation that frequently appears in a video at high speed and accurately by analyzing the video.
- the specific action template creating unit previously stores the feature vector of the specific action as a template.
- the feature calculating unit calculates a feature vector from the video, and the identification calculating unit compares the feature vector calculated by the feature calculating unit with the feature vector stored by the specific operation template creating unit, and determines the specific operation. Identify.
- This feature vector is an element based on a frame number, a motion direction number, and a block number, that is, a vector in which the number of pixels of a predetermined motion direction existing in a predetermined block of a predetermined frame is an element. Since the matching process of the feature vector focusing on the motion direction is performed between the feature vector of the specific operation stored in advance and the calculated feature vector, the specific operation can be detected from the video.
- Patent Literature 5 discloses an action recognition system that can recognize complicated and diverse actions of a person and improve the accuracy of action recognition.
- This behavior recognition system is composed of a recognition target data generation unit that generates recognition target data using recognition target image data including a person to be subjected to behavior recognition, and a plurality of behavior models that are data that model the behavior of a person.
- a likelihood calculation unit that compares the data with the recognition target data generated by the recognition target data generation unit and calculates the likelihood of the recognition target data for each of the plurality of behavior model data; and a template of the object generated in advance.
- An image matching unit that compares the data with the image data to be recognized, and a recognition result determination unit that specifies a person's action as a recognition result based on the calculation result by the likelihood calculation unit and the matching result by the image matching unit. .
- JP 2012-043042 A JP 2009-271632 A JP 09-330406 A JP 2010-267029 A Japanese Patent Application Laid-Open No. 2007-213528
- the present invention has been devised in view of such circumstances, and lends a vehicle to a user provided on a vehicle provided to a car-sharing or rental car, based on a fraudulent act such as damage or smoking performed by the user.
- An information management apparatus and an information management method capable of restricting the information management.
- an information management device used in a car sharing system that lends a vehicle shared by a plurality of users, a receiving device that receives sensor data from a sensor installed in the vehicle, A usage record recording device that stores sensor data received by the receiving device in association with the user as data on the user's behavior, and classifies user's misconduct from data on the user's activity recorded by the usage record recording device A classification device, and a control device that gives a user's performance based on the type of fraudulent activity classified by the classification device, and determines whether the user can use the vehicle based on the performance and a predetermined lending standard.
- An information management device is provided. According to this, it is possible to provide an information management device that restricts the lending of the vehicle to the user based on the illegal act performed by the user and the lending standard in the vehicle provided for car sharing or the like.
- the predetermined lending standard may be set for each vehicle owner. According to this, if the vehicle lending standard is uniform, the allowable range differs for each owner, so it may be uncomfortable depending on the owner, but it is possible to set a lending standard that can satisfy itself for each owner Thus, the owner can be prevented from being uncomfortable.
- a terminal communication device that communicates with a provider terminal operated by a vehicle provider
- the control device is configured to transmit data relating to a user's action according to information from the provider terminal received by the terminal communication device. It may be characterized in that transmission and change of the grade are permitted. According to this, the provider can directly inspect the data and judge the presence or absence of fraud, and even if the misclassification of the machine is incorrect, the grade can be manually corrected, so that the fraud can be accurately classified. Become like
- control device may be characterized in that the control device transmits to the provider terminal data relating to the behavior of the user having low reliability of the classification performed by the classification device. According to this, it is possible to urge the review of the results and the learning of the classifying device by notifying the provider of the classified data.
- the learning device further includes a learning device that learns the classification device.
- the learning device receives a label of the fraudulent activity given by the provider of the vehicle from the provider terminal to the data on the user's behavior, It may be characterized in that the classifier is trained using the label as the teacher data. According to this, the performance of the classification device can be improved by making the provider learn the classification device.
- An information management method used in a car sharing system for renting out a vehicle shared by a plurality of users to solve the above-mentioned problem comprising: receiving sensor data from a sensor installed in the vehicle; The data is associated with the user and stored as data on the user's actions, the data on the user's actions is classified into the fraudulent acts of the user, and the grades of the users are assigned based on the types of the fraudulent actions.
- An information management method for determining whether a user can use a vehicle based on a predetermined rental standard is provided. According to this, it is possible to provide an information management method for restricting the lending of the vehicle to the user based on the improper activity performed by the user in the vehicle provided for car sharing or the like.
- information that restricts the lending of a vehicle to a user based on a car-sharing or fraudulent activity such as smoking performed by the user in the vehicle provided to the rental car can be provided.
- FIG. 1 is a block diagram of a car sharing system according to a first embodiment of the present invention.
- FIG. 1 is a block configuration diagram of an information management device according to a first embodiment of the present invention.
- FIG. 1 is a block diagram of a classification device and a learning device according to a first embodiment of the present invention.
- FIG. 1 is a block diagram of a classification device and a learning device according to a first embodiment of the present invention (when a multilayer neural network is used as a classifier).
- 5 is a flowchart at the start of use of a vehicle in the car sharing system according to the first embodiment of the present invention.
- 5 is a flowchart at the end of use of a vehicle in the car sharing system according to the first embodiment of the present invention.
- An example of the fraud and occurrence rate according to the present invention. 4 is an example of output values and label data when a multilayer neural network is used as a classifier of the classification device according to the first embodiment of the present invention.
- the information management apparatus 100 is used in a car sharing system 1 for renting a vehicle CR shared by a plurality of users.
- the car sharing system 1 includes an information management device 100, a vehicle CR, a user terminal TM2 operated when a user of the vehicle CR applies for a reservation, and a user of the vehicle CR applying for a reservation.
- a vehicle lending management device RTS that sometimes communicates with the user terminal TM2, an electronic key EK that the user of the vehicle CR possesses when using the device, and a communication that the provider of the vehicle CR operates to communicate with the information management device 100.
- car sharing in this specification means that the owner of the vehicle CR provides the vehicle CR for shared use within a predetermined group or community, and a plurality of uses within the group or community. Refers to a system that manages lending, etc., for lending to people.
- the provider terminal TM1 and the user terminal TM2 communicate information from the information management device 100 and the vehicle lending management device RTS with the communication function of communicating with the information management device 100 and the vehicle lending management device RTS via the Internet or a public line. It has a display function for displaying, an input function for a provider or a user to input information, a control function and a storage function for realizing the display function and the input function.
- the provider terminal TM1 and the user terminal TM2 are, for example, a personal computer or a smartphone.
- the display function is a display
- the input function is a keyboard
- the control function is a CPU (Central Processing Unit)
- the storage function is a memory or a hard disk.
- the vehicle rental management device RTS is accessed when the user operates the user terminal TM2 to apply for the use of the vehicle CR.
- Can the vehicle lending management device RTS receive information such as a vehicle CR desired to be used, a use period, a return place, and the like at the time of use application, and permit the information management device 100 connected via a network to permit the use application? Inquire whether or not.
- the network connecting the vehicle lending management device RTS and the information management device 100 may be the Internet or a dedicated line.
- the vehicle lending management device RTS creates a schedule for lending with the available vehicle CR, performs a cost process for the user, The contents are transmitted to the user terminal TM2.
- the vehicle lending management device RTS when the vehicle lending management device RTS receives the information indicating that the vehicle is not permitted, the vehicle lending management device RTS transmits to the user terminal TM2 that the desired vehicle CR cannot be lent. Since the vehicle lending management device RTS is accessed from a number of user terminals TM2, it is preferable that the vehicle lending management device RTS be a server having a control function, a storage function, and a communication function that are more sophisticated than a personal computer.
- the vehicle CR includes a vehicle management system CC.
- the vehicle management system CC includes a vehicle-side communication unit CR1 that communicates with the information management apparatus 100 via a network, and an image sensor that is a sensor that detects a user's action, action, operation, or a resulting state.
- CR2 a vehicle state monitoring unit CR3 for monitoring the driving state / use state of the vehicle CR, a vehicle-side control unit CR4 for performing these controls, a vehicle-side authentication unit CR5 for communicating with the electronic key EK for authentication, and an engine.
- a door lock ECU CR8 for controlling door locking and unlocking.
- the vehicle-side communication unit CR1 communicates with the information management device 100 to receive an identification code of the electronic key EK corresponding to the vehicle CR at the start of use, or to transmit information detected by the image sensor CR2 at the end of use.
- the network between the information management device 100 and the vehicle CR is not particularly limited as long as it is a network capable of wireless communication.
- the image sensor CR2 is a sensor that detects an action / action / action of a user of the vehicle CR including a passenger and a state generated as a result, and is typically a sensor that detects a moving image such as a monitoring camera. For example, a camera capable of imaging visible light or infrared light.
- the image sensor CR2 is installed in the vehicle CR so that a driver or a passenger can be imaged in the cabin of the vehicle CR.
- the image sensor CR2 provides the vehicle CR such as a state resulting from the user's smoking, eating and drinking, putting on a dashboard, putting on a box, performing manners while driving, and leaving garbage unattended.
- the image sensor CR2 may have a function of detecting a biological signal (vital sign, reflection, voluntary movement, or the like) of the driver, and may be capable of acquiring the driver's tension or emotion.
- the image sensor CR2 can also detect a state that the provider of the vehicle CR considers undesirable, such as driving in an unnecessarily nervous state.
- the vehicle state monitoring unit CR3 monitors the state of the vehicle CR using various sensors.
- sensors used in the vehicle state monitoring unit CR3 include acceleration sensors, vehicle speed sensors, steering angle sensors, vibration sensors, yaw rate sensors, gyroscopes, inter-vehicle distances, including sensors related to the main control system of acceleration, steering, and braking.
- Sensors front / rear cameras, LIDAR, position sensors (GPS: Global Positioning System), and the like.
- GPS Global Positioning System
- information on the user's actions / actions / actions detected by the image sensor CR2 and the vehicle state monitoring unit CR3, the resulting state, and the state of the vehicle are combined with data on the user's actions. To tell.
- the vehicle management system CC detects and controls the vehicle state as described above in addition to the illustrated engine ECUCR6 for controlling the engine, the power supply ECUCR7 for controlling the power supply, and the door lock ECUCR8 for controlling the locking and unlocking of the door. It is equipped with a number of ECUs (Electronic Control Unit) that perform the operations. For example, it is a steering control ECU, a brake control ECU, or the like.
- ECUs Electronic Control Unit
- the vehicle-side authentication unit CR5 communicates with the electronic key EK to authenticate that the person has the electronic key EK that can use the vehicle CR. If the vehicle has been authenticated, the vehicle-side authentication unit CR5 instructs the door lock ECU CR8 to unlock the door, turns on the main power supply to the power supply ECU CR7, instructs the engine ECU CR6 to rotate the starter motor of the engine, and notifies the vehicle-side control unit CR4 of the vehicle. The start of the control of the side communication unit CR1, etc. is instructed.
- the electronic key EK may be a vehicle key (FOB) unique to the vehicle CR if it can be directly delivered, or may be a smartphone or an IC card registered in advance by the user. Instead of using the electronic key EK, a fingerprint or face that can identify the user may be used. In this case, the vehicle-side authentication unit CR5 has a function of performing fingerprint authentication / face authentication.
- FOB vehicle key
- the information management device 100 receives the sensor data from the sensor installed in the vehicle CR, and associates the sensor data received by the reception device 10 with the user.
- a data storage unit 20 (use result recording device) for storing as data relating to the act, a classifying device 30 for classifying a user's fraudulent activity from the data relating to the user's act recorded by the data storage unit 20, and a learning of the classifying device 30
- a learning device 70 a vehicle database 50 that records information about the vehicle CR and information about the provider of the vehicle CR, a transmission device 80 that transmits an identification code of an available electronic key EK to the vehicle CR, Controls the entire device, database, and the like, and based on the type of misconduct classified by the classifying device 30, Only, and a control device 60 for determining the availability of the vehicle by the user on the basis of results and a predetermined lending standards, the.
- the receiving device 10 has a network interface function of communicating with the vehicle-side communication unit CR1 of the vehicle CR via a network connected to a wirelessly communicable network (for example, a wireless telephone line or a wireless LAN). Further, the receiving device 10 may have a receiving function of a terminal communication device that communicates with the provider terminal TM1 via these networks. The terminal communication device that communicates with the provider terminal TM1 may be separately provided as a dedicated device in the learning device 70 or the like. In addition, the receiving device 10 receives, via these networks, sensor data detected by the above-described various sensors installed in the vehicle CR at the end of use or the like. The receiving device 10 also communicates with the vehicle lending management device RTS, and receives information (for example, information on a user, a period of use, and the like) related to an application for a reservation to use the vehicle CR.
- a wirelessly communicable network for example, a wireless telephone line or a wireless LAN.
- the receiving device 10 may have a receiving function of
- the data storage unit 20 stores the sensor data received by the receiving device 10 in association with the user as data relating to the action of the user of the vehicle CR.
- the data storage unit 20 stores the image data detected by the image sensor CR2, the driving data such as sudden start / brake detected by the acceleration sensor and the steering angle sensor of the vehicle state monitoring unit CR3, and the user ID (member number, Driver's license number, name, etc.).
- the image data is used to perform various operations during the ride of the user (driver and passenger), for example, not only smoking, eating and drinking inside the vehicle using the in-vehicle camera, but also leaving the pedestrian approaching the vehicle in front of the vehicle using the camera outside the vehicle.
- the data storage unit 20 stores information on the user's actions / actions / actions detected by the image sensor CR2 and the vehicle state monitoring unit CR3, the resulting state, and the state of the vehicle. Record as usage record.
- the classifying device 30 classifies a user's misconduct based on data on the user's behavior recorded by the data storage unit 20.
- the classification device 30 includes a data input unit 31 that reads information on the user's actual usage from the data storage unit 20, and a feature that calculates the characteristic amount from the actual usage information read by the data input unit 31. It comprises a quantity calculation unit 32 and a classifier 33 for classifying fraudulent acts based on the feature values calculated by the feature value calculation unit 32.
- the feature amount calculation unit 32 includes a luminance distribution (Wavelet) of the entire object in the image, a Haar-like feature amount, a HOG feature amount, an EOH feature amount, and the like, which are local feature amounts of the object.
- a plurality of feature values may be used. Further, feature points of the eyes, nose and mouth, feature points of the fingers of the hand, and feature points of these chronological changes may be extracted. If necessary, noise removal or normalization may be performed as preprocessing. Further, the configuration may be such that the value (lightness or color information) of each pixel of the image data is directly input to the classifier 33 without using the feature amount calculation unit 32.
- the classifier 33 classifies which of the previously labeled fraudulent acts the input feature value corresponds to.
- the classifier 33 may be any classifier that uses a classification method that can divide input data (images, moving images, audio data, and the like) into predetermined categories (classes).
- the classifier 33 is preferably a classifier capable of machine learning.
- the classifier 33 may be a multilayer neural network 33 as shown in FIG.
- the multilayer neural network 33 includes an input layer, a plurality of intermediate layers, and an output layer configured by a node configuration and a connection weight between nodes in which the calculated feature amount is previously learned by deep learning (deep learning) or the like.
- the multilayer neural network 33 has as many or more output nodes as the type of fraud labeled.
- the output layer includes output corresponding to these misconducts. Has nodes.
- an image in which the hand is close to the user's mouth and the hand is holding something is labeled as "food and drink” to learn, and similarly, the hand is close to the user's mouth,
- the image where the light flickers with is labeled "smoking” and learned.
- the output value of “drinking” or “smoking” is large.
- the multilayer neural network 33 adds an output node of the output layer, and accordingly, the node configuration in a plurality of intermediate layers and the connection weight between the nodes are re-learned. Being built.
- the classification device 30 is not limited to the classification method using the multilayer neural network 33.
- learning data such as a support vector machine, logistic regression, random forest, and boosting.
- a plurality of classifiers 33 may be used in combination.
- a convolutional neural network may be used as the multilayer neural network 33.
- the convolutional neural network When the convolutional neural network is used, the pixel data of the image is directly input to the input layer of the convolutional neural network without using the feature amount calculation unit 32.
- a convolutional layer and a pooling layer are provided as layers next to the input layer of the convolutional neural network, and these layers execute extraction of a feature amount.
- a template may be created for each fraudulent activity, and matching with the classification target may be performed using an average value of templates having the same label. . Further, matching may be performed sequentially with all templates. When a provider labels and learns data, the number of templates increases.
- the classifying device 30 classifies data such as image data input from the data storage unit 20 and outputs the data to the control device 60.
- the control device 60 controls the entire information management device 100 including the database.
- the control device 60 communicates with the receiving device 10, the transmitting device 80, the data storage unit 20, the classifying device 30, the learning device 70, the user database 40, and the vehicle database 50 to cooperate with each other, and obtains data obtained from the vehicle CR.
- the user is classified based on the fraud.
- the control device 60 (result giving section 62) determines the user's results based on the classified misconduct.
- the control device 60 sends the information to the receiving device 10, the transmitting device 80, the data storage unit 20, the classifying device 30, the learning device 70, the user database 40, and the vehicle database 50 based on the information from the provider terminal TM1.
- the terminal communication device may include the functions of the receiving device 10 and the transmitting device 80, or may be provided in a learning data input unit 71 described later.
- the control device 60 creates a grade of the user based on the classification result output by the classification device 30 and stores the grade in the user database 40.
- the user's performance is the type of fraudulent activity performed by the user, the total score given for each type of fraudulent activity, the incidence rate described later, and the like.
- the user database 40 records the performance of the user (a performance recording device). As shown in FIG. 7, the user database 40 accumulates the occurrence rate of each fraudulent act for each user (user ID).
- the control device 60 also performs control such as writing information about the vehicle CR and information about its provider into the vehicle database 50 and reading the information from the vehicle database 50 on the contrary.
- the vehicle database 50 records, as information on the provider of the vehicle CR, a lending standard for each provider.
- the lending criterion for each provider is a tolerance for a predetermined fraud. For example, lending is permitted if the frequency of littering and fraudulent driving while driving is less than 50%, but lending is prohibited if the frequency is 50% or more. Such as prohibiting lending when an outbreak occurs. Instead of using the incidence rate or the like, the types of fraudulent activities performed by the user may be listed and recorded in the vehicle database 50. Then, if predetermined fraudulent acts are recorded, lending may be prohibited.
- the control device 60 performs overall control of the information management device 100, and includes a determination unit 61 and a performance imparting unit 62 therein.
- the determination unit 61 determines whether or not the user can use the vehicle CR based on the user's performance recorded in the user database 40 and the lending standard of the vehicle database 50. According to this, it is possible to restrict the lending of the vehicle CR to the user based on the fraudulent activity performed by the user in the vehicle CR provided to the car sharing and the lending standard determined by the provider.
- the lending standard may be uniform as a company that provides the car sharing system 1, but the lending standard may be set for each owner who provides the vehicle. Since the acceptable range may differ for each owner that provides the vehicle, an action that is OK for one owner may be uncomfortable for another owner. By being able to set a satisfactory lending standard for each owner, it is possible to prevent the owner from becoming uncomfortable.
- the control device 60 controls the transmission device 80 to permit the vehicle lending management device RTS to use the reservation. Is transmitted, and if not permitted, a message is transmitted not to be permitted. Further, the control device 60 controls the transmitting device 80 to transmit the identification code of the available electronic key EK to the vehicle CR when the user is actually permitted to use the vehicle CR and the user actually uses the vehicle CR. I do. As a result, the user can start using the vehicle CR as requested.
- the control device 60 communicates with the provider terminal TM1 operated by the provider of the vehicle CR via the receiving device 10 and the transmitting device 80 as terminal communication devices for performing communication with the terminal, and receives a signal from the provider terminal TM1. According to the command, transmission of the data stored in the data storage unit 20 and change of the user's performance stored in the user database 40 are permitted.
- the terminal communication device may be provided as a dedicated device.
- the data may be transmitted when there is a request from a provider or the like, or may be transmitted automatically. At the time of transmission, since the amount of information in a moving image is too large, transmission may be limited to data used for digesting and determination.
- the transmission of the data stored in the data storage unit 20 to the provider terminal TM1 is performed via the transmission device 80.
- a dedicated terminal communication device for performing communication with the provider terminal TM1 may be separately provided.
- the provider can check the data at the time of use of the user himself and correct the performance of the user using the provider terminal TM1 if the judgment of the classifying device 30 is wrong. preferable. In this way, the vehicle CR provider can directly inspect the data and judge the presence or absence of fraud, so even if the grade is incorrectly assigned due to incorrect classification of machine fraud, the grade is manually corrected. it can.
- the control device 60 may automatically transmit the data to the terminal.
- the control device 60 receives the output from the classification device 30 and transmits, for example, data with low reliability of the classification performed by the classification device 30 to the provider terminal TM1.
- the output from the classification device 30 at that time and the image data input at that time are notified to a provider or the like, so that the results can be reviewed or the learning of the classification device 30 can be performed. Can be encouraged.
- the control device 60 receives the command from the provider terminal TM1, and controls the learning device 70 in accordance with the command to cause the classifying device 30 to learn.
- the learning device 70 includes a learning data input unit 71 that receives data transmitted from the provider terminal TM1, a learning data storage unit 72 that stores data received by the learning data input unit 71, A parameter calculation unit 73 for calculating parameters of the classifier 33 based on data stored in the data storage unit 72, an output from the classifier 33, and the like; a parameter value storage unit 74 for storing calculated parameter values; A parameter updating unit 75 for updating the parameters of the unit 33 to the stored values.
- the learning data input unit 71 may receive data directly from the provider terminal TM1, or the data received by the receiving device 10 of the information management device 100 may be transferred and received by the control device 60.
- the learning data storage unit 72 previously stores a large number of learning input data for learning and label data as teacher data of each learning input data.
- the learning input data is image data
- the label data is data indicating what image the image data is.
- the learning device 70 includes a learning data input unit 71 that receives data from the provider terminal TM1, a learning data storage unit 72 that stores data received by the learning data input unit 71, and a learning data storage unit.
- a weight calculation unit 73 that calculates the connection weight between nodes based on the data stored in 72, a weight data storage unit 74 that stores the calculated connection weight, and a connection weight between nodes of the multilayer neural network 33 are stored.
- a weight updating unit 75 for updating the connection weights.
- the learning data storage unit 72 previously stores a large number of learning image data for learning and data of a label assigned to each of the learning image data.
- the learning data storage unit 72 may store the feature amount output from the feature amount calculation unit 32 in association with each data. Note that a large amount of learning data is stored in the learning data storage unit 72 in advance, and the multilayer neural network 33 has been trained using this learning data.
- the provider After receiving the data stored in the data storage unit 20 at the provider terminal TM1, the provider directly inspects the data to determine whether or not there is any fraud. Label if there are any. The provider sees, for example, data that cannot be clearly identified as smoking or eating or drinking with low reliability, and recognizes that this is smoking and labels the data.
- the learning data input unit 71 receives the image data A and the label data A given by the provider from the provider terminal TM1 as data for learning.
- the label data is data corresponding to the node of the output layer of the multilayer neural network 33.
- the label data A is (1: 0: 0: 0: 0: 0: 0) as shown in FIG.
- the conversion from the label (smoking) to the label data may be executed by the provider terminal TM1 in which dedicated application software is installed, or may be executed by the learning data input unit 71.
- the image data A received by the learning data input unit 71 and the label data A corresponding to the image data are stored in the learning data storage unit 72 as learning data.
- the learning device 70 inputs the learning image data A to the data input unit 31 of the classifying device 30, and the feature calculating unit 32 calculates the feature of the learning image data A, To enter.
- the output value from the output layer of the multilayer neural network 33 and each output value of an internal node at this time are input to the weight calculation unit 73 of the learning device 70.
- the label data A is also input from the learning data storage unit 72.
- data as the current connection weight of the multilayer neural network 33 is input from the weight data storage unit 74.
- the weight calculation unit 73 corrects the connection weight between nodes using a method such as an error back propagation method in order to reduce an error between the output value from the output layer and the label data, for example, a square error.
- the weight calculator 73 similarly corrects the connection weight between nodes using a method such as the backpropagation method for a plurality of pieces of learning data other than the image data A stored in the learning data storage unit 72. I will do it.
- the learning device 70 confirms that the error between the output value from the output layer and the label data has converged to the predetermined threshold value for all the learning data, the learning of the classification device 30 ends.
- the weight data storage unit 74 stores the connection weight at the end of learning, and outputs the connection weight to the weight update unit 75.
- the weight updating unit 75 updates the value of the connection weight between the nodes of the multilayer neural network 33 to the value of the connection weight at the end of learning.
- the parameter calculation unit 73 calculates a parameter value once using a plurality of pieces of learning data stored in the learning data storage unit 72. Then, the parameter value storage unit 74 stores the parameter value, and causes the parameter value to update the current parameter value of the support vector machine via the parameter update unit 75. Thereafter, when the provider confirms the determination of the support vector machine, the provider assigns the correct label data to the image data A for which the support vector machine has made an incorrect determination, and the image data A is input to the learning data input unit 71. Label data A is added. The parameter calculation unit 73 calculates the parameter value of the identification function again so that the added learning data can be correctly classified. Then, the parameter updating unit 75 updates the parameter value of the support vector machine to the calculated new parameter value.
- the parameters of the identification function may be adjusted using a method such as “soft margin” or “kernel trick” other than “maximization of margin”.
- Adaboost one of the boosting, is known as a classifier.
- the Adaboost classifier is composed of a plurality of weak classifiers, and calculates the product of the output value from the weak classifier and the reliability assigned to each weak classifier for each weak classifier, The final classification is performed based on the sum of these products.
- weights are assigned to the learning input data, and the false detection rate of each weak classifier is calculated based on the weight.
- the reliability of the weak classifier is calculated based on the false detection rate.
- Each weight of the input data for learning is set to a larger weight for input data whose classification is incorrect by the weak classifier.
- the parameter calculating unit 73 increases the weight of the image data A and sets the weight of the other image data stored in the learning data storage unit 72. Is re-weighted, and the reliability of each weak classifier is calculated again. The reliability calculated for each weak classifier is stored in the parameter value storage unit 74, and the reliability of each weak classifier of the classifier is updated via the parameter update unit 75.
- the classification device 30 and the learning device 70 are realized using a computer and software.
- the classification device 30 and the learning device 70 may be realized by the same computer. Part of the classification device 30 and the learning device 70 may use dedicated hardware.
- the classifier 33 of the classification device 30 when the classifier 33 of the classification device 30 is a multilayer neural network, the classifier 33 may be realized using a plurality of neurochips or the like.
- the same configuration as the classifier 33 may be configured in the parameter calculation unit 73 of the learning device 70 to calculate parameters without receiving the output from the classifier 33.
- the user accesses the vehicle lending management device RTS from the user terminal TM2, inputs a use period and the like, and applies for a use reservation of the vehicle CR.
- the vehicle lending management device RTS Upon receiving the application from the user, the vehicle lending management device RTS transmits a user ID to the information management device 100 and requests the information management device 100 to determine whether or not use is possible in S102.
- the user ID may be, for example, a license number or an ID registered in the information management device 100 in advance.
- the information management apparatus 100 Upon receiving the user ID, the information management apparatus 100 authenticates the ID in S104. If the authentication is successful, the information management device 100 accesses the user database 40 in S106, and confirms the user's performance regarding the fraudulent activity when the user rents the vehicle CR. If the use does not meet the lending conditions or has already been reserved in S108, the information management device 100 transmits a non-permission response to the vehicle lending management device RTS in S110. Upon receiving the response, the vehicle lending management device RTS transmits an unusable response to the user in S112. Further, when the vehicle CR is not available, the information management apparatus 100 may recommend another vehicle that can be reserved and that meets the lending conditions based on the performance of the user.
- the information management device 100 replies to the vehicle lending management device RTS in S114 that the use is permitted.
- the vehicle lending management device RTS transmits the key ID of the permitted vehicle CR from the transmission device 80 in S116.
- the user downloads the key ID to his / her smartphone or the like, and in S118, transmits a signal to approach and unlock the door when using the vehicle CR.
- vehicle CR authenticates the key ID in S120, and if the authentication is successful, unlocks the door of vehicle CR in S122.
- the vehicle CR provider can provide the vehicle CR to the car sharing system 1 with ease.
- the vehicle CR continues to detect the user's actions / actions / actions and the resulting state and the like by the image sensor CR2 in S200, and the driving state of the vehicle CR by the vehicle state monitoring unit CR3 in S202. / Continue to monitor usage status.
- the vehicle CR When the user terminates the use of the vehicle CR, the user gets off the vehicle CR in S204 and transmits a key ID to the vehicle CR in S206 to perform locking. Upon receiving the key ID, the vehicle CR locks the door in S208.
- the vehicle CR transmits data detected by sensors such as the image sensor CR2 and the vehicle state monitoring unit CR3 to the information management device 100 in S210. In this embodiment, the data detected by the sensor is transmitted at the end of use. However, the present invention is not limited to this. Data may be transmitted in real time during use.
- the classifying device 30 classifies a device corresponding to an improper act based on the data, and determines the performance of the user based on the use record.
- the information management device 100 notifies the determination result to the provider and the user of the vehicle CR.
- the information management apparatus 100 updates the record of the user in the user database 40 including the latest data.
- the provider may request the provider terminal TM1 to access the user's performance and usage results at S218 at his or her favorite timing.
- the provider is not limited to the access at this timing, but may notify the provider at the timing when the information management apparatus 100 determines the availability, and make the determination at that time.
- the information management device 100 transmits a digest such as image data of the user's performance and usage results from the transmission device 80 in S220.
- the provider Upon receiving the digest, in S222, the provider corrects the user's performance or relabels the image data of the misconduct classified by the classification device 30 by mistake, and obtains information from the provider terminal TM1. It is transmitted to the management device 100.
- the classifying device 30 is a case where the image data classified as eating and drinking is actually smoking.
- the information management device 100 updates the user's performance in the user database 40 in S224.
- the learning device 70 of the information management device 100 trains the multilayer neural network 33 of the classification device 30 using the new learning data in S226.
- the provider checks the performance data of the user, and when there is an error in the classification of the classification device 30, the performance of the classification device 30 is improved by learning the classification device 30 again based on the performance data. Can be done.
- the above is also an information management method used in the car sharing system 1 for renting a vehicle CR shared by a plurality of users. That is, the above-described information management method receives sensor data from a sensor installed in the vehicle CR, stores data relating to the user's behavior from the received sensor data, and classifies the user's fraudulent behavior from the data.
- This is an information management method that records the performance of a user based on the type of misconduct and determines whether the user can use the vehicle based on the performance and a predetermined lending standard. According to this, it is possible to provide an information management method for restricting the lending of the vehicle to the user based on the improper activity performed by the user in the vehicle provided for car sharing or the like.
- Receiving device 20 Data storage unit / image data storage unit (usage record device) Reference Signs List 30 Classification device 31 Data input unit / image data input unit 32 Feature calculation unit 33 Classifier (multilayer neural network) 40 User Database (Result Recorder) Reference Signs List 50 Vehicle database 60 Control device 61 Judgment unit 62 Grading unit 70 Learning device 71 Learning data input unit 72 Learning data storage unit 73 Parameter calculation unit (weight calculation unit) 74 Parameter value storage (weight data storage) 75 Parameter update unit (weight update unit) 80 Transmission device RTS Vehicle lending management device TM1 Provider terminal (operation terminal) TM2 user terminal CR vehicle CR1 vehicle side communication part CR2 image sensor CR3 vehicle state monitoring part CR4 vehicle side control part CR5 vehicle side authentication part CR6 engine ECU CR7 power supply ECU CR8 Door lock ECU CC Vehicle Management System EK Electronic Key
Landscapes
- Business, Economics & Management (AREA)
- Tourism & Hospitality (AREA)
- Health & Medical Sciences (AREA)
- Economics (AREA)
- General Health & Medical Sciences (AREA)
- Human Resources & Organizations (AREA)
- Marketing (AREA)
- Primary Health Care (AREA)
- Strategic Management (AREA)
- Physics & Mathematics (AREA)
- General Business, Economics & Management (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Traffic Control Systems (AREA)
Abstract
La présente invention limite la location d'un véhicule à un utilisateur sur la base d'un acte répréhensible, etc., que l'utilisateur a commis sur un véhicule pour un partage de voiture ou à d'autres fins similaires. L'invention concerne un dispositif 100 de traitement d'informations comportant: un dispositif 10 de réception servant à recevoir des données de capteur en provenance d'un capteur installé dans un véhicule CR; un dispositif d'enregistrement de performances d'utilisation (unité 20 de stockage de données) servant à stocker les données de capteur reçues par le dispositif de réception en association avec un utilisateur en tant que données se rapportant à l'acte de l'utilisateur; un dispositif 30 de classification servant à classifier l'acte répréhensible de l'utilisateur à partir des données se rapportant à l'acte de l'utilisateur qui sont enregistrées par le dispositif d'enregistrement de performances d'utilisation; et un dispositif 60 de commande servant à corréler l'utilisateur sur la base du type d'acte répréhensible classifié par le dispositif de classification et à déterminer si ou non l'utilisation d'un véhicule par l'utilisateur est permissible sur la base d'antécédents et d'un standard de location prescrit.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2018-168836 | 2018-09-10 | ||
JP2018168836A JP2020042490A (ja) | 2018-09-10 | 2018-09-10 | 情報管理装置および情報管理方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020054518A1 true WO2020054518A1 (fr) | 2020-03-19 |
Family
ID=69777615
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2019/034629 WO2020054518A1 (fr) | 2018-09-10 | 2019-09-03 | Dispositif et procédé de gestion d'informations |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP2020042490A (fr) |
WO (1) | WO2020054518A1 (fr) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7392624B2 (ja) * | 2020-10-12 | 2023-12-06 | トヨタ自動車株式会社 | 制御装置、制御方法、及びプログラム |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009271632A (ja) * | 2008-05-01 | 2009-11-19 | Pioneer Electronic Corp | 情報管理装置、情報管理方法、情報管理プログラム、および記録媒体 |
JP2012084136A (ja) * | 2010-09-13 | 2012-04-26 | Shinano Kenshi Co Ltd | 車両運行管理システム |
JP2015152953A (ja) * | 2014-02-10 | 2015-08-24 | Jx日鉱日石エネルギー株式会社 | 情報処理装置および情報処理方法 |
-
2018
- 2018-09-10 JP JP2018168836A patent/JP2020042490A/ja active Pending
-
2019
- 2019-09-03 WO PCT/JP2019/034629 patent/WO2020054518A1/fr active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009271632A (ja) * | 2008-05-01 | 2009-11-19 | Pioneer Electronic Corp | 情報管理装置、情報管理方法、情報管理プログラム、および記録媒体 |
JP2012084136A (ja) * | 2010-09-13 | 2012-04-26 | Shinano Kenshi Co Ltd | 車両運行管理システム |
JP2015152953A (ja) * | 2014-02-10 | 2015-08-24 | Jx日鉱日石エネルギー株式会社 | 情報処理装置および情報処理方法 |
Also Published As
Publication number | Publication date |
---|---|
JP2020042490A (ja) | 2020-03-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR102297162B1 (ko) | 차량 제어 방법 및 시스템, 차량 탑재 지능형 시스템, 전자 기기, 매체 | |
US10503990B2 (en) | System and method for determining probability that a vehicle driver is associated with a driver identifier | |
JP6145210B1 (ja) | 乗客管理装置、及び乗客管理方法 | |
CN113825689A (zh) | 自主交通工具系统 | |
CN101296821B (zh) | 用于执行驾驶员身份验证的方法 | |
CN106056839A (zh) | 网约车安全监测系统及方法 | |
CN103434484A (zh) | 车载识别认证装置、移动终端、智能车钥控制系统及方法 | |
CN112446761A (zh) | 用于使用区块链进行共乘的系统和方法 | |
US20210397683A1 (en) | System and Method for Continuous User Authentication | |
EP3926498A1 (fr) | Systèmes et procédés d'authentification d'utilisateur continue | |
CN110909597A (zh) | 车辆儿童锁落锁控制方法、装置及车载设备 | |
KR102269367B1 (ko) | 딥러닝 기반의 차량 특징점을 이용한 주차 정산 시스템 | |
US20240311614A1 (en) | Image forgery detection via headpose estimation | |
WO2020054518A1 (fr) | Dispositif et procédé de gestion d'informations | |
CN114926824A (zh) | 一种不良驾驶行为判别方法 | |
EP4246460A2 (fr) | Système d'identification de véhicule | |
CN109770922B (zh) | 嵌入式疲劳检测系统及方法 | |
JP2004164626A (ja) | 車両貸出返却機、車両貸出返却システム及び車両貸出返却方法 | |
CN112686076A (zh) | 一种图像处理方法、系统及计算机可读存储介质 | |
CN114863532A (zh) | 在终端设备处执行的模型训练方法、装置、设备及介质 | |
US8902043B1 (en) | Mitigating conformational bias in authentication systems | |
US20240217523A1 (en) | System and Method for Controlling Access Based on Preconditioned Activities | |
US20240054795A1 (en) | Automatic Vehicle Verification | |
US20240087389A1 (en) | Method of managing parking access into or exiting from a multi-residential building | |
Selavaraj et al. | Image extraction for vehicle theft detection using Neural Network |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19860355 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19860355 Country of ref document: EP Kind code of ref document: A1 |