WO2017183476A1 - Information processing device, information processing method, and program - Google Patents

Information processing device, information processing method, and program Download PDF

Info

Publication number
WO2017183476A1
WO2017183476A1 PCT/JP2017/014451 JP2017014451W WO2017183476A1 WO 2017183476 A1 WO2017183476 A1 WO 2017183476A1 JP 2017014451 W JP2017014451 W JP 2017014451W WO 2017183476 A1 WO2017183476 A1 WO 2017183476A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
hardware
unit
vehicle
learning data
Prior art date
Application number
PCT/JP2017/014451
Other languages
French (fr)
Japanese (ja)
Inventor
延浩 小川
勝吉 金本
淳史 野田
拓也 藤田
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Priority to JP2018513108A priority Critical patent/JP7017143B2/en
Priority to US16/094,032 priority patent/US20190114558A1/en
Publication of WO2017183476A1 publication Critical patent/WO2017183476A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/40Business processes related to the transportation industry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N5/00Computing arrangements using knowledge-based models
    • G06N5/04Inference or reasoning models
    • G06N5/043Distributed expert systems; Blackboards
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/161Decentralised systems, e.g. inter-vehicle communication
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2221/00Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/21Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F2221/2149Restricted operating environment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus

Definitions

  • the present disclosure relates to an information processing apparatus, an information processing method, and a program, and more particularly, to an information processing apparatus, an information processing method, and a program that can easily inherit learning data stored in hardware.
  • An “intelligent agent” is known as a device that recognizes an environmental situation based on detection results from various sensors, infers the next operation, and performs an operation according to the inference result (Non-Patent Document 1). reference).
  • the intelligent agent learns the knowledge of the unknown environment according to the experience, accumulates the learning result, repeats the inference, and operates according to the inference result. To do.
  • the accumulated learning results such as personalized history data are deleted from the viewpoint of privacy when the user is changed when the hardware with the intelligent agent is replaced. Therefore, the user who uses the new hardware cannot make use of the learning results accumulated so far with the intelligent agent of the new hardware.
  • the present disclosure has been made in view of such a situation, and in particular, a learning result obtained by learning by an intelligent agent or the like mounted on hardware is changed by a hardware user or a new hardware is obtained. Even if it comes to use the wear, it will be able to take over in an appropriate state.
  • An information processing apparatus includes a storage unit that stores learning data in association with a user who uses hardware, an operation determination unit that determines an operation of the hardware based on the learning data, Among the learning data stored in the storage unit, including a presentation unit that presents options of learning data that can be used by the user, the operation determination unit, among the options presented by the presentation unit, An information processing apparatus that determines an operation based on selected learning data.
  • the learning data can be composed of private data depending on the user and public data other than the private data, and the presentation unit stores the learning data stored in the storage unit Among them, the public data can be presented as options as learning data that can be used by the operation determination unit.
  • the learning data may be composed of individual-dependent data that depends on the individual hardware, model-dependent data that depends on the hardware model, and general-purpose data that does not depend on the hardware.
  • the presenting unit when the user uses another hardware different from the hardware and causes the operation determining unit of the other hardware to determine an operation, the user is stored in association with the hardware.
  • the general-purpose data can be presented with options including private data and public data.
  • the learning data here refers to various data acquired from the hardware when the user is using the hardware.
  • the various data acquired from hardware includes data acquired based on the operation of each device and software included in the hardware.
  • the presentation unit uses the other hardware different from the hardware, and the presentation unit uses the other hardware.
  • the learning data stored in association with the other hardware when the operation is determined by the operation determination unit, the model-dependent data and the general-purpose data, which are private data and public data, respectively. It is possible to present an option consisting of
  • the presenting unit is configured to present the individual-dependent data, the model-dependent data, and the general-purpose data in the learning data, and options to be discarded in units of private data and public data. Can do.
  • the learning data can be configured for each piece of hardware used by the user, and the presentation unit has other options for selecting the hardware to which the learning data is to be transferred, and Of the learning data stored in the storage unit, public data of learning data learned by the hardware used by the user is presented as an option of learning data that can be used by the operation determination unit of the hardware. Can be.
  • the learning data is configured for each piece of hardware used by the user and is dependent on the individual of the hardware, dependent on the individual of the hardware, dependent on the model of the hardware, and dependent on the hardware
  • the presenting unit is configured to include general-purpose data that is not to be transferred, among other learning data stored in the storage unit, and other options for selecting the hardware that the learning data is to be transferred.
  • General data of learning data learned by the hardware used by the user which is public data and private data, and learning data that can be used by the operation determination unit of other hardware different from the hardware It can be presented as an option.
  • the hardware may be a vehicle
  • the learning data includes individual-dependent data that depends on an individual of the vehicle, vehicle-type dependent data that depends on four types of the vehicle, and general-purpose data that does not depend on the vehicle.
  • the individual-dependent data can include repair history, mileage, remodeling information, collision history, and gasoline remaining amount
  • the vehicle type-dependent data includes the vehicle type data for each vehicle type.
  • the general-purpose data includes route information, nearby store information, visit location history, conversation history with agents, driving methods, sudden braking, Number of times of smoking, presence / absence of smoking, weather, buildings, roads, etc., and external information can be included.
  • an anonymization unit that anonymizes the data of the determined operation
  • the storage unit is based on the learning data by the operation determination unit.
  • the data of the determined operation can be stored in an anonymized state by the anonymization unit.
  • An information processing method includes a storage unit that stores learning data in association with a user who uses hardware, an operation determination unit that determines an operation of the hardware based on the learning data, Among the learning data stored in the storage unit, an information processing method of an information processing apparatus including a presentation unit that presents options of learning data that can be used by the user, the operation determination unit includes the presentation unit This is an information processing method including a step of determining an action based on selected learning data among the options presented by.
  • a program includes a storage unit that stores learning data in association with a user who uses hardware, an operation determination unit that determines an operation of the hardware based on the learning data, and the storage
  • a presentation unit that presents options of learning data that can be used by the user among the learning data stored in the unit, and the action determination unit is selected from the options presented by the presentation unit
  • learning data is stored in association with a user who uses hardware, the operation of the hardware is determined based on the learning data, and the learning is stored in the storage unit Among the data, learning data options that can be used by the user are presented, and an action is determined based on the selected learning data among the presented options.
  • FIG. 1 It is a figure showing an example of composition of an agent system which consists of vehicles and a server to which this indication is applied. It is a flowchart explaining the agent process by the vehicle of FIG. It is a flowchart explaining the authentication process of FIG. It is a figure explaining the authentication process of FIG. It is a figure explaining the authentication process of FIG. It is a figure explaining the authentication process of FIG. It is a figure explaining the authentication process of FIG. It is a figure explaining the authentication process of FIG. It is a figure explaining the image which displays the action which can be corresponded with usable agent data. It is a figure explaining the rule which concerns on transfer of agent data. It is a flowchart explaining the use setting process of FIG. It is a figure explaining the use setting process of FIG. It is a flowchart explaining the transfer setting process of FIG.
  • FIG. 11 is a diagram illustrating a configuration example of a general-purpose personal computer.
  • FIG. 1 illustrates a configuration example of an agent system including a vehicle on which an agent to which an information processing apparatus according to the present disclosure is applied and a server that manages agent data.
  • the vehicles 11-1 to 11-n in FIG. 1 are equipped with an agent that executes processing according to a request of a user who is a driver.
  • the agent for example, provides various driving support to the user who uses the vehicles 11-1 to 11-n. For example, the navigation function for guiding the user to the destination and the fuel efficiency are improved. It realizes driving support functions such as guiding accelerator work and brake work.
  • the agent functions as a navigation device that searches and guides a route (route) to a desired destination, particularly when the user who is a driver uses the vehicle 11.
  • a route route
  • other functions may be realized.
  • the vehicles 11-1 to 11-n are simply referred to as the vehicle 11 unless otherwise distinguished, and the other configurations are also referred to in the same manner.
  • the agent searches for routes that exclude difficult routes that are narrower than the vehicle width, for example, from vehicle size information. To do.
  • the agent sets the destination at a specific time when there is a history of frequently set waypoints (stop points). Just search for routes that include frequently set stops.
  • the agent repeats learning according to the driver's habit and history related to the route search and the conditions of the vehicle 11, and performs the route search according to the learning result. By executing, a route search optimum for the driver is executed.
  • the agent stores the learning result as agent data in association with a driver or a vehicle and stores it in the server 12 including a cloud server or the like via the network 13 represented by the Internet.
  • the agent authenticates the driver, accesses the server 12 based on the authentication result, reads out the corresponding agent data, and uses it for route search and the like.
  • the agent data used according to the driver is transmitted from the vehicle A to the vehicle B. It can be used by migrating to.
  • the search result of the route search included in the agent data, the learning result of the history, and the like can be transferred as they are.
  • the size of the vehicle is the one of the new vehicle B.
  • agent data associated with the vehicle A for example, a history of traveling on the top of Mt. Fuji, a special history indicating traveling in a special area such as driving through the North American continent, etc. are specified as personal information. It is possible to make it in a state where it can be recognized when a new driver Q takes over the vehicle A and drives it. Further, in the agent data associated with the vehicle A, the repair history, accident history, etc. of the vehicle A can also be left as a special history. The special history included in the agent data can be used as value-added information of the vehicle A.
  • a history that increases with a premium feeling can be an index that increases the value of the vehicle A.
  • the vehicle 11 includes a control unit 31, a vehicle drive unit 32, an operation input unit 33, a vehicle operation detection unit 34, a communication unit 35, a display unit 36, an audio output unit 37, an audio input unit 38, an imaging unit 39, a storage unit 40, And an agent processing unit 41.
  • the control unit 31 is a computer composed of a so-called ECU (Engine Control Unit) or the like, and controls the entire operation of the vehicle 11.
  • ECU Engine Control Unit
  • the vehicle drive unit 32 is a general term for drive parts included in the vehicle 11 such as an engine, an accelerator, a brake, an air conditioner (air conditioner), and lighting, and the operation of the drive unit 32 is controlled by the control unit 31.
  • the control unit 31 may control the automatic driving of the vehicle 11, and in this case, the automatic driving is realized by controlling the vehicle driving unit 32.
  • the operation input unit 33 is a button or a touch panel for inputting various information for the agent controlled by the agent control unit 41, and is operated by a driver as a user to output an operation signal corresponding to the operation content.
  • the vehicle motion detection unit 34 includes various sensors in the vehicle 11 and detects the presence or absence of an automatic brake operation, the presence or absence of a collision, in addition to the yaw, pitch, and roll by a three-dimensional acceleration sensor.
  • the communication unit 35 includes an Ethernet (registered trademark) board and the like, and communicates with the server 12 via the network 13 to transmit and receive various data.
  • the display unit 36 is a display composed of an LCD (Liquid Crystal Display), an organic EL (Electro Luminescence), or the like, and displays various types of information.
  • LCD Liquid Crystal Display
  • organic EL Electro Luminescence
  • the audio output unit 37 includes a speaker and outputs various types of information as audio. That is, the voice output unit 37 outputs, for example, information related to route guidance in the car navigation function of the agent realized by the function of the agent processing unit 41 as a voice.
  • the voice input unit 38 is composed of a microphone or the like, and accepts the driver's instruction content by voice as with the operation input unit 33.
  • the imaging unit 39 is configured by an imaging device such as a CMOS (Complementary Metal Oxide Semiconductor) and captures the traveling direction of the vehicle 11, the rear, the rear of the side, the lower side, the interior of the vehicle, the entire periphery of the vehicle, and the facial expression of the driver Output the image signal.
  • CMOS Complementary Metal Oxide Semiconductor
  • the storage unit 40 includes a flash memory and the like, reads and stores the agent data stored in the server 12, and is appropriately rewritten according to the processing of the agent processing unit 41.
  • the agent data stored in the storage unit 40 is stored in the server 12 as necessary.
  • the agent processing unit 41 realizes the above-described function as an agent.
  • the agent processing unit 41 includes an authentication unit 61, an account management unit 62, an agent data synchronization management unit 63, an agent data use management unit 64, an agent data migration management unit 65, an agent data discard management unit 66, an analysis unit 67, and a case search unit. 68, a case confirmation unit 69, a case correction unit 70, a case determination unit 71, a case anonymization unit 72, and a case verification unit 73.
  • the authentication input unit 61 authenticates whether or not the operation input unit 33 is operated and the user is registered using the input ID and password. To do.
  • the authentication unit 61 may authenticate whether or not the user is a registered driver based on the driver's face image captured by the imaging unit 39, for example.
  • the authentication unit 61 may perform authentication using a user's retina pattern or fingerprint imaged by the imaging unit 39, or a user's voiceprint input via the audio input unit 38. You may make it authenticate using.
  • the account management unit 62 manages an account of a user who is a driver that manages agent data related to the vehicle 11. More specifically, the account management unit 62 registers a user who is a driver as an account when authentication is not permitted by the authentication unit 61, that is, a new driver. In addition, an account is normally set for the vehicle 11 in association with a driver who is a user who is an individual owner.
  • the account management unit 62 sets a temporary account, manages the individual-dependent data in the server 12 in association with the temporary account even when the driver is not registered, and registers a driver as a new owner. May be managed. In this case, the temporary account to be registered for the new driver is deleted, and the server 12 is managed in association with the account for which the driver is set.
  • the agent data synchronization management unit 63 accesses the server 12 based on the authentication result and information for identifying the vehicle 11, and synchronizes the agent data of the authenticated user and stored in association with the vehicle 11. Then, it is stored in the reading storage unit 40.
  • the agent data usage management unit 64 executes usage setting processing of the agent data stored in the storage unit 40, and performs usage settings according to the operation content of the operation input unit 33.
  • the agent data transfer management unit 65 executes a transfer setting process for the agent data to the new vehicle 11 and sets the operation content of the operation input unit 33. Make the appropriate migration settings.
  • the agent data discard management unit 66 executes a discard setting process, accepts an operation by the operation input unit 33, and performs discard setting of data instructed to discard among the agent data.
  • the analysis unit 67 performs the operation input unit 33, the vehicle motion detection unit 34, and the imaging. Based on information of various detection results supplied from the unit 39, the situation of the vehicle 11 is analyzed, and a case that becomes a problem is specified.
  • the case search unit 68 searches for cases analyzed by the analysis unit 67 from the cases stored as agent data, and outputs the searched cases as target cases.
  • the case confirmation unit 69 confirms whether or not the problem identified by the analysis unit 67 is solved based on the searched target case.
  • the case correction unit 70 corrects the searched target case so that the problem can be solved when the problem cannot be solved in the case retrieved by the case confirmation unit 69.
  • the case determination unit 71 determines whether or not the problem has been solved by agent processing according to the target case, and the personal information of the driver is specified in managing the target case that has been solved as agent data. It is determined whether or not anonymization is necessary so as not to be performed.
  • the case anonymization part 72 anonymizes the information of the case which was able to solve a subject, for example by k anonymization, when the case determination part 71 considers that anonymization is required.
  • the case verification unit 73 verifies whether or not the case that can solve the problem is a falsified case.
  • the agent data use management unit 64 stores case data that has been solved by an anonymization as necessary and has not been tampered with as agent database in the agent data. Then, the agent data use management unit 64 performs the function as an agent using the agent data in which the cases are accumulated in the subsequent processing. As a result, by accumulating cases, learning for demonstrating the function as an agent is repeated, and the processing accuracy is improved.
  • the term “learning” used here refers to learning and correction of agent data to solve a problem by using agent data, which is learning data necessary for exhibiting the agent function, and solves the problem. In addition to executing the process, the process of storing the executed content as agent data is repeated.
  • Agent data which is learning data, uses not only the processing contents (examples) executed to solve a problem by using search results and correction results for solving the problem, but also the user's hardware. It includes various data acquired from hardware when in use.
  • the various data includes the location and time of sudden braking, the user's driver status (wake, sleeping, heart rate, etc.), staggered driving, and slipping.
  • Location, time and driver status, lane departure location, time and driver status, signal ignorance, overtaking location, time and driver status, and accident location, time and driver status, etc. is there.
  • the agent data which is learning data, includes data indicating the operating state of the vehicle, which is hardware, and the driver's action in addition to the contents executed to solve the problem. It can also be said that it includes data to be shown, that is, an operation history.
  • the data indicating the hardware operation state is data obtained by executing software that controls the hardware operation, or data obtained from the outside to control the hardware operation, for example, a cloud Includes data obtained from the server.
  • the server 12 is constructed by at least one or more servers on the network 13 such as a cloud server, and stores agent data of the vehicles 11-1 to 11-n.
  • a communication unit 92 and a storage unit 93 are provided.
  • the control unit 91 controls the entire operation of the server 12.
  • the communication unit 92 includes, for example, an Ethernet (registered trademark) board, communicates with the vehicle 11 via the network 13, and transmits and receives various data.
  • the storage unit 93 stores agent data registered in association with at least one of the vehicle 11 and the user.
  • agent processing by the agent system of FIG. 1 will be described with reference to the flowchart of FIG.
  • step S11 the authentication unit 61 determines whether the driver has boarded based on information supplied from the operation input unit 33, the vehicle motion detection unit 34, and the imaging unit 39. That is, for example, when the operation input unit 33 is operated and information indicating boarding is input, for example, the operation is unlocked by the vehicle operation detection unit 34, the door is released, and the operation state is set. When it is detected that the image has been made, or when the face image is detected in the vicinity of the driver's seat, for example, in the vehicle image captured by the imaging unit 39, the authentication unit 61 assumes that the driver has boarded The processing proceeds to step S12. In addition, the process of step S11 is repeated until boarding is detected.
  • step S12 the authentication unit 61 executes an authentication process to authenticate the driver.
  • step S41 the authentication unit 61 displays an authentication image start image as shown in FIG. 4, for example.
  • FIG. 4 a “login” image is displayed, and a button B1 that is operated when inputting an ID displayed as “ID input” is displayed at the upper center.
  • a button B1 that is operated when inputting an ID displayed as “ID input” is displayed at the upper center.
  • an ID input column C1 displayed as “ID”, a password input column C2 displayed as “Password”, and a keyboard C3 are displayed.
  • the ID input field C1 and the password input field C2 are selected with a pointer or the like, and the keyboard and C3 are operated with the operation input unit 33 so that the ID and password can be input.
  • a button C4 is displayed which displays “Completed” indicating that it has been completed.
  • step S42 the authentication unit 61 determines whether an ID and a password have been input, and the processes in steps S41 and S42 are repeated until it is determined that the ID and password have been input.
  • step S42 for example, the ID input field C1 and the password input field C2 are selected, the keyboard C3 is operated by the operation input unit 33, the ID and password are input, and the button C4 is operated. Thus, it is considered that the ID and password have been input, and the process proceeds to step S43.
  • step S43 the authentication unit 61 controls the communication unit 35 to collate with the server 12 whether the pre-registered ID and password match. At this time, for example, an image displayed as “logged in” as shown in FIG. 6 is displayed.
  • control unit 91 of the server 12 controls the communication unit 92 to acquire the transmitted ID and password, is the user ID and password registered in the storage unit 93 in advance? Check whether or not. And the control part 91 controls the communication part 92, and transmits a collation result to the vehicle 11 which has transmitted ID and a password.
  • step S44 the authentication unit 61 determines whether the ID and the password match and the authentication is permitted based on the collation result.
  • step S44 if authentication is permitted (authentication is OK), in step S45, the authentication unit 61 recognizes that authentication is permitted.
  • step S44 if authentication is not permitted (authentication is NG) in step S44, the authentication unit 61 recognizes that authentication is not permitted in step S46.
  • step S12 When the authentication process is completed in step S12, the process proceeds to step S13.
  • step S13 the authentication unit 61 determines from the authentication result whether authentication is permitted (authentication is OK) or not.
  • step S13 if authentication is not permitted, the process proceeds to step S24.
  • step S24 the authentication unit 61 determines whether or not the operation input unit 33 is operated to request setting of a new account. If it is determined in step S24 that setting of a new account is requested, the process proceeds to step S25.
  • step S25 the account management unit 62 controls the display unit 36 to display an image for setting a new account, and accepts an operation input for setting a new account by the operation input unit 33.
  • step S ⁇ b> 26 the account management unit 62 sets a new account based on the information input by operating the operation input unit 33 and controls the communication unit 35 to make a new account for the server 12. The account is registered, and the process proceeds to step S23.
  • step S27 the authentication unit 61 confirms that authentication is not permitted (authentication is NG), for example, as shown in FIG.
  • the display unit 36 is controlled and displayed, and the process proceeds to step S23.
  • “Login failed” is displayed, indicating that authentication has not been accepted.
  • step S23 the authentication unit 61 determines whether or not the driver has got off based on information supplied from the operation input unit 33, the vehicle motion detection unit 34, and the imaging unit 39. If it is determined in step S23 that the driver has got off, the process returns to step S12.
  • step S13 If the authentication is approved in step S13, the process proceeds to step S14.
  • step S ⁇ b> 14 the agent data synchronization management unit 63 controls the communication unit 35 to read out agent data of a driver (user) whose authentication is permitted from the server 12 and stores the agent data in its own storage unit 40.
  • the agent data is updated (synchronized with the agent data of the server 12).
  • step S15 the agent data synchronization management unit 63 selects the usable data and the corresponding action as shown in FIG. 8, for example, based on the agent data stored in the storage unit 40. Is generated and the display unit 36 is controlled and displayed.
  • FIG. 8 data of the user authenticated from above, data already applied to the vehicle A, and data associated with the user A managed by the server 12 are displayed.
  • FIG. 8 shows the agent data of user A as authenticated user data.
  • individual-dependent data Public
  • vehicle-type dependent data Public
  • general-purpose data Public
  • FIG. 8 shows the agent data of user A as authenticated user data.
  • individual-dependent data Public
  • vehicle-type dependent data Public
  • general-purpose data Public
  • Public data already applied to the vehicle A, which is the vehicle 11 that is executing a series of processing of the agent. That is, it is indicated that the data already applied to the vehicle A is individual-dependent data (Public), vehicle-type dependent data (Public), and general-purpose data (Public).
  • the data associated with the user A includes data when driving the vehicle B and data when driving the vehicle C.
  • Data (Public / Private) and general-purpose data (Public / Private) are shown, and data when driving the vehicle B is individual-dependent data, vehicle-type-dependent data (Public / Private), and general-purpose data (Public / Private) It has been shown.
  • buttons 111 labeled “use”, “move”, which is operated when selecting each of the agent data use setting, transfer setting, and discard setting A button 112 indicated and a button 113 indicated as “Discard” are provided.
  • the agent data consists of three types of data: individual-dependent data, vehicle-type-dependent data, and general-purpose data. Furthermore, for each data, there are a private type that includes user-dependent information and other public types. is there.
  • the private type and the public type are set for each of the three types of individual-dependent data, vehicle-type dependent data, and general-purpose data, and there are a total of six types of data.
  • Individual-dependent data is data specific to an individual (in this case, a vehicle) on which an agent is installed (refers to the hardware itself). More specifically, since the hardware is a vehicle, the individual-dependent data includes, for example, repair history, travel distance, modification information, collision history, and gasoline remaining amount.
  • the vehicle type-dependent data is data that is common not only to the individual vehicle but also to all types of vehicle vehicles that are hardware. More specifically, since the vehicle type dependent data is the vehicle, the fuel consumption, the maximum speed, the information on whether or not to pass on the route to be navigated, Sensing data (for example, automatic brake activation conditions), automatic driving availability information, travel route during automatic driving, and the like.
  • General-purpose data is general-purpose data that does not depend on the vehicle that is hardware.
  • the general-purpose data is, for example, route information, nearby store information, visit place history, conversation history with agents, driving method, sudden braking, number of horns, presence or absence of smoking 3D data such as weather, buildings, and roads, and external information.
  • Private type data is information that depends on the driver who is the user of the vehicle, which is hardware, but may not be information that can be specified by an individual. If time is included, it is Private type data.
  • Public type data is data that is not Private type data. For example, it is not possible to specify simple route information or a place or time, that is, a driving history that cannot specify which user belongs. It is.
  • each upper left part indicates whether or not it is possible to migrate to a different agent
  • each lower right part indicates whether or not it is possible to migrate to a different user.
  • the circle mark indicates that transfer is possible
  • the cross mark indicates that transfer is not possible
  • the triangle mark indicates that transfer is possible if the vehicle type is the same.
  • the agent data is transferred to a different agent because the agent is different for each vehicle 11. Therefore, the agent data used when the same user drives a different vehicle A is transferred to the agent of a different vehicle B. Use different agents.
  • to transfer the agent data to a different user means to use the agent by transferring the agent data of the user B to the agent of the vehicle 11 used by the user A when the different user B drives.
  • both public type and private type data can be transferred to different agents if the vehicle type is the same, but cannot be transferred to different agents if the vehicle type is different.
  • general-purpose data can be transferred to different agents for both public and private data.
  • All private-type data such as individual-dependent data, vehicle-type-dependent data, and general-purpose data, may contain personal information and are user-specific, and therefore cannot be transferred to different users.
  • the description returns to the display example of FIG. That is, since the agent data is configured as described above, the individual-dependent data (Public), the vehicle-type-dependent data (Public), and the general-purpose data (Public) listed as data already applied to the vehicle A in FIG. Is data associated with the vehicle A, and since the authenticated user A has never driven the vehicle A until now, all of the data can be transferred to different users. Because there is, it is said that it has been applied.
  • Public the individual-dependent data
  • Public vehicle-type-dependent data
  • Public general-purpose data
  • vehicle type dependent data Public / Private
  • the data associated with the user A which is individual-dependent data, vehicle-type-dependent data, and general-purpose data (Public / Private) is shown as when driving the vehicle C, but “(Public / Private)”
  • the general-purpose data (Public / Private) to which is attached indicates that the data can be migrated as shown in FIG. Therefore, as shown in FIG. 9, individual-dependent data and vehicle type-dependent data that are not attached with “(Public / Private)” cannot be transferred.
  • the vehicle type dependent data cannot be transferred because the vehicle A is not the same vehicle type as the vehicle C.
  • step S ⁇ b> 16 the agent usage management unit 64 determines whether the operation input unit 33 is operated and the button 111 labeled “USED” in FIG. 8 is pressed to instruct the usage setting of the agent data. To do.
  • step S17 the agent usage management unit 64 executes usage setting processing for agent data, and the process proceeds to step S23. Details of the usage setting process will be described later with reference to the flowchart of FIG.
  • step S16 determines whether usage setting is not instructed. If it is determined in step S16 that usage setting is not instructed, the process proceeds to step S18.
  • step S18 the agent migration management unit 65 determines whether or not the operation input unit 33 has been operated and the button 112 labeled “migration” in FIG. To do.
  • step S19 the agent migration management unit 65 executes migration setting processing for setting migration of agent data, and the processing proceeds to step S23. Details of the migration setting process will be described later with reference to the flowchart of FIG.
  • step S18 determines whether the migration setting is not instructed. If it is determined in step S18 that the migration setting is not instructed, the process proceeds to step S20.
  • step S20 the agent discard management unit 66 determines whether or not the operation input unit 33 is operated and the button 113 labeled “Discard” in FIG. To do.
  • step S21 the agent discard management unit 66 executes a discard setting process for setting the discard of the agent data, and the process proceeds to step S23. Details of the discard setting process will be described later with reference to the flowchart of FIG.
  • step S20 determines whether the discard setting is not instructed.
  • step S22 the analysis unit 67 executes an agent function execution process to execute the agent function, and proceeds to step S23.
  • agent function execution process will be described in detail later with reference to the flowchart of FIG.
  • agent process is executed by the series of processes described above, and the vehicle 11 can be operated using the agent function.
  • agent data required for using the agent function can be transferred to another user or used for use in another vehicle.
  • step S71 the agent data use managing unit 64 displays an image for setting use of agent data as shown in FIG. 11, for example.
  • “applied” is written on the right side of each of the individual-dependent data (Public), the vehicle-type dependent data (Public), and the general-purpose data (Public) as data already applied to the vehicle A.
  • Buttons 131 to 133 are displayed, and each time the button is pressed, “Applied” indicating that it is applicable but not applied and “Applied” indicating that it is already applied are sequentially switched. Displayed and set to a state corresponding to the displayed state. In FIG. 11, since it is set to be applied by default, the buttons 131 to 133 are described as “applied”.
  • Data that can be transferred to the vehicle A from the server 12 associated with the user A includes individual-dependent data that is displayed in gray indicating that the vehicle B cannot be used, and vehicle type-dependent data ( Public / Private) and general-purpose data (Public / Private) have buttons 134 and 135 labeled “Apply” as described above.
  • the individual-dependent data displayed in gray indicating that the data cannot be used as the vehicle C driving, and the vehicle type-dependent In the data and general-purpose data (Public / Private), a button 136 labeled “Apply” is displayed as described above.
  • step S72 the agent data utilization managing unit 64 determines whether or not the operation input unit 33 is operated and the buttons 131 to 135 are operated. If it is determined that the operation is performed, the process proceeds to step S73. .
  • step S73 the agent data utilization management unit 64 updates the agent data stored in the storage unit 40 in accordance with the operation content of the operation input unit 33 for the buttons 131 to 135.
  • step S74 the agent data usage management unit 64 determines whether or not the operation input unit 33 has been operated to instruct the end of the usage setting. If termination is not instructed, the process returns to step S71, and the subsequent processes are repeated.
  • step S72 when the operation input unit 33 is not operated, the process of step S73 is skipped.
  • step S74 when the end is instructed, the use setting is ended.
  • step S101 the agent data migration management unit 65 displays an image for setting migration of agent data as shown in FIG. 13, for example.
  • FIG. 13 “Vehicle D (February 9, 2016, 16:00) for selecting the vehicle 11 that can be selected as the migration destination is described from above. Buttons 151 to 153 on which “Scheduled to use. AA taxi”, “Vehicle E (owned by friend B)” and “Vehicle F (owned by user A)” are displayed. Since the items below that are the same as those in FIG. 8, the description thereof is omitted.
  • buttons 151 to 153 that can select the vehicles D, E, and F that can be used by the authenticated user and can transfer the agent data are displayed.
  • step S102 the agent data migration management unit 65 determines whether the operation input unit 33 is operated and the buttons 151 to 153 are operated.
  • step S102 for example, when the button 151 is pressed, it is considered that the button 151 has been operated, and the process proceeds to step S103.
  • step S103 the agent data migration management unit 65 updates the agent data stored in the storage unit 40 in accordance with the operation content of the operation input unit 33 for the buttons 151 to 153.
  • the agent data migration management unit 65 switches the display so as to indicate that the button 151 is selected as shown in FIG. Buttons 171 to 173 labeled “Migration” that can select whether or not the general data (Public) and general data (Public / Private) of the vehicles B and C are migrated are described.
  • the vehicle D is not the same as any of the vehicles A, B, and C, it is indicated that the transfer of the vehicle type-dependent data (Public / Private) is not permitted.
  • step S104 the agent data migration management unit 65 determines whether or not the operation input unit 33 has been operated to instruct the end of the use setting. If termination is not instructed, the process returns to step S101, and the subsequent processes are repeated.
  • step S102 it is determined again whether or not there is an operation input in the state displayed as shown in FIG. 14, and for example, if any of the buttons 171 to 173 is operated, there is an operation input. The process proceeds to step S103 again.
  • step S103 the agent data migration management unit 65 sets the data migration of any of the buttons 171 to 173 for which the operation input has been made, and the process proceeds to step S104.
  • step S104 when the migration setting end is instructed, the migration setting ends.
  • the vehicle A corresponding to at least one of the buttons 171 to 173 selected from the vehicle 11 of “vehicle D (planned to be used from 16:00 on February 9, 2016, AA taxi)” Agent data is set so that applied data, general data (Public / Private) of vehicles B and C, and general data (Public / Private) are transferred.
  • the user may be able to change the circle mark, the triangle mark, and the cross mark in FIG. 9.
  • the change pattern 2 allows the user to change the type of individual-dependent data, vehicle-type dependent data, and general-purpose data in the figure. However, it is possible to change from general-purpose data to vehicle-type dependent data, and from vehicle-type dependent data to individual-dependent data, but not vice versa.
  • change pattern 3 it is assumed that the user can change the type of Private type and Public type in the figure. However, it is possible to change from Public type to Private type, but not vice versa.
  • data that can be migrated may be incapable of migrating, but by making non-migratable data non-migratable, it is possible to keep privacy while maintaining flexibility in the rules in the migration process. It becomes.
  • the Public type to the Private type
  • the agent data discard management unit 66 displays, for example, an image for setting discard of agent data as shown in FIG.
  • individual-dependent data (Private), vehicle-type-dependent data (Private), and general-purpose data (Private) as data already applied to the vehicle A
  • individual-dependent data (Private) Public / Private) Public / Private
  • vehicle-dependent data (Public / Private) Public / Private
  • general-purpose data Public / Private
  • individual-dependent data Public / Private
  • General data (Public / Private) are provided with buttons 191 to 199 which are displayed as “discard” and which are operated when instructing the discard.
  • the data already applied to the vehicle A includes individual-dependent data (Private), vehicle-type-dependent data (Private), and general-purpose data (Private ), And none of them has the authority to discard (Public).
  • step S122 the agent data discard management unit 66 determines whether or not the operation input unit 33 has been operated and the buttons 131 to 135 have been operated. If it is determined that the operation has been performed, the process proceeds to step S123. .
  • step S123 the agent data discard management unit 66 updates the agent data stored in the storage unit 40 in accordance with the operation content of the operation input unit 33 for the buttons 191 to 199.
  • step S124 the agent data discard management unit 66 determines whether or not the operation input unit 33 has been operated to instruct the end of the use setting. If termination is not instructed, the process returns to step S121, and the subsequent processes are repeated.
  • step S122 when the operation input unit 33 is not operated, the process of step S123 is skipped.
  • steps S121 to S124 is repeated until the end of discard setting is instructed. Then, in step S124, when termination is instructed, the discard setting is terminated.
  • step S ⁇ b> 141 the analysis unit 67 detects the vehicle motion detection result, the input sound, and the captured image supplied from the vehicle motion detection unit 34, the voice input unit 38, and the imaging unit 39 in the vehicle 11. Accepts supply of various detection results.
  • step S142 the analysis unit 67 analyzes the received various detection results, sets a case that is a problem corresponding to the analysis result as a target case, and supplies the case to the case search unit 68.
  • the analysis unit 67 performs language analysis, semantic analysis, and the like on the voice input to the voice input unit 38 when the driver as a user utters voice including a destination for instructing navigation.
  • searching for a route (route) is a problem, and the search for the route to the destination is set as a target case and supplied to the case search unit 68.
  • the case search unit 68 searches for a case serving as a process for solving the target case based on the agent data stored in the storage unit 40. For example, the case search unit 68 searches for route candidates R1, R2,... Rn to be selected from the past route search history included in the agent data and outputs them as case search results.
  • step S144 the case confirmation unit 69 confirms whether or not the target case that is the problem can be solved from the data that is the case search result.
  • route search result R1 As a route from the starting point X to the destination Y, a route from the past case to the destination Y to the starting point X, and a route from the starting point X to the waypoint A
  • route R2 from route A to route B, and route R3 from route B to destination Y are found as search results, for example, routes R1 and R3 have no problem.
  • the route R2 is a route when the user A has used the vehicle H in the past.
  • the vehicle A currently driven by the user A is wider than the vehicle H, and the vehicle width is larger than the road R2.
  • step S144 when it is determined that the problem cannot be solved in the target case as described above, the process proceeds to step S145.
  • step S145 the case correction unit 70 corrects the case search result so that the target case can be solved. That is, in the case search result described above, the route R2 is corrected to the route R2 ′ by searching for a route that can correspond to the vehicle width of the vehicle A using the agent data, and as a result, By using the routes R1, R2 ′, R3 as the route from the starting point X to the destination Y, the problem can be solved in the target case. However, even if correction is made in this way, the target case may not necessarily be resolved.
  • step S146 the case determination unit 71 executes a case search result for solving the target case set in this way. That is, in this example, the case determination unit 71 controls the display unit 36 to display a navigation image up to the destination Y, and also controls the audio output unit 37 to guide voice to the destination Y. Is output.
  • step S144 If the target case can be solved in step S144, the process in step S145 is skipped. That is, in this case, the first case search result is used as it is to solve the target case.
  • step S147 the case determination unit 71 determines whether the problem of the target case has been solved based on the case search result in a series of operations. That is, in this case, the case determination unit 71 determines whether or not the vehicle 11 can be moved from the departure point X to the destination Y by navigation based on the case search result.
  • Step S147 when the target case can be solved, in Step S148, the case determination unit 71 recognizes the case search result as a successful case.
  • the case determination unit 71 recognizes the case search result as a successful case in step S149.
  • step S150 the case anonymization unit 72 determines whether or not anonymization is necessary for the case search result. For example, in the case search result, there is a possibility that information including personal information of the user A is left as private type general-purpose data that the user A has moved from the departure place X to the destination Y at a predetermined date and time. is there. Therefore, in the case where such personal information is left, the case anonymization unit 72 determines that anonymization is necessary to use the case search result as public type general-purpose data, The process proceeds to step S151.
  • the case anonymization unit 72 anonymizes the case search result.
  • Anonymization is, for example, anonymized by k anonymization processing.
  • k anonymization processing means that, for example, user A uses information on a plurality of similar case search results so that at least k people or more have moved from departure point X to destination Y. This is a process of converting to unspecified information.
  • the k anonymization process is to convert the data attribute attached to the information that the vehicle has moved from the departure point X to the destination Y so that the user A is not specified.
  • the case anonymization unit 72 controls the communication unit 35, accesses the server 12, searches for information that has moved from the departure place X to the destination Y, and obtains a plurality of case search results of k or more people.
  • the case anonymization unit 72 includes information that the user A has moved from the departure point X to the destination Y on a predetermined date D, and a plurality of k or more persons have determined a predetermined year (including the predetermined date D). The information is converted into information of k or more people who have moved from the starting point X to the destination Y.
  • the information that the user A is not specified can be determined only by “information that the vehicle has moved from the departure place X to the destination Y”.
  • the case anonymization unit 72 changes the attribute of the private type general-purpose data as public type general-purpose data.
  • the case anonymization unit 72 By accumulating public-type general-purpose data that has been attribute-converted in this way, it is possible to search for cases with higher accuracy by increasing the data that can be referred to when searching for routes in other agents. Become.
  • step S150 for example, when the case search result can be regarded as public type general-purpose data, anonymization is regarded as unnecessary, and the process in step S151 is skipped.
  • step S152 the case verification unit 73 determines whether or not the case search result including public type general-purpose data is falsified data. If it is determined that the data is not falsified, in step S153, The data is stored as agent data stored in the storage unit 40 and also stored in the storage unit 93 of the server 12. At this time, the case verification unit 73 stores, in the agent data, information on whether the target case is a successful case or a failure case, with respect to the case search result including public type general-purpose data.
  • step S152 if it is determined that the case search result including the public type general-purpose data is falsified, the process in step S153 is skipped, and the case search result is the public type general-purpose data. Not registered as data.
  • step S154 the analysis unit 67 determines whether or not the operation input unit 33 is operated to instruct the end of the agent function execution process. If the end is not instructed, the process returns to step S141. The subsequent processing is repeated. In step S154, if an end instruction is given, the process ends.
  • the agent data accumulated by the agent function can be taken over. That is, when the user A changes from the vehicle A to the vehicle B, that is, from the agent of the vehicle A to the agent of the vehicle B, among the agent data accumulated in the vehicle A, the individual-dependent data of the vehicle A Is not related to the vehicle B and cannot be taken over, but the general-purpose data of the vehicle A can be taken over by the vehicle B, so that the operations and settings customized so far can be taken over. Further, the vehicle type dependent data can be taken over as long as the vehicles A and B are the same vehicle type.
  • the private type data described above is deleted after being stored in the server 12 at the timing immediately before the user gets off the vehicle 11, and is downloaded from the server 12 and used every time the user gets on the vehicle 11. You may do it. By managing in this way, even if the vehicle 11 is stolen, it is possible to prevent the private type data from being stolen.
  • the agent By downloading the data from the server 12 and using it, it is possible to use the agent data of the vehicle type-dependent data and the general-purpose data at any time by taking over each time.
  • the object to function as the agent may be other than that.
  • an agent that realizes a function that outputs a guide voice that assists the speed, accelerator, and brake according to the road condition based on the correlation information between the speed, road inclination, accelerator, and brake and fuel consumption. There may be.
  • vehicle's residual value such as sudden start / brake history, accident history, battery charge count history, travel history (seaside, desert, volcanic ash etc. deterioration factor contact history), dust / odor sensor history, etc.
  • An agent managed as information may be used.
  • destination history anecdotal information such as vehicles ran in Antarctica
  • rarity of vehicles number of vehicles traveling in a specific area
  • driver history user history: gender, etc. Or the like
  • case search results become the added value of the vehicle as it accumulates. Therefore, the accumulated amount of case search results may be visualized information that becomes a quantitative index such as the data amount (... GB retention).
  • information such as yen in monetary value may be displayed by comparing the residual value and added value described above with the used vehicles that are distributed in the market.
  • ⁇ Example executed by software> By the way, the series of processes described above can be executed by hardware, but can also be executed by software.
  • a program constituting the software may execute various functions by installing a computer incorporated in dedicated hardware or various programs. For example, it is installed from a recording medium in a general-purpose personal computer or the like.
  • FIG. 18 shows a configuration example of a general-purpose personal computer.
  • This personal computer incorporates a CPU (Central Processing Unit) 1001.
  • An input / output interface 1005 is connected to the CPU 1001 via a bus 1004.
  • a ROM (Read Only Memory) 1002 and a RAM (Random Access Memory) 1003 are connected to the bus 1004.
  • the input / output interface 1005 includes an input unit 1006 including an input device such as a keyboard and a mouse for a user to input an operation command, an output unit 1007 for outputting a processing operation screen and an image of the processing result to a display device, programs, and various types.
  • a storage unit 1008 including a hard disk drive for storing data, a LAN (Local Area Network) adapter, and the like are connected to a communication unit 1009 that executes communication processing via a network represented by the Internet.
  • magnetic disks including flexible disks
  • optical disks including CD-ROM (Compact Disc-Read Only Memory), DVD (Digital Versatile Disc)), magneto-optical disks (including MD (Mini Disc)), or semiconductors
  • a drive 1010 for reading / writing data from / to a removable medium 1011 such as a memory is connected.
  • the CPU 1001 is read from a program stored in the ROM 1002 or a removable medium 1011 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, installed in the storage unit 1008, and loaded from the storage unit 1008 to the RAM 1003. Various processes are executed according to the program.
  • the RAM 1003 also appropriately stores data necessary for the CPU 1001 to execute various processes.
  • the CPU 1001 loads the program stored in the storage unit 1008 to the RAM 1003 via the input / output interface 1005 and the bus 1004 and executes the program, for example. Is performed.
  • the program executed by the computer (CPU 1001) can be provided by being recorded on the removable medium 1011 as a package medium, for example.
  • the program can be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting.
  • the program can be installed in the storage unit 1008 via the input / output interface 1005 by attaching the removable medium 1011 to the drive 1010. Further, the program can be received by the communication unit 1009 via a wired or wireless transmission medium and installed in the storage unit 1008. In addition, the program can be installed in advance in the ROM 1002 or the storage unit 1008.
  • the program executed by the computer may be a program that is processed in time series in the order described in this specification, or in parallel or at a necessary timing such as when a call is made. It may be a program for processing.
  • the system means a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether all the components are in the same housing. Accordingly, a plurality of devices housed in separate housings and connected via a network and a single device housing a plurality of modules in one housing are all systems. .
  • the present disclosure can take a cloud computing configuration in which one function is shared by a plurality of devices via a network and is processed jointly.
  • each step described in the above flowchart can be executed by one device or can be shared by a plurality of devices.
  • the plurality of processes included in the one step can be executed by being shared by a plurality of apparatuses in addition to being executed by one apparatus.
  • this indication can also take the following structures.
  • a storage unit that stores learning data in association with a user who uses hardware;
  • An operation determining unit that determines an operation of the hardware based on the learning data;
  • a presentation unit that presents options of learning data that can be used by the user among the learning data stored in the storage unit,
  • the operation determining unit determines an operation based on learning data selected from among the options presented by the presenting unit.
  • the learning data includes private data depending on the user, and public data other than the private data.
  • the information processing apparatus according to ⁇ 1>, wherein the presenting unit presents the public data as an option as learning data that can be used by the operation determining unit among the learning data stored in the storage unit.
  • the learning data includes individual-dependent data that depends on an individual of the hardware, model-dependent data that depends on a model of the hardware, and general-purpose data that does not depend on the hardware.
  • the presentation unit is stored in association with the hardware when the user uses another hardware different from the hardware and causes the operation determination unit of the other hardware to determine the operation.
  • the information processing apparatus according to ⁇ 2>, wherein the general-purpose data among the learning data is presented, and options including private data and public data are presented.
  • the presentation unit uses the other hardware different from the hardware so that the user can use the other hardware.
  • the information processing apparatus Among the learning data stored in association with the other hardware, the model-dependent data and the general-purpose data, and the private data, and The information processing apparatus according to ⁇ 3>, in which an option including public data is presented.
  • the presenting unit presents the individual-dependent data, the model-dependent data, and the general-purpose data in the learning data, and presents options to be discarded in units of each private data and public data ⁇ 4 > The information processing apparatus described in>.
  • the learning data is configured for each hardware used by the user, The presenting unit, other options for selecting the hardware, the learning data to be transferred, Of the learning data stored in the storage unit, public data of learning data learned by the hardware used by the user is presented as an option of learning data that can be used by the operation determination unit of the hardware.
  • the learning data is configured for each piece of hardware used by the user, and the individual-dependent data depending on the individual hardware, the model-dependent data depending on the hardware model, and the hardware And general-purpose data that does not depend on
  • the presenting unit other options for selecting the hardware, the learning data to be transferred, Of the learning data stored in the storage unit, general data of learning data learned by the hardware used by the user, which is public data and private data, which is different from the hardware
  • the information processing apparatus according to ⁇ 2>, wherein the information is presented as an option of learning data that can be used by the operation determination unit of hardware.
  • the hardware is a vehicle
  • the learning data is composed of individual-dependent data that depends on an individual of the vehicle, vehicle-dependent data that depends on four types of the vehicle, and general-purpose data that does not depend on the vehicle
  • the individual-dependent data includes repair history, mileage, modification information, collision history, and gasoline remaining amount
  • the vehicle type-dependent data includes the fuel consumption, the maximum speed, the information on whether or not the vehicle is allowed to travel on the route to be navigated, the sensing data for detecting the vehicle operation, the automatic driving availability information, and the automatic Including driving routes when driving
  • the general-purpose data includes route information, neighboring store information, visit location history, conversation history with agents, driving method, sudden braking, number of horns, presence / absence of smoking, weather, buildings, roads, etc., and external
  • the information processing apparatus according to any one of ⁇ 1> to ⁇ 7>, including information.
  • the storage unit stores data determined by the operation determination unit based on the learning data in an anonymized state by the anonymization unit.
  • the information processing apparatus described. ⁇ 10> a storage unit that stores learning data in association with a user who uses hardware; An operation determining unit that determines an operation of the hardware based on the learning data; An information processing method of an information processing apparatus including a presentation unit that presents options of learning data that can be used by the user among the learning data stored in the storage unit, The information processing method includes a step in which the operation determination unit determines an operation based on learning data selected from the options presented by the presentation unit.
  • a storage unit that stores learning data in association with a user who uses hardware;
  • An operation determining unit that determines an operation of the hardware based on the learning data;
  • a presentation unit that presents options of learning data that can be used by the user among the learning data stored in the storage unit,
  • the operation determining unit is a program for causing a computer to execute a process of determining an operation based on selected learning data among the options presented by the presenting unit.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Computer Security & Cryptography (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Business, Economics & Management (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Remote Sensing (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Evolutionary Computation (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Physics (AREA)
  • Marketing (AREA)
  • General Business, Economics & Management (AREA)
  • Tourism & Hospitality (AREA)
  • Strategic Management (AREA)
  • Primary Health Care (AREA)
  • Economics (AREA)
  • Human Resources & Organizations (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Medical Informatics (AREA)
  • Computational Linguistics (AREA)
  • Bioethics (AREA)
  • Automation & Control Theory (AREA)
  • Traffic Control Systems (AREA)
  • Navigation (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The present disclosure relates to an information processing device, an information processing method, and a program whereby learning data can be transferred to another agent. According to the present disclosure, when learning data is to be transferred from a user to another user, the learning data is divided into individual-dependent, private-type leanring data and the other learning data, namely, public-type learning data, and the other user is allowed to take over only the public-type learning data. Further, when learning data is to be transferred to an agent for a new vehicle, the learning data is classified into individual-vehicle-dependent data, vehicle-type-dependent data, and general purpose data, and only the general purpose data is transferred, except that when the new vehicle is of the same type as the preceding vehicle, the vehicle-type-dependent data is also transferred. The present disclosure can be applied to intelligent agents.

Description

情報処理装置、および情報処理方法、並びにプログラムInformation processing apparatus, information processing method, and program
 本開示は、情報処理装置、および情報処理方法、並びにプログラムに関し、特に、ハードウェアに蓄積される学習データを容易に引き継げるようにした情報処理装置、および情報処理方法、並びにプログラムに関する。 The present disclosure relates to an information processing apparatus, an information processing method, and a program, and more particularly, to an information processing apparatus, an information processing method, and a program that can easily inherit learning data stored in hardware.
 様々なセンサによる検出結果に基づいて、環境の状況を認識し、次の動作を推論して、推論結果に応じた動作をするものとして「知的エージェント」が知られている(非特許文献1参照)。 An “intelligent agent” is known as a device that recognizes an environmental situation based on detection results from various sensors, infers the next operation, and performs an operation according to the inference result (Non-Patent Document 1). reference).
 知的エージェントには、様々なものが存在するが、具体的な例としては、例えば、パーソナルコンピュータやスマートフォン等の電子デバイスにおいて、ユーザにより文字入力がなされる際、過去の入力履歴に基づいた学習により、可能性の高い候補を表示して、ユーザに選択させるものなどがある。 There are various intelligent agents. Specific examples include learning based on past input histories when characters are input by a user in an electronic device such as a personal computer or a smartphone. To display the most likely candidates and allow the user to select them.
 すなわち、知的エージェントは、未知なる環境に関する記述は完全には得られないので、経験に従い、未知なる環境の知識を学習し、学習結果を蓄積させながら、推論を繰り返し、推論結果に応じて動作するものである。 In other words, since the description about the unknown environment cannot be obtained completely, the intelligent agent learns the knowledge of the unknown environment according to the experience, accumulates the learning result, repeats the inference, and operates according to the inference result. To do.
 しかしながら、このような個人化された履歴データなどの、蓄積されてきた学習結果は、知的エージェントが搭載されたハードウェアを買い替えるような場合、ユーザが変更される際にプライバシの観点から消去されてしまうので、新しいハードウェアを使うユーザは、これまで蓄積されてきた学習結果を、新しいハードウェアの知的エージェントで生かすことができない。 However, the accumulated learning results such as personalized history data are deleted from the viewpoint of privacy when the user is changed when the hardware with the intelligent agent is replaced. Therefore, the user who uses the new hardware cannot make use of the learning results accumulated so far with the intelligent agent of the new hardware.
 また、新たなユーザが古いハードウェアの使用を開始するときには、以前のユーザにより蓄積された学習結果が、プライバシの観点から消去されてしまうので、学習結果は何もない状態となり、再び最初から学習をやり直すことになる。 Also, when a new user starts using old hardware, the learning results accumulated by the previous user are erased from the viewpoint of privacy, so there is no learning result and learning from the beginning again. Will be redone.
 結果として、過去に知的エージェントが学習によって獲得した学習結果に、新たなユーザにとって有用な情報が含まれていても、それまでの学習結果を利用することができず、学習結果により得られるはずの恩恵が受けられないことが一般的であった。 As a result, even if the learning results acquired by the intelligent agent in the past contain useful information for new users, the learning results up to that point cannot be used and should be obtained from the learning results. It was common not to receive the benefits of.
 本開示は、このような状況に鑑みてなされたものであり、特に、ハードウェアに搭載される知的エージェントなどが学習によって獲得した学習結果を、ハードウェアのユーザが変更したり、新たなハードウェアを使用するようになっても、適切な状態で引き継げるようにするものである。 The present disclosure has been made in view of such a situation, and in particular, a learning result obtained by learning by an intelligent agent or the like mounted on hardware is changed by a hardware user or a new hardware is obtained. Even if it comes to use the wear, it will be able to take over in an appropriate state.
 本開示の一側面の情報処理装置は、ハードウェアを使用するユーザに対応付けて学習データを記憶する記憶部と、前記学習データに基づいて、前記ハードウェアの動作を決定する動作決定部と、前記記憶部に記憶されている前記学習データのうち、前記ユーザにより利用可能な学習データの選択肢を提示する提示部とを含み、前記動作決定部は、前記提示部により提示された選択肢のうち、選択された学習データに基づいて、動作を決定する情報処理装置である。 An information processing apparatus according to one aspect of the present disclosure includes a storage unit that stores learning data in association with a user who uses hardware, an operation determination unit that determines an operation of the hardware based on the learning data, Among the learning data stored in the storage unit, including a presentation unit that presents options of learning data that can be used by the user, the operation determination unit, among the options presented by the presentation unit, An information processing apparatus that determines an operation based on selected learning data.
 前記学習データは、前記ユーザに依存するプライベートデータ、および、前記プライベートデータ以外のパブリックデータより構成されるようにすることができ、前記提示部には、前記記憶部に記憶されている前記学習データのうち、前記動作決定部で利用可能な学習データとして、前記パブリックデータを選択肢として提示させるようにすることができる。 The learning data can be composed of private data depending on the user and public data other than the private data, and the presentation unit stores the learning data stored in the storage unit Among them, the public data can be presented as options as learning data that can be used by the operation determination unit.
 前記学習データは、前記ハードウェアの個体に依存する個体依存データ、前記ハードウェアの機種に依存した機種依存データ、および、前記ハードウェアに依存しない汎用データとから構成されるようにすることができ、前記提示部には、前記ユーザが、前記ハードウェアと異なる他のハードウェアを使用して、前記他のハードウェアの前記動作決定部により動作を決定させるとき、前記ハードウェアに対応付けて記憶されている学習データのうち、前記汎用データであって、プライベートデータ、およびパブリックデータからなる選択肢を提示させるようにすることができる。ここでいう学習データとは、ユーザがハードウェアを使用している際に、ハードウェアから取得される各種データを指す。またハードウェアから取得される各種データとは、ハードウェアに含まれる各デバイス、ソフトウェアの動作に基づいて取得されるデータを含む。 The learning data may be composed of individual-dependent data that depends on the individual hardware, model-dependent data that depends on the hardware model, and general-purpose data that does not depend on the hardware. In the presenting unit, when the user uses another hardware different from the hardware and causes the operation determining unit of the other hardware to determine an operation, the user is stored in association with the hardware. Among the learned data, the general-purpose data can be presented with options including private data and public data. The learning data here refers to various data acquired from the hardware when the user is using the hardware. The various data acquired from hardware includes data acquired based on the operation of each device and software included in the hardware.
 前記ハードウェアと、前記他のハードウェアとが同一の機種である場合、前記提示部には、前記ユーザが、前記ハードウェアと異なる他のハードウェアを使用して、前記他のハードウェアの前記動作決定部により動作を決定させるとき、前記他のハードウェアに対応付けて記憶されている学習データのうち、前記機種依存データ、および前記汎用データであって、それぞれの、プライベートデータ、およびパブリックデータからなる選択肢を提示させるようにすることができる。 When the hardware and the other hardware are the same model, the presentation unit uses the other hardware different from the hardware, and the presentation unit uses the other hardware. Among the learning data stored in association with the other hardware, when the operation is determined by the operation determination unit, the model-dependent data and the general-purpose data, which are private data and public data, respectively. It is possible to present an option consisting of
 前記提示部には、前記学習データにおける、前記個体依存データ、前記機種依存データ、および前記汎用データであって、それぞれのプライベートデータ、およびパブリックデータの単位で破棄する選択肢を提示させるようにすることができる。 The presenting unit is configured to present the individual-dependent data, the model-dependent data, and the general-purpose data in the learning data, and options to be discarded in units of private data and public data. Can do.
 前記学習データは、前記ユーザが使用したハードウェア毎に構成されるようにすることができ、前記提示部には、前記学習データを移行させたい、前記ハードウェアを選択する他の選択肢と、前記記憶部に記憶されている前記学習データのうち、前記ユーザが使用したハードウェアにより学習された学習データのパブリックデータを、前記ハードウェアの前記動作決定部で利用可能な学習データの選択肢として提示させるようにすることができる。 The learning data can be configured for each piece of hardware used by the user, and the presentation unit has other options for selecting the hardware to which the learning data is to be transferred, and Of the learning data stored in the storage unit, public data of learning data learned by the hardware used by the user is presented as an option of learning data that can be used by the operation determination unit of the hardware. Can be.
 前記学習データは、前記ユーザが使用したハードウェア毎に構成されると共に、前記ハードウェアの個体に依存する個体依存データ、前記ハードウェアの機種に依存した機種依存データ、および、前記ハードウェアに依存しない汎用データとから構成されるようにする、前記提示部は、前記学習データを移行させたい、前記ハードウェアを選択する他の選択肢と、前記記憶部に記憶されている前記学習データのうち、前記ユーザが使用したハードウェアにより学習された学習データの汎用データであって、パブリックデータおよびプライベートデータであり、前記ハードウェアとは異なる他のハードウェアの前記動作決定部で利用可能な学習データの選択肢として提示させるようにすることができる。 The learning data is configured for each piece of hardware used by the user and is dependent on the individual of the hardware, dependent on the individual of the hardware, dependent on the model of the hardware, and dependent on the hardware The presenting unit is configured to include general-purpose data that is not to be transferred, among other learning data stored in the storage unit, and other options for selecting the hardware that the learning data is to be transferred. General data of learning data learned by the hardware used by the user, which is public data and private data, and learning data that can be used by the operation determination unit of other hardware different from the hardware It can be presented as an option.
 前記ハードウェアは、車両とすることができ、前記学習データには、前記車両の個体に依存する個体依存データ、前記車両の4車種に依存した車種依存データ、および、前記車両に依存しない汎用データとから構成され、前記個体依存データは、修理履歴、走行距離、改造情報、衝突履歴、およびガソリン残存量を含ませるようにすることができ、前記車種依存データには、前記車両の車種毎に共通して特定される燃費、最高速度、ナビゲーションされるルート上の通行可否の情報、車両動作を検出するセンシングデータ、自動運転可否情報、および、自動運転時の走行ルートを含ませるようにすることができ、前記汎用データには、経路情報、近隣の店舗情報、訪問場所履歴、エージェントとの会話履歴、運転の仕方、急ブレーキ、クラクション回数、喫煙の有無、天気、建物、道路等の3次元データ、および外部情報を含ませるようにすることができる。 The hardware may be a vehicle, and the learning data includes individual-dependent data that depends on an individual of the vehicle, vehicle-type dependent data that depends on four types of the vehicle, and general-purpose data that does not depend on the vehicle. The individual-dependent data can include repair history, mileage, remodeling information, collision history, and gasoline remaining amount, and the vehicle type-dependent data includes the vehicle type data for each vehicle type. Commonly specified fuel efficiency, maximum speed, information on whether to navigate on route to be navigated, sensing data to detect vehicle movement, information on whether to allow automatic driving, and driving route at the time of automatic driving The general-purpose data includes route information, nearby store information, visit location history, conversation history with agents, driving methods, sudden braking, Number of times of smoking, presence / absence of smoking, weather, buildings, roads, etc., and external information can be included.
 前記学習データに基づいて、決定された動作のデータを、匿名化する匿名化部を含ませるようにすることができ、前記記憶部には、前記動作決定部により、前記学習データに基づいて、決定された動作のデータを、前記匿名化部により匿名化された状態で記憶させるようにすることができる。 Based on the learning data, it is possible to include an anonymization unit that anonymizes the data of the determined operation, and the storage unit is based on the learning data by the operation determination unit. The data of the determined operation can be stored in an anonymized state by the anonymization unit.
 本開示の一側面の情報処理方法は、ハードウェアを使用するユーザに対応付けて学習データを記憶する記憶部と、前記学習データに基づいて、前記ハードウェアの動作を決定する動作決定部と、前記記憶部に記憶されている前記学習データのうち、前記ユーザにより利用可能な学習データの選択肢を提示する提示部とを含む情報処理装置の情報処理方法は、前記動作決定部が、前記提示部により提示された選択肢のうち、選択された学習データに基づいて、動作を決定するステップを含む情報処理方法である。 An information processing method according to one aspect of the present disclosure includes a storage unit that stores learning data in association with a user who uses hardware, an operation determination unit that determines an operation of the hardware based on the learning data, Among the learning data stored in the storage unit, an information processing method of an information processing apparatus including a presentation unit that presents options of learning data that can be used by the user, the operation determination unit includes the presentation unit This is an information processing method including a step of determining an action based on selected learning data among the options presented by.
 本開示の一側面のプログラムは、ハードウェアを使用するユーザに対応付けて学習データを記憶する記憶部と、前記学習データに基づいて、前記ハードウェアの動作を決定する動作決定部と、前記記憶部に記憶されている前記学習データのうち、前記ユーザにより利用可能な学習データの選択肢を提示する提示部とを含み、前記動作決定部は、前記提示部により提示された選択肢のうち、選択された学習データに基づいて、動作を決定する処理をコンピュータに実行させるプログラムである。 A program according to an aspect of the present disclosure includes a storage unit that stores learning data in association with a user who uses hardware, an operation determination unit that determines an operation of the hardware based on the learning data, and the storage A presentation unit that presents options of learning data that can be used by the user among the learning data stored in the unit, and the action determination unit is selected from the options presented by the presentation unit This is a program for causing a computer to execute processing for determining an operation based on the learned data.
 本開示の一側面においては、ハードウェアを使用するユーザに対応付けて学習データが記憶され、前記学習データに基づいて、前記ハードウェアの動作が決定され、前記記憶部に記憶されている前記学習データのうち、前記ユーザにより利用可能な学習データの選択肢が提示され、提示された選択肢のうち、選択された学習データに基づいて、動作が決定される。 In one aspect of the present disclosure, learning data is stored in association with a user who uses hardware, the operation of the hardware is determined based on the learning data, and the learning is stored in the storage unit Among the data, learning data options that can be used by the user are presented, and an action is determined based on the selected learning data among the presented options.
 本開示の一側面によれば、ハードウェアに搭載される知的エージェントなどが学習によって獲得した学習結果を、ハードウェアのユーザが変更したり、新たなハードウェアを使用するようになっても、適切な状態で引き継ぐことが可能となる。 According to one aspect of the present disclosure, even if a hardware user changes a learning result acquired by learning by an intelligent agent or the like installed in hardware or uses new hardware, It is possible to take over in an appropriate state.
本開示を適用した車両とサーバとからなるエージェントシステムの構成例を示す図である。It is a figure showing an example of composition of an agent system which consists of vehicles and a server to which this indication is applied. 図1の車両によるエージェント処理を説明するフローチャートである。It is a flowchart explaining the agent process by the vehicle of FIG. 図2の認証処理を説明するフローチャートである。It is a flowchart explaining the authentication process of FIG. 図3の認証処理を説明する図である。It is a figure explaining the authentication process of FIG. 図3の認証処理を説明する図である。It is a figure explaining the authentication process of FIG. 図3の認証処理を説明する図である。It is a figure explaining the authentication process of FIG. 図3の認証処理を説明する図である。It is a figure explaining the authentication process of FIG. 利用可能なエージェントデータと対応可能なアクションを表示する画像を説明する図である。It is a figure explaining the image which displays the action which can be corresponded with usable agent data. エージェントデータの移行に係る規則を説明する図である。It is a figure explaining the rule which concerns on transfer of agent data. 図2の利用設定処理を説明するフローチャートである。It is a flowchart explaining the use setting process of FIG. 図10の利用設定処理を説明する図である。It is a figure explaining the use setting process of FIG. 図2の移行設定処理を説明するフローチャートである。It is a flowchart explaining the transfer setting process of FIG. 図12の移行設定処理を説明する図である。It is a figure explaining the migration setting process of FIG. 図12の移行設定処理を説明する図である。It is a figure explaining the migration setting process of FIG. 図2の破棄設定処理を説明するフローチャートである。It is a flowchart explaining the discard setting process of FIG. 図15の破棄設定処理を説明する図である。It is a figure explaining the discard setting process of FIG. 図2のエージェント機能実行処理を説明するフローチャートである。It is a flowchart explaining the agent function execution process of FIG. 汎用のパーソナルコンピュータの構成例を説明する図である。And FIG. 11 is a diagram illustrating a configuration example of a general-purpose personal computer.
 以下に添付図面を参照しながら、本開示の好適な実施の形態について詳細に説明する。なお、本明細書及び図面において、実質的に同一の機能構成を有する構成要素については、同一の符号を付することにより重複説明を省略する。 Hereinafter, preferred embodiments of the present disclosure will be described in detail with reference to the accompanying drawings. In addition, in this specification and drawing, about the component which has the substantially same function structure, duplication description is abbreviate | omitted by attaching | subjecting the same code | symbol.
 <エージェントを搭載した車両と、エージェントデータを管理するサーバからなるエージェントシステムの構成例>
 図1は、本開示の情報処理装置を適用したエージェントを搭載した車両と、エージェントデータを管理するサーバからなるエージェントシステムの構成例を示している。
<Example configuration of an agent system consisting of a vehicle equipped with an agent and a server that manages agent data>
FIG. 1 illustrates a configuration example of an agent system including a vehicle on which an agent to which an information processing apparatus according to the present disclosure is applied and a server that manages agent data.
 図1の車両11-1乃至11-nは、ドライバであるユーザの要求に応じた処理を実行するエージェントを搭載している。エージェントとは、例えば、車両11-1乃至11-nを使用するユーザに対して、各種の運転支援を行うものであり、例えば、目的地までを案内するナビゲーション機能や、燃費を向上させるためのアクセルワークやブレーキワークを案内するといった運転サポート機能などを実現するものである。 The vehicles 11-1 to 11-n in FIG. 1 are equipped with an agent that executes processing according to a request of a user who is a driver. The agent, for example, provides various driving support to the user who uses the vehicles 11-1 to 11-n. For example, the navigation function for guiding the user to the destination and the fuel efficiency are improved. It realizes driving support functions such as guiding accelerator work and brake work.
 尚、本実施の形態においては、エージェントは、特にドライバであるユーザが車両11を使用する際に、所望とする目的地までのルート(経路)を探索して案内するナビゲーション装置として機能するものとするが、それ以外の機能を実現するものであってもよいものである。 In the present embodiment, the agent functions as a navigation device that searches and guides a route (route) to a desired destination, particularly when the user who is a driver uses the vehicle 11. However, other functions may be realized.
 また、以降において、車両11-1乃至11-nについて、特に区別する必要がない場合、単に、車両11と称するものとし、その他の構成についても同様に称するものとする。 In the following, the vehicles 11-1 to 11-n are simply referred to as the vehicle 11 unless otherwise distinguished, and the other configurations are also referred to in the same manner.
 この際、エージェントは、地図情報などを利用した一般的なルート探索に加えて、例えば、車両のサイズの情報などから、車幅に対して道幅の狭い、取り回しが難しいルートを排除したルートを探索する。また、エージェントは、例えば、特定の時間帯に所定の目的地が設定される際、頻繁に設定される経由地(立ち寄り地点)の履歴などがあるとき、特定の時間に目的地が設定されるだけで、頻繁に設定される経由地を含めたルートを探索する。すなわち、エージェントは、ドライバが所望とする目的地までのルート探索をする際、ルート探索に係るドライバの習慣、履歴、および車両11の条件に応じた学習を繰り返し、学習結果に応じたルート探索を実行することで、ドライバにとって最適なルート探索を実行する。 In this case, in addition to general route search using map information, the agent searches for routes that exclude difficult routes that are narrower than the vehicle width, for example, from vehicle size information. To do. In addition, for example, when a predetermined destination is set in a specific time zone, the agent sets the destination at a specific time when there is a history of frequently set waypoints (stop points). Just search for routes that include frequently set stops. In other words, when the route search to the destination desired by the driver, the agent repeats learning according to the driver's habit and history related to the route search and the conditions of the vehicle 11, and performs the route search according to the learning result. By executing, a route search optimum for the driver is executed.
 より具体的には、エージェントは、学習結果をエージェントデータとして、ドライバや車両に対応付けて、インターネットに代表されるネットワーク13を介して、クラウドサーバなどからなるサーバ12に格納させる。また、エージェントは、ドライバを認証して、認証結果に基づいて、サーバ12にアクセスし、対応するエージェントデータを読み出してルート探索等に利用する。 More specifically, the agent stores the learning result as agent data in association with a driver or a vehicle and stores it in the server 12 including a cloud server or the like via the network 13 represented by the Internet. The agent authenticates the driver, accesses the server 12 based on the authentication result, reads out the corresponding agent data, and uses it for route search and the like.
 このエージェントデータは、ドライバと車両とに対応付けられているため、例えば、ドライバPが、車両Aから車両Bに乗り換えるような場合にでも、ドライバに応じて使用するエージェントデータを車両Aから車両Bに移行して利用することができる。この際、エージェントデータに含まれるルート探索の探索結果や履歴の学習結果などは、そのまま移行させることができるが、例えば、車両のサイズなどについては、新たな車両Bのものを採用する。 Since this agent data is associated with the driver and the vehicle, for example, even when the driver P changes from the vehicle A to the vehicle B, the agent data used according to the driver is transmitted from the vehicle A to the vehicle B. It can be used by migrating to. At this time, the search result of the route search included in the agent data, the learning result of the history, and the like can be transferred as they are. For example, the size of the vehicle is the one of the new vehicle B.
 さらに、車両Aに対応付けられたエージェントデータとして、例えば、富士山の山頂を走行したといった履歴や、北米大陸を走破したといった特殊な地域を走行したことを示す特殊履歴などについては、個人情報として特定できない状態にして残すようにし、新たなドライバQが車両Aを譲り受けて運転する際には認識できる状態とすることができる。また、車両Aに対応付けられたエージェントデータには、車両Aの修理履歴や事故履歴なども特殊履歴として残すことができる。このエージェントデータに含まれている特殊履歴は、車両Aの付加価値情報として利用することができる。 Furthermore, as agent data associated with the vehicle A, for example, a history of traveling on the top of Mt. Fuji, a special history indicating traveling in a special area such as driving through the North American continent, etc. are specified as personal information. It is possible to make it in a state where it can be recognized when a new driver Q takes over the vehicle A and drives it. Further, in the agent data associated with the vehicle A, the repair history, accident history, etc. of the vehicle A can also be left as a special history. The special history included in the agent data can be used as value-added information of the vehicle A.
 すなわち、この特殊履歴のうち、富士山の山頂を走行したといった履歴や、北米大陸を走破したといった特殊な地域を走行したことを示す特殊履歴については、車両Aに対するプレミアム感を高める(付加価値を高める)情報として使用することができる。また、この特殊履歴のうち、修理履歴や事故履歴などについては、車両Aに対する付加価値を低下させる情報として使用することができる。 That is, among these special histories, a history of traveling on the top of Mt. Fuji and a special history indicating traveling in a special area such as driving through the North American continent enhance the premium feeling for the vehicle A (increase added value). ) Can be used as information. Of the special history, repair history, accident history, and the like can be used as information for reducing the added value to the vehicle A.
 この結果、通常、車両Aが売却される際には、特殊履歴にも含まれる、走行距離や使用年数、並びに修理履歴や事故履歴といった車両の履歴は、車両Aの付加価値をどの程度下げるかを決める指標とされてきたが、特殊履歴のうち、プレミアム感と高める(付加価値を高める)履歴については、車両Aの価値を高める指標とすることができる。 As a result, when the vehicle A is sold, how much the vehicle history such as the mileage, the number of years of use, the repair history, and the accident history included in the special history lowers the added value of the vehicle A. However, among the special histories, a history that increases with a premium feeling (increases added value) can be an index that increases the value of the vehicle A.
 ここで、車両11の構成例について説明する。 Here, a configuration example of the vehicle 11 will be described.
 車両11は、制御部31、車両駆動部32、操作入力部33、車両動作検出部34、通信部35、表示部36、音声出力部37、音声入力部38、撮像部39、記憶部40、およびエージェント処理部41を備えている。 The vehicle 11 includes a control unit 31, a vehicle drive unit 32, an operation input unit 33, a vehicle operation detection unit 34, a communication unit 35, a display unit 36, an audio output unit 37, an audio input unit 38, an imaging unit 39, a storage unit 40, And an agent processing unit 41.
 制御部31は、いわゆる、ECU(Engine Control Unit)などからなるコンピュータであり、車両11の動作の全体を制御する。 The control unit 31 is a computer composed of a so-called ECU (Engine Control Unit) or the like, and controls the entire operation of the vehicle 11.
 車両駆動部32は、エンジン、アクセル、ブレーキ、エアコン(エアコンディショナ)、および照明等の車両11に含まれる駆動部分の総称であり、制御部31によりその動作が制御される。制御部31は、車両11の自動運転を制御するようにしてもよく、この場合、車両駆動部32を制御することで、自動運転を実現する。 The vehicle drive unit 32 is a general term for drive parts included in the vehicle 11 such as an engine, an accelerator, a brake, an air conditioner (air conditioner), and lighting, and the operation of the drive unit 32 is controlled by the control unit 31. The control unit 31 may control the automatic driving of the vehicle 11, and in this case, the automatic driving is realized by controlling the vehicle driving unit 32.
 操作入力部33は、エージェント制御部41により制御されるエージェントに対する各種の情報を入力するボタンやタッチパネルなどであり、ユーザであるドライバにより操作されて、操作内容に応じた操作信号を出力する。 The operation input unit 33 is a button or a touch panel for inputting various information for the agent controlled by the agent control unit 41, and is operated by a driver as a user to output an operation signal corresponding to the operation content.
 車両動作検出部34は、車両11における各種のセンサからなり、3次元の加速度センサによるヨー、ピッチ、ロールの他、自動ブレーキの動作の有無や衝突の有無等を検出する。 The vehicle motion detection unit 34 includes various sensors in the vehicle 11 and detects the presence or absence of an automatic brake operation, the presence or absence of a collision, in addition to the yaw, pitch, and roll by a three-dimensional acceleration sensor.
 通信部35は、イーサネット(登録商標)ボードなどからなり、ネットワーク13を介してサーバ12と通信し、各種のデータを送受信する。 The communication unit 35 includes an Ethernet (registered trademark) board and the like, and communicates with the server 12 via the network 13 to transmit and receive various data.
 表示部36は、LCD(Liquid Crystal Display)や有機EL(Electro Luminescence)等からなるディスプレイであり、各種の情報を表示する。 The display unit 36 is a display composed of an LCD (Liquid Crystal Display), an organic EL (Electro Luminescence), or the like, and displays various types of information.
 音声出力部37は、スピーカなどからなり、各種の情報を音声として出力する。すなわち、音声出力部37は、例えば、エージェント処理部41が機能することにより実現されるエージェントのカーナビゲーション機能におけるルート案内に関する情報を音声として出力する。 The audio output unit 37 includes a speaker and outputs various types of information as audio. That is, the voice output unit 37 outputs, for example, information related to route guidance in the car navigation function of the agent realized by the function of the agent processing unit 41 as a voice.
 音声入力部38は、マイクロフォンなどからなり、操作入力部33と同様に、ドライバの指示内容を音声で受け付ける。 The voice input unit 38 is composed of a microphone or the like, and accepts the driver's instruction content by voice as with the operation input unit 33.
 撮像部39は、CMOS(Complementary Metal Oxide Semiconductor)などの撮像素子より構成され、車両11の走行方向、後方、側面後方、下方、車内、車両の全周囲、およびドライバの表情などを撮像し、撮像した画像信号を出力する。 The imaging unit 39 is configured by an imaging device such as a CMOS (Complementary Metal Oxide Semiconductor) and captures the traveling direction of the vehicle 11, the rear, the rear of the side, the lower side, the interior of the vehicle, the entire periphery of the vehicle, and the facial expression of the driver Output the image signal.
 記憶部40は、フラッシュメモリなどからなり、サーバ12に格納されているエージェントデータを読み込んで記憶すると共に、エージェント処理部41の処理に応じて適宜書き換えられる。尚、記憶部40に記憶されるエージェントデータは、必要に応じて、サーバ12に格納させる。 The storage unit 40 includes a flash memory and the like, reads and stores the agent data stored in the server 12, and is appropriately rewritten according to the processing of the agent processing unit 41. The agent data stored in the storage unit 40 is stored in the server 12 as necessary.
 エージェント処理部41は、上述したエージェントとしての機能を実現するものである。エージェント処理部41は、認証部61、アカウント管理部62、エージェントデータ同期管理部63、エージェントデータ利用管理部64、エージェントデータ移行管理部65、エージェントデータ廃棄管理部66、解析部67、事例検索部68、事例確認部69、事例修正部70、事例判定部71、事例匿名化部72、および事例検証部73を備えている。 The agent processing unit 41 realizes the above-described function as an agent. The agent processing unit 41 includes an authentication unit 61, an account management unit 62, an agent data synchronization management unit 63, an agent data use management unit 64, an agent data migration management unit 65, an agent data discard management unit 66, an analysis unit 67, and a case search unit. 68, a case confirmation unit 69, a case correction unit 70, a case determination unit 71, a case anonymization unit 72, and a case verification unit 73.
 認証部61は、車両11にドライバであるユーザが乗車した場合、例えば、操作入力部33が操作されて、入力されたIDとパスワードとを利用して登録されたユーザであるか否かを認証する。また、認証部61は、例えば、撮像部39により撮像されたドライバの顔画像に基づいて、登録されたドライバであるユーザであるか否かを認証するようにしてもよい。さらに、この他にも、認証部61は、撮像部39により撮像されるユーザの網膜パターンや指紋を用いて認証するようにしてもよいし、音声入力部38を介して入力されるユーザの声紋を利用して認証するようにしてもよい。 When the user who is a driver gets on the vehicle 11, for example, the authentication input unit 61 authenticates whether or not the operation input unit 33 is operated and the user is registered using the input ID and password. To do. The authentication unit 61 may authenticate whether or not the user is a registered driver based on the driver's face image captured by the imaging unit 39, for example. In addition to this, the authentication unit 61 may perform authentication using a user's retina pattern or fingerprint imaged by the imaging unit 39, or a user's voiceprint input via the audio input unit 38. You may make it authenticate using.
 アカウント管理部62は、車両11に関するエージェントデータを管理するドライバであるユーザのアカウントを管理する。より詳細には、アカウント管理部62は、認証部61により認証が認められなかった、すなわち、新規のドライバであるとき、アカウントとしてドライバであるユーザを登録する。また、車両11は、通常、個別に所有者たるユーザであるドライバと対応付けてアカウントが設定されている。 The account management unit 62 manages an account of a user who is a driver that manages agent data related to the vehicle 11. More specifically, the account management unit 62 registers a user who is a driver as an account when authentication is not permitted by the authentication unit 61, that is, a new driver. In addition, an account is normally set for the vehicle 11 in association with a driver who is a user who is an individual owner.
 車両11が売却されるなどして、次の所有者が存在しない場合、アカウントが存在しない状態となる。しかしながら、車両11の走行距離や修理情報といった車両11に個体依存するデータについては、ドライバのアカウントがない状態でも車両11に対応付けて管理する必要がある。そこで。このような場合、アカウント管理部62は、テンポラリアカウントを設定し、ドライバの登録がない状態でも個体依存データをテンポラリアカウントに対応付けてサーバ12において管理し、新たな所有者たるドライバが登録されるまで、管理されるようにしてもよい。この場合、新たなドライバが、登録されるテンポラリアカウントが抹消されて、ドライバが設定されたアカウントに対応付けられてサーバ12で管理される。 If the next owner does not exist, such as when the vehicle 11 is sold, the account does not exist. However, data dependent on the vehicle 11 such as the travel distance and repair information of the vehicle 11 needs to be managed in association with the vehicle 11 even in the absence of a driver account. Therefore. In such a case, the account management unit 62 sets a temporary account, manages the individual-dependent data in the server 12 in association with the temporary account even when the driver is not registered, and registers a driver as a new owner. May be managed. In this case, the temporary account to be registered for the new driver is deleted, and the server 12 is managed in association with the account for which the driver is set.
 エージェントデータ同期管理部63は、認証結果や車両11を識別する情報に基づいてサーバ12にアクセスし、認証されたユーザのものであって、車両11に対応付けて格納されているエージェントデータを同期して、読み込み記憶部40に記憶させる。 The agent data synchronization management unit 63 accesses the server 12 based on the authentication result and information for identifying the vehicle 11, and synchronizes the agent data of the authenticated user and stored in association with the vehicle 11. Then, it is stored in the reading storage unit 40.
 エージェントデータ利用管理部64は、記憶部40に記憶されたエージェントデータの利用設定処理を実行し、操作入力部33の操作内容に応じた利用設定を行う。 The agent data usage management unit 64 executes usage setting processing of the agent data stored in the storage unit 40, and performs usage settings according to the operation content of the operation input unit 33.
 エージェントデータ移行管理部65は、車両11の買い替えなどにより、ユーザが新たな車両11に乗り換えるようなとき、エージェントデータを新たな車両11に移行設定処理を実行し、操作入力部33の操作内容に応じた移行設定を行う。 When the user changes to the new vehicle 11 due to replacement of the vehicle 11 or the like, the agent data transfer management unit 65 executes a transfer setting process for the agent data to the new vehicle 11 and sets the operation content of the operation input unit 33. Make the appropriate migration settings.
 エージェントデータ廃棄管理部66は、破棄設定処理を実行し、操作入力部33による操作を受け付けて、エージェントデータのうち、破棄が指示されたデータの破棄設定を行う。 The agent data discard management unit 66 executes a discard setting process, accepts an operation by the operation input unit 33, and performs discard setting of data instructed to discard among the agent data.
 解析部67は、エージェントデータ利用管理部64により、記憶部40に記憶されたエージェントデータが利用されて、エージェントとしての機能が実行されるとき、操作入力部33、車両動作検出部34、および撮像部39より供給されてくる各種の検出結果の情報に基づいて、車両11の状況を解析し、課題となる事例を特定する。 When the agent data use management unit 64 uses the agent data stored in the storage unit 40 to execute the function as an agent, the analysis unit 67 performs the operation input unit 33, the vehicle motion detection unit 34, and the imaging. Based on information of various detection results supplied from the unit 39, the situation of the vehicle 11 is analyzed, and a case that becomes a problem is specified.
 事例検索部68は、解析部67により解析された事例を、エージェントデータとして蓄積されている事例から検索し、検索された事例を対象事例として出力する。 The case search unit 68 searches for cases analyzed by the analysis unit 67 from the cases stored as agent data, and outputs the searched cases as target cases.
 事例確認部69は、検索された対象事例により、解析部67により特定された課題が解決するか否かを確認する。 The case confirmation unit 69 confirms whether or not the problem identified by the analysis unit 67 is solved based on the searched target case.
 事例修正部70は、事例確認部69により検索された事例では、課題が解決できない場合、検索された対象事例に修正を加えて課題を解決できるようにする。 The case correction unit 70 corrects the searched target case so that the problem can be solved when the problem cannot be solved in the case retrieved by the case confirmation unit 69.
 事例判定部71は、対象事例に応じたエージェント処理により、課題を解決することができたか否かを判定し、解決できた対象事例について、エージェントデータとして管理する上で、ドライバの個人情報が特定されないように匿名化する必要があるか否かを判定する。 The case determination unit 71 determines whether or not the problem has been solved by agent processing according to the target case, and the personal information of the driver is specified in managing the target case that has been solved as agent data. It is determined whether or not anonymization is necessary so as not to be performed.
 事例匿名化部72は、事例判定部71により匿名化が必要であるとみなされた場合、課題を解決することができた事例の情報を、例えば、k匿名化により匿名化する。 The case anonymization part 72 anonymizes the information of the case which was able to solve a subject, for example by k anonymization, when the case determination part 71 considers that anonymization is required.
 事例検証部73は、課題を解決することができた事例が改ざんされた事例であるか否かを検証する。 The case verification unit 73 verifies whether or not the case that can solve the problem is a falsified case.
 エージェントデータ利用管理部64は、課題が解決できた事例であって、必要に応じて匿名化された、改ざんのない事例のデータを事例のデータベースとしてエージェントデータに蓄積させる。そして、エージェントデータ利用管理部64は、以降の処理において、事例が蓄積されたエージェントデータを利用して、エージェントとしての機能を実行する。結果として、事例が蓄積されることにより、エージェントとしての機能を発揮する上での学習が繰り返されて、処理精度が向上する。 The agent data use management unit 64 stores case data that has been solved by an anonymization as necessary and has not been tampered with as agent database in the agent data. Then, the agent data use management unit 64 performs the function as an agent using the agent data in which the cases are accumulated in the subsequent processing. As a result, by accumulating cases, learning for demonstrating the function as an agent is repeated, and the processing accuracy is improved.
 尚、ここでいう学習とは、エージェント機能を発揮する上で必要とされる学習データであるエージェントデータを用いて、課題を解決するためのエージェントデータの検索および修正を実行し、課題を解決する処理を実行すると共に、処理を実行した内容をエージェントデータとして蓄積する処理を繰り返すことである。 Note that the term “learning” used here refers to learning and correction of agent data to solve a problem by using agent data, which is learning data necessary for exhibiting the agent function, and solves the problem. In addition to executing the process, the process of storing the executed content as agent data is repeated.
 また、学習データであるエージェントデータとは、課題を解決するための検索結果や修正結果を利用することにより、課題を解決するために実行した処理内容(事例)のみならず、ユーザがハードウェアを使用している際に、ハードウェアから取得される各種データを含むものである。 Agent data, which is learning data, uses not only the processing contents (examples) executed to solve a problem by using search results and correction results for solving the problem, but also the user's hardware. It includes various data acquired from hardware when in use.
 各種データとは、例えば、ハードウェアが車両であるような場合、急ブレーキ発生の場所、時刻、およびユーザである運転者状態(起きている、寝ている、心拍数など)、ふらつき運転、スリップの場所、時刻、および運転者状態、レーン逸脱の場所、時刻、および運転者状態、信号無視、追い抜きの場所、時刻、および運転者状態、並びに、事故の場所、時刻、および運転者状態などである。すなわち、ハードウェアが車両である場合には、学習データであるエージェントデータは、課題を解決するために実行した内容に加えて、ハードウェアである車両の動作状態を示すデータおよび運転者の行動を示すデータ、すなわち、運転履歴を含んだものであるともいえる。 For example, when the hardware is a vehicle, the various data includes the location and time of sudden braking, the user's driver status (wake, sleeping, heart rate, etc.), staggered driving, and slipping. Location, time and driver status, lane departure location, time and driver status, signal ignorance, overtaking location, time and driver status, and accident location, time and driver status, etc. is there. That is, when the hardware is a vehicle, the agent data, which is learning data, includes data indicating the operating state of the vehicle, which is hardware, and the driver's action in addition to the contents executed to solve the problem. It can also be said that it includes data to be shown, that is, an operation history.
 さらに、ハードウェアの動作状態を示すデータは、ハードウェアの動作を制御するソフトウェアが実行されることにより得られるデータや、ハードウェアの動作を制御するために外部より取得されるデータ、例えば、クラウドサーバから得られるデータなどを含むものである。 Furthermore, the data indicating the hardware operation state is data obtained by executing software that controls the hardware operation, or data obtained from the outside to control the hardware operation, for example, a cloud Includes data obtained from the server.
 サーバ12は、クラウドサーバなどのネットワーク13上であって、少なくとも1台以上のサーバにより構築されるものであり、車両11-1乃至11-nのエージェントデータを格納しており、制御部91、通信部92、および記憶部93を備えている。 The server 12 is constructed by at least one or more servers on the network 13 such as a cloud server, and stores agent data of the vehicles 11-1 to 11-n. A communication unit 92 and a storage unit 93 are provided.
 制御部91は、サーバ12の動作の全体を制御している。通信部92は、例えば、イーサネット(登録商標)ボードからなり、ネットワーク13を介して車両11と通信しており、各種のデータを送受信する。記憶部93は、車両11、および、ユーザの少なくともいずれかに対応付けて登録されているエージェントデータを格納している。 The control unit 91 controls the entire operation of the server 12. The communication unit 92 includes, for example, an Ethernet (registered trademark) board, communicates with the vehicle 11 via the network 13, and transmits and receives various data. The storage unit 93 stores agent data registered in association with at least one of the vehicle 11 and the user.
 <エージェント処理>
 次に、図2のフローチャートを参照して、図1のエージェントシステムによるエージェント処理について説明する。
<Agent processing>
Next, agent processing by the agent system of FIG. 1 will be described with reference to the flowchart of FIG.
 ステップS11において、認証部61は、操作入力部33、車両動作検出部34、および撮像部39より供給されてくる情報に基づいて、ドライバが乗車したか否かを判定する。すなわち、例えば、操作入力部33が操作されて、乗車を示す情報が入力された場合、例えば、車両動作検出部34により解錠されて、扉が解放されて、動作状態にされるといった操作がなされたことが検出された場合、または、例えば、撮像部39により撮像された車内の画像であって、運転席付近に顔画像が検出された場合、認証部61は、ドライバが乗車したものとみなし、処理は、ステップS12に進む。尚、ステップS11の処理は、乗車が検出されるまで、繰り返される。 In step S11, the authentication unit 61 determines whether the driver has boarded based on information supplied from the operation input unit 33, the vehicle motion detection unit 34, and the imaging unit 39. That is, for example, when the operation input unit 33 is operated and information indicating boarding is input, for example, the operation is unlocked by the vehicle operation detection unit 34, the door is released, and the operation state is set. When it is detected that the image has been made, or when the face image is detected in the vicinity of the driver's seat, for example, in the vehicle image captured by the imaging unit 39, the authentication unit 61 assumes that the driver has boarded The processing proceeds to step S12. In addition, the process of step S11 is repeated until boarding is detected.
 ステップS12において、認証部61は、認証処理を実行して、ドライバの認証を行う。 In step S12, the authentication unit 61 executes an authentication process to authenticate the driver.
 <認証処理>
 ここで、図3のフローチャートを参照して、認証処理について説明する。
<Authentication process>
Here, the authentication process will be described with reference to the flowchart of FIG.
 ステップS41において、認証部61は、例えば、図4で示されるような認証画像開始画像を表示する。 In step S41, the authentication unit 61 displays an authentication image start image as shown in FIG. 4, for example.
 図4においては、「ログイン」画像が表示されており、中央上部に「ID入力」と表示されたIDを入力するときに操作されるボタンB1が表示されている。さらに、その下には、「指紋認証・デバイス認証をご利用の場合は所定の操作を行ってください」と表示されており、IDとパスワードを入力する方式以外にも利用可能な、例えば、指紋認証、または、顔画像などを利用したデバイス認証などを使用する場合に対応する所定の操作をする旨が指示されている。 In FIG. 4, a “login” image is displayed, and a button B1 that is operated when inputting an ID displayed as “ID input” is displayed at the upper center. In addition, below it is displayed, “Please perform the specified operation if you are using fingerprint authentication / device authentication”, which can be used other than the method of entering ID and password. For example, fingerprint It is instructed to perform a predetermined operation corresponding to the case of using authentication or device authentication using a face image.
 例えば、ボタンB1が操作されると、図5で示されるように、上から「ID」と表示されたID入力欄C1、「Password」と表示されたパスワード入力欄C2、およびキーボードC3が表示され、ポインタ等によりID入力欄C1、およびパスワード入力欄C2を選択し、キーボードC3を、操作入力部33で操作することでIDとパスワードを入力することが可能な構成とされている。 For example, when the button B1 is operated, as shown in FIG. 5, an ID input column C1 displayed as “ID”, a password input column C2 displayed as “Password”, and a keyboard C3 are displayed. The ID input field C1 and the password input field C2 are selected with a pointer or the like, and the keyboard and C3 are operated with the operation input unit 33 so that the ID and password can be input.
 さらに、IDとパスワードが入力された後、完了したことを示す「完了」と表示されたボタンC4が設けられている。 Furthermore, after the ID and password are entered, a button C4 is displayed which displays “Completed” indicating that it has been completed.
 ステップS42において、認証部61は、IDとパスワードが入力されたか否かを判定し、入力されたと判定されるまで、ステップS41,S42の処理が繰り返される。そして、ステップS42において、例えば、ID入力欄C1、およびパスワード入力欄C2が選択されて、キーボードC3が、操作入力部33で操作されて、IDとパスワードが入力され、ボタンC4が操作されることで、IDとパスワードが入力されたとみなされ、処理は、ステップS43に進む。 In step S42, the authentication unit 61 determines whether an ID and a password have been input, and the processes in steps S41 and S42 are repeated until it is determined that the ID and password have been input. In step S42, for example, the ID input field C1 and the password input field C2 are selected, the keyboard C3 is operated by the operation input unit 33, the ID and password are input, and the button C4 is operated. Thus, it is considered that the ID and password have been input, and the process proceeds to step S43.
 ステップS43において、認証部61は、通信部35を制御してサーバ12に対して、予め登録されたIDとパスワードが一致するか否かを照合する。尚、この際、例えば、図6で示されるような「ログイン中」と表示された画像を表示する。 In step S43, the authentication unit 61 controls the communication unit 35 to collate with the server 12 whether the pre-registered ID and password match. At this time, for example, an image displayed as “logged in” as shown in FIG. 6 is displayed.
 また、これに応じて、サーバ12の制御部91は、通信部92を制御して、送信されてきたIDとパスワードを取得すると、記憶部93に予め登録されたユーザのIDとパスワードであるか否かを照合する。そして、制御部91は、通信部92を制御して、IDとパスワードを送信してきた車両11に照合結果を送信する。 In response to this, when the control unit 91 of the server 12 controls the communication unit 92 to acquire the transmitted ID and password, is the user ID and password registered in the storage unit 93 in advance? Check whether or not. And the control part 91 controls the communication part 92, and transmits a collation result to the vehicle 11 which has transmitted ID and a password.
 ステップS44において、認証部61は、照合結果に基づいて、IDとパスワードが一致し、認証が認められたか否かを判定する。 In step S44, the authentication unit 61 determines whether the ID and the password match and the authentication is permitted based on the collation result.
 ステップS44において、認証が認められた(認証がOKである)場合、ステップS45において、認証部61は、認証が認められたことを認識する。 In step S44, if authentication is permitted (authentication is OK), in step S45, the authentication unit 61 recognizes that authentication is permitted.
 一方、ステップS44において、認証が認められない(認証がNGである)場合、ステップS46において、認証部61は、認証が認められなかったことを認識する。 On the other hand, if authentication is not permitted (authentication is NG) in step S44, the authentication unit 61 recognizes that authentication is not permitted in step S46.
 以上の処理により、認証処理が完了する。 The above process completes the authentication process.
 ここで、図2のフローチャートの説明に戻る。 Here, the description returns to the flowchart of FIG.
 ステップS12において、認証処理が完了すると、処理は、ステップS13に進む。 When the authentication process is completed in step S12, the process proceeds to step S13.
 ステップS13において、認証部61は、認証結果から、認証が認められたか(認証がOKか)否かを判定する。 In step S13, the authentication unit 61 determines from the authentication result whether authentication is permitted (authentication is OK) or not.
 ステップS13において、認証が認められない場合、処理は、ステップS24に進む。 In step S13, if authentication is not permitted, the process proceeds to step S24.
 ステップS24において、認証部61は、操作入力部33が操作されて、新規アカウントの設定が要求されているか否かを判定する。ステップS24において、新規アカウントの設定が要求されていると判定された場合、処理は、ステップS25に進む。 In step S24, the authentication unit 61 determines whether or not the operation input unit 33 is operated to request setting of a new account. If it is determined in step S24 that setting of a new account is requested, the process proceeds to step S25.
 ステップS25において、アカウント管理部62は、表示部36を制御して、新規アカウントを設定するための画像を表示すると共に、操作入力部33による新規アカウントを設定するための操作入力を受け付ける。 In step S25, the account management unit 62 controls the display unit 36 to display an image for setting a new account, and accepts an operation input for setting a new account by the operation input unit 33.
 ステップS26において、アカウント管理部62は、操作入力部33が操作されることにより入力された情報に基づいて、新規のアカウントを設定すると共に、通信部35を制御して、サーバ12に対して新規のアカウントを登録させ、処理は、ステップS23に進む。 In step S <b> 26, the account management unit 62 sets a new account based on the information input by operating the operation input unit 33 and controls the communication unit 35 to make a new account for the server 12. The account is registered, and the process proceeds to step S23.
 また、ステップS24において、新規アカウントの設定が要求されない場合、ステップS27において、認証部61は、例えば、図7で示されるように、認証が認められなかった(認証がNGである)ことを、表示部36を制御して表示し、処理は、ステップS23に進む。尚、図7においては、「ログインに失敗しました」と表示されており、認証が認められなかったことが提示されている。 Further, when setting of a new account is not requested in step S24, in step S27, the authentication unit 61 confirms that authentication is not permitted (authentication is NG), for example, as shown in FIG. The display unit 36 is controlled and displayed, and the process proceeds to step S23. In FIG. 7, “Login failed” is displayed, indicating that authentication has not been accepted.
 ステップS23において、認証部61は、操作入力部33、車両動作検出部34、および撮像部39より供給されてくる情報に基づいて、ドライバが降車したか否かを判定する。ステップS23において、ドライバが降車したものとみなされた場合、処理は、ステップS12に戻る。 In step S23, the authentication unit 61 determines whether or not the driver has got off based on information supplied from the operation input unit 33, the vehicle motion detection unit 34, and the imaging unit 39. If it is determined in step S23 that the driver has got off, the process returns to step S12.
 すなわち、乗車が認められたが、認証が認められない場合、新規アカウントを設定して、認証が認められるまで、ステップS12,S13,S23乃至S26の処理が繰り返されて、ユーザが車両11を降車するまで、エージェント処理を進めることができない状態が継続する。 That is, when the boarding is permitted but the authentication is not permitted, a new account is set, and the process of steps S12, S13, S23 to S26 is repeated until the authentication is permitted, and the user gets off the vehicle 11 Until this is done, the agent processing cannot continue.
 ステップS13において、認証が認められた場合、処理は、ステップS14に進む。 If the authentication is approved in step S13, the process proceeds to step S14.
 ステップS14において、エージェントデータ同期管理部63は、通信部35を制御して、サーバ12より認証が認められたドライバ(ユーザ)のエージェントデータを読み出して、自らの記憶部40に格納されている、エージェントデータを更新する(サーバ12のエージェントデータと同期する)。 In step S <b> 14, the agent data synchronization management unit 63 controls the communication unit 35 to read out agent data of a driver (user) whose authentication is permitted from the server 12 and stores the agent data in its own storage unit 40. The agent data is updated (synchronized with the agent data of the server 12).
 ステップS15において、エージェントデータ同期管理部63は、記憶部40に格納されているエージェントデータに基づいて、例えば、図8で示されるような、利用可能なデータと、対応可能なアクションを選択する画像を生成して、表示部36を制御して表示する。 In step S15, the agent data synchronization management unit 63 selects the usable data and the corresponding action as shown in FIG. 8, for example, based on the agent data stored in the storage unit 40. Is generated and the display unit 36 is controlled and displayed.
 図8においては、上から認証されたユーザのデータ、車両Aに適用済みのデータ、およびサーバ12において管理されているユーザAに関連付けられたデータが表示されている。 In FIG. 8, data of the user authenticated from above, data already applied to the vehicle A, and data associated with the user A managed by the server 12 are displayed.
 図8においては、認証されたユーザのデータとしてユーザAのエージェントデータが示されている。また、エージェントの一連の処理を実行している車両11である車両Aに適用済みのデータとして、個体依存データ(Public)、車種依存データ(Public)、汎用データ(Public)が示されている。すなわち、車両Aに適用済みのデータが、現状では、個体依存データ(Public)、車種依存データ(Public)、汎用データ(Public)であることが示されている。 FIG. 8 shows the agent data of user A as authenticated user data. In addition, individual-dependent data (Public), vehicle-type dependent data (Public), and general-purpose data (Public) are shown as data already applied to the vehicle A, which is the vehicle 11 that is executing a series of processing of the agent. That is, it is indicated that the data already applied to the vehicle A is individual-dependent data (Public), vehicle-type dependent data (Public), and general-purpose data (Public).
 さらに、ユーザAに関連付けられたデータには、車両B運転時のものと、車両C運転時のものとがあることが示されており、車両B運転時のものとして、個体依存データ、車種依存データ(Public/Private)、汎用データ(Public/Private)が示されており、車両B運転時のデータが、個体依存データ、車種依存データ(Public/Private)、汎用データ(Public/Private)であることが示されている。 Further, it is shown that the data associated with the user A includes data when driving the vehicle B and data when driving the vehicle C. Data (Public / Private) and general-purpose data (Public / Private) are shown, and data when driving the vehicle B is individual-dependent data, vehicle-type-dependent data (Public / Private), and general-purpose data (Public / Private) It has been shown.
 また、車両C運転時のものとして、個体依存データ、車種依存データ、汎用データ(Public/Private)が示されており、車両B運転時のデータが、個体依存データ、車種依存データ、汎用データ(Public/Private)であることが示されている。 In addition, individual-dependent data, vehicle-type dependent data, and general-purpose data (Public / Private) are shown when driving the vehicle C, and data when driving the vehicle B is individual-dependent data, vehicle-type-dependent data, general-purpose data ( Public / Private).
 さらに、図8の最下段には、左から、エージェントデータの利用設定、移行設定、および破棄設定のそれぞれを選択する際に操作される、「利用」と表記されたボタン111、「移行」と表記されたボタン112、および「破棄」と表記されたボタン113が設けられている。 Further, at the bottom of FIG. 8, from the left, a button 111 labeled “use”, “move”, which is operated when selecting each of the agent data use setting, transfer setting, and discard setting, A button 112 indicated and a button 113 indicated as “Discard” are provided.
 <エージェントデータの構成>
 ここで、図9を参照して、エージェントデータの構成について説明する。
<Configuration of agent data>
Here, the configuration of the agent data will be described with reference to FIG.
 エージェントデータは、個体依存データ、車種依存データ、汎用データの3種類のデータから構成されており、さらに、それぞれのデータについて、ユーザに依存する情報を含むPrivateタイプと、それ以外のPublicタイプとがある。 The agent data consists of three types of data: individual-dependent data, vehicle-type-dependent data, and general-purpose data. Furthermore, for each data, there are a private type that includes user-dependent information and other public types. is there.
 従って、エージェントデータは、個体依存データ、車種依存データ、汎用データの3種類について、それぞれPrivateタイプと、Publicタイプとが設定され、合計6種類のデータが存在する。 Therefore, for the agent data, the private type and the public type are set for each of the three types of individual-dependent data, vehicle-type dependent data, and general-purpose data, and there are a total of six types of data.
 個体依存データは、エージェントを搭載するハードウェア(ここでは、車両)の個体(そのハードウェア自体を指す)特有のデータである。より具体的には、個体依存データは、ハードウェアが、車両であるので、例えば、修理履歴、走行距離、改造情報、衝突履歴、およびガソリン残存量などである。 Individual-dependent data is data specific to an individual (in this case, a vehicle) on which an agent is installed (refers to the hardware itself). More specifically, since the hardware is a vehicle, the individual-dependent data includes, for example, repair history, travel distance, modification information, collision history, and gasoline remaining amount.
 車種依存データは、その個体に限らず、ハードウェアである車両の車種全般に共通するデータである。より具体的には、車種依存データは、ハードウェアが車両であるので、車種毎に共通して特定される燃費、最高速度、ナビゲーションされるルート上の通行可否の情報、車両動作検出部34のセンシングデータ(例えば、自動ブレーキ発動条件など)、自動運転可否情報、および、自動運転時の走行ルートなどである。 The vehicle type-dependent data is data that is common not only to the individual vehicle but also to all types of vehicle vehicles that are hardware. More specifically, since the vehicle type dependent data is the vehicle, the fuel consumption, the maximum speed, the information on whether or not to pass on the route to be navigated, Sensing data (for example, automatic brake activation conditions), automatic driving availability information, travel route during automatic driving, and the like.
 汎用データは、ハードウェアである車両に依存しない汎用的に適用可能なデータである。より詳細には、汎用データは、ハードウェアが車両であるので、例えば、経路情報、近隣の店舗情報、訪問場所履歴、エージェントとの会話履歴、運転の仕方、急ブレーキ、クラクション回数、喫煙の有無、天気、建物、道路等の3次元データ、および外部情報などである。 General-purpose data is general-purpose data that does not depend on the vehicle that is hardware. In more detail, since the hardware is a vehicle, the general-purpose data is, for example, route information, nearby store information, visit place history, conversation history with agents, driving method, sudden braking, number of horns, presence or absence of smoking 3D data such as weather, buildings, and roads, and external information.
 また、Privateタイプのデータは、ハードウェアである車両の利用者であるドライバに依存する情報であるが個人が特定できるほどの情報ではなくてもよいものであり、例えば、走行履歴のうち場所と時間が含まれている場合にはPrivateタイプのデータである。 Private type data is information that depends on the driver who is the user of the vehicle, which is hardware, but may not be information that can be specified by an individual. If time is included, it is Private type data.
 さらに、Publicタイプのデータは、Privateタイプのデータでないタイプのデータであり、例えば、単なる経路情報や、場所や時間の特定ができない、すなわち、どのユーザのものであるのかの特定ができない走行履歴などである。 Furthermore, Public type data is data that is not Private type data. For example, it is not possible to specify simple route information or a place or time, that is, a driving history that cannot specify which user belongs. It is.
 また、図9においては、エージェントデータの種別毎に、それぞれの左上部が異なるエージェントに移行可能か否かを表しており、それぞれの右下部が異なるユーザに移行可能か否かを表している。ここで、丸印は移行可能であることを表しており、バツ印は、移行不能であることを表しており、三角印は、車種が同一であれば移行可能であることを表している。 In FIG. 9, for each type of agent data, each upper left part indicates whether or not it is possible to migrate to a different agent, and each lower right part indicates whether or not it is possible to migrate to a different user. Here, the circle mark indicates that transfer is possible, the cross mark indicates that transfer is not possible, and the triangle mark indicates that transfer is possible if the vehicle type is the same.
 エージェントデータを異なるエージェントに移行するとは、車両11毎にエージェントは異なるものであるので、同一のユーザが、異なる車両Aを運転する際に利用した、エージェントデータを、異なる車両Bのエージェントに移行して、異なるエージェントを使用することである。 The agent data is transferred to a different agent because the agent is different for each vehicle 11. Therefore, the agent data used when the same user drives a different vehicle A is transferred to the agent of a different vehicle B. Use different agents.
 また、エージェントデータを異なるユーザに移行するとは、ユーザAが使用する車両11のエージェントに、異なるユーザBが運転する際に、ユーザBのエージェントデータを移行してエージェントを使用することである。 Further, to transfer the agent data to a different user means to use the agent by transferring the agent data of the user B to the agent of the vehicle 11 used by the user A when the different user B drives.
 すなわち、個体依存データは、PublicタイプおよびPrivateタイプのいずれのデータも、異なるエージェントに移行することができない。 That is, individual-dependent data cannot be transferred to a different agent for both Public type and Private type data.
 また、車種依存データは、PublicタイプおよびPrivateタイプのいずれのデータも、車種が同一であれば異なるエージェントに移行できるが、車種が異なる場合には異なるエージェントに移行することができない。 Also, as for the vehicle type dependent data, both public type and private type data can be transferred to different agents if the vehicle type is the same, but cannot be transferred to different agents if the vehicle type is different.
 さらに、汎用データは、PublicタイプおよびPrivateタイプのいずれのデータも、異なるエージェントに移行できる。 Furthermore, general-purpose data can be transferred to different agents for both public and private data.
 個体依存データ、車種依存データ、および汎用データの全てのPrivateタイプのデータは、個人情報を含む可能性があり、ユーザ固有のものであるので、異なるユーザには移行することができない。 All private-type data, such as individual-dependent data, vehicle-type-dependent data, and general-purpose data, may contain personal information and are user-specific, and therefore cannot be transferred to different users.
 また、個体依存データ、車種依存データ、および汎用データの全てのPublicタイプのデータは、ユーザ固有のものではないので、異なるユーザに移行することができる。 Also, since all public type data such as individual-dependent data, vehicle type-dependent data, and general-purpose data are not user-specific, they can be transferred to different users.
 ここで、図8の表示例についての説明に戻る。すなわち、エージェントデータは、このように構成されているため、図8において、車両Aに適用済みのデータとして挙げられている個体依存データ(Public)、車種依存データ(Public)、汎用データ(Public)は、車両Aに関連付けられたデータであって、かつ、認証されたユーザAが、これまで車両Aを運転したことがない状態であるので、いずれも異なるユーザに移行することが可能なデータであるので、適用済みとされている。 Here, the description returns to the display example of FIG. That is, since the agent data is configured as described above, the individual-dependent data (Public), the vehicle-type-dependent data (Public), and the general-purpose data (Public) listed as data already applied to the vehicle A in FIG. Is data associated with the vehicle A, and since the authenticated user A has never driven the vehicle A until now, all of the data can be transferred to different users. Because there is, it is said that it has been applied.
 また、ユーザAに関連付けられたデータであって、車両B運転時のものとして、個体依存データ、車種依存データ(Public/Private)、汎用データ(Public/Private)が示されているが、「(Public/Private)」が付されている、車種依存データ(Public/Private)、および汎用データ(Public/Private)が、図9で示されるように、移行できるデータであることを示している。したがって、図9で示されるように、「(Public/Private)」が付されていない個体依存データは、移行できない。 In addition, as data associated with the user A and when driving the vehicle B, individual-dependent data, vehicle-type dependent data (Public / Private), and general-purpose data (Public / Private) are shown. The vehicle type-dependent data (Public / Private) and general-purpose data (Public / Private) to which “Public / Private)” is attached are data that can be migrated as shown in FIG. Therefore, as shown in FIG. 9, individual-dependent data without “(Public / Private)” cannot be transferred.
 すなわち、ここでは、車両Aが、車両Bと同一車種であるため、車種依存データ(Public/Private)が移行できることが示されている。 That is, here, since the vehicle A is the same vehicle type as the vehicle B, it is indicated that the vehicle type dependent data (Public / Private) can be transferred.
 さらに、ユーザAに関連付けられたデータであって、車両C運転時のものとして、個体依存データ、車種依存データ、汎用データ(Public/Private)が示されているが、「(Public/Private)」が付されている、汎用データ(Public/Private)が、図9で示されるように、移行できるデータであることを示している。したがって、図9で示されるように、「(Public/Private)」が付されていない個体依存データ、および車種依存データは、移行できない。 Furthermore, the data associated with the user A, which is individual-dependent data, vehicle-type-dependent data, and general-purpose data (Public / Private) is shown as when driving the vehicle C, but “(Public / Private)” The general-purpose data (Public / Private) to which is attached indicates that the data can be migrated as shown in FIG. Therefore, as shown in FIG. 9, individual-dependent data and vehicle type-dependent data that are not attached with “(Public / Private)” cannot be transferred.
 ここでは、車両Aが、車両Cと同一車種ではないため、車種依存データが移行できないことが示されている。 Here, it is shown that the vehicle type dependent data cannot be transferred because the vehicle A is not the same vehicle type as the vehicle C.
 ここで、図2のフローチャートの説明に戻る。 Here, the description returns to the flowchart of FIG.
 ステップS16において、エージェント利用管理部64は、操作入力部33が操作されて、図8の「利用」と表記されたボタン111が押下されて、エージェントデータの利用設定が指示されたか否かを判定する。 In step S <b> 16, the agent usage management unit 64 determines whether the operation input unit 33 is operated and the button 111 labeled “USED” in FIG. 8 is pressed to instruct the usage setting of the agent data. To do.
 ステップS16において、利用設定が指示されたとみなされた場合、ステップS17において、エージェント利用管理部64は、エージェントデータの利用設定処理を実行し、処理は、ステップS23に進む。尚、利用設定処理については、図10のフローチャートを参照して、詳細を後述する。 If it is determined in step S16 that usage setting has been instructed, in step S17, the agent usage management unit 64 executes usage setting processing for agent data, and the process proceeds to step S23. Details of the usage setting process will be described later with reference to the flowchart of FIG.
 一方、ステップS16において、利用設定が指示されていないとみなされた場合、処理は、ステップS18に進む。 On the other hand, if it is determined in step S16 that usage setting is not instructed, the process proceeds to step S18.
 ステップS18において、エージェント移行管理部65は、操作入力部33が操作されて、図8の「移行」と表記されたボタン112が押下されて、エージェントデータの移行設定が指示されたか否かを判定する。 In step S18, the agent migration management unit 65 determines whether or not the operation input unit 33 has been operated and the button 112 labeled “migration” in FIG. To do.
 ステップS18において、移行設定が指示されたとみなされた場合、ステップS19において、エージェント移行管理部65は、エージェントデータの移行を設定する移行設定処理を実行し、処理は、ステップS23に進む。尚、移行設定処理については、図12のフローチャートを参照して、詳細を後述する。 If it is determined in step S18 that migration setting has been instructed, in step S19, the agent migration management unit 65 executes migration setting processing for setting migration of agent data, and the processing proceeds to step S23. Details of the migration setting process will be described later with reference to the flowchart of FIG.
 一方、ステップS18において、移行設定が指示されていないとみなされた場合、処理は、ステップS20に進む。 On the other hand, if it is determined in step S18 that the migration setting is not instructed, the process proceeds to step S20.
 ステップS20において、エージェント破棄管理部66は、操作入力部33が操作されて、図8の「破棄」と表記されたボタン113が押下されて、エージェントデータの破棄設定が指示されたか否かを判定する。 In step S20, the agent discard management unit 66 determines whether or not the operation input unit 33 is operated and the button 113 labeled “Discard” in FIG. To do.
 ステップS20において、破棄設定が指示されたとみなされた場合、ステップS21において、エージェント破棄管理部66は、エージェントデータの破棄を設定する破棄設定処理を実行し、処理は、ステップS23に進む。尚、破棄設定処理については、図15のフローチャートを参照して、詳細を後述する。 If it is determined in step S20 that the discard setting has been instructed, in step S21, the agent discard management unit 66 executes a discard setting process for setting the discard of the agent data, and the process proceeds to step S23. Details of the discard setting process will be described later with reference to the flowchart of FIG.
 一方、ステップS20において、破棄設定が指示されていないとみなされた場合、処理は、ステップS22に進む。 On the other hand, when it is determined in step S20 that the discard setting is not instructed, the process proceeds to step S22.
 ステップS22において、解析部67は、エージェント機能実行処理を実行し、エージェント機能を実行させ、ステップS23に進む。尚、エージェント機能実行処理については、図17のフローチャートを参照して、詳細を後述する。 In step S22, the analysis unit 67 executes an agent function execution process to execute the agent function, and proceeds to step S23. The agent function execution process will be described in detail later with reference to the flowchart of FIG.
 以上の一連の処理によりエージェント処理が実行されて、エージェント機能を利用した車両11の運転を実現することが可能となる。また、エージェント機能を利用するにあたって必要とされるエージェントデータを他のユーザに引き継いだり、他の車両で使用するために引き継ぐことが可能となる。 The agent process is executed by the series of processes described above, and the vehicle 11 can be operated using the agent function. In addition, agent data required for using the agent function can be transferred to another user or used for use in another vehicle.
 <利用設定処理>
 次に、図10のフローチャートを参照して、利用設定処理について説明する。
<Usage setting process>
Next, usage setting processing will be described with reference to the flowchart of FIG.
 ステップS71において、エージェントデータ利用管理部64は、例えば、図11で示されるようなエージェントデータの利用を設定する画像を表示する。ここで、図11においては、車両Aに適用済みのデータとして、個体依存データ(Public)、車種依存データ(Public)、汎用データ(Public)のそれぞれの右隣りに「適用済」と表記されたボタン131乃至133が表示されており、押下される度に、適用可能であるが、非適用であることを示す「適用」と、既に適用されていることを示す「適用済」とが順次切り替えられて表示されると共に、表示された状態に対応する状態に設定される。図11においては、デフォルトで適用されることが設定されているので、ボタン131乃至133には「適用済」と表記されている。 In step S71, the agent data use managing unit 64 displays an image for setting use of agent data as shown in FIG. 11, for example. Here, in FIG. 11, “applied” is written on the right side of each of the individual-dependent data (Public), the vehicle-type dependent data (Public), and the general-purpose data (Public) as data already applied to the vehicle A. Buttons 131 to 133 are displayed, and each time the button is pressed, “Applied” indicating that it is applicable but not applied and “Applied” indicating that it is already applied are sequentially switched. Displayed and set to a state corresponding to the displayed state. In FIG. 11, since it is set to be applied by default, the buttons 131 to 133 are described as “applied”.
 ユーザAに関連付けられた、サーバ12より当該車両Aへと移行可能なデータには、車両B運転時のものとして、利用できないことを表すグレー表示されている個体依存データ、並びに、車種依存データ(Public/Private)、および汎用データ(Public/Private)には、上述と同様に、「適用」と表記されたボタン134,135が表示されている。 Data that can be transferred to the vehicle A from the server 12 associated with the user A includes individual-dependent data that is displayed in gray indicating that the vehicle B cannot be used, and vehicle type-dependent data ( Public / Private) and general-purpose data (Public / Private) have buttons 134 and 135 labeled “Apply” as described above.
 さらに、ユーザAに関連付けられた、サーバ12より当該車両Aへと移行可能なデータには、車両C運転時のものとして、利用できないことを表すグレー表示されている個体依存データ、および、車種依存データ、並びに汎用データ(Public/Private)には、上述と同様に、「適用」と表記されたボタン136が表示されている。 Furthermore, in the data associated with the user A that can be transferred from the server 12 to the vehicle A, the individual-dependent data displayed in gray indicating that the data cannot be used as the vehicle C driving, and the vehicle type-dependent In the data and general-purpose data (Public / Private), a button 136 labeled “Apply” is displayed as described above.
 ステップS72において、エージェントデータ利用管理部64は、操作入力部33が操作されて、ボタン131乃至135が操作されたか否かを判定し、操作されたとみなされた場合、処理は、ステップS73に進む。 In step S72, the agent data utilization managing unit 64 determines whether or not the operation input unit 33 is operated and the buttons 131 to 135 are operated. If it is determined that the operation is performed, the process proceeds to step S73. .
 ステップS73において、エージェントデータ利用管理部64は、ボタン131乃至135に対する操作入力部33の操作内容に応じて、記憶部40に記憶されているエージェントデータを更新する。 In step S73, the agent data utilization management unit 64 updates the agent data stored in the storage unit 40 in accordance with the operation content of the operation input unit 33 for the buttons 131 to 135.
 ステップS74において、エージェントデータ利用管理部64は、操作入力部33が操作されて、利用設定の終了が指示されたか否かを判定する。終了が指示されていない場合、処理は、ステップS71に戻り、それ以降の処理が繰り返される。 In step S74, the agent data usage management unit 64 determines whether or not the operation input unit 33 has been operated to instruct the end of the usage setting. If termination is not instructed, the process returns to step S71, and the subsequent processes are repeated.
 また、ステップS72において、操作入力部33が操作されない場合、ステップS73の処理がスキップされる。 In step S72, when the operation input unit 33 is not operated, the process of step S73 is skipped.
 すなわち、利用設定終了が指示されるまで、ステップS71乃至S74の処理が繰り返される。そして、ステップS74において、終了が指示されると、利用設定が終了する。 That is, the processing of steps S71 to S74 is repeated until the end of use setting is instructed. In step S74, when the end is instructed, the use setting is ended.
 以上の処理により、エージェントデータのうち、適用可能なものについて、適用するか否かを設定することが可能となる。 Through the above processing, it is possible to set whether to apply the applicable data among the agent data.
 <移行設定処理>
 次に、図12のフローチャートを参照して、移行設定処理について説明する。
<Migration setting process>
Next, the migration setting process will be described with reference to the flowchart of FIG.
 ステップS101において、エージェントデータ移行管理部65は、例えば、図13で示されるようなエージェントデータの移行を設定する画像を表示する。ここで、図13においては、上から、「移行先を選択してください」と表記され、移行先として選択可能な車両11を選択するための、「車両D(2016年2月9日16時より利用予定。AAタクシー)」、「車両E(友人Bが所有)」および「車両F(ユーザAが所有)」が表記されたボタン151乃至153が表示されている。尚、それ以下の項目については、図8における項目と同様であるので、その説明は省略する。 In step S101, the agent data migration management unit 65 displays an image for setting migration of agent data as shown in FIG. 13, for example. In FIG. 13, “Vehicle D (February 9, 2016, 16:00) for selecting the vehicle 11 that can be selected as the migration destination is described from above. Buttons 151 to 153 on which “Scheduled to use. AA taxi”, “Vehicle E (owned by friend B)” and “Vehicle F (owned by user A)” are displayed. Since the items below that are the same as those in FIG. 8, the description thereof is omitted.
 すなわち、エージェントデータを移行することが可能な、認証されたユーザが使用可能な車両D,E,Fを選択することが可能なボタン151乃至153が表示されている。 That is, buttons 151 to 153 that can select the vehicles D, E, and F that can be used by the authenticated user and can transfer the agent data are displayed.
 ステップS102において、エージェントデータ移行管理部65は、操作入力部33が操作されて、ボタン151乃至153が操作されたか否かを判定する。 In step S102, the agent data migration management unit 65 determines whether the operation input unit 33 is operated and the buttons 151 to 153 are operated.
 ステップS102において、例えば、ボタン151が押下されると、操作されたとみなされ、処理は、ステップS103に進む。 In step S102, for example, when the button 151 is pressed, it is considered that the button 151 has been operated, and the process proceeds to step S103.
 ステップS103において、エージェントデータ移行管理部65は、ボタン151乃至153に対する操作入力部33の操作内容に応じて、記憶部40に記憶されているエージェントデータを更新する。 In step S103, the agent data migration management unit 65 updates the agent data stored in the storage unit 40 in accordance with the operation content of the operation input unit 33 for the buttons 151 to 153.
 すなわち、ボタン151が押下された場合、エージェントデータ移行管理部65は、図14で示されるように、ボタン151が選択されたことを示すように表示を切り替えると共に、車両Aに適用済みのデータ、車両B,Cのそれぞれの汎用データ(Public)、および汎用データ(Public/Private)の移行の有無を選択可能な「移行」と表記されたボタン171乃至173を表記する。ここでは、車両Dが、車両A,B,Cのいずれの車種とも同一ではないので、車種依存データ(Public/Private)の移行が認められていないことが示されている。 That is, when the button 151 is pressed, the agent data migration management unit 65 switches the display so as to indicate that the button 151 is selected as shown in FIG. Buttons 171 to 173 labeled “Migration” that can select whether or not the general data (Public) and general data (Public / Private) of the vehicles B and C are migrated are described. Here, since the vehicle D is not the same as any of the vehicles A, B, and C, it is indicated that the transfer of the vehicle type-dependent data (Public / Private) is not permitted.
 ステップS104において、エージェントデータ移行管理部65は、操作入力部33が操作されて、利用設定の終了が指示されたか否かを判定する。終了が指示されていない場合、処理は、ステップS101に戻り、それ以降の処理が繰り返される。 In step S104, the agent data migration management unit 65 determines whether or not the operation input unit 33 has been operated to instruct the end of the use setting. If termination is not instructed, the process returns to step S101, and the subsequent processes are repeated.
 今の場合、ステップS102において、図14で示されるように表示されている状態で、再び、操作入力の有無が判定され、例えば、ボタン171乃至173のいずれかが操作されると操作入力があったものとみなし、処理は、再び、ステップS103に進む。 In this case, in step S102, it is determined again whether or not there is an operation input in the state displayed as shown in FIG. 14, and for example, if any of the buttons 171 to 173 is operated, there is an operation input. The process proceeds to step S103 again.
 ステップS103において、エージェントデータ移行管理部65は、操作入力がなされたボタン171乃至173のいずれかのデータの移行を設定し、処理は、ステップS104に進む。 In step S103, the agent data migration management unit 65 sets the data migration of any of the buttons 171 to 173 for which the operation input has been made, and the process proceeds to step S104.
 そして、ステップS104において、移行設定終了が指示されると、移行設定が終了する。 Then, in step S104, when the migration setting end is instructed, the migration setting ends.
 以上の処理により、「車両D(2016年2月9日16時より利用予定。AAタクシー)」の車両11に対して、少なくともそのいずれか選択されたボタン171乃至173に対応する、車両Aに適用済みのデータ、車両B,Cのそれぞれの汎用データ(Public/Private)、および汎用データ(Public/Private)が移行するようにエージェントデータが設定される。 With the above processing, the vehicle A corresponding to at least one of the buttons 171 to 173 selected from the vehicle 11 of “vehicle D (planned to be used from 16:00 on February 9, 2016, AA taxi)” Agent data is set so that applied data, general data (Public / Private) of vehicles B and C, and general data (Public / Private) are transferred.
 尚、エージェントデータにおける、個体依存データ、車種依存データ、汎用データ、並びに、Privateタイプ、およびPublicタイプのデータのそれぞれの移行処理における規則は、図9を参照して説明したもの以外であってもよい。 It should be noted that the rules in the migration process of individual-dependent data, vehicle-type-dependent data, general-purpose data, and private type and public type data in the agent data may be other than those described with reference to FIG. Good.
 例えば、図9のものに代えて、変更パターン1として、図9中の丸印、三角印、およびバツ印をユーザが変更できるようにしてもよい。ただし、丸印から三角印、および三角印からバツ印への方向に変更は可能であるが、逆は不可とする。 For example, instead of the one in FIG. 9, as the change pattern 1, the user may be able to change the circle mark, the triangle mark, and the cross mark in FIG. 9. However, it is possible to change the direction from a circle to a triangle, and from a triangle to a cross, but not vice versa.
 また、変更パターン2として、図中の個体依存データ、車種依存データ、汎用データの種別をユーザが変更できるものとする。ただし、汎用データから車種依存データへの変更、および、車種依存データから個体依存データの方向に変更は可能であるが、逆は不可とする。 Suppose that the change pattern 2 allows the user to change the type of individual-dependent data, vehicle-type dependent data, and general-purpose data in the figure. However, it is possible to change from general-purpose data to vehicle-type dependent data, and from vehicle-type dependent data to individual-dependent data, but not vice versa.
 さらに、変更パターン3として、図中のPrivateタイプ、および、Publicタイプの種別をユーザが変更できるものとする。ただし、PublicタイプからPrivateタイプへの方向に変更は可能であるが、逆は不可とする。 Furthermore, as change pattern 3, it is assumed that the user can change the type of Private type and Public type in the figure. However, it is possible to change from Public type to Private type, but not vice versa.
 すなわち、移行可能なデータについては、移行不能としてもよいが、移行不能なデータを移行可能にはできないようにすることで、移行処理における規則に柔軟性を持たせつつ、プライバシを守ることが可能となる。すなわち、PublicタイプをPrivateタイプに変更するにあたっては、そもそも移行可能であったため、これを移行不能としてもプライバシに影響しないが、逆にするとプライバシが公開されることになり、プライバシへの配慮が欠けてしまうことになるので、このような処理は不可とされる。 In other words, data that can be migrated may be incapable of migrating, but by making non-migratable data non-migratable, it is possible to keep privacy while maintaining flexibility in the rules in the migration process. It becomes. In other words, when changing the Public type to the Private type, it was possible to migrate in the first place. Even if this is impossible, privacy will not be affected, but if it is reversed, privacy will be made public and privacy considerations will be lacking. Therefore, such processing is not possible.
 <廃棄設定処理>
 次に、図15のフローチャートを参照して、廃棄設定処理について説明する。
<Discard setting process>
Next, the discard setting process will be described with reference to the flowchart of FIG.
 ステップS121において、エージェントデータ廃棄管理部66は、例えば、図16で示されるようなエージェントデータの廃棄を設定する画像を表示する。ここで、図16においては、車両Aに適用済みのデータとして、個体依存データ(Private)、車種依存データ(Private)、および汎用データ(Private)、車両B運転時のものとして、個体依存データ(Public/Private)、車種依存データ(Public/Private)、および汎用データ(Public/Private)、並びに、車両C運転時のものとして、個体依存データ(Public/Private)、車種依存データ(Public/Private)、および汎用データ(Public/Private)のそれぞれに対して「破棄」と表記された、破棄を指示するとき操作されるボタン191乃至199が設けられている。 In step S121, the agent data discard management unit 66 displays, for example, an image for setting discard of agent data as shown in FIG. Here, in FIG. 16, individual-dependent data (Private), vehicle-type-dependent data (Private), and general-purpose data (Private) as data already applied to the vehicle A, and individual-dependent data (Private) Public / Private), vehicle-dependent data (Public / Private), general-purpose data (Public / Private), and individual-dependent data (Public / Private), vehicle-dependent data (Public / Private) , And general data (Public / Private) are provided with buttons 191 to 199 which are displayed as “discard” and which are operated when instructing the discard.
 すなわち、ここでは、車両AとユーザAとが対応付けるように登録されていないので、車両Aに適用済みのデータについては、個体依存データ(Private)、車種依存データ(Private)、および汎用データ(Private)のみであり、いずれも(Public)を廃棄する権限がないことが示されている。 That is, since the vehicle A and the user A are not registered so as to be associated with each other, the data already applied to the vehicle A includes individual-dependent data (Private), vehicle-type-dependent data (Private), and general-purpose data (Private ), And none of them has the authority to discard (Public).
 ステップS122において、エージェントデータ廃棄管理部66は、操作入力部33が操作されて、ボタン131乃至135が操作されたか否かを判定し、操作されたとみなされた場合、処理は、ステップS123に進む。 In step S122, the agent data discard management unit 66 determines whether or not the operation input unit 33 has been operated and the buttons 131 to 135 have been operated. If it is determined that the operation has been performed, the process proceeds to step S123. .
 ステップS123において、エージェントデータ廃棄管理部66は、ボタン191乃至199に対する操作入力部33の操作内容に応じて、記憶部40に記憶されているエージェントデータを更新する。 In step S123, the agent data discard management unit 66 updates the agent data stored in the storage unit 40 in accordance with the operation content of the operation input unit 33 for the buttons 191 to 199.
 ステップS124において、エージェントデータ廃棄管理部66は、操作入力部33が操作されて、利用設定の終了が指示されたか否かを判定する。終了が指示されていない場合、処理は、ステップS121に戻り、それ以降の処理が繰り返される。 In step S124, the agent data discard management unit 66 determines whether or not the operation input unit 33 has been operated to instruct the end of the use setting. If termination is not instructed, the process returns to step S121, and the subsequent processes are repeated.
 また、ステップS122において、操作入力部33が操作されない場合、ステップS123の処理がスキップされる。 In step S122, when the operation input unit 33 is not operated, the process of step S123 is skipped.
 すなわち、廃棄設定終了が指示されるまで、ステップS121乃至S124の処理が繰り返される。そして、ステップS124において、終了が指示されると、廃棄設定が終了する。 That is, the processing of steps S121 to S124 is repeated until the end of discard setting is instructed. Then, in step S124, when termination is instructed, the discard setting is terminated.
 以上の処理により、エージェントデータのうち、廃棄可能なものについて、廃棄するか否かを設定することが可能となる。 With the above processing, it is possible to set whether to discard the agent data that can be discarded.
 <エージェント機能実行処理>
 次に、図17のフローチャートを参照して、上述した処理により設定されたエージェントデータを用いたエージェント機能実行処理について説明する。
<Agent function execution processing>
Next, an agent function execution process using the agent data set by the above-described process will be described with reference to the flowchart of FIG.
 ステップS141において、解析部67は、車両動作検出部34、音声入力部38、および撮像部39より供給されてくる車両動作の検出結果、入力された音声、および撮像された画像などの車両11における各種の検出結果の供給を受け付ける。 In step S <b> 141, the analysis unit 67 detects the vehicle motion detection result, the input sound, and the captured image supplied from the vehicle motion detection unit 34, the voice input unit 38, and the imaging unit 39 in the vehicle 11. Accepts supply of various detection results.
 ステップS142において、解析部67は、受け付けた各種の検出結果を解析し、解析結果に対応する課題となる事例を対象事例に設定して、事例検索部68に供給する。例えば、解析部67は、ユーザであるドライバがナビゲーションを指示するために目的地などを含む音声を発したような場合、音声入力部38に入力された音声について、言語解析や、意味解析などを掛けて、経路(ルート)を探索することが課題であることを認識し、目的地までの経路の探索を対象事例として設定し、事例検索部68に供給する。 In step S142, the analysis unit 67 analyzes the received various detection results, sets a case that is a problem corresponding to the analysis result as a target case, and supplies the case to the case search unit 68. For example, the analysis unit 67 performs language analysis, semantic analysis, and the like on the voice input to the voice input unit 38 when the driver as a user utters voice including a destination for instructing navigation. Thus, it is recognized that searching for a route (route) is a problem, and the search for the route to the destination is set as a target case and supplied to the case search unit 68.
 ステップS143において、事例検索部68は、記憶部40に記憶されているエージェントデータに基づいて、対象事例を解決するための処理となる事例を検索する。事例検索部68は、例えば、選択すべき経路候補R1,R2,・・・Rnを、エージェントデータに含まれる過去の経路探索履歴などから検索し、事例検索結果として出力する。 In step S143, the case search unit 68 searches for a case serving as a process for solving the target case based on the agent data stored in the storage unit 40. For example, the case search unit 68 searches for route candidates R1, R2,... Rn to be selected from the past route search history included in the agent data and outputs them as case search results.
 ステップS144において、事例確認部69は、事例検索結果となるデータから、課題となる対象事例を解決できるか否かを確認する。 In step S144, the case confirmation unit 69 confirms whether or not the target case that is the problem can be solved from the data that is the case search result.
 例えば、経路の探索結果R1において、出発地Xから目的地Yまでの経路として、過去の事例から出発地Xから目的地Yまでの途中経路であって、出発地Xから経由地Aまでの経路R1、経由地Aから経由地Bまでの経路R2、経由地Bから目的地Yまでの経路R3がそれぞれ検索結果として見つけ出された場合に、例えば、経路R1,R3には、問題がないが、経路R2は、過去に車両HをユーザAが利用した際の経路であり、車両Hよりも今現在ユーザAが運転する車両Aの方が、車幅が大きく、経路R2の道幅に対して取り回しが難しいとき、事例検索結果では、対象事例を解決できないものと判断する。ステップS144において、このように対象事例では課題を解決できないとみなされた場合、処理は、ステップS145に進む。 For example, in the route search result R1, as a route from the starting point X to the destination Y, a route from the past case to the destination Y to the starting point X, and a route from the starting point X to the waypoint A When R1, route R2 from route A to route B, and route R3 from route B to destination Y are found as search results, for example, routes R1 and R3 have no problem. The route R2 is a route when the user A has used the vehicle H in the past. The vehicle A currently driven by the user A is wider than the vehicle H, and the vehicle width is larger than the road R2. When handling is difficult, it is determined from the case search result that the target case cannot be solved. In step S144, when it is determined that the problem cannot be solved in the target case as described above, the process proceeds to step S145.
 ステップS145において、事例修正部70は、事例検索結果を、対象事例を解決できるように修正を加える。すなわち、上述の事例検索結果においては、経路R2において車両Aの車幅に対応可能な経路を、エージェントデータを利用して探索するなどして、経路R2を経路R2’に修正し、結果として、出発地Xから目的地Yまでの経路として、経路R1,R2’,R3を利用することで、対象事例で課題を解決できるようにする。ただし、このように修正を加えても、必ずしも対象事例を解消できないこともある。 In step S145, the case correction unit 70 corrects the case search result so that the target case can be solved. That is, in the case search result described above, the route R2 is corrected to the route R2 ′ by searching for a route that can correspond to the vehicle width of the vehicle A using the agent data, and as a result, By using the routes R1, R2 ′, R3 as the route from the starting point X to the destination Y, the problem can be solved in the target case. However, even if correction is made in this way, the target case may not necessarily be resolved.
 ステップS146において、事例判定部71は、このように設定された対象事例を解決するための事例検索結果を実行する。すなわち、この例においては、事例判定部71は、表示部36を制御して、目的地Yまでのナビゲーションの画像を表示すると共に、音声出力部37を制御して、目的地Yまでのガイド音声を出力させる。 In step S146, the case determination unit 71 executes a case search result for solving the target case set in this way. That is, in this example, the case determination unit 71 controls the display unit 36 to display a navigation image up to the destination Y, and also controls the audio output unit 37 to guide voice to the destination Y. Is output.
 尚、ステップS144において、対象事例を解決することができる場合、ステップS145の処理はスキップされる。すなわち、この場合、最初の事例検索結果を、そのまま対象事例を解決するために使用する。 If the target case can be solved in step S144, the process in step S145 is skipped. That is, in this case, the first case search result is used as it is to solve the target case.
 ステップS147において、事例判定部71は、一連の動作において、事例検索結果により対象事例の課題が解決できたか否かを判定する。すなわち、今の場合、事例判定部71は、事例検索結果に基づいたナビゲーションにより出発地Xから目的地Yまで車両11を移動させることができたか否かを判定する。 In step S147, the case determination unit 71 determines whether the problem of the target case has been solved based on the case search result in a series of operations. That is, in this case, the case determination unit 71 determines whether or not the vehicle 11 can be moved from the departure point X to the destination Y by navigation based on the case search result.
 ステップS147において、対象事例が解決できた場合、ステップS148において、事例判定部71は、事例検索結果を成功事例として認定する。 In Step S147, when the target case can be solved, in Step S148, the case determination unit 71 recognizes the case search result as a successful case.
 また、ステップS147において、対象事例が解決できない場合、ステップS149において、事例判定部71は、事例検索結果を成功事例として認定する。 If the target case cannot be solved in step S147, the case determination unit 71 recognizes the case search result as a successful case in step S149.
 ステップS150において、事例匿名化部72は、この事例検索結果について、匿名化が必要であるか否かを判定する。例えば、この事例検索結果は、所定の日時に、ユーザAが、出発地Xから目的地Yまで移動したというPrivateタイプの汎用データとして、ユーザAの個人情報を含む情報が残されてしまう恐れがある。そこで、このような個人情報が残されてしまうような場合、事例匿名化部72は、事例検索結果をPublicタイプの汎用データとして利用するには、匿名化が必要であると判定し、処理は、ステップS151に進む。 In step S150, the case anonymization unit 72 determines whether or not anonymization is necessary for the case search result. For example, in the case search result, there is a possibility that information including personal information of the user A is left as private type general-purpose data that the user A has moved from the departure place X to the destination Y at a predetermined date and time. is there. Therefore, in the case where such personal information is left, the case anonymization unit 72 determines that anonymization is necessary to use the case search result as public type general-purpose data, The process proceeds to step S151.
 ステップS151において、事例匿名化部72は、この事例検索結果を匿名化する。匿名化するとは、例えば、k匿名化処理により匿名化する。ここで、k匿名化処理とは、例えば、最低限k人以上が出発地Xから目的地Yまで移動したという情報にできるように複数の類似する事例検索結果の情報を用いて、ユーザAが特定されない情報に変換する処理である。換言すれば、出発地Xから目的地Yまで移動したという情報に付帯するデータ属性を、ユーザAが特定されないように変換することが、k匿名化処理である。 In step S151, the case anonymization unit 72 anonymizes the case search result. Anonymization is, for example, anonymized by k anonymization processing. Here, k anonymization processing means that, for example, user A uses information on a plurality of similar case search results so that at least k people or more have moved from departure point X to destination Y. This is a process of converting to unspecified information. In other words, the k anonymization process is to convert the data attribute attached to the information that the vehicle has moved from the departure point X to the destination Y so that the user A is not specified.
 例えば、事例匿名化部72は、通信部35を制御して、サーバ12にアクセスし、出発地Xから目的地Yまで移動したという情報を検索して、k人以上の複数の事例検索結果を求める。事例匿名化部72は、所定の期日Dに、ユーザAが出発地Xから目的地Yまで移動したという情報を、k人以上の複数の人が、所定の一年間(所定の期日Dも含まれる)で、出発地Xから目的地Yまで移動したk人以上の情報に変換する。 For example, the case anonymization unit 72 controls the communication unit 35, accesses the server 12, searches for information that has moved from the departure place X to the destination Y, and obtains a plurality of case search results of k or more people. Ask. The case anonymization unit 72 includes information that the user A has moved from the departure point X to the destination Y on a predetermined date D, and a plurality of k or more persons have determined a predetermined year (including the predetermined date D). The information is converted into information of k or more people who have moved from the starting point X to the destination Y.
 このように、このデータが生成された日時があいまいな情報となるため、「出発地Xから目的地Yまで移動したという情報」だけで、ユーザAが特定されない情報とすることができる。 Thus, since the date and time when this data was generated is ambiguous information, the information that the user A is not specified can be determined only by “information that the vehicle has moved from the departure place X to the destination Y”.
 このように匿名化された事例検索結果については、事例匿名化部72は、Privateタイプの汎用データを、Publicタイプの汎用データとして属性変更する。このように属性変換されたPublicタイプの汎用データが蓄積されることにより、他のエージェントにおいて、経路探索をするにあたって、参照できるデータが増えることで、より高精度に事例検索を行うことが可能となる。 For the case search result anonymized in this way, the case anonymization unit 72 changes the attribute of the private type general-purpose data as public type general-purpose data. By accumulating public-type general-purpose data that has been attribute-converted in this way, it is possible to search for cases with higher accuracy by increasing the data that can be referred to when searching for routes in other agents. Become.
 尚、ステップS150において、例えば、事例検索結果が、Publicタイプの汎用データであるとみなせるような場合、匿名化は不要であるとみなされ、ステップS151の処理はスキップされる。 In step S150, for example, when the case search result can be regarded as public type general-purpose data, anonymization is regarded as unnecessary, and the process in step S151 is skipped.
 ステップS152において、事例検証部73は、Publicタイプの汎用データからなる事例検索結果が改ざんされたデータであるか否かを判定し、改ざんされたデータではないとみなされた場合、ステップS153において、記憶部40に格納されているエージェントデータとして格納すると共に、サーバ12の記憶部93に格納させる。この際、事例検証部73は、Publicタイプの汎用データからなる事例検索結果について、対象事例が成功事例、または、失敗事例であるのかの情報も合わせてエージェントデータに格納する。 In step S152, the case verification unit 73 determines whether or not the case search result including public type general-purpose data is falsified data. If it is determined that the data is not falsified, in step S153, The data is stored as agent data stored in the storage unit 40 and also stored in the storage unit 93 of the server 12. At this time, the case verification unit 73 stores, in the agent data, information on whether the target case is a successful case or a failure case, with respect to the case search result including public type general-purpose data.
 尚、ステップS152において、事例検索結果が、Publicタイプの汎用データからなる事例検索結果が、改ざんされているとみなされた場合、ステップS153の処理はスキップされて、事例検索結果はPublicタイプの汎用データとして登録されない。 In step S152, if it is determined that the case search result including the public type general-purpose data is falsified, the process in step S153 is skipped, and the case search result is the public type general-purpose data. Not registered as data.
 ステップS154において、解析部67は、操作入力部33が操作されて、エージェント機能実行処理の終了が指示されたか否かを判定し、終了が指示されていない場合、処理は、ステップS141に戻り、それ以降の処理が繰り返される。そして、ステップS154において、終了が指示された場合、処理は終了する。 In step S154, the analysis unit 67 determines whether or not the operation input unit 33 is operated to instruct the end of the agent function execution process. If the end is not instructed, the process returns to step S141. The subsequent processing is repeated. In step S154, if an end instruction is given, the process ends.
 以上の処理により、エージェント機能を備えた車両を乗り換えるような場合にでも、エージェント機能により蓄積されるエージェントデータを引き継ぐことが可能となる。すなわち、ユーザAが車両Aから車両Bに乗り換えるような場合、すなわち、車両Aのエージェントから、車両Bのエージェントに乗り換える場合、車両Aにおいて蓄積されたエージェントデータのうち、車両Aの個体依存データについては、車両Bとは無関係であるので、引き継ぐことはできないが、車両Aの汎用データについては、車両Bに引き継ぐことで、これまでカスタマイズした動作や設定などを引き継ぐことが可能となる。また、車種依存データについては、車両A,Bが同一車種であれば、そのまま引き継ぐことが可能となる。 Through the above processing, even when a vehicle having an agent function is changed, the agent data accumulated by the agent function can be taken over. That is, when the user A changes from the vehicle A to the vehicle B, that is, from the agent of the vehicle A to the agent of the vehicle B, among the agent data accumulated in the vehicle A, the individual-dependent data of the vehicle A Is not related to the vehicle B and cannot be taken over, but the general-purpose data of the vehicle A can be taken over by the vehicle B, so that the operations and settings customized so far can be taken over. Further, the vehicle type dependent data can be taken over as long as the vehicles A and B are the same vehicle type.
 さらに、車両Aの中古車を、新たなユーザであるユーザBが使用するような場合、Privateタイプのデータについては、個体依存データ、車種依存データ、および汎用データのいずれも引き継ぐことはできないが、Publicタイプのデータについては、いずれも引き継ぐことが可能となる。 Furthermore, when the user B who is a new user uses the used car of the vehicle A, for the private type data, none of the individual-dependent data, the vehicle-type dependent data, and the general-purpose data can be taken over. Any public type data can be inherited.
 このような処理により、プライバシを守りつつ、適切にエージェントデータを引き継ぐことが可能となる。 Such processing makes it possible to properly transfer agent data while protecting privacy.
 また、上述したPrivateタイプのデータについては、ユーザが車両11から降車する直前のタイミングで、サーバ12に記憶させるようにした後に消去し、車両11に乗車する度にサーバ12からダウンロードして利用するようにしてもよい。このように管理することで、車両11が盗難にあってしまうようなことがあっても、Privateタイプのデータが盗み出されるようなことを防止することが可能となる。 In addition, the private type data described above is deleted after being stored in the server 12 at the timing immediately before the user gets off the vehicle 11, and is downloaded from the server 12 and used every time the user gets on the vehicle 11. You may do it. By managing in this way, even if the vehicle 11 is stolen, it is possible to prevent the private type data from being stolen.
 さらに、レンタカとして車両11を借り受ける場合であって、自分の持っている車両11と同一の車種を借りる場合、または、シェアカーなどで毎回決まった同一の車種の車両11を借り受ける場合などでも、エージェントデータをサーバ12からダウンロードして使用することで、毎回引き継ぐことで、車種依存データ、および汎用データをいつでも自らのエージェントデータを使用することが可能となる。 Furthermore, when borrowing a vehicle 11 as a rental car and renting the same vehicle type as the vehicle 11 that he owns, or when borrowing the vehicle 11 of the same vehicle type determined every time with a share car or the like, the agent By downloading the data from the server 12 and using it, it is possible to use the agent data of the vehicle type-dependent data and the general-purpose data at any time by taking over each time.
 また、以上においては、エージェントを利用して、車両11のナビゲーション機能を実現させる例について説明してきたが、エージェントとして機能させる対象はそれ以外のものであってもよいものである。 In the above description, an example of using the agent to realize the navigation function of the vehicle 11 has been described. However, the object to function as the agent may be other than that.
 例えば、速度、道路の傾斜、アクセル、およびブレーキと燃費との相関情報などから、道路状況に応じた速度、アクセル、およびブレーキをアシストするガイド音声を出力するような機能を実現させるようなエージェントであってもよい。 For example, an agent that realizes a function that outputs a guide voice that assists the speed, accelerator, and brake according to the road condition based on the correlation information between the speed, road inclination, accelerator, and brake and fuel consumption. There may be.
 また、急発進・急ブレーキの履歴、事故履歴、バッテリー充電回数履歴、走行履歴(海辺、砂漠、火山灰等の劣化要因接触履歴)、埃・匂いのセンサ履歴などを、車両の残存価値を設定する情報として管理するエージェントとするようにしてもよい。 Also set the vehicle's residual value, such as sudden start / brake history, accident history, battery charge count history, travel history (seaside, desert, volcanic ash etc. deterioration factor contact history), dust / odor sensor history, etc. An agent managed as information may be used.
 さらに、目的地の履歴(南極で走った・・・といった車両の持つ、逸話のような情報)、車両の希少性(特定地域での走行台数)、ドライバ履歴(ユーザ履歴:性別など、女性専用とか)といった車両に付加価値をもたらすような情報を管理するエージェントとするようにしてもよい。 In addition, destination history (anecdotal information such as vehicles ran in Antarctica), rarity of vehicles (number of vehicles traveling in a specific area), driver history (user history: gender, etc. Or the like) may be an agent that manages information that adds value to the vehicle.
 また、事例検索結果は、蓄積するほどに、その車両の付加価値となる。したがって、事例検索結果の蓄積量を、データ量(・・・GB保持)といった定量的な指標となる可視化情報とするようにしてもよい。 Also, the case search results become the added value of the vehicle as it accumulates. Therefore, the accumulated amount of case search results may be visualized information that becomes a quantitative index such as the data amount (... GB retention).
 さらに、上述した、残存価値、および付加価値を市場で流通している中古車両との比較により、金銭的価値で・・・円といった情報を表示するようにしてもよい。 In addition, information such as yen in monetary value may be displayed by comparing the residual value and added value described above with the used vehicles that are distributed in the market.
 <ソフトウェアにより実行させる例>
 ところで、上述した一連の処理は、ハードウェアにより実行させることもできるが、ソフトウェアにより実行させることもできる。一連の処理をソフトウェアにより実行させる場合には、そのソフトウェアを構成するプログラムが、専用のハードウェアに組み込まれているコンピュータ、または、各種のプログラムをインストールすることで、各種の機能を実行することが可能な、例えば汎用のパーソナルコンピュータなどに、記録媒体からインストールされる。
<Example executed by software>
By the way, the series of processes described above can be executed by hardware, but can also be executed by software. When a series of processing is executed by software, a program constituting the software may execute various functions by installing a computer incorporated in dedicated hardware or various programs. For example, it is installed from a recording medium in a general-purpose personal computer or the like.
 図18は、汎用のパーソナルコンピュータの構成例を示している。このパーソナルコンピュータは、CPU(Central Processing Unit)1001を内蔵している。CPU1001にはバス1004を介して、入出力インタ-フェイス1005が接続されている。バス1004には、ROM(Read Only Memory)1002およびRAM(Random Access Memory)1003が接続されている。 FIG. 18 shows a configuration example of a general-purpose personal computer. This personal computer incorporates a CPU (Central Processing Unit) 1001. An input / output interface 1005 is connected to the CPU 1001 via a bus 1004. A ROM (Read Only Memory) 1002 and a RAM (Random Access Memory) 1003 are connected to the bus 1004.
 入出力インタ-フェイス1005には、ユーザが操作コマンドを入力するキーボード、マウスなどの入力デバイスよりなる入力部1006、処理操作画面や処理結果の画像を表示デバイスに出力する出力部1007、プログラムや各種データを格納するハードディスクドライブなどよりなる記憶部1008、LAN(Local Area Network)アダプタなどよりなり、インターネットに代表されるネットワークを介した通信処理を実行する通信部1009が接続されている。また、磁気ディスク(フレキシブルディスクを含む)、光ディスク(CD-ROM(Compact Disc-Read Only Memory)、DVD(Digital Versatile Disc)を含む)、光磁気ディスク(MD(Mini Disc)を含む)、もしくは半導体メモリなどのリムーバブルメディア1011に対してデータを読み書きするドライブ1010が接続されている。 The input / output interface 1005 includes an input unit 1006 including an input device such as a keyboard and a mouse for a user to input an operation command, an output unit 1007 for outputting a processing operation screen and an image of the processing result to a display device, programs, and various types. A storage unit 1008 including a hard disk drive for storing data, a LAN (Local Area Network) adapter, and the like are connected to a communication unit 1009 that executes communication processing via a network represented by the Internet. Also, magnetic disks (including flexible disks), optical disks (including CD-ROM (Compact Disc-Read Only Memory), DVD (Digital Versatile Disc)), magneto-optical disks (including MD (Mini Disc)), or semiconductors A drive 1010 for reading / writing data from / to a removable medium 1011 such as a memory is connected.
 CPU1001は、ROM1002に記憶されているプログラム、または磁気ディスク、光ディスク、光磁気ディスク、もしくは半導体メモリ等のリムーバブルメディア1011ら読み出されて記憶部1008にインストールされ、記憶部1008からRAM1003にロードされたプログラムに従って各種の処理を実行する。RAM1003にはまた、CPU1001が各種の処理を実行する上において必要なデータなども適宜記憶される。 The CPU 1001 is read from a program stored in the ROM 1002 or a removable medium 1011 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, installed in the storage unit 1008, and loaded from the storage unit 1008 to the RAM 1003. Various processes are executed according to the program. The RAM 1003 also appropriately stores data necessary for the CPU 1001 to execute various processes.
 以上のように構成されるコンピュータでは、CPU1001が、例えば、記憶部1008に記憶されているプログラムを、入出力インタフェース1005及びバス1004を介して、RAM1003にロードして実行することにより、上述した一連の処理が行われる。 In the computer configured as described above, the CPU 1001 loads the program stored in the storage unit 1008 to the RAM 1003 via the input / output interface 1005 and the bus 1004 and executes the program, for example. Is performed.
 コンピュータ(CPU1001)が実行するプログラムは、例えば、パッケージメディア等としてのリムーバブルメディア1011に記録して提供することができる。また、プログラムは、ローカルエリアネットワーク、インターネット、デジタル衛星放送といった、有線または無線の伝送媒体を介して提供することができる。 The program executed by the computer (CPU 1001) can be provided by being recorded on the removable medium 1011 as a package medium, for example. The program can be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting.
 コンピュータでは、プログラムは、リムーバブルメディア1011をドライブ1010に装着することにより、入出力インタフェース1005を介して、記憶部1008にインストールすることができる。また、プログラムは、有線または無線の伝送媒体を介して、通信部1009で受信し、記憶部1008にインストールすることができる。その他、プログラムは、ROM1002や記憶部1008に、あらかじめインストールしておくことができる。 In the computer, the program can be installed in the storage unit 1008 via the input / output interface 1005 by attaching the removable medium 1011 to the drive 1010. Further, the program can be received by the communication unit 1009 via a wired or wireless transmission medium and installed in the storage unit 1008. In addition, the program can be installed in advance in the ROM 1002 or the storage unit 1008.
 なお、コンピュータが実行するプログラムは、本明細書で説明する順序に沿って時系列に処理が行われるプログラムであっても良いし、並列に、あるいは呼び出しが行われたとき等の必要なタイミングで処理が行われるプログラムであっても良い。 The program executed by the computer may be a program that is processed in time series in the order described in this specification, or in parallel or at a necessary timing such as when a call is made. It may be a program for processing.
 また、本明細書において、システムとは、複数の構成要素(装置、モジュール(部品)等)の集合を意味し、すべての構成要素が同一筐体中にあるか否かは問わない。したがって、別個の筐体に収納され、ネットワークを介して接続されている複数の装置、及び、1つの筐体の中に複数のモジュールが収納されている1つの装置は、いずれも、システムである。 In this specification, the system means a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether all the components are in the same housing. Accordingly, a plurality of devices housed in separate housings and connected via a network and a single device housing a plurality of modules in one housing are all systems. .
 なお、本開示の実施の形態は、上述した実施の形態に限定されるものではなく、本開示の要旨を逸脱しない範囲において種々の変更が可能である。 Note that the embodiments of the present disclosure are not limited to the above-described embodiments, and various modifications can be made without departing from the scope of the present disclosure.
 例えば、本開示は、1つの機能をネットワークを介して複数の装置で分担、共同して処理するクラウドコンピューティングの構成をとることができる。 For example, the present disclosure can take a cloud computing configuration in which one function is shared by a plurality of devices via a network and is processed jointly.
 また、上述のフローチャートで説明した各ステップは、1つの装置で実行する他、複数の装置で分担して実行することができる。 Further, each step described in the above flowchart can be executed by one device or can be shared by a plurality of devices.
 さらに、1つのステップに複数の処理が含まれる場合には、その1つのステップに含まれる複数の処理は、1つの装置で実行する他、複数の装置で分担して実行することができる。 Further, when a plurality of processes are included in one step, the plurality of processes included in the one step can be executed by being shared by a plurality of apparatuses in addition to being executed by one apparatus.
 尚、本開示は、以下のような構成も取ることができる。
<1> ハードウェアを使用するユーザに対応付けて学習データを記憶する記憶部と、
 前記学習データに基づいて、前記ハードウェアの動作を決定する動作決定部と、
 前記記憶部に記憶されている前記学習データのうち、前記ユーザにより利用可能な学習データの選択肢を提示する提示部とを含み、
 前記動作決定部は、前記提示部により提示された選択肢のうち、選択された学習データに基づいて、動作を決定する
 情報処理装置。
<2> 前記学習データは、前記ユーザに依存するプライベートデータ、および、前記プライベートデータ以外のパブリックデータより構成され、
 前記提示部は、前記記憶部に記憶されている前記学習データのうち、前記動作決定部で利用可能な学習データとして、前記パブリックデータを選択肢として提示する
 <1>に記載の情報処理装置。
<3> 前記学習データは、前記ハードウェアの個体に依存する個体依存データ、前記ハードウェアの機種に依存した機種依存データ、および、前記ハードウェアに依存しない汎用データとから構成され、
 前記提示部は、前記ユーザが、前記ハードウェアと異なる他のハードウェアを使用して、前記他のハードウェアの前記動作決定部により動作を決定させるとき、前記ハードウェアに対応付けて記憶されている学習データのうち、前記汎用データであって、プライベートデータ、およびパブリックデータからなる選択肢を提示する
 <2>に記載の情報処理装置。
<4> 前記ハードウェアと、前記他のハードウェアとが同一の機種である場合、前記提示部は、前記ユーザが、前記ハードウェアと異なる他のハードウェアを使用して、前記他のハードウェアの前記動作決定部により動作を決定させるとき、前記他のハードウェアに対応付けて記憶されている学習データのうち、前記機種依存データ、および前記汎用データであって、それぞれの、プライベートデータ、およびパブリックデータからなる選択肢を提示する
 <3>に記載の情報処理装置。
<5> 前記提示部は、前記学習データにおける、前記個体依存データ、前記機種依存データ、および前記汎用データであって、それぞれのプライベートデータ、およびパブリックデータの単位で破棄する選択肢を提示する
 <4>に記載の情報処理装置。
<6> 前記学習データは、前記ユーザが使用したハードウェア毎に構成され、
 前記提示部は、前記学習データを移行させたい、前記ハードウェアを選択する他の選択肢と、
 前記記憶部に記憶されている前記学習データのうち、前記ユーザが使用したハードウェアにより学習された学習データのパブリックデータを、前記ハードウェアの前記動作決定部で利用可能な学習データの選択肢として提示する
 <2>に記載の情報処理装置。
<7> 前記学習データは、前記ユーザが使用したハードウェア毎に構成されると共に、前記ハードウェアの個体に依存する個体依存データ、前記ハードウェアの機種に依存した機種依存データ、および、前記ハードウェアに依存しない汎用データとから構成され、
 前記提示部は、前記学習データを移行させたい、前記ハードウェアを選択する他の選択肢と、
 前記記憶部に記憶されている前記学習データのうち、前記ユーザが使用したハードウェアにより学習された学習データの汎用データであって、パブリックデータおよびプライベートデータであり、前記ハードウェアとは異なる他のハードウェアの前記動作決定部で利用可能な学習データの選択肢として提示する
 <2>に記載の情報処理装置。
<8> 前記ハードウェアは、車両であり、
 前記学習データは、前記車両の個体に依存する個体依存データ、前記車両の4車種に依存した車種依存データ、および、前記車両に依存しない汎用データとから構成され、
 前記個体依存データは、修理履歴、走行距離、改造情報、衝突履歴、およびガソリン残存量を含み、
 前記車種依存データは、前記車両の車種毎に共通して特定される燃費、最高速度、ナビゲーションされるルート上の通行可否の情報、車両動作を検出するセンシングデータ、自動運転可否情報、および、自動運転時の走行ルートを含み、
 前記汎用データは、経路情報、近隣の店舗情報、訪問場所履歴、エージェントとの会話履歴、運転の仕方、急ブレーキ、クラクション回数、喫煙の有無、天気、建物、道路等の3次元データ、および外部情報を含む
 <1>乃至<7>のいずれかに記載の情報処理装置。
<9> 前記学習データに基づいて、決定された動作のデータを、匿名化する匿名化部を含み、
 前記記憶部は、前記動作決定部により、前記学習データに基づいて、決定された動作のデータを、前記匿名化部により匿名化された状態で記憶する
 <1>乃至<8>のいずれかに記載の情報処理装置。
<10> ハードウェアを使用するユーザに対応付けて学習データを記憶する記憶部と、
 前記学習データに基づいて、前記ハードウェアの動作を決定する動作決定部と、
 前記記憶部に記憶されている前記学習データのうち、前記ユーザにより利用可能な学習データの選択肢を提示する提示部とを含む情報処理装置の情報処理方法は、
 前記動作決定部が、前記提示部により提示された選択肢のうち、選択された学習データに基づいて、動作を決定するステップを含む
 情報処理方法。
<11> ハードウェアを使用するユーザに対応付けて学習データを記憶する記憶部と、
 前記学習データに基づいて、前記ハードウェアの動作を決定する動作決定部と、
 前記記憶部に記憶されている前記学習データのうち、前記ユーザにより利用可能な学習データの選択肢を提示する提示部とを含み、
 前記動作決定部は、前記提示部により提示された選択肢のうち、選択された学習データに基づいて、動作を決定する処理をコンピュータに実行させる
 プログラム。
In addition, this indication can also take the following structures.
<1> A storage unit that stores learning data in association with a user who uses hardware;
An operation determining unit that determines an operation of the hardware based on the learning data;
A presentation unit that presents options of learning data that can be used by the user among the learning data stored in the storage unit,
The operation determining unit determines an operation based on learning data selected from among the options presented by the presenting unit.
<2> The learning data includes private data depending on the user, and public data other than the private data.
The information processing apparatus according to <1>, wherein the presenting unit presents the public data as an option as learning data that can be used by the operation determining unit among the learning data stored in the storage unit.
<3> The learning data includes individual-dependent data that depends on an individual of the hardware, model-dependent data that depends on a model of the hardware, and general-purpose data that does not depend on the hardware.
The presentation unit is stored in association with the hardware when the user uses another hardware different from the hardware and causes the operation determination unit of the other hardware to determine the operation. The information processing apparatus according to <2>, wherein the general-purpose data among the learning data is presented, and options including private data and public data are presented.
<4> In the case where the hardware and the other hardware are the same model, the presentation unit uses the other hardware different from the hardware so that the user can use the other hardware. Among the learning data stored in association with the other hardware, the model-dependent data and the general-purpose data, and the private data, and The information processing apparatus according to <3>, in which an option including public data is presented.
<5> The presenting unit presents the individual-dependent data, the model-dependent data, and the general-purpose data in the learning data, and presents options to be discarded in units of each private data and public data <4 > The information processing apparatus described in>.
<6> The learning data is configured for each hardware used by the user,
The presenting unit, other options for selecting the hardware, the learning data to be transferred,
Of the learning data stored in the storage unit, public data of learning data learned by the hardware used by the user is presented as an option of learning data that can be used by the operation determination unit of the hardware The information processing apparatus according to <2>.
<7> The learning data is configured for each piece of hardware used by the user, and the individual-dependent data depending on the individual hardware, the model-dependent data depending on the hardware model, and the hardware And general-purpose data that does not depend on
The presenting unit, other options for selecting the hardware, the learning data to be transferred,
Of the learning data stored in the storage unit, general data of learning data learned by the hardware used by the user, which is public data and private data, which is different from the hardware The information processing apparatus according to <2>, wherein the information is presented as an option of learning data that can be used by the operation determination unit of hardware.
<8> The hardware is a vehicle,
The learning data is composed of individual-dependent data that depends on an individual of the vehicle, vehicle-dependent data that depends on four types of the vehicle, and general-purpose data that does not depend on the vehicle,
The individual-dependent data includes repair history, mileage, modification information, collision history, and gasoline remaining amount,
The vehicle type-dependent data includes the fuel consumption, the maximum speed, the information on whether or not the vehicle is allowed to travel on the route to be navigated, the sensing data for detecting the vehicle operation, the automatic driving availability information, and the automatic Including driving routes when driving,
The general-purpose data includes route information, neighboring store information, visit location history, conversation history with agents, driving method, sudden braking, number of horns, presence / absence of smoking, weather, buildings, roads, etc., and external The information processing apparatus according to any one of <1> to <7>, including information.
<9> Based on the learning data, including an anonymization unit that anonymizes the data of the determined operation,
The storage unit stores data determined by the operation determination unit based on the learning data in an anonymized state by the anonymization unit. <1> to <8> The information processing apparatus described.
<10> a storage unit that stores learning data in association with a user who uses hardware;
An operation determining unit that determines an operation of the hardware based on the learning data;
An information processing method of an information processing apparatus including a presentation unit that presents options of learning data that can be used by the user among the learning data stored in the storage unit,
The information processing method includes a step in which the operation determination unit determines an operation based on learning data selected from the options presented by the presentation unit.
<11> A storage unit that stores learning data in association with a user who uses hardware;
An operation determining unit that determines an operation of the hardware based on the learning data;
A presentation unit that presents options of learning data that can be used by the user among the learning data stored in the storage unit,
The operation determining unit is a program for causing a computer to execute a process of determining an operation based on selected learning data among the options presented by the presenting unit.
 11,11-1乃至11-n 車両, 12 サーバ, 13 ネットワーク, 31 制御部, 32 車両駆動部, 33 操作入力部, 34 車両動作検出部, 35 通信部, 36 表示部, 37 音声出力部, 38 音声入力部, 39 撮像部, 40 記憶部, 41 エージェント処理部, 61 認証部, 62 アカウント管理部, 63 エージェントデータ同期管理部, 64 エージェントデータ利用管理部, 65 エージェントデータ移行管理部, 66 エージェントデータ破棄管理部, 67 解析部, 68 事例検索部, 69 事例確認部, 70 事例修正部, 71 事例判定部, 72 事例匿名化部, 73 事例検証部, 91 制御部, 92 通信部, 93 記憶部 11, 11-1 to 11-n vehicle, 12 server, 13 network, 31 control unit, 32 vehicle drive unit, 33 operation input unit, 34 vehicle operation detection unit, 35 communication unit, 36 display unit, 37 audio output unit, 38 voice input unit, 39 imaging unit, 40 storage unit, 41 agent processing unit, 61 authentication unit, 62 account management unit, 63 agent data synchronization management unit, 64 agent data usage management unit, 65 agent data migration management unit, 66 agent Data destruction management section, 67 analysis section, 68 case search section, 69 case confirmation section, 70 case correction section, 71 case determination section, 72 case anonymization section, 73 case verification section, 91 control section, 92 communication section, 93 storage Part

Claims (11)

  1.  ハードウェアを使用するユーザに対応付けて学習データを記憶する記憶部と、
     前記学習データに基づいて、前記ハードウェアの動作を決定する動作決定部と、
     前記記憶部に記憶されている前記学習データのうち、前記ユーザにより利用可能な学習データの選択肢を提示する提示部とを含み、
     前記動作決定部は、前記提示部により提示された選択肢のうち、選択された学習データに基づいて、動作を決定する
     情報処理装置。
    A storage unit that stores learning data in association with a user who uses hardware;
    An operation determining unit that determines an operation of the hardware based on the learning data;
    A presentation unit that presents options of learning data that can be used by the user among the learning data stored in the storage unit,
    The operation determining unit determines an operation based on learning data selected from among the options presented by the presenting unit.
  2.  前記学習データは、前記ユーザに依存するプライベートデータ、および、前記プライベートデータ以外のパブリックデータより構成され、
     前記提示部は、前記記憶部に記憶されている前記学習データのうち、前記動作決定部で利用可能な学習データとして、前記パブリックデータを選択肢として提示する
     請求項1に記載の情報処理装置。
    The learning data is composed of private data depending on the user, and public data other than the private data,
    The information processing apparatus according to claim 1, wherein the presenting unit presents the public data as an option as learning data that can be used by the operation determining unit among the learning data stored in the storage unit.
  3.  前記学習データは、前記ハードウェアの個体に依存する個体依存データ、前記ハードウェアの機種に依存した機種依存データ、および、前記ハードウェアに依存しない汎用データとから構成され、
     前記提示部は、前記ユーザが、前記ハードウェアと異なる他のハードウェアを使用して、前記他のハードウェアの前記動作決定部により動作を決定させるとき、前記ハードウェアに対応付けて記憶されている学習データのうち、前記汎用データであって、プライベートデータ、およびパブリックデータからなる選択肢を提示する
     請求項2に記載の情報処理装置。
    The learning data is composed of individual-dependent data that depends on an individual of the hardware, model-dependent data that depends on the model of the hardware, and general-purpose data that does not depend on the hardware,
    The presentation unit is stored in association with the hardware when the user uses another hardware different from the hardware and causes the operation determination unit of the other hardware to determine the operation. The information processing apparatus according to claim 2, wherein, among the learning data, the general-purpose data is presented, and options including private data and public data are presented.
  4.  前記ハードウェアと、前記他のハードウェアとが同一の機種である場合、前記提示部は、前記ユーザが、前記ハードウェアと異なる他のハードウェアを使用して、前記他のハードウェアの前記動作決定部により動作を決定させるとき、前記他のハードウェアに対応付けて記憶されている学習データのうち、前記機種依存データ、および前記汎用データであって、それぞれの、プライベートデータ、およびパブリックデータからなる選択肢を提示する
     請求項3に記載の情報処理装置。
    When the hardware and the other hardware are the same model, the presentation unit uses the other hardware different from the hardware, and the operation of the other hardware is performed by the user. When determining the operation by the determination unit, among the learning data stored in association with the other hardware, the model-dependent data and the general-purpose data, which are private data and public data, respectively. The information processing apparatus according to claim 3.
  5.  前記提示部は、前記学習データにおける、前記個体依存データ、前記機種依存データ、および前記汎用データであって、それぞれのプライベートデータ、およびパブリックデータの単位で破棄する選択肢を提示する
     請求項4に記載の情報処理装置。
    The presenting unit presents options to be discarded in units of private data and public data, which are the individual-dependent data, the model-dependent data, and the general-purpose data in the learning data. Information processing device.
  6.  前記学習データは、前記ユーザが使用したハードウェア毎に構成され、
     前記提示部は、前記学習データを移行させたい、前記ハードウェアを選択する他の選択肢と、
     前記記憶部に記憶されている前記学習データのうち、前記ユーザが使用したハードウェアにより学習された学習データのパブリックデータを、前記ハードウェアの前記動作決定部で利用可能な学習データの選択肢として提示する
     請求項2に記載の情報処理装置。
    The learning data is configured for each piece of hardware used by the user,
    The presenting unit, other options for selecting the hardware, the learning data to be transferred,
    Of the learning data stored in the storage unit, public data of learning data learned by the hardware used by the user is presented as an option of learning data that can be used by the operation determination unit of the hardware The information processing apparatus according to claim 2.
  7.  前記学習データは、前記ユーザが使用したハードウェア毎に構成されると共に、前記ハードウェアの個体に依存する個体依存データ、前記ハードウェアの機種に依存した機種依存データ、および、前記ハードウェアに依存しない汎用データとから構成され、
     前記提示部は、前記学習データを移行させたい、前記ハードウェアを選択する他の選択肢と、
     前記記憶部に記憶されている前記学習データのうち、前記ユーザが使用したハードウェアにより学習された学習データの汎用データであって、パブリックデータおよびプライベートデータであり、前記ハードウェアとは異なる他のハードウェアの前記動作決定部で利用可能な学習データの選択肢として提示する
     請求項2に記載の情報処理装置。
    The learning data is configured for each piece of hardware used by the user and is dependent on the individual of the hardware, dependent on the individual of the hardware, dependent on the model of the hardware, and dependent on the hardware Not composed of general-purpose data,
    The presenting unit, other options for selecting the hardware, the learning data to be transferred,
    Of the learning data stored in the storage unit, general data of learning data learned by the hardware used by the user, which is public data and private data, which is different from the hardware The information processing apparatus according to claim 2, wherein the information is presented as an option of learning data that can be used by the operation determination unit of hardware.
  8.  前記ハードウェアは、車両であり、
     前記学習データは、前記車両の個体に依存する個体依存データ、前記車両の4車種に依存した車種依存データ、および、前記車両に依存しない汎用データとから構成され、
     前記個体依存データは、修理履歴、走行距離、改造情報、衝突履歴、およびガソリン残存量を含み、
     前記車種依存データは、前記車両の車種毎に共通して特定される燃費、最高速度、ナビゲーションされるルート上の通行可否の情報、車両動作を検出するセンシングデータ、自動運転可否情報、および、自動運転時の走行ルートを含み、
     前記汎用データは、経路情報、近隣の店舗情報、訪問場所履歴、エージェントとの会話履歴、運転の仕方、急ブレーキ、クラクション回数、喫煙の有無、天気、建物、道路等の3次元データ、および外部情報を含む
     請求項1に記載の情報処理装置。
    The hardware is a vehicle,
    The learning data is composed of individual-dependent data that depends on an individual of the vehicle, vehicle-dependent data that depends on four types of the vehicle, and general-purpose data that does not depend on the vehicle,
    The individual-dependent data includes repair history, mileage, modification information, collision history, and gasoline remaining amount,
    The vehicle type-dependent data includes the fuel consumption, the maximum speed, the information on whether or not the vehicle is allowed to travel on the route to be navigated, the sensing data for detecting the vehicle operation, the automatic driving availability information, and the automatic Including driving routes when driving,
    The general-purpose data includes route information, neighboring store information, visit location history, conversation history with agents, driving method, sudden braking, number of horns, presence / absence of smoking, weather, buildings, roads, etc., and external The information processing apparatus according to claim 1, comprising information.
  9.  前記学習データに基づいて、決定された動作のデータを、匿名化する匿名化部を含み、
     前記記憶部は、前記動作決定部により、前記学習データに基づいて、決定された動作のデータを、前記匿名化部により匿名化された状態で記憶する
     請求項1に記載の情報処理装置。
    Based on the learning data, including the anonymization unit that anonymizes the data of the determined operation,
    The information processing apparatus according to claim 1, wherein the storage unit stores the data of the operation determined by the operation determination unit based on the learning data in a state of being anonymized by the anonymization unit.
  10.  ハードウェアを使用するユーザに対応付けて学習データを記憶する記憶部と、
     前記学習データに基づいて、前記ハードウェアの動作を決定する動作決定部と、
     前記記憶部に記憶されている前記学習データのうち、前記ユーザにより利用可能な学習データの選択肢を提示する提示部とを含む情報処理装置の情報処理方法は、
     前記動作決定部が、前記提示部により提示された選択肢のうち、選択された学習データに基づいて、動作を決定するステップを含む
     情報処理方法。
    A storage unit that stores learning data in association with a user who uses hardware;
    An operation determining unit that determines an operation of the hardware based on the learning data;
    An information processing method of an information processing apparatus including a presentation unit that presents options of learning data that can be used by the user among the learning data stored in the storage unit,
    The information processing method includes a step in which the operation determination unit determines an operation based on learning data selected from the options presented by the presentation unit.
  11.  ハードウェアを使用するユーザに対応付けて学習データを記憶する記憶部と、
     前記学習データに基づいて、前記ハードウェアの動作を決定する動作決定部と、
     前記記憶部に記憶されている前記学習データのうち、前記ユーザにより利用可能な学習データの選択肢を提示する提示部とを含み、
     前記動作決定部は、前記提示部により提示された選択肢のうち、選択された学習データに基づいて、動作を決定する処理をコンピュータに実行させる
     プログラム。
    A storage unit that stores learning data in association with a user who uses hardware;
    An operation determining unit that determines an operation of the hardware based on the learning data;
    A presentation unit that presents options of learning data that can be used by the user among the learning data stored in the storage unit,
    The operation determining unit is a program for causing a computer to execute a process of determining an operation based on selected learning data among the options presented by the presenting unit.
PCT/JP2017/014451 2016-04-22 2017-04-07 Information processing device, information processing method, and program WO2017183476A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2018513108A JP7017143B2 (en) 2016-04-22 2017-04-07 Information processing equipment, information processing methods, and programs
US16/094,032 US20190114558A1 (en) 2016-04-22 2017-04-07 Information processing apparatus, information processing method, and program

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016-086116 2016-04-22
JP2016086116 2016-04-22

Publications (1)

Publication Number Publication Date
WO2017183476A1 true WO2017183476A1 (en) 2017-10-26

Family

ID=60116886

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/014451 WO2017183476A1 (en) 2016-04-22 2017-04-07 Information processing device, information processing method, and program

Country Status (3)

Country Link
US (1) US20190114558A1 (en)
JP (1) JP7017143B2 (en)
WO (1) WO2017183476A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111204298A (en) * 2018-11-21 2020-05-29 丰田自动车北美公司 Vehicle motion adaptive system and method
WO2020225918A1 (en) * 2019-05-09 2020-11-12 本田技研工業株式会社 Agent system, agent server, control method for agent server, and program
JP2020197959A (en) * 2019-06-04 2020-12-10 富士ゼロックス株式会社 Information processor and program
JP7057904B1 (en) 2020-12-23 2022-04-21 住友ゴム工業株式会社 Tire management system, tire management method

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110222093A (en) * 2019-06-12 2019-09-10 中国神华能源股份有限公司 Handle the method, apparatus and storage medium of train data

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06199158A (en) * 1993-01-07 1994-07-19 Mazda Motor Corp Control gain changing device for automobile control device
JPH11325234A (en) * 1998-05-19 1999-11-26 Toyota Motor Corp Control device for vehicle
JP2003049702A (en) * 2001-08-07 2003-02-21 Mazda Motor Corp On-vehicle automobile control-gain changing device, automobile control-gain changing method and automobile control-gain changing program
JP2009244204A (en) * 2008-03-31 2009-10-22 Toyota Motor Corp On-vehicle information terminal, and information providing system for vehicle
JP2011073565A (en) * 2009-09-30 2011-04-14 Nec Software Chubu Ltd Driving assistance system, server device, driving assistance device, and information processing method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06199158A (en) * 1993-01-07 1994-07-19 Mazda Motor Corp Control gain changing device for automobile control device
JPH11325234A (en) * 1998-05-19 1999-11-26 Toyota Motor Corp Control device for vehicle
JP2003049702A (en) * 2001-08-07 2003-02-21 Mazda Motor Corp On-vehicle automobile control-gain changing device, automobile control-gain changing method and automobile control-gain changing program
JP2009244204A (en) * 2008-03-31 2009-10-22 Toyota Motor Corp On-vehicle information terminal, and information providing system for vehicle
JP2011073565A (en) * 2009-09-30 2011-04-14 Nec Software Chubu Ltd Driving assistance system, server device, driving assistance device, and information processing method

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111204298A (en) * 2018-11-21 2020-05-29 丰田自动车北美公司 Vehicle motion adaptive system and method
CN111204298B (en) * 2018-11-21 2023-04-04 丰田自动车北美公司 Vehicle motion adaptive system and method
WO2020225918A1 (en) * 2019-05-09 2020-11-12 本田技研工業株式会社 Agent system, agent server, control method for agent server, and program
JPWO2020225918A1 (en) * 2019-05-09 2020-11-12
CN113748049A (en) * 2019-05-09 2021-12-03 本田技研工业株式会社 Agent system, agent server control method, and program
JP7177922B2 (en) 2019-05-09 2022-11-24 本田技研工業株式会社 Agent system, agent server, agent server control method, and program
CN113748049B (en) * 2019-05-09 2024-03-22 本田技研工业株式会社 Intelligent body system, intelligent body server and control method of intelligent body server
JP2020197959A (en) * 2019-06-04 2020-12-10 富士ゼロックス株式会社 Information processor and program
JP7238610B2 (en) 2019-06-04 2023-03-14 富士フイルムビジネスイノベーション株式会社 Information processing device and program
JP7057904B1 (en) 2020-12-23 2022-04-21 住友ゴム工業株式会社 Tire management system, tire management method
JP2022099879A (en) * 2020-12-23 2022-07-05 住友ゴム工業株式会社 Tire management system and tire management method

Also Published As

Publication number Publication date
US20190114558A1 (en) 2019-04-18
JPWO2017183476A1 (en) 2019-02-28
JP7017143B2 (en) 2022-02-08

Similar Documents

Publication Publication Date Title
WO2017183476A1 (en) Information processing device, information processing method, and program
JP6615840B2 (en) Method and system for recognizing personal driving preferences of autonomous vehicles
JP6606532B2 (en) Method and system for managing vehicle groups for autonomous vehicles
JP6674019B2 (en) Control error correction planning method for operating autonomous vehicles
JP6310531B2 (en) System and method for providing augmented virtual reality content in an autonomous vehicle
EP3620336A1 (en) Method and apparatus for using a passenger-based driving profile
WO2017168883A1 (en) Information processing device, information processing method, program, and system
KR102106875B1 (en) System and method for avoiding accidents during autonomous driving based on vehicle learning
US11358605B2 (en) Method and apparatus for generating a passenger-based driving profile
US11162806B2 (en) Learning and predictive navigation system
US8060297B2 (en) Route transfer between devices
EP3620972A1 (en) Method and apparatus for providing a user reaction user interface for generating a passenger-based driving profile
US11346672B2 (en) Multi-mode route selection
US20240027206A1 (en) Transportation route error detection and adjustment
US20200164897A1 (en) Method and apparatus for presenting a feedforward cue in a user interface before an upcoming vehicle event occurs
CN110853240A (en) Information processing device, riding vehicle adjusting method, and storage medium
US20240003694A1 (en) Determining ridership errors by analyzing provider-requestor consistency signals across ride stages
JP7350814B2 (en) Wireless energy transfer to transportation vehicles based on route data
EP3040682B1 (en) Learning and predictive navigation system
US11137261B2 (en) Method and apparatus for determining and presenting a spatial-temporal mobility pattern of a vehicle with respect to a user based on user appointments
US20210407031A1 (en) Utilizing digital signals to intelligently monitor client device transit progress and generate dynamic public transit interfaces
JP6757230B2 (en) In-vehicle device and awakening system
JP6946456B2 (en) Corner negotiation method for self-driving vehicles that do not require maps and positioning
JP2021086316A (en) Information processing device, information processing system, and program
JP2020194279A (en) Information processing device, information processing method, and program

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 2018513108

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17785818

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17785818

Country of ref document: EP

Kind code of ref document: A1