WO2016121174A1 - 情報処理システムおよび制御方法 - Google Patents
情報処理システムおよび制御方法 Download PDFInfo
- Publication number
- WO2016121174A1 WO2016121174A1 PCT/JP2015/079708 JP2015079708W WO2016121174A1 WO 2016121174 A1 WO2016121174 A1 WO 2016121174A1 JP 2015079708 W JP2015079708 W JP 2015079708W WO 2016121174 A1 WO2016121174 A1 WO 2016121174A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- users
- processing system
- relationship
- information processing
- Prior art date
Links
- 230000010365 information processing Effects 0.000 title claims abstract description 78
- 238000000034 method Methods 0.000 title claims abstract description 29
- 230000035945 sensitivity Effects 0.000 claims description 122
- 230000008451 emotion Effects 0.000 claims description 69
- 230000003993 interaction Effects 0.000 claims description 68
- 238000011156 evaluation Methods 0.000 claims description 28
- 238000004891 communication Methods 0.000 claims description 25
- 238000004458 analytical method Methods 0.000 claims description 12
- 230000008921 facial expression Effects 0.000 claims description 5
- 230000005540 biological transmission Effects 0.000 claims description 4
- 210000004556 brain Anatomy 0.000 claims description 4
- 230000035900 sweating Effects 0.000 claims description 3
- 238000010195 expression analysis Methods 0.000 claims description 2
- 230000008569 process Effects 0.000 description 18
- 238000003860 storage Methods 0.000 description 17
- 238000004364 calculation method Methods 0.000 description 16
- 239000000284 extract Substances 0.000 description 15
- 238000012545 processing Methods 0.000 description 9
- 238000010586 diagram Methods 0.000 description 8
- 230000006870 function Effects 0.000 description 7
- 230000000694 effects Effects 0.000 description 5
- 238000004519 manufacturing process Methods 0.000 description 5
- 230000009471 action Effects 0.000 description 4
- 230000007613 environmental effect Effects 0.000 description 4
- 241001465754 Metazoa Species 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 235000012054 meals Nutrition 0.000 description 3
- 230000002093 peripheral effect Effects 0.000 description 3
- 230000004044 response Effects 0.000 description 3
- 238000004590 computer program Methods 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 230000001151 other effect Effects 0.000 description 2
- 240000008574 Capsicum frutescens Species 0.000 description 1
- 241000282326 Felis catus Species 0.000 description 1
- 230000037237 body shape Effects 0.000 description 1
- 230000036760 body temperature Effects 0.000 description 1
- 238000004140 cleaning Methods 0.000 description 1
- 238000012217 deletion Methods 0.000 description 1
- 230000037430 deletion Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000035622 drinking Effects 0.000 description 1
- 230000001747 exhibiting effect Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 210000004243 sweat Anatomy 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3605—Destination input or retrieval
- G01C21/3617—Destination input or retrieval using user history, behaviour, conditions or preferences, e.g. predicted or inferred from previous use or current movement
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/3407—Route searching; Route guidance specially adapted for specific applications
- G01C21/343—Calculating itineraries, i.e. routes leading from a starting point to a series of categorical destinations using a global route restraint, round trips, touristic trips
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3602—Input other than that of destination using image analysis, e.g. detection of road signs, lanes, buildings, real preceding vehicles using a camera
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3679—Retrieval, searching and output of POI information, e.g. hotels, restaurants, shops, filling stations, parking facilities
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0088—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/10—Office automation; Time management
- G06Q10/101—Collaborative creation, e.g. joint development of products or services
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B29/00—Maps; Plans; Charts; Diagrams, e.g. route diagram
- G09B29/003—Maps
- G09B29/006—Representation of non-cartographic information on maps, e.g. population distribution, wind direction, radiation levels, air and sea routes
- G09B29/007—Representation of non-cartographic information on maps, e.g. population distribution, wind direction, radiation levels, air and sea routes using computer methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
Definitions
- This disclosure relates to an information processing system and a control method.
- a navigation system that automatically searches and guides a route to a specified destination has been proposed.
- Such a navigation system searches for a route with the shortest travel distance and time from the current location to a designated destination, presents the search result to the user, and provides the user with route guidance based on the search result. To go.
- Patent Document 1 discloses a navigation device that proposes a detour based on the user's preference and the recommended degree of facility information.
- the present disclosure proposes an information processing system and a control method capable of generating more optimal guidance information in consideration of the relationship between a plurality of users.
- the estimation unit that estimates the relationship between the identified plurality of users, and the plurality of users according to the estimated relationship between the plurality of users
- An information processing system including a generation unit that generates guidance information for a user is proposed.
- the information processing system estimates the relationship between a plurality of users from the atmosphere, attributes, conversation, face recognition, emotion, or sensitivity value of the plurality of users based on information acquired from various sensors, and the estimated relationship Presents guidance information suitable for sex.
- the guidance information presented is the route information to the destination and the route information via the detour destination in the middle of the route (for example, the detour destination 1, detour destination 2 and detour destination 3 as shown on the right in FIG. 1) Route information to the ground).
- the presentation of the guidance information may be output / sound output on an information processing terminal (for example, a smartphone, a mobile phone terminal, a tablet terminal, a wearable terminal, etc.) possessed by the user, or displayed on a vehicle on which the user is boarded. / Audio may be output. Further, when the vehicle on which the user is boarding is a vehicle that can automatically travel, automatic traveling may be performed according to route information as guidance information.
- an information processing terminal for example, a smartphone, a mobile phone terminal, a tablet terminal, a wearable terminal, etc.
- the shortest temporal / distance route to the destination input by the user is automatically searched and only navigation according to the route is performed.
- the user stops at a sightseeing spot, a restaurant, a souvenir shop, or the like on the way to the destination.
- the user searches for a detour destination in the middle of the route and sets the detour destination as a new destination in the navigation system again. Had to do. Searching for a detour is a time-consuming work for the user, and the optimum location could not be sufficiently searched depending on the search ability of the user.
- the optimum navigation differs depending on the relationship between the multiple users. For example, a route that detours to a family-friendly spot is optimal for families, but a route that detours to a lover's spot is optimal for lovers. In the case of work relations, the route that arrives at the destination at the shortest without taking a detour is optimal.
- the information processing system makes it possible to generate more optimal guidance information in consideration of the relationship between a plurality of users.
- the information processing system according to the present embodiment automatically searches for a detour according to the relationship between a plurality of users near a route to a specified destination, and generates guidance information including the detour. And present it to the user.
- the relationship between a plurality of users is assumed to be, for example, a couple (lovers) as shown in FIG. 1, friends, family, couple, parent and child, brother / sister, work relationship, and the like.
- the relationship between the plurality of users can be estimated from the atmosphere, attributes, conversation, face recognition, emotion, or sensitivity value of the plurality of users based on information acquired from various sensors.
- the information processing system includes, for example, a captured image of a user's face captured by a camera of an information processing terminal or an in-vehicle camera, voice data of a user collected by a microphone, biometric sensor Based on the biometric information of the users detected by the above, gender and rough age of a plurality of users are estimated, and the relationship is estimated from the combination.
- the information processing system estimates the atmosphere based on analysis of conversation contents based on voice data, voice tone, facial expression recognition based on a captured image, biometric information, and the like, and estimates a relationship from the atmosphere. It is also possible.
- the information processing system refers to personal information registered in advance using face recognition based on captured images, speaker recognition based on voice data, biometric authentication based on biometric information, and the like. It is also possible to perform personal identification. In this case, attributes (age, sex, occupation, etc.) registered in advance linked to the identified person can be acquired, and the relationship between a plurality of users can be estimated according to the attribute.
- the sensitivity value of a plurality of users is acquired based on the object ID registered in advance associated with the identified person, and the relationship between the users is estimated according to the sensitivity value.
- the sensibility value is an evaluation value of an interaction (an action occurring between objects such as handling, care, service provision, conversation, etc.) that occurs between a plurality of objects including persons and objects. This is an index that is calculated on the basis of the numerical value of the nature and character of the object and the person.
- a person with a low sensibility value can make a judgment such as a person with low credibility, a person with a rough handling of a thing, or a rough person, and a person with a high sensibility value has a careful handling of a reliable person or thing. Judgment such as a person or a good person can be made.
- the evaluation value of the interaction stored in association with the target object ID is used. Which range of interaction history is used depends on the purpose of use of the sensitivity value.
- an evaluation value of a past interaction that occurred between the plurality of users is used using an object ID corresponding to the identified plurality of users.
- the sensitivity value is calculated. This makes it possible to estimate a finer relationship (for example, between lovers but in a fight) between multiple users.
- FIG. 2 is a diagram showing the overall configuration of the information processing system according to the present embodiment.
- the information processing system according to the present embodiment includes a guidance server 1 that generates guidance information, and an information processing device that presents the guidance information received from the guidance server 1 to the user (for example, a vehicle 2a or a smartphone 2b). including.
- the guidance server 1 is connected to the vehicle 2a or the smartphone 2b via the Internet 7, for example.
- a vehicle 2a will be described as an information processing device that presents guidance information.
- the guidance server 1 estimates the relationship between a plurality of users who board the vehicle 2a, and generates guidance information according to the estimated relationship.
- the relationship between a plurality of users is estimated based on information (captured image, audio data, biometric information, etc.) related to the user detected by the sensor transmitted from the vehicle 2a.
- the guidance information is, for example, route information to a designated destination, and may include detour destination information corresponding to the estimated relationship. Specifically, for example, when a plurality of users are lovers, a spot for lovers is included as detour information, and when a plurality of users are with a family, a spot for families is included as detour information.
- the guidance server 1 transmits the generated guidance information to the vehicle 2a via the Internet 7.
- the vehicle 2a presents the received guidance information to the user by displaying it on a display device (for example, a car navigation device) provided in the vehicle 2a or by outputting sound from a speaker.
- a display device for example, a car navigation device
- the vehicle 2a is controlled to automatically travel according to the route information included in the received guidance information.
- the passenger of the vehicle 2a can move to the destination via an appropriate detour depending on the relationship, and can enjoy traveling and driving more.
- the guidance information generated by the guidance server 1 is not limited to automobile navigation information for the vehicle 2a, but may be navigation information for movement by walking, bicycle, train, bus, or the like.
- FIG. 3 is a block diagram illustrating an example of the configuration of the guidance server 1 according to the present embodiment.
- the guidance server 1 according to the present embodiment includes a control unit 10, a communication unit 11, and a storage unit 12.
- the control unit 10 controls each configuration of the guidance server 1.
- the control unit 10 is realized by a microcomputer including a central processing unit (CPU), a read only memory (ROM), a random access memory (RAM), and a nonvolatile memory. Further, as shown in FIG. 3, the control unit 10 according to the present embodiment includes a user identification unit 10a, a relationship estimation unit 10b, a guidance information generation unit 10c, an emotion heat map generation unit 10d, and an environment information map generation unit 10e. Also works.
- the user specifying unit 10a specifies a plurality of users who board the vehicle 2a (act together) based on the information about the user detected by the sensor received from the vehicle 2a via the communication unit 11.
- the information on the user detected by the sensor is, for example, a user's captured image captured by an in-vehicle camera provided in the vehicle 2a, user's voice data collected by a microphone, or a user's living body detected by a biosensor.
- Information for example, heart rate, sweat rate, brain wave, body movement, fingerprint, etc.
- the information regarding the user may include inter-terminal communication data.
- Inter-terminal communication data refers to communication (infrared communication, Wi-Fi (registered trademark), Bluetooth (registered trademark), etc.) with information processing terminals (smartphones, mobile phone terminals, tablet terminals, wearable terminals, etc.) possessed by users.
- the user identification information (name, age, gender, user ID, etc.) that can be acquired by the vehicle 2a.
- the user specifying unit 10a analyzes the passenger's face image, voice data, biometric information, and the like, and estimates the gender and rough age of the passenger (multiple users) and, if possible, personal identification.
- the personal identification is performed by referring to the analysis result of the face image and the person information registered in advance in the person DB server 6.
- the personal identification may be performed with reference to personal information acquired from an SNS (Social Networking Service) server 5.
- SNS Social Networking Service
- the user specifying unit 10a can obtain personal information (gender, rough age, name, hobbies / preferences, occupation, weakness) acquired by analyzing face images, voice recognition, conversation recognition, biometric recognition, etc. May be newly registered in the person DB server 6.
- the user specifying unit 10a may register the animal information with the animal type (dog, cat, etc.) as an attribute in the person DB.
- the user specifying unit 10a may specify a plurality of users based on the information of the manual input when the user who has boarded the vehicle 2a is input by himself / herself with information about the passenger and transmitted from the vehicle 2a. Good.
- specification part 10a outputs the information regarding the specified several user to the relationship estimation part 10b.
- the relationship estimation unit 10b estimates the relationship of a plurality of users specified by the user specifying unit 10a. For example, the relationship estimation unit 10b estimates a parent-child relationship, a brother relationship, a friend relationship, a lover relationship, and the like based on a plurality of users' genders and rough age combinations specified by the user specifying unit 10a. In addition, the relationship estimation unit 10b lists a plurality of relationship candidates based on a combination of gender and rough age, and further analyzes and recognizes conversation contents based on voice data to estimate a relationship between a plurality of users. Is also possible.
- the relationship estimation unit 10b acquires the atmosphere in the vehicle by facial expression recognition based on face images, analysis / recognition of conversation contents based on voice data, voice tone (voice color), or analysis of biological information. It is also possible to estimate the relationship between a plurality of users on board according to the atmosphere.
- the relationship estimation unit 10b can estimate the relationship between a plurality of users based on information related to the emotions of the plurality of users.
- the information related to emotion can be acquired by, for example, facial expression recognition based on a face image, analysis / recognition of conversation content based on voice data, voice tone (voice color), or biological information.
- the relationship estimation unit 10b acquires the sensitivity value of each user from the sensitivity server 3 as information related to the emotions of the plurality of users, and the plurality of users based on the sensitivity values.
- the relationship may be estimated.
- the relationship estimation unit 10b acquires the object IDs of the identified plurality of users from the person DB server 6, and requests the sensitivity server 3 for the sensitivity value using the object ID.
- the sensitivity server 3 returns the sensitivity value of each object calculated using the evaluation value of the past interaction between the specified objects in response to the request.
- the relationship estimation unit 10b includes a plurality of users according to the attributes (gender, age, name, occupation, etc.) of the plurality of users included in the person information acquired from the person DB server 6. It is also possible to estimate user relationships.
- the relationship estimation unit 10b outputs the estimated relationship between a plurality of users to the guidance information generation unit 10c.
- the relationship estimation unit 10b may select the relationship having the largest number of people. For example, 5 users (specifically, a total of 5 people including 4 family members and 1 child friend) board the vehicle 2a, 2 have friendships, 2 have siblings, 2 have marital relationships, 4
- the relationship estimation unit 10b selects the family relationship with the largest number of people as the relationship between a plurality of users who board the vehicle 2a.
- the relationship estimation unit 10b may select the relationship of a person who has a strong influence among a plurality of users, such as the owner of the vehicle 2a or an elderly person.
- the relationship estimation unit 10b determines the relationship between the plurality of users based on the information on the manual input. It may be estimated.
- the guidance information generation unit 10c generates guidance information for a plurality of users according to the relationship between the plurality of users estimated by the relationship estimation unit 10b. For example, the guidance information generation unit 10c extracts one or a plurality of detours according to the relationship between a plurality of users near the route to the designated destination, and generates route information via the detour as the guidance information. To do. Specifically, for example, when a plurality of users have a family relationship, the guidance information generation unit 10c extracts a family-friendly spot as a stop-by destination, and when a plurality of users have a lover relationship, the guidance-information generating unit 10c sets a spot for a lover as a stop-by destination Extract.
- the guidance information generation unit 10c uses the emotion heat map generated by the emotion heat map generation unit 10d and the environment information map generated by the environment information map generation unit 10e to provide guidance that is optimal for the relationship among a plurality of users. Information may be generated. The generation of guidance information using the emotion heat map and the environment information map will be described later with reference to FIGS.
- the emotion heat map generation unit 10d generates an emotion heat map in which information related to emotion is mapped in association with the position.
- the information related to emotion includes a sensor value detected by a biological sensor or the like, or a sensitivity value acquired from the sensitivity server 3.
- the interaction information stored in the sensibility server 3 is associated with location information in addition to the date and time when each interaction was performed (see FIGS. 10 and 11), and the interaction performed in the area near the route to the destination. It is possible to calculate a sensitivity value based on the history and a sensitivity value of a spot (destination). Details of the emotion heat map generation will be described later with reference to FIGS.
- the environment information map generation unit 10e generates an environment information map in which information related to the environment is mapped in association with the position.
- the information related to the environment includes spot information (information on sightseeing spots, restaurants, shops, rest areas, parks, etc.) acquired from the spot information server 4.
- spot information stored in the spot information server 4 includes not only basic information such as the location of each spot, business hours, admission fee, but also a user-friendly spot (for families, lovers, friends) ), And spot feature information (beautiful night view, children can enter, terrace seats, pets allowed, places where children can enjoy, etc.).
- the environment information map generation unit 10e in accordance with an instruction from the guidance information generation unit 10c, relationship, attributes (age, sex, hobbies / preferences, etc.), time, weather, A spot that matches a filter condition such as a spot category is extracted to generate an environment information map. Details of the environment information map generation will be described later with reference to FIG.
- the communication unit 11 transmits / receives data to / from an external device.
- the communication unit 11 connects to the sensitivity server 3, the spot information server 4, the SNS server 5, and the person DB server 6 to transmit and receive various data.
- the communication unit 11 is connected to the vehicle 2a to receive information about the user detected by the sensor from the vehicle 2a and navigation setting information (selection of destination, desired arrival time, priority road, etc.) or guidance.
- the guide information generated by the information generation unit 10c is transmitted to the vehicle 2a.
- the storage unit 12 stores a program for performing various processes by the control unit 10.
- the storage unit 12 may store the emotion heat map generated by the emotion heat map generation unit 10d and the environment information map generated by the environment information map generation unit 10e.
- the configuration of the guidance server 1 has been specifically described above.
- the configuration of the guidance server 1 shown in FIG. 3 is an example, and the configuration of the guidance server 1 according to the present embodiment is not limited to this.
- spot information stored in the spot information server 4 and person information stored in the person DB server 6 may be stored in a storage area inside the guide server 1, that is, the storage unit 12.
- FIG. 4 is a block diagram showing an example of the configuration of the vehicle 2a according to the present embodiment.
- the vehicle 2a includes a control unit 20, a communication unit 21, a host vehicle position acquisition unit 22, an in-vehicle camera 23, a microphone 24, an operation input unit 25, a display unit 26, a biosensor 27, and an automatic travel control unit. 28 and a storage unit 29.
- the control unit 20 is constituted by, for example, a microcomputer including a CPU, a ROM, a RAM, a nonvolatile memory, and an interface unit, and controls each configuration of the vehicle 2a.
- the control unit 20 uses the communication unit 21 as the information related to the passenger detected by the sensor using the captured image captured by the in-vehicle camera 23, the voice data collected by the microphone, or the biological information detected by the biological sensor 27. Control to transmit to the guidance server 1.
- the control unit 20 may control to transmit the identification information (gender, age, name, hobby / preference, user ID, object ID, etc.) of the passenger input from the operation input unit 25 to the guidance server 1. Good.
- the control unit 20 may control to transmit the navigation setting information input by the operation input unit 25, specifically, information related to the destination, the desired arrival time, the priority road, and the like to the guidance server 1. .
- control unit 20 may control to display the guidance information received from the guidance server 1 via the communication unit 21 on the display unit 26, or may control to output sound from a speaker (not shown). .
- control unit 20 may instruct the automatic travel control unit 28 to perform automatic travel according to the route included in the guidance information received from the guidance server 1.
- the communication unit 21 transmits / receives data to / from an external device.
- the communication unit 21 is connected to the guidance server 1 and transmits information on the passenger detected by the sensor, navigation setting information, and the like, and receives guidance information generated by the guidance server 1.
- the own vehicle position acquisition unit 22 has a function of detecting the current position of the vehicle 2a based on an external acquisition signal.
- the host vehicle position acquisition unit 22 is realized by a GPS (Global Positioning System) positioning unit, receives a radio wave from a GPS satellite, detects a position where the vehicle 2a exists, and detects the position. The position information is output to the control unit 20.
- the host vehicle position acquisition unit 22 may detect the position by transmitting / receiving with, for example, Wi-Fi (registered trademark), Bluetooth (registered trademark), or the like, or short-range communication.
- the in-vehicle camera 23 is a camera that images the inside of the vehicle 2a, and for example images the face of a passenger sitting in each seat.
- the installation position and the number of the in-vehicle camera 23 are not particularly limited.
- the microphone 24 has a function of collecting voice inside the vehicle 2a, and for example, picks up a passenger's conversation.
- the installation position and the number of the microphones 24 are not particularly limited.
- the operation input unit 25 receives an input of a user operation and outputs it to the control unit 20 as input information.
- the operation input unit 25 may be a touch panel integrated with a display unit 26 provided near the driver's seat of the vehicle 2a.
- the operation input unit 25 may analyze the user's captured image captured by the in-vehicle camera 23 and enable gesture input, or may analyze the user's voice collected by the microphone 24 and input speech. It is good.
- the display unit 26 displays a menu screen and a navigation screen, and is realized by, for example, a liquid crystal display.
- the display unit 26 is provided near the driver's seat.
- the display unit 26 displays the guide information transmitted from the guide server 1.
- the display part 26 may be implement
- the biometric sensor 27 detects biometric information of a user who gets on the vehicle 2a.
- one or a plurality of biosensors 27 are provided on the handle portion of the vehicle 2a, the handle portion of the door, the opening / closing operation portion of the window, the seat portion, or the headrest portion, and the body temperature, sweating amount, heart rate, brain wave, Detect fingerprints.
- the automatic traveling control unit 28 has a function of controlling the traveling of the vehicle 2a and realizing automatic driving that does not depend on the operation of the driver. Specifically, the automatic travel control unit 28 controls the vehicle 2 a to travel according to the guidance information received from the guidance server 1. When the guidance information is route guidance to the destination via the detour, the automatic travel control unit 28 controls the vehicle 2a to travel along the route via the detour indicated by the guidance information. In addition, when performing automatic traveling, the automatic traveling control unit 28 performs accelerator control, brake control, steering wheel control, and the like of the vehicle 2a according to the acquired situation outside the vehicle (for example, surrounding captured images, object detection information, etc.). Do.
- the storage unit 29 stores a program for the control unit 20 to execute each process.
- the storage unit 29 may store information on passengers of the vehicle 2a and guidance information transmitted from the guidance server 1.
- the specific configuration of the vehicle 2a according to this embodiment has been described above.
- the configuration of the vehicle 2a shown in FIG. 4 is an example, and the present embodiment is not limited to this.
- the configuration may be such that the automatic travel control unit 28 is not provided, and another sensor (for example, an infrared camera, a depth camera, or the like) that acquires information about the passenger may be provided.
- FIG. 5 is a block diagram illustrating an example of the configuration of the sensitivity server 3 according to the present embodiment.
- the sensitivity server 3 includes a control unit 30, a communication unit 31, an object DB 32, and a sensitivity information DB 34.
- the communication unit 31 is connected to the guidance server 1 via the network, and returns the sensitivity value of the object requested from the guidance server 1. Further, the communication unit 31 receives interaction information from a sensing device (not shown) attached / mounted on each object (including all persons and objects).
- the control unit 30 controls each component of the sensitivity server 3.
- the control unit 30 is realized by a microcomputer including a CPU, a ROM, a RAM, and a nonvolatile memory. As shown in FIG. 5, the control unit 30 according to the present embodiment functions as an interaction storage control unit 30a, an evaluation unit 30b, an object management unit 30c, a related object search unit 30d, and a sensitivity value calculation unit 30e.
- the interaction storage control unit 30a performs control so that the interaction information received from the sensing device mounted / mounted on the object is stored in the sensitivity information DB 34.
- the sensing device is realized by a humidity sensor, a temperature sensor, a vibration sensor, an infrared sensor, a camera, a tactile sensor, a gyro sensor, or the like, and detects an interaction from another object with respect to the object.
- An interaction is an action, for example, a conversation between people, a telephone call, an e-mail, a gift present, etc., and between a person and an object, care, storage, cleaning, appreciation, wearing, etc. Is mentioned.
- the evaluation unit 30b evaluates the interaction stored in the sensitivity information DB 34.
- the interaction evaluation method is not particularly limited. For example, the evaluation unit 30b evaluates higher as the interaction is preferable for the object that has undergone the interaction, and specifically assigns a score of -1.0 to 1.0.
- the evaluation result is stored in the sensitivity information DB 34 in association with the interaction.
- the object management unit 30c performs management such as registration, change, and deletion of information related to objects stored in the object DB 32.
- the related object search unit 30d searches the object DB 32 and the sensitivity information DB 34 as related objects for other objects that have interacted with the requested object ID.
- the sensitivity value calculation unit 30e calculates the sensitivity value of the target user based on the interaction evaluation associated with the object ID of the target user. For example, the sensitivity value calculation unit 30e may calculate the total sensitivity value of the target user based on the sum of the interaction evaluation values, or may calculate the total sensitivity value of the target user based on the average value of the interaction evaluation values.
- the sensitivity value calculation unit 30e may calculate the sensitivity value using only the predetermined interaction or weighting the predetermined interaction according to the usage of the sensitivity value at the request source. For example, when the sensitivity value is used in the guidance server 1 to estimate the relationship between a plurality of users, the sensitivity value calculation unit 30e uses the past interaction history between designated objects (that is, between a plurality of users) to calculate the sensitivity value. calculate.
- the object DB 32 is a storage unit that stores an object ID of each object.
- the object DB 32 stores information such as the name, age, sex, service type, service company, product name, product type, manufacturer ID, model number, and manufacturing date and time of the object in association with the object ID.
- the sensitivity information DB 34 is a storage unit that stores interaction information between objects and evaluation values. Specifically, the sensitivity information DB 34 stores the date and place where the interaction was performed, the related object ID indicating the other party where the interaction occurred, the interaction type, the details of the interaction, and the interaction evaluation in association with the object ID. Is done.
- the configuration of the sensitivity server 3 according to the present embodiment has been specifically described above.
- the configuration of the sensitivity server 3 is not limited to the example illustrated in FIG. 5, and for example, the object DB 32 and the sensitivity information DB 34 may be stored in an external storage device on the network.
- the sensibility value calculation unit 30 e of the sensibility server 3 may be provided in the guidance server 1, and the sensibility value may be calculated by the guidance server 1 based on the interaction evaluation between a plurality of users acquired from the sensitivity server 3.
- FIG. 6 is a flowchart showing guidance information generation processing in the information processing system according to the present embodiment.
- step S103 the user specifying unit 10a of the guidance server 1 transmits information related to the passenger detected by the sensor transmitted from the vehicle 2a (specifically, for example, a captured image, audio, etc.). Based on data, biometric information, etc.), a plurality of users boarding the vehicle 2a are identified.
- step S106 the relationship estimating unit 10b estimates the relationship between the plurality of users specified by the user specifying unit 10a. Details of the relationship estimation between multiple users will be described later with reference to FIG.
- the guidance information generation unit 10c determines whether or not the purpose of movement of the plurality of users is appropriate for taking a detour.
- the purpose of movement of a plurality of users can be estimated from the relationship, attributes, designated destination, etc. of the plurality of users. For example, the guidance information generation unit 10c estimates that the purpose of movement is a family trip when the relationship between a plurality of users is a parent-child relationship or a sibling / sister relationship and the destination is a tourist destination that is somewhat distant. Further, the guidance information generation unit 10c estimates the purpose of movement as a date when the relationship between a plurality of users is a lover relationship and the destination is a sightseeing spot that is somewhat distant.
- the guidance information generation part 10c presumes the purpose of movement to be a friend trip when the relationship between a plurality of users is a friendship and the destination is a sightseeing spot far away to some extent.
- the guidance information generation unit 10c estimates the purpose of movement as work when the relationship between a plurality of users is work relation and the destination is a company or a store.
- the guide information generating unit 10c estimates that the purpose of movement is shopping, and the destination is a nearby restaurant such as a restaurant or a restaurant. In some cases, the purpose of movement is estimated to be a meal.
- the guidance information generation unit 10c estimates the purpose of movement as a ceremonial occasion.
- the guide information generating unit 10c is suitable for taking a detour when the purpose of movement is to arrive early at a destination such as "work", “shopping", “ceremonial occasion”, etc. Judge that there is no.
- the guidance information generation unit 10c may be used to detour if the purpose of travel is to be enjoyed even on the way to the destination such as “family trip”, “friend trip”, “date”, etc. Judge that it is appropriate.
- the guidance information generation unit 10c determines whether or not it is suitable for taking a detour based on the input purpose of movement. To do.
- the guidance information generation unit 10c may search only the base route that arrives at the destination in the shortest time / distance and provide it to the vehicle 2a.
- step S112 the guidance information generating unit 10c has time to reach the destination specified by the user. Judge whether or not.
- the time margin to the destination can be calculated based on, for example, the desired arrival time of the destination input by the user and the time required for movement to the destination.
- the time allowance is a time allowance to stop at a detour, such as one hour or more, and the threshold value can be specified by the user.
- the guide information generation process including the detour destination ends.
- the guidance information generation unit 10c may search only the base route that arrives at the destination in the shortest time / distance and provide it to the vehicle 2a.
- generation process of the guidance information containing the detour destination by this embodiment is complete
- the guidance information The generation process may be terminated.
- the guide information generation process may be terminated.
- step S115 the guidance information generation unit 10c searches for a base route from the current location to the destination. For example, the guidance information generation unit 10c considers the basic route arriving at the shortest time / distance to the destination in consideration of the conditions (pay priority, general road priority, distance priority, traffic jam information, etc.) input by the user. To search. Note that the processing in step S115 may be performed before S109 and S112, that is, after step S106.
- the guidance information generation unit 10c acquires an environment information map of a certain area along the searched base route from the environment information map generation unit 10e. Specifically, the guidance information generation unit 10c specifies an environment information map generation unit 10e by designating an area within a certain range along the base route and a filter condition according to the relationship among a plurality of users. Instruct the generation of.
- the guidance information generation unit 10c may specify a certain range of area along the base route that is scheduled to pass after a predetermined time has elapsed from the departure point. For example, when a child rides on the vehicle 2a, it is assumed that the child gets tired after about 1 hour from the departure. The optimum guidance information can be presented to the user.
- the guidance information generating unit 10c sets “for family” as the filter condition, and the relationship between the plurality of users is In the case of “lover relationship”, the filter condition is “for lover”.
- the filter condition specified in the environment information map generation unit 10e are not limited to the conditions according to the relationship among the plurality of users. For example, the attributes (age, gender, hobbies / preferences, occupation, etc.) of the plurality of users, time, weather These conditions may be added.
- an expected time when the user passes through the designated area and the weather in the designated area are used as conditions.
- a pet such as “pet allowed”, a conversation such as “I am hungry” recognized by the conversation recognition result, or when the meal time specified by the user is close to “ Filter conditions such as “restaurant” and “restaurant” may be added.
- the environment information map 40 shown in FIG. 7 is generated by filtering with, for example, a filter condition “for lovers” according to the relationship of a plurality of users and a date / time / weather filter condition “rain, temperature 18 degrees, 9:00 pm”. It has been done. Specifically, the environment information map generation unit 10e acquires information on spots existing in a designated area (a certain range along the base route) from the spot information server 4, and basic information on spots included in the spot information. And spot information matching the condition of “for lover” is extracted with reference to the feature information and the like. In addition, the environment information map generation unit 10e extracts spots that are open even in rainy weather according to the condition “weather: rain”.
- the environment information map generation unit 10e extracts indoor spots because the outside is chilly according to the condition of “temperature: 18 degrees Celsius”. Furthermore, the environment information map generation unit 10e extracts restaurants and bars that are open even after 9pm according to the condition "time: 9pm”. For example, as shown in FIG. 7, spots 400, 401, and 402 that match the filter condition “for lover, rain, temperature 18 degrees, 9:00 pm” are extracted.
- the environment information map 40 may be generated by the environment information map generation unit 10e of the guidance server 1 as shown in FIG. 3, or may be acquired from an external information providing server (not shown) via the network. .
- the guidance information generation unit 10c acquires an emotion heat map of a certain area along the searched base route from the emotion heat map generation unit 10d.
- the emotion heat map By referring to the emotion heat map, it is possible to grasp the sensibility values of the persons who are active in each spot and the surrounding area.
- the guidance information generation unit 10c determines, with respect to the emotion heat map generation unit 10d, each spot in a certain range area along each base route (each spot extracted by generating the environmental information map) or a certain range. Designate the area of the range and instruct the generation of the emotion heat map.
- FIG. 8 an example of the emotion heat map according to the present embodiment is shown in FIG.
- the emotion heat map 42 shown on the left in FIG. 8 is generated using the sensitivity values of the spots in a certain range area along the base route.
- the guidance information generation unit 10c can select an appropriate detour for multiple users according to the sensitivity value of each spot.
- the sensitivity value of each spot when generating the emotion heat map 42 is based on the interaction between a person having the same attributes as the attributes (age, sex, occupation, hobbies / preferences, etc.) of a plurality of users and the spot. Sensitivity value calculated in this way may be used.
- the attribute of a plurality of users is, for example, “adult (age: 20 years old or older)”, and a person (that is, an adult) having the same attributes as these, and each spot ( Sensitivity values calculated based on the interaction generated between the spots 400, 401, 402) extracted by generating the environment information map are used.
- the spot which the person who has the same attribute as the user who provides guidance information can spend pleasantly and comfortably can be grasped
- the sensitivity value of each spot can be acquired from the sensitivity server 3.
- the emotion heat map generation unit 10d acquires the object ID of each spot (for example, the restaurant “XXX”, the bar “ ⁇ ”, etc.) from the spot information server 4 and acquires the acquired object ID.
- the attribute information of a plurality of users are transmitted to the sensitivity server 3 to request acquisition of sensitivity values.
- the sensitivity value calculation unit 30e uses the past interaction history generated between the object ID specified in response to the request from the guidance server 1 and the person having the specified attribute to determine the sensitivity value. And the calculated sensitivity value is returned to the guidance server 1.
- the sensitivity value calculation unit 30e first refers to the detailed object information stored in the object DB 32, and extracts the object ID of a person having a specified attribute.
- object detailed information includes object ID, object name (name, store name, product name, etc.), object type (person, restaurant, bar, etc.), and attribute information (age, gender, occupation, hobby). ⁇ Preferences, store category, store opening hours, store location, etc.).
- the attribute specified from the guidance server 1 is “adult (age: 20 years old or more)”
- the sensitivity value calculation unit 30e has a person AAA (object ID: 384) and a person BBB (object ID) as objects having the attribute.
- the sensitivity value calculation unit 30e extracts, from the sensitivity information DB 34, only past interaction information generated between the extracted object ID of the person and the object ID requested from the guidance server 1 (spot object ID). To do.
- the sensitivity value calculation unit 30e extracts a person whose interaction has been detected with the object ID of the spot first, and extracts a person having a specified attribute from the person, thereby obtaining a predetermined attribute.
- the past interaction information generated between the person and the spot may be extracted.
- the function of searching for a person having a predetermined attribute that has interacted with a predetermined spot is provided on the guide server 1 side, and on the sensitivity server 3 side, the person specified by the guide server 1 and the predetermined spot are You may perform the process which extracts interaction information.
- an example of the interaction information extracted from the sensitivity information DB 34 is shown in FIG.
- an interaction in this case, an object
- an object that is a person having a predetermined attribute here, “adult (age: 20 years old or more)” as an example
- An evaluation value of an action such as providing meal / sake or eating and drinking is extracted.
- the sensitivity value calculation unit 30e calculates the sensitivity value of each spot based on the interaction information extracted as described above.
- the method for calculating the sensitivity value is not particularly limited, for example, the average of the interaction evaluation values may be calculated, or weighting is performed according to the date and time when the interaction is performed, and the influence is increased as the recent interaction is performed. You may make it do.
- the emotion heat map generation unit 10d generates the emotion heat map 42 as shown on the left side of FIG. 8 using the sensitivity values of the spots described above.
- the emotion heat map generation unit 10d uses the sensitivity values of the persons who are active in the surrounding area as shown in the right of FIG. It is also possible to generate the emotion heat map 43 used.
- the guidance information generation unit 10c grasps an area where people with high sensibility values, that is, good people gather, and an area where people with low sensibility values, that is, people with bad abilities gather, It is possible to select an appropriate detour for multiple users.
- the emotion heat map 43 shown on the right side of FIG. 8 is generated using a sensitivity value calculated based on the interaction evaluation associated with the area around the base route.
- the emotion heat map generation unit 10d requests the sensitivity server 3 for a sensitivity value by designating a base route peripheral area.
- the sensitivity value calculation unit 30e of the sensitivity server 3 extracts only the interaction evaluation performed in the specified area from the sensitivity information DB 34, and the sensitivity value of the person who is active in the area Is calculated.
- FIG. an example of the interaction information extracted from the sensitivity information DB 34 is shown in FIG. As shown in FIG.
- the interaction information indicates the object ID indicating the object that has interacted (or received), the date and time when the interaction was performed, and the object that has received (or has performed) the interaction.
- the related object ID, interaction type, details, and evaluation value are included.
- the sensitivity value calculation unit 30e can extract the interaction information performed in the designated area. Then, the sensitivity value calculation unit 30e calculates a sensitivity value of a predetermined area (may be a sensitivity value of a person who is active in the predetermined area) based on the extracted evaluation value of the interaction.
- the method for calculating the sensitivity value is not particularly limited, and for example, a method for calculating the average of the interaction evaluation values may be used. Further, weighting may be performed according to the date and time when the interaction is performed, and the influence may be increased as the recent interaction is performed.
- the emotion heat map generation unit 10d generates an emotion heat map 43 as shown on the right in FIG. 8 using the sensitivity values of the person who is active in the predetermined area calculated by the sensitivity value calculation unit 30e of the sensitivity server 3.
- the emotion heat map 43 shown on the right side of FIG. 8 the area having a higher sensitivity value is indicated by a lighter color. That is, it can be seen that the sensitivity value of area 410 is the highest, the sensitivity value of area 411 is the next highest, and the sensitivity value of area 412 is the lowest.
- the generation of the emotion heat map by the emotion heat map generation unit 10d has been specifically described above.
- the emotion heat map is not limited to the case where the emotion heat map is generated by the emotion heat map generation unit 10d of the guidance server 1, and may be acquired from an external information providing server (not shown) via a network.
- step S124 the guidance information generating unit 10c acquires an integrated heat map obtained by integrating the environmental information map and the emotion heat map.
- the integrated heat map according to the present embodiment will be described with reference to FIG.
- FIG. 12 is a diagram illustrating an example of an integrated heat map according to the present embodiment.
- the integrated heat map 45 shown in FIG. 12 is generated by synthesizing the environment information map 40 shown in FIG. 7, the emotion heat map 42 shown on the left in FIG. 8, and the emotion heat map 43 shown on the right in FIG.
- the guidance information generation unit 10c can determine the location of each spot that is a candidate for a detour destination suitable for the relationship among a plurality of users within a certain range along the base route, It is possible to grasp the height of the sensitivity value and the height of the sensitivity value in the surrounding area. For example, in the example illustrated in FIG.
- the integrated heat map according to the present embodiment may be generated by combining the emotion heat map 42 and the emotion heat map 43, or may be generated by combining the environment information map 40 and the emotion heat map 42 or the emotion heat map 43. May be.
- the guidance information generation unit 10c extracts spots whose total points are greater than or equal to a predetermined threshold in the integrated heat map.
- the total points of the spots are, for example, the degree of matching with the filter condition used when generating the environment information map 40, the sensitivity value of the spot indicated by the emotion heat map 42, and the spot peripheral area indicated by the emotion heat map 43. It is calculated based on the sensitivity value. Specifically, for example, the spot 401 of the integrated heat map 45 shown in FIG. 12 is calculated in consideration of the low sensitivity value of the peripheral area 412 in addition to the sensitivity value of the spot itself.
- step S130 the guidance information generating unit 10c sorts the spots in descending order of the total spot score, and generates a detour destination candidate list.
- step S133 the guidance information generating unit 10c acquires the spot with the highest total score from the generated candidate list of detour destinations.
- step S136 the guide information generating unit 10c determines whether or not the acquired spot category is the same as a category (restaurant, bar, zoo, etc.) already detoured by a plurality of users within a certain period.
- the spot category can be acquired from the spot information server 4.
- step S139 the guide information generating unit 10c deletes the acquired spot from the detour destination candidate list.
- step S142 the guidance information generating unit 10c determines whether or not another spot remains in the detour destination candidate list.
- step S145 the guidance information generation unit 10c lowers the predetermined threshold for extracting the spot by a predetermined value, and then performs S127. Returning to the step, spot extraction is performed using a new threshold value.
- the guidance information generation unit 10c guides the acquired spot as a detour destination. Generate information. Specifically, the guidance information generation unit 10c generates guidance information for guiding a route arriving at a destination via a detour destination. Alternatively, the guidance information generation unit 10c may present detailed information on the detour destination and generate guidance information that allows the user to select whether or not to change the route, or presents a plurality of spots as detour destination candidates. Guidance information to be selected by the user may be generated. Further, when the desired arrival time to the destination is specified by the user, the guidance information generation unit 10c may generate guidance information including the maximum stayable time at the detour in terms of the remaining time and the travel time. Good.
- step S151 the guidance server 1 transmits the generated guidance information to the vehicle 2a.
- the guidance information is displayed on the display unit 26, or voice output is performed from a speaker (not shown).
- the vehicle 2a may notify information about the spot by voice or video when traveling near a tourist attraction or a spot to be noted.
- the vehicle 2a is a vehicle that can automatically travel, automatic traveling according to guidance information (route information to a destination via a detour destination) can be performed.
- FIG. 13 is a flowchart showing details of the process of estimating the relationship between a plurality of users according to this embodiment.
- the user specifying unit 10a of the guidance server 1 determines whether or not a person has boarded the vehicle 2a. Specifically, the face of the person boarding the vehicle 2a is imaged by the in-vehicle camera 23 of the vehicle 2a, and the captured image is transmitted from the vehicle 2a. Therefore, the user specifying unit 10a recognizes the person by face recognition based on the captured image. Determine boarding. In addition to the captured image transmitted from the vehicle 2a, the user specifying unit 10a can also determine boarding of a person based on analysis of passenger voice data and biological information transmitted from the vehicle 2a. .
- step S208 the user specifying unit 10a performs personal identification of the person.
- the user specifying unit 10a refers to the analysis result of the captured face image and the person information registered in advance in the person DB server 6 to identify the passenger.
- the user specifying unit 10a can also perform personal identification with reference to the passenger's voice recognition result or biometric information analysis result and the person information registered in the person DB server 6.
- step S209 the relationship estimation unit 10b acquires the attribute of the identified person from the person DB server 6.
- the attributes of the person registered in the person DB server 6 are, for example, age, sex, occupation, hobby / preference, object ID, and the like.
- step S212 the relationship estimation unit 10b acquires a sensitivity value based on the interaction evaluation between the identified passengers (between a plurality of users) from the sensitivity server 3.
- step S215 the relationship estimation unit 10b determines whether or not the destination is input (or changed) by the passenger.
- step S218 the relationship estimation unit 10b estimates the purpose of movement of the passenger. For example, “Leisure” if it is a tourist spot, “Work” if it is a place related to a company or work, “Shopping” if it is a supermarket or a store, “Play or errand” if it is a private house such as an apartment or apartment. It is estimated that.
- the relationship estimation unit 10b can further increase the estimation accuracy of the purpose of movement by referring to the day of the week and the time.
- the user specifying unit 10a includes information on the user detected by the in-vehicle camera 23, the microphone 24, the biological sensor 27, and the like of the vehicle 2a ( (Captured image, audio data, biological information, etc.).
- the user identification unit 10a can identify the gender and age of the passenger even if personal identification cannot be performed by face recognition, voice analysis, and biometric information analysis using information about the acquired user. it can. Moreover, when the vehicle 2a is provided with a depth sensor (not shown), it is possible to obtain the occupant's body shape data using the sensor value detected by the depth sensor, and to increase the estimation accuracy of gender and age. .
- step S227 the relationship estimation unit 10b determines the purpose of movement estimated when the individual identification can be performed, the sensitivity value based on the attribute of the person and the interaction evaluation between the plurality of users, or the individual identification cannot be performed.
- the relationship between a plurality of users boarding the vehicle 2a is estimated using the gender and age of the passenger.
- the relationship estimation unit 10b narrows the relationship among these multiple users to candidates such as lovers, friends, company bosses and subordinates, and family members. Further, the relationship estimation unit 10b considers the date and time, the destination, the purpose of travel, etc., and if it goes to a sightseeing spot on a holiday, it is a lover relationship, and if it goes to a work-related place on weekdays, it is a work relationship. Can be estimated. Moreover, the relationship estimation part 10b may estimate the relationship of a some user in consideration of the conversation content recognized from audio
- the estimation of the relationship between multiple users is not limited to the above-described attributes, sensitivity values, purpose of movement, gender / age, for example, conversation content based on data detected by a camera, microphone, biometric sensor, voice tone It may be estimated from the atmosphere in the vehicle, the new density, and the like.
- step S230 the guidance information generation unit 10c determines whether or not the relationship has been updated.
- the estimation of the relationship between a plurality of users boarding the vehicle 2a is continuously performed. When a new person gets on the vehicle, the destination is changed, the atmosphere in the vehicle changes, etc. May be updated.
- the guidance information generation unit 10c determines that the relationship has been updated when the relationship estimation unit 10b newly estimates the relationship of a plurality of users.
- step S233 the guide information generation unit 10c performs generation (update) processing of guide information including a detour destination.
- the processes shown in S109 to S151 in FIG. 6 are mainly performed.
- step S236 the guidance server 1 determines whether or not the vehicle 2a has arrived at the destination, and the processes of S203 to S233 are repeated until the vehicle 2a arrives at the destination.
- the present embodiment is not limited to the above-described processing.
- the control unit 10 of the guidance server 1 prepares a “departure permission mode” in advance because the user does not want to take a detour when moving in the case of work.
- guidance information generation processing according to the present embodiment may be performed.
- control unit 10 of the guidance server 1 determines whether or not the user has actually departed to the guided detour, the staying time when detouring, the degree of smile and the amount of conversation after returning to the vehicle 2a. Based on the change or the like, it is determined whether or not the estimated relationship and selection of the guided detour are correct. Further, when the re-estimation of the relationship is necessary according to the result, the relationship estimation unit 10b re-estimates the relationship between the plurality of users.
- the guidance information generating unit 10c can grasp the personal hobbies / preferences and personality based on the SNS history and the contents of the mail, and can propose a more optimal detour. . Specifically, when the guidance information generation unit 10c instructs the environment information map generation unit 10e to generate the environment information map, the guidance information generation unit 10c gives a filter condition that reflects the personal taste / preference and personality.
- a sensitivity value based on past interactions between passengers can be acquired from the sensitivity server 3 to estimate a more detailed relationship (such as “lovers are fighting”). It is possible to influence a fine relationship. For example, when the relationship “lovers are fighting” is estimated, a place where an extraordinary experience such as an amusement park is good for reconciliation, the guidance information generating unit 10c generates the environment information map. When instructing the generation unit 10e, a filter condition such as “unusual experience” is given.
- the information processing system it is possible to generate more optimal guidance information in consideration of the relationship between a plurality of users. Specifically, the information processing system according to the present embodiment extracts spots (departure destinations) in the vicinity of the base route to the destination according to the relationship among a plurality of users, and generates guidance information including the detour destination.
- the functions of the guidance server 1, the vehicle 2 a, the smartphone 2 b, or the sensitivity server 3 are added to hardware such as the CPU, ROM, and RAM incorporated in the guidance server 1, the vehicle 2 a, the smartphone 2 b, or the sensitivity server 3 described above. It is also possible to create a computer program for exhibiting the above. A computer-readable storage medium storing the computer program is also provided.
- this technique can also take the following structures.
- a user identifying unit that identifies multiple users acting together;
- An estimation unit for estimating the relationship between the identified plurality of users;
- a generation unit that generates guidance information for the plurality of users according to the estimated relationship between the plurality of users;
- An information processing system comprising: (2) The information processing system according to (1), wherein the user specifying unit specifies a plurality of users who are acting together based on information about the user detected by the sensor.
- the information related to the user's emotion includes facial expression analysis based on a face image obtained by capturing the user's face, analysis of conversation content based on voice data, voice tone, and heart rate based on biological information, sweating volume, brain waves, or
- the information processing system according to (7) which is generated using at least one of body movements.
- the estimation unit calculates a sensitivity value calculated based on an interaction evaluation value between objects corresponding to the plurality of users from a sensitivity value database constructed by accumulating evaluation values of interaction between objects.
- the information processing system according to (6) wherein the information is acquired as information related to, and the relationship between the plurality of users is estimated based on the sensitivity value.
- the information processing system includes: A transmission unit that transmits the generated guidance information to a vehicle that can automatically travel; The information processing system according to any one of (1) to (14), wherein the vehicle capable of automatic traveling performs automatic traveling control according to route information to a destination as the guidance information.
- the information processing system includes a detour on the way to a destination.
- the information processing system includes: A transmission unit for transmitting the generated guidance information to an information processing terminal; The information processing system according to any one of (1) to (16), wherein the information processing terminal presents the guidance information to a user.
Landscapes
- Engineering & Computer Science (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Business, Economics & Management (AREA)
- Entrepreneurship & Innovation (AREA)
- Human Resources & Organizations (AREA)
- Theoretical Computer Science (AREA)
- Strategic Management (AREA)
- Health & Medical Sciences (AREA)
- Social Psychology (AREA)
- General Health & Medical Sciences (AREA)
- Tourism & Hospitality (AREA)
- Mathematical Physics (AREA)
- Operations Research (AREA)
- Quality & Reliability (AREA)
- Economics (AREA)
- General Business, Economics & Management (AREA)
- Data Mining & Analysis (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Ecology (AREA)
- Marketing (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Game Theory and Decision Science (AREA)
- Medical Informatics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Navigation (AREA)
- Instructional Devices (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
Description
1.本開示の一実施形態による情報処理システムの概要
2.基本構成
2-1.案内サーバの構成
2-2.車両の構成
2-3.感性サーバの構成
3.動作処理
4.まとめ
まず、本開示の一実施形態による情報処理システムの概要について図1を参照して説明する。本実施形態による情報処理システムは、複数ユーザの関係性を、各種センサから取得された情報に基づく複数ユーザの雰囲気、属性、会話、顔認識、感情、または感性値等から推定し、推定した関係性に適した案内情報を提示する。提示される案内情報は、目的地までのルート情報や、ルート途中で寄り道先を経由するルート情報(例えば図1右に示すような寄り道先1、寄り道先2、寄り道先3を経由して目的地へ向かうルート情報)である。また、案内情報の提示は、ユーザが所持する情報処理端末(例えばスマートフォン、携帯電話端末、タブレット端末、ウェアラブル端末等)において表示出力/音声出力されてもよいし、ユーザが搭乗する車両において表示出力/音声出力されてもよい。また、ユーザが搭乗する車両が自動走行可能な車両である場合、案内情報としてのルート情報に従った自動走行が行われるようにしてもよい。
ここで、従来のナビゲーションシステムでは、ユーザが入力した目的地までの時間的/距離的な最短ルートが自動探索され、当該ルートに従ったナビゲーションが行われるのみであった。一方、一般的に旅行等で目的地まで移動するための時間が十分にある場合、ユーザが目的地までの移動途中で観光スポットや飲食店、土産物店等に立ち寄ることが想定される。しかしながら、上述したように目的地までのルートのみが探索され、提示されるナビゲーションシステムを利用する場合、ユーザは自らルート途中における寄り道先を探し、寄り道先を新たな目的地として再度ナビゲーションシステムに設定しなければならなかった。寄り道先の探索はユーザにとって手間のかかる作業であって、また、ユーザの探索能力によっては最適な場所を十分に探すことができなかった。
<2-1.案内サーバ>
図3は、本実施形態による案内サーバ1の構成の一例を示すブロック図である。図3に示すように、本実施形態による案内サーバ1は、制御部10、通信部11、および記憶部12を有する。
制御部10は、案内サーバ1の各構成を制御する。制御部10は、CPU(Central Processing Unit)、ROM(Read Only Memory)、RAM(Random Access Memory)、および不揮発性メモリを備えたマイクロコンピュータにより実現される。また、本実施形態による制御部10は、図3に示すように、ユーザ特定部10a、関係性推定部10b、案内情報生成部10c、感情ヒートマップ生成部10d、および環境情報マップ生成部10eとしても機能する。
通信部11は、外部装置とデータの送受信を行う。例えば通信部11は、感性サーバ3、スポット情報サーバ4、SNSサーバ5、人物DBサーバ6と接続して各種データの送受信を行う。また、通信部11は、車両2aと接続して、車両2aからセンサにより検知されたユーザに関する情報や、ナビゲーション設定情報(目的地、到着希望時間、優先道路の選択等)を受信したり、案内情報生成部10cにより生成された案内情報を車両2aに送信したりする。
記憶部12は、制御部10による各種処理を行うためのプログラムを記憶する。また、記憶部12は、感情ヒートマップ生成部10dにより生成された感情ヒートマップ、環境情報マップ生成部10eにより生成された環境情報マップを記憶してもよい。
次に、案内情報をユーザに提示する情報処理装置の一例である車両2aの構成を説明する。図4は、本実施形態による車両2aの構成の一例を示すブロック図である。図4に示すように、車両2aは、制御部20、通信部21、自車両位置取得部22、車内カメラ23、マイクロホン24、操作入力部25、表示部26、生体センサ27、自動走行制御部28、および記憶部29を有する。
制御部20は、例えばCPU、ROM、RAM、不揮発性メモリ、インターフェース部を備えたマイクロコンピュータにより構成され、車両2aの各構成を制御する。また、制御部20は、車内カメラ23により撮像した撮像画像、マイクロホンにより収音した音声データ、または生体センサ27により検知した生体情報を、センサにより検出した搭乗者に関する情報として通信部21を介して案内サーバ1に送信するよう制御する。また、制御部20は、操作入力部25から入力された搭乗者の識別情報(性別、年齢、氏名、趣味・嗜好、ユーザID、オブジェクトID等)を案内サーバ1に送信するよう制御してもよい。また、制御部20は、操作入力部25により入力されたナビゲーション設定情報、具体的には、目的地、到着希望時間、優先道路等に関する情報を、案内サーバ1に送信するよう制御してもよい。
通信部21は、外部装置とデータの送受信を行う。例えば通信部21は、案内サーバ1と接続し、センサにより検出した搭乗者に関する情報やナビゲーション設定情報等を送信したり、案内サーバ1で生成された案内情報を受信したりする。
自車両位置取得部22は、外部からの取得信号に基づいて車両2aの現在位置を検知する機能を有する。具体的には、例えば自車両位置取得部22は、GPS(Global Positioning System)測位部により実現され、GPS衛星からの電波を受信して、車両2aが存在している位置を検知し、検知した位置情報を制御部20に出力する。また、自車両位置取得部22は、GPSの他、例えばWi-Fi(登録商標)、Bluetooth(登録商標)等との送受信、または近距離通信等により位置を検知するものであってもよい。
車内カメラ23は、車両2a内部を撮像するカメラであって、例えば各座席に座っている搭乗者の顔を撮像する。車内カメラ23の設置位置および個数については特に限定しない。
マイクロホン24は、車両2a内部の音声を収音する機能を有し、例えば搭乗者の会話を収音する。マイクロホン24の設置位置および個数については特に限定しない。
操作入力部25は、ユーザ操作の入力を受付け、入力情報として制御部20に出力する。例えば操作入力部25は、車両2aの運転席近傍に設けられた表示部26と一体化されたタッチパネルであってもよい。また、操作入力部25は、車内カメラ23により撮像されたユーザの撮像画像を解析してジェスチャー入力を可能としてもよいし、マイクロホン24により収音されたユーザの音声を解析して音声入力を可能としてもよい。
表示部26は、メニュー画面やナビゲーション画面を表示し、例えば液晶ディスプレイにより実現される。また、表示部26は、運転席近傍に設けられる。また、表示部26は、案内サーバ1から送信された案内情報を表示する。また、表示部26は、車両2aのフロントガラスに画像を投影する投影部により実現されてもよい。
生体センサ27は、車両2aに搭乗するユーザの生体情報を検知する。例えば生体センサ27は、車両2aのハンドル部分、ドアの取手部分、窓の開閉操作部分、座席部分、またはヘッドレスト部分等に1または複数設けられ、搭乗者の体温、発汗量、心拍数、脳波、指紋等を検知する。
自動走行制御部28は、車両2aの走行を制御し、運転者の操作によらない自動運転を実現する機能を有する。具体的には、自動走行制御部28は、案内サーバ1から受信した案内情報に従って走行するよう車両2aを制御する。案内情報が寄り道先を経由した目的地までのルート案内である場合、自動走行制御部28は、案内情報で示される寄り道先を経由したルートを走行するよう車両2aを制御する。また、自動走行を行う際、自動走行制御部28は、取得された車外の状況(例えば周囲の撮像画像、物体検知情報等)に応じて、車両2aのアクセル制御、ブレーキ制御、ハンドル制御等を行う。
記憶部29は、制御部20が各処理を実行するためのプログラムを記憶する。また、記憶部29は、車両2aの搭乗者に関する情報や、案内サーバ1から送信された案内情報を記憶してもよい。
続いて、本実施形態による感性サーバ3の構成について説明する。図5は、本実施形態による感性サーバ3の構成の一例を示すブロック図である。図5に示すように、感性サーバ3は、制御部30、通信部31、オブジェクトDB32、および感性情報DB34を有する。
通信部31は、ネットワークを介して案内サーバ1と接続し、案内サーバ1から要求されたオブジェクトの感性値を返信する。また、通信部31は、各オブジェクト(人、モノ全てを含む)に装着/搭載されたセンシングデバイス(不図示)から、インタラクション情報を受信する。
制御部30は、感性サーバ3の各構成を制御する。制御部30は、CPU、ROM、RAM、および不揮発性メモリを備えたマイクロコンピュータにより実現される。また、本実施形態による制御部30は、図5に示すように、インタラクション記憶制御部30a、評価部30b、オブジェクト管理部30c、関連オブジェクト検索部30d、および感性値算出部30eとして機能する。
オブジェクトDB32は、各オブジェクトのオブジェクトIDを格納する記憶部である。また、オブジェクトDB32には、オブジェクトIDに紐付けて、オブジェクトの氏名、年齢、性別、サービス種類、サービス会社、商品名、商品種類、メーカーID、型番、製造日時等の情報が格納される。
感性情報DB34は、オブジェクト間のインタラクション情報や、評価値を格納する記憶部である。具体的には、感性情報DB34には、オブジェクトIDに紐付けて、インタラクションが行われた日時・場所、インタラクションが発生した相手方を示す関連オブジェクトID、インタラクション種類、インタラクションの詳細、およびインタラクション評価が格納される。
次に、本実施形態による情報処理システムの動作処理について図6を参照して説明する。図6は、本実施形態による情報処理システムにおける案内情報の生成処理を示すフローチャートである。
上述したように、本開示の実施形態による情報処理システムでは、複数ユーザの関係性を考慮して、より最適な案内情報を生成することを可能とする。具体的には、本実施形態による情報処理システムは、複数ユーザの関係性に応じて、目的地までのベースルート付近でスポット(寄り道先)を抽出し、寄り道先を含む案内情報を生成する。
(1)
共に行動している複数ユーザを特定するユーザ特定部と、
前記特定した複数ユーザの関係性を推定する推定部と、
前記推定した複数ユーザの関係性に応じて、前記複数ユーザに対する案内情報を生成する生成部と、
を備える、情報処理システム。
(2)
前記ユーザ特定部は、センサにより検出されたユーザに関する情報に基づいて、共に行動している複数ユーザを特定する、前記(1)に記載の情報処理システム。
(3)
前記推定部は、センサにより検知されたユーザに関する情報に基づいて取得された各ユーザの性別、年代に応じて、前記複数ユーザの関係性を推定する、前記(1)または(2)に記載の情報処理システム。
(4)
前記推定部は、センサにより検知されたユーザに関する情報に基づいて取得された雰囲気に応じて、前記複数ユーザの関係性を推定する、前記(1)~(3)のいずれか1項に記載の情報処理システム。
(5)
前記センサにより検出されたユーザに関する情報は、撮像画像、音声データ、生体情報、および端末間通信データの少なくともいずれかである、前記(2)~(4)のいずれか1項に記載の情報処理システム。
(6)
前記推定部は、ユーザの感情に関連する情報に応じて、前記複数ユーザの関係性を推定する、前記(1)~(5)のいずれか1項に記載の情報処理システム。
(7)
前記ユーザの感情に関連する情報は、センサにより検知された検出されたユーザに関する情報に基づいて生成される、前記(6)に記載の情報処理システム。
(8)
前記ユーザの感情に関連する情報は、ユーザの顔を撮像した顔画像に基づく表情解析、音声データに基づく会話内容の解析、声のトーン、および生体情報に基づく心拍数、発汗量、脳波、または体動の少なくともいずれかを用いて生成される、前記(7)に記載の情報処理システム。
(9)
前記推定部は、オブジェクト間のインタラクションの評価値を蓄積して構築された感性値データベースから、前記複数ユーザに対応するオブジェクト間のインタラクション評価値に基づいて算出された感性値を、前記ユーザの感情に関連する情報として取得し、当該感性値に基づいて前記複数ユーザの関係性を推定する、前記(6)に記載の情報処理システム。
(10)
前記推定部は、前記複数ユーザの属性に応じて、前記複数ユーザの関係性を推定する、前記(1)~(9)のいずれか1項に記載の情報処理システム。
(11)
前記生成部は、前記推定部により推定された前記複数ユーザの関係性に応じて、前記案内情報に含まれる寄り道先を決定する、前記(1)~(10)のいずれか1項に記載の情報処理システム。
(12)
前記生成部は、感情に関連する情報を位置に紐付けてマッピングした感情ヒートマップを用いて前記案内情報を生成する、前記(1)~(9)のいずれか1項に記載の情報処理システム。
(13)
前記生成部は、環境に関連する情報を位置に紐付けてマッピングした環境情報マップと、前記感情ヒートマップとを用いて、前記案内情報を生成する、前記(12)に記載の情報処理システム。
(14)
前記環境に関連する情報は、スポット情報である、前記(13)に記載の情報処理システム。
(15)
前記情報処理システムは、
前記生成した案内情報を自動走行可能な車両に送信する送信部をさらに備え、
前記自動走行可能な車両は、前記案内情報としての目的地までのルート情報に従って自動走行制御を行う、前記(1)~(14)のいずれか1項に記載の情報処理システム。
(16)
前記ルート情報には、目的地へ行く途中の寄り道先が含まれる、前記(15)に記載の情報処理システム。
(17)
前記情報処理システムは、
前記生成された案内情報を情報処理端末に送信する送信部をさらに備え、
前記情報処理端末は、前記案内情報をユーザに提示する、前記(1)~(16)のいずれか1項に記載の情報処理システム。
(18)
共に行動している複数ユーザを特定部により特定することと、
前記特定した複数ユーザの関係性を推定部により推定することと、
前記推定した複数ユーザの関係性に応じて、前記複数ユーザに対する案内情報を生成部により生成することと、
を含む、制御方法。
10 制御部
10a ユーザ特定部
10b 関係性推定部
10c 案内情報生成部
10d 感情ヒートマップ生成部
10e 環境情報マップ生成部
11 通信部
12 記憶部
2a 車両
20 制御部
21 通信部
22 自車両位置取得部
23 車内カメラ
24 マイクロホン
25 操作入力部
26 表示部
27 生体センサ
28 自動走行制御部
29 記憶部
2b スマートフォン
3 感性サーバ
4 スポット情報サーバ
5 SNSサーバ
6 人物DBサーバ
7 インターネット
40 環境情報マップ
42、43 感情ヒートマップ
45 統合ヒートマップ
400~402 スポット
Claims (18)
- 共に行動している複数ユーザを特定するユーザ特定部と、
前記特定した複数ユーザの関係性を推定する推定部と、
前記推定した複数ユーザの関係性に応じて、前記複数ユーザに対する案内情報を生成する生成部と、
を備える、情報処理システム。 - 前記ユーザ特定部は、センサにより検出されたユーザに関する情報に基づいて、共に行動している複数ユーザを特定する、請求項1に記載の情報処理システム。
- 前記推定部は、センサにより検知されたユーザに関する情報に基づいて取得された各ユーザの性別、年代に応じて、前記複数ユーザの関係性を推定する、請求項1に記載の情報処理システム。
- 前記推定部は、センサにより検知されたユーザに関する情報に基づいて取得された雰囲気に応じて、前記複数ユーザの関係性を推定する、請求項1に記載の情報処理システム。
- 前記センサにより検出されたユーザに関する情報は、撮像画像、音声データ、生体情報、および端末間通信データの少なくともいずれかである、請求項2に記載の情報処理システム。
- 前記推定部は、ユーザの感情に関連する情報に応じて、前記複数ユーザの関係性を推定する、請求項1に記載の情報処理システム。
- 前記ユーザの感情に関連する情報は、センサにより検知された検出されたユーザに関する情報に基づいて生成される、請求項6に記載の情報処理システム。
- 前記ユーザの感情に関連する情報は、ユーザの顔を撮像した顔画像に基づく表情解析、音声データに基づく会話内容の解析、声のトーン、および生体情報に基づく心拍数、発汗量、脳波、または体動の少なくともいずれかを用いて生成される、請求項7に記載の情報処理システム。
- 前記推定部は、オブジェクト間のインタラクションの評価値を蓄積して構築された感性値データベースから、前記複数ユーザに対応するオブジェクト間のインタラクション評価値に基づいて算出された感性値を、前記ユーザの感情に関連する情報として取得し、当該感性値に基づいて前記複数ユーザの関係性を推定する、請求項6に記載の情報処理システム。
- 前記推定部は、前記複数ユーザの属性に応じて、前記複数ユーザの関係性を推定する、請求項1に記載の情報処理システム。
- 前記生成部は、前記推定部により推定された前記複数ユーザの関係性に応じて、前記案内情報に含まれる寄り道先を決定する、請求項1に記載の情報処理システム。
- 前記生成部は、感情に関連する情報を位置に紐付けてマッピングした感情ヒートマップを用いて前記案内情報を生成する、請求項1に記載の情報処理システム。
- 前記生成部は、環境に関連する情報を位置に紐付けてマッピングした環境情報マップと、前記感情ヒートマップとを用いて、前記案内情報を生成する、請求項12に記載の情報処理システム。
- 前記環境に関連する情報は、スポット情報である、請求項13に記載の情報処理システム。
- 前記情報処理システムは、
前記生成した案内情報を自動走行可能な車両に送信する送信部をさらに備え、
前記自動走行可能な車両は、前記案内情報としての目的地までのルート情報に従って自動走行制御を行う、請求項1に記載の情報処理システム。 - 前記ルート情報には、目的地へ行く途中の寄り道先が含まれる、請求項15に記載の情報処理システム。
- 前記情報処理システムは、
前記生成された案内情報を情報処理端末に送信する送信部をさらに備え、
前記情報処理端末は、前記案内情報をユーザに提示する、請求項1に記載の情報処理システム。 - 共に行動している複数ユーザを特定部により特定することと、
前記特定した複数ユーザの関係性を推定部により推定することと、
前記推定した複数ユーザの関係性に応じて、前記複数ユーザに対する案内情報を生成部により生成することと、
を含む、制御方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201580074142.5A CN107209019B (zh) | 2015-01-30 | 2015-10-21 | 信息处理系统和控制方法 |
US15/542,720 US10302444B2 (en) | 2015-01-30 | 2015-10-21 | Information processing system and control method |
JP2016571673A JP6607198B2 (ja) | 2015-01-30 | 2015-10-21 | 情報処理システムおよび制御方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015016666 | 2015-01-30 | ||
JP2015-016666 | 2015-01-30 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016121174A1 true WO2016121174A1 (ja) | 2016-08-04 |
Family
ID=56542812
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/079708 WO2016121174A1 (ja) | 2015-01-30 | 2015-10-21 | 情報処理システムおよび制御方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US10302444B2 (ja) |
JP (1) | JP6607198B2 (ja) |
CN (2) | CN107209019B (ja) |
WO (1) | WO2016121174A1 (ja) |
Cited By (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2018097737A (ja) * | 2016-12-15 | 2018-06-21 | 株式会社東芝 | 運転状態制御装置および運転状態制御方法 |
JP2018100890A (ja) * | 2016-12-20 | 2018-06-28 | ヤフー株式会社 | 提供装置、提供方法および提供プログラム |
JP2018100936A (ja) * | 2016-12-21 | 2018-06-28 | トヨタ自動車株式会社 | 車載装置及び経路情報提示システム |
WO2018123041A1 (ja) | 2016-12-28 | 2018-07-05 | 本田技研工業株式会社 | 情報処理システム、及び情報処理装置 |
WO2018123055A1 (ja) * | 2016-12-28 | 2018-07-05 | 本田技研工業株式会社 | 情報提供システム |
JP2018112443A (ja) * | 2017-01-11 | 2018-07-19 | トヨタ自動車株式会社 | 経路情報提供装置 |
WO2018179331A1 (ja) * | 2017-03-31 | 2018-10-04 | 本田技研工業株式会社 | 行動支援システム、行動支援装置、行動支援方法およびプログラム |
JP2018155606A (ja) * | 2017-03-17 | 2018-10-04 | 本田技研工業株式会社 | ナビゲーションシステム、ナビゲーション方法、および、プログラム |
JP2018165968A (ja) * | 2017-03-28 | 2018-10-25 | テイ・エス テック株式会社 | 組み合わせ選定システム |
JP2018181006A (ja) * | 2017-04-14 | 2018-11-15 | 富士通株式会社 | ユーザ関係抽出装置、ユーザ関係抽出方法及びプログラム |
WO2018230461A1 (ja) * | 2017-06-16 | 2018-12-20 | 本田技研工業株式会社 | 車両制御システム、車両制御方法、及びプログラム |
WO2018235379A1 (ja) * | 2017-06-23 | 2018-12-27 | ソニー株式会社 | サービス情報提供システムおよび制御方法 |
WO2019049491A1 (ja) * | 2017-09-08 | 2019-03-14 | ソニー株式会社 | 情報処理装置および情報処理方法 |
JP2019070530A (ja) * | 2017-10-05 | 2019-05-09 | トヨタ自動車株式会社 | 情報処理装置、情報処理方法、及びプログラム |
CN109760692A (zh) * | 2017-11-07 | 2019-05-17 | 丰田自动车株式会社 | 信息处理设备和信息处理方法 |
WO2019097674A1 (ja) * | 2017-11-17 | 2019-05-23 | 日産自動車株式会社 | 車両用操作支援装置 |
JP2019104354A (ja) * | 2017-12-12 | 2019-06-27 | 日産自動車株式会社 | 情報処理方法及び情報処理装置 |
JP2019109739A (ja) * | 2017-12-19 | 2019-07-04 | 富士ゼロックス株式会社 | 情報処理装置及びプログラム |
JP2019163984A (ja) * | 2018-03-19 | 2019-09-26 | 本田技研工業株式会社 | 情報提供装置およびその制御方法 |
JP2019164475A (ja) * | 2018-03-19 | 2019-09-26 | 本田技研工業株式会社 | 情報提供装置およびその制御方法 |
JP2019164474A (ja) * | 2018-03-19 | 2019-09-26 | 本田技研工業株式会社 | 情報提供システム、情報提供方法、及びプログラム |
JP2019192056A (ja) * | 2018-04-27 | 2019-10-31 | 株式会社ピーケア | 飲食店舗システムおよび飲食店舗のサービス管理システム |
JP2020030469A (ja) * | 2018-08-20 | 2020-02-27 | Zホールディングス株式会社 | 情報処理装置、情報処理方法、及び情報処理プログラム |
JP2020061062A (ja) * | 2018-10-12 | 2020-04-16 | トヨタ自動車株式会社 | 運転支援装置、車両、運転支援システム、運転支援方法、及び運転支援用コンピュータプログラム |
CN111199334A (zh) * | 2018-11-19 | 2020-05-26 | 丰田自动车株式会社 | 信息处理系统、记录介质以及信息处理方法 |
JP2020083211A (ja) * | 2018-11-29 | 2020-06-04 | テイ・エス テック株式会社 | シートシステム |
WO2020111220A1 (ja) * | 2018-11-29 | 2020-06-04 | テイ・エス テック株式会社 | シートシステム |
JP2021018792A (ja) * | 2020-01-04 | 2021-02-15 | 株式会社MaaS Tech Japan | プログラム及び情報処理装置 |
JP2021018513A (ja) * | 2019-07-18 | 2021-02-15 | 株式会社MaaS Tech Japan | プログラム及び情報処理装置 |
EP3706095A4 (en) * | 2017-11-02 | 2021-07-28 | Omron Corporation | EVALUATION DEVICE AND SYSTEM, VEHICLE, AND PROGRAM |
JP2022519791A (ja) * | 2018-09-14 | 2022-03-25 | ライク,フィリップ | 交流作成装置 |
US11302106B2 (en) | 2016-12-28 | 2022-04-12 | Honda Motor Co., Ltd. | Information provision system |
JP2022528021A (ja) * | 2018-09-14 | 2022-06-08 | ライク,フィリップ | 交流推薦システム |
JP2022144325A (ja) * | 2021-03-18 | 2022-10-03 | ヤフー株式会社 | 提供装置、提供方法及び提供プログラム |
WO2023095318A1 (ja) * | 2021-11-29 | 2023-06-01 | 日本電気株式会社 | 案内装置、システム及び方法、並びに、コンピュータ可読媒体 |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107111314B (zh) * | 2014-11-07 | 2021-10-08 | 索尼公司 | 控制系统、控制方法以及存储介质 |
US10388282B2 (en) * | 2017-01-25 | 2019-08-20 | CliniCloud Inc. | Medical voice command device |
JP6702217B2 (ja) * | 2017-02-06 | 2020-05-27 | 株式会社デンソー | 自動運転装置 |
US10974619B2 (en) * | 2017-03-28 | 2021-04-13 | Ts Tech Co., Ltd. | Vehicle seat and passenger selection system |
US10303961B1 (en) | 2017-04-13 | 2019-05-28 | Zoox, Inc. | Object detection and passenger notification |
US10795356B2 (en) * | 2017-08-31 | 2020-10-06 | Uatc, Llc | Systems and methods for determining when to release control of an autonomous vehicle |
EP3466761B1 (en) * | 2017-10-05 | 2020-09-09 | Ningbo Geely Automobile Research & Development Co. Ltd. | A display system and method for a vehicle |
US11273836B2 (en) * | 2017-12-18 | 2022-03-15 | Plusai, Inc. | Method and system for human-like driving lane planning in autonomous driving vehicles |
JP7095431B2 (ja) * | 2018-06-19 | 2022-07-05 | トヨタ自動車株式会社 | 情報処理装置、情報処理方法、情報処理システム |
KR20210057624A (ko) * | 2019-11-12 | 2021-05-21 | 현대자동차주식회사 | 단말기, 서버, 이를 포함하는 멀티 모빌리티 서비스 시스템 및 그의 방법 |
JP2022054820A (ja) * | 2020-09-28 | 2022-04-07 | マツダ株式会社 | 走行経路設定装置 |
US20220357172A1 (en) * | 2021-05-04 | 2022-11-10 | At&T Intellectual Property I, L.P. | Sentiment-based navigation |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009036653A (ja) * | 2007-08-02 | 2009-02-19 | Pioneer Electronic Corp | ドライブプラン作成装置、ドライブプラン作成方法、ドライブプラン作成プログラムおよび記録媒体 |
JP2010237134A (ja) * | 2009-03-31 | 2010-10-21 | Equos Research Co Ltd | 目的地提示システム及びナビゲーションシステム |
JP2011117905A (ja) * | 2009-12-07 | 2011-06-16 | Pioneer Electronic Corp | ルート選択支援装置及びルート選択支援方法 |
JP2013088409A (ja) * | 2011-10-24 | 2013-05-13 | Nissan Motor Co Ltd | 車両用走行支援装置 |
JP2013164664A (ja) * | 2012-02-09 | 2013-08-22 | Denso Corp | 情報出力装置 |
WO2014006688A1 (ja) * | 2012-07-03 | 2014-01-09 | 三菱電機株式会社 | ナビゲーション装置 |
WO2014054152A1 (ja) * | 2012-10-04 | 2014-04-10 | 三菱電機株式会社 | 車載情報処理装置 |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1871499B (zh) * | 2003-09-30 | 2011-08-17 | 株式会社建伍 | 引导路线搜索装置和引导路线搜索方法 |
JP4566844B2 (ja) * | 2005-07-01 | 2010-10-20 | 株式会社デンソー | ナビゲーションシステム、および、そのナビゲーションシステムに用いる記憶装置 |
JP2008003027A (ja) | 2006-06-26 | 2008-01-10 | Fujitsu Ten Ltd | ナビゲーション装置 |
US20090031877A1 (en) * | 2007-08-03 | 2009-02-05 | Les Produits Gilbert Inc. | Saw blade assembly for a cutting saw |
JP4554653B2 (ja) * | 2007-08-08 | 2010-09-29 | クラリオン株式会社 | 経路探索方法、経路探索システムおよびナビゲーション装置 |
JP4609527B2 (ja) * | 2008-06-03 | 2011-01-12 | 株式会社デンソー | 自動車用情報提供システム |
DE112009005414B4 (de) * | 2009-12-02 | 2019-03-21 | Mitsubishi Electric Corporation | Navigationssystem |
US8509982B2 (en) * | 2010-10-05 | 2013-08-13 | Google Inc. | Zone driving |
KR101495674B1 (ko) * | 2010-10-15 | 2015-02-26 | 한국전자통신연구원 | 다중 사용자 관계 기반 내비게이션 장치 및 이를 이용한 내비게이션 관리 방법 |
CN103189817B (zh) * | 2010-11-02 | 2016-08-24 | 日本电气株式会社 | 信息处理系统和信息处理方法 |
US8364395B2 (en) * | 2010-12-14 | 2013-01-29 | International Business Machines Corporation | Human emotion metrics for navigation plans and maps |
US9202465B2 (en) * | 2011-03-25 | 2015-12-01 | General Motors Llc | Speech recognition dependent on text message content |
US8775059B2 (en) * | 2011-10-26 | 2014-07-08 | Right There Ware LLC | Method and system for fleet navigation, dispatching and multi-vehicle, multi-destination routing |
US8642241B2 (en) * | 2011-12-21 | 2014-02-04 | Xerox Corporation | Mixer apparatus and method of making developer |
WO2013101045A1 (en) * | 2011-12-29 | 2013-07-04 | Intel Corporation | Navigation systems and associated methods |
CN104115180B (zh) * | 2013-02-21 | 2017-06-09 | 索尼公司 | 信息处理设备、信息处理方法和程序 |
KR102043637B1 (ko) * | 2013-04-12 | 2019-11-12 | 한국전자통신연구원 | 감성 기반 경로 안내 장치 및 방법 |
-
2015
- 2015-10-21 WO PCT/JP2015/079708 patent/WO2016121174A1/ja active Application Filing
- 2015-10-21 CN CN201580074142.5A patent/CN107209019B/zh active Active
- 2015-10-21 CN CN202011563140.1A patent/CN112762956A/zh active Pending
- 2015-10-21 US US15/542,720 patent/US10302444B2/en active Active
- 2015-10-21 JP JP2016571673A patent/JP6607198B2/ja active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009036653A (ja) * | 2007-08-02 | 2009-02-19 | Pioneer Electronic Corp | ドライブプラン作成装置、ドライブプラン作成方法、ドライブプラン作成プログラムおよび記録媒体 |
JP2010237134A (ja) * | 2009-03-31 | 2010-10-21 | Equos Research Co Ltd | 目的地提示システム及びナビゲーションシステム |
JP2011117905A (ja) * | 2009-12-07 | 2011-06-16 | Pioneer Electronic Corp | ルート選択支援装置及びルート選択支援方法 |
JP2013088409A (ja) * | 2011-10-24 | 2013-05-13 | Nissan Motor Co Ltd | 車両用走行支援装置 |
JP2013164664A (ja) * | 2012-02-09 | 2013-08-22 | Denso Corp | 情報出力装置 |
WO2014006688A1 (ja) * | 2012-07-03 | 2014-01-09 | 三菱電機株式会社 | ナビゲーション装置 |
WO2014054152A1 (ja) * | 2012-10-04 | 2014-04-10 | 三菱電機株式会社 | 車載情報処理装置 |
Cited By (78)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2018097737A (ja) * | 2016-12-15 | 2018-06-21 | 株式会社東芝 | 運転状態制御装置および運転状態制御方法 |
JP2018100890A (ja) * | 2016-12-20 | 2018-06-28 | ヤフー株式会社 | 提供装置、提供方法および提供プログラム |
RU2681429C1 (ru) * | 2016-12-21 | 2019-03-06 | Тойота Дзидося Кабусики Кайся | Бортовое устройство и система предоставления информации маршрутов |
JP2018100936A (ja) * | 2016-12-21 | 2018-06-28 | トヨタ自動車株式会社 | 車載装置及び経路情報提示システム |
CN108225366A (zh) * | 2016-12-21 | 2018-06-29 | 丰田自动车株式会社 | 车载装置与路线信息提示系统 |
EP3343175A1 (en) * | 2016-12-21 | 2018-07-04 | Toyota Jidosha Kabushiki Kaisha | In-vehicle device and route information presentation system |
WO2018123055A1 (ja) * | 2016-12-28 | 2018-07-05 | 本田技研工業株式会社 | 情報提供システム |
US11237009B2 (en) | 2016-12-28 | 2022-02-01 | Honda Motor Co., Ltd. | Information provision system for route proposition based on emotion information |
JPWO2018123055A1 (ja) * | 2016-12-28 | 2019-10-31 | 本田技研工業株式会社 | 情報提供システム |
US11302106B2 (en) | 2016-12-28 | 2022-04-12 | Honda Motor Co., Ltd. | Information provision system |
US11435201B2 (en) | 2016-12-28 | 2022-09-06 | Honda Motor Co., Ltd. | Information processing system and information processing device |
WO2018123041A1 (ja) | 2016-12-28 | 2018-07-05 | 本田技研工業株式会社 | 情報処理システム、及び情報処理装置 |
JP2018112443A (ja) * | 2017-01-11 | 2018-07-19 | トヨタ自動車株式会社 | 経路情報提供装置 |
JP2018155606A (ja) * | 2017-03-17 | 2018-10-04 | 本田技研工業株式会社 | ナビゲーションシステム、ナビゲーション方法、および、プログラム |
CN108627169A (zh) * | 2017-03-17 | 2018-10-09 | 本田技研工业株式会社 | 导航系统、导航方法和存储介质 |
US10775795B2 (en) | 2017-03-17 | 2020-09-15 | Honda Motor Co., Ltd. | Navigation system, navigation method, and recording medium |
JP7372562B2 (ja) | 2017-03-28 | 2023-11-01 | テイ・エス テック株式会社 | 組み合わせ選定システム |
JP2018165968A (ja) * | 2017-03-28 | 2018-10-25 | テイ・エス テック株式会社 | 組み合わせ選定システム |
JP2022093394A (ja) * | 2017-03-28 | 2022-06-23 | テイ・エス テック株式会社 | 組み合わせ選定システム |
WO2018179331A1 (ja) * | 2017-03-31 | 2018-10-04 | 本田技研工業株式会社 | 行動支援システム、行動支援装置、行動支援方法およびプログラム |
JPWO2018179331A1 (ja) * | 2017-03-31 | 2020-02-20 | 本田技研工業株式会社 | 行動支援システム、行動支援装置、行動支援方法およびプログラム |
CN110637334A (zh) * | 2017-03-31 | 2019-12-31 | 本田技研工业株式会社 | 行动辅助系统、行动辅助装置、行动辅助方法以及程序 |
US11250875B2 (en) | 2017-03-31 | 2022-02-15 | Honda Motor Co., Ltd. | Behavior support system, behavior support apparatus, behavior support method, and storage medium storing program thereof |
JP2018181006A (ja) * | 2017-04-14 | 2018-11-15 | 富士通株式会社 | ユーザ関係抽出装置、ユーザ関係抽出方法及びプログラム |
JP7055136B2 (ja) | 2017-06-16 | 2022-04-15 | 本田技研工業株式会社 | 車両制御システム、車両制御方法、及びプログラム |
CN110753946B (zh) * | 2017-06-16 | 2023-12-05 | 本田技研工业株式会社 | 车辆控制系统、车辆控制方法及存储介质 |
CN110753946A (zh) * | 2017-06-16 | 2020-02-04 | 本田技研工业株式会社 | 车辆控制系统、车辆控制方法及程序 |
US11017666B2 (en) | 2017-06-16 | 2021-05-25 | Honda Motor Co., Ltd. | Vehicle control system, vehicle control method, and program |
JPWO2018230461A1 (ja) * | 2017-06-16 | 2020-05-28 | 本田技研工業株式会社 | 車両制御システム、車両制御方法、及びプログラム |
WO2018230461A1 (ja) * | 2017-06-16 | 2018-12-20 | 本田技研工業株式会社 | 車両制御システム、車両制御方法、及びプログラム |
WO2018235379A1 (ja) * | 2017-06-23 | 2018-12-27 | ソニー株式会社 | サービス情報提供システムおよび制御方法 |
JP7188852B2 (ja) | 2017-09-08 | 2022-12-13 | ソニーグループ株式会社 | 情報処理装置および情報処理方法 |
EP3680838A4 (en) * | 2017-09-08 | 2020-08-12 | Sony Corporation | INFORMATION PROCESSING DEVICE AND INFORMATION PROCESSING METHOD |
WO2019049491A1 (ja) * | 2017-09-08 | 2019-03-14 | ソニー株式会社 | 情報処理装置および情報処理方法 |
JPWO2019049491A1 (ja) * | 2017-09-08 | 2020-10-15 | ソニー株式会社 | 情報処理装置および情報処理方法 |
JP2019070530A (ja) * | 2017-10-05 | 2019-05-09 | トヨタ自動車株式会社 | 情報処理装置、情報処理方法、及びプログラム |
EP3706095A4 (en) * | 2017-11-02 | 2021-07-28 | Omron Corporation | EVALUATION DEVICE AND SYSTEM, VEHICLE, AND PROGRAM |
JP2019084981A (ja) * | 2017-11-07 | 2019-06-06 | トヨタ自動車株式会社 | 情報処理装置及び情報処理方法 |
US11117586B2 (en) | 2017-11-07 | 2021-09-14 | Toyota Jidosha Kabushiki Kaisha | Information processing apparatus and information processing method |
CN109760692A (zh) * | 2017-11-07 | 2019-05-17 | 丰田自动车株式会社 | 信息处理设备和信息处理方法 |
US11654916B2 (en) | 2017-11-07 | 2023-05-23 | Toyota Jidosha Kabushiki Kaisha | Information processing apparatus and information processing method |
CN109760692B (zh) * | 2017-11-07 | 2022-03-01 | 丰田自动车株式会社 | 信息处理设备和信息处理方法 |
WO2019097674A1 (ja) * | 2017-11-17 | 2019-05-23 | 日産自動車株式会社 | 車両用操作支援装置 |
JPWO2019097674A1 (ja) * | 2017-11-17 | 2020-12-03 | 日産自動車株式会社 | 車両用操作支援装置 |
JP7024799B2 (ja) | 2017-11-17 | 2022-02-24 | 日産自動車株式会社 | 車両用操作支援装置 |
JP2019104354A (ja) * | 2017-12-12 | 2019-06-27 | 日産自動車株式会社 | 情報処理方法及び情報処理装置 |
JP7081132B2 (ja) | 2017-12-12 | 2022-06-07 | 日産自動車株式会社 | 情報処理方法及び情報処理装置 |
JP2019109739A (ja) * | 2017-12-19 | 2019-07-04 | 富士ゼロックス株式会社 | 情報処理装置及びプログラム |
JP2019164475A (ja) * | 2018-03-19 | 2019-09-26 | 本田技研工業株式会社 | 情報提供装置およびその制御方法 |
JP2019163984A (ja) * | 2018-03-19 | 2019-09-26 | 本田技研工業株式会社 | 情報提供装置およびその制御方法 |
JP7080078B2 (ja) | 2018-03-19 | 2022-06-03 | 本田技研工業株式会社 | 情報提供システム、情報提供方法、及びプログラム |
JP7080079B2 (ja) | 2018-03-19 | 2022-06-03 | 本田技研工業株式会社 | 情報提供装置およびその制御方法 |
JP7053325B2 (ja) | 2018-03-19 | 2022-04-12 | 本田技研工業株式会社 | 情報提供装置およびその制御方法 |
JP2019164474A (ja) * | 2018-03-19 | 2019-09-26 | 本田技研工業株式会社 | 情報提供システム、情報提供方法、及びプログラム |
JP7127804B2 (ja) | 2018-04-27 | 2022-08-30 | 株式会社ピーケア | 飲食店舗システム |
JP2019192056A (ja) * | 2018-04-27 | 2019-10-31 | 株式会社ピーケア | 飲食店舗システムおよび飲食店舗のサービス管理システム |
JP7082008B2 (ja) | 2018-08-20 | 2022-06-07 | ヤフー株式会社 | 情報処理装置、情報処理方法、及び情報処理プログラム |
JP2020030469A (ja) * | 2018-08-20 | 2020-02-27 | Zホールディングス株式会社 | 情報処理装置、情報処理方法、及び情報処理プログラム |
JP2022519791A (ja) * | 2018-09-14 | 2022-03-25 | ライク,フィリップ | 交流作成装置 |
JP7278214B2 (ja) | 2018-09-14 | 2023-05-19 | ライク,フィリップ | 交流作成装置 |
JP7278213B2 (ja) | 2018-09-14 | 2023-05-19 | ライク,フィリップ | 交流推薦システム |
JP2022528021A (ja) * | 2018-09-14 | 2022-06-08 | ライク,フィリップ | 交流推薦システム |
US11360487B2 (en) | 2018-10-12 | 2022-06-14 | Toyota Jidosha Kabushiki Kaisha | Driving support apparatus, vehicle, driving support system, and driving support method |
JP2020061062A (ja) * | 2018-10-12 | 2020-04-16 | トヨタ自動車株式会社 | 運転支援装置、車両、運転支援システム、運転支援方法、及び運転支援用コンピュータプログラム |
JP7192380B2 (ja) | 2018-10-12 | 2022-12-20 | トヨタ自動車株式会社 | 運転支援装置、車両、運転支援システム、運転支援方法、及び運転支援用コンピュータプログラム |
JP7155927B2 (ja) | 2018-11-19 | 2022-10-19 | トヨタ自動車株式会社 | 情報処理システム、プログラム、及び情報処理方法 |
CN111199334A (zh) * | 2018-11-19 | 2020-05-26 | 丰田自动车株式会社 | 信息处理系统、记录介质以及信息处理方法 |
JP2020086656A (ja) * | 2018-11-19 | 2020-06-04 | トヨタ自動車株式会社 | 情報処理システム、プログラム、及び情報処理方法 |
WO2020111220A1 (ja) * | 2018-11-29 | 2020-06-04 | テイ・エス テック株式会社 | シートシステム |
JP7239807B2 (ja) | 2018-11-29 | 2023-03-15 | テイ・エス テック株式会社 | シートシステム |
JP2020083211A (ja) * | 2018-11-29 | 2020-06-04 | テイ・エス テック株式会社 | シートシステム |
US12005811B2 (en) | 2018-11-29 | 2024-06-11 | Ts Tech Co., Ltd. | Seat system |
JP2021018513A (ja) * | 2019-07-18 | 2021-02-15 | 株式会社MaaS Tech Japan | プログラム及び情報処理装置 |
JP7161782B2 (ja) | 2020-01-04 | 2022-10-27 | 株式会社MaaS Tech Japan | プログラム及び情報処理装置 |
JP2022171986A (ja) * | 2020-01-04 | 2022-11-11 | 株式会社MaaS Tech Japan | プログラム及び情報処理装置 |
JP2021018792A (ja) * | 2020-01-04 | 2021-02-15 | 株式会社MaaS Tech Japan | プログラム及び情報処理装置 |
JP2022144325A (ja) * | 2021-03-18 | 2022-10-03 | ヤフー株式会社 | 提供装置、提供方法及び提供プログラム |
WO2023095318A1 (ja) * | 2021-11-29 | 2023-06-01 | 日本電気株式会社 | 案内装置、システム及び方法、並びに、コンピュータ可読媒体 |
Also Published As
Publication number | Publication date |
---|---|
CN107209019B (zh) | 2021-01-15 |
CN112762956A (zh) | 2021-05-07 |
CN107209019A (zh) | 2017-09-26 |
US10302444B2 (en) | 2019-05-28 |
US20170370744A1 (en) | 2017-12-28 |
JP6607198B2 (ja) | 2019-11-20 |
JPWO2016121174A1 (ja) | 2017-11-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6607198B2 (ja) | 情報処理システムおよび制御方法 | |
US10853650B2 (en) | Information processing apparatus, information processing method, and program | |
US8655740B2 (en) | Information providing apparatus and system | |
US20180172464A1 (en) | In-vehicle device and route information presentation system | |
JP4258585B2 (ja) | 目的地設定装置 | |
US20180025283A1 (en) | Information processing apparatus, information processing method, and program | |
WO2016113967A1 (ja) | 情報処理システム、および制御方法 | |
US20200309549A1 (en) | Control apparatus, control method, and storage medium storing program | |
JP7139904B2 (ja) | 情報処理装置及び情報処理プログラム | |
JP2022517052A (ja) | 個人体験旅程 | |
JP6642401B2 (ja) | 情報提供システム | |
JP2020060987A (ja) | 移動装置及び移動装置用プログラム | |
JP7053325B2 (ja) | 情報提供装置およびその制御方法 | |
JP2020169956A (ja) | 車両の行先提案システム | |
US20220357172A1 (en) | Sentiment-based navigation | |
CN110514214A (zh) | 车辆导航系统 | |
JP7226233B2 (ja) | 車両、情報処理システム、プログラム及び制御方法 | |
JP2019040457A (ja) | 関係性推定方法、関係性推定装置、及び情報提供方法 | |
CN115131054A (zh) | 信息提供装置 | |
JP2023057804A (ja) | 情報処理装置、情報処理方法および情報処理プログラム | |
JP2020160541A (ja) | 情報表示制御装置 | |
JP2023170927A (ja) | 端末装置、情報処理方法、および情報処理プログラム | |
JP2023170926A (ja) | 端末装置、情報処理方法、および情報処理プログラム | |
CN114981831A (zh) | 信息处理装置和信息处理方法 | |
JP2022078573A (ja) | 情報提供装置、情報提供方法、情報提供プログラム、及び記録媒体 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15880067 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016571673 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15542720 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15880067 Country of ref document: EP Kind code of ref document: A1 |