WO2011007386A1 - Dispositif de navigation - Google Patents

Dispositif de navigation Download PDF

Info

Publication number
WO2011007386A1
WO2011007386A1 PCT/JP2009/003299 JP2009003299W WO2011007386A1 WO 2011007386 A1 WO2011007386 A1 WO 2011007386A1 JP 2009003299 W JP2009003299 W JP 2009003299W WO 2011007386 A1 WO2011007386 A1 WO 2011007386A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
information
destination
index value
preference
Prior art date
Application number
PCT/JP2009/003299
Other languages
English (en)
Japanese (ja)
Inventor
福原英樹
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2009/003299 priority Critical patent/WO2011007386A1/fr
Publication of WO2011007386A1 publication Critical patent/WO2011007386A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3605Destination input or retrieval
    • G01C21/3617Destination input or retrieval using user history, behaviour, conditions or preferences, e.g. predicted or inferred from previous use or current movement

Definitions

  • This invention relates to a navigation device that presents destination candidates and changes routes according to the biological information and preferences of the crew members.
  • Patent Document 1 since the in-vehicle situation is used when generating the search condition for narrowing down the destination, user operations such as starting the search for the destination remain. For this reason, when the occupant is only the driver, there is a problem that the destination search start operation cannot be performed during driving, and the destination cannot be searched at a timing desired by the user.
  • Patent Document 1 the user's preference described in Patent Document 1 is only related to smoking. For example, the preference regarding the eating and drinking of the driver and passengers is not considered, and the destination desired by the user cannot be accurately presented.
  • the present invention has been made in order to solve the above-described problems.
  • the destination candidate and the route desired by the user at an appropriate timing according to the current state of the user based on the biological information and preferences of the crew of the vehicle.
  • An object of the present invention is to obtain a navigation device capable of presenting.
  • a navigation device includes a biological information detection unit that detects biological information of a user who is on a moving body, and an index that calculates an index value indicating the state of the user from the biological information detected by the biological information detection unit.
  • a biological information detection unit that detects biological information of a user who is on a moving body
  • an index that calculates an index value indicating the state of the user from the biological information detected by the biological information detection unit.
  • the destination candidate and route desired by the user can be determined at an appropriate timing according to the current state of the user based on the biological information and taste of the passenger. There is an effect that presentation can be performed.
  • FIG. 1 is a block diagram showing a configuration of a navigation device according to Embodiment 1.
  • FIG. It is a block diagram which shows the structure of the various sensors by Embodiment 1, and a database. It is a figure which shows the flow of information until a presentation route is determined with the car navigation system by Embodiment 1.
  • FIG. It is a figure which shows schematically the structure of the car navigation system using the navigation apparatus by Embodiment 2 of this invention.
  • 6 is a block diagram illustrating a configuration of a navigation device according to Embodiment 2.
  • FIG. 1 is a diagram schematically showing a configuration of a car navigation system using a navigation apparatus according to Embodiment 1 of the present invention.
  • the car navigation system according to the first embodiment includes a navigation device 10, various sensors 20, and a storage device that stores a database 30, and these are connected by an in-vehicle network 90 and can communicate with each other.
  • the navigation device 10 is a device that guides the route from the current position to the destination.
  • the various sensors 20 are sensor groups that acquire biological information such as a user's blood pressure and body temperature.
  • preference information such as user's preferences regarding eating and drinking, and destination information such as a destination address are registered.
  • the in-vehicle network 90 is a single wireless or wired communication line that connects to the navigation device 10, the various sensors 20, and the storage devices stored in the database 30.
  • FIG. 2 is a block diagram showing a configuration of the navigation device according to the first embodiment.
  • the navigation apparatus 10 includes a preference information acquisition unit 11, a biological information storage unit 12, a user state determination unit 13, a user state storage unit 13a, a destination information acquisition unit 14, a time information acquisition unit 15, and a time information storage unit. 15a, the present location information acquisition part 16, the apparatus control part 17, the display apparatus control part 18, and the display apparatus part 19 are provided.
  • the preference information acquisition unit 11 is a component that acquires the user preference information 100.
  • it is an HMI (Human Machine Interface) for inputting user preference information 100 in which an input screen is displayed on the display unit 19 and user preference information 100 is input based on this input screen.
  • HMI Human Machine Interface
  • an input device mounted on the navigation device 10 such as a key operation unit or a voice input device can be used.
  • the biological information storage unit 12 is a storage unit that stores the biological information 101 detected by the various sensors 20.
  • the biological information storage unit 12 when the biological information 101 is detected by the various sensors 20, the detected biological information 101 is acquired and stored via the in-vehicle network 90.
  • the biometric information 101 stored in the biometric information storage unit 12 is read by the user state determination unit 13 at a predetermined cycle.
  • the user state determination unit 13 is a configuration unit that determines user state information 102 such as a user's hunger degree and fatigue degree based on the user preference information 100 and the biological information 101. A method for determining the user status information 102 will be described later.
  • the user state storage unit 13 a is a storage unit that stores the user state information 102 obtained by the user state determination unit 13.
  • the destination information acquisition unit 14 is a component that acquires the destination information 103.
  • a destination setting screen can be displayed on the display unit 19, and the destination information 103 can be set with an input device that is normally mounted on the navigation device 10 based on the setting screen. This is realized as an HMI for setting the ground information 103.
  • the destination information 103 acquired by the destination information acquisition unit 14 is registered in the database 30 via the in-vehicle network 90. Further, the destination information acquisition unit 14 may acquire the destination information 103 registered in the past from the database 30.
  • the time information acquisition unit 15 is a configuration unit that acquires time information 104 such as the current time and driving time. For example, it is time measuring means using a timer mounted on a computer that functions as the navigation device 10.
  • the time information storage unit 15 a is a storage unit that stores the time information 104 obtained by the time information acquisition unit 15.
  • the current location information acquisition unit 16 is a component that acquires current location information 105 such as the current travel point of the host vehicle. For example, position information is acquired from GPS (Global Positioning System) or an acceleration sensor, and the vehicle position is measured.
  • the device control unit 17 uses the user preference information 100, the user status information 102, the destination information 103, the time information 104, and the current location information 105 to present a destination such as a destination to be presented to the user and a route to the destination. It is a component that determines information 110.
  • the display device control unit 18 is a component that controls the presentation operation of the presentation destination information 110 in accordance with a control signal from the device control unit 17.
  • the display unit 19 is a component that presents the presentation destination information 110 to the user side.
  • the display unit 19 includes an audio output device such as a speaker in addition to a display device such as an LCD (Liquid Crystal Display) mounted on the navigation device.
  • the map information and the guidance voice information indicating the destination set in the presentation destination information 110 and the route to the destination are displayed on the display screen of the display unit 19 and are output as voice from a speaker.
  • FIG. 3 is a block diagram showing configurations of various sensors and a database used in the car navigation system according to the first embodiment.
  • FIG. 3A shows the configuration of various sensors
  • FIG. 3B shows the configuration of the database 30.
  • the various sensors 20 are provided with a biological information detection unit 21 that detects biological information 101.
  • the biological information detection unit 21 is, for example, information that can process these output signals on the navigation device 10 side in addition to a camera that captures the user, a blood pressure sensor that measures the blood pressure of the user, a thermometer sensor that detects body temperature, and the like. Processing means for generating biometric information 101 by converting into a form is provided.
  • the database 30 is provided with a preference information storage unit 31 that stores user preference information 100 and a destination information storage unit 32 that stores destination information 103.
  • the storage device for storing the database 30 may be an external storage device connected via the in-vehicle network 90, but may be constructed in a storage area of a hard disk device built in the navigation device 10.
  • FIG. 4 is a diagram showing a flow of information until a presentation route is determined in the car navigation system according to the first embodiment.
  • the main operations of each part of the car navigation system will be described with reference to the configuration shown in FIGS.
  • a route including a restaurant is presented as a destination candidate and a route associated in advance with the user state.
  • the preference information acquisition unit 11 acquires user preference information 100 of a driver and a passenger.
  • the user preference information 100 includes the gender, age, favorite food genre, time of eating a normal meal (morning, noon, night), the amount of money per meal when eating out, and the like.
  • the preference information acquisition unit 11 displays an input screen having these pieces of information as setting items on the display unit 19. The user sets information corresponding to the setting item by terminal key input or voice input.
  • the user preference information 100 input in this way is held in the preference information storage unit 31 of the database 30 from the preference information acquisition unit 11 via the in-vehicle network 90.
  • the destination information 103 is also registered in the destination information storage unit 32 of the database 30 from the destination information acquisition unit 14 via the in-vehicle network 90.
  • the biological information detection unit 21 detects the biological information 101 of the driver and passengers. Based on the biometric information 101 acquired from the user in the vehicle by the biometric information detection unit 21, an appropriate presentation timing in accordance with the current state of the driving user is determined.
  • the biological information detection unit 21 is provided with a camera that captures the interior of the vehicle, a sound collection microphone that collects the interior sound, A blood pressure sensor, an acceleration sensor, a temperature sensor, a thermographic device, and the like are provided, and an information processing unit that processes the information acquired by these to generate the biological information 101 is provided.
  • the biometric information detection unit 21 measures the gaze time as the biometric information 101 based on the result of detecting the number of passengers or the direction in which each passenger's eyes are facing, for example, from an image captured by a camera. Moreover, the hungry sound may be extracted from the in-vehicle sound (speaking voice, hungry sound) detected by the sound collecting microphone, and the biological information 101 indicating the hungry state of the driver or the passenger may be generated. Further, the biological information detection unit 21 embeds a blood pressure sensor in a seat belt or the like, and measures the blood pressure of the driver or passenger as the biological information 101.
  • an acceleration sensor is embedded in a seat or the like, the vibration can be detected when the seated person takes the rhythm described above. That is, the vibration detection information acquired by the acceleration sensor can be used as the biological information 101 for estimating the psychological state of the driver or passengers.
  • the body temperature near the head of the passenger is measured as the biological information 101 by the temperature sensor and the thermography device.
  • body temperature information is utilized as the biometric information 101 for estimating the psychological state of a driver and a passenger.
  • the biological information 101 is acquired from the driver and passengers at the time of traveling of the host vehicle by the biological information detection unit 21 constantly or at a constant cycle, and is transmitted from the biological information detection unit 21 to the living body of the navigation device 10 via the in-vehicle network 90. It is sent to and held in the information storage unit 12. Further, the time information acquisition unit 15 acquires the current time from a clock or the like installed in the vehicle, acquires the boarding time from a timer or the like, and holds the time information 104 in the time information storage unit 15a.
  • the user state determination unit 13 reads various biological information 101 from the biological information storage unit 12, reads time information 104 from the time information storage unit 15 a, and inputs the user preference information 100 acquired by the preference information acquisition unit 11.
  • the user status information 102 of the driver and the passenger is calculated by performing a predetermined calculation on the information 100, 101, and 104, respectively.
  • the biometric information 101, the time information 104, and the user preference information 100 are scored according to a predetermined rule, so that an index indicating the driver's or passenger's hunger or fatigue can be obtained. .
  • the driver's or passenger's fatigue level B is calculated from the gaze time of the driver's or passenger's line of sight acquired as the biological information 101 and the boarding time indicated by the time information 104 using the following equation (1).
  • is a coefficient that differs depending on the driver or passenger.
  • Fatigue degree B ⁇ ⁇ ⁇ (gaze time) / (ride time) ⁇ (1)
  • the hunger degree H of the driver or passenger can be calculated.
  • index 1 is an index value indicating the degree of irritation to hunger.
  • the index 2 is an index value indicating the appropriate degree of timing for eating.
  • a1 is a coefficient for the blood pressure value
  • a2 is a coefficient for the above-described vibration value
  • a3 is a coefficient for the body temperature near the head.
  • b1 is a coefficient for the volume level of the hungry sound.
  • b2 is a coefficient with respect to the index value indicating the result of keyword matching indicating hunger extracted from the voice collection result.
  • the matching result is “1” if the matching is obtained, and the index value indicating the matching result is “0” if the matching is not obtained.
  • b3 is a coefficient for the degree of correlation between the current time and the normal meal intake time. This correlation degree has a maximum value of “1” and a minimum value of “0”. ⁇ 1, ⁇ 2, a1 to a3, and b1 to b3 are different coefficients depending on the driver or the passenger.
  • Hunger degree H ( ⁇ 1 ⁇ index 1) + ( ⁇ 2 ⁇ index 2) (2)
  • Index 1 ⁇ a1 ⁇ (blood pressure) ⁇ + ⁇ a2 ⁇ (vibration) ⁇ + ⁇ a3 ⁇ (body temperature near the head) ⁇ (3)
  • a rule for scoring is set for each of the biological information 101, the time information 104, and the user preference information 100, and the calculated value becomes the user status information 102.
  • the user status information 102 for each passenger obtained in this way is held in the user status storage unit 13a.
  • the current location information acquisition unit 16 acquires the current location information 105 of the vehicle based on GPS and acceleration sensor data.
  • the device control unit 17 determines the presentation destination information 110 using the user preference information 100, the user status information 102, the destination information 103, the time information 104, and the current location information 105 every certain period, and determines the determination result. Based on this, the display control unit 18 is controlled. As described above, the presentation destination information 110 is determined based on the biological information 101 at an appropriate presentation timing according to the current state of the user. Note that the device control unit 17 controls the operation of each component in the navigation device 10 in addition to this operation.
  • the apparatus control unit 17 reads the user state information 102 from the user state storage unit 13a, and compares the index values constituting the user state information 102 with threshold values set beforehand for these index values. The contents of the presentation destination information 110 are determined accordingly.
  • a route including a restaurant is presented as a destination candidate and a route previously associated with the user state.
  • the number of persons whose hunger degree H or fatigue degree B in the user state information 102 is higher than a threshold value is obtained, and if there are many persons with a high hunger degree H, the target genre of the presentation candidate 106 is set to a restaurant or the like restaurant. Moreover, if there are many people with high fatigue degree B, restaurants, such as a coffee shop, will be set, and if it is the same number, it will set to restaurants, such as a restaurant. Further, the user having the highest hunger degree H from the user status information 102 is set as the destination target user of the presentation candidate 106.
  • the device control unit 17 acquires the destination information 103 that matches the destination genre of the presentation candidate 106 from the destination information 103 stored in the destination information storage unit 32, and uses this destination information 103 as the destination content.
  • the analysis result 107 is assumed.
  • the device control unit 17 acquires user preference information 100 that matches the destination target user of the presentation candidate 106, and uses the user preference information 100 as the user preference content analysis result 108.
  • the device control unit 17 performs a route search from the current location to the destination from the current location information 105 and the address of the destination content analysis result 107 using map information acquired from a map database (not shown), and obtains a route search result 109. Ask for. Thereafter, the device control unit 17 obtains the presentation destination information 110 from the current time of the time information 104, the destination content analysis result 107, the user preference content analysis result 108, and the route search result 109 by scoring according to threshold determination. decide.
  • the presentation destination information 110 with the restaurant as the destination is determined.
  • the current time the cooking genre specified in the destination content analysis result 107, age of use, gender, business hours, average fee, age set in the user preference content analysis result 108, gender, favorite food genre, restaurant
  • the restaurant that matches the user's taste is determined from the cost to spend and the required time to the destination of the route search result 109.
  • the presentation determination value A is calculated using the following formulas (5) to (7).
  • the index 3 is an index value indicating the preference of the store.
  • the index 4 is an index value indicating the general degree of the store.
  • ⁇ 1 is a coefficient for the index 3 indicating the preference of the store
  • ⁇ 2 is a coefficient for the index 4 indicating the general degree of the store.
  • C1 is a coefficient for the degree of correlation between the cooking genre and the favorite food genre.
  • c2 is a coefficient for an index value that is “1” when the average fee includes the cost of eating out by the user. When the cost is not included in the average fee, the index value is “0”.
  • d1 is a coefficient for the index value that is “1” when the age is included in the usage age.
  • the index value When the age is not included in the usage age, the index value is “0”.
  • d2 is a coefficient for the index value that is “1” when the gender is included according to the usage. If the gender is not included by gender, the index value is “0”.
  • d3 is a coefficient for the index value that is “1” when the sum of the current time and the required time is within business hours. If the sum of the above times is not within business hours, the index value is “0”.
  • d4 is a coefficient for the index value that is “1” when the required time is within the threshold time. When the required time exceeds the threshold time, the index value becomes “0”. Note that ⁇ 1, ⁇ 2, c1 to c2, and d1 to d4 are different coefficients depending on the driver or the passenger.
  • the device control unit 17 calculates a presentation determination value A for each destination that matches the target genre of the presentation candidate 106 in the above-described procedure, and determines a store to be presented to the user.
  • the maximum number of shops to be presented is arbitrarily set.
  • the display device control unit 18 displays the images set in the presentation destination information 110 and the reasons for presentation of the images in descending order of the presentation determination value A of each destination from the presentation destination information 110 determined by the device control unit 17.
  • the display device unit 19 is controlled to present a voice to be explained. For example, if the presentation determination value A is equal to or greater than a predetermined threshold value, the shop is a presentation target.
  • the user determines (cancels) the destination from the presentation destination by terminal key input or voice input, and starts (does not implement) route guidance.
  • the navigation apparatus 10 ends the route guidance to the destination that has already been started.
  • the index value indicating the user state information 102 calculated from the biological information 101 of the user on the vehicle is compared with the predetermined threshold value, and a change occurs in the current state of the user. If it is determined, the destination candidate and route previously associated with the state are presented to the user. With this configuration, the navigation device 10 actively performs the destination candidate and the change to the route without the user's operation, so that the user's operation is saved, and the purpose matches the user's preference. Guidance to the ground is possible. In addition, since the user preference information 100 and the biometric information 101 are used, it is possible to present a destination candidate that matches the user preference and change the route at an appropriate presentation timing in accordance with the current state of the user.
  • the present invention is not limited to this configuration.
  • the preference information acquisition unit 11 is omitted, and the device control unit 17 determines the presentation timing based on the biological information 101 sequentially detected from the user during driving.
  • the device control unit 17 determines the presentation destination information 110 from the current time of the time information 104, the destination content analysis result 107, and the route search result 109, and controls the display device control unit 18 to inform the user.
  • the display device control unit 18 determines the presentation destination information 110 from the current time of the time information 104, the destination content analysis result 107, and the route search result 109, and controls the display device control unit 18 to inform the user.
  • FIG. FIG. 5 is a diagram schematically showing a configuration of a car navigation system using a navigation device according to Embodiment 2 of the present invention.
  • the car navigation system according to the second embodiment includes an out-of-vehicle server 70 and another navigation device 40 in addition to the configuration shown in the first embodiment.
  • the navigation device 10, the out-of-vehicle server 70, and the other navigation device 40 are connected by an out-of-vehicle network 91 and can communicate with each other.
  • the off-vehicle server 70 is a server device that manages the destination additional information 111.
  • the destination addition information 111 is information indicating, for example, an in-store image related to the destination information 103, a service of a store currently being implemented, and the like.
  • the outside-vehicle network 91 is a single wireless communication line that connects the navigation device 10, the outside-server 70, and the other navigation device 40.
  • the other navigation device 40 is a navigation device that is installed in a vehicle different from the navigation device 10 and is managed by another user.
  • the other navigation device 40 notifies the other user evaluation information 112 regarding the destination information 103 to the navigation device 10 in the wireless communication area via the outside-vehicle network 91.
  • the other user evaluation information 112 is evaluation information of other users indicating, for example, a degree of satisfaction with the store with respect to the destination information 103.
  • FIG. 6 is a block diagram showing the configuration of the navigation device according to the second embodiment.
  • the navigation device 10 according to the second embodiment includes a communication control unit 20 in addition to the configuration described with reference to FIG. 2 in the first embodiment.
  • the communication control unit 20 is a component that communicates with the outside server 70 and the other navigation device 40 via the outside network 91 and exchanges information with them.
  • the route including the restaurant is presented as the presentation destination information 110 at the timing when the driver or passenger is determined to be hungry or tired based on the biometric information 101.
  • the communication control unit 20 wirelessly communicates the destination additional information 111 such as the in-store image relating to the destination information 103 and the service of the store currently being executed by the wireless server 70. Get from.
  • the communication control unit 20 similarly acquires other user evaluation information 112 such as satisfaction with the store with respect to the destination information 103 from the other navigation device 40.
  • the other navigation device 40 to be communicated with needs to be communicable (within the communication area) with the navigation device 10 via the outside-vehicle network 91.
  • the target other user must manage the other navigation device 40 described above, and the user of the navigation device 10 needs to permit communication. That is, the target other user is a user highly correlated with the user preference content analysis result 108.
  • the destination additional information 111 and the other user evaluation information 112 acquired by the communication control unit 20 are stored in a memory (not shown) in the device control unit 17.
  • the destination additional information 111 is acquired by the communication control unit 20 from the out-of-vehicle server 70 as information on the determined store when the store to be presented to the user is determined by the device control unit 17. This destination additional information 111 is added to an image, sound, or the like presented on the display unit 19.
  • the out-of-vehicle server 70 creates and manages the destination additional information 111 from information provided from the outside.
  • the information content that can be provided to the outside server 70 may be set according to, for example, an advertisement fee paid to the administrator of the outside server 70.
  • the other user evaluation information 112 is the same age as the user preference content analysis result 108 (within an age difference threshold) when the device control unit 17 determines the store to be presented to the user. It is acquired from the other navigation apparatus 40 as evaluation information of other users who have sex and favorite food genres. The other user evaluation information 112 is used for the calculation of the presentation determination value by the device control unit 17.
  • the information provided by the destination (candidate place) acquired from the outside server 70 by the outside communication is used as the information presented by the destination (candidate place) and the route change. included.
  • the evaluation for the destination by another user having the same taste acquired from the other navigation device 40 by external communication is used.
  • the information 111 and 112 acquired from the out-of-vehicle server 70 and the other navigation device 40 may be provided by a charging system and may be configured so that the contents that can be presented differ according to the amount of money.
  • the presentation destination information 110 with the restaurant as the destination is determined.
  • the current time the cooking genre specified in the destination content analysis result 107, age of use, gender, business hours, average fee, age set in the user preference content analysis result 108, gender, favorite food genre, restaurant
  • the restaurant according to the user's preference is determined from the cost to spend, the required time to the destination of the route search result 109, and the satisfaction of the other user evaluation information 112 to the store.
  • the presentation determination value B is calculated using the following formulas (8) to (10).
  • the index 5 is an index value indicating the preference of the store
  • the index 6 is an index value indicating the general degree of the store.
  • ⁇ 1 is a coefficient for the index value indicating the preference of the store
  • ⁇ 2 is a coefficient for the index value indicating the generality of the store.
  • E1 is a coefficient for the degree of correlation between the cooking genre and the favorite food genre.
  • e2 is a coefficient for an index value that is “1” when the cost of eating out by the user is included in the average fee. When the cost is not included in the average fee, the index value is “0”.
  • f1 is a coefficient for the index value that is “1” when the age is included in the usage age.
  • the index value When the age is not included in the usage age, the index value is “0”.
  • f2 is a coefficient for an index value that is “1” when gender is included according to usage. If the gender is not included by gender, the index value is “0”.
  • f3 is a coefficient for the index value that is “1” when the sum of the current time and the required time is within business hours. If the sum of the above times is not within business hours, the index value is “0”.
  • f4 is an index value that is “1” when the required time is within the threshold time. When the required time exceeds the threshold time, the index value becomes “0”.
  • ⁇ 1 and ⁇ 2 are coefficients that have different values in proportion to the degree of satisfaction with the store and are proportional to the index value of the degree of satisfaction with the store indicated in the other user evaluation information 112.
  • e1, e2, and f1 to f4 are different coefficients depending on the driver or the passenger.
  • Presentation determination value B ( ⁇ 1 ⁇ index 5) + ( ⁇ 2 ⁇ index 6) (8)
  • Index 5 ⁇ e1 ⁇ (Correlation between cooking genre and favorite food genre) ⁇ + ⁇ e2 ⁇ (cost for eating out is an average fee) ⁇ (9)
  • the device control unit 17 calculates a presentation determination value B for each destination that matches the target genre of the presentation candidate 106 in the above-described procedure, and determines a store to be presented to the user.
  • the display device control unit 18 displays the images set in the presentation destination information 110 and the reasons for presentation of the images in descending order of the presentation determination value B of each destination from the presentation destination information 110 determined by the device control unit 17.
  • the display device unit 19 is controlled to present a voice to be explained.
  • the user uses the user evaluation information 113 that has been graded for the degree of satisfaction with the destination by terminal key input or voice input.
  • the information is input to the land information acquisition unit 14.
  • This user evaluation information 113 is stored in the destination information storage unit 32 of the database 30 from the destination information acquisition unit 14 via the in-vehicle network 90.
  • This user evaluation information 113 is used in the other navigation device 40.
  • the other navigation device 40 is a device having the same configuration and function as the navigation device 10
  • the user evaluation information 113 is used as the other user evaluation information 112 by acquiring the user evaluation information 113 from the navigation device 10. can do.
  • the off-vehicle server 70 that manages the destination additional information 111 related to the destination and the other navigation device that is managed by another user who transmits the other user evaluation information 112 for the destination.
  • the device control unit 17 determines that a change has occurred in the user state information 102
  • the device control unit 17 communicates with the user from the destination candidates and routes previously associated with the state.
  • the destination candidate and the route that are matched with the preference indicated by the preference information and selected according to the other-user evaluation information 112 received by the communication control unit 20, are received by the communication control unit 20 from the outside server 70.
  • the information is presented to the user of the host vehicle along with the destination additional information 111 regarding the candidate.
  • the other user evaluation information 112 which is an evaluation for the destination of another user having the same taste, can be used for determining the store to be presented, so that it is possible to present a destination that more closely matches the taste of the user It becomes.
  • the navigation device is capable of presenting destination candidates and routes desired by the user at an appropriate presentation timing in accordance with the current state of the user based on the biological information and preferences of the crew of the vehicle. It is useful as a high navigation device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Social Psychology (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)

Abstract

L'invention porte sur un dispositif de navigation, qui compare une valeur d'indicateur à une valeur de seuil prédéterminée, la valeur d'indicateur utilisant une condition d'utilisateur calculée à partir d'informations biométriques d'un utilisateur embarquant dans un objet mobile. Si le dispositif de navigation détermine qu'un changement s'est produit dans la condition d'utilisateur, le dispositif de navigation présente à l'utilisateur un candidat de destination et un itinéraire préalablement établi correspondant à la condition d'utilisateur.
PCT/JP2009/003299 2009-07-14 2009-07-14 Dispositif de navigation WO2011007386A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2009/003299 WO2011007386A1 (fr) 2009-07-14 2009-07-14 Dispositif de navigation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2009/003299 WO2011007386A1 (fr) 2009-07-14 2009-07-14 Dispositif de navigation

Publications (1)

Publication Number Publication Date
WO2011007386A1 true WO2011007386A1 (fr) 2011-01-20

Family

ID=43449010

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2009/003299 WO2011007386A1 (fr) 2009-07-14 2009-07-14 Dispositif de navigation

Country Status (1)

Country Link
WO (1) WO2011007386A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013165310A (ja) * 2012-02-09 2013-08-22 Nikon Corp 電子機器
JP2015212125A (ja) * 2014-05-07 2015-11-26 株式会社小糸製作所 車両用室内灯装置
EP3246870A4 (fr) * 2015-01-14 2018-07-11 Sony Corporation Système de navigation, dispositif terminal client, procédé de commande et support de stockage
DE102018100373A1 (de) 2018-01-09 2019-07-11 Motherson Innovations Company Limited Verfahren zur vertikalen Keystone-Korrektur in Projektionssystemen für Head-Up-Displays
JP2020165694A (ja) * 2019-03-28 2020-10-08 本田技研工業株式会社 制御装置、制御方法およびプログラム

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005293587A (ja) * 1997-07-22 2005-10-20 Equos Research Co Ltd エージェント装置
JP2005345325A (ja) * 2004-06-04 2005-12-15 Kenwood Corp カーナビゲーション装置、カーナビゲーション方法及びプログラム
JP2007212421A (ja) * 2006-02-13 2007-08-23 Denso Corp 自動車用もてなし情報提供システム
JP2009042891A (ja) * 2007-08-07 2009-02-26 Denso Corp 施設検索装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005293587A (ja) * 1997-07-22 2005-10-20 Equos Research Co Ltd エージェント装置
JP2005345325A (ja) * 2004-06-04 2005-12-15 Kenwood Corp カーナビゲーション装置、カーナビゲーション方法及びプログラム
JP2007212421A (ja) * 2006-02-13 2007-08-23 Denso Corp 自動車用もてなし情報提供システム
JP2009042891A (ja) * 2007-08-07 2009-02-26 Denso Corp 施設検索装置

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013165310A (ja) * 2012-02-09 2013-08-22 Nikon Corp 電子機器
JP2015212125A (ja) * 2014-05-07 2015-11-26 株式会社小糸製作所 車両用室内灯装置
EP3246870A4 (fr) * 2015-01-14 2018-07-11 Sony Corporation Système de navigation, dispositif terminal client, procédé de commande et support de stockage
US10408629B2 (en) 2015-01-14 2019-09-10 Sony Corporation Navigation system, client terminal device, control method, and storage medium
DE102018100373A1 (de) 2018-01-09 2019-07-11 Motherson Innovations Company Limited Verfahren zur vertikalen Keystone-Korrektur in Projektionssystemen für Head-Up-Displays
DE102018100373A9 (de) 2018-01-09 2020-01-02 Motherson Innovations Company Limited Verfahren zur vertikalen Keystone-Korrektur in Projektionssystemen für Head-Up-Displays
JP2020165694A (ja) * 2019-03-28 2020-10-08 本田技研工業株式会社 制御装置、制御方法およびプログラム
CN111762147A (zh) * 2019-03-28 2020-10-13 本田技研工业株式会社 控制装置、控制方法以及存储程序的存储介质
JP7190952B2 (ja) 2019-03-28 2022-12-16 本田技研工業株式会社 制御装置、制御方法およびプログラム

Similar Documents

Publication Publication Date Title
US10302444B2 (en) Information processing system and control method
US8655740B2 (en) Information providing apparatus and system
JP4609527B2 (ja) 自動車用情報提供システム
EP3343175A1 (fr) Dispositif embarqué dans un véhicule et système de présentation d'informations de route
WO2011007386A1 (fr) Dispositif de navigation
JP2007122579A (ja) 車両制御装置
JP2005315802A (ja) ユーザ支援装置
JP4807625B2 (ja) 情報提供装置
JP2012112853A (ja) 情報処理装置、車載用ナビゲーション装置、及び、情報処理方法
JP2014052518A (ja) 広告配信システムおよび広告配信方法
JP4604597B2 (ja) 状態推定装置、状態推定方法、及びそれを用いた情報提供装置、情報提供方法
JP6552548B2 (ja) 地点提案装置及び地点提案方法
EP3011522A1 (fr) Système de traitement de données, procédé et support non transitoire lisible par ordinateur
CN114117196A (zh) 用于在车辆内向用户提供推荐服务的方法和系统
JP2022001870A (ja) 経路処理プログラム、経路処理装置および経路処理方法
WO2007135855A1 (fr) Dispositif, procédé et programme de présentation d'informations et support d'enregistrement lisible par un ordinateur
CN113496193A (zh) 推荐引导系统、推荐引导方法及存储介质
CN114119293A (zh) 信息处理装置、信息处理系统、程序以及车辆
JP2014203357A (ja) 情報提示システム
CN110285824B (zh) 信息提供装置及其控制方法
US20220306124A1 (en) Information providing apparatus
JP7016578B2 (ja) 評価情報生成システム、評価情報生成装置、評価情報生成方法、及びプログラム
JP2016114427A (ja) 情報提示装置および情報提示方法
JP2020154693A (ja) 化粧支援装置および化粧支援システム
JP2020091777A (ja) 情報処理装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09847286

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09847286

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP