JP2016057154A - Information processing system, program, and on-vehicle device - Google Patents

Information processing system, program, and on-vehicle device Download PDF

Info

Publication number
JP2016057154A
JP2016057154A JP2014183297A JP2014183297A JP2016057154A JP 2016057154 A JP2016057154 A JP 2016057154A JP 2014183297 A JP2014183297 A JP 2014183297A JP 2014183297 A JP2014183297 A JP 2014183297A JP 2016057154 A JP2016057154 A JP 2016057154A
Authority
JP
Japan
Prior art keywords
vehicle
image data
identifier
unit
position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2014183297A
Other languages
Japanese (ja)
Inventor
正寛 牧野
Masahiro Makino
正寛 牧野
幸文 青山
Yukifumi Aoyama
幸文 青山
清 都丸
Kiyoshi Tomaru
清 都丸
進一 吉田
Shinichi Yoshida
進一 吉田
稲垣 修
Osamu Inagaki
修 稲垣
Original Assignee
株式会社東海理化電機製作所
Tokai Rika Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社東海理化電機製作所, Tokai Rika Co Ltd filed Critical 株式会社東海理化電機製作所
Priority to JP2014183297A priority Critical patent/JP2016057154A/en
Publication of JP2016057154A publication Critical patent/JP2016057154A/en
Application status is Pending legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in preceding groups G01C1/00-G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in preceding groups G01C1/00-G01C19/00 specially adapted for navigation in a road network
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in preceding groups G01C1/00-G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in preceding groups G01C1/00-G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle
    • G08G1/0969Systems involving transmission of navigation instructions to the vehicle having a display in the form of a map
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • G09B29/10Map spot or coordinate position indicators; Map reading aids

Abstract

PROBLEM TO BE SOLVED: To suppress an influence by communication delay and improve accuracy of display corresponding to a vehicle position.SOLUTION: An information processing system comprises an on-vehicle device mounted on a vehicle, and an information processor. The information processor comprises: a first communication unit for transmitting image data with an identifier attached thereto to the on-vehicle device; and a communication control unit for controlling the first communication unit so as to transmit, to the on-vehicle device, an identifier corresponding to the position of the vehicle among identifiers attached to already-transmitted image data. The on-vehicle device comprises: a second communication unit for receiving the image data with the identifier attached thereto and the identifier from the information processor; a storage unit for storing the image data with the identifier attached thereto received by the second communication unit; and a display control unit that, when the second communication unit receives an identifier corresponding to the position of the vehicle from the information processor, extracts image data with the identifier attached thereto from the storage unit, and performs display control on the basis of the extracted image data.SELECTED DRAWING: Figure 4

Description

  The present invention relates to an information processing system, a program, and an in-vehicle device.

  In the near future, navigation systems that present vehicle users to the current position of the vehicle and turns such as turning left and right have become widespread. For example, a navigation system that includes an in-vehicle device having a display function and a mobile terminal having a navigation function is known. In this navigation system, the mobile terminal generates image data for navigation, the mobile terminal wirelessly transmits the image data to the vehicle-mounted device, and the vehicle-mounted device performs display processing based on the image data.

  Furthermore, Patent Document 1 discloses a navigation system including a mobile terminal and a center connected to the mobile terminal via a network. The center in the navigation system generates and transmits image data indicating the position of the mobile terminal based on information received from the mobile terminal, and the mobile terminal performs display processing based on the image data received from the center. Here, the center predicts an existing position at a timing at which image data is displayed on the mobile terminal based on a communication delay between the mobile terminal and the center, and generates image data indicating the predicted existing position.

JP-A-2005-25037

  However, in the navigation system described in Patent Document 1, since the display is performed based on the predicted presence position, when a change in the vehicle speed or a change in the direction of the vehicle occurs suddenly, the navigation system has a position different from the actual presence position. Display based, that is, erroneous display occurs.

  Therefore, the present invention has been made in view of the above problems, and an object of the present invention is a novel and capable of suppressing the influence of communication delay and improving display accuracy in a vehicle. An object is to provide an improved information processing system, program, and vehicle-mounted device.

In order to solve the above problems, according to one aspect of the present invention,
An information processing system comprising an on-vehicle device mounted on a vehicle and an information processing device, wherein the information processing device transmits image data to which an identifier is attached to the on-vehicle device, and at least a part of the identifier The first communication unit for transmitting to the vehicle-mounted device, and the first communication unit to transmit an identifier corresponding to the position of the vehicle among the identifiers attached to the transmitted image data to the vehicle-mounted device. A communication control unit for controlling the vehicle-mounted device, wherein the vehicle-mounted device receives image data to which the identifier is attached from the information processing apparatus, a second communication unit that receives the identifier, and the identifier received by the second communication unit. When the identifier corresponding to the position of the vehicle is received from the information processing apparatus by the second communication unit, the storage unit stores the image data to which the identifier is attached. Or Extracted, the information processing system comprising a display control unit, the performing display control based on the extracted image data is provided.

  The information processing apparatus includes an image generation unit that generates the image data, a storage unit that stores an identifier attached to the image data in association with a specific position, and an identifier that is associated with the specific position reached by the vehicle The communication control unit may control the first communication unit to transmit the identifier extracted by the arrival determination unit to the vehicle-mounted device.

  The storage unit stores one or more image data generated by the image generation unit, and the communication control unit stores the 1 or 2 stored in the storage unit according to the driving state of the vehicle. The transmission of the above image data may be controlled.

  The image generation unit generates image data for a specific position on a route between the current position of the vehicle and a destination, and the storage unit determines the specific position of the image data generated for the specific position. It may be stored in association with the identifier.

  The first communication unit may transmit the image data and the identifier to the in-vehicle device by wireless communication.

  In order to solve the above-described problem, according to another aspect of the present invention, a process for transmitting image data with an identifier to a computer to an on-vehicle device mounted on a vehicle, and the transmitted image data A program is provided for performing the process of transmitting an identifier corresponding to a position of the vehicle among the attached identifiers to the vehicle-mounted device.

  Moreover, in order to solve the said subject, according to another viewpoint of this invention, it is an onboard equipment mounted in a vehicle, Comprising: The communication part which communicates with information processing apparatus, and the identifier received by the said communication part are When an identifier corresponding to the position of the vehicle is received from the information processing apparatus by the storage unit that stores the attached image data and the communication unit, the image data to which the identifier is attached is extracted from the storage unit. An on-vehicle device is provided that includes a display control unit that controls display of the display device based on the extracted image data.

  As described above, according to the present invention, it is possible to suppress the influence of communication delay and improve the display accuracy in the vehicle.

It is explanatory drawing which shows the navigation system by the 1st Embodiment of this invention. It is explanatory drawing which shows the external appearance ahead of the compartment of a vehicle. It is explanatory drawing which shows operation | movement of the navigation system by a comparative example. It is explanatory drawing which shows the structure of the mobile terminal by the 1st Embodiment of this invention. It is explanatory drawing which shows the structural example of image DB. It is explanatory drawing which shows the structure of the onboard equipment by the 1st Embodiment of this invention. It is explanatory drawing which shows the data memorize | stored in the memory | storage part of onboard equipment. It is explanatory drawing which shows operation | movement of the navigation system by 1st Embodiment. It is explanatory drawing which shows a 1st modification. It is explanatory drawing which shows the 2nd modification. It is explanatory drawing which shows the navigation system by the 2nd Embodiment of this invention. It is explanatory drawing which shows operation | movement of the navigation system by 2nd Embodiment. It is explanatory drawing which showed the hardware constitutions of the mobile terminal.

  Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings. In addition, in this specification and drawing, about the component which has the substantially same function structure, duplication description is abbreviate | omitted by attaching | subjecting the same code | symbol.

  In the present specification and drawings, a plurality of components having substantially the same functional configuration may be distinguished by adding different alphabets after the same reference numeral. However, when it is not necessary to particularly distinguish each of a plurality of constituent elements having substantially the same functional configuration, only the same reference numerals are given.

<< 1. First Embodiment >>
<1-1. Overview of the navigation system>
First, an overview of a navigation system according to the first embodiment of the present invention will be described with reference to FIGS. 1 and 2.

  FIG. 1 is an explanatory diagram showing a navigation system according to the first embodiment of the present invention. As shown in FIG. 1, the navigation system according to the present embodiment is an information processing system having a mobile terminal 20 and an in-vehicle device 30 mounted on the vehicle 1.

  The mobile terminal 20 is an example of an information processing apparatus having a navigation function. When the destination of the vehicle 1 is set, the mobile terminal 20 displays a route guidance image indicating a route for reaching the destination from the current position of the vehicle 1. In addition, the mobile terminal 20 generates image data for each turn instruction position on the route (a position where the user specifies a right turn position, a left turn position, and the like), and wirelessly transmits the image data to the vehicle-mounted device 30. The vehicle-mounted device 30 displays a turn instruction image at each turn instruction position based on the image data received from the mobile terminal 20. Hereinafter, the mobile terminal 20 and the vehicle-mounted device 30 will be described more specifically with reference to FIG.

  FIG. 2 is an explanatory diagram showing an appearance of the vehicle 1 in front of the passenger compartment. As shown in FIG. 2, a dashboard 14 including a meter panel 16 is provided in front of the passenger compartment of the vehicle 1. As an example, the mobile terminal 20 is installed on the dashboard 14. Further, the mobile terminal 20 includes a display unit 232, and as shown in FIG. 2, a route guidance image 50 including a vehicle mark 52 indicating the current position on the map and a route display 54 indicating the route to the destination moves. It is displayed on the display unit 232 of the terminal 20.

  The meter panel 16 has instruments such as a speedometer, a tachometer, a fuel gauge, and a water temperature gauge. The meter panel 16 includes a display unit 332 that constitutes a part of the vehicle-mounted device 30. Based on the image data received from the mobile terminal 20, a turn instruction image 60 indicating the turn position and the turn direction is displayed on the display unit 332. For example, the turn instruction image 60 shown in FIG. 2 indicates that the turn position is an X intersection 100 m ahead, and that the turn direction at the X intersection is the left. Here, the meter panel 16 is generally arranged in front of the driver's seat. For this reason, the user who drives the vehicle 1 can confirm the display unit 332 in the meter panel 16 and recognize the turn position and the turn direction by the confirmation while keeping the line of sight to the front. .

  The display unit 332 may be a liquid crystal display or an OLED (Organic Light-Emitting Diode) display. 2 shows an example in which the meter panel 16 has instruments such as a speedometer, a tachometer, a fuel meter, and a water temperature meter. The meter panel 16 shows an image of a speed meter, a tachometer, a fuel meter, a water temperature meter, and the like. A part of the display may be used as the display unit 332 for displaying the turn instruction image 60.

  2 shows an example in which the display unit 332 is provided in the meter panel 16, but the present embodiment is not limited to such an example. For example, the display unit 332 may be a HUD (Head Up Display) provided on the dashboard 14. In this case, the display unit 332 projects the turn instruction image 60 onto the windshield of the vehicle 1 so that the driver of the vehicle 1 can confirm the turn instruction image.

  Moreover, in FIG. 2, although the smart phone was shown as an example of the mobile terminal 20, the mobile terminal 20 is not limited to a smart phone. For example, the mobile terminal 20 is an information processing device such as a PND (Portable / Personal Navigation Device), a mobile phone, a PHS (Personal Handyphone System), a portable music playback device, a portable video processing device, a portable game device, or the like. Also good.

  1 and 2 show a four-wheeled vehicle as the vehicle 1, the vehicle to which the present embodiment can be applied is not limited to a four-wheeled vehicle. For example, the present embodiment can be applied to vehicles such as motorcycles, bicycles, military vehicles, and trains.

<1-2. Background>
The overview of the navigation system according to the first embodiment of the present invention has been described above. Next, the background that led to the creation of the navigation system according to the present embodiment will be described.

  As described above, the vehicle-mounted device 30 displays the turn instruction image 60 based on the image data received from the mobile terminal 20. For this reason, when the mobile terminal 20 transmits image data to the vehicle-mounted device 30 when reaching the turn instruction position, and the vehicle-mounted device 30 displays the turn instruction image based on the image data, it is caused by a communication delay of the image data. The turn instruction image is displayed with a delay from the arrival at the turn instruction position. In particular, when the communication between the mobile terminal 20 and the vehicle-mounted device 30 is performed by a relatively low-speed wireless communication such as Bluetooth (registered trademark), the above delay appears more noticeably. For example, the turn instruction image 60 may be displayed after the vehicle 1 passes through the turn position.

  In order to alleviate the above problem, it is conceivable to communicate image data in consideration of the occurrence of communication delay. A navigation system that communicates image data in consideration of the occurrence of communication delay will be described below as a comparative example of the embodiment of the present invention with reference to FIG.

  FIG. 3 is an explanatory diagram showing the operation of the navigation system according to the comparative example. As shown in FIG. 3, in the comparative example, first, the mobile terminal transmits a delay measurement message to the vehicle-mounted device (S82). When receiving the delay measurement message, the vehicle-mounted device transmits a delay measurement message response (S84).

  Subsequently, the mobile terminal estimates the delay time of communication between the mobile terminal and the vehicle-mounted device based on the difference between the transmission timing of the delay measurement message and the reception timing of the delay measurement message response (S86).

  Thereafter, the mobile terminal predicts the position of the vehicle after the lapse of the delay time estimated in S86 from the present time (S88), generates image data corresponding to the predicted vehicle position (S90), and uses the generated image data in the vehicle. (S92). Then, the vehicle-mounted device performs display processing based on the image data received from the mobile terminal (S94).

  However, in the navigation system according to the comparative example described above, image data corresponding to the predicted vehicle position is generated. Therefore, when a change in vehicle speed or a change in direction of the vehicle occurs suddenly, a position different from the actual vehicle position. In other words, an erroneous display occurs.

  Therefore, the present inventor has come up with an embodiment of the present invention by focusing on the above circumstances. The navigation system according to the embodiment of the present invention can improve display accuracy in a vehicle. Hereinafter, configurations of the mobile terminal 20 and the vehicle-mounted device 30 for realizing the above-described effects in the navigation system according to the embodiment of the present invention will be sequentially described in detail.

<1-3. Configuration of mobile terminal>
FIG. 4 is an explanatory diagram showing the configuration of the mobile terminal 20 according to the first embodiment of the present invention. As shown in FIG. 4, the mobile terminal 20 according to the present embodiment includes a position detection unit 220, an operation unit 224, a route setting unit 228, a display unit 232, an image generation unit 236, a storage unit 240, A vehicle state determination unit 244, an arrival determination unit 248, a communication control unit 252, and a communication unit 256 are provided.

(Position detector)
The position detection unit 220 detects the position of the vehicle 1. In the navigation system according to the present embodiment, since the mobile terminal 20 is used in the vehicle 1, the position of the vehicle 1 is equivalent to the position of the mobile terminal 20. For this reason, the position detection unit 220 according to the present embodiment may detect the position of the mobile terminal 20 as the position of the vehicle 1.

  Note that the method by which the position detection unit 220 detects the position of the vehicle 1 is not particularly limited. For example, the position detection unit 220 may detect the position of the vehicle 1 by GPS (Global Positioning System). Alternatively, the position detection unit 220 may detect the position of the vehicle 1 based on information from various sensors such as an acceleration sensor, a gyro sensor, and an atmospheric pressure sensor provided in the mobile terminal 20. Alternatively, the position detection unit 220 may detect the position of the vehicle 1 based on information from various sensors such as an accelerator opening sensor, a speed sensor, and an acceleration sensor provided in the vehicle 1. Further, the position detection unit 220 detects the position of the vehicle 1 by GPS when a GPS signal is received from the artificial satellite, and based on information from the various sensors described above when the GPS signal is not received from the artificial satellite. Then, the position of the vehicle 1 may be detected.

(Operation section)
The operation unit 224 detects a user operation and outputs the detected operation content. For example, the operation unit 224 detects a destination input operation, and outputs information indicating the destination to the route setting unit as the detected operation content.

  Note that the operation unit 224 may be a touch panel or a touch screen provided integrally with the display unit 232, or a physical configuration provided separately from the display unit 232, such as buttons, switches, levers, and dials. It may be. The operation unit 224 may be a signal reception unit that detects a signal indicating a user operation transmitted from the remote controller.

(Route setting part)
The route setting unit 228 sets a route to be guided to the user of the vehicle 1. Specifically, the route setting unit 228 sets the destination of the vehicle 1 according to information indicating the destination input from the operation unit 224. Then, the route setting unit 228 searches the route for reaching the destination from the current position of the vehicle 1 input from the position detection unit 220 using the map data stored in the storage unit 240, and is obtained by the search. The route thus set is set as a route to be guided to the user of the vehicle 1.

  Note that the method by which the route setting unit 228 searches for a route is not particularly limited. For example, the route setting unit 228 preferentially searches for a route that satisfies a condition set by the user or an initially set condition such as a short moving distance, a wide road, or no congestion. Also good.

(Display section)
The display unit 232 displays various images. In particular, the display unit 232 according to the present embodiment displays the route guidance image 50 described with reference to FIG. 2 based on image data generated by an image generation unit 236 described later. The display unit 232 may be a liquid crystal display or an OLED display.

(Image generator)
The image generation unit 236 generates image data for the display unit 232 of the mobile terminal 20 to display the route guidance image 50 and image data for the display unit 332 of the vehicle-mounted device 30 to display the turn instruction image 60.

  Specifically, the image generation unit 236 reads the map data stored in the storage unit 240, and places the vehicle mark 52 at a point corresponding to the current position of the vehicle 1 detected by the position detection unit 220 on the map data. Is superimposed on the map data, and image data for displaying the route guidance image 50 is generated by superimposing a route display 54 indicating the route set by the route setting unit 228 on the map data. The image data is output to the display unit 232, and the route guide image 50 is displayed on the display unit 232.

  In addition, the image generation unit 236 identifies each turn instruction position on the route set by the route setting unit 228. The turn instruction position is, for example, a position a predetermined distance before a position where a turn such as a right or left turn is performed on the route. Then, the image generation unit 236 generates image data indicating a turn position (a position where a turn is performed) and a turn direction for each identified turn instruction position. After attaching an identifier to the image data, the image generation unit 236 supplies the image data with the identifier to the storage unit 240 in association with the turn instruction position. As will be described later in detail, the turn instruction image 60 is displayed when the vehicle 1 reaches the turn instruction position. Therefore, the turn instruction position is also referred to as a specific position of the vehicle 1 for displaying the image. obtain.

(Memory part)
The storage unit 240 stores programs and data used for the operation of the mobile terminal 20. In particular, the storage unit 240 according to the present embodiment stores map data and an image DB (DataBase).

  The map data is data representing at least a part of the earth plane, and indicates the terrain, rivers, road laying conditions, and the like. In addition, position information is associated with each point on the map data. The image generation unit 236 is set by the route setting unit 228 and the superimposition of the vehicle mark 52 on the point corresponding to the current position of the vehicle 1 based on the position information associated with each point on the map data. A route display 54 indicating the route can be superimposed. The map data may include additional data associated with the corresponding position on the map data, such as a bank, a post office, a convenience store, a gas station, a station, a bus stop, a railroad crossing, and a parking lot.

  The image DB stores image data for turn instructions generated by the image generation unit 236. Hereinafter, the configuration of the image DB will be described more specifically with reference to FIG.

  FIG. 5 is an explanatory diagram illustrating a configuration example of the image DB. As shown in FIG. 5, in the image DB, a management flag is associated in addition to the image data supplied from the image generation unit 236, the identifier attached to the image data, and the turn instruction position associated with the image data. . For example, the identifier “001” illustrated in FIG. 5 is associated with “image data # 1” as image data, “1” as a management flag, and “P1” as a turn instruction position.

  The management flag is a flag indicating whether or not the image data associated with the management flag has been transmitted to the vehicle-mounted device 30. In the example illustrated in FIG. 5, “1” indicates that the image data associated with the management flag has been transmitted to the vehicle-mounted device 30, and “0” indicates that the image data associated with the management flag is It shows that it has not transmitted to the onboard equipment 30. FIG.

  The storage unit 240 may be a storage medium such as a non-volatile memory, a magnetic disk, an optical disk, and an MO (Magneto Optical) disk. Examples of the nonvolatile memory include a flash memory, an SD card, a micro SD card, a USB memory, an EEPROM (Electrically Erasable Programmable Programmable Memory), and an EPROM (Erasable Programmable ROM). Examples of the magnetic disk include a hard disk and a disk type magnetic disk. Examples of the optical disk include a CD (Compact Disc), a DVD (Digital Versatile Disc), and a BD (Blu-Ray Disc (registered trademark)).

(Vehicle state determination unit)
The vehicle state determination unit 244 determines whether or not the vehicle 1 has a predetermined driving state. For example, the vehicle state determination unit 244 determines whether or not the driving state of the vehicle 1 is a stopped state based on the presence or absence of a change in position detected by the position detection unit 220. When the vehicle state determination unit 244 determines that the driving state of the vehicle 1 is the stopped state, the vehicle state determination unit 244 extracts and extracts image data and an identifier associated with the management flag “0” in the image DB of the storage unit 240. The image data and the identifier are supplied to the communication control unit 252. In addition, the vehicle state determination unit 244 updates the management flag associated with the image data and identifier supplied to the communication control unit 252 to “1”.

  The vehicle state determination unit 244 may determine whether or not the vehicle 1 has a predetermined driving state based on information on the vehicle 1 received from the vehicle-mounted device 30. For example, when a speed sensor provided in the vehicle 1 outputs a vehicle speed pulse signal corresponding to the rotational speed of the drive wheel and the vehicle 1 calculates the speed of the vehicle 1 based on the vehicle speed pulse signal, the vehicle state determination unit 244 Can determine whether or not the driving state of the vehicle 1 is stopped based on the speed of the vehicle 1 calculated by the vehicle 1.

(Area determination unit)
The arrival determination unit 248 determines, based on the position of the vehicle 1 input from the position detection unit 220, the turn instruction position reached by the vehicle 1 among the turn instruction positions included in the image DB of the storage unit 240. For example, the arrival determination unit 248 may determine a turn instruction position that matches the position of the vehicle 1 or a turn instruction position within a predetermined range from the position of the vehicle 1 as the turn instruction position that the vehicle 1 has reached. Then, arrival determination unit 248 extracts an identifier associated with the turn instruction position where vehicle 1 has arrived, and supplies the extracted identifier to communication control unit 252.

  The identifier is transmitted to the vehicle-mounted device 30 as described later, and is used to extract image data stored in association with the identifier in the vehicle-mounted device 30. For this reason, when the image data associated with the identifier is not transmitted to the vehicle-mounted device 30, the image data associated with the identifier is extracted in the vehicle-mounted device 30 even if the identifier is transmitted to the vehicle-mounted device 30. Not. Therefore, the arrival determination unit 248 indicates that when the management flag “0” is associated with the turn instruction position reached by the vehicle 1, that is, the image data associated with the turn instruction position reached by the vehicle 1 is the in-vehicle device 30. If not transmitted to the communication control unit 252, the identifier associated with the turn instruction position may not be supplied.

(Communication control unit, communication unit)
The communication control unit 252 controls communication performed between the communication unit 256 and the vehicle-mounted device 30. For example, the communication control unit 252 controls transmission by the communication unit 256 of the image data to which the identifier supplied from the vehicle state determination unit 244 is attached. Further, the communication control unit 252 controls transmission of the identifier supplied from the arrival determination unit 248 by the communication unit 256.

  The communication unit 256 is an example of a first communication unit that performs communication with the vehicle-mounted device 30. For example, the communication unit 256 wirelessly transmits the image data with the identifier and the identifier to the vehicle-mounted device 30 according to the control of the communication control unit 252. Further, the communication unit 256 may receive information obtained from various sensors provided in the vehicle 1 from the vehicle-mounted device 30. The communication method performed between the communication unit 256 and the vehicle-mounted device 30 may be Bluetooth (registered trademark), another wireless communication method, or a wired communication method. Also good. The communication unit 256 may be a functional block corresponding to the physical layer, and the communication control unit 252 may be a functional block corresponding to an upper layer of the physical layer.

  Due to the functions of the communication control unit 252 and the communication unit 256 described above, image data with an identifier is transmitted to the vehicle-mounted device 30 when the driving state of the vehicle 1 is stopped, and the vehicle 1 has reached the turn instruction position. In this case, an identifier associated with the turn instruction position is transmitted to the vehicle-mounted device 30. The in-vehicle device 30 receives the image data and the identifiers to which these identifiers are attached from the mobile terminal 20, and thus can give a turn instruction to the user at an appropriate position. Hereinafter, the configuration of the vehicle-mounted device 30 will be described with reference to FIG.

<1-4. Configuration of OBE>
FIG. 6 is an explanatory diagram showing the configuration of the vehicle-mounted device 30 according to the first embodiment of the present invention. As shown in FIG. 6, the vehicle-mounted device 30 according to the present embodiment includes a communication unit 320, a storage unit 324, a display control unit 328, and a display unit 332.

(Communication Department)
The communication unit 320 is an example of a second communication unit that performs communication with the communication unit 256 of the mobile terminal 20. For example, the communication unit 320 receives image data to which an identifier is attached from the communication unit 256 of the mobile terminal 20. Further, the communication unit 320 receives the same identifier as the identifier attached to the received image data from the communication unit 256 of the mobile terminal 20. Note that the communication unit 320 may transmit information obtained by various sensors provided in the vehicle 1 to the communication unit 256 of the mobile terminal 20.

(Memory part)
The storage unit 324 stores a program and data used for the operation of the vehicle-mounted device 30. In particular, the storage unit 324 according to the present embodiment stores image data with an identifier received from the mobile terminal 20, as shown in FIG.

  FIG. 7 is an explanatory diagram showing data stored in the storage unit 324 of the vehicle-mounted device 30. As illustrated in FIG. 7, the storage unit 324 stores image data in association with each identifier. For example, the identifier “001” shown in FIG. 7 is associated with “image data # 1” as image data.

  The storage unit 324 may be a storage medium such as a non-volatile memory, a magnetic disk, an optical disk, and an MO (Magneto Optical) disk, like the storage unit 240 of the mobile terminal 20.

(Display control unit, display unit)
The display control unit 328 controls display on the display unit 332. In particular, when an identifier is received from the mobile terminal 20 by the communication unit 320, the display control unit 328 according to the present embodiment extracts image data with the identifier from the storage unit 324, and based on the extracted image data. The turn instruction image 60 is displayed on the display unit 332. For example, when the identifier “002” is received by the communication unit 320, the display control unit 328 extracts “image data # 2” with the identifier “002” from the storage unit 324, and “image data # 2”. Based on the above, the turn instruction image 60 is displayed on the display unit 332.

  Here, when the vehicle 1 reaches the turn instruction position, the mobile terminal 20 transmits the identifier of the image data generated for the turn instruction position to the communication unit 320 of the vehicle-mounted device 30. For this reason, the display unit 332 of the vehicle-mounted device 30 displays the turn instruction image 60 based on the image data generated for the turn instruction position reached by the vehicle 1 when the vehicle 1 actually reaches the turn instruction position. Is possible. Further, in the present embodiment, when the vehicle 1 reaches the turn instruction position, an identifier having a data amount smaller than that of the image data is transmitted, but the image data is not transmitted. Therefore, in the present embodiment, the turn instruction image is transmitted with a small delay after the vehicle 1 arrives at the turn instruction position, as compared with an example in which image data is transmitted when the vehicle 1 reaches the turn instruction position. 60 can be displayed.

<1-5. Operation>
The configuration of the navigation system according to the first embodiment of the present invention has been described above. Next, with reference to FIG. 8, the operation of the navigation system according to the present embodiment is organized. Hereinafter, of the operations of the navigation system, the operation for displaying the turn instruction image 60 on the vehicle-mounted device 30 will be described.

  FIG. 8 is an explanatory diagram showing the operation of the navigation system according to the first embodiment. First, when the route setting unit 228 of the mobile terminal 20 sets a destination and a route according to a user operation (S404), the image generation unit 236 identifies each turn instruction position on the route, and for each identified turn instruction position, Image data indicating the turn position and turn direction is generated (S408). Then, the storage unit 240 stores an image DB including the image data supplied from the image generation unit 236, an identifier attached to the image data, a turn instruction position associated with the image data, and a management flag (S412). .

  Thereafter, the navigation system according to the first embodiment repeats the image display control shown in S416 to S448 until the vehicle 1 arrives at the destination.

  Specifically, when the vehicle state determination unit 244 determines that the driving state of the vehicle 1 is a stopped state (S416 / yes), the image data associated with the management flag “0” in the image DB of the storage unit 240. Then, the identifier is extracted, and the extracted image data and identifier are supplied to the communication control unit 252 (S420). The communication control unit 252 causes the image data and the identifier supplied from the vehicle state determination unit 244 to be transmitted from the communication unit 256 to the vehicle-mounted device 30 (S424). Then, the vehicle state determination unit 244 updates the management flag associated with the image data and the identifier supplied to the communication control unit 252 to “1” (S428). On the other hand, the storage unit 324 of the vehicle-mounted device 30 stores the image data and the identifier transmitted from the mobile terminal 20 (S432). The processes of S416 to S432 are repeated until the vehicle 1 starts running (S434).

  On the other hand, while the vehicle 1 is in the traveling state, the arrival determination unit 248 is based on the position of the vehicle 1 input from the position detection unit 220, and the vehicle 1 among the turn instruction positions included in the image DB of the storage unit 240. It is determined whether there is a turn instruction position that has reached (S436). When the vehicle 1 reaches the turn instruction position (S436 / yes), the arrival determination unit 248 extracts an identifier associated with the turn instruction position reached by the vehicle 1, and sends the extracted identifier to the communication control unit 252. Supply (S440). The communication control unit 252 causes the communication unit 256 to transmit the identifier supplied from the arrival determination unit 248 to the vehicle-mounted device 30 (S444).

  When the identifier is received from the mobile terminal 20 by the communication unit 320 of the on-vehicle device 30, the display control unit 328 of the on-vehicle device 30 extracts the image data to which the identifier is attached from the storage unit 324, and extracts the extracted image. Based on the data, the turn instruction image 60 is displayed on the display unit 332 (S448).

<1-6. Effect>
As described above, in the first embodiment of the present invention, the mobile terminal 20 transmits the image data about the turn instruction position that the vehicle 1 can reach in the future to the vehicle-mounted device 30 in advance. When the mobile terminal 20 actually reaches the turn instruction position, the mobile terminal 20 transmits an image data identifier for the turn instruction position.

  Here, the data amount of the identifier is smaller than the data amount of the image data. For this reason, in this embodiment, compared with the example in which image data is transmitted when the vehicle 1 reaches the turn instruction position, the turn instruction is performed with a small delay after the vehicle 1 reaches the turn instruction position. An image 60 is displayed. That is, according to the present embodiment, the turn instruction image 60 to be displayed at the turn instruction position can be displayed at a position closer to the turn instruction position, so that the accuracy of display of the turn instruction image 60 is improved. Is possible. Since the navigation system according to the present embodiment displays the turn instruction image 60 based on the actual vehicle position, the display accuracy can be improved in comparison with the navigation system according to the comparative example described with reference to FIG. Is possible.

  In the present embodiment, image data with an identifier is transmitted while the vehicle 1 is stopped. Here, the vehicle 1 does not reach the turn instruction position while the vehicle 1 is stopped, and the vehicle 1 reaches the turn instruction position while the vehicle 1 is traveling. For this reason, the transmission of the identifier from the mobile terminal 20 to the vehicle-mounted device 30 is performed while the vehicle 1 is in the traveling state. Accordingly, since the overlap between the transmission timing of the image data to which the identifier is attached and the transmission timing of the identifier is avoided, it is possible to suppress a communication delay that occurs when the identifier is transmitted.

<1-7. Modification>
The first embodiment of the present invention has been described above. Hereinafter, some modified examples of the first embodiment of the present invention will be described. In addition, each modification described below may be applied to the first embodiment of the present invention alone, or may be applied to the first embodiment of the present invention in combination. Each modification may be applied in place of the configuration described in the first embodiment of the present invention, or may be additionally applied to the configuration described in the first embodiment of the present invention. Also good.

(First modification)
In the above description, one turn instruction position is specified for one turn position and one image data is generated. However, the present embodiment is not limited to this example. As a first modification, the image generation unit 236 may specify a plurality of turn instruction positions for one turn position and generate a plurality of image data.

  For example, for the X intersection that is the turn position, the image generation unit 236 selects a point 100 m before the X intersection, a point 50 m before the X intersection, a point 30 m before the X intersection, and a point 10 m before the X intersection. The turn instruction position may be specified, and image data for turn instruction may be generated for each turn instruction position. In this case, as shown in FIG. 9, the turn instruction image 60-1 is displayed at a point 100m before the X intersection, the turn instruction image 60-2 is displayed at a point 50m before the X intersection, and 30m from the X intersection. A turn instruction image 60-3 is displayed at a point in front, and a turn instruction image 60-4 is displayed at a point 10 meters before the X intersection. According to this configuration, the user can grasp the sense of distance between the vehicle 1 and the turn position that changes as the vehicle 1 travels as needed.

(Second modification)
Contrary to the first modification, the image generation unit 236 may generate common image data for a plurality of turn instruction positions as a second modification. For example, the image generation unit 236 may generate left turn image data that does not include an individual name such as an X intersection. In this case, the image data includes a plurality of turn instruction positions to which a left turn should be guided. Can be associated. According to such a configuration, the number of image data transmitted from the mobile terminal 20 to the vehicle-mounted device 30 is reduced. Therefore, the amount of communication traffic between the mobile terminal 20 and the vehicle-mounted device 30 and the storage unit 324 of the vehicle-mounted device 30 It is possible to suppress the storage capacity used.

(Third Modification)
Furthermore, in the above description, the turn instruction image 60 is described as an example of the driving support image for supporting the driving of the user of the vehicle 1, and the turn instruction position is described as an example of the specific position of the vehicle 1. The image and the specific position are not limited to such an example. The image generation unit 236 according to the third modification may specify a position a predetermined distance before a position where a guidance target such as a convenience store, a gas station, or a railroad crossing exists on the route as the specific position. Further, the image generation unit 236 may generate image data indicating the presence of the guidance target for the specified specific position. For example, the image generation unit 236 specifies a position 100m before the railroad crossing on the route as the specific position, and generates image data indicating that there is a railroad crossing 100m ahead. In this case, as shown in FIG. 10, the display unit 332 of the vehicle-mounted device 30 displays a warning image 62 indicating the presence of a crossing as a driving assistance image.

  The driving assistance image may be the route guidance image 50. That is, the mobile terminal 20 generates image data for displaying the route guidance image 50 at each point on the route, attaches an identifier to each image data, and transmits the image data to the vehicle-mounted device 30. By transmitting the identifier of the image data corresponding to the position to the vehicle-mounted device 30, the route guidance image 50 can be displayed as a driving assistance image in the vehicle-mounted device 30. In this case, since the data amount of the image data transmitted from the mobile terminal 20 to the vehicle-mounted device increases, the effect of transmitting the image data while the vehicle 1 is stopped, that is, the communication delay that occurs when the identifier is transmitted. The suppression effect appears more prominently.

(Fourth modification)
In the above, the example in which the image data is transmitted from the mobile terminal 20 to the vehicle-mounted device 30 while the vehicle 1 is stopped has been described, but the transmission timing of the image data is not limited to this example. As a fourth modification, the communication control unit 252 may monitor the communication traffic amount of the mobile terminal 20 and control transmission of image data when the communication traffic amount of the mobile terminal 20 is less than a threshold value. According to this configuration, communication other than the image data performed by the mobile terminal 20 can be prevented from being hindered by the transmission of the image data. Note that communication other than image data includes, for example, communication of audio data for a hands-free call.

(Fifth modification)
As described above, the image data with the identifier transmitted from the mobile terminal 20 is stored in the storage unit 324 of the in-vehicle device 30. However, since the storage capacity of the storage unit 324 is finite, it may be difficult for the storage unit 324 to simultaneously store image data for all turn instruction positions on the route.

  Therefore, the communication unit 320 of the vehicle-mounted device 30 notifies the mobile terminal 20 of the free capacity of the storage unit 324 of the vehicle-mounted device 30, and the communication control unit 252 of the mobile terminal 20 takes into account the free capacity of the storage unit 324. Data transmission may be controlled. For example, the communication control unit 252 of the mobile terminal 20 instructs the communication unit 256 to transmit image data that fits in the free capacity of the storage unit 324, and the storage unit 324 of the vehicle-mounted device 30 is used to display the turn instruction image 60. The communication control unit 252 of the mobile terminal 20 may instruct the communication unit 256 to transmit new image data that can fit in the free space of the storage unit 324 that has been increased by the deletion of the image data. . Note that the storage unit 324 of the vehicle-mounted device 30 may delete the image data based on an instruction from the mobile terminal 20 side. In addition, the notification of the free capacity from the vehicle-mounted device 30 to the mobile terminal 20 may be performed at a timing when the free capacity of the storage unit 324 becomes less than a predetermined capacity.

<< 2. Second Embodiment >>
The first embodiment and the modification examples of the present invention have been described above. Next, a second embodiment of the present invention will be described. The second embodiment of the present invention can realize a navigation system by cloud computing.

<2-1. Configuration of navigation system>
FIG. 11 is an explanatory diagram showing a navigation system according to the second embodiment of the present invention. As shown in FIG. 11, the navigation system according to the present embodiment includes the server 10, the communication network 12, the mobile terminal 22, and the vehicle-mounted device 30 mounted on the vehicle 1. Since the configuration of the vehicle-mounted device 30 according to the second embodiment is substantially the same as the configuration of the vehicle-mounted device 30 described in the first embodiment, details here about the vehicle-mounted device 30 according to the second embodiment. The detailed explanation is omitted.

  The communication network 12 is a wired or wireless transmission path for information transmitted from a device connected to the communication network 12. For example, the communication network 12 may include a public line network such as the Internet, a telephone line network, a satellite communication network, various local area networks (LANs) including the Ethernet (registered trademark), a wide area network (WAN), and the like. . Further, the communication network 12 may include a dedicated line network such as an IP-VPN (Internet Protocol-Virtual Private Network). The server 10 and the mobile terminal 22 can communicate via the communication network 12.

  The server 10 is an example of an information processing apparatus having a navigation function. That is, the server 10 includes the route setting unit 228, the image generation unit 236, the storage unit 240, the vehicle state determination unit 244, the arrival determination unit 248, the communication control unit 252 and the communication unit of the mobile terminal 20 described in the first embodiment. It has a function corresponding to 256. For example, the server 10 generates image data for displaying the route guidance image 50 on the mobile terminal 22 according to the position information received from the mobile terminal 22. Further, the server 10 generates image data for turn instruction for each turn instruction position on the set route, and stores the image DB described with reference to FIG. 5 including the image data.

  The mobile terminal 22 is an example of an information processing apparatus that transmits image data for turn instructions to the vehicle-mounted device 30 in cooperation with the server 10. For example, the mobile terminal 22 has functions corresponding to the position detection unit 220, the operation unit 224, the display unit 232, the vehicle state determination unit 244, and the communication unit 256 of the mobile terminal 20 described in the first embodiment. The route guidance image 50 is displayed based on the image data received from 10. In addition, the mobile terminal 22 receives image data for a turn instruction with an identifier from the server 10 and displays the image data with the identifier on the vehicle-mounted device 30. Thereafter, the mobile terminal 22 transmits the identifier received from the server 10 to the vehicle-mounted device 30.

<2-2. Operation of the navigation system>
The configuration of the navigation system according to the second embodiment of the present invention has been described above. Next, with reference to FIG. 12, the operation of the navigation system according to the present embodiment is organized. Hereinafter, of the operations of the navigation system, the operation for displaying the turn instruction image 60 on the vehicle-mounted device 30 will be described.

  FIG. 12 is an explanatory diagram showing the operation of the navigation system according to the second embodiment. As shown in FIG. 12, the mobile terminal 22 continuously transmits the position information of the vehicle 1 detected by the position detection unit 220 to the server 10 (S504). In addition, the mobile terminal 22 transmits the destination information input by the user operation to the server 10 (S508).

  The server 10 sets the destination of the vehicle 1 according to the destination information received from the mobile terminal 22, and sets a route connecting the destination and the current position of the vehicle 1 (S512). Then, the server 10 identifies each turn instruction position on the route, and generates image data indicating a turn position and a turn direction for each identified turn instruction position (S516). Subsequently, the server 10 stores an image DB including the generated image data, an identifier of the image data, a turn instruction position, and a management flag (S520).

  Thereafter, the navigation system according to the second embodiment repeats the image display control shown in S524 to S568 until the vehicle 1 arrives at the destination.

  Specifically, the server 10 determines whether or not the vehicle 1 is stopped based on, for example, position information received from the mobile terminal 22 (S524). If the server 10 determines that the vehicle 1 is stopped (S524 / yes), the server 10 extracts the image data and the identifier associated with the management flag “0” in the image DB (S528), and extracts the extracted image. The data and identifier are transmitted to the mobile terminal 22 (S532), and the management flag associated with the transmitted image data and identifier is updated to “1” (S536). On the other hand, the mobile terminal 22 transmits the image data and identifier received from the server 10 to the vehicle-mounted device 30 (S540), and the vehicle-mounted device 30 stores the image data and identifier received from the mobile terminal 22 in the storage unit 324 ( S544). These processes of S524 to S544 are repeated until the vehicle 1 starts running (S548).

  Thereafter, the server 10 determines whether there is a turn instruction position reached by the vehicle 1 among the turn instruction positions included in the image DB based on the position information received from the mobile terminal 22 (S552). When the vehicle 1 reaches the turn instruction position (S552 / yes), the arrival determination unit 248 extracts an identifier associated with the turn instruction position at which the vehicle 1 has arrived, and transmits the extracted identifier to the mobile terminal 22. (S560). The mobile terminal 22 transmits the identifier received from the server 10 to the vehicle-mounted device 30 (S564), and the vehicle-mounted device 30 extracts and extracts the image data with the identifier received from the mobile terminal 22 from the storage unit 324. The turn instruction image 60 is displayed on the display unit 332 based on the image data (S568).

  Thereafter, the processing of S524 to S568 is repeated until the vehicle 1 arrives at the destination (S572 / no). When the vehicle 1 arrives at the destination, the operation of the navigation system according to the present embodiment is finished (S572 / yes). .

<2-3. Effect>
As described above, according to the second embodiment of the present invention, similar to the first embodiment of the present invention, the turn instruction image 60 is displayed with a small delay after the vehicle 1 reaches the turn instruction position. The effect of being able to do so can be realized by cloud computing.

<2-4. Supplement>
Although the example in which the mobile terminal 22 transmits the vehicle stop notification to the server 10 has been described above, the server 10 determines whether or not the vehicle 1 is stopped based on the position information received from the mobile terminal 22. Can do. For this reason, the mobile terminal 22 may not transmit a vehicle stop notification to the server 10.

  Moreover, although the server 10 communicated with the mobile terminal 22 and the mobile terminal 22 communicated with the onboard equipment 30 was demonstrated above, this embodiment is not limited to this example. For example, when the vehicle-mounted device 30 has a direct communication means with the communication network 12, the vehicle-mounted device 30 and the server 10 may communicate without going through the mobile terminal 22. In this case, the vehicle-mounted device 30 may detect the position information of the vehicle 1 and continuously transmit the position information of the vehicle 1 to the server 10.

  In addition, each modification described in the first embodiment can be applied to the second embodiment.

<< 3. Hardware configuration >>
The embodiments of the present invention have been described above. Information processing such as image generation and communication control described above is realized by cooperation of software and hardware of the mobile terminal 20 (the mobile terminal 22 in the second embodiment) described below.

  FIG. 13 is an explanatory diagram showing a hardware configuration of the mobile terminal 20. As shown in FIG. 13, the mobile terminal 20 includes a CPU (Central Processing Unit) 201, a ROM (Read Only Memory) 202, a RAM (Random Access Memory) 203, an input device 208, an output device 210, A storage device 211 and a communication device 215 are provided.

  The CPU 201 functions as an arithmetic processing device and a control device, and controls the overall operation in the mobile terminal 20 according to various programs. Further, the CPU 201 may be a microprocessor. The ROM 202 stores programs used by the CPU 201, calculation parameters, and the like. The RAM 203 temporarily stores programs used in the execution of the CPU 201, parameters that change as appropriate during the execution, and the like. These are connected to each other by a host bus including a CPU bus. Mainly, the functions of the route setting unit 228, the image generation unit 236, the vehicle state determination unit 244, the arrival determination unit 248, and the communication control unit 252 are realized by the cooperation of the CPU 201, the ROM 202, the RAM 203, and the software.

  The input device 208 includes input means for a user to input information, such as a mouse, keyboard, touch panel, button, microphone, switch, and lever, and an input control circuit that generates an input signal based on the input by the user and outputs the input signal to the CPU 201. Etc. The user of the mobile terminal 20 can input various data and instruct processing operations to the mobile terminal 20 by operating the input device 208. The input device 208 corresponds to the operation unit 224 described with reference to FIG.

  The output device 210 includes, for example, a display device such as a liquid crystal display (LCD) device, an OLED device, and a lamp. Furthermore, the output device 210 includes an audio output device such as a speaker and headphones. For example, the display device displays a captured image or a generated image. On the other hand, the audio output device converts audio data or the like into audio and outputs it.

  The storage device 211 is a device for storing data. The storage device 211 may include a storage medium, a recording device that records data on the storage medium, a reading device that reads data from the storage medium, a deletion device that deletes data recorded on the storage medium, and the like. The storage device 211 corresponds to the storage unit 240 described with reference to FIG. 4 and stores programs executed by the CPU 201 and various data.

  The communication device 215 is a communication interface configured with, for example, a communication device for connecting to the communication network 12. The communication device 215 may include a wireless LAN (Local Area Network) compatible communication device, an LTE (Long Term Evolution) compatible communication device, a wire communication device that performs wired communication, or a Bluetooth communication device.

  Although the hardware configuration of the mobile terminal 20 has been described above, the vehicle-mounted device 30 and the server 10 according to the second embodiment also have hardware equivalent to the CPU 201, the ROM 202, the RAM 203, and the like, similar to the mobile terminal 20. . Then, for example, the function of the display control unit 328 is realized by the cooperation of the hardware and software of the vehicle-mounted device 30, and the route setting unit 228 and the image generation unit 236 are realized by the cooperation of the server 10 and the software according to the second embodiment. Functions equivalent to the storage unit 240, the vehicle state determination unit 244, the arrival determination unit 248, the communication control unit 252, and the communication unit 256 are realized.

<< 4. Conclusion >>
As described above, according to the embodiment of the present invention, the mobile terminal 20 (mobile terminal 22) transmits the image data about the turn instruction position that the vehicle 1 can reach in the future to the vehicle-mounted device 30 in advance. When the vehicle 1 actually reaches the turn instruction position, the mobile terminal 20 (mobile terminal 22) transmits an identifier of image data for the turn instruction position.

  Here, the data amount of the identifier is smaller than the data amount of the image data. For this reason, in this embodiment, compared with the example in which image data is transmitted when the vehicle 1 reaches the turn instruction position, the turn instruction is performed with a small delay after the vehicle 1 reaches the turn instruction position. An image 60 is displayed. That is, according to the present embodiment, the turn instruction image 60 to be displayed at the turn instruction position can be displayed at a position closer to the turn instruction position, so that the accuracy of display of the turn instruction image 60 is improved. Is possible. Since the navigation system according to the present embodiment displays the turn instruction image 60 based on the actual vehicle position, the display accuracy can be improved in comparison with the navigation system according to the comparative example described with reference to FIG. Is possible.

  Although the preferred embodiments of the present invention have been described in detail with reference to the accompanying drawings, the present invention is not limited to such examples. It is obvious that a person having ordinary knowledge in the technical field to which the present invention pertains can come up with various changes or modifications within the scope of the technical idea described in the claims. Of course, it is understood that these also belong to the technical scope of the present invention.

  For example, the technical idea of the navigation system according to the comparative example described with reference to FIG. 3 may be combined with the above-described embodiment of the present invention. Specifically, the mobile terminal 20 according to the first embodiment estimates a delay time that may occur between the mobile terminal 20 and the vehicle-mounted device 30 for the communication of the identifier, and determines the position of the vehicle 1 after the delay time from the current time. It may be predicted. Then, the arrival determination unit 248 may extract the identifier from the image DB when the predicted position of the vehicle 1 reaches the turn instruction position.

  According to such a configuration, the turn instruction image 60 to be displayed at the turn instruction position can be displayed at a position closer to the turn instruction position, so that the display accuracy of the turn instruction image 60 can be improved. .

  Note that the delay time that can occur for communication of the above-described identifier is sufficiently smaller than the delay time that can occur for communication of image data, so the actual position at the time when the delay time has elapsed from the predicted position of the vehicle 1. The error from the position of the vehicle 1 is also reduced. For this reason, even when the embodiment of the present invention is combined with the technical idea related to the prediction of the vehicle position, it is possible to suppress a case where an erroneous display such as the comparative example occurs. Display accuracy is improved.

  Further, each step in the processing of the navigation system of the present specification does not necessarily have to be processed in time series in the order described as a sequence diagram. For example, each step in the processing of the navigation system may be processed in an order different from the order described as the sequence diagram or may be processed in parallel.

  In addition, a computer program for causing the hardware such as the CPU 201, the ROM 202, and the RAM 203 built in the mobile terminal 20, the vehicle-mounted device 30 and the server 10 to perform the functions of the mobile terminal 20, the vehicle-mounted device 30 and the server 10 is also created. Is possible. In particular, functions such as the image generation unit 236, the vehicle state determination unit 244, and the arrival determination unit 248 may be implemented in the mobile terminal 20 by the mobile terminal 20 downloading and installing a computer program. A storage medium storing the computer program is also provided.

DESCRIPTION OF SYMBOLS 1 Vehicle 10 Server 12 Communication network 14 Dashboard 16 Meter panel 20, 22 Mobile terminal 30 Onboard equipment 220 Position detection part 224 Operation part 228 Route setting part 232 Display part 236 Image generation part 240 Storage part 244 Vehicle state determination part 248 Arrival determination Unit 252 communication control unit 256 communication unit 320 communication unit 324 storage unit 328 display control unit 332 display unit

Claims (7)

  1. An information processing system comprising an in-vehicle device mounted on a vehicle and an information processing device,
    The information processing apparatus includes:
    A first communication unit that transmits image data with an identifier to the on-vehicle device, and transmits at least a part of the identifier to the on-vehicle device;
    A communication control unit that controls the first communication unit so as to transmit an identifier according to the position of the vehicle among the identifiers attached to the transmitted image data to the vehicle-mounted device;
    The in-vehicle device is
    Image data to which the identifier is attached from the information processing apparatus, and a second communication unit that receives the identifier;
    A storage unit for storing the image data with the identifier received by the second communication unit;
    When an identifier corresponding to the position of the vehicle is received from the information processing apparatus by the second communication unit, image data with the identifier is extracted from the storage unit, and display control is performed based on the extracted image data. A display control unit for performing
    An information processing system comprising:
  2. The information processing apparatus includes:
    An image generator for generating the image data;
    A storage unit for storing an identifier attached to the image data in association with a specific position;
    An arrival determination unit that extracts an identifier associated with the specific position reached by the vehicle;
    With
    The information processing system according to claim 1, wherein the communication control unit controls the first communication unit to transmit the identifier extracted by the arrival determination unit to the vehicle-mounted device.
  3. The storage unit stores one or more image data generated by the image generation unit,
    The information processing system according to claim 2, wherein the communication control unit controls transmission of the one or more image data stored in the storage unit according to a driving state of the vehicle.
  4. The image generation unit generates image data for a specific position on a route between the current position of the vehicle and a destination,
    The information processing system according to claim 3, wherein the storage unit stores the specific position in association with an identifier of the image data generated for the specific position.
  5.   The information processing system according to any one of claims 1 to 4, wherein the first communication unit transmits the image data and the identifier to the vehicle-mounted device by wireless communication.
  6. On the computer,
    A process of transmitting the image data with the identifier to the vehicle-mounted device mounted on the vehicle;
    A process of transmitting an identifier corresponding to the position of the vehicle among the identifiers attached to the transmitted image data to the vehicle-mounted device;
    A program to let you do.
  7. An in-vehicle device mounted on a vehicle,
    A communication unit that communicates with the information processing device;
    A storage unit for storing image data with an identifier received by the communication unit;
    When an identifier corresponding to the position of the vehicle is received from the information processing device by the communication unit, the image data with the identifier is extracted from the storage unit, and the display device displays based on the extracted image data. A display control unit for controlling
    A vehicle-mounted device.
JP2014183297A 2014-09-09 2014-09-09 Information processing system, program, and on-vehicle device Pending JP2016057154A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2014183297A JP2016057154A (en) 2014-09-09 2014-09-09 Information processing system, program, and on-vehicle device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014183297A JP2016057154A (en) 2014-09-09 2014-09-09 Information processing system, program, and on-vehicle device
PCT/JP2015/066135 WO2016038949A1 (en) 2014-09-09 2015-06-04 Information processing system, program, and onboard device

Publications (1)

Publication Number Publication Date
JP2016057154A true JP2016057154A (en) 2016-04-21

Family

ID=55458712

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2014183297A Pending JP2016057154A (en) 2014-09-09 2014-09-09 Information processing system, program, and on-vehicle device

Country Status (2)

Country Link
JP (1) JP2016057154A (en)
WO (1) WO2016038949A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010197158A (en) * 2009-02-24 2010-09-09 Toshiba Corp Electronic apparatus and navigation image display method
JP2013117604A (en) * 2011-12-02 2013-06-13 Denso Corp Information display and information service system
JP2013200249A (en) * 2012-03-26 2013-10-03 Fujitsu Ten Ltd Vehicle mounted system, vehicle mounted unit, information processing method, and program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010197158A (en) * 2009-02-24 2010-09-09 Toshiba Corp Electronic apparatus and navigation image display method
JP2013117604A (en) * 2011-12-02 2013-06-13 Denso Corp Information display and information service system
JP2013200249A (en) * 2012-03-26 2013-10-03 Fujitsu Ten Ltd Vehicle mounted system, vehicle mounted unit, information processing method, and program

Also Published As

Publication number Publication date
WO2016038949A1 (en) 2016-03-17

Similar Documents

Publication Publication Date Title
JP5645815B2 (en) Apparatus, method, system and program for providing parking lot availability
US10126743B2 (en) Vehicle navigation route search system, method, and program
US10055982B1 (en) Determining corrective actions based upon broadcast of telematics data originating from another vehicle
JP5076468B2 (en) Communication-type navigation system, vehicle navigation device, and center device
JP4551961B2 (en) Voice input support device, its method, its program, recording medium recording the program, and navigation device
JP4562471B2 (en) Navigation device and traveling direction guide method
KR20080088584A (en) Method for storing the position of a parked vehicle and navigation device arranged for that
JP4581564B2 (en) Map display device
JP2008157919A (en) Weather information announcing device and program
CN1370977A (en) Vehiculor pilot system
JP2004333467A (en) Navigation system and navigation method
JP2008070326A (en) Control information outputting unit
KR101729102B1 (en) Navigation method of mobile terminal and apparatus thereof
JP2011174744A (en) Navigation device, navigation method and program
WO2011119358A1 (en) Navigation system with image assisted navigation mechanism and method of operation thereof
JP2006038558A (en) Car navigation system
CN102754138A (en) Road-Vehicle cooperative driving safety support device
CN102725722A (en) Navigation device &amp; method
EP2453207B1 (en) Travel guidance device, travel guidance method, and computer program
US20160343249A1 (en) Methods and devices for processing traffic data
JP4935704B2 (en) Parking lot congestion state determination device, parking lot congestion state determination method, and computer program
US20160069695A1 (en) Methods and apparatus for providing travel information
JP2011128005A (en) Navigation device, on-vehicle display system, and map display method
JP5601224B2 (en) Road shape learning device
JP5387839B2 (en) Navigation device, navigation method, and navigation program

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20170217

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20171205

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20180703