WO2005098365A1 - Système de navigation - Google Patents

Système de navigation Download PDF

Info

Publication number
WO2005098365A1
WO2005098365A1 PCT/JP2005/005049 JP2005005049W WO2005098365A1 WO 2005098365 A1 WO2005098365 A1 WO 2005098365A1 JP 2005005049 W JP2005005049 W JP 2005005049W WO 2005098365 A1 WO2005098365 A1 WO 2005098365A1
Authority
WO
WIPO (PCT)
Prior art keywords
sound
navigation system
guide
control unit
vehicle
Prior art date
Application number
PCT/JP2005/005049
Other languages
English (en)
Japanese (ja)
Inventor
Shinichi Gayama
Original Assignee
Pioneer Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pioneer Corporation filed Critical Pioneer Corporation
Publication of WO2005098365A1 publication Critical patent/WO2005098365A1/fr

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3629Guidance using speech or audio output, e.g. text-to-speech

Definitions

  • the present invention relates to a navigation system which is mounted on a mobile body such as a vehicle, searches a route to a predetermined point, and guides the route or the like by voice or image, and related technology.
  • an in-vehicle navigation system has a GPS (Global Positioning System) receiver that measures the position of the vehicle and a map database, and calculates the route from the position of the vehicle to the destination. In addition, it has a function of displaying the route to the destination on the vehicle-mounted display, emitting sound from a speaker, and guiding to the destination.
  • In-vehicle navigation systems in recent years have also been able to provide traffic congestion information and regulation information to in-vehicle displays based on road traffic information supplied from VICS (road traffic information communication system) via FM multiplex broadcasting, radio beacons or optical beacons. It has a function to display and notify by voice.
  • VICS road traffic information communication system
  • Patent Document 1 Japanese Patent Application Laid-Open No. 2002-10756 discloses that the voice guidance of the vehicle driver is overlooked by providing traffic congestion 'regulation information, route guidance or route guidance according to the level of stopping frequency. A navigation system has been disclosed that prevents this. For example, if a vehicle stops temporarily at an intersection with a traffic light, route guidance is provided, and if the vehicle stops for a long time at other than the intersection, guidance on traffic congestion 'regulation information is provided. If the vehicle stops on the highway, the traffic congestion 'regulation information will be provided.
  • Patent Document 1 does not provide voice route guidance unless the vehicle stops, so voice guidance is interrupted each time the vehicle starts running.
  • route guidance cannot be provided at an appropriate timing, which may cause a mental burden on a vehicle driver.
  • the in-vehicle navigation system in recent years has a function of reproducing an audio signal from an optical disk such as a DVD (Digital Versatile Disk) or a CD-ROM.
  • an in-vehicle navigation system can provide voice guidance while playing music, but vehicle drivers must listen to guidance sounds at the same time as listening to music, etc. There is a problem.
  • the present invention relates to a navigation system mounted on a mobile object, comprising: a route search unit that searches for a guide route to at least one target point by referring to map data; and a navigation system that surrounds at least an operator's seat. And a control unit that controls the plurality of sound source speakers so as to emit a guide sound that guides a traveling direction of the moving object in accordance with the guide route.
  • the plurality of sound source speakers such that a distribution of the sound field of the in-vehicle sound in the space in the moving body is biased according to the content of the guide sound. Is controlled.
  • FIG. 1 is a block diagram schematically showing a configuration of a navigation system according to a first embodiment of the present invention.
  • FIG. 2 is a diagram schematically showing a progress vector
  • FIG. 3 is a diagram schematically showing a progress vector
  • FIG. 4 is a diagram schematically showing a progress vector
  • FIG. 5 is a diagram schematically showing a progress vector
  • FIG. 6 is a diagram schematically showing a part of the configuration of the output control unit of the navigation device.
  • FIG. 7 is a flowchart schematically showing the procedure of the navigation process.
  • FIG. 9 is a flowchart schematically showing the procedure of the navigation processing, and
  • FIG. 10 is a flowchart schematically showing the procedure of the navigation processing.
  • FIG. 11 is a diagram schematically showing a configuration of a navigation system according to a second embodiment of the present invention.
  • FIG. 12 is a diagram for explaining the tourist guide mode.
  • FIG. 13 is a diagram showing an example of a genre selection screen displayed on the display device
  • FIG. 14 is a diagram schematically showing an example of a database used in the tourist guide mode.
  • FIG. 15 is a flowchart schematically showing the procedure of the navigation processing.
  • FIG. 16 is a flowchart schematically showing the procedure of the navigation processing.
  • FIG. 17 is a flowchart showing the procedure of the navigation processing. It is a flow chart schematically shown,
  • FIG. 18 is a flowchart schematically showing the procedure of the navigation process.
  • FIG. 1 is a block diagram schematically showing a configuration of a navigation system according to a first embodiment of the present invention.
  • This navigation system includes a navigation device 1A, a display device 30, and four sound source speakers 35R, 35R, 36R, 36L.
  • the navigation device 1A and the display device 30 are displayed outside the vehicle 31, but in fact, these components 1A and 30 are mounted near the front panel of the vehicle 31. Have been.
  • the navigation system is installed in the vehicle 31.
  • the navigation system is not limited to the vehicle 31 but may be applied to a moving body such as a ship or an airplane having an operation seat inside. Can be installed.
  • the navigation device 1A includes a playback device 24 that plays back audio signals recorded on an optical disc such as a CD-ROM or a DVD, and the user loads the optical disc into the playback device 24 and plays back from the optical disc. You can listen to the music played.
  • the reproducing device 24 supplies the audio signal AS reproduced from the optical disk to the output control unit 21 via the input / output interface 23.
  • the output control unit 21 modulates the audio signal AS (outputs the original sound speakers 35, 35R, 36, and 36RI and outputs them).
  • the sound source speakers (front speakers) 35R and 35L are arranged at two places on the front right side and the front left side of the cabin space 32, respectively. It is located in two places on the rear right side and the rear left side. Therefore, The sound field speakers 35R, 35L, 36R, and 36L are arranged in the vehicle cabin so as to surround the driver's seat 33 and its adjacent and rear seats. You will hear the sound propagating from four diagonal directions: the power station and the rear two power stations.
  • the navigation device 1A has a function of supplying four-channel sound signals MS1, MS2, MS3, and MS4 to the four sound source speakers 35R, 35R, and 36R, 36L, respectively.
  • the navigation device 1A calculates a current position (own vehicle position), refers to map data, and searches for a guide route (travel route) to the target point; a route search block (route search unit); And a control block (control unit) that controls the sound source speeds 35R, 35L, 36R, and 36L so that a guidance sound that guides the traveling direction of the vehicle 31 is generated according to the vehicle speed.
  • the control block controls the sound source speakers 35R, 35R, 36R, 36L so that the sound field distribution of the guide sound in the vehicle interior space 32 is biased in an angular direction according to the content of the guide sound. It is controlled individually. Specifically, as shown in FIG.
  • the guide sound mainly comes from the direction of the target point 42 to the driver's seat 33.
  • the sound field distribution in the cabin space 32 is biased toward the target point 42.
  • the target point 42 is located obliquely rearward with respect to the vehicle center 31c, the vehicle is guided so that the guide sound comes mainly from the direction of the target point 42 to the driver's seat 33.
  • the sound field distribution in the room space 32 is biased toward the target point 42.
  • the sound source distribution and sound absorption distribution in the power room space 32 in which a three-dimensional sound field is created using four sound source speakers 35R, 35L, 36R, and 36L are taken into consideration.
  • a three-dimensional sound field may be created using only the sound source speakers.
  • the route search block includes a receiving antenna 10, a GPS signal receiving unit 11, a vehicle information calculating unit 12, and a traveling route calculating unit 13.
  • GPS signal receiver 1 1 A GPS signal received from a GPS satellite (not shown) via the receiving antenna 10 is converted into an intermediate frequency band signal, and the signal is amplified and output to the host vehicle information calculation unit 12.
  • the own vehicle information calculation unit 12 calculates the current position (latitude / longitude Z altitude), that is, the own vehicle position and the traveling direction of the vehicle 31 based on the input signal, and outputs the result to the map generation unit 22 and the traveling route. Output to the calculation unit 13.
  • the traveling route calculation unit 13 refers to map data read from a large-capacity recording medium such as a hard disk drive (HDD) or an optical disk to search for a guide route from the vehicle position to the target point, and obtains the search results. Is given to the map generator 22 and the driving event detector 14.
  • the driving route calculation unit 13 calculates a guidance route based on the road traffic information. The information is updated in real time and output to the driving event detector 14 and the map generator 22.
  • the own-vehicle information calculation unit 12 of the present embodiment calculates the own-vehicle position based only on the GPS signal, from the viewpoint of improving the positioning accuracy, it further includes a traveling distance sensor, an angular velocity sensor, It is preferable to calculate the position of the vehicle using detection signals from an acceleration sensor or the like.
  • the map generation unit 22 generates an image signal for displaying the own vehicle position supplied from the own vehicle information calculation unit 12 and the guidance route supplied from the traveling route calculation unit 13 on a map, and outputs the image signal. It is supplied to the display device 30 via the interface 25.
  • the display device 30 is configured by a thin display such as a liquid crystal display or an organic EL display, and the screen of the display device 30 displays a route along with a three-dimensional map or a planar map around the position of the vehicle.
  • the control block consists of a running event detector 14, a progress vector calculator 15, a guidance signal generator 16, a guidance voice storage unit 17, a sound effect storage unit 18, a control signal generator 19, and an output scalability.
  • a number control unit 20 and an output control unit 21 are provided.
  • These processing units 14 to 21 may be constituted by hardware, or may be constituted by a program executed by a microprocessor or a series of instructions.
  • control block is an integrated circuit having a nonvolatile memory (recording medium) for recording the program, a microprocessor, a RAM, an internal bus, and an input / output interface. Roads may be used.
  • the driving event detector 14 generates driving event information in real time based on the guide route information and the map data supplied from the driving route calculator 13, and generates the driving event information with the progress vector calculator 15 and the guidance signal generation. Supply to part 16 and.
  • the traveling event information includes one or more target points on the guide route, for example, a start point and a destination point (end point) of the guide route, an intersection on the guide route, a construction site, an amusement arcade, a scenic spot, a tourist facility. And restaurants.
  • the traveling event detecting unit 14 refers to the map data, extracts those points existing on the guide route and scattered around the guide route, and sets them as target points.
  • the traveling event detection unit 14 detects an event (traveling state) of the vehicle 31 at each target point, and uses this as traveling event information. For example, when one of the target points is an intersection where a plurality of traveling roads intersect, the traveling event detection unit 14 detects a left turn, a right turn, progress or stop at the intersection, and sets the information as traveling event information. . Such traveling event information is used when the sound source speakers 35L, 35R, 36L, 36R emit a sound such as "Please turn left at the next intersection" when the vehicle 31 approaches the intersection. Sa It is.
  • the driving event detection unit 14 converts the driving event information in real time based on the road traffic information. May be updated.
  • the guidance signal generation unit 16 obtains a sound pattern corresponding to the driving event information from the guidance voice storage unit 17 or the sound effect storage unit 18, generates a guidance signal AG from the obtained sound pattern, and controls the output.
  • the sound is supplied to the sound source speakers 35R, 35L, 36R, 36L via the unit 21.
  • Guidance The voice storage unit 17 holds a sound pattern such as "100 m to the intersection. Turn right at the next intersection", and the sound effect storage unit 18 stores pulse sounds and warning sounds. Holds various sound effect patterns. A method for generating the guide signal AG will be described later.
  • the traveling vector calculation unit 15 has a function of calculating a vector amount from the vehicle position to the target point in real time based on the traveling event information, and further calculates a target amount based on the traveling event information. It has a function to set a virtual point near the point and calculate the vector amount from the vehicle position to the virtual point in real time.
  • the progress vector calculation unit 15 supplies the calculated vector amount, that is, the progress vector, to the guide signal generation unit 16 and the control signal generation unit 19. Specifically, as shown in FIG. 2 and FIG. 3, the traveling vector calculation unit 15 calculates the traveling vector P from the center position 31c of the vehicle to the target point 42 with reference to the map data. In the figure, 0 is an angle formed between the traveling vector P and a lateral direction orthogonal to the front direction of the vehicle 31.
  • the progress vector calculation unit 15 sets a virtual point on a guidance route near the target point, The vector quantity P from the own vehicle position to the virtual point is calculated.
  • the target point 40 is an intersection, and the vehicle 31 turns right according to the guidance route.
  • a virtual point 41 R is set on the traveling road at the right turn.
  • the traveling vector P in such a case is calculated as the sum of the vector amount from the center position 31c of the vehicle 31 to the intersection 40 and the vector amount F from the intersection 40 to the virtual point 41R. Further, as shown in FIG.
  • a virtual point 41L is set on the traveling road at the left turn destination.
  • the traveling vector P in such a case is calculated as the sum of the vector amount from the own vehicle position to the target point and the vector amount F from the intersection 40 to the virtual point 41.
  • the direction and length (norm) of the vector amount F from the target point to the virtual point are set in advance according to the target point.
  • the length of the vector amount F may be set to about several meters.
  • the traveling route calculation unit 13 calculates in real time the position of the vehicle, that is, the linear distance from the center position 31c of the vehicle 31 to the target point, and the distance of the guide route from the center position 31c to the target point. Then, the linear distance and the road distance are supplied as measured distance information to the traveling event detection unit 14, the guidance signal generation unit 16 and the control signal generation unit 19.
  • the control signal generator 19 has a function of generating a control signal CS for generating a sound field in the vehicle interior space 32 based on the traveling vector P. In other words, the control signal generation section 19 generates a control signal CS for modulating the guide signal AG to create a sound field.
  • FIG. 6 is a diagram schematically showing a part of the configuration of the output control unit 21.
  • the output control section 21 adds a first modulation section 60 for modulating the guide signal AG based on the control signal CS, a second modulation section 61 for modulating the audio signal AS based on the control signal CS, and a modulation signal group. Adders 62, 63, 64, 65.
  • First modulation ⁇ P60 is composed of 4 I solid multipliers 50, 51, 52, 53, the second modulation unit 61, c consists of four multipliers 54, 55, 56, 57 Further, the control signal CS is composed of data of coefficients 1, 2, aS, 4 and data of coefficients UO— ⁇ , UO—2, UO—3, UO— »4.
  • the fixed value UO is the maximum value that the coefficients 1, a2, 0 ⁇ 3, and 4 can take, and is appropriately set according to the digital processing system.
  • the user can operate a user interface (not shown) such as an input key to designate an arbitrary sound source speaker and set it in the output speaker number control unit 20.
  • the guide signal AG is composed of first to fourth guide signals AG1, AG2, AG3, and AG4 of four channels
  • the audio signal AS is first to fourth audio signals AS1, AS2, AS3, and AS4 of four channels. It is configured.
  • the multiplier 50 modulates the first guide signal AG with the data of the coefficient 1 and outputs it to the adder 62
  • the multiplier 51 converts the second guide signal AG with the data of the coefficient 2.
  • the data is modulated and output to the adder 63
  • the multiplier 52 modulates the third guide signal AG with the data of the coefficient 3 and outputs it to the power calculator 64
  • the multiplier 53 outputs the data of the coefficient ⁇ 4 with the data of the coefficient ⁇ 4. 4
  • the guide signal AG is modulated and output to the adder 65.
  • the multiplier 54 modulates the first audio signal AS1 with the data of the coefficient UO-1 and outputs the result to the adder 62, and the multiplier 55 outputs the coefficient UO-2
  • the second audio signal AS2 is modulated by the data of the second audio signal AS2 and output to the adder 63, and the multiplier 56 modulates the third audio signal AS3 with the data of the coefficient UO-3 and outputs it to the adder 64.
  • Multiplier 57 is the data of coefficient UO-4 Modulates the fourth audio signal AS4 and outputs it to the adder 65.
  • the adder 62 adds the modulated signals output from the multipliers 50 and 54 to generate an acoustic signal MS1, and the adder 63 adds the modulated signals output from the multipliers 51 and 55, respectively.
  • the adder 64 adds the modulated signals output from the multipliers 52 and 56 to generate an audio signal MS3, and the adder 65 outputs the audio signals MS3 from the multipliers 53 and 57, respectively.
  • the modulated signal is added to generate an acoustic signal MS4.
  • the sound signals MS1 to MS4 are output to sound source speakers 35R, 35R and 36R, 36L, respectively.
  • the output control unit 21 has a mixer function of mixing the guide signal AG and the audio signal AS, and has a function of variably controlling the mixing ratio of the audio signal AS to the guide signal AG. .
  • the volume of the audio signal and the volume of the guide signal are in inverse proportion to each other. Therefore, the volume of the audio signal automatically decreases as the volume of the guidance signal increases, and the volume of the guidance signal automatically decreases as the volume of the audio signal increases. It is possible to easily hear the guidance sound while listening to the sound.
  • Equation (1) to (4) are exemplified below.
  • L is the road distance
  • H is the norm
  • 0 is the angle of the traveling vector P ( ⁇ 180 ° ⁇ 0 ⁇ + 180 °;)
  • indicates a scale factor (S> 1 ⁇ 0), respectively.
  • the control signal generation unit 19 biases the sound field distribution in the vehicle interior space 32 toward the target point or the virtual point, as if the target point Alternatively, the coefficients 1, 2, 3, 4, UO-1, UO-2, UO-3, and UO-4 can be calculated so that the guidance voice reaches the driver 34 from the direction of the virtual point.
  • the guide signal generator 16 has a volume adjustment function for individually adjusting the volume of the sound source speakers 35L, 35R, 36L, 36R according to the linear distance or the road distance (hereinafter, the linear distance and the road distance). Measure distance Fixed distance). Specifically, the guide signal generation unit 16 can control to emit a guide sound stepwise or continuously according to the measured distance at a volume that is inversely proportional to the measured distance.
  • the guidance sound can be obtained from the guidance voice storage unit 17 or the sound effect storage unit 18. With this control, for example, as the vehicle 31 approaches the target point, the measured distance to the target point or the virtual point decreases, and the volume of the guide signal AG increases. For this reason, the driver 34 can intuitively grasp the distance to the target point based on the change in the volume of the guide sound.
  • the guide signal generation unit 16 generates the guide signal AG so that the sound effect is reproduced stepwise or continuously according to the measured distance, the number of times per unit time which is inversely proportional to the measured distance.
  • the sound effect pattern can be obtained from the sound effect storage section 18. With this control, for example, as the vehicle 31 approaches the target point, the number of times the guidance sound is emitted increases. When the measurement distance is 3 km or more (stage 1), a guidance sound (sound effect) sounds once. When the measurement distance is less than 3 km and 2 km or more (stage 2), the guidance sound is emitted twice.
  • the driver 34 can intuitively and easily grasp the distance to the target point only by hearing the number of times the guidance sound is heard.
  • the guide signal generation unit 16 can simultaneously control the number of times the sound effect sounds per unit time and the volume in accordance with the measured distance.
  • the guide signal generation unit 16 has a function of generating a guide signal AG by selecting any of only voice, effect sound, or a synthesized sound of voice and effect sound according to the setting mode.
  • the setting mode can be appropriately changed by the user.
  • the traveling event detection unit 14 detects an event (traveling state) of the vehicle 31 at each target point, such as a left turn or a right turn at an intersection, as traveling event information, and sends this to the guidance signal generation unit 16.
  • the guide signal generation unit 16 has a function of changing sound effects according to each event. Table 2 below shows an example of a look-up table for selecting sound effects according to events such as “turn right or left at an intersection ⁇ “ emergency guidance ”or“ destination guidance ⁇ “other”.
  • the guide signal generation unit 16 can select any one of the sound effects A to D with reference to the look-up table, and acquire it from the sound effect storage unit 18.
  • the running event detection unit 14 includes a look-up table (voice guidance table) indicating a threshold distance corresponding to the target point and the event (running state).
  • the driving event detection unit 14 obtains a threshold distance corresponding to each target point or each event with reference to the voice guidance table, and outputs information on the magnitude relationship between the threshold distance and the current measured distance to the guidance signal generation unit.
  • Supply 1 6 To do.
  • the guide signal generator 16 generates a guide signal AG according to the magnitude relationship. Table 3 below shows an example of the voice guidance table.
  • FIG. 7 to 10 are flowcharts schematically showing the procedure of the navigation processing by the navigation device 1A.
  • FIG. 7 and FIG. 8 are flowcharts showing a procedure of a traveling event detection process which constitutes one of the navigation processes, and these flowcharts are interconnected via connectors C1 and C2.
  • the GPS signal receiving unit 11 converts the received GPS signal into a signal in the intermediate frequency band, amplifies the signal, and outputs the signal to the vehicle information calculation unit 12. I do.
  • the own-vehicle information calculation section 12 is based on the signal input from the GPS signal reception section 11.
  • the current position, that is, the own vehicle position is calculated (step S2), and then the traveling direction of the vehicle 31 is calculated (step S3), and the own vehicle position and the traveling direction are calculated by the traveling route calculation unit 13 and the map generation unit 22. Supply.
  • the traveling route calculator 13 determines in advance whether or not the destination (destination) has been input and set by the driver 34 (step S4).
  • an optimal travel route (guide route) from the vehicle position to the destination is calculated (step S6).
  • the map generation unit 22 generates an image signal for displaying the own vehicle position supplied from the own vehicle information calculation unit 12 and the guidance route supplied from the traveling route calculation unit 13 on a map. (Step S8), this is supplied to the display device 30 via the output interface 25 and displayed (Step S9).
  • the traveling event detection unit 14 determines that there is no congestion point (S17)
  • the processing after step S1 is repeatedly executed.
  • H P a straight line distance
  • the range of the linear distance H in the first stage is Y2 or more and less than Y1
  • the range of the linear distance H in the second stage is Y3 or more and less than Y2
  • the range of the linear distance H in the third stage is Y4 or more and less than Y3
  • the range of the straight-line distance H in the fourth stage is specified to be less than Y4.
  • the variable FLAG is a flag value for preventing repeated voice guidance at the same stage.
  • step S28 the guide signal generation unit 16 executes a voice generation process (FIG. 9) (step S28).
  • step S40 the guide signal generation unit 16 determines whether or not the type of the guide signal AG is voice only in accordance with the setting mode. Migrate.
  • guidance signal A When the type of G is voice only, the guide signal generation unit 16 obtains a voice pattern at a stage corresponding to the set value of the variable STEP, and outputs the guide signal AG from the voice pattern. Is generated (step S42). Then, the process returns to the main routine (FIG. 8).
  • step S43 the guide signal generation unit 16 determines whether or not the type of the guide signal AG is only the sound effect according to the setting mode. If it is determined that the type of the guide signal AG is not, the process proceeds to step S44. . On the other hand, when the type of the guide signal AG is only the sound effect, the guide signal generation unit 16 obtains a sound effect pattern corresponding to the set value of the variable COND from the sound effect storage unit 18, and obtains the effect. A guide signal AG is generated from the sound pattern (step S46). Then, the process returns to the main routine (FIG. 8).
  • step S44 the guidance signal generation unit 16 acquires a sound effect pattern corresponding to the set value of the variable COND from the sound effect storage unit 18 (step S44), and outputs a sound at a stage corresponding to the set value of the variable STEP.
  • the pattern is acquired from the guidance voice storage unit 17 to generate a guidance signal AG in which the voice and the sound effect are combined (step S45). Then, the process returns to the main routine (FIG. 8).
  • the progress vector calculation unit 15 calculates the progress vector P and supplies it to the control signal generation unit 19 (step S29). Subsequently, the control signal generation unit 19 obtains the number of output speakers from the output speaker number control unit 20 (step S30), and then executes a control process (FIG. 10) (step S31).
  • the control signal generation unit 19 determines whether or not the type of the guide signal AG is only a voice (step S50). If it is determined that the type is not voice, the process proceeds to step S53. On the other hand, when the type of the guide signal AG is only voice, the control signal generator 19 calculates the coefficients CM to 4 according to the above equations (1) to (4) to generate the control signal CS (step S51). That Thereafter, the output control section 21 modulates the guide signal AG with the control signal CS and outputs the output signals MS1 to! 1S4 is generated (step S52). As a result, the sound source speakers 35L, 35R, 36, and 36R output the proposed sound at a volume inversely proportional to the linear distance H so as to bias the sound field distribution toward the traveling vector P.
  • the term ⁇ XL-H) / ( / S XL) indicates a volume adjustment term for generating a volume inversely proportional to the linear distance H.
  • the term x (135 + ⁇ ), (1.0 ⁇ 90) x (45—) indicates a direction adjustment term for determining the bias direction (directivity) of the sound field.
  • step S53 the control signal generator 19 determines whether or not to adjust the volume of the sound effect included in the guide signal AG, and if not, shifts the process to step S57. On the other hand, if it is determined that the volume of the sound effect is to be adjusted, the control signal generator 19 calculates the coefficients M to 4 according to the above equations (1) to (4) and generates the control signal CS (step S54). Thereafter, the output control section 21 modulates the guide signal AG with the control signal CS to generate output signals MS1 to MS4 (step S55). As a result, the sound source speakers 35L, 35R, 36L, 36R output an effect sound at a volume inversely proportional to the linear distance H so as to bias the sound field distribution toward the direction of the traveling vector P.
  • control signal generator 19 determines whether or not the type of the guide signal AG is only the effect sound (step S56), and if not, executes the processing after step S51. On the other hand, when it is determined that the type of the guide signal AG is only the sound effect, the process returns to the main routine (FIG. 8).
  • step S53 If it is determined in step S53 that the sound effect is not adjusted, the control signal generation unit 1 9 generates the control signal CS by setting the value of the volume adjustment term to “1” in the above equations (1) to (4) and calculating the coefficients 1 to 4 using only the direction adjustment term (step S57). ). After that, the output control section 21 modulates the guide signal AG with the control signal CS and outputs the output signals MS1 to! IS4 is generated (step S58). As a result, the sound source speakers 35L, 35R, 36L, 36R output sound effects. Subsequently, the control signal generator 19 determines whether or not the number of times the sound effect has been reproduced has reached a number inversely proportional to the linear distance H (step S59). Step S58 is repeatedly executed.
  • the processing power is shifted to the previous step S56.
  • the sound source speakers 35R, 35R, 36L, and 36R output sound effects a number of times inversely proportional to the linear distance ⁇ ⁇ so as to bias the sound field distribution in the direction of the traveling vector ⁇ .
  • the driver 34 can determine the distance and direction to the target point even if he does not gaze at the navigation screen or hear the guidance sound while driving. Since it is possible to intuitively grasp, it is possible to provide route guidance that does not hinder the driving operation of the driver 34, greatly reduce the mental burden on the driver 34, and improve safety. In addition, it is possible to intuitively grasp the distance and direction to the target point while playing music and the like. In addition, since the output speaker number control unit 20 can select an acoustic speaker that outputs an acoustic signal, if there is an occupant in the rear seat, only the left and right sound source speeds 35L and 35R of the front seat are controlled, and the rear speaker is controlled. This has the effect of eliminating the harsh guidance sound to the occupants of the seat.
  • FIG. 11 is a diagram schematically showing the configuration of the navigation system of the second embodiment.
  • the first embodiment is described.
  • the components denoted by the same reference numerals as the components of the navigation system have the same functions as the components of the first embodiment, and the detailed description is omitted.
  • the navigation system of the present embodiment includes a navigation device 1, a display device 30, and four sound source speakers 35 R, 35 R, 36 R, and 36 R.
  • the navigation device 1B has a mode control unit 70, a genre designation unit 71, and a spot information storage unit 72, in addition to the components 11 to 25 of the first embodiment.
  • the mode control unit 70 designates the operation mode of the navigation device 1 B as one of the “normal guidance mode”, the “sightseeing guide mode” or the “rough navigation mode” to the traveling route calculation unit 13. It has the function to do. The user can operate the input device to select one of the guidance modes.
  • “Tourist guide mode” is a mode in which target points such as sightseeing spots and restaurants are searched and voice guidance is provided for each target point.
  • the driver 34 of the vehicle 31 can hear the guide voices of the scenic spot, the aquarium, the amusement arcade and the restaurant one after another.
  • the “rough navigation mode (rough navigation mode)” is a mode in which the user is roughly guided by voice without guiding the optimal travel route calculated by the travel route calculation unit 13.
  • an in-vehicle navigation system sequentially calculates an optimal and shortest traveling route and notifies the driver 34 of the traveling route.
  • the rough navigation mode if the user sets only the final destination, the navigation system is operated. The system does not guide the travel route but provides sound guidance on the distance and direction to the final destination, as in the case of the first embodiment.
  • events such as turning right and left at an intersection are not sound-guided, Sound guidance is performed stepwise according to the distance to the final destination.
  • the driver 34 can concentrate on the driving operation, and can expect safer driving.
  • the genre specifying section 71 has a function of specifying the genre of the target point, for example, a scenic spot or an aquarium, to the traveling event detecting section 14.
  • FIG. 13 is a diagram showing an example of a genre selection screen displayed on the display device 30. As shown in FIG. The user can select at least one of “None”, “Scenic spot”, “Restaurant” “Playground” and “Convenience store (convenience store)”.
  • the spot information storage unit 72 records positional information and explanation information of scenic spots, facilities or recreational facilities located around the traveling route in a database.
  • FIG. 14 is a diagram illustrating an example of the database.
  • FIG. 15 and FIG. 16 are diagrams schematically showing the procedure of the sightseeing guidance process in the “tourism guide mode”.
  • the flowcharts shown in FIGS. 15 and 16 are interconnected via connectors C3 and C4.
  • the GPS signal receiving unit 11 converts the received GPS signal into a signal in the intermediate frequency band, amplifies the signal, and sends it to the own vehicle information calculating unit 12. Output.
  • the host vehicle information calculation unit 12 calculates the current position, that is, the host vehicle position, based on the signal input from the GPS signal reception unit 11 (step S51), and then calculates the traveling direction of the vehicle 31 (step S52). ), The vehicle position and the traveling direction are supplied to the traveling route calculation unit 13 and the map generation unit 22.
  • the map generation unit 22 generates an image signal for displaying the own vehicle position supplied from the own vehicle information calculation unit 12 and the guidance route supplied from the traveling route calculation unit 13 on a map. (Step S53), this is supplied to the display device 30 via the output interface 25 and displayed. (Step S54).
  • the traveling event detection unit 14 determines that there is no congestion point (S62)
  • the processing after step S50 is repeatedly executed.
  • the range of the linear distance H in the first step is Y2 or more and less than Y1
  • the range of the linear distance H in the second step is Y3 or more and less than Y2
  • the range of the linear distance H in the third step is ⁇ 4 or more and less than Y3.
  • the range of the linear distance H in four steps is specified as less than Y4.
  • the variable FLAG is a flag value for preventing repeated voice guidance at the same stage.
  • step S70 the guide signal generation unit 16 determines whether the straight-line distance H corresponds to the first stage and the variable FLAG is “0”, and if not, executes the process of step S72. .
  • step S71 the value of the variable FLAG is set to “1”
  • step S72 the guide signal generation unit 16 determines whether the straight-line distance H corresponds to the second stage and the variable FLAG is “1”, and if not, executes the process in step S74.
  • step S8 the guide signal generation unit 16 Execute the voice generation process (Fig. 9) (step S8).
  • the progress vector calculation unit 15 calculates the progress vector P and supplies it to the control signal generation unit 19 (step S79A).
  • step S79B the control signal generation unit 19 acquires the number of output speakers from the output speaker number control unit 20 (step S79B), and thereafter, the above-described control processing (FIG. 10) is executed (step S79C).
  • step S50 FIG. 15
  • the sightseeing guide processing it is possible to intuitively understand the direction and the distance from the own vehicle position to the sightseeing spot or the like based on the direction and the size of the in-car sound. If you are driving within a certain tourist destination area, there are often spots that may be of interest around your vehicle position, regardless of the final destination setting. However, in practice, it is often the case that the driver is not traveling toward such a mysterious spot, so even though the conventional navigation method can recognize those spots, the information that is sequentially guided is not recognized. Without understanding, it is difficult to notice those spots.
  • FIG. 17 and FIG. 18 are diagrams schematically showing the procedure of the rough navigation process.
  • the flowcharts shown in FIGS. 17 and 18 are connected to each other via connectors C5 and C6.
  • the GPS signal receiving unit 11 converts the received GPS signal into a signal in the intermediate frequency band, amplifies the signal, and sends the signal to the own vehicle information calculating unit 12. Output.
  • the host vehicle information calculation unit 12 calculates the current position, that is, the host vehicle position, based on the signal input from the GPS signal reception unit 11 (step S81), and then calculates the traveling direction of the vehicle 31 (step S82). ), The vehicle position and the traveling direction are supplied to the traveling route calculation unit 13 and the map generation unit 22.
  • the traveling route calculation unit 13 determines in advance whether or not the destination (destination) has been input and set by the driver 34 (step S83).
  • step S86 The optimum travel route (guide route) from the vehicle position to the destination is calculated (step S86). Subsequently, the map generation unit 22 generates an image signal for displaying on the map the vehicle position supplied from the vehicle information calculation unit 12 and the guidance route supplied from the travel route calculation unit 13 ( In step S87), this is supplied to the display device 30 via the output interface 25 and displayed (step S88).
  • the map generation unit 22 determines the own vehicle position supplied from the own vehicle information calculation unit 12 on the map.
  • An image signal to be displayed is generated (step S84) and supplied to the display device 30 via the output interface 25 to be displayed (step S85).
  • step S79 it is determined whether or not there is a guidance facility within a range of distance L2 from the vehicle position. If it is determined that there is no guidance facility, the process returns to step S80. On the other hand, if the guidance facility is located within the range of 2 away from the vehicle position, the process proceeds to step S90 (FIG. 18).
  • step S90 the guidance signal generation unit 16 corresponds to the n-th stage and the Value is "if it is determined that nj, value of the variable STEP is gamma eta + 1" is set (step S91). Thereafter, the guide signal generation unit 16 determines whether or not the value of the variable FLAG is “ ⁇ ⁇ 1” (step S92), and if not, executes the voice generation processing (FIG. 9) (FIG. 9). Step S93). After that, the value of the variable FLAG is set to “0” (step S94).
  • the guide signal generating unit 16 sets the value of the variable FLAG to ⁇ + 1 ”(step S95), and thereafter, the voice generation processing is performed. Execute (Fig. 9) (Step S96) 0
  • the progress vector calculation unit 15 calculates the progress vector P and supplies it to the control signal generation unit 19 (step S97). Subsequently, the control signal generator 19 obtains the number of output speakers from the output speaker number controller 20 (step S98), and executes a control process (FIG. 10) (step S99). Thereafter, the processing after step S80 is repeatedly executed.
  • the driver 34 can enjoy the drive while concentrating on the driving operation.
  • the driver 34 can continuously and intuitively understand where the final destination is, the mental load during driving can be greatly reduced.
  • paying attention to the navigation system guidance can be a burden on driving and may impair safety.
  • the guidance provided in the rough navigation process can improve the safety for a driver unfamiliar with driving.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)

Abstract

Système de navigation assurant un guidage vocal sûr sans gêner les manœuvres d’un conducteur et sans lui imposer d’impact psychologique. Le système de navigation comporte une section recherche d’itinéraire de guidage vers au moins un point d’objet par référence à des données cartographiques, des haut parleurs de la source sonore agencés dans un corps mobile de façon à entourer au moins un siège de conducteur et une section commandant les haut parleurs de la source sonore pour qu’ils émettent le son servant à indiquer la direction du parcours du corps mobile suivant l’itinéraire de guidage. La section de commande gère les haut parleurs de façon que la répartition, dans l’espace du corps mobile, d’un champ sonore du son de guidage soit déviée suivant le contenu du son de guidage.
PCT/JP2005/005049 2004-03-31 2005-03-15 Système de navigation WO2005098365A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2004106918 2004-03-31
JP2004-106918 2004-03-31

Publications (1)

Publication Number Publication Date
WO2005098365A1 true WO2005098365A1 (fr) 2005-10-20

Family

ID=35125183

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2005/005049 WO2005098365A1 (fr) 2004-03-31 2005-03-15 Système de navigation

Country Status (1)

Country Link
WO (1) WO2005098365A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2196975A1 (fr) * 2007-07-23 2010-06-16 Clarion Co., Ltd. Dispositif de navigation
EP3392619A1 (fr) * 2017-04-17 2018-10-24 Harman International Industries, Incorporated Invites audibles dans un système de navigation de véhicule

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07114690A (ja) * 1993-10-19 1995-05-02 Aqueous Res:Kk 案内装置
JPH07301540A (ja) * 1994-05-02 1995-11-14 Aqueous Res:Kk ガイドシステム
JPH09167297A (ja) * 1995-12-14 1997-06-24 Sumitomo Electric Ind Ltd 交差点案内装置
JP2000213951A (ja) * 1999-01-28 2000-08-04 Kenwood Corp カ―ナビゲ―ションシステム
JP2002116045A (ja) * 2000-10-11 2002-04-19 Clarion Co Ltd 音量制御装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07114690A (ja) * 1993-10-19 1995-05-02 Aqueous Res:Kk 案内装置
JPH07301540A (ja) * 1994-05-02 1995-11-14 Aqueous Res:Kk ガイドシステム
JPH09167297A (ja) * 1995-12-14 1997-06-24 Sumitomo Electric Ind Ltd 交差点案内装置
JP2000213951A (ja) * 1999-01-28 2000-08-04 Kenwood Corp カ―ナビゲ―ションシステム
JP2002116045A (ja) * 2000-10-11 2002-04-19 Clarion Co Ltd 音量制御装置

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2196975A1 (fr) * 2007-07-23 2010-06-16 Clarion Co., Ltd. Dispositif de navigation
EP2196975A4 (fr) * 2007-07-23 2012-12-19 Clarion Co Ltd Dispositif de navigation
EP3392619A1 (fr) * 2017-04-17 2018-10-24 Harman International Industries, Incorporated Invites audibles dans un système de navigation de véhicule
US10440493B2 (en) 2017-04-17 2019-10-08 Harman International Industries, Incorporated Audible prompts in a vehicle navigation system

Similar Documents

Publication Publication Date Title
JP3263286B2 (ja) 車載ナビゲーション装置
US5406492A (en) Directional voice-type navigation apparatus
US7970539B2 (en) Method of direction-guidance using 3D sound and navigation system using the method
US7228229B2 (en) Information processing device and travel information voice guidance method
US6172641B1 (en) Navigation system with audible route guidance instructions
US20070174006A1 (en) Navigation device, navigation method, navigation program, and computer-readable recording medium
US20110144901A1 (en) Method for Playing Voice Guidance and Navigation Device Using the Same
JP3981040B2 (ja) ナビゲーション装置およびその装置における地図データのアクセス方法
WO2007114086A1 (fr) Dispositif a bord d'un vehicule, systeme vocal de fourniture d'informations et procede d'ajustement de vitesse par enonciation
JP2002233001A (ja) 擬似エンジン音制御装置
JP2001289660A (ja) ナビゲーション装置
JP4030064B2 (ja) 音響経路情報を有するナビゲーションシステム
JP2007232573A (ja) 車載用ナビゲーション装置並びに案内情報提供方法及びプログラム
JPH11201770A (ja) ナビゲーション装置
JP2000065585A (ja) 車載用ナビゲーション装置
JP3703981B2 (ja) オーディオ装置
JP3815270B2 (ja) ナビゲーション装置及び制御プログラム
US7512482B2 (en) Navigation apparatus
WO2005098365A1 (fr) Système de navigation
JPH0757190A (ja) ボイスナビゲーション装置
JPH10116086A (ja) 車載カラオケ
JP2004226189A (ja) カーナビゲーション装置
JP2006115364A (ja) 音声出力制御装置
JP2004108908A (ja) オーディオ連携ナビゲーション装置
JP2003156352A (ja) ナビゲーション装置

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

WWW Wipo information: withdrawn in national office

Country of ref document: DE

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP