WO2013145242A1 - Dispositif sonore, dispositif de gestion du son émis, dispositif terminal, et procédé de contrôle du son émis - Google Patents

Dispositif sonore, dispositif de gestion du son émis, dispositif terminal, et procédé de contrôle du son émis Download PDF

Info

Publication number
WO2013145242A1
WO2013145242A1 PCT/JP2012/058466 JP2012058466W WO2013145242A1 WO 2013145242 A1 WO2013145242 A1 WO 2013145242A1 JP 2012058466 W JP2012058466 W JP 2012058466W WO 2013145242 A1 WO2013145242 A1 WO 2013145242A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
sound
directional sound
vehicle
output
Prior art date
Application number
PCT/JP2012/058466
Other languages
English (en)
Japanese (ja)
Inventor
堀江 俊行
Original Assignee
パイオニア株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パイオニア株式会社 filed Critical パイオニア株式会社
Priority to PCT/JP2012/058466 priority Critical patent/WO2013145242A1/fr
Publication of WO2013145242A1 publication Critical patent/WO2013145242A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q5/00Arrangement or adaptation of acoustic signal devices
    • B60Q5/005Arrangement or adaptation of acoustic signal devices automatically actuated
    • B60Q5/008Arrangement or adaptation of acoustic signal devices automatically actuated for signaling silent vehicles, e.g. for warning that a hybrid or electric vehicle is approaching

Definitions

  • the present invention relates to an audio device, an output sound management device, a terminal device, an output sound control method and an output sound control program, and a recording medium on which the output sound control program is recorded.
  • Patent Document 1 hereinafter referred to as “Conventional Example 1”.
  • a pseudo sound signal is generated based on the detection result of the vehicle speed, the rotational speed of the motor that is a power source, the accelerator opening, etc., and the pseudo sound is output from the speaker to the front outside the vehicle or the like. It has become.
  • the execution / non-execution of the pseudo sound output is controlled in accordance with the type of area in which the vehicle is traveling.
  • Patent Document 2 a technique for generating a warning sound in an area including the position of the detected warning target person.
  • Patent Document 2 a technique for generating a warning sound in an area including the position of the detected warning target person.
  • the technique of Conventional Example 2 the presence or absence of a warning target person and the position of the warning target person and the relative distance to the warning target person when the warning target person exists are detected based on the surrounding image of the vehicle. To do.
  • a warning target person exists and horn sound is not being output a limited audible warning sound is generated within the vicinity of the warning target person including the position of the warning target person using a parametric speaker or the like. , To generate.
  • a warning sound is generated in response to local circumstances such as the presence of a warning target person such as a pedestrian around the traveling position of the vehicle.
  • a warning sound is generated when the presence of the pedestrian or the like is detected. Therefore, when a shield is interposed between the pedestrian or the like and the vehicle, No warning sound is generated to alert pedestrians.
  • the present invention has been made in view of the above circumstances, and an acoustic device and an output sound management for outputting an external output sound that can be appropriately alerted in response to the presence or absence of an obstacle around the vehicle It is an object to provide a device, a terminal device, and an output sound control method.
  • the present invention is an acoustic device that supplies an external output sound signal to an external sound output unit that outputs directional sound having directivity to the outside of a vehicle, and is present around the vehicle. And a detection unit that detects a reflective object that reflects the directional sound; and so that the directional sound is reflected on the detected reflective object so that the directional sound reaches the alert target person. And a control unit that controls the output direction of the directional sound.
  • the present invention provides a receiving unit that receives at least one of vehicle position information and surrounding information in which an external sound output unit that outputs directional sound having directivity is installed;
  • a detection unit that detects a reflective object that exists around the vehicle and reflects the directional sound based on a reception result by the unit; and reflects the directional sound on the detected reflective object, and
  • An output comprising: a generation unit that generates control information of an output direction of the directional sound for causing a sexual sound to reach a person to be alerted; and a transmission unit that transmits a generation result of the generation unit It is a sound management device.
  • the present invention is a terminal device arranged in a vehicle in which an external sound output unit that outputs directional sound having directivity is installed outside, the vehicle position information and surroundings
  • An acquisition unit that acquires at least one of information; a transmission unit that transmits an acquisition result obtained by the acquisition unit; and a directivity reflected on a reflector that reflects the directional sound detected based on the acquisition result obtained by the acquisition unit.
  • a receiving unit that receives control information of the output direction of the directional sound for reflecting the sound having the directivity and causing the sound having the directivity to reach the alert target person; based on the reception result by the receiving unit;
  • a control unit that controls an output direction of the directional sound.
  • the present invention detects a reflector that exists around a vehicle in which an external sound output unit that outputs directional sound having directivity is installed and reflects the directional sound.
  • a detection step and a control step of controlling an output direction of the directional sound so that the directional sound is reflected on the detected reflector and reaches the caution target person. It is an output sound control method characterized by comprising.
  • the present invention is an output sound control program characterized by causing an arithmetic unit to execute the output sound control method of the present invention.
  • the present invention is a recording medium in which the output sound control program of the present invention is recorded so as to be readable by a calculation unit.
  • FIG. (1) for demonstrating the propagation path of a directional sound.
  • FIG. (2) for demonstrating the propagation path of a directional sound.
  • FIG. (1) for demonstrating the propagation path of a directional sound.
  • FIG. (2) for demonstrating the propagation path of a directional sound.
  • FIG. (1) for demonstrating the propagation path of a directional sound.
  • FIG. (2) for demonstrating the propagation path of a directional sound.
  • FIG. (2) for demonstrating positioning of the terminal device and server apparatus which concern on 2nd Example of this invention.
  • It is a block diagram for demonstrating the structure of the terminal device of FIG.
  • FIG. 1 shows a schematic configuration of an acoustic device 700 according to the first embodiment. As shown in FIG. 1, the acoustic device 700 is mounted on a vehicle CR that uses electrical energy as part or all of driving energy.
  • a directional sound output unit 910H that is a part of the external sound output unit and a low directional sound output unit 910L that is a part of the external sound output unit are installed.
  • An imaging unit 920, a position detection unit 930, and a vehicle speed detection unit 940 are installed in the vehicle CR.
  • the directional sound output unit 910H includes a speaker SPH that outputs a directional sound toward the outside of the vehicle in accordance with the first external output sound signal sent from the acoustic device 700.
  • a speaker SPH for example, a flat speaker can be employed.
  • the directional sound output unit 910H further includes a rotation drive unit (not shown) that rotates the speaker SPH.
  • This rotation drive unit rotates the speaker SPH around a rotation axis parallel to the vertical direction in accordance with the rotation control sent from the acoustic device 700.
  • the low directivity sound output unit 910L outputs a speaker SPL that outputs a low directivity sound having a directivity lower than the directivity sound toward the outside of the vehicle in accordance with the second external output sound signal sent from the acoustic device 700.
  • a cone speaker can be adopted as the speaker SPL.
  • the imaging unit 920 captures a peripheral image in the traveling direction of the vehicle CR. Then, the imaging unit 920 sends the imaging result to the audio device 700 as imaging data.
  • the position detector 930 described above sequentially detects the current position of the vehicle CR. Then, the position detection unit 930 sends the detected current position to the acoustic device 700.
  • the vehicle speed detection unit 940 described above sequentially detects the traveling speed (vehicle speed) of the vehicle CR. Then, the vehicle speed detection unit 940 sends the detected vehicle speed to the acoustic device 700.
  • the acoustic device 700 includes a storage unit 710, an acquisition unit 720, and a detection unit 730.
  • the acoustic device 700 includes a control unit 740A, a sound source unit 750, and a sound signal generation unit 760.
  • the storage unit 710 stores various information used in the audio device 700. Such information includes map information.
  • the storage unit 710 can be accessed by the detection unit 730 and the control unit 740A.
  • the acquisition unit 720 acquires the shooting data sent from the shooting unit 920, the current position sent from the position detection unit 930, and the vehicle speed sent from the vehicle speed detection unit 940. Then, the acquisition unit 720 sends the acquired shooting data and the current position to the detection unit 730. In addition, the acquisition unit 720 sends the acquired current position and vehicle speed to the control unit 740A.
  • the detection unit 730 receives the shooting data and the current position sent from the acquisition unit 720. Then, based on the photographing data and the current position, the low directivity that exists on the traveling direction side of the vehicle CR and is output from the low directivity sound output unit 910L while appropriately referring to the map information in the storage unit 710. Detects obstructions that hinder sound progression. In addition, the detection unit 730 exists on the traveling direction side of the vehicle CR while appropriately referring to the map information in the storage unit 710 based on the shooting data and the current position, and is output from the directional sound output unit 910H. A reflective object that can reflect the directional sound is detected. The detection result by the detection unit 730 is sent to the control unit 740A.
  • the detection unit 730 detects the relative distance to the vehicle CR, the traveling direction of the vehicle CR, and the position where the shielding object is present for each detected shielding object. Is further detected (hereinafter referred to as “the relative angle with respect to the traveling direction of the shielding object”). In addition, when at least one reflector is detected, the detection unit 730 detects the relative distance to the vehicle CR, the traveling direction of the vehicle CR, and the position where the reflector exists for each of the detected reflectors. Is further detected (hereinafter referred to as “relative angle with respect to the traveling direction of the reflector”). The relative angle with respect to the traveling direction of the shielding object and the relative angle with respect to the traveling direction of the reflecting object are sent to the control unit 740A as a part of the detection result by the detection unit 730.
  • the control unit 740A receives the current position and vehicle speed sent from the acquisition unit 720 and the detection result sent from the detection unit 730. Then, the control unit 740A controls the generation of the external output sound signal in the sound signal generation unit 760 and the direction of the speaker SPH in the directional sound output unit 910H.
  • the external output sound signal generation control process by the control unit 740A will be described later.
  • the sound source unit 750 stores sound data corresponding to the external output sound.
  • the sound signal generation unit 760 can access the sound source unit 750.
  • the sound signal generation unit 760 generates an external output sound signal under the control of the control unit 740A.
  • the sound signal generation unit 760 When generating the external output sound signal, the sound signal generation unit 760 generates an external output sound signal based on the data read from the sound source unit 750.
  • the external output sound signal there are a first external output sound signal supplied to the directional sound output unit 910H and a second external output sound signal supplied to the low directional sound output unit 910L.
  • the sound signal generation unit 760 may generate only the second external output sound signal, or may generate the first and second external output sound signals.
  • shooting data of peripheral images in the traveling direction of the vehicle CR is sequentially sent from the shooting unit 920 to the acquisition unit 720.
  • the current position of the vehicle CR is sequentially sent from the position detection unit 930 to the acquisition unit 720.
  • the vehicle speed detection unit 940 is assumed to send the current vehicle speed to the acquisition unit 720 sequentially.
  • the detection unit 730 In the detection process of the shielding object and the reflection object, the detection unit 730 first performs image analysis based on the photographing data to detect the presence or absence of the shielding object or the reflection object. Then, when at least one shielding object or at least one reflecting object is detected, the detection unit 730 further performs image analysis, and the relative distance from the vehicle CR with respect to each of the detected shielding object and the reflecting object. And a relative angle with respect to the traveling direction of the vehicle CR is further detected. Then, the detection unit 730 sends the detection result to the control unit 740A.
  • the detection unit 730 refers to the map information in the storage unit 710 based on the current position and the traveling direction, and features around the traveling direction side of the vehicle CR and the shielding object obtained as an image analysis result. Or collation with a reflector is performed.
  • collation with a reflector is performed.
  • some of the shielding objects or reflection objects obtained as a result of the image analysis are not registered as features in the map information, or a feature that can be a shielding object registered in the map information is an image.
  • the shielding object or the reflecting object is specified by giving priority to the image analysis result.
  • the control unit 740A displays the map information in the storage unit 710 based on the current position.
  • the shielding notification area is specified with reference.
  • the “shielding notification region” is a region where the low directivity sound output from the low directivity sound output unit 910L does not reach by the shield, and the vehicle travels at the current vehicle speed from the viewpoint of traffic safety. An area where the approach of the vehicle CR to be notified should be notified.
  • the control unit 740A uses the reflector included in the detection result sent from the detection unit 730 to shield the directional sound output from the directional sound output unit 910H. Extract the route to reach the area. When such a path is extracted, the control unit 740A performs rotation control to direct the speaker SPH of the directional sound output unit 910H in the direction for causing the directional sound to travel along the extracted path. This is performed for the output unit 910H.
  • the route that is actually adopted is extracted from the routes that have the smallest number of reflections.
  • the control unit 740A controls the sound signal generation unit 760 to generate the first external output sound signal in addition to the second external output sound signal.
  • the sound signal generation unit 760 generates a first external output sound signal and sends the generated first external output signal to the directional sound output unit 910H.
  • the directional sound output unit 910H outputs the directional sound in the direction set by the rotation control by the control unit 740A.
  • the directional sound output from the directional sound output unit 910H travels the extracted path. Then, the directional sound reaches the shielding notification area.
  • the detection unit 730 is low-oriented based on the peripheral image on the traveling direction side of the vehicle CR, the current position of the vehicle CR, and the map information in the storage unit 710.
  • a shield that prevents the progression of the low directivity sound output from the sexual sound output unit 910L is detected.
  • the detection part 730 exists in the advancing direction side of vehicle CR, and the reflective body which can reflect the directional sound output from the directional sound output part 910H is detected.
  • the control unit 740A should notify the approach of the vehicle CR that travels at the current vehicle speed in a region where the low directional sound does not reach due to the detected shielding object.
  • control part 740A extracts the path
  • the control unit 740A performs rotation control for directing the speaker SPH of the directional sound output unit 910H in the direction for causing the directional sound to travel the extracted path. Perform for 910H.
  • the route that is actually adopted is determined from the routes that have the smallest number of reflections. For this reason, the fall of the volume of the directional sound which reaches
  • FIG. 2 shows a schematic configuration of the terminal device 810 and the output sound management device 820 according to the second embodiment.
  • the terminal device 810 is arranged in the vehicle CR and operates.
  • the output sound management device 820 is disposed outside the vehicle CR.
  • the terminal device 810 and the output sound management device 820 can communicate with each other via the network 850.
  • the output sound management device 820 can communicate with other terminal devices configured in the same manner as the terminal device 810, but only the terminal device 810 is representatively shown in FIG.
  • the terminal device 810 differs from the acoustic device 700 of the first embodiment described above in that a control unit 740B is provided instead of the control unit 740A, and a transmission unit 811 and a reception unit 812 are provided. . In the following, the description will be given mainly focusing on this difference.
  • the control unit 740B receives the shooting data of the surrounding video in the traveling direction of the vehicle CR, the current position of the vehicle CR, and the vehicle speed sent from the acquisition unit 720. Then, the control unit 740B sends the photographing data, the current position of the vehicle CR, and the vehicle speed to the transmission unit 811 as terminal transmission data.
  • control unit 740B receives control information sent from the receiving unit 812.
  • control information includes information on rotation control of the speaker SPH of the directional sound output unit 910H and information on generation control of the first external output sound signal.
  • the control unit 740B that has received the control information performs rotation control of the speaker SPH with respect to the directional sound output unit 910H and generation control of the first external output sound signal with respect to the sound signal generation unit 760 in accordance with the control information.
  • the transmission unit 811 receives the terminal transmission data transmitted from the control unit 740B. Then, the transmission unit 811 transmits the terminal transmission data to the output sound management apparatus 820 via the network 850.
  • the receiving unit 812 receives control information sent from the output sound management device 820 via the network 850. Then, the receiving unit 812 sends the control information to the control unit 740B.
  • the output sound management device 820 includes a storage unit 710, a detection unit 730, a reception unit 821, a generation unit 822, and a transmission unit 823.
  • the receiving unit 821 receives terminal transmission data transmitted from the terminal device 810 via the network 850. Then, the reception unit 821 sends the shooting data and the current position in the received terminal transmission data to the detection unit 730. In addition, the reception unit 821 sends the current position and vehicle speed in the received terminal transmission data to the generation unit 822.
  • the generation unit 822 receives the current position and vehicle speed sent as the terminal transmission data from the reception unit 821 and the detection result sent from the detection unit 730.
  • the generation unit 822 generates the control information described above based on the current position, the vehicle speed, the detection result, and the map information in the storage unit 710.
  • the control information generated in this way is sent to the transmission unit 823 as server transmission data.
  • the transmission unit 823 receives the server transmission data transmitted from the generation unit 822. Then, the transmission unit 823 transmits the server transmission data to the terminal device 810 via the network 850.
  • the shooting data of the peripheral video in the traveling direction of the vehicle CR and the current position of the vehicle CR collected by the control unit 740B are the transmission unit 811 and the network 850. Then, the data is sent to the detection unit 730 via the reception unit 821. The current position and vehicle speed of the vehicle CR collected by the control unit 740B are sent to the generation unit 822 via the transmission unit 811, the network 850, and the reception unit 821. Furthermore, the control information generated by the generation unit 822 is sent to the control unit 740B via the transmission unit 823, the network 850, and the reception unit 812.
  • shooting data of peripheral images in the traveling direction of the vehicle CR is sequentially sent from the shooting unit 920 to the acquisition unit 720.
  • the current position of the vehicle CR is sequentially sent from the position detection unit 930 to the acquisition unit 720.
  • the vehicle speed detection unit 940 is assumed to send the current vehicle speed to the acquisition unit 720 sequentially. Then, it is assumed that the acquisition result is sequentially sent from the acquisition unit 720 to the control unit 740B.
  • the control unit 740B of the terminal device 810 detects the shooting data of the peripheral video in the traveling direction of the vehicle CR and the current position of the vehicle CR, and the detection unit 730 of the output sound management device 820. Send to. Subsequently, the detection unit 730 detects the shielding object and the reflection object in the same manner as in the first embodiment described above, referring to the map information in the storage unit 710 based on the photographing data and the current position as appropriate. Execute the process. Then, the detection unit 730 sends the detection result to the generation unit 822.
  • the generation unit 822 of the output sound management device 820 displays the detection result and the terminal Based on the current position and vehicle speed of the vehicle sent from the control unit 740B of the apparatus 810, the control information is generated in the same manner as the control unit 740A while appropriately referring to the map information in the storage unit 710. Then, the generation unit 822 sends the generated control information to the control unit 740B.
  • the control unit 740B performs rotation control of the speaker SPH of the directional sound output unit 910H according to the control information, and performs first control on the sound signal generation unit 760. Controls generation of external output sound signals. As a result, the sound signal generation unit 760 generates a first external output sound signal and sends the generated first external output signal to the directional sound output unit 910H. Upon receiving the first external output signal, the directional sound output unit 910H outputs the directional sound in the direction set by the rotation control by the control unit 740B.
  • the directional sound output from the directional sound output unit 910H travels the extracted path as in the case of the first embodiment. Then, the directional sound reaches the shielding notification area.
  • the control unit 740B of the terminal device 810 and the detection unit 730 of the output sound management device 820 cooperate with each other to display the peripheral image on the traveling direction side of the vehicle CR, the vehicle Based on the current position of the CR and the map information stored in the storage unit 710, a shield that prevents the progress of the low directivity sound output from the low directivity sound output unit 910L is detected. Further, the control unit 740B and the detection unit 730 cooperate to detect a reflector that exists on the traveling direction side of the vehicle CR and can reflect the directional sound output from the directional sound output unit 910H. .
  • the generation unit 822 of the output sound management device 820 is a vehicle CR that travels at the current vehicle speed in a region where the low directional sound does not reach due to the detected obstacle based on the current position and vehicle speed of the vehicle CR.
  • the area where the approach of the should be notified is specified.
  • generation part 822 extracts the path
  • the generation unit 822 and the control unit 740B cooperate to direct the speaker SPH of the directional sound output unit 910H in the direction for causing the directional sound to travel the extracted path.
  • the rotation control is performed on the directional sound output unit 910H.
  • the output direction of directional sound propagating outside the vehicle is controlled by controlling the direction of the speaker SPH.
  • the present invention can also be applied to a case where the output direction of the directional sound propagating outside the vehicle is connected to a directional sound output unit that performs another method.
  • the shielding object and the reflection object are detected based on the shooting data, the current position, and the map information.
  • the shielding object and the reflection object may be detected based only on the imaging data, or the shielding object and the reflection object may be detected based only on the current position and the map information.
  • the acoustic device includes a storage unit, a sound source unit, and a sound signal generation unit.
  • the elements that can be shared may be used.
  • a sharable element can be omitted as a component of the acoustic device.
  • the terminal device includes the sound source unit and the sound signal generation unit.
  • the elements that can be shared may be used.
  • a sharable element can be omitted as a component of the terminal device.
  • the acquisition unit, the detection unit, and the control unit of the acoustic device of the first embodiment are configured as a computer as a calculation unit including a central processing unit (CPU: Central Processing Unit), and a program prepared in advance. May be executed by the computer to execute part or all of the processing of these elements.
  • This program is recorded on a computer-readable recording medium such as a hard disk, CD-ROM, or DVD, and is loaded from the recording medium and executed by the computer.
  • the program may be acquired in a form recorded on a portable recording medium such as a CD-ROM or DVD, or may be acquired in a form distributed via a network such as the Internet. Also good.
  • the acquisition unit and control unit of the terminal device of the second embodiment, and the detection unit and generation unit of the output sound management device are used as a calculation unit including a central processing unit (CPU). It may be configured as a computer, and a part of or all of the processes of these elements may be executed by executing a program prepared in advance on the computer.
  • This program is recorded on a computer-readable recording medium such as a hard disk, CD-ROM, or DVD, and is loaded from the recording medium and executed by the computer.
  • the program may be acquired in a form recorded on a portable recording medium such as a CD-ROM or DVD, or may be acquired in a form distributed via a network such as the Internet. Also good.
  • FIG. 3 shows a schematic configuration of the navigation 100 as the acoustic device according to the first embodiment. This is an aspect of the acoustic device 700 (see FIG. 1) of the first embodiment described above.
  • the navigation device 100 is mounted on a vehicle CR that uses electric energy as part or all of the driving energy.
  • a directional sound output unit 210H as a directional sound output unit 910H and a low directional sound output unit 210L as a low directional sound output unit 910L are installed.
  • a photographing unit 220 as the photographing unit 920 is installed in the vehicle CR.
  • the directional sound output unit 210H includes a speaker SPH that outputs a directional sound toward the outside of the vehicle in accordance with the first external output sound signal sent from the navigation device 100.
  • a flat speaker is adopted as the speaker SPH.
  • the directional sound output unit 210H further includes a rotation driving unit (not shown) that rotates the speaker SPH.
  • the rotation driving unit rotates the speaker SPH around a rotation axis parallel to the vertical direction according to the rotation control sent from the navigation device 100.
  • the low directivity sound output unit 210L includes a speaker SPL that outputs a low directivity sound having a directivity lower than the directivity sound toward the outside of the vehicle in accordance with the second external output sound signal sent from the navigation device 100.
  • a cone speaker is employed as the speaker SPL.
  • the above photographing unit 220 is configured with a camera.
  • the photographing unit 220 photographs a peripheral image of the vehicle CR. Then, the photographing unit 220 sends photographing data that is photographing result data to the navigation device 100.
  • the navigation device 100 includes a control unit 110A, a storage unit 710, and a storage unit 120 as a sound source unit 750.
  • the navigation device 100 includes a sound output unit 130, a display unit 140, and an input unit 150.
  • the navigation apparatus 100 includes a sensor unit 160 that also functions as a vehicle speed detection unit 940 and a GPS (Global Positioning System) reception unit 170 as a part of the position detection unit 930.
  • GPS Global Positioning System
  • the control unit 110A controls the entire navigation device 100.
  • the control unit 110A will be described later.
  • the storage unit 120 includes a nonvolatile storage device such as a hard disk device.
  • the storage unit 120 can be accessed by the control unit 110A.
  • This information data includes map information MPD and sound source data SSD.
  • the map information MPD includes road network information, feature position information, type information, and shape.
  • the sound source data SSD includes sound data corresponding to the external output sound.
  • the above-described sound output unit 130 includes a speaker and outputs sound corresponding to the sound data received from the control unit 110A.
  • This sound output unit 130 outputs guidance voices such as the traveling direction of the vehicle CR, the traveling situation, and the traffic situation regarding the navigation processing under the control of the control unit 110A.
  • the display unit 140 includes a display device such as a liquid crystal panel, and displays an image corresponding to the display data received from the control unit 110A.
  • This display unit 140 displays images such as map information and route information, guidance information, and the like during navigation processing under the control of the control unit 110A.
  • the input unit 150 includes a key unit provided in the main body of the navigation device 100 and / or a remote input device including the key unit.
  • a key part provided in the main body part a touch panel provided in a display device of the display unit 140 can be used.
  • it can replace with the structure which has a key part, or can also employ
  • the operation content of the navigation device 100 is set and an operation command is performed.
  • the user uses the input unit 150 to set a destination or the like related to route search in the navigation process.
  • Such input contents are sent as input data from the input unit 150 to the control unit 110A.
  • the sensor unit 160 includes a vehicle speed sensor, an acceleration sensor, an angular velocity sensor, a tilt sensor, and the like. Detection results from various sensors included in the sensor unit 160 are sent as sensor data to the control unit 110A.
  • the GPS receiving unit 170 described above calculates the current position of the vehicle CR based on reception results of radio waves from a plurality of GPS satellites. Further, the GPS receiving unit 170 measures the current time based on the date / time information transmitted from the GPS satellite. Information regarding these current position and current time is sent to the control unit 110A as GPS data.
  • the control unit 110A includes a central processing unit (CPU) and its peripheral circuits.
  • Various functions as the navigation device 100 are realized by the control unit 110A executing various programs. These functions include a part of the position detection unit 930, the acquisition unit 720, the detection unit 730, the control unit 740A, and the sound signal generation unit 760 in the first embodiment described above.
  • control unit 110A is recorded on a computer-readable recording medium such as a hard disk, CD-ROM, or DVD, and is loaded from the recording medium and executed.
  • the program may be acquired in a form recorded on a portable recording medium such as a CD-ROM or DVD, or may be acquired in a form distributed via a network such as the Internet. Also good.
  • the control unit 110A appropriately refers to the map information MPD in the storage unit 120 based on the sensor data received from the sensor unit 160 and the GPS data received from the GPS receiving unit 170, and provides navigation information to the user. I do.
  • the navigation information providing processing includes (a) a map display for displaying a map of an area designated by the user on the display device of the display unit 140, (b) where the vehicle CR is located on the map, , Map matching to calculate which direction it is heading, (c) route search from the position where the vehicle currently exists to a destination which is an arbitrary position designated by the user, (d) to the set route When driving to the destination along, calculation of the predicted arrival time to the destination, (e) the map matching result, the calculated predicted arrival time, and the advice of the direction to proceed, are performed, Processing such as control for displaying guidance on the display device of the display unit 140 and control for outputting voice guidance from the speaker of the sound output unit 130 is included.
  • control unit 110A generates a first external output sound signal supplied to the directional sound output unit 210H and a second external output sound signal supplied to the low directional sound output unit 210L.
  • control unit 110A controls the direction of the speaker SPH in the directional sound output unit 210H.
  • shooting data of peripheral images in the traveling direction of the vehicle CR is sequentially sent from the shooting unit 220 to the control unit 110A.
  • GPS data is sequentially sent from the GPS receiving unit 170 to the control unit 110A.
  • sensor data including vehicle speed data is sequentially sent from the sensor unit 160 to the control unit 110A.
  • the control unit 110A sequentially calculates the current position and the traveling direction based on the map matching result described above.
  • step S11 the control unit 110A exists on the traveling direction side of the vehicle CR and is output from the low directional sound output unit 210L.
  • the detection process of the obstruction which prevents the progress of the produced low directivity sound is performed.
  • the control unit 110A first performs image analysis based on the imaging data sent from the imaging unit 220 to detect the presence or absence of the shielding object.
  • the control unit 110A further performs image analysis, and for each of the detected shielding objects, the relative distance to the vehicle CR and the relative angle to the traveling direction of the vehicle CR. Is further detected.
  • the control unit 110A refers to the map information MPD in the storage unit 120 on the basis of the current position and the traveling direction, and the features on the vicinity of the traveling direction side of the vehicle CR and the shielding obtained as an image analysis result Check against the object.
  • the control unit 110A detects the relative distance with respect to the vehicle CR and the relative angle with respect to the traveling direction of the shield for each of the detected shields.
  • step S12 the control unit 110A determines whether or not a shielding object has been detected by the process of step S11. If the result of this determination is negative (step S12: N), the process proceeds to step S13.
  • step S13 the control unit 110A determines whether or not a directional sound is being output from the directional sound output unit 210H. If the result of this determination is negative (step S13: N), the process returns to step S11.
  • step S13 If the result of the determination in step S13 is affirmative (step S13: Y), the process proceeds to step S14.
  • step S14 the control unit 110A stops the output of the directional sound from the directional sound output unit 210H by stopping the supply of the first external sound output signal. Then, the process returns to step S11.
  • step S15 the control unit 110A refers to the map information MPD in the storage unit 120 based on the current position, and performs a shielding notification area specifying process.
  • the “shielding notification area” is, for example, an area where a low directivity sound does not reach due to a shielding object, and a pedestrian or the like who passes through the area crosses the road and the road where the vehicle CR travels ( An area that may be present at the intersection at the time of arrival of the vehicle CR to a T-shaped intersection).
  • step S16 it is determined whether or not the shielding notification area is specified by the process of step S15. If the result of this determination is negative (step S16: N), the process proceeds to step S13. And after the process of step S13 mentioned above and step S14 as needed is performed, a process returns to step S11.
  • step S16 If the result of the determination in step S16 is affirmative (step S16: Y), the process proceeds to step S17.
  • step S17 the control unit 110A performs a process of detecting a reflector that exists on the traveling direction side of the vehicle CR and can reflect the directional sound output from the directional sound output unit 210H.
  • the control unit 110A first performs image analysis based on the imaging data sent from the imaging unit 220 to detect the presence or absence of the reflection object. When at least one reflector is detected, the control unit 110A further performs image analysis, and for each of the detected reflectors, the relative distance to the vehicle CR and the relative angle to the traveling direction of the vehicle CR. Is further detected.
  • control unit 110A refers to the map information MPD in the storage unit 120 based on the current position and the traveling direction, and the features on the traveling direction side of the vehicle CR and the reflection obtained as an image analysis result. Check against the object.
  • the reflection objects obtained as the image analysis result are not registered as features in the map information, or a feature that can be a reflection object registered in the map information is used as the image analysis result.
  • the reflecting object is specified giving priority to the image analysis result.
  • step S18 the control unit 110A determines whether or not a reflecting object has been detected by the process of step S17. If the result of this determination is negative (step S18: N), the process proceeds to step S13. After the process of step S13 described above and step S14 as necessary is performed, the process returns to step S11.
  • step S18 determines whether the result of the determination in step S18 is affirmative (step S18: Y). If the result of the determination in step S18 is affirmative (step S18: Y), the process proceeds to step S19.
  • step S19 the control unit 110A uses the detected reflector to extract a path through which the directional sound output from the directional sound output unit 210H reaches the shielding notification area. In the first embodiment, when a plurality of routes reaching the shielding notification area are extracted, a route that is actually adopted is extracted from the routes having the smallest number of reflections.
  • step S20 the control unit 110A determines whether or not a route has been extracted by the process of step S19. If the result of this determination is negative (step S20: N), the process proceeds to step S13. After the process of step S13 described above and step S14 as necessary is performed, the process returns to step S11.
  • step S20 If the result of the determination in step S20 is affirmative (step S20: Y), the process proceeds to step S21.
  • step S21 the control unit 110A performs rotation control for directing the directional sound to the speaker SPH of the directional sound output unit 210H in the direction for causing the directional sound to travel along the extracted path. To do.
  • step S22 the control unit 110A determines whether or not a directional sound is being output from the directional sound output unit 210H. If the result of this determination is affirmative (step S22: Y), the process returns to step S11.
  • step S22 If the result of the determination in step S22 is negative (step S22: N), the process proceeds to step S23.
  • step S23 the control unit 110A starts to output the directional sound from the directional sound output unit 210H by starting the supply of the first external sound output signal. Then, the process returns to step S11.
  • FIG. 5 when the shielding object SO and the reflection objects RO1 and RO2 are detected and the shielding notification area RA1 is specified, one kind of directional sound can be made to reach the shielding notification area RA1.
  • An example in which the route RT1 is extracted is shown. In the case of this example, rotation control is performed to direct the speaker SPH of the directional sound output unit 210H in a direction for causing the directional sound to travel along the route RT1.
  • FIG. 6 when the shielding object SO and the reflection objects RO1 and RO2 are detected and the shielding notification area RA2 is specified, two types of directional sounds can be made to reach the shielding notification area RA2.
  • rotation control is performed in which the directional sound is directed to the speaker SPH of the directional sound output unit 210H in a direction for causing the directional sound to travel along the path RT2 having the smallest number of reflections.
  • the control unit 110A is based on the peripheral image on the traveling direction side of the vehicle CR, the current position of the vehicle CR, and the map information MPD in the storage unit 120.
  • a shielding object that prevents the progress of the low directivity sound output from the low directivity sound output unit 210L is detected.
  • the control unit 110A detects a reflector that exists on the traveling direction side of the vehicle CR and can reflect the directional sound output from the directional sound output unit 210H. Further, the control unit 110A should notify the approach of the vehicle CR that travels at the current vehicle speed in the region where the low directivity sound does not reach due to the detected obstacle based on the current position and the vehicle speed of the vehicle CR. Identify the area.
  • control unit 110A extracts the path
  • the control unit 110A performs rotation control for directing the speaker SPH of the directional sound output unit 210H in the direction for causing the directional sound to travel the extracted path. Perform for 210H.
  • the route that is actually adopted is determined from the routes that have the smallest number of reflections. For this reason, the fall of the volume of the directional sound which reaches
  • FIG. 7 shows the relationship between the arrangement positions of the terminal device 300 and the server device 400 according to the second embodiment.
  • the terminal device 300 is an aspect of the terminal device 810 in the second embodiment
  • the server apparatus 400 is an aspect of the output sound management device 820 in the second embodiment.
  • the terminal device 300 is arranged in the vehicle CR.
  • a directional sound output unit 210H and a low directional sound output unit 210L are installed in the vehicle CR.
  • a photographing unit 220 is installed in the vehicle CR.
  • the server device 400 is arranged outside the vehicle CR.
  • the terminal device 300 and the server device 400 can communicate with each other via the network 500.
  • the server device 400 can communicate with other terminal devices configured in the same manner as the terminal device 300, but only the terminal device 300 is representatively shown in FIG.
  • FIG. 8 shows a schematic configuration of the terminal device 300.
  • the terminal device 300 is different from the navigation device 100 of the first embodiment described above in that it includes a control unit 110B instead of the control unit 110A, and does not include a sensor unit 160.
  • the difference is that a wireless communication unit 320 as a transmission unit 811 and a reception unit 812 is further provided.
  • description will be made mainly focusing on these differences.
  • the control unit 110B includes a central processing unit (CPU) and its peripheral circuits, and performs overall control of the entire terminal device 300.
  • Various functions as the terminal device 300 are realized by the control unit 110B executing various programs. These functions include functions as the acquisition unit 720, the control unit 740B, and the sound signal generation unit 760 in the second embodiment described above.
  • the control unit 110B acquires the GPS data received from the GPS receiving unit 170, and specifies the current position and the current time based on the acquired GPS data.
  • the control unit 110B receives the image data of the peripheral video in the traveling direction of the vehicle CR sent from the image capturing unit 220. Then, the control unit 110B sends the photographing data, the current position of the vehicle CR, and the vehicle speed to the wireless communication unit 320 as terminal transmission data.
  • control unit 110B receives control information sent from the wireless communication unit 320.
  • control information includes information on rotation control of the speaker SPH of the directional sound output unit 210H and information on generation control of the first external output sound signal.
  • the control unit 110B that has received the control information performs rotation control of the speaker SPH with respect to the directional sound output unit 210H and generation control of the first external output sound signal according to the control information.
  • the wireless communication unit 320 receives the terminal transmission data sent from the control unit 110B. Then, the wireless communication unit 320 transmits the terminal transmission data to the server device 400 via the network 500.
  • the wireless communication unit 320 receives control information transmitted from the server device 400 via the network 500. Then, the wireless communication unit 320 sends the control information to the control unit 110B.
  • FIG. 9 shows a schematic configuration of the server apparatus 400.
  • the server device 400 includes a control unit 110 ⁇ / b> C, a storage unit 410, and an external communication unit 420 as a reception unit 821 and a transmission unit 823.
  • the control unit 110C described above includes a central processing unit (CPU) and its peripheral circuits, and performs overall control of the server device 400 as a whole.
  • Various functions as the server device 400 are realized by the control unit 110C executing various programs. These functions include the functions as the detection unit 730 and the generation unit 822 in the second embodiment described above.
  • the storage unit 410 stores various information data used in the server device 400. Such information data includes map information MPD.
  • the storage unit 410 can be accessed by the control unit 110C.
  • the external communication unit 420 receives terminal transmission data transmitted from the terminal device 300 via the network 500. Then, the external communication unit 420 sends the terminal transmission data to the control unit 110C.
  • the external communication unit 420 receives server transmission data such as control information sent from the control unit 110C. Then, the external communication unit 420 sends the server transmission data to the terminal device 300 via the network 500.
  • the terminal transmission data output from the control unit 110B is transmitted to the control unit 110C via the wireless communication unit 320, the network 500, and the external communication unit 420. Will be.
  • the server transmission data output from the control unit 110C is sent to the control unit 110B via the external communication unit 420, the network 500, and the wireless communication unit 320.
  • shooting data of peripheral images in the traveling direction of the vehicle CR is sequentially sent from the shooting unit 220 to the control unit 110B.
  • GPS data is sequentially transmitted from the GPS receiving unit 170 to the control unit 110B.
  • the control unit 110B sends the image data and the current position included in the GPS data to the control unit 110C as terminal transmission data.
  • the control unit 110B calculates the vehicle speed from the time change of the current position included in the GPS data, and sends the calculated vehicle speed to the control unit 110C as terminal transmission data. .
  • the control unit 110C that has received the terminal transmission data sent from the control unit 110B refers to the map information MPD in the storage unit 410 based on the photographing data and the current position as appropriate, similarly to the control unit 110A. The control information described above is generated. Then, the control unit 110C sends the generated control information to the control unit 110B.
  • the control unit 110B Upon receiving the control information, the control unit 110B controls the rotation of the speaker SPH of the directional sound output unit 210H according to the control information, and outputs the first external output sound signal as a directional sound. Supply to unit 210H.
  • the directional sound is extracted. Rotation control is performed so that the speaker SPH of the directional sound output unit 210H is directed in the direction for traveling along the direction.
  • control unit 110B of the terminal device 300 and the control unit 110C of the server device 400 cooperate to display a peripheral image on the traveling direction side of the vehicle CR, the vehicle CR Based on the current position and the map information MPD in the storage unit 410, a shield that prevents the progress of the low directional sound output from the low directional sound output unit 210L is detected.
  • control unit 110B and the control unit 110C cooperate to detect a reflector that exists on the traveling direction side of the vehicle CR and can reflect the directional sound output from the directional sound output unit 210H. .
  • control unit 110C should notify the approach of the vehicle CR that travels at the current vehicle speed in the region where the low directional sound does not reach due to the detected obstacle based on the current position and the vehicle speed of the vehicle CR. Identify the area. And control unit 110C extracts the path
  • the direction of the directional sound propagating outside the vehicle is controlled by controlling the direction of the speaker SPH.
  • the present invention can also be applied to a case where the output direction of the directional sound propagating outside the vehicle is connected to a directional sound output unit that performs another method.
  • the shielding object and the reflection object are detected based on the photographing data, the current position, and the map information.
  • the shielding object and the reflection object may be detected based only on the imaging data, or the shielding object and the reflection object may be detected based only on the current position and the map information.
  • a flat speaker is used as the speaker SPH that outputs directional sound.
  • other types of speakers may be used as long as the speaker outputs directional sound. Good.
  • a cone speaker is used as the speaker SPL that outputs low directivity.
  • other types of speakers may be used as long as they output low directivity. May be.
  • the storage unit of the navigation device stores the sound source data.
  • the sharable storage unit may be used. In this case, the sound data of the external output sound can be omitted as information stored in the storage unit of the navigation device.
  • the control unit of the navigation device generates the first and second external output sound signals.
  • the directional sound output unit may generate the first external output sound signal
  • the low directional sound output unit may generate the second external output sound signal.
  • the storage unit of the terminal device stores the sound source data.
  • the sharable storage unit may be used as a storage unit for storing sound source data.
  • the sound source data can be omitted as information stored in the storage unit of the terminal device.
  • the terminal device includes a sound output unit, a display unit, and an input unit.

Landscapes

  • Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Navigation (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)

Abstract

Dans la présente invention, une unité de détection (730) détecte un objet formant écran qui empêche la progression d'un son à faible directivité émis par une unité d'émission de son à faible directivité (910L). De même, l'unité de détection (730) détecte un réflecteur qui peut réfléchir un son directionnel émis par une unité d'émission de son directionnel (910H) et présent dans la direction de progression d'un véhicule (CR). Ensuite, sur la base de la position actuelle et de la vitesse actuelle du véhicule (CR), une unité de commande (740A) spécifie une zone permettant de notifier l'approche du véhicule (CR) qui roule à la vitesse au moment indiqué et qui est une zone que le son à faible directivité n'atteint pas du fait de l'objet formant écran détecté. De même, l'unité de commande (740A) utilise le réflecteur contenu dans les résultats de détection pour extraire un chemin afin de faire arriver le son directionnel à la zone écran notifiée. Lorsqu'un tel chemin est extrait, l'unité de commande (740A) soumet l'unité d'émission de son directionnel (910H) à un contrôle en rotation qui provoque l'orientation d'un haut-parleur (SPH) de l'unité d'émission de son directionnel (910H) vers une direction pour faire progresser le son directionnel le long du chemin extrait.
PCT/JP2012/058466 2012-03-29 2012-03-29 Dispositif sonore, dispositif de gestion du son émis, dispositif terminal, et procédé de contrôle du son émis WO2013145242A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2012/058466 WO2013145242A1 (fr) 2012-03-29 2012-03-29 Dispositif sonore, dispositif de gestion du son émis, dispositif terminal, et procédé de contrôle du son émis

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2012/058466 WO2013145242A1 (fr) 2012-03-29 2012-03-29 Dispositif sonore, dispositif de gestion du son émis, dispositif terminal, et procédé de contrôle du son émis

Publications (1)

Publication Number Publication Date
WO2013145242A1 true WO2013145242A1 (fr) 2013-10-03

Family

ID=49258607

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/058466 WO2013145242A1 (fr) 2012-03-29 2012-03-29 Dispositif sonore, dispositif de gestion du son émis, dispositif terminal, et procédé de contrôle du son émis

Country Status (1)

Country Link
WO (1) WO2013145242A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011031695A (ja) * 2009-07-30 2011-02-17 Denso Corp 車両存在報知装置
JP2012038136A (ja) * 2010-08-09 2012-02-23 Tabuchi Electric Co Ltd 電動車両および該電動車両の走行制御システム

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011031695A (ja) * 2009-07-30 2011-02-17 Denso Corp 車両存在報知装置
JP2012038136A (ja) * 2010-08-09 2012-02-23 Tabuchi Electric Co Ltd 電動車両および該電動車両の走行制御システム

Similar Documents

Publication Publication Date Title
JP6531144B2 (ja) 自律走行車(adv)用の緊急処理システム
WO2011118064A1 (fr) Dispositif de production de sons semblables à ceux d'un véhicule et procédé de production de sons semblables à ceux d'un véhicule
US20190039613A1 (en) Apparatus and method for changing route of vehicle based on emergency vehicle
JP2021043788A (ja) 車両遠隔指示システム
JP2020091790A (ja) 自動運転システム
JP2008242844A (ja) 走行支援用車載装置
JP2011109170A (ja) 車両周囲表示装置、車両周囲表示方法
JP5263507B2 (ja) 車両運転支援装置
JP5980607B2 (ja) ナビゲーション装置
JP2006277547A (ja) 車両用運転補助装置
JP5954520B2 (ja) 車両接近通報装置
JP2016040644A (ja) 運転支援システム、データ構造
JP2020009285A (ja) 運転支援装置,方法,プログラム
JP5521575B2 (ja) 車両接近報知装置
JP2014044458A (ja) 車載機器、及び、危険報知方法
JPH1063995A (ja) 緊急車両位置検出装置および方法ならびに退避指示方法
WO2013145242A1 (fr) Dispositif sonore, dispositif de gestion du son émis, dispositif terminal, et procédé de contrôle du son émis
JP2021018636A (ja) 車両遠隔指示システム
WO2013145083A1 (fr) Dispositif audio, dispositif de gestion d'émission de son, dispositif de terminal et procédé de commande d'émission de son
JP2008305283A (ja) 運転支援装置および運転支援方法
JP4944283B1 (ja) 音響装置及び出力音制御方法
JP6310381B2 (ja) 情報処理装置、信号機の情報を案内する方法、および、コンピュータプログラム
JP6505199B2 (ja) 運転支援システム、プログラム
JP5211234B2 (ja) 車両想起音出力制御装置及び車両想起音出力制御方法
KR101971262B1 (ko) 통행 예측을 기반으로 한 사고 관리 방법 및 시스템

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12873051

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12873051

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP