WO2018163545A1 - 情報処理装置、情報処理方法及び記録媒体 - Google Patents
情報処理装置、情報処理方法及び記録媒体 Download PDFInfo
- Publication number
- WO2018163545A1 WO2018163545A1 PCT/JP2017/044081 JP2017044081W WO2018163545A1 WO 2018163545 A1 WO2018163545 A1 WO 2018163545A1 JP 2017044081 W JP2017044081 W JP 2017044081W WO 2018163545 A1 WO2018163545 A1 WO 2018163545A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- sound source
- moving body
- information processing
- processing apparatus
- sound
- Prior art date
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R3/00—Circuits for transducers, loudspeakers or microphones
- H04R3/005—Circuits for transducers, loudspeakers or microphones for combining the signals of two or more microphones
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/02—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
- G01S15/50—Systems of measurement, based on relative movement of the target
- G01S15/52—Discriminating between fixed and moving objects or between objects moving at different speeds
- G01S15/523—Discriminating between fixed and moving objects or between objects moving at different speeds for presence detection
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R11/00—Arrangements for holding or mounting articles, not otherwise provided for
- B60R11/02—Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R11/00—Arrangements for holding or mounting articles, not otherwise provided for
- B60R11/02—Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof
- B60R11/0217—Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof for loud-speakers
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/02—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
- G01S15/50—Systems of measurement, based on relative movement of the target
- G01S15/58—Velocity or trajectory determination systems; Sense-of-movement determination systems
- G01S15/582—Velocity or trajectory determination systems; Sense-of-movement determination systems using transmission of interrupted pulse-modulated waves and based upon the Doppler effect resulting from movement of targets
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/02—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
- G01S15/50—Systems of measurement, based on relative movement of the target
- G01S15/58—Velocity or trajectory determination systems; Sense-of-movement determination systems
- G01S15/588—Velocity or trajectory determination systems; Sense-of-movement determination systems measuring the velocity vector
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/52017—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
- G01S7/52023—Details of receivers
- G01S7/52025—Details of receivers for pulse systems
- G01S7/52026—Extracting wanted echo signals
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/539—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R3/00—Circuits for transducers, loudspeakers or microphones
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04S—STEREOPHONIC SYSTEMS
- H04S7/00—Indicating arrangements; Control arrangements, e.g. balance control
- H04S7/30—Control circuits for electronic adaptation of the sound field
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2520/00—Input parameters relating to overall vehicle dynamics
- B60W2520/10—Longitudinal speed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2556/00—Input parameters relating to data
- B60W2556/45—External transmission of data to or from the vehicle
- B60W2556/50—External transmission of data to or from the vehicle for navigation systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R2499/00—Aspects covered by H04R or H04S not otherwise provided for in their subgroups
- H04R2499/10—General applications
- H04R2499/13—Acoustic transducers and sound field adaptation in vehicles
Definitions
- the present disclosure relates to an information processing apparatus, an information processing method, and a recording medium.
- Patent Document 1 discloses a technique for reducing the feeling of obstruction felt by the user by capturing external sound through headphones.
- the present disclosure provides a mechanism for selectively capturing external sound from an appropriate sound source into an internal space.
- an acquisition unit that acquires an audio signal from a sound source that exists outside a moving body, and a target sound source whose distance to the moving body is a distance corresponding to the speed of the moving body.
- a generation unit that generates the audio signal based on the audio signal acquired by the acquisition unit, and an output control unit that outputs the audio signal generated by the generation unit toward the internal space of the moving body.
- an audio signal from a sound source that exists outside a moving body is obtained, and a target sound source in which a distance from the moving body among the sound sources is a distance corresponding to the speed of the moving body Is generated based on the acquired audio signal, and the generated audio signal is output to the internal space of the moving body by an output device.
- the computer is configured to acquire a sound signal from a sound source that exists outside the moving body, and the distance between the sound source and the moving body is a distance according to the speed of the moving body.
- a generation unit that generates an audio signal from the target sound source based on the audio signal acquired by the acquisition unit, and an output that outputs the audio signal generated by the generation unit toward an internal space of the moving body
- a control medium and a recording medium on which a program for functioning as a controller is recorded are provided.
- a signal generated based on an audio signal acquired from a sound source existing outside the moving body is output toward the internal space of the moving body. Therefore, it becomes possible to take in external sound into the internal space of the moving body. Furthermore, it is possible to selectively capture external sound from an appropriate sound source according to the speed of the moving body.
- a mechanism for selectively capturing external sound from an appropriate sound source into the internal space is provided.
- the above effects are not necessarily limited, and any of the effects shown in the present specification, or other effects that can be grasped from the present specification, together with or in place of the above effects. May be played.
- FIG. 1 is a diagram for describing an example of an information processing apparatus according to an embodiment of the present disclosure. As shown in FIG. 1, the moving body 10 runs around the sound sources 20A and 20B.
- the moving body 10 has an internal space that is a space for a person to stay in the moving body 10.
- the moving body 10 shown in FIG. 1 is provided with a seat in the internal space, so that a driver, a passenger, etc. can stay there.
- the moving body 10 is an automobile.
- the moving body 10 may be an arbitrary moving body having an internal space where a person stays, and may be a construction vehicle, a ship, an airplane, a train, a vehicle in an amusement park, a playground equipment, or the like.
- the sound source 20A is a clock tower that sounds a time signal on time.
- the sound source 20B is a pedestrian and suddenly sounds a voice, footsteps, and the like.
- these sounds outside the vehicle (hereinafter also referred to as external sounds) are difficult to hear for those who are in the vehicle because the internal space of the vehicle 10 is isolated from the external space.
- the internal space of the moving body 10 is also referred to as an interior space
- the external space is also referred to as an exterior space.
- the information processing apparatus reproduces the external sound toward the vehicle interior space, that is, performs processing for capturing the external sound into the vehicle interior space.
- the information processing apparatus detects external sounds using microphones 110A to 110D, applies predetermined signal processing to the audio signals representing the detected external sounds, and applies the predetermined signal processing to the interior space of the speakers 120A to 120D. Output.
- the external sound can be taken in for driving assistance, for example.
- the external sound is taken into the internal space, so that the driver can know the surrounding situation. For example, the driver can prevent an accident by listening to the sound of the pedestrian 20B and reducing the speed in preparation for the pedestrian 20B jumping to the road.
- External sound can be taken in for entertainment purposes, for example.
- a vehicle fitted with a glass for ensuring weather and safety such as a dedicated vehicle in a safari park
- a sound environment as if there is no glass is realized, and a person in the vehicle space can enjoy a realistic experience.
- the information processing apparatus according to the present embodiment is implemented as a server apparatus that is mounted on a mobile body or can communicate with the mobile body, and controls an external sound capturing process.
- the information processing apparatus according to the present embodiment will be described as being mounted on the moving body 10.
- FIG. 2 is a block diagram illustrating an example of a logical configuration of the information processing apparatus according to the present embodiment.
- the information processing apparatus 100 includes a sensor unit 110, an output unit 120, a communication unit 130, a storage unit 140, and a processing unit 150.
- the sensor unit 110 has a function of detecting various information.
- the sensor unit 110 outputs the detected sensor information to the processing unit 150.
- the sensor unit 110 may include a sensor that detects information on the external space of the moving body.
- the sensor unit 110 detects external sound as information on the external space of the moving body.
- the sensor that detects external sound may include a plurality of microphones such as the microphones 110A to 110D shown in FIG. Of course, there may be only one microphone.
- a microphone that detects external sound is typically installed on a moving body in a state in which at least a part of the configuration of a diaphragm or the like is exposed to the external space (that is, a state in which the outside air is touched).
- a microphone that detects external sound is installed on the outer wall of the vehicle body.
- the sensor unit 110 may detect a captured image, depth information, an infrared image, or the like as information on the external space of the moving body.
- the sensor unit 110 may include a sensor that detects information related to the state of the moving object. For example, the sensor unit 110 detects the speed, acceleration, position information, or the like of the moving body as information regarding the state of the moving body.
- the sensor unit 110 may include a sensor that detects information on the internal space of the moving body.
- the sensor unit 110 detects input information, sound, captured images, biological information, or the like by a person in the internal space as information on the internal space of the moving body.
- the sensor unit 110 may include a clock that detects time information.
- Output unit 120 The output unit 120 has a function of outputting various information.
- the output unit 120 outputs information based on control by the processing unit 150.
- the output unit 120 can output information to the internal space of the moving body.
- the output unit 120 may include a plurality of speakers such as the speakers 120A to 120D illustrated in FIG. Of course, there may be only one speaker.
- the speaker is typically installed in an interior space such as the inside of a door.
- the output unit 120 may output an image (still image or moving image), vibration, or the like.
- the communication unit 130 has a function of transmitting and receiving signals to and from other devices.
- the communication unit 130 can communicate using any communication standard such as cellular communication such as 4G or 5G, wireless LAN (Local Area Network), Wi-Fi (registered trademark), or Bluetooth (registered trademark).
- cellular communication such as 4G or 5G
- wireless LAN Local Area Network
- Wi-Fi registered trademark
- Bluetooth registered trademark
- the communication unit 130 can communicate using V2X (Vehicle to Everything) communication.
- Storage unit 140 has a function of temporarily or permanently storing information for the operation of the information processing apparatus 100.
- the storage unit 140 stores known sound source information described later.
- Processing unit 150 provides various functions of the information processing apparatus 100.
- the processing unit 150 includes an acquisition unit 151, a setting unit 153, a generation unit 155, and an output control unit 157.
- the processing unit 150 may further include other components other than these components. That is, the processing unit 150 can perform operations other than the operations of these components.
- the function of each component of the acquisition unit 151, the setting unit 153, the generation unit 155, and the output control unit 157 will be described in detail later.
- the information processing apparatus 100 acquires an audio signal (that is, a signal representing an external sound) from a sound source that exists outside the moving body.
- the information processing apparatus 100 acquires a sound signal of a sound that is actually sounding in a sound source that is detected by the microphones 110A to 110D shown in FIG.
- the information processing apparatus 100 After acquiring the audio signal, the information processing apparatus 100 (for example, the setting unit 153) sets the operation mode.
- the information processing apparatus 100 selects and sets an operation mode from an operation mode group including a plurality of operation modes.
- an operation mode group including a plurality of operation modes.
- the operation mode group may include a first operation mode in which output to the internal space of the moving body is not performed. By operating in the first operation mode, the information processing apparatus 100 can stop the external sound from being taken into the vehicle interior space.
- the operation mode group may include a second operation mode in which output to the internal space of the moving object is performed.
- the information processing apparatus 100 can take in external sound into the vehicle interior space by operating in the second operation mode.
- the second operation mode can be subdivided into third to fifth operation modes described below.
- the second operation mode may include a third operation mode in which a known sound source is a target sound source.
- a known sound source is a sound source that emits sound according to a predetermined condition.
- the known sound source for example, a sound source that emits sound as scheduled according to conditions such as place and time, such as the clock tower 20A, fireworks display, and railroad crossing shown in FIG.
- a sound such as a sound from a clock tower of a tourist attraction or a crossing sound can be taken into the vehicle interior space. Therefore, it can be said that the third operation mode is effective for both driving support and entertainment.
- the second operation mode may include a fourth operation mode in which an unknown sound source is a target sound source.
- an unknown sound source is a sound source that suddenly emits sound.
- the unknown sound source for example, the pedestrian 20B shown in FIG.
- the fourth operation mode suddenly generated sounds such as sounds from pedestrians and oncoming vehicles can be taken into the vehicle interior space. Therefore, the fourth operation mode is particularly effective for driving support applications.
- the second operation mode may include a fifth operation mode in which a known sound source and an unknown sound source are target sound sources.
- the information processing apparatus 100 can take in external sound from both the known sound source and the unknown sound source into the vehicle interior space.
- the information processing apparatus 100 (for example, the setting unit 153) can set (in other words, switch) the operation mode based on various criteria.
- the information processing apparatus 100 may set the operation mode based on a sound signal from a sound source that exists outside the moving body. Specifically, the information processing apparatus 100 can switch between the first operation mode and the second operation mode according to the presence / absence, magnitude, or type of external sound.
- the information processing apparatus 100 may set an operation mode based on the known sound source information.
- the known sound source information may include, for example, map information, and the information processing apparatus 100 may set the operation mode based on the map information indicating the position of the known sound source and the position information of the moving body.
- the map information includes information indicating a range in which sound from a known sound source should be taken.
- the range in which the sound from the known sound source is to be captured may be defined as, for example, the position of the known sound source and a circle centered on the position.
- the information indicating the range in which the sound from the known sound source is to be captured is Position information and distance information on the radius of the circle.
- the map information 30 includes position information indicating a position 31 of a known sound source and information indicating a range 32 within a predetermined distance from the known sound source.
- the information processing apparatus 100 refers to the map information 30, sets the third operation mode when the moving body is located within the range 32, and sets the fourth operation mode when the moving body is located outside the range 32. Set the operation mode.
- the map information may further include information indicating a time zone in which the known sound source emits sound.
- the information processing apparatus 100 further sets the operation mode based on the current time. For example, the information processing apparatus 100 sets the third operation mode when the moving body is located within a range in which sound from a known sound source is to be captured and the current time is included in a time zone in which the known sound source emits sound. Set. As a result, the information processing apparatus 100 can take the sound from the known sound source into the vehicle interior in a pinpoint manner in accordance with the time zone when the known sound source emits sound.
- the map information may include information indicating types of roads such as general roads, highways, elevated roads, and underground roads, and the information processing apparatus 100 may set an operation mode based on such information. .
- the known sound source information may include additional sound.
- the additional sound is a sound signal associated with a known sound source, and is stored in advance in the storage unit 140 or an external server device.
- the information processing apparatus 100 acquires and outputs additional sound stored in advance related to the known sound source.
- the additional sound is, for example, a description related to a known sound source such as the history of the clock tower 20A in the example shown in FIG. 1, or a sound effect in an amusement park. For example, in the example shown in FIG.
- the information processing apparatus 100 uses the sound that is actually ringing on the clock tower 20 ⁇ / b> A when the car 10 is located within a predetermined distance from the clock tower 20 ⁇ / b> A during the time zone when the clock tower 20 ⁇ / b> A rings. Is output to the interior space of the vehicle, and a sound explaining the history of the clock tower 20A is output. In this way, it is possible to enhance the amusement in the entertainment use by the additional sound.
- the storage location of the known sound source information is arbitrary.
- the known sound source information may be stored in the information processing apparatus 100 or may be stored in a server apparatus on the Internet.
- the map information of the known sound source information may be stored in the information processing apparatus 100, and the additional sound may be stored in the server apparatus, or may be downloaded as necessary.
- the information processing apparatus 100 may set the operation mode based on information from the external device.
- the external device may be a device capable of V2X (Vehicle to Everything) communication such as a car, a device carried by a pedestrian, or an RSU (Road Side Unit).
- V2X Vehicle to Everything
- the information processing apparatus 100 recognizes the presence of a vehicle traveling in the driver's blind spot based on information from the RSU, sets the fourth operation mode, and captures the sound from the vehicle into the interior space. It is possible to prevent accidents.
- the information processing apparatus 100 may set an operation mode based on a user operation. For example, the information processing apparatus 100 may set an operation mode according to an operation mode selection operation by the user.
- the information processing apparatus 100 After setting the operation mode, the information processing apparatus 100 (for example, the generation unit 155) captures the sound source (unknown sound source and / or known sound source) existing around the moving body into the vehicle interior according to the set operation mode. Select the target sound source as the target sound source. Of course, all sound sources may be selected as target sound sources. In the third mode or the fifth mode, the known sound source may not be selected as the target sound source, or the known sound source may always be selected as the target sound source.
- the information processing apparatus 100 may use one or a plurality of determination criteria described below in combination.
- the information processing apparatus 100 may select, as a target sound source, a sound source whose distance from the moving body is a distance according to the speed of the moving body.
- the information processing apparatus 100 selects a sound source farther from the moving body as the target sound source as the speed of the moving body is higher, and selects a sound source closer to the moving body as the target sound source as the speed of the moving body is lower.
- the moving object is immediately separated from a nearby sound source, so it is considered that sound from a distant sound source is more useful.
- the vehicle speed is low, it takes time to approach a distant sound source, so sounds from a nearby sound source are considered to be more useful.
- the information processing apparatus 100 can capture a more useful sound into the vehicle interior space by using a sound source at a distance corresponding to the speed of the moving object as a target sound source.
- the information processing apparatus 100 may use one or more of the determination criteria described below.
- 4 and 5 are diagrams for explaining an example of a process for determining the distance between the moving body and the sound source according to the present embodiment.
- 4 and 5 the position 41 of the moving body 10 at a certain time point (hereinafter also referred to as a reference time point), the position 42 after t seconds, and the sound source 20A and the sound source 20B far from the moving body 10 at the reference time point are shown. It is shown. 4 is an example when the speed of the moving body 10 is high, and FIG. 5 is an example when the speed of the moving body 10 is low.
- the information processing apparatus 100 may determine the distance between the moving body and the sound source based on a time-series change in the direction of the sound source with respect to the moving body.
- a time-series change in the direction of the sound source with respect to the moving body.
- the time series change from the angle 43 to the angle 44
- the time-series change from angle 45 to angle 46
- t seconds of the direction of the sound source 20B relative to the moving body 10 is small.
- the information processing apparatus 100 determines that the distance to the sound source having a large time-series change in the azimuth with respect to the moving body is close, and determines that the distance to the small sound source is long be able to.
- the determination of the identity of the sound source after the reference time and t seconds may be performed based on, for example, the spectrum shape of the sound.
- the information processing apparatus 100 may determine the distance between the moving body and the sound source based on a time-series change in sound pressure from the sound source.
- a time-series change in sound pressure from the sound source Referring to FIG. 4, when the speed of the moving body 10 is fast, the time-series change of the sound pressure from the sound source 20A for t seconds is large because the distance between the moving body 10 and the sound source 20A is greatly changed.
- the time-series change of the sound pressure from the sound source 20B for t seconds is small because the distance between the moving body 10 and the sound source 20B does not change much. Therefore, when the speed of the moving body 10 is high, the information processing apparatus 100 can determine that the distance from the sound source having a large time-series change in sound pressure is close, and can determine that the distance from the small sound source is long.
- the time-series change of the direction of the sound source and the time-series change of the sound pressure are different depending on whether the distance between the moving body and the sound source is long or near. It is thought that there are few. Specifically, when the speed of the moving body 10 is slow, the time-series change (from angle 43 to angle 44) of the direction of the sound source 20A relative to the moving body 10 is small, and the direction t of the sound source 20B is t. The time-series change in seconds (from angle 45 to angle 46) is also small.
- the time-series change of the sound pressure from the sound source 20A for t seconds is small because the distance between the moving body 10 and the sound source 20A does not change so much, and the sound pressure from the sound source 20B is small.
- the time-series change for t seconds is also small because the distance between the moving body 10 and the sound source 20B does not change much.
- the information processing apparatus 100 may determine the distance between the moving body and the sound source based on the absolute value of the sound pressure from the sound source.
- the absolute value of the sound pressure is considered to be larger as the distance between the moving body and the sound source is closer and smaller as it is farther away. Therefore, when the speed of the moving body is slow, the information processing apparatus 100 can determine the distance between the moving body and the sound source by referring to the absolute value of the sound pressure.
- the information processing apparatus 100 can increase the determination accuracy by referring to the absolute value of the sound pressure even when the speed of the moving body is high.
- the information processing apparatus 100 has been described as selecting a sound source whose distance from the moving body is a distance corresponding to the speed of the moving body, as a target sound source.
- the speed may be a relative speed with the sound source.
- the information processing apparatus 100 can determine the distance from the moving body based on the above-described determination criterion when the moving body and the sound source are running side by side and the relative speed is low. Further, the information processing apparatus 100 determines the distance from the moving body based on the above-described determination criterion when the moving body and the sound source face each other and the relative speed is high, for example, when the moving body has a high speed. obtain.
- the information processing apparatus 100 may select whether or not a sound source is the target sound source according to the type of the sound source. For example, the information processing apparatus 100 selectively takes a human voice into the vehicle interior space, or does not take music that sounds outside the vehicle into the vehicle interior space. By such switching, the information processing apparatus 100 can selectively capture only the sound to be captured in the vehicle interior space.
- the information processing apparatus 100 may select whether or not to use a sound source as a target sound source depending on whether or not the sound from the sound source is a moving object. For example, the information processing apparatus 100 selectively takes in a horn sounded toward a moving body into the vehicle interior space, or does not capture a horn sounded toward another vehicle in the vehicle interior space. Such switching can be performed based on, for example, the direction of the car that honed the horn. By such switching, the information processing apparatus 100 can selectively capture only the sound to be captured in the vehicle interior space.
- the information processing apparatus 100 After selecting the target sound source, the information processing apparatus 100 (for example, the generation unit 155) generates a sound signal from the target sound source based on the acquired sound signal.
- the generation of an audio signal from the target sound source can be performed by applying a so-called beam forming process to the acquired audio signal.
- the information processing apparatus 100 may apply signal processing that cancels sound other than sound from the target sound source to the acquired audio signal. Further, the information processing apparatus 100 may apply signal processing that emphasizes the sound from the target sound source to the acquired audio signal.
- the information processing apparatus 100 may use additional sound as a sound signal from the target sound source instead of or together with the generation by the signal processing described above. Good.
- the information processing apparatus 100 After generating the audio signal from the target sound source, the information processing apparatus 100 (for example, the output control unit 157) causes the generated audio signal to be output toward the internal space of the moving body. For example, the information processing apparatus 100 outputs the generated audio signal through the speakers 120A to 120D illustrated in FIG.
- FIG. 6 is a diagram for explaining an example of signal processing by the information processing apparatus 100 according to the present embodiment.
- the information processing apparatus 100 inputs a plurality of audio signals detected by a plurality of microphones (for example, a microphone array) included in the sensor unit 110 to the howling suppression unit 161.
- a plurality of microphones for example, a microphone array
- Howling suppression unit 161 performs processing for suppressing howling. Howling is a phenomenon that oscillates by forming a feedback loop in which a signal emitted from a speaker is input to a microphone again. The howling suppression process will be described with reference to FIG.
- FIG. 7 is a diagram for explaining an example of howling suppression processing according to the present embodiment.
- howling suppression processing by the howling suppression unit 161 is applied to the audio signal input from the microphone 110
- signal processing by the signal processing unit 201 is applied, and amplified by the amplifier 175A or 175B. Output from the speaker 120.
- the other signal processing shown in FIG. 6 is simplified or omitted, and the signal processing unit 201 includes the wind noise removing unit 162 and the like shown in FIG.
- the howling suppression unit 161 measures the spatial transfer function H in advance.
- the howling suppression unit 161 monitors the signal X reproduced from the speaker 120 and estimates the signal Y that passes through the spatial transfer function H and enters the microphone 110 again.
- the howling suppression unit 161 analyzes the frequency characteristics of the signal Y, and suppresses howling by suppressing a signal in a band exceeding the specified value when the gain of the signal Y to be fed back exceeds the specified value. For example, the howling suppression unit 161 suppresses a signal in a band that exceeds a specified value by applying a notch filter. Note that howling suppression unit 161 suppresses howling by stopping reproduction or lowering gain when it is difficult to suppress howling with a notch filter.
- Wind noise removal unit 162 As illustrated in FIG. 6, the information processing apparatus 100 inputs a signal after applying the howling suppression process to the wind noise removing unit 162.
- the wind noise removing unit 162 performs a process of removing wind noise.
- the wind pressure noise removal may be performed by a fixed filter. Wind noise has a feature that energy is concentrated in a low frequency band. Therefore, the wind noise removing unit 162 can remove the wind noise by applying a high-pass filter that reduces 200 to 300 Hz or less.
- Wind noise removal may be performed by signal processing. Since wind noise is caused by air turbulence generated in the vicinity of the microphone rather than sound waves, different noise is input for each microphone. That is, the correlation between microphones is characterized by low wind noise and high normal sound waves. Therefore, the wind noise removing unit 162 calculates the correlation between the microphones for each frequency band, determines that wind noise exists when the correlation value falls below a certain threshold, and lowers the gain of the time and band in which the wind noise exists. By performing such processing, wind noise can be removed dynamically.
- Processing group 170 As illustrated in FIG. 6, the information processing apparatus 100 performs each process included in the processing group 170 after applying the wind noise removal process. The processing in the processing group 170 differs for each set operation mode. The processing group 170 illustrated in FIG. 6 includes processing executed in the fifth operation mode.
- the known sound source control unit 171 controls signal processing related to the known sound source. Specifically, the known sound source control unit 171 controls the external sound signal separation / selection unit 173A, the external sound signal correction filter 174A, and the amplifier 175A based on the map information among the position information, time, and known sound source information 172. Note that the signal input to the processing group 170 passes through the external sound signal separation / selection unit 173A, the external sound signal correction filter 174A, and the amplifier 175A in this order.
- the known sound source control unit 171 sets the amplification amount by the amplifier 175A to 0 (that is, sets the amplitude to 0), and stops the reproduction of the signal related to the known sound source, so that the first operation mode or the fourth operation is performed.
- the operation in the mode may be realized.
- the unknown sound source control part 176 controls the signal processing regarding an unknown sound source. Specifically, the unknown sound source control unit 176 controls the external sound signal separation / selection unit 173B, the external sound signal correction filter 174B, and the amplifier 175B based on the vehicle speed (that is, the speed of the moving body) and the microphone sound pressure.
- the signal input to the processing group 170 passes through the external sound signal separation / selection unit 173B, the external sound signal correction filter 174B, and the amplifier 175B in this order.
- the unknown sound source control unit 176 sets the amplification amount by the amplifier 175B to 0 (that is, sets the amplitude to 0), and stops the reproduction of the signal related to the unknown sound source, so that the first operation mode or the third operation is performed.
- the operation in the mode may be realized.
- the external sound signal separation / selection unit 173 (173A or 173B) separates and / or selects an external sound signal (external sound signal).
- the external sound signal separation / selection unit 173 can employ a conventionally known beam forming technique or sound source separation technique.
- a conventionally known beam forming technique or sound source separation technique For example, an example will be described with reference to FIG.
- FIG. 8 is a diagram for explaining an example of an external sound signal separation / selection process according to the present embodiment. As shown in FIG. 8, sound sources 20 ⁇ / b> A, 20 ⁇ / b> B, and 20 ⁇ / b> C exist around the moving body 10.
- first to third methods as a sound source separation technique using a plurality of microphones of two or more.
- the first method is a method of extracting a specific sound based on directionality. According to the first method, for example, only the sound source 20A can be extracted.
- the first method is, for example, delayed beam forming (DS: Delay and Sum Beamforming).
- the second method is a method of removing a specific sound based on directionality. According to the second method, for example, only the sound source 20A can be removed and the sound sources 20B and 20C can be extracted.
- the second method is, for example, blind spot control type beam forming (NBF: Null Beamforming).
- the third method is a method in which the sound source feature is unknown but is separated from the obtained signal so as to be statistically independent. According to the third method, the sound sources 20A, 20B, and 20C can be extracted individually.
- the third method is, for example, independent component analysis (ICA).
- the fourth method is a method of extracting each sound source based on the spectral characteristics of the sound source. According to the fourth method, the sound sources 20A, 20B, and 20C can be extracted individually.
- the fourth method is, for example, non-negative matrix factorization (NMF).
- the external sound signal separation / selection unit 173A calculates the direction of the known sound source to be acquired using the position information and map information of the moving body 10.
- the external sound signal separation / selection unit 173B estimates the direction of the unknown sound source based on the phase difference of the input signal for the unknown sound source.
- the external sound signal separation / selection unit 173 calculates the coefficient of the delayed beam forming in order to acquire the sound in the direction obtained by the calculation.
- the external sound signal separation / selection unit 173 multiplies the microphone input by a delay beamforming coefficient, and outputs a sound signal of a known sound source or an unknown sound source to be acquired to the external sound signal correction filter 174.
- the external sound signal correction filter 174 (174A or 174B) performs processing for correcting the external sound signal.
- the external sound signal correction processing by the external sound signal correction filter will be described with reference to FIG.
- FIG. 9 is a diagram for explaining an example of the external sound signal correction processing according to the present embodiment.
- the input signal S detected by the microphone 110 passes through the external sound signal correction filter 174, the amplifier 175, and the speaker 120 and is reproduced as a reproduction signal Z.
- the other signal processing shown in FIG. 6 is simplified or omitted.
- the reproduction signal Z is a signal having the characteristics of the path.
- the microphone characteristic is M
- the filter characteristic is G
- the amplifier characteristic is A
- the speaker characteristic is D
- the reproduction signal Z is expressed by the following mathematical formula.
- the amplifier 175 (175A or 175B) amplifies the input signal and outputs it.
- the amplification amount is controlled by the known sound source control unit 171 or the unknown sound source control unit 176.
- the information processing apparatus 100 uses an adder 163 to add the signal output from the processing group 170 to which the process related to the known sound source is applied and the signal to which the process related to the unknown sound source is applied.
- Audio mixer 164 adds a predetermined signal to a signal for capturing external sound via the adder 165.
- the predetermined signal include additional sound included in the known sound source information 172, a navigation sound signal output from the car navigation device, and an audio signal output from the music playback device.
- the information processing apparatus 100 outputs from the speaker 120 an audio signal that has undergone the series of processes described above.
- FIG. 10 is a diagram for explaining an example of signal processing by the information processing apparatus 100 according to the present embodiment.
- FIG. 10 shows signal processing in the third operation mode with an NC (noise canceling) function.
- the block related to the unknown sound source ie, the unknown sound source control unit 176, the external sound source control unit 176)
- an NC control unit 177 and an NC filter 178 which are blocks for performing signal processing for noise canceling, are added.
- the NC control unit 177 controls signal processing for noise canceling. Specifically, the NC control unit 177 sets the filter coefficient of the NC filter 178 based on a preset NC setting. The NC filter 178 applies a filter to the signal input to the processing group 170.
- the NC control unit 177 can reduce noise by an arbitrary method such as an FB (Feedforward) method, an FF (Feedforward) method, or an AFF (Adaptive Feedforward) method.
- the adder 163 adds the signal output from the processing group 170 to which the processing related to the known sound source is applied and the noise cancellation signal output from the NC filter 178.
- FIG. 11 is a diagram for explaining an example of signal processing by the information processing apparatus 100 according to the present embodiment.
- FIG. 11 shows signal processing in the fourth operation mode with the NC function.
- the block relating to the known sound source ie, the known sound source control unit 171 and the known sound source control unit 171 is obtained from the signal processing in the fifth operation mode described above with reference to FIG.
- an NC control unit 177 and an NC filter 178 which are blocks for performing signal processing for noise canceling, are added.
- the adder 163 adds the noise cancellation signal output from the processing group 170 and output from the NC filter 178 and the signal to which processing related to the unknown sound source is applied.
- FIG. 12 is a diagram for explaining an example of signal processing by the information processing apparatus 100 according to the present embodiment.
- FIG. 12 shows signal processing in the fifth operation mode with the NC function.
- signal processing for noise canceling is performed in the signal processing in the fifth operation mode without the NC function described above with reference to FIG.
- An NC control unit 177 and an NC filter 178, a switch 166, and an adder 167, which are blocks, are added.
- the NC filter 178 shown in FIG. 12 applies a filter to the audio signal detected by the microphone 110 in accordance with control by the NC control unit 177 based on preset NC settings.
- the adder 167 adds a noise cancellation signal to the signal output from the adder 163 and added with the signal to which the processing related to the known sound source is applied and the signal to which the processing related to the unknown sound source is applied, and To 165.
- the switch 166 can switch whether to reproduce the noise cancellation signal output from the NC filter 178.
- the information processing apparatus 100 may emit the sound of the vehicle interior space (that is, the internal sound) to the vehicle exterior space.
- the operation mode for releasing the internal sound to the outside space is also referred to as a sixth operation mode.
- the information processing apparatus 100 outputs an audio signal from a sound source (for example, a person such as a driver) existing in the internal space of the moving body 10 toward the outside of the moving body 10. Thereby, for example, a person inside the vehicle can talk to a person outside the vehicle.
- a sound source for example, a person such as a driver
- the sixth operation mode may be set based on various criteria. For example, the sixth operation mode can be set according to the presence / absence or volume of an internal sound, a user operation, or the like.
- the sixth operation mode can be used in combination with the second operation mode described above.
- a person inside the vehicle and a person outside the vehicle can talk. Therefore, for example, it is possible to receive a refueling service without opening a window at a gas station. Payment can also be made without opening a window via electronic money.
- people in the car can talk with people outside the car with the windows closed, which is advantageous for crime prevention.
- FIG. 13 is a view for explaining an example of the process of releasing the internal sound to the outside space according to the present embodiment.
- the information processing apparatus 100 causes the audio signal detected by the microphone 110 provided in the internal space of the moving body 10 to be output by the speaker 120 provided on the door outer surface of the moving body 10.
- the information processing apparatus 100 may switch to the sixth operation mode and emit the internal sound in response to a touch operation on the window 11 provided in the moving body 10.
- the mode may be switched to the sixth operation mode, and when the driver releases the hand from the window 11, the mode may be switched to another mode.
- the driver can easily talk to or quit the person outside the vehicle.
- the window 11 may be a video see-through display.
- the window 11 displays a black-colored screen on the outside of the vehicle while displaying an image outside the vehicle on the inside of the vehicle.
- the person inside the vehicle can see the outside of the vehicle in the same manner as when the window 11 is made of glass, and can prevent peeping from outside the vehicle.
- the window 11 may be a video see-through display whose transparency can be controlled.
- the window 11 normally has low transparency, displays an image outside the vehicle on the inside of the vehicle, and displays a black screen on the outside of the vehicle.
- the window 11B of FIG. 13 the window 11 increases the transparency when operating in the sixth operation mode, and enables people inside the vehicle and people outside the vehicle to see each other. obtain. Thereby, it is possible to facilitate conversation between a person inside the vehicle and a person outside the vehicle.
- the information processing apparatus 100 may be realized as an apparatus mounted on any type of vehicle such as an automobile, an electric vehicle, a hybrid electric vehicle, and a motorcycle. Further, at least a part of the components of the information processing apparatus 100 may be realized in a module for an apparatus mounted on a vehicle (for example, an integrated circuit module configured by one die).
- FIG. 14 is a block diagram illustrating an example of a schematic configuration of a vehicle control system 900 to which the technology according to the present disclosure can be applied.
- the vehicle control system 900 includes an electronic control unit 902, a storage device 904, an input device 906, a vehicle outside sensor 908, a vehicle state sensor 910, a passenger sensor 912, a communication IF 914, an output device 916, a power generation device 918, a braking device 920, a steering system. 922 and a lamp actuating device 924.
- the electronic control unit 902 functions as an arithmetic processing unit and a control unit, and controls the overall operation in the vehicle control system 900 according to various programs.
- the electronic control unit 902 can be formed as an ECU (Electronic Control Unit) together with a storage device 904 described later.
- a plurality of ECUs may be included in the vehicle control system 900.
- each of the various sensors or the various drive systems may be provided with an ECU for controlling them, and further provided with an ECU for controlling the plurality of ECUs in a coordinated manner.
- the plurality of ECUs are connected via an in-vehicle communication network that conforms to an arbitrary standard such as CAN (Controller Area Network), LIN (Local Interconnect Network), LAN (Local Area Network), or Flexray.
- the electronic control unit 902 can form, for example, the processing unit 150 shown in FIG.
- the storage device 904 is a data storage device formed as an example of a storage unit of the vehicle control system 900.
- the storage apparatus 904 is realized by, for example, a magnetic storage device such as an HDD, a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like.
- the storage device 904 may include a storage medium, a recording device that records data on the storage medium, a reading device that reads data from the storage medium, a deletion device that deletes data recorded on the storage medium, and the like.
- the storage device 904 stores programs executed by the electronic control unit 902, various data, various data acquired from the outside, and the like.
- the storage device 904 can form, for example, the storage unit 140 shown in FIG.
- the input device 906 is realized by a device in which information is input by a passenger (driver or passenger) such as a mouse, a keyboard, a touch panel, a button, a microphone, a switch, and a lever.
- the input device 906 may be, for example, a remote control device using infrared rays or other radio waves, or may be an external connection device such as a mobile phone or a PDA corresponding to the operation of the vehicle control system 900.
- the input device 906 may be a camera, for example, and in that case, the passenger can input information by gesture.
- the input device 906 may include, for example, an input control circuit that generates an input signal based on information input by the user using the above-described input means and outputs the input signal to the electronic control unit 902.
- the passenger can operate the input device 906 to input various data to the vehicle control system 900 and to instruct processing operations.
- the input device 906 can form, for example, the sensor unit 110 shown in FIG.
- the vehicle outside sensor 908 is realized by a sensor that detects information outside the vehicle.
- the outside sensor 908 includes a sonar device, a radar device, a LIDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging) device, a camera, a stereo camera, a ToF (Time Of Flight) camera, an infrared sensor, an environmental sensor, a microphone, and the like. May be included.
- the vehicle exterior sensor 908 can form, for example, the sensor unit 110 shown in FIG.
- Vehicle state sensor 910 is realized by a sensor that detects information related to the vehicle state.
- the vehicle state sensor 910 may include a sensor that detects an operation by the driver such as an accelerator opening, a brake depression pressure, or a steering angle.
- the vehicle state sensor 910 may include a sensor that detects the state of the power source, such as the rotational speed or torque of the internal combustion engine or motor.
- the vehicle state sensor 910 may include a sensor for detecting information related to the movement of the vehicle such as a gyro sensor or an acceleration sensor.
- the vehicle state sensor 910 receives a GNSS signal (for example, a GPS signal from a GPS (Global Positioning System) satellite) from a GNSS (Global Navigation Satellite System) satellite, and includes position information including the latitude, longitude, and altitude of the device.
- a GNSS module may be included.
- the vehicle state sensor 910 may detect the position by transmission / reception with Wi-Fi (registered trademark), a mobile phone / PHS / smartphone or the like, or near field communication.
- Wi-Fi registered trademark
- the vehicle state sensor 910 can form, for example, the sensor unit 110 shown in FIG.
- the passenger sensor 912 is realized by a sensor that detects information related to the passenger.
- the passenger sensor 912 may include a camera, a microphone, and an environment sensor provided in the vehicle.
- the passenger sensor 912 may include a biological sensor that detects biological information of the passenger.
- the biometric sensor is provided on, for example, a seat surface or a steering wheel, and can detect biometric information of a passenger sitting on the seat or a driver who holds the steering wheel.
- the passenger sensor 912 can form, for example, the sensor unit 110 shown in FIG.
- various sensors such as the outside sensor 908, the vehicle state sensor 910, and the passenger sensor 912 each output information indicating the detection result to the electronic control unit 902. These various sensors may set a sensing range or accuracy based on control by the electronic control unit 902. These various sensors include a recognition module that performs recognition processing based on raw data such as processing for recognizing the traveling position of the host vehicle on the road based on the position of the white line included in the captured image. Also good.
- the communication IF 914 is a communication interface that mediates communication with other devices by the vehicle control system 900.
- the communication IF 914 may include, for example, a V2X communication module.
- V2X communication is a concept including vehicle-to-vehicle (Vehicle to Vehicle) communication and road-to-vehicle (Vehicle to Infrastructure) communication.
- the communication IF 914 can be a wireless local area network (LAN), Wi-Fi (registered trademark), 3G, LTE (Long Term Evolution), Bluetooth (registered trademark), NFC (Near Field Communication), or WUSB (Wireless USB). ) Communication module.
- LAN local area network
- Wi-Fi registered trademark
- 3G Third Generation
- LTE Long Term Evolution
- Bluetooth registered trademark
- NFC Near Field Communication
- WUSB Wireless USB
- the communication IF 914 can transmit and receive signals and the like according to a predetermined protocol such as TCP / IP, for example, with the Internet or a communication device outside the vehicle.
- the communication IF 914 can form, for example, the communication unit 130 illustrated in FIG.
- the output device 916 is realized by a device that can notify the passenger or the outside of the acquired information visually or audibly.
- Such devices include display devices such as instrument panels, head-up displays, projectors or lamps, and audio output devices such as speakers or headphones.
- the display device visually displays the results obtained by various processes performed by the vehicle control system 900 in various formats such as text, images, tables, and graphs. At that time, a virtual object such as an AR (Augmented Reality) object may be displayed.
- the audio output device converts an audio signal composed of reproduced audio data, acoustic data, and the like into an analog signal and outputs it aurally.
- the output device 916 can form, for example, the output unit 120 shown in FIG.
- the power generation device 918 is a device for generating a driving force of the vehicle.
- the power generation device 918 may be realized by an internal combustion engine, for example. In that case, the power generation device 918 performs start control, stop control, throttle valve opening control, fuel injection control, EGR (Exhaust Gas Recirculation) control, and the like based on a control command from the electronic control unit 902. .
- the power generation device 918 may be realized by, for example, a motor, an inverter, and a battery. In that case, the power generation device 918 supplies electric power from the battery to the motor via the inverter based on a control command from the electronic control unit 902 and outputs a positive torque, so-called power running, and torque to the motor. And a regenerative operation of charging the battery via an inverter.
- the braking device 920 is a device for applying a braking force to the vehicle, or decelerating or stopping the vehicle.
- the braking device 920 may include, for example, a brake installed on each wheel, a brake pipe or an electric circuit for transmitting the depression pressure of the brake pedal to the brake, and the like.
- the braking device 920 may include a control device for operating a sliding or skid prevention mechanism by brake control such as ABS (Antilock Brake System) or ESC (Electronic Stability Control).
- Steering 922 is a device for controlling the traveling direction (steering angle) of the vehicle.
- the steering 922 can include, for example, a steering wheel, a steering shaft, a steering gear, and a tie rod.
- the steering 922 may include power steering for assisting steering by the driver.
- the steering 922 may include a power source such as a motor for realizing automatic steering.
- the lamp operating device 924 is a device that operates various lamps such as a headlight, a blinker, a vehicle width light, a fog light, or a stop lamp.
- the lamp operating device 924 controls, for example, the blinking of the lamp, the amount of light, or the irradiation direction.
- the power generation device 918, the braking device 920, the steering 922, and the lamp operation device 924 may operate based on a manual operation by a driver or may operate based on an automatic operation by the electronic control unit 902. .
- each of the above components may be realized using a general-purpose member, or may be realized by hardware specialized for the function of each component. Therefore, it is possible to change the hardware configuration to be used as appropriate according to the technical level at the time of carrying out this embodiment.
- a computer program for realizing each function of the information processing apparatus 100 according to the present embodiment as described above can be produced and installed in an ECU or the like.
- a computer-readable recording medium storing such a computer program can be provided.
- the recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like.
- the above computer program may be distributed via a network, for example, without using a recording medium.
- the information processing apparatus 100 acquires an audio signal from a sound source that exists outside the moving body, and the distance between the sound source and the moving body depends on the speed of the moving body.
- An audio signal from the target sound source is generated based on the acquired audio signal and output toward the internal space of the moving body.
- the car it is possible to realize a realistic sound environment in the interior space as if there is no car wall.
- the person in the vehicle interior can hear the external sound from a properly selected sound source, it is also effective for driving support applications such as accident prevention, and sound effects on amusement park vehicles. It is also effective for entertainment purposes such as listening to music.
- the present technology can be applied to a sightseeing bus traveling in the city.
- a sightseeing bus sound from a known sound source such as the sound of a bell of a tourist attraction or a sound effect of an attraction, the sound of a seaside wave seen from a car window or the sound of a train running next to the car interior space
- a known sound source such as the sound of a bell of a tourist attraction or a sound effect of an attraction, the sound of a seaside wave seen from a car window or the sound of a train running next to the car interior space
- the present technology can be applied to non-open cars.
- the present technology can be applied to amusement park vehicles.
- the vehicle of a vehicle with a transparent outer wall for example, it can provide a simulated warp experience or provide a safe and realistic driving experience in an area where dangerous wildlife lives Is possible.
- An acquisition unit for acquiring an audio signal from a sound source existing outside the moving body;
- a generating unit that generates an audio signal from a target sound source whose distance to the moving body among the sound sources is a distance corresponding to a speed of the moving body based on the audio signal acquired by the acquiring unit;
- An output control unit for outputting the audio signal generated by the generation unit toward the internal space of the moving body;
- An information processing apparatus comprising: (2)
- the information processing apparatus further includes a setting unit that sets an operation mode from an operation mode group, including a first operation mode in which no output is performed toward the internal space and a second operation mode in which the output is performed.
- the information processing apparatus includes a third operation mode in which a known sound source is the target sound source and a fourth operation mode in which an unknown sound source is the target sound source.
- the second operation mode includes a fifth operation mode in which a known sound source and an unknown sound source are used as the target sound source.
- the setting unit sets the operation mode based on map information indicating a position of a known sound source and position information of the moving body.
- the map information includes information indicating a time zone in which a known sound source emits sound, The information processing apparatus according to (5), wherein the setting unit sets the operation mode further based on a current time.
- the information processing apparatus according to any one of (3) to (6), wherein the acquisition unit acquires a previously stored audio signal related to a known sound source.
- the setting unit sets an operation mode based on information from an external device.
- the generation unit selects the sound source farther from the moving body as the target sound source as the speed of the moving body is higher, and selects the sound source closer to the moving body as the target sound source as the speed of the moving body is lower.
- the information processing apparatus according to any one of (1) to (8).
- the information processing apparatus according to any one of (1) to (12), wherein the generation unit selects whether to use the sound source as the target sound source according to a type of the sound source. (14) The generation unit selects whether or not to use the sound source as the target sound source according to whether or not the target of the sound from the sound source is the moving body, any of (1) to (13) The information processing apparatus according to claim 1. (15) The output control unit outputs an audio signal from a sound source existing in the internal space toward the outside of the moving body in response to a touch operation on a window provided in the moving body. The information processing apparatus according to any one of (14). (16) The information processing apparatus according to (15), wherein the window is a video see-through display capable of controlling transparency.
Abstract
Description
1.概要
2.機能構成
3.技術的特徴
3.1.音声信号の取得
3.2.動作モードの設定
3.3.対象音源の選択
3.4.音声信号の生成
3.5.音声信号の出力
4.具体的な信号処理例
5.補足
6.応用例
7.まとめ
図1は、本開示の一実施形態に係る情報処理装置の一例を説明するための図である。図1に示すように、音源20A及び20Bの周囲を移動体10が走行している。
図2は、本実施形態に係る情報処理装置の論理的な構成の一例を示すブロック図である。図2に示すように、本実施形態に係る情報処理装置100は、センサ部110、出力部120、通信部130、記憶部140及び処理部150を含む。
センサ部110は、多様な情報を検出する機能を有する。センサ部110は、検出したセンサ情報を処理部150に出力する。
出力部120は、多様な情報を出力する機能を有する。出力部120は、処理部150による制御に基づいて情報を出力する。
通信部130は、他の装置との間で信号を送受信する機能を有する。例えば、通信部130は、4G若しくは5G等のセルラー通信、無線LAN(Local Area Network)、Wi-Fi(登録商標)又はBluetooth(登録商標)等の任意の通信規格を用いて通信し得る。とりわけ、移動体が車である場合、通信部130は、V2X(Vehicle to Everything)通信を用いて通信し得る。
記憶部140は、情報処理装置100の動作のための情報を一時的に又は恒久的に記憶する機能を有する。例えば、記憶部140は、後述する既知音源情報を記憶する。
処理部150は、情報処理装置100の様々な機能を提供する。処理部150は、取得部151、設定部153、生成部155及び出力制御部157を含む。なお、処理部150は、これらの構成要素以外の他の構成要素をさらに含み得る。即ち、処理部150は、これらの構成要素の動作以外の動作も行い得る。取得部151、設定部153、生成部155及び出力制御部157の各々の構成要素の機能は、後に詳しく説明する。
<3.1.音声信号の取得>
まず、情報処理装置100(例えば、取得部151)は、移動体の外部に存在する音源からの音声信号(即ち、外音を表す信号)を取得する。例えば、情報処理装置100は、図1に示したマイク110A~110Dにより検出された、移動体の外部に存在する音源において実際に鳴っている音の音声信号を取得する。
音声信号の取得後、情報処理装置100(例えば、設定部153)は、動作モードを設定する。
動作モード群は、移動体の内部空間への出力が行われない第1の動作モードを含んでいてもよい。情報処理装置100は、第1の動作モードで動作することで、外音の車内空間への取り込みを停止することができる。
情報処理装置100(例えば、設定部153)は、多様な基準に基づいて動作モードを設定(換言すると、切り替え)し得る。
情報処理装置100は、移動体の外部に存在する音源からの音声信号に基づいて動作モードを設定してもよい。具体的には、情報処理装置100は、外音の有無、大きさ又は種別等に応じて、第1の動作モードと第2の動作モードとを切り替え得る。
情報処理装置100は、既知音源情報に基づいて動作モードを設定してもよい。
情報処理装置100は、外部機器からの情報に基づいて動作モードを設定してもよい。
例えば、外部機器は、車、歩行者(pedestrian)が携帯するデバイス、又はRSU(Road Side Unit)等の、V2X(Vehicle to Everything)通信可能な装置であってもよい。例えば、情報処理装置100は、運転者の死角を走行する車の存在をRSUからの情報に基づき認識し、第4の動作モードを設定してかかる車からの音を車内空間に取り込むことで、事故を未然に防止することが可能である。
情報処理装置100は、ユーザ操作に基づいて動作モードを設定してもよい。例えば、情報処理装置100は、ユーザによる動作モードの選択操作に応じた動作モードを設定してもよい。
動作モードの設定後、情報処理装置100(例えば、生成部155)は、設定された動作モードに従い、移動体の周囲に存在する音源(未知音源及び/又は既知音源)から、車内空間への取り込み対象となる音源を対象音源として選択する。もちろん、音源がすべて対象音源として選択されてもよい。なお、第3のモード又は第5のモードにおいて、既知音源が対象音源として選択されない場合があってもよいし、既知音源は常に対象音源として選択されてもよい。
情報処理装置100は、音源のうち移動体との距離が移動体の速度に応じた距離である音源を、対象音源として選択してもよい。
情報処理装置100は、音源の種類に応じて、音源を前記対象音源とするか否かを選択してもよい。例えば、情報処理装置100は、人の声を選択的に車内空間に取り込んだり、車外で鳴る音楽を車内空間に取り込まなかったりする。このような切り替えにより、情報処理装置100は、車内空間に取り込むべき音のみを、選択的に取り込むことが可能となる。
情報処理装置100は、音源からの音の対象が移動体であるか否かに応じて、音源を対象音源とするか否かを選択してもよい。例えば、情報処理装置100は、移動体に向けて鳴らされたクラクションを選択的に車内空間に取り込んだり、他の車に向けて鳴らされたクラクションを車内空間に取り込まなかったりする。かかる切り替えは、例えばクラクションを鳴らした車の向き等に基づいて行われ得る。このような切り替えにより、情報処理装置100は、車内空間に取り込むべき音のみを、選択的に取り込むことが可能となる。
対象音源の選択後、情報処理装置100(例えば、生成部155)は、対象音源からの音声信号を、取得された音声信号に基づいて生成する。
対象音源からの音声信号を生成後、情報処理装置100(例えば、出力制御部157)は、生成された音声信号を移動体の内部空間に向けて出力させる。例えば、情報処理装置100は、図1に示したスピーカ120A~120Dにより、生成された音声信号を出力する。
以下、図6~図12を参照しながら、具体的な信号処理の一例を説明する。
図6は、本実施形態に係る情報処理装置100による信号処理の一例を説明するための図である。図6に示すように、情報処理装置100は、センサ部110に含まれる複数のマイク(例えば、マイクアレイ)により検出された複数の音声信号を、ハウリング抑制部161に入力する。
ハウリング抑制部161は、ハウリングを抑制する処理を行う。ハウリングとは、スピーカから放出された信号が再度マイクに入力されるフィードバックループが形成されることによって発振する現象である。ハウリング抑制処理について、図7を参照して説明する。
図6に示すように、情報処理装置100は、ハウリング抑制処理を適用後の信号を、風雑音除去部162に入力する。風雑音除去部162は、風雑音を除去する処理を行う。
図6に示すように、情報処理装置100は、風雑音除去処理を適用後、処理群170に含まれる各処理を行う。処理群170における処理は、設定された動作モードごとに異なる。図6に示した処理群170は、第5の動作モードにおいて実行される処理を含んでいる。
既知音源制御部171は、既知音源に関する信号処理を制御する。詳しくは、既知音源制御部171は、位置情報、時刻、及び既知音源情報172のうちマップ情報に基づいて、外音信号分離・選択部173A、外音信号補正フィルタ174A及びアンプ175Aを制御する。なお、処理群170に入力された信号は、外音信号分離・選択部173A、外音信号補正フィルタ174A及びアンプ175Aの順に通過する。なお、既知音源制御部171は、アンプ175Aによる増幅量を0にして(即ち、振幅を0にする)、既知音源に関する信号の再生を停止することで、第1の動作モード又は第4の動作モードでの動作を実現してもよい。
未知音源制御部176は、未知音源に関する信号処理を制御する。詳しくは、未知音源制御部176は、車速(即ち、移動体の速度)及びマイク音圧に基づいて、外音信号分離・選択部173B、外音信号補正フィルタ174B及びアンプ175Bを制御する。なお、処理群170に入力された信号は、外音信号分離・選択部173B、外音信号補正フィルタ174B及びアンプ175Bの順に通過する。なお、未知音源制御部176は、アンプ175Bによる増幅量を0にして(即ち、振幅を0にする)、未知音源に関する信号の再生を停止することで、第1の動作モード又は第3の動作モードでの動作を実現してもよい。
外音信号分離・選択部173(173A又は173B)は、外音信号(外音の音声信号)の分離及び/又は選択を行う。外音信号分離・選択部173は、従来から知られているビームフォーミング技術又は音源分離技術を採用することができる。以下、図8を参照しながら、その一例を説明する。
外音信号補正フィルタ174(174A又は174B)は、外音信号を補正する処理を行う。外音信号補正フィルタによる外音信号補正処理について、図9を参照して説明する。
Z=SMGAD …(1)
G=1/MAD …(2)
アンプ175(175A又は175B)は、入力された信号を増幅して出力する。増幅量は、既知音源制御部171又は未知音源制御部176により制御される。
図6に示すように、情報処理装置100は、処理群170から出力された、既知音源に関する処理が適用された信号と未知音源に関する処理が適用された信号とを、加算器163により加算する。
オーディオミキサ164は、所定の信号を、加算器165を介して外音取り込みのための信号に加算する。ここでの所定の信号としては、既知音源情報172に含まれる付加音声、カーナビゲーション装置から出力されたナビ音声信号、及び音楽再生装置から出力されたオーディオ信号等が挙げられる。
図10は、本実施形態に係る情報処理装置100による信号処理の一例を説明するための図である。図10では、NC(noise cancelling)機能付き第3の動作モードにおける信号処理が示されている。図10に示すように、NC機能付き第3の動作モードでは、図6を参照して上記説明した第5の動作モードにおける信号処理から、未知音源に関するブロック(即ち、未知音源制御部176、外音信号分離・選択部173B、外音信号補正フィルタ174B及びアンプ175B)が削除されている。一方で、図10に示すように、NC機能付き第3の動作モードでは、ノイズキャンセリングのための信号処理を行うブロックである、NC制御部177及びNCフィルタ178が追加されている。
図11は、本実施形態に係る情報処理装置100による信号処理の一例を説明するための図である。図11では、NC機能付き第4の動作モードにおける信号処理が示されている。図11に示すように、NC機能付き第4の動作モードでは、図6を参照して上記説明した第5の動作モードにおける信号処理から、既知音源に関するブロック(即ち、既知音源制御部171、既知音源情報172、外音信号分離・選択部173A、外音信号補正フィルタ174A及びアンプ175A)が削除されている。一方で、図11に示すように、NC機能付き第4の動作モードでは、ノイズキャンセリングのための信号処理を行うブロックである、NC制御部177及びNCフィルタ178が追加されている。
図12は、本実施形態に係る情報処理装置100による信号処理の一例を説明するための図である。図12では、NC機能付き第5の動作モードにおける信号処理が示されている。図12に示すように、NC機能付き第5の動作モードでは、図6を参照して上記説明したNC機能無しの第5の動作モードにおける信号処理に、ノイズキャンセリングのための信号処理を行うブロックである、NC制御部177及びNCフィルタ178、並びにスイッチ166及び加算器167が追加されている。
上記では、情報処理装置100が、外音を車内空間に取り込む技術について説明した。これと反対に、情報処理装置100は、車内空間の音(即ち、内音)を車外空間に放出してもよい。内音を車外空間に放出する動作モードを、以下では第6の動作モードとも称する。
これにより、例えば車内の人が車外の人に対して話をすることが可能となる。第6の動作モードは、多様な基準に基づいて動作モードを設定され得る。例えば、第6の動作モードは、内音の有無若しくは大きさ、又はユーザ操作等に応じて設定され得る。
本開示に係る技術は、様々な製品へ応用可能である。例えば、情報処理装置100は、自動車、電気自動車、ハイブリッド電気自動車、自動二輪車などのいずれかの種類の車両に搭載される装置として実現されてもよい。また、情報処理装置100の少なくとも一部の構成要素は、車両に搭載される装置のためのモジュール(例えば、1つのダイで構成される集積回路モジュール)において実現されてもよい。
以上、図1~図14を参照して、本開示の一実施形態について詳細に説明した。上記説明したように、本実施形態に係る情報処理装置100は、移動体の外部に存在する音源からの音声信号を取得し、音源のうち移動体との距離が移動体の速度に応じた距離である対象音源からの音声信号を、取得された音声信号に基づいて生成して、移動体の内部空間に向けて出力する。これにより、移動体の内部空間に外音を取り込むことが可能となる。さらには、移動体の速度に応じた適切な音源からの外音を選択的に移動体に内部空間に取り込むことが可能となる。具体的に車に関して言えば、車の壁が無いかのような臨場感のある音環境を車内空間に実現することが可能となる。また、車内空間のいる人は適切に選択された音源からの外音を聞くことが可能であるので、例えば事故防止等の運転支援用途にも効果的であるし、遊園地の乗り物において効果音を聞く等の娯楽用途にも効果的である。
(1)
移動体の外部に存在する音源からの音声信号を取得する取得部と、
前記音源のうち前記移動体との距離が前記移動体の速度に応じた距離である対象音源からの音声信号を、前記取得部により取得された音声信号に基づいて生成する生成部と、
前記生成部により生成された音声信号を前記移動体の内部空間に向けて出力させる出力制御部と、
を備える情報処理装置。
(2)
前記情報処理装置は、前記内部空間に向けての出力が行われない第1の動作モードと行われる第2の動作モードとを含む、動作モード群から動作モードを設定する設定部をさらに備える、前記(1)に記載の情報処理装置。
(3)
前記第2の動作モードは、既知音源を前記対象音源とする第3の動作モード及び未知音源を前記対象音源とする第4の動作モードを含む、前記(2)に記載の情報処理装置。
(4)
前記第2の動作モードは、既知音源及び未知音源を前記対象音源とする第5の動作モードを含む、前記(2)又は(3)に記載の情報処理装置。
(5)
前記設定部は、既知音源の位置を示すマップ情報と前記移動体の位置情報に基づいて前記動作モードを設定する、前記(3)又は(4)に記載の情報処理装置。
(6)
前記マップ情報は、既知音源が音を発する時間帯を示す情報を含み、
前記設定部は、現在時刻にさらに基づいて前記動作モードを設定する、前記(5)に記載の情報処理装置。
(7)
前記取得部は、既知音源に関する予め記憶された音声信号を取得する、前記(3)~(6)のいずれか一項に記載の情報処理装置。
(8)
前記設定部は、外部機器からの情報に基づいて動作モードを設定する、前記(2)~(7)のいずれか一項に記載の情報処理装置。
(9)
前記生成部は、前記移動体の速度が速いほど前記移動体から遠い前記音源を前記対象音源として選択し、前記移動体の速度が遅いほど前記移動体から近い前記音源を前記対象音源として選択する、前記(1)~(8)のいずれか一項に記載の情報処理装置。
(10)
前記生成部は、前記移動体と前記音源との距離を、前記移動体を基準とする前記音源の方位の時系列変化に基づいて判断する、前記(9)に記載の情報処理装置。
(11)
前記生成部は、前記移動体と前記音源との距離を、前記音源からの音圧の時系列変化に基づいて判断する、前記(9)又は(10)に記載の情報処理装置。
(12)
前記生成部は、前記移動体と前記音源との距離を、前記音源からの音圧の絶対値に基づいて判断する、前記(9)~(11)のいずれか一項に記載の情報処理装置。
(13)
前記生成部は、前記音源の種類に応じて、前記音源を前記対象音源とするか否かを選択する、前記(1)~(12)のいずれか一項に記載の情報処理装置。
(14)
前記生成部は、前記音源からの音の対象が前記移動体であるか否かに応じて、前記音源を前記対象音源とするか否かを選択する、前記(1)~(13)のいずれか一項に記載の情報処理装置。
(15)
前記出力制御部は、前記移動体に設けられた窓へのタッチ操作に応じて、前記内部空間に存在する音源からの音声信号を前記移動体の外部に向けて出力させる、前記(1)~(14)のいずれか一項に記載の情報処理装置。
(16)
前記窓は、透過性を制御可能なビデオシースルーディスプレイである、前記(15)に記載の情報処理装置。
(17)
前記内部空間は、前記移動体に人が滞在するための空間である、前記(1)~(16)のいずれか一項に記載の情報処理装置。
(18)
前記移動体は自動車である、前記(1)~(17)のいずれか一項に記載の情報処理装置。
(19)
移動体の外部に存在する音源からの音声信号を取得することと、
前記音源のうち前記移動体との距離が前記移動体の速度に応じた距離である対象音源からの音声信号を、取得された音声信号に基づいて生成することと、
生成された音声信号を前記移動体の内部空間に向けて出力装置により出力させることと、
を含む情報処理方法。
(20)
コンピュータを、
移動体の外部に存在する音源からの音声信号を取得する取得部と、
前記音源のうち前記移動体との距離が前記移動体の速度に応じた距離である対象音源からの音声信号を、前記取得部により取得された音声信号に基づいて生成する生成部と、
前記生成部により生成された音声信号を前記移動体の内部空間に向けて出力させる出力制御部と、
として機能させるためのプログラムが記録された記録媒体。
20 音源
100 情報処理装置
110 センサ部、マイク
120 出力部、スピーカ
130 通信部
140 記憶部
150 処理部
151 取得部
153 設定部
155 生成部
157 出力制御部
Claims (20)
- 移動体の外部に存在する音源からの音声信号を取得する取得部と、
前記音源のうち前記移動体との距離が前記移動体の速度に応じた距離である対象音源からの音声信号を、前記取得部により取得された音声信号に基づいて生成する生成部と、
前記生成部により生成された音声信号を前記移動体の内部空間に向けて出力させる出力制御部と、
を備える情報処理装置。 - 前記情報処理装置は、前記内部空間に向けての出力が行われない第1の動作モードと行われる第2の動作モードとを含む、動作モード群から動作モードを設定する設定部をさらに備える、請求項1に記載の情報処理装置。
- 前記第2の動作モードは、既知音源を前記対象音源とする第3の動作モード及び未知音源を前記対象音源とする第4の動作モードを含む、請求項2に記載の情報処理装置。
- 前記第2の動作モードは、既知音源及び未知音源を前記対象音源とする第5の動作モードを含む、請求項2に記載の情報処理装置。
- 前記設定部は、既知音源の位置を示すマップ情報と前記移動体の位置情報に基づいて前記動作モードを設定する、請求項3に記載の情報処理装置。
- 前記マップ情報は、既知音源が音を発する時間帯を示す情報を含み、
前記設定部は、現在時刻にさらに基づいて前記動作モードを設定する、請求項5に記載の情報処理装置。 - 前記取得部は、既知音源に関する予め記憶された音声信号を取得する、請求項3に記載の情報処理装置。
- 前記設定部は、外部機器からの情報に基づいて動作モードを設定する、請求項2に記載の情報処理装置。
- 前記生成部は、前記移動体の速度が速いほど前記移動体から遠い前記音源を前記対象音源として選択し、前記移動体の速度が遅いほど前記移動体から近い前記音源を前記対象音源として選択する、請求項1に記載の情報処理装置。
- 前記生成部は、前記移動体と前記音源との距離を、前記移動体を基準とする前記音源の方位の時系列変化に基づいて判断する、請求項9に記載の情報処理装置。
- 前記生成部は、前記移動体と前記音源との距離を、前記音源からの音圧の時系列変化に基づいて判断する、請求項9に記載の情報処理装置。
- 前記生成部は、前記移動体と前記音源との距離を、前記音源からの音圧の絶対値に基づいて判断する、請求項9に記載の情報処理装置。
- 前記生成部は、前記音源の種類に応じて、前記音源を前記対象音源とするか否かを選択する、請求項1に記載の情報処理装置。
- 前記生成部は、前記音源からの音の対象が前記移動体であるか否かに応じて、前記音源を前記対象音源とするか否かを選択する、請求項1に記載の情報処理装置。
- 前記出力制御部は、前記移動体に設けられた窓へのタッチ操作に応じて、前記内部空間に存在する音源からの音声信号を前記移動体の外部に向けて出力させる、請求項1に記載の情報処理装置。
- 前記窓は、透過性を制御可能なビデオシースルーディスプレイである、請求項15に記載の情報処理装置。
- 前記内部空間は、前記移動体に人が滞在するための空間である、請求項1に記載の情報処理装置。
- 前記移動体は自動車である、請求項1に記載の情報処理装置。
- 移動体の外部に存在する音源からの音声信号を取得することと、
前記音源のうち前記移動体との距離が前記移動体の速度に応じた距離である対象音源からの音声信号を、取得された音声信号に基づいて生成することと、
生成された音声信号を前記移動体の内部空間に向けて出力装置により出力させることと、
を含む情報処理方法。 - コンピュータを、
移動体の外部に存在する音源からの音声信号を取得する取得部と、
前記音源のうち前記移動体との距離が前記移動体の速度に応じた距離である対象音源からの音声信号を、前記取得部により取得された音声信号に基づいて生成する生成部と、
前記生成部により生成された音声信号を前記移動体の内部空間に向けて出力させる出力制御部と、
として機能させるためのプログラムが記録された記録媒体。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201780087599.9A CN110366852B (zh) | 2017-03-09 | 2017-12-07 | 信息处理设备、信息处理方法和记录介质 |
EP17899809.2A EP3595331A1 (en) | 2017-03-09 | 2017-12-07 | Information processing device, information processing method and recording medium |
US16/482,626 US11543521B2 (en) | 2017-03-09 | 2017-12-07 | Information processing apparatus, information processing method, and recording medium |
KR1020197023857A KR20190125304A (ko) | 2017-03-09 | 2017-12-07 | 정보 처리 장치, 정보 처리 방법 및 기록 매체 |
JP2019504328A JP7040513B2 (ja) | 2017-03-09 | 2017-12-07 | 情報処理装置、情報処理方法及び記録媒体 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017-044901 | 2017-03-09 | ||
JP2017044901 | 2017-03-09 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018163545A1 true WO2018163545A1 (ja) | 2018-09-13 |
Family
ID=63447478
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2017/044081 WO2018163545A1 (ja) | 2017-03-09 | 2017-12-07 | 情報処理装置、情報処理方法及び記録媒体 |
Country Status (6)
Country | Link |
---|---|
US (1) | US11543521B2 (ja) |
EP (1) | EP3595331A1 (ja) |
JP (1) | JP7040513B2 (ja) |
KR (1) | KR20190125304A (ja) |
CN (1) | CN110366852B (ja) |
WO (1) | WO2018163545A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020188941A1 (ja) * | 2019-03-15 | 2020-09-24 | 本田技研工業株式会社 | 車両用コミュニケーション装置及びプログラム |
US20220308157A1 (en) * | 2019-08-28 | 2022-09-29 | Sony Interactive Entertainment Inc. | Image processing apparatus, system, image processing method, and image processing program |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113496694A (zh) * | 2020-03-19 | 2021-10-12 | 上汽通用汽车有限公司 | 一种车辆声学系统、车辆用座椅以及车辆 |
KR20210136569A (ko) * | 2020-05-08 | 2021-11-17 | 삼성전자주식회사 | 경보 장치, 상기 장치를 포함하는 경보 시스템, 및 경보 방법 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH04358936A (ja) * | 1991-05-31 | 1992-12-11 | Sony Corp | 車両用音声伝達装置 |
JP2008149917A (ja) * | 2006-12-18 | 2008-07-03 | Denso Corp | 車両用周囲音報知装置 |
JP2009051333A (ja) * | 2007-08-27 | 2009-03-12 | Nissan Motor Co Ltd | 車両用聴覚モニタ装置 |
JP2009251799A (ja) * | 2008-04-03 | 2009-10-29 | Nissan Motor Co Ltd | 車外情報提供装置及び車外情報提供方法 |
JP2013256249A (ja) * | 2012-06-14 | 2013-12-26 | Mitsubishi Electric Corp | 音声伝達装置 |
JP2015173369A (ja) | 2014-03-12 | 2015-10-01 | ソニー株式会社 | 信号処理装置、信号処理方法、およびプログラム |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3163868B2 (ja) * | 1993-09-20 | 2001-05-08 | 富士通株式会社 | 音選択再生装置 |
US20020150262A1 (en) * | 2001-03-29 | 2002-10-17 | Carter Jerome D. | Method and apparatus for communicating to vehicle occupants |
WO2006006553A1 (ja) * | 2004-07-14 | 2006-01-19 | Matsushita Electric Industrial Co., Ltd. | 報知装置 |
JP3949701B1 (ja) * | 2006-03-27 | 2007-07-25 | 株式会社コナミデジタルエンタテインメント | 音声処理装置、音声処理方法、ならびに、プログラム |
CN101464168B (zh) * | 2009-01-20 | 2010-06-16 | 清华大学 | 一种车辆加速噪声的噪声源识别方法 |
JP2011252853A (ja) * | 2010-06-03 | 2011-12-15 | Toyota Motor Corp | 音源方向検出装置 |
JP5533897B2 (ja) * | 2012-01-20 | 2014-06-25 | 日産自動車株式会社 | 車両用聴覚モニタ装置 |
JP2014061786A (ja) * | 2012-09-21 | 2014-04-10 | Sony Corp | 移動体およびプログラム |
US9469247B2 (en) | 2013-11-21 | 2016-10-18 | Harman International Industries, Incorporated | Using external sounds to alert vehicle occupants of external events and mask in-car conversations |
CN108603778B (zh) | 2016-02-04 | 2021-08-13 | 高准公司 | 用于振动流量计量器的压力补偿及相关方法 |
CN106023617A (zh) * | 2016-06-01 | 2016-10-12 | 乐视控股(北京)有限公司 | 车辆行驶安全的检测方法和装置 |
-
2017
- 2017-12-07 EP EP17899809.2A patent/EP3595331A1/en not_active Withdrawn
- 2017-12-07 JP JP2019504328A patent/JP7040513B2/ja active Active
- 2017-12-07 KR KR1020197023857A patent/KR20190125304A/ko not_active Application Discontinuation
- 2017-12-07 WO PCT/JP2017/044081 patent/WO2018163545A1/ja unknown
- 2017-12-07 CN CN201780087599.9A patent/CN110366852B/zh active Active
- 2017-12-07 US US16/482,626 patent/US11543521B2/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH04358936A (ja) * | 1991-05-31 | 1992-12-11 | Sony Corp | 車両用音声伝達装置 |
JP2008149917A (ja) * | 2006-12-18 | 2008-07-03 | Denso Corp | 車両用周囲音報知装置 |
JP2009051333A (ja) * | 2007-08-27 | 2009-03-12 | Nissan Motor Co Ltd | 車両用聴覚モニタ装置 |
JP2009251799A (ja) * | 2008-04-03 | 2009-10-29 | Nissan Motor Co Ltd | 車外情報提供装置及び車外情報提供方法 |
JP2013256249A (ja) * | 2012-06-14 | 2013-12-26 | Mitsubishi Electric Corp | 音声伝達装置 |
JP2015173369A (ja) | 2014-03-12 | 2015-10-01 | ソニー株式会社 | 信号処理装置、信号処理方法、およびプログラム |
Non-Patent Citations (1)
Title |
---|
See also references of EP3595331A4 |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020188941A1 (ja) * | 2019-03-15 | 2020-09-24 | 本田技研工業株式会社 | 車両用コミュニケーション装置及びプログラム |
JPWO2020188941A1 (ja) * | 2019-03-15 | 2020-09-24 | ||
JP7379462B2 (ja) | 2019-03-15 | 2023-11-14 | 本田技研工業株式会社 | 車両用コミュニケーション装置及びプログラム |
US20220308157A1 (en) * | 2019-08-28 | 2022-09-29 | Sony Interactive Entertainment Inc. | Image processing apparatus, system, image processing method, and image processing program |
Also Published As
Publication number | Publication date |
---|---|
JP7040513B2 (ja) | 2022-03-23 |
US20210286073A1 (en) | 2021-09-16 |
EP3595331A4 (en) | 2020-01-15 |
US11543521B2 (en) | 2023-01-03 |
CN110366852B (zh) | 2021-12-21 |
KR20190125304A (ko) | 2019-11-06 |
JPWO2018163545A1 (ja) | 2020-01-09 |
CN110366852A (zh) | 2019-10-22 |
EP3595331A1 (en) | 2020-01-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3121064B1 (en) | Vehicle control device and vehicle control method thereof | |
CN104658548B (zh) | 用外部声音向车辆驾乘人员警告外部事件并掩蔽车内谈话 | |
CN110691299B (zh) | 音频处理系统、方法、装置、设备及存储介质 | |
JP7040513B2 (ja) | 情報処理装置、情報処理方法及び記録媒体 | |
KR101526407B1 (ko) | 차량용 가상 엔진음 시스템 및 그 제어 방법 | |
JP2013198065A (ja) | 音声提示装置 | |
US11024280B2 (en) | Signal processing apparatus, method, and program | |
US11190155B2 (en) | Learning auxiliary feature preferences and controlling the auxiliary devices based thereon | |
US20170303037A1 (en) | Enhanced audio landscape | |
WO2020120754A1 (en) | Audio processing device, audio processing method and computer program thereof | |
US11854541B2 (en) | Dynamic microphone system for autonomous vehicles | |
KR101487474B1 (ko) | 자동차의 가상 음향 발생 장치 및 이를 이용한 자동차의 가상 음향 발생 방법 | |
JP6981095B2 (ja) | サーバ装置、記録方法、プログラム、および記録システム | |
WO2019175273A1 (en) | Electronic device, method and computer program | |
WO2021175735A1 (en) | Electronic device, method and computer program | |
US20220116724A1 (en) | Three-dimensional (3d) audio notification for vehicle | |
CN110139205B (zh) | 用于辅助信息呈现的方法及装置 | |
WO2023204076A1 (ja) | 音響制御方法及び音響制御装置 | |
US20230026188A1 (en) | Remote support device, remote support system, and remote support method | |
KR20190031053A (ko) | 차량 제어 방법 | |
WO2018062476A1 (ja) | 車載装置、出力方法及びプログラム | |
CN115179930A (zh) | 车辆控制方法、装置、车辆及可读存储介质 | |
CN116368398A (zh) | 语音声源定位方法、装置及系统 | |
CN117789741A (zh) | 语音信号处理方法、装置、车辆及存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17899809 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 20197023857 Country of ref document: KR Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2019504328 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2017899809 Country of ref document: EP Effective date: 20191009 |