WO2019015954A1 - Method, device and system for the generation of spatial sound fields - Google Patents

Method, device and system for the generation of spatial sound fields Download PDF

Info

Publication number
WO2019015954A1
WO2019015954A1 PCT/EP2018/067981 EP2018067981W WO2019015954A1 WO 2019015954 A1 WO2019015954 A1 WO 2019015954A1 EP 2018067981 W EP2018067981 W EP 2018067981W WO 2019015954 A1 WO2019015954 A1 WO 2019015954A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
chassis
audio
circuitry
information
Prior art date
Application number
PCT/EP2018/067981
Other languages
French (fr)
Inventor
Andreas Schwager
Michael Enenkl
Original Assignee
Sony Corporation
Sony Europe Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corporation, Sony Europe Limited filed Critical Sony Corporation
Priority to DE112018003683.9T priority Critical patent/DE112018003683T5/en
Publication of WO2019015954A1 publication Critical patent/WO2019015954A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04SSTEREOPHONIC SYSTEMS 
    • H04S7/00Indicating arrangements; Control arrangements, e.g. balance control
    • H04S7/30Control circuits for electronic adaptation of the sound field
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R2499/00Aspects covered by H04R or H04S not otherwise provided for in their subgroups
    • H04R2499/10General applications
    • H04R2499/13Acoustic transducers and sound field adaptation in vehicles

Definitions

  • the present disclosure generally pertains to methods, devices and systems for the generation of spatial sound fields, in particular to 3D Audio inside an automotive chassis.
  • Spatial audio rendering techniques for the generation of spatial sound fields create a virtual acoustic environment (also called 3D audio, or 3D virtual audio). They produce "artificial" wave fronts synthesized by a large number of individually driven speakers. Examples are the so called 5.1 or 7.1 systems, which are composed of 5 or 7 loudspeaker enclosures and one or two extra subwoofer, which are designed to reproduce the low frequency range of sound with a higher energy. In such systems mathematical algorithms axe used to reproduce a sound field as exactly as possible.
  • WFS Wavefield synthesis
  • 3D audio systems that are optimized for in-car use. Such systems are typically based on cross-talk cancelation techniques to control the sound arriving at each of the listener's ears. The low acoustic power requirements and well defined listener locations are beneficial to the optimization of such in-car systems. Although there exist 3D audio systems that are optimized for in-car use it is generally desirable to improve techniques for the generation of a spatial sound field.
  • the disclosure provides an apparatus comprising circuitty configured to render audio considering an automotive chassis shift caused by vehicle dynamics.
  • the disclosure provides a method comprising rendering audio considering an automotive chassis shift caused by vehicle dynamics.
  • the disclosure provides a computer program comprising instructions which, when executed on a processor, cause the processor to render audio considering an automotive chassis shift caused by vehicle dynamics.
  • Fig. 1 shows a process of detetrnining chassis shift
  • Fig. 2 shows a process of 3d audio tendering based on speaker positions that consider chassis shift
  • Fig.3 illustrate a schematic example where k is assumed that a force F acts down on the left front wheel and me rear wheels are fixed;
  • Fig.4 illustrates an embodiment of how a displacement of a car body frame may cause a Shift of speaker positions
  • Fig.5a illustrates a two-track model with twistable frame
  • Fig. 5b illustrates the spring travel in the two-track model with twistable frame
  • Fig. 6 illustrates an embodiment of a method comprising rendering audio considering an automotive chassis shift caused by vehicle dynamics
  • Fig.7 provides an embodiment of it 3d audio rendering that is based on a digitalized Monopole Synthesis algorithm ;
  • Fig. 8 schematically shows an exemplary speaker displacement in a 3d audio environment
  • Fig.9 is a block diagram depicting an example of schematic configuration of a vehicle control system as an example of a mobile body control system to which the technology according to an embodiment of the present disclosure can be applied DETAILED DESCRIPTION OF EMBODIMENTS
  • the embodiments relate to an apparatus comprising circuitry configured to render audio considering an automotive chassis shift caused by vehicle dynamics.
  • the embodiments concern the finding that in the case of a 3D audio sound system installed inside a car, the speakers position might not be stable because of lacking torsional rigidity of the car. This might impair the sound quality.
  • the apparatus according to the embodiments described below may allow to correct playout of 3D sound inside an automotive chassis even if the speakers face individual displacements due to distortions such as torque applied on the chassis by driving maneuvers.
  • An automotive chassis may for example consist of an internal vehicle frame that supports an artificial object in its construction and use, can also provide protection for some internal parts.
  • An example of a chassis as referred to in die embodiments may be the underpart of a motor vehicle, consisting of the frame on 'which the body is mounted.
  • the term chassis as used here also comprises the concept of a "running" chassis in -which the running gear such as wheels and transmission, and the driver's seat, are included.
  • chassis shift as used in die embodiments described below relates to any internal movements, of the chassis such as torsion, bending, and the like.
  • the car body torsional rigidity is designed by car manufacturers fulfilling compromised between following points, comfort of driver, driving dynamic properties, achieving minimal -weight of the chassis, manufacturers costs, provide sufficient space for passengers and luggage, minimal air resistance, attractive in its shape, etc. All these parameters influence the stiffness of the cars. Chassis shifts result in respective shifts of me cat body and the components inside the car, such as loudspeakers that are located inside the car.
  • Circuitry may include a processor, a memory (RAM, ROM or the like), a storage, input means (e.g. an interface to a vehicle communication bus), and output means (an audio interface, loudspeakers, etc.). Moreover, it may include sensors for sensing vehicle dynamics or for sensing environxnental parameters (e.g. radar, humidity, light, temperature), etc
  • the circuitry may be configured to render 3d audio ⁇ x>nsidering the automotive chassis shift.
  • the apparatus may relate to playout of 3D sound inside a flexible automotive chassis, e.g. inside a chassis of a car.
  • a 3D audio sound controller according to an embodiment thus considers chassis shifts when playing out sound.
  • the 3D audio sound controller may add delay or phase rotations (see Fig.6 and corresponding description for more details) to the speaker's individual signals in order to playout correct high quality 3D audio in spits of the chassis movements. By adding the delays to the sound-playout for every speaker individually the same sound quality might be enjoyed by the passengers as there would be no deformation of the car chassis.
  • the signals fed to the incUvidual speakers are predistored in order to maintain the retechnischship of the phases of all audio signals inside the area where the 3D audio effects ate intended to work.
  • This allows to eUminate any negative consequences ton 3D audio caused by displacements of the speakers.
  • This ttiaint «iflff the sound quality even if the car is manoevered in a way where die chassis might be deformed.
  • the circuitty may be for example be configured to use Wavcficld synthesis and/or monopole synthesis techniques to generate 3d sound sources.
  • Hie circuitry is configured to provide noise cancellation considering the automotive chassis shift Noise cancellation may for example be achieved by detecting ambient noise and by playing out 3D sound inside a flexible automotive chassis, the 3D sound constituting anti-sound that is foreseen to cancel the detected ambient noise.
  • chassis shifts may also be used in cross channel compensation audio playback, which uses cancellation to produce stereo/HRTF effects. For these, even small replacements of the speakers would be an issue.
  • the circuitry may be configured to determine the chassis shift based on information on vehicle dynamics.
  • the information on vehicle dynamics may for example be obtained by sensors integrated in a vehicle.
  • a vehicle has integrated sensors mat may permanently provide and record information mat is indicative of the vehicle dynamics, such as yaw-rate, pitch-rate and roll-over rate, wheel-speeds obtained from wheel-spin sensors, and information obtained from steering wheel angle sensors, throttle and braking sensors that detect any driving maneuver of the car. Breaking forces and accelerations can also be measured by a modem car.
  • me circuitry is configured to calculate speaker positions or corrections of speaker positions based on information on vehicle dynamics.
  • the dremtry is configured to determ
  • the speaker positions may for example relate to an array of speakers distributed inside a vehide.
  • the drcuitiy may be configured to determine the chassis shift by calculating a torsion of the chassis.
  • Calcuktiiigatordonofa chassis rnay for example comprise calcukting a toi ⁇ on angle.
  • Qiassis torsion may result in that the relative position of die front speakers changes with respect to the position of the rear speakers. According to the embodiments described below in more detail, speaker displacements caused by such chassis shifts ate determined and considered in order to play out 3d audio in a correct way.
  • the circuitry may be configured to calculate the torsion of the chassis based on a predefined torsional stiffness of the vehicle body.
  • Methods to Determine torsion stiffness in in automotive chassis' are known and comprise analytic methods, simulation methods, or experhental methods.
  • a typical small family car has a torsion stiffness of 6.7kN/° where a mid-engined sports car shows a typical torsional-rigidity of 65kN/°.
  • a predefined torsional stiffoess of a vehicle body may for example be expressed as a parameter in the range from 5kN/° to lQ0kN/° or the like.
  • the circuitry may be configured to determine one or more forces on the tires of a vehicle and to determine a chassis torsion based on the determined forces on the tires.
  • the forces on the tires of a vehicle may for example be determined using acceleration sensors that are positioned at the tires.
  • the circuitry may be configured to determine a vehicle lateral acceleration and to determine a chassis torsion based on the determined vehicle lateral acceleration.
  • a vehicle lateral Acceleration may for example be determined using an acceleration sensor that is located at an arbitrary position in or on a vehicle.
  • the circuitry may be configured to determine resonant excitations of the suspension system to predict chassis shifts
  • the circuitry may be configured to obtain direct measurements of movements of the chassis.
  • the circuitry may be configured to determine chassis shifts based on calibration.
  • the embodiments also relate to a method comprising rendering audio considering an automotive chassis shift caused by vehicle dynamics.
  • the method may comprise any of the processes and/or operations that are described above or in the detailed description of the embodiments below.
  • the embodiments also relate to a computer program comprising instructions which, when executed on a processor, cause the processor to render audio considering an automotive chassis shift caused by vehicle dynamics.
  • the computer program may implement any of the processes and/or operations that are described above or in the detailed description of the embodiments below.
  • Fig. 1 shows a process of determining chassis shift
  • a car 101 has integrated sensors that permanently provide and record information 102 that is indicative of the vehicle dynamics, such as yaw-rate, pitch-rate and roll-overrate, wheel-speeds obtained from wheel-spin sensors, and information obtained from steering wheel ingle sensors, throttle and braking sensors that detect any driving maneuver of the car. Breaking forces and accelerations can also be measured by a modern car.
  • the information 102 on the vehicle dynamics obtained from the sensors is used to determine the chassis shifts. Hie prediction of the chassis shifts yields (corrected) speaker positions 104.
  • Fig.2 shows a process of 3d audio rendering based on speaker positions that consider chassis shift
  • a 3D audio tendering operation 201 obtains audio data from an audio source 202. Still further the 3D audio rendering operation 201 obtains (corrected) speaker positions 203 that have been obtained by predicting chassis shifts (as described with regard to Fig. 1). The 3D audio rendering operation 201 renders the audio data from an audio source 202 considering the (corrected) speaker positions 203. Thus, the 3D audio rendering operation 201 can playout correct 3D audio to a speaker array 204 to which the (corrected) speaker positions 203 relate.
  • Breaking hard during a curve causes forces on the tires to be distributed unevenly. This may cause forces up to an equivalent of lOt in mass on only one tire.
  • chassis shift can be calculated based on the forces obtained by the sensors in the suspension system
  • Fig.3 illustrate a schematic example assuming that a. force F acts down on the left front wheel and die rear wheels are fixed. In this case it is obtained for a vehicle of width w a torsional moment of
  • Torsion is connected to the applied length torsional stiffness c T and the torsion angle through
  • torsion angle as obtained with the above model results in that die speaker pairs have a relative displacement as drawn in Fig.4.
  • the relative position displacement can be calculated as
  • the following embodiment concerns lateral accelerations that result from driving in a curve also influence the relative speaker positions in a significant way
  • Fig. Sa illustrates a two-track model with twistable frame as disclosed in B. Heifiing, M. Ersoy: Chasis Handbook: Fundamentals, Driving Dynamics, Components, Mechatronics, Perspectives. 2010. Springer Science & Business Media, p. 76-79.
  • a vehicle moving into a curve of radius r possesses lateral acceleration cty which can be expressed through the vehicle speed v through
  • the second term is due to the displacement of die center of mass through rotation.
  • a first order Taylor approximation is applied which yields
  • Fig.5b illustrates the spring travel in the two-track model with twistable frame, wherein exemplary forces F on the front axis and F 2 on the rear axis are at action ⁇ Le, from the former described roll moment).
  • the spring travels and for example due to different mass distribution on the vehicle axis.
  • Fig. 6 illustrates an embodiment of a method comprising rendering audio considering an automotive chassis shift caused by vehicle dynamics.
  • a lateral acceleration Oy is received.
  • the torsion angle ⁇ is calculated based on the lateral acceleration Oy and based on vehicle parameters (such as m 2 , h ⁇ m-i, h ⁇ , c t , c s ,s s ).
  • vehicle parameters such as m 2 , h ⁇ m-i, h ⁇ , c t , c s ,s s ).
  • die speaker position displacement ⁇ , ⁇ is calculated from the torsion angle ⁇ and die stationary speaker positions.
  • die 3d audio rendcrer is c ⁇ figuied with corrected speaker positions.
  • movements of a car chassis are measured directly by eg. a strain gauge located at the positions where the distortion is significant Sensors inside the chassis suspension, suspension spring etc may also be helpful for estimating chassis movements.
  • chassis shifts are determined by calibration and measurements.
  • separate calibrations of the speaker locations are performed, ie. the speaker locations are measured in the case of no force to the system. The measurement is then repeated in the effect of external forces and the dependence between chassis shift and external forces is recoded in a database. This calibration may be performed for a predetermined number of supporting points.
  • cassis shifts are obtained by detennining the current external forces that act on the car and by querying the database to obtain the chassis shift that corresponds to the current external forces. If the current external forces do not match with die any one of the supporting points for which calibration has been performed, the actual speaker positions may be obtained using linear interpolation. This procedure may be refined by using more supporting points in between.
  • Fig.7 provides an embodiment of a 3d audio rendering that is based on a digitalizcd Monopole Synthesis algorithm.
  • the theoretical background of this technique is described in more detail in patent application US 2016/0037282 Al which is herewith incorporated by reference.
  • ⁇ target sound field is modelled as at least one target monopole placed at a defined target position.
  • the target sound field is modelled as one single target monopole.
  • the target sound field is modelled as multiple target monopoles placed at respective defined target positions.
  • each target monopole may represent a noise cancelation source comprised in a set of multiple noise cancelation sources positioned at a specific location within a space.
  • the position of a target monopole may be moving.
  • a target monopole may adapt to the movement of a noise source to be attenuated. If multiple target monopoles are used to represent a target sound field, then the methods of synthesizing the sound of a target monopole based on a set of defined synthesis monopoles as described below may be applied for each target monopole independently, and the contributions of the synthesis monopoles obtained for each target monopole may be summed to reconstruct the target sound field.
  • the synthesis is thus performed in the form of delayed and amplified components of the source signal x.
  • the modified amplification factor according to equation (118) of reference US 2016/0037282 Al can be used.
  • a mapping factor as described with regard to Fig. 9 can be used to modify the amplification.
  • Fig. 8 schematically shows an exemplary speaker displacement in a 3d audio system.
  • a user of the 3d audio system is located at 801.
  • Three speakers SP1, SP2 and SP3 generate a virtual sound source 802 (e.g. a monopole source). Due to chassis shift, the position of speaker SP2 is displaced to SP2' (dashed lines).
  • the travelling times of sound from the respective speaker positions and the user 801 to the virtual sound source 802 are At 0j Atj, ⁇ ( 2 ⁇ At' 2> and ⁇ 3 . If we consider a 3D rendering system which is using the amplitude and the phase relation between the real speaker SP2 and a virtual source 802, it is easy to see that the chassis shift creates an imbalance of the delays in the system.
  • Fig. 8 schematically shows an exemplary speaker displacement in a 3d audio system.
  • a user of the 3d audio system is located at 801.
  • Three speakers SP1, SP2 and SP3 generate a virtual sound source 802 (e.g. a
  • the technology according to an embodiment of the present disclosure is applicable to various products,
  • the technology according to an embodiment of the present disclosure may be implemented as a device included in a mobile body that is any of kinds of automobiles, electric vehicles, hybrid electric vehicles, motorcycles, bicycles, personal mobility vehicles, airplanes, trains, choppers, drones, ships, robots, construction machinery, agricultural machinery (eg., tractors), and the like.
  • Fig. 9 is a block diagram depicting an example of schematic configuration of a vehicle control system 7000 as an example of a mobile body control system to which the technology according to an embodiment of the present disclosure can be applied.
  • the vehicle control system 7000 includes a plurality of electronic control units connected to each other via a communication network 7010.
  • the vehicle control system 7000 includes a driving system control unit 7100, a body system control unit 7200, a battery control unit 7300, an outside-vehicle information detecting unit 7400, an in-vehide information detecting unit 7500, and an integrated control unit 7600.
  • the communication network 7010 connecting the plurality of control units to each other may, for example, be a vehicle-mounted communication network compliant with an arbitrary standard such as controller area network (CAN), local interconnect network (LIN), local area network (LAN), FlexBay (registered trademark), or the like.
  • CAN controller area network
  • LIN local interconnect network
  • LAN local area network
  • FlexBay registered trademark
  • Each of the control units includes: a microcomputer that performs arithmetic processing according to various kinds of programs; a storage section that stores the programs executed by the microcomputer, parameters used for various kinds of operations, or the like; and a driving circuit that drives various kinds of control target devices.
  • Each of the control units further includes: a network interface (I/F) for performing communication with other control units via the communication network 7010; and a communication I/F for performing communication with a device, a sensor, or the like, within and without the vehicle by wire communication Or radio Communication.
  • microcomputer 9 includes a microcomputer 7610, a general-purpose communication I/F 7620, a dedicated communication I/F 7630, a positioning section 7640, a beacon receiving section 7650, an in-vehicle device I/F 7660, a sound/image output section 7670, a vehicle-mounted network I/F 7680, and a storage section 7690.
  • the other control units similarly include a microcomputer, a communication I/F, a storage section, and the like.
  • the driving system control unit 7100 controls the operation of devices related to the driving system of die vehicle in accordance with' various kinds of programs.
  • the driving system control unit 7100 functions as a control device for a driving force generating device for generating' the driving force of the vehicle, such as an internal combustion engine, a driving motor, or the like, a driving force mnsmitting mechanism for transmitting the driving force to wheels, a steering mechanism for adjusting the steering angle of the vehicle, a braking device for generating the braking force of the vehicle, and the like.
  • the driving system control unit 7100 may have a function as a control device of an antilock brake system (ABS), electronic stability control (ESQ, or the like.
  • ABS antilock brake system
  • ESQ electronic stability control
  • the driving system control unit 7100 is connected with a vehicle state detecting section 7110.
  • the vehicle state detecting section 711 Q includes at least one of a gyro sensor that detects the angular velocity of axial rotational movement of a vehicle body, an acceleration sensor that detects lateral (a, in operation 601 of Fig. 6) and forward acceleration the vehicle, and sensors for detecting an amount of operation of an accelerator pedal, an amount of operation of a brake pedal, the steering angle of a steering wheel, an engine speed or the rotational speed of wheels, and the like.
  • the lateral acceleration can for example be used to calculate (in operation 602 of Fig. 6) the torsion angle for sound correction.
  • the driving system control unit 7100 performs arithmetic processing using a signal input from the vehicle state detecting section 7110, and controls the internal combustion engine, the driving motor, an electric power steering device, the brake device, and the like.
  • the body system control unit 7200 controls the operation of various kinds of devices provided to the vehicle body in accordance with various kinds of programs.
  • the body system control unit 7200 functions as a control device for a keyless entry system, a smart key system, a power window device, or various kinds of lamps such as a headlamp, a backup lamp, a brake lamp, a turn signal lamp, a fog lamp, or the like.
  • radio waves transmitted from a mobile device as an alternative to a key, or signals of various kinds of switches can be input to the body system control unit 7200.
  • the body system control unit 7200 receives these input radio waves or signals, and controls a door lock device, the power window device, the lamps, or the like, of the vehicle.
  • the battery control unit 7300 controls a secondary battery 7310, which is a power supply source for the driving motor, in accordance with various kinds of programs.
  • the battery control unit 7300 is supplied with information about a battery temperature, a battery output voltage, an amount of charge remaining in the battery, or the like, from a battery device including the secondary battery 7310.
  • the battery control unit 7300 performs arithmetic processing using these signals, and performs control for regulating the temperature of the secondary battery 7310 or controls a cooling device provided to the battery device or the like.
  • the outside-vehicle information detecting unit 7400 detects information about the outside of the vehicle including the vehicle control system 7000.
  • the outside-vehicle information detecting unit 7400 is connected with at least one of an imaging section 7410 and an outside-vehicle information detecting section 7420.
  • the imaging section 7410 includes at least one of a time-oF- flight (ToF) camera, a stereo camera, a monocular camera, an mfrared camera, and other cameras.
  • ToF time-oF- flight
  • the outside-vehicle information detecting section 7420 includes at least one of an environmental sensor for detecting current atmospheric conditions or weather conditions and a peripheral information detecting sensor for detecting another vehicle, an obstacle, a pedestrian, or the like, on the periphery of the vehicle including the vehicle control system 7000.
  • the environmental sensor may be at least one of a rain drop sensor detecting rain, a fog sensor detecting a fog, a sunshine sensor detecting a degree of sunshine, and a snow sensor detecting a snowfall
  • the peripheral information detecting sensor may be at least one of an ultrasonic sensor, a radar device, and a LIDAR device (Light detection and Ranging device, or Laser imaging detection and ranging device).
  • Each of the imaging section 7410 and the outside-vehicle information detecting section 7420 may be provided as an independent sensor or device, or may be provided as a device in which a plurality of sensors or devices are integrated.
  • the outside-vehicle information detecting unit 7400 makes the imaging section 7410 image an image of the outside of the vehicle, and receives imaged image data. In addition, the outside-vehicle information detecting unit 7400 receives detection information from the outside-vehicle information detecting section 7420 connected to the outside-vehicle information detecting unit 7400. In a case where the outside-vehicle information detecting section 7420 is an ultrasonic sensor, a radar device, or a LIDAR device, the outside-vehicle information detecting unit 7400 transmits an ultrasonic wave, an electromagnetic wave, or the like, and receives information of a received reflected wave.
  • the outside-vehicle information detecting unit 7400 may perform processing of detecting an object such as a human, a Vehicle, an obstacle, a sigh, a character on a road surface, or the like, or processing of detecting a distance thereto.
  • the outside-vehicle information detecting unit 7400 may perform environment recognition processing of recognizing a rainfall, a fog, road surface conditions, or the like, on the basis of the received information.
  • the outside-vehicle information detecting unit 7400 may calculate a distance to an object outside the vehicle on the basis of the received information.
  • the outsMe-vehicle information detecting unit 7400 may perform image recognition processing of recognizing a human, a vehicle, an obstacle, a sign, a character on a road surface, or the like, or processing of detecting a distance thereto.
  • the outside-vehide information detecting unit 7400 may subject the received image data to processing such as distortion correction, alignment, or the like, and combine the image data imaged by a plurality of (lifrerent imaging sections 7410 to generate a bird's-eye view image or a panoramic image.
  • the outside-vehide information detecting unit 7400 may perform viewpoint conversion processing using the image data imaged by the imaging section 7410 induding the different imaging parts.
  • the in-vehide information detecting unit 7500 detects information about the inside of die vehicle.
  • the in-vehide information detecting unit 7500 is, for example, connected with a driver state detecting section 7510 that detects the state of a driver.
  • the driver state detecting section 7510 may indude a camera that images the driver, a biosensor that detects biological information of the driver, a microphone that collects sound within the interior of the vehicle, or the like.
  • the biosensor is, for example, disposed in a seat surface, the steering whed, or the like, and detects biological infonnation of an occupant sitting in a seat or the driver holding the steering wheeL
  • the in-vehide information detecting unit 7500 may calculate a degree of fatigue of the driver or a degree of concentration of the driver, or may determine whether the driver is dozing.
  • the in-vchicle information detecting unit 7500 may subject an audio signal obtained by die collection of the sound to processing such as noise canceling processing, or the like.
  • the integrated control unit 7600 controls general operation within the vehide control system 7000 in accordance with various kinds of programs.
  • the integrated control unit 7600 is connected with an input section 7800.
  • the input section 7800 is implemented by a device capable of input operation by an occupant, such, for example, as a touch panel, a button, a microphone, a switch, a lever, or the like.
  • the integrated control unit 7600 may be supplied with data obtained by voice recognition of voice input through the microphone.
  • the input section 7800 may, for example, be a remote control device using infrared rays or other radio waves, or an external connecting device such as a mobile telephone, a personal digital assistant (PDA), or the like, that supports operation of the vehide control system 7000.
  • PDA personal digital assistant
  • the input section 7800 may be, for example, * camera. In that case, an occupant can input information by gesture. Alternattvdy, data may be input which is obtained by detecting the movement of a wearable device that an occupant wears. Further, die input section 7800 may, for example, include an input control circuit, or the like, that generates an input signal on the basis of information input by an occupant, or the like, using the above-described input section 7800, and which outputs the generated input signal to the integrated control unit 7600. An occupant, or the like, inputs various kinds of data or gives an instruction for processing operations to the vehicle control system 7000 by operating the input section 7800.
  • the storage section 7690 may include a read only memory (ROM) that stores various kinds of programs executed by me microcomputer and a random access memory (RAM) that stores various kinds of parameters, operation results, sensor values, or die like.
  • ROM read only memory
  • RAM random access memory
  • the storage section 7690 may be implemented by a magnetic storage device such as a hard disc drive (HDD), or the like, a semiconductor storage device, an optical storage device, a magneto-optical storage device, or die like.
  • HDD hard disc drive
  • the general-purpose communication I/F 7620 is a communication I/F used widely, which commu- nication I/F mediates communication with various apparatuses present in an external environment 7750.
  • the general-purpose communication I/F 7620 may implement a cellular communication protocol such as global system for mobile communications (GSM (registered t-cademark)), worldwide interoperability for microwave access (WiMAX (registered trademark ⁇ ), long term evolution (LTE (registered trademark)), LTE-advanced (LTB-A), or die like, or another wireless communica- tion protocol such as wireless LAN (referred to also as wireless fidelity (Wi-Fi (registered trademark)), Bluetooth (registered trademark), or the like.
  • GSM global system for mobile communications
  • WiMAX registered trademark ⁇
  • LTE long term evolution
  • LTB-A LTE-advanced
  • Wi-Fi wireless fidelity
  • Bluetooth registered trademark
  • the general-purpose communication I/F 7620 may, for example, connect to an apparatus (for example, an application server or a control server) present on an external network (for example, the Internet, a cloud network, or a company-specific network) via a base station or an access point
  • an apparatus for example, an application server or a control server
  • an external network for example, the Internet, a cloud network, or a company-specific network
  • the general-purpose communication I/F 7620 may connect to a terminal present in the vicinity of the vehicle (which terminal is, for example, a terminal of the driver, a pedestrian, or a store, or a machine type communication (MTQ terminal) using a peer to peer (P2P) technology, for example.
  • P2P peer to peer
  • the dedicated communication I/F 7630 is a communication I/F that supports a communication protocol developed for use in vehicles.
  • the dedicated communication I/F 7630 may implement a standard protocol such, for example, as wireless access in vehicle environment (WAVE), which is a combination of institute of electrical and electronic engineers (IEEE) 802.1 Ip as a lower layer and IEEE 1609 as a higher layer, dedicated short range communications (DSRQ, or a cellular communication protocol.
  • WAVE wireless access in vehicle environment
  • IEEE institute of electrical and electronic engineers
  • DSRQ dedicated short range communications
  • the dedicated communication I/F 7630 typically carries out V2X communication as a concept including one or more of communication between a vehicle and a vehicle (Vehicle to Vehicle), communication between a vehicle and a road (Vehicle to Infrastructure), communication between a vehicle and a home (Vehicle to Home), and communication between a vehicle and a pedestrian (Vehicle to Pedestrian).
  • the positioning section 7640 performs positioning by receiving a global navigation satellite system (GNSS) signal from a GNSS satellite (for example* a GPS signal from a global positioning system (GPS) satellite), and generates positional information including the lati t ude, longitude, and altitude of the vehicle.
  • GNSS global navigation satellite system
  • GPS global positioning system
  • the positioning section 7640 may identify a current position by exchanging signals -with, a wireless access point, or may obtain the positional information from a terminal such as a mobile telephone, a personal handyphone system (PHS), or a smart phone that has a positioning function.
  • GNSS global navigation satellite system
  • PHS personal handyphone system
  • the beacon receiving section 7650 receives a radio wave or an electromagnetic wave transmitted from a radio station installed on a road, or the like, and thereby obtains information about the current position, congestion, a closed road, a necessary time, or the like.
  • the function of the beacon receiving section 7650 may be included in the dedicated communication I/F 7630 described above.
  • the in-vehicle device I/F 7660 is a communication interface that mediates connection between the microcomputer 7610 and various in-vehicle devices 7760 present within the vehicle.
  • the in-vehicle device I/F 7660 may establish a wireless connection using a wireless communication protocol such as wireless LAN, Bluetooth (registered trademark), near field communication (NFQ, or wireless universal serial bus (WUSB).
  • a wireless communication protocol such as wireless LAN, Bluetooth (registered trademark), near field communication (NFQ, or wireless universal serial bus (WUSB).
  • WUSB wireless universal serial bus
  • the in-vehide device I/F 7660 may establish a wired connection by universal serial bus (USB), high ⁇ lefinition multimedia interface (HDMI (registered trademark)), mobile high-defimtion link (MHL), or the like, via a connection tenrtinal (and a cable if necessary) not depicted in me figures.
  • USB universal serial bus
  • HDMI high ⁇ lefinition multimedia interface
  • the in- vehicle devices 7760 may, for example, include at least one of a mobile device and a wearable device possessed by an occupant and an information device carried into or attached to the vehicle.
  • the in-vehicle devices 7760 may also include a navigation device that searches for a path to an arbitrary destination.
  • the in-vehicle device I/F 7660 exchanges control signals or data signals with these in-vehicle devices 7760.
  • the vehicle-mounted network I/F 7680 is an interface that mediates communication between the microcomputer 7610 and the communication network 7010.
  • the microcomputer 7610 of the integrated control unit 7600 controls the vehicle control system 7000 in accordance with various kinds of programs on the basis of information obtained via at least one of the general-purpose communication I/F 7620, the dedicated communication I/F 7630, the positioning section 7640, the beacon receiving section 7650, the in-vehide device I/F 7660, and the vehicle-mounted network I/F 7680.
  • the microcomputer 7610 may calculate a control target value for the driving force generating device, the steering mechanism, or the braking device on the basis of the obtained information about the inside and outside of the vehicle, and output a control command to the driving system control unit 7100.
  • the microcomputer 7610 may perform cooperative control intended to implement functions of an advanced driver assistance system (ADAS) -which functions include collision avoidance or shock mitigation for the vehicle, following driving based on a following distance, vehicle speed maintaining driving, a warning of collision of the vehicle, a warning of deviation of the vehicle from a lane, or the like.
  • ADAS advanced driver assistance system
  • the microcomputer 7610 may perform cooperative control intended for automatic driving, which makes the vehicle to travel autonomously without depending on the operation of the driver, or the like, by controlling the driving force generating device, the steering mechanism, the braking device, or the like, on the basis of the obtained information about the surroundings of the vehicle.
  • the microcomputer 7610 of me integrated control unit 7600 may in particular control the vehicle control system 7000 in accordance with various kinds of programs on the basis of information obtained via at least one of the general-purpose communication I/F 7620, the dedicated communication I/F 7630, the positioning section 7640, the beacon receiving section 7650, the in-vchicle device I/F 7660, and the vehicle-mounted network I/F 7680.
  • the microcomputer 7610 may implement adaptive algorithms for generating artificial sound as described in the embodiment above.
  • the microcomputer 7610 may control vehicle to active noise control device communication as described in the embodiments above.
  • the microcomputer 7610 may generate three-dhnensional distance information between the vehicle and an object such as a surrounding structure, a person, or the like, and generate local map information including information about the surroundings of the current position of the vehicle, on the basis of information obtained via at least one of the general-purpose communication I/F 7620, the dedicated communication I/F 7630, the positioning section 7640, the beacon receiving section 7650, the in-vehicle device I/F 7660, and the vehicle-mounted network I/F 7680. This information may be used as input for an adaptive sound generation as described in the embodiments above.
  • the microcomputer 7610 may predict danger such as collision of the vehicle, appimching of a pedestrian, or the like, an entry to a closed road, or the like, on the basis of the obtained information, and generate a warning signal.
  • the warning signal may, for example, be a signal for producing a warning sound or lighting a warning lamp .
  • the sound/image output section 7670 transmits an output signal of at least one of a sound and an image to an output device capable of visually or audibly notifying information to an occupant of the vehicle or the outside of the vehicle.
  • the sound/image output section 7670 may be used to generate artificial sound as described in the embodiment above.
  • an audio speaker 7710, a display section 7720, and an instrument panel 7730 are illustrated as the output device.
  • the display section 7720 may, for example, include at least one of an on-board display and a head-up display.
  • the display section 7720 may have an augmented reality (AR) display function.
  • AR augmented reality
  • the output device may be other than these devices, and may be another device such as headphones, a wearable device such as an eyeglass type display worn by an occupant, or the like, a projector, a lamp, or the like.
  • the output device is a display device
  • the display device visually displays results obtained by various kinds of processing performed by the microcomputer 7610 or information received from another control unit in various forms such as text, an image, a table, a graph, or the like.
  • the audio output device converts an audio signal constituted of reproduced audio data or sound data, or the like, into an analog signal, and audibly outputs the analog signal.
  • the rmcrocomputer 7610 can calculate sound correc- tion parameters to adjust the monopole synthesis system for vehicle frame transformations (such as described with regard to Fig.7 and 8).
  • each individual control unit may include a plurality of control units.
  • the vehicle control system 7000 may include another control unit not depicted in die figures.
  • part or die whole of the functions performed by one of the control units in die above description may be assigned to another control unit That is, predetermined arithmetic processing may be performed by any of the control units as long as information is transmitted and received via the communication network 7010.
  • a sensor or a device connected to one of die control units may be connected to another control unit, and a plurality of control units may mutually transmit and receive detection information via the communication network 7010.
  • a computer program for realizing the functions of die information processing device 100 according to the present embodiment described with reference to Fig.9 can be implemented in one of the control units, or the like.
  • a computer readable recording medium storing such a computer program can also be provided.
  • the recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like.
  • die above- described computer program may be distributed via a network, for example, without die recording medium being used.
  • An apparatus comprising circuitry configured to render audio considering an automotive chassis shift caused by vehicle dynamics.
  • rendering audio comprises providing noise cancellation: consMering the automotive chassis shift.
  • circuitry is configured to determine chassis shift based on information on vehicle dynamics.
  • circuitry is configured to obtain the information on vehicle dynamics from sensors integrated in a vehicle.
  • circuitry is configured to determine chassis shift by calcukting speaker positions or corrections of speaker positions based on information on vehicle dynamics.
  • circuitry is configured to determine chassis shift by calculating a torsion of the chassis.
  • circuitry is configured to determine a vehicle lateral acceleration and to determine a chassis torsion based on the determined vehicle lateral acceleration.
  • 3D audio system comprising the apparatus of anyone of (1) to (14).
  • Automotive controller comprising die apparatus of anyone of (1) to (14).
  • Computer program comprising instructions which, when executed on a processor, cause the processor to render audio considering an automotive chassis shift caused by vehicle dynamics.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Acoustics & Sound (AREA)
  • Signal Processing (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)

Abstract

An apparatus comprising circuity configured to render spatial audio wherein a chassis shift is determined based on sensor information on vehicle dynamics. Due to internal movements of the chassis such as torsion the position of the loudspeakers is not stable. The provided circuitry estimates the difference in loudspeaker position due to torsion and corrects the path length by adjusting the phase or providing a delay.

Description

METHOD, DEVICE AND SYSTEM FOR THE GENERATION
OF SPATIAL SOUND FIELDS
TECHNICAL FIELD
The present disclosure generally pertains to methods, devices and systems for the generation of spatial sound fields, in particular to 3D Audio inside an automotive chassis. TECHNICAL BACKGROUND
Spatial audio rendering techniques for the generation of spatial sound fields create a virtual acoustic environment (also called 3D audio, or 3D virtual audio). They produce "artificial" wave fronts synthesized by a large number of individually driven speakers. Examples are the so called 5.1 or 7.1 systems, which are composed of 5 or 7 loudspeaker enclosures and one or two extra subwoofer, which are designed to reproduce the low frequency range of sound with a higher energy. In such systems mathematical algorithms axe used to reproduce a sound field as exactly as possible.
The most well-known system for the generation of a spatial sound field is the so-called Wavefield synthesis (WFS). Here the reproduction of a sound field is based on the Huygens principle and approximates it with a number of acoustic enclosures. This method has a relatively high computa- tional complexity. A target sound field is modelled as at least one target monopole placed at a defined target position.
There exist 3D audio systems that are optimized for in-car use. Such systems are typically based on cross-talk cancelation techniques to control the sound arriving at each of the listener's ears. The low acoustic power requirements and well defined listener locations are beneficial to the optimization of such in-car systems. Although there exist 3D audio systems that are optimized for in-car use it is generally desirable to improve techniques for the generation of a spatial sound field.
SUMMARY
According to a first aspect the disclosure provides an apparatus comprising circuitty configured to render audio considering an automotive chassis shift caused by vehicle dynamics.
According to a first aspect the disclosure provides a method comprising rendering audio considering an automotive chassis shift caused by vehicle dynamics.
According to a first aspect the disclosure provides a computer program comprising instructions which, when executed on a processor, cause the processor to render audio considering an automotive chassis shift caused by vehicle dynamics.
Further aspects ate set forth in the dependent claims, the following description and the drawings. BRIEF DESCRIPnON OF THE DRAWINGS
Embodiments ate explained by way of example with respect to the accompanying drawings, in which:
Fig. 1 shows a process of detetrnining chassis shift;
Fig. 2 shows a process of 3d audio tendering based on speaker positions that consider chassis shift;
Fig.3 illustrate a schematic example where k is assumed that a force F acts down on the left front wheel and me rear wheels are fixed;
Fig.4 illustrates an embodiment of how a displacement of a car body frame may cause a Shift of speaker positions;
Fig.5a illustrates a two-track model with twistable frame;
Fig. 5b illustrates the spring travel in the two-track model with twistable frame;
Fig. 6 illustrates an embodiment of a method comprising rendering audio considering an automotive chassis shift caused by vehicle dynamics;
Fig.7 provides an embodiment of it 3d audio rendering that is based on a digitalized Monopole Synthesis algorithm ;
Fig. 8 schematically shows an exemplary speaker displacement in a 3d audio environment; and
Fig.9 is a block diagram depicting an example of schematic configuration of a vehicle control system as an example of a mobile body control system to which the technology according to an embodiment of the present disclosure can be applied DETAILED DESCRIPTION OF EMBODIMENTS
Before a detailed description of the embodiments under reference of Fig. 1, general explanations are made.
The embodiments relate to an apparatus comprising circuitry configured to render audio considering an automotive chassis shift caused by vehicle dynamics. In particular, the embodiments concern the finding that in the case of a 3D audio sound system installed inside a car, the speakers position might not be stable because of lacking torsional rigidity of the car. This might impair the sound quality. The apparatus according to the embodiments described below may allow to correct playout of 3D sound inside an automotive chassis even if the speakers face individual displacements due to distortions such as torque applied on the chassis by driving maneuvers.
An automotive chassis may for example consist of an internal vehicle frame that supports an artificial object in its construction and use, can also provide protection for some internal parts. An example of a chassis as referred to in die embodiments may be the underpart of a motor vehicle, consisting of the frame on 'which the body is mounted. The term chassis as used here also comprises the concept of a "running" chassis in -which the running gear such as wheels and transmission, and the driver's seat, are included.
Vehicle dynamics such as breaking, driving in a curve, accelerating, etc. may cause chassis shifts. The term chassis shift as used in die embodiments described below relates to any internal movements, of the chassis such as torsion, bending, and the like. The car body torsional rigidity is designed by car manufacturers fulfilling compromised between following points, comfort of driver, driving dynamic properties, achieving minimal -weight of the chassis, manufacturers costs, provide sufficient space for passengers and luggage, minimal air resistance, attractive in its shape, etc. All these parameters influence the stiffness of the cars. Chassis shifts result in respective shifts of me cat body and the components inside the car, such as loudspeakers that are located inside the car. For example if the car is performing a heavy braking during a left curve, the car's mass generate high forces towards the front right side. Such forces cause the chassis to shift and torsion. The curves inner rear wheel -will be lifted into the air, and the front ode on the curves outside will be pressed downwards. Speakers mat are fixed mounted to the chassis (e.g. via the vehicle body) will move in position according to the chassis shift.
Circuitry may include a processor, a memory (RAM, ROM or the like), a storage, input means (e.g. an interface to a vehicle communication bus), and output means (an audio interface, loudspeakers, etc.). Moreover, it may include sensors for sensing vehicle dynamics or for sensing environxnental parameters (e.g. radar, humidity, light, temperature), etc
The circuitry may be configured to render 3d audio <x>nsidering the automotive chassis shift. For example, the apparatus may relate to playout of 3D sound inside a flexible automotive chassis, e.g. inside a chassis of a car. For example, a 3D audio sound controller according to an embodiment thus considers chassis shifts when playing out sound. The 3D audio sound controller may add delay or phase rotations (see Fig.6 and corresponding description for more details) to the speaker's individual signals in order to playout correct high quality 3D audio in spits of the chassis movements. By adding the delays to the sound-playout for every speaker individually the same sound quality might be enjoyed by the passengers as there would be no deformation of the car chassis. That is, according to some embodiments, when speakers change their position, the signals fed to the incUvidual speakers are predistored in order to maintain the rektionship of the phases of all audio signals inside the area where the 3D audio effects ate intended to work. This allows to eUminate any negative consequences ton 3D audio caused by displacements of the speakers. This ttiaint«iflff the sound quality even if the car is manoevered in a way where die chassis might be deformed. The circuitty may be for example be configured to use Wavcficld synthesis and/or monopole synthesis techniques to generate 3d sound sources.
Hie circuitry is configured to provide noise cancellation considering the automotive chassis shift Noise cancellation may for example be achieved by detecting ambient noise and by playing out 3D sound inside a flexible automotive chassis, the 3D sound constituting anti-sound that is foreseen to cancel the detected ambient noise.
Furthermore, considering chassis shifts may also be used in cross channel compensation audio playback, which uses cancellation to produce stereo/HRTF effects. For these, even small replacements of the speakers would be an issue.
For cars on race tracks, sportive car sounds (engine rotations, ignition and exhaust noises, turbo- wheel, backfire, wheel noises if drifting or sliding, etc) might be presented to the driver via the 3D audio sound system. Such an ia-car sound could be a frequently selected feature for eg. an electrical cat, if the driver likes to enjoy the sound of the combustion engine. In the outside, the car is quiet At the 'race-track* application the forces on the car's chassis are maximal and the 3D audio should work with full performance.
The circuitry may be configured to determine the chassis shift based on information on vehicle dynamics. The information on vehicle dynamics may for example be obtained by sensors integrated in a vehicle. For example, a vehicle has integrated sensors mat may permanently provide and record information mat is indicative of the vehicle dynamics, such as yaw-rate, pitch-rate and roll-over rate, wheel-speeds obtained from wheel-spin sensors, and information obtained from steering wheel angle sensors, throttle and braking sensors that detect any driving maneuver of the car. Breaking forces and accelerations can also be measured by a modem car.
According to some embodiments, me circuitry is configured to calculate speaker positions or corrections of speaker positions based on information on vehicle dynamics. For example, the dremtry is configured to determ
speaker positions based on the determined chassis shift The speaker positions may for example relate to an array of speakers distributed inside a vehide.
For example, the drcuitiy may be configured to determine the chassis shift by calculating a torsion of the chassis.
Calcuktiiigatordonofa chassis rnay for example comprise calcukting a toi^on angle. Qiassis torsion may result in that the relative position of die front speakers changes with respect to the position of the rear speakers. According to the embodiments described below in more detail, speaker displacements caused by such chassis shifts ate determined and considered in order to play out 3d audio in a correct way.
The circuitry may be configured to calculate the torsion of the chassis based on a predefined torsional stiffness of the vehicle body. Methods to Determine torsion stiffness in in automotive chassis' are known and comprise analytic methods, simulation methods, or experhnental methods. For example, a typical small family car has a torsion stiffness of 6.7kN/° where a mid-engined sports car shows a typical torsional-rigidity of 65kN/°. A predefined torsional stiffoess of a vehicle body may for example be expressed as a parameter in the range from 5kN/° to lQ0kN/° or the like.
The circuitry may be configured to determine one or more forces on the tires of a vehicle and to determine a chassis torsion based on the determined forces on the tires. The forces on the tires of a vehicle may for example be determined using acceleration sensors that are positioned at the tires.
AltettMtively or m addition, the circuitry may be configured to determine a vehicle lateral acceleration and to determine a chassis torsion based on the determined vehicle lateral acceleration. A vehicle lateral Acceleration may for example be determined using an acceleration sensor that is located at an arbitrary position in or on a vehicle.
Alternatively or in addition, the circuitry may be configured to determine resonant excitations of the suspension system to predict chassis shifts,
Alternatively or in addition, the circuitry may be configured to obtain direct measurements of movements of the chassis.
Alternatively or in addition, the circuitry may be configured to determine chassis shifts based on calibration.
The embodiments also relate to a method comprising rendering audio considering an automotive chassis shift caused by vehicle dynamics. The method may comprise any of the processes and/or operations that are described above or in the detailed description of the embodiments below.
The embodiments also relate to a computer program comprising instructions which, when executed on a processor, cause the processor to render audio considering an automotive chassis shift caused by vehicle dynamics. The computer program may implement any of the processes and/or operations that are described above or in the detailed description of the embodiments below.
Determiriing chassis shift based on vehicle dynamics
Fig. 1 shows a process of determining chassis shift A car 101 has integrated sensors that permanently provide and record information 102 that is indicative of the vehicle dynamics, such as yaw-rate, pitch-rate and roll-overrate, wheel-speeds obtained from wheel-spin sensors, and information obtained from steering wheel ingle sensors, throttle and braking sensors that detect any driving maneuver of the car. Breaking forces and accelerations can also be measured by a modern car. At 103 the information 102 on the vehicle dynamics obtained from the sensors is used to determine the chassis shifts. Hie prediction of the chassis shifts yields (corrected) speaker positions 104.
3D audio rendering considering chassis shift
Fig.2 shows a process of 3d audio rendering based on speaker positions that consider chassis shift A 3D audio tendering operation 201 obtains audio data from an audio source 202. Still further the 3D audio rendering operation 201 obtains (corrected) speaker positions 203 that have been obtained by predicting chassis shifts (as described with regard to Fig. 1). The 3D audio rendering operation 201 renders the audio data from an audio source 202 considering the (corrected) speaker positions 203. Thus, the 3D audio rendering operation 201 can playout correct 3D audio to a speaker array 204 to which the (corrected) speaker positions 203 relate.
Torsion due to breaking in a curve
Breaking hard during a curve causes forces on the tires to be distributed unevenly. This may cause forces up to an equivalent of lOt in mass on only one tire.
Forces acting on the tires can be measured by sensors in the suspension system such as linear potentiometers or rotary potentiometers. Using the model described below, chassis shift can be calculated based on the forces obtained by the sensors in the suspension system,
Fig.3 illustrate a schematic example assuming that a. force F acts down on the left front wheel and die rear wheels are fixed. In this case it is obtained for a vehicle of width w a torsional moment of
Figure imgf000008_0001
Torsion is connected to the applied length torsional stiffness cT and the torsion angle through
Figure imgf000008_0004
Figure imgf000008_0007
Figure imgf000008_0003
Identifying half the vehicle widm with length I, combining die two former equations and
Figure imgf000008_0005
solving for the torsion angle Δ yields in
Figure imgf000008_0006
Figure imgf000008_0002
If it is considered that two front and two rear speakers are located at the sides of the car, torsion angle as obtained with the above model results in that die speaker pairs have a relative displacement as drawn in Fig.4. The relative position displacement can be calculated as
Figure imgf000009_0001
Assuming, as an example, a torsional stiffness of
Figure imgf000009_0006
a vehicle "width of and
Figure imgf000009_0007
a force
Figure imgf000009_0010
equivalent to the -weight of about 5t acting on the tire, this results, according to the model described above, in a torsion angle of
Figure imgf000009_0008
which corresponds to a speaker position displacement of
Figure imgf000009_0009
Audio -wave length at 10 kHz -probably one of the highest frequencies that is audible*- have a wavelength of 3.5 cm, well below die 9 cm position displacement determined above. That shows that speaker distance changes of mis order of magnitude will indeed have an effect on 3d audio reproduction,
Torsion due lateral acceleration in a curve
The following embodiment concerns lateral accelerations that result from driving in a curve also influence the relative speaker positions in a significant way,
Fig. Sa illustrates a two-track model with twistable frame as disclosed in B. Heifiing, M. Ersoy: Chasis Handbook: Fundamentals, Driving Dynamics, Components, Mechatronics, Perspectives. 2010. Springer Science & Business Media, p. 76-79.
A vehicle moving into a curve of radius r possesses lateral acceleration cty which can be expressed through the vehicle speed v through
Figure imgf000009_0002
Therefor a vehidewim body ma^s m expetiences a c^trifugal force
Figure imgf000009_0003
In most vehicles the suspension roll axis and die vehicles center of mass have distance h in altitude hence the centrifugal force Fc becomes torsional. Let g be the gravitational acceleration and φ the roll angle then the total roll moment one a single vehicles axis is given by
Figure imgf000009_0004
The second term is due to the displacement of die center of mass through rotation. For simplification, a first order Taylor approximation is applied which yields
Figure imgf000009_0005
To counter the roll moment vehicles are equipped with a spring suspension. According to the model, two springs for on axis whereby a spring has suspension stifihess cs and spring travel fs are considered. Each spring is placed at distance Ss to each other. We linearly approximate spring travel through
Figure imgf000010_0001
m summary die toll moment is balanced through suspension through
Figure imgf000010_0002
It is consider that two axes connected through a torsional spring with stiffness Ct as a model tor a vehicle frame. This results in:
Figure imgf000010_0003
Assuming identical suspension parameters for bom axes. The torsion angle between both axes is defined as Δφ = (φ2— <Pi) and the first equation is subtracted from the second equation 'which gives
Figure imgf000010_0004
This can now be solved for the torsion angle and simplified to:
Figure imgf000010_0005
Assuming a spring distance at about the vehicle -width 5S = 2m , a spring stiffness of c
Figure imgf000010_0008
Figure imgf000010_0009
a total mass and a lateral acceleration of < and further
Figure imgf000010_0010
Figure imgf000010_0007
assuming
Figure imgf000010_0006
a torsion angle of is obtained which yields to a relative speaker displacement
Figure imgf000010_0011
of Ay * 0.1mm and Δζ » 12mm.
Fig.5b illustrates the spring travel in the two-track model with twistable frame, wherein exemplary forces F on the front axis and F2 on the rear axis are at action {Le, from the former described roll moment). The spring travels and for example due to different mass distribution on the vehicle axis.
Figure imgf000010_0012
Figure imgf000010_0013
Fig. 6 illustrates an embodiment of a method comprising rendering audio considering an automotive chassis shift caused by vehicle dynamics. At 601, a lateral acceleration Oy is received. At 602, the torsion angle Δφ is calculated based on the lateral acceleration Oy and based on vehicle parameters (such as m2, h^m-i, h\, ct, cs,ss). At 603, die speaker position displacement Δγ,Δζ is calculated from the torsion angle Δφ and die stationary speaker positions. Finally, at 604, die 3d audio rendcrer is c∞figuied with corrected speaker positions.
Direct measurement of movements of the car chassis
According to a further embodiment, movements of a car chassis are measured directly by eg. a strain gauge located at the positions where the distortion is significant Sensors inside the chassis suspension, suspension spring etc may also be helpful for estimating chassis movements.
Predicting chassis shifts based on calibration
According to a further embodiment, chassis shifts are determined by calibration and measurements. To compensate the (different speaker locations, separate calibrations of the speaker locations are performed, ie. the speaker locations are measured in the case of no force to the system. The measurement is then repeated in the effect of external forces and the dependence between chassis shift and external forces is recoded in a database. This calibration may be performed for a predetermined number of supporting points.
After this calibration phase, during operation of a car, cassis shifts are obtained by detennining the current external forces that act on the car and by querying the database to obtain the chassis shift that corresponds to the current external forces. If the current external forces do not match with die any one of the supporting points for which calibration has been performed, the actual speaker positions may be obtained using linear interpolation. This procedure may be refined by using more supporting points in between.
System for digitaliied monopole synthesis in the case of integer delays
Fig.7 provides an embodiment of a 3d audio rendering that is based on a digitalizcd Monopole Synthesis algorithm. The theoretical background of this technique is described in more detail in patent application US 2016/0037282 Al which is herewith incorporated by reference.
The technique which is implemented in me embodiments of US 2016/0037282 Al is conceptually similar to the Wavefield synthesis, which uses a restricted number of acoustic enclosures to generate a defined sound field. The fundamental basis of the generation principle of the embodiments is, however, specific, since the synthesis does not try to model the sound field exactly but is based on a least square approach. Λ target sound field is modelled as at least one target monopole placed at a defined target position. In one embodiment, the target sound field is modelled as one single target monopole. In other embodiments, the target sound field is modelled as multiple target monopoles placed at respective defined target positions. For example, each target monopole may represent a noise cancelation source comprised in a set of multiple noise cancelation sources positioned at a specific location within a space. The position of a target monopole may be moving. For example, a target monopole may adapt to the movement of a noise source to be attenuated. If multiple target monopoles are used to represent a target sound field, then the methods of synthesizing the sound of a target monopole based on a set of defined synthesis monopoles as described below may be applied for each target monopole independently, and the contributions of the synthesis monopoles obtained for each target monopole may be summed to reconstruct the target sound field.
A source signal x(n). is fed to delay units labelled by z~n>> and to amplification units ap, where p = 1, ... , N is the index of the respective synthesis monopole used for synthesizing the target monopole signal. The delay and amplification units according to mis embodiment may apply equation (117) of reference US 2016/0037282 Al to compute the resulting signals yp(n) = sp(n) which are used to synthesize the target monopole signal Hie resulting signals Sp(n) are power amplified and fed to loudspeaker Sp.
In this embodiment, the synthesis is thus performed in the form of delayed and amplified components of the source signal x.
According to this embodiment, the delay np for a synthesis monopole indexed p is rarrcsponding to the propagation time of sound for the Euclidean distance r = Rpo— |rp— r0| between the target monopole re and the generator rp.
Further, according to this embodiment, the amplification factor ap inversely proportional to the distance r = Rp0 .
In alternative embodiments of the system, the modified amplification factor according to equation (118) of reference US 2016/0037282 Al can be used.
In yet alternative embodiments of the system, a mapping factor as described with regard to Fig. 9 can be used to modify the amplification.
Fig. 8 schematically shows an exemplary speaker displacement in a 3d audio system. A user of the 3d audio system is located at 801. Three speakers SP1, SP2 and SP3 generate a virtual sound source 802 (e.g. a monopole source). Due to chassis shift, the position of speaker SP2 is displaced to SP2' (dashed lines). The travelling times of sound from the respective speaker positions and the user 801 to the virtual sound source 802 are At0j Atj, Δ(2· At'2> and Δϋ3. If we consider a 3D rendering system which is using the amplitude and the phase relation between the real speaker SP2 and a virtual source 802, it is easy to see that the chassis shift creates an imbalance of the delays in the system. In Fig. 8 the shifted position of the real speaker (SP25) would create the impression of the virtual source shifted to the right If we change the delay from Δί2 to Δί*2 and correct the amplitude we can maintain the virtual position of the virtual source 802 despite the displacement of speaker SP2 due to chassis shift.
Examples of application
The technology according to an embodiment of the present disclosure, in particular a 3d audio rendering device as described above, is applicable to various products, For example, the technology according to an embodiment of the present disclosure may be implemented as a device included in a mobile body that is any of kinds of automobiles, electric vehicles, hybrid electric vehicles, motorcycles, bicycles, personal mobility vehicles, airplanes, trains, choppers, drones, ships, robots, construction machinery, agricultural machinery (eg., tractors), and the like.
Fig. 9 is a block diagram depicting an example of schematic configuration of a vehicle control system 7000 as an example of a mobile body control system to which the technology according to an embodiment of the present disclosure can be applied. The vehicle control system 7000 includes a plurality of electronic control units connected to each other via a communication network 7010. In the example depicted in Fig.9, the vehicle control system 7000 includes a driving system control unit 7100, a body system control unit 7200, a battery control unit 7300, an outside-vehicle information detecting unit 7400, an in-vehide information detecting unit 7500, and an integrated control unit 7600. The communication network 7010 connecting the plurality of control units to each other may, for example, be a vehicle-mounted communication network compliant with an arbitrary standard such as controller area network (CAN), local interconnect network (LIN), local area network (LAN), FlexBay (registered trademark), or the like.
Each of the control units includes: a microcomputer that performs arithmetic processing according to various kinds of programs; a storage section that stores the programs executed by the microcomputer, parameters used for various kinds of operations, or the like; and a driving circuit that drives various kinds of control target devices. Each of the control units further includes: a network interface (I/F) for performing communication with other control units via the communication network 7010; and a communication I/F for performing communication with a device, a sensor, or the like, within and without the vehicle by wire communication Or radio Communication. Λ functional configuration of the integrated control unit 7600 illustrated in Fig. 9 includes a microcomputer 7610, a general-purpose communication I/F 7620, a dedicated communication I/F 7630, a positioning section 7640, a beacon receiving section 7650, an in-vehicle device I/F 7660, a sound/image output section 7670, a vehicle-mounted network I/F 7680, and a storage section 7690. The other control units similarly include a microcomputer, a communication I/F, a storage section, and the like.
The driving system control unit 7100 controls the operation of devices related to the driving system of die vehicle in accordance with' various kinds of programs. For example, the driving system control unit 7100 functions as a control device for a driving force generating device for generating' the driving force of the vehicle, such as an internal combustion engine, a driving motor, or the like, a driving force mnsmitting mechanism for transmitting the driving force to wheels, a steering mechanism for adjusting the steering angle of the vehicle, a braking device for generating the braking force of the vehicle, and the like. The driving system control unit 7100 may have a function as a control device of an antilock brake system (ABS), electronic stability control (ESQ, or the like.
The driving system control unit 7100 is connected with a vehicle state detecting section 7110. The vehicle state detecting section 711 Q, for example, includes at least one of a gyro sensor that detects the angular velocity of axial rotational movement of a vehicle body, an acceleration sensor that detects lateral (a, in operation 601 of Fig. 6) and forward acceleration the vehicle, and sensors for detecting an amount of operation of an accelerator pedal, an amount of operation of a brake pedal, the steering angle of a steering wheel, an engine speed or the rotational speed of wheels, and the like. In the present application the lateral acceleration can for example be used to calculate (in operation 602 of Fig. 6) the torsion angle for sound correction. The driving system control unit 7100 performs arithmetic processing using a signal input from the vehicle state detecting section 7110, and controls the internal combustion engine, the driving motor, an electric power steering device, the brake device, and the like.
The body system control unit 7200 controls the operation of various kinds of devices provided to the vehicle body in accordance with various kinds of programs. For example, the body system control unit 7200 functions as a control device for a keyless entry system, a smart key system, a power window device, or various kinds of lamps such as a headlamp, a backup lamp, a brake lamp, a turn signal lamp, a fog lamp, or the like. In this case, radio waves transmitted from a mobile device as an alternative to a key, or signals of various kinds of switches, can be input to the body system control unit 7200. The body system control unit 7200 receives these input radio waves or signals, and controls a door lock device, the power window device, the lamps, or the like, of the vehicle.
The battery control unit 7300 controls a secondary battery 7310, which is a power supply source for the driving motor, in accordance with various kinds of programs. For example, the battery control unit 7300 is supplied with information about a battery temperature, a battery output voltage, an amount of charge remaining in the battery, or the like, from a battery device including the secondary battery 7310. The battery control unit 7300 performs arithmetic processing using these signals, and performs control for regulating the temperature of the secondary battery 7310 or controls a cooling device provided to the battery device or the like.
The outside-vehicle information detecting unit 7400 detects information about the outside of the vehicle including the vehicle control system 7000. For example, the outside-vehicle information detecting unit 7400 is connected with at least one of an imaging section 7410 and an outside-vehicle information detecting section 7420. The imaging section 7410 includes at least one of a time-oF- flight (ToF) camera, a stereo camera, a monocular camera, an mfrared camera, and other cameras. The outside-vehicle information detecting section 7420, for example, includes at least one of an environmental sensor for detecting current atmospheric conditions or weather conditions and a peripheral information detecting sensor for detecting another vehicle, an obstacle, a pedestrian, or the like, on the periphery of the vehicle including the vehicle control system 7000.
The environmental sensor, for example, may be at least one of a rain drop sensor detecting rain, a fog sensor detecting a fog, a sunshine sensor detecting a degree of sunshine, and a snow sensor detecting a snowfall The peripheral information detecting sensor may be at least one of an ultrasonic sensor, a radar device, and a LIDAR device (Light detection and Ranging device, or Laser imaging detection and ranging device). Each of the imaging section 7410 and the outside-vehicle information detecting section 7420 may be provided as an independent sensor or device, or may be provided as a device in which a plurality of sensors or devices are integrated.
The outside-vehicle information detecting unit 7400 makes the imaging section 7410 image an image of the outside of the vehicle, and receives imaged image data. In addition, the outside-vehicle information detecting unit 7400 receives detection information from the outside-vehicle information detecting section 7420 connected to the outside-vehicle information detecting unit 7400. In a case where the outside-vehicle information detecting section 7420 is an ultrasonic sensor, a radar device, or a LIDAR device, the outside-vehicle information detecting unit 7400 transmits an ultrasonic wave, an electromagnetic wave, or the like, and receives information of a received reflected wave. On the basis of the received information, the outside-vehicle information detecting unit 7400 may perform processing of detecting an object such as a human, a Vehicle, an obstacle, a sigh, a character on a road surface, or the like, or processing of detecting a distance thereto. The outside-vehicle information detecting unit 7400 may perform environment recognition processing of recognizing a rainfall, a fog, road surface conditions, or the like, on the basis of the received information. The outside-vehicle information detecting unit 7400 may calculate a distance to an object outside the vehicle on the basis of the received information. In addition, on the basis of the received image data, the outsMe-vehicle information detecting unit 7400 may perform image recognition processing of recognizing a human, a vehicle, an obstacle, a sign, a character on a road surface, or the like, or processing of detecting a distance thereto. The outside-vehide information detecting unit 7400 may subject the received image data to processing such as distortion correction, alignment, or the like, and combine the image data imaged by a plurality of (lifrerent imaging sections 7410 to generate a bird's-eye view image or a panoramic image. The outside-vehide information detecting unit 7400 may perform viewpoint conversion processing using the image data imaged by the imaging section 7410 induding the different imaging parts.
The in-vehide information detecting unit 7500 detects information about the inside of die vehicle. The in-vehide information detecting unit 7500 is, for example, connected with a driver state detecting section 7510 that detects the state of a driver. The driver state detecting section 7510 may indude a camera that images the driver, a biosensor that detects biological information of the driver, a microphone that collects sound within the interior of the vehicle, or the like. The biosensor is, for example, disposed in a seat surface, the steering whed, or the like, and detects biological infonnation of an occupant sitting in a seat or the driver holding the steering wheeL On the basis of detection iriforrnation input from the driver state detecting section 7510, the in-vehide information detecting unit 7500 may calculate a degree of fatigue of the driver or a degree of concentration of the driver, or may determine whether the driver is dozing. The in-vchicle information detecting unit 7500 may subject an audio signal obtained by die collection of the sound to processing such as noise canceling processing, or the like.
The integrated control unit 7600 controls general operation within the vehide control system 7000 in accordance with various kinds of programs. The integrated control unit 7600 is connected with an input section 7800. The input section 7800 is implemented by a device capable of input operation by an occupant, such, for example, as a touch panel, a button, a microphone, a switch, a lever, or the like. The integrated control unit 7600 may be supplied with data obtained by voice recognition of voice input through the microphone. The input section 7800 may, for example, be a remote control device using infrared rays or other radio waves, or an external connecting device such as a mobile telephone, a personal digital assistant (PDA), or the like, that supports operation of the vehide control system 7000. The input section 7800 may be, for example, * camera. In that case, an occupant can input information by gesture. Alternattvdy, data may be input which is obtained by detecting the movement of a wearable device that an occupant wears. Further, die input section 7800 may, for example, include an input control circuit, or the like, that generates an input signal on the basis of information input by an occupant, or the like, using the above-described input section 7800, and which outputs the generated input signal to the integrated control unit 7600. An occupant, or the like, inputs various kinds of data or gives an instruction for processing operations to the vehicle control system 7000 by operating the input section 7800.
The storage section 7690 may include a read only memory (ROM) that stores various kinds of programs executed by me microcomputer and a random access memory (RAM) that stores various kinds of parameters, operation results, sensor values, or die like. In addition, the storage section 7690 may be implemented by a magnetic storage device such as a hard disc drive (HDD), or the like, a semiconductor storage device, an optical storage device, a magneto-optical storage device, or die like.
The general-purpose communication I/F 7620 is a communication I/F used widely, which commu- nication I/F mediates communication with various apparatuses present in an external environment 7750. The general-purpose communication I/F 7620 may implement a cellular communication protocol such as global system for mobile communications (GSM (registered t-cademark)), worldwide interoperability for microwave access (WiMAX (registered trademark}), long term evolution (LTE (registered trademark)), LTE-advanced (LTB-A), or die like, or another wireless communica- tion protocol such as wireless LAN (referred to also as wireless fidelity (Wi-Fi (registered trademark)), Bluetooth (registered trademark), or the like. The general-purpose communication I/F 7620 may, for example, connect to an apparatus (for example, an application server or a control server) present on an external network (for example, the Internet, a cloud network, or a company-specific network) via a base station or an access point In addition, the general-purpose communication I/F 7620 may connect to a terminal present in the vicinity of the vehicle (which terminal is, for example, a terminal of the driver, a pedestrian, or a store, or a machine type communication (MTQ terminal) using a peer to peer (P2P) technology, for example.
The dedicated communication I/F 7630 is a communication I/F that supports a communication protocol developed for use in vehicles. The dedicated communication I/F 7630 may implement a standard protocol such, for example, as wireless access in vehicle environment (WAVE), which is a combination of institute of electrical and electronic engineers (IEEE) 802.1 Ip as a lower layer and IEEE 1609 as a higher layer, dedicated short range communications (DSRQ, or a cellular communication protocol. The dedicated communication I/F 7630 typically carries out V2X communication as a concept including one or more of communication between a vehicle and a vehicle (Vehicle to Vehicle), communication between a vehicle and a road (Vehicle to Infrastructure), communication between a vehicle and a home (Vehicle to Home), and communication between a vehicle and a pedestrian (Vehicle to Pedestrian).
The positioning section 7640, for example, performs positioning by receiving a global navigation satellite system (GNSS) signal from a GNSS satellite (for example* a GPS signal from a global positioning system (GPS) satellite), and generates positional information including the latitude, longitude, and altitude of the vehicle. Incidentally, the positioning section 7640 may identify a current position by exchanging signals -with, a wireless access point, or may obtain the positional information from a terminal such as a mobile telephone, a personal handyphone system (PHS), or a smart phone that has a positioning function.
The beacon receiving section 7650, for example, receives a radio wave or an electromagnetic wave transmitted from a radio station installed on a road, or the like, and thereby obtains information about the current position, congestion, a closed road, a necessary time, or the like. Inddentally, the function of the beacon receiving section 7650 may be included in the dedicated communication I/F 7630 described above.
The in-vehicle device I/F 7660 is a communication interface that mediates connection between the microcomputer 7610 and various in-vehicle devices 7760 present within the vehicle. The in-vehicle device I/F 7660 may establish a wireless connection using a wireless communication protocol such as wireless LAN, Bluetooth (registered trademark), near field communication (NFQ, or wireless universal serial bus (WUSB). In addition, the in-vehide device I/F 7660 may establish a wired connection by universal serial bus (USB), high^lefinition multimedia interface (HDMI (registered trademark)), mobile high-defimtion link (MHL), or the like, via a connection tenrtinal (and a cable if necessary) not depicted in me figures. The in- vehicle devices 7760 may, for example, include at least one of a mobile device and a wearable device possessed by an occupant and an information device carried into or attached to the vehicle. The in-vehicle devices 7760 may also include a navigation device that searches for a path to an arbitrary destination. The in-vehicle device I/F 7660 exchanges control signals or data signals with these in-vehicle devices 7760.
The vehicle-mounted network I/F 7680 is an interface that mediates communication between the microcomputer 7610 and the communication network 7010. The vehicle-mounted network I/F 76S0 ttans-rdts and receives signaU, or me
supported by the communication network 7010.
The microcomputer 7610 of the integrated control unit 7600 controls the vehicle control system 7000 in accordance with various kinds of programs on the basis of information obtained via at least one of the general-purpose communication I/F 7620, the dedicated communication I/F 7630, the positioning section 7640, the beacon receiving section 7650, the in-vehide device I/F 7660, and the vehicle-mounted network I/F 7680. For example, the microcomputer 7610 may calculate a control target value for the driving force generating device, the steering mechanism, or the braking device on the basis of the obtained information about the inside and outside of the vehicle, and output a control command to the driving system control unit 7100. For example, the microcomputer 7610 may perform cooperative control intended to implement functions of an advanced driver assistance system (ADAS) -which functions include collision avoidance or shock mitigation for the vehicle, following driving based on a following distance, vehicle speed maintaining driving, a warning of collision of the vehicle, a warning of deviation of the vehicle from a lane, or the like. In addition, the microcomputer 7610 may perform cooperative control intended for automatic driving, which makes the vehicle to travel autonomously without depending on the operation of the driver, or the like, by controlling the driving force generating device, the steering mechanism, the braking device, or the like, on the basis of the obtained information about the surroundings of the vehicle.
The microcomputer 7610 of me integrated control unit 7600 may in particular control the vehicle control system 7000 in accordance with various kinds of programs on the basis of information obtained via at least one of the general-purpose communication I/F 7620, the dedicated communication I/F 7630, the positioning section 7640, the beacon receiving section 7650, the in-vchicle device I/F 7660, and the vehicle-mounted network I/F 7680. For example, the microcomputer 7610 may implement adaptive algorithms for generating artificial sound as described in the embodiment above. likewise, the microcomputer 7610 may control vehicle to active noise control device communication as described in the embodiments above.
The microcomputer 7610 may generate three-dhnensional distance information between the vehicle and an object such as a surrounding structure, a person, or the like, and generate local map information including information about the surroundings of the current position of the vehicle, on the basis of information obtained via at least one of the general-purpose communication I/F 7620, the dedicated communication I/F 7630, the positioning section 7640, the beacon receiving section 7650, the in-vehicle device I/F 7660, and the vehicle-mounted network I/F 7680. This information may be used as input for an adaptive sound generation as described in the embodiments above. In addition, the microcomputer 7610 may predict danger such as collision of the vehicle, appimching of a pedestrian, or the like, an entry to a closed road, or the like, on the basis of the obtained information, and generate a warning signal. The warning signal may, for example, be a signal for producing a warning sound or lighting a warning lamp .
The sound/image output section 7670 transmits an output signal of at least one of a sound and an image to an output device capable of visually or audibly notifying information to an occupant of the vehicle or the outside of the vehicle. In particular, the sound/image output section 7670 may be used to generate artificial sound as described in the embodiment above. In the example of Fig. 9, an audio speaker 7710, a display section 7720, and an instrument panel 7730 are illustrated as the output device. The display section 7720 may, for example, include at least one of an on-board display and a head-up display. The display section 7720 may have an augmented reality (AR) display function. The output device may be other than these devices, and may be another device such as headphones, a wearable device such as an eyeglass type display worn by an occupant, or the like, a projector, a lamp, or the like. In a case where the output device is a display device, the display device visually displays results obtained by various kinds of processing performed by the microcomputer 7610 or information received from another control unit in various forms such as text, an image, a table, a graph, or the like. In addition, in a case where die output device is an audio output device, the audio output device converts an audio signal constituted of reproduced audio data or sound data, or the like, into an analog signal, and audibly outputs the analog signal. Together with acquired parameters of the state detecting section 7110 the rmcrocomputer 7610 can calculate sound correc- tion parameters to adjust the monopole synthesis system for vehicle frame transformations (such as described with regard to Fig.7 and 8).
Incidentally, at least two control units connected to each other via the communication network 7010 in the example depicted in Kg.9 may be integrated into one control unit Alternatively, each individual control unit may include a plurality of control units. Further, the vehicle control system 7000 may include another control unit not depicted in die figures. In addition, part or die whole of the functions performed by one of the control units in die above description may be assigned to another control unit That is, predetermined arithmetic processing may be performed by any of the control units as long as information is transmitted and received via the communication network 7010. Similarly, a sensor or a device connected to one of die control units may be connected to another control unit, and a plurality of control units may mutually transmit and receive detection information via the communication network 7010.
Incidentally, a computer program for realizing the functions of die information processing device 100 according to the present embodiment described with reference to Fig.9 can be implemented in one of the control units, or the like. In addition, a computer readable recording medium storing such a computer program can also be provided. The recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like. In addition, die above- described computer program may be distributed via a network, for example, without die recording medium being used.
*** It should be recognized that the embodiments describe methods with an exemplary ordering of method steps. The specific ordering of method steps is however given for illustrative purposes only and should not be construed as binding.
It should also be noted mat the division of die control or circuitry of Fig.9 into units is only made for illustration purposes and that the present disclosure is not limited to any specific division of functions in specific units. For instance, at least parts of the circuitry could be implemented by a respective programmed processor, field programmable gate array (FPGA), dedicated circuits, and the like.
All units and entities described in mis specification and claimed in the appended claims can, if not stated otherwise, be implemented as integrated circuit logic, for example on a chip, and functionality provided by such units and entities can, if not stated otherwise, be implemented by software.
In so far as the embodiments of the disclosure described above are implemented, at least in part, using software-controlled data processing apparatus, it will be appreciated that a computer program providing such software control and a transmission, storage or other medium by which such a computer program is provided arc envisaged as aspects of the present disclosure.
Note that the present technology can also be configured as described below:
(1) An apparatus comprising circuitry configured to render audio considering an automotive chassis shift caused by vehicle dynamics.
(2) The apparatus of (1), wherein the circuitry is configured to render 3d audio considering the automotive chassis shift.
(3) The apparatus of anyone of (1) or (2), wherein the rendering audio comprises providing noise cancellation: consMering the automotive chassis shift.
(4) The apparatus of anyone of (1) to (3), wherein the circuitry is configured to determine chassis shift based on information on vehicle dynamics.
(5) The apparatus of anyone of (1) to (4), wherein the circuitry is configured to obtain the information on vehicle dynamics from sensors integrated in a vehicle.
(6) The apparatus of (5), wherein the information on vehicle dynamics comprises at least one of yaw-rate, pitch-rate, roll-over rate, wheel-speeds, steering wheel angle, throttling forces, braking forces, and process the chassis shift based on these parameters.
(7) The apparatus of anyone of (1) to (6), wherein the circuitry is configured to determine chassis shift by calcukting speaker positions or corrections of speaker positions based on information on vehicle dynamics.
(8) The apparatus of anyone of (1) to (7), wherein the circuitry is configured to determine chassis shift by calculating a torsion of the chassis.
(9) The apparatus of anyone of (1) to (8), wherein the circuitry is configured to calculate the torsion of the chassis based on a predefined torsional stiffness of the vehicle body. (10) The apparatus of anyone of (1) to (9), wherein tfae dreuitry is configured to determine one or mote forces on die tires of a vehicle and to determine a chassis torsion based on the determined forces on the tires.
(11) The apparatus of anyone of (1) to (10), wherein the circuitry is configured to determine a vehicle lateral acceleration and to determine a chassis torsion based on the determined vehicle lateral acceleration.
(12) The apparatus of anyone of (1) to (11), wherein the circuitty is configured to determine resonant excitations of the suspension system to predict chassis shifts.
(13) The apparatus of anyone of (1) to (12), wherein the dreuitry is configured to obtain direct measurements of movements of the chassis.
(14) The apparatus of anyone of (1) to (13), wherein the dreuitry is configured to determine chassis shifts based on calibration.
(15) 3D audio system comprising the apparatus of anyone of (1) to (14).
(16) Automotive controller comprising die apparatus of anyone of (1) to (14).
(17) Method comprising rendering audio considering an automotive chassis shift caused by vehicle dynamics.
(18) Computer program comprising instructions which, when executed on a processor, cause the processor to render audio considering an automotive chassis shift caused by vehicle dynamics.

Claims

1. An apparatus comprising circuitry configured to render audio considering an automotive chassis shift caused by vehicle dynamics.
2. The apparatus of claim 1, -wherein the dtcuitry is configured to tender 3d audio considering the automotive chassis shift
3. The apparatus of claim 1 , wherein the rendering audio comprises providing noise cancellation considering the automotive chassis shift
4. The apparatus of claim 1, wherein the circuitry is configured to determine chassis shift based on information on vehicle dynamics.
5. The apparatus of claim 1, wherein the circuitry is configured to obtain the information on vehicle dynamics from sensors integrated in a vehicle.
6. The apparatus of claim 5, wherein the information on vehicle dynamics comprises at least one of yaw-rate, pitch-rate, roll-over rate, wheel-speeds, steering wheel angle, throttling forces, braking forces, and process the chassis shift based on these parameters.
7. The apparatus of claim 1, wherein the circuitry is corulgured to
calculating speaker positions or corrections of speaker positions based on information on vehicle dynamics.
8. The apparatus of claim 1 , wherein the circuitry is configured to determine chassis shift by calculating a torsion of the chassis.
9. The apparatus of claim 8, wherein me drcui
chassis based on a predefined torsional stiffness of the vehicle body.
10. The apparatus of claim 1, wherein the citautry is configured to deternime one or more forces on the tires of a vehicle and to determine a chassis torsion based on the determined forces on me tires.
11. The apparatus of claim 1 , wherein the circuitry is configured to determine a vehicle lateral acceleration and to determine a chassis torsion based on the determined vehicle lateral acceleration.
12. The apparatus of claim 1 , wherein the circuitry is configured to determine resonant excitations of the suspension system to predict chassis shifts.
13. The apparatus of claim 1 , wherein the circuitry Is configured to obtain direct measurements of movements of the chassis.
14. Thc appatsuxisof cldm l.whcrdn the circui
based on calibration.
15. 3D audio system comprising the apparatus of claim 1.
16. Automotive controller comprising the apparatus of claim 1.
17. Method comprising rendering audio considering an automotive chassis shift caused by vehicle dynamics.
18. Computer program comprising instructions which, when executed on a processor, cause the processor to render audio considering an automotive chassis shift caused by vehicle dynamics.
PCT/EP2018/067981 2017-07-19 2018-07-03 Method, device and system for the generation of spatial sound fields WO2019015954A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
DE112018003683.9T DE112018003683T5 (en) 2017-07-19 2018-07-03 Method, device and system for generating spatial sound fields

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP17182199 2017-07-19
EP17182199.4 2017-07-19

Publications (1)

Publication Number Publication Date
WO2019015954A1 true WO2019015954A1 (en) 2019-01-24

Family

ID=59631539

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2018/067981 WO2019015954A1 (en) 2017-07-19 2018-07-03 Method, device and system for the generation of spatial sound fields

Country Status (2)

Country Link
DE (1) DE112018003683T5 (en)
WO (1) WO2019015954A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022149825A1 (en) * 2021-01-06 2022-07-14 삼성전자 주식회사 Electronic device and sound rendering method of electronic device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130156213A1 (en) * 2011-12-16 2013-06-20 Bose Corporation Virtual Audio System Tuning
US20160037282A1 (en) 2014-07-30 2016-02-04 Sony Corporation Method, device and system
US20160144782A1 (en) * 2014-11-21 2016-05-26 Hyundai Motor Company Vehicle, control method of vehicle, and vehicle driving sound control apparatus and method
WO2016165776A1 (en) * 2015-04-17 2016-10-20 Huawei Technologies Co., Ltd. Apparatus and method for driving an array of loudspeakers with drive signals
US20170078822A1 (en) * 2015-09-11 2017-03-16 GM Global Technology Operations LLC Vehicle sound enhancement

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130156213A1 (en) * 2011-12-16 2013-06-20 Bose Corporation Virtual Audio System Tuning
US20160037282A1 (en) 2014-07-30 2016-02-04 Sony Corporation Method, device and system
US20160144782A1 (en) * 2014-11-21 2016-05-26 Hyundai Motor Company Vehicle, control method of vehicle, and vehicle driving sound control apparatus and method
WO2016165776A1 (en) * 2015-04-17 2016-10-20 Huawei Technologies Co., Ltd. Apparatus and method for driving an array of loudspeakers with drive signals
US20170078822A1 (en) * 2015-09-11 2017-03-16 GM Global Technology Operations LLC Vehicle sound enhancement

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
B. HEISSING; M. ERSOY: "Chasis Handbook: Fundamentals, Driving Dynamics, Components, Mechatronics, Perspectives", 2010, SPRINGER SCIENCE & BUSINESS MEDIA, pages: 76 - 79
HASHFI HAZIMI ET AL: "Vertical bending strength and torsional rigidity analysis of formula student car chassis", AIP CONFERENCE PROCEEDINGS, vol. 1927, 1 January 2018 (2018-01-01), NEW YORK, US, pages 030050, XP055492297, ISSN: 0094-243X, DOI: 10.1063/1.5024109 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022149825A1 (en) * 2021-01-06 2022-07-14 삼성전자 주식회사 Electronic device and sound rendering method of electronic device

Also Published As

Publication number Publication date
DE112018003683T5 (en) 2020-05-14

Similar Documents

Publication Publication Date Title
US10650798B2 (en) Electronic device, method and computer program for active noise control inside a vehicle
JP6834964B2 (en) Image processing equipment, image processing methods, and programs
JP7294148B2 (en) CALIBRATION DEVICE, CALIBRATION METHOD AND PROGRAM
US10587863B2 (en) Image processing apparatus, image processing method, and program
JP2017220051A (en) Image processing device, image processing method and vehicle
US20220408212A1 (en) Electronic device, method and computer program
JPWO2014174839A1 (en) Vehicle acoustic control apparatus and vehicle acoustic control method
US20220018932A1 (en) Calibration apparatus, calibration method, program, and calibration system and calibration target
US20220306130A1 (en) System, method and computer program to suppress vibrations in a vehicle
WO2018070266A1 (en) Image processing device and image processing method
JP7040513B2 (en) Information processing equipment, information processing method and recording medium
CN111587572A (en) Image processing apparatus, image processing method, and program
US20200175959A1 (en) Apparatus, system, method and computer program
JPWO2018020884A1 (en) Terminal device and device system
WO2019015954A1 (en) Method, device and system for the generation of spatial sound fields
US20220012552A1 (en) Information processing device and information processing method
US20210042886A1 (en) Image processing apparatus, image processing method, and program
US20230356603A1 (en) Information processing apparatus, information processing method, and program
JP2018046353A (en) Communication device and communication system
WO2024062976A1 (en) Information processing device and information processing method
WO2020017172A1 (en) Information processing device, information processing method, and program
US12012039B2 (en) Apparatus and method for generating sound of electrification vehicle
US20230412923A1 (en) Signal processing device, imaging device, and signal processing method
US11753028B1 (en) Pedal control system and method for an electric vehicle
WO2021106558A1 (en) Radar device, radar device manufacturing method, and transceiver

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18733917

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 18733917

Country of ref document: EP

Kind code of ref document: A1