WO2006095688A1 - Dispositif de reproduction d’informations, méthode de reproduction d’informations, programme de reproduction d’informations et support lisible par ordinateur - Google Patents

Dispositif de reproduction d’informations, méthode de reproduction d’informations, programme de reproduction d’informations et support lisible par ordinateur Download PDF

Info

Publication number
WO2006095688A1
WO2006095688A1 PCT/JP2006/304281 JP2006304281W WO2006095688A1 WO 2006095688 A1 WO2006095688 A1 WO 2006095688A1 JP 2006304281 W JP2006304281 W JP 2006304281W WO 2006095688 A1 WO2006095688 A1 WO 2006095688A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
sound
passenger
unit
image
Prior art date
Application number
PCT/JP2006/304281
Other languages
English (en)
Japanese (ja)
Inventor
Koji Koga
Takeshi Sato
Goro Kobayashi
Original Assignee
Pioneer Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pioneer Corporation filed Critical Pioneer Corporation
Publication of WO2006095688A1 publication Critical patent/WO2006095688A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04SSTEREOPHONIC SYSTEMS 
    • H04S7/00Indicating arrangements; Control arrangements, e.g. balance control
    • H04S7/30Control circuits for electronic adaptation of the sound field
    • H04S7/302Electronic adaptation of stereophonic sound system to listener position or orientation
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R2011/0001Arrangements for holding or mounting articles, not otherwise provided for characterised by position
    • B60R2011/0003Arrangements for holding or mounting articles, not otherwise provided for characterised by position inside the vehicle
    • B60R2011/0012Seats or parts thereof
    • B60R2011/0017Head-rests
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R11/02Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof
    • B60R2011/0276Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof for rear passenger use
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R2499/00Aspects covered by H04R or H04S not otherwise provided for in their subgroups
    • H04R2499/10General applications
    • H04R2499/13Acoustic transducers and sound field adaptation in vehicles

Definitions

  • Information reproducing apparatus information reproducing method, information reproducing program, and computer-readable recording medium
  • the present invention relates to an information reproduction apparatus, an information reproduction method, an information reproduction program, and a computer-readable recording medium that acquire information related to the sitting state of a passenger in a vehicle and reproduce audio data and the like.
  • the use of the present invention is not limited to the above-described information reproducing apparatus, information reproducing method, information reproducing program, and computer-readable recording medium.
  • each passenger can be identified by selecting setting information for himself / herself from the setting information registered in advance by the passenger or by the passenger himself / herself. This is done by, for example, reading an information stored in the ID card with an ID force reader provided for each vehicle seat.
  • the sound image localization of the audio signal detects the head shape of the listener and detects the speaker position force based on the head-related transfer function or the like. This is done by accurately simulating the transfer characteristics up to and obtaining the filter coefficient, and using the obtained filter coefficient to localize the sound image.
  • the preferred in-vehicle environment (in-vehicle device) set by each passenger in each seat of the vehicle (driver's seat, front passenger seat, etc.) (Such as the location and operating state of the vehicle) is stored in an ID card using an IC as profile information for each passenger. And the passenger's profile from the ID card By adjusting the in-vehicle equipment by reading out the vehicle information, it is set to the passenger's preferred in-vehicle environment.
  • the face of the user is photographed with a CCD camera, and the width of the user's face and the size of the auricle are based on this image. Detect.
  • the rear speaker force calculates the head-related transfer function that is the transfer function to the user's ears. Then, the sound image localization of the rear speaker is realized with the front speaker by performing the finisher processing with the DSP of the USB amplifier so as to realize the characteristics of the head-related transfer function.
  • Patent Document 1 Japanese Patent Application Laid-Open No. 2002-104105
  • Patent Document 2 JP 2003-230199 A
  • Patent Document 1 Also, with the prior art disclosed in Patent Document 1, it is possible to store in-vehicle device setting information according to the passenger's preference, but to actively make the in-vehicle device setting suitable for the passenger. An example of this is the problem that cannot be done.
  • Patent Document 2 when a head image is obtained by photographing a listener's face image and filter processing is performed and sound image localization is performed, complicated operations such as image calculation processing and audio filter processing are performed. As an example, there is a problem that sound image localization associated with sound reproduction cannot be performed easily and easily because a simple process is essential.
  • An information reproducing apparatus detects a seating state of a passenger in a vehicle, An acquisition means for acquiring information relating to the seating state; an identification means for identifying the occupant based on information relating to the seating state acquired by the acquisition means; and an occupant identified by the identification means.
  • Reproduction means for reproducing the sound data being reproduced, and control means for controlling the sound field formed by the sound of the sound data reproduced by the reproduction means based on the identification result identified by the identification means. It is characterized by providing.
  • the information reproduction method according to the invention of claim 7 is obtained by an acquisition step of detecting a seating state of a passenger in a vehicle and acquiring information on the seating state, and the acquisition step.
  • An identification step for identifying the occupant based on information related to the seating state, a reproduction step for reproducing audio data associated with the occupant identified by the identification step, and an identification identified by the identification step And a control step of controlling the sound field formed by the sound of the sound data reproduced by the reproduction process based on the result.
  • An information reproduction program according to the invention of claim 8 causes a computer to execute the information reproduction method according to claim 7.
  • the information reproduction program described in 8 is recorded.
  • FIG. 1 is an explanatory view showing an example of the inside of a vehicle on which an information reproducing apparatus that is useful for an embodiment is mounted.
  • FIG. 2 is a block diagram showing an example of a functional configuration of an information reproducing apparatus according to an embodiment.
  • FIG. 3 is a flowchart showing an example of an information reproduction processing procedure of the information reproducing apparatus according to the embodiment.
  • FIG. 4 is a block diagram showing an example of a hardware configuration of an information reproducing apparatus according to the embodiment.
  • FIG. 5 is a block diagram illustrating an example of an internal configuration of an image processing unit in the information reproducing apparatus according to the embodiment. ⁇
  • FIG. 6 is a block diagram illustrating an example of the internal configuration of the audio processing unit in the information reproducing apparatus according to the embodiment.
  • FIG. 7 is an explanatory diagram showing an example of the inside of a vehicle on which the information reproducing apparatus according to the example is mounted.
  • FIG. 8 is a flowchart showing an example of an information reproduction processing procedure of the information reproducing apparatus according to the embodiment.
  • FIG. 1 shows an information reproducing apparatus that is useful for the embodiment of the present invention. It is explanatory drawing which shows an example inside the made vehicle.
  • an image display device (display) 121a, an audio output device (speaker) 122, and an information reproducing device 126a are provided around the driver's seat 111 and the passenger seat 112.
  • a photographing device (camera) 123 is provided on the ceiling 114 of the vehicle, and an image display device (display) 121 is provided on the passenger seat 112 to the passenger on the rear seat 113. b and an information reproducing device 126b are provided.
  • a sound output device (speaker) (not shown) is provided behind the rear seat 113, and a photographing device (camera) 123 is provided in each information reproduction device 126 (126a, 126b).
  • FIG. 2 is a block diagram showing an example of a functional configuration of the information reproducing apparatus according to the embodiment of the present invention.
  • the information reproduction apparatus includes an acquisition unit 201, an identification unit 202, a reproduction unit 203, a control unit 204, and a storage unit 205, and further includes a photographing device (camera) 206 and an image display device. (Display) 207 and audio output device (speaker) 208 are provided.
  • the information reproducing apparatus may have a structure that can be attached to and detached from the vehicle.
  • the acquisition unit 201 detects the seating state of the passenger on each of the seats 111 to 113 of the vehicle in FIG. 1, and acquires information on the seating state.
  • Each of the seats 111 to 113 is a seat of a vehicle on which an information reproducing device is mounted.
  • the passenger is a person who gets on the vehicle, and includes a person who gets on the driver's seat 111 and a person who gets on the passenger seat 112 and the rear seat 113.
  • the detection of the seating state is, for example, by detecting the position in the vehicle where the passenger is seated using image data of an image taken by the photographing device 206 described later. is there.
  • the information on the sitting state includes information such as seating position information, sitting height information, face image information, and ear position information of the passenger.
  • the acquisition unit 201 may be configured by so-called seating sensors provided in each of the seats 111 to 113, for example.
  • the information regarding the sitting state includes, for example, information regarding the body part of the passenger May be included.
  • the body part of the passenger is the position of each part of the body.
  • Information on the body part is specifically information on the position of each part of the body, such as the position of the shoulders, abdomen, and head, the length of arms, legs, height, weight, and sitting height. is there.
  • the information on the body part includes, for example, information on the positions of eyes and ears in the head. Further, the information on the body part may be information representing the ability of the body part such as visual acuity and hearing ability, in addition to the information on these positions alone.
  • the acquiring unit 201 When acquiring information related to the body part as information related to the sitting state, specifically, the acquiring unit 201 prays, for example, image data captured by the imaging device 206, and the boarding obtained thereby. Based on the image of the body part of the person, information on the body part is calculated and acquired. Note that the acquisition unit 201 may acquire information related to the body part as information related to the sitting state by an input operation from the passenger or reading from an external recording device.
  • the identification unit 202 identifies the passenger based on the information regarding the seating state acquired by the acquisition unit 201. Identification refers to the presence or absence of image data playback information on the image display device 207 corresponding to the passenger and audio data playback information on the audio output device 208, and what playback information is associated with the passenger. It is to discriminate power.
  • face authentication processing is performed using the passenger's face image information included in the information related to the seating state to identify and identify the person. May be. Note that information on the sitting state and information such as the identification result by the identification unit 202 are stored in the storage unit 205 described later.
  • the reproduction unit 203 reproduces image data and audio data associated with the passenger identified by the identification unit 202.
  • Image data and audio data are reproduced based on the reproduction information stored in the storage unit 205.
  • Examples of the image data reproduced by the reproduction unit 203 include image data obtained by recording a TV program, image data captured by the image capturing device 206, and the like.
  • the audio data reproduced by the reproduction unit 203 includes music such as music and sound data such as sound effects.
  • the control unit 204 Based on the identification result identified by the identification unit 202, the control unit 204 forms audio output from the audio output device 208 based on the audio data reproduced by the reproduction unit 203. To control the sound field. Specifically, the sound field control includes, for example, adjustment of sound volume of sound output from the sound output device 208, sound equalization processing, sound image localization, and the like.
  • the control unit 204 performs sound image localization as control of the sound field
  • the sound image localization of the sound is performed using, for example, a known sound image localization method.
  • a sound image localization method for example, sound data is filtered based on the head-related transfer function to generate a virtual sound source in the sound field inside the vehicle, or the sound output device 2 It is possible to use a method of changing the audio output delay characteristic of 08.
  • the storage unit 205 stores information on the sitting state and information such as an identification result by the identification unit 202.
  • the storage unit 205 stores reproduction information of image data and audio data associated with the passenger.
  • the playback information is considered to be highly relevant to the passenger based on the information regarding the playback history and music selection history of the image data and audio data considering the preference of each passenger, or the age and gender of the identified passenger. Includes information about image and audio selection.
  • Various types of information stored in the storage unit 205 are temporarily stored, for example, stored in a learning manner or statistically like a database, depending on the type.
  • the imaging device 206 captures an image inside the vehicle, for example.
  • Image data of the captured image is provided to the acquisition unit 201.
  • the image display device 207 displays the image data reproduced by the reproducing unit 203 on the display screen.
  • the sound output device 208 outputs sound based on the sound data reproduced by the reproducing unit 203 and controlled by the control unit 204 to the inside of the vehicle.
  • FIG. 3 is a flowchart showing an example of an information reproduction processing procedure of the information reproducing apparatus according to the embodiment of the present invention.
  • step S302 determines that the passenger is not seated on each of the seats 111 to 113 (see FIG. 1, the same applies hereinafter) If yes (step S302: No), the process returns to step S301 to take an image.
  • step S302: Yes the acquisition unit 201 (see FIG. 2, the same applies hereinafter) detects the seating state and acquires information on the seating state ( Step S303).
  • the identification unit 202 identifies the passenger (step S304).
  • the playback unit 203 stores the playback information of the image data and audio data associated with the identified occupant in the storage unit 205 (see FIG. 2 and so on).
  • the audio output device 208 To play back images and sounds on the image display device 207 (see FIG. 2, the same applies hereinafter) and the audio output device 208 (see FIG. 2, the same applies hereinafter) (step S305).
  • step S306 the audio data reproduced by the reproduction unit 203 based on the identification result identified by the identification unit 202 by the control unit 204 when the reproduction unit 203 reproduces image data or audio data.
  • the sound field formed by the voice is controlled (step S306).
  • the control of the sound field in step S306 includes, for example, performing sound image localization according to the seating position of the passenger so as to form a sound field where the sound can be heard optimally for the passenger.
  • the passenger is identified based on the information on the seating state of the passenger on the vehicle, and the boarding that has been identified.
  • the sound field formed by the sound of the sound data to be played can be controlled based on the identification result! For this reason, it is possible to automatically optimize the sound field inside the vehicle according to the occupant without manually performing an input operation for sound field control.
  • FIG. 4 is a block diagram showing an example of the hardware configuration of the information reproducing apparatus according to the embodiment of the present invention.
  • the information reproducing apparatus is mounted on the vehicle in a detachable structure as described above, and includes a navigation control unit 400, a user operation unit (remote control and touch panel) 401, and a display unit (monitor). ) 402, position acquisition unit 403, recording medium 404, recording medium decoding unit 405, guidance sound output unit 406, communication unit 407, route search unit 408, route guidance unit 409, and guidance sound generation
  • the unit 410 includes a speaker 411, an image processing unit 412, an image input / output IZF 413, an audio processing unit 414, and a photographing unit 415.
  • the navigation control unit 400 controls the entire information reproducing apparatus, for example, and performs various arithmetic processes according to the control program, thereby comprehensively controlling each unit included in the information reproducing apparatus.
  • the navigation control unit 400 includes, for example, a CPU (Central Processing Unit) that executes predetermined arithmetic processing, a ROM (Read Only Memory) that stores various control programs, and a RAM (Random Access Memory) that functions as a work area for the CPU. ) Or the like.
  • a CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • the navigation control unit 400 for example, when the vehicle route is guided, information on the current position of the vehicle (current position information) acquired by the position acquisition unit 403 and the recording medium 404 to the recording medium decoding unit 405 Based on the map information obtained via the map, the position on the map where the vehicle is traveling is calculated, and the calculation result is output to the display unit 402.
  • the navigation control unit 400 inputs / outputs information related to route guidance between the route search unit 408, the route guidance unit 409, and the guidance sound generation unit 410 when performing the above-described route guidance, and obtains information obtained as a result.
  • the information is output to the display unit 402 and the guidance sound output unit 406.
  • the user operation unit 401 outputs information input by the user, such as characters, numerical values, and various instructions, to the navigation control unit 400.
  • information input by the user such as characters, numerical values, and various instructions
  • various known forms such as a push button switch that detects physical pressing Z non-pressing, a touch panel, a keyboard, and a joystick can be employed.
  • This user operation unit 401 uses, for example, a microphone for inputting sound of external power to You can also use the input operation mode!
  • the user operation unit 401 may be provided integrally with the information reproducing apparatus, or may be configured to be able to be operated with a position force separated from the information reproducing apparatus, such as a remote controller. .
  • the user operation unit 401 may be configured in any one of the various forms described above, or may be configured in a plurality of forms. The user inputs information by appropriately performing an input operation according to the form of the user operation unit 401.
  • Information input by an input operation of the user operation unit 401 includes, for example, information on a destination regarding navigation. Specifically, for example, when the information reproducing apparatus is provided in a vehicle or the like, a point to be reached by the passenger of this vehicle is set. Further, as information input to the user operation unit 401, for example, regarding information reproduction, selection information of audio reproduced by an audio processing unit 414 described later can be cited. Specifically, for example, audio data such as music desired by the passenger of this vehicle is selected and set.
  • the touch panel when a touch panel is adopted as the form of the user operation unit 401, the touch panel is used by being stacked on the display screen side of the display unit 402.
  • the input timing by the input operation is recognized by managing the display timing on the display unit 402, the operation timing for the touch panel (user operation unit 401) and its position coordinates.
  • a touch panel stacked on the display unit 402 as a form of the user operation unit 401, it is possible to input a large amount of information without enlarging the form of the user operation unit 401.
  • this touch panel various known touch panels such as a resistive film type and a pressure sensitive type can be adopted.
  • Display unit 402 includes, for example, a CRT (Cathode Ray Tube), a TFT liquid crystal display, an organic EL display, a plasma display, and the like.
  • the display unit 402 can be configured by, for example, a video IZF or a video display device connected to the video IZF, not shown.
  • the video IZF is, for example, a graphic controller that controls the entire display device, and a VRAM (V It consists of a buffer memory such as ideo RAM) and a control IC or GPU (Graphics Processing Unit) that controls display based on image information output from the graphics controller.
  • VRAM V It consists of a buffer memory such as ideo RAM
  • control IC or GPU Graphics Processing Unit
  • the display unit 402 displays icons, cursors, menus, windows, or various information such as characters and images. Further, the display unit 402 displays image data processed by an image processing unit 412 described later.
  • the position acquisition unit 403 acquires the current position information (latitude and longitude information) of the vehicle on which the information reproducing device is mounted, for example, by receiving radio waves of artificial satellite power.
  • the current position information is information that receives radio waves from an artificial satellite and obtains the geometric position with the artificial satellite, and can be measured anywhere on the earth.
  • the position acquisition unit 403 includes a GPS antenna (not shown).
  • GPS Global Positioning System
  • This position acquisition unit 403 can be configured by, for example, a tuner that demodulates radio waves received by satellite power, an arithmetic circuit that calculates the current position based on the demodulated information, and the like.
  • the radio wave from the artificial satellite is 1.
  • Carrier wave of 57542 GHz using L1 radio wave carrying CZA (Coarse and Access) code and navigation message.
  • CZA Coarse and Access
  • the current position (latitude and longitude) of the vehicle on which the information reproducing apparatus is mounted is detected.
  • information collected by various sensors such as a vehicle speed sensor and a gyro sensor may be taken into account.
  • the vehicle speed sensor detects the vehicle speed from the output shaft of the transmission of the vehicle equipped with the information reproducing device.
  • angular velocity sensor detects the angular velocity when the vehicle rotates and outputs angular velocity information and relative orientation information.
  • the mileage sensor calculates the number of pulses per one rotation of the wheel by counting the number of pulses of a pulse signal output with the rotation of the wheel in a given period, and the mileage based on the number of pulses per one rotation Output information.
  • the inclination angle sensor detects the inclination angle of the road surface and outputs inclination angle information.
  • Horizontal G The sensor detects lateral G, which is an outward force (gravity) generated by centrifugal force during vehicle cornering, and outputs lateral G information.
  • the vehicle current position information acquired by the position acquisition unit 403 and the information detected by the vehicle speed sensor, gyro sensor, angular velocity sensor, travel distance sensor, inclination angle sensor, and lateral G sensor are used for navigation. Output to the control unit 400.
  • the recording medium 404 records various control programs and various information in a state readable by a computer.
  • the recording medium 404 accepts writing of information by the recording medium decoding unit 405 and records the written information in a nonvolatile manner.
  • the recording medium 404 can be realized by HD (Hard Disk), for example.
  • the recording medium 404 is not limited to HD. Instead of HD or in addition to HD, DVD (Digital Versatile Disk), CD (Compact Disk), or other recording medium decoding unit 405 can be attached and detached. A medium that is portable and has portability may be used as the recording medium 404.
  • the recording medium 404 is not limited to a DVD and a CD.
  • the recording medium 404 is attached to and detached from a recording medium decoding unit 405 such as a CD-ROM (CD-R, CD-RW), MO (Magneto-Optical disk), or a memory card. Use media that is portable and portable.
  • the recording medium 404 records an information reproduction program, a navigation program, image data, audio data, map information, and the like that realize the present invention.
  • the image data refers to, for example, a two-dimensional array value representing an image of an image taken inside the vehicle.
  • Audio data refers to data for reproducing music such as music.
  • the map information includes background information representing features (features) such as buildings, rivers, and the ground surface, and road shape information representing the shape of the road. It is drawn in 2D or 3D.
  • the background information includes background shape information representing the shape of the background and background type information representing the type of the background.
  • the background shape information includes information indicating, for example, representative points of features, polylines, polygons, coordinates of features, and the like.
  • the background type information includes, for example, text information indicating the name, address, telephone number, and the like of the feature, and type information indicating the type of the feature such as a building 'river'.
  • the road shape information is information relating to a road network having a plurality of nodes and links.
  • the node is information indicating an intersection where a plurality of roads intersect, such as a three-way “crossroad” and a five-way.
  • a link is information indicating a road connecting nodes.
  • Some links have shape interpolation points that allow the expression of curved roads.
  • Road shape information has traffic condition information.
  • the traffic condition information is information indicating the characteristics of the intersection, the length (distance) of each link, vehicle width, traveling direction, traffic prohibition, road type, and the like.
  • the characteristics of this intersection include, for example, complex intersections such as three- and five-way intersections, intersections where roads diverge at shallow ridge angles, intersections around destinations, highway entrances and junctions, and route deviation rates. For example, high intersections.
  • the route deviation rate can be calculated by, for example, the past driving history force. Examples of road types include highway, toll road, and general road.
  • the power for recording image data, audio data, and map information in the recording medium 404 is not limited to this.
  • Image data, audio data, and map information are recorded only if they are provided integrally with the hardware of the information reproducing device, and are not provided, but may be provided outside the information reproducing device.
  • the information reproducing apparatus acquires image data and audio data via the network through the communication unit 407, for example.
  • the information reproducing apparatus acquires map information via a network, for example, through the communication unit 407.
  • the image data, audio data, and map information acquired in this way may be stored in the RAM of the navigation control unit 400, for example.
  • the recording medium decoding unit 405 controls reading and writing of information on the recording medium 404.
  • the recording medium decoding unit 405 is an HDD (Hard Disk Drive).
  • the recording medium decoding unit 405 is a DVD drive! / ⁇ is a CD drive.
  • CD-ROM CD-R, CD-RW
  • MO electrically erasable and detachable recording medium
  • the guidance sound output unit 406 reproduces the navigation guidance sound by controlling the output to the connected speaker 411.
  • the guide sound output unit 406 can be realized by an audio IZF (not shown) connected to the audio output speaker 411.
  • the audio IZ F is, for example, a DZ A converter that performs DZA conversion of digital audio data, an ⁇ / A converter power, an amplifier that amplifies the analog audio signal that is output, and an AZD conversion of the analog audio signal. It can be configured with AZD converter and force.
  • the communication unit 407 performs communication with, for example, another information reproducing apparatus.
  • the communication unit 407 of this embodiment may be a communication module that communicates with a communication server (not shown) via a base station (not shown) such as a mobile phone, for example. It may be a communication module that performs direct wireless communication with the device.
  • the wireless communication is communication performed using radio waves or infrared rays' ultrasonic waves without using a wire line as a communication medium.
  • wireless LAN Infrared Data Association
  • HomeRF Home Radio Frequency
  • BlueTooth BlueTooth as standards that enable wireless communication.
  • Technology can be used.
  • Wireless LAN is preferred from the aspect of information transfer speed and can be used as an example.
  • the communication unit 407 may receive road traffic information such as traffic jams and traffic regulations on a regular basis (may be irregular). Receiving road traffic information by the communication unit 407 may be performed at the timing when the VICS (Venicle Information and Communication System) Center ⁇ Rikido Road 3 ⁇ 4 ⁇ Transport ⁇ Blue Bulletin is distributed, or periodically to the vies center. It may be done by requesting road traffic information.
  • the communication unit 407 can be realized as, for example, an AMZFM tuner, a TV tuner, a VICSZ beacon receiver, and other communication devices.
  • rviCSj is a well-known technology that omits detailed explanations.
  • RviCSj is a vehicle navigation device that sends traffic traffic information such as traffic jams and traffic regulations edited and processed in the VICS center in real time. Information communication system that displays characters on graphics Is.
  • VICS information road traffic information
  • FM multiplex broadcasting installed on each road.
  • “beacons” include “radio wave beacons” mainly used on expressways and “optical beacons” used on major general roads.
  • “FM multiplex broadcasting” road traffic information in a wide area can be received.
  • a “beacon” it is possible to receive necessary road traffic information at the location where the vehicle is located, such as detailed information on the most recent road based on the location of the vehicle (vehicle). Become. If the communication method with another information reproducing apparatus is different from the communication method for receiving image data, audio data, and road traffic information, the communication unit 407 has a plurality of corresponding methods. Have a means of communication.
  • the route search unit 408 calculates an optimal route from the current position to the destination based on the current position information of the vehicle acquired by the position acquisition unit 403 and the information on the destination input by the user. To do.
  • the route guidance unit 409 includes information on the guidance route searched by the route search unit 408 or route information received by the communication unit 407, current position information acquired by the position acquisition unit 403, and a recording medium 404 Based on the map information obtained via the decoding unit 405, real-time route guidance information is generated.
  • the route guidance information generated by the route guidance unit 409 is output to the display unit 402 via the navigation control unit 400.
  • Guide sound generation section 410 generates tone and voice information corresponding to the pattern. In other words, based on the route guidance information generated by the route guidance unit 409, the virtual sound source corresponding to the guidance point is set and the voice guidance information is generated, and these are transmitted via the navigation control unit 400. The sound is output to the guide sound output unit 406.
  • the speaker 411 reproduces (outputs) the navigation guidance sound output from the guidance sound output unit 406 and the sound output from the audio processing unit 414 described later.
  • a headphone or the like may be provided for the speaker power 411 so that the sound output sound of the guidance sound or sound is appropriately changed so that the sound field of the guidance sound or sound output from the entire vehicle interior is not obtained. Oh ,.
  • the image processing unit 412 receives an image input / output IZF4 from an imaging unit 415 and a communication unit 407, which will be described later. Image processing such as image data acquired through 13 and image data recorded on the recording medium 404 is performed. Specifically, the image processing unit 412 is configured by a GPU, for example. The image processing is performed by, for example, playing back image data obtained by recording a TV program obtained via the image input / output IZF413 based on reproduction information associated with the passenger. The image processing unit 412 performs, for example, image data analysis processing in accordance with a control command from the navigation control unit 400.
  • the passenger Based on the image data taken by the shooting unit 415 including a shooting device (VC), etc., the passenger recognizes and recognizes the passenger from the image image inside the vehicle, and analyzes information related to the seating state of the passenger To do. Specifically, the analysis of information related to the sitting state is based on, for example, determining the state in which the passenger is seated based on the seating position information of the passenger inside the vehicle and information on the body part. It is carried out. For this reason, the image processing unit 412 may be configured to have a DSP (Digital Signal Processor) function, for example.
  • DSP Digital Signal Processor
  • Image Input / Output IZF 413 inputs / outputs image data input / output to / from image processing unit 412 from the outside.
  • Image input / output IZF413 uses USB (Universal Serial Bus), IEEE 1394 to transfer image data from recording medium 404 that stores image data captured by DSC or DVC, for example, or image data stored in DSC or DVC, for example. (Institut e of Electrical and Electronic Engineers 1394) and image data input from the communication unit 407 by communication such as infrared rays is output to the image processing unit 412 and the image data output from the image processing unit 412 is output to the recording medium 404. And output to the communication unit 407.
  • USB Universal Serial Bus
  • the image input / output IZF 413 has a function of a controller for controlling read Z write of the recording medium 404 when inputting / outputting image data to / from the recording medium 404. Further, the image input / output IZF 413 may have a function of a communication controller that controls communication with the communication unit 407 when inputting / outputting image data to / from the communication unit 407.
  • the audio processing unit 414 receives audio data obtained from the recording medium 404 through the recording medium decoding unit 405 and audio data obtained from the communication unit 407 through the navigation control unit 400. Data is selected, and the selected audio data is played back. Further, the audio processing unit 414 performs reproduction processing of audio data stored in a storage device such as an audio database (hereinafter referred to as “audio DB”) 611 (see FIG. 6) described later. Audio data to be played back includes audio data such as music and sound effects. The reproduction process includes, for example, control of the sound field formed by the sound output from the speaker 411. In addition, when the information reproducing apparatus includes an AMZFM tuner or a TV tuner, the audio processing unit 414 may be configured to reproduce, for example, radio or television sound.
  • audio DB audio database
  • the audio processing unit 414 performs audio data reproduction processing based on the audio data reproduction information associated with the passenger identified and recognized by the image processing unit 412.
  • the sound processing unit 414 controls the output of sound output from the force 411 based on the selected and reproduced sound data. More specifically, for example, sound volume adjustment and equalizing processing and sound image localization are performed to control the sound output state.
  • the audio output control by the audio processing unit 414 is performed by, for example, an input operation from the user operation unit 401 or a control by the navigation control unit 400.
  • the imaging unit 415 includes an imaging device (camera) 123 mounted on the vehicle in FIG. 1 and an external imaging device such as the above-described DSC and DVC, and includes a photoelectric conversion element such as a C-MOS or a CCD. And take images inside the vehicle.
  • This photographing unit 415 is connected to the information reproducing device by wire or wirelessly, and photographs a video (image) including, for example, a passenger inside the vehicle by a photographing command from the navigation control unit 400.
  • Image data of the image captured by the imaging unit 415 is output to the image processing unit 412 via the image input / output IZF 413.
  • the information reproducing apparatus replaces the information related to the sitting state of the passenger inside the vehicle in place of or instead of the image data photographed by the photographing unit 415 and analyzed by the image processing unit 412, for example, each seat It may be obtained based on signals detected by a seating sensor that also has a force, such as a membrane switch mounted on 111-113 (see Fig. 1, the same applies hereinafter).
  • the signal indicating the seating sensor force is input to the navigation control unit 400, and the navigation control unit 400 determines the seating state of the passenger.
  • the acquisition unit 201 and the identification unit 202 in FIG. The function is realized by the processing unit 412 and the navigation control unit 400, and the playback unit 203 realizes the function by, for example, the image processing unit 412, the sound processing unit 414, and the navigation control unit 400.
  • the control unit 204 in FIG. 2 specifically realizes its function by, for example, the audio processing unit 414 and the navigation control unit 400, and the storage unit 205 includes, for example, the recording medium 404 and the recording medium decoding unit 405. The function is realized by.
  • the imaging device 206 in FIG. 2 realizes its function by the imaging unit 415, for example.
  • the image display device 207 in FIG. 2 specifically realizes its function by the display unit 402, for example, and the audio output device 208 realizes its function by, for example, the speaker 411.
  • FIG. 5 is a block diagram showing an example of the internal configuration of the image processing unit in the information reproducing apparatus according to the embodiment of the present invention.
  • FIG. 6 is a block diagram showing an example of the internal configuration of the audio processing unit in the information reproducing apparatus according to the embodiment of the present invention.
  • an image processing unit 412 includes an image analysis unit 510, a display control unit 511, an image recognition unit 512, an image storage unit 513, a passenger recognition unit 514, a passenger database (hereinafter referred to as “passenger database”). , “Passenger DB”) 515.
  • the image analysis unit 510 includes an image input / output I / F 413 through an image input unit 415 (see FIG. 4; the same applies hereinafter), external image data input to the image processing unit 412, and a recording medium decoding unit 405 (see FIG. 4).
  • the same applies to the following) and the navigation control unit 400 see FIG. 4; the same applies to the following
  • analyzes the image data input from the recording medium 404 see FIG. 4; the same applies to the following
  • the image analysis unit 510 is configured by, for example, a GPU.
  • the display control unit 511 performs control for displaying the image data output from the image analysis unit 510 on the display screen of the display unit 402.
  • the image recognition unit 512 recognizes what image image is included in the image data based on the image data input to the image analysis unit 510. Specifically, the image recognizing unit 512 recognizes, for example, where the passenger is seated in the vehicle.
  • the image storage unit 513 stores the image data input to the image analysis unit 510.
  • the image storage unit 513 is an information reproducing device. The reproduction information of the image data to be reproduced is stored. This reproduction information is stored in association with the identified passenger.
  • the occupant recognition unit 514 stores the image of the occupant in the image data input to the image analysis unit 510. If the image of the occupant is included, the occupant recognition unit 514 stores the occupant recognition unit 514.
  • the passenger's image is read out, and the passenger's identification 'recognition process represented by the image is performed. Specifically, the identification / recognition process is performed, for example, by face authentication based on the passenger's face image information. Since face authentication is a known technique, description thereof is omitted here.
  • the passenger DB 515 stores image data including image images of passengers of the vehicle, personal identification data such as the age and gender of these passengers, and the like.
  • an audio processing unit 414 includes an audio reproduction processing unit 610, an audio data base (hereinafter referred to as “audio DB”) 611, and a history database (hereinafter referred to as “history DB”). 612, a sound field control unit 613, and a parameter storage unit 614.
  • the audio reproduction processing unit 610 performs selection / reproduction processing of audio data input to the audio processing unit 414 and audio data stored in the audio DB 611. Also, the audio reproduction processing unit 610 performs selection / reproduction processing of audio data associated with the passenger identified by the image processing unit 412 (see FIG. 5, the same applies hereinafter).
  • the voice data associated with the passenger is, for example, voice data associated with a rhythm or tempo that seems to be optimal for the age of the passenger.
  • the audio DB 611 stores audio data to be selected and reproduced by the audio processing unit 414.
  • the audio data stored in the audio DB 611 is the audio data input to the audio processing unit 414 even if the recording medium 404 (see FIG. 4; the same applies hereinafter) and the communication unit 407 (see FIG. 4; the same applies hereinafter) It may also be audio data equipped in the brute force information reproducing apparatus.
  • the history DB 612 stores information related to the playback history and music selection history of the music when the audio data selected and played back by the audio processing unit 414 is music data. This history DB 612 stores, for example, information related to the playback history and music selection history of music played during driving when the information playback device is mounted on a vehicle.
  • the sound field control unit 613 is configured to output the sound data and the navigation data output from the sound reproduction processing unit 610. Based on the information regarding the seating state output from the motion control unit 400, the sound field parameter for sound field control is read from the parameter storage unit 614, and the sound output from the speaker 411 is read based on the read sound field parameter. Controls the sound field that forms. The sound field is controlled by, for example, sound image localization.
  • the parameter storage unit 614 stores sound field parameters for sound field control used in the sound field control unit 613.
  • the stored sound field parameters are determined in advance so that an optimal sound field can be formed in accordance with information such as the seating position, sitting height, and ear position of the passenger in sound image localization. It is a thing.
  • FIG. 7 is an explanatory view showing an example of the internal part of the vehicle on which the information reproducing apparatus according to the embodiment of the present invention is mounted.
  • a plurality of speakers (SP1 to SP4) of the information reproducing apparatus 411 are installed inside the vehicle, SP1 is in front of the passenger seat 712, and SP2 is driving In front of the seat 711, SP3 is installed behind the rear left seat 713, and SP4 is installed behind the rear right seat 714.
  • the sound field control unit 613 (see FIG. 6, the same applies hereinafter), for example, the sound output delay characteristics of SP1 to SP4 based on the sound field parameters.
  • the sound field control unit 613 reads out the sound field parameters from the parameter storage unit 614 (refer to FIG. 6, the same applies hereinafter) and performs sound image localization appropriately in accordance with the boarding position pattern of the passenger.
  • FIG. 8 is a flowchart showing an example of the information reproduction processing procedure of the information reproducing apparatus according to the embodiment of the present invention.
  • the imaging unit 415 see FIG. 4, the same applies hereinafter
  • Image data of the captured image is input to an image processing unit 412 (see Fig. 5 and the same below) via an image input / output IZF413 (see Fig. 4 and the same below), and an image analysis unit 510 (Fig. It is determined whether or not the passenger is seated on the seat (see step S802). If it is determined that the passenger is not seated on the seat (step S802: No), the process returns to step S801 to capture an image. When it is determined that the passenger is seated on the seat (step S802: Yes), the navigation control unit 400 (see FIG. 4, the same applies hereinafter) acquires seating position information from the image processing unit 412 (step S803).
  • the navigation control unit 400 determines whether there is a change in the seating position of the occupant (step S804). If it is determined that there is a change in the seating position (step S804: Yes), the process returns to step S803, and the navigation control unit 400 acquires the seating position information from the image processing unit 412 again. If it is determined that there is no change in the seating position (step S804: No), the navigation control unit 400 identifies the passenger inside the vehicle based on the information about the passenger identified in the image processing unit 412 ( Step S805).
  • the navigation control unit 400 selects the audio data based on the reproduction information associated with the identified occupant for the audio processing unit 414 (see FIG. 6, the same applies hereinafter).
  • the sound field control unit 613 (see FIG. 6; the same applies hereinafter) generates a sound field optimal for the seating position of the passenger.
  • the sound field parameters are read from (Step S806).
  • the sound processing unit 414 reproduces the sound based on the selected sound data (step S807), and the sound field control unit 613 forms the sound based on the read sound field parameters.
  • the sound image of the sound field to be localized is localized (step S808).
  • the image processing unit 412 is associated with the identified passenger. Based on the reproduced information, the image data may be appropriately reproduced together with the reproduction of the audio data by the audio processing unit 414. Also, based on the seating position of the passenger, the image display on the display screen of the display unit 402 (see FIG. 4; the same applies hereinafter) is controlled so that the image can be viewed optimally. Let's play it.
  • the information reproducing apparatus acquires the seating position information of the passenger based on the image data obtained by photographing the inside of the vehicle, and controls the sound field suitable for the seating position. You can perform sound image localization. For this reason, the sound field inside the vehicle can be automatically optimized in accordance with the passenger.
  • the sound field is controlled based on the seated state of the passenger.
  • the sound field inside the vehicle can be optimized according to the passenger.
  • the information reproduction method described in the present embodiment can be realized by executing a prepared program on a computer such as a personal computer or a workstation.
  • This program is recorded on a computer-readable recording medium such as a hard disk, a flexible disk, a CD-ROM, an MO, and a DVD, and is executed by being read by the computer.
  • this program may be a transmission medium that can be distributed through a network such as the Internet.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Acoustics & Sound (AREA)
  • Signal Processing (AREA)
  • Navigation (AREA)

Abstract

L'invention concerne un dispositif de reproduction d’informations possédant une section d’acquisition (201) pour détecter l’état d’assise d’un occupant de véhicule ; une section d’identification (202) pour identifier l’occupant en fonction des informations d’état d’assise acquises par la section d’acquisition (201) ; une section de reproduction (203) pour reproduire des données de son associées à l’occupant identifié par la section d’identification (202) ; et une section de commande (204) pour commander, en fonction du résultat de l’identification par la section d’identification (202), un champ sonore formé d’un son des données de son reproduites par la section de reproduction (203).
PCT/JP2006/304281 2005-03-11 2006-03-06 Dispositif de reproduction d’informations, méthode de reproduction d’informations, programme de reproduction d’informations et support lisible par ordinateur WO2006095688A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2005-069778 2005-03-11
JP2005069778 2005-03-11

Publications (1)

Publication Number Publication Date
WO2006095688A1 true WO2006095688A1 (fr) 2006-09-14

Family

ID=36953282

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2006/304281 WO2006095688A1 (fr) 2005-03-11 2006-03-06 Dispositif de reproduction d’informations, méthode de reproduction d’informations, programme de reproduction d’informations et support lisible par ordinateur

Country Status (1)

Country Link
WO (1) WO2006095688A1 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008213634A (ja) * 2007-03-02 2008-09-18 Denso Corp 運転環境設定システム、車載装置、携帯装置、管理装置、車載装置用プログラム、携帯装置用プログラム及び管理装置用プログラム
WO2012141057A1 (fr) * 2011-04-14 2012-10-18 株式会社Jvcケンウッド Dispositif de génération de champs sonores, système de génération de champs sonores et procédé de génération d'un champ sonore
JP2012231448A (ja) * 2011-04-14 2012-11-22 Jvc Kenwood Corp 音場生成装置、音場生成システム、及び音場生成方法
JP2016082443A (ja) * 2014-10-17 2016-05-16 学校法人 中央大学 スピーカ配置選択装置、スピーカ配置選択方法及び音場制御システム
JP2019193108A (ja) * 2018-04-25 2019-10-31 パイオニア株式会社 音響装置
US20220174446A1 (en) * 2019-03-22 2022-06-02 Sony Group Corporation Acoustic signal processing device, acoustic signal processing system, acoustic signal processing method, and program

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH099398A (ja) * 1995-06-20 1997-01-10 Matsushita Electric Ind Co Ltd 音像定位装置
JPH1146394A (ja) * 1997-07-25 1999-02-16 Sony Corp 情報処理装置および方法、記録媒体、並びに伝送媒体
JP2002140603A (ja) * 2000-10-31 2002-05-17 Omron Corp 画像形成装置および情報提供装置
JP2003111200A (ja) * 2001-09-28 2003-04-11 Sony Corp 音響処理装置
JP2003130649A (ja) * 2001-10-22 2003-05-08 Clarion Co Ltd 車載装置の設定変更装置
JP2004102714A (ja) * 2002-09-10 2004-04-02 Kyodo Printing Co Ltd 広告システム及び広告方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH099398A (ja) * 1995-06-20 1997-01-10 Matsushita Electric Ind Co Ltd 音像定位装置
JPH1146394A (ja) * 1997-07-25 1999-02-16 Sony Corp 情報処理装置および方法、記録媒体、並びに伝送媒体
JP2002140603A (ja) * 2000-10-31 2002-05-17 Omron Corp 画像形成装置および情報提供装置
JP2003111200A (ja) * 2001-09-28 2003-04-11 Sony Corp 音響処理装置
JP2003130649A (ja) * 2001-10-22 2003-05-08 Clarion Co Ltd 車載装置の設定変更装置
JP2004102714A (ja) * 2002-09-10 2004-04-02 Kyodo Printing Co Ltd 広告システム及び広告方法

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008213634A (ja) * 2007-03-02 2008-09-18 Denso Corp 運転環境設定システム、車載装置、携帯装置、管理装置、車載装置用プログラム、携帯装置用プログラム及び管理装置用プログラム
US9174552B2 (en) 2007-03-02 2015-11-03 Denso Corporation Driving-environment setup system, in-vehicle device and program thereof, portable device and program thereof, management device and program thereof
WO2012141057A1 (fr) * 2011-04-14 2012-10-18 株式会社Jvcケンウッド Dispositif de génération de champs sonores, système de génération de champs sonores et procédé de génération d'un champ sonore
JP2012231448A (ja) * 2011-04-14 2012-11-22 Jvc Kenwood Corp 音場生成装置、音場生成システム、及び音場生成方法
JP2016082443A (ja) * 2014-10-17 2016-05-16 学校法人 中央大学 スピーカ配置選択装置、スピーカ配置選択方法及び音場制御システム
JP2019193108A (ja) * 2018-04-25 2019-10-31 パイオニア株式会社 音響装置
US20220174446A1 (en) * 2019-03-22 2022-06-02 Sony Group Corporation Acoustic signal processing device, acoustic signal processing system, acoustic signal processing method, and program

Similar Documents

Publication Publication Date Title
JP4516111B2 (ja) 画像編集装置、画像編集方法、画像編集プログラムおよびコンピュータに読み取り可能な記録媒体
JP4533897B2 (ja) 処理制御装置、そのプログラム、および、そのプログラムを記録した記録媒体
WO2006095688A1 (fr) Dispositif de reproduction d’informations, méthode de reproduction d’informations, programme de reproduction d’informations et support lisible par ordinateur
WO2006101012A1 (fr) Dispositif, procede et programme de mise a jour d'informations cartographiques et support d'enregistrement lisible par ordinateur
JP4652099B2 (ja) 画像表示装置、画像表示方法、画像表示プログラム、および記録媒体
JP3708141B2 (ja) 電子地図装置
WO2007135865A1 (fr) Dispositif, procédé et programme de commande d'imagerie et support d'enregistrement
JP4791064B2 (ja) 環境音再生を伴うナビゲーション装置
US20070115433A1 (en) Communication device to be mounted on automotive vehicle
WO2007032278A1 (fr) Terminal de communication, appareil de guidage, procede de guidage et support d'enregistrement
JP2007259146A (ja) 字幕検出装置、字幕検出方法、字幕検出プログラム、および記録媒体
WO2007023900A1 (fr) Dispositif, procédé et programme de fourniture de contenu et support d'enregistrement lisible par ordinateur
JP2009223187A (ja) 表示内容制御装置、表示内容制御方法及び表示内容制御方法プログラム
JP2008252589A (ja) 音量制御装置、音量制御方法、音量制御プログラムおよび記録媒体
JP2006189977A (ja) 画像編集装置、画像編集方法、画像編集プログラムおよびコンピュータに読み取り可能な記録媒体
WO2006095689A1 (fr) Dispositif d’assistance à la conduite, méthode d’assistance à la conduite et programme d’assistance à la conduite
JP2008160447A (ja) 放送番組受信装置、放送番組受信計画装置、放送番組受信方法、放送番組受信計画方法、プログラム、および記録媒体
WO2006109469A1 (fr) Dispositif de support de composition musicale, procede de support de composition musicale, programme de support de composition musicale et moyen d'enregistrement
JP2006177814A (ja) 情報提供装置
WO2007043464A1 (fr) Dispositif de contrôle de sortie, procédé de contrôle de sortie, programme de contrôle de sortie et support d’enregistrement lisible par un ordinateur
US11538218B2 (en) System and method for three-dimensional reproduction of an off-road vehicle
JP4584176B2 (ja) 情報転送システム、携帯型情報処理装置、情報転送方法、情報転送プログラムおよびコンピュータに読み取り可能な記録媒体
JP2008160445A (ja) 放送波情報表示装置、放送波情報表示方法、放送波情報表示プログラム、および記録媒体
JP2010175854A (ja) エンジン音出力装置、出力制御方法、出力制御プログラムおよび記録媒体
JPH10132595A (ja) ナビゲーション装置および方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: RU

122 Ep: pct application non-entry in european phase

Ref document number: 06715301

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP