WO2006095688A1 - Information reproduction device, information reproduction method, information reproduction program, and computer-readable recording medium - Google Patents

Information reproduction device, information reproduction method, information reproduction program, and computer-readable recording medium Download PDF

Info

Publication number
WO2006095688A1
WO2006095688A1 PCT/JP2006/304281 JP2006304281W WO2006095688A1 WO 2006095688 A1 WO2006095688 A1 WO 2006095688A1 JP 2006304281 W JP2006304281 W JP 2006304281W WO 2006095688 A1 WO2006095688 A1 WO 2006095688A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
sound
passenger
unit
image
Prior art date
Application number
PCT/JP2006/304281
Other languages
French (fr)
Japanese (ja)
Inventor
Koji Koga
Takeshi Sato
Goro Kobayashi
Original Assignee
Pioneer Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pioneer Corporation filed Critical Pioneer Corporation
Publication of WO2006095688A1 publication Critical patent/WO2006095688A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04SSTEREOPHONIC SYSTEMS 
    • H04S7/00Indicating arrangements; Control arrangements, e.g. balance control
    • H04S7/30Control circuits for electronic adaptation of the sound field
    • H04S7/302Electronic adaptation of stereophonic sound system to listener position or orientation
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R2011/0001Arrangements for holding or mounting articles, not otherwise provided for characterised by position
    • B60R2011/0003Arrangements for holding or mounting articles, not otherwise provided for characterised by position inside the vehicle
    • B60R2011/0012Seats or parts thereof
    • B60R2011/0017Head-rests
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R11/02Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof
    • B60R2011/0276Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof for rear passenger use
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R2499/00Aspects covered by H04R or H04S not otherwise provided for in their subgroups
    • H04R2499/10General applications
    • H04R2499/13Acoustic transducers and sound field adaptation in vehicles

Definitions

  • Information reproducing apparatus information reproducing method, information reproducing program, and computer-readable recording medium
  • the present invention relates to an information reproduction apparatus, an information reproduction method, an information reproduction program, and a computer-readable recording medium that acquire information related to the sitting state of a passenger in a vehicle and reproduce audio data and the like.
  • the use of the present invention is not limited to the above-described information reproducing apparatus, information reproducing method, information reproducing program, and computer-readable recording medium.
  • each passenger can be identified by selecting setting information for himself / herself from the setting information registered in advance by the passenger or by the passenger himself / herself. This is done by, for example, reading an information stored in the ID card with an ID force reader provided for each vehicle seat.
  • the sound image localization of the audio signal detects the head shape of the listener and detects the speaker position force based on the head-related transfer function or the like. This is done by accurately simulating the transfer characteristics up to and obtaining the filter coefficient, and using the obtained filter coefficient to localize the sound image.
  • the preferred in-vehicle environment (in-vehicle device) set by each passenger in each seat of the vehicle (driver's seat, front passenger seat, etc.) (Such as the location and operating state of the vehicle) is stored in an ID card using an IC as profile information for each passenger. And the passenger's profile from the ID card By adjusting the in-vehicle equipment by reading out the vehicle information, it is set to the passenger's preferred in-vehicle environment.
  • the face of the user is photographed with a CCD camera, and the width of the user's face and the size of the auricle are based on this image. Detect.
  • the rear speaker force calculates the head-related transfer function that is the transfer function to the user's ears. Then, the sound image localization of the rear speaker is realized with the front speaker by performing the finisher processing with the DSP of the USB amplifier so as to realize the characteristics of the head-related transfer function.
  • Patent Document 1 Japanese Patent Application Laid-Open No. 2002-104105
  • Patent Document 2 JP 2003-230199 A
  • Patent Document 1 Also, with the prior art disclosed in Patent Document 1, it is possible to store in-vehicle device setting information according to the passenger's preference, but to actively make the in-vehicle device setting suitable for the passenger. An example of this is the problem that cannot be done.
  • Patent Document 2 when a head image is obtained by photographing a listener's face image and filter processing is performed and sound image localization is performed, complicated operations such as image calculation processing and audio filter processing are performed. As an example, there is a problem that sound image localization associated with sound reproduction cannot be performed easily and easily because a simple process is essential.
  • An information reproducing apparatus detects a seating state of a passenger in a vehicle, An acquisition means for acquiring information relating to the seating state; an identification means for identifying the occupant based on information relating to the seating state acquired by the acquisition means; and an occupant identified by the identification means.
  • Reproduction means for reproducing the sound data being reproduced, and control means for controlling the sound field formed by the sound of the sound data reproduced by the reproduction means based on the identification result identified by the identification means. It is characterized by providing.
  • the information reproduction method according to the invention of claim 7 is obtained by an acquisition step of detecting a seating state of a passenger in a vehicle and acquiring information on the seating state, and the acquisition step.
  • An identification step for identifying the occupant based on information related to the seating state, a reproduction step for reproducing audio data associated with the occupant identified by the identification step, and an identification identified by the identification step And a control step of controlling the sound field formed by the sound of the sound data reproduced by the reproduction process based on the result.
  • An information reproduction program according to the invention of claim 8 causes a computer to execute the information reproduction method according to claim 7.
  • the information reproduction program described in 8 is recorded.
  • FIG. 1 is an explanatory view showing an example of the inside of a vehicle on which an information reproducing apparatus that is useful for an embodiment is mounted.
  • FIG. 2 is a block diagram showing an example of a functional configuration of an information reproducing apparatus according to an embodiment.
  • FIG. 3 is a flowchart showing an example of an information reproduction processing procedure of the information reproducing apparatus according to the embodiment.
  • FIG. 4 is a block diagram showing an example of a hardware configuration of an information reproducing apparatus according to the embodiment.
  • FIG. 5 is a block diagram illustrating an example of an internal configuration of an image processing unit in the information reproducing apparatus according to the embodiment. ⁇
  • FIG. 6 is a block diagram illustrating an example of the internal configuration of the audio processing unit in the information reproducing apparatus according to the embodiment.
  • FIG. 7 is an explanatory diagram showing an example of the inside of a vehicle on which the information reproducing apparatus according to the example is mounted.
  • FIG. 8 is a flowchart showing an example of an information reproduction processing procedure of the information reproducing apparatus according to the embodiment.
  • FIG. 1 shows an information reproducing apparatus that is useful for the embodiment of the present invention. It is explanatory drawing which shows an example inside the made vehicle.
  • an image display device (display) 121a, an audio output device (speaker) 122, and an information reproducing device 126a are provided around the driver's seat 111 and the passenger seat 112.
  • a photographing device (camera) 123 is provided on the ceiling 114 of the vehicle, and an image display device (display) 121 is provided on the passenger seat 112 to the passenger on the rear seat 113. b and an information reproducing device 126b are provided.
  • a sound output device (speaker) (not shown) is provided behind the rear seat 113, and a photographing device (camera) 123 is provided in each information reproduction device 126 (126a, 126b).
  • FIG. 2 is a block diagram showing an example of a functional configuration of the information reproducing apparatus according to the embodiment of the present invention.
  • the information reproduction apparatus includes an acquisition unit 201, an identification unit 202, a reproduction unit 203, a control unit 204, and a storage unit 205, and further includes a photographing device (camera) 206 and an image display device. (Display) 207 and audio output device (speaker) 208 are provided.
  • the information reproducing apparatus may have a structure that can be attached to and detached from the vehicle.
  • the acquisition unit 201 detects the seating state of the passenger on each of the seats 111 to 113 of the vehicle in FIG. 1, and acquires information on the seating state.
  • Each of the seats 111 to 113 is a seat of a vehicle on which an information reproducing device is mounted.
  • the passenger is a person who gets on the vehicle, and includes a person who gets on the driver's seat 111 and a person who gets on the passenger seat 112 and the rear seat 113.
  • the detection of the seating state is, for example, by detecting the position in the vehicle where the passenger is seated using image data of an image taken by the photographing device 206 described later. is there.
  • the information on the sitting state includes information such as seating position information, sitting height information, face image information, and ear position information of the passenger.
  • the acquisition unit 201 may be configured by so-called seating sensors provided in each of the seats 111 to 113, for example.
  • the information regarding the sitting state includes, for example, information regarding the body part of the passenger May be included.
  • the body part of the passenger is the position of each part of the body.
  • Information on the body part is specifically information on the position of each part of the body, such as the position of the shoulders, abdomen, and head, the length of arms, legs, height, weight, and sitting height. is there.
  • the information on the body part includes, for example, information on the positions of eyes and ears in the head. Further, the information on the body part may be information representing the ability of the body part such as visual acuity and hearing ability, in addition to the information on these positions alone.
  • the acquiring unit 201 When acquiring information related to the body part as information related to the sitting state, specifically, the acquiring unit 201 prays, for example, image data captured by the imaging device 206, and the boarding obtained thereby. Based on the image of the body part of the person, information on the body part is calculated and acquired. Note that the acquisition unit 201 may acquire information related to the body part as information related to the sitting state by an input operation from the passenger or reading from an external recording device.
  • the identification unit 202 identifies the passenger based on the information regarding the seating state acquired by the acquisition unit 201. Identification refers to the presence or absence of image data playback information on the image display device 207 corresponding to the passenger and audio data playback information on the audio output device 208, and what playback information is associated with the passenger. It is to discriminate power.
  • face authentication processing is performed using the passenger's face image information included in the information related to the seating state to identify and identify the person. May be. Note that information on the sitting state and information such as the identification result by the identification unit 202 are stored in the storage unit 205 described later.
  • the reproduction unit 203 reproduces image data and audio data associated with the passenger identified by the identification unit 202.
  • Image data and audio data are reproduced based on the reproduction information stored in the storage unit 205.
  • Examples of the image data reproduced by the reproduction unit 203 include image data obtained by recording a TV program, image data captured by the image capturing device 206, and the like.
  • the audio data reproduced by the reproduction unit 203 includes music such as music and sound data such as sound effects.
  • the control unit 204 Based on the identification result identified by the identification unit 202, the control unit 204 forms audio output from the audio output device 208 based on the audio data reproduced by the reproduction unit 203. To control the sound field. Specifically, the sound field control includes, for example, adjustment of sound volume of sound output from the sound output device 208, sound equalization processing, sound image localization, and the like.
  • the control unit 204 performs sound image localization as control of the sound field
  • the sound image localization of the sound is performed using, for example, a known sound image localization method.
  • a sound image localization method for example, sound data is filtered based on the head-related transfer function to generate a virtual sound source in the sound field inside the vehicle, or the sound output device 2 It is possible to use a method of changing the audio output delay characteristic of 08.
  • the storage unit 205 stores information on the sitting state and information such as an identification result by the identification unit 202.
  • the storage unit 205 stores reproduction information of image data and audio data associated with the passenger.
  • the playback information is considered to be highly relevant to the passenger based on the information regarding the playback history and music selection history of the image data and audio data considering the preference of each passenger, or the age and gender of the identified passenger. Includes information about image and audio selection.
  • Various types of information stored in the storage unit 205 are temporarily stored, for example, stored in a learning manner or statistically like a database, depending on the type.
  • the imaging device 206 captures an image inside the vehicle, for example.
  • Image data of the captured image is provided to the acquisition unit 201.
  • the image display device 207 displays the image data reproduced by the reproducing unit 203 on the display screen.
  • the sound output device 208 outputs sound based on the sound data reproduced by the reproducing unit 203 and controlled by the control unit 204 to the inside of the vehicle.
  • FIG. 3 is a flowchart showing an example of an information reproduction processing procedure of the information reproducing apparatus according to the embodiment of the present invention.
  • step S302 determines that the passenger is not seated on each of the seats 111 to 113 (see FIG. 1, the same applies hereinafter) If yes (step S302: No), the process returns to step S301 to take an image.
  • step S302: Yes the acquisition unit 201 (see FIG. 2, the same applies hereinafter) detects the seating state and acquires information on the seating state ( Step S303).
  • the identification unit 202 identifies the passenger (step S304).
  • the playback unit 203 stores the playback information of the image data and audio data associated with the identified occupant in the storage unit 205 (see FIG. 2 and so on).
  • the audio output device 208 To play back images and sounds on the image display device 207 (see FIG. 2, the same applies hereinafter) and the audio output device 208 (see FIG. 2, the same applies hereinafter) (step S305).
  • step S306 the audio data reproduced by the reproduction unit 203 based on the identification result identified by the identification unit 202 by the control unit 204 when the reproduction unit 203 reproduces image data or audio data.
  • the sound field formed by the voice is controlled (step S306).
  • the control of the sound field in step S306 includes, for example, performing sound image localization according to the seating position of the passenger so as to form a sound field where the sound can be heard optimally for the passenger.
  • the passenger is identified based on the information on the seating state of the passenger on the vehicle, and the boarding that has been identified.
  • the sound field formed by the sound of the sound data to be played can be controlled based on the identification result! For this reason, it is possible to automatically optimize the sound field inside the vehicle according to the occupant without manually performing an input operation for sound field control.
  • FIG. 4 is a block diagram showing an example of the hardware configuration of the information reproducing apparatus according to the embodiment of the present invention.
  • the information reproducing apparatus is mounted on the vehicle in a detachable structure as described above, and includes a navigation control unit 400, a user operation unit (remote control and touch panel) 401, and a display unit (monitor). ) 402, position acquisition unit 403, recording medium 404, recording medium decoding unit 405, guidance sound output unit 406, communication unit 407, route search unit 408, route guidance unit 409, and guidance sound generation
  • the unit 410 includes a speaker 411, an image processing unit 412, an image input / output IZF 413, an audio processing unit 414, and a photographing unit 415.
  • the navigation control unit 400 controls the entire information reproducing apparatus, for example, and performs various arithmetic processes according to the control program, thereby comprehensively controlling each unit included in the information reproducing apparatus.
  • the navigation control unit 400 includes, for example, a CPU (Central Processing Unit) that executes predetermined arithmetic processing, a ROM (Read Only Memory) that stores various control programs, and a RAM (Random Access Memory) that functions as a work area for the CPU. ) Or the like.
  • a CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • the navigation control unit 400 for example, when the vehicle route is guided, information on the current position of the vehicle (current position information) acquired by the position acquisition unit 403 and the recording medium 404 to the recording medium decoding unit 405 Based on the map information obtained via the map, the position on the map where the vehicle is traveling is calculated, and the calculation result is output to the display unit 402.
  • the navigation control unit 400 inputs / outputs information related to route guidance between the route search unit 408, the route guidance unit 409, and the guidance sound generation unit 410 when performing the above-described route guidance, and obtains information obtained as a result.
  • the information is output to the display unit 402 and the guidance sound output unit 406.
  • the user operation unit 401 outputs information input by the user, such as characters, numerical values, and various instructions, to the navigation control unit 400.
  • information input by the user such as characters, numerical values, and various instructions
  • various known forms such as a push button switch that detects physical pressing Z non-pressing, a touch panel, a keyboard, and a joystick can be employed.
  • This user operation unit 401 uses, for example, a microphone for inputting sound of external power to You can also use the input operation mode!
  • the user operation unit 401 may be provided integrally with the information reproducing apparatus, or may be configured to be able to be operated with a position force separated from the information reproducing apparatus, such as a remote controller. .
  • the user operation unit 401 may be configured in any one of the various forms described above, or may be configured in a plurality of forms. The user inputs information by appropriately performing an input operation according to the form of the user operation unit 401.
  • Information input by an input operation of the user operation unit 401 includes, for example, information on a destination regarding navigation. Specifically, for example, when the information reproducing apparatus is provided in a vehicle or the like, a point to be reached by the passenger of this vehicle is set. Further, as information input to the user operation unit 401, for example, regarding information reproduction, selection information of audio reproduced by an audio processing unit 414 described later can be cited. Specifically, for example, audio data such as music desired by the passenger of this vehicle is selected and set.
  • the touch panel when a touch panel is adopted as the form of the user operation unit 401, the touch panel is used by being stacked on the display screen side of the display unit 402.
  • the input timing by the input operation is recognized by managing the display timing on the display unit 402, the operation timing for the touch panel (user operation unit 401) and its position coordinates.
  • a touch panel stacked on the display unit 402 as a form of the user operation unit 401, it is possible to input a large amount of information without enlarging the form of the user operation unit 401.
  • this touch panel various known touch panels such as a resistive film type and a pressure sensitive type can be adopted.
  • Display unit 402 includes, for example, a CRT (Cathode Ray Tube), a TFT liquid crystal display, an organic EL display, a plasma display, and the like.
  • the display unit 402 can be configured by, for example, a video IZF or a video display device connected to the video IZF, not shown.
  • the video IZF is, for example, a graphic controller that controls the entire display device, and a VRAM (V It consists of a buffer memory such as ideo RAM) and a control IC or GPU (Graphics Processing Unit) that controls display based on image information output from the graphics controller.
  • VRAM V It consists of a buffer memory such as ideo RAM
  • control IC or GPU Graphics Processing Unit
  • the display unit 402 displays icons, cursors, menus, windows, or various information such as characters and images. Further, the display unit 402 displays image data processed by an image processing unit 412 described later.
  • the position acquisition unit 403 acquires the current position information (latitude and longitude information) of the vehicle on which the information reproducing device is mounted, for example, by receiving radio waves of artificial satellite power.
  • the current position information is information that receives radio waves from an artificial satellite and obtains the geometric position with the artificial satellite, and can be measured anywhere on the earth.
  • the position acquisition unit 403 includes a GPS antenna (not shown).
  • GPS Global Positioning System
  • This position acquisition unit 403 can be configured by, for example, a tuner that demodulates radio waves received by satellite power, an arithmetic circuit that calculates the current position based on the demodulated information, and the like.
  • the radio wave from the artificial satellite is 1.
  • Carrier wave of 57542 GHz using L1 radio wave carrying CZA (Coarse and Access) code and navigation message.
  • CZA Coarse and Access
  • the current position (latitude and longitude) of the vehicle on which the information reproducing apparatus is mounted is detected.
  • information collected by various sensors such as a vehicle speed sensor and a gyro sensor may be taken into account.
  • the vehicle speed sensor detects the vehicle speed from the output shaft of the transmission of the vehicle equipped with the information reproducing device.
  • angular velocity sensor detects the angular velocity when the vehicle rotates and outputs angular velocity information and relative orientation information.
  • the mileage sensor calculates the number of pulses per one rotation of the wheel by counting the number of pulses of a pulse signal output with the rotation of the wheel in a given period, and the mileage based on the number of pulses per one rotation Output information.
  • the inclination angle sensor detects the inclination angle of the road surface and outputs inclination angle information.
  • Horizontal G The sensor detects lateral G, which is an outward force (gravity) generated by centrifugal force during vehicle cornering, and outputs lateral G information.
  • the vehicle current position information acquired by the position acquisition unit 403 and the information detected by the vehicle speed sensor, gyro sensor, angular velocity sensor, travel distance sensor, inclination angle sensor, and lateral G sensor are used for navigation. Output to the control unit 400.
  • the recording medium 404 records various control programs and various information in a state readable by a computer.
  • the recording medium 404 accepts writing of information by the recording medium decoding unit 405 and records the written information in a nonvolatile manner.
  • the recording medium 404 can be realized by HD (Hard Disk), for example.
  • the recording medium 404 is not limited to HD. Instead of HD or in addition to HD, DVD (Digital Versatile Disk), CD (Compact Disk), or other recording medium decoding unit 405 can be attached and detached. A medium that is portable and has portability may be used as the recording medium 404.
  • the recording medium 404 is not limited to a DVD and a CD.
  • the recording medium 404 is attached to and detached from a recording medium decoding unit 405 such as a CD-ROM (CD-R, CD-RW), MO (Magneto-Optical disk), or a memory card. Use media that is portable and portable.
  • the recording medium 404 records an information reproduction program, a navigation program, image data, audio data, map information, and the like that realize the present invention.
  • the image data refers to, for example, a two-dimensional array value representing an image of an image taken inside the vehicle.
  • Audio data refers to data for reproducing music such as music.
  • the map information includes background information representing features (features) such as buildings, rivers, and the ground surface, and road shape information representing the shape of the road. It is drawn in 2D or 3D.
  • the background information includes background shape information representing the shape of the background and background type information representing the type of the background.
  • the background shape information includes information indicating, for example, representative points of features, polylines, polygons, coordinates of features, and the like.
  • the background type information includes, for example, text information indicating the name, address, telephone number, and the like of the feature, and type information indicating the type of the feature such as a building 'river'.
  • the road shape information is information relating to a road network having a plurality of nodes and links.
  • the node is information indicating an intersection where a plurality of roads intersect, such as a three-way “crossroad” and a five-way.
  • a link is information indicating a road connecting nodes.
  • Some links have shape interpolation points that allow the expression of curved roads.
  • Road shape information has traffic condition information.
  • the traffic condition information is information indicating the characteristics of the intersection, the length (distance) of each link, vehicle width, traveling direction, traffic prohibition, road type, and the like.
  • the characteristics of this intersection include, for example, complex intersections such as three- and five-way intersections, intersections where roads diverge at shallow ridge angles, intersections around destinations, highway entrances and junctions, and route deviation rates. For example, high intersections.
  • the route deviation rate can be calculated by, for example, the past driving history force. Examples of road types include highway, toll road, and general road.
  • the power for recording image data, audio data, and map information in the recording medium 404 is not limited to this.
  • Image data, audio data, and map information are recorded only if they are provided integrally with the hardware of the information reproducing device, and are not provided, but may be provided outside the information reproducing device.
  • the information reproducing apparatus acquires image data and audio data via the network through the communication unit 407, for example.
  • the information reproducing apparatus acquires map information via a network, for example, through the communication unit 407.
  • the image data, audio data, and map information acquired in this way may be stored in the RAM of the navigation control unit 400, for example.
  • the recording medium decoding unit 405 controls reading and writing of information on the recording medium 404.
  • the recording medium decoding unit 405 is an HDD (Hard Disk Drive).
  • the recording medium decoding unit 405 is a DVD drive! / ⁇ is a CD drive.
  • CD-ROM CD-R, CD-RW
  • MO electrically erasable and detachable recording medium
  • the guidance sound output unit 406 reproduces the navigation guidance sound by controlling the output to the connected speaker 411.
  • the guide sound output unit 406 can be realized by an audio IZF (not shown) connected to the audio output speaker 411.
  • the audio IZ F is, for example, a DZ A converter that performs DZA conversion of digital audio data, an ⁇ / A converter power, an amplifier that amplifies the analog audio signal that is output, and an AZD conversion of the analog audio signal. It can be configured with AZD converter and force.
  • the communication unit 407 performs communication with, for example, another information reproducing apparatus.
  • the communication unit 407 of this embodiment may be a communication module that communicates with a communication server (not shown) via a base station (not shown) such as a mobile phone, for example. It may be a communication module that performs direct wireless communication with the device.
  • the wireless communication is communication performed using radio waves or infrared rays' ultrasonic waves without using a wire line as a communication medium.
  • wireless LAN Infrared Data Association
  • HomeRF Home Radio Frequency
  • BlueTooth BlueTooth as standards that enable wireless communication.
  • Technology can be used.
  • Wireless LAN is preferred from the aspect of information transfer speed and can be used as an example.
  • the communication unit 407 may receive road traffic information such as traffic jams and traffic regulations on a regular basis (may be irregular). Receiving road traffic information by the communication unit 407 may be performed at the timing when the VICS (Venicle Information and Communication System) Center ⁇ Rikido Road 3 ⁇ 4 ⁇ Transport ⁇ Blue Bulletin is distributed, or periodically to the vies center. It may be done by requesting road traffic information.
  • the communication unit 407 can be realized as, for example, an AMZFM tuner, a TV tuner, a VICSZ beacon receiver, and other communication devices.
  • rviCSj is a well-known technology that omits detailed explanations.
  • RviCSj is a vehicle navigation device that sends traffic traffic information such as traffic jams and traffic regulations edited and processed in the VICS center in real time. Information communication system that displays characters on graphics Is.
  • VICS information road traffic information
  • FM multiplex broadcasting installed on each road.
  • “beacons” include “radio wave beacons” mainly used on expressways and “optical beacons” used on major general roads.
  • “FM multiplex broadcasting” road traffic information in a wide area can be received.
  • a “beacon” it is possible to receive necessary road traffic information at the location where the vehicle is located, such as detailed information on the most recent road based on the location of the vehicle (vehicle). Become. If the communication method with another information reproducing apparatus is different from the communication method for receiving image data, audio data, and road traffic information, the communication unit 407 has a plurality of corresponding methods. Have a means of communication.
  • the route search unit 408 calculates an optimal route from the current position to the destination based on the current position information of the vehicle acquired by the position acquisition unit 403 and the information on the destination input by the user. To do.
  • the route guidance unit 409 includes information on the guidance route searched by the route search unit 408 or route information received by the communication unit 407, current position information acquired by the position acquisition unit 403, and a recording medium 404 Based on the map information obtained via the decoding unit 405, real-time route guidance information is generated.
  • the route guidance information generated by the route guidance unit 409 is output to the display unit 402 via the navigation control unit 400.
  • Guide sound generation section 410 generates tone and voice information corresponding to the pattern. In other words, based on the route guidance information generated by the route guidance unit 409, the virtual sound source corresponding to the guidance point is set and the voice guidance information is generated, and these are transmitted via the navigation control unit 400. The sound is output to the guide sound output unit 406.
  • the speaker 411 reproduces (outputs) the navigation guidance sound output from the guidance sound output unit 406 and the sound output from the audio processing unit 414 described later.
  • a headphone or the like may be provided for the speaker power 411 so that the sound output sound of the guidance sound or sound is appropriately changed so that the sound field of the guidance sound or sound output from the entire vehicle interior is not obtained. Oh ,.
  • the image processing unit 412 receives an image input / output IZF4 from an imaging unit 415 and a communication unit 407, which will be described later. Image processing such as image data acquired through 13 and image data recorded on the recording medium 404 is performed. Specifically, the image processing unit 412 is configured by a GPU, for example. The image processing is performed by, for example, playing back image data obtained by recording a TV program obtained via the image input / output IZF413 based on reproduction information associated with the passenger. The image processing unit 412 performs, for example, image data analysis processing in accordance with a control command from the navigation control unit 400.
  • the passenger Based on the image data taken by the shooting unit 415 including a shooting device (VC), etc., the passenger recognizes and recognizes the passenger from the image image inside the vehicle, and analyzes information related to the seating state of the passenger To do. Specifically, the analysis of information related to the sitting state is based on, for example, determining the state in which the passenger is seated based on the seating position information of the passenger inside the vehicle and information on the body part. It is carried out. For this reason, the image processing unit 412 may be configured to have a DSP (Digital Signal Processor) function, for example.
  • DSP Digital Signal Processor
  • Image Input / Output IZF 413 inputs / outputs image data input / output to / from image processing unit 412 from the outside.
  • Image input / output IZF413 uses USB (Universal Serial Bus), IEEE 1394 to transfer image data from recording medium 404 that stores image data captured by DSC or DVC, for example, or image data stored in DSC or DVC, for example. (Institut e of Electrical and Electronic Engineers 1394) and image data input from the communication unit 407 by communication such as infrared rays is output to the image processing unit 412 and the image data output from the image processing unit 412 is output to the recording medium 404. And output to the communication unit 407.
  • USB Universal Serial Bus
  • the image input / output IZF 413 has a function of a controller for controlling read Z write of the recording medium 404 when inputting / outputting image data to / from the recording medium 404. Further, the image input / output IZF 413 may have a function of a communication controller that controls communication with the communication unit 407 when inputting / outputting image data to / from the communication unit 407.
  • the audio processing unit 414 receives audio data obtained from the recording medium 404 through the recording medium decoding unit 405 and audio data obtained from the communication unit 407 through the navigation control unit 400. Data is selected, and the selected audio data is played back. Further, the audio processing unit 414 performs reproduction processing of audio data stored in a storage device such as an audio database (hereinafter referred to as “audio DB”) 611 (see FIG. 6) described later. Audio data to be played back includes audio data such as music and sound effects. The reproduction process includes, for example, control of the sound field formed by the sound output from the speaker 411. In addition, when the information reproducing apparatus includes an AMZFM tuner or a TV tuner, the audio processing unit 414 may be configured to reproduce, for example, radio or television sound.
  • audio DB audio database
  • the audio processing unit 414 performs audio data reproduction processing based on the audio data reproduction information associated with the passenger identified and recognized by the image processing unit 412.
  • the sound processing unit 414 controls the output of sound output from the force 411 based on the selected and reproduced sound data. More specifically, for example, sound volume adjustment and equalizing processing and sound image localization are performed to control the sound output state.
  • the audio output control by the audio processing unit 414 is performed by, for example, an input operation from the user operation unit 401 or a control by the navigation control unit 400.
  • the imaging unit 415 includes an imaging device (camera) 123 mounted on the vehicle in FIG. 1 and an external imaging device such as the above-described DSC and DVC, and includes a photoelectric conversion element such as a C-MOS or a CCD. And take images inside the vehicle.
  • This photographing unit 415 is connected to the information reproducing device by wire or wirelessly, and photographs a video (image) including, for example, a passenger inside the vehicle by a photographing command from the navigation control unit 400.
  • Image data of the image captured by the imaging unit 415 is output to the image processing unit 412 via the image input / output IZF 413.
  • the information reproducing apparatus replaces the information related to the sitting state of the passenger inside the vehicle in place of or instead of the image data photographed by the photographing unit 415 and analyzed by the image processing unit 412, for example, each seat It may be obtained based on signals detected by a seating sensor that also has a force, such as a membrane switch mounted on 111-113 (see Fig. 1, the same applies hereinafter).
  • the signal indicating the seating sensor force is input to the navigation control unit 400, and the navigation control unit 400 determines the seating state of the passenger.
  • the acquisition unit 201 and the identification unit 202 in FIG. The function is realized by the processing unit 412 and the navigation control unit 400, and the playback unit 203 realizes the function by, for example, the image processing unit 412, the sound processing unit 414, and the navigation control unit 400.
  • the control unit 204 in FIG. 2 specifically realizes its function by, for example, the audio processing unit 414 and the navigation control unit 400, and the storage unit 205 includes, for example, the recording medium 404 and the recording medium decoding unit 405. The function is realized by.
  • the imaging device 206 in FIG. 2 realizes its function by the imaging unit 415, for example.
  • the image display device 207 in FIG. 2 specifically realizes its function by the display unit 402, for example, and the audio output device 208 realizes its function by, for example, the speaker 411.
  • FIG. 5 is a block diagram showing an example of the internal configuration of the image processing unit in the information reproducing apparatus according to the embodiment of the present invention.
  • FIG. 6 is a block diagram showing an example of the internal configuration of the audio processing unit in the information reproducing apparatus according to the embodiment of the present invention.
  • an image processing unit 412 includes an image analysis unit 510, a display control unit 511, an image recognition unit 512, an image storage unit 513, a passenger recognition unit 514, a passenger database (hereinafter referred to as “passenger database”). , “Passenger DB”) 515.
  • the image analysis unit 510 includes an image input / output I / F 413 through an image input unit 415 (see FIG. 4; the same applies hereinafter), external image data input to the image processing unit 412, and a recording medium decoding unit 405 (see FIG. 4).
  • the same applies to the following) and the navigation control unit 400 see FIG. 4; the same applies to the following
  • analyzes the image data input from the recording medium 404 see FIG. 4; the same applies to the following
  • the image analysis unit 510 is configured by, for example, a GPU.
  • the display control unit 511 performs control for displaying the image data output from the image analysis unit 510 on the display screen of the display unit 402.
  • the image recognition unit 512 recognizes what image image is included in the image data based on the image data input to the image analysis unit 510. Specifically, the image recognizing unit 512 recognizes, for example, where the passenger is seated in the vehicle.
  • the image storage unit 513 stores the image data input to the image analysis unit 510.
  • the image storage unit 513 is an information reproducing device. The reproduction information of the image data to be reproduced is stored. This reproduction information is stored in association with the identified passenger.
  • the occupant recognition unit 514 stores the image of the occupant in the image data input to the image analysis unit 510. If the image of the occupant is included, the occupant recognition unit 514 stores the occupant recognition unit 514.
  • the passenger's image is read out, and the passenger's identification 'recognition process represented by the image is performed. Specifically, the identification / recognition process is performed, for example, by face authentication based on the passenger's face image information. Since face authentication is a known technique, description thereof is omitted here.
  • the passenger DB 515 stores image data including image images of passengers of the vehicle, personal identification data such as the age and gender of these passengers, and the like.
  • an audio processing unit 414 includes an audio reproduction processing unit 610, an audio data base (hereinafter referred to as “audio DB”) 611, and a history database (hereinafter referred to as “history DB”). 612, a sound field control unit 613, and a parameter storage unit 614.
  • the audio reproduction processing unit 610 performs selection / reproduction processing of audio data input to the audio processing unit 414 and audio data stored in the audio DB 611. Also, the audio reproduction processing unit 610 performs selection / reproduction processing of audio data associated with the passenger identified by the image processing unit 412 (see FIG. 5, the same applies hereinafter).
  • the voice data associated with the passenger is, for example, voice data associated with a rhythm or tempo that seems to be optimal for the age of the passenger.
  • the audio DB 611 stores audio data to be selected and reproduced by the audio processing unit 414.
  • the audio data stored in the audio DB 611 is the audio data input to the audio processing unit 414 even if the recording medium 404 (see FIG. 4; the same applies hereinafter) and the communication unit 407 (see FIG. 4; the same applies hereinafter) It may also be audio data equipped in the brute force information reproducing apparatus.
  • the history DB 612 stores information related to the playback history and music selection history of the music when the audio data selected and played back by the audio processing unit 414 is music data. This history DB 612 stores, for example, information related to the playback history and music selection history of music played during driving when the information playback device is mounted on a vehicle.
  • the sound field control unit 613 is configured to output the sound data and the navigation data output from the sound reproduction processing unit 610. Based on the information regarding the seating state output from the motion control unit 400, the sound field parameter for sound field control is read from the parameter storage unit 614, and the sound output from the speaker 411 is read based on the read sound field parameter. Controls the sound field that forms. The sound field is controlled by, for example, sound image localization.
  • the parameter storage unit 614 stores sound field parameters for sound field control used in the sound field control unit 613.
  • the stored sound field parameters are determined in advance so that an optimal sound field can be formed in accordance with information such as the seating position, sitting height, and ear position of the passenger in sound image localization. It is a thing.
  • FIG. 7 is an explanatory view showing an example of the internal part of the vehicle on which the information reproducing apparatus according to the embodiment of the present invention is mounted.
  • a plurality of speakers (SP1 to SP4) of the information reproducing apparatus 411 are installed inside the vehicle, SP1 is in front of the passenger seat 712, and SP2 is driving In front of the seat 711, SP3 is installed behind the rear left seat 713, and SP4 is installed behind the rear right seat 714.
  • the sound field control unit 613 (see FIG. 6, the same applies hereinafter), for example, the sound output delay characteristics of SP1 to SP4 based on the sound field parameters.
  • the sound field control unit 613 reads out the sound field parameters from the parameter storage unit 614 (refer to FIG. 6, the same applies hereinafter) and performs sound image localization appropriately in accordance with the boarding position pattern of the passenger.
  • FIG. 8 is a flowchart showing an example of the information reproduction processing procedure of the information reproducing apparatus according to the embodiment of the present invention.
  • the imaging unit 415 see FIG. 4, the same applies hereinafter
  • Image data of the captured image is input to an image processing unit 412 (see Fig. 5 and the same below) via an image input / output IZF413 (see Fig. 4 and the same below), and an image analysis unit 510 (Fig. It is determined whether or not the passenger is seated on the seat (see step S802). If it is determined that the passenger is not seated on the seat (step S802: No), the process returns to step S801 to capture an image. When it is determined that the passenger is seated on the seat (step S802: Yes), the navigation control unit 400 (see FIG. 4, the same applies hereinafter) acquires seating position information from the image processing unit 412 (step S803).
  • the navigation control unit 400 determines whether there is a change in the seating position of the occupant (step S804). If it is determined that there is a change in the seating position (step S804: Yes), the process returns to step S803, and the navigation control unit 400 acquires the seating position information from the image processing unit 412 again. If it is determined that there is no change in the seating position (step S804: No), the navigation control unit 400 identifies the passenger inside the vehicle based on the information about the passenger identified in the image processing unit 412 ( Step S805).
  • the navigation control unit 400 selects the audio data based on the reproduction information associated with the identified occupant for the audio processing unit 414 (see FIG. 6, the same applies hereinafter).
  • the sound field control unit 613 (see FIG. 6; the same applies hereinafter) generates a sound field optimal for the seating position of the passenger.
  • the sound field parameters are read from (Step S806).
  • the sound processing unit 414 reproduces the sound based on the selected sound data (step S807), and the sound field control unit 613 forms the sound based on the read sound field parameters.
  • the sound image of the sound field to be localized is localized (step S808).
  • the image processing unit 412 is associated with the identified passenger. Based on the reproduced information, the image data may be appropriately reproduced together with the reproduction of the audio data by the audio processing unit 414. Also, based on the seating position of the passenger, the image display on the display screen of the display unit 402 (see FIG. 4; the same applies hereinafter) is controlled so that the image can be viewed optimally. Let's play it.
  • the information reproducing apparatus acquires the seating position information of the passenger based on the image data obtained by photographing the inside of the vehicle, and controls the sound field suitable for the seating position. You can perform sound image localization. For this reason, the sound field inside the vehicle can be automatically optimized in accordance with the passenger.
  • the sound field is controlled based on the seated state of the passenger.
  • the sound field inside the vehicle can be optimized according to the passenger.
  • the information reproduction method described in the present embodiment can be realized by executing a prepared program on a computer such as a personal computer or a workstation.
  • This program is recorded on a computer-readable recording medium such as a hard disk, a flexible disk, a CD-ROM, an MO, and a DVD, and is executed by being read by the computer.
  • this program may be a transmission medium that can be distributed through a network such as the Internet.

Abstract

An information reproduction device has an acquisition section (201) for detecting the condition of seating of a vehicle occupant; an identification section (202) for identifying the occupant based on the information on the seating condition acquired by the acquisition section (201); a reproduction section (203) for reproducing sound data associated with the occupant identified by the identification section (202); and a control section (204) for controlling, based on the result identified by the identification section (202), a sound field formed by a sound of the sound data reproduced by the reproduction section (203).

Description

明 細 書  Specification
情報再生装置、情報再生方法、情報再生プログラムおよびコンピュータ に読み取り可能な記録媒体  Information reproducing apparatus, information reproducing method, information reproducing program, and computer-readable recording medium
技術分野  Technical field
[0001] 本発明は、車両への搭乗者の着座状態に関する情報を取得し、音声データなどの 再生をおこなう情報再生装置、情報再生方法、情報再生プログラムおよびコンビユー タに読み取り可能な記録媒体に関する。ただし、本発明の利用は、上述した情報再 生装置、情報再生方法、情報再生プログラムおよびコンピュータに読み取り可能な記 録媒体に限られない。  [0001] The present invention relates to an information reproduction apparatus, an information reproduction method, an information reproduction program, and a computer-readable recording medium that acquire information related to the sitting state of a passenger in a vehicle and reproduce audio data and the like. However, the use of the present invention is not limited to the above-described information reproducing apparatus, information reproducing method, information reproducing program, and computer-readable recording medium.
背景技術  Background art
[0002] 従来、車両において、各座席に搭乗する搭乗者を識別し、各搭乗者ごとに記憶し て 、る設定情報などに合わせて、シートやミラーなどの車載機器の位置や動作状態 を調整する車載機器調整装置や、音声の聴取者の頭部形状などに合わせて正確に 音像を定位するバーチャルスピーカアンプなどが提案されている。  [0002] Conventionally, in a vehicle, the passengers in each seat are identified, stored for each passenger, and the position and operating state of in-vehicle devices such as seats and mirrors are adjusted according to the setting information. In-vehicle equipment adjustment devices, virtual speaker amplifiers, etc. that accurately localize sound images according to the shape of the head of a voice listener have been proposed.
[0003] 上記のような車載機器調整装置において、たとえば各搭乗者の識別は、搭乗者自 らがあらかじめ登録してある設定情報から自己に対する設定情報を選択したり、搭乗 者が自己の設定情報を記憶した IDカードを所持し、車両のシートごとに設けた ID力 ード読み取り機によって IDカードが記憶する情報を読み取ったりすることなどによつ ておこなわれている。  [0003] In the in-vehicle device adjustment apparatus as described above, for example, each passenger can be identified by selecting setting information for himself / herself from the setting information registered in advance by the passenger or by the passenger himself / herself. This is done by, for example, reading an information stored in the ID card with an ID force reader provided for each vehicle seat.
[0004] また、上記のようなバーチャルスピーカアンプにぉ 、て、オーディオ信号の音像定 位は、聴取者の頭部形状を検出し、頭部伝達関数などに基づきスピーカ位置力 聴 取者の耳までの伝達特性を正確にシミュレートしてフィルタ係数を求め、求めたフィ ルタ係数を用いて音像を定位することなどによっておこなわれて 、る。  [0004] Further, in the virtual speaker amplifier as described above, the sound image localization of the audio signal detects the head shape of the listener and detects the speaker position force based on the head-related transfer function or the like. This is done by accurately simulating the transfer characteristics up to and obtaining the filter coefficient, and using the obtained filter coefficient to localize the sound image.
[0005] たとえば、下記特許文献 1に記載の車載プロファイルシステムおよびドライブ環境設 定方法では、車両の各座席 (運転席、助手席など)において、各搭乗者が設定した 好みの車内環境 (車載機器の位置や動作状態など)を、搭乗者ごとのプロファイル情 報として ICを用いた IDカードに記憶させる。そして、 IDカードから搭乗者のプロフアイ ル情報を読み出して車載機器を調整することによって、搭乗者の好みの車内環境に 設定する。 [0005] For example, in the in-vehicle profile system and the drive environment setting method described in Patent Document 1 below, the preferred in-vehicle environment (in-vehicle device) set by each passenger in each seat of the vehicle (driver's seat, front passenger seat, etc.) (Such as the location and operating state of the vehicle) is stored in an ID card using an IC as profile information for each passenger. And the passenger's profile from the ID card By adjusting the in-vehicle equipment by reading out the vehicle information, it is set to the passenger's preferred in-vehicle environment.
[0006] また、たとえば下記特許文献 2に記載のバーチャルスピーカアンプでは、 CCDカメ ラでユーザ (聴取者)の顔を撮影し、この画像に基づ 、てユーザの顔の幅および耳介 の大きさを検出する。この顔の幅および耳介の大きさをユーザの頭部形状データとし て用いて、リアスピーカ力 ユーザの両耳までの伝達関数である頭部伝達関数を演 算する。そして、この頭部伝達関数の特性を実現するように USBアンプの DSPでフィ ノレタ処理をすることにより、フロントスピーカでリアスピーカの音像定位を実現する。  [0006] For example, in the virtual speaker amplifier described in Patent Document 2 below, the face of the user (listener) is photographed with a CCD camera, and the width of the user's face and the size of the auricle are based on this image. Detect. Using the face width and pinna size as the user's head shape data, the rear speaker force calculates the head-related transfer function that is the transfer function to the user's ears. Then, the sound image localization of the rear speaker is realized with the front speaker by performing the finisher processing with the DSP of the USB amplifier so as to realize the characteristics of the head-related transfer function.
[0007] 特許文献 1 :特開 2002— 104105号公報  Patent Document 1: Japanese Patent Application Laid-Open No. 2002-104105
特許文献 2:特開 2003 - 230199号公報  Patent Document 2: JP 2003-230199 A
発明の開示  Disclosure of the invention
発明が解決しょうとする課題  Problems to be solved by the invention
[0008] し力しながら、上記特許文献 1のように搭乗者の識別に IDカードを用いる場合、車 両の各シートごとに IDカード読み取り用の機器を設置する必要がある。このような特 殊な機器を各シートごとに設置するのは、車内スペース面、金銭面などにおける負担 が生じるという問題が一例として挙げられる。 However, when using an ID card for passenger identification as described in Patent Document 1, it is necessary to install an ID card reading device for each vehicle seat. An example of the installation of such special equipment for each seat is the burden on the interior space and money.
[0009] また、各搭乗者ごとに IDカードを作成'所持する必要があり、 IDカードの作成作業 や取り扱いが繁雑である。特に、 IDカードを紛失したり破損したりした場合、搭乗者 の識別ができなくなってしまうという問題が一例として挙げられる。 [0009] In addition, it is necessary to create and possess an ID card for each passenger, which makes it difficult to create and handle ID cards. One example is the problem that if the ID card is lost or damaged, the passenger cannot be identified.
[0010] また、上記特許文献 1の従来技術では、搭乗者の好みによる車載機器の設定情報 を記憶することはできるが、車載機器の設定を積極的に搭乗者に適したものにするこ とはできな ヽと 、う問題が一例として挙げられる。 [0010] Also, with the prior art disclosed in Patent Document 1, it is possible to store in-vehicle device setting information according to the passenger's preference, but to actively make the in-vehicle device setting suitable for the passenger. An example of this is the problem that cannot be done.
[0011] また、上記特許文献 2のように聴取者の顔画像を撮影することで頭部伝達関数を求 め、フィルタ処理をおこなって音像定位する場合、画像演算処理や音声フィルタ処理 などの複雑なプロセスが必須となるため、簡単かつ容易に音声再生に伴う音像定位 をおこなうことができないという問題が一例として挙げられる。  [0011] Further, as in Patent Document 2, when a head image is obtained by photographing a listener's face image and filter processing is performed and sound image localization is performed, complicated operations such as image calculation processing and audio filter processing are performed. As an example, there is a problem that sound image localization associated with sound reproduction cannot be performed easily and easily because a simple process is essential.
課題を解決するための手段  Means for solving the problem
[0012] 請求項 1の発明にかかる情報再生装置は、車両への搭乗者の着座状態を検出し、 当該着座状態に関する情報を取得する取得手段と、前記取得手段によって取得さ れた着座状態に関する情報に基づいて、前記搭乗者を識別する識別手段と、前記 識別手段によって識別された搭乗者に関連付けられている音声データを再生する再 生手段と、前記識別手段によって識別された識別結果に基づいて、前記再生手段に よって再生される音声データの音声が形成する音場を制御する制御手段と、を備え ることを特徴とする。 [0012] An information reproducing apparatus according to the invention of claim 1 detects a seating state of a passenger in a vehicle, An acquisition means for acquiring information relating to the seating state; an identification means for identifying the occupant based on information relating to the seating state acquired by the acquisition means; and an occupant identified by the identification means. Reproduction means for reproducing the sound data being reproduced, and control means for controlling the sound field formed by the sound of the sound data reproduced by the reproduction means based on the identification result identified by the identification means. It is characterized by providing.
[0013] また、請求項 7の発明に力かる情報再生方法は、車両への搭乗者の着座状態を検 出し、当該着座状態に関する情報を取得する取得工程と、前記取得工程によって取 得された着座状態に関する情報に基づいて、前記搭乗者を識別する識別工程と、前 記識別工程によって識別された搭乗者に関連付けられている音声データを再生する 再生工程と、前記識別工程によって識別された識別結果に基づいて、前記再生ェ 程によって再生される音声データの音声が形成する音場を制御する制御工程と、を 含むことを特徴とする。  [0013] Further, the information reproduction method according to the invention of claim 7 is obtained by an acquisition step of detecting a seating state of a passenger in a vehicle and acquiring information on the seating state, and the acquisition step. An identification step for identifying the occupant based on information related to the seating state, a reproduction step for reproducing audio data associated with the occupant identified by the identification step, and an identification identified by the identification step And a control step of controlling the sound field formed by the sound of the sound data reproduced by the reproduction process based on the result.
[0014] また、請求項 8の発明にかかる情報再生プログラムは、請求項 7に記載の情報再生 方法をコンピュータに実行させることを特徴とする。  An information reproduction program according to the invention of claim 8 causes a computer to execute the information reproduction method according to claim 7.
[0015] また、請求項 9の発明にかかるコンピュータに読み取り可能な記録媒体は、請求項[0015] Further, the computer-readable recording medium according to the invention of claim 9 is the claim.
8に記載の情報再生プログラムを記録したことを特徴とする。 The information reproduction program described in 8 is recorded.
図面の簡単な説明  Brief Description of Drawings
[0016] [図 1]図 1は、実施の形態に力かる情報再生装置が搭載された車両内部の一例を示 す説明図である。  [0016] FIG. 1 is an explanatory view showing an example of the inside of a vehicle on which an information reproducing apparatus that is useful for an embodiment is mounted.
[図 2]図 2は、実施の形態に力かる情報再生装置の機能的構成の一例を示すブロック 図である。  [FIG. 2] FIG. 2 is a block diagram showing an example of a functional configuration of an information reproducing apparatus according to an embodiment.
[図 3]図 3は、実施の形態にかかる情報再生装置の情報再生処理手順の一例を示す フローチャートである。  FIG. 3 is a flowchart showing an example of an information reproduction processing procedure of the information reproducing apparatus according to the embodiment.
[図 4]図 4は、実施例に力かる情報再生装置のハードウェア構成の一例を示すブロッ ク図である。  [FIG. 4] FIG. 4 is a block diagram showing an example of a hardware configuration of an information reproducing apparatus according to the embodiment.
[図 5]図 5は、実施例に力かる情報再生装置における画像処理部の内部構成の一例 を示すブロック図である。 Ο FIG. 5 is a block diagram illustrating an example of an internal configuration of an image processing unit in the information reproducing apparatus according to the embodiment. Ο
[図 6]図 6は、実施例にかかる情報再生装置における音声処理部の内部構成の一例 を示〇すブロック図である。 FIG. 6 is a block diagram illustrating an example of the internal configuration of the audio processing unit in the information reproducing apparatus according to the embodiment.
1—  1—
[図 7]図 7は、実施例にかかる情報再生装置が搭載された車両内部の一例を示す説 明図である。  FIG. 7 is an explanatory diagram showing an example of the inside of a vehicle on which the information reproducing apparatus according to the example is mounted.
[図 8]図 8は、実施例に力かる情報再生装置の情報再生処理手順の一例を示すフロ 一チャートである。  [FIG. 8] FIG. 8 is a flowchart showing an example of an information reproduction processing procedure of the information reproducing apparatus according to the embodiment.
符号の説明  Explanation of symbols
取得部  Acquisition department
202 識別部  202 Identification part
203 再生部  203 Playback section
204 制御部  204 Control unit
205 じ' 1思 p:[5 205 '1 thought p : [5
206 撮影装置  206 Camera
207 画像表示機器  207 Image display equipment
208 音声出力機器  208 Audio output device
400 ナビゲーシヨン制御部  400 Navigation control unit
412 画像処理部  412 Image processing unit
414 音声処理部  414 Audio processing unit
510 画像解析部  510 Image analysis unit
610 音声再生処理部  610 Audio playback processor
613 音場制御部  613 Sound field controller
発明を実施するための最良の形態  BEST MODE FOR CARRYING OUT THE INVENTION
[0018] 以下に添付の図面を参照して、この発明にかかる情報再生装置、情報再生方法、 情報再生プログラムおよびこのプログラムを記録したコンピュータに読み取り可能な 記録媒体の好適な実施の形態を詳細に説明する。  [0018] Exemplary embodiments of an information reproducing apparatus, an information reproducing method, an information reproducing program, and a computer-readable recording medium recording the program according to the present invention will be described in detail below with reference to the accompanying drawings. explain.
[0019] (実施の形態)  [0019] (Embodiment)
まずはじめに、この発明の実施の形態に力かる情報再生装置が搭載された車両内 部について説明する。図 1は、この発明の実施の形態に力かる情報再生装置が搭載 された車両内部の一例を示す説明図である。図 1において、たとえば運転席シート 1 11および助手席シート 112の周囲には、画像表示機器 (ディスプレイ) 121aおよび 音声出力機器 (スピーカ) 122ならびに情報再生機器 126aが設けられている。 First, an explanation will be given of the inside of a vehicle on which an information reproducing apparatus that is useful for the embodiment of the present invention is mounted. FIG. 1 shows an information reproducing apparatus that is useful for the embodiment of the present invention. It is explanatory drawing which shows an example inside the made vehicle. In FIG. 1, for example, an image display device (display) 121a, an audio output device (speaker) 122, and an information reproducing device 126a are provided around the driver's seat 111 and the passenger seat 112.
[0020] また、車両の天井部 114には、撮影装置 (カメラ) 123が設けられ、助手席シート 11 2には、後部座席シート 113への搭乗者に向けて、画像表示機器 (ディスプレイ) 121 bおよび情報再生機器 126bが設けられている。なお、後部座席シート 113の後方に は、図示しない音声出力機器 (スピーカ)が設けられ、各情報再生機器 126 (126a, 126b)には、撮影装置 (カメラ) 123が設けられている。  [0020] In addition, a photographing device (camera) 123 is provided on the ceiling 114 of the vehicle, and an image display device (display) 121 is provided on the passenger seat 112 to the passenger on the rear seat 113. b and an information reproducing device 126b are provided. A sound output device (speaker) (not shown) is provided behind the rear seat 113, and a photographing device (camera) 123 is provided in each information reproduction device 126 (126a, 126b).
[0021] (情報再生装置の機能的構成)  [0021] (Functional configuration of information reproducing apparatus)
つぎに、この発明の実施の形態に力かる情報再生装置の機能的構成について説 明する。図 2は、この発明の実施の形態に力かる情報再生装置の機能的構成の一例 を示すブロック図である。図 2において、情報再生装置は、取得部 201と、識別部 20 2と、再生部 203と、制御部 204と、記憶部 205とを含み、さらに撮影装置 (カメラ) 20 6と、画像表示機器 (ディスプレイ) 207と、音声出力機器 (スピーカ) 208とを備えた 構成となっている。なお、この情報再生装置は、車両に着脱可能な構造を備えてい てもよい。  Next, a functional configuration of the information reproducing apparatus which is useful for the embodiment of the present invention will be described. FIG. 2 is a block diagram showing an example of a functional configuration of the information reproducing apparatus according to the embodiment of the present invention. In FIG. 2, the information reproduction apparatus includes an acquisition unit 201, an identification unit 202, a reproduction unit 203, a control unit 204, and a storage unit 205, and further includes a photographing device (camera) 206 and an image display device. (Display) 207 and audio output device (speaker) 208 are provided. Note that the information reproducing apparatus may have a structure that can be attached to and detached from the vehicle.
[0022] 取得部 201は、図 1における車両の各シート 111〜113への搭乗者の着座状態を 検出し、着座状態に関する情報を取得する。各シート 111〜113とは、情報再生装 置が搭載されている車両の座席である。また、搭乗者とは、車両に搭乗する者であり 、運転席シート 111に搭乗する運転者の他、助手席シート 112や後部座席シート 11 3に搭乗する者も含む。着座状態の検出とは、具体的には、たとえば後述する撮影装 置 206で撮影された画像の画像データを用いて、車内のどの位置に搭乗者が着座 して 、るかを検出することである。  [0022] The acquisition unit 201 detects the seating state of the passenger on each of the seats 111 to 113 of the vehicle in FIG. 1, and acquires information on the seating state. Each of the seats 111 to 113 is a seat of a vehicle on which an information reproducing device is mounted. Further, the passenger is a person who gets on the vehicle, and includes a person who gets on the driver's seat 111 and a person who gets on the passenger seat 112 and the rear seat 113. Specifically, the detection of the seating state is, for example, by detecting the position in the vehicle where the passenger is seated using image data of an image taken by the photographing device 206 described later. is there.
[0023] この着座状態に関する情報とは、具体的には、たとえば搭乗者の着座位置情報、 座高情報、顔画像情報および耳の位置情報などの情報が含まれる。なお、取得部 2 01は、たとえば各シート 111〜113に備えられた、いわゆる着座センサなどにより構 成されていてもよい。  [0023] Specifically, the information on the sitting state includes information such as seating position information, sitting height information, face image information, and ear position information of the passenger. Note that the acquisition unit 201 may be configured by so-called seating sensors provided in each of the seats 111 to 113, for example.
[0024] ここで、着座状態に関する情報には、たとえば搭乗者の身体の部位に関する情報 が含まれていてもよい。搭乗者の身体の部位とは、身体の各パーツの位置のことであ る。また、身体の部位に関する情報とは、具体的には、たとえば肩、腹部、頭部の位 置や、腕、足の長さ、身長や体重、座高など、身体の各パーツの位置の情報である。 また、身体の部位に関する情報には、たとえば頭部における目や耳の位置の情報も 含まれる。さらに、身体の部位に関する情報は、これらの位置の情報だけではなぐ 視力や聴力などの身体の部分の能力を表わす情報であってもよい。 [0024] Here, the information regarding the sitting state includes, for example, information regarding the body part of the passenger May be included. The body part of the passenger is the position of each part of the body. Information on the body part is specifically information on the position of each part of the body, such as the position of the shoulders, abdomen, and head, the length of arms, legs, height, weight, and sitting height. is there. The information on the body part includes, for example, information on the positions of eyes and ears in the head. Further, the information on the body part may be information representing the ability of the body part such as visual acuity and hearing ability, in addition to the information on these positions alone.
[0025] この身体の部位に関する情報を着座状態に関する情報として取得する場合、取得 部 201は、具体的には、たとえば撮影装置 206によって撮影された画像データを解 祈し、それにより得られた搭乗者の身体の部位の画像イメージに基づ 、て身体の部 位に関する情報を算出し取得する。なお、取得部 201は、搭乗者からの入力操作や 、外部の記録装置からの読み出しなどによって、身体の部位に関する情報を着座状 態に関する情報として取得してもよい。  [0025] When acquiring information related to the body part as information related to the sitting state, specifically, the acquiring unit 201 prays, for example, image data captured by the imaging device 206, and the boarding obtained thereby. Based on the image of the body part of the person, information on the body part is calculated and acquired. Note that the acquisition unit 201 may acquire information related to the body part as information related to the sitting state by an input operation from the passenger or reading from an external recording device.
[0026] 識別部 202は、取得部 201によって取得された着座状態に関する情報に基づいて 、搭乗者を識別する。識別とは、搭乗者に対応する画像表示機器 207での画像デー タの再生情報および音声出力機器 208での音声データの再生情報の有無や、搭乗 者にどのような再生情報が関連付けられている力などを判別することである。  The identification unit 202 identifies the passenger based on the information regarding the seating state acquired by the acquisition unit 201. Identification refers to the presence or absence of image data playback information on the image display device 207 corresponding to the passenger and audio data playback information on the audio output device 208, and what playback information is associated with the passenger. It is to discriminate power.
[0027] また、搭乗者をより詳しく識別するためには、たとえば着座状態に関する情報に含 まれる搭乗者の顔画像情報を用いて、顔認証処理をおこない、人物を特定して識別 するようにしてもよい。なお、着座状態に関する情報および識別部 202による識別結 果などの情報は、後述する記憶部 205に記憶される。  [0027] Further, in order to identify the passenger in more detail, for example, face authentication processing is performed using the passenger's face image information included in the information related to the seating state to identify and identify the person. May be. Note that information on the sitting state and information such as the identification result by the identification unit 202 are stored in the storage unit 205 described later.
[0028] 再生部 203は、識別部 202によって識別された搭乗者に関連付けられている画像 データや音声データを再生する。画像データや音声データの再生は、記憶部 205に 記憶されている再生情報に基づいておこなわれる。再生部 203によって再生される 画像データは、たとえば TV番組を録画した画像データや、撮影装置 206によって撮 影された画像データなどが挙げられる。また、再生部 203によって再生される音声デ ータは、音楽などの楽曲や効果音などの音声データが挙げられる。  The reproduction unit 203 reproduces image data and audio data associated with the passenger identified by the identification unit 202. Image data and audio data are reproduced based on the reproduction information stored in the storage unit 205. Examples of the image data reproduced by the reproduction unit 203 include image data obtained by recording a TV program, image data captured by the image capturing device 206, and the like. The audio data reproduced by the reproduction unit 203 includes music such as music and sound data such as sound effects.
[0029] 制御部 204は、識別部 202によって識別された識別結果に基づいて、再生部 203 によって再生される音声データにより音声出力機器 208から出力される音声が形成 する音場を制御する。音場の制御は、具体的には、たとえば音声出力機器 208から 出力される音声の音量の調整ゃィコライジング処理、音像定位などが挙げられる。 [0029] Based on the identification result identified by the identification unit 202, the control unit 204 forms audio output from the audio output device 208 based on the audio data reproduced by the reproduction unit 203. To control the sound field. Specifically, the sound field control includes, for example, adjustment of sound volume of sound output from the sound output device 208, sound equalization processing, sound image localization, and the like.
[0030] この制御部 204が音場の制御として音像定位をおこなう場合、音声の音像定位は、 たとえば公知の音像定位方法を用いておこなわれる。ここでは、公知であるため説明 を省略するが、音像定位方法として、たとえば頭部伝達関数に基づき音声データの フィルタ処理をおこない、車両内部の音場に仮想音源を生成したり、音声出力機器 2 08の音声出力遅延特性を変化させたりする方法を用いることが可能である。  [0030] When the control unit 204 performs sound image localization as control of the sound field, the sound image localization of the sound is performed using, for example, a known sound image localization method. Although the description is omitted here because it is publicly known, as a sound image localization method, for example, sound data is filtered based on the head-related transfer function to generate a virtual sound source in the sound field inside the vehicle, or the sound output device 2 It is possible to use a method of changing the audio output delay characteristic of 08.
[0031] 記憶部 205は、上述したように、着座状態に関する情報や、識別部 202による識別 結果などの情報を記憶する。また、記憶部 205は、搭乗者に関連付けられている画 像データや音声データの再生情報を記憶する。再生情報は、搭乗者ごとの嗜好性を 考慮した画像データや音声データの再生履歴や選曲履歴に関する情報、あるいは 識別された搭乗者の年齢や性別に基づき、搭乗者にとって適合性が高いと思われる 画像や音声の選択に関する情報などを含む。この記憶部 205に記憶される各種情 報は、その種別により、たとえば一時的に記憶されたり、データベースのような学習的 または統計的に記憶されたりする。  [0031] As described above, the storage unit 205 stores information on the sitting state and information such as an identification result by the identification unit 202. In addition, the storage unit 205 stores reproduction information of image data and audio data associated with the passenger. The playback information is considered to be highly relevant to the passenger based on the information regarding the playback history and music selection history of the image data and audio data considering the preference of each passenger, or the age and gender of the identified passenger. Includes information about image and audio selection. Various types of information stored in the storage unit 205 are temporarily stored, for example, stored in a learning manner or statistically like a database, depending on the type.
[0032] 撮影装置 206は、たとえば車両内部の画像を撮影する。撮影された画像の画像デ ータは、取得部 201に提供される。画像表示機器 207は、再生部 203によって再生 される画像データを表示画面上に表示する。音声出力機器 208は、再生部 203によ つて再生され制御部 204によって音場制御される音声データに基づく音声を車両内 部に出力する。  [0032] The imaging device 206 captures an image inside the vehicle, for example. Image data of the captured image is provided to the acquisition unit 201. The image display device 207 displays the image data reproduced by the reproducing unit 203 on the display screen. The sound output device 208 outputs sound based on the sound data reproduced by the reproducing unit 203 and controlled by the control unit 204 to the inside of the vehicle.
[0033] (情報再生装置の情報再生処理手順)  (Information reproduction processing procedure of information reproduction apparatus)
つぎに、この発明の実施の形態に力かる情報再生装置の情報再生処理手順につ いて説明する。図 3は、この発明の実施の形態に力かる情報再生装置の情報再生処 理手順の一例を示すフローチャートである。まず、撮影装置 206 (図 2参照、以下同 じ)によって、車両内部の画像を撮影する (ステップ S301)。つぎに、撮影した画像の 画像データに基づいて、搭乗者がシートに着座したカゝ否かを判断する (ステップ S30 2)。  Next, the information reproduction processing procedure of the information reproducing apparatus that is relevant to the embodiment of the present invention will be described. FIG. 3 is a flowchart showing an example of an information reproduction processing procedure of the information reproducing apparatus according to the embodiment of the present invention. First, an image inside the vehicle is shot by the shooting device 206 (see FIG. 2, the same applies hereinafter) (step S301). Next, based on the image data of the photographed image, it is determined whether or not the passenger is seated on the seat (step S302).
[0034] 搭乗者が各シート 111〜113 (図 1参照、以下同じ)に着座していないと判断した場 合 (ステップ S302 :No)、ステップ S301に戻り画像を撮影する。搭乗者が各シート 1 11〜 113に着座したと判断した場合 (ステップ S302: Yes)、取得部 201 (図 2参照、 以下同じ)によって、着座状態を検出し、着座状態に関する情報を取得する (ステツ プ S303)。 [0034] When it is determined that the passenger is not seated on each of the seats 111 to 113 (see FIG. 1, the same applies hereinafter) If yes (step S302: No), the process returns to step S301 to take an image. When it is determined that the passenger is seated on each of the seats 111 to 113 (step S302: Yes), the acquisition unit 201 (see FIG. 2, the same applies hereinafter) detects the seating state and acquires information on the seating state ( Step S303).
[0035] 着座状態に関する情報が取得されると、識別部 202 (図 2参照、以下同じ)によって 、搭乗者を識別する (ステップ S304)。搭乗者が識別されると、再生部 203 (図 2参照 、以下同じ)によって、識別した搭乗者に関連付けられている画像データや音声デー タの再生情報を記憶部 205 (図 2参照、以下同じ)から読み出して、画像表示機器 20 7 (図 2参照、以下同じ)や音声出力機器 208 (図 2参照、以下同じ)において画像や 音声を再生する (ステップ S305)。  When information related to the seating state is acquired, the identification unit 202 (see FIG. 2, the same applies hereinafter) identifies the passenger (step S304). When the occupant is identified, the playback unit 203 (see FIG. 2 and so on) stores the playback information of the image data and audio data associated with the identified occupant in the storage unit 205 (see FIG. 2 and so on). ) To play back images and sounds on the image display device 207 (see FIG. 2, the same applies hereinafter) and the audio output device 208 (see FIG. 2, the same applies hereinafter) (step S305).
[0036] そして、この再生部 203による画像データや音声データの再生処理の際に、制御 部 204によって、識別部 202により識別された識別結果に基づいて、再生部 203に よって再生される音声データの音声が形成する音場を制御する (ステップ S306)。こ のステップ S306における音場の制御としては、たとえば搭乗者にとって最適に音声 を聞き取れる音場を形成するように、搭乗者の着座位置などに合わせて音声の音像 定位をおこなうことが挙げられる。これにより、本フローチャートによる一連の情報再生 処理を終了する。  [0036] Then, the audio data reproduced by the reproduction unit 203 based on the identification result identified by the identification unit 202 by the control unit 204 when the reproduction unit 203 reproduces image data or audio data. The sound field formed by the voice is controlled (step S306). The control of the sound field in step S306 includes, for example, performing sound image localization according to the seating position of the passenger so as to form a sound field where the sound can be heard optimally for the passenger. Thus, the series of information reproduction processing according to this flowchart is completed.
[0037] 以上説明したように、この発明の実施の形態に力かる情報再生装置によれば、車 両への搭乗者の着座状態に関する情報に基づいて、搭乗者を識別し、識別した搭 乗者に関連付けられている画像データや音声データを再生するとともに、識別結果 に基づ!/、て、再生される音声データの音声が形成する音場を制御することができる。 このため、手動で音場制御のための入力操作をおこなうことなぐ車両内部の音場を 搭乗者に合わせて自動的に最適な状態にすることができる。  [0037] As described above, according to the information reproducing apparatus according to the embodiment of the present invention, the passenger is identified based on the information on the seating state of the passenger on the vehicle, and the boarding that has been identified. In addition to playing back image data and sound data associated with a person, the sound field formed by the sound of the sound data to be played can be controlled based on the identification result! For this reason, it is possible to automatically optimize the sound field inside the vehicle according to the occupant without manually performing an input operation for sound field control.
[0038] つぎに、この発明の実施の形態に力かる実施例について詳細に説明する。ここで は、この実施の形態に力かる情報再生装置を車載のナビゲーシヨン装置に適用した 場合を例示して説明する。  [0038] Next, examples that are useful for the embodiment of the present invention will be described in detail. Here, a case where the information reproducing apparatus according to this embodiment is applied to an in-vehicle navigation apparatus will be described as an example.
実施例  Example
[0039] (情報再生装置のハードウェア構成) まず、この発明の実施例に力かる情報再生装置のハードウェア構成について説明 する。図 4は、この発明の実施例に力かる情報再生装置のハードウェア構成の一例を 示すブロック図である。 [0039] (Hardware configuration of information reproducing apparatus) First, the hardware configuration of an information reproducing apparatus according to the embodiment of the present invention will be described. FIG. 4 is a block diagram showing an example of the hardware configuration of the information reproducing apparatus according to the embodiment of the present invention.
[0040] 図 4において、情報再生装置は、上述したように着脱自在な構造で車両に搭載され ており、ナビゲーシヨン制御部 400と、ユーザ操作部(リモコン,タツチパネル) 401と、 表示部(モニタ) 402と、位置取得部 403と、記録媒体 404と、記録媒体デコード部 4 05と、案内音出力部 406と、通信部 407と、経路探索部 408と、経路誘導部 409と、 案内音生成部 410と、スピーカ 411と、画像処理部 412と、画像入出力 IZF413と、 音声処理部 414と、撮影部 415とを含む構成となっている。  In FIG. 4, the information reproducing apparatus is mounted on the vehicle in a detachable structure as described above, and includes a navigation control unit 400, a user operation unit (remote control and touch panel) 401, and a display unit (monitor). ) 402, position acquisition unit 403, recording medium 404, recording medium decoding unit 405, guidance sound output unit 406, communication unit 407, route search unit 408, route guidance unit 409, and guidance sound generation The unit 410 includes a speaker 411, an image processing unit 412, an image input / output IZF 413, an audio processing unit 414, and a photographing unit 415.
[0041] ナビゲーシヨン制御部 400は、たとえば情報再生装置全体の制御を司り、制御プロ グラムにしたがって各種の演算処理を実行することにより、情報再生装置が備える各 部を統括的に制御する。ナビゲーシヨン制御部 400は、たとえば所定の演算処理を 実行する CPU (Central Processing Unit)や、各種制御プログラムを格納する R OM (Read Only Memory)および CPUのワークエリアとして機能する RAM (Ran dom Access Memory)などによって構成されるマイクロコンピュータなどによって 実現することができる。  [0041] The navigation control unit 400 controls the entire information reproducing apparatus, for example, and performs various arithmetic processes according to the control program, thereby comprehensively controlling each unit included in the information reproducing apparatus. The navigation control unit 400 includes, for example, a CPU (Central Processing Unit) that executes predetermined arithmetic processing, a ROM (Read Only Memory) that stores various control programs, and a RAM (Random Access Memory) that functions as a work area for the CPU. ) Or the like.
[0042] また、ナビゲーシヨン制御部 400は、たとえば車両の経路誘導に際し、位置取得部 403によって取得された車両の現在位置に関する情報 (現在位置情報)と、記録媒 体 404から記録媒体デコード部 405を介して得られた地図情報とに基づ 、て、地図 上のどの位置を車両が走行しているかを算出し、算出結果を表示部 402へ出力する 。このナビゲーシヨン制御部 400は、上記経路誘導に際し、経路探索部 408、経路誘 導部 409および案内音生成部 410の間で経路誘導に関する情報の入出力をおこな い、その結果得られる情報を表示部 402および案内音出力部 406へ出力する。  [0042] In addition, the navigation control unit 400, for example, when the vehicle route is guided, information on the current position of the vehicle (current position information) acquired by the position acquisition unit 403 and the recording medium 404 to the recording medium decoding unit 405 Based on the map information obtained via the map, the position on the map where the vehicle is traveling is calculated, and the calculation result is output to the display unit 402. The navigation control unit 400 inputs / outputs information related to route guidance between the route search unit 408, the route guidance unit 409, and the guidance sound generation unit 410 when performing the above-described route guidance, and obtains information obtained as a result. The information is output to the display unit 402 and the guidance sound output unit 406.
[0043] ユーザ操作部 401は、文字、数値、各種指示など、ユーザによって入力操作された 情報をナビゲーション制御部 400に対して出力する。ユーザ操作部 401の構成とし ては、たとえば物理的な押下 Z非押下を検出する押しボタン式スィッチ、タツチパネ ル、キーボード、ジョイスティックなどの公知の各種形態を採用することができる。この ユーザ操作部 401は、たとえば外部力もの音声を入力するマイクなどを用いて、音声 によって入力操作をおこなう形態としてもよ!、。 [0043] The user operation unit 401 outputs information input by the user, such as characters, numerical values, and various instructions, to the navigation control unit 400. As the configuration of the user operation unit 401, various known forms such as a push button switch that detects physical pressing Z non-pressing, a touch panel, a keyboard, and a joystick can be employed. This user operation unit 401 uses, for example, a microphone for inputting sound of external power to You can also use the input operation mode!
[0044] また、このユーザ操作部 401は、情報再生装置に対して一体的に設けられていても よいし、リモコンのように情報再生装置力 離間した位置力 操作可能な形態であつ てもよい。また、ユーザ操作部 401は、上述した各種形態のうちのいずれか一つの形 態で構成されていても、複数の形態で構成されていてもよい。ユーザは、ユーザ操作 部 401の形態に応じて、適宜入力操作をおこなうことで情報を入力する。  [0044] Further, the user operation unit 401 may be provided integrally with the information reproducing apparatus, or may be configured to be able to be operated with a position force separated from the information reproducing apparatus, such as a remote controller. . In addition, the user operation unit 401 may be configured in any one of the various forms described above, or may be configured in a plurality of forms. The user inputs information by appropriately performing an input operation according to the form of the user operation unit 401.
[0045] ユーザ操作部 401の入力操作によって入力される情報として、たとえばナビゲーシ ヨンに関しては、目的地の情報などが挙げられる。具体的に、たとえば情報再生装置 が車両などに備えられている場合には、この車両の搭乗者が到達目標とする地点が 設定される。また、ユーザ操作部 401に入力される情報として、たとえば情報再生に 関しては、後述する音声処理部 414によって再生される音声の選択情報などが挙げ られる。具体的には、たとえばこの車両の搭乗者が所望とする楽曲などの音声データ が選択され設定される。  [0045] Information input by an input operation of the user operation unit 401 includes, for example, information on a destination regarding navigation. Specifically, for example, when the information reproducing apparatus is provided in a vehicle or the like, a point to be reached by the passenger of this vehicle is set. Further, as information input to the user operation unit 401, for example, regarding information reproduction, selection information of audio reproduced by an audio processing unit 414 described later can be cited. Specifically, for example, audio data such as music desired by the passenger of this vehicle is selected and set.
[0046] ここで、たとえばユーザ操作部 401の形態としてタツチパネルを採用する場合、この タツチパネルは、表示部 402の表示画面側に積層して使用される。この場合、表示 部 402における表示タイミングと、タツチパネル (ユーザ操作部 401)に対する操作タ イミングおよびその位置座標とを管理することによって、入力操作による入力情報を 認識する。  Here, for example, when a touch panel is adopted as the form of the user operation unit 401, the touch panel is used by being stacked on the display screen side of the display unit 402. In this case, the input timing by the input operation is recognized by managing the display timing on the display unit 402, the operation timing for the touch panel (user operation unit 401) and its position coordinates.
[0047] ユーザ操作部 401の形態として表示部 402に積層されたタツチパネルを採用する ことにより、ユーザ操作部 401の形態を大型化することなぐ多くの情報入力をおこな うことができる。このタツチパネルとしては、抵抗膜式、感圧式など公知の各種タツチ パネルを採用することが可能である。  [0047] By adopting a touch panel stacked on the display unit 402 as a form of the user operation unit 401, it is possible to input a large amount of information without enlarging the form of the user operation unit 401. As this touch panel, various known touch panels such as a resistive film type and a pressure sensitive type can be adopted.
[0048] 表示部 402は、たとえば CRT (Cathode Ray Tube)、 TFT液晶ディスプレイ、 有機 ELディスプレイ、プラズマディスプレイなどを含む。表示部 402は、具体的には 、たとえば図示しな 、映像 IZFや映像 IZFに接続された映像表示用のディスプレイ 装置によって構成することができる。  [0048] Display unit 402 includes, for example, a CRT (Cathode Ray Tube), a TFT liquid crystal display, an organic EL display, a plasma display, and the like. Specifically, the display unit 402 can be configured by, for example, a video IZF or a video display device connected to the video IZF, not shown.
[0049] ここで、映像 IZFは、具体的には、たとえばディスプレイ装置全体の制御をおこなう グラフィックコントローラと、即時表示可能な画像情報を一時的に記憶する VRAM (V ideo RAM)などのバッファメモリと、グラフィックコントローラから出力される画像情報 に基づいて、ディスプレイ装置を表示制御する制御 ICや GPU (Graphics Process ing Unit)などによって構成される。表示部 402には、アイコン、カーソル、メニュー 、ウィンドウあるいは文字や画像などの各種情報が表示される。また、表示部 402に は、後述する画像処理部 412によって処理された画像データが表示される。 Here, specifically, the video IZF is, for example, a graphic controller that controls the entire display device, and a VRAM (V It consists of a buffer memory such as ideo RAM) and a control IC or GPU (Graphics Processing Unit) that controls display based on image information output from the graphics controller. The display unit 402 displays icons, cursors, menus, windows, or various information such as characters and images. Further, the display unit 402 displays image data processed by an image processing unit 412 described later.
[0050] 位置取得部 403は、たとえば人工衛星力 の電波を受信することによって情報再生 装置が搭載された車両の現在位置情報 (緯度経度情報)を取得する。ここで、現在 位置情報は、人工衛星からの電波を受信し、人工衛星との幾何学的位置を求める情 報であり、地球上どこでも計測可能である。なお、位置取得部 403は、図示しない GP Sアンテナを備えている。  [0050] The position acquisition unit 403 acquires the current position information (latitude and longitude information) of the vehicle on which the information reproducing device is mounted, for example, by receiving radio waves of artificial satellite power. Here, the current position information is information that receives radio waves from an artificial satellite and obtains the geometric position with the artificial satellite, and can be measured anywhere on the earth. The position acquisition unit 403 includes a GPS antenna (not shown).
[0051] ここで、 GPS (Global Positioning System)とは、 4つ以上の人工衛星からの電 波を受信することによって地上での位置を正確に求めるシステムである。ここでは、公 知の技術であるため GPSについての説明は省略する。この位置取得部 403は、たと えば人工衛星力 受信した電波を復調するチューナや、復調した情報に基づいて現 在位置を算出する演算回路などによって構成することができる。  [0051] Here, GPS (Global Positioning System) is a system that accurately obtains a position on the ground by receiving radio waves from four or more artificial satellites. Here, explanation of GPS is omitted because it is a publicly known technology. This position acquisition unit 403 can be configured by, for example, a tuner that demodulates radio waves received by satellite power, an arithmetic circuit that calculates the current position based on the demodulated information, and the like.
[0052] なお、人工衛星からの電波としては、 1. 57542GHzの搬送波で、 CZA (Coarse and Access)コードおよび航法メッセージが乗っている L1電波などを用いておこ なわれる。これにより、情報再生装置が搭載された車両の現在位置 (緯度および経度 )を検知する。なお、車両の現在位置の検知に際しては、車速センサやジャイロセン サなどの各種センサによって収集された情報を加味してもよい。車速センサは、情報 再生装置を搭載する車両のトランスミッションの出力側シャフトから車速を検出する。  [0052] It should be noted that the radio wave from the artificial satellite is 1. Carrier wave of 57542 GHz, using L1 radio wave carrying CZA (Coarse and Access) code and navigation message. Thereby, the current position (latitude and longitude) of the vehicle on which the information reproducing apparatus is mounted is detected. When detecting the current position of the vehicle, information collected by various sensors such as a vehicle speed sensor and a gyro sensor may be taken into account. The vehicle speed sensor detects the vehicle speed from the output shaft of the transmission of the vehicle equipped with the information reproducing device.
[0053] その他、車両の現在位置の検知に際しては、角速度センサ、走行距離センサ、傾 斜角センサ、横 G (Gravity)センサなどの各種センサによって収集された情報を加味 してもよい。角速度センサは、車両の回転時の角速度を検出し、角速度情報と相対 方位情報とを出力する。走行距離センサは、車輪の回転に伴って出力される所定周 期のパルス信号のパルス数をカウントすることによって車輪 1回転あたりのパルス数を 算出し、その 1回転あたりのパルス数に基づく走行距離情報を出力する。  [0053] In addition, when detecting the current position of the vehicle, information collected by various sensors such as an angular velocity sensor, a travel distance sensor, a tilt angle sensor, and a lateral G (Gravity) sensor may be taken into consideration. The angular velocity sensor detects the angular velocity when the vehicle rotates and outputs angular velocity information and relative orientation information. The mileage sensor calculates the number of pulses per one rotation of the wheel by counting the number of pulses of a pulse signal output with the rotation of the wheel in a given period, and the mileage based on the number of pulses per one rotation Output information.
[0054] また、傾斜角センサは、路面の傾斜角度を検出し、傾斜角情報を出力する。横 Gセ ンサは、車両のコーナリングの際に遠心力によって発生する外向きの力(重力)であ る横 Gを検出し、横 G情報を出力する。なお、位置取得部 403によって取得される車 両の現在位置情報や、これらの車速センサ、ジャイロセンサ、角速度センサ、走行距 離センサ、傾斜角センサおよび横 Gセンサによって検出される情報は、ナビゲーショ ン制御部 400に出力される。 [0054] The inclination angle sensor detects the inclination angle of the road surface and outputs inclination angle information. Horizontal G The sensor detects lateral G, which is an outward force (gravity) generated by centrifugal force during vehicle cornering, and outputs lateral G information. The vehicle current position information acquired by the position acquisition unit 403 and the information detected by the vehicle speed sensor, gyro sensor, angular velocity sensor, travel distance sensor, inclination angle sensor, and lateral G sensor are used for navigation. Output to the control unit 400.
[0055] 記録媒体 404は、各種制御プログラムや各種情報をコンピュータに読み取り可能な 状態で記録する。記録媒体 404は、記録媒体デコード部 405による情報の書き込み を受け付けるとともに、書き込まれた情報を不揮発に記録する。記録媒体 404は、た とえば HD (Hard Disk)によって実現することができる。  [0055] The recording medium 404 records various control programs and various information in a state readable by a computer. The recording medium 404 accepts writing of information by the recording medium decoding unit 405 and records the written information in a nonvolatile manner. The recording medium 404 can be realized by HD (Hard Disk), for example.
[0056] なお、記録媒体 404は、 HDに限るものではなぐ HDに代えてあるいは HDに加え て DVD (Digital Versatile Disk)や CD (Compact Disk)など、記録媒体デコ ード部 405に対して着脱可能であり可搬性を有するメディアを記録媒体 404として用 いてもよい。そして、記録媒体 404は、 DVDおよび CDに限るものではなぐ CD-R OM (CD— R、 CD-RW) , MO (Magneto -Optical disk)、メモリカードなどの 記録媒体デコード部 405に対して着脱可能であり可搬性を有するメディアを利用す ることちでさる。  [0056] Note that the recording medium 404 is not limited to HD. Instead of HD or in addition to HD, DVD (Digital Versatile Disk), CD (Compact Disk), or other recording medium decoding unit 405 can be attached and detached. A medium that is portable and has portability may be used as the recording medium 404. The recording medium 404 is not limited to a DVD and a CD. The recording medium 404 is attached to and detached from a recording medium decoding unit 405 such as a CD-ROM (CD-R, CD-RW), MO (Magneto-Optical disk), or a memory card. Use media that is portable and portable.
[0057] なお、記録媒体 404には、本発明を実現する情報再生プログラム、ナビゲーシヨン プログラム、画像データ、音声データおよび地図情報などが記録されている。ここで、 画像データは、たとえば車両内部を撮影した画像の画像イメージを表わす 2次元配 列の値をいう。また、音声データは、楽曲などの音楽を再生するためのデータをいう。 また、地図情報は、たとえば建物、河川、地表面などの地物(フィーチャ)をあらわす 背景情報と、道路の形状をあらわす道路形状情報とを有しており、表示部 402の表
Figure imgf000014_0001
、て 2次元または 3次元に描画される。
Note that the recording medium 404 records an information reproduction program, a navigation program, image data, audio data, map information, and the like that realize the present invention. Here, the image data refers to, for example, a two-dimensional array value representing an image of an image taken inside the vehicle. Audio data refers to data for reproducing music such as music. The map information includes background information representing features (features) such as buildings, rivers, and the ground surface, and road shape information representing the shape of the road.
Figure imgf000014_0001
It is drawn in 2D or 3D.
[0058] 背景情報は、背景の形状をあらわす背景形状情報と、背景の種別をあらわす背景 種別情報とを有する。背景形状情報は、たとえば地物の代表点、ポリライン、ポリゴン 、地物の座標などを示す情報を含む。背景種別情報は、たとえば地物の名称や住所 や電話番号などを示すテキスト情報、建物'河川などの地物の種別を示す種別情報 などを含む。 [0059] 道路形状情報は、複数のノードおよびリンクを有する道路ネットワークに関する情報 である。ノードは、三叉路 '十字路'五叉路など複数の道路が交差する交差点を示す 情報である。リンクは、ノード間を連結する道路を示す情報である。リンクには、曲線 道路の表現を可能とする形状補完点を有するものもある。道路形状情報は、交通条 件情報を有する。交通条件情報は、交差点の特徴、各リンクの長さ (距離)、車幅、進 行方向、通行禁止、道路種別などを示す情報である。 The background information includes background shape information representing the shape of the background and background type information representing the type of the background. The background shape information includes information indicating, for example, representative points of features, polylines, polygons, coordinates of features, and the like. The background type information includes, for example, text information indicating the name, address, telephone number, and the like of the feature, and type information indicating the type of the feature such as a building 'river'. [0059] The road shape information is information relating to a road network having a plurality of nodes and links. The node is information indicating an intersection where a plurality of roads intersect, such as a three-way “crossroad” and a five-way. A link is information indicating a road connecting nodes. Some links have shape interpolation points that allow the expression of curved roads. Road shape information has traffic condition information. The traffic condition information is information indicating the characteristics of the intersection, the length (distance) of each link, vehicle width, traveling direction, traffic prohibition, road type, and the like.
[0060] この交差点の特徴としては、たとえば三叉路や五叉路などの複雑な交差点、浅 ヽ 角度で道路が分岐する交差点、目的地周辺の交差点、高速道路の出入り口ゃジャ ンクシヨン、経路逸脱率の高い交差点などが挙げられる。経路逸脱率は、たとえば過 去の走行履歴力 算出することが可能である。そして、道路種別としては、たとえば高 速道路、有料道路、一般道路などが挙げられる。  [0060] The characteristics of this intersection include, for example, complex intersections such as three- and five-way intersections, intersections where roads diverge at shallow ridge angles, intersections around destinations, highway entrances and junctions, and route deviation rates. For example, high intersections. The route deviation rate can be calculated by, for example, the past driving history force. Examples of road types include highway, toll road, and general road.
[0061] なお、本実施例では、たとえば画像データおよび音声データや地図情報を記録媒 体 404に記録するようにした力 これに限るものではない。画像データおよび音声デ ータゃ地図情報は、情報再生装置のハードウェアと一体に設けられているものに限 つて記録されて 、るものではなく、情報再生装置の外部に設けられて 、てもよ 、。  [0061] In the present embodiment, for example, the power for recording image data, audio data, and map information in the recording medium 404 is not limited to this. Image data, audio data, and map information are recorded only if they are provided integrally with the hardware of the information reproducing device, and are not provided, but may be provided outside the information reproducing device. Yo ...
[0062] この場合、情報再生装置は、たとえば通信部 407を通じて、ネットワークを介して画 像データや音声データを取得する。また、情報再生装置は、たとえば通信部 407を 通じて、ネットワークを介して地図情報を取得する。このように取得された画像データ および音声データや地図情報は、たとえばナビゲーシヨン制御部 400の RAMなどに 記憶されてもよい。  [0062] In this case, the information reproducing apparatus acquires image data and audio data via the network through the communication unit 407, for example. In addition, the information reproducing apparatus acquires map information via a network, for example, through the communication unit 407. The image data, audio data, and map information acquired in this way may be stored in the RAM of the navigation control unit 400, for example.
[0063] 記録媒体デコード部 405は、記録媒体 404に対する情報の読み取り Z書き込みの 制御をおこなう。たとえば記録媒体 404として HDを用いた場合には、記録媒体デコ ード部 405は、 HDD (Hard Disk Drive)となる。同様に、記録媒体 404として DV Dあるいは CD (CD— R, CD— RWを含む)を用いた場合には、記録媒体デコード部 405は、 DVDドライブある!/ヽは CDドライブとなる。  [0063] The recording medium decoding unit 405 controls reading and writing of information on the recording medium 404. For example, when HD is used as the recording medium 404, the recording medium decoding unit 405 is an HDD (Hard Disk Drive). Similarly, when DV D or CD (including CD-R, CD-RW) is used as the recording medium 404, the recording medium decoding unit 405 is a DVD drive! / ヽ is a CD drive.
[0064] また、書き込み可能かつ着脱可能な記録媒体 404として、 CD-ROM (CD-R, CD-RW) , MO,メモリカードなどを利用する場合には、各種記録媒体への情報の 書き込みおよび各種記録媒体に記憶された情報の読み出しが可能な専用のドライブ 装置などを、記録媒体デコード部 405として適宜用いるとよ 、。 [0064] Further, when a CD-ROM (CD-R, CD-RW), MO, memory card, or the like is used as the writable and detachable recording medium 404, writing of information to various recording media and Dedicated drive capable of reading information stored in various recording media An apparatus or the like is appropriately used as the recording medium decoding unit 405.
[0065] 案内音出力部 406は、接続されたスピーカ 411への出力を制御することによって、 ナビゲーシヨンの案内音を再生する。スピーカ 411は、一つであってもよいし、複数で あってもよい。具体的に、案内音出力部 406は、音声出力用のスピーカ 411に接続 される図示しない音声 IZFによって実現することができる。より具体的には、音声 IZ Fは、たとえばディジタル音声データの DZA変換をおこなう DZ Aコンバータと、 Ό/ Aコンバータ力 出力されるアナログ音声信号を増幅する増幅器と、アナログ音声信 号の AZD変換をおこなう AZDコンバータと、力ら構成することができる。  The guidance sound output unit 406 reproduces the navigation guidance sound by controlling the output to the connected speaker 411. There may be one or more speakers 411. Specifically, the guide sound output unit 406 can be realized by an audio IZF (not shown) connected to the audio output speaker 411. More specifically, the audio IZ F is, for example, a DZ A converter that performs DZA conversion of digital audio data, an Ό / A converter power, an amplifier that amplifies the analog audio signal that is output, and an AZD conversion of the analog audio signal. It can be configured with AZD converter and force.
[0066] 通信部 407は、たとえば他の情報再生装置との間で通信をおこなう。本実施例の通 信部 407は、たとえば携帯電話のように、基地局(図示せず)を介して通信サーバ(図 示せず)と通信をおこなう通信モジュールであってもよぐ他の情報再生装置との間で 直接無線通信をおこなう通信モジュールであってもよい。  [0066] The communication unit 407 performs communication with, for example, another information reproducing apparatus. The communication unit 407 of this embodiment may be a communication module that communicates with a communication server (not shown) via a base station (not shown) such as a mobile phone, for example. It may be a communication module that performs direct wireless communication with the device.
[0067] ここで、無線通信とは、通信の媒体となるワイヤ線を使わず、電波や赤外線'超音波 を用いておこなわれる通信である。無線通信を可能とする規格には、たとえば無線 L AN、 IrDA (Infrared Data Association)、 HomeRF (Home Radio Freque ncy)、 BlueToothなど各種の技術がある力 本実施例においては公知の各種の無 線通信技術を利用することができる。なお、情報の転送速度などの面から、無線 LA Nを好まし 、一例として用いることができる。  Here, the wireless communication is communication performed using radio waves or infrared rays' ultrasonic waves without using a wire line as a communication medium. There are various technologies such as wireless LAN, IrDA (Infrared Data Association), HomeRF (Home Radio Frequency), and BlueTooth as standards that enable wireless communication. Technology can be used. Wireless LAN is preferred from the aspect of information transfer speed and can be used as an example.
[0068] また、通信部 407は、たとえば渋滞や交通規制などの道路交通情報を、定期的 (不 定期でも可)に受信してもよい。通信部 407による道路交通情報の受信は、 VICS (V enicle Information and Communication System)センタ ~~力ら道 ¾·交通†青 報が配信されたタイミングでおこなってもよいし、 viesセンターに対し定期的に道路 交通情報を要求することでおこなってもよい。また、通信部 407は、たとえば AMZF Mチューナ、 TVチューナ、 VICSZビーコンレシーバおよびその他の通信機器とし て実現することが可能である。  [0068] In addition, the communication unit 407 may receive road traffic information such as traffic jams and traffic regulations on a regular basis (may be irregular). Receiving road traffic information by the communication unit 407 may be performed at the timing when the VICS (Venicle Information and Communication System) Center ~~ Rikido Road ¾ · Transport † Blue Bulletin is distributed, or periodically to the vies center. It may be done by requesting road traffic information. The communication unit 407 can be realized as, for example, an AMZFM tuner, a TV tuner, a VICSZ beacon receiver, and other communication devices.
[0069] なお、公知の技術であるため詳細な説明を省略する力 rviCSjとは、 VICSセンタ 一で編集、処理された渋滞や交通規制などの道路交通情報をリアルタイムに送信し 、カーナビゲーシヨン装置などの車載機器に文字 '図形で表示する情報通信システ ムである。この VICSセンターで編集、処理された道路交通情報 (VICS情報)をナビ ゲーシヨン装置に伝達する方法としては、各道路上に設置された「ビーコン」と「FM 多重放送」を利用する方法がある。 [0069] Note that rviCSj is a well-known technology that omits detailed explanations. RviCSj is a vehicle navigation device that sends traffic traffic information such as traffic jams and traffic regulations edited and processed in the VICS center in real time. Information communication system that displays characters on graphics Is. As a method of transmitting road traffic information (VICS information) edited and processed in this VICS center to the navigation device, there is a method of using “beacon” and “FM multiplex broadcasting” installed on each road.
[0070] ここで、 「ビーコン」には、主に高速道路で使用される「電波ビーコン」と、主要な一 般道路で使用される「光ビーコン」とがある。「FM多重放送」を利用する場合には、広 域エリアの道路交通情報を受信することが可能となる。なお、「ビーコン」を利用する 場合には、自車 (車両)位置を元にした直近の道路の詳細な情報など、自車が位置 する場所において必要な道路交通情報を受信することが可能となる。そして、通信部 407は、他の情報再生装置との間での通信方法と、画像データおよび音声データや 道路交通情報を受信するための通信方法とが異なる場合には、それぞれに対応した 複数の通信手段を備えて 、てもよ 、。 Here, “beacons” include “radio wave beacons” mainly used on expressways and “optical beacons” used on major general roads. When “FM multiplex broadcasting” is used, road traffic information in a wide area can be received. When using a “beacon”, it is possible to receive necessary road traffic information at the location where the vehicle is located, such as detailed information on the most recent road based on the location of the vehicle (vehicle). Become. If the communication method with another information reproducing apparatus is different from the communication method for receiving image data, audio data, and road traffic information, the communication unit 407 has a plurality of corresponding methods. Have a means of communication.
[0071] 経路探索部 408は、位置取得部 403によって取得される車両の現在位置情報と、 ユーザによって入力される目的地の情報とに基づいて、現在位置から目的地までの 最適な経路を算出する。経路誘導部 409は、経路探索部 408によって探索された誘 導経路に関する情報あるいは、通信部 407によって受信した経路情報と、位置取得 部 403によって取得された現在位置情報と、記録媒体 404から記録媒体デコード部 405を経由して得られた地図情報とに基づいて、リアルタイムな経路誘導情報の生 成をおこなう。経路誘導部 409で生成された経路誘導情報は、ナビゲーシヨン制御 部 400を介して表示部 402へ出力される。  [0071] The route search unit 408 calculates an optimal route from the current position to the destination based on the current position information of the vehicle acquired by the position acquisition unit 403 and the information on the destination input by the user. To do. The route guidance unit 409 includes information on the guidance route searched by the route search unit 408 or route information received by the communication unit 407, current position information acquired by the position acquisition unit 403, and a recording medium 404 Based on the map information obtained via the decoding unit 405, real-time route guidance information is generated. The route guidance information generated by the route guidance unit 409 is output to the display unit 402 via the navigation control unit 400.
[0072] 案内音生成部 410は、パターンに対応したトーンと音声の情報を生成する。すなわ ち、経路誘導部 409で生成された経路誘導情報に基づいて、案内ポイントに対応し た仮想音源の設定と音声ガイダンス情報の生成をおこな 、、これらをナビゲーシヨン 制御部 400を介して案内音出力部 406へ出力する。  [0072] Guide sound generation section 410 generates tone and voice information corresponding to the pattern. In other words, based on the route guidance information generated by the route guidance unit 409, the virtual sound source corresponding to the guidance point is set and the voice guidance information is generated, and these are transmitted via the navigation control unit 400. The sound is output to the guide sound output unit 406.
[0073] スピーカ 411は、案内音出力部 406から出力されるナビゲーシヨンの案内音や、後 述する音声処理部 414から出力される音声を再生(出力)する。なお、たとえばこのス ピー力 411にヘッドホンなどを設け、車両内部全体が出力される案内音や音声の音 場とならな 、ように、案内音や音声の出力形態を適宜変更するようにしてもょ 、。  The speaker 411 reproduces (outputs) the navigation guidance sound output from the guidance sound output unit 406 and the sound output from the audio processing unit 414 described later. For example, a headphone or the like may be provided for the speaker power 411 so that the sound output sound of the guidance sound or sound is appropriately changed so that the sound field of the guidance sound or sound output from the entire vehicle interior is not obtained. Oh ,.
[0074] 画像処理部 412は、後述する撮影部 415および通信部 407から画像入出力 IZF4 13を通じて取得した画像データや、記録媒体 404に記録されて ヽる画像データなど の画像処理をおこなう。この画像処理部 412は、具体的には、たとえば GPUなどによ つて構成される。画像処理は、たとえば画像入出力 IZF413を介して取得した TV番 組などを録画した画像データなどを、搭乗者に関連付けられた再生情報に基づき再 生すること〖こよりおこなわれる。画像処理部 412は、ナビゲーシヨン制御部 400からの 制御命令にしたがって、たとえば画像データの解析処理をおこなう。 [0074] The image processing unit 412 receives an image input / output IZF4 from an imaging unit 415 and a communication unit 407, which will be described later. Image processing such as image data acquired through 13 and image data recorded on the recording medium 404 is performed. Specifically, the image processing unit 412 is configured by a GPU, for example. The image processing is performed by, for example, playing back image data obtained by recording a TV program obtained via the image input / output IZF413 based on reproduction information associated with the passenger. The image processing unit 412 performs, for example, image data analysis processing in accordance with a control command from the navigation control unit 400.
VC)などの撮影装置を含む撮影部 415によって撮影された画像データなどに基づ いて、車両内部の画像イメージカゝら搭乗者を識別'認識し、搭乗者の着座状態に関 する情報を解析することをいう。着座状態に関する情報の解析は、具体的には、たと えば車両内部の搭乗者の着座位置情報や身体の部位に関する情報などに基づき、 搭乗者がどのような状態で着座しているかを判断しておこなわれる。このため、画像 処理部 412は、たとえば DSP (Digital Signal Processor)の機能を有する構成と されていてもよい。 Based on the image data taken by the shooting unit 415 including a shooting device (VC), etc., the passenger recognizes and recognizes the passenger from the image image inside the vehicle, and analyzes information related to the seating state of the passenger To do. Specifically, the analysis of information related to the sitting state is based on, for example, determining the state in which the passenger is seated based on the seating position information of the passenger inside the vehicle and information on the body part. It is carried out. For this reason, the image processing unit 412 may be configured to have a DSP (Digital Signal Processor) function, for example.
[0076] 画像入出力 IZF413は、外部から画像処理部 412に対して入出力される画像デー タの入出力をおこなう。画像入出力 IZF413は、たとえば上記 DSCや DVCなどが撮 影した画像データを格納する記録媒体 404からの画像データや、 DSCや DVCなど に格納された画像データを USB (Universal Serial Bus)、 IEEE 1394 (Institut e of Electrical and Electronic Engineers 1394)および赤外線などの通信 によって通信部 407から入力される画像データなどを画像処理部 412に出力し、画 像処理部 412から出力される画像データを記録媒体 404や通信部 407に出力する。  Image Input / Output IZF 413 inputs / outputs image data input / output to / from image processing unit 412 from the outside. Image input / output IZF413 uses USB (Universal Serial Bus), IEEE 1394 to transfer image data from recording medium 404 that stores image data captured by DSC or DVC, for example, or image data stored in DSC or DVC, for example. (Institut e of Electrical and Electronic Engineers 1394) and image data input from the communication unit 407 by communication such as infrared rays is output to the image processing unit 412 and the image data output from the image processing unit 412 is output to the recording medium 404. And output to the communication unit 407.
[0077] なお、画像入出力 IZF413は、記録媒体 404との間で画像データの入出力をおこ なう場合は、記録媒体 404のリード Zライトを制御するコントローラの機能を有するとよ い。また、画像入出力 IZF413は、通信部 407との間で画像データの入出力をおこ なう場合は、通信部 407での通信を制御する通信コントローラの機能を有するとよい  Note that the image input / output IZF 413 has a function of a controller for controlling read Z write of the recording medium 404 when inputting / outputting image data to / from the recording medium 404. Further, the image input / output IZF 413 may have a function of a communication controller that controls communication with the communication unit 407 when inputting / outputting image data to / from the communication unit 407.
[0078] 音声処理部 414は、記録媒体デコード部 405を通じて記録媒体 404から得られる 音声データや、ナビゲーシヨン制御部 400を通じて通信部 407から得られる音声デ ータなどの選択をおこない、選択した音声データの再生処理をおこなう。また、音声 処理部 414は、後述する音声データベース(以下、「音声 DB」とする) 611 (図 6参照 )のような記憶装置に格納された音声データの再生処理をおこなう。再生される音声 データは、音楽の楽曲や効果音などの音声データが挙げられる。再生処理には、た とえばスピーカ 411から出力される音声が形成する音場の制御が含まれる。また、こ の音声処理部 414は、情報再生装置が AMZFMチューナや TVチューナを備える 場合、たとえばラジオやテレビの音声を再生するように構成されて 、てもよ 、。 [0078] The audio processing unit 414 receives audio data obtained from the recording medium 404 through the recording medium decoding unit 405 and audio data obtained from the communication unit 407 through the navigation control unit 400. Data is selected, and the selected audio data is played back. Further, the audio processing unit 414 performs reproduction processing of audio data stored in a storage device such as an audio database (hereinafter referred to as “audio DB”) 611 (see FIG. 6) described later. Audio data to be played back includes audio data such as music and sound effects. The reproduction process includes, for example, control of the sound field formed by the sound output from the speaker 411. In addition, when the information reproducing apparatus includes an AMZFM tuner or a TV tuner, the audio processing unit 414 may be configured to reproduce, for example, radio or television sound.
[0079] また、音声処理部 414は、画像処理部 412によって識別'認識された搭乗者に関 連付けられて 、る音声データの再生情報に基づ 、て、音声データの再生処理をおこ なう。この音声処理部 414は、選択され再生処理された音声データに基づいて、スピ 一力 411から出力される音声の出力を制御する。具体的には、たとえば音声の音量 の調整ゃィコライジング処理および音像定位などをおこない、音声の出力状態を制 御する。この音声処理部 414による音声の出力の制御は、たとえばユーザ操作部 40 1からの入力操作やナビゲーシヨン制御部 400による制御によっておこなわれる。  [0079] Also, the audio processing unit 414 performs audio data reproduction processing based on the audio data reproduction information associated with the passenger identified and recognized by the image processing unit 412. Yeah. The sound processing unit 414 controls the output of sound output from the force 411 based on the selected and reproduced sound data. More specifically, for example, sound volume adjustment and equalizing processing and sound image localization are performed to control the sound output state. The audio output control by the audio processing unit 414 is performed by, for example, an input operation from the user operation unit 401 or a control by the navigation control unit 400.
[0080] 撮影部 415は、図 1における車両に搭載された撮影装置 (カメラ) 123や、上述した DSCおよび DVCなどの外部の撮影装置により構成され、 C— MOSあるいは CCDな どの光電変換素子を有し、車両内部の画像を撮影する。この撮影部 415は、情報再 生装置と有線または無線で接続され、ナビゲーシヨン制御部 400からの撮影命令に より、たとえば車両内部の搭乗者を含む映像 (画像)を撮影する。撮影部 415で撮影 された画像の画像データは、画像入出力 IZF413を介して画像処理部 412に出力 される。  [0080] The imaging unit 415 includes an imaging device (camera) 123 mounted on the vehicle in FIG. 1 and an external imaging device such as the above-described DSC and DVC, and includes a photoelectric conversion element such as a C-MOS or a CCD. And take images inside the vehicle. This photographing unit 415 is connected to the information reproducing device by wire or wirelessly, and photographs a video (image) including, for example, a passenger inside the vehicle by a photographing command from the navigation control unit 400. Image data of the image captured by the imaging unit 415 is output to the image processing unit 412 via the image input / output IZF 413.
[0081] ここで、情報再生装置は、車両内部の搭乗者の着座状態に関する情報を、撮影部 415で撮影され画像処理部 412で解析される画像データの代わりにあるいはカ卩えて 、たとえば各シート 111〜113 (図 1参照、以下同じ)に搭載されたメンブレンスイッチ など力もなる着座センサで検出される信号に基づ 、て取得するようにしてもょ 、。この 場合、着座センサ力もの信号はナビゲーシヨン制御部 400に入力され、ナビゲーショ ン制御部 400によって搭乗者の着座状態が判断される。  Here, the information reproducing apparatus replaces the information related to the sitting state of the passenger inside the vehicle in place of or instead of the image data photographed by the photographing unit 415 and analyzed by the image processing unit 412, for example, each seat It may be obtained based on signals detected by a seating sensor that also has a force, such as a membrane switch mounted on 111-113 (see Fig. 1, the same applies hereinafter). In this case, the signal indicating the seating sensor force is input to the navigation control unit 400, and the navigation control unit 400 determines the seating state of the passenger.
[0082] なお、図 2における取得部 201および識別部 202は、具体的には、たとえば画像処 理部 412およびナビゲーシヨン制御部 400によってその機能を実現し、再生部 203 は、たとえば画像処理部 412、音声処理部 414およびナビゲーシヨン制御部 400に よってその機能を実現する。また、図 2における制御部 204は、具体的には、たとえ ば音声処理部 414およびナビゲーシヨン制御部 400によってその機能を実現し、記 憶部 205は、たとえば記録媒体 404および記録媒体デコード部 405によってその機 能を実現する。 [0082] Note that the acquisition unit 201 and the identification unit 202 in FIG. The function is realized by the processing unit 412 and the navigation control unit 400, and the playback unit 203 realizes the function by, for example, the image processing unit 412, the sound processing unit 414, and the navigation control unit 400. Further, the control unit 204 in FIG. 2 specifically realizes its function by, for example, the audio processing unit 414 and the navigation control unit 400, and the storage unit 205 includes, for example, the recording medium 404 and the recording medium decoding unit 405. The function is realized by.
[0083] また、図 2における撮影装置 206は、具体的には、たとえば撮影部 415によってそ の機能を実現する。さらに、図 2における画像表示機器 207は、具体的には、たとえ ば表示部 402によってその機能を実現し、音声出力機器 208は、たとえばスピーカ 4 11によってその機能を実現する。  [0083] In addition, specifically, the imaging device 206 in FIG. 2 realizes its function by the imaging unit 415, for example. Further, the image display device 207 in FIG. 2 specifically realizes its function by the display unit 402, for example, and the audio output device 208 realizes its function by, for example, the speaker 411.
[0084] ここで、上記画像処理部 412および音声処理部 414の内部構成について説明する 。図 5は、この発明の実施例に力かる情報再生装置における画像処理部の内部構成 の一例を示すブロック図である。また、図 6は、この発明の実施例に力かる情報再生 装置における音声処理部の内部構成の一例を示すブロック図である。  Here, the internal configurations of the image processing unit 412 and the audio processing unit 414 will be described. FIG. 5 is a block diagram showing an example of the internal configuration of the image processing unit in the information reproducing apparatus according to the embodiment of the present invention. FIG. 6 is a block diagram showing an example of the internal configuration of the audio processing unit in the information reproducing apparatus according to the embodiment of the present invention.
[0085] 図 5において、画像処理部 412は、画像解析部 510と、表示制御部 511と、画像認 識部 512と、画像記憶部 513と、搭乗者認識部 514と、搭乗者データベース(以下、「 搭乗者 DB」とする) 515とを備えて構成されている。画像解析部 510は、画像入出力 I/F413を通じて撮影部 415 (図 4参照、以下同じ)や外部から画像処理部 412に入 力される画像データや、記録媒体デコード部 405 (図 4参照、以下同じ)およびナビゲ ーシヨン制御部 400 (図 4参照、以下同じ)を通じて記録媒体 404 (図 4参照、以下同 じ)から画像処理部 412に入力される画像データの解析処理をおこなう。この画像解 析部 510は、具体的には、たとえば GPUなどによって構成される。  In FIG. 5, an image processing unit 412 includes an image analysis unit 510, a display control unit 511, an image recognition unit 512, an image storage unit 513, a passenger recognition unit 514, a passenger database (hereinafter referred to as “passenger database”). , “Passenger DB”) 515. The image analysis unit 510 includes an image input / output I / F 413 through an image input unit 415 (see FIG. 4; the same applies hereinafter), external image data input to the image processing unit 412, and a recording medium decoding unit 405 (see FIG. 4). The same applies to the following) and the navigation control unit 400 (see FIG. 4; the same applies to the following), and analyzes the image data input from the recording medium 404 (see FIG. 4; the same applies to the following) to the image processing unit 412. Specifically, the image analysis unit 510 is configured by, for example, a GPU.
[0086] 表示制御部 511は、画像解析部 510から出力される画像データを、表示部 402の 表示画面上で表示するための制御をおこなう。画像認識部 512は、画像解析部 510 に入力された画像データに基づ 、て、その画像データにどのような画像イメージが含 まれているのかを認識する。画像認識部 512は、具体的には、たとえば車両内部の どの位置に搭乗者が着座しているかを認識する。画像記憶部 513は、画像解析部 5 10に入力された画像データを記憶する。また、画像記憶部 513は、情報再生装置で 再生される画像データの再生情報を記憶する。この再生情報は、識別された搭乗者 に関連付けられて記憶される。 The display control unit 511 performs control for displaying the image data output from the image analysis unit 510 on the display screen of the display unit 402. The image recognition unit 512 recognizes what image image is included in the image data based on the image data input to the image analysis unit 510. Specifically, the image recognizing unit 512 recognizes, for example, where the passenger is seated in the vehicle. The image storage unit 513 stores the image data input to the image analysis unit 510. The image storage unit 513 is an information reproducing device. The reproduction information of the image data to be reproduced is stored. This reproduction information is stored in association with the identified passenger.
[0087] 搭乗者認識部 514は、画像解析部 510に入力された画像データ内の画像イメージ に、搭乗者の画像イメージが含まれている場合に、あらカゝじめ搭乗者 DB515に格納 されて ヽる搭乗者の画像イメージを読み出して、その画像イメージが表わす搭乗者の 識別'認識処理をおこなう。識別'認識処理は、具体的には、たとえば搭乗者の顔画 像情報に基づく顔認証などによりおこなわれる。顔認証については、公知の技術で あるため、ここでは説明を省略する。搭乗者 DB515は、車両の搭乗者の画像ィメー ジを含む画像データや、これらの搭乗者の年齢、性別などの個人識別データなどを 格納している。  [0087] The occupant recognition unit 514 stores the image of the occupant in the image data input to the image analysis unit 510. If the image of the occupant is included, the occupant recognition unit 514 stores the occupant recognition unit 514. The passenger's image is read out, and the passenger's identification 'recognition process represented by the image is performed. Specifically, the identification / recognition process is performed, for example, by face authentication based on the passenger's face image information. Since face authentication is a known technique, description thereof is omitted here. The passenger DB 515 stores image data including image images of passengers of the vehicle, personal identification data such as the age and gender of these passengers, and the like.
[0088] 一方、図 6において、音声処理部 414は、音声再生処理部 610と、音声データべ一 ス(以下、「音声 DB」という) 611と、履歴データベース(以下、「履歴 DB」という) 612 と、音場制御部 613と、パラメータ格納部 614とを備えて構成されている。音声再生 処理部 610は、音声処理部 414に入力される音声データや、音声 DB611に格納さ れている音声データの選択 ·再生処理をおこなう。また、音声再生処理部 610は、画 像処理部 412 (図 5参照、以下同じ)によって識別された搭乗者に関連付けられてい る音声データの選択 ·再生処理をおこなう。搭乗者に関連付けられている音声データ とは、たとえば搭乗者の年齢に合わせて最適と思われるリズムやテンポなどにより関 連付けされた音声データである。  On the other hand, in FIG. 6, an audio processing unit 414 includes an audio reproduction processing unit 610, an audio data base (hereinafter referred to as “audio DB”) 611, and a history database (hereinafter referred to as “history DB”). 612, a sound field control unit 613, and a parameter storage unit 614. The audio reproduction processing unit 610 performs selection / reproduction processing of audio data input to the audio processing unit 414 and audio data stored in the audio DB 611. Also, the audio reproduction processing unit 610 performs selection / reproduction processing of audio data associated with the passenger identified by the image processing unit 412 (see FIG. 5, the same applies hereinafter). The voice data associated with the passenger is, for example, voice data associated with a rhythm or tempo that seems to be optimal for the age of the passenger.
[0089] 音声 DB611は、音声処理部 414で選択'再生処理される音声データを格納する。  The audio DB 611 stores audio data to be selected and reproduced by the audio processing unit 414.
音声 DB611に格納される音声データは、記録媒体 404 (図 4参照、以下同じ)や通 信部 407 (図 4参照、以下同じ)力も音声処理部 414に入力される音声データであつ ても、あら力じめ情報再生装置に装備された音声データであってもよい。履歴 DB61 2は、音声処理部 414で選択 ·再生処理した音声データが楽曲データである場合な どに、その楽曲の再生履歴や選曲履歴に関する情報を格納する。この履歴 DB612 は、たとえば情報再生装置が車両に搭載されているときに、ドライブ中に再生された 楽曲の再生履歴や選曲履歴に関する情報などを格納する。  The audio data stored in the audio DB 611 is the audio data input to the audio processing unit 414 even if the recording medium 404 (see FIG. 4; the same applies hereinafter) and the communication unit 407 (see FIG. 4; the same applies hereinafter) It may also be audio data equipped in the brute force information reproducing apparatus. The history DB 612 stores information related to the playback history and music selection history of the music when the audio data selected and played back by the audio processing unit 414 is music data. This history DB 612 stores, for example, information related to the playback history and music selection history of music played during driving when the information playback device is mounted on a vehicle.
[0090] 音場制御部 613は、音声再生処理部 610から出力される音声データおよびナビゲ ーシヨン制御部 400から出力される着座状態に関する情報に基づいて、パラメータ格 納部 614から音場制御用の音場パラメータを読み出し、読み出した音場パラメータに 基づいて、スピーカ 411から出力される音声が形成する音場を制御する。音場の制 御は、たとえば音像定位によりおこなわれる。 [0090] The sound field control unit 613 is configured to output the sound data and the navigation data output from the sound reproduction processing unit 610. Based on the information regarding the seating state output from the motion control unit 400, the sound field parameter for sound field control is read from the parameter storage unit 614, and the sound output from the speaker 411 is read based on the read sound field parameter. Controls the sound field that forms. The sound field is controlled by, for example, sound image localization.
[0091] パラメータ格納部 614は、音場制御部 613で用いられる音場制御用の音場パラメ ータを格納する。格納される音場パラメータは、音像定位においては、搭乗者の着座 位置、座高、耳の位置などの情報に対応して、最適な音場を形成することができるよ うにあら力じめ決められたものである。ここで、音場制御部 613による音像定位につい て説明する。図 7は、この発明の実施例に力かる情報再生装置が搭載された車両内 部の一例を示す説明図である。  The parameter storage unit 614 stores sound field parameters for sound field control used in the sound field control unit 613. The stored sound field parameters are determined in advance so that an optimal sound field can be formed in accordance with information such as the seating position, sitting height, and ear position of the passenger in sound image localization. It is a thing. Here, the sound image localization by the sound field control unit 613 will be described. FIG. 7 is an explanatory view showing an example of the internal part of the vehicle on which the information reproducing apparatus according to the embodiment of the present invention is mounted.
[0092] 図 7において、情報再生装置のスピーカ 411 (図 4参照、以下同じ)は、車両内部に 複数 (SP1〜SP4)設置されており、 SP1は助手席シート 712の前方に、 SP2は運転 席シート 711の前方に、 SP3は後部座席左側シート 713の後方に、 SP4は後部座席 右側シート 714の後方にそれぞれ設置されている。  [0092] In FIG. 7, a plurality of speakers (SP1 to SP4) of the information reproducing apparatus 411 (see FIG. 4, the same applies hereinafter) are installed inside the vehicle, SP1 is in front of the passenger seat 712, and SP2 is driving In front of the seat 711, SP3 is installed behind the rear left seat 713, and SP4 is installed behind the rear right seat 714.
[0093] 音場制御部 613 (図 6参照、以下同じ)は、運転席シート 711にのみ搭乗者が着座 している場合、音場パラメータに基づいて、たとえば SP1〜SP4の音声出力遅延特 性値を、 SPl = +0. 001秒、 SP2= -0. 001秒、 SP3= +0. 003秒、 SP4= +0 . 002秒と設定し、運転席シート 711への搭乗者に対して最適な音場を提供できるよ うに音像定位をおこなう。  [0093] When the occupant is seated only in the driver's seat 711, the sound field control unit 613 (see FIG. 6, the same applies hereinafter), for example, the sound output delay characteristics of SP1 to SP4 based on the sound field parameters. The values are set to SPl = +0.001 seconds, SP2 = -0. 001 seconds, SP3 = +0.03 seconds, SP4 = +0.002 seconds, and optimal for passengers on driver seat 711 Sound image localization is performed so that a sound field can be provided.
[0094] また、音場制御部 613は、運転席シート 711および助手席シート 712にのみ搭乗者 が着座している場合、音場パラメータに基づいて、たとえば SP1〜SP4の音声出力 遅延特性値を、 SP1 = 0秒、 SP2 = 0秒、 SP3= +0. 002秒、 SP4= +0. 002秒と 設定し、運転席シート 711および助手席シート 712への搭乗者に対して最適な音場 を提供できるように音像定位をおこなう。その他、搭乗者の乗車位置パターンに対応 して、音場制御部 613は、パラメータ格納部 614 (図 6参照、以下同じ)から音場パラ メータを読み出し、適宜音像定位をおこなう。  [0094] When the passenger is seated only in the driver's seat 711 and the passenger seat 712, the sound field control unit 613 calculates, for example, the sound output delay characteristic values of SP1 to SP4 based on the sound field parameters. , SP1 = 0 seconds, SP2 = 0 seconds, SP3 = +0.002 seconds, SP4 = +0.002 seconds, and the optimal sound field for passengers in the driver's seat 711 and passenger seat 712 Sound image localization is performed so that can be provided. In addition, the sound field control unit 613 reads out the sound field parameters from the parameter storage unit 614 (refer to FIG. 6, the same applies hereinafter) and performs sound image localization appropriately in accordance with the boarding position pattern of the passenger.
[0095] (情報再生装置の情報再生処理手順)  [Information Reproduction Processing Procedure of Information Reproducing Device]
つぎに、この発明の実施例に力かる情報再生装置の情報再生処理手順について 説明する。図 8は、この発明の実施例にかかる情報再生装置の情報再生処理手順の 一例を示すフローチャートである。ここでは、搭乗者の着座位置に基づいて情報再生 処理をおこなう場合について説明する。図 8において、まず、車両内部に備えられた 撮影部 415 (図 4参照、以下同じ)によって、車内の画像を撮影する (ステップ S801) Next, the information reproduction processing procedure of the information reproducing apparatus according to the embodiment of the present invention explain. FIG. 8 is a flowchart showing an example of the information reproduction processing procedure of the information reproducing apparatus according to the embodiment of the present invention. Here, the case where information reproduction processing is performed based on the seating position of the passenger will be described. In FIG. 8, first, an image inside the vehicle is captured by the imaging unit 415 (see FIG. 4, the same applies hereinafter) provided inside the vehicle (step S801).
[0096] 撮影された画像の画像データは、画像入出力 IZF413 (図 4参照、以下同じ)を介 して画像処理部 412 (図 5参照、以下同じ)に入力され、画像解析部 510 (図 5参照、 以下同じ)などによって、搭乗者がシートに着座した力否かを判断する (ステップ S80 2)。搭乗者がシートに着座していないと判断した場合 (ステップ S802 : No)、ステツ プ S801に戻り画像を撮影する。搭乗者がシートに着座したと判断した場合 (ステップ S802 : Yes)、ナビゲーシヨン制御部 400 (図 4参照、以下同じ)は、画像処理部 412 から着座位置情報を取得する (ステップ S803)。 [0096] Image data of the captured image is input to an image processing unit 412 (see Fig. 5 and the same below) via an image input / output IZF413 (see Fig. 4 and the same below), and an image analysis unit 510 (Fig. It is determined whether or not the passenger is seated on the seat (see step S802). If it is determined that the passenger is not seated on the seat (step S802: No), the process returns to step S801 to capture an image. When it is determined that the passenger is seated on the seat (step S802: Yes), the navigation control unit 400 (see FIG. 4, the same applies hereinafter) acquires seating position information from the image processing unit 412 (step S803).
[0097] こうして取得した着座位置情報に基づ 、て、ナビゲーシヨン制御部 400は、搭乗者 の着座位置に変化があるカゝ否かを判断する(ステップ S804)。着座位置に変化があ ると判断した場合 (ステップ S804 :Yes)、ステップ S803に戻り、ナビゲーシヨン制御 部 400は、画像処理部 412から再度着座位置情報を取得する。着座位置に変化が ないと判断した場合 (ステップ S804 : No)、ナビゲーシヨン制御部 400は、画像処理 部 412において識別された搭乗者に関する情報に基づいて、車両内部への搭乗者 を識別する (ステップ S805)。  Based on the seating position information acquired in this way, the navigation control unit 400 determines whether there is a change in the seating position of the occupant (step S804). If it is determined that there is a change in the seating position (step S804: Yes), the process returns to step S803, and the navigation control unit 400 acquires the seating position information from the image processing unit 412 again. If it is determined that there is no change in the seating position (step S804: No), the navigation control unit 400 identifies the passenger inside the vehicle based on the information about the passenger identified in the image processing unit 412 ( Step S805).
[0098] つぎに、ナビゲーシヨン制御部 400は、音声処理部 414 (図 6参照、以下同じ)に対 して、識別した搭乗者に関連付けられている再生情報に基づいて、音声データを選 択させるととも〖こ、音場制御部 613 (図 6参照、以下同じ)は、搭乗者の着座位置に最 適な音場を生成するように、パラメータ格納部 614 (図 6参照、以下同じ)から音場パ ラメータを読み込む (ステップ S806)。  Next, the navigation control unit 400 selects the audio data based on the reproduction information associated with the identified occupant for the audio processing unit 414 (see FIG. 6, the same applies hereinafter). The sound field control unit 613 (see FIG. 6; the same applies hereinafter) generates a sound field optimal for the seating position of the passenger. The sound field parameters are read from (Step S806).
[0099] そして、音声処理部 414は、選択した音声データに基づ!/、て音声を再生し (ステツ プ S807)、音場制御部 613は、読み込んだ音場パラメータに基づいて音声が形成 する音場の音像を定位する (ステップ S808)。これにより、本フローチャートによる一 連の情報再生処理を終了する。なお、画像処理部 412は、識別した搭乗者に関連付 けられている再生情報に基づいて、音声処理部 414による音声データの再生と併せ て、適宜画像データを再生するようにしてもよい。また、搭乗者の着座位置に基づい て、画像を最適に鑑賞することができるように、表示部 402 (図 4参照、以下同じ)の 表示画面上での画像表示を制御して、画像データを再生するようにしてもょ ヽ。 [0099] Then, the sound processing unit 414 reproduces the sound based on the selected sound data (step S807), and the sound field control unit 613 forms the sound based on the read sound field parameters. The sound image of the sound field to be localized is localized (step S808). Thus, a series of information reproduction processing according to this flowchart is completed. The image processing unit 412 is associated with the identified passenger. Based on the reproduced information, the image data may be appropriately reproduced together with the reproduction of the audio data by the audio processing unit 414. Also, based on the seating position of the passenger, the image display on the display screen of the display unit 402 (see FIG. 4; the same applies hereinafter) is controlled so that the image can be viewed optimally. Let's play it.
[0100] 以上説明したように、本実施例に力かる情報再生装置は、車両内部を撮影した画 像データに基づいて搭乗者の着座位置情報を取得し、着座位置に適した音場の制 御を音像定位をおこなってすることができる。このため、車両内部の音場を搭乗者に 合わせて自動的に最適な状態にすることができる。  [0100] As described above, the information reproducing apparatus according to the present embodiment acquires the seating position information of the passenger based on the image data obtained by photographing the inside of the vehicle, and controls the sound field suitable for the seating position. You can perform sound image localization. For this reason, the sound field inside the vehicle can be automatically optimized in accordance with the passenger.
[0101] 以上のように、この発明に力かる情報再生装置、情報再生方法、情報再生プロダラ ムおよびコンピュータに読み取り可能な記録媒体によれば、音場の制御を搭乗者の 着座状態に基づ 、て適宜自動的におこな、、車両内部の音場を搭乗者に合わせて 最適な状態にすることができるという効果を奏する。  As described above, according to the information reproducing apparatus, the information reproducing method, the information reproducing program, and the computer-readable recording medium according to the present invention, the sound field is controlled based on the seated state of the passenger. As a result, the sound field inside the vehicle can be optimized according to the passenger.
[0102] なお、本実施の形態で説明した情報再生方法は、予め用意されたプログラムをパ 一ソナル 'コンピュータやワークステーションなどのコンピュータで実行することにより 実現することができる。このプログラムは、ハードディスク、フレキシブルディスク、 CD -ROM, MO、 DVDなどのコンピュータで読み取り可能な記録媒体に記録され、コ ンピュータによって記録媒体力 読み出されることによって実行される。また、このプ ログラムは、インターネットなどのネットワークを介して配布することが可能な伝送媒体 であっても良い。  [0102] The information reproduction method described in the present embodiment can be realized by executing a prepared program on a computer such as a personal computer or a workstation. This program is recorded on a computer-readable recording medium such as a hard disk, a flexible disk, a CD-ROM, an MO, and a DVD, and is executed by being read by the computer. Further, this program may be a transmission medium that can be distributed through a network such as the Internet.

Claims

請求の範囲 The scope of the claims
[1] 車両への搭乗者の着座状態を検出し、当該着座状態に関する情報を取得する取 得手段と、  [1] An acquisition means for detecting a seating state of a passenger in the vehicle and acquiring information related to the seating state;
前記取得手段によって取得された着座状態に関する情報に基づいて、前記搭乗 者を識別する識別手段と、  Identification means for identifying the occupant based on information on the seating state acquired by the acquisition means;
前記識別手段によって識別された搭乗者に関連付けられている音声データを再生 する再生手段と、  Reproduction means for reproducing audio data associated with the passenger identified by the identification means;
前記識別手段によって識別された識別結果に基づいて、前記再生手段によって再 生される音声データの音声が形成する音場を制御する制御手段と、  Control means for controlling the sound field formed by the sound of the sound data reproduced by the reproducing means based on the identification result identified by the identifying means;
を備えることを特徴とする情報再生装置。  An information reproducing apparatus comprising:
[2] 前記取得手段は、前記着座状態に関する情報として、前記搭乗者の着座位置情 報、座高情報、顔画像情報および耳の位置情報の少なくともいずれか一つを取得す ることを特徴とする請求項 1に記載の情報再生装置。  [2] The acquisition means acquires at least one of the sitting position information, sitting height information, face image information, and ear position information of the passenger as the information on the sitting state. The information reproducing apparatus according to claim 1.
[3] 前記識別手段は、前記搭乗者の年齢および性別を識別することを特徴とする請求 項 1または 2に記載の情報再生装置。 3. The information reproducing apparatus according to claim 1, wherein the identification unit identifies an age and a gender of the passenger.
[4] 前記再生手段は、前記搭乗者に関連付けられている画像データを再生することを 特徴とする請求項 1に記載の情報再生装置。 4. The information reproducing apparatus according to claim 1, wherein the reproducing unit reproduces image data associated with the passenger.
[5] 前記再生手段は、前記着座状態に関する情報に基づいて、再生する前記画像デ ータおよび前記音声データの少なくともいずれか一つを選択することを特徴とする請 求項 4に記載の情報再生装置。 [5] The information according to claim 4, wherein the reproduction means selects at least one of the image data and the audio data to be reproduced based on information on the seating state. Playback device.
[6] 前記制御手段は、前記再生手段によって再生される音声データの音声が形成する 音場の音像を定位することを特徴とする請求項 1に記載の情報再生装置。 6. The information reproducing apparatus according to claim 1, wherein the control means localizes a sound image of a sound field formed by the sound of the sound data reproduced by the reproducing means.
[7] 車両への搭乗者の着座状態を検出し、当該着座状態に関する情報を取得する取 得工程と、 [7] An acquisition process for detecting a seated state of a passenger in the vehicle and acquiring information related to the seated state;
前記取得工程によって取得された着座状態に関する情報に基づいて、前記搭乗 者を識別する識別工程と、  An identification step for identifying the occupant based on the information on the seating state acquired by the acquisition step;
前記識別工程によって識別された搭乗者に関連付けられている音声データを再生 する再生工程と、 前記識別工程によって識別された識別結果に基づ ヽて、前記再生工程によって再 生される音声データの音声が形成する音場を制御する制御工程と、 A reproduction step of reproducing audio data associated with the passenger identified by the identification step; A control step for controlling the sound field formed by the sound of the audio data reproduced by the reproduction step based on the identification result identified by the identification step;
を含むことを特徴とする情報再生方法。  A method for reproducing information, comprising:
[8] 請求項 7に記載の情報再生方法をコンピュータに実行させることを特徴とする情報 再生プログラム。  [8] An information reproduction program that causes a computer to execute the information reproduction method according to claim 7.
[9] 請求項 8に記載の情報再生プログラムを記録したことを特徴とするコンピュータに読 み取り可能な記録媒体。  [9] A computer-readable recording medium on which the information reproduction program according to claim 8 is recorded.
PCT/JP2006/304281 2005-03-11 2006-03-06 Information reproduction device, information reproduction method, information reproduction program, and computer-readable recording medium WO2006095688A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2005069778 2005-03-11
JP2005-069778 2005-03-11

Publications (1)

Publication Number Publication Date
WO2006095688A1 true WO2006095688A1 (en) 2006-09-14

Family

ID=36953282

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2006/304281 WO2006095688A1 (en) 2005-03-11 2006-03-06 Information reproduction device, information reproduction method, information reproduction program, and computer-readable recording medium

Country Status (1)

Country Link
WO (1) WO2006095688A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008213634A (en) * 2007-03-02 2008-09-18 Denso Corp Operation environment setting system, vehicle-mounted device, portable device, management device, program for vehicle-mounted device, program for portable device, and program for management device
WO2012141057A1 (en) * 2011-04-14 2012-10-18 株式会社Jvcケンウッド Sound field generating device, sound field generating system and method of generating sound field
JP2012231448A (en) * 2011-04-14 2012-11-22 Jvc Kenwood Corp Sound field generation device, sound field generation system, and sound field generation method
JP2016082443A (en) * 2014-10-17 2016-05-16 学校法人 中央大学 Speaker arrangement selection unit, speaker arrangement selection method and sound field control system
JP2019193108A (en) * 2018-04-25 2019-10-31 パイオニア株式会社 Sound apparatus
US20220174446A1 (en) * 2019-03-22 2022-06-02 Sony Group Corporation Acoustic signal processing device, acoustic signal processing system, acoustic signal processing method, and program

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH099398A (en) * 1995-06-20 1997-01-10 Matsushita Electric Ind Co Ltd Sound image localization device
JPH1146394A (en) * 1997-07-25 1999-02-16 Sony Corp Information-processing device and method, recording medium and transmission medium there
JP2002140603A (en) * 2000-10-31 2002-05-17 Omron Corp Image forming device and image providing device
JP2003111200A (en) * 2001-09-28 2003-04-11 Sony Corp Sound processor
JP2003130649A (en) * 2001-10-22 2003-05-08 Clarion Co Ltd Device for changing onboard equipment setting
JP2004102714A (en) * 2002-09-10 2004-04-02 Kyodo Printing Co Ltd Advertisement system and advertisement method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH099398A (en) * 1995-06-20 1997-01-10 Matsushita Electric Ind Co Ltd Sound image localization device
JPH1146394A (en) * 1997-07-25 1999-02-16 Sony Corp Information-processing device and method, recording medium and transmission medium there
JP2002140603A (en) * 2000-10-31 2002-05-17 Omron Corp Image forming device and image providing device
JP2003111200A (en) * 2001-09-28 2003-04-11 Sony Corp Sound processor
JP2003130649A (en) * 2001-10-22 2003-05-08 Clarion Co Ltd Device for changing onboard equipment setting
JP2004102714A (en) * 2002-09-10 2004-04-02 Kyodo Printing Co Ltd Advertisement system and advertisement method

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008213634A (en) * 2007-03-02 2008-09-18 Denso Corp Operation environment setting system, vehicle-mounted device, portable device, management device, program for vehicle-mounted device, program for portable device, and program for management device
US9174552B2 (en) 2007-03-02 2015-11-03 Denso Corporation Driving-environment setup system, in-vehicle device and program thereof, portable device and program thereof, management device and program thereof
WO2012141057A1 (en) * 2011-04-14 2012-10-18 株式会社Jvcケンウッド Sound field generating device, sound field generating system and method of generating sound field
JP2012231448A (en) * 2011-04-14 2012-11-22 Jvc Kenwood Corp Sound field generation device, sound field generation system, and sound field generation method
JP2016082443A (en) * 2014-10-17 2016-05-16 学校法人 中央大学 Speaker arrangement selection unit, speaker arrangement selection method and sound field control system
JP2019193108A (en) * 2018-04-25 2019-10-31 パイオニア株式会社 Sound apparatus
US20220174446A1 (en) * 2019-03-22 2022-06-02 Sony Group Corporation Acoustic signal processing device, acoustic signal processing system, acoustic signal processing method, and program

Similar Documents

Publication Publication Date Title
JP4516111B2 (en) Image editing apparatus, image editing method, image editing program, and computer-readable recording medium
JP4533897B2 (en) PROCESS CONTROL DEVICE, ITS PROGRAM, AND RECORDING MEDIUM CONTAINING THE PROGRAM
WO2006095688A1 (en) Information reproduction device, information reproduction method, information reproduction program, and computer-readable recording medium
WO2006101012A1 (en) Map information update device, map information update method, map information update program, and computer-readable recording medium
JP4652099B2 (en) Image display device, image display method, image display program, and recording medium
JP3708141B2 (en) Electronic map device
WO2007135865A1 (en) Imaging control device, imaging control method, imaging control program, and recording medium
JP4791064B2 (en) Navigation device with environmental sound reproduction
US20070115433A1 (en) Communication device to be mounted on automotive vehicle
WO2007032278A1 (en) Path search apparatus, path search method, path search program, and computer readable recording medium
JP2007259146A (en) Caption detector, caption detecting method, caption detecting program and recording medium
WO2007023900A1 (en) Content providing device, content providing method, content providing program, and computer readable recording medium
JP2009223187A (en) Display content controller, display content control method and display content control method program
JP2008252589A (en) Sound volume controller, sound volume control method, sound volume control program, and recording medium
JP2006189977A (en) Device, method and program for image editing, and medium for computer-readable recording medium
WO2006095689A1 (en) Drive assistance device, drive assistance method, and drive assistance program
JP2008160447A (en) Broadcast program receiving device, broadcast program reception planning device, broadcast program receiving method, broadcast program reception planning method, program, and recording medium
WO2006109469A1 (en) Music composition support device, music composition support method, music composition support program, and recording medium
JP2006177814A (en) Information providing device
WO2007043464A1 (en) Output control device, output control method, output control program, and computer-readable recording medium
JP4917340B2 (en) Information terminal device, information extraction method, information extraction program, and computer-readable recording medium
US11538218B2 (en) System and method for three-dimensional reproduction of an off-road vehicle
JP4584176B2 (en) Information transfer system, portable information processing apparatus, information transfer method, information transfer program, and computer-readable recording medium
JP2008160445A (en) Broadcast wave information display device, broadcast wave information displaying method, broadcast wave information display program, and recording medium
JPH10132595A (en) Navigation apparatus and method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: RU

122 Ep: pct application non-entry in european phase

Ref document number: 06715301

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP