WO2007023900A1 - Dispositif, procédé et programme de fourniture de contenu et support d'enregistrement lisible par ordinateur - Google Patents

Dispositif, procédé et programme de fourniture de contenu et support d'enregistrement lisible par ordinateur Download PDF

Info

Publication number
WO2007023900A1
WO2007023900A1 PCT/JP2006/316614 JP2006316614W WO2007023900A1 WO 2007023900 A1 WO2007023900 A1 WO 2007023900A1 JP 2006316614 W JP2006316614 W JP 2006316614W WO 2007023900 A1 WO2007023900 A1 WO 2007023900A1
Authority
WO
WIPO (PCT)
Prior art keywords
passenger
content
information
content providing
output
Prior art date
Application number
PCT/JP2006/316614
Other languages
English (en)
Japanese (ja)
Inventor
Hiroaki Shibasaki
Original Assignee
Pioneer Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pioneer Corporation filed Critical Pioneer Corporation
Publication of WO2007023900A1 publication Critical patent/WO2007023900A1/fr

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • B60R16/023Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for transmission of signals between vehicle parts or subsystems
    • B60R16/0231Circuits relating to the driving or the functioning of the vehicle

Definitions

  • Content providing apparatus content providing method, content providing program, and computer-readable recording medium
  • the present invention relates to a content providing apparatus that provides content in a mobile body, a content providing method, a content providing program, and a computer-readable recording medium.
  • the present invention is not limited to the above-described content providing apparatus, content providing method, content providing program, and computer-readable recording medium.
  • a passenger of the moving object can view various contents via a display device such as a display and an acoustic device such as a speaker mounted on the moving object.
  • a display device such as a display
  • an acoustic device such as a speaker mounted on the moving object.
  • Various types of content are, for example, radio and television broadcasts, music and video recorded on recording media such as CD (Compact Disk) and DVD (Digital Versatile Disk). Adjust the screen to view the content.
  • ID IDentification
  • IC Integrated Circuit
  • Patent Document 1 Japanese Patent Application Laid-Open No. 2002-104105
  • each passenger needs to create and possess an ID card storing profile information, and to take measures to prevent loss and damage of the ID card.
  • I One example is the problem that the interior environment cannot be set according to the preferences of passengers who do not have an ID card because the interior environment is adjusted using the D card.
  • the content providing apparatus is a content providing apparatus that provides content in a mobile object, and identifies a passenger of the mobile object. Identifying means, passenger information obtaining means for obtaining information relating to the passenger identified by the identifying means (hereinafter referred to as “passenger information” t), output means for outputting the content, and the output means And control means for outputting the content based on passenger information acquired by the passenger information acquisition means.
  • the content providing apparatus is a content providing apparatus that provides content on a mobile body, and behavior detecting means for detecting the behavior of the passenger, and output means for outputting the content. And a control means for controlling the output means and outputting the content based on a result detected by the behavior detecting means.
  • the content providing method according to the invention of claim 6 is a specific process for specifying a passenger of the mobile object in the content providing method for providing content to the mobile object. And a passenger information acquisition step for acquiring information related to the passenger specified in the specifying step (hereinafter referred to as “passenger information” and V), and the passenger information acquired in the passenger information acquisition step. And a control step for controlling the output of the content.
  • the content providing method according to the invention of claim 7 is a content providing method for providing content to a mobile object, a behavior detecting step for detecting the behavior of the passenger, and the behavior detection And a control step of controlling the output of the content based on the result detected by the step.
  • a content providing program according to claim 8 causes a computer to execute the content providing method according to claim 6 or 7.
  • a computer-readable recording medium according to the invention of claim 9 records the content providing program according to claim 8.
  • FIG. 1 is a block diagram illustrating an example of a functional configuration of a content providing apparatus according to the present embodiment.
  • FIG. 2 is a flow chart showing the contents of processing of the content providing apparatus which works on the present embodiment.
  • FIG. 3 is a block diagram showing an example of a hardware configuration of a navigation device that works well with the present embodiment.
  • FIG. 4 is an explanatory view showing an example of the inside of a vehicle equipped with a navigation device that is effective in the present embodiment.
  • FIG. 5 is a flowchart showing the contents of processing using the passenger information in the navigation device empowering the present embodiment.
  • FIG. 6 is a flowchart showing the contents of processing using behavior information in the navigation apparatus which is useful in the present embodiment.
  • FIG. 1 is a block diagram showing an example of a functional configuration of a content providing apparatus that works on the present embodiment.
  • the content providing device 100 includes a passenger identification unit 101, a passenger information acquisition unit 102, an output control unit 103, a content output unit 104, and a behavior detection unit 105. ing.
  • the passenger identification unit 101 identifies a passenger.
  • the identification of the passenger is, for example, the identification of the relationship (owner, relative, friend, other person, etc.) with the owner of the moving body in each passenger.
  • the passenger information acquisition unit 102 acquires the passenger information related to the passenger specified by the passenger specification unit 101.
  • Passenger information is, for example, information including characteristics such as preferences, age, and gender related to the passenger, and the configuration such as the arrangement and number of passengers.
  • the behavior detection unit 105 detects the behavior of the passenger.
  • the passenger's behavior is information including the physical condition of the passenger such as sleepiness, fatigue and physical condition, and may be information including the arrangement and number of passengers exhibiting the predetermined behavior.
  • the output control unit 103 outputs the content based on at least one of the passenger information acquired by the passenger information acquisition unit 102 and the behavior of the passenger detected by the movement detection unit 105.
  • Control For example, the output is controlled by controlling the sound and display, and controlling the volume and sound quality related to the sound or the size and brightness of the caption related to the display. In addition, it may be configured to control on / off of audio and display output.
  • the content output unit 104 outputs content according to control by the output control unit 103.
  • the content is output by, for example, a display device such as a display mounted on a moving body or an acoustic device such as a speaker.
  • FIG. 2 is a flowchart showing the contents of the processing of the content providing apparatus that works on the present embodiment.
  • the content providing apparatus 100 determines whether or not there is an instruction to provide the content (step S201).
  • the instruction to provide the content is, for example, a content type such as an operation unit not shown by the passenger. It may be configured to instruct.
  • step S201 after waiting for the content provision instruction (step S201: Yes), the passenger identification unit 101 identifies the passenger (step S202).
  • the identification of the passenger is, for example, the identification of the relationship between the owner of the mobile object and the passenger.
  • the occupant information acquisition unit 102 acquires occupant information regarding the occupant identified in step S202 (step S203).
  • Passenger information is, for example, information including characteristics such as preferences, age, and sex regarding the passenger, and configuration such as the arrangement and number of passengers.
  • the behavior detecting unit 105 detects the behavior of the passenger (step S204).
  • the passenger's behavior is information including the physical condition of the passenger such as sleepiness, fatigue, and physical condition.
  • the passenger information is acquired (step S203), and the behavior of the passenger to be detected (step S204) is detected (step S204). ), Acquire passenger information (Step S203) Do not perform either step in the order!
  • the content output unit 104 outputs content according to the control of the output control unit 103 (step S205).
  • the output control is performed, for example, by the output control unit 103 based on at least one of the passenger information acquired in step S203 and the behavior of the passenger detected in step S204. Then, a series of processing ends.
  • step S201 After waiting for an instruction to provide content, if there is an instruction (step S201: Yes), the passenger is identified (step S202), and the passenger information is stored. Although it is configured to acquire (step S203) and detect behavior (step S204), it may be configured to perform the above-described steps S202 to S204 before instructing content provision. For example, at the time of boarding by force, the passenger is identified (step S202), the passenger information is obtained (step S203), the behavior is detected (step S204), and there is an instruction to provide content. If there is an instruction after waiting (Step S201: Yes), the content may be controlled and output (Step S205).
  • the content providing apparatus and the content providing that are relevant to the present embodiment According to the method, the content providing program, and the computer-readable recording medium, the user can control the output based on the passenger information and behavior of the passenger without having to control the content output by his input operation. Can do. Therefore, it is possible to efficiently provide comfortable content to passengers.
  • FIG. 3 is a block diagram showing an example of a hardware configuration of a navigation apparatus that works on the present embodiment.
  • a navigation device 300 is mounted on a moving body such as a vehicle, and includes a navigation control unit 301, a user operation unit 302, a display unit 303, a position acquisition unit 304, and a recording medium. 305, a recording medium decoding unit 306, an audio output unit 307, a communication unit 308, a route search unit 309, a route guidance unit 310, an audio generation unit 311, a speaker 312, a passenger photographing unit 313, And an audio processing unit 314.
  • the navigation control unit 301 controls the entire navigation device 300.
  • the navigation control unit 301 includes, for example, a CPU (Central Processing Unit) that executes predetermined arithmetic processing, a ROM (Read Only Memory) that stores various control programs, and a RAM (Random) that functions as a work area for the CPU. It can be realized by a microcomputer constituted by an Access Memory).
  • a CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random
  • the navigation control unit 301 On the basis of the information on the current position acquired by the position acquisition unit 304 and the map information obtained from the recording medium 305 via the recording medium decoding unit 306, the navigation control unit 301 performs the route guidance, The driving force at which position on the map is calculated, and the calculation result is output to the display unit 303.
  • the navigation control unit 301 inputs / outputs information related to route guidance to / from the route search unit 309, the route guidance unit 310, and the voice generation unit 311 during route guidance, and displays information obtained as a result.
  • 303 and audio output unit 307 Output to.
  • the navigation control unit 301 generates passenger identification information and behavior information, which will be described later, based on the passenger image or behavior of the passenger photographed by the passenger photographing unit 313. Then, content output such as audio and video is controlled based on the identification information and behavior information of the occupant according to the content reproduction instruction input by the user operating the user operation unit 302.
  • the content output control can be performed, for example, when the playback of a music or video recorded on the recording medium 305 or a radio broadcast or a television broadcast received by the communication unit 308 is instructed to set the volume level or sound quality (high This is control of the balance between the low and high frequencies), the size of the subtitles, the brightness of the display 303, or the output on / off. Note that if there are a plurality of display units 303 and speakers 312, a configuration for controlling each of them is appropriate.
  • the user operation unit 302 acquires information input by the user by operating operation means such as a remote controller, a switch, and a touch panel, and outputs the acquired information to the navigation control unit 301.
  • operating operation means such as a remote controller, a switch, and a touch panel
  • Display unit 303 includes, for example, a CRT (Cathode Ray Tube), a TFT liquid crystal display, an organic EL display, a plasma display, and the like.
  • the display unit 303 can be configured by, for example, a video IZF (interface) or a video display device connected to the video IZF.
  • the video IZF is, for example, a graphic controller that controls the entire display device, a buffer memory such as VRAM (Video RAM) that temporarily stores image information that can be displayed immediately, and a graphics controller. It is composed of a control IC that controls the display of the display device based on the image information output from the device.
  • the display unit 303 displays traffic information, map information, information on route guidance, content related to video output from the navigation control unit 301, and other various information according to the control related to the output of the navigation control unit 301. To do.
  • the position acquisition unit 304 includes a GPS receiver and various sensor forces such as a vehicle speed sensor, an angular velocity sensor, and an acceleration sensor, and acquires information on the current position of the moving object (current position of the navigation device 300).
  • the GPS receiver receives the radio wave from the GPS satellite and determines the geometric position with the GPS satellite.
  • GPS means Global Positioning It is an abbreviation for System, and is a system that accurately determines the position on the ground by receiving radio waves from four or more satellites.
  • the GPS receiver is composed of an antenna for receiving radio waves from GPS satellites, a tuner for demodulating the received radio waves, and an arithmetic circuit for calculating the current position based on the demodulated information.
  • the recording medium 305 various control programs and various information are recorded in a state that can be read by a computer.
  • the recording medium 305 can be realized by, for example, an HD (Hard Disk), a DVD (Digital Versatile Disk), a CD (Compact Disk), or a memory card.
  • the recording medium 305 may accept writing of information by the recording medium decoding unit 306 and record the written information in a nonvolatile manner.
  • map information used for route search and route guidance is recorded in the recording medium 305.
  • the map information recorded in the recording medium 305 includes background data representing features (features) such as buildings, rivers, and the ground surface, and road shape data representing the shape of the road. It is drawn in 2D or 3D on the display screen.
  • the navigation device 300 is guiding a route, the map information read from the recording medium 305 by the recording medium decoding unit 306 and the mark indicating the position of the moving body acquired by the position acquisition unit 304 are displayed on the display unit 303. Will be displayed.
  • registered identification information for identifying a passenger, specific behavior information for determining the behavior of the passenger, output form information for determining content output, Content such as video and music is recorded.
  • the registered identification information includes, for example, information obtained by extracting feature points of a passenger image taken with a camera or the like, such as a face pattern, eye iris, fingerprint data, or voice data.
  • the specific behavior information includes, for example, information obtained by extracting features related to the specific behavior state such as drowsiness and fatigue, such as eyelid movement, volume level, and heart rate.
  • the output form information recorded in the recording medium 305 is information related to content output.
  • the passenger information of the passenger identified based on the registered identification information of the passenger, or the passenger It is related to the specific behavior state of the passenger determined based on the specific behavior information. Then, identification and enumeration of the passenger in the navigation control unit 301 are performed.
  • a configuration may be adopted in which the navigation control unit 301 reads when performing control related to output in accordance with the result of determination of movement.
  • Passenger information is information including features such as preferences, age, sex, etc., and the arrangement of passengers and number of passengers.
  • the relationship between the occupant information of the occupant, the specific behavior state, and the output form information may be, for example, a setting that makes the sound volume small and the sound quality mild for elderly people and female occupants.
  • the setting may be such that the caption displayed on the display unit 303 is enlarged.
  • the general application may be set to a low volume and a mild sound quality.
  • the luminance of the display unit 303 may be lowered by decreasing the volume.
  • a preset output format of each passenger may be used.
  • the passenger's viewing history may be recorded to match the previous viewing setting.
  • the output form information may have a structure in which the relationship between the output form and the age and sex is recorded in advance, or a structure that can be registered by the user.
  • the force for recording map information, content, and the like on the recording medium 305 is not limited to this.
  • Map information, content, and the like may be recorded on a server outside the navigation apparatus 300.
  • the navigation device 300 acquires map information and content from the Sano via the network, for example, through the communication unit 308.
  • the acquired information is stored in RAM or the like.
  • the recording medium decoding unit 306 controls reading of information on the recording medium 305 and Z writing.
  • the recording medium decoding unit 306 is an HDD (Hard Disk Drive).
  • the audio output unit 307 reproduces a sound such as an internal sound by controlling the output to the connected speaker 312. There may be one speaker 312 or a plurality of speakers.
  • the audio output unit 307 includes, for example, a D ZA converter that performs DZA conversion of audio digital information, an amplifier that amplifies an audio analog signal output from the DZA converter, and an AZD converter that performs AZD conversion of audio analog information. It can consist of [0046]
  • the communication unit 308 obtains various types of information from the outside.
  • the communication unit 308 is an FM multiplex tuner, a VICS (registered trademark) Z beacon receiver, a wireless communication device and other communication devices, and other communication via a communication medium such as a mobile phone, PHS, communication card and wireless LAN. Communicate with the device. Alternatively, it may be a device that can communicate by radio broadcast radio waves, television broadcast radio waves, or satellite broadcasts.
  • Information acquired by the communication unit 308 includes traffic information such as traffic jams and traffic regulations that are also distributed by the road traffic information communication system center, traffic information acquired by operators in their own way, and other information on the Internet. Public data and content.
  • the communication unit 308 may request traffic information or content from a server storing traffic information and content nationwide via the network and obtain the requested information.
  • it may be configured to receive video signals or audio signals from radio broadcasts, television broadcasts, or satellite broadcasts.
  • the route search unit 309 uses the map information acquired from the recording medium 305 via the recording medium decoding unit 306, the traffic information acquired via the communication unit 308, and the like. Search for the optimal route.
  • the optimal route is the route that best matches the user's request.
  • the route guidance unit 310 is obtained from the optimum route information searched by the route search unit 309, the position information of the moving body acquired by the position acquisition unit 304, and the recording medium 305 via the recording medium decoding unit 306. Based on the obtained map information, route guidance information for guiding the user to the destination point is generated.
  • the route guidance information generated at this time may be information that considers the traffic jam information received by the communication unit 308.
  • the route guidance information generated by the route guidance unit 310 is output to the display unit 303 via the navigation control unit 301.
  • the sound generation unit 311 generates information of various sounds such as a guide sound. That is, based on the route guidance information generated by the route guidance unit 310, the virtual sound source corresponding to the guidance point is set and the voice guidance information is generated, and this is output as voice via the navigation control unit 301. Output to part 307.
  • the passenger photographing unit 313 photographs a passenger. Passengers can shoot video or still images For example, the image of the passenger and the behavior of the passenger are taken and output to the navigation control unit 301.
  • the sound processing unit 314 reproduces sound such as music by controlling the output to the connected speaker 312 according to the control related to the output of the navigation control unit 301.
  • the sound processing unit 314 may have substantially the same configuration as the sound output unit 307.
  • the passenger identification unit 101, the passenger information acquisition unit 102, and the behavior detection unit 105 which are functional configurations of the content providing apparatus 100 according to the embodiment, are the navigation control unit 301 and the passenger imaging.
  • the output control unit 103 is realized by the navigation control unit 301
  • the content output unit 104 is realized by the display unit 303 and the speaker 312 by the unit 313.
  • FIG. 4 is an explanatory diagram showing an example of the inside of a vehicle in which a navigation device that is useful in this embodiment is mounted.
  • the interior of the vehicle has a driver seat 411, a passenger seat 412, and a rear seat 413, around the driver seat 411 and the passenger seat 412.
  • a display device display unit 303
  • an acoustic device speaker 312
  • an information reproducing device 426a The passenger seat 412 is provided with a display device 421b and an information reproducing device 426b for the passenger of the rear seat 413, and an acoustic device (not shown) is provided behind the rear seat 413. It has been.
  • each information reproducing device 426 (426a, 426b) are provided with a photographing device (passenger photographing unit 313) 423, which can photograph a passenger.
  • a photographing device passingenger photographing unit 313) 423, which can photograph a passenger.
  • Each information reproducing device 426 (426a, 426b) may have a structure that can be attached to and detached from the vehicle.
  • FIG. 5 is a flowchart showing the contents of the process using the passenger information in the navigation device that is useful in the present embodiment.
  • a case where preference information which is a passenger's preference is used as the passenger information will be described.
  • Figure 5 Flow In one chart, first, the navigation apparatus 300 determines whether or not a content reproduction instruction has been given (step S501).
  • the content reproduction instruction may be configured, for example, by a passenger operating the user operation unit 302.
  • step S501 waiting for the content playback instruction, and if there is an instruction (step S501: Yes), next, the passenger photographing unit 313 photographs the passenger image (step S501).
  • step S502 The passenger image is shot, for example, by taking a still image of the passenger's face.
  • the navigation control unit 301 generates passenger identification information from the passenger image captured in step S502 (step S503).
  • the identification information includes, for example, information obtained by extracting the feature points of the passenger's face, and is collated with the registered identification information recorded on the recording medium 305.
  • the navigation control unit 301 collates the registered identification information pre-registered in the recording medium 305 with the identification information generated in step S503, so that the identification information matches. It is determined whether or not (step S504).
  • the navigation control unit 301 reads the output form information associated with the registered identification information that matches the identification information from the recording medium 305 (step S505).
  • the output form information is, for example, information related to the output of content suitable for the passenger, and is information including volume and sound quality for audio or subtitles and brightness for video.
  • navigation control section 301 determines the content output form based on the output form information read in step S 505 (step S 506), and outputs it to display section 303 or audio processing section 314. To do.
  • the output form is, for example, volume level, sound quality, subtitle size, brightness brightness, or the like.
  • the output form is determined based on, for example, the age, gender, number of passengers, personal preferences and viewing history of the passengers.
  • the audio processing unit 314 performs audio processing and outputs the content to the speaker 312.
  • the display unit 303 or the speaker 312 reproduces the content based on the output form determined in step S506 (step S506).
  • S507 a series of processing ends.
  • step S504 If the identification information does not match in step S504 (step S504: No), Then, it is determined whether or not to register a passenger (step S508).
  • the passenger registration may be configured to display a message indicating that the passenger registration is required on the display unit 303 and to prompt the passenger to determine whether or not to register!
  • step S508 If the passenger is not registered in step S508 (step S508: No), the navigation control unit 301 outputs an output form selection request to the display unit 303 (step S510). The selection of the output form is accepted from (Step S511). Then, the navigation control unit 301 controls the output of the content output to the display unit 303 or the audio processing unit 314 based on the output mode selection. Then, the audio processing unit 314 performs audio processing and outputs the content to the speaker 312. The display unit 303 or the speaker 312 reproduces the content (step S507), and the series of processing ends.
  • step S508 When the passenger is registered in step S508 (step S508: Yes), a message that prompts the passenger to be registered is displayed on the display unit 303. Information is registered (step S509).
  • the registration identification information of the passenger is registered, for example, by extracting the feature point of the passenger image power photographed by the passenger photographing unit 313 or by the user operating the user operation unit 302 to register the age and sex. May be. Then, the process returns to step S504 and the same processing is repeated.
  • the configuration is such that a passenger image is photographed (step S502) after waiting for the content reproduction instruction (step S501: Yes).
  • the passenger image may be taken before the content reproduction instruction is given (step S502). For example, take a picture of the passenger at predetermined intervals during boarding, when starting the vehicle engine, when operating the passenger, and during travel (step S502), and wait for the content playback indication. Moho.
  • identification information is generated based on information obtained by taking a passenger image and extracting feature points of the passenger's face, but the passenger is not the identification information of each passenger.
  • the configuration may be such that the number of people and the configuration are generated as identification information. Specifically, for example, by identifying the number and composition of passengers, if you are riding with a large number of people, you can play content such as lively music, or if you are a family with children, programs for children, etc. The content may be reproduced. Next, a case where the passenger's behavior is photographed by the passenger photographing unit 313 will be described.
  • FIG. 6 is a flowchart showing the contents of the process using the behavior information in the navigation device that works on the present embodiment.
  • the navigation apparatus 300 determines whether or not there is an instruction for content reproduction (step S601).
  • the content reproduction instruction may be, for example, a configuration in which the passenger operates the user operation unit 302.
  • step S601 waiting for the content playback instruction, and if there is an instruction (step S601: Yes), then the passenger photographing unit 313 photographs the behavior of the passenger. (Step 602). For example, the movement of the occupant's eyeball may be photographed.
  • the navigation control unit 301 generates occupant behavior information from the movement of the occupant's eye shot in step S602 (step S603).
  • the behavior information includes, for example, information obtained by extracting feature points of the passenger's eye movements, and specific behavior information including characteristics of eye movements such as sleepiness and fatigue recorded in the recording medium 305 in advance. Are to be verified.
  • the navigation control unit 301 collates the specific behavior information preliminarily registered in the recording medium 305 with the behavior information generated in step S603! /, Then, it is determined whether or not the passenger is in a specific behavior state (step S604). If it is in the specific behavior state (step S604: Yes), the navigation control unit 301 reads the output form information associated with the specific behavior state from the recording medium 305 (step S605).
  • the output form information is, for example, information related to the output of content suitable for the passenger's behavior. If the passenger shows sleepy behavior, the volume and sound quality are reduced, the brightness of the video is reduced, or the display unit Information including on / off of 303 and speaker 312.
  • the navigation control unit 301 determines the output form of the content based on the output form information read in step S605 (step S606). Specifically, for example, when the passenger shows sleepy behavior, if the passenger is a driver, the volume of the content to be played is determined to be high, and if the passenger is a child, the volume is determined to be low. You It may be configured as well.
  • the navigation control unit 301 controls the output of content in the display unit 303 or the audio processing unit 314 based on the output mode determined in step S606. Then, the audio processing unit 314 performs audio processing and outputs the content to the speaker 312. The display unit 303 or the speaker 312 reproduces the content (step S607), and the series of processing ends.
  • step S604 if the occupant is not in the specific behavior state (step S604: No), the process returns to step S602 and the same processing is repeated. If the passenger is not in the specific behavior state, the content output form when the passenger is not in the specific behavior state may be set in advance and played back.
  • Fig. 6 is configured to wait for the instruction to play the content, and when the instruction is given (step S601: Yes), photograph the behavior of the passenger (step S602).
  • the configuration may be such that the behavior of the passenger is photographed before the content reproduction instruction is given (step S602). For example, brute force, take a picture of the passenger's behavior at predetermined intervals during boarding, when the engine of the vehicle is started, when the passenger operates, and during travel (step S602), and a content playback instruction is issued. You can wait.
  • step S603 movement information is generated by photographing the movement of the passenger's eyeball, but the opening / closing of the window, the movement of the entire body of the passenger, the volume in the vehicle, and the like.
  • force may generate behavior information. For example, when a passenger opens a window, it is assumed that the behavior is such that it gets hot and feels uncomfortable. Also good.
  • the content output in the present embodiment may be controlled for each of the forces that are configured through one or more display units 303 or the spin force 312.
  • a display unit 303 may be provided for each seat of the vehicle, and content suitable for each passenger of each seat may be reproduced.
  • the processing using the passenger information and the processing using the behavior information are processed by combining the functions of the respective forces described with reference to Figs. 5 and 6. It is good also as a structure.
  • the passenger is photographed from the passenger photographing unit 313 such as a camera, and the identification information of the passenger or the behavior information is generated.
  • the passenger is photographed. Instead, it may be configured to generate identification information for identifying the passenger or passenger behavior information from information acquired by other sensors.
  • the identification information or the behavior information may be generated by using, for example, a seating sensor that detects a load distribution and a total load on a seat on which a passenger is seated. Information on the number and physique of the passengers can be obtained by the seating sensor.
  • One or more fingerprint sensors may be provided at predetermined positions in the vehicle. The fingerprint sensor can acquire the passenger's fingerprint information and identify the passenger.
  • a voice sensor such as a microphone may be provided in the car. Voice information such as the volume, sound quality, and pitch of the passenger can be acquired by the voice sensor, so that the passenger can be identified, and the number, gender, sleepiness, etc. can be determined.
  • a human body sensor that measures a pulse or the like may also be used. For example, by using information such as the pulse, the physical condition of the passenger can be grasped, and the behavior of the passenger can be determined.
  • the user can output the content by his / her input operation.
  • the output can be controlled based on the occupant information and behavior of the occupant even without control. Therefore, it is possible to efficiently provide comfortable content to passengers.
  • the navigation apparatus 300 photographs the passenger and generates identification information and behavior information of the passenger. And since the identification information, the behavior information, the registered identification information, and the specific behavior information can be compared to determine the output form of the content, it is possible to provide comfortable content to the passenger who is not set by the passenger himself / herself
  • content can be output based on the passenger's viewing history !, for example, it is possible to prevent content that has recently been viewed by the passenger or to continue the content that the passenger has viewed halfway. Can be used to provide complete content to passengers. Therefore, even if the passenger does not set the content himself, the content provision can be optimized.
  • the content providing method described in the present embodiment can be realized by executing a program prepared in advance on a computer such as a personal computer or a workstation.
  • This program is recorded on a computer-readable recording medium such as a hard disk, a flexible disk, a CD-ROM, an MO, or a DVD, and is executed by being read by the computer.
  • the program may be a transmission medium that can be distributed via a network such as the Internet.

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Mechanical Engineering (AREA)
  • Navigation (AREA)

Abstract

La présente invention se rapporte à un dispositif de fourniture de contenu (100) qui, dans un corps mobile, fournit du contenu. Dans le dispositif, une section de spécification d'occupant (101) spécifie un occupant dans le corps mobile, et une section d'acquisition d'informations sur l'occupant (102) acquiert des informations sur l'occupant (ci-après dénommées les 'informations sur l'occupant') spécifiées par la section de spécification de l'occupant (101). De plus, une section de sortie de contenu (104) affiche un contenu et une section de commande de sortie (103) commande la section de sortie de contenu (104) pour afficher le contenu en fonction des informations sur l'occupant acquises par la section d'acquisition d'informations sur l'occupant (102).
PCT/JP2006/316614 2005-08-24 2006-08-24 Dispositif, procédé et programme de fourniture de contenu et support d'enregistrement lisible par ordinateur WO2007023900A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2005242638 2005-08-24
JP2005-242638 2005-08-24

Publications (1)

Publication Number Publication Date
WO2007023900A1 true WO2007023900A1 (fr) 2007-03-01

Family

ID=37771643

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2006/316614 WO2007023900A1 (fr) 2005-08-24 2006-08-24 Dispositif, procédé et programme de fourniture de contenu et support d'enregistrement lisible par ordinateur

Country Status (1)

Country Link
WO (1) WO2007023900A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009208592A (ja) * 2008-03-04 2009-09-17 Pioneer Electronic Corp 移動体搭載機器設定システム、及び移動体搭載機器設定システム用サーバ装置
JP2019119445A (ja) * 2018-01-04 2019-07-22 ハーマン インターナショナル インダストリーズ インコーポレイテッド 車両キャビン内での拡張されたメディア体験のためのムードルーフ
CN110287386A (zh) * 2018-03-19 2019-09-27 本田技研工业株式会社 信息提供系统、信息提供方法以及介质
US20210155250A1 (en) * 2019-11-22 2021-05-27 Mobile Drive Technology Co.,Ltd. Human-computer interaction method, vehicle-mounted device and readable storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003306091A (ja) * 2002-04-15 2003-10-28 Nissan Motor Co Ltd ドライバ判定装置
JP2004037292A (ja) * 2002-07-04 2004-02-05 Sony Corp ナビゲーション装置、ナビゲーション装置のサービス装置及びナビゲーション装置によるサービス提供方法

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003306091A (ja) * 2002-04-15 2003-10-28 Nissan Motor Co Ltd ドライバ判定装置
JP2004037292A (ja) * 2002-07-04 2004-02-05 Sony Corp ナビゲーション装置、ナビゲーション装置のサービス装置及びナビゲーション装置によるサービス提供方法

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009208592A (ja) * 2008-03-04 2009-09-17 Pioneer Electronic Corp 移動体搭載機器設定システム、及び移動体搭載機器設定システム用サーバ装置
JP2019119445A (ja) * 2018-01-04 2019-07-22 ハーマン インターナショナル インダストリーズ インコーポレイテッド 車両キャビン内での拡張されたメディア体験のためのムードルーフ
US11958345B2 (en) 2018-01-04 2024-04-16 Harman International Industries, Incorporated Augmented media experience in a vehicle cabin
JP7475812B2 (ja) 2018-01-04 2024-04-30 ハーマン インターナショナル インダストリーズ インコーポレイテッド 車両キャビン内での拡張されたメディア体験のためのムードルーフ
CN110287386A (zh) * 2018-03-19 2019-09-27 本田技研工业株式会社 信息提供系统、信息提供方法以及介质
US20210155250A1 (en) * 2019-11-22 2021-05-27 Mobile Drive Technology Co.,Ltd. Human-computer interaction method, vehicle-mounted device and readable storage medium

Similar Documents

Publication Publication Date Title
JP4533897B2 (ja) 処理制御装置、そのプログラム、および、そのプログラムを記録した記録媒体
US10332495B1 (en) In vehicle karaoke
JP4516111B2 (ja) 画像編集装置、画像編集方法、画像編集プログラムおよびコンピュータに読み取り可能な記録媒体
CN108327667A (zh) 车辆语音控制方法及装置
US7106184B2 (en) Rear entertainment system and control method thereof
WO2007046269A1 (fr) Appareil, procede et programme de presentation d'informations ainsi que support d'enregistrement pouvant etre lu par ordinateur
WO2007032278A1 (fr) Terminal de communication, appareil de guidage, procede de guidage et support d'enregistrement
WO2021149593A1 (fr) Dispositif de traitement d'information, procédé de traitement d'information, programme de traitement d'information et support de stockage
WO2007023900A1 (fr) Dispositif, procédé et programme de fourniture de contenu et support d'enregistrement lisible par ordinateur
WO2006095688A1 (fr) Dispositif de reproduction d’informations, méthode de reproduction d’informations, programme de reproduction d’informations et support lisible par ordinateur
JPH07286854A (ja) 電子地図装置
JP2008083184A (ja) 運転評価システム
JP2010134507A (ja) 再生装置
WO2007043464A1 (fr) Dispositif de contrôle de sortie, procédé de contrôle de sortie, programme de contrôle de sortie et support d’enregistrement lisible par un ordinateur
JP2005333226A (ja) 車両用テレビ会議システム
WO2007108337A1 (fr) Dispositif de reproduction de contenu, procede de reproduction de contenu, programme de reproduction de contenu et support d'enregistrement lisible par ordinateur
WO2007020808A1 (fr) Dispositif de fourniture de contenu, méthode de fourniture de contenu, programme de fourniture de contenu et support d’enregistrement lisible par ordinateur
JP2008252589A (ja) 音量制御装置、音量制御方法、音量制御プログラムおよび記録媒体
JP2009223187A (ja) 表示内容制御装置、表示内容制御方法及び表示内容制御方法プログラム
JP2006190206A (ja) 処理装置、その方法、そのプログラム、および、そのプログラムを記録した記録媒体
JP2012105174A (ja) 画像音声記録再生装置
JP3883619B2 (ja) ナビゲーション装置および方法
JP2006189977A (ja) 画像編集装置、画像編集方法、画像編集プログラムおよびコンピュータに読み取り可能な記録媒体
JP7386076B2 (ja) 車載装置及び応答出力制御方法
JP2009248952A (ja) 撮影情報表示装置、撮影情報表示方法及び撮影情報表示プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 06796727

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP