WO2007020808A1 - Dispositif de fourniture de contenu, méthode de fourniture de contenu, programme de fourniture de contenu et support d’enregistrement lisible par ordinateur - Google Patents

Dispositif de fourniture de contenu, méthode de fourniture de contenu, programme de fourniture de contenu et support d’enregistrement lisible par ordinateur Download PDF

Info

Publication number
WO2007020808A1
WO2007020808A1 PCT/JP2006/315405 JP2006315405W WO2007020808A1 WO 2007020808 A1 WO2007020808 A1 WO 2007020808A1 JP 2006315405 W JP2006315405 W JP 2006315405W WO 2007020808 A1 WO2007020808 A1 WO 2007020808A1
Authority
WO
WIPO (PCT)
Prior art keywords
passenger
information
content
content providing
behavior
Prior art date
Application number
PCT/JP2006/315405
Other languages
English (en)
Japanese (ja)
Inventor
Hiroaki Shibasaki
Tadayasu Kaneko
Matsuaki Haruki
Motoyuki Yamashita
Jun Oosugi
Mari Kitada
Original Assignee
Pioneer Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pioneer Corporation filed Critical Pioneer Corporation
Publication of WO2007020808A1 publication Critical patent/WO2007020808A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases

Definitions

  • Content providing apparatus content providing method, content providing program, and computer-readable recording medium
  • the present invention relates to a content providing apparatus that provides content in a mobile body, a content providing method, a content providing program, and a computer-readable recording medium.
  • the present invention is not limited to the above-described content providing apparatus, content providing method, content providing program, and computer-readable recording medium.
  • a passenger of the moving body can view various types of content.
  • Various types of content are, for example, radio and television broadcasts, music and video recorded on recording media such as CD (Compact Disk) and DVD (Digital Versatile Disk), and passengers are mounted on mobile objects.
  • the content can be viewed through a display device such as a display or an audio device such as a speaker.
  • each passenger's preference for the in-vehicle environment (such as the position and operating status of the in-vehicle device) is used as profile information for each passenger, using an IC (Integrated Circuit) card ID ( IDentification) Store in the card.
  • IC Integrated Circuit
  • IDentification IDentification
  • a navigation apparatus in a moving body such as a vehicle guides the moving body to a destination using the position information of the moving body, the information on the destination, and the map information.
  • the destination point is set based on the user's input, and the user operates the operation unit with a remote control or touch panel to input information on the name and position of the destination point.
  • Patent Document 1 Japanese Patent Application Laid-Open No. 2002-104105
  • Patent Document 2 JP 2000-88597
  • the content providing apparatus is a content providing apparatus that provides content in a mobile object, and identifies a passenger of the mobile object. Identifying means, passenger information obtaining means for obtaining information relating to the passenger identified by the identifying means (hereinafter referred to as “passenger information” t), and passenger obtained by the passenger information obtaining means
  • the information processing apparatus includes: a determination unit that determines content to be output based on the information; and an output unit that outputs the content determined by the determination unit.
  • the content providing device is a behavior detecting device that detects the behavior of the passenger in the content providing device that provides content on a mobile object. Means for determining content to be output based on the result detected by the behavior detecting means, and output means for outputting the content determined by the determining means. .
  • the content providing method according to the invention of claim 10 includes a specifying step for specifying a passenger of the moving object and the specifying method for providing the content in the moving object.
  • a passenger information acquisition process for acquiring the passenger information specified by the process, a determination process for determining the content to be output based on the passenger information acquired by the passenger information acquisition process, and the determination process.
  • the content providing method according to the invention of claim 11 is a content providing method for providing content on a mobile object, and is detected by a behavior detecting step for detecting the behavior of the passenger and the behavior detecting step. And a determination step of determining content to be output based on the detection result, and an output step of outputting the content determined by the determination step.
  • a content providing program according to the invention of claim 12 causes a computer to execute the content providing method according to claim 10 or 11.
  • a computer-readable recording medium records the content providing program according to claim 12.
  • FIG. 1 is a block diagram showing an example of a functional configuration of a content providing apparatus according to the first embodiment.
  • FIG. 2 is a flowchart showing details of processing of the content providing apparatus according to the first embodiment.
  • FIG. 3 is a block diagram of an example of a hardware configuration of the navigation device according to the first embodiment.
  • FIG. 4 is an explanatory diagram of an example of the interior of the vehicle in which the navigation device according to the first embodiment is mounted.
  • FIG. 5 shows the use of passenger information in the navigation device according to the first embodiment.
  • FIG. 6 is a diagram illustrating a process using behavior information in the navigation device according to the first embodiment.
  • FIG. 7 is a block diagram showing an example of a functional configuration of the content providing apparatus according to the second embodiment.
  • FIG. 8 is a flowchart showing details of processing of the content providing apparatus according to the second embodiment.
  • FIG. 9 is a flowchart of the process using the passenger information in the navigation device according to the second embodiment.
  • FIG. 10 is a flowchart of the contents of the process using the behavior information in the navigation device according to the second embodiment.
  • FIG. 1 is a block diagram illustrating an example of a functional configuration of the content providing apparatus according to the first embodiment.
  • the content providing apparatus 100 includes a passenger identification unit 101, a passenger information acquisition unit 102, a content determination unit 103, a content output unit 104, and a behavior detection unit 105. ing.
  • the passenger identification unit 101 identifies the passenger.
  • the identification of the passenger is, for example, the identification of the relationship (owner, relative, friend, other person, etc.) with the owner of the moving body in each passenger.
  • the passenger information acquisition unit 102 acquires the passenger information related to the passenger specified by the passenger specifying unit 101.
  • Passenger information is, for example, information including characteristics such as preferences, age, and gender related to the passenger, and the configuration such as the arrangement and number of passengers.
  • the behavior detection unit 105 detects the behavior of the passenger.
  • the passenger's behavior is information including the physical condition of the passenger such as sleepiness, fatigue and physical condition, and may be information including the arrangement and number of passengers exhibiting the predetermined behavior.
  • the content determination unit 103 determines the content to be output based on at least one of the passenger information acquired by the passenger information acquisition unit 102 and the behavior of the passenger detected by the behavior detection unit 105. .
  • the content to be output is, for example, music or video recorded in advance on a recording medium (not shown), and radio broadcast or television broadcast received by a communication unit (not shown).
  • the content output unit 104 outputs the content determined by the content determination unit 103.
  • the content is output by, for example, a display device such as a display mounted on a moving body or an acoustic device such as a speaker.
  • FIG. 2 is a flowchart showing the contents of the processing of the content providing apparatus according to the first embodiment.
  • content provision The apparatus 100 determines whether or not there is an instruction to provide content (step S201).
  • the content providing instruction may be, for example, a configuration in which a passenger gives an operation unit (not shown).
  • step S201 after waiting for an instruction to provide content, if there is an instruction (step S201: Yes), the passenger identification unit 101 identifies the passenger (step S202).
  • the identification of the passenger is, for example, the identification of the relationship between the owner of the mobile object and the passenger.
  • the occupant information acquisition unit 102 acquires occupant information about the occupant identified in step S202 (step S203).
  • Passenger information is, for example, information including characteristics such as preferences, age, and sex regarding the passenger, and configuration such as the arrangement and number of passengers.
  • the behavior detecting unit 105 detects the behavior of the passenger (step S204).
  • the passenger's behavior is information including the physical condition of the passenger such as sleepiness, fatigue, and physical condition.
  • the passenger information is acquired (step S203), and the behavior of the passenger to be detected (step S204) is detected (step S204). ), Acquire passenger information (Step S203) Do not perform either step in the order!
  • the content determination unit 103 determines the content to be output based on at least one of the passenger information acquired in step S203 and the passenger behavior detected in step S204 (Ste S205). Then, the content output unit 104 outputs the content determined in step S205 (step S206), and ends the series of processes.
  • step S201 After waiting for an instruction to provide content, if there is an instruction (step S201: Yes), the passenger is identified (step S202), and the passenger information is stored. Although it is configured to acquire (step S203) and detect behavior (step S204), it may be configured to perform the above-described steps S202 to S204 before instructing content provision. For example, at the time of boarding by force, the passenger is identified (step S202), the passenger information is obtained (step S203), the behavior is detected (step S204), and there is an instruction to provide content. Wait for instructions (Step S201: Yes) (Step S205) The configuration is also good! /.
  • the user can input the content by his / her input operation. Even without setting, content can be output based on the passenger information and behavior of the passenger. Therefore, it is possible to efficiently output content suitable for the passenger.
  • Example 1 of the present invention will be described.
  • a navigation device mounted on a moving body such as a vehicle (including a four-wheeled vehicle and a two-wheeled vehicle) will be described.
  • FIG. 3 is a block diagram of an example of a hardware configuration of the navigation device according to the first embodiment.
  • a navigation device 300 is mounted on a moving body such as a vehicle, and includes a navigation control unit 301, a user operation unit 302, a display unit 303, a position acquisition unit 304, and a recording medium. 305, a recording medium decoding unit 306, an audio output unit 307, a communication unit 308, a route search unit 309, a route guidance unit 310, an audio generation unit 311, a speaker 312, a passenger photographing unit 313, And an audio processing unit 314.
  • the navigation control unit 301 controls the entire navigation device 300.
  • the navigation control unit 301 includes, for example, a CPU (Central Processing Unit) that executes predetermined arithmetic processing, a ROM (Read Only Memory) that stores various control programs, and a RAM (Random) that functions as a work area for the CPU. It can be realized by a microcomputer constituted by an Access Memory).
  • a CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random
  • the navigation control unit 301 based on the information on the current position acquired by the position acquisition unit 304 and the map information obtained from the recording medium 305 via the recording medium decoding unit 306, upon route guidance, The driving force at which position on the map is calculated, and the calculation result is output to the display unit 303.
  • the navigation control unit 301 performs route guidance.
  • the route search unit 309, the route guidance unit 310, and the voice generation unit 311 input / output information related to route guidance, and output the obtained information to the display unit 303 and the voice output unit 307.
  • the navigation control unit 301 generates passenger identification information and behavior information, which will be described later, based on the passenger image or behavior of the passenger photographed by the passenger photographing unit 313. Then, in accordance with the content reproduction instruction input by the user operating the user operation unit 302, content reproduction such as audio and video is controlled based on the identification information and behavior information of the passenger.
  • the content playback control is performed by, for example, determining the content to be played back from music or video recorded on the recording medium 305 or radio broadcast or television broadcast received by the communication unit 308, and displaying the display unit 303 or the audio processing unit 314. Output to.
  • the user operation unit 302 acquires information input by the user by operating operation means such as a remote control, a switch, and a touch panel, and outputs the acquired information to the navigation control unit 301.
  • Display unit 303 includes, for example, a CRT (Cathode Ray Tube), a TFT liquid crystal display, an organic EL display, a plasma display, and the like.
  • the display unit 303 can be configured by, for example, a video IZF (interface) or a video display device connected to the video IZF.
  • the video IZF is, for example, a graphic controller that controls the entire display device, a buffer memory such as VRAM (Video RAM) that temporarily stores image information that can be displayed immediately, and a graphics controller. It is composed of a control IC that controls the display of the display device based on the image information output from the device.
  • the display unit 303 displays traffic information, map information, information on route guidance, content about video output from the navigation control unit 301, and other various types of information.
  • the position acquisition unit 304 includes a GPS receiver and various sensor forces such as a vehicle speed sensor, an angular velocity sensor, and an acceleration sensor, and acquires information on the current position of the moving object (current position of the navigation device 300).
  • the GPS receiver receives the radio wave from the GPS satellite and determines the geometric position with the GPS satellite. GPS means Global Positioning
  • the GPS receiver receives radio waves from GPS satellites. It consists of an antenna for transmitting, a tuner that demodulates received radio waves, and an arithmetic circuit that calculates the current position based on the demodulated information.
  • the recording medium 305 can be realized by, for example, an HD (Hard Disk), a DVD (Digital Versatile Disk), a CD (Compact Disk), or a memory card. Note that the recording medium 305 may accept writing of information by the recording medium decoding unit 306 and record the written information in a nonvolatile manner.
  • HD Hard Disk
  • DVD Digital Versatile Disk
  • CD Compact Disk
  • the recording medium 305 may accept writing of information by the recording medium decoding unit 306 and record the written information in a nonvolatile manner.
  • map information used for route search and route guidance is recorded in the recording medium 305.
  • the map information recorded in the recording medium 305 includes background data representing features (features) such as buildings, rivers, and the ground surface, and road shape data representing the shape of the road. It is drawn in 2D or 3D on the display screen.
  • the navigation device 300 is guiding a route, the map information read from the recording medium 305 by the recording medium decoding unit 306 and the mark indicating the position of the moving body acquired by the position acquisition unit 304 are displayed on the display unit 303. Will be displayed.
  • the recording medium 305 records registration identification information for identifying the passenger, specific behavior information for determining the behavior of the passenger, and content such as video and music. ing.
  • the registered identification information includes, for example, information obtained by extracting feature points of a passenger image photographed by a camera or the like, such as a face pattern, an eye iris, fingerprint data, or voice data.
  • the specific behavior information includes, for example, information obtained by extracting features related to the specific behavior state such as drowsiness and fatigue, such as eyelid movement, volume level, and heart rate.
  • the content recorded in the recording medium 305 is determined based on, for example, the passenger information of the identified passenger or the specific behavior information of the passenger based on the registered identification information of the passenger! Is recorded in association with the specific behavior state of the occupant and can be read out according to the result of the identification of the occupant and the determination of the behavior in the navigation control unit 301.
  • the passenger information is, for example, information including characteristics such as preference, age, and sex, and the configuration such as the arrangement of passengers and the number of passengers.
  • map information, content, and the like are recorded on the recording medium 305.
  • Map information, content, and the like may be recorded in a server outside the navigation device 300. In this case, the navigation device 300 acquires the map information content from the sano via the network through the communication unit 308, for example. The acquired information is stored in RAM or the like.
  • the recording medium decoding unit 306 controls reading of information on the recording medium 305 and Z writing.
  • the recording medium decoding unit 306 is an HDD (Hard Disk Drive).
  • the audio output unit 307 reproduces a sound such as an internal sound by controlling the output to the connected speaker 312. There may be one speaker 312 or a plurality of speakers.
  • the audio output unit 307 includes, for example, a D ZA converter that performs DZA conversion of audio digital information, an amplifier that amplifies an audio analog signal output from the DZA converter, and an AZD converter that performs AZD conversion of audio analog information. It can consist of
  • the communication unit 308 obtains various types of information from the outside.
  • the communication unit 308 is an FM multiplex tuner, a VICS (registered trademark) Z beacon receiver, a wireless communication device and other communication devices, and other communication via a communication medium such as a mobile phone, PHS, communication card and wireless LAN. Communicate with the device.
  • it may be a device that can communicate by radio broadcast radio waves, television broadcast radio waves, or satellite broadcasts.
  • Information acquired by the communication unit 308 includes traffic information such as traffic jams and traffic regulations that are also distributed by the road traffic information communication system center, traffic information acquired by operators in their own way, and other information on the Internet. Public data and content.
  • the communication unit 308 may request traffic information or content from a server storing traffic information and content nationwide via the network and obtain the requested information.
  • it may be configured to receive video signals or audio signals from radio broadcasts, television broadcasts, or satellite broadcasts.
  • the route search unit 309 uses the map information acquired from the recording medium 305 via the recording medium decoding unit 306, the traffic information acquired via the communication unit 308, etc. Force Searches for the optimal route to the destination.
  • the optimal route is the route that best matches the user's request.
  • the route guidance unit 310 is obtained from the optimal route information searched by the route search unit 309, the position information of the moving body acquired by the position acquisition unit 304, and the recording medium 305 via the recording medium decoding unit 306. Based on the obtained map information, route guidance information for guiding the user to the destination point is generated.
  • the route guidance information generated at this time may be information that considers the traffic jam information received by the communication unit 308.
  • the route guidance information generated by the route guidance unit 310 is output to the display unit 303 via the navigation control unit 301.
  • the sound generation unit 311 generates information of various sounds such as a guide sound. That is, based on the route guidance information generated by the route guidance unit 310, the virtual sound source corresponding to the guidance point is set and the voice guidance information is generated, and this is output as voice via the navigation control unit 301. Output to part 307.
  • the passenger photographing unit 313 photographs a passenger.
  • the passenger may be photographed using either a moving image or a still image.
  • the passenger image and the behavior of the passenger are photographed and output to the navigation control unit 301.
  • the audio processing unit 314 reproduces audio such as music output from the navigation control unit 301 by controlling output to the connected speaker 312.
  • the sound processing unit 314 may have substantially the same configuration as the sound output unit 307.
  • the passenger identification unit 101, the passenger information acquisition unit 102, and the behavior detection unit 105 which are functional configurations of the content providing apparatus 100 according to the first embodiment, are a navigation control unit 301 and a passenger photographing unit 313. Accordingly, the function is realized by the content determination unit 103 by the navigation control unit 301, and the content output unit 104 by the display unit 303 and the speaker 312.
  • FIG. 4 is an explanatory diagram of an example of the inside of the vehicle on which the navigation device according to the first embodiment is mounted.
  • the interior of the vehicle has a driver seat 411, a passenger seat 412, and a rear seat 413, around the driver seat 411 and the passenger seat 412.
  • a display device display unit 303
  • an acoustic device speaker 312
  • an information reproducing device 426a The passenger seat 412 is provided with a display device 421b and an information reproducing device 426b for the passenger of the rear seat 413, and an acoustic device (not shown) is provided behind the rear seat 413. It has been.
  • each information reproducing device 426 (426a, 426b) are provided with a photographing device (passenger photographing unit 313) 423, which can photograph a passenger.
  • a photographing device passingenger photographing unit 313) 423, which can photograph a passenger.
  • Each information reproducing device 426 (426a, 426b) may have a structure that can be attached to and detached from the vehicle.
  • FIG. 5 is a flowchart of the process using the passenger information in the navigation device according to the first embodiment.
  • the navigation apparatus 300 first determines whether or not a content reproduction instruction has been given (step S501).
  • the content reproduction instruction may be configured, for example, by a passenger operating the user operation unit 302.
  • step S501 waiting for the content playback instruction, and if there is an instruction (step S501: Yes), the passenger photographing unit 313 then photographs the passenger image (step S501: Yes).
  • step S502 The passenger image is shot, for example, by taking a still image of the passenger's face.
  • the navigation control unit 301 generates passenger identification information from the passenger image taken in step S502 (step S503).
  • the identification information includes, for example, information obtained by extracting the feature points of the passenger's face, and is collated with the registered identification information recorded on the recording medium 305.
  • the navigation control unit 301 collates the registered identification information pre-registered in the recording medium 305 with the identification information generated in step S503, and the identification information matches. It is determined whether or not (step S504). And the identification information matches If so (Step S504: Yes), the navigation control unit 301 reads from the recording medium 305 the passenger preference information of the registered identification information that matches the identification information (Step S505).
  • the preference information is, for example, information related to the passenger's content preference, and if it is music, it includes information about genres such as Japanese music, Western music, and singer, and if it is video, such as animation and use.
  • the navigation control unit 301 determines the content to be played based on the preference information read in step S 505 (step S 506), and outputs it to the display unit 303 or the audio processing unit 314.
  • the audio processing unit 314 performs audio processing and outputs the content to the speaker 312.
  • the display unit 303 or the speaker 312 reproduces the content (step S507), and the series of processing ends.
  • step S504 if the identification information does not match (step S504: No), it is determined whether or not to register a passenger (step S508).
  • the passenger registration may be configured to display a message indicating that the passenger registration is required on the display unit 303 and to prompt the passenger to determine whether or not to register!
  • step S508 If the passenger is not registered in step S508 (step S508: No), the navigation control unit 301 outputs a content selection request to the display unit 303 (step S510). Also, the selection of content is accepted (step S511). Then, the navigation control unit 301 outputs the content to the display unit 303 or the audio processing unit 314 based on the content selection. Then, the audio processing unit 314 performs audio processing and outputs the content to the speech power 312. The display unit 303 or the speaker 312 reproduces the content (step S507), and the series of processing ends.
  • step S508 is not required! /, Do not register the passenger! / (Step S508: Yes)
  • a message such as prompting the passenger to register is displayed on the display 303.
  • registration identification information and preference information of the passenger are registered (step S509).
  • Registration of registration identification information of a passenger is performed by, for example, extracting feature points from a passenger image photographed by the passenger photographing unit 313.
  • the preference information may be registered by the user operating the user operation unit 302. Then, the process returns to step S504 and the same processing is repeated.
  • the configuration is such that a passenger image is photographed (step S502) after waiting for the content reproduction instruction (step S501: Yes).
  • the passenger image may be taken before the content reproduction instruction is given (step S502). For example, take a picture of the passenger at predetermined intervals during boarding, when starting the vehicle engine, when operating the passenger, and during travel (step S502), and wait for the content playback indication. Moho.
  • the passenger image is taken and the identification information is generated based on the information obtained by extracting the feature points of the passenger's face, but the passenger is not the identification information of the individual passenger.
  • the configuration may be such that the number of people and the configuration are generated as identification information. Specifically, for example, the number and composition of passengers can be identified. If you are riding with a large number of people, content such as live music is played, and if you are a family with children, programs for children, etc. The content may be played back.
  • step S 505 preference information is read from the recording medium 305 as passenger information of the passenger whose identification information matches, but the age and gender of the passenger are read.
  • passenger information of the passenger whose identification information matches, but the age and gender of the passenger are read.
  • it may be configured to play content suitable for age and gender.
  • a child-aged passenger can play content such as a program for children
  • a female-only passenger can play content such as a cooking program.
  • the content is recorded according to the viewing history.
  • the content that was viewed halfway during the last boarding may be replayed from the continuation.
  • FIG. 6 is a flowchart of the contents of the process using the behavior information in the navigation device according to the first embodiment.
  • the navigation apparatus 300 determines whether or not a content reproduction instruction has been given (step S601).
  • the content reproduction instruction may be, for example, a configuration in which the passenger operates the user operation unit 302.
  • step S601 waiting for the content playback instruction, and if there is an instruction (step S601: Yes), next, the passenger photographing unit 313 photographs the behavior of the passenger. (Step 602). For example, the movement of the occupant's eyeball may be photographed.
  • the navigation control unit 301 generates the passenger's behavior information from the movement of the passenger's eye shot in step S602 (step S603).
  • the behavior information includes, for example, information obtained by extracting feature points of the passenger's eye movements, and specific behavior information including characteristics of eye movements such as sleepiness and fatigue recorded in the recording medium 305 in advance. Are to be verified.
  • the navigation control unit 301 collates the specific behavior information registered in advance in the recording medium 305 with the behavior information generated in step S603! /, Then, it is determined whether or not the passenger is in a specific behavior state (step S604). If it is in the specific motion state (step S604: Yes), the navigation control unit 301 determines the content to be played based on the specific motion state (step S605). Specifically, for example, when the passenger shows sleepy behavior, if the passenger is a driver, the content to be played is relatively noisy and the content is determined. It may be configured to determine the content of the song that has arrived.
  • the navigation control unit 301 reads the content determined in step S 605 from the recording medium 305 and outputs it to the display unit 303 or the audio processing unit 314. Then, the audio processing unit 314 performs audio processing and outputs the content to the speaker 312. The display unit 303 or the speaker 312 reproduces the content (step S606), and the series of processing ends.
  • step S604 if the passenger is not in the specific behavior state (step S604: No), the process returns to step S602 and the same process is repeated. If the occupant is not in the specific behavior state, the occupant may be notified to that effect and be prompted to select content. In addition, it is possible to set the content in the case of not being in the specific behavior state in advance and play it back.
  • step S601 the force that is configured to capture the passenger's behavior (step S602) may be configured to capture the passenger's behavior (step S602) before the content playback instruction is issued.
  • brute force take a picture of the passenger's behavior at predetermined intervals during boarding, when the engine of the vehicle is started, when the passenger operates, and during travel (step S602), and a content playback instruction is issued. You can wait.
  • step S602 movement information is generated by photographing movements of the passenger's eyeballs.
  • force may generate behavior information.
  • a passenger opens a window it may be a behavior that makes V feel hot and feels uncomfortable, and even if behavior information is generated on the behavior of the whole body and noise, it generates noise.
  • Yo ⁇ .
  • content output in the first embodiment may control content reproduction for each of the forces that are configured via one or more display units 303 or speaker force 312.
  • a display unit 303 may be provided for each seat of the vehicle, and content suitable for each passenger of each seat may be reproduced.
  • the processing using the passenger information and the processing using the behavior information are processed together with the respective functions of the forces described with reference to Figs. 5 and 6. It is good as composition.
  • the passenger is photographed from the passenger photographing unit 313 such as a camera, and the passenger identification information or the behavior information is generated.
  • the passenger photographing unit 313 such as a camera
  • the passenger identification information or the behavior information is generated.
  • it may be configured to generate identification information that identifies the passenger or passenger behavior information from information acquired by other sensors.
  • the identification information or the behavior information may be generated using, for example, a seating sensor that detects a load distribution or a total load on a seat on which a passenger is seated. Information on the number and physique of the passengers can be obtained by the seating sensor.
  • One or more fingerprint sensors may be provided at predetermined positions in the vehicle. The fingerprint sensor can acquire the passenger's fingerprint information and identify the passenger.
  • a voice sensor such as a microphone may be provided in the car. Voice information such as the volume, sound quality, and pitch of the passenger can be acquired by the voice sensor, so that the passenger can be identified, and the number, gender, sleepiness, etc. can be determined.
  • pulse A human body sensor for measurement may be used. For example, by using information such as the pulse, the physical condition of the passenger can be grasped, and the behavior of the passenger can be determined.
  • a configuration may be adopted in which personal information is acquired from a mobile phone owned by the passenger!
  • information such as the number of people may be acquired by detecting the attachment / detachment of a seat belt or a child seat.
  • the user can input the content by his / her input operation. Even without setting, content can be output based on the passenger information and behavior of the passenger. Therefore, it is possible to efficiently output content suitable for the passenger.
  • the navigation device 300 captures the passenger and generates identification information and behavior information of the passenger.
  • the content to be played can be determined by comparing the identification information, behavior information with the registered identification information, and the specific behavior information. wear.
  • content can be played back based on the passenger's viewing history, for example, the content that the passenger has recently viewed can be prevented from being repeated, or the content that the passenger has watched halfway can be continued. Can be used to provide complete content to passengers. Therefore, even if the passenger does not set the content himself, the content provision can be optimized.
  • FIG. 7 is a block diagram of an example of a functional configuration of the content providing apparatus according to the second embodiment.
  • the content providing apparatus 700 includes a passenger identification unit 701, a passenger information acquisition unit 702, a spot information setting unit 703, a spot information output unit 704, and a behavior detection unit 705. Only configured.
  • the passenger identification unit 701 identifies the passenger.
  • the identification of the passenger is, for example, the identification of the relationship (owner, relative, friend, other person, etc.) with the owner of the moving body in each passenger.
  • the passenger information acquisition unit 702 acquires passenger information related to the passenger specified by the passenger identification unit 701.
  • Passenger information is, for example, information including characteristics such as preferences, age, and sex regarding the passenger, and the configuration of the passenger layout and personnel.
  • the behavior detection unit 705 detects the behavior of the passenger.
  • the passenger's behavior is information including the physical condition of the passenger such as sleepiness, tiredness, and physical condition, for example, and may include information such as the layout of the passenger exhibiting the predetermined behavior and personnel.
  • the point information setting unit 703 sets the point information based on at least one of the passenger information acquired by the passenger information acquisition unit 702 and the behavior of the passenger detected by the behavior detection unit 705. .
  • the point information is, for example, information including a destination point and a stop point in route guidance, which is acquired by a communication unit (not shown) or set with reference to a recording medium (not shown).
  • the point information is set to a destination or a stop-off facility via a communication unit (not shown), making a request for use such as reservation V ⁇ , and the point where the request is accepted is set But!
  • the point information output unit 704 outputs the point information set by the point information setting unit 703.
  • the location information is output, for example, on a display mounted on a moving object. This is performed by an audio device such as a device or a speaker.
  • the point information may be output by outputting a plurality of candidates and allowing the user to select a destination point or a stop-off point.
  • FIG. 8 is a flowchart showing the contents of processing of the content providing apparatus according to the second embodiment.
  • the content providing apparatus 700 determines whether or not a route guidance instruction has been given (step S801).
  • the route guidance may be instructed from the operation unit (not shown) by the passenger, for example.
  • step S801 after waiting for a route guidance instruction to be given (step S801: Yes), the passenger identification unit 701 identifies the passenger (step S802).
  • the identification of the passenger is, for example, the identification of the relationship between the owner of the mobile object and the passenger.
  • the occupant information acquisition unit 702 acquires occupant information related to the occupant identified in step S802 (step S803).
  • Passenger information is, for example, information including characteristics such as preferences, age, and sex regarding the passenger, as well as the configuration of the passengers and personnel.
  • the behavior detection unit 705 detects the behavior of the passenger (step S804).
  • the passenger's behavior is information including the physical condition of the passenger such as sleepiness, fatigue, and physical condition.
  • passenger information is acquired (step S803) and the passenger's behavior is detected (step S804).
  • the passenger's behavior is detected (step S804). ), Acquire passenger information (Step S803) Do not perform either step in the order!
  • the location information setting unit 703 determines the location information based on at least one of the passenger information acquired in step S803 and the passenger behavior detected in step S804. Is set (step S805).
  • the spot information output unit 704 outputs the spot information set in step S805 (step S806), and the series of processing ends.
  • a configuration that performs processing such as acquisition of passenger information and detection of behavior after a route guidance instruction is given.
  • it is not limited to a route guidance instruction.
  • a configuration that is performed simultaneously with the start of movement of the moving body or a configuration that is performed at predetermined intervals while the moving body is moving may be employed.
  • the user can set the point information based on the passenger information and behavior of the passenger without setting the point information by his input operation. Can be output. Therefore, it is possible to output point information suitable for the passenger efficiently and accurately.
  • Example 2 of the present invention will be described below.
  • the content providing device of the present invention is implemented by the navigation device 300 shown in FIG. 3
  • the interior of the vehicle equipped with the navigation device 300 is almost the same as that in FIG.
  • the map information recorded on the recording medium 305 includes destination point information and stop-off point information.
  • Destination information and stop-point information are based on, for example, the passenger's registered identification information, and the passenger information determined based on the passenger information of the identified passenger and the specific behavior information of the passenger. Is associated with a specific behavior state. Then, the destination point information and the stop point information are read according to the result of the passenger identification and behavior determination in the navigation control unit 301.
  • Passenger information is, for example, information including characteristics such as preference, age, and gender, and the layout of passengers and personnel.
  • the specific behavior state is, for example, the physical state of the passenger such as sleepiness, fatigue, or physical condition.
  • the destination point information and the stop point information include facility information such as restaurants and amusement facilities.
  • the facility information is, for example, discount information according to the number of passengers, age, etc., recommended information, vacancy information of the facility, etc., and the facility power can also be acquired by the communication unit 308 described later.
  • the current position acquired by the position acquisition unit 304 or the departure point designated by the user from the user operation unit 302 is set as the departure point of the route searched by the route search unit 309.
  • the passenger information of the passenger and the specific behavior of the passenger described above are set.
  • a configuration may be adopted in which the destination point is set from destination point information associated with the state.
  • a stop point may be set on the route searched by the route search unit 309.
  • the setting of the stop point is, for example, a point specified by the user from the user operation unit 302 or the stop point information based on the stop point information associated with the above-described passenger information of the passenger or the specific behavior state of the passenger. May be set.
  • the setting of the destination point and the stop point may be configured such that a request for use of the facility is made via the communication unit 308 as necessary, and the point where the request is accepted is set. Requests regarding the use of facilities, for example, to confirm availability of passengers, ask for availability, reserve restaurants with discounts by passengers, and reserve waiting times. You may make reservations to amusement facilities that can be reduced, and set the destination or stopover point for facilities that you have confirmed or reserved.
  • the passenger identification unit 701, the passenger information acquisition unit 702, and the behavior detection unit 705, which are functional configurations of the content providing apparatus 700 according to the second embodiment, are a navigation control unit 301 and a passenger photographing unit 313.
  • the point information setting unit 703 realizes the function by the route search unit 309
  • the point information output unit 704 realizes the function by the display unit 303 and the speaker 312.
  • FIG. 9 is a flowchart of the process using the passenger information in the navigation device according to the second embodiment.
  • a process for setting a destination point using passenger information will be described.
  • the navigation apparatus 300 first determines whether or not a route guidance request has been made (step S901).
  • the route guidance request is, for example, when the passenger is a user It may be configured to operate the operation unit 302.
  • step S901 after waiting for a route guidance request, if there is a request (step S901: Yes), then the passenger photographing unit 313 photographs a passenger image (step S902).
  • the passenger image is taken by taking a still image of the face of the passenger.
  • the navigation control unit 301 generates passenger identification information from the passenger image captured in step S902 (step S903).
  • the identification information includes, for example, information obtained by extracting the feature points of the passenger's face, and is collated with the registered identification information recorded on the recording medium 305.
  • the navigation control unit 301 collates the registered identification information pre-registered in the recording medium 305 with the identification information generated in step S903, so that the identification information matches. It is determined whether or not (step S904). If the identification information matches (step S 904: Yes), the navigation control unit 301 obtains the destination point associated with the passenger information of the passenger of the registered identification information that matches the identification information from the recording medium 305. Information is read (step S905).
  • the destination point information associated with the passenger information relates to the passenger's preference, for example, and may be configured to read destination point information such as a restaurant preferred by the passenger.
  • information indicating that the passenger has stopped in the past may be managed in association with the destination information, and the destination information may be removed and the other destination information may be read.
  • the determination of “stopped in the past” can be made, for example, when a destination point of the vehicle is set in the past and the vehicle has arrived at the destination point.
  • the passenger may set their own evaluation for the destination information that the passenger has visited in the past, and read the high-V destination information of the evaluation!
  • the “specific evaluation setting” may be configured such that the passenger inputs an original score for the destination point felt by the passenger and sets the target point having a high score as the high evaluation destination point.
  • the route search unit 309 sets a destination point based on the destination point information read in step S905 (step S906).
  • the route guidance unit 310 generates route guidance information for guiding the user to the destination set in step S906 and outputs the route guidance information to the navigation control unit 301. . Then, the navigation control unit 301 controls the navigation device 300 based on the route guidance information, guides the route (step S907), and ends the series of processes.
  • step S904 If the identification information does not match in step S904 (step S904: No), it is determined whether or not to register a passenger (step S908).
  • the passenger registration may be configured to display a message indicating that the passenger registration is required on the display unit 303 and to prompt the passenger to determine whether or not to register!
  • step S908 when the passenger is not registered (step S908: No), the navigation control unit 301 outputs a destination selection request to the display unit 303 (step S910).
  • the destination selection is accepted from (Step S911).
  • the route search unit 309 sets a destination point based on the selection of the destination point in step S911 (step S906).
  • the route guidance unit 310 generates route guidance information for guiding the user to the destination set in step S906 and outputs the route guidance information to the navigation control unit 301.
  • the navigation control unit 301 controls the navigation device 300 based on the route guidance information, guides the route (step S907), and ends the series of processes.
  • step S908 If the passenger is to be registered in step S908 (step S908: Yes), the registration identification information of the passenger is displayed, for example, by prompting the display 303 to prompt the passenger to register. Is registered (step S909). Registration of registration identification information of a passenger is performed by, for example, extracting feature points from a passenger image photographed by the passenger photographing unit 313. In addition, the passenger information may be registered by operating the user operation unit 302 by the user. Then, the process returns to step S904 and the same processing is repeated.
  • a configuration is adopted in which a passenger image is photographed (step S902) after waiting for a route guidance request and when there is a request (step S901: Yes).
  • the passenger image may be taken before the route guidance is requested (step S902).
  • the passenger image may be taken at the time of boarding, when the engine of the vehicle is started, or when the passenger is operated (step S902), and a route guidance request may be waited.
  • the passenger image is taken and the identification information is generated based on the information obtained by extracting the feature points of the passenger's face, but the passenger is not the identification information of the individual passenger.
  • generates a structure as identification information may be sufficient. Specifically, for example, the number and composition of passengers should be identified so that the boarding of many people and boarding with families with children can be identified.
  • the destination point information associated with the passenger information is, for example, related to the passenger's preference, but is also associated with age, gender, placement, personnel, history, etc. It is good also as composition which has. More specifically, if the passenger is an elderly person, a restaurant for the elderly, a facility that the passenger has visited in the past, or a shop that has a reputation for the passenger in the past. In addition, discount shops according to the number of passengers, shops with recommended services, facilities for families for families, and amusement facilities for friends are linked to destination information. .
  • the setting of the destination point in step S906 may be configured such that a request for using the facility is made via the communication unit 308 as necessary, and the point where the request is accepted is set. More specifically, depending on the number of passengers, you may make reservations for restaurants and amusement facilities and set the power.
  • FIG. 10 is a flowchart of the contents of the process using the behavior information in the navigation device according to the second embodiment.
  • the behavior information is used for V and the process of setting a stop point is described.
  • the navigation apparatus 300 determines whether or not there is a request for route guidance (step S1001).
  • the route guidance request may be configured, for example, by the passenger operating the user operation unit 302! /.
  • step S1001 a request for route guidance is waited for and if there is an instruction (step S1001: Yes), then the passenger photographing unit 313 photographs the behavior of the passenger (step S1002). For example, the movement of the passenger's eyeball may be imaged.
  • the navigation control unit 301 generates the passenger's behavior information from the movement of the passenger's eye shot in step S1002 (step S1003).
  • the behavior information includes, for example, information obtained by extracting feature points of the movement of the passenger's eyeball, and is recorded in advance.
  • the specific behavior information including the characteristics of eye movements such as sleepiness and fatigue recorded in the medium 305 is checked.
  • the navigation control unit 301 compares the specific behavior information registered in advance on the recording medium 305 with the behavior information generated in step S1003, and the passenger identifies It is determined whether or not the force is in the behavior state (step S1004). If it is in the specific behavior state (step S 1004: Yes), the navigation control unit 301 reads the stop point information associated with the specific behavior state based on the specific behavior state (step S 1005). Specifically, for example, when the passenger shows sleepy behavior, if the passenger is a driver, the point where the passenger can take a break can be read as the stop point information from the recording medium 305.
  • the route search unit 309 sets a stop point based on the stop point information read in step S1005 (step S1006). Then, the route guidance unit 310 generates route guidance information for guiding the user to the stop point set in step S 1006 and outputs the route guidance information to the navigation control unit 301. Then, the navigation control unit 301 controls the navigation device 300 based on the route guidance information, guides the route (step S 1007), and ends the series of processes.
  • step S 1004 if the passenger is not in the specific behavior state (step S 1004: No), the process returns to step S 1002 and the same processing is repeated. If the passenger is not in a specific motion state, the passenger may be notified of this fact and prompted to set a stop point. In addition, when the state is not a specific behavior state, a route may be guided in advance by setting a break point as a stop point every predetermined time.
  • the configuration is such that the behavior of the passenger is photographed (Step S1002) after waiting for the request for route guidance (Step S1001: Yes). It may be configured so that the behavior of the passenger is photographed before the force route guidance is requested (step S1002). For example, intensify and photograph the passenger's behavior at predetermined intervals during boarding, when the vehicle engine is started, when the passenger is operating, and during travel (step S 1 002). You may wait. Alternatively, the behavior of passengers can be photographed at predetermined intervals, and behaviors such as fatigue and sleepiness can be promptly guided to resting points. As good as.
  • behavior information is generated by photographing the movement of the passenger's eyeballs in step S1002, but the opening and closing of the window, the movement of the entire body of the passenger, the volume in the vehicle, etc. Forces may also generate behavior information. For example, if a passenger opens a window, the behavior may be hot and uncomfortable, and the behavior information related to the behavior may be generated based on the movement and volume of the entire body. ,.
  • the setting of the stop point in step S1006 may be configured such that a request for use of the facility is made via the communication unit 308 as necessary, and the point where the request is accepted is set. More specifically, depending on the number of passengers, you may make a reservation for a restaurant to take a break and set the power.
  • the processing using the passenger information and the processing using the behavior information are processed together with the functions of the respective forces described with reference to Figs. 9 and 10. It is good as composition.
  • the passenger is photographed from the passenger photographing unit 313 such as a camera, and the identification information of the passenger or the behavior information is generated.
  • the passenger photographing unit 313 such as a camera
  • the identification information of the passenger or the behavior information is generated.
  • it may be configured to generate identification information that identifies the passenger or passenger behavior information from information acquired by other sensors.
  • Identification information or behavior information may be generated using, for example, a seating sensor that detects a load distribution and a total load on a seat on which a passenger is seated.
  • a seating sensor that detects a load distribution and a total load on a seat on which a passenger is seated.
  • One or more fingerprint sensors may be provided at predetermined positions in the vehicle. The fingerprint sensor can obtain the passenger's fingerprint information and identify the passenger.
  • a voice sensor such as a microphone may be provided in the vehicle. Voice information such as the volume, sound quality, and pitch of the passenger can be obtained by the voice sensor, so that the passenger can be identified, and the personnel, gender, sleepiness, etc. can be determined.
  • the user can input the ground by his / her input operation. Even without setting point information, point information can be output based on the passenger information and behavior of the passenger. Therefore, it is possible to output point information suitable for the passenger efficiently and accurately.
  • the passenger is photographed to generate the identification information and behavior information of the passenger.
  • the identification point, behavior information, registered identification information, and specific behavior information can be compared to set the target point or stopover point, it is possible to set the optimal point for the passenger's situation that the passenger does not set himself. Can be planned.
  • stop point is set in accordance with the behavior of the passenger, if the driver is rested by drowsiness, guiding him to a place where he can take a break can lead to relieving drowsiness and driving safely. it can.
  • the destination point or stop point can be set based on the history of the destination point or stop point of the passenger, for example, the passenger may be repeatedly guided to the most recent point. Can be prevented. On the other hand, it is also possible to guide to a point that is reputable to the passenger. Therefore, the point setting can be optimized even if the passenger does not set the destination point or the stop point by himself / herself.
  • the content providing method described in the first and second embodiments is realized by executing a program prepared in advance on a computer such as a personal computer or a workstation. Can do.
  • This program is recorded on a computer-readable recording medium such as a hard disk, a flexible disk, a CD-ROM ⁇ MO, and a DVD, and the recording medium power is read by the computer. Therefore, it is executed.
  • the program may be a transmission medium that can be distributed via a network such as the Internet.

Landscapes

  • Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Remote Sensing (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)

Abstract

Dispositif de fourniture de contenu (100) fournissant des contenus à un corps mobile. Une section de spécification d’occupant (101) spécifie un occupant du corps mobile et une section d’acquisition d’informations d’occupant (102) acquiert des informations sur l’occupant (désignées ci-après par “ informations d’occupant ”) spécifiées par la section de spécification d’occupant (101). Une section de détermination de contenu (103) détermine alors, en fonction des informations d’occupant acquises par la section d’acquisition d’informations d’occupant (102), des contenus à fournir et une section de sortie de contenu (104) sort les contenus déterminés par la section de détermination de contenu (103).
PCT/JP2006/315405 2005-08-19 2006-08-03 Dispositif de fourniture de contenu, méthode de fourniture de contenu, programme de fourniture de contenu et support d’enregistrement lisible par ordinateur WO2007020808A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2005238717 2005-08-19
JP2005-238718 2005-08-19
JP2005238718 2005-08-19
JP2005-238717 2005-08-19

Publications (1)

Publication Number Publication Date
WO2007020808A1 true WO2007020808A1 (fr) 2007-02-22

Family

ID=37757471

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2006/315405 WO2007020808A1 (fr) 2005-08-19 2006-08-03 Dispositif de fourniture de contenu, méthode de fourniture de contenu, programme de fourniture de contenu et support d’enregistrement lisible par ordinateur

Country Status (1)

Country Link
WO (1) WO2007020808A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015195014A (ja) * 2014-03-28 2015-11-05 パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America 情報提示方法
JP2019164477A (ja) * 2018-03-19 2019-09-26 本田技研工業株式会社 情報提供システム、情報提供方法、及びプログラム

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003306091A (ja) * 2002-04-15 2003-10-28 Nissan Motor Co Ltd ドライバ判定装置
JP2004037292A (ja) * 2002-07-04 2004-02-05 Sony Corp ナビゲーション装置、ナビゲーション装置のサービス装置及びナビゲーション装置によるサービス提供方法

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003306091A (ja) * 2002-04-15 2003-10-28 Nissan Motor Co Ltd ドライバ判定装置
JP2004037292A (ja) * 2002-07-04 2004-02-05 Sony Corp ナビゲーション装置、ナビゲーション装置のサービス装置及びナビゲーション装置によるサービス提供方法

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015195014A (ja) * 2014-03-28 2015-11-05 パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America 情報提示方法
JP2019164477A (ja) * 2018-03-19 2019-09-26 本田技研工業株式会社 情報提供システム、情報提供方法、及びプログラム
CN110287386A (zh) * 2018-03-19 2019-09-27 本田技研工业株式会社 信息提供系统、信息提供方法以及介质

Similar Documents

Publication Publication Date Title
JP4533897B2 (ja) 処理制御装置、そのプログラム、および、そのプログラムを記録した記録媒体
EP1267590A2 (fr) Système et procédé de représentation de contenus
JP2003109162A (ja) エージェント装置
WO2007032278A1 (fr) Terminal de communication, appareil de guidage, procede de guidage et support d'enregistrement
JP2007108134A (ja) コンテンツデータ再生装置
JP3752159B2 (ja) 情報記録装置
JP2024041746A (ja) 情報処理装置
JP4903505B2 (ja) ナビゲーション装置及び方法、ナビゲーションプログラム、並びに記憶媒体。
US20220299334A1 (en) Recommendation information providing method and recommendation information providing system
JP2019132928A (ja) 車両用楽曲提供装置
WO2007023900A1 (fr) Dispositif, procédé et programme de fourniture de contenu et support d'enregistrement lisible par ordinateur
WO2007020808A1 (fr) Dispositif de fourniture de contenu, méthode de fourniture de contenu, programme de fourniture de contenu et support d’enregistrement lisible par ordinateur
WO2006095688A1 (fr) Dispositif de reproduction d’informations, méthode de reproduction d’informations, programme de reproduction d’informations et support lisible par ordinateur
JP6785889B2 (ja) サービス提供装置
WO2007043464A1 (fr) Dispositif de contrôle de sortie, procédé de contrôle de sortie, programme de contrôle de sortie et support d’enregistrement lisible par un ordinateur
JP6884605B2 (ja) 判定装置
WO2007108337A1 (fr) Dispositif de reproduction de contenu, procede de reproduction de contenu, programme de reproduction de contenu et support d'enregistrement lisible par ordinateur
JP2018097474A (ja) 情報提供システム
CN113450177A (zh) 信息提供系统、装置及其控制方法、服务器、记录介质
JP2006190206A (ja) 処理装置、その方法、そのプログラム、および、そのプログラムを記録した記録媒体
JP2008305239A (ja) 通信装置及びプログラム
JP4741933B2 (ja) 情報提供装置、情報提供方法、情報提供プログラムおよびコンピュータに読み取り可能な記録媒体
JP2001304896A (ja) 車両用ナビゲーション装置
US20210236930A1 (en) Information processing apparatus, information processing system, non-transitory computer readable medium, and vehicle
JP2019163984A (ja) 情報提供装置およびその制御方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 06782265

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP