WO2024042359A1 - Information processing method and information processing device - Google Patents

Information processing method and information processing device Download PDF

Info

Publication number
WO2024042359A1
WO2024042359A1 PCT/IB2023/000484 IB2023000484W WO2024042359A1 WO 2024042359 A1 WO2024042359 A1 WO 2024042359A1 IB 2023000484 W IB2023000484 W IB 2023000484W WO 2024042359 A1 WO2024042359 A1 WO 2024042359A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
notification information
information
display section
face image
Prior art date
Application number
PCT/IB2023/000484
Other languages
French (fr)
Japanese (ja)
Inventor
睿申 夏
美友紀 茂田
Original Assignee
日産自動車株式会社
ルノー エス.ア.エス.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日産自動車株式会社, ルノー エス.ア.エス. filed Critical 日産自動車株式会社
Publication of WO2024042359A1 publication Critical patent/WO2024042359A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R11/02Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/02Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/37Details of the operation on graphic patterns

Definitions

  • the present invention relates to an information processing method and an information processing apparatus for notifying users of information.
  • robots that can communicate with users.
  • a device has been proposed that communicates with a user by displaying information such as a character string on a display device that displays a facial image (for example, see JP2019-124855A).
  • the purpose of the present invention is to more appropriately notify users of information.
  • One aspect of the present invention is an information processing method that provides notification information to a user using an agent device that has a display device that displays a facial image.
  • This information processing method includes a control process that controls the display mode of the display device, and in this control process, when displaying notification information on the display device, a switching effect using the display device is executed, and a face After erasing the image display, the notification information is displayed.
  • FIG. 1 is a diagram showing a simplified example of the configuration of the interior of a vehicle.
  • FIG. 2 is a simplified front view showing an example of the external configuration of the agent device.
  • FIG. 3 is a diagram showing an example of the system configuration of an information processing system installed in a vehicle.
  • FIG. 4 is a diagram showing an example of transition when providing notification information using two display units.
  • FIG. 5 is a diagram showing an example of transition when providing notification information using two display units.
  • FIG. 6 is a diagram showing an example of transition when providing notification information using two display units.
  • FIG. 7 is a diagram illustrating a transition example in the case where notification information is provided using two display sections that are realized using a flat display panel that is rotatably provided on the face.
  • FIG. 1 is a diagram showing a simplified example of the configuration of the interior of a vehicle.
  • FIG. 2 is a simplified front view showing an example of the external configuration of the agent device.
  • FIG. 3 is a diagram showing an example of the system configuration of
  • FIG. 8 is a diagram showing an example of transition when providing notification information using a slideable display section and a fixed display section.
  • FIG. 9 is a diagram illustrating an example of transition when providing notification information using two display units.
  • FIG. 10 is a diagram showing an example of transition when providing broadcast information using one display unit.
  • FIG. 11 is a flowchart illustrating an example of broadcast information output processing in the information processing device.
  • FIG. 12 is a flowchart illustrating an example of broadcast information output processing in the information processing device.
  • FIG. 13 is a flowchart illustrating an example of broadcast information output processing in the information processing device.
  • FIG. 1 is a diagram showing a simplified example of the configuration of the interior of a vehicle C1. Note that FIG. 1 shows an example of the external appearance when the front side of the driver's seat and passenger seat (not shown) is viewed from the rear side of the vehicle C1. Further, in FIG. 1, for ease of explanation, illustrations other than the dashboard 2, steering wheel 3, front window 4, rearview mirror 5, camera 101, and agent device 200 are omitted.
  • the agent device 200 is a small robot installed on the dashboard 2 of the vehicle C1.
  • the agent device 200 is a robot imitating a human.
  • FIG. 1 shows an example in which the agent device 200 is installed on the dashboard 2, the present invention is not limited to this.
  • the agent device 200 may be installed above the front window 4.
  • FIG. 1 shows an example in which the agent device 200 is a robot imitating a human
  • the present invention is not limited to this.
  • robots imitating animals such as rabbits, pigs, etc., robots imitating virtual creatures (e.g. the faces of anime characters), robots imitating other objects (e.g. television-type devices, radio-type devices)
  • a robot may be used as the agent device 200. In this way, it is possible to use a simulated agent as the agent device 200.
  • the agent device 200 executes various operations based on instructions from the information processing device 110 (see FIG. 3). For example, the agent device 200 outputs various information related to driving support when the user performs a driving operation based on the control of the information processing device 110. As this driving support, notification of a moving object in front or behind is assumed. For example, as a notification of a moving object ahead, it is possible to output a voice saying, ⁇ There is a railroad crossing ahead, so be careful,'' or a voice saying, ⁇ There are people ahead.'' In this way, the agent device 200 performs driving assistance.
  • notification means to convey or inform some kind of information. Further, information to be notified to the user of the vehicle C1 (occupant of the vehicle C1) will be referred to as notification information and will be described.
  • the notification information may be communicated to the user through image display or audio output. This embodiment mainly shows an example in which notification information is conveyed to the user by displaying an image. Note that the notification shown in this embodiment may also be referred to as notification, transmission, or the like.
  • the camera 101 is installed on the ceiling inside the vehicle C1, and captures an image of a subject inside the vehicle C1 to generate an image (image data).
  • the camera 101 is configured by, for example, one or more camera devices or image sensors capable of capturing an image of a subject.
  • the camera 101 can be provided above the front window 4, that is, above the rearview mirror 5.
  • FIG. 1 shows an example including at least one camera 101, it is also possible to include two or more imaging devices and use images from all or part of these imaging devices.
  • the installation location of each imaging device is not limited to the example shown in FIG. 1, and can be changed as appropriate.
  • one or more devices such as a 360-degree camera, may be used that can capture objects present in all directions of the vehicle C1 and objects inside the vehicle C1.
  • FIG. 2 is a simplified front view showing an example of the external configuration of the agent device 200.
  • a face image, notification information, etc. are displayed on the display unit 210.
  • the agent device 200 can be turned on in response to a user operation regarding the agent device 200.
  • the on/off operation of the vehicle C1 means an on or off operation of a start key related to starting or stopping the vehicle C1.
  • FIG. 2(A) shows a front view of the agent device 200 in a normal state.
  • the agent device 200 includes a substantially box-shaped body part 202 and a substantially box-shaped face part 201 arranged vertically via a connecting part 203. Further, the face section 201 is provided with a display section 210 that displays various images. A body image showing the body is displayed on the surface of the body part 202.
  • the display section 210 various display media such as a light emitting sheet, a liquid crystal display, an organic EL (Electro Luminescence), etc. can be used.
  • the connecting portion 203 is recognized as a portion corresponding to the neck of a living creature.
  • the display section 210 of the face section 201 displays various parts that make up the agent's face, such as eyes 211, nose 212, and mouth 213.
  • the agent device 200 is installed on the dashboard 2 of the vehicle C1 in a direction in which the eye portion 211 looks inside the vehicle.
  • the side where the display unit 210 of the agent device 200 is provided will be referred to as the front side of the agent device 200, and the side opposite to the front side of the agent device 200 will be referred to as the rear side of the agent device 200.
  • the left side in FIG. 2 will be referred to as the right side of the agent device 200, and the right side in FIG. 2 will be referred to as the left side of the agent device 200.
  • FIG. 2(B) shows a front view of the agent device 200 with each part of the face 201 facing left.
  • FIG. 2C shows a front view of the agent device 200 with each part of the face section 201 facing to the right.
  • the eye part 211 is turned to the left side. It is possible to make the facial expression 201 look like this.
  • FIG. 2(B) shows a front view of the agent device 200 with each part of the face 201 facing left.
  • FIG. 2C shows a front view of the agent device 200 with each part of the face section 201 facing to the right.
  • the eye part 211 is moved to the right side, the black part of the eye part 211 is moved to the right side, and the mouth part 213 is made into an oval shape that is elongated in the vertical direction. It is possible to make the expression of the face part 201 facing the direction.
  • FIG. 2(D) shows a front view of the agent device 200 in a state where the face section 201 is expressing an effect to make the user feel like a living thing as a switching effect before displaying the notification information. show.
  • an effect to make the object feel like a living thing for example, it is possible to use a display that conveys a surprised expression to the user.
  • the eye portion 211 is formed into an oval shape that is long in the vertical direction, the black part of the eye portion 211 is moved downward, and the mouth portion 213 is formed into an oval shape that is long in the vertical direction.
  • the expression of the face part 201 look surprised.
  • by displaying an exclamation mark image 214 to draw attention or outputting surprising audio information it is possible to express a state of being even more surprised.
  • the agent device 200 it is assumed that the user is surprised by the agent device 200 and turns his attention to the display unit 210 to see if something will happen in the future.
  • the agent device 200 look surprised it is possible to make the user feel like a living thing.
  • effects using various images such as the exclamation mark image 214 it is possible to further attract the user's attention.
  • various effects such as effects using movements of the face part 201 may be performed to attract the user's attention by using other means than images such as the exclamation mark image 214, for example, effects using movements of the face part 201.
  • the effects using the movement of the face 201 of the agent device 200 will be described in detail with reference to FIGS. 5 to 9.
  • FIGS. 2(E) and 2(F) show front views of the agent device 200 in a state where notification information (notification images 400, 410) is displayed on the display unit 210.
  • FIG. 2E shows an example of displaying a notification image 400 on the display unit 210 for notifying the user of fastening the seat belt.
  • FIG. 2F shows an example in which a notification image 410 for notifying that the remaining battery level of the vehicle C1 is decreasing is displayed on the display unit 210.
  • these broadcast information are just examples, and other information may be displayed as the broadcast information.
  • a notification image may be displayed on the display unit 210 to notify that any door of the vehicle C1 is ajar. As notification information regarding this ajar door, image information indicating the door can be displayed on the display unit 210.
  • FIG. 2 shows an example in which the face part 201 and the body part 202 are configured as separate bodies, the face part 201 and the body part 202 may be configured as an integrated housing.
  • a face image is displayed as an image representing a creature-like appearance (living creature display), but other images (for example, the agent's whole body, a mechanical object, a virtual face) are displayed as a creature-like image. It may be displayed as an image representing.
  • the agent device 200 installed on the dashboard 2 is a three-dimensional anthropomorphic robot, it is assumed that its presence is recognized by the user of the vehicle C1. For this reason, for example, when notifying the user of the vehicle C1 of some information, it is conceivable to do so using the agent device 200.
  • the display unit 210 which previously displayed a face image that gives the user a feeling of being a living creature, may switch from the face image to display information. For example, at the timing when the face image shown in any one of FIGS. 2(A) to (C) is displayed on the display unit 210, the notification information shown in either FIG. 2(E) or (F) is displayed on the display unit 210.
  • the facial image may lack the appearance of a living creature after the transition from the face image that gives the impression of a living creature to the notification information that does not give the impression of a living creature.
  • interest in the notification information displayed on the display unit 210 decreases due to the lack of living creature-likeness, and it may become difficult to appropriately notify the user of the notification information. Therefore, when displaying notification information, by making the user feel like a living thing when switching between notifications, it increases the user's interest in displaying the notification information and allows the notification information to be conveyed more appropriately to the user. It is important to inform.
  • the notification information when displaying notification information on the display section 210, after performing a switching effect using the display section 210 and erasing the display of the face image on the display section 210, display the notification information. For example, at the timing when the face image shown in any one of FIGS. 2(A) to (C) is displayed on the display unit 210, the notification information shown in either FIG. 2(E) or (F) is displayed on the display unit 210.
  • the switching effect is executed by displaying the face image shown in FIG. 2(D) on the display unit 210 before displaying the notification information. In this way, by performing a switching effect that makes the user feel like a living creature before the notification information is displayed, the user's interest in the notification information being displayed is increased, and the notification information is conveyed more appropriately to the user. It becomes possible to notify.
  • FIG. 2 is an example, and other switching effects may be performed.
  • FIG. 10 shows another example of switching effect using the same display screen.
  • a switching effect using a plurality of display screens may be performed. An example of this is shown in FIG.
  • These switching effects can be executed by switching processing using software. Moreover, these switching effects make it possible to exclusively display the face image and the notification information.
  • a switching effect using physical movement of the face part 201 may be performed. Examples of this are shown in FIGS. 5 to 9. These switching effects can be executed by switching processing using physical movements such as rotation and parallel movement. Moreover, these switching effects make it possible to exclusively display the face image and the notification information. Note that even when performing a switching effect using these physical movements, it is possible to make the display position of the face image and the display position of the notification information the same. In this case, the notification information is displayed at the position of the face image after the switching effect that makes the user feel like a living thing, so it is possible to further increase the user's interest in the notification information, and the notification information It becomes possible to notify the user more appropriately.
  • FIG. 3 is a diagram showing an example of the system configuration of the information processing system 100 installed in the vehicle C1.
  • the information processing system 100 includes a camera 101, a position information acquisition sensor 102, an audio input unit 103, sensors 104, an information processing device 110, and an agent device 200.
  • the information processing device 110 and the agent device 200 are connected by a communication method using wired communication or wireless communication.
  • the information processing device 110 is connected to the network 20 by a communication method using wireless communication.
  • the network 20 is a network such as a public line network or the Internet.
  • the agent device 200 may also be connected to the network 20 using a communication method using wireless communication.
  • FIG. 3 shows an example in which the information processing device 110 and the agent device 200 are configured as separate devices, the information processing device 110 and the agent device 200 may be configured as an integrated device.
  • the camera 101 captures an image of a subject and generates an image (image data) under the control of the information processing device 110, and outputs image information regarding the generated image to the information processing device 110.
  • the camera 101 is provided at least inside the vehicle C1, and captures an image of a subject inside the vehicle C1 to generate an image (image data).
  • FIG. 1 shows a camera 101 provided inside the vehicle C1.
  • the camera 101 includes, for example, one or more camera devices or image sensors capable of capturing an image of a subject.
  • one camera 101 may be provided in front of the vehicle C1 to capture an image of a subject from the front of the vehicle C1 to generate an image (image data), and another camera 101 may be provided in the rear of the vehicle C1 to generate an image (image data).
  • An image (image data) may be generated by capturing an image of a subject behind C1.
  • the position information acquisition sensor 102 acquires position information regarding the position where the vehicle C1 is present, and outputs the acquired position information to the information processing device 110.
  • the position information includes various data related to the position such as latitude, longitude, altitude, etc. at the time of receiving the GNSS signal.
  • the location information may be acquired using other location information acquisition methods. For example, location information may be derived using information from nearby access points and base stations. Alternatively, location information may be acquired using a beacon. For example, based on the information acquired by the position information acquisition sensor 102, it is possible to determine the state of the vehicle C1, for example, whether it is running, stopped, or moving backward.
  • the vehicle C1 is near a facility that exists outside the vehicle C1, such as a coffee shop, based on the position information acquired by the position information acquisition sensor 102.
  • the audio input unit 103 is provided inside the vehicle C1, and is configured to acquire sounds inside the vehicle C1 based on the control of the information processing device 110, and transmits sound information regarding the acquired sounds to the information processing device 110. Output.
  • the audio input unit 103 for example, one or more microphones or sound acquisition sensors can be used.
  • the sensors 104 are various sensors installed in the vehicle C1, and output detection information acquired by each sensor to the information processing device 110.
  • the sensors include a human sensor, a distance sensor, a vehicle speed sensor, an acceleration sensor, a seating sensor, a seatbelt sensor, a door sensor, and a battery sensor. Note that these are just examples, and other sensors may be used. Further, only some of these sensors may be used.
  • the human sensor is a sensor that detects the presence or absence, number of people, position, state, etc. of people inside the vehicle C1. For example, it is possible to use a human sensor using infrared rays, ultrasonic waves, visible light, an image sensor, or the like. For example, if a person is seated in the driver's seat, passenger seat, or rear seat, the presence of that person can be detected by a human sensor. Further, by using both the human sensor and the seating sensor, it is possible to improve the accuracy of detecting the seating state of each seat.
  • the distance sensor is a sensor that detects the distance to a person existing inside the vehicle C1, the distance to a notification target, etc.
  • various sensors such as a distance sensor can be used as the distance sensor.
  • the seating sensor (or seat sensor) is a sensor that detects the presence or absence of an occupant sitting in each seat of the vehicle C1.
  • the seat belt sensor is a sensor that detects whether or not the occupant seated in each seat of the vehicle C1 is wearing a seat belt.
  • the door sensor is a sensor that detects whether each door of the vehicle C1 is ajar.
  • the battery sensor is a sensor for measuring the remaining amount of the battery installed in the vehicle C1. For each of these sensors, known sensors can be used.
  • the information processing device 110 includes a control section 120, a storage section 130, and a communication section 140.
  • the communication unit 140 exchanges various information with other devices using wired communication or wireless communication under the control of the control unit 120.
  • the control unit 120 controls each unit based on various programs stored in the storage unit 130.
  • the control unit 120 is realized by, for example, a processing device such as a CPU (Central Processing Unit) or a GPU (Graphics Processing Unit). For example, it is possible to increase the calculation speed by performing image processing using a GPU. Further, by executing parallel calculations using GPU, it is possible to further increase the calculation speed.
  • a vehicle ECU Electronic Control Unit
  • the control unit 120 has a conversion function that converts characters into speech. This conversion function is realized by, for example, TTS (Text to Speech).
  • the control unit 120 executes control processing to control the operating state of the agent device 200 based on each piece of information output from the camera 101, location information acquisition sensor 102, audio input unit 103, sensors 104, communication unit 140, etc. do.
  • the control unit 120 includes a display switching determination unit 121 and an agent control unit 122.
  • the display switching determination unit 121 determines whether to switch the content displayed on the display unit 210 based on each piece of information output from the camera 101, position information acquisition sensor 102, audio input unit 103, sensors 104, communication unit 140, etc.
  • the agent controller 122 outputs the determination result to the agent control unit 122. Specifically, the display switching determination unit 121 determines the timing of switching from the face image to the notification information.
  • the display switching determination unit 121 determines whether notification information has occurred based on information output from the camera 101, position information acquisition sensor 102, audio input unit 103, sensors 104, communication unit 140, etc. judge. For example, based on a seat belt sensor that detects whether a seat belt is worn or not, and a seating sensor (or seat sensor) that detects the presence or absence of an occupant in each seat of the vehicle C1, the number of occupants in each seat of the vehicle C1 is determined. It is possible to determine whether or not a seat belt is worn. Therefore, if it is determined that the occupant is not wearing a seatbelt even though he or she is seated, it is determined that notification information has occurred.
  • the display switching determination unit 121 may adjust the timing of switching from the face image to the notification information based on the importance level of the notification information, and may adjust the timing of switching from the face image to the notification information based on the degree of importance of the notification information.
  • the timing of switching from the face image to the notification information may be adjusted based on whether or not the face image is displayed. A method for adjusting these switching timings will be described in detail with reference to FIGS. 12 and 13.
  • the agent control unit 122 determines whether the agent It controls the operating state of the device 200. For example, the agent control unit 122 causes the display unit 210 to display each part (eyes 211, nose 212, mouth 213) that make up the face of the simulated agent. Further, the agent control unit 122 causes the display unit 210 to display information (notification information) to be notified to the occupants of the vehicle C1. Further, the agent control unit 122 causes the sound output unit 220 to output the voice of the simulated agent, the voice corresponding to information to be notified to the occupants of the vehicle C1, and the like.
  • the agent control unit 122 determines notification information to be displayed on the display unit 210 and notification information to be output from the sound output unit 220, and executes control to output the notification information.
  • notification information regarding objects existing inside the vehicle C1 is stored in the notification information DB 132. Further, examples of output of each of these notification information are shown in FIGS. 2, 4 to 10. For example, if the notification information is about wearing a seatbelt, an image showing the seatbelt (notification image 400 (see FIG. 2(E))) is displayed as the notification information. Further, if the notification information is related to the remaining amount of battery, an image indicating the battery (notification image 410 (see FIG. 2(F))) is displayed as the notification information. Further, if the notification information is about an ajar door, an image showing the door is displayed as the notification information.
  • the agent control unit 122 executes control to cause the agent device 200 to perform a switching effect. This switching effect will be explained in detail with reference to FIGS. 4 to 10.
  • the storage unit 130 is a storage medium that stores various information.
  • the storage unit 130 stores various information (for example, a control program, an agent information DB 131, a notification information DB 132, and a map information DB 133) necessary for the control unit 120 to perform various processes.
  • the storage unit 130 stores various information acquired via the communication unit 140. Examples of the storage unit 130 include ROM (Read Only Memory), RAM (Random Access Memory), SRAM (Static Random Access Memory), and HDD (Hard Disk). drive), SSD (Solid State Drive), or a combination thereof be able to.
  • the agent information DB 131 stores information necessary to realize various operations of the agent device 200. For example, facial image information (eg, eyes 211, nose 212, and mouth 213) displayed on the display unit 210 and audio information output from the sound output unit 220 are stored in the agent information DB 131. Further, for example, operation information for operating the face portion 201 (see FIGS. 5 to 9) when performing a switching effect is stored in the agent information DB 131.
  • facial image information eg, eyes 211, nose 212, and mouth 213
  • audio information output from the sound output unit 220 are stored in the agent information DB 131.
  • operation information for operating the face portion 201 (see FIGS. 5 to 9) when performing a switching effect is stored in the agent information DB 131.
  • the notification information DB 132 stores information necessary for outputting notification information regarding objects existing inside or outside the vehicle C1. For example, in the case of notification information regarding a door ajar, image information indicating the door and audio information for notifying the door ajar are stored in the notification information DB 132 as notification information. For example, in the case of notification information regarding the fastening of a seatbelt, image information indicating the seatbelt (notification image 400 (see FIG. 2(E))) and audio information for notifying the fastening of the seatbelt are combined. It is stored in the broadcast information DB 132 as broadcast information. For example, when the notification information is about the remaining battery level, the notification information includes image information indicating the battery (notification image 410 (see FIG. 2(F))) and audio information for notifying the fastening of the seat belt. It is stored in the broadcast information DB 132 as .
  • the map information DB 133 stores map information such as road information regarding roads necessary for route guidance of the vehicle C1.
  • the map information includes road slope, road intersections, number of road lanes, road width information, and road undulation information.
  • the map information also includes road signs indicating speed limits, one-way traffic, etc., crosswalks, lane markings, etc. Further, the map information may include information on facilities such as road structures (for example, traffic lights, telephone poles), buildings, etc., information on tourist guides in the surrounding area, and the like.
  • the agent information DB 131, notification information DB 132, and map information DB 133 may be stored and used in the storage unit 130 included in the vehicle C1, or may be obtained and used from an external device via the network 20.
  • the agent device 200 is a robot device that performs various operations based on instructions from the information processing device 110.
  • Agent device 200 includes a display section 210, a sound output section 220, and a drive section 230. Note that the display section 210, the sound output section 220, and the drive section 230 are controlled based on a control section (not shown) included in the agent device 200.
  • the display unit 210 is a display unit that displays various images based on instructions from the information processing device 110.
  • a display panel such as an organic EL (Electro Luminescence) panel or an LCD (Liquid Crystal Display) panel can be used.
  • the display unit 210 may be configured as a touch panel that allows the user to perform operation input by touching or approaching the display surface with his or her finger, or may be configured as a separate user interface. .
  • FIG. 2 shows an example in which the agent device 200 includes one display unit 210
  • the agent device 200 may include multiple display units. Examples in which the agent device 200 includes a plurality of display units are shown in FIGS. 4 to 9. Note that each display section shown in FIGS. 4 to 9 corresponds to the display section 210 shown in FIG. 3.
  • the sound output unit 220 outputs various sounds based on instructions from the information processing device 110.
  • the sound output section 220 for example, one or more speakers can be used.
  • the display section 210 and the sound output section 220 are examples of user interfaces, and some of them may be omitted or other user interfaces may be used.
  • the drive section 230 drives each section of the agent device 200 based on instructions from the information processing device 110.
  • the drive section 230 is a drive device that realizes a mechanism for rotating the face section 201 (see FIGS. 5, 6, and 9).
  • the drive section 230 is a drive device that realizes a mechanism for moving the display panel 270 (see FIG. 7) in the face section 201.
  • the drive section 230 is a drive device that realizes a mechanism for moving the display section 300 (see FIG. 8).
  • the drive section 230 is configured with a motor, a servo motor, etc. that can drive each section. Note that the drive unit 230 will be described in detail with reference to FIGS. 5 to 9.
  • FIG. 2 shows an example in which one display unit 210 is used to provide notification information. As described above, it is also possible to provide notification information using a plurality of display units. Therefore, FIGS. 4 to 9 show examples in which notification information is provided using a plurality of display units. Note that FIGS. 4 to 9 show examples in which notification information (notification image 410 (see FIG. 2(F))) regarding the remaining battery level is output. Further, FIG. 10 shows an example in which notification information (notification image 400 (see FIG. 2E)) regarding fastening of the seat belt is output.
  • FIG. 4 is a diagram showing an example of transition when providing notification information using two display units 241 and 242.
  • FIG. 4 shows an example in which the agent device 200 includes two display units 241 and 242 arranged in the vertical direction.
  • the display section 241 is installed on the face section 201
  • the display section 242 is installed on the body section 202. That is, in the example shown in FIG. 4, a display section 241 is provided in place of the display section 210 in the agent device 200 shown in FIG. 2, and a display section 242 is provided on the body part 202.
  • the display section 241 mainly displays facial images
  • the display section 242 mainly displays notification information. For example, in normal times, various facial images are displayed on the display unit 241, similar to the examples shown in FIGS. 2(A) to 2(C).
  • the display switching determination unit 121 determines information to be notified to the user of the vehicle C1 based on each information output from the camera 101, the position information acquisition sensor 102, the audio input unit 103, the sensors 104, the communication unit 140, etc. Determine whether (broadcast information) has occurred. Then, when notification information is generated, the display switching determination unit 121 outputs a notification to that effect to the agent control unit 122.
  • the agent control unit 122 executes control to display the notification information on the display unit 242. Specifically, the agent control unit 122 executes a switching effect using the display unit 241, and after erasing the display of the face image on the display unit 241, causes the display unit 242 to display the notification information. Note that instead of performing the switching effect using the display section 241, a switching effect using the display sections 241 and 242, a switching effect using only the display section 242, etc. may be performed.
  • the agent control unit 122 executes the performance (switching performance). Specifically, the agent control unit 122 executes the switching effect by controlling the display state of the display unit 241 so that the facial image on the display unit 241 has a surprised facial expression (see FIG. 2(D)). do.
  • the agent control unit 122 erases the display of the face image on the display unit 241, and causes the display unit 242 to display the notification image 410 as notification information.
  • FIG. 5 is a diagram showing an example of transition when providing notification information using two display sections 252 and 253.
  • the agent device 200 includes a casing 250 (face portion 201) that can be rotated in the left-right direction, and a display portion 252 (see FIGS. 5(A) and 5(B)) on one surface of the casing 250.
  • a display unit 253 is installed on the other surface of the housing 250.
  • the display section 252 mainly displays face images
  • the display section 253 mainly displays notification information.
  • the housing 250 (face part 201) and the body part 202 are connected by the neck part 251, using the neck part 251 as a rotation support part.
  • the neck portion 251 is connected to a drive portion 230 (see FIG. 3), and the rotating operation of the drive portion 230 allows the casing 250 to be rotated in the left-right direction (arrow A1 direction) using the neck portion 251 as a rotation axis.
  • the rotation range of the housing 250 can be set, for example, in a range of 180 to 360 degrees.
  • the front-back direction of the housing 250 can be reversed.
  • the display unit 252 side is set to be the front side, and various facial images are displayed on the display unit 252, similar to the examples shown in FIGS. 2(A) to 2(C).
  • the display switching determination unit 121 determines whether or not notification information has been generated, and if notification information has been generated, outputs a notification to that effect to the agent control unit 122.
  • the agent control unit 122 executes control to display the notification information on the display unit 253. Specifically, the agent control unit 122 executes a switching effect using the housing 250, erases the display of the face image on the display unit 252, and then displays the notification information on the display unit 253.
  • the agent control unit 122 when notification information is generated, as shown in FIGS. 5(A) and 5(B), the agent control unit 122 creates a switching effect that gives the user a feeling of living beings at a timing before displaying the notification information.
  • a performance (switching performance) for the purpose of the change is executed.
  • the agent control unit 122 executes the switching effect by controlling the display state of the display unit 252 so that the facial image on the display unit 252 has a surprised facial expression (see FIG. 2(D)). do.
  • the agent control unit 122 erases the display of the face image on the display unit 252, and rotates the casing 250 to the right (in the direction of arrows A2 and A3) using the neck 251 as the rotation axis. ). Note that during this rotation, the face image on the display section 252 may be displayed, and when this rotation ends, the face image on the display section 252 may be erased. Furthermore, before the rotation starts or at the time of this rotation, the sound output unit 220 may output audio information to make the user feel like a living thing.
  • the agent control unit 122 displays a notification image 410 as notification information on the display unit 253 on the user side of the vehicle C1.
  • a switching effect is executed using the housing 250 (display sections 252 and 253), and after erasing the display of the face image on the display section 252, the display section 253
  • the notification information (notification image 410) is displayed.
  • the display position of the face image and the display position of the notification information are the same position as viewed from the user. Therefore, it is possible to display the notification information at a position that is easily visible to the user who is viewing the face image. With these, it becomes possible to more appropriately notify the user of the notification information.
  • FIG. 6 is a diagram showing an example of transition when providing notification information using two display sections 264 and 265.
  • the agent device 200 includes a vertically rotatable housing 260 (face portion 201), and a display unit 264 (see FIG. 6(A)) is installed on one surface of the housing 260.
  • a display section 265 is installed on the other surface of the housing 260.
  • the display section 264 mainly displays face images, and the display section 265 mainly displays notification information.
  • the casing 260 is installed inside a frame 261 that has a square shape when viewed from the front, and both ends of the casing 260 in the left and right direction and the inner surface of the frame 261 are connected to the rotation support part 262. , 263.
  • the rotation support parts 262 and 263 are provided at positions corresponding to the ears of the face part 201, the rotation support parts 262 and 263 may be configured as the ears of the face part 201.
  • the rotation support parts 262 and 263 are connected to the drive part 230 (see FIG. 8), and the rotation movement of the drive part 230 causes the face part 201 to move in the vertical direction (arrow A10 direction). In this way, the housing 260 can be rotated in the vertical direction inside the frame portion 261.
  • the rotation range of the housing 260 (face portion 201) can be set, for example, in a range of 180 degrees to 360 degrees.
  • the front-back direction of the housing 260 can be reversed.
  • the display unit 264 side is set to be the front side, and various facial images are displayed on the display unit 264, similar to the examples shown in FIGS. 2(A) to 2(C).
  • the display switching determination unit 121 determines whether or not notification information has been generated, and if notification information has been generated, outputs a notification to that effect to the agent control unit 122.
  • the agent control unit 122 executes control to display the notification information on the display unit 265. Specifically, the agent control unit 122 executes a switching effect using the housing 260 and, after erasing the display of the face image on the display unit 264, causes the display unit 265 to display the notification information.
  • the agent control unit 122 when notification information is generated, as shown in FIGS. 6(A) and 6(B), the agent control unit 122 creates a switching effect that gives the user a feeling of living beings at a timing before displaying the notification information.
  • a performance (switching performance) for the purpose of the change is executed.
  • the agent control unit 122 executes the switching effect by controlling the display state of the display unit 264 so that the facial image on the display unit 264 has a surprised facial expression (see FIG. 2(D)). do.
  • the agent control unit 122 erases the display of the face image on the display unit 264, and rotates the housing 260 in the vertical direction (arrow A11, A12 direction). Note that during this rotation, the face image on the display section 264 may be displayed, and when this rotation ends, the face image on the display section 264 may be erased. Furthermore, before the rotation starts or at the time of this rotation, the sound output unit 220 may output audio information to make the user feel like a living thing.
  • the agent control unit 122 displays a notification image 410 as notification information on the display unit 265 on the user side of the vehicle C1.
  • notification information is generated, after performing a switching effect using the housing 260 (display sections 264 and 265) and erasing the display of the face image on the display section 264, the display section 265
  • the notification information (notification image 410) is displayed.
  • the display position of the face image and the display position of the notification information are the same position as viewed from the user. Therefore, it is possible to display the notification information in a position that is easily visible to the user who is viewing the face image. With these, it becomes possible to more appropriately notify the user of the notification information.
  • the left-right rotation mechanism shown in FIG. 5 is an example, and there is a rotation axis at the center of the front-rear direction and left-right direction of a plate-shaped casing with display sections on both sides, and the casing is rotated around the rotation axis.
  • Other mechanisms may also be used.
  • the vertical rotation mechanism shown in FIG. 6 is an example, and there is a rotation axis at the center of the front-rear direction and vertical direction of a plate-shaped casing with display sections on both sides, and the casing is rotated around the rotation axis.
  • Other mechanisms may also be used.
  • FIGS. 5 and 6 show examples in which display sections are provided on both sides of the casings 250 and 260.
  • a display section may be provided only on one surface of the casings 250, 260, and an operation of swinging the casings 250, 260 may be performed as a switching effect.
  • a switching effect is executed in which the housing is rotated, and after erasing the display of the face image on one display section, the notification information is displayed on that display section. Display.
  • FIG. 7 shows notification information using two display sections that are realized using a flat display panel 270 (see FIGS. 7(B) and 7(C)) that is rotatably provided on the face section 201. It is a figure which shows the example of a transition when providing.
  • the display panel 270 has a first display section 271 provided on one surface, and a second display section 272 provided on the opposite surface. Further, the display panel 270 is provided with a rotation support section 273 at one end thereof, and is provided on the display section 280 so as to be rotatable about the rotation support section 273 as a rotation axis. Further, the display section 280 is installed on the face section 201.
  • the rotation support section 273 is connected to the drive section 230 (see FIG. 3), and the rotation operation of the drive section 230 causes the display panel 270 to move in the vertical direction (in the direction of arrow A20 (FIG. 7) with the rotation support section 273 as a rotation axis). (See (B))).
  • the rotation range of the display panel 270 can be set to, for example, a range of 180 degrees.
  • the display device constituted by the display panel 270 and the display section 280 enters either a first state in which a face image is displayed or a second state in which notification information is displayed by the rotating operation of the display panel 270. controlled as follows. As shown in FIG. 7A, in the first state, the display panel 270 is arranged in the upper area of the display part 280, and the first display part 271 of the display panel 270 and the lower display area 281 of the display part 280 are arranged in the first state. Let the display area consist of . Further, as shown in FIG.
  • the display panel 270 in the second state, is arranged in the lower area of the display part 280, and the second display part 272 of the display panel 270 and the upper display area of the display part 280 are arranged. 282. Note that in the first state, a face image is mainly displayed, and in the second state, notification information is mainly displayed.
  • the movable display panel 270 functions like a lid that covers the upper display area of the display unit 280 or the lower display area of the display unit 280. Therefore, the display panel 270 can also be referred to as a foldable display section or a lid section.
  • the display switching determination unit 121 determines whether or not notification information has been generated, and if notification information has been generated, outputs a notification to that effect to the agent control unit 122.
  • the agent control unit 122 causes the notification information to be displayed on the display device configured by the display panel 270 and the display unit 280.
  • control Specifically, the agent control unit 122 executes a switching effect using the display panel 270 and the display unit 280, erases the display of the face image in the first state, and then displays the notification information in the second state. Display.
  • the agent control unit 122 displays a creature-like image to the user as a switching effect before displaying the notification information.
  • the agent control unit 122 controls the display panel 270 and the display panel 270 so that the facial image in the first state has a surprised facial expression (see FIG. 2D).
  • the display state of the display unit 280 is controlled to execute a switching effect.
  • the agent control unit 122 erases the display of the face image on the first display section 271 of the display panel 270 and the lower display area 281 of the display section 280.
  • the display panel 270 is rotated downward (in the direction of arrows A21 and A22) using the rotation support part 273 as a rotation axis. Note that during this rotation, the face image may be displayed, and at the end of this rotation, the face image may be erased.
  • the sound output unit 220 may output audio information to make the user feel like a living creature.
  • the agent control unit 122 displays a notification image 410 as notification information in a second state in which the display panel 270 is placed in the lower area of the display unit 280. That is, the notification image 410 is displayed in a display area formed by the second display section 272 of the display panel 270 and the upper display area 282 of the display section 280. In this way, when notification information is generated, a switching effect is performed in which the display panel 270 is rotated to transition from the first state to the second state, and the facial image is displayed in the first state. After erasing, the notification information (notification image 410) is displayed in the second state. Note that, as shown in FIG. 7C, notification information (notification image 410) may be displayed while the display panel 270 is rotating.
  • a facial image with a surprised expression is displayed on display section 264 and display panel 270 is rotated when switching from a facial image to notification information.
  • the agent device 200 By rotating the display panel 270, it is possible for the agent device 200 to remove the mask of its face and express interesting movements that reveal its true appearance, thereby increasing the user's interest in the agent device 200. is possible. Further, it is possible to give the user an interesting impression by displaying the notification information on the real face of the agent device 200 after removing the mask of the face of the agent device 200. With these, it is possible to make the user feel like a living thing and increase the user's interest in being notified of the notification information.
  • the display position of the face image and the display position of the notification information are the same position as viewed from the user. Therefore, it is possible to display the notification information at a position that is easily visible to the user who is viewing the face image. With these, it becomes possible to more appropriately notify the user of the notification information.
  • FIG. 7 shows an example in which the display panel 270 is rotated in the vertical direction using the rotation support portion 273 extending in the left-right direction as a rotation axis
  • the display panel 270 is not limited to this.
  • the display panel 270 may be rotated in the left-right direction using a rotation support portion extending in the vertical direction as a rotation axis.
  • FIG. 8 is a diagram illustrating an example of transition when providing notification information using a slideable display section 300 and a fixed display section 310.
  • the display unit 300 is a plate-shaped housing that includes a display screen 301
  • the display unit 310 is a plate-shaped housing that includes a display screen 311.
  • the display section 300 is installed in front of the display section 310 so as to be movable in the vertical direction. Specifically, the display section 300 is attached to rail sections 305 and 306 fixed to the front of the body section 202 so as to be slidable in a direction substantially parallel to the display surface of the display section 310.
  • the slide mechanism realized by the rail parts 305 and 306 can employ a known technique.
  • the size of the display screen 301 of the display unit 300 and the size of the display screen 311 of the display unit 310 may be the same (or substantially the same) or may be different sizes.
  • the size of the display screen 301 of the display unit 300 can be set to a size that can cover at least a portion of the display screen 311 of the display unit 310.
  • the display unit 300 is configured to transition to either a first state (see FIG. 8(A)) in which a face image is displayed or a second state (see FIG. 8(C)) in which notification information is displayed. controlled.
  • a first state see FIG. 8(A)
  • a second state see FIG. 8(C)
  • notification information is displayed.
  • FIG. 8A in the first state, the display section 300 is arranged so as to cover the display screen 311 of the display section 310 in front of the display section 310.
  • FIG. 8C in the second state, the user slides the display unit 300 in the first state in a direction substantially parallel to the display screen 311 of the display unit 300.
  • the display screen 311 of the display unit 310 can be viewed.
  • the display section 300 is directly or indirectly connected to the drive section 230 (see FIG. 3), and is driven by the drive section 230 in the vertical direction (arrow A30 direction (FIG. 8 (B)) along the rail sections 305 and 306. )).
  • the display switching determination unit 121 determines whether or not notification information has been generated, and if notification information has been generated, outputs a notification to that effect to the agent control unit 122.
  • the agent control unit 122 executes control to display the notification information on the display unit 310. Specifically, the agent control unit 122 executes a switching effect using the display unit 300 and, after erasing the display of the face image on the display unit 300, causes the display unit 310 to display the notification information.
  • the agent control unit 122 when notification information is generated, as shown in FIGS. 8(A) and 8(B), the agent control unit 122 provides a switching effect that gives the user a feeling of living beings at a timing before displaying the notification information.
  • a performance switching performance
  • the agent control unit 122 controls the display unit 300 so that the facial image on the display unit 300 has a surprised facial expression (see FIG. 2(D)). Control the display state and execute switching effects.
  • the agent control unit 122 erases the face image on the display unit 300 and moves the display unit 300 downward (in the direction of arrow A31) along the rails 305 and 306.
  • the face image may be displayed during this movement, and the face image may be erased at the end of this movement.
  • the sound output unit 220 may output audio information to make the user feel like a living creature.
  • the agent control unit 122 causes the display unit 310 to display a notification image 410 as notification information.
  • the display section 300 is slid downward (in the direction of arrow A31) along the rail sections 305 and 306 to effect a transition from the first state to the second state.
  • notification information (notification image 410) is displayed on the display unit 310.
  • the display position of the face image and the display position of the notification information are the same position as viewed from the user. Therefore, it is possible to display the notification information at a position that is easily visible to the user who is viewing the face image. With these, it becomes possible to more appropriately notify the user of the notification information.
  • the display unit 300 in the second state is placed in front of the body part 202, and a part of the body part 202 is covered by the display unit 300 when viewed from the user.
  • An example is shown in which a part of the body part 202 is hidden.
  • the display unit 300 may be housed inside the body part 202 so that the body part 202 is not hidden when viewed from the user.
  • an image that is the same as the part of the body part 202 covered by the display unit 300 when viewed from the user may also be displayed on the display unit 300.
  • FIG. 9 is a diagram showing an example of transition when providing notification information using two display sections 324 and 325.
  • FIG. 9 shows an example in which the agent device 200 includes a casing 320 (face portion 201) that is rotatable in the vertical direction.
  • FIG. 9(D) shows a perspective view of the external appearance of the vertically rotatable casing 320 (face portion 201).
  • a display section 324 (see FIG. 9(A)) is installed on one surface of the casing 320, and the other surface adjacent to the surface of the display section 324 of the casing 320 is provided with a display section 324 (see FIG. 9(A)).
  • a display unit 325 (see FIG. 9(C)) is installed.
  • the display section 324 mainly displays face images, and the display section 325 mainly displays notification information.
  • FIG. 9 is a modification of FIG. 6, and differs in the shape of the housing 320, the positional relationship between the two display units in the housing 320, etc.
  • the casing 320 is installed inside a frame 321 that has a square shape when viewed from the front, and both ends of the casing 320 in the left and right direction and the inner surface of the frame 321 are connected to the rotation support part 322. , 323.
  • the rotation support parts 322 and 323 are provided at positions corresponding to the ears of the face part 201, the rotation support parts 322 and 323 may be configured as the ears of the face part 201.
  • the rotational supports 322 and 323 are connected to a drive unit 230 (see FIG. 3), and the rotational movement of the drive unit 230 causes the housing 320 to move in the vertical direction (arrow A40 direction). In this way, the housing 320 can be rotated in the vertical direction inside the frame portion 321.
  • the rotation range of the housing 320 can be set, for example, in a range of 180 degrees to 360 degrees.
  • the two display sections 324 and 325 can be switched by rotating about 90 degrees in the vertical direction.
  • the display unit 324 side is set to be the front side, and as in the examples shown in FIGS. 2(A) to (C), various faces are displayed on the display unit 324. The image is displayed.
  • the display switching determination unit 121 determines whether or not notification information has been generated, and if notification information has been generated, outputs a notification to that effect to the agent control unit 122.
  • the agent control unit 122 executes control to display the notification information on the display unit 325. Specifically, the agent control unit 122 executes a switching effect using the housing 320, erases the face image on the display unit 324, and then displays the notification information on the display unit 325.
  • the agent control unit 122 when notification information is generated, as shown in FIGS. 9A and 9B, the agent control unit 122 creates a switching effect that gives the user a feeling of living beings at a timing before displaying the notification information.
  • a performance switching performance
  • the agent control unit 122 controls the display unit 324 so that the facial image on the display unit 300 has a surprised facial expression (see FIG. 2(D)). Control the display state and execute switching effects.
  • the agent control unit 122 erases the face image on the display unit 324, and rotates the housing 320 downward (in the direction of arrow A41) using the rotational supports 322 and 323 as rotation axes. ). Note that during this rotation, the face image on the display section 324 may be displayed, and when this rotation ends, the face image on the display section 324 may be erased. Furthermore, before the rotation starts or at the time of this rotation, the sound output unit 220 may output audio information to make the user feel like a living creature.
  • the agent control unit 122 displays a notification image 410 as notification information on the display unit 325 on the user side of the vehicle C1.
  • a switching effect is performed using the housing 320 (display sections 324, 325), and after erasing the face image on the display section 324, the notification information is displayed on the display section 325.
  • Information (notification image 410) is displayed.
  • a face image with a surprised expression is displayed on the display unit 324 and the housing 320 is moved when switching from the face image to the notification information.
  • the housing 320 By rotating the housing 320, it is possible to express a movement in which the face of the agent device 200 faces downward, and therefore it is possible to increase the user's interest in the agent device 200.
  • the display position of the face image and the display position of the notification information are approximately the same position as viewed from the user.
  • the display size of the notification information is larger than the display size of the face image. Therefore, it is possible to display the notification information in a large size at a position that is easily visible to the user who is viewing the face image. With these, it becomes possible to more appropriately notify the user of the notification information.
  • a facial image is displayed on the horizontally elongated display section 324, and notification information is displayed on the vertically wide display section 325.
  • This allows the user to visually and intuitively understand that information more important than the facial image is displayed on the display unit 325.
  • the information can be displayed on the back near the face of the agent device 200 (or in front of the face). It is possible to give the user an interesting impression that the notification information is displayed on the abdominal part of the user. With these, it is possible to give the user a feeling of being like a living thing, and it is possible to increase the user's interest in the broadcast information.
  • a face image is displayed on the horizontally elongated display section 324, and notification information is displayed on the vertically wide display section 325, but the present invention is not limited to this.
  • the notification information may be displayed on the horizontally elongated display section 324, the facial image may be displayed on the vertically wide display section 325, and the facial image and the notification information may be displayed on the other two adjacent surfaces. It's okay.
  • FIG. 9 shows an example using two display sections provided on two adjacent surfaces of the rectangular parallelepiped-shaped housing 320
  • two display sections provided on two adjacent surfaces of the cubic-shaped housing 320 are used.
  • a single display section may also be used.
  • FIG. 9 shows an example in which the rectangular parallelepiped-shaped housing 320 is rotated in the vertical direction
  • the present invention is not limited thereto.
  • a rectangular parallelepiped-shaped housing configured to form a face image elongated in the vertical direction may be rotated in the left-right direction.
  • Example of providing notification information using one display unit show examples in which notification information is displayed by switching the screen contents of a plurality of display units.
  • FIG. 10 shows an example in which notification information is displayed by switching the screen content of one display unit.
  • FIG. 10 is a diagram showing an example of transition when providing notification information using one display unit 210.
  • FIG. 10 shows an example in which the display unit 210 has a display function that can display each image with different display characteristics.
  • the display characteristics refer to characteristics when displaying images, such as image quality, color, resolution, and presence or absence of stereoscopic viewing.
  • different display characteristics can be realized by switching between a three-dimensional image and a two-dimensional image.
  • different display characteristics can be realized by switching between high image quality and low image quality.
  • different display characteristics can be realized by switching between a color image and a monochrome image.
  • the facial image As a three-dimensional image on the display section 241 and displaying the notification information as a two-dimensional image on the display section 241, it becomes possible to display the facial image and the notification information with different display characteristics.
  • the face image on the display unit 241 As a high-quality image and displaying the notification information on the display unit 241 as a low-quality image, it is possible to display the face image and the notification information with different display characteristics.
  • the high quality face image is, for example, a colorful and beautiful image
  • the low quality face image is, for example, at least one of a black and white image, a pixel image, and a mosaic image.
  • the face image on the display unit 241 as a color image and displaying the notification information as a black and white image on the display unit 241, it is possible to display the face image and the notification information with different display characteristics.
  • the display switching determination unit 121 determines whether or not notification information has been generated, and if notification information has been generated, outputs a notification to that effect to the agent control unit 122.
  • the agent control unit 122 executes control to display the notification information on the display unit 210. Specifically, the agent control unit 122 executes a switching effect using the display unit 210, erases the face image on the display unit 210, and then displays the notification information on the display unit 210.
  • the agent control unit 122 executes the performance (switching performance). Specifically, as shown in FIG. 10(A), the agent control unit 122 controls the display unit 210 so that the facial image on the display unit 210 has a surprised facial expression (see FIG. 2(D)). Control the display state and execute switching effects.
  • the agent control unit 122 erases the face image on the display unit 210 and causes the display unit 210 to display the notification image 400 as notification information.
  • an effect that makes the user feel that the face image display screen and the notification information display screen are different screens may be performed as a switching effect.
  • switching from a stereoscopic image to a two-dimensional image, switching from high image quality to low image quality, switching from a color image to a monochrome image, etc. can be performed as a switching effect.
  • the switching effect may be performed by changing the color scheme or changing the resolution.
  • the switching effect may be performed using software-based image processing.
  • the notification information may be gradually displayed.
  • audio information for making the user feel like a living thing may be outputted from the sound output unit 220.
  • a facial image with a surprised expression is displayed on the display unit 210 when switching from a face image to notification information, and notification information is displayed as a different display characteristic.
  • notification information is displayed as a different display characteristic.
  • FIG. 10 shows an example in which notification information is displayed as a display characteristic different from that of a face image by switching the screen content of one display unit 210.
  • each of the plurality of display sections may be configured to have different display characteristics.
  • the first display section can be an LCD (Liquid Crystal Display), and the second display section can be an OLED (Organic Electro Luminescence Diode). Further, it is possible to make the first display section a glossy screen and the second display section a non-glossy screen. Further, the first display section can be a flat screen, and the second display section can be a curved screen. Further, the first display section can be a stereoscopic screen capable of displaying a three-dimensional image, and the second display section can be a screen capable of displaying a two-dimensional image. Further, it is possible to make the first display section and the second display section different in size.
  • LCD Liquid Crystal Display
  • OLED Organic Electro Luminescence Diode
  • the visual images may be the same, but the brightness is different, so it is possible to perform different effects.
  • a switching effect is executed on at least one of the first display section and the second display section. For example, the switching effect is performed by moving the casings of the first display section and the second display section, or by displaying a surprised expression on the first display section.
  • FIG. 11 is a flowchart illustrating an example of notification information output processing in the information processing device 110. Further, this notification information output processing is executed by the control unit 120 based on a program stored in the storage unit 130. Moreover, this notification information output process is always executed in every control cycle. Further, this notification information output processing will be explained with reference to FIGS. 1 to 10 as appropriate.
  • step S501 the display switching determination unit 121 executes an analysis process to analyze input information.
  • This input information includes information input to the information processing apparatus 110 from the camera 101, position information acquisition sensor 102, audio input unit 103, sensors 104, etc., and information acquired from the outside via the communication unit 140 and input to the information processing apparatus 110. This is input information, etc.
  • step S502 the display switching determination unit 121 determines whether there is information (notification information) to be notified to the occupants of the vehicle C1, based on the analysis result of the input information.
  • the remaining battery level of the vehicle C1 can be detected based on the battery sensor. Therefore, when it is determined that the remaining battery amount is equal to or less than the threshold value, it is determined that there is notification information for notifying that the remaining battery amount is decreasing.
  • the number of occupants in each seat of the vehicle C1 is determined. It is possible to determine whether or not a seat belt is worn. Therefore, if it is determined that the occupant is not wearing a seatbelt even though he or she is seated, it is determined that there is notification information for the occupant.
  • the locations around the vehicle C1 may be determined. It is possible to obtain the presence or absence of goods, commercial facilities, etc. Therefore, for example, if it is determined that a commercial facility, such as a coffee shop, exists around the vehicle C1, it is determined that an advertisement for the coffee shop exists as the notification information. Further, for example, if it is determined that a tourist facility such as ABC Castle exists around the vehicle C1, it is determined that information for guiding the ABC Castle exists as broadcast information.
  • an external device such as an information providing server
  • step S503 the display switching determination unit 121 determines whether information to be notified to the occupants of the vehicle C1 has occurred. If information to be notified to the occupants of the vehicle C1 occurs, the process advances to step S504. On the other hand, if the information to be notified to the occupants of the vehicle C1 is not generated, the operation of the notification information output process is ended.
  • step S504 the agent control unit 122 executes a switching effect from the face image to the notification information.
  • FIG. 10(B) and FIG. 10(A) a switching effect using each part of the agent device 200 is executed.
  • audio information related to the switching effect may be output from the sound output unit 220.
  • FIGS. 5 to 9 show examples in which both a surprised facial expression and the movement of the face part 201 are executed as a switching effect, only the movement of the face part 201 may be executed as a switching effect. .
  • step S505 the agent control unit 122 causes the agent device 200 to perform output processing to output notification information.
  • the notification image 400 and 410 are displayed on the display section 210.
  • audio information related to the broadcast information may be output from the sound output unit 220.
  • step S506 the agent control unit 122 determines whether the end timing of the notification information output in step S505 has come. When the end timing of the broadcast information has come, the process advances to step S507. On the other hand, if it is not the end timing of the broadcast information, the process returns to step S505.
  • the end timing of the notification information is when there is a predetermined action by the occupant of the vehicle C1 corresponding to the notification information, when a predetermined time has elapsed since the display of the notification information, and when the current location of the vehicle C1 is within the display area of the notification information. This can be done when the value is out of range.
  • the predetermined action is to increase the remaining battery amount or to charge the battery.
  • An increase in the remaining battery capacity can be determined based on information detected by the battery sensor.
  • battery charging can be determined based on the exchange between the charging equipment and the vehicle C1 via the charging cable when the charging cable is connected to the vehicle C1.
  • the predetermined action is to fasten the seatbelt by the occupant seated in the seat targeted for notification. becomes. Whether or not the seat belt is fastened can be determined based on information detected by the seat belt sensor.
  • the predetermined action is the action of closing the door targeted for notification. Whether or not the door is closed can be determined based on information detected by the door sensor.
  • the notification information output in step S505 is notification information regarding commercial facilities, tourist facilities, etc. that exist around the vehicle C1
  • the operation of moving the vehicle C1 to the location targeted for notification is performed. is the predetermined action. Whether there is an operation to move the vehicle C1 to the location targeted for notification can be determined based on the position information of the vehicle C1 acquired by the position information acquisition sensor 102. Furthermore, in this case, the predetermined action is also an action in which the occupant makes a sound regarding the location targeted for notification.
  • an action to emit a sound related to the XYZ coffee shop such as "XYZ coffee is nice” or "XYZ coffee looks delicious", etc. can be a predetermined action.
  • the information is about a tourist facility that exists around the vehicle C1, such as ABC Castle, a voice related to ABC Castle, such as "I like ABC Castle” or "I want to see ABC Castle", is emitted.
  • the motion can be a predetermined action.
  • the presence or absence of these voice-related actions can be determined based on the degree of matching (or similarity) between the voice information acquired by the voice input unit 103 and a predetermined keyword stored in the storage unit 130. Note that as a method for determining the degree of matching (or similarity) of voices, it is possible to employ a known voice recognition technique.
  • the timing when a predetermined period of time has elapsed since the display of the notification information is It may also be the end timing of the notification information.
  • the timing for ending the notification information may be set to a time when the current location of the vehicle C1 is outside the display area of the notification information. For example, if the notification information is about a commercial facility that exists around the vehicle C1, such as an XYZ coffee shop, information is sent outside a predetermined area based on the XYZ coffee shop, for example, outside a 2 km radius around the XYZ coffee shop. It is possible to set the timing at which the vehicle C1 moves as the end timing of the notification information.
  • step S507 the agent control unit 122 executes a notification information termination process to terminate the display of the notification information output in step S505, and after this termination process, executes a display process to display a face image.
  • a notification information termination process For example, if the notification information is about a commercial facility or a tourist facility, a deletion process is executed to erase the notification information, and a display process is executed to display a face image. Note that in the examples shown in FIGS. 5 to 9, the display unit that displays the face image moves when displaying the notification information. Therefore, in the facial image display process in step S507, the facial image is displayed after the display unit that displays the facial image is returned to its original position.
  • FIG. 12 is a flowchart illustrating an example of notification information output processing in the information processing device 110. Further, this notification information output processing is executed by the control unit 120 based on a program stored in the storage unit 130. Moreover, this notification information output process is always executed in every control cycle. Further, this notification information output processing will be explained with reference to FIGS. 1 to 11 as appropriate.
  • this output control process is a partial modification of the notification information output process shown in FIG. Specifically, the difference is that steps S511 and S512 are added to the notification information output process shown in FIG. 11. Note that other points than these are the same as the notification information output process shown in FIG. 11, so the same reference numerals as in FIG. 11 are given, and the description thereof will be omitted.
  • the display switching determination unit 121 determines whether the importance of the notification information to be output is high.
  • the importance of the broadcast information can be determined based on the urgency of the broadcast information, for example. For example, information regarding the driving of the vehicle C1 and the safety of the occupants of the vehicle C1 is assumed to have a high degree of urgency. Therefore, the importance of the notification information is set high for information regarding the driving of the vehicle C1 and the safety of the occupants of the vehicle C1. For example, the importance level of the notification information is set to be higher than the standard for notification information regarding seatbelt fastening, door ajar, remaining battery level, and the like.
  • the level of urgency for providing information about the surroundings of the vehicle C1, tourist information, gourmet information, etc. is low. Therefore, regarding the provision of information around the vehicle C1, sightseeing information, gourmet information, etc., the importance of the notification information is set lower than the standard. For example, the importance level of the notification information regarding the facilities around the vehicle C1 (for example, restaurants, commercial facilities) is set to be low. Note that the level of importance of broadcast information is stored in the broadcast information DB 132 (see FIG. 3).
  • step S512 the process advances to step S512.
  • step S512 the agent control unit 122 executes an effect to smoothly erase the face image.
  • the effect of smoothly erasing the face image is realized, for example, by smoothly displaying the gesture of the face image.
  • the facial image can be smoothly deleted by expressing the gesture of the surprised facial image shown in FIG. 4(A) with a relatively slow movement and then deleting the facial image.
  • the agent control unit 122 executes an effect to smoothly erase the face image.
  • the face image and the notification information may be temporarily displayed simultaneously.
  • an effect may be performed in which the face image is gradually erased and the notification information is gradually displayed.
  • FIG. 13 is a flowchart illustrating an example of notification information output processing in the information processing device 110. Further, this notification information output processing is executed by the control unit 120 based on a program stored in the storage unit 130. Moreover, this notification information output process is always executed in every control cycle. Further, this notification information output processing will be explained with reference to FIGS. 1 to 12 as appropriate.
  • this output control process is a partial modification of the notification information output process shown in FIG. 12.
  • the notification information output process shown in FIG. 12 differs in that the process in step S512 is omitted and the processes in steps S521 and S522 are added. Note that other points than these are the same as the notification information output process shown in FIG. 12, so the same reference numerals as in FIG. 12 are given, and the description thereof will be omitted.
  • step S521 the display switching determination unit 121 determines whether or not the face image is being rendered. For example, if a predetermined exchange (for example, conversation) is occurring between the agent device 200 and the occupant of the vehicle C1, it is determined that the facial image is being rendered. Furthermore, for example, when the agent device 200 is performing a performance such as a behavior to appeal its autonomy (for example, behaving while sleeping or acting tired), it is determined that the facial image is being performed. Ru. For example, if the agent device 200 is behaving like it's going to sleep, and the agent device 200 is trying hard to send out advertising information for a ramen restaurant near the vehicle C1 as notification information, the user may feel uncomfortable. There is a risk of giving.
  • a predetermined exchange for example, conversation
  • a performance such as a behavior to appeal its autonomy
  • Ru for example, if the agent device 200 is behaving like it's going to sleep, and the agent device 200 is trying hard to send out advertising information for a ramen restaurant near the vehicle C1
  • step S522 the display switching determination unit 121 determines whether the notification information determined in step S503 is being generated. That is, it is assumed that for some reason it becomes unnecessary to output the broadcast information. In this way, when output of broadcast information is no longer necessary, it is determined that broadcast information is not being generated. If broadcast information is being generated, the process returns to step S521. On the other hand, if broadcast information is not being generated, the operation of broadcast information output processing is ended.
  • broadcast information regarding an advertisement for a commercial facility when the vehicle C1 moves far away from the commercial facility, it becomes unnecessary to output the broadcast information. Further, for example, in the case of broadcast information regarding an advertisement of a commercial facility, when a predetermined time (for example, about 10 minutes) has elapsed since the broadcast information was generated, the output of the broadcast information may not be necessary.
  • a predetermined time for example, about 10 minutes
  • the switching effect of the agent device 200 is important to prevent the switching effect of the agent device 200 from interfering with the driver while driving. For example, when a driver in the driver's seat is driving, the driver is aware of the switching effects of the agent device 200, but it is also assumed that the driver does not direct his/her line of sight in the direction of the agent device 200. be done. Therefore, it is preferable that the switching effect of the agent device 200 is performed while the vehicle C1 is stopped. However, if the target is a passenger other than the driver sitting in the driver's seat, the process may be executed while the vehicle C1 is running.
  • agent device 200 installed in the vehicle C1 has been shown.
  • this embodiment is also applicable to agent devices that can be removed from the vehicle C1, agent devices that can be installed outside the vehicle, and the like.
  • agent devices that can be removed from the vehicle C1
  • agent devices that can be installed outside the vehicle, and the like.
  • the agent device when a user with a portable agent device gets into the vehicle C1, the agent device is installed on the dashboard 2 of the vehicle C1, and when the user gets off the vehicle C1, the user installs the agent device on the dashboard 2 of the vehicle C1.
  • users use agent devices at home.
  • notification information regarding each device installed in the home For example, it is envisaged that the notification will be given at the timing when the bath is boiled, when the cooking is finished using the cooking utensils, etc. Further, for example, when the entrance door or window is open, it is assumed that a notification to that effect is provided. Furthermore, for example, if the user forgets to turn off the gas stove, it is assumed that a notification to that effect will be provided. In these cases, it is assumed that notification information regarding the bath, various cooking utensils, doors, windows, etc. will be provided. Further, the degree of importance can be set based on safety, degree of urgency, etc.
  • the importance level can be set higher than the standard, and for objects with a low degree of urgency, the importance level can be set lower than the standard.
  • objects with a high degree of urgency and objects whose safety must be maintained are a gas stove (to prevent forgetting to turn off the fire), an entrance door, and a window (to notify when it is open).
  • objects with a low degree of urgency include, for example, a bath (notifies the timing when the bath is boiled) and cooking utensils (notifies the timing when cooking is finished).
  • the present embodiment may be applied by displaying the agent image as a two-dimensional image. In this case, both the face and body are displayed as two-dimensional images.
  • a switching effect is executed from the time the face image is displayed to before the notification information is displayed, and the face image and the notification information are displayed separately. That is, a switching effect is performed between the display of the face image and the display of the notification information so that the face image and the notification information are separated.
  • the switching effect is executed after the face image is displayed (or while the face image is being displayed), and the notification information is displayed after this switching effect is executed, so the notification information displayed after the switching effect is This allows the user to hold the camera in place.
  • the user can control to some extent the content that will be displayed after the switching effect is executed. Becomes predictable. For example, when a switching effect is executed from the display of a face image, the user can anticipate that important notification information will be displayed after this switching effect. In this way, by displaying the notification information after the execution of the switching effect, it is possible to predict the notification information, and it is possible to alleviate confusion caused to the user. This makes it possible to appropriately convey broadcast information.
  • the agent device 200 outputs the notification information without executing the switching effect, and a case where the notification information is output after executing the switching effect.
  • the agent device 200 is more likely to feel like a living thing when the notification information is output after executing the switching effect than when the notification information is output without executing the switching effect. . Therefore, in this embodiment, the notification information is output after the switching effect is executed. Thereby, it becomes possible to make the user feel that the agent device 200 is like a living thing, and it becomes possible to appropriately convey notification information.
  • notification information with a high degree of urgency for example, notification information with a high degree of importance
  • notification information with a high degree of importance occurs, after quickly performing a switching effect to exclusively display the face image and notification information, It is possible to display notification information. In this way, by quickly performing the switching effect at least in an emergency, it becomes possible to make the user feel that the agent device 200 is like a living thing, and it becomes possible to quickly convey notification information.
  • each device that executes a part of each of these processes constitutes an information processing system.
  • in-vehicle devices devices that can be used by users (e.g., smartphones, tablet terminals, personal computers, car navigation devices, IVI), various information processing devices such as servers that can be connected via a predetermined network such as the Internet, various At least a portion of each process can be performed using an electronic device.
  • a part (or all) of the information processing system that can execute the functions of the information processing device 110 (or the information processing system 100) may be provided by an application that can be provided via a predetermined network such as the Internet. good.
  • This application is, for example, SaaS (Software as a Service).
  • the information processing method according to this embodiment is an information processing method that provides notification information to a user using an agent device 200 having a display unit 210 (an example of a display device) that displays a face image.
  • This information processing method includes a control process (steps S503 to S507) that controls the display mode of the display unit 210, and in the control process, when displaying notification information on the display unit 210, switching using the display unit 210 is performed. After performing the performance and erasing the display of the face image, notification information is displayed.
  • the program according to the present embodiment is a program that causes a computer to execute each of these processes. In other words, the program according to this embodiment is a program that causes a computer to implement each function that can be executed by the information processing device 110.
  • the display unit 210 when switching from the face image to the notification information, the display unit 210 is used to perform a switching effect, for example, displaying a surprised expression, so that the user feels like a living creature, and the notification information is not notified. It becomes possible to increase the user's interest in the subject. This makes it possible to more appropriately notify the user of the notification information.
  • the display unit 210 includes a first display unit (display units 241, 252, 264, 271, 281, 300, and 324) that displays a face image, and a second display unit that displays notification information.
  • display units display units 242, 253, 265, 272 and 282, 310, 325), and in the control process (steps S504, S505), when displaying the notification information on the second display unit, the first display unit After performing a switching effect on at least one of the display section and the second display section and erasing the display of the face image on the first display section, the notification information is displayed on the second display section.
  • a switching effect using the first display section a surprised facial expression may be displayed, or as a switching effect using the first display section and the second display section, a housing equipped with the first display section and the second display section may be used. It is possible to rotate. Further, for example, by displaying an image with different display characteristics on at least one of the first display section and the second display section, a switching effect using the first display section or the second display section may be executed. is also possible.
  • an effect of moving the display unit 210 is executed as a switching effect. For example, as shown in FIGS. 5(A)(B), FIGS. 6(A)(B), FIGS. 7(A) to (C), FIGS. Then, a switching effect is executed to move the display section of the agent device 200.
  • the display unit 210 (an example of a display device) is provided with a first display unit (display units 252, 264, and 324) on one surface that displays face images, and displays notification information.
  • the second display section (display section 253, 265, 325) to display is composed of a cube-shaped or rectangular parallelepiped-shaped housing provided on another surface different from one surface, and in the control processing (steps S504, S505) , when displaying notification information on the second display section, a rotation operation that switches the side of the display device visible to the user from one side to another side is performed as a switching effect, and the face image on the first display section is After erasing the display, the notification information is displayed on the second display section.
  • the rotating motion of the display unit 210 is executed as a switching effect, thereby making the user feel that the movement of the face looks like a living thing, and the notification information is This makes it possible to increase the user's interest in what is being done. Therefore, it becomes possible to more appropriately notify the user of the notification information.
  • the display unit 210 (an example of a display device) is provided with a first display unit (display units 252, 264) on one side, and a first display unit (display units 252, 264) on the opposite side of the first side. It consists of a flat case in which a second display section (display sections 253, 265) is provided on the other surface, and in the control process (step S504), the first display section and the second display section are arranged around the rotation axis. A rotation action of rotating the part by 180 degrees is executed as a switching effect. For example, as shown in FIGS. 5A and 5B and 6A and 6B, a switching effect is performed in which the display section of the agent device 200 is rotated.
  • the rotating motion of the display unit 210 is executed as a switching effect, so that the agent's face moves like a living creature, such as turning backwards or downwards. It is possible to make the user feel it. Thereby, it becomes possible to increase the user's interest in being notified of the broadcast information, and it becomes possible to notify the user of the broadcast information more appropriately.
  • the display section 210 (an example of a display device) is provided with a display section 324 (an example of a first display section) on one surface, and the other surface is adjacent to the first surface. It consists of a cube-shaped or rectangular parallelepiped-shaped housing in which a display section 325 (an example of a second display section) is provided on the surface thereof.
  • a rotation action of rotating 90 degrees is performed as a switching effect. For example, as shown in FIGS. 9A and 9B, a switching effect is performed in which the display units 324 and 325 of the agent device 200 are rotated.
  • the display section 210 (an example of a display device) has a first display section 271 provided on one surface, and a second display section 272 provided on the surface opposite to the one surface.
  • the display panel 270 includes a flat display panel 270 and a display section 280 (an example of a third display section).
  • the display unit 210 is provided in a first state in which a face image is displayed in a first display area constituted by a first display unit 271 and a part of a display unit 280, and a first state in which a face image is displayed in a first display area configured by a first display unit 271 and a part of a display unit 280.
  • the control process when displaying notification information on the display unit 210, a switching effect is performed in which the display panel 270 is rotated to transition from the first state to the second state, and the face image in the first display area is After erasing the display, the notification information is displayed in the second display area.
  • a switching effect is performed in which the display panel 270 of the agent device 200 is rotated.
  • the unexpected movement of the display panel 270 makes the user feel like a living creature, and the notification information is This makes it possible to increase the user's interest in what is being done. Therefore, it becomes possible to more appropriately notify the user of the notification information.
  • the display unit (an example of a display device) includes a display unit 300 (an example of a first display unit) that displays a face image, and a display unit 310 (an example of a second display unit) that displays notification information.
  • the display section 300 has a first state where the display surface of the display section 310 is covered in front of the display section 310, and a state where the display section 300 is slid in a direction substantially parallel to the display surface of the display section 310 so that the user can view the display section 300.
  • the display screen and the display screen of the display unit 310 are installed so as to be able to be changed to a second state in which the display screen can be viewed, and in the control processing (steps S504 and S505), when displaying notification information on the display unit 310 To do this, a switching effect is performed in which the display section 300 is slid to transition from the first state to the second state, and after erasing the display of the face image on the display section 300, a notification is sent to the display section 310. Display information.
  • the unexpected movement of the display unit 300 makes the user feel like a living thing, and the notification information is It becomes possible to increase the user's interest in being notified. Therefore, it becomes possible to more appropriately notify the user of the notification information.
  • the display unit 210 switches at least one of a three-dimensional image and a two-dimensional image, a high image quality and a low image quality, and a color image and a monochrome image. It is possible to display each image with different display characteristics, and in the control processing (steps S504 and S505), when displaying notification information on the display unit 210, display characteristics different from those used when displaying the face image are used. After executing the effect of displaying a predetermined image as a characteristic as a switching effect and erasing the display of the face image on the display section 210, the notification information is displayed on the display section 210.
  • the notification information by displaying the notification information as a different display characteristic when switching from the face image to the notification information, it is possible to make the user feel like a living thing and increase the user's interest in being notified of the notification information. becomes possible. Furthermore, the difference in display characteristics allows the user to clearly distinguish between the facial image and the notification information. Therefore, it becomes possible to more appropriately notify the user of the notification information.
  • the display unit 210 (an example of a display device) includes a first display unit (display units 241, 252, 264, 271, 281, 300, and 324) that displays a face image, and a a second display section (display sections 242, 253, 265, 272 and 282, 310, 325) that displays information; At least one of a low image quality, a color image, and a monochrome image has different display characteristics, and in the control process (steps S504 and S505), when displaying notification information on the second display section, the first display section After performing a switching effect on at least one of the display section and the second display section and erasing the display of the face image on the first display section, the notification information is displayed on the second display section.
  • the user when switching from the face image to the notification information, by displaying the notification information on the second display section that has different display characteristics, the user can feel like a living thing, and the user can feel that the notification information is being notified. It becomes possible to increase user interest. Furthermore, the difference in display characteristics allows the user to clearly distinguish between the facial image and the notification information. Therefore, it becomes possible to more appropriately notify the user of the notification information.
  • the degree of importance is determined based on the degree of urgency of the notification information, and based on the degree of importance of the notification information, the degree of importance is determined from the display of the face image. Change the timing of switching to the display of notification information.
  • broadcast information with high importance can be outputted quickly, and broadcast information with low importance can be outputted at an appropriate timing. Therefore, it becomes possible to more appropriately notify the user of the notification information.
  • step S504, S511, S512, S521, S522) if the importance of the broadcast information is higher than the standard, the face When switching from displaying an image to displaying notification information, and when the importance of the notification information is lower than the standard, when the face image displayed on the display unit 210 (an example of a display device) is performing a predetermined effect. , after the predetermined effect is completed, the display of the face image is switched to the display of the notification information, and when the face image displayed on the display unit 210 is not performing the predetermined effect, after the notification information is generated, Switch from displaying face images to displaying notification information.
  • the information processing device 110 is an information processing device that provides notification information to a user using an agent device 200 that has a display unit 210 (an example of a display device) that displays a facial image.
  • the information processing device 110 includes a control unit 120 that controls the display mode of the display unit 210, and when displaying notification information on the display unit 210, the control unit 120 executes a switching effect using the display unit 210. , and after erasing the display of the face image, the notification information is displayed on the display unit 210. Further, an information processing system that can execute each process realized by the information processing device 110 may be used.
  • the display unit 210 when switching from the face image to the notification information, the display unit 210 is used to perform a switching effect, for example, displaying a surprised expression, so that the user feels like a living creature, and the notification information is not notified. It becomes possible to increase the user's interest in the subject. This makes it possible to more appropriately notify the user of the notification information.
  • each processing procedure shown in this embodiment is an example for realizing this embodiment, and the order of a part of each processing procedure may be changed to the extent that this embodiment can be realized. Often, a part of each processing procedure may be omitted or other processing steps may be added.
  • each process shown in this embodiment is executed based on a program for causing a computer to execute each process procedure. Therefore, this embodiment can also be understood as an embodiment of a program that implements the function of executing each of these processes, and a recording medium that stores the program. For example, when an update process is performed to add a new function to an information processing device, the program can be stored in the storage device of the information processing device. This makes it possible to cause the updated information processing device to perform each process described in this embodiment.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

An information processing method that provides notification information to users using an agent apparatus having a display unit for displaying face images. The information processing method includes a control process for controlling the display form of the display unit. In the control process, when the display unit is caused to display notification information, switchover rendering using the display unit is executed, and the notification information is shown by the display unit after the display of a face image is erased.

Description

情報処理方法及び情報処理装置Information processing method and information processing device
 本発明は、ユーザに情報を報知する情報処理方法及び情報処理装置に関する。 The present invention relates to an information processing method and an information processing apparatus for notifying users of information.
 従来、ユーザとのコミュニケーションを行うことが可能なロボットが存在する。例えば、顔画像を表示する表示装置に文字列等の情報を表示させてユーザとのコミュニケーションを行う機器が提案されている(例えば、JP2019−124855A参照)。 Conventionally, there are robots that can communicate with users. For example, a device has been proposed that communicates with a user by displaying information such as a character string on a display device that displays a facial image (for example, see JP2019-124855A).
 上述した従来技術では、顔画像を徐々に変化させ、顔画像を完全に隠して文字列等の情報を表示装置に表示させることが可能である。ここで、生き物らしさをユーザに感じさせる顔画像が表示されていた表示装置に、報知すべき情報(報知情報)を表示する場合を想定する。この場合には、生き物らしさを感じさせる情報(顔画像)から、生き物らしさを感じさせない報知情報へと遷移する切り替え時以降において、生き物らしさが不足するおそれがある。このような場合には、生き物らしさの不足により、表示装置に表示される報知情報に対する興味が低下することも想定され、報知情報を適切にユーザに報知することが困難となるおそれもある。そこで、ユーザに報知すべき情報を表示する際には、その切り替え時において、情報が報知されることに対してユーザの関心を高めて、その情報をより適切にユーザに報知することが重要である。 With the above-mentioned conventional technology, it is possible to gradually change the facial image, completely hide the facial image, and display information such as character strings on the display device. Here, a case is assumed in which information to be notified (notification information) is displayed on a display device on which a face image that gives the user a feeling of being alive is displayed. In this case, there is a risk that the creature-likeness may be insufficient after the time of switching from the information (face image) that makes the information feel like a living thing to the notification information that does not make it feel like a living thing. In such a case, it is assumed that interest in the notification information displayed on the display device decreases due to the lack of life-like appearance, and there is also a possibility that it becomes difficult to appropriately notify the user of the notification information. Therefore, when displaying information that should be reported to the user, it is important to increase the user's interest in the information being reported at the time of switching, and to notify the user of that information more appropriately. be.
 本発明は、より適切にユーザに情報を報知することを目的とする。 The purpose of the present invention is to more appropriately notify users of information.
 本発明の一態様は、顔画像を表示する表示装置を有するエージェント機器を用いて報知情報をユーザに提供する情報処理方法である。この情報処理方法は、表示装置の表示態様を制御する制御処理を含み、この制御処理では、表示装置に報知情報を表示させる場合には、表示装置を用いた切替演出を実行し、かつ、顔画像の表示を消去した後に、報知情報を表示させる。 One aspect of the present invention is an information processing method that provides notification information to a user using an agent device that has a display device that displays a facial image. This information processing method includes a control process that controls the display mode of the display device, and in this control process, when displaying notification information on the display device, a switching effect using the display device is executed, and a face After erasing the image display, the notification information is displayed.
図1は、車両の車室内の構成例を簡略化して示す図である。FIG. 1 is a diagram showing a simplified example of the configuration of the interior of a vehicle. 図2は、エージェント機器の外観構成の一例を簡略化して示す正面図である。FIG. 2 is a simplified front view showing an example of the external configuration of the agent device. 図3は、車両に設置されている情報処理システムのシステム構成の一例を示す図である。FIG. 3 is a diagram showing an example of the system configuration of an information processing system installed in a vehicle. 図4は、2つの表示部を用いて報知情報を提供する場合の遷移例を示す図である。FIG. 4 is a diagram showing an example of transition when providing notification information using two display units. 図5は、2つの表示部を用いて報知情報を提供する場合の遷移例を示す図である。FIG. 5 is a diagram showing an example of transition when providing notification information using two display units. 図6は、2つの表示部を用いて報知情報を提供する場合の遷移例を示す図である。FIG. 6 is a diagram showing an example of transition when providing notification information using two display units. 図7は、顔部上に回動可能に設けられている平板状の表示パネルを利用して実現される2つの表示部を用いて報知情報を提供する場合の遷移例を示す図である。FIG. 7 is a diagram illustrating a transition example in the case where notification information is provided using two display sections that are realized using a flat display panel that is rotatably provided on the face. 図8は、スライド移動が可能な表示部と、固定式の表示部とを用いて報知情報を提供する場合の遷移例を示す図である。FIG. 8 is a diagram showing an example of transition when providing notification information using a slideable display section and a fixed display section. 図9は、2つの表示部を用いて報知情報を提供する場合の遷移例を示す図である。FIG. 9 is a diagram illustrating an example of transition when providing notification information using two display units. 図10は、1つの表示部を用いて報知情報を提供する場合の遷移例を示す図である。FIG. 10 is a diagram showing an example of transition when providing broadcast information using one display unit. 図11は、情報処理装置における報知情報出力処理の一例を示すフローチャートである。FIG. 11 is a flowchart illustrating an example of broadcast information output processing in the information processing device. 図12は、情報処理装置における報知情報出力処理の一例を示すフローチャートである。FIG. 12 is a flowchart illustrating an example of broadcast information output processing in the information processing device. 図13は、情報処理装置における報知情報出力処理の一例を示すフローチャートである。FIG. 13 is a flowchart illustrating an example of broadcast information output processing in the information processing device.
 以下、添付図面を参照しながら本発明の実施形態について説明する。 Embodiments of the present invention will be described below with reference to the accompanying drawings.
 [エージェント機器の設置例]
 図1は、車両C1の車室内の構成例を簡略化して示す図である。なお、図1では、運転席、助手席(図示省略)よりも前側を、車両C1の後ろ側から見た場合の外観構成例を示す。また、図1では、説明を容易にするため、ダッシュボード2、ステアリングホイール3、フロントウインド4、バックミラー5、カメラ101、エージェント機器200以外の図示は省略する。
[Example of agent device installation]
FIG. 1 is a diagram showing a simplified example of the configuration of the interior of a vehicle C1. Note that FIG. 1 shows an example of the external appearance when the front side of the driver's seat and passenger seat (not shown) is viewed from the rear side of the vehicle C1. Further, in FIG. 1, for ease of explanation, illustrations other than the dashboard 2, steering wheel 3, front window 4, rearview mirror 5, camera 101, and agent device 200 are omitted.
 エージェント機器200は、車両C1のダッシュボード2上に設置される小型のロボットである。本実施形態では、人間を模したロボットをエージェント機器200とする例を示す。なお、図1では、ダッシュボード2上にエージェント機器200を設置する例を示すが、これに限定されない。例えば、フロントウインド4の上部にエージェント機器200を設置してもよい。また、図1では、人間を模したロボットをエージェント機器200とする例を示すが、これに限定されない。例えば、ウサギ、豚等のような動物を模したロボット、仮想物の生物(例えばアニメのキャラクターの顔)を模したロボット、他の物体(例えばテレビ型の機器、ラジオ型の機器)を模したロボットをエージェント機器200としてもよい。このように、擬生物化されたエージェントをエージェント機器200とすることが可能である。 The agent device 200 is a small robot installed on the dashboard 2 of the vehicle C1. In this embodiment, an example is shown in which the agent device 200 is a robot imitating a human. Note that although FIG. 1 shows an example in which the agent device 200 is installed on the dashboard 2, the present invention is not limited to this. For example, the agent device 200 may be installed above the front window 4. Further, although FIG. 1 shows an example in which the agent device 200 is a robot imitating a human, the present invention is not limited to this. For example, robots imitating animals such as rabbits, pigs, etc., robots imitating virtual creatures (e.g. the faces of anime characters), robots imitating other objects (e.g. television-type devices, radio-type devices) A robot may be used as the agent device 200. In this way, it is possible to use a simulated agent as the agent device 200.
 エージェント機器200は、情報処理装置110(図3参照)からの指示に基づいて各種の動作を実行する。例えば、エージェント機器200は、情報処理装置110の制御に基づいて、ユーザが運転操作をする際における運転支援に関する各種情報を出力する。この運転支援として、前方又は後方の動体の報知等が想定される。例えば、前方の動体の報知として、「この先踏切だから気を付けてよ」と音声出力したり、「前方に人がいるよ」と音声出力したりすることができる。このように、エージェント機器200は、運転支援を実行する。 The agent device 200 executes various operations based on instructions from the information processing device 110 (see FIG. 3). For example, the agent device 200 outputs various information related to driving support when the user performs a driving operation based on the control of the information processing device 110. As this driving support, notification of a moving object in front or behind is assumed. For example, as a notification of a moving object ahead, it is possible to output a voice saying, ``There is a railroad crossing ahead, so be careful,'' or a voice saying, ``There are people ahead.'' In this way, the agent device 200 performs driving assistance.
 ここで、報知は、何らかの情報を伝えたり、知らせたりすることを意味する。また、車両C1のユーザ(車両C1の乗員)に報知すべき情報を報知情報と称して説明する。報知情報については、画像表示によりユーザに伝えてもよく、音声出力によりユーザに伝えてもよい。本実施形態では、主に、報知情報を画像表示によりユーザに伝える例を示す。なお、本実施形態で示す報知については、通知、伝達等と称してもよい。 Here, notification means to convey or inform some kind of information. Further, information to be notified to the user of the vehicle C1 (occupant of the vehicle C1) will be referred to as notification information and will be described. The notification information may be communicated to the user through image display or audio output. This embodiment mainly shows an example in which notification information is conveyed to the user by displaying an image. Note that the notification shown in this embodiment may also be referred to as notification, transmission, or the like.
 カメラ101は、車両C1の内部の天井に設けられ、車両C1の内部の被写体を撮像して画像(画像データ)を生成するものである。なお、カメラ101は、例えば、被写体を撮像することが可能な1又は複数のカメラ機器や画像センサにより構成される。例えば、フロントウインド4の上部、すなわち、バックミラー5の上側にカメラ101を設けることができる。なお、図1では、少なくとも1つのカメラ101を備える例を示すが、2以上の撮像装置を備え、これらの撮像装置のうちの全部又は一部の画像を用いてもよい。また、各撮像装置の設置場所については、図1に示す例に限定されず、適宜変更可能である。また、車両C1の全方位に存在する被写体と、車両C1の内部の被写体とを取得可能な1又は複数の機器、例えば、360度カメラを用いてもよい。 The camera 101 is installed on the ceiling inside the vehicle C1, and captures an image of a subject inside the vehicle C1 to generate an image (image data). Note that the camera 101 is configured by, for example, one or more camera devices or image sensors capable of capturing an image of a subject. For example, the camera 101 can be provided above the front window 4, that is, above the rearview mirror 5. Although FIG. 1 shows an example including at least one camera 101, it is also possible to include two or more imaging devices and use images from all or part of these imaging devices. Further, the installation location of each imaging device is not limited to the example shown in FIG. 1, and can be changed as appropriate. Alternatively, one or more devices, such as a 360-degree camera, may be used that can capture objects present in all directions of the vehicle C1 and objects inside the vehicle C1.
 [エージェント機器の外観構成例]
 図2は、エージェント機器200の外観構成の一例を簡略化して示す正面図である。本実施形態では、エージェント機器200がオン状態である場合には、顔画像、報知情報等が表示部210に表示される例を示す。なお、車両C1のオン操作に応じて、エージェント機器200をオン状態とすることが可能である。また、エージェント機器200に関するユーザ操作に応じて、エージェント機器200をオン状態とすることが可能である。なお、車両C1のオンオフ操作は、車両C1の起動又は停止に関するスタートキーのオン操作又はオフ操作を意味する。
[Example of external configuration of agent device]
FIG. 2 is a simplified front view showing an example of the external configuration of the agent device 200. As shown in FIG. In this embodiment, when the agent device 200 is in the on state, a face image, notification information, etc. are displayed on the display unit 210. Note that it is possible to turn on the agent device 200 in response to a turn-on operation of the vehicle C1. Further, the agent device 200 can be turned on in response to a user operation regarding the agent device 200. Note that the on/off operation of the vehicle C1 means an on or off operation of a start key related to starting or stopping the vehicle C1.
 図2(A)には、通常状態のエージェント機器200の正面図を示す。エージェント機器200は、略箱状の身体部202と、略箱状の顔部201とが連結部203を介して上下方向に並べて構成される。また、顔部201には、各種画像を表示する表示部210が設けられている。身体部202の表面には、身体を示す身体画像が表示される。表示部210として、例えば、発光シート、液晶表示器、有機EL(Electro Luminescence)等の各種表示媒体を使用可能である。また、連結部203は、生物の首に相当する部分として認識される。 FIG. 2(A) shows a front view of the agent device 200 in a normal state. The agent device 200 includes a substantially box-shaped body part 202 and a substantially box-shaped face part 201 arranged vertically via a connecting part 203. Further, the face section 201 is provided with a display section 210 that displays various images. A body image showing the body is displayed on the surface of the body part 202. As the display section 210, various display media such as a light emitting sheet, a liquid crystal display, an organic EL (Electro Luminescence), etc. can be used. Further, the connecting portion 203 is recognized as a portion corresponding to the neck of a living creature.
 顔部201の表示部210には、エージェントの顔を構成する各部、例えば、眼部211、鼻部212、口部213が表示される。本実施形態では、車両C1のダッシュボード2上に、眼部211が車内側を見る方向にエージェント機器200が設置される例を示す。また、エージェント機器200の表示部210が設けられている側をエージェント機器200の前側と称し、エージェント機器200の前側とは反対側をエージェント機器200の後側と称して説明する。また、エージェント機器200において、図2の左側をエージェント機器200の右側と称し、図2の右側をエージェント機器200の左側と称して説明する。 The display section 210 of the face section 201 displays various parts that make up the agent's face, such as eyes 211, nose 212, and mouth 213. In this embodiment, an example is shown in which the agent device 200 is installed on the dashboard 2 of the vehicle C1 in a direction in which the eye portion 211 looks inside the vehicle. In addition, the side where the display unit 210 of the agent device 200 is provided will be referred to as the front side of the agent device 200, and the side opposite to the front side of the agent device 200 will be referred to as the rear side of the agent device 200. In the agent device 200, the left side in FIG. 2 will be referred to as the right side of the agent device 200, and the right side in FIG. 2 will be referred to as the left side of the agent device 200.
 図2(B)には、顔部201の各部が左側を向いている状態のエージェント機器200の正面図を示す。図2(C)には、顔部201の各部が右側を向いている状態のエージェント機器200の正面図を示す。図2(B)に示すように、眼部211を左側に移動させるとともに、眼部211の黒眼部分を左側に寄せ、口部213を上下方向に長い楕円形とすることにより、左側を向いている顔部201の表情とすることが可能である。また、図2(C)に示すように、眼部211を右側に移動させるとともに、眼部211の黒眼部分を右側に寄せ、口部213を上下方向に長い楕円形とすることにより、右側を向いている顔部201の表情とすることが可能である。 FIG. 2(B) shows a front view of the agent device 200 with each part of the face 201 facing left. FIG. 2C shows a front view of the agent device 200 with each part of the face section 201 facing to the right. As shown in FIG. 2(B), by moving the eye part 211 to the left side, moving the black eye part of the eye part 211 to the left side, and making the mouth part 213 into an oval shape that is elongated in the vertical direction, the eye part 211 is turned to the left side. It is possible to make the facial expression 201 look like this. In addition, as shown in FIG. 2C, the eye part 211 is moved to the right side, the black part of the eye part 211 is moved to the right side, and the mouth part 213 is made into an oval shape that is elongated in the vertical direction. It is possible to make the expression of the face part 201 facing the direction.
 図2(D)には、報知情報を表示する前のタイミングで、切替演出として、ユーザに生き物らしさを感じさせるための演出を顔部201が表現している状態のエージェント機器200の正面図を示す。この生き物らしさを感じさせるための演出として、例えば、驚いた表情をユーザに伝える表示とすることが可能である。 FIG. 2(D) shows a front view of the agent device 200 in a state where the face section 201 is expressing an effect to make the user feel like a living thing as a switching effect before displaying the notification information. show. As an effect to make the object feel like a living thing, for example, it is possible to use a display that conveys a surprised expression to the user.
 例えば、図2(D)に示すように、眼部211を上下方向に長い楕円形とするとともに、眼部211の黒眼部分を下側に寄せ、口部213を上下方向に長い楕円形とすることにより、驚いた顔部201の表情とすることが可能である。また、注意を喚起するための感嘆符画像214を表示したり、驚きの音声情報を出力したりすることにより、さらに驚いている状態を表現することが可能である。これにより、ユーザは、エージェント機器200が驚いたことにより、今後、何かが発生するのかと表示部210に注意を向けることが想定される。また、エージェント機器200が驚いた表情をすることにより、ユーザに生き物らしさを感じさせることが可能となる。これにより、報知情報が表示されることに対してユーザの関心を表示部210に向けさせるとともに、ユーザの関心を高めることが可能となる。このように、ユーザに生き物らしさを感じさせるための演出として、眼、鼻、口等の生物的な機能を活かして、ユーザの注意を引くことが可能である。すなわち、これらの演出は、ユーザの注意を誘導するための演出としても把握できる。 For example, as shown in FIG. 2(D), the eye portion 211 is formed into an oval shape that is long in the vertical direction, the black part of the eye portion 211 is moved downward, and the mouth portion 213 is formed into an oval shape that is long in the vertical direction. By doing so, it is possible to make the expression of the face part 201 look surprised. Furthermore, by displaying an exclamation mark image 214 to draw attention or outputting surprising audio information, it is possible to express a state of being even more surprised. As a result, it is assumed that the user is surprised by the agent device 200 and turns his attention to the display unit 210 to see if something will happen in the future. Furthermore, by making the agent device 200 look surprised, it is possible to make the user feel like a living thing. This makes it possible to direct the user's attention to the display unit 210 and increase the user's interest in displaying the notification information. In this way, it is possible to draw the user's attention by making use of biological functions such as eyes, nose, mouth, etc. as an effect to make the user feel like a living thing. In other words, these effects can also be understood as effects for guiding the user's attention.
 このように、感嘆符画像214等の各種画像を用いた演出を実行することにより、ユーザの注意をさらに引くことが可能となる。また、感嘆符画像214等の画像以外の他の手段による演出、例えば、顔部201の動きを用いた演出等の各種演出を実行して、ユーザの注意を引いてもよい。エージェント機器200の顔部201の動きを用いた演出については、図5乃至図9を参照して詳細に説明する。 In this way, by performing effects using various images such as the exclamation mark image 214, it is possible to further attract the user's attention. Furthermore, various effects such as effects using movements of the face part 201 may be performed to attract the user's attention by using other means than images such as the exclamation mark image 214, for example, effects using movements of the face part 201. The effects using the movement of the face 201 of the agent device 200 will be described in detail with reference to FIGS. 5 to 9.
 図2(E)(F)には、報知情報(報知画像400、410)を表示部210に表示させている状態のエージェント機器200の正面図を示す。図2(E)には、シートベルトの着用を報知するための報知画像400を表示部210に表示させる例を示す。また、図2(F)には、車両C1のバッテリ残量が低下していることを報知するための報知画像410を表示部210に表示させる例を示す。なお、これらの報知情報は、一例であり、他の情報を報知情報として表示してもよい。例えば、車両C1の何れかのドアが半ドアであることを報知するための報知画像を表示部210に表示させてもよい。この半ドアに関する報知情報として、ドアを示す画像情報を表示部210に表示させることが可能である。 FIGS. 2(E) and 2(F) show front views of the agent device 200 in a state where notification information (notification images 400, 410) is displayed on the display unit 210. FIG. 2E shows an example of displaying a notification image 400 on the display unit 210 for notifying the user of fastening the seat belt. Further, FIG. 2F shows an example in which a notification image 410 for notifying that the remaining battery level of the vehicle C1 is decreasing is displayed on the display unit 210. Note that these broadcast information are just examples, and other information may be displayed as the broadcast information. For example, a notification image may be displayed on the display unit 210 to notify that any door of the vehicle C1 is ajar. As notification information regarding this ajar door, image information indicating the door can be displayed on the display unit 210.
 なお、これらのエージェント機器200の形状、表示態様、音声出力態様は、一例であり、他の形状、表示態様、音声出力態様とすることも可能である。また、図2では、顔部201及び身体部202を別体として構成する例を示すが、顔部201及び身体部202を一体の筐体として構成してもよい。 Note that the shape, display mode, and voice output mode of these agent devices 200 are merely examples, and other shapes, display modes, and voice output modes are also possible. Further, although FIG. 2 shows an example in which the face part 201 and the body part 202 are configured as separate bodies, the face part 201 and the body part 202 may be configured as an integrated housing.
 また、本実施形態では、生き物らしさを表す画像(生き物表示)として顔画像を表示する例を示すが、他の画像(例えば、エージェントの全身、機械的なもの、仮想的な顔)を生き物らしさを表す画像として表示してもよい。 In addition, in this embodiment, an example is shown in which a face image is displayed as an image representing a creature-like appearance (living creature display), but other images (for example, the agent's whole body, a mechanical object, a virtual face) are displayed as a creature-like image. It may be displayed as an image representing.
 ここで、ダッシュボード2上に設置されたエージェント機器200は、擬生物化された3次元のロボットであるため、車両C1のユーザに認識される存在感が高いと想定される。このため、例えば、何らかの情報を車両C1のユーザに報知する場合には、エージェント機器200を用いて報知をすることが考えられる。例えば、生き物らしさをユーザに感じさせる顔画像が表示されていた表示部210に、顔画像から報知情報に切り替えて表示することが考えられる。例えば、図2(A)乃至(C)の何れかに示す顔画像が表示部210に表示されているタイミングで、図2(E)、(F)の何れかに示す報知情報を表示部210に表示することが考えられる。この場合には、生き物らしさを感じさせる顔画像から、生き物らしさを感じさせない報知情報へと遷移する切り替え時以降において、生き物らしさが不足するおそれがある。このような場合には、生き物らしさの不足により、表示部210に表示される報知情報に対する興味が低下することも想定され、報知情報を適切にユーザに報知することが困難となるおそれもある。そこで、報知情報を表示する際には、その切り替え時において、生き物らしさをユーザに感じさせることにより、報知情報が表示されることに対してユーザの関心を高め、報知情報をより適切にユーザに報知することが重要である。 Here, since the agent device 200 installed on the dashboard 2 is a three-dimensional anthropomorphic robot, it is assumed that its presence is recognized by the user of the vehicle C1. For this reason, for example, when notifying the user of the vehicle C1 of some information, it is conceivable to do so using the agent device 200. For example, it is conceivable that the display unit 210, which previously displayed a face image that gives the user a feeling of being a living creature, may switch from the face image to display information. For example, at the timing when the face image shown in any one of FIGS. 2(A) to (C) is displayed on the display unit 210, the notification information shown in either FIG. 2(E) or (F) is displayed on the display unit 210. It is conceivable to display it in In this case, there is a risk that the facial image may lack the appearance of a living creature after the transition from the face image that gives the impression of a living creature to the notification information that does not give the impression of a living creature. In such a case, it is assumed that interest in the notification information displayed on the display unit 210 decreases due to the lack of living creature-likeness, and it may become difficult to appropriately notify the user of the notification information. Therefore, when displaying notification information, by making the user feel like a living thing when switching between notifications, it increases the user's interest in displaying the notification information and allows the notification information to be conveyed more appropriately to the user. It is important to inform.
 そこで、本実施形態では、表示部210に報知情報を表示させる場合には、表示部210を用いた切替演出を実行し、かつ、表示部210における顔画像の表示を消去した後に、表示部210に報知情報を表示させる。例えば、図2(A)乃至(C)の何れかに示す顔画像が表示部210に表示されているタイミングで、図2(E)、(F)の何れかに示す報知情報を表示部210に表示する場合には、その報知情報の表示前に、図2(D)に示す顔画像を表示部210に表示させることにより切替演出を実行する。このように、報知情報の表示前において、生き物らしさをユーザに感じさせる切替演出を実行することにより、報知情報が表示されることに対してユーザの関心を高め、報知情報をより適切にユーザに報知することが可能となる。 Therefore, in the present embodiment, when displaying notification information on the display section 210, after performing a switching effect using the display section 210 and erasing the display of the face image on the display section 210, display the notification information. For example, at the timing when the face image shown in any one of FIGS. 2(A) to (C) is displayed on the display unit 210, the notification information shown in either FIG. 2(E) or (F) is displayed on the display unit 210. In the case of displaying the notification information, the switching effect is executed by displaying the face image shown in FIG. 2(D) on the display unit 210 before displaying the notification information. In this way, by performing a switching effect that makes the user feel like a living creature before the notification information is displayed, the user's interest in the notification information being displayed is increased, and the notification information is conveyed more appropriately to the user. It becomes possible to notify.
 なお、図2に示す切替演出は、一例であり、他の切替演出を実行してもよい。例えば、同一の表示画面を用いた他の切替演出例を図10に示す。また、例えば、同一の表示画面を用いた切替演出以外に、複数の表示画面を用いた切替演出を実行してもよい。この例を図4に示す。これらの切替演出は、ソフトウエアを利用した切替処理により実行が可能である。また、これらの切替演出により、顔画像と報知情報とを排他的に表示することが可能となる。 Note that the switching effect shown in FIG. 2 is an example, and other switching effects may be performed. For example, FIG. 10 shows another example of switching effect using the same display screen. Furthermore, for example, in addition to the switching effect using the same display screen, a switching effect using a plurality of display screens may be performed. An example of this is shown in FIG. These switching effects can be executed by switching processing using software. Moreover, these switching effects make it possible to exclusively display the face image and the notification information.
 また、例えば、顔部201の物理的な動きを用いた切替演出を実行してもよい。この例を図5乃至図9に示す。これらの切替演出は、回転、平行移動等の物理的な動きを利用した切替処理により実行が可能である。また、これらの切替演出により、顔画像と報知情報とを排他的に表示することが可能となる。なお、これらの物理的な動きを用いた切替演出を実行した場合でも、顔画像の表示位置と、報知情報の表示位置とを同じとすることが可能である。この場合には、生き物らしさをユーザに感じさせる切替演出の実行後に、顔画像の位置に報知情報が表示されるため、その報知情報に対してユーザの関心をさらに高めることが可能となり、報知情報をより適切にユーザに報知することが可能となる。 Furthermore, for example, a switching effect using physical movement of the face part 201 may be performed. Examples of this are shown in FIGS. 5 to 9. These switching effects can be executed by switching processing using physical movements such as rotation and parallel movement. Moreover, these switching effects make it possible to exclusively display the face image and the notification information. Note that even when performing a switching effect using these physical movements, it is possible to make the display position of the face image and the display position of the notification information the same. In this case, the notification information is displayed at the position of the face image after the switching effect that makes the user feel like a living thing, so it is possible to further increase the user's interest in the notification information, and the notification information It becomes possible to notify the user more appropriately.
 [情報処理システムの構成例]
 図3は、車両C1に設置されている情報処理システム100のシステム構成の一例を示す図である。
[Configuration example of information processing system]
FIG. 3 is a diagram showing an example of the system configuration of the information processing system 100 installed in the vehicle C1.
 情報処理システム100は、カメラ101と、位置情報取得センサ102と、音声入力部103と、センサ類104と、情報処理装置110と、エージェント機器200とを備える。なお、情報処理装置110及びエージェント機器200は、有線通信又は無線通信を利用した通信方式によって接続される。また、情報処理装置110は、無線通信を利用した通信方式によってネットワーク20に接続されている。ネットワーク20は、公衆回線網、インターネット等のネットワークである。なお、エージェント機器200についても、無線通信を利用した通信方式によってネットワーク20に接続してもよい。なお、図3では、情報処理装置110及びエージェント機器200を別体として構成する例を示すが、情報処理装置110及びエージェント機器200を一体の機器として構成してもよい。 The information processing system 100 includes a camera 101, a position information acquisition sensor 102, an audio input unit 103, sensors 104, an information processing device 110, and an agent device 200. Note that the information processing device 110 and the agent device 200 are connected by a communication method using wired communication or wireless communication. Further, the information processing device 110 is connected to the network 20 by a communication method using wireless communication. The network 20 is a network such as a public line network or the Internet. Note that the agent device 200 may also be connected to the network 20 using a communication method using wireless communication. Although FIG. 3 shows an example in which the information processing device 110 and the agent device 200 are configured as separate devices, the information processing device 110 and the agent device 200 may be configured as an integrated device.
 カメラ101は、情報処理装置110の制御に基づいて、被写体を撮像して画像(画像データ)を生成するものであり、生成された画像に関する画像情報を情報処理装置110に出力する。カメラ101は、車両C1のうちの少なくとも内部に設けられ、車両C1の内部の被写体を撮像して画像(画像データ)を生成する。なお、図1では、車両C1の内部に設けられているカメラ101を示す。上述したように、カメラ101は、例えば、被写体を撮像することが可能な1又は複数のカメラ機器や画像センサにより構成される。例えば、1つのカメラ101を車両C1の前方に設け、車両C1の前方からの被写体を撮像して画像(画像データ)を生成してもよく、他のカメラ101を車両C1の後方に設け、車両C1からの後方の被写体を撮像して画像(画像データ)を生成してもよい。 The camera 101 captures an image of a subject and generates an image (image data) under the control of the information processing device 110, and outputs image information regarding the generated image to the information processing device 110. The camera 101 is provided at least inside the vehicle C1, and captures an image of a subject inside the vehicle C1 to generate an image (image data). Note that FIG. 1 shows a camera 101 provided inside the vehicle C1. As described above, the camera 101 includes, for example, one or more camera devices or image sensors capable of capturing an image of a subject. For example, one camera 101 may be provided in front of the vehicle C1 to capture an image of a subject from the front of the vehicle C1 to generate an image (image data), and another camera 101 may be provided in the rear of the vehicle C1 to generate an image (image data). An image (image data) may be generated by capturing an image of a subject behind C1.
 位置情報取得センサ102は、車両C1が存在する位置に関する位置情報を取得するものであり、取得された位置情報を情報処理装置110に出力する。例えば、GNSS(Global Navigation Satellite System:全球測位衛星システム)を利用して位置情報を取得するGNSS受信機により実現できる。また、その位置情報には、GNSS信号の受信時における緯度、経度、高度等の位置に関する各データが含まれる。また、他の位置情報の取得方法により位置情報を取得してもよい。例えば、周囲に存在するアクセスポイントや基地局からの情報を用いて位置情報を導き出してもよい。また、ビーコンを用いて位置情報を取得してもよい。例えば、位置情報取得センサ102により取得された情報に基づいて、車両C1の状態、例えば、走行中、停止中、後進中を判定可能である。 The position information acquisition sensor 102 acquires position information regarding the position where the vehicle C1 is present, and outputs the acquired position information to the information processing device 110. For example, it can be realized by a GNSS receiver that acquires position information using GNSS (Global Navigation Satellite System). Further, the position information includes various data related to the position such as latitude, longitude, altitude, etc. at the time of receiving the GNSS signal. Alternatively, the location information may be acquired using other location information acquisition methods. For example, location information may be derived using information from nearby access points and base stations. Alternatively, location information may be acquired using a beacon. For example, based on the information acquired by the position information acquisition sensor 102, it is possible to determine the state of the vehicle C1, for example, whether it is running, stopped, or moving backward.
 また、例えば、車両C1の外部に存在する施設、例えば珈琲店の付近に車両C1が存在することを位置情報取得センサ102により取得された位置情報に基づいて判定可能である。 Furthermore, for example, it is possible to determine that the vehicle C1 is near a facility that exists outside the vehicle C1, such as a coffee shop, based on the position information acquired by the position information acquisition sensor 102.
 音声入力部103は、車両C1の内部に設けられ、情報処理装置110の制御に基づいて、車両C1の内部の音を取得するものであり、取得された音に関する音情報を情報処理装置110に出力する。音声入力部103として、例えば、1又は複数のマイクや音取得センサを用いることができる。 The audio input unit 103 is provided inside the vehicle C1, and is configured to acquire sounds inside the vehicle C1 based on the control of the information processing device 110, and transmits sound information regarding the acquired sounds to the information processing device 110. Output. As the audio input unit 103, for example, one or more microphones or sound acquisition sensors can be used.
 センサ類104は、車両C1に設置されている各種のセンサであり、各センサにより取得された検出情報を情報処理装置110に出力する。センサ類は、例えば、人感センサ、距離センサ、車速センサ、加速度センサ、着座センサ、シートベルトセンサ、ドアセンサ、バッテリセンサ等である。なお、これらは一例であり、他のセンサを用いてもよい。また、これらのうちの一部のセンサのみを用いてもよい。 The sensors 104 are various sensors installed in the vehicle C1, and output detection information acquired by each sensor to the information processing device 110. Examples of the sensors include a human sensor, a distance sensor, a vehicle speed sensor, an acceleration sensor, a seating sensor, a seatbelt sensor, a door sensor, and a battery sensor. Note that these are just examples, and other sensors may be used. Further, only some of these sensors may be used.
 人感センサは、車両C1の内部に存在する人の有無、人数、位置、状態等を検出するセンサである。例えば、赤外線、超音波、可視光、撮像素子等を用いる人感センサを用いることが可能である。例えば、運転席、助手席、後部座席の何れかに人が着座している場合には、その人の存在を人感センサにより検出可能である。また、人感センサ及び着座センサの双方を用いることにより、各座席の着座状態の検出精度を高めることが可能である。 The human sensor is a sensor that detects the presence or absence, number of people, position, state, etc. of people inside the vehicle C1. For example, it is possible to use a human sensor using infrared rays, ultrasonic waves, visible light, an image sensor, or the like. For example, if a person is seated in the driver's seat, passenger seat, or rear seat, the presence of that person can be detected by a human sensor. Further, by using both the human sensor and the seating sensor, it is possible to improve the accuracy of detecting the seating state of each seat.
 距離センサは、車両C1の内部に存在する人までの距離、報知対象物までの距離等を検出するセンサである。例えば、測距センサ等の各種センサを距離センサとして用いることが可能である。 The distance sensor is a sensor that detects the distance to a person existing inside the vehicle C1, the distance to a notification target, etc. For example, various sensors such as a distance sensor can be used as the distance sensor.
 着座センサ(又はシートセンサ)は、車両C1の各座席に着座している乗員の有無を検出するセンサである。シートベルトセンサは、車両C1の各座席に着座している乗員がシートベルトを着用しているか否かを検出するセンサである。ドアセンサは、車両C1の各ドアが半ドアであるか否かを検出するセンサである。バッテリセンサは、車両C1に設置されているバッテリの残量を計測するためのセンサである。これらの各センサについては、公知のセンサを用いることが可能である。 The seating sensor (or seat sensor) is a sensor that detects the presence or absence of an occupant sitting in each seat of the vehicle C1. The seat belt sensor is a sensor that detects whether or not the occupant seated in each seat of the vehicle C1 is wearing a seat belt. The door sensor is a sensor that detects whether each door of the vehicle C1 is ajar. The battery sensor is a sensor for measuring the remaining amount of the battery installed in the vehicle C1. For each of these sensors, known sensors can be used.
 情報処理装置110は、制御部120と、記憶部130と、通信部140とを備える。通信部140は、制御部120の制御に基づいて、有線通信又は無線通信を利用して、他の機器との間で各種情報のやりとりを行うものである。 The information processing device 110 includes a control section 120, a storage section 130, and a communication section 140. The communication unit 140 exchanges various information with other devices using wired communication or wireless communication under the control of the control unit 120.
 制御部120は、記憶部130に記憶されている各種プログラムに基づいて各部を制御するものである。制御部120は、例えば、CPU(Central Processing Unit)、GPU(Graphics Processing Unit)等の処理装置により実現される。例えば、GPUを用いて画像処理を実行することにより演算速度を速めることが可能である。また、GPUを用いて並列演算を実行することにより演算速度をさらに速めることが可能である。なお、車両C1の車両ECU(Electronic Control Unit)を制御部120としても使用してもよく、車両ECUとは異なる処理装置を制御部120として設けてもよい。なお、制御部120は、文字を音声に変換する変換機能を備える。この変換機能は、例えば、TTS(Text to Speech)により実現される。 The control unit 120 controls each unit based on various programs stored in the storage unit 130. The control unit 120 is realized by, for example, a processing device such as a CPU (Central Processing Unit) or a GPU (Graphics Processing Unit). For example, it is possible to increase the calculation speed by performing image processing using a GPU. Further, by executing parallel calculations using GPU, it is possible to further increase the calculation speed. Note that a vehicle ECU (Electronic Control Unit) of the vehicle C1 may be used as the control unit 120, or a processing device different from the vehicle ECU may be provided as the control unit 120. Note that the control unit 120 has a conversion function that converts characters into speech. This conversion function is realized by, for example, TTS (Text to Speech).
 制御部120は、カメラ101、位置情報取得センサ102、音声入力部103、センサ類104、通信部140等から出力された各情報に基づいて、エージェント機器200の動作状態を制御する制御処理を実行する。具体的には、制御部120は、表示切替判定部121と、エージェント制御部122とを備える。 The control unit 120 executes control processing to control the operating state of the agent device 200 based on each piece of information output from the camera 101, location information acquisition sensor 102, audio input unit 103, sensors 104, communication unit 140, etc. do. Specifically, the control unit 120 includes a display switching determination unit 121 and an agent control unit 122.
 表示切替判定部121は、カメラ101、位置情報取得センサ102、音声入力部103、センサ類104、通信部140等から出力された各情報に基づいて、表示部210に表示させる内容を切り替えるか否かを判定するものであり、その判定結果をエージェント制御部122に出力する。具体的には、表示切替判定部121は、顔画像から報知情報への切替タイミングを判定する。 The display switching determination unit 121 determines whether to switch the content displayed on the display unit 210 based on each piece of information output from the camera 101, position information acquisition sensor 102, audio input unit 103, sensors 104, communication unit 140, etc. The agent controller 122 outputs the determination result to the agent control unit 122. Specifically, the display switching determination unit 121 determines the timing of switching from the face image to the notification information.
 例えば、表示切替判定部121は、カメラ101、位置情報取得センサ102、音声入力部103、センサ類104、通信部140等から出力された各情報に基づいて、報知情報が発生したか否かを判定する。例えば、シートベルトの着用の有無を検出するシートベルトセンサと、車両C1の各座席の乗員の有無を検出する着座センサ(又はシートセンサ)とに基づいて、車両C1の各座席に乗車する乗員のシートベルトの着用の有無を判定可能である。そこで、乗員が着座しているにもかかわらず、シートベルトを着用していないことが判定された場合には、報知情報が発生したと判定される。 For example, the display switching determination unit 121 determines whether notification information has occurred based on information output from the camera 101, position information acquisition sensor 102, audio input unit 103, sensors 104, communication unit 140, etc. judge. For example, based on a seat belt sensor that detects whether a seat belt is worn or not, and a seating sensor (or seat sensor) that detects the presence or absence of an occupant in each seat of the vehicle C1, the number of occupants in each seat of the vehicle C1 is determined. It is possible to determine whether or not a seat belt is worn. Therefore, if it is determined that the occupant is not wearing a seatbelt even though he or she is seated, it is determined that notification information has occurred.
 なお、表示切替判定部121は、報知情報の重要度に基づいて、顔画像から報知情報への切替タイミングを調整してもよく、顔画像の動作状態(例えば、所定の演出を実行中であるか否か)に基づいて、顔画像から報知情報への切替タイミングを調整してもよい。これらの切替タイミングの調整方法については、図12、図13を参照して詳細に説明する。 Note that the display switching determination unit 121 may adjust the timing of switching from the face image to the notification information based on the importance level of the notification information, and may adjust the timing of switching from the face image to the notification information based on the degree of importance of the notification information. The timing of switching from the face image to the notification information may be adjusted based on whether or not the face image is displayed. A method for adjusting these switching timings will be described in detail with reference to FIGS. 12 and 13.
 エージェント制御部122は、表示切替判定部121による判定結果と、カメラ101、位置情報取得センサ102、音声入力部103、センサ類104、通信部140等から出力された各情報とに基づいて、エージェント機器200の動作状態を制御するものである。例えば、エージェント制御部122は、擬生物化されたエージェントの顔を構成する各部(眼部211、鼻部212、口部213)を表示部210に表示させる。また、エージェント制御部122は、車両C1の乗員に報知すべき情報(報知情報)を表示部210に表示させる。また、エージェント制御部122は、擬生物化されたエージェントの音声、車両C1の乗員に報知すべき情報に対応する音声等を音出力部220から出力させる。 The agent control unit 122 determines whether the agent It controls the operating state of the device 200. For example, the agent control unit 122 causes the display unit 210 to display each part (eyes 211, nose 212, mouth 213) that make up the face of the simulated agent. Further, the agent control unit 122 causes the display unit 210 to display information (notification information) to be notified to the occupants of the vehicle C1. Further, the agent control unit 122 causes the sound output unit 220 to output the voice of the simulated agent, the voice corresponding to information to be notified to the occupants of the vehicle C1, and the like.
 例えば、エージェント制御部122は、表示部210に表示させる報知情報と、音出力部220から出力させる報知情報とを決定し、それらの報知情報を出力させる制御を実行する。なお、車両C1の内部に存在する対象物に関する報知情報については、報知情報DB132に格納されている。また、これらの各報知情報の出力例については、図2、図4乃至図10に示す。例えば、シートベルトの着用に関する報知情報である場合には、シートベルトを示す画像(報知画像400(図2(E)参照))を報知情報として表示する。また、バッテリ残量に関する報知情報である場合には、バッテリを示す画像(報知画像410(図2(F)参照))を報知情報として表示する。また、半ドアに関する報知情報である場合には、ドアを示す画像を報知情報として表示する。 For example, the agent control unit 122 determines notification information to be displayed on the display unit 210 and notification information to be output from the sound output unit 220, and executes control to output the notification information. Note that notification information regarding objects existing inside the vehicle C1 is stored in the notification information DB 132. Further, examples of output of each of these notification information are shown in FIGS. 2, 4 to 10. For example, if the notification information is about wearing a seatbelt, an image showing the seatbelt (notification image 400 (see FIG. 2(E))) is displayed as the notification information. Further, if the notification information is related to the remaining amount of battery, an image indicating the battery (notification image 410 (see FIG. 2(F))) is displayed as the notification information. Further, if the notification information is about an ajar door, an image showing the door is displayed as the notification information.
 また、例えば、エージェント制御部122は、表示部210の表示状態を、顔画像から報知情報に切り替える場合に、エージェント機器200に切替演出を実行させる制御を実行する。この切替演出については、図4乃至図10を参照して詳細に説明する。 Furthermore, for example, when switching the display state of the display unit 210 from a face image to notification information, the agent control unit 122 executes control to cause the agent device 200 to perform a switching effect. This switching effect will be explained in detail with reference to FIGS. 4 to 10.
 記憶部130は、各種情報を記憶する記憶媒体である。例えば、記憶部130には制御部120が各種処理を行うために必要となる各種情報(例えば、制御プログラム、エージェント情報DB131、報知情報DB132、地図情報DB133)が記憶される。また、記憶部130には、通信部140を介して取得された各種情報が記憶される。記憶部130として、例えば、ROM(Read Only Memory)、RAM(Random Access Memory)、SRAM(Static Random Access Memory)、HDD(Hard Disk Drive)、SSD(Solid State Drive)、又は、これらの組み合わせを用いることができる。 The storage unit 130 is a storage medium that stores various information. For example, the storage unit 130 stores various information (for example, a control program, an agent information DB 131, a notification information DB 132, and a map information DB 133) necessary for the control unit 120 to perform various processes. Further, the storage unit 130 stores various information acquired via the communication unit 140. Examples of the storage unit 130 include ROM (Read Only Memory), RAM (Random Access Memory), SRAM (Static Random Access Memory), and HDD (Hard Disk). drive), SSD (Solid State Drive), or a combination thereof be able to.
 エージェント情報DB131には、エージェント機器200の各種動作を実現するために必要となる情報が格納されている。例えば、表示部210に表示される顔画像情報(例えば、眼部211、鼻部212、口部213)と、音出力部220から出力される音声情報とがエージェント情報DB131に格納される。また、例えば、切替演出の実行時に、顔部201(図5乃至図9参照)を動作させるための動作情報等がエージェント情報DB131に格納される。 The agent information DB 131 stores information necessary to realize various operations of the agent device 200. For example, facial image information (eg, eyes 211, nose 212, and mouth 213) displayed on the display unit 210 and audio information output from the sound output unit 220 are stored in the agent information DB 131. Further, for example, operation information for operating the face portion 201 (see FIGS. 5 to 9) when performing a switching effect is stored in the agent information DB 131.
 報知情報DB132には、車両C1の内部又は外部に存在する対象物に関する報知情報を出力するために必要となる情報が格納されている。例えば、半ドアに関する報知情報である場合には、ドアを示す画像情報と半ドアを報知するための音声情報とが報知情報として報知情報DB132に格納される。また、例えば、シートベルトの着用に関する報知情報である場合には、シートベルトを示す画像情報(報知画像400(図2(E)参照))とシートベルトの着用を報知するための音声情報とが報知情報として報知情報DB132に格納される。また、例えば、バッテリ残量に関する報知情報である場合には、バッテリを示す画像情報(報知画像410(図2(F)参照))とシートベルトの着用を報知するための音声情報とが報知情報として報知情報DB132に格納される。 The notification information DB 132 stores information necessary for outputting notification information regarding objects existing inside or outside the vehicle C1. For example, in the case of notification information regarding a door ajar, image information indicating the door and audio information for notifying the door ajar are stored in the notification information DB 132 as notification information. For example, in the case of notification information regarding the fastening of a seatbelt, image information indicating the seatbelt (notification image 400 (see FIG. 2(E))) and audio information for notifying the fastening of the seatbelt are combined. It is stored in the broadcast information DB 132 as broadcast information. For example, when the notification information is about the remaining battery level, the notification information includes image information indicating the battery (notification image 410 (see FIG. 2(F))) and audio information for notifying the fastening of the seat belt. It is stored in the broadcast information DB 132 as .
 地図情報DB133には、車両C1の経路案内に必要となる道路に関する道路情報等の地図情報が格納されている。その地図情報には、道路の勾配、道路の交差点、道路の車線数、道幅情報、道路の起伏情報が含まれる。また、地図情報には、速度制限、一方通行等を示す道路標識、横断歩道、区画線等を示す道路標示も含まれる。また、地図情報には、道路構造物(例えば信号機、電信柱)、建物等の施設情報、周囲の観光案内情報等を含めてもよい。なお、エージェント情報DB131、報知情報DB132、地図情報DB133については、車両C1が備える記憶部130に記憶して用いてもよく、ネットワーク20を介して外部機器から取得して用いてもよい。 The map information DB 133 stores map information such as road information regarding roads necessary for route guidance of the vehicle C1. The map information includes road slope, road intersections, number of road lanes, road width information, and road undulation information. The map information also includes road signs indicating speed limits, one-way traffic, etc., crosswalks, lane markings, etc. Further, the map information may include information on facilities such as road structures (for example, traffic lights, telephone poles), buildings, etc., information on tourist guides in the surrounding area, and the like. Note that the agent information DB 131, notification information DB 132, and map information DB 133 may be stored and used in the storage unit 130 included in the vehicle C1, or may be obtained and used from an external device via the network 20.
 エージェント機器200は、情報処理装置110からの指示に基づいて各種動作を実行するロボット機器である。エージェント機器200は、表示部210と、音出力部220と、駆動部230とを備える。なお、表示部210、音出力部220及び駆動部230は、エージェント機器200が備える制御部(図示省略)に基づいて制御される。 The agent device 200 is a robot device that performs various operations based on instructions from the information processing device 110. Agent device 200 includes a display section 210, a sound output section 220, and a drive section 230. Note that the display section 210, the sound output section 220, and the drive section 230 are controlled based on a control section (not shown) included in the agent device 200.
 表示部210は、情報処理装置110からの指示に基づいて、各種画像を表示する表示部である。なお、表示部210として、例えば、有機EL(Electro Luminescence)パネル、LCD(Liquid Crystal Display)パネル等の表示パネルを用いることができる。なお、表示部210については、使用者がその指を表示面に接触又は近接することにより操作入力を行うことが可能なタッチパネルとして構成してもよく、別体のユーザインタフェースとして構成してもよい。 The display unit 210 is a display unit that displays various images based on instructions from the information processing device 110. Note that as the display unit 210, a display panel such as an organic EL (Electro Luminescence) panel or an LCD (Liquid Crystal Display) panel can be used. Note that the display unit 210 may be configured as a touch panel that allows the user to perform operation input by touching or approaching the display surface with his or her finger, or may be configured as a separate user interface. .
 なお、図2では、エージェント機器200が1つの表示部210を備える例を示すが、エージェント機器200が複数の表示部を備えてもよい。エージェント機器200が複数の表示部を備える例については、図4乃至図9に示す。なお、図4乃至図9に示す各表示部については、図3に示す表示部210に対応するものとする。 Although FIG. 2 shows an example in which the agent device 200 includes one display unit 210, the agent device 200 may include multiple display units. Examples in which the agent device 200 includes a plurality of display units are shown in FIGS. 4 to 9. Note that each display section shown in FIGS. 4 to 9 corresponds to the display section 210 shown in FIG. 3.
 音出力部220は、情報処理装置110からの指示に基づいて、各種音声を出力するものである。音出力部220として、例えば、1又は複数のスピーカを用いることができる。なお、表示部210及び音出力部220は、ユーザインタフェースの一例であり、これらのうちの一部を省略してもよく、他のユーザインタフェースを用いてもよい。 The sound output unit 220 outputs various sounds based on instructions from the information processing device 110. As the sound output section 220, for example, one or more speakers can be used. Note that the display section 210 and the sound output section 220 are examples of user interfaces, and some of them may be omitted or other user interfaces may be used.
 駆動部230は、情報処理装置110からの指示に基づいて、エージェント機器200の各部を駆動するものである。例えば、駆動部230は、顔部201(図5、図6、図9参照)をくるくる回す機構を実現する駆動装置である。また、駆動部230は、顔部201における表示パネル270(図7参照)を動かす機構を実現する駆動装置である。また、駆動部230は、表示部300(図8参照)を動かす機構を実現する駆動装置である。例えば、駆動部230は、各部を駆動させることが可能なモータ、サーボモータ等により構成される。なお、駆動部230については、図5乃至図9を参照して詳細に説明する。 The drive section 230 drives each section of the agent device 200 based on instructions from the information processing device 110. For example, the drive section 230 is a drive device that realizes a mechanism for rotating the face section 201 (see FIGS. 5, 6, and 9). Further, the drive section 230 is a drive device that realizes a mechanism for moving the display panel 270 (see FIG. 7) in the face section 201. Further, the drive section 230 is a drive device that realizes a mechanism for moving the display section 300 (see FIG. 8). For example, the drive section 230 is configured with a motor, a servo motor, etc. that can drive each section. Note that the drive unit 230 will be described in detail with reference to FIGS. 5 to 9.
 [複数の表示部を用いて報知情報を提供する例]
 図2では、1つの表示部210を用いて報知情報を提供する例を示した。上述したように、複数の表示部を用いて報知情報を提供することも可能である。そこで、図4乃至図9では、複数の表示部を用いて報知情報を提供する例を示す。なお、図4乃至図9では、バッテリ残量に関する報知情報(報知画像410(図2(F)参照))を出力する例を示す。また、図10では、シートベルトの着用に関する報知情報(報知画像400(図2(E)参照))を出力する例を示す。
[Example of providing notification information using multiple display units]
FIG. 2 shows an example in which one display unit 210 is used to provide notification information. As described above, it is also possible to provide notification information using a plurality of display units. Therefore, FIGS. 4 to 9 show examples in which notification information is provided using a plurality of display units. Note that FIGS. 4 to 9 show examples in which notification information (notification image 410 (see FIG. 2(F))) regarding the remaining battery level is output. Further, FIG. 10 shows an example in which notification information (notification image 400 (see FIG. 2E)) regarding fastening of the seat belt is output.
 [上下方向に並べて設置された複数の表示部を用いて報知情報を提供する例]
 図4は、2つの表示部241、242を用いて報知情報を提供する場合の遷移例を示す図である。図4では、エージェント機器200が、上下方向に並べて2つの表示部241、242を備える例を示す。具体的には、表示部241は、顔部201に設置され、表示部242は、身体部202に設置される。すなわち、図4に示す例は、図2に示すエージェント機器200において、表示部210の代わりに表示部241を設け、身体部202に表示部242を設けたものである。
[Example of providing notification information using multiple display units installed vertically]
FIG. 4 is a diagram showing an example of transition when providing notification information using two display units 241 and 242. FIG. 4 shows an example in which the agent device 200 includes two display units 241 and 242 arranged in the vertical direction. Specifically, the display section 241 is installed on the face section 201, and the display section 242 is installed on the body section 202. That is, in the example shown in FIG. 4, a display section 241 is provided in place of the display section 210 in the agent device 200 shown in FIG. 2, and a display section 242 is provided on the body part 202.
 表示部241には、主に顔画像が表示され、表示部242には、主に報知情報が表示されるものとする。例えば、通常時には、図2(A)乃至(C)に示す例と同様に、表示部241には各種の顔画像が表示される。 It is assumed that the display section 241 mainly displays facial images, and the display section 242 mainly displays notification information. For example, in normal times, various facial images are displayed on the display unit 241, similar to the examples shown in FIGS. 2(A) to 2(C).
 また、表示切替判定部121は、カメラ101、位置情報取得センサ102、音声入力部103、センサ類104、通信部140等から出力された各情報に基づいて、車両C1のユーザに報知すべき情報(報知情報)が発生したか否かを判定する。そして、表示切替判定部121は、報知情報が発生した場合には、その旨をエージェント制御部122に出力する。 Further, the display switching determination unit 121 determines information to be notified to the user of the vehicle C1 based on each information output from the camera 101, the position information acquisition sensor 102, the audio input unit 103, the sensors 104, the communication unit 140, etc. Determine whether (broadcast information) has occurred. Then, when notification information is generated, the display switching determination unit 121 outputs a notification to that effect to the agent control unit 122.
 次に、エージェント制御部122は、表示切替判定部121により報知情報が発生したと判定された場合には、その報知情報を表示部242に表示させるための制御を実行する。具体的には、エージェント制御部122は、表示部241を用いた切替演出を実行し、かつ、表示部241における顔画像の表示を消去した後に、その報知情報を表示部242に表示させる。なお、表示部241を用いた切替演出を実行する代わりに、表示部241及び表示部242を用いた切替演出、表示部242のみを用いた切替演出等を実行してもよい。 Next, when the display switching determination unit 121 determines that notification information has been generated, the agent control unit 122 executes control to display the notification information on the display unit 242. Specifically, the agent control unit 122 executes a switching effect using the display unit 241, and after erasing the display of the face image on the display unit 241, causes the display unit 242 to display the notification information. Note that instead of performing the switching effect using the display section 241, a switching effect using the display sections 241 and 242, a switching effect using only the display section 242, etc. may be performed.
 例えば、報知情報が発生した場合には、図4(A)に示すように、エージェント制御部122は、報知情報を表示する前のタイミングで、切替演出として、ユーザに生き物らしさを感じさせるための演出(切替演出)を実行させる。具体的には、エージェント制御部122は、表示部241における顔画像が驚いた顔の表情(図2(D)参照)となるように、表示部241の表示状態を制御して切替演出を実行する。 For example, when notification information is generated, the agent control unit 122, as shown in FIG. Execute the performance (switching performance). Specifically, the agent control unit 122 executes the switching effect by controlling the display state of the display unit 241 so that the facial image on the display unit 241 has a surprised facial expression (see FIG. 2(D)). do.
 次に、図4(B)に示すように、エージェント制御部122は、表示部241における顔画像の表示を消去し、表示部242に報知情報として報知画像410を表示させる。 Next, as shown in FIG. 4(B), the agent control unit 122 erases the display of the face image on the display unit 241, and causes the display unit 242 to display the notification image 410 as notification information.
 このように、報知情報を表示する際には、顔画像から報知情報の切り替え時において、驚いた表情の顔画像を表示部241に表示させることにより、生き物らしさをユーザに感じさせ、報知情報が報知されることに対するユーザの関心を高めることが可能となる。これにより、報知情報をより適切にユーザに報知することが可能となる。 In this way, when displaying notification information, by displaying a face image with a surprised expression on the display unit 241 when switching from a face image to notification information, the user can feel like a living thing, and the notification information can be displayed. It becomes possible to increase the user's interest in being notified. This makes it possible to more appropriately notify the user of the notification information.
 [両面に表示部が設置された筐体を左右方向に回動させて報知情報を提供する例]
 図5は、2つの表示部252、253を用いて報知情報を提供する場合の遷移例を示す図である。図5では、エージェント機器200が、左右方向に回動可能な筐体250(顔部201)を備え、筐体250の一の面に表示部252(図5(A)(B)参照)が設置され、筐体250の他の面に表示部253(図5(C)参照)が設置されている例を示す。表示部252には、主に顔画像が表示され、表示部253には、主に報知情報が表示されるものとする。
[Example of providing notification information by rotating a housing with display units installed on both sides in the left and right directions]
FIG. 5 is a diagram showing an example of transition when providing notification information using two display sections 252 and 253. In FIG. 5, the agent device 200 includes a casing 250 (face portion 201) that can be rotated in the left-right direction, and a display portion 252 (see FIGS. 5(A) and 5(B)) on one surface of the casing 250. An example is shown in which a display unit 253 (see FIG. 5C) is installed on the other surface of the housing 250. The display section 252 mainly displays face images, and the display section 253 mainly displays notification information.
 具体的には、首部251を回転支持部として、筐体250(顔部201)と身体部202とが首部251により連結されている。そして、首部251は、駆動部230(図3参照)に連結され、駆動部230の回転動作により、首部251を回動軸として筐体250を左右方向(矢印A1方向)に回動させることが可能である。なお、筐体250の回転範囲は、例えば、180乃至360度の範囲を設定可能である。例えば、左右方向に180度程度、回動させることにより、筐体250の前後方向を逆にすることが可能である。通常時には、表示部252側が正面側となるように設定され、図2(A)乃至(C)に示す例と同様に、表示部252には各種の顔画像が表示される。 Specifically, the housing 250 (face part 201) and the body part 202 are connected by the neck part 251, using the neck part 251 as a rotation support part. The neck portion 251 is connected to a drive portion 230 (see FIG. 3), and the rotating operation of the drive portion 230 allows the casing 250 to be rotated in the left-right direction (arrow A1 direction) using the neck portion 251 as a rotation axis. It is possible. Note that the rotation range of the housing 250 can be set, for example, in a range of 180 to 360 degrees. For example, by rotating the housing 250 by about 180 degrees in the left-right direction, the front-back direction of the housing 250 can be reversed. Normally, the display unit 252 side is set to be the front side, and various facial images are displayed on the display unit 252, similar to the examples shown in FIGS. 2(A) to 2(C).
 図4と同様に、表示切替判定部121は、報知情報が発生したか否かを判定し、報知情報が発生した場合には、その旨をエージェント制御部122に出力する。 Similarly to FIG. 4, the display switching determination unit 121 determines whether or not notification information has been generated, and if notification information has been generated, outputs a notification to that effect to the agent control unit 122.
 次に、エージェント制御部122は、表示切替判定部121により報知情報が発生したと判定された場合には、その報知情報を表示部253に表示させるための制御を実行する。具体的には、エージェント制御部122は、筐体250を用いた切替演出を実行し、かつ、表示部252における顔画像の表示を消去した後に、その報知情報を表示部253に表示させる。 Next, when the display switching determination unit 121 determines that notification information has been generated, the agent control unit 122 executes control to display the notification information on the display unit 253. Specifically, the agent control unit 122 executes a switching effect using the housing 250, erases the display of the face image on the display unit 252, and then displays the notification information on the display unit 253.
 例えば、報知情報が発生した場合には、図5(A)(B)に示すように、エージェント制御部122は、報知情報を表示する前のタイミングで、切替演出として、ユーザに生き物らしさを感じさせるための演出(切替演出)を実行させる。具体的には、エージェント制御部122は、表示部252における顔画像が驚いた顔の表情(図2(D)参照)となるように、表示部252の表示状態を制御して切替演出を実行する。 For example, when notification information is generated, as shown in FIGS. 5(A) and 5(B), the agent control unit 122 creates a switching effect that gives the user a feeling of living beings at a timing before displaying the notification information. A performance (switching performance) for the purpose of the change is executed. Specifically, the agent control unit 122 executes the switching effect by controlling the display state of the display unit 252 so that the facial image on the display unit 252 has a surprised facial expression (see FIG. 2(D)). do.
 次に、図5(B)に示すように、エージェント制御部122は、表示部252における顔画像の表示を消去し、首部251を回動軸として筐体250を右方向(矢印A2、A3方向)に回動させる。なお、この回動時には、表示部252における顔画像を表示状態とし、この回動の終了時に、表示部252における顔画像を消去してもよい。また、この回動の開始前、又は、この回動時において、ユーザに生き物らしさを感じさせるための音声情報を音出力部220から出力させてもよい。 Next, as shown in FIG. 5B, the agent control unit 122 erases the display of the face image on the display unit 252, and rotates the casing 250 to the right (in the direction of arrows A2 and A3) using the neck 251 as the rotation axis. ). Note that during this rotation, the face image on the display section 252 may be displayed, and when this rotation ends, the face image on the display section 252 may be erased. Furthermore, before the rotation starts or at the time of this rotation, the sound output unit 220 may output audio information to make the user feel like a living thing.
 次に、図5(C)に示すように、エージェント制御部122は、車両C1のユーザ側となった表示部253に報知情報として報知画像410を表示させる。このように、報知情報が発生した場合には、筐体250(表示部252、253)を用いた切替演出を実行し、かつ、表示部252における顔画像の表示を消去した後に、表示部253に報知情報(報知画像410)を表示させる。 Next, as shown in FIG. 5(C), the agent control unit 122 displays a notification image 410 as notification information on the display unit 253 on the user side of the vehicle C1. In this way, when notification information is generated, a switching effect is executed using the housing 250 (display sections 252 and 253), and after erasing the display of the face image on the display section 252, the display section 253 The notification information (notification image 410) is displayed.
 このように、報知情報を表示する際には、顔画像から報知情報の切り替え時において、驚いた表情の顔画像を表示部252に表示させ、かつ、筐体250(顔部201)を回動させる。この筐体250の回動により、エージェント機器200の顔が後方を振り向くような動きを表現することが可能であるため、エージェント機器200に対するユーザの関心を高めることが可能である。また、エージェント機器200の顔が後方を振り向いた後に、エージェント機器200の頭や背中に報知情報を表示するような印象をユーザに与えることが可能である。これらにより、生き物らしさをユーザに感じさせ、報知情報が報知されることに対するユーザの関心を高めることが可能となる。また、顔画像の表示位置と、報知情報の表示位置とは、ユーザから見て同じ位置となる。このため、顔画像を見ていたユーザに対して見易い位置に報知情報を表示させることが可能となる。これらにより、報知情報をより適切にユーザに報知することが可能となる。 In this way, when displaying notification information, when switching from a face image to notification information, a facial image with a surprised expression is displayed on the display unit 252, and the housing 250 (face unit 201) is rotated. let By rotating the housing 250, it is possible to express a movement in which the face of the agent device 200 looks backward, and therefore it is possible to increase the user's interest in the agent device 200. Furthermore, it is possible to give the user the impression that the notification information is displayed on the head or back of the agent device 200 after the face of the agent device 200 turns backward. With these, it is possible to make the user feel like a living thing and increase the user's interest in being notified of the notification information. Further, the display position of the face image and the display position of the notification information are the same position as viewed from the user. Therefore, it is possible to display the notification information at a position that is easily visible to the user who is viewing the face image. With these, it becomes possible to more appropriately notify the user of the notification information.
 [両面に表示部が設置された筐体を上下方向に回動させて報知情報を提供する例]
 図6は、2つの表示部264、265を用いて報知情報を提供する場合の遷移例を示す図である。図6では、エージェント機器200が、上下方向に回動可能な筐体260(顔部201)を備え、筐体260の一の面に表示部264(図6(A)参照)が設置され、筐体260の他の面に表示部265(図6(C)参照)が設置されている例を示す。表示部264には、主に顔画像が表示され、表示部265には、主に報知情報が表示されるものとする。
[Example of providing notification information by vertically rotating a case with display units installed on both sides]
FIG. 6 is a diagram showing an example of transition when providing notification information using two display sections 264 and 265. In FIG. 6, the agent device 200 includes a vertically rotatable housing 260 (face portion 201), and a display unit 264 (see FIG. 6(A)) is installed on one surface of the housing 260. An example is shown in which a display section 265 (see FIG. 6(C)) is installed on the other surface of the housing 260. The display section 264 mainly displays face images, and the display section 265 mainly displays notification information.
 具体的には、正面視で口の字形状の枠部261の内部に筐体260が設置され、筐体260の左右方向の両端部と、枠部261の内側面とが、回転支持部262、263により連結されている。なお、回転支持部262、263を設ける位置は、顔部201の耳に相当する位置であるため、回転支持部262、263を顔部201の耳部として構成してもよい。そして、回転支持部262、263は、駆動部230(図8参照)に連結され、駆動部230の回転動作により、回転支持部262、263を回動軸として顔部201を上下方向(矢印A10方向)に回動させる。このように、枠部261の内部において筐体260を上下方向に回動可能とする。なお、筐体260(顔部201)の回転範囲は、例えば、180度乃至360度の範囲を設定可能である。例えば、上下方向に180度程度、回動させることにより、筐体260の前後方向を逆にすることが可能である。通常時には、表示部264側が正面側となるように設定され、図2(A)乃至(C)に示す例と同様に、表示部264には各種の顔画像が表示される。 Specifically, the casing 260 is installed inside a frame 261 that has a square shape when viewed from the front, and both ends of the casing 260 in the left and right direction and the inner surface of the frame 261 are connected to the rotation support part 262. , 263. Note that since the rotational support parts 262 and 263 are provided at positions corresponding to the ears of the face part 201, the rotation support parts 262 and 263 may be configured as the ears of the face part 201. The rotation support parts 262 and 263 are connected to the drive part 230 (see FIG. 8), and the rotation movement of the drive part 230 causes the face part 201 to move in the vertical direction (arrow A10 direction). In this way, the housing 260 can be rotated in the vertical direction inside the frame portion 261. Note that the rotation range of the housing 260 (face portion 201) can be set, for example, in a range of 180 degrees to 360 degrees. For example, by rotating the housing 260 approximately 180 degrees in the vertical direction, the front-back direction of the housing 260 can be reversed. Normally, the display unit 264 side is set to be the front side, and various facial images are displayed on the display unit 264, similar to the examples shown in FIGS. 2(A) to 2(C).
 図4と同様に、表示切替判定部121は、報知情報が発生したか否かを判定し、報知情報が発生した場合には、その旨をエージェント制御部122に出力する。 Similarly to FIG. 4, the display switching determination unit 121 determines whether or not notification information has been generated, and if notification information has been generated, outputs a notification to that effect to the agent control unit 122.
 次に、エージェント制御部122は、表示切替判定部121により報知情報が発生したと判定された場合には、その報知情報を表示部265に表示させるための制御を実行する。具体的には、エージェント制御部122は、筐体260を用いた切替演出を実行し、かつ、表示部264における顔画像の表示を消去した後に、その報知情報を表示部265に表示させる。 Next, when the display switching determination unit 121 determines that notification information has been generated, the agent control unit 122 executes control to display the notification information on the display unit 265. Specifically, the agent control unit 122 executes a switching effect using the housing 260 and, after erasing the display of the face image on the display unit 264, causes the display unit 265 to display the notification information.
 例えば、報知情報が発生した場合には、図6(A)(B)に示すように、エージェント制御部122は、報知情報を表示する前のタイミングで、切替演出として、ユーザに生き物らしさを感じさせるための演出(切替演出)を実行させる。具体的には、エージェント制御部122は、表示部264における顔画像が驚いた顔の表情(図2(D)参照)となるように、表示部264の表示状態を制御して切替演出を実行する。 For example, when notification information is generated, as shown in FIGS. 6(A) and 6(B), the agent control unit 122 creates a switching effect that gives the user a feeling of living beings at a timing before displaying the notification information. A performance (switching performance) for the purpose of the change is executed. Specifically, the agent control unit 122 executes the switching effect by controlling the display state of the display unit 264 so that the facial image on the display unit 264 has a surprised facial expression (see FIG. 2(D)). do.
 次に、図6(B)に示すように、エージェント制御部122は、表示部264における顔画像の表示を消去し、回転支持部262、263を回動軸として筐体260を上下方向(矢印A11、A12方向)に回動させる。なお、この回動時には、表示部264における顔画像を表示状態とし、この回動の終了時に、表示部264における顔画像を消去してもよい。また、この回動の開始前、又は、この回動時において、ユーザに生き物らしさを感じさせるための音声情報を音出力部220から出力させてもよい。 Next, as shown in FIG. 6B, the agent control unit 122 erases the display of the face image on the display unit 264, and rotates the housing 260 in the vertical direction (arrow A11, A12 direction). Note that during this rotation, the face image on the display section 264 may be displayed, and when this rotation ends, the face image on the display section 264 may be erased. Furthermore, before the rotation starts or at the time of this rotation, the sound output unit 220 may output audio information to make the user feel like a living thing.
 次に、図6(C)に示すように、エージェント制御部122は、車両C1のユーザ側となった表示部265に報知情報として報知画像410を表示させる。このように、報知情報が発生した場合には、筐体260(表示部264、265)を用いた切替演出を実行し、かつ、表示部264における顔画像の表示を消去した後に、表示部265に報知情報(報知画像410)を表示させる。 Next, as shown in FIG. 6(C), the agent control unit 122 displays a notification image 410 as notification information on the display unit 265 on the user side of the vehicle C1. In this way, when notification information is generated, after performing a switching effect using the housing 260 (display sections 264 and 265) and erasing the display of the face image on the display section 264, the display section 265 The notification information (notification image 410) is displayed.
 このように、報知情報を表示する際には、顔画像から報知情報の切り替え時において、驚いた表情の顔画像を表示部264に表示させ、かつ、筐体260(顔部201)を回動させる。この筐体260の回動により、エージェント機器200の顔が下を向くような動きを表現することが可能であるため、エージェント機器200に対するユーザの関心を高めることが可能である。また、エージェント機器200の顔が下を向いた後に、エージェント機器200の頭や背中に報知情報を表示するような印象をユーザに与えることが可能である。これらにより、生き物らしさをユーザに感じさせ、報知情報が報知されることに対するユーザの関心を高めることが可能となる。また、顔画像の表示位置と、報知情報の表示位置とは、ユーザから見て同じ位置となる。このため、顔画像を見ていたユーザに対して見易い位置に報知情報を表示させることが可能となる。これらにより、報知情報をより適切にユーザに報知することが可能となる。 In this way, when displaying notification information, when switching from a face image to notification information, a facial image with a surprised expression is displayed on the display section 264, and the housing 260 (face section 201) is rotated. let By rotating the housing 260, it is possible to express a movement in which the face of the agent device 200 is facing downward, so that it is possible to increase the user's interest in the agent device 200. Further, it is possible to give the user the impression that the notification information is displayed on the head or back of the agent device 200 after the face of the agent device 200 is turned downward. With these, it is possible to make the user feel like a living thing and increase the user's interest in being notified of the notification information. Further, the display position of the face image and the display position of the notification information are the same position as viewed from the user. Therefore, it is possible to display the notification information in a position that is easily visible to the user who is viewing the face image. With these, it becomes possible to more appropriately notify the user of the notification information.
 なお、図5に示す左右回転機構は一例であり、両面に表示部を備える板状の筐体の前後方向及び左右方向の中心に回転軸があり、その回転軸を中心に筐体を回転させる他の機構を採用してもよい。また、図6に示す上下回転機構は一例であり、両面に表示部を備える板状の筐体の前後方向及び上下方向の中心に回転軸があり、その回転軸を中心に筐体を回転させる他の機構を採用してもよい。 Note that the left-right rotation mechanism shown in FIG. 5 is an example, and there is a rotation axis at the center of the front-rear direction and left-right direction of a plate-shaped casing with display sections on both sides, and the casing is rotated around the rotation axis. Other mechanisms may also be used. In addition, the vertical rotation mechanism shown in FIG. 6 is an example, and there is a rotation axis at the center of the front-rear direction and vertical direction of a plate-shaped casing with display sections on both sides, and the casing is rotated around the rotation axis. Other mechanisms may also be used.
 図5、図6では、筐体250、260の両面に表示部を設ける例を示した。しかし、筐体250、260をゆらゆら回動させることにより、生き物らしさをユーザに感じさせることも可能である。そこで、筐体250、260の一の面にのみ表示部を設け、筐体250、260をゆらゆら回動させる動作を切替演出として実行してもよい。この場合には、報知情報が発生した場合には、筐体をゆらゆら回動させる切替演出を実行し、かつ、1つの表示部における顔画像の表示を消去した後に、その表示部に報知情報を表示させる。 FIGS. 5 and 6 show examples in which display sections are provided on both sides of the casings 250 and 260. However, by swinging the casings 250 and 260, it is also possible to give the user a feeling of living beings. Therefore, a display section may be provided only on one surface of the casings 250, 260, and an operation of swinging the casings 250, 260 may be performed as a switching effect. In this case, when notification information is generated, a switching effect is executed in which the housing is rotated, and after erasing the display of the face image on one display section, the notification information is displayed on that display section. Display.
 このように、顔部201をくるくる回転させたり、顔部201をゆらゆら回動させたりしてユーザの注意を引き、ユーザへの報知動作を実行することが可能である。これにより、ユーザに対して適切に報知を行うことができる。このように、顔部201自体が動き出すと、何が表示されるのかなと、ユーザの注意を誘導することが可能となる。 In this way, it is possible to draw the user's attention by rotating the face part 201 or by swinging the face part 201, and to perform a notification operation to the user. Thereby, it is possible to appropriately notify the user. In this way, when the face section 201 itself starts moving, it is possible to draw the user's attention to what will be displayed.
 [可動式の表示パネルによる複数の表示部を用いて報知情報を提供する例]
 図7は、顔部201上に回動可能に設けられている平板状の表示パネル270(図7(B)(C)参照)を利用して実現される2つの表示部を用いて報知情報を提供する場合の遷移例を示す図である。
[Example of providing notification information using multiple movable display panels]
FIG. 7 shows notification information using two display sections that are realized using a flat display panel 270 (see FIGS. 7(B) and 7(C)) that is rotatably provided on the face section 201. It is a figure which shows the example of a transition when providing.
 表示パネル270は、一の面に第1表示部271が設けられ、その反対側の面に第2表示部272が設けられている。また、表示パネル270は、一の端部に回転支持部273が設けられ、回転支持部273を回転軸として回動可能となるように、表示部280上に設けられている。また、表示部280は、顔部201に設置される。 The display panel 270 has a first display section 271 provided on one surface, and a second display section 272 provided on the opposite surface. Further, the display panel 270 is provided with a rotation support section 273 at one end thereof, and is provided on the display section 280 so as to be rotatable about the rotation support section 273 as a rotation axis. Further, the display section 280 is installed on the face section 201.
 また、回転支持部273は、駆動部230(図3参照)に連結され、駆動部230の回転動作により、回転支持部273を回動軸として表示パネル270を上下方向(矢印A20方向(図7(B)参照))に回動させることが可能である。なお、表示パネル270の回転範囲は、例えば、180度の範囲を設定可能である。 The rotation support section 273 is connected to the drive section 230 (see FIG. 3), and the rotation operation of the drive section 230 causes the display panel 270 to move in the vertical direction (in the direction of arrow A20 (FIG. 7) with the rotation support section 273 as a rotation axis). (See (B))). Note that the rotation range of the display panel 270 can be set to, for example, a range of 180 degrees.
 表示パネル270及び表示部280により構成される表示装置は、表示パネル270の回転動作により、顔画像を表示する第1状態と、報知情報を表示する第2状態との何れかの表示状態となるように制御される。図7(A)に示すように、第1状態では、表示パネル270を表示部280の上側の領域に配置させ、表示パネル270の第1表示部271と表示部280の下側表示領域281とにより構成される表示領域とする。また、図7(D)に示すように、第2状態では、表示パネル270を表示部280の下側の領域に配置させ、表示パネル270の第2表示部272と表示部280の上側表示領域282とにより構成される表示領域とする。なお、第1状態では、主に顔画像が表示され、第2状態では、主に報知情報が表示されるものとする。 The display device constituted by the display panel 270 and the display section 280 enters either a first state in which a face image is displayed or a second state in which notification information is displayed by the rotating operation of the display panel 270. controlled as follows. As shown in FIG. 7A, in the first state, the display panel 270 is arranged in the upper area of the display part 280, and the first display part 271 of the display panel 270 and the lower display area 281 of the display part 280 are arranged in the first state. Let the display area consist of . Further, as shown in FIG. 7D, in the second state, the display panel 270 is arranged in the lower area of the display part 280, and the second display part 272 of the display panel 270 and the upper display area of the display part 280 are arranged. 282. Note that in the first state, a face image is mainly displayed, and in the second state, notification information is mainly displayed.
 このように、可動式の表示パネル270は、表示部280の上側の表示領域、又は、表示部280の下側の表示領域を覆う蓋のように機能する。このため、表示パネル270は、折り畳み表示部、又は、蓋部と称することも可能である。 In this way, the movable display panel 270 functions like a lid that covers the upper display area of the display unit 280 or the lower display area of the display unit 280. Therefore, the display panel 270 can also be referred to as a foldable display section or a lid section.
 図4と同様に、表示切替判定部121は、報知情報が発生したか否かを判定し、報知情報が発生した場合には、その旨をエージェント制御部122に出力する。 Similarly to FIG. 4, the display switching determination unit 121 determines whether or not notification information has been generated, and if notification information has been generated, outputs a notification to that effect to the agent control unit 122.
 次に、エージェント制御部122は、表示切替判定部121により報知情報が発生したと判定された場合には、その報知情報を、表示パネル270及び表示部280により構成される表示装置に表示させるための制御を実行する。具体的には、エージェント制御部122は、表示パネル270及び表示部280を用いた切替演出を実行し、かつ、第1状態における顔画像の表示を消去した後に、その報知情報を第2状態で表示させる。 Next, when the display switching determination unit 121 determines that notification information has been generated, the agent control unit 122 causes the notification information to be displayed on the display device configured by the display panel 270 and the display unit 280. control. Specifically, the agent control unit 122 executes a switching effect using the display panel 270 and the display unit 280, erases the display of the face image in the first state, and then displays the notification information in the second state. Display.
 例えば、報知情報が発生した場合には、図7(A)乃至(C)に示すように、エージェント制御部122は、報知情報を表示する前のタイミングで、切替演出として、ユーザに生き物らしさを感じさせるための演出(切替演出)を実行させる。具体的には、図7(A)に示すように、エージェント制御部122は、第1状態における顔画像が驚いた顔の表情(図2(D)参照)となるように、表示パネル270及び表示部280の表示状態を制御して切替演出を実行する。 For example, when notification information is generated, as shown in FIGS. 7A to 7C, the agent control unit 122 displays a creature-like image to the user as a switching effect before displaying the notification information. Execute a performance (switching performance) to make the user feel the feeling. Specifically, as shown in FIG. 7A, the agent control unit 122 controls the display panel 270 and the display panel 270 so that the facial image in the first state has a surprised facial expression (see FIG. 2D). The display state of the display unit 280 is controlled to execute a switching effect.
 次に、図7(B)(C)に示すように、エージェント制御部122は、表示パネル270の第1表示部271と表示部280の下側表示領域281とにおける顔画像の表示を消去し、回転支持部273を回動軸として表示パネル270を下方向(矢印A21、A22方向)に回動させる。なお、この回動時には、顔画像を表示状態とし、この回動の終了時に、顔画像を消去してもよい。また、この回動の開始前、又は、この回動時において、ユーザに生き物らしさを感じさせるための音声情報を音出力部220から出力させてもよい。 Next, as shown in FIGS. 7B and 7C, the agent control unit 122 erases the display of the face image on the first display section 271 of the display panel 270 and the lower display area 281 of the display section 280. , the display panel 270 is rotated downward (in the direction of arrows A21 and A22) using the rotation support part 273 as a rotation axis. Note that during this rotation, the face image may be displayed, and at the end of this rotation, the face image may be erased. Furthermore, before the rotation starts or at the time of this rotation, the sound output unit 220 may output audio information to make the user feel like a living creature.
 次に、図7(D)に示すように、エージェント制御部122は、表示パネル270が表示部280の下側の領域に配置された第2状態で、報知情報として報知画像410を表示させる。すなわち、表示パネル270の第2表示部272と表示部280の上側表示領域282とにより構成される表示領域に報知画像410が表示される。このように、報知情報が発生した場合には、表示パネル270を回転させて第1状態から第2状態に遷移する演出を切替演出として実行し、かつ、第1状態での顔画像の表示を消去した後に、第2状態で報知情報(報知画像410)を表示させる。なお、図7(C)に示すように、表示パネル270の回転中に報知情報(報知画像410)を表示させてもよい。 Next, as shown in FIG. 7(D), the agent control unit 122 displays a notification image 410 as notification information in a second state in which the display panel 270 is placed in the lower area of the display unit 280. That is, the notification image 410 is displayed in a display area formed by the second display section 272 of the display panel 270 and the upper display area 282 of the display section 280. In this way, when notification information is generated, a switching effect is performed in which the display panel 270 is rotated to transition from the first state to the second state, and the facial image is displayed in the first state. After erasing, the notification information (notification image 410) is displayed in the second state. Note that, as shown in FIG. 7C, notification information (notification image 410) may be displayed while the display panel 270 is rotating.
 このように、報知情報を表示する際には、顔画像から報知情報の切り替え時において、驚いた表情の顔画像を表示部264に表示させ、かつ、表示パネル270を回動させる。この表示パネル270の回動により、エージェント機器200が顔の仮面を取り、真の姿を見せるような面白味のある動きを表現することが可能であるため、エージェント機器200に対するユーザの関心を高めることが可能である。また、エージェント機器200の顔の仮面を取った後に、エージェント機器200の真の姿の顔に報知情報を表示するような面白味のある印象をユーザに与えることが可能である。これらにより、生き物らしさをユーザに感じさせ、報知情報が報知されることに対するユーザの関心を高めることが可能となる。また、顔画像の表示位置と、報知情報の表示位置とは、ユーザから見て同じ位置となる。このため、顔画像を見ていたユーザに対して見易い位置に報知情報を表示させることが可能となる。これらにより、報知情報をより適切にユーザに報知することが可能となる。 In this way, when displaying notification information, a facial image with a surprised expression is displayed on display section 264 and display panel 270 is rotated when switching from a facial image to notification information. By rotating the display panel 270, it is possible for the agent device 200 to remove the mask of its face and express interesting movements that reveal its true appearance, thereby increasing the user's interest in the agent device 200. is possible. Further, it is possible to give the user an interesting impression by displaying the notification information on the real face of the agent device 200 after removing the mask of the face of the agent device 200. With these, it is possible to make the user feel like a living thing and increase the user's interest in being notified of the notification information. Further, the display position of the face image and the display position of the notification information are the same position as viewed from the user. Therefore, it is possible to display the notification information at a position that is easily visible to the user who is viewing the face image. With these, it becomes possible to more appropriately notify the user of the notification information.
 なお、図7では、左右方向に延びる回転支持部273を回動軸として表示パネル270を上下方向に回動させる例を示したが、これに限定されない。例えば、上下方向に延びる回転支持部を回動軸として表示パネル270を左右方向に回動させてもよい。 Although FIG. 7 shows an example in which the display panel 270 is rotated in the vertical direction using the rotation support portion 273 extending in the left-right direction as a rotation axis, the display panel 270 is not limited to this. For example, the display panel 270 may be rotated in the left-right direction using a rotation support portion extending in the vertical direction as a rotation axis.
 [スライド移動が可能な表示部を用いて報知情報を提供する例]
 図8は、スライド移動が可能な表示部300と、固定式の表示部310とを用いて報知情報を提供する場合の遷移例を示す図である。表示部300は、表示画面301を備える板状の筐体であり、表示部310は、表示画面311を備える板状の筐体である。
[Example of providing notification information using a display unit that can be slid]
FIG. 8 is a diagram illustrating an example of transition when providing notification information using a slideable display section 300 and a fixed display section 310. The display unit 300 is a plate-shaped housing that includes a display screen 301, and the display unit 310 is a plate-shaped housing that includes a display screen 311.
 表示部300は、表示部310の前方において上下方向に移動可能に設置される。具体的には、表示部300は、身体部202の前方に固定されるレール部305、306において、表示部310の表示面と略平行方向にスライド可能に取り付けられる。なお、レール部305、306により実現されるスライド機構は、公知の技術を採用することが可能である。 The display section 300 is installed in front of the display section 310 so as to be movable in the vertical direction. Specifically, the display section 300 is attached to rail sections 305 and 306 fixed to the front of the body section 202 so as to be slidable in a direction substantially parallel to the display surface of the display section 310. In addition, the slide mechanism realized by the rail parts 305 and 306 can employ a known technique.
 また、表示部300の表示画面301のサイズと、表示部310の表示画面311のサイズとは、同一(又は略同一)としてもよく、異なるサイズとしてもよい。例えば、表示部300の表示画面301のサイズを、表示部310の表示画面311の少なくとも一部を覆うことが可能なサイズとすることができる。 Further, the size of the display screen 301 of the display unit 300 and the size of the display screen 311 of the display unit 310 may be the same (or substantially the same) or may be different sizes. For example, the size of the display screen 301 of the display unit 300 can be set to a size that can cover at least a portion of the display screen 311 of the display unit 310.
 また、表示部300は、顔画像を表示する第1状態(図8(A)参照)と、報知情報を表示する第2状態(図8(C)参照)との何れかに遷移するように制御される。図8(A)に示すように、第1状態では、表示部310の前方において表示部310の表示画面311を表示部300が覆うように配置される。また、図8(C)に示すように、第2状態では、第1状態の表示部300を表示部310の表示画面311と略平行方向にスライドしてユーザが表示部300の表示画面301と表示部310の表示画面311とを見ることが可能とされる。 Further, the display unit 300 is configured to transition to either a first state (see FIG. 8(A)) in which a face image is displayed or a second state (see FIG. 8(C)) in which notification information is displayed. controlled. As shown in FIG. 8A, in the first state, the display section 300 is arranged so as to cover the display screen 311 of the display section 310 in front of the display section 310. Further, as shown in FIG. 8C, in the second state, the user slides the display unit 300 in the first state in a direction substantially parallel to the display screen 311 of the display unit 300. The display screen 311 of the display unit 310 can be viewed.
 また、表示部300は、直接又は間接的に駆動部230(図3参照)に連結され、駆動部230の駆動により、レール部305、306に沿って上下方向(矢印A30方向(図8(B)参照))に移動させることが可能である。 In addition, the display section 300 is directly or indirectly connected to the drive section 230 (see FIG. 3), and is driven by the drive section 230 in the vertical direction (arrow A30 direction (FIG. 8 (B)) along the rail sections 305 and 306. )).
 図4と同様に、表示切替判定部121は、報知情報が発生したか否かを判定し、報知情報が発生した場合には、その旨をエージェント制御部122に出力する。 Similarly to FIG. 4, the display switching determination unit 121 determines whether or not notification information has been generated, and if notification information has been generated, outputs a notification to that effect to the agent control unit 122.
 次に、エージェント制御部122は、表示切替判定部121により報知情報が発生したと判定された場合には、その報知情報を表示部310に表示させるための制御を実行する。具体的には、エージェント制御部122は、表示部300を用いた切替演出を実行し、かつ、表示部300における顔画像の表示を消去した後に、その報知情報を表示部310に表示させる。 Next, when the display switching determination unit 121 determines that notification information has been generated, the agent control unit 122 executes control to display the notification information on the display unit 310. Specifically, the agent control unit 122 executes a switching effect using the display unit 300 and, after erasing the display of the face image on the display unit 300, causes the display unit 310 to display the notification information.
 例えば、報知情報が発生した場合には、図8(A)(B)に示すように、エージェント制御部122は、報知情報を表示する前のタイミングで、切替演出として、ユーザに生き物らしさを感じさせるための演出(切替演出)を実行させる。具体的には、図8(A)に示すように、エージェント制御部122は、表示部300における顔画像が驚いた顔の表情(図2(D)参照)となるように、表示部300の表示状態を制御して切替演出を実行する。 For example, when notification information is generated, as shown in FIGS. 8(A) and 8(B), the agent control unit 122 provides a switching effect that gives the user a feeling of living beings at a timing before displaying the notification information. A performance (switching performance) for the purpose of the change is executed. Specifically, as shown in FIG. 8(A), the agent control unit 122 controls the display unit 300 so that the facial image on the display unit 300 has a surprised facial expression (see FIG. 2(D)). Control the display state and execute switching effects.
 次に、図8(B)に示すように、エージェント制御部122は、表示部300における顔画像を消去し、レール部305、306に沿って表示部300を下方向(矢印A31方向)に移動させる。なお、この移動時には、顔画像を表示状態とし、この移動の終了時に、顔画像を消去してもよい。また、この移動の開始前、又は、この移動時において、ユーザに生き物らしさを感じさせるための音声情報を音出力部220から出力させてもよい。 Next, as shown in FIG. 8B, the agent control unit 122 erases the face image on the display unit 300 and moves the display unit 300 downward (in the direction of arrow A31) along the rails 305 and 306. let Note that the face image may be displayed during this movement, and the face image may be erased at the end of this movement. Furthermore, before starting this movement or during this movement, the sound output unit 220 may output audio information to make the user feel like a living creature.
 次に、図8(C)に示すように、エージェント制御部122は、表示部310に報知情報として報知画像410を表示させる。このように、報知情報が発生した場合には、レール部305、306に沿って表示部300を下方向(矢印A31方向)にスライド移動させることにより第1状態から第2状態に遷移させる演出を切替演出として実行し、かつ、表示部300における顔画像の表示を消去した後に、表示部310に報知情報(報知画像410)を表示させる。 Next, as shown in FIG. 8(C), the agent control unit 122 causes the display unit 310 to display a notification image 410 as notification information. In this way, when notification information is generated, the display section 300 is slid downward (in the direction of arrow A31) along the rail sections 305 and 306 to effect a transition from the first state to the second state. After performing this as a switching effect and erasing the display of the face image on the display unit 300, notification information (notification image 410) is displayed on the display unit 310.
 このように、レール部305、306に沿って表示部300を上下方向にスライド移動させることにより、表示部310の表示画面311上を表示部300の表示画面301がスワイプしているような印象をユーザに与えることが可能である。 In this way, by sliding the display unit 300 vertically along the rails 305 and 306, the impression that the display screen 301 of the display unit 300 is swiping over the display screen 311 of the display unit 310 is created. It is possible to give it to the user.
 このように、報知情報を表示する際には、顔画像から報知情報の切り替え時において、驚いた表情の顔画像を表示部300に表示させ、かつ、表示部300を移動させる。この表示部300の移動により、エージェント機器200の顔を隠していたフィルタが外され、真の姿を見せるような面白味のある動きを表現することが可能であるため、エージェント機器200に対するユーザの関心を高めることが可能である。また、エージェント機器200の顔を隠していたフィルタが外された後に、エージェント機器200の真の姿の顔に報知情報を表示するような面白味のある印象をユーザに与えることが可能である。これらにより、生き物らしさをユーザに感じさせ、報知情報が報知されることに対するユーザの関心を高めることが可能となる。また、顔画像の表示位置と、報知情報の表示位置とは、ユーザから見て同じ位置となる。このため、顔画像を見ていたユーザに対して見易い位置に報知情報を表示させることが可能となる。これらにより、報知情報をより適切にユーザに報知することが可能となる。 In this manner, when displaying notification information, a facial image with a surprised expression is displayed on display unit 300, and display unit 300 is moved when switching from a face image to notification information. By this movement of the display unit 300, the filter that has hidden the face of the agent device 200 is removed, and it is possible to express interesting movements that reveal the real face of the agent device 200, thereby increasing the user's interest in the agent device 200. It is possible to increase the Furthermore, it is possible to give the user an interesting impression in which the notification information is displayed on the real face of the agent device 200 after the filter that has hidden the face of the agent device 200 is removed. With these, it is possible to make the user feel like a living thing and increase the user's interest in being notified of the notification information. Further, the display position of the face image and the display position of the notification information are the same position as viewed from the user. Therefore, it is possible to display the notification information at a position that is easily visible to the user who is viewing the face image. With these, it becomes possible to more appropriately notify the user of the notification information.
 なお、図8に示す例では、第2状態となった表示部300が身体部202の前方に配置され、ユーザから見た場合に身体部202の一部が表示部300により覆われることにより、身体部202の一部を隠す状態とする例を示す。ただし、表示部300を第2状態とする場合に、表示部300を身体部202の内部に収容し、ユーザから見た場合に身体部202が隠れないようにしてもよい。また、第2状態となった表示部300が身体部202の前方に配置された場合に、ユーザから見て表示部300により覆われる身体部202の一部と同じ画像(又は、エージェント機器200に関連する他の画像)を表示部300に表示してもよい。これらにより、顔部201に報知情報が表示されている場合でも、身体部202をユーザが認識可能となるため、エージェント機器200に対する生物らしさをユーザに感じさせることが可能である。 In the example shown in FIG. 8, the display unit 300 in the second state is placed in front of the body part 202, and a part of the body part 202 is covered by the display unit 300 when viewed from the user. An example is shown in which a part of the body part 202 is hidden. However, when the display unit 300 is in the second state, the display unit 300 may be housed inside the body part 202 so that the body part 202 is not hidden when viewed from the user. Furthermore, when the display unit 300 in the second state is placed in front of the body part 202, an image that is the same as the part of the body part 202 covered by the display unit 300 when viewed from the user (or Other related images) may also be displayed on the display unit 300. These allow the user to recognize the body part 202 even when the notification information is displayed on the face part 201, making it possible for the user to feel that the agent device 200 resembles a living creature.
 [直方体の表示装置を上下方向に回動させて報知情報を提供する例]
 図9は、2つの表示部324、325を用いて報知情報を提供する場合の遷移例を示す図である。図9では、エージェント機器200が、上下方向に回動可能な筐体320(顔部201)を備える例を示す。なお、図9(D)には、上下方向に回動可能な筐体320(顔部201)の外観を斜視図として示す。図9(D)に示すように、筐体320の一の面には表示部324(図9(A)参照)が設置され、筐体320の表示部324の面に隣接する他の面には、表示部325(図9(C)参照)が設置される。表示部324には、主に顔画像が表示され、表示部325には、主に報知情報が表示されるものとする。
[Example of providing notification information by rotating a rectangular parallelepiped display device vertically]
FIG. 9 is a diagram showing an example of transition when providing notification information using two display sections 324 and 325. FIG. 9 shows an example in which the agent device 200 includes a casing 320 (face portion 201) that is rotatable in the vertical direction. Note that FIG. 9(D) shows a perspective view of the external appearance of the vertically rotatable casing 320 (face portion 201). As shown in FIG. 9(D), a display section 324 (see FIG. 9(A)) is installed on one surface of the casing 320, and the other surface adjacent to the surface of the display section 324 of the casing 320 is provided with a display section 324 (see FIG. 9(A)). A display unit 325 (see FIG. 9(C)) is installed. The display section 324 mainly displays face images, and the display section 325 mainly displays notification information.
 なお、図9に示す例は、図6の変形例であり、筐体320の形状、筐体320における2つの表示部の位置関係等が異なる。 Note that the example shown in FIG. 9 is a modification of FIG. 6, and differs in the shape of the housing 320, the positional relationship between the two display units in the housing 320, etc.
 具体的には、正面視で口の字形状の枠部321の内部に筐体320が設置され、筐体320の左右方向の両端部と、枠部321の内側面とが、回転支持部322、323により連結されている。なお、回転支持部322、323を設ける位置は、顔部201の耳に相当する位置であるため、回転支持部322、323を顔部201の耳部として構成してもよい。そして、回転支持部322、323は、駆動部230(図3参照)に連結され、駆動部230の回転動作により、回転支持部322、323を回動軸として筐体320を上下方向(矢印A40方向)に回動させる。このように、枠部321の内部において筐体320を上下方向に回動可能とする。なお、筐体320の回転範囲は、例えば、180度乃至360度の範囲を設定可能である。例えば、上下方向に90度程度、回動させることにより、2つの表示部324、325を切り替え可能である。通常時には、図9(A)に示すように、表示部324側が正面側となるように設定され、図2(A)乃至(C)に示す例と同様に、表示部324には各種の顔画像が表示される。 Specifically, the casing 320 is installed inside a frame 321 that has a square shape when viewed from the front, and both ends of the casing 320 in the left and right direction and the inner surface of the frame 321 are connected to the rotation support part 322. , 323. Note that since the rotation support parts 322 and 323 are provided at positions corresponding to the ears of the face part 201, the rotation support parts 322 and 323 may be configured as the ears of the face part 201. The rotational supports 322 and 323 are connected to a drive unit 230 (see FIG. 3), and the rotational movement of the drive unit 230 causes the housing 320 to move in the vertical direction (arrow A40 direction). In this way, the housing 320 can be rotated in the vertical direction inside the frame portion 321. Note that the rotation range of the housing 320 can be set, for example, in a range of 180 degrees to 360 degrees. For example, the two display sections 324 and 325 can be switched by rotating about 90 degrees in the vertical direction. Normally, as shown in FIG. 9(A), the display unit 324 side is set to be the front side, and as in the examples shown in FIGS. 2(A) to (C), various faces are displayed on the display unit 324. The image is displayed.
 図4と同様に、表示切替判定部121は、報知情報が発生したか否かを判定し、報知情報が発生した場合には、その旨をエージェント制御部122に出力する。 Similarly to FIG. 4, the display switching determination unit 121 determines whether or not notification information has been generated, and if notification information has been generated, outputs a notification to that effect to the agent control unit 122.
 次に、エージェント制御部122は、表示切替判定部121により報知情報が発生したと判定された場合には、その報知情報を表示部325に表示させるための制御を実行する。具体的には、エージェント制御部122は、筐体320を用いた切替演出を実行し、かつ、表示部324における顔画像を消去した後に、その報知情報を表示部325に表示させる。 Next, when the display switching determination unit 121 determines that notification information has been generated, the agent control unit 122 executes control to display the notification information on the display unit 325. Specifically, the agent control unit 122 executes a switching effect using the housing 320, erases the face image on the display unit 324, and then displays the notification information on the display unit 325.
 例えば、報知情報が発生した場合には、図9(A)(B)に示すように、エージェント制御部122は、報知情報を表示する前のタイミングで、切替演出として、ユーザに生き物らしさを感じさせるための演出(切替演出)を実行させる。具体的には、図9(A)に示すように、エージェント制御部122は、表示部300における顔画像が驚いた顔の表情(図2(D)参照)となるように、表示部324の表示状態を制御して切替演出を実行する。 For example, when notification information is generated, as shown in FIGS. 9A and 9B, the agent control unit 122 creates a switching effect that gives the user a feeling of living beings at a timing before displaying the notification information. A performance (switching performance) for the purpose of the change is executed. Specifically, as shown in FIG. 9(A), the agent control unit 122 controls the display unit 324 so that the facial image on the display unit 300 has a surprised facial expression (see FIG. 2(D)). Control the display state and execute switching effects.
 次に、図9(B)に示すように、エージェント制御部122は、表示部324における顔画像を消去し、回転支持部322、323を回動軸として筐体320を下方向(矢印A41方向)に回動させる。なお、この回動時には、表示部324における顔画像を表示状態とし、この回動の終了時に、表示部324における顔画像を消去してもよい。また、この回動の開始前、又は、この回動時において、ユーザに生き物らしさを感じさせるための音声情報を音出力部220から出力させてもよい。 Next, as shown in FIG. 9B, the agent control unit 122 erases the face image on the display unit 324, and rotates the housing 320 downward (in the direction of arrow A41) using the rotational supports 322 and 323 as rotation axes. ). Note that during this rotation, the face image on the display section 324 may be displayed, and when this rotation ends, the face image on the display section 324 may be erased. Furthermore, before the rotation starts or at the time of this rotation, the sound output unit 220 may output audio information to make the user feel like a living creature.
 次に、図9(C)に示すように、エージェント制御部122は、車両C1のユーザ側となった表示部325に報知情報として報知画像410を表示させる。このように、報知情報が発生した場合には、筐体320(表示部324、325)を用いた切替演出を実行し、かつ、表示部324における顔画像を消去した後に、表示部325に報知情報(報知画像410)を表示させる。 Next, as shown in FIG. 9(C), the agent control unit 122 displays a notification image 410 as notification information on the display unit 325 on the user side of the vehicle C1. In this way, when notification information is generated, a switching effect is performed using the housing 320 (display sections 324, 325), and after erasing the face image on the display section 324, the notification information is displayed on the display section 325. Information (notification image 410) is displayed.
 このように、報知情報を表示する際には、顔画像から報知情報の切り替え時において、驚いた表情の顔画像を表示部324に表示させ、かつ、筐体320を移動させる。この筐体320の回動により、エージェント機器200の顔が下を向くような動きを表現することが可能であるため、エージェント機器200に対するユーザの関心を高めることが可能である。また、エージェント機器200の顔が下を向いた後に、エージェント機器200の頭や背中に報知情報を表示するような印象をユーザに与えることが可能である。これらにより、生き物らしさをユーザに感じさせ、報知情報が報知されることに対するユーザの関心を高めることが可能となる。また、顔画像の表示位置と、報知情報の表示位置とは、ユーザから見て略同じ位置となる。また、顔画像の表示サイズよりも報知情報の表示サイズを大きくすることが可能である。このため、顔画像を見ていたユーザに対して見易い位置に報知情報を大きく表示させることが可能となる。これらにより、報知情報をより適切にユーザに報知することが可能となる。 In this way, when displaying the notification information, a face image with a surprised expression is displayed on the display unit 324 and the housing 320 is moved when switching from the face image to the notification information. By rotating the housing 320, it is possible to express a movement in which the face of the agent device 200 faces downward, and therefore it is possible to increase the user's interest in the agent device 200. Furthermore, it is possible to give the user the impression that the notification information is displayed on the head or back of the agent device 200 after the face of the agent device 200 is turned downward. With these, it is possible to make the user feel like a living thing and increase the user's interest in being notified of the notification information. Furthermore, the display position of the face image and the display position of the notification information are approximately the same position as viewed from the user. Furthermore, it is possible to make the display size of the notification information larger than the display size of the face image. Therefore, it is possible to display the notification information in a large size at a position that is easily visible to the user who is viewing the face image. With these, it becomes possible to more appropriately notify the user of the notification information.
 また、図9(D)に示すように、左右方向に細長い表示部324に顔画像を表示させ、上下方向に広い表示部325に報知情報を表示させる。これにより、顔画像よりも重要な情報が表示部325に表示されることをユーザに視覚的に直感的に把握させることが可能となる。また、表示部324(顔画像を表示)を回転させ、その上側に隣接する比較的大きい表示部325に報知情報を表示させることにより、エージェント機器200の顔の近くの背中(又は、顔の前の腹部)部分に報知情報が表示されたような面白味のある印象をユーザに与えることができる。これらにより、生物らしさをユーザに与えることができ、報知情報に対するユーザの関心を高めることが可能である。 Further, as shown in FIG. 9(D), a facial image is displayed on the horizontally elongated display section 324, and notification information is displayed on the vertically wide display section 325. This allows the user to visually and intuitively understand that information more important than the facial image is displayed on the display unit 325. In addition, by rotating the display section 324 (displaying the face image) and displaying the notification information on the relatively large display section 325 adjacent to the upper side of the display section 324, the information can be displayed on the back near the face of the agent device 200 (or in front of the face). It is possible to give the user an interesting impression that the notification information is displayed on the abdominal part of the user. With these, it is possible to give the user a feeling of being like a living thing, and it is possible to increase the user's interest in the broadcast information.
 なお、図9に示す例では、左右方向に細長い表示部324に顔画像を表示させ、上下方向に広い表示部325に報知情報を表示させる例を示したが、これに限定されない。例えば、左右方向に細長い表示部324に報知情報を表示させ、上下方向に広い表示部325に顔画像を表示させてもよく、隣接する他の2つの面に、顔画像及び報知情報を表示させてもよい。 Note that in the example shown in FIG. 9, a face image is displayed on the horizontally elongated display section 324, and notification information is displayed on the vertically wide display section 325, but the present invention is not limited to this. For example, the notification information may be displayed on the horizontally elongated display section 324, the facial image may be displayed on the vertically wide display section 325, and the facial image and the notification information may be displayed on the other two adjacent surfaces. It's okay.
 また、図9では、直方体形状の筐体320の隣接する2つの面に設けられた2つの表示部を用いる例を示したが、立方体形状の筐体の隣接する2つの面に設けられた2つの表示部を用いてもよい。 Furthermore, although FIG. 9 shows an example using two display sections provided on two adjacent surfaces of the rectangular parallelepiped-shaped housing 320, two display sections provided on two adjacent surfaces of the cubic-shaped housing 320 are used. A single display section may also be used.
 また、図9では、直方体形状の筐体320を上下方向に回動させる例を示したが、これに限定されない。例えば、上下方向に細長い顔画像となるよう設定された直方体形状の筐体を左右方向に回動させてもよい。 Further, although FIG. 9 shows an example in which the rectangular parallelepiped-shaped housing 320 is rotated in the vertical direction, the present invention is not limited thereto. For example, a rectangular parallelepiped-shaped housing configured to form a face image elongated in the vertical direction may be rotated in the left-right direction.
 [1つの表示部を用いて報知情報を提供する例]
 図4乃至図9では、複数の表示部の画面内容を切り替えることにより報知情報を表示させる例を示した。ここで、1つの表示部に表示される画面内容を切り替えることにより、顔画像の表示画面と、報知情報の表示画面とを異なる画面としてユーザに認識させることも可能である。そこで、図10では、1つの表示部の画面内容を切り替えることにより報知情報を表示させる例を示す。
[Example of providing notification information using one display unit]
4 to 9 show examples in which notification information is displayed by switching the screen contents of a plurality of display units. Here, by switching the screen contents displayed on one display unit, it is also possible to make the user recognize the face image display screen and the notification information display screen as different screens. Therefore, FIG. 10 shows an example in which notification information is displayed by switching the screen content of one display unit.
 図10は、1つの表示部210を用いて報知情報を提供する場合の遷移例を示す図である。図10では、表示部210が、異なる表示特性で各画像を表示することが可能な表示機能を備える例を示す。なお、表示特性は、画質、色彩、解像度、立体視の有無等の画像を表示する際の特性を意味する。例えば、立体画像及び平面画像を切り替えることにより異なる表示特性を実現可能である。また、例えば、高画質及び低画質を切り替えることにより異なる表示特性を実現可能である。また、例えば、カラー画像及び白黒画像を切り替えることにより異なる表示特性を実現可能である。 FIG. 10 is a diagram showing an example of transition when providing notification information using one display unit 210. FIG. 10 shows an example in which the display unit 210 has a display function that can display each image with different display characteristics. Note that the display characteristics refer to characteristics when displaying images, such as image quality, color, resolution, and presence or absence of stereoscopic viewing. For example, different display characteristics can be realized by switching between a three-dimensional image and a two-dimensional image. Further, for example, different display characteristics can be realized by switching between high image quality and low image quality. Further, for example, different display characteristics can be realized by switching between a color image and a monochrome image.
 例えば、表示部241に顔画像を立体画像として表示させ、表示部241に報知情報を平面画像として表示させることにより、異なる表示特性で顔画像及び報知情報を表示することが可能となる。また、例えば、表示部241に顔画像を高画質画像として表示させ、表示部241に報知情報を低画質画像として表示させることにより、異なる表示特性で顔画像及び報知情報を表示することが可能となる。この場合の高画質の顔画像は、例えば、カラフルで綺麗な画像であり、低画質の顔画像は、例えば、白黒画像、ピクセル画像、モザイク画像のうちの少なくとも1つである。また、例えば、表示部241に顔画像をカラー画像として表示させ、表示部241に報知情報を白黒画像として表示させることにより、異なる表示特性で顔画像及び報知情報を表示することが可能となる。 For example, by displaying the facial image as a three-dimensional image on the display section 241 and displaying the notification information as a two-dimensional image on the display section 241, it becomes possible to display the facial image and the notification information with different display characteristics. Furthermore, for example, by displaying the face image on the display unit 241 as a high-quality image and displaying the notification information on the display unit 241 as a low-quality image, it is possible to display the face image and the notification information with different display characteristics. Become. In this case, the high quality face image is, for example, a colorful and beautiful image, and the low quality face image is, for example, at least one of a black and white image, a pixel image, and a mosaic image. Further, for example, by displaying the face image on the display unit 241 as a color image and displaying the notification information as a black and white image on the display unit 241, it is possible to display the face image and the notification information with different display characteristics.
 図4と同様に、表示切替判定部121は、報知情報が発生したか否かを判定し、報知情報が発生した場合には、その旨をエージェント制御部122に出力する。 Similarly to FIG. 4, the display switching determination unit 121 determines whether or not notification information has been generated, and if notification information has been generated, outputs a notification to that effect to the agent control unit 122.
 次に、エージェント制御部122は、表示切替判定部121により報知情報が発生したと判定された場合には、その報知情報を表示部210に表示させるための制御を実行する。具体的には、エージェント制御部122は、表示部210を用いた切替演出を実行し、かつ、表示部210における顔画像を消去した後に、その報知情報を表示部210に表示させる。 Next, when the display switching determination unit 121 determines that notification information has been generated, the agent control unit 122 executes control to display the notification information on the display unit 210. Specifically, the agent control unit 122 executes a switching effect using the display unit 210, erases the face image on the display unit 210, and then displays the notification information on the display unit 210.
 例えば、報知情報が発生した場合には、図10(A)に示すように、エージェント制御部122は、報知情報を表示する前のタイミングで、切替演出として、ユーザに生き物らしさを感じさせるための演出(切替演出)を実行させる。具体的には、図10(A)に示すように、エージェント制御部122は、表示部210における顔画像が驚いた顔の表情(図2(D)参照)となるように、表示部210の表示状態を制御して切替演出を実行する。 For example, when notification information is generated, the agent control unit 122, as shown in FIG. Execute the performance (switching performance). Specifically, as shown in FIG. 10(A), the agent control unit 122 controls the display unit 210 so that the facial image on the display unit 210 has a surprised facial expression (see FIG. 2(D)). Control the display state and execute switching effects.
 次に、図10(B)に示すように、エージェント制御部122は、表示部210における顔画像を消去し、表示部210に報知情報として報知画像400を表示させる。この顔画像から報知情報に表示を遷移させる場合において、顔画像の表示画面と、報知情報の表示画面とが異なる画面であるとユーザに感じさせる演出を切替演出として実行してもよい。例えば、上述したように、立体画像から平面画像への切り替え、高画質から低画質への切り替え、カラー画像から白黒画像への切り替え等を切替演出として実行することが可能である。また、配色を変化させたり、解像度を変化させたりして切替演出を実行してもよい。これらにより、表示部210により異なる表示特性(第1表示特性、第2表示特性)を実現可能である。 Next, as shown in FIG. 10(B), the agent control unit 122 erases the face image on the display unit 210 and causes the display unit 210 to display the notification image 400 as notification information. In the case of transitioning the display from the face image to the notification information, an effect that makes the user feel that the face image display screen and the notification information display screen are different screens may be performed as a switching effect. For example, as described above, switching from a stereoscopic image to a two-dimensional image, switching from high image quality to low image quality, switching from a color image to a monochrome image, etc. can be performed as a switching effect. Further, the switching effect may be performed by changing the color scheme or changing the resolution. With these, different display characteristics (first display characteristics, second display characteristics) can be realized by the display section 210.
 また、顔画像から報知情報に表示を遷移させる場合において、ソフトウエア的な画像処理を用いて切替演出を実行してもよい。例えば、顔画像が徐々に消えていく演出をした後に、報知情報が徐々に表示される演出を実行してもよい。このように、顔画像が徐々に消えていく演出を実行することにより、これから何らかの情報が出るという期待感をユーザに与えることができ、ユーザの注意を表示部210に向けさせることが可能となる。また、この切替の開始前、又は、この切替時において、ユーザに生き物らしさを感じさせるための音声情報を音出力部220から出力させてもよい。 Furthermore, when the display is to be transitioned from the face image to the notification information, the switching effect may be performed using software-based image processing. For example, after the face image gradually disappears, the notification information may be gradually displayed. In this way, by performing an effect in which the face image gradually disappears, it is possible to give the user a sense of expectation that some kind of information will be forthcoming, and it is possible to direct the user's attention to the display unit 210. . Moreover, before the start of this switching or at the time of this switching, audio information for making the user feel like a living thing may be outputted from the sound output unit 220.
 このように、報知情報を表示する際には、顔画像から報知情報の切り替え時において、驚いた表情の顔画像を表示部210に表示させ、かつ、異なる表示特性として報知情報を表示させる。これにより、生き物らしさをユーザに感じさせ、報知情報が報知されることに対するユーザの関心を高めることが可能となる。また、表示特性の違いにより、顔画像と報知情報とをユーザが明確に区別することが可能となる。このため、報知情報をより適切にユーザに報知することが可能となる。 In this manner, when displaying notification information, a facial image with a surprised expression is displayed on the display unit 210 when switching from a face image to notification information, and notification information is displayed as a different display characteristic. Thereby, it is possible to make the user feel like a living thing and increase the user's interest in being notified of the notification information. Furthermore, the difference in display characteristics allows the user to clearly distinguish between the facial image and the notification information. Therefore, it becomes possible to more appropriately notify the user of the notification information.
 なお、図10では、1つの表示部210の画面内容を切り替えることにより、顔画像とは異なる表示特性として報知情報を表示する例を示した。ただし、複数の表示部の画面内容を切り替えて報知情報を表示する実施形態(例えば、図4乃至図9)においても、複数の表示部のそれぞれを異なる表示特性として構成してもよい。 Note that FIG. 10 shows an example in which notification information is displayed as a display characteristic different from that of a face image by switching the screen content of one display unit 210. However, even in embodiments (for example, FIGS. 4 to 9) in which notification information is displayed by switching the screen contents of a plurality of display sections, each of the plurality of display sections may be configured to have different display characteristics.
 例えば、第1表示部をLCD(Liquid Crystal Display)とし、第2表示部をOLED(Organic Electro Luminescence Diode)とすることが可能である。また、第1表示部を光沢画面とし、第2表示部を非光沢画面とすることが可能である。また、第1表示部を平面画面とし、第2表示部を曲面画面とすることが可能である。また、第1表示部を立体画像が表示可能な立体視画面とし、第2表示部を平面画像が表示可能な画面とすることが可能である。また、第1表示部及び第2表示部のサイズを異なるサイズすることが可能である。例えば、第1表示部をLCDとし、第2表示部をOLEDとした場合には、見た目の画像は同じとなることもあるが、明るさが異なるため、異なる演出を実行することが可能である。また、これらの場合において、第2表示部に報知情報を表示させる場合には、第1表示部及び第2表示部のうちの少なくとも1つにおいて切替演出を実行させる。例えば、第1表示部及び第2表示部の筐体を移動させたり、第1表示部に驚いた表情を表示させたりすることにより、切替演出を実行させる。 For example, the first display section can be an LCD (Liquid Crystal Display), and the second display section can be an OLED (Organic Electro Luminescence Diode). Further, it is possible to make the first display section a glossy screen and the second display section a non-glossy screen. Further, the first display section can be a flat screen, and the second display section can be a curved screen. Further, the first display section can be a stereoscopic screen capable of displaying a three-dimensional image, and the second display section can be a screen capable of displaying a two-dimensional image. Further, it is possible to make the first display section and the second display section different in size. For example, if the first display section is an LCD and the second display section is an OLED, the visual images may be the same, but the brightness is different, so it is possible to perform different effects. . Furthermore, in these cases, when displaying the notification information on the second display section, a switching effect is executed on at least one of the first display section and the second display section. For example, the switching effect is performed by moving the casings of the first display section and the second display section, or by displaying a surprised expression on the first display section.
 [情報処理装置の動作例]
 図11は、情報処理装置110における報知情報出力処理の一例を示すフローチャートである。また、この報知情報出力処理は、記憶部130に記憶されているプログラムに基づいて制御部120により実行される。また、この報知情報出力処理は、制御周期毎に常時実行される。また、この報知情報出力処理では、図1乃至図10を適宜参照して説明する。
[Operation example of information processing device]
FIG. 11 is a flowchart illustrating an example of notification information output processing in the information processing device 110. Further, this notification information output processing is executed by the control unit 120 based on a program stored in the storage unit 130. Moreover, this notification information output process is always executed in every control cycle. Further, this notification information output processing will be explained with reference to FIGS. 1 to 10 as appropriate.
 ステップS501において、表示切替判定部121は、入力情報を分析する分析処理を実行する。この入力情報は、カメラ101、位置情報取得センサ102、音声入力部103、センサ類104等から情報処理装置110に入力された情報、通信部140を介して外部から取得されて情報処理装置110に入力された情報等である。 In step S501, the display switching determination unit 121 executes an analysis process to analyze input information. This input information includes information input to the information processing apparatus 110 from the camera 101, position information acquisition sensor 102, audio input unit 103, sensors 104, etc., and information acquired from the outside via the communication unit 140 and input to the information processing apparatus 110. This is input information, etc.
 ステップS502において、表示切替判定部121は、入力情報の分析結果に基づいて、車両C1の乗員に報知すべき情報(報知情報)があるか否かを判定する。 In step S502, the display switching determination unit 121 determines whether there is information (notification information) to be notified to the occupants of the vehicle C1, based on the analysis result of the input information.
 例えば、バッテリセンサに基づいて、車両C1のバッテリ残量を検出可能である。そこで、バッテリ残量が閾値以下となることが判定された場合には、バッテリ残量が低下していることを報知するための報知情報があると判定される。 For example, the remaining battery level of the vehicle C1 can be detected based on the battery sensor. Therefore, when it is determined that the remaining battery amount is equal to or less than the threshold value, it is determined that there is notification information for notifying that the remaining battery amount is decreasing.
 例えば、シートベルトの着用の有無を検出するシートベルトセンサと、車両C1の各座席の乗員の有無を検出する着座センサ(又はシートセンサ)とに基づいて、車両C1の各座席に乗車する乗員のシートベルトの着用の有無を判定可能である。そこで、乗員が着座しているにもかかわらず、シートベルトを着用していないことが判定された場合には、その乗員に対する報知情報があると判定される。 For example, based on a seat belt sensor that detects whether a seat belt is worn or not, and a seating sensor (or seat sensor) that detects the presence or absence of an occupant in each seat of the vehicle C1, the number of occupants in each seat of the vehicle C1 is determined. It is possible to determine whether or not a seat belt is worn. Therefore, if it is determined that the occupant is not wearing a seatbelt even though he or she is seated, it is determined that there is notification information for the occupant.
 また、例えば、車両C1の各ドアの半ドアを検出する半ドアセンサに基づいて、車両C1の半ドアを検出可能である。そこで、半ドア状態のドアが検出された場合には、半ドア状態のドアに対する報知情報があると判定される。 Further, for example, it is possible to detect whether a door of the vehicle C1 is ajar based on a door sensor that detects whether each door of the vehicle C1 is ajar. Therefore, when a door in an ajar state is detected, it is determined that there is notification information for the door in an ajar state.
 また、例えば、通信部140を介して外部機器、例えば情報提供サーバから取得された情報と、位置情報取得センサ102により取得された車両C1の現在地とに基づいて、車両C1の周囲に存在する地物、商業施設等の有無を取得可能である。そこで、例えば、車両C1の周囲に商業施設、例えば珈琲店が存在することが判定された場合には、その珈琲店の広告が報知情報としてあると判定される。また、例えば、車両C1の周囲に観光施設、例えばABC城が存在することが判定された場合には、そのABC城を案内する情報が報知情報としてあると判定される。 For example, based on the information acquired from an external device, such as an information providing server, via the communication unit 140 and the current location of the vehicle C1 acquired by the position information acquisition sensor 102, the locations around the vehicle C1 may be determined. It is possible to obtain the presence or absence of goods, commercial facilities, etc. Therefore, for example, if it is determined that a commercial facility, such as a coffee shop, exists around the vehicle C1, it is determined that an advertisement for the coffee shop exists as the notification information. Further, for example, if it is determined that a tourist facility such as ABC Castle exists around the vehicle C1, it is determined that information for guiding the ABC Castle exists as broadcast information.
 ステップS503において、表示切替判定部121は、車両C1の乗員に報知すべき情報が発生したか否かを判定する。車両C1の乗員に報知すべき情報が発生した場合には、ステップS504に進む。一方、車両C1の乗員に報知すべき情報が発生していない場合には、報知情報出力処理の動作を終了する。 In step S503, the display switching determination unit 121 determines whether information to be notified to the occupants of the vehicle C1 has occurred. If information to be notified to the occupants of the vehicle C1 occurs, the process advances to step S504. On the other hand, if the information to be notified to the occupants of the vehicle C1 is not generated, the operation of the notification information output process is ended.
 ステップS504において、エージェント制御部122は、顔画像から報知情報への切替演出を実行する。例えば、図4(A)、図5(A)(B)、図6(A)(B)、図7(A)乃至(C)、図8(A)(B)、図9(A)(B)、図10(A)に示すように、エージェント機器200の各部を用いた切替演出が実行される。この場合に、切替演出に関連する音声情報を音出力部220から出力してもよい。なお、図5乃至図9では、驚いた顔の表情と、顔部201の動きとの双方を切替演出として実行する例を示すが、顔部201の動きのみを切替演出として実行してもよい。 In step S504, the agent control unit 122 executes a switching effect from the face image to the notification information. For example, FIG. 4(A), FIG. 5(A)(B), FIG. 6(A)(B), FIG. 7(A) to (C), FIG. 8(A)(B), FIG. 9(A) As shown in FIG. 10(B) and FIG. 10(A), a switching effect using each part of the agent device 200 is executed. In this case, audio information related to the switching effect may be output from the sound output unit 220. Note that although FIGS. 5 to 9 show examples in which both a surprised facial expression and the movement of the face part 201 are executed as a switching effect, only the movement of the face part 201 may be executed as a switching effect. .
 ステップS505において、エージェント制御部122は、報知情報を出力する出力処理をエージェント機器200に実行させる。例えば、図4(B)、図5(C)、図6(C)、図7(D)、図8(C)、図9(C)、図10(B)に示すように、報知画像400、410が表示部210に表示される。この場合に、報知情報に関連する音声情報を音出力部220から出力してもよい。 In step S505, the agent control unit 122 causes the agent device 200 to perform output processing to output notification information. For example, as shown in FIG. 4(B), FIG. 5(C), FIG. 6(C), FIG. 7(D), FIG. 8(C), FIG. 9(C), and FIG. 10(B), the notification image 400 and 410 are displayed on the display section 210. In this case, audio information related to the broadcast information may be output from the sound output unit 220.
 ステップS506において、エージェント制御部122は、ステップS505で出力された報知情報の終了タイミングになったか否かを判定する。報知情報の終了タイミングになった場合には、ステップS507に進む。一方、報知情報の終了タイミングになっていない場合には、ステップS505に戻る。 In step S506, the agent control unit 122 determines whether the end timing of the notification information output in step S505 has come. When the end timing of the broadcast information has come, the process advances to step S507. On the other hand, if it is not the end timing of the broadcast information, the process returns to step S505.
 ここで、報知情報の終了タイミングは、報知情報に対応する車両C1の乗員の所定アクションがあった場合、報知情報の表示から所定時間が経過した場合、車両C1の現在地が報知情報の表示領域の範囲外となった場合等とすることができる。 Here, the end timing of the notification information is when there is a predetermined action by the occupant of the vehicle C1 corresponding to the notification information, when a predetermined time has elapsed since the display of the notification information, and when the current location of the vehicle C1 is within the display area of the notification information. This can be done when the value is out of range.
 例えば、ステップS505で出力された報知情報が、バッテリ残量に関する報知情報であった場合には、バッテリ残量の増加、又は、バッテリ充電をする動作が所定アクションとなる。バッテリ残量の増加については、バッテリセンサによる検出情報に基づいて判定可能である。また、バッテリ充電については、充電ケーブルが車両C1に接続された場合における充電ケーブルを介した充電設備及び車両C1間のやりとりに基づいて判定可能である。 For example, if the notification information output in step S505 is notification information regarding the remaining battery amount, the predetermined action is to increase the remaining battery amount or to charge the battery. An increase in the remaining battery capacity can be determined based on information detected by the battery sensor. Furthermore, battery charging can be determined based on the exchange between the charging equipment and the vehicle C1 via the charging cable when the charging cable is connected to the vehicle C1.
 また、例えば、ステップS505で出力された報知情報が、シートベルトの着用に関する報知情報であった場合には、報知対象となった座席に着座している乗員がシートベルトを着用する動作が所定アクションとなる。このシートベルトを着用する動作の有無については、シートベルトセンサによる検出情報に基づいて判定可能である。 Further, for example, if the notification information output in step S505 is notification information regarding the fastening of a seatbelt, the predetermined action is to fasten the seatbelt by the occupant seated in the seat targeted for notification. becomes. Whether or not the seat belt is fastened can be determined based on information detected by the seat belt sensor.
 また、例えば、ステップS505で出力された報知情報が、車両C1のドアの半ドアに関する報知情報であった場合には、報知対象となったドアを閉じる動作が所定アクションとなる。このドアを閉じる動作の有無については、ドアセンサによる検出情報に基づいて判定可能である。 Further, for example, if the notification information output in step S505 is notification information regarding the door of the vehicle C1 being ajar, the predetermined action is the action of closing the door targeted for notification. Whether or not the door is closed can be determined based on information detected by the door sensor.
 また、例えば、ステップS505で出力された報知情報が、車両C1の周囲に存在する商業施設、観光施設等に関する報知情報であった場合には、報知対象となった場所に車両C1を移動させる動作が所定アクションとなる。報知対象となった場所に車両C1を移動させる動作の有無については、位置情報取得センサ102により取得された車両C1の位置情報に基づいて判定可能である。また、この場合には、報知対象となった場所について乗員が音声を発する動作についても所定アクションとなる。例えば、車両C1の周囲に存在する商業施設、例えばXYZ珈琲店に関する報知情報である場合には、XYZ珈琲店に関する音声、例えば「XYZ珈琲いいね」「XYZ珈琲美味しそう」等の音声を発する動作を所定アクションとすることが可能である。また、例えば、車両C1の周囲に存在する観光施設、例えばABC城に関する報知情報である場合には、ABC城に関する音声、例えば「ABC城いいね」「ABC城見たいね」等の音声を発する動作を所定アクションとすることが可能である。これらの音声に関する動作の有無については、音声入力部103により取得された音声情報と、記憶部130に記憶されている所定キーワードとの一致度(又は類似度)に基づいて判定可能である。なお、音声の一致度(又は類似度)の判定方法については、公知の音声認識技術を採用することが可能である。 Further, for example, if the notification information output in step S505 is notification information regarding commercial facilities, tourist facilities, etc. that exist around the vehicle C1, the operation of moving the vehicle C1 to the location targeted for notification is performed. is the predetermined action. Whether there is an operation to move the vehicle C1 to the location targeted for notification can be determined based on the position information of the vehicle C1 acquired by the position information acquisition sensor 102. Furthermore, in this case, the predetermined action is also an action in which the occupant makes a sound regarding the location targeted for notification. For example, if the notification information is about a commercial facility that exists around the vehicle C1, such as an XYZ coffee shop, an action to emit a sound related to the XYZ coffee shop, such as "XYZ coffee is nice" or "XYZ coffee looks delicious", etc. can be a predetermined action. For example, if the information is about a tourist facility that exists around the vehicle C1, such as ABC Castle, a voice related to ABC Castle, such as "I like ABC Castle" or "I want to see ABC Castle", is emitted. The motion can be a predetermined action. The presence or absence of these voice-related actions can be determined based on the degree of matching (or similarity) between the voice information acquired by the voice input unit 103 and a predetermined keyword stored in the storage unit 130. Note that as a method for determining the degree of matching (or similarity) of voices, it is possible to employ a known voice recognition technique.
 また、例えば、ステップS505で出力された報知情報が、車両C1の周囲に存在する商業施設、観光施設等に関する報知情報であった場合には、その報知情報の表示から所定時間が経過したタイミングを報知情報の終了タイミングとしてもよい。又は、車両C1の現在地が報知情報の表示領域の範囲外となった場合を報知情報の終了タイミングとしてもよい。例えば、車両C1の周囲に存在する商業施設、例えばXYZ珈琲店に関する報知情報である場合には、XYZ珈琲店を基準とする所定領域外、例えばXYZ珈琲店を中心とする半径2kmの範囲外に車両C1が移動したタイミングを報知情報の終了タイミングとすることが可能である。 For example, if the notification information output in step S505 is notification information regarding commercial facilities, tourist facilities, etc. that exist around the vehicle C1, the timing when a predetermined period of time has elapsed since the display of the notification information is It may also be the end timing of the notification information. Alternatively, the timing for ending the notification information may be set to a time when the current location of the vehicle C1 is outside the display area of the notification information. For example, if the notification information is about a commercial facility that exists around the vehicle C1, such as an XYZ coffee shop, information is sent outside a predetermined area based on the XYZ coffee shop, for example, outside a 2 km radius around the XYZ coffee shop. It is possible to set the timing at which the vehicle C1 moves as the end timing of the notification information.
 ステップS507において、エージェント制御部122は、ステップS505で出力された報知情報の表示を終了させる報知情報の終了処理を実行し、この終了処理後に、顔画像を表示させる表示処理を実行する。例えば、商業施設、観光施設に関する報知情報である場合には、その報知情報を消去する消去処理を実行し、顔画像を表示させる表示処理を実行する。なお、図5乃至図9に示す例では、報知情報を表示する際には、顔画像を表示する表示部が移動する。このため、ステップS507における顔画像の表示処理では、顔画像を表示する表示部を元の位置に戻した後に顔画像を表示するようにする。 In step S507, the agent control unit 122 executes a notification information termination process to terminate the display of the notification information output in step S505, and after this termination process, executes a display process to display a face image. For example, if the notification information is about a commercial facility or a tourist facility, a deletion process is executed to erase the notification information, and a display process is executed to display a face image. Note that in the examples shown in FIGS. 5 to 9, the display unit that displays the face image moves when displaying the notification information. Therefore, in the facial image display process in step S507, the facial image is displayed after the display unit that displays the facial image is returned to its original position.
 [報知情報の重要度に基づいて報知情報への切替タイミングを変更する例]
 図12は、情報処理装置110における報知情報出力処理の一例を示すフローチャートである。また、この報知情報出力処理は、記憶部130に記憶されているプログラムに基づいて制御部120により実行される。また、この報知情報出力処理は、制御周期毎に常時実行される。また、この報知情報出力処理では、図1乃至図11を適宜参照して説明する。
[Example of changing the timing of switching to broadcast information based on the importance of broadcast information]
FIG. 12 is a flowchart illustrating an example of notification information output processing in the information processing device 110. Further, this notification information output processing is executed by the control unit 120 based on a program stored in the storage unit 130. Moreover, this notification information output process is always executed in every control cycle. Further, this notification information output processing will be explained with reference to FIGS. 1 to 11 as appropriate.
 なお、この出力制御処理は、図11に示す報知情報出力処理の一部を変形したものである。具体的には、図11に示す報知情報出力処理において、ステップS511、S512の処理を追加した点が異なる。なお、これら以外の点については、図11に示す報知情報出力処理と共通するため、図11と同一の符号を付してこれらの説明を省略する。 Note that this output control process is a partial modification of the notification information output process shown in FIG. Specifically, the difference is that steps S511 and S512 are added to the notification information output process shown in FIG. 11. Note that other points than these are the same as the notification information output process shown in FIG. 11, so the same reference numerals as in FIG. 11 are given, and the description thereof will be omitted.
 ステップS511において、表示切替判定部121は、出力対象となる報知情報の重要度が高いか否かを判定する。報知情報の重要度については、例えば、報知情報の緊急度に基づいて判定可能である。例えば、車両C1の運転、車両C1の乗員の安全性に関する情報については、緊急度が高いと想定される。このため、車両C1の運転、車両C1の乗員の安全性に関する情報については、報知情報の重要度が高く設定される。例えば、シートベルトの着用、半ドア状、バッテリ残量等に関する報知情報については、報知情報の重要度が基準よりも高く設定される。一方、車両C1の周囲の情報、観光情報、グルメ情報等の提供については、緊急度が低いと想定される。このため、車両C1の周囲の情報、観光情報、グルメ情報等の提供については、報知情報の重要度が基準よりも低く設定される。例えば、車両C1の周囲の施設(例えば、飲食店、商業施設)に関する報知情報については、報知情報の重要度が低く設定される。なお、報知情報の重要度の高低については、報知情報DB132(図3参照)に格納される。 In step S511, the display switching determination unit 121 determines whether the importance of the notification information to be output is high. The importance of the broadcast information can be determined based on the urgency of the broadcast information, for example. For example, information regarding the driving of the vehicle C1 and the safety of the occupants of the vehicle C1 is assumed to have a high degree of urgency. Therefore, the importance of the notification information is set high for information regarding the driving of the vehicle C1 and the safety of the occupants of the vehicle C1. For example, the importance level of the notification information is set to be higher than the standard for notification information regarding seatbelt fastening, door ajar, remaining battery level, and the like. On the other hand, it is assumed that the level of urgency for providing information about the surroundings of the vehicle C1, tourist information, gourmet information, etc. is low. Therefore, regarding the provision of information around the vehicle C1, sightseeing information, gourmet information, etc., the importance of the notification information is set lower than the standard. For example, the importance level of the notification information regarding the facilities around the vehicle C1 (for example, restaurants, commercial facilities) is set to be low. Note that the level of importance of broadcast information is stored in the broadcast information DB 132 (see FIG. 3).
 出力対象となる報知情報の重要度が高い場合には、ステップS504に進む。一方、出力対象となる報知情報の重要度が低い場合には、ステップS512に進む。 If the importance of the broadcast information to be output is high, the process advances to step S504. On the other hand, if the importance of the broadcast information to be output is low, the process advances to step S512.
 ステップS512において、エージェント制御部122は、顔画像を円滑に消去する演出を実行する。ここで、顔画像を円滑に消去する演出は、例えば、顔画像のしぐさを円滑に表示することにより実現される。例えば、図4(A)に示す驚いた顔画像のしぐさを比較的遅い動きで表現した後に顔画像を消去することにより、顔画像を円滑に消去することが可能である。このように、重要度が低い報知情報の場合には、ユーザに急いで報知する必要性が低いため、エージェント機器200の生物らしさを高めつつ、報知情報の出力処理に移行することが可能である。すなわち、顔画像を円滑に消去してから、報知情報を表示することが可能となる。 In step S512, the agent control unit 122 executes an effect to smoothly erase the face image. Here, the effect of smoothly erasing the face image is realized, for example, by smoothly displaying the gesture of the face image. For example, the facial image can be smoothly deleted by expressing the gesture of the surprised facial image shown in FIG. 4(A) with a relatively slow movement and then deleting the facial image. In this way, in the case of broadcast information of low importance, there is little need to notify the user in a hurry, so it is possible to shift to output processing of the broadcast information while enhancing the creature-like nature of the agent device 200. . That is, it becomes possible to display the notification information after smoothly erasing the face image.
 なお、図5乃至図9に示す例では、報知情報を表示する際には、顔画像を表示する表示部を移動させる。このような場合には、表示部の移動により顔画像が見え難くなることも想定される。このため、図5乃至図9に示す例では、報知情報の切り替え時に、顔画像を円滑に消去する処理を省略してもよい。 Note that in the examples shown in FIGS. 5 to 9, when displaying notification information, the display unit that displays the face image is moved. In such a case, it is assumed that the facial image becomes difficult to see due to the movement of the display unit. Therefore, in the examples shown in FIGS. 5 to 9, the process of smoothly erasing the face image may be omitted when switching the notification information.
 また、報知情報の重要度が低い場合には、顔画像と報知情報とを一時的に同時に表示してもよい。又は、報知情報の重要度が低い場合には、顔画像を徐々に消していき、報知情報を徐々に表示させる演出を実行してもよい。 Furthermore, if the importance of the notification information is low, the face image and the notification information may be temporarily displayed simultaneously. Alternatively, if the importance of the notification information is low, an effect may be performed in which the face image is gradually erased and the notification information is gradually displayed.
 [顔画像の演出の有無に基づいて報知情報への切替タイミングを遅延させる例]
 図13は、情報処理装置110における報知情報出力処理の一例を示すフローチャートである。また、この報知情報出力処理は、記憶部130に記憶されているプログラムに基づいて制御部120により実行される。また、この報知情報出力処理は、制御周期毎に常時実行される。また、この報知情報出力処理では、図1乃至図12を適宜参照して説明する。
[Example of delaying the timing of switching to notification information based on the presence or absence of facial image production]
FIG. 13 is a flowchart illustrating an example of notification information output processing in the information processing device 110. Further, this notification information output processing is executed by the control unit 120 based on a program stored in the storage unit 130. Moreover, this notification information output process is always executed in every control cycle. Further, this notification information output processing will be explained with reference to FIGS. 1 to 12 as appropriate.
 なお、この出力制御処理は、図12に示す報知情報出力処理の一部を変形したものである。具体的には、図12に示す報知情報出力処理において、ステップS512の処理を省略し、ステップS521、S522の各処理を追加した点が異なる。なお、これら以外の点については、図12に示す報知情報出力処理と共通するため、図12と同一の符号を付してこれらの説明を省略する。 Note that this output control process is a partial modification of the notification information output process shown in FIG. 12. Specifically, the notification information output process shown in FIG. 12 differs in that the process in step S512 is omitted and the processes in steps S521 and S522 are added. Note that other points than these are the same as the notification information output process shown in FIG. 12, so the same reference numerals as in FIG. 12 are given, and the description thereof will be omitted.
 ステップS521において、表示切替判定部121は、顔画像が演出中であるか否かを判定する。例えば、エージェント機器200と車両C1の乗員との間で所定のやりとり(例えば、会話)がされている場合には、顔画像が演出中であると判定される。また、例えば、エージェント機器200が自律性をアピールするための振る舞い(例えば、寝ている振る舞い、疲れた振る舞い)等の演出を実行している場合には、顔画像が演出中であると判定される。例えば、エージェント機器200が寝むそうな振る舞いをしている最中に、車両C1の近くに存在するラーメン店の広告情報を報知情報としてエージェント機器200が一生懸命出していると、ユーザに違和感を与えるおそれがある。この場合には、ユーザがエージェント機器200に生き物としての信頼性や愛着を感じなくなるおそれもある。そこで、エージェント機器200の自律性を重視し、自律性の表現の統一感を出すため、エージェント機器200が自律性をアピールするための振る舞い等の演出を実行している場合には、重要度の低い報知情報を出力しないようにする。これにより、エージェント機器200に対する生き物としての信頼性や愛着を高めることが可能であり、報知情報に対する関心を高めることが可能である。顔画像が演出中である場合には、ステップS522に進む。一方、顔画像が演出中でない場合には、ステップS504に進む。 In step S521, the display switching determination unit 121 determines whether or not the face image is being rendered. For example, if a predetermined exchange (for example, conversation) is occurring between the agent device 200 and the occupant of the vehicle C1, it is determined that the facial image is being rendered. Furthermore, for example, when the agent device 200 is performing a performance such as a behavior to appeal its autonomy (for example, behaving while sleeping or acting tired), it is determined that the facial image is being performed. Ru. For example, if the agent device 200 is behaving like it's going to sleep, and the agent device 200 is trying hard to send out advertising information for a ramen restaurant near the vehicle C1 as notification information, the user may feel uncomfortable. There is a risk of giving. In this case, there is a possibility that the user may no longer feel trustworthiness or attachment to the agent device 200 as a living thing. Therefore, in order to emphasize the autonomy of the agent device 200 and create a sense of unity in expressing autonomy, when the agent device 200 is performing a performance such as a behavior to appeal its autonomy, it is necessary to Avoid outputting low notification information. Thereby, it is possible to increase the reliability and attachment of the agent device 200 as a living thing, and it is possible to increase interest in the broadcast information. If the face image is being rendered, the process advances to step S522. On the other hand, if the face image is not being rendered, the process advances to step S504.
 ステップS522において、表示切替判定部121は、ステップS503で判定された報知情報が発生中であるか否かを判定する。すなわち、何らかの理由により、報知情報の出力が不要になることも想定される。このように、報知情報の出力が不要になった場合には、報知情報が発生中でないと判定される。報知情報が発生中である場合には、ステップS521に戻る。一方、報知情報が発生中でない場合には、報知情報出力処理の動作を終了する。 In step S522, the display switching determination unit 121 determines whether the notification information determined in step S503 is being generated. That is, it is assumed that for some reason it becomes unnecessary to output the broadcast information. In this way, when output of broadcast information is no longer necessary, it is determined that broadcast information is not being generated. If broadcast information is being generated, the process returns to step S521. On the other hand, if broadcast information is not being generated, the operation of broadcast information output processing is ended.
 例えば、商業施設の広告に関する報知情報である場合において、車両C1がその商業施設から遠く離れてしまったときには、その報知情報の出力が不要となる。また、例えば、商業施設の広告に関する報知情報である場合において、その報知情報が発生してから所定時間(例えば10分程度の値)が経過したときには、その報知情報の出力を不要としてもよい。 For example, in the case of broadcast information regarding an advertisement for a commercial facility, when the vehicle C1 moves far away from the commercial facility, it becomes unnecessary to output the broadcast information. Further, for example, in the case of broadcast information regarding an advertisement of a commercial facility, when a predetermined time (for example, about 10 minutes) has elapsed since the broadcast information was generated, the output of the broadcast information may not be necessary.
 ここで、エージェント機器200の切替演出等が運転中のドライバの邪魔をすることを防止することが重要である。例えば、運転席に乗車しているドライバが運転中である場合には、そのドライバは、エージェント機器200の切替演出等を把握しているが、エージェント機器200の方向に視線を向けないことも想定される。そこで、エージェント機器200の切替演出については、車両C1が停止中に実行することが好ましい。ただし、運転席に乗車しているドライバ以外の乗員を対象とする場合には、車両C1が走行中に実行してもよい。 Here, it is important to prevent the switching effect of the agent device 200 from interfering with the driver while driving. For example, when a driver in the driver's seat is driving, the driver is aware of the switching effects of the agent device 200, but it is also assumed that the driver does not direct his/her line of sight in the direction of the agent device 200. be done. Therefore, it is preferable that the switching effect of the agent device 200 is performed while the vehicle C1 is stopped. However, if the target is a passenger other than the driver sitting in the driver's seat, the process may be executed while the vehicle C1 is running.
 [車両以外に設置可能なエージェント機器の例]
 以上では、車両C1に設置されるエージェント機器200の例を示した。ただし、車両C1から取り外し可能なエージェント機器、車両以外に設置可能なエージェント機器等についても本実施形態を適用可能である。例えば、例えば、携帯型のエージェント機器を所持するユーザが、車両C1に乗車するときには、車両C1のダッシュボード2上にそのエージェント機器を設置し、車両C1から降りる場合には、ユーザがそのエージェント機器を所持して持ち歩くことも想定される。また、ユーザが家庭内でエージェント機器を使用することも想定される。
[Example of agent device that can be installed outside of vehicle]
Above, an example of the agent device 200 installed in the vehicle C1 has been shown. However, this embodiment is also applicable to agent devices that can be removed from the vehicle C1, agent devices that can be installed outside the vehicle, and the like. For example, when a user with a portable agent device gets into the vehicle C1, the agent device is installed on the dashboard 2 of the vehicle C1, and when the user gets off the vehicle C1, the user installs the agent device on the dashboard 2 of the vehicle C1. It is also assumed that people carry around in their possession. It is also assumed that users use agent devices at home.
 例えば、家庭内では、家庭内に設置されている各機器に関する報知情報を出力することが考えられる。例えば、お風呂が沸いたタイミング、調理器具の調理が終了したタイミング等で、その旨を報知することが想定される。また、例えば、玄関のドア、窓が開いた状態の場合に、その旨を報知することが想定される。また、例えば、ガスコンロの火を消し忘れている場合に、その旨を報知することが想定される。これらの場合には、お風呂、各調理器具、ドア、窓等に関する報知情報を提供することが想定される。また、重要度については、安全性、緊急度等を基準として設定することが可能である。例えば、緊急度が高い対象物については、重要度を基準よりも高く設定し、緊急度が低い対象物については、重要度を基準よりも低く設定することが可能である。例えば、緊急度が高い対象物、安全性を維持する対象物は、ガスコンロ(火の消し忘れを防止するため)、玄関のドア、窓(開いた状態を報知)である。一方、緊急度が低い対象物は、例えば、お風呂(お風呂が沸いたタイミングを報知)、調理器具(調理の調理が終了したタイミングを報知)である。 For example, in a home, it is possible to output notification information regarding each device installed in the home. For example, it is envisaged that the notification will be given at the timing when the bath is boiled, when the cooking is finished using the cooking utensils, etc. Further, for example, when the entrance door or window is open, it is assumed that a notification to that effect is provided. Furthermore, for example, if the user forgets to turn off the gas stove, it is assumed that a notification to that effect will be provided. In these cases, it is assumed that notification information regarding the bath, various cooking utensils, doors, windows, etc. will be provided. Further, the degree of importance can be set based on safety, degree of urgency, etc. For example, for objects with a high degree of urgency, the importance level can be set higher than the standard, and for objects with a low degree of urgency, the importance level can be set lower than the standard. For example, objects with a high degree of urgency and objects whose safety must be maintained are a gas stove (to prevent forgetting to turn off the fire), an entrance door, and a window (to notify when it is open). On the other hand, objects with a low degree of urgency include, for example, a bath (notifies the timing when the bath is boiled) and cooking utensils (notifies the timing when cooking is finished).
 [エージェント機器を2次元で表現する例]
 以上では、3次元のエージェント機器200を例にして説明した。ただし、エージェント機器の生き物らしさを表現することが可能であれば、2次元画像として、エージェント画像を表示して本実施形態を適用してもよい。この場合には、顔部及び身体部の双方を2次元画像として表示する。
[Example of representing an agent device in two dimensions]
The above description has been made using the three-dimensional agent device 200 as an example. However, as long as it is possible to express the creature-like appearance of the agent device, the present embodiment may be applied by displaying the agent image as a two-dimensional image. In this case, both the face and body are displayed as two-dimensional images.
 [本実施形態の効果例]
 ここで、エージェント機器200の表示対象を顔画像から報知情報に切り替える場合に、単に顔画像から報知情報に切り替えると、顔画像と報知情報とがはっきり切り分けられていないため、エージェント機器200に生き物らしさが不足するおそれがある。例えば、顔画像を認識させたい顔部201に、機械的に報知情報を表示させると、顔部201を機械としてユーザに認識させるおそれがある。また、顔画像の表示後にいきなり報知情報が表示されると、顔画像の表示部分が機械的な表示画面であるとユーザに認識させてしまうおそれがある。これらの場合には、エージェント機器200に対するユーザの愛着度が低下し、エージェント機器200に表示される報知情報に対する関心が低下するおそれがある。この場合には、報知情報をユーザに適切に伝えることが困難となるおそれがある。
[Example of effects of this embodiment]
Here, when switching the display target of the agent device 200 from the face image to the notification information, if the display target of the agent device 200 is simply switched from the face image to the notification information, the face image and the notification information are not clearly separated, so the agent device 200 has a feeling that it is a living thing. There is a risk of a shortage. For example, if notification information is mechanically displayed on the face 201 whose facial image is desired to be recognized, there is a risk that the user will recognize the face 201 as a machine. Furthermore, if the notification information is suddenly displayed after the face image is displayed, there is a risk that the user may be made to think that the display part of the face image is a mechanical display screen. In these cases, the user's degree of attachment to the agent device 200 may decrease, and there is a risk that the user's interest in the notification information displayed on the agent device 200 may decrease. In this case, it may be difficult to appropriately convey the broadcast information to the user.
 また、エージェント機器200の表示対象を顔画像から報知情報に切り替える場合に、顔画像及び報知情報を混ぜて表示すると、ユーザが困惑するおそれがある。そこで、本実施形態では、顔画像及び報知情報を混ぜて表示せずに、顔画像の表示時から報知情報の表示前のタイミングにおいて切替演出を実行し、顔画像及び報知情報を切り分けて表示する。すなわち、顔画像の表示から報知情報の表示までの間に、顔画像と報知情報とが切り分けられるように切替演出を実行する。 Additionally, when switching the display target of the agent device 200 from facial images to notification information, if the facial images and notification information are displayed together, the user may be confused. Therefore, in this embodiment, instead of displaying the face image and the notification information together, a switching effect is executed from the time the face image is displayed to before the notification information is displayed, and the face image and the notification information are displayed separately. . That is, a switching effect is performed between the display of the face image and the display of the notification information so that the face image and the notification information are separated.
 このように、顔画像の表示後(又は顔画像の表示中)に切替演出が実行され、この切替演出の実行後に報知情報が表示されるため、切替演出の実行後に表示される報知情報に対してユーザが構えることが可能となる。すなわち、情緒的、感情的な動きをする顔画像表示画面と、所定情報を報知するための報知情報表示画面とをはっきりする区別することで、切替演出の実行後に表示される内容をユーザがある程度予想可能となる。例えば、顔画像の表示から切替演出が実行されると、この切替演出の後に重要な報知情報が表示されることをユーザが予想可能となる。このように、切替演出の実行後に報知情報を表示することにより、報知情報の予測が可能となり、ユーザに与える困惑を緩和することが可能である。これにより、報知情報を適切に伝えることが可能となる。 In this way, the switching effect is executed after the face image is displayed (or while the face image is being displayed), and the notification information is displayed after this switching effect is executed, so the notification information displayed after the switching effect is This allows the user to hold the camera in place. In other words, by clearly distinguishing between the face image display screen that displays emotional and emotional movements and the notification information display screen that reports predetermined information, the user can control to some extent the content that will be displayed after the switching effect is executed. Becomes predictable. For example, when a switching effect is executed from the display of a face image, the user can anticipate that important notification information will be displayed after this switching effect. In this way, by displaying the notification information after the execution of the switching effect, it is possible to predict the notification information, and it is possible to alleviate confusion caused to the user. This makes it possible to appropriately convey broadcast information.
 ここで、エージェント機器200において切替演出を実行しないで報知情報を出力した場合と、切替演出を実行した後に報知情報を出力した場合とについて考える。この場合に、切替演出を実行しないで報知情報を出力した場合よりも、切替演出を実行した後に報知情報を出力した場合の方が、エージェント機器200に生き物らしさを感じることが多いと想定される。そこで、本実施形態では、切替演出を実行した後に報知情報を出力するようにする。これにより、ユーザにエージェント機器200の生き物らしさを感じさせることが可能となり、報知情報を適切に伝えることが可能となる。 Here, we will consider a case where the agent device 200 outputs the notification information without executing the switching effect, and a case where the notification information is output after executing the switching effect. In this case, it is assumed that the agent device 200 is more likely to feel like a living thing when the notification information is output after executing the switching effect than when the notification information is output without executing the switching effect. . Therefore, in this embodiment, the notification information is output after the switching effect is executed. Thereby, it becomes possible to make the user feel that the agent device 200 is like a living thing, and it becomes possible to appropriately convey notification information.
 また、本実施形態では、自律性を確保しつつ、エージェント機器200に知性が感じられる演出を実行することにより、生き物らしさを保ちつつ、報知情報を効率的に伝えることが可能となる。 Furthermore, in this embodiment, by performing a performance that gives the agent device 200 a sense of intelligence while ensuring its autonomy, it is possible to efficiently convey notification information while maintaining the appearance of a living creature.
 また、本実施形態では、緊急度の高い報知情報、例えば重要度が高い報知情報が発生した場合には、顔画像及び報知情報を排他的に表示するための切替演出を迅速に行った後に、報知情報を表示させることが可能である。このように、少なくとも緊急時には切替演出を迅速に実行することにより、ユーザにエージェント機器200の生き物らしさを感じさせることが可能となり、報知情報を迅速に伝えることが可能となる。 In addition, in this embodiment, when notification information with a high degree of urgency, for example, notification information with a high degree of importance, occurs, after quickly performing a switching effect to exclusively display the face image and notification information, It is possible to display notification information. In this way, by quickly performing the switching effect at least in an emergency, it becomes possible to make the user feel that the agent device 200 is like a living thing, and it becomes possible to quickly convey notification information.
 なお、以上では、報知情報出力処理等をエージェント機器200、情報処理装置110(又は情報処理システム100)において実行する例を示したが、それらの各処理の全部又は一部を他の機器において実行してもよい。この場合には、それらの各処理の一部を実行する各機器により情報処理システムが構成される。例えば、車載機器、ユーザが使用可能な機器(例えば、スマートフォン、タブレット端末、パーソナルコンピュータ、カーナビゲーション装置、IVI)、インターネット等の所定のネットワークを介して接続可能なサーバ等の各種情報処理装置、各種電子機器を用いて各処理の少なくとも一部を実行させることができる。 Note that although the example above has shown an example in which the notification information output processing, etc. is executed in the agent device 200 and the information processing device 110 (or the information processing system 100), all or part of each of these processes may be executed in other devices. You may. In this case, each device that executes a part of each of these processes constitutes an information processing system. For example, in-vehicle devices, devices that can be used by users (e.g., smartphones, tablet terminals, personal computers, car navigation devices, IVI), various information processing devices such as servers that can be connected via a predetermined network such as the Internet, various At least a portion of each process can be performed using an electronic device.
 また、情報処理装置110(又は情報処理システム100)の機能を実行可能な情報処理システムの一部(又は全部)については、インターネット等の所定のネットワークを介して提供可能なアプリケーションにより提供されてもよい。このアプリケーションは、例えばSaaS(Software as a Service)である。 Further, a part (or all) of the information processing system that can execute the functions of the information processing device 110 (or the information processing system 100) may be provided by an application that can be provided via a predetermined network such as the Internet. good. This application is, for example, SaaS (Software as a Service).
 [本実施形態の構成例及びその効果]
 本実施形態に係る情報処理方法は、顔画像を表示する表示部210(表示装置の一例)を有するエージェント機器200を用いて報知情報をユーザに提供する情報処理方法である。この情報処理方法は、表示部210の表示態様を制御する制御処理(ステップS503乃至S507)を含み、制御処理では、表示部210に報知情報を表示させる場合には、表示部210を用いた切替演出を実行し、かつ、顔画像の表示を消去した後に、報知情報を表示させる。また、本実施形態に係るプログラムは、これらの各処理をコンピュータに実行させるプログラムである。言い換えると、本実施形態に係るプログラムは、情報処理装置110が実行可能な各機能をコンピュータに実現させるプログラムである。
[Configuration example of this embodiment and its effects]
The information processing method according to this embodiment is an information processing method that provides notification information to a user using an agent device 200 having a display unit 210 (an example of a display device) that displays a face image. This information processing method includes a control process (steps S503 to S507) that controls the display mode of the display unit 210, and in the control process, when displaying notification information on the display unit 210, switching using the display unit 210 is performed. After performing the performance and erasing the display of the face image, notification information is displayed. Further, the program according to the present embodiment is a program that causes a computer to execute each of these processes. In other words, the program according to this embodiment is a program that causes a computer to implement each function that can be executed by the information processing device 110.
 この構成によれば、顔画像から報知情報の切り替え時において、表示部210を用いた切替演出、例えば驚いた表情の表示を実行させることにより、生き物らしさをユーザに感じさせ、報知情報が報知されることに対するユーザの関心を高めることが可能となる。これにより、報知情報をより適切にユーザに報知することが可能となる。 According to this configuration, when switching from the face image to the notification information, the display unit 210 is used to perform a switching effect, for example, displaying a surprised expression, so that the user feels like a living creature, and the notification information is not notified. It becomes possible to increase the user's interest in the subject. This makes it possible to more appropriately notify the user of the notification information.
 本実施形態に係る情報処理方法において、表示部210は、顔画像を表示する第1表示部(表示部241、252、264、271及び281、300、324)と、報知情報を表示する第2表示部(表示部242、253、265、272及び282、310、325)とを含み、制御処理(ステップS504、S505)では、第2表示部に報知情報を表示させる場合には、第1表示部及び第2表示部のうちの少なくとも1つにおいて切替演出を実行し、かつ、第1表示部における顔画像の表示を消去した後に、第2表示部に報知情報を表示させる。例えば、第1表示部を用いた切替演出として、驚いた表情を表示させたり、第1表示部及び第2表示部を用いた切替演出として、第1表示部及び第2表示部を備える筐体を回転させたりすることが可能である。また、例えば、第1表示部及び第2表示部のうちの少なくとも1つにおいて、異なる表示特性で画像を表示することにより、第1表示部又は第2表示部を用いた切替演出を実行することも可能である。 In the information processing method according to the present embodiment, the display unit 210 includes a first display unit ( display units 241, 252, 264, 271, 281, 300, and 324) that displays a face image, and a second display unit that displays notification information. display units ( display units 242, 253, 265, 272 and 282, 310, 325), and in the control process (steps S504, S505), when displaying the notification information on the second display unit, the first display unit After performing a switching effect on at least one of the display section and the second display section and erasing the display of the face image on the first display section, the notification information is displayed on the second display section. For example, as a switching effect using the first display section, a surprised facial expression may be displayed, or as a switching effect using the first display section and the second display section, a housing equipped with the first display section and the second display section may be used. It is possible to rotate. Further, for example, by displaying an image with different display characteristics on at least one of the first display section and the second display section, a switching effect using the first display section or the second display section may be executed. is also possible.
 この構成によれば、顔画像から報知情報の切り替え時において、切替演出を実行することにより、生き物らしさをユーザに感じさせ、報知情報が報知されることに対するユーザの関心を高めることが可能となる。また、顔画像と報知情報とを表示する表示部の違いにより、顔画像と報知情報とをユーザが明確に区別することが可能となる。このため、報知情報をより適切にユーザに報知することが可能となる。 According to this configuration, by executing the switching effect when switching from the face image to the notification information, it is possible to make the user feel like a living thing and increase the user's interest in being notified of the notification information. . Moreover, the difference in the display units that display the facial image and the notification information allows the user to clearly distinguish between the facial image and the notification information. Therefore, it becomes possible to more appropriately notify the user of the notification information.
 本実施形態に係る情報処理方法において、制御処理(ステップS504)では、切替演出として表示部210を移動させる演出を実行する。例えば、図5(A)(B)、図6(A)(B)、図7(A)乃至(C)、図8(A)(B)、図9(A)(B)に示すように、エージェント機器200の表示部を移動させる切替演出が実行される。 In the information processing method according to the present embodiment, in the control process (step S504), an effect of moving the display unit 210 is executed as a switching effect. For example, as shown in FIGS. 5(A)(B), FIGS. 6(A)(B), FIGS. 7(A) to (C), FIGS. Then, a switching effect is executed to move the display section of the agent device 200.
 この構成によれば、顔画像から報知情報の切り替え時において、表示部210を移動させる演出を実行することにより、生き物らしさをユーザに感じさせ、報知情報が報知されることに対するユーザの関心を高めることが可能となる。このため、報知情報をより適切にユーザに報知することが可能となる。 According to this configuration, by performing an effect of moving the display unit 210 when switching from the face image to the notification information, the user is made to feel like a living thing, and the user's interest in being notified of the notification information is increased. becomes possible. Therefore, it becomes possible to more appropriately notify the user of the notification information.
 本実施形態に係る情報処理方法において、表示部210(表示装置の一例)は、顔画像を表示する第1表示部(表示部252、264、324)が一の面に設けられ、報知情報を表示する第2表示部(表示部253、265、325)が一の面とは異なる他の面に設けられている立方体形状又は直方体形状の筐体からなり、制御処理(ステップS504、S505)では、第2表示部に報知情報を表示させる場合には、ユーザから見える表示装置の面を一の面から他の面に切り替える回転動作を切替演出として実行し、かつ、第1表示部における顔画像の表示を消去した後に、第2表示部に報知情報を表示させる。 In the information processing method according to the present embodiment, the display unit 210 (an example of a display device) is provided with a first display unit ( display units 252, 264, and 324) on one surface that displays face images, and displays notification information. The second display section (display section 253, 265, 325) to display is composed of a cube-shaped or rectangular parallelepiped-shaped housing provided on another surface different from one surface, and in the control processing (steps S504, S505) , when displaying notification information on the second display section, a rotation operation that switches the side of the display device visible to the user from one side to another side is performed as a switching effect, and the face image on the first display section is After erasing the display, the notification information is displayed on the second display section.
 この構成によれば、顔画像から報知情報の切り替え時において、表示部210の回転動作を切替演出として実行することにより、顔部の動きに対して生き物らしさをユーザに感じさせ、報知情報が報知されることに対するユーザの関心を高めることが可能となる。このため、報知情報をより適切にユーザに報知することが可能となる。 According to this configuration, when switching from the face image to the notification information, the rotating motion of the display unit 210 is executed as a switching effect, thereby making the user feel that the movement of the face looks like a living thing, and the notification information is This makes it possible to increase the user's interest in what is being done. Therefore, it becomes possible to more appropriately notify the user of the notification information.
 本実施形態に係る情報処理方法において、表示部210(表示装置の一例)は、一の面に第1表示部(表示部252、264)が設けられ、一の面の反対側の面である他の面に第2表示部(表示部253、265)が設けられている平板状の筐体からなり、制御処理(ステップS504)では、回転軸を中心にして第1表示部及び第2表示部を180度回転させる回転動作を切替演出として実行する。例えば、図5(A)(B)、図6(A)(B)に示すように、エージェント機器200の表示部を回転させる切替演出が実行される。 In the information processing method according to the present embodiment, the display unit 210 (an example of a display device) is provided with a first display unit (display units 252, 264) on one side, and a first display unit (display units 252, 264) on the opposite side of the first side. It consists of a flat case in which a second display section (display sections 253, 265) is provided on the other surface, and in the control process (step S504), the first display section and the second display section are arranged around the rotation axis. A rotation action of rotating the part by 180 degrees is executed as a switching effect. For example, as shown in FIGS. 5A and 5B and 6A and 6B, a switching effect is performed in which the display section of the agent device 200 is rotated.
 この構成によれば、顔画像から報知情報の切り替え時において、表示部210の回転動作を切替演出として実行することにより、エージェントの顔が後方を振り向いたり下を向いたりするような生き物らしい動きをユーザに感じさせることが可能である。これにより、報知情報が報知されることに対するユーザの関心を高めることが可能となり、報知情報をより適切にユーザに報知することが可能となる。 According to this configuration, when switching from the face image to the notification information, the rotating motion of the display unit 210 is executed as a switching effect, so that the agent's face moves like a living creature, such as turning backwards or downwards. It is possible to make the user feel it. Thereby, it becomes possible to increase the user's interest in being notified of the broadcast information, and it becomes possible to notify the user of the broadcast information more appropriately.
 本実施形態に係る情報処理方法において、表示部210(表示装置の一例)は、一の面に表示部324(第1表示部の一例)が設けられ、一の面に隣接する面である他の面に表示部325(第2表示部の一例)が設けられている立方体形状又は直方体形状の筐体からなり、制御処理(ステップS504)では、回転軸を中心にして表示部324、325を90度回転させる回転動作を切替演出として実行する。例えば、図9(A)(B)に示すように、エージェント機器200の表示部324、325を回転させる切替演出が実行される。 In the information processing method according to the present embodiment, the display section 210 (an example of a display device) is provided with a display section 324 (an example of a first display section) on one surface, and the other surface is adjacent to the first surface. It consists of a cube-shaped or rectangular parallelepiped-shaped housing in which a display section 325 (an example of a second display section) is provided on the surface thereof. A rotation action of rotating 90 degrees is performed as a switching effect. For example, as shown in FIGS. 9A and 9B, a switching effect is performed in which the display units 324 and 325 of the agent device 200 are rotated.
 この構成によれば、エージェント機器200の顔の後ろの背中(又は、顔の前の腹部)部分に報知情報が表示されたような面白味のある印象をユーザに与えることができる。これにより、生物らしさをユーザに与えることができ、報知情報に対するユーザの関心を高めることが可能である。 According to this configuration, it is possible to give the user an interesting impression that the notification information is displayed on the back behind the face of the agent device 200 (or on the abdomen in front of the face). Thereby, it is possible to give the user a feeling of being like a living thing, and it is possible to increase the user's interest in the broadcast information.
 本実施形態に係る情報処理方法において、表示部210(表示装置の一例)は、一の面に第1表示部271が設けられ、一の面の反対側の面に第2表示部272が設けられている平板状の表示パネル270と、表示部280(第3表示部の一例)とを備え、表示パネル270は、一の端部を回転軸として回動可能となるように表示部280上に設けられ、表示部210は、表示パネル270の回転動作により、第1表示部271と表示部280の一部とにより構成される第1表示領域に顔画像を表示する第1状態と、第2表示部272と表示部280の一部とにより構成される第2表示領域に報知情報を表示する第2状態との何れかの表示状態となるように制御され、制御処理(ステップS504、S505)では、表示部210に報知情報を表示させる場合には、表示パネル270を回転させて第1状態から第2状態に遷移する演出を切替演出として実行し、かつ、第1表示領域における顔画像の表示を消去した後に、第2表示領域に報知情報を表示させる。例えば、図7(A)乃至(C)に示すように、エージェント機器200の表示パネル270を回転させる切替演出が実行される。 In the information processing method according to the present embodiment, the display section 210 (an example of a display device) has a first display section 271 provided on one surface, and a second display section 272 provided on the surface opposite to the one surface. The display panel 270 includes a flat display panel 270 and a display section 280 (an example of a third display section). The display unit 210 is provided in a first state in which a face image is displayed in a first display area constituted by a first display unit 271 and a part of a display unit 280, and a first state in which a face image is displayed in a first display area configured by a first display unit 271 and a part of a display unit 280. The control process (steps S504, S505 ), when displaying notification information on the display unit 210, a switching effect is performed in which the display panel 270 is rotated to transition from the first state to the second state, and the face image in the first display area is After erasing the display, the notification information is displayed in the second display area. For example, as shown in FIGS. 7A to 7C, a switching effect is performed in which the display panel 270 of the agent device 200 is rotated.
 この構成によれば、表示パネル270を回転させて、顔画像から報知情報の切替演出を実行するため、表示パネル270の意外性のある動きにより、生き物らしさをユーザに感じさせ、報知情報が報知されることに対するユーザの関心を高めることが可能となる。このため、報知情報をより適切にユーザに報知することが可能となる。 According to this configuration, since the display panel 270 is rotated and the notification information is switched from the face image, the unexpected movement of the display panel 270 makes the user feel like a living creature, and the notification information is This makes it possible to increase the user's interest in what is being done. Therefore, it becomes possible to more appropriately notify the user of the notification information.
 本実施形態に係る情報処理方法において、表示部(表示装置の一例)は、顔画像を表示する表示部300(第1表示部の一例)と、報知情報を表示する表示部310(第2表示部)とを備え、表示部300は、表示部310の前方において表示部310の表示面を覆う第1状態と、表示部310の表示面と略平行方向にスライドしてユーザが表示部300の表示面と表示部310の表示面とを見ることが可能な第2状態とに遷移することが可能に設置され、制御処理(ステップS504、S505)では、表示部310に報知情報を表示させる場合には、表示部300をスライド移動させることにより第1状態から第2状態に遷移させる演出を切替演出として実行し、かつ、表示部300における顔画像の表示を消去した後に、表示部310に報知情報を表示させる。 In the information processing method according to the present embodiment, the display unit (an example of a display device) includes a display unit 300 (an example of a first display unit) that displays a face image, and a display unit 310 (an example of a second display unit) that displays notification information. The display section 300 has a first state where the display surface of the display section 310 is covered in front of the display section 310, and a state where the display section 300 is slid in a direction substantially parallel to the display surface of the display section 310 so that the user can view the display section 300. The display screen and the display screen of the display unit 310 are installed so as to be able to be changed to a second state in which the display screen can be viewed, and in the control processing (steps S504 and S505), when displaying notification information on the display unit 310 To do this, a switching effect is performed in which the display section 300 is slid to transition from the first state to the second state, and after erasing the display of the face image on the display section 300, a notification is sent to the display section 310. Display information.
 この構成によれば、表示部300をスライド移動させて、顔画像から報知情報の切替演出を実行するため、表示部300の意外性のある動きにより、生き物らしさをユーザに感じさせ、報知情報が報知されることに対するユーザの関心を高めることが可能となる。このため、報知情報をより適切にユーザに報知することが可能となる。 According to this configuration, since the display unit 300 is slid and the notification information is switched from the face image, the unexpected movement of the display unit 300 makes the user feel like a living thing, and the notification information is It becomes possible to increase the user's interest in being notified. Therefore, it becomes possible to more appropriately notify the user of the notification information.
 本実施形態に係る情報処理方法において、表示部210(表示装置の一例)は、立体画像及び平面画像と、高画質及び低画質と、カラー画像及び白黒画像と、のうちの少なくとも1つの切り替えにより異なる表示特性で各画像を表示することが可能であり、制御処理(ステップS504、S505)では、表示部210に報知情報を表示させる場合には、顔画像の表示時における表示特性とは異なる表示特性とする所定画像を表示する演出を切替演出として実行し、かつ、表示部210における顔画像の表示を消去した後に、表示部210に報知情報を表示させる。 In the information processing method according to the present embodiment, the display unit 210 (an example of a display device) switches at least one of a three-dimensional image and a two-dimensional image, a high image quality and a low image quality, and a color image and a monochrome image. It is possible to display each image with different display characteristics, and in the control processing (steps S504 and S505), when displaying notification information on the display unit 210, display characteristics different from those used when displaying the face image are used. After executing the effect of displaying a predetermined image as a characteristic as a switching effect and erasing the display of the face image on the display section 210, the notification information is displayed on the display section 210.
 この構成によれば、顔画像から報知情報の切り替え時において、異なる表示特性として報知情報を表示させることにより、生き物らしさをユーザに感じさせ、報知情報が報知されることに対するユーザの関心を高めることが可能となる。また、表示特性の違いにより、顔画像と報知情報とをユーザが明確に区別することが可能となる。このため、報知情報をより適切にユーザに報知することが可能となる。 According to this configuration, by displaying the notification information as a different display characteristic when switching from the face image to the notification information, it is possible to make the user feel like a living thing and increase the user's interest in being notified of the notification information. becomes possible. Furthermore, the difference in display characteristics allows the user to clearly distinguish between the facial image and the notification information. Therefore, it becomes possible to more appropriately notify the user of the notification information.
 本実施形態に係る情報処理方法において、表示部210(表示装置の一例)は、顔画像を表示する第1表示部(表示部241、252、264、271及び281、300、324)と、報知情報を表示する第2表示部(表示部242、253、265、272及び282、310、325)とを含み、第1表示部及び第2表示部は、立体画像及び平面画像と、高画質及び低画質と、カラー画像及び白黒画像と、のうちの少なくとも1つが異なる表示特性を備え、制御処理(ステップS504、S505)では、第2表示部に報知情報を表示させる場合には、第1表示部及び第2表示部のうちの少なくとも1つにおいて切替演出を実行し、かつ、第1表示部における顔画像の表示を消去した後に、第2表示部に前記報知情報を表示させる。 In the information processing method according to the present embodiment, the display unit 210 (an example of a display device) includes a first display unit ( display units 241, 252, 264, 271, 281, 300, and 324) that displays a face image, and a a second display section ( display sections 242, 253, 265, 272 and 282, 310, 325) that displays information; At least one of a low image quality, a color image, and a monochrome image has different display characteristics, and in the control process (steps S504 and S505), when displaying notification information on the second display section, the first display section After performing a switching effect on at least one of the display section and the second display section and erasing the display of the face image on the first display section, the notification information is displayed on the second display section.
 この構成によれば、顔画像から報知情報の切り替え時において、異なる表示特性となる第2表示部に報知情報を表示させることにより、生き物らしさをユーザに感じさせ、報知情報が報知されることに対するユーザの関心を高めることが可能となる。また、表示特性の違いにより、顔画像と報知情報とをユーザが明確に区別することが可能となる。このため、報知情報をより適切にユーザに報知することが可能となる。 According to this configuration, when switching from the face image to the notification information, by displaying the notification information on the second display section that has different display characteristics, the user can feel like a living thing, and the user can feel that the notification information is being notified. It becomes possible to increase user interest. Furthermore, the difference in display characteristics allows the user to clearly distinguish between the facial image and the notification information. Therefore, it becomes possible to more appropriately notify the user of the notification information.
 本実施形態に係る情報処理方法において、制御処理(ステップS504、S511、S512)では、報知情報の緊急度に基づいて重要度を判定し、報知情報の重要度に基づいて、顔画像の表示から報知情報の表示に切り替えるタイミングを変更する。 In the information processing method according to the present embodiment, in the control processing (steps S504, S511, S512), the degree of importance is determined based on the degree of urgency of the notification information, and based on the degree of importance of the notification information, the degree of importance is determined from the display of the face image. Change the timing of switching to the display of notification information.
 この構成によれば、重要度が高い報知情報については迅速に出力させ、重要度が低い報知情報については適切なタイミングで出力させることが可能となる。このため、報知情報をより適切にユーザに報知することが可能となる。 According to this configuration, broadcast information with high importance can be outputted quickly, and broadcast information with low importance can be outputted at an appropriate timing. Therefore, it becomes possible to more appropriately notify the user of the notification information.
 本実施形態に係る情報処理方法において、制御処理(ステップS504、S511、S512、S521、S522)では、報知情報の重要度が基準よりも高い場合には、当該報知情報が発生した直後に、顔画像の表示から報知情報の表示に切り替え、報知情報の重要度が基準よりも低い場合において、表示部210(表示装置の一例)に表示されている顔画像が所定の演出を実行しているときには、当該所定の演出の終了後に、顔画像の表示から報知情報の表示に切り替え、表示部210に表示されている顔画像が所定の演出を実行していないときには、当該報知情報が発生した後に、顔画像の表示から報知情報の表示に切り替える。 In the information processing method according to the present embodiment, in the control process (steps S504, S511, S512, S521, S522), if the importance of the broadcast information is higher than the standard, the face When switching from displaying an image to displaying notification information, and when the importance of the notification information is lower than the standard, when the face image displayed on the display unit 210 (an example of a display device) is performing a predetermined effect. , after the predetermined effect is completed, the display of the face image is switched to the display of the notification information, and when the face image displayed on the display unit 210 is not performing the predetermined effect, after the notification information is generated, Switch from displaying face images to displaying notification information.
 この構成によれば、重要度が高い報知情報については迅速に出力させ、重要度が低い報知情報については顔画像の演出状態に応じた適切なタイミングで出力させることが可能となる。このため、報知情報をより適切にユーザに報知することが可能となる。 According to this configuration, it is possible to output notification information of high importance quickly, and output notification information of low importance at an appropriate timing according to the presentation state of the face image. Therefore, it becomes possible to more appropriately notify the user of the notification information.
 情報処理装置110は、顔画像を表示する表示部210(表示装置の一例)を有するエージェント機器200を用いて報知情報をユーザに提供する情報処理装置である。情報処理装置110は、表示部210の表示態様を制御する制御部120を備え、制御部120は、表示部210に報知情報を表示させる場合には、表示部210を用いた切替演出を実行し、かつ、顔画像の表示を消去した後に、報知情報を表示部210に表示させる。また、情報処理装置110により実現される各処理を実行可能な情報処理システムとしてもよい。 The information processing device 110 is an information processing device that provides notification information to a user using an agent device 200 that has a display unit 210 (an example of a display device) that displays a facial image. The information processing device 110 includes a control unit 120 that controls the display mode of the display unit 210, and when displaying notification information on the display unit 210, the control unit 120 executes a switching effect using the display unit 210. , and after erasing the display of the face image, the notification information is displayed on the display unit 210. Further, an information processing system that can execute each process realized by the information processing device 110 may be used.
 この構成によれば、顔画像から報知情報の切り替え時において、表示部210を用いた切替演出、例えば驚いた表情の表示を実行させることにより、生き物らしさをユーザに感じさせ、報知情報が報知されることに対するユーザの関心を高めることが可能となる。これにより、報知情報をより適切にユーザに報知することが可能となる。 According to this configuration, when switching from the face image to the notification information, the display unit 210 is used to perform a switching effect, for example, displaying a surprised expression, so that the user feels like a living creature, and the notification information is not notified. It becomes possible to increase the user's interest in the subject. This makes it possible to more appropriately notify the user of the notification information.
 なお、本実施形態で示した各処理手順は、本実施形態を実現するための一例を示したものであり、本実施形態を実現可能な範囲で各処理手順の一部の順序を入れ替えてもよく、各処理手順の一部を省略したり他の処理手順を追加したりしてもよい。 Note that each processing procedure shown in this embodiment is an example for realizing this embodiment, and the order of a part of each processing procedure may be changed to the extent that this embodiment can be realized. Often, a part of each processing procedure may be omitted or other processing steps may be added.
 なお、本実施形態で示した各処理は、各処理手順をコンピュータに実行させるためのプログラムに基づいて実行されるものである。このため、本実施形態は、それらの各処理を実行する機能を実現するプログラム、そのプログラムを記憶する記録媒体の実施形態としても把握することができる。例えば、情報処理装置に新機能を追加するためのアップデート処理により、そのプログラムを情報処理装置の記憶装置に記憶させることができる。これにより、そのアップデートされた情報処理装置に本実施形態で示した各処理を実施させることが可能となる。 Note that each process shown in this embodiment is executed based on a program for causing a computer to execute each process procedure. Therefore, this embodiment can also be understood as an embodiment of a program that implements the function of executing each of these processes, and a recording medium that stores the program. For example, when an update process is performed to add a new function to an information processing device, the program can be stored in the storage device of the information processing device. This makes it possible to cause the updated information processing device to perform each process described in this embodiment.
 以上、本発明の実施形態について説明したが、上記実施形態は本発明の適用例の一部を示したに過ぎず、本発明の技術的範囲を上記実施形態の具体的構成に限定する趣旨ではない。 Although the embodiments of the present invention have been described above, the above embodiments merely show a part of the application examples of the present invention, and are not intended to limit the technical scope of the present invention to the specific configurations of the above embodiments. do not have.
 なお、本願は2022年8月26日に日本国特許庁に出願された特願2022−135323に基づく優先権を主張し、この出願の全ての内容は参照により本明細書に組み込まれる。 Note that this application claims priority based on Japanese Patent Application No. 2022-135323 filed with the Japan Patent Office on August 26, 2022, and the entire contents of this application are incorporated herein by reference.

Claims (13)

  1.  顔画像を表示する表示装置を有するエージェント機器を用いて報知情報をユーザに提供する情報処理方法であって、
     前記表示装置の表示態様を制御する制御処理を含み、
     前記制御処理では、前記表示装置に前記報知情報を表示させる場合には、前記表示装置を用いた切替演出を実行し、かつ、前記顔画像の表示を消去した後に、前記報知情報を表示させる、
    情報処理方法。
    An information processing method for providing notification information to a user using an agent device having a display device that displays a face image, the method comprising:
    including a control process for controlling a display mode of the display device,
    In the control process, when displaying the notification information on the display device, performing a switching effect using the display device, and displaying the notification information after erasing the display of the face image.
    Information processing method.
  2.  請求項1に記載の情報処理方法であって、
     前記表示装置は、前記顔画像を表示する第1表示部と、前記報知情報を表示する第2表示部とを含み、
     前記制御処理では、前記第2表示部に前記報知情報を表示させる場合には、前記第1表示部及び前記第2表示部のうちの少なくとも1つにおいて切替演出を実行し、かつ、前記第1表示部における前記顔画像の表示を消去した後に、前記第2表示部に前記報知情報を表示させる、
    情報処理方法。
    The information processing method according to claim 1,
    The display device includes a first display section that displays the face image, and a second display section that displays the notification information,
    In the control process, when displaying the notification information on the second display section, a switching effect is executed on at least one of the first display section and the second display section, and displaying the notification information on the second display unit after erasing the display of the face image on the display unit;
    Information processing method.
  3.  請求項1に記載の情報処理方法であって、
     前記制御処理では、前記切替演出として前記表示装置を移動させる演出を実行する、
    情報処理方法。
    The information processing method according to claim 1,
    In the control process, performing an effect of moving the display device as the switching effect;
    Information processing method.
  4.  請求項3に記載の情報処理方法であって、
     前記表示装置は、前記顔画像を表示する第1表示部が一の面に設けられ、前記報知情報を表示する第2表示部が前記一の面とは異なる他の面に設けられている立方体形状又は直方体形状の筐体からなり、
     前記制御処理では、前記第2表示部に前記報知情報を表示させる場合には、前記ユーザから見える前記表示装置の面を前記一の面から前記他の面に切り替える回転動作を前記切替演出として実行し、かつ、前記第1表示部における前記顔画像の表示を消去した後に、前記第2表示部に前記報知情報を表示させる、
    情報処理方法。
    The information processing method according to claim 3,
    The display device is a cube in which a first display section for displaying the face image is provided on one surface, and a second display section for displaying the notification information is provided on another surface different from the one surface. Consisting of a casing in the shape of a rectangular parallelepiped,
    In the control process, when displaying the notification information on the second display section, a rotation operation that switches a surface of the display device that is visible to the user from the one surface to the other surface is performed as the switching effect. and displaying the notification information on the second display section after erasing the display of the face image on the first display section;
    Information processing method.
  5.  請求項4に記載の情報処理方法であって、
     前記表示装置は、前記一の面に前記第1表示部が設けられ、前記一の面の反対側の面である前記他の面に前記第2表示部が設けられている平板状の筐体からなり、
     前記制御処理では、回転軸を中心にして前記表示装置を180度回転させる回転動作を前記切替演出として実行する、
    情報処理方法。
    The information processing method according to claim 4,
    The display device is a flat case in which the first display section is provided on the one surface, and the second display section is provided on the other surface that is the opposite surface to the one surface. Consisting of
    In the control process, a rotation operation of rotating the display device 180 degrees around a rotation axis is executed as the switching effect.
    Information processing method.
  6.  請求項4に記載の情報処理方法であって、
     前記表示装置は、前記一の面に前記第1表示部が設けられ、前記一の面に隣接する面である前記他の面に前記第2表示部が設けられている立方体形状又は直方体形状の筐体からなり、
     前記制御処理では、回転軸を中心にして前記表示装置を90度回転させる回転動作を前記切替演出として実行する、
    情報処理方法。
    The information processing method according to claim 4,
    The display device has a cubic or rectangular parallelepiped shape, and the first display section is provided on the one surface, and the second display section is provided on the other surface that is adjacent to the one surface. Consists of a casing,
    In the control process, a rotation operation of rotating the display device 90 degrees around a rotation axis is executed as the switching effect.
    Information processing method.
  7.  請求項3に記載の情報処理方法であって、
     前記表示装置は、一の面に第1表示部が設けられ、前記一の面の反対側の面に第2表示部が設けられている平板状の表示パネルと、第3表示部とを備え、
     前記表示パネルは、一の端部を回転軸として回動可能となるように第3表示部上に設けられ、
     前記表示装置は、前記表示パネルの回転動作により、前記第1表示部と前記第3表示部の一部とにより構成される第1表示領域に前記顔画像を表示する第1状態と、前記第2表示部と前記第3表示部の一部とにより構成される第2表示領域に前記報知情報を表示する第2状態との何れかの表示状態となるように制御され、
     前記制御処理では、前記表示装置に前記報知情報を表示させる場合には、前記表示パネルを回転させて前記第1状態から前記第2状態に遷移する演出を前記切替演出として実行し、かつ、前記第1表示領域における前記顔画像の表示を消去した後に、前記第2表示領域に前記報知情報を表示させる、
    情報処理方法。
    The information processing method according to claim 3,
    The display device includes a flat display panel having a first display section on one surface and a second display section on the opposite surface of the first surface, and a third display section. ,
    The display panel is provided on the third display section so as to be rotatable about one end of the display panel,
    The display device has a first state in which the face image is displayed in a first display area configured by the first display section and a part of the third display section by a rotational operation of the display panel; controlled to be in any one of the display states of a second state in which the notification information is displayed in a second display area formed by the second display section and a part of the third display section;
    In the control process, when displaying the notification information on the display device, an effect of rotating the display panel to transition from the first state to the second state is executed as the switching effect, and displaying the notification information in the second display area after erasing the display of the face image in the first display area;
    Information processing method.
  8.  請求項3に記載の情報処理方法であって、
     前記表示装置は、前記顔画像を表示する第1表示部と、前記報知情報を表示する第2表示部とを備え、
     前記第1表示部は、前記第2表示部の前方において前記第2表示部の表示面を覆う第1状態と、前記第2表示部の表示面と略平行方向にスライドして前記ユーザが前記第1表示部の表示面と前記第2表示部の表示面とを見ることが可能な第2状態とに遷移することが可能に設置され、
     前記制御処理では、前記第2表示部に前記報知情報を表示させる場合には、前記第1表示部をスライド移動させることにより前記第1状態から前記第2状態に遷移させる演出を前記切替演出として実行し、かつ、前記第1表示部における前記顔画像の表示を消去した後に、前記第2表示部に前記報知情報を表示させる、
    情報処理方法。
    The information processing method according to claim 3,
    The display device includes a first display section that displays the face image, and a second display section that displays the notification information,
    The first display section is arranged in a first state in front of the second display section to cover the display surface of the second display section and in a direction substantially parallel to the display surface of the second display section so that the user can installed so as to be able to transition to a second state in which the display surface of the first display section and the display surface of the second display section can be viewed;
    In the control process, when displaying the notification information on the second display section, the switching effect is an effect of transitioning from the first state to the second state by sliding the first display section. displaying the notification information on the second display unit after executing the display and erasing the display of the face image on the first display unit;
    Information processing method.
  9.  請求項1に記載の情報処理方法であって、
     前記表示装置は、立体画像及び平面画像と、高画質及び低画質と、カラー画像及び白黒画像と、のうちの少なくとも1つの切り替えにより異なる表示特性で各画像を表示することが可能であり、
     前記制御処理では、前記表示装置に前記報知情報を表示させる場合には、前記顔画像の表示時における表示特性とは異なる表示特性とする所定画像を表示する演出を前記切替演出として実行し、かつ、前記表示装置における前記顔画像の表示を消去した後に、前記表示装置に前記報知情報を表示させる、
    情報処理方法。
    The information processing method according to claim 1,
    The display device is capable of displaying each image with different display characteristics by switching at least one of a three-dimensional image and a two-dimensional image, a high image quality and a low image quality, a color image and a monochrome image,
    In the control process, when displaying the notification information on the display device, executing as the switching effect an effect of displaying a predetermined image having display characteristics different from display characteristics when displaying the face image, and , displaying the notification information on the display device after erasing the display of the face image on the display device;
    Information processing method.
  10.  請求項1に記載の情報処理方法であって、
     前記表示装置は、前記顔画像を表示する第1表示部と、前記報知情報を表示する第2表示部とを含み、
     前記第1表示部及び前記第2表示部は、立体画像及び平面画像と、高画質及び低画質と、カラー画像及び白黒画像と、のうちの少なくとも1つが異なる表示特性を備え、
     前記制御処理では、前記第2表示部に前記報知情報を表示させる場合には、前記第1表示部及び前記第2表示部のうちの少なくとも1つにおいて切替演出を実行し、かつ、前記第1表示部における前記顔画像の表示を消去した後に、前記第2表示部に前記報知情報を表示させる、
    情報処理方法。
    The information processing method according to claim 1,
    The display device includes a first display section that displays the face image, and a second display section that displays the notification information,
    The first display section and the second display section have different display characteristics for at least one of a three-dimensional image and a two-dimensional image, a high image quality and a low image quality, a color image and a monochrome image,
    In the control process, when displaying the notification information on the second display section, a switching effect is executed on at least one of the first display section and the second display section, and displaying the notification information on the second display unit after erasing the display of the face image on the display unit;
    Information processing method.
  11.  請求項1から10の何れかに記載の情報処理方法であって、
     前記制御処理では、前記報知情報の緊急度に基づいて前記重要度を判定し、前記報知情報の重要度に基づいて、前記顔画像の表示から前記報知情報の表示に切り替えるタイミングを変更する、
    情報処理方法。
    The information processing method according to any one of claims 1 to 10,
    In the control process, the degree of importance is determined based on the degree of urgency of the notification information, and the timing of switching from displaying the face image to displaying the notification information is changed based on the degree of importance of the notification information.
    Information processing method.
  12.  請求項11に記載の情報処理方法であって、
     前記制御処理では、
     前記報知情報の重要度が基準よりも高い場合には、当該報知情報が発生した直後に、前記顔画像の表示から前記報知情報の表示に切り替え、
     前記報知情報の重要度が基準よりも低い場合において、前記表示装置に表示されている前記顔画像が所定の演出を実行しているときには、当該所定の演出の終了後に、前記顔画像の表示から前記報知情報の表示に切り替え、前記表示装置に表示されている前記顔画像が前記所定の演出を実行していないときには、当該報知情報が発生した後に、前記顔画像の表示から前記報知情報の表示に切り替える、
    情報処理方法。
    The information processing method according to claim 11,
    In the control process,
    If the importance of the notification information is higher than a standard, immediately after the notification information is generated, switching from displaying the face image to displaying the notification information,
    When the importance of the notification information is lower than the standard, and when the face image displayed on the display device is performing a predetermined effect, after the predetermined effect is finished, the display of the face image is stopped. When the display of the notification information is switched to, and the face image displayed on the display device is not performing the predetermined effect, after the notification information is generated, the display of the notification information is switched from the display of the face image to the display of the notification information. switch to,
    Information processing method.
  13.  顔画像を表示する表示装置を有するエージェント機器を用いて報知情報をユーザに提供する情報処理装置であって、
     前記表示装置の表示態様を制御する制御部を備え、
     前記制御部は、前記表示装置に前記報知情報を表示させる場合には、前記表示装置を用いた切替演出を実行し、かつ、前記顔画像の表示を消去した後に、前記報知情報を表示させる、
    情報処理装置。
    An information processing device that provides notification information to a user using an agent device having a display device that displays a face image,
    comprising a control unit that controls a display mode of the display device,
    When displaying the notification information on the display device, the control unit executes a switching effect using the display device, and displays the notification information after erasing the display of the face image.
    Information processing device.
PCT/IB2023/000484 2022-08-26 2023-08-16 Information processing method and information processing device WO2024042359A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022135323 2022-08-26
JP2022-135323 2022-08-26

Publications (1)

Publication Number Publication Date
WO2024042359A1 true WO2024042359A1 (en) 2024-02-29

Family

ID=90012654

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2023/000484 WO2024042359A1 (en) 2022-08-26 2023-08-16 Information processing method and information processing device

Country Status (1)

Country Link
WO (1) WO2024042359A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS59501380A (en) * 1982-08-06 1984-08-02 ハ−ウォル・インダストリ−ズ・ピ−ティ−ワイ・リミテド color display
JP2006347478A (en) * 2005-06-20 2006-12-28 Fujitsu Ten Ltd Vehicle-mounted instrument with touch panel display
JP4729545B2 (en) * 2007-09-13 2011-07-20 京セラ株式会社 Mobile communication terminal
JP4959420B2 (en) * 2007-05-25 2012-06-20 株式会社ブリヂストン Double-side display type information display device
JP2020157855A (en) * 2019-03-26 2020-10-01 本田技研工業株式会社 Agent device, control method of agent device, and program

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS59501380A (en) * 1982-08-06 1984-08-02 ハ−ウォル・インダストリ−ズ・ピ−ティ−ワイ・リミテド color display
JP2006347478A (en) * 2005-06-20 2006-12-28 Fujitsu Ten Ltd Vehicle-mounted instrument with touch panel display
JP4959420B2 (en) * 2007-05-25 2012-06-20 株式会社ブリヂストン Double-side display type information display device
JP4729545B2 (en) * 2007-09-13 2011-07-20 京セラ株式会社 Mobile communication terminal
JP2020157855A (en) * 2019-03-26 2020-10-01 本田技研工業株式会社 Agent device, control method of agent device, and program

Similar Documents

Publication Publication Date Title
JP7475812B2 (en) Mood roof for an enhanced media experience in the vehicle cabin
US10710608B2 (en) Provide specific warnings to vehicle occupants before intense movements
JP7126709B2 (en) foldable virtual reality device
US20190101976A1 (en) Systems and methods to provide an interactive space based on predicted events
KR20210011416A (en) Shared environment for vehicle occupants and remote users
JP6761340B2 (en) Simulation system and program
US20220092862A1 (en) Devices, Methods, and Graphical User Interfaces for Interacting with Three-Dimensional Environments
CN107650795A (en) System, the method and apparatus of vehicle-mounted media content are presented based on vehicle sensor data
JP2012530317A (en) System to change virtual view
CN107650796A (en) System, the method and apparatus of vehicle-mounted media content are presented based on vehicle sensor data
CN107351763A (en) Control device for vehicle
CN109716266A (en) Immersion is virtually shown
WO2019124158A1 (en) Information processing device, information processing method, program, display system, and moving body
JP2000207575A (en) Space fusing device and application devices adapting the same
JP5857330B2 (en) System and program
WO2021241431A1 (en) Information processing device, information processing method, and computer-readable recording medium
WO2024042359A1 (en) Information processing method and information processing device
EP3869302A1 (en) Vehicle, apparatus and method to reduce the occurence of motion sickness
CN110979202B (en) Method, device and system for changing automobile style
JP5687879B2 (en) Information processing apparatus, automobile, information processing method and program
JP2024031649A (en) Information processing method and information processing device
JP2007094082A (en) Mobile object simulator system, control method and control program thereof
JP2024025227A (en) Information processing method and information processing device
JP2009008821A (en) Image display
JP7332823B1 (en) program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23855839

Country of ref document: EP

Kind code of ref document: A1