WO2023042423A1 - Dispositif, procédé et programme de traitement d'informations - Google Patents

Dispositif, procédé et programme de traitement d'informations Download PDF

Info

Publication number
WO2023042423A1
WO2023042423A1 PCT/JP2022/007776 JP2022007776W WO2023042423A1 WO 2023042423 A1 WO2023042423 A1 WO 2023042423A1 JP 2022007776 W JP2022007776 W JP 2022007776W WO 2023042423 A1 WO2023042423 A1 WO 2023042423A1
Authority
WO
WIPO (PCT)
Prior art keywords
content
information
information processing
viewing
external device
Prior art date
Application number
PCT/JP2022/007776
Other languages
English (en)
Japanese (ja)
Inventor
文彦 飯田
健太 安部
雄司 北澤
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2023042423A1 publication Critical patent/WO2023042423A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home

Definitions

  • the present disclosure relates to an information processing device, an information processing method, and an information processing program.
  • the present disclosure proposes an information processing device capable of enhancing the presence of content viewed by a user.
  • An information processing device includes a determination unit. Based on content information related to content viewed by a user using a viewing device and information on an external device different from the viewing device existing in a space where the content is viewed, the determination unit selects the content from among the external devices. Determine the target device to be linked with viewing.
  • FIG. 1 is a diagram illustrating a configuration example of an information processing device according to an embodiment of the present disclosure
  • FIG. FIG. 2 is a diagram showing an arrangement example of audio-visual devices and external devices according to an embodiment of the present disclosure
  • FIG. FIG. 4 is a diagram showing an example of a UI according to an embodiment of the present disclosure
  • FIG. FIG. 4 is a diagram showing an example of a UI according to an embodiment of the present disclosure
  • FIG. FIG. 4 is a diagram showing an example of a UI according to an embodiment of the present disclosure
  • FIG. FIG. 4 is a diagram showing an example of a UI according to an embodiment of the present disclosure
  • FIG. FIG. 4 is a diagram showing an example of a UI according to an embodiment of the present disclosure
  • FIG. FIG. 4 is a diagram showing an example of a UI according to an embodiment of the present disclosure
  • FIG. FIG. 4 is a diagram showing an example of a UI according to an embodiment of the present disclosure
  • FIG. FIG. 4 is
  • FIG. 5 is a diagram showing an example of evaluation results of an external device according to an embodiment of the present disclosure
  • FIG. FIG. 2 is a diagram illustrating a correspondence relationship between a user's sensation and a target device that affects the sensation according to an embodiment of the present disclosure
  • FIG. 4 is a diagram showing an example of control of a controlled object according to the embodiment of the present disclosure
  • FIG. FIG. 2 is a diagram illustrating an example of arrangement of external devices according to an embodiment of the present disclosure
  • FIG. 4 is a sequence diagram showing an example of data communication according to the embodiment of the present disclosure
  • FIG. 6 is a flow chart showing an example of processing executed by a control unit of the information processing device according to the embodiment of the present disclosure
  • FIG. 1 is a diagram illustrating a configuration example of an information processing device according to an embodiment of the present disclosure.
  • the information processing apparatus 1 according to the present embodiment is a device or equipment installed in a viewing environment other than a display (hereinafter sometimes referred to as a "monitor") when a user views content such as video and audio. (hereinafter sometimes referred to as “external device”).
  • the information processing device 1 is connected to the viewing device 2 and the external device 3 .
  • the viewing device 2 is connected to a content DB (database) 4 .
  • the content DB 4 is a storage device that stores video and audio content such as movies and live music.
  • the content DB 4 may be a recording device that records content in advance, or a streaming server that distributes content to the viewing device 2 .
  • the audio-visual device 2 includes a monitor that displays video of content acquired from the content DB 4 and a speaker that outputs audio.
  • a monitor that displays video of content acquired from the content DB 4
  • a speaker that outputs audio.
  • a case where the viewing device 2 is a television 21 (see FIG. 2) will be described below, but the viewing device 2 according to the embodiment is not limited to the television 21 .
  • the viewing device 2 may be a device other than the television 21, such as an HMD (Head Mounted Display) smartphone, tablet terminal, personal computer, portable media player, etc., as long as it has a function of outputting video and audio. Also, the number of sounds output by the audio-visual device 2 is not limited to monaural, 2ch, 5.1ch, and the like.
  • HMD Head Mounted Display
  • the number of sounds output by the audio-visual device 2 is not limited to monaural, 2ch, 5.1ch, and the like.
  • the external device 3 is a home appliance that exists in the space where the user views content and is capable of wireless or wired communication with the information processing device 1, and its original main purpose is to improve the comfort of the user environment. It is a home appliance that allows you to
  • the external device 3 is an air conditioner 31, a fan 32, and a humidifier 33 (see FIG. 2) will be described below, but the external device 3 according to the embodiment is not limited to these home appliances. do not have.
  • the external device 3 is a device that can communicate with the information processing device 1 and improves the comfort of the user environment, for example, it may be a floor heating device, a lighting device, an electric curtain, a smart speaker, an aroma device, or the like. good. Also, the external device 3 may include a sensor for grasping the position of the user in the room. The number of external devices 3 may be one, or may be plural.
  • the information processing device 1 includes a communication section 11 , an operation input section 12 and a control section 13 .
  • the communication unit 11 is a communication interface that transmits various types of information between the viewing device 2 and the external device 3 and the control unit 13 .
  • the operation input unit 12 is, for example, a touch panel display, and displays icons and the like that serve as operation buttons operated by the user, receives the operation of the icon by the user, and outputs a signal corresponding to the operation to the control unit 13. .
  • the control unit 13 includes a microcomputer having a CPU (Central Processing Unit), ROM (Read Only Memory), RAM (Random Access Memory), and various circuits (processors).
  • the control unit 13 includes an analysis unit 14 , a determination unit 15 , and a target device control unit 16 which function when the CPU executes a program stored in the ROM using the RAM as a work area.
  • Part or all of the analysis unit 14, the determination unit 15, and the target device control unit 16 provided in the control unit 13 are configured by hardware such as ASIC (Application Specific Integrated Circuit) and FPGA (Field Programmable Gate Array). may
  • ASIC Application Specific Integrated Circuit
  • FPGA Field Programmable Gate Array
  • the analysis unit 14, the determination unit 15, and the target device control unit 16 included in the control unit 13 implement or execute the information processing operations described below.
  • the internal configuration of the control unit 13 is not limited to the configuration shown in FIG. 1, and may be another configuration as long as it performs the information processing described later.
  • the analysis unit 14 acquires content information related to the content viewed by the user from the viewing device 2 , analyzes the content of the video, and extracts information necessary for controlling the external device 3 .
  • the content information includes the content itself and additional information (meta information) related to the content.
  • the content information may not include additional information.
  • the analysis unit 14 also acquires and refers to the effect plan.
  • the analysis unit 14 may acquire additional information in advance, for example, when a production plan such as smoke or special effects is added as additional information to a specific song by the production side.
  • the analysis unit 14 outputs the additional information to the determination unit 15 as an analysis result.
  • the analysis unit 14 analyzes the image of the environment in the content, the timing of the change in the environment, the situation of the performer, the timing of the change in the situation of the performer, etc., and determines the analysis result. output to
  • the determination unit 15 obtains content information related to content that the user views on the viewing device 2 and information on the external device 3 different from the viewing device 2 existing in the space where the content is viewed (hereinafter referred to as “external device information”). ), among the external devices 3, the target device to be linked with viewing of the content is determined. Also, the determining unit 15 determines a target device to be linked with viewing of the content based on the viewing environment information regarding the space in which the content is viewed (hereinafter sometimes referred to as “viewing space”).
  • the determining unit 15 can acquire external device information from the external device 3 by communicating with the external device 3, for example.
  • the external device information includes information indicating the external device 3 existing in the viewing space and the control target of the external device 3 .
  • the external device information also includes information indicating the position of the external device 3 in the space where the content is viewed. Based on communication with the external device 3, the target device control unit 16 recognizes the presence of the external device 3 in the content viewing space. Further, the external device information includes information such as the responsiveness of control of the controlled object by the external device 3 and the influence range of the controlled object by the control of the external device 3, for example.
  • the viewing environment information includes information indicating the time zone in which the content is viewed, information indicating the floor plan of the viewing space, and information indicating the behavior of each person when there are multiple people in the space where the content is viewed. Contains at least one of the following: Also, the viewing environment information includes information indicating the behavior of each person when there are a plurality of people in the viewing space.
  • the determining unit 15 acquires information on the time zone in which the content is viewed from the viewing device 2 . Further, the determining unit 15 acquires information on the number of persons present in the viewing space and the behavior of the persons present in the viewing space, for example, from an image captured by a camera (not shown) installed in the viewing space.
  • the determination unit 15 determines the external device 3 capable of reproducing at least part of the situation of a specific scene in the content as the target device to be linked with the viewing of the content in the viewing environment of the content. In other words, the determining unit 15 determines the external device 3 that satisfies a predetermined condition as the target device to be linked with viewing of the content, based on the external device information and the information indicating the situation of the specific scene in the content.
  • the information indicating the situation of the specific scene includes at least one of information on the recording time of the content, information on the environment of the person in the content, and information on the state of the person in the content.
  • the predetermined condition includes that there is a relationship between the information indicating the situation of the specific scene and the controlled object to be changed by the external device 3 .
  • the determination unit 15 determines a target device from among the external devices 3 whose control targets to be changed by the external device 3 are the same or similar to the characteristics included in the information indicating the situation of the specific scene. Then, the determination unit 15 outputs the analysis result by the analysis unit 14 and the above-described viewing environment information to the target device control unit 16 together with the information indicating the determined target device.
  • the target device control unit 16 combines the information of the analysis result input from the determination unit 15, the information of the external device 3, and the viewing environment information to perform information processing for controlling the target device linked to the video. I do.
  • the target device control unit 16 is configured to control the target device linked to the video based on at least one of the analysis result information, the external device 3 information, and the viewing environment information. good too.
  • the information generated here is transmitted as a command to the external device 3.
  • Methods for doing so include wireless systems such as Wi-fi (registered trademark) and Bluetooth (registered trademark), IR (infrared It may be an optical method such as light) or a mechanical method that realizes mechanical operation. Also, it may cooperate with an external service that provides an IoT (Internet Of Things) cooperation service.
  • FIG. 2 is a diagram showing an arrangement example of the audio-visual device 2 and the external device 3 according to the embodiment.
  • a sofa 102 on which a user U who views content sits is placed in the center of a room 101, and a television 21, which is an example of the viewing device 2, is installed on the wall facing the sofa 102.
  • An air conditioner 31, which is an example of the external device 3 is installed on the wall on the right side of the user U, an electric fan 32, which is also an example of the external device 3, is arranged on the left front, and an external device is installed on the right rear.
  • a humidifier 33 which is an example of 3, is arranged.
  • the user U When the user U causes the target device to operate in conjunction with the content to be viewed, first, the user U registers the target device to be operated in the information processing apparatus 1 .
  • the user U uses a smartphone in which an application program (hereinafter referred to as an “app”) for realizing the functions of the information processing device 1 is installed in advance, the user U registers the external device 3 as the target device.
  • an application program hereinafter referred to as an “app”
  • the information processing device 1 may be realized by a dedicated terminal device.
  • FIGS. 3 to 7 are diagrams showing examples of UIs (user interfaces) according to the embodiment of the present disclosure. Note that the icons shown in FIGS. 3 to 7 are given the same reference numerals as the actual objects shown in FIG.
  • the information processing apparatus 1 When the application is activated by the user U and the type and layout of the equipment of the external device 3 are input, the information processing apparatus 1, as shown in FIG. View floor plan information.
  • the floor plan information includes the size of the room 101, the position of the door, and the like.
  • the information processing device 1 scans the external device 3 on the same network. At that time, the position of the floor plan is fixed in advance for the air conditioner 31 because there is no large positional movement. Information about the installation position of the air conditioner 31 is held in the air conditioner 31 or a predetermined server.
  • the electric fan 32 and the humidifier 33 are external devices 3 that can be moved by the user U. Therefore, the information processing device 1 cannot grasp the locations of the electric fan 32 and the humidifier 33 . However, since the fan 32 and the humidifier 33 are on the network, the information processing device 1 can recognize that the fan 32 and the humidifier 33 are present.
  • the information processing apparatus 1 displays the icons of the fan 32 and the humidifier 33 outside the frame of the room 101 on the UI, and the user U selects and moves them. Thereby, as shown in FIG. 4, the user U can select the electric fan 32 by touching the icon of the electric fan 32 with the fingertip Uf, for example.
  • the user U can register the position of the fan 32 in the room 101 in the information processing device 1 as shown in FIG. 5 by moving the icon of the fan 32 on the UI to the actual placement position.
  • the target device control unit 16 outputs information for notifying the user U of the range of influence of the target device.
  • the influence range 32A is displayed on the UI as shown in FIG.
  • the user U can check whether or not the user's position when viewing the content is included in the influence range 32A of the electric fan 32, and if not included, refer to the influence range 32A.
  • the installation position of the electric fan 32 in the real space can be changed.
  • the target device control unit 16 may be configured to output information to be notified to the user U when the position where the user U views the content is outside the influence range of the target device.
  • the target device control unit 16 outputs information proposing to the user U the position of the target device suitable for viewing the content when the position where the user U views the content is outside the influence range of the target device.
  • the information processing apparatus 1 displays the recommended placement position 33A of the humidifier 33 when the user U selects the icon of the humidifier 33, and allows the user U to see the real environment. Optimization efforts can also be made. If ideal devices and layouts are stored by the content provider, a flow line for purchasing or leasing the devices may be presented accordingly to encourage the user to improve the experience thereafter.
  • the information processing apparatus 1 can specify the position of the device based on the wireless strength of the communication device or the like, or can identify the object from the image data of the room 101 captured by the user U. It can also perform recognition and grasp the position and attached objects.
  • FIG. 8 is a diagram illustrating an example of evaluation results of target devices according to the embodiment of the present disclosure.
  • the target devices shown in FIG. 8 are an air conditioner 31, a humidifier 33, and an electric fan 32 here.
  • the controlled object shown in FIG. 8 is a physical characteristic that can be controlled when each external device 3 operates.
  • the objects controlled by the air conditioner 31 are the temperature of the room 101 , the air volume to be output, and the humidity of the room 101 .
  • the humidity of the room 101 is controlled by the humidifier 33 .
  • the control target of the electric fan 32 is the output air volume and wind direction.
  • control target may be smell, sound, light, or the like.
  • control targets may be grouped for each of the five human senses.
  • the external device 3 that affects the sense of touch includes an air conditioner 31, a humidifier 33, an electric fan 32, and the like.
  • the external devices 3 that affect vision include a television 21, lighting, electric curtains, personal computers, and the like.
  • the external device 3 that affects hearing includes a speaker, a television 21, an electric fan 32, and the like.
  • the influence range shown in FIG. 8 is the range in which space is affected when the external device 3 operates.
  • the air conditioner 31 and the humidifier 33 change the environment of the entire space, so if there is a user other than the user U who is the viewer, consideration must be given to their operation.
  • the electric fan 32 can blow air with a pinpoint, the influence range is small.
  • the air conditioner 31 has temperature, humidity, and blown air volume, each of which has different characteristics such as the range of influence. For this reason, the information processing apparatus 1 may obtain these influence ranges from the model number of the external device 3 or the like by citing catalog specifications, or may obtain them from manual input by the user U.
  • FIG. 1
  • the responsiveness shown in FIG. 8 means the time required for the presented controlled object to change to a desired value. For example, in the case of the air conditioner 31, it takes a certain amount of time to cause an air conditioning change (room temperature change). These responsivenesses may be times estimated from the controlled object, or may be actually measured.
  • the information processing device 1 refers to a reference table in which response time estimates are described in advance based on the output and temperature difference.
  • the information processing apparatus 1 accumulates space control information when the air conditioner 31 was driven in the past, and refers to the log.
  • the information processing apparatus 1 needs to consider the impact of presentation when changing the output or operation of the target device has a large impact on the viewing environment. For this reason, the determination unit 15 limits the determination condition of the target device according to the behavior of each person in the space where the content is viewed. For example, when there are a plurality of people in the environment, and some of them are watching video, the information processing device 1 connects the external device 3, such as the air conditioner 31, which affects the entire environment, to the content being watched. If it is operated by hand, it may cause discomfort to those who are not watching the video.
  • the external device 3 such as the air conditioner 31, which affects the entire environment, to the content being watched. If it is operated by hand, it may cause discomfort to those who are not watching the video.
  • the information processing apparatus 1 takes measures such as excluding the external device 3 that affects the entire environment from the target device list, or lowering the range of parameters to be changed. For example, the information processing apparatus 1 acquires an image of the interior of the room from a camera (not shown) attached to the room 101, and determines the number of people in the room from the image. The information processing device 1 may determine the number of people in the room based on information acquired from a human sensor that detects people in the room 101 .
  • the information processing device 1 analyzes the behavior of each person and evaluates whether the multiple people are taking the same intended behavior. At this time, even if there are a plurality of people, if all of them are watching the television 21, no special measures are required. On the other hand, when one person is watching video and another person is doing another task other than video watching, the information processing apparatus 1 selects the control parameter Constraints are given such that they are not used.
  • the information processing apparatus 1 excludes the humidifier 33 that affects the environment of the entire room 101 from the target devices, and does not use the temperature change function of the air conditioner 31, but uses only the fan function. Limit functionality. In addition to such control, the information processing apparatus 1 may take measures such as reducing the range of temperature rise in the air conditioner 31, for example.
  • the information processing apparatus for example, when the user U explicitly inputs to exclude from the current target device on the application, the target device Constraint conditions are generated considering information such as devices.
  • the information processing device 1 performs video analysis of the content viewed by the user U.
  • FIG. The purpose of video analysis is to control each target device and realize linkage between video content and viewing environment.
  • the information processing apparatus 1 performs video analysis by referring not only to the video content itself, but also to information such as the timing and time when the video content was recorded. Specifically, the information processing apparatus 1 determines the recording time of the video content, the special effect presentation of the venue in the video content, the action information of the performer in the video content, and the like by video analysis.
  • the information processing device 1 controls the electric fan 32 to blow air to the user U.
  • the information processing device 1 may control the target device according to the physical condition of the performer or the audience. For example, if the performer is sweating or has an increased heart rate, this is recognized, and the air conditioner 31 and humidifier 33 adjust the temperature and humidity of the user U so that the state of the user U is close to that of the performer. Increase humidity.
  • the information processing device 1 uses image recognition or the like to estimate the heart rate of the performer from the video, or estimates the heart rate of the performer from the video based on data from a group of sensors separately attached to the performer or the audience. can be Further, the information processing device 1 may be configured to acquire biometric information such as the heart rate and body temperature of the performer from the metadata attached to the content.
  • the information processing device 1 stores the environmental information of the venue and the information of the sensor group acquired from the content in a predetermined storage device, and can be used to control the target device when the same content is viewed next time.
  • the environmental information of the venue, the information of the sensor group, and the like may include not only information actually measured but also information calculated by information processing.
  • the information processing device 1 may control the external device 3 according to the operating states of the performers and the audience. For example, if the performer is running, it is assumed that the performer is exposed to the wind due to his or her movement, so the wind is controlled. For example, even when the performer is standing still, if the performer's hair is swaying, the information processing apparatus 1 controls the wind because it is assumed that the wind is being generated. In this way, the target device control unit 16 of the information processing apparatus 1 controls the target device determined by the determination unit 15 to reproduce at least part of the situation of the specific scene in the space where the content is viewed.
  • the determination unit 15 determines the external device 3 capable of outputting the viewing environment of the content closer to the same environment as the specific scene in the content as the target device to be linked with the viewing of the content. In other words, the determination unit 15 determines the external device 3 that satisfies a predetermined condition as the target device to be linked with viewing of the content, based on the viewing environment information of the content and the environment information of the specific scene in the content.
  • Environmental information in a specific scene includes information extracted from video analysis results and metadata. Specifically, it includes the recording time of the video content, the special effect presentation of the venue in the video content, and the action information of the performer in the video content.
  • the determining unit 15 determines a target device to be linked with content viewing based on the result of video analysis and information extracted from metadata added in advance to the video content.
  • the predetermined conditions include that there is a relationship between the environmental information in the specific scene and the controlled object that the external device 3 changes.
  • the target device is selected from among the external devices 3 that have the same characteristics based on the environment information in the specific scene and the control target to be changed by the external device 3 .
  • the target device control unit 16 determines that the scene is a performance using fire from the information indicating the situation of the specific scene, it raises the room temperature by the air conditioner 31 and controls the fan 32 to blow warm air to the viewing space. generate.
  • the target device control unit 16 can also determine and control the external device 3 that has a similar control target to be changed by the external device 3 as the target device. For example, in a scene that uses fire, for example, there are cases where a scene where flames rise momentarily is included. In this case, in addition to controlling the air conditioner 31, the target device control unit 16 may be configured to turn on the lighting device to momentarily brighten the space where the content is viewed at the moment the flame rises.
  • the target device control unit 16 determines that the scene is a scene of a performance using dry ice from the information indicating the situation of the specific scene, it lowers the room temperature by the air conditioner 31 and controls the fan 32 to generate cool air in the viewing space. .
  • the timing at which the determination unit 15 determines the external device 3 as the target device may be determined in advance, or may be determined each time at a predetermined timing during distribution in the case of real-time distribution.
  • the target device may be determined each time between songs, the timing at which the performer changes, the timing at which the presentation changes, and the like.
  • the environment information in the specific scene includes video analysis results of video content, metadata, and the like.
  • FIG. 10 is a diagram illustrating an example of controlling a controlled object according to the embodiment of the present disclosure.
  • the performer is running between times t1 and t2
  • dry ice is used for special effects between times t4 and t5
  • dry ice is used for special effects between times t8 and t9.
  • the target device control unit 16 adjusts the control start timing of the target device according to the responsiveness. For example, if the performer is running, the information processing apparatus 1 determines that the scene is a wind blowing scene from at least one of the information of the video analysis result and the metadata, and sends the wind to the user U. Give control. As described above, the wind has high responsiveness, so a control signal instructing the fan 32 to blow air is generated and transmitted only during the time period t1 to t2 when the performer is running.
  • the information processing device 1 After that, the information processing device 1 performs control to lower the temperature when smog due to dry ice or the like occurs as a special effect in the video content, because the effect is to evoke a feeling of coldness. At this time, the information processing device 1 generates and transmits a control signal for lowering the temperature to the air conditioner 31 .
  • the information processing apparatus 1 takes into consideration the low responsiveness of temperature adjustment, and starts control to lower the temperature at time t3, which is the buffer time before time t4 at which the use of dry ice is started as a special effect.
  • the information processing device 1 After that, the information processing device 1 generates and transmits an operation signal to the air conditioner 31 and the humidifier 33 in order to remind the user U of an increase in temperature and humidity when the performer is sweating.
  • the information processing apparatus 1 considers the low responsiveness of the temperature adjustment, and starts the control to raise the temperature from time t7, which is before the time t8 when the performer is sweating by the buffer time. . Further, the information processing apparatus 1 performs control to raise the humidity, which has lower responsiveness than the temperature adjustment, from time t6, which is before time t7 by the buffer time.
  • the information processing apparatus 1 generates a control signal for the target device according to at least one of the information of the video analysis result of the video content and the metadata, thereby enabling the content processing according to the video content. can recreate at least part of the context of a particular scene in
  • the surrounding environment of the user U at the time of viewing may be considered. For example, late at night, compared to daytime, lighting output is more effective, so it is often used.
  • a so-called parental control function that restricts the functions of devices that can be accessed for each user U, and an energy saving mode that restricts functions in consideration of energy consumption as a whole may be included. good.
  • the information processing apparatus 1 was able to set the buffer time and the like in advance by performing video analysis in advance on the video content that was saved in advance. I can't. Therefore, the information processing apparatus 1 performs processing such as selectively selecting the external device 3 with high responsiveness when, for example, video content of a real-time event is viewed.
  • the information processing apparatus 1 selects an external device 3 whose positional relationship makes it difficult to present information due to the layout of the external device 3, the positional relationship between the external device 3 and the viewer, etc., from the list of target devices to be controlled. exclude.
  • each fan 32, 34 has an influence range 32A, 34A that is affected by its output.
  • the information processing apparatus 1 determines that sufficient information presentation cannot be performed when the user U does not enter the influence ranges 32A and 34A.
  • the information processing apparatus 1 excludes the fan 34 from the list of target devices to be controlled because the user U is outside the influence range 34A. In this manner, the information processing apparatus 1 can perform optimal target device control by appropriately updating the list of target devices to be controlled according to the positional relationship between the user U and the external device 3 .
  • the determination unit 15 determines the external device 3 included in the field of view of the user U as the target device to be linked with viewing of the content. If the information to be presented visually, the information processing apparatus 1 performs control in consideration of the influence of the HMD when the control target is an object within the field of view of the user U, or when the field of view is covered by the HMD or the like. may
  • the information processing apparatus 1 is a non-see-through HMD, it does not control the devices included in the area, or if it is a see-through HMD, the device can be controlled within the area.
  • the information processing device 1 when the information processing device 1 presents a scent or the like, it may take time to remove the scent. In that case, the information processing device 1 may use the electric fan 32, a ventilation fan, or the like to promote ventilation. In addition, when it is difficult to present information to the user U, the information processing apparatus 1 may notify the user U of a position where a better experience can be provided and perform a notification to guide the user.
  • an example using the television 21 is shown. Even if the environment is manipulated, it is not visually recognized by the user U and does not contribute to the purpose of obtaining a sense of immersion.
  • the HMD system may have a function of grasping the position of the user U.
  • the target device may be controlled with higher accuracy.
  • the information processing apparatus 1 may exclude light from illumination that the user U cannot visually recognize, etc., from the control targets depending on the viewing state of the user U.
  • the information processing device 1 excludes lighting devices and the like from target devices when the HMD is worn.
  • the information processing device 1 when the user U interrupts, rewinds, or fast-forwards viewing of the content, the information processing device 1 performs processing such as stopping the operation of the target device. At this time, the information processing apparatus 1 performs control to reproduce at least part of the situation of a specific scene in the space where the content is viewed, in conjunction with video operations (such as fast-forwarding), limited to target devices with high responsiveness. may be configured to perform
  • the information processing apparatus 1 may reduce the sense of immersion for the user U if the target device is stopped quickly. May be set.
  • the target device control section 16 may be configured to output information regarding the control details for the above-described target device.
  • the information processing apparatus 1 may output and store information regarding the control details for the target device in a predetermined storage device, and may use the information as a basis for diversion or editing when viewing the content from the next time onward.
  • a response sound may be output when a control signal is received from the information processing device 1 . Therefore, if the target device has a function to turn off the response sound, the information processing apparatus 1 performs setting in advance. In addition, the information processing apparatus 1 modifies the control signal so as to reduce the control frequency for the target device that cannot turn off the response.
  • FIG. 12 is a sequence diagram showing an example of data communication according to the embodiment of the present disclosure.
  • the information processing apparatus 1 transmits a control signal to turn on the electric fan 32 before a scene in which the performer runs in the video content (step S1), and transmits an operation instruction to the electric fan 32 when a scene in which the performer runs occurs. (step S2).
  • the information processing device 1 transmits a stop instruction to the electric fan 32 (step S3).
  • the electric fan 32 transmits a stop notification to the information processing device 1 when stopping the operation (step S4).
  • the information processing device 1 transmits a control signal to switch on the air conditioner 31 (step S5), and inquires about the current temperature to the air conditioner 31 (step S6).
  • the air conditioner 31 notifies the information processing device 1 of the current temperature in response to the inquiry (step S7).
  • the information processing apparatus 1 transmits an output command to lower the temperature to the air conditioner 31 before the scene in which dry ice is used in the special effect (step S8), and after the scene in which dry ice is used ends. , a stop instruction is transmitted to the air conditioner 31 (step S9). When the air conditioner 31 stops operating, it transmits a stop notification to the information processing device 1 (step S10).
  • the information processing device 1 inquires about the current temperature to the air conditioner 31 (step S11), and inquires about the current humidity to the humidifier 33 (step S12).
  • the air conditioner 31 notifies the information processing device 1 of the current temperature (step S13).
  • the humidifier 33 notifies the current humidity to the information processing device 1 (step S14).
  • the information processing device 1 transmits an output command to increase the temperature to the air conditioner 31 (step S15) and an output command to increase the humidity to the humidifier 33 (step S16) before the scene in which the performer sweats. .
  • the information processing device 1 transmits a stop command to the air conditioner 31 and the humidifier 33 before the scene in which the performer sweats ends (steps S17 and S18).
  • the air conditioner 31 and the humidifier 33 stop operating, they transmit a stop notification to the information processing device 1 (steps S19 and S20).
  • FIG. 13 is a flowchart illustrating an example of processing executed by the control unit of the information processing device according to the embodiment of the present disclosure
  • control unit 13 first registers the target device (step S101) and evaluates the target device (step S102). After that, the control unit 13 senses the number of users (step S103), and determines whether or not a plurality of users exist in the same space (step S104).
  • step S104 determines that multiple users do not exist in the same space (step S104, No).
  • step S104 determines that a plurality of users exist in the same space (step S104, Yes).
  • step S105 determines whether or not the users are performing actions for the same purpose.
  • step S105 determines that the user is performing an action for the same purpose (step S105, Yes). If the control unit 13 determines that the user has not performed the action for the same purpose (step S105, No), the control unit 13 adds the target device use restriction condition (step S106), and moves the process to step S107.
  • step S107 the control unit 13 generates constraints based on conditions other than the number of users. Thereafter, the control unit 13 selects or reads video content (step S108), and performs video analysis (step S109).
  • the control unit 13 determines the target device (step S110).
  • the timing at which the determination unit 15 determines the external device 3 as the target device may be determined in advance, or may be determined each time at a predetermined timing during distribution in the case of real-time distribution.
  • the target device may be determined each time between songs, the timing at which the performer changes, the timing at which the presentation changes, and the like.
  • control unit 13 generates a target device control signal (step S111), outputs the target device control signal to the target device at a timing according to the scene of the video content (step S112), and ends the process.
  • the information processing device 1 includes a determination unit 15 .
  • the determining unit 15 selects, among the external devices 3, based on content information related to content viewed by the user U using the viewing device 2 and information on an external device 3 different from the viewing device 2 existing in the space where the content is viewed. Determine the target device to be linked with the viewing of the content. Accordingly, the information processing apparatus 1 can enhance the presence of the content viewed by the user U by linking the operation of the target device with the viewing of the content.
  • the information of the external device 3 includes information indicating the control target of the external device 3.
  • the determination unit 15 can determine the external device 3 that controls the control target that enhances the presence of the content as the target device that is linked to the viewing of the content.
  • the determination unit 15 determines the external device 3 that satisfies a predetermined condition as the target device to be linked with viewing of the content. As a result, the information processing apparatus 1 controls the target device and reproduces at least part of the situation of the specific scene in the space where the content is viewed, thereby enhancing the presence of the content.
  • the predetermined condition includes the relationship between the information indicating the situation of the specific scene and the controlled object that the external device 3 changes.
  • the information processing apparatus 1 controls the target device and reproduces at least part of the situation of the specific scene in the space where the content is viewed, thereby enhancing the presence of the content.
  • the information indicating the situation of the specific scene includes the video analysis result of the content, and the determination unit 15 determines the target device linked with the viewing of the content based on the video analysis result.
  • the information processing apparatus 1 controls the target device to display at least a part of the situation of the specific scene in the space where the content is viewed, even if additional information indicating the situation of the specific scene is not added to the content. By reproducing the content, the presence of the content can be enhanced.
  • the information indicating the situation of the specific scene includes at least one of information on the recording time of the content, information on the environment of the person in the content, and information on the state of the person in the content.
  • the information processing apparatus 1 brings the space in which the content is viewed closer to the environment at the recording time of the content and the environment of the person in the content, and brings the state of the user U viewing the content closer to the state of the person in the content. By doing so, it is possible to enhance the presence of the content.
  • a person in the content may be a performer or an audience member.
  • the information processing apparatus 1 includes a target device control unit 16 that controls the output of the external (target) device 3 determined by the determination unit 15 so as to reproduce at least part of the situation of a specific scene.
  • the information processing apparatus 1 can enhance the presence of the content by bringing the space in which the content is viewed closer to the environment of the specific scene.
  • the target device control unit 16 outputs information regarding the content of control for the target device. As a result, the target device control unit 16 outputs and saves the information about the control details for the target device in a predetermined storage device, thereby reducing the amount of processing required for controlling the target device when the content is viewed from the next time onwards. can be reduced.
  • the information on the external device 3 includes information indicating the responsiveness of control of the control target by the external device 3 .
  • the target device control unit 16 adjusts the control start timing of the target device according to the responsiveness.
  • the information processing apparatus 1 can enhance the presence of the specific scene by, for example, starting the operation of the target device with relatively low responsiveness before the specific scene.
  • the target device control unit 16 controls only target devices that have higher responsiveness than other target devices to control at least the situation of a specific scene in the content viewing space. Control to reproduce a part. As a result, the information processing apparatus 1 can immediately follow the operation of the target device when the viewing of the content is suddenly interrupted, rewound, or fast-forwarded.
  • the information of the external device 3 includes information indicating the range of influence of the control target controlled by the external device 3 .
  • the target device control unit 16 outputs information for notifying the user U of the range of influence. Thereby, the information processing apparatus 1 can clearly indicate to the user U the influence range of the target device.
  • the target device control unit 16 outputs information to be notified to the user U when the position where the user U views the content is outside the influence range. Thereby, the information processing apparatus 1 can make the user U reconsider the appropriate installation position of the target device.
  • the target device control unit 16 outputs information proposing a position suitable for viewing the content to the user U when the position where the user U views the content is outside the influence range. Thereby, the information processing apparatus 1 can prompt the user U to move the target device to a position suitable for viewing.
  • the target device control unit 16 Based on communication with the external device 3, the target device control unit 16 recognizes the presence of the external device 3 in the content viewing space. Thereby, the information processing device 1 can automatically recognize the presence of the external device 3 in the space where the content is viewed.
  • the information on the external device 3 includes information indicating the position of the external device 3 in the content viewing space. Thereby, the information processing device 1 can recognize the exact position of the external device 3 in the space where the content is viewed.
  • the determination unit 15 determines the target device based on the viewing environment information regarding the space in which the content is viewed. As a result, the information processing apparatus 1 can effectively enhance the presence of the content by appropriately controlling the target device according to, for example, the environment of the space where the content is viewed.
  • the viewing environment information includes at least one of information indicating that the content will be viewed, information indicating the floor plan, and information indicating the behavior of each person when there are multiple persons in the space where the content is viewed. include.
  • the information processing apparatus 1 can appropriately control the target device according to the time zone in which the content is viewed and the space in which the content is viewed. For example, when the viewing time is nighttime, the information processing apparatus 1 can effectively enhance the realism of the content by determining the lighting device as the target device to be linked with the viewing of the content.
  • the viewing environment information includes information indicating the actions of each person when there are multiple people in the space where the content is viewed.
  • the determination unit 15 determines the target device according to the behavior of each person. As a result, for example, when there are a person who is viewing the content and a person who is not viewing the content in the space where the content is viewed, the information processing apparatus 1 can be configured so as not to affect the person who is not viewing the content.
  • the device 3 can be determined as the target device and linked to the viewing of the content.
  • the determining unit 15 determines the external device 3 included in the field of view of the user U as the target device to be linked with viewing of the content. As a result, the information processing apparatus 1 can prevent, for example, unnecessarily linking lighting or the like that is out of the field of view of the user U with the viewing of the content.
  • the determining unit 15 sets it to off in advance. This is to reduce the obstruction of the sense of immersion due to the response sound.
  • the present technology can also take the following configuration.
  • An information processing device comprising: a determination unit that determines a target device.
  • the information of the external device is The information processing apparatus according to (1), including information indicating a control target of the external device.
  • the decision unit determining the external device that satisfies a predetermined condition as the target device to be linked with viewing of the content, based on the information of the external device and the information indicating the situation of a specific scene in the content; The information processing device according to (2) above.
  • the predetermined condition is Including that there is a relationship between the information indicating the situation of the specific scene and the controlled object changed by the external device, The information processing device according to (3) above.
  • the information indicating the situation of the specific scene is including video analysis results of the content;
  • the decision unit determining the target device to be linked with viewing of the content based on the video analysis result;
  • the information processing apparatus according to (3) or (4).
  • the information indicating the situation of the specific scene is including at least one of information on the recording time of the content, information on the environment of the person in the content, and information on the state of the person in the content, The information processing apparatus according to any one of (3) to (5).
  • any one of (3) to (6) above comprising: a target device control unit that controls the target device determined by the determination unit to reproduce at least part of the situation of the specific scene in the viewing space.
  • the information processing device according to 1. (8) The target device control unit outputting information about control content for the target device; The information processing device according to (7) above.
  • the information of the external device is including information indicating the responsiveness of control of the controlled object by the external device; The target device control unit Adjusting the control start timing of the target device according to the responsiveness; The information processing device according to (8) above.
  • the target device control unit When the viewing of the content is interrupted, rewound, or fast-forwarded, only the target device having the higher response than other target devices can reproduce at least part of the situation of the specific scene in the viewing space. to control to reproduce,
  • the information processing device according to (9) above.
  • the information of the external device is including information indicating the range of influence of the controlled object by the control of the external device;
  • the target device control unit outputting information for notifying the user of the range of influence;
  • (12) The target device control unit outputting information to be notified to the user when the position where the user views the content is outside the range of influence;
  • the target device control unit The information processing apparatus according to (11), wherein, when a position where the user views the content is outside the range of influence, information for proposing a position suitable for viewing the content to the user is output.
  • the target device control unit recognizing the presence of the external device in the space for viewing the content based on communication with the external device; The information processing apparatus according to any one of (7) to (13).
  • the information of the external device is including information indicating the position of the external device in the space where the content is viewed; The information processing apparatus according to any one of (1) to (14).
  • the decision unit determining the target device based on viewing environment information related to a space for viewing the content; The information processing apparatus according to any one of (1) to (15).
  • the viewing environment information includes: including at least one of information indicating the time zone in which the content is viewed, information indicating the floor plan, and information indicating behavior of each person when there are a plurality of persons in the space where the content is viewed , The information processing device according to (16) above.
  • the viewing environment information includes: If there are multiple people in the space where the content is viewed, including information indicating the behavior of each person, The decision unit determining the target device according to the behavior of each person; The information processing apparatus according to (16) or (17). (19) The decision unit determining the external device included in the field of view of the user as the target device to be linked with viewing of the content; The information processing apparatus according to any one of (1) to (18).
  • the computer Based on content information about content viewed by a user using a viewing device and information on an external device different from the viewing device existing in a space where the content is viewed, the external device is linked to viewing of the content.
  • determine the target equipment, information processing method including (21) Based on content information about content viewed by a user using a viewing device and information on an external device different from the viewing device existing in a space where the content is viewed, among the external devices, a target linked with viewing of the content
  • An information processing program that causes a computer to execute a determination procedure for determining a device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

Un dispositif de traitement d'informations (1) selon un mode de réalisation de la présente divulgation comprend une unité de détermination (15). Sur la base d'informations de contenu concernant un contenu à visualiser par un utilisateur au moyen d'un dispositif de visualisation (2) et des informations concernant des dispositifs externes (3) qui sont différents du dispositif de visualisation (2) et sont présents dans un espace dans lequel le contenu doit être visualisé, l'unité de détermination (15) détermine, parmi les dispositifs externes (3), un dispositif sujet à associer à la visualisation du contenu.
PCT/JP2022/007776 2021-09-17 2022-02-25 Dispositif, procédé et programme de traitement d'informations WO2023042423A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021152603 2021-09-17
JP2021-152603 2021-09-17

Publications (1)

Publication Number Publication Date
WO2023042423A1 true WO2023042423A1 (fr) 2023-03-23

Family

ID=85602626

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/007776 WO2023042423A1 (fr) 2021-09-17 2022-02-25 Dispositif, procédé et programme de traitement d'informations

Country Status (1)

Country Link
WO (1) WO2023042423A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005050126A (ja) * 2003-07-29 2005-02-24 Matsushita Electric Ind Co Ltd コンテンツ再生システム及び、それに関する、装置または方法または記録媒体またはプログラム
JP2016054054A (ja) * 2014-09-03 2016-04-14 株式会社東芝 照明装置制御プログラム、電子機器及びシステム
WO2017002435A1 (fr) * 2015-07-01 2017-01-05 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005050126A (ja) * 2003-07-29 2005-02-24 Matsushita Electric Ind Co Ltd コンテンツ再生システム及び、それに関する、装置または方法または記録媒体またはプログラム
JP2016054054A (ja) * 2014-09-03 2016-04-14 株式会社東芝 照明装置制御プログラム、電子機器及びシステム
WO2017002435A1 (fr) * 2015-07-01 2017-01-05 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme

Similar Documents

Publication Publication Date Title
US11994917B2 (en) Multipurpose speaker enclosure in a display assistant device
JP7351890B2 (ja) 音響導波管とヒートシンクを組み合わせたコンパクトなホームアシスタントの設計
JP3204588U (ja) サーモスタットの遠隔制御のためのタッチスクリーンデバイスユーザインターフェイス
CN110709785B (zh) 设备控制系统
CN112166350B (zh) 智能设备中的超声感测的系统和方法
US9349179B2 (en) Location information determined from depth camera data
CN108022590A (zh) 语音接口设备处的聚焦会话
US11736760B2 (en) Video integration with home assistant
US20240134462A1 (en) Confidence-based application-specific user interactions
CN111123851A (zh) 根据用户情绪控制电器设备的方法、装置和系统
KR20140092634A (ko) 전자장치와 그 제어방법
US20190332351A1 (en) Control method, controller, and device
JP2016532355A (ja) インテリジェント住宅システム及び操作方法
WO2018155354A1 (fr) Procédé de commande de dispositif électronique, système de commande de dispositif électronique, dispositif électronique, et programme
US20180210700A1 (en) Contextual user interface based on environment
CN114342357A (zh) 基于事件的记录
CN111025922A (zh) 一种目标设备控制方法及电子设备
JP2009087074A (ja) 機器制御システム
US20200204390A1 (en) Control of network-connected devices in accordance with group preferences
WO2023042423A1 (fr) Dispositif, procédé et programme de traitement d'informations
CN110529964A (zh) 具有唤醒功能的遥控器控制方法、装置、遥控器及空调器
CN114608170A (zh) 室内环境的智能调节方法与智能调节系统
CN113641115A (zh) 智慧阅读场景的环境控制方法及系统
US20230244437A1 (en) Systems and methods to adjust loudness of connected and media source devices based on context
JP7392125B2 (ja) 発話機器の発話を制御する方法、発話機器の発話を制御するサーバ、発話機器、およびプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22869575

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE