WO2020003695A1 - Information processing device, information processing method, and information processing system - Google Patents

Information processing device, information processing method, and information processing system Download PDF

Info

Publication number
WO2020003695A1
WO2020003695A1 PCT/JP2019/015868 JP2019015868W WO2020003695A1 WO 2020003695 A1 WO2020003695 A1 WO 2020003695A1 JP 2019015868 W JP2019015868 W JP 2019015868W WO 2020003695 A1 WO2020003695 A1 WO 2020003695A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
agent device
agent
unit
image
Prior art date
Application number
PCT/JP2019/015868
Other languages
French (fr)
Japanese (ja)
Inventor
俊之 関矢
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2020003695A1 publication Critical patent/WO2020003695A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output

Definitions

  • the technology disclosed in this specification relates to an information processing apparatus, an information processing method, and an information processing system that support the use of an agent interacting with a user.
  • agents that present various information to users according to applications and situations while interacting with users using voices and the like have begun to spread. For example, besides turning on / off and adjusting operations of home appliances such as lighting and air conditioners, responding to voices when asked about weather forecasts, stock and exchange information, news, accepting product orders, and the contents of purchased books Agents that read aloud are known.
  • the agent function is generally provided by cooperation between an agent device installed around a user in a home or the like and an agent service built on the cloud.
  • the agent device mainly provides a user interface such as a voice input for receiving a voice spoken by the user and a voice output for responding to an inquiry from the user by voice.
  • the agent service performs high-load processing such as recognition and semantic analysis of voice input by the agent device, information retrieval in response to a user inquiry, and voice synthesis based on the processing result.
  • Some agents have a display function such as a projector in addition to voice input / output (for example, refer to Patent Document 1).
  • the purpose of the technology disclosed in this specification is to provide an information processing apparatus, an information processing method, and an information processing system that support the use of an agent that interacts with a user.
  • a receiving unit that receives first information including information on an environment in which the agent device is installed;
  • a processing unit configured to generate, based on the first information, second information that suggests an installation position of the agent device in the environment;
  • a transmitting unit that returns the second information; It is an information processing apparatus including:
  • the first information includes a captured image of a room where the agent device is installed
  • the second information includes an image in which the agent device is superimposed on a proposed installation position in the captured image.
  • the processing unit proposes an installation position of the agent device by comparing the result of object recognition of the captured image with a place where the agent device is suitable and a device or a place to be avoided.
  • a transmitting unit that transmits first information including information on an environment in which the agent device is installed;
  • a receiving unit that receives second information regarding the installation position of the agent device proposed based on the first information;
  • a presentation unit for presenting the second information; It is an information processing apparatus including:
  • the information processing device further includes a photographing unit. Then, the transmitting unit transmits the first information including an image of the room where the agent device is installed by the imaging unit, and the receiving unit transmits the first information to a suggested installation position in the captured image. The second information including the image on which the agent device is superimposed is received, and the presentation unit displays the image on which the agent device is superimposed at a suggested installation position.
  • a third aspect of the technology disclosed in the present specification is as follows. Transmitting first information including information on an environment in which the agent device is installed; A receiving step of receiving second information on an installation position of the agent device proposed based on the first information; A presentation step of presenting the second information; An information processing method having the following.
  • a fourth aspect of the technology disclosed in the present specification is as follows. Transmitting first information including information on an environment in which the agent device is installed, and receiving second information on an installation position of the agent device proposed based on the first information, A first device for presenting information; A second device that receives the first information from the first device and returns the second information;
  • An information processing system including:
  • system refers to a logical collection of a plurality of devices (or functional modules that realize specific functions), and each device or functional module is in a single housing. It does not matter in particular.
  • FIG. 1 is a diagram illustrating an example of an application environment of an agent.
  • FIG. 2 is a diagram showing a configuration example of the placement place recommendation system 200.
  • FIG. 3 is a diagram for explaining the operation procedure of the place recommendation system 200.
  • FIG. 4 is a diagram showing a state where candidate positions for installing agent devices in a room are determined.
  • FIG. 5 is a diagram for explaining a method of photographing a candidate position for installing an agent device.
  • FIG. 6 is a diagram for explaining a method of photographing a candidate position for installing an agent device.
  • FIG. 7 is a diagram illustrating a configuration example of the agent device table.
  • FIG. 8 is a diagram illustrating an example of an image obtained by photographing a corner where an agent device is to be placed.
  • FIG. 9 is a diagram showing a state where candidate positions for installing agent devices are superimposed on a captured image.
  • FIG. 10 is a diagram showing a state where candidate positions for installing agent devices are superimposed on a captured image.
  • FIG. 11 is a diagram illustrating a state in which candidate positions for installing agent devices are superimposed on a captured image.
  • FIG. 12 is a diagram illustrating an example of an image of the room captured by the companion device 220.
  • FIG. 13 is a diagram illustrating an example in which the captured image illustrated in FIG. 12 is encrypted by blurring.
  • FIG. 14 is a diagram illustrating an example in which an agent device is superimposed on a candidate position in the encrypted captured image illustrated in FIG.
  • FIG. 15 is a diagram illustrating an example in which the image illustrated in FIG. 14 is decoded.
  • FIG. 16 is a diagram illustrating an example of an image of a room captured by the companion device 220.
  • FIG. 17 is a diagram illustrating an example in which the captured image illustrated in FIG. 16 is encrypted by generalization or generalization.
  • FIG. 18 is a diagram illustrating an example in which an agent device is superimposed on a candidate position in the encrypted captured image illustrated in FIG.
  • FIG. 19 is a diagram illustrating an example in which the image illustrated in FIG. 18 is decoded.
  • FIG. 20 is a diagram illustrating a configuration example of the furniture information table.
  • FIG. 21 is a diagram exemplifying a layout of a room where other furniture is placed at an optimal installation position of the agent device.
  • FIG. 22 is a diagram exemplifying a layout of a room in which other furniture is placed at an optimal installation position of the agent device.
  • FIG. 23 is a diagram exemplifying a layout of a room in which other furniture is placed at an optimal installation position of the agent device.
  • FIG. 24 is a diagram exemplifying a composite image in which furniture is moved and an agent device is installed.
  • FIG. 25 is a diagram showing an installation cost table of the agent device.
  • FIG. 26 is a diagram illustrating an operation example of an agent device according to a change in the layout of a room.
  • FIG. 27 is a diagram illustrating an example of changing the layout of furniture in a room.
  • FIG. 28 is a diagram schematically illustrating a functional configuration of an agent device 210 that performs fine adjustment of a position.
  • FIG. 29 is a diagram showing a state in which a message urging a task of improving the installation position of the agent device is displayed on the projector screen.
  • FIG. 30 is a diagram illustrating a state in which a message for prompting a task for improving the installation position of the agent device is displayed on the projector screen.
  • FIG. 31 is a diagram illustrating a state in which a message urging a task of improving the installation position of the agent device is displayed on the projector screen.
  • FIG. 32 is a diagram showing a configuration example of a UI screen for drawing an illustration of a room layout.
  • FIG. 33 is a diagram illustrating a configuration example of a UI screen in which an agent device icon is superimposed on a candidate position.
  • FIG. 34 is a flowchart showing a basic processing procedure for recommending an optimal installation location of an agent device.
  • FIG. 35 is a flowchart showing a processing procedure for recommending an optimal installation location of an agent device in consideration of a moving cost of furniture and an installation cost according to a distance from an outlet.
  • FIG. 36 is a flowchart showing a processing procedure for recommending an optimal installation location of an agent device in consideration of a user's desired installation location.
  • FIG. 37 is a flowchart showing a processing procedure for recommending an optimal installation location of an agent device in consideration of feedback from a user.
  • An agent having a voice interaction function with a user is also called a “smart speaker”, an “AI speaker”, an “AI assistant”, or the like, but in this specification, is simply called an “agent”.
  • FIG. 1 shows a living room 1 as an example of an application environment of an agent.
  • a television receiver 11 and an agent device 12 are installed on the sideboard 13 .
  • a sofa 14 is provided so as to face the television receiver 11, and a sofa table 15 is provided in front of the sofa 14.
  • the agent device 12 mainly uses a voice UI (User @ Interface) to perform on / off and adjustment operations of home appliances such as lighting and air conditioners, and, when asked about weather forecasts, stock / exchange information, and news, sounds.
  • voice UI User @ Interface
  • Various services are provided to the user, such as answering, receiving an order for a product, and reading out the content of a purchased book.
  • a display function such as a projector
  • a service based on information display can also be provided.
  • the sound wave of the sound emitted from the television receiver 11 reaches not only three persons sitting on the sofa 14 as direct waves, but also as reflected waves from a wall surface, a ceiling, and a floor surface.
  • the sound emitted from the agent device 12 reaches not only three persons sitting on the sofa 14 as direct waves, but also as reflected waves from a wall surface, a ceiling, and a floor surface.
  • the application environment of the technology disclosed in this specification is not limited to a living room shared by family members in a general home as shown in FIG.
  • the technology disclosed in the present specification can be applied to various rooms in which an agent device is installed, such as a private room used by a specific user such as a study or a bedroom. Further, the technology disclosed in this specification can be applied not only to homes but also to corporate offices.
  • the agent device 12 When the agent device 12 is installed on the sideboard 13 alongside the television receiver 11 as in the example shown in FIG. 1, the sounds output from each other overlap and reach the user's ear, so that the user Difficult questions and a problem that the agent device 12 mistakes the voice output from the television receiver 11 for a voice command from the user easily occur. In some cases, it is difficult for the user to understand the position where the voice input and voice output of the agent device 12 can be effectively performed.
  • the agent device 12 has a projector display function
  • the agent device 12 is installed away from a wall as in the example shown in FIG. 1, the screen may not be projected properly.
  • the projected image and the pattern are mixed, or the projected image becomes uneven, making it difficult to see. There is.
  • the following technology for supporting a user to install the agent device 12 in a room is proposed. If this technology is used, the user can easily install the agent device 12 in a room at a position where the performance of the agent device 12 can be sufficiently exhibited. The agent device 12 can fully enjoy the service, and the user can enjoy the service provided by the agent device 12.
  • FIG. 2 shows a configuration example of a storage location recommendation system 200 that recommends a storage location of an agent device in a room to a user.
  • the illustrated place recommendation system 200 performs an operation of searching for an agent device 210 to be installed in a room, a companion device 220 to take the user, and a candidate for a place to install the agent device 210 in the room. It is composed of a device 230.
  • the agent device 210 is assumed to be equipped with almost the same hardware configuration as a general voice agent, also called “smart speaker”, “AI speaker”, “AI assistant”, and the like. However, the agent device 210 is not a dedicated device, but may be an application for an agent resident in a device such as an information home appliance or an IoT (Internet of Things) device.
  • a general voice agent also called “smart speaker”, “AI speaker”, “AI assistant”, and the like.
  • the agent device 210 is not a dedicated device, but may be an application for an agent resident in a device such as an information home appliance or an IoT (Internet of Things) device.
  • IoT Internet of Things
  • the agent device 210 illustrated in FIG. 2 includes a control unit 211 including a CPU (Central Processing Unit) and a memory, a display unit 212 that externally displays information processed by the control unit 211, and an agent device 210.
  • Expression unit 212 basically includes a speaker.
  • the speaker may be a stereo speaker or a multi-channel speaker. Further, some speakers may be externally connected to the main body of the agent device 210.
  • the display unit 212 may include a projector.
  • the configuration of the sensor unit 213 is arbitrary.
  • the sensor unit 213 may include a microphone, a camera, an object detection sensor, and a depth sensor.
  • the camera may be, for example, a camera having an angle of view of 90 degrees, an all-around camera having an angle of view of 360 degrees, or a stereo camera or a multi-lens camera.
  • the layout of the furniture installed in the room can be detected based on the detection results of the camera, the object detection sensor, and the depth sensor.
  • the sensor unit 213 may include an environment sensor that detects an environment in the room, such as an illuminance sensor, a temperature sensor, and a humidity sensor.
  • the sensor unit 213 may include an infrared sensor or a human sensor.
  • the sensor unit 213 may include a biological sensor that detects a user's pulse, sweating, brain waves, myoelectric potential, exhalation, and the like. Some or all of the sensor units constituting the sensor unit 213 may be externally connected to the agent device 210 or may be wirelessly connected.
  • the communication unit 214 interconnects with an external device of the agent device 210 using wired communication such as Ethernet (registered trademark) or wireless communication such as Wi-Fi (registered trademark).
  • a companion device 220 is assumed as an external device of the agent device 210. For example, environmental information sensed by the sensor unit 213 and information such as the device name of the agent device 210 are transmitted to the companion device via the communication unit 214. 220.
  • FIG. 2 mainly illustrates components necessary for the location recommendation, but it is assumed that the agent device 210 includes components other than those illustrated.
  • the communication unit 214 may be connected to an external network such as the Internet via an access point or a router, but is not shown.
  • agent device 210 may be battery-powered, it is assumed that power is supplied from a commercial power supply via a power cable (not shown) because of the need to respond to inquiries from the user for 24 hours.
  • the companion device 220 is a device that takes the user when recommending the location of the agent device in the room to the user.
  • the companion device 220 is realized by executing a predetermined program (in this specification, also temporarily referred to as a “companion” application) on an information terminal carried by the user such as a smartphone or a tablet.
  • the companion device 220 illustrated in FIG. 2 includes a control unit 221 including a CPU and a memory, a display unit 222 that externally displays information processed by the control unit 221, an imaging unit 223, a communication unit 224, An encryption unit 225 and an information decryption unit 226 are provided.
  • the control unit 221 executes the companion application, and executes a process for recommending the location of the agent device 210 to the user in the room in cooperation with the arithmetic device 230.
  • the display unit 222 is, for example, a smartphone screen or a speaker.
  • the photographing unit 223 is a camera mounted on an information terminal such as a smartphone, and is used mainly for photographing a scene of a room where the agent device 210 is placed.
  • the communication unit 224 interconnects with an external device of the companion device 220 using wired communication such as Ethernet (registered trademark) or wireless communication such as Wi-Fi (registered trademark).
  • wired communication such as Ethernet (registered trademark) or wireless communication such as Wi-Fi (registered trademark).
  • Wi-Fi registered trademark
  • the communication unit 224 may correspond to a wireless communication method of a mobile communication system such as LTE (Long Term Evolution) or LTE-Advanced.
  • the agent device 210 and the arithmetic device 230 are assumed as external devices to which the companion device 220 is connected using the communication unit 224.
  • environment information sensed by the sensor unit 213 and information such as the device name of the agent device 210 are received from the agent device 210 via the communication unit 224.
  • the information received from the agent device 210 is transmitted to the computing device 230 via the communication unit 224, and the information regarding the candidate position of the agent device 210 in the room is transmitted from the computing device 230 via the communication unit 224.
  • Information on the candidate for the location of the agent device 210 is presented to the user using the display unit 222, for example.
  • the information transmitted / received to / from the external device via the communication unit 224 includes information related to privacy and which the user does not want to disclose to the outside. Therefore, when transmitting the information to the external device, the companion device 220 performs an encryption process for converting a part of the information that the user does not want to disclose included in the transmitted information into a format that cannot be decrypted by the external device. It is carried out in. When the information subjected to the above-described encryption processing is returned from the external device, the companion device 220 performs the decryption processing for restoring the encrypted information to the original state by the information decryption unit 226. I do.
  • the agent device 210 is an external device that does not require information encryption.
  • the entity of the arithmetic device 230 is assumed to be a cloud (described later), and corresponds to an external device that requires information encryption. The details of the encryption and decryption processing will be described later.
  • the computing device 230 is a device that provides a “place recommendation service” that recommends the place of the agent device 210 in the room to the user in cooperation with the companion device 220.
  • the entity of the computing device 230 is, for example, a software module that is executed on a cloud, and is assumed to provide a service that recommends a location of a large number of agent devices installed in each home or the like.
  • the term “cloud” generally refers to cloud computing (Cloud Computing). The cloud provides computing services via a network such as the Internet.
  • the arithmetic device 230 shown in FIG. 2 includes a control unit 231 including a CPU and a memory, a communication unit 232, an agent device table 233, and a furniture information table 234.
  • the control unit 231 cooperates with the companion device 220 to execute an application program for recommending the location of the agent device 210 in the room to the user.
  • the control unit 231 acquires information necessary for executing this application from the agent device table 233 and the furniture information table 234 as appropriate.
  • the communication unit 232 interconnects with an external device of the companion device 220 using wired communication such as Ethernet (registered trademark) or wireless communication such as Wi-Fi (registered trademark).
  • a companion device 220 is assumed as an external device of the arithmetic device 230. From the companion device 220, information such as environment information of the agent device 210 (such as an image of the inside of the room) and a device name is received via the communication unit 232. Then, the control unit 231 calculates information on the candidate location of the agent device 210 based on the received information, and transmits the calculation result to the companion device 220 via the communication unit 232.
  • the agent device table 233 is a table in which Capabilities (functions), device specifications, locations suitable for installation, and devices and locations to be avoided for each device name of the agent device will be described later.
  • the furniture information table 234 is a table in which moving costs for each furniture name are collected, and details will be described later. The moving cost is obtained by quantifying the difficulty of moving (or the burden on the user for moving) based on the weight and volume of each piece of furniture, installation conditions, and the like.
  • the computing device 230 that is a cloud server simultaneously transmits the agent device 210 to each of a large number of companion devices 220. It is assumed that a service for recommending the place of the storage is provided.
  • the computing device 230 is not a cloud, but a PC (Personal Computer) installed in the same room as the agent device 210 and the companion device 220, or another application that is executed in parallel on an information terminal that executes a companion application. (In the latter case, the companion device 220 and the computing device 230 will be constructed simultaneously on the same hardware device).
  • PC Personal Computer
  • the process for recommending the location of the agent device 210 to the user is performed, for example, during the initial setup of the agent device 210. Alternatively, the processing may be executed when the user wants to change the location of the agent device 210 or when changing the layout of the room.
  • the user starts a companion application on the companion device 220 (step 1). Then, the user uses the photographing unit 223 to photograph a corner where the agent device 210 is to be placed (step 2).
  • the user shoots each of the candidate positions 401 to 403 from inside the room (see FIG. 5), and shoots each of the candidate positions 401 to 403 with his / her back (see FIG. 5). See FIG. 6).
  • the reason why the photographing is performed toward the candidate position is to confirm the suitability of the projection plane of the projector (however, when the agent device has a projector display function).
  • the photographing is performed with the candidate position turned to the back in order to confirm the suitability of the input device such as a microphone or a camera or the acoustic environment of the speaker.
  • FIGS. 5 and 6 are only examples of the method of photographing a candidate position for installing an agent device, and the method is not limited thereto.
  • shooting methods such as shooting the entire room as a zoomed out subtracted image, shooting a panorama at the center of the room, shooting the front, right, rear, and left directions at every 90 degrees (at the center of the room) are also mentioned.
  • a shooting method in which the candidate position is overlooked, a shooting method in which the candidate position is viewed upward, and the like can also be mentioned.
  • the companion device 220 transmits an image of the candidate position where the agent device 210 is to be installed and data including information on the device name of the agent device 210 to the computing device 230 (step 3).
  • the data transmission to the arithmetic device 230 is performed, for example, in the form of uploading data from a smartphone to the cloud.
  • the information transmitted from the companion device 220 to the computing device 230 includes information that the user does not want to disclose to the outside, the information is encrypted by the information encryption unit 225 and then transmitted to the computing device 230. You. The details of the encryption process will be described later.
  • the calculation device 230 evaluates a candidate position for installing the agent device 210 (step 4). Specifically, the arithmetic device 230 acquires information about a function or a location that conforms to the specification of the inquired agent device 210 with reference to the agent device table 233. Further, the computing device 230 performs object recognition in the room based on the transmitted captured image. At the time of object recognition, a location where a device such as a television receiver which competes with the voice input / output function of the agent device 210 is detected. When the target agent device 210 has a projector display function, it detects a white wall and determines the unevenness of the wall, determines a window that becomes an obstacle to a projected image, detects brightness, and the like. Also go.
  • the computing device 230 determines whether each candidate position in the room is a place that conforms to the functions and specifications of the agent device, or whether a device to be avoided is close to or is not a place to be avoided. Further, the computing device 230 quantifies the degree to which each candidate position matches the agent device 210 and assigns a priority.
  • the arithmetic device 230 refers to the furniture information table 234. Then, a plan for moving furniture that is an obstacle at each candidate position is planned, or another candidate position is planned in the room (step 5). For example, if the vase placed on the center table hinders the user from seeing the image projected by the projector, the movement of the vase is planned.
  • the computing device 230 prioritizes each of the candidate position, the furniture movement plan, and the candidate position planned by itself from the companion device 220, and creates proposal data of the installation position of the agent device 210. (Step 6), and transmits this to the companion device 220 (Step 7).
  • the companion device 220 outputs the proposal data of the installation position of the agent device 210 received from the arithmetic device 230 to the display unit 222 and presents it to the user (step 8).
  • the user changes the installation position of the agent device 210 in the room based on the proposal data presented by the companion device 220, and moves furniture as necessary (step 9).
  • the user After installing the agent device 210 at a desired position, the user repeats photographing of the agent device 210 and uploading of the photographed image to the computing device 230, and requests the computing device 210 to verify the actual installation location. You may do so. Further, at the time of shooting, the image projected by the agent device 210 on the wall by the projector display function may be shot together. Then, the computing device 230 verifies whether the current installation position of the agent device 210 is appropriate. For example, the computing device 230 re-creates proposal data for performing a slight position adjustment so that the projection size of the projector is desired (or corrects trapezoidal distortion), and returns the proposal data to the companion device 220. I do. The companion device 220 presents the received proposal data to the user again, and the user looks at it and adjusts the position of the agent device 210.
  • the calculation device 230 evaluates each candidate position in light of functions and device specifications of the agent device. For example, an agent device without a projector display function can evaluate candidate positions without considering the presence of a wall that projects images, but an agent device with a projector display function gives priority to the presence of a wall. And evaluate the candidate position.
  • the computing device 230 summarizes Capability (function), device specifications, locations suitable for installation, and devices and locations to be avoided for each device name of the agent device, in order to evaluate the candidate position for installing the agent device.
  • the agent device table 233 is used.
  • FIG. 7 shows a configuration example of the agent device table 233.
  • the companion device 220 sends an image of a corner where the agent device is to be placed and the device name of the agent device to be placed to the computing device 230.
  • the captured image shown in FIG. 8 and the agent device name “XXXX” are transmitted from the companion device 220 to the arithmetic device 230 will be described.
  • the arithmetic device 230 recognizes the captured image sent from the companion device 220 as an object and grasps the environment such as the layout of furniture and the color of the wall in the room where the agent device is to be installed.
  • a dining table is placed in front of a room covered with a white wall, and in the living room, a sideboard and a television receiver are installed near the wall, and a sofa is provided facing the television screen. It is possible to grasp through object recognition that a rug is laid in front of the sofa, a center table is placed on the rug, and a side table is placed on the right side of the sofa.
  • the computing device 230 refers to the agent device table 233 for the agent device name “XXXX”, has two microphones and one speaker, has voice operation and speaker playback functions, and Know that the center is a suitable place, and that the TV receiver, speakers, and kitchen are the equipment to be avoided or the place to be avoided.
  • the arithmetic device 230 compares the object recognition result of the captured image with information on a place suitable for the agent device and information on a device to be avoided or a place to be avoided, which is obtained from the agent device table 233.
  • the candidate positions are calculated with priorities.
  • the computing device 230 calculates a score that quantifies the degree to which each candidate position is suitable for the agent device, based on information on a location suitable for the agent device and equipment to be avoided or a location to be avoided. Then, priorities are assigned in descending order of the score. However, details of the method of calculating the score for each candidate position will be described later.
  • the arithmetic device 230 has a higher score on the side table on the right side of the sofa and then on the dining table.
  • the computing device 230 creates proposal data indicating a candidate position with a high score, and returns the proposal data to the companion device 220.
  • the companion device 220 outputs the proposal data of the installation position of the agent device 210 received from the computing device 230 to the display unit 222 and presents it to the user.
  • the companion device 220 superimposes an agent device icon on a candidate position having a high score on an image of a corner where the agent device is to be placed and displays the icon on the screen.
  • the icon 901 of the agent device is displayed on the side table on the right side of the sofa and on the dining table, respectively.
  • 902 are displayed in a superimposed manner.
  • the user can easily grasp where in the room it is recommended to place the agent device through a display screen as shown in FIG. Further, the user can easily grasp the situation in the room where the agent device is installed at the candidate position.
  • a numerical value indicating the priority order may be displayed near icons 901 and 902 of each agent device.
  • the process of superimposing the icon of the agent device on the captured image may be performed by the computing device 230, or may be performed by the companion device 220 based on information from the computing device 230.
  • the image of the icon to be superimposed corresponds to the actual device of the agent device to be placed (that is, the appearance of the agent device having the device name “XXXX”), the user can install the agent device in the room. It is easy to imagine how you are.
  • the computing device 230 recognizes the captured image sent from the companion device 220 as an object and can grasp the environment such as the layout of furniture and the color of the wall in the room where the agent device is to be installed (same as above). .
  • the computing device 230 refers to the agent device table 233 for the agent device name “YYYY”, and is provided with three microphones, a camera with a 360-degree angle of view, and one speaker. It is understood that the camera has the function and that the center of the room is a suitable place for 360-degree (all-around) shooting, and that the television receiver, the speaker, the kitchen, and the window are devices or places to avoid.
  • the computing device 230 calculates a score that quantifies the degree to which each candidate position is suitable for the agent device, based on the information on the location suitable for the agent device and the equipment to be avoided or the location to be avoided, Assign priorities in descending order of score. However, details of the score calculation method will be described later. Locations near windows that are prone to oblique light when the user approaches the camera are avoided. For example, it is assumed that the arithmetic device 230 has a high score near the center of the room on the dining table. The computing device 230 creates proposal data indicating a candidate position with a high score, and returns the proposal data to the companion device 220.
  • the companion device 220 outputs the proposal data of the installation position of the agent device 210 received from the computing device 230 to the display unit 222 and presents it to the user.
  • FIG. 10 shows a display example of a screen in which the icon 1001 of the agent device is superimposed on a candidate position near the center of the room on the dining table having a high score. It is preferable that the icon image corresponds to the actual device of the agent device to be placed (that is, the appearance of the agent device having the device name “YYYY”).
  • the user can easily grasp where in the room it is recommended to place the agent device through a display screen as shown in FIG. Further, the user can easily grasp the situation in the room where the agent device is installed at the candidate position.
  • the computing device 230 recognizes the captured image sent from the companion device 220 as an object and can grasp the environment such as the layout of furniture and the color of the wall in the room where the agent device is to be installed (same as above). .
  • the arithmetic device 230 refers to the agent device table 233 for the agent device name “ZZZZ”, includes three microphones, one speaker, and a projector, and has voice operation, projector display, and speaker playback functions.
  • a place near a white flat wall is suitable for projecting an image, and that a television receiver, a speaker, a kitchen, and a window are devices to be avoided or a place to be avoided.
  • the computing device 230 calculates a score that quantifies the degree to which each candidate position is suitable for the agent device, based on the information on the location suitable for the agent device and the equipment to be avoided or the location to be avoided, Assign priorities in descending order of score.
  • a score that quantifies the degree to which each candidate position is suitable for the agent device, based on the information on the location suitable for the agent device and the equipment to be avoided or the location to be avoided, Assign priorities in descending order of score.
  • Windows where images cannot be projected and places where obstacles covering the wall such as television receivers are installed are avoided.
  • a wall that is lit by another light source and is bright, such as near a window or near a lighting fixture is avoided.
  • the computing device 230 has a high score on a dining table near a wall of a room.
  • the computing device 230 creates proposal data indicating a candidate position with a high score, and returns the proposal data to the companion device 220.
  • the companion device 220 outputs the proposal data of the installation position of the agent device 210 received from the computing device 230 to the display unit 222 and presents it to the user.
  • FIG. 11 shows a display example of a screen in which the icon 1101 of the agent device is superimposed on a candidate position near the center of the room on the dining table having a high score. It is preferable that the icon image corresponds to the actual device of the agent device to be placed (that is, the appearance of the agent device having the device name “ZZZZ”).
  • the user can easily grasp where in the room it is recommended to place the agent device through a display screen as shown in FIG. Further, the user can easily grasp the situation in the room where the agent device is installed at the candidate position.
  • the companion device 220 transmits an image of a corner where the agent device is to be placed to the outside together with the agent device name in order to enjoy the information service relating to the candidate position for installing the agent device from the computing device 230. Must be sent.
  • the photographed image includes privacy-related information that the user does not want to disclose to the outside.
  • shooting toward the candidate position shooting with the candidate position as a back
  • shooting of the entire room with a pull image shooting with a panorama at the center of the room (or front, right, and back every 90 degrees) ⁇
  • shooting with a panorama at the center of the room Take a picture in the left direction) (described above).
  • Increasing the camera's field of view increases the risk of privacy-related information being reflected.
  • the information encryption unit 225 performs an encryption process of converting a part of the information that the user does not want to disclose included in the image into a format that cannot be decrypted.
  • the companion device 220 performs a decryption process of restoring the encrypted information portion to the original state by the information decryption unit. 226.
  • an image of a corner where an agent device is to be placed is an image in which several families are shown as shown in FIG.
  • the companion device 220 blurs the face area of the family, encrypts the information by the information encryption unit 225 so that the individual cannot be identified, and then transmits the information to the arithmetic device 230.
  • the information encryption unit 225 may use image processing such as mosaicing in addition to blurring.
  • the computing device 230 side even in the case of an image in which a human face is blurred, the layout of furniture in the room, the position of windows, and the like can be grasped by object recognition. The position can be evaluated correctly.
  • the arithmetic device 230 does not need to be aware of whether the captured image sent from the companion device 220 has been encrypted by blurring or the like.
  • an image in which the icon of the agent device is superimposed on the candidate position in the photographed image in which the face is blurred is returned from the computing device 230 to the companion device 220.
  • the companion device 220 may directly present the user with an image in which the icons of the agent devices 1401 and 1402 are superimposed as shown in FIG.
  • the information decoding unit 226 creates an image in which the icons of the agent devices 1501 and 1502 are re-mapped to the captured image obtained by restoring the blurring processing to the original state, and presents the image to the user. Good.
  • the image of a corner where the agent device is to be placed is an image in which no family members are reflected, but there are cases where the user does not want to disclose furniture or the layout of the room.
  • the companion device 220 performs an encryption process by replacing the image with a similar image that has no problem even if it is made public, and then transmits the encrypted image to the arithmetic device 230.
  • the similar image referred to here is, for example, a generalized image or a generalized image obtained by replacing the entire room or individual unique furniture with a general image while maintaining the same arrangement of furniture and the structure of the room. is there.
  • the arithmetic device 230 does not need to be aware of whether the captured image sent from the companion device 220 has been encrypted by generalization or generalization. On the computing device 230 side, even if it is a generalized image or a generalized image, if the layout of furniture in the room, the position of the window, and the like are almost the same, the same object as the original captured image can be recognized. Therefore, each candidate position for installing an agent device can be correctly evaluated.
  • the arithmetic device 230 returns the generalized image or the image in which the icon of the agent device is superimposed on the candidate position in the generalized image to the companion device 220.
  • the companion device 220 may present the user with an image in which the icon of the agent device 1801 is superimposed on the image in the generalized or generalized state as shown in FIG.
  • the information decoding unit 226 may create an image in which the icon of the agent device 1901 is re-mapped to the captured image restored as before, and present the image to the user.
  • FIG. 32 shows a configuration example of a UI screen 3200 for drawing an illustration of a room layout.
  • a room floor plan 3201 is displayed in the upper half of the illustrated UI screen, and a plurality of furniture icons 3202 are displayed in the lower half.
  • the floor plan 3201 can be automatically generated by, for example, taking a three-dimensional image of an actual room, converting the sides of the three-dimensional object into solid lines, and converting invisible sides into dashed lines as necessary. .
  • the floor plan creation application may prepare a standard floor plan such as a square in advance, or the user may select from among several floor plan samples. You may make it selectable.
  • Furniture icons 3202 represent main furniture such as a refrigerator, a television receiver, a sink, a sofa, a speaker, an outlet, a table, a sideboard, and a window in standard patterns.
  • the user can illustrate the floor plan of the room by selecting the icon corresponding to the furniture that is actually placed in the room and arranging it at the corresponding location on the floor plan 3201 in the upper half of the screen. .
  • the information encryption unit 225 may automatically convert a captured image into a room layout illustration.
  • the information decoding unit 226 may automatically restore the floor plan illustration to the original captured image.
  • a UI screen for drawing an illustration as shown in FIG. 32 is displayed on the screen of the smartphone. Then, the user can draw a floor plan illustration by touching the UI screen.
  • a floor plan creation application may be started on an information terminal such as a tablet or a personal computer that allows easier editing than the companion device 220, and the created illustration may be transmitted from the information terminal to the companion device 220.
  • the illustration of the room layout is transmitted from the companion device 220 to the computing device 230 together with the device name of the agent device.
  • each piece of furniture is iconified, so that the computing device 230 can identify the piece of furniture more accurately than recognizing the captured image as an object.
  • the arithmetic device 230 evaluates a candidate position for installing the agent device in the illustration of the room layout.
  • the icon of the agent device is superimposed on the recommended place on the illustration and returned to the companion device 220.
  • the companion device 220 displays an illustration on which the icon 3301 of the agent device is superimposed and displayed on the UI screen 3200, and suggests the installation location of the agent to the user.
  • the companion device 220 does not send the room layout illustration itself to the arithmetic device 230, but sends geometric information indicating the shape of the room layout and numerical information such as position information of each furniture icon. You may do so.
  • an illustration of the room layout can be reproduced based on the received numerical information.
  • the computing device 230 may return the position information of the recommended installation location instead of the illustration in which the icon of the agent device is superimposed on the recommended installation location.
  • the companion device 220 may perform a process of superimposing the agent device icon 3301 on the floor plan 3200 of the room.
  • the companion device 220 replaces the sofa or center table included in the image with the object name and the image shown below. And transmits it to the computing device 230.
  • the computing device 230 calculates the following candidate agent positions from the object name and the position information on the image.
  • Agent1 (x2, y2, h2, w2)
  • the computing device 230 Since the image is not sent from the companion device 220, the computing device 230 does not superimpose the icon of the agent device on the image, and returns the information on the candidate position of the agent to the companion device 220 as it is. Then, on the companion device 220 side, based on the received information on the candidate positions of the agents, the icon of the agent device may be superimposed on the corresponding position on the captured image and presented to the user. Note that only position information of an object such as furniture or an agent device is transmitted between the computing device 230 and the companion device 220, and image information having a large data amount is not transmitted, which leads to a reduction in communication bandwidth.
  • FIG. Example of Operation Considering Movement of Furniture Subsequently, a method of proposing a candidate position for installing an agent device in the place recommendation system 200 including movement of an object in a room such as furniture will be described.
  • the furniture information table 234 is a table that describes furniture that can be moved by the user and the movement cost thereof.
  • the moving cost is obtained by quantifying the difficulty of moving (or the burden on the user for moving) based on the weight and volume of each piece of furniture, installation conditions, and the like.
  • FIG. 20 shows a configuration example of the furniture information table 234.
  • low moving costs are given to lightweight and easy-to-move furniture such as speakers, photo frames and vases, and high moving costs are given to heavy objects such as closets and refrigerators.
  • the computing device 230 refers to the agent device table “233” (see FIG. 7) for the agent device name “ZZZZ”, and recognizes that the target agent device has the projector display function.
  • the computing device 230 recognizes the captured image 2100 as an object, and has no irregularities on the white wall 2102 facing the sofa 2101. Therefore, the arithmetic device 230 can find above the cabinet 2103 as a candidate position suitable for projection.
  • the arithmetic device 230 refers to the furniture information table 234 for “photo stand” and confirms that the moving cost is low, and moves the photo stand 2104 to another location while setting the top of the cabinet 2103 as a candidate position. Is created and sent back to the companion device 220.
  • the arithmetic device 230 removes the picture frame 2104 from the top of the cabinet 2103 and synthesizes an image (not shown) in which an agent device with the device name “ZZZZ” is installed on the captured image shown in FIG. Then, it may be transmitted to the companion device 220.
  • the computing device 230 may transmit a message notifying that the photo frame 2104 should be moved to the companion device 220 together.
  • the companion device 220 presents the image received from the computing device 230 to the user, and when a message accompanying the image is received, also presents it to the user, moves the photo stand 2104 and places the agent device on the cabinet 2103. The user can be prompted to install.
  • the computing device 230 refers to the agent device table “233” (see FIG. 7) for the agent device name “ZZZZ”, and recognizes that the target agent device has the projector display function.
  • the computing device 230 recognizes the captured image 2200 as an object, and since the white wall 2202 facing the sofa 2201 has no irregularities, the computing device 230 can find the position above the sideboard 2203 as a candidate position suitable for projection. .
  • the television receiver 2204 and the vase 2205 are already placed on the sideboard 2203.
  • the television receiver 2204 and the vase 2205 obstruct the placement of the agent device on the sideboard 2203.
  • the device to be avoided is specified in the agent device table 233 (see FIG. 7). Therefore, the arithmetic device 230 refers to the “furniture information table 234 television receiver” and “vase” and confirms that the moving cost is not high, and sets the candidate position 2206 on the sideboard 2203 and sets the television reception. Proposal data for moving the machine 2204 away from the candidate position 2206 on the sideboard 2203 and moving the vase 2205 to another location is created and returned to the companion device 220.
  • the arithmetic device 230 moves the television receiver 2204 away from the candidate position 2206 on the sideboard 2203 and removes the vase 2205 from the sideboard 2203, for example, with respect to the captured image shown in FIG.
  • an image (not shown) in which an agent device having the device name “ZZZZ” is installed on the sideboard 2203 may be combined and transmitted to the companion device 220.
  • the computing device 230 may transmit a message notifying that the television receiver 2204 and the vase 2205 should be moved to the companion device 220 together.
  • the companion device 220 presents the image received from the computing device 230 to the user, and if a message accompanying the image is received, also presents the message to the user, moves the television receiver 2204 and the vase 2205 to move the sideboard 2203 The user can be prompted to place an agent device on top.
  • the computing device 230 refers to the agent device table “233” (see FIG. 7) for the agent device name “WWW”, and determines that the target agent device is equipped with a camera photographing function and is a suitable place near a wall. Figure out. Also, the computing device 230 can recognize the photographed image 2200 as an object and find out the telephone stand 2301 placed on the wall as a candidate position suitable for camera photographing.
  • the arithmetic device 230 refers to the “sofa” in the furniture information table 234 and confirms that the moving cost is not high, and sets a candidate position on the telephone stand 2301 and sets the sofa 2302 to the telephone stand 2301. Is created, and the proposal data is moved to the companion device 220.
  • the arithmetic device 230 combines, for example, an image 2302 ′ in which the sofa 2302 has been turned and an image in which an agent device with the device name “WWWW” is installed on the telephone stand 2301 with the captured image shown in FIG. Then, it may be transmitted to the companion device 220. The arithmetic device 230 may transmit a message notifying that the direction of the sofa 2302 should be changed to the companion device 220 together.
  • FIG. 24 illustrates a composite image created based on FIG.
  • the companion device 220 presents the image received from the computing device 230 to the user, and when a message accompanying the image is received, also presents it to the user, changes the orientation of the sofa 2302, and places the message on the telephone stand 2301. The user can be prompted to install the agent device 2401.
  • the difficulty of movement may differ depending on the individual circumstances of each home, so that the user may be able to customize the furniture information table 234.
  • a speaker is lightweight and easy to carry, but if the user does not want to move the speaker from the viewpoint of the sound effect, the moving cost is increased, or if the vase is fixed from the viewpoint of the beauty of the room, the moving cost is increased. .
  • the move cost may be reduced to make it easier to propose a candidate position of the agent device.
  • the embodiment has been described in which the installation location of the agent device is proposed, focusing mainly on the functions of the agent device.
  • the distance from the outlet increases the length of the power cable, which may interfere with walking and impair the aesthetics of the room.
  • an extension cord is required, and the battery cannot be reached by a commercial power source, so that the battery must be driven (charging work is troublesome).
  • an installation cost table that defines the installation cost according to the distance from the outlet is prepared, and the arithmetic device 230 is placed in a suitable place (or a place to be avoided) according to the function of the agent device. (Device), the installation location of the agent device may be proposed in consideration of the installation cost in addition to the moving cost of the furniture.
  • the arithmetic device 230 When the arithmetic device 230 receives an image of a corner where the agent device is to be placed from the companion device 220, it recognizes the captured image as an object and searches for the position of the outlet in the room. Further, when the outlet cannot be recognized because the outlet is hidden in the captured image, the position of the outlet in the room may be estimated based on an empirical rule or learning data on the architectural design of the house.
  • the arithmetic device 230 searches for a place suitable for the target agent device to perform its function in consideration of the installation cost together with the furniture moving cost. For example, when two or more candidate positions having the same score are found, by considering the installation cost of each candidate position, the priority of the candidate position closest to the outlet is higher.
  • FIG. Operation Examples According to Room Layout Changes Although various operation examples have been described above, they are common in that an agent device searches for a place suitable for performing a function according to a room layout. Therefore, when the layout of the room changes, such as when moving furniture, replacement of the installation location of the agent device may be proposed.
  • the companion device 220 may be used to request the computing device 230 for a proposal regarding the location of the agent device 210 according to the procedure described with reference to FIG. Good.
  • the agent device 210 detects, based on sensor information or the like acquired by the sensor unit 213, that an event, such as a change in the layout of a room, that changes the installation location suitable for the agent device 210 itself has occurred,
  • the replacement of the device installation location may be notified to the user using the display unit 212 or may be notified to the companion device 220 via the communication unit 214.
  • the sensor unit 213 senses the environment in the room where the agent device 210 is installed (S2601), and transmits environment data to the CPU in the control unit 211 at regular intervals (S2602).
  • the CPU should temporarily store the received environment data in the memory (S2603), read the environment data stored in the past, compare the current environment data (S2604), and change the installation location of the agent device 210 itself. It is determined whether or not the room environment has changed.
  • the sensor unit 213 When the sensor unit 213 includes a camera, the sensor unit 213 sends an image of the room taken by the camera to the CPU in the control unit 211. Then, the CPU sequentially and temporarily stores the photographed images in the room in the memory, calculates the difference between the photographed images in the room at regular intervals, and determines the installation location of the agent device 210 itself such as movement of furniture. Detect changes in the room environment to be changed. For example, as shown in FIG. 27, by taking the image difference between the photographed image 2701 before the change and the photographed image 2702 after the change, it is possible to detect that the position and the direction of the sofa 2710 have changed.
  • the sensor unit 213 includes a human sensor, the sensor unit 213 also sends human sensor data to the CPU in the control unit 211 (S2605).
  • the CPU in the control unit 211 notifies the user using the display unit 212 (S2606).
  • the display unit 212 has a speaker reproduction function
  • the user is notified of the occurrence of an environmental change in which the installation location of the agent device 210 itself is to be changed in the room or an audio message for announcing the changed content.
  • the display unit 212 has a projector display function, a screen indicating that the environment in the room has changed or the changed content is projected on a wall to notify the user (S2608).
  • the CPU in the control unit 211 may execute the expression operation by the expression unit 212 at the timing when the presence of the user can be confirmed by the human sensor of the sensor unit 213.
  • the CPU in the control unit 211 detects a change in the environment in the room, it also notifies the companion device 220 via the communication unit 214 (S2609).
  • the companion device 220 also uses the display unit 222 to notify the user of a change in the room environment in which the installation location of the agent device 210 needs to be changed.
  • the user can know from at least one of the agent device 210 and the companion device 220 that there has been a change in the room environment in which the installation location of the agent device 210 needs to be changed. Then, similarly to the initial setup of the agent device 210, the user can photograph the room and enjoy the service of proposing the installation position of the agent device 210 according to the procedure already described with reference to FIG.
  • the arithmetic device 230 may propose a rough installation position, and may perform a fine-tuning of the actual position based on the sensing of the agent device 210 itself. There is a limit to the granularity of information that the arithmetic device 230 can acquire through object recognition of the captured image sent from the companion device 220.
  • the agent device 210 estimates the surrounding environment of the installation position based on the sensing result of the sensor unit 213, and can finely adjust the position of the agent device 210 with finer granularity.
  • FIG. 28 schematically shows the functional configuration of the agent device 210 for performing fine adjustment of the position.
  • the illustrated agent device 210 is provided with a projector display function as the display unit 212, and it is assumed that an installation position near a wall is proposed.
  • the agent device 210 includes a six-axis sensor 2801, a distance sensor 2802, a human sensor 2803, and the like as the sensor unit 213.
  • the 6-axis sensor 2801 is, for example, a sensor unit that includes an inertial measurement device (Inertia Measurement Unit) (IMU) and measures the position and orientation.
  • IMU Inertia Measurement Unit
  • the distance sensor 2802 may be a distance sensor using reflection of laser, ultrasonic waves, infrared rays, or the like.
  • the control unit 211 includes a self-orientation detection unit 2811, a wall distance estimation unit 2812, and an installation position evaluation unit 2813.
  • These functional modules 2811 to 2813 may be, for example, software modules executed by a CPU in the control unit 211.
  • the self-orientation detecting unit 2811 detects the posture of the main body of the agent device 210 based on the detection signal of the six-axis sensor 2801.
  • the wall distance estimating unit 2812 estimates the distance from the main body of the agent device 210 to the wall serving as the projection plane based on the detection signal of the distance sensor 2802.
  • the installation position evaluation unit 2813 evaluates whether the installation position of the agent device 210 is appropriate based on the detected posture of the agent device 210 and the distance to the wall.
  • the installation position evaluation unit 2813 shows that the installation position of the main body of the agent device 210 is too far from the wall, the wall is inclined with respect to the emission direction of the projector, and the installation surface is inclined and unstable.
  • the user is notified by voice reproduction or the projector screen using the display unit 212. For example, “It looks like you are away from the wall. Please move it closer to the wall.”, “It's a little diagonal. Isn't it hard to read my letters?", "Please put it on a straight table, slip. May be displayed on the projector screen prompting the user to improve the installation position of the agent device 210 (see FIGS. 29 to 31).
  • the agent device 210 may output a voice message by voice instead of or together with the screen display.
  • control unit 211 may execute the expression operation by the expression unit 212 at the timing when the presence of the user can be confirmed by the motion sensor 2803.
  • the basic operation calculation device 230 avoids a place suitable for the target agent device or avoids it.
  • a score score (x, y) at each position (x, y) in the room is calculated based on information on a device or a place to be set, and an optimal installation position (x opt , y opt ) at which the score becomes maximum is determined. And presents it to the companion device 220.
  • the score score (x, y) at each position (x, y) in the room is defined as in the following equation (1).
  • is a parameter indicating what kind of object is where (location).
  • location is represented by (x, y) and the size of the object in the vertical and horizontal directions.
  • location of each object can be represented by the following expression (2).
  • the parameter ⁇ ⁇ can be expressed as in the following equation (3).
  • FIG. 34 shows, in the form of a flowchart, a basic processing procedure in which the computing device 230 recommends the companion device 220 an optimal installation location of the agent device.
  • the arithmetic device 230 receives, from the companion device 220, the captured image of the room where the agent device is installed and the device name of the target agent device (step S3401).
  • the arithmetic device 230 recognizes the received captured image as an object and estimates the attribute (category), position, and size of each object in the room (step S3402). Based on this estimation result, the parameter ⁇ of the above equation (3) can be obtained.
  • the computing device 230 refers to the agent device table 233 for the device name received in step S3401, and obtains information on a location suitable for the agent device and devices and locations to be avoided. Then, the arithmetic device 230 uses the function F, which increases the score as it approaches a device and a place to be avoided but increases the score as it moves away, and calculates the score score (x, y) when the agent device is placed at the position (x, y). ) Is calculated according to the above equation (1) (step S3403).
  • the optimal installation position (x opt , y opt ) is determined (step S3404). If there are a plurality of candidate positions, the calculation is also performed for the second and third positions. In addition, prioritization of each candidate position is performed based on the score.
  • the computing device 230 superimposes the image of the agent device on the installation position (x opt , y opt ) of the agent device determined in step S3404 on the captured image received in step S3401, and transmits the image to the request source. Is transmitted to the companion device 220 (step S3405), and this processing ends.
  • the parameter ⁇ ⁇ ⁇ ⁇ relating to the moving cost of each scattered piece of furniture and the installation cost based on the distance from the outlet can be considered.
  • the parameter ⁇ can be expressed as in the following equation (5).
  • the parameter ⁇ ⁇ shown in the following equation (5) includes the moving cost of each object recognized from the captured image and the installation cost according to the distance from the outlet.
  • g TV is the moving cost of the television receiver
  • g Table is the moving cost of the table
  • g c is the installation cost according to the distance from the outlet.
  • the score score (x, y) at each position (x, y) in the room can be defined as in the following expression (6).
  • FIG. 35 is a flowchart showing a processing procedure for the arithmetic device 230 to recommend an optimal installation location of the agent device to the companion device 220 in consideration of the moving cost of furniture and the installation cost according to the distance from the outlet.
  • the arithmetic device 230 to recommend an optimal installation location of the agent device to the companion device 220 in consideration of the moving cost of furniture and the installation cost according to the distance from the outlet.
  • the arithmetic device 230 receives, from the companion device 220, the captured image of the room where the agent device is installed and the device name of the target agent device (step S3501).
  • the arithmetic device 230 recognizes the received captured image as an object and estimates the attribute (category), position, and size of each object in the room (step S3502). Based on this estimation result, the parameter ⁇ shown in the above equation (3) can be obtained.
  • the computing device 230 refers to the furniture information table 234 (see FIG. 20) and the installation cost table (see FIG. 25), and obtains parameters relating to furniture moving costs and installation costs according to outlet lengths. ⁇ is acquired (step S3503).
  • the computing device 230 refers to the agent device table 233 for the device name received in step S3501, and acquires information on a place suitable for the agent device and a device and a place to be avoided. Then, the arithmetic device 230 uses the function F, which increases the score as it approaches a device and a place to be avoided but increases the score as it moves away, and calculates the score score (x, y) when the agent device is placed at the position (x, y). ) Is calculated according to the above equation (6) (step S3504).
  • the arithmetic device 230 superimposes the image of the agent device on the installation position (x opt , y opt ) of the agent device determined in step S3505 on the captured image received in step S3501, and the image of the furniture after movement. Are combined, and the image is transmitted to the companion device 220 of the request source (step S3506), and this processing ends.
  • I-3 There is a case where a recommended user of an installation location in consideration of a desired installation location of the user desires an installation location of the agent device. For example, if you consider the agent device as furniture or furniture and are aesthetically observant about the installation position in the room, or if you plan to place other items and want to limit the installation location of the agent device, the reasons are: Various.
  • FIG. 36 shows, in the form of a flowchart, a processing procedure in which the computing device 230 recommends the companion device 220 an optimal installation location of the agent device in consideration of a user's desired installation location.
  • the arithmetic device 230 receives, from the companion device 220, the captured image of the room where the agent device is installed and the device name of the target agent device (step S3601).
  • the arithmetic device 230 recognizes the received captured image as an object and estimates the attribute (category), position, and size of each object in the room (step S3602). Based on this estimation result, the parameter ⁇ shown in the above equation (3) can be obtained.
  • the computing device 230 refers to the furniture information table 234 (see FIG. 20) and the installation cost table (see FIG. 25), and obtains parameters relating to furniture moving costs and installation costs according to outlet lengths. ⁇ is acquired (step S3603).
  • computing device 230 the installation location of the agent device installed desired position the user agent device desires (x u, y u) acquires (step S3604).
  • the method by which the computing device 230 acquires the desired installation position (x u , yu ) is arbitrary.
  • the computing device 230 refers to the agent device table 233 for the device name received in step S3601, and acquires information on a place suitable for the agent device and a device and a place to be avoided. Then, the arithmetic device 230 uses the function F, which increases the score as it approaches a device and a place to be avoided but increases the score as it moves away, and calculates the score score (x, y) when the agent device is placed at the position (x, y). ) Is calculated according to the above equation (6) under the condition of the installation position shown in the above equation (8) (step S3605).
  • the arithmetic device 230 calculates the score (x opt , y opt ) at the optimal installation position (x opt , y opt ) determined in step S3606 according to the following equation (9), and the score is determined by the predetermined threshold th. It is checked whether it is the above (step S3607).
  • step S3607 If the score (x opt , y opt ) at the optimum installation position (x opt , y opt ) is equal to or greater than the predetermined threshold th (Yes in step S3607), the arithmetic device 230 returns to the captured image received in step S3601. Above, the image of the agent device is superimposed on the installation position (x opt , y opt ) of the agent device determined in step S3606, and the image of the furniture after movement is synthesized, and the image is transmitted to the companion device 220 of the request source. (Step S3608), and the process ends.
  • step S3607 when the score (x opt , y opt ) at the optimal installation position (x opt , y opt ) is less than the predetermined threshold th (No in step S3607), the arithmetic device 230 acquires in step S3604. Installation desired position of the user (x u, y u) is notified that no suitable location of the agent device in companion device 220 (step S3609), the process ends.
  • the installation operation recommendation calculation device 230 in consideration of the feedback from the user converts the image in which the agent device is superimposed on the optimal installation position to the requesting companion device 220. Reply to On the companion device 220 side, it is recommended that the image on which the agent device is superimposed be presented to the user, and that the agent device be placed at an optimal installation position. The user looks at the image on which the agent device is superimposed, or actually puts the agent device at that location, and determines whether to permit the recommended location. Further, the companion device 220 may feed back to the arithmetic device 230 whether or not the user has approved the recommended installation position.
  • FIG. 37 shows, in the form of a flowchart, a processing procedure for the arithmetic device 230 to recommend an optimal installation location of an agent device in consideration of feedback from a user.
  • the arithmetic device 230 receives, from the companion device 220, the captured image of the room where the agent device is installed and the device name of the target agent device (step S3701). Then, the arithmetic device 230 performs the same processing as in steps S3602 to S3608 of the flowchart shown in FIG. 36, for example, to determine the optimal installation position of the agent device, and superimposes the image of the agent device on that location. The image is transmitted to the companion device 220 (step S3702).
  • step S3703 the arithmetic device 230 checks whether the user has approved the recommended installation position.
  • step S3704 if the user permits the recommended installation position (Yes in step S3704), the process ends.
  • the computing device 230 obtains the installation position of the agent device (x u, y u) the installation desired position the user agent device desires (Step S3705).
  • the method by which the computing device 230 obtains the desired installation position (x u , yu ) is arbitrary (as described above).
  • the computing device 230 refers to the agent device table 233 for the device name received in step S3701, and acquires information on a place suitable for the agent device and a device and a place to be avoided. Then, the arithmetic device 230 uses the function F, which increases the score as it approaches a device and a place to be avoided but increases the score as it moves away, and calculates the score score (x, y) when the agent device is placed at the position (x, y). ) Is calculated according to the above equation (6) under the condition of the installation position shown in the above equation (8) (step S3706).
  • the arithmetic device 230 calculates a score (x opt , y opt ) at the optimal installation position (x opt , y opt ) determined in step S3606 according to the above equation (9), and the score is determined by a predetermined threshold th. It is checked whether it is the above (step S3708).
  • Optimal placement (x opt, y opt) score in score (x opt, y opt) if is the predetermined threshold th or more (Yes in step S3708), the computing device 230, captured image received in step S3701 Above, the image of the agent device is superimposed on the installation position (x opt , y opt ) of the agent device determined in step S3606, and the image of the furniture after movement is synthesized, and the image is transmitted to the companion device 220 of the request source. (Step S3709), and the process ends.
  • step S3708 when the score score (x opt , y opt ) at the optimal installation position (x opt , y opt ) is less than the predetermined threshold th (No in step S3708), the arithmetic device 230 acquires in step S3705. Installation desired position of the user (x u, y u) is notified that no suitable location of the agent device in companion device 220 (step S3710), the process ends.
  • the optimum position where the agent device 210 is installed is presented only by transmitting the image of the room photographed by the companion device 220 to the arithmetic device 230.
  • the companion device 220 is realized in the form of executing a companion application on an information terminal such as a smartphone or a tablet, for example.
  • the computing device 230 is a server on the cloud. Therefore, the user can simply and conveniently perform an ordinary and simple operation of uploading an image captured by the smartphone to the cloud, so that the optimal installation position of the agent device 210 is presented, which is convenient and convenient.
  • the companion device 220 performs image processing such as blurring or mosaicing information that is not desired to be disclosed, or performs encryption processing to replace the information with other information when transmitting the captured image of the room to the computing device 230. Because of this, it is possible to enjoy the service of recommending the location of the agent device 210 while protecting the privacy of the user.
  • the storage location of the agent device 210 is recommended in consideration of the movement of the furniture in the room. And recommending the location of the agent device 210.
  • the technology disclosed in this specification can be applied not only to a voice agent, but also to installing various devices in which an agent application such as an information home appliance or an IoT device resides.
  • the technology disclosed in the present specification may have the following configurations.
  • a receiving unit that receives first information including information about an environment in which an agent device is installed;
  • a processing unit configured to generate, based on the first information, second information that suggests an installation position of the agent device in the environment;
  • a transmitting unit that returns the second information;
  • An information processing apparatus comprising: (1-1) a receiving step of receiving first information including information on an environment in which an agent device is installed; A processing step of generating second information that proposes an installation position of the agent device in the environment based on the first information; A transmitting step of returning the second information;
  • An information processing method comprising: (2) the first information includes a captured image of a room where the agent device is installed, The second information includes an image in which the agent device is superimposed on a proposed installation position in the captured image, The information processing device according to (1).
  • the processing unit proposes an installation position of the agent device based on a result of object recognition of the captured image and a function or specification of the agent device.
  • the processing unit compares the result of object recognition of the captured image with a place where the agent device is suitable and a device or place to be avoided, and proposes an installation position of the agent device.
  • (4-1) The processing unit proposes, for the agent device equipped with a speaker playback function, an installation position that avoids a place where a television receiver or another speaker is located, The information processing device according to (4).
  • the processing unit proposes an installation position at the center of the room to the agent device equipped with a omnidirectional or wide-angle camera photographing function, The information processing device according to (4). (4-3) The processing unit proposes an installation position near a white flat wall with respect to the agent device equipped with a projector display function, The information processing device according to (4). (5) when there are a plurality of candidate positions for installing the agent device, the processing unit generates the second information in which a priority is assigned to each candidate position; The information processing device according to any one of (1) to (4). (6) The processing unit proposes an installation position of the agent device in consideration of movement of furniture in a room where the agent device is installed. The information processing device according to any one of (1) to (5).
  • the processing unit proposes an installation position of the agent device in consideration of movement of furniture in the room based on a movement cost for each furniture.
  • the processing unit generates the second information including a composite image of a room where the agent device is installed by moving furniture.
  • the processing unit proposes an installation position of the agent device in consideration of an installation cost based on a distance from an outlet, The information processing device according to any one of (1) to (9).
  • (10) a transmitting unit that transmits first information including information on an environment in which the agent device is installed; A receiving unit that receives second information regarding the installation position of the agent device proposed based on the first information; A presentation unit for presenting the second information;
  • An information processing apparatus comprising: (11) further including a photographing unit, The transmission unit transmits the first information including an image of the room in which the agent device is installed by the imaging unit, The receiving unit receives the second information including an image in which the agent device is superimposed on a suggested installation position in the captured image, The presentation unit displays the image in which the agent device is superimposed on a proposed installation position, The information processing device according to (10).
  • the second information includes a plurality of candidate positions for installing the agent device, and a priority assigned to each candidate position,
  • the presentation unit presents each priority of the plurality of candidate positions,
  • the information processing device according to any one of (10) and (11).
  • an information encryption unit for encrypting predetermined information included in the first information;
  • An information decryption unit that recovers based on the encrypted predetermined information included in the second information; Further comprising,
  • the information processing apparatus according to any one of (10) to (12).
  • the information encryption unit performs predetermined image processing on a human face area in an image included in the first information;
  • the information encryption unit replaces an image included in the first information with a generalized or generalized image.
  • the information processing device 13).
  • the information encryption unit replaces an image included in the first information with an illustration drawn by a user.
  • the information encryption unit replaces an image included in the first information with position information of each object reflected in the image.
  • the information processing device according to (13). (18) further comprising a sensor unit, The transmitting unit transmits the first information in response to the sensor unit detecting a change in the environment in which an agent device is installed,
  • the information processing apparatus according to any one of (10) to (17).
  • An information processing method comprising: (20) transmitting first information including information on an environment in which an agent device is to be installed, and receiving second information about an installation position of the agent device proposed based on the first information, A first device for presenting second information; A second device that receives the first information from the first device and returns the second information;
  • An information processing system comprising:

Abstract

Provided are an information processing device, information processing method, and information processing system, for assisting with use of an agent for conversing with a user. This information processing device comprises: a receiving part for receiving first information from a companion device, said first information including a photographic image of a room wherein an agent device is to be installed; a processing part for generating second information including an image whereon the agent device is superimposed in an installation position determined by comparing a result of an object recognition on the photographic image with a location suitable for the agent device and a device or location to be avoided by the agent device; and a transmission part for transmitting the second information to the companion device.

Description

情報処理装置及び情報処理方法、並びに情報処理システムInformation processing apparatus, information processing method, and information processing system
 本明細書で開示する技術は、ユーザと対話するエージェントの利用を支援する情報処理装置及び情報処理方法、並びに情報処理システムに関する。 The technology disclosed in this specification relates to an information processing apparatus, an information processing method, and an information processing system that support the use of an agent interacting with a user.
 最近、音声などを用いてユーザと対話を行いながら、用途や状況に応じて種々の情報をユーザに提示するエージェントが普及し始めている。例えば、照明やエアコンなどの家電機器のオンオフや調整操作を代行する他、天気予報や株・為替情報、ニュースについて聞かれると音声で回答したり、商品の注文を受け付けたり、購入した書籍の内容を読み上げたりするエージェントが知られている。 Recently, agents that present various information to users according to applications and situations while interacting with users using voices and the like have begun to spread. For example, besides turning on / off and adjusting operations of home appliances such as lighting and air conditioners, responding to voices when asked about weather forecasts, stock and exchange information, news, accepting product orders, and the contents of purchased books Agents that read aloud are known.
 エージェント機能は、一般に、家庭内などでユーザの周囲に設置されるエージェントデバイスと、クラウド上に構築されるエージェントサービスの連携により提供される。例えば、エージェントデバイスは、ユーザが発話する音声を受け付ける音声入力、並びにユーザからの問い合せに対して音声で回答する音声出力といったユーザインターフェースを主に提供する。一方のエージェントサービス側では、エージェントデバイスで入力された音声の認識や意味解析、ユーザの問い合わせに応じた情報検索などの処理、処理結果に基づく音声合成など、負荷の高い処理を実行する。また、エージェントの中には、音声入出力の他に、プロジェクタなどの表示機能を装備したものもある(例えば、特許文献1を参照のこと)。 The agent function is generally provided by cooperation between an agent device installed around a user in a home or the like and an agent service built on the cloud. For example, the agent device mainly provides a user interface such as a voice input for receiving a voice spoken by the user and a voice output for responding to an inquiry from the user by voice. On the other hand, the agent service performs high-load processing such as recognition and semantic analysis of voice input by the agent device, information retrieval in response to a user inquiry, and voice synthesis based on the processing result. Some agents have a display function such as a projector in addition to voice input / output (for example, refer to Patent Document 1).
WO2016/158792WO2016 / 158792
 本明細書で開示する技術の目的は、ユーザと対話するエージェントの利用を支援する情報処理装置及び情報処理方法、並びに情報処理システムを提供することにある。 The purpose of the technology disclosed in this specification is to provide an information processing apparatus, an information processing method, and an information processing system that support the use of an agent that interacts with a user.
 本明細書で開示する技術の第1の側面は、
 エージェントデバイスを設置する環境に関する情報を含む第1の情報を受信する受信部と、
 前記第1の情報に基づいて、前記環境における前記エージェントデバイスの設置位置を提案する第2の情報を生成する処理部と、
 前記第2の情報を返信する送信部と、
を具備する情報処理装置である。
A first aspect of the technology disclosed in the present specification is as follows.
A receiving unit that receives first information including information on an environment in which the agent device is installed;
A processing unit configured to generate, based on the first information, second information that suggests an installation position of the agent device in the environment;
A transmitting unit that returns the second information;
It is an information processing apparatus including:
 前記第1の情報は、前記エージェントデバイスを設置する部屋の撮影画像を含み、前記第2の情報は、前記撮影画像中の提案する設置位置に前記エージェントデバイスを重畳した画像を含む。そして、前記処理部は、前記撮影画像を物体認識した結果と、前記エージェントデバイスが適している場所並びに避けるべき機器又は場所とを比較して、前記エージェントデバイスの設置位置を提案する。 The first information includes a captured image of a room where the agent device is installed, and the second information includes an image in which the agent device is superimposed on a proposed installation position in the captured image. Then, the processing unit proposes an installation position of the agent device by comparing the result of object recognition of the captured image with a place where the agent device is suitable and a device or a place to be avoided.
 また、本明細書で開示する技術の第2の側面は、
 エージェントデバイスを設置する環境に関する情報を含む第1の情報を送信する送信部と、
 前記第1の情報に基づいて提案された前記エージェントデバイスの設置位置に関する第2の情報を受信する受信部と、
 前記第2の情報を提示する提示部と、
を具備する情報処理装置である。
A second aspect of the technology disclosed in the present specification is as follows.
A transmitting unit that transmits first information including information on an environment in which the agent device is installed;
A receiving unit that receives second information regarding the installation position of the agent device proposed based on the first information;
A presentation unit for presenting the second information;
It is an information processing apparatus including:
 第2の側面に係る情報処理装置は、撮影部をさらに含む。そして、前記送信部は、前記エージェントデバイスを設置する部屋を前記撮影部で撮影した画像を含む前記第1の情報を送信し、前記受信部は、前記撮影画像中の提案される設置位置に前記エージェントデバイスが重畳された画像を含む前記第2の情報を受信し、前記提示部は、提案される設置位置に前記エージェントデバイスが重畳された前記画像を表示する。 情報 処理 The information processing device according to the second aspect further includes a photographing unit. Then, the transmitting unit transmits the first information including an image of the room where the agent device is installed by the imaging unit, and the receiving unit transmits the first information to a suggested installation position in the captured image. The second information including the image on which the agent device is superimposed is received, and the presentation unit displays the image on which the agent device is superimposed at a suggested installation position.
 また、本明細書で開示する技術の第3の側面は、
 エージェントデバイスを設置する環境に関する情報を含む第1の情報を送信する送信ステップと、
 前記第1の情報に基づいて提案された前記エージェントデバイスの設置位置に関する第2の情報を受信する受信ステップと、
 前記第2の情報を提示する提示ステップと、
を有する情報処理方法である。
A third aspect of the technology disclosed in the present specification is as follows.
Transmitting first information including information on an environment in which the agent device is installed;
A receiving step of receiving second information on an installation position of the agent device proposed based on the first information;
A presentation step of presenting the second information;
An information processing method having the following.
 また、本明細書で開示する技術の第4の側面は、
 エージェントデバイスを設置する環境に関する情報を含む第1の情報を送信するとともに、前記第1の情報に基づいて提案された前記エージェントデバイスの設置位置に関する第2の情報を受信して、前記第2の情報を提示する第1のデバイスと、
 前記第1のデバイスから前記第1の情報を受信し、前記第2の情報を返信する第2のデバイスと、
を具備する情報処理システムである。
A fourth aspect of the technology disclosed in the present specification is as follows.
Transmitting first information including information on an environment in which the agent device is installed, and receiving second information on an installation position of the agent device proposed based on the first information, A first device for presenting information;
A second device that receives the first information from the first device and returns the second information;
An information processing system including:
 但し、ここで言う「システム」とは、複数の装置(又は特定の機能を実現する機能モジュール)が論理的に集合した物のことを言い、各装置や機能モジュールが単一の筐体内にあるか否かは特に問わない。 However, the term “system” as used herein refers to a logical collection of a plurality of devices (or functional modules that realize specific functions), and each device or functional module is in a single housing. It does not matter in particular.
 本明細書で開示する技術によれば、ユーザと対話するエージェントの部屋内での設置を支援する情報処理装置及び情報処理方法、並びに情報処理システムを提供することができる。 According to the technology disclosed in this specification, it is possible to provide an information processing apparatus, an information processing method, and an information processing system that support installation of an agent interacting with a user in a room.
 なお、本明細書に記載された効果は、あくまでも例示であり、本発明の効果はこれに限定されるものではない。また、本発明が、上記の効果以外に、さらに付加的な効果を奏する場合もある。 The effects described in the present specification are merely examples, and the effects of the present invention are not limited to these. In addition, the present invention may exhibit additional effects other than the above effects.
 本明細書で開示する技術のさらに他の目的、特徴や利点は、後述する実施形態や添付する図面に基づくより詳細な説明によって明らかになるであろう。 {Other objects, features, and advantages of the technology disclosed in this specification will become apparent from the following embodiments and the more detailed description based on the accompanying drawings.
図1は、エージェントの適用環境の一例を示した図である。FIG. 1 is a diagram illustrating an example of an application environment of an agent. 図2は、置き場所推薦システム200の構成例を示した図である。FIG. 2 is a diagram showing a configuration example of the placement place recommendation system 200. 図3は、置き場所推薦システム200の動作手順を説明するための図である。FIG. 3 is a diagram for explaining the operation procedure of the place recommendation system 200. 図4は、部屋内にエージェントデバイスを設置する候補位置が決められた様子を示した図である。FIG. 4 is a diagram showing a state where candidate positions for installing agent devices in a room are determined. 図5は、エージェントデバイスを設置する候補位置を撮影する方法を説明するための図である。FIG. 5 is a diagram for explaining a method of photographing a candidate position for installing an agent device. 図6は、エージェントデバイスを設置する候補位置を撮影する方法を説明するための図である。FIG. 6 is a diagram for explaining a method of photographing a candidate position for installing an agent device. 図7は、エージェントデバイステーブルの構成例を示した図である。FIG. 7 is a diagram illustrating a configuration example of the agent device table. 図8は、エージェントデバイスを置こうとしている一角を撮影した画像の一例を示した図である。FIG. 8 is a diagram illustrating an example of an image obtained by photographing a corner where an agent device is to be placed. 図9は、エージェントデバイスを設置する候補位置を撮影画像に重畳した様子を示した図である。FIG. 9 is a diagram showing a state where candidate positions for installing agent devices are superimposed on a captured image. 図10は、エージェントデバイスを設置する候補位置を撮影画像に重畳した様子を示した図である。FIG. 10 is a diagram showing a state where candidate positions for installing agent devices are superimposed on a captured image. 図11は、エージェントデバイスを設置する候補位置を撮影画像に重畳した様子を示した図である。FIG. 11 is a diagram illustrating a state in which candidate positions for installing agent devices are superimposed on a captured image. 図12は、コンパニオンデバイス220が部屋を撮影した画像の一例を示した図である。FIG. 12 is a diagram illustrating an example of an image of the room captured by the companion device 220. 図13は、図12に示した撮影画像をぼかしにより暗号化した例を示した図である。FIG. 13 is a diagram illustrating an example in which the captured image illustrated in FIG. 12 is encrypted by blurring. 図14は、図13に示した暗号化された撮影画像中の候補位置にエージェントデバイスを重畳した例を示した図である。FIG. 14 is a diagram illustrating an example in which an agent device is superimposed on a candidate position in the encrypted captured image illustrated in FIG. 図15は、図14に示した画像を復号した例を示した図である。FIG. 15 is a diagram illustrating an example in which the image illustrated in FIG. 14 is decoded. 図16は、コンパニオンデバイス220が部屋を撮影した画像の一例を示した図である。FIG. 16 is a diagram illustrating an example of an image of a room captured by the companion device 220. 図17は、図16に示した撮影画像を一般化又は汎用化により暗号化した例を示した図である。FIG. 17 is a diagram illustrating an example in which the captured image illustrated in FIG. 16 is encrypted by generalization or generalization. 図18は、図17に示した暗号化された撮影画像中の候補位置にエージェントデバイスを重畳した例を示した図である。FIG. 18 is a diagram illustrating an example in which an agent device is superimposed on a candidate position in the encrypted captured image illustrated in FIG. 図19は、図18に示した画像を復号した例を示した図である。FIG. 19 is a diagram illustrating an example in which the image illustrated in FIG. 18 is decoded. 図20は、家具情報テーブルの構成例を示した図である。FIG. 20 is a diagram illustrating a configuration example of the furniture information table. 図21は、エージェントデバイスの最適な設置位置に他の家具が置かれている部屋のレイアウトを例示した図である。FIG. 21 is a diagram exemplifying a layout of a room where other furniture is placed at an optimal installation position of the agent device. 図22は、エージェントデバイスの最適な設置位置に他の家具が置かれている部屋のレイアウトを例示した図である。FIG. 22 is a diagram exemplifying a layout of a room in which other furniture is placed at an optimal installation position of the agent device. 図23は、エージェントデバイスの最適な設置位置に他の家具が置かれている部屋のレイアウトを例示した図である。FIG. 23 is a diagram exemplifying a layout of a room in which other furniture is placed at an optimal installation position of the agent device. 図24は、家具を移動させてエージェントデバイスを設置した合成画像を例示した図である。FIG. 24 is a diagram exemplifying a composite image in which furniture is moved and an agent device is installed. 図25は、エージェントデバイスの設置コストテーブルを示した図である。FIG. 25 is a diagram showing an installation cost table of the agent device. 図26は、部屋のレイアウト変更に応じたエージェントデバイスの動作例を示した図である。FIG. 26 is a diagram illustrating an operation example of an agent device according to a change in the layout of a room. 図27は、部屋内の家具のレイアウトの変更例を示した図である。FIG. 27 is a diagram illustrating an example of changing the layout of furniture in a room. 図28は、位置の微調整を行うエージェントデバイス210の機能的構成を模式的に示した図である。FIG. 28 is a diagram schematically illustrating a functional configuration of an agent device 210 that performs fine adjustment of a position. 図29は、エージェントデバイスの設置位置の改善する作業を促すメッセージをプロジェクタ画面に表示する様子を示した図である。FIG. 29 is a diagram showing a state in which a message urging a task of improving the installation position of the agent device is displayed on the projector screen. 図30は、エージェントデバイスの設置位置の改善する作業を促すメッセージをプロジェクタ画面に表示する様子を示した図である。FIG. 30 is a diagram illustrating a state in which a message for prompting a task for improving the installation position of the agent device is displayed on the projector screen. 図31は、エージェントデバイスの設置位置の改善する作業を促すメッセージをプロジェクタ画面に表示する様子を示した図である。FIG. 31 is a diagram illustrating a state in which a message urging a task of improving the installation position of the agent device is displayed on the projector screen. 図32は、部屋の間取りのイラストを描画するUI画面の構成例を示した図である。FIG. 32 is a diagram showing a configuration example of a UI screen for drawing an illustration of a room layout. 図33は、候補位置にエージェントデバイスのアイコンが重畳されたUI画面の構成例を示した図である。FIG. 33 is a diagram illustrating a configuration example of a UI screen in which an agent device icon is superimposed on a candidate position. 図34は、エージェントデバイスの最適な設置場所を薦めるための基本的な処理手順を示したフローチャートである。FIG. 34 is a flowchart showing a basic processing procedure for recommending an optimal installation location of an agent device. 図35は、家具の移動コストとコンセントからの距離に応じた設置コストを考慮して、エージェントデバイスの最適な設置場所を薦めるための処理手順を示したフローチャートである。FIG. 35 is a flowchart showing a processing procedure for recommending an optimal installation location of an agent device in consideration of a moving cost of furniture and an installation cost according to a distance from an outlet. 図36は、ユーザの設置希望位置を考慮して、エージェントデバイスの最適な設置場所を薦めるための処理手順を示したフローチャートである。FIG. 36 is a flowchart showing a processing procedure for recommending an optimal installation location of an agent device in consideration of a user's desired installation location. 図37は、ユーザからのフィードバックを考慮して、エージェントデバイスの最適な設置場所を薦めるための処理手順を示したフローチャートである。FIG. 37 is a flowchart showing a processing procedure for recommending an optimal installation location of an agent device in consideration of feedback from a user.
 以下、図面を参照しながら本明細書で開示する技術の実施形態について詳細に説明する。 Hereinafter, embodiments of the technology disclosed in this specification will be described in detail with reference to the drawings.
 ユーザとの音声対話機能を備えたエージェントは、「スマートスピーカ」、「AIスピーカ」、「AIアシスタント」などとも呼ばれるが、本明細書では、単に「エージェント」と呼ぶことにする。 エ ー ジ ェ ン ト An agent having a voice interaction function with a user is also called a “smart speaker”, an “AI speaker”, an “AI assistant”, or the like, but in this specification, is simply called an “agent”.
 図1には、エージェントの適用環境の一例として、リビングルーム1を示している。サイドボード13上には、テレビ受像機11と、エージェントデバイス12が設置されている。また、リビングルーム1内には、テレビ受像機11と対面するようにソファ14が設置され、ソファ14の前方にはソファテーブル15が備えられている。 FIG. 1 shows a living room 1 as an example of an application environment of an agent. On the sideboard 13, a television receiver 11 and an agent device 12 are installed. In the living room 1, a sofa 14 is provided so as to face the television receiver 11, and a sofa table 15 is provided in front of the sofa 14.
 図1に示す例では、親子3人がソファ14に座っている。3人は、テレビ受像機11に表示されているテレビ番組の視聴者であるとともに、エージェントデバイス12のユーザであり、エージェントデバイス12に対して問い合わせして、返答を待つ。もちろん、エージェントデバイス12がいずれかのユーザに対して自律的に話しかけるシーンも想定される。エージェントデバイス12は、主に音声UI(User Interface)を利用して、照明やエアコンなどの家電機器のオンオフや調整操作を代行する他、天気予報や株・為替情報、ニュースについて聞かれると音声で回答したり、商品の注文を受け付けたり、購入した書籍の内容を読み上げたりするといった、さまざまなサービスをユーザに提供する。また、エージェントデバイス12がプロジェクタなどの表示機能を装備する場合には、情報表示をベースとしたサービスも併せて提供することができる。 In the example shown in FIG. 1, three parents and children are sitting on the sofa 14. The three are the viewers of the television program displayed on the television receiver 11 and the users of the agent device 12, inquire about the agent device 12, and wait for a reply. Of course, a scene in which the agent device 12 speaks autonomously to any user is also assumed. The agent device 12 mainly uses a voice UI (User @ Interface) to perform on / off and adjustment operations of home appliances such as lighting and air conditioners, and, when asked about weather forecasts, stock / exchange information, and news, sounds. Various services are provided to the user, such as answering, receiving an order for a product, and reading out the content of a purchased book. When the agent device 12 has a display function such as a projector, a service based on information display can also be provided.
 テレビ受像機11から発される音声の音波は、ソファ14に座っている3人に直接波として届く他、壁面や天井、床面からの反射波としても届く。エージェントデバイス12から発される音声も同様に、ソファ14に座っている3人に直接波として届く他、壁面や天井、床面からの反射波としても届く。 (4) The sound wave of the sound emitted from the television receiver 11 reaches not only three persons sitting on the sofa 14 as direct waves, but also as reflected waves from a wall surface, a ceiling, and a floor surface. Similarly, the sound emitted from the agent device 12 reaches not only three persons sitting on the sofa 14 as direct waves, but also as reflected waves from a wall surface, a ceiling, and a floor surface.
 なお、本明細書で開示する技術の適用環境は、図1に示したような一般家庭内で家族が共有するリビングルームには限定されない。書斎や寝室といった特定のユーザが使用する個室など、エージェントデバイスが設置されるさまざまな部屋にも本明細書で開示する技術を適用可能である。また、家庭内だけでなく、企業のオフィスにも、本明細書で開示する技術を適用することができる。 適用 Note that the application environment of the technology disclosed in this specification is not limited to a living room shared by family members in a general home as shown in FIG. The technology disclosed in the present specification can be applied to various rooms in which an agent device is installed, such as a private room used by a specific user such as a study or a bedroom. Further, the technology disclosed in this specification can be applied not only to homes but also to corporate offices.
 ユーザがエージェントデバイス12から提供されるサービスを享受するには、テレビ受像機11を始め、同じ部屋内に設置された他の音響装置との位置関係を十分に考慮する必要がある。 In order for the user to enjoy the service provided by the agent device 12, it is necessary to sufficiently consider the positional relationship between the television receiver 11 and other audio devices installed in the same room.
 図1に示した例のように、サイドボード13上にテレビ受像機11と並んでエージェントデバイス12が設置されていると、互いに出力する音声が重なり合ってユーザの耳に届くので、ユーザが聴き取り難い問う問題や、エージェントデバイス12が、テレビ受像機11が出力する音声をユーザからの音声コマンドと聞き違えるといった問題が発生し易い。エージェントデバイス12の音声入力や音声出力を効果的に行える位置がどこか、ユーザが分かり難い場合もある。 When the agent device 12 is installed on the sideboard 13 alongside the television receiver 11 as in the example shown in FIG. 1, the sounds output from each other overlap and reach the user's ear, so that the user Difficult questions and a problem that the agent device 12 mistakes the voice output from the television receiver 11 for a voice command from the user easily occur. In some cases, it is difficult for the user to understand the position where the voice input and voice output of the agent device 12 can be effectively performed.
 また、エージェントデバイス12がプロジェクタ表示機能を装備する場合、図1に示した例のように壁から離間して設置されていると、画面をうまく投射できないことがある。付言すると、エージェントデバイス12と対向する壁に柄模様のある壁紙が貼られていたり凹凸があったりすると、投射した映像と柄模様が混ざり合い、又は投射した画像が凸凹になり、見え難いという問題がある。 In addition, when the agent device 12 has a projector display function, if the agent device 12 is installed away from a wall as in the example shown in FIG. 1, the screen may not be projected properly. In addition, if wallpaper with a pattern is stuck on the wall facing the agent device 12 or if there is unevenness, the projected image and the pattern are mixed, or the projected image becomes uneven, making it difficult to see. There is.
 要するに、部屋内には、エージェントデバイス12が持つ性能を十分に発揮するのに適した設置位置と不適切な設置位置がある。しかしながら、ユーザが適切な設置位置を必ずしも正確に判断できる訳ではなく、設計者が意図しない場所にエージェントデバイス12が設置されて、ユーザがエージェントデバイス12によって提供されるサービスを十分に享受できなくなることが懸念される。エージェントデバイス12を適切な場所に設置するには技術的な専門知識が必要な場合もあり、一般ユーザでは判断がつかないこともある。 In short, in the room, there are an installation position suitable for sufficiently exhibiting the performance of the agent device 12 and an inappropriate installation position. However, it is not always possible for the user to accurately determine an appropriate installation position, and the agent device 12 is installed in a place not intended by the designer, and the user cannot fully enjoy the service provided by the agent device 12. Is concerned. Technical expertise may be required to install the agent device 12 at an appropriate location, and a general user may not be able to determine.
 そこで、本明細書では、ユーザがエージェントデバイス12を部屋内に設置する作業を支援する技術について、以下で提案する。当該技術を利用すれば、ユーザは、部屋内で、エージェントデバイス12の性能を十分に発揮できる位置にエージェントデバイス12を容易に設置することができるようになるので、エージェントデバイス12は本来の性能を十分に発揮することができ、ユーザはエージェントデバイス12によって提供されるサービスを享受できるようになる。 Therefore, in the present specification, the following technology for supporting a user to install the agent device 12 in a room is proposed. If this technology is used, the user can easily install the agent device 12 in a room at a position where the performance of the agent device 12 can be sufficiently exhibited. The agent device 12 can fully enjoy the service, and the user can enjoy the service provided by the agent device 12.
A.システム構成
 図2には、部屋の中でエージェントデバイスの置き場所をユーザに推薦する置き場所推薦システム200の構成例を示している。図示の置き場所推薦システム200は、部屋の中に設置するエージェントデバイス210と、ユーザを連れ立つコンパニオンデバイス220と、部屋の中でエージェントデバイス210を設置する場所の候補を探索する処理を実施する演算デバイス230で構成される。
A. System Configuration FIG. 2 shows a configuration example of a storage location recommendation system 200 that recommends a storage location of an agent device in a room to a user. The illustrated place recommendation system 200 performs an operation of searching for an agent device 210 to be installed in a room, a companion device 220 to take the user, and a candidate for a place to install the agent device 210 in the room. It is composed of a device 230.
 エージェントデバイス210は、「スマートスピーカ」、「AIスピーカ」、「AIアシスタント」などとも呼ばれる、一般的な音声エージェントとほぼ同じハードウェア構成を装備することを想定している。但し、エージェントデバイス210は、専用のデバイスではなく、情報家電やIoT(Internet of Things)デバイスなどの機器に常駐するエージェント用のアプリケーションであってもよい。 The agent device 210 is assumed to be equipped with almost the same hardware configuration as a general voice agent, also called “smart speaker”, “AI speaker”, “AI assistant”, and the like. However, the agent device 210 is not a dedicated device, but may be an application for an agent resident in a device such as an information home appliance or an IoT (Internet of Things) device.
 図2に示すエージェントデバイス210は、CPU(Central Processing Unit)とメモリからなる制御部211と、制御部211で処理される情報を外部に表出する表出部212と、エージェントデバイス210が設置された場所周辺の情報を検出するセンサ部213と、外部の装置と通信する通信部214を備えている。 The agent device 210 illustrated in FIG. 2 includes a control unit 211 including a CPU (Central Processing Unit) and a memory, a display unit 212 that externally displays information processed by the control unit 211, and an agent device 210. A sensor unit 213 for detecting information around the place and a communication unit 214 for communicating with an external device.
 表出部212は、基本的にスピーカを含んでいる。スピーカはステレオスピーカやマルチチャンネルスピーカであってもよい。また、一部のスピーカはエージェントデバイス210本体に外付け接続されていてもよい。また、表出部212はプロジェクタを含んでいてもよい。 Expression unit 212 basically includes a speaker. The speaker may be a stereo speaker or a multi-channel speaker. Further, some speakers may be externally connected to the main body of the agent device 210. In addition, the display unit 212 may include a projector.
 センサ部213の構成(すなわち、どのようなセンサ素子を含むか)は任意である。例えば、センサ部213は、マイク、カメラ、物体検出センサや深度センサを含んでいてもよい。カメラは、例えば、画角90度のカメラや、画角360度を持つ全周囲カメラ、あるいはステレオカメラや多眼カメラであってもよい。カメラや物体検出センサ、深度センサの検出結果に基づいて、室内に設置された家具のレイアウトを検出することができる。また、センサ部213は、照度センサや温度センサ、湿度センサなどの部屋内の環境を検出する環境センサを備えていてもよい。また、センサ部213は、赤外線センサや人感センサを備えていてもよい。また、センサ部213は、ユーザの脈拍や発汗、脳波、筋電位、呼気などを検出する生体センサを備えていてもよい。センサ部213を構成するセンサ部の一部又は全部がエージェントデバイス210に外付け接続され、又は無線接続されていてもよい。 The configuration of the sensor unit 213 (that is, what kind of sensor element is included) is arbitrary. For example, the sensor unit 213 may include a microphone, a camera, an object detection sensor, and a depth sensor. The camera may be, for example, a camera having an angle of view of 90 degrees, an all-around camera having an angle of view of 360 degrees, or a stereo camera or a multi-lens camera. The layout of the furniture installed in the room can be detected based on the detection results of the camera, the object detection sensor, and the depth sensor. Further, the sensor unit 213 may include an environment sensor that detects an environment in the room, such as an illuminance sensor, a temperature sensor, and a humidity sensor. Further, the sensor unit 213 may include an infrared sensor or a human sensor. The sensor unit 213 may include a biological sensor that detects a user's pulse, sweating, brain waves, myoelectric potential, exhalation, and the like. Some or all of the sensor units constituting the sensor unit 213 may be externally connected to the agent device 210 or may be wirelessly connected.
 通信部214は、イーサネット(登録商標)などの有線通信又はWi-Fi(登録商標)などの無線通信を利用して、エージェントデバイス210の外部装置と相互接続する。本実施形態では、エージェントデバイス210の外部装置としてコンパニオンデバイス220を想定しており、例えばセンサ部213でセンシングした環境情報や、エージェントデバイス210の機器名などの情報を、通信部214経由でコンパニオンデバイス220に送信する。 The communication unit 214 interconnects with an external device of the agent device 210 using wired communication such as Ethernet (registered trademark) or wireless communication such as Wi-Fi (registered trademark). In the present embodiment, a companion device 220 is assumed as an external device of the agent device 210. For example, environmental information sensed by the sensor unit 213 and information such as the device name of the agent device 210 are transmitted to the companion device via the communication unit 214. 220.
 なお、図2では、置き場所推薦に必要な構成要素を中心に描いているが、エージェントデバイス210は図示した以外の構成要素を備えているものとする。また、通信部214は、アクセスポイント若しくはルータを経由してインターネットなどの外部ネットワークに接続してもよいが、図示を省略している。 Note that FIG. 2 mainly illustrates components necessary for the location recommendation, but it is assumed that the agent device 210 includes components other than those illustrated. The communication unit 214 may be connected to an external network such as the Internet via an access point or a router, but is not shown.
 また、エージェントデバイス210は、バッテリ駆動であってもよいが、ユーザからの問い合わせに対して24時間対応するというニーズから、電源ケーブル(図示しない)を介して商用電源から給電されるものとする。 Also, although the agent device 210 may be battery-powered, it is assumed that power is supplied from a commercial power supply via a power cable (not shown) because of the need to respond to inquiries from the user for 24 hours.
 コンパニオンデバイス220は、部屋の中でエージェントデバイスの置き場所をユーザに推薦する際に、ユーザを連れ立つデバイスである。コンパニオンデバイス220は、例えばスマートフォンやタブレットなどユーザが携帯する情報端末上で所定のプログラム(本明細書では、仮に「コンパニオン」アプリケーションとも呼ぶ)を実行することにより実現される。 The companion device 220 is a device that takes the user when recommending the location of the agent device in the room to the user. The companion device 220 is realized by executing a predetermined program (in this specification, also temporarily referred to as a “companion” application) on an information terminal carried by the user such as a smartphone or a tablet.
 図2に示すコンパニオンデバイス220は、CPUとメモリからなる制御部221と、制御部221で処理される情報を外部に表出する表出部222と、撮影部223と、通信部224と、情報暗号部225と、情報復号部226を備えている。 The companion device 220 illustrated in FIG. 2 includes a control unit 221 including a CPU and a memory, a display unit 222 that externally displays information processed by the control unit 221, an imaging unit 223, a communication unit 224, An encryption unit 225 and an information decryption unit 226 are provided.
 制御部221は、コンパニオンアプリケーションを実行して、演算デバイス230と連携して、部屋の中でエージェントデバイス210の置き場所をユーザに推薦するための処理を実施する。 The control unit 221 executes the companion application, and executes a process for recommending the location of the agent device 210 to the user in the room in cooperation with the arithmetic device 230.
 表出部222は、例えばスマートフォンの画面やスピーカである。また、撮影部223は、例えばスマートフォンなどの情報端末に搭載されたカメラであり、主にエージェントデバイス210が置かれる部屋の風景を撮影するために使用される。 The display unit 222 is, for example, a smartphone screen or a speaker. The photographing unit 223 is a camera mounted on an information terminal such as a smartphone, and is used mainly for photographing a scene of a room where the agent device 210 is placed.
 通信部224は、イーサネット(登録商標)などの有線通信又はWi-Fi(登録商標)などの無線通信を利用して、コンパニオンデバイス220の外部装置と相互接続する。もちろん、通信部224は、LTE(Long Term Evolution)やLTE-Advancedといった移動通信システムの無線通信方式に対応していてもよい。 The communication unit 224 interconnects with an external device of the companion device 220 using wired communication such as Ethernet (registered trademark) or wireless communication such as Wi-Fi (registered trademark). Of course, the communication unit 224 may correspond to a wireless communication method of a mobile communication system such as LTE (Long Term Evolution) or LTE-Advanced.
 本実施形態では、通信部224を使ってコンパニオンデバイス220が接続する外部装置として、エージェントデバイス210並びに演算デバイス230を想定している。エージェントデバイス210からは、例えばセンサ部213でセンシングした環境情報や、エージェントデバイス210の機器名などの情報を、通信部224経由で受信する。また、エージェントデバイス210から受信した情報を、通信部224経由で演算デバイス230に送信するとともに、演算デバイス230からは部屋の中でのエージェントデバイス210の置き場所の候補位置に関する情報を通信部224経由で受信する。エージェントデバイス210の置き場所の候補に関する情報は、例えば表出部222を使ってユーザに提示される。 In the present embodiment, the agent device 210 and the arithmetic device 230 are assumed as external devices to which the companion device 220 is connected using the communication unit 224. For example, environment information sensed by the sensor unit 213 and information such as the device name of the agent device 210 are received from the agent device 210 via the communication unit 224. In addition, the information received from the agent device 210 is transmitted to the computing device 230 via the communication unit 224, and the information regarding the candidate position of the agent device 210 in the room is transmitted from the computing device 230 via the communication unit 224. To receive. Information on the candidate for the location of the agent device 210 is presented to the user using the display unit 222, for example.
 通信部224経由で外部装置と送受信する情報の中には、プライバシーに関わり、ユーザが外部には公開したくない情報が含まれる。そこで、コンパニオンデバイス220は、外部装置に情報を送信する際には、その送信情報に含まれる公開したくない情報の部分を、外部装置では解読できない形式に変換する暗号化処理を情報暗号部225で実施する。また、コンパニオンデバイス220は、上記の暗号化処理が施された情報が外部装置から返信された際には、暗号化処理された情報を元の状態に復元する復号処理を情報復号部226で実施する。コンパニオンデバイス220とエージェントデバイス210は家庭内で使用され若しくは同一のユーザによって使用されるので、エージェントデバイス210は情報の暗号化が不要な外部装置とする。他方、演算デバイス230の実体はクラウドであることを想定しており(後述)、情報の暗号化が必要な外部装置に該当する。暗号化及び復号処理の詳細については、後述に譲る。 The information transmitted / received to / from the external device via the communication unit 224 includes information related to privacy and which the user does not want to disclose to the outside. Therefore, when transmitting the information to the external device, the companion device 220 performs an encryption process for converting a part of the information that the user does not want to disclose included in the transmitted information into a format that cannot be decrypted by the external device. It is carried out in. When the information subjected to the above-described encryption processing is returned from the external device, the companion device 220 performs the decryption processing for restoring the encrypted information to the original state by the information decryption unit 226. I do. Since the companion device 220 and the agent device 210 are used at home or by the same user, the agent device 210 is an external device that does not require information encryption. On the other hand, the entity of the arithmetic device 230 is assumed to be a cloud (described later), and corresponds to an external device that requires information encryption. The details of the encryption and decryption processing will be described later.
 演算デバイス230は、コンパニオンデバイス220と連携して、部屋の中でエージェントデバイス210の置き場所をユーザに推薦する「置き場所推薦サービス」を提供するするデバイスである。本実施形態では、演算デバイス230の実体は例えばクラウド上で実行されるソフトウェアモジュールであり、各家庭などに設置される多数のエージェントデバイスの置き場所を推薦するサービスを提供することを想定している。本明細書では、クラウド(Cloud)というときは、一般的に、クラウドコンピューティング(Cloud Computing)を指すものとする。クラウドは、インターネットなどのネットワークを経由してコンピューティングサービスを提供する。 The computing device 230 is a device that provides a “place recommendation service” that recommends the place of the agent device 210 in the room to the user in cooperation with the companion device 220. In the present embodiment, the entity of the computing device 230 is, for example, a software module that is executed on a cloud, and is assumed to provide a service that recommends a location of a large number of agent devices installed in each home or the like. . In this specification, the term “cloud” generally refers to cloud computing (Cloud Computing). The cloud provides computing services via a network such as the Internet.
 図2に示す演算デバイス230は、CPUとメモリからなる制御部231と、通信部232と、エージェントデバイステーブル233と、家具情報テーブル234を備えている。 The arithmetic device 230 shown in FIG. 2 includes a control unit 231 including a CPU and a memory, a communication unit 232, an agent device table 233, and a furniture information table 234.
 制御部231は、コンパニオンデバイス220と連携して、部屋の中でエージェントデバイス210の置き場所をユーザに推薦するためのアプリケーションプログラムを実行する。制御部231は、このアプリケーションを実行するために必要な情報を、エージェントデバイステーブル233と家具情報テーブル234から適宜取得する。 The control unit 231 cooperates with the companion device 220 to execute an application program for recommending the location of the agent device 210 in the room to the user. The control unit 231 acquires information necessary for executing this application from the agent device table 233 and the furniture information table 234 as appropriate.
 通信部232は、イーサネット(登録商標)などの有線通信又はWi-Fi(登録商標)などの無線通信を利用して、コンパニオンデバイス220の外部装置と相互接続する。本実施形態では、演算デバイス230の外部装置として、コンパニオンデバイス220を想定している。コンパニオンデバイス220からは、エージェントデバイス210の環境情報(部屋内を撮影した画像など)や機器名などの情報を、通信部232経由で受信する。そして、制御部231は、これらの受信情報に基づいてエージェントデバイス210の置き場所の候補位置に関する情報を算出し、その算出結果を通信部232経由でコンパニオンデバイス220に送信する。 The communication unit 232 interconnects with an external device of the companion device 220 using wired communication such as Ethernet (registered trademark) or wireless communication such as Wi-Fi (registered trademark). In the present embodiment, a companion device 220 is assumed as an external device of the arithmetic device 230. From the companion device 220, information such as environment information of the agent device 210 (such as an image of the inside of the room) and a device name is received via the communication unit 232. Then, the control unit 231 calculates information on the candidate location of the agent device 210 based on the received information, and transmits the calculation result to the companion device 220 via the communication unit 232.
 エージェントデバイステーブル233は、エージェントデバイスの機器名毎の、Capability(機能)、デバイス仕様、設置に適した場所、及び避けるべき機器や場所を取りまとめたテーブルであるが、詳細については後述に譲る。また、家具情報テーブル234は、家具名毎の移動コストを取りまとめたテーブルであるが、詳細については後述に譲る。移動コストは、各家具の重量や体積、設置する条件などに基づいて、移動する難しさ(若しくは、移動させるためのユーザの負担)を数値化したものである。 The agent device table 233 is a table in which Capabilities (functions), device specifications, locations suitable for installation, and devices and locations to be avoided for each device name of the agent device will be described later. Further, the furniture information table 234 is a table in which moving costs for each furniture name are collected, and details will be described later. The moving cost is obtained by quantifying the difficulty of moving (or the burden on the user for moving) based on the weight and volume of each piece of furniture, installation conditions, and the like.
 図2に示したシステム構成例では、コンパニオンデバイス220を1台しか描いていないが、実際には、クラウドサーバである演算デバイス230が、同時に多数のコンパニオンデバイス220の各々に対して、エージェントデバイス210の置き場所を推薦するサービスを提供することが想定される。 Although only one companion device 220 is illustrated in the system configuration example illustrated in FIG. 2, in actuality, the computing device 230 that is a cloud server simultaneously transmits the agent device 210 to each of a large number of companion devices 220. It is assumed that a service for recommending the place of the storage is provided.
 なお、演算デバイス230は、クラウドではなく、エージェントデバイス210及びコンパニオンデバイス220と同じ部屋内に設置されるPC(Personal Computer)や、コンパニオンアプリケーションを実行する情報端末上で並列実行される他のアプリケーションであってもよい(後者の場合、コンパニオンデバイス220と演算デバイス230は、同一のハードウェア装置上に同時に構築されることになる)。 The computing device 230 is not a cloud, but a PC (Personal Computer) installed in the same room as the agent device 210 and the companion device 220, or another application that is executed in parallel on an information terminal that executes a companion application. (In the latter case, the companion device 220 and the computing device 230 will be constructed simultaneously on the same hardware device).
B.基本動作
 続いて、置き場所推薦システム200の動作手順について、図3を参照しながら説明する。
B. Basic Operation Next, an operation procedure of the place recommendation system 200 will be described with reference to FIG.
 エージェントデバイス210の置き場所をユーザに推薦するための処理は、例えばエージェントデバイス210の初期セットアップ時に実施される。あるいは、ユーザがエージェントデバイス210の置き場所を変更したくなったときや、部屋のレイアウトを変更したときにも、実行するようにしてもよい。 The process for recommending the location of the agent device 210 to the user is performed, for example, during the initial setup of the agent device 210. Alternatively, the processing may be executed when the user wants to change the location of the agent device 210 or when changing the layout of the room.
 まず、ユーザは、コンパニオンデバイス220でコンパニオンアプリケーションを起動する(ステップ1)。そして、ユーザは、撮影部223を使って、エージェントデバイス210を置こうとしている一角を撮影する(ステップ2)。 First, the user starts a companion application on the companion device 220 (step 1). Then, the user uses the photographing unit 223 to photograph a corner where the agent device 210 is to be placed (step 2).
 例えば、図4に示すようなレイアウトをした部屋400において、参照番号401~403で示す、サイドボード、カウンター、及び机の上の3箇所が、ユーザがエージェントデバイス210を置く候補位置であるとする。このような場合、ユーザは、部屋内から各候補位置401~403に向かってそれぞれ撮影を行うとともに(図5を参照のこと)、各候補位置401~403を背に向けてそれぞれ撮影を行う(図6を参照のこと)。図5に示すように、候補位置に向かって撮影を行うのは、プロジェクタの投影面の適合を確認するためである(但し、エージェントデバイスがプロジェクタ表示機能を装備する場合)。また、図6に示すように、候補位置を背に向けて撮影を行うのは、マイクやカメラなどの入力デバイス、あるいはスピーカの音響環境の適合を確認するためである。 For example, in a room 400 laid out as shown in FIG. 4, it is assumed that three places on a sideboard, a counter, and a desk indicated by reference numerals 401 to 403 are candidate positions where the user places the agent device 210. . In such a case, the user shoots each of the candidate positions 401 to 403 from inside the room (see FIG. 5), and shoots each of the candidate positions 401 to 403 with his / her back (see FIG. 5). See FIG. 6). As shown in FIG. 5, the reason why the photographing is performed toward the candidate position is to confirm the suitability of the projection plane of the projector (however, when the agent device has a projector display function). In addition, as shown in FIG. 6, the photographing is performed with the candidate position turned to the back in order to confirm the suitability of the input device such as a microphone or a camera or the acoustic environment of the speaker.
 なお、図5及び図6は、エージェントデバイスを設置する候補位置の撮影方法の例示に過ぎず、これらに限定されるものではない。例えば、ズームアウトした引き画像にして部屋全体を撮影する、部屋の中央でパノラマ撮影する、(部屋の中央で)90度毎に前・右・後・左の方向を撮影する、などの撮影方法も挙げられる。また、候補位置を俯瞰する撮影方法や、候補位置を仰視する撮影方法なども挙げることができる。 FIGS. 5 and 6 are only examples of the method of photographing a candidate position for installing an agent device, and the method is not limited thereto. For example, shooting methods such as shooting the entire room as a zoomed out subtracted image, shooting a panorama at the center of the room, shooting the front, right, rear, and left directions at every 90 degrees (at the center of the room) Are also mentioned. In addition, a shooting method in which the candidate position is overlooked, a shooting method in which the candidate position is viewed upward, and the like can also be mentioned.
 コンパニオンデバイス220は、エージェントデバイス210を設置する候補位置を撮影した画像と、エージェントデバイス210の機器名に関する情報を含むデータを、演算デバイス230に送信する(ステップ3)。演算デバイス230へのデータ送信は、例えばスマートフォンからクラウドへのデータのアップロードという形式で実施される。 The companion device 220 transmits an image of the candidate position where the agent device 210 is to be installed and data including information on the device name of the agent device 210 to the computing device 230 (step 3). The data transmission to the arithmetic device 230 is performed, for example, in the form of uploading data from a smartphone to the cloud.
 なお、コンパニオンデバイス220から演算デバイス230に送信する情報の中にはユーザが外部には公開したくない情報が含まれるので、情報暗号部225で暗号化処理してから、演算デバイス230に送信される。暗号化処理の詳細については、後述に譲る。 Since the information transmitted from the companion device 220 to the computing device 230 includes information that the user does not want to disclose to the outside, the information is encrypted by the information encryption unit 225 and then transmitted to the computing device 230. You. The details of the encryption process will be described later.
 演算デバイス230は、エージェントデバイス210を設置する候補位置の評価を行う(ステップ4)。具体的には、演算デバイス230は、エージェントデバイステーブル233を参照して、問い合わせされているエージェントデバイス210の機能や仕様に適合した場所などに関する情報を取得する。また、演算デバイス230は、送られてきた撮影画像に基づいて部屋内の物体認識を行う。物体認識の際、エージェントデバイス210の音声入出力機能と競合するテレビ受像機などのデバイスが存在する場所を検出する。また、対象となっているエージェントデバイス210がプロジェクタ表示機能を備えている場合には、白い壁の検出並びにその壁の凹凸判定を行ったり、投影画像の障害となる窓の判定や明るさ検出なども行ったりする。そして、演算デバイス230は、部屋内の各候補位置がそのエージェントデバイスの機能や仕様に適合した場所であるかどうか、あるいは避けるべき機器が近接し又は避ける場所でないかどうかを判定する。さらに演算デバイス230は、各候補位置がそのエージェントデバイス210に適合している度合いを数値化して優先順位を付ける。 The calculation device 230 evaluates a candidate position for installing the agent device 210 (step 4). Specifically, the arithmetic device 230 acquires information about a function or a location that conforms to the specification of the inquired agent device 210 with reference to the agent device table 233. Further, the computing device 230 performs object recognition in the room based on the transmitted captured image. At the time of object recognition, a location where a device such as a television receiver which competes with the voice input / output function of the agent device 210 is detected. When the target agent device 210 has a projector display function, it detects a white wall and determines the unevenness of the wall, determines a window that becomes an obstacle to a projected image, detects brightness, and the like. Also go. Then, the computing device 230 determines whether each candidate position in the room is a place that conforms to the functions and specifications of the agent device, or whether a device to be avoided is close to or is not a place to be avoided. Further, the computing device 230 quantifies the degree to which each candidate position matches the agent device 210 and assigns a priority.
 また、演算デバイス230は、コンパニオンデバイス220から送られてきた候補位置の中には、エージェントデバイス210の設置に適合する場所が含まれていないことが判った場合には、家具情報テーブル234を参照して、各候補位置で障害となっている家具を移動させる案を立案したり、部屋内で他の候補位置を立案したりする(ステップ5)。例えば、センターテーブルに置かれた花瓶が、ユーザがプロジェクタの投影画像を観るのに邪魔になる場合には、花瓶の移動を立案する。 In addition, if it is determined that the candidate position sent from the companion device 220 does not include a place suitable for the installation of the agent device 210, the arithmetic device 230 refers to the furniture information table 234. Then, a plan for moving furniture that is an obstacle at each candidate position is planned, or another candidate position is planned in the room (step 5). For example, if the vase placed on the center table hinders the user from seeing the image projected by the projector, the movement of the vase is planned.
 そして、演算デバイス230は、コンパニオンデバイス220から送られてきた候補位置、家具の移動案、並びに自ら立案した候補位置の各々を優先順位付けして、エージェントデバイス210の設置位置の提案データを作成して(ステップ6)、これをコンパニオンデバイス220に送信する(ステップ7)。 Then, the computing device 230 prioritizes each of the candidate position, the furniture movement plan, and the candidate position planned by itself from the companion device 220, and creates proposal data of the installation position of the agent device 210. (Step 6), and transmits this to the companion device 220 (Step 7).
 コンパニオンデバイス220は、演算デバイス230から受信したエージェントデバイス210の設置位置の提案データを、表出部222に出力して、ユーザに提示する(ステップ8)。 The companion device 220 outputs the proposal data of the installation position of the agent device 210 received from the arithmetic device 230 to the display unit 222 and presents it to the user (step 8).
 そして、ユーザは、コンパニオンデバイス220で提示された提案データに基づいて、部屋内でエージェントデバイス210の設置位置を変更し、必要に応じて家具の移動なども行う(ステップ9)。 Then, the user changes the installation position of the agent device 210 in the room based on the proposal data presented by the companion device 220, and moves furniture as necessary (step 9).
 ユーザは、エージェントデバイス210を所望の位置に設置した後、さらにエージェントデバイス210の撮影と演算デバイス230への撮影画像のアップロードを繰り返し実施して、現実の設置場所の検証を演算デバイス210に依頼するようにしてもよい。また、撮影の際、エージェントデバイス210がプロジェクタ表示機能により壁に投影した画像を併せて撮影するようにしてもよい。そして、演算デバイス230は、エージェントデバイス210の現在の設置位置が適切かどうかを検証する。例えば、演算デバイス230は、プロジェクタの投影サイズが所望になるように(あるいは、台形歪みを補正するように)、若干の位置調整を施すための提案データを再度作成して、コンパニオンデバイス220に返信する。コンパニオンデバイス220は、再度受け取った提案データをユーザに提示し、ユーザはそれを見て、エージェントデバイス210の位置調整を行う。 After installing the agent device 210 at a desired position, the user repeats photographing of the agent device 210 and uploading of the photographed image to the computing device 230, and requests the computing device 210 to verify the actual installation location. You may do so. Further, at the time of shooting, the image projected by the agent device 210 on the wall by the projector display function may be shot together. Then, the computing device 230 verifies whether the current installation position of the agent device 210 is appropriate. For example, the computing device 230 re-creates proposal data for performing a slight position adjustment so that the projection size of the projector is desired (or corrects trapezoidal distortion), and returns the proposal data to the companion device 220. I do. The companion device 220 presents the received proposal data to the user again, and the user looks at it and adjusts the position of the agent device 210.
C.具体的な動作例
 続いて、置き場所推薦システム200の具体的な動作例について説明する。
C. Specific Operation Example Next , a specific operation example of the place recommendation system 200 will be described.
 演算デバイス230は、エージェントデバイスが備える機能やデバイス仕様などと照らし合わせて、各候補位置の評価を行う。例えば、プロジェクタ表示機能を装備しないエージェントデバイスは、映像を投影する壁の存在を考慮せずに候補位置を評価することができるが、プロジェクタ表示機能を装備するエージェントデバイスについては、壁の存在を優先して候補位置を評価しなければならない。 The calculation device 230 evaluates each candidate position in light of functions and device specifications of the agent device. For example, an agent device without a projector display function can evaluate candidate positions without considering the presence of a wall that projects images, but an agent device with a projector display function gives priority to the presence of a wall. And evaluate the candidate position.
 そこで、演算デバイス230は、エージェントデバイスを設置する候補位置を評価するために、エージェントデバイスの機器名毎の、Capability(機能)、デバイス仕様、設置に適した場所、及び避けるべき機器や場所を取りまとめたエージェントデバイステーブル233を使用する。図7には、エージェントデバイステーブル233の構成例を示している。 Therefore, the computing device 230 summarizes Capability (function), device specifications, locations suitable for installation, and devices and locations to be avoided for each device name of the agent device, in order to evaluate the candidate position for installing the agent device. The agent device table 233 is used. FIG. 7 shows a configuration example of the agent device table 233.
 コンパニオンデバイス220から演算デバイス230へ、エージェントデバイスを置こうとしている一角を撮影した画像と、置こうとしているエージェントデバイスの機器名が送られてくる。ここで、コンパニオンデバイス220から演算デバイス230へ、図8に示す撮影画像と、エージェント機器名“XXXX”が送られてきたときの動作例について説明する。 (2) The companion device 220 sends an image of a corner where the agent device is to be placed and the device name of the agent device to be placed to the computing device 230. Here, an example of the operation when the captured image shown in FIG. 8 and the agent device name “XXXX” are transmitted from the companion device 220 to the arithmetic device 230 will be described.
 演算デバイス230は、コンパニオンデバイス220から送られてきた撮影画像を物体認識して、エージェントデバイスを設置しようとしている部屋内の家具のレイアウトや壁の色などの環境を把握する。図8に示す画像の場合、白色の壁で覆われた部屋の手前側にダイニングテーブルが置かれ、リビングには、壁際にサイドボード及びテレビ受像機が設置されるとともにテレビ画面に対向してソファが設置され、ソファの前にラグが敷かれその上にセンターテーブルが置かれ、ソファの右横にサイドテーブルが置かれていることを、物体認識を通じて把握することができる。 The arithmetic device 230 recognizes the captured image sent from the companion device 220 as an object and grasps the environment such as the layout of furniture and the color of the wall in the room where the agent device is to be installed. In the case of the image shown in FIG. 8, a dining table is placed in front of a room covered with a white wall, and in the living room, a sideboard and a television receiver are installed near the wall, and a sofa is provided facing the television screen. It is possible to grasp through object recognition that a rug is laid in front of the sofa, a center table is placed on the rug, and a side table is placed on the right side of the sofa.
 また、演算デバイス230は、エージェント機器名“XXXX”をエージェントデバイステーブル233に照会して、2個のマイクと1個のスピーカとを備え、音声操作とスピーカ再生機能を有すること、並びに、部屋の中央が適した場所であり、テレビ受像機、スピーカ、キッチンが避けるべき機器又は避けるべき場所であることを把握する。 The computing device 230 refers to the agent device table 233 for the agent device name “XXXX”, has two microphones and one speaker, has voice operation and speaker playback functions, and Know that the center is a suitable place, and that the TV receiver, speakers, and kitchen are the equipment to be avoided or the place to be avoided.
 そして、演算デバイス230は、撮影画像の物体認識結果と、エージェントデバイステーブル233から得られた、当該エージェントデバイスに適した場所並びに避けるべき機器又は避けるべき場所の情報とを比較して、1以上の候補位置を優先順位付きで算出する。具体的には、演算デバイス230は、当該エージェントデバイスに適した場所並びに避けるべき機器又は避けるべき場所の情報に基づいて、各候補位置が当該エージェントデバイスに適している度合いを数値化したスコアを計算して、スコアが高い方から順に優先順位を割り当てる。但し、候補位置毎のスコアの計算方法の詳細については後述に譲る。 Then, the arithmetic device 230 compares the object recognition result of the captured image with information on a place suitable for the agent device and information on a device to be avoided or a place to be avoided, which is obtained from the agent device table 233. The candidate positions are calculated with priorities. Specifically, the computing device 230 calculates a score that quantifies the degree to which each candidate position is suitable for the agent device, based on information on a location suitable for the agent device and equipment to be avoided or a location to be avoided. Then, priorities are assigned in descending order of the score. However, details of the method of calculating the score for each candidate position will be described later.
 例えば、演算デバイス230は、ソファの右横のサイドテーブル上、ダイニングテーブル上の順にスコアが高かったとする。演算デバイス230は、スコアの高い候補位置を示す提案データを作成して、コンパニオンデバイス220に返信する。 For example, it is assumed that the arithmetic device 230 has a higher score on the side table on the right side of the sofa and then on the dining table. The computing device 230 creates proposal data indicating a candidate position with a high score, and returns the proposal data to the companion device 220.
 コンパニオンデバイス220は、演算デバイス230から受信したエージェントデバイス210の設置位置の提案データを、表出部222に出力して、ユーザに提示する。本実施形態では、コンパニオンデバイス220は、エージェントデバイスを置こうとしている一角を撮影した画像に対して、スコアが高かった候補位置にエージェントデバイスのアイコンを重畳して、画面に表示する。ソファの右横のサイドテーブル上、ダイニングテーブル上の順にスコアが高かった場合には、図9に示すように、ソファの右横のサイドテーブル上、ダイニングテーブル上の各々に、エージェントデバイスのアイコン901及び902が重畳して表示される。ユーザは、図9に示すような表示画面を通じて、部屋内のどこにエージェントデバイスを置くことが薦められているかを、容易に把握することができる。また、ユーザは、エージェントデバイスを候補位置に設置した部屋内の様子を把握し易い。各エージェントデバイスのアイコン901及び902の近傍に、優先順位を示す数値を併せて表示するようにしてもよい。 The companion device 220 outputs the proposal data of the installation position of the agent device 210 received from the computing device 230 to the display unit 222 and presents it to the user. In the present embodiment, the companion device 220 superimposes an agent device icon on a candidate position having a high score on an image of a corner where the agent device is to be placed and displays the icon on the screen. When the score is higher on the side table on the right side of the sofa and on the dining table in order, as shown in FIG. 9, the icon 901 of the agent device is displayed on the side table on the right side of the sofa and on the dining table, respectively. And 902 are displayed in a superimposed manner. The user can easily grasp where in the room it is recommended to place the agent device through a display screen as shown in FIG. Further, the user can easily grasp the situation in the room where the agent device is installed at the candidate position. A numerical value indicating the priority order may be displayed near icons 901 and 902 of each agent device.
 なお、撮影画像にエージェントデバイスのアイコンを重畳する処理は、演算デバイス230で行ってもよいし、コンパニオンデバイス220側で演算デバイス230からの情報に基づいて行うようにしてもよい。また、重畳するアイコンの画像は、置こうとしているエージェントデバイスの実機(すなわち、機器名“XXXX”のエージェントデバイスの外観)に対応させると、ユーザは、部屋の中にそのエージェントデバイスが設置されている様子を想像し易い。 The process of superimposing the icon of the agent device on the captured image may be performed by the computing device 230, or may be performed by the companion device 220 based on information from the computing device 230. When the image of the icon to be superimposed corresponds to the actual device of the agent device to be placed (that is, the appearance of the agent device having the device name “XXXX”), the user can install the agent device in the room. It is easy to imagine how you are.
 続いて、コンパニオンデバイス220から演算デバイス230へ、図8に示す撮影画像と、エージェント機器名“YYYY”が送られてきたときの動作例について説明する。 Next, an operation example when the captured image shown in FIG. 8 and the agent device name “YYYY” are transmitted from the companion device 220 to the computing device 230 will be described.
 演算デバイス230は、コンパニオンデバイス220から送られてきた撮影画像を物体認識して、エージェントデバイスを設置しようとしている部屋内の家具のレイアウトや壁の色などの環境を把握することができる(同上)。 The computing device 230 recognizes the captured image sent from the companion device 220 as an object and can grasp the environment such as the layout of furniture and the color of the wall in the room where the agent device is to be installed (same as above). .
 また、演算デバイス230は、エージェント機器名“YYYY”をエージェントデバイステーブル233に照会して、3個のマイクと画角360度のカメラ及び1個のスピーカを備え、音声操作とカメラ撮影及びスピーカ再生機能を有すること、並びに、360度(全周囲)撮影のため部屋の中央が適した場所であり、テレビ受像機、スピーカ、キッチン、窓際が避けるべき機器又は避けるべき場所であることを把握する。 The computing device 230 refers to the agent device table 233 for the agent device name “YYYY”, and is provided with three microphones, a camera with a 360-degree angle of view, and one speaker. It is understood that the camera has the function and that the center of the room is a suitable place for 360-degree (all-around) shooting, and that the television receiver, the speaker, the kitchen, and the window are devices or places to avoid.
 そして、演算デバイス230は、当該エージェントデバイスに適した場所並びに避けるべき機器又は避けるべき場所の情報に基づいて、各候補位置が当該エージェントデバイスに適している度合いを数値化したスコアを計算して、スコアが高い方から順に優先順位を割り当てる。但し、スコアの計算方法の詳細については後述に譲る。ユーザがカメラの前に近づいてきたときに斜光になり易い窓の近くの場所は回避される。例えば、演算デバイス230は、ダイニングテーブル上で部屋の中央寄りの場所のスコアが高かったとする。演算デバイス230は、スコアの高い候補位置を示す提案データを作成して、コンパニオンデバイス220に返信する。 Then, the computing device 230 calculates a score that quantifies the degree to which each candidate position is suitable for the agent device, based on the information on the location suitable for the agent device and the equipment to be avoided or the location to be avoided, Assign priorities in descending order of score. However, details of the score calculation method will be described later. Locations near windows that are prone to oblique light when the user approaches the camera are avoided. For example, it is assumed that the arithmetic device 230 has a high score near the center of the room on the dining table. The computing device 230 creates proposal data indicating a candidate position with a high score, and returns the proposal data to the companion device 220.
 コンパニオンデバイス220は、演算デバイス230から受信したエージェントデバイス210の設置位置の提案データを、表出部222に出力して、ユーザに提示する。図10には、スコアが高かった、ダイニングテーブル上で部屋の中央寄りの候補位置に、エージェントデバイスのアイコン1001を重畳した画面の表示例を示している。アイコンの画像は、置こうとしているエージェントデバイスの実機(すなわち、機器名“YYYY”のエージェントデバイスの外観)に対応していることが好ましい。ユーザは、図10に示すような表示画面を通じて、部屋内のどこにエージェントデバイスを置くことが薦められているかを、容易に把握することができる。また、ユーザは、エージェントデバイスを候補位置に設置した部屋内の様子を把握し易い。 The companion device 220 outputs the proposal data of the installation position of the agent device 210 received from the computing device 230 to the display unit 222 and presents it to the user. FIG. 10 shows a display example of a screen in which the icon 1001 of the agent device is superimposed on a candidate position near the center of the room on the dining table having a high score. It is preferable that the icon image corresponds to the actual device of the agent device to be placed (that is, the appearance of the agent device having the device name “YYYY”). The user can easily grasp where in the room it is recommended to place the agent device through a display screen as shown in FIG. Further, the user can easily grasp the situation in the room where the agent device is installed at the candidate position.
 続いて、コンパニオンデバイス220から演算デバイス230へ、図8に示す撮影画像と、エージェント機器名“ZZZZ”が送られてきたときの動作例について説明する。 Next, an operation example when the captured image shown in FIG. 8 and the agent device name “ZZZZ” are transmitted from the companion device 220 to the computing device 230 will be described.
 演算デバイス230は、コンパニオンデバイス220から送られてきた撮影画像を物体認識して、エージェントデバイスを設置しようとしている部屋内の家具のレイアウトや壁の色などの環境を把握することができる(同上)。 The computing device 230 recognizes the captured image sent from the companion device 220 as an object and can grasp the environment such as the layout of furniture and the color of the wall in the room where the agent device is to be installed (same as above). .
 また、演算デバイス230は、エージェント機器名“ZZZZ”をエージェントデバイステーブル233に照会して、3個のマイクと1個のスピーカ及びプロジェクタを備え、音声操作とプロジェクタ表示及びスピーカ再生機能を有すること、並びに、映像投影のため白く平坦な壁の近くが適した場所であり、テレビ受像機、スピーカ、キッチン、窓際が避けるべき機器又は避けるべき場所であることを把握する。 Also, the arithmetic device 230 refers to the agent device table 233 for the agent device name “ZZZZ”, includes three microphones, one speaker, and a projector, and has voice operation, projector display, and speaker playback functions. In addition, it is understood that a place near a white flat wall is suitable for projecting an image, and that a television receiver, a speaker, a kitchen, and a window are devices to be avoided or a place to be avoided.
 そして、演算デバイス230は、当該エージェントデバイスに適した場所並びに避けるべき機器又は避けるべき場所の情報に基づいて、各候補位置が当該エージェントデバイスに適している度合いを数値化したスコアを計算して、スコアが高い方から順に優先順位を割り当てる。但し、スコアの計算方法の詳細については後述に譲る。映像を投影できない窓や、テレビ受像機のように壁を覆う障害物が設置された場所は回避される。また、窓際や照明器具の近くのように、他の光源で照らされて明るくなっている壁際は回避される。例えば、演算デバイス230は、ダイニングテーブル上で部屋の壁寄りの場所のスコアが高かったとする。演算デバイス230は、スコアの高い候補位置を示す提案データを作成して、コンパニオンデバイス220に返信する。 Then, the computing device 230 calculates a score that quantifies the degree to which each candidate position is suitable for the agent device, based on the information on the location suitable for the agent device and the equipment to be avoided or the location to be avoided, Assign priorities in descending order of score. However, details of the score calculation method will be described later. Windows where images cannot be projected and places where obstacles covering the wall such as television receivers are installed are avoided. In addition, a wall that is lit by another light source and is bright, such as near a window or near a lighting fixture, is avoided. For example, it is assumed that the computing device 230 has a high score on a dining table near a wall of a room. The computing device 230 creates proposal data indicating a candidate position with a high score, and returns the proposal data to the companion device 220.
 コンパニオンデバイス220は、演算デバイス230から受信したエージェントデバイス210の設置位置の提案データを、表出部222に出力して、ユーザに提示する。図11には、スコアが高かった、ダイニングテーブル上で部屋の中央寄りの候補位置に、エージェントデバイスのアイコン1101を重畳した画面の表示例を示している。アイコンの画像は、置こうとしているエージェントデバイスの実機(すなわち、機器名“ZZZZ”のエージェントデバイスの外観)に対応していることが好ましい。ユーザは、図11に示すような表示画面を通じて、部屋内のどこにエージェントデバイスを置くことが薦められているかを、容易に把握することができる。また、ユーザは、エージェントデバイスを候補位置に設置した部屋内の様子を把握し易い。 The companion device 220 outputs the proposal data of the installation position of the agent device 210 received from the computing device 230 to the display unit 222 and presents it to the user. FIG. 11 shows a display example of a screen in which the icon 1101 of the agent device is superimposed on a candidate position near the center of the room on the dining table having a high score. It is preferable that the icon image corresponds to the actual device of the agent device to be placed (that is, the appearance of the agent device having the device name “ZZZZ”). The user can easily grasp where in the room it is recommended to place the agent device through a display screen as shown in FIG. Further, the user can easily grasp the situation in the room where the agent device is installed at the candidate position.
D.プライバシー保護を考慮した動作例
 続いて、置き場所推薦システム200におけるプライバシー保護について説明する。
D. Operation Example Considering Privacy Protection Next, privacy protection in the location recommendation system 200 will be described.
 上述したように、コンパニオンデバイス220は、演算デバイス230からエージェントデバイスを設置する候補位置に関する情報サービスを享受するために、エージェント機器名とともに、エージェントデバイスを置こうとしている一角を撮影した画像を外部に送信しなければならない。ところが、撮影画像の中には、外部に公開したくない、プライバシーに関わる情報を含む場合がある。 As described above, the companion device 220 transmits an image of a corner where the agent device is to be placed to the outside together with the agent device name in order to enjoy the information service relating to the candidate position for installing the agent device from the computing device 230. Must be sent. However, in some cases, the photographed image includes privacy-related information that the user does not want to disclose to the outside.
 撮影方法については、候補位置に向かって撮影する、候補位置を背にして撮影する、引き画像で部屋全体を撮影する、部屋の中央でパノラマ撮影する(又は、90度毎に前・右・後・左の方向を撮影する)などを挙げた(前述)。カメラの視野を広くすると、その分、プライバシーに関わる情報が写り込んでしまうリスクが高まる。 Regarding the shooting method, shooting toward the candidate position, shooting with the candidate position as a back, shooting of the entire room with a pull image, shooting with a panorama at the center of the room (or front, right, and back every 90 degrees)・ Take a picture in the left direction) (described above). Increasing the camera's field of view increases the risk of privacy-related information being reflected.
 そこで、コンパニオンデバイス220は、演算デバイス230に撮影画像を送信する際に、その画像に含まれる公開したくない情報の部分を解読できない形式に変換する暗号化処理を情報暗号部225で実施する。また、コンパニオンデバイス220は、上記の暗号化処理が施された画像が演算デバイス230から返信された際には、暗号化処理された情報の部分を元の状態に復元する復号処理を情報復号部226で実施する。 Therefore, when the companion device 220 transmits a captured image to the arithmetic device 230, the information encryption unit 225 performs an encryption process of converting a part of the information that the user does not want to disclose included in the image into a format that cannot be decrypted. When the image subjected to the above-described encryption processing is returned from the arithmetic device 230, the companion device 220 performs a decryption process of restoring the encrypted information portion to the original state by the information decryption unit. 226.
 例えば、エージェントデバイスを置こうとしている一角を撮影した画像が、図12に示すように、数人の家族が写り込んだ画像であったとする。このような場合、コンパニオンデバイス220は、図13に示すように、家族の顔領域にぼかしを施して、個人を特定できないように情報暗号部225で暗号化処理してから、演算デバイス230に送信する。なお、情報暗号部225は、ぼかしの以外にも、モザイクをかけるなどの画像処理を利用してもよい。 {Suppose, for example, that an image of a corner where an agent device is to be placed is an image in which several families are shown as shown in FIG. In such a case, as shown in FIG. 13, the companion device 220 blurs the face area of the family, encrypts the information by the information encryption unit 225 so that the individual cannot be identified, and then transmits the information to the arithmetic device 230. I do. Note that the information encryption unit 225 may use image processing such as mosaicing in addition to blurring.
 演算デバイス230側では、人の顔にぼかしが施された画像であっても、部屋内の家具のレイアウトや窓の位置などを物体認識により把握することができるので、エージェントデバイスを設置する各候補位置を正しく評価することができる。演算デバイス230は、コンパニオンデバイス220から送られてきた撮影画像がぼかしなどにより暗号化されているかどうかを意識する必要はない。 On the computing device 230 side, even in the case of an image in which a human face is blurred, the layout of furniture in the room, the position of windows, and the like can be grasped by object recognition. The position can be evaluated correctly. The arithmetic device 230 does not need to be aware of whether the captured image sent from the companion device 220 has been encrypted by blurring or the like.
 演算デバイス230からコンパニオンデバイス220へ、図14に示すように、顔にぼかしが施された撮影画像中の候補位置にエージェントデバイスのアイコンを重畳した画像が返送される。このような場合、コンパニオンデバイス220は、図14に示すような、顔にぼかしが施された状態のままのエージェントデバイス1401、1402のアイコンを重畳した画像をユーザにそのまま提示してもよいが、図15に示すように、情報復号部226で、ぼかし処理を元通りに復元した撮影画像にエージェントデバイス1501、1502のアイコンをマッピングし直した画像を作成して、ユーザに提示するようにしてもよい。 As shown in FIG. 14, an image in which the icon of the agent device is superimposed on the candidate position in the photographed image in which the face is blurred is returned from the computing device 230 to the companion device 220. In such a case, the companion device 220 may directly present the user with an image in which the icons of the agent devices 1401 and 1402 are superimposed as shown in FIG. As illustrated in FIG. 15, the information decoding unit 226 creates an image in which the icons of the agent devices 1501 and 1502 are re-mapped to the captured image obtained by restoring the blurring processing to the original state, and presents the image to the user. Good.
 また、エージェントデバイスを置こうとしている一角を撮影した画像が、図16に示すように、家族は誰も映り込んでいない画像であるが、家具や室内のレイアウトを公開したくない場合がある。このような場合、コンパニオンデバイス220は、図17に示すように、公開されても問題のない類似画像に置き換えることによる暗号化処理を情報暗号部225で行ってから、演算デバイス230に送信する。ここで言う類似画像は、例えば、家具の配置や部屋の構造をほぼ同じに保ちながら、部屋全体又は個々の固有の家具を一般的なものに画像処理で置き換えた一般化画像若しくは汎用化画像である。 Furthermore, as shown in FIG. 16, the image of a corner where the agent device is to be placed is an image in which no family members are reflected, but there are cases where the user does not want to disclose furniture or the layout of the room. In such a case, as shown in FIG. 17, the companion device 220 performs an encryption process by replacing the image with a similar image that has no problem even if it is made public, and then transmits the encrypted image to the arithmetic device 230. The similar image referred to here is, for example, a generalized image or a generalized image obtained by replacing the entire room or individual unique furniture with a general image while maintaining the same arrangement of furniture and the structure of the room. is there.
 演算デバイス230は、コンパニオンデバイス220から送られてきた撮影画像が一般化若しくは汎用化により暗号化されているかどうかを意識する必要はない。演算デバイス230側では、一般化画像若しくは汎用化画像であっても、部屋内の家具のレイアウトや窓の位置などがほぼ同一であれば、元の撮影画像と同じものを物体認識することができるので、エージェントデバイスを設置する各候補位置を正しく評価することができる。 The arithmetic device 230 does not need to be aware of whether the captured image sent from the companion device 220 has been encrypted by generalization or generalization. On the computing device 230 side, even if it is a generalized image or a generalized image, if the layout of furniture in the room, the position of the window, and the like are almost the same, the same object as the original captured image can be recognized. Therefore, each candidate position for installing an agent device can be correctly evaluated.
 演算デバイス230からコンパニオンデバイス220へ、図18に示すように、一般化画像若しくは汎用化画像中の候補位置にエージェントデバイスのアイコンを重畳した画像が返送される。このような場合、コンパニオンデバイス220は、図18に示すように、一般化若しくは汎用化された状態のままの画像にエージェントデバイス1801のアイコンを重畳した画像をユーザにそのまま提示してもよいが、図19に示すように、情報復号部226で、元通りに復元した撮影画像にエージェントデバイス1901のアイコンをマッピングし直した画像を作成して、ユーザに提示するようにしてもよい。 As shown in FIG. 18, the arithmetic device 230 returns the generalized image or the image in which the icon of the agent device is superimposed on the candidate position in the generalized image to the companion device 220. In such a case, the companion device 220 may present the user with an image in which the icon of the agent device 1801 is superimposed on the image in the generalized or generalized state as shown in FIG. As shown in FIG. 19, the information decoding unit 226 may create an image in which the icon of the agent device 1901 is re-mapped to the captured image restored as before, and present the image to the user.
 また、撮影画像を、一般化若しくは汎用化画像ではなく、イラストに置き換えることによって、プライバシーに関わる情報を公開しないようにすることもできる。コンパニオンデバイス220上で、部屋の間取りを作成するアプリケーションを実行して、ユーザ自身が部屋の間取りを描画するようにしてもよい。 プ ラ イ バ シ ー Also, by replacing the captured image with an illustration instead of a generalized or generalized image, privacy-related information can be kept from being disclosed. An application that creates a room layout may be executed on the companion device 220, and the user himself may draw the room layout.
 図32には、部屋の間取りのイラストを描画するUI画面3200の構成例を示している。図示のUI画面の上半分には部屋の間取り図3201が表示され、下半分には複数の家具アイコン3202が表示されている。 FIG. 32 shows a configuration example of a UI screen 3200 for drawing an illustration of a room layout. A room floor plan 3201 is displayed in the upper half of the illustrated UI screen, and a plurality of furniture icons 3202 are displayed in the lower half.
 間取り図3201は、例えば、実際の部屋を立体的に撮影し、写っている立体の辺を実線に変換し、必要に応じて見えない辺を破線に変換することで、自動生成することができる。あるいは、部屋の間取りそのものを公開したくない場合などのために、四角形などの標準的な間取り図を間取り作成アプリケーションがあらかじめ用意しておいてもよいし、複数の間取り図見本の中からユーザが選択できるようにしてもよい。 The floor plan 3201 can be automatically generated by, for example, taking a three-dimensional image of an actual room, converting the sides of the three-dimensional object into solid lines, and converting invisible sides into dashed lines as necessary. . Alternatively, if you do not want to disclose the room floor plan itself, the floor plan creation application may prepare a standard floor plan such as a square in advance, or the user may select from among several floor plan samples. You may make it selectable.
 また、家具アイコン3202として、冷蔵庫、テレビ受像機、シンク、ソファ、スピーカ、コンセント、テーブル、サイドボード、窓などの主要な家具をそれぞれ標準的な絵柄で表現したものである。ユーザは、部屋内に実際に置かれている家具に対応するアイコンを選択して、画面上半分の間取り図3201上の該当する場所に配置することで、部屋の間取りをイラスト化することができる。 家具 Furniture icons 3202 represent main furniture such as a refrigerator, a television receiver, a sink, a sofa, a speaker, an outlet, a table, a sideboard, and a window in standard patterns. The user can illustrate the floor plan of the room by selecting the icon corresponding to the furniture that is actually placed in the room and arranging it at the corresponding location on the floor plan 3201 in the upper half of the screen. .
 あるいは、上記のようにユーザのマニュアル操作で部屋の間取りをイラスト化するのではなく、情報暗号部225で撮影画像から部屋の間取りのイラストへ自動変換するようにしてもよい。この場合、情報復号部226で間取りのイラストの元の撮影画像への復元を自動で行うようにしてもよい。 Alternatively, instead of illustration of the room layout by the user's manual operation as described above, the information encryption unit 225 may automatically convert a captured image into a room layout illustration. In this case, the information decoding unit 226 may automatically restore the floor plan illustration to the original captured image.
 コンパニオンデバイス220が例えばスマートフォンで構成される場合、そのスマートフォンの画面上には、図32に示すようなイラスト描画用のUI画面が表示される。そして、ユーザは、そのUI画面のタッチ操作によって、部屋の間取りのイラストを描画することができる。あるいは、タブレットやパソコンなどコンパニオンデバイス220よりも編集操作を行い易い情報端末上で間取り作成アプリケーションを起動して、作成したイラストを情報端末からコンパニオンデバイス220に送信するようにしてもよい。 When the companion device 220 is, for example, a smartphone, a UI screen for drawing an illustration as shown in FIG. 32 is displayed on the screen of the smartphone. Then, the user can draw a floor plan illustration by touching the UI screen. Alternatively, a floor plan creation application may be started on an information terminal such as a tablet or a personal computer that allows easier editing than the companion device 220, and the created illustration may be transmitted from the information terminal to the companion device 220.
 そして、コンパニオンデバイス220から演算デバイス230へ、部屋の間取りのイラストが、エージェントデバイスの機器名とともに送信される。部屋の間取りのイラストでは、各家具はアイコン化されているので、演算デバイス230側では、撮影画像を物体認識するよりも正確に家具を識別することができる。そして、演算デバイス230は、部屋の間取りのイラストの中で、エージェントデバイスを設置する候補位置を評価する。そして、同イラスト上のお薦めする場所にエージェントデバイスのアイコンを重畳して、コンパニオンデバイス220に返送する。コンパニオンデバイス220は、図33に示すように、エージェントデバイスのアイコン3301が重畳表示されたイラストを、UI画面3200上に表示して、ユーザにエージェントの設置場所を提案する。 (5) Then, the illustration of the room layout is transmitted from the companion device 220 to the computing device 230 together with the device name of the agent device. In the illustration of the room layout, each piece of furniture is iconified, so that the computing device 230 can identify the piece of furniture more accurately than recognizing the captured image as an object. Then, the arithmetic device 230 evaluates a candidate position for installing the agent device in the illustration of the room layout. Then, the icon of the agent device is superimposed on the recommended place on the illustration and returned to the companion device 220. As shown in FIG. 33, the companion device 220 displays an illustration on which the icon 3301 of the agent device is superimposed and displayed on the UI screen 3200, and suggests the installation location of the agent to the user.
 あるいは、コンパニオンデバイス220は、演算デバイス230へ、部屋の間取りのイラストそのものを送るのではなく、部屋の間取りの形状などを示す幾何情報、及び、各家具アイコンの位置情報などの数値情報を送信するようにしてもよい。演算デバイス230側では、受信した数値情報に基づいて部屋の間取りのイラストを再現することができる。また、演算デバイス230は、エージェントデバイスのアイコンをお薦めの設置場所に重畳したイラストではなく、お薦めする設置場所の位置情報を返送するようにしてもよい。そして、コンパニオンデバイス220側で、部屋の間取り図3200上にエージェントデバイスのアイコン3301を重畳する処理を実施するようにしてもよい。 Alternatively, the companion device 220 does not send the room layout illustration itself to the arithmetic device 230, but sends geometric information indicating the shape of the room layout and numerical information such as position information of each furniture icon. You may do so. On the computing device 230 side, an illustration of the room layout can be reproduced based on the received numerical information. The computing device 230 may return the position information of the recommended installation location instead of the illustration in which the icon of the agent device is superimposed on the recommended installation location. Then, the companion device 220 may perform a process of superimposing the agent device icon 3301 on the floor plan 3200 of the room.
 また、撮影画像に含まれる物体を、物体名とその画像上の位置情報に置き換えて扱うことによって、撮影画像やイラストを用いる場合よりも、さらに解読できない形式に暗号化することができる。例えば、エージェントデバイスを置こうとしている一角を撮影した画像が図16に示す通りであったとすると、コンパニオンデバイス220は、画像に含まれるソファやセンターテーブルを、以下に示すような物体名と画像上の位置情報に置き換えて、演算デバイス230に送信する。 物体 Also, by replacing the object included in the captured image with the object name and the position information on the image, it is possible to encrypt the image in a format that cannot be decrypted more than when using the captured image or illustration. For example, assuming that an image of a corner where the agent device is to be placed is as shown in FIG. 16, the companion device 220 replaces the sofa or center table included in the image with the object name and the image shown below. And transmits it to the computing device 230.
Sofa:(x0,y0,h0,w0)
Table:(x1,y1,h1,w1)
   …
Softa: (x0, y0, h0, w0)
Table: (x1, y1, h1, w1)
 演算デバイス230側では、物体名と画像上の位置情報からでも、部屋内の家具のレイアウトや窓の位置などを元の撮影画像と同じように空間的若しくは3次元的に把握することが可能であり、したがって、エージェントデバイスを設置する各候補位置を正しく評価することができる。この場合、演算デバイス230は、物体名と画像上の位置情報から、以下のようなエージェントの候補位置を計算する。 On the computing device 230 side, even from the object name and the positional information on the image, the layout of the furniture in the room, the position of the window, and the like can be grasped spatially or three-dimensionally in the same manner as the original captured image. Yes, and therefore, each candidate position for installing an agent device can be correctly evaluated. In this case, the computing device 230 calculates the following candidate agent positions from the object name and the position information on the image.
Agent1:(x2,y2,h2,w2) Agent1: (x2, y2, h2, w2)
 コンパニオンデバイス220から画像は送られてこないので、演算デバイス230は、エージェントデバイスのアイコンを画像に重畳することはせず、上記のようなエージェントの候補位置の情報をそのままコンパニオンデバイス220へ返信する。そして、コンパニオンデバイス220側では、受信したエージェントの候補位置の情報に基づいて、撮影画像上の該当する位置にエージェントデバイスのアイコンを重畳して、ユーザに提示するようにしてもよい。なお、演算デバイス230とコンパニオンデバイス220間では、家具やエージェントデバイスなどの物体の位置情報のみが伝送され、データ量の多い画像情報は伝送されないので、通信帯域の節減にもつながる。 Since the image is not sent from the companion device 220, the computing device 230 does not superimpose the icon of the agent device on the image, and returns the information on the candidate position of the agent to the companion device 220 as it is. Then, on the companion device 220 side, based on the received information on the candidate positions of the agents, the icon of the agent device may be superimposed on the corresponding position on the captured image and presented to the user. Note that only position information of an object such as furniture or an agent device is transmitted between the computing device 230 and the companion device 220, and image information having a large data amount is not transmitted, which leads to a reduction in communication bandwidth.
E.家具の移動を考慮した動作例
 続いて、置き場所推薦システム200において、家具など部屋内の物体の移動も含めてエージェントデバイスを設置する候補位置する提案する方法について説明する。
E. FIG. Example of Operation Considering Movement of Furniture Subsequently, a method of proposing a candidate position for installing an agent device in the place recommendation system 200 including movement of an object in a room such as furniture will be described.
 図3に示した置き場所推薦システム200の動作手順のステップ5では、家具情報テーブル234を参照して、各候補位置で障害となっている家具を移動させる案を立案したり、部屋内で他の候補位置を立案したりする。 In step 5 of the operation procedure of the storage location recommendation system 200 shown in FIG. 3, with reference to the furniture information table 234, a plan for moving furniture that is an obstacle at each candidate position is planned, or another plan is moved in the room. Or a candidate position for
 家具情報テーブル234は、ユーザによって移動することが可能な家具とその移動コストを記述したテーブルである。移動コストは、各家具の重量や体積、設置する条件などに基づいて、移動する難しさ(若しくは、移動させるためのユーザの負担)を数値化したものである。図20には、家具情報テーブル234の構成例を示している。図示の家具情報テーブル234では、スピーカ、写真立て、花瓶といった軽量で移動させ易い家具には低い移動コストが与えられ、タンスや冷蔵庫といった重量物には高い移動コストが与えられている。 The furniture information table 234 is a table that describes furniture that can be moved by the user and the movement cost thereof. The moving cost is obtained by quantifying the difficulty of moving (or the burden on the user for moving) based on the weight and volume of each piece of furniture, installation conditions, and the like. FIG. 20 shows a configuration example of the furniture information table 234. In the illustrated furniture information table 234, low moving costs are given to lightweight and easy-to-move furniture such as speakers, photo frames and vases, and high moving costs are given to heavy objects such as closets and refrigerators.
 家具など部屋内の物体の移動も含めてエージェントデバイスを設置する候補位置する提案するいくつかの動作例について、以下で説明する。 (4) Some proposed operation examples of candidate positions for installing agent devices including movement of objects in a room such as furniture will be described below.
 例えば、コンパニオンデバイス220から演算デバイス230へ、図21に示す撮影画像2100と、エージェント機器名“ZZZZ”が送られてきたとする。演算デバイス230は、エージェントデバイステーブル233(図7を参照のこと)にエージェント機器名“ZZZZ”を照会して、対象とするエージェントデバイスがプロジェクタ表示機能を装備していることを把握する。また、演算デバイス230は、撮影画像2100を物体認識して、ソファ2101に対向している白い壁2102に凹凸がないので、プロジェクションに適している候補位置としてキャビネット2103の上を見つけ出すことができる。 For example, assume that the captured image 2100 shown in FIG. 21 and the agent device name “ZZZZ” are transmitted from the companion device 220 to the computing device 230. The computing device 230 refers to the agent device table “233” (see FIG. 7) for the agent device name “ZZZZ”, and recognizes that the target agent device has the projector display function. In addition, the computing device 230 recognizes the captured image 2100 as an object, and has no irregularities on the white wall 2102 facing the sofa 2101. Therefore, the arithmetic device 230 can find above the cabinet 2103 as a candidate position suitable for projection.
 しかしながら、キャビネット2103の上には、写真立て2104が既に置いてあるので、エージェントデバイスを設置するのに邪魔となる。そこで、演算デバイス230は、「写真立て」を家具情報テーブル234に照会して、移動コストが低いことを確認すると、キャビネット2103の上を候補位置とするとともに写真立て2104を他の場所に移動させるという提案データを作成して、コンパニオンデバイス220に返送する。 However, since the photo stand 2104 has already been placed on the cabinet 2103, it is an obstacle to installing the agent device. Therefore, the arithmetic device 230 refers to the furniture information table 234 for “photo stand” and confirms that the moving cost is low, and moves the photo stand 2104 to another location while setting the top of the cabinet 2103 as a candidate position. Is created and sent back to the companion device 220.
 演算デバイス230は、例えば、図21に示した撮影画像に対して、キャビネット2103の上から写真立て2104を取り除いて、機器名“ZZZZ”のエージェントデバイスを設置した画像(図示を省略)を合成して、コンパニオンデバイス220に送信するようにしてもよい。演算デバイス230は、写真立て2104を移動すべきことを通知するメッセージを併せてコンパニオンデバイス220に送信するようにしてもよい。コンパニオンデバイス220は、演算デバイス230から受信した画像をユーザに提示し、画像に付随するメッセージを受信した場合はそれもユーザに提示して、写真立て2104を移動してキャビネット2103上にエージェントデバイスを設置するようにユーザを促すことができる。 The arithmetic device 230, for example, removes the picture frame 2104 from the top of the cabinet 2103 and synthesizes an image (not shown) in which an agent device with the device name “ZZZZ” is installed on the captured image shown in FIG. Then, it may be transmitted to the companion device 220. The computing device 230 may transmit a message notifying that the photo frame 2104 should be moved to the companion device 220 together. The companion device 220 presents the image received from the computing device 230 to the user, and when a message accompanying the image is received, also presents it to the user, moves the photo stand 2104 and places the agent device on the cabinet 2103. The user can be prompted to install.
 また、コンパニオンデバイス220から演算デバイス230へ、図22に示す撮影画像2200と、エージェント機器名“ZZZZ”が送られてきたとする。演算デバイス230は、エージェントデバイステーブル233(図7を参照のこと)にエージェント機器名“ZZZZ”を照会して、対象とするエージェントデバイスがプロジェクタ表示機能を装備していることを把握する。また、演算デバイス230は、撮影画像2200を物体認識して、ソファ2201に対向している白い壁2202に凹凸がないので、プロジェクションに適している候補位置としてサイドボード2203の上を見つけ出すことができる。 {Suppose that the captured image 2200 and the agent device name “ZZZZ” shown in FIG. 22 are transmitted from the companion device 220 to the arithmetic device 230. The computing device 230 refers to the agent device table “233” (see FIG. 7) for the agent device name “ZZZZ”, and recognizes that the target agent device has the projector display function. In addition, the computing device 230 recognizes the captured image 2200 as an object, and since the white wall 2202 facing the sofa 2201 has no irregularities, the computing device 230 can find the position above the sideboard 2203 as a candidate position suitable for projection. .
 しかしながら、サイドボード2203の上には、テレビ受像機2204と花瓶2205が既に置いてある。テレビ受像機2204と花瓶2205は、エージェントデバイスをサイドボード2203上に設置するのに邪魔になる。さらに、テレビ受像機2204の音響は、エージェントデバイスの音声入出力と干渉するので、避けるべき機器であることがエージェントデバイステーブル233(図7を参照のこと)で指定されている。そこで、演算デバイス230は、「家具情報テーブル234テレビ受像機」及び「花瓶」を照会して、移動コストが高くないことを確認すると、サイドボード2203上に候補位置2206を設定するとともに、テレビ受像機2204をサイドボード2203上で候補位置2206から離れるように移動させ、且つ花瓶2205を他の場所に移動させるという提案データを作成して、コンパニオンデバイス220に返送する。 However, the television receiver 2204 and the vase 2205 are already placed on the sideboard 2203. The television receiver 2204 and the vase 2205 obstruct the placement of the agent device on the sideboard 2203. Furthermore, since the sound of the television receiver 2204 interferes with the sound input / output of the agent device, the device to be avoided is specified in the agent device table 233 (see FIG. 7). Therefore, the arithmetic device 230 refers to the “furniture information table 234 television receiver” and “vase” and confirms that the moving cost is not high, and sets the candidate position 2206 on the sideboard 2203 and sets the television reception. Proposal data for moving the machine 2204 away from the candidate position 2206 on the sideboard 2203 and moving the vase 2205 to another location is created and returned to the companion device 220.
 演算デバイス230は、例えば、図22に示した撮影画像に対して、テレビ受像機2204をサイドボード2203上で候補位置2206から離れるように移動させ、且つ、花瓶2205をサイドボード2203上から取り除いて、機器名“ZZZZ”のエージェントデバイスをサイドボード2203上に設置した画像(図示を省略)を合成して、コンパニオンデバイス220に送信するようにしてもよい。演算デバイス230は、テレビ受像機2204及び花瓶2205を移動すべきことを通知するメッセージを併せてコンパニオンデバイス220に送信するようにしてもよい。コンパニオンデバイス220は、演算デバイス230から受信した画像をユーザに提示し、画像に付随するメッセージを受信した場合はそれもユーザに提示して、テレビ受像機2204及び花瓶2205を移動してサイドボード2203上にエージェントデバイスを設置するようにユーザを促すことができる。 The arithmetic device 230 moves the television receiver 2204 away from the candidate position 2206 on the sideboard 2203 and removes the vase 2205 from the sideboard 2203, for example, with respect to the captured image shown in FIG. Alternatively, an image (not shown) in which an agent device having the device name “ZZZZ” is installed on the sideboard 2203 may be combined and transmitted to the companion device 220. The computing device 230 may transmit a message notifying that the television receiver 2204 and the vase 2205 should be moved to the companion device 220 together. The companion device 220 presents the image received from the computing device 230 to the user, and if a message accompanying the image is received, also presents the message to the user, moves the television receiver 2204 and the vase 2205 to move the sideboard 2203 The user can be prompted to place an agent device on top.
 また、コンパニオンデバイス220から演算デバイス230へ、図23に示す撮影画像2300と、エージェント機器名“WWWW”が送られてきたとする。演算デバイス230は、エージェントデバイステーブル233(図7を参照のこと)にエージェント機器名“WWWW”を照会して、対象とするエージェントデバイスがカメラ撮影機能を装備し、壁際が適した場所であることを把握する。また、演算デバイス230は、撮影画像2200を物体認識して、カメラ撮影に適している候補位置として壁際に置かれている電話台2301を見つけ出すことができる。 Also, assume that the captured image 2300 shown in FIG. 23 and the agent device name “WWWW” have been sent from the companion device 220 to the computing device 230. The computing device 230 refers to the agent device table “233” (see FIG. 7) for the agent device name “WWW”, and determines that the target agent device is equipped with a camera photographing function and is a suitable place near a wall. Figure out. Also, the computing device 230 can recognize the photographed image 2200 as an object and find out the telephone stand 2301 placed on the wall as a candidate position suitable for camera photographing.
 しかしながら、ユーザが着座するソファ2302が電話台2301に背を向けて設置されているので、エージェントデバイスを電話台2301に置くと、ユーザを正面から撮影できなくなるという問題がある。そこで、演算デバイス230は、家具情報テーブル234に「ソファ」を照会して、移動コストが高くないことを確認すると、電話台2301上に候補位置を設定するとともに、ソファ2302が電話台2301の方を向くように移動させるという提案データを作成して、コンパニオンデバイス220に返送する。 However, since the sofa 2302 on which the user sits is placed with his / her back facing the telephone stand 2301, placing the agent device on the telephone stand 2301 poses a problem that the user cannot be photographed from the front. Then, the arithmetic device 230 refers to the “sofa” in the furniture information table 234 and confirms that the moving cost is not high, and sets a candidate position on the telephone stand 2301 and sets the sofa 2302 to the telephone stand 2301. Is created, and the proposal data is moved to the companion device 220.
 演算デバイス230は、例えば、図23に示した撮影画像に対して、ソファ2302の向きを変えた画像2302´、及び、機器名“WWWW”のエージェントデバイスを電話台2301上に設置した画像を合成して、コンパニオンデバイス220に送信するようにしてもよい。演算デバイス230は、ソファ2302の向きを変えるべきことを通知するメッセージを併せてコンパニオンデバイス220に送信するようにしてもよい。図24には、図23に基づいて作成された合成画像を例示している。コンパニオンデバイス220は、演算デバイス230から受信した画像をユーザに提示し、画像に付随するメッセージを受信した場合はそれもユーザに提示して、ソファ2302の向きを変えて、電話台2301の上にエージェントデバイス2401を設置するようにユーザを促すことができる。 The arithmetic device 230 combines, for example, an image 2302 ′ in which the sofa 2302 has been turned and an image in which an agent device with the device name “WWWW” is installed on the telephone stand 2301 with the captured image shown in FIG. Then, it may be transmitted to the companion device 220. The arithmetic device 230 may transmit a message notifying that the direction of the sofa 2302 should be changed to the companion device 220 together. FIG. 24 illustrates a composite image created based on FIG. The companion device 220 presents the image received from the computing device 230 to the user, and when a message accompanying the image is received, also presents it to the user, changes the orientation of the sofa 2302, and places the message on the telephone stand 2301. The user can be prompted to install the agent device 2401.
 なお、同じ家具カテゴリであっても、移動の難易が家庭毎の個別の事情により相違する場合があるので、ユーザが家具情報テーブル234をカスタマイズできるようにしてもよい。例えば、スピーカは、軽量で持ち運び易いが、音響効果の観点から移動させたくない場合には移動コストを高くしたり、部屋の美観の観点から花瓶を固定したい場合には移動コストを高くしたりする。また、タンスは、中身が空であり移動が困難でなければ移動コストを低くして、エージェントデバイスの候補位置を提案し易くしてもよい。 Note that even in the same furniture category, the difficulty of movement may differ depending on the individual circumstances of each home, so that the user may be able to customize the furniture information table 234. For example, a speaker is lightweight and easy to carry, but if the user does not want to move the speaker from the viewpoint of the sound effect, the moving cost is increased, or if the vase is fixed from the viewpoint of the beauty of the room, the moving cost is increased. . In addition, if the closet is empty and it is difficult to move, the move cost may be reduced to make it easier to propose a candidate position of the agent device.
F.電源ケーブルの配線を考慮した動作例
 続いて、置き場所推薦システム200において、電源ケーブルの取り回しも含めてエージェントデバイスを設置する候補位置する提案する方法について説明する。
F. Operation Example Considering Wiring of Power Cable Next, a description will be given of a method of proposing a candidate position for installing an agent device in the storage location recommendation system 200, including routing of the power cable.
 これまでは、主にエージェントデバイスが備える機能に着目して、エージェントデバイスの設置場所を提案する実施例について説明してきた。しかしながら、プロジェクタの表示やカメラの撮影には極めて優れた設置場所であったとしても、コンセントから遠くなると、電源ケーブルの配線長が長くなり、歩行の邪魔になったり室内の美観を損なう原因になったりする。また、延長コードが必要になる、商用電源に届かないためバッテリ駆動にしなければならなくなる(充電作業が面倒である)、といった問題も発生し得る。 So far, the embodiment has been described in which the installation location of the agent device is proposed, focusing mainly on the functions of the agent device. However, even if it is an excellent location for displaying projectors and taking pictures with a camera, the distance from the outlet increases the length of the power cable, which may interfere with walking and impair the aesthetics of the room. Or In addition, there may be a problem that an extension cord is required, and the battery cannot be reached by a commercial power source, so that the battery must be driven (charging work is troublesome).
 そこで、図25に示すように、コンセントからの距離に応じた設置コストを定義した設置コストテーブルを用意して、演算デバイス230は、エージェントデバイスの機能に応じて適した場所(又は、避けるべき場所・デバイス)、家具の移動コストに加え、設置コストも考慮して、エージェントデバイスの設置場所を提案するようにしてもよい。 Therefore, as shown in FIG. 25, an installation cost table that defines the installation cost according to the distance from the outlet is prepared, and the arithmetic device 230 is placed in a suitable place (or a place to be avoided) according to the function of the agent device. (Device), the installation location of the agent device may be proposed in consideration of the installation cost in addition to the moving cost of the furniture.
 演算デバイス230は、エージェントデバイスを置こうとしている一角を撮影した画像をコンパニオンデバイス220から受信すると、撮影画像を物体認識して、部屋内のコンセントの位置を探索する。また、撮影画像ではコンセントが隠れていて物体認識できない場合には、家屋の建築設計に関する経験則や学習データなどに基づいて、部屋内のコンセントの位置を推定するようにしてもよい。 When the arithmetic device 230 receives an image of a corner where the agent device is to be placed from the companion device 220, it recognizes the captured image as an object and searches for the position of the outlet in the room. Further, when the outlet cannot be recognized because the outlet is hidden in the captured image, the position of the outlet in the room may be estimated based on an empirical rule or learning data on the architectural design of the house.
 そして、演算デバイス230は、対象とするエージェントデバイスが機能を果たすのに適した場所を、家具の移動コストとともに設置コストを考慮して探索する。例えば、スコアが同じになる2以上の候補位置が見つけ出された場合、各々の候補位置の設置コストを考慮することで、コンセントからの距離が最も近い候補位置の優先順位が高くなる。 {Circle around (5)} Then, the arithmetic device 230 searches for a place suitable for the target agent device to perform its function in consideration of the installation cost together with the furniture moving cost. For example, when two or more candidate positions having the same score are found, by considering the installation cost of each candidate position, the priority of the candidate position closest to the outlet is higher.
G.部屋のレイアウト変更に応じた動作例
 これまでさまざまな動作例について説明してきたが、部屋のレイアウトに応じてエージェントデバイスが機能を果たすのに適した場所を探索するという点で共通する。したがって、家具を移動するなど部屋のレイアウトが変化したときには、エージェントデバイスの設置場所の置き換えを提案するようにしてもよい。
G. FIG. Operation Examples According to Room Layout Changes Although various operation examples have been described above, they are common in that an agent device searches for a place suitable for performing a function according to a room layout. Therefore, when the layout of the room changes, such as when moving furniture, replacement of the installation location of the agent device may be proposed.
 例えば、ユーザが部屋の模様替えを行った後に、図3を参照しながら説明した手順に従って、コンパニオンデバイス220を使って演算デバイス230に対して、エージェントデバイス210の設置場所に関する提案を求めるようにしてもよい。 For example, after the user has changed the room, the companion device 220 may be used to request the computing device 230 for a proposal regarding the location of the agent device 210 according to the procedure described with reference to FIG. Good.
 あるいは、エージェントデバイス210が、センサ部213で取得するセンサ情報などに基づいて、部屋のレイアウトの変更などエージェントデバイス210自身に適した設置場所が変化するような事象が発生したことを検知すると、エージェントデバイスの設置場所の置き換えを、表出部212を使ってユーザに通知したり、通信部214経由でコンパニオンデバイス220に通知したりするようにしてもよい。 Alternatively, when the agent device 210 detects, based on sensor information or the like acquired by the sensor unit 213, that an event, such as a change in the layout of a room, that changes the installation location suitable for the agent device 210 itself has occurred, The replacement of the device installation location may be notified to the user using the display unit 212 or may be notified to the companion device 220 via the communication unit 214.
 この場合のエージェントデバイス210の動作について、図26を参照しながら説明する。 The operation of the agent device 210 in this case will be described with reference to FIG.
 センサ部213は、エージェントデバイス210が設置されている部屋内の環境をセンシングして(S2601)、制御部211内のCPUに、一定間隔で環境データを送信する(S2602)。CPUは、受信した環境データをメモリ内に一時保存するとともに(S2603)、過去に保存した環境データを読み出して現在の環境データを比較して(S2604)、エージェントデバイス210自身の設置場所を変えるべき部屋内環境の変化があったかどうかを判定する。 The sensor unit 213 senses the environment in the room where the agent device 210 is installed (S2601), and transmits environment data to the CPU in the control unit 211 at regular intervals (S2602). The CPU should temporarily store the received environment data in the memory (S2603), read the environment data stored in the past, compare the current environment data (S2604), and change the installation location of the agent device 210 itself. It is determined whether or not the room environment has changed.
 センサ部213がカメラを含む場合には、センサ部213は、カメラで撮影した部屋内の画像を、制御部211内のCPUに送る。そして、CPUは、部屋内の撮影画像をメモリに逐次的に一時保存して、一定間隔毎の部屋内の撮影画像間の差分をとって、家具の移動などのエージェントデバイス210自身の設置場所を変えるべき部屋内環境の変化を検出する。例えば、図27に示すように、変更前の撮影画像2701と変更後の撮影画像2702の画像差分をとることによって、ソファ2710の位置及び向きが変わったことを検出することができる。 When the sensor unit 213 includes a camera, the sensor unit 213 sends an image of the room taken by the camera to the CPU in the control unit 211. Then, the CPU sequentially and temporarily stores the photographed images in the room in the memory, calculates the difference between the photographed images in the room at regular intervals, and determines the installation location of the agent device 210 itself such as movement of furniture. Detect changes in the room environment to be changed. For example, as shown in FIG. 27, by taking the image difference between the photographed image 2701 before the change and the photographed image 2702 after the change, it is possible to detect that the position and the direction of the sofa 2710 have changed.
 また、センサ部213が人感センサを含む場合には、センサ部213は人感センサデータも、制御部211内のCPUに送る(S2605)。 If the sensor unit 213 includes a human sensor, the sensor unit 213 also sends human sensor data to the CPU in the control unit 211 (S2605).
 そして、制御部211内のCPUは、エージェントデバイス210自身の設置場所を変えるべき部屋内環境の変化を検出すると、表出部212を使って、ユーザに通知する(S2606)。例えば、表出部212がスピーカ再生機能を備える場合には、部屋内でエージェントデバイス210自身の設置場所を変えるべき環境変化が発生したことや変化した内容をアナウンスする音声メッセージを出力して、ユーザに通知する(S2607)。また、表出部212がプロジェクタ表示機能を備える場合には、部屋内の環境が変化したことや変化した内容を示す画面を壁に投影して、ユーザに通知する(S2608)。 Then, upon detecting a change in the room environment in which the installation location of the agent device 210 should be changed, the CPU in the control unit 211 notifies the user using the display unit 212 (S2606). For example, when the display unit 212 has a speaker reproduction function, the user is notified of the occurrence of an environmental change in which the installation location of the agent device 210 itself is to be changed in the room or an audio message for announcing the changed content. (S2607). If the display unit 212 has a projector display function, a screen indicating that the environment in the room has changed or the changed content is projected on a wall to notify the user (S2608).
 但し、ユーザが不在時に音声出力や画面表示を行ってもユーザに通知できない。そこで、制御部211内のCPUは、センサ部213の人感センサによりユーザの存在を確認することができたタイミングで、上記の表出部212による表出動作を実施するようにしてもよい。 However, even if voice output or screen display is performed when the user is not present, the user cannot be notified. Therefore, the CPU in the control unit 211 may execute the expression operation by the expression unit 212 at the timing when the presence of the user can be confirmed by the human sensor of the sensor unit 213.
 また、制御部211内のCPUは、部屋内の環境の変化を検出すると、通信部214経由でコンパニオンデバイス220にも通知する(S2609)。コンパニオンデバイス220側でも、表出部222を使って、エージェントデバイス210の設置場所を変えるべき部屋内環境の変化があったことをユーザに通知する。 When the CPU in the control unit 211 detects a change in the environment in the room, it also notifies the companion device 220 via the communication unit 214 (S2609). The companion device 220 also uses the display unit 222 to notify the user of a change in the room environment in which the installation location of the agent device 210 needs to be changed.
 ユーザは、上述のようにしてエージェントデバイス210又はコンパニオンデバイス220の少なくともいずれかから、エージェントデバイス210の設置場所を変えるべき部屋内環境の変化があったことを知ることができる。そして、ユーザは、エージェントデバイス210の初期セットアップ時と同様に、部屋を撮影して、図3を参照しながら既に説明した手順に従って、エージェントデバイス210の設置位置の提案サービスを享受することができる。 As described above, the user can know from at least one of the agent device 210 and the companion device 220 that there has been a change in the room environment in which the installation location of the agent device 210 needs to be changed. Then, similarly to the initial setup of the agent device 210, the user can photograph the room and enjoy the service of proposing the installation position of the agent device 210 according to the procedure already described with reference to FIG.
H.エージェントデバイス自身による微調整の提案
 図3に示した置き場所推薦システム200の基本動作において、ユーザが、コンパニオンデバイス220で提示された提案データに基づいて、部屋内にエージェントデバイス210の設置した後、さらにエージェントデバイス210の撮影と演算デバイス230への撮影画像のアップロードを繰り返し実施して、その設置場所の適正を検証して、エージェントデバイス210の位置調整を行うことができる。
H. Proposal of fine adjustment by agent device itself In the basic operation of the place recommendation system 200 shown in FIG. 3, after the user installs the agent device 210 in the room based on the proposal data presented by the companion device 220, Further, the photographing of the agent device 210 and the uploading of the photographed image to the arithmetic device 230 are repeatedly performed to verify the appropriateness of the installation location and adjust the position of the agent device 210.
 但し、演算デバイス230には大まかな設置位置を提案してもらう一方、エージェントデバイス210自身のセンシングに基づいて実地での位置微調整を行うようにしてもよい。演算デバイス230が、コンパニオンデバイス220から送られてきた撮影画像の物体認識を通じて取得できる情報の粒度には限界がある。これに対し、エージェントデバイス210は、センサ部213のセンシング結果に基づいて設置位置の周囲環境を推定して、より細かい粒度でエージェントデバイス210の位置の微調整を行うことができる。 However, the arithmetic device 230 may propose a rough installation position, and may perform a fine-tuning of the actual position based on the sensing of the agent device 210 itself. There is a limit to the granularity of information that the arithmetic device 230 can acquire through object recognition of the captured image sent from the companion device 220. On the other hand, the agent device 210 estimates the surrounding environment of the installation position based on the sensing result of the sensor unit 213, and can finely adjust the position of the agent device 210 with finer granularity.
 図28には、位置の微調整を行うエージェントデバイス210の機能的構成を模式的に示している。但し、図示のエージェントデバイス210は、表出部212としてプロジェクタ表示機能を備え、壁際の設置位置が提案されていることを想定している。 FIG. 28 schematically shows the functional configuration of the agent device 210 for performing fine adjustment of the position. However, the illustrated agent device 210 is provided with a projector display function as the display unit 212, and it is assumed that an installation position near a wall is proposed.
 エージェントデバイス210は、センサ部213として、6軸センサ2801及び距離センサ2802、人感センサ2803などを備えている。6軸センサ2801は、例えば慣性計測装置(Inertial Measurement Unit:IMU)からなり、位置及び姿勢を計測するセンサユニットである。また、距離センサ2802は、レーザー、超音波、又は赤外線などの反射を利用した距離センサでもよい。 The agent device 210 includes a six-axis sensor 2801, a distance sensor 2802, a human sensor 2803, and the like as the sensor unit 213. The 6-axis sensor 2801 is, for example, a sensor unit that includes an inertial measurement device (Inertia Measurement Unit) (IMU) and measures the position and orientation. Further, the distance sensor 2802 may be a distance sensor using reflection of laser, ultrasonic waves, infrared rays, or the like.
 制御部211は、自己向き検出部2811と、壁距離推定部2812と、設置位置評価部2813を備えている。これらの機能モジュール2811~2813は、例えば制御部211内のCPUで実行されるソフトウェアモジュールであってもよい。 The control unit 211 includes a self-orientation detection unit 2811, a wall distance estimation unit 2812, and an installation position evaluation unit 2813. These functional modules 2811 to 2813 may be, for example, software modules executed by a CPU in the control unit 211.
 自己向き検出部2811は、6軸センサ2801の検出信号に基づいて、エージェントデバイス210本体の姿勢を検出する。また、壁距離推定部2812は、距離センサ2802の検出信号に基づいて、エージェントデバイス210本体から投影面となる壁までの距離を推定する。そして、設置位置評価部2813は、検出されたエージェントデバイス210本体の姿勢及び壁までの距離に基づいて、エージェントデバイス210本体の設置位置が適切かどうかを評価する。 The self-orientation detecting unit 2811 detects the posture of the main body of the agent device 210 based on the detection signal of the six-axis sensor 2801. The wall distance estimating unit 2812 estimates the distance from the main body of the agent device 210 to the wall serving as the projection plane based on the detection signal of the distance sensor 2802. The installation position evaluation unit 2813 evaluates whether the installation position of the agent device 210 is appropriate based on the detected posture of the agent device 210 and the distance to the wall.
 設置位置評価部2813は、評価した結果、エージェントデバイス210本体の設置位置が壁から離れ過ぎている、プロジェクタの出射方向に対し壁が傾斜している、設置面が傾斜していて不安定である、など設置位置が不適切と判定した場合には、表出部212を使って音声再生やプロジェクタ画面でユーザに通知する。例えば、「壁から離れているようです。もう少し壁に寄せてください。」、「少し斜めに置かれています。私の文字が読みづらくありませんか?」、「真直ぐな台に置いてください、滑って落下してしまいます。」といった、ユーザに対してエージェントデバイス210の設置位置の改善する作業を促すメッセージをプロジェクタ画面に表示するようにしてもよい(図29~図31を参照のこと)。あるいは、エージェントデバイス210は、画面表示ではなく、若しくは画面表示と併せて、音声メッセージを音声出力するようにしてもよい。 As a result of the evaluation, the installation position evaluation unit 2813 shows that the installation position of the main body of the agent device 210 is too far from the wall, the wall is inclined with respect to the emission direction of the projector, and the installation surface is inclined and unstable. When it is determined that the installation position is inappropriate, the user is notified by voice reproduction or the projector screen using the display unit 212. For example, "It looks like you are away from the wall. Please move it closer to the wall.", "It's a little diagonal. Isn't it hard to read my letters?", "Please put it on a straight table, slip. May be displayed on the projector screen prompting the user to improve the installation position of the agent device 210 (see FIGS. 29 to 31). Alternatively, the agent device 210 may output a voice message by voice instead of or together with the screen display.
 但し、ユーザが不在時に音声出力や画面表示を行ってもユーザに通知できない。そこで、制御部211は、人感センサ2803によりユーザの存在を確認することができたタイミングで、上記の表出部212による表出動作を実施するようにしてもよい。 However, even if voice output or screen display is performed when the user is not present, the user cannot be notified. Therefore, the control unit 211 may execute the expression operation by the expression unit 212 at the timing when the presence of the user can be confirmed by the motion sensor 2803.
I.設置場所推薦のための処理動作
I-1.基本動作
 演算デバイス230は、部屋の撮影画像(又は、部屋の間取りのイラスト)から、部屋内に設置された家具や窓などの物体を認識すると、対象とするエージェントデバイスに適した場所や、避けるべきデバイス又は場所の情報に基づいて、部屋内の各位置(x,y)のスコアscore(x,y)を算出して、スコアが最大となる最適な設置位置(xopt,yopt)を求め、これをコンパニオンデバイス220に提案する。
I. Processing action for installation location recommendation
I-1. When recognizing an object such as furniture or a window installed in the room from a captured image of the room (or an illustration of the room layout) from the captured image of the room, the basic operation calculation device 230 avoids a place suitable for the target agent device or avoids it. A score score (x, y) at each position (x, y) in the room is calculated based on information on a device or a place to be set, and an optimal installation position (x opt , y opt ) at which the score becomes maximum is determined. And presents it to the companion device 220.
 ここで、エージェントデバイスの位置(x,y)が、適した場所に近づくとスコアが上がるが遠ざかるとスコアが下がり、また、避けるべき機器及び場所に近づくとスコアが上がるが遠ざかるとスコアが上がる関数Fを使って、部屋内の各位置(x,y)のスコアscore(x,y)を下式(1)のように定義する。 Here, when the position (x, y) of the agent device approaches a suitable place, the score rises, but decreases when the distance increases, and also increases when approaching a device and place to be avoided, but increases when moving away. Using F, the score score (x, y) at each position (x, y) in the room is defined as in the following equation (1).
Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000001
 上式(1)中で、Θは、どのような物体がどこにあるか(所在)を示すパラメータである。物体の2次元位置を(x,y)で表し、物体の縦方向及び横方向のサイズを(h,w)と表すと、各物体の所在は下式(2)のように表すことができる。そして、パラメータΘは下式(3)のように表すことができる。 {In the above equation (1),} is a parameter indicating what kind of object is where (location). When the two-dimensional position of the object is represented by (x, y) and the size of the object in the vertical and horizontal directions is represented by (h, w), the location of each object can be represented by the following expression (2). . Then, the parameter 表 す can be expressed as in the following equation (3).
Figure JPOXMLDOC01-appb-M000002
Figure JPOXMLDOC01-appb-M000002
Figure JPOXMLDOC01-appb-M000003
Figure JPOXMLDOC01-appb-M000003
 したがって、エージェントデバイスの最適な設置位置(xopt,yopt)は、スコアscore(x,y)=F(x,y|Θ)が最大となる位置を探索するという最適化問題として扱うことができ、下式(4)のように表すことができる。 Therefore, the optimal installation position (x opt , y opt ) of the agent device can be treated as an optimization problem of searching for a position where the score score (x, y) = F (x, y | Θ) is maximum. And can be expressed as in the following equation (4).
Figure JPOXMLDOC01-appb-M000004
Figure JPOXMLDOC01-appb-M000004
 図34には、演算デバイス230がコンパニオンデバイス220に対して、エージェントデバイスの最適な設置場所を薦めるための、基本的な処理手順をフローチャートの形式で示している。 FIG. 34 shows, in the form of a flowchart, a basic processing procedure in which the computing device 230 recommends the companion device 220 an optimal installation location of the agent device.
 演算デバイス230は、コンパニオンデバイス220から、エージェントデバイスを設置する部屋の撮影画像と、対象とするエージェントデバイスの機器名を受信する(ステップS3401)。 The arithmetic device 230 receives, from the companion device 220, the captured image of the room where the agent device is installed and the device name of the target agent device (step S3401).
 演算デバイス230は、受信した撮影画像を物体認識して、その部屋内の各物体の属性(カテゴリ)、位置、及びサイズを推定する(ステップS3402)。この推定結果に基づいて、上式(3)のパラメータΘを得ることができる。 The arithmetic device 230 recognizes the received captured image as an object and estimates the attribute (category), position, and size of each object in the room (step S3402). Based on this estimation result, the parameter の of the above equation (3) can be obtained.
 次いで、演算デバイス230は、ステップS3401で受信した機器名をエージェントデバイステーブル233に照会して、そのエージェントデバイスに適した場所並びに避けるべき機器及び場所の情報を取得する。そして、演算デバイス230は、避けるべき機器及び場所に近づくとスコアが上がるが遠ざかるとスコアが上がる関数Fを使って、エージェントデバイスを位置(x,y)に置いたときにスコアscore(x,y)を、上式(1)に従って算出する(ステップS3403)。 Next, the computing device 230 refers to the agent device table 233 for the device name received in step S3401, and obtains information on a location suitable for the agent device and devices and locations to be avoided. Then, the arithmetic device 230 uses the function F, which increases the score as it approaches a device and a place to be avoided but increases the score as it moves away, and calculates the score score (x, y) when the agent device is placed at the position (x, y). ) Is calculated according to the above equation (1) (step S3403).
 次いで、演算デバイス230は、上式(4)に示した、スコアscore(x,y)=F(x,y|Θ)が最大となる位置を探索するという最適化問題を解いて、エージェントデバイスの最適な設置位置(xopt,yopt)を決定する(ステップS3404)。なお、複数の候補位置がある場合には、第2、第3の位置についても算出する。また、スコアに基づいて各候補位置の優先順位付けを行う。 Next, the arithmetic device 230 solves the optimization problem of searching for a position where the score score (x, y) = F (x, y | Θ) shown in the above equation (4) is the maximum, and solves the agent device. The optimal installation position (x opt , y opt ) is determined (step S3404). If there are a plurality of candidate positions, the calculation is also performed for the second and third positions. In addition, prioritization of each candidate position is performed based on the score.
 そして、演算デバイス230は、ステップS3401で受信した撮影画像上で、ステップS3404で決定されたエージェントデバイスの設置位置(xopt,yopt)にエージェントデバイスの画像を重畳して、その画像を要求元のコンパニオンデバイス220に送信して(ステップS3405)、本処理を終了する。 Then, the computing device 230 superimposes the image of the agent device on the installation position (x opt , y opt ) of the agent device determined in step S3404 on the captured image received in step S3401, and transmits the image to the request source. Is transmitted to the companion device 220 (step S3405), and this processing ends.
I-2.家具の移動コストとコンセントからの距離を考慮した設置場所の推薦
 関数Fを定義する際に、対象とするエージェントデバイスに適した場所、並びに避けるべき機器及び場所に関するパラメータΘの他に、部屋内に散在する各家具の移動コストや、コンセントからの距離に基づく設置コストに関するパラメータΨを考慮することができる。家具の移動を考慮することによって、より広い範囲からエージェントデバイスの最適な設置位置を決定することが可能になる。また、コンセントからの距離を考慮することによって、電源ケーブルの配線長が短い設置位置を決定することが可能になる。パラメータΨは、下式(5)のように表すことができる。但し、下式(5)に示すパラメータΨは、撮像画像から認識された各物体の移動コストやコンセントからの距離に応じた設置コストを含む。gTVはテレビ受像機の移動コスト、gTableはテーブルの移動コストであり、gcはコンセントからの距離に応じた設置コストである。
I-2. When defining the recommendation function F of the installation location in consideration of the moving cost of the furniture and the distance from the outlet , in addition to the parameter に 関 す る regarding the location suitable for the target agent device and the equipment and location to be avoided, The parameter に 関 す る relating to the moving cost of each scattered piece of furniture and the installation cost based on the distance from the outlet can be considered. By considering the movement of the furniture, it becomes possible to determine the optimum installation position of the agent device from a wider range. In addition, by considering the distance from the outlet, it is possible to determine an installation position where the wiring length of the power cable is short. The parameter Ψ can be expressed as in the following equation (5). However, the parameter 示 す shown in the following equation (5) includes the moving cost of each object recognized from the captured image and the installation cost according to the distance from the outlet. g TV is the moving cost of the television receiver, g Table is the moving cost of the table, and g c is the installation cost according to the distance from the outlet.
Figure JPOXMLDOC01-appb-M000005
Figure JPOXMLDOC01-appb-M000005
 また、この場合の部屋内の各位置(x,y)のスコアscore(x,y)を下式(6)のように定義することができる。そして、エージェントデバイスの最適な設置位置(xopt,yopt)は、スコアscore(x,y)=F(x,y|Θ,Ψ)が最大となる位置を探索するという最適化問題として扱うことができ、下式(7)のように表すことができる。 In this case, the score score (x, y) at each position (x, y) in the room can be defined as in the following expression (6). Then, the optimal installation position (x opt , y opt ) of the agent device is treated as an optimization problem of searching for a position at which the score score (x, y) = F (x, y | Θ, Ψ) is maximum. And can be expressed as in the following equation (7).
Figure JPOXMLDOC01-appb-M000006
Figure JPOXMLDOC01-appb-M000006
Figure JPOXMLDOC01-appb-M000007
Figure JPOXMLDOC01-appb-M000007
 図35には、演算デバイス230がコンパニオンデバイス220に対して、家具の移動コストとコンセントからの距離に応じた設置コストを考慮して、エージェントデバイスの最適な設置場所を薦めるための処理手順をフローチャートの形式で示している。 FIG. 35 is a flowchart showing a processing procedure for the arithmetic device 230 to recommend an optimal installation location of the agent device to the companion device 220 in consideration of the moving cost of furniture and the installation cost according to the distance from the outlet. In the format
 演算デバイス230は、コンパニオンデバイス220から、エージェントデバイスを設置する部屋の撮影画像と、対象とするエージェントデバイスの機器名を受信する(ステップS3501)。 The arithmetic device 230 receives, from the companion device 220, the captured image of the room where the agent device is installed and the device name of the target agent device (step S3501).
 演算デバイス230は、受信した撮影画像を物体認識して、その部屋内の各物体の属性(カテゴリ)、位置、及びサイズを推定する(ステップS3502)。この推定結果に基づいて、上式(3)に示したパラメータΘを得ることができる。 The arithmetic device 230 recognizes the received captured image as an object and estimates the attribute (category), position, and size of each object in the room (step S3502). Based on this estimation result, the parameter Θ shown in the above equation (3) can be obtained.
 また、演算デバイス230は、家具情報テーブル234(図20を参照のこと)並びに設置コストテーブル(図25を参照のこと)に照会して、家具の移動コスト及びコンセント長に応じた設置コストに関するパラメータΨを取得する(ステップS3503)。 In addition, the computing device 230 refers to the furniture information table 234 (see FIG. 20) and the installation cost table (see FIG. 25), and obtains parameters relating to furniture moving costs and installation costs according to outlet lengths. Ψ is acquired (step S3503).
 次いで、演算デバイス230は、ステップS3501で受信した機器名をエージェントデバイステーブル233に照会して、そのエージェントデバイスに適した場所並びに避けるべき機器及び場所の情報を取得する。そして、演算デバイス230は、避けるべき機器及び場所に近づくとスコアが上がるが遠ざかるとスコアが上がる関数Fを使って、エージェントデバイスを位置(x,y)に置いたときにスコアscore(x,y)を、上式(6)に従って算出する(ステップS3504)。 Next, the computing device 230 refers to the agent device table 233 for the device name received in step S3501, and acquires information on a place suitable for the agent device and a device and a place to be avoided. Then, the arithmetic device 230 uses the function F, which increases the score as it approaches a device and a place to be avoided but increases the score as it moves away, and calculates the score score (x, y) when the agent device is placed at the position (x, y). ) Is calculated according to the above equation (6) (step S3504).
 次いで、演算デバイス230は、上式(7)に示した、スコアscore(x,y)=F(x,y|Θ,Ψ)が最大となる位置を探索するという最適化問題を解いて、家具の移動を許容するとともに、コンセントからの距離を考慮して、エージェントデバイスの最適な設置位置(xopt,yopt)を決定する(ステップS3505)。なお、複数の候補位置がある場合には、第2、第3の位置についても算出する。また、スコアに基づいて各候補位置の優先順位付けを行う。 Next, the computing device 230 solves the optimization problem of searching for a position where the score score (x, y) = F (x, y | Θ, Ψ), which is shown in the above equation (7), is maximum, An optimal installation position (x opt , y opt ) of the agent device is determined in consideration of the movement of the furniture and the distance from the outlet (step S3505). If there are a plurality of candidate positions, the calculation is also performed for the second and third positions. In addition, prioritization of each candidate position is performed based on the score.
 そして、演算デバイス230は、ステップS3501で受信した撮影画像上で、ステップS3505で決定されたエージェントデバイスの設置位置(xopt,yopt)にエージェントデバイスの画像を重畳するとともに移動後の家具の画像を合成して、その画像を要求元のコンパニオンデバイス220に送信して(ステップS3506)、本処理を終了する。 Then, the arithmetic device 230 superimposes the image of the agent device on the installation position (x opt , y opt ) of the agent device determined in step S3505 on the captured image received in step S3501, and the image of the furniture after movement. Are combined, and the image is transmitted to the companion device 220 of the request source (step S3506), and this processing ends.
I-3.ユーザの設置希望位置を考慮した設置場所の推薦
 ユーザが、エージェントデバイスの設置位置を希望する場合がある。例えば、エージェントデバイスを家具や調度品として捉え、美観上、部屋内での設置位置にこだわりがある場合や、他の物品を置く予定があり、エージェントデバイスの設置場所を制限したい場合など、理由はさまざまである。
I-3. There is a case where a recommended user of an installation location in consideration of a desired installation location of the user desires an installation location of the agent device. For example, if you consider the agent device as furniture or furniture and are aesthetically observant about the installation position in the room, or if you plan to place other items and want to limit the installation location of the agent device, the reasons are: Various.
 ユーザがエージェントデバイスの設置を希望する位置を(xu,yu)とおくと、上式(6)に示したスコアscore(x,y)の計算、並びに、上式(7)に示した最適化問題の解決は、下式(8)に示す条件の下で実施される。 When the position where the user desires to install the agent device is (x u , y u ), the score score (x, y) shown in the above equation (6) is calculated, and the score shown in the above equation (7) is given. The solution of the optimization problem is implemented under the condition shown in the following equation (8).
Figure JPOXMLDOC01-appb-M000008
Figure JPOXMLDOC01-appb-M000008
 図36には、演算デバイス230がコンパニオンデバイス220に対して、ユーザの設置希望位置を考慮して、エージェントデバイスの最適な設置場所を薦めるための処理手順をフローチャートの形式で示している。 FIG. 36 shows, in the form of a flowchart, a processing procedure in which the computing device 230 recommends the companion device 220 an optimal installation location of the agent device in consideration of a user's desired installation location.
 演算デバイス230は、コンパニオンデバイス220から、エージェントデバイスを設置する部屋の撮影画像と、対象とするエージェントデバイスの機器名を受信する(ステップS3601)。 The arithmetic device 230 receives, from the companion device 220, the captured image of the room where the agent device is installed and the device name of the target agent device (step S3601).
 演算デバイス230は、受信した撮影画像を物体認識して、その部屋内の各物体の属性(カテゴリ)、位置、及びサイズを推定する(ステップS3602)。この推定結果に基づいて、上式(3)に示したパラメータΘを得ることができる。 The arithmetic device 230 recognizes the received captured image as an object and estimates the attribute (category), position, and size of each object in the room (step S3602). Based on this estimation result, the parameter Θ shown in the above equation (3) can be obtained.
 また、演算デバイス230は、家具情報テーブル234(図20を参照のこと)並びに設置コストテーブル(図25を参照のこと)に照会して、家具の移動コスト及びコンセント長に応じた設置コストに関するパラメータΨを取得する(ステップS3603)。 In addition, the computing device 230 refers to the furniture information table 234 (see FIG. 20) and the installation cost table (see FIG. 25), and obtains parameters relating to furniture moving costs and installation costs according to outlet lengths. Ψ is acquired (step S3603).
 さらに、演算デバイス230は、エージェントデバイスの設置希望位置ユーザが希望するエージェントデバイスの設置位置(xu,yu)を取得する(ステップS3604)。演算デバイス230が設置希望位置(xu,yu)を取得する方法は任意である。例えば、ユーザは撮影画像の表示画面上でタッチ又はクリックして設置希望位置を指示し、コンパニオンデバイス220は入力された設置希望位置(xu,yu)を演算デバイス230に送信するようにしてもよい。 Furthermore, computing device 230, the installation location of the agent device installed desired position the user agent device desires (x u, y u) acquires (step S3604). The method by which the computing device 230 acquires the desired installation position (x u , yu ) is arbitrary. For example, the user touches or clicks on the captured image display screen to indicate a desired installation position, and the companion device 220 transmits the input desired installation position (x u , yu ) to the arithmetic device 230. Is also good.
 次いで、演算デバイス230は、ステップS3601で受信した機器名をエージェントデバイステーブル233に照会して、そのエージェントデバイスに適した場所並びに避けるべき機器及び場所の情報を取得する。そして、演算デバイス230は、避けるべき機器及び場所に近づくとスコアが上がるが遠ざかるとスコアが上がる関数Fを使って、エージェントデバイスを位置(x,y)に置いたときにスコアscore(x,y)を、上式(8)に示した設置位置の条件下で、上式(6)に従って算出する(ステップS3605)。 Next, the computing device 230 refers to the agent device table 233 for the device name received in step S3601, and acquires information on a place suitable for the agent device and a device and a place to be avoided. Then, the arithmetic device 230 uses the function F, which increases the score as it approaches a device and a place to be avoided but increases the score as it moves away, and calculates the score score (x, y) when the agent device is placed at the position (x, y). ) Is calculated according to the above equation (6) under the condition of the installation position shown in the above equation (8) (step S3605).
 次いで、演算デバイス230は、上式(8)に示した設置位置の条件下で、上式(7)に示した、スコアscore(x,y)=F(x,y|Θ,Ψ)が最大となる位置を探索するという最適化問題を解いて、家具の移動を許容するとともに、コンセントからの距離を考慮して、エージェントデバイスの最適な設置位置(xopt,yopt)を決定する(ステップS3606)。なお、複数の候補位置がある場合には、第2、第3の位置についても算出する。また、スコアに基づいて各候補位置の優先順位付けを行う。 Next, the arithmetic device 230 calculates the score score (x, y) = F (x, y | Θ, Ψ) shown in the above equation (7) under the condition of the installation position shown in the above equation (8). Solving the optimization problem of searching for the maximum position, allowing the furniture to move, and considering the distance from the outlet, determines the optimal installation position (x opt , y opt ) of the agent device ( Step S3606). If there are a plurality of candidate positions, the calculation is also performed for the second and third positions. In addition, prioritization of each candidate position is performed based on the score.
 次いで、演算デバイス230は、下式(9)に従い、ステップS3606で決定した最適設置位置(xopt,yopt)におけるスコアscore(xopt,yopt)を算出し、そのスコアが所定の閾値th以上であるかどうかをチェックする(ステップS3607)。 Next, the arithmetic device 230 calculates the score (x opt , y opt ) at the optimal installation position (x opt , y opt ) determined in step S3606 according to the following equation (9), and the score is determined by the predetermined threshold th. It is checked whether it is the above (step S3607).
Figure JPOXMLDOC01-appb-M000009
Figure JPOXMLDOC01-appb-M000009
 最適設置位置(xopt,yopt)におけるスコアscore(xopt,yopt)が所定の閾値th以上である場合には(ステップS3607のYes)、演算デバイス230は、ステップS3601で受信した撮影画像上で、ステップS3606で決定されたエージェントデバイスの設置位置(xopt,yopt)にエージェントデバイスの画像を重畳するとともに移動後の家具の画像を合成して、その画像を要求元のコンパニオンデバイス220に送信して(ステップS3608)、本処理を終了する。 If the score (x opt , y opt ) at the optimum installation position (x opt , y opt ) is equal to or greater than the predetermined threshold th (Yes in step S3607), the arithmetic device 230 returns to the captured image received in step S3601. Above, the image of the agent device is superimposed on the installation position (x opt , y opt ) of the agent device determined in step S3606, and the image of the furniture after movement is synthesized, and the image is transmitted to the companion device 220 of the request source. (Step S3608), and the process ends.
 他方、最適設置位置(xopt,yopt)におけるスコアscore(xopt,yopt)が所定の閾値th未満である場合には(ステップS3607のNo)、演算デバイス230は、ステップS3604で取得したユーザの設置希望位置(xu,yu)がエージェントデバイスの設置場所に適していないことをコンパニオンデバイス220に通知して(ステップS3609)、本処理を終了する。 On the other hand, when the score (x opt , y opt ) at the optimal installation position (x opt , y opt ) is less than the predetermined threshold th (No in step S3607), the arithmetic device 230 acquires in step S3604. installation desired position of the user (x u, y u) is notified that no suitable location of the agent device in companion device 220 (step S3609), the process ends.
I-4.ユーザからのフィードバックを考慮した設置場所の推薦
 演算デバイス230は、図34~図36のいずれかに示した処理手順に従って、最適な設置位置にエージェントデバイスを重畳した画像を、要求元のコンパニオンデバイス220に返信する。コンパニオンデバイス220側では、エージェントデバイスが重畳された画像をユーザに提示して、最適な設置位置にエージェントデバイスを置くことを薦める。ユーザは、エージェントデバイスが重畳された画像を見て、又は、実際にエージェントデバイスをその場所に置いてみて、薦められた場所を許諾するかどうかを判断する。また、コンパニオンデバイス220は、薦めた設置位置をユーザが許諾したかどうかを、演算デバイス230にフィードバックするようにしてもよい。
I-4. According to the processing procedure shown in any one of FIGS. 34 to 36, the installation operation recommendation calculation device 230 in consideration of the feedback from the user converts the image in which the agent device is superimposed on the optimal installation position to the requesting companion device 220. Reply to On the companion device 220 side, it is recommended that the image on which the agent device is superimposed be presented to the user, and that the agent device be placed at an optimal installation position. The user looks at the image on which the agent device is superimposed, or actually puts the agent device at that location, and determines whether to permit the recommended location. Further, the companion device 220 may feed back to the arithmetic device 230 whether or not the user has approved the recommended installation position.
 図37には、演算デバイス230が、ユーザからのフィードバックを考慮して、エージェントデバイスの最適な設置場所を薦めるための処理手順をフローチャートの形式で示している。 FIG. 37 shows, in the form of a flowchart, a processing procedure for the arithmetic device 230 to recommend an optimal installation location of an agent device in consideration of feedback from a user.
 演算デバイス230は、コンパニオンデバイス220から、エージェントデバイスを設置する部屋の撮影画像と、対象とするエージェントデバイスの機器名を受信する(ステップS3701)。そして、演算デバイス230は、例えば図36に示したフローチャートのステップS3602~S3608と同様の処理を実施して、エージェントデバイスの最適な設置位置を決定して、その場所にエージェントデバイスの画像を重畳した画像をコンパニオンデバイス220に送信する(ステップS3702)。 The arithmetic device 230 receives, from the companion device 220, the captured image of the room where the agent device is installed and the device name of the target agent device (step S3701). Then, the arithmetic device 230 performs the same processing as in steps S3602 to S3608 of the flowchart shown in FIG. 36, for example, to determine the optimal installation position of the agent device, and superimposes the image of the agent device on that location. The image is transmitted to the companion device 220 (step S3702).
 その後、演算デバイス230は、上記の最適な設置位置に関するユーザの応答をコンパニオンデバイス220から受け取ると(ステップS3703)、ユーザが、お薦めした設置位置を許諾したかどうかをチェックする(ステップS3704)。 (4) Thereafter, upon receiving the user's response regarding the above-described optimal installation position from the companion device 220 (step S3703), the arithmetic device 230 checks whether the user has approved the recommended installation position (step S3704).
 ここで、お薦めした設置位置をユーザが許諾した場合には(ステップS3704のYes)、本処理を終了する。 Here, if the user permits the recommended installation position (Yes in step S3704), the process ends.
 一方、お薦めした設置位置をユーザが許諾しない場合には(ステップS3704のNo)、演算デバイス230は、エージェントデバイスの設置希望位置ユーザが希望するエージェントデバイスの設置位置(xu,yu)を取得する(ステップS3705)。演算デバイス230が設置希望位置(xu,yu)を取得する方法は任意である(同上)。 On the other hand, when the recommendation was installed position the user is not licensed (No in step S 3704), the computing device 230, obtains the installation position of the agent device (x u, y u) the installation desired position the user agent device desires (Step S3705). The method by which the computing device 230 obtains the desired installation position (x u , yu ) is arbitrary (as described above).
 次いで、演算デバイス230は、ステップS3701で受信した機器名をエージェントデバイステーブル233に照会して、そのエージェントデバイスに適した場所並びに避けるべき機器及び場所の情報を取得する。そして、演算デバイス230は、避けるべき機器及び場所に近づくとスコアが上がるが遠ざかるとスコアが上がる関数Fを使って、エージェントデバイスを位置(x,y)に置いたときにスコアscore(x,y)を、上式(8)に示した設置位置の条件下で、上式(6)に従って算出する(ステップS3706)。 Next, the computing device 230 refers to the agent device table 233 for the device name received in step S3701, and acquires information on a place suitable for the agent device and a device and a place to be avoided. Then, the arithmetic device 230 uses the function F, which increases the score as it approaches a device and a place to be avoided but increases the score as it moves away, and calculates the score score (x, y) when the agent device is placed at the position (x, y). ) Is calculated according to the above equation (6) under the condition of the installation position shown in the above equation (8) (step S3706).
 次いで、演算デバイス230は、上式(8)に示した設置位置の条件下で、上式(7)に示した、スコアscore(x,y)=F(x,y|Θ,Ψ)が最大となる位置を探索するという最適化問題を解いて、家具の移動を許容するとともに、コンセントからの距離を考慮して、エージェントデバイスの最適な設置位置(xopt,yopt)を決定する(ステップS3707)。なお、複数の候補位置がある場合には、第2、第3の位置についても算出する。また、スコアに基づいて各候補位置の優先順位付けを行う。 Next, the arithmetic device 230 calculates the score score (x, y) = F (x, y | Θ, Ψ) shown in the above equation (7) under the condition of the installation position shown in the above equation (8). Solving the optimization problem of searching for the maximum position, allowing the furniture to move, and considering the distance from the outlet, determines the optimal installation position (x opt , y opt ) of the agent device ( Step S3707). If there are a plurality of candidate positions, the calculation is also performed for the second and third positions. In addition, prioritization of each candidate position is performed based on the score.
 次いで、演算デバイス230は、上式(9)に従い、ステップS3606で決定した最適設置位置(xopt,yopt)におけるスコアscore(xopt,yopt)を算出し、そのスコアが所定の閾値th以上であるかどうかをチェックする(ステップS3708)。 Next, the arithmetic device 230 calculates a score (x opt , y opt ) at the optimal installation position (x opt , y opt ) determined in step S3606 according to the above equation (9), and the score is determined by a predetermined threshold th. It is checked whether it is the above (step S3708).
 最適設置位置(xopt,yopt)におけるスコアscore(xopt,yopt)が所定の閾値th以上である場合には(ステップS3708のYes)、演算デバイス230は、ステップS3701で受信した撮影画像上で、ステップS3606で決定されたエージェントデバイスの設置位置(xopt,yopt)にエージェントデバイスの画像を重畳するとともに移動後の家具の画像を合成して、その画像を要求元のコンパニオンデバイス220に送信して(ステップS3709)、本処理を終了する。 Optimal placement (x opt, y opt) score in score (x opt, y opt) if is the predetermined threshold th or more (Yes in step S3708), the computing device 230, captured image received in step S3701 Above, the image of the agent device is superimposed on the installation position (x opt , y opt ) of the agent device determined in step S3606, and the image of the furniture after movement is synthesized, and the image is transmitted to the companion device 220 of the request source. (Step S3709), and the process ends.
 他方、最適設置位置(xopt,yopt)におけるスコアscore(xopt,yopt)が所定の閾値th未満である場合には(ステップS3708のNo)、演算デバイス230は、ステップS3705で取得したユーザの設置希望位置(xu,yu)がエージェントデバイスの設置場所に適していないことをコンパニオンデバイス220に通知して(ステップS3710)、本処理を終了する。 On the other hand, when the score score (x opt , y opt ) at the optimal installation position (x opt , y opt ) is less than the predetermined threshold th (No in step S3708), the arithmetic device 230 acquires in step S3705. installation desired position of the user (x u, y u) is notified that no suitable location of the agent device in companion device 220 (step S3710), the process ends.
J.まとめ
 本実施形態に係る置き場所推薦システム200によれば、コンパニオンデバイス220で部屋内を撮影した画像を演算デバイス230に送信するだけで、エージェントデバイス210を設置する最適な位置が提示される。
J. Conclusion According to the storage location recommendation system 200 according to the present embodiment, the optimum position where the agent device 210 is installed is presented only by transmitting the image of the room photographed by the companion device 220 to the arithmetic device 230.
 コンパニオンデバイス220は、例えばスマートフォンやタブレットなどの情報端末上でコンパニオンアプリケーションを実行するという形態で実現され、また、演算デバイス230の実体はクラウド上のサーバである。したがって、ユーザは、スマートフォンで撮影した画像をクラウドにアップロードするという日常的で且つ簡易な操作を行うだけで、エージェントデバイス210の最適な設置位置が提示されるので、手軽であるとともに便利である。 The companion device 220 is realized in the form of executing a companion application on an information terminal such as a smartphone or a tablet, for example. The computing device 230 is a server on the cloud. Therefore, the user can simply and conveniently perform an ordinary and simple operation of uploading an image captured by the smartphone to the cloud, so that the optimal installation position of the agent device 210 is presented, which is convenient and convenient.
 また、コンパニオンデバイス220は、部屋の撮影画像を演算デバイス230へ送信する際に、公開したくない情報にぼかしやモザイクをかけるといった画像処理を施したり、他の情報に置き換えたりする暗号化処理を施すので、ユーザのプライバシーを保護しながら、エージェントデバイス210の置き場所を推薦するサービスを享受することができる。 Further, the companion device 220 performs image processing such as blurring or mosaicing information that is not desired to be disclosed, or performs encryption processing to replace the information with other information when transmitting the captured image of the room to the computing device 230. Because of this, it is possible to enjoy the service of recommending the location of the agent device 210 while protecting the privacy of the user.
 また、本実施形態に係る置き場所推薦システム200によれば、部屋内の家具の移動を考慮してエージェントデバイス210の置き場所を推薦するので、エージェントデバイス210以外の物体は、雑音若しくは設置を阻害するものとして扱って、エージェントデバイス210の置き場所を推薦することができる。 Further, according to the storage location recommendation system 200 according to the present embodiment, the storage location of the agent device 210 is recommended in consideration of the movement of the furniture in the room. And recommending the location of the agent device 210.
 以上、特定の実施形態を参照しながら、本明細書で開示する技術について詳細に説明してきた。しかしながら、本明細書で開示する技術の要旨を逸脱しない範囲で当業者が該実施形態の修正や代用を成し得ることは自明である。 The technique disclosed in the present specification has been described above in detail with reference to the specific embodiments. However, it is obvious that those skilled in the art can modify or substitute the embodiment without departing from the gist of the technology disclosed in this specification.
 本明細書で開示する技術は、音声エージェントだけでなく、情報家電やIoTデバイスなどのエージェントアプリケーションが常駐するさまざまな機器を設置する際にも適用することができる。 The technology disclosed in this specification can be applied not only to a voice agent, but also to installing various devices in which an agent application such as an information home appliance or an IoT device resides.
 要するに、例示という形態により本明細書で開示する技術について説明してきたのであり、本明細書の記載内容を限定的に解釈するべきではない。本明細書で開示する技術の要旨を判断するためには、特許請求の範囲を参酌すべきである。 In short, the technology disclosed in this specification has been described by way of example, and the contents described in this specification should not be interpreted restrictively. In order to determine the gist of the technology disclosed in this specification, the claims should be considered.
 なお、本明細書の開示の技術は、以下のような構成をとることも可能である。
(1)エージェントデバイスを設置する環境に関する情報を含む第1の情報を受信する受信部と、
 前記第1の情報に基づいて、前記環境における前記エージェントデバイスの設置位置を提案する第2の情報を生成する処理部と、
 前記第2の情報を返信する送信部と、
を具備する情報処理装置。
(1-1)エージェントデバイスを設置する環境に関する情報を含む第1の情報を受信する受信ステップと、
 前記第1の情報に基づいて、前記環境における前記エージェントデバイスの設置位置を提案する第2の情報を生成する処理ステップと、
 前記第2の情報を返信する送信ステップと、
を有する情報処理方法。
(2)前記第1の情報は、前記エージェントデバイスを設置する部屋の撮影画像を含み、
 前記第2の情報は、前記撮影画像中の提案する設置位置に前記エージェントデバイスを重畳した画像を含む、
上記(1)に記載の情報処理装置。
(3)前記処理部は、前記撮影画像を物体認識した結果と、前記エージェントデバイスが備える機能又は仕様に基づいて、前記エージェントデバイスの設置位置を提案する、
上記(2)に記載の情報処理装置。
(4)前記処理部は、前記撮影画像を物体認識した結果と、前記エージェントデバイスが適している場所並びに避けるべき機器又は場所とを比較して、前記エージェントデバイスの設置位置を提案する、
上記(2)又は(3)のいずれかに記載の情報処理装置。
(4-1)前記処理部は、スピーカ再生機能を装備する前記エージェントデバイスに対して、テレビ受像機又は他のスピーカのある場所を避ける設置位置を提案する、
上記(4)に記載の情報処理装置。
(4-2)前記処理部は、全周囲又は広角のカメラ撮影機能を装備する前記エージェントデバイスに対して部屋の中央の設置位置を提案する、
上記(4)に記載の情報処理装置。
(4-3)前記処理部は、プロジェクタ表示機能を装備する前記エージェントデバイスに対して白く平坦な壁の近くとなる設置位置を提案する、
上記(4)に記載の情報処理装置。
(5)前記処理部は、前記エージェントデバイスを設置する候補位置が複数ある場合には、候補位置毎に優先順位を付けた前記第2の情報を生成する、
上記(1)乃至(4)のいずれかに記載の情報処理装置。
(6)前記処理部は、前記エージェントデバイスを設置する部屋内の家具の移動も考慮して、前記エージェントデバイスの設置位置を提案する、
上記(1)乃至(5)のいずれかに記載の情報処理装置。
(7)前記処理部は、家具毎の移動コストに基づく部屋内の家具の移動も考慮して、前記エージェントデバイスの設置位置を提案する、
上記(6)に記載の情報処理装置。
(8)前記処理部は、家具を移動させて前記エージェントデバイスを設置した部屋の合成画像を含む前記第2の情報を生成する、
上記(7)に記載の情報処理装置。
(9)前記処理部は、コンセントからの距離に基づく設置コストも考慮して、前記エージェントデバイスの設置位置を提案する、
上記(1)乃至(9)のいずれかに記載の情報処理装置。
(10)エージェントデバイスを設置する環境に関する情報を含む第1の情報を送信する送信部と、
 前記第1の情報に基づいて提案された前記エージェントデバイスの設置位置に関する第2の情報を受信する受信部と、
 前記第2の情報を提示する提示部と、
を具備する情報処理装置。
(11)撮影部をさらに含み、
 前記送信部は、前記エージェントデバイスを設置する部屋を前記撮影部で撮影した画像を含む前記第1の情報を送信し、
 前記受信部は、前記撮影画像中の提案される設置位置に前記エージェントデバイスが重畳された画像を含む前記第2の情報を受信し、
 前記提示部は、提案される設置位置に前記エージェントデバイスが重畳された前記画像を表示する、
上記(10)に記載の情報処理装置。
(12)前記第2の情報は、前記エージェントデバイスを設置する複数の候補位置と、候補位置毎に付けられた優先順位を含み、
 前記提示部は、前記複数の候補位置の各優先順位を提示する、
上記(10)又は(11)のいずれかに記載の情報処理装置。
(13)前記第1の情報に含まれる所定の情報を暗号化する情報暗号部と、
 前記第2の情報に含まれる暗号化された前記所定の情報を基に復元する情報復号部と、
をさらに備える、
上記(10)乃至(12)のいずれかに記載の情報処理装置。
(14)前記情報暗号部は、前記第1の情報に含まれる画像中の人の顔領域に対して所定の画像処理を施す、
上記(13)に記載の情報処理装置。
(15)前記情報暗号部は、前記第1の情報に含まれる画像を一般化又は汎用化した画像に置き換える、
上記(13)に記載の情報処理装置。
(16)前記情報暗号部は、前記第1の情報に含まれる画像をユーザによって描画されたイラストに置き換える、
上記(13)に記載の情報処理装置。
(17)前記情報暗号部は、前記第1の情報に含まれる画像を前記画像に映り込んだ各物体の位置情報に置き換える、
上記(13)に記載の情報処理装置。
(18)センサ部をさらに備え、
 前記送信部は、前記センサ部がエージェントデバイスを設置する前記環境の変化を検出したことに応じて、前記第1の情報を送信する、
上記(10)乃至(17)のいずれかに記載の情報処理装置。
(19)エージェントデバイスを設置する環境に関する情報を含む第1の情報を送信する送信ステップと、
 前記第1の情報に基づいて提案された前記エージェントデバイスの設置位置に関する第2の情報を受信する受信ステップと、
 前記第2の情報を提示する提示ステップと、
を有する情報処理方法。
(20)エージェントデバイスを設置する環境に関する情報を含む第1の情報を送信するとともに、前記第1の情報に基づいて提案された前記エージェントデバイスの設置位置に関する第2の情報を受信して、前記第2の情報を提示する第1のデバイスと、
 前記第1のデバイスから前記第1の情報を受信し、前記第2の情報を返信する第2のデバイスと、
を具備する情報処理システム。
The technology disclosed in the present specification may have the following configurations.
(1) a receiving unit that receives first information including information about an environment in which an agent device is installed;
A processing unit configured to generate, based on the first information, second information that suggests an installation position of the agent device in the environment;
A transmitting unit that returns the second information;
An information processing apparatus comprising:
(1-1) a receiving step of receiving first information including information on an environment in which an agent device is installed;
A processing step of generating second information that proposes an installation position of the agent device in the environment based on the first information;
A transmitting step of returning the second information;
An information processing method comprising:
(2) the first information includes a captured image of a room where the agent device is installed,
The second information includes an image in which the agent device is superimposed on a proposed installation position in the captured image,
The information processing device according to (1).
(3) The processing unit proposes an installation position of the agent device based on a result of object recognition of the captured image and a function or specification of the agent device.
The information processing device according to (2).
(4) The processing unit compares the result of object recognition of the captured image with a place where the agent device is suitable and a device or place to be avoided, and proposes an installation position of the agent device.
The information processing device according to any one of (2) and (3).
(4-1) The processing unit proposes, for the agent device equipped with a speaker playback function, an installation position that avoids a place where a television receiver or another speaker is located,
The information processing device according to (4).
(4-2) The processing unit proposes an installation position at the center of the room to the agent device equipped with a omnidirectional or wide-angle camera photographing function,
The information processing device according to (4).
(4-3) The processing unit proposes an installation position near a white flat wall with respect to the agent device equipped with a projector display function,
The information processing device according to (4).
(5) when there are a plurality of candidate positions for installing the agent device, the processing unit generates the second information in which a priority is assigned to each candidate position;
The information processing device according to any one of (1) to (4).
(6) The processing unit proposes an installation position of the agent device in consideration of movement of furniture in a room where the agent device is installed.
The information processing device according to any one of (1) to (5).
(7) The processing unit proposes an installation position of the agent device in consideration of movement of furniture in the room based on a movement cost for each furniture.
The information processing device according to (6).
(8) The processing unit generates the second information including a composite image of a room where the agent device is installed by moving furniture.
The information processing device according to (7).
(9) The processing unit proposes an installation position of the agent device in consideration of an installation cost based on a distance from an outlet,
The information processing device according to any one of (1) to (9).
(10) a transmitting unit that transmits first information including information on an environment in which the agent device is installed;
A receiving unit that receives second information regarding the installation position of the agent device proposed based on the first information;
A presentation unit for presenting the second information;
An information processing apparatus comprising:
(11) further including a photographing unit,
The transmission unit transmits the first information including an image of the room in which the agent device is installed by the imaging unit,
The receiving unit receives the second information including an image in which the agent device is superimposed on a suggested installation position in the captured image,
The presentation unit displays the image in which the agent device is superimposed on a proposed installation position,
The information processing device according to (10).
(12) The second information includes a plurality of candidate positions for installing the agent device, and a priority assigned to each candidate position,
The presentation unit presents each priority of the plurality of candidate positions,
The information processing device according to any one of (10) and (11).
(13) an information encryption unit for encrypting predetermined information included in the first information;
An information decryption unit that recovers based on the encrypted predetermined information included in the second information;
Further comprising,
The information processing apparatus according to any one of (10) to (12).
(14) the information encryption unit performs predetermined image processing on a human face area in an image included in the first information;
The information processing device according to (13).
(15) The information encryption unit replaces an image included in the first information with a generalized or generalized image.
The information processing device according to (13).
(16) The information encryption unit replaces an image included in the first information with an illustration drawn by a user.
The information processing device according to (13).
(17) The information encryption unit replaces an image included in the first information with position information of each object reflected in the image.
The information processing device according to (13).
(18) further comprising a sensor unit,
The transmitting unit transmits the first information in response to the sensor unit detecting a change in the environment in which an agent device is installed,
The information processing apparatus according to any one of (10) to (17).
(19) a transmitting step of transmitting first information including information on an environment in which the agent device is installed;
A receiving step of receiving second information on an installation position of the agent device proposed based on the first information;
A presentation step of presenting the second information;
An information processing method comprising:
(20) transmitting first information including information on an environment in which an agent device is to be installed, and receiving second information about an installation position of the agent device proposed based on the first information, A first device for presenting second information;
A second device that receives the first information from the first device and returns the second information;
An information processing system comprising:
 200…置き場所推薦システム
 210…エージェントデバイス、211…制御部、212…表出部
 213…センサ部、214…通信部
 220…コンパニオンデバイス、221…制御部、222…表出部
 223…撮影部、224…通信部
 225…情報暗号部、226…情報復号部
 230…演算デバイス、231…制御部、232…通信部
 233…エージェントデバイステーブル、234…家具情報テーブル
 2801…6軸センサ、2802…距離センサ、2803…人感センサ
 2811…自己向き検出部、2812…壁距離推定部
 2813…設置位置評価部
200 ... Place recommendation system 210 ... Agent device, 211 ... Control unit, 212 ... Exposed unit 213 ... Sensor unit, 214 ... Communication unit 220 ... Companion device, 221 ... Control unit, 222 ... Exposed unit 223 ... Shooting unit, 224: Communication unit 225: Information encryption unit, 226: Information decryption unit 230: Arithmetic device, 231: Control unit, 232 ... Communication unit 233: Agent device table, 234: Furniture information table 2801: 6-axis sensor, 2802: Distance sensor , 2803: human sensor 2811: self-orientation detection unit, 2812: wall distance estimation unit 2813: installation position evaluation unit

Claims (20)

  1.  エージェントデバイスを設置する環境に関する情報を含む第1の情報を受信する受信部と、
     前記第1の情報に基づいて、前記環境における前記エージェントデバイスの設置位置を提案する第2の情報を生成する処理部と、
     前記第2の情報を返信する送信部と、
    を具備する情報処理装置。
    A receiving unit that receives first information including information on an environment in which the agent device is installed;
    A processing unit configured to generate, based on the first information, second information that suggests an installation position of the agent device in the environment;
    A transmitting unit that returns the second information;
    An information processing apparatus comprising:
  2.  前記第1の情報は、前記エージェントデバイスを設置する部屋の撮影画像を含み、
     前記第2の情報は、前記撮影画像中の提案する設置位置に前記エージェントデバイスを重畳した画像を含む、
    請求項1に記載の情報処理装置。
    The first information includes a captured image of a room where the agent device is installed,
    The second information includes an image in which the agent device is superimposed on a proposed installation position in the captured image,
    The information processing device according to claim 1.
  3.  前記処理部は、前記撮影画像を物体認識した結果と、前記エージェントデバイスが備える機能又は仕様に基づいて、前記エージェントデバイスの設置位置を提案する、
    請求項2に記載の情報処理装置。
    The processing unit, based on the result of object recognition of the captured image and the function or specification of the agent device, proposes an installation position of the agent device,
    The information processing device according to claim 2.
  4.  前記処理部は、前記撮影画像を物体認識した結果と、前記エージェントデバイスが適している場所並びに避けるべき機器又は場所とを比較して、前記エージェントデバイスの設置位置を提案する、
    請求項2に記載の情報処理装置。
    The processing unit compares the result of object recognition of the captured image with a place where the agent device is suitable and a device or place to be avoided, and proposes an installation position of the agent device.
    The information processing device according to claim 2.
  5.  前記処理部は、前記エージェントデバイスを設置する候補位置が複数ある場合には、候補位置毎に優先順位を付けた前記第2の情報を生成する、
    請求項1に記載の情報処理装置。
    When there are a plurality of candidate positions for installing the agent device, the processing unit generates the second information in which a priority is assigned to each candidate position.
    The information processing device according to claim 1.
  6.  前記処理部は、前記エージェントデバイスを設置する部屋内の家具の移動も考慮して、前記エージェントデバイスの設置位置を提案する、
    請求項1に記載の情報処理装置。
    The processing unit, considering the movement of furniture in the room where the agent device is installed, proposes an installation position of the agent device,
    The information processing device according to claim 1.
  7.  前記処理部は、家具毎の移動コストに基づく部屋内の家具の移動も考慮して、前記エージェントデバイスの設置位置を提案する、
    請求項6に記載の情報処理装置。
    The processing unit also considers the movement of furniture in a room based on the movement cost of each piece of furniture, and proposes an installation position of the agent device.
    The information processing device according to claim 6.
  8.  前記処理部は、家具を移動させて前記エージェントデバイスを設置した部屋の合成画像を含む前記第2の情報を生成する、
    請求項7に記載の情報処理装置。
    The processing unit generates the second information including a composite image of a room where the agent device is installed by moving furniture.
    The information processing device according to claim 7.
  9.  前記処理部は、コンセントからの距離に基づく設置コストも考慮して、前記エージェントデバイスの設置位置を提案する、
    請求項1に記載の情報処理装置。
    The processing unit also considers an installation cost based on a distance from an outlet, and proposes an installation position of the agent device.
    The information processing device according to claim 1.
  10.  エージェントデバイスを設置する環境に関する情報を含む第1の情報を送信する送信部と、
     前記第1の情報に基づいて提案された前記エージェントデバイスの設置位置に関する第2の情報を受信する受信部と、
     前記第2の情報を提示する提示部と、
    を具備する情報処理装置。
    A transmitting unit that transmits first information including information on an environment in which the agent device is installed;
    A receiving unit that receives second information regarding the installation position of the agent device proposed based on the first information;
    A presentation unit for presenting the second information;
    An information processing apparatus comprising:
  11.  撮影部をさらに含み、
     前記送信部は、前記エージェントデバイスを設置する部屋を前記撮影部で撮影した画像を含む前記第1の情報を送信し、
     前記受信部は、前記撮影画像中の提案される設置位置に前記エージェントデバイスが重畳された画像を含む前記第2の情報を受信し、
     前記提示部は、提案される設置位置に前記エージェントデバイスが重畳された前記画像を表示する、
    請求項10に記載の情報処理装置。
    It further includes a shooting unit,
    The transmission unit transmits the first information including an image of the room in which the agent device is installed by the imaging unit,
    The receiving unit receives the second information including an image in which the agent device is superimposed on a suggested installation position in the captured image,
    The presentation unit displays the image in which the agent device is superimposed on a proposed installation position,
    The information processing device according to claim 10.
  12.  前記第2の情報は、前記エージェントデバイスを設置する複数の候補位置と、候補位置毎に付けられた優先順位を含み、
     前記提示部は、前記複数の候補位置の各優先順位を提示する、
    請求項10に記載の情報処理装置。
    The second information includes a plurality of candidate positions for installing the agent device, and a priority assigned to each candidate position,
    The presentation unit presents each priority of the plurality of candidate positions,
    The information processing device according to claim 10.
  13.  前記第1の情報に含まれる所定の情報を暗号化する情報暗号部と、
     前記第2の情報に含まれる暗号化された前記所定の情報を基に復元する情報復号部と、
    をさらに備える、
    請求項10に記載の情報処理装置。
    An information encryption unit that encrypts predetermined information included in the first information;
    An information decryption unit that recovers based on the encrypted predetermined information included in the second information;
    Further comprising,
    The information processing device according to claim 10.
  14.  前記情報暗号部は、前記第1の情報に含まれる画像中の人の顔領域に対して所定の画像処理を施す、
    請求項13に記載の情報処理装置。
    The information encryption unit performs predetermined image processing on a human face region in an image included in the first information;
    The information processing device according to claim 13.
  15.  前記情報暗号部は、前記第1の情報に含まれる画像を一般化又は汎用化した画像に置き換える、
    請求項13に記載の情報処理装置。
    The information encryption unit replaces an image included in the first information with a generalized or generalized image,
    The information processing device according to claim 13.
  16.  前記情報暗号部は、前記第1の情報に含まれる画像をユーザによって描画されたイラストに置き換える、
    請求項13に記載の情報処理装置。
    The information encryption unit replaces an image included in the first information with an illustration drawn by a user.
    The information processing device according to claim 13.
  17.  前記情報暗号部は、前記第1の情報に含まれる画像を前記画像に映り込んだ各物体の位置情報に置き換える、
    請求項13に記載の情報処理装置。
    The information encryption unit replaces an image included in the first information with position information of each object reflected in the image,
    The information processing device according to claim 13.
  18.  センサ部をさらに備え、
     前記送信部は、前記センサ部がエージェントデバイスを設置する前記環境の変化を検出したことに応じて、前記第1の情報を送信する、
    請求項10に記載の情報処理装置。
    Further comprising a sensor unit,
    The transmitting unit transmits the first information in response to the sensor unit detecting a change in the environment in which an agent device is installed,
    The information processing device according to claim 10.
  19.  エージェントデバイスを設置する環境に関する情報を含む第1の情報を送信する送信ステップと、
     前記第1の情報に基づいて提案された前記エージェントデバイスの設置位置に関する第2の情報を受信する受信ステップと、
     前記第2の情報を提示する提示ステップと、
    を有する情報処理方法。
    Transmitting first information including information on an environment in which the agent device is installed;
    A receiving step of receiving second information on an installation position of the agent device proposed based on the first information;
    A presentation step of presenting the second information;
    An information processing method comprising:
  20.  エージェントデバイスを設置する環境に関する情報を含む第1の情報を送信するとともに、前記第1の情報に基づいて提案された前記エージェントデバイスの設置位置に関する第2の情報を受信して、前記第2の情報を提示する第1のデバイスと、
     前記第1のデバイスから前記第1の情報を受信し、前記第2の情報を返信する第2のデバイスと、
    を具備する情報処理システム。
    Transmitting first information including information on an environment in which the agent device is installed, and receiving second information on an installation position of the agent device proposed based on the first information, A first device for presenting information;
    A second device that receives the first information from the first device and returns the second information;
    An information processing system comprising:
PCT/JP2019/015868 2018-06-25 2019-04-11 Information processing device, information processing method, and information processing system WO2020003695A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018120135 2018-06-25
JP2018-120135 2018-06-25

Publications (1)

Publication Number Publication Date
WO2020003695A1 true WO2020003695A1 (en) 2020-01-02

Family

ID=68987020

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/015868 WO2020003695A1 (en) 2018-06-25 2019-04-11 Information processing device, information processing method, and information processing system

Country Status (1)

Country Link
WO (1) WO2020003695A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022194627A1 (en) * 2021-03-16 2022-09-22 Signify Holding B.V. Systems and methods to detect airflow patterns using lighting embedded sensors
JP7485457B1 (en) 2023-03-16 2024-05-16 Necプラットフォームズ株式会社 Server management system, server management method, and program

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012205201A (en) * 2011-03-28 2012-10-22 Brother Ind Ltd Remote conference apparatus
JP2016057551A (en) * 2014-09-12 2016-04-21 キヤノン株式会社 Projector system
CN107527615A (en) * 2017-09-13 2017-12-29 联想(北京)有限公司 Information processing method, device, equipment, system and server

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012205201A (en) * 2011-03-28 2012-10-22 Brother Ind Ltd Remote conference apparatus
JP2016057551A (en) * 2014-09-12 2016-04-21 キヤノン株式会社 Projector system
CN107527615A (en) * 2017-09-13 2017-12-29 联想(北京)有限公司 Information processing method, device, equipment, system and server

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022194627A1 (en) * 2021-03-16 2022-09-22 Signify Holding B.V. Systems and methods to detect airflow patterns using lighting embedded sensors
JP7485457B1 (en) 2023-03-16 2024-05-16 Necプラットフォームズ株式会社 Server management system, server management method, and program

Similar Documents

Publication Publication Date Title
US10834359B2 (en) Information processing apparatus, information processing method, and program
US11762620B2 (en) Accessing functions of external devices using reality interfaces
US11343633B2 (en) Environmental condition based spatial audio presentation
US9854206B1 (en) Privacy-aware indoor drone exploration and communication framework
CN112166350B (en) System and method for ultrasonic sensing in smart devices
CN109407821B (en) Collaborative interaction with virtual reality video
JP6481210B2 (en) Information processing apparatus, control method, and program
JP2010529738A (en) Home video communication system
JP6517255B2 (en) Character image generation apparatus, character image generation method, program, recording medium, and character image generation system
JP2015507394A (en) Spatial bookmark
JP2006510081A (en) Method and apparatus for correcting head posture in videophone image
JP2005323139A (en) Conference recording device, conference recording method, designing method and program
TW202112145A (en) Determination of an acoustic filter for incorporating local effects of room modes
JP2022057771A (en) Communication management device, image communication system, communication management method, and program
KR102638946B1 (en) Information processing device and information processing method, and information processing system
WO2020003695A1 (en) Information processing device, information processing method, and information processing system
EP2830309A1 (en) Information processing apparatus, information processing method, and program
US20230188921A1 (en) Audio system with dynamic target listening spot and ambient object interference cancelation
WO2020213170A1 (en) Information display device and activity plan display system
WO2019044100A1 (en) Information processing apparatus, information processing method, and program
US20230199422A1 (en) Audio system with dynamic target listening spot and ambient object interference cancelation
US20230188923A1 (en) Audio system with dynamic target listening spot and ambient object interference cancelation
US20230188922A1 (en) Audio system with dynamic target listening spot and ambient object interference cancelation
WO2019239902A1 (en) Information processing device, information processing method and program
TW202324373A (en) Audio system with dynamic target listening spot and ambient object interference cancelation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19827445

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19827445

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP