WO2019138682A1 - Information processing device, information processing method, and program - Google Patents

Information processing device, information processing method, and program Download PDF

Info

Publication number
WO2019138682A1
WO2019138682A1 PCT/JP2018/042046 JP2018042046W WO2019138682A1 WO 2019138682 A1 WO2019138682 A1 WO 2019138682A1 JP 2018042046 W JP2018042046 W JP 2018042046W WO 2019138682 A1 WO2019138682 A1 WO 2019138682A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
space
user
display device
image
Prior art date
Application number
PCT/JP2018/042046
Other languages
French (fr)
Japanese (ja)
Inventor
健一 瀬田
誠也 一森
礼史 後藤
伸也 鈴木
賢司 久永
拓也 生江
布施 博明
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2019138682A1 publication Critical patent/WO2019138682A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/4728End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for selecting a Region Of Interest [ROI], e.g. for requesting a higher resolution version of a selected region

Definitions

  • the present disclosure relates to an information processing device, an information processing method, and a program.
  • a display method using a stereoscopic display employing a parallax barrier method, a lenticular method, or the like has been proposed as one of methods for displaying different images according to the position where the user views the display.
  • these display methods although different images can be displayed on one display, the visible range of the user is limited.
  • Patent Document 1 after adjusting the brightness of the image, by wearing glasses having a polarizing filter for each user, different images are simultaneously provided to a plurality of users.
  • An information display device that can be done is disclosed.
  • Patent Document 1 can only view different videos, and even when there are a plurality of users in a certain place, it is possible to obtain information on remote places. It can not be provided while changing for each user.
  • an information processing method capable of providing remote location information adapted to the state of each user even when a plurality of users exist in a certain place.
  • An information processing system, an information processing apparatus, and a program are proposed.
  • a captured image of a second space in a predetermined direction is acquired based on an information acquisition unit that acquires user position information and user viewpoint information in the first space, and position information and the viewpoint information.
  • an information processing apparatus including: an image acquisition unit to be controlled; and a provision control unit configured to control space environment information provided to a user in the first space based on a captured image of the second space.
  • An information processing method includes: controlling spatial environment information provided to the user in the first space based on a captured image of the second space.
  • a computer is used to capture an image of a second space in a predetermined direction based on an information acquisition unit that acquires user's position information and user's viewpoint information in the first space, and position information and viewpoint information.
  • a program is provided to function as an image acquisition unit that acquires an image, and a provision control unit that controls space environment information provided to a user in the first space based on a captured image of the second space. .
  • space environment information including image information of another base acquired according to position information and viewpoint information of the user is output to a predetermined user.
  • composition of information processor 3 concerning an embodiment of this indication. It is a block diagram showing an example of composition of information processing system 1 concerning the embodiment. It is a block diagram showing an example of composition of imaging unit 10 concerning the embodiment. It is a block diagram showing an example of composition of control device 20 concerning the embodiment. It is a block diagram showing an example of composition of the 1st display 30 concerning the embodiment. It is a flowchart which shows an example of the flow of operation
  • a plurality of components having substantially the same functional configuration may be distinguished by attaching different alphabets to the same reference numerals.
  • the same reference numerals when it is not necessary to distinguish each of a plurality of components having substantially the same functional configuration, only the same reference numerals will be given.
  • a network communication facility and an imaging device (camera) and a display device (display) for mutually displaying the situation at each point Is used.
  • imaging device camera
  • display display
  • These configurations are used in so-called video chat and video conferencing, but methods using these configurations have the following problems, for example.
  • the information displayed by the display is flat information, it may appear unnatural depending on the position and angle at which the user views the display.
  • a user is basically always on a plane with respect to the display, such as video chat using a personal computer (PC)
  • the appearance is not a problem, but for example, it is large on the wall
  • the unnatural appearance becomes remarkable.
  • the present inventors came to create the information processing technology concerning one embodiment of this indication, as a result of earnestly examining the above-mentioned situation as one point of focus.
  • the present embodiment which will be described in detail below, it is possible to connect a remote location as if imaging a plurality of bases and providing optimal video information etc. according to the position where each user is located, the viewpoint of each user, etc. It is possible to connect remote places in a natural state as if they are being used, and to realize smoother and natural communication.
  • the configuration and operation of an information processing technique according to an embodiment of the present disclosure which exhibits such an effect will be sequentially described.
  • FIG. 1 is a block diagram showing a configuration example of an information processing device 3 according to an embodiment of the present disclosure.
  • FIG. 2 is a block diagram showing a configuration example of the information processing system 1 according to the embodiment.
  • the information processing apparatus 3 includes an information acquisition unit 5, an image acquisition unit 7, and a provision control unit 9.
  • the information acquisition unit 5 has a function of acquiring user's position information and viewpoint information.
  • the image acquisition unit 7 has a function of acquiring a captured image of the second space in a predetermined direction based on the position information and the viewpoint information acquired by the information acquisition unit 5.
  • the provision control unit 9 has a function of controlling space environment information to be provided to the user present in the first space based on the captured image of the second space acquired by the image acquisition unit 7.
  • the information processing system 1 at least includes an imaging unit 10 and a control device 20, and as necessary, the first display device 30 or the second display device 40.
  • An information processing apparatus 3 further including at least one of the above is provided at each of a plurality of bases, and the plurality of information processing apparatuses 3 are connected to one another via a known network 50 such as the Internet.
  • the imaging unit 10, the first display device 30 and the second display device 40 are connected to the control device 20 by wire or wirelessly.
  • FIG. 2 illustrates the case where the information processing devices 3 provided at two points, that is, the base A and the base B are connected to each other, the information processing devices 3 connected to each other via the network 50 are illustrated.
  • the number of devices is not limited to the example illustrated in FIG. 2, and three or more information processing apparatuses 3 may be connected to one another via the network 50.
  • the imaging unit 10 is installed in each of the base A as the first space and the base B as the second space, and at each of the base A or the base B, at least a part of each base It has a function of imaging and generating a captured image.
  • the image information on the captured image generated by the imaging unit 10 is one of space environment information which is information on the environment of the installed space.
  • the imaging unit 10 may have a function of acquiring audio in the installed space and generating audio information on the acquired audio. Such voice information is also one of space environment information.
  • the imaging unit 10 may further have a function of analyzing the space environment information as described above.
  • the above-mentioned space environment information is related not only to the image information and the voice information acquired by the imaging unit 10 but also to each base and various objects existing in each base (including the users existing in each base).
  • Various types of information corresponding to the human senses related to the environment of the base such as information on various types of character information, information on various vibrations generated at each base, information on odors obtained at each base, etc. May be included.
  • the control device 20 is a device that integrally controls the overall functions of the information processing device 3 according to the present embodiment, and has an analysis function of analyzing space environment information as described above. It controls imaging processing and sound collection processing performed by an imaging device having an imaging function installed at another site, and displays the above-described space environment information on the first display device 30 and the second display device 40. It has a function to control.
  • the imaging device installed at another site may be, for example, a known imaging device such as a digital still camera or a digital video camera, and may be used as the imaging unit 10.
  • the control device 20 may also have a function of controlling various operations in the imaging unit 10.
  • the image acquisition unit 7 and the provision control unit 9 in FIG. 1 correspond to the control device 20.
  • the information acquisition unit 5 in FIG. 1 may be included in the imaging unit 10 shown in FIG. 2, may be included in the control device 20, or may be distributed to the imaging unit 10 and the control device 20. It may be provided.
  • the first display device 30 is, for example, a glasses-type display, a head mount display (HMD) or the like, which the user wears and uses, and is transmitted from the control device 20 It has a function of outputting various spatial environment information to the user.
  • the first display device 30 is a user who wears the first display device 30 by using various sensors such as an acceleration sensor or a gyro sensor mounted on the first display device 30. It is preferable to have a function of acquiring view point information on the view point of and the position information on the position of the user who is wearing it.
  • the second display device 40 has a function of outputting various kinds of space environment information transmitted from the control device 20 to the user.
  • the base such as a wall surface existing in each base It is used in the state fixed at the predetermined position of.
  • the second display device 40 may be various displays fixed to the wall surfaces existing at each site, or the wall surfaces themselves existing at each site may function as a display.
  • the user existing at each site refers to the space environment information of the other site displayed on at least one of the first display device 30 and the second display device 40 to display the other sites displayed. Recognize spatial environment information. At this time, each user present at each site can obtain a feeling as if it were connected to another site via the second display device 40 fixed at a certain position in the site.
  • the space environment information of the other site to be output may not necessarily be output to both the first display device 30 and the second display device 40. That is, space environment information of another site may be output only to the first display device 30 or may be output only to the second display device 40.
  • the network 50 is realized using a known information communication technology such as the Internet, for example, and mutually connects the control device 20A provided at the site A and the control device 20B provided at the site B.
  • FIG. 3 is a block diagram showing a configuration example of the imaging unit 10 according to the present embodiment.
  • the imaging unit 10 is known, for example, at least having an imaging function such as a digital still camera or a digital video camera, and preferably further having a sound collecting function.
  • This is realized by the imaging device of The imaging unit 10 according to the present embodiment includes, for example, an image acquisition unit 101, a sound acquisition unit 103, a space environment information analysis unit 105, a communication control unit 107, and a storage unit 109.
  • the image acquisition unit 101 has a function of imaging at least a part of the installed base and acquiring various types of image information related to the image, including entity data of the imaged image.
  • the acquired various image information is transmitted to at least one of the space environment information analysis unit 105 and the communication control unit 107.
  • the image acquisition unit 101 may use various types of captured images used to extract position information related to the position of the user and viewpoint information related to the user's viewpoint, various types of captured images to be displayed at other sites, etc. Get at any time.
  • the image acquisition unit 101 is preferably configured by a plurality of cameras so that a plurality of positions of the installed base can be simultaneously imaged. With such a configuration, even when a plurality of users are present at different positions of the same site, an image for extracting position information and viewpoint information of each of the plurality of users, and a display at another site Images can be acquired simultaneously.
  • the voice acquisition unit 103 has a function of acquiring voice of at least a part of the installed base and acquiring various voice information on voice including entity data on the voice acquired.
  • the acquired voice is transmitted to at least one of the space environment information analysis unit 105 and the communication control unit 107.
  • the voice acquisition unit 103 is preferably configured by a plurality of microphones so as to simultaneously collect voice information of a plurality of positions of the installed base.
  • the imaging unit 10 may be provided with a function of acquiring various types of information that may be included in the spatial ring information.
  • the space environment information analysis unit 105 is a processing unit that realizes the function of the information acquisition unit 5, and acquires position information and viewpoint information of the user.
  • the space environment information analysis unit 105 is information related to the space environment of the base where the imaging unit 10 is installed, using the image information transmitted from the image acquisition unit 101, the sound information transmitted from the sound acquisition unit 103, and the like. Space environment information is analyzed using various known analysis and recognition techniques. Thereby, the space environment information analysis unit 105 generates various secondary information including position information and viewpoint information.
  • the space environment information analysis unit 105 can perform the above-described analysis processing for any existing users.
  • a function similar to that of the space environment information analysis unit 105 may be included in the control device 20 described later, and when the control device 20 has a function equivalent to that of the space environment information analysis unit 105, imaging The unit 10 may not have the space environment information analysis unit 105.
  • the function of the space environment information analysis unit 105 as described above may be distributed and implemented in the imaging unit 10 and the control device 20.
  • Information on the analysis result (that is, various types of generated secondary information) by the space environment information analysis unit 105 is transmitted to the control device 20 described later via the communication control unit 107.
  • the above analysis method is merely an example, and various analysis methods implemented by the space environment information analysis unit 105 according to the present embodiment are not limited to the above.
  • the communication control unit 107 controls the image information transmitted from the image acquisition unit 101, the voice information transmitted from the voice acquisition unit 103, various secondary information transmitted from the space environment information analysis unit 105, and the like. Communication control function at the time of transmission to The communication control unit 107 also has a function of receiving various control instruction information transmitted from the control device 20. Based on the acquired control instruction information, imaging processing by the image acquisition unit 101 and audio acquisition processing by the audio acquisition unit 103 are controlled to a desired state, and for example, an imaging range and an audio acquisition range are set to the desired state.
  • the storage unit 109 is an example of a storage device provided in the imaging unit 10.
  • the storage unit 109 appropriately stores various programs, databases, and the like that are used when the imaging unit 10 according to the present embodiment performs various processes as described above.
  • the storage unit 109 may also record various types of information acquired by the image acquisition unit 101, the audio acquisition unit 103, and the like as described above as history information. Furthermore, various parameters that need to be stored when the imaging unit 10 according to the present embodiment performs some processing, progress of processing, and the like may be appropriately recorded in the storage unit 109.
  • the image acquisition unit 101, the voice acquisition unit 103, the space environment information analysis unit 105, the communication control unit 107, and the like can freely perform read / write processing.
  • Each component described above may be configured using a general-purpose member or circuit, or may be configured by hardware specialized for the function of each component. Further, all functions of each component may be performed by a CPU or the like. Therefore, it is possible to change the configuration to be used as appropriate according to the technical level at which the present embodiment is implemented.
  • FIG. 4 is a block diagram showing a configuration example of the control device 20 according to the present embodiment.
  • the control device 20 includes a communication control unit 201, a space environment information analysis unit 203, an output control unit 205, an imaging unit control unit 207, and a storage unit 209.
  • the communication control unit 201 has a function of controlling communication processing performed with another device. Specifically, the communication control unit 201 controls communication processing mutually implemented between the imaging unit 10, the first display device 30, and the second display device 40, or via the network 50. Control processing of communication with the control device 20 provided at another location implemented mutually. For example, the communication control unit 201 acquires space environment information such as image information and audio information transmitted from the imaging unit 10, various secondary information generated by the imaging unit 10 transmitted from the imaging unit 10, and the like. Then, it transmits to the space environment information analysis unit 203. Further, the communication control unit 201 transmits space environment information to the first display device 30 or the second display device 40 based on the control instruction information transmitted from the output control unit 205, and performs the first process. Desired space environment information is displayed on the display device 30 or the second display device 40.
  • the space environment information analysis unit 203 is a processing unit having the same function as the space environment information analysis unit 105 of the imaging unit 10 and realizing the function of the information acquisition unit 5, and includes position information and viewpoint information of the user. get. Specifically, the space environment information analysis unit 203 analyzes space environment information such as image information and voice information transmitted from the imaging unit 10 using various known analysis and recognition techniques and the like. Thereby, the space environment information analysis unit 203 generates various secondary information including the position information of the user and the viewpoint information.
  • the space environment information analysis unit 203 can perform the above-described analysis processing for any existing users.
  • the space environment information analysis unit 203 does not have to have the function of generating all the various types of secondary information as described above, and has only the function of generating position information and viewpoint information of the user A. May be As mentioned earlier, the space environment information analysis unit 203 is not necessarily provided in the control device 20 as long as the imaging unit 10 is provided with the space environment information analysis unit 105. In addition, the function of the space environment information analysis unit as described above may be distributed and implemented in the imaging unit 10 and the control device 20.
  • the space environment information is transmitted to the output control unit 205 together with information on the analysis result (that is, various generated secondary information) by the space environment information analysis unit 203 as needed.
  • the output control unit 205 causes the space environment information transmitted from the control device 20 provided at another site to be displayed on at least one of the first display device 30 or the second display device 40 in the site.
  • the output control unit 205A of the control device 20A provided at the site A can use the space environment information of the site B transmitted from the control device 20B provided at the site B to only the first display device 30A of the site A. It is possible to make it output, and it is also possible to make it output only to the 2nd display 40A of base A.
  • the output control unit 205A can also output space environment information to both the first display device 30A and the second display device 40A.
  • the output control unit 205 can divide and output the space environment information to be output to the first display device 30A and the second display device 40A. At this time, the output control unit 205 performs, for example, the first to overlap the output content (in other words, part of the space environment information to be displayed) displayed on the display screen of the second display device 40A.
  • the space environment information can be displayed on the display device 30A.
  • the following method can be used as a superimposed display method of causing the first display device 30A to display space environment information so as to be superimposed on the output content of the second display device 40A. That is, based on the position information and the viewpoint information of the user present at the site A, the imaging unit 10B of the site B captures the space of the site B that matches the position of the user of the site A and the direction of the viewpoint.
  • the control device 20 generates space environment information including captured image information.
  • the output control unit 205A of the control device 20A of the base A uses the space environment information transmitted from the base B to determine the position of the display screen of the second display device 40A that matches the position of the user of the base A and the direction of the viewpoint.
  • the output control unit 205A of the control device 20A of the base A should display at a specific position of the first display device 30A so as to be superimposed on the specified position of the display screen of the second display device 40A. Display space environment information.
  • the imaging unit control unit 207 supervises imaging processing and sound collection processing performed by the imaging unit 10 installed at the same site as the site where the control device 20 is provided and an imaging apparatus having an imaging function installed at another site. Control.
  • the imaging unit control unit 207A of the control device 20A provided at the base A is installed at the base B according to the position information and the viewpoint information acquired by the imaging unit 10A or the control device 20A, and an image of the base B or It is possible to control an imaging device having an imaging function for acquiring voice. Thereby, the image and sound of the base B at the position according to the position information and the viewpoint information acquired by the imaging unit 10A or the control device 20A are acquired.
  • the imaging unit control unit 207A is an image acquired by the imaging unit 10A according to the relative angle of the user A viewed from the second display device 40A, that is, the direction of the user A viewed from the second display device 40A. Or, it is possible to control the acquisition position of voice and the like.
  • the apparatus for acquiring the image and sound of the site B is, for example, a known imaging apparatus having at least an imaging function such as a digital still camera or a digital video camera, and preferably further having a sound collecting function. It may be the imaging unit 10B installed at the base B.
  • the imaging unit control unit 207A of the control device 20A provided at the site A is an image acquired by the imaging unit 10A based on the position information and the viewpoint information of the user B transmitted from the control device 20B of the site B. Or, it is possible to control the acquisition position of voice and the like.
  • the imaging unit control unit 207A acquires an image acquired by the imaging unit 10A according to the relative angle of the user B viewed from the second display device 40B, that is, the direction of the user B viewed from the second display device 40B. Or, it is possible to control the acquisition position of voice and the like.
  • the storage unit 209 is an example of a storage device provided in the control device 20.
  • the storage unit 209 appropriately records various programs, databases, and the like that are used when the control device 20 according to the present embodiment performs the various processes as described above. Further, various information generated by the imaging unit 10, various secondary information generated by the space environment information analysis unit 203, and the like may be recorded in the storage unit 209 as history information. Furthermore, various parameters that need to be stored when the control device 20 according to the present embodiment performs some processing, progress of processing, and the like may be appropriately recorded in the storage unit 209.
  • the communication control unit 201, the space environment information analysis unit 203, the output control unit 205, the imaging unit control unit 207, and the like can freely perform read / write processing.
  • Each component described above may be configured using a general-purpose member or circuit, or may be configured by hardware specialized for the function of each component. Further, all functions of each component may be performed by a CPU or the like. Therefore, it is possible to change the configuration to be used as appropriate according to the technical level at which the present embodiment is implemented.
  • a computer program for realizing each function of the control device 20 according to the present embodiment as described above can be prepared and implemented on a personal computer or the like.
  • a computer readable recording medium in which such a computer program is stored can be provided.
  • the recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory or the like.
  • the above computer program may be distributed via, for example, a network without using a recording medium.
  • the first display device 30 is a display device worn and used by a user, and as such a first display device 30, for example, mention may be made of a wearable display such as a glasses-type display, a head mounted display, etc. Can. It is preferable that, for example, various sensors such as a camera for photographing a user and an acceleration sensor or a gyro sensor be mounted on these wearable displays.
  • FIG. 5 is a block diagram showing a configuration example of the first display device 30 according to the present embodiment.
  • the first display device 30 includes a viewpoint information acquisition unit 301, a position information acquisition unit 303, a voice acquisition unit 305, an image output unit 307, a voice output unit 309, and a communication control unit.
  • a storage unit 313 and a storage unit 313 are provided.
  • the viewpoint information acquisition unit 301 acquires viewpoint information related to the viewpoint of the user wearing the first display device 30.
  • the viewpoint information acquired by the viewpoint information acquisition unit 301 is output to the communication control unit 311 and transmitted to the control device 20 present in the same base.
  • the acquisition method of the viewpoint information of the user by the viewpoint information acquisition unit 301 is not particularly limited, and a known viewpoint detection method can be used. As such a viewpoint detection method, for example, a method described in Japanese Patent Application Laid-Open No. 10-222287 can be mentioned.
  • the position information acquisition unit 303 acquires position information on the position of the user who uses the first display device 30 in the space to be used.
  • the position information acquisition unit 303 uses the output from a sensor used to acquire position information such as an acceleration sensor, a gyro sensor, etc. to obtain position information of the user wearing the first display device 30. get. It does not specifically limit about the acquisition method of such a user's positional information,
  • the well-known method like the self-position estimation technique of a wearable display etc. can be used.
  • self-position estimation technology for example, a technology using information from GNSS (Global Navigation Satellite System: Global Positioning Satellite System), indoor positioning, mobile network, wireless LAN (Local Area Network) base station, etc. is listed. be able to.
  • GNSS Global Navigation Satellite System: Global Positioning Satellite System
  • indoor positioning mobile network
  • wireless LAN Local Area Network
  • the position information acquired by the position information acquisition unit 303 is output to the communication control unit 311 and transmitted to the control device 20 present in the same base.
  • the voice acquisition unit 305 exists around the first display device 30 or a voice uttered by the user wearing the first display device 30 via a sound collection microphone or the like provided in the first display device 30. Acquire various voice information such as sounds. The acquired voice information is output to the communication control unit 311, and is transmitted to the control device 20 existing in the same base.
  • the image output unit 307 acquires, from the communication control unit 311, the output instruction information and the space environment information transmitted from the control device 20 in the same site, and the first display device 30 according to the acquired output instruction information.
  • the space environment information is displayed on the display provided in.
  • the image output unit 307 can display the space environment information so as to superimpose the image on the output content of the second display device 40 using the method described above. It is possible to display space environment information including image information as it is directly output from the second display device 40 to the user wearing the.
  • the audio output unit 309 acquires the output instruction information and the audio information transmitted from the communication control unit 311, and according to the output instruction information, audio information is output from a speaker or the like provided in the first display device 30. Output.
  • the communication control unit 311 has a function of controlling communication processing performed with the control device 20. Specifically, the communication control unit 311 receives the viewpoint information acquired by the viewpoint information acquisition unit 301, the position information acquired by the position information acquisition unit 303, and the voice acquired by the voice acquisition unit 305, It is transmitted to the control device 20. Further, the communication control unit 311 can transmit the space environment information and various output instruction information transmitted from the control device 20 to the image output unit 307 and the audio output unit 309.
  • the storage unit 313 is an example of a storage device included in the first display device 30.
  • the storage unit 313 appropriately stores various programs, databases, and the like that are used when the first display device 30 according to the present embodiment performs the various processes as described above. Further, various information acquired by the viewpoint information acquisition unit 301, the position information acquisition unit 303, the voice acquisition unit 305, and the like as described above may be recorded in the storage unit 313 as history information. Furthermore, various parameters that need to be stored when the first display device 30 according to the present embodiment performs some processing, progress of processing, and the like may be appropriately recorded in the storage unit 313. In the storage unit 313, the viewpoint information acquisition unit 301, the position information acquisition unit 303, the voice acquisition unit 305, the image output unit 307, the voice output unit 309, the communication control unit 311, etc. perform read / write processing freely. Is possible.
  • the first display device 30 has an acquisition function of viewpoint information and an acquisition function of position information
  • the first display device 30 is not limited to viewpoint information or It is also conceivable that the mobile terminal does not have a position information acquisition function.
  • the space environment information acquisition unit of the control device 20 provided in the same base may extract the viewpoint information and the position information as described above by analyzing the space environment information.
  • Each component described above may be configured using a general-purpose member or circuit, or may be configured by hardware specialized for the function of each component. Further, all functions of each component may be performed by a CPU or the like. Therefore, it is possible to change the configuration to be used as appropriate according to the technical level at which the present embodiment is implemented. Subsequently, the configuration of the second display device 40 will be described in detail.
  • the second display device 40 is, for example, a display fixed to a space, such as a wall surface of a base, and is realized by various known displays. Any user existing in the provided space can recognize the content displayed on the second display device 40.
  • the second display device 40 is connected to the control device 20 by wire or wireless, and has a function of outputting the space environment information transmitted from the control device 20.
  • the second display device 40 may be provided integrally with the imaging unit 10 or the control device 20.
  • an audio output unit may be provided so that when the first display device 30 does not output audio, the second display device 40 can output audio.
  • the configuration of the second display device 40 has been described above. Subsequently, the operation of the information processing system 1 according to the present embodiment will be described.
  • FIG. 6 is a sequence diagram showing an example of the flow of the operation according to this embodiment, and the imaging unit 10A, the control device 20A, the first display device 30A, the second display device 40A, and the base are provided at the base A.
  • a broken line extending from the top to the bottom shows the lifeline of each object, which is composed of the objects of the imaging unit 10B provided in B and the control device 20B.
  • the control device 20 is integrated with the imaging unit 10 and the second display device 40 will be described.
  • the control device 20, the imaging unit 10, and the second display device 40 may be provided independently of each other.
  • the imaging unit 10A captures at least a part of the installed site A, acquires image information of the site A (S101), and transmits the acquired image information to the control device 20A (S103).
  • the control device 20A analyzes the space environment information based on the image information transmitted from the imaging unit 10A in the space environment information analysis unit 203 provided in the control device 20A, and exists in the site A as secondary information. Position information and viewpoint information of the user A are generated (S105). Then, the control device 20A transmits imaging instruction information based on the generated position information and viewpoint information (S107).
  • the imaging unit 10B images a predetermined position of the base B by the image acquisition unit 101B included in the imaging unit 10B (S109).
  • the audio acquisition unit 103B provided in the imaging unit 10B can pick up the audio at a predetermined position of the base B.
  • the imaging unit 10B analyzes space environment information including the acquired image information, and generates secondary information such as position information of the user B existing at the site B (S111). Then, the imaging unit 10B transmits the space environment information and the secondary information to the control device 20A via the control device 20B (S113).
  • the control device 20A determines the output destination and output position of the received space environment information and secondary information (S115). Specifically, the output control unit 205A included in the control device 20A determines space environment information to be output to the first display device 30A and space environment information to be output to the second display device 40A. Furthermore, for example, when the image displayed on the first display device 30A is to be superimposed on the output content of the second display device 40A, the position of the image to be displayed on the first display device 30A so that the image to be displayed is superimposed. Decide.
  • the control device 20A transmits the output instruction information to the output destination determined as described above, that is, at least one of the first display device 30A and the second display device 40A (S117).
  • the first display device 30A or the second display device 40A outputs the space environment information and, if necessary, the secondary information at a position according to the output instruction information (S119).
  • the generated secondary information may be generated not only by the control device 20A but also by the imaging unit 10A.
  • the viewpoint information and the position information, which are one of the secondary information, may be acquired by the viewpoint information acquiring unit 301 and the position information acquiring unit 303 which the first display device 30A has.
  • FIG. 7 shows an example of space environment information recognized by the user A who uses the information processing system 1 according to the present embodiment.
  • the image of the site B is displayed on the first display device 30A attached to the user A and the second display device 40A installed at the site A, and the user B who exists at the site B is displayed.
  • the image is displayed so as to be superimposed on the output content of the second display device 40A.
  • it is naturally connected as if it were connected to a remote place, and it is possible to realize smoother and natural communication.
  • FIG. 8A shows the base A where the user A1 and the user A2 are present, and the right side of FIG. 8A shows the base B connected to the base A. As shown in the left diagram of FIG. 8A, the user A1 and the user A2 are located in different places in the base A.
  • the imaging unit 10A of the base A acquires space environment information including image information and the like of the base A, and the control device 20A generates position information and viewpoint information of each of the user A1 and the user A2.
  • the control device 20A transmits the generated position information and viewpoint information and the space environment information acquired by the imaging unit 10A to the control device 20B of the base B.
  • the imaging unit control unit 207B provided in the control device 20B provided at the base B controls the imaging unit 10B, and the position of the base B according to the transmitted position information and viewpoint information of the user A1 and the user A2 Take an image.
  • the space environment information acquired by imaging is transmitted to the control device 20A, and the output control unit 205A included in the control device 20A transmits the user A1 and the user to the first display device 30A and the second display device 40A. It outputs space environment information according to each A2.
  • the left figure in FIG. 8A is space environment information that the user A1 recognizes through the first display device 30A and the second display device 40A, and the right figure in FIG. 8A shows the first display device 30A with the second display A. It is space environment information recognized through the display device 40A.
  • information corresponding to the position and the viewpoint of the user A1 is presented to the user A1, and as illustrated in the right diagram of FIG. 8B, to the user A2.
  • Information corresponding to the position and the viewpoint of the user A2 is presented. More specifically, on the second display device, an image of the wall surface and the floor of the base B, which is a common part of the images recognized by the user A1 and the user A2, is displayed. Further, on the first display device 30A worn by the user A1, the image of the chair is displayed so as to overlap the output content of the second display device 40, and similarly, the user A2 wears it. On the first display device 30A, the image of the sofa is displayed so as to be superimposed on the output content of the second display device 40A.
  • the image of the base B according to the positions and the viewpoints of a plurality of users is provided, and the control device 20A transmits the plurality of images to the second display device 40A.
  • a common part of the image information provided to the user can be displayed.
  • the first display device 30 can output not only image information but also text information as space environment information. For example, when there is no user B present at the site B within the range of the site B based on the viewpoint information and position information of the user A present at the site A, the position information of the user B may be displayed as text information it can. Such a case will be described with reference to FIGS. 9, 10A and 10B.
  • FIG. 9 is a sequence diagram showing a flow of an operation in which position information of the user B present at the site B is displayed on the first display device 30A.
  • the operation is the same as in the operation example 1 until the control device 20B provided at the base B transmits the imaging instruction information to the imaging unit 10B after the image of the base A is acquired, and thus the detailed description is omitted. Also omitted from the sequence diagram.
  • the imaging unit 10B captures a plurality of positions of the base B by using a plurality of cameras provided in the imaging unit 10B, and acquires space environment information (S201). At this time, the imaging unit 10B captures not only the image to be displayed to the user A present at the site A but also the other position of the site B.
  • the control device 20B analyzes the space environment information, and generates secondary information such as position information of the user B and identification information for identifying the user (S203).
  • the user existing at the site B is the user B (S205).
  • the secondary information on the user B thus acquired is transmitted to the control device 20A via the control device 20B (S207), and the output instruction information is transmitted to the first display device 30A (S209).
  • the first display device 30A outputs space environment information according to the output instruction information transmitted from the control device 20A (S211).
  • FIG. 10A shows a state in which the user B does not exist at the position displayed for the user A who is at the site A.
  • the arrow shown in FIG. 10A indicates the direction of the viewpoint of the user A.
  • the user B is not located in the range displayed corresponding to the position and the viewpoint of the user A.
  • the control device 20A has the location information of the user B existing in the location B extracted from the space environment information of the location B acquired by the imaging unit 10B in the location A. It can be provided to user A.
  • secondary information such as position information related to the user B may be acquired not only by the imaging unit 10B but also by the position information acquisition unit 303 provided in the first display device 30B used by the user B.
  • the method of identifying the user may be a method of identifying based on solid-state information of the first display device 30B used by the user B, or may be acquired using a camera provided in the first display device 30B. May be identified based on the image of the user B.
  • FIG. 11 is a sequence diagram showing a flow of calling the user A who is present at the base B to the user A who is present at the base A and making the user A recognize the user B.
  • the base A and the base B are connected by the information processing system 1 according to the present embodiment, as shown in FIG. 12A, the user A who is at the base A gazes at the second display device 40A. , The user B who is at the base B is not recognized.
  • the arrow shown in FIG. 12A indicates the direction of the viewpoint of the user A.
  • the operation is the same as in the operation example 1 until the control device 20B provided at the base B transmits the imaging instruction information to the imaging unit 10B after the image of the base A is acquired, and thus the detailed description is omitted. Also omitted from the sequence diagram.
  • the imaging unit 10B provided at the base B images a plurality of positions of the base B using a plurality of cameras provided at the imaging unit 10B (S301).
  • the voice of the base B is obtained using the voice acquisition unit 103B provided in the imaging unit 10B (S303).
  • voice information including the call is acquired by the voice acquisition unit 103B
  • the space environment information analysis unit 105B generates voice information, and the contents of the user and the call are It is identified (S305).
  • it is identified that such a challenge is for User B to User A.
  • the audio information acquired in this manner is transmitted to the control device 20A via the control device 20B (S307), and the output instruction information is transmitted to the first display device 30A (S309).
  • the first display device 30A outputs information according to the output instruction transmitted from the control device 20A (S311).
  • the first display device 30A used by the user A can display that the user B is calling the user A.
  • control device 20A provides the user A, who exists at the base A, with the audio information related to the sound of the base B extracted from the space environment information of the base B acquired by the imaging unit 10B. it can.
  • the first display device 30A may output the audio information itself. Also, the identification of the content of the challenged user and the challenge performed in S305 may be performed by the control device 20A.
  • FIG. 13 is a sequence diagram showing a flow for causing the user A to recognize the user B by the sight line and the motion with respect to the user A who is present at the site A and the user B present at the site B.
  • FIG. 14A shows a state in which the user B is calling the user A by sight and movement.
  • the arrow shown in FIG. 14A left shows the direction of the viewpoint of the user A
  • the arrow shown in the right of FIG. 14A shows the direction of the viewpoint of the user B.
  • the user A who is at the site A is not looking at the second display device 40A and is not aware of the user B who is at the site B.
  • the user B who is at the base B is in a state of gazing at the user A through the second display device 40A.
  • the operation until the user B recognizes the image of the base A according to the position and the viewpoint of the user B is the same operation as the operation example 1, and thus the detailed description is omitted.
  • the imaging unit 10B images the site B (S401), and generates viewpoint information of the user B from the acquired space environment information (S403). Then, for example, when the user B shakes his / her hand while gazing at the user A, the space environment information analysis unit 105B of the imaging unit 10B generates secondary information such as the viewpoint information of the user B and the user's operation information. (S405), the secondary information is transmitted to the control device 20A via the control device 20B (S407). The control device 20A identifies that the user B is calling the user A based on the transmitted viewpoint information and operation information (S409), and the output control unit 205A included in the control device 20A performs the first display.
  • Output instruction information is transmitted to the device 30A (S411). Then, for example, call information as shown in FIG. 14B is output to the first display device 30A worn by the user A (S413).
  • the control device 20A determines, from the space environment information of the base B acquired by the imaging unit 10B, the operation information on the movement of the user B existing at the base B extracted as the space environment information as the base A. Can be provided to the user A who exists in Thereby, even though the user A is not looking at the second display device 40A and can not recognize the movement of the hand of the user B, the user A feels as if being called from behind. You can get it.
  • FIGS. 15A, 15B, and 15C a case where only the first display device 30 according to the present embodiment displays space environment information of another base will be described with reference to FIGS. 15A, 15B, and 15C.
  • the first display device 30 can display space environment information of another site without being limited to the size of the second display device 40.
  • the left view of FIG. 15A shows that the user A who is at the base A is located at a distance d from the second display device 40A.
  • the imaging unit 10A captures an image of the site A, and generates position information of the user A from the acquired space environment information. Specifically, the distance d between the user A and the second display device 40A can be calculated.
  • the output control unit 205A provided in the control device 20A controls the first display device 30A and the second display device 40A to output an output image as a user
  • the image of the base B corresponding to the position and the viewpoint of A can be switched to the image of the entire imaging range imaged by the imaging unit 10B provided at the base B.
  • the output control unit 205A included in the control device 20A virtually displays the image information captured by the imaging unit 10B on the wall on which the second display device 40A is installed. Then, the user A can recognize the image of the base B by looking at the wall on which the image of the base B is virtually displayed through the first display device 30A.
  • the site B is actually There are cases where the privacy of the should be maintained.
  • the image displayed on the second display device 40A may be displayed clearly, and the image displayed on the wall may be displayed indistinctly.
  • the image displayed on the wall surface can be a wire frame image or a mosaic image.
  • the control device 20 causes the distance between the user and the second display device 40 to be a trigger, and the image to be displayed on the first display device 30 is not limited to the size of the second display device 40.
  • the case of switching to display space environment information of the base of has been described. Such switching of display is not limited to the distance between the user and the second display device 40, and may be triggered by the number of users at the base. For example, as shown to FIG. 16A, the case where the base A and the base B are connected by the information processing system 1 which concerns on this embodiment, and the some user exists in the base B is assumed.
  • the output control unit 205A included in the control device 20A performs an instruction to switch the output image to the first. It transmits to the display device 30 and the second display device 40.
  • the output control unit 205A provided in the control device 20A is an imaging unit provided at the base B as shown in FIG. 16B from the image of the base B corresponding to the position and the viewpoint of the user A as the output image. Switch to the image of the entire imaging range of 10B.
  • FIG. 17A is an explanatory view schematically showing the state of the base B.
  • FIG. The left view of FIG. 17A shows that the user B is present in the direction corresponding to the position and the viewpoint of the user A who is at the base A.
  • the right view of FIG. 17A shows that the user B has moved out of the range of the position corresponding to the position of the user A and the viewpoint.
  • the image recognized by the user A is the second as shown in the left drawing of FIG.
  • the user B is displayed on the display device 40A.
  • the output control unit 205A of the control device 20A displays the image of the user B on the first display device 30A without being superimposed on the display image of the second display device 40A.
  • the user A can continue to recognize the presence of the user B, as shown in the central view of FIG. 17B.
  • the image of the user B may be displayed with such visibility that the presence can be recognized.
  • the user B displayed only on the first display device 30A may be displayed as an image or the like generated by a wire frame.
  • the user B displayed at this time may be displayed on the first display device 30A so as to be superimposed on the display content of the second display device 40A without being displayed on the second display device 40A.
  • the image of the base B corresponding to the position and the viewpoint of the user A at the base A may be output only to the first display device 30A, but not necessarily to the second display device 40A. It is good.
  • the second display device 40A includes a marker for determining the output position of the image of the site B displayed on the first display device 30A. It should just be.
  • the second display 40A can be used as an auxiliary display for outputting a predetermined image.
  • a common image may be displayed on the second display device 40A provided at the site A and the second display device 40B provided at the site B.
  • reference material of the conference may be displayed on the second display device 40A.
  • a screen or the like of the online game being played may be displayed.
  • images of mutual bases corresponding to the user's viewpoint information and position information may be displayed. As an example of an image recognized by the user A is shown in FIG.
  • the user A can simultaneously confirm the common information while confirming the state of the user B who is at the base B.
  • the sites separated from each other The user can exchange other information at the same time while gaining the sense that they are naturally connected.
  • the control device 20A can reduce the resolution of the second display device 40A.
  • the imaging unit 10A captures an image of the site A, and from the space environment information analyzed by the space environment information analysis unit 105A, the space environment information extraction unit 107A analyzes the user A and the second display device 40A. Extract the distance d between When the distance d becomes equal to or more than the predetermined threshold value d 1 , the output control unit 205A provided in the control device 20A controls the second display device 40A to have the resolution of the output image as shown in FIG. 19B. Can be lowered. In addition, there is no viewpoint of the user A on the second display device 40A by the viewpoint information of the user A extracted by the space environment information extraction unit 107A from the space environment information analyzed by the space environment information analysis unit 105A.
  • the output control unit 205A included in the control device 20A can control the second display device 40A to reduce the resolution of the output image.
  • the output control unit 205A included in the control device 20A can control the second display device 40A to reduce the resolution of the output image.
  • FIG. 20 is a block diagram showing a hardware configuration of control device 20.
  • the control device 20 includes a central processing unit (CPU) 251, a read only memory (ROM) 252, a random access memory (RAM) 253, and a host bus 254. Further, the control device 20 includes a bridge 255, an external bus 256, an interface 257, an imaging / sound collecting device 258, an input device 259, a display device 260, an audio output device 261, and a storage device (HDD) 262. , A drive 263, and a network interface 264.
  • CPU central processing unit
  • ROM read only memory
  • RAM random access memory
  • HDD storage device
  • the display device 260 and the audio output device 261 are generally provided in at least one of the first display device 30 and the second display device 40, the control device 20 and the second display device 40 are integrated. It may be used. Therefore, the display device 260 and the audio output device 261 may be included in the hardware of the control device 20.
  • the CPU 251 functions as an arithmetic processing unit and a control unit, and controls the overall operation in the control unit 20 according to various programs. Also, the CPU 251 may be a microprocessor.
  • the ROM 252 stores programs used by the CPU 251, calculation parameters, and the like.
  • the RAM 253 temporarily stores programs used in the execution of the CPU 251 and parameters and the like that appropriately change in the execution. These are mutually connected by a host bus 254 configured of a CPU bus or the like.
  • the functions of the space environment information analysis unit 203, the output control unit 205, and the like can be realized by the cooperation of the CPU 251, the ROM 252, and the RAM 253 with the software.
  • the host bus 254 is connected to an external bus 256 such as a peripheral component interconnect / interface (PCI) bus via the bridge 255.
  • PCI peripheral component interconnect / interface
  • the host bus 254, the bridge 255, and the external bus 256 do not necessarily need to be separately configured, and these functions may be implemented on one bus.
  • the imaging and sound collecting device 258 has a function of imaging and collecting an image and a sound related to the base where the user is.
  • the image and the sound acquired by the imaging and sound collecting device 258 are output to another site.
  • the imaging and sound collecting device 258 includes a camera, a microphone and the like.
  • the input device 259 generates an input signal based on an input by the member such as a mouse, a keyboard, a touch panel, a button, a microphone, a sensor, a switch, a lever, and the like, and an input by the member. It may be configured of a control circuit or the like.
  • the operation of the control device 20 can also instruct the processing operation by inputting various data to the control device 20 by operating the input device 259.
  • the display device 260 includes, for example, a display device such as a cathode ray tube (CRT) display device, a liquid crystal display (LCD) device, a projector device, an organic light emitting diode (OLED) device, and a lamp.
  • the display device 260 corresponds to, for example, an image output unit provided in the second display device 40 when provided integrally with the control device 20.
  • the audio output device 261 includes an audio output device such as a speaker and a headphone.
  • the audio output device 261 corresponds to, for example, an audio output unit provided in the second display device 40 when provided integrally with the control device 20.
  • the storage device 262 is a device for data storage configured as an example of the storage unit 209 of the control device 20 according to the present embodiment.
  • the storage device 262 may include a storage medium, a recording device for recording data in the storage medium, a reading device for reading data from the storage medium, and a deletion device for deleting data recorded in the storage medium.
  • the storage device 262 is configured of, for example, a hard disk drive (HDD) or a solid storage drive (SSD), a memory having an equivalent function, or the like.
  • the storage device 262 drives a storage and stores programs executed by the CPU 251 and various data.
  • the drive 263 is a reader / writer for a storage medium, and is built in or externally attached to the control device 20.
  • the drive 263 reads out the information recorded in the removable storage medium 270 such as a mounted magnetic disk, optical disk, magneto-optical disk, or semiconductor memory, and outputs the read information to the RAM 253 or storage device 262.
  • the drive 263 can also write information to the removable storage medium 270.
  • the network interface 264 is, for example, a communication interface configured of a communication device or the like for connecting to the network 50. Also, the network interface 264 may be a wireless LAN compatible control device or a wire control device that performs wired communication.
  • FIG. 21 is a block diagram showing the hardware configuration of the first display device 30.
  • the first display device 30 includes a CPU 351, a ROM 352, a RAM 353, a host bus 354, a bridge 355, an external bus 356, an interface 357, an imaging / sound collector 358, an input device 359, and a display device.
  • a voice output device 361, a storage device 362, an inertial sensor 363, and a network interface 364 can be provided.
  • the CPU 351, the ROM 352, the RAM 353, the host bus 354, the bridge 355, the external bus 356, the interface 357, the display device 360, and the network interface 364 are basically the hardware configuration of the control device 20. Therefore, the description here is omitted.
  • the host bus 354 is connected to an external bus 356 such as a PCI bus via the bridge 355.
  • an external bus 356 such as a PCI bus
  • the host bus 354, the bridge 355, and the external bus 356 do not necessarily need to be separately configured, and these functions may be implemented on one bus.
  • the imaging and sound collecting device 358 has a function of user position estimation / image recognition and a function of detecting the user's viewpoint.
  • the imaging and sound collecting device 358 includes a camera, a microphone and the like.
  • the input device 359 includes an input unit such as a touch panel, a button, a microphone, a sensor, and a switch for inputting information by the user, an input control circuit which generates an input signal based on an input by a member, and outputs it to the CPU 251 It can be done.
  • the operation of the first display device 30 can also instruct processing operation by inputting various data to the first display device 30 by operating the input device 359.
  • the display device 360 includes, for example, display devices such as a CRT display device, a liquid crystal display device, an OLED device, and a lamp.
  • the display device 360 corresponds to the image output unit 307.
  • the audio output device 361 also includes an audio output device such as a speaker and headphones.
  • the audio output device 361 corresponds to, for example, the audio output unit 309.
  • the storage device 362 is a device for data storage configured as an example of the storage unit 313 of the first display device 30 according to the present embodiment.
  • the storage device 362 may include a storage medium, a recording device that records data in the storage medium, a reading device that reads data from the storage medium, and a deletion device that deletes data recorded in the storage medium.
  • the storage device 362 is configured of, for example, an SSD or a memory having an equivalent function.
  • the storage device 362 drives a storage and stores programs executed by the CPU 351 and various data.
  • the inertial sensor 363 functions as a detection device that detects the position of the user, detects which position the user is at, and which direction the user is facing, and may be an acceleration sensor, a gyro sensor, or the like.
  • the inertial sensor 363 corresponds to the position information acquisition unit 303.
  • the network interface 364 is, for example, a communication interface configured of a communication device or the like for connecting to the network 50. Also, the network interface 364 may be a wireless LAN compatible control device or a wire control device that performs wired communication. However, since the first display device 30 is often of a head mount type or glasses type structure and the possibility of mounting and using the first display device 30 for each user is high, the wireless compatible control device Is preferred.
  • An information acquisition unit that acquires position information of a user in a first space and viewpoint information of the user;
  • An image acquisition unit configured to acquire a captured image of a second space in a predetermined direction based on the position information and the viewpoint information;
  • An information processing apparatus comprising: a provision control unit configured to control space environment information provided to the user in the first space based on a captured image of the second space.
  • the provision control unit according to any one of (1) to (3), displaying the space environment information including the captured image on a second display device provided in the first space.
  • Information processing device A plurality of users exist in the first space, The provision control unit causes a second display device provided in the first space to display a common portion of the captured images provided to the plurality of users as part of the space environment information.
  • the information processing apparatus according to any one of (1) to (3).
  • the image acquisition unit further acquires a captured image of a second space according to a relative angle of the user viewed from a second display device provided in the first space, (1) to (1)
  • the information processing apparatus according to any one of 5).
  • the provision control unit further provides, as the space environment information, position information on the position of another user existing in the second space to a predetermined user existing in the first space ((1 ) The information processing apparatus according to any one of (6) to (6).
  • the provision control unit further provides, as the space environment information, sound information on the sound of the second space to a predetermined user present in the first space, as in (1) to (7).
  • the information processing apparatus according to any one.
  • the provision control unit further provides, as the space environment information, operation information on operations of other users existing in the second space to a predetermined user existing in the first space ((1 The information processing apparatus according to any one of the above to (8).
  • the provision control unit From the image information in the second space of the direction corresponding to the position and viewpoint of the predetermined user present in the first space, the image to be displayed on the first display device used by the user is The information processing apparatus according to any one of (1) to (9), which switches to predetermined image information.
  • the provision control unit refers to information related to the number of other users present in the second space, and when the number is greater than or equal to a predetermined threshold, an image displayed on the first display device used by the user Switching information from image information in the second space in a direction corresponding to the position and viewpoint of the predetermined user present in the first space to predetermined image information in the second space, (1) to (1)
  • the information processing apparatus according to any one of 10).
  • the provision control unit Displaying an image having a lower visibility than the image displayed on the second display device provided in the first space, on the first display device used by the user;
  • the information processing apparatus according to any one.
  • the provision control unit is configured to present an image to be displayed on a first display device used by the user, in the first space according to position information on the positions of other users present in the second space. Switching from an image in the second space in a direction corresponding to a predetermined user's position and viewpoint to another image including the other user present in the second space, any of (1) to (12)
  • the information processing apparatus according to any one of the above.
  • the provision control unit causes the image of the other user to be output in a low visibility state according to the position of the other user existing in the second space, any one of (1) to (13) The information processing apparatus according to claim 1.
  • the provision control unit causes the first display device used by the user to display the space environment information so as to overlap the display content of the second display device provided in the first space (1
  • the provision control unit causes the second display device provided in the first space and the second display device provided in the second space to display common image information.
  • the provision control unit determines that the distance between the position of a predetermined user present in the first space and the position of a second display device provided in the first space is equal to or greater than a predetermined threshold value.
  • the information processing apparatus according to any one of (1) to (16), wherein the resolution of the second display device is reduced.
  • the provision control unit sets the resolution of the second display device.
  • the information processing apparatus according to any one of (1) to (17), wherein (19)
  • the information acquisition unit extracts the position information and the viewpoint information of a predetermined user using information acquired from a first display device used by the predetermined user, any of (1) to (18).
  • the information processing apparatus according to any one of the above.
  • the information acquisition unit extracts predetermined user position information and viewpoint information from the space environment information acquired by the image acquisition unit provided in each of the first space and the second space.
  • the information processing apparatus according to any one of (1) to (18). (21) Obtaining position information of the user in the first space and viewpoint information of the user; Obtaining a captured image of a second space in a predetermined direction based on the position information and the viewpoint information; Controlling space environment information provided to the user in the first space based on the captured image of the second space.
  • (22) Computer An information acquisition unit that acquires position information of a user in a first space and viewpoint information of the user; Provided to the user in the first space based on the captured image of the second space, and an image acquisition unit that acquires a captured image of the second space in a predetermined direction based on the position information and the viewpoint information.

Abstract

[Problem] Proposed is an information processing device that makes it possible to provide remote location information adapted to the state of each of a plurality of users even if the users are in one place. [Solution] This information processing device comprises: an information acquisition unit that acquires position information of a user in a first space and viewpoint information of said user; an image acquisition unit that acquires a captured image for a second space which is in a prescribed direction, on the basis of the position information and said viewpoint information; and a provision control unit that controls spatial environment information to be provided to the user in the first space, on the basis of the captured image for the second space.

Description

情報処理装置、情報処理方法及びプログラムINFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND PROGRAM
 本開示は、情報処理装置、情報処理方法及びプログラムに関する。 The present disclosure relates to an information processing device, an information processing method, and a program.
 従来、利用者がディスプレイを見る位置に応じて異なる映像を表示させるための方法の一つとして、視差バリア方式やレンチキュラー方式等を採用した立体型ディスプレイを用いる表示方法が提案されている。しかし、これらの表示方法では、1つのディスプレイで異なる画像を表示することはできるものの、利用者の可視範囲は限られてしまう。 Conventionally, a display method using a stereoscopic display employing a parallax barrier method, a lenticular method, or the like has been proposed as one of methods for displaying different images according to the position where the user views the display. However, with these display methods, although different images can be displayed on one display, the visible range of the user is limited.
 これに対して、例えば以下の特許文献1には、画像の輝度を調整した上で、使用者ごとに偏光フィルターを有するメガネを着用することで、複数の使用者に対し、同時に異なる映像を提供することが可能な情報表示装置が開示されている。 On the other hand, for example, in Patent Document 1 below, after adjusting the brightness of the image, by wearing glasses having a polarizing filter for each user, different images are simultaneously provided to a plurality of users. An information display device that can be done is disclosed.
特開2015-111279号公報JP, 2015-111279, A
 しかしながら、上記特許文献1に開示されているような情報表示装置は、異なる映像を視聴できるようになるだけであり、ある場所に複数のユーザが存在する場合であっても、遠隔地の情報を、ユーザ毎に変化させながら提供することはできない。 However, the information display device as disclosed in Patent Document 1 can only view different videos, and even when there are a plurality of users in a certain place, it is possible to obtain information on remote places. It can not be provided while changing for each user.
 そこで、本開示では、上記事情に鑑みて、ある場所に複数のユーザが存在する場合であっても、それぞれのユーザの状態に適合した遠隔地の情報を提供することが可能な、情報処理方法、情報処理システム、情報処理装置及びプログラムを提案する。 Therefore, in the present disclosure, in view of the above circumstances, an information processing method capable of providing remote location information adapted to the state of each user even when a plurality of users exist in a certain place. , An information processing system, an information processing apparatus, and a program are proposed.
 本開示によれば、第1の空間におけるユーザの位置情報とユーザの視点情報を取得する情報取得部と、位置情報と前記視点情報に基づいて、所定方向の第2の空間の撮像画像を取得させる画像取得部と、第2の空間の撮像画像に基づいて、第1の空間におけるユーザに提供する空間環境情報の制御を行う提供制御部と、を備える情報処理装置が提供される。 According to the present disclosure, a captured image of a second space in a predetermined direction is acquired based on an information acquisition unit that acquires user position information and user viewpoint information in the first space, and position information and the viewpoint information. There is provided an information processing apparatus including: an image acquisition unit to be controlled; and a provision control unit configured to control space environment information provided to a user in the first space based on a captured image of the second space.
 また、本開示によれば、第1の空間におけるユーザの位置情報とユーザの視点情報を取得することと、位置情報と視点情報に基づいて、所定方向の第2の空間の撮像画像を取得させることと、第2の空間の撮像画像に基づいて、第1の空間における前記ユーザに提供する空間環境情報の制御を行うことと、を含む情報処理方法が提供される。 Further, according to the present disclosure, acquisition of position information of the user in the first space and viewpoint information of the user and acquisition of a captured image of the second space in a predetermined direction based on the position information and viewpoint information An information processing method is provided that includes: controlling spatial environment information provided to the user in the first space based on a captured image of the second space.
 また、本開示によれば、コンピュータを、第1の空間におけるユーザの位置情報とユーザの視点情報を取得する情報取得部、位置情報と視点情報に基づいて、所定方向の第2の空間の撮像画像を取得させる画像取得部、及び第2の空間の撮像画像に基づいて、第1の空間におけるユーザに提供する空間環境情報の制御を行う提供制御部、として機能させるためのプログラムが提供される。 Further, according to the present disclosure, a computer is used to capture an image of a second space in a predetermined direction based on an information acquisition unit that acquires user's position information and user's viewpoint information in the first space, and position information and viewpoint information. A program is provided to function as an image acquisition unit that acquires an image, and a provision control unit that controls space environment information provided to a user in the first space based on a captured image of the second space. .
 本開示によれば、所定の利用者に対して、当該利用者の位置情報と視点情報とに応じて取得された他の拠点の画像情報を含む空間環境情報が出力される。 According to the present disclosure, space environment information including image information of another base acquired according to position information and viewpoint information of the user is output to a predetermined user.
 以上説明したように本開示によれば、ある場所に複数のユーザが存在する場合であっても、それぞれのユーザの状態に適合した遠隔地の情報を提供することが可能となる。 As described above, according to the present disclosure, even when there are a plurality of users in a certain place, it is possible to provide information on remote places adapted to the state of each user.
 なお、上記の効果は必ずしも限定的なものではなく、上記の効果とともに、又は、上記の効果に代えて、本明細書に示されたいずれかの効果、又は、本明細書から把握され得る他の効果が奏されてもよい。 Note that the above-mentioned effects are not necessarily limited, and, along with the above effects, or in place of the above effects, any of the effects shown in the present specification or others that can be grasped from the present specification The effect of may be exhibited.
本開示の実施形態に係る情報処理装置3の構成の一例を示すブロック図である。It is a block diagram showing an example of composition of information processor 3 concerning an embodiment of this indication. 同実施形態に係る情報処理システム1の構成の一例を示すブロック図である。It is a block diagram showing an example of composition of information processing system 1 concerning the embodiment. 同実施形態に係る撮像ユニット10の構成の一例を示すブロック図である。It is a block diagram showing an example of composition of imaging unit 10 concerning the embodiment. 同実施形態に係る制御装置20の構成の一例を示すブロック図である。It is a block diagram showing an example of composition of control device 20 concerning the embodiment. 同実施形態に係る第1の表示装置30の構成の一例を示すブロック図である。It is a block diagram showing an example of composition of the 1st display 30 concerning the embodiment. 同実施形態に係る動作の流れの一例を示す流れ図である。It is a flowchart which shows an example of the flow of operation | movement which concerns on the embodiment. 同実施形態に係るユーザによって認識される画像の様子の一例を示す説明図である。It is an explanatory view showing an example of a situation of a picture recognized by a user concerning the embodiment. 同実施形態に係る動作例を示す説明図である。It is an explanatory view showing an operation example concerning the embodiment. 同実施形態に係る動作例を示す説明図である。It is an explanatory view showing an operation example concerning the embodiment. 同実施形態に係る動作の流れの一例を示すシーケンス図である。It is a sequence diagram which shows an example of the flow of operation | movement which concerns on the embodiment. 同実施形態に係る動作例を示す説明図である。It is an explanatory view showing an operation example concerning the embodiment. 同実施形態に係る動作例を示す説明図である。It is an explanatory view showing an operation example concerning the embodiment. 同実施形態に係る動作の流れの一例を示すシーケンス図である。It is a sequence diagram which shows an example of the flow of operation | movement which concerns on the embodiment. 同実施形態に係る動作例を示す説明図である。It is an explanatory view showing an operation example concerning the embodiment. 同実施形態に係る動作例を示す説明図である。It is an explanatory view showing an operation example concerning the embodiment. 同実施形態に係る動作の流れの一例を示すシーケンス図である。It is a sequence diagram which shows an example of the flow of operation | movement which concerns on the embodiment. 同実施形態に係る動作例を示す説明図である。It is an explanatory view showing an operation example concerning the embodiment. 同実施形態に係る動作例を示す説明図である。It is an explanatory view showing an operation example concerning the embodiment. 同実施形態に係る動作例を示す説明図である。It is an explanatory view showing an operation example concerning the embodiment. 同実施形態に係る動作例を示す説明図である。It is an explanatory view showing an operation example concerning the embodiment. 同実施形態に係る動作例を示す説明図である。It is an explanatory view showing an operation example concerning the embodiment. 同実施形態に係る動作例を示す説明図である。It is an explanatory view showing an operation example concerning the embodiment. 同実施形態に係る動作例を示す説明図である。It is an explanatory view showing an operation example concerning the embodiment. 同実施形態に係る動作例を示す説明図である。It is an explanatory view showing an operation example concerning the embodiment. 同実施形態に係る動作例を示す説明図である。It is an explanatory view showing an operation example concerning the embodiment. 同実施形態に係る動作例を示す説明図である。It is an explanatory view showing an operation example concerning the embodiment. 同実施形態に係る動作例を示す説明図である。It is an explanatory view showing an operation example concerning the embodiment. 同実施形態に係る動作例を示す説明図である。It is an explanatory view showing an operation example concerning the embodiment. 同実施形態に係る動作例を示す説明図である。It is an explanatory view showing an operation example concerning the embodiment. 同実施形態に係る制御装置20のハードウェア構成例を示すブロック図である。It is a block diagram showing the example of hardware constitutions of control device 20 concerning the embodiment. 同実施形態に係る第1の表示装置30のハードウェア構成例を示すブロック図である。It is a block diagram showing the example of hardware constitutions of the 1st display 30 concerning the embodiment.
 以下に添付図面を参照しながら、本開示の好適な実施の形態について詳細に説明する。なお、本明細書及び図面において、実質的に同一の機能構成を有する構成要素については、同一の符号を付することにより重複説明を省略する。 Hereinafter, preferred embodiments of the present disclosure will be described in detail with reference to the accompanying drawings. In the present specification and the drawings, components having substantially the same functional configuration will be assigned the same reference numerals and redundant description will be omitted.
 また、本明細書及び図面において、実質的に同一の機能構成を有する複数の構成要素を、同一の符号の後に異なるアルファベットを付して区別する場合もある。ただし、実質的に同一の機能構成を有する複数の構成要素の各々を特に区別する必要がない場合、同一符号のみを付する。 Further, in the present specification and the drawings, a plurality of components having substantially the same functional configuration may be distinguished by attaching different alphabets to the same reference numerals. However, when it is not necessary to distinguish each of a plurality of components having substantially the same functional configuration, only the same reference numerals will be given.
 なお、説明は以下の順序で行うものとする。
<<1.背景>>
<<2.構成>>
 <2-1.全体構成>
 <2-2.撮像ユニット10の構成>
 <2-3.制御装置20の構成>
 <2-4.第1の表示装置30の構成>
 <2-5.第2の表示装置40の構成>
<<3.動作>>
 <3-1.動作例1>
 <3-2.動作例2>
 <3-3.動作例3>
 <3-4.動作例4>
 <3-5.動作例5>
 <3-6.動作例6>
 <3-7.動作例7>
 <3-8.動作例8>
 <3-9.動作例9>
<<4.ハードウェア構成>>
 <4-1.制御装置20のハードウェア構成>
 <4-2.第1の表示装置30のハードウェア構成>
<<5.結び>>
The description will be made in the following order.
<< 1. Background >>
<< 2. Configuration >>
<2-1. Overall configuration>
<2-2. Configuration of Imaging Unit 10>
<2-3. Configuration of control device 20>
<2-4. Configuration of First Display Device 30>
<2-5. Configuration of Second Display Device 40>
<< 3. Operation >>
<3-1. Operation example 1>
<3-2. Operation example 2>
<3-3. Operation example 3>
<3-4. Operation example 4>
<3-5. Operation example 5>
<3-6. Operation example 6>
<3-7. Operation example 7>
<3-8. Operation example 8>
<3-9. Operation example 9>
<< 4. Hardware configuration >>
<4-1. Hardware configuration of control device 20>
<4-2. Hardware Configuration of First Display Device 30>
<< 5. Close >>
<<1.背景>>
 まず、本開示の一実施形態に係る情報処理技術の創作に至った背景について、簡単に説明する。
<< 1. Background >>
First, the background that led to the creation of the information processing technology according to an embodiment of the present disclosure will be briefly described.
 一般的に、物理的に離れた地点で通信及びユーザ間コミュニケーションを行うためには、ネットワーク通信設備と、各地点での状況を互いに表示させるための撮像装置(カメラ)及び表示装置(ディスプレイ)と、が使用される。これらの構成は、いわゆるビデオチャットやビデオ会議で用いられているが、これらの構成を使用した方法には、例えば以下のような問題がある。 Generally, in order to perform communication and communication between users at physically separated points, a network communication facility and an imaging device (camera) and a display device (display) for mutually displaying the situation at each point , Is used. These configurations are used in so-called video chat and video conferencing, but methods using these configurations have the following problems, for example.
 第1に、ディスプレイが映し出す情報はあくまでも平面的な情報であるため、ユーザがディスプレイを見る位置及び角度によっては、不自然に見えてしまうという問題がある。例えば、パーソナルコンピュータ(PC)を用いたビデオチャットのように、ユーザが基本的にディスプレイに対して常に平面に位置するような場合には、見え方はさほど問題とはならないが、例えば壁面に大きなディスプレイを設置して使用する場合などには、不自然な見え方は顕著となる。 First, since the information displayed by the display is flat information, it may appear unnatural depending on the position and angle at which the user views the display. For example, when a user is basically always on a plane with respect to the display, such as video chat using a personal computer (PC), the appearance is not a problem, but for example, it is large on the wall When the display is installed and used, the unnatural appearance becomes remarkable.
 第2に、壁面ディスプレイのような設置位置が固定されたディスプレイを使用して、複数のユーザが遠隔地と空間を共有するような場合、各ユーザに適合した映像情報及び音声情報を提供することができないという問題がある。各ユーザの居る位置がそれぞれに異なり、また、PCを用いたビデオチャットのように、全てのユーザが常にディスプレイの正面に位置するわけでもないため、各ユーザそれぞれに適合する映像情報および音声情報を提供することができない。 Second, in the case where a plurality of users share a space with a remote place using a display with a fixed installation position such as a wall display, providing video information and audio information suitable for each user There is a problem that you can not Since each user's location is different, and all users are not always positioned in front of the display as in a video chat using a PC, video and audio information suitable for each user is It can not be provided.
 そこで、本発明者らは、上記事情を一着眼点として鋭意検討した結果、本開示の一実施形態に係る情報処理技術を創作するに至った。以下で詳述する本実施形態によれば、複数の拠点を撮像し、各ユーザが居る位置及び各ユーザの視点等に応じて、最適な映像情報等を提供することで、遠隔地があたかも繋がっているかのように遠隔地を自然な状態で接続し、より円滑で自然なコミュニケーションを実現することが可能となる。以下、このような効果を奏する本開示の一実施形態に係る情報処理技術の構成及び動作について順次説明する。 Then, the present inventors came to create the information processing technology concerning one embodiment of this indication, as a result of earnestly examining the above-mentioned situation as one point of focus. According to the present embodiment, which will be described in detail below, it is possible to connect a remote location as if imaging a plurality of bases and providing optimal video information etc. according to the position where each user is located, the viewpoint of each user, etc. It is possible to connect remote places in a natural state as if they are being used, and to realize smoother and natural communication. Hereinafter, the configuration and operation of an information processing technique according to an embodiment of the present disclosure which exhibits such an effect will be sequentially described.
<<2.構成>>
<2-1.構成>
 まず、図1と図2を参照しながら、本開示の一実施形態に係る情報処理装置3及び本実施形態に係る情報処理システム1の構成について説明する。図1は、本開示の一実施形態に係る情報処理装置3の構成例を示すブロック図である。図2は、同実施形態に係る情報処理システム1の構成例を示すブロック図である。
<< 2. Configuration >>
<2-1. Configuration>
First, configurations of an information processing device 3 according to an embodiment of the present disclosure and an information processing system 1 according to the present embodiment will be described with reference to FIGS. 1 and 2. FIG. 1 is a block diagram showing a configuration example of an information processing device 3 according to an embodiment of the present disclosure. FIG. 2 is a block diagram showing a configuration example of the information processing system 1 according to the embodiment.
 本実施形態に係る情報処理装置3は、情報取得部5と、画像取得部7と、提供制御部9とを備える。情報取得部5は、ユーザの位置情報と視点情報とを取得する機能を有する。画像取得部7は、情報取得部5で取得された位置情報と視点情報とに基づいて、所定方向の第2の空間の撮像画像を取得させる機能を有する。提供制御部9は、画像取得部7で取得された第2の空間の撮像画像に基づいて、第1の空間に存在するユーザに提供する空間環境情報の制御を行う機能を有する。 The information processing apparatus 3 according to the present embodiment includes an information acquisition unit 5, an image acquisition unit 7, and a provision control unit 9. The information acquisition unit 5 has a function of acquiring user's position information and viewpoint information. The image acquisition unit 7 has a function of acquiring a captured image of the second space in a predetermined direction based on the position information and the viewpoint information acquired by the information acquisition unit 5. The provision control unit 9 has a function of controlling space environment information to be provided to the user present in the first space based on the captured image of the second space acquired by the image acquisition unit 7.
 以下で、情報取得部5と、画像取得部7と、提供制御部9について、図2と対比させながら、詳細に説明する。 Hereinafter, the information acquisition unit 5, the image acquisition unit 7, and the provision control unit 9 will be described in detail in comparison with FIG.
 本実施形態に係る情報処理システム1は、図2に示すように、撮像ユニット10と、制御装置20と、を少なくとも備え、必要に応じて、第1の表示装置30又は第2の表示装置40の少なくとも何れか一方を更に備えるような情報処理装置3が、複数の拠点にそれぞれ設けられており、インターネット等の公知のネットワーク50を介して、複数の情報処理装置3が互いに接続されている。また、それぞれの情報処理装置3において、撮像ユニット10、第1の表示装置30及び第2の表示装置40は、有線もしくは無線により、制御装置20と接続されている。 As shown in FIG. 2, the information processing system 1 according to the present embodiment at least includes an imaging unit 10 and a control device 20, and as necessary, the first display device 30 or the second display device 40. An information processing apparatus 3 further including at least one of the above is provided at each of a plurality of bases, and the plurality of information processing apparatuses 3 are connected to one another via a known network 50 such as the Internet. In each of the information processing devices 3, the imaging unit 10, the first display device 30 and the second display device 40 are connected to the control device 20 by wire or wirelessly.
 なお、図2では、拠点A及び拠点Bという2つの地点に設けられた情報処理装置3が互いに接続されている場合を図示しているが、ネットワーク50を介して互いに接続される情報処理装置3の個数は、図2に示した例に限定されるものではなく、3以上の情報処理装置3がネットワーク50を介して互いに接続されていてもよい。 Although FIG. 2 illustrates the case where the information processing devices 3 provided at two points, that is, the base A and the base B are connected to each other, the information processing devices 3 connected to each other via the network 50 are illustrated. The number of devices is not limited to the example illustrated in FIG. 2, and three or more information processing apparatuses 3 may be connected to one another via the network 50.
 本実施形態に係る撮像ユニット10は、第1の空間である拠点Aと第2の空間である拠点Bのそれぞれに設置され、拠点A又は拠点Bのそれぞれおいて、各拠点の少なくとも一部を撮像して撮像画像を生成する機能を有する。撮像ユニット10により生成される撮像画像に関する画像情報は、設置されている空間の環境に関する情報である空間環境情報の一つである。また、撮像ユニット10は、設置されている空間における音声を取得して、取得した音声に関する音声情報を生成する機能を有していてもよい。かかる音声情報も、空間環境情報の一つである。更に、撮像ユニット10は、上記のような空間環境情報を解析する機能を、更に有していてもよい。 The imaging unit 10 according to the present embodiment is installed in each of the base A as the first space and the base B as the second space, and at each of the base A or the base B, at least a part of each base It has a function of imaging and generating a captured image. The image information on the captured image generated by the imaging unit 10 is one of space environment information which is information on the environment of the installed space. In addition, the imaging unit 10 may have a function of acquiring audio in the installed space and generating audio information on the acquired audio. Such voice information is also one of space environment information. Furthermore, the imaging unit 10 may further have a function of analyzing the space environment information as described above.
 ここで、上記の空間環境情報には、撮像ユニット10により取得された画像情報や音声情報だけでなく、各拠点及び各拠点に存在する各種物体(各拠点に存在するユーザも含む。)に関係する各種の文字情報や、各拠点で発生した各種の振動に関する情報や、各拠点で取得された匂いに関する情報等のように、拠点の環境に関連する、人間の五感に対応した各種の情報が含まれうる。 Here, the above-mentioned space environment information is related not only to the image information and the voice information acquired by the imaging unit 10 but also to each base and various objects existing in each base (including the users existing in each base). Various types of information corresponding to the human senses related to the environment of the base, such as information on various types of character information, information on various vibrations generated at each base, information on odors obtained at each base, etc. May be included.
 本実施形態に係る制御装置20は、本実施形態に係る情報処理装置3の全体的な機能を統括的に制御する装置であり、上記のような空間環境情報を解析する解析機能を有するとともに、他の拠点に設置された撮像機能を有する撮像装置が行う撮像処理や収音処理を制御し、上記のような空間環境情報の第1の表示装置30や第2の表示装置40への表示を制御する機能を有する。他の拠点に設置される撮像装置は、例えば、デジタルスチルカメラやデジタルビデオカメラ等のような撮像公知の撮像装置であってもよく、撮像ユニット10として使用されるものであってよい。また、制御装置20は、撮像ユニット10における各種の動作を制御する機能を有していてもよい。 The control device 20 according to the present embodiment is a device that integrally controls the overall functions of the information processing device 3 according to the present embodiment, and has an analysis function of analyzing space environment information as described above. It controls imaging processing and sound collection processing performed by an imaging device having an imaging function installed at another site, and displays the above-described space environment information on the first display device 30 and the second display device 40. It has a function to control. The imaging device installed at another site may be, for example, a known imaging device such as a digital still camera or a digital video camera, and may be used as the imaging unit 10. The control device 20 may also have a function of controlling various operations in the imaging unit 10.
 図1と図2とを対比すると明らかなように、図1における画像取得部7及び提供制御部9は、制御装置20に対応している。また、図1における情報取得部5は、図2に示す、撮像ユニット10に備えられてもよいし、制御装置20に備えられてもよいし、撮像ユニット10と制御装置20とに分散させて備えられてもよい。 As apparent from the comparison between FIG. 1 and FIG. 2, the image acquisition unit 7 and the provision control unit 9 in FIG. 1 correspond to the control device 20. In addition, the information acquisition unit 5 in FIG. 1 may be included in the imaging unit 10 shown in FIG. 2, may be included in the control device 20, or may be distributed to the imaging unit 10 and the control device 20. It may be provided.
 本実施形態に係る第1の表示装置30は、例えばメガネ型ディスプレイやヘッドマウントディスプレイ(Head Mount Display:HMD)等のように、ユーザが装着して使用するものであり、制御装置20から送信された各種の空間環境情報をユーザに対して出力する機能を有する。また、第1の表示装置30は、第1の表示装置30に実装されている加速度センサやジャイロセンサ等といった各種のセンサを利用するなどして、第1の表示装置30を装着しているユーザの視点に関する視点情報や、装着しているユーザの位置に関する位置情報を取得する機能を有していることが好ましい。 The first display device 30 according to the present embodiment is, for example, a glasses-type display, a head mount display (HMD) or the like, which the user wears and uses, and is transmitted from the control device 20 It has a function of outputting various spatial environment information to the user. In addition, the first display device 30 is a user who wears the first display device 30 by using various sensors such as an acceleration sensor or a gyro sensor mounted on the first display device 30. It is preferable to have a function of acquiring view point information on the view point of and the position information on the position of the user who is wearing it.
 本実施形態に係る第2の表示装置40は、制御装置20から送信された各種の空間環境情報をユーザに対して出力する機能を有しており、例えば各拠点に存在する壁面など、拠点内の所定の位置に固定された状態で用いられる。かかる第2の表示装置40は、各拠点に存在する壁面に固定された、各種のディスプレイであってもよいし、各拠点に存在する壁面そのものがディスプレイとして機能するものであってもよい。 The second display device 40 according to the present embodiment has a function of outputting various kinds of space environment information transmitted from the control device 20 to the user. For example, in the base such as a wall surface existing in each base It is used in the state fixed at the predetermined position of. The second display device 40 may be various displays fixed to the wall surfaces existing at each site, or the wall surfaces themselves existing at each site may function as a display.
 各拠点に存在するユーザは、第1の表示装置30又は第2の表示装置40の少なくとも何れか一方に表示される他の拠点の空間環境情報を参照することで、表示された他の拠点の空間環境情報を認識する。このとき、各拠点に存在する各ユーザは、拠点内のある位置に固定された第2の表示装置40を介して、他の拠点とあたかも繋がっているような感覚を得ることができる。しかし、後述するように、出力される他の拠点の空間環境情報は、第1の表示装置30及び第2の表示装置40の双方に必ずしも出力されなくてもよい。すなわち、他の拠点の空間環境情報は、第1の表示装置30のみに出力されてもよいし、第2の表示装置40のみに出力されてもよい。 The user existing at each site refers to the space environment information of the other site displayed on at least one of the first display device 30 and the second display device 40 to display the other sites displayed. Recognize spatial environment information. At this time, each user present at each site can obtain a feeling as if it were connected to another site via the second display device 40 fixed at a certain position in the site. However, as will be described later, the space environment information of the other site to be output may not necessarily be output to both the first display device 30 and the second display device 40. That is, space environment information of another site may be output only to the first display device 30 or may be output only to the second display device 40.
 ネットワーク50は、例えばインターネット等のような公知の情報通信技術を用いて実現されており、拠点Aに設けられる制御装置20Aと、拠点Bに設けられる制御装置20Bとを相互に接続する。 The network 50 is realized using a known information communication technology such as the Internet, for example, and mutually connects the control device 20A provided at the site A and the control device 20B provided at the site B.
<2-2.撮像ユニット10の構成>
 次に、図3を参照して、本実施形態に係る撮像ユニット10の構成について詳細に説明する。図3は、本実施形態に係る撮像ユニット10の構成例を示すブロック図である。
<2-2. Configuration of Imaging Unit 10>
Next, the configuration of the imaging unit 10 according to the present embodiment will be described in detail with reference to FIG. FIG. 3 is a block diagram showing a configuration example of the imaging unit 10 according to the present embodiment.
 図3に示すように、本実施形態に係る撮像ユニット10は、例えば、デジタルスチルカメラやデジタルビデオカメラ等のような、撮像機能を少なくとも有し、好ましくは収音機能を更に有するような、公知の撮像装置によって実現される。本実施形態に係る撮像ユニット10は、例えば、画像取得部101と、音声取得部103と、空間環境情報解析部105と、通信制御部107と、記憶部109と、を備える。 As shown in FIG. 3, the imaging unit 10 according to the present embodiment is known, for example, at least having an imaging function such as a digital still camera or a digital video camera, and preferably further having a sound collecting function. This is realized by the imaging device of The imaging unit 10 according to the present embodiment includes, for example, an image acquisition unit 101, a sound acquisition unit 103, a space environment information analysis unit 105, a communication control unit 107, and a storage unit 109.
 画像取得部101は、設置された拠点の少なくとも一部を撮像して、撮像された画像の実体データを含む、画像に関する各種の画像情報を取得する機能を有している。取得された各種の画像情報は、空間環境情報解析部105及び通信制御部107の少なくとも何れか一方に送信される。具体的には、画像取得部101は、ユーザの位置に関する位置情報及びユーザの視点に関する視点情報を抽出するために用いられる各種の撮像画像や、他の拠点で表示させるための各種の撮像画像等を随時取得する。画像取得部101は、設置された拠点の複数の位置を同時に撮像できるように、複数のカメラで構成されることが好ましい。このような構成により、複数のユーザが同一拠点の異なる位置に存在する場合であっても、複数のユーザそれぞれの位置情報及び視点情報を抽出するための画像や、他の拠点で表示させるための画像を、同時に取得することができる。 The image acquisition unit 101 has a function of imaging at least a part of the installed base and acquiring various types of image information related to the image, including entity data of the imaged image. The acquired various image information is transmitted to at least one of the space environment information analysis unit 105 and the communication control unit 107. Specifically, the image acquisition unit 101 may use various types of captured images used to extract position information related to the position of the user and viewpoint information related to the user's viewpoint, various types of captured images to be displayed at other sites, etc. Get at any time. The image acquisition unit 101 is preferably configured by a plurality of cameras so that a plurality of positions of the installed base can be simultaneously imaged. With such a configuration, even when a plurality of users are present at different positions of the same site, an image for extracting position information and viewpoint information of each of the plurality of users, and a display at another site Images can be acquired simultaneously.
 音声取得部103は、設置された拠点の少なくとも一部の音声を取得して、取得した音声に関する実体データを含む、音声に関する各種の音声情報を取得する機能を有している。取得された音声は、空間環境情報解析部105及び通信制御部107の少なくとも何れか一方に送信される。音声取得部103は、設置された拠点の複数の位置の音声情報を同時に集音できるように、複数のマイクで構成されることが好ましい。 The voice acquisition unit 103 has a function of acquiring voice of at least a part of the installed base and acquiring various voice information on voice including entity data on the voice acquired. The acquired voice is transmitted to at least one of the space environment information analysis unit 105 and the communication control unit 107. The voice acquisition unit 103 is preferably configured by a plurality of microphones so as to simultaneously collect voice information of a plurality of positions of the installed base.
 また、撮像ユニット10には、上記のような画像情報及び音声情報以外にも、空間環状情報に含まれうる各種の情報を取得する機能が実装されていてもよい。 In addition to the above-described image information and audio information, the imaging unit 10 may be provided with a function of acquiring various types of information that may be included in the spatial ring information.
 空間環境情報解析部105は、情報取得部5の機能を実現する処理部であり、ユーザの位置情報と視点情報とを取得する。空間環境情報解析部105は、画像取得部101から送信された画像情報や、音声取得部103から送信された音声情報等を用いて、撮像ユニット10が設置された拠点の空間環境に関する情報である空間環境情報を公知の各種の解析技術及び認識技術等を利用しながら解析する。これにより、空間環境情報解析部105は、位置情報と視点情報を含む各種の二次情報を生成する。かかる二次情報として、例えば、拠点Aに存在しているユーザAの位置情報や視点情報、ユーザAが行っている各種の動作に関する動作情報、第2の表示装置から見たユーザAの相対角度、ユーザAの発話内容等に関する発話情報等を挙げることができる。また、拠点Aに対して複数のユーザが存在している場合、空間環境情報解析部105は、存在する任意のユーザに関して、上記のような解析処理を実施することができる。 The space environment information analysis unit 105 is a processing unit that realizes the function of the information acquisition unit 5, and acquires position information and viewpoint information of the user. The space environment information analysis unit 105 is information related to the space environment of the base where the imaging unit 10 is installed, using the image information transmitted from the image acquisition unit 101, the sound information transmitted from the sound acquisition unit 103, and the like. Space environment information is analyzed using various known analysis and recognition techniques. Thereby, the space environment information analysis unit 105 generates various secondary information including position information and viewpoint information. As such secondary information, for example, position information and viewpoint information of the user A existing at the site A, operation information regarding various operations performed by the user A, and a relative angle of the user A viewed from the second display device And utterance information and the like on the utterance content and the like of the user A. In addition, when there are a plurality of users for the site A, the space environment information analysis unit 105 can perform the above-described analysis processing for any existing users.
 なお、空間環境情報解析部105と同様の機能は、後述する制御装置20に備えられていてもよく、制御装置20に空間環境情報解析部105と同等の機能が実装される場合には、撮像ユニット10は、空間環境情報解析部105を有していなくともよい。また、上記のような空間環境情報解析部105の機能を、撮像ユニット10と制御装置20とに分散させて実装してもよい。 A function similar to that of the space environment information analysis unit 105 may be included in the control device 20 described later, and when the control device 20 has a function equivalent to that of the space environment information analysis unit 105, imaging The unit 10 may not have the space environment information analysis unit 105. In addition, the function of the space environment information analysis unit 105 as described above may be distributed and implemented in the imaging unit 10 and the control device 20.
 空間環境情報解析部105による解析結果(すなわち、生成された各種の二次情報)に関する情報は、通信制御部107を介して、後述する制御装置20に送信される。 Information on the analysis result (that is, various types of generated secondary information) by the space environment information analysis unit 105 is transmitted to the control device 20 described later via the communication control unit 107.
 以下、空間環境情報解析部105による空間環境の解析方法の一例について、簡単に説明する。 Hereinafter, an example of the analysis method of the space environment by the space environment information analysis unit 105 will be briefly described.
 まず、空間環境情報解析部105によるユーザの位置情報、視点情報及びユーザの動き情報の抽出方法について説明する。 First, a method of extracting user's position information, viewpoint information, and user's motion information by the space environment information analysis unit 105 will be described.
 撮像ユニット10により生成された撮像画像を利用した、ユーザの位置情報の抽出方法としては、公知の各種の技術を利用することが可能であり、例えば、特開2010-16743号公報、特許第5869883号公報、特開2017-182739号公報等に開示されている、各種の位置推定技術を適宜利用することができる。 Various known techniques can be used as a method of extracting position information of the user using a captured image generated by the imaging unit 10. For example, Japanese Patent Application Laid-Open No. 2010-16743, patent No. 5869883 Various position estimation techniques disclosed in Japanese Patent Laid-Open Publication No. JP-A-2017-182739 and the like can be used as appropriate.
 また、撮像ユニット10により生成された撮像画像を利用した、ユーザの視点情報の抽出方法としては、公知の各種の技術を利用することが可能であり、例えば、特開2005-230049号公報、特開2015-153302号公報、特開2017-169685号公報等に開示されている、各種の視線推定技術を適宜利用することができる。 In addition, as a method of extracting viewpoint information of the user using a captured image generated by the imaging unit 10, various known techniques can be used. For example, Japanese Patent Application Laid-Open No. 2005-230049, Various line-of-sight estimation techniques disclosed in Japanese Patent Laid-Open Publication No. 2015-153302, Japanese Patent Laid-Open Publication No. 2017-169685, and the like can be used as appropriate.
 また、撮像ユニット10により生成された撮像画像を利用した、ユーザの動き情報の抽出方法としては、公知のジェスチャー認識技術を利用することが可能であり、例えば、特開2013-205983号公報等に開示されている技術を適宜利用することができる。 In addition, as a method of extracting motion information of the user using the captured image generated by the imaging unit 10, it is possible to use a known gesture recognition technology. For example, Japanese Patent Application Laid-Open No. 2013-205983 and the like The disclosed technology can be used as appropriate.
 また、撮像ユニット10により生成された音声情報を利用した、ユーザの発話内容等に関する発話情報の抽出方法としては、公知の音声認識技術を利用することが可能である。 Further, as a method of extracting speech information related to the contents of speech of the user using speech information generated by the imaging unit 10, it is possible to use known speech recognition technology.
 なお、上記の解析方法(抽出方法)は、あくまでも一例にすぎず、本実施形態に係る空間環境情報解析部105で実施される各種の解析方法は、上記のものに限定されるものではない。 The above analysis method (extraction method) is merely an example, and various analysis methods implemented by the space environment information analysis unit 105 according to the present embodiment are not limited to the above.
 通信制御部107は、画像取得部101から送信された画像情報、音声取得部103から送信された音声情報、及び、空間環境情報解析部105から送信された各種の二次情報等を制御装置20に送信する際の通信制御機能を有する。また、通信制御部107は、制御装置20から送信された各種の制御指示情報を受信する機能を有する。取得した制御指示情報に基づき、画像取得部101による撮像処理や、音声取得部103による音声取得処理が所望の状態に制御され、例えば撮像範囲や音声取得範囲等が所望の状態に設定される。 The communication control unit 107 controls the image information transmitted from the image acquisition unit 101, the voice information transmitted from the voice acquisition unit 103, various secondary information transmitted from the space environment information analysis unit 105, and the like. Communication control function at the time of transmission to The communication control unit 107 also has a function of receiving various control instruction information transmitted from the control device 20. Based on the acquired control instruction information, imaging processing by the image acquisition unit 101 and audio acquisition processing by the audio acquisition unit 103 are controlled to a desired state, and for example, an imaging range and an audio acquisition range are set to the desired state.
 記憶部109は、撮像ユニット10が備える記憶装置の一例である。記憶部109には、本実施形態に係る撮像ユニット10が、上記のような各種の処理を実施する際に利用する各種のプログラムやデータベース等が適宜記録されている。また、記憶部109には、上記のような画像取得部101や音声取得部103等が取得した各種の情報が履歴情報として記録されていてもよい。更に、記憶部109には、本実施形態に係る撮像ユニット10が何らかの処理を行う際に保存する必要が生じた様々なパラメータや処理の途中経過等が適宜記録されてもよい。この記憶部109は、画像取得部101、音声取得部103、空間環境情報解析部105、通信制御部107等が、自由にリード/ライト処理を実施することが可能である。 The storage unit 109 is an example of a storage device provided in the imaging unit 10. The storage unit 109 appropriately stores various programs, databases, and the like that are used when the imaging unit 10 according to the present embodiment performs various processes as described above. The storage unit 109 may also record various types of information acquired by the image acquisition unit 101, the audio acquisition unit 103, and the like as described above as history information. Furthermore, various parameters that need to be stored when the imaging unit 10 according to the present embodiment performs some processing, progress of processing, and the like may be appropriately recorded in the storage unit 109. In the storage unit 109, the image acquisition unit 101, the voice acquisition unit 103, the space environment information analysis unit 105, the communication control unit 107, and the like can freely perform read / write processing.
 ここまで、撮像ユニット10の構成の一例について詳細に説明した。上記の各構成要素は、汎用的な部材や回路を用いて構成されていてもよいし、各構成要素の機能に特化したハードウェアにより構成されていてもよい。また、各構成要素の機能を、CPU等が全て行ってもよい。従って、本実施形態を実施する時々の技術レベルに応じて、適宜、利用する構成を変更することが可能である。 Up to this point, an example of the configuration of the imaging unit 10 has been described in detail. Each component described above may be configured using a general-purpose member or circuit, or may be configured by hardware specialized for the function of each component. Further, all functions of each component may be performed by a CPU or the like. Therefore, it is possible to change the configuration to be used as appropriate according to the technical level at which the present embodiment is implemented.
<2-3.制御装置20の構成>
 次に、図4を参照して、本実施形態に係る制御装置20の構成の一例について詳細に説明する。図4は、本実施形態に係る制御装置20の構成例を示すブロック図である。
<2-3. Configuration of control device 20>
Next, with reference to FIG. 4, an example of the configuration of the control device 20 according to the present embodiment will be described in detail. FIG. 4 is a block diagram showing a configuration example of the control device 20 according to the present embodiment.
 図4に示すように、制御装置20は、例えば、通信制御部201と、空間環境情報解析部203と、出力制御部205と、撮像ユニット制御部207と、記憶部209と、を備える。 As shown in FIG. 4, for example, the control device 20 includes a communication control unit 201, a space environment information analysis unit 203, an output control unit 205, an imaging unit control unit 207, and a storage unit 209.
 通信制御部201は、他の装置との間で実施される通信処理を制御する機能を有する。具体的には、通信制御部201は、撮像ユニット10、第1の表示装置30、及び、第2の表示装置40との間で相互に実施される通信処理を制御したり、ネットワーク50を介して相互に実施される他の拠点に設けられた制御装置20との間の通信処理を制御したりする。例えば、通信制御部201は、撮像ユニット10から送信された画像情報や音声情報等といった空間環境情報や、撮像ユニット10から送信された、撮像ユニット10で生成された各種の二次情報等を取得して、空間環境情報解析部203に送信する。また、通信制御部201は、出力制御部205から送信された制御指示情報に基づいて、第1の表示装置30又は第2の表示装置40に対して空間環境情報を送信して、第1の表示装置30又は第2の表示装置40に、所望の空間環境情報を表示させる。 The communication control unit 201 has a function of controlling communication processing performed with another device. Specifically, the communication control unit 201 controls communication processing mutually implemented between the imaging unit 10, the first display device 30, and the second display device 40, or via the network 50. Control processing of communication with the control device 20 provided at another location implemented mutually. For example, the communication control unit 201 acquires space environment information such as image information and audio information transmitted from the imaging unit 10, various secondary information generated by the imaging unit 10 transmitted from the imaging unit 10, and the like. Then, it transmits to the space environment information analysis unit 203. Further, the communication control unit 201 transmits space environment information to the first display device 30 or the second display device 40 based on the control instruction information transmitted from the output control unit 205, and performs the first process. Desired space environment information is displayed on the display device 30 or the second display device 40.
 空間環境情報解析部203は、撮像ユニット10が有する空間環境情報解析部105と同様の機能を有し、情報取得部5の機能を実現する処理部であり、ユーザの位置情報と視点情報とを取得する。具体的には、空間環境情報解析部203は、撮像ユニット10から送信された画像情報及び音声情報等といった空間環境情報を公知の各種の解析技術及び認識技術等を利用しながら解析する。これにより空間環境情報解析部203は、ユーザの位置情報と視点情報を含む各種の二次情報を生成する。かかる二次情報として、例えば、例えば、拠点Aに存在しているユーザAの位置情報や視点情報、ユーザAが行っている各種の動作に関する動作情報、第2の表示装置40から見たユーザAの相対角度、ユーザAの発話内容等に関する発話情報等を挙げることができる。また、拠点Aに対して複数のユーザが存在している場合、空間環境情報解析部203は、存在する任意のユーザに関して、上記のような解析処理を実施することができる。 The space environment information analysis unit 203 is a processing unit having the same function as the space environment information analysis unit 105 of the imaging unit 10 and realizing the function of the information acquisition unit 5, and includes position information and viewpoint information of the user. get. Specifically, the space environment information analysis unit 203 analyzes space environment information such as image information and voice information transmitted from the imaging unit 10 using various known analysis and recognition techniques and the like. Thereby, the space environment information analysis unit 203 generates various secondary information including the position information of the user and the viewpoint information. As such secondary information, for example, position information and viewpoint information of the user A existing at the base A, operation information regarding various operations performed by the user A, and the user A viewed from the second display device 40 Information on the relative angle of the user A, the utterance content of the user A, and the like. In addition, when there are a plurality of users for the site A, the space environment information analysis unit 203 can perform the above-described analysis processing for any existing users.
 なお、空間環境情報解析部203は、上記のような各種の二次情報を全て生成する機能を有している必要はなく、ユーザAの位置情報や視点情報を生成する機能のみを有していてもよい。先だって言及したように、空間環境情報解析部203は、撮像ユニット10に空間環境情報解析部105が備えられていれば、必ずしも制御装置20に設けられる必要はない。また、上記のような空間環境情報解析部の機能を、撮像ユニット10と制御装置20とに分散させて実装してもよい。 Note that the space environment information analysis unit 203 does not have to have the function of generating all the various types of secondary information as described above, and has only the function of generating position information and viewpoint information of the user A. May be As mentioned earlier, the space environment information analysis unit 203 is not necessarily provided in the control device 20 as long as the imaging unit 10 is provided with the space environment information analysis unit 105. In addition, the function of the space environment information analysis unit as described above may be distributed and implemented in the imaging unit 10 and the control device 20.
 空間環境情報は、必要に応じて、空間環境情報解析部203による解析結果(すなわち、生成された各種の二次情報)に関する情報とともに、出力制御部205に送信される。 The space environment information is transmitted to the output control unit 205 together with information on the analysis result (that is, various generated secondary information) by the space environment information analysis unit 203 as needed.
 出力制御部205は、他の拠点に設けられた制御装置20から送信された空間環境情報を、拠点内の第1の表示装置30又は第2の表示装置40の少なくともいずれか一方に表示させる。例えば、拠点Aに設けられた制御装置20Aの出力制御部205Aは、拠点Bに設けられた制御装置20Bから送信された拠点Bの空間環境情報を、拠点Aの第1の表示装置30Aのみに出力させることが可能であり、拠点Aの第2の表示装置40Aのみに出力させることも可能である。更に、出力制御部205Aは、空間環境情報を第1の表示装置30A及び第2の表示装置40Aの双方に出力させることも可能である。この場合、出力制御部205は、出力させる空間環境情報を、第1の表示装置30Aと第2の表示装置40Aに分割して出力させることができる。この際、出力制御部205は、例えば、第2の表示装置40Aの表示画面に表示されている出力内容(換言すれば、表示すべき空間環境情報の一部)と重畳するように、第1の表示装置30Aに空間環境情報を表示させることが可能である。 The output control unit 205 causes the space environment information transmitted from the control device 20 provided at another site to be displayed on at least one of the first display device 30 or the second display device 40 in the site. For example, the output control unit 205A of the control device 20A provided at the site A can use the space environment information of the site B transmitted from the control device 20B provided at the site B to only the first display device 30A of the site A. It is possible to make it output, and it is also possible to make it output only to the 2nd display 40A of base A. Furthermore, the output control unit 205A can also output space environment information to both the first display device 30A and the second display device 40A. In this case, the output control unit 205 can divide and output the space environment information to be output to the first display device 30A and the second display device 40A. At this time, the output control unit 205 performs, for example, the first to overlap the output content (in other words, part of the space environment information to be displayed) displayed on the display screen of the second display device 40A. The space environment information can be displayed on the display device 30A.
 第2の表示装置40Aの出力内容と重畳するように、第1の表示装置30Aに空間環境情報を表示させる重畳表示方法として、例えば、以下のような方法を用いることが可能である。すなわち、拠点Aに存在するユーザの位置情報及び視点情報に基づいて、拠点Bの撮像ユニット10Bにより、拠点Aのユーザの位置及び視点の方向に合致する拠点Bの空間が撮像され、拠点Bの制御装置20により、撮像された画像情報を含む空間環境情報が生成される。拠点Aの制御装置20Aの出力制御部205Aは、拠点Bから送信された空間環境情報を用いて、拠点Aのユーザの位置及び視点の方向に合致する第2の表示装置40Aの表示画面の位置を特定する。続いて、拠点Aの制御装置20Aの出力制御部205Aは、特定した第2の表示装置40Aの表示画面の位置に重畳するように、第1の表示装置30Aの特定の位置に、表示すべき空間環境情報を表示させる。 For example, the following method can be used as a superimposed display method of causing the first display device 30A to display space environment information so as to be superimposed on the output content of the second display device 40A. That is, based on the position information and the viewpoint information of the user present at the site A, the imaging unit 10B of the site B captures the space of the site B that matches the position of the user of the site A and the direction of the viewpoint. The control device 20 generates space environment information including captured image information. The output control unit 205A of the control device 20A of the base A uses the space environment information transmitted from the base B to determine the position of the display screen of the second display device 40A that matches the position of the user of the base A and the direction of the viewpoint. Identify Subsequently, the output control unit 205A of the control device 20A of the base A should display at a specific position of the first display device 30A so as to be superimposed on the specified position of the display screen of the second display device 40A. Display space environment information.
 また、上記のような方法以外にも、例えば、特開2017-182302号公報に開示されているような技術を応用して、表示位置を決定することも可能である。すなわち、特開2017-182302号公報に開示されている技術を応用し、ユーザの位置情報及び視点情報と、他の拠点で撮像された画像と、を利用して、第1の表示装置30Aに重畳表示させる位置を決定することも可能である。 In addition to the method described above, for example, it is possible to determine the display position by applying a technology as disclosed in Japanese Patent Application Laid-Open No. 2017-182302. That is, the technology disclosed in Japanese Patent Application Laid-Open No. 2017-182302 is applied to the first display device 30A using position information and viewpoint information of the user and images taken at other locations. It is also possible to determine the position to be superimposed and displayed.
 本実施形態に係る出力制御部205による空間環境情報の出力制御の具体例については、以下で「動作例」として、詳細に説明する。 A specific example of output control of space environment information by the output control unit 205 according to the present embodiment will be described in detail as an “operation example” below.
 撮像ユニット制御部207は、制御装置20が備えられた拠点と同じ拠点に設置された撮像ユニット10及び、他の拠点に設置された撮像機能を有する撮像装置が行う撮像処理や収音処理を統括的に制御する。例えば、拠点Aに設けられた制御装置20Aの撮像ユニット制御部207Aは、撮像ユニット10A又は制御装置20Aにより取得された位置情報と視点情報に応じて、拠点Bに設置され、拠点Bの画像や音声を取得する撮像機能を有する撮像装置を制御することができる。これにより、撮像ユニット10A又は制御装置20Aにより取得された位置情報と視点情報に応じた位置の拠点Bの画像及び音声が取得される。また、撮像ユニット制御部207Aは、第2の表示装置40Aから見たユーザAの相対角度、すなわち、第2の表示装置40Aから見たユーザAの方向に応じて、撮像ユニット10Aが取得する画像又は音声の取得位置等を制御することができる。なお、拠点Bの画像や音声を取得する装置は、例えば、デジタルスチルカメラやデジタルビデオカメラ等のような、撮像機能を少なくとも有し、好ましくは収音機能を更に有するような、公知の撮像装置であってもよいし、拠点Bに設置された撮像ユニット10Bであってもよい。 The imaging unit control unit 207 supervises imaging processing and sound collection processing performed by the imaging unit 10 installed at the same site as the site where the control device 20 is provided and an imaging apparatus having an imaging function installed at another site. Control. For example, the imaging unit control unit 207A of the control device 20A provided at the base A is installed at the base B according to the position information and the viewpoint information acquired by the imaging unit 10A or the control device 20A, and an image of the base B or It is possible to control an imaging device having an imaging function for acquiring voice. Thereby, the image and sound of the base B at the position according to the position information and the viewpoint information acquired by the imaging unit 10A or the control device 20A are acquired. Further, the imaging unit control unit 207A is an image acquired by the imaging unit 10A according to the relative angle of the user A viewed from the second display device 40A, that is, the direction of the user A viewed from the second display device 40A. Or, it is possible to control the acquisition position of voice and the like. The apparatus for acquiring the image and sound of the site B is, for example, a known imaging apparatus having at least an imaging function such as a digital still camera or a digital video camera, and preferably further having a sound collecting function. It may be the imaging unit 10B installed at the base B.
 また、例えば、拠点Aに設けられた制御装置20Aの撮像ユニット制御部207Aは、拠点Bの制御装置20Bから送信されたユーザBの位置情報及び視点情報に基づいて、撮像ユニット10Aが取得する画像又は音声の取得位置等を制御することができる。また、撮像ユニット制御部207Aは、第2の表示装置40Bから見たユーザBの相対角度、すなわち、第2の表示装置40Bから見たユーザBの方向に応じて、撮像ユニット10Aが取得する画像又は音声の取得位置等を制御することができる。 Also, for example, the imaging unit control unit 207A of the control device 20A provided at the site A is an image acquired by the imaging unit 10A based on the position information and the viewpoint information of the user B transmitted from the control device 20B of the site B. Or, it is possible to control the acquisition position of voice and the like. In addition, the imaging unit control unit 207A acquires an image acquired by the imaging unit 10A according to the relative angle of the user B viewed from the second display device 40B, that is, the direction of the user B viewed from the second display device 40B. Or, it is possible to control the acquisition position of voice and the like.
 記憶部209は、制御装置20が備える記憶装置の一例である。記憶部209には、本実施形態に係る制御装置20が、上記のような各種の処理を実施する際に利用する各種のプログラムやデータベース等が適宜記録されている。また、記憶部209には、撮像ユニット10により生成された各種の情報や、空間環境情報解析部203が生成した各種の二次情報等が、履歴情報として記録されていてもよい。更に、記憶部209には、本実施形態に係る制御装置20が何らかの処理を行う際に保存する必要が生じた様々なパラメータや処理の途中経過等が適宜記録されてもよい。この記憶部209は、通信制御部201、空間環境情報解析部203、出力制御部205、撮像ユニット制御部207等が、自由にリード/ライト処理を実施することが可能である。 The storage unit 209 is an example of a storage device provided in the control device 20. The storage unit 209 appropriately records various programs, databases, and the like that are used when the control device 20 according to the present embodiment performs the various processes as described above. Further, various information generated by the imaging unit 10, various secondary information generated by the space environment information analysis unit 203, and the like may be recorded in the storage unit 209 as history information. Furthermore, various parameters that need to be stored when the control device 20 according to the present embodiment performs some processing, progress of processing, and the like may be appropriately recorded in the storage unit 209. In the storage unit 209, the communication control unit 201, the space environment information analysis unit 203, the output control unit 205, the imaging unit control unit 207, and the like can freely perform read / write processing.
 ここまで、制御装置20の構成の一例について詳細に説明した。上記の各構成要素は、汎用的な部材や回路を用いて構成されていてもよいし、各構成要素の機能に特化したハードウェアにより構成されていてもよい。また、各構成要素の機能を、CPU等が全て行ってもよい。従って、本実施形態を実施する時々の技術レベルに応じて、適宜、利用する構成を変更することが可能である。 Up to this point, an example of the configuration of the control device 20 has been described in detail. Each component described above may be configured using a general-purpose member or circuit, or may be configured by hardware specialized for the function of each component. Further, all functions of each component may be performed by a CPU or the like. Therefore, it is possible to change the configuration to be used as appropriate according to the technical level at which the present embodiment is implemented.
 なお、上述のような本実施形態に係る制御装置20の各機能を実現するためのコンピュータプログラムを作製し、パーソナルコンピュータ等に実装することが可能である。また、このようなコンピュータプログラムが格納された、コンピュータで読み取り可能な記録媒体も提供することができる。記録媒体は、例えば、磁気ディスク、光ディスク、光磁気ディスク、フラッシュメモリなどである。また、上記のコンピュータプログラムは、記録媒体を用いずに、例えばネットワークを介して配信してもよい。 A computer program for realizing each function of the control device 20 according to the present embodiment as described above can be prepared and implemented on a personal computer or the like. In addition, a computer readable recording medium in which such a computer program is stored can be provided. The recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory or the like. In addition, the above computer program may be distributed via, for example, a network without using a recording medium.
<2-4.第1の表示装置30の構成>
 続いて、第1の表示装置30の構成の一例について詳細に説明する。
 第1の表示装置30は、ユーザに装着されて使用される表示装置であり、このような第1の表示装置30として、例えば、眼鏡型ディスプレイ、ヘッドマウントディスプレイ等のようなウェアラブルディスプレイを挙げることができる。これらのウェアラブルディスプレイには、例えば、ユーザを撮影するカメラや、加速度センサやジャイロセンサ等といった各種のセンサが実装されていることが好ましい。図5は、本実施形態に係る第1の表示装置30の構成例を示すブロック図である。
<2-4. Configuration of First Display Device 30>
Subsequently, an example of the configuration of the first display device 30 will be described in detail.
The first display device 30 is a display device worn and used by a user, and as such a first display device 30, for example, mention may be made of a wearable display such as a glasses-type display, a head mounted display, etc. Can. It is preferable that, for example, various sensors such as a camera for photographing a user and an acceleration sensor or a gyro sensor be mounted on these wearable displays. FIG. 5 is a block diagram showing a configuration example of the first display device 30 according to the present embodiment.
 図5に示すように、第1の表示装置30は、視点情報取得部301と、位置情報取得部303と、音声取得部305と、画像出力部307と、音声出力部309と、通信制御部311と、記憶部313とを備える。 As shown in FIG. 5, the first display device 30 includes a viewpoint information acquisition unit 301, a position information acquisition unit 303, a voice acquisition unit 305, an image output unit 307, a voice output unit 309, and a communication control unit. A storage unit 313 and a storage unit 313 are provided.
 視点情報取得部301は、第1の表示装置30を装着しているユーザの視点に関する視点情報を取得する。視点情報取得部301が取得した視点情報は、通信制御部311に出力され、同じ拠点内に存在する制御装置20へと送信される。視点情報取得部301によるユーザの視点情報の取得方法は、特に限定されるものではなく、公知の視点検出方法を利用することができる。このような視点検出方法として、例えば、特開平10-222287号公報に記載されている方法を挙げることができる。 The viewpoint information acquisition unit 301 acquires viewpoint information related to the viewpoint of the user wearing the first display device 30. The viewpoint information acquired by the viewpoint information acquisition unit 301 is output to the communication control unit 311 and transmitted to the control device 20 present in the same base. The acquisition method of the viewpoint information of the user by the viewpoint information acquisition unit 301 is not particularly limited, and a known viewpoint detection method can be used. As such a viewpoint detection method, for example, a method described in Japanese Patent Application Laid-Open No. 10-222287 can be mentioned.
 位置情報取得部303は、使用される空間における、第1の表示装置30を使用するユーザの位置に関する位置情報を取得する。位置情報取得部303は、加速度センサ、ジャイロセンサ等のような位置情報を取得するために用いられるセンサからの出力を利用して、第1の表示装置30を装着しているユーザの位置情報を取得する。このようなユーザの位置情報の取得方法については、特に限定されるものではなく、例えば、ウェアラブルディスプレイの自己位置推定技術等のような公知の方法を用いることができる。このような自己位置推定技術としては、例えば、GNSS(Global Navigation Satellite System:全球測位衛星システム)や屋内測位、モバイルネットワーク、無線LAN(Local Area Network)基地局からの情報などを用いた技術を挙げることができる。更に、自己位置推定技術としては、例えば、特開2009-237848号公報に記載されているような技術を挙げることができる。位置情報取得部303が取得した位置情報は、通信制御部311に出力され、同じ拠点内に存在する制御装置20へと送信される。 The position information acquisition unit 303 acquires position information on the position of the user who uses the first display device 30 in the space to be used. The position information acquisition unit 303 uses the output from a sensor used to acquire position information such as an acceleration sensor, a gyro sensor, etc. to obtain position information of the user wearing the first display device 30. get. It does not specifically limit about the acquisition method of such a user's positional information, For example, the well-known method like the self-position estimation technique of a wearable display etc. can be used. As such self-position estimation technology, for example, a technology using information from GNSS (Global Navigation Satellite System: Global Positioning Satellite System), indoor positioning, mobile network, wireless LAN (Local Area Network) base station, etc. is listed. be able to. Furthermore, as a self-position estimation technique, for example, a technique as described in JP-A-2009-237848 can be mentioned. The position information acquired by the position information acquisition unit 303 is output to the communication control unit 311 and transmitted to the control device 20 present in the same base.
 音声取得部305は、第1の表示装置30に設けられた収音マイク等を介して、第1の表示装置30を装着したユーザが発する音声や、第1の表示装置30の周囲に存在する音等といった各種の音声情報を取得する。取得された音声情報は、通信制御部311に出力され、同じ拠点内に存在する制御装置20へと送信される。 The voice acquisition unit 305 exists around the first display device 30 or a voice uttered by the user wearing the first display device 30 via a sound collection microphone or the like provided in the first display device 30. Acquire various voice information such as sounds. The acquired voice information is output to the communication control unit 311, and is transmitted to the control device 20 existing in the same base.
 画像出力部307は、通信制御部311から、同じ拠点内の制御装置20から送信された出力指示情報と空間環境情報とを取得し、取得した出力指示情報に応じて、第1の表示装置30に設けられたディスプレイに空間環境情報を表示する。画像出力部307は、先だって説明したような方法を用いて、画像を第2の表示装置40の出力内容と重畳するように空間環境情報を表示させることが可能であり、第1の表示装置30を装着しているユーザに対して、第2の表示装置40から直接出力されているように画像情報を含む空間環境情報を表示させることが可能である。 The image output unit 307 acquires, from the communication control unit 311, the output instruction information and the space environment information transmitted from the control device 20 in the same site, and the first display device 30 according to the acquired output instruction information. The space environment information is displayed on the display provided in. The image output unit 307 can display the space environment information so as to superimpose the image on the output content of the second display device 40 using the method described above. It is possible to display space environment information including image information as it is directly output from the second display device 40 to the user wearing the.
 音声出力部309は、通信制御部311から送信された出力指示情報と音声情報とを取得し、その出力指示情報に応じて、第1の表示装置30に設けられたスピーカ等から、音声情報を出力する。 The audio output unit 309 acquires the output instruction information and the audio information transmitted from the communication control unit 311, and according to the output instruction information, audio information is output from a speaker or the like provided in the first display device 30. Output.
 通信制御部311は、制御装置20との間で実施される通信処理を制御する機能を有する。具体的には、通信制御部311は、視点情報取得部301で取得された視点情報や、位置情報取得部303で取得された位置情報や、音声取得部305で取得された音声を受信し、制御装置20へと送信させる。また、通信制御部311は、制御装置20から送信された空間環境情報や各種の出力指示情報を、画像出力部307や音声出力部309に送信することができる。 The communication control unit 311 has a function of controlling communication processing performed with the control device 20. Specifically, the communication control unit 311 receives the viewpoint information acquired by the viewpoint information acquisition unit 301, the position information acquired by the position information acquisition unit 303, and the voice acquired by the voice acquisition unit 305, It is transmitted to the control device 20. Further, the communication control unit 311 can transmit the space environment information and various output instruction information transmitted from the control device 20 to the image output unit 307 and the audio output unit 309.
 記憶部313は、第1の表示装置30が備える記憶装置の一例である。記憶部313には、本実施形態に係る第1の表示装置30が、上記のような各種の処理を実施する際に利用する各種のプログラムやデータベース等が適宜記録されている。また、記憶部313には、上記のような視点情報取得部301、位置情報取得部303、音声取得部305等が取得した各種の情報が履歴情報として記録されていてもよい。更に、記憶部313には、本実施形態に係る第1の表示装置30が何らかの処理を行う際に保存する必要が生じた様々なパラメータや処理の途中経過等が適宜記録されてもよい。この記憶部313は、視点情報取得部301、位置情報取得部303、音声取得部305、画像出力部307、音声出力部309、通信制御部311等が、自由にリード/ライト処理を実施することが可能である。 The storage unit 313 is an example of a storage device included in the first display device 30. The storage unit 313 appropriately stores various programs, databases, and the like that are used when the first display device 30 according to the present embodiment performs the various processes as described above. Further, various information acquired by the viewpoint information acquisition unit 301, the position information acquisition unit 303, the voice acquisition unit 305, and the like as described above may be recorded in the storage unit 313 as history information. Furthermore, various parameters that need to be stored when the first display device 30 according to the present embodiment performs some processing, progress of processing, and the like may be appropriately recorded in the storage unit 313. In the storage unit 313, the viewpoint information acquisition unit 301, the position information acquisition unit 303, the voice acquisition unit 305, the image output unit 307, the voice output unit 309, the communication control unit 311, etc. perform read / write processing freely. Is possible.
 なお、上記の説明では、第1の表示装置30が、視点情報の取得機能や位置情報の取得機能を有している場合を例に挙げたが、第1の表示装置30が、視点情報や位置情報の取得機能を有していない場合も考えうる。その場合には、同じ拠点内に設けられた制御装置20の空間環境情報取得部が、上記のような視点情報や位置情報を、空間環境情報を解析することで抽出すればよい。 In the above description, although the case where the first display device 30 has an acquisition function of viewpoint information and an acquisition function of position information has been described as an example, the first display device 30 is not limited to viewpoint information or It is also conceivable that the mobile terminal does not have a position information acquisition function. In that case, the space environment information acquisition unit of the control device 20 provided in the same base may extract the viewpoint information and the position information as described above by analyzing the space environment information.
 ここまで、第1の表示装置30の構成の一例について詳細に説明した。上記の各構成要素は、汎用的な部材や回路を用いて構成されていてもよいし、各構成要素の機能に特化したハードウェアにより構成されていてもよい。また、各構成要素の機能を、CPU等が全て行ってもよい。従って、本実施形態を実施する時々の技術レベルに応じて、適宜、利用する構成を変更することが可能である。
 続いて、第2の表示装置40の構成について詳細に説明する。
Up to this point, an example of the configuration of the first display device 30 has been described in detail. Each component described above may be configured using a general-purpose member or circuit, or may be configured by hardware specialized for the function of each component. Further, all functions of each component may be performed by a CPU or the like. Therefore, it is possible to change the configuration to be used as appropriate according to the technical level at which the present embodiment is implemented.
Subsequently, the configuration of the second display device 40 will be described in detail.
<2-5.第2の表示装置40の構成>
 第2の表示装置40は、例えば、拠点の壁面等に設置されているような、空間に固定されたディスプレイであり、公知の各種のディスプレイにより実現される。設けられた空間内に存在している任意のユーザが、第2の表示装置40に表示される内容を認識することができる。第2の表示装置40は、有線もしくは無線により制御装置20と接続されており、制御装置20から送信された空間環境情報を出力する機能を有する。ここで、拠点Aに複数のユーザが存在する場合、第2の表示装置40に出力される画像として、その複数のユーザに提供される画像情報の共通した部分が、空間環境情報の一部として表示されてもよい。また、第2の表示装置40は、撮像ユニット10又は制御装置20と一体に設けられてもよい。また、第1の表示装置30で音声が出力されない場合、第2の表示装置40で音声を出力できるよう、音声出力部が備えられてもよい。
<2-5. Configuration of Second Display Device 40>
The second display device 40 is, for example, a display fixed to a space, such as a wall surface of a base, and is realized by various known displays. Any user existing in the provided space can recognize the content displayed on the second display device 40. The second display device 40 is connected to the control device 20 by wire or wireless, and has a function of outputting the space environment information transmitted from the control device 20. Here, when there are a plurality of users at the site A, as an image to be output to the second display device 40, a common part of the image information provided to the plurality of users is a part of the space environment information It may be displayed. Further, the second display device 40 may be provided integrally with the imaging unit 10 or the control device 20. Further, an audio output unit may be provided so that when the first display device 30 does not output audio, the second display device 40 can output audio.
 以上、第2の表示装置40の構成について説明した。続いて、本実施形態に係る情報処理システム1の動作について説明する。 The configuration of the second display device 40 has been described above. Subsequently, the operation of the information processing system 1 according to the present embodiment will be described.
<<3.動作>>
<3-1.動作例1>
 図6は、本実施形態に係る動作の流れの一例を示すシーケンス図であり、拠点Aに設けられる撮像ユニット10A、制御装置20A、第1の表示装置30A、第2の表示装置40A、及び拠点Bに設けられる撮像ユニット10B、制御装置20Bのオブジェクトから構成され、上から下に延びる破線は各オブジェクトのライフラインを示している。なお、本開示では、制御装置20が、撮像ユニット10及び第2の表示装置40と一体となっている場合について説明する。しかしながら、制御装置20と、撮像ユニット10と、第2の表示装置40は、それぞれ独立して設けられてもよい。
<< 3. Operation >>
<3-1. Operation example 1>
FIG. 6 is a sequence diagram showing an example of the flow of the operation according to this embodiment, and the imaging unit 10A, the control device 20A, the first display device 30A, the second display device 40A, and the base are provided at the base A. A broken line extending from the top to the bottom shows the lifeline of each object, which is composed of the objects of the imaging unit 10B provided in B and the control device 20B. In the present disclosure, a case where the control device 20 is integrated with the imaging unit 10 and the second display device 40 will be described. However, the control device 20, the imaging unit 10, and the second display device 40 may be provided independently of each other.
 図6を参照しながら、本実施形態に係る動作の流れを説明する。まず、撮像ユニット10Aは、設置された拠点Aの少なくとも一部を撮像して、拠点Aの画像情報を取得し(S101)、取得された画像情報を制御装置20Aに送信する(S103)。制御装置20Aは、制御装置20Aに備えられる空間環境情報解析部203にて、撮像ユニット10Aから送信された画像情報を基に空間環境情報を解析し、二次情報として拠点Aに存在しているユーザAの位置情報及び視点情報を生成する(S105)。そして、制御装置20Aは、生成された位置情報および視点情報を基に、撮像指示情報を送信する(S107)。 The flow of the operation according to the present embodiment will be described with reference to FIG. First, the imaging unit 10A captures at least a part of the installed site A, acquires image information of the site A (S101), and transmits the acquired image information to the control device 20A (S103). The control device 20A analyzes the space environment information based on the image information transmitted from the imaging unit 10A in the space environment information analysis unit 203 provided in the control device 20A, and exists in the site A as secondary information. Position information and viewpoint information of the user A are generated (S105). Then, the control device 20A transmits imaging instruction information based on the generated position information and viewpoint information (S107).
 次いで、撮像ユニット10Bは、受信した撮像指示情報に基づいて、撮像ユニット10Bに備えられる画像取得部101Bにより、拠点Bの所定の位置を撮像する(S109)。このとき、撮像指示に基づいて、撮像ユニット10Bに備えられる音声取得部103Bにより、拠点Bの所定の位置の音声を収音することができる。 Next, based on the received imaging instruction information, the imaging unit 10B images a predetermined position of the base B by the image acquisition unit 101B included in the imaging unit 10B (S109). At this time, based on the imaging instruction, the audio acquisition unit 103B provided in the imaging unit 10B can pick up the audio at a predetermined position of the base B.
 続いて、撮像ユニット10Bは、取得された画像情報を含む空間環境情報を解析し、拠点Bに存在するユーザBの位置情報などの二次情報を生成する(S111)。そして、撮像ユニット10Bは、その空間環境情報と二次情報を、制御装置20Bを介して制御装置20Aに送信する(S113)。制御装置20Aは、受信した空間環境情報と二次情報の出力先及び出力位置を決定する(S115)。具体的には、制御装置20Aに備えられる出力制御部205Aが、第1の表示装置30Aに出力させる空間環境情報と第2の表示装置40Aに出力させる空間環境情報を決定する。さらに、例えば、第1の表示装置30Aに表示させる画像が第2の表示装置40Aの出力内容と重畳させる場合、表示させる画像が重畳するように、第1の表示装置30Aに表示させる画像の位置を決定する。 Subsequently, the imaging unit 10B analyzes space environment information including the acquired image information, and generates secondary information such as position information of the user B existing at the site B (S111). Then, the imaging unit 10B transmits the space environment information and the secondary information to the control device 20A via the control device 20B (S113). The control device 20A determines the output destination and output position of the received space environment information and secondary information (S115). Specifically, the output control unit 205A included in the control device 20A determines space environment information to be output to the first display device 30A and space environment information to be output to the second display device 40A. Furthermore, for example, when the image displayed on the first display device 30A is to be superimposed on the output content of the second display device 40A, the position of the image to be displayed on the first display device 30A so that the image to be displayed is superimposed. Decide.
 そして、制御装置20Aは、上記のようにして決定された出力先、すなわち、第1の表示装置30A又は第2の表示装置40Aの少なくともいずれか一方に出力指示情報を送信し(S117)、第1の表示装置30A又は第2の表示装置40Aは、出力指示情報に応じた位置に、空間環境情報と、必要に応じて二次情報とを出力する(S119)。なお、生成される二次情報は、制御装置20Aだけでなく、撮像ユニット10Aにより生成されてもよい。また、二次情報の中の一つである、視点情報や位置情報は、第1の表示装置30Aが有する視点情報取得部301及び位置情報取得部303により取得されてもよい。 Then, the control device 20A transmits the output instruction information to the output destination determined as described above, that is, at least one of the first display device 30A and the second display device 40A (S117). The first display device 30A or the second display device 40A outputs the space environment information and, if necessary, the secondary information at a position according to the output instruction information (S119). The generated secondary information may be generated not only by the control device 20A but also by the imaging unit 10A. The viewpoint information and the position information, which are one of the secondary information, may be acquired by the viewpoint information acquiring unit 301 and the position information acquiring unit 303 which the first display device 30A has.
 図7に本実施形態に係る情報処理システム1を使用するユーザAが認識する空間環境情報の一例を示す。図7に示すように、ユーザAに装着された第1の表示装置30Aと拠点Aに設置された第2の表示装置40Aとに拠点Bの画像が表示され、拠点Bに存在するユーザBの画像が第2の表示装置40Aの出力内容と重畳するように表示される。このようにして、本開示によれば、遠隔地とあたかも繋がっているかのように自然に接続されており、より円滑で自然なコミュニケーションを実現することが可能となる。 FIG. 7 shows an example of space environment information recognized by the user A who uses the information processing system 1 according to the present embodiment. As shown in FIG. 7, the image of the site B is displayed on the first display device 30A attached to the user A and the second display device 40A installed at the site A, and the user B who exists at the site B is displayed. The image is displayed so as to be superimposed on the output content of the second display device 40A. In this manner, according to the present disclosure, it is naturally connected as if it were connected to a remote place, and it is possible to realize smoother and natural communication.
<3-2.動作例2>
 続いて、拠点A又は拠点Bの少なくともいずれか一方に居る複数のユーザが本実施形態に係る情報処理システム1を使用する場合の動作例について説明する。図8Aは、ユーザA1とユーザA2が居る拠点Aを示しており、図8A右図は、拠点Aと接続されている拠点Bの様子を示している。図8A左図に示すように、ユーザA1及びユーザA2は拠点A内でそれぞれ異なる場所に位置している。
<3-2. Operation example 2>
Subsequently, an operation example in the case where a plurality of users present in at least one of the base A and the base B use the information processing system 1 according to the present embodiment will be described. FIG. 8A shows the base A where the user A1 and the user A2 are present, and the right side of FIG. 8A shows the base B connected to the base A. As shown in the left diagram of FIG. 8A, the user A1 and the user A2 are located in different places in the base A.
 拠点Aの撮像ユニット10Aは、拠点Aの画像情報等を含む空間環境情報を取得し、制御装置20Aにより、ユーザA1及びユーザA2それぞれの位置情報と視点情報が生成される。制御装置20Aは、生成された位置情報及び視点情報と、撮像ユニット10Aにより取得された空間環境情報を、拠点Bの制御装置20Bに送信する。拠点Bに設けられた制御装置20Bに備えられた撮像ユニット制御部207Bは、撮像ユニット10Bを制御し、送信されたユーザA1及びユーザA2それぞれの位置情報と視点情報に応じた拠点Bの位置を撮像させる。撮像して取得された空間環境情報は、制御装置20Aに送信され、制御装置20Aに備えられた出力制御部205Aは、第1の表示装置30A及び第2の表示装置40Aに、ユーザA1及びユーザA2それぞれに応じた空間環境情報を出力させる。図8A左図は、ユーザA1が第1の表示装置30A及び第2の表示装置40Aを通じて認識する空間環境情報であり、図8A右図は、ユーザA2が第1の表示装置30A及び第2の表示装置40Aを通じて認識する空間環境情報である。図8B左図に示すように、ユーザA1に対しては、ユーザA1の位置と視点に応じた情報を提示されており、また、図8B右図に示すように、ユーザA2に対しては、ユーザA2の位置と視点に応じた情報を提示されている。より詳細には、第2の表示装置には、ユーザA1とユーザA2が認識する画像の共通部分である、拠点Bの壁面と床の画像が表示されている。また、ユーザA1が装着している第1の表示装置30Aには、椅子の画像が第2の表示装置40の出力内容と重畳するように表示されており、同様に、ユーザA2が装着している第1の表示装置30Aには、ソファの画像を第2の表示装置40Aの出力内容と重畳するように表示されている。このように、本実施形態に係る情報処理システム1によれば、複数のユーザの位置と視点に応じた拠点Bの画像を提供し、制御装置20Aが、第2の表示装置40Aに、複数のユーザに提供される画像情報の共通した部分を表示させることができる。 The imaging unit 10A of the base A acquires space environment information including image information and the like of the base A, and the control device 20A generates position information and viewpoint information of each of the user A1 and the user A2. The control device 20A transmits the generated position information and viewpoint information and the space environment information acquired by the imaging unit 10A to the control device 20B of the base B. The imaging unit control unit 207B provided in the control device 20B provided at the base B controls the imaging unit 10B, and the position of the base B according to the transmitted position information and viewpoint information of the user A1 and the user A2 Take an image. The space environment information acquired by imaging is transmitted to the control device 20A, and the output control unit 205A included in the control device 20A transmits the user A1 and the user to the first display device 30A and the second display device 40A. It outputs space environment information according to each A2. The left figure in FIG. 8A is space environment information that the user A1 recognizes through the first display device 30A and the second display device 40A, and the right figure in FIG. 8A shows the first display device 30A with the second display A. It is space environment information recognized through the display device 40A. As shown in the left diagram of FIG. 8B, information corresponding to the position and the viewpoint of the user A1 is presented to the user A1, and as illustrated in the right diagram of FIG. 8B, to the user A2. Information corresponding to the position and the viewpoint of the user A2 is presented. More specifically, on the second display device, an image of the wall surface and the floor of the base B, which is a common part of the images recognized by the user A1 and the user A2, is displayed. Further, on the first display device 30A worn by the user A1, the image of the chair is displayed so as to overlap the output content of the second display device 40, and similarly, the user A2 wears it. On the first display device 30A, the image of the sofa is displayed so as to be superimposed on the output content of the second display device 40A. As described above, according to the information processing system 1 according to the present embodiment, the image of the base B according to the positions and the viewpoints of a plurality of users is provided, and the control device 20A transmits the plurality of images to the second display device 40A. A common part of the image information provided to the user can be displayed.
<3-3.動作例3>
 第1の表示装置30は、空間環境情報として、画像情報だけでなく、文字情報も出力することができる。例えば、拠点Aに存在するユーザAの視点情報及び位置情報に基づく拠点Bの範囲内に、拠点Bに存在するユーザBが存在しない場合に、ユーザBの位置情報を文字情報として表示することができる。図9、図10A及び図10Bを参照しながら、係る場合について説明する。図9は、拠点Bに存在するユーザBの位置情報が第1の表示装置30Aに表示される動作の流れを示すシーケンス図である。ここでは、拠点Aの画像が取得されてから拠点Bに設けられた制御装置20Bが撮像ユニット10Bに撮像指示情報を送信するまでは、動作例1と同様であるため、詳細な説明は省略し、シーケンス図からも省略する。撮像ユニット10Bは、撮像ユニット10Bに複数設けられたカメラを用いて拠点Bの複数の位置を撮像し、空間環境情報を取得する(S201)。このとき、撮像ユニット10Bは、拠点Aに存在するユーザAに表示させるための画像だけでなく、拠点Bの他の位置を撮像する。次いで、制御装置20Bが空間環境情報を解析し、ユーザBの位置情報やユーザを特定する識別情報等の二次情報を生成する(S203)。ここで、拠点Bに存在するユーザがユーザBであることが識別される(S205)。このようにして取得された、ユーザBに関する二次情報が制御装置20Bを介して制御装置20Aに送信され(S207)、出力指示情報が第1の表示装置30Aに送信される(S209)。第1の表示装置30Aは、制御装置20Aから送信された出力指示情報に応じた空間環境情報を出力する(S211)。
<3-3. Operation example 3>
The first display device 30 can output not only image information but also text information as space environment information. For example, when there is no user B present at the site B within the range of the site B based on the viewpoint information and position information of the user A present at the site A, the position information of the user B may be displayed as text information it can. Such a case will be described with reference to FIGS. 9, 10A and 10B. FIG. 9 is a sequence diagram showing a flow of an operation in which position information of the user B present at the site B is displayed on the first display device 30A. Here, the operation is the same as in the operation example 1 until the control device 20B provided at the base B transmits the imaging instruction information to the imaging unit 10B after the image of the base A is acquired, and thus the detailed description is omitted. Also omitted from the sequence diagram. The imaging unit 10B captures a plurality of positions of the base B by using a plurality of cameras provided in the imaging unit 10B, and acquires space environment information (S201). At this time, the imaging unit 10B captures not only the image to be displayed to the user A present at the site A but also the other position of the site B. Next, the control device 20B analyzes the space environment information, and generates secondary information such as position information of the user B and identification information for identifying the user (S203). Here, it is identified that the user existing at the site B is the user B (S205). The secondary information on the user B thus acquired is transmitted to the control device 20A via the control device 20B (S207), and the output instruction information is transmitted to the first display device 30A (S209). The first display device 30A outputs space environment information according to the output instruction information transmitted from the control device 20A (S211).
 図10Aは、拠点Aに居るユーザAに対して表示される位置にユーザBが存在しない状態を示している。図10Aに示す矢印は、ユーザAの視点の方向を示している。図10A右図に示すように、ユーザAの位置及び視点対応して表示される範囲にはユーザBは位置していない。しかしながら、上記のような動作により、図10Bに示すように、ユーザAが利用する第1の表示装置30Aには、ユーザの情報とそのユーザBの位置情報が表示される。このようにして、本開示によれば、制御装置20Aが、撮像ユニット10Bにより取得された拠点Bの空間環境情報から抽出された拠点Bに存在するユーザBの位置情報を、拠点Aに存在するユーザAに対して提供することができる。なお、ユーザBに関する位置情報等の二次情報は、撮像ユニット10Bだけでなく、ユーザBが利用する第1の表示装置30Bに備えられる位置情報取得部303により取得されてもよい。また、ユーザを特定する方法は、ユーザBが利用する第1の表示装置30Bの固体情報をもとに識別される方法でもよいし、第1の表示装置30Bに設けられるカメラを用いて取得されるユーザBの画像に基づいて識別される方法でもよい。 FIG. 10A shows a state in which the user B does not exist at the position displayed for the user A who is at the site A. The arrow shown in FIG. 10A indicates the direction of the viewpoint of the user A. As shown in the right view of FIG. 10A, the user B is not located in the range displayed corresponding to the position and the viewpoint of the user A. However, as shown in FIG. 10B, by the operation as described above, the user's information and the position information of the user B are displayed on the first display device 30A used by the user A. Thus, according to the present disclosure, the control device 20A has the location information of the user B existing in the location B extracted from the space environment information of the location B acquired by the imaging unit 10B in the location A. It can be provided to user A. In addition, secondary information such as position information related to the user B may be acquired not only by the imaging unit 10B but also by the position information acquisition unit 303 provided in the first display device 30B used by the user B. Further, the method of identifying the user may be a method of identifying based on solid-state information of the first display device 30B used by the user B, or may be acquired using a camera provided in the first display device 30B. May be identified based on the image of the user B.
<3-4.動作例4>
 本実施形態に係る情報処理システム1を利用することで、一方の拠点に居るユーザが、他の拠点に居る他のユーザの存在を認識していないときに、他のユーザがそのユーザに対して呼びかけることで、呼びかけた内容を通知することができる。他の拠点に居るユーザを認識していない場合として、例えば、拠点Aに居るユーザAが第2の表示装置40Aを注視していない場合や、そもそも第2の表示装置40Aを見ていない場合等が挙げられる。このような場合、本実施形態に係る情報処理システム1は、ユーザBからユーザAへの呼びかけに関する情報を通知することができる。ユーザBからユーザAへの呼びかけに関する情報を通知する場合について、図11、図12A及び図12Bを参照しながら詳細に説明する。
<3-4. Operation example 4>
By using the information processing system 1 according to the present embodiment, when a user at one of the bases does not recognize the presence of another user at the other base, the other user responds to the user By calling, it is possible to notify the contents of the calling. When not recognizing the user who is in another base, for example, when the user A who is at the base A is not looking at the second display device 40A or when it is not looking at the second display device 40A in the first place Can be mentioned. In such a case, the information processing system 1 according to the present embodiment can notify the user B of information on an invitation to the user A. The case of notifying information on the invitation from the user B to the user A will be described in detail with reference to FIGS. 11, 12A and 12B.
 図11は、拠点Bに存在するユーザBが拠点Aに居るユーザAに対して呼びかけ、ユーザAにユーザBを認識させる流れを示すシーケンス図である。拠点Aと拠点Bとは本実施形態に係る情報処理システム1により接続されているものの、図12Aに示したように拠点Aに居るユーザAは、第2の表示装置40Aを注視しておらず、拠点Bに居るユーザBを認識していない状態である。図12Aに示す矢印は、ユーザAの視点の方向を示している。ここでは、拠点Aの画像が取得されてから拠点Bに設けられた制御装置20Bが撮像ユニット10Bに撮像指示情報を送信するまでは、動作例1と同様であるため、詳細な説明は省略し、シーケンス図からも省略する。 FIG. 11 is a sequence diagram showing a flow of calling the user A who is present at the base B to the user A who is present at the base A and making the user A recognize the user B. Although the base A and the base B are connected by the information processing system 1 according to the present embodiment, as shown in FIG. 12A, the user A who is at the base A gazes at the second display device 40A. , The user B who is at the base B is not recognized. The arrow shown in FIG. 12A indicates the direction of the viewpoint of the user A. Here, the operation is the same as in the operation example 1 until the control device 20B provided at the base B transmits the imaging instruction information to the imaging unit 10B after the image of the base A is acquired, and thus the detailed description is omitted. Also omitted from the sequence diagram.
 まず、拠点Bに設けられる撮像ユニット10Bは、撮像ユニット10Bに複数設けられたカメラを用いて拠点Bの複数の位置を撮像する(S301)。このとき、撮像ユニット10Bに備えられる音声取得部103Bを用いて、拠点Bの音声が取得される(S303)。例えば、ユーザBがユーザAに対して呼びかけると、その呼びかけを含む音声情報が音声取得部103Bにより取得され、空間環境情報解析部105Bにて音声情報が生成され、呼びかけたユーザ及び呼びかけの内容が識別される(S305)。ここで、かかる呼びかけが、ユーザBからユーザAに対するものであることが識別される。このようにして取得された音声情報が制御装置20Bを介して制御装置20Aに送信され(S307)、出力指示情報が第1の表示装置30Aに送信される(S309)。第1の表示装置30Aは、例えば、図12Bに示したように、制御装置20Aから送信された出力指示に応じた情報を出力する(S311)。例えば、図12Bに示すように、ユーザAが利用する第1の表示装置30Aは、ユーザBがユーザAを呼んでいる旨を表示することができる。 First, the imaging unit 10B provided at the base B images a plurality of positions of the base B using a plurality of cameras provided at the imaging unit 10B (S301). At this time, the voice of the base B is obtained using the voice acquisition unit 103B provided in the imaging unit 10B (S303). For example, when the user B calls the user A, voice information including the call is acquired by the voice acquisition unit 103B, the space environment information analysis unit 105B generates voice information, and the contents of the user and the call are It is identified (S305). Here, it is identified that such a challenge is for User B to User A. The audio information acquired in this manner is transmitted to the control device 20A via the control device 20B (S307), and the output instruction information is transmitted to the first display device 30A (S309). For example, as shown in FIG. 12B, the first display device 30A outputs information according to the output instruction transmitted from the control device 20A (S311). For example, as shown in FIG. 12B, the first display device 30A used by the user A can display that the user B is calling the user A.
 このようにして、制御装置20Aが、撮像ユニット10Bにより取得された拠点Bの空間環境情報から抽出された拠点Bの音声に関する音声情報を、拠点Aに存在するユーザAに対して提供することができる。なお、第1の表示装置30Aは、音声情報そのものを出力してもよい。また、S305において行われる呼びかけたユーザ及び呼びかけの内容の識別は、制御装置20Aで実施されてもよい。 In this manner, the control device 20A provides the user A, who exists at the base A, with the audio information related to the sound of the base B extracted from the space environment information of the base B acquired by the imaging unit 10B. it can. The first display device 30A may output the audio information itself. Also, the identification of the content of the challenged user and the challenge performed in S305 may be performed by the control device 20A.
<3-5.動作例5>
 また、ユーザに対する呼びかけは、上記したような音声による呼びかけに限られず、ユーザの視点と動きを用いて呼びかけることも可能である。図13、図14A及び図14Bを参照しながら、視点と動きを用いた呼びかけについて説明する。図13は、拠点Bに存在するユーザBが拠点Aに居るユーザAに対して視線と動きによりユーザAにユーザBを認識させる流れを示すシーケンス図である。図14Aは、ユーザBが視線と動きによりユーザAに呼びかけている様子を示している。図14A左図に示す矢印は、ユーザAの視点の方向を示し、図14A右図に示す矢印は、ユーザBの視点の方向を示している。図14A左図に示したように、拠点Aに居るユーザAは、第2の表示装置40Aを注視しておらず、拠点Bに居るユーザBを認識していない状態である。また、図14A右図に示したように、拠点Bに居るユーザBは、第2の表示装置40Aを通じて、ユーザAを注視した状態である。ユーザBが、ユーザBの位置及び視点に応じた拠点Aの画像を認識するまでの動作は、動作例1と同様の動作であるため詳細な説明は省略する。
<3-5. Operation example 5>
Further, the call to the user is not limited to the voice call as described above, and it is also possible to call using the user's viewpoint and motion. A call using a viewpoint and a motion will be described with reference to FIGS. 13, 14A and 14B. FIG. 13 is a sequence diagram showing a flow for causing the user A to recognize the user B by the sight line and the motion with respect to the user A who is present at the site A and the user B present at the site B. FIG. 14A shows a state in which the user B is calling the user A by sight and movement. The arrow shown in FIG. 14A left shows the direction of the viewpoint of the user A, and the arrow shown in the right of FIG. 14A shows the direction of the viewpoint of the user B. As shown in the left diagram of FIG. 14A, the user A who is at the site A is not looking at the second display device 40A and is not aware of the user B who is at the site B. Further, as shown in the right view of FIG. 14A, the user B who is at the base B is in a state of gazing at the user A through the second display device 40A. The operation until the user B recognizes the image of the base A according to the position and the viewpoint of the user B is the same operation as the operation example 1, and thus the detailed description is omitted.
 まず、ユーザBは、ユーザAを注視しており、撮像ユニット10Bは、拠点Bを撮像し(S401)、取得された空間環境情報からユーザBの視点情報を生成する(S403)。そして、ユーザBは、例えば、ユーザAを注視したまま手を振る動きをすると、撮像ユニット10Bの空間環境情報解析部105BがユーザBの視点情報及びユーザの動作情報等の二次情報を生成し(S405)、二次情報は、制御装置20Bを介して制御装置20Aに送信される(S407)。制御装置20Aは、送信された視点情報及び動作情報を基に、ユーザBがユーザAを呼びかけていることを識別し(S409)、制御装置20Aに備えられる出力制御部205Aは、第1の表示装置30Aに出力指示情報を送信する(S411)。そして、ユーザAが装着する第1の表示装置30Aに、例えば図14Bに示すような呼びかけ情報が出力される(S413)。このように、制御装置20Aが、撮像ユニット10Bにより取得された拠点Bの空間環境情報から、抽出された拠点Bに存在するユーザBの動きに関する動作情報を、更なる空間環境情報として、拠点Aに存在するユーザAに対して提供することができる。これにより、ユーザAは、第2の表示装置40Aを注視しておらず、ユーザBの手の動きを認識することはできなくても、ユーザAはあたかも後ろから呼ばれているような感覚を得ることができる。 First, the user B is gazing at the user A, and the imaging unit 10B images the site B (S401), and generates viewpoint information of the user B from the acquired space environment information (S403). Then, for example, when the user B shakes his / her hand while gazing at the user A, the space environment information analysis unit 105B of the imaging unit 10B generates secondary information such as the viewpoint information of the user B and the user's operation information. (S405), the secondary information is transmitted to the control device 20A via the control device 20B (S407). The control device 20A identifies that the user B is calling the user A based on the transmitted viewpoint information and operation information (S409), and the output control unit 205A included in the control device 20A performs the first display. Output instruction information is transmitted to the device 30A (S411). Then, for example, call information as shown in FIG. 14B is output to the first display device 30A worn by the user A (S413). As described above, the control device 20A determines, from the space environment information of the base B acquired by the imaging unit 10B, the operation information on the movement of the user B existing at the base B extracted as the space environment information as the base A. Can be provided to the user A who exists in Thereby, even though the user A is not looking at the second display device 40A and can not recognize the movement of the hand of the user B, the user A feels as if being called from behind. You can get it.
<3-6.動作例6>
 本動作例では、図15A、図15B及び図15Cを参照しながら、本実施形態に係る第1の表示装置30のみが他の拠点の空間環境情報を表示する場合について説明する。第1の表示装置30は、第2の表示装置40のサイズに限られることなく、他の拠点の空間環境情報を表示することができる。図15A左図は、拠点Aに居るユーザAが、第2の表示装置40Aから距離dだけ離れて位置している様子を示している。このとき、撮像ユニット10Aは、拠点Aを撮像し、取得された空間環境情報から、ユーザAの位置情報を生成する。具体的には、ユーザAと第2の表示装置40Aとの間の距離dを算出することができる。係る距離dが所定の閾値d以上となったとき、制御装置20Aに備えられる出力制御部205Aは、第1の表示装置30A及び第2の表示装置40Aを制御して、出力画像を、ユーザAの位置及び視点に対応した拠点Bの画像から、拠点Bに設けられた撮像ユニット10Bが撮像する撮像範囲全体の画像に切り替えることができる。具体的には、制御装置20Aに備えられる出力制御部205Aは、第2の表示装置40Aが設置される壁一面に、撮像ユニット10Bが撮像する画像情報を仮想的に表示させる。そして、ユーザAは、第1の表示装置30Aを通じて、拠点Bの画像が仮想的に表示された壁を見ることで、拠点Bの画像を認識することができる。
<3-6. Operation example 6>
In this operation example, a case where only the first display device 30 according to the present embodiment displays space environment information of another base will be described with reference to FIGS. 15A, 15B, and 15C. The first display device 30 can display space environment information of another site without being limited to the size of the second display device 40. The left view of FIG. 15A shows that the user A who is at the base A is located at a distance d from the second display device 40A. At this time, the imaging unit 10A captures an image of the site A, and generates position information of the user A from the acquired space environment information. Specifically, the distance d between the user A and the second display device 40A can be calculated. When the distance d becomes equal to or greater than a predetermined threshold value d 0 , the output control unit 205A provided in the control device 20A controls the first display device 30A and the second display device 40A to output an output image as a user The image of the base B corresponding to the position and the viewpoint of A can be switched to the image of the entire imaging range imaged by the imaging unit 10B provided at the base B. Specifically, the output control unit 205A included in the control device 20A virtually displays the image information captured by the imaging unit 10B on the wall on which the second display device 40A is installed. Then, the user A can recognize the image of the base B by looking at the wall on which the image of the base B is virtually displayed through the first display device 30A.
 このようにユーザが第2の表示装置40から所定の距離以上離れることで、第2の表示装置40が設置された壁に、他の拠点の画像を表示させることができる。これにより、第2の表示装置40の大きさに制限されることなく、まるで拠点Aと拠点Bとが壁が無く、つながっているかのような感覚をユーザが感じることができる。 As described above, when the user is separated from the second display device 40 by a predetermined distance or more, images of other bases can be displayed on the wall on which the second display device 40 is installed. As a result, without being limited by the size of the second display device 40, the user can feel as if the bases A and B have a wall and are connected.
 本動作例では、拠点Aの壁面全体を仮想的な表示装置とすることにより、撮像ユニット10Bで撮像される拠点Bの撮像範囲全体をユーザAは認識することができるものの、実際には拠点Bのプライバシー性が維持されるべき場合が生じることがある。そのような場合、図15Cに示すように、第2の表示装置40Aに表示される範囲の画像のみが明瞭に表示され、壁面に表示される画像は不明瞭に表示させてもよい。例えば、壁面に表示される画像は、ワイヤーフレーム状の画像又はモザイク状の画像とすることができる。このような表示がされることにより、プライバシーに対する配慮をしながらも、拠点AのユーザAは、拠点Bの空間環境を認識することが可能となる。 In this operation example, by using the entire wall surface of the site A as a virtual display device, although the user A can recognize the entire imaging range of the site B imaged by the imaging unit 10B, the site B is actually There are cases where the privacy of the should be maintained. In such a case, as shown in FIG. 15C, only the image in the range displayed on the second display device 40A may be displayed clearly, and the image displayed on the wall may be displayed indistinctly. For example, the image displayed on the wall surface can be a wire frame image or a mosaic image. With such a display, the user A of the site A can recognize the space environment of the site B while considering the privacy.
<3-7.動作例7>
 動作例6では、ユーザと第2の表示装置40との距離をトリガーとして、制御装置20が、第1の表示装置30表示させる画像を、第2の表示装置40のサイズに限られることなく他の拠点の空間環境情報を表示するように切り替える場合について説明した。このような表示の切り替えは、ユーザと第2の表示装置40との距離に限られず、拠点に居るユーザ数をトリガーとしてもよい。例えば、図16Aに示すように、拠点Aと拠点Bとが本実施形態に係る情報処理システム1で接続され、拠点Bに複数のユーザが存在する場合が想定される。このとき、撮像ユニット10Bにより取得された空間環境情報から抽出されたユーザ数が所定の閾値以上である場合に、制御装置20Aに備えられる出力制御部205Aは、出力画像を切り替える指示を第1の表示装置30及び第2の表示装置40に送信する。具体的には、制御装置20Aに備えられる出力制御部205Aは、出力画像を、ユーザAの位置及び視点に対応した拠点Bの画像から、図16Bに示すような拠点Bに設けられた撮像ユニット10Bの撮像範囲全体の画像に切り替える。上記のようにして表示画像の切り替えが行われることで、例えば、遠隔地でパーティ等が行われている場合、ユーザは、臨場感を得ながら体験することができる。なお、本動作例で説明した表示画像の切り替えは、物理的なボタンの押下やユーザのジェスチャー等によって行うことも可能である。
<3-7. Operation example 7>
In the operation example 6, the control device 20 causes the distance between the user and the second display device 40 to be a trigger, and the image to be displayed on the first display device 30 is not limited to the size of the second display device 40. The case of switching to display space environment information of the base of has been described. Such switching of display is not limited to the distance between the user and the second display device 40, and may be triggered by the number of users at the base. For example, as shown to FIG. 16A, the case where the base A and the base B are connected by the information processing system 1 which concerns on this embodiment, and the some user exists in the base B is assumed. At this time, when the number of users extracted from the space environment information acquired by the imaging unit 10B is equal to or more than a predetermined threshold value, the output control unit 205A included in the control device 20A performs an instruction to switch the output image to the first. It transmits to the display device 30 and the second display device 40. Specifically, the output control unit 205A provided in the control device 20A is an imaging unit provided at the base B as shown in FIG. 16B from the image of the base B corresponding to the position and the viewpoint of the user A as the output image. Switch to the image of the entire imaging range of 10B. By switching the display image as described above, for example, when a party or the like is performed at a remote place, the user can experience while gaining a sense of reality. The switching of the display image described in this operation example can also be performed by physically pressing a button, a gesture of the user, or the like.
<3-8.動作例8>
 本実施形態に係る情報処理システム1は、特定の対象物のみを表示させることもできる。例えば、拠点Bに存在するユーザBのみをユーザAに対して表示する場合等が例示できる。図17Aは拠点Bの様子を模式的に示した説明図である。図17A左図は、拠点Aに居るユーザAの位置及び視点に対応した方向にユーザBが居る様子を示している。また、図17A右図は、ユーザBがユーザAの位置及び視点に対応した位置の範囲外に移動した様子を示している。図17A左図に示すような、拠点Aに居るユーザAの位置及び視点に対応した方向にユーザBが居る場合は、ユーザAが認識する画像は図17B左図に示すように、第2の表示装置40AにユーザBが表示される。しかし、ユーザBが、図17A右図に示したように、ユーザAの位置及び視点に対応した位置から移動した場合、ユーザAは、拠点Aに設けられる第2の表示装置40Aを通じてユーザBを認識することはできなくなる。しかし、このような場合に、制御装置20Aの出力制御部205Aが、ユーザBの画像が、第2の表示装置40Aの表示画像に重畳されることなく、第1の表示装置30Aに表示されるように出力指示情報を送信することで、ユーザBのみを第1の表示装置30Aに表示させることができる。
<3-8. Operation example 8>
The information processing system 1 according to the present embodiment can also display only a specific target. For example, the case where only the user B who exists in the base B is displayed with respect to the user A etc. can be illustrated. FIG. 17A is an explanatory view schematically showing the state of the base B. FIG. The left view of FIG. 17A shows that the user B is present in the direction corresponding to the position and the viewpoint of the user A who is at the base A. The right view of FIG. 17A shows that the user B has moved out of the range of the position corresponding to the position of the user A and the viewpoint. When the user B is in the direction corresponding to the position and the viewpoint of the user A at the base A as shown in the left drawing of FIG. 17A, the image recognized by the user A is the second as shown in the left drawing of FIG. The user B is displayed on the display device 40A. However, when the user B moves from the position corresponding to the position and the view point of the user A as shown in the right view of FIG. 17A, the user A uses the user B through the second display device 40A provided at the base A. It can not be recognized. However, in such a case, the output control unit 205A of the control device 20A displays the image of the user B on the first display device 30A without being superimposed on the display image of the second display device 40A. By transmitting the output instruction information as described above, only the user B can be displayed on the first display device 30A.
 上記のような動作が実行されることで、ユーザBが移動した場合でも、図17B中央図に示したように、ユーザAは、ユーザBの存在を認識し続けることが可能となる。また、上記のように、ユーザBの画像が第2の表示装置40Aの表示画像に重畳されることなく、第1の表示装置30Aに表示される場合、図17B右図に示したように、ユーザBの画像は、存在が認識できる程度の視認性で表示されてもよい。例えば、第1の表示装置30Aのみに表示されるユーザBは、ワイヤーフレームで生成された画像等で表示されてもよい。このように、第1の表示装置30Aのみに表示されるユーザBが、存在が認識できる程度の視認性で表示されることで、ユーザBのプライバシーを保護することが可能となる。なお、このとき表示されるユーザBは、第2の表示装置40Aに表示されずに、第2の表示装置40Aの表示内容に重畳するように第1の表示装置30Aに表示されてもよい。 By executing the above-described operation, even when the user B moves, the user A can continue to recognize the presence of the user B, as shown in the central view of FIG. 17B. Further, as described above, when the image of the user B is displayed on the first display device 30A without being superimposed on the display image of the second display device 40A, as shown in the right view of FIG. 17B, The image of the user B may be displayed with such visibility that the presence can be recognized. For example, the user B displayed only on the first display device 30A may be displayed as an image or the like generated by a wire frame. Thus, the privacy of the user B can be protected by displaying the user B displayed only on the first display device 30A with such visibility that the presence of the user B can be recognized. The user B displayed at this time may be displayed on the first display device 30A so as to be superimposed on the display content of the second display device 40A without being displayed on the second display device 40A.
<3-9.動作例9>
 ここまで説明したように、拠点Aに居るユーザAの位置及び視点に対応した拠点Bの画像は、第1の表示装置30Aのみに出力させてよく、第2の表示装置40Aに必ずしも出力させなくともよい。第1の表示装置30Aのみに拠点Bの画像が出力される場合、第2の表示装置40Aが、第1の表示装置30Aに表示される拠点Bの画像の出力位置を定めるためのマーカを備えていればよい。
<3-9. Operation example 9>
As described above, the image of the base B corresponding to the position and the viewpoint of the user A at the base A may be output only to the first display device 30A, but not necessarily to the second display device 40A. It is good. When the image of the site B is output only to the first display device 30A, the second display device 40A includes a marker for determining the output position of the image of the site B displayed on the first display device 30A. It should just be.
 このような場合、第2の表示装置40Aを所定の画像を出力するための補助ディスプレイとして使用することができる。例えば、図18Aに示すように、拠点Aに設けられる第2の表示装置40Aと拠点Bに設けられる第2の表示装置40Bに、共通の画像が表示されてもよい。具体的には、拠点Aに居るユーザAと拠点Bに居るユーザBが打合せや会議等を行う場合には、第2の表示装置40Aには会議の参考資料が表示されてもよい。また、プレイ中のオンラインゲームの画面等が表示されてもよい。一方、第1の表示装置30には、ユーザの視点情報と位置情報に対応した、相互の拠点の画像が表示されてもよい。図18BにユーザAが認識する画像の一例を示したように、ユーザAは、拠点Bに居るユーザBの様子を確認しながら、同時に共通情報を確認することができる。このように、拠点Aに設けられた第2の表示装置40Aと、拠点Bに設けられた第2の表示装置40Bとに対し、共通の画像情報が表示されることで、相互に離れた拠点どうしが自然に繋がっているような感覚をユーザは得ながら、その他の情報を同時にやり取りすることが可能となる。 In such a case, the second display 40A can be used as an auxiliary display for outputting a predetermined image. For example, as shown in FIG. 18A, a common image may be displayed on the second display device 40A provided at the site A and the second display device 40B provided at the site B. Specifically, when the user A at the site A and the user B at the site B have a meeting, a meeting, or the like, reference material of the conference may be displayed on the second display device 40A. In addition, a screen or the like of the online game being played may be displayed. On the other hand, on the first display device 30, images of mutual bases corresponding to the user's viewpoint information and position information may be displayed. As an example of an image recognized by the user A is shown in FIG. 18B, the user A can simultaneously confirm the common information while confirming the state of the user B who is at the base B. As described above, by displaying common image information on the second display device 40A provided at the site A and the second display device 40B provided at the site B, the sites separated from each other The user can exchange other information at the same time while gaining the sense that they are naturally connected.
 また、ユーザが第2の表示装置40から距離が離れた位置に立って視聴しているとき、通常は遠くのものは見えづらくなるはずであり、常に第2の表示装置40に高精細の画像が表示されなくともよい。図19A左図に示した矢印は、ユーザAの視点を示しており、第2の表示装置40Aを注視していない状態を示している。例えば、図19Aに示したように、ユーザAの視点が第2の表示装置40A上にない場合、または、ユーザAが第2の表示装置40Aから所定の距離だけ離れた位置から第2の表示装置40Aを注視している場合等、ユーザAの焦点が第2の表示装置40A上にないとき、制御装置20Aは、第2の表示装置40Aの解像度を低下させることができる。詳細には、撮像ユニット10Aは、拠点Aを撮像し、空間環境情報解析部105Aにて解析された空間環境情報から、空間環境情報抽出部107Aにて、ユーザAと第2の表示装置40Aとの間の距離dを抽出する。係る距離dが所定の閾値d以上となったとき、制御装置20Aに備えられる出力制御部205Aは、第2の表示装置40Aを制御して、図19Bに示したように、出力画像の解像度を低下させることができる。また、空間環境情報解析部105Aにて解析された空間環境情報から、空間環境情報抽出部107Aにて抽出されたユーザAの視点情報により、第2の表示装置40A上にユーザAの視点がない場合、制御装置20Aに備えられる出力制御部205Aは、第2の表示装置40Aを制御して、出力画像の解像度を低下させることができる。このような動作が行われることで、ユーザAが第2の表示装置40A離れた位置に立っているときは映像の解像度を落とすことで自然な見え方を実現するとともに、映像データのデータ量が減ることで、各情報通信システム間での通信の帯域を節約することができる。特に拠点A、拠点Bの本体システム間の帯域を節約することは、実際の体感上のパフォーマンスの向上にもつながる。 Also, when the user is standing and watching at a distance from the second display device 40, usually something far away should be difficult to see, and a high definition image is always displayed on the second display device 40. May not be displayed. The arrows shown in the left view of FIG. 19A indicate the viewpoint of the user A, and indicate a state in which the second display device 40A is not focused. For example, as shown in FIG. 19A, if the viewpoint of the user A is not on the second display device 40A, or if the user A is away from the second display device 40A by a predetermined distance, the second display When the user A's focus is not on the second display device 40A, such as when looking at the device 40A, the control device 20A can reduce the resolution of the second display device 40A. Specifically, the imaging unit 10A captures an image of the site A, and from the space environment information analyzed by the space environment information analysis unit 105A, the space environment information extraction unit 107A analyzes the user A and the second display device 40A. Extract the distance d between When the distance d becomes equal to or more than the predetermined threshold value d 1 , the output control unit 205A provided in the control device 20A controls the second display device 40A to have the resolution of the output image as shown in FIG. 19B. Can be lowered. In addition, there is no viewpoint of the user A on the second display device 40A by the viewpoint information of the user A extracted by the space environment information extraction unit 107A from the space environment information analyzed by the space environment information analysis unit 105A. In this case, the output control unit 205A included in the control device 20A can control the second display device 40A to reduce the resolution of the output image. By performing such an operation, when the user A stands at a position away from the second display device 40A, a natural appearance can be realized by reducing the resolution of the image, and the amount of data of the image data is By reducing the bandwidth, communication bandwidth can be saved between the information communication systems. In particular, saving the bandwidth between the main system of the base A and the base B leads to an improvement in the actual perceived performance.
<<4.ハードウェア構成>>
 以上、本技術の実施形態について説明した。上述した出力制御及び撮像等の情報処理は、ソフトウェアと、以下に説明する制御装置20及び第1の表示装置30のハードウェアとの協働により実現される。
<< 4. Hardware configuration >>
The embodiments of the present technology have been described above. The information processing such as the output control and the imaging described above is realized by cooperation of software and hardware of the control device 20 and the first display device 30 described below.
<4-1.制御装置20のハードウェア構成>
 図20は、制御装置20のハードウェア構成を示したブロック図である。制御装置20は、CPU(Central Processing Unit)251と、ROM(Read Only Memory)252と、RAM(Random Access Memory)253と、ホストバス254と、を備える。また、制御装置20は、ブリッジ255と、外部バス256と、インタフェース257と、撮像・集音装置258と、入力装置259と、表示装置260と、音声出力装置261と、ストレージ装置(HDD)262と、ドライブ263と、ネットワークインタフェース264とを備えることができる。表示装置260及び音声出力装置261は、通常第1の表示装置30もしくは第2の表示装置40の少なくともいずれか一方に備えられるが、制御装置20と第2の表示装置40とが一体となって使用されてもよい。そのため、表示装置260及び音声出力装置261は、制御装置20のハードウェアに含まれることがある。
<4-1. Hardware configuration of control device 20>
FIG. 20 is a block diagram showing a hardware configuration of control device 20. Referring to FIG. The control device 20 includes a central processing unit (CPU) 251, a read only memory (ROM) 252, a random access memory (RAM) 253, and a host bus 254. Further, the control device 20 includes a bridge 255, an external bus 256, an interface 257, an imaging / sound collecting device 258, an input device 259, a display device 260, an audio output device 261, and a storage device (HDD) 262. , A drive 263, and a network interface 264. Although the display device 260 and the audio output device 261 are generally provided in at least one of the first display device 30 and the second display device 40, the control device 20 and the second display device 40 are integrated. It may be used. Therefore, the display device 260 and the audio output device 261 may be included in the hardware of the control device 20.
 CPU251は、演算処理装置及び制御装置として機能し、各種プログラムに従って制御装置20内の動作全般を制御する。また、CPU251は、マイクロプロセッサであってもよい。ROM252は、CPU251が使用するプログラムや演算パラメータ等を記憶する。RAM253は、CPU251の実行において使用するプログラムや、その実行において適宜変化するパラメータ等を一時記憶する。これらはCPUバス等から構成されるホストバス254により相互に接続されている。CPU251、ROM252及びRAM253とソフトウェアとの協働により、空間環境情報解析部203、出力制御部205等の機能が実現され得る。 The CPU 251 functions as an arithmetic processing unit and a control unit, and controls the overall operation in the control unit 20 according to various programs. Also, the CPU 251 may be a microprocessor. The ROM 252 stores programs used by the CPU 251, calculation parameters, and the like. The RAM 253 temporarily stores programs used in the execution of the CPU 251 and parameters and the like that appropriately change in the execution. These are mutually connected by a host bus 254 configured of a CPU bus or the like. The functions of the space environment information analysis unit 203, the output control unit 205, and the like can be realized by the cooperation of the CPU 251, the ROM 252, and the RAM 253 with the software.
 ホストバス254は、ブリッジ255を介して、PCI(Peripheral Component Interconnect/Interface)バス等の外部バス256に接続されている。なお、必ずしもホストバス254、ブリッジ255及び外部バス256を分離構成する必要はなく、1つのバスにこれらの機能を実装してもよい。 The host bus 254 is connected to an external bus 256 such as a peripheral component interconnect / interface (PCI) bus via the bridge 255. Note that the host bus 254, the bridge 255, and the external bus 256 do not necessarily need to be separately configured, and these functions may be implemented on one bus.
 撮像・集音装置258は、ユーザの居る拠点に関する画像及び音声を撮像・集音する機能を有する。撮像・集音装置258で取得された画像及び音声は、他の拠点に出力される。撮像・集音装置258は、カメラ、マイクロフォン等を含む。 The imaging and sound collecting device 258 has a function of imaging and collecting an image and a sound related to the base where the user is. The image and the sound acquired by the imaging and sound collecting device 258 are output to another site. The imaging and sound collecting device 258 includes a camera, a microphone and the like.
 入力装置259は、マウス、キーボード、タッチパネル、ボタン、マイクロフォン、センサ、スイッチ及びレバー等メンバーが情報を入力するための入力手段と、メンバーによる入力に基づいて入力信号を生成し、CPU251に出力する入力制御回路等から構成され得る。制御装置20の操作は、該入力装置259を操作することにより、制御装置20に対して各種のデータを入力することで処理動作を指示することもできる。 The input device 259 generates an input signal based on an input by the member such as a mouse, a keyboard, a touch panel, a button, a microphone, a sensor, a switch, a lever, and the like, and an input by the member. It may be configured of a control circuit or the like. The operation of the control device 20 can also instruct the processing operation by inputting various data to the control device 20 by operating the input device 259.
 表示装置260は、例えば、CRT(Cathode Ray Tube)ディスプレイ装置、液晶ディスプレイ(LCD)装置、プロジェクター装置、OLED(Organic Light Emitting Diode)装置及びランプ等の表示装置を含む。表示装置260は、例えば制御装置20と一体となって設けられる場合の第2の表示装置40に備えられる画像出力部に対応する。また、音声出力装置261は、スピーカ及びヘッドホン等の音声出力装置を含む。音声出力装置261は、例えば、制御装置20と一体となって設けられる場合の第2の表示装置40に備えられる音声出力部に対応する。 The display device 260 includes, for example, a display device such as a cathode ray tube (CRT) display device, a liquid crystal display (LCD) device, a projector device, an organic light emitting diode (OLED) device, and a lamp. The display device 260 corresponds to, for example, an image output unit provided in the second display device 40 when provided integrally with the control device 20. In addition, the audio output device 261 includes an audio output device such as a speaker and a headphone. The audio output device 261 corresponds to, for example, an audio output unit provided in the second display device 40 when provided integrally with the control device 20.
 ストレージ装置262は、本実施形態にかかる制御装置20の記憶部209の一例として構成されたデータ記憶用の装置である。ストレージ装置262は、記憶媒体、記憶媒体にデータを記録する記録装置、記憶媒体からデータを読み出す読出し装置及び記憶媒体に記録されたデータを削除する削除装置等を含んでもよい。ストレージ装置262は、例えば、HDD(Hard Disk Drive)やSSD(Solid Strage Drive)、あるいは同等の機能を有するメモリ等で構成される。このストレージ装置262は、ストレージを駆動し、CPU251が実行するプログラムや各種データを記憶する。 The storage device 262 is a device for data storage configured as an example of the storage unit 209 of the control device 20 according to the present embodiment. The storage device 262 may include a storage medium, a recording device for recording data in the storage medium, a reading device for reading data from the storage medium, and a deletion device for deleting data recorded in the storage medium. The storage device 262 is configured of, for example, a hard disk drive (HDD) or a solid storage drive (SSD), a memory having an equivalent function, or the like. The storage device 262 drives a storage and stores programs executed by the CPU 251 and various data.
 ドライブ263は、記憶媒体用リーダライタであり、制御装置20に内蔵、あるいは外付けされる。ドライブ263は、装着されている磁気ディスク、光ディスク、光磁気ディスク、または半導体メモリ等のリムーバブル記憶媒体270に記録されている情報を読み出して、RAM253またはストレージ装置262に出力する。また、ドライブ263は、リムーバブル記憶媒体270に情報を書き込むこともできる。 The drive 263 is a reader / writer for a storage medium, and is built in or externally attached to the control device 20. The drive 263 reads out the information recorded in the removable storage medium 270 such as a mounted magnetic disk, optical disk, magneto-optical disk, or semiconductor memory, and outputs the read information to the RAM 253 or storage device 262. The drive 263 can also write information to the removable storage medium 270.
 ネットワークインタフェース264は、例えば、ネットワーク50に接続するための通信デバイス等で構成された通信インタフェースである。また、ネットワークインタフェース264は、無線LAN対応制御装置であっても、有線による通信を行うワイヤー制御装置であってもよい。 The network interface 264 is, for example, a communication interface configured of a communication device or the like for connecting to the network 50. Also, the network interface 264 may be a wireless LAN compatible control device or a wire control device that performs wired communication.
<4-2.第1の表示装置30のハードウェア構成>
 図21は、第1の表示装置30のハードウェア構成を示したブロック図である。第1の表示装置30は、CPU351と、ROM352と、RAM353と、ホストバス354と、ブリッジ355と、外部バス356と、インタフェース357と、撮像・集音装置358と、入力装置359と、表示装置360と、音声出力装置361と、ストレージ装置362と、慣性センサ363と、ネットワークインタフェース364とを備えることができる。ここで、CPU351と、ROM352と、RAM353と、ホストバス354と、ブリッジ355と、外部バス356と、インタフェース357と、表示装置360と、ネットワークインタフェース364は、制御装置20のハードウェア構成と基本的に同様であるため、ここでの説明は省略する。
<4-2. Hardware Configuration of First Display Device 30>
FIG. 21 is a block diagram showing the hardware configuration of the first display device 30. As shown in FIG. The first display device 30 includes a CPU 351, a ROM 352, a RAM 353, a host bus 354, a bridge 355, an external bus 356, an interface 357, an imaging / sound collector 358, an input device 359, and a display device. A voice output device 361, a storage device 362, an inertial sensor 363, and a network interface 364 can be provided. Here, the CPU 351, the ROM 352, the RAM 353, the host bus 354, the bridge 355, the external bus 356, the interface 357, the display device 360, and the network interface 364 are basically the hardware configuration of the control device 20. Therefore, the description here is omitted.
 ホストバス354は、ブリッジ355を介して、PCIバス等の外部バス356に接続されている。なお、必ずしもホストバス354、ブリッジ355及び外部バス356を分離構成する必要はなく、1つのバスにこれらの機能を実装してもよい。 The host bus 354 is connected to an external bus 356 such as a PCI bus via the bridge 355. Note that the host bus 354, the bridge 355, and the external bus 356 do not necessarily need to be separately configured, and these functions may be implemented on one bus.
 撮像・集音装置358は、ユーザ位置推定・画像認識の機能と、ユーザの視点を検出する機能を有する。撮像・集音装置358は、カメラ、マイクロフォン等を含む。 The imaging and sound collecting device 358 has a function of user position estimation / image recognition and a function of detecting the user's viewpoint. The imaging and sound collecting device 358 includes a camera, a microphone and the like.
 入力装置359は、タッチパネル、ボタン、マイクロフォン、センサ、及びスイッチ等ユーザが情報を入力するための入力手段と、メンバーによる入力に基づいて入力信号を生成し、CPU251に出力する入力制御回路等から構成され得る。第1の表示装置30の操作は、該入力装置359を操作することにより、第1の表示装置30に対して各種のデータを入力することで処理動作を指示することもできる。 The input device 359 includes an input unit such as a touch panel, a button, a microphone, a sensor, and a switch for inputting information by the user, an input control circuit which generates an input signal based on an input by a member, and outputs it to the CPU 251 It can be done. The operation of the first display device 30 can also instruct processing operation by inputting various data to the first display device 30 by operating the input device 359.
 表示装置360は、例えば、CRTディスプレイ装置、液晶ディスプレイ装置、OLED装置及びランプ等の表示装置を含む。表示装置360は、画像出力部307に対応する。また、音声出力装置361は、スピーカ及びヘッドホン等の音声出力装置を含む。音声出力装置361は、例えば、音声出力部309に対応する。 The display device 360 includes, for example, display devices such as a CRT display device, a liquid crystal display device, an OLED device, and a lamp. The display device 360 corresponds to the image output unit 307. The audio output device 361 also includes an audio output device such as a speaker and headphones. The audio output device 361 corresponds to, for example, the audio output unit 309.
 ストレージ装置362は、本実施形態にかかる第1の表示装置30の記憶部313の一例として構成されたデータ記憶用の装置である。ストレージ装置362は、記憶媒体、記憶媒体にデータを記録する記録装置、記憶媒体からデータを読み出す読出し装置及び記憶媒体に記録されたデータを削除する削除装置等を含んでもよい。ストレージ装置362は、例えば、SSD、あるいは同等の機能を有するメモリ等で構成される。このストレージ装置362は、ストレージを駆動し、CPU351が実行するプログラムや各種データを記憶する。 The storage device 362 is a device for data storage configured as an example of the storage unit 313 of the first display device 30 according to the present embodiment. The storage device 362 may include a storage medium, a recording device that records data in the storage medium, a reading device that reads data from the storage medium, and a deletion device that deletes data recorded in the storage medium. The storage device 362 is configured of, for example, an SSD or a memory having an equivalent function. The storage device 362 drives a storage and stores programs executed by the CPU 351 and various data.
 慣性センサ363は、ユーザ位置を検出する検出装置として機能し、ユーザがどの位置に居て、どの方角を向いているのかを検知し、加速度センサ、ジャイロセンサ等であってよい。係る慣性センサ363は、位置情報取得部303に対応する。 The inertial sensor 363 functions as a detection device that detects the position of the user, detects which position the user is at, and which direction the user is facing, and may be an acceleration sensor, a gyro sensor, or the like. The inertial sensor 363 corresponds to the position information acquisition unit 303.
 ネットワークインタフェース364は、例えば、ネットワーク50に接続するための通信デバイス等で構成された通信インタフェースである。また、ネットワークインタフェース364は、無線LAN対応制御装置であっても、有線による通信を行うワイヤー制御装置であってもよい。ただし、第1の表示装置30は、ヘッドマウント型やメガネ型等の構造であることが多く、ユーザ毎に第1の表示装置30を装着して使用する可能性が高いため、ワイヤレス対応制御装置であることが好ましい。 The network interface 364 is, for example, a communication interface configured of a communication device or the like for connecting to the network 50. Also, the network interface 364 may be a wireless LAN compatible control device or a wire control device that performs wired communication. However, since the first display device 30 is often of a head mount type or glasses type structure and the possibility of mounting and using the first display device 30 for each user is high, the wireless compatible control device Is preferred.
<<5.結び>>
 以上、添付図面を参照しながら本開示の好適な実施形態について詳細に説明したが、本開示の技術的範囲はかかる例に限定されない。本開示の技術分野における通常の知識を有する者であれば、特許請求の範囲に記載された技術的思想の範疇内において、各種の変更例または修正例に想到し得ることは明らかであり、これらについても、当然に本開示の技術的範囲に属するものと了解される。
<< 5. Close >>
The preferred embodiments of the present disclosure have been described in detail with reference to the accompanying drawings, but the technical scope of the present disclosure is not limited to such examples. It is obvious that those skilled in the art of the present disclosure can conceive of various modifications or alterations within the scope of the technical idea described in the claims. It is understood that also of course falls within the technical scope of the present disclosure.
 また、本明細書に記載された効果は、あくまで説明的または例示的なものであって限定的ではない。つまり、本開示に係る技術は、上記の効果とともに、又は、上記の効果に代えて、本明細書の記載から当業者には明らかな他の効果を奏しうる。 In addition, the effects described in the present specification are merely illustrative or exemplary, and not limiting. That is, the technology according to the present disclosure can exhibit other effects apparent to those skilled in the art from the description of the present specification, in addition to or instead of the effects described above.
 なお、以下のような構成も本開示の技術的範囲に属する。
(1)
 第1の空間におけるユーザの位置情報と前記ユーザの視点情報を取得する情報取得部と、
 前記位置情報と前記視点情報に基づいて、所定方向の第2の空間の撮像画像を取得させる画像取得部と、
 前記第2の空間の撮像画像に基づいて、第1の空間における前記ユーザに提供する空間環境情報の制御を行う提供制御部と、を備える情報処理装置。
(2)
 前記第1の空間には、複数のユーザが存在しており、
 前記提供制御部は、前記ユーザ毎に異なる前記空間環境情報を表示させる、(1)に記載の情報処理装置。
(3)
 前記提供制御部は、前記ユーザが使用する第1の表示装置に、前記撮像画像を含む前記空間環境情報を表示させる、(1)又は(2)に記載の情報処理装置。
(4)
 前記提供制御部は、前記第1の空間に設けられた第2の表示装置に、前記撮像画像を含む前記空間環境情報を表示させる、(1)~(3)の何れか1つに記載の情報処理装置。
(5)
 前記第1の空間には、複数のユーザが存在しており、
 前記提供制御部は、前記第1の空間に設けられた第2の表示装置に、前記複数のユーザに提供される撮像画像のうち共通した部分を、前記空間環境情報の一部として表示させる、(1)~(3)の何れか1つに記載の情報処理装置。
(6)
 前記画像取得部は、更に、前記第1の空間に設けられた第2の表示装置から見た前記ユーザの相対角度に応じて、第2の空間の撮像画像を取得する、(1)~(5)の何れか1つに記載の情報処理装置。
(7)
 前記提供制御部は、前記空間環境情報として、更に、前記第2の空間に存在する他のユーザの位置に関する位置情報を前記第1の空間に存在する所定のユーザに対して提供する、(1)~(6)の何れか1つに記載の情報処理装置。
(8)
 前記提供制御部は、前記空間環境情報として、更に、前記第2の空間の音声に関する音声情報を前記第1の空間に存在する所定のユーザに対して提供する、(1)~(7)の何れか1つに記載の情報処理装置。
(9)
 前記提供制御部は、前記空間環境情報として、更に、前記第2の空間に存在する他のユーザの動作に関する動作情報を前記第1の空間に存在する所定のユーザに対して提供する、(1)~(8)の何れか1つに記載の情報処理装置。
(10)
 前記提供制御部は、前記第1の空間に存在するユーザの位置と前記第1の空間に設けられた第2の表示装置の位置との間の距離が所定の閾値以上であるときに、前記ユーザが使用する第1の表示装置に表示させる画像を、前記第1の空間に存在する所定のユーザの位置及び視点に対応した方向の第2の空間における画像情報から、前記第2の空間の所定の画像情報に切り替える、(1)~(9)の何れか1つに記載の情報処理装置。
(11)
 前記提供制御部は、第2の空間に存在する他のユーザの人数に関する情報を参照し、当該人数が所定の閾値以上であるときに、前記ユーザが使用する第1の表示装置に表示させる画像情報を、前記第1の空間に存在する所定のユーザの位置及び視点に対応した方向の前記第2の空間における画像情報から、第2の空間の所定の画像情報に切り替える、(1)~(10)の何れか1つに記載の情報処理装置。
(12)
 前記提供制御部は、前記第1の空間に存在するユーザの位置と前記第1の空間に設けられた第2の表示装置の位置との間の距離が所定の閾値以上であるときに、前記第1の空間に設けられた第2の表示装置に表示させる画像と比較して視認性が低い画像を、前記ユーザが使用する第1の表示装置に表示させる、(1)~(11)の何れか1つに記載の情報処理装置。
(13)
 前記提供制御部は、前記ユーザが使用する第1の表示装置に表示させる画像を、前記第2の空間に存在する他のユーザの位置に関する位置情報に応じて、前記第1の空間に存在する所定のユーザの位置及び視点に対応した方向の前記第2の空間における画像から、前記第2の空間に存在する前記他のユーザを含む他の画像に切り替える、(1)~(12)の何れか1つに記載の情報処理装置。
(14)
 前記提供制御部は、第2の空間に存在する他のユーザの位置に応じて、前記他のユーザの画像を視認性が低い状態で出力させる、(1)~(13)の何れか1つに記載の情報処理装置。
(15)
 前記提供制御部は、前記第1の空間に設けられた第2の表示装置の表示内容と重畳するように、前記空間環境情報を前記ユーザが使用する第1の表示装置に表示させる、(1)~(14)の何れか1つに記載の情報処理装置。
(16)
 前記提供制御部は、前記第1の空間に設けられた第2の表示装置と、第2の空間に設けられた第2の表示装置とに対し、共通の画像情報を表示させる、(1)~(15)の何れか1つに記載の情報処理装置。
(17)
 前記提供制御部は、前記第1の空間に存在する所定のユーザの位置と、前記第1の空間に設けられた第2の表示装置の位置との間の距離が所定の閾値以上であるときに、前記第2の表示装置の解像度を低下させる、(1)~(16)の何れか1つに記載の情報処理装置。
(18)
 前記提供制御部は、前記第1の空間に存在する所定のユーザの視点が前記第1の空間に設けられた第2の表示装置上に位置しないときに、前記第2の表示装置の解像度を低下させる、(1)~(17)の何れか1つに記載の情報処理装置。
(19)
 前記情報取得部は、所定のユーザの前記位置情報及び前記視点情報を、前記所定のユーザが使用する第1の表示装置から取得した情報を用いて抽出する、(1)~(18)の何れか1つに記載の情報処理装置。
(20)
 前記情報取得部は、所定のユーザの位置情報及び視点情報を、前記第1の空間及び前記第2の空間のそれぞれに設けられた前記画像取得部により取得された前記空間環境情報から抽出する、(1)~(18)の何れか1つに記載の情報処理装置。
(21)
 第1の空間におけるユーザの位置情報と前記ユーザの視点情報を取得することと、
 前記位置情報と前記視点情報に基づいて、所定方向の第2の空間の撮像画像を取得させることと、
 前記第2の空間の撮像画像に基づいて、第1の空間における前記ユーザに提供する空間環境情報の制御を行うことと、を備える情報処理方法。
(22)
 コンピュータを、
 第1の空間におけるユーザの位置情報と前記ユーザの視点情報を取得する情報取得部、
 前記位置情報と前記視点情報に基づいて、所定方向の第2の空間の撮像画像を取得させる画像取得部、及び
 前記第2の空間の撮像画像に基づいて、第1の空間における前記ユーザに提供する空間環境情報の制御を行う提供制御部、として機能させるためのプログラム。
The following configurations are also within the technical scope of the present disclosure.
(1)
An information acquisition unit that acquires position information of a user in a first space and viewpoint information of the user;
An image acquisition unit configured to acquire a captured image of a second space in a predetermined direction based on the position information and the viewpoint information;
An information processing apparatus, comprising: a provision control unit configured to control space environment information provided to the user in the first space based on a captured image of the second space.
(2)
A plurality of users exist in the first space,
The information processing apparatus according to (1), wherein the provision control unit causes the space environment information different for each user to be displayed.
(3)
The information processing apparatus according to (1) or (2), wherein the provision control unit causes the first display device used by the user to display the space environment information including the captured image.
(4)
The provision control unit according to any one of (1) to (3), displaying the space environment information including the captured image on a second display device provided in the first space. Information processing device.
(5)
A plurality of users exist in the first space,
The provision control unit causes a second display device provided in the first space to display a common portion of the captured images provided to the plurality of users as part of the space environment information. The information processing apparatus according to any one of (1) to (3).
(6)
The image acquisition unit further acquires a captured image of a second space according to a relative angle of the user viewed from a second display device provided in the first space, (1) to (1) The information processing apparatus according to any one of 5).
(7)
The provision control unit further provides, as the space environment information, position information on the position of another user existing in the second space to a predetermined user existing in the first space ((1 ) The information processing apparatus according to any one of (6) to (6).
(8)
The provision control unit further provides, as the space environment information, sound information on the sound of the second space to a predetermined user present in the first space, as in (1) to (7). The information processing apparatus according to any one.
(9)
The provision control unit further provides, as the space environment information, operation information on operations of other users existing in the second space to a predetermined user existing in the first space ((1 The information processing apparatus according to any one of the above to (8).
(10)
When the distance between the position of the user present in the first space and the position of the second display provided in the first space is equal to or greater than a predetermined threshold, the provision control unit From the image information in the second space of the direction corresponding to the position and viewpoint of the predetermined user present in the first space, the image to be displayed on the first display device used by the user is The information processing apparatus according to any one of (1) to (9), which switches to predetermined image information.
(11)
The provision control unit refers to information related to the number of other users present in the second space, and when the number is greater than or equal to a predetermined threshold, an image displayed on the first display device used by the user Switching information from image information in the second space in a direction corresponding to the position and viewpoint of the predetermined user present in the first space to predetermined image information in the second space, (1) to (1) The information processing apparatus according to any one of 10).
(12)
When the distance between the position of the user present in the first space and the position of the second display provided in the first space is equal to or greater than a predetermined threshold, the provision control unit Displaying an image having a lower visibility than the image displayed on the second display device provided in the first space, on the first display device used by the user; The information processing apparatus according to any one.
(13)
The provision control unit is configured to present an image to be displayed on a first display device used by the user, in the first space according to position information on the positions of other users present in the second space. Switching from an image in the second space in a direction corresponding to a predetermined user's position and viewpoint to another image including the other user present in the second space, any of (1) to (12) The information processing apparatus according to any one of the above.
(14)
The provision control unit causes the image of the other user to be output in a low visibility state according to the position of the other user existing in the second space, any one of (1) to (13) The information processing apparatus according to claim 1.
(15)
The provision control unit causes the first display device used by the user to display the space environment information so as to overlap the display content of the second display device provided in the first space (1 The information processing apparatus according to any one of (14) to (14).
(16)
The provision control unit causes the second display device provided in the first space and the second display device provided in the second space to display common image information. (1) The information processing apparatus according to any one of (15) to (15).
(17)
When the provision control unit determines that the distance between the position of a predetermined user present in the first space and the position of a second display device provided in the first space is equal to or greater than a predetermined threshold value. The information processing apparatus according to any one of (1) to (16), wherein the resolution of the second display device is reduced.
(18)
When the viewpoint of a predetermined user present in the first space is not located on a second display device provided in the first space, the provision control unit sets the resolution of the second display device. The information processing apparatus according to any one of (1) to (17), wherein
(19)
The information acquisition unit extracts the position information and the viewpoint information of a predetermined user using information acquired from a first display device used by the predetermined user, any of (1) to (18). The information processing apparatus according to any one of the above.
(20)
The information acquisition unit extracts predetermined user position information and viewpoint information from the space environment information acquired by the image acquisition unit provided in each of the first space and the second space. The information processing apparatus according to any one of (1) to (18).
(21)
Obtaining position information of the user in the first space and viewpoint information of the user;
Obtaining a captured image of a second space in a predetermined direction based on the position information and the viewpoint information;
Controlling space environment information provided to the user in the first space based on the captured image of the second space.
(22)
Computer,
An information acquisition unit that acquires position information of a user in a first space and viewpoint information of the user;
Provided to the user in the first space based on the captured image of the second space, and an image acquisition unit that acquires a captured image of the second space in a predetermined direction based on the position information and the viewpoint information. A program for functioning as a provision control unit that controls space environment information.
 1            情報処理システム
 3            情報処理装置
 5            情報取得部
 7            画像取得部
 9            提供制御部
 10           撮像ユニット
 20           制御装置
 30           第1の表示装置
 40           第2の表示装置
 50           ネットワーク
 101          画像取得部
 103、305      音声取得部
 105、203      空間環境情報解析部
 107、201、311  通信制御部
 109、209、313  記憶部
 205          出力制御部
 207          撮像ユニット制御部
 301          視点情報取得部
 303          位置情報取得部
 307          画像出力部
 309          音声出力部
DESCRIPTION OF SYMBOLS 1 information processing system 3 information processing apparatus 5 information acquisition part 7 image acquisition part 9 provision control part 10 imaging unit 20 control apparatus 30 1st display apparatus 40 2nd display apparatus 50 network 101 image acquisition part 103, 305 audio acquisition part 105, 203 space environment information analysis unit 107, 201, 311 communication control unit 109, 209, 313 storage unit 205 output control unit 207 imaging unit control unit 301 viewpoint information acquisition unit 303 position information acquisition unit 307 image output unit 309 voice output unit

Claims (22)

  1.  第1の空間におけるユーザの位置情報と前記ユーザの視点情報を取得する情報取得部と、
     前記位置情報と前記視点情報に基づいて、所定方向の第2の空間の撮像画像を取得させる画像取得部と、
     前記第2の空間の撮像画像に基づいて、第1の空間における前記ユーザに提供する空間環境情報の制御を行う提供制御部と、を備える情報処理装置。
    An information acquisition unit that acquires position information of a user in a first space and viewpoint information of the user;
    An image acquisition unit configured to acquire a captured image of a second space in a predetermined direction based on the position information and the viewpoint information;
    An information processing apparatus, comprising: a provision control unit configured to control space environment information provided to the user in the first space based on a captured image of the second space.
  2.  前記第1の空間には、複数のユーザが存在しており、
     前記提供制御部は、前記ユーザ毎に異なる前記空間環境情報を表示させる、請求項1に記載の情報処理装置。
    A plurality of users exist in the first space,
    The information processing apparatus according to claim 1, wherein the provision control unit displays the space environment information different for each user.
  3.  前記提供制御部は、前記ユーザが使用する第1の表示装置に、前記撮像画像を含む前記空間環境情報を表示させる、請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the provision control unit causes the first display device used by the user to display the space environment information including the captured image.
  4.  前記提供制御部は、前記第1の空間に設けられた第2の表示装置に、前記撮像画像を含む前記空間環境情報を表示させる、請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the provision control unit causes the second display device provided in the first space to display the space environment information including the captured image.
  5.  前記第1の空間には、複数のユーザが存在しており、
     前記提供制御部は、前記第1の空間に設けられた第2の表示装置に、前記複数のユーザに提供される撮像画像のうち共通した部分を、前記空間環境情報の一部として表示させる、請求項1に記載の情報処理装置。
    A plurality of users exist in the first space,
    The provision control unit causes a second display device provided in the first space to display a common portion of the captured images provided to the plurality of users as part of the space environment information. An information processing apparatus according to claim 1.
  6.  前記画像取得部は、更に、前記第1の空間に設けられた第2の表示装置から見た前記ユーザの相対角度に応じて、第2の空間の撮像画像を取得させる、請求項1に記載の情報処理装置。 The image acquisition unit according to claim 1, further comprising: acquiring a captured image of a second space according to a relative angle of the user viewed from a second display device provided in the first space. Information processing equipment.
  7.  前記提供制御部は、前記空間環境情報として、更に、前記第2の空間に存在する他のユーザの位置に関する位置情報を前記第1の空間に存在する所定のユーザに対して提供する、請求項1に記載の情報処理装置。 The provision control unit further provides, as the space environment information, position information on a position of another user existing in the second space to a predetermined user existing in the first space. The information processing apparatus according to 1.
  8.  前記提供制御部は、前記空間環境情報として、更に、前記第2の空間の音声に関する音声情報を前記第1の空間に存在する所定のユーザに対して提供する、請求項1に記載の情報処理装置。 The information processing according to claim 1, wherein the provision control unit further provides, as the spatial environment information, audio information on audio of the second space to a predetermined user existing in the first space. apparatus.
  9.  前記提供制御部は、前記空間環境情報として、更に、前記第2の空間に存在する他のユーザの動作に関する動作情報を前記第1の空間に存在する所定のユーザに対して提供する、請求項1に記載の情報処理装置。 The provision control unit further provides, as the space environment information, operation information on an operation of another user existing in the second space to a predetermined user existing in the first space. The information processing apparatus according to 1.
  10.  前記提供制御部は、前記第1の空間に存在するユーザの位置と前記第1の空間に設けられた第2の表示装置の位置との間の距離が所定の閾値以上であるときに、前記ユーザが使用する第1の表示装置に表示させる画像を、前記第1の空間に存在する所定のユーザの位置及び視点に対応した方向の第2の空間における画像情報から、前記第2の空間の所定の画像情報に切り替える、請求項1に記載の情報処理装置。 When the distance between the position of the user present in the first space and the position of the second display provided in the first space is equal to or greater than a predetermined threshold, the provision control unit From the image information in the second space of the direction corresponding to the position and viewpoint of the predetermined user present in the first space, the image to be displayed on the first display device used by the user is The information processing apparatus according to claim 1, wherein the information is switched to predetermined image information.
  11.  前記提供制御部は、第2の空間に存在する他のユーザの人数に関する情報を参照し、当該人数が所定の閾値以上であるときに、前記ユーザが使用する第1の表示装置に表示させる画像情報を、前記第1の空間に存在する所定のユーザの位置及び視点に対応した方向の前記第2の空間における画像情報から、第2の空間の所定の画像情報に切り替える、請求項1に記載の情報処理装置。 The provision control unit refers to information related to the number of other users present in the second space, and when the number is greater than or equal to a predetermined threshold, an image displayed on the first display device used by the user The information is switched from image information in the second space in a direction corresponding to the position and viewpoint of the predetermined user present in the first space to predetermined image information in the second space. Information processing equipment.
  12.  前記提供制御部は、前記第1の空間に存在するユーザの位置と前記第1の空間に設けられた第2の表示装置の位置との間の距離が所定の閾値以上であるときに、前記第1の空間に設けられた第2の表示装置に表示させる画像と比較して視認性が低い画像を、前記ユーザが使用する第1の表示装置に表示させる、請求項1に記載の情報処理装置。 When the distance between the position of the user present in the first space and the position of the second display provided in the first space is equal to or greater than a predetermined threshold, the provision control unit The information processing according to claim 1, wherein an image having a lower visibility than an image displayed on a second display device provided in the first space is displayed on the first display device used by the user. apparatus.
  13.  前記提供制御部は、前記ユーザが使用する第1の表示装置に表示させる画像を、前記第2の空間に存在する他のユーザの位置に関する位置情報に応じて、前記第1の空間に存在する所定のユーザの位置及び視点に対応した方向の前記第2の空間における画像から、前記第2の空間に存在する前記他のユーザを含む他の画像に切り替える、請求項1に記載の情報処理装置。 The provision control unit is configured to present an image to be displayed on a first display device used by the user, in the first space according to position information on the positions of other users present in the second space. The information processing apparatus according to claim 1, wherein the image in the second space in the direction corresponding to the position and the viewpoint of the predetermined user is switched to another image including the other user existing in the second space. .
  14.  前記提供制御部は、第2の空間に存在する他のユーザの位置に応じて、前記他のユーザの画像を視認性が低い状態で出力させる、請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the provision control unit outputs the image of the other user in a state of low visibility according to the position of the other user existing in the second space.
  15.  前記提供制御部は、前記第1の空間に設けられた第2の表示装置の表示内容と重畳するように、前記空間環境情報を前記ユーザが使用する第1の表示装置に表示させる、請求項1に記載の情報処理装置。 The said provision control part displays the said space environment information on the 1st display apparatus which the said user uses so that it may overlap with the display content of the 2nd display apparatus provided in the said 1st space. The information processing apparatus according to 1.
  16.  前記提供制御部は、前記第1の空間に設けられた第2の表示装置と、第2の空間に設けられた第2の表示装置とに対し、共通の画像情報を表示させる、請求項1に記載の情報処理装置。 The provision control unit causes common image information to be displayed on a second display device provided in the first space and a second display device provided in the second space. The information processing apparatus according to claim 1.
  17.  前記提供制御部は、前記第1の空間に存在する所定のユーザの位置と、前記第1の空間に設けられた第2の表示装置の位置との間の距離が所定の閾値以上であるときに、前記第2の表示装置の解像度を低下させる、請求項1に記載の情報処理装置。 When the provision control unit determines that the distance between the position of a predetermined user present in the first space and the position of a second display device provided in the first space is equal to or greater than a predetermined threshold value. The information processing apparatus according to claim 1, wherein the resolution of the second display device is reduced.
  18.  前記提供制御部は、前記第1の空間に存在する所定のユーザの視点が前記第1の空間に設けられた第2の表示装置上に位置しないときに、前記第2の表示装置の解像度を低下させる、請求項1に記載の情報処理装置。 When the viewpoint of a predetermined user present in the first space is not located on a second display device provided in the first space, the provision control unit sets the resolution of the second display device. The information processing apparatus according to claim 1, wherein the information processing apparatus is reduced.
  19.  前記情報取得部は、所定のユーザの前記位置情報及び前記視点情報を、前記所定のユーザが使用する第1の表示装置から取得した情報を用いて抽出する、請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the information acquisition unit extracts the position information and the viewpoint information of a predetermined user using information acquired from a first display device used by the predetermined user. .
  20.  前記情報取得部は、所定のユーザの前記位置情報及び前記視点情報を、前記第1の空間及び前記第2の空間のそれぞれに設けられた前記画像取得部により取得された前記空間環境情報から抽出する、請求項1に記載の情報処理装置。 The information acquisition unit extracts the position information and the viewpoint information of a predetermined user from the space environment information acquired by the image acquisition unit provided in each of the first space and the second space. The information processing apparatus according to claim 1.
  21.  第1の空間におけるユーザの位置情報と前記ユーザの視点情報を取得することと、
     前記位置情報と前記視点情報に基づいて、所定方向の第2の空間の撮像画像を取得させることと、
     前記第2の空間の撮像画像に基づいて、第1の空間における前記ユーザに提供する空間環境情報の制御を行うことと、を含む情報処理方法。
    Obtaining position information of the user in the first space and viewpoint information of the user;
    Obtaining a captured image of a second space in a predetermined direction based on the position information and the viewpoint information;
    Controlling space environment information provided to the user in the first space based on the captured image of the second space.
  22.  コンピュータを、
     第1の空間におけるユーザの位置情報と前記ユーザの視点情報を取得する情報取得部、
     前記位置情報と前記視点情報に基づいて、所定方向の第2の空間の撮像画像を取得させる画像取得部、及び
     前記第2の空間の撮像画像に基づいて、第1の空間における前記ユーザに提供する空間環境情報の制御を行う提供制御部、として機能させるためのプログラム。
    Computer,
    An information acquisition unit that acquires position information of a user in a first space and viewpoint information of the user;
    Provided to the user in the first space based on the captured image of the second space, and an image acquisition unit that acquires a captured image of the second space in a predetermined direction based on the position information and the viewpoint information. A program for functioning as a provision control unit that controls space environment information.
PCT/JP2018/042046 2018-01-09 2018-11-14 Information processing device, information processing method, and program WO2019138682A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018001123 2018-01-09
JP2018-001123 2018-01-09

Publications (1)

Publication Number Publication Date
WO2019138682A1 true WO2019138682A1 (en) 2019-07-18

Family

ID=67218941

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/042046 WO2019138682A1 (en) 2018-01-09 2018-11-14 Information processing device, information processing method, and program

Country Status (1)

Country Link
WO (1) WO2019138682A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114500846A (en) * 2022-02-12 2022-05-13 北京蜂巢世纪科技有限公司 Method, device and equipment for switching viewing angles of live action and readable storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08149433A (en) * 1994-11-20 1996-06-07 Casio Comput Co Ltd Video telephone system
JP2006140747A (en) * 2004-11-11 2006-06-01 Nippon Telegr & Teleph Corp <Ntt> Video communication apparatus and method for controlling same
JP2009267729A (en) * 2008-04-24 2009-11-12 Sony Corp Image processing apparatus, image processing method, program, and recording medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08149433A (en) * 1994-11-20 1996-06-07 Casio Comput Co Ltd Video telephone system
JP2006140747A (en) * 2004-11-11 2006-06-01 Nippon Telegr & Teleph Corp <Ntt> Video communication apparatus and method for controlling same
JP2009267729A (en) * 2008-04-24 2009-11-12 Sony Corp Image processing apparatus, image processing method, program, and recording medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114500846A (en) * 2022-02-12 2022-05-13 北京蜂巢世纪科技有限公司 Method, device and equipment for switching viewing angles of live action and readable storage medium
CN114500846B (en) * 2022-02-12 2024-04-02 北京蜂巢世纪科技有限公司 Live action viewing angle switching method, device, equipment and readable storage medium

Similar Documents

Publication Publication Date Title
US10009542B2 (en) Systems and methods for environment content sharing
US20240070947A1 (en) Information processing apparatus and information processing method
US11170580B2 (en) Information processing device, information processing method, and recording medium
JP5967839B2 (en) Display device using wearable glasses and operating method thereof
US11320655B2 (en) Graphic interface for real-time vision enhancement
JP7092028B2 (en) Information processing equipment, information processing methods, and programs
US20150172634A1 (en) Dynamic POV Composite 3D Video System
JP2014233035A (en) Information processor, display control method and program
EP3528024B1 (en) Information processing device, information processing method, and program
WO2019155735A1 (en) Information processing device, information processing method, and program
JP2010244322A (en) Communication character device and program therefor
JP2016213674A (en) Display control system, display control unit, display control method, and program
CN107148614A (en) Message processing device, information processing method and program
WO2019187732A1 (en) Information processing device, information processing method, and program
WO2019138682A1 (en) Information processing device, information processing method, and program
US20210400234A1 (en) Information processing apparatus, information processing method, and program
JP2019046482A (en) Voice video tracking device
JP6208910B1 (en) Moving image processing apparatus, moving image processing system, moving image processing method, and moving image processing program
US20230260235A1 (en) Information processing apparatus, information processing method, and information processing system
CN115686190A (en) Guiding a virtual agent based on eye behavior of a user
JP7094759B2 (en) System, information processing method and program
WO2018083757A1 (en) Image provision device, image provision method, program, and non-transitory computer-readable information recording medium
US11740773B2 (en) Information processing device and method
WO2022269887A1 (en) Wearable terminal device, program, and image processing method
CN111670431B (en) Information processing device, information processing method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18900266

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18900266

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP