WO2014208169A1 - Information processing device, control method, program, and recording medium - Google Patents

Information processing device, control method, program, and recording medium Download PDF

Info

Publication number
WO2014208169A1
WO2014208169A1 PCT/JP2014/059531 JP2014059531W WO2014208169A1 WO 2014208169 A1 WO2014208169 A1 WO 2014208169A1 JP 2014059531 W JP2014059531 W JP 2014059531W WO 2014208169 A1 WO2014208169 A1 WO 2014208169A1
Authority
WO
WIPO (PCT)
Prior art keywords
information processing
irradiation point
operator
remote
information
Prior art date
Application number
PCT/JP2014/059531
Other languages
French (fr)
Japanese (ja)
Inventor
智也 成田
卓 井上
丈博 萩原
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2014208169A1 publication Critical patent/WO2014208169A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • G06F3/0386Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry for light pen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/142Constructional details of the terminal equipment, e.g. arrangements of the camera and the display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/038Indexing scheme relating to G06F3/038
    • G06F2203/0383Remote input, i.e. interface arrangements in which the signals generated by a pointing device are transmitted to a PC at a remote location, e.g. to a PC in a LAN

Definitions

  • the present disclosure relates to an information processing apparatus, a control method, a program, and a storage medium.
  • video conferencing video conferencing
  • Patent Document 1 discloses an optimum layout attempt according to an operation state.
  • the recognition unit that recognizes the position of the irradiation point by the laser pointer with respect to the local projection image
  • the acquisition unit that acquires the information of the operator of the laser pointer
  • the display device installed on the remote side
  • the position of the irradiation point and the information of the operator are transmitted to the remote side.
  • An information processing apparatus including a first transmission unit is proposed.
  • the step of recognizing the position of the irradiation point by the laser pointer with respect to the projection image on the local side the step of acquiring information on the operator of the laser pointer, and the display device installed on the remote side
  • a control method is proposed.
  • the computer is installed on the remote side, a recognition unit that recognizes the position of the irradiation point by the laser pointer on the local projection image, an acquisition unit that acquires information on the operator of the laser pointer, In order to display the remote irradiation point located at the coordinate position corresponding to the irradiation point and the operator information together with the projection image, the position of the irradiation point and the operator information are displayed on the display device.
  • a program for functioning as a first transmission unit that transmits to the side is proposed.
  • the computer is installed on the remote side, a recognition unit that recognizes the position of the irradiation point by the laser pointer on the local projection image, an acquisition unit that acquires information on the operator of the laser pointer, In order to display the remote irradiation point located at the coordinate position corresponding to the irradiation point and the operator information together with the projection image, the position of the irradiation point and the operator information are displayed on the display device.
  • a storage medium storing a program for causing a first transmission section to transmit to the first transmission section and a program to function as the first transmission section is proposed.
  • the information of the operator who operates the laser pointer is clearly associated with the indication point displayed on the remote side, thereby facilitating a communication conference connecting a plurality of remote locations. Can be performed.
  • a remote location cooperation system is formed by a plurality of information processing devices 1A and 1B installed in a plurality of remote locations, and each information processing device (communication device) 1A, 1B is connected via a network.
  • the information processing apparatus 1A installed in the ROOM • A is connected to each device installed in the ROOM • A by wire / wireless.
  • the information processing apparatus 1A includes a projector 3A, a content display apparatus 4A, a camera 5A that captures an image projected on the screen S1, a camera 6A that captures an operator 7A of the laser pointer 2A, a microphone (hereinafter referred to as a microphone). It is connected to 8A and the speaker 9A.
  • the system configuration shown in FIG. 1 is an example.
  • the microphone 8A and the camera 6A may be integrated, the content display device 4A and the information processing device 1A may be integrated, or the camera 5A may be built in the projector 3A.
  • the information processing apparatus 1A acquires content from the content display apparatus 4A and transmits projection image data (display signal) to the projector 3A.
  • the contents are, for example, charts, sentences, other various graphic images, maps, websites, 3D objects, and the like, and are also referred to as image data for projection hereinafter.
  • the content display device 4A is a storage device that stores content, and may be a desktop PC, a tablet terminal, a smart phone, or the like in addition to the notebook PC shown in FIG.
  • the projector 3A projects the projection image data transmitted from the information processing apparatus 1A onto the screen S1 according to the control of the information processing apparatus 1A.
  • the laser pointer 2A irradiates the screen S1 with the visible light laser beam and the invisible light marker at the same position or only with the visible light marker.
  • the visible light laser light and the visible light marker are laser light and marker images that are irradiated with visible light visible to the human eye.
  • the invisible light marker is a marker image irradiated with invisible light (invisible light) such as infrared rays or ultraviolet rays that cannot be seen by human eyes.
  • the non-visible light marker / visible light marker is a one-dimensional or two-dimensional barcode in which various figures such as a star shape and a cross, or specific information is embedded.
  • the operator 7A makes a presentation while indicating an arbitrary portion of the screen S1 with a visible light laser beam or a visible light marker emitted from the laser pointer 2A.
  • the camera 5A captures the projected image projected on the screen S1, and outputs the captured image to the information processing apparatus 1A.
  • the camera 5A has a non-visible light imaging function that performs non-visible light (invisible light) imaging such as an infrared camera or an ultraviolet camera, or a visible light imaging function that performs visible light imaging.
  • the information processing apparatus 1A uses an irradiation point P1 (visible light laser light, visible light marker, or invisible light marker) by the laser pointer 2A based on the visible / invisible light captured image of the projection image captured by the camera 5A. The coordinate position of the designated point on the projected image is recognized.
  • P1 visible light laser light, visible light marker, or invisible light marker
  • the information processing apparatus 1A can calculate the three-dimensional position of the laser pointer 2A. Specifically, the information processing apparatus 1A determines the size of the shape of the visible / invisible light marker emitted from the laser pointer 2A based on the visible / invisible light captured image of the projection image captured by the camera 5A. From the distortion and the like, the three-dimensional position of the laser pointer 2A with respect to the projection image is calculated.
  • the information processing apparatus 1A acquires information on the operator 7A of the laser pointer 2A.
  • the information of the operator 7A is, for example, a name such as a name or a nickname, a title, an affiliation, a current location, a face image or an avatar image registered in advance.
  • the information processing apparatus 1A analyzes the visible / invisible light captured image of the projection image captured by the camera 5A, and the shape and color of the visible / invisible light marker emitted from the laser pointer 2A.
  • the embedded information (user ID) is read, and the operator 7A of the laser pointer 2A is specified.
  • the information processing apparatus 1A uses the camera 6A so that the operator 7A of the laser pointer 2A is included in the angle of view of the camera 6A different from the camera 5A that captures the projected image based on the calculated three-dimensional position.
  • the movement pan, tilt, zoom
  • the information processing apparatus 1A can acquire a real-time image of the operator 7A of the laser pointer 2A.
  • the information processing apparatus 1A directs the microphone 8A having high directivity toward the operator 7A of the laser pointer 2A based on the calculated three-dimensional position, and utters (sounds) of the operator who has irradiated the laser pointer 2A. Data) can also be collected.
  • the information processing apparatus 1A on the local side uses the image data for projection (display signal), the coordinate position of the irradiation point P1, the information of the operator 7A, the real-time video of the operator 7A captured by the camera 6A, and the microphone 8A.
  • the collected voice data and the like are transmitted to the information processing apparatus 1B on the remote side.
  • the information processing apparatus 1B installed in the remote ROOM • B is connected to each device installed in the ROOM • B by wire / wireless.
  • the information processing device 1B is connected to a projector 3B ((an example of a display device)), a camera 6B that captures an image of a viewer 7B, a microphone 8B, and a speaker 9B.
  • the information processing apparatus 1B transmits the projection image data received from the information processing apparatus 1A to the projector 3B, and causes the projector 3B to project the image data onto the screen S2. At this time, the information processing apparatus 1B performs control to superimpose and display the remote irradiation point P1 ′ at the coordinate position corresponding to the coordinate position of the irradiation point P1 received from the information processing apparatus 1A on the projection image projected on the screen S2. To do. Thereby, even on the projection image projected on the screen S2 on the remote side, the same locus as the irradiation point P1 irradiated on the projection image projected on the screen S1 on the ROOM • A side can be displayed.
  • the information processing apparatus 1B superimposes and displays information (for example, name and face image) of the operator 7A of the laser pointer 2A in association with the remote irradiation point P1 '.
  • information for example, name and face image
  • the viewer 7B can intuitively understand who is operating and explaining the remote irradiation point P1 'superimposed on the screen S2 on the ROOM • A side.
  • the information processing apparatus 1B may superimpose and display an image of the operator 7A captured in real time by the camera 6A as the information about the operator 7A.
  • the information processing apparatus 1B when the information processing apparatus 1B receives audio data collected by the microphone 8A from the local information processing apparatus 1A, the information processing apparatus 1B controls the speaker 9B to reproduce from the speaker 9B.
  • the information processing apparatus 1B can also transmit the utterance (voice data) of the viewer 7B collected by the microphone 8B to the information processing apparatus 1A of the ROOM • A.
  • the information of the operator 7A who is operating the laser pointer 2a is superimposed and displayed in association with the remote irradiation point P1 ′ displayed on the remote side. It is possible to intuitively understand who is operating and explaining, and to smoothly conduct a communication conference connecting a plurality of remote locations.
  • FIG. 1 illustrates the case of a communication conference that connects two remote locations as an example, but the remote location cooperation system according to the present embodiment is not limited to this. For example, as shown in FIG. The same can be applied to the communication conference to be connected.
  • FIG. 2 is a diagram for describing a case where the remote location cooperation system according to an embodiment of the present disclosure is applied to a communication conference that connects three remote locations.
  • a remote location cooperation system according to an embodiment of the present disclosure is formed by a plurality of information processing devices 1A, 1B, and 1C installed in a plurality of remote locations, and each information processing device (communication device). 1A, 1B, and 1C are connected via a network.
  • the information processing apparatus 1C installed in the ROOM ⁇ C is connected to each device installed in the ROOM ⁇ C by wire / wireless.
  • the information processing apparatus 1C includes a projector 3C, a content display apparatus 4C, a camera 5C that captures an image projected on the screen S3, a camera 6C that captures an operator 7C of the laser pointer 2C, a microphone 8C, and a speaker 9C.
  • the system configuration shown in FIG. 2 is an example, and for example, the microphone 8C and the camera 6C may be integrated, the content display device 4C and the information processing device 1C may be integrated, or the camera 5C may be built in the projector 3C.
  • the information processing apparatus 1A transmits the coordinate position of the irradiation point P1, information on the operator 7A, and the like to the information processing apparatus 1B and the information processing apparatus 1C that are remote to the information processing apparatus 1A.
  • the information processing apparatus 1C transmits the coordinate position of the irradiation point P3, information on the operator 7C, and the like to the information processing apparatus 1B and the information processing apparatus 1A that are remote to the information processing apparatus 1C.
  • a remote irradiation point P1 'corresponding to the irradiation point P1 and a remote irradiation point P3' corresponding to the irradiation point P3 are superimposed and displayed on the projection image projected on the screen S2 of the ROOM • C.
  • the remote irradiation point P1 ′ is displayed in the same locus as the irradiation point P1 irradiated on the projection image projected on the screen S1 on the ROOM • A side
  • the remote irradiation point P3 ′ is displayed on the screen S3 on the ROOM • C side.
  • the projected image is displayed on the projected image with the same locus as the irradiated point P3.
  • a remote irradiation point P3 ′ corresponding to the irradiation point P3 is superimposed and displayed on the projection image projected on the screen S1 of the ROOM • A, and on the projection image projected on the screen S3 of the ROOM • C, A remote irradiation point P1 ′ corresponding to the irradiation point P1 is displayed in a superimposed manner.
  • the remote location cooperation system according to this embodiment can be applied to a communication conference that connects three remote locations.
  • an internal configuration example of each configuration forming the remote location cooperation system according to the present embodiment will be described with reference to a plurality of embodiments.
  • FIG. 3 is a block diagram illustrating an internal configuration example of the information processing apparatus 1A-1 and the information processing apparatus 1B-1 that form the remote location cooperation system according to the first embodiment.
  • the information processing apparatus 1A-1 is an example of the information processing apparatus 1A installed in the ROOM • A described with reference to FIG. As shown in FIG. 3, the information processing apparatus 1A-1 includes a transmission / reception unit 11A, an irradiation position recognition unit 12, an operator information acquisition unit 13, a three-dimensional position calculation unit 14, a transmission unit 15A, an output control unit 16, and a reception. Part 17A.
  • the transmission / reception unit 11A has a function of transmitting / receiving data by connecting to each device installed in the ROOM • A by wire / wireless. Specifically, the transmission / reception unit 11A receives content from the content display device 4A, transmits image data for projection based on the content to the projector 3A, and receives a captured image of the projection image from the camera 5A. In addition, the transmission / reception unit 11A transmits an imaging control signal to the camera 6A that images the operator 7A, and receives a captured image from the camera 6A. In addition, the transmission / reception unit 11A transmits a sound collection control signal to the microphone 8A that collects the speech of the operator 7A, and receives sound collection data from the microphone 8A. Further, the transmission / reception unit 11A transmits the sound collection data transmitted from the remote information processing apparatus 1B-1 to the speaker 9A.
  • the method of wireless communication between the transmission / reception unit 11A and each device installed in the ROOM A is not particularly limited.
  • a wireless LAN, Wi-Fi (registered trademark), or Bluetooth (registered trademark) is connected. Also good.
  • the irradiation position recognition unit 12 irradiates the projected image with the visible light laser light or the visible light / invisible light marker by the laser pointer 2A based on the visible light / invisible light captured image of the projected image captured by the camera 5A. Recognize the coordinate position of P1. Specifically, for example, the irradiation position recognition unit 12 detects a difference between an image projected by the projector 3A (projection image data) and a visible / invisible light captured image obtained by capturing the projection image. The position coordinates of the irradiation position P1 are detected.
  • the irradiation position recognizing unit 12 also analyzes the difference between the visible light / invisible light captured image of the previous frame of the currently projected image and the visible light / invisible light captured image of the currently projected image. The accuracy can be increased by adding to.
  • the irradiation position recognition unit 12 outputs the recognized coordinate position of the irradiation point P1 to the transmission unit 15A.
  • the operator information acquisition unit 13 analyzes the visible / invisible light captured image of the projection image captured by the camera 5A, and embeds the shape and color of the visible / invisible light marker emitted from the laser pointer 2A. And the like, the operator 7A is specified, and the operator information is acquired. Specifically, for example, the operator information acquisition unit 13 has information on the operator associated with the shape and color of the visible / invisible light marker in the information processing apparatus 1A-1 or on the cloud. Extracted from an operator information DB (database) (not shown). Moreover, the operator information acquisition part 13 can also acquire operator information based on the information read from the visible light / invisible light marker. The operator information acquisition unit 13 outputs the acquired operator information to the transmission unit 15A.
  • the three-dimensional position calculation unit 14 calculates the three-dimensional position of the laser pointer 2A with respect to the projection image projected on the screen S. Specifically, the three-dimensional position calculation unit 14 forms the shape of the visible / invisible light marker emitted from the laser pointer 2A based on the visible / invisible light captured image of the projection image captured by the camera 5A. The three-dimensional position (positional relationship) of the laser pointer 2A with respect to the projection image is calculated from the size and distortion of The three-dimensional position calculation unit 14 outputs the calculated three-dimensional position to the transmission unit 15 ⁇ / b> A and the output control unit 16.
  • the transmission unit 15A is connected to the information processing apparatus 1B-1 on the remote side via a network and transmits data.
  • the transmission unit 15A includes the coordinate position of the irradiation point P1 recognized by the irradiation position recognition unit 12, the operator information acquired by the operator information acquisition unit 13, and the three-dimensional position of the laser pointer 2A (projection image and laser pointer). 2A), image data for projection, collected sound data transmitted from the microphone 8A, and the like are transmitted to the information processing apparatus 1B-1.
  • the receiving unit 17A is connected to the information processing apparatus 1B-1 on the remote side via a network and receives data. For example, the receiving unit 17A receives audio data collected on the remote side.
  • the output control unit 16 has a function of processing data output from each device installed in the ROOM A. Specifically, for example, the output control unit 16 performs content output processing.
  • the content output processing is processing for generating projection image data (display signal) based on content data acquired from the content display device 4A, for example. Further, the output control unit 16 controls to reproduce the audio data of the viewer 7B received from the information processing apparatus 1B-1 via the receiving unit 17A from the speaker 9A.
  • the output control unit 16 also controls the control signal for controlling the movement of the camera 6A so that the camera 6A reflects the operator 7A based on the three-dimensional position of the laser pointer 2A, and the microphone 8A speaks the operator 7A.
  • a control signal for controlling the direction of directivity so as to collect sound is transmitted from the transmission / reception unit 11A. What is calculated by the three-dimensional position calculation unit 14 is the positional relationship between the projection image (screen S1) and the laser pointer 2A. If the positional relationship between other devices is known, the output control unit 16 The relative positional relationship between the camera 6A and the operator 7A and the relative positional relationship between the microphone 8A and the operator 7A can be specified.
  • the output control unit 16 can specify the relative positional relationship between the camera 6A and the operator 7A.
  • the information processing apparatus 1B-1 is an example of the information processing apparatus 1B installed in the ROOM • B described with reference to FIG. As illustrated in FIG. 3, the information processing apparatus 1B-1 includes a transmission / reception unit 11B, a transmission unit 15B, a reception unit 17B, and a superimposed image signal generation unit 18.
  • the receiving unit 17B connects to the information processing apparatus 1A-1 on the remote side via a network and receives data.
  • the reception unit 17B receives the coordinate position of the irradiation point P1, operator information, the three-dimensional position of the laser pointer 2A, projection image data, sound collection data, and the like.
  • the receiving unit 17B outputs the coordinate position of the irradiation point P1, operator information, the three-dimensional position of the laser pointer 2A, and the content to the superimposed image signal generation unit 18, and outputs the audio data to the transmission / reception unit 11B.
  • the superimposed image signal generation unit 18 Based on the coordinate position of the irradiation point P1, the superimposed image signal generation unit 18 generates an image signal in which the remote irradiation point P1 ′ is superimposed on the coordinate position on the projection image data corresponding to the coordinate position of the irradiation point P1. Generate. As a result, the information processing apparatus 1B-1 can display the same locus on the projected image on the screen S2 on the ROOM • B side as the irradiation point P1 irradiated on the projected image on the screen S1 on the ROOM • A side.
  • the superimposed image signal generation unit 18 may also generate an image signal in which operator information (name, face image, avatar image, etc.) is superimposed at a position corresponding to the remote irradiation point P1 ′ on the projection image. Is possible.
  • the superimposed image signal generation unit 18 superimposes a human (person) whole body image on the projection image data at a position corresponding to the three-dimensional position information (positional relationship) of the laser pointer 2A with respect to the projection image (screen S1). It is also possible to generate an image signal. Thereby, it seems that the operator 7A of the remote irradiation point P1 'is actually present through the screen S2.
  • the whole body image of a person is, for example, a silhouette image, an avatar, an operator's CG (computer graphic), an image obtained by cutting out the entire body of the operator from an image of the operator.
  • a silhouette image is used as a whole body image of a person will be described.
  • the superimposed image signal generation unit 18 transmits the generated image signal to the projector 3B via the transmission / reception unit 11B.
  • the transmission / reception unit 11B is connected to each device installed in the ROOM / B by wire / wireless and has a function of transmitting / receiving data. Specifically, the transmission / reception unit 11B transmits image data for projection based on the image signal generated by the superimposed image signal generation unit 18 to the projector 3B, and transmits audio data on the ROOM • A side to the speaker 9B. Moreover, the transmission / reception part 11B receives the captured image which imaged the viewer 7B from the camera 6B, and receives the audio
  • the method of wireless communication between the transmission / reception unit 11B and each device installed in the ROOM B is not particularly limited.
  • the wireless communication system is connected by a wireless LAN, Wi-Fi (registered trademark), Bluetooth (registered trademark), or the like. Also good.
  • the transmission unit 15B transmits the utterance (voice data) of the viewer 7B collected by the microphone 8B and the captured image of the viewer 7B captured by the camera 6B to the information processing apparatus 1A-1 via the network.
  • the information processing apparatuses 1A-1 and 1B-1 may convert the voices collected from the microphones 8A and 8B into text by performing voice recognition and output the text data to the other party.
  • the text data of the voice of the user talking at a remote location is displayed on the screen together with the vicinity of the remote irradiation point P1 ′ corresponding to the irradiation point P1 operated by the user at the remote location and the user's avatar image. Can be projected.
  • the information processing apparatuses 1A-1 and 1B-1 may translate the collected sound and output it to the other party. Subsequently, an operation process of the remote cooperation system according to the first embodiment will be described with reference to FIG.
  • FIG. 4 is a diagram for explaining the operation processing of the remote cooperation system according to the first embodiment.
  • the information processing apparatus 1A-1 starts projecting content data.
  • the information processing apparatus 1A-1 transmits projection image data based on the content data to the projector 3A, and causes the projector 3A to project the image data onto the screen S1.
  • step S103 the information processing apparatus 1A-1 acquires a captured image of the projection image captured by the camera 5A.
  • step S106 the irradiation position recognition unit 12 of the information processing apparatus 1A-1 recognizes the coordinate position of the irradiation point P1 based on the captured image.
  • step S109 S103 and S106 are repeated until the coordinate position of the irradiation point P1 is recognized.
  • step S112 the operator information acquisition unit 13 of the information processing apparatus 1A-1 operates the irradiation point P1 with the laser pointer 2A.
  • the information of the person 7A is acquired.
  • the operator information acquisition unit 13 analyzes the captured image of the projection image captured by the camera 5A, and the shape and color of the visible / invisible light marker emitted from the laser pointer 2A, or An operator is specified based on information embedded in the visible / invisible light marker, and operator information is acquired.
  • step S115 the three-dimensional position calculation unit 14 of the information processing apparatus 1A-1 calculates the three-dimensional position (relative positional relationship) of the laser pointer 2A with respect to the projection image (screen S1). Specifically, the three-dimensional position calculation unit 14 analyzes the captured image of the projection image captured by the camera 5A, and the shape size and distortion of the visible / invisible light marker irradiated from the laser pointer 2A. To calculate the three-dimensional position of the laser pointer 2A.
  • step S118 the transmission unit 15A of the information processing apparatus 1A-1 acquires the content data acquired from the content display apparatus 4A (specifically, at least the currently projected image data) and the position of the irradiation point P1.
  • the coordinates and the operator information are transmitted to the information processing apparatus 1B-1.
  • step S121 the output control unit 16 of the information processing apparatus 1A-1 determines the imaging direction of the camera 6A and the directivity of the microphone 8A based on the three-dimensional position calculated by the three-dimensional position calculation unit 14. Control is directed to the operator 7A.
  • the captured image captured by the camera 6A and the audio data collected by the microphone 8A are also transmitted from the transmission unit 15A of the information processing apparatus 1A-1 to the information processing apparatus 1B-1.
  • step S124 on the remote side, the information processing apparatus 1B-1 applies the remote irradiation point to the content data transmitted from the information processing apparatus 1A-1 (the same as the image currently projected on the ROOM • A side).
  • An image signal on which P1 ′ and operator information are superimposed is generated.
  • the remote irradiation point P1 ' is superimposed on the same position as the coordinate position of the irradiation point P1.
  • Image data for projection based on the image signal generated in this way is transmitted to the projector 3B and projected onto the screen S2.
  • the remote irradiation point P1 ′ on the projected image of the screen S2 of the remote ROOM • B is displayed with the same locus as the irradiation point P1 irradiated to the screen S1 of the ROOM • A side. . Further, the operator information is superimposed and displayed at the position corresponding to the remote irradiation point P1 ′, so that the viewer 7B can intuitively grasp who is operating the remote irradiation point P1 ′ on the ROOM • A side. obtain.
  • the position corresponding to the remote irradiation point P1 ′ is, for example, in the vicinity of the remote irradiation point P1 ′, and the operator information is displayed near the remote irradiation point P1 ′. You can grasp intuitively whether you are operating. In addition, when the operator information is displayed near the remote irradiation point P1 ', there may be a situation where it is difficult to see the original content (projected image) indicated by the remote irradiation point P1'. Therefore, for example, as shown on the left side of FIG.
  • the operator's profile image, name, and location ("" linked to the cursor 30 of the remote irradiation point P1 'displayed on the image projected on the screen S2 by a solid line.
  • the operator information 32 such as the last utterance sentence that has been voice-recognized may be superimposed and displayed at a position away from the cursor 30.
  • a solid line is drawn from the cursor 30 of the remote irradiation point P1 ′.
  • the linked operator information 32 may be displayed for a few seconds in a dedicated area 35 provided around the reduced content portion.
  • the cursor 30 (remote irradiation point P1 ') shown in FIG. 5 is included in the image projected from the projector 3B on the screen S2, but the method of clearly indicating the remote irradiation point P1' is not limited to this.
  • the information processing apparatus 1B-1 controls to actually irradiate the screen S2 with a visible laser beam according to the coordinate position of the irradiation point P1.
  • the operation processing according to this embodiment has been described above.
  • the operation processing according to the present embodiment is not limited to the example shown in FIG.
  • the three-dimensional position of the laser pointer 2A with respect to the projection image (screen S1) is transmitted to the remote side, and the silhouette image of the operator (person's An example of a whole body image) may be displayed.
  • the silhouette image of the operator person's An example of a whole body image
  • FIG. 6 is a flowchart showing an operation process when silhouette display is performed on the remote side in the remote cooperation system according to the first embodiment.
  • steps S100 to S115 shown in FIG. 6 processing similar to that shown in FIG. 4 is performed.
  • step S119 the transmission unit 15A of the information processing apparatus 1A-1 uses the content data acquired from the content display apparatus 4A, the position coordinates of the irradiation point P1, the operator information, and the three-dimensional position as the information processing apparatus. To 1B-1.
  • step S121 the same processing as that shown in FIG. 4 is performed.
  • step S125 on the remote side, the information processing apparatus 1B-1 adds a silhouette corresponding to the remote irradiation point P1 ′, the operator information, and the three-dimensional position to the content data transmitted from the information processing apparatus 1A-1. An image signal in which an image is superimposed is generated. Image data for projection based on the image signal generated in this way is transmitted to the projector 3B and projected onto the screen S2.
  • the remote irradiation point P1 ′ is superimposed on the projection image 36 projected on the screen S2 installed in the ROOM • B, and is further linked to the remote irradiation point P1 ′ and is a dummy silhouette image 38. Is superimposed.
  • the display position of the silhouette image 38 is adjusted to correspond to the three-dimensional position of the laser pointer 2A (positional relationship with the screen S1). For example, when the operator 7A irradiates the screen S1 with laser light (irradiation point P1) from the left side on the ROOM A side, as shown in FIG.
  • a silhouette image is displayed on the left side of the remote irradiation point P1 ′. 38 is displayed. Thereby, the browsing person 7B who exists in ROOM * B can perform the face-to-face communication with the operator 7A in a remote place (ROOM * A) through a screen.
  • an operator information image 37 may be superimposed and displayed (projected) in association with the silhouette image 38.
  • the member information 7A-1 and 7A-2 in ROOM • A may be superimposed and displayed (projected) on the projected image 36 projected on the screen S2.
  • the viewer 7B can intuitively know who is in the remote area (ROOM ⁇ A) across the screen other than the operator.
  • the content (projected image data), operator information, the coordinate position of the irradiation point P1, and the like are individually transmitted to the remote side and displayed superimposed on the remote side ( Projected).
  • the remote cooperation system according to the present disclosure is not limited to this.
  • content (projected image data) on the local side, operator information, the coordinate position of the irradiation point P1, and the like are combined to generate image data (
  • a second embodiment is also conceivable in which an image signal is generated and transmitted to the remote side.
  • FIG. 8 is a block diagram showing an internal configuration example of each device forming the remote location cooperation system according to the second embodiment.
  • the information processing device 1A-2 and the information processing device 1B-2 according to the second embodiment shown in FIG. 8 are the same as the information processing device 1A-1 and the information processing device 1B-1 according to the first embodiment shown in FIG.
  • the information processing apparatus 1A-2 has a superimposed image signal generation unit 18 that the information processing apparatus 1B-1 has.
  • the superimposition image signal generation unit 18 included in the information processing apparatus 1A-2 is based on the projection image data corresponding to the coordinate position of the irradiation point P1 based on the coordinate position of the irradiation point P1 recognized by the irradiation position recognition unit 12. An image signal in which the remote irradiation point P1 ′ is superimposed on the coordinate position of is generated.
  • the superimposed image signal generation unit 18 generates an image signal in which operator information is superimposed at a position corresponding to the remote irradiation point P1 'on the projection image.
  • the superimposed image signal generated in this way is transmitted from the transmission unit 15A to the information processing apparatus 1B-2. Then, the information processing apparatus 1B-2 transmits the superimposed image signal received by the reception unit 17B from the transmission / reception unit 11B to the projector 3B installed in the ROOM • B, and causes the projector 3B to project it onto the screen S2.
  • the remote site cooperation system generates an image in which the remote irradiation point P1 ′ and the operator information are superimposed on the projection image on the local side, and transmits the generated superimposed image signal to the remote side. It is also possible to do.
  • the information processing devices (communication devices) 1A and 1B (see FIG. 1) and the information processing devices 1A to 1C (see FIG. 2) are mutually connected via a network.
  • the case of the system configuration connected by P2P (Peer to Peer) has been described.
  • the system configuration of the remote cooperation system according to the present disclosure is not limited to the model connected by P2P, and may be a system configuration connected by a server-client as shown in FIG. 9, for example.
  • FIG. 9 is a diagram for explaining another system configuration example of the remote cooperation system according to the present disclosure.
  • the remote cooperation system according to the present embodiment may have a system configuration in which a plurality of information processing apparatuses 1A to 1C are connected to a server (communication server) 100, respectively.
  • a server communication server
  • Variations of internal configuration examples of the devices forming the remote cooperation system when such a system configuration is applied will be described below with reference to third to seventh embodiments.
  • a case of a server-client system in which a plurality of information processing apparatuses 1A and 1B are connected to the server 100 will be described.
  • FIG. 10 is a block diagram illustrating an internal configuration example of each device forming the remote location cooperation system according to the third embodiment.
  • the information processing device 1A-3 and the information processing device 1B-3 according to the third embodiment shown in FIG. 10 are the same as the information processing device 1A-1 and the information processing device 1B-1 according to the first embodiment shown in FIG. Compared with the configuration, the following points are different. That is, the transmission unit 15A of the information processing apparatus 1A-3 transmits data to the server 100-1, the reception unit 17A receives data from the server 100-1, and the reception unit 17B of the information processing apparatus 1B-3 -1 is received, and the transmission unit 15B transmits data to the server 100-1.
  • the server 100-1 has a transfer unit 110, transfers the data transmitted from the transmission unit 15A of the information processing device 1A-3 as it is to the information processing device 1B-3, and transmits the data to the transmission unit 15B of the information processing device 1B-3.
  • the data transmitted from is directly transferred to the information processing apparatus 1A-3.
  • the information processing apparatuses 1A-3 and 1B-3 can realize remote cooperation through the server 100-1.
  • FIG. 11 is a block diagram showing an example of the internal configuration of each device forming the remote location cooperation system according to the fourth embodiment.
  • the information processing device 1A-4 and the information processing device 1B-4 according to the fourth embodiment shown in FIG. 11 are the same as the information processing device 1A-3 and the information processing device 1B-3 according to the third embodiment shown in FIG.
  • the information processing apparatus 1A-3 has a superimposed image signal generation unit 18 that the information processing apparatus 1B-3 has.
  • the superimposed image signal generation unit 18 included in the information processing apparatus 1A-2 uses the coordinates of the irradiation point P1 based on the coordinate position of the irradiation point P1 recognized by the irradiation position recognition unit 12. An image signal is generated by superimposing the remote irradiation point P1 ′ on the coordinate position on the projection image data corresponding to the position. The superimposed image signal generation unit 18 generates an image signal in which operator information is superimposed at a position corresponding to the remote irradiation point P1 'on the projection image.
  • the superimposed image signal generated in this way is transmitted from the transmission unit 15A to the information processing apparatus 1B-2 via the server 100-1. Then, the information processing apparatus 1B-4 transmits the superimposed image signal received by the reception unit 17B from the transmission / reception unit 11B to the projector 3B installed in the ROOM • B, and causes the projector 3B to project it onto the screen S2.
  • the remote cooperation system generates an image in which the remote irradiation point P1 ′ and the operator information are superimposed on the projection image on the local side, even when the server-client connection is established.
  • the superimposed image signal thus transmitted can be transmitted to the remote side.
  • the server 100-1 merely mediates transmission / reception of data between the information processing apparatuses.
  • the server 100-2 executes the superimposed image signal. Is generated.
  • FIG. 12 is a block diagram showing an example of the internal configuration of each device forming the remote location cooperation system according to the fifth embodiment.
  • the information processing apparatuses 1A-5 and 1B-5 and the server 100-2 according to the fifth embodiment shown in FIG. 12 differ from the internal configurations according to the fourth embodiment shown in FIG. 11 in the following points. That is, the difference is that the server 100-2 includes the superimposed image signal generation unit 120 having the same function as the superimposed image signal generation unit 18 included in the information processing apparatus 1A-5, and the transmission / reception unit 111.
  • the superimposed image signal generation unit 120 included in the server 100-2 is based on the projection image data corresponding to the coordinate position of the irradiation point P1 based on the coordinate position of the irradiation point P1 transmitted from the information processing apparatus 1A-5. An image signal in which the remote irradiation point P1 ′ is superimposed on the coordinate position is generated. In addition, the superimposed image signal generation unit 120 generates an image signal in which operator information is superimposed at a position corresponding to the remote irradiation point P1 'on the projection image.
  • the superimposed image signal generated in this way is transmitted from the transmission / reception unit 111 to the information processing apparatus 1B-5. Then, the information processing apparatus 1B-5 transmits the superimposed image signal received by the reception unit 17B from the transmission / reception unit 11B to the projector 3B installed in the ROOM • B, and causes the projector 3B to project it onto the screen S2.
  • the remote site cooperation system generates an image in which the remote irradiation point P1 ′ and the operator information are superimposed on the projection image on the server side when the server-client connection is made,
  • the superimposed image signal thus transmitted can be transmitted to the remote side.
  • the content output processing is performed by the local output control unit 16, but in the sixth embodiment, the server 100-3 also performs content output processing.
  • the content output processing is processing for generating projection image data (display signal) based on content data acquired from the content display device 4A, for example.
  • the sixth embodiment will be specifically described with reference to FIG.
  • FIG. 13 is a block diagram showing an example of the internal configuration of each device forming the remote location cooperation system according to the sixth embodiment.
  • the information processing apparatuses 1A-6, 1B-6 and the server 100-3 according to the sixth embodiment shown in FIG. 13 differ from the internal configurations according to the fifth embodiment shown in FIG. 12 in the following points. That is, the difference is that the server 100-3 includes the output control unit 130 having the same function as the output control unit 16 included in the information processing apparatus 1A-5, the transmission / reception unit 111, and the superimposed image signal generation unit 120.
  • the transmission / reception unit 111 of the server 100-3 receives content data from the content display device 4A installed in the ROOM A via the information processing device 1A-6 or from a content server on the network.
  • the transmission / reception unit 111 of the server 100-3 uses the image data for projection (display signal) generated by the output control unit 130 based on the acquired content data as the information processing apparatuses 1A on the local side and the remote side. -6, send to 1B-6. Thereby, on the local side and the remote side, the same image data is projected from the projectors 3A, 3B onto the screens S1, S2, respectively.
  • the transmission unit 15A of the information processing apparatus 1A-6 transmits the coordinate position of the irradiation point P1, operator information, and the laser pointer 2A.
  • the three-dimensional position is transmitted to the server 100-3.
  • the superimposed image signal generation unit 120 of the server 100-3 performs irradiation points on the projection image data output from the output control unit 130 based on the coordinate position of the irradiation point P1 received from the information processing apparatus 1A-6.
  • An image signal is generated by superimposing the remote irradiation point P1 ′ on the coordinate position corresponding to the coordinate position of P1.
  • the superimposed image signal generation unit 120 generates an image signal in which operator information is superimposed at a position corresponding to the remote irradiation point P1 'on the projection image.
  • the superimposed image signal generated in this way is transmitted from the transmission / reception unit 111 to the information processing apparatus 1B-6. Then, the information processing apparatus 1B-6 transmits the superimposed image signal received by the reception unit 17B from the transmission / reception unit 11B to the projector 3B installed in the ROOM • B, and causes the projector 3B to project it onto the screen S2.
  • the output control unit 130 of the server 100-3 operates the imaging direction of the ROOM • A camera 6A and the directivity of the microphone 8A based on the three-dimensional position of the laser pointer 2A received from the information processing apparatus 1A-6.
  • a control signal for controlling the movement of the camera 6A and the microphone 8A to be directed to the person 7A is transmitted to the information processing apparatus 1A-6.
  • the content output processing is performed on the server side, so that the processing burden on the local side can be reduced.
  • a seventh embodiment will be described with reference to FIG.
  • most processing is performed on the server side, and on the local side, only processing for transmitting a visible light / invisible light captured image for recognizing the coordinate position of the irradiation position P1 to the server side is performed.
  • FIG. 14 is a block diagram showing an example of the internal configuration of each device forming the remote cooperation system according to the seventh embodiment.
  • the information processing apparatuses 1A-7 and 1B-7 and the server 100-4 according to the seventh embodiment shown in FIG. 14 differ from the internal configurations according to the sixth embodiment shown in FIG. 13 in the following points. That is, the irradiation position recognition unit 140, the operator information acquisition unit 13, and the irradiation position recognition unit 140 having the same functions as the three-dimensional position calculation unit 14 included in the information processing apparatus 1A-6, the operator information acquisition unit 150, and the server 100-4 has a three-dimensional position calculation unit 160.
  • the other configurations of the server 100-4, the processing contents of the transmission / reception unit 111, the superimposed image signal generation unit 120, and the output control unit 130 are the same as those in the fifth embodiment.
  • the information processing apparatus 1A-7 continuously transmits the captured image of the projection image captured by the camera 5A to the server 100-4, and the server 100-4 performs irradiation.
  • the position recognition unit 140 recognizes the coordinate position of the irradiation point P1 on the projection image.
  • the operator information acquisition unit 150 of the server 100-4 also specifies the operator and acquires the operator information based on the captured image transmitted from the information processing apparatus 1A-7.
  • the coordinate position of the irradiation point P1 and the operator information are output to the superimposed image signal generation unit 120 and are used to generate a superimposed image signal.
  • the three-dimensional position calculation unit 160 of the server 100-4 calculates the three-dimensional position of the laser pointer 2A based on the captured image transmitted from the information processing apparatus 1A-7.
  • the calculated three-dimensional position is output to the output control unit 130 to be used for controlling the camera 6A and the microphone 8A on the local side, or is output to the superimposed image signal generation unit 120 to output a silhouette image superimposed on the projection image. Or used to determine placement.
  • the seventh embodiment most processes such as the process of recognizing the coordinate position of the irradiation point P1 based on the captured image of the projection image projected on the local screen S1, and the process of acquiring operator information. Is performed on the server side, the processing load on the local side can be further reduced.
  • a computer program for causing the information processing apparatus 1 and the server 100 to perform the functions of the information processing apparatus 1 and the server 100 described above on hardware such as the CPU, ROM, and RAM incorporated in the information processing apparatus 1 and the server 100.
  • a computer-readable storage medium storing the computer program is also provided.
  • steps S112 and S115 may be processed in parallel or in the reverse order.
  • step S121 and S118 and S124 may be processed in parallel.
  • a superimposed image (an image in which the remote irradiation point P1 and operator information are superimposed on the content) is projected on the screen S2 from the projector 3B (an example of a display device).
  • the display output on the remote side is not necessarily limited to projection.
  • the superimposed image may be displayed and output on a display screen of a television device or a PC (personal computer).
  • this technique can also take the following structures.
  • a recognition unit for recognizing the position of the irradiation point by the laser pointer with respect to the projection image on the local side;
  • An acquisition unit for acquiring information of an operator of the laser pointer;
  • In the display device installed on the remote side in order to display the projection image and the remote irradiation point located at the coordinate position corresponding to the irradiation point and the information of the operator, the position of the irradiation point and the operation
  • a first transmitter for transmitting the information of the person to the remote side;
  • An information processing apparatus comprising: (2) The acquisition unit acquires information on the operator by analyzing a non-visible light / visible light marker irradiated by the laser pointer on the projection image based on a captured image obtained by capturing the projection image.
  • the information processing apparatus includes: Based on the captured image obtained by capturing the projection image, the three-dimensional position of the laser pointer with respect to the projection image is analyzed by analyzing the invisible light / visible light marker irradiated on the projection image by the laser pointer.
  • the information processing apparatus according to any one of (1) to (4), further including a calculation unit that calculates.
  • (6) The information according to (5), wherein the calculation unit calculates a three-dimensional position of the laser pointer based on at least one of a shape, a size, an inclination, and a distortion of the invisible light / visible light marker.
  • the information processing apparatus includes: A second transmission unit configured to transmit an imaging control signal to a local imaging device so as to capture an image of the operator of the laser pointer according to the three-dimensional position calculated by the calculation unit; ) Or the information processing apparatus according to (6).
  • the information processing apparatus includes: A third transmission unit that transmits a sound collection control signal to the directional sound collection unit on the local side so as to face the operator of the laser pointer according to the three-dimensional position calculated by the calculation unit;
  • the information processing apparatus according to any one of (5) to (7), comprising: (9)
  • the first transmission unit displays the three-dimensional position calculated by the calculation unit on the remote side in order to display a whole body image of a person at a position corresponding to the three-dimensional position on a projection image.
  • the information processing apparatus according to any one of (5) to (8), wherein the information processing apparatus transmits to the side.
  • Computer A recognition unit for recognizing the position of the irradiation point by the laser pointer with respect to the projection image on the local side;
  • An acquisition unit for acquiring information of an operator of the laser pointer;
  • the display device installed on the remote side, in order to display the projection image and the remote irradiation point located at the coordinate position corresponding to the irradiation point and the information of the operator, the position of the irradiation point and the operation
  • a first transmitter for transmitting the information of the person to the remote side; Program to function as.
  • Computer A recognition unit for recognizing the position of the irradiation point by the laser pointer with respect to the projection image on the local side;
  • An acquisition unit for acquiring information of an operator of the laser pointer;
  • In the display device installed on the remote side in order to display the projection image and the remote irradiation point located at the coordinate position corresponding to the irradiation point and the information of the operator, the position of the irradiation point and the operation
  • a first transmitter for transmitting the information of the person to the remote side;
  • a storage medium storing a program for functioning as a computer.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

[Problem] To provide an information processing device, control method, program, and recording medium whereby information associated with an operator operating a laser pointer is clearly shown in correspondence with an indicating point displayed on a remote side, allowing a smooth teleconference connecting a plurality of distant locations. [Solution] An information processing device provided with a recognition unit that recognizes the position of an illuminated point due to a laser pointer on a projected image, an acquisition unit that acquires information associated with the operator of said laser pointer, and a first transmission unit that transmits the position of the illuminated point and the information associated with the operator to a remote side in order to display the following, in addition to the aforementioned projected image, on a display device on the remote side: a remote illuminated point at a coordinate position corresponding to the abovementioned illuminated point; and the information associated with the abovementioned operator.

Description

情報処理装置、制御方法、プログラム、および記憶媒体Information processing apparatus, control method, program, and storage medium
 本開示は、情報処理装置、制御方法、プログラム、および記憶媒体に関する。 The present disclosure relates to an information processing apparatus, a control method, a program, and a storage medium.
 従来、複数の遠隔地を結んで双方向の画像および音声による会議を行うテレビ会議(ビデオ会議)システムが利用されていたが、会議室全体を映そうとすると会議参加者の顔が小さくなってしまい、会議参加者の顔を大きく映そうとすると全員の顔を映せないという事情があった。 Conventionally, video conferencing (video conferencing) systems have been used that connect two or more remote locations to perform interactive video and audio conferences. In other words, there was a situation where all members' faces could not be shown when trying to show the faces of participants in the conference.
 また、テレビ会議システムでは、プロジェクタで投影するスクリーンに対して特定の操作者がレーザーポインタで指示して説明している様子を映して遠隔地の会議室に送信するといった利用方法も行われている。 Also, in the video conference system, there is a method of using a method in which a specific operator indicates with a laser pointer on a screen projected by a projector and transmits it to a remote conference room. .
 上述したようなテレビ会議の他、情報処理装置と、当該情報処理とデータ通信可能な複数の端末装置とによりネットワーク上で行われる電子会議システムも近年普及している。このような電子会議システムに関し、例えば下記特許文献1では、操作状態に応じた最適なレイアウトの試みが開示されている。 In addition to the video conference as described above, an electronic conference system that is performed on a network by an information processing device and a plurality of terminal devices that can perform data communication with the information processing has recently become widespread. With regard to such an electronic conference system, for example, Patent Document 1 below discloses an optimum layout attempt according to an operation state.
 また、下記特許文献2では、同じ説明資料データを表示している複数のテレビ会議端末装置において、説明者の指示位置の座標情報が閲覧者側に送信され、閲覧者側のテレビ会議端末装置において同位置に一定のマークが重畳表示される通信会議システムが開示されている。 Moreover, in the following patent document 2, in a plurality of video conference terminal devices displaying the same explanatory document data, coordinate information of the instructor's designated position is transmitted to the viewer side, and the viewer side video conference terminal device A communication conference system in which a certain mark is superimposed and displayed at the same position is disclosed.
特開2011-134124号公報JP 2011-134124 A 特開07-162826号公報JP 07-162826 A
 しかしながら、テレビ会議システムにおいて、スクリーンと、スクリーンから離れた位置にいる操作者を含めて映すと、スクリーンが小さく投影内容が見えにくくなってしまい、一方スクリーンのみを映すと説明者の顔が写らず、誰がどのように説明を行っているのか分からなかった。 However, in a video conferencing system, if the screen and the operator who is far from the screen are projected, the screen will be small and the projected content will be difficult to see. On the other hand, if only the screen is projected, the face of the explainer will not be captured. I didn't know who was explaining how.
 また、上記特許文献1では、操作者の様子を伝送することについては何ら言及されておらず、上記特許文献2では、操作者の映像の伝送は行っているが、レーザーポインタ等で遠距離から指示している場合の操作者に関する情報の伝送については何ら言及されていない。 Moreover, in the said patent document 1, it is not mentioned at all about transmitting the state of an operator, and in the said patent document 2, although an operator's image | video is transmitted, from a long distance with a laser pointer etc. No mention is made of the transmission of information about the operator when instructed.
 そこで、本開示では、レーザーポインタを操作している操作者の情報をリモート側で表示される指示点に対応付けて明示させることで、複数の遠隔地を結ぶ通信会議を円滑に行うことが可能な、新規かつ改良された情報処理装置、制御方法、プログラム、および記憶媒体を提案する。 Therefore, in the present disclosure, it is possible to smoothly hold a communication conference connecting a plurality of remote locations by clearly indicating the information of the operator operating the laser pointer in association with the indication point displayed on the remote side. A new and improved information processing apparatus, control method, program, and storage medium are proposed.
 本開示によれば、ローカル側の投影画像に対するレーザーポインタによる照射点の位置を認識する認識部と、前記レーザーポインタの操作者の情報を取得する取得部と、リモート側に設置されている表示装置において、前記投影画像と共に、前記照射点に対応する座標位置に位置するリモート照射点および前記操作者の情報を表示させるために、前記照射点の位置と前記操作者の情報を前記リモート側に送信する第1の送信部と、を備える、情報処理装置を提案する。 According to the present disclosure, the recognition unit that recognizes the position of the irradiation point by the laser pointer with respect to the local projection image, the acquisition unit that acquires the information of the operator of the laser pointer, and the display device installed on the remote side In order to display the information of the remote irradiation point located at the coordinate position corresponding to the irradiation point and the operator together with the projection image, the position of the irradiation point and the information of the operator are transmitted to the remote side. An information processing apparatus including a first transmission unit is proposed.
 本開示によれば、ローカル側の投影画像に対するレーザーポインタによる照射点の位置を認識するステップと、前記レーザーポインタの操作者の情報を取得するステップと、リモート側に設置されている表示装置において、前記投影画像と共に、前記照射点に対応する座標位置に位置するリモート照射点および前記操作者の情報を表示させるために、前記照射点の位置と前記操作者の情報を前記リモート側に送信するステップと、を含む、制御方法を提案する。 According to the present disclosure, in the step of recognizing the position of the irradiation point by the laser pointer with respect to the projection image on the local side, the step of acquiring information on the operator of the laser pointer, and the display device installed on the remote side, A step of transmitting the position of the irradiation point and the information of the operator to the remote side in order to display the remote irradiation point located at the coordinate position corresponding to the irradiation point and the information of the operator together with the projection image. And a control method is proposed.
 本開示によれば、コンピュータを、ローカル側の投影画像に対するレーザーポインタによる照射点の位置を認識する認識部と、前記レーザーポインタの操作者の情報を取得する取得部と、リモート側に設置されている表示装置において、前記投影画像と共に、前記照射点に対応する座標位置に位置するリモート照射点および前記操作者の情報を表示させるために、前記照射点の位置と前記操作者の情報を前記リモート側に送信する第1の送信部と、として機能させるためのプログラムを提案する。 According to the present disclosure, the computer is installed on the remote side, a recognition unit that recognizes the position of the irradiation point by the laser pointer on the local projection image, an acquisition unit that acquires information on the operator of the laser pointer, In order to display the remote irradiation point located at the coordinate position corresponding to the irradiation point and the operator information together with the projection image, the position of the irradiation point and the operator information are displayed on the display device. A program for functioning as a first transmission unit that transmits to the side is proposed.
 本開示によれば、コンピュータを、ローカル側の投影画像に対するレーザーポインタによる照射点の位置を認識する認識部と、前記レーザーポインタの操作者の情報を取得する取得部と、リモート側に設置されている表示装置において、前記投影画像と共に、前記照射点に対応する座標位置に位置するリモート照射点および前記操作者の情報を表示させるために、前記照射点の位置と前記操作者の情報を前記リモート側に送信する第1の送信部と、として機能させるためのプログラムが記憶された、記憶媒体を提案する。 According to the present disclosure, the computer is installed on the remote side, a recognition unit that recognizes the position of the irradiation point by the laser pointer on the local projection image, an acquisition unit that acquires information on the operator of the laser pointer, In order to display the remote irradiation point located at the coordinate position corresponding to the irradiation point and the operator information together with the projection image, the position of the irradiation point and the operator information are displayed on the display device. A storage medium storing a program for causing a first transmission section to transmit to the first transmission section and a program to function as the first transmission section is proposed.
 以上説明したように本開示によれば、レーザーポインタを操作している操作者の情報をリモート側で表示される指示点に対応付けて明示させることで、複数の遠隔地を結ぶ通信会議を円滑に行うことが可能となる。 As described above, according to the present disclosure, the information of the operator who operates the laser pointer is clearly associated with the indication point displayed on the remote side, thereby facilitating a communication conference connecting a plurality of remote locations. Can be performed.
本開示の一実施形態による遠隔地協調システムの概要を説明するための図である。It is a figure for demonstrating the outline | summary of the remote cooperation system by one Embodiment of this indication. 本開示の一実施形態による遠隔地協調システムが3つの遠隔地を結ぶ通信会議に適用される場合について説明するための図である。It is a figure for demonstrating the case where the remote place cooperation system by one Embodiment of this indication is applied to the communication conference which connects three remote places. 第1の実施形態による遠隔地協調システムを形成する各情報処理装置の内部構成例を示すブロック図である。It is a block diagram which shows the internal structural example of each information processing apparatus which forms the remote cooperation system by 1st Embodiment. 第1の実施形態による遠隔地協調システムの動作処理について説明するための図である。It is a figure for demonstrating the operation | movement process of the remote place cooperation system by 1st Embodiment. リモート側のスクリーンに投影される画像上に表示される操作情報の表示例について説明するための図である。It is a figure for demonstrating the example of a display of the operation information displayed on the image projected on the screen of a remote side. 第1の実施形態による隔地協調システムにおいてリモート側でシルエット表示を行う場合の動作処理を示すフローチャートである。It is a flowchart which shows the operation | movement process in the case of performing a silhouette display on the remote side in the remote cooperation system by 1st Embodiment. シルエット画像の重畳表示例について説明するための図である。It is a figure for demonstrating the example of a superimposed display of a silhouette image. 第2の実施形態による遠隔地協調システムを形成する各装置の内部構成例を示すブロック図である。It is a block diagram which shows the internal structural example of each apparatus which forms the remote place cooperation system by 2nd Embodiment. 本開示による遠隔地協調システムの他のシステム構成例を説明するための図である。It is a figure for demonstrating the other system configuration example of the remote place cooperation system by this indication. 第3の実施形態による遠隔地協調システムを形成する各装置の内部構成例を示すブロック図である。It is a block diagram which shows the internal structural example of each apparatus which forms the remote place cooperation system by 3rd Embodiment. 、第4の実施形態による遠隔地協調システムを形成する各装置の内部構成例を示すブロック図である。It is a block diagram which shows the internal structural example of each apparatus which forms the remote place cooperation system by 4th Embodiment. 第5の実施形態による遠隔地協調システムを形成する各装置の内部構成例を示すブロック図である。It is a block diagram which shows the internal structural example of each apparatus which forms the remote place cooperation system by 5th Embodiment. 第6の実施形態による遠隔地協調システムを形成する各装置の内部構成例を示すブロック図である。It is a block diagram which shows the internal structural example of each apparatus which forms the remote place cooperation system by 6th Embodiment. 第7の実施形態による遠隔地協調システムを形成する各装置の内部構成例を示すブロック図である。It is a block diagram which shows the internal structural example of each apparatus which forms the remote place cooperation system by 7th Embodiment.
 以下に添付図面を参照しながら、本開示の好適な実施の形態について詳細に説明する。なお、本明細書及び図面において、実質的に同一の機能構成を有する構成要素については、同一の符号を付することにより重複説明を省略する。 Hereinafter, preferred embodiments of the present disclosure will be described in detail with reference to the accompanying drawings. In addition, in this specification and drawing, about the component which has the substantially same function structure, duplication description is abbreviate | omitted by attaching | subjecting the same code | symbol.
 また、説明は以下の順序で行うものとする。
 1.本開示の一実施形態による遠隔地協調システムの概要
 2.各実施形態
  2-1.第1の実施形態
  2-2.第2の実施形態
  2-3.第3の実施形態
  2-4.第4の実施形態
  2-5.第5の実施形態
  2-6.第6の実施形態
  2-7.第7の実施形態
 3.まとめ
The description will be made in the following order.
1. 1. Overview of remote cooperation system according to an embodiment of the present disclosure Embodiments 2-1. First embodiment 2-2. Second embodiment 2-3. Third Embodiment 2-4. Fourth embodiment 2-5. Fifth embodiment 2-6. Sixth embodiment 2-7. 7. Seventh embodiment Summary
  <<1.本開示の一実施形態による遠隔地協調システムの概要>>
 まず、本開示の一実施形態による遠隔地協調システムの概要について図1を参照して説明する。図1に示すように、本開示の一実施形態による遠隔地協調システムは、複数の遠隔地に設置された複数の情報処理装置1A、1Bにより形成され、各情報処理装置(コミュニケーション装置)1A、1Bは、ネットワークを介して接続される。
<< 1. Overview of remote cooperation system according to an embodiment of the present disclosure >>
First, an overview of a remote cooperation system according to an embodiment of the present disclosure will be described with reference to FIG. As shown in FIG. 1, a remote location cooperation system according to an embodiment of the present disclosure is formed by a plurality of information processing devices 1A and 1B installed in a plurality of remote locations, and each information processing device (communication device) 1A, 1B is connected via a network.
 ROOM・Aに設置されている情報処理装置1Aは、ROOM・Aに設置されている各装置と有線/無線により接続する。具体的には、情報処理装置1Aは、プロジェクタ3A、コンテンツ表示用装置4A、スクリーンS1の投影画像を撮像するカメラ5A、レーザーポインタ2Aの操作者7Aを撮像するカメラ6A、マイクロホン(以下、マイクと称す)8A、およびスピーカ9Aと接続する。なお図1に示すシステム構成は一例であって、例えばマイク8Aとカメラ6Aが一体になっていてもよいし、コンテンツ表示用装置4Aと情報処理装置1Aが一体になっていてもよいし、カメラ5Aがプロジェクタ3Aに内蔵されていてもよい。 The information processing apparatus 1A installed in the ROOM • A is connected to each device installed in the ROOM • A by wire / wireless. Specifically, the information processing apparatus 1A includes a projector 3A, a content display apparatus 4A, a camera 5A that captures an image projected on the screen S1, a camera 6A that captures an operator 7A of the laser pointer 2A, a microphone (hereinafter referred to as a microphone). It is connected to 8A and the speaker 9A. The system configuration shown in FIG. 1 is an example. For example, the microphone 8A and the camera 6A may be integrated, the content display device 4A and the information processing device 1A may be integrated, or the camera 5A may be built in the projector 3A.
 情報処理装置1Aは、コンテンツ表示用装置4Aからコンテンツを取得し、投影用の画像データ(表示信号)をプロジェクタ3Aに送信する。コンテンツとは、例えば図表、文章、その他種々のグラフィック画像や、地図、ウェブサイト、3Dオブジェクト等であって、以下投影用の画像データとも称する。なおコンテンツ表示用装置4Aは、コンテンツを記憶している記憶装置であって、図1に示すノート型PCの他、デスクトップ型PCや、タブレット端末、スマートホン等であってもよい。 The information processing apparatus 1A acquires content from the content display apparatus 4A and transmits projection image data (display signal) to the projector 3A. The contents are, for example, charts, sentences, other various graphic images, maps, websites, 3D objects, and the like, and are also referred to as image data for projection hereinafter. The content display device 4A is a storage device that stores content, and may be a desktop PC, a tablet terminal, a smart phone, or the like in addition to the notebook PC shown in FIG.
 プロジェクタ3Aは、情報処理装置1Aから送信された投影用の画像データを、情報処理装置1Aの制御に従ってスクリーンS1に投影する。 The projector 3A projects the projection image data transmitted from the information processing apparatus 1A onto the screen S1 according to the control of the information processing apparatus 1A.
 レーザーポインタ2Aは、可視光レーザー光と非可視光マーカーを同位置に、または可視光マーカーのみを、スクリーンS1に照射する。可視光レーザー光、可視光マーカーとは、人間の眼に見える可視光線により照射されるレーザー光、マーカー画像である。また、非可視光マーカーとは、人間の眼に見えない赤外線や紫外線等の非可視光線(不可視光線)により照射されるマーカー画像である。 The laser pointer 2A irradiates the screen S1 with the visible light laser beam and the invisible light marker at the same position or only with the visible light marker. The visible light laser light and the visible light marker are laser light and marker images that are irradiated with visible light visible to the human eye. The invisible light marker is a marker image irradiated with invisible light (invisible light) such as infrared rays or ultraviolet rays that cannot be seen by human eyes.
 また、非可視光マーカー/可視光マーカーは、星形や十字等の様々な図形、または特定の情報が埋め込まれた1次元/2次元バーコードである。操作者7Aは、レーザーポインタ2Aから照射される可視光レーザー光または可視光マーカーでスクリーンS1の任意の個所を指示しながらプレゼンテーションを行う。 In addition, the non-visible light marker / visible light marker is a one-dimensional or two-dimensional barcode in which various figures such as a star shape and a cross, or specific information is embedded. The operator 7A makes a presentation while indicating an arbitrary portion of the screen S1 with a visible light laser beam or a visible light marker emitted from the laser pointer 2A.
 カメラ5Aは、スクリーンS1に投影された投影画像を撮像し、撮像画像を情報処理装置1Aに出力する。カメラ5Aは、赤外線カメラや紫外線カメラ等の非可視光(不可視光)撮像を行う非可視光撮像機能、または可視光撮像を行う可視光撮像機能を有する。 The camera 5A captures the projected image projected on the screen S1, and outputs the captured image to the information processing apparatus 1A. The camera 5A has a non-visible light imaging function that performs non-visible light (invisible light) imaging such as an infrared camera or an ultraviolet camera, or a visible light imaging function that performs visible light imaging.
 情報処理装置1Aは、カメラ5Aにより撮像された投影画像の可視光/非可視光撮像画像に基づいて、レーザーポインタ2Aによる照射点P1(可視光レーザー光、可視光マーカー、または非可視光マーカーで指示されている投影画像上の点)の座標位置を認識する。 The information processing apparatus 1A uses an irradiation point P1 (visible light laser light, visible light marker, or invisible light marker) by the laser pointer 2A based on the visible / invisible light captured image of the projection image captured by the camera 5A. The coordinate position of the designated point on the projected image is recognized.
 また、情報処理装置1Aは、レーザーポインタ2Aの3次元位置を算出することができる。具体的には、情報処理装置1Aは、カメラ5Aにより撮像された投影画像の可視光/非可視光撮像画像に基づいて、レーザーポインタ2Aから照射された可視光/非可視光マーカーの形状の大きさ、歪み等から、投影画像に対するレーザーポインタ2Aの3次元位置を算出する。 Further, the information processing apparatus 1A can calculate the three-dimensional position of the laser pointer 2A. Specifically, the information processing apparatus 1A determines the size of the shape of the visible / invisible light marker emitted from the laser pointer 2A based on the visible / invisible light captured image of the projection image captured by the camera 5A. From the distortion and the like, the three-dimensional position of the laser pointer 2A with respect to the projection image is calculated.
 また、情報処理装置1Aは、レーザーポインタ2Aの操作者7Aの情報を取得する。操作者7Aの情報とは、例えば氏名やニックネーム等の名称、役職、所属、現在所在地、予め登録された顔画像やアバター画像等である。具体的には、情報処理装置1Aは、カメラ5Aにより撮像された投影画像の可視光/非可視光撮像画像を解析し、レーザーポインタ2Aから照射された可視光/非可視光マーカーの形や色、埋め込まれた情報(ユーザID)を読み出し、レーザーポインタ2Aの操作者7Aを特定する。 Further, the information processing apparatus 1A acquires information on the operator 7A of the laser pointer 2A. The information of the operator 7A is, for example, a name such as a name or a nickname, a title, an affiliation, a current location, a face image or an avatar image registered in advance. Specifically, the information processing apparatus 1A analyzes the visible / invisible light captured image of the projection image captured by the camera 5A, and the shape and color of the visible / invisible light marker emitted from the laser pointer 2A. The embedded information (user ID) is read, and the operator 7A of the laser pointer 2A is specified.
 また、情報処理装置1Aは、算出された3次元位置に基づいて、レーザーポインタ2Aの操作者7Aが、投影画像を撮像するカメラ5Aとは別のカメラ6Aの画角に含まれるよう、カメラ6Aを移動(パン、チルト、ズーム)制御してもよい。これにより、情報処理装置1Aは、レーザーポインタ2Aの操作者7Aのリアルタイムの映像を取得することができる。また、情報処理装置1Aは、算出された3次元位置に基づいて、レーザーポインタ2Aの操作者7Aに対して指向性の高いマイク8Aを向けて、レーザーポインタ2Aを照射した操作者の発話(音声データ)を収音することもできる。 Further, the information processing apparatus 1A uses the camera 6A so that the operator 7A of the laser pointer 2A is included in the angle of view of the camera 6A different from the camera 5A that captures the projected image based on the calculated three-dimensional position. The movement (pan, tilt, zoom) may be controlled. Thereby, the information processing apparatus 1A can acquire a real-time image of the operator 7A of the laser pointer 2A. Further, the information processing apparatus 1A directs the microphone 8A having high directivity toward the operator 7A of the laser pointer 2A based on the calculated three-dimensional position, and utters (sounds) of the operator who has irradiated the laser pointer 2A. Data) can also be collected.
 そして、ローカル側の情報処理装置1Aは、投影用の画像データ(表示信号)、照射点P1の座標位置、操作者7Aの情報、カメラ6Aで撮像した操作者7Aのリアルタイムの映像、マイク8Aで収音した音声データ等を、リモート側の情報処理装置1Bに送信する。 Then, the information processing apparatus 1A on the local side uses the image data for projection (display signal), the coordinate position of the irradiation point P1, the information of the operator 7A, the real-time video of the operator 7A captured by the camera 6A, and the microphone 8A. The collected voice data and the like are transmitted to the information processing apparatus 1B on the remote side.
 一方、リモート側のROOM・Bに設置されている情報処理装置1Bは、ROOM・Bに設置されている各装置と有線/無線により接続する。具体的には、情報処理装置1Bは、プロジェクタ3B((表示装置の一例))、閲覧者7Bを撮像するカメラ6B、マイク8B、およびスピーカ9Bと接続する。 On the other hand, the information processing apparatus 1B installed in the remote ROOM • B is connected to each device installed in the ROOM • B by wire / wireless. Specifically, the information processing device 1B is connected to a projector 3B ((an example of a display device)), a camera 6B that captures an image of a viewer 7B, a microphone 8B, and a speaker 9B.
 情報処理装置1Bは、情報処理装置1Aから受信した投影用の画像データをプロジェクタ3Bに送信し、プロジェクタ3BからスクリーンS2に投影させる。この際、情報処理装置1Bは、スクリーンS2に投影する投影画像上の、情報処理装置1Aから受信した照射点P1の座標位置に対応する座標位置に、リモート照射点P1’を重畳表示するよう制御する。これにより、リモート側のスクリーンS2に投影される投影画像上においても、ROOM・A側のスクリーンS1に投影される投影画像上に照射される照射点P1と同じ軌跡を表示させることができる。 The information processing apparatus 1B transmits the projection image data received from the information processing apparatus 1A to the projector 3B, and causes the projector 3B to project the image data onto the screen S2. At this time, the information processing apparatus 1B performs control to superimpose and display the remote irradiation point P1 ′ at the coordinate position corresponding to the coordinate position of the irradiation point P1 received from the information processing apparatus 1A on the projection image projected on the screen S2. To do. Thereby, even on the projection image projected on the screen S2 on the remote side, the same locus as the irradiation point P1 irradiated on the projection image projected on the screen S1 on the ROOM • A side can be displayed.
 また、情報処理装置1Bは、図1に示すように、リモート照射点P1’に対応付けて、レーザーポインタ2Aの操作者7Aの情報(例えば氏名および顔画像)を重畳表示する。これにより、閲覧者7Bは、スクリーンS2に重畳表示されるリモート照射点P1’を、誰がROOM・A側で操作して説明を行っているのかを直感的に把握することができる。なお情報処理装置1Bは、操作者7Aの情報として、カメラ6Aによりリアルタイムで撮像された操作者7Aの映像を重畳表示してもよい。 Further, as shown in FIG. 1, the information processing apparatus 1B superimposes and displays information (for example, name and face image) of the operator 7A of the laser pointer 2A in association with the remote irradiation point P1 '. As a result, the viewer 7B can intuitively understand who is operating and explaining the remote irradiation point P1 'superimposed on the screen S2 on the ROOM • A side. The information processing apparatus 1B may superimpose and display an image of the operator 7A captured in real time by the camera 6A as the information about the operator 7A.
 また、情報処理装置1Bは、ローカル側の情報処理装置1Aからマイク8Aで収音した音声データを受信した場合、スピーカ9Bから再生するようスピーカ9Bを制御する。また、情報処理装置1Bは、マイク8Bで収音した閲覧者7Bの発話(音声データ)を、ROOM・Aの情報処理装置1Aに送信することも可能である。 In addition, when the information processing apparatus 1B receives audio data collected by the microphone 8A from the local information processing apparatus 1A, the information processing apparatus 1B controls the speaker 9B to reproduce from the speaker 9B. The information processing apparatus 1B can also transmit the utterance (voice data) of the viewer 7B collected by the microphone 8B to the information processing apparatus 1A of the ROOM • A.
 上述したように、本実施形態による遠隔地協調システムでは、レーザーポインタ2aを操作している操作者7Aの情報を、リモート側で表示されるリモート照射点P1’に対応付けて重畳表示させることで、誰が操作して説明を行っているかを直感的に把握させ、複数の遠隔地を結ぶ通信会議を円滑に行うことができる。 As described above, in the remote cooperation system according to the present embodiment, the information of the operator 7A who is operating the laser pointer 2a is superimposed and displayed in association with the remote irradiation point P1 ′ displayed on the remote side. It is possible to intuitively understand who is operating and explaining, and to smoothly conduct a communication conference connecting a plurality of remote locations.
 なお、図1では、一例として2つの遠隔地を結ぶ通信会議の場合について図示したが、本実施形態による遠隔地協調システムはこれに限定されず、例えば図2に示すように3つの遠隔地を結ぶ通信会議にも同様に適用され得る。 1 illustrates the case of a communication conference that connects two remote locations as an example, but the remote location cooperation system according to the present embodiment is not limited to this. For example, as shown in FIG. The same can be applied to the communication conference to be connected.
 (3つの遠隔地を結ぶ通信会議)
 図2は、本開示の一実施形態による遠隔地協調システムが3つの遠隔地を結ぶ通信会議に適用される場合について説明するための図である。図2に示す例では、3つの遠隔地、ROOM・A、ROOM・B、ROOM・Cを結ぶ通信会議が行われる場合を想定している。図2に示すように、本開示の一実施形態による遠隔地協調システムは、複数の遠隔地に設置された複数の情報処理装置1A、1B、1Cにより形成され、各情報処理装置(コミュニケーション装置)1A、1B、1Cは、ネットワークを介して接続される。
(Communication conference connecting three remote locations)
FIG. 2 is a diagram for describing a case where the remote location cooperation system according to an embodiment of the present disclosure is applied to a communication conference that connects three remote locations. In the example shown in FIG. 2, it is assumed that a communication conference connecting three remote locations, ROOM / A, ROOM / B, and ROOM / C is performed. As shown in FIG. 2, a remote location cooperation system according to an embodiment of the present disclosure is formed by a plurality of information processing devices 1A, 1B, and 1C installed in a plurality of remote locations, and each information processing device (communication device). 1A, 1B, and 1C are connected via a network.
 ROOM・A、ROOM・Bに設置されている各装置は、図1と同様であるので、ここでの説明は省略する。 Since each device installed in ROOM • A and ROOM • B is the same as that in FIG. 1, description thereof is omitted here.
 ROOM・Cに設置されている情報処理装置1Cは、ROOM・Cに設置されている各装置と有線/無線により接続する。具体的には、情報処理装置1Cは、プロジェクタ3C、コンテンツ表示用装置4C、スクリーンS3の投影画像を撮像するカメラ5C、レーザーポインタ2Cの操作者7Cを撮像するカメラ6C、マイク8C、およびスピーカ9Cと接続する。なお図2に示すシステム構成は一例であって、例えばマイク8Cとカメラ6Cが一体になっていてもよいし、コンテンツ表示用装置4Cと情報処理装置1Cが一体になっていてもよいし、カメラ5Cがプロジェクタ3Cに内蔵されていてもよい。 The information processing apparatus 1C installed in the ROOM · C is connected to each device installed in the ROOM · C by wire / wireless. Specifically, the information processing apparatus 1C includes a projector 3C, a content display apparatus 4C, a camera 5C that captures an image projected on the screen S3, a camera 6C that captures an operator 7C of the laser pointer 2C, a microphone 8C, and a speaker 9C. Connect with. The system configuration shown in FIG. 2 is an example, and for example, the microphone 8C and the camera 6C may be integrated, the content display device 4C and the information processing device 1C may be integrated, or the camera 5C may be built in the projector 3C.
 以上説明したシステム構成において、情報処理装置1Aは、情報処理装置1Aにとってリモート側になる情報処理装置1Bおよび情報処理装置1Cに、照射点P1の座標位置や操作者7Aの情報等を送信する。また、情報処理装置1Cは、情報処理装置1Cにとってリモート側になる情報処理装置1Bおよび情報処理装置1Aに、照射点P3の座標位置や操作者7Cの情報等を送信する。 In the system configuration described above, the information processing apparatus 1A transmits the coordinate position of the irradiation point P1, information on the operator 7A, and the like to the information processing apparatus 1B and the information processing apparatus 1C that are remote to the information processing apparatus 1A. In addition, the information processing apparatus 1C transmits the coordinate position of the irradiation point P3, information on the operator 7C, and the like to the information processing apparatus 1B and the information processing apparatus 1A that are remote to the information processing apparatus 1C.
 そして、例えばROOM・CのスクリーンS2に投影される投影画像上には、照射点P1に対応するリモート照射点P1’と、照射点P3に対応するリモート照射点P3’が重畳表示される。リモート照射点P1’は、ROOM・A側のスクリーンS1に投影される投影画像上に照射される照射点P1と同じ軌跡で表示され、リモート照射点P3’は、ROOM・C側のスクリーンS3に投影される投影画像上に照射される照射点P3と同じ軌跡で表示される。 For example, a remote irradiation point P1 'corresponding to the irradiation point P1 and a remote irradiation point P3' corresponding to the irradiation point P3 are superimposed and displayed on the projection image projected on the screen S2 of the ROOM • C. The remote irradiation point P1 ′ is displayed in the same locus as the irradiation point P1 irradiated on the projection image projected on the screen S1 on the ROOM • A side, and the remote irradiation point P3 ′ is displayed on the screen S3 on the ROOM • C side. The projected image is displayed on the projected image with the same locus as the irradiated point P3.
 また、リモート照射点P1’に対応付けて、レーザーポインタ2Aの操作者7Aの情報が重畳表示され、リモート照射点P3’に対応付けて、レーザーポインタ2Cの操作者7Cの情報が重畳表示される。これにより、閲覧者7Bは、スクリーンS2に重畳表示されるリモート照射点P1’、P3’のどちらが誰により操作されているのかを直感的に把握することができる。 In addition, information on the operator 7A of the laser pointer 2A is displayed in a superimposed manner in association with the remote irradiation point P1 ′, and information on the operator 7C in the laser pointer 2C is displayed in a superimposed manner in association with the remote irradiation point P3 ′. . As a result, the viewer 7B can intuitively understand who is operating the remote irradiation points P1 'and P3' superimposed on the screen S2.
 なお、ROOM・AのスクリーンS1に投影される投影画像上には、照射点P3に対応するリモート照射点P3’が重畳表示され、ROOM・CのスクリーンS3に投影される投影画像上には、照射点P1に対応するリモート照射点P1’が重畳表示される。 A remote irradiation point P3 ′ corresponding to the irradiation point P3 is superimposed and displayed on the projection image projected on the screen S1 of the ROOM • A, and on the projection image projected on the screen S3 of the ROOM • C, A remote irradiation point P1 ′ corresponding to the irradiation point P1 is displayed in a superimposed manner.
 以上説明したように、3つの遠隔地を結ぶ通信会議においても本実施形態による遠隔地協調システムを適用することができる。続いて、本実施形態による遠隔地協調システムを形成する各構成の内部構成例について複数の実施形態を参照して説明する。 As described above, the remote location cooperation system according to this embodiment can be applied to a communication conference that connects three remote locations. Next, an internal configuration example of each configuration forming the remote location cooperation system according to the present embodiment will be described with reference to a plurality of embodiments.
  <<2.各実施形態>>
   <2-1.第1の実施形態>
  (2-1-1.構成)
 図3は、第1の実施形態による遠隔地協調システムを形成する情報処理装置1A-1と情報処理装置1B-1の内部構成例を示すブロック図である。
<< 2. Each embodiment >>
<2-1. First Embodiment>
(2-1-1. Configuration)
FIG. 3 is a block diagram illustrating an internal configuration example of the information processing apparatus 1A-1 and the information processing apparatus 1B-1 that form the remote location cooperation system according to the first embodiment.
 (情報処理装置1A-1)
 情報処理装置1A-1は、図1を参照して説明したROOM・Aに設置される情報処理装置1Aの一例である。情報処理装置1A-1は、図3に示すように、送受信部11A、照射位置認識部12、操作者情報取得部13、3次元位置算出部14、送信部15A、出力制御部16、および受信部17Aを有する。
(Information processing apparatus 1A-1)
The information processing apparatus 1A-1 is an example of the information processing apparatus 1A installed in the ROOM • A described with reference to FIG. As shown in FIG. 3, the information processing apparatus 1A-1 includes a transmission / reception unit 11A, an irradiation position recognition unit 12, an operator information acquisition unit 13, a three-dimensional position calculation unit 14, a transmission unit 15A, an output control unit 16, and a reception. Part 17A.
 送受信部11Aは、ROOM・Aに設置されている各装置と有線/無線により接続し、データの送受信を行う機能を有する。具体的には、送受信部11Aは、コンテンツ表示用装置4Aからコンテンツを受信し、プロジェクタ3Aに対してコンテンツに基づく投影用の画像データを送信し、カメラ5Aから投影画像の撮像画像を受信する。また、送受信部11Aは、操作者7Aを撮像するカメラ6Aに対して撮像制御信号を送信し、カメラ6Aから撮像画像を受信する。また、送受信部11Aは、操作者7Aの発話を収音するマイク8Aに対して収音制御信号を送信し、マイク8Aから収音データを受信する。また、送受信部11Aは、リモート側の情報処理装置1B-1から送信された収音データをスピーカ9Aに送信する。 The transmission / reception unit 11A has a function of transmitting / receiving data by connecting to each device installed in the ROOM • A by wire / wireless. Specifically, the transmission / reception unit 11A receives content from the content display device 4A, transmits image data for projection based on the content to the projector 3A, and receives a captured image of the projection image from the camera 5A. In addition, the transmission / reception unit 11A transmits an imaging control signal to the camera 6A that images the operator 7A, and receives a captured image from the camera 6A. In addition, the transmission / reception unit 11A transmits a sound collection control signal to the microphone 8A that collects the speech of the operator 7A, and receives sound collection data from the microphone 8A. Further, the transmission / reception unit 11A transmits the sound collection data transmitted from the remote information processing apparatus 1B-1 to the speaker 9A.
 送受信部11AとROOM・Aに設置されている各装置の間の無線通信の方式は特に限定しないが、例えば無線LAN、Wi-Fi(登録商標)またはBluetooth(登録商標)等により接続されていてもよい。 The method of wireless communication between the transmission / reception unit 11A and each device installed in the ROOM A is not particularly limited. For example, a wireless LAN, Wi-Fi (registered trademark), or Bluetooth (registered trademark) is connected. Also good.
 照射位置認識部12は、カメラ5Aにより撮像された投影画像の可視光/非可視光撮像画像に基づいて、投影画像に対するレーザーポインタ2Aによる可視光レーザー光または可視光/非可視光マーカーの照射位置P1の座標位置を認識する。具体的には、例えば照射位置認識部12は、プロジェクタ3Aにより投影されている画像(投影用の画像データ)と、投影画像を撮像した可視光/非可視光撮像画像の差分を検出することによって、照射位置P1の位置座標を検出する。また、照射位置認識部12は、現在投影されている画像の前のフレームの可視光/非可視光撮像画像と、現在投影されている画像の可視光/非可視光撮像画像との差分も分析に加えることで精度を高めることができる。照射位置認識部12は、認識した照射点P1の座標位置を送信部15Aに出力する。 The irradiation position recognition unit 12 irradiates the projected image with the visible light laser light or the visible light / invisible light marker by the laser pointer 2A based on the visible light / invisible light captured image of the projected image captured by the camera 5A. Recognize the coordinate position of P1. Specifically, for example, the irradiation position recognition unit 12 detects a difference between an image projected by the projector 3A (projection image data) and a visible / invisible light captured image obtained by capturing the projection image. The position coordinates of the irradiation position P1 are detected. The irradiation position recognizing unit 12 also analyzes the difference between the visible light / invisible light captured image of the previous frame of the currently projected image and the visible light / invisible light captured image of the currently projected image. The accuracy can be increased by adding to. The irradiation position recognition unit 12 outputs the recognized coordinate position of the irradiation point P1 to the transmission unit 15A.
 操作者情報取得部13は、カメラ5Aにより撮像された投影画像の可視光/非可視光撮像画像を解析し、レーザーポインタ2Aから照射された可視光/非可視光マーカーの形や色、埋め込まれた情報等を読み出し、操作者7Aを特定し、操作者情報を取得する。具体的には、例えば操作者情報取得部13は、可視光/非可視光マーカーの形や色に対応付けられている操作者の情報を、情報処理装置1A-1が有する、またはクラウド上の操作者情報DB(データベース)(不図示)から抽出する。また、操作者情報取得部13は、可視光/非可視光マーカーから読み出した情報に基づいて、操作者情報を取得することもできる。操作者情報取得部13は、取得した操作者情報を送信部15Aに出力する。 The operator information acquisition unit 13 analyzes the visible / invisible light captured image of the projection image captured by the camera 5A, and embeds the shape and color of the visible / invisible light marker emitted from the laser pointer 2A. And the like, the operator 7A is specified, and the operator information is acquired. Specifically, for example, the operator information acquisition unit 13 has information on the operator associated with the shape and color of the visible / invisible light marker in the information processing apparatus 1A-1 or on the cloud. Extracted from an operator information DB (database) (not shown). Moreover, the operator information acquisition part 13 can also acquire operator information based on the information read from the visible light / invisible light marker. The operator information acquisition unit 13 outputs the acquired operator information to the transmission unit 15A.
 3次元位置算出部14は、スクリーンSに投影される投影画像に対するレーザーポインタ2Aの3次元位置を算出する。具体的には、3次元位置算出部14は、カメラ5Aにより撮像された投影画像の可視光/非可視光撮像画像に基づいて、レーザーポインタ2Aから照射された可視光/非可視光マーカーの形状の大きさと歪み等から、投影画像に対するレーザーポインタ2Aの3次元位置(位置関係)を算出する。3次元位置算出部14は、算出した3次元位置を送信部15Aおよび出力制御部16に出力する。 The three-dimensional position calculation unit 14 calculates the three-dimensional position of the laser pointer 2A with respect to the projection image projected on the screen S. Specifically, the three-dimensional position calculation unit 14 forms the shape of the visible / invisible light marker emitted from the laser pointer 2A based on the visible / invisible light captured image of the projection image captured by the camera 5A. The three-dimensional position (positional relationship) of the laser pointer 2A with respect to the projection image is calculated from the size and distortion of The three-dimensional position calculation unit 14 outputs the calculated three-dimensional position to the transmission unit 15 </ b> A and the output control unit 16.
 送信部15Aは、ネットワークを介してリモート側の情報処理装置1B-1と接続し、データの送信を行う。例えば、送信部15Aは、照射位置認識部12により認識された照射点P1の座標位置、操作者情報取得部13により取得された操作者情報、レーザーポインタ2Aの3次元位置(投影画像とレーザーポインタ2Aの位置関係)、投影用の画像データ、マイク8Aから送信された収音データ等を情報処理装置1B-1に送信する。 The transmission unit 15A is connected to the information processing apparatus 1B-1 on the remote side via a network and transmits data. For example, the transmission unit 15A includes the coordinate position of the irradiation point P1 recognized by the irradiation position recognition unit 12, the operator information acquired by the operator information acquisition unit 13, and the three-dimensional position of the laser pointer 2A (projection image and laser pointer). 2A), image data for projection, collected sound data transmitted from the microphone 8A, and the like are transmitted to the information processing apparatus 1B-1.
 受信部17Aは、ネットワークを介してリモート側の情報処理装置1B-1と接続し、データの受信を行う。例えば、受信部17Aは、リモート側で収音された音声データを受信する。 The receiving unit 17A is connected to the information processing apparatus 1B-1 on the remote side via a network and receives data. For example, the receiving unit 17A receives audio data collected on the remote side.
 出力制御部16は、ROOM・Aに設置されている各装置から出力するデータを処理する機能を有する。具体的には、例えば出力制御部16は、コンテンツの出力処理を行う。コンテンツの出力処理とは、例えばコンテンツ表示用装置4Aから取得したコンテンツデータに基づいて投影用の画像データ(表示信号)を生成する処理である。また、出力制御部16は、情報処理装置1B-1から受信部17Aを介して受信した閲覧者7Bの音声データを、スピーカ9Aから再生するよう制御する。 The output control unit 16 has a function of processing data output from each device installed in the ROOM A. Specifically, for example, the output control unit 16 performs content output processing. The content output processing is processing for generating projection image data (display signal) based on content data acquired from the content display device 4A, for example. Further, the output control unit 16 controls to reproduce the audio data of the viewer 7B received from the information processing apparatus 1B-1 via the receiving unit 17A from the speaker 9A.
 また、出力制御部16は、レーザーポインタ2Aの3次元位置に基づいて、カメラ6Aが操作者7Aを映すようカメラ6Aの動きを制御するための制御信号や、マイク8Aが操作者7Aの発話を収音するよう指向性の方向を制御するための制御信号を、送受信部11Aから送信させる。3次元位置算出部14により算出されるのは、投影画像(スクリーンS1)とレーザーポインタ2Aの位置関係となるが、他の各装置間の位置関係が既知である場合、出力制御部16は、カメラ6Aと操作者7Aの相対的位置関係や、マイク8Aと操作者7Aの相対的位置関係を特定できる。例えば、プロジェクタ3AとスクリーンS1の位置関係、カメラ6Aとプロジェクタ3Aの位置関係、プロジェクタ3Aとカメラ6Aの位置関係が既知であって、かつレーザーポインタ2Aと操作者7Aの位置関係が通常の場合(同じ場所に位置する)、出力制御部16はカメラ6Aと操作者7Aの相対的位置関係を特定できる。 The output control unit 16 also controls the control signal for controlling the movement of the camera 6A so that the camera 6A reflects the operator 7A based on the three-dimensional position of the laser pointer 2A, and the microphone 8A speaks the operator 7A. A control signal for controlling the direction of directivity so as to collect sound is transmitted from the transmission / reception unit 11A. What is calculated by the three-dimensional position calculation unit 14 is the positional relationship between the projection image (screen S1) and the laser pointer 2A. If the positional relationship between other devices is known, the output control unit 16 The relative positional relationship between the camera 6A and the operator 7A and the relative positional relationship between the microphone 8A and the operator 7A can be specified. For example, when the positional relationship between the projector 3A and the screen S1, the positional relationship between the camera 6A and the projector 3A, the positional relationship between the projector 3A and the camera 6A are known, and the positional relationship between the laser pointer 2A and the operator 7A is normal ( The output control unit 16 can specify the relative positional relationship between the camera 6A and the operator 7A.
 (情報処理装置1B-1)
 情報処理装置1B-1は、図1を参照して説明したROOM・Bに設置される情報処理装置1Bの一例である。情報処理装置1B-1は、図3に示すように、送受信部11B、送信部15B、受信部17B、および重畳画像信号生成部18を有する。
(Information processing apparatus 1B-1)
The information processing apparatus 1B-1 is an example of the information processing apparatus 1B installed in the ROOM • B described with reference to FIG. As illustrated in FIG. 3, the information processing apparatus 1B-1 includes a transmission / reception unit 11B, a transmission unit 15B, a reception unit 17B, and a superimposed image signal generation unit 18.
 受信部17Bは、ネットワークを介してリモート側の情報処理装置1A-1と接続し、データの受信を行う。例えば、受信部17Bは、照射点P1の座標位置、操作者情報、レーザーポインタ2Aの3次元位置、投影用の画像データ、収音データ等を受信する。受信部17Bは、照射点P1の座標位置、操作者情報、レーザーポインタ2Aの3次元位置、およびコンテンツを重畳画像信号生成部18に出力し、音声データは送受信部11Bに出力する。 The receiving unit 17B connects to the information processing apparatus 1A-1 on the remote side via a network and receives data. For example, the reception unit 17B receives the coordinate position of the irradiation point P1, operator information, the three-dimensional position of the laser pointer 2A, projection image data, sound collection data, and the like. The receiving unit 17B outputs the coordinate position of the irradiation point P1, operator information, the three-dimensional position of the laser pointer 2A, and the content to the superimposed image signal generation unit 18, and outputs the audio data to the transmission / reception unit 11B.
 重畳画像信号生成部18は、照射点P1の座標位置に基づいて、照射点P1の座標位置に対応する投影用の画像データ上の座標位置に、リモート照射点P1’を重畳させた画像信号を生成する。これにより、情報処理装置1B-1は、ROOM・A側のスクリーンS1の投影画像に照射された照射点P1と同じ軌跡をROOM・B側のスクリーンS2の投影画像上に表示することができる。 Based on the coordinate position of the irradiation point P1, the superimposed image signal generation unit 18 generates an image signal in which the remote irradiation point P1 ′ is superimposed on the coordinate position on the projection image data corresponding to the coordinate position of the irradiation point P1. Generate. As a result, the information processing apparatus 1B-1 can display the same locus on the projected image on the screen S2 on the ROOM • B side as the irradiation point P1 irradiated on the projected image on the screen S1 on the ROOM • A side.
 また、重畳画像信号生成部18は、投影画像上の、リモート照射点P1’に対応する位置に、操作者情報(名前や顔画像、アバター画像等)を重畳させた画像信号を生成することも可能である。 The superimposed image signal generation unit 18 may also generate an image signal in which operator information (name, face image, avatar image, etc.) is superimposed at a position corresponding to the remote irradiation point P1 ′ on the projection image. Is possible.
 さらに、重畳画像信号生成部18は、投影画像(スクリーンS1)に対するレーザーポインタ2Aの3次元位置情報(位置関係)に対応する位置に人間(人物)の全身画像を投影用の画像データに重畳させた画像信号を生成することも可能である。これにより、スクリーンS2越しにリモート照射点P1’の操作者7Aが実際に居るように見える。人物の全身画像は、具体的には、例えばシルエット画像や、アバター、操作者のCG(コンピュータグラフィック)、操作者を撮影した映像から操作者の全身をくり抜いた映像等である。図7を参照して後述する例では、人物の全身画像としてシルエット画像を用いる場合について説明する。 Further, the superimposed image signal generation unit 18 superimposes a human (person) whole body image on the projection image data at a position corresponding to the three-dimensional position information (positional relationship) of the laser pointer 2A with respect to the projection image (screen S1). It is also possible to generate an image signal. Thereby, it seems that the operator 7A of the remote irradiation point P1 'is actually present through the screen S2. Specifically, the whole body image of a person is, for example, a silhouette image, an avatar, an operator's CG (computer graphic), an image obtained by cutting out the entire body of the operator from an image of the operator. In the example described later with reference to FIG. 7, a case where a silhouette image is used as a whole body image of a person will be described.
 重畳画像信号生成部18は、生成した画像信号を、送受信部11Bを介してプロジェクタ3Bに送信する。 The superimposed image signal generation unit 18 transmits the generated image signal to the projector 3B via the transmission / reception unit 11B.
 送受信部11Bは、ROOM・Bに設置されている各装置と有線/無線により接続し、データの送受信を行う機能を有する。具体的には、送受信部11Bは、重畳画像信号生成部18により生成された画像信号に基づく投影用の画像データをプロジェクタ3Bに送信し、スピーカ9BにROOM・A側の音声データを送信する。また、送受信部11Bは、カメラ6Bから閲覧者7Bを撮像した撮像画像を受信し、マイク8Bから閲覧者7Bの発話を収音した音声データを受信する。 The transmission / reception unit 11B is connected to each device installed in the ROOM / B by wire / wireless and has a function of transmitting / receiving data. Specifically, the transmission / reception unit 11B transmits image data for projection based on the image signal generated by the superimposed image signal generation unit 18 to the projector 3B, and transmits audio data on the ROOM • A side to the speaker 9B. Moreover, the transmission / reception part 11B receives the captured image which imaged the viewer 7B from the camera 6B, and receives the audio | voice data which picked up the speech of the viewer 7B from the microphone 8B.
 送受信部11BとROOM・Bに設置されている各装置の間の無線通信の方式は特に限定しないが、例えば無線LAN、Wi-Fi(登録商標)またはBluetooth(登録商標)等により接続されていてもよい。 The method of wireless communication between the transmission / reception unit 11B and each device installed in the ROOM B is not particularly limited. For example, the wireless communication system is connected by a wireless LAN, Wi-Fi (registered trademark), Bluetooth (registered trademark), or the like. Also good.
 送信部15Bは、マイク8Bで収音された閲覧者7Bの発話(音声データ)およびカメラ6Bで撮像された閲覧者7Bの撮像画像を、ネットワークを介して情報処理装置1A-1に送信する。 The transmission unit 15B transmits the utterance (voice data) of the viewer 7B collected by the microphone 8B and the captured image of the viewer 7B captured by the camera 6B to the information processing apparatus 1A-1 via the network.
 以上、第1の実施形態による遠隔地協調システムを形成する各装置の内部構成について具体的説明した。なお、情報処理装置1A-1、1B-1は、それぞれマイク8A、8Bから収音した音声を音声認識に掛けてテキスト化し、テキストデータを相手側に出力してもよい。これにより相手側では、遠隔地で話しているユーザの音声をテキスト化したデータを、遠隔地でユーザが操作している照射点P1に対応するリモート照射点P1’付近やユーザのアバター画像と共にスクリーンに投影することができる。また、情報処理装置1A-1、1B-1は、収音した音声を翻訳して相手側に出力してもよい。続いて、第1の実施形態による遠隔地協調システムの動作処理について図4を参照して説明する。 The above has specifically described the internal configuration of each device forming the remote cooperation system according to the first embodiment. Note that the information processing apparatuses 1A-1 and 1B-1 may convert the voices collected from the microphones 8A and 8B into text by performing voice recognition and output the text data to the other party. As a result, on the other side, the text data of the voice of the user talking at a remote location is displayed on the screen together with the vicinity of the remote irradiation point P1 ′ corresponding to the irradiation point P1 operated by the user at the remote location and the user's avatar image. Can be projected. Further, the information processing apparatuses 1A-1 and 1B-1 may translate the collected sound and output it to the other party. Subsequently, an operation process of the remote cooperation system according to the first embodiment will be described with reference to FIG.
  (2-1-2.動作処理)
 図4は、第1の実施形態による遠隔地協調システムの動作処理について説明するための図である。図4に示すように、まず、ステップS100において、情報処理装置1A-1は、コンテンツデータの投影を開始する。具体的には、情報処理装置1A-1は、コンテンツデータに基づく投影用の画像データをプロジェクタ3Aに送信し、プロジェクタ3AからスクリーンS1に対して投影させる。
(2-1-2. Operation processing)
FIG. 4 is a diagram for explaining the operation processing of the remote cooperation system according to the first embodiment. As shown in FIG. 4, first, in step S100, the information processing apparatus 1A-1 starts projecting content data. Specifically, the information processing apparatus 1A-1 transmits projection image data based on the content data to the projector 3A, and causes the projector 3A to project the image data onto the screen S1.
 次いで、ステップS103において、情報処理装置1A-1は、カメラ5Aにより撮像された投影画像の撮像画像を取得する。 Next, in step S103, the information processing apparatus 1A-1 acquires a captured image of the projection image captured by the camera 5A.
 次に、ステップS106において、情報処理装置1A-1の照射位置認識部12は、撮像画像に基づいて、照射点P1の座標位置を認識する。 Next, in step S106, the irradiation position recognition unit 12 of the information processing apparatus 1A-1 recognizes the coordinate position of the irradiation point P1 based on the captured image.
 次いで、ステップS109において、照射点P1の座標位置を認識するまで上記S103、S106が繰り返される。 Next, in step S109, S103 and S106 are repeated until the coordinate position of the irradiation point P1 is recognized.
 次に、照射点P1の座標位置が認識された場合(S109/Yes)、ステップS112において、情報処理装置1A-1の操作者情報取得部13は、照射点P1をレーザーポインタ2Aで操作する操作者7Aの情報を取得する。具体的には、例えば操作者情報取得部13は、カメラ5Aにより撮像された投影画像の撮像画像を解析し、レーザーポインタ2Aから照射されている可視光/非可視光マーカーの形や色、または可視光/非可視光マーカーに埋め込まれている情報等に基づいて操作者を特定し、操作者情報を取得する。 Next, when the coordinate position of the irradiation point P1 is recognized (S109 / Yes), in step S112, the operator information acquisition unit 13 of the information processing apparatus 1A-1 operates the irradiation point P1 with the laser pointer 2A. The information of the person 7A is acquired. Specifically, for example, the operator information acquisition unit 13 analyzes the captured image of the projection image captured by the camera 5A, and the shape and color of the visible / invisible light marker emitted from the laser pointer 2A, or An operator is specified based on information embedded in the visible / invisible light marker, and operator information is acquired.
 続いて、ステップS115において、情報処理装置1A-1の3次元位置算出部14は、投影画像(スクリーンS1)に対するレーザーポインタ2Aの3次元位置(相対的位置関係)を算出する。具体的には、3次元位置算出部14は、カメラ5Aにより撮像された投影画像の撮像画像を解析し、レーザーポインタ2Aから照射されている可視光/非可視光マーカーの形状の大きさや歪み等を解析してレーザーポインタ2Aの3次元位置を算出する。 Subsequently, in step S115, the three-dimensional position calculation unit 14 of the information processing apparatus 1A-1 calculates the three-dimensional position (relative positional relationship) of the laser pointer 2A with respect to the projection image (screen S1). Specifically, the three-dimensional position calculation unit 14 analyzes the captured image of the projection image captured by the camera 5A, and the shape size and distortion of the visible / invisible light marker irradiated from the laser pointer 2A. To calculate the three-dimensional position of the laser pointer 2A.
 そして、ステップS118において、情報処理装置1A-1の送信部15Aは、コンテンツ表示用装置4Aから取得したコンテンツデータ(具体的には、少なくとも現在投影している画像データ)と、照射点P1の位置座標、および操作者情報を、情報処理装置1B-1に送信する。 In step S118, the transmission unit 15A of the information processing apparatus 1A-1 acquires the content data acquired from the content display apparatus 4A (specifically, at least the currently projected image data) and the position of the irradiation point P1. The coordinates and the operator information are transmitted to the information processing apparatus 1B-1.
 次に、ステップS121において、情報処理装置1A-1の出力制御部16は、3次元位置算出部14により算出された3次元位置に基づいて、カメラ6Aの撮像方向やマイク8Aの指向性を、操作者7Aに向けるよう制御する。カメラ6Aで撮像された撮像画像およびマイク8Aにより収音された音声データも、情報処理装置1A-1の送信部15Aから情報処理装置1B-1に送信される。 Next, in step S121, the output control unit 16 of the information processing apparatus 1A-1 determines the imaging direction of the camera 6A and the directivity of the microphone 8A based on the three-dimensional position calculated by the three-dimensional position calculation unit 14. Control is directed to the operator 7A. The captured image captured by the camera 6A and the audio data collected by the microphone 8A are also transmitted from the transmission unit 15A of the information processing apparatus 1A-1 to the information processing apparatus 1B-1.
 一方、ステップS124において、リモート側では、情報処理装置1B-1が、情報処理装置1A-1から送信されたコンテンツデータ(現在ROOM・A側で投影されている画像と同じ)に、リモート照射点P1’および操作者情報を重畳した画像信号を生成する。リモート照射点P1’は、照射点P1の座標位置と同じ位置に重畳される。このように生成された画像信号に基づく投影用の画像データがプロジェクタ3Bに送信され、スクリーンS2に投影される。 On the other hand, in step S124, on the remote side, the information processing apparatus 1B-1 applies the remote irradiation point to the content data transmitted from the information processing apparatus 1A-1 (the same as the image currently projected on the ROOM • A side). An image signal on which P1 ′ and operator information are superimposed is generated. The remote irradiation point P1 'is superimposed on the same position as the coordinate position of the irradiation point P1. Image data for projection based on the image signal generated in this way is transmitted to the projector 3B and projected onto the screen S2.
 以上説明したように、ROOM・A側のスクリーンS1に照射されている照射点P1と同じ軌跡で、リモート側であるROOM・BのスクリーンS2の投影画像上のリモート照射点P1’が表示される。また、リモート照射点P1’に対応する位置に操作者情報が重畳表示されることで、誰がリモート照射点P1’をROOM・A側で操作しているのかが閲覧者7Bに直感的に把握され得る。 As described above, the remote irradiation point P1 ′ on the projected image of the screen S2 of the remote ROOM • B is displayed with the same locus as the irradiation point P1 irradiated to the screen S1 of the ROOM • A side. . Further, the operator information is superimposed and displayed at the position corresponding to the remote irradiation point P1 ′, so that the viewer 7B can intuitively grasp who is operating the remote irradiation point P1 ′ on the ROOM • A side. obtain.
 ここで、リモート照射点P1’に対応する位置とは、例えばリモート照射点P1’付近であって、リモート照射点P1’の近くに操作者情報が表示されることで、閲覧者7Bは、誰が操作しているのかを直感的に把握することができる。なお、操作者情報がリモート照射点P1’の近くに表示されると、リモート照射点P1’で指し示している本来のコンテンツ(投影画像)が見えにくいといった状況も考えられる。そこで、例えば図5左に示すように、スクリーンS2に投影される画像上に表示されるリモート照射点P1’のカーソル30に実線で紐付けられた、操作者のプロフィール画像、名前、所在地(「場所:ROOM・A」の表示)、音声認識された最終の発話文などの操作者情報32を、カーソル30から離された位置に重畳表示してもよい。 Here, the position corresponding to the remote irradiation point P1 ′ is, for example, in the vicinity of the remote irradiation point P1 ′, and the operator information is displayed near the remote irradiation point P1 ′. You can grasp intuitively whether you are operating. In addition, when the operator information is displayed near the remote irradiation point P1 ', there may be a situation where it is difficult to see the original content (projected image) indicated by the remote irradiation point P1'. Therefore, for example, as shown on the left side of FIG. 5, the operator's profile image, name, and location ("" linked to the cursor 30 of the remote irradiation point P1 'displayed on the image projected on the screen S2 by a solid line. The operator information 32 such as the last utterance sentence that has been voice-recognized may be superimposed and displayed at a position away from the cursor 30.
 または、遠隔地(ROOM・A)で照射点P1が移動した際や新たな操作者が登場した際等のタイミングで、図5右に示すように、リモート照射点P1’のカーソル30から実線で紐付けた操作者情報32を、コンテンツ部分を縮小して周囲に設けた専用領域35に数秒間表示させるようにしてもよい。 Alternatively, when the irradiation point P1 moves in a remote place (ROOM • A) or when a new operator appears, as shown in the right of FIG. 5, a solid line is drawn from the cursor 30 of the remote irradiation point P1 ′. The linked operator information 32 may be displayed for a few seconds in a dedicated area 35 provided around the reduced content portion.
 なお、図5に示すカーソル30(リモート照射点P1’)は、スクリーンS2に対してプロジェクタ3Bから投影される画像に含まれるが、リモート照射点P1’の明示方法はこれに限定されない。例えばROOM・Bに、レーザー照射装置が設置されている場合、情報処理装置1B-1は、照射点P1の座標位置に応じて、実際に可視光線によるレーザー光線をスクリーンS2に照射するよう制御してリモート照射点P1’を明示してもよい。これにより、リモート側において臨場感の高い演出が可能となる。 Note that the cursor 30 (remote irradiation point P1 ') shown in FIG. 5 is included in the image projected from the projector 3B on the screen S2, but the method of clearly indicating the remote irradiation point P1' is not limited to this. For example, when a laser irradiation apparatus is installed in ROOM B, the information processing apparatus 1B-1 controls to actually irradiate the screen S2 with a visible laser beam according to the coordinate position of the irradiation point P1. You may specify remote irradiation point P1 '. This makes it possible to produce a highly realistic presentation on the remote side.
 以上、本実施形態による動作処理について説明した。なお本実施形態による動作処理は図4に示す例に限定されない。例えば、本実施形態による遠隔地強調システムでは、投影画像(スクリーンS1)に対するレーザーポインタ2Aの3次元位置をリモート側に送信し、リモート側において3次元位置に応じて操作者のシルエット画像(人物の全身画像の一例)を表示してもよい。以下、図6~図7を参照して具体的に説明する。 The operation processing according to this embodiment has been described above. The operation processing according to the present embodiment is not limited to the example shown in FIG. For example, in the remote location enhancement system according to the present embodiment, the three-dimensional position of the laser pointer 2A with respect to the projection image (screen S1) is transmitted to the remote side, and the silhouette image of the operator (person's An example of a whole body image) may be displayed. Hereinafter, a specific description will be given with reference to FIGS.
 (シルエット画像の表示)
 図6は、第1の実施形態による隔地協調システムにおいてリモート側でシルエット表示を行う場合の動作処理を示すフローチャートである。
(Silhouette image display)
FIG. 6 is a flowchart showing an operation process when silhouette display is performed on the remote side in the remote cooperation system according to the first embodiment.
 図6に示すステップS100~S115では、図4に示す同ステップと同様の処理が行われる。 In steps S100 to S115 shown in FIG. 6, processing similar to that shown in FIG. 4 is performed.
 次いで、ステップS119おいて、情報処理装置1A-1の送信部15Aは、コンテンツ表示用装置4Aから取得したコンテンツデータ、照射点P1の位置座標、操作者情報、および3次元位置を、情報処理装置1B-1に送信する。 Next, in step S119, the transmission unit 15A of the information processing apparatus 1A-1 uses the content data acquired from the content display apparatus 4A, the position coordinates of the irradiation point P1, the operator information, and the three-dimensional position as the information processing apparatus. To 1B-1.
 次に、ステップS121において、図4に示す同ステップと同様の処理が行われる。 Next, in step S121, the same processing as that shown in FIG. 4 is performed.
 一方、ステップS125において、リモート側では、情報処理装置1B-1が、情報処理装置1A-1から送信されたコンテンツデータに、リモート照射点P1’、操作者情報、および3次元位置に応じたシルエット画像を重畳した画像信号を生成する。このように生成された画像信号に基づく投影用の画像データがプロジェクタ3Bに送信され、スクリーンS2に投影される。 On the other hand, in step S125, on the remote side, the information processing apparatus 1B-1 adds a silhouette corresponding to the remote irradiation point P1 ′, the operator information, and the three-dimensional position to the content data transmitted from the information processing apparatus 1A-1. An image signal in which an image is superimposed is generated. Image data for projection based on the image signal generated in this way is transmitted to the projector 3B and projected onto the screen S2.
 シルエット画像の重畳表示例について、図7を参照して説明する。図7に示すように、ROOM・Bに設置されているスクリーンS2に投影される投影画像36にリモート照射点P1’が重畳表示され、さらにリモート照射点P1’に紐付けてダミーのシルエット画像38が重畳表示される。シルエット画像38の表示位置は、レーザーポインタ2Aの3次元位置(スクリーンS1との位置関係)に対応するよう調整される。例えば操作者7Aが、ROOM・A側において、スクリーンS1に対して左側からレーザー光(照射点P1)を照射している場合、図7に示すように、リモート照射点P1’の左側にシルエット画像38が表示される。これにより、ROOM・Bに居る閲覧者7Bは、遠隔地(ROOM・A)に居る操作者7Aとのスクリーン越しの対面コミュニケーションを行うことができる。 An example of superimposed display of silhouette images will be described with reference to FIG. As shown in FIG. 7, the remote irradiation point P1 ′ is superimposed on the projection image 36 projected on the screen S2 installed in the ROOM • B, and is further linked to the remote irradiation point P1 ′ and is a dummy silhouette image 38. Is superimposed. The display position of the silhouette image 38 is adjusted to correspond to the three-dimensional position of the laser pointer 2A (positional relationship with the screen S1). For example, when the operator 7A irradiates the screen S1 with laser light (irradiation point P1) from the left side on the ROOM A side, as shown in FIG. 7, a silhouette image is displayed on the left side of the remote irradiation point P1 ′. 38 is displayed. Thereby, the browsing person 7B who exists in ROOM * B can perform the face-to-face communication with the operator 7A in a remote place (ROOM * A) through a screen.
 また、図7に示すように、シルエット画像38に紐付けて操作者情報画像37が重畳表示(投影)されてもよい。 Further, as shown in FIG. 7, an operator information image 37 may be superimposed and displayed (projected) in association with the silhouette image 38.
 さらに、スクリーンS2に投影される投影画像36上には、ROOM・Aに居るメンバーの情報7A-1、7A-2が重畳表示(投影)されていてもよい。これにより、閲覧者7Bは、スクリーン越しの遠隔地(ROOM・A)に、操作者以外に誰が居るかを直感的に把握することができる。 Further, the member information 7A-1 and 7A-2 in ROOM • A may be superimposed and displayed (projected) on the projected image 36 projected on the screen S2. As a result, the viewer 7B can intuitively know who is in the remote area (ROOM · A) across the screen other than the operator.
   <2-2.第2の実施形態>
 以上説明した第1の実施形態では、コンテンツ(投影している画像データ)と、操作者情報や、照射点P1の座標位置等を個別にリモート側に送信し、リモート側において重畳して表示(投影)されていた。しかし、本開示による遠隔地協調システムはこれに限定されず、例えばローカル側でコンテンツ(投影している画像データ)と、操作者情報や、照射点P1の座標位置等を合成して画像データ(画像信号)を生成し、リモート側に送信する第2の実施形態も考えられる。以下、図8を参照して具体的に説明する。
<2-2. Second Embodiment>
In the first embodiment described above, the content (projected image data), operator information, the coordinate position of the irradiation point P1, and the like are individually transmitted to the remote side and displayed superimposed on the remote side ( Projected). However, the remote cooperation system according to the present disclosure is not limited to this. For example, content (projected image data) on the local side, operator information, the coordinate position of the irradiation point P1, and the like are combined to generate image data ( A second embodiment is also conceivable in which an image signal is generated and transmitted to the remote side. Hereinafter, a specific description will be given with reference to FIG.
 図8は、第2の実施形態による遠隔地協調システムを形成する各装置の内部構成例を示すブロック図である。図8に示す第2の実施形態による情報処理装置1A-2および情報処理装置1B-2は、図3に示す第1の実施形態による情報処理装置1A-1および情報処理装置1B-1の内部構成と比較すると、情報処理装置1B-1が有していた重畳画像信号生成部18を、情報処理装置1A-2が有する点が異なる。 FIG. 8 is a block diagram showing an internal configuration example of each device forming the remote location cooperation system according to the second embodiment. The information processing device 1A-2 and the information processing device 1B-2 according to the second embodiment shown in FIG. 8 are the same as the information processing device 1A-1 and the information processing device 1B-1 according to the first embodiment shown in FIG. Compared with the configuration, the information processing apparatus 1A-2 has a superimposed image signal generation unit 18 that the information processing apparatus 1B-1 has.
 情報処理装置1A-2が有する重畳画像信号生成部18は、照射位置認識部12により認識された照射点P1の座標位置に基づいて、照射点P1の座標位置に対応する投影用の画像データ上の座標位置に、リモート照射点P1’を重畳させた画像信号を生成する。また、重畳画像信号生成部18は、投影画像上の、リモート照射点P1’に対応する位置に、操作者情報を重畳させた画像信号を生成する。 The superimposition image signal generation unit 18 included in the information processing apparatus 1A-2 is based on the projection image data corresponding to the coordinate position of the irradiation point P1 based on the coordinate position of the irradiation point P1 recognized by the irradiation position recognition unit 12. An image signal in which the remote irradiation point P1 ′ is superimposed on the coordinate position of is generated. The superimposed image signal generation unit 18 generates an image signal in which operator information is superimposed at a position corresponding to the remote irradiation point P1 'on the projection image.
 このように生成された重畳画像信号は、送信部15Aから情報処理装置1B-2に送信される。そして、情報処理装置1B-2は、受信部17Bにより受信した重畳画像信号を、送受信部11BからROOM・Bに設置されているプロジェクタ3Bに送信し、プロジェクタ3BからスクリーンS2に投影させる。 The superimposed image signal generated in this way is transmitted from the transmission unit 15A to the information processing apparatus 1B-2. Then, the information processing apparatus 1B-2 transmits the superimposed image signal received by the reception unit 17B from the transmission / reception unit 11B to the projector 3B installed in the ROOM • B, and causes the projector 3B to project it onto the screen S2.
 以上説明したように、本実施形態による遠隔地協調システムは、ローカル側でリモート照射点P1’および操作者情報を投影画像に重畳させた画像を生成し、生成した重畳画像信号をリモート側に送信することも可能である。 As described above, the remote site cooperation system according to the present embodiment generates an image in which the remote irradiation point P1 ′ and the operator information are superimposed on the projection image on the local side, and transmits the generated superimposed image signal to the remote side. It is also possible to do.
   <2-3.第3の実施形態>
 以上説明した第1、第2の実施形態は、各情報処理装置(コミュニケーション装置)1A、1B(図1参照)や、各情報処理装置1A~1C(図2参照)が、互いにネットワークを介してP2P(Peer to Peer)で接続するシステム構成の場合について説明した。しかし、本開示による遠隔地協調システムのシステム構成は、P2Pで接続するモデルに限定されず、例えば図9に示すように、サーバ-クライアントで接続するシステム構成であってもよい。
<2-3. Third Embodiment>
In the first and second embodiments described above, the information processing devices (communication devices) 1A and 1B (see FIG. 1) and the information processing devices 1A to 1C (see FIG. 2) are mutually connected via a network. The case of the system configuration connected by P2P (Peer to Peer) has been described. However, the system configuration of the remote cooperation system according to the present disclosure is not limited to the model connected by P2P, and may be a system configuration connected by a server-client as shown in FIG. 9, for example.
 図9は、本開示による遠隔地協調システムの他のシステム構成例を説明するための図である。図9に示すように、本実施形態による遠隔地協調システムは、複数の情報処理装置1A~1Cが、各々サーバ(コミュニケーションサーバ)100に接続するシステム構成であってもよい。かかるシステム構成を適用した場合の遠隔地協調システムを形成する各装置の内部構成例のバリエーションについて、以下第3~第7の実施形態を挙げて説明する。なお、第3~第7の実施形態では、複数の情報処理装置1A、1Bが各々サーバ100に接続するサーバ-クライアントシステムの場合について説明する。 FIG. 9 is a diagram for explaining another system configuration example of the remote cooperation system according to the present disclosure. As shown in FIG. 9, the remote cooperation system according to the present embodiment may have a system configuration in which a plurality of information processing apparatuses 1A to 1C are connected to a server (communication server) 100, respectively. Variations of internal configuration examples of the devices forming the remote cooperation system when such a system configuration is applied will be described below with reference to third to seventh embodiments. In the third to seventh embodiments, a case of a server-client system in which a plurality of information processing apparatuses 1A and 1B are connected to the server 100 will be described.
 (内部構成例)
 図10は、第3の実施形態による遠隔地協調システムを形成する各装置の内部構成例を示すブロック図である。図10に示す第3の実施形態による情報処理装置1A-3および情報処理装置1B-3は、図3に示す第1の実施形態による情報処理装置1A-1および情報処理装置1B-1の内部構成と比較すると、以下の点が異なる。すなわち、情報処理装置1A-3の送信部15Aがサーバ100-1にデータ送信を行い、受信部17Aがサーバ100-1からデータを受信し、情報処理装置1B-3の受信部17Bがサーバ100-1からデータを受信し、送信部15Bがサーバ100-1にデータ送信を行う点が異なる。
(Internal configuration example)
FIG. 10 is a block diagram illustrating an internal configuration example of each device forming the remote location cooperation system according to the third embodiment. The information processing device 1A-3 and the information processing device 1B-3 according to the third embodiment shown in FIG. 10 are the same as the information processing device 1A-1 and the information processing device 1B-1 according to the first embodiment shown in FIG. Compared with the configuration, the following points are different. That is, the transmission unit 15A of the information processing apparatus 1A-3 transmits data to the server 100-1, the reception unit 17A receives data from the server 100-1, and the reception unit 17B of the information processing apparatus 1B-3 -1 is received, and the transmission unit 15B transmits data to the server 100-1.
 サーバ100-1は、転送部110を有し、情報処理装置1A-3の送信部15Aから送信されたデータをそのまま情報処理装置1B-3に転送し、情報処理装置1B-3の送信部15Bから送信されたデータをそのまま情報処理装置1A-3に転送する。 The server 100-1 has a transfer unit 110, transfers the data transmitted from the transmission unit 15A of the information processing device 1A-3 as it is to the information processing device 1B-3, and transmits the data to the transmission unit 15B of the information processing device 1B-3. The data transmitted from is directly transferred to the information processing apparatus 1A-3.
 情報処理装置1A-3および情報処理装置1B-3が互いに送受信するデータの内容は、第1の実施形態と同様であるので、ここでの説明は省略する。 Since the contents of data transmitted and received between the information processing apparatus 1A-3 and the information processing apparatus 1B-3 are the same as those in the first embodiment, description thereof is omitted here.
 このように、第2の実施形態によれば、各情報処理装置1A-3、1B-3が、サーバ100-1を介して遠隔地協調を実現することができる。 As described above, according to the second embodiment, the information processing apparatuses 1A-3 and 1B-3 can realize remote cooperation through the server 100-1.
   <2-4.第4の実施形態>
 上記第3の実施形態では、リモート側で、コンテンツ(投影している画像データ)と、操作者情報や、照射点P1の座標位置等の重畳画像が生成されているが、第4の実施形態では、ローカル側で重畳画像が生成される。以下、図11を参照して具体的に説明する。
<2-4. Fourth Embodiment>
In the third embodiment, content (projected image data), operator information, and a superimposed image such as the coordinate position of the irradiation point P1 are generated on the remote side. The fourth embodiment Then, a superimposed image is generated on the local side. Hereinafter, a specific description will be given with reference to FIG.
 図11は、第4の実施形態による遠隔地協調システムを形成する各装置の内部構成例を示すブロック図である。図11に示す第4の実施形態による情報処理装置1A-4および情報処理装置1B-4は、図10に示す第3の実施形態による情報処理装置1A-3および情報処理装置1B-3の内部構成と比較すると、情報処理装置1B-3が有していた重畳画像信号生成部18を、情報処理装置1A-3が有する点が異なる。 FIG. 11 is a block diagram showing an example of the internal configuration of each device forming the remote location cooperation system according to the fourth embodiment. The information processing device 1A-4 and the information processing device 1B-4 according to the fourth embodiment shown in FIG. 11 are the same as the information processing device 1A-3 and the information processing device 1B-3 according to the third embodiment shown in FIG. Compared with the configuration, the information processing apparatus 1A-3 has a superimposed image signal generation unit 18 that the information processing apparatus 1B-3 has.
 情報処理装置1A-2が有する重畳画像信号生成部18は、上記第2の実施形態と同様に、照射位置認識部12により認識された照射点P1の座標位置に基づいて、照射点P1の座標位置に対応する投影用の画像データ上の座標位置に、リモート照射点P1’を重畳させた画像信号を生成する。また、重畳画像信号生成部18は、投影画像上の、リモート照射点P1’に対応する位置に、操作者情報を重畳させた画像信号を生成する。 Similar to the second embodiment, the superimposed image signal generation unit 18 included in the information processing apparatus 1A-2 uses the coordinates of the irradiation point P1 based on the coordinate position of the irradiation point P1 recognized by the irradiation position recognition unit 12. An image signal is generated by superimposing the remote irradiation point P1 ′ on the coordinate position on the projection image data corresponding to the position. The superimposed image signal generation unit 18 generates an image signal in which operator information is superimposed at a position corresponding to the remote irradiation point P1 'on the projection image.
 このように生成された重畳画像信号は、送信部15Aからサーバ100-1を介して、情報処理装置1B-2に送信される。そして、情報処理装置1B-4は、受信部17Bにより受信した重畳画像信号を、送受信部11BからROOM・Bに設置されているプロジェクタ3Bに送信し、プロジェクタ3BからスクリーンS2に投影させる。 The superimposed image signal generated in this way is transmitted from the transmission unit 15A to the information processing apparatus 1B-2 via the server 100-1. Then, the information processing apparatus 1B-4 transmits the superimposed image signal received by the reception unit 17B from the transmission / reception unit 11B to the projector 3B installed in the ROOM • B, and causes the projector 3B to project it onto the screen S2.
 以上説明したように、本実施形態による遠隔地協調システムは、サーバ-クライアント接続される場合でも、ローカル側でリモート照射点P1’および操作者情報を投影画像に重畳させた画像を生成し、生成した重畳画像信号をリモート側に送信することが可能である。 As described above, the remote cooperation system according to the present embodiment generates an image in which the remote irradiation point P1 ′ and the operator information are superimposed on the projection image on the local side, even when the server-client connection is established. The superimposed image signal thus transmitted can be transmitted to the remote side.
   <2-5.第5の実施形態>
 上記第3、第4の実施形態では、サーバ100-1は、各情報処理装置間のデータの送受信を仲介するだけであったが、第5の実施形態では、サーバ100-2が重畳画像信号の生成を行う。
<2-5. Fifth Embodiment>
In the third and fourth embodiments, the server 100-1 merely mediates transmission / reception of data between the information processing apparatuses. However, in the fifth embodiment, the server 100-2 executes the superimposed image signal. Is generated.
 すなわち、上述した第3、第4の実施形態では、コンテンツ(投影している画像データ)と、操作者情報や、照射点P1の座標位置等を、リモート側またはローカル側で重畳(合成)して重畳画像を生成しているが、第5の実施形態では、サーバ側で重畳画像が生成される。以下、図12を参照して具体的に説明する。 That is, in the third and fourth embodiments described above, content (projected image data), operator information, the coordinate position of the irradiation point P1, and the like are superimposed (synthesized) on the remote side or the local side. In the fifth embodiment, the superimposed image is generated on the server side. Hereinafter, a specific description will be given with reference to FIG.
 図12は、第5の実施形態による遠隔地協調システムを形成する各装置の内部構成例を示すブロック図である。図12に示す第5の実施形態による情報処理装置1A-5、1B-5およびサーバ100-2は、図11に示す第4の実施形態による各内部構成と比較すると、以下の点が異なる。すなわち、情報処理装置1A-5が有していた重畳画像信号生成部18と同様の機能を有する重畳画像信号生成部120と、送受信部111をサーバ100-2が有する点が異なる。 FIG. 12 is a block diagram showing an example of the internal configuration of each device forming the remote location cooperation system according to the fifth embodiment. The information processing apparatuses 1A-5 and 1B-5 and the server 100-2 according to the fifth embodiment shown in FIG. 12 differ from the internal configurations according to the fourth embodiment shown in FIG. 11 in the following points. That is, the difference is that the server 100-2 includes the superimposed image signal generation unit 120 having the same function as the superimposed image signal generation unit 18 included in the information processing apparatus 1A-5, and the transmission / reception unit 111.
 サーバ100-2が有する重畳画像信号生成部120は、情報処理装置1A-5から送信された照射点P1の座標位置に基づいて、照射点P1の座標位置に対応する投影用の画像データ上の座標位置に、リモート照射点P1’を重畳させた画像信号を生成する。また、重畳画像信号生成部120は、投影画像上の、リモート照射点P1’に対応する位置に、操作者情報を重畳させた画像信号を生成する。 The superimposed image signal generation unit 120 included in the server 100-2 is based on the projection image data corresponding to the coordinate position of the irradiation point P1 based on the coordinate position of the irradiation point P1 transmitted from the information processing apparatus 1A-5. An image signal in which the remote irradiation point P1 ′ is superimposed on the coordinate position is generated. In addition, the superimposed image signal generation unit 120 generates an image signal in which operator information is superimposed at a position corresponding to the remote irradiation point P1 'on the projection image.
 このように生成された重畳画像信号は、送受信部111から情報処理装置1B-5に送信される。そして、情報処理装置1B-5は、受信部17Bにより受信した重畳画像信号を、送受信部11BからROOM・Bに設置されているプロジェクタ3Bに送信し、プロジェクタ3BからスクリーンS2に投影させる。 The superimposed image signal generated in this way is transmitted from the transmission / reception unit 111 to the information processing apparatus 1B-5. Then, the information processing apparatus 1B-5 transmits the superimposed image signal received by the reception unit 17B from the transmission / reception unit 11B to the projector 3B installed in the ROOM • B, and causes the projector 3B to project it onto the screen S2.
 以上説明したように、本実施形態による遠隔地協調システムは、サーバ-クライアント接続される場合において、サーバ側でリモート照射点P1’および操作者情報を投影画像に重畳させた画像を生成し、生成した重畳画像信号をリモート側に送信することが可能である。 As described above, the remote site cooperation system according to the present embodiment generates an image in which the remote irradiation point P1 ′ and the operator information are superimposed on the projection image on the server side when the server-client connection is made, The superimposed image signal thus transmitted can be transmitted to the remote side.
   <2-6.第6の実施形態>
 上述した第4、第5の実施形態では、コンテンツの出力処理はローカル側の出力制御部16で行われていたが、第6の実施形態では、サーバ100-3がコンテンツの出力処理も行う。コンテンツの出力処理とは、上述したように、例えばコンテンツ表示用装置4Aから取得したコンテンツデータに基づいて投影用の画像データ(表示信号)を生成する処理である。以下、第6の実施形態について図13を参照して具体的に説明する。
<2-6. Sixth Embodiment>
In the fourth and fifth embodiments described above, the content output processing is performed by the local output control unit 16, but in the sixth embodiment, the server 100-3 also performs content output processing. As described above, the content output processing is processing for generating projection image data (display signal) based on content data acquired from the content display device 4A, for example. Hereinafter, the sixth embodiment will be specifically described with reference to FIG.
 図13は、第6の実施形態による遠隔地協調システムを形成する各装置の内部構成例を示すブロック図である。図13に示す第6の実施形態による情報処理装置1A-6、1B-6およびサーバ100-3は、図12に示す第5の実施形態による各内部構成と比較すると、以下の点が異なる。すなわち、情報処理装置1A-5が有していた出力制御部16と同様の機能を有する出力制御部130と、送受信部111および重畳画像信号生成部120をサーバ100-3が有する点が異なる。 FIG. 13 is a block diagram showing an example of the internal configuration of each device forming the remote location cooperation system according to the sixth embodiment. The information processing apparatuses 1A-6, 1B-6 and the server 100-3 according to the sixth embodiment shown in FIG. 13 differ from the internal configurations according to the fifth embodiment shown in FIG. 12 in the following points. That is, the difference is that the server 100-3 includes the output control unit 130 having the same function as the output control unit 16 included in the information processing apparatus 1A-5, the transmission / reception unit 111, and the superimposed image signal generation unit 120.
 サーバ100-3の送受信部111は、ROOM・Aに設置されているコンテンツ表示用装置4Aから情報処理装置1A-6を介して、またはネットワーク上のコンテンツサーバから、コンテンツデータを受信する。また、サーバ100-3の送受信部111は、取得したコンテンツデータに基づいて、出力制御部130により生成された投影用の画像データ(表示信号)を、ローカル側およびリモート側の各情報処理装置1A-6、1B-6に送信する。これにより、ローカル側およびリモート側では、同じ画像データが各プロジェクタ3A、3BからスクリーンS1、S2にそれぞれ投影される。 The transmission / reception unit 111 of the server 100-3 receives content data from the content display device 4A installed in the ROOM A via the information processing device 1A-6 or from a content server on the network. In addition, the transmission / reception unit 111 of the server 100-3 uses the image data for projection (display signal) generated by the output control unit 130 based on the acquired content data as the information processing apparatuses 1A on the local side and the remote side. -6, send to 1B-6. Thereby, on the local side and the remote side, the same image data is projected from the projectors 3A, 3B onto the screens S1, S2, respectively.
 また、ローカル側でレーザーポインタ2Aにより照射される照射点P1が操作(移動)された場合、情報処理装置1A-6の送信部15Aは照射点P1の座標位置、操作者情報、レーザーポインタ2Aの3次元位置を、サーバ100-3に送信する。サーバ100-3の重畳画像信号生成部120は、情報処理装置1A-6から受信した照射点P1の座標位置に基づいて、出力制御部130から出力された投影用の画像データ上の、照射点P1の座標位置に対応する座標位置に、リモート照射点P1’を重畳させた画像信号を生成する。また、重畳画像信号生成部120は、投影画像上の、リモート照射点P1’に対応する位置に、操作者情報を重畳させた画像信号を生成する。 When the irradiation point P1 irradiated by the laser pointer 2A is operated (moved) on the local side, the transmission unit 15A of the information processing apparatus 1A-6 transmits the coordinate position of the irradiation point P1, operator information, and the laser pointer 2A. The three-dimensional position is transmitted to the server 100-3. The superimposed image signal generation unit 120 of the server 100-3 performs irradiation points on the projection image data output from the output control unit 130 based on the coordinate position of the irradiation point P1 received from the information processing apparatus 1A-6. An image signal is generated by superimposing the remote irradiation point P1 ′ on the coordinate position corresponding to the coordinate position of P1. In addition, the superimposed image signal generation unit 120 generates an image signal in which operator information is superimposed at a position corresponding to the remote irradiation point P1 'on the projection image.
 このように生成された重畳画像信号は、送受信部111から情報処理装置1B-6に送信される。そして、情報処理装置1B-6は、受信部17Bにより受信した重畳画像信号を、送受信部11BからROOM・Bに設置されているプロジェクタ3Bに送信し、プロジェクタ3BからスクリーンS2に投影させる。 The superimposed image signal generated in this way is transmitted from the transmission / reception unit 111 to the information processing apparatus 1B-6. Then, the information processing apparatus 1B-6 transmits the superimposed image signal received by the reception unit 17B from the transmission / reception unit 11B to the projector 3B installed in the ROOM • B, and causes the projector 3B to project it onto the screen S2.
 また、サーバ100-3の出力制御部130は、情報処理装置1A-6から受信したレーザーポインタ2Aの3次元位置に基づいて、ROOM・Aのカメラ6Aの撮像方向やマイク8Aの指向性を操作者7Aに向けるようカメラ6A、マイク8Aの動きを制御する制御信号を情報処理装置1A-6に送信する。 Further, the output control unit 130 of the server 100-3 operates the imaging direction of the ROOM • A camera 6A and the directivity of the microphone 8A based on the three-dimensional position of the laser pointer 2A received from the information processing apparatus 1A-6. A control signal for controlling the movement of the camera 6A and the microphone 8A to be directed to the person 7A is transmitted to the information processing apparatus 1A-6.
 以上説明したように、第6の実施形態では、コンテンツ出力処理をサーバ側で行うので、ローカル側の処理負担を軽減することができる。 As described above, in the sixth embodiment, the content output processing is performed on the server side, so that the processing burden on the local side can be reduced.
   <2-7.第7の実施形態>
 次に、図14を参照して第7の実施形態について説明する。第7の実施形態では、ほとんどの処理をサーバ側で行い、ローカル側では、照射位置P1の座標位置を認識するための可視光/非可視光撮像画像をサーバ側に送る処理のみを行う。
<2-7. Seventh Embodiment>
Next, a seventh embodiment will be described with reference to FIG. In the seventh embodiment, most processing is performed on the server side, and on the local side, only processing for transmitting a visible light / invisible light captured image for recognizing the coordinate position of the irradiation position P1 to the server side is performed.
 図14は、第7の実施形態による遠隔地協調システムを形成する各装置の内部構成例を示すブロック図である。図14に示す第7の実施形態による情報処理装置1A-7、1B-7およびサーバ100-4は、図13に示す第6の実施形態による各内部構成と比較すると、以下の点が異なる。すなわち、情報処理装置1A-6が有していた照射位置認識部12、操作者情報取得部13、および3次元位置算出部14と同様の機能を有する照射位置認識部140、操作者情報取得部150、および3次元位置算出部160をサーバ100-4が有する点が異なる。 FIG. 14 is a block diagram showing an example of the internal configuration of each device forming the remote cooperation system according to the seventh embodiment. The information processing apparatuses 1A-7 and 1B-7 and the server 100-4 according to the seventh embodiment shown in FIG. 14 differ from the internal configurations according to the sixth embodiment shown in FIG. 13 in the following points. That is, the irradiation position recognition unit 140, the operator information acquisition unit 13, and the irradiation position recognition unit 140 having the same functions as the three-dimensional position calculation unit 14 included in the information processing apparatus 1A-6, the operator information acquisition unit 150, and the server 100-4 has a three-dimensional position calculation unit 160.
 また、サーバ100-4の他の構成、送受信部111、重畳画像信号生成部120、および出力制御部130の処理内容は、第5の実施形態と同様である。 Further, the other configurations of the server 100-4, the processing contents of the transmission / reception unit 111, the superimposed image signal generation unit 120, and the output control unit 130 are the same as those in the fifth embodiment.
 このような構成により、第6の実施形態では、情報処理装置1A-7が、カメラ5Aにより撮像された投影画像の撮像画像をサーバ100-4に継続的に送信し、サーバ100-4の照射位置認識部140により、投影画像上の照射点P1の座標位置が認識される。また、サーバ100-4の操作者情報取得部150も、情報処理装置1A-7から送信された撮像画像に基づいて、操作者の特定や操作者情報の取得を行う。照射点P1の座標位置や、操作者情報は、重畳画像信号生成部120に出力され、重畳画像信号の生成に用いられる。 With this configuration, in the sixth embodiment, the information processing apparatus 1A-7 continuously transmits the captured image of the projection image captured by the camera 5A to the server 100-4, and the server 100-4 performs irradiation. The position recognition unit 140 recognizes the coordinate position of the irradiation point P1 on the projection image. In addition, the operator information acquisition unit 150 of the server 100-4 also specifies the operator and acquires the operator information based on the captured image transmitted from the information processing apparatus 1A-7. The coordinate position of the irradiation point P1 and the operator information are output to the superimposed image signal generation unit 120 and are used to generate a superimposed image signal.
 また、サーバ100-4の3次元位置算出部160は、情報処理装置1A-7から送信された撮像画像に基づいて、レーザーポインタ2Aの3次元位置を算出する。算出された3次元位置は、出力制御部130に出力されてローカル側におけるカメラ6Aやマイク8Aの制御に用いられたり、重畳画像信号生成部120に出力されて投影画像に重畳されるシルエット画像の配置を決定する際に用いられたりする。 Further, the three-dimensional position calculation unit 160 of the server 100-4 calculates the three-dimensional position of the laser pointer 2A based on the captured image transmitted from the information processing apparatus 1A-7. The calculated three-dimensional position is output to the output control unit 130 to be used for controlling the camera 6A and the microphone 8A on the local side, or is output to the superimposed image signal generation unit 120 to output a silhouette image superimposed on the projection image. Or used to determine placement.
 以上説明したように、第7の実施形態では、ローカル側のスクリーンS1に投影される投影画像の撮像画像に基づく照射点P1の座標位置の認識処理や、操作者情報の取得処理等ほとんどの処理をサーバ側で行うので、ローカル側の処理負担をさらに軽減することができる。 As described above, in the seventh embodiment, most processes such as the process of recognizing the coordinate position of the irradiation point P1 based on the captured image of the projection image projected on the local screen S1, and the process of acquiring operator information. Is performed on the server side, the processing load on the local side can be further reduced.
   <<3.まとめ>>
 上述したように、本実施形態による遠隔地協調システムでは、ローカル側でレーザーポインタを操作している操作者の情報を、リモート側で表示(投影)される指示点(リモート照射点)に対応付けて明示(投影)させることで、複数の遠隔地を結ぶ通信会議を円滑に行うことができる。
<< 3. Summary >>
As described above, in the remote cooperation system according to the present embodiment, information on the operator who operates the laser pointer on the local side is associated with the indication point (remote irradiation point) displayed (projected) on the remote side. By clearly displaying (projecting), it is possible to smoothly perform a communication conference connecting a plurality of remote locations.
 以上、添付図面を参照しながら本開示の好適な実施形態について詳細に説明したが、本技術はかかる例に限定されない。本開示の技術分野における通常の知識を有する者であれば、特許請求の範囲に記載された技術的思想の範疇内において、各種の変更例または修正例に想到し得ることは明らかであり、これらについても、当然に本開示の技術的範囲に属するものと了解される。 The preferred embodiments of the present disclosure have been described in detail above with reference to the accompanying drawings, but the present technology is not limited to such examples. It is obvious that a person having ordinary knowledge in the technical field of the present disclosure can come up with various changes or modifications within the scope of the technical idea described in the claims. Of course, it is understood that it belongs to the technical scope of the present disclosure.
 例えば、情報処理装置1、サーバ100に内蔵されるCPU、ROM、およびRAM等のハードウェアに、上述した情報処理装置1、サーバ100の機能を発揮させるためのコンピュータプログラムも作成可能である。また、当該コンピュータプログラムを記憶させたコンピュータ読み取り可能な記憶媒体も提供される。 For example, it is possible to create a computer program for causing the information processing apparatus 1 and the server 100 to perform the functions of the information processing apparatus 1 and the server 100 described above on hardware such as the CPU, ROM, and RAM incorporated in the information processing apparatus 1 and the server 100. A computer-readable storage medium storing the computer program is also provided.
 また、上述したフローチャートは、必ずしも図示された順序で処理されなくともよく、例えば並行に処理されたり逆の順序で処理されたりしてもよい。例えば、図4に示すフローチャートにおいて、ステップS112とS115は並行に処理されてもよいし、逆の順序で処理されてもよい。また、ステップS121と、S118およびS124は、並行に処理されてもよい。 Also, the flowcharts described above do not necessarily have to be processed in the order shown, and may be processed in parallel or in reverse order, for example. For example, in the flowchart shown in FIG. 4, steps S112 and S115 may be processed in parallel or in the reverse order. Moreover, step S121 and S118 and S124 may be processed in parallel.
 また、上述した各実施形態では、リモート側において、プロジェクタ3B(表示装置の一例)から重畳画像(コンテンツにリモート照射点P1や操作者情報が重畳されている画像)がスクリーンS2に投影されているが、リモート側での表示出力は必ずしも投影に限られない。例えばリモート側において、重畳画像が、テレビジョン装置やPC(パーソナルコンピュータ)の表示画面に表示出力されてもよい。 In each of the above-described embodiments, on the remote side, a superimposed image (an image in which the remote irradiation point P1 and operator information are superimposed on the content) is projected on the screen S2 from the projector 3B (an example of a display device). However, the display output on the remote side is not necessarily limited to projection. For example, on the remote side, the superimposed image may be displayed and output on a display screen of a television device or a PC (personal computer).
 なお、本技術は以下のような構成も取ることができる。
(1)
 ローカル側の投影画像に対するレーザーポインタによる照射点の位置を認識する認識部と、
 前記レーザーポインタの操作者の情報を取得する取得部と、
 リモート側に設置されている表示装置において、前記投影画像と共に、前記照射点に対応する座標位置に位置するリモート照射点および前記操作者の情報を表示させるために、前記照射点の位置と前記操作者の情報を前記リモート側に送信する第1の送信部と、
を備える、情報処理装置。
(2)
 前記取得部は、前記投影画像を撮像した撮像画像に基づいて、前記投影画像に対して前記レーザーポインタにより照射される非可視光/可視光マーカーを解析することにより、前記操作者の情報を取得する、前記(1)に記載の情報処理装置。
(3)
 前記認識部は、前記レーザーポインタから照射される非可視光/可視光マーカーの位置座標を前記照射点の位置として認識する、前記(1)または(2)に記載の情報処理装置。
(4)
 前記非可視光/可視光マーカーは、図形または1次元/2次元バーコードにより形成される、前記(2)または(3)に記載の情報処理装置。
(5)
 前記情報処理装置は、
 前記投影画像を撮像した撮像画像に基づいて、前記投影画像に対して前記レーザーポインタにより照射される非可視光/可視光マーカーを解析することにより、前記投影画像に対する前記レーザーポインタの3次元位置を算出する算出部をさらに備える、前記(1)~(4)のいずれか1項に記載の情報処理装置。
(6)
 前記算出部は、前記非可視光/可視光マーカーの形、大きさ、傾き、および歪みの少なくともいずれかに基づいて、前記レーザーポインタの3次元位置を算出する、前記(5)に記載の情報処理装置。
(7)
 前記情報処理装置は、
 前記算出部により算出された前記3次元位置に応じて、前記レーザーポインタの操作者を撮像するよう、撮像制御信号をローカル側の撮像装置に送信する第2の送信部をさらに備える、前記(5)または(6)に記載の情報処理装置。
(8)
 前記情報処理装置は、
 前記算出部により算出された前記3次元位置に応じて、前記レーザーポインタの操作者の方向を向くよう、収音制御信号をローカル側の指向性収音部に送信する第3の送信部をさらに備える、前記(5)~(7)のいずれか1項に記載の情報処理装置。
(9)
 前記第1の送信部は、前記リモート側において、投影画像上の前記3次元位置に応じた位置に人物の全身画像を表示するために、前記算出部により算出された前記3次元位置を前記リモート側に送信する、前記(5)~(8)のいずれか1項に記載の情報処理装置。
(10)
 ローカル側の投影画像に対するレーザーポインタによる照射点の位置を認識するステップと、
 前記レーザーポインタの操作者の情報を取得するステップと、
 リモート側に設置されている表示装置において、前記投影画像と共に、前記照射点に対応する座標位置に位置するリモート照射点および前記操作者の情報を表示させるために、前記照射点の位置と前記操作者の情報を前記リモート側に送信するステップと、
を含む、制御方法。
(11)
 コンピュータを、
 ローカル側の投影画像に対するレーザーポインタによる照射点の位置を認識する認識部と、
 前記レーザーポインタの操作者の情報を取得する取得部と、
 リモート側に設置されている表示装置において、前記投影画像と共に、前記照射点に対応する座標位置に位置するリモート照射点および前記操作者の情報を表示させるために、前記照射点の位置と前記操作者の情報を前記リモート側に送信する第1の送信部と、
として機能させるためのプログラム。
(12)
 コンピュータを、
 ローカル側の投影画像に対するレーザーポインタによる照射点の位置を認識する認識部と、
 前記レーザーポインタの操作者の情報を取得する取得部と、
 リモート側に設置されている表示装置において、前記投影画像と共に、前記照射点に対応する座標位置に位置するリモート照射点および前記操作者の情報を表示させるために、前記照射点の位置と前記操作者の情報を前記リモート側に送信する第1の送信部と、
として機能させるためのプログラムが記憶された、記憶媒体。
In addition, this technique can also take the following structures.
(1)
A recognition unit for recognizing the position of the irradiation point by the laser pointer with respect to the projection image on the local side;
An acquisition unit for acquiring information of an operator of the laser pointer;
In the display device installed on the remote side, in order to display the projection image and the remote irradiation point located at the coordinate position corresponding to the irradiation point and the information of the operator, the position of the irradiation point and the operation A first transmitter for transmitting the information of the person to the remote side;
An information processing apparatus comprising:
(2)
The acquisition unit acquires information on the operator by analyzing a non-visible light / visible light marker irradiated by the laser pointer on the projection image based on a captured image obtained by capturing the projection image. The information processing apparatus according to (1).
(3)
The information processing apparatus according to (1) or (2), wherein the recognition unit recognizes a position coordinate of an invisible light / visible light marker irradiated from the laser pointer as a position of the irradiation point.
(4)
The information processing apparatus according to (2) or (3), wherein the invisible light / visible light marker is formed by a figure or a one-dimensional / 2-dimensional barcode.
(5)
The information processing apparatus includes:
Based on the captured image obtained by capturing the projection image, the three-dimensional position of the laser pointer with respect to the projection image is analyzed by analyzing the invisible light / visible light marker irradiated on the projection image by the laser pointer. The information processing apparatus according to any one of (1) to (4), further including a calculation unit that calculates.
(6)
The information according to (5), wherein the calculation unit calculates a three-dimensional position of the laser pointer based on at least one of a shape, a size, an inclination, and a distortion of the invisible light / visible light marker. Processing equipment.
(7)
The information processing apparatus includes:
A second transmission unit configured to transmit an imaging control signal to a local imaging device so as to capture an image of the operator of the laser pointer according to the three-dimensional position calculated by the calculation unit; ) Or the information processing apparatus according to (6).
(8)
The information processing apparatus includes:
A third transmission unit that transmits a sound collection control signal to the directional sound collection unit on the local side so as to face the operator of the laser pointer according to the three-dimensional position calculated by the calculation unit; The information processing apparatus according to any one of (5) to (7), comprising:
(9)
The first transmission unit displays the three-dimensional position calculated by the calculation unit on the remote side in order to display a whole body image of a person at a position corresponding to the three-dimensional position on a projection image. The information processing apparatus according to any one of (5) to (8), wherein the information processing apparatus transmits to the side.
(10)
Recognizing the position of the irradiation point by the laser pointer on the local projection image;
Obtaining information of an operator of the laser pointer;
In the display device installed on the remote side, in order to display the projection image and the remote irradiation point located at the coordinate position corresponding to the irradiation point and the information of the operator, the position of the irradiation point and the operation Sending the information of the person to the remote side;
Including a control method.
(11)
Computer
A recognition unit for recognizing the position of the irradiation point by the laser pointer with respect to the projection image on the local side;
An acquisition unit for acquiring information of an operator of the laser pointer;
In the display device installed on the remote side, in order to display the projection image and the remote irradiation point located at the coordinate position corresponding to the irradiation point and the information of the operator, the position of the irradiation point and the operation A first transmitter for transmitting the information of the person to the remote side;
Program to function as.
(12)
Computer
A recognition unit for recognizing the position of the irradiation point by the laser pointer with respect to the projection image on the local side;
An acquisition unit for acquiring information of an operator of the laser pointer;
In the display device installed on the remote side, in order to display the projection image and the remote irradiation point located at the coordinate position corresponding to the irradiation point and the information of the operator, the position of the irradiation point and the operation A first transmitter for transmitting the information of the person to the remote side;
A storage medium storing a program for functioning as a computer.
 1、1A(1A-1~1A-7)、1B(1B-1~1B-7)、1C  情報処理装置
 2、2A、2C  レーザーポインタ
 3A~3C  プロジェクタ
 4A、4B  コンテンツ表示用装置
 5A、5B  カメラ(投影画像撮影用)
 6A~6C  カメラ(操作者/閲覧者撮影用)
 7A、7C  操作者
 7B  閲覧者
 8A~8C  マイクロホン
 9A~9C  スピーカ
 11A、11B  送受信部
 12、140  照射位置認識部
 13、150  操作者情報取得部
 14、160  3次元位置算出部
 15A、15B  送信部
 16、130  出力制御部
 17A、17B  受信部
 18、120  重畳画像信号生成部
 100(100-1~100-4)  サーバ
 110  転送部
 111  送受信部
 S1~S3  スクリーン
 P1、P2  照射点
 P1’、P2’  リモート照射点
 
 
1, 1A (1A-1 to 1A-7), 1B (1B-1 to 1B-7), 1C Information processing device 2, 2A, 2C Laser pointer 3A to 3C Projector 4A, 4B Content display device 5A, 5B Camera (For projection image shooting)
6A to 6C camera (for operator / viewer shooting)
7A, 7C Operator 7B Viewer 8A-8C Microphone 9A- 9C Speaker 11A, 11B Transmission / reception unit 12, 140 Irradiation position recognition unit 13, 150 Operator information acquisition unit 14, 160 Three-dimensional position calculation unit 15A, 15B Transmission unit 16 , 130 Output control unit 17A, 17B Reception unit 18, 120 Superimposed image signal generation unit 100 (100-1 to 100-4) Server 110 Transfer unit 111 Transmission / reception unit S1 to S3 Screen P1, P2 Irradiation point P1 ', P2' Remote Irradiation point

Claims (12)

  1.  ローカル側の投影画像に対するレーザーポインタによる照射点の位置を認識する認識部と、
     前記レーザーポインタの操作者の情報を取得する取得部と、
     リモート側に設置されている表示装置において、前記投影画像と共に、前記照射点に対応する座標位置に位置するリモート照射点および前記操作者の情報を表示させるために、前記照射点の位置と前記操作者の情報を前記リモート側に送信する第1の送信部と、
    を備える、情報処理装置。
    A recognition unit for recognizing the position of the irradiation point by the laser pointer with respect to the projection image on the local side;
    An acquisition unit for acquiring information of an operator of the laser pointer;
    In the display device installed on the remote side, in order to display the projection image and the remote irradiation point located at the coordinate position corresponding to the irradiation point and the information of the operator, the position of the irradiation point and the operation A first transmitter for transmitting the information of the person to the remote side;
    An information processing apparatus comprising:
  2.  前記取得部は、前記投影画像を撮像した撮像画像に基づいて、前記投影画像に対して前記レーザーポインタにより照射される非可視光/可視光マーカーを解析することにより、前記操作者の情報を取得する、請求項1に記載の情報処理装置。 The acquisition unit acquires information on the operator by analyzing a non-visible light / visible light marker irradiated by the laser pointer on the projection image based on a captured image obtained by capturing the projection image. The information processing apparatus according to claim 1.
  3.  前記認識部は、前記レーザーポインタから照射される非可視光/可視光マーカーの位置座標を前記照射点の位置として認識する、請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the recognition unit recognizes a position coordinate of an invisible light / visible light marker irradiated from the laser pointer as a position of the irradiation point.
  4.  前記非可視光/可視光マーカーは、図形または1次元/2次元バーコードにより形成される、請求項2に記載の情報処理装置。 The information processing apparatus according to claim 2, wherein the invisible light / visible light marker is formed by a figure or a one-dimensional / two-dimensional barcode.
  5.  前記情報処理装置は、
     前記投影画像を撮像した撮像画像に基づいて、前記投影画像に対して前記レーザーポインタにより照射される非可視光/可視光マーカーを解析することにより、前記投影画像に対する前記レーザーポインタの3次元位置を算出する算出部をさらに備える、請求項1に記載の情報処理装置。
    The information processing apparatus includes:
    Based on the captured image obtained by capturing the projection image, the three-dimensional position of the laser pointer with respect to the projection image is analyzed by analyzing the invisible light / visible light marker irradiated on the projection image by the laser pointer. The information processing apparatus according to claim 1, further comprising a calculation unit for calculating.
  6.  前記算出部は、前記非可視光/可視光マーカーの形、大きさ、傾き、および歪みの少なくともいずれかに基づいて、前記レーザーポインタの3次元位置を算出する、請求項5に記載の情報処理装置。 The information processing according to claim 5, wherein the calculation unit calculates a three-dimensional position of the laser pointer based on at least one of a shape, a size, a tilt, and a distortion of the invisible light / visible light marker. apparatus.
  7.  前記情報処理装置は、
     前記算出部により算出された前記3次元位置に応じて、前記レーザーポインタの操作者を撮像するよう、撮像制御信号をローカル側の撮像装置に送信する第2の送信部をさらに備える、請求項5に記載の情報処理装置。
    The information processing apparatus includes:
    6. The apparatus according to claim 5, further comprising a second transmission unit configured to transmit an imaging control signal to a local imaging device so as to image the operator of the laser pointer in accordance with the three-dimensional position calculated by the calculation unit. The information processing apparatus described in 1.
  8.  前記情報処理装置は、
     前記算出部により算出された前記3次元位置に応じて、前記レーザーポインタの操作者の方向を向くよう、収音制御信号をローカル側の指向性収音部に送信する第3の送信部をさらに備える、請求項5に記載の情報処理装置。
    The information processing apparatus includes:
    A third transmission unit that transmits a sound collection control signal to the directional sound collection unit on the local side so as to face the operator of the laser pointer according to the three-dimensional position calculated by the calculation unit; The information processing apparatus according to claim 5, comprising:
  9.  前記第1の送信部は、前記リモート側において、投影画像上の前記3次元位置に応じた位置に人物の全身画像を表示するために、前記算出部により算出された前記3次元位置を前記リモート側に送信する、請求項5に記載の情報処理装置。 The first transmission unit displays the three-dimensional position calculated by the calculation unit on the remote side in order to display a whole body image of a person at a position corresponding to the three-dimensional position on a projection image. The information processing apparatus according to claim 5, wherein the information processing apparatus transmits the information to the side.
  10.  ローカル側の投影画像に対するレーザーポインタによる照射点の位置を認識するステップと、
     前記レーザーポインタの操作者の情報を取得するステップと、
     リモート側に設置されている表示装置において、前記投影画像と共に、前記照射点に対応する座標位置に位置するリモート照射点および前記操作者の情報を表示させるために、前記照射点の位置と前記操作者の情報を前記リモート側に送信するステップと、
    を含む、制御方法。
    Recognizing the position of the irradiation point by the laser pointer on the local projection image;
    Obtaining information of an operator of the laser pointer;
    In the display device installed on the remote side, in order to display the projection image and the remote irradiation point located at the coordinate position corresponding to the irradiation point and the information of the operator, the position of the irradiation point and the operation Sending the information of the person to the remote side;
    Including a control method.
  11.  コンピュータを、
     ローカル側の投影画像に対するレーザーポインタによる照射点の位置を認識する認識部と、
     前記レーザーポインタの操作者の情報を取得する取得部と、
     リモート側に設置されている表示装置において、前記投影画像と共に、前記照射点に対応する座標位置に位置するリモート照射点および前記操作者の情報を表示させるために、前記照射点の位置と前記操作者の情報を前記リモート側に送信する第1の送信部と、
    として機能させるためのプログラム。
    Computer
    A recognition unit for recognizing the position of the irradiation point by the laser pointer with respect to the projection image on the local side;
    An acquisition unit for acquiring information of an operator of the laser pointer;
    In the display device installed on the remote side, in order to display the projection image and the remote irradiation point located at the coordinate position corresponding to the irradiation point and the information of the operator, the position of the irradiation point and the operation A first transmitter for transmitting the information of the person to the remote side;
    Program to function as.
  12.  コンピュータを、
     ローカル側の投影画像に対するレーザーポインタによる照射点の位置を認識する認識部と、
     前記レーザーポインタの操作者の情報を取得する取得部と、
     リモート側に設置されている表示装置において、前記投影画像と共に、前記照射点に対応する座標位置に位置するリモート照射点および前記操作者の情報を表示させるために、前記照射点の位置と前記操作者の情報を前記リモート側に送信する第1の送信部と、
    として機能させるためのプログラムが記憶された、記憶媒体。
     
     
     
    Computer
    A recognition unit for recognizing the position of the irradiation point by the laser pointer with respect to the projection image on the local side;
    An acquisition unit for acquiring information of an operator of the laser pointer;
    In the display device installed on the remote side, in order to display the projection image and the remote irradiation point located at the coordinate position corresponding to the irradiation point and the information of the operator, the position of the irradiation point and the operation A first transmitter for transmitting the information of the person to the remote side;
    A storage medium storing a program for functioning as a computer.


PCT/JP2014/059531 2013-06-26 2014-03-31 Information processing device, control method, program, and recording medium WO2014208169A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2013133892 2013-06-26
JP2013-133892 2013-06-26

Publications (1)

Publication Number Publication Date
WO2014208169A1 true WO2014208169A1 (en) 2014-12-31

Family

ID=52141523

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/059531 WO2014208169A1 (en) 2013-06-26 2014-03-31 Information processing device, control method, program, and recording medium

Country Status (1)

Country Link
WO (1) WO2014208169A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2549264A (en) * 2016-04-06 2017-10-18 Rolls-Royce Power Eng Plc Apparatus, methods, computer programs, and non-transitory computer readable storage mediums for enabling remote control of one or more devices
CN108303062A (en) * 2016-12-27 2018-07-20 株式会社和冠 Image information processing device and image information processing method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001022519A (en) * 1999-07-06 2001-01-26 Casio Comput Co Ltd Image information processor and storage medium
JP2005267257A (en) * 2004-03-18 2005-09-29 Nara Institute Of Science & Technology Handwritten information input system
JP2010152717A (en) * 2008-12-25 2010-07-08 Canon Inc Image processor, method, and program
JP2013522766A (en) * 2010-03-16 2013-06-13 インターフェイズ・コーポレーション Interactive display system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001022519A (en) * 1999-07-06 2001-01-26 Casio Comput Co Ltd Image information processor and storage medium
JP2005267257A (en) * 2004-03-18 2005-09-29 Nara Institute Of Science & Technology Handwritten information input system
JP2010152717A (en) * 2008-12-25 2010-07-08 Canon Inc Image processor, method, and program
JP2013522766A (en) * 2010-03-16 2013-06-13 インターフェイズ・コーポレーション Interactive display system

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2549264A (en) * 2016-04-06 2017-10-18 Rolls-Royce Power Eng Plc Apparatus, methods, computer programs, and non-transitory computer readable storage mediums for enabling remote control of one or more devices
US10606340B2 (en) 2016-04-06 2020-03-31 Rolls-Royce Power Engineering Plc Apparatus, methods, computer programs, and non-transitory computer readable storage mediums for enabling remote control of one or more devices
GB2549264B (en) * 2016-04-06 2020-09-23 Rolls Royce Power Eng Plc Apparatus, methods, computer programs, and non-transitory computer readable storage mediums for enabling remote control of one or more devices
CN108303062A (en) * 2016-12-27 2018-07-20 株式会社和冠 Image information processing device and image information processing method

Similar Documents

Publication Publication Date Title
US10725729B2 (en) Virtual and real object recording in mixed reality device
WO2017215295A1 (en) Camera parameter adjusting method, robotic camera, and system
US10602121B2 (en) Method, system and apparatus for capture-based immersive telepresence in virtual environment
US20190026945A1 (en) Real-time immersive mediated reality experiences
JP5709906B2 (en) Augmented reality panorama for the visually impaired
KR101591493B1 (en) System for the rendering of shared digital interfaces relative to each user&#39;s point of view
TWI311286B (en)
WO2017094543A1 (en) Information processing device, information processing system, method for controlling information processing device, and method for setting parameter
WO2015192631A1 (en) Video conferencing system and method
TR201702966A2 (en) Improved method and system for video conferences with HMDs.
JPWO2005076210A1 (en) Image processing method, image processing apparatus, and mobile communication terminal apparatus
JP2014165565A (en) Television conference device, system and method
JP2009089324A (en) Video conference system and program, and recoding medium
JP2011152593A (en) Robot operation device
WO2017141584A1 (en) Information processing apparatus, information processing system, information processing method, and program
US10979666B2 (en) Asymmetric video conferencing system and method
JP2016213674A (en) Display control system, display control unit, display control method, and program
WO2011027475A1 (en) Teleconference device
JP2011097447A (en) Communication system
WO2014208169A1 (en) Information processing device, control method, program, and recording medium
JP6435701B2 (en) Control device
JPWO2020095639A1 (en) Information processing equipment, information processing methods and programs
CN110730378A (en) Information processing method and system
JP6534120B2 (en) Image communication device
JP2001092990A (en) Three-dimensional virtual space participant display method, three-dimensional virtual space display device and recording medium stored with three-dimensional virtual space participant display program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14816957

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: JP

122 Ep: pct application non-entry in european phase

Ref document number: 14816957

Country of ref document: EP

Kind code of ref document: A1