US20170127017A1 - Communication system, communication apparatus and communication method - Google Patents
Communication system, communication apparatus and communication method Download PDFInfo
- Publication number
- US20170127017A1 US20170127017A1 US15/297,334 US201615297334A US2017127017A1 US 20170127017 A1 US20170127017 A1 US 20170127017A1 US 201615297334 A US201615297334 A US 201615297334A US 2017127017 A1 US2017127017 A1 US 2017127017A1
- Authority
- US
- United States
- Prior art keywords
- communication apparatus
- image
- unit
- area
- output
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/141—Systems for two-way working between two video terminals, e.g. videophone
- H04N7/147—Communication arrangements, e.g. identifying the communication as a video-communication, intermediate storage of the signals
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/165—Management of the audio stream, e.g. setting of volume, audio stream path
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
-
- H04N5/23238—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/15—Conference systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03545—Pens or stylus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/167—Audio in a user interface, e.g. using voice commands for navigating, audio feedback
Definitions
- the present invention relates to a communication system, a communication apparatus, and a communication method.
- a video (television) conference system that realizes a remote conference by using a communication network.
- This video conference system can conduct a conference between remote locations almost like a face-to-face meeting in such a way that by using a communication apparatus (a terminal device) of the video conference system set up in a conference room where one of the parties participating in the remote conference, an image of the conference room such as conference participants and a voice, such as speaker's voice, are converted into digital data, and the digital data is transmitted to a communication apparatus of the other party to display the image on a display and output the voice from a speaker in a conference room of the other party.
- a communication apparatus a terminal device
- a microphone is used to acquire conference participants' voice, and a camera is used to acquire their image.
- the camera has an angle of view, and therefore cannot capture an image of a conference participant who is outside the angle of view.
- a method of using a panoramic camera capable of capturing a 360-degree panoramic image On the other hand, a general microphone is omnidirectional, and therefore picks up the ambient sound besides participant's speech.
- a method of using a microphone array to control the directivity, i.e., increase the sensitivity of a microphone that picks up the sound from a particular direction, thereby suppressing the ambient sound from being picked up and enabling participant's speech to be heard clearly.
- Japanese Unexamined Patent Application Publication No. 2007-274463 has disclosed a method of how a conference terminal forms a beam of sound picked up by a microphone array according to a selected placement pattern of participants.
- Japanese Patent No. 5028944 has disclosed a technology to detect the direction of a speaker by use of a microphone array composed of an array of a plurality of microphones and make the shooting direction of a camera follow the detected direction.
- a communication system comprising a first communication apparatus and a second communication apparatus that transmits/receives data to/from the first communication apparatus
- the first communication apparatus includes: an area dividing unit that divides, of an output image which is an image that the first communication apparatus has output and includes at least a shot image obtained by shooting the surroundings of the second communication apparatus, a display area indicating an area where the shot image is displayed into as many unit areas as the number of people captured in the shot image; a first transmission control unit that performs control of transmitting coordinate information which indicates, of the output image, a unit area corresponding to a position pointed by a user to the second communication apparatus; and an output control unit that, when the first communication apparatus has received output information including a voice subjected to directivity control according to, of the shot image, an area corresponding to the coordinate information from the second communication apparatus, performs control of outputting the received output information
- the second communication apparatus includes; an acquiring unit that acquires the shot image; an identifying unit that
- Exemplary embodiments of the present invention also provide a communication apparatus comprising: an area dividing unit that divides, of an output image which is an image that the communication apparatus has output and includes at least a shot image obtained by shooting the surroundings of another communication apparatus which communicates with the communication apparatus, a display area indicating an area where the shot image is displayed into as many unit areas as the number of people captured in the shot image; a transmission control unit that performs control of transmitting coordinate information which indicates, of the output image, a unit area corresponding to a position pointed by a user to the other communication apparatus; and an output control unit that, when the communication apparatus has received output information including a voice subjected to directivity control according to, of the shot image, an area corresponding to the coordinate information from the other communication apparatus, performs control of outputting the received output information.
- Exemplary embodiments of the present invention also provide a communication method for a communication system including a first communication apparatus and a second communication apparatus that transmits/receives data to/from the first communication apparatus, the communication method comprising: dividing, by the first communication apparatus, of an output image which is an image that the first communication apparatus has output and includes at least a shot image obtained by shooting the surroundings of the second communication apparatus, a display area indicating an area where the shot image is displayed into as many unit areas as the number of people captured in the shot image; performing, by the first communication apparatus, control of transmitting coordinate information which indicates, of the output image, a unit area corresponding to a position pointed by a user to the second communication apparatus; when having received output information including a voice subjected to directivity control according to, of the shot image, an area corresponding to the coordinate information from the second communication apparatus, performing, by the first communication apparatus, control of outputting the received output information; acquiring, by the second communication apparatus, the shot image; when having received the coordinate information from the first communication apparatus, identifying, by the
- Exemplary embodiments of the present invention also provide a communication method for a communication apparatus, the communication method comprising: dividing, of an output image which is an image that the communication apparatus has output and includes at least a shot image obtained by shooting the surroundings of another communication apparatus which communicates with the communication apparatus, a display area indicating an area where the shot image is displayed into as many unit areas as the number of people captured in the shot image; performing control of transmitting coordinate information which indicates, of the output image, a unit area corresponding to a position pointed by a user to the other communication apparatus; and when the communication apparatus has received output information including a voice subjected to directivity control according to, of the shot image, an area corresponding to the coordinate information from the other communication apparatus, performing control of outputting the received output information.
- FIG. 1 is a diagram showing an example of a configuration of a communication system
- FIG. 2 is a diagram showing an example of a hardware configuration of a conference terminal
- FIG. 3 is a diagram showing an example of the installation of microphones
- FIG. 4 is a diagram for explaining details of the conference terminal
- FIG. 5 is a diagram showing an example of a hardware configuration of a projector
- FIG. 6 is a schematic diagram showing an example of circumstances of a video conference in an embodiment of the present invention.
- FIGS. 7A, 7B, and 7C are diagrams for explaining how to specify a cutout area of a projected image
- FIG. 8 is a diagram for explaining a cutout image
- FIG. 9 is a diagram showing an example of functions that the projector has.
- FIG. 10 is a diagram showing an example of a projected image
- FIG. 11 is a diagram showing an example of functions that a control unit of the conference terminal has;
- FIG. 12 is a flowchart showing an example of the operation of the projector
- FIG. 13 is a flowchart showing an example of the operation of the conference terminal
- FIG. 14 is a flowchart showing another example of the operation of the conference terminal.
- FIG. 15 is a flowchart showing still another example of the operation of the conference terminal.
- FIG. 16 is a diagram showing an example of a projected image
- FIG. 17 is a diagram for explaining a variation of how to divide a display area.
- FIG. 1 is a diagram showing an example of a configuration of a communication system 1 according to the present embodiment.
- one conference terminal 2 is set up in each of Locations A and B.
- the conference terminals 2 set up in Locations A and B are each connected to a server 4 via a network 3 such as the Internet.
- a network 3 such as the Internet.
- the number of conference terminals 2 (the number of locations) included in the communication system 1 is riot limited to this, and can be arbitrarily changed.
- the server 4 monitors whether each conference terminal 2 is connected to the server 4 , and performs control required at the time of a conference, such as control of calling the conference terminals 2 at the start of the conference.
- the conference terminal 2 transmits image and voice data to the server 4
- the server 4 transmits the image and voice data to the other conference terminal 2 on the side of the other party.
- the conference terminal 2 receives data
- the conference terminal 2 receives image and voice data of the other conference terminal 2 on the side of the other party through the server 4 .
- a conference when a conference is conducted in Locations A and B, data that a conference terminal 2 of Location A has transmitted is transmitted to a conference terminal 2 of Location B through the server 4 , and is not transmitted to the other conference terminals 2 (conference terminals 2 not participating in the conference).
- data that the conference terminal 2 of Location B has transmitted is transmitted to the conference terminal 2 of Location A participating in the conference through the server 4 , and is not transmitted to the other conference terminals 2 not participating in the conference.
- FIG. 2 is a diagram showing an example of a hardware configuration of a conference terminal 2 .
- the conference terminal 2 includes a panoramic camera 10 , a display unit 11 , a microphone array 12 , a speaker 13 , a CPU 14 , a storage device 15 , a memory 16 , a LAN I/F unit 17 , and an operation unit 18 .
- the panoramic camera 10 is an example of an “image shooting unit,” and generates a shot image obtained by shooting an image.
- the panoramic camera 10 generates a panoramic image (an example of a shot image) obtained by shooting a 360-degree panorama around the panoramic camera 10 (which can be considered as a 360-degree panorama around the conference terminal 2 ), and transmits the generated panoramic image to the CPU 14 .
- the panoramic camera 10 is composed of a known omnidirectional camera or the like. By shooting a 360-degree panorama around the panoramic camera 10 , a shot image in which all conference participants existing around the panoramic camera 10 are captured can be generated.
- panoramic image here means an image generated by combining multiple images (shot images) taken by moving one camera with a plurality of imaging sensors or by using multiple cameras with a plurality of imaging sensors.
- shots images taken by moving one camera with a plurality of imaging sensors or by using multiple cameras with a plurality of imaging sensors.
- the area shot by the panoramic camera 10 is a 360-degree panorama around the panoramic camera 10 ; however, the area can be smaller than this.
- the display unit 11 has a function of displaying image data received from the CPU 14 on a screen.
- the display unit 11 is composed of a liquid crystal display device or the like.
- the microphone array 12 includes a plurality of microphones installed dispersively in the conference terminal 2 , and has a function of acquiring conference participant's voice and transmitting the acquired voice to the CPU 14 .
- the term “microphone array” is composed of a plurality of omnidirectional microphones, and forms the directivity by means of a beamformer.
- the term “beamformer” is a technique to form the directivity by use of differences in the time for sound to reach microphones.
- the directivity formed by the beamformer can freely set the target direction, such as a vertical direction or a horizontal direction, according to the arrangement microphones. Therefore, the area where the microphone array wants to pick up sound (the target area) can be arbitrarily changed.
- FIG. 3 is a diagram showing an example where six microphones a to f included in the microphone array 12 are installed in a housing of a conference terminal 2 .
- the CPU 14 switches each microphone on/off, i.e. enables or disables each microphone and adds up respective voices picked up by the microphones, thereby can pick up a voice of art arbitrary area.
- the correspondence relationship between the microphones and their installation positions in the housing of the conference terminal 2 has been stored in the storage device 15 of the conference terminal 2 in advance.
- the speaker 13 has a function of outputting voice data received from the CPU 14
- the CPU 14 controls the operation of the entire conference terminal 2 .
- the CPU 14 has a function of controlling a video conference, a CODEC function, etc.;
- the CODEC function is a function of encoding an image acquired from the panoramic camera 10 and voice data acquired from the microphone array 12 and transmitting the encoded image and voice data to the LAN I/F unit 17 and also decoding image and voice data on the side of the other party of a conference that the LAN I/F unit 17 has received and transmitting the decoded image and voice data to the display unit 11 and the speaker 13 .
- the CODEC used by the CPU 14 is, for example, H.264/AVC or H.264/SVC.
- the CPU 14 further has a function of controlling the directivity of the microphone array 12 , a function of displaying a close-up of a speaker who is one of conference participants captured in a panoramic image acquired from the panoramic camera 10 , etc.
- the storage device 15 stores therein various control programs (such as a video conference control program) executed by the CPU 14 , a conversion table to be described later, etc.
- the storage device 15 is, for example, a non-volatile storage medium such as a flash memory or an HDD.
- the memory 16 is used for unfolding of a program executed by the CPU 14 and temporary storage of operation data.
- the memory 16 is, for example, a volatile memory such as a DDR memory.
- the LAN I/F unit 17 connects the conference terminal 2 to another conference terminal 2 via the network 3 , and transmits/receives data (image and voice data) to/from the other conference terminal 2 .
- the LAN I/F unit 17 is, for example, a wired LAN that is compatible with 10BASE-T, 100BASE-TX, or 1000BASE-T and is connected to an Ethernet(trademark) network or a wireless LAN compatible with 802.11a/b/g/n/ac.
- the operation unit 18 is a device used in various operations (various operations related to control over devices of the conference terminal 2 ) made by a user, and includes, for example, a keyboard, buttons, etc.
- FIG. 4 is a diagram for explaining details of a conference terminal 2 .
- the conference terminal 2 is connected to a projector 300 which is an example of an output device.
- the projector 300 has a function of projecting an image input from the conference terminal 2 on a projection plane (for example, a screen) set up in a location where the projector 300 is placed.
- This projector 300 enables so-called interactive manipulation, i.e., enables a user to input various operations (such as pointing, clicking, and scrolling) by directly manipulating an area of the projection plane on which the image is projected with a special wireless interactive pen (a dedicated stylus pen).
- a special wireless interactive pen a dedicated stylus pen
- a DLP interactive projector manufactured by TI Inc has a unique pixel level tracking system embedded in a projection beam and thus is always aware of the position at which a dedicated stylus pen is pointing on the projection plane, and therefore does not require the execution of calibration at the time of start-up, and, even if the projector has been moved, does not require any calibration.
- the projector 300 based on this system and a dedicated stylus pen 400 are used.
- the projector 300 has a function of performing wireless communication with each of the stylus pen 400 and the conference terminal 2 ; the stylus pen 400 has a function of performing wireless communication with the projector 300 .
- the projector 300 performs wireless communication with the stylus pen 400 , thereby can acquire information indicating the position at which the stylus pen 400 is pointing on the projection plane from the stylus pen 400 , and therefore can be always aware of the position at which the stylus pen 400 is pointing (which can be considered as the position at which a user is pointing) on the projection plane.
- the conference terminal 2 also, performs wireless communication with the projector 300 , thereby can be always aware of the position at which the stylus pen 400 is pointing on the projection plane.
- a pair of the same projector 300 and its dedicated stylus pen 400 is set up in each of Locations A and B.
- a conference terminal 2 and a projector 300 connected to the conference terminal 2 correspond to a “first communication apparatus” or a “second communication apparatus.”
- the type of an output device connected to the conference terminal 2 is optional; for example, an interactive whiteboard can be connected to the conference terminal 2 .
- a single device having both the functions of the conference terminal 2 and the functions of the projector 300 can be set up in each location.
- the single device corresponds to the “first communication apparatus” or the “second communication apparatus.”
- the conference terminal 2 includes a network unit 100 , a control unit 101 , a display control unit 102 , a wireless communication unit 109 , a camera I/F receiver 111 , a lens-characteristics holding unit 112 , and a distortion-correction processing unit 113 besides the panoramic camera 10 , the microphone array 12 , the speaker 13 , the storage device 15 , etc.
- Respective functions of the network unit 100 , the control unit 101 , the display control unit 102 , the wireless communication unit 109 , the camera I/F receiver 111 , and the distortion-correction processing unit 113 can be realized by the CPU 14 executing a program stored in the storage device 15 or the like, or at least some of these functions can be realized by a dedicated hardware circuitry (such as a semiconductor integrated circuit). Furthermore, for example, the lens-characteristics holding unit 112 can be realized by the storage device 15 .
- the network unit 100 transmits/receives data to/from another conference terminal 2 that is the other party of a conference.
- the control unit 101 is a part that performs various controls and arithmetic operations. Functions that the control unit 101 has will be described in detail later.
- the display control unit 102 has a function of controlling display (projection of an image on the projection plane) performed by the projector 300 .
- the wireless communication unit 109 performs wireless communication with the projector 300 , and acquires position information that indicates the position at which the stylus pen 400 is pointing on the projection plane from the projector 300 .
- the wireless communication unit 109 can notify the control unit 101 of the position information acquired from the projector 300 .
- a panoramic image generated by the panoramic camera 10 is sent to the camera I/F receiver 111 .
- the camera I/F receiver 111 is assumed to be a high-speed serial I/F, such as V-by-One (trademark) or HDMI (trademark).
- the distortion-correction processing unit 113 corrects distortion of the panoramic image subjected to serial/parallel conversion by the camera I/F receiver 111 , and outputs the corrected panoramic image to the control unit 101 .
- the lens-characteristics holding unit 112 stores therein conversion parameters for correcting distortion according to lens characteristics, and the distortion-correction processing unit 113 can use a conversion parameter to correct distortion of a panoramic image.
- FIG. 5 is a schematic diagram showing an example of a hardware configuration of the projector 300 .
- the projector 300 includes a CPU 311 , a storage unit 312 , an input unit 313 , a communication I/F 314 , and a projecting unit 315 ; these units are connected via a bus.
- the CPU 311 executes a program stored in the storage unit 312 , and controls the operation of the projector 300 overall.
- the storage unit 312 is composed of a ROM or HDD for storing therein a program executed by the CPU 311 and data required to execute the program, a RAM that serves as a work area of the CPU 311 , etc.
- the input unit 313 is used to perform various inputs to the projector 300 , and includes a touch panel, key switches, etc.
- the communication I/F 314 is an interface for communicating with the stylus pen 400 and the conference terminal 2 .
- the projecting unit 315 projects image data to be projected on the projection plane, such as a screen.
- the projecting unit 315 includes a projection optical system, such as a projection lens. Functions that the projector 300 has will be described later.
- FIG. 6 is a schematic diagram showing an example of circumstances of a video conference in the present embodiment.
- a conference terminal 2 is put on a desk. Since the conference terminal 2 is equipped with the panoramic camera 10 , the conference terminal 2 is assumed to be put on the center of the desk. As described above, this conference terminal 2 is equipped with the microphone array 12 including the microphones a to f. In Location 1 , it shows that five people, Persons D to H, are participating in the video conference.
- a projector 300 is connected to the conference terminal 2 set up in Location 1 via a video output cable; an image including a panoramic image on the side of Location 2 (an image in which a panoramic image on the side of Location is displayed) is projected on a projection plane in Location 1 .
- an image projected on the projection plane on the side of Location 1 is referred to as a “projected image 1 ”
- an image projected on a projection plane on the side of Location 2 is referred to as a “projected image 2 .”
- a projected image here is an example of an “output image.”
- a panoramic image in which all people participating in the conference in Location 2 (in this example, Persons A to C) are captured is displayed.
- a cutout image that is a close-up of Person A who is a speaker is displayed. In a case of a video conference system with conventional speaker tracking function, when any one of conference participants starts speaking, a close-up of the speaker is displayed in a cutout image.
- a conference terminal 2 is put on a desk.
- the conference terminal 2 is assumed to be put on the center of the desk, and is equipped with the microphone array 12 including the microphones a to f.
- a projector 300 is connected to the conference terminal 2 set up in Location 2 via a video output cable; an image including a panoramic image on the side of Location 1 (an image in which a panoramic image on the side of Location 1 is displayed) is projected on a projection plane in Location 2 .
- a panoramic image in which all people participating in the conference in Location 1 (in this example, Persons D to H) are captured is displayed.
- a speaker is normally displayed in a cutout image; however, for example, if any one of the conference participants in Location 2 has specified an area enclosed by a dotted line with the stylus pen 400 as shown in FIG. 6 , a voice subjected to directivity control according to the specified area is output, and an image of the specified area is displayed as a cutout image on the projected image 2 . That is, by specifying an area in which, of the conference participants captured in the panoramic image, a person whose close-up is to be displayed is captured, an output focused on the specified area can be performed.
- FIGS. 7A, 7B, and 7C How to specify a cutout area of a projected image is explained with FIGS. 7A, 7B, and 7C .
- a projector 300 Upon receipt of pressing on a START icon 310 to instruct to start cutout-area specification, a projector 300 performs control of displaying a pop-up screen for confirmation of the start of cutout-area specification as shown in FIG. 7A (under the control of the display control unit 102 ). Then, upon receipt of pressing on an “OK” button on the pop-up screen, the projector 300 performs control of displaying a message prompting a user to specify the center point of a cutout as shown in FIG. 7B . After that, the user can perform an operation of specifying the center point of a cutout with the stylus pen 400 or the like.
- a display area indicating an area where a panoramic image (a panoramic image on the side of the other party of a conference) is displayed is divided into as many unit areas as the number of people captured in the panoramic image; when a user has performed an operation of pointing at the center point (which is not necessarily the center point) of any one of the people captured in the display area as the center point of a cutout with the stylus pen 400 or the like, a unit area including the center point of the cutout is specified as a cutout area.
- This can simplify the operation of specifying, of a display area of a projected image, an area in which an object to be focused on is captured (the operation of specifying a cutout area), and therefore, it is possible to improve the user-friendliness.
- the projector 300 transmits coordinate information indicating the coordinates of the specified cutout area (a rectangle enclosed by a dotted line in the example shown in FIGS. 7A, 7B, and 7C ) on the projection plane to the conference terminal 2 .
- the conference terminal 2 performs control of transmitting the coordinate information received from the projector 300 to another conference terminal 2 that is the other party of the conference.
- an area 330 in which Person H is captured is assumed to be specified as a cutout area.
- the area 330 is a rectangular area
- coordinate information of the area 330 is information indicating respective coordinates of four vertices (A, B, C, and D) of the area 330 .
- the coordinates of the vertex A on the projected image 2 is (Xa, Ya)
- the coordinates of the vertex B is (Xb, Yb)
- the coordinates of the vertex C is (Xc, Yc)
- the coordinates of the vertex D is (Xd, Yd).
- This coordinate information is transmitted to the conference terminal 2 of the Location 1 side.
- the conference terminal 2 on the side of Location 1 generates output information including a cutout image which is, of the panoramic image on the side of Location 1 , an area corresponding to the coordinate information received from the conference terminal 2 on the side of Location 2 and a voice subjected to directivity control according to the area, and transmits the generated output information to the conference terminal 2 on the side of Location 2 .
- output information including a cutout image which is, of the panoramic image on the side of Location 1 , an area corresponding to the coordinate information received from the conference terminal 2 on the side of Location 2 and a voice subjected to directivity control according to the area, and transmits the generated output information to the conference terminal 2 on the side of Location 2 .
- a cutout image which is, of the panoramic image on the side of Location 1
- the conference terminal 2 on the side of Location 1 cuts out, out of the panoramic image acquired from the panoramic camera 10 included in the conference terminal 2 , an image of a rectangular area defined by four vertices: A′ (Xa′, Ya′), B′ (Xb′, Yb′), C′ (Xc′, Yc′), and D′ (Xd′, Yd′) as a cutout image.
- the conference terminal 2 of the Location 1 side controls the directivity of the microphone array 12 to increase the sensitivity of a microphone closest to the position defined by the coordinates of the area of the panoramic image corresponding to the coordinate information received from the conference terminal 2 of the Location 2 side based on position information indicating a relationship between the positions of the microphones included in the microphone array 12 and the coordinates of the panoramic image.
- the conference terminal 2 of the Location 1 side transmits the output information including the cutout image cut out as described above and the voice subjected to directivity control to the conference terminal 2 of the Location 2 side.
- the conference terminal 2 of the Location 2 side outputs the output information received from the conference terminal 2 of the Location 1 side.
- FIG. 9 is a diagram showing an example of functions that a projector 300 has.
- the projector 300 includes a projection control unit 321 , an area dividing unit 322 , a unit-area selecting unit 323 , and a coordinate-information-transmission control unit 324 .
- functions related to the present embodiment are mainly shown as an example; however, functions that the projector 300 has are not limited to these.
- the projection control unit 321 performs control of projecting an image input from the conference terminal 2 on a projection plane under the control of the display control unit 102 .
- the area dividing unit 322 divides, of a projected image indicating an image that the the projector 300 has projected on the projection plane, a display area indicating an area where a panoramic image (a panoramic image on another conference terminal 2 that is the other party of a conference) obtained by shooting the surroundings of the other conference terminal 2 of the other party is displayed into as many unit areas as the number of people captured in the panoramic image.
- the area dividing unit 322 equally divides the display area into as many unit areas as the number of users operations performed to point at people captured in the display area (which can be considered as the number of positions in the display area indicated by the user's operations), thereby can obtain a plurality of unit areas.
- a projected image output by the projector 300 is assumed. to be as shown in FIG. 10 .
- the projector 300 when the projector 300 has received pressing on an icon for instructing to execute calibration of a correspondence relationship between the positions pointed by a user and a plurality of unit areas on a projected image as with the case of the above-described icon for specifying a cutout area, the projector 300 goes into calibration mode, and the area dividing unit 322 performs control of outputting information (which can be an image or a voice) prompting a user to point at the center point of each of people captured in the display area.
- information which can be an image or a voice
- the area dividing unit 322 equally divides the display area into five laterally (horizontally), thereby can obtain five unit areas (unit areas 401 to 405 ).
- the projector 300 exits the calibration mode when the projector 300 have obtained the unit areas.
- the unit-area selecting unit 323 selects, of a projected image output by the projector 300 , a unit area corresponding to the position pointed by a user.
- the unit-area selecting unit 323 selects a unit area including the coordinates of the specified center point of the cutout.
- the coordinate-information-transmission control unit 324 performs control of transmitting coordinate information indicating a unit area selected by the unit-area selecting unit 323 to the conference terminal 2 connected to the projector 300 .
- the above-described functions that the projector 300 has can be realized by the CPU 311 executing a program stored in the storage unit 312 or the like, or at least some of these functions can be realized by a dedicated hardware circuitry (such as a semiconductor integrated circuit).
- FIG. 11 is a diagram showing an example of functions that the control unit 101 of a conference terminal 2 has.
- functions related to the present embodiment are mainly shown as an example; however, functions that the control unit 101 has are not limited to these.
- the control unit 101 includes a first transmission control unit 121 , an acquiring unit 122 , an identifying unit 123 , a cutting-out unit 124 , a directivity control unit 125 , a second transmission control unit 126 , and an output control unit 127 .
- the first transmission control unit 121 performs, when having received coordinate information from a projector 300 connected to the conference terminal 2 , control of transmitting the received coordinate information to another conference terminal 2 that is the other party of a conference. That is, the first transmission control unit 121 performs control of transmitting coordinate information indicating, of a projected image, a unit area corresponding to the position pointed by a user to another conference terminal 2 that is the other party of a conference.
- the acquiring unit 122 acquires a panoramic image obtained by the panoramic camera 10 shooting the surroundings of the conference terminal 2 .
- the acquiring unit 122 acquires a corrected panoramic image input from the distortion-correction processing unit 113 .
- the identifying unit 123 identifies, of a panoramic image acquired by the acquiring unit 122 , an area corresponding to the received coordinate information based on correspondence information indicating a correspondence relationship between the coordinates of the projected image (which can be considered as the coordinates of an area on which an image is projected on a projection plane) and the coordinates of the panoramic image.
- the correspondence information has been stored in the storage device 15 in advance.
- a general video conference system enables a user to freely change the layout (change the display mode), such as to project only an image of the user's party or to project only an image of the other party; therefore, the relationship between the coordinates of a projected image and the coordinates of a panoramic image is not always in one-to-one correspondence. Accordingly, the correspondence information in this example associates the coordinates of a projected image with the coordinates of a panoramic image with respect to each display mode (layout information) of the projector 300 .
- the cutting-out unit 124 cuts out, of a panoramic image acquired by the acquiring unit 122 , an image of an area identified by the identifying unit 123 as a cutout image.
- the directivity control unit 125 controls the directivity of the microphone array 12 to increase the sensitivity of, of a plurality of microphones installed dispersively in the conference terminal 2 , a microphone corresponding to an area identified by the identifying unit 123 (in this example, an area within a panoramic image).
- the directivity control unit 125 can determine the microphone corresponding to the coordinates of the area identified by the identifying unit 123 based on position information indicating a relationship between the positions of the microphones included in the microphone array 12 and the coordinate of the panoramic image.
- the position information can be stored in, for example, the storage device 15 or the like in advance.
- the second transmission control unit 126 performs control of transmitting output information including at least a voice subjected to directivity control by the directivity control unit 125 to another conference terminal 2 .
- the second transmission control unit 126 performs control of transmitting output information including a voice subjected to directivity control by the directivity control unit 125 and a cutout image cut out by the cutting-out unit 124 to another conference terminal 2 .
- the second transmission control unit 126 performs control of transmitting output information including a panoramic image acquired by the acquiring unit 122 , a voice subjected to directivity control by the directivity control unit 125 , and a cutout image cut out by the cutting-out unit 124 to another conference terminal 2 .
- the output information may include at least a voice subjected to directivity control by the directivity control unit 125 (a voice subjected to directivity control according to, of a shot image acquired by the acquiring unit 122 , an area corresponding to coordinate information received from another conference terminal 2 ).
- the control unit 101 can be configured to not include, for example, the cutting-out unit 124 .
- the second transmission control unit 126 performs control of transmitting general conference information including a panoramic image acquired by the acquiring unit 122 , a cutout image that is a close-up of a speaker who is one of conference participants captured in the panoramic image, and voice data picked up by the microphone array 12 to the other conference terminal 2 .
- the output control unit 127 performs control of outputting an image and voice received from another conference terminal 2 .
- the output control unit 127 performs control of instructing the display control unit 102 to cause the projector 300 to output (project) an image received from another conference terminal 2 onto a projection plane and outputting a voice received from the other conference terminal 2 from the speaker 13 .
- the output control unit 127 in the present embodiment performs control of outputting the received output information. More specifically, the output control unit 127 performs control of instructing the display control unit 102 to output a composite image of a cutout image and a panoramic image that are included in the received output information and outputting a voice included in the received output information from the speaker 13 .
- the output control unit 127 performs control of outputting the received general conference information.
- control unit 101 can be realized by the CPU 14 executing a program stored in the storage device 15 or the like, or at least some of the functions that the control unit 101 has can be realized by a dedicated hardware circuitry (such as a semiconductor integrated circuit).
- the panoramic camera 10 and the speaker 13 are included in the conference terminal 2 ; however, the configuration of the conference terminals 2 is not limited to this, and, for example, the panoramic camera 10 and the speaker 13 can be provided outside of the conference terminals 2 .
- FIG. 12 is a flowchart showing an example of the operation of the projector 300 when a cutout area is specified.
- the projector 300 Upon receipt of pressing on the START icon 310 (YES at Step S 1 ), the projector 300 receives an operation of specifying the center point of a cutout (Step S 2 ). Then, the projector 300 selects a unit area corresponding to the center point of the cutout specified at Step S 2 (Step S 3 ). And then, upon receipt of pressing on the Exit icon 320 (YES at Step S 4 ), the projector 300 transmits coordinate information indicating the unit area selected at Step S 3 to the conference terminal 2 (Step S 5 ).
- FIG. 13 is a flowchart showing an example of the operation of the conference terminal 2 upon receipt of coordinate information from the projector 300 connected to the conference terminal 2 .
- the first transmission control unit 121 performs control of transmitting the received coordinate information to another conference terminal 2 (Step S 7 ).
- FIG. 14 is a flowchart showing an example of the operation of the conference terminal 2 upon receipt of coordinate information from another conference terminal 2 .
- the identifying unit 123 identifies, of a panoramic image acquired by the acquiring unit 122 (a panoramic image acquired from the panoramic camera 10 of the conference terminal 2 ), an area corresponding to the received coordinate information based on correspondence information (Step S 11 ).
- the cutting-out unit 124 cuts out, out of the panoramic image acquired by the acquiring unit 122 , an image of the area identified at Step S 11 as a cutout image (Step S 12 ).
- the directivity control unit 125 controls the directivity of the microphone array 12 to increase the sensitivity of, of a plurality of microphones installed dispersively in the conference terminal 2 , a microphone corresponding to the area identified at Step S 11 (Step S 13 ). And then, the second transmission control unit 126 performs control of transmitting output information including the panoramic image acquired by the acquiring unit 122 , the cutout image cut out at Step S 12 , and a voice subjected to directivity control obtained as a result of Step S 13 to the other conference terminal 2 (Step S 14 ).
- FIG. 15 is a flowchart showing an example of the operation of the conference terminal 2 upon receipt of output information from another conference terminal 2 .
- the output control unit 127 performs control of outputting the received output information (Step S 21 ).
- the conference terminal 2 upon receipt of coordinate information from another conference terminal 2 that is the other party of a conference, the conference terminal 2 in the present embodiment identifies, of a panoramic image acquired from the panoramic camera 10 of the conference terminal 2 , an area corresponding to the received coordinate information based on correspondence information, and cuts out an image of the identified area as a cutout image. Furthermore, the conference terminal 2 controls the directivity of the microphone array 12 to increase the sensitivity of, of a plurality of microphones installed dispersively in the conference terminal 2 , a microphone corresponding to the identified area. Then, the conference terminal 2 transmits output information including the cutout image and a voice subjected to the directivity control to the other conference terminal 2 , and the other conference terminal 2 outputs the received output information.
- the other conference terminal 2 can perform an intended output.
- the shooting range of the panoramic camera is a 360-degree panorama.
- the essentials of the present embodiment are to specify a portion of a shot image as a cutout image and control the directivity of the microphone array to increase the sensitivity of a microphone corresponding to the cutout image. Therefore, as the shooting range, the angle of view of the camera can be below 360 degrees, and, for example, can be about 80 degrees.
- the area dividing unit 322 equally divides the display area into as many unit areas as the number of user's operations performed to point at people captured in the display area, thereby obtaining a plurality of unit areas; however, for example, we assume a case where the positions of people captured in the display area are converged on one side as shown in FIG. 16 . In this case, if the display area is equally divided laterally by the number of people captured in the display area, obtained unit areas do not correspond one-to-one to the people captured in the display area; therefore, it may be difficult to appropriately specify an area where a target person is captured (as a cutout area).
- the area dividing unit 322 can be configured to divide the display area into unit areas that correspond one-to-one to multiple positions in the display area indicated by user's operations on the basis of a relative positional relationship between four vertices of a projected image and the multiple positions in the display area indicated by the user's operations. Specific contents of this are explained below.
- the area dividing unit 322 When the projector 300 has gone into the calibration mode, the area dividing unit 322 performs control of outputting information (which can be an image or a voice) prompting a user to point at four vertices of a projected image.
- information which can be an image or a voice
- the area dividing unit 322 does not have to perform this control because respective coordinates of four vertices of an image (an example of an output image) that the interactive whiteboard displays thereon are recognized in advance.
- the area dividing unit 322 performs control of outputting information prompting the user to point at the center point of each of the people captured in the display area, and detects multiple positions in the display area indicated by user's operations.
- the positions of cutout lines (cutout lines extending vertically) for cutting out multiple unit areas that correspond one-to-one to the indicated positions can be found.
- the focused indicated position is referred to as the “focused position.”
- the lateral position of a right-hand cutout line for cutting out a unit area including the focused position can be found as the position at a distance from the left side of the display area in a lateral direction of, for example, a difference between the lateral length of the display area and the distance from the right side of the display area to the lateral position of the focused position plus half the distance between the lateral position of the focused position and the lateral position of the indicated position adjacent to the right side of the focused position.
- the lateral position of a left-hand cutout line for cutting out the unit area including the focused position can be found as the position at a distance from the left side of the display area in the lateral direction of, for example, the difference between the lateral length of the display area and the distance from the right side of the display area to the lateral position of the focused position minus half the distance between the lateral position of the focused position and the lateral position of the indicated position adjacent to the left side of the focused position.
- the lateral position of a left-hand cutout line for cutting out the unit area including the focused position can be found as the position at a distance from the left side of the display area in the lateral direction of, for example, a difference between the lateral length of the display area and the distance from the right side of the display area to the lateral position of the focused position minus half the distance between the lateral position of the focused position and the lateral position of the indicated position adjacent to the right side of the focused position.
- the lateral position of a right-hand cutout line for cutting out the unit area including the focused position can be found as the position at a distance from the left side of the display area in the lateral direction of, for example, a difference between the lateral length of the display area and the distance from the right side of the display area to the lateral position of the focused position plus half the distance between the lateral position of the focused position and the lateral position of the indicated position adjacent to the left side of the focused position.
- the lateral direction is x-direction
- the vertical direction is y-direction.
- the coordinates of the upper-left vertex of the projected image is (a 1 , a 2 )
- the coordinates of the upper-right vertex is (b 1 , b 2 )
- the coordinates of the lower-right vertex is (c 1 , c 2 )
- the coordinates of the lower-left vertex is (0, 0).
- the coordinates of the indicated position corresponding to Person D is (d 1 , d 2 )
- the coordinates of the indicated position corresponding to Person F is (e 1 , e 2 )
- the coordinates of the indicated position corresponding to Person F is (f 1 , f 2 )
- the coordinates of the indicated position corresponding to Person G is (g 1 , g 2 )
- the coordinates of the indicated position corresponding to Person H is (h 1 , h 2 ).
- the lateral position of a right-hand cutout line for cutting out a unit area including the indicated position corresponding to Person G can be found as the position at a distance of (b 1 ⁇ a 1 ) ⁇ (b 1 ⁇ g 1 )+((h 1 ⁇ g 1 )/ 2 ) from the left side of the display area in the x-direction.
- the lateral position of a left-hand cutout line for cutting out the unit area including the indicated position corresponding to Person G can be found as the position at a distance of (b 1 ⁇ a 1 ) ⁇ (b 1 ⁇ g 1 ) ⁇ ((gl ⁇ f 1 )/ 2 ) from the left side of the display area in the x-direction.
- the following is how to find cutout lines for cutting out a unit area including the indicated position corresponding to Person H that has the adjacent indicated position on the left thereof and has no adjacent indicated position on the right thereof.
- the lateral position of a right-hand cutout line for cutting out the unit area including the indicated position corresponding to Person H can be found as the position at a distance of (b 1 ⁇ a 1 ) ⁇ (b 1 ⁇ h 1 )+((h 1 ⁇ g 1 )/ 2 ) from the left side of the display area in the x-direction.
- the lateral position of a left-hand cutout line for cutting out the unit area including the indicated position corresponding to Person H can be found as the position at a distance of (b 1 ⁇ a 1 ) ⁇ (b 1 ⁇ h 1 ) ⁇ ((h 1 ⁇ g 1 )/ 2 ) from the left side of the display area in the x-direction.
- the following is how to find cutout lines for cutting out a unit area including the indicated position corresponding to Person D that has no adjacent indicated position on the left thereof and has the adjacent indicated position on the right thereof.
- the lateral position of a right-hand cutout line for cutting out the unit area including the indicated position corresponding to Person D can be found as the position at a distance of (b 1 ⁇ a 1 ) ⁇ (b 1 ⁇ d 1 ) ⁇ ((e 1 ⁇ d 1 )/ 2 ) from the left side of the display area in the x-direction.
- the lateral position of a left-hand cutout line for cutting out the unit area including the indicated position corresponding to Person D can be found as the position at a distance of (b 1 ⁇ a 1 ) ⁇ (b 1 ⁇ d 1 ) ⁇ ((e 1 ⁇ d 1 )/ 2 ) from the left side of the display area in the x-direction.
- Programs executed by the conference terminal 2 or the projector 300 can be provided in such a manner that each program is recorded on a computer-readable recording medium, such as a CD-ROM, a flexible disk (FD), a CD-R, a digital versatile disk (DVD), or a universal serial bus (DSP) flash drive, in an installable or executable file format, or can be provided or distributed via a network such as the Internet.
- a computer-readable recording medium such as a CD-ROM, a flexible disk (FD), a CD-R, a digital versatile disk (DVD), or a universal serial bus (DSP) flash drive
- the programs can be provided in such a manner that each program is built into a ROM or the like in advance.
- any of the above-described apparatus, devices or units can be implemented as a hardware apparatus, such as a special-purpose circuit or device, or as a hardware/software combination, such as a processor executing a software program.
- any one of the above-described and other methods of the present invention may be embodied in the form of a computer program stored in any kind of storage medium.
- storage mediums include, but are not limited to, flexible disk, hard disk, optical discs, magneto-optical discs, magnetic tapes, nonvolatile memory, semiconductor memory, read-only-memory (ROM), etc.
- any one of the above-described and other methods of the present invention may be implemented by an application specific integrated circuit (ASIC), a digital signal processor (DSP) or a field programmable gate array (FPGA), prepared by interconnecting an appropriate network of conventional component circuits or by a combination thereof with one or more conventional general purpose microprocessors or signal processors programmed accordingly.
- ASIC application specific integrated circuit
- DSP digital signal processor
- FPGA field programmable gate array
- Processing circuitry includes a programmed processor, as a processor includes circuitry.
- a processing circuit also includes devices such as an application specific integrated circuit (ASIC), digital signal processor (DSP), field programmable gate array (FPGA) and conventional circuit components arranged to perform the recited functions.
- ASIC application specific integrated circuit
- DSP digital signal processor
- FPGA field programmable gate array
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Telephonic Communication Services (AREA)
- General Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Health & Medical Sciences (AREA)
- Obtaining Desirable Characteristics In Audible-Bandwidth Transducers (AREA)
- Circuit For Audible Band Transducer (AREA)
Abstract
Description
- The present application claims priority under 35 U.S.C. §119 to Japanese Patent Application No. 2015-212270 filed Oct. 28, 2015. The contents of which are incorporated. herein by reference in their entirety.
- 1. Field of the Invention
- The present invention relates to a communication system, a communication apparatus, and a communication method.
- 2. Description of the Related Art
- As a form of a communication system for performing data transmission/reception among multiple communication apparatuses, there is known a video (television) conference system that realizes a remote conference by using a communication network. This video conference system can conduct a conference between remote locations almost like a face-to-face meeting in such a way that by using a communication apparatus (a terminal device) of the video conference system set up in a conference room where one of the parties participating in the remote conference, an image of the conference room such as conference participants and a voice, such as speaker's voice, are converted into digital data, and the digital data is transmitted to a communication apparatus of the other party to display the image on a display and output the voice from a speaker in a conference room of the other party.
- In such a video conference system, a microphone is used to acquire conference participants' voice, and a camera is used to acquire their image. The camera has an angle of view, and therefore cannot capture an image of a conference participant who is outside the angle of view. To solve this problem, there is known a method of using a panoramic camera capable of capturing a 360-degree panoramic image. On the other hand, a general microphone is omnidirectional, and therefore picks up the ambient sound besides participant's speech. To solve this problem, there is known a method of using a microphone array to control the directivity, i.e., increase the sensitivity of a microphone that picks up the sound from a particular direction, thereby suppressing the ambient sound from being picked up and enabling participant's speech to be heard clearly. For example, Japanese Unexamined Patent Application Publication No. 2007-274463 has disclosed a method of how a conference terminal forms a beam of sound picked up by a microphone array according to a selected placement pattern of participants. Furthermore, Japanese Patent No. 5028944 has disclosed a technology to detect the direction of a speaker by use of a microphone array composed of an array of a plurality of microphones and make the shooting direction of a camera follow the detected direction.
- There can be considered a method of realizing a more realistic video conference with a combination of a panoramic camera and a microphone array, namely, by shooting a whole conference room with the panoramic camera and, if there is a speaker, setting the sound pickup area of a microphone toward at least the speaker. However, this combination can set the sound pickup area of a microphone toward a speaker, though a communication apparatus of the other party may not want output focused on the speaker. Therefore, there is a problem that this combination does not enable an output intended by each communication apparatus composing a communication system.
- In view of the above, there is a need to provide a communication system, a communication apparatus, and a communication method that enable one's intended output.
- According to exemplary embodiments of the present invention, there is provided a communication system comprising a first communication apparatus and a second communication apparatus that transmits/receives data to/from the first communication apparatus, wherein the first communication apparatus includes: an area dividing unit that divides, of an output image which is an image that the first communication apparatus has output and includes at least a shot image obtained by shooting the surroundings of the second communication apparatus, a display area indicating an area where the shot image is displayed into as many unit areas as the number of people captured in the shot image; a first transmission control unit that performs control of transmitting coordinate information which indicates, of the output image, a unit area corresponding to a position pointed by a user to the second communication apparatus; and an output control unit that, when the first communication apparatus has received output information including a voice subjected to directivity control according to, of the shot image, an area corresponding to the coordinate information from the second communication apparatus, performs control of outputting the received output information, and the second communication apparatus includes; an acquiring unit that acquires the shot image; an identifying unit that, when the second communication apparatus has received the coordinate information from the first communication apparatus, identifies, of the shot image acquired by the acquiring unit, an area corresponding to the received coordinate information based on correspondence information indicating a correspondence relationship between the coordinates of the output image and the coordinates of the shot image; a directivity control unit that controls the directivity of a microphone array including a plurality of microphones to increase the sensitivity of, of the microphones installed in the second communication apparatus, a microphone corresponding to the area identified by the identifying unit; and a second transmission control unit that performs control of transmitting output information including at least a voice subjected to directivity control by the directivity control unit to the first communication apparatus.
- Exemplary embodiments of the present invention also provide a communication apparatus comprising: an area dividing unit that divides, of an output image which is an image that the communication apparatus has output and includes at least a shot image obtained by shooting the surroundings of another communication apparatus which communicates with the communication apparatus, a display area indicating an area where the shot image is displayed into as many unit areas as the number of people captured in the shot image; a transmission control unit that performs control of transmitting coordinate information which indicates, of the output image, a unit area corresponding to a position pointed by a user to the other communication apparatus; and an output control unit that, when the communication apparatus has received output information including a voice subjected to directivity control according to, of the shot image, an area corresponding to the coordinate information from the other communication apparatus, performs control of outputting the received output information.
- Exemplary embodiments of the present invention also provide a communication method for a communication system including a first communication apparatus and a second communication apparatus that transmits/receives data to/from the first communication apparatus, the communication method comprising: dividing, by the first communication apparatus, of an output image which is an image that the first communication apparatus has output and includes at least a shot image obtained by shooting the surroundings of the second communication apparatus, a display area indicating an area where the shot image is displayed into as many unit areas as the number of people captured in the shot image; performing, by the first communication apparatus, control of transmitting coordinate information which indicates, of the output image, a unit area corresponding to a position pointed by a user to the second communication apparatus; when having received output information including a voice subjected to directivity control according to, of the shot image, an area corresponding to the coordinate information from the second communication apparatus, performing, by the first communication apparatus, control of outputting the received output information; acquiring, by the second communication apparatus, the shot image; when having received the coordinate information from the first communication apparatus, identifying, by the second communication apparatus, of the shot image acquired at the acquiring, an area corresponding to the received coordinate information on the basis of correspondence information indicating a correspondence relationship between the coordinates of the output image and the coordinates of the shot image; controlling, by the second communication apparatus, the directivity of a microphone array including a plurality of microphones to increase the sensitivity of, of the microphones installed in the second communication apparatus, a microphone corresponding to the area identified at the identifying; and performing, by the second communication apparatus, control of transmitting output information including at least a voice subjected to directivity control at the controlling to the first communication apparatus.
- Exemplary embodiments of the present invention also provide a communication method for a communication apparatus, the communication method comprising: dividing, of an output image which is an image that the communication apparatus has output and includes at least a shot image obtained by shooting the surroundings of another communication apparatus which communicates with the communication apparatus, a display area indicating an area where the shot image is displayed into as many unit areas as the number of people captured in the shot image; performing control of transmitting coordinate information which indicates, of the output image, a unit area corresponding to a position pointed by a user to the other communication apparatus; and when the communication apparatus has received output information including a voice subjected to directivity control according to, of the shot image, an area corresponding to the coordinate information from the other communication apparatus, performing control of outputting the received output information.
-
FIG. 1 is a diagram showing an example of a configuration of a communication system; -
FIG. 2 is a diagram showing an example of a hardware configuration of a conference terminal; -
FIG. 3 is a diagram showing an example of the installation of microphones; -
FIG. 4 is a diagram for explaining details of the conference terminal; -
FIG. 5 is a diagram showing an example of a hardware configuration of a projector; -
FIG. 6 is a schematic diagram showing an example of circumstances of a video conference in an embodiment of the present invention; -
FIGS. 7A, 7B, and 7C are diagrams for explaining how to specify a cutout area of a projected image; -
FIG. 8 is a diagram for explaining a cutout image; -
FIG. 9 is a diagram showing an example of functions that the projector has; -
FIG. 10 is a diagram showing an example of a projected image; -
FIG. 11 is a diagram showing an example of functions that a control unit of the conference terminal has; -
FIG. 12 is a flowchart showing an example of the operation of the projector; -
FIG. 13 is a flowchart showing an example of the operation of the conference terminal; -
FIG. 14 is a flowchart showing another example of the operation of the conference terminal; -
FIG. 15 is a flowchart showing still another example of the operation of the conference terminal; -
FIG. 16 is a diagram showing an example of a projected image; and -
FIG. 17 is a diagram for explaining a variation of how to divide a display area. - The accompanying drawings are intended to depict exemplary embodiments of the present invention and should not be interpreted to limit the scope thereof. Identical or similar reference numerals designate identical or similar components throughout the various drawings.
- The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the present invention.
- As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise.
- In describing preferred embodiments illustrated in the drawings, specific terminology may be employed for the sake of clarity. However, the disclosure of this patent specification is not intended to be limited to the specific terminology so selected, and it is to be understood that each specific element includes all technical equivalents that have the same function, operate in a similar manner, and achieve a similar result.
- An exemplary embodiment of a communication system, a communication apparatus, and a communication method according to the present invention will be described in detail below with reference to accompanying drawings.
-
FIG. 1 is a diagram showing an example of a configuration of acommunication system 1 according to the present embodiment. In the example shown inFIG. 1 , oneconference terminal 2 is set up in each of Locations A and B. Theconference terminals 2 set up in Locations A and B are each connected to aserver 4 via anetwork 3 such as the Internet. Incidentally, the number of conference terminals 2 (the number of locations) included in thecommunication system 1 is riot limited to this, and can be arbitrarily changed. - The
server 4 monitors whether eachconference terminal 2 is connected to theserver 4, and performs control required at the time of a conference, such as control of calling theconference terminals 2 at the start of the conference. When oneconference terminal 2 transmits data during the conference, theconference terminal 2 transmits image and voice data to theserver 4, and theserver 4 transmits the image and voice data to theother conference terminal 2 on the side of the other party. When oneconference terminal 2 receives data, theconference terminal 2 receives image and voice data of theother conference terminal 2 on the side of the other party through theserver 4. For example, when a conference is conducted in Locations A and B, data that aconference terminal 2 of Location A has transmitted is transmitted to aconference terminal 2 of Location B through theserver 4, and is not transmitted to the other conference terminals 2 (conference terminals 2 not participating in the conference). Likewise, data that theconference terminal 2 of Location B has transmitted is transmitted to theconference terminal 2 of Location A participating in the conference through theserver 4, and is not transmitted to theother conference terminals 2 not participating in the conference. By performing the control described above, a conference can be conducted among multiple conference terminals 2 (in multiple locations). - Subsequently, a configuration of the
conference terminals 2 is explained. Incidentally, theconference terminals 2 set up in Locations A and B have the same configuration; therefore, in the following description, oneconference terminal 2 is cited.FIG. 2 is a diagram showing an example of a hardware configuration of aconference terminal 2. As shown inFIG. 2 , theconference terminal 2 includes apanoramic camera 10, adisplay unit 11, amicrophone array 12, aspeaker 13, aCPU 14, astorage device 15, amemory 16, a LAN I/F unit 17, and anoperation unit 18. - The
panoramic camera 10 is an example of an “image shooting unit,” and generates a shot image obtained by shooting an image. In this example, thepanoramic camera 10 generates a panoramic image (an example of a shot image) obtained by shooting a 360-degree panorama around the panoramic camera 10 (which can be considered as a 360-degree panorama around the conference terminal 2), and transmits the generated panoramic image to theCPU 14. For example, thepanoramic camera 10 is composed of a known omnidirectional camera or the like. By shooting a 360-degree panorama around thepanoramic camera 10, a shot image in which all conference participants existing around thepanoramic camera 10 are captured can be generated. The term “panoramic image” here means an image generated by combining multiple images (shot images) taken by moving one camera with a plurality of imaging sensors or by using multiple cameras with a plurality of imaging sensors. Incidentally, in this example, the area shot by thepanoramic camera 10 is a 360-degree panorama around thepanoramic camera 10; however, the area can be smaller than this. - The
display unit 11 has a function of displaying image data received from theCPU 14 on a screen. In this example, thedisplay unit 11 is composed of a liquid crystal display device or the like. - The
microphone array 12 includes a plurality of microphones installed dispersively in theconference terminal 2, and has a function of acquiring conference participant's voice and transmitting the acquired voice to theCPU 14. The term “microphone array” is composed of a plurality of omnidirectional microphones, and forms the directivity by means of a beamformer. In general, the term “beamformer” is a technique to form the directivity by use of differences in the time for sound to reach microphones. The directivity formed by the beamformer can freely set the target direction, such as a vertical direction or a horizontal direction, according to the arrangement microphones. Therefore, the area where the microphone array wants to pick up sound (the target area) can be arbitrarily changed. -
FIG. 3 is a diagram showing an example where six microphones a to f included in themicrophone array 12 are installed in a housing of aconference terminal 2, TheCPU 14 switches each microphone on/off, i.e. enables or disables each microphone and adds up respective voices picked up by the microphones, thereby can pick up a voice of art arbitrary area. The correspondence relationship between the microphones and their installation positions in the housing of theconference terminal 2 has been stored in thestorage device 15 of theconference terminal 2 in advance. - Returning to the explanation of
FIG. 2 , thespeaker 13 has a function of outputting voice data received from theCPU 14 TheCPU 14 controls the operation of theentire conference terminal 2. For example, theCPU 14 has a function of controlling a video conference, a CODEC function, etc.; the CODEC function is a function of encoding an image acquired from thepanoramic camera 10 and voice data acquired from themicrophone array 12 and transmitting the encoded image and voice data to the LAN I/F unit 17 and also decoding image and voice data on the side of the other party of a conference that the LAN I/F unit 17 has received and transmitting the decoded image and voice data to thedisplay unit 11 and thespeaker 13. The CODEC used by theCPU 14 is, for example, H.264/AVC or H.264/SVC. TheCPU 14 further has a function of controlling the directivity of themicrophone array 12, a function of displaying a close-up of a speaker who is one of conference participants captured in a panoramic image acquired from thepanoramic camera 10, etc. - The
storage device 15 stores therein various control programs (such as a video conference control program) executed by theCPU 14, a conversion table to be described later, etc. Thestorage device 15 is, for example, a non-volatile storage medium such as a flash memory or an HDD. - The
memory 16 is used for unfolding of a program executed by theCPU 14 and temporary storage of operation data. Thememory 16 is, for example, a volatile memory such as a DDR memory. The LAN I/F unit 17 connects theconference terminal 2 to anotherconference terminal 2 via thenetwork 3, and transmits/receives data (image and voice data) to/from theother conference terminal 2. The LAN I/F unit 17 is, for example, a wired LAN that is compatible with 10BASE-T, 100BASE-TX, or 1000BASE-T and is connected to an Ethernet(trademark) network or a wireless LAN compatible with 802.11a/b/g/n/ac. - The
operation unit 18 is a device used in various operations (various operations related to control over devices of the conference terminal 2) made by a user, and includes, for example, a keyboard, buttons, etc. -
FIG. 4 is a diagram for explaining details of aconference terminal 2. Theconference terminal 2 is connected to aprojector 300 which is an example of an output device. Theprojector 300 has a function of projecting an image input from theconference terminal 2 on a projection plane (for example, a screen) set up in a location where theprojector 300 is placed. Thisprojector 300 enables so-called interactive manipulation, i.e., enables a user to input various operations (such as pointing, clicking, and scrolling) by directly manipulating an area of the projection plane on which the image is projected with a special wireless interactive pen (a dedicated stylus pen). A DLP interactive projector manufactured by TI Inc, has a unique pixel level tracking system embedded in a projection beam and thus is always aware of the position at which a dedicated stylus pen is pointing on the projection plane, and therefore does not require the execution of calibration at the time of start-up, and, even if the projector has been moved, does not require any calibration. In the present embodiment, theprojector 300 based on this system and adedicated stylus pen 400 are used. - The
projector 300 has a function of performing wireless communication with each of thestylus pen 400 and theconference terminal 2; thestylus pen 400 has a function of performing wireless communication with theprojector 300. Theprojector 300 performs wireless communication with thestylus pen 400, thereby can acquire information indicating the position at which thestylus pen 400 is pointing on the projection plane from thestylus pen 400, and therefore can be always aware of the position at which thestylus pen 400 is pointing (which can be considered as the position at which a user is pointing) on the projection plane. Furthermore, theconference terminal 2, also, performs wireless communication with theprojector 300, thereby can be always aware of the position at which thestylus pen 400 is pointing on the projection plane. - In this example, a pair of the
same projector 300 and itsdedicated stylus pen 400 is set up in each of Locations A and B. Here, aconference terminal 2 and aprojector 300 connected to theconference terminal 2 correspond to a “first communication apparatus” or a “second communication apparatus.” Incidentally, the type of an output device connected to theconference terminal 2 is optional; for example, an interactive whiteboard can be connected to theconference terminal 2. - Furthermore, for example, a single device having both the functions of the
conference terminal 2 and the functions of theprojector 300 can be set up in each location. In this configuration, the single device corresponds to the “first communication apparatus” or the “second communication apparatus.” - As shown in
FIG. 4 , theconference terminal 2 includes anetwork unit 100, acontrol unit 101, adisplay control unit 102, awireless communication unit 109, a camera I/F receiver 111, a lens-characteristics holding unit 112, and a distortion-correction processing unit 113 besides thepanoramic camera 10, themicrophone array 12, thespeaker 13, thestorage device 15, etc. Respective functions of thenetwork unit 100, thecontrol unit 101, thedisplay control unit 102, thewireless communication unit 109, the camera I/F receiver 111, and the distortion-correction processing unit 113 can be realized by theCPU 14 executing a program stored in thestorage device 15 or the like, or at least some of these functions can be realized by a dedicated hardware circuitry (such as a semiconductor integrated circuit). Furthermore, for example, the lens-characteristics holding unit 112 can be realized by thestorage device 15. - The
network unit 100 transmits/receives data to/from anotherconference terminal 2 that is the other party of a conference. - The
control unit 101 is a part that performs various controls and arithmetic operations. Functions that thecontrol unit 101 has will be described in detail later. - The
display control unit 102 has a function of controlling display (projection of an image on the projection plane) performed by theprojector 300. - The
wireless communication unit 109 performs wireless communication with theprojector 300, and acquires position information that indicates the position at which thestylus pen 400 is pointing on the projection plane from theprojector 300. Thewireless communication unit 109 can notify thecontrol unit 101 of the position information acquired from theprojector 300. - A panoramic image generated by the
panoramic camera 10 is sent to the camera I/F receiver 111. The camera I/F receiver 111 is assumed to be a high-speed serial I/F, such as V-by-One (trademark) or HDMI (trademark). The distortion-correction processing unit 113 corrects distortion of the panoramic image subjected to serial/parallel conversion by the camera I/F receiver 111, and outputs the corrected panoramic image to thecontrol unit 101. The lens-characteristics holding unit 112 stores therein conversion parameters for correcting distortion according to lens characteristics, and the distortion-correction processing unit 113 can use a conversion parameter to correct distortion of a panoramic image. -
FIG. 5 is a schematic diagram showing an example of a hardware configuration of theprojector 300. As shown inFIG. 5 , theprojector 300 includes a CPU 311, a storage unit 312, an input unit 313, a communication I/F 314, and a projecting unit 315; these units are connected via a bus. - The CPU 311 executes a program stored in the storage unit 312, and controls the operation of the
projector 300 overall. The storage unit 312 is composed of a ROM or HDD for storing therein a program executed by the CPU 311 and data required to execute the program, a RAM that serves as a work area of the CPU 311, etc. The input unit 313 is used to perform various inputs to theprojector 300, and includes a touch panel, key switches, etc. The communication I/F 314 is an interface for communicating with thestylus pen 400 and theconference terminal 2. The projecting unit 315 projects image data to be projected on the projection plane, such as a screen. The projecting unit 315 includes a projection optical system, such as a projection lens. Functions that theprojector 300 has will be described later. -
FIG. 6 is a schematic diagram showing an example of circumstances of a video conference in the present embodiment. InLocation 1, aconference terminal 2 is put on a desk. Since theconference terminal 2 is equipped with thepanoramic camera 10, theconference terminal 2 is assumed to be put on the center of the desk. As described above, thisconference terminal 2 is equipped with themicrophone array 12 including the microphones a to f. InLocation 1, it shows that five people, Persons D to H, are participating in the video conference. Furthermore, aprojector 300 is connected to theconference terminal 2 set up inLocation 1 via a video output cable; an image including a panoramic image on the side of Location 2 (an image in which a panoramic image on the side of Location is displayed) is projected on a projection plane inLocation 1. In the following description, an image projected on the projection plane on the side ofLocation 1 is referred to as a “projectedimage 1”, and an image projected on a projection plane on the side ofLocation 2 is referred to as a “projectedimage 2.” When there is no need to make a distinction between projectedimages image 1, a panoramic image in which all people participating in the conference in Location 2 (in this example, Persons A to C) are captured is displayed. Above the projectedimage 1, a cutout image that is a close-up of Person A who is a speaker is displayed. In a case of a video conference system with conventional speaker tracking function, when any one of conference participants starts speaking, a close-up of the speaker is displayed in a cutout image. - Also in
Location 2, aconference terminal 2 is put on a desk. Just like inLocation 1, since theconference terminal 2 is equipped with thepanoramic camera 10, theconference terminal 2 is assumed to be put on the center of the desk, and is equipped with themicrophone array 12 including the microphones a to f. InLocation 2, it is shown that three people, Persons A to C, are participating in the video conference. Just like inLocation 1, aprojector 300 is connected to theconference terminal 2 set up inLocation 2 via a video output cable; an image including a panoramic image on the side of Location 1 (an image in which a panoramic image on the side ofLocation 1 is displayed) is projected on a projection plane inLocation 2. Below a projectedimage 2, a panoramic image in which all people participating in the conference in Location 1 (in this example, Persons D to H) are captured is displayed. A speaker is normally displayed in a cutout image; however, for example, if any one of the conference participants inLocation 2 has specified an area enclosed by a dotted line with thestylus pen 400 as shown inFIG. 6 , a voice subjected to directivity control according to the specified area is output, and an image of the specified area is displayed as a cutout image on the projectedimage 2. That is, by specifying an area in which, of the conference participants captured in the panoramic image, a person whose close-up is to be displayed is captured, an output focused on the specified area can be performed. This applies to not only a person but an object; by specifying an area in which, of objects captured in a panoramic image, an object to be focused on is captured, an output focused on the specified area can be performed. A configuration of a conference terminal 2 (a function that thecontrol unit 101 has) for realizing this will be described later. - How to specify a cutout area of a projected image is explained with
FIGS. 7A, 7B, and 7C . Upon receipt of pressing on aSTART icon 310 to instruct to start cutout-area specification, aprojector 300 performs control of displaying a pop-up screen for confirmation of the start of cutout-area specification as shown inFIG. 7A (under the control of the display control unit 102). Then, upon receipt of pressing on an “OK” button on the pop-up screen, theprojector 300 performs control of displaying a message prompting a user to specify the center point of a cutout as shown inFIG. 7B . After that, the user can perform an operation of specifying the center point of a cutout with thestylus pen 400 or the like. - As will be described later, in the present embodiment, of a projected image, a display area indicating an area where a panoramic image (a panoramic image on the side of the other party of a conference) is displayed is divided into as many unit areas as the number of people captured in the panoramic image; when a user has performed an operation of pointing at the center point (which is not necessarily the center point) of any one of the people captured in the display area as the center point of a cutout with the
stylus pen 400 or the like, a unit area including the center point of the cutout is specified as a cutout area. This can simplify the operation of specifying, of a display area of a projected image, an area in which an object to be focused on is captured (the operation of specifying a cutout area), and therefore, it is possible to improve the user-friendliness. - Then, upon receipt of pressing on an
Exit icon 320 to instruct to end the cutout-area specification as shown inFIG. 7C , theprojector 300 transmits coordinate information indicating the coordinates of the specified cutout area (a rectangle enclosed by a dotted line in the example shown inFIGS. 7A, 7B, and 7C ) on the projection plane to theconference terminal 2. Then, theconference terminal 2 performs control of transmitting the coordinate information received from theprojector 300 to anotherconference terminal 2 that is the other party of the conference. - Here, as shown in
FIGS. 6 and 8 , of the projected image 2 (which can be considered as an area of the projection plane on the side ofLocation 2 on which an image including the panoramic image on the side ofLocation 1 is projected), anarea 330 in which Person H is captured is assumed to be specified as a cutout area. In this example, thearea 330 is a rectangular area, and coordinate information of thearea 330 is information indicating respective coordinates of four vertices (A, B, C, and D) of thearea 330. In this example, the coordinates of the vertex A on the projectedimage 2 is (Xa, Ya), the coordinates of the vertex B is (Xb, Yb), the coordinates of the vertex C is (Xc, Yc), and the coordinates of the vertex D is (Xd, Yd). This coordinate information is transmitted to theconference terminal 2 of theLocation 1 side. - The
conference terminal 2 on the side ofLocation 1 generates output information including a cutout image which is, of the panoramic image on the side ofLocation 1, an area corresponding to the coordinate information received from theconference terminal 2 on the side ofLocation 2 and a voice subjected to directivity control according to the area, and transmits the generated output information to theconference terminal 2 on the side ofLocation 2. In the example shown inFIG. 8 , theconference terminal 2 on the side ofLocation 1 cuts out, out of the panoramic image acquired from thepanoramic camera 10 included in theconference terminal 2, an image of a rectangular area defined by four vertices: A′ (Xa′, Ya′), B′ (Xb′, Yb′), C′ (Xc′, Yc′), and D′ (Xd′, Yd′) as a cutout image. Furthermore, theconference terminal 2 of theLocation 1 side controls the directivity of themicrophone array 12 to increase the sensitivity of a microphone closest to the position defined by the coordinates of the area of the panoramic image corresponding to the coordinate information received from theconference terminal 2 of theLocation 2 side based on position information indicating a relationship between the positions of the microphones included in themicrophone array 12 and the coordinates of the panoramic image. - Then, the
conference terminal 2 of theLocation 1 side transmits the output information including the cutout image cut out as described above and the voice subjected to directivity control to theconference terminal 2 of theLocation 2 side. Theconference terminal 2 of theLocation 2 side outputs the output information received from theconference terminal 2 of theLocation 1 side. - Detailed contents of respective functions that a
projector 300 and thecontrol unit 101 of aconference terminal 2 have are explained below. First, functions that aprojector 300 has are explained.FIG. 9 is a diagram showing an example of functions that aprojector 300 has. As shown inFIG. 9 , theprojector 300 includes aprojection control unit 321, anarea dividing unit 322, a unit-area selecting unit 323, and a coordinate-information-transmission control unit 324. For convenience of explanation, inFIG. 9 , functions related to the present embodiment are mainly shown as an example; however, functions that theprojector 300 has are not limited to these. - The
projection control unit 321 performs control of projecting an image input from theconference terminal 2 on a projection plane under the control of thedisplay control unit 102. - The
area dividing unit 322 divides, of a projected image indicating an image that the theprojector 300 has projected on the projection plane, a display area indicating an area where a panoramic image (a panoramic image on anotherconference terminal 2 that is the other party of a conference) obtained by shooting the surroundings of theother conference terminal 2 of the other party is displayed into as many unit areas as the number of people captured in the panoramic image. In the present embodiment, thearea dividing unit 322 equally divides the display area into as many unit areas as the number of users operations performed to point at people captured in the display area (which can be considered as the number of positions in the display area indicated by the user's operations), thereby can obtain a plurality of unit areas. - For example, a projected image output by the
projector 300 is assumed. to be as shown inFIG. 10 . In the present embodiment, when theprojector 300 has received pressing on an icon for instructing to execute calibration of a correspondence relationship between the positions pointed by a user and a plurality of unit areas on a projected image as with the case of the above-described icon for specifying a cutout area, theprojector 300 goes into calibration mode, and thearea dividing unit 322 performs control of outputting information (which can be an image or a voice) prompting a user to point at the center point of each of people captured in the display area. In the example shown inFIG. 10 , five people, Persons D, E, F, G, and H, are captured in the display area, so the user performs an operation of pointing at the center point of each of these people with thestylus pen 400 or the like. The number of pointing (the number of user's operations performed to point at people) is the total number of people captured in the display area; in this example, the number of pointing is “five.” Therefore, thearea dividing unit 322 equally divides the display area into five laterally (horizontally), thereby can obtain five unit areas (unit areas 401 to 405). Theprojector 300 exits the calibration mode when theprojector 300 have obtained the unit areas. - Returning to the explanation of
FIG. 9 , the unit-area selecting unit 323 selects, of a projected image output by theprojector 300, a unit area corresponding to the position pointed by a user. In this example, when theprojector 300 has received an operation of specifying the center point of a cutout, the unit-area selecting unit 323 selects a unit area including the coordinates of the specified center point of the cutout. - The coordinate-information-
transmission control unit 324 performs control of transmitting coordinate information indicating a unit area selected by the unit-area selecting unit 323 to theconference terminal 2 connected to theprojector 300. - The above-described functions that the
projector 300 has (theprojection control unit 321, thearea dividing unit 322, the unit-area selecting unit 323, and the coordinate-information-transmission control unit 324) can be realized by the CPU 311 executing a program stored in the storage unit 312 or the like, or at least some of these functions can be realized by a dedicated hardware circuitry (such as a semiconductor integrated circuit). - Subsequently, functions that the
control unit 101 of aconference terminal 2 has are explained.FIG. 11 is a diagram showing an example of functions that thecontrol unit 101 of aconference terminal 2 has. For convenience of explanation, inFIG. 11 , functions related to the present embodiment are mainly shown as an example; however, functions that thecontrol unit 101 has are not limited to these. - As shown in
FIG. 11 , thecontrol unit 101 includes a first transmission control unit 121, an acquiring unit 122, an identifying unit 123, a cutting-out unit 124, a directivity control unit 125, a secondtransmission control unit 126, and anoutput control unit 127. - The first transmission control unit 121 performs, when having received coordinate information from a
projector 300 connected to theconference terminal 2, control of transmitting the received coordinate information to anotherconference terminal 2 that is the other party of a conference. That is, the first transmission control unit 121 performs control of transmitting coordinate information indicating, of a projected image, a unit area corresponding to the position pointed by a user to anotherconference terminal 2 that is the other party of a conference. - The acquiring unit 122 acquires a panoramic image obtained by the
panoramic camera 10 shooting the surroundings of theconference terminal 2. In this example, the acquiring unit 122 acquires a corrected panoramic image input from the distortion-correction processing unit 113. - When the
conference terminal 2 has received from anotherconference terminal 2 coordinate information indicating, of a display area (an area where a panoramic image of theconference terminal 2 side is displayed) of a projected image that theother conference terminal 2 has output, a unit area corresponding to the position pointed by a user of theother conference terminal 2, the identifying unit 123 identifies, of a panoramic image acquired by the acquiring unit 122, an area corresponding to the received coordinate information based on correspondence information indicating a correspondence relationship between the coordinates of the projected image (which can be considered as the coordinates of an area on which an image is projected on a projection plane) and the coordinates of the panoramic image. In this example, the correspondence information has been stored in thestorage device 15 in advance. Furthermore, a general video conference system enables a user to freely change the layout (change the display mode), such as to project only an image of the user's party or to project only an image of the other party; therefore, the relationship between the coordinates of a projected image and the coordinates of a panoramic image is not always in one-to-one correspondence. Accordingly, the correspondence information in this example associates the coordinates of a projected image with the coordinates of a panoramic image with respect to each display mode (layout information) of theprojector 300. - The cutting-out unit 124 cuts out, of a panoramic image acquired by the acquiring unit 122, an image of an area identified by the identifying unit 123 as a cutout image.
- The directivity control unit 125 controls the directivity of the
microphone array 12 to increase the sensitivity of, of a plurality of microphones installed dispersively in theconference terminal 2, a microphone corresponding to an area identified by the identifying unit 123 (in this example, an area within a panoramic image). The directivity control unit 125 can determine the microphone corresponding to the coordinates of the area identified by the identifying unit 123 based on position information indicating a relationship between the positions of the microphones included in themicrophone array 12 and the coordinate of the panoramic image. The position information can be stored in, for example, thestorage device 15 or the like in advance. - The second
transmission control unit 126 performs control of transmitting output information including at least a voice subjected to directivity control by the directivity control unit 125 to anotherconference terminal 2. In the present embodiment, the secondtransmission control unit 126 performs control of transmitting output information including a voice subjected to directivity control by the directivity control unit 125 and a cutout image cut out by the cutting-out unit 124 to anotherconference terminal 2. More specifically, the secondtransmission control unit 126 performs control of transmitting output information including a panoramic image acquired by the acquiring unit 122, a voice subjected to directivity control by the directivity control unit 125, and a cutout image cut out by the cutting-out unit 124 to anotherconference terminal 2. Incidentally, the output information may include at least a voice subjected to directivity control by the directivity control unit 125 (a voice subjected to directivity control according to, of a shot image acquired by the acquiring unit 122, an area corresponding to coordinate information received from another conference terminal 2). Furthermore, thecontrol unit 101 can be configured to not include, for example, the cutting-out unit 124. - Furthermore, if the
conference terminal 2 has not received coordinate information from anotherconference terminal 2, the secondtransmission control unit 126 performs control of transmitting general conference information including a panoramic image acquired by the acquiring unit 122, a cutout image that is a close-up of a speaker who is one of conference participants captured in the panoramic image, and voice data picked up by themicrophone array 12 to theother conference terminal 2. - The
output control unit 127 performs control of outputting an image and voice received from anotherconference terminal 2. Theoutput control unit 127 performs control of instructing thedisplay control unit 102 to cause theprojector 300 to output (project) an image received from anotherconference terminal 2 onto a projection plane and outputting a voice received from theother conference terminal 2 from thespeaker 13. When theconference terminal 2 has received output information from anotherconference terminal 2, theoutput control unit 127 in the present embodiment performs control of outputting the received output information. More specifically, theoutput control unit 127 performs control of instructing thedisplay control unit 102 to output a composite image of a cutout image and a panoramic image that are included in the received output information and outputting a voice included in the received output information from thespeaker 13. - Furthermore, when the
conference terminal 2 has received general conference information from anotherconference terminal 2, theoutput control unit 127 performs control of outputting the received general conference information. - The above-described functions that the
control unit 101 has (the first transmission control unit 121, the acquiring unit 122, the identifying unit 123, the cutting-out unit 124, the directivity control unit 125, the secondtransmission control unit 126, and the output control unit 127) can be realized by theCPU 14 executing a program stored in thestorage device 15 or the like, or at least some of the functions that thecontrol unit 101 has can be realized by a dedicated hardware circuitry (such as a semiconductor integrated circuit). - Furthermore, in the above example, the
panoramic camera 10 and thespeaker 13 are included in theconference terminal 2; however, the configuration of theconference terminals 2 is not limited to this, and, for example, thepanoramic camera 10 and thespeaker 13 can be provided outside of theconference terminals 2. -
FIG. 12 is a flowchart showing an example of the operation of theprojector 300 when a cutout area is specified. Upon receipt of pressing on the START icon 310 (YES at Step S1), theprojector 300 receives an operation of specifying the center point of a cutout (Step S2). Then, theprojector 300 selects a unit area corresponding to the center point of the cutout specified at Step S2 (Step S3). And then, upon receipt of pressing on the Exit icon 320 (YES at Step S4), theprojector 300 transmits coordinate information indicating the unit area selected at Step S3 to the conference terminal 2 (Step S5). -
FIG. 13 is a flowchart showing an example of the operation of theconference terminal 2 upon receipt of coordinate information from theprojector 300 connected to theconference terminal 2. When theconference terminal 2 has received coordinate information (YES at Step S6), the first transmission control unit 121 performs control of transmitting the received coordinate information to another conference terminal 2 (Step S7). -
FIG. 14 is a flowchart showing an example of the operation of theconference terminal 2 upon receipt of coordinate information from anotherconference terminal 2. When theconference terminal 2 has received coordinate information from another conference terminal 2 (YES at Step S10), the identifying unit 123 identifies, of a panoramic image acquired by the acquiring unit 122 (a panoramic image acquired from thepanoramic camera 10 of the conference terminal 2), an area corresponding to the received coordinate information based on correspondence information (Step S11). Then, the cutting-out unit 124 cuts out, out of the panoramic image acquired by the acquiring unit 122, an image of the area identified at Step S11 as a cutout image (Step S12). Then, the directivity control unit 125 controls the directivity of themicrophone array 12 to increase the sensitivity of, of a plurality of microphones installed dispersively in theconference terminal 2, a microphone corresponding to the area identified at Step S11 (Step S13). And then, the secondtransmission control unit 126 performs control of transmitting output information including the panoramic image acquired by the acquiring unit 122, the cutout image cut out at Step S12, and a voice subjected to directivity control obtained as a result of Step S13 to the other conference terminal 2 (Step S14). -
FIG. 15 is a flowchart showing an example of the operation of theconference terminal 2 upon receipt of output information from anotherconference terminal 2. When theconference terminal 2 has received output information from another conference terminal 2 (YES at Step S20), theoutput control unit 127 performs control of outputting the received output information (Step S21). - As described above, upon receipt of coordinate information from another
conference terminal 2 that is the other party of a conference, theconference terminal 2 in the present embodiment identifies, of a panoramic image acquired from thepanoramic camera 10 of theconference terminal 2, an area corresponding to the received coordinate information based on correspondence information, and cuts out an image of the identified area as a cutout image. Furthermore, theconference terminal 2 controls the directivity of themicrophone array 12 to increase the sensitivity of, of a plurality of microphones installed dispersively in theconference terminal 2, a microphone corresponding to the identified area. Then, theconference terminal 2 transmits output information including the cutout image and a voice subjected to the directivity control to theother conference terminal 2, and theother conference terminal 2 outputs the received output information. Accordingly, theother conference terminal 2 can perform an intended output. Incidentally, in the present embodiment, the shooting range of the panoramic camera is a 360-degree panorama. However, the essentials of the present embodiment are to specify a portion of a shot image as a cutout image and control the directivity of the microphone array to increase the sensitivity of a microphone corresponding to the cutout image. Therefore, as the shooting range, the angle of view of the camera can be below 360 degrees, and, for example, can be about 80 degrees. - The embodiment according to the present invention is explained above; however, the present invention is not limited to the embodiment as is described above, and, in the practical phase, components can be modified without departing from the scope of the invention. Furthermore, various inventions can be formed by appropriate combinations of several of the components described in the above embodiment. Moreover, for example, some of the components described in the above embodiment can be eliminated.
- In the above embodiment, the
area dividing unit 322 equally divides the display area into as many unit areas as the number of user's operations performed to point at people captured in the display area, thereby obtaining a plurality of unit areas; however, for example, we assume a case where the positions of people captured in the display area are converged on one side as shown inFIG. 16 . In this case, if the display area is equally divided laterally by the number of people captured in the display area, obtained unit areas do not correspond one-to-one to the people captured in the display area; therefore, it may be difficult to appropriately specify an area where a target person is captured (as a cutout area). - Accordingly, for example, the
area dividing unit 322 can be configured to divide the display area into unit areas that correspond one-to-one to multiple positions in the display area indicated by user's operations on the basis of a relative positional relationship between four vertices of a projected image and the multiple positions in the display area indicated by the user's operations. Specific contents of this are explained below. - When the
projector 300 has gone into the calibration mode, thearea dividing unit 322 performs control of outputting information (which can be an image or a voice) prompting a user to point at four vertices of a projected image. Incidentally, for example, if an output device connected to theconference terminal 2 is an interactive whiteboard, thearea dividing unit 322 does not have to perform this control because respective coordinates of four vertices of an image (an example of an output image) that the interactive whiteboard displays thereon are recognized in advance. - Then, the
area dividing unit 322 performs control of outputting information prompting the user to point at the center point of each of the people captured in the display area, and detects multiple positions in the display area indicated by user's operations. By figuring out a relative positional relationship between the multiple positions in the display area indicated by the user's operations (which may be referred to as the “indicated positions” in the following description) and four vertices of the display area, the positions of cutout lines (cutout lines extending vertically) for cutting out multiple unit areas that correspond one-to-one to the indicated positions can be found. - Here we explain this focusing on one of the indicated positions. In the following description, the focused indicated position is referred to as the “focused position.” First, we assume a case where there are the indicated positions adjacent to both sides of the focused position. In this case, the lateral position of a right-hand cutout line for cutting out a unit area including the focused position can be found as the position at a distance from the left side of the display area in a lateral direction of, for example, a difference between the lateral length of the display area and the distance from the right side of the display area to the lateral position of the focused position plus half the distance between the lateral position of the focused position and the lateral position of the indicated position adjacent to the right side of the focused position. Likewise, the lateral position of a left-hand cutout line for cutting out the unit area including the focused position can be found as the position at a distance from the left side of the display area in the lateral direction of, for example, the difference between the lateral length of the display area and the distance from the right side of the display area to the lateral position of the focused position minus half the distance between the lateral position of the focused position and the lateral position of the indicated position adjacent to the left side of the focused position.
- Next, we assume a case where there is no indicated position adjacent to the left side of the focused position, and there is the indicated position adjacent to the right side of the focused position. In this case, the lateral position of a right-hand cutout line for cutting out a unit area including the focused position can be found in the same manner as the first case. On the other hand, the lateral position of a left-hand cutout line for cutting out the unit area including the focused position can be found as the position at a distance from the left side of the display area in the lateral direction of, for example, a difference between the lateral length of the display area and the distance from the right side of the display area to the lateral position of the focused position minus half the distance between the lateral position of the focused position and the lateral position of the indicated position adjacent to the right side of the focused position.
- Furthermore, we assume a case where there is the indicated position adjacent to the left side of the focused position, and there is no indicated position adjacent to the right side of the focused position. In this case, the lateral position of a left-hand cutout line for cutting oat a unit area including the focused position can be found in the same manner as the first case. On the other hand, the lateral position of a right-hand cutout line for cutting out the unit area including the focused position can be found as the position at a distance from the left side of the display area in the lateral direction of, for example, a difference between the lateral length of the display area and the distance from the right side of the display area to the lateral position of the focused position plus half the distance between the lateral position of the focused position and the lateral position of the indicated position adjacent to the left side of the focused position.
- For example, we assume a projected image shown in
FIG. 17 . InFIG. 17 , the lateral direction is x-direction, and the vertical direction is y-direction. In the example shown inFIG. 17 , the coordinates of the upper-left vertex of the projected image is (a1, a2), the coordinates of the upper-right vertex is (b1, b2), the coordinates of the lower-right vertex is (c1, c2), and the coordinates of the lower-left vertex is (0, 0). Furthermore, of respective coordinates of five indicated positions corresponding one-to-one to five people (Persons D to H) captured in the display area, the coordinates of the indicated position corresponding to Person D is (d1, d2), the coordinates of the indicated position corresponding to Person F is (e1, e2), the coordinates of the indicated position corresponding to Person F is (f1, f2), the coordinates of the indicated position corresponding to Person G is (g1, g2), and the coordinates of the indicated position corresponding to Person H is (h1, h2). - For example, as shown in
FIG. 17 , the lateral position of a right-hand cutout line for cutting out a unit area including the indicated position corresponding to Person G can be found as the position at a distance of (b1−a1)−(b1−g1)+((h1−g1)/2) from the left side of the display area in the x-direction. On the other hand, the lateral position of a left-hand cutout line for cutting out the unit area including the indicated position corresponding to Person G can be found as the position at a distance of (b1−a1)−(b1−g1)−((gl−f1)/2) from the left side of the display area in the x-direction. - Furthermore, the following is how to find cutout lines for cutting out a unit area including the indicated position corresponding to Person H that has the adjacent indicated position on the left thereof and has no adjacent indicated position on the right thereof. The lateral position of a right-hand cutout line for cutting out the unit area including the indicated position corresponding to Person H can be found as the position at a distance of (b1−a1)−(b1−h1)+((h1−g1)/2) from the left side of the display area in the x-direction. On the other hand, the lateral position of a left-hand cutout line for cutting out the unit area including the indicated position corresponding to Person H can be found as the position at a distance of (b1−a1)−(b1−h1)−((h1−g1)/2) from the left side of the display area in the x-direction.
- Moreover, the following is how to find cutout lines for cutting out a unit area including the indicated position corresponding to Person D that has no adjacent indicated position on the left thereof and has the adjacent indicated position on the right thereof. The lateral position of a right-hand cutout line for cutting out the unit area including the indicated position corresponding to Person D can be found as the position at a distance of (b1−a1)−(b1−d1)−((e1−d1)/2) from the left side of the display area in the x-direction. On the other hand, the lateral position of a left-hand cutout line for cutting out the unit area including the indicated position corresponding to Person D can be found as the position at a distance of (b1−a1)−(b1−d1)−((e1−d1)/2) from the left side of the display area in the x-direction.
- Programs executed by the
conference terminal 2 or theprojector 300 can be provided in such a manner that each program is recorded on a computer-readable recording medium, such as a CD-ROM, a flexible disk (FD), a CD-R, a digital versatile disk (DVD), or a universal serial bus (DSP) flash drive, in an installable or executable file format, or can be provided or distributed via a network such as the Internet. Furthermore, the programs can be provided in such a manner that each program is built into a ROM or the like in advance. - According to exemplary embodiments of the present invention, it is possible to provide a communication system, a communication apparatus, and a communication method that enable one's intended output.
- The above-described embodiments are illustrative and do not limit the present invention. Thus, numerous additional modifications and variations are possible in light of the above teachings. For example, at least one element of different illustrative and exemplary embodiments herein may be combined with each other or substituted for each other within the scope of this disclosure and appended claims. Further, features of components of the embodiments, such as the number, the position, and the shape are not limited the embodiments and thus may be preferably set. It is therefore to be understood that within the scope of the appended claims, the disclosure of the present invention may be practiced otherwise than as specifically described herein.
- The method steps, processes, or operations described herein are not to be construed as necessarily requiring their performance in the particular order discussed or illustrated, unless specifically identified as an order of performance or clearly identified through the context. It is also to be understood that additional or alternative steps may be employed.
- Further, any of the above-described apparatus, devices or units can be implemented as a hardware apparatus, such as a special-purpose circuit or device, or as a hardware/software combination, such as a processor executing a software program.
- Further, as described above, any one of the above-described and other methods of the present invention may be embodied in the form of a computer program stored in any kind of storage medium. Examples of storage mediums include, but are not limited to, flexible disk, hard disk, optical discs, magneto-optical discs, magnetic tapes, nonvolatile memory, semiconductor memory, read-only-memory (ROM), etc.
- Alternatively, any one of the above-described and other methods of the present invention may be implemented by an application specific integrated circuit (ASIC), a digital signal processor (DSP) or a field programmable gate array (FPGA), prepared by interconnecting an appropriate network of conventional component circuits or by a combination thereof with one or more conventional general purpose microprocessors or signal processors programmed accordingly.
- Each of the functions of the described embodiments may be implemented by one or more processing circuits or circuitry. Processing circuitry includes a programmed processor, as a processor includes circuitry. A processing circuit also includes devices such as an application specific integrated circuit (ASIC), digital signal processor (DSP), field programmable gate array (FPGA) and conventional circuit components arranged to perform the recited functions.
Claims (7)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015-212270 | 2015-10-28 | ||
JP2015212270A JP6551155B2 (en) | 2015-10-28 | 2015-10-28 | Communication system, communication apparatus, communication method and program |
Publications (2)
Publication Number | Publication Date |
---|---|
US20170127017A1 true US20170127017A1 (en) | 2017-05-04 |
US9648278B1 US9648278B1 (en) | 2017-05-09 |
Family
ID=58638416
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/297,334 Active US9648278B1 (en) | 2015-10-28 | 2016-10-19 | Communication system, communication apparatus and communication method |
Country Status (2)
Country | Link |
---|---|
US (1) | US9648278B1 (en) |
JP (1) | JP6551155B2 (en) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10429995B2 (en) | 2016-07-13 | 2019-10-01 | Ricoh Company, Ltd. | Coordinate detecting apparatus |
US10524048B2 (en) * | 2018-04-13 | 2019-12-31 | Bose Corporation | Intelligent beam steering in microphone array |
US10917620B1 (en) * | 2019-08-21 | 2021-02-09 | Delta Electronics, Inc. | Projection apparatus, projection system, and operation method |
CN114026842A (en) * | 2019-06-21 | 2022-02-08 | 佳能株式会社 | Image providing system and control method thereof |
US20220311969A1 (en) * | 2021-03-15 | 2022-09-29 | Amazon Technologies, Inc. | Audiovisual device |
US11561598B2 (en) | 2020-04-07 | 2023-01-24 | Ricoh Company, Ltd. | Power supply device, power supply system, power supply control method, and recording medium |
US11625218B2 (en) | 2020-04-07 | 2023-04-11 | Ricoh Company, Ltd. | Sound output device, sound output system, and output sound control method with appropriately controllable volume, and recording medium |
US11627007B2 (en) * | 2018-06-07 | 2023-04-11 | Maxell, Ltd. | Mobile information terminal |
WO2024027100A1 (en) * | 2022-08-02 | 2024-02-08 | 科大讯飞股份有限公司 | Speech interaction method, and related apparatus, device and storage medium |
US11949997B2 (en) | 2021-03-15 | 2024-04-02 | Amazon Technologies, Inc. | Electronic device with shutter assembly |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019062448A (en) * | 2017-09-27 | 2019-04-18 | カシオ計算機株式会社 | Image processing apparatus, image processing method, and program |
JP7164831B2 (en) * | 2018-03-30 | 2022-11-02 | 株式会社リコー | Communication management system, communication system, communication method, and program |
JP7371369B2 (en) * | 2018-07-31 | 2023-10-31 | 株式会社リコー | Communication terminals and image communication systems |
JP7225735B2 (en) * | 2018-11-27 | 2023-02-21 | 株式会社リコー | VIDEO CONFERENCE SYSTEM, COMMUNICATION TERMINAL AND MICROPHONE CONTROL METHOD OF COMMUNICATION TERMINAL |
Family Cites Families (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0955925A (en) * | 1995-08-11 | 1997-02-25 | Nippon Telegr & Teleph Corp <Ntt> | Picture system |
JP4048511B2 (en) * | 1998-03-13 | 2008-02-20 | 富士通株式会社 | Fisheye lens camera device and image distortion correction method thereof |
JP2000105671A (en) | 1998-05-11 | 2000-04-11 | Ricoh Co Ltd | Coordinate input and detecting device, and electronic blackboard system |
JP4439763B2 (en) * | 2001-07-04 | 2010-03-24 | 株式会社リコー | Image recording / reproducing system and image recording / reproducing method |
JP2007274463A (en) | 2006-03-31 | 2007-10-18 | Yamaha Corp | Remote conference apparatus |
JP5028944B2 (en) | 2006-10-17 | 2012-09-19 | ヤマハ株式会社 | Audio conference device and audio conference system |
US8289363B2 (en) * | 2006-12-28 | 2012-10-16 | Mark Buckler | Video conferencing |
US8395653B2 (en) * | 2010-05-18 | 2013-03-12 | Polycom, Inc. | Videoconferencing endpoint having multiple voice-tracking cameras |
JP5589644B2 (en) * | 2010-07-27 | 2014-09-17 | 日本精機株式会社 | Peripheral image display device and display method thereof |
US8537195B2 (en) * | 2011-02-09 | 2013-09-17 | Polycom, Inc. | Automatic video layouts for multi-stream multi-site telepresence conferencing system |
JP5776313B2 (en) | 2011-04-28 | 2015-09-09 | 株式会社リコー | Conference equipment |
JP6303270B2 (en) | 2012-05-18 | 2018-04-04 | 株式会社リコー | Video conference terminal device, video conference system, video distortion correction method, and video distortion correction program |
JP2014143678A (en) * | 2012-12-27 | 2014-08-07 | Panasonic Corp | Voice processing system and voice processing method |
US9369672B2 (en) * | 2013-03-14 | 2016-06-14 | Polycom, Inc. | Intelligent layouts for call scaling and layout persistence |
JP5958833B2 (en) | 2013-06-24 | 2016-08-02 | パナソニックIpマネジメント株式会社 | Directional control system |
JP6201519B2 (en) | 2013-08-21 | 2017-09-27 | 株式会社リコー | Coordinate detection apparatus, coordinate detection method, and electronic information board system |
US9602771B2 (en) * | 2014-12-10 | 2017-03-21 | Polycom, Inc. | Automated layouts optimized for multi-screen and multi-camera videoconferencing calls |
US9641585B2 (en) * | 2015-06-08 | 2017-05-02 | Cisco Technology, Inc. | Automated video editing based on activity in video conference |
-
2015
- 2015-10-28 JP JP2015212270A patent/JP6551155B2/en not_active Expired - Fee Related
-
2016
- 2016-10-19 US US15/297,334 patent/US9648278B1/en active Active
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10429995B2 (en) | 2016-07-13 | 2019-10-01 | Ricoh Company, Ltd. | Coordinate detecting apparatus |
US10524048B2 (en) * | 2018-04-13 | 2019-12-31 | Bose Corporation | Intelligent beam steering in microphone array |
US10721560B2 (en) | 2018-04-13 | 2020-07-21 | Bose Coporation | Intelligent beam steering in microphone array |
US11627007B2 (en) * | 2018-06-07 | 2023-04-11 | Maxell, Ltd. | Mobile information terminal |
US12081352B2 (en) | 2018-06-07 | 2024-09-03 | Maxell, Ltd. | Mobile information terminal |
CN114026842A (en) * | 2019-06-21 | 2022-02-08 | 佳能株式会社 | Image providing system and control method thereof |
US10917620B1 (en) * | 2019-08-21 | 2021-02-09 | Delta Electronics, Inc. | Projection apparatus, projection system, and operation method |
US11561598B2 (en) | 2020-04-07 | 2023-01-24 | Ricoh Company, Ltd. | Power supply device, power supply system, power supply control method, and recording medium |
US11625218B2 (en) | 2020-04-07 | 2023-04-11 | Ricoh Company, Ltd. | Sound output device, sound output system, and output sound control method with appropriately controllable volume, and recording medium |
US20220311969A1 (en) * | 2021-03-15 | 2022-09-29 | Amazon Technologies, Inc. | Audiovisual device |
US11949997B2 (en) | 2021-03-15 | 2024-04-02 | Amazon Technologies, Inc. | Electronic device with shutter assembly |
WO2024027100A1 (en) * | 2022-08-02 | 2024-02-08 | 科大讯飞股份有限公司 | Speech interaction method, and related apparatus, device and storage medium |
Also Published As
Publication number | Publication date |
---|---|
US9648278B1 (en) | 2017-05-09 |
JP6551155B2 (en) | 2019-07-31 |
JP2017085372A (en) | 2017-05-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9648278B1 (en) | Communication system, communication apparatus and communication method | |
JP6547496B2 (en) | Communication apparatus, communication method, program and communication system | |
JP6582874B2 (en) | COMMUNICATION SYSTEM, COMMUNICATION DEVICE, COMMUNICATION METHOD, AND PROGRAM | |
US11736801B2 (en) | Merging webcam signals from multiple cameras | |
US9942513B1 (en) | Automated configuration of behavior of a telepresence system based on spatial detection of telepresence components | |
JP6303270B2 (en) | Video conference terminal device, video conference system, video distortion correction method, and video distortion correction program | |
CA2874715C (en) | Dynamic video and sound adjustment in a video conference | |
JP6171263B2 (en) | Remote conference system and remote conference terminal | |
EP3785429B1 (en) | Videoconferencing device and method | |
CN108293104B (en) | Information processing system, wireless terminal, and information processing method | |
JP6528574B2 (en) | INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND INFORMATION PROCESSING PROGRAM | |
JP2007295335A (en) | Camera device and image recording and reproducing method | |
JP6590152B2 (en) | Information processing apparatus, conference system, and control method for information processing apparatus | |
JP6500366B2 (en) | Management device, terminal device, transmission system, transmission method and program | |
JP2009027246A (en) | Television conference apparatus | |
JP2017168903A (en) | Information processing apparatus, conference system, and method for controlling information processing apparatus | |
JP6565777B2 (en) | COMMUNICATION DEVICE, CONFERENCE SYSTEM, PROGRAM, AND DISPLAY CONTROL METHOD | |
JP5464290B2 (en) | Control device, control method, and camera system | |
JP2017092950A (en) | Information processing apparatus, conference system, information processing method, and program | |
JP2017158134A (en) | Information processing apparatus, conference system, and method for controlling information processing apparatus | |
JP2010028299A (en) | Conference photographed image processing method, conference device, and the like | |
JP2017108287A (en) | Communication device, control method and control program | |
JP2020191514A (en) | Information processing unit and information processing method | |
JP2010045539A (en) | Information processor and control method for the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: RICOH COMPANY, LTD., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:TAKAHASHI, MASATO;KUWATA, KOJI;IGARASHI, KIYOTO;AND OTHERS;REEL/FRAME:040063/0468 Effective date: 20161013 |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |