EP1297699A1 - Übertragungsendgerät- und system - Google Patents

Übertragungsendgerät- und system

Info

Publication number
EP1297699A1
EP1297699A1 EP01947590A EP01947590A EP1297699A1 EP 1297699 A1 EP1297699 A1 EP 1297699A1 EP 01947590 A EP01947590 A EP 01947590A EP 01947590 A EP01947590 A EP 01947590A EP 1297699 A1 EP1297699 A1 EP 1297699A1
Authority
EP
European Patent Office
Prior art keywords
screen
image
terminal according
local
terminal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP01947590A
Other languages
English (en)
French (fr)
Inventor
Michel Beaudoin Lafon
Nicolas Roussel
Jacques Martin
Jean-Dominique Gascuel
Georges Buchner
Hervé LISSEK
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Gula Consulting LLC
Original Assignee
France Telecom SA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by France Telecom SA filed Critical France Telecom SA
Publication of EP1297699A1 publication Critical patent/EP1297699A1/de
Withdrawn legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/142Constructional details of the terminal equipment, e.g. arrangements of the camera and the display

Definitions

  • the invention relates to a communication terminal and a communication system incorporating it. It applies to the field of group communication systems in interactive or broadcast mode, and more particularly multimedia systems such as videoconferencing systems.
  • videoconferencing system any videocommunication system allowing to establish between individuals or groups, connections with at least two of the following three media: sound, video, data, bilaterally symmetrical or asymmetrical.
  • Such systems allow one or more users present on a first determined site, to communicate verbally with one or more users present on at least one second determined site, distant from the first site, and each user of a site to see the image of a user from another site in spatial location consistent with the sound generated by it.
  • remote sites two by two can thus be connected by a suitable communication or broadcasting network, in general a digital network such as ISDN (Digital Network with Service Integration) or ATM (from the English “Asynchronous Transfer Mode”) , or ADSL (from English “Asymmetry Digital S bscriber Line”), or IP, or other, in point-to-point mode or in multipoint (or multisite) mode.
  • a digital network such as ISDN (Digital Network with Service Integration) or ATM (from the English “Asynchronous Transfer Mode") , or ADSL (from English “Asymmetry Digital S bscriber Line”), or IP, or other, in point-to-point mode or in multipoint (or multisite) mode.
  • Audio data corresponding in particular to the sounds produced by the users, as well as video data, corresponding in particular to the image of the users, are transmitted via this network.
  • the term “local” is used with reference to said first determined site and the term “remote” is used with reference to said second determined site, it being understood that any site is both a local site and a remote site, according to from the point of view of one or the other site.
  • the invention relates to the end part of such a system called "terminal". From the point of view of transmission, it uses the same network supports and the same protocols as those used for videoconferencing, for videophony, television broadcasting, or other.
  • the terminal according to the invention could be substituted for conventional audiovisual terminal equipment (televisions, videophones, videoconferencing equipment, etc.)
  • the known videoconferencing equipment comprises, at each determined site, shooting means such as a video camera and sound pickup means such as a microphone, which respectively allow the acquisition of the image and the acquisition of the sound of the voice of a local user.
  • shooting means such as a video camera
  • sound pickup means such as a microphone
  • they also include image restitution means, such as a video projector cooperating with a projection screen, and sound restitution means, such as speakers, which respectively allow the restitution of the image.
  • Video conferences are formal meetings. In each site, users sit at a table, the screen being arranged vertically on the other side of the table. They behave as if they were sitting at a meeting table, and as if the remote users were sitting opposite them, on the other side of the table.
  • the invention aims to allow the implementation of a videoconferencing system beyond current uses, and thus to promote the appearance of a new form of teleconviviality, the informal meeting between individuals distant from each other.
  • the invention provides a communication terminal for a videoconferencing system between one or more local users and one or more remote users, comprising:
  • - sound pickup means for producing audio data corresponding to the sound generated by local users
  • - image restitution means comprising a screen disposed substantially horizontally, for restoring on the screen at least the image of a remote user from image data;
  • the horizontal arrangement of the screen of the terminal according to the invention allows a large number of people to be distributed around, below or above it without being annoyed.
  • the invention indeed proposes several embodiments making it possible to envisage new applications of videocommunication, in the general public or other field.
  • the terminal has for example the general shape of a well, around, above or below which users can be distributed, which makes it an original, attractive and user-friendly communication equipment.
  • the invention also proposes a communication system incorporating such a terminal.
  • a communication system incorporating such a terminal.
  • Such a system is multimedia and can be interactive. It allows communication between groups of people distant from each other.
  • Other characteristics and advantages of the invention will become apparent on reading the description which follows. This is purely illustrative and should be read in conjunction with the accompanying drawings, in which:
  • Figure 7 a diagram showing means for processing the sound of two terminals
  • FIG 1 there are shown two terminals 1 and 2 of a videoconferencing system according to the invention. These two terminals are distant from each other. The actual distance between the two terminals 1 and 2 depends on the application. It can go from a few meters to several hundred or even several thousand kilometers. By convention, we take the rest from the point of view of the site of terminal 1. In other words, the site of terminal 1 is called local site and the site of terminal 2 is called remote site. The image and the sound of the voice of a user remote from the first terminal respectively correspond to the image and the sound of the voice of a local user of the second terminal, and vice versa.
  • Each terminal 1 or 2 includes a screen 10 or 20 respectively.
  • the screen of a terminal is arranged substantially horizontally. This means that the screen plane has a zero or small angle with the horizontal (for example less than 15 degrees).
  • the screen is preferably planar, but can also have a curved shape or the shape of a polygonal pyramid, concave or convex.
  • the screen is for example a diffusing, opaque or translucent screen, a CRT type screen (from the English “Cathode Ray Tube”), a plasma screen, a liquid crystal screen, or the like. It can also be a screen suitable for stereoscopy, namely a screen of the aforementioned type covered with an appropriate lenticular network or a screen comprising optical valves (liquid crystals or PLZT, etc.) which alternately address the view. left to the left eye and right sight to the right eye using appropriate glasses. In a preferred embodiment, it is a translucent diffusing screen, such as a frosted glass panel.
  • the terminal has the general shape of a well closed at one of its ends by the screen.
  • the other end forms the bottom of the well.
  • the shape of the well in section in the plane of the screen, can be circular as shown in the figurel, polygonal or other. Note that it can also affect the shape of an unclosed curve.
  • the height of the well, counted from the ground level on which the local users 11 stand is for example substantially equal to the height of a table (typically 0.72m when users are sit or more, for example a meter when users are standing). The bottom of the well then rests on the ground.
  • a video projector is preferably installed inside the well.
  • a platform around the well on which local users can stand.
  • a transparent protective glass can cover the screen, so that the user can lean on or place an object on the terminal without risking damaging the screen.
  • the protective glass can be replaced by a glass or a touch screen, so that the screen becomes a touch screen.
  • the glass or the glass is treated so as to avoid the reflection of the local users bent over the screen.
  • a layer of anti-reflection material is applied to the face of the screen facing the local users.
  • the terminal 1 is installed so that the screen 10 is substantially at ground level on which the users stand.
  • a handrail 18 can be installed around the well, to allow local users to lean on it to lean over the well. In this way, users can distribute themselves around the well. Alternatively, there is no handrail and users can stand over the well.
  • the well 1 extends downwards from the ceiling 4 of the room which shelters it.
  • the plane of the screen 10 is located above the heads of the users 11 (for example 2.5 meters from the ground level 3 on which the users 11 are standing). In this way, the users can arrange themselves around and below the screen 20. They can also sit in relatively inclined armchairs so as to be in a more comfortable position.
  • FIG. 4 there is shown, in a diagram in section along an axis A-B (visible in the diagram of Figure 5b), the video means of a terminal according to the invention.
  • These video means comprise on the one hand the means of taking view and on the other hand the image restitution means.
  • the aforementioned protective glass is visible in FIG. 4 and bears the reference 16.
  • the shooting means comprise one or more cameras. In the example shown, there are three cameras of which only two 12 and 13 are visible in the figures.
  • the cameras are preferably placed inside a coping 17 of the well, where they are protected.
  • the optical axis of each camera is inclined relative to the plane of the screen so as to be able to produce data relating to the image of a local user standing at a determined position around the well (here on the other side from the well, facing the camera in question).
  • the image rendering means comprise at least one video projector 14 disposed on the side of the screen 10 opposite to local users 11. In other words, the image reproduction means operate by rear projection.
  • the projector 14 is arranged inside the well, so that the video signal which it produces comes to illuminate the face of the screen 10 facing the bottom 19 of the well. In this way, it is protected and, moreover, it is not visible from the outside of the well.
  • the optical axis of the projector 14 is inclined relative to the plane of the screen 10 in a direction opposite to it and is oriented towards a mirror 15 placed in the well so as to reflect the optical signal generated by the projector 14 towards the screen 10. This makes it possible to increase the size of the image projected on the screen 10 compared to a direct projection.
  • Other mirrors such as mirror 15 can be also used. The number and shape of these mirrors depend on the arrangement of the projector 14 in the well.
  • FIG 5a there is shown in a vertical sectional view along an axis A-B a terminal according to the invention.
  • Figure 5b there is shown a top view of the terminal according to the invention.
  • These figures show the audio means of the terminal. These means comprise on the one hand the sound pickup means and on the other hand the sound reproduction means of the terminal.
  • the sound pickup means include microphones.
  • these microphones are three in number, and bear the references 101, 102, and 103.
  • the three microphones 101 to 103 are, for example, suspended above the center of the screen 10 and form two by two a 120 degree angle. They are preferably directional microphones, of cardioid, hypercardioid, or other directivity, oriented from the center towards the periphery of the screen 10.
  • a microphone is oriented so as to produce audio data corresponding to the sound generated by a user. room standing or moving in a specific area in relation to the screen or in relation to the cameras.
  • the microphones are arranged substantially at the height of the mouth of the local users 11, and they face the cameras. The users are framed by the cameras, and they can speak between two microphones.
  • the sound pickup means comprise an omnidirectional microphone. Preferably, this microphone is then coupled to an echo cancellation device.
  • the sound reproduction means include speakers.
  • the loudspeakers are uniformly distributed around the periphery of the well.
  • they are arranged on the side of the screen 10 opposite to local users 11, that is to say on the side of the face of the screen 10 facing the bottom 19 of the well.
  • the microphones 101 to 103 are for example integral with a suspension fixed to a system of arches (not shown) forming a gantry above the screen 10. These arches are for example fixed to the vertical walls or to the coping 17 of Wells. They delimit the location of users.
  • the speakers 104 to 106 are preferably mounted on a suitable suspension. The suspended mounting of the speakers and / or microphones makes it possible to avoid any risk of “solidien” coupling between the first and the second.
  • the sound reproduction means may comprise flat and transparent loudspeakers maintained, for example by gluing, on the face of the screen 10 opposite the bottom 19 of the well (the one turned to local users 11).
  • the sound reproduced by these speakers is then in subjective correspondence with the image reproduced on the zone of the screen 10 on which they are arranged.
  • Such speakers have recently appeared on the market and are offered, for example, by the company NXT Corporation.
  • the aforementioned flat speakers are not transparent.
  • the video reproduction means then comprise a projector placed on the same side of the screen as the local users, the screen being in this case an opaque diffusing screen.
  • the internal walls of the well are preferably coated with an acoustic absorbent material 6 making it possible to avoid the "sounding board” effect inside the well.
  • Each device comprises a management unit comprising audio processing means and video processing means.
  • audio processing means and video processing means.
  • video processing means These two types of means will now be described separately with reference respectively to the diagrams in FIG. 6 and in FIG. 7. This separate presentation has been chosen for the sake of convenience. However, these two groups of means naturally provide a functional unit, so that the sounds and images reproduced in each terminal are consistent with each other. with the others. In the diagrams in FIGS. 6 and 7, we consider the example of a point-to-point videoconference system between terminals 1 and 2.
  • the video management unit 15 of the terminal 1 comprises an image composition matrix M receiving as input the video signals generated by the cameras 12 and 13.
  • the output of the matrix M is coupled to the input of a duplicator of luminous flux D.
  • the output of this duplicator D is connected to the input of a video coder 17.
  • the compressed video data at the output of the coder 17 correspond to at least one image of a local user 11 (FIG. 1). They are transmitted to the remote terminal 2 via a communication network via interfaces (not shown) appropriate to the type of network used for the transmission.
  • the output of the duplicator D is also connected to the video input of a digital video processing unit VPU, such as a station of the SGI family offered by the company Silicon Graphics or any other time image processing equipment. real allowing the composition of images, the special effects, the overlay of images, etc ...
  • VPU digital video processing unit
  • This also includes a data input for receiving, via an analog / digital A / D converter, video data delivered by the output of a video decoder 18. These video data are transmitted from the remote terminal 2 via of the communication network via the appropriate interfaces (not shown). They correspond to at least one image of a remote user 21 (FIG. 1).
  • the video output of the VPU unit is connected to the video input of the video projector 14. It delivers for example a video signal for an image in 1024x768 format (pixels).
  • the VPU unit performs the following functions:
  • composition of local images and / or distant images either by fusion or by superposition, as will be explained below;
  • deformations can correspond to wavelets on the screen, in order to simulate the agitation of the surface of the water and thus to increase the similarity with a real well; the screen can then be a touch screen, the effect of wavelets being produced when a local user touches the screen;
  • these deformations can be coded by computer data transmitted with the audio data and the video data, for example according to the protocol ITU T120 or similar; - possibly also, digital or analog inlay of additional video images coded in JPEG format (for example drawings, plans, graphics, or the like making it possible to illustrate the words of the users); these additional images, which do not correspond to images taken by the shooting means according to the invention, are coded by computer data transmitted with the audio data and the video data, for example according to the ITU T120 protocol or the like.
  • Composing an image from local images and remote images allows each user to see other users on the screen, including those who are on the same site as him. This avoids the head movements of local users, visible to remote users, going from the screen to the face of other local users. This also allows each local user to place themselves around the well as if the two user groups (local and remote) were one and the same group of people present around the well.
  • the purpose of coding the data transmitted from one terminal to the other is to compress the video data in order to limit the bandwidth necessary for transmission, which makes it possible to adapt the bit rate to the lines offered by telecommunications operators.
  • the encoder 17 and the decoder 18 are for example an encoder and a decoder of MPEG type (from the English “Moving Pictures Experts Group”), that is to say that they meet the compression standard for video films of Group d 'Experts in Animated Images. It can also be an ITU H263 type encoder (from the English “International Telecommunication Union ”) or AVI type (“ Audio Video Interleaved ”).
  • FIG. 6 the diagram of a management unit 25 of the remote terminal 2 is likewise represented.
  • This unit 25 being identical to the management unit 15 of the local terminal 1, it will not be described in new.
  • the elements of the remote terminal 2 corresponding to the elements of the local terminal 1 visible in particular in FIG. 6, bear the same reference as the latter with, for the tens digit, the digit 2 in place of the digit 1.
  • the video management unit of these terminals is of course modified correspondingly, depending on the equipment available, as those skilled in the art immediately perceive it.
  • FIG. 8 an example of images 31, 32, 33 and 34 has been symbolically represented generated respectively by the cameras 12 and 13 of the local terminal 1 and by the cameras 22 and 23 of the remote terminal 2.
  • FIGS. 9 to 11 show examples of the composition of local and remote images produced by the VPU.
  • the images shown in these figures are rectangular, but this does not in any way limit the shape of the screen which, as mentioned above, can be circular, oval, polygonal or the like.
  • FIG. 9 shows the composite image 35 obtained by superimposing images 31 to 34.
  • the images are superimposed by transparency.
  • the light intensity of the images thus superimposed is not necessarily identical for each image.
  • the light intensity of the image of local users is lower than that of the image of remote users. In this way, local users appear on the screen only as shadows, "ghosts" or the like.
  • This mode of composition produces a homogeneous image observable by all the people gathered around the well.
  • Figure 10 shows the composite image 36 obtained by fusion
  • This mode of composition consists in assigning a particular zone of the restored image to each image source (camera), thereby operating a partition of the restored image.
  • This mode of composition produces an image with discontinuities or “crossfades”. This makes it possible to display or enhance the image of a user only, for example the one who speaks.
  • These special effects are generated by the VPU unit which mixes and adjusts the luminance levels of the images. We can for example assign the area of the screen closest to each local user to the image of this local user, so that the user can thus see himself on the screen as if he saw himself by reflection at the surface of the water in an actual well.
  • Figures 9 and 10 produce an effect identical to that produced by water in an actual well. Each person sees himself face up and sees the other people upside down.
  • Figure 11 shows image 37 obtained by merging images 31 to
  • This composition allows users to focus their attention on the center of the screen and no longer on its periphery as in the other two types of composition. In addition, it allows each local user to see another person from the front.
  • the composition of the images will be such that the image of a local user and that of a remote user are located opposite on the screen.
  • the location on the screen of the sound image of a remote user is consistent with the location on the screen of the image of this remote user as reproduced on the screen. This is achieved by appropriate control of the aforementioned VPU video processing unit and an audio processing unit (which will now be described).
  • FIG. 7 there is shown the audio processing means of the management unit 15 of the local terminal 1 (in the left part of the figure) and the corresponding means of the management unit 25 of the remote terminal 2 (in part right of the figure). In fact these means are identical for each terminal. It will be noted that the elements of the remote terminal 2 corresponding to the elements of the local terminal 1 visible in particular in FIG. 7, bear the same reference as the latter with the digit for hundreds, the digit 2 in place of the digit 1.
  • the management unit 15 of the local terminal 1 comprises a digital audio processing unit APU produced for example in the form of one or more DSPs (from the English "Digital Signal Processor” which means digital signal processor) or under the form of digital audio cards for PC type computer.
  • Analog or digital inputs of the APU unit are connected to microphones 101, 102 and 103 via analog pre-amplifiers A.
  • Analog outputs of the APU unit are connected to an audio encoder 117 such as an MP3 type encoder (from "MPEG - Layer III", which designates the standard for compressing audio data for the Internet) or processed according to standardized telecommunication modes (ITU G711, G742, G748, G723, G729, etc. depending on bit rate) to produce compressed audio data.
  • This compressed audio data corresponds to the sounds generated by the local users and picked up by the microphones 101 to 103.
  • This data is transmitted via the communication network via appropriate interfaces (not shown) to the remote terminal 2.
  • the management unit 15 further comprises an audio decoder 118 which is the dual of the audio coder of the management unit 25 of the remote terminal 2, such as an MP3 or G7xx decoder (see above).
  • the decoder 118 receives audio data from the communication network (not shown) connecting the local terminal 1 to the remote terminal 2 via the appropriate interfaces (also not shown).
  • Analog or digital outputs of the decoder 118 are connected, via respective power amplifier PA, to the speakers 104, 105 and 106 of the sound reproduction means. The sounds reproduced by these speakers from said audio data correspond to the sounds generated by remote users 21 (FIG. 1).
  • a microphone of the local terminal 1 is associated with a loudspeaker of the remote terminal 2.
  • the microphones 101 to 103 of the terminal 1 are respectively associated with the loudspeakers 204 to 206 of the terminal 2.
  • a loudspeaker of the local terminal 1 is associated with a microphone of the remote terminal 2.
  • the speakers 104 to 106 of the terminal 1 are respectively associated with the microphones 201 to 203 of the terminal 2.
  • the APU unit performs the following functions:
  • the "pan-pot” spatial effect equivalent to stereophony in a plane, in this case that of the screen
  • the "pan-pot” effect corresponds to a stereophony with several reproduction routes
  • this effect is obtained, in a configuration with at least three local speakers associated with three remote microphones, by classifying the levels (intensity) at the output of the microphones and by choosing the association of the local speakers and the remote microphones according of this classification; this makes it possible to position the sound image of a distant speaker in a determined area of the screen plane; this area is naturally chosen so that its location on the screen is consistent with the location of the image of the remote speaker reproduced on the screen.
  • the control of the spatial effect completes the effect naturally produced by the directivity of the microphones. For example, if we consider a pair of directional microphones and their pair of associated speakers, this effect corresponds to a simple stereophonic effect. But with at least three speakers, a “pan-pot” type spatial effect allows the sound image to move in a plane corresponding to the plane of the screen.
  • Echo cancellation is particularly useful if the sound pickup means include an omni-directional microphone. However, in addition to echo problems, it also overcomes the stability problems of audio loops.
  • Local microphones pick up sounds generated by local users. These sounds are output from the remote speakers and can be picked up by the remote microphones. They are then reproduced by the local speakers. They can in turn be picked up by local microphones, etc. Therefore, in the event of coupling between the microphones and the loudspeakers of a terminal, there is an acoustic looping which must be ensured in order to avoid saturation of the audio means (by "Larsen” effect).
  • the acoustic stability comes essentially from the directivity of the microphones, associated with the acoustic diffraction on the edge of the edge 17 of the well.
  • Stability reinforcement can be obtained by choosing a second order cardioid or hypercardioid directionality.
  • the directivity of the microphones also has the advantage of improving the spatial effect.
  • the risk of coupling between the microphones and the loudspeakers can also be advantageously reduced by observing the following measure.
  • the loudspeaker of the local site which is associated with the microphone of the remote site which is closest to the loudspeaker of the remote site associated with a determined microphone of the local site is as far as possible from said determined microphone of the local site.
  • close and disant used above are understood in the acoustic sense, that is to say that they refer respectively to a strong coupling and to a weak coupling, taking into account the directivity of the microphones and / or loudspeakers, diffractions, reflections and / or acoustic absorptions due to the implantation of the latter, and of a in general of all the acoustic disturbances involved in the physical installation of the elements of the terminal considered.
  • FIG. 12 illustrates an example of application of this measurement in the case considered previously of well-shaped terminals whose surface is circular, each having three microphones and three associated loudspeakers.
  • the same elements as in Figure 7 have the same references.
  • the speaker 104 of terminal 1 which is associated with the microphone 201 of terminal 2 which is closest to the speaker 204 of terminal 2 associated with the microphone 101 of terminal 1
  • the loudspeaker 104 is arranged in the edge of the well at an angular position corresponding to an angle of + 120 ° (counted in the trigonometric direction) relative to the radius corresponding to the axis of the directivity of the microphone 101 the position.
  • the system includes at least two videoconferencing terminals as described above. Each terminal is distant from the other terminals.
  • the terminals are connected by a communication or broadcasting network for the transmission of audio and video data.
  • the terminals are distributed over different sites, for example the sites of different establishments of a company.
  • the network can be of WAN type (from the English "Wide Area Network"). It can also be the Internet, a broadband network over ATM (from the English “Asynchronous Transfer Mode”), a network of dedicated lines, and generally any type of communication network. by radio, optical and / or satellite wire link.
  • the terminals are preferably installed in a usual meeting place, an open space forming a forum such as an entrance hall, a rest room (where the coffee machines are usually located), a cafeteria or simply a corridor.
  • a forum such as an entrance hall, a rest room (where the coffee machines are usually located), a cafeteria or simply a corridor.
  • the terminals which, as we have said, can be in continuous operation, create the link, the interface, between the different sites.
  • the terminals are distributed in different locations of the same site, for example they are arranged in a public place, a large room, a reception room, etc ... It is in this type of application that the terminals can advantageously have the shape of a table.
  • the network can also be of the LAN type (from the English “Local Area Network”).
  • the terminals can operate continuously, so as to allow informal communication and without appointment between users distant from each other.
  • the wells therefore constitute, in each site, a window open to a remote site. Users can then, while passing near the well, look into the well for eye contact with a user at the remote site. A conversation can then easily be initiated.
  • the videoconferencing well therefore allows informal and friendly communication between remote users.
  • the system comprises a communication network R comprising links 310 for connecting the terminals 300 in multipoint mode.
  • links 310 are of course bidirectional links, which are symmetrical or asymmetrical in speed.
  • the network is completely meshed, that is to say that links 310 specifically connect each determined terminal to each of the other terminals.
  • the network may only be partially meshed.
  • the system comprises a communication network R comprising a multipoint videoconferencing equipment or EVM (in English "Multipoint Conferencing Unit” or MCU), also called videoconferencing bridge.
  • EVM in English "Multipoint Conferencing Unit” or MCU
  • This equipment known per se, is connected, by a link 320, to each of the terminals 300. It provides multiplexing and switching of audio and video, or other data, originating from or intended for each of the terminals.
  • the network can both contain an EVM and direct links between at least some of the terminals two by two. This corresponds to a combination of the examples shown in Figures 13 and 14.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Telephonic Communication Services (AREA)
EP01947590A 2000-07-04 2001-06-22 Übertragungsendgerät- und system Withdrawn EP1297699A1 (de)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
FR0008670A FR2811501B1 (fr) 2000-07-04 2000-07-04 Terminal et systeme de communication
FR0008670 2000-07-04
PCT/FR2001/001981 WO2002003692A1 (fr) 2000-07-04 2001-06-22 Terminal et systeme de communication

Publications (1)

Publication Number Publication Date
EP1297699A1 true EP1297699A1 (de) 2003-04-02

Family

ID=8852083

Family Applications (1)

Application Number Title Priority Date Filing Date
EP01947590A Withdrawn EP1297699A1 (de) 2000-07-04 2001-06-22 Übertragungsendgerät- und system

Country Status (6)

Country Link
US (1) US7190388B2 (de)
EP (1) EP1297699A1 (de)
JP (1) JP2004502381A (de)
AU (1) AU2001269243A1 (de)
FR (1) FR2811501B1 (de)
WO (1) WO2002003692A1 (de)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2832282B1 (fr) * 2001-11-12 2004-07-30 France Telecom Systeme audiovisuel modulaire avec modules concatenables pour mettre en presence une scene locale et une scene distante
JP2004012712A (ja) * 2002-06-05 2004-01-15 Olympus Corp テーブル型ディスプレイ装置
JP2004015332A (ja) * 2002-06-05 2004-01-15 Olympus Corp テーブル型ディスプレイ装置およびその組立方法
GB0321083D0 (en) * 2003-09-09 2003-10-08 British Telecomm Video communications method and system
US20070033539A1 (en) * 2005-08-04 2007-02-08 Thielman Jeffrey L Displaying information
FR2908900B1 (fr) * 2006-11-20 2009-02-20 Euroinvest Sarl Table de projection comprenant un moyen de presentation d'images
GB0712099D0 (en) * 2007-06-22 2007-08-01 Wivenhoe Technology Ltd Transmission Of Audio Information
JP5529617B2 (ja) * 2010-04-21 2014-06-25 日本電信電話株式会社 遠隔会議装置、遠隔会議方法、および遠隔会議プログラム
FR3050894B1 (fr) * 2016-04-27 2019-04-12 Jean Neimark Systeme de communication interactive et multi-utilisateurs pour des applications dans les domaines de l'evenementiel ou des loisirs
JP7464927B2 (ja) 2022-09-12 2024-04-10 公立大学法人公立はこだて未来大学 通信システム、通信装置、プログラム、及び制御方法

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3888576A (en) * 1972-08-07 1975-06-10 Francis Bolgar Film projectors
DE3816093A1 (de) * 1988-05-11 1989-11-23 Bke Bildtechnisches Konstrukti Video-projektionsbox fuer deckeneinbau
US4918535A (en) * 1988-10-25 1990-04-17 Robert Bosch Gmbh Television projection system
JPH04122186A (ja) * 1990-09-12 1992-04-22 Sharp Corp Tv会議システム
JPH05181437A (ja) * 1992-01-06 1993-07-23 Fuji Xerox Co Ltd 電子装置
JPH06338829A (ja) * 1993-05-28 1994-12-06 American Teleph & Telegr Co <Att> 通信システム内の反響除去方法と装置
US5639151A (en) * 1996-02-16 1997-06-17 Mcnelley; Steve H. Pass-through reflective projection display
JP3840266B2 (ja) * 1996-03-15 2006-11-01 株式会社 日立製作所 表示装置およびその操作方法
FR2761562B1 (fr) * 1997-03-27 2004-08-27 France Telecom Systeme de visioconference
JP3266045B2 (ja) * 1997-04-11 2002-03-18 ヤマハ株式会社 ステレオシステム
JPH11288040A (ja) * 1998-03-31 1999-10-19 Denso Corp 映像・音声一体型スクリーン及び再生装置
JPH11331827A (ja) * 1998-05-12 1999-11-30 Fujitsu Ltd テレビカメラ装置
JP4169830B2 (ja) * 1998-06-04 2008-10-22 株式会社ザナヴィ・インフォマティクス 車載用ディスプレイ装置
JP3994183B2 (ja) * 1998-07-28 2007-10-17 キヤノン株式会社 表示制御装置、表示制御方法、及び記憶媒体
US6330022B1 (en) * 1998-11-05 2001-12-11 Lucent Technologies Inc. Digital processing apparatus and method to support video conferencing in variable contexts
US6275251B1 (en) * 1998-11-05 2001-08-14 Motorola, Inc. Teleconference system with personal presence cells
JP2000142260A (ja) * 1998-11-18 2000-05-23 Matsushita Electric Ind Co Ltd 車載用ディスプレイ装置
JP2000166656A (ja) * 1998-12-11 2000-06-20 Mitsubishi Electric Corp
US6304648B1 (en) * 1998-12-21 2001-10-16 Lucent Technologies Inc. Multimedia conference call participant identification system and method
JP3703328B2 (ja) * 1999-02-16 2005-10-05 キヤノン株式会社 電子会議システム及びその制御方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO0203692A1 *

Also Published As

Publication number Publication date
US20040068736A1 (en) 2004-04-08
FR2811501A1 (fr) 2002-01-11
AU2001269243A1 (en) 2002-01-14
US7190388B2 (en) 2007-03-13
FR2811501B1 (fr) 2003-06-13
JP2004502381A (ja) 2004-01-22
WO2002003692A1 (fr) 2002-01-10

Similar Documents

Publication Publication Date Title
CA2284884C (fr) Systeme de visioconference
US8836750B2 (en) Telepresence system, telepresence method, and video collection device
CN101534413B (zh) 一种远程呈现的系统、装置和方法
JP3871209B2 (ja) 劇場の映像・音響システム構造
US8289367B2 (en) Conferencing and stage display of distributed conference participants
US20040008423A1 (en) Visual teleconferencing apparatus
JP2011528207A (ja) ライブテレポーティングのシステムおよび装置
WO2002023902A1 (fr) Systeme interactif audiovisuel
EP1297699A1 (de) Übertragungsendgerät- und system
WO2003043324A1 (fr) Systeme audiovisuel modulaires pour met tre en presence une scene locale et une scene distante
CN109274921A (zh) 视频会议系统
Kentgens et al. From Spatial Recording to Immersive Reproduction—Design & Implementation of a 3DOF Audio-Visual VR System
KR101954680B1 (ko) 반전 망원경 카메라를 이용한 화상 회의 시스템
JPS63240283A (ja) 映像送受信装置
CN1086893C (zh) 影像处理系统
KR200273728Y1 (ko) 극장용 영상, 음향 광학계(劇場用 映像,音響 光學係)
FR2711874A1 (fr) Terminal de télécommunications visuelles et sonores.
FR2848762A1 (fr) Systeme audiovisuel interactif
FR2743242A1 (fr) Systeme de restitution simultanee, synchrone ou asynchrone, de plusieurs sources audiovisuelles et de donnees, prealablement compressees et stockees sous forme de flux de donnees numeriques
JPH04150684A (ja) 表示・撮像装置
WO2005052775A2 (en) Videophone
KR20140022580A (ko) 화상회의 장치

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20021230

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE TR

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE TR

AX Request for extension of the european patent

Extension state: AL LT LV MK RO SI

17Q First examination report despatched

Effective date: 20040414

APBN Date of receipt of notice of appeal recorded

Free format text: ORIGINAL CODE: EPIDOSNNOA2E

APBR Date of receipt of statement of grounds of appeal recorded

Free format text: ORIGINAL CODE: EPIDOSNNOA3E

APAF Appeal reference modified

Free format text: ORIGINAL CODE: EPIDOSCREFNE

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: GULA CONSULTING LIMITED LIABILITY COMPANY

APBT Appeal procedure closed

Free format text: ORIGINAL CODE: EPIDOSNNOA9E

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN

18W Application withdrawn

Effective date: 20110411