EP2368167A1 - Verfahren und einrichtung zum verarbeiten von textdaten - Google Patents
Verfahren und einrichtung zum verarbeiten von textdatenInfo
- Publication number
- EP2368167A1 EP2368167A1 EP09802187A EP09802187A EP2368167A1 EP 2368167 A1 EP2368167 A1 EP 2368167A1 EP 09802187 A EP09802187 A EP 09802187A EP 09802187 A EP09802187 A EP 09802187A EP 2368167 A1 EP2368167 A1 EP 2368167A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- terminal
- data
- image
- mcu
- text
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000012545 processing Methods 0.000 title claims abstract description 55
- 238000000034 method Methods 0.000 title claims abstract description 19
- 239000002131 composite material Substances 0.000 claims description 9
- 230000005540 biological transmission Effects 0.000 claims description 4
- 230000006870 function Effects 0.000 claims description 3
- 238000004590 computer program Methods 0.000 claims description 2
- 238000007781 pre-processing Methods 0.000 claims 2
- 238000004891 communication Methods 0.000 description 14
- 239000000872 buffer Substances 0.000 description 3
- 238000002360 preparation method Methods 0.000 description 3
- 241001362574 Decodes Species 0.000 description 2
- 238000003672 processing method Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 230000001154 acute effect Effects 0.000 description 1
- 210000004899 c-terminal region Anatomy 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000001151 other effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/02—Details
- H04L12/16—Arrangements for providing special services to substations
- H04L12/18—Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
- H04L12/1813—Arrangements for providing special services to substations for broadcast or conference, e.g. multicast for computer conferences, e.g. chat rooms
- H04L12/1827—Network arrangements for conference optimisation or adaptation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F15/00—Digital computers in general; Data processing equipment in general
- G06F15/16—Combinations of two or more digital computers each having at least an arithmetic unit, a program unit and a register, e.g. for a simultaneous processing of several programs
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/04—Real-time or near real-time messaging, e.g. instant messaging [IM]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/07—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail characterised by the inclusion of specific contents
- H04L51/10—Multimedia information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/141—Systems for two-way working between two video terminals, e.g. videophone
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/141—Systems for two-way working between two video terminals, e.g. videophone
- H04N7/147—Communication arrangements, e.g. identifying the communication as a video-communication, intermediate storage of the signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/15—Conference systems
- H04N7/152—Multipoint control units therefor
Definitions
- the invention relates to the field of textual data processing, in particular in the context of videoconferencing between a plurality of users.
- Conversational videophone services have recently undergone a strong development. Such services consist of the joint use of video, audio and text means to organize a conference between a plurality of participants.
- chat feature is required to exchange messages as textual data.
- a chat feature is required to exchange messages as textual data.
- a point-to-point context between two participants, there are methods of exchanging texts, regardless of the exchange of video, with for example a text input made on the client side and exchanged by entire sentence.
- FIG. IA An example of such a point-to-point system is shown in FIG. IA.
- the terminals A and B exchange video data by a first exchange channel 1, audio data by a second exchange channel 2 and text data by a third exchange channel 3.
- the transfer of Text data is governed by a text communication protocol, such as the T. 140, SIP Message or SIP Info protocols.
- a text communication protocol such as the T. 140, SIP Message or SIP Info protocols.
- the terminal A uses a channel 4 for exchanging text data managed by a T.140 type protocol, while the terminal B uses a channel 5 d exchange of text data managed by a SIP Message type protocol, as an example.
- the terminal B can not receive the textual data provided by the channel 4, and conversely the terminal A can not receive the data provided by the channel 5.
- This problem is particularly acute in the context of multipoint videoconferencing systems, where a central control unit will manage videoconferencing between a plurality of participants, typically more than three.
- the terminal A uses a channel 6 for exchanging text data managed by a T.140 protocol. It receives textual data from a terminal B according to a textual data exchange channel 7 managed by the same protocol T.140, as well as textual data from a terminal C according to a textual data exchange channel 8 managed. by the SIP protocol Message. In such a case, the terminal A can communicate verbatim with the terminal B, but is unable to communicate verbatim with the C terminal.
- a solution to this text conversation problem, during a videoconference between several participants, is of course to require that each participant use the same textual communication protocol, as is the case for the terminals A and B on the figure IC.
- this constraint is cumbersome to implement, or even simply unfeasible, when the participants' terminals already have their own textual communication protocol and can not change it.
- participants' terminals do not know how to receive textual data from multiple text sources, even if they use the same protocol.
- most terminals do not support the multi-point text dialogue function.
- One of the aims of the invention is to overcome disadvantages of the state of the art cited above.
- a textual data processing method in telecommunication context between at least a first terminal and a second terminal, having, after receiving from a textual data processing device of the first terminal, the generation of data. image integrating the textual data received, for transmission of these image data at least to the second terminal.
- the present invention improves the situation, making it possible for multiple text conversations, homogeneously and dynamically, with any terminal supporting at least the sending of textual data.
- the second terminal does not need to use the same textual communication protocol as the first terminal to receive the text data thereof.
- the second terminal can now receive multiple text streams from different sources and transferred using the same protocol.
- the processing device embeds at least a portion of a text corresponding to said text data in at least one source image.
- the text data of the first terminal will thus be readily available for the second terminal, simply by viewing the image data received from the processing device.
- the processing device also receives text data from the second terminal and embeds at least part of a text corresponding to said textual data received from the second terminal in said source image.
- At least part of the source image is communicated from the first terminal to the processing device.
- Information of a textual nature from the first terminal will thus be supplemented by information of a visual nature.
- the source image is a composite image of which at least part is communicated from the second terminal to the processing device. Textual information from the terminals will thus be supplemented by visual information from them.
- the overlay of the text data is performed in a display area defined in said source image.
- the source image comprises at least a first display area in which at least a portion of a first image communicated from the first terminal to the processing device is displayed, as well as a second display area. wherein at least a portion of a second communicated image of the second terminal is displayed to the processing device.
- a processing device embeds at least part of a text corresponding to said text data from the first terminal in the first display area and at least a portion of a text corresponding to said text data from the second terminal in the second area. display.
- the present invention also proposes a device for processing textual data, in the context of telecommunication between at least a first terminal and a second terminal, comprising processing means adapted to implement the method as described above.
- the processing device comprises pretreatment means for generating the source image according to the method described above.
- the pretreatment means comprises mixing means for generating said source image in the form of a composite image according to the method described above.
- the present invention also provides a video-text conversation system comprising a textual data processing device as described above, connected to at least two terminals able to send textual and / or image data to the processing device.
- the present invention proposes, finally, a computer program for the implementation of the method as described above.
- a program may be downloadable via a telecommunication network and / or stored in a memory of a processing device and / or stored on a storage medium intended to cooperate with a processing device.
- FIGS. 1A, 1B, 1C illustrate various conventional videoconferencing systems with textual communication
- FIG. 2 diagrammatically represents a textual data processing device according to a preferred embodiment of the invention
- Figures 3A, 3B illustrate different modes of presentation of textual data and images according to the present invention
- FIG. 4 illustrates a videoconferencing system with textual communication according to a preferred embodiment of the invention
- Figure 5 illustrates the steps of a textual data processing method according to a preferred embodiment of the invention.
- FIG. 2 shows schematically a conversation system comprising a textual data processing device MCU according to a preferred embodiment of the invention.
- text data is understood to mean one or more data corresponding to one or more successive characters forming a part of text, for example characters represented by data coded in the ASCII format. Such data can be entered on the keyboard of a usual terminal.
- Any textual data communication protocol such as the T. 140 protocol, SIP message or
- SIP info can be used to transmit the textual data between the terminals and the control unit, from the moment when the protocols used by the different terminals Ti, T 2 , T 3 are recognized by the control unit MCU.
- the different users do not need to use the same textual communication protocol in the present invention.
- the control unit MCU thus receives the textual data of at least one terminal, for example Ti, and then generates from these, within a processing means PROC, image data Im which are intended to be transmitted to at least one other terminal, for example T 2 .
- the control unit MCU generates data image Im intended to be sent to Jen, through the terminal T 2 , or even to Sam through the terminal T 3 .
- this image data is also sent to the terminal Ti, so that Bob can verify that his di se data have been integrated into the image data Im.
- the generation of the image data Im can be performed, for example, by embedding a portion of text corresponding to the textual data di in a portion of a source image Im 3 .
- This source image Im 3 may be a simple still image, such as a completely black image, on which a portion of text corresponding to the textual data di in white characters is embedded.
- the source image Im 8 may also consist of a succession of images ⁇ Im s ⁇ i, .. ⁇ / k of a video stream, received for example from a medium external to the control unit MCU and generating such video stream, like a TV program for example.
- the source image data Im 3 come from the users themselves.
- Bob, Jen and / or Sam can transmit respective image data Im 1 , Im 2 , Im 3 , which will be used to embed a portion of text corresponding to the text data di, d 2 and / or d 3 transmitted.
- the terminal T 1 transmits textual data di, corresponding to a text entered by the user Bob, as well as image data Imi, corresponding for example to a picture of this user.
- the processing means PROC uses the image data Iudi as source image data Im 5 , in order to embed a part of the text corresponding to the textual data di input by this user Bob. It then generates image data Im corresponding to this picture of the user Bob on which appears the text he has entered. This image data is then transmitted to the user Jen, or even for example to other users like Sam, and possibly also to the user Bob as described above.
- the terminal Ti transmits textual data di, corresponding to a text entered by the user Bob, as well as a video stream consisting of a series of image data ⁇ Imi ⁇ i, ..., k / corresponding for example to the video acquisition of a webcam belonging to BOB.
- the processing means PROC will then use the image data ⁇ Im ⁇ ik successively as source image data ⁇ Im s ⁇ i, ..., k # to embed on each of these images a part of the corresponding text to the text data dl entered by the user Bob.
- the means PROC then generates a video stream consisting of successive image data ⁇ Im ⁇ i, ...
- the text corresponding to the transmitted textual data is inlaid character by character, as the textual data arrive at the MCU control unit. This is called a "real-time" text inlay mode, as text data is entered. Such a mode of inlay offers a great interactivity between the participants.
- the processing means PROC may comprise a memory, of buffer type for example, for storing the received textual data until a data indicative of the end of a sentence (such as the ASCII data corresponding to the "input" key of a keyboard) is received. It is only then that the processing means PROC processes all the stored text data and inserts the entire sentence on the source image Im 3 .
- the control unit MCU can also very well simultaneously receive text data d ⁇ , d 2 , d 3 respectively from the users Bob, Jen and Sam.
- the processing means PROC can be made to incrust simultaneously, in real time, the text parts corresponding to these different data within the same image to be broadcast to different terminals.
- the processing means PROC stores separately the textual data coming from the users Bob, Jen and / or Sam, in order to display the corresponding texts only when an entire sentence has been received from one of them. This storage can be done within a single buffer, common to all these users, or buffers dedicated respectively to each user.
- the source image Im 5 is a composite image consisting of three images Im 1 , Im 2 , Im 3 each representing one of the participants, Bob, Jen and Sam.
- a central zone Z of Text is superimposed on this composite image and is used to embed the text parts corresponding to each participant.
- the control unit MCU comprises a preparation means PREP of the source image receiving image data Im ⁇ , Im 2 and Im 3 of each of the terminals T 1 , T 2 , T 3 .
- This image data can be first decoded by decoders respective DECODi, DECOD 2 , DECOD 3 , in order to overcome the different types of image coding used during transmission from the terminals to the MCU control unit.
- each participant has his particular zone Z x , Z 2 or Z 3 , in which an image or a video stream that he has transmitted from his terminal for example, as well as the characters, will be displayed. he will have seized.
- This type of presentation has, in addition to the advantage of homogeneity, that of differentiating, visually and immediately, which user has entered a particular text.
- Figure 4 will now illustrate a videoconferencing system with text communication using an MCU control unit according to a preferred embodiment of the invention.
- the three users Bob, Jen and Sam use the respective terminals A, B, C to communicate with each other via an MCU control unit similar to that described. above.
- the terminals A, B, C use the video channels 21, 31, 41 and the audio channels 22, 32, 42 to organize the videoconference, followed by established protocols.
- the terminal A uses the protocol T. 140, the terminal B the SIP protocol Message and the terminal C the SIP Info protocol.
- FIG. 5 illustrates the steps of a method 100 of textual data processing according to a preferred embodiment of the invention.
- a first input step 110 textual data di are entered on at least one terminal Ti, by means of a keyboard for example.
- source image data I sculpture are captured on the same terminal T x during a second capture step 120.
- This data can consist of a single image, or even a video stream, and be captured. by means of a webcam for example.
- the capture step 120 is indicated as following the input step 110 in FIG. 5, but these two steps can also be performed simultaneously, or in the reverse order where the image data capturing step precedes the step of entering textual data.
- a source image preparation step 140 follows, in which the image data Im x is processed, formatted to a specific format, or combined with other image data received from other terminals to form an image. composite source image. This step is performed within the PREP preparation means of the MCU control unit. We then obtain a source image Im 5 , or even a source image stream ⁇ Im s ⁇ i k , composed of a succession of source images.
- the textual data di are then embedded in the source image Im 5 or the stream ⁇ Im s ⁇ X / ..., k , prepared previously, in one of the presentation modes ci. before, for example.
- An image Im, or the corresponding image stream (Im) xk, is then obtained.
- the image Im or the image stream (Im) 1k is broadcast to the participants' terminals during a broadcast step 160 , if necessary by being previously encoded by a coding means COD adapted to these terminals.
Landscapes
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Computer Networks & Wireless Communication (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Human Computer Interaction (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Telephonic Communication Services (AREA)
- Data Exchanges In Wide-Area Networks (AREA)
- Information Transfer Between Computers (AREA)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
FR0858907 | 2008-12-22 | ||
PCT/FR2009/052559 WO2010072945A1 (fr) | 2008-12-22 | 2009-12-16 | Procédé et dispositif de traitement de données textuelles |
Publications (1)
Publication Number | Publication Date |
---|---|
EP2368167A1 true EP2368167A1 (de) | 2011-09-28 |
Family
ID=40897699
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP09802187A Withdrawn EP2368167A1 (de) | 2008-12-22 | 2009-12-16 | Verfahren und einrichtung zum verarbeiten von textdaten |
Country Status (4)
Country | Link |
---|---|
US (1) | US8848015B2 (de) |
EP (1) | EP2368167A1 (de) |
JP (1) | JP2012513693A (de) |
WO (1) | WO2010072945A1 (de) |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5892021B2 (ja) * | 2011-12-26 | 2016-03-23 | キヤノンマーケティングジャパン株式会社 | 会議サーバ、会議システム、会議サーバの制御方法、プログラムおよび記録媒体 |
JP6089516B2 (ja) * | 2012-09-11 | 2017-03-08 | 沖電気工業株式会社 | 多地点会議サーバ及び多地点会議サーバプログラム、並びに、多地点会議システム |
JP2016015009A (ja) * | 2014-07-02 | 2016-01-28 | ソニー株式会社 | 情報処理システム、情報処理端末、および情報処理方法 |
CN108810443A (zh) * | 2017-04-28 | 2018-11-13 | 南宁富桂精密工业有限公司 | 视频画面合成方法及多点控制单元 |
JP2019049854A (ja) * | 2017-09-10 | 2019-03-28 | 益満 大 | プログラム及び情報処理システム |
US11562124B2 (en) * | 2021-05-05 | 2023-01-24 | Rovi Guides, Inc. | Message modification based on message context |
US11463389B1 (en) * | 2021-05-05 | 2022-10-04 | Rovi Guides, Inc. | Message modification based on device compatability |
US11563701B2 (en) | 2021-05-05 | 2023-01-24 | Rovi Guides, Inc. | Message modification based on message format |
US12015581B1 (en) * | 2023-07-13 | 2024-06-18 | Kyndryl, Inc. | Selectively exclude recipients from an end-to-end encryption enabled group chat |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0898424A2 (de) * | 1993-10-01 | 1999-02-24 | Vicor, Inc. | Gemeinsamer Kollaborationsinitiator in einem Multimedia Kollaborationssystem |
US6069622A (en) * | 1996-03-08 | 2000-05-30 | Microsoft Corporation | Method and system for generating comic panels |
US20020198716A1 (en) * | 2001-06-25 | 2002-12-26 | Kurt Zimmerman | System and method of improved communication |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS6253082A (ja) * | 1985-09-02 | 1987-03-07 | Nippon Telegr & Teleph Corp <Ntt> | 多地点間映像会議方式 |
AU2001245426A1 (en) * | 2000-03-03 | 2001-09-17 | Lawrence R. Jones | Picture communications system and associated network services |
JP2002259313A (ja) * | 2001-03-01 | 2002-09-13 | Square Co Ltd | 電子会議方法およびそのシステム |
JP2003219047A (ja) * | 2002-01-18 | 2003-07-31 | Matsushita Electric Ind Co Ltd | 通信装置 |
JP2004128614A (ja) * | 2002-09-30 | 2004-04-22 | Toshiba Corp | 画像表示制御装置及び画像表示制御プログラム |
JP3819852B2 (ja) | 2003-01-29 | 2006-09-13 | 富士通株式会社 | 通信支援方法、通信支援装置、通信支援プログラム及び通信支援プログラムを記録したコンピュータ読み取り可能な記録媒体 |
JP2004274342A (ja) | 2003-03-07 | 2004-09-30 | Cni Kk | Tv会議システム |
JP2004304601A (ja) * | 2003-03-31 | 2004-10-28 | Toshiba Corp | Tv電話装置、tv電話装置のデータ送受信方法 |
JP2005222431A (ja) | 2004-02-09 | 2005-08-18 | Fuji Xerox Co Ltd | 共同作業システム |
JP2005346252A (ja) * | 2004-06-01 | 2005-12-15 | Nec Corp | 情報伝達システムおよび情報伝達方法 |
US8123782B2 (en) | 2004-10-20 | 2012-02-28 | Vertiflex, Inc. | Interspinous spacer |
US20080005269A1 (en) * | 2006-06-29 | 2008-01-03 | Knighton Mark S | Method and apparatus to share high quality images in a teleconference |
US8373799B2 (en) * | 2006-12-29 | 2013-02-12 | Nokia Corporation | Visual effects for video calls |
JP5211557B2 (ja) | 2007-06-15 | 2013-06-12 | 富士通株式会社 | Web会議支援プログラム、該プログラムを記録した記録媒体、Web会議支援装置、およびWeb会議支援方法 |
KR101341504B1 (ko) * | 2007-07-12 | 2013-12-16 | 엘지전자 주식회사 | 휴대 단말기 및 휴대 단말기에서의 멀티미디어 컨텐츠 생성방법 |
-
2009
- 2009-12-16 WO PCT/FR2009/052559 patent/WO2010072945A1/fr active Application Filing
- 2009-12-16 US US13/139,417 patent/US8848015B2/en not_active Expired - Fee Related
- 2009-12-16 JP JP2011541556A patent/JP2012513693A/ja active Pending
- 2009-12-16 EP EP09802187A patent/EP2368167A1/de not_active Withdrawn
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0898424A2 (de) * | 1993-10-01 | 1999-02-24 | Vicor, Inc. | Gemeinsamer Kollaborationsinitiator in einem Multimedia Kollaborationssystem |
US6069622A (en) * | 1996-03-08 | 2000-05-30 | Microsoft Corporation | Method and system for generating comic panels |
US20020198716A1 (en) * | 2001-06-25 | 2002-12-26 | Kurt Zimmerman | System and method of improved communication |
Non-Patent Citations (2)
Title |
---|
KAZUO WATABE ET AL: "Distributed multiparty desktop conferencing system: MERMAID", PROCEEDINGS OF THE 1990 ACM CONFERENCE ON COMPUTER-SUPPORTED COOPERATIVE WORK , CSCW '90, 1 January 1990 (1990-01-01), New York, New York, USA, pages 27 - 38, XP055283211, ISBN: 978-0-89791-402-4, DOI: 10.1145/99332.99338 * |
See also references of WO2010072945A1 * |
Also Published As
Publication number | Publication date |
---|---|
WO2010072945A1 (fr) | 2010-07-01 |
JP2012513693A (ja) | 2012-06-14 |
US20110249083A1 (en) | 2011-10-13 |
US8848015B2 (en) | 2014-09-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2368167A1 (de) | Verfahren und einrichtung zum verarbeiten von textdaten | |
EP2255527B1 (de) | Verfahren zur implementierung von rich-video-inhalten auf mobilen endgeräten | |
CA2284884C (fr) | Systeme de visioconference | |
WO2006134055A1 (fr) | Procede de gestion de l'execution par un serveur d'une application offrant au moins un service multimedia interactif a au moins un terminal, produit programme d'ordinateur et serveur correspondants | |
EP1856901B1 (de) | Verfahren und system zum liefern von informationen an teilnehmer an einem telefongespräch | |
WO2004017636A1 (fr) | Procede de diffusion en temps reel de fichiers multimedias au cours d'une visioconference, sans rupture de communication, et interface homme-machine pour la mise en oeuvre | |
WO2009147348A2 (fr) | Procédé et système d'enregistrement automatique d'une session de communication | |
FR3092718A1 (fr) | Procédé de traitement de flux audiovidéo en conférence multipartite, dispositifs, système et programme correspondants | |
EP2064900B1 (de) | Verfahren zur übertragung eines audiostroms zwischen mehreren endgeräten | |
EP1964363B1 (de) | Verfahren zum transferieren von kommunikationsströmen | |
FR3113447A1 (fr) | Procede de connexion a une visio-conference securisee par authentification forte | |
EP2684353A1 (de) | Vorrichtung und verfahren zum verteilten mischen von datenströmen | |
WO2008046697A1 (fr) | Enrichissement de la signalisation dans une session de communication de type ' push to talk ' par insertion d'une carte de visite | |
WO2007093616A1 (fr) | Procédé et dispositif de gestion d'au moins un groupe d'utilisateurs, produit programme d'ordinateur correspondant | |
WO2007015012A1 (fr) | Service de personnalisation de communications par traitement des flux media audio et/ou video | |
EP2064855B1 (de) | Verfahren zur kommunikation zwischen mehreren endgeräten | |
FR2896648A1 (fr) | Procede et systeme de conversation multimedia | |
FR2776457A1 (fr) | Systeme de visioconference multipoint a presence permanente | |
WO2021255375A1 (fr) | Procede d'acces et dispositif de gestion d'acces a une session de communication securisee entre des terminaux de communication participants par un terminal de communication requerant | |
FR3000357A1 (fr) | Procede de transfert de communication audio et/ou video depuis un premier terminal vers un deuxieme terminal | |
FR2930699A1 (fr) | Negociation optimisee de ressources de codage entre clients de communication | |
FR2992808A1 (fr) | Systeme, serveur, procede, produit programme d'ordinateur et moyen de stockage pour la mise en oeuvre d'une conference multipoints | |
EP1065590A1 (de) | Systemarchitektur zur Bearbeitung von Multimedia Daten | |
FR2865048A1 (fr) | Procede et terminal de transmission d'une carte de visite | |
FR2990821A1 (fr) | Procedes et systemes pour l'acces a une conference electronique |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20110519 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK SM TR |
|
DAX | Request for extension of the european patent (deleted) | ||
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: ORANGE |
|
17Q | First examination report despatched |
Effective date: 20150910 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: H04L 12/58 20060101ALI20170712BHEP Ipc: H04N 7/14 20060101ALI20170712BHEP Ipc: G06F 15/16 20060101ALI20170712BHEP Ipc: G06F 3/00 20060101ALI20170712BHEP Ipc: H04L 12/18 20060101AFI20170712BHEP |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
RIN1 | Information on inventor provided before grant (corrected) |
Inventor name: ALLEGRO, STEPHANE |
|
INTG | Intention to grant announced |
Effective date: 20170829 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20180109 |