WO2008052458A1 - Procédé, système et terminal d'acquisition d'informations de caractères médiatiques. - Google Patents

Procédé, système et terminal d'acquisition d'informations de caractères médiatiques. Download PDF

Info

Publication number
WO2008052458A1
WO2008052458A1 PCT/CN2007/070791 CN2007070791W WO2008052458A1 WO 2008052458 A1 WO2008052458 A1 WO 2008052458A1 CN 2007070791 W CN2007070791 W CN 2007070791W WO 2008052458 A1 WO2008052458 A1 WO 2008052458A1
Authority
WO
WIPO (PCT)
Prior art keywords
media
information
terminal
tag information
shared
Prior art date
Application number
PCT/CN2007/070791
Other languages
English (en)
Chinese (zh)
Inventor
Jian Yang
Guoqiao Chen
Lei Wang
Yi Zhang
Original Assignee
Huawei Technologies Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co., Ltd. filed Critical Huawei Technologies Co., Ltd.
Priority to CN200780000333.2A priority Critical patent/CN101317434B/zh
Publication of WO2008052458A1 publication Critical patent/WO2008052458A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/1066Session management
    • H04L65/1083In-session procedures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/1066Session management
    • H04L65/1083In-session procedures
    • H04L65/1095Inter-network session transfer or sharing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/40Support for services or applications
    • H04L65/401Support for services or applications wherein the services involve a main real-time session and one or more additional parallel real-time or time sensitive sessions, e.g. white board sharing or spawning of a subconference
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/10Architectures or entities
    • H04L65/1016IP multimedia subsystem [IMS]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/1066Session management
    • H04L65/1101Session protocols
    • H04L65/1104Session initiation protocol [SIP]

Definitions

  • the present invention relates to network communication technologies, and more particularly to a method and system for acquiring media feature information and a terminal device. Background of the invention
  • the traditional circuit service only allows the user to use one type of service in each session. For example, suppose an end user is making a voice call. At this time, if the user wants to send an instant message to the other party, the user must end or pause the current. Voice calls to send instant messages from the same terminal.
  • CSI Combination of CS and IMS services
  • IMS IP Multimedia Subsystem
  • FIG. 1 is a schematic flowchart of a conventional CSI service call setup. As shown in Figure 1, the following steps are included:
  • Step 101 The user terminal (UE) A and the user terminal B establish a circuit domain call.
  • Step 102 User terminal A and user terminal B perform IMS capability interaction.
  • the interactive content includes a CSI terminal mobile subscriber integrated service digital network (MSISDN, Mobile Station Integrated Service Digital Network) and a session initiation protocol-Uniform Resource Identifier (SIP-URI, Session Initiation Protocol-Uniform Resource) Identifier) and terminal capability information.
  • MSISDN Mobile Station Integrated Service Digital Network
  • SIP-URI Session Initiation Protocol-Uniform Resource
  • terminal capability information is used to identify a service set that can be successfully invoked when an IMS session is established between users, including an IMS streaming media type, a media format parameter supported by the IMS media type, such as a codec format and a media file format.
  • the user terminal can also interact with the following capability information through the IMS network: the capabilities of the circuit domain video telephony, the capabilities of the circuit domain voice telephony, the capabilities of the MMS, and other IMS-based service capabilities, such as push-to-talk over cellular (PoC, Push To) Talk Over Cellular) and so on.
  • IMS push-to-talk over cellular
  • Step 103 The user terminal A triggers the IMS service, and sends a session invitation (INVITE) message to the user terminal B through the IMS network A, and the media attribute is marked as "inactive".
  • IMS session invitation
  • Step 104 IMS network A transmits an INVITE message to IMS network B.
  • Step 105 The IMS network B transmits the INVITE message to the user terminal B.
  • Step 106 The user terminal B obtains according to the IMS capability interaction in step 102.
  • the correspondence between the MSISDN and the SIP-URI associates the received INVITE message with the CS service being used.
  • the user terminal B side directly initializes the Internet Protocol-based connection access network (IP-CAN, IP-Connectivity Access Network), and establishes the ⁇ load; otherwise, it starts to establish from step 115. .
  • IP-CAN Internet Protocol-based connection access network
  • Step 107 The user terminal sends a 200 OK message to the IMS network, and the media attribute is marked as "inactive".
  • Step 108 IMS network B sends a 200 OK message to IMS network A.
  • Step 109 The IMS network A sends a 200 OK message to the user terminal A.
  • Step 110 User terminal A sends a response message to IMS network A.
  • the user terminal A side initializes the media attribute and establishes IP-CAN 7 load.
  • Step 111 IMS network A sends a response message to IMS network B.
  • Step 112 The IMS network B sends a response message to the user terminal B.
  • Step 113 After the IP-CAN is established on the user terminal A side, the user terminal A sends an INVITE message to the IMS network A again, and the media attribute is marked as "active".
  • Step 114 IMS network A sends an INVITE message to IMS network B.
  • Step 115 IMS network B sends an INVITE message to user terminal B.
  • Step 116 The user terminal B performs a corresponding business action.
  • a real-time transport protocol (RTP) media channel is established, a codec device is started, and a user media stream is received or transmitted.
  • RTP real-time transport protocol
  • Step 117 After the IP-CAN 7
  • Step 118 IMS network B sends a 200 OK message to IMS network A.
  • Step 119 The IMS network A sends a 200 OK message to the user terminal A.
  • Step 120 After receiving the 200 OK message, the user terminal A sends a response message to the IMS network A.
  • Step 121 IMS network A sends a response message to IMS network B.
  • Step 122 The IMS network B sends a response message to the user terminal B.
  • Step 123 The multimedia session of the two parties is established.
  • Video sharing technology is the application of CSI technology, allowing users to share a video while making a voice call.
  • FIG. 2 is a schematic structural diagram of a system for implementing video sharing. As shown in Figure 2, the system consists of user terminal A, user terminal B and communication network.
  • the user terminal includes two parts: a voice communication module and a video sharing client: the voice communication module is used to complete the voice session of the user in the circuit domain, and the traditional power is passed.
  • the road network transmits the voice information of both parties of the communication;
  • the video sharing client is used for video sharing.
  • the sender terminal can send a video or other multimedia file to the receiver terminal through the packet domain network through its own video sharing client. Receive the transmitted video through its own video sharing client and watch it.
  • the shared video may be existing video information stored on the sender terminal, or may be video information actually captured by the sender terminal device.
  • the receiver user when the video sharing is implemented, the receiver user can only passively receive the video sent by the sender user, and cannot know whether the video is captured by the sender user in real time or pre-existing on the sender terminal.
  • the receiver user needs to objectively know the feature information of the received video, such as whether the video is captured in real time, whether the information is superimposed, etc., but the existing video sharing technology cannot satisfy the user. A request. Summary of the invention
  • the embodiment of the invention provides a method for acquiring media feature information, which enables the media sharing recipient user to know the feature information of the received media.
  • a method for obtaining media feature information including:
  • Embodiments of the present invention provide a system for acquiring media feature information, which enables a media sharing recipient user to learn feature information of a received media.
  • a system for acquiring media feature information including a communication network, a sender terminal, and a receiver terminal;
  • the communication network is configured to transmit a circuit domain and a packet domain session information of a sender terminal and a receiver terminal in a communication process;
  • the sender terminal is configured to generate media tag information that identifies the shared media feature information, and send the generated media tag information to the receiver terminal.
  • the receiver terminal is configured to receive and acquire media tag information content from the sender terminal.
  • the embodiment of the invention provides a terminal device for acquiring media feature information, and the device is used to enable the media sharing receiver user to know the feature information of the received media.
  • a terminal device for acquiring media feature information including a generating unit, a sending unit, and a receiving unit;
  • the generating unit is configured to generate media tag information that identifies the shared media feature information, where the sending unit is configured to send the media tag information to a receiver terminal, and the receiving unit is configured to receive and obtain a self-correspondence The media tag information content of the sender terminal.
  • the user who is the media sender generates the media tag information that identifies the shared media feature information by using the terminal device, and requests the media tag information to be sent by the receiver or the receiver in the media sharing process.
  • the receiver terminal receives and acquires the media tag information content, so that the receiver user can learn the feature information of the shared media.
  • FIG. 1 is a schematic flowchart of a conventional CSI service call setup
  • FIG. 2 is a schematic structural diagram of a system for implementing video sharing
  • 3 is a schematic structural view of an embodiment of a system of the present invention
  • FIG. 4 is a schematic structural diagram of an embodiment of a terminal device according to the present invention.
  • Figure 5 is a flow chart of an embodiment of a method of the present invention.
  • Figure 6 is a flow chart of a preferred embodiment of the method of the present invention.
  • FIG. 7 is a schematic diagram of carrying video tag information in signaling through a CSI service call setup process according to an embodiment of the present invention. Mode for carrying out the invention
  • the sender terminal generates media tag information identifying the shared media feature information, and transmits the media tag information to the receiver terminal; and the receiver terminal receives and acquires the media tag information content.
  • FIG. 3 is a schematic structural view of an embodiment of a system of the present invention. As shown in FIG. 3, the system is mainly composed of a communication network, a sender terminal, and a receiver terminal.
  • the communication network is configured to transmit a circuit domain of the sender terminal and the receiver terminal in a communication process and packet domain session information
  • a sender terminal configured to generate media tag information that identifies the shared media feature information, and send the generated media tag information to the receiver terminal;
  • FIG. 4 is a schematic structural diagram of an embodiment of a user terminal device according to the present invention. As shown in FIG. 4, the user terminal device includes: a generating unit 41, a sending unit 42 and a receiving unit 43;
  • the generating unit 41 is configured to generate media tag information that identifies the shared media feature information
  • the sending unit 42 is configured to send the generated media tag information to the receiver terminal, for example, send the generated media tag information in a customized message format.
  • the media tag information is carried by the existing message or the streaming media information and sent to the receiver terminal.
  • the receiving unit 43 is configured to receive and acquire media tag information content from the corresponding sender terminal.
  • the generating unit 41 specifically includes: a determining subunit 411 and a generating subunit 412: determining the subunit 411, configured to determine whether the shared media information is real-time information, and generating a sub-unit 412, configured to generate media marking information according to the determination result.
  • the receiving unit 43 specifically includes: a receiving subunit 431 and a parsing subunit 432: a receiving subunit 431, configured to receive media tag information from the corresponding sender terminal; and a parsing subunit 432, configured to parse the media tag information, Get the media tag information content.
  • the receiving unit 43 may further include: a processing subunit 433, configured to display media tag information content according to the parsing result, or play the received shared media or store as a file according to the parsing result.
  • a processing subunit 433 configured to display media tag information content according to the parsing result, or play the received shared media or store as a file according to the parsing result.
  • the device shown in FIG. 4 further includes a voice communication unit 44 for transmitting or receiving voice information during communication; and a media sharing client 45 for transmitting or receiving shared media information.
  • each unit in the user terminal device of the embodiment of the present invention may be uniformly incorporated into the media sharing client, and the functions thereof are similar to those of the respective units, and are not described herein again.
  • the content of the media mark information in the embodiment shown in FIG. 3 and FIG. 4 may be real-time/non-real-time, media format, requiring the receiver terminal to play the received shared media or save as a file, and whether to allow saving. And other information.
  • the shared media can be information such as video, picture or sound. The following is a detailed description of the solution in the embodiment of the present invention by taking the shared media as the video information and the media tagged information as the real-time/non-real-time information as an example:
  • the determining subunit 411 in the sender terminal pairs the feature information of the shared video, such as real time/
  • it can be implemented by monitoring which application is enabled on the video output end, that is, by determining whether the video information is from an existing video storage unit in the terminal or from a camera or the like.
  • the output of the program is used to judge the real-time performance of the video.
  • the generation sub-unit 412 generates a video according to the received judgment result.
  • the information is marked, and the generated video tag information is transmitted to the transmitting unit 42; the transmitting unit 42 transmits the video tag information to the receiving subunit 431 in the receiving unit 43 of the receiving terminal.
  • the receiving sub-unit 431 sends the received video tag information to the parsing sub-unit 432; the parsing sub-unit 432 parses the video feature information in the video tag information and sends it to the processing sub-unit 433; the processing sub-unit 433 displays the video feature
  • the information can be displayed in a way that both parties can understand. For example, a small icon can appear on the screen, and different icon patterns can be used to indicate the real-time or non-real-time nature of the video information, or to use text on the screen. To explain directly.
  • the video tag information is transmitted and received through the transmitting unit 42 and the receiving unit 43 in the terminal device.
  • information can also be sent and received by calling an existing message communication module, for example, a short message service (SMS), a multimedia message service (MMS), or a session initiation protocol.
  • SMS short message service
  • MMS multimedia message service
  • the existing short message signaling generally includes reserved bits, which can be used to identify video feature information, such as 0 for real time and 1 for non-real time, when the short message is transmitted to receive.
  • the required video tag information is parsed from the short message, and the subsequent processing flow is continued.
  • the shared video stream may also be used to carry the video tag information, and details are not described herein.
  • FIG. 5 is a flow of an embodiment of the method of the present invention. Cheng Tu. As shown in Figure 5, the following steps are included:
  • Step 501 The sender terminal generates media tag information that identifies the shared media feature information, and sends the media tag information to the receiver terminal.
  • the sender terminal may actively send the media tag information to the receiver terminal, or the sender terminal may send the media tag information to the receiver device after receiving the shared media feature information request sent by the receiver terminal.
  • Receiver terminal The manner in which the receiving terminal sends the request information may be determined by a specific situation. For example, the most simple way may be that the request is verbally made by the receiving user during the two-party conversation.
  • Step 502 The receiver terminal receives and acquires the content of the media tag information.
  • FIG. 6 is a flow chart of a preferred embodiment of the method of the present invention. It is assumed that the shared media is video information and the media mark information content is real-time/non-real-time information in this embodiment. As shown in Figure 6, the following steps are included:
  • Step 601 The sender terminal determines the feature information of the shared video.
  • the feature information of the shared video is real-time/non-real-time information.
  • the sender terminal can determine the real-time/non-real-time performance of the shared video by monitoring which application is enabled on the video output, that is, by determining whether the shared video is from an existing storage device on the terminal, or from a camera or the like. Like the output of the program to determine whether the shared video is captured in real time.
  • Step 602 The sender terminal generates video tag information according to the determination result.
  • the sender terminal If the result of the determination indicates that the video is captured in real time, the sender terminal generates video tag information identifying the real-time performance; if the result of the determination indicates that the video is not real-time, the sender terminal generates video tag information identifying non-real-time.
  • Step 603 The sender terminal sends the generated video tag information to the receiver terminal.
  • the sender terminal may send the generated video tag information to the receiver terminal in a customized message format.
  • the video tag information may be the most A signaling format consisting of two parts: a signaling header and a signaling entity.
  • the signaling entity part uses 0 and 1 to represent real-time and non-real-time respectively. It can also use existing information, such as signaling information such as SMS, MMS, and SIP.
  • the shared video stream carries the video tag information, and then sent to the receiver terminal.
  • Step 604 The receiver terminal receives and parses the video tag information.
  • Step 605 The receiver terminal displays the feature information of the shared video according to the parsing result.
  • the specific display mode may be a way for both parties to understand, for example, a small icon may appear on the screen, and different icon patterns may be used to indicate real-time or non-real-time performance of the video information, or may be on the screen. Use text to explain directly.
  • the multimedia session establishment of the sender terminal and the receiver terminal is completed, that is, after the step 123 shown in FIG. 1 is completed, the sender terminal starts to send the shared video to the receiver terminal, and then sends the message.
  • the generated video tag information however, in an actual application, before the multimedia session is established, the generated video tag information is sent to the receiver terminal in some signaling in the multimedia session establishment process; After receiving the video tag information, the terminal may immediately perform parsing and display, or may wait for the subsequent multimedia session to be established, and after receiving the shared video stream sent by the sender terminal, display the same with the shared video.
  • the specific display method is not limited.
  • FIG. 7 is a schematic diagram of carrying video tag information by using signaling in a CSI service call setup process according to an embodiment of the present invention.
  • the sender terminal carries the generated video tag information in the signaling shown in FIG. 7 and sends it to the receiver terminal.
  • the user terminal a that is, the sender terminal and the user terminal b, that is, the receiver terminal
  • the user terminal a Before starting the signaling interaction, the user terminal a, that is, the sender terminal and the user terminal b, that is, the receiver terminal, first perform capability interaction.
  • the sender terminal or the receiver terminal sends a SIP audit (OPTION) message to the other party, and carries a video share in the User-Agent header field of the OPTION message.
  • the logo indicates that it supports the Video Share business.
  • the corresponding function can also be represented by other methods, that is, by extending different tag names and identifiers; the function and principle are to extend the header field to indicate the terminal's support for real-time or non-real-time shared video, but only The specific implementation method is different. This extension can also be applied to other header fields, such as the Accept header field.
  • real-time/non-real-time shared video support capabilities can be implemented by extending the new Feature Tag, for example,
  • +g.3gpp.csisc.clip and +g.3gpp.csics.live respectively indicate the terminal's ability to support real-time and non-real-time shared video.
  • the Feature tag can also be placed in other header fields.
  • the terminal receiving the OPTION message sends a 200 OK message to the peer, and carries the Video Share flag in the User-Agent header field of the 200 OK message, indicating that it supports the Video Share service. In this way, through the capability interaction, both parties can know whether the other party supports the Video Share service.
  • the CSI service call setup process shown in Figure 7 specifically includes the following steps:
  • Steps 701 to 702 The sender terminal sends an INVITE request to the receiver terminal through the application server (AS, Application Server), where the media type information to be used is carried.
  • AS Application Server
  • the media type information to be used is carried.
  • SubAPPName live or +g.3gpp.cs-voice;
  • SubAPPName clip identifier to identify the shared video to be sent as Real-time or non-real time.
  • Steps 703 to 705 After receiving the INVITE request, the receiver terminal sends a 183 message to the AS, and carries the media stream type and coding mode that the receiver terminal can receive in the session description protocol (SDP) of the message.
  • SDP session description protocol
  • the sender terminal is required to perform resource reservation, and at the same time, the resource reservation is performed by itself; after receiving the 183 message, the AS sends a PRACK message to the receiver terminal; and the receiver terminal sends back a 200 OK message of the PRACK.
  • SDP session description protocol
  • Steps 706 ⁇ 708 The AS sends a 183 request to the sender terminal; the sender terminal sends a PRACK message to the AS; and the AS sends a PRACK 200 OK message.
  • Step 706 can be performed after step 703, and there is no timing relationship with steps 704 and 705.
  • Steps 709 ⁇ 710 The sender terminal resource reservation is successful, and an update (UPDATE) message is sent to the receiver terminal through the AS.
  • UPDATE update
  • the device can be carried in the UPDATE message in this step.
  • the specific carrying mode is similar to the INVITE request, and is not described here.
  • Steps 711 ⁇ 712 The receiving terminal receives the UPDATE message, and the resource reservation is successful, and sends a 200 OK message to the sender terminal.
  • Steps 713 ⁇ 714 The receiver terminal sends a 180 message to the sender terminal, and prompts the receiver terminal to start ringing.
  • Steps 715 ⁇ 716 The receiving terminal sends a 200 OK message to the sender terminal, and responds to the INVITE message.
  • Steps 717 to 718 The sender terminal sends an ACK message to the receiver terminal to confirm the session establishment. At this point, the multimedia session is established. In the subsequent process, the sender terminal and the receiver terminal perform real-time video sharing through the RTP message.
  • Steps 719 ⁇ 722 The sender terminal sends a BYE message to the receiver terminal to end the video sharing; the receiver terminal returns a 200 OK response message of the BYE message.
  • the video tag information is carried by extending the SIP header field of the signaling layer.
  • the signaling shown in FIG. 7 can also be used, such as the mDP of the SDP file requested by the INVITE.
  • Add a new a attribute to the video attribute to carry video tag information, such as adding a attribute a sourctype live to identify the shared video as real-time video. Use, the specific use can be determined according to actual needs.
  • the video tag information is used as the real-time/non-real-time information as an example.
  • the video tag information may also be: a media format, and the receiver is required to receive the terminal. Whether the shared video is played or saved as a file, and whether or not information such as saving is allowed.
  • the receiver terminal After receiving the video tag information, the receiver terminal performs a corresponding operation according to the parsed video tag information content. Depending on the content of the received video tag information, the specific operation mode of the receiver terminal will also be different.
  • the video identification information in the embodiment shown in FIG. 6 is real-time/non-real-time information
  • the corresponding operation mode is to display the real-time or non-real-time performance of the video information by using different icon patterns on the screen of the receiver terminal. If the video tag information is used to identify that the receiver terminal is required to play the received shared video or save as a file, then the receiving terminal operation mode is to play the subsequent received shared video or as a file. storage. Other situations will not be described again.
  • the technical solutions in the embodiments shown in FIG. 6 and FIG. 7 are equally applicable. It can be seen that, by using the technical solution of the embodiment of the present invention, a user who is a media sender generates media tag information that identifies the shared media feature information by using the terminal device, and requests the media tag information to be actively or in response to the receiver terminal in the media sharing process.
  • the receiving terminal receives and acquires the content of the media mark information, and performs an operation such as displaying according to the obtained media mark information content, so that the receiving user knows the feature information of the current shared media.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Telephonic Communication Services (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

La présente invention concerne un procédé et un système d'acquisition d'informations de caractères médiatiques. Les utilisateurs en tant que côté de transmission médiatique génèrent les informations d'identification médiatiques identifiant les informations de caractère médiatique partagées par leurs terminaux, et transmettent ces informations d'identification médiatiques au terminal du côté réception ou pour les demandes du côté réception lors du procédé de partage de média; le terminal du côté réception reçoit et acquiert le contenu d'informations d'identification de média, par conséquent les utilisateurs du côté réception peuvent acquérir les informations de caractères médiatiques partagées. Un terminal d'acquisition d'informations de caractère médiatique est également proposé. Les utilisateurs du côté réception du partage de média peuvent acquérir les informations de caractère médiatique partagées en utilisant le terminal.
PCT/CN2007/070791 2006-10-30 2007-09-26 Procédé, système et terminal d'acquisition d'informations de caractères médiatiques. WO2008052458A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN200780000333.2A CN101317434B (zh) 2006-10-30 2007-09-26 一种获取媒体特征信息的方法和系统以及终端设备

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN200610150365.8A CN100581197C (zh) 2006-10-30 2006-10-30 一种获取媒体特征信息的方法和系统以及终端设备
CN200610150365.8 2006-10-30

Publications (1)

Publication Number Publication Date
WO2008052458A1 true WO2008052458A1 (fr) 2008-05-08

Family

ID=38943574

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2007/070791 WO2008052458A1 (fr) 2006-10-30 2007-09-26 Procédé, système et terminal d'acquisition d'informations de caractères médiatiques.

Country Status (3)

Country Link
CN (2) CN100581197C (fr)
DE (1) DE102007051828B4 (fr)
WO (1) WO2008052458A1 (fr)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110088076A1 (en) * 2009-10-08 2011-04-14 Futurewei Technologies, Inc. System and Method for Media Adaptation
CN102685563B (zh) * 2011-03-15 2015-11-25 华为终端有限公司 互联网协议电视内容共享方法、装置以及终端设备
CN103078851B (zh) * 2012-12-28 2016-09-07 Tcl集团股份有限公司 消息接收、发送方法、消息交互系统及dlan设备
CN105009543A (zh) * 2013-01-31 2015-10-28 诺基亚技术有限公司 媒体项目的递送
CN105472296B (zh) * 2014-09-09 2019-02-05 联想(北京)有限公司 实时性校验方法和装置
US9112849B1 (en) * 2014-12-31 2015-08-18 Spotify Ab Methods and systems for dynamic creation of hotspots for media control

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1149795A (zh) * 1995-11-02 1997-05-14 邝冬英 多媒体数字传输广播系统
JP2004214916A (ja) * 2002-12-27 2004-07-29 Yamaha Corp ファイル作成端末
CN1767446A (zh) * 2004-10-28 2006-05-03 华为技术有限公司 一种增值业务请求的处理方法

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB9826157D0 (en) * 1998-11-27 1999-01-20 British Telecomm Announced session control
US20040184432A1 (en) * 2003-03-19 2004-09-23 Ralitsa Gateva Method for controlling streaming services
US20050004968A1 (en) * 2003-07-02 2005-01-06 Jari Mononen System, apparatus, and method for a mobile information server

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1149795A (zh) * 1995-11-02 1997-05-14 邝冬英 多媒体数字传输广播系统
JP2004214916A (ja) * 2002-12-27 2004-07-29 Yamaha Corp ファイル作成端末
CN1767446A (zh) * 2004-10-28 2006-05-03 华为技术有限公司 一种增值业务请求的处理方法

Also Published As

Publication number Publication date
DE102007051828A1 (de) 2008-05-08
CN101317434A (zh) 2008-12-03
CN101317434B (zh) 2013-03-20
CN101090415A (zh) 2007-12-19
DE102007051828B4 (de) 2014-07-03
CN100581197C (zh) 2010-01-13

Similar Documents

Publication Publication Date Title
US8195147B2 (en) Method of enabling a combinational service and communication network implementing the service
US8483378B2 (en) Method and system for implementing multimedia ring back tone service and multimedia caller identification service
US11108838B2 (en) Method, user equipment and application server for adding media stream of multimedia session
JP5363461B2 (ja) グループ呼機能の問い合わせ
KR100871237B1 (ko) 무선통신 시스템에서 이동 단말의 얼라팅 정보 송수신 시스템 및 방법
JP5628296B2 (ja) セッションプッシュ伝送
JP2008523662A (ja) 画像ベースのプッシュ・ツー・トークのユーザインタフェース向き画像交換方法
WO2005027460A1 (fr) Services multimedia combines
TW200427268A (en) Method and system for group communications
US9246955B2 (en) Capability query handling in a communication network
WO2008052458A1 (fr) Procédé, système et terminal d'acquisition d'informations de caractères médiatiques.
RU2526710C2 (ru) Способ и система передачи вызова по протоколу sip с помощью абонентской приставки
EP2627100A1 (fr) Procédé et dispositif pour l'affichage d'information
WO2008006311A1 (fr) Procédé et dispositif d'utilisation d'un identificateur de terminal utilisateur
WO2009089797A1 (fr) Procédé de mise en oeuvre de service de tonalité de retour d'appel et/ou de tonalité de reour d'appel multimédia et de production de demande sdp multimédia anticipée
US20120190347A1 (en) Method and System for Representing Multimedia Ring Tone For IM
WO2011018008A1 (fr) Procédé, dispositif et système de commande de service d'interface i1
WO2010043168A1 (fr) Procédé d'envoi et de réception de fichier de tonalité d'appel multimédia
FR2907621A1 (fr) Enrichissement de la signalisation dans une session de communication de type "push to talk" par insertion d'une carte de visite
WO2012034423A1 (fr) Procédé et système de reproduction de contenu multimédia précoce dans une session
WO2017000481A1 (fr) Procédé et appareil de composition de numéro pour un appel vocal
WO2017000781A1 (fr) Procédé et appareil de communication vidéo
KR100963010B1 (ko) 스마트 카드를 이용한 sip 기반 영상통화 서비스 시스템및 그 방법
TWI281816B (en) The communication method and system of the internet phone
WO2008067757A1 (fr) Procédé, système et serveur d'applications de traitement de service

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200780000333.2

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07816982

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07816982

Country of ref document: EP

Kind code of ref document: A1