CN101695090B - Method for realizing real-time sharing of audio and video of mobile terminal and mobile terminal - Google Patents

Method for realizing real-time sharing of audio and video of mobile terminal and mobile terminal Download PDF

Info

Publication number
CN101695090B
CN101695090B CN200910205518.8A CN200910205518A CN101695090B CN 101695090 B CN101695090 B CN 101695090B CN 200910205518 A CN200910205518 A CN 200910205518A CN 101695090 B CN101695090 B CN 101695090B
Authority
CN
China
Prior art keywords
data
audio
video data
time
video
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN200910205518.8A
Other languages
Chinese (zh)
Other versions
CN101695090A (en
Inventor
王冰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ZTE Corp
Original Assignee
ZTE Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ZTE Corp filed Critical ZTE Corp
Priority to CN200910205518.8A priority Critical patent/CN101695090B/en
Publication of CN101695090A publication Critical patent/CN101695090A/en
Application granted granted Critical
Publication of CN101695090B publication Critical patent/CN101695090B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention provides a method for realizing the real-time sharing of audio and video of a mobile terminal and the mobile terminal. The method comprises the following steps: A. a sending terminal records audio-video data to be played, encodes and compresses the audio-video data and then transmits the audio-video data to a receiving terminal; and B. the receiving terminal decompresses and decodes the received audio-video data, and then invokes the corresponding driving interface to play the audio-video data. According to the invention, the receiving terminal can watch files synchronously watched under the situation that a decoder corresponding to the displaying files of the sending terminal does not exist, the situation that the receiving terminal synchronously displays and plays the content displayed or played by the sending terminal can be met, and the content comprises various entity or non-entity information.

Description

One realizes mobile terminal audio frequency and video Real-Time Sharing method and mobile terminal
Technical field
The present invention relates to moving communicating field, relate to particularly a kind of method and mobile terminal of realizing mobile terminal audio frequency and video Real-Time Sharing.
Background technology
Media function is one of critical function of current mobile terminal, and shared its importance aspect resource-sharing that also manifests gradually of media.At present, the Real-Time Sharing of the media informations such as the audio frequency and video between mobile terminal is all generally by first the media files such as audio frequency and video being transferred to receiving terminal, then is carried out decoding and the broadcasting of media file by receiving terminal.But if receiving terminal is not supported the decoding of associated media file, this receiving terminal just cannot be play these media files; And because file transfer needs a period of time, receiving terminal need to wait for that the longer time carries out file download, just can play; And, for the mobile phone users that need to share its broadcasting in real time or displaying contents, may also need these contents to record into file, then be transferred to receiving terminal, can not meet in real time and synchronism.
Existing patent documentation has: the patent application that number of patent application is CN200810066097.0 has proposed " a kind of method of shared play multimedia file and multimedia player ".Although method and multimedia player that this patent application proposes can cushion the media in player by the mode of real-time streams, but also need player to carry out media decodes, player decoding capability is had relatively high expectations, and the file format that some players are not supported is had no idea to share and is play; And, method and multimedia player that this patent application proposes do not relate to and the solution Real-Time Sharing problem of entity file by no means, for example, how terminal is by its CMMB (China Mobile Multimedia Broadcasting, China Mobile multimedia broadcasting) or the received video of broadcast receiver or audio-frequency information Real-Time Sharing to other-end, or how by the LCD of terminal (Liquid Crystal Display, liquid crystal display) displaying contents or sound Real-Time Sharing to other-end.
Summary of the invention
Technical problem to be solved by this invention is to provide a kind of method and mobile terminal of realizing mobile terminal audio frequency and video Real-Time Sharing, arrives receiving terminal with the audio, video data synchronous transmission that transmitting terminal is play.
In order to solve the problems of the technologies described above, the invention provides a kind of method that realizes mobile terminal audio frequency and video Real-Time Sharing, comprise the following steps:
A, transmitting terminal record the audio, video data that will play, and described audio, video data is carried out being transferred to receiving terminal after compression coding;
B, receiving terminal carry out calling corresponding driving interface after uncompressed encoding to the described audio, video data receiving described audio, video data are play.
Further, said method has feature below: described steps A specific implementation is:
The audio, video data that transmitting terminal will be play with specific speed timing extraction, is stored in the audio, video data extracting in the adjacent Δ t time in different buffering areas in chronological order, forms multiple Δ t time data bags;
Δ t time data bag in different buffering area described in compression coding in turn;
Δ t time data bag after compression coding is transferred to receiving terminal.
Further, said method has feature below: described extraction audio, video data specific implementation is:
According to decimation periods timing sampling video data, and extract complete voice data in described decimation periods.
Further, said method has feature below: after the Δ t time data bag in buffering areas different described in described compression coding in turn, also comprise:
Δ t time data bag after compression coding is stored in data queue.
Further, said method has feature below: described step B specific implementation is:
The described Δ t time data bag that receiving terminal receiving end/sending end is sent;
Described Δ t time data bag is carried out being alternately stored in different buffering areas after uncompressed encoding;
After alternately takes out audio, video data in described Δ t time data bag and separates and adjust from described different buffering area, call LCD and drive interface and CODEC to drive interface respectively described audio, video data to be play.
The present invention also provides a kind of mobile terminal, comprising:
Data recordin module, for recording the audio, video data of described mobile terminal, then sends a message to the first data processing module;
The first data processing module carries out compression coding to described audio, video data, then notification data sending module after receiving the message that described data recordin module sends;
Data transmission blocks, sends the audio, video data after compression coding afterwards for receiving the notice that described the first data processing module sends.
Further, above-mentioned mobile terminal has feature below:
Described data recordin module, specifically for the audio, video data with specific speed timing extraction itself, in chronological order the audio, video data extracting in the adjacent Δ t time is stored in different buffering areas, form multiple Δ t time data bags, Δ t time data bag of every formation sends a message to described the first data processing module;
Described the first data processing module, specifically for receiving after described message correspondingly the Δ t time data bag in buffering areas different described in compression coding in turn, then notifies described data transmission blocks;
Described data transmission blocks, sends the described Δ t time data bag after compression coding afterwards specifically for receiving described notice.
Further, above-mentioned mobile terminal has feature below: also comprise:
Data reception module, the audio, video data of sending for receiving other mobile terminal, then sends message to the second data processing module;
The second data processing module carries out uncompressed encoding to described audio, video data after receiving described message, then notifies driving interface module;
Driving interface module, calls corresponding driving interface after described notice described audio, video data is shown and play for receiving.
Further, above-mentioned mobile terminal has feature below:
The audio, video data that described data reception module receives is the second Δ t time data bag;
Described the second data processing module, specifically for carrying out being alternately stored in different buffering areas after uncompressed encoding to described the second Δ t time data bag;
Described driving interface module, after taking out audio, video data in described the second Δ t time data bag and separating and adjust, calls LCD and drives interface and CODEC to drive interface to play described audio, video data specifically for alternately from described buffering area.
The present invention also provides a kind of mobile terminal, comprising:
Data reception module, the audio, video data of sending for receiving other mobile terminal, then sends a message to the 3rd data processing module;
The 3rd data processing module carries out uncompressed encoding to described audio, video data after receiving described message, then notifies driving interface module;
Driving interface module, calls corresponding driving interface after described notice described audio, video data is play for receiving.
To sum up, method and the mobile terminal of realizing mobile terminal audio frequency and video Real-Time Sharing provided by the invention, can make receiving terminal in the situation that there is no transmitting terminal played file respective decoder, synchronously watch file, or can in the situation that there is no transmitting terminal media hardware (as CMMB, camera, broadcast receiver etc.), synchronously play media, or meet receiving terminal real-time synchronization and show and play the shown or play content of transmitting terminal, comprise various entities or non-entity information.By the audio frequency and video Real-Time Sharing between mobile terminal, can add the interchange between large user, strengthen user's multimedia experiences.
Brief description of the drawings
Accompanying drawing is used to provide a further understanding of the present invention, and forms a part for specification, for explaining the present invention, is not construed as limiting the invention together with embodiments of the present invention.In the accompanying drawings:
Fig. 1 is the schematic diagram of the mobile terminal (transmitting terminal) according to the embodiment of the present invention;
Fig. 2 is the schematic diagram of the mobile terminal (receiving terminal) according to inventive embodiments;
Fig. 3 is the schematic diagram of the mobile terminal (comprising transmitting terminal and receiving terminal function) according to inventive embodiments;
Fig. 4 is according to the flow chart of the method that realizes mobile terminal audio frequency and video Real-Time Sharing of inventive embodiments;
Fig. 5 shows according to the thread T110 of the embodiment of the present invention and controls and data flow diagram;
Fig. 6 shows according to the thread T120 of the embodiment of the present invention and controls and data flow diagram;
Fig. 7 shows according to the thread T130 of the embodiment of the present invention and controls and data flow diagram;
Fig. 8 shows according to the thread T210 of the embodiment of the present invention and controls and data flow diagram;
Fig. 9 shows according to the thread T220 of the embodiment of the present invention and controls and data flow diagram;
Figure 10 shows according to the thread T230 of the embodiment of the present invention and controls and data flow diagram.
Embodiment
At present, various mobile terminals, particularly mobile phone all adopt bluetooth as short-distance wireless transmission mode substantially, and therefore, the embodiment of the present invention is carried out the audio frequency and video Real-Time Sharing between various mobile terminals based on this transmission means, but is not limited to this transmission means.Main technical scheme is: the audio, video data that transmitting terminal need to show it and play is recorded, and carry out compression coding, by Bluetooth transmission, to receiving terminal, receiving terminal is decoded this data decompression again, and shows and play by calling its corresponding interface that drives.Below in conjunction with accompanying drawing, the preferred embodiments of the present invention are described, should be appreciated that preferred embodiment described herein, only for description and interpretation the present invention, is not intended to limit the present invention.
Fig. 1 is the schematic diagram of the mobile terminal (transmitting terminal) of the embodiment of the present invention, as shown in Figure 1, the mobile terminal of the present embodiment, i.e. transmitting terminal S100, comprises data recordin module S110, data processing module S120 and data transmission blocks S130.
Data recordin module S110 is for extracting the audio, video data of transmitting terminal S100, be the initial data of RGB and PCM, after data preliminary treatment reduces data volume, be stored in buffering area, the final Δ t time data bag that forms, then notification data processing module S120 is for further processing;
Data processing module S120 obtains after the notification message of data recordin module S110, the data that are stored in buffering area is carried out to compression coding, and compression coding result is stored in to data queue, and then, notification data sending module S130 is for further processing;
Data transmission blocks S130 obtains data processing module S120 notification message, calls blue tooth interface the audio, video data of preserving in data queue is sent to receiving terminal by bluetooth.
Fig. 2 is the schematic diagram of the mobile terminal (receiving terminal) of inventive embodiments, as shown in Figure 2, the mobile terminal of the present embodiment, i.e. receiving terminal R100, comprises data reception module R110, data processing module R120 and driving interface module R130.
Data reception module R110, for being received from transmitting terminal and passed the audio, video data bag of coming by bluetooth, be stored in the P2 of data queue, and notification data processing module R120 processes;
Data processing module R120 is for receiving after the notice of data reception module R110, audio, video data in the P2 of data queue is decompressed, decoded, recover each sampling of data in Δ t time data bag, be stored in buffering area, then notify driving interface module R130 to process;
Driving interface module R130, for receiving after the notice of data processing module R120, processes the sampling of data in buffering area, obtains original RGB data and PCM data, calls LCD and drives interface and CODEC to drive interface to carry out demonstration and the broadcasting of audio, video data.
Fig. 3 is the schematic diagram of the mobile terminal (comprising transmitting terminal and receiving terminal function) of inventive embodiments, as shown in Figure 3, the mobile terminal 10 of the present embodiment comprises the functional module (data recordin module S110, data processing module S120 and data transmission blocks S130) of above-mentioned transmitting terminal, for send the audio, video data that itself shows and play to other mobile terminal; Also comprise the functional module (data reception module R110, data processing module R120 and driving interface module R130) of above-mentioned receiving terminal, for showing and playing the audio, video data that other mobile terminal is sent.The mobile terminal 10 of the present embodiment can be used as transmitting terminal and also can be used as receiving terminal.
Introduce and utilize mobile terminal provided by the invention to realize the method for mobile terminal audio frequency and video Real-Time Sharing in detail below, the method that realizes mobile terminal audio frequency and video Real-Time Sharing of the present invention is mainly by extracting the original audio, video data of transmitting terminal, be RGB data and PCM data, through coding and compression, effectively reduce data package size, then send to receiving terminal by bluetooth, receiving terminal receives this audio, video data by bluetooth, it is carried out to decompress(ion) and decoding, drive audio frequency and video the most at last to play back through LCD driving and audio frequency CODEC.
As shown in Figure 4, the method that realizes mobile terminal audio frequency and video Real-Time Sharing of the present embodiment comprises step:
S401, transmitting terminal record itself be the audio, video data of playing, and described audio, video data is carried out after compression coding by Bluetooth transmission to receiving terminal;
In the method, transmitting terminal S100 comprises data recordin module S110, data processing module S120 and data transmission blocks S130.In the present embodiment, apply the function that thread T110 realizes data recordin module S110, application thread T120 realizes the function of data processing module S120, and application thread T130 and bluetooth module are realized the function of data transmission blocks S130.
In the present embodiment, thread T110 records the audio, video data in a period of time Δ t.This audio, video data comprises that transmitting terminal will output to the RGB data of its LCD driving interface module, and will output to the PCM data of audio frequency CODEC (decoding-decoder) module.This recording method should reduce audio frequency and video original data volume as far as possible: for video RGB data, can adopt reduction video data resolution, can adopt and reduce video data figure place, can adopt and reduce video data extraction speed, or adopt other modes that reduces video original data volume, or do not carry out video data transmitting and transmission of audio data only; For voice data, can adopt and reduce voice data figure place, can adopt and reduce voice data and extract speed, can carry out that u leads or A leads coding, or adopt other modes that reduces audio frequency original data volume, or do not carry out audio data transmission and transmitting video data only.The time consuming for reduction audio, video data amount in recording process should be less than the sampling period to audio, video data.Thread T110 adopts double buffering or many buffering methods to be stored in memory recorded data, for example thread T110 uses buffering area B1 to record after a period of time Δ t, notice thread T120 carries out data processing to buffering area B1, at this moment, thread T110 can be when thread T120 processes the data of buffering area B1, continue new data from the sample survey to be kept in the B2 of buffering area, the data of the Δ t time sample adjacent by time relationship are kept in different buffering areas.
Fig. 5 shows according to the thread T110 of the embodiment of the present invention and controls and data flow diagram.As shown in Figure 5, thread T110 is with certain extraction speed timing extraction RGB data and PCM data, through data preliminary treatment, carry out adjustment and the compression of simple image and voice data, store in the B1 of buffering area, as a sampling of data, in the B1 of buffering area, carry out sampling of data preservation so always, until carry out after the Δ t time, form a complete Δ t time data bag.Now, thread T110 notice thread T120 processes the Δ t time data bag in the B1 of buffering area, and the sampling of data in the ensuing Δ t time is kept in the B2 of buffering area, and processing procedure loops at buffering area B1 and buffering area B2 according to this afterwards.Here the audio, video data of sampling is stored in double buffering as example, the audio, video data of sampling can certainly be stored in multiple buffer, method is similar, just repeats no more here.And, being not quite similar for RGB data and PCM data pick-up mode: RGB data are carried out timing sampling according to decimation periods, the PCM data that extract are the complete documentations to the PCM data in this decimation periods.
Then, thread T120 processing threads T110 institute save data.Fig. 6 shows according to the thread T120 of the embodiment of the present invention and controls and data flow diagram.As shown in Figure 6, thread T120 obtains the notification message of thread T110, processes in turn the Δ t time data bag in buffering area B1 and buffering area B2.Processing procedure can comprise the steps such as coding, compression, also can take other modes, object is to eliminate as far as possible the redundant information between each sampling of data in Δ t time data bag, reaches minimizing of data space, and reduces the requirement to Bluetooth transmission bandwidth.The consuming time of compressed encoding that thread T120 carries out a Δ t time data should be less than Δ t, otherwise can cause new Δ t time data effective time, can not carry out in time compressed encoding processing.Δ t time data bag, after the step process such as coding, compression, is stored in the P1 of data queue, and notifies thread T130.By the application of the P1 of data queue, can adjust flexibly the number that at every turn sends Δ t time data bag by bluetooth module, optimize the validity in Bluetooth transmission stage.
Then, thread T130 processing threads T120 institute save data.Fig. 7 shows according to the thread T130 of the embodiment of the present invention and controls and data flow diagram.As shown in Figure 7, thread T130 obtains after the notification message of thread T120, selects the data in the P1 of data queue to eat dishes without rice or wine to send by the bluetooth of bluetooth module.In the time sending these data by bluetooth module, can adopt OPP (the Object Push Profile of bluetooth, object push-and-pull host-host protocol), the agreement such as FTP (File Transfer Protocol, file transfer protocol (FTP)) carries out, and also can adopt other agreements to carry out.In the P1 of data queue, the system of selection of data can be adjusted as the case may be, can transmit the Δ t time data bag of fixed number at every turn, also can dynamically adjust according to link condition, can carry out before Bluetooth transmission calling bluetooth module interface at every turn, by transmitted Δ t time data bag number communication to receiving terminal.Whether Δ t time data bag need to process be processed before sending, as added Δ t time data end-of-packet mark, finish receiving so that receiving terminal can judge a Δ t time data bag.
S402, receiving terminal carry out calling corresponding driving interface after uncompressed encoding to the audio, video data receiving described audio, video data are play.
In the method, receiving terminal R100 comprises data reception module R110, data processing module R120 and driving interface module R130.In the present embodiment, apply the function that bluetooth module and thread T210 realize data reception module R110, application thread T220 realizes the function of data processing module R120, and application thread T230 realizes the function of driving interface module R130.
Fig. 8 shows according to the thread T210 of the embodiment of the present invention and controls and data flow diagram.As shown in Figure 8, bluetooth module eats dishes without rice or wine to receive after the audio, video data that transmitting terminal sends by bluetooth, and notice thread T210 receives.Thread T210 stores this audio, video data into the current writing position receiving in the P2 of data queue.Receive when detecting that a complete Δ t time data is coated with, as by the judgement of Δ t time data end-of-packet mark, be stored among other unit of the P2 of data queue obtaining data after the end mark receiving, and notify thread T220 to carry out data processing.If thread T210 does not obtain a complete Δ t time data bag within the predetermined T time, notify upper layer application to carry out mistake and process, to prevent that receiving terminal is always in data receiving state.
Then, thread T220 carries out audio, video data processing.Thread T220 receives the notification message of thread T210, from accept the P2 of data queue, fetch data decompress, the operation such as decoding, obtain a Δ t time data, be stored in double buffering or multiple buffer, and notify driving interface module R130 to process.Fig. 9 shows according to the thread T220 of the embodiment of the present invention and controls and data flow diagram.As shown in Figure 9, thread T220 receives after the notification message of thread T210, take out a Δ t time data bag from the current read-out position of the P2 of data queue, decompress, decode, recover each sampling of data in Δ t time data bag, be stored in multiple buffer, for example, in buffering area B3 or buffering area B4, and notify thread T230 to carry out subsequent treatment.If previous Δ t time data bag be stored in the B3 of buffering area, next Δ t time data bag is stored in the B4 of buffering area, i.e. the preservation of Δ t time data bag sampling of data and process between two buffering areas, hocket.
Then, the interface function of thread T230 realization and driver.Thread T230 receives after the notification message of thread T220, from double buffering (or multiple buffer), take out data and carry out preliminary treatment, to meet LCD and the audio frequency CODEC requirement to RGB, PCM data, and call corresponding LCD driver and audio driver.Wherein, RGB data are used transmitting terminal RGB sampling rate to carry out LCD to refresh; PCM data, according to information such as the sample rate of this PCM, sampling resolution, compress modes, are called audio driver and are play.
Figure 10 shows according to the thread T230 of the embodiment of the present invention and controls and data flow diagram.
As shown in figure 10, thread T230 receives after the notification message of thread T220, from data buffer zone B3 or data buffer zone B4, take out the audio, video data of sampling in a Δ t time data bag, through PCM data, RGB data separating, RGB data are adjusted, PCM data are adjusted, and then call LCD and drive interface and CODEC to drive interface to show and play the audio, video data of this sampling.PCM, RGB data separating, the steps such as the adjustment of RGB data, the adjustment of PCM data are mainly to make to be finally input to LCD to drive interface and CODEC to drive the data of interface to meet the requirement of the corresponding interface.The processing mode of the audio, video data to sampling, is different for PCM data and RGB data, and this is to be determined by its signal characteristic, and reflects from the sample mode of sampling of data.
In sum, by mobile terminal of the present invention and the method that realizes mobile terminal audio frequency and video Real-Time Sharing, the audio, video data synchronous transmission that transmitting terminal can be play is to receiving terminal, owing to adopting, the data that finally output to LCD and audio frequency CODEC are sent, realized synchronizeing of receiving terminal LCD image and sound and transmitting terminal.Like this, can make receiving terminal in the situation that there is no transmitting terminal played file respective decoder, synchronously watch file, or can in the situation that there is no transmitting terminal media hardware (as CMMB, camera, broadcast receiver etc.), synchronously play media, make the synchronous Real-Time Sharing media information of user, reach the audio frequency and video Real-Time Sharing between mobile terminal, add the interchange between large user, strengthen user's multimedia experiences.
The foregoing is only the preferred embodiments of the present invention, be not limited to the present invention, for a person skilled in the art, the present invention can have various modifications and variations.Within the spirit and principles in the present invention all, any amendment of doing, be equal to replacement, improvement etc., within all should being included in protection scope of the present invention.

Claims (6)

1. a method that realizes mobile terminal audio frequency and video Real-Time Sharing, comprises the following steps:
A, transmitting terminal record the audio, video data that will play, and described audio, video data is carried out being transferred to receiving terminal after compression coding;
B, receiving terminal carry out calling corresponding driving interface after uncompressed encoding to the described audio, video data receiving described audio, video data are play,
Wherein, described steps A specific implementation is:
The audio, video data that transmitting terminal will be play with specific speed timing extraction, is stored in the audio, video data extracting in the adjacent Δ t time in different buffering areas in chronological order, forms multiple Δ t time data bags;
Δ t time data bag in different buffering area described in compression coding in turn;
Δ t time data bag after compression coding is transferred to receiving terminal;
Described step B specific implementation is:
The described Δ t time data bag that receiving terminal receiving end/sending end is sent;
Described Δ t time data bag is carried out being alternately stored in different buffering areas after uncompressed encoding;
After alternately takes out audio, video data in described Δ t time data bag and separates and adjust from described different buffering area, call LCD and drive interface and CODEC to drive interface respectively described audio, video data to be play.
2. the method for claim 1, is characterized in that: described extraction audio, video data specific implementation is:
According to decimation periods timing sampling video data, and extract complete voice data in described decimation periods.
3. the method for claim 1, is characterized in that: after the Δ t time data bag in buffering areas different described in described compression coding in turn, also comprise:
Δ t time data bag after compression coding is stored in data queue.
4. a mobile terminal, comprising:
Data recordin module, for recording the audio, video data of described mobile terminal, then sends a message to the first data processing module;
The first data processing module carries out compression coding to described audio, video data, then notification data sending module after receiving the message that described data recordin module sends;
Data transmission blocks, sends the audio, video data after compression coding afterwards for receiving the notice that described the first data processing module sends,
Wherein, described data recordin module, specifically for the audio, video data with specific speed timing extraction itself, in chronological order the audio, video data extracting in the adjacent Δ t time is stored in different buffering areas, form multiple Δ t time data bags, Δ t time data bag of every formation sends a message to described the first data processing module;
Described the first data processing module, specifically for receiving after described message correspondingly the Δ t time data bag in buffering areas different described in compression coding in turn, then notifies described data transmission blocks;
Described data transmission blocks, sends the described Δ t time data bag after compression coding afterwards specifically for receiving described notice.
5. mobile terminal as claimed in claim 4, is characterized in that, also comprises:
Data reception module, the audio, video data of sending for receiving other mobile terminal, then sends message to the second data processing module;
The second data processing module carries out uncompressed encoding to described audio, video data after receiving described message, then notifies driving interface module;
Driving interface module, calls corresponding driving interface after described notice described audio, video data is shown and play for receiving.
6. mobile terminal as claimed in claim 5, is characterized in that,
The audio, video data that described data reception module receives is the second Δ t time data bag;
Described the second data processing module, specifically for carrying out being alternately stored in different buffering areas after uncompressed encoding to described the second Δ t time data bag;
Described driving interface module, after taking out audio, video data in described the second Δ t time data bag and separating and adjust, calls LCD and drives interface and CODEC to drive interface to play described audio, video data specifically for alternately from described buffering area.
CN200910205518.8A 2009-10-20 2009-10-20 Method for realizing real-time sharing of audio and video of mobile terminal and mobile terminal Active CN101695090B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN200910205518.8A CN101695090B (en) 2009-10-20 2009-10-20 Method for realizing real-time sharing of audio and video of mobile terminal and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN200910205518.8A CN101695090B (en) 2009-10-20 2009-10-20 Method for realizing real-time sharing of audio and video of mobile terminal and mobile terminal

Publications (2)

Publication Number Publication Date
CN101695090A CN101695090A (en) 2010-04-14
CN101695090B true CN101695090B (en) 2014-06-11

Family

ID=42094030

Family Applications (1)

Application Number Title Priority Date Filing Date
CN200910205518.8A Active CN101695090B (en) 2009-10-20 2009-10-20 Method for realizing real-time sharing of audio and video of mobile terminal and mobile terminal

Country Status (1)

Country Link
CN (1) CN101695090B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101945096B (en) * 2010-07-13 2013-01-02 上海未来宽带技术股份有限公司 Video live broadcast system facing to set-top box and PC of mobile phone and working method thereof
CN101902524A (en) * 2010-07-13 2010-12-01 上海未来宽带技术及应用工程研究中心有限公司 Mobile phone capable of being taken as video source of live video system and audio-video transmission method
CN102377457A (en) * 2010-08-11 2012-03-14 中兴通讯股份有限公司 Method and device for sharing audio information by mobile communication terminals
CN102075728B (en) * 2011-01-18 2015-08-12 中兴通讯股份有限公司 The method and system of a kind of shared audio frequency and/or video
CN105979284B (en) * 2016-05-10 2019-07-19 杨�远 Mobile terminal video sharing method
WO2019023919A1 (en) * 2017-08-01 2019-02-07 Vishare Technology Limited Methods and apparatus for video streaming with improved synchronization
CN115460425B (en) * 2022-07-29 2023-11-24 上海赫千电子科技有限公司 Audio and video synchronous transmission method based on vehicle-mounted Ethernet transmission

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1933594A (en) * 2005-09-14 2007-03-21 王世刚 Multichannel audio-video frequency data network transmitting and synchronous playing method
CN101009824A (en) * 2006-01-24 2007-08-01 成都索贝数码科技股份有限公司 A network transfer method for audio/video data
CN101500158A (en) * 2008-12-26 2009-08-05 深圳市同洲电子股份有限公司 Visible interphone and audio/video data transmission method and system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6590604B1 (en) * 2000-04-07 2003-07-08 Polycom, Inc. Personal videoconferencing system having distributed processing architecture

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1933594A (en) * 2005-09-14 2007-03-21 王世刚 Multichannel audio-video frequency data network transmitting and synchronous playing method
CN101009824A (en) * 2006-01-24 2007-08-01 成都索贝数码科技股份有限公司 A network transfer method for audio/video data
CN101500158A (en) * 2008-12-26 2009-08-05 深圳市同洲电子股份有限公司 Visible interphone and audio/video data transmission method and system

Also Published As

Publication number Publication date
CN101695090A (en) 2010-04-14

Similar Documents

Publication Publication Date Title
CN101695090B (en) Method for realizing real-time sharing of audio and video of mobile terminal and mobile terminal
US9491505B2 (en) Frame capture and buffering at source device in wireless display system
JP4585479B2 (en) Server apparatus and video distribution method
CN101427579B (en) Time-shifted presentation of media streams
EP2802151A1 (en) Method and apparatus for providing content, method and apparatus for reproducing content
EP2995087A1 (en) Video streaming in a wireless communication system
CN1893383B (en) Method of providing recordable time according to remaining memory capacity and terminal therefor
CN102883152A (en) Media streaming with adaptation
CN101883097A (en) Method and device for realizing that server equipment shares screen of client equipment
CN101594528A (en) Information processing system, messaging device, information processing method and program
US7493644B1 (en) Method, apparatus, and system for video fast forward functionality in a mobile phone
TW201129098A (en) Method and video receiving system for adaptively decoding embedded video bitstream
US9497245B2 (en) Apparatus and method for live streaming between mobile communication terminals
US8736771B2 (en) Display apparatus, communication apparatus, displaying method and program recording medium
EP1511326B1 (en) Apparatus and method for multimedia reproduction using output buffering in a mobile communication terminal
WO2016107174A1 (en) Method and system for processing multimedia file data, player and client
US20140126878A1 (en) Self-configuring media devices and methods
KR100710386B1 (en) Method for recording and playing data for broadcasting, and communication terminal and system for the same
US20070058576A1 (en) Mobile communication terminal and method for reproducing digital broadcasting
CN117336283A (en) Data communication method, device, electronic equipment and storage medium
CN113225309A (en) Multimedia file online playing method, device, server and storage medium
KR20200029881A (en) Image processing apparatus and controlling method thereof
KR20070027142A (en) The display device for recording the compressed data formats of other types and method for controlling the same
KR20040000313A (en) System for recording and reading audiovisual information

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant