CN1921610B - Client-based video stream interactive processing method and processing system - Google Patents

Client-based video stream interactive processing method and processing system Download PDF

Info

Publication number
CN1921610B
CN1921610B CN2006101129068A CN200610112906A CN1921610B CN 1921610 B CN1921610 B CN 1921610B CN 2006101129068 A CN2006101129068 A CN 2006101129068A CN 200610112906 A CN200610112906 A CN 200610112906A CN 1921610 B CN1921610 B CN 1921610B
Authority
CN
China
Prior art keywords
video
interactive
data
client
frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN2006101129068A
Other languages
Chinese (zh)
Other versions
CN1921610A (en
Inventor
龚湘明
龚湘京
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canal Age Beijing Technology Development Co ltd
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN2006101129068A priority Critical patent/CN1921610B/en
Publication of CN1921610A publication Critical patent/CN1921610A/en
Application granted granted Critical
Publication of CN1921610B publication Critical patent/CN1921610B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The invention relates to a client-based video stream interactive processing method, and a device and a system for realizing the method. The video stream interactive processing system based on the client is used for carrying out overall identification processing on the video data stream; and if the interactive logic data stream data is obtained, if the video data stream single frame decoding is successful, performing interframe processing on the video stream. The method and the field of the video application can be greatly enriched, and the universal design also avoids the frequent upgrade and update of the application by the user, and better achieves the purpose of reducing the use cost.

Description

Client-based video flowing interactive processing method and treatment system
Technical field
The present invention relates to a kind of client-based video flowing interactive processing method, and the Apparatus and system that is used to realize this method.
Background technology
Present stage, visible video interactive technology all is that the video content encoding process after server is finished the synthetic of final picture and sound and will be synthesized is a video flowing, and video background picture, prospect display element, sound accompaniment Audio Processing that is to say the playing sequence of relevant plot/scene, the restriction of interactive programmed logic or the like all realized at server end.And the function of client is the video flowing behind the received code and adopt such as television set, set-top box, computer equipment with its decoding and be reduced into video image and sound realizing video playback, and be used to import such as plays, stop, control commands such as F.F., rewind down are with the simple control of realization to video flowing.
As shown in Figure 1, the structure chart that has shown prior art based on the video flow processing system of server end.As can be seen from the figure, in the prior art, interdynamic video stream generates at server end.Because real time altering picture displaying contents needs extremely huge operand, and this function mainly is by various hardware film titlers, effect machine and editing machine and adopts the computer program of various complicated operations to realize at present.
Because special equipment and/or server that each road interdynamic video all needs to possess very big residue operational capability are realized, this just causes the number of users that can carry based on this class interdynamic video system of server-side processes very limited, generally, the interactive audio frequency and video editor's synthesis server of one cover is only supported limited several roads interdynamic video, can simultaneously treated interactive request have only one the tunnel usually.Therefore, this server data pattern of focusing on has caused the development bottleneck of interdynamic video technology.
In addition, most interdynamic video treatment technologies mainly are the simple menu interfaces that increases, and the user only can come selecting video film chapters and sections/fragment by simple control operation.But in actual mechanical process, the a large number of services more information that need superpose on video is arranged, such as certain scene that can more fine play at concrete frame of video, particularly certain element in the picture is operated, perhaps realize certain automatic logical process ability.These complicated interdynamic video treatment technologies are unusual difficulty for traditional centralized interdynamic video system.
The logic flow control of interdynamic video often needs more professional operative knowledge and professional tools, also need a large amount of Production Times, and the interdynamic video content that is generated is difficult for revising in real time or adjusting, and this has also caused the inconvenience in the concrete application and development.Because various video file formats, code decode algorithm are various, have his own strong points, but except some original design just comprise the multimedia video form of interactive application, other most of video formats do not comprise the index information of chapters and sections, plot, and in order to keep compatibility and versatility to greatest extent, defined file form or add extra metadata streams support arbitrarily so just causes some interactive application difficulty relatively again.
Traditional video interactive technology, be that server generation video pictures data flow is transferred to upward direct picture playback of terminal player (specific or universal player) again, be exactly the player plays of the video file format of particular design with special use, RM/RMVB form as Real Network, and the WMV form of Microsoft, the video media form of above-mentioned two companies is because the opening of its player and compatible design are not enough, and reason such as user's use habit, hindered applying of its interaction function.And present technique is the another kind of solution that is independent of outside file format and the player.Can will be close to the function (no matter being program newly developed or comparatively out-of-date) that adds video and applications interaction on any video reproduction program on the terminal, greatly enrich the method and the field of Video Applications.General design has avoided the user that the frequent upgrading of using is upgraded, and also can better reach the purpose that reduces use cost.
Summary of the invention
The object of the present invention is to provide a kind of client-based video flowing interactive processing method, and the Apparatus and system that is used to realize this method.
According to client-based video flowing interactive processing method of the present invention, may further comprise the steps: the identification that video data stream is carried out is on the whole handled; And if obtained interactive logical data flow data, when the video data stream single frames is decoded successfully, then described video flowing is carried out interframe and handles.
Wherein the identification processing that video data stream is carried out on the whole may further comprise the steps: carry out data flow identification; To data flow after the identification and then identification frame of video information wherein; Obtain the interactive logical data relevant with the frame of video of being discerned; And the interactive logical data element that is obtained carried out preliminary treatment.
Wherein said interframe is handled and be may further comprise the steps: to the response for changing of user's input information and content frame; Calculate the status attribute of interactive element in real time; Carry out the effect process and the pel caching process of element; And output video and carry out interactive logic.
According to client-based video flowing interactive processing system of the present invention, comprising: Data Stream Processing unit 11 is used for the identification that video data stream is carried out is on the whole handled; And single frames processing unit 12, be used for that described video flowing is carried out interframe and handle.
Wherein Data Stream Processing unit 11 further comprises: data flow identification module 111 is used for the source that recognition data flows; Frame of video information extraction modules 112 is used for the various attribute informations of identification video frame; Obtain interactive logical data module 113, be used for extracting the various interactive informations that the client provided from server; And interactive element pretreatment module 114, the interactive element that is obtained is carried out preliminary treatment such as formation index and buffering etc.
Described single frames processing unit 12 further comprises: input module 121, thus be used for importing the response for changing of relevant information to content frame by the user; Real-time computing module 122 is used for calculating in real time the status attribute of interactive element; Processing module 123 is used for described element is carried out effect process and pel caching process; And output module 124, be used for output video and carry out other vision operation instruction.
Description of drawings
Fig. 1 has shown the structure chart based on the video flow processing system of server end of prior art;
Fig. 2 has shown the block diagram according to the interactive video treatment system of terminaloriented of the present invention;
Fig. 3 is the flow chart according to the interactive video processing method of terminaloriented of the present invention;
Fig. 4 is a kind of sample data stream recognition method flow chart;
Fig. 5 is a kind of flow chart of example identification video frame information;
Fig. 6 is a kind of flow chart that obtains interactive logical data stream of example;
Fig. 7 is a kind of interactive element pretreatment process figure of example;
Fig. 8 is when module is the interframe processing, and data extract and pretreated flow chart are carried out in the change that picture changes and the user imports;
Fig. 9 is the flow chart that the interframe of example is calculated the interactive element status attribute in real time;
Figure 10 be the interactive element video of example play up and and the flow chart of pel caching process;
Figure 11 is the output video of example and the submodule of carrying out interactive logic.
Embodiment
Referring now to accompanying drawing, system and method for the present invention is carried out concrete description.
Fig. 2 has shown the block diagram according to the interactive video treatment system of terminaloriented of the present invention.As shown in Figure 2, the interactive video treatment system of terminaloriented comprises Data Stream Processing unit 11 and single frames processing unit 12.Wherein said Data Stream Processing unit 11 comprises: data flow identification module 111 is used for the source that recognition data flows; Frame of video information extraction modules 112 is used for the various attribute informations of identification video frame; Obtain interactive logical data module 113, be used for extracting the various interactive informations that the client provided from server; And interactive element pretreatment module 114, the interactive element that is obtained is carried out preliminary treatment such as formation index and buffering etc.In addition, described single frames processing unit 12 further comprises: input module 121, thus be used for importing the response for changing of relevant information to content frame by the user; Real-time computing module 122 is used for calculating in real time the status attribute of interactive element; Processing module 123 is used for described element is carried out effect process and pel caching process; And output module 124, be used for output video and carry out interactive logic.
Can adopt multiple mode that the interactive video treatment system of terminaloriented of the present invention is loaded in the computer system, such as: utilize the third party's expansion interface or the plug-in code of open media player or other video capability softwares initiatively to load native system; Perhaps,, but utilize the expansion interface that keeps in the video specification of operating system, load native system automatically for the old-fashioned player that does not have third party's module loading function or other video capability software; Perhaps,, can Hook be set, thereby load native system at video display window (screen) if player or other video capability software do not utilize the expansion interface of operating system standard or expansion interface is limited; In addition, be connected with hardware or the video functionality software of live characteristic,, can also follow the tracks of dynamic load native system on data flow by monitoring to the API of system hardware Drive Layer as camera, video frequency collection card etc. for some.The mode of loading procedure is the prior art in this area on described expansion interface and API or the like, does not repeat them here.In loading procedure, system of the present invention automatically selects best mode to load by detecting the characteristics of playing program, video file and hardware module.
After the generic video interactive module loads and finishes, the video that the meeting waiting system will compress after the data decode of handling triggers message and deposit data address thereof, after in case module detects video data, interactive module can reality according to circumstances be made different processing, when the first time, data flow triggered, can do the preliminary treatment of video, obtain data such as entire video data stream, video file information and index information, player, system information, and the data that collect are done certain processing, submit to the interactive application server, and fetch data and operation such as handle.When video data stream triggers once more, then confirm whether carry out the correlation function that interframe is handled according to the concrete instruction that whether obtains interactive data and interactive data.
The data flow identification module 111 of video data stream processing unit 11 is at first according to the data encoding format of video flowing and the dynamic module that is connected, differentiating data flow is from the digital video collecting device, still from the network video stream bag or from the file on local hard drive, the CD.The information of recognition data frame afterwards,, color figure place wide, high such as video, color depth, video metadata, video frame rate, frame number, reproduction time, current time are (when asynchronous pattern of processing threads and Online Video are live, the preliminary treatment frame number can not be a start frame 0) or the like information, when containing special defined label in the ad-hoc location in the video data stream, these frame data are carried out digital watermarking extract processing, obtain the information of hiding in the digital watermarking.
In the processing procedure of video data stream processing unit 11, the frame based on having made special marking obtains digital watermark information.Adopting this technology mainly is in order better to solve the orientation problem of medium plot, content chapters and sections.By utilizing digital watermark technology, can not rely on concrete file format and encryption algorithm and realize index content special scenes, plot.Because the extractive technique of this digital watermarking is the technological means that often adopts in the prior art, does not give unnecessary details at this.Use after this technology; in case film source has been carried out the digital watermarking protection; under most situations; even if original video is performed converting video coding, video pictures convergent-divergent, stretching, light weight softening, sharpening etc. and diminishes after the graphics process; still can restore raw information, this will further expand the application surface of interdynamic video function and increase versatility.
After frame of video information extraction modules 112 had been extracted the various essential informations of video, client can send to service end together with the system parameters of client with the various essential informations of video and handle.Service end will be sorted out the data of uploading, and deposit relevant database in.Last according to the information that collects, finish processing such as inquiry, media identification, plot and the scene Recognition of client authority and time shaft correction location, according to the business logic modules of the interactive application of these medium/plot correspondence in the database, generate corresponding interactive application data script dynamically again.Final through after the encryption, send back client.This process is finished by obtaining interactive logical data module 113.
Client is received after the interactive data module, script can be resolved one by one, interactive element in the script is carried out the formation index process, use time length according to the statistics interactive element afterwards, adopt 114 pairs of uses of interactive element pretreatment module frequently can cause the interactive element of ample resources consumption to carry out buffered.
Program running is to this moment, and the data flow pretreatment process finishes, and after this mainly is treated to the master with the interframe between the frame of video.
Interframe is handled and to be meant, the processing procedure between two continuous original video pictures by the interactive element effect on the real-time interframe processed frame, realizes the vision reconstruct to the original video picture.When the player plays video, all can load interframe processing capacity module between any two frame pictures.Interframe is handled and mainly comprised following main flow process: (1) detects the variation of video content and user's various inputs by input module 121, and the data such as display priority that form interactive element are exported, (2) dynamically carry out interactive control script by real-time computing module 1222, calculate the interactive element attribute change, (3) play up and caching process by 123 pairs of interactive element of processing module, (4) output module 124, and output video is also carried out other vision operation instruction.
In the interframe processing procedure, can obtain the serial number of present frame automatically, meanwhile video data stream is carried out watermark detection and data extract, the processing procedure of watermark is similar in this flow process and the pretreatment process.Secondly be exactly to detect whether data flow additional data sign is arranged,, then additional data extracted if having and can be identified.Detect the message event of the rearmost position of external input devices such as mouse, keyboard then, if the input equipment incident is fallen in the video window, then according to the information of importing, in the interactive element of formation, carry out the collision detection of video/interactive element and the collision/occlusion detection of interactive element in the interim interactive element buffering area that when last once interframe is handled, produces.The visual element that can't show in this flow process is stamped the mark of skipping.Simultaneously according to the information of input equipment, the logical level of interactive element is revised and sorted, in the hope of the effect of the resource consumption that reaches the minimizing system.
Determined after the display priority and logic diagram ATM layer relationsATM of interactive element, will be real-time calculate the attribute of each interactive element according to interactive logic script.This module is obtained the buffer queue of the interactive element of present frame earlier during operation.Begin then one by one attribute of an element to be calculated and played up.
Carry out interactive element calculate play up before, earlier make a call to a timestamp mark, and then from formation, obtain a untreated interactive element, it is cushioned detection by real-time delimiter, if exist buffered data just to carry out next step, otherwise set up a new data-objects and buffering.Detect each attribute of interactive element then and whether need to change (as screen position, length and width, 3 anglecs of rotation, color or the like attribute), if on the corresponding frame of script time shaft, have the corresponding active script and the instruction of association attributes, just carry out dynamic script according to instruction modification or on the internal virtual machine, realize the dynamic change of element property, reinform renderer and play up this interactive element again, otherwise inform that directly renderer plays up with original buffering image by new argument.Enter remaining time between real-time delimiter estimated frames at last.If remaining time is not enough, then abandons dynamically playing up of some elements and directly extract the data cached output of previous frame, in the hope of the real-time characteristic of guaranteeing to handle.
Because the essential real-time problem that solves of terminal plays video, if therefore can not solve the real-time problem, produce the phenomenon of frame-skipping when causing the user terminal playing video file, will reduce user's watching quality.Therefore the present invention adopts real-time delimiter technology, add up and calculate the render time of each element, reasonably allotment utilizes the ability of playing up and the graphics cache of interactive element, the physilogical characteristics that persist by human vision, what reduce interactive element to greatest extent directly plays up number of times, under limited computational resource, realize the processing capability in real time of more picture element.By above-mentioned technical finesse, can allow video pictures play up ability substantially and improve nearly 30% usefulness.
The video rendering module of present technique; two big characteristics are arranged; the first is built-in to comprise tens kinds of common various graph processing techniques, and image default treatment pattern comprises: convergent-divergent, sharpening, fuzzy, emergence, mosaic, inverse, gray scale, linear, embossment, balance, tone, noise, granulating, scratch look like, twist, fold, hide, duplicate, tile, fusion, tens of kinds of graphic processing methods such as transparent.And can pass through the plug-in more pattern process module of plug-in unit, finally realize more treatment effect.Its two video is played up and handled is directly directly to write processing based on the data format of frame of video, need carry out the RGB-RGB graphics process with the conventional video treatment technology, the YUV colour system that is transformed into video from RGB colorimetric system is compared again, do not need to carry out again complicated loaded down with trivial details Data Format Transform operation during this time, to a certain degree improved the video rendering speed yet.
Finish at last after video plays up, entering before next round interframe handles, interactive logic Executive Module can be carried out the dependent instruction present frame definition or that external equipment activates.By carrying out dependent instruction, can call outside third party's program, perhaps the broadcast of video data stream is controlled or is read other interdynamic video logical order or the like.
Fig. 3 is the flow chart according to the interactive video processing method of terminaloriented of the present invention.What at first will carry out is that the identification that video data stream is carried out is on the whole handled, and it is included in step S301, carries out data flow identification; Data flow after step S302 is to identification and then identification frame of video information wherein; At step S303, obtain the interactive logical data relevant with the frame of video of being discerned afterwards, described data comprise for example literal, image, animation, video, audio frequency, control logic code script etc.; And at step S304, the interactive logical data element that is obtained is carried out preliminary treatment, in this step, need the feature of interactive element is resolved,, and interactive element outputed to the target buffer district with for example time for competiton of analyzing interactive element and scene ordering etc.After the integral body identification of having carried out video data stream, then to carry out the single frames of frame of video and handle.If obtained interactive logical data flow data, decode successfully when the video data stream single frames, just enter the interframe handling process, all to carry out following operation between per two frame pictures: at step S305, to the response for changing of user's input information and content frame; At step S306, calculate the status attribute of interactive element in real time, described status attribute is screen position, length and width, 3 anglecs of rotation, color or the like for example; Then, carry out the effect process and the pel caching process of element at step S307; At step S308 output video and carry out interactive logic; Judge at step S309 whether video data stream finishes, if do not finish then return execution in step S305-S308 and finish until the whole processing of all frame of video.
Referring to Fig. 4-Figure 10, for example understand the concrete operations of the interactive video processing method of terminaloriented of the present invention.Fig. 4 for example understands a kind of data flow recognition methods.Described method comprises carries out a special house dog code, and it monitors the system api interface relevant with video in step 401, and IO message such as mouse, keyboard, hard disk/CD, video peripheral hardware; Then, judge whether the system video module is activated at step S402.The execution in step S403 if the system video module is activated; S403 detects hardware device in step, and described hardware device is camera/capture card/video card/USB/IEEE1394 for example; At step S404 and S405, carry out network data flow detection and local, optical file module video playback detection afterwards, method proceeds to step S405 then.If be not activated in the described system video module of step S402, then be directly to step S406.At S406, output testing result data.Judge (S407) to whether unloading house dog at last, unload if desired then that method finishes, if do not need to download then method is returned execution in step S401.Promptly, after finding that the module that has video to be correlated with is used, the source of determining video earlier is this locality or CD or network flow or external hardware collecting device, the data that identify are formed specific data flow be dealt into next processing module, module is moved with relatively independent thread, can wait until after the Thread Messages that obtains to withdraw from always, just can carry out oneself's dismounting.
Fig. 5 illustrated a kind of method of identification video frame information.This method for example can adopt independently the thread mode to move.At step S501, read video stream data in a synchronous manner, also can adopt asynchronous mode to read described video stream data where necessary certainly.Then at step S502, the data format of identification video frame.Whether the form of judging described frame of video at step S503 is the frame of video form of being supported.If not the frame of video form of supporting, then direct execution in step S508.If the frame of video form of supporting then extracts the size that information such as video size, color figure place are used for setting interactive prospect frame buffer zone at step S504.And at the built-in video metadata of step S505 extraction, described video metadata for example comprises metamessages such as video name, copyright owner, video specification, and obtains video flowing at step S506 and play metamessages such as complete time, frame per second, frame number, present frame.Data watermark information in step S507 detection video flowing wherein is reduced into the appointed information form with possible flowing water stamp or digital watermarking then.And the final processing that the recognition result data flow of the above-mentioned frame of video of collecting is outputed to next stage module and direct end identification video frame information at step S508.
Fig. 6 for example understands a kind of method of obtaining interactive logical data stream.Wherein, at step S601, client is submitted to application gateway with various metamessages of the video of collecting and data watermark information, then at S602, application gateway is decoded the video information of the submission of client and is read, and analyze classification, then different information is submitted to concrete application or inquiry service (step S603-S607).Then at step S608-S614, system obtains it in system or concrete operational authority and right according to user's essential information, and in conjunction with the essential information (display size of video, Pgmtime, frame per second and data stream format, current time), metamessage (video name, copyright, copyright owner etc.) and data watermark additional information or the like, further orient the characteristic of video, the ownership of copyright and content plot scene, by the service logic of retrieval scene plot control and the maker of interactive element resource, formed the policy service of the interactive control of video plot, and the control strategy that cooperates video playback to control, just can generate the interaction control script that is used for client control.Then resulting strategy is returned concrete service/data query to generate the interactive data of client control at step S615.Interactive data is packed and encoding compression processing (S616) by application gateway then, and download on the computer of client.At last, client receives and decoding interactive data (S617) and to other module output interactive data.
Fig. 7 for example understands the pretreated method of a kind of interactive element.Wherein read interactive control data from server, resolve from the interactive control data stream that server obtains in step 702 pair, thereby obtain interactive element characterizing definition data of description and interactive element behavior description data at step S701.At step S703, interactive element is sorted by the frequency of occurrences then again, for example sort according to the time and the scene sequential scheduling that appear on the scene earlier, then sort at the element frequency of utilization again, thereby form the priority treatment formation at step S704.Then step S705 to whether frequency of occurrences element big or that x needs in second to show judge.If then generate interactive element contents of object (S706) and output to (S707) in the target buffer district.If occur then described processing directly finishes.
Fig. 8 for example understand a kind of when module be that interframe is carried out data extract and pretreated method to the change that picture changes and the user imports when handling.Comprising the detection of three kinds of information or incident, promptly detect the video stream data watermark, detect video flowing additional data mark and detect the mouse-keyboard life event.Specifically, at step S801, earlier video single frames data flow is done the detection whether digital watermarking exists, if there is no data watermark, then method is directly to step S803, exists digital watermarking then to extract watermark information at step S802 else if, and method proceeds to step S803 afterwards.Again whether the data flow additional data information is existed at step S803 and to detect, can discern and Useful Information, then extract, then proceed to S805 at step S804 if exist; Do not exist else if, method is directly to step S805.At step S805, extract mouse and KeyEvent formation then.If there is the generation of incoming event, then carries out the hot-zone of video, interactive element and detect (S806) and collision/occlusion detection (S807), then execution in step S808 output interactive element state information and end process; Interactive element state information, end process are then then directly exported in the if there is no generation of incoming event.
The for example clear a kind of interframe of Fig. 9 is calculated the method for interactive element status attribute in real time.Described method is moved independently in the asynchronous thread mode, promptly submit the interactive operation request of data to server with sub-thread independently, rather than adopt main thread to carry out data processing simultaneously and obtain data to server, such mode improved the operating efficiency of system, and the appearance of frame-skipping phenomenon when having avoided playing.At first obtain the interactive element state information at step S901.After the state information that receives interactive element, just carry out the real time execution delimiter, stamp the timestamp (S902) of a processing to data processing.At step S903, from the interactive element formation, extract a untreated interactive element then.And judge in step 904 whether this interactive element object is buffered.If buffer memory then step proceeds to S905 is not set up these interactive element data of related object and buffer memory.Carrying out buffer memory can reduce the CPU processing time widely, that is to say if picture element is too much, and video is the continuous pictures of per second 24-30 frame, in the interframe that real-time is had relatively high expectations is handled, the processing time of temporarily setting up an interactive element graphics primitive object is bigger, if interactive element and its element picture buffer memory are got up, just merges processing during only to use, to reduce the CPU time greatly, interframe is handled and is raised the efficiency about 30-50% in the application of some complexity.Carried out after the data buffering, judged at step S906 whether the attribute of interactive element changes.If have then the mouse-keyboard information of the basic running environment of basis and user's input, calculate the dynamic attribute (S907) of interactive element, step proceeds to S908 then.If not then be directly to S908.At step S908, at last the message of interactive element object variation is broadcasted.Carry out next interactive element processing at the real-time delimiter of step S909 again according to determining whether running time at last.Such pattern mainly is the characteristic of handling in order to ensure in real time.
Figure 10 understands for example that a kind of interactive element video is played up and and the method for pel caching process.The interactive element video of institute's example is played up with the method for pel caching process and is moved independently in the mode of asynchronous thread.In step S1001 thread monitoring system message, and judge at step S1002 whether interactive element source effect needs to upgrade.Whether upgrade if desired, be determined further, be visual detection (S1003) promptly.If do not need to upgrade, then step directly jumps to S1008.If judged that at step S1003 this element is visual, execution in step S1004 then attempts to generate the visual pattern of this element, and this source graphics cache is got up.Afterwards successively to image carry out convergent-divergent and rotation processing (S1005), call other expansion figure Processing Interface (S1006) and with the figure that obtains once more buffer memory get up to put into interim pel buffer area (S1007).Then handle picture being carried out shade, virtualization, position, at last at superpose into vision prospect buffering area and temporarily finish or hang up this module of step S1009 at step S1008.
Figure 11 for example understands output video and carries out the method for interactive logic.Specifically, monitor the message that frame refreshes at step S1101 with process independently.Judge whether to upgrade the message of frame then at step S1102.If the message of upgrading frame is arranged, then the picture of interactive prospect frame of vision and video data frame is merged, and immediately synthetic video data stream is outputed to the codec etc. of any screen or next stage at step S1104 at step S1103.If judge the message do not upgrade frame then direct execution in step S1104.Judge whether to exist other interactive control command at step S1105.If also there is other interactive control command, then carry out video flowing time-out, redirect, switching controls etc. according to instruction at step S1106, and according to circumstances, call the more interaction function of execution of other expansion interfaces at step S1107, finish dealing with final temporary suspension or stop described processing until this interframe.If there is no other interactive control command then directly forwards pending or termination step to.
The concrete steps of exemplary method of the present invention and the example structure of example system of the present invention have more than been provided.But above-mentioned example only is illustrative and be not to be limitation of the invention.Protection scope of the present invention is defined by the claim in the claims that added.Under the situation of the scope that does not break away from the claim that the present invention adds, can carry out adaptive improvement and/or change to the present invention.

Claims (22)

1. client-based video flowing interactive processing method may further comprise the steps:
The identification that video data stream is carried out is on the whole handled; And
If obtained interactive logical data flow data,, then described video flowing is carried out interframe and handle when the video data stream single frames is decoded successfully;
Wherein the identification processing that video data stream is carried out on the whole may further comprise the steps:
The identification of execution data flow;
To data flow after the identification and then identification frame of video information wherein;
Obtain the interactive logical data relevant with the frame of video of being discerned;
The interactive logical data element that is obtained is carried out preliminary treatment;
Described interframe is handled and be may further comprise the steps:
Response for changing to user's input information and content frame;
Calculate the status attribute of interactive element in real time;
Carry out the effect process and the pel caching process of element; And
Output video is also carried out interactive logic;
Wherein carrying out data flow discerns according to following operation:
Monitor the system api interface relevant with video;
Judge whether the system video module is activated;
If the system video module is activated hardware device is detected;
Carrying out network data flow detection and file module video playback detects; And
Output testing result data;
Data flow after the identification and then identification frame of video information wherein be may further comprise the steps:
Read the data format of video stream data and identification video frame;
If this data format is supported frame of video form, then extract the size that relevant information is used for setting interactive prospect frame buffer zone;
Extract built-in video metadata and obtain the relevant metamessage of video flowing broadcast;
Detect the data watermark information in the video flowing, possible flowing water stamp or digital watermarking are reduced into the appointed information form;
The recognition result of output video frame;
The interactive logical data element that is obtained is carried out preliminary treatment be may further comprise the steps:
Read interactive control data from server;
The interactive control data stream that is obtained is resolved, obtain interactive element characterizing definition data of description and interactive element behavior description data;
Interactive element is sorted by the frequency of occurrences, and sort, thereby form the priority treatment formation at the element frequency of utilization; And
If the element that need to show is arranged, then generate the interactive element contents of object and output in the target buffer district.
2. a client-based video flowing interactive processing method as claimed in claim 1 wherein reads the operation of video stream data and carries out in a synchronous manner.
3. client-based video flowing interactive processing method as claimed in claim 1, the operation of wherein reading video stream data is to carry out in asynchronous mode.
4. client-based video flowing interactive processing method as claimed in claim 1, wherein obtain the interactive logical data relevant and may further comprise the steps with the frame of video of being discerned:
Client is submitted to application gateway with various metamessages of the video of collecting and data watermark information; And
Application gateway is decoded the video information of the submission of client and is read, and analyzes classification, then different information is submitted to concrete application or inquiry service.
5. client-based video flowing interactive processing method as claimed in claim 1, wherein obtain the interactive logical data relevant and further may further comprise the steps with the frame of video of being discerned:
System obtains it in system or concrete operational authority and right according to user's essential information, and orients the attribute of video in conjunction with the relevant information of frame of video;
Utilize the maker of service logic and interactive element resource to form the policy service of the interactive control of video plot, thereby generate the interaction control script that is used for client control;
Resulting strategy is returned concrete service/data query to generate the interactive data of client control; And
Above-mentioned data are returned to client.
6. one kind as claim 1 or 4 or 5 described client-based video flowing interactive processing methods, and wherein said interactive logical data comprises literal, image, animation, video, audio frequency, control logic code script.
7. client-based video flowing interactive processing method as claimed in claim 1, wherein said response for changing to user's input information and content frame may further comprise the steps:
If existing video single frames data flow does digital watermarking then extracts this watermark information;
If have the data flow additional data information then extract this information;
Extract incoming event;
If there is incoming event, then carries out the hot-zone of video, interactive element and detect and collision/occlusion detection; And
Output interactive element state information.
8. client-based video flowing interactive processing method as claimed in claim 1, the status attribute of wherein said real-time calculating interactive element may further comprise the steps:
Obtain the interactive element state information, and carry out the real-time operation delimiter, stamp the timestamp of a processing to data processing;
From the interactive element formation, extract a untreated interactive element;
If described interactive element is buffer memory then set up related object and this interactive element of buffer memory not;
, calculates the attribute of described interactive element the dynamic attribute of interactive element if changing according to basic running environment and user's input information; And
The message of interactive element object variation is broadcasted.
9. client-based video flowing interactive processing method as claimed in claim 8, the method for the status attribute of wherein said real-time calculating interactive element is to move independently in the mode of asynchronous thread.
10. client-based video flowing interactive processing method as claimed in claim 1, the effect process of wherein said execution element and pel caching process may further comprise the steps:
Thread monitoring system message;
Judge whether interactive element source effect needs to upgrade, and upgrades if desired, then judges whether it is visual detection, if this element is visual, then generates the visual pattern of this element, and the described figure of buffer memory;
Figure is carried out convergent-divergent and rotation processing;
Call other expansion figure Processing Interface, and the figure that is obtained is cached to interim pel buffer area once more; And
Picture is carried out shade, virtualization, position processing, and with its into vision prospect buffering area that superposes.
11. a client-based video flowing interactive processing method as claimed in claim 1, wherein said output video is also carried out interactive logic and be may further comprise the steps:
Monitor the message that frame refreshes with process independently;
If the message of upgrading frame is arranged, then the picture with interactive prospect frame of vision and video data frame merges, and synthetic video data stream is outputed in the codec of any screen or next stage; And
If also there is other interactive control command, then carry out corresponding video flowing control operation, and call the more interaction function of execution of other expansion interfaces according to described instruction, finish dealing with until this interframe.
12. a client-based video flowing interactive processing system, described system comprises:
Data Stream Processing unit (11) is used for the identification that video data stream is carried out is on the whole handled; And single frames processing unit (12), be used for that described video flowing is carried out interframe and handle;
Described Data Stream Processing unit (11) further comprises: data flow identification module (111) is used for the source that recognition data flows; Frame of video information extraction modules (112) is used for the various attribute informations of identification video frame; Obtain interactive logical data module (113), be used for extracting the various interactive informations that the client provided from server; And interactive element pretreatment module (114), the interactive element that is obtained is carried out preliminary treatment such as formation index and buffering etc.;
Described single frames processing unit (12) further comprises: input module (121), thus be used for importing the response for changing of relevant information to content frame by the user; Real-time computing module (122) is used for calculating in real time the status attribute of interactive element; Processing module (123) is used for described element is carried out effect process and pel caching process; And output module (124), be used for output video and carry out other vision operation instruction;
Described data flow identification module (111) is carried out following concrete operation: monitor the system api interface relevant with video; Judge whether the system video module is activated; If the system video module is activated hardware device is detected; Carrying out network data flow detection and file module video playback detects; And output testing result data;
Described frame of video information extraction modules (112) is carried out following concrete operation: the data format that reads video stream data and identification video frame; If this data format is supported frame of video form, then extract the size that relevant information is used for setting interactive prospect frame buffer zone; Extract built-in video metadata and obtain the relevant metamessage of video flowing broadcast; Detect the data watermark information in the video flowing, possible flowing water stamp or digital watermarking are reduced into the appointed information form; The recognition result of output video frame;
Described interactive element pretreatment module (114) is carried out following concrete operation: read interactive control data from server; The interactive control data stream that is obtained is resolved, obtain interactive element characterizing definition data of description and interactive element behavior description data; Interactive element is sorted by the frequency of occurrences, and sort, thereby form the priority treatment formation at the element frequency of utilization; And if the element that need to show is arranged, then generate the interactive element contents of object and output in the target buffer district.
13. a client-based video flowing interactive processing system as claimed in claim 12 wherein reads the operation of video stream data and carries out in a synchronous manner.
14. a client-based video flowing interactive processing system as claimed in claim 12, the operation of wherein reading video stream data is to carry out in asynchronous mode.
15. a client-based video flowing interactive processing system as claimed in claim 12, the wherein said interactive logical data module (113) of obtaining is carried out following concrete operation:
Client is submitted to application gateway with various metamessages of the video of collecting and data watermark information; And
Application gateway is decoded the video information of the submission of client and is read, and analyzes classification, then different information is submitted to concrete application or inquiry service.
16. a client-based video flowing interactive processing system as claimed in claim 12, the wherein said interactive logical data module (113) of obtaining also is used for carrying out further following concrete operation:
System obtains it in system or concrete operational authority and right according to user's essential information, and orients the attribute of video in conjunction with the relevant information of frame of video;
Utilize the maker of service logic and interactive element resource to form the policy service of the interactive control of video plot, thereby generate the interaction control script that is used for client control;
Resulting strategy is returned concrete service/data query to generate the interactive data of client control; And
Above-mentioned data are returned to client.
17. one kind as claim 12 or 15 or 16 described client-based video flowing interactive processing methods, wherein said interactive logical data comprises literal, image, animation, video, audio frequency, control logic code script.
18. a client-based video flowing interactive processing system as claimed in claim 12, wherein said input module (121) is used to carry out following operation:
If existing video single frames data flow does digital watermarking then extracts this watermark information;
If have the data flow additional data information then extract this information;
Extract incoming event;
If there is incoming event, then carries out the hot-zone of video, interactive element and detect and collision/occlusion detection; And
Output interactive element state information.
19. a client-based video flowing interactive processing system as claimed in claim 12, wherein said real-time computing module (122) is used to carry out following operation:
Obtain the interactive element state information, and carry out the real-time operation delimiter, stamp the timestamp of a processing to data processing;
From the interactive element formation, extract a untreated interactive element;
If described interactive element is buffer memory then set up related object and this interactive element of buffer memory not;
, calculates the attribute of described interactive element the dynamic attribute of interactive element if changing according to basic running environment and user's input information; And
The message of interactive element object variation is broadcasted.
20. a client-based video flowing interactive processing system as claimed in claim 19, the method for the status attribute of wherein said real-time calculating interactive element is to move independently in the mode of asynchronous thread.
21. a client-based video flowing interactive processing system as claimed in claim 12, wherein said processing module (123) is used to carry out following operation:
Thread monitoring system message;
Judge whether interactive element source effect needs to upgrade, and upgrades if desired, then judges whether it is visual detection, if this element is visual, then generates the visual pattern of this element, and the described figure of buffer memory;
Figure is carried out convergent-divergent and rotation processing;
Call other expansion figure Processing Interface, and the figure that is obtained is cached to interim pel buffer area once more; And
Picture is carried out shade, virtualization, position processing, and with its into vision prospect buffering area that superposes.
22. a client-based video flowing interactive processing system as claimed in claim 12, wherein said output module (124) is used to carry out following operation:
Monitor the message that frame refreshes with process independently;
If the message of upgrading frame is arranged, then the picture with interactive prospect frame of vision and video data frame merges, and synthetic video data stream is outputed in the codec of any screen or next stage; And
If also there is other interactive control command, then carry out corresponding video flowing control operation, and call the more interaction function of execution of other expansion interfaces according to described instruction, finish dealing with until this interframe.
CN2006101129068A 2006-09-11 2006-09-11 Client-based video stream interactive processing method and processing system Active CN1921610B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2006101129068A CN1921610B (en) 2006-09-11 2006-09-11 Client-based video stream interactive processing method and processing system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2006101129068A CN1921610B (en) 2006-09-11 2006-09-11 Client-based video stream interactive processing method and processing system

Publications (2)

Publication Number Publication Date
CN1921610A CN1921610A (en) 2007-02-28
CN1921610B true CN1921610B (en) 2011-06-22

Family

ID=37779145

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2006101129068A Active CN1921610B (en) 2006-09-11 2006-09-11 Client-based video stream interactive processing method and processing system

Country Status (1)

Country Link
CN (1) CN1921610B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103384311B (en) * 2013-07-18 2018-10-16 博大龙 Interdynamic video batch automatic generation method
CN103458320B (en) * 2013-08-29 2016-12-28 上海赛特斯信息科技股份有限公司 Realize the method that video adds digital watermarking
CN103491196B (en) * 2013-10-09 2017-01-04 百度在线网络技术(北京)有限公司 The acquisition methods of multimedia address and device in webpage
CN104079838A (en) * 2014-07-08 2014-10-01 丽水桉阳生物科技有限公司 Character generator with financial data caption making and playing function
CN105049955B (en) * 2015-07-02 2019-02-05 浙江工商大学 A kind of real-time method and system for passing screen
CN106851332B (en) * 2017-01-04 2019-09-20 北京百度网讯科技有限公司 Video stream processing method, device and system
CN108024117A (en) * 2017-11-29 2018-05-11 广东技术师范学院 A kind of method and system that loop filtering processing is carried out to video flowing
KR102512446B1 (en) * 2018-05-04 2023-03-22 구글 엘엘씨 Hot-word free adaptation of automated assistant function(s)
CN112104909A (en) * 2019-06-18 2020-12-18 上海哔哩哔哩科技有限公司 Interactive video playing method and device, computer equipment and readable storage medium
CN115037732B (en) * 2022-06-01 2024-04-23 中国电力科学研究院有限公司 Method, device, equipment and medium for remote real machine debugging through streaming media

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1728781A (en) * 2004-07-30 2006-02-01 新加坡科技研究局 Method and apparatus for insertion of additional content into video

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1728781A (en) * 2004-07-30 2006-02-01 新加坡科技研究局 Method and apparatus for insertion of additional content into video

Also Published As

Publication number Publication date
CN1921610A (en) 2007-02-28

Similar Documents

Publication Publication Date Title
CN1921610B (en) Client-based video stream interactive processing method and processing system
CN112291627B (en) Video editing method and device, mobile terminal and storage medium
JP4726097B2 (en) System and method for interfacing MPEG coded audio-visual objects capable of adaptive control
US6912726B1 (en) Method and apparatus for integrating hyperlinks in video
US11418832B2 (en) Video processing method, electronic device and computer-readable storage medium
CN101300567B (en) Method for media sharing and authoring on the web
US6573898B1 (en) Analysis of properties of effects for rendering and caching of media data
JP4908460B2 (en) System and method for enhanced visual presentation using interactive video streams
CN109325145B (en) Video thumbnail obtaining method, terminal and computer readable storage medium
JP2005108230A (en) Printing system with embedded audio/video content recognition and processing function
CN112804459A (en) Image display method and device based on virtual camera, storage medium and electronic equipment
JP2007534279A (en) Systems and methods for using graphics hardware for real time 2D and 3D, single and high definition video effects
WO2006115604A2 (en) Media timeline sorting
CN101076106A (en) Interdynamic video system of IPTV two-dimensional frame marked information
CN109587546A (en) Method for processing video frequency, device, electronic equipment and computer-readable medium
CN116210221A (en) Time alignment of MPEG and GLTF media
JP2023519372A (en) 3D video processing method, apparatus, readable storage medium and electronic equipment
US7941739B1 (en) Timeline source
JP2001167037A (en) System and method for dynamic multimedia web cataloging utilizing java(r)
CN1205539C (en) System and method for program-controlled generating continuous media representation
KR20080044872A (en) Systems and methods for processing information or data on a computer
US7692562B1 (en) System and method for representing digital media
US7934159B1 (en) Media timeline
US20050021552A1 (en) Video playback image processing
WO2009044351A1 (en) Generation of image data summarizing a sequence of video frames

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20181211

Address after: 100000 room 1403-2, 37 Building, 34 hospital, Chaoyang District, Beijing.

Patentee after: Beijing Xinguang digital cinema line Co., Ltd.

Address before: 100034 Building 303, No. 6, Xisi Bingma Hutong, Xicheng District, Beijing

Co-patentee before: Gong Xiangjing

Patentee before: Gong Xiangming

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20220526

Address after: 100124 room 4026, 4 / F, building 24, second courtyard, shimencun Road, Chaoyang District, Beijing

Patentee after: Canal age (Beijing) Technology Development Co.,Ltd.

Address before: 100000 room 1403-2, 37 Building, 34 hospital, Chaoyang District, Beijing.

Patentee before: Beijing Xinguang Digital Cinema Line Co.,Ltd.