Embodiment
Referring now to accompanying drawing, system and method for the present invention is carried out concrete description.
Fig. 2 has shown the block diagram according to the interactive video treatment system of terminaloriented of the present invention.As shown in Figure 2, the interactive video treatment system of terminaloriented comprises Data Stream Processing unit 11 and single frames processing unit 12.Wherein said Data Stream Processing unit 11 comprises: data flow identification module 111 is used for the source that recognition data flows; Frame of video information extraction modules 112 is used for the various attribute informations of identification video frame; Obtain interactive logical data module 113, be used for extracting the various interactive informations that the client provided from server; And interactive element pretreatment module 114, the interactive element that is obtained is carried out preliminary treatment such as formation index and buffering etc.In addition, described single frames processing unit 12 further comprises: input module 121, thus be used for importing the response for changing of relevant information to content frame by the user; Real-time computing module 122 is used for calculating in real time the status attribute of interactive element; Processing module 123 is used for described element is carried out effect process and pel caching process; And output module 124, be used for output video and carry out interactive logic.
Can adopt multiple mode that the interactive video treatment system of terminaloriented of the present invention is loaded in the computer system, such as: utilize the third party's expansion interface or the plug-in code of open media player or other video capability softwares initiatively to load native system; Perhaps,, but utilize the expansion interface that keeps in the video specification of operating system, load native system automatically for the old-fashioned player that does not have third party's module loading function or other video capability software; Perhaps,, can Hook be set, thereby load native system at video display window (screen) if player or other video capability software do not utilize the expansion interface of operating system standard or expansion interface is limited; In addition, be connected with hardware or the video functionality software of live characteristic,, can also follow the tracks of dynamic load native system on data flow by monitoring to the API of system hardware Drive Layer as camera, video frequency collection card etc. for some.The mode of loading procedure is the prior art in this area on described expansion interface and API or the like, does not repeat them here.In loading procedure, system of the present invention automatically selects best mode to load by detecting the characteristics of playing program, video file and hardware module.
After the generic video interactive module loads and finishes, the video that the meeting waiting system will compress after the data decode of handling triggers message and deposit data address thereof, after in case module detects video data, interactive module can reality according to circumstances be made different processing, when the first time, data flow triggered, can do the preliminary treatment of video, obtain data such as entire video data stream, video file information and index information, player, system information, and the data that collect are done certain processing, submit to the interactive application server, and fetch data and operation such as handle.When video data stream triggers once more, then confirm whether carry out the correlation function that interframe is handled according to the concrete instruction that whether obtains interactive data and interactive data.
The data flow identification module 111 of video data stream processing unit 11 is at first according to the data encoding format of video flowing and the dynamic module that is connected, differentiating data flow is from the digital video collecting device, still from the network video stream bag or from the file on local hard drive, the CD.The information of recognition data frame afterwards,, color figure place wide, high such as video, color depth, video metadata, video frame rate, frame number, reproduction time, current time are (when asynchronous pattern of processing threads and Online Video are live, the preliminary treatment frame number can not be a start frame 0) or the like information, when containing special defined label in the ad-hoc location in the video data stream, these frame data are carried out digital watermarking extract processing, obtain the information of hiding in the digital watermarking.
In the processing procedure of video data stream processing unit 11, the frame based on having made special marking obtains digital watermark information.Adopting this technology mainly is in order better to solve the orientation problem of medium plot, content chapters and sections.By utilizing digital watermark technology, can not rely on concrete file format and encryption algorithm and realize index content special scenes, plot.Because the extractive technique of this digital watermarking is the technological means that often adopts in the prior art, does not give unnecessary details at this.Use after this technology; in case film source has been carried out the digital watermarking protection; under most situations; even if original video is performed converting video coding, video pictures convergent-divergent, stretching, light weight softening, sharpening etc. and diminishes after the graphics process; still can restore raw information, this will further expand the application surface of interdynamic video function and increase versatility.
After frame of video information extraction modules 112 had been extracted the various essential informations of video, client can send to service end together with the system parameters of client with the various essential informations of video and handle.Service end will be sorted out the data of uploading, and deposit relevant database in.Last according to the information that collects, finish processing such as inquiry, media identification, plot and the scene Recognition of client authority and time shaft correction location, according to the business logic modules of the interactive application of these medium/plot correspondence in the database, generate corresponding interactive application data script dynamically again.Final through after the encryption, send back client.This process is finished by obtaining interactive logical data module 113.
Client is received after the interactive data module, script can be resolved one by one, interactive element in the script is carried out the formation index process, use time length according to the statistics interactive element afterwards, adopt 114 pairs of uses of interactive element pretreatment module frequently can cause the interactive element of ample resources consumption to carry out buffered.
Program running is to this moment, and the data flow pretreatment process finishes, and after this mainly is treated to the master with the interframe between the frame of video.
Interframe is handled and to be meant, the processing procedure between two continuous original video pictures by the interactive element effect on the real-time interframe processed frame, realizes the vision reconstruct to the original video picture.When the player plays video, all can load interframe processing capacity module between any two frame pictures.Interframe is handled and mainly comprised following main flow process: (1) detects the variation of video content and user's various inputs by input module 121, and the data such as display priority that form interactive element are exported, (2) dynamically carry out interactive control script by real-time computing module 1222, calculate the interactive element attribute change, (3) play up and caching process by 123 pairs of interactive element of processing module, (4) output module 124, and output video is also carried out other vision operation instruction.
In the interframe processing procedure, can obtain the serial number of present frame automatically, meanwhile video data stream is carried out watermark detection and data extract, the processing procedure of watermark is similar in this flow process and the pretreatment process.Secondly be exactly to detect whether data flow additional data sign is arranged,, then additional data extracted if having and can be identified.Detect the message event of the rearmost position of external input devices such as mouse, keyboard then, if the input equipment incident is fallen in the video window, then according to the information of importing, in the interactive element of formation, carry out the collision detection of video/interactive element and the collision/occlusion detection of interactive element in the interim interactive element buffering area that when last once interframe is handled, produces.The visual element that can't show in this flow process is stamped the mark of skipping.Simultaneously according to the information of input equipment, the logical level of interactive element is revised and sorted, in the hope of the effect of the resource consumption that reaches the minimizing system.
Determined after the display priority and logic diagram ATM layer relationsATM of interactive element, will be real-time calculate the attribute of each interactive element according to interactive logic script.This module is obtained the buffer queue of the interactive element of present frame earlier during operation.Begin then one by one attribute of an element to be calculated and played up.
Carry out interactive element calculate play up before, earlier make a call to a timestamp mark, and then from formation, obtain a untreated interactive element, it is cushioned detection by real-time delimiter, if exist buffered data just to carry out next step, otherwise set up a new data-objects and buffering.Detect each attribute of interactive element then and whether need to change (as screen position, length and width, 3 anglecs of rotation, color or the like attribute), if on the corresponding frame of script time shaft, have the corresponding active script and the instruction of association attributes, just carry out dynamic script according to instruction modification or on the internal virtual machine, realize the dynamic change of element property, reinform renderer and play up this interactive element again, otherwise inform that directly renderer plays up with original buffering image by new argument.Enter remaining time between real-time delimiter estimated frames at last.If remaining time is not enough, then abandons dynamically playing up of some elements and directly extract the data cached output of previous frame, in the hope of the real-time characteristic of guaranteeing to handle.
Because the essential real-time problem that solves of terminal plays video, if therefore can not solve the real-time problem, produce the phenomenon of frame-skipping when causing the user terminal playing video file, will reduce user's watching quality.Therefore the present invention adopts real-time delimiter technology, add up and calculate the render time of each element, reasonably allotment utilizes the ability of playing up and the graphics cache of interactive element, the physilogical characteristics that persist by human vision, what reduce interactive element to greatest extent directly plays up number of times, under limited computational resource, realize the processing capability in real time of more picture element.By above-mentioned technical finesse, can allow video pictures play up ability substantially and improve nearly 30% usefulness.
The video rendering module of present technique; two big characteristics are arranged; the first is built-in to comprise tens kinds of common various graph processing techniques, and image default treatment pattern comprises: convergent-divergent, sharpening, fuzzy, emergence, mosaic, inverse, gray scale, linear, embossment, balance, tone, noise, granulating, scratch look like, twist, fold, hide, duplicate, tile, fusion, tens of kinds of graphic processing methods such as transparent.And can pass through the plug-in more pattern process module of plug-in unit, finally realize more treatment effect.Its two video is played up and handled is directly directly to write processing based on the data format of frame of video, need carry out the RGB-RGB graphics process with the conventional video treatment technology, the YUV colour system that is transformed into video from RGB colorimetric system is compared again, do not need to carry out again complicated loaded down with trivial details Data Format Transform operation during this time, to a certain degree improved the video rendering speed yet.
Finish at last after video plays up, entering before next round interframe handles, interactive logic Executive Module can be carried out the dependent instruction present frame definition or that external equipment activates.By carrying out dependent instruction, can call outside third party's program, perhaps the broadcast of video data stream is controlled or is read other interdynamic video logical order or the like.
Fig. 3 is the flow chart according to the interactive video processing method of terminaloriented of the present invention.What at first will carry out is that the identification that video data stream is carried out is on the whole handled, and it is included in step S301, carries out data flow identification; Data flow after step S302 is to identification and then identification frame of video information wherein; At step S303, obtain the interactive logical data relevant with the frame of video of being discerned afterwards, described data comprise for example literal, image, animation, video, audio frequency, control logic code script etc.; And at step S304, the interactive logical data element that is obtained is carried out preliminary treatment, in this step, need the feature of interactive element is resolved,, and interactive element outputed to the target buffer district with for example time for competiton of analyzing interactive element and scene ordering etc.After the integral body identification of having carried out video data stream, then to carry out the single frames of frame of video and handle.If obtained interactive logical data flow data, decode successfully when the video data stream single frames, just enter the interframe handling process, all to carry out following operation between per two frame pictures: at step S305, to the response for changing of user's input information and content frame; At step S306, calculate the status attribute of interactive element in real time, described status attribute is screen position, length and width, 3 anglecs of rotation, color or the like for example; Then, carry out the effect process and the pel caching process of element at step S307; At step S308 output video and carry out interactive logic; Judge at step S309 whether video data stream finishes, if do not finish then return execution in step S305-S308 and finish until the whole processing of all frame of video.
Referring to Fig. 4-Figure 10, for example understand the concrete operations of the interactive video processing method of terminaloriented of the present invention.Fig. 4 for example understands a kind of data flow recognition methods.Described method comprises carries out a special house dog code, and it monitors the system api interface relevant with video in step 401, and IO message such as mouse, keyboard, hard disk/CD, video peripheral hardware; Then, judge whether the system video module is activated at step S402.The execution in step S403 if the system video module is activated; S403 detects hardware device in step, and described hardware device is camera/capture card/video card/USB/IEEE1394 for example; At step S404 and S405, carry out network data flow detection and local, optical file module video playback detection afterwards, method proceeds to step S405 then.If be not activated in the described system video module of step S402, then be directly to step S406.At S406, output testing result data.Judge (S407) to whether unloading house dog at last, unload if desired then that method finishes, if do not need to download then method is returned execution in step S401.Promptly, after finding that the module that has video to be correlated with is used, the source of determining video earlier is this locality or CD or network flow or external hardware collecting device, the data that identify are formed specific data flow be dealt into next processing module, module is moved with relatively independent thread, can wait until after the Thread Messages that obtains to withdraw from always, just can carry out oneself's dismounting.
Fig. 5 illustrated a kind of method of identification video frame information.This method for example can adopt independently the thread mode to move.At step S501, read video stream data in a synchronous manner, also can adopt asynchronous mode to read described video stream data where necessary certainly.Then at step S502, the data format of identification video frame.Whether the form of judging described frame of video at step S503 is the frame of video form of being supported.If not the frame of video form of supporting, then direct execution in step S508.If the frame of video form of supporting then extracts the size that information such as video size, color figure place are used for setting interactive prospect frame buffer zone at step S504.And at the built-in video metadata of step S505 extraction, described video metadata for example comprises metamessages such as video name, copyright owner, video specification, and obtains video flowing at step S506 and play metamessages such as complete time, frame per second, frame number, present frame.Data watermark information in step S507 detection video flowing wherein is reduced into the appointed information form with possible flowing water stamp or digital watermarking then.And the final processing that the recognition result data flow of the above-mentioned frame of video of collecting is outputed to next stage module and direct end identification video frame information at step S508.
Fig. 6 for example understands a kind of method of obtaining interactive logical data stream.Wherein, at step S601, client is submitted to application gateway with various metamessages of the video of collecting and data watermark information, then at S602, application gateway is decoded the video information of the submission of client and is read, and analyze classification, then different information is submitted to concrete application or inquiry service (step S603-S607).Then at step S608-S614, system obtains it in system or concrete operational authority and right according to user's essential information, and in conjunction with the essential information (display size of video, Pgmtime, frame per second and data stream format, current time), metamessage (video name, copyright, copyright owner etc.) and data watermark additional information or the like, further orient the characteristic of video, the ownership of copyright and content plot scene, by the service logic of retrieval scene plot control and the maker of interactive element resource, formed the policy service of the interactive control of video plot, and the control strategy that cooperates video playback to control, just can generate the interaction control script that is used for client control.Then resulting strategy is returned concrete service/data query to generate the interactive data of client control at step S615.Interactive data is packed and encoding compression processing (S616) by application gateway then, and download on the computer of client.At last, client receives and decoding interactive data (S617) and to other module output interactive data.
Fig. 7 for example understands the pretreated method of a kind of interactive element.Wherein read interactive control data from server, resolve from the interactive control data stream that server obtains in step 702 pair, thereby obtain interactive element characterizing definition data of description and interactive element behavior description data at step S701.At step S703, interactive element is sorted by the frequency of occurrences then again, for example sort according to the time and the scene sequential scheduling that appear on the scene earlier, then sort at the element frequency of utilization again, thereby form the priority treatment formation at step S704.Then step S705 to whether frequency of occurrences element big or that x needs in second to show judge.If then generate interactive element contents of object (S706) and output to (S707) in the target buffer district.If occur then described processing directly finishes.
Fig. 8 for example understand a kind of when module be that interframe is carried out data extract and pretreated method to the change that picture changes and the user imports when handling.Comprising the detection of three kinds of information or incident, promptly detect the video stream data watermark, detect video flowing additional data mark and detect the mouse-keyboard life event.Specifically, at step S801, earlier video single frames data flow is done the detection whether digital watermarking exists, if there is no data watermark, then method is directly to step S803, exists digital watermarking then to extract watermark information at step S802 else if, and method proceeds to step S803 afterwards.Again whether the data flow additional data information is existed at step S803 and to detect, can discern and Useful Information, then extract, then proceed to S805 at step S804 if exist; Do not exist else if, method is directly to step S805.At step S805, extract mouse and KeyEvent formation then.If there is the generation of incoming event, then carries out the hot-zone of video, interactive element and detect (S806) and collision/occlusion detection (S807), then execution in step S808 output interactive element state information and end process; Interactive element state information, end process are then then directly exported in the if there is no generation of incoming event.
The for example clear a kind of interframe of Fig. 9 is calculated the method for interactive element status attribute in real time.Described method is moved independently in the asynchronous thread mode, promptly submit the interactive operation request of data to server with sub-thread independently, rather than adopt main thread to carry out data processing simultaneously and obtain data to server, such mode improved the operating efficiency of system, and the appearance of frame-skipping phenomenon when having avoided playing.At first obtain the interactive element state information at step S901.After the state information that receives interactive element, just carry out the real time execution delimiter, stamp the timestamp (S902) of a processing to data processing.At step S903, from the interactive element formation, extract a untreated interactive element then.And judge in step 904 whether this interactive element object is buffered.If buffer memory then step proceeds to S905 is not set up these interactive element data of related object and buffer memory.Carrying out buffer memory can reduce the CPU processing time widely, that is to say if picture element is too much, and video is the continuous pictures of per second 24-30 frame, in the interframe that real-time is had relatively high expectations is handled, the processing time of temporarily setting up an interactive element graphics primitive object is bigger, if interactive element and its element picture buffer memory are got up, just merges processing during only to use, to reduce the CPU time greatly, interframe is handled and is raised the efficiency about 30-50% in the application of some complexity.Carried out after the data buffering, judged at step S906 whether the attribute of interactive element changes.If have then the mouse-keyboard information of the basic running environment of basis and user's input, calculate the dynamic attribute (S907) of interactive element, step proceeds to S908 then.If not then be directly to S908.At step S908, at last the message of interactive element object variation is broadcasted.Carry out next interactive element processing at the real-time delimiter of step S909 again according to determining whether running time at last.Such pattern mainly is the characteristic of handling in order to ensure in real time.
Figure 10 understands for example that a kind of interactive element video is played up and and the method for pel caching process.The interactive element video of institute's example is played up with the method for pel caching process and is moved independently in the mode of asynchronous thread.In step S1001 thread monitoring system message, and judge at step S1002 whether interactive element source effect needs to upgrade.Whether upgrade if desired, be determined further, be visual detection (S1003) promptly.If do not need to upgrade, then step directly jumps to S1008.If judged that at step S1003 this element is visual, execution in step S1004 then attempts to generate the visual pattern of this element, and this source graphics cache is got up.Afterwards successively to image carry out convergent-divergent and rotation processing (S1005), call other expansion figure Processing Interface (S1006) and with the figure that obtains once more buffer memory get up to put into interim pel buffer area (S1007).Then handle picture being carried out shade, virtualization, position, at last at superpose into vision prospect buffering area and temporarily finish or hang up this module of step S1009 at step S1008.
Figure 11 for example understands output video and carries out the method for interactive logic.Specifically, monitor the message that frame refreshes at step S1101 with process independently.Judge whether to upgrade the message of frame then at step S1102.If the message of upgrading frame is arranged, then the picture of interactive prospect frame of vision and video data frame is merged, and immediately synthetic video data stream is outputed to the codec etc. of any screen or next stage at step S1104 at step S1103.If judge the message do not upgrade frame then direct execution in step S1104.Judge whether to exist other interactive control command at step S1105.If also there is other interactive control command, then carry out video flowing time-out, redirect, switching controls etc. according to instruction at step S1106, and according to circumstances, call the more interaction function of execution of other expansion interfaces at step S1107, finish dealing with final temporary suspension or stop described processing until this interframe.If there is no other interactive control command then directly forwards pending or termination step to.
The concrete steps of exemplary method of the present invention and the example structure of example system of the present invention have more than been provided.But above-mentioned example only is illustrative and be not to be limitation of the invention.Protection scope of the present invention is defined by the claim in the claims that added.Under the situation of the scope that does not break away from the claim that the present invention adds, can carry out adaptive improvement and/or change to the present invention.