NZ619460B2 - Methods and apparatus for an embedded appliance - Google Patents
Methods and apparatus for an embedded appliance Download PDFInfo
- Publication number
- NZ619460B2 NZ619460B2 NZ619460A NZ61946012A NZ619460B2 NZ 619460 B2 NZ619460 B2 NZ 619460B2 NZ 619460 A NZ619460 A NZ 619460A NZ 61946012 A NZ61946012 A NZ 61946012A NZ 619460 B2 NZ619460 B2 NZ 619460B2
- Authority
- NZ
- New Zealand
- Prior art keywords
- media
- signal
- media signal
- format
- parameters
- Prior art date
Links
- 230000004048 modification Effects 0.000 claims abstract description 114
- 238000006011 modification reaction Methods 0.000 claims abstract description 114
- 230000001360 synchronised Effects 0.000 claims description 13
- 230000015654 memory Effects 0.000 description 34
- 238000001514 detection method Methods 0.000 description 28
- 230000005236 sound signal Effects 0.000 description 22
- 238000000034 method Methods 0.000 description 19
- 238000005259 measurement Methods 0.000 description 17
- 238000010586 diagram Methods 0.000 description 12
- 150000002500 ions Chemical class 0.000 description 9
- 239000000463 material Substances 0.000 description 5
- 239000000203 mixture Substances 0.000 description 5
- 230000004044 response Effects 0.000 description 5
- 238000007906 compression Methods 0.000 description 3
- 230000003287 optical Effects 0.000 description 3
- WMFYOYKPJLRMJI-UHFFFAOYSA-N 5-O-[1-[3,3-diphenylpropyl(methyl)amino]-2-methylpropan-2-yl] 3-O-methyl 2,6-dimethyl-4-(3-nitrophenyl)-1,4-dihydropyridine-3,5-dicarboxylate;hydron;chloride Chemical compound Cl.COC(=O)C1=C(C)NC(C)=C(C(=O)OC(C)(C)CN(C)CCC(C=2C=CC=CC=2)C=2C=CC=CC=2)C1C1=CC=CC([N+]([O-])=O)=C1 WMFYOYKPJLRMJI-UHFFFAOYSA-N 0.000 description 2
- 101700015817 LAT2 Proteins 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 239000002131 composite material Substances 0.000 description 2
- 230000001902 propagating Effects 0.000 description 2
- 101700062627 A1H Proteins 0.000 description 1
- 101700084722 A1H1 Proteins 0.000 description 1
- 101700061511 A1H2 Proteins 0.000 description 1
- 101700048824 A1H3 Proteins 0.000 description 1
- 101700051538 A1H4 Proteins 0.000 description 1
- 101700051076 A1HA Proteins 0.000 description 1
- 101700015578 A1HB1 Proteins 0.000 description 1
- 101700027417 A1HB2 Proteins 0.000 description 1
- 101700018074 A1I1 Proteins 0.000 description 1
- 101700039128 A1I2 Proteins 0.000 description 1
- 101700004404 A1I4 Proteins 0.000 description 1
- 101700073726 A1IA1 Proteins 0.000 description 1
- 101700075321 A1IA2 Proteins 0.000 description 1
- 101700022939 A1IA3 Proteins 0.000 description 1
- 101700022941 A1IA4 Proteins 0.000 description 1
- 101700023549 A1IA5 Proteins 0.000 description 1
- 101700040959 A1IA6 Proteins 0.000 description 1
- 101700061864 A1IA7 Proteins 0.000 description 1
- 101700071702 A1IA8 Proteins 0.000 description 1
- 101700015972 A1IB1 Proteins 0.000 description 1
- 101700078659 A1IB2 Proteins 0.000 description 1
- 101700076103 A1IB3 Proteins 0.000 description 1
- 101700056046 A1IB4 Proteins 0.000 description 1
- 101700081488 A1IB5 Proteins 0.000 description 1
- 101700062266 A1IB6 Proteins 0.000 description 1
- 101700002220 A1K Proteins 0.000 description 1
- 101700015324 A1KA Proteins 0.000 description 1
- 101700008193 A1KA1 Proteins 0.000 description 1
- 101700010369 A1KA2 Proteins 0.000 description 1
- 101700013447 A1KA3 Proteins 0.000 description 1
- 101700081640 A1KA4 Proteins 0.000 description 1
- 101700057270 A1KA5 Proteins 0.000 description 1
- 101700087084 A1KA6 Proteins 0.000 description 1
- 101700065792 A1KB Proteins 0.000 description 1
- 101700048210 A1KB1 Proteins 0.000 description 1
- 101700046590 A1KB2 Proteins 0.000 description 1
- 101700009736 A1KB3 Proteins 0.000 description 1
- 101700011865 A1KC Proteins 0.000 description 1
- 101700080679 A1L Proteins 0.000 description 1
- 101700051073 A1L1 Proteins 0.000 description 1
- 101700052658 A1L2 Proteins 0.000 description 1
- 101700008597 A1L3 Proteins 0.000 description 1
- 101700026671 A1LA Proteins 0.000 description 1
- 101700012330 A1LB1 Proteins 0.000 description 1
- 101700036775 A1LB2 Proteins 0.000 description 1
- 101700060504 A1LC Proteins 0.000 description 1
- 101700050006 A1MA1 Proteins 0.000 description 1
- 101700050259 A1MA2 Proteins 0.000 description 1
- 101700050664 A1MA3 Proteins 0.000 description 1
- 101700003843 A1MA4 Proteins 0.000 description 1
- 101700003604 A1MA5 Proteins 0.000 description 1
- 101700001262 A1MA6 Proteins 0.000 description 1
- 101700041596 A1MB Proteins 0.000 description 1
- 101700049125 A1O Proteins 0.000 description 1
- 101700017240 A1OA Proteins 0.000 description 1
- 101700024712 A1OA1 Proteins 0.000 description 1
- 101700028879 A1OA2 Proteins 0.000 description 1
- 101700032345 A1OA3 Proteins 0.000 description 1
- 101700087028 A1OB Proteins 0.000 description 1
- 101700062393 A1OB1 Proteins 0.000 description 1
- 101700081359 A1OB2 Proteins 0.000 description 1
- 101700071300 A1OB3 Proteins 0.000 description 1
- 101700031670 A1OB4 Proteins 0.000 description 1
- 101700030247 A1OB5 Proteins 0.000 description 1
- 101700014295 A1OC Proteins 0.000 description 1
- 101700068991 A1OD Proteins 0.000 description 1
- 101700008688 A1P Proteins 0.000 description 1
- 101700071148 A1X1 Proteins 0.000 description 1
- 101700020518 A1XA Proteins 0.000 description 1
- 101700017295 A1i3 Proteins 0.000 description 1
- 101700011284 A22 Proteins 0.000 description 1
- 101700067615 A311 Proteins 0.000 description 1
- 101700064616 A312 Proteins 0.000 description 1
- 101710005568 A31R Proteins 0.000 description 1
- 101710005570 A32L Proteins 0.000 description 1
- 101700044316 A331 Proteins 0.000 description 1
- 101700045658 A332 Proteins 0.000 description 1
- 101700004768 A333 Proteins 0.000 description 1
- 101700007547 A3X1 Proteins 0.000 description 1
- 101700079274 A411 Proteins 0.000 description 1
- 101700063825 A412 Proteins 0.000 description 1
- 101700039137 A413 Proteins 0.000 description 1
- 101710005559 A41L Proteins 0.000 description 1
- 101700056514 A42 Proteins 0.000 description 1
- 101700003484 A421 Proteins 0.000 description 1
- 101700048250 A422 Proteins 0.000 description 1
- 101700060284 A423 Proteins 0.000 description 1
- 101700086421 A424 Proteins 0.000 description 1
- 101710008954 A4A1 Proteins 0.000 description 1
- 101700004929 A611 Proteins 0.000 description 1
- 101700001981 A612 Proteins 0.000 description 1
- 101700009064 A71 Proteins 0.000 description 1
- 101700020790 AX1 Proteins 0.000 description 1
- 101710003793 B1D1 Proteins 0.000 description 1
- 101700038578 B1H Proteins 0.000 description 1
- 101700025656 B1H1 Proteins 0.000 description 1
- 101700025455 B1H2 Proteins 0.000 description 1
- 101700058885 B1KA Proteins 0.000 description 1
- 101700028285 B1KB Proteins 0.000 description 1
- 101700058474 B1LA Proteins 0.000 description 1
- 101700031600 B1LB Proteins 0.000 description 1
- 101700004835 B1M Proteins 0.000 description 1
- 101700054656 B1N Proteins 0.000 description 1
- 101700022877 B1O Proteins 0.000 description 1
- 101700046587 B1Q Proteins 0.000 description 1
- 101700010385 B1R Proteins 0.000 description 1
- 101700032784 B1R1 Proteins 0.000 description 1
- 101700012097 B1R2 Proteins 0.000 description 1
- 101700072176 B1S Proteins 0.000 description 1
- 101700045578 B1S1 Proteins 0.000 description 1
- 101700052720 B1S2 Proteins 0.000 description 1
- 101700046810 B1S3 Proteins 0.000 description 1
- 101700016166 B1T1 Proteins 0.000 description 1
- 101700008274 B1T2 Proteins 0.000 description 1
- 101700085024 B1U1 Proteins 0.000 description 1
- 101700070037 B1U2 Proteins 0.000 description 1
- 101700039556 B1V Proteins 0.000 description 1
- 101700001301 B2H Proteins 0.000 description 1
- 101700011411 B2I Proteins 0.000 description 1
- 101700043400 B2I1 Proteins 0.000 description 1
- 101700013212 B2I2 Proteins 0.000 description 1
- 101700037945 B2I3 Proteins 0.000 description 1
- 101700013584 B2I4 Proteins 0.000 description 1
- 101700076307 B2I5 Proteins 0.000 description 1
- 101700070759 B2J Proteins 0.000 description 1
- 101700047017 B2J1 Proteins 0.000 description 1
- 101700086457 B2J2 Proteins 0.000 description 1
- 101700030756 B2K Proteins 0.000 description 1
- 101700011185 B2KA1 Proteins 0.000 description 1
- 101700034482 B2KA2 Proteins 0.000 description 1
- 101700059671 B2KA3 Proteins 0.000 description 1
- 101700051428 B2KA4 Proteins 0.000 description 1
- 101700067858 B2KB1 Proteins 0.000 description 1
- 101700021477 B2KB2 Proteins 0.000 description 1
- 101700041272 B2KB3 Proteins 0.000 description 1
- 101700026045 B2KB4 Proteins 0.000 description 1
- 101700027558 B2KB5 Proteins 0.000 description 1
- 101700032261 B2KB6 Proteins 0.000 description 1
- 101700073146 B2KB7 Proteins 0.000 description 1
- 101700079550 B2KB8 Proteins 0.000 description 1
- 101700056037 B2KB9 Proteins 0.000 description 1
- 101700036551 B2KBA Proteins 0.000 description 1
- 101700055440 B2KBB Proteins 0.000 description 1
- 101700077277 B2KBC Proteins 0.000 description 1
- 101700056297 B2KBD Proteins 0.000 description 1
- 101700079394 B2KBE Proteins 0.000 description 1
- 101700075860 B2L1 Proteins 0.000 description 1
- 101700067766 B2L2 Proteins 0.000 description 1
- 101700017463 B31 Proteins 0.000 description 1
- 101700004120 B312 Proteins 0.000 description 1
- 101700005607 B32 Proteins 0.000 description 1
- 101710025734 BIB11 Proteins 0.000 description 1
- 101700041598 BX17 Proteins 0.000 description 1
- 101700045280 BX2 Proteins 0.000 description 1
- 101700043880 BX3 Proteins 0.000 description 1
- 101700046017 BX4 Proteins 0.000 description 1
- 101700016678 Bx8 Proteins 0.000 description 1
- 101710025150 DTPLD Proteins 0.000 description 1
- 241000229754 Iva xanthiifolia Species 0.000 description 1
- 101710005624 MVA131L Proteins 0.000 description 1
- 101710005633 MVA164R Proteins 0.000 description 1
- 101700060028 PLD1 Proteins 0.000 description 1
- 101710009126 PLDALPHA1 Proteins 0.000 description 1
- 241000690470 Plantago princeps Species 0.000 description 1
- 229920004880 RTP PEK Polymers 0.000 description 1
- 101710005563 VACWR168 Proteins 0.000 description 1
- 101700084597 X5 Proteins 0.000 description 1
- 101700062487 X6 Proteins 0.000 description 1
- 230000003139 buffering Effects 0.000 description 1
- 239000000969 carrier Substances 0.000 description 1
- 150000001768 cations Chemical class 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000003780 insertion Methods 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 238000002156 mixing Methods 0.000 description 1
- 230000000750 progressive Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000003068 static Effects 0.000 description 1
- 235000010384 tocopherol Nutrition 0.000 description 1
- 235000019731 tricalcium phosphate Nutrition 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 239000011089 white board Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/02—Handling of images in compressed format, e.g. JPEG, MPEG
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/04—Changes in size, position or resolution of an image
- G09G2340/0407—Resolution change, inclusive of the use of different resolutions for different screen areas
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/04—Changes in size, position or resolution of an image
- G09G2340/0407—Resolution change, inclusive of the use of different resolutions for different screen areas
- G09G2340/0435—Change or adaptation of the frame rate of the video stream
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/06—Colour space transformation
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2370/00—Aspects of data communication
- G09G2370/04—Exchange of auxiliary data, i.e. other than image data, between monitor and graphics controller
- G09G2370/042—Exchange of auxiliary data, i.e. other than image data, between monitor and graphics controller for monitor identification
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2370/00—Aspects of data communication
- G09G2370/12—Use of DVI or HDMI protocol in interfaces along the display data pipeline
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2370/00—Aspects of data communication
- G09G2370/22—Detection of presence or absence of input display information or of connection or disconnection of a corresponding information source
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/003—Details of a display terminal, the details relating to the control arrangement of the display terminal and to the interfaces thereto
- G09G5/005—Adapting incoming signals to the display format of the display terminal
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/003—Details of a display terminal, the details relating to the control arrangement of the display terminal and to the interfaces thereto
- G09G5/006—Details of the interface to the display terminal
- G09G5/008—Clock recovery
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/25—Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
- H04N21/262—Content or additional data distribution scheduling, e.g. sending additional data at off-peak times, updating software modules, calculating the carousel transmission frequency, delaying a video stream transmission, generating play-lists
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42203—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] sound input device, e.g. microphone
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/4223—Cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/432—Content retrieval operation from a local storage medium, e.g. hard-disk
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/433—Content storage operation, e.g. storage operation in response to a pause request, caching operations
- H04N21/4334—Recording operations
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
- H04N21/4402—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
- H04N21/4402—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
- H04N21/440218—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by transcoding between formats or standards, e.g. from MPEG-2 to MPEG-4
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/478—Supplemental services, e.g. displaying phone caller identification, shopping application
- H04N21/4788—Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/04—Synchronising
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/01—Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/80—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N9/804—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components
- H04N9/8042—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components involving data reduction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/80—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N9/82—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
- H04N9/8205—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
Abstract
apparatus comprises a media module (430A, 430B) and a modification module included in an embedded appliance is disclosed. The media module (430A, 430B) is configured to receive a first media signal associated with a first input port of the embedded appliance and a second media signal associated with a second input port of the embedded appliance. The media module (430A, 430B) is configured to identify a first set of media signal parameters based on the first media signal. The modification module is configured to receive a modification instruction associated with a session format having a second set of media signal parameters different from the first set. The modification module is configured to modify the first media signal based on the first set of media signal parameters and the modification instruction to produce a first modified media signal in the session format and having the second set of media signal parameters. The media module (430A, 430B) configured to identify, at a second time after the first time, a third plurality of media signal parameters different from the first plurality of media signal parameters and based on the first media signal. The modification module configured to receive a second modification instruction associated with the session format. The modification module configured to modify the first media signal based on the third plurality of media signal parameters and the second modification instruction to produce a second modified media signal in the session format and having the second plurality of media signal parameters. ith a second input port of the embedded appliance. The media module (430A, 430B) is configured to identify a first set of media signal parameters based on the first media signal. The modification module is configured to receive a modification instruction associated with a session format having a second set of media signal parameters different from the first set. The modification module is configured to modify the first media signal based on the first set of media signal parameters and the modification instruction to produce a first modified media signal in the session format and having the second set of media signal parameters. The media module (430A, 430B) configured to identify, at a second time after the first time, a third plurality of media signal parameters different from the first plurality of media signal parameters and based on the first media signal. The modification module configured to receive a second modification instruction associated with the session format. The modification module configured to modify the first media signal based on the third plurality of media signal parameters and the second modification instruction to produce a second modified media signal in the session format and having the second plurality of media signal parameters.
Description
METHODS AND TUS FOR AN EMBEDDED APPLIANCE
CROSS-REFERENCE TO RELATED APPLICATION
This application claims ty to US. Provisional Application No. 61/503,472
filed June 30, 2011, and entitled “METHODS AND APPARATUS FOR AN EMBEDDED
APPLICANCE,” the disclosure of which is hereby incorporated herein by reference in its
entirety.
BACKGROUND
Some embodiments relate generally to an apparatus and method for an embedded
appliance.
The y to capture live media recordings of, for example, classroom instruction
and meetings for and availability and time-shifted viewing has become valuable to
institutions such as universities and businesses. Although some commercial solutions for
capturing and publishing live recordings are known, these solutions are often implemented
on l purpose devices such as a personal computer (PC). Because these PC-based
e solutions use general purpose components and software, they are ive, difficult
to maintain, inefficient when capturing and storing signals, able to security threats,
require special technical support and can be difficult to integrate into, for example, a smart
classroom environment. Thus, a need exists for a purpose-built multimedia capture device.
In some embodiments, an apparatus comprises a media module and a
modification module ed in an embedded appliance. The media module is configured to
receive a first media signal associated with a first input port of the embedded appliance and a
second media signal associated with a second input port of the embedded appliance. The
media module is configured to fy a first set of media signal parameters based on the
first media signal. The modification module is red to receive a modification
instruction associated with a session format having a second set of media signal parameters
different from the first set of media signal parameters. The modification module is
configured to modify the first media signal based on the first set of media signal parameters
and the modification instruction to produce a first modified media signal in the n
format and having the second set of media signal parameters.
2012/044879
BRIEF DESCRIPTION OF THE DRAWINGS
is a system block diagram that illustrates embedded appliances d to
a control server over a network, according to an embodiment.
is a system block m that illustrates an embedded appliance having
input ports, a processor, a memory and multiple modules, according to an embodiment.
is a block diagram that shows the flow of media signals through a control
, according to an embodiment.
is a system block diagram that illustrates an embedded appliance having
two sets of input ports associated with two sets of modules, a processor, and a memory,
ing to an embodiment.
is a flowchart that illustrates a method of using an embedded appliance,
ing to an ment.
is a block diagram that illustrates a hardware detection module coupled to
a software detection module configured to measure and test the timing of horizontal and
vertical sync pulses in an embedded appliance, according to an embodiment.
is a flowchart illustrating a method of detecting or identifying a video
rd for signals received in an ed appliance, according to an embodiment.
is a schematic illustration ofVGA (Video Graphics Array) sync signals,
according to an embodiment.
is a schematic illustration of the frame parameters that make up the timing
for a VGA frame, according to an embodiment.
DETAILED DESCRIPTION
An embedded appliance for multimedia capture (also referred to herein as an
ded appliance”) is a device dedicated to capturing, processing, storing and/or sending
real-time media signals (e.g. audio signal, video signal, visual-capture , digital-image
signal). The embedded appliance can capture real-time media signal(s) that can include
digital-image signals, visual-capture signals, audio signals and/or video signals of, for
example, an in-progress classroom presentation. As the media signal(s) are being captured,
the embedded appliance can s and/or otherwise modify the signal(s) in real-time by,
for example, compressing, indexing, encoding, decoding, synchronizing and/or formatting,
for example, deinterleaving, decimating, scaling, modifying gain, modifying audio levels,
and/or audio multiplexing, the content. Embedded appliances can be, for example,
distributed throughout a network and nated according to a schedule to capture, process,
store and send the real-time media signals for eventual retrieval by a user from, for example,
a control server and/or a server(s) configured as, for example, a course management system.
Media streams being captured on the embedded appliance optionally can also be monitored
and/or further processed by a l server before distribution.
As a dedicated (i.e., specific-purpose) device having an embedded environment,
the embedded appliance uses a hardened operating system (OS) and a sor (e.g.,
processor system) to capture, process, store and/or send real-time media signals. The
hardened OS is configured to resist security attacks (e.g., t access by an unauthorized
user or program) and facilitate filnctions related only to the capturing, processing, storing
and/or sending of real-time media signals. In other words, the re and software within
the embedded appliance are integrated into and designed specifically for capturing,
processing, storing and/or sending real-time media signals. e the hardware and
software for ing, processing, storing and/or g real-time media signals are
integrated into the embedded environment of the embedded appliance, the costs and
complexity associated with installation, g, design, deployment and technical support
can be lower than that for a l purpose system.
A real-time media signal represents an image and/or a sound of an event that is
being acquired by a sensor at substantially the same time as the event is occurring and that is
transmitted without a perceivable delay n the sensor when acquired and an ed
appliance. The capturing, processing, g and/or sending of the real-time media signals
by the embedded appliance can be performed at any time. Throughout the cation, real-
time media signals are also referred to as media signals.
In some embodiments, an ed appliance can include a media module and a
modification module. The media module can be configured to receive a first media signal
from a first input port of the embedded appliance and a second media signal from a second
input port of the embedded appliance. The first media signal and the second media signal
can be, for example, an audio signal received at an audio input port of the embedded
appliance, a visual-capture media signal received at a visual-capture input port of the
embedded appliance, a video media signal ed at a video input port of the embedded
appliance, or a digital-image media signal received at a digital-image input port of the
embedded appliance.
The media module can be configured to identify a first set of media signal
parameters based on the first media . The first set of media signal parameters can
include, for example, a resolution of the first media signal, a frame rate of the first media
signal, a bit rate of the first media signal, or a clock rate of the first media signal.
The ation module can be configured to receive a modification instruction
associated With a session format having a second set of media signal parameters ent
from the first set of media signal parameters. In some ments, the n format is
one from a set of predefined session formats, Where each predefined session format from the
set of predefined session formats is associated with a predefined set of media signal
parameters from a group of predefined sets of media signal parameters. In such
embodiments, the media module can be configured to identify the first set of media signal
parameters from the group of predefined sets of media signal parameters. In some
ments, the session format can be selected from the set of predefined session formats
based on, for example, the first set of media signal parameters, a user-selected output
parameter, or a capability of the embedded appliance.
Furthermore, the modification module can be configured to modify the first media
signal based on the first set of media signal parameters and the modification instruction to
produce a first modified media signal in the session format and having the second set of
media signal parameters. In some embodiments, the modification module can be configured
to modify the first media signal by performing on the first media signal, for example,
deinterleaving, decimating, resizing, color space converting, modifying gain, ing audio
level, or audio multiplexing.
As used in this specification, the singular forms “ 3, (C
a an” and “the” include plural
referents unless the t clearly es otherwise. Thus, for example, the term “an audio
input port” is intended to mean a single audio input port or a combination of audio input
ports.
is a block diagram that rates embedded appliances 100 distributed
across a network 110 and connected to a control server 120. The control server 120, in this
embodiment, is connected with a server 130 that is configured, for example, as a course
management system (e.g., a server running BlackboardTM WebCT, and/or Moodle). The
network 110 can be any type of network ing a local area network (LAN) or wide area
network (WAN) ented as a wired or wireless network in a variety of nments
such as, for example, an office complex or a university campus. The embedded appliances
100 can e real-time media signals including audio signals, visual-capture signals,
digital-image signals and/or video signals acquired through electronic capture devices or
sensors such as microphones, web cameras, video cameras, still cameras and video players.
The embedded appliances 100 can also be configured to process, store and/or send (e. g.,
streaming the signal over a network using a real-time protocol, such as RTP) captured real-
time media signals. Data associated with the t captured by real-time media signals can
also be processed, stored, and/or sent; such data can include, for e, e time,
capture location, and/or speaker’s name.
The embedded appliances 100 can be prompted to start and stop capturing real-
time media signals in response to start and stop tors generated by, for example, the
control server 120 or the ed appliances 100. The start and stop indicators can be
generated according to a schedule determined and/or stored by the control server 120 and/or
each ed appliance 100. If implemented in, for example, a university campus
environment, embedded appliances 100 can be fixed in sity classrooms and connected
via a university communications k. An embedded appliance 100 can be prompted, for
example, according to a schedule stored on the embedded appliance 100 to e media
signals from a particular university classroom at a specific time.
In some embodiments, media signals captured by each embedded appliance 100
can be processed, stored and sent to the control server 120. The control server 120 es
the media signals and sends them to the server 130 where the content of the media signals are
made available for distribution. In other embodiments, although not shown in the
embedded appliances 100 can be coupled to the server 130, and media signals captured by
each embedded appliance 100 can be processed, stored and sent to the server 130 without
going through the control server 120. The content of the media signals are then made
available for distribution at the server 130.
In some embodiments, the content of the media signals can be made ble for
distribution to a user 140 at the control server 120 or the server 130. In some embodiments
the content of the media signals can be made available for distribution to a user substantially
immediately, e.g., real-time, can be stored for distribution at a time other than real-time,
and/or can be simultaneously provided to a user in real-time and stored for distribution at a
later time. In some embodiments, r processing of the media signals can be performed
on the control server 120, the server 130 and/or another processing device (not shown in before the content of the media signals is made available for distribution. The embedded
appliances 100, the control server 120 and/or the server 130 can process the media signals
by, for example, compressing, indexing, encoding, decoding, synchronizing and/or
formatting, for example, deinterleaving, decimating, scaling, modifying gain, modifying
audio levels, and/or audio multiplexing, the media signals.
The embedded appliances 100 can be prompted to start and stop sending
processed real-time media signals in response to start and/or stop tors generated by, for
example, the control server 120 or the ed appliance 100. The start and/or stop
indicators can be generated according to a schedule or according to defined ions. In
some embodiments, the start and/or stop indicator can be a trigger signal generated by a
trigger generator within a l server and received by a r receiver within an
embedded appliance. More details regarding r signals in the context of video signal
capturing are set forth in US. Patent Application No. 10/076,872, Publication No. US
2002/0175991 A1, “GPI Trigger Over TCP/IP for Video Acquisition,” which is incorporated
herein by reference.
The embedded nces 100 can also be configured to send media s after
any stage of processing. For example, an embedded appliance 100 can be configured to send
media signals to the control server 120, based on network traffic conditions, unsynchronized
and unformatted portions of audio and digital-images signals after the signals have been
encoded. The control server 120 can be configured to synchronize and format the audio and
digital-image signals received from the embedded appliance 100.
The ing of media signals on the ed appliance 100 can also be
monitored by the control server 120 through, for example, a confidence ring signal.
Examples of confidence ring are described in US. Patent No. 7,720,251, entitled
“Embedded Appliance for Multimedia Capture,” which is herein incorporated by reference in
its entirety (the ‘251 patent).
Although only shows a single l server 120 connected with multiple
embedded appliances 100 in some embodiments, in other embodiments, more than one
control server 120 can be connected with any combination of embedded appliances 100. For
example, two control servers 120 can be configured to nate the capturing, processing,
storing and/or g of media signals captured by embedded appliances 100. The
embedded appliances 100 can be programmed to recognize multiple control servers 120 and
can be mmed to, for example, send a portion of a processed media signal to one of the
control servers 120.
More specifically, as discussed fiarther below, a given control server (e. g., l
server 120) can be configured to generate and send instructions (e. g., modification
instructions, ements on desired output media signals) to the embedded appliance 100,
such that modules in the embedded appliance 100 can perform signal detection, modification,
encoding, and/or the like, based on the instructions. A separate storage server can be
configured to receive output media signals from the ed appliance 100, process the
output media signals to make them available for users, and/or distribute the output media
signals to other devices and users.
The control server that generates and sends instructions to the embedded
appliance can receive user input that specifies the desired output media signals such as
desired characteristics and/or parameters for the output media s. Such user input can
be received before the particular format of the input media devices at the embedded
appliance is known or before the media s are received at the ed appliance. The
control server can send the instructions based on the user input to the embedded appliance so
that the requirements on the desired output media signal can be generated within the
embedded appliance based on the ctions, as described below in connection with FIG 2.
Alternatively, in other embodiments, the requirements on the d output media signal can
be received at the embedded appliance 200 from an external resource such as, for example, a
user (e. g., via a direct control signal) and/or any other type of al device that controls
the embedded appliance.
is a system block diagram that illustrates an embedded appliance 200 with
input ports 210, a media module 230, a modification module 232, an encoding module 234, a
processor 250, and a memory 260. The embedded nce 200 can be structurally and
functionally r to the embedded appliances 100 shown and described with respect to
While depicts the processor 250 coupled to the media module 230 and the
ation module 232 via the encoding module 234, in some ments, the processor
250 can be directly coupled to the media module 230 and/or the modification module 232. In
such embodiments, the processor 250 can send instructions and/or control signals directly to
the media module 230 and/or the modification module 232, via, for example, a bus (not
shown in .
The embedded appliance 200 captures ime media signals from various
onic devices via the input ports 210 in response to start and stop indicators generated
by, for e, a scheduler (not shown in in the embedded appliance 200, a
scheduler in the control server 220, and/or from a direct l signal 240 from a user via a
user interface (not shown in of the embedded appliance 200. In some ment,
the embedded nce 200 can include an alarm module (not shown in . Examples
of schedulers and alarm modules are described in the ’251 patent.
The embedded appliance 200 receives and processes and/or modifies the media
signals using the media module 230, the modification module 232, and/or the encoding
module 234. Said another way, the embedded appliance 200 can receive a raw (or native)
media (s), and send and/or store a processed and/or modified media signal (“encoded
media signal”). The embedded appliance 200 can use the memory 260 to perform any of the
above described functions such as storing encoded media signals. The embedded appliance
200 captures and transmits encoded media signals to the control server 220 when prompted
by, for example, a scheduler and/or a user. The captured encoded media signals can be sent
to the control server 220 as, for example, a multiplexed signal over a network connection via
an output port (not shown) of the embedded appliance 200.
The input ports 210 include an audio input port(s) 202, a visual-capture input
port(s) 204, a video input port(s) 206 and a digital-image input port(s) 208. Each of the input
ports 210 is integrated as part of the ed environment of the embedded appliance 200.
The media signals captured by the inputs ports 210 can be received as analog signals and/or
as digital signals. In some embodiments, a portion of the media signals can be analog, and a
portion of the media signals can be digital.
The audio input port(s) 202 is used to capture an audio . The audio input
port(s) 202 can be, for example, an RCA (Radio Corporation of America) stereo audio input
port(s), a 1/4” jack stereo audio input port(s), an XLR (Cannon X Series, Latch, Rubber)
input port(s), a balanced wire block, a HDMI (High Definition Multimedia) input port(s)
and/or a USB (Universal Serial Bus) port(s). The audio signal can be produced by any type
of device e of producing an audio signal, for example, a standalone microphone or
microphone connected to a video camera. The embedded appliance 200 can include more or
fewer audio input ports, and/or can include more than one audio input port format, for
example, one RCA audio input port and one wire block audio input port.
The visual-capture input port(s) 204 receives a digital or analog VGA signal
through, for example, a VGA input port(s), DVI (Digital Visual Interface) input port(s),
XGA (Extended cs Array) input port(s), HD (High Definition)-15 input port(s), HDMI
input port(s) and/or BNC (Bayonet Neill-Concelman) connector port(s). The visual-capture
input port 204 captures images ed by, for example, a computer or a microscope. An
electronic device connected to the visual-capture input port 204 can also be used to capture
images from, for example, an electronic whiteboard transmitting images via, for e, a
VGA signal. The embedded nce 200 can include more or fewer visual-capture input
ports, and/or can include more than one visual-capture input port format, for example, one
VGA visual-capture input port and one DVI visual-capture input port.
The video input port(s) 206 receives motion video signals from s such as
video cameras via an input port(s) that includes, but is not limited to, an s-video input port(s),
composite video input port(s), HDMI input ) and/or component video input ).
The embedded appliance 200 can include more or fewer video input ports, and/or can include
more than one video input port format, for example, one HDMI video input port and one
composite video input port.
The digital-image input port(s) 208 captures digital images via an input port(s)
such as an Ethernet port(s), a DVI port(s) and/or a USB port(s). The l-images can be
acquired using, for example, a digital camera or a web camera. The embedded appliance 200
can include more or fewer digital-image input ports, and/or can include more than one digital
image input port , for example, one DVI l-image input port and one USB digital
image input port.
The embedded appliance 200 includes re modules and/or software
modules ented in hardware, which can include, for example, ASICs (Application
c Integrated Circuits), CPUs (Central Processing Units), FPGA (Field Programmable
Gate Arrays), modules, DSPs (Digital Signal sors), processors and/or co-processors.
The hardware modules and/or software modules can be configured to perform functions
specifically related to capturing, processing, storing and/or sending media signals.
The media module 230 can be implemented as an integrated circuit such as a
video chip, audio chip, and/or audio-video chip. The media module 230 can be configured to
receive a media signal, decode the media signal, identify input media signal parameters
and/or characteristics, convert the media signal, and/or forward the media signal to the
modification module 232. By way of example, the media module 230 can be an audio chip
that receives an analog audio signal from the audio input port 202, converts the analog audio
signal into a l audio signal, and forwards the l audio signal to the modification
module 232.
The media module 230 can identify media signal parameters and/or characteristics
(parameters) for the received media signal, and can be configured to send the identified input
media signal parameters to the modification module 232 and/or the processor 250. The
media signal parameters identified at the media module 230 can include, for example, a
tion of the media signal, a frame rate of the media signal, an aspect ratio of the media
signal, a bit rate of the media signal, a clock rate of the media signal, and/or the like. By way
of example, the media module 230 can determine that a media signal received via the video
input port 206 is a 1080p 24fps (frames per second) video signal (e. g., 080 resolution
video at 24 frames per second), and can send a signal representing those input media signal
parameters to the modification module 232 and/or processor 250.
In some embodiments, the media module 230 can be configured to detect and/or
identify digital parameters (e.g., frame rate, aspect ratio, etc.) for ed media s by
reading values for the digital parameters from a set of ers at the media module 230.
Such a detection of digital parameters can be done at, for example, an integrated circuit (e. g.,
ADV7441A chip) of the media module 230. Furthermore, in some embodiments, such a
ion can be performed automatically at the media module 230 without any instruction,
indication, input or command received from a controller (e.g., the control server 220, the
processor 250) or a user (e.g., via the direct control signal 240). That is, the media module
230 can be configured to automatically perform the detection of digital parameters on a
ed media signal in response to receiving that media signal and without any other input.
While depicts the embedded appliance 200 as having one media module
230, in some embodiments, the embedded appliance 200 can e more or fewer media
modules. In one such embodiment, the embedded nce 200 can include a video chip
media module 230 configured to receive, convert, and send video signals from the Visual-
capture input port 204, the video input port 206, and/or the digital-image input port 208, and
can include an audio chip media module 230 configured to receive, convert, and send audio
signals from the audio input port 202. While depicts the ed appliance 200 as
having one modification module 232 and one encoding module 234, in some embodiments
(e.g., as shown in , the embedded appliance 200 can have two or more of each,
providing two separately encoded representations of the input signals, possibly with different
characteristics (e.g., resolutions, frame rates, bit rates, aspect ratios, etc.).
The modification module 232 can be, for example, a FPGA configured to receive
media signals from the media module 230, process and/or otherwise modify the media
signals, and send the modified media signals to the encoding module 234. By way of
example, the modification module 232 can deinterleave (interlaced to progressive), te
(scale in time, e.g., 60 fps to 24 fps), resize (scale in height and/or width, e.g., upscale and/or
ale resolution), perform color space conversion (scale in density), modify gain, adjust
audio level(s), and/or perform audio lexing (selecting an audio signal from a group of
audio signals or combining audio signals).
In some ments, the modification module 232 can modify the signal based
on modification instructions received from the processor 250, ation instructions
received from the encoding module 234 and/or input media signal parameters received from
the media module 230. The modification instructions can be generated at the sor 250
or the encoding module 234 based on requirements on a desired output media signal such as
desired characteristics and/or parameters for the output media . In some ments,
the requirements on the desired output media signal can be generated within the embedded
appliance 200 such as, for example, at the processor 250. In other embodiments, the
requirements on the desired output media signal can be received at the embedded appliance
200 from an external resource such as, for example, a user (e.g., via the direct control signal
240), the control server 220 and/or any other type of external device that controls the
embedded appliance 200.
Furthermore, in some embodiments, requirements on a desired output media
signal (e. g., information of desired characteristics and/or parameters of the output media
) can be received or generated at the embedded appliance 200 prior to an input media
signal being ed at the embedded appliance 200. In such embodiments, the
requirements on the desired output media signal can be defined independent of the input
media signal (that is, without any information of the input media signal). The modification
instructions can be generated at, for example, the sor 250 based on the requirements on
the desired output media signal and/or information (e.g., parameters) of the input media
signal. The ation module 232 can be configured to modify the input media signal in
real-time, based on the parameters of the input media signal identified at the media module
230 and the ation instructions, to produce the desired output media signal.
For example, at a first time the processor 250 can receive a first signal from the
control server 220 indicating that any input Video signal is to be modified into an output
Video signal with a resolution of 1024x768 at 24 fps. At a second time after the first time,
the ation module 232 can receive a media signal, for example a Video signal with a
resolution of 1920x1080 at 30 fps, fiom the media module 230. The modification module
232 can then receive a first modification instruction from the sor 250 and/or the
encoding module 234 associated with modifying a Video signal with a resolution of
1920x1080 at 30 fps to a Video signal with a resolution of 1024x768 at 24 fps. By following
the modification instruction, the modification module 232 can resize the Video signal from
1920x1080 to 1024x768, and decimate the Video signal from 30 fps to 24 fps. After the
modification, the modification module 232 can send the d media signal to the
encoding module 234. rmore, when a second modification instruction received fiom
the processor 250 indicates that any input Video signal is to be modified into two output
media signals with different resolutions, for example, with the second modification for output
of an 800x600 video stream at 15 fps, a second modification module (not shown in
can resize and decimate the input Video signal to those parameters in real-time, and send the
second output video signal to a second encoding module (not shown in . The second
ation instruction can be generated at the processor 250 based on a second signal
indicating the second modification for output, which is received from a user (e.g., via the
direct control signal 240) prior to the input video signal being received at the modification
module 232.
The encoding module 234 can be a digital signal processor configured to encode a
modified media signal received from the modification module 232. The encoding module
234 is configured to determine media signal ations and associated modification
instructions, and can send those modification instructions to the media module 230 and/or the
modification module 232. In some embodiments, when the processor 250 indicates multiple
modifications for the same input stream, two or more encoding modules can be used to
provide multiple output media streams. The encoding module 234 is also configured to
encode, for example, compress, the modified media signal into an encoded signal using a
session format, such as, for example H.264/MPEG (Motion es s Group) 4 AVC
(H.264) at 080 resolution. The session format can include an encoded signal profile
(e.g., H.264 profile) and level (e.g., H. 264 level), as well as other characteristics such as
resolution. The session format can be determined by, for an example, a s that selects
the session format from a set of le session formats, based on the input media signal
parameters, user-selected (or default) output parameters, and/or the capabilities of the
embedded appliance 200. For example, in some ments, the control server 220 can
determine a session format based on the output parameters and the embedded appliance
capabilities, and can then send a signal representing the determined session format to the
encoding module 234 via the sor 250. An output parameter can be, for example, a
resolution, speed, and/or file size requested by a user (e.g., a professor that will generate the
content on which the media signals will be based).
The control server 220 can be configured to be coupled to two or more ed
appliances 200, and each of the two or more embedded appliances 200 can have different
capabilities. An embedded appliance capability can be, for example, a maximum native
resolution supported by the input ports, the internal processing capability, and internal
storage. The control server 220 can determine a n format in such a heterogeneous
appliance environment by basing the ination of the session format on an individual
embedded appliance capability in addition to the user-selected ter for that appliance.
For example, the selection of a given set of output parameters can result in a first session
format for a first embedded appliance 200, but the selection of the same set of output
parameters can result in a second session format, different from the first n format, for a
second embedded appliance 200.
A media (s) d in the session format can be compressed and/or
otherwise processed to a greater degree than a native or raw signal, but still configured to be
decoded and/or subsequently encoded using a second . This allows the media signal to
be compressed and/or otherwise processed to a greater degree than the native or raw signal,
but also compressed and/or otherwise processed to a lesser or a greater degree than the media
signal encoded with the session format. By way of example, er a raw signal that is
stored in 10 units of space in a memory; the media signal based on that raw signal and
encoded with the session format is stored in 5 units of space in a memory. In this example,
the media signal encoded with the session format can be decoded and then encoded by the
control server in a second format and is stored in 8 units of space in a memory, and can be
encoded by the l server in a third format and is stored in 3 units of space in a .
As this e illustrates, the n format can be selected by the control server and
notified to the embedded appliance such that the embedded appliance compresses or
otherwise processes a raw signal into a format appropriate for transport to the control server
and subsequent processing by the control server into the desired output format(s). In some
embodiments, the degree of compression and/or processing using the session format can
determine the maximum range of formats from which subsequent encoding format can be
selected. In this manner, if a user requires a high degree of flexibility post capture, as
indicated by the selected output parameters, the determined session format may include a low
degree of processing and/or compression, resulting in a larger file size. But if a user requires
a low degree of flexibility, as indicated by the user-selected output parameters, the
ined session format may include a high degree of processing and/or compression,
resulting in a smaller file size. Note that in both cases, a common media format can be used
but the parameters and/or levels for the media format can differ as just described.
The encoding module 234 can send the encoded signal to the processor 250. In
some embodiments, the encoding module 234 can encode and send a video signal received
from the modification module 232, and can send an unencoded audio signal associated with
that video signal to the processor 250. In such embodiments, the processor 250 can encode
the audio signal. In other embodiments, the encoding module 234 can encode and send both
an encoded video signal and an associated encoded audio signal to the sor 250. While
bed above with reference to H.264, the encoding module 234 can encode media signals
into other formats, such as for example, a MPEG 2 format. The encoding module 234
can also compress media signals into more than one format simultaneously. For example, if
the embedded appliance 200 receives a digital-image signal and an associated audio signal,
the digital-image signal can be compressed into a JPEG (Joint Photographic Experts Group)
format while the audio signal can be compressed into an MPEG audio layer-3 (MP3) format.
In some embodiments, the encoding module 234 can compress a single media signal into
multiple formats simultaneously. Similarly, one or more media signals can be compressed
into a single compressed stream (e.g., ).
The sor 250 can receive an encoded media signal from the encoding
module 234, store the encoded media signal in the memory 260, and/or send the encoded
media signal to the control server 220. In some embodiments, the processor 250 can store
the encoded media signal in the memory 260 and can send the encoded media signal to the
control server 220 at a later time, such as, for example, during a perceived low traffic time
for the control server 220 and/or the network to which the embedded appliance 220 is
connected. The processor 250 is red to receive input media signal parameters from
the media module 230 and/or the modification module 232, and to receive user-selected
parameters from the control server 220 and/or the direct control signal 240. Similar to the
encoding module 234, the processor 250 can also be configured to determine media signal
modifications and associated modification instructions, and can send those modification
instructions to the media module 230 and/or the modification module 232. The processor
250 is also configured to determine an encoding format and associated encoding ctions
and can send those ng instructions to the encoding module 234. The processor 250 is
red to store an d media signal in the memory 260 and to send the encoded
media signal to the control server 220 substantially immediately and/or at a time other than
real-time based on a send tor associated with a schedule.
The processor 250 and/or the encoding module 234 can be configured to
determine additional ctions to send to the media module 230 and/or the modification
module 232 in real-time when the input media signal changes during a capture session. By
way of example, the embedded nce 200 can begin capturing media signals in response
to a start indication received from a ler or user, and can begin to receive 1920x1080
video at 60 fps. Based on a set of parameters of 1920x1080 video at 24fps that is requested
by a user, the processor 250 and/or the encoding module 234 can define and send a
modification ction to the modification module 232 to only perform decimation on the
media signals to reduce the signals from 60 fps to 24 fps. After the modification instruction
has been sent, the media signals received by the embedded appliance 200 may change to
1024x768 video at 30 fps. For example, a user of the embedded appliance 200 may
disconnect a particular video device with a given input and connect a different video device
with a different output format. The processor 250 and/or the encoding module 234, in real-
time, can receive an indication from the media module 230 and/or the modification module
232 that the input media signal parameters of the media signal have changed, and the
processor 250 and/or the encoding module 234 can define and send a new modification
instruction to the modification module 232 to resize the new media signals up to l920X1080
and to perform decimation on the new media signals to reduce the speed of the medial signals
from 30fps to 24fps. Anytime the format of a media signal changes and/or a new media
signal is added, the processor 250 and/or the ng module 234 can define and send a new
modification instruction, or ctions, to maintain the same modified media signal being
received by the encoding module 234.
In some embodiments, the processor 250 can receive from the control server 220,
instructions representing the ng ters for media signals (e.g., the session format)
and/or scheduling instructions for one or more media capture sessions. In embodiments
where the processor 250 has received the output parameters and/or the encoding parameters
(e.g., the session format) and received a scheduling instruction, the embedded device 200 can
capture media signals, based on the schedule or based on a direct control signal from a user,
r or not the embedded appliance 200 remains connected to the control server 220.
Said another way, the ed appliance 200 can continue to operate, e.g., capture media
s, if the embedded appliance 200 is intentionally or unintentionally disconnected from
the control server 220. In such embodiments, the embedded appliance 200 can continue to
store encoded media signals until onboard memory and/or external memory is filled. In such
embodiments, the embedded appliance 200 can be red to ite low priority
encoded media signals with higher ty encoded media signals.
The embedded appliance 200 captures any ation of real-time media signals
received through the input ports 210. Each of the media signals, although ted via
different input ports 210, is synchronously acquired by the embedded appliance 200. For
example, even though the sound of chalk against a classroom board can be received via a
microphone through the audio input port 202, the motion of a professor’s hand wielding the
chalk can be received synchronously using a video camera connected to the video input port
206. These media signals are onously received and processed by the embedded
appliance 200.
In some embodiments, the embedded appliance 200 can be configured to capture
only certain portions of media signals. The embedded nce 200 can be configured to,
for example, capture and store sounds received via a microphone while ignoring static and/or
silence. The ed appliance 200 can also be configured to, for example, capture a video
signal or a digital-image signal only when movement or a substantial change in a scene is
detected. In many embodiments, each of the input ports 210 included in the embedded
appliance 200 can be configured to capture one or more media signals at different and/or
le rates. For example, the video input port 206 can be configured to receive video
signals at a high frame rate compared with a frame rate of digital images received by the
digital-image input port 208.
The memory 260 can be any appropriate type of fixed and/or removable storage
. The memory 260 can be, but is not d to, a tape, digital-video-disk (DVD),
digital-video-cassette (DVC), random-access-memory (RAM), solid state drive (SSD), flash
memory and/or hard disk drive. The size of the memory 260 can vary ing on the
amount of storage needed for a particular application. For example, the size of the memory
260 can be higher if the embedded appliance 200 is intended to capture large quantities of
media signals ssed in a ss format. The size of the memory 260 can also be
higher if the embedded appliance 200 is intended to, for example, capture media s over
relatively long periods of time (e.g., during network down time) without uploading captured
media signals to, for example, the control server 220. The memory 260 can be used to
prevent the loss of captured media signals that cannot be sent to, for example, the control
server 220 because of a network outage. In some embodiments, the processor 250 can, if
necessary, use the memory 260 to buffer information received via the input ports 210 before
ssion.
In some embodiments, a scheduler (not shown in can be disposed in the
embedded appliance 200 and/or in the control server 220, and can generate start and stop
indicators to prompt the embedded nce 200 to, for example, start and stop capturing
and/or start and stop sending media signals. The scheduler can access a schedule that is
either stored locally on the embedded appliance 200 or on the control server 220. The
schedule can include, for e, start and stop times that are specific to input ports 210.
For example, if a professor will teach a one-hour class on one day of the week, every week
for four months, the scheduler can use a schedule to prompt the embedded appliance 200 to
e the professor’s lecture for one hour on the day of the lecture every week for the four-
month time period. The scheduler can be configured to capture or send media signals
according to more than one schedule stored on, for example, the embedded appliance 200.
The scheduler can te a schedule or receive a schedule from the control
server 220. For e, the ler can generate a schedule for sending captured media
signals based on input from the control server 220 indicating preferred transmission times.
In some embodiments, the scheduler can access and execute a schedule that is, for example,
sent from the control server 220 and stored in the memory 260 of the embedded nce
200. In some embodiments, the scheduler can be used to start and stop not only the capturing
and/or sending of media s by the embedded appliance 200, but also the processing
and/or storing of media signals.
Rather than using a schedule to prompt the capturing and/or sending of media
signals, the scheduler can prompt n fianctions to be performed based on defined criteria.
For example, the ler can be configured to prompt the sending of media signals from
the embedded appliance 200 when a certain amount of bandwidth is available for use by the
embedded nce 200. In some embodiments, the scheduler is included as a hardware
and/or software module that is separate from the processor 250.
While depicts the embedded appliance 200 having a discrete media
module 230, modification module 232, encoding module 234, and processor 250, in some
embodiments, the embedded appliance 200 includes a single processor that can be any type
of processor (e. g., an embedded sor or a l purpose processor) configured to
define and/or operate within an embedded environment. The single processor can be
configured to execute the functions performed by the processor 250, the media module 230,
the modification module 232, the encoding module 234 and/or other filnctions within the
embedded appliance 200. In some ments, each of the modules and sor can be
ed in a single piece of hardware, across multiple pieces of hardware, and/or on shared
hardware.
In some embodiments, the start and stop indicators from the scheduler can be
based on variables such as the storage and/or sending capacity of each embedded appliance
200. The control server 220 can query each embedded appliance 200 to determine, for
example, how much capacity of the memory 260 of each embedded appliance 200 is
available. The control server 220 can also, for example, receive a signal from each
embedded appliance 200 indicating how much capacity of the memory 260 of each
2012/044879
embedded appliance 200 is available. The control server 220 can then prioritize and prompt
the sending of ation from the embedded appliances 200 based on memory ty
indicators.
also illustrates that the embedded appliance 200 can be controlled using a
direct control signal 240 from, for example, a user. The embedded nce 200 can include
an interface such as a graphical user interface (GUI) (not shown in , physical display
(not shown in or buttons (not shown in to produce the direct control signal
240 to control some or all of the functions that can be med by the embedded appliance
200. The direct control signal 240 can be used to, for example, modify a schedule stored on
the embedded appliance 200, modify the sing of media signals, troubleshoot an error
on the embedded appliance 200 or control the embedded appliance 200, for example, while
the control server 220 is down. The direct l signal 240 can also be used to, for
e, start and stop capturing and/or sending of media signals. The embedded appliance
200 can be configured to require authentication (e.g., usemame/password) of, for example, a
user before accepting a direct control signal 240 sent via an interface (not shown in
from the user. The direct control signal 240 can also be generated using, for example, an
interface (not shown in that is not directly coupled to the embedded appliance 200.
In some embodiments, the embedded appliance 200 can be directly controlled using the
control server 220.
In some embodiments, the embedded appliance 200 can include other software
and/or hardware s to perform other processing functions such as, for example,
encoding, decoding, indexing, formatting and/or onization of media s.
While depicts the embedded appliance 200 being coupled to a single
control server 220 that both controls and/or instructs the operations of the embedded
appliance 200 and receives the output media signals from the embedded nce 200, in
some embodiments (as shown and described with respect to , the ed nce
200 can be coupled to two or more than two server devices that each performs a different
functionality. For example, the embedded appliance 200 can be coupled to a control server
(similar to the control server 120 in and a storage server (similar to the server 130 in
. The control server can be configured to generate and send instructions (e.g.,
modification instructions, requirements on desired output media signals) to the embedded
appliance 200, such that modules in the embedded appliance 200 can perform signal
detection, modification, encoding, and/or the like, based on the instructions. The storage
server can be red to receive output media signals from the embedded appliance 200,
process the output media signals to make them available for users, and/or distribute the
output media signals to other devices and users.
is a block diagram that shows the flow of media signals from an embedded
appliance (similar to the embedded nce 100 and the embedded appliance 200 in FIGS.
1 and 2) through modules in a control server 320. The control server 320 receives encoded
real-time or stored media signals 305 encoded in a session format, and including an encoded
video signal 311 and an encoded audio signal 313. Although this figure shows that each of
the components of the media signals 305 is received as a multiplexed signal, over, for
example, an Internet protocol (IP) network connection that can be tiplexed by the
control server 320 when received, in some embodiments, the media signals 305 can be sent
to the control server 320 as one or more discrete signals. For example, audio and video
signals can be combined into a single MPEG-2 signal at the embedded appliance before
being sent by the ed appliance to the l server 320. Also, the control server 320
can receive media signals 305 from more than one embedded appliance and can process each
of the media signals 305 in parallel using, for example, multi-threaded processing.
Each of the compressed media s 305 that are received by the control server
320 are similarly processed. Each of the media s 305 can be processed by one of the
decode modules 315 (e. g., decode module 312A or 312B), index modules 325 (e.g., index
module 314A or 314B) and encode modules 335 (e.g., encode module 316A or 316B). After
each of the media signals 305 has been processed (e.g., individually processed, processed as
a group), the signals are synchronized and/or formatted by the synchronizer/formatter 380.
The processing of the encoded video signal 311 will be used herein as a
representative example of the processing of the compressed media s 305. The
processing of the remaining s 305 can be understood in light of this representative
example.
When the encoded video signal 311 is received by the l server 320, the
encoded video signal 311 can be decompressed from the session format by the decode
module 315 into a decoded video signal. The decode module 315 can be configured to detect
the session format of the encoded video signal 311 when the encoded video signal 311 is
received so that the signal 311 can be properly decoded/decompressed. The encoded video
signal 311, when converted into a decoded video signal, can be decoded to another format
other than the session format and can be used by the control server 320 to continue
processing the . In some embodiments, the encoded video signal 311 can be received
in the session format and can be stored in that format. In such ments, the control
server 320 can decode the encoded video signal 311 at a later time, for example, at the
request of a user.
The decoded video signal is then sed by the index module 325 to index the
decoded video signal by, for example, determining and marking scene changes. The
indexing is performed so that the decoded video signal can later be properly synchronized
with the other media signals 305 by the synchronizer/formatter 380 and to provide nt
index points for use by, for example, an end-user (not shown in . Segments, rather
than , can be detected from the encoded audio signal 313 using the index module 314B
so that the encoded audio signal 313 can be ly synchronized with the other media
signals 305 and to provide nt index points for use by, for example, an end-user. The
decoded video signal with indexing (e.g., scene change markings) is then encoded by the
encode module 316A into an encoding that can be synchronized and formatted by the
synchronizer/formatter 3 80.
Returning to the general discussion of the synchronizer/formatter 380
receives the media signals 305 after processing through the decode module 315, the index
module 325 and the encode module 335. The synchronizer/formatter 380 indexes,
synchronizes and formats the media signals so that they can be accessed by a user via a user
interface 340. In the synchronization process, the scenes from each of the media signals and
the audio segments are synchronized so that, for example, the sound of a dropped pen hitting
a floor is matched with video of the pen hitting the floor. The synchronized media signal can
be ted by the synchronizer/formatter 380 into one or more formats that can be used by
a user. By way of example, the user can initially request certain output ters for the
d media signal, resulting in the media signal being encoded in a session format, but
later request the encoded media signal in a different format. For example, the output
parameters can result in an encoded media signal having 1024x768 video at 24 fps; but then
the user can request to download the media format to a portable device having a maximum
resolution of 800x600. In such an example, the control server 320 can send the stored
d media signal 305 through the decode module(s) 315, the index module(s) 325, the
encode (s) 335, and the synchronizer/formatter 380 to reformat the media signal 305
at 800x600 video at 24fps. In this , the encoded video signal can take up less memory
on the portable device.
The synchronizer/formatter 380 can receive collateral material 370 and can
combine collateral material 370 with the media signals 305 that have been processed by the
modules. The eral material 370 can be, for example, additional marking information
that can be combined with the processed media signals to aid in the synchronizing process.
In some embodiments, the collateral material 370 can be additional media signals captured
by other multimedia e devices (not shown in that are to be combined with the
media signals 305 already shown. Although not shown in the control server 320 can
include separate modules that decode, index (e.g., segment detect or optical character
recognition) and/or encode the collateral material 370 received by the control server 320.
Although shows that separate modules perform decoding, indexing,
encoding, synchronizing and formatting, the functions of each of the modules can be further
subdivided and/or combined into one or more processors or modules. These fianctions can
also be subdivided and/or combined onto more than one control . Also, the control
server 320 can include a memory (not shown in or a separate database (not shown in
for storing ation and/or buffering ation that is received from one or
more embedded appliances.
Any combination of the fianctions performed by any of the s and/or other
components of the control server 320 can alternatively be performed on an embedded
appliance. For example, the indexing can be med by an embedded appliance before the
media signals are ssed and transmitted to the control server 320.
The control server 320 can also receive an input signal from a user via the user
interface 340. The user ace 340 can be, for example, a remote computer that is
interfacing with the control server 320 via a network connection and/or can be an interface
that is integrated into the control server 320. The user ace 340 can be used to control
any of the modules and their associated functions and/or to specify parameters for processing
information on the control server 320. A user input signal can specify, for example, the type
of format that should be used by the synchronizer/formatter 380 for a particular set of media
signals 305 ed at the control server 320. A user interface 340 can be configured so that
a user can manually manipulate any of the media signals 305 received by embedded
appliances distributed across a network.
The user interface 340 can also be used to access, monitor and/or control any
embedded appliances (not shown in that can be connected to the l server 320
and distributed, for example, over a network. Access to embedded appliances and/or the
control server 320 via the user interface 340 can be, for example, password protected. The
user interface 340 can be used to define, for example, schedules used by the embedded
appliance or schedules used by the control server 320 to send signals to start and stop
capturing, processing, storing and/or sending by distributed embedded nces. The user
interface 340 can also be used to view confidence monitoring signals that can be generated
by embedded appliances connected to the control server 320.
The user interface 340 can also be used to access the final synchronized/formatted
content generated by the control server 320. More than one user interface 340 can be
distributed across a network and can be configured to access the content produced by the
control server 320 (e.g., personal computers distributed over a university network accessing
the control server 320). In some embodiments, the control server 320 sends the t to a
server (not shown in where the content is made available to one or more users
through the user interface 340.
is a system block diagram that illustrates an ed appliance 400
having two sets of input ports (input ports 410A and 410B) ated with two sets of
modules, a processor 450, and a memory 460, according to an embodiment. The embedded
appliance 400 can be similar to the embedded appliance 200 (in and can e
similar ts with similar functionality. By way of example, the processor 450 of the
embedded appliance 400 can be similar to the sor 250 of the embedded appliance 200.
Unlike the embedded appliance 200, however, the embedded appliance 400 includes two sets
of inputs and modules, including two sets of input ports 410A and 410B, two media modules
430A and 430B, two sets of modification modules 432A and 432B, two synchronization
modules 470A and 470B, and two sets of encoding s 434A and 434B. In this manner,
the embedded appliance 400 can simultaneously s and modify more aneous
signals from more inputs. By way of example, the “A” set of inputs and modules can
capture, process, and store one or more media signals using a first session format, while the
“B” set of inputs and modules can capture, process, and live stream the same (or different)
one or more media signals using a second session format. In other embodiments, both sets of
inputs can be used for a live stream and/or for a stored encoded media signal. onally,
as discussed above with respect to and shown in each channel (the “A”
channel and the “B” channel) can have one or more than one modification module (e.g., the
modification modules 432A and 432B) and/or one or more than one encoding module (e. g.,
the encoding modules 434A and 434B).
As shown in the embedded appliance 400 includes the synchronization
modules 470A and 470B that are not included in the ed appliance 200 shown in FIG
2. The synchronization modules 470A and 470B align sets of input signals with disparate
time bases to a common time base. The common time base can be derived from one input
signal or from a reference time base unaligned with any input signal. The onization
modules 470A and 470B cause the numbers of media samples (e.g., audio samples, video
frames) during a specific time period to be in correct agreement throughout a capture or
capture session for the sample rates requested by, for example, the control server 420. In
some embodiments, the synchronization modules 470A and 470B use sample deletion and
sample insertion to ensure that all media signals are synchronized after encoding. In other
ments, the synchronization modules 470A and 470B use sample blending techniques
(e.g., resampling, ne, etc.).
For example, if the control server 420 instructs the processor 450 to capture video
at 15 fps and audio at 44100 samples per second (sps), the synchronization modules 470A
and 470B each can use an audio clock as the time base. If the actual input video frame rate is
ideally 29.97 fps, then the modification modules 432A and 432B can be configured to
decimate frames from 29.97 fps to 15 fps using, for example, a simple counter with a
numerator of 15000 and a denominator of 29970. In operation, the modification modules
432A and 432B can be configured to add 15000 to the numerator for each input video frame
and emit a video frame whenever the tor is at least equal to the denominator. The
numerator is then reduced modulo the denominator for the next video frame. That is, the
denominator is subtracted from the tor until the tor is less than the
denominator. Such a method is then repeated for the next input video frame.
The method described above will e for the proper ratio of input and output
frames. The method alone, r, typically does not account for an input clock that varies
over the duration of capture, nor does it typically recover from the loss of an input signal.
For example, in practice, the input clock is not the ideal 29.97 fps but may drift up or down
as the source ent (e.g., a source video camera, a source computer providing the
y images) warms or cools. When multiple sources are involved, their clocks will
almost always be derived from different time bases and thus should undergo treatment to
maintain a perfect synchronization when encoded with idealized frame and sample rates. If
the preceding method were used in a high frame rate setting, for example, even a small
amount of clock drift between sources could result in noticeable loss of sync between the
audio and video after hours of capture.
To address this issue, timestamps on the sampled media signals (video frames or
audio ) and a sample count can be used (e.g., at the media modules 430A and 430B) to
encode exactly the proper number of video frames by the encoding modules 434A and 434B
for a given number of audio samples. The synchronization modules 470A and 470B can be
configured to maintain a time window, allowing a configurable amount of leeway, in which a
frame arrives from the modification modules 432A and 432B. For example, if the
synchronization module 470A receives a frame that arrives too early (that is, the timestamp
of the frame is earlier than the current encoding window, possibly because the input clock
has drifted and is now ), the synchronization module 470A does not send that frame to
the encoding module 434A. If the onization module 470A determines that the current
time window has expired, the synchronization module 470A sends the previous frame to the
encoding module 434A, resulting in a ate frame (unless the us frame was too
. After a configurable number of ated frames, the synchronization module 470A
can switch to a frame that contains an indication of lost signal (e.g., a black screen, a blue
screen, a screen with certain text, etc.). Whenever a frame is sent to the encoding module
434A, the synchronization module 470A will update its time window to the ideal window
based on the time base and the number of frames so far d. This method allows all the
input media s to remain synchronized after encoding despite being supplied with
disparate and varying clocks.
In some embodiments, modules other than the synchronization modules 470A and
470B can also perform a function related to the synchronization fianctionality on media
signals. For example, as described above, the media modules 430A and 430B can be
configured to determine a timestamp for each frame of media signals received from the input
ports 410A and 410B, such that the media s can be onized based on the
timestamps at the synchronization module 470A and 470B.
is a flowchart illustrating a method 5000 of capturing, processing, storing
and/or sending of media s using an embedded appliance according to an embodiment
2012/044879
of the ion. According to the method 5000, and with reference to the processor
250 can e a signal representing a session format based on output parameters and/or
embedded appliance capabilities from the control server 220, at 5002. In some embodiments,
the output parameters can be input directly into the embedded appliance 200 by the user via
an interface described above. The n format can be, for example, a desired format for a
capture n as specified by the user of the control server 220 or a user providing direct
input to the embedded appliance 200. As such the session format can be specified
independent of the format of the media signals to be captured during the capture session. In
other words, the session format can be specified by user of the control server 220 or the user
providing direct input to the embedded appliance 200 Without that user having any
dge of the format of the media signals to be captured or the types of media capture
devices coupled to the input ports 210 of the embedded appliance 200.
The processor 250 can receive an indication to start a capture session, at 5004.
The indication to start the capture session can be based on, for example, a schedule or a
direct input from a user of the embedded appliance 200. A capture session can be any
amount of time and can be determined, for e by a le, a default value (e. g., 1
hour increments), or dynamically based on user input. The processor 250 and/or the
encoding module 234 can receive a first value of a first parameter of an input media signal
from the media module 230 and/or the modification module 232, at 5006. The first value of
the first parameter of the input media signal can be, for example, a value of a resolution or
frame rate of a video media signal received at an input port 210 and tically detected
by the media module 230 upon receiving the video media signal from the input port 210.
The processor 250 and/or the encoding module 234 can send a first modification
instruction based on the first value of the first parameter and the session format to the media
module 230 and/or the modification module 232, at 5008. This first modification instruction
can be calculated, for example, by the processor 250 and/or the encoding module 234 after
the first value of the first parameter and the n format are ed. In other words, this
first modification instruction can be calculated during or after the capture n, and need
not be predetermined or selected from a sting list of options before the capture session
starts. In fact, the first modification instruction can be calculated for any format of media
signals or any type of media capture devices coupled to the input ports 210 of the embedded
appliance 200, and is not limited or constrained by the formats of media signals or the types
of media capture devices coupled to the input ports 210 of the embedded appliance 200.
The processor 250 can store in the memory 260 and/or send to the control server
220 an encoded media signal received from encoding module 234, at 5010. When the
encoded media signal is sent to a control server, the encoded media signal can be sent to the
control server 220 that initially sent the signal representing the session format or to a
different server designated to receive the encoded media signal for possible further
processing and subsequent distribution.
The sor 250 and/or the encoding module 234 can receive a second value of
the first parameter of an input media signal from the media module 230 and/or the
ation module 232, at 5012. The second value of the first parameter of the input
media signal can be, for example, a value of a resolution or frame rate of a video media
signal received at an input port 210 and automatically detected by the media module 230
upon receiving the video media signal from the input port 210.
The processor 250 and/or the encoding module 234 can send a second
modification instruction based on the second value of the first parameter and the session
format to the media module 230 and/or the modification module 232, at 5014. Similar to the
discussion above regarding the first modification instructions, this second modification
instruction can be calculated, for example, by the processor 250 and/or the encoding module
234 after the second value of the first parameter and the session format are received. In other
words, this second modification instruction can be calculated during or after the capture
session, and need not be predetermined or ed from a preexisting list of options before
the capture session starts. In fact, the second modification instruction can be ated for
any format of media signals or any type of media capture devices coupled to the input ports
210 of the embedded appliance 200, and is not limited or ained by the formats of media
s or the types of media capture devices coupled to the input ports 210 of the ed
appliance 200.
The sor 250 can store in the memory 260 and/or send to the control server
220 an encoded media signal received from the encoding module 234, at 5016. When this
encoded media signal is sent to a control server, the d media signal can be sent to the
control server 220 that initially sent the signal enting the session format or to a
different server designated to receive the encoded media signal for possible further
processing and subsequent distribution.
The processor 250 can receive an indication to stop the capture session based on
the schedule, a stop indicator associated with the schedule, the default value, and/or
dynamically based on user input, at 5018. The processor 250 can stop sending and/or storing
the encoded media signal, at 5020.
is a block diagram that illustrates a hardware detection module 610
coupled to a re detection module 620 configured to measure and test the timing of
horizontal and vertical sync pulses in an embedded appliance, according to an embodiment.
In some embodiments, the hardware ion module 610 and the software detection module
620 can be located anywhere in the embedded appliance. For example, the hardware
detection module 610 can be part of a modification module (e.g., the modification module
432A or 432B in of the embedded appliance, and the software detection module 620
can be stored in a memory and/or executed at a processor of a synchronization module (e.g.,
the synchronization module 470A or 470B in or an encoding module (e.g., the
encoding module 434A or 434B in of the embedded appliance.
The hardware detection module 610 and the software detection module 620 can
be any hardware module and software module (stored and/or executed in hardware),
respectively, which are collectively configured to determine frame parameters based on
media signals (e.g., VGA sync signals) received fiom, for example, input ports of the
embedded appliance. Although not shown in the hardware detection module 610 can
include, for example, circuits, ers, etc., which are configured to determine a set of
measurements based on the received media signals. The software detection module 620 can
include, for example, a , a sor, software (e. g., method or process), etc., which
are red to perform a method (e.g., the method of to determine frame
parameters based on the set of measurements.
Although the hardware detection module 610 and the software detection module
620 are described herein as a hardware module and a re , respectively, in other
ments, the hardware detection module 610 and the re detection module 620 can
be implemented in any other combination such as, for example, both being hardware
modules, both being software s, the hardware detection module 610 being a software
module and the re detection module 620 being a hardware module, etc.
As shown in the hardware ion module 610 can be configured to
receive signals associated with media signals such as a vertical sync signal (Vsync), a
ntal sync signal (Hsync), a clock signal (Clock), and/or the like. In some
embodiments, the sync signals received at the hardware detection module 610 can be, for
example, VGA sync signals.
is a schematic illustration ofVGA sync signals, according to an
embodiment. The top half of shows a vertical sync signal (Vsync 801) and a
horizontal sync signal (Hsync 802), each including multiple pulses, over the course of over
two frames. In this diagram, the x-axis represents time and the y-axis represents amplitude
of the signals. The Vsync 801 and the Hsync 802 are similar to the vertical sync signal and
the horizontal sync signal, respectively, shown and described with respect to as being
received at the hardware ion module 610.
In this example of the Hsync pulses of the Hsync 802 occur too often to
distinguish visually on the diagram. Accordingly, the bottom half of shows an
expanded vertical sync signal (Expanded Vsync 803), which is an expansion of the area
around one Vsync pulse of the Vsync 801; and an expanded horizontal sync signal
(Expanded Hsync 804), which includes five Hsync pulses of the Hsync 802 in the same time
frame for the Expanded Vsync 803. The diagram for the Expanded Vsync 803 also shows
two valid s (in grey) where a Vsync transition (e.g., from low to high, from high to
low) can occur.
In some embodiments, the Vsync 801 and the Hsync 802 are the only two input
signals that are included in a typical input to a hardware ion module (e.g., the hardware
detection module 610 in for detections of a video standard (e.g., VGA detections).
Additionally, a clock input (e.g., the clock signal (Clock) in can be available at the
hardware detection module; this clock input can be any stable clock with a period shorter
than the est expected Hsync pulse of the Hsync 802. Such a clock input can serve as a
time base for all elated measurements for the VGA detections.
Returning to the hardware detection module 610 can be configured to
measure values based on the received sync signals , Hsync) and clock signal ).
As shown in the values measured at the hardware signal detection module 610 can
include, for example, the length of time that Vsync is high (value 611), the length of time that
Vsync is low (value 612), the length of time that Hsync is high (value 613), the length of
time that Hsync is low (value 614), the number of lines where Vsync is high (value 615), the
number of lines where Vsync is low (value 616), and/or the like. In the case of the values
611 — 614, the length of time is defined as the number of pulses of the input clock for that
stage of the signal. For the values 615 and 616, the registers of the hardware detection
module 610 can contain an actual number of lines (Hsync pulses) counted. Specifically, the
value 615 represents the actual number of lines counted when Vsync is high (e.g., digital 1),
and the value 616 represents the actual number of lines counted when Vsync is low (e.g.,
digital 0). All of the registers of the hardware ion module 610 can be simple
synchronous counters that are buffered in such a way that a single read of the registers will
return valid values for a complete frame. These measurements are then read as sync
measurements by the software detection module 620, as shown in
The re detection module 620 can be configured to determine, based on the
sync ements received from the hardware detection module 610, a set of frame
parameters used for identification of a video standard (e. g., a VGA standard). is a
schematic illustration of the frame parameters that characterize the timing for a VGA frame,
according to an embodiment. As shown in the frame parameters include: ntal
Back Porch 901, Horizontal Active 902, Horizontal Front Porch 903, Hsync 904, Vertical
Back Porch 905, Vertical Active 906, Vertical Front Porch 907, and Vsync 908. Additional
parameters include, for example, a frame rate, a Vsync polarity, a pixel rate, a Hsync
polarity, and/or other frame ters.
Returning to the software detection module 620 transforms the sync
measurements (611-616) received from the hardware detection module 610 into the set of
frame parameters (e.g., the 12 frame parameters discussed above with respect to Figure 9).
In some embodiments, these frame parameters can be used by a media module (e.g., the
media module 430 A/B in , a modification module (e. g., the modification module 432
A/B in , and/or an ng module (e.g., the encoding module 434 A/B in
associated with the re detection module 610 and the software detection module 620.
is a flowchart illustrating a method 7000 of detecting or identifying a video
standard for signals, according to an embodiment. In some embodiments, the method 7000
can be executed at a software detection module in a media module of an embedded
appliance, such as the software detection module 620 shown and described with respect to
Specifically, the method 7000 uses the data received from a re ion
module (e. g., the hardware detection module 610 in and s complex software
methods or processes to derive the tion and timing of, for example, a VGA signal. By
ming the method 7000, the software detection module is capable of detecting or
identifying a video rd for each video signal from multiple possible video standards
such as, for example, Discrete Monitor Timing (DMT), Generalized Timing Formula (GTF),
nated Video Timing (CVT), Coordinated Video Timing with Reduced Blanking
(CVT-RB), and High Definition Television (HDTV) using the horizontal sync and vertical
sync signals.
As shown in the detection starts with ing a signal representing the
measurements at 7001 from the hardware detection module and testing them for validity.
The measurements can be the values 611-616 of The measurements are deemed
valid by cross checking detected pulse widths against a range derived from detected pulse
counts. If the ements are determined to be invalid or illegal, the measurements can be
dropped or discarded and the software detection module is ready to receive new
measurements.
At 7002, the measurements are tested for an exact match with some known
standard values such as, for example, values for DMT and HDTV. If a suitable match with a
known standard (e.g., DMT, HDTV) is ined, a result identifying or representing the
known standard is generated at the software detection module and returned to, for e, a
processor of the ed appliance. Otherwise, if a suitable match is not made, then at
7003, the measurements are used to calculate estimated timings for a set of other known
standards including, for e, CVT, CVT-RB, and/or GTF standards. These estimated
timings are then tested for validity, and any d or l combinations are discarded.
Next, valid estimated timings are tested for an estimated match with the set of
known standards. If a match with an known standard (e. g., CVT, CVT-RB, GTF) is
determined, a result including the known standard is generated at the software detection
module and returned to, for example, a processor of the embedded appliance. ise, if
no match is determined with any known standard at 7003, then at 7004, a minimal-matching
method or process can be applied on the measurements to search for a minimal match based
on the measurements. Such a minimal-matching method can be similar to (a portion of) the
approach used at 7001-7003, except that one or more of the measured values is d
from the match criteria for the minimal-matching method. In some embodiments, the step of
7004 can be repeated several times using different match criteria. This repeating of 7004 can
continue until a match is found, or until no measured value remains to be removed.
In some embodiments, the process rated by the flowchart in can be re-
d to a range of measurement values to define a list of candidate timings. These
candidate timings can then be searched for the best match. Stated another way, the method
executed at the software detection module can loop through a range of one or more
parameters, generating a timing estimate for each of the measurement values in the range.
When the loop is complete, a best-fit method can be applied to the results to select the final
timing.
While various embodiments have been described above, it should be understood
that they have been presented by way of e only, and not limitation. Where methods
bed above indicate n events occurring in certain order, the ordering of n
events may be modified. Additionally, certain of the events may be performed concurrently
in a parallel s when possible, as well as performed sequentially as described above.
Some embodiments bed herein relate to a computer storage product with a
non-transitory computer-readable medium (also can be referred to as a non-transitory
processor-readable ) having instructions or computer code thereon for performing
various computer-implemented operations. The computer-readable medium (or processor-
readable medium) is non-transitory in the sense that it does not include transitory propagating
signals per se (e.g., a propagating omagnetic wave carrying information on a
transmission medium such as space or a cable). The media and computer code (also can be
referred to as code) may be those designed and constructed for the specific purpose or
purposes. Examples of computer-readable media include, but are not limited to: magnetic
storage media such as hard disks, floppy disks, and magnetic tape; optical storage media such
as Compact Disc/Digital Video Discs (CD/DVDs), Compact Disc-Read Only Memories
(CD-ROMs), and holographic devices; magneto-optical storage media such as optical disks;
carrier wave signal processing modules; and re devices that are specially configured
to store and execute program code, such as ation-Specific Integrated Circuits (ASICs),
mmable Logic s (PLDs), Read-Only Memory (ROM) and Random-Access
Memory (RAM) devices.
[1 l 10] Examples of computer code include, but are not limited to, micro-code or micro-
instructions, machine instructions, such as produced by a compiler, code used to produce a
web e, and files containing higher-level instructions that are executed by a computer
using an interpreter. For example, embodiments may be implemented using Java, C++, or
other programming languages (e.g., obj ect-oriented programming languages) and
development tools. Additional examples of computer code include, but are not limited to,
control signals, encrypted code, and compressed code.
[1 l l 1] In conclusion, among other things, an apparatus and method for capturing,
processing, g and/or sending media signals using an embedded appliance is described.
While various embodiments of the invention have been described above, it should be
understood that they have been presented by way of example only and s changes in
form and details may be made. For example, sors and/or s of an embedded
appliance can be included on separate electronic boards in one or more housings, can have
dedicated memory (RAM etc).
Claims (15)
1. An apparatus, comprising: a media module configured to receive a first media signal and a second media signal, the first media signal being associated with a first input port of an embedded appliance, the second media signal being associated with a second input port of the embedded appliance, the media module configured to identify a first plurality of media signal ters based on the first media signal; and a modification module included in the embedded appliance and ured to receive a first modification instruction associated with a session format having a second plurality of media signal ters different from the first ity of media signal parameters, the modification module ured to modify the first media signal based on the first plurality of media signal parameters and the first modification instruction to produce a first modified media signal in the session format and having the second plurality of media signal parameters, the media module configured to identify, at a second time after the first time, a third plurality of media signal parameters different from the first plurality of media signal parameters and based on the first media signal, the modification module configured to receive a second modification ction associated with the session format, the modification module configured to modify the first media signal based on the third plurality of media signal parameters and the second modification instruction to produce a second ed media signal in the session format and having the second plurality of media signal parameters.
2. The apparatus of claim 1, wherein the first modification instruction is independent of a format of the first media signal and a format of the second media signal.
3. The apparatus of claim 1, wherein the first plurality of media signal parameters include at least two of a resolution of the first media , a frame rate of the first media signal, a bit rate of the first media signal, or a clock rate of the first media signal.
4. The apparatus of claim 1, n: the session format is a first predefined session format from a plurality of predefined session formats, the media module is configured to identify the first plurality of media signal parameters from a plurality of predefined sets of media signal parameters, each predefined set of media signal parameters being associated with a predefined session format from the ity of predefined session formats.
5. The tus of claim 1, wherein the modification module is configured to modify the first media signal by performing on the first media signal at least one of deinterleaving, decimating, resizing, color space converting, modifying gain, adjusting audio level, or audio multiplexing.
6. The apparatus of claim 1, wherein the session format is selected from a plurality of predefined session formats based on at least one of the first plurality of media signal parameters, a elected output parameter, or a capability of the embedded appliance.
7. The apparatus of claim 1, wherein: the media module configured to identify a first format of the first media signal when the first plurality of media parameters match a predefined set of media ters from a plurality of predefined sets of media parameters, the predefined set of media parameters including a first number of media ters and being associated with the first , the media module configured to identify a second format of the first media signal when the first format is not identified and when the first plurality of media parameters match a subset of media ters of the predefined set of media parameters, the subset of media parameters including a second number of media parameters less than the first number of media parameters and being associated with the second format, the media module configured to send a first signal indicating the first format when the first format is identified, the media module configured to send a second signal indicating the second format when the second format is identified.
8. The apparatus of claim 7, wherein: the media module is configured to fy a third format of the first media signal when the first format is not identified, when the second format is not identified and when a subset of the first plurality of media parameters match a predefined set of media parameters from the plurality of predefined sets of media parameters or an estimated set of media parameters from the plurality of estimated sets of media parameters, the media module configured to send a signal indicating an identification of the third format when the third format is identified.
9. The apparatus of claim 7, wherein: the subset of media parameters is a first subset of media parameters, the media module is ured to identify a third format of the first media signal when the first format is not identified and when the second format is not identified, the media module is configured to identify the third format by iteratively defining a second subset of the first plurality of media parameters of the ined set of media parameters until that second subset of the first plurality of media parameters s the predefined set of media parameters from the plurality of predefined sets of media parameters or an estimated set of media ters from the plurality of ted sets of media parameters, the media module configured to send a signal indicating an identification of the third format when the third format is identified.
10. The apparatus of claim 7, wherein: the plurality of media parameters include a vertical sync pulse width, a horizontal sync pulse width and a clock line number, the media module ured to calculate the vertical sync pulse width based on a vertical sync signal associated with the media signal, the media module configured to calculate the horizontal sync pulse width based on a horizontal sync signal associated with the media signal, the media module configured to calculate the clock line number based on a clock signal associated with the media signal.
11. The apparatus of claim 7, wherein: the first format is identified from one of a Discrete Monitor Timing (DMT) rd or a High Definition Television (HDTV) standard, and the second format is identified from one of a Coordinated Video Timing (CVT) standard, a Coordinated Video Timing with Reduced Blanking (CVT-RB) standard or a Generalized Timing Formula (GTF) standard.
12. The apparatus of claim 1, further comprising: a onization module operatively coupled to the media module and ed in the embedded appliance, the synchronization module configured to receive the first modified media signal having a format and a time base and being associated with a first input port of an embedded appliance, the synchronization module configured to receive the second media signal having a time base different from the time base of the first media signal and being associated with the second input port of the embedded nce, the synchronization module configured to modify at least one of the first modified media signal or the second media signal based on a common time base and a modification instruction such that the first modified media signal and the second media signal are synchronized to the common time base.
13. The apparatus of claim 12, wherein the synchronization module is configured to modify the at least one of the first modified media signal or the second media signal based on a time window such that the first ed media signal and the second media signal are synchronized to the common time base independent of clock drift.
14. The apparatus of claim 12, wherein the synchronization module is configured to define the common time base based on at least one of the time base of the first media signal or the time base of the second media signal.
15. The apparatus of claim 12, n the modification instruction is a first modification instruction, the time base of the first modified media signal is a first time base during a first time window, the first modified media signal has a second time base during a second time window, the time base of the second media signal is a third time base, the second media signal having a fourth time base during the second time window, the fourth time is different from the second time base, the onization module is configured to modify during the first time window, the synchronization module configured to modify, at a time during the second time window, at least one of the first modified media signal or the second media signal based on the common time base and a second modification instruction such that the first modified media signal and the second media signal are synchronized to the common time base.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201161503472P | 2011-06-30 | 2011-06-30 | |
US61/503,472 | 2011-06-30 | ||
PCT/US2012/044879 WO2013003698A2 (en) | 2011-06-30 | 2012-06-29 | Methods and apparatus for an embedded appliance |
Publications (2)
Publication Number | Publication Date |
---|---|
NZ619460A NZ619460A (en) | 2014-11-28 |
NZ619460B2 true NZ619460B2 (en) | 2015-03-03 |
Family
ID=
Similar Documents
Publication | Publication Date | Title |
---|---|---|
AU2022200777B2 (en) | Methods and apparatus for an embedded appliance | |
US9819973B2 (en) | Embedded appliance for multimedia capture | |
NZ619460B2 (en) | Methods and apparatus for an embedded appliance | |
AU2019204751B2 (en) | Embedded appliance for multimedia capture | |
AU2013254937B2 (en) | Embedded Appliance for Multimedia Capture | |
CA2914803C (en) | Embedded appliance for multimedia capture | |
AU2012202843A1 (en) | Embedded appliance for multimedia capture |