WO1996032722A1 - Systeme de mise en forme - Google Patents
Systeme de mise en forme Download PDFInfo
- Publication number
- WO1996032722A1 WO1996032722A1 PCT/JP1996/000963 JP9600963W WO9632722A1 WO 1996032722 A1 WO1996032722 A1 WO 1996032722A1 JP 9600963 W JP9600963 W JP 9600963W WO 9632722 A1 WO9632722 A1 WO 9632722A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- clip
- event
- image data
- management record
- Prior art date
Links
- 238000007726 management method Methods 0.000 claims description 225
- 230000015654 memory Effects 0.000 claims description 121
- 238000003860 storage Methods 0.000 claims description 56
- 238000000034 method Methods 0.000 claims description 28
- 230000003139 buffering effect Effects 0.000 claims description 11
- 230000004044 response Effects 0.000 claims description 6
- 230000005540 biological transmission Effects 0.000 claims description 4
- 238000013523 data management Methods 0.000 claims 121
- 238000003491 array Methods 0.000 claims 2
- 238000013500 data storage Methods 0.000 claims 1
- 238000004891 communication Methods 0.000 abstract description 15
- 230000005236 sound signal Effects 0.000 abstract description 5
- 238000012546 transfer Methods 0.000 abstract description 4
- 239000002131 composite material Substances 0.000 description 44
- 238000012545 processing Methods 0.000 description 22
- 238000006243 chemical reaction Methods 0.000 description 9
- 230000006835 compression Effects 0.000 description 5
- 238000007906 compression Methods 0.000 description 5
- 238000005070 sampling Methods 0.000 description 5
- 239000000463 material Substances 0.000 description 4
- 241001553014 Myrsine salicina Species 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 239000000284 extract Substances 0.000 description 3
- 230000006837 decompression Effects 0.000 description 2
- 238000004064 recycling Methods 0.000 description 2
- 102100022907 Acrosin-binding protein Human genes 0.000 description 1
- 241000406668 Loxodonta cyclotis Species 0.000 description 1
- 102100022465 Methanethiol oxidase Human genes 0.000 description 1
- 102100031798 Protein eva-1 homolog A Human genes 0.000 description 1
- 102100028029 SCL-interrupting locus protein Human genes 0.000 description 1
- 101100072644 Saccharomyces cerevisiae (strain ATCC 204508 / S288c) INO2 gene Proteins 0.000 description 1
- 101100454372 Saccharomyces cerevisiae (strain ATCC 204508 / S288c) LCB2 gene Proteins 0.000 description 1
- 101100489624 Saccharomyces cerevisiae (strain ATCC 204508 / S288c) RTS1 gene Proteins 0.000 description 1
- 230000002776 aggregation Effects 0.000 description 1
- 230000001174 ascending effect Effects 0.000 description 1
- 239000011324 bead Substances 0.000 description 1
- 239000013256 coordination polymer Substances 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 238000005360 mashing Methods 0.000 description 1
- 230000000873 masking effect Effects 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000007170 pathology Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 239000011800 void material Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/36—Monitoring, i.e. supervising the progress of recording or reproducing
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/02—Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
- G11B27/031—Electronic editing of digitised analogue information signals, e.g. audio or video signals
- G11B27/034—Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/34—Indicating arrangements
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B2220/00—Record carriers by type
- G11B2220/20—Disc-shaped record carriers
- G11B2220/25—Disc-shaped record carriers characterised in that the disc is based on a specific recording technology
- G11B2220/2508—Magnetic discs
- G11B2220/2512—Floppy disks
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B2220/00—Record carriers by type
- G11B2220/40—Combinations of multiple record carriers
- G11B2220/41—Flat as opposed to hierarchical combination, e.g. library of tapes or discs, CD changer, or groups of record carriers that together store one title
- G11B2220/415—Redundant array of inexpensive disks [RAID] systems
Definitions
- the present invention relates to an editing system.
- it is suitable for editing material that requires speed, such as sports broadcasting and news reporting.
- Background art
- the current editing system mainly uses a VTR as the recording medium for recording the material, so it takes time to find, fast-forward, and rewind the VTR, etc., and control the VTR to just before online playback. The fact is that they created a blog program.
- conventional editing devices are not considered to handle materials that require real-time properties (real-time properties) such as sports relay and news reporting. Disclosure of the invention
- the present invention has been made in view of the above points, and is intended to propose an editing system that can realize high-speed real-time editing while being small in size compared to the related art.
- the present invention has been made in consideration of the above points, receives input video data input in real time, outputs video data obtained from the input video data as first video data in real time, and An input / output means for outputting video data reproduced from the medium in real time as second video data; a recording operation for recording input video data input in real time on a recording medium;
- a main recording / reproducing apparatus including a recording / reproducing means for executing, at substantially the same time, a reproducing operation for reproducing video data recorded on the recording medium in real time; and a recording and reproducing operation for the main recording / reproducing apparatus.
- FIG. 1 is a drawing for explaining the overall configuration of the editing system of the present invention.
- FIG. 2 is a drawing showing the internal configuration of the computer 1.
- FIG. 3 is a drawing showing a graphical display for GUI displayed on the monitor 14 of the computer 1.
- FIG. 4 is a drawing for showing first management record data for clip data, event data, and program data.
- FIG. 5 is a drawing showing the second management record data for the clip data.
- FIG. 6 is a drawing for showing second management record data for event data and program data.
- FIG. 7 is a drawing to list the order of marking of the clip image data and the index number, clip number and event number attached to it.
- FIG. 8 is a drawing for showing each clip image data displayed in the clip display area 28, the event display area 29, and the program display area 30.
- FIG. 9A is a diagram showing a link state of management record data for managing program data.
- FIG. 9B is a drawing showing a link state of management record data for managing event data.
- FIG. 9C is a diagram showing a link state of management record data for managing clip data.
- FIG. 10 is a drawing showing the internal configuration of the hard disk array 2.
- FIG. 11 is a flowchart showing the initial operation of the editing system.
- FIG. 12 is a flowchart illustrating the first marker operation of the editing system.
- FIGS. 13 ⁇ and 13B are flowcharts for illustrating the second marking operation of the editing system.
- FIG. Fig. 1 conceptually shows the hardware configuration of the entire editing system.
- This editing system consists of a computer 1 and a hard disk array 2.
- Computer 1 is installed with an application program for editing video data sent to the computer.
- This publication D-gram which is installed on the editing computer, is a program that can run under the computer's operating system.
- the Aburikeshi ® down program includes and G to generate a control command 1 I a (graphical 'user' Lee Ntafu over scan).
- a plurality of hard disks are connected to the hard disk array 2 on the array.
- the hard disk array 2 is apparently controlled so that recording and reproduction are performed simultaneously. That is, a real-time video signal can be reproduced from the hard disk while recording the input real-time video signal.
- the computer 1 and the hard disk array 2 are connected by a communication cable based on the communication format of the RS-422 interface.
- the RS-422 interface communication format is a communication format capable of simultaneously transmitting and receiving a video signal and a control command.
- the input video signal 3 input to the editing system is a composite video signal captured by a video camera or the like or a composite video signal transmitted from a VTR.
- This composite video signal is a signal transmitted in accordance with the format of SD1 (Serial'Digital Interface).
- the video signal 4 output from the editing system is also a composite video signal transmitted in accordance with the SDI format signal.
- the video signal input to and output from the editing system may be a component video signal.
- the video signals input to and output from the editing system are not limited to digital video signals, but may be analog composite video signals.
- a composite video signal captured by a video camera or the like is input to the computer 1 and the hard disk array 2, respectively.
- the video signal input to the computer 1 is displayed on the monitor of the editing computer 1.
- the video signal input to the hard disk array 2 is encoded in real time and recorded on the hard disk.
- An operator operating the computer 1 operates a pointing device such as a mouse connected to the computer 1 to edit points such as an in point (edit start point) and an out point (edit end point). Can be specified.
- a pointing device such as a mouse connected to the computer 1 to edit points such as an in point (edit start point) and an out point (edit end point). Can be specified.
- control for editing is performed.
- Control commands can be generated.
- the generated control command is transmitted to the hard disk array 2 as an RS-422 control command, and the reproduction control of the hard disk array 2 is performed.
- the reproduced video signal 4 is displayed on the monitor of the computer 1 and sent out.
- the computer 1 includes a system bus 5 for transmitting control signals and video signals, a CPU 10 for controlling the entire computer, and a first video processor 1 for performing image processing and the like on video signals.
- Display controller for controlling video signals displayed on the first and second video processors 12, video monitor 14, and graphics display for GUI 13-Low-power hard disk drive (local HDD) HDD interface 15 to control 15a, FDD interface 16 to control floppy disk drive 16a, cursor control device (commonly called mouse ) 17a
- cursor control device commonly called mouse
- Control D-pointing device that generates control data based on commands from pointing devices such as panel 17b and keyboard 17c.
- Interface 17 and external interface 18 with a software driver for data communication based on the hard disk array 2 and RS-422 communication format. .
- the system bus 5 is for communicating video data, command data, and address data inside the computer 1; the image data bus 5a for transmitting video data; and for transmitting control signals. It consists of a command data bus 5b.
- the first and second video processors 11 and 12, the display controller ⁇ - 13, the HDD interface 15 and the FDD interface 16 are connected to the image data bus 5 a, respectively. Have been. Therefore, the first video processor 11, the second video processor 12, the display controller 13, the HDD interface 15 and the FDD interface 16 are connected via the image data bus 5a. Contact Video data can be transmitted between each block.
- the command data bus 5 b has a CPU 10, a first video processor 11, a second video processor 12, a display controller 13, an HDD interface 15 FDD interface The interface 16, the pointing device interface 17, and the external interface 18 are connected. In other words, all blockers inside the computer 1 are connected via the command data bus 5b.
- the CPU 10 is a block for controlling the entire computer.
- the CPU 10 stores R0M10a, which stores the operating system of the computer 1, and RAMI0b, which uploads and stores the application program recorded on the hard disk 15a. Have.
- the CPU 10 executes a software program based on the operating system admired by the R RM 10a.
- the CPU 10 reads the application program recorded on the disk of the disk drive 15a and reads the application program. Upload to RAM 10b inside.
- the first video processor 11 receives the first composite video signal V2 input to the computer 1, converts the first composite video signal V2 into data, and converts the converted composite video signal into the composite video signal V2. This is a temporary buffering buffer.
- the first video processor 11 includes a processor controller 11 a for controlling the entire video processor 11 1, and a composite / video signal of the received analog signal. Is converted to digital component video data, and a frame memory 11c for temporarily storing several frames of video data transmitted from the data converter 11b. Is provided.
- the processor controller 11a transmits a control signal for data conversion to the data conversion SB 11b and extracts a time code from the composite video signal V2 input to the data 1b. To the data converter 1 1 b I do. Further, the processor controller 11a outputs a control signal for controlling read / write timing and read / write address of the frame memory 11c. Specifically, the processor controller 11a controls the frame memory rate so that the time code transmitted to the GUI manager 13 and the video data (frame data) correspond to each other. .
- Data conversion unit 1 1 b is based on the control signal from the flop n Sessako down preparative roller 1 1 a, converts the analog co Npoji' protruding old signal in a U Nponen Tobideo signal, converted into di digital video data I do.
- time code data is extracted.
- the digitally converted video data is sent to the frame memory 11e, and the extracted timecode is sent to the processor controller 11a.
- This composite video signal V2 is an analog composite video signal in which a time code is superimposed in the vertical synchronization period of the input video signal V1.
- This time code is a signal inserted into the 14 H and 16 H or 12 H and 14 HO 2 lines during the vertical blanking period, and is called a VITC (Vertical Int erva l Ti). me Code) It is called time code.
- a time code generated by a time code generator synchronized with an external tone system is used.However, a time code generator is provided inside the hard disk array 2 and is used by Nairo The generated time code may be used. In this embodiment, the time code thus encoded is superimposed on the composite signal.
- the digitally converted video data is temporarily stored in the frame memory 11c.
- the read / write timing of this frame memory 11c is It is controlled by the controller 11a.
- the frame memory 11c is composed of two frame memories (having a capacity of 1 Mbyte), and has a total capacity of 4 Mbytes.
- the video data stored in the frame memory 11c is video data composed of 152 ⁇ 960 pixels, and the above-mentioned frame memory stores video data for two frames. it can.
- the video data of 152 ⁇ X960 pixels stored in the frame memory 11c is read based on the read control of the processor controller 11a.
- the video data read from the frame memory 11c is 1520 pixels x 960 pixels, that is, it is not the video data of all pixels, but 380 pixels x 240 pixels In this way, the data amount is thinned out.
- thinning out the amount of data simply means reducing the amount of video data to be read by reducing the sampling rate for reading video data from the frame memory 11 c to 1/4. That is.
- the read video data of 380 pixels X 240 pixels is sent to the display controller 13 via the image data bus 5a.
- the second video port processor 12 has a configuration exactly the same as the configuration of the first video port D processor.
- a processor controller 12a for controlling the entire video processor 12 and a data converter for converting the received analog video signal into digital combo video data. 12b, and a frame memory 12c for temporarily storing video data of several frames transmitted from the data converter 12b.
- the first video processor 11 differs from the second video processor in that a composite video signal V 2 is input to the first video processor 11 and a composite video signal V 2 is input to the second video processor 12. V 3 is input.
- the composite video signal V2 is a video signal in which a time code is superimposed in the vertical synchronization period of the input video signal V1 inside the disk array 2, it is temporally identical to the input video signal input to the real time. It is a video signal. That is, the video data stored in the frame memory 11c is the input video signal. This is the same video data as the video signal obtained by digitizing
- the composite video signal V 3 is a video signal reproduced from the hard disk array according to an instruction from the computer 1. Therefore, the composite video signal V3 is a video signal that is not temporally related to the input video signal V1. This will be described in detail below.
- computer 1 When the operator instructs computer 1 to play the desired video data, computer 1 outputs a playback command to disk array 2.
- the hard disk array 2 reproduces video data specified by the operator and a time code corresponding to the video data in response to a command from the computer 1.
- the heart disk array 2 stores time codes and video data so as to correspond to each other on a frame basis. The detailed configuration inside the hard disk array will be described later.
- the reproduced time code is superimposed during the vertical synchronization period of the reproduced video data.
- the video data on which the time code is superimposed is converted into an analog composite video signal V3 so that the video data can be transmitted to the computer 1, and is output to the computer 1 side.
- the composite video signal V3 supplied to the second processor is transmitted through the data conversion unit 12b and the frame memory 12c in the same manner as the composite video signal supplied to the first video processor. Transmitted to the display controller I3 as digital video data of 80 pixels X 240 pixels.
- the display controller 13 is a control block for controlling data displayed on the monitor 14.
- the display controller 13 has a memory controller 13a and a VRAM (video 'random'access' memory) 13b.
- the memory controller 13a controls the read / write timing of the VRAM 13b in accordance with the internal synchronization inside the computer 1.
- This VRAMI 3 b includes video data from the frame memory 11 c of the first video processor 11 1, video data from the frame memory 12 c of the second video processor 12 and CPU 10 power, These image data are stored based on a timing control signal from the memory controller 13a.
- the image data stored in this VRAMI 3 b A timing control signal from the memory controller 13b based on the internal synchronization of the viewer is read out from the VRAM 13b and displayed graphically on the video monitor 14.
- the graphic display displayed on the monitor 14 in this manner becomes a graphic display for the G1.
- the image data transmitted from the CPU 10 to the VRAM 13 b is, for example, image data such as a window, a force, a mouth, and a lever.
- the hard disk interface 15 is a block for interfacing with an opening provided inside the computer 1—a hard disk drive (HDD) 15a.
- the hard disk interface 15 and the hard disk drive 15a communicate with each other based on the transmission format of SCS I (SmallComPuterSstemEmInt er f a c e).
- SCS I SystemComPuterSstemEmInt er f a c e.
- An application program for booting on the computer 1 is installed on the hard disk drive 15a, and when executing the abrication program, the hard disk drive 15a is installed. And is uploaded to RAM I1b. When the application program is terminated, the work file created by the editing operation stored in the RAM I 1b is downloaded to the hard disk 15a.
- the bead disk interface 16 is a bus for interfacing with the floppy disk drive (F DD) 16a provided inside the computer 1. It is good. Communication between the floppy disk interface 16 and the floppy disk drive 16a is performed based on the SCS1 transmission format.
- the floppy disk drive 16a stores an EDL (edit 'decision' list) indicating the editing result of the editing operation.
- the pointing device interface 17 is connected to the computer 1 Block for interfacing with the mouse 17a, the control panel 17b and the cow board 17c.
- the pointing device interface I7 includes, for example, detection information of a two-dimensional rotary encoder provided on the mouse 17a and a click of the right and left buttons provided on the mouse 17a. And mouse information from mouse 17a.
- the pointing device interface 17 decodes the received information and sends it to the CUP 10.
- the pointing device interface 17 receives information from the control panel 17b and the keyboard '17c, decodes the received information, and sends it to the CUP 10.
- the external interface 18 is a block for communicating with the hard disk array 2 connected to the outside of the computer 1.
- the Toro interface 18 has an RS-422 driver for converting command data generated by the CPU 10 into an RS-422 communication protocol.
- the graphics displayed on the monitor 1 can be roughly classified into two areas: the recording video display area 21, the timing display area 22, the playback video display area 23, the recording video marking area 24, and the playback speed.
- 10 setting areas 25, a recycling box area 26, a playback video marking area 27, a crit display area 28, an event display area 29, and a program area 30 are classified into two types. The areas will be described below in order.
- the recording video display area 21 has a recording video screen 21a, a recording start point display section 21b, a remaining storage time display section 211, and a recording display section 21d. ing.
- the video signal displayed on the recording video screen 21a is a video signal obtained from the composite video signal V2 output from the hard disk array 2, and has a resolution of 380 pixels from the frame memory lie.
- the recording start point display section 21b a time code indicating from which point in time the hard disk array 2 has started recording the video signal displayed on the recording video screen 21a is displayed.
- the remaining time of the storage capacity of the hard disk array 2 is displayed in the storage capacity remaining time display section 21c.
- the calculation of the remaining time is performed by subtracting the recording start time from the current time from the recordable time of the hard disk array 2 because the total storage capacity of the hard disk array 2 is set in advance. It can be easily obtained.
- the timing display area 22 includes a one-minute display section 22a, a time display section 22b, an input video signal timecode display section 22c, and a reproduced video signal timecode display section 22. d, input display section 2 2 e, mode button 22 f, bridge button 22 g, and playback speed setting (DMC: dynamic 'motion' control) button 2 2 h.
- DMC dynamic 'motion' control
- the 1-minute total display section 22a is an area for counting 1 minute (or 3 minutes) every second in units of seconds and visually displaying it. As the count progresses, the color of the display changes in seconds, allowing the operator to visually grasp the passage of time.
- the 1-minute display unit 22a for 1-minute time for example, for 1 minute after specifying the 1N point on the input video side or playback video side This is used when specifying the OUT point by counting the number of times, or when reviewing the created program, when counting down 1 minute from the start of the preview.
- the current time is displayed on the time display section 22b.
- the recorded video signal time code display section 22 c displays the time code corresponding to the video signal displayed in the recorded video display area 21.
- This time code is a time code extracted from the vertical synchronization period of the composite video signal V 2 by the processor controller 11 a of the first video processor 11.
- the playback video signal timecode display section 22c has a playback video display area 23.
- a time code corresponding to the displayed video signal is displayed.
- This time code is a time code extracted from the vertical synchronization period of the composite video signal V3 by the processor controller 12a of the second video processor 12.
- the display of the on-air display section 22e changes to red.
- the signal indicating that the air is on the air is a signal supplied when the composite video signal V3 output from the disk array 2 is on the air. Since the color of the on-air display section 22e is variable in this manner, the operator can visually grasp the on-air state.
- the mode button 22 f is a button used to change between a victim mode for displaying graphics as shown in FIG. 3 and a timeline mode for displaying timelines. When this mode button 22f is clicked with the mouse 17a, the mode is changed between the visual mode and the timeline mode.
- the use of the pre-button 22 g and the playback speed setting (DMC: Dynamic / Motion 'control) button 22 h will be described later.
- the playback video display area 23 is provided with a playback video screen 23a, a shuttle button 23, a jog button 23c, and a playback state display section 23d.
- the video signal displayed on the playback video screen 23a is a video signal obtained from the composite video signal V3 played back by the disk array 2 and has a size of 3800 pixels 24 from the frame memory 12e. This is a video signal that is supplied to the VRAM 13b so as to have 0 pixels.
- the shuttle button 23b is used when fast-forwarding (so-called shuttle-forwarding) the video data reproduced from the disk array 2 and displayed on the playback video screen 23a.
- shuttle-forwarding the video data reproduced from the disk array 2 and displayed on the playback video screen 23a.
- the shuttle button 23 b is designated by operating the mouse 17 b and dragged in the direction in which the video data is to be sent, the playback control of the hard disk array 2 is performed in accordance with the drag.
- Jog button 2 3c is played from disk array 2 and Use this when you want to advance the video data displayed on side 23a frame by frame.
- the recording video marking area 24 is an area used when marking in-point or art-point clip image data from video data displayed on the recording video screen.
- the meaning of "marking” here is used to mean an in point or an out point, or to set an in point or an art point.
- the “clip image” here is a “still image”.
- the recorded video marking area 24 includes an IN clip display area 24a, a time code display section 24b, a mark IN button 24c, and an OUT clip display area 24d. , A time code display section 24 e and a mark OUT button 24 f.
- Crib display area 24a is an area for displaying the click image data of the IN point marked by the operator by clicking the mark IN button 24c. It is.
- the clip image data displayed in the IN clip display area 24a is image data obtained from the composite video signal V2 output from the disk array 2, and includes the frame memory 11 From (e), the image data is supplied to the VRAM 13b so as to be 95 pixels ⁇ 60 pixels.
- the time code display section 24b displays the time code of the clip image data displayed in the clip display area 24a. This time code is set when the age of the child is marked by clicking the mark 1 N button 24c and marking the in point. 1 video processor 11 1 processor controller 1 2a This is a powerful, time code extracted from composite video signals.
- the mark IN button 24c is a button for marking an IN point. The observer clicks this button while watching the video data displayed on the recording video screen 21a. When the button 24c is clicked, click image data (95 pixels 60 pixels) is generated from the video data displayed on the recording video screen 21a at this time, and the generated image data is generated. The clip image data is displayed in the IN clip display area 24a. The specific operation will be described later.
- Critical display area 24d is an area for displaying the click image data of the art point marked by the operator by clicking the mark OUT button 24f. is there.
- the clip image data displayed in the OUT clip display area 24d is image data obtained from the composite video signal V2 output from the hard disk array 2, and is stored in the frame memory 1 This is image data supplied to the VRAM 13b so as to have 95 pixels ⁇ 60 pixels.
- the time code display section 24e displays the time code of the clip image data displayed in the 0UT clip display area 24a. This time code is used when the operator clicks the mark OUT button 24c to mark the OUT point, and the processor controller 12a of the first video processor 11a powers the composite. This is the time code extracted from the video signal.
- the mark OUT button 24c is a button for marking an art point.
- click image data (95 pixels x 60 pixels) is generated from the video data displayed on the recording video screen 21a at this time, and is generated.
- the clipped image data is displayed in the OUT clip display area 24d. The specific operation will be described later.
- the recycle box 26 is used to delete the generated clip image data. It is an error to use.
- the user designates the click image data with the mouse and drags the click image data to the area of the recycling box 26 to perform the erasure.
- click this recycled box 26 to display all of the clipped image data discarded in the recycled box 26. . If you specify the clip image data you want to restore from among them, the specified clip image data will be restored.
- the playback video marking area 27 is an area used when marking in-point or art-point clip image data from the video data displayed on the playback video screen 23a.
- the playback video area 27 has an IN clip display area 27a, a time code display section 27b, a mark IN button 27c, and a 0UT clip display.
- the IN clip display area 27a is an area where the operator clicks the mark IN button 27c to display clip image data marked as an IN point.
- the clip image data displayed in the IN clip display area 27a is image data obtained from the combo video signal V3 output from the hard disk array 2, and is a frame image. This is image data supplied from the memory 12 e to the VRAM 13 b so as to have 95 pixels ⁇ 60 pixels.
- the time code display section 27b displays the time code of the clip image data displayed in the clip display area 27a.
- the time code is set to the second video processor 12 when the processor clicks the mark IN button 27c to mark the in-point. 2a Powerful, time court extracted from composite video signal V3.
- the mark IN button 27c is a button for marking an IN point.
- the ⁇ button 27c is clicked.
- the clip image data is displayed from the video data displayed on the record video screen 23a. (95 pixels x 60 pixels) is generated, and the generated clip image data is displayed in the IN clip display area 27a. The specific operation will be described later.
- the UT clip display area 27 d is an area for displaying the clip image data of the mark point marked by the operator by clicking the mark ⁇ ⁇ ⁇ UT button 27 f. .
- the clip image data displayed in the 0 UT clip display area 27d is image data obtained from the composite video signal V3 output from the hard disk array 2, and includes a frame memory. This is image data supplied to the VRAM 13 b from 12 e so as to be 95 pixels ⁇ 60 pixels.
- the time code display section 27e displays the time code of the clip image data displayed in the ⁇ UT clip display area 27a.
- the time code is output to the bus controller 12a of the second video processor 12a. This is the time code extracted from the signal.
- the mark OUT button 27c is a button for marking an art point.
- the operator clicks the mark 0 UT button 27c while watching the video data displayed on the recording video screen 23a. Click.
- the button 24c is clicked, the clip image data (95 pixels x 60 pixels) is generated from the video data displayed on the recording video screen 23a at this time.
- the clip image data is displayed in the OUT clip display area 27d. The specific operation will be described later.
- the clip display area 28 is the mark marked by clicking the mark IN button 24c and mark OUT button 24f provided in the recording video marking area 24. Click on the mark 1 N button 27 c and mark UT button 27 f provided on the live image data and playback video marking area 27. This is an error for displaying live image data.
- the clip image data displayed in the clip display area 28 is clip image data that is not used as an event in point or art point. is there.
- the clip image data used as event event points is displayed in the event display area 29.
- the clip display area 28 includes a clip image data display area 28a, a time code display section 28b, a clip type display section 28c, and a clip number display. It has a section 28d, a feed button 28e, and a return button 28f.
- the clip image data display area 28a is the clip display area 24a, the clip display area 24d, the ink display area 27a or the outside. It is the clip image data moved from any of the display areas of the clip display area 27 d, and is composed of 95 pixels and 60 pixels of the clip image data.
- the time code display section 28b displays the time code of the click image data displayed in the click image data display area 28a. This time code is displayed in the area 24c for the clip display, the area 24d for the output clip, the area 27a for the clip display or the area 27d for the art clip display.
- the clip image data is moved in the same way as the clip image data moved from any display area to the clip image data display area 28a.
- the clip image display area 28c displays the clip image data force ⁇ , which is displayed in the clip image data display area 28a, in either the in-point or the art point. Data indicating whether there is top image data is displayed. Whether the clip image data displayed in the clip image data display area 28a is the clip image data obtained from the in-clip display area 24a For example, a red “IN” character is displayed. If the clip image data displayed in the clip image data display area 28a is the clip image data obtained from the out-clip display area 24d, The red “OUT” text is displayed. If the clip image data displayed in the display area 28a is the clip image data obtained from the ink display area 27a, , The blue “IN” text is displayed. The crib image data displayed in the crib image data display area 28a is the crib obtained from the out-clip display area 27d. If it is image data, a blue “ ⁇ UT” character is displayed.
- the clip number assigned to the clip image data displayed in the clip image data display area 28a is displayed in the clip number display section 28d.
- This clip number is a number that is automatically added to the clip image data in the order in which the clip image data is marked.
- the forward button 28 e and the return button 28 f are used to advance the display of the clip image data in the crib display area 28 to the front or back. When a large number of clip image data are generated, it is not possible to display all the clip image data on the monitor in the clip display area. In such a case, by operating the forward button 28 e and the return button 28 f to move the clip image data forward or backward, all the clip image data can be monitored. Can be displayed above.
- the event display area 29 is a click on the event generated by clicking the mark IN button 24 c and the mark OUT button 24 f provided in the recording video marking area 24 in order. Click the mark 1 N button 27 c and the mark OUT button 27 f provided on the live image data and playback video masking area 27 in order, and click on the mark of the event marked.
- This area is used to display rip image data. For one event, either the clip image data at the in point or the clip image data at the clip point at the art point is displayed.
- the event display area 29 is, like the clip display area 28, a clip image data display area 29a, a time code display area 29b, and a clip type display area 29. c, an event number display ffi29d, a feed button 29e, a return button 29f, and an event title display section 29.
- the clip image display area 29c displays a clip image of the event displayed in the clip image data display area 29a (the elephant data is an in-point or an art point). If the clip image data of the in-point is displayed as the clip image data of the event, this data is displayed. Clip The letters “1N” are displayed on the type display. If you want to display the out-point clip image data instead of the in-point clip image data, click this clip-type display section 29c to display the out-point clip image data. The clip image data of the point G is displayed. Thereafter, each time the clip type display section 29c is clicked, the display of the 1N-point clip image data and the clip image data of the fat point are switched.
- the event number display section 29 d displays the event number assigned to the generated event. This event number is a number that is automatically assigned to the event in the order in which the event was generated, and is completely independent of the clip number.
- the title attached to the event is displayed in characters.
- the program display area 30 a copy of the clip image data of the event displayed in the event display area 29 is displayed.
- the event displayed in the event display area 29 is dragged. Can be freely rearranged.
- the program display area 30 has a click image data display area 30a, a time code display section 30b, and a clip display section 30c. , An event number display section 30d, a feed button 30e, a return button 30 [, and an event title display section 30.
- the recording start button 31 a and the recording end button 31 b are buttons for sending control commands for recording start and rule end to the hard disk array 2.
- the recording start button 3 1 a When the recording start button 3 1 a is clicked, the CPU 10 starts the recording start button 3 1 Recognize that a has been pressed, and instruct the external interface use 18 to output a recording start command.
- the external interface 18 converts the command from the CPU 10 into a recording start command (RECSTART command) defined by RS-422 and sends it to the hard disk array 2.
- the hard disk 2 starts recording the input video signal V1 on the hard disk in response to the received recording start command.
- the recording end button 3 1b When the recording end button 3 1b is clicked, the CPU 10 detects that the recording end button 3 1b has been pressed, and outputs a recording end command 'to the external interface 18. Command.
- the external interface 18 converts the command from the CPU 10 into a recording start command (RECSTOP command) defined by RS-422 and sends it to the hard disk array 2.
- the hard disk array 2 ends the recording of the input video signal V 1 on the hard disk in response to the received recording start command.
- Preview buttons 32 are used to preview the selected event program.
- the clip image data of the specified event program is displayed on the playback video screen 23a as a still image (STI).
- the CPU 10 detects that the preview button has been pressed, and outputs a playback start command to the external interface 18.
- the Toro interface user 18 converts the instruction from the CPU 10 into a playback start command (P and AYSTART command) defined by RS-422 and sends it to the hard disk array 2.
- the hard disk array 2 starts reproducing the composite video signal V3 from the hard disk in response to the received reproduction start command.
- New event button 3 3 is used to create a new event. This new event button is used to register an event whose entry point and art point have been changed as another new event for the event specified by the operation. Click.
- the rebuild button 3 4 changes the in and art points of the selected event. Use when you need to. If the event specified by the observer is replaced with an event whose event point and event point have been changed, instead of another new event, this event will be used instead. Click this rebirth button 33.
- Delete button 35 is used to delete the selected event program.
- the erased event or program is discarded in the recycle box 26.
- the crisp data includes data for displaying the crisp image data in the crisp display area and data for storing the crisp image data.
- event data and program data The same applies to event data and program data.
- the first management record data for the clip data, the event data, and the program data will be described with reference to FIG.
- the first management record data for the clip data is data for managing all the clip image data displayed in the clip display area 28.
- the first management record data for event data is data for managing all the clip image data displayed in the event display area 29.
- the first management record data for program data is data for managing all the clip image data displayed in the program display area 30.
- the first management record data has only one first management record 'data for clip data, event data, and program data. .
- the first management record data is a void to the previously linked data.
- Pointer the pointer to the data linked later, the horizontal display size for one page, the vertical display size for one page, the display position on the screen, the display start position, and the total number of links And have data on
- the pointer to the previously linked data is data for indicating the pointer of the management record data linked before the first management record data. . If there is no previously linked management record data, your pointer is recorded.
- the pointer to the subsequently linked data is data indicating a pointer of the management record data linked after the first management record data. If no management record data is linked later, your pointer is recorded.
- the horizontal display size for one page is the clip image that is displayed in the horizontal direction on each of the display area 28, event area 29, and program area 30. This is data indicating the maximum display number of data. In this embodiment, since each of the display areas 28, 29, and 30 can display one image of the clip image, one page of the image can be displayed. Data indicating “11” as the horizontal display size of the minute is recorded in each management record data.
- the vertical display size for one page is the clip image data that is displayed in the vertical direction on each of the CLIP display area 28, event display area 29, and program display area 30. Is the data indicating the maximum display number.
- the clip display area 28, the event display area 29, and the program display area 30 can display only one clip image data.
- the data indicating "one" is recorded in each of the first major record data.
- the display position on the screen is data for indicating in which display area the click image data is displayed.
- the lower part is clipped on the screen.
- the display start position is defined as the clip display area 28, the event display area 29, and the mouth gram area 30. Data to indicate whether the In this embodiment, 1 I clip image data in the clip display area 28, 11 clip image data in the event display area 29, and 11 clip image data and the program display area 30 in the event display area 29. 11. Since one clip image data can be displayed, a total of 33 clip image data can be displayed.
- the display positions are managed by numbering the total of 33 display positions from the top on the screen. For example, the display position of the mouth gram display area 30 is the display position of the numbers 1 to 11, the event display area 29 is the display position of the namers 12 to 22, and the clip display area 2 The display position of 8 is determined to be the display position of number 23-33.
- the data indicating “23” is recorded as the data indicating the display start position, and the first management record data for the event data is used. For example, data indicating “1 2” is recorded as data indicating the display head position, and if the first management record data for program data, data indicating “1” is used as data indicating the display head position. Be recorded.
- the total number of links is data indicating how many management record data are linked after the first management record data.
- the second management record data for the clip data is This is data for managing the clip image data displayed in the web display area 28 for each clip image data. Accordingly, there are as many second management record data for the clip data as the number of the clip image data displayed in the clip display area 28.
- the second management record data for the clip data is pointers to the previously linked data, pointers to the later linked data, attributes, and the clip image. It has a data handle, a clip type, time code 'data, and an index number of the clip image data.
- the pointer to the previously linked data is data indicating the pointer of the management record data linked before this second management record data. Since the second management record data always has the first management record data or the second management record data before it, the pointer of the previously linked data is always recorded. .
- the pointer to the subsequently linked data is data indicating the pointer of the management record data linked after the second management record data. If there is no later linked management record data, your pointer is recorded.
- the attribute is data indicating whether the data is for the second management record data, for live data, for event data, or for program data.
- the clip image data handle is data indicating the address where the clip image data is recorded. Therefore, by referring to the clip image data handle in the second management record data corresponding to the desired clip image data, the address where the clip image data is stored can be determined. Obtainable.
- the clip is defined as whether the clip image data managed by the second management record data is the clip image data at the in-point or the clip at the in-point. This is data indicating whether the image data is image data.
- the time code data is data indicating the time code of the click image data managed by the second management record data.
- the index number of the clip image data is the index number assigned to the clip image data.
- the index number is a number sequentially assigned to all marked clip image data regardless of generation of an in point, an out point, and an event. That is, it is the same number as the clip number displayed in the clip number table section 28d. This index number manages all clip image data.
- the second management record data for event data is data for managing the clip image data displayed in the event display area 29 for each clip image data. Therefore, there are the same number of second management record data for event data as the number of clip image data displayed in the event display area 29.
- the second management record data for the program data is data for managing the clip image data displayed in the block diagram display area 30 for each clip image data. is there. Therefore, there are the same number of second management record data for program data as the number of clip image data displayed in the program display area 29.
- the second management record data for event data and program data includes pointers to previously linked data, pointers to data linked later, and ⁇ , Event number, title information, sub title information, in-point clip image data hand information, in-point clip type, in-point time code data, and Index number of clip image data of clip point, clip image data handle of clip point, clip type of clip point, and time code data of clip point , The index number of the clip image data at the art point, the slot type, the symbol type, and the symbol time code data. Pointers to previously linked data, pointers to later linked data, and attributes are the same as for the second management record data for Cribs described above. Therefore, the description is omitted.
- the event numbers are numbers assigned to the events in the order in which they were generated. This event number is displayed on the event number display section 29d.
- the titles and subtitles are titles and subtitles previously assigned to registered events, and are stored in actual characters.
- the title is displayed in the title display section 29 g.
- the in-point click image data hunt is data indicating the address where the in-point click image data is recorded. Therefore, by referring to the in-point clip image data handle in the corresponding second management record data of the desired in-point clip image data, the in-point is obtained. Thus, an address in which the criminal image data is stored can be obtained.
- the in-point clip type is defined as the in-point clip image data managed by the second management record data, which is the in-point clip image data. This is data indicating whether or not the image data is clip image data of a fault point. Here, data indicating the clip image data of all the in points is stored.
- the time code data at the in-point is data indicating a time code of clip image data at the in-point managed by the second management record data.
- the index number of the clip image data at the in-point is the index number assigned to the clip image data at the in-point. Similar to the index number in the second management record data for clip data described above, the index number of the clip image data at this in-point is This is a number that is sequentially assigned to all marked image data, regardless of the generation of connection points, art points, and events.
- the clip image data handle at the out point is the clip image data at the out point. Is the data indicating the address recorded. Therefore, by referring to the clip image data handle of the art point in the corresponding second management record data of the clip image data of the desired out point, the desired image can be obtained. Click at the point where the image data is stored.
- a clip point type is a clip image data of an art point managed by the second management record data. This is data indicating whether the data is clip data or clip image data of a fault point. Here, data indicating the click image data of all art points is stored.
- the time code data of an art point is data indicating a time code of clip image data of an art point managed by the second management record data.
- the index number of the clip image data at the art point is the index number given to the clip image data at the art point. Similar to the index number in the second management record data for the clip data described above, the index number of the clip image data at this point is the same as the index number in the second management record data. It is a number that is sequentially assigned to all the merged click image data regardless of the generation of connection points, art points, and events.
- the ⁇ -type means that the event or b ⁇ gramme managed by the second management record data is set to the playback speed control using the playback speed setting area 25. , Or data indicating normal playback control.
- the symbol type refers to whether there is any image data defined as a symbol in the period between the in-point and the art-point of the event managed by the second management record data. It is data indicating whether or not it is.
- the symbol means representative clip image data for representing the event.
- the symbol time code data is the time code of the click image data set as a symbol.
- the “marking” line shown in FIG. 7 indicates whether the mouse was used at 1N or UU.
- the row of “INDEX No.” indicates the index numbers assigned to the clicked image data of the marked in points and art points.
- the index number is a number that is sequentially assigned to all the marked clipped image data regardless of the in-point and the in-point. Therefore, as shown in FIG. 7, index numbers 1 to 15 are sequentially assigned to the marked clipped image data.
- the line of “Clip No.” indicates the clip number displayed in the clip number display area 28 d of the clip display area 28. Note that the clip number displayed in the clip number display area 28d is the same as the index number.
- the line of “Event No.” indicates the event number displayed in the event number display area 29 d of the event display area 29. This event number is a number that is automatically assigned in the order of event generation, irrespective of the index number and the crib number.
- Fig. 8 shows which clip image data is displayed in the clip display area 28, event display area 29 and program display area 30 when marking is performed as shown in Fig. 7.
- FIG. 8 shows which clip image data is displayed in the clip display area 28, event display area 29 and program display area 30 when marking is performed as shown in Fig. 7.
- the clip image data of index number 1 the clip image data of index number 6, and the clip image data of index number 7 are displayed. And the clip image data of index number 13 and the clip image data of index number 14 are displayed in this order. You.
- the four created events are displayed.
- the clip image data of index number 2 is displayed as the event of event number 1
- the clip image data of index number 4 is displayed as the event of event number 2.
- the clip image data of index number 8 is displayed as the event of event number 3
- the clip image data of index number 10 is displayed as the event of event number 4.
- Clip image data is not displayed in the program display area 30 simply by specifying an in point and an art point.
- a mouthgram as shown in FIG. 8 is created by exchanging the four events displayed in the event display area 29.
- the mouthgram is a program that is serialized in the order of the event with event number 2, the event with event number 4, and the event with event number 1. Therefore, in the program display area 30, the clip image data of the index number 4 registered as the event of the event number 2 and the event image registered as the event of the event number 4 are displayed.
- the clip image data of index number 10 and the clip image data of index number 7 registered as the event of event number 1 are displayed.
- Figure 9 9, Figure 9 ⁇ , and Figure 9C show how the first management record data and the second management record data manage the clip image data.
- FIG. 9C shows how the clip image data displayed in the clip display area 28 is managed.
- the management record data 101 is the first management record data for the clip. As shown in FIG. 4, the first management record data 101 for this clip includes the entire area of the clip display area 28 and the clip display area 28. It has data for managing the position of the clip image data displayed on the area.
- the second management record data 201 is data for managing the click image data of index number 1.
- the second management record data 201 is a clip image for indicating the address where the click image data of the index number 1 is stored. Has a data handle.
- the management record data 206 linked after the second management record data 201 is the second management record data for the clip.
- the second management record data 206 is data for managing the clip image data of the index number 6, and the clip image data of the index number 6 is stored. It has a clip image data handle for indicating the address being set.
- the second management record data 2007 for managing the clip image data of the index number 7 is linked
- the second management record data 210 is linked.
- the management record data 2 1 2 the second management record data 2 1 3 for managing the clip image data of index number 13 is strongly linked, and the second management record data is linked.
- the record data 21 3 there is a second management record data 2 14 4 link for managing the clip image data of the index number 14.
- FIG. 9B shows how clip image data displayed in the event display area 29 is managed.
- the management record data 102 is the first management record data for the event. As shown in FIG. 4, the first management record data 102 is displayed in the entire area of the event display area 29 and in the area of the Critical display area 29. It has data for managing the location of the click image data.
- the record data 202 is the second management record data for the event.
- the second management record data 210 is composed of the clip image data at the in-point indicated by the index number 2 and the key image data indicated by the index number 3. ⁇ ⁇ ⁇ It has data for managing the clip image data at the point. More specifically, the second management record data 202 is an in-point clip indicating an address of an in-point clip image data indicated by an index number 2. Similarly, it has a live image data handle and an image data huntle at the art point indicating the address of the image data at the art point indicated by index number 3.
- the clip image data of the index point of index number 4 and the clip image data of the index point of index number 5 are managed.
- the second management record data 204 is linked, and after the second management record data 204, the clip image data at the in-point of the index number 8 is added. Click image of index point with index number 9
- the second management record data 208 for managing the data is linked, and after the second management record data 208, the click of the in-point of index number 10 is clicked.
- the second management record data 210 for managing the clip image data and the clip plane image data of the art point with the index number 11 is linked.
- FIG. 9A shows how the clip image data displayed in the program display area 30 is managed.
- the management record data 103 is the first management record data for the program. As shown in FIG. 4, the first management record data 103 is composed of the entire area of the program display area 30 and the clip image data displayed in the program display area 29. It has data to manage the location.
- the second management record data 204 for managing the clip image data is linked, and the index number 10 0 follows the second management record data 204.
- the clip image data of the index point of index number 8 and the clip of the art point of index number 9 are clipped.
- the second management record data 208 for managing image data is linked.
- FIG. 9B which shows the management of event data
- Figure 9C which shows the management of mouthgrams.
- the order of storage of the index image data of index number 2, the index image data of index number 4, and the index image data of index number 10 is shown in FIG. 9. No changes have been made between ⁇ and Figure 9C. This means that the storage location of the clip image data has not been changed at all.
- FIGS. 9A and 9C is that the link order of the second management record data has been changed.
- the link order of the second management record data has been changed.
- the display order of events instead of changing the storage location of the clip image data representing the event, the clip image data is directly managed.
- the second management changes the link order of record data. Therefore, there is an effect that the display order of events can be changed at a high speed.
- the present invention is not limited to the change of the display order of events, and the same applies to the change of the display order of the click image data displayed in the click table area 28.
- the display order of the clip image data is changed by deleting or newly adding the clip image data
- the storage position of the clip image data is actually changed.
- the first to fifteenth markings will be described in order.
- the first address of the area for recording includes the first management record data 101 for clip data, the first management record data 102 for event data, and the first management record data 102 for event data.
- the first management record data 103 has been generated. However, since there is no second management record data linked to any management record data, the address is stored in “Pointer to data linked later”. I have.
- the readout from the frame memory 11e is controlled to form the clear image data of 95 pixels ⁇ 60 pixels.
- the formed clip image data is stored as clip image data of index number 1 in the empty area of RAM 10b. Simultaneously with this recording, the formed clip image data is displayed in the ink clip display area 24a.
- the second management record data 201 for managing the clip image data is temporarily stored in a register in the CPU 10 and is stored in the RAMIO b. It has not been. The reason is that at this point, it is unclear to which management record data the second management record data 201 is linked.
- the clip image data of index number 2 is similarly formed and stored in the free area of the RAM 10b.
- the in-point was marked twice in succession, the clit image data of the index number 1 displayed on the incline display area 24a was not used as an event. Become. Therefore, the clip image data of index number 1 displayed in the ink display area 24 e is moved to the clip display area 28.
- the second mm record data for managing the clip image data of the index number 1 is the first mark data for the clip data. It is decided to link to the management record data 101 of the company. Yotsu As shown in FIG. 9C, the second management record data 201 temporarily stored in the register of the CPU 10 becomes the first management record data 10 1. Stored in RAMIO b as linked to 1.
- the clip image data of index number 2 generated in the second mapping is newly displayed in the clip display area 24 e.
- the second management record data 202 for managing the index image data of the index number 2 is temporarily stored in a register in the CPU 10. Is newly stored.
- the clip image data of index number 3 is formed in the same manner and stored in the free area of RAM 10b. Since this third marking is an out point, the clip image data of index number 2 is used as an in point, and the clip image data of index number 3 is used as an out point. Event is formed. Therefore, the clip image data of the index number 2 displayed on the incremental display area 24 e is displayed in the incremental display area 24 e. The event display area 28 is copied.
- the second management record data 202 for managing the image data of the index number 2 stored in the register is stored in the event data. It is decided to link to the first management record data 102 for the first time. Therefore, as shown in FIG. 9B, the second management record data 202 temporarily stored in the register of the CPU 10 becomes the first management record data 102. Path to RAMI 0b so that it is linked to
- the clip image data of index number 3 generated by this third marking is newly displayed in the out-crib display area 24d.
- the second management record data 202 for managing the critical image data of the index number 3 can be linked to the first management record data 102. Since it is determined, it is not temporarily stored in the register in CPU 10 No.
- the click image data of the index number 4 is formed in the same manner, and the image is recorded in the empty area of RAMIOb.
- the formed clip image data is displayed on the ink clip display area 24a.
- the second management record data 204 for managing the clip image data of the index number 4 is stored in the CPU 10 as a register. Is stored temporarily.
- the clit image data of the index number 3 displayed in the outer clit display area 24d has already been recorded, it is cleared from the display area 24d.
- the clip image data of index number 5 is formed and stored in the empty area of RAM10b.
- this fifth marking is an out point, so the clip image data with index number 4 is used as the in point, and the clip image with index number 5 is used.
- An event with data as an art point is formed. Therefore, the clip image data of the index number 4 displayed in the ink display area 24 e is displayed in the ink display area 24 e while the event image is displayed in the ink display area 24 e.
- the fifth marking is used to copy the second management record data for managing the index image data of the index number 4 stored in the register.
- the second management record data 204 that has been used is linked to the second management record data 202 and used for RAMIOb.
- the clip image data of index number 5 generated by the fifth marking is newly displayed in the out- clip display area 24d.
- the second management record data 204 for managing the clip image data of mix number 5 is determined to be linked to the second management record data 202, However, the register in the CPU 10 is not temporarily stored.
- the click image data of index number 6 is formed and stored in the empty area of RAMIOb. Simultaneously with the storage, the formed clip image data of the index number 6 is displayed in the inclik display area 24a. Similarly to the fourth marking, the second management record data 206 for managing the clip image data of the index number 6 is stored in the CPU 10 as a register. Is stored temporarily. In addition, since the clit image data of the index number 5 displayed in the outer display area 24d is already recorded, it is cleared from the display area 24d.
- the clip image data of index number 7 is formed in the same manner, and is stored in the empty area of RAM10b. Since the IN point was marked twice in succession, the clip image data of index number 6 displayed on the incremental display area 24 e was replaced with the clip display area 28 Will be moved to.
- the second management record data 206 stored in the register of the CPU 10 becomes the second management record data 2 as shown in FIG. 9C. Stored in RAMI 0b as linked to 01.
- the formed clip image data of the index number 7 is displayed in the inclip display area 24a.
- the second management record data 207 for managing the image data of the index number 7 and the image data 207 is the same as the CP 10 It is temporarily stored in the register inside.
- the markings for the 9th to 15th markings are also The description is omitted because it is performed in the same manner as in the first embodiment.
- FIG. 10 shows the entire configuration of the hard disk array 2.
- the hard disk array 2 mainly includes a system controller 70, a video data input / output unit 71, a video data processing unit 72, a video data hard disk, an audio data input / output unit 74, and an audio data input / output unit 74. It has a data processing unit 75 and a hard disk for data data 76.
- the system controller 70 has a CPU 70a and a time code generator 70b.
- a DMA controller (Direct Memory Access controller 11 er) 70 c A DMA controller (Direct Memory Access controller 11 er) 70 c, a SCSI controller 70 d, a DMA controller 70 e, and a SCSI controller 70 f. .
- the CPU 70a is a central processing circuit for controlling all the blocks of the disk array 2.
- the CPU 70a receives a control command based on the RS-422 communication protocol supplied from the outside to the system controller n-, and according to the command, the DMA controller 70c, 70c. f and SCSI components. Controller 70 d, 70 f.
- the CPU 70a receives the external time code (Et.TO) supplied to the system controller D-70 from the time code generator 70b or the outside, and inputs and receives the received time code data to the video data.
- the audio data input / output unit 74 Further, all the recording addresses of the video data stored in the video hard disk 73 in frame units, all the time codes of the recorded frames, and the like.
- the DMA controller 70c reads the video data from the buffer memories 72b and 72e provided in the video data processing unit 72 in accordance with the command from the CPU 70a. And the writing timing when writing video data to the buffer memories 72b and 72e.
- the SCSI controller 70d transmits the control command from the CPU 70a, the video data in units of frames received from the DMA controller, and the time code data related to the frame video data.
- the converted video data is transmitted to the hard disk 73 for the SCS 1 communication command.
- the video data reproduced from the hard disk 73 is converted from the SCS I communication command, and supplied to the DMA controller 70c.
- the DMA controller 7 Oe reads the audio data from the buffer memories 75 b and 75 e provided in the audio data processing unit 75 in accordance with the command from the CPU 70 a. In addition to controlling the write timing, it also controls the write timing when writing audio data to the buffer memories 75b and 75e.
- the SCSI controller 70f associates the control command from the CPU 70a, the audio data in units of frames received from the DMA controller, and the frame audio data.
- the time code data is converted into a SCSI communication command and sent to the video data hard disk 76.
- the audio data reproduced from the hard disk 76 is converted from the SCS I command and supplied to the DMA controller 70e.
- the input system of the video data input / output unit 71 extracts a synchronizing signal of the input video signal V 1 and converts the input video signal V 1 into a component signal.
- AZD conversion circuit 71b for converting component video signals into digital video signals.
- the output system of the video data input / output unit 71 includes a DZA conversion circuit 71 d that converts the playback video signal of the first channel supplied from the video data processing unit 72 to analog D, and a first channel. Play video signal The signal is converted to a composite signal, and the time code is output to the composite video signal based on the phase of the external synchronization signal (EX and Sync) supplied to the system controller 70.
- EX and Sync external synchronization signal
- An encoder 711 for providing a superimposed vertical synchronization signal an A-conversion circuit 71 f for converting the reproduced video signal of the second channel supplied from the video data processing unit 72 into an analog signal, and In addition to converting the playback video signal of the second channel into a composite signal, the external video signal supplied to the system controller 70 is supplied to the output composite video signal.
- the input system of the video data processing unit 72 includes a compression unit 72 a that compresses the video data supplied from the video data input / output unit 71 in frame units based on the JPEG standard, and a DMA controller 70 c.
- the video data from the compression section 72a is stored based on the write command of the DMA controller 70a, and the expansion section 72c or the DMA controller 70 is stored based on the write command from the DMA controller 70c. It has a buffer memory 72b for supplying compressed video data to c.
- the output system of the video data processing unit 72 receives the compressed video data from the buffer memory 72 b, decompresses the compressed video data, and outputs it as video data of the first channel.
- the video processing unit 72 e has a buffer memory 72 b for storing video data for the first channel and a buffer memory 72 e for storing video data for the second channel.
- the buffer memories 72b and 72e are composed of FIF0 memories and have a capacity to store 15 frames of video data.
- the audio data input / output unit 74 is, like the video data input / output unit 71, a decoder 74a for converting the input audio signal A1 and an AZD converter for converting an analog audio signal into digital audio data.
- Circuit 7 4b A DZA conversion circuit 74 d for converting the playback audio data supplied from the data processing unit 75 into analog data, and an encoder 74 c for converting the analog D audio signal supplied from the DZA conversion circuit to A 2. have.
- the audio data processing unit 75 includes a compression unit 72 a for compressing the audio data supplied from the audio data input / output unit 74 and a control command from the DMA controller 7 Oe.
- the buffer memory 75 that supplies the compressed audio data to the DMA controller 70e while storing the audio data from the compression unit 72a based on the data, and the DMA controller 70e.
- the buffer memory 75d that receives the reproduced audio data and outputs it to the decompression unit 75c, and receives the playback audio data from the buffer memory 75d and decompresses the compressed audio data It has an extension part 75c.
- the video signal supplied to the video data input / output unit 71 is subjected to predetermined input / output processing and supplied to the video data processing unit 72.
- the video data compressed by the compression section 72a of the video data processing section 72 is supplied to the buffer memory 72b.
- the same video data is supplied to both the DMA controller 70 and the expansion circuit 72c according to the read command from the DMA controller 70c.
- the CPU 70a associates the time code data supplied from the time code generator 70b with the video data supplied to the DMA controller 70c so that the SC SI controller can associate the time code data with the video data supplied to the DMA controller 70c.
- the SCSI controller 70d supplies a recording command and video data to the hard disk 73 so that the video data received from the DMA controller 70c is recorded in the address specified by the CPU 70a. Line up.
- the video data supplied to the expansion circuit 72c is expanded as video data of the first channel and sent to the video data input / output unit 71e.
- the video data input / output unit 71 e superimposes the time code supplied from the CPU 70 a on the vertical synchronization period of the supplied video data, and transmits it as a composite video signal V 2. Put out.
- the CPU 70a When a playback command according to the RS-422 communication protocol is sent to the disk array 2, the CPU 70a outputs a playback command to the disk 73 via the SCSI controller 70d. I do.
- the reproduced video data is stored in the buffer memory 72e in accordance with the writing timing of the DMA controller 70c.
- the video data read from the buffer memory 72 e is expanded as video data of the second channel by the expansion circuit 72 d and sent to the video data input / output unit 71 e.
- the video data input / output unit 71e superimposes the time code supplied from the CPU 70a on the vertical synchronization period of the supplied video data, and sends out the composite video signal V3.
- the DMA controller 70c While the DMA controller 70c outputs a read command to the buffer memory 72b, the video signal is recorded by the disk 73 from the buffer memory 72b. While the DMA controller 70c outputs a write command to the buffer memory 72e, the video signal recorded by the disk 73 is played back, and the playback video data is buffered. It is supplied to the memory 72 e. That is, the transfer of the video data to be recorded to the disk is buffered by the buffer memory 72b, and the transfer of the video data reproduced from the disk 73 to the video data input / output unit 71 is performed by the buffer memory 7b.
- step SP1 the operation is started when the operator designates the execution of the aggregation program.
- step SP2 since the application program is stored in the local hard disk 15a, the CPU 10 stores the application program in the operating RAM 10b provided in the CPU 10. Upload.
- step SP3 when the upload to the RAM 10b in the CPU 10 ends, the CPU 10 executes an abrication program.
- step SP4 a memory area of RAM 10b is secured.
- the reason for securing this memory area is that a plurality of clip image data and edited data are generated by the editing operation, and the respective data are stored in the RAM 10b.
- the first management record data for clip data, event data and program data shown in FIGS. 9A, 9B and 9C are recorded in the RAM 10b. Is done.
- step SP5 a work folder for storing the program event data and the like generated by the editing operation by the computer 1 is generated.
- the generated work holder is recorded on the local hard disk 15a.
- step SP6 the CPU 10 synchronizes the internal clock of the computer with the graphic data to the VRAM 13b in order to display the graphic display for the GUI on the monitor 14. Transfer in real time.
- step SP7 the same graphic as the graphic data stored in the VRAM 13b is displayed on the monitor 14.
- step SP8 it is confirmed whether or not the input video signal V2 is to be displayed on the recording video screen 21a. If the video display is not specified, it is determined that the editing operation is not to be performed, and the process proceeds to step SP16 and ends. In a normal case, since the input video signal V2 needs to be displayed on the recording video screen 21a in order to perform the editing operation, the process proceeds to step SP9.
- step SP9 the computer 1 outputs the video signal V2 to the computer 1.
- the hard disk array 2 Upon receiving a control command from the computer 1, the hard disk array 2 generates a video signal V2 from the input video signal V1 as a video signal for the first channel, and sends the video signal V2 to the computer 1. .
- step SP10 the data converter 11b extracts a time code from the composite video signal V2 supplied to the computer 1 and converts the input composite video signal into a digital component. Convert to one-unit video data.
- the converted video data is temporarily stored in the frame memory 11e on a frame basis.
- the processor con- nector 11a sends the time code data extracted by the data converter 11b to the CPU 10.
- step SP 11 the video data stored in the frame memory 11 e is transferred to VRAM 13 b.
- the video data to be transferred is video data of 380 pixels ⁇ 240 pixels because the sampling read from the frame memory 11 e is small.
- the data transferred to the VRAM 13b is not only the video data from the frame memory 11e, but also the video data from the frame memory 11e in the area where the video data is displayed.
- the image data bus 5a is interrupted so that image data is transferred from the CPU 10 in an area where graphics for the GUI are displayed.
- the data stored in the VRAM 13b is updated in real time, so that video data can be displayed on the monitor 14 in real time.
- step SP12 the graphic data and the video data stored in the VRAM 13b are displayed on the monitor 14 in real time.
- step SP 13 it is determined whether or not the video data displayed on the video screen 21 a is started by the hard disk array 2. To start the pathology, click "STAR R. Hot 31a”.
- step SP 14 when the start of the I * is specified, the CPU 10 issues the command to the external interface 18.
- Toro Interface 18 is R Converts to S-422 standard communication format and sends to hard disk array 2.
- step SP15 since the recording has been started by the hard disk array 2, it is determined that all the initial settings have been completed, and the process is terminated.
- step SP20 is started.
- step SP20 it is determined whether or not a new marking has been made.
- the judgment as to whether or not a mark is made is based on whether or not the mouse is clicked when the cursor is positioned in the area of the mark-in button 24c or the mark-art button 24f. This determination is based on an interrupt command supplied to CPU10.
- the mark-in button 24c it is determined that the in-point has been designated, and the flow advances to step SP22.
- the markout button 24f it is determined that an art point has been designated, and the process proceeds to step SP30.
- step SP22 in-point clip image data is formed.
- the clip image data at this in-point is data formed when reading out the video data VRAM 13 b stored in the frame memory 11 e, and is stored in the frame memory 1 e.
- the sampling rate is reduced so that the data amount of video data recorded in 1e becomes 116.
- This clip image data is composed of 95 pixels ⁇ 60 pixels.
- step SP23 the click image data stored at the position of the incremental display area 24a of the VRAM 13b is incremented according to the reading of the data from the VRAM 13b. Displayed on the display area 24a.
- step SP24 click on the in-point marked in step SP21. It is determined whether or not the image data is the first marching. If it is the first marking, return to step SP21. If it is the second or more marking, go to step SP 25.
- step SP25 it is determined whether or not the previously marked clip image data is clip image data at the in-point. If the previously marked clip image data is the in-point clip image data, the process proceeds to step SP26, where the previously marked clip image data is processed. If the data is the clip image data at the art point, the process proceeds to step SP27.
- step SP26 the clip image data of the in-point marked earlier is moved from the incremental display area 24a to the critical display area 28.
- the in-point is marked twice consecutively.
- the description of FIGS. 7 and 8 and FIG. 9 may be referred to.
- step SP27 it is determined whether an event has been generated based on the previously marked art point. If the event has been generated by the previous marking, the process proceeds to step SP29. If the event has not been generated by the previous marking, the process proceeds to step SP28.
- step SP 28 the clip image data of the art point marked first and displayed in the out-crisp display area 24 d is moved to the clit-display area 28. This is because the out-point marked earlier determines the click image data that does not cause the event to be generated.
- step SP29 the clip image data displayed in the outer display area 24d is cleared. Since the clip image data displayed in the outer clip display area 24 d is already registered as an event point, the clip image data is moved to the clip display area 28. No need.
- step SP30 the crib image data at the art point is formed.
- the clip image data at this point is data formed when read out to the video data VRAM 13b stored in the frame memory 11e, and is stored in the frame memory 11e.
- the sampling rate is reduced so that the data amount is 1/16 of the recorded video data. Note that this clip image data is composed of 95 pixels ⁇ 60 pixels.
- step SP31 the clip image data stored at the location of the out- clip display area 24d of the VRAM 13b is read according to the reading of the data from the VRAM 13b. Displayed in the articulate display area 24d.
- step SP32 it is determined whether or not the click image data of the out point marked in step SP21 is the first marking. If it is the first marching, return to step SP 21. If it is the second or more marking, go to step SP33.
- step SP33 it is determined whether or not the previously marked clip image data is clip image data at the in-point. If the previously marked clip image data is the clip image data at the in-point, the process proceeds to step SP34, and the previously marked clip image data is deleted. If the image data is the click image data at the point G, the process proceeds to step SP36.
- step SP34 an event ascending SS is performed. In this way, when an art point is marked after an in point, it is automatically registered as an event.
- the description of the second management record data created for the registration of the event can be better understood by referring to the description of FIG. 7 and FIG.
- step SP35 the generated clip image data at the in-point of the event is copied to the event display area 29.
- Step S ⁇ 36, Step S 3 end, and Step S ⁇ 38 are the same as Step S ⁇ 27, Step S ⁇ 27, and Stynov S ⁇ 29, so description thereof will be omitted.
- step SP39 it is determined whether or not to end the marking. Until the marking is completed, the flow shown in FIG. 12 is repeated.
- FIG. 13A and FIG. 13B show a tip when creating an event from a video signal reproduced from the disk array 2.
- This alpha is started from the state where the clip image data is already stored.
- step SP41 it is determined whether or not the click image data has been designated.
- double-clicking the mouse clicking twice consecutively
- the cursor is within the display position of the clip image data means that The clip image data is specified.
- step S ⁇ 42 when the clip image data is designated, if the clip image data is at the designated clip image data key point, the clip display display area is selected. 27a, if it is the out-point of the click image data, it will be displayed in the art-click display area 27d.
- the CPU 10 refers to the time code included in the specified clip image data, and performs external reproduction so that the video data of the time code is still reproduced. Outputs control command to interface 18.
- the external interface 18 converts the still playback command into the RS-422 protocol and sends it to the disk array 2.
- the disk array 2 refers to the received time code and related data of the storage address of the video data, and sends out the video data reproduced as still data from the hard disk 73 or the video data of the second channel.
- step SP44 the video data transmitted from the disk array 2 is received, and predetermined image processing is performed in the second video processor 12.
- the still playback video stored in the frame memory The data is transferred to the VRAM 13 b so as to be 380 pixels ⁇ 240 pixels.
- step SP46 the playback video data stored in the VRAM 13b is displayed on the playback video screen 23a.
- the still video data from the hard disk array 2 is not a real-time video signal, but still video data. Therefore, only a still image is displayed on the playback video screen 23a.
- step SP48 a playback start command is output to the hard disk array 2 via the external interface 18.
- the hard disk array 2 receives the playback start command and sends normal playback video data to the computer 1.
- step SP51 it is determined whether or not marking has been performed.
- the determination as to whether or not the marking has been made is based on whether or not the mouse is clicked when the cursor is positioned in the area of the mark-in button 27c or the mark-button 27f. This determination is based on an interrupt command provided to CPU 10.
- the mark-in button 27c it is determined that an in-point has been designated, and the flow advances to step SP52.
- the mark art button 27 f it is determined that an art point has been designated, and the flow advances to step SP55.
- step SP52 clip image data at the in-point is formed.
- the clip image data at this in-point is data formed when read out to the video data VRAM 13b stored in the frame memory 12e, and is stored in the frame memory 12e.
- the sample rate is reduced so that the amount of video data recorded in e becomes 1Z16.
- This clip image data is composed of 95 pixels ⁇ 60 pixels.
- step SP53 the position of the incremental display area 27a of VRAM 13b is The clip image data stored in the memory is displayed on the incremental display area 27a in accordance with the reading of the data from the VRAM 13b.
- step SP54 the clipped image data of the in-point marked first and displayed on the incremental display area 27d is moved to the clip display area 28.
- step SP55 the click image data of the art point is formed.
- the clip image data at this point is data formed when read out to the video data VRAM 13b stored in the frame memory 12e. 2
- the sampling rate is decreasing so that the data amount is 1/16 of the video data recorded in e. Note that this clip image data is composed of 95 pixels ⁇ 60 pixels.
- step SP56 the clip image data stored at the position of the output clip display area 27d of the VRAM 13b ⁇ , and the readout of the data from the VRAM13b is performed according to the readout of the data from the VRAM13b. It is displayed in the toque display area 27d.
- step SP57 it is determined whether or not the previously marked clipped image data is the clipped image data at the in-point. If the clip image data marked earlier is the clip image data at the in-point, the process proceeds to step SP58, where the clip image data marked earlier is deleted. If it is clip image data at the point G, the process proceeds to step SP57.
- step SP58 it is determined whether or not to newly register as an event. Clicking the “NEW EVENT” button 3 3 means that a new event will be registered.
- step SP60 the event is registered.
- the mark point is marked after the gap point, and is registered as an event when the “NEW EVENT” button 33 is clicked.
- the description of the second management record data created for the registration of the event can be better understood by referring to the description of FIG. 7 and FIG.
- step SP61 the clip image data of the generated event point is copied to the event display area 29.
- step SP62 it is determined whether or not the stop of the playback of the video data displayed on the playback video screen 23a is specified. When the stop is specified, the process proceeds to Step SP63.
- step SP63 a stop command is output to the hard disk array 2 and the processing ends.
- the present invention relates to an editing system.
- it is suitable for materials that require speed, such as sports broadcasts and news reports.
- it can be used for a combigraphic device.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Television Signal Processing For Recording (AREA)
- Management Or Editing Of Information On Record Carriers (AREA)
Description
Claims
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP96908375A EP0764951B1 (en) | 1995-04-08 | 1996-04-08 | Editing system |
KR1019960706675A KR970703595A (ko) | 1995-04-08 | 1996-04-08 | 편집 시스템(Editing system) |
JP53087096A JP3837746B2 (ja) | 1995-04-08 | 1996-04-08 | 編集システム |
US08/750,330 US5930446A (en) | 1995-04-08 | 1996-04-08 | Edition system |
DE69623712T DE69623712T2 (de) | 1995-04-08 | 1996-04-08 | Schnittsystem |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP10821795 | 1995-04-08 | ||
JP7/108217 | 1995-04-08 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO1996032722A1 true WO1996032722A1 (fr) | 1996-10-17 |
Family
ID=14479008
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP1996/000963 WO1996032722A1 (fr) | 1995-04-08 | 1996-04-08 | Systeme de mise en forme |
Country Status (7)
Country | Link |
---|---|
US (1) | US5930446A (ja) |
EP (2) | EP0764951B1 (ja) |
JP (1) | JP3837746B2 (ja) |
KR (1) | KR970703595A (ja) |
CN (2) | CN1139934C (ja) |
DE (1) | DE69623712T2 (ja) |
WO (1) | WO1996032722A1 (ja) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0845781A2 (en) * | 1996-11-28 | 1998-06-03 | Sony Corporation | Video-audio data processors and methods of processing and saving input video-audio data |
EP0871177A2 (en) * | 1997-04-08 | 1998-10-14 | MGI Software Corp. | A non-timeline, non-linear digital multimedia composition method and system |
KR100603161B1 (ko) * | 1997-04-12 | 2006-12-13 | 소니 가부시끼 가이샤 | 편집시스템및편집방법 |
KR100603173B1 (ko) * | 1997-04-12 | 2006-12-15 | 소니 가부시끼 가이샤 | 편집장치및편집방법 |
JP2008543124A (ja) * | 2005-03-07 | 2008-11-27 | エムシーアイ・エルエルシー | ネットワーク上でデジタルメディアの分散編集及び記憶を提供するための方法及びシステム |
EP2079234A2 (en) | 2008-01-09 | 2009-07-15 | Sony Corporation | Video searching apparatus, editing apparatus, video searching method, and program |
US8782523B2 (en) | 2007-11-22 | 2014-07-15 | Sony Corporation | Unit video representing device, editing console, editing system, and animation editing method |
Families Citing this family (181)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6167350A (en) * | 1996-04-12 | 2000-12-26 | Sony Corporation | Method and apparatus for selecting information signal range and editing apparatus for information signal |
CA2260932C (en) * | 1996-07-29 | 2003-11-18 | Avid Technology, Inc. | Motion video processing circuit for capture, playback and manipulation of digital motion video information on a computer |
US6628303B1 (en) | 1996-07-29 | 2003-09-30 | Avid Technology, Inc. | Graphical user interface for a motion video planning and editing system for a computer |
US7055100B2 (en) * | 1996-09-20 | 2006-05-30 | Sony Corporation | Editing system, editing method, clip management apparatus, and clip management method |
US6661430B1 (en) * | 1996-11-15 | 2003-12-09 | Picostar Llc | Method and apparatus for copying an audiovisual segment |
US6201925B1 (en) * | 1996-11-15 | 2001-03-13 | Futuretel, Inc. | Method and apparatus for editing video files |
US6324335B1 (en) * | 1996-11-29 | 2001-11-27 | Sony Corporation | Editing system and editing method |
GB2325776B (en) * | 1996-12-09 | 2000-10-11 | Sony Corp | Editing device,editing system and editing method |
US20010016113A1 (en) * | 1997-02-10 | 2001-08-23 | Nikon Corporation | Information processing apparatus and method |
JP4462642B2 (ja) * | 1997-03-05 | 2010-05-12 | ソニー株式会社 | 編集装置及び編集方法 |
JP3419645B2 (ja) * | 1997-03-14 | 2003-06-23 | 株式会社日立国際電気 | 動画像編集方法 |
EP0869445A3 (en) * | 1997-03-31 | 2002-11-06 | Hitachi, Ltd. | Video signal processing system and method using time code |
US6052508A (en) * | 1997-04-04 | 2000-04-18 | Avid Technology, Inc. | User interface for managing track assignment for portable digital moving picture recording and editing system |
JP4536164B2 (ja) * | 1997-04-12 | 2010-09-01 | ソニー株式会社 | 編集装置及び編集方法 |
WO1998047145A1 (fr) * | 1997-04-12 | 1998-10-22 | Sony Corporation | Systeme d'edition et procede d'edition |
JP3988205B2 (ja) * | 1997-05-27 | 2007-10-10 | ソニー株式会社 | 映像信号記録再生装置、映像信号記録再生方法、映像信号再生装置及び映像信号再生方法 |
GB9716033D0 (en) | 1997-07-30 | 1997-10-01 | Discreet Logic Inc | Processing edit decision list data |
US6463444B1 (en) | 1997-08-14 | 2002-10-08 | Virage, Inc. | Video cataloger system with extensibility |
US6567980B1 (en) | 1997-08-14 | 2003-05-20 | Virage, Inc. | Video cataloger system with hyperlinked output |
US7295752B1 (en) | 1997-08-14 | 2007-11-13 | Virage, Inc. | Video cataloger system with audio track extraction |
US6360234B2 (en) * | 1997-08-14 | 2002-03-19 | Virage, Inc. | Video cataloger system with synchronized encoders |
JP4303798B2 (ja) * | 1997-09-11 | 2009-07-29 | ソニー株式会社 | 撮像装置、編集装置及び編集システム |
JP4078691B2 (ja) * | 1997-09-19 | 2008-04-23 | ソニー株式会社 | 記録再生制御システム、記録再生制御方法および記録再生制御装置 |
US8963681B2 (en) | 1997-10-27 | 2015-02-24 | Direct Source International, Llc | Operating control system for electronic equipment |
US7394347B2 (en) * | 1997-10-27 | 2008-07-01 | World Wide Innovations, Llc | Locking device for electronic equipment |
JPH11297040A (ja) * | 1998-04-07 | 1999-10-29 | Sony Corp | 再生信号処理装置 |
AU4439899A (en) | 1998-06-16 | 2000-01-05 | United Video Properties, Inc. | Interactive television program guide with simultaneous watch and record capabilities |
US6163510A (en) | 1998-06-30 | 2000-12-19 | International Business Machines Corporation | Multimedia search and indexing system and method of operation using audio cues with signal thresholds |
US6833865B1 (en) * | 1998-09-01 | 2004-12-21 | Virage, Inc. | Embedded metadata engines in digital capture devices |
US6452612B1 (en) * | 1998-12-18 | 2002-09-17 | Parkervision, Inc. | Real time video production system and method |
US20030214605A1 (en) * | 1998-12-18 | 2003-11-20 | Snyder Robert J. | Autokeying method, system, and computer program product |
US20030001880A1 (en) * | 2001-04-18 | 2003-01-02 | Parkervision, Inc. | Method, system, and computer program product for producing and distributing enhanced media |
US6952221B1 (en) | 1998-12-18 | 2005-10-04 | Thomson Licensing S.A. | System and method for real time video production and distribution |
US11109114B2 (en) | 2001-04-18 | 2021-08-31 | Grass Valley Canada | Advertisement management method, system, and computer program product |
US9123380B2 (en) | 1998-12-18 | 2015-09-01 | Gvbb Holdings S.A.R.L. | Systems, methods, and computer program products for automated real-time execution of live inserts of repurposed stored content distribution, and multiple aspect ratio automated simulcast production |
US6909874B2 (en) * | 2000-04-12 | 2005-06-21 | Thomson Licensing Sa. | Interactive tutorial method, system, and computer program product for real time media production |
US6760916B2 (en) | 2000-01-14 | 2004-07-06 | Parkervision, Inc. | Method, system and computer program product for producing and distributing enhanced media downstreams |
US7024677B1 (en) | 1998-12-18 | 2006-04-04 | Thomson Licensing | System and method for real time video production and multicasting |
US7549128B2 (en) * | 2000-08-08 | 2009-06-16 | Thomson Licensing | Building macro elements for production automation control |
US8560951B1 (en) | 1998-12-18 | 2013-10-15 | Thomson Licensing | System and method for real time video production and distribution |
US20040027368A1 (en) * | 2002-05-09 | 2004-02-12 | Parkervision, Inc. | Time sheet for real time video production system and method |
US6748421B1 (en) * | 1998-12-23 | 2004-06-08 | Canon Kabushiki Kaisha | Method and system for conveying video messages |
EP1080469A1 (en) * | 1999-02-25 | 2001-03-07 | Applied Magic, Inc. | Non-linear multimedia editing system integrated into a television, set-top box or the like |
US6757906B1 (en) | 1999-03-30 | 2004-06-29 | Tivo, Inc. | Television viewer interface system |
US6847778B1 (en) | 1999-03-30 | 2005-01-25 | Tivo, Inc. | Multimedia visual progress indication system |
EP1183689A1 (en) | 1999-03-30 | 2002-03-06 | Tivo, Inc. | System for automatic playback position correction after fast forward or reverse |
US7665111B1 (en) | 1999-10-20 | 2010-02-16 | Tivo Inc. | Data storage management and scheduling system |
US20020118954A1 (en) | 2001-12-07 | 2002-08-29 | Barton James M. | Data storage management and scheduling system |
EP1968067A1 (en) | 1999-03-30 | 2008-09-10 | Tivo, Inc. | Multimedia program bookmarking system |
US8689265B2 (en) | 1999-03-30 | 2014-04-01 | Tivo Inc. | Multimedia mobile personalization system |
DE60045377D1 (de) * | 1999-03-30 | 2011-01-27 | Tivo Inc | Fernsehbetrachterschnittstellensystem |
US7543325B2 (en) | 1999-03-30 | 2009-06-02 | Tivo Inc. | System for remotely controlling client recording and storage behavior |
US7242847B1 (en) * | 1999-06-18 | 2007-07-10 | Intel Corporation | Systems and methods for editing video streams using a grid-based representation |
US20030182567A1 (en) | 1999-10-20 | 2003-09-25 | Tivo Inc. | Client-side multimedia content targeting system |
JP2001195825A (ja) | 1999-10-29 | 2001-07-19 | Sony Corp | 記録再生装置および方法 |
JP2001143447A (ja) * | 1999-11-16 | 2001-05-25 | Sony Corp | データ編集装置および方法、並びにデータ記録再生装置 |
JP2001197366A (ja) * | 2000-01-12 | 2001-07-19 | Hitachi Ltd | 画像合成方法及び画像合成プログラムを記録した記録媒体 |
JP4168307B2 (ja) | 2000-03-17 | 2008-10-22 | ソニー株式会社 | 情報再生装置および画像表示制御方法、並びに記録媒体 |
JP4348821B2 (ja) * | 2000-03-27 | 2009-10-21 | ソニー株式会社 | 編集装置、編集方法 |
US7257641B1 (en) * | 2000-03-30 | 2007-08-14 | Microsoft Corporation | Multipoint processing unit |
EP1273008A2 (en) * | 2000-03-31 | 2003-01-08 | Parkervision, Inc. | Method, system and computer program product for full news integration and automation in a real time video production environment |
GB2361098A (en) * | 2000-04-05 | 2001-10-10 | Sony Uk Ltd | Editing system and method using metadata |
EP1947649A3 (en) | 2000-04-05 | 2014-07-09 | Sony United Kingdom Limited | Audio/video reproducing apparatus and method |
US7904922B1 (en) * | 2000-04-07 | 2011-03-08 | Visible World, Inc. | Template creation and editing for a message campaign |
US7870579B2 (en) | 2000-04-07 | 2011-01-11 | Visible Worl, Inc. | Systems and methods for managing and distributing media content |
US7962948B1 (en) * | 2000-04-07 | 2011-06-14 | Virage, Inc. | Video-enabled community building |
US7895620B2 (en) | 2000-04-07 | 2011-02-22 | Visible World, Inc. | Systems and methods for managing and distributing media content |
US7260564B1 (en) * | 2000-04-07 | 2007-08-21 | Virage, Inc. | Network video guide and spidering |
US7870578B2 (en) | 2000-04-07 | 2011-01-11 | Visible World, Inc. | Systems and methods for managing and distributing media content |
US7861261B2 (en) | 2000-04-07 | 2010-12-28 | Visible World, Inc. | Systems and methods for managing and distributing media content |
US7222163B1 (en) * | 2000-04-07 | 2007-05-22 | Virage, Inc. | System and method for hosting of video content over a network |
US8171509B1 (en) | 2000-04-07 | 2012-05-01 | Virage, Inc. | System and method for applying a database to video multimedia |
US7900227B2 (en) | 2000-04-07 | 2011-03-01 | Visible World, Inc. | Systems and methods for managing and distributing media content |
US7890971B2 (en) | 2000-04-07 | 2011-02-15 | Visible World, Inc. | Systems and methods for managing and distributing media content |
JP4332988B2 (ja) * | 2000-04-27 | 2009-09-16 | ソニー株式会社 | 信号処理装置及び方法 |
JP2001319419A (ja) | 2000-05-02 | 2001-11-16 | Matsushita Electric Ind Co Ltd | データ記録システム、記録先決定装置、媒体、および情報集合体 |
US7475404B2 (en) | 2000-05-18 | 2009-01-06 | Maquis Techtrix Llc | System and method for implementing click-through for browser executed software including ad proxy and proxy cookie caching |
US8086697B2 (en) | 2005-06-28 | 2011-12-27 | Claria Innovations, Llc | Techniques for displaying impressions in documents delivered over a computer network |
US6785901B1 (en) * | 2000-05-19 | 2004-08-31 | Webtv Networks, Inc. | Altering locks on programming content |
US9038108B2 (en) | 2000-06-28 | 2015-05-19 | Verizon Patent And Licensing Inc. | Method and system for providing end user community functionality for publication and delivery of digital media content |
EP1173013A3 (en) * | 2000-07-14 | 2004-04-21 | Sony Corporation | Remote control device for recording/reproducing apparatus and video signal recording/reproducing apparatus |
WO2002007435A1 (en) * | 2000-07-17 | 2002-01-24 | Sensory Science Corporation | Multimedia appliance |
JP4725758B2 (ja) * | 2000-08-25 | 2011-07-13 | ソニー株式会社 | 情報処理装置および情報処理方法、並びに記録媒体 |
WO2002047086A1 (en) * | 2000-12-05 | 2002-06-13 | Matsushita Electric Industrial Co., Ltd. | Recording/reproducing apparatus and record medium |
US7447754B2 (en) * | 2000-12-06 | 2008-11-04 | Microsoft Corporation | Methods and systems for processing multi-media editing projects |
US7103677B2 (en) * | 2000-12-06 | 2006-09-05 | Microsoft Corporation | Methods and systems for efficiently processing compressed and uncompressed media content |
US7114161B2 (en) * | 2000-12-06 | 2006-09-26 | Microsoft Corporation | System and related methods for reducing memory requirements of a media processing system |
US6774919B2 (en) | 2000-12-06 | 2004-08-10 | Microsoft Corporation | Interface and related methods for reducing source accesses in a development system |
US6912717B2 (en) * | 2000-12-06 | 2005-06-28 | Microsoft Corporation | Methods and systems for implementing dynamic properties on objects that support only static properties |
US7287226B2 (en) * | 2000-12-06 | 2007-10-23 | Microsoft Corporation | Methods and systems for effecting video transitions represented by bitmaps |
US6959438B2 (en) * | 2000-12-06 | 2005-10-25 | Microsoft Corporation | Interface and related methods for dynamically generating a filter graph in a development system |
US6961943B2 (en) * | 2000-12-06 | 2005-11-01 | Microsoft Corporation | Multimedia processing system parsing multimedia content from a single source to minimize instances of source files |
US6882891B2 (en) * | 2000-12-06 | 2005-04-19 | Microsoft Corporation | Methods and systems for mixing digital audio signals |
US6768499B2 (en) * | 2000-12-06 | 2004-07-27 | Microsoft Corporation | Methods and systems for processing media content |
US7114162B2 (en) | 2000-12-06 | 2006-09-26 | Microsoft Corporation | System and methods for generating and managing filter strings in a filter graph |
US6834390B2 (en) * | 2000-12-06 | 2004-12-21 | Microsoft Corporation | System and related interfaces supporting the processing of media content |
US6954581B2 (en) * | 2000-12-06 | 2005-10-11 | Microsoft Corporation | Methods and systems for managing multiple inputs and methods and systems for processing media content |
US6983466B2 (en) | 2000-12-06 | 2006-01-03 | Microsoft Corporation | Multimedia project processing systems and multimedia project processing matrix systems |
US20020106191A1 (en) * | 2001-01-05 | 2002-08-08 | Vm Labs, Inc. | Systems and methods for creating a video montage from titles on a digital video disk |
US7356250B2 (en) * | 2001-01-05 | 2008-04-08 | Genesis Microchip Inc. | Systems and methods for creating a single video frame with one or more interest points |
JP2002222101A (ja) * | 2001-01-29 | 2002-08-09 | Pioneer Electronic Corp | 情報記録再生装置 |
US6873344B2 (en) * | 2001-02-22 | 2005-03-29 | Sony Corporation | Media production system using flowgraph representation of operations |
US20070089151A1 (en) * | 2001-06-27 | 2007-04-19 | Mci, Llc. | Method and system for delivery of digital media experience via common instant communication clients |
US8972862B2 (en) | 2001-06-27 | 2015-03-03 | Verizon Patent And Licensing Inc. | Method and system for providing remote digital media ingest with centralized editorial control |
JP3600555B2 (ja) * | 2001-06-27 | 2004-12-15 | 株式会社東芝 | 動画像編集装置及び動画像編集方法 |
US7970260B2 (en) | 2001-06-27 | 2011-06-28 | Verizon Business Global Llc | Digital media asset management system and method for supporting multiple users |
US20060236221A1 (en) * | 2001-06-27 | 2006-10-19 | Mci, Llc. | Method and system for providing digital media management using templates and profiles |
JP4593842B2 (ja) * | 2001-08-03 | 2010-12-08 | キヤノン株式会社 | 動画像検索装置及びその制御方法 |
US7398004B1 (en) | 2001-10-16 | 2008-07-08 | Sonic Solutions | Software methods for authoring multimedia content to be written to optical media |
DE60225060T2 (de) * | 2001-12-25 | 2008-05-21 | Matsushita Electric Industrial Co., Ltd., Kadoma | Vorrichtung und verfahren zur wiedergabe von inhalten |
WO2003063060A2 (en) * | 2002-01-24 | 2003-07-31 | Broadcom Corporation | Asymmetric digital subscriber line modem apparatus and methods therefor |
EP1502435A4 (en) * | 2002-05-06 | 2009-08-12 | Mattel Inc | PERIPHERAL DEVICE AND SYSTEM FOR EDITING AUDIOVISUAL CONTENT |
AU2003230350A1 (en) * | 2002-05-09 | 2003-11-11 | Parkervision, Inc. | Video production system for automating the execution of a video show |
US7603341B2 (en) | 2002-11-05 | 2009-10-13 | Claria Corporation | Updating the content of a presentation vehicle in a computer network |
JP4379681B2 (ja) * | 2003-04-04 | 2009-12-09 | ソニー株式会社 | 携帯型情報処理装置、情報送信制御方法及び情報通信システム |
US20060098941A1 (en) * | 2003-04-04 | 2006-05-11 | Sony Corporation 7-35 Kitashinagawa | Video editor and editing method, recording medium, and program |
JP2005004866A (ja) * | 2003-06-11 | 2005-01-06 | Sony Corp | 情報処理装置および方法、記録媒体、並びにプログラム |
JP2005012256A (ja) * | 2003-06-16 | 2005-01-13 | Canon Inc | データ処理装置 |
JP4096310B2 (ja) * | 2003-06-18 | 2008-06-04 | ソニー株式会社 | 情報作成装置および方法、再生装置および方法、並びにプログラム |
JP4313639B2 (ja) * | 2003-09-29 | 2009-08-12 | パイオニア株式会社 | 信号処理装置 |
US7725828B1 (en) * | 2003-10-15 | 2010-05-25 | Apple Inc. | Application of speed effects to a video presentation |
KR100561417B1 (ko) * | 2004-02-09 | 2006-03-16 | 삼성전자주식회사 | Av 데이터의 재생상태를 전환할 수 있는 인터랙티브그래픽 스트림을 기록한 정보저장매체, 그 재생방법 및 장치 |
CN1661671B (zh) * | 2004-02-27 | 2010-05-05 | 雅马哈株式会社 | 用于数字混合器的情景数据编辑装置 |
US7308476B2 (en) * | 2004-05-11 | 2007-12-11 | International Business Machines Corporation | Method and system for participant automatic re-invite and updating during conferencing |
KR101058007B1 (ko) * | 2004-05-18 | 2011-08-19 | 삼성전자주식회사 | 기록 매체에 저장된 데이터를 삭제하는 방법 및 삭제된데이터를 복원하는 방법 |
US8296366B2 (en) * | 2004-05-27 | 2012-10-23 | Microsoft Corporation | Efficient routing of real-time multimedia information |
US8078602B2 (en) | 2004-12-17 | 2011-12-13 | Claria Innovations, Llc | Search engine for a computer network |
US8255413B2 (en) | 2004-08-19 | 2012-08-28 | Carhamm Ltd., Llc | Method and apparatus for responding to request for information-personalization |
JP4085392B2 (ja) * | 2004-09-02 | 2008-05-14 | ソニー株式会社 | 記録再生装置及びその方法並びにプログラム |
US7613383B2 (en) * | 2004-12-02 | 2009-11-03 | Hitachi, Ltd. | Editing method and recording and reproducing device |
US7693863B2 (en) | 2004-12-20 | 2010-04-06 | Claria Corporation | Method and device for publishing cross-network user behavioral data |
US8073866B2 (en) | 2005-03-17 | 2011-12-06 | Claria Innovations, Llc | Method for providing content to an internet user based on the user's demonstrated content preferences |
US7434155B2 (en) * | 2005-04-04 | 2008-10-07 | Leitch Technology, Inc. | Icon bar display for video editing system |
JP4396567B2 (ja) * | 2005-04-15 | 2010-01-13 | ソニー株式会社 | 素材記録装置および素材記録方法 |
JP4442499B2 (ja) * | 2005-04-15 | 2010-03-31 | ソニー株式会社 | 素材記録装置および素材記録方法 |
JP2006303595A (ja) * | 2005-04-15 | 2006-11-02 | Sony Corp | 素材記録装置および素材記録方法 |
JP4442500B2 (ja) * | 2005-04-15 | 2010-03-31 | ソニー株式会社 | 素材記録装置および素材記録方法 |
US9401080B2 (en) | 2005-09-07 | 2016-07-26 | Verizon Patent And Licensing Inc. | Method and apparatus for synchronizing video frames |
US8631226B2 (en) | 2005-09-07 | 2014-01-14 | Verizon Patent And Licensing Inc. | Method and system for video monitoring |
US9076311B2 (en) | 2005-09-07 | 2015-07-07 | Verizon Patent And Licensing Inc. | Method and apparatus for providing remote workflow management |
US20070107012A1 (en) * | 2005-09-07 | 2007-05-10 | Verizon Business Network Services Inc. | Method and apparatus for providing on-demand resource allocation |
CN101305425B (zh) * | 2005-11-07 | 2012-06-27 | 皇家飞利浦电子股份有限公司 | 光盘节目编辑方法及装置 |
US8024376B2 (en) * | 2006-03-10 | 2011-09-20 | Pioneer Corporation | Information processing apparatus, information processing method, and information processing program |
EP2014090A4 (en) | 2006-04-24 | 2011-11-30 | Visible World Inc | SYSTEMS AND METHODS FOR GENERATING MULTIMEDIA CONTENT USING MICROTENDANCES |
US8463106B2 (en) * | 2006-04-28 | 2013-06-11 | Panasonic Corporation | Contents reproducing device |
EP2024974A1 (en) * | 2006-05-29 | 2009-02-18 | THOMSON Licensing | Moving image editing system and moving image editing method |
JP2009044436A (ja) * | 2007-08-08 | 2009-02-26 | Toshiba Corp | 映像処理装置及び映像処理システム |
JP4697289B2 (ja) * | 2008-11-05 | 2011-06-08 | ソニー株式会社 | 撮像装置、撮像装置の表示制御方法 |
US8769421B2 (en) * | 2009-04-30 | 2014-07-01 | Apple Inc. | Graphical user interface for a media-editing application with a segmented timeline |
US8549404B2 (en) | 2009-04-30 | 2013-10-01 | Apple Inc. | Auditioning tools for a media editing application |
US8881013B2 (en) * | 2009-04-30 | 2014-11-04 | Apple Inc. | Tool for tracking versions of media sections in a composite presentation |
US8555169B2 (en) * | 2009-04-30 | 2013-10-08 | Apple Inc. | Media clip auditioning used to evaluate uncommitted media content |
US8701007B2 (en) * | 2009-04-30 | 2014-04-15 | Apple Inc. | Edit visualizer for modifying and evaluating uncommitted media content |
US9564173B2 (en) | 2009-04-30 | 2017-02-07 | Apple Inc. | Media editing application for auditioning different types of media clips |
US8522144B2 (en) * | 2009-04-30 | 2013-08-27 | Apple Inc. | Media editing application with candidate clip management |
US20110113361A1 (en) * | 2009-11-06 | 2011-05-12 | Apple Inc. | Adjustment presets for digital images |
CN101909161B (zh) * | 2009-12-17 | 2013-12-25 | 新奥特(北京)视频技术有限公司 | 一种视频剪辑方法及装置 |
US10324605B2 (en) | 2011-02-16 | 2019-06-18 | Apple Inc. | Media-editing application with novel editing tools |
US8875025B2 (en) | 2010-07-15 | 2014-10-28 | Apple Inc. | Media-editing application with media clips grouping capabilities |
JP5833822B2 (ja) * | 2010-11-25 | 2015-12-16 | パナソニックIpマネジメント株式会社 | 電子機器 |
US9099161B2 (en) | 2011-01-28 | 2015-08-04 | Apple Inc. | Media-editing application with multiple resolution modes |
US8621355B2 (en) | 2011-02-02 | 2013-12-31 | Apple Inc. | Automatic synchronization of media clips |
US8966367B2 (en) | 2011-02-16 | 2015-02-24 | Apple Inc. | Anchor override for a media-editing application with an anchored timeline |
US9026909B2 (en) | 2011-02-16 | 2015-05-05 | Apple Inc. | Keyword list view |
US11747972B2 (en) | 2011-02-16 | 2023-09-05 | Apple Inc. | Media-editing application with novel editing tools |
US9997196B2 (en) | 2011-02-16 | 2018-06-12 | Apple Inc. | Retiming media presentations |
US9536564B2 (en) | 2011-09-20 | 2017-01-03 | Apple Inc. | Role-facilitated editing operations |
US9437247B2 (en) | 2011-11-14 | 2016-09-06 | Apple Inc. | Preview display for multi-camera media clips |
JP5915158B2 (ja) * | 2011-12-22 | 2016-05-11 | ソニー株式会社 | タイムコード表示装置及びタイムコード表示方法 |
US20150309998A1 (en) * | 2012-01-12 | 2015-10-29 | Thomson Licensing | Method and apparatus for playing a mp4 file container while generating such a file |
CN103313122B (zh) * | 2012-03-09 | 2018-02-27 | 联想(北京)有限公司 | 一种数据处理方法及电子设备 |
US8994777B2 (en) * | 2012-03-26 | 2015-03-31 | Salesforce.Com, Inc. | Method and system for web conference recording |
US9106961B2 (en) * | 2012-05-11 | 2015-08-11 | Cisco Technology, Inc. | Method, system, and apparatus for marking point of interest video clips and generating composite point of interest video in a network environment |
US9014544B2 (en) | 2012-12-19 | 2015-04-21 | Apple Inc. | User interface for retiming in a media authoring tool |
CN104347096A (zh) * | 2013-08-09 | 2015-02-11 | 上海证大喜马拉雅网络科技有限公司 | 集音频裁剪、续录及合并于一体的录音系统和方法 |
USD737320S1 (en) * | 2013-10-03 | 2015-08-25 | La Crosse Technology, Ltd. | Display screen with icon |
USD741368S1 (en) * | 2013-10-17 | 2015-10-20 | Microsoft Corporation | Display screen with transitional graphical user interface |
US10471348B2 (en) | 2015-07-24 | 2019-11-12 | Activision Publishing, Inc. | System and method for creating and sharing customized video game weapon configurations in multiplayer video games via one or more social networks |
USD781914S1 (en) | 2015-11-18 | 2017-03-21 | Domo, Inc. | Display screen or portion thereof with a graphical user interface |
USD1041506S1 (en) * | 2022-01-20 | 2024-09-10 | Clo Virtual Fashion Inc. | Display panel with icon |
USD1041504S1 (en) * | 2022-01-20 | 2024-09-10 | Clo Virtual Fashion Inc. | Display panel with icon |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0194583A (ja) * | 1987-10-05 | 1989-04-13 | Matsushita Electric Ind Co Ltd | 情報編集装置 |
JPH04344975A (ja) * | 1991-05-22 | 1992-12-01 | Toshiba Corp | 画像ワ−クステ−ション |
JPH05342267A (ja) * | 1992-06-10 | 1993-12-24 | Nippon Denki Micom Technol Kk | ドキュメント処理系におけるグロッサリ・インデックス自動作成方式 |
Family Cites Families (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CA1217559A (en) * | 1982-12-22 | 1987-02-03 | Ronald C. Barker | Video composition method and apparatus |
US4717971A (en) * | 1984-08-24 | 1988-01-05 | Eastman Kodak Company | Partitioned editing method for a collection of video still pictures |
EP0268270B1 (en) * | 1986-11-20 | 1993-11-03 | Matsushita Electric Industrial Co., Ltd. | Information editing apparatus |
US5047867A (en) * | 1989-06-08 | 1991-09-10 | North American Philips Corporation | Interface for a TV-VCR system |
US5218672A (en) * | 1990-01-19 | 1993-06-08 | Sony Corporation Of America | Offline editing system with user interface for controlling edit list generation |
US5237648A (en) * | 1990-06-08 | 1993-08-17 | Apple Computer, Inc. | Apparatus and method for editing a video recording by selecting and displaying video clips |
EP0526064B1 (en) * | 1991-08-02 | 1997-09-10 | The Grass Valley Group, Inc. | Video editing system operator interface for visualization and interactive control of video material |
US5396339A (en) * | 1991-12-06 | 1995-03-07 | Accom, Inc. | Real-time disk system |
WO1993021635A1 (en) * | 1992-04-10 | 1993-10-28 | Avid Technology, Inc. | Method for visually and audibly representing computer instructions for editing video |
US5404316A (en) * | 1992-08-03 | 1995-04-04 | Spectra Group Ltd., Inc. | Desktop digital video processing system |
EP0598516A3 (en) * | 1992-11-05 | 1996-07-03 | Sony Corp | Recording and playback of moving pictures. |
GB2273220B (en) * | 1992-12-07 | 1997-01-08 | Quantel Ltd | A video processing system |
US5333091B2 (en) * | 1993-01-08 | 1996-12-17 | Arthur D Little Enterprises | Method and apparatus for controlling a videotape player to automatically scan past recorded commercial messages |
JPH06259887A (ja) * | 1993-03-09 | 1994-09-16 | Sony Corp | 記録再生装置 |
JPH06282965A (ja) * | 1993-03-26 | 1994-10-07 | Sony Corp | ビデオ編集装置 |
DE69424896T2 (de) * | 1993-04-13 | 2000-12-14 | Sony Corp., Tokio/Tokyo | Editiergerät |
US5339393A (en) * | 1993-04-15 | 1994-08-16 | Sony Electronics, Inc. | Graphical user interface for displaying available source material for editing |
US5440348A (en) * | 1993-04-16 | 1995-08-08 | Avid Technology, Inc. | Method and user interface for creating, specifying and adjusting motion picture transitions |
DE69535679T2 (de) * | 1994-03-16 | 2009-01-02 | Sony Corp. | Bildeditionssystem |
US5659793A (en) * | 1994-12-22 | 1997-08-19 | Bell Atlantic Video Services, Inc. | Authoring tools for multimedia application development and network delivery |
-
1996
- 1996-04-08 JP JP53087096A patent/JP3837746B2/ja not_active Expired - Fee Related
- 1996-04-08 CN CNB961903074A patent/CN1139934C/zh not_active Expired - Fee Related
- 1996-04-08 CN CNB200310124338XA patent/CN1312609C/zh not_active Expired - Fee Related
- 1996-04-08 WO PCT/JP1996/000963 patent/WO1996032722A1/ja active IP Right Grant
- 1996-04-08 EP EP96908375A patent/EP0764951B1/en not_active Expired - Lifetime
- 1996-04-08 US US08/750,330 patent/US5930446A/en not_active Expired - Fee Related
- 1996-04-08 KR KR1019960706675A patent/KR970703595A/ko active IP Right Grant
- 1996-04-08 DE DE69623712T patent/DE69623712T2/de not_active Expired - Fee Related
- 1996-04-08 EP EP20020076161 patent/EP1248258A2/en not_active Withdrawn
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0194583A (ja) * | 1987-10-05 | 1989-04-13 | Matsushita Electric Ind Co Ltd | 情報編集装置 |
JPH04344975A (ja) * | 1991-05-22 | 1992-12-01 | Toshiba Corp | 画像ワ−クステ−ション |
JPH05342267A (ja) * | 1992-06-10 | 1993-12-24 | Nippon Denki Micom Technol Kk | ドキュメント処理系におけるグロッサリ・インデックス自動作成方式 |
Non-Patent Citations (1)
Title |
---|
See also references of EP0764951A4 * |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0845781A2 (en) * | 1996-11-28 | 1998-06-03 | Sony Corporation | Video-audio data processors and methods of processing and saving input video-audio data |
EP0845781A3 (en) * | 1996-11-28 | 2001-10-17 | Sony Corporation | Video-audio data processors and methods of processing and saving input video-audio data |
EP0871177A2 (en) * | 1997-04-08 | 1998-10-14 | MGI Software Corp. | A non-timeline, non-linear digital multimedia composition method and system |
EP0871177A3 (en) * | 1997-04-08 | 2000-10-04 | MGI Software Corp. | A non-timeline, non-linear digital multimedia composition method and system |
KR100603161B1 (ko) * | 1997-04-12 | 2006-12-13 | 소니 가부시끼 가이샤 | 편집시스템및편집방법 |
KR100603173B1 (ko) * | 1997-04-12 | 2006-12-15 | 소니 가부시끼 가이샤 | 편집장치및편집방법 |
JP2008543124A (ja) * | 2005-03-07 | 2008-11-27 | エムシーアイ・エルエルシー | ネットワーク上でデジタルメディアの分散編集及び記憶を提供するための方法及びシステム |
US8782523B2 (en) | 2007-11-22 | 2014-07-15 | Sony Corporation | Unit video representing device, editing console, editing system, and animation editing method |
EP2079234A2 (en) | 2008-01-09 | 2009-07-15 | Sony Corporation | Video searching apparatus, editing apparatus, video searching method, and program |
Also Published As
Publication number | Publication date |
---|---|
JP3837746B2 (ja) | 2006-10-25 |
CN1139934C (zh) | 2004-02-25 |
KR970703595A (ko) | 1997-07-03 |
DE69623712T2 (de) | 2003-05-28 |
CN1149924A (zh) | 1997-05-14 |
CN1534512A (zh) | 2004-10-06 |
EP0764951A1 (en) | 1997-03-26 |
CN1312609C (zh) | 2007-04-25 |
EP0764951A4 (en) | 2000-11-22 |
DE69623712D1 (de) | 2002-10-24 |
EP0764951B1 (en) | 2002-09-18 |
US5930446A (en) | 1999-07-27 |
EP1248258A2 (en) | 2002-10-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO1996032722A1 (fr) | Systeme de mise en forme | |
JP4207099B2 (ja) | 画像編集装置及びその方法 | |
US6674955B2 (en) | Editing device and editing method | |
US6441832B1 (en) | Hierarchical processing apparatus and hierarchical processing method for video and audio data | |
JP4117616B2 (ja) | 編集システム、その制御方法及び編集装置 | |
US20030091329A1 (en) | Editing system and editing method | |
WO1998024091A1 (fr) | Systeme de montage video et procede correspondant | |
KR19990072012A (ko) | 편집시스템및편집방법 | |
KR20060018861A (ko) | 편집장치 및 방법 | |
US6628889B2 (en) | Editing device, editing system and editing method | |
JP4229199B2 (ja) | 編集装置及び編集方法 | |
WO2000063914A1 (fr) | Dispositif d'enregistrement/lecture de donnees, dispositif d'edition de donnees et procede d'enregistrement de donnees | |
JP4174718B2 (ja) | 編集装置及び編集方法 | |
JPH10164497A (ja) | 編集システム | |
JP4161279B2 (ja) | 表示管理装置、データ表示方法及びクリップ画像データ表示方法 | |
JP4253913B2 (ja) | 編集装置、データ記録再生装置及び編集素材記録方法 | |
EP0911829A1 (en) | Editing system and editing method | |
JP2008182765A (ja) | 画像編集装置及びその方法 | |
JP3951196B2 (ja) | 編集装置 | |
JP4172525B2 (ja) | 編集装置及び編集方法 | |
JP4484804B2 (ja) | 録画再生装置 | |
JPH10162556A (ja) | 編集システム | |
JP2000308000A (ja) | 編集装置、データ記録再生装置及び編集情報作成方法 | |
JPH10290419A (ja) | 編集装置 | |
US20080212935A1 (en) | Playback apparatus, playback method, and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 96190307.4 Country of ref document: CN |
|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): CN JP KR US |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BE DE FI FR GB IT |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1996908375 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 08750330 Country of ref document: US |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWP | Wipo information: published in national office |
Ref document number: 1996908375 Country of ref document: EP |
|
WWG | Wipo information: grant in national office |
Ref document number: 1996908375 Country of ref document: EP |