WO2010055638A1 - 映像データ作成装置、映像データ作成方法、および、映像データ作成用のプログラムとその記録媒体、集積回路 - Google Patents
映像データ作成装置、映像データ作成方法、および、映像データ作成用のプログラムとその記録媒体、集積回路 Download PDFInfo
- Publication number
- WO2010055638A1 WO2010055638A1 PCT/JP2009/005975 JP2009005975W WO2010055638A1 WO 2010055638 A1 WO2010055638 A1 WO 2010055638A1 JP 2009005975 W JP2009005975 W JP 2009005975W WO 2010055638 A1 WO2010055638 A1 WO 2010055638A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- video
- video data
- information
- block
- video stream
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/80—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N9/82—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
- H04N9/8205—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
- H04N9/8227—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal the additional signal being at least another television signal
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/02—Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
- G11B27/031—Electronic editing of digitised analogue information signals, e.g. audio or video signals
- G11B27/034—Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/102—Programmed access in sequence to addressed parts of tracks of operating record carriers
- G11B27/105—Programmed access in sequence to addressed parts of tracks of operating record carriers of operating discs
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/426—Internal components of the client ; Characteristics thereof
- H04N21/42646—Internal components of the client ; Characteristics thereof for reading from or writing on a non-volatile solid state storage medium, e.g. DVD, CD-ROM
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/432—Content retrieval operation from a local storage medium, e.g. hard-disk
- H04N21/4325—Content retrieval operation from a local storage medium, e.g. hard-disk by playing back content from the storage medium
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/434—Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
- H04N21/4347—Demultiplexing of several video streams
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/85—Assembly of content; Generation of multimedia applications
- H04N21/854—Content authoring
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/84—Television signal recording using optical recording
- H04N5/85—Television signal recording using optical recording on discs or drums
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B20/00—Signal processing not specific to the method of recording or reproducing; Circuits therefor
- G11B20/10—Digital recording or reproducing
- G11B20/10527—Audio or video recording; Data buffering arrangements
- G11B2020/10537—Audio or video recording
Definitions
- the present invention relates to creation of video data capable of multi-angle playback.
- the video data of a video that can be played by multi-angle playback is video data that allows the scene to be viewed from various angles, for example, by switching with a button in a playback section during playback on a playback device.
- Patent Document 1 discloses a technique for creating video data capable of multi-angle playback.
- the above-described prior art is based on the premise that a plurality of video data transmitted mainly in a transport stream is recorded as multi-angle playable video data. It is to be created in a lump. Therefore, when collecting video data taken by a large number of people and editing them in a batch, a large-capacity recording medium is required to record the large number of video data. Since it is necessary to find and edit a necessary portion, there is a problem that a great deal of labor is required to create video data that can be reproduced in multiple angles.
- An object of the present invention is to provide a video data creation device that can easily create video data that can be reproduced at a multi-angle without performing a laborious editing operation.
- a video data creation device of the present invention is recorded on a first recording medium, and includes a first video stream and first playback path information indicating a playback path of the first video stream
- First video data including, for each block in the first video stream, first video information associated with shooting information including a shooting time at which the video included in the block was shot, and the first video data Recorded on a second recording medium different from the one recording medium, the second video stream, the playback path information indicating the playback path of the second video stream, and for each block in the second video stream
- a video data creation device that creates third video data from second video data including second video information associated with shooting information including shooting time when a video included in the block is shot.
- the first video data, the second video data, the block recorded in the first recording medium, the block in the first video stream, and the block in the second video stream are collated Corresponding to each block from an acquisition unit that acquires collation reference information indicating elements of shooting information serving as a reference, each block in the first video stream, and each block in the second video stream Among the shooting information, when a detection unit that detects a set of blocks that match the elements of the shooting information indicated by the collation reference information and a set of blocks that match the detection unit, the set of blocks are synchronized. As described above, reproduction indicating a reproduction path capable of reproducing the block detected by the detection unit in the first video stream and the second video stream is adapted.
- the third video including path information, the first video stream, a block of the second video stream detected by the detection unit and the video information of the first video stream And a creation unit for creating data.
- blocks to be multi-angle reproduced are detected from the video stream by collating shooting information included in the video information, so that it is easy even without performing a laborious editing operation.
- the video data creation device may further include a writing unit for overwriting the third video data on the first recording medium.
- the playback path information includes a video stream identifier indicating a video stream to be played back on the playback path indicated by the playback path information, and whether there are a plurality of video streams to be played back on the playback path indicated by the playback path information.
- the creation unit is included in the set of blocks detected by the detection unit among the playback paths indicated by the playback path information included in the first video data or the second video data.
- the video stream to be played back on the playback path is added to the playback path information. It is not necessary to change the video stream itself to create
- the collation reference information indicates a plurality of elements included in the shooting information
- the detection unit detects the matching block set in the shooting information element corresponding to each block of the block set. Among them, it may be a case where all the elements of the photographing information indicated by the reference information match.
- the detection unit when the collation reference information indicates a plurality of elements included in the photographing information and the photographing time information is included in the elements of the photographing information indicated by the collation reference information, the detection unit includes the photographing time information. It is good also as a trap which determines whether it fits preferentially.
- the shooting information includes shooting location information indicating a shooting location where the video of the block was shot, and the detection unit detects the matching block set for each block set.
- shooting location information indicating a shooting location where the video of the block was shot
- the detection unit detects the matching block set for each block set.
- the shooting location information matches.
- blocks that shot the same location are detected from video streams included in different video data. For example, scenes shot at the same location at different times It is possible to create a multi-angle video that can be viewed by switching between.
- the shooting information includes person-to-be-photographed information indicating a person photographed in the video of the block, and the detection unit detects the matching block set.
- the detection unit detects the matching block set.
- the elements of the photographic information corresponding to each block it may be a case where the photographed person information matches.
- blocks that photograph the same person are detected from video streams included in different video data. It is possible to create a multi-angle video that can be viewed by switching scenes where people are photographed.
- the video data creation device may further include a writing unit that writes the third video data and the reference information to a third recording medium different from the first recording medium.
- the person who first wrote the video data to the first recording medium set it by writing the created video data and verification reference information to a third recording medium different from the first recording medium.
- the multi-angle video data created based on the collation reference information can be recorded on another third recording medium and handed one after another.
- FIG. 5 is a diagram showing an example of use of the video data creation device according to the first embodiment.
- 1 is a diagram showing a configuration of a video data creation device in Embodiment 1.
- FIG. 6 is a diagram showing a relationship between a video stream and blocks according to the first embodiment.
- FIG. 5 is a diagram showing a structure of reproduction path information in the first embodiment.
- FIG. 5 is a diagram showing a structure of video information in the first embodiment.
- FIG. 6 is a view for explaining reproduction path information creation processing according to the first embodiment.
- FIG. 7 is a flowchart showing an operation of video information collation processing in the first embodiment.
- 6 is a flowchart showing an operation of reproduction path information creation processing according to the first embodiment.
- FIG. 6 is a diagram showing a configuration of a video data creation device according to Embodiment 2.
- FIG. 6 is a diagram showing a structure of image information in the second embodiment.
- 15 is a flowchart showing the operation of image stream creation processing in the second embodiment.
- Embodiment 1 the video data recorded on the optical disc and the video data recorded on the memory card are read to create multi-angle playable video data, and the video data recorded on the optical disc is erased.
- a video data creation apparatus for adding video data recorded on a memory card to video data recorded on an optical disk by writing multi-angle playable video data on the original optical disk will be described.
- the video data creation apparatus is a video data creation device that allows a user to capture video data shot by a user when the video data is shot at various angles at the same time at an event such as an athletic meet. And then handing the optical disc to another user of the video data creation device, the user who received the video data recorded on the optical disc appends the video data taken by the user, It is assumed that video data that can be reproduced from multiple angles is created from video data taken by a plurality of users.
- FIG. 1 is a diagram for explaining the usage status of the video data creation device. Assume that the users U1, U2, and U3 have video data 1, 2, and 4 that are mainly taken from their children at the same athletic meet.
- the user U1 writes the video data 1 taken by the user and the collation reference information indicating the conditions under which the multi-angle video is created on the optical disc, and hands the optical disc to the user U2.
- the collation reference information indicates which features of the captured video such as the shooting time, the shooting location, the person to be shot, etc., when creating a multi-angle video. For example, when shooting an athletic meet, by setting a condition that the name of the person to be photographed is the name of the child of the user U1 as the reference information, the child's child can be identified from the video data taken by another person. It is possible to create a multi-angle video by extracting the scenes in which the image appears.
- the user U2 who has received the optical disc uses the video data creation device owned by the user U2 and the video data 1 recorded on the optical disc and the user himself / herself based on the collation reference information recorded on the optical disc. Reads out the video data 2 captured by, and creates video data 3, which is composed of video data 1 and a part of the video data 2, and is capable of multi-angle playback. The user U2 writes the created video data 3 on the optical disk and hands the optical disk to the user U3.
- the user U3 who has received the optical disc uses the video data creation device owned by the user U3 and uses the video data recorded on the optical disc based on the collation reference information recorded on the optical disc.
- the video data 4 and the video data 4 captured by the user are read out, and video data 5 that can be reproduced from multiple angles and composed of the video data 3 and a part of the video data 4 is created.
- video data that can be played at multiple angles is created from the video data recorded on the optical disc and the video data held by the owner of the video data creation device, and the optical disc on which the created video data is recorded is handed over.
- the optical disc contains multiple scenes extracted from video data taken by a large number of people based on collation criteria information. You can get an angle video.
- the user U1 records the video data that he / she has photographed and the matching reference information indicating the condition that the name of the person to be photographed is the name of the child of the user U1 on the optical disk, and the optical disk is recorded on the user U2, U3, ... are handed in order, and video data that can be played at multiple angles is created from video data taken by each user based on the collation reference information. You can get video data that can be viewed from multiple angles.
- the video data creation device 100 of this embodiment receives the video data 1 recorded on the optical disc 160, the collation reference information, and the video data 2 recorded on the memory card 170 via the optical disc drive 161 and the memory card slot 171, respectively. Based on the collation reference information, video data 3 capable of multi-angle playback is created from video data 1 and video data 2. Also, the video data creation device 100 overwrites the created video data 3 on the optical disc 160 via the optical disc drive 161.
- overwriting means erasing the video data 1 recorded on the optical disc 160 and writing the video data 3, and the data other than the video data 1 recorded on the optical disc 160, that is, the collation reference information is Not changed.
- the video data creation device 100 includes an acquisition unit 101, a creation unit 102, a detection unit 103, a storage unit 105, and a writing unit 104.
- the acquisition unit 101 acquires the video data 1 recorded on the optical disc 160 and the collation reference information via the optical disc drive 161, and the video data 2 recorded on the memory card 170 via the memory card slot 171. It has the function to do.
- the acquisition unit 101 has a function of outputting the acquired video data 1, video data 2, and verification reference information to the creation unit 102.
- the creation unit 102 has a function of creating 3 video data that can be reproduced from multiple angles based on the video data 1, the video data 2, and the collation reference information acquired by the acquisition unit 101.
- the creation unit 102 accesses the storage unit 105 in order to store the created video data 3 and data used during creation of the video data 3.
- the creation unit 102 instructs the detection unit 103 to detect the portion to be reproduced from the video data 1 and the video data 2 based on the collation reference information, and based on the result detected by the detection unit 103, the video data Create 3 tiles.
- the creating unit 102 instructs the writing unit 104 to write the created video data 3 to the optical disc 160.
- the detection unit 103 Upon receiving an instruction from the creation unit 102, the detection unit 103 collates the video stream included in the video data 1 and the video stream included in the video data 2 based on the collation reference information, and configures each video stream. The function of detecting a set of blocks that match the elements of the photographing information indicated in the collation reference information from the blocks to be matched.
- the storage unit 105 is a storage medium that stores the video data 3 created by the creation unit 102 and data used during the creation of the video data 3.
- the writing unit 104 has a function of receiving the instruction from the creating unit 102 and overwriting the video data 3 created by the creating unit 102 on the optical disc 160 via the optical disc drive 161.
- Video data Next, the structure of video data handled by the video data creation apparatus 100 will be described.
- Video data is a set of one or more playback path information, one or more video streams, and video information associated with each video stream.
- -Video stream A video stream is a substance of data including captured video.
- the video stream includes a PTS (Presentation Time Stamp) indicating a position on the video stream, and the range of the video stream corresponding to each PTS is called a block.
- a video stream is identified by a video stream identifier.
- the video stream identifier is, for example, the file name of a file that records the video stream.
- Video data 1 includes only one video stream, and the video stream identifier indicating the video stream is ST1.
- Video data 2 includes only one video stream, and the video stream identifier indicating the video stream is “ST2”.
- FIG. 3 shows an example of the video stream indicated by the video stream identifier “ST1” included in the video data 1 and the video stream indicated by the video stream identifier “ST2” included in the video data 2.
- the video stream indicated by the video stream identifier “ST1” is composed of 9 blocks of PTS from “0” to “80” as shown in FIG. 3 (a).
- the video stream indicated by the video stream identifier “ST2” is composed of 3 blocks of PTS from “0” to “20” as shown in Fig. 3 (b).
- Playback path information is information indicating a path for playing a video stream, a video stream identifier indicating a video stream to be played back, a playback start PTS and a playback end PTS of the video stream to be played back, In the range indicated by the playback start PTS and playback end PTS, a multi-angle flag indicating whether multi-angle playback is possible is included.
- Fig. 4 shows an example of playback path information included in video data 1 and video data 2.
- Video data 1 has only one playback path information number 1 in video data 1 as shown in Fig. 4 (a).
- Playback path information is included, and the playback path information indicates the entire range of the video stream ST1. That is, it indicates that the range from the reproduction start PTS 0 to the reproduction end PTS 80 is reproduced. Furthermore, since the multi-angle flag of this reproduction path information is OFF, it indicates that multi-angle reproduction is impossible.
- the video data 2 includes only one playback path information whose playback path information number is 1 in the video data 2, and the playback path information covers the entire range of the video stream ST2. Show. That is, it indicates that the range from the reproduction start PTS 0 to the reproduction end PTS 20 is reproduced. Furthermore, since the multi-angle flag of this reproduction path information is OFF, it indicates that multi-angle reproduction is impossible.
- Video Information Video information is information that is associated with each video stream and indicates shooting information regarding the video included in the block for each block that forms the video stream.
- the shooting information is a shooting time that indicates the time when the video included in the block was shot, a shooting location that indicates the location where the video included in the block was shot by longitude and latitude, and a video included in the block. This is information such as a person to be photographed that indicates a person's name.
- FIGS. 5 (a) and 5 (b) show examples of the video information “M1” corresponding to the video stream “ST1” and the video information “M2” corresponding to the video stream “ST2”.
- the video information “M1” corresponding to the video stream “ST1” is a table in which the video stream identifier, PTS, and shooting information are associated with each of the nine blocks of the video stream “ST1” as shown in FIG.
- the video information “M2” corresponding to the video stream “ST2” is a table in which the video stream identifier, PTS, and shooting information are associated with each of the three blocks of the video stream “ST2” as shown in FIG. 5 (b).
- the video information is created at the same time that the video stream is recorded at the time of shooting. That is, the shooting time is acquired by a clock, the shooting location is acquired by a GPS (Global Positioning System), and the person to be shot is acquired by a face recognition function. However, in order to acquire the person to be photographed by the face recognition function, it is necessary to register in advance a correspondence between face recognition data necessary for face recognition and a person name in the database.
- the collation reference information is information that indicates which elements of the shooting information included in each entry of the video information are used to create video data that can be reproduced at a multi-angle using the collation reference. That is, the collation reference information indicates any one of the shooting time, the shooting location, the person to be shot, or a combination thereof. In the following description, it is assumed that the photographing information element indicated by the collation reference information is only the photographing time.
- the video data creation device 100 When the video data creation device 100 acquires the video data 1 and video data 2 and the collation reference information, the video data creation device 100 collates the video information M2 contained in the video data 2 against the video information M1 contained in the video data 1, A set of blocks that match the elements of the shooting information indicated in the information is detected. If a suitable set of blocks is detected, the video data creation device 100 changes the video information M2 and the corresponding video stream ST2 so that the PTSs of these blocks match. Subsequently, the video data creation device 100 creates playback path information based on the video information M1 and the corresponding video stream ST1, and the changed video information M2 and the corresponding video stream ST2. The video data creation device 100 is capable of multi-angle playback of the set of the created playback path information, video information M1 and video stream ST1, and post-change video information M2 and video stream ST2. Video data 3.
- collating means comparing the elements of the photographing information to determine whether or not they match.
- Matching means that the values match or are close to each other, and means that the value is within an allowable range that is set depending on the degree of accuracy in which shooting information is recorded during shooting. For example, in the example of FIG. 5, since the photographing time is recorded in units of minutes, the time difference of less than 1 minute cannot be distinguished. Therefore, it is determined that a difference of less than 1 minute in matching is appropriate. In addition, since the shooting location is recorded in units of 0'0'001 ⁇ , the difference in location less than 0'0'001 cannot be distinguished. Therefore, it is determined that a difference of less than 0'0'001 is appropriate in the collation. The subject person is determined to be suitable when the registered person names match.
- the video information M1 shown in Fig. 5 (a) is collated with the video information M2 shown in Fig. 5 (b), and the elements of the shooting information indicated by the matching reference information, that is, the shooting time is adapted.
- the shooting time of the video information M1 PTS shown in Fig. 5 (a) should match the shooting time of the video information M2 PTS of 0 to 20 shown in Fig. 5 (b). I understand.
- the PTS of the detected entry included in the video information M2 matches the PTS of the corresponding entry included in the video information M1 so that the PTS of the entries match.
- Change as follows.
- the video information “M2” is as shown in FIG.
- the video data creation device 100 changes the PTS of the video stream ST2 corresponding to the video information M2 in accordance with the PTS of the video information M2 after the change.
- the video data creation device 100 creates playback path information of the video data 3.
- Fig. 6 is a diagram showing how the playback path information of video data 3 is created.
- FIG. 6 shows that the video information (M1) PTS is in the range of 40 to 60 and the video information (M2) is in the PTS range of 0 to 20 and the shooting times are compatible.
- the video stream (ST1) and the video stream (ST1) corresponding to the video information (M1) are matched to the blocks in the range from 40 to 60. It is shown that the video stream having changed is included in the video data 3.
- the playback route is designated by the same playback route information and the video stream to be played back is specified.
- the playback path information of the video data 3 is created so that the playback path is specified with different playback path information.
- the only video stream to be played is the video stream ST1, so the playback start PTS is 0, the playback end PTS is 30, and the video stream to be played back Reproduction path information with a video stream identifier indicating “ST1” and a multi-angle flag of “OFF” is created, and a reproduction path information number “1” is assigned to this reproduction path information.
- the video stream to be played changes to 2 video streams (ST1 and ST2), so the playback start PTS is 40 and the playback end PTS is 60, the video stream to be played.
- the reproduction path information with the video stream identifiers indicating “ST1” and “ST2” and the multi-angle flag “ON” is created, and the reproduction path information number “2” is assigned to this reproduction path information.
- the video stream to be played changes only to the video stream (ST1), so the playback start PTS is 70, the playback end PTS is 80, and the video stream identifier indicating the video stream to be played ST1 creates playback path information with the multi-angle flag set to “OFF”, and assigns playback path information number “3” to this playback path information.
- the reproduction path information of the video data 3 created in this way is as shown in FIG. ⁇ Video data creation process>
- video data creation processing by the video data creation device 100 will be described using the flowchart of FIG.
- the acquisition unit 101 acquires the video data 1 recorded on the optical disk 160 and the verification reference information (S801). Subsequently, the acquiring unit 101 acquires the video data 2 recorded on the memory card 170 (S802).
- the creation unit 102 counts the number of video information included in the video data 1 acquired by the acquisition unit 101, and assigns the number to the variable N1 that stores the number of video information included in the video data 1 ( S803). Subsequently, the creation unit 102 counts the number of video information included in the video data 2 acquired by the acquisition unit 101, and stores the number in the variable N2 that stores the number of video information included in the video data 2. Substitute (S804).
- variable i is initialized to 1 (S805), and until the variable i becomes N1 (S812), incrementing by 1 (S813), the i th video information M1 (i) of the video data 1 is acquired ( S806), the following processing is repeated.
- variable j is initialized to 1 (S807), and the variable j is incremented by 1 (S811) until the variable j becomes N2 (S811), and the j th video information M2 (j) of the video data 2 is acquired (S808). ) Repeat the following process.
- the verification process of video information “A” and video information “B” is repeated (S809).
- the video information “A” is “video information” M1 (i) ”
- the video information“ B ” is“ video information ”M2 (j).
- Video data 3 is created as described above. ⁇ Verification processing> Next, collation processing between video information A and video information B will be described using the flowchart of FIG.
- the number of entries of the video information “A” is counted, and the number is substituted into a variable “NA” that stores the number of entries of the video information “A” (S901).
- the number of entries of the video information B is counted, and the number is substituted into a variable NB that stores the number of entries of the video information B (S902).
- variable i is initialized to 1 (S903), and the following processing is repeated while incrementing by 1 (S915) until the variable i becomes NA (S914).
- the i-th entry “A (i)” of the video information “A” is acquired (S904). Subsequently, the element RA (i) of the shooting information indicated by the matching reference information is acquired from the entry A (i) (S905).
- variable j is initialized to 1 (S906), and the following processing is repeated while incrementing by 1 (S913) until the variable j becomes NB (S912).
- the “j” -th entry “B (j)” of the video information “B” is acquired (S907). Subsequently, the photographing information element RB (j) ⁇ ⁇ indicated by the collation reference information is acquired from the entry B (j) (S908).
- the shooting information elements RA (i) and RB (j) match (S909 Y)
- the difference ⁇ between the entry A (i) PTS and the entry B (j) PTS is calculated (S910).
- the j-th element C (j) of the conformance flag C is set to ON (S921).
- the video stream “PTS” corresponding to the video information “B” is changed to match the “PTS” of the video information “B” (S916).
- the video stream composition M is emptied (S1003).
- variable i is initialized to 1 (S1004), and variable i is incremented by 1 (S1022), and the following processing is repeated.
- variable t is assigned to the variable Tbeg (S1005). Subsequently, the variable t is substituted for the variable Tend (S1006).
- variable t is increased by 1 (S1010) and repeated until the video stream assembly N is no longer empty.
- the variable Tend is updated to the variable t (S1012), the variable t is increased by 1 (S1013), and the above processing is repeated for the increased variable ⁇ ⁇ t. .
- the playback start PTS of the i th playback path information P (i) is set to Tbeg (S1014), and the playback end PTS is set to Tend (S1015). Further, an element of the video stream combination N is added to the playback path information P (i) (S1016).
- the number of elements of the video stream combination N is obtained (S1017) . If the number of elements ⁇ n is greater than 1 ⁇ (S1018 Y), the multi-angle flag of the playback path information P (i) is set to ON ( S1019). If the number of elements “n” is not larger than “1” (N in S1018), the multi-angle flag of the reproduction path information “P (i)” is set to “OFF” (S1020).
- the video stream group M is set to the video stream group N (S1021), the playback path information number i is increased by 1 (S1022), and the next playback path information is created.
- the playback device plays back in the order of playback path information numbers in accordance with the playback path information included in the video data.
- the video stream included in the playback path information is played from the playback start PTS to the playback end PTS.
- the multi-angle flag included in the playback path information is ON, there are multiple video streams included in the playback path information, so when one of the video streams is played back and a remote control button operation by the user is accepted, the other Switch to the video stream and play back.
- Embodiment 2 In the first embodiment, the video data creation apparatus that creates video data that can be reproduced from two or more video streams from two or more video streams included in the two video data has been described. However, in the present embodiment, the video data included in the video data is described. A video data creation apparatus that creates video data that can be reproduced from a multi-angle from a stream and image data will be described.
- the video data creation device 100 of this embodiment receives the video data 1 and collation reference information recorded on the optical disc 160 and the image data recorded on the memory card 170 via the optical disc drive 161 and the memory card slot 171, respectively. Acquire and create video data 3. Also, the video data creation device 100 overwrites the created video data 3 on the optical disc 160 via the optical disc drive 161.
- the video data creation device 100 has a configuration in which an image stream creation unit 106 is added to the video data creation device 100 according to the first embodiment.
- the image stream creation unit 106 detects, from the image data acquired by the acquisition unit 101, image data that matches the blocks constituting the video stream included in the video data 1 and the shooting information element indicated by the matching reference information, It has a function of creating an image stream for displaying the detected image data as a slide show, and a function of creating video information of the created image stream.
- the creation unit 102 multi-angles the video data 1 and the image stream and video information created by the image stream creation unit 106.
- the process for creating reproducible video data is the same as in the first embodiment. ⁇ image data> Next, the structure of image data from which the image stream creation unit 106 creates an image stream will be described.
- Image data is obtained by adding image information to photographed data itself.
- the image information is information such as a shooting time, a shooting location, and a person to be shot when the image data is shot.
- the image information is generated at the time of shooting and is associated with each image data according to, for example, the Exif standard.
- the Exif standard together with image data, information such as image resolution, compression format, camera model that was shot, camera settings at the time of shooting, and shooting time are tagged and stored in a single file. Information such as a person to be photographed can be easily stored in association with image data.
- Fig. 12 (a) IV shows an example of image information included in image data.
- five image data are recorded in the memory card 170, and image data identifiers indicating the respective image data are defined as “PICT1 to PICT5”.
- the image information in FIG. 12 (a) is compared with the video information M1 included in the video data 1 in FIG. Since the MTS PTS 40 to 60 entry and the image data PICT2 to PICT4 image information have the same shooting time, the image stream creation unit 106 displays the video shown in Fig. 12 (b). Create an image stream ST2 with information.
- image stream creation processing will be described with reference to FIG. In the image stream creation process, video information of the image stream is created simultaneously with the creation of the image stream.
- the video data 1 recorded on the optical disk 160 and the reference information are acquired (S1301). Subsequently, the image data recorded on the memory card 170 is acquired (S1302).
- NV is set to the number of entries of video information to be collated V included in video data 1 (S1303).
- NI is set to the number of image data (S1304).
- the video information of the image stream is emptied (S1305), and a variable n indicating which number entry of the video information of the image stream is being processed is initialized to 1 (S1306).
- variable i is initialized to 1 (S1307), and the following processing is repeated while incrementing by 1 (S1322) until variable i becomes NV (S1321).
- the element “RI (i)” of the photographing information indicated by the matching reference information is acquired from the image information “I (i)” included in the i-th image data (S1308).
- variable j is initialized to 1 (S1309), and the following processing is repeated while incrementing by 1 ((S1320) until the variable j becomes NI (S1319).
- the “j” -th entry “V (j)” of the video information “V” is acquired (S1310). Subsequently, the photographing information element RV (j) indicated by the matching reference information is acquired from the entry V (j) (S1311).
- the shooting information elements RI (i) and RV (j) match (S1312 Y)
- the shooting information of the image information I (i) is added to the video information of the image stream (S1313).
- the PTS of the n th entry is calculated from the shooting time and PTS of the (n-1) entry of the image information of the image stream, and the calculated PTS is used as the entry.
- Set to “PTS” (S1316).
- PTSP is calculated as follows. For example, as can be seen from the video information “M1” in FIG. 5A, time and “PTS” have a relationship that “PTS” increases by “10” every minute. Therefore, from this relationship, the first entry's PTS is 0, and the difference in shooting time from the (n-1) th entry, the ⁇ ⁇ PTS of the nth entry can be calculated.
- the image stream and its video information are created by repeating the above processing while increasing the variable n indicating the entry during the processing of the image information by 1 (S1318).
- the created image stream and its video information are applied in place of the video stream and its video information included in the video data 2 in the first embodiment. This makes it possible to create video data that can be played back at multiple angles.
- ⁇ Supplement ⁇ The above-described embodiment may be modified as follows.
- the shooting time has been described as an example of the shooting information indicated by the collation reference information, but the present invention is not necessarily limited to this.
- the element of the shooting information indicated by the collation reference information is not limited to the shooting time, but may be information indicating the content of the block in the video stream, and may be a shooting location or a person to be shot. In the above-described embodiment, this can be realized by acquiring the element of the shooting information indicated by the collation reference information instead of acquiring the shooting time from the video information.
- the elements of the shooting information indicated by the collation reference information are acquired using GPS at the shooting location and using the face recognition function for the person being shot, and the video information needs to be generated in advance.
- the photographing information element indicated by the collation reference information is only the photographing time, but the present invention is not necessarily limited to this.
- all of the plurality of elements of the shooting information indicated by the collation reference information are collated, and when all the collated elements are matched, it is determined that the corresponding block is matched. .
- collating all of the plurality of elements of the photographing information indicated by the collation reference information is, for example, when the elements of the photographing information indicated by the collation reference information are the photographing time and the photographing place, and the photographing time is adapted, and It is to determine whether the shooting location is also suitable.
- the photographing time may be preferentially collated. That is, when detecting a set of matching blocks based on two pieces of video information, first, a set of blocks that match the shooting time is detected, and the shooting indicated by the reference information for the detected block set is detected. It is determined whether the elements of the shooting information other than the time match.
- collating the shooting times with priority it is possible to create a multi-angle video in the original sense of combining video data shot from different angles at the same time.
- the optical disk ⁇ 160 on which video data is recorded is handed over and video data taken by another person is acquired.
- the present invention is not necessarily limited thereto.
- another recording medium may be handed over or may be handed over by mail or the like.
- video data shot by a moving image shooting function such as a digital still camera or a mobile phone may be transmitted / received via a network, and video data that can be reproduced at multiple angles may be created from the transmitted / received video data.
- the video data 1 is recorded on the optical disk 160 and the video data 2 is recorded on the memory card 170.
- the recording medium on which the video data 1 is recorded may be a recording medium that can be read and handed over by the video data creation device 100, as long as the video data is not transmitted / received via the network as described above.
- the recording medium on which the video data 2 is recorded may be any recording medium that can be read by the video data creation device 100.
- the video data 3 created by the video data creation device 100 is overwritten on the optical disk 160, but the present invention is not necessarily limited to this.
- the video data 3 may be recorded on a recording medium different from the recording medium on which the video data 1 is recorded, and handed one after another.
- the allowable range depends on the accuracy of the shooting information recorded at the time of shooting. Although it is set as appropriate, the present invention is not necessarily limited to this. In other words, a unit of accuracy coarser than the unit of accuracy in which shooting information is recorded may be set as the allowable range. For example, when the photographing time is recorded in units of 1 minute, it may be determined that a difference of less than 5 minutes is suitable.
- the video data creation device of the present invention can be used for devices that record video data on a recording medium, such as a personal computer, a DVD recorder, a BD recorder, and a digital video camera. It can also be used when recording video data on a recording medium via an external device, such as a digital still camera or a mobile phone, that can shoot video data and that can communicate with the external device. it can.
- a recording medium such as a personal computer, a DVD recorder, a BD recorder, and a digital video camera. It can also be used when recording video data on a recording medium via an external device, such as a digital still camera or a mobile phone, that can shoot video data and that can communicate with the external device. it can.
- Video data creation device 101 Acquisition unit 102: Creation unit 103: Detection unit 104: Writing unit 105: Storage unit 106: Image stream creation unit 160: Optical disk 161: Optical disk drive 170: Memory card 171: Memory card slot
Abstract
Description
本実施形態では、光ディスクに記録された映像データと、メモリーカードに記録された映像データとを読み出し、マルチアングル再生可能な映像データを作成し、光ディスクに記録された映像データを消して、作成したマルチアングル再生可能な映像データを元の光ディスクに書き込むことで、光ディスクに記録された映像データに、メモリーカードに記録された映像データを追記する映像データ作成装置について説明する。
〈構成〉
次に、本実施形態の映像データ作成装置の構成について、図2を用いて説明する。
〈映像データ〉
次に、映像データ作成装置 100が取り扱う映像データの構造について説明する。
・映像ストリーム
映像ストリームとは、撮影した映像を含んだデータの実体である。映像ストリームには、映像ストリーム上の位置を示すPTS (Presentation Time Stamp) が含まれており、PTS ごとに対応する映像ストリームの範囲をブロックと呼ぶ。映像ストリームは映像ストリーム識別子によって識別される。映像ストリーム識別子は、例えば、映像ストリームを記録したファイルのファイル名である。
・再生経路情報
再生経路情報とは、映像ストリームを再生するための経路を示す情報であり、再生すべき映像ストリームを示す映像ストリーム識別子と、再生すべき映像ストリームの再生開始PTS と再生終了PTS、再生開始PTS と再生終了PTS とで示される範囲において、マルチアングル再生可能かどうかを示すマルチアングルフラグを含んでいる。
映像データ1は、図4(a) に示すように、映像データ1における再生経路情報番号が 1 の唯1つの再生経路情報を含んでおり、その再生経路情報は、映像ストリーム ST1 の全範囲を示している。すなわち、再生開始PTS 0 から再生終了PTS 80 までの範囲を再生することを示している。更に、この再生経路情報のマルチアングルフラグは OFF なので、マルチアングル再生は不可能であることを示している。
・映像情報
映像情報とは、映像ストリームごとに対応付けられ、映像ストリームを構成するブロックごとに、当該ブロックに含まれる映像に関する撮影情報を示した情報である。
〈照合基準情報〉
照合基準情報とは、映像情報の各エントリに含まれる撮影情報のうち、どの要素を照合の基準としてマルチアングル再生可能な映像データを作成するのかを示した情報である。すなわち、照合基準情報は、撮影時刻、撮影場所、被撮影人物のうちのいずれか、もしくは、これらの組み合わせを示している。以下では、照合基準情報が示す撮影情報の要素は撮影時刻のみであるとして説明を行う。
〈動作〉
次に、映像データ作成装置 100によるマルチアングル再生可能な映像データの作成動作の概要について説明する。
〈映像データ作成処理〉
次に、映像データ作成装置 100による映像データ作成処理について図8のフローチャートを用いて説明する。
〈照合処理〉
次に、映像情報 A と映像情報 B との照合処理について、図9のフローチャートを用いて説明する。
〈再生経路情報作成処理〉
次に、映像データ3の再生経路情報作成処理について、図10のフローチャートを用いて説明する。
〈具体例〉
以上のようにして、映像データ1を使って作成したマルチアングル再生可能な映像データを光ディスク 160に上書きしてその光ディスク 160を手渡ししていくと、光ディスク 160を受け取った人は、光ディスク 160に記録されている照合基準情報に基づいて、光ディスク 160に記録されている映像データに、自分の撮影した映像データを追加した映像データを作成し、光ディスク 160に上書きして次の人に手渡しするというようにして、多数の人の撮影した映像データからマルチアングル再生可能な映像データを作成できる。
〈再生動作〉
ここでは、映像データ作成装置 100が作成したマルチアングル再生可能な映像データを、再生装置がどのように再生するかについて説明する。
《実施形態2》
実施形態1 では、2つの映像データに含まれる 2つ以上の映像ストリームから、マルチアングル再生可能な映像データを作成する映像データ作成装置について説明したが、本実施形態では、映像データに含まれる映像ストリームと、画像データとから、マルチアングル再生可能な映像データを作成する映像データ作成装置について説明する。
〈構成〉
次に、本実施形態の映像データ作成装置の構成について、図11を用いて説明する。
〈画像データ〉
次に、画像ストリーム作成部 106が画像ストリームを作成する元となる画像データの構造について説明する。
〈画像ストリーム作成処理〉
次に、画像ストリーム作成処理について、図13を用いて説明する。画像ストリーム作成処理では、画像ストリームの作成と同時に、画像ストリームの映像情報も作成する。
《補足》
上述の実施形態を以下のように変形してもよい。
101:取得部
102:作成部
103:検出部
104:書込部
105:記憶部
106:画像ストリーム作成部
160:光ディスク
161:光ディスクドライブ
170:メモリーカード
171:メモリーカードスロット
Claims (12)
- 第1の記録媒体に記録され、第1の映像ストリームと、前記第1の映像ストリームの再生経路を示す第1の再生経路情報と、前記第1の映像ストリームにおけるブロックごとに、当該ブロックに含まれる映像が撮影された撮影時刻を含む撮影情報を対応付けた第1の映像情報とを含む第1の映像データと、
前記第1の記録媒体とは異なる第2の記録媒体に記録され、第2の映像ストリームと、前記第2の映像ストリームの再生経路を示す再生経路情報と、前記第2の映像ストリームにおけるブロックごとに、当該ブロックに含まれる映像が撮影された撮影時刻を含む撮影情報を対応付けた第2の映像情報とを含む第2の映像データとから
第3の映像データを作成する映像データ作成装置であって、
前記第1の映像データと、前記第2の映像データと、前記第1の記録媒体に記録され、前記第1の映像ストリームにおけるブロックと、前記第2の映像ストリームにおけるブロックとを、照合するための基準となる撮影情報の要素を示す照合基準情報とを取得する取得部と、
前記第1の映像ストリームにおける各ブロックと、前記第2の映像ストリームにおける各ブロックとから、各ブロックに対応付けられた撮影情報のうち、前記照合基準情報が示す撮影情報の要素が適合するブロックの組を検出する検出部と、
前記検出部が適合するブロックの組を検出した場合に、当該ブロックの組が同期するように、前記第1の映像ストリーム、および、前記第2の映像ストリームのうちの前記検出部が適合したと検出したブロックを再生可能な再生経路を示す再生経路情報と、前記第1の映像ストリームと、前記第2の映像ストリームのうちの前記検出部が適合したと検出したブロックと、前記第1の映像ストリームの映像情報とを含む前記第3の映像データを作成する作成部とを備える
ことを特徴とする映像データ作成装置。 - 前記映像データ作成装置は、更に、
前記第3の映像データを前記第1の記録媒体に上書きする書込部を備える
ことを特徴とする請求項1に記載の映像データ作成装置。 - 前記再生経路情報は、
当該再生経路情報が示す再生経路で再生すべき映像ストリームを示す映像ストリーム識別子と、
当該再生経路情報が示す再生経路で再生すべき映像ストリームが複数あるかどうかを示すマルチストリームフラグとを含み、
前記作成部は、
前記第1の映像データまたは前記第2の映像データに含まれる再生経路情報が示す再生経路のうち、前記検出部が検出したブロックの組に含まれるブロックを含む再生経路を示す再生経路情報に、前記第2の映像ストリームを示す映像ストリーム識別子を追加し、かつ、当該再生経路情報に含まれるマルチストリームフラグをオンにすることで、前記第3の映像データに含まれる再生経路情報を作成する
ことを特徴とする請求項2に記載の映像データ作成装置。 - 前記照合基準情報は、撮影情報に含まれる複数の要素を示し、
前記検出部が前記適合するブロックの組を検出するのは、当該ブロックの組の各ブロックに対応する撮影情報の要素のうち、前記照合基準情報が示すすべての撮影情報の要素が適合する場合である
ことを特徴とする請求項3に記載の映像データ作成装置。 - 前記検出部は、
前記照合基準情報が、撮影情報に含まれる複数の要素を示し、前記照合基準情報が示す撮影情報の要素の中に前記撮影時刻情報がある場合、前記撮影時刻情報を優先して適合するかどうかを判定する
ことを特徴とする請求項4に記載の映像データ作成装置。 - 前記撮影情報は、
前記ブロックの映像が撮影された撮影場所を示す撮影場所情報を要素にもち、
前記検出部が前記適合するブロックの組を検出するのは、当該ブロックの組の各ブロックに対応する撮影情報の要素のうち、撮影場所情報が適合する場合である
ことを特徴とする請求項4に記載の映像データ作成装置。 - 前記撮影情報は、
前記ブロックの映像に撮影されている人物を示す被撮影人物情報を要素にもち、
前記検出部が前記適合するブロックの組を検出するのは、当該ブロックの組の各ブロックに対応する撮影情報の要素のうち、被撮影人物情報が適合する場合である
ことを特徴とする請求項4に記載の映像データ作成装置。 - 前記映像データ作成装置は、更に、
前記第3の映像データと前記照合基準情報とを第1の記録媒体とは異なる第3の記録媒体に書き込む書込部を備える
ことを特徴とする請求項1に記載の映像データ作成装置。 - 第1の記録媒体に記録され、第1の映像ストリームと、前記第1の映像ストリームの再生経路を示す第1の再生経路情報と、前記第1の映像ストリームにおけるブロックごとに、当該ブロックに含まれる映像が撮影された撮影時刻を含む撮影情報を対応付けた第1の映像情報とを含む第1の映像データと、
前記第1の記録媒体とは異なる第2の記録媒体に記録され、第2の映像ストリームと、前記第2の映像ストリームの再生経路を示す再生経路情報と、前記第2の映像ストリームにおけるブロックごとに、当該ブロックに含まれる映像が撮影された撮影時刻を含む撮影情報を対応付けた第2の映像情報とを含む第2の映像データとから
第3の映像データを作成する映像データ作成方法であって、
前記第1の映像データと、前記第2の映像データと、前記第1の記録媒体に記録され、前記第1の映像ストリームにおけるブロックと、前記第2の映像ストリームにおけるブロックとを、照合するための基準となる撮影情報の要素を示す照合基準情報とを取得する取得ステップと、
前記第1の映像ストリームにおける各ブロックと、前記第2の映像ストリームにおける各ブロックとから、各ブロックに対応付けられた撮影情報のうち、前記照合基準情報が示す撮影情報の要素が適合するブロックの組を検出する検出ステップと、
前記検出ステップが適合するブロックの組を検出した場合に、当該ブロックの組が同期するように、前記第1の映像ストリーム、および、前記第2の映像ストリームのうちの前記検出ステップが適合したと検出したブロックを再生可能な再生経路を示す再生経路情報と、前記第1の映像ストリームと、前記第2の映像ストリームのうちの前記検出ステップが適合したと検出したブロックと、前記第1の映像ストリームの映像情報とを含む前記第3の映像データを作成する作成ステップとを含む
ことを特徴とする映像データ作成方法。 - 第1の記録媒体に記録され、第1の映像ストリームと、前記第1の映像ストリームの再生経路を示す第1の再生経路情報と、前記第1の映像ストリームにおけるブロックごとに、当該ブロックに含まれる映像が撮影された撮影時刻を含む撮影情報を対応付けた第1の映像情報とを含む第1の映像データと、
前記第1の記録媒体とは異なる第2の記録媒体に記録され、第2の映像ストリームと、前記第2の映像ストリームの再生経路を示す再生経路情報と、前記第2の映像ストリームにおけるブロックごとに、当該ブロックに含まれる映像が撮影された撮影時刻を含む撮影情報を対応付けた第2の映像情報とを含む第2の映像データとから
第3の映像データを作成する映像データ作成処理をコンピュータに実行させる映像データ作成用のプログラムであって、
前記第1の映像データと、前記第2の映像データと、前記第1の記録媒体に記録され、前記第1の映像ストリームにおけるブロックと、前記第2の映像ストリームにおけるブロックとを、照合するための基準となる撮影情報の要素を示す照合基準情報とを取得する取得ステップと、
前記第1の映像ストリームにおける各ブロックと、前記第2の映像ストリームにおける各ブロックとから、各ブロックに対応付けられた撮影情報のうち、前記照合基準情報が示す撮影情報の要素が適合するブロックの組を検出する検出ステップと、
前記検出ステップが適合するブロックの組を検出した場合に、当該ブロックの組が同期するように、前記第1の映像ストリーム、および、前記第2の映像ストリームのうちの前記検出ステップが適合したと検出したブロックを再生可能な再生経路を示す再生経路情報と、前記第1の映像ストリームと、前記第2の映像ストリームのうちの前記検出ステップが適合したと検出したブロックと、前記第1の映像ストリームの映像情報とを含む前記第3の映像データを作成する作成ステップとを含む
ことを特徴とする映像データ作成処理をコンピュータに実行させる映像データ作成用のプログラム。 - 第1の記録媒体に記録され、第1の映像ストリームと、前記第1の映像ストリームの再生経路を示す第1の再生経路情報と、前記第1の映像ストリームにおけるブロックごとに、当該ブロックに含まれる映像が撮影された撮影時刻を含む撮影情報を対応付けた第1の映像情報とを含む第1の映像データと、
前記第1の記録媒体とは異なる第2の記録媒体に記録され、第2の映像ストリームと、前記第2の映像ストリームの再生経路を示す再生経路情報と、前記第2の映像ストリームにおけるブロックごとに、当該ブロックに含まれる映像が撮影された撮影時刻を含む撮影情報を対応付けた第2の映像情報とを含む第2の映像データとから
第3の映像データを作成する映像データ作成処理をコンピュータに実行させる映像データ作成用のプログラムを記録した記録媒体であって、
前記第1の映像データと、前記第2の映像データと、前記第1の記録媒体に記録され、前記第1の映像ストリームにおけるブロックと、前記第2の映像ストリームにおけるブロックとを、照合するための基準となる撮影情報の要素を示す照合基準情報とを取得する取得ステップと、
前記第1の映像ストリームにおける各ブロックと、前記第2の映像ストリームにおける各ブロックとから、各ブロックに対応付けられた撮影情報のうち、前記照合基準情報が示す撮影情報の要素が適合するブロックの組を検出する検出ステップと、
前記検出ステップが適合するブロックの組を検出した場合に、当該ブロックの組が同期するように、前記第1の映像ストリーム、および、前記第2の映像ストリームのうちの前記検出ステップが適合したと検出したブロックを再生可能な再生経路を示す再生経路情報と、前記第1の映像ストリームと、前記第2の映像ストリームのうちの前記検出ステップが適合したと検出したブロックと、前記第1の映像ストリームの映像情報とを含む前記第3の映像データを作成する作成ステップとを含む
ことを特徴とする映像データ作成処理をコンピュータに実行させる映像データ作成用のプログラムを記録した記録媒体。 - 第1の記録媒体に記録され、第1の映像ストリームと、前記第1の映像ストリームの再生経路を示す第1の再生経路情報と、前記第1の映像ストリームにおけるブロックごとに、当該ブロックに含まれる映像が撮影された撮影時刻を含む撮影情報を対応付けた第1の映像情報とを含む第1の映像データと、
前記第1の記録媒体とは異なる第2の記録媒体に記録され、第2の映像ストリームと、前記第2の映像ストリームの再生経路を示す再生経路情報と、前記第2の映像ストリームにおけるブロックごとに、当該ブロックに含まれる映像が撮影された撮影時刻を含む撮影情報を対応付けた第2の映像情報とを含む第2の映像データとから
第3の映像データを作成する映像データ作成用の集積回路であって、
前記第1の映像データと、前記第2の映像データと、前記第1の記録媒体に記録され、前記第1の映像ストリームにおけるブロックと、前記第2の映像ストリームにおけるブロックとを、照合するための基準となる撮影情報の要素を示す照合基準情報とを取得する取得部と、
前記第1の映像ストリームにおける各ブロックと、前記第2の映像ストリームにおける各ブロックとから、各ブロックに対応付けられた撮影情報のうち、前記照合基準情報が示す撮影情報の要素が適合するブロックの組を検出する検出部と、
前記検出部が適合するブロックの組を検出した場合に、当該ブロックの組が同期するように、前記第1の映像ストリーム、および、前記第2の映像ストリームのうちの前記検出部が適合したと検出したブロックを再生可能な再生経路を示す再生経路情報と、前記第1の映像ストリームと、前記第2の映像ストリームのうちの前記検出部が適合したと検出したブロックと、前記第1の映像ストリームの映像情報とを含む前記第3の映像データを作成する作成部とを備える
ことを特徴とする映像データ作成用の集積回路。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2009801397909A CN102217303A (zh) | 2008-11-13 | 2009-11-10 | 影像数据生成装置、影像数据生成方法、程序及其记录介质、集成电路 |
US13/119,805 US20110164858A1 (en) | 2008-11-13 | 2009-11-10 | Video data creation device, video data creation method, video data creation program, recording medium thereof, and integrated circuit |
JP2010537683A JPWO2010055638A1 (ja) | 2008-11-13 | 2009-11-10 | 映像データ作成装置、映像データ作成方法、および、映像データ作成用のプログラムとその記録媒体、集積回路 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008-290536 | 2008-11-13 | ||
JP2008290536 | 2008-11-13 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010055638A1 true WO2010055638A1 (ja) | 2010-05-20 |
Family
ID=42169784
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2009/005975 WO2010055638A1 (ja) | 2008-11-13 | 2009-11-10 | 映像データ作成装置、映像データ作成方法、および、映像データ作成用のプログラムとその記録媒体、集積回路 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20110164858A1 (ja) |
JP (1) | JPWO2010055638A1 (ja) |
CN (1) | CN102217303A (ja) |
WO (1) | WO2010055638A1 (ja) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101905638B1 (ko) * | 2012-05-15 | 2018-12-05 | 삼성전자주식회사 | 동영상 재생 장치 및 방법 |
EP3118854B1 (en) * | 2014-09-10 | 2019-01-30 | Panasonic Intellectual Property Corporation of America | Recording medium, playback device, and playback method |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004304486A (ja) * | 2003-03-31 | 2004-10-28 | Nec Corp | 動画像編集装置および動画像編集方法 |
JP2006109494A (ja) * | 2002-09-25 | 2006-04-20 | Matsushita Electric Ind Co Ltd | 再生装置、プログラム、再生方法 |
JP2007013939A (ja) * | 2005-05-30 | 2007-01-18 | Matsushita Electric Ind Co Ltd | メタデータ付与装置及びメタデータ付与方法 |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4226247B2 (ja) * | 2002-01-15 | 2009-02-18 | 富士フイルム株式会社 | 画像処理装置 |
JP3948979B2 (ja) * | 2002-02-18 | 2007-07-25 | パイオニア株式会社 | 情報記録媒体、情報記録装置及び方法、情報再生装置及び方法、情報記録再生装置及び方法、記録又は再生制御用のコンピュータプログラム、並びに制御信号を含むデータ構造 |
KR100548383B1 (ko) * | 2003-07-18 | 2006-02-02 | 엘지전자 주식회사 | 이동통신 시스템의 디지털 비디오 신호처리 장치 및 방법 |
-
2009
- 2009-11-10 JP JP2010537683A patent/JPWO2010055638A1/ja not_active Withdrawn
- 2009-11-10 WO PCT/JP2009/005975 patent/WO2010055638A1/ja active Application Filing
- 2009-11-10 CN CN2009801397909A patent/CN102217303A/zh active Pending
- 2009-11-10 US US13/119,805 patent/US20110164858A1/en not_active Abandoned
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006109494A (ja) * | 2002-09-25 | 2006-04-20 | Matsushita Electric Ind Co Ltd | 再生装置、プログラム、再生方法 |
JP2004304486A (ja) * | 2003-03-31 | 2004-10-28 | Nec Corp | 動画像編集装置および動画像編集方法 |
JP2007013939A (ja) * | 2005-05-30 | 2007-01-18 | Matsushita Electric Ind Co Ltd | メタデータ付与装置及びメタデータ付与方法 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2010055638A1 (ja) | 2012-04-12 |
US20110164858A1 (en) | 2011-07-07 |
CN102217303A (zh) | 2011-10-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101042831B1 (ko) | 정보 기록 장치 및 정보 기록 방법 | |
KR101234988B1 (ko) | 정보 처리 장치 및 방법, 및 기록 매체 | |
US7714910B2 (en) | Image file recording method, image recording and playback method, image recording and playback system, image recording apparatus, and image playback apparatus | |
KR102078136B1 (ko) | 오디오 데이터를 가지는 이미지를 촬영하기 위한 장치 및 방법 | |
EP2160892B1 (en) | Method and system for facilitating creation of content | |
EP1785996A1 (en) | Image information recording device and image information display device | |
JP4164478B2 (ja) | 再生装置 | |
JP4267007B2 (ja) | 画像表示装置、画像表示方法、プログラム及び記憶媒体 | |
JP2004350042A (ja) | 記録装置および記録方法、再生装置および再生方法、並びに記憶媒体 | |
JP4510686B2 (ja) | 撮像装置及び撮像装置の制御方法 | |
WO2010055638A1 (ja) | 映像データ作成装置、映像データ作成方法、および、映像データ作成用のプログラムとその記録媒体、集積回路 | |
CN103678469A (zh) | 媒体文件管理方法 | |
JP2010147509A (ja) | 映像処理装置及び映像配信システム | |
JP4838659B2 (ja) | 再生装置及び再生装置の制御方法 | |
JP5936376B2 (ja) | 画像管理装置 | |
JP6248314B2 (ja) | 撮像装置、撮像画像の処理方法、及び撮像システム | |
JP2005328154A (ja) | 記録装置、ファイル分割方法、及びプログラム | |
JP5979984B2 (ja) | 撮像装置、その制御方法及びプログラム | |
JP2008042455A (ja) | 記録装置及び記録方法 | |
JP6263002B2 (ja) | 撮像装置およびその制御方法、プログラム | |
JP2006101155A (ja) | 画像記録再生装置 | |
WO2007105560A1 (ja) | 情報処理装置、情報処理方法及び情報処理用プログラム | |
JP2010124323A (ja) | 画像処理装置、消去候補画像選別方法および消去候補画像選別プログラム | |
Shapiro et al. | From Still to Motion: Editing DSLR Video with Final Cut Pro X | |
JP2010212821A (ja) | 記録再生装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200980139790.9 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09825892 Country of ref document: EP Kind code of ref document: A1 |
|
DPE1 | Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101) | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2010537683 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 09825892 Country of ref document: EP Kind code of ref document: A1 |