WO2008047643A1 - Programme de traitement d'image, dispositif de traitement d'image et procédé de traitement d'image - Google Patents

Programme de traitement d'image, dispositif de traitement d'image et procédé de traitement d'image Download PDF

Info

Publication number
WO2008047643A1
WO2008047643A1 PCT/JP2007/069755 JP2007069755W WO2008047643A1 WO 2008047643 A1 WO2008047643 A1 WO 2008047643A1 JP 2007069755 W JP2007069755 W JP 2007069755W WO 2008047643 A1 WO2008047643 A1 WO 2008047643A1
Authority
WO
WIPO (PCT)
Prior art keywords
image data
time
data
moving image
cpu
Prior art date
Application number
PCT/JP2007/069755
Other languages
English (en)
Japanese (ja)
Inventor
Akihiko Obama
Original Assignee
Nikon Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nikon Corporation filed Critical Nikon Corporation
Priority to US12/311,705 priority Critical patent/US8837910B2/en
Priority to JP2008539757A priority patent/JP5168147B2/ja
Priority to EP07829493A priority patent/EP2073540A4/fr
Publication of WO2008047643A1 publication Critical patent/WO2008047643A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • G11B27/034Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/77Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
    • H04N5/772Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera the recording apparatus and the television camera being placed in the same enclosure

Definitions

  • Editing processing of an image processing program according to the present invention may be configured as the following (i) to (iv).
  • Playback data may be created so that sound effects are generated and played back when the display of multiple images is switched.
  • FIG. 35 is a time chart showing a moving image displayed in the scene start priority mode in the modification.
  • FIG. 36 is a diagram for explaining the overall configuration of equipment used to provide a program product. BEST MODE FOR CARRYING OUT THE INVENTION
  • A4 Constant time switching mode
  • the first priority video data A is shot during the time t3 to t4, and the second priority video data B is shot during the time tl to t6.
  • FIG. 6 (g) shows that the shooting time of moving image data A selected as the first priority data is between time tl and time t2, and the shooting time of moving image data B selected as the second priority data.
  • Is between time t5 and t6, and the shooting time of video data C is between time t3 and t4, that is, the case where the three video data do not overlap in the same time zone.
  • the moving image data A force between time t1 and time t2 and the moving image data C force between time t3 and time t4 is selected by the CPU 104 between time t5 and time t6.
  • each movie data will not depend on the priority data setting. Selected by CPU104.
  • moving image data A is captured at times t2 to t3
  • moving image data B is acquired at times tl to t4. This is image data taken.
  • CPU 104 selects moving image data B.
  • the recording start time of movie data A is reached.
  • the CPU 104 compares the shooting time lengths in the tag information of the moving image data A and B. As a result of the comparison, since the shooting time of the moving image data B is longer, the selection of the moving image data B is prioritized by the CPU 104. Thereafter, the selection of the video data B by the CPU 104 is continued until time t4. Therefore, video data A is not selected by CPU104.
  • moving image data A is captured at times tl to t3
  • moving image data B is captured at times t2 to t4. This is image data taken.
  • the video data A is selected by the CPU 104 at time tl.
  • Video data A and B overlap between times t2 and t3.
  • the CPU 104 compares the shooting times of the moving image data A and the moving image data B. Since the shooting time of the moving image data A is longer, the moving image data A is selected with priority by the CPU 104.
  • the moving image data B is selected by the CPU 104 at time tl.
  • time t2 which is the shooting start time of moving image data C
  • the CPU 104 selects moving image data C having the longest shooting time. After this time, there is no video data with a longer recording time than video data C, so selection of video data C by CPU 104 continues until time t6.
  • CPU 104 compares the shooting times of moving image data A, B, and C that overlap in the same time period. As a result of comparison, the CPU 104 selects the video data B with the longest shooting time. Thereafter, the selection of the moving image data B by the CPU 104 is continued until time t6. Therefore, the video data A and C having a short shooting time are not selected by the CPU 104.
  • A3 Scene start priority mode in switching display mode
  • moving image data A is taken at time t ;! to t3, as in FIGS. 5 (d) and 6 (c) to 8 (c).
  • B is the image data taken from time t2 to t4.
  • CPU 104 selects video data A. Since the shooting start time of the moving image data B is reached at time t2, the moving image data B is selected by the CPU 104. When the shooting end time of the video data B comes at time t4, the CPU 104 finishes selecting the video data B.
  • the fixed time switching mode will be described with reference to FIG.
  • the CPU 104 switches the moving image data to be selected every switching time tm within the overlapping time.
  • the switching time tm is set from the menu screen by the operation of the input device 101 by the user.
  • the order in which the display of the movie corresponding to the movie data is switched is determined based on the shooting start time of each movie data. It is. If the shooting time of the video data is less than the switching time tm within the overlapping time, the CPU 104 does not select the video data.
  • the moving image data B is selected by the CPU 104.
  • the moving image data C is selected by the CPU104.
  • the force S that is the shooting start time for the video data A and the switching time tm from time t2 have not yet elapsed. Therefore, the moving image data C is selected by the CPU 104.
  • the moving image data Ab is selected by the CPU 104 at Ijt2a.
  • the moving image data Be is selected by the CPU 104 at time t3a when the switching time tm has elapsed from time t2a.
  • the parallel display mode will be described with reference to FIG. In FIG. 11, it is assumed that the moving image A is set to be displayed on the left screen of the monitor 105 and the moving image B is set to be displayed on the right screen of the monitor 105.
  • the CPU 104 selects moving image data A, which is priority data, as large-screen display image data.
  • the CPU 104 displays the moving image A corresponding to the moving image data A in the large screen display area of the monitor 105.
  • the CPU 104 does not display the small screen display area on the monitor 105. Since there is no image data having a higher priority than the moving image data A, the CPU 104 displays the moving image A in the large screen display area of the monitor 105 until time t4.
  • the video data A which is priority data
  • the image data B which is non-priority data
  • the CPU 104 selects video data A, which is priority data, as large screen display image data.
  • the CPU 104 displays the moving image A corresponding to the moving image data A in the large screen display area of the monitor 105.
  • the video Since there is no moving image data overlapping with data A, the small screen display area is not displayed on the monitor 105 by the CPU 104.
  • the CPU 104 displays the moving image A in the large screen display area of the monitor 105 until time t3.
  • movie data B which is non-priority data
  • movie Ba corresponding to movie data Ba is displayed on monitor 105. Displayed in the screen display area.
  • the display of the moving image data A in the large screen display area of the monitor 105 by the CPU 104 is ended. Therefore, the moving image Bb corresponding to the moving image data Bb is displayed on the large screen display area of the monitor 105 by the CPU 104.
  • the small screen display area is not displayed on the monitor 105 after time t3.
  • the CPU 104 selects the second priority moving image data B as the large-screen display image data.
  • the CPU 104 displays the moving image Ba corresponding to the selected moving image data Ba in the large screen display area of the monitor 105.
  • the small screen display area is not displayed on the monitor 105.
  • CPU 104 selects video data C that is non-priority data as small-screen display image data, and video Ca corresponding to this video data Ca is monitored. Displayed in the 105 small screen display area.
  • FIG. 13 (c) As in FIG. 5 (d), FIG. 6 (c) to FIG. 9 (c), FIG. Ll (c), and FIG.
  • the video data B was taken from time t1 to t3, and the video data B was taken from time t2 to t4.
  • the moving image data Bb is selected as the small screen display image data, and the moving image Bb corresponding to the moving image data Bb is displayed in the small screen display area of the monitor 105 by the CPU 104.
  • CPU 104 selects movie data A as the large screen display image data, and movie A corresponding to this movie data A is displayed in the large screen display area of monitor 105.
  • the CPU 104 switches the display of the small screen display area of the monitor 105 to the display of the moving image Cb corresponding to the moving image data Cb selected as the small screen display image data.
  • the video Be corresponding to the video data Be is not displayed by the CPU 104! /.
  • the still image data is the first priority data
  • the movie data A is the second priority data
  • the movie data B1 and the movie data B2 composed of the movie data Bla to Bld are non-dominant. It is the destination data.
  • the shooting time of movie data A is from time t3 to t5
  • the shooting time of movie data B1 is from time tl to t7
  • the shooting time of movie data B2 is from time t8 to t9.
  • the photographing time powers of the still image data Sl, S2, S3, S4, and S5 are times t2, t4, t5, t6, and t7, respectively.
  • the moving image data B1 is selected by the CPU 104, and the moving image Bla corresponding to the moving image data Bla is displayed on the monitor 105.
  • the CPU 104 selects the still image data S1.
  • CP By U104, the still image SI is displayed on the monitor 105 instead of the moving image Bla.
  • display of still image S1 on monitor 105 by CPU 104 is terminated. Then, the CPU 104 selects the moving image data Blc, and the CPU 104 monitors the moving image Blc corresponding to the moving image data Blc instead of the still image S 1 10
  • moving image data A is selected by CCU 104, and moving image Aa is displayed on monitor 105 instead of moving image Blc.
  • the still image data S2 is selected by the CPU 104 and the still image S2 is displayed on the monitor 105 by the CPU 104 instead of the movie Aa. Is displayed.
  • the still image data S3 is selected by the CPU 104. In this case, the still image S3 is displayed on the monitor 105 instead of the still image S2 by the CPU 104.
  • Fig. 16 (e) is an example in which only one display mode is selected in the moving image editing mode setting.
  • the still image data S1 to S5 that is the first priority data takes precedence over the video data B1 that is the non-priority data, Selected by CPU 104.
  • video data A and video data B1 that overlap in the interval from time t3 to t4 video data A that is the second priority data has priority over video data B1 that is the non-priority data.
  • the selected still image data S1 and S2 are open in the section from time Ijt2a to time t4.
  • five still image data S1 to S5, each of which is reproduced for time ts, are edited as continuous still image data.
  • moving image data A is first priority data
  • still image data is second priority data
  • moving image data B is non-priority data.
  • the overlapping relationship between moving image data A, moving image data B, and still images S1 to S5 is the same as in FIG. 16 (e).
  • the selected still image data S1 and moving image data Aa are open in the section from time Ijt2a to time t3.
  • the still image data S1 and the moving image data Aa are edited as continuous data.
  • the selected moving image data B and still image data S3 are open in the section from time t6 to time t7.
  • the still image data S3 is edited as still image data continuous with the moving image data Bb.
  • the still image simultaneous time insertion mode when the switching display mode for a certain period of time is selected in the video editing mode setting will be described with reference to FIG. As described above, the switching time of moving image data is described as tm.
  • moving image data A exists from shooting time t2 to t4, and still image data S1 and This shows the case where S2 exists at shooting times tl and t3, and movie data A and still image data S2 overlap in the same time zone.
  • the still image data S1 is selected by the CPU 104, and the still image S1 is displayed on the monitor 105.
  • the moving image data A is selected by the CPU 104, and the moving image A is displayed on the monitor 105.
  • the still image data S2 is selected by the CPU 104, and the still image S2 is displayed on the monitor 105 instead of the moving image A.
  • the still image data S2 is reproduced and displayed during the reproduction time ts. That is, the playback time ts is applied to the display time of the still image S 2 that overlaps the moving image A.
  • moving image data A, moving image data Bl, moving image data B2, and still image data S1 to S6 are shown.
  • the shooting time for movie data A is from time t3 to t6, and the shooting time for movie data B1 is from time tl to t9.
  • the shooting time of movie data B2 is between time tl0 and tl l.
  • the shooting times of the still image data Sl, S2, S3, S4, S5 and S6 are times t2, t4, t5, t6, t7 and t8, respectively.
  • Movie data A and movie data B1 overlap between time t3 and time t6, and still image data S1 to S6 overlap with movie data B1.
  • the moving image data B1 is selected by the CPU 104, and the moving image B1 is displayed on the monitor 105 for the switching time tm.
  • time t2 which is the shooting time of the still image data S1
  • time tl force and the switching time tm have not elapsed. Therefore, the still image data S1 is not selected by the CPU 104, and the selection of the moving image data B1 is continued.
  • a video Bib corresponding to the video data Bib is displayed on the monitor 105.
  • Images corresponding to a plurality of image data whose shooting dates and times overlap in the same time zone are displayed in parallel on the monitor 105 on two screens.
  • two different images displayed in parallel are selected by operating the input device 101. Selection of images to be displayed in parallel is performed on the menu screen displayed on the monitor 105 as described above. It is also possible to select whether the selected image is displayed on either the left or right screen of the monitor 105.
  • a specific time tr before the shooting start time t200 of the video data Assume that there is a still image data shooting time tlOO, and that the still image display starts from the time t300 indicated by (video shooting start time t200-predetermined time ta) before the predetermined time ta prior to the video display. To do. Then, the display screen control before moving image display is performed as follows according to the long and short relationship between the time tr and the predetermined time ta.
  • Fig. 21 (a) as in the case of Fig. 15 (a), the CPU 104 passes the power of the still images SI, S2, and S3 corresponding to the still image data Sl, S2, and S3 respectively. It is displayed on the right screen of the monitor 105. A black screen is displayed on the left screen of the monitor 105.
  • CPU 104 selects moving image data B as left screen display image data, and moving image B is displayed on the left screen of monitor 105. Thereafter, the video B is displayed on the left screen of the monitor 105 by the CPU 104 until time t8. On the other hand, at time tl, the CPU 104 displays a black screen on the right screen of the monitor 105. At time t2, the CPU 104 selects the still image data S1 as the right screen display image data, and the still image S1 is displayed on the right screen of the monitor 105.
  • the selection of the priority camera is set by selecting the camera name displayed in the pull-down menu on the menu screen displayed on the monitor 105 as described above. In addition, it is possible to set so that still images are always played back and displayed in the small screen display area. This setting is also performed by operating the input device 101 from the menu screen displayed on the monitor 105.
  • the still image S1 is not displayed in the large screen display area of the monitor 105 by the CPU 104.
  • time t2a which is a time ta before time t2
  • still image data SI is selected as large screen display image data by CPU 104, and still image S1 corresponding to this still image data S1 is displayed on the large screen on monitor 105. Displayed in the area. At this time, the small screen display area is not displayed on the monitor 105.
  • the still image data S3 is selected as the small screen display image data by the CPU 104, and the display in the small screen display area of the monitor 105 is switched from the still image S2 to the still image S3.
  • the CPU 104 ends the display of the moving image A in the large screen display area of the monitor 105.
  • the CPU 104 ends the display of the small screen display area of the monitor 105.
  • the still image data S4 whose photographing time is time t6 is selected by the CPU 104 as large screen display image data. Therefore, the still image S4 is displayed in the large screen display area of the monitor 105 from the time t5 to the reproduction time ts so as to be continuous with the moving image A.
  • FIG. 22 (c) is an example in which still image data is preferential data and moving image data is non-priority data, and the overlapping relationship between moving image data A and still image data S1 to S4 is illustrated. 22 shows the same case as (b).
  • the still image data S2 is selected as the large-screen display image data by the CPU 104, and the display is switched from the still image S1 to the still image S2 in the large-screen display area of the monitor 105.
  • the still image data S3 is selected as the large screen display image data by the CPU 104, and the display is switched from the still image S2 to the still image S3 in the large screen display area of the monitor 105.
  • the still image data S4 whose shooting time is the time t6 is selected by the CPU 104 as the large screen display image data. Therefore, the still image S4 is displayed in the large screen display area of the monitor 105 from time t4a to time t5a when the reproduction time ts elapses so as to be continuous with the still image S3.
  • the still image data S1 is not selected by the CPU 104.
  • the CPU 104 selects the moving image data A as the large screen display image data, and the moving image A is displayed in the large screen display area of the monitor 105.
  • the CPU 104 selects the still image data S1 as the small screen display image data, and the still image S1 is displayed in the small screen display area of the monitor 105.
  • the still image data S2 is selected as the small screen display image data by the CPU 104, and the display is switched from the still image S1 to the still image S2 in the small screen display area of the monitor 105.
  • FIG. 22 (e) shows a case where the overlapping relationship between the moving image data A and B and the still image data S1 to S5 is the same as that in FIG. 15 (d).
  • the CPU 104 selects the moving image data A as the large screen display image data, and the display in the large screen display area of the monitor 105 is switched to the moving image A having the highest priority. Furthermore, the moving image data B is selected as the small screen display image data by the CPU 104, and the second priority data is displayed by the CPU 104 with the display power of the still image S1 in the small screen display area of the monitor 105. The video Bb corresponding to the video data Bb is switched. That is, the still image S 1 is not displayed by the CPU 104.
  • the display of the moving image Bb in the small screen display area of the monitor 105 is switched to the still image S3 by the CPU 104. Thereafter, the CPU 104 continues to display the high-priority video Be in the large screen display area of the monitor 105.
  • the still image data S4 is selected as the small screen display image data by the CPU 104, and the display in the small screen display area of the monitor 105 is switched from the still image S3 to the still image S4.
  • the still image data S5 is selected as the small screen display image data, and the playback display in the small screen display area of the monitor 105 is switched from the still image S4 to the still image S5 by the CPU 104.
  • the CPU 104 ends the display of the moving image Bc in the large screen display area of the monitor 105 and the still image S5 in the small screen display area of the monitor 105.
  • the CPU 104 ends the display of the still image S1 in the large screen display area of the monitor 105.
  • the moving image Blc corresponding to the moving image data Blc selected as the large screen display image data is displayed instead of the still image S1 by the CPU 104.
  • the display of the moving image Bib in the small screen display area of the monitor 105 is ended by the CPU 104. That is, the small screen display area is not displayed on the monitor 105.
  • the CPU 104 selects the moving image data A as the large screen display image data, and the display in the large screen display area of the monitor 105 changes from the moving image B1 to the moving image A corresponding to the moving image data A having a higher priority. Switch. Thereafter, the video A is displayed on the large screen display area of the monitor 105 by the CPU 105 until time t5, which is the shooting end time of the video data A. Further, at time t3, the moving image data Bid is selected as the small screen display image data by the CPU 104, and the moving image Bid is displayed in the small screen display area of the monitor 105.
  • the CPU 104 selects the still image data S2 as the small screen display image data, and the display in the small screen display area of the monitor 105 is the video Bid.
  • the display of the moving image A in the large screen display area of the monitor 105 is ended by the CPU 104.
  • still image data S3 is selected as large screen display image data by CPU 104, and video Ab is displayed in the large screen display area of monitor 105.
  • the still image S3 is displayed instead of.
  • the CPU 104 accelerates the display start of the still image S5.
  • the CPU 104 also displays the video data B2 display start time so that only the still image is not displayed in the large screen display area of the monitor 105 as the display start of the still image S5 is accelerated. Advance from t9 to time t7.
  • the shooting time of movie data A selected as priority data is between time t3 and t5
  • the shooting time of movie data B1 is between times tl and t7. That is, the shooting time of the video data B2 is between the times t9 and tl0.
  • the IJ force when shooting still image data S1, S2, S3, S4, S5, S6 and S7 is Ijt2, t4, t5, t6, t8, til and tl2, respectively.
  • moving image data A and Bl overlap between time t3 and time t5, and still image data S1 to S4 overlap with moving image data B1. It is assumed that the still image is set to be displayed in the small screen display area of the monitor 105.
  • the CPU 104 selects video data A, which has a higher priority than the video data B, as the large screen display image data. In the screen display area, the CPU 104 switches the display from the movie B1 to the movie A. At this time, in the small screen display area of the monitor 105, the still image data S1 corresponding to the still image data S1 selected as the small screen display image data by the CPU 104 is displayed. The SI display continues.
  • the shooting time of still image data S5 is reached.
  • the CCU104 does not select the still image S5 corresponding to the still image data S5.
  • the CPU 104 selects the moving image data B2 as the large screen display image data and the still image data S5 as the small screen display image data.
  • the CPU 104 displays the moving image B2 in the large screen display area of the monitor 105 and the still image S5 in the small screen display area of the monitor 105.
  • FIG. 26 (a) shows a case where program X generated based on moving image data A, B, and C overlaps with still image data S1 and S2.
  • the CPU 104 displays still images S1 and S2 corresponding to the still image data S1 and S2 on the monitor 105 prior to the start of program display. At this time, the display time of each of the still images S1 and S2 is the playback time ts.
  • FIG. 29 to 32 A program for performing each process of FIGS. 29 to 32 is stored in a memory (not shown) in the CPU 104. This program is used to operate the input device 101. When a signal for starting an application is input, the CPU 104 starts the application. Each step in FIG. 29 to FIG. 32 is a process executed based on a command from the CPU 104.
  • step S101 it is determined whether the moving image audio mode is selected. If the video / audio mode is selected, an affirmative determination is made in step S101 and the process proceeds to step S102. If the moving image audio mode is not selected, that is, if the still image audio mode is selected, a negative determination is made in step S102, and the process proceeds to step S106.
  • step S103 it is determined whether or not the selection of the priority voice has been completed. If the moving image data to be the priority audio has been selected, an affirmative determination is made in step S103 and the process proceeds to step S104. In step S104, the priority voice mode is set, the processing in the subroutine is terminated, and the process returns to the calling source. If the moving image data to be the priority audio has not been selected, a negative determination is made in step S103, and step S103 is repeated.
  • step S101 If the still image audio setting is selected in step S101, the process proceeds to step S106.
  • step S106 it is determined whether or not the still image audio ON mode is selected. If the still image audio ON mode is selected, an affirmative decision is made in step S106 and the process proceeds to step S107. In step S107, the still image audio ON mode is set, the processing in the subroutine is terminated, and the process returns to the calling source. If the still image audio ON mode is not selected, that is, if the still image audio OFF mode is selected, a negative determination is made in step S106 and the process proceeds to step S108. In step S108, the still image audio OFF mode is set, the processing in the subroutine is terminated, and the process returns to the caller.
  • step S202 it is determined whether or not the still image simultaneous insertion mode is selected. If the still image simultaneous insertion mode is selected, an affirmative determination is made in step S202 and the process proceeds to step S204. If the still image simultaneous insertion mode is not selected, a negative determination is made in step S202, and the process proceeds to step S206.
  • step S204 is affirmed and the process proceeds to step S205.
  • step S205 the still image simultaneous insertion mode is set, the processing in the subroutine is terminated, and the process returns to the calling source. If the playback time ts is not set, the determination in step S204 is negative, and step S204 is repeated until the playback time ts is set.
  • Step S206 when the program pre- and post-combination mode is selected, Step S2 Proceed to 10.
  • step S210 it is determined whether or not the previous insertion mode has been selected. If the pre-insertion mode is selected, step S210 is affirmed and the process proceeds to step S211.
  • step S211 the pre-insertion mode in the program pre- and post-integration mode is set, the processing in the subroutine is terminated, and the process returns to the calling source. If the previous insertion mode is not selected, a negative determination is made in step S210 and the process proceeds to step S212.
  • step S212 it is determined whether post-insertion mode has been selected. When the rear insertion mode is selected, an affirmative determination is made in step S212 and the process proceeds to step S213.
  • step S213 the post-insertion mode in the program pre- and post-integration mode is set, the processing in the subroutine is terminated, and the process returns to the calling source. If the post-insertion mode is not selected, that is, if the back-and-forth insertion mode is selected, a negative determination is made in step S212 and the process proceeds to step S214.
  • step S214 the before / after insertion mode in the program before / after aggregation mode is set, the processing in the subroutine is terminated, and the process returns to the calling source.
  • step S302 it is determined whether or not the priority camera setting mode is selected.
  • step S307 it is determined whether the long-time video priority mode is selected. If the long-time video priority mode is selected, step S307 is affirmed and the process proceeds to step S308. If the long-time video priority mode is not selected, a negative determination is made in step S307 and the process proceeds to step S311.
  • step S308 it is determined whether or not the combination display mode is selected.
  • step S308 is affirmed and the process proceeds to step S309.
  • step S309 the combination display mode in the long-time video priority mode is set, the processing in the subroutine is terminated, and the process returns to the calling source. If the union display mode is selected! /, !, that is, if only one display mode is selected, step S308 is negatively determined and processing proceeds to step S310.
  • step S310 the display mode is set for only one of the long-time video priority mode, the processing in the subroutine is terminated, and the process returns to the calling source.
  • step S301 If the simultaneous display mode is selected in step S301, the process proceeds to step S315.
  • step S315 it is determined whether or not the parallel display mode is selected. If the parallel display mode is selected, an affirmative determination is made in step S315 and the process proceeds to step S316. In step S316, the parallel display mode is set, the processing in the subroutine is terminated, and the process returns to the calling source. If the parallel display mode is not selected, that is, if the small screen display mode is selected, a negative determination is made in step S315 and the process proceeds to step S317.
  • step S317 it is determined whether or not the scene start priority mode has been selected. If the scene start priority mode is selected, an affirmative determination is made in step S317 and the process proceeds to step S318.
  • step S3108 the scene start priority mode in the small screen display mode is set, the processing in the subroutine is terminated, and the process returns to the calling source. If the scene start priority mode is selected! /, Na! /, That is, if the priority camera setting mode is selected, a negative determination is made in step S317 and the process proceeds to step S319.
  • step S319 it is determined whether or not a priority camera has been set in the same manner as in step S303. If the priority camera is set, step S319 is affirmed and the process proceeds to step S320. If the priority camera is not set, step S319 is negatively determined, and step S319 is repeated until the priority camera is set. In step S320, the priority camera setting mode in the small screen display mode is set, the processing in the subroutine is terminated, and the process returns to the calling source.
  • the CPU 104 refers to the EXIF information and tag information of each of the plurality of image data read from the plurality of electronic cameras 200 to the HDD 102, and from among the plurality of image data. Extract image data whose shooting date and time overlap in the same time zone. The CPU 104 edits the extracted image data and creates reproduction data. The CPU 104 displays the reproduction data on the motor 105. At this time, according to the editing mode set by the user's operation of the input device 101, the CPU 104 selects image data corresponding to the editing mode from among a plurality of image data. Therefore, when a user edits or appreciates an image, the user manually selects an image to be displayed on the monitor 105 from a plurality of images having the same shooting date and time in the same time zone. As it is no longer necessary, convenience is improved.
  • a priority camera setting mode is provided as one of the editing modes.
  • the CPU 104 selects image data selected as priority data from a plurality of image data whose shooting dates and times overlap in the same time zone, that is, image data acquired by the camera with the highest priority. select. Therefore, since the image captured by the electronic camera 200 selected by the user as the priority camera is displayed on the monitor 105, it is possible to prevent a situation in which a desired image is dropped from editing when the user edits the image. .
  • [0301] (5) Set fixed time switching mode as one of the editing modes.
  • the CPU 104 switches image data to be selected at fixed time intervals in a plurality of image data whose shooting dates and times overlap in the same time zone. That is, the CPU 104 selects image data different from the image data selected during a certain time in the same time zone. The selection is made during a certain time different from the certain time.
  • the fixed time is the switching time tm when selecting moving image data, and the playback time ts when selecting still image data. Accordingly, a plurality of different images are switched at regular time intervals and displayed on the monitor 105, so that the user can obtain various information when editing the images.
  • the user can select either video priority or still image priority.
  • the CPU 104 selects moving image data from the plurality of extracted image data, and displays the moving image corresponding to the selected moving image data in the large screen display area of the monitor 105.
  • still image priority selects still image data from among a plurality of image data, and displays a still image corresponding to the still image data in the large screen display area of the monitor 105. Therefore, either a moving image or a still image can be displayed in the large screen display area of the monitor 105 according to the user's preference, which improves convenience when the user edits the image. .
  • a scene start priority mode is provided in the small screen display mode of the simultaneous display mode.
  • the CPU 104 compares the shooting start times based on the EXIF information and tag information of the plurality of extracted image data. As a result of the comparison, the CPU 104 displays an image corresponding to image data having a shooting start time with a later shooting start time in the large screen display area of the monitor 105. Further, an image corresponding to the image data having an early shooting start time is displayed in the small screen display area of the monitor 105. Therefore, an image including new information different from the previous information is displayed in the large screen display area of the monitor 105, so that the editing operation becomes easy when the user edits the image. It can also be used for ornamental purposes.
  • a moving image pre- and post-integration mode is provided.
  • the CPU 104 extracts video data and a plurality of still image data
  • the CPU 104 displays a plurality of still image data before or after the display of the video corresponding to the video data is started on the monitor 105.
  • a still image corresponding to is displayed. Therefore, since the moving image and the still image are not mixedly displayed on the monitor 105, the editing work becomes easy when the user edits the image.
  • a program pre- and post-combination mode is provided.
  • the CPU 104 collects a plurality of different moving image data, that is, discrete moving image data, and creates one continuous program.
  • the CPU 104 starts the display of the program on the monitor 105 before or after the display ends. Displays the still image corresponding to the image data. Therefore, since the program and the still image are not displayed together on the monitor 105, the editing work is facilitated when the user edits the image.
  • a mode for reproducing still image audio ie, a still image audio priority mode, is provided.
  • this mode when the CPU 104 extracts still image data with audio and moving image data,
  • the moving image corresponding to the moving image data is displayed in 105, and the sound corresponding to the still image data with sound is reproduced by the speaker 106. Therefore, when important information is included in the sound corresponding to the still image data with sound, the sound can be reliably reproduced, so that desired information can be obtained when the user performs an editing operation.
  • a short-time video setting mode may be added as an image editing mode.
  • the short-time video setting mode it is set whether or not to display the video corresponding to the video data whose video data shooting time is less than the predetermined time on the monitor 105. This setting is performed when the user performs a selection setting operation on the menu screen.
  • the short video setting mode is
  • the short-time video priority mode may be added to the switching display mode of the video editing process. In this case, even in the short-time video priority mode, the combination display mode and only one display mode are provided. If the short-time video priority mode is provided, a “short-time video priority mode” check box may be provided on the menu screen of FIG. 3 (a) so that it can be selected by a user operation.
  • a “short-time video priority mode” check box may be provided on the menu screen of FIG. 3 (a) so that it can be selected by a user operation.
  • the combination display mode and the one-only display mode will be described.
  • the video Ba corresponding to the video data Ba is displayed at the time tl to t2, the video A corresponding to the video data A is displayed at the times t2 to t3, and the image Be corresponding to the video data Be is displayed at the times t3 to t4. Is displayed.
  • the shooting start times and the shooting time lengths of the moving image data A and B are the same as those in the example shown in Fig. 7 (c).
  • the moving image data A is selected by the CPU 104.
  • Video data A and B overlap between times t2 and t3.
  • the CPU 104 compares the shooting time length of the moving image data A and the moving image data B.
  • the shooting time and the shooting time length of each of the moving image data Al, A2, and B are the same as in Fig. 7 (d).
  • the CPU 104 selects moving image data A1.
  • the CPU 104 compares the shooting time lengths of the moving image data A1 and the moving image data B. As a result of the comparison, the CPU 104 selects moving image data A1 having a short shooting time length.
  • time t3 and t4 there is no other moving image data other than moving image data B, so moving image data B is selected by CPU 104.
  • the CPU 104 compares the shooting time lengths of the moving image data A2 and the moving image data B. As a result of the comparison, the CPU 104 selects moving image data B having a shorter shooting time length than the moving image data A2.
  • moving image data A2 is selected by CPU 104. Therefore, the video A1 corresponding to the video data A1 is displayed at times tl to t3, the video Bb corresponding to the video data Bb is displayed at times t3 to t4, and the video Be corresponding to the video data Be is displayed at times t4 to t5.
  • the video A2b corresponding to the video data A2 is displayed at times t5 to t6.
  • the CPU 104 compares the shooting time lengths of the moving image data A, the moving image data 8, and the moving image data C. As a result of comparison, CPU104 Thus, the moving image data A having the shortest shooting time length is selected.
  • the CPU 104 compares the shooting time lengths of moving image data B and moving image data C. As a result of the comparison, the CPU 104 selects moving image data B having a short shooting time length. After time t5, since there is no other moving image data overlapping with moving image data C, CPU 104 selects moving image data C between times t5 and t6.
  • the CPU 104 compares the shooting time lengths of the moving image data B and the moving image data C. As a result of the comparison, the CPU 104 selects moving image data C having a short shooting time length. After time t5, since there is no other moving image data overlapping with moving image data B, CPU 104 selects moving image data B between times t5 and t6. Therefore, the video Ba corresponding to the video data Ba is displayed from the time tl to t2, the video Ca corresponding to the video data Ca is displayed from the time 2 to t3, and the video data is displayed from the time t3 to t4. Movie A corresponding to A is displayed, movie Cc corresponding to movie data Cc is displayed between times t4 and t5, and movie Be corresponding to movie data Be is displayed between times t5 and t6.
  • FIG. 34 (b) it is assumed that the shooting start times and the shooting time lengths of the moving image data A and B are the same as in the example shown in FIG. 8 (b).
  • the CPU 104 compares the shooting time lengths of the overlapping video data A and B in the same time zone. As a result of the comparison, the CPU 104 selects the moving image data A having the shortest shooting time length. Therefore, the moving image data B is not selected by the CPU 104 at time tl.
  • the shooting start time t2 of the moving image data A is reached, the moving image data A is selected by the CPU 104. Thereafter, the selection of the video data A by the CPU 104 is continued until time t3.
  • Fig. 34 (b) is an example in which only one display mode is set. Unlike the combination display mode in Fig. 33 (b), video data B is not selected by CPU 104 at time t3. . That is, the moving image data B is not selected by the CPU 104.
  • the shooting start times and the shooting time lengths of the moving image data A and B are the same as in the example shown in FIG. 8 (c).
  • the CPU 104 compares the shooting time lengths of the overlapping video data A and B in the same time zone. As a result of the comparison, the CPU 104 selects the moving image data B having the shortest shooting time length. Therefore, the moving image data A is not selected by the CPU 104 at the shooting start time tl of the moving image data A.
  • the shooting start time t2 of the moving image data B is reached, the moving image data B is selected by the CPU 104. Thereafter, the CPU 104 continues to select the video data B until time t4. That is, the moving image data A is not selected by the CPU 104.
  • Fig. 35 (a) shows a case where there is no other moving image data other than moving image data A that overlaps in the same time zone, as in Fig. 9 (a).
  • the moving image data A is selected by the CPU 104, and the moving image A corresponding to the moving image data A is reproduced and displayed on the monitor 105.
  • each of the following programs (1) to (24) is installed alone, or in combination, and the actual power of CPU 104 is controlled by S.
  • the editing process includes an image data selection process for selecting one image data as selected image data from the extracted plurality of image data, and the selected image data.
  • An image processing program including reproduction data creation processing for creating reproduction data so that one image is displayed on a single screen based on the above.
  • the image data selection process selects the image data having the earliest shooting start time from among the plurality of extracted image data.
  • the image data selection means based on the extracted plurality of image data, for the first time zone of the same time zone of the shooting date and time
  • the first image data which is one of the plurality of extracted image data, is selected, and the plurality of extracted multiple times are extracted for the second time zone that is different from the first time zone within the same time zone of the shooting date and time.
  • the editing process is an image corresponding to the low priority! / Low priority image captured by the camera and image data.
  • the reproduction data is displayed so that the image is displayed in the large screen display area.
  • the editing process further includes a process of collecting a plurality of discrete moving image data read in the reading process into a continuous program, A plurality of still image data whose shooting date and time overlaps with the plurality of moving image data before the program editing processing are extracted in the same time zone, and the plurality of still images extracted before the reproduction start time or after the reproduction end time of the program.
  • An image processing program for creating the reproduction data so as to display an image corresponding to the data.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Library & Information Science (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Television Signal Processing For Recording (AREA)
  • Management Or Editing Of Information On Record Carriers (AREA)

Abstract

L'invention concerne un programme de traitement d'image pour la mise en oeuvre, par un ordinateur, d'un traitement de lecture pour la lecture d'une pluralité d'éléments de données d'image obtenus par prise de vue par une pluralité de caméras, et d'un traitement d'édition permettant d'extraire les données d'image dont les dates et les heures de prise de vue sont dans le même fuseau horaire que les données d'image lues et permettant de générer des données de reproduction par l'édition des données d'image extraites.
PCT/JP2007/069755 2006-10-10 2007-10-10 Programme de traitement d'image, dispositif de traitement d'image et procédé de traitement d'image WO2008047643A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US12/311,705 US8837910B2 (en) 2006-10-10 2007-10-10 Image processing program, image processing device and image processing method
JP2008539757A JP5168147B2 (ja) 2006-10-10 2007-10-10 画像処理プログラム、画像処理装置および画像処理方法
EP07829493A EP2073540A4 (fr) 2006-10-10 2007-10-10 Programme de traitement d'image, dispositif de traitement d'image et procédé de traitement d'image

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2006-276189 2006-10-10
JP2006276189 2006-10-10

Publications (1)

Publication Number Publication Date
WO2008047643A1 true WO2008047643A1 (fr) 2008-04-24

Family

ID=39313882

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2007/069755 WO2008047643A1 (fr) 2006-10-10 2007-10-10 Programme de traitement d'image, dispositif de traitement d'image et procédé de traitement d'image

Country Status (4)

Country Link
US (1) US8837910B2 (fr)
EP (1) EP2073540A4 (fr)
JP (1) JP5168147B2 (fr)
WO (1) WO2008047643A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011197914A (ja) * 2010-03-18 2011-10-06 Brother Industries Ltd 通信システム、議事録作成方法、議事録作成装置、および議事録作成プログラム
JP2016100778A (ja) * 2014-11-21 2016-05-30 カシオ計算機株式会社 画像処理装置、画像処理方法及びプログラム

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101567814B1 (ko) * 2009-01-21 2015-11-11 삼성전자주식회사 슬라이드 쇼를 제공하는 방법, 장치, 및 컴퓨터 판독가능 저장매체
US8707158B2 (en) * 2009-08-05 2014-04-22 Microsoft Corporation Customizing a form in a model-based system
US20140263181A1 (en) 2013-03-15 2014-09-18 Jaeyoung Park Method and apparatus for generating highly repetitive pulsed plasmas
US10121123B1 (en) 2013-04-15 2018-11-06 Atomized Llc Systems and methods for managing related visual elements
KR20180013523A (ko) * 2016-07-29 2018-02-07 삼성전자주식회사 이미지의 유사도에 기초하여 이미지들을 연속적으로 표시하는 방법 및 장치
US10644968B1 (en) * 2016-12-09 2020-05-05 Tableau Software, Inc. Sampling in sliding windows with tight optimality and time decayed design
US11494900B2 (en) * 2019-07-26 2022-11-08 Case Western Reserve University Prognosis of prostate cancer with computerized histomorphometric features of tumor morphology from routine hematoxylin and eosin slides
US11659214B2 (en) * 2020-07-20 2023-05-23 Netflix, Inc. Automated workflows from media asset differentials
US11910050B2 (en) * 2021-05-21 2024-02-20 Deluxe Media Inc. Distributed network recording system with single user control

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11341449A (ja) * 1998-05-22 1999-12-10 Ricoh Co Ltd 放送型配信方法およびその方法をコンピュータに実行させるためのプログラムを記録したコンピュータ読み取り可能な記録媒体
JP2000244914A (ja) * 1999-02-18 2000-09-08 Matsushita Electric Ind Co Ltd 映像音声多重化カメラ装置
JP2000316146A (ja) * 1999-04-28 2000-11-14 Matsushita Electric Ind Co Ltd 遠隔監視システム
JP2004064396A (ja) 2002-07-29 2004-02-26 Fuji Photo Film Co Ltd 画像生成方法および装置並びにプログラム

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5751892A (en) * 1995-06-15 1998-05-12 Kabushiki Kaisha Toshiba Multi-scene recording medium and apparatus for reproducing data therefrom
EP1547082A1 (fr) 2002-09-26 2005-06-29 Koninklijke Philips Electronics N.V. Appareil d'enregistrement d'un fichier principal et de fichiers auxiliaires sur une piste d'un support d'enregistrement
US20040130635A1 (en) 2002-10-09 2004-07-08 Canon Kabushiki Kaisha Image processing apparatus and image processing method
JP2004328185A (ja) 2003-04-23 2004-11-18 Konica Minolta Photo Imaging Inc 再生情報ファイル生成プログラム
JP4228767B2 (ja) * 2003-04-25 2009-02-25 ソニー株式会社 再生装置、再生方法、再生プログラムおよび記録媒体
JP2004343472A (ja) 2003-05-16 2004-12-02 Konica Minolta Photo Imaging Inc 再生情報ファイル生成プログラム
JP2005123775A (ja) 2003-10-15 2005-05-12 Sony Corp 再生装置、再生方法、再生プログラムおよび記録媒体
JP4193743B2 (ja) * 2004-04-09 2008-12-10 ソニー株式会社 編集装置および方法、プログラム、並びに記録媒体
JP2006033155A (ja) * 2004-07-13 2006-02-02 Make Softwear:Kk 写真プリント提供装置、写真プリント提供方法および写真プリント提供プログラム
JP4649980B2 (ja) * 2004-12-21 2011-03-16 ソニー株式会社 画像編集装置、画像編集方法、プログラム
JP4881034B2 (ja) * 2005-02-28 2012-02-22 富士フイルム株式会社 電子アルバム編集システム、電子アルバム編集方法、及び電子アルバム編集プログラム

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11341449A (ja) * 1998-05-22 1999-12-10 Ricoh Co Ltd 放送型配信方法およびその方法をコンピュータに実行させるためのプログラムを記録したコンピュータ読み取り可能な記録媒体
JP2000244914A (ja) * 1999-02-18 2000-09-08 Matsushita Electric Ind Co Ltd 映像音声多重化カメラ装置
JP2000316146A (ja) * 1999-04-28 2000-11-14 Matsushita Electric Ind Co Ltd 遠隔監視システム
JP2004064396A (ja) 2002-07-29 2004-02-26 Fuji Photo Film Co Ltd 画像生成方法および装置並びにプログラム

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2073540A4 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011197914A (ja) * 2010-03-18 2011-10-06 Brother Industries Ltd 通信システム、議事録作成方法、議事録作成装置、および議事録作成プログラム
JP2016100778A (ja) * 2014-11-21 2016-05-30 カシオ計算機株式会社 画像処理装置、画像処理方法及びプログラム

Also Published As

Publication number Publication date
JP5168147B2 (ja) 2013-03-21
EP2073540A1 (fr) 2009-06-24
US20100027959A1 (en) 2010-02-04
US8837910B2 (en) 2014-09-16
JPWO2008047643A1 (ja) 2010-02-25
EP2073540A4 (fr) 2011-06-08

Similar Documents

Publication Publication Date Title
WO2008047643A1 (fr) Programme de traitement d'image, dispositif de traitement d'image et procédé de traitement d'image
JP4760438B2 (ja) 画像再生装置および画像再生プログラム
US20040095474A1 (en) Imaging apparatus using imaging template
JP5524653B2 (ja) 表示制御装置及びその制御方法
JP2010103765A (ja) 電子機器および映像処理方法
JP4389941B2 (ja) 再生制御装置および方法、並びにプログラム
JP4735388B2 (ja) 再生装置および方法、並びにプログラム
JP2013097455A (ja) 表示制御装置、表示制御装置の制御方法およびプログラム
US8750685B2 (en) Image processing apparatus
US20060070000A1 (en) Image display device and control method of the same
JP2006041888A (ja) 情報処理装置および方法、記録媒体、並びにプログラム
US8644685B2 (en) Image editing device, image editing method, and program
JP2005260749A (ja) 電子カメラ、及び電子カメラの制御プログラム
JP2008258926A (ja) 画像再生装置、画像再生プログラム、記録媒体、画像再生方法
US20100186026A1 (en) Method for providing appreciation object automatically according to user's interest and video apparatus using the same
JP2011078139A (ja) 画像再生装置、音楽加工プログラム、および画像再生プログラム
JP2009087520A (ja) 記録再生装置
JP4771085B2 (ja) コンテンツ情報表示装置
JP2011061707A (ja) コンテンツ再生装置およびコンテンツ再生方法
JP2008311808A (ja) 情報処理装置、情報処理方法、プログラム、および記録媒体
JP6529647B2 (ja) 表示制御装置およびその制御方法
JP4347779B2 (ja) 電子アルバム表示システム、電子アルバム表示方法、リモートコントローラ、及びリモートコントロールプログラム
TW202345606A (zh) 提供視角切換之影音播放系統及方法
JPH1118031A (ja) 映像プレゼンテーション及びプリント供給装置
Ozer Pinnacle Studio 11 for Windows: Visual QuickStart Guide

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07829493

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2008539757

Country of ref document: JP

WWE Wipo information: entry into national phase

Ref document number: 2007829493

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 12311705

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE