WO2015146242A1 - Image processing device - Google Patents

Image processing device Download PDF

Info

Publication number
WO2015146242A1
WO2015146242A1 PCT/JP2015/051364 JP2015051364W WO2015146242A1 WO 2015146242 A1 WO2015146242 A1 WO 2015146242A1 JP 2015051364 W JP2015051364 W JP 2015051364W WO 2015146242 A1 WO2015146242 A1 WO 2015146242A1
Authority
WO
WIPO (PCT)
Prior art keywords
frame
frames
channel
similarity
comparison
Prior art date
Application number
PCT/JP2015/051364
Other languages
French (fr)
Japanese (ja)
Inventor
北 耕次
Original Assignee
Nkワークス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nkワークス株式会社 filed Critical Nkワークス株式会社
Publication of WO2015146242A1 publication Critical patent/WO2015146242A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor
    • H04N5/93Regeneration of the television signal or of selected parts thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/46Extracting features or characteristics from the video content, e.g. video fingerprints, representative shots or key frames
    • G06V20/47Detecting features for summarising video content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/19Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
    • G11B27/28Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
    • G11B27/30Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on the same track as the main recording
    • G11B27/3027Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on the same track as the main recording used signal is digitally coded
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/34Indicating arrangements 
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor
    • H04N5/915Television signal processing therefor for field- or frame-skip recording or reproducing

Definitions

  • the present invention relates to an image processing apparatus and an image processing program, and more particularly to an image processing apparatus and an image processing program for classifying these frames for each channel when frames belonging to different channels are mixed in one moving image. .
  • time lapse An intermittent recording method called time lapse is known as a method for recording moving images. This is a method of recording at a frame rate lower than normal, that is, dropping frames, and enables recording on the same recording medium for a longer time than usual. Also in security cameras, this time lapse method is often employed, and further, images from other security cameras may be inserted in the gap time formed by lowering the frame rate. In this case, a moving image is recorded in which videos from a plurality of cameras, that is, videos from a plurality of channels are alternately switched along one timeline.
  • ⁇ Videos that are recorded from multiple channels on a single timeline can be displayed on a monitor for each camera, that is, for each channel, using a dedicated device.
  • channel number data is assigned to each frame of the video, and at the time of playback, by referring to this, a video of an arbitrary channel can be selected from videos in which videos of a plurality of channels are mixed. Can only be taken out.
  • the present invention relates to an image processing apparatus that enables reproduction for each channel, even if frames belonging to different channels are mixed in one moving image, no matter what specifications the moving image is recorded in.
  • An object is to provide an image processing program.
  • An image processing apparatus is an image processing apparatus for classifying the frames into the respective channels when frames belonging to different channels are mixed in one moving image, and an automatic classification unit Is provided.
  • the automatic classification unit calculates a similarity between frames based on a local color tendency and / or an overall color tendency by performing image processing on a plurality of frames included in the moving image, and the similarity The plurality of frames are classified into a plurality of channels.
  • a function for classifying these frames for each channel is provided. Specifically, first, image processing is performed on a plurality of frames included in a moving image to calculate the similarity between frames, and the plurality of frames are automatically classified into a plurality of channels according to this. .
  • the similarity between frames is determined based on local and / or overall color tendency in the frame screen. Therefore, according to this function, a video in which frames belonging to different channels are mixed into a video for each channel based on the local and / or overall color tendency in the frame screen regardless of the video specifications. And can be separated. That is, in the case where frames belonging to different channels are mixed in one moving image, it is possible to reproduce for each channel regardless of the specifications of the moving image recorded.
  • An image processing apparatus is the image processing apparatus according to the first aspect, and the automatic classification unit includes a setting unit, a calculation unit, and a determination unit.
  • the setting unit sets a specific frame included in the moving image or a frame obtained by combining two or more specific frames included in the moving image as a reference frame, and sets another specific frame included in the moving image. Set as a comparison frame.
  • the calculation unit divides the comparison frame and the reference frame into a plurality of partial areas, calculates a first color index for each partial area, and based on the first color index, the comparison frame and the reference frame And the degree of similarity between the composition of the comparison frame and the composition of the reference frame is calculated.
  • the determination unit determines whether the comparison frame belongs to the same channel as the reference frame or a frame from which the comparison frame is combined based on the similarity.
  • dividing a frame into a plurality of partial areas includes not only dividing the entire frame into a plurality of partial areas without overlapping, but also a part of the entire frame (for example, excluding the outer frame area). And the like are divided into a plurality of partial areas without overlapping, and the entire frame or part of the frame is divided into a plurality of partial areas partially overlapping.
  • the composition of each of the comparison frame and the reference frame is derived based on the color tendency, and it is determined whether or not both frames are from the same channel according to the similarity of both compositions.
  • the composition is an arrangement of each element (for example, a target object and a background) that appears in the frame screen. Therefore, based on the composition of the frame, it can be determined whether the compared frames belong to the same channel.
  • An image processing apparatus is the image processing apparatus according to the first aspect, and the automatic classification unit includes a setting unit, a calculation unit, and a determination unit.
  • the setting unit sets a specific frame included in the moving image or a frame obtained by combining two or more specific frames included in the moving image as a reference frame, and sets another specific frame included in the moving image. Set as a comparison frame.
  • the calculation unit divides the comparison frame and the reference frame into a plurality of partial areas, calculates a first color index for each partial area, and based on the first color index, the comparison frame and the reference frame And comparing the comparison frame and the reference frame into a plurality of new partial areas based on the composition, and at least one of the new partial areas for each of the comparison frame and the reference frame. Then, a second color index is calculated, and the similarity between the second color index of the comparison frame and the second color index of the reference frame is calculated.
  • the determination unit determines whether the comparison frame belongs to the same channel as the reference frame or a frame from which the comparison frame is combined based on the similarity.
  • the composition of each of the comparison frame and the reference frame is derived based on the color tendency. Further, the comparison frame and the reference frame are divided into a plurality of partial areas based on the composition. Then, whether or not both frames are derived from the same channel is determined according to the similarity between the color tendency in the partial area of the comparison frame and the color tendency in the partial area of the reference frame. Therefore, it can be determined whether or not the compared frames belong to the same channel based on the color tendency considering the composition of the frames.
  • An image processing apparatus is the image processing apparatus according to the first aspect, and the automatic classification unit includes a setting unit, a calculation unit, and a determination unit.
  • the setting unit sets a specific frame included in the moving image or a frame obtained by combining two or more specific frames included in the moving image as a reference frame, and sets another specific frame included in the moving image. Set as a comparison frame.
  • the calculation unit divides the comparison frame and the reference frame into a plurality of partial areas, calculates a first color index for each partial area, and based on the first color index, the comparison frame and the reference frame And comparing the comparison frame and the reference frame into a plurality of new partial areas based on the composition, and for each of the comparison frame and the reference frame, at least 2 of the new partial areas.
  • a second color index is calculated, a third color index that is a difference between the second color indices is calculated for each of the comparison frame and the reference frame, and the third color index of the comparison frame
  • the similarity with the third color index of the reference frame is calculated.
  • the determination unit determines whether the comparison frame belongs to the same channel as the reference frame or a frame from which the comparison frame is combined based on the similarity.
  • the composition of each of the comparison frame and the reference frame is derived based on the local color tendency. Further, the comparison frame and the reference frame are divided into a plurality of partial areas based on the composition. Whether or not both frames come from the same channel according to the similarity between the difference in color index between the plurality of partial areas in the comparison frame and the difference in color index between the plurality of partial areas in the reference frame Is judged.
  • the color index difference between the plurality of partial areas is, for example, a saturation difference or a density difference in the plurality of partial areas. Therefore, it can be determined whether or not the compared frames belong to the same channel based on the color tendency considering the composition of the frames.
  • An image processing apparatus is the image processing apparatus according to any one of the first to fourth aspects, wherein the automatic classification unit includes at least one of density, brightness, luminance, saturation, and hue. Based on one, the color tendency is evaluated.
  • the color tendency of the frame can be determined based on at least one of density, saturation, and hue.
  • An image processing program is an image processing program for classifying the frames into the channels when frames belonging to different channels are mixed in one moving image.
  • a similarity between frames regarding a local color tendency and / or an overall color tendency is calculated, and the plurality of frames are converted into a plurality of frames according to the similarity.
  • the same effect as the first aspect can be achieved.
  • a moving image in which frames belonging to different channels are mixed is separated into moving images for each channel based on the local and / or overall color tendency in the frame screen. It becomes possible to do. That is, in the case where frames belonging to different channels are mixed in one moving image, it is possible to reproduce for each channel regardless of the specifications of the moving image recorded.
  • FIG. 1 is a block diagram of an image processing apparatus according to a first embodiment of the present invention.
  • the figure of the basic screen before image data is taken in.
  • the figure of a basic screen after image data was taken in.
  • the flowchart which shows the flow of the automatic classification process which concerns on 1st Embodiment of this invention.
  • the conceptual diagram explaining the method of calculating the similarity which concerns on 2nd Embodiment of this invention The flowchart which shows the flow of the automatic classification
  • An image processing apparatus 1 shown in FIG. 1 is an embodiment of an image processing apparatus according to the present invention.
  • the image processing apparatus 1 is a general-purpose personal computer.
  • an image processing program 2 which is an embodiment of the image processing program according to the present invention is stored in a computer-readable recording medium 60 such as a CD-ROM, a DVD-ROM, or a USB memory. Etc. are provided and installed by etc.
  • the image processing program 2 is application software for supporting image processing for moving images and still images.
  • the image processing program 2 causes the image processing apparatus 1 to execute steps included in the operations described later.
  • the image processing apparatus 1 includes a display 10, an input unit 20, a storage unit 30, and a control unit 40. These units 10 to 40 are connected to each other via a bus line, a cable 5 or the like and can communicate appropriately.
  • the display 10 is composed of a liquid crystal display or the like, and displays a screen or the like described later to the user.
  • the input unit 20 includes a mouse, a keyboard, and the like, and accepts an operation from the user for the image processing apparatus 1.
  • the storage unit 30 is a non-volatile storage area configured from a hard disk or the like.
  • the control unit 40 includes a CPU, a ROM, a RAM, and the like.
  • the image processing program 2 is stored in the storage unit 30.
  • a software management area 50 is secured in the storage unit 30.
  • the software management area 50 is an area used by the image processing program 2.
  • an original image area 51, a processed file area 52, and an impression degree definition area 53 are secured. The role of each of the areas 51 to 53 will be described later.
  • the control unit 40 virtually operates as the automatic classification unit 41 and the reclassification unit 45 by reading and executing the image processing program 2 stored in the storage unit 30.
  • the automatic classification unit also operates as the setting unit 42, the calculation unit 43, and the determination unit 44. The operation of each part 41 to 45 will be described later.
  • control unit 40 detects that the user has performed a predetermined operation via the input unit 20, the control unit 40 activates the image processing program 2.
  • image processing program 2 When the image processing program 2 is activated, a basic screen W1 (see FIG. 2) is displayed on the display 10. Note that the display of all elements such as a screen, a window, a button, and the like displayed on the display 10 is controlled by the control unit 40.
  • the basic screen W1 accepts an instruction for taking image data into the original image area 51 from the user.
  • the image data taken into the original image area 51 is a target of reproduction processing, image processing, and channel separation processing described later.
  • the control unit 40 captures image data from the still image file or the moving image file into the original image area 51.
  • a still image file is a data file in a still image format
  • a moving image file is a data file in a moving image format.
  • the control unit 40 When capturing image data from a still image file, the user operates the input unit 20 to specify one still image file or one folder.
  • the control unit 40 causes the user to input the address path and file name in the storage unit 30 of the still image file.
  • the control unit 40 causes the user to input the address path and folder name in the storage unit 30 of the folder.
  • the control unit 40 stores the specified still image file or all the still image files in the specified folder as a still image file group in the original image area 51.
  • the number of elements is not limited to a plurality, and may be one.
  • the control unit 40 when capturing image data from a moving image file, the user operates the input unit 20 to input the address path and file name in the storage unit 30 of one moving image file.
  • the control unit 40 detects that the user has specified a moving image file, the control unit 40 displays a moving image capturing window (not shown) in an overlapping manner on the basic screen W1.
  • the moving image capture window accepts selection of an arbitrary section from the user among all the sections of the timeline of the specified moving image file.
  • the control unit 40 detects that the user has selected a specific section via the input unit 20, the control unit 40 generates a still image file group corresponding to the frame group included in the section of the designated moving image file on a one-to-one basis. To do.
  • the control unit 40 stores the still image file group in the original image area 51. Therefore, in the present embodiment, the image data to be subjected to playback processing, image processing, and channel separation processing, which will be described later, is not a moving image file but a still image file.
  • control unit 40 sets the still image file group to one time not only when the still image file group captured in the original image area 51 is derived from the moving image file but also from the still image file. Recognize that they are arranged along a line. The array is automatically given from the attribute of the file.
  • the control unit 40 displays the display window W2 (see FIG. 3) on the basic screen W1.
  • the display windows W2 are created by the number of timelines of the still image file group taken into the original image area 51.
  • one still image file (for example, a still image file of the first frame on the timeline) included in the still image file group captured in the original image area 51 is displayed. Thereafter, as will be described later, the frame displayed in the display window W2 is switched in response to a user operation.
  • one still image file for example, a still image file of the first frame on the timeline
  • the control unit 40 can reproduce a frame group belonging to the timeline corresponding to the display window W2 as a moving image in the display window W2.
  • a window selection pull-down menu T1 a playback button T2, a frame advance button T3, a frame return button T4, and a timeline bar T5 are arranged on the basic screen W1.
  • the window selection pull-down menu T1 accepts selection of which display window W2 is active from the user.
  • a timeline corresponding to the active display window W2 is referred to as an active timeline
  • a frame group belonging to the active timeline is referred to as an active frame group.
  • a frame currently displayed in the active display window W2 is referred to as an active display frame.
  • the playback button T2 receives a playback command from the user as a movie of the active frame group.
  • the control unit 40 detects that the user has pressed the play button T2 via the input unit 20, the frame included in the active frame group is sequentially framed along the timeline in the active display window W2. Display in format. Note that the reproduction starts from the active display frame at the time when the reproduction button T2 is pressed.
  • the playback button T2 accepts a playback stop command from the user.
  • the control unit 40 detects that the user has pressed the playback button T2 via the input unit 20 during playback, the control unit 40 fixes the display in the active display window W2 to the active display frame at that time.
  • the frame advance button T3 and the frame return button T4 each receive a command to switch the active display frame to the previous frame one by one along the active timeline.
  • the timeline bar T5 is an object that schematically shows the active timeline.
  • the timeline bar T5 is equally divided by the number of frames of the active frame group in the direction in which the bar extends.
  • the nth divided region from the left on the timeline bar T5 corresponds to the nth frame on the active timeline (n is a natural number).
  • the timeline bar T5 displays the divided area A1 corresponding to the selected frame group and the divided area A2 corresponding to the non-selected frame group in different modes.
  • the selected frame group is a frame group corresponding to the currently selected section on the active timeline.
  • the non-selected frame group is a frame group corresponding to a section that is not currently selected on the active timeline.
  • the timeline bar T5 accepts selection of an arbitrary section on the active timeline from the user.
  • the user can select any number of arbitrary frames from the active frame group by operating the divided areas on the timeline bar T5 via the input unit 20.
  • the control unit 40 recognizes the selected frame group as an object of image processing and channel separation processing described later.
  • the active display frame is switched to a frame corresponding to the most recently selected divided area.
  • the control unit 40 can execute a plurality of image processing modules such as noise removal, sharpness, brightness / contrast / saturation adjustment, image resolution, rotation, addition of characters / arrows / mosaic, and image averaging.
  • the image processing module is incorporated in the image processing program 2.
  • the user can select any one of the image processing modules in any order and any number of times by operating the basic screen W1 via the input unit 20. Each time the control unit 40 detects that the user has selected an image processing module, the control unit 40 executes the image processing module on the selected frame group.
  • the frame is sequentially sorted into the first order, the second order, the third order,. It will be processed.
  • the 0th frame corresponds to the still image file stored in the original image area 51.
  • the (m + 1) th frame corresponds to the still image file after the image processing module is executed once for the still image file corresponding to the mth frame (m is an integer of 0 or more).
  • the control unit 40 sequentially generates still image files corresponding to the first and subsequent frames, and separately stores these still image files in the processed file area 52.
  • FIG. 4 is a conceptual diagram showing how still image groups belonging to one timeline are managed by the image processing program 2.
  • the N axis on the horizontal axis indicates the order of frames on the timeline
  • the M axis on the vertical axis indicates the order of processing.
  • a square corresponding to the coordinates (n, m) in the NM space in FIG. 4 represents the still image Q (n, m).
  • the still image Q (n, m) is an mth-order still image of the nth frame on the timeline (n is a natural number and m is an integer of 0 or more).
  • the control unit 40 manages the value of the currently selected coordinate m as the parameter m s for each frame. Immediately after the still image file group is taken into the original image area 51, the coordinate m s has an initial value of 0. Thereafter, each time the image processing module is executed once, the coordinate m s of the frame is incremented by one. Further, the user can freely change the coordinate m s of the selected frame group by performing a predetermined operation via the input unit 20. Note that to execute the image processing module to the frame, it is to perform an image processing module to the m s next still image of the frame. Therefore, changing the coordinate m s means changing the execution target of the image processing module. In addition, displaying a frame means displaying a still image at the coordinate m s of the frame. Therefore, changing the coordinate m s also means changing the object displayed in the active display window W2.
  • the channel separation process is a process of separating a moving image in which videos belonging to different channels are mixed into moving images for each channel.
  • a moving image to be subjected to channel separation processing typically includes a plurality of moving images recorded in a time-lapse manner, and one time in a format in which video of another moving image is embedded in a gap time of each moving image.
  • a moving image mixed on a line (hereinafter referred to as a mixed time-lapse moving image).
  • a mixed time-lapse moving image is a moving image in which videos of a plurality of channels are alternately switched along a timeline.
  • the channel separation process is executed for the selected frame group, but a plurality of frames are required to execute the channel separation process. Therefore, in a state where only one frame is selected as the selected frame group, it is assumed that the operation button for starting the channel separation process is invalid and the channel separation process cannot be started. Alternatively, even if only one frame is selected as the selected frame group, if there are a plurality of frames on the active timeline, all the frames on the active timeline are subject to channel separation processing. You can also Hereinafter, a frame group to be subjected to channel separation processing is referred to as a target frame group. In addition, a moving image composed of target frame groups is referred to as a target moving image.
  • the channel separation process includes an automatic classification process that automatically classifies a plurality of frames included in the target moving image into a plurality of channels, and a reclassification process in which the user corrects the classification result by the automatic classification process by manual operation.
  • the automatic classification process is executed by the automatic classification unit 41.
  • the automatic classification process is a process of automatically classifying these frames into a plurality of channels by performing image processing on a plurality of frames included in the target moving image.
  • the automatic classification unit 41 calculates the similarity between frames, and classifies a plurality of frames included in the target moving image into a plurality of channels according to the similarity.
  • the similarity between frames is calculated as an indicator that the compared frames belong to the same channel.
  • a plurality of channels are detected as a result of the automatic classification process. A plurality of frames belong to each channel.
  • a plurality of channels are usually detected.
  • each frame included in the target moving image is given a channel name of one of the channels or one of “unclassified” labeling.
  • the reclassification process is executed by the reclassification unit 45.
  • the reclassification process is a process of individually classifying arbitrary frames included in the target moving image into arbitrary channels in accordance with a user operation. Specifically, in the reclassification process, unclassified frames that have not been classified into any channel by the automatic classification process can be individually classified into specific channels. In addition, a frame that has been classified into an incorrect channel by the automatic classification process can be classified into another correct channel. Therefore, when the user feels that the result of the automatic classification process is strange, the user can correct the channel to which the problematic frame belongs in units of frames. In the reclassification process, it is also possible to create a new channel or integrate a plurality of channels into one channel according to a user operation. That is, the reclassification process is used for manual correction of the result of the automatic classification process.
  • the automatic classification unit 41 detects that the user has performed a predetermined operation via the input unit 20, the automatic classification unit 41 starts the channel separation process.
  • the setting unit 42 first displays an area setting window W3 (see FIG. 5) for displaying the selected frame on the basic screen W1.
  • the area setting window W3 is a screen that accepts designation of an area serving as a reference for calculating similarity between frames in the automatic classification process within the selected frame.
  • the selected frame is one frame included in the target frame group.
  • the selected frame can be a top frame along the timeline in the target frame group or an active display frame.
  • the active display frame is the frame most recently selected on the timeline bar T5, and thus is always included in the target frame group.
  • a date and time display area is often provided in the frame screen (lower left portion in the example of FIG. 5). If the similarity between frames is determined using information on the entire frame including the date / time display area, the similarity may be determined to be low even for frames belonging to the same channel. In some cases, a black edge is provided in the frame screen. On the contrary, in such a case, if the similarity between frames is determined using information on the entire frame including the black edge, there is a possibility that the similarity is determined to be high even for frames belonging to different channels. .
  • the target moving image is a mixed time-lapse moving image composed of images of security cameras installed in different elevators
  • the similarity between frames belonging to different channels is easily determined. This is because the scenery in the elevator is similar.
  • the area setting window W3 avoids such a situation, and an area that may cause an erroneous determination is displayed for the similarity degree so that the similarity degree between frames calculated by the automatic classification process can be an accurate indicator of belonging to the same channel.
  • the user designates an area that characterizes the background of each channel in the selected frame of the area setting window W3. In the example of an elevator, an area in which a poster or the like characterizing the scenery in each elevator is shown may be specified.
  • a surrounding line 71 indicating the area designated by the user is displayed in the selected frame. Thereby, the user can confirm whether or not a correct area can be designated.
  • the cancel button 73 is a button for canceling area designation via the area setting window W3.
  • the reclassification unit 45 causes the reclassification window W4 shown in FIG. 6 to be superimposed and displayed on the basic screen W1 as a user interface for performing reclassification after the automatic classification process.
  • a channel list area C1 On the reclassification window W4, a channel list area C1, a frame list area C2, a playback area C3, and a timeline bar C4 are arranged.
  • the channel list area C1 channel objects 91 respectively corresponding to the channels detected by the automatic classification process are arranged.
  • the channel object 91 is in the form of an icon.
  • the channel name 91 of the corresponding channel is displayed on the channel object 91.
  • the channel object 91 displays thumbnail images of representative frames that represent a plurality of frames belonging to the corresponding channel.
  • the representative frame is switched as appropriate in response to a user operation described later.
  • the default is, for example, the first frame along the timeline among the frames belonging to the corresponding channel.
  • a scroll bar appears in the area C1.
  • the area C1 is substantially enlarged.
  • the channel object 91 also displays the number of frames belonging to the corresponding channel. The number of such frames changes in response to a user operation described later, but the display of the number of frames in the channel object 91 is also switched in real time in accordance with the change.
  • an unclassified object 92 is also arranged in the channel list area C1.
  • the unclassified object 92 is an object corresponding to a classification for collecting unclassified frames that do not belong to any channel in the target frame group. Note that the classification of grouping unclassified frames does not correspond to an actual channel, but in the sense that the frames are grouped, one channel is virtually configured. Therefore, the unclassified object 92 has a form similar to the channel object 91, and is an icon of the same size in this embodiment. Specifically, in the unclassified object 92, “unclassified” is displayed instead of the channel name, and the number of unclassified frames is displayed instead of the number of frames belonging to the channel.
  • the display of the number of images in the unclassified object 92 is also switched in real time in accordance with the change in the number of unclassified frames.
  • the unclassified object 92 also displays thumbnail images of representative frames representing “unclassified” frames.
  • the representative frame is also switched as appropriate in response to a user operation to be described later.
  • the default is, for example, the first frame along the timeline among the frames belonging to the “unclassified” category.
  • a boundary line 93 is drawn between the unclassified object 92 and the channel object 91.
  • the reclassifying unit 45 accepts a selection operation for selecting any one object from all the channel objects 91 and the unclassified objects 92 in the channel list area C1 from the user. Of all the objects 91 and 92, the currently selected object is displayed in a different manner from the other objects. In the example of FIG. 6, the object 91 of the channel CH01 is displayed in a different color from the other objects 91 and 92.
  • a channel or classification corresponding to one object 91, 92 that is currently selected is referred to as a selected channel.
  • thumbnail images 94 of all frames belonging to the selected channel are displayed in a list.
  • the number of frames belonging to the selected channel is large and not all thumbnail images 94 can be arranged in the frame list area C2, a scroll bar 95 appears in the area C2, and the area C2 is substantially subtracted. Expanded to Therefore, the user can selectively display all unclassified frames or all frames belonging to one of the channels in the frame list area C2 by switching the selection state of the objects 91 and 92. it can. Therefore, all the unclassified and classified frames included in the target frame group can be confirmed while effectively using the limited space in the frame list area C2.
  • the thumbnail images 94 in the channel list area C1 can be arranged in a multistage manner, and from the left to the right and from the top to the bottom as the order n on the timeline of the original frame increases. Arranged in chronological order.
  • the reclassifying unit 45 manages a specific single frame belonging to each channel as an active frame, and also manages a specific single unclassified frame as an active frame.
  • a surrounding frame 96 is attached to the thumbnail image 94 of the active frame as a sign for distinguishing it from the thumbnail images 94 of other frames.
  • the thumbnail images 94 in the frame list area C2 are each recognized as an object, and the user can select any one thumbnail image 94 from all the thumbnail images 94 in the frame list area C2 by a click operation or the like. Can be selected. It is also possible to simultaneously select a plurality of thumbnail images 94 in the frame list area C2 by an operation of repeating a click operation while pressing a specific key on the keyboard.
  • the reclassifying unit 45 switches the active frame of the selected channel to a frame corresponding to the latest selected thumbnail image 94. At this time, the position of the surrounding frame 96 is also moved.
  • the active frame is linked with the thumbnail images displayed on the channel object 91 and the unclassified object 92, and each time the active frame of each channel or classification is changed, the objects 91, 92 thumbnail images are also switched. That is, in this embodiment, the representative frame of each channel and classification matches the active frame.
  • the thumbnail image 94 of the active frame in the frame list area C2 can be moved individually so as to be superimposed on one of the objects 91 and 92 in the channel list area C1 by a drag and drop operation. Further, when a plurality of thumbnail images 94 are simultaneously selected in the frame list area C2, the plurality of thumbnail images 94 are collectively put on any one of the objects 91 and 92 in the channel list area C1. It can be moved to overlap. (However, even if it is “move”, the thumbnail image 94 disappears as if it was inserted into the folder when it is released on the target objects 91 and 92.) In other words, the reclassification unit 45 uses the frame list area.
  • An operation of associating an arbitrary thumbnail image 94 in C2 with an arbitrary channel or classification is accepted from the user.
  • this association operation is performed, the frame corresponding to the thumbnail image 94 moved by the operation is reclassified into a channel or classification corresponding to the objects 91 and 92 to which the thumbnail image 94 is moved. If the frame corresponding to the thumbnail image 94 related to the operation is originally classified into the channel or the classification corresponding to the objects 91 and 92 related to the operation, the operation is ignored and reclassification is not performed. I will not.
  • a newly created object 97 is arranged after the arrangement of the objects 91 and 92 in the channel list area C1.
  • the newly created object 97 is an object for newly creating a channel, and is in the form of an icon in this embodiment.
  • the same operation as the above association operation can be performed not only on the objects 91 and 92 but also on the newly created object 97.
  • the reclassification unit 45 accepts an operation of moving the thumbnail image 94 of the active frame in the frame list area C2 individually so as to overlap the newly created object 97 by a drag and drop operation from the user. .
  • the plurality of thumbnail images 94 can be moved together so as to overlap the newly created object 97. (However, even if it is “move”, when the thumbnail image 94 is released on the newly created object 97, it disappears as if it was inserted into the folder.)
  • this association operation is performed, the newly created object 97 is displayed.
  • a channel object 91 is newly created at the position where the.
  • the newly created channel object 91 is an object representing a new channel whose elements are frames corresponding to one or a plurality of thumbnail images 94 moved by the association operation.
  • the frame corresponding to one or more thumbnail images 94 moved by the association operation is reclassified into a new channel. Therefore, immediately after the channel object 91 is newly created, if only one thumbnail image 94 has been moved, the thumbnail image 94 is displayed on the object 91.
  • one of the thumbnail images 94 (for example, the last selected one) is displayed on the newly created channel object 91. .
  • Channel names are also given and displayed as appropriate.
  • the newly created object 97 moves after the arrangement of the objects 91 and 92 including the newly created object 91.
  • the reclassified frame is labeled with the channel name CH01, CH02,... Or “unclassified” according to the reclassified channel or classification. Will be granted again.
  • the thumbnail image 94 of the reclassified frame is deleted from the frame list area C2.
  • the remaining thumbnail images 94 are aligned so as to fill the space where the deleted thumbnail images 94 were placed.
  • the reclassified frame always includes an active frame. Therefore, after the reclassification, the active frame of the selected channel is changed to the closest frame among the following frames along the timeline, or the closest frame among the previous frames along the timeline.
  • the change of the active frame of the selected channel is appropriately reflected in the position of the surrounding frame 96 and the display of the thumbnail images on the objects 91 and 92. Further, the number of frames belonging to the channel and / or classification corresponding to the objects 91 and 92 of the reclassified frame before and after the movement is recalculated and reflected in the display of the objects 91 and 92.
  • the above various association operations can be realized in modes other than the drag and drop operation.
  • the objects 91, 92, and 97 in the frame list area C2 display character strings indicating specific keys on the keyboard on the lower right.
  • the reclassifying unit 45 detects that a specific key displayed on the objects 91, 92, 97 is pressed, the current frame 96 in the active frame of the selected channel at that time, that is, in the frame list area C2. It is determined that the frame marked with is associated with the objects 91, 92, and 97 corresponding to the specific key. Subsequent processing is the same as when the association operation is performed in the form of a drag and drop operation.
  • the reclassification unit 45 can also integrate a plurality of channels (including classification of “unclassified” in this paragraph and the next paragraph) in response to a user operation.
  • a plurality of channels including classification of “unclassified” in this paragraph and the next paragraph
  • any one of the objects 91 and 92 can be individually moved so as to be superimposed on another arbitrary one of the objects 91 and 92 by a drag and drop operation.
  • a plurality of objects 91 and 92 may be selected at the same time, and the plurality of objects 91 and 92 may be collectively moved onto another arbitrary one of objects 91 and 92.
  • the reclassification unit 45 accepts an operation for associating an arbitrary channel in the channel list area C1 with another arbitrary channel from the user.
  • this association operation is performed, the channels corresponding to the objects 91 and 92 moved by the operation are integrated with the channels corresponding to the other objects 91 and 92 to which the objects 91 and 92 are moved. .
  • the channel name “unclassified” of the channel corresponding to the destination object 91, 92 is displayed in all frames belonging to the channel corresponding to the moved object 91, 92. ”, The labeling of CH01, CH02,.
  • the destination objects 91 and 92 become objects that represent a new integrated channel.
  • the active frame of the integrated channel the active frames of the objects 91 and 92 that are the movement destinations are continuously used.
  • the reclassifying unit 45 calculates the number of frames belonging to the integrated channel and reflects it on the display on the objects 91 and 92.
  • the channel list area C1 when the channel object 91 is moved, the moved channel object 91 disappears, and the remaining channel object 91 is filled so as to fill the space where the lost channel object 91 is arranged. Objects 91 and 97 are aligned. When the unclassified object 92 is moved, the number of frames is 0, but the unclassified object 92 remains as it is.
  • the order of arrangement of these objects 91 in the channel list area C1 can be freely changed by a drag and drop operation. For example, when it is desired to move the channel object 91 of the channel CH01 to the right side of the channel object 91 of the channel CH02 in the state of FIG. 6, the channel object 91 of the channel CH01 is placed between the channel objects 91 of the channels CH02 and CH03. Move it.
  • a frame group belonging to the selected channel can be reproduced as a moving image.
  • a play button 76, a frame advance button 77, and a frame return button 78 are arranged on the reclassification window W4.
  • the playback button 76 receives a playback command from the user as a moving image of a frame group belonging to the selected channel.
  • the reclassifying unit 45 detects that the playback button 76 has been pressed, the reclassifying unit 45 sequentially displays the frames included in the frame group belonging to the selected channel in the playback area C3 in the frame advance format along the timeline. Playback starts from the active frame of the selected channel when the playback button 76 is pressed. Reproduction is performed at a frame rate corresponding to the selected channel. During playback, the frames displayed in the playback area C3 are sequentially switched. At each switching, the active frame managed by the reclassifying unit 45 is updated to the frame after the switching in real time. Is done.
  • the playback button 76 receives a command to stop playback from the user.
  • the reclassifying unit 45 detects that the playback button 76 is pressed during playback, the reclassifying unit 45 fixes the display in the playback area C3 to the active frame at that time. Even during stoppage, the active frame of the selected channel is always displayed in the reproduction area C3. Accordingly, when the active frame is changed by the selection operation of the thumbnail image 94 in the frame list area C2 during the stop, the display in the reproduction area C3 is also changed in real time.
  • the active frame is updated as needed, but the surrounding frame 96 in the frame list area C2 also moves in real time to a position surrounding the thumbnail image 94 of the latest active frame.
  • the thumbnail images of the objects 91 and 92 corresponding to the selected channel are also updated in real time. That is, the display in the reproduction area C3 is synchronized with the position of the surrounding frame 96 and the display of the thumbnail images of the objects 91 and 92.
  • such a synchronization may not be taken during playback. In this case, for example, the position of the surrounding frame 96 may be moved and / or the thumbnail images of the objects 91 and 92 may be updated only when playback is stopped.
  • Each of the frame advance button 77 and the frame return button 78 issues a command from the user to switch the display in the reproduction area C3 to the next frame one after the next in the frame group belonging to the selected channel along the timeline. Accept.
  • the change of the display in the reproduction area C3 by the operation of the buttons 77 and 78 is also linked to the change of the active frame.
  • the above playback functions are useful for finding frames that are misclassified within an existing channel. That is, when a frame suddenly appears in a different channel during playback of a moving image, a person can immediately find it. In such a case, the reproduction is immediately stopped, the thumbnail image 94 of the corresponding frame is searched for in the frame list area C2, and the thumbnail image 94 is moved to the correct channel by the above association operation. .
  • the frame rate can be calculated by various methods. For example, the frame rate is calculated by dividing the number of frames belonging to the selected channel by the difference between the time of the first frame belonging to the selected channel and the time of the last frame. be able to. Alternatively, the frame rate of the selected channel can be calculated based on the following equation. (Frame rate of video corresponding to the active timeline) x (number of frames belonging to the selected channel) ⁇ (number of frames belonging to the active timeline)
  • the reclassification unit 45 recalculates the frame rate of the selected channel every time reclassification is performed, and changes the display of the frame rate area C5 in real time. Further, in the vicinity of the frame rate area C5, a channel name area C6 in which the channel name or classification name of the selected channel is displayed is arranged. Therefore, the user must understand exactly which channel the video displayed in the reproduction area C3 belongs to and which channel the frame rate displayed in the frame rate area C5 is. Can do.
  • the timeline bar C4 is an object that schematically shows the active timeline, like the timeline bar T5 on the basic screen W1.
  • the target frame group that is the target of the channel separation process constitutes part or all of the active frame group. Therefore, since it is important that the active timeline can be managed also on the reclassification window W4, a timeline bar C4 similar to the timeline bar T5 on the basic screen W1 is arranged.
  • the timeline bar C4 extends in the left-right direction and is equally divided in the left-right direction by the number of frames of the active frame group.
  • the nth divided area from the left on the timeline bar C4 corresponds to the nth frame on the active timeline (n is a natural number).
  • the reclassifying unit 45 displays a straight line 83 at the position of the divided area as a sign indicating the divided area corresponding to the active frame of the selected channel on the timeline bar C4. That is, the straight line 83 indicates the position on the active timeline of the frame displayed in the area C3 and the frame 96 of the area C2.
  • a target range bar 85 is displayed below the timeline bar C4. Similar to the timeline bar C4, the target range bar 85 extends in the left-right direction over a range corresponding to the section occupied by the target frame group.
  • the timeline bar C4 can be expanded or contracted in the left-right direction.
  • a scale change bar 86 is arranged on the reclassification window W4, and the scale change bar 86 is an object obtained by combining two objects of the slide groove 61 and the slider 62.
  • the slide groove 61 has a GUI such as a groove that linearly extends in the left-right direction
  • the slider 62 has a GUI that slides in the slide groove 61 in the left-right direction.
  • the user can reciprocate the slider 62 in the left-right direction along the slide groove 61 by operating the slider 62 via the input unit 20.
  • the reclassifying unit 45 changes the horizontal scale of the timeline bar C4 stepwise according to the horizontal position of the slider 62 in the slide groove 61.
  • the horizontal scale of the timeline bar C4 becomes shorter stepwise, and as the slider 62 moves to the left, the horizontal scale of the timeline bar C4 becomes a step. Become longer.
  • the horizontal scale of the timeline bar C4 changes to the thumbnail list C7 (see FIG. 7).
  • a scroll bar 87 that scrolls in the horizontal direction appears and displays the timeline bar C4. The area for this is substantially enlarged in the left-right direction.
  • the thumbnail list C7 is an arrangement of thumbnail images 88 of all frames belonging to the active timeline in the horizontal direction. Therefore, the user can check frames other than the target frame group included in the active timeline on the reclassification window W4.
  • the nth thumbnail image 88 from the left on the thumbnail list C7 is a thumbnail image of the nth frame on the active timeline (n is a natural number).
  • a surrounding frame 89 is attached to the thumbnail image 88 of the active frame of the selected channel as a sign for distinguishing from the other thumbnail images 88.
  • the timeline bar C4 and the thumbnail list C7 accept the selection of an arbitrary frame on the active timeline from the user, similarly to the timeline bar T5.
  • the user selects an arbitrary frame from the active frame group by selecting an arbitrary thumbnail image 88 in the divided area on the timeline bar C4 or the thumbnail list C7 by a click operation or the like. Can do.
  • the reclassifying unit 45 determines whether or not the frame is included in the target frame group. If it is determined that the channel is included, the channel or classification to which the frame belongs is switched to the selected channel. At the same time, the reclassifying unit 45 switches the frame to the active frame of the selected channel.
  • the change of the selected channel and the active frame here is reflected in real time on the display of the channel list area C1, the frame list area C2, the reproduction area C3, the frame rate area C5, and the channel name area C6.
  • the positions of the straight line 83 and the surrounding frame 89 are also changed in real time.
  • the user can reclassify a plurality of frames included in the target frame group using the various functions described above. Then, when it is determined that the reclassification has ended, the OK button E5 is pressed. In response to this, the reclassification unit 45 closes the reclassification window W4 and newly creates one or more moving images belonging to the new timeline. The one or more newly created moving images correspond to the channels that are finally defined on the reclassification window W4 and are selected by the user (hereinafter referred to as the final channel) on a one-to-one basis. Selection of the final channel by the user is performed by putting a selection mark in a selection box 98 arranged on the channel object 91 as shown in FIG.
  • the default value of the selection box 98 is “selected”. Therefore, the user only has to operate the selection box on the corresponding object 91 for the channel that it is determined that the creation of the moving image is not necessary, so that the state becomes “non-selected”.
  • a selection box 99 is also arranged below the reproduction area C3.
  • the selection box 99 is an object for selecting whether or not to create a moving image of the selected channel (however, the classification of “unclassified” is not included), and is synchronized with the selection box 98 on the channel object 91. ing.
  • the reclassifying unit 45 determines the selection state of the selection boxes 98 and 99 and determines the final channel.
  • the newly created video is obtained by arranging all the frames belonging to the corresponding final channel along a new timeline.
  • a new display window W2 corresponding to each of these new moving images is created.
  • the reclassifying unit 45 creates a still image file group corresponding to the frame group included in these new moving images on a one-to-one basis, stores it in the processed file area 52, and handles it as the 0th-order frame. That is, thereafter, these new moving images can be processed in the same manner as the still image file group captured in the original image area 51. Specifically, it can be reproduced in the same manner in the display window W2, and is similarly subject to various image processing.
  • the reclassifying unit 45 may determine whether or not an unclassified frame remains, and if so, may notify the user to that effect. In this case, for example, the reclassification unit 45 displays a confirmation window W5 as shown in FIG. 8 on the display 10 when it is determined that an unclassified frame remains. Then, after the “Yes” button 74 is pressed, the process proceeds to the above-described new movie creation process. When the “No” button 75 is pressed, the confirmation window W5 is closed and the reclassification window W4 is restored. On the other hand, if it is determined that no unclassified frame remains, the process immediately proceeds to the above-described new movie creation process.
  • the automatic classification process starts when the start button 72 on the area setting window W3 is pressed.
  • a plurality of frames constituting the target frame group are represented as F 1 , F 2 ,..., F J (J is an integer of 2 or more). Note that the arrangement of the frames F 1 , F 2 ,..., F J is equal to the arrangement on the timeline.
  • step S1 the automatic classification unit 41 reduces the size of each frame F 1 , F 2 ,..., F J included in the target frame group.
  • the number of pixels in the horizontal and / or vertical direction of the frame may be reduced so as to be a specified number, or may be reduced so as to be a predetermined ratio from the original size.
  • the aspect ratio may or may not be saved.
  • This reduction step speeds up the following processing and reduces the influence of noise.
  • the frame F 1, F 2 after reduction, ⁇ ⁇ ⁇ , F J also, F 1, F 2, ⁇ ⁇ ⁇ , expressed as F J.
  • step S2 the setting unit 42 defines a channel CH 1, classifies the frames F 1 to the channel CH 1. Specifically, labeling of the channel name CH 1 is given to the frame F 1 .
  • the setting unit 42 sets the frame F 1 as a reference frame G 1 representing a frame group belonging to the channel CH 1 . Note that the number of channels can be increased at any time by step S10 described later.
  • the calculation unit 43, the determination unit 44, and the setting unit 42 repeat the following steps S3 to S10 with respect to the comparison frame F j , and the existing reference frames G 1 , G 1 2 ,..., G L (L is the number of existing channels).
  • step S3 the calculation unit 43 divides the entire comparison frame Fj into partial regions (blocks) D 1 , D 2 ,..., D K of a specified number K (K is an integer of 2 or more) (FIG. 10).
  • a comparative entire frame F j to say, if it has been designated area on the area setting window W3 are entire area, if not specified is the entire screen of the frame is there.
  • the partial areas D 1 , D 2 ,..., D K do not have to be divided as a whole into the comparison frame F j, and may partially overlap each other.
  • the shape does not need to be a rectangle, and may be, for example, a circle or another polygon. Further, the shape and size can be uniform or different from each other.
  • step S4 calculating section 43, existing channel CH 1, CH 2, ⁇ ⁇ ⁇ , reference frames G 1, G 2 of CH L, ⁇ ⁇ ⁇ , each of G L, subregion D '1, Divided into D ′ 2 ,..., D ′ K (see FIG. 10).
  • Subregion D '1, D' 2, ⁇ , D 'K in the screen frame, each partial region D 1, D 2, ⁇ ⁇ ⁇ , are defined to occupy the same position as D K .
  • a correlation coefficient called normalized cross-correlation (ZNCC) is calculated, but in other embodiments, other similarities such as a correlation coefficient called NCC and similarities called SDD and SAD are used. May be calculated.
  • ZNCC normalized cross-correlation
  • the degree of similarity between the frame F j and the reference frame G l is calculated again to obtain the total degree of similarity B l (see FIG. 10).
  • normalized cross-correlation ZNCC
  • ZNCC normalized cross-correlation
  • the total similarity B 1 and the local similarity Y lk may be calculated by different methods.
  • the information on the partial regions D k and D ′ k having a low local similarity Y lk is not considered in calculating the overall similarity B 1 , for the following reason. That is, the frames of the same channel have the same background image but are different in the moving body portion. If the overall similarity B 1 is determined using the information of the entire frame when the proportion of the moving body portion in the entire frame is large, the overall similarity B 1 is low even if the frames of the same channel are used. Because it becomes.
  • the partial regions D k and D ′ k having a low local similarity Y lk that are considered to include many mobile parts are not used for calculating the total similarity B 1 .
  • the influence of the moving object is reduced, and it can be accurately determined whether or not the compared frames belong to the same channel.
  • the determination unit 44 determines the maximum one of the overall similarities B 1 , B 2 ,..., B L calculated in step S6, and the maximum is greater than a predetermined threshold value. Determine whether it is larger. If it is determined that the value is greater than the predetermined threshold, the process proceeds to step S8. If it is determined that the value is equal to or less than the predetermined threshold, the process proceeds to step S10.
  • step S8 the determination unit 44 determines that the comparison frame F j belongs to the channel CH MAX .
  • MAX is a value of l that gives the maximum total similarity B l . More specifically, the determination unit 44, the labeling of the channel name CH MAX assigned to the comparison frame F j, to classify the comparison frame F j to the channel CH MAX. As a result, the comparison frame F j belongs to the same channel as the frame group belonging to the channel CH MAX .
  • step S9 the setting unit 42 updates the reference frame G MAX of the channel CH MAX. Specifically, the setting unit 42 synthesizes the comparison frame F j and the existing reference frame G MAX to form a new reference frame G MAX .
  • a weighted average is employed as a mode of synthesis such that the weight of the comparison frame F j is larger than that of the existing reference frame G MAX .
  • the reference frame G MAX representing the frame group belonging to the channel CH MAX becomes a composite image in which the weight is placed on the latest frame.
  • the comparison frame F j may be set as a new reference frame G MAX as it is. It is also possible to omit the step S9, so as not to update the reference frame G MAX.
  • step S10 is a step that is executed when it is determined that the total similarity B 1 , B 2 ,..., B L calculated in step S6 is less than or equal to a predetermined threshold value.
  • step S10 is a step executed when the comparison frame F j is not similar to the reference frame G l of any channel CH l .
  • the setting unit 42 defines a new channel CH L + 1 in addition to the existing channels CH 1 , CH 2 ,..., CH L , and sets the comparison frame F j to the channel CH L + 1 . Classify. Specifically, labeling of a new channel name CH L + 1 is given to the comparison frame F j .
  • the setting unit 42 sets the comparison frame F j as a reference frame G L + 1 that represents a frame group that belongs to the new channel CH L + 1 .
  • Step S11 the existing channels CH 1 , CH 2 ,..., CH L are corrected.
  • the calculation unit 43 reference frame G 1, G 2, ⁇ ⁇ ⁇ , calculated on the total per frame overall similarity between G L.
  • the whole frame here is the same as step S3 when the area is specified on the area setting window W3, and when the area is not specified, the frame screen is displayed. The whole.
  • ZNCC normalized cross-correlation
  • the similarity may be calculated by a method different from the total similarity B 1 and the local similarity Y lk . Then, the determination unit 44, If there is more than a predetermined threshold value to these similarities, was what channel CH l corresponding to the reference frame G l providing such similarity, integrated into one channel. Specifically, labeling with the same channel name is re-assigned to all the frames F j belonging to the channel CH l to be integrated. Here, three or more channels CH 1 may be integrated into one.
  • the determination unit 44 or more if the channel CH l not only include still one frame by performing the integration of the channel is present, to extinguish the channel CH l. Specifically, the determination unit 44 re-assigns “unclassified” labeling to the frames F j belonging to all the channels CH 1 to be eliminated.
  • the channel CH l containing only one frame F j is a target for disappearance, but such reference values may be set to two, three, and so on. It is also possible to set an upper limit on the number of channels that can be created. In this case, it is possible to eliminate the channel F j from a channel with a small number so that the number of channels does not exceed the upper limit.
  • the determination unit 44 finally created channel CH l, CH01, CH02, the channel name and ... and reapplication sequentially, the labeling which is given to each frame F j, these Update with a new channel name. After that, the automatic classification process ends.
  • FIG. 11 shows the flow of automatic classification processing according to the second embodiment.
  • the second embodiment and the first embodiment are different only in the automatic classification algorithm. Further, as can be seen by comparing FIG. 9 and FIG. 11, the difference between the two automatic classification processes is only that steps S24 to S26 are inserted instead of steps S4 to S6 in the second embodiment. is there. Therefore, for the sake of simplicity, only the difference will be described below.
  • step S24 is executed after execution of steps S1 to S3 similar to those in the first embodiment.
  • the calculation unit 43 detects feature points in the partial areas D 1 , D 2 ,..., D K of the comparison frame F j .
  • the number of the partial regions D 1, D 2, ⁇ ⁇ ⁇ , feature points detected in the D K it is preferable that the same number.
  • the feature points in the comparison frame F j are represented as P 1 , P 2 ,..., P U (U is an integer of 2 or more).
  • the partial region V u is a region having a predetermined size with the feature point P u as the center. Then, calculating unit 43, existing channel CH 1, CH 2, ⁇ ⁇ ⁇ , reference frames G 1, G 2 of CH L, ⁇ ⁇ ⁇ , for each of the G L, partial regions V '1, V' 2 ,..., V ′ U are set (see FIG. 12).
  • Partial region V '1, V' 2, ⁇ , V 'U in the screen frame, each partial area V 1, V 2, ⁇ ⁇ ⁇ , are defined to occupy the same position as V U .
  • the feature point P 1, P 2, ⁇ ⁇ ⁇ , is P U Detected.
  • the feature point P 1, P 2, ⁇ , P U is, without thereby biased in a partial area of the entire screen of the comparison frame F j, approximately evenly from the entire comparison frame F j Detected.
  • a correlation coefficient called normalized cross-correlation (ZNCC) is calculated, but other similarities may be calculated as in the first embodiment.
  • the overall similarity in the first and second embodiments is common in that both represent the similarity in the entire frame, and therefore the same symbol B is used.
  • the comparison frame F j and the reference frame G are used by using only V u and V ′ u having a high local similarity Y lu in the entire frame.
  • the degree of similarity with l is calculated again to obtain the total degree of similarity B l .
  • the normalized cross correlation (ZNCC) is calculated as the total similarity B 1 , but other similarities may be calculated as in the first embodiment.
  • the total similarity B 1 and the local similarity Y lu may be calculated by different methods.
  • the overall similarity B 1 , B 2 ,..., B L is calculated by the above step S26.
  • the subsequent processing flow is the same as in the first embodiment.
  • the feature point P 1, P 2, ⁇ ⁇ ⁇ , partial regions V 1 of the vicinity of the P U, V 2, ⁇ ⁇ ⁇ , other than V U Area information is not considered. This is because, even if frames of different channels are taken from similar scenes, if the overall similarity B l is determined using the information of the entire frame, the overall similarity B l increases. Because. For example, in two images taken at different locations in the same store, the same uniform color wallpaper or floor may occupy most of the frame. In such a case, the overall similarity B l is high. turn into.
  • FIG. 13 shows a flow of automatic classification processing according to the third embodiment.
  • the third embodiment is different from the first and second embodiments only in the automatic classification algorithm.
  • the difference between the automatic classification processes according to the first and third embodiments is that, in the third embodiment, a subroutine S30 is used instead of steps S5 and S6. Only the points that are inserted. Therefore, for the sake of simplicity, only the difference will be described below.
  • the correlation coefficient is calculated as the similarity between frames.
  • the similarity between frames is evaluated based on the color tendency of the frames. .
  • the color tendency of the frame an overall color tendency in the frame screen and a local color tendency in the frame screen are comprehensively considered. This will be specifically described below.
  • Subroutine S30 is a process for evaluating various color indexes indicating local or overall color trends in the frame screen, and details thereof are shown in FIG.
  • density is taken into consideration in steps S31 to S33 included in the subroutine S30
  • saturation is taken into consideration in steps S34 to S38
  • hue is taken into consideration in steps S39 to S43.
  • steps S31 to S33 the density is taken into consideration, but brightness or luminance may be used instead of the density.
  • steps S44 to S46 included in the subroutine S30 the overall impression level of the frame is considered.
  • step S31 calculation unit 43 calculates an average density Hd overall comparison frame F j.
  • a comparative entire frame F j to say, if it has been designated area on the area setting window W3 are entire area, if not specified is the entire screen of the frame is there.
  • the average density is calculated as an average value of the density of all the pixels in the target area, and the density of each pixel is calculated as the average value of the RGB values of the pixel.
  • a value of “0” is given (see FIG. 14).
  • “1” and “0” are indicators that indicate whether the density of the local region is relatively high or low with respect to the entire frame, and are color indicators that represent local color trends related to density (brightness) ( Hereinafter, the concentration index).
  • a concentration index is calculated (see FIG. 14). Incidentally, as shown in FIGS. 13 and 17, since that would step S32 is repeatedly executed, for reference frames G l concentration index is already calculated, it is referred the value, the processing of a new calculation omitted Is done.
  • the similarity is calculated Bd l between.
  • the calculation unit 43 calculates the similarity between the density index distribution pattern in the comparison frame F j and the density index distribution pattern in each reference frame G l as the similarity Bd l. .
  • the number of combinations of partial areas D k and D ′ k where the density indices “1” and “0” do not match is counted, and a square value of a value obtained by dividing the count value by the number K of partial areas is obtained.
  • the similarity Bd l see FIG.
  • the composition of an image is determined by the arrangement of a plurality of elements in the screen.
  • Many images have a composition that is roughly divided into an area representing the object of interest and an area representing the background. That is, it can be considered that one of the region having the density index “1” and the region “0” represents the object of interest and the other represents the background thereof.
  • the distribution pattern of concentration indicators in comparison frame F j represents the composition of the comparison frame F j
  • the distribution pattern of concentration indicators in the reference frame G l is the reference frame G l
  • the similarity Bd l represents the similarity of composition between the comparison frame F j and the reference frame G l .
  • step S34 calculation unit 43, based on the distribution pattern of concentration indicators in comparison frame F j, dividing the comparison frame F j 2 two partial regions R 1, to R 2 (see FIG. 15).
  • the partial region R 1 is a region obtained by combining all the partial regions D k having the density index “1” in the comparison frame F j
  • the partial region R 2 has the density index “0” in the comparison frame F j .
  • This is a region where all the partial regions D k are combined. That is, in step S34, based on the composition of the comparison frame F j, is divided into an area representing the background and a region representing the object of interest the comparison frame F j.
  • step S35 similarly to step S34, calculation unit 43, the concentration in each of the existing channel CH 1, CH 2, ⁇ ⁇ ⁇ , reference frames G 1 of CH L, G 2, ⁇ , G L Based on the index distribution pattern, each of the reference frames G 1 , G 2 ,..., G L is divided into partial areas R ′ 1 and R ′ 2 (see FIG. 15).
  • the partial region R ′ 1 is a region obtained by combining all the partial regions D ′ k whose density index is “1” in the reference frame G 1 , and the partial region R ′ 2 has a density index in the reference frame G 1 .
  • This is a region where all the partial regions D ′ k that become “0” are combined. That is, in step S35, based on the composition of the reference frame G l, divides the reference frame G l in the region that represents the object of interest and a region representing the background.
  • the calculation unit 43 calculates three average values for each of R, G, and B for each of the partial regions R 1 and R 2 in the comparison frame F j . Subsequently, the calculation unit 43, the three average values of RGB by the partial region R 1, to calculate the saturation partial region R 1, the RGB separate three average values in the partial region R 2, the partial region to calculate the saturation of R 2. Next, the calculation unit 43 calculates the relative saturation (hereinafter referred to as relative saturation) between the partial regions R 1 and R 2 . The relative saturation is calculated as an absolute value of the difference between the saturation of the partial region R 1 and the saturation of the partial region R 2 . Relative saturation is a color index (hereinafter referred to as saturation index) representing a color tendency related to saturation.
  • Relative saturation which is the absolute value of the difference
  • the similarity Bs l is calculated.
  • the similarity Bs l is calculated as a square value of a value obtained by dividing the difference between the two relative chromas by 255.
  • the division by 255 is performed in order to maintain the similarity Bs l at a value between 0 and 1 and normalize it.
  • step S39 the calculation unit 43 divides the comparison frame F j into two regions, a main region O 1 and a sub region O 2 . Specifically, among the partial regions R 1 and R 2 calculated in step S34, the region with the larger area is set as the main region O 1 and the region with the narrower region is set as the sub region O 2 (see FIG. 16). That is, step S39, based on the composition of the comparison frame F j, is divided into an area representing the background and a region representing the object of interest the comparison frame F j.
  • step S40 the calculation unit 43 converts each of the reference frames G 1 , G 2 ,..., G L into two regions, a main region O ′ 1 and a sub region O ′ 2.
  • the region with the larger area is set as the main region O ′ 1 and the region with the narrower region is set as the sub-region O ′ 2.
  • Set see FIG. 16. That is, in step S40, based on the composition of the reference frame G l, divides the reference frame G l in the region that represents the object of interest and a region representing the background.
  • the calculation unit 43 calculates the average hue of the main area O 1 and calculates the average hue of the sub area O 2 .
  • the average hue is a color index (hereinafter referred to as a hue index) representing a local color tendency related to the hue, and is calculated as an average value of the hues of all the pixels in the target area.
  • step S40 the processing of a new calculation omitted Is done. Step S40 can be omitted as well.
  • the similarity Bh 1l is calculated as a square value of a value obtained by dividing the difference between the hue indexes of the main regions O 1 and O ′ 1 by 180.
  • the similarity Bh 2l is calculated.
  • the square value of the value obtained by dividing the difference between the hue indexes of the sub-regions O 2 and O ′ 2 by 180 is set as the similarity Bh 2l .
  • the reason why the division by 180 is performed is to maintain the similarity Bh 1l and Bh 2l between 0 and 1 and normalize them.
  • Subsequent steps S44 to S46 are steps for evaluating a color index indicating an overall color tendency in the frame screen.
  • various impressions Z 1 , Z 2 ,..., Z I (I is an integer of 2 or more) are defined in order to evaluate the overall color tendency in the frame screen. Yes.
  • Various kinds of information for defining these impression degrees Z 1 , Z 2 ,..., Z I are stored in an impression degree definition area 53 in the software management area 50.
  • Each impression degree Z i is associated with one or a plurality of color conditions, and weights are defined for these color conditions. Note that the weight of one or more color conditions associated with the same impression degree Z i is 1 in total.
  • FIG. 18 shows an example of the impression degree Z 1 “natural”, and the impression degree Z 1 “natural” is associated with three color conditions of “green”, “brown”, and “beige”. Also, weights of 0.5, 0.25, and 0.25 are assigned to the three color conditions of “green”, “brown”, and “beige”, respectively.
  • evaluation values for density (lightness), saturation, and hue values are defined for each color condition.
  • the value of each color condition is calculated by deriving evaluation values corresponding to the density, saturation, and hue values of the pixel and then multiplying these evaluation values. . Then, the value of the impression degree Z i of the pixel is calculated as a value obtained by adding the above weighting values to the values of the respective color conditions.
  • the whole frame here is the entire area when the area is designated on the area setting window W3, and the whole frame screen when the area is not designated.
  • the calculation unit 43 calculates the value of the impression degree Z i for each pixel included in the entire frame of the comparison frame F j .
  • An average value of these values is calculated, and the average value is set as the value of the impression degree Z i in the entire frame.
  • the value of the impression Z i of the entire frame, the color index representing a color tendency of the entire frame about the impression Z i (hereinafter, the impression index) is.
  • step S45 to become a thing step S45 is repeatedly executed, for reference frames G l impression index has already been calculated, is referred the value, the processing of a new calculation omitted Is done.
  • step S47 the total similarity B 1 , B 2 ,..., BL is calculated.
  • the subsequent processing flow is the same as in the first embodiment.
  • the image processing program 2 can handle image processing for a wide variety of moving images.
  • the image processing program 2 can be used in a scene where an organization such as the police analyzes a surveillance video of a security camera for investigation of an incident.
  • many security cameras may be installed in the same store, but these surveillance videos are often recorded in the form of a mixed time-lapse movie.
  • you use a dedicated device in the store where the security camera is installed you can separate each channel included in the mixed time-lapse video, but in the police etc., each model of each manufacturer Since it does not have a corresponding dedicated device, playback becomes difficult. Therefore, the timeline separation process that can separate the mixed time-lapse moving image into the moving image for each channel can be utilized in such a scene regardless of the specification of the moving image.
  • the reclassification window W4 in the above embodiment may be composed of a plurality of windows.
  • a frame list area for displaying a list of frames may be provided for each of a plurality of channels and a classification of “unclassified”.
  • the frame group in order to use the screen space efficiently, it is preferable to display the frame group as a thumbnail in each frame list area, and it is preferable to arrange the thumbnail image group along the timeline.
  • the channel object 91 and the unclassified object 92 in the form of icons are omitted, and the frame list area for each channel is used as a channel object for the association operation, and the “unclassified” classification object is used.
  • the frame list area can be used as an unclassified object for the association operation. That is, by selecting an arbitrary frame from the “unclassified” frame list area and dragging and dropping it to an arbitrary frame list area (channel object) for a channel, the frame is associated with the area. It is also possible to configure it so that it can be classified into channels. The same applies to a case where a frame included in a channel is moved to another channel or classification.
  • a frame list area for displaying a list of “unclassified” frame groups and a frame list area for displaying a list of frame groups belonging to the channel are provided, and the “unclassified” frame groups are included in the former area. May be always displayed. In this case, in the latter area, it is possible to display only a list of frames belonging to the currently selected channel.
  • the composition is determined based on the color index related to density, but the composition may be determined based on the color index related to hue and / or saturation, for example.
  • the similarity Bs l is calculated based on the relative saturation between the partial areas R 1 and R 2 , but the absolute saturation difference between the partial areas R 1 and R ′ 1 , A difference in absolute saturation between R 2 and R ′ 2 may be used as the similarity Bs l .
  • the difference in density between the partial areas R 1 and R ′ 1 and the difference in density between the partial areas R 2 and R ′ 2 may be calculated as the similarity and added to the total similarity B 1. .
  • the area setting window W3 is displayed immediately after the start of the channel separation process, and the area can be set to avoid erroneous determination.
  • an area setting window W3 may not be displayed so that the area cannot be set.
  • a button may be provided on the reclassification window W4 for setting the area and instructing the redo of the automatic classification process.
  • the user presses the above button and displays the area setting window W3 only when the result of the automatic classification process performed without setting the area is not appropriate (there are many erroneous determinations). Then, after setting the area, the automatic classification process can be executed again.
  • the area setting window W3 is displayed immediately after the start of the channel separation process, and the automatic classification process can be executed with the area set, and the above buttons are provided on the reclassification window W4. May be.
  • a new reference frame GMAX is obtained by combining the comparison frame Fj and the existing reference frame GMAX by weighted averaging.
  • the method of updating the reference frame G MAX is not limited thereto. For example, for each pixel, a histogram of pixel values of all frames (including the comparison frame F j at that time) included in the channel CH MAX is created, and the most frequent pixel value is determined as the pixel of the reference frame G MAX . It is good also as a pixel value. Alternatively, the pixel value serving as the median value in the histogram may be used as the pixel value of the pixel in the reference frame GMAX .
  • ⁇ 5-6> it is determined whether or not the number of channels L has reached a certain number immediately after step S10, and when it is determined that the number has reached a certain number or more, You may make it perform the correction process of a channel. This is because if the number of channels L increases too much, the processing load becomes large. However, even if there is a channel with the number of frames of 1, if the frame of 1 is close to a certain value in time from the currently selected comparison frame F j , the channel is not deleted. It is preferable to do so. This is because the channel is a recently created channel and there is a high possibility that a frame classified as the channel will appear in the future. If the number of channels L exceeds a certain value only by deleting the channels as described above, the channels with the number of channels of 2 or more are also sequentially deleted, and the channels The number can be kept below a certain number.

Abstract

Provided is an image processing device for classifying each frame into a channel when frames belonging to different channels are mixed within a single video. The image processing device is provided with an automatic classification unit. The automatic classification unit calculates the similarities between a plurality of frames in the video on the basis of local color trends and/or overall color trends by applying image processing to the frames and classifies the plurality of frames into a plurality of channels on the basis of the similarities.

Description

画像処理装置Image processing device
 本発明は、画像処理装置及び画像処理プログラムに関し、特に、1の動画内に異なるチャンネルに属するフレームが混在する場合に、これらのフレームをチャンネル毎に分類するための画像処理装置及び画像処理プログラムに関する。 The present invention relates to an image processing apparatus and an image processing program, and more particularly to an image processing apparatus and an image processing program for classifying these frames for each channel when frames belonging to different channels are mixed in one moving image. .
 動画の録画の一方式として、タイムラプスという間欠録画方式が知られている。これは、通常よりも低いフレームレートで、言うならばコマを落としながら録画する方式であり、同じ記録媒体に通常よりも長時間の録画を可能にする。防犯カメラにおいても、しばしばこのタイムラプスの方式が採用され、更には、フレームレートを下げることでできた隙間時間に、別の防犯カメラの映像が挿入されることもある。この場合、複数のカメラの映像、すなわち、複数のチャンネルの映像が1つのタイムラインに沿って交互に切り替わるような動画が録画される。 An intermittent recording method called time lapse is known as a method for recording moving images. This is a method of recording at a frame rate lower than normal, that is, dropping frames, and enables recording on the same recording medium for a longer time than usual. Also in security cameras, this time lapse method is often employed, and further, images from other security cameras may be inserted in the gap time formed by lowering the frame rate. In this case, a moving image is recorded in which videos from a plurality of cameras, that is, videos from a plurality of channels are alternately switched along one timeline.
 複数のチャンネルの映像を1つのタイムライン上にまとめて録画した動画は、専用機器を用いれば、カメラ毎の、すなわち、チャンネル毎の映像をモニターに表示することができる。特許文献1の専用機器では、映像の各フレームにチャンネル番号のデータが付与されており、再生時には、これを参照することで、複数のチャンネルの映像が混在した映像の中から任意のチャンネルの映像のみを取り出すことができる。 ¡Videos that are recorded from multiple channels on a single timeline can be displayed on a monitor for each camera, that is, for each channel, using a dedicated device. In the dedicated device of Patent Document 1, channel number data is assigned to each frame of the video, and at the time of playback, by referring to this, a video of an arbitrary channel can be selected from videos in which videos of a plurality of channels are mixed. Can only be taken out.
特開2002-319210号公報JP 2002-319210 A
 しかしながら、複数のチャンネルの映像が混在した映像を一般の再生機器で再生する場合には、チャンネル毎の映像を見ることができない。各フレームのチャンネルを認識するための仕様は、メーカー毎、又は機種毎に異なっており、一般の再生機器は、これらの仕様に対応していないからである。 However, when a video in which videos of multiple channels are mixed is played on a general playback device, the video for each channel cannot be viewed. This is because the specifications for recognizing the channel of each frame differ for each manufacturer or for each model, and general playback devices do not support these specifications.
 本発明は、1の動画内に異なるチャンネルに属するフレームが混在する場合に、当該動画がどのような仕様で録画されたものであったとしても、チャンネル毎の再生を可能にする画像処理装置及び画像処理プログラムを提供することを目的とする。 The present invention relates to an image processing apparatus that enables reproduction for each channel, even if frames belonging to different channels are mixed in one moving image, no matter what specifications the moving image is recorded in. An object is to provide an image processing program.
 本発明の第1観点に係る画像処理装置は、1の動画内に異なるチャンネルに属するフレームが混在する場合に、前記フレームを前記チャンネル毎に分類するための画像処理装置であって、自動分類部を備える。前記自動分類部は、前記動画に含まれる複数のフレームに画像処理を施すことにより、局所的な色彩傾向及び/又は全体的な色彩傾向に基づいてフレーム間の類似度を算出し、前記類似度に応じて、前記複数のフレームを複数のチャンネルに分類する。 An image processing apparatus according to a first aspect of the present invention is an image processing apparatus for classifying the frames into the respective channels when frames belonging to different channels are mixed in one moving image, and an automatic classification unit Is provided. The automatic classification unit calculates a similarity between frames based on a local color tendency and / or an overall color tendency by performing image processing on a plurality of frames included in the moving image, and the similarity The plurality of frames are classified into a plurality of channels.
 ここでは、1の動画内に異なるチャンネルに属するフレームが混在する場合に、これらのフレームをチャンネル毎に分類するための機能が提供される。具体的には、まず、動画に含まれる複数のフレームに画像処理を施すことにより、フレーム間の類似度が算出され、これに応じて、複数のフレームが複数のチャンネルに自動的に分類される。また、ここでは、フレーム間の類似度が、フレームの画面内における局所的な及び/又は全体的な色彩傾向に基づいて判断される。従って、この機能によれば、動画の仕様に関わらず、フレームの画面内における局所的な及び/又は全体的な色彩傾向に基づいて、異なるチャンネルに属するフレームが混在する動画をチャンネル毎の動画へと分離することが可能になる。すなわち、1の動画内に異なるチャンネルに属するフレームが混在する場合において、当該動画がどのような仕様で録画されたものであったとしても、チャンネル毎の再生が可能になる。 Here, when frames belonging to different channels are mixed in one moving image, a function for classifying these frames for each channel is provided. Specifically, first, image processing is performed on a plurality of frames included in a moving image to calculate the similarity between frames, and the plurality of frames are automatically classified into a plurality of channels according to this. . Here, the similarity between frames is determined based on local and / or overall color tendency in the frame screen. Therefore, according to this function, a video in which frames belonging to different channels are mixed into a video for each channel based on the local and / or overall color tendency in the frame screen regardless of the video specifications. And can be separated. That is, in the case where frames belonging to different channels are mixed in one moving image, it is possible to reproduce for each channel regardless of the specifications of the moving image recorded.
 本発明の第2観点に係る画像処理装置は、第1観点に係る画像処理装置であって、前記自動分類部は、設定部と、算出部と、判定部とを含む。前記設定部は、前記動画に含まれる特定のフレームを、又は前記動画に含まれる2以上の特定のフレームを合成したフレームを基準フレームとして設定するとともに、前記動画に含まれる別の特定のフレームを比較フレームとして設定する。前記算出部は、前記比較フレーム及び前記基準フレームを複数の部分領域に分割し、前記部分領域毎に第1色指標を算出し、前記第1色指標に基づいて、前記比較フレーム及び前記基準フレームの構図を導出し、前記比較フレームの構図と前記基準フレームの構図との前記類似度を算出する。前記判定部は、前記類似度に基づいて、前記比較フレームが前記基準フレーム又はその合成元のフレームと同じチャンネルに属するか否かを判定する。 An image processing apparatus according to a second aspect of the present invention is the image processing apparatus according to the first aspect, and the automatic classification unit includes a setting unit, a calculation unit, and a determination unit. The setting unit sets a specific frame included in the moving image or a frame obtained by combining two or more specific frames included in the moving image as a reference frame, and sets another specific frame included in the moving image. Set as a comparison frame. The calculation unit divides the comparison frame and the reference frame into a plurality of partial areas, calculates a first color index for each partial area, and based on the first color index, the comparison frame and the reference frame And the degree of similarity between the composition of the comparison frame and the composition of the reference frame is calculated. The determination unit determines whether the comparison frame belongs to the same channel as the reference frame or a frame from which the comparison frame is combined based on the similarity.
 なお、「フレームを複数の部分領域に分割する」ことには、フレーム全体を隈なく重複なく複数の部分領域に分割することの他、フレーム全体の一部の領域(例えば、外枠領域を除いた中央領域等)を隈なく重複なく複数の部分領域に分割することや、フレーム全体又は一部を一部重複する複数の部分領域に分割すること等が含まれる。 Note that “dividing a frame into a plurality of partial areas” includes not only dividing the entire frame into a plurality of partial areas without overlapping, but also a part of the entire frame (for example, excluding the outer frame area). And the like are divided into a plurality of partial areas without overlapping, and the entire frame or part of the frame is divided into a plurality of partial areas partially overlapping.
 ここでは、色彩傾向に基づいて、比較フレーム及び基準フレームのそれぞれの構図が導出され、両構図の類否に応じて、両フレームが同じチャンネルに由来するか否かが判断される。構図とは、フレームの画面内に写る各要素(例えば、注目物体と背景等)の配置のことである。従って、フレームの構図に基づいて、比較されるフレームどうしが同じチャンネルに属するか否かを判断することができる。 Here, the composition of each of the comparison frame and the reference frame is derived based on the color tendency, and it is determined whether or not both frames are from the same channel according to the similarity of both compositions. The composition is an arrangement of each element (for example, a target object and a background) that appears in the frame screen. Therefore, based on the composition of the frame, it can be determined whether the compared frames belong to the same channel.
 本発明の第3観点に係る画像処理装置は、第1観点に係る画像処理装置であって、前記自動分類部は、設定部と、算出部と、判定部とを含む。前記設定部は、前記動画に含まれる特定のフレームを、又は前記動画に含まれる2以上の特定のフレームを合成したフレームを基準フレームとして設定するとともに、前記動画に含まれる別の特定のフレームを比較フレームとして設定する。前記算出部は、前記比較フレーム及び前記基準フレームを複数の部分領域に分割し、前記部分領域毎に第1色指標を算出し、前記第1色指標に基づいて、前記比較フレーム及び前記基準フレームの構図を導出し、前記構図に基づいて、前記比較フレーム及び前記基準フレームを新たな複数の部分領域に分割し、前記比較フレーム及び前記基準フレームの各々に対し、前記新たな部分領域の少なくとも1つにおいて第2色指標を算出し、前記比較フレームの前記第2色指標と前記基準フレームの前記第2色指標との前記類似度を算出する。前記判定部は、前記類似度に基づいて、前記比較フレームが前記基準フレーム又はその合成元のフレームと同じチャンネルに属するか否かを判定する。 An image processing apparatus according to a third aspect of the present invention is the image processing apparatus according to the first aspect, and the automatic classification unit includes a setting unit, a calculation unit, and a determination unit. The setting unit sets a specific frame included in the moving image or a frame obtained by combining two or more specific frames included in the moving image as a reference frame, and sets another specific frame included in the moving image. Set as a comparison frame. The calculation unit divides the comparison frame and the reference frame into a plurality of partial areas, calculates a first color index for each partial area, and based on the first color index, the comparison frame and the reference frame And comparing the comparison frame and the reference frame into a plurality of new partial areas based on the composition, and at least one of the new partial areas for each of the comparison frame and the reference frame. Then, a second color index is calculated, and the similarity between the second color index of the comparison frame and the second color index of the reference frame is calculated. The determination unit determines whether the comparison frame belongs to the same channel as the reference frame or a frame from which the comparison frame is combined based on the similarity.
 ここでは、色彩傾向に基づいて、比較フレーム及び基準フレームのそれぞれの構図が導出される。更に、当該構図に基づいて、比較フレーム及び基準フレームが複数の部分領域に分割される。そして、比較フレームの部分領域における色彩傾向と、基準フレームの部分領域における色彩傾向との類否に応じて、両フレームが同じチャンネルに由来するか否かが判断される。従って、フレームの構図を考慮した色彩傾向に基づいて、比較されるフレームどうしが同じチャンネルに属するか否かを判断することができる。 Here, the composition of each of the comparison frame and the reference frame is derived based on the color tendency. Further, the comparison frame and the reference frame are divided into a plurality of partial areas based on the composition. Then, whether or not both frames are derived from the same channel is determined according to the similarity between the color tendency in the partial area of the comparison frame and the color tendency in the partial area of the reference frame. Therefore, it can be determined whether or not the compared frames belong to the same channel based on the color tendency considering the composition of the frames.
 本発明の第4観点に係る画像処理装置は、第1観点に係る画像処理装置であって、前記自動分類部は、設定部と、算出部と、判定部とを含む。前記設定部は、前記動画に含まれる特定のフレームを、又は前記動画に含まれる2以上の特定のフレームを合成したフレームを基準フレームとして設定するとともに、前記動画に含まれる別の特定のフレームを比較フレームとして設定する。前記算出部は、前記比較フレーム及び前記基準フレームを複数の部分領域に分割し、前記部分領域毎に第1色指標を算出し、前記第1色指標に基づいて、前記比較フレーム及び前記基準フレームの構図を導出し、前記構図に基づいて、前記比較フレーム及び前記基準フレームを新たな複数の部分領域に分割し、前記比較フレーム及び前記基準フレームの各々に対し、前記新たな部分領域の少なくとも2つにおいて第2色指標を算出し、前記比較フレーム及び前記基準フレームの各々に対し、前記第2色指標間の差である第3色指標を算出し、前記比較フレームの前記第3色指標と前記基準フレームの前記第3色指標との前記類似度を算出する。前記判定部は、前記類似度に基づいて、前記比較フレームが前記基準フレーム又はその合成元のフレームと同じチャンネルに属するか否かを判定する。 An image processing apparatus according to a fourth aspect of the present invention is the image processing apparatus according to the first aspect, and the automatic classification unit includes a setting unit, a calculation unit, and a determination unit. The setting unit sets a specific frame included in the moving image or a frame obtained by combining two or more specific frames included in the moving image as a reference frame, and sets another specific frame included in the moving image. Set as a comparison frame. The calculation unit divides the comparison frame and the reference frame into a plurality of partial areas, calculates a first color index for each partial area, and based on the first color index, the comparison frame and the reference frame And comparing the comparison frame and the reference frame into a plurality of new partial areas based on the composition, and for each of the comparison frame and the reference frame, at least 2 of the new partial areas. A second color index is calculated, a third color index that is a difference between the second color indices is calculated for each of the comparison frame and the reference frame, and the third color index of the comparison frame The similarity with the third color index of the reference frame is calculated. The determination unit determines whether the comparison frame belongs to the same channel as the reference frame or a frame from which the comparison frame is combined based on the similarity.
 ここでは、局所的な色彩傾向に基づいて、比較フレーム及び基準フレームのそれぞれの構図が導出される。更に、当該構図に基づいて、比較フレーム及び基準フレームが複数の部分領域に分割される。そして、比較フレームにおける当該複数の部分領域間の色指標の差と、基準フレームにおける当該複数の部分領域間の色指標の差との類否に応じて、両フレームが同じチャンネルに由来するか否かが判断される。なお、複数の部分領域間の色指標の差とは、例えば、複数の部分領域における彩度の差や、濃度の差である。従って、フレームの構図を考慮した色彩傾向に基づいて、比較されるフレームどうしが同じチャンネルに属するか否かを判断することができる。 Here, the composition of each of the comparison frame and the reference frame is derived based on the local color tendency. Further, the comparison frame and the reference frame are divided into a plurality of partial areas based on the composition. Whether or not both frames come from the same channel according to the similarity between the difference in color index between the plurality of partial areas in the comparison frame and the difference in color index between the plurality of partial areas in the reference frame Is judged. Note that the color index difference between the plurality of partial areas is, for example, a saturation difference or a density difference in the plurality of partial areas. Therefore, it can be determined whether or not the compared frames belong to the same channel based on the color tendency considering the composition of the frames.
 本発明の第5観点に係る画像処理装置は、第1観点から第4観点のいずれかに係る画像処理装置であって、前記自動分類部は、濃度、明度、輝度、彩度及び色相の少なくとも1つに基づいて、前記色彩傾向を評価する。 An image processing apparatus according to a fifth aspect of the present invention is the image processing apparatus according to any one of the first to fourth aspects, wherein the automatic classification unit includes at least one of density, brightness, luminance, saturation, and hue. Based on one, the color tendency is evaluated.
 ここでは、濃度、彩度及び色相の少なくとも1つに基づいて、フレームの色彩傾向を判断することができる。 Here, the color tendency of the frame can be determined based on at least one of density, saturation, and hue.
 本発明の第6観点に係る画像処理プログラムは、1の動画内に異なるチャンネルに属するフレームが混在する場合に、前記フレームを前記チャンネル毎に分類するための画像処理プログラムであって、前記動画に含まれる複数のフレームに画像処理を施すことにより、局所的な色彩傾向及び/又は全体的な色彩傾向に関するフレーム間の類似度を算出し、前記類似度に応じて、前記複数のフレームを複数のチャンネルに分類するステップをコンピュータに実行させる。ここでは、第1観点と同様の効果を奏することができる。 An image processing program according to a sixth aspect of the present invention is an image processing program for classifying the frames into the channels when frames belonging to different channels are mixed in one moving image. By performing image processing on a plurality of included frames, a similarity between frames regarding a local color tendency and / or an overall color tendency is calculated, and the plurality of frames are converted into a plurality of frames according to the similarity. Let the computer perform the step of classifying into channels. Here, the same effect as the first aspect can be achieved.
 本発明によれば、動画の仕様に関わらず、フレームの画面内における局所的な及び/又は全体的な色彩傾向に基づいて、異なるチャンネルに属するフレームが混在する動画をチャンネル毎の動画へと分離することが可能になる。すなわち、1の動画内に異なるチャンネルに属するフレームが混在する場合において、当該動画がどのような仕様で録画されたものであったとしても、チャンネル毎の再生が可能になる。 According to the present invention, regardless of the specifications of a moving image, a moving image in which frames belonging to different channels are mixed is separated into moving images for each channel based on the local and / or overall color tendency in the frame screen. It becomes possible to do. That is, in the case where frames belonging to different channels are mixed in one moving image, it is possible to reproduce for each channel regardless of the specifications of the moving image recorded.
本発明の第1実施形態に係る画像処理装置のブロック図。1 is a block diagram of an image processing apparatus according to a first embodiment of the present invention. 画像データが取り込まれる前の基本画面の図。The figure of the basic screen before image data is taken in. 画像データが取り込まれた後の基本画面の図。The figure of a basic screen after image data was taken in. 1のタイムラインに属する静止画群を示す図。The figure which shows the still image group which belongs to 1 timeline. エリア設定ウィンドウを示す図。The figure which shows an area setting window. 再分類ウィンドウを示す図。The figure which shows a reclassification window. 再分類ウィンドウを示す別の図。Another figure showing a reclassification window. 確認ウィンドウを示す図。The figure which shows a confirmation window. 本発明の第1実施形態に係る自動分類処理の流れを示すフローチャート。The flowchart which shows the flow of the automatic classification process which concerns on 1st Embodiment of this invention. 本発明の第1実施形態に係る類似度の算出の方法を説明する概念図。The conceptual diagram explaining the method of calculating the similarity which concerns on 1st Embodiment of this invention. 本発明の第2実施形態に係る自動分類処理の流れを示すフローチャート。The flowchart which shows the flow of the automatic classification process which concerns on 2nd Embodiment of this invention. 本発明の第2実施形態に係る類似度の算出の方法を説明する概念図。The conceptual diagram explaining the method of calculating the similarity which concerns on 2nd Embodiment of this invention. 本発明の第3の実施形態に係る自動分類処理の流れを示すフローチャート。The flowchart which shows the flow of the automatic classification | category process which concerns on the 3rd Embodiment of this invention. 本発明の第3実施形態に係る類似度の算出の方法を説明する概念図。The conceptual diagram explaining the method of calculating the similarity which concerns on 3rd Embodiment of this invention. 本発明の第3実施形態に係る別の類似度の算出の方法を説明する概念図。The conceptual diagram explaining the calculation method of another similarity based on 3rd Embodiment of this invention. 本発明の第3実施形態に係るさらに別の類似度の算出の方法を説明する概念図。The conceptual diagram explaining the method of the calculation of another similarity based on 3rd Embodiment of this invention. 本発明の第3の実施形態に係る色彩傾向を評価する処理の流れを示すフローチャート。The flowchart which shows the flow of the process which evaluates the color tendency which concerns on the 3rd Embodiment of this invention. 印象度を定義する情報を説明する図。The figure explaining the information which defines an impression degree.
 以下、図面を参照しつつ、本発明のいくつかの実施形態に係る画像処理装置及び画像処理プログラムについて説明する。 Hereinafter, an image processing apparatus and an image processing program according to some embodiments of the present invention will be described with reference to the drawings.
 <1.第1実施形態>
 <1-1.画像処理装置の概要>
 図1に示す画像処理装置1は、本発明に係る画像処理装置の一実施形態である。画像処理装置1は、汎用のパーソナルコンピュータである。画像処理装置1には、本発明に係る画像処理プログラムの一実施形態である画像処理プログラム2が、これを格納するCD-ROM、DVD-ROM、USBメモリ等のコンピュータが読み取り可能な記録媒体60等から提供され、インストールされている。画像処理プログラム2は、動画及び静止画に対する画像処理を支援するためのアプリケーションソフトウェアである。画像処理プログラム2は、画像処理装置1に後述する動作に含まれるステップを実行させる。
<1. First Embodiment>
<1-1. Overview of Image Processing Device>
An image processing apparatus 1 shown in FIG. 1 is an embodiment of an image processing apparatus according to the present invention. The image processing apparatus 1 is a general-purpose personal computer. In the image processing apparatus 1, an image processing program 2 which is an embodiment of the image processing program according to the present invention is stored in a computer-readable recording medium 60 such as a CD-ROM, a DVD-ROM, or a USB memory. Etc. are provided and installed by etc. The image processing program 2 is application software for supporting image processing for moving images and still images. The image processing program 2 causes the image processing apparatus 1 to execute steps included in the operations described later.
 画像処理装置1は、ディスプレイ10、入力部20、記憶部30及び制御部40を有する。これらの部10~40は、互いにバス線やケーブル等5を介して接続されており、適宜、通信可能である。ディスプレイ10は、液晶ディスプレイ等から構成され、後述する画面等をユーザに対し表示する。入力部20は、マウス及びキーボート等から構成され、画像処理装置1に対するユーザからの操作を受け付ける。記憶部30は、ハードディスク等から構成される不揮発性の記憶領域である。制御部40は、CPU、ROM及びRAM等から構成される。 The image processing apparatus 1 includes a display 10, an input unit 20, a storage unit 30, and a control unit 40. These units 10 to 40 are connected to each other via a bus line, a cable 5 or the like and can communicate appropriately. The display 10 is composed of a liquid crystal display or the like, and displays a screen or the like described later to the user. The input unit 20 includes a mouse, a keyboard, and the like, and accepts an operation from the user for the image processing apparatus 1. The storage unit 30 is a non-volatile storage area configured from a hard disk or the like. The control unit 40 includes a CPU, a ROM, a RAM, and the like.
 画像処理プログラム2は、記憶部30内に格納されている。記憶部30内には、ソフトウェア管理領域50が確保されている。ソフトウェア管理領域50は、画像処理プログラム2が使用する領域である。ソフトウェア管理領域50内には、オリジナル画像領域51、加工ファイル領域52及び印象度定義領域53が確保されている。各領域51~53の役割については、後述する。 The image processing program 2 is stored in the storage unit 30. A software management area 50 is secured in the storage unit 30. The software management area 50 is an area used by the image processing program 2. In the software management area 50, an original image area 51, a processed file area 52, and an impression degree definition area 53 are secured. The role of each of the areas 51 to 53 will be described later.
 制御部40は、記憶部30内に格納されている画像処理プログラム2を読み出して実行することにより、仮想的に自動分類部41及び再分類部45として動作する。また、自動分類部は、設定部42、算出部43及び判定部44としても動作する。各部41~45の動作については、後述する。 The control unit 40 virtually operates as the automatic classification unit 41 and the reclassification unit 45 by reading and executing the image processing program 2 stored in the storage unit 30. The automatic classification unit also operates as the setting unit 42, the calculation unit 43, and the determination unit 44. The operation of each part 41 to 45 will be described later.
 <1-2.画像処理装置の構成及び動作の詳細>
 制御部40は、ユーザが入力部20を介して所定の操作を行ったことを検知すると、画像処理プログラム2を起動する。画像処理プログラム2が起動されると、基本画面W1(図2参照)がディスプレイ10上に表示される。なお、ディスプレイ10上に表示される画面、ウィンドウ、ボタンその他の全ての要素の表示は、制御部40により制御される。
<1-2. Details of Configuration and Operation of Image Processing Device>
When the control unit 40 detects that the user has performed a predetermined operation via the input unit 20, the control unit 40 activates the image processing program 2. When the image processing program 2 is activated, a basic screen W1 (see FIG. 2) is displayed on the display 10. Note that the display of all elements such as a screen, a window, a button, and the like displayed on the display 10 is controlled by the control unit 40.
 <1-2-1.画像データの取込み>
 基本画面W1は、オリジナル画像領域51への画像データの取込みの命令をユーザから受け付ける。オリジナル画像領域51へ取り込まれた画像データは、後述する再生処理、画像処理及びチャンネル分離処理の対象になる。制御部40は、静止画ファイル又は動画ファイルから、オリジナル画像領域51へ画像データを取り込む。なお、本明細書において、静止画ファイルとは、静止画形式のデータファイルであり、動画ファイルとは、動画形式のデータファイルである。
<1-2-1. Importing image data>
The basic screen W1 accepts an instruction for taking image data into the original image area 51 from the user. The image data taken into the original image area 51 is a target of reproduction processing, image processing, and channel separation processing described later. The control unit 40 captures image data from the still image file or the moving image file into the original image area 51. In this specification, a still image file is a data file in a still image format, and a moving image file is a data file in a moving image format.
 静止画ファイルから画像データを取り込む場合、ユーザは、入力部20を操作することにより、1の静止画ファイルを指定するか、又は1のフォルダを指定する。前者の場合、制御部40は、その静止画ファイルの記憶部30内のアドレスパス及びファイル名をユーザに入力させる。後者の場合、制御部40は、そのフォルダの記憶部30内のアドレスパス及びフォルダ名をユーザに入力させる。その後、制御部40は、指定された静止画ファイル又は指定されたフォルダ内の全ての静止画ファイルを、オリジナル画像領域51に静止画ファイル群として保存する。なお、本明細書において、「群」という場合には、その要素数は複数とは限らず、1つであってもよい。 When capturing image data from a still image file, the user operates the input unit 20 to specify one still image file or one folder. In the former case, the control unit 40 causes the user to input the address path and file name in the storage unit 30 of the still image file. In the latter case, the control unit 40 causes the user to input the address path and folder name in the storage unit 30 of the folder. Thereafter, the control unit 40 stores the specified still image file or all the still image files in the specified folder as a still image file group in the original image area 51. In the present specification, when referring to a “group”, the number of elements is not limited to a plurality, and may be one.
 一方、動画ファイルから画像データを取り込む場合、ユーザは、入力部20を操作することにより、1の動画ファイルの記憶部30内のアドレスパス及びファイル名を入力する。制御部40は、ユーザが動画ファイルを指定したことを検知すると、基本画面W1上に動画取込みウィンドウ(図示されない)を重ねて表示させる。動画取込みウィンドウは、指定された動画ファイルのタイムラインの全区間うち、任意の区間の選択をユーザから受け付ける。制御部40は、ユーザが入力部20を介して特定の区間を選択したことを検知すると、指定された動画ファイルのその区間に含まれるフレーム群に1対1で対応する静止画ファイル群を生成する。その後、制御部40は、この静止画ファイル群をオリジナル画像領域51に保存する。従って、本実施形態では、後述する再生処理、画像処理及びチャンネル分離処理の対象となる画像データは、動画ファイルではなく、静止画ファイルである。 On the other hand, when capturing image data from a moving image file, the user operates the input unit 20 to input the address path and file name in the storage unit 30 of one moving image file. When the control unit 40 detects that the user has specified a moving image file, the control unit 40 displays a moving image capturing window (not shown) in an overlapping manner on the basic screen W1. The moving image capture window accepts selection of an arbitrary section from the user among all the sections of the timeline of the specified moving image file. When the control unit 40 detects that the user has selected a specific section via the input unit 20, the control unit 40 generates a still image file group corresponding to the frame group included in the section of the designated moving image file on a one-to-one basis. To do. Thereafter, the control unit 40 stores the still image file group in the original image area 51. Therefore, in the present embodiment, the image data to be subjected to playback processing, image processing, and channel separation processing, which will be described later, is not a moving image file but a still image file.
 なお、制御部40は、オリジナル画像領域51へ取り込まれた静止画ファイル群が動画ファイルに由来する場合だけではなく、静止画ファイルに由来する場合であっても、静止画ファイル群を1のタイムラインに沿って配列されているものと認識する。配列は、ファイルの属性等から自動的に付与される。 Note that the control unit 40 sets the still image file group to one time not only when the still image file group captured in the original image area 51 is derived from the moving image file but also from the still image file. Recognize that they are arranged along a line. The array is automatically given from the attribute of the file.
 <1-2-2.再生処理>
 オリジナル画像領域51へ静止画ファイル群が取り込まれると、制御部40は、基本画面W1上に表示ウィンドウW2(図3参照)を重ねて表示させる。表示ウィンドウW2は、オリジナル画像領域51へ取り込まれた静止画ファイル群のタイムラインの数だけ作成される。
<1-2-2. Playback processing>
When the still image file group is taken into the original image area 51, the control unit 40 displays the display window W2 (see FIG. 3) on the basic screen W1. The display windows W2 are created by the number of timelines of the still image file group taken into the original image area 51.
 表示ウィンドウW2内には、まず、オリジナル画像領域51へ取り込まれた静止画ファイル群に含まれる1の静止画ファイル(例えば、タイムライン上で先頭のフレームの静止画ファイル)が表示される。その後、後述するとおり、表示ウィンドウW2内に表示されるフレームは、ユーザの操作を受けて切り替わる。 In the display window W2, first, one still image file (for example, a still image file of the first frame on the timeline) included in the still image file group captured in the original image area 51 is displayed. Thereafter, as will be described later, the frame displayed in the display window W2 is switched in response to a user operation.
 制御部40は、表示ウィンドウW2内で、その表示ウィンドウW2に対応するタイムラインに属するフレーム群を、動画として再生可能である。ここで、図3に示すとおり、基本画面W1上には、ウィンドウ選択プルダウンメニューT1、再生ボタンT2、コマ送りボタンT3、コマ戻しボタンT4及びタイムラインバーT5が配置されている。 The control unit 40 can reproduce a frame group belonging to the timeline corresponding to the display window W2 as a moving image in the display window W2. Here, as shown in FIG. 3, a window selection pull-down menu T1, a playback button T2, a frame advance button T3, a frame return button T4, and a timeline bar T5 are arranged on the basic screen W1.
 表示ウィンドウW2が複数存在する場合であっても、アクティブな表示ウィンドウW2は1つである。ウィンドウ選択プルダウンメニューT1は、どの表示ウィンドウW2をアクティブとするかの選択をユーザから受け付ける。以下、アクティブな表示ウィンドウW2に対応するタイムラインを、アクティブタイムラインと呼び、アクティブタイムラインに属するフレーム群を、アクティブフレーム群と呼ぶ。また、アクティブな表示ウィンドウW2内に現在表示されているフレームを、アクティブ表示フレームと呼ぶ。 Even if there are a plurality of display windows W2, there is only one active display window W2. The window selection pull-down menu T1 accepts selection of which display window W2 is active from the user. Hereinafter, a timeline corresponding to the active display window W2 is referred to as an active timeline, and a frame group belonging to the active timeline is referred to as an active frame group. A frame currently displayed in the active display window W2 is referred to as an active display frame.
 再生ボタンT2は、アクティブフレーム群の動画としての再生の命令をユーザから受け付ける。制御部40は、ユーザが入力部20を介して再生ボタンT2を押下したことを検知すると、アクティブな表示ウィンドウW2内に、アクティブフレーム群に含まれるフレームを、タイムラインに沿って順次コマ送りの形式で表示させる。なお、再生は、再生ボタンT2が押下された時点のアクティブ表示フレームから開始する。また、再生ボタンT2は、再生の停止の命令をユーザから受け付ける。制御部40は、再生中にユーザが入力部20を介して再生ボタンT2を押下したことを検知すると、アクティブな表示ウィンドウW2内の表示を、その時点のアクティブ表示フレームに固定する。 The playback button T2 receives a playback command from the user as a movie of the active frame group. When the control unit 40 detects that the user has pressed the play button T2 via the input unit 20, the frame included in the active frame group is sequentially framed along the timeline in the active display window W2. Display in format. Note that the reproduction starts from the active display frame at the time when the reproduction button T2 is pressed. The playback button T2 accepts a playback stop command from the user. When the control unit 40 detects that the user has pressed the playback button T2 via the input unit 20 during playback, the control unit 40 fixes the display in the active display window W2 to the active display frame at that time.
 コマ送りボタンT3、コマ戻しボタンT4はそれぞれ、アクティブ表示フレームを、アクティブタイムラインに沿って1つ後、1つ前のフレームへ切り替える命令をユーザから受け付ける。 The frame advance button T3 and the frame return button T4 each receive a command to switch the active display frame to the previous frame one by one along the active timeline.
 タイムラインバーT5は、アクティブタイムラインを図式的に示すオブジェクトである。タイムラインバーT5は、そのバーが延びる方向に、アクティブフレーム群のフレーム数で等分に分割されている。タイムラインバーT5上の左からn番目の分割領域は、アクティブタイムライン上でn番目のフレームに対応する(nは、自然数)。 The timeline bar T5 is an object that schematically shows the active timeline. The timeline bar T5 is equally divided by the number of frames of the active frame group in the direction in which the bar extends. The nth divided region from the left on the timeline bar T5 corresponds to the nth frame on the active timeline (n is a natural number).
 図3に示すように、タイムラインバーT5は、選択フレーム群に対応する分割領域A1と、非選択フレーム群に対応する分割領域A2とを異なる態様で表示する。選択フレーム群とは、アクティブタイムライン上で現在選択されている区間に対応するフレーム群である。非選択フレーム群とは、アクティブタイムライン上で現在選択されていない区間に対応するフレーム群である。 As shown in FIG. 3, the timeline bar T5 displays the divided area A1 corresponding to the selected frame group and the divided area A2 corresponding to the non-selected frame group in different modes. The selected frame group is a frame group corresponding to the currently selected section on the active timeline. The non-selected frame group is a frame group corresponding to a section that is not currently selected on the active timeline.
 タイムラインバーT5は、アクティブタイムライン上の任意の区間の選択をユーザから受け付ける。言い換えると、ユーザは、入力部20を介してタイムラインバーT5上の分割領域を操作することにより、アクティブフレーム群の中から、任意のフレームを任意の数だけ選択することができる。制御部40は、選択フレーム群を後述される画像処理及びチャンネル分離処理の対象として認識する。なお、ユーザによりタイムラインバーT5上の分割領域が選択される度に、アクティブ表示フレームは、最新に選択された分割領域に対応するフレームに切り替わる。 The timeline bar T5 accepts selection of an arbitrary section on the active timeline from the user. In other words, the user can select any number of arbitrary frames from the active frame group by operating the divided areas on the timeline bar T5 via the input unit 20. The control unit 40 recognizes the selected frame group as an object of image processing and channel separation processing described later. Each time the divided area on the timeline bar T5 is selected by the user, the active display frame is switched to a frame corresponding to the most recently selected divided area.
 <1-2-3.画像処理>
 以下、選択フレーム群に対する画像処理について説明する。制御部40は、ノイズ除去、シャープネス、明るさ/コントラスト/彩度調整、画像解像度、回転、文字/矢印/モザイクの付加、画像平均などの複数の画像処理モジュールを実行可能である。画像処理モジュールは、画像処理プログラム2に組み込まれている。
<1-2-3. Image processing>
Hereinafter, image processing for the selected frame group will be described. The control unit 40 can execute a plurality of image processing modules such as noise removal, sharpness, brightness / contrast / saturation adjustment, image resolution, rotation, addition of characters / arrows / mosaic, and image averaging. The image processing module is incorporated in the image processing program 2.
 ユーザは、入力部20を介して基本画面W1を操作することにより、画像処理モジュールの中から任意のものを、任意の順番に、任意の回数だけ選択することが可能である。制御部40は、ユーザが画像処理モジュールを選択したことを検知する度に、選択フレーム群に対しその画像処理モジュールを実行する。 The user can select any one of the image processing modules in any order and any number of times by operating the basic screen W1 via the input unit 20. Each time the control unit 40 detects that the user has selected an image processing module, the control unit 40 executes the image processing module on the selected frame group.
 フレームに対し画像処理モジュールが1回、2回、3回,・・・と、順次実行されてゆくにつれて、そのフレームは、第1次、第2次、第3次,・・・と、順次加工されてゆく。第0次フレームは、オリジナル画像領域51に保存されている静止画ファイルに対応する。第(m+1)次フレームは、第m次フレームに対応する静止画ファイルに対し画像処理モジュールを1回実行した後の静止画ファイルに対応する(mは、0以上の整数)。制御部40は、第1次以降のフレームに対応する静止画ファイルを順次生成し、これらの静止画ファイルを加工ファイル領域52内にそれぞれ別個に保存する。 As the image processing module is sequentially executed once, twice, three times,... With respect to the frame, the frame is sequentially sorted into the first order, the second order, the third order,. It will be processed. The 0th frame corresponds to the still image file stored in the original image area 51. The (m + 1) th frame corresponds to the still image file after the image processing module is executed once for the still image file corresponding to the mth frame (m is an integer of 0 or more). The control unit 40 sequentially generates still image files corresponding to the first and subsequent frames, and separately stores these still image files in the processed file area 52.
 図4は、1のタイムラインに属する静止画群が画像処理プログラム2によりどのように管理されるかを示す概念図である。図4において、横軸のN軸は、タイムライン上のフレームの順番を示しており、縦軸のM軸は、加工の順番を示している。図4のN-M空間内の座標(n,m)に対応する四角形は、静止画Q(n,m)を表している。静止画Q(n,m)は、タイムライン上でn番目のフレームの第m次の静止画である(nは、自然数であり、mは、0以上の整数である)。 FIG. 4 is a conceptual diagram showing how still image groups belonging to one timeline are managed by the image processing program 2. In FIG. 4, the N axis on the horizontal axis indicates the order of frames on the timeline, and the M axis on the vertical axis indicates the order of processing. A square corresponding to the coordinates (n, m) in the NM space in FIG. 4 represents the still image Q (n, m). The still image Q (n, m) is an mth-order still image of the nth frame on the timeline (n is a natural number and m is an integer of 0 or more).
 制御部40は、各フレームについて、現在選択されている座標mの値をパラメータmsとして管理する。オリジナル画像領域51へ静止画ファイル群が取り込まれた直後、座標msは、初期値0である。その後、画像処理モジュールが1回実行される度に、そのフレームの座標msは1ずつインクリメントされる。また、ユーザは、入力部20を介して所定の操作を行うことにより、選択フレーム群の座標msを自在に変更することができる。なお、フレームに対し画像処理モジュールを実行するとは、そのフレームの第ms次の静止画に対し画像処理モジュールを実行することである。従って、座標msを変更することには、画像処理モジュールの実行の対象を変更するという意味がある。また、フレームを表示するとは、そのフレームの座標msの静止画を表示することである。従って、座標msを変更することには、アクティブな表示ウィンドウW2内に表示される対象を変更するという意味もある。 The control unit 40 manages the value of the currently selected coordinate m as the parameter m s for each frame. Immediately after the still image file group is taken into the original image area 51, the coordinate m s has an initial value of 0. Thereafter, each time the image processing module is executed once, the coordinate m s of the frame is incremented by one. Further, the user can freely change the coordinate m s of the selected frame group by performing a predetermined operation via the input unit 20. Note that to execute the image processing module to the frame, it is to perform an image processing module to the m s next still image of the frame. Therefore, changing the coordinate m s means changing the execution target of the image processing module. In addition, displaying a frame means displaying a still image at the coordinate m s of the frame. Therefore, changing the coordinate m s also means changing the object displayed in the active display window W2.
 <1-3.チャンネル分離処理>
 以下、画像処理プログラム2に実装されているチャンネル分離処理について説明する。チャンネル分離処理とは、異なるチャンネルに属する映像が混在する動画を、チャンネル毎の動画へと分離する処理である。すなわち、チャンネル分離処理の対象とされるべき動画とは、典型的には、タイムラプスの方式で録画される複数の動画を、各動画の隙間時間に他の動画の映像を埋める形式で1つのタイムライン上に混合した動画(以下、混合タイムラプス動画という)である。混合タイムラプス動画は、複数のチャンネルの映像がタイムラインに沿って交互に切り替わるような動画である。
<1-3. Channel separation processing>
Hereinafter, channel separation processing implemented in the image processing program 2 will be described. The channel separation process is a process of separating a moving image in which videos belonging to different channels are mixed into moving images for each channel. In other words, a moving image to be subjected to channel separation processing typically includes a plurality of moving images recorded in a time-lapse manner, and one time in a format in which video of another moving image is embedded in a gap time of each moving image. A moving image mixed on a line (hereinafter referred to as a mixed time-lapse moving image). A mixed time-lapse moving image is a moving image in which videos of a plurality of channels are alternately switched along a timeline.
 チャンネル分離処理は、選択フレーム群に対し実行されるが、チャンネル分離処理の実行には、複数のフレームが必要となる。従って、選択フレーム群としてフレームが1枚しか選択されていない状態では、チャンネル分離処理の開始のための操作ボタン等が無効化されており、チャンネル分離処理は開始させることができないものとする。或いは、選択フレーム群としてフレームが1枚しか選択されていなかったとしても、アクティブタイムライン上に複数のフレームが存在する場合には、アクティブタイムライン上の全てのフレームを、チャンネル分離処理の対象とすることもできる。以下では、チャンネル分離処理の対象となるフレーム群を対象フレーム群と呼ぶ。また、対象フレーム群により構成される動画を対象動画と呼ぶ。 The channel separation process is executed for the selected frame group, but a plurality of frames are required to execute the channel separation process. Therefore, in a state where only one frame is selected as the selected frame group, it is assumed that the operation button for starting the channel separation process is invalid and the channel separation process cannot be started. Alternatively, even if only one frame is selected as the selected frame group, if there are a plurality of frames on the active timeline, all the frames on the active timeline are subject to channel separation processing. You can also Hereinafter, a frame group to be subjected to channel separation processing is referred to as a target frame group. In addition, a moving image composed of target frame groups is referred to as a target moving image.
 チャンネル分離処理は、対象動画に含まれる複数のフレームを複数のチャンネルに自動的に分類する自動分類処理と、自動分類処理による分類結果をユーザが手動操作により修正する再分類処理とからなる。 The channel separation process includes an automatic classification process that automatically classifies a plurality of frames included in the target moving image into a plurality of channels, and a reclassification process in which the user corrects the classification result by the automatic classification process by manual operation.
 自動分類処理は、自動分類部41により実行される。自動分類処理は、対象動画に含まれる複数のフレームに画像処理を施すことにより、これらのフレームを複数のチャンネルに自動的に分類する処理である。具体的には、自動分類部41は、フレーム間の類似度を算出し、当該類似度に応じて、対象動画に含まれる複数のフレームを複数のチャンネルに分類する。フレーム間の類似度は、比較されるフレームが同じチャンネルに属することの指標として算出される。自動分類処理の詳細については後述するが、自動分類処理の結果、複数のチャンネルが検出される。各チャンネルには、複数のフレームが所属する。なお、混合タイムラプス動画が対象とされる限りには、通常、複数のチャンネルが検出されることになるが、自動分類処理の結果、1のチャンネルしか検出されないこともあり得る。また、自動分類処理において、いずれのチャンネルにも属さないと判定されたフレームには、「未分類」という分類のラベリングが付与される。従って、自動分類処理の結果、対象動画に含まれる各フレームには、いずれかのチャンネルのチャンネル名、又は「未分類」のいずれかのラベリングが付与される。 The automatic classification process is executed by the automatic classification unit 41. The automatic classification process is a process of automatically classifying these frames into a plurality of channels by performing image processing on a plurality of frames included in the target moving image. Specifically, the automatic classification unit 41 calculates the similarity between frames, and classifies a plurality of frames included in the target moving image into a plurality of channels according to the similarity. The similarity between frames is calculated as an indicator that the compared frames belong to the same channel. Although details of the automatic classification process will be described later, a plurality of channels are detected as a result of the automatic classification process. A plurality of frames belong to each channel. In addition, as long as a mixed time-lapse moving image is targeted, a plurality of channels are usually detected. However, only one channel may be detected as a result of the automatic classification process. Further, in the automatic classification process, a label that is classified as “unclassified” is assigned to a frame that is determined not to belong to any channel. Therefore, as a result of the automatic classification process, each frame included in the target moving image is given a channel name of one of the channels or one of “unclassified” labeling.
 再分類処理は、再分類部45により実行される。再分類処理は、ユーザの操作に従って、対象動画に含まれる任意のフレームを任意のチャンネルに個別に分類する処理である。具体的には、再分類処理では、自動分類処理によりいずれのチャンネルにも分類されなかった未分類のフレームを、個別に特定のチャンネルに分類することができる。また、自動分類処理により誤ったチャンネルに分類されてしまったフレームを、正しい別のチャンネルに分類することもできる。従って、ユーザは、自動分類処理の結果がおかしいと感じた場合に、問題のフレームの所属先のチャンネルをフレーム単位で修正することができる。また、再分類処理では、ユーザの操作に従って、チャンネルを新規作成したり、複数のチャンネルを1のチャンネルに統合したりすることも可能である。すなわち、再分類処理は、自動分類処理の結果の手動での修正に利用される。 The reclassification process is executed by the reclassification unit 45. The reclassification process is a process of individually classifying arbitrary frames included in the target moving image into arbitrary channels in accordance with a user operation. Specifically, in the reclassification process, unclassified frames that have not been classified into any channel by the automatic classification process can be individually classified into specific channels. In addition, a frame that has been classified into an incorrect channel by the automatic classification process can be classified into another correct channel. Therefore, when the user feels that the result of the automatic classification process is strange, the user can correct the channel to which the problematic frame belongs in units of frames. In the reclassification process, it is also possible to create a new channel or integrate a plurality of channels into one channel according to a user operation. That is, the reclassification process is used for manual correction of the result of the automatic classification process.
 自動分類部41は、ユーザが入力部20を介して所定の操作を行ったことを検知すると、チャンネル分離処理を開始させる。チャンネル分離処理が開始すると、設定部42は、まず、基本画面W1上に、選択フレームを表示するエリア設定ウィンドウW3(図5参照)を重ねて表示させる。エリア設定ウィンドウW3は、選択フレーム内で、自動分類処理においてフレーム間の類似度を算出するための基準となるエリアの指定をユーザから受け付ける画面である。選択フレームとは、対象フレーム群に含まれる1のフレームであり、例えば、対象フレーム群の中でタイムラインに沿って先頭のフレームとしたり、アクティブ表示フレームとしたりすることができる。なお、既に述べたことから明らかであるが、アクティブ表示フレームは、タイムラインバーT5上で最新に選択されたフレームであるため、必ず対象フレーム群に含まれている。 When the automatic classification unit 41 detects that the user has performed a predetermined operation via the input unit 20, the automatic classification unit 41 starts the channel separation process. When the channel separation process is started, the setting unit 42 first displays an area setting window W3 (see FIG. 5) for displaying the selected frame on the basic screen W1. The area setting window W3 is a screen that accepts designation of an area serving as a reference for calculating similarity between frames in the automatic classification process within the selected frame. The selected frame is one frame included in the target frame group. For example, the selected frame can be a top frame along the timeline in the target frame group or an active display frame. As is clear from what has already been described, the active display frame is the frame most recently selected on the timeline bar T5, and thus is always included in the target frame group.
 対象動画が防犯カメラの映像等である場合には、しばしば、フレームの画面内には(図5の例では左下の部分)、日時の表示エリアが設けられている。そして、このような日時の表示エリアを含むフレーム全体の情報を用いてフレーム間の類似度を判断すると、同じチャンネルに属するフレームどうしであっても、類似度が低く判断される虞がある。また、フレームの画面内に、黒縁が設けられている場合がある。逆に、このような場合には、黒縁を含むフレーム全体の情報を用いてフレーム間の類似度を判断すると、異なるチャンネルに属するフレームどうしであっても、類似度が高く判断される虞がある。また、対象動画が、異なるエレベータ内に設置された防犯カメラの映像から構成される混合タイムラプス動画である場合にも、異なるチャンネルに属するフレーム間の類似度が高く判断され易い。エレベータ内の風景は、いずれも似通っているからである。エリア設定ウィンドウW3は、このような事態を避け、自動分類処理で算出されるフレーム間の類似度が同じチャンネルに属することの正確な指標となるよう、誤判定の要因となり得るエリアを類似度の算出の根拠となるエリアから除外するのに役立つ。具体的には、ユーザは、エリア設定ウィンドウW3の選択フレーム内で、各チャンネルの背景を特徴づけるエリアを指定する。エレベータの例で言うと、各エレベータ内の風景を特徴づけるポスター等が写っているエリアを指定すればよい。ユーザが、エリア設定ウィンドウW3上で、マウス等の操作により選択フレーム内の任意のエリアを指定すると、選択フレーム内には、ユーザにより指定されたエリアを示す囲み線71が表示される。これにより、ユーザは、正しいエリアを指定できているか否かを確認することができる。 When the target video is a security camera video or the like, a date and time display area is often provided in the frame screen (lower left portion in the example of FIG. 5). If the similarity between frames is determined using information on the entire frame including the date / time display area, the similarity may be determined to be low even for frames belonging to the same channel. In some cases, a black edge is provided in the frame screen. On the contrary, in such a case, if the similarity between frames is determined using information on the entire frame including the black edge, there is a possibility that the similarity is determined to be high even for frames belonging to different channels. . In addition, even when the target moving image is a mixed time-lapse moving image composed of images of security cameras installed in different elevators, the similarity between frames belonging to different channels is easily determined. This is because the scenery in the elevator is similar. The area setting window W3 avoids such a situation, and an area that may cause an erroneous determination is displayed for the similarity degree so that the similarity degree between frames calculated by the automatic classification process can be an accurate indicator of belonging to the same channel. Useful for exclusion from areas that are the basis for calculations. Specifically, the user designates an area that characterizes the background of each channel in the selected frame of the area setting window W3. In the example of an elevator, an area in which a poster or the like characterizing the scenery in each elevator is shown may be specified. When the user designates an arbitrary area in the selected frame by operating the mouse or the like on the area setting window W3, a surrounding line 71 indicating the area designated by the user is displayed in the selected frame. Thereby, the user can confirm whether or not a correct area can be designated.
 そして、ユーザによりエリア設定ウィンドウW3上の開始ボタン72が押されると、その旨が自動分類部41により検知され、後述する自動分類処理が開始する。また、同時に、エリア設定ウィンドウW3が閉じられる。なお、選択フレーム内でエリアが指定されていない状態で、開始ボタン72が押された場合には、自動分類処理において、フレームの画面全体が類似度の算出の根拠として使用される。キャンセルボタン73は、エリア設定ウィンドウW3を介してのエリア指定をキャンセルするためのボタンである。 When the start button 72 on the area setting window W3 is pressed by the user, this is detected by the automatic classification unit 41, and automatic classification processing described later is started. At the same time, the area setting window W3 is closed. If the start button 72 is pressed when no area is specified in the selected frame, the entire frame screen is used as a basis for calculating the similarity in the automatic classification process. The cancel button 73 is a button for canceling area designation via the area setting window W3.
 自動分類処理の実行中は、自動分類処理の進捗状況を示すダイアログボックスが表示される。そして、自動分類処理が終了すると、当該ダイアログボックスが閉じられるとともに、再分類部45が再分類処理を開始する。自動分類処理の詳細については、後述する。再分類部45は、まず、基本画面W1上に、自動分類処理後に再分類を行うためのユーザーインターフェースとして、図6に示す再分類ウィンドウW4を重ねて表示させる。 * While the automatic classification process is in progress, a dialog box showing the progress of the automatic classification process is displayed. When the automatic classification process ends, the dialog box is closed and the reclassification unit 45 starts the reclassification process. Details of the automatic classification process will be described later. First, the reclassification unit 45 causes the reclassification window W4 shown in FIG. 6 to be superimposed and displayed on the basic screen W1 as a user interface for performing reclassification after the automatic classification process.
 再分類ウィンドウW4上には、チャンネル一覧エリアC1、フレーム一覧エリアC2、再生エリアC3、及びタイムラインバーC4が配置されている。チャンネル一覧エリアC1内には、自動分類処理で検出されたチャンネルにそれぞれ対応するチャンネルオブジェクト91が配列される。図6の例では、4つのチャンネルCH01,CH02,CH03,CH04に対応する4つのチャンネルオブジェクト91が配列されている。本実施形態では、チャンネルオブジェクト91は、アイコンの形態である。 On the reclassification window W4, a channel list area C1, a frame list area C2, a playback area C3, and a timeline bar C4 are arranged. In the channel list area C1, channel objects 91 respectively corresponding to the channels detected by the automatic classification process are arranged. In the example of FIG. 6, four channel objects 91 corresponding to the four channels CH01, CH02, CH03, and CH04 are arranged. In the present embodiment, the channel object 91 is in the form of an icon.
 図6に示すとおり、チャンネルオブジェクト91には、対応するチャンネルのチャンネル名CH01,CH02,・・・が表示される。また、チャンネルオブジェクト91には、対応するチャンネルに属する複数のフレームを代表する代表フレームのサムネイル画像が表示される。代表フレームは、後述するユーザの操作を受けて適宜切り替わるが、デフォルトは、例えば、対応するチャンネルに属するフレームのうち、タイムラインに沿って先頭のフレームである。なお、図6には示されていないが、チャンネルの数が多くなり、全てのチャンネルに対応するチャンネルオブジェクト91をチャンネル一覧エリアC1内に配置しきれなくなると、当該エリアC1内にスクロールバーが登場し、当該エリアC1は実質的に拡大される。また、チャンネルオブジェクト91には、対応するチャンネルに属するフレームの枚数も表示される。このようなフレームの枚数は、後述するユーザの操作を受けて変化するが、当該変化に合わせて、チャンネルオブジェクト91における枚数の表示も、リアルタイムに切り替わる。 As shown in FIG. 6, the channel name 91 of the corresponding channel is displayed on the channel object 91. The channel object 91 displays thumbnail images of representative frames that represent a plurality of frames belonging to the corresponding channel. The representative frame is switched as appropriate in response to a user operation described later. The default is, for example, the first frame along the timeline among the frames belonging to the corresponding channel. Although not shown in FIG. 6, when the number of channels increases and channel objects 91 corresponding to all channels cannot be arranged in the channel list area C1, a scroll bar appears in the area C1. However, the area C1 is substantially enlarged. The channel object 91 also displays the number of frames belonging to the corresponding channel. The number of such frames changes in response to a user operation described later, but the display of the number of frames in the channel object 91 is also switched in real time in accordance with the change.
 また、チャンネル一覧エリアC1内には、チャンネルオブジェクト91の他、未分類オブジェクト92も配置されている。未分類オブジェクト92は、対象フレーム群の中で、いずれのチャンネルにも属さない未分類のフレームをまとめる分類に対応するオブジェクトである。なお、未分類のフレームをまとめる分類とは、現実のチャンネルには対応しないが、フレームをまとめていると言う意味では、仮想的には1つのチャンネルを構成している。そのため、未分類オブジェクト92は、チャンネルオブジェクト91と類似の形態を有し、本実施形態では、同じサイズのアイコンである。具体的には、未分類オブジェクト92では、チャンネル名に代えて「未分類」の表示がされ、チャンネルに属するフレームの枚数に代えて未分類フレームの枚数が表示される。未分類オブジェクト92における枚数の表示も、未分類のフレームの枚数の変化に合わせて、リアルタイムに切り替わる。また、未分類オブジェクト92にも、「未分類」のフレームを代表する代表フレームのサムネイル画像が表示される。この代表フレームも、後述するユーザの操作を受けて適宜切り替わるが、デフォルトは、例えば、「未分類」の分類に属するフレームのうち、タイムラインに沿って先頭のフレームである。本実施形態では、未分類オブジェクト92とチャンネルオブジェクト91との間には、境界線93が引かれている。 Further, in addition to the channel object 91, an unclassified object 92 is also arranged in the channel list area C1. The unclassified object 92 is an object corresponding to a classification for collecting unclassified frames that do not belong to any channel in the target frame group. Note that the classification of grouping unclassified frames does not correspond to an actual channel, but in the sense that the frames are grouped, one channel is virtually configured. Therefore, the unclassified object 92 has a form similar to the channel object 91, and is an icon of the same size in this embodiment. Specifically, in the unclassified object 92, “unclassified” is displayed instead of the channel name, and the number of unclassified frames is displayed instead of the number of frames belonging to the channel. The display of the number of images in the unclassified object 92 is also switched in real time in accordance with the change in the number of unclassified frames. The unclassified object 92 also displays thumbnail images of representative frames representing “unclassified” frames. The representative frame is also switched as appropriate in response to a user operation to be described later. The default is, for example, the first frame along the timeline among the frames belonging to the “unclassified” category. In the present embodiment, a boundary line 93 is drawn between the unclassified object 92 and the channel object 91.
 再分類部45は、チャンネル一覧エリアC1内において、全てのチャンネルオブジェクト91及び未分類オブジェクト92の中から、任意の1のオブジェクトを選択する選択操作をユーザから受け付けている。全てのオブジェクト91,92のうち、現在選択されているオブジェクトは、その他のオブジェクトとは異なる態様で表示される。図6の例では、チャンネルCH01のオブジェクト91が、その他のオブジェクト91,92とは異なる色で表示されている。以下では、現在選択されている1のオブジェクト91,92に対応するチャンネル又は分類を、選択チャンネルと呼ぶ。 The reclassifying unit 45 accepts a selection operation for selecting any one object from all the channel objects 91 and the unclassified objects 92 in the channel list area C1 from the user. Of all the objects 91 and 92, the currently selected object is displayed in a different manner from the other objects. In the example of FIG. 6, the object 91 of the channel CH01 is displayed in a different color from the other objects 91 and 92. Hereinafter, a channel or classification corresponding to one object 91, 92 that is currently selected is referred to as a selected channel.
 フレーム一覧エリアC2内では、選択チャンネルに属する全てのフレームのサムネイル画像94が一覧表示される。なお、選択チャンネルに属するフレームの枚数が多く、全てのサムネイル画像94をフレーム一覧エリアC2内に配置しきれない場合には、当該エリアC2内にスクロールバー95が登場し、当該エリアC2は実質的に拡大される。従って、ユーザは、オブジェクト91,92の選択状態を切り替えることにより、フレーム一覧エリアC2内に、未分類の全てのフレーム、又はいずれかのチャンネルに属する全てのフレームを選択的に一覧表示させることができる。従って、フレーム一覧エリアC2内の限られたスペースを有効利用しつつ、対象フレーム群に含まれる未分類及び分類済みのフレームを全て確認することができる。特に、この一覧表示の態様によれば、各チャンネルに誤ったフレームが分類されていたとしても、それを発見することが容易になる。なお、本実施形態では、チャンネル一覧エリアC1内のサムネイル画像94は、多段式に配列可能であり、元のフレームのタイムライン上での順番nが大きくなるにつれて、左から右へ、上から下へと時系列順に配置される。 In the frame list area C2, thumbnail images 94 of all frames belonging to the selected channel are displayed in a list. When the number of frames belonging to the selected channel is large and not all thumbnail images 94 can be arranged in the frame list area C2, a scroll bar 95 appears in the area C2, and the area C2 is substantially subtracted. Expanded to Therefore, the user can selectively display all unclassified frames or all frames belonging to one of the channels in the frame list area C2 by switching the selection state of the objects 91 and 92. it can. Therefore, all the unclassified and classified frames included in the target frame group can be confirmed while effectively using the limited space in the frame list area C2. In particular, according to this list display mode, even if an erroneous frame is classified in each channel, it is easy to find it. In this embodiment, the thumbnail images 94 in the channel list area C1 can be arranged in a multistage manner, and from the left to the right and from the top to the bottom as the order n on the timeline of the original frame increases. Arranged in chronological order.
 再分類部45は、各チャンネルに属する特定の1枚のフレームを、アクティブフレームとして管理するとともに、特定の1枚の未分類のフレームについても、アクティブフレームとして管理する。フレーム一覧エリアC2内では、アクティブフレームのサムネイル画像94には、他のフレームのサムネイル画像94と区別するためのサインとして、囲み枠96が付される。フレーム一覧エリアC2内のサムネイル画像94は、各々、オブジェクトとして認識されており、ユーザは、クリック操作等により、フレーム一覧エリアC2内の全てのサムネイル画像94の中から、任意の1のサムネイル画像94を選択することができる。また、キーボード上の特定のキーを押下しながら、クリック操作を繰り返す操作等により、フレーム一覧エリアC2内のサムネイル画像94を同時に複数選択することも可能である。再分類部45は、フレーム一覧エリアC2内でサムネイル画像94が選択される度に、選択チャンネルのアクティブフレームを、最新に選択されたサムネイル画像94に対応するフレームに切り替える。このとき、囲み枠96の位置も移動する。また、アクティブフレームは、チャンネルオブジェクト91及び未分類オブジェクト92に表示されるサムネイル画像と連動しており、各チャンネル又は分類のアクティブフレームが変更される度に、当該チャンネル又は分類に対応するオブジェクト91,92のサムネイル画像も切り替わる。すなわち、本実施形態では、各チャンネル及び分類の代表フレームとは、アクティブフレームに一致する。 The reclassifying unit 45 manages a specific single frame belonging to each channel as an active frame, and also manages a specific single unclassified frame as an active frame. In the frame list area C2, a surrounding frame 96 is attached to the thumbnail image 94 of the active frame as a sign for distinguishing it from the thumbnail images 94 of other frames. The thumbnail images 94 in the frame list area C2 are each recognized as an object, and the user can select any one thumbnail image 94 from all the thumbnail images 94 in the frame list area C2 by a click operation or the like. Can be selected. It is also possible to simultaneously select a plurality of thumbnail images 94 in the frame list area C2 by an operation of repeating a click operation while pressing a specific key on the keyboard. Each time the thumbnail image 94 is selected in the frame list area C <b> 2, the reclassifying unit 45 switches the active frame of the selected channel to a frame corresponding to the latest selected thumbnail image 94. At this time, the position of the surrounding frame 96 is also moved. The active frame is linked with the thumbnail images displayed on the channel object 91 and the unclassified object 92, and each time the active frame of each channel or classification is changed, the objects 91, 92 thumbnail images are also switched. That is, in this embodiment, the representative frame of each channel and classification matches the active frame.
 フレーム一覧エリアC2内のアクティブフレームのサムネイル画像94は、ドラッグ&ドロップ操作により、チャンネル一覧エリアC1内のいずれかのオブジェクト91,92上に重ねるように個別に移動させることができる。また、フレーム一覧エリアC2内で複数のサムネイル画像94が同時選択されている場合には、これらの複数のサムネイル画像94を一括して、チャンネル一覧エリアC1内のいずれかのオブジェクト91,92上に重ねるように移動させることができる。(ただし、「移動」と言っても、サムネイル画像94は、目的のオブジェクト91,92上でリリースされると、フォルダに挿入されたかの如く消える。)言い換えると、再分類部45は、フレーム一覧エリアC2内の任意のサムネイル画像94を、任意のチャンネル又は分類に関連付ける操作をユーザから受け付けている。この関連付け操作がされると、当該操作で移動させられたサムネイル画像94に対応するフレームは、当該サムネイル画像94の移動先となったオブジェクト91,92に対応するチャンネル又は分類に再分類される。なお、当該操作に係るサムネイル画像94に対応するフレームが、元々、当該操作に係るオブジェクト91,92に対応するチャンネル又は分類に分類されていた場合には、当該操作は無視され、再分類は行われない。 The thumbnail image 94 of the active frame in the frame list area C2 can be moved individually so as to be superimposed on one of the objects 91 and 92 in the channel list area C1 by a drag and drop operation. Further, when a plurality of thumbnail images 94 are simultaneously selected in the frame list area C2, the plurality of thumbnail images 94 are collectively put on any one of the objects 91 and 92 in the channel list area C1. It can be moved to overlap. (However, even if it is “move”, the thumbnail image 94 disappears as if it was inserted into the folder when it is released on the target objects 91 and 92.) In other words, the reclassification unit 45 uses the frame list area. An operation of associating an arbitrary thumbnail image 94 in C2 with an arbitrary channel or classification is accepted from the user. When this association operation is performed, the frame corresponding to the thumbnail image 94 moved by the operation is reclassified into a channel or classification corresponding to the objects 91 and 92 to which the thumbnail image 94 is moved. If the frame corresponding to the thumbnail image 94 related to the operation is originally classified into the channel or the classification corresponding to the objects 91 and 92 related to the operation, the operation is ignored and reclassification is not performed. I will not.
 図6に示すとおり、チャンネル一覧エリアC1内には、オブジェクト91,92の配列の後に、新規作成オブジェクト97が配置されている。新規作成オブジェクト97は、チャンネルを新規作成するためのオブジェクトであり、本実施形態では、アイコンの形態である。そして、以上の関連付け操作と同様の操作は、オブジェクト91,92だけでなく、新規作成オブジェクト97に対しても行うことができる。具体的には、再分類部45は、ドラッグ&ドロップ操作により、フレーム一覧エリアC2内のアクティブフレームのサムネイル画像94を新規作成オブジェクト97上に重ねるように個別に移動させる操作をユーザから受け付けている。また、フレーム一覧エリアC2内で複数のサムネイル画像94が同時選択されている場合には、これらの複数のサムネイル画像94を一括して、新規作成オブジェクト97上に重ねるように移動させることもできる。(ただし、「移動」と言っても、サムネイル画像94は、新規作成オブジェクト97上でリリースされると、フォルダに挿入されたかの如く消える。)そして、この関連付け操作がなされると、新規作成オブジェクト97が存在していた位置に、チャンネルオブジェクト91が新規作成される。 As shown in FIG. 6, a newly created object 97 is arranged after the arrangement of the objects 91 and 92 in the channel list area C1. The newly created object 97 is an object for newly creating a channel, and is in the form of an icon in this embodiment. Then, the same operation as the above association operation can be performed not only on the objects 91 and 92 but also on the newly created object 97. Specifically, the reclassification unit 45 accepts an operation of moving the thumbnail image 94 of the active frame in the frame list area C2 individually so as to overlap the newly created object 97 by a drag and drop operation from the user. . Further, when a plurality of thumbnail images 94 are simultaneously selected in the frame list area C2, the plurality of thumbnail images 94 can be moved together so as to overlap the newly created object 97. (However, even if it is “move”, when the thumbnail image 94 is released on the newly created object 97, it disappears as if it was inserted into the folder.) When this association operation is performed, the newly created object 97 is displayed. A channel object 91 is newly created at the position where the.
 新規作成されたチャンネルオブジェクト91は、関連付け操作により移動させられた1又は複数のサムネイル画像94に対応するフレームを要素とする新たなチャンネルを表すオブジェクトとなる。言い換えると、関連付け操作により移動させられた1又は複数のサムネイル画像94に対応するフレームが、新たなチャンネルに再分類される。従って、チャンネルオブジェクト91が新規作成された直後において、移動されられたサムネイル画像94が1つしかなかった場合には、当該オブジェクト91上には、当該サムネイル画像94が表示される。一方、複数のサムネイル画像94が移動させられた場合には、それらのサムネイル画像94の中の1つ(例えば、最後に選択されたもの)が、新規作成されたチャンネルオブジェクト91上に表示される。チャンネル名も、適宜付与され、表示される。また、チャンネルオブジェクト91の新規作成に伴って、新規作成オブジェクト97は、新規作成されたオブジェクト91を含むオブジェクト91,92の配列の後に移動する。 The newly created channel object 91 is an object representing a new channel whose elements are frames corresponding to one or a plurality of thumbnail images 94 moved by the association operation. In other words, the frame corresponding to one or more thumbnail images 94 moved by the association operation is reclassified into a new channel. Therefore, immediately after the channel object 91 is newly created, if only one thumbnail image 94 has been moved, the thumbnail image 94 is displayed on the object 91. On the other hand, when a plurality of thumbnail images 94 are moved, one of the thumbnail images 94 (for example, the last selected one) is displayed on the newly created channel object 91. . Channel names are also given and displayed as appropriate. As the channel object 91 is newly created, the newly created object 97 moves after the arrangement of the objects 91 and 92 including the newly created object 91.
 以上の各種関連付け操作によりフレームの再分類が行われると、再分類されたフレームには、再分類先のチャンネル又は分類に応じて、チャンネル名CH01,CH02,・・・又は「未分類」のラベリングが再付与される。また、フレーム一覧エリアC2内からは、再分類されたフレームのサムネイル画像94が削除される。このとき、フレーム一覧エリアC2内では、削除されたサムネイル画像94が配置されていたスペースを埋めるように、残りのサムネイル画像94が整列される。なお、再分類されたフレームには、必ずアクティブフレームが含まれている。従って、再分類後、選択チャンネルのアクティブフレームは、タイムラインに沿って後ろのフレームのうち最も近い、なければタイムラインに沿って前のフレームのうち最も近いフレームに変更される。選択チャンネルのアクティブフレームの変更は、囲み枠96の位置、及び、オブジェクト91,92上のサムネイル画像の表示に、適宜反映される。また、再分類されたフレームの移動前及び移動先のオブジェクト91,92に対応するチャンネル及び/又は分類に属するフレームの枚数が再計算され、オブジェクト91,92の表示に反映される。 When the frame is reclassified by the above various association operations, the reclassified frame is labeled with the channel name CH01, CH02,... Or “unclassified” according to the reclassified channel or classification. Will be granted again. Further, the thumbnail image 94 of the reclassified frame is deleted from the frame list area C2. At this time, in the frame list area C2, the remaining thumbnail images 94 are aligned so as to fill the space where the deleted thumbnail images 94 were placed. The reclassified frame always includes an active frame. Therefore, after the reclassification, the active frame of the selected channel is changed to the closest frame among the following frames along the timeline, or the closest frame among the previous frames along the timeline. The change of the active frame of the selected channel is appropriately reflected in the position of the surrounding frame 96 and the display of the thumbnail images on the objects 91 and 92. Further, the number of frames belonging to the channel and / or classification corresponding to the objects 91 and 92 of the reclassified frame before and after the movement is recalculated and reflected in the display of the objects 91 and 92.
 また、以上の各種関連付け操作は、ドラッグ&ドロップ操作以外の態様でも実現可能である。具体的には、図6に示すとおり、フレーム一覧エリアC2内のオブジェクト91,92,97には、右下にキーボードの特定のキーを示す文字列が表示されている。再分類部45は、オブジェクト91,92,97に表示されている特定のキーが押下されたことを検知すると、その時点の選択チャンネルのアクティブフレーム、すなわち、フレーム一覧エリアC2内で現在囲み枠96が付されているフレームが、当該特定のキーに対応するオブジェクト91,92,97に関連付けられたものと判断する。その後の処理は、ドラッグ&ドロップ操作の態様で関連付け操作が行われた場合と同様である。 In addition, the above various association operations can be realized in modes other than the drag and drop operation. Specifically, as shown in FIG. 6, the objects 91, 92, and 97 in the frame list area C2 display character strings indicating specific keys on the keyboard on the lower right. When the reclassifying unit 45 detects that a specific key displayed on the objects 91, 92, 97 is pressed, the current frame 96 in the active frame of the selected channel at that time, that is, in the frame list area C2. It is determined that the frame marked with is associated with the objects 91, 92, and 97 corresponding to the specific key. Subsequent processing is the same as when the association operation is performed in the form of a drag and drop operation.
 また、再分類部45は、ユーザの操作を受けて、複数のチャンネル(本段落及び次段落において、「未分類」の分類も含む。)を統合することもできる。具体的には、チャンネル一覧エリアC1内では、任意の1のオブジェクト91,92を、ドラッグ&ドロップ操作により、別の任意の1のオブジェクト91,92上に重ねるように個別に移動させることができる。また、複数のオブジェクト91,92を同時に複数選択可能とし、これらの複数のオブジェクト91,92を一括して、別の任意の1のオブジェクト91,92上に移動可能としてもよい。(ただし、「移動」と言っても、移動させられたオブジェクト91,92は、移動先のオブジェクト91,92上でリリースされると、フォルダに挿入されたかの如く消える。)言い換えると、再分類部45は、チャンネル一覧エリアC1内の任意のチャンネルを、別の任意のチャンネルに関連付ける操作をユーザから受け付けている。この関連付け操作がされると、当該操作で移動させられたオブジェクト91,92に対応するチャンネルは、当該オブジェクト91,92の移動先となった別のオブジェクト91,92に対応するチャンネルに統合される。 Further, the reclassification unit 45 can also integrate a plurality of channels (including classification of “unclassified” in this paragraph and the next paragraph) in response to a user operation. Specifically, in the channel list area C1, any one of the objects 91 and 92 can be individually moved so as to be superimposed on another arbitrary one of the objects 91 and 92 by a drag and drop operation. . Alternatively, a plurality of objects 91 and 92 may be selected at the same time, and the plurality of objects 91 and 92 may be collectively moved onto another arbitrary one of objects 91 and 92. (However, even if it is “move”, the moved objects 91 and 92 disappear when they are released on the destination objects 91 and 92.) In other words, the reclassification unit 45 accepts an operation for associating an arbitrary channel in the channel list area C1 with another arbitrary channel from the user. When this association operation is performed, the channels corresponding to the objects 91 and 92 moved by the operation are integrated with the channels corresponding to the other objects 91 and 92 to which the objects 91 and 92 are moved. .
 以上の関連付け操作によりチャンネルの統合が行われると、移動させられたオブジェクト91,92に対応するチャンネルに属する全てのフレームには、移動先のオブジェクト91,92に対応するチャンネルのチャンネル名「未分類」、CH01,CH02,・・・のラベリングが再付与される。その結果、移動先のオブジェクト91,92が、統合された新たなチャンネルを表すオブジェクトとなる。統合されたチャンネルのアクティブフレームとしては、移動先となったオブジェクト91,92のアクティブフレームが継続して用いられる。また、再分類部45は、統合されたチャンネルに属するフレームの枚数を計算し、オブジェクト91,92上の表示に反映させる。また、チャンネル一覧エリアC1内では、チャンネルオブジェクト91が移動させられた場合には、移動させられたチャンネルオブジェクト91が消失し、消失したチャンネルオブジェクト91が配置されていたスペースを埋めるように、残りのオブジェクト91,97が整列される。未分類オブジェクト92が移動させられた場合には、フレームの枚数は0となるが、未分類オブジェクト92はそのまま残る。 When the channels are integrated by the above association operation, the channel name “unclassified” of the channel corresponding to the destination object 91, 92 is displayed in all frames belonging to the channel corresponding to the moved object 91, 92. ”, The labeling of CH01, CH02,. As a result, the destination objects 91 and 92 become objects that represent a new integrated channel. As the active frame of the integrated channel, the active frames of the objects 91 and 92 that are the movement destinations are continuously used. In addition, the reclassifying unit 45 calculates the number of frames belonging to the integrated channel and reflects it on the display on the objects 91 and 92. Also, in the channel list area C1, when the channel object 91 is moved, the moved channel object 91 disappears, and the remaining channel object 91 is filled so as to fill the space where the lost channel object 91 is arranged. Objects 91 and 97 are aligned. When the unclassified object 92 is moved, the number of frames is 0, but the unclassified object 92 remains as it is.
 また、チャンネルオブジェクト91が複数存在する場合に、チャンネル一覧エリアC1内でのこれらのオブジェクト91の配列の順番は、ドラッグ&ドロップ操作により自在に変更することができる。例えば、図6の状態で、チャンネルCH01のチャンネルオブジェクト91をチャンネルCH02のチャンネルオブジェクト91の右側へと移動したい場合には、チャンネルCH01のチャンネルオブジェクト91を、チャンネルCH02,CH03のチャンネルオブジェクト91の間に移動させればよい。 Further, when there are a plurality of channel objects 91, the order of arrangement of these objects 91 in the channel list area C1 can be freely changed by a drag and drop operation. For example, when it is desired to move the channel object 91 of the channel CH01 to the right side of the channel object 91 of the channel CH02 in the state of FIG. 6, the channel object 91 of the channel CH01 is placed between the channel objects 91 of the channels CH02 and CH03. Move it.
 次に、再生エリアC3について説明する。再生エリアC3は、選択チャンネルに属するフレーム群を動画として再生可能である。図6に示すとおり、再分類ウィンドウW4上には、再生ボタン76、コマ送りボタン77及びコマ戻しボタン78が配置されている。 Next, the reproduction area C3 will be described. In the reproduction area C3, a frame group belonging to the selected channel can be reproduced as a moving image. As shown in FIG. 6, a play button 76, a frame advance button 77, and a frame return button 78 are arranged on the reclassification window W4.
 再生ボタン76は、選択チャンネルに属するフレーム群の動画としての再生の命令をユーザから受け付ける。再分類部45は、再生ボタン76が押下されたことを検知すると、再生エリアC3内に、選択チャンネルに属するフレーム群に含まれるフレームを、タイムラインに沿って順次コマ送りの形式で表示させる。再生は、再生ボタン76が押下された時点の選択チャンネルのアクティブフレームから開始する。また、再生は、選択チャンネルに応じたフレームレートで行われる。そして、再生中、再生エリアC3内に表示されるフレームは順次切り替わることになるが、当該切り替えの度に、再分類部45により管理されるアクティブフレームは、当該切り替え後のフレームへとリアルタイムに更新される。また、再生ボタン76は、再生の停止の命令をユーザから受け付ける。再分類部45は、再生中に再生ボタン76が押下されたことを検知すると、再生エリアC3内の表示を、その時点のアクティブフレームに固定する。なお、停止中においても、再生エリアC3内には、常に選択チャンネルのアクティブフレームが表示される。従って、停止中、フレーム一覧エリアC2内でのサムネイル画像94の選択操作によりアクティブフレームが変更されると、再生エリアC3内の表示もリアルタイムに変更される。 The playback button 76 receives a playback command from the user as a moving image of a frame group belonging to the selected channel. When the reclassifying unit 45 detects that the playback button 76 has been pressed, the reclassifying unit 45 sequentially displays the frames included in the frame group belonging to the selected channel in the playback area C3 in the frame advance format along the timeline. Playback starts from the active frame of the selected channel when the playback button 76 is pressed. Reproduction is performed at a frame rate corresponding to the selected channel. During playback, the frames displayed in the playback area C3 are sequentially switched. At each switching, the active frame managed by the reclassifying unit 45 is updated to the frame after the switching in real time. Is done. The playback button 76 receives a command to stop playback from the user. When the reclassifying unit 45 detects that the playback button 76 is pressed during playback, the reclassifying unit 45 fixes the display in the playback area C3 to the active frame at that time. Even during stoppage, the active frame of the selected channel is always displayed in the reproduction area C3. Accordingly, when the active frame is changed by the selection operation of the thumbnail image 94 in the frame list area C2 during the stop, the display in the reproduction area C3 is also changed in real time.
 また、再生中、アクティブフレームは随時更新されてゆくが、フレーム一覧エリアC2内の囲み枠96も、最新のアクティブフレームのサムネイル画像94を囲む位置にリアルタイムに移動する。また、このとき、選択チャンネルに対応するオブジェクト91,92のサムネイル画像も、リアルタイムに更新される。すなわち、再生エリアC3内での表示は、囲み枠96の位置及びオブジェクト91,92のサムネイル画像の表示と、同期が取られている。しかしながら、他の実施形態では、再生中はこのような同期は取らない構成としてもよい。その場合、例えば、再生が停止した時点で初めて、囲み枠96の位置が移動し、及び/又は、オブジェクト91,92のサムネイル画像が更新されるようにしてもよい。 Also, during playback, the active frame is updated as needed, but the surrounding frame 96 in the frame list area C2 also moves in real time to a position surrounding the thumbnail image 94 of the latest active frame. At this time, the thumbnail images of the objects 91 and 92 corresponding to the selected channel are also updated in real time. That is, the display in the reproduction area C3 is synchronized with the position of the surrounding frame 96 and the display of the thumbnail images of the objects 91 and 92. However, in other embodiments, such a synchronization may not be taken during playback. In this case, for example, the position of the surrounding frame 96 may be moved and / or the thumbnail images of the objects 91 and 92 may be updated only when playback is stopped.
 コマ送りボタン77、コマ戻しボタン78はそれぞれ、再生エリアC3内の表示を、選択チャンネルに属するフレーム群のうち、タイムラインに沿って1つ後、1つ前のフレームへと切り替える命令をユーザから受け付ける。ボタン77,78操作による再生エリアC3内の表示の変更も、アクティブフレームの変更に連動している。 Each of the frame advance button 77 and the frame return button 78 issues a command from the user to switch the display in the reproduction area C3 to the next frame one after the next in the frame group belonging to the selected channel along the timeline. Accept. The change of the display in the reproduction area C3 by the operation of the buttons 77 and 78 is also linked to the change of the active frame.
 以上の再生の機能は、既存のチャンネル内に誤って分類されているフレームを発見するのに役立つ。すなわち、人は、動画を再生中に、突然、異なるチャンネルに属するフレームが現れた場合には、直ぐにそれを発見することができる。そして、そのような場合には、直ちに再生を停止させ、フレーム一覧エリアC2内で対応するフレームのサムネイル画像94を探し出し、当該サムネイル画像94を以上の関連付け操作により正しいチャンネルへと移動させればよい。 The above playback functions are useful for finding frames that are misclassified within an existing channel. That is, when a frame suddenly appears in a different channel during playback of a moving image, a person can immediately find it. In such a case, the reproduction is immediately stopped, the thumbnail image 94 of the corresponding frame is searched for in the frame list area C2, and the thumbnail image 94 is moved to the correct channel by the above association operation. .
 再分類ウィンドウW4上の再生エリアC3の下方には、選択チャンネルのフレームレートを表示するフレームレートエリアC5が表示されている。フレームレートは、様々な方法で算出することができ、例えば、選択チャンネルに属するフレームの枚数を、選択チャンネルに属する先頭のフレームの時刻と最後のフレームの時刻との差で割った値として算出することができる。或いは、選択チャンネルのフレームレートは、以下の式に基づいて算出することもできる。
(アクティブタイムラインに対応する動画のフレームレート)×(選択チャンネルに属するフレームの枚数)÷(アクティブタイムラインに属するフレーム数)
Below the playback area C3 on the reclassification window W4, a frame rate area C5 for displaying the frame rate of the selected channel is displayed. The frame rate can be calculated by various methods. For example, the frame rate is calculated by dividing the number of frames belonging to the selected channel by the difference between the time of the first frame belonging to the selected channel and the time of the last frame. be able to. Alternatively, the frame rate of the selected channel can be calculated based on the following equation.
(Frame rate of video corresponding to the active timeline) x (number of frames belonging to the selected channel) ÷ (number of frames belonging to the active timeline)
 ところで、上述したとおり、選択チャンネルに属するフレームは、フレームの再分類により変化する。従って、再分類部45は、再分類が行われる度に、選択チャンネルのフレームレートを再計算し、リアルタイムにフレームレートエリアC5の表示を変更する。また、フレームレートエリアC5の近傍には、選択チャンネルのチャンネル名又は分類名が表示されるチャンネル名エリアC6が配置されている。従って、ユーザは、再生エリアC3内に表示されている映像がどのチャンネルに属し、また、フレームレートエリアC5内に表示されているフレームレートがどのチャンネルのフレームレートであるかを正確に理解することができる。 By the way, as described above, the frames belonging to the selected channel change due to the reclassification of the frames. Therefore, the reclassification unit 45 recalculates the frame rate of the selected channel every time reclassification is performed, and changes the display of the frame rate area C5 in real time. Further, in the vicinity of the frame rate area C5, a channel name area C6 in which the channel name or classification name of the selected channel is displayed is arranged. Therefore, the user must understand exactly which channel the video displayed in the reproduction area C3 belongs to and which channel the frame rate displayed in the frame rate area C5 is. Can do.
 次に、タイムラインバーC4について説明する。タイムラインバーC4は、基本画面W1上のタイムラインバーT5と同様に、アクティブタイムラインを図式的に示すオブジェクトである。チャンネル分離処理の対象となる対象フレーム群は、アクティブフレーム群の一部又は全部を構成する。従って、再分類ウィンドウW4上においても、アクティブタイムラインを管理できることは重要であるため、基本画面W1上のタイムラインバーT5と同様のタイムラインバーC4が配置されている。タイムラインバーC4は、左右方向に延びており、左右方向にアクティブフレーム群のフレーム数で等分に分割されている。タイムラインバーC4上の左からn番目の分割領域は、アクティブタイムライン上でn番目のフレームに対応する(nは、自然数)。 Next, the timeline bar C4 will be described. The timeline bar C4 is an object that schematically shows the active timeline, like the timeline bar T5 on the basic screen W1. The target frame group that is the target of the channel separation process constitutes part or all of the active frame group. Therefore, since it is important that the active timeline can be managed also on the reclassification window W4, a timeline bar C4 similar to the timeline bar T5 on the basic screen W1 is arranged. The timeline bar C4 extends in the left-right direction and is equally divided in the left-right direction by the number of frames of the active frame group. The nth divided area from the left on the timeline bar C4 corresponds to the nth frame on the active timeline (n is a natural number).
 再分類部45は、タイムラインバーC4上において、選択チャンネルのアクティブフレームに対応する分割領域を示すサインとして、当該分割領域の位置に直線83を表示する。すなわち、直線83は、エリアC3内、及び、エリアC2の囲み枠96内に表示されているフレームの、アクティブタイムライン上での位置を示している。また、タイムラインバーC4の下方には、対象範囲バー85が表示されている。対象範囲バー85は、タイムラインバーC4と同様に左右方向に、対象フレーム群の占める区間に対応する範囲に亘って延びている。 The reclassifying unit 45 displays a straight line 83 at the position of the divided area as a sign indicating the divided area corresponding to the active frame of the selected channel on the timeline bar C4. That is, the straight line 83 indicates the position on the active timeline of the frame displayed in the area C3 and the frame 96 of the area C2. A target range bar 85 is displayed below the timeline bar C4. Similar to the timeline bar C4, the target range bar 85 extends in the left-right direction over a range corresponding to the section occupied by the target frame group.
 タイムラインバーC4は、左右方向に拡縮可能である。具体的には、再分類ウィンドウW4上には、尺度変更バー86が配置されており、尺度変更バー86は、スライド溝61及びスライダー62の2つのオブジェクトを組み合わせたオブジェクトである。スライド溝61は、左右方向に直線状に延びる溝のようなGUIを有し、スライダー62は、スライド溝61内を左右方向にスライドするようなGUIを有する。ユーザは、入力部20を介してスライダー62を操作することにより、スライド溝61に沿ってスライダー62を左右方向に往復移動させることができる。再分類部45は、スライド溝61内でのスライダー62の左右方向の位置に応じて、タイムラインバーC4の左右方向の尺度を段階的に変更する。具体的には、スライダー62が右側へ移動するにつれて、タイムラインバーC4の左右方向の尺度は段階的に短くなり、スライダー62が左側へ移動するにつれて、タイムラインバーC4の左右方向の尺度は段階的に長くなる。なお、タイムラインバーC4の左右方向の尺度が変更されると、それに比例して、全ての分割領域の左右方向のサイズも均等に変更される。ただし、スライダー62がスライド溝61内の最左の位置に達すると、タイムラインバーC4は、サムネイル一覧C7に変化する(図7参照)。また、タイムラインバーC4及びサムネイル一覧C7の左右方向の尺度が長く、再分類ウィンドウW4内に収まりきらない場合には、左右方向にスクロールするスクロールバー87が登場し、タイムラインバーC4を表示するためのエリアは左右方向に実質的に拡大される。 The timeline bar C4 can be expanded or contracted in the left-right direction. Specifically, a scale change bar 86 is arranged on the reclassification window W4, and the scale change bar 86 is an object obtained by combining two objects of the slide groove 61 and the slider 62. The slide groove 61 has a GUI such as a groove that linearly extends in the left-right direction, and the slider 62 has a GUI that slides in the slide groove 61 in the left-right direction. The user can reciprocate the slider 62 in the left-right direction along the slide groove 61 by operating the slider 62 via the input unit 20. The reclassifying unit 45 changes the horizontal scale of the timeline bar C4 stepwise according to the horizontal position of the slider 62 in the slide groove 61. Specifically, as the slider 62 moves to the right, the horizontal scale of the timeline bar C4 becomes shorter stepwise, and as the slider 62 moves to the left, the horizontal scale of the timeline bar C4 becomes a step. Become longer. Note that when the horizontal scale of the timeline bar C4 is changed, the size in the horizontal direction of all the divided areas is also changed equally in proportion thereto. However, when the slider 62 reaches the leftmost position in the slide groove 61, the timeline bar C4 changes to the thumbnail list C7 (see FIG. 7). In addition, when the horizontal scale of the timeline bar C4 and the thumbnail list C7 is long and does not fit in the reclassification window W4, a scroll bar 87 that scrolls in the horizontal direction appears and displays the timeline bar C4. The area for this is substantially enlarged in the left-right direction.
 サムネイル一覧C7は、アクティブタイムラインに属する全てのフレームのサムネイル画像88を左右方向に配列したものである。従って、ユーザは、再分類ウィンドウW4上で、アクティブタイムラインに含まれる対象フレーム群以外のフレームについても、確認することができる。サムネイル一覧C7上の左からn番目のサムネイル画像88は、アクティブタイムライン上でn番目のフレームのサムネイル画像である(nは、自然数)。サムネイル一覧C7内において、選択チャンネルのアクティブフレームのサムネイル画像88には、その他のサムネイル画像88から区別するためのサインとして、囲み枠89が付される。 The thumbnail list C7 is an arrangement of thumbnail images 88 of all frames belonging to the active timeline in the horizontal direction. Therefore, the user can check frames other than the target frame group included in the active timeline on the reclassification window W4. The nth thumbnail image 88 from the left on the thumbnail list C7 is a thumbnail image of the nth frame on the active timeline (n is a natural number). In the thumbnail list C <b> 7, a surrounding frame 89 is attached to the thumbnail image 88 of the active frame of the selected channel as a sign for distinguishing from the other thumbnail images 88.
 タイムラインバーC4及びサムネイル一覧C7は、タイムラインバーT5と同様に、アクティブタイムライン上の任意のフレームの選択をユーザから受け付ける。言い換えると、ユーザは、クリック操作等によりタイムラインバーC4上の分割領域、又はサムネイル一覧C7内の任意のサムネイル画像88を選択することにより、アクティブフレーム群の中から、任意のフレームを選択することができる。そして、再分類部45は、タイムラインバーC4及びサムネイル一覧C7上でフレームが選択される度に、当該フレームが対象フレーム群に含まれる否かを判断する。そして、含まれると判断された場合には、当該フレームの属するチャンネル又は分類を、選択チャンネルに切り替える。同時に、再分類部45は、当該フレームを選択チャンネルのアクティブフレームに切り替える。ここでの選択チャンネル及びアクティブフレームの変更は、チャンネル一覧エリアC1、フレーム一覧エリアC2、再生エリアC3、フレームレートエリアC5及びチャンネル名エリアC6の表示にリアルタイムに反映される。また、直線83及び囲み枠89の位置も、リアルタイムに変更される。 The timeline bar C4 and the thumbnail list C7 accept the selection of an arbitrary frame on the active timeline from the user, similarly to the timeline bar T5. In other words, the user selects an arbitrary frame from the active frame group by selecting an arbitrary thumbnail image 88 in the divided area on the timeline bar C4 or the thumbnail list C7 by a click operation or the like. Can do. Then, each time a frame is selected on the timeline bar C4 and the thumbnail list C7, the reclassifying unit 45 determines whether or not the frame is included in the target frame group. If it is determined that the channel is included, the channel or classification to which the frame belongs is switched to the selected channel. At the same time, the reclassifying unit 45 switches the frame to the active frame of the selected channel. The change of the selected channel and the active frame here is reflected in real time on the display of the channel list area C1, the frame list area C2, the reproduction area C3, the frame rate area C5, and the channel name area C6. The positions of the straight line 83 and the surrounding frame 89 are also changed in real time.
 ユーザは、以上説明した様々な機能を用いて、対象フレーム群に含まれる複数のフレームの再分類を行うことができる。そして、再分類が終了したと判断した時点で、OKボタンE5を押下する。これを受けて、再分類部45は、再分類ウィンドウW4を閉じるとともに、新たなタイムラインに属する1又は複数の動画を新規作成する。新規作成される1又は複数の動画は、再分類ウィンドウW4上で最終的に定義されていたチャンネルであって、ユーザにより選択されていたもの(以下、最終チャンネル)に1対1で対応する。ユーザによる最終チャンネルの選択は、図6に示すとおり、チャンネルオブジェクト91上に配置されている選択ボックス98に選択の印を入れることにより行われる。なお、本実施形態では、選択ボックス98のデフォルト値は「選択」の状態である。従って、ユーザは、動画の作成が不要と判断したチャンネルについてのみ、対応するオブジェクト91上の選択ボックスを操作し、「非選択」の状態にすればよい。また、本実施形態では、再生エリアC3の下方にも、選択ボックス99が配置されている。選択ボックス99は、選択チャンネル(ただし、「未分類」の分類は含まれない)の動画の作成の要否を選択するためのオブジェクトであり、チャンネルオブジェクト91上の選択ボックス98と同期が取られている。再分類部45は、選択ボックス98,99の選択状態を判断し、最終チャンネルを決定する。 The user can reclassify a plurality of frames included in the target frame group using the various functions described above. Then, when it is determined that the reclassification has ended, the OK button E5 is pressed. In response to this, the reclassification unit 45 closes the reclassification window W4 and newly creates one or more moving images belonging to the new timeline. The one or more newly created moving images correspond to the channels that are finally defined on the reclassification window W4 and are selected by the user (hereinafter referred to as the final channel) on a one-to-one basis. Selection of the final channel by the user is performed by putting a selection mark in a selection box 98 arranged on the channel object 91 as shown in FIG. In the present embodiment, the default value of the selection box 98 is “selected”. Therefore, the user only has to operate the selection box on the corresponding object 91 for the channel that it is determined that the creation of the moving image is not necessary, so that the state becomes “non-selected”. In the present embodiment, a selection box 99 is also arranged below the reproduction area C3. The selection box 99 is an object for selecting whether or not to create a moving image of the selected channel (however, the classification of “unclassified” is not included), and is synchronized with the selection box 98 on the channel object 91. ing. The reclassifying unit 45 determines the selection state of the selection boxes 98 and 99 and determines the final channel.
 以上のとおり新規作成される動画は、それぞれに対応する最終チャンネルに属する全てのフレームを、新たなタイムラインに沿って配列したものである。また、このとき、これらの新たな動画に1対1で対応する表示ウィンドウW2が新規作成される。更に、再分類部45は、これらの新たな動画に含まれるフレーム群に1対1で対応する静止画ファイル群を作成し、加工ファイル領域52内に格納し、第0次のフレームとして扱う。すなわち、これ以降、これらの新たな動画は、オリジナル画像領域51へ取り込まれた静止画ファイル群と同様に処理されることが可能になる。具体的には、表示ウィンドウW2内で同様に再生可能となり、同様に各種画像処理の対象となる。 As described above, the newly created video is obtained by arranging all the frames belonging to the corresponding final channel along a new timeline. At this time, a new display window W2 corresponding to each of these new moving images is created. Further, the reclassifying unit 45 creates a still image file group corresponding to the frame group included in these new moving images on a one-to-one basis, stores it in the processed file area 52, and handles it as the 0th-order frame. That is, thereafter, these new moving images can be processed in the same manner as the still image file group captured in the original image area 51. Specifically, it can be reproduced in the same manner in the display window W2, and is similarly subject to various image processing.
 なお、「未分類」の分類に対応する動画は、新規作成されない。そのため、OKボタンE5が押下された時点で、再分類部45は、未分類のフレームが残っているか否かを判断し、残っている場合にはその旨をユーザに伝えるようにしてもよい。この場合、例えば、再分類部45は、未分類のフレームが残っていると判断される場合に、図8に示すような確認ウィンドウW5をディスプレイ10上に表示させる。そして、「はい」ボタン74が押されるのを待って、以上の動画の新規作成処理に移行する。「いいえ」ボタン75が押された場合には、確認ウィンドウW5を閉じ、再分類ウィンドウW4に復帰する。一方、未分類のフレームが残っていないと判断される場合には、直ちに以上の動画の新規作成処理に移行する。 Note that new videos that correspond to the “Uncategorized” category are not created. Therefore, when the OK button E5 is pressed, the reclassifying unit 45 may determine whether or not an unclassified frame remains, and if so, may notify the user to that effect. In this case, for example, the reclassification unit 45 displays a confirmation window W5 as shown in FIG. 8 on the display 10 when it is determined that an unclassified frame remains. Then, after the “Yes” button 74 is pressed, the process proceeds to the above-described new movie creation process. When the “No” button 75 is pressed, the confirmation window W5 is closed and the reclassification window W4 is restored. On the other hand, if it is determined that no unclassified frame remains, the process immediately proceeds to the above-described new movie creation process.
 <1-4.自動分類のアルゴリズム>
 以下、図9を参照しつつ、自動分類処理の詳細について説明する。自動分類処理は、上述したとおり、エリア設定ウィンドウW3上の開始ボタン72が押下された時に開始する。以下では、対象フレーム群を構成する複数のフレームを、F1,F2,・・・,FJと表す(Jは、2以上の整数)。なお、フレームF1,F2,・・・,FJの配列は、タイムライン上での配列に等しい。
<1-4. Automatic classification algorithm>
Hereinafter, the details of the automatic classification process will be described with reference to FIG. As described above, the automatic classification process starts when the start button 72 on the area setting window W3 is pressed. Hereinafter, a plurality of frames constituting the target frame group are represented as F 1 , F 2 ,..., F J (J is an integer of 2 or more). Note that the arrangement of the frames F 1 , F 2 ,..., F J is equal to the arrangement on the timeline.
 まず、ステップS1では、自動分類部41が、対象フレーム群に含まれる各フレームF1,F2,・・・,FJのサイズを縮小する。このとき、例えば、フレームの横及び/又は縦の画素数が規定数となるように縮小してもよいし、元のサイズから所定の割合のサイズとなるように縮小してもよい。また、このとき、アスペクト比が保存されるようにしてもよいし、しなくてもよい。この縮小のステップにより、以下の処理の高速化が図られるとともに、ノイズの影響が低減される。以下では、縮小後のフレームF1,F2,・・・,FJも、F1,F2,・・・,FJと表す。 First, in step S1, the automatic classification unit 41 reduces the size of each frame F 1 , F 2 ,..., F J included in the target frame group. At this time, for example, the number of pixels in the horizontal and / or vertical direction of the frame may be reduced so as to be a specified number, or may be reduced so as to be a predetermined ratio from the original size. At this time, the aspect ratio may or may not be saved. This reduction step speeds up the following processing and reduces the influence of noise. In the following, the frame F 1, F 2 after reduction, · · ·, F J also, F 1, F 2, · · ·, expressed as F J.
 続くステップS2では、設定部42が、チャンネルCH1を定義し、フレームF1をチャンネルCH1に分類する。具体的には、フレームF1にチャンネル名CH1のラベリングを付与する。また、設定部42は、フレームF1を、チャンネルCH1に属するフレーム群を代表する基準フレームG1として設定する。なお、後述されるステップS10により、チャンネル数は随時増加し得る。そして、新たに作成されるチャンネルCHlついても、それぞれのチャンネルに属するフレーム群を代表する基準フレームGlが設定されることになる(l=2,3,・・・)。 In step S2, the setting unit 42 defines a channel CH 1, classifies the frames F 1 to the channel CH 1. Specifically, labeling of the channel name CH 1 is given to the frame F 1 . The setting unit 42 sets the frame F 1 as a reference frame G 1 representing a frame group belonging to the channel CH 1 . Note that the number of channels can be increased at any time by step S10 described later. For the newly created channel CH l, the reference frame G l representing the frame group belonging to each channel is set (l = 2, 3,...).
 以上の前処理が終了すると、設定部42は、残りのフレームF2,F3,・・・,FJの中からタイムラインに沿って順次フレームFjを選択し(j=2,3,・・・,J)、比較フレームとして設定する。一方で、算出部43、判定部44及び設定部42は、比較フレームFjが設定される度に、比較フレームFjに対し以下のステップS3~S10を繰り返し、既存の基準フレームG1,G2,・・・,GLとの比較を行う(Lは、既存のチャンネル数)。 When the above preprocessing is completed, the setting unit 42 sequentially selects the frames F j along the timeline from the remaining frames F 2 , F 3 ,..., F J (j = 2, 3, .., J), set as a comparison frame. On the other hand, each time the comparison frame F j is set, the calculation unit 43, the determination unit 44, and the setting unit 42 repeat the following steps S3 to S10 with respect to the comparison frame F j , and the existing reference frames G 1 , G 1 2 ,..., G L (L is the number of existing channels).
 ステップS3では、算出部43は、比較フレームFj全体を規定数K(Kは、2以上の整数)の部分領域(ブロック)D1,D2,・・・,DKに分割する(図10参照)。なお、ここでいう比較フレームFj全体とは、エリア設定ウィンドウW3上でエリアの指定がされていた場合には、当該エリア全体であり、指定がされていない場合には、フレームの画面全体である。なお、部分領域D1,D2,・・・,DKは、比較フレームFj全体を隈なく分割したものである必要はなく、また、互いに一部重複していてもよい。形状については、矩形である必要はなく、例えば、円形であっても、他の多角形であってもよい。また、形状やサイズについては、均一とすることもできるし、互いに異なるものとすることもできる。 In step S3, the calculation unit 43 divides the entire comparison frame Fj into partial regions (blocks) D 1 , D 2 ,..., D K of a specified number K (K is an integer of 2 or more) (FIG. 10). Here, a comparative entire frame F j to say, if it has been designated area on the area setting window W3 are entire area, if not specified is the entire screen of the frame is there. Note that the partial areas D 1 , D 2 ,..., D K do not have to be divided as a whole into the comparison frame F j, and may partially overlap each other. The shape does not need to be a rectangle, and may be, for example, a circle or another polygon. Further, the shape and size can be uniform or different from each other.
 続くステップS4では、算出部43は、既存のチャンネルCH1,CH2,・・・,CHLの基準フレームG1,G2,・・・,GLの各々を、部分領域D'1,D'2,・・・,D'Kに分割する(図10参照)。部分領域D'1,D'2,・・・,D'Kは、フレームの画面内において、それぞれ部分領域D1,D2,・・・,DKと同じ位置を占めるように定義される。 In step S4, calculating section 43, existing channel CH 1, CH 2, · · ·, reference frames G 1, G 2 of CH L, · · ·, each of G L, subregion D '1, Divided into D ′ 2 ,..., D ′ K (see FIG. 10). Subregion D '1, D' 2, ···, D 'K , in the screen frame, each partial region D 1, D 2, · · ·, are defined to occupy the same position as D K .
 次に、ステップS5では、算出部43は、部分領域Dk,D'k(k=1,2,・・・,K)の組み合わせ毎に、比較フレームFjと、各基準フレームGl(l=1,2,・・・,L)との類似度(以下、局所類似度)Ylkを算出する。すなわち、局所類似度Ylkとは、比較フレームFjの部分領域Dkと基準フレームGlの部分領域D'kとの類似度である。本実施形態では、正規化相互相関(ZNCC)と呼ばれる相関係数が算出されるが、他の実施形態では、NCCと呼ばれる相関係数や、SDD、SADと呼ばれる類似度等、他の類似度が算出されてもよい。 Next, in step S5, the calculation unit 43 calculates the comparison frame F j and each reference frame G l (for each combination of the partial areas D k , D ′ k (k = 1, 2,..., K). l = 1, 2,..., L) (hereinafter referred to as local similarity) Y lk is calculated. That is, the local similarity Y lk, a similarity between the partial region D 'k of partial areas D k and the reference frame G l comparison frame F j. In this embodiment, a correlation coefficient called normalized cross-correlation (ZNCC) is calculated, but in other embodiments, other similarities such as a correlation coefficient called NCC and similarities called SDD and SAD are used. May be calculated.
 次に、ステップS6では、算出部43は、比較フレームFjと、各基準フレームGl(l=1,2,・・・,L)とのフレーム全体での類似度(以下、総合類似度)Blを算出する。具体的には、算出部43は、各l(l=1,2,・・・,L)について、局所類似度Ylkの高い部分領域Dk,D'kを判断する。このとき、例えば、局所類似度Ylkが上位規定割合に入るものを高いと判断することもできるし、局所類似度Ylkが所定の閾値よりも大きいものを高いと判断することもできる。そして、算出部43は、各l(l=1,2,・・・,L)について、フレーム全体の中で局所類似度Ylkの高い部分領域Dk,D'kのみを用いて、比較フレームFjと基準フレームGlとの類似度を再度算出し、総合類似度Blとする(図10参照)。本実施形態では、総合類似度Blとしても、正規化相互相関(ZNCC)が算出されるが、局所類似度Ylkと同様に、他の類似度が算出されてもよい。また、総合類似度Blと局所類似度Ylkとが異なる方法で算出されてもよい。 Next, in step S6, the calculation unit 43 calculates the similarity between the comparison frame F j and each reference frame G l (l = 1, 2,..., L) (hereinafter referred to as total similarity). ) B1 is calculated. Specifically, the calculation unit 43 determines partial regions D k and D ′ k having a high local similarity Y lk for each l (l = 1, 2,..., L). At this time, for example, it is possible to determine that the local similarity Y lk falls within the upper specified ratio, and it is possible to determine that the local similarity Y lk is higher than a predetermined threshold. Then, the calculation unit 43 compares each l (l = 1, 2,..., L) using only the partial regions D k and D ′ k having a high local similarity Y lk in the entire frame. The degree of similarity between the frame F j and the reference frame G l is calculated again to obtain the total degree of similarity B l (see FIG. 10). In this embodiment, normalized cross-correlation (ZNCC) is calculated as the total similarity B 1 , but other similarities may be calculated in the same manner as the local similarity Y lk . Further, the total similarity B 1 and the local similarity Y lk may be calculated by different methods.
 以上より、本実施形態では、総合類似度Blを算出するに当たり、局所類似度Ylkの低い部分領域Dk,D'kの情報は考慮されないが、これは、以下の理由による。すなわち、同じチャンネルのフレームどうしは、背景画像は一致するが、移動体部分においては異なる画像となる。そして、フレーム全体に占める移動体部分の割合が大きい場合に、フレーム全体の情報を用いて総合類似度Blを判断すると、たとえ同じチャンネルのフレームどうしであっても、総合類似度Blが低くなるからである。従って、ここでは、誤判定を避けるべく、移動体部分を多く含むと考えられる局所類似度Ylkの低い部分領域Dk,D'kは、総合類似度Blの算出に用いられない。その結果、移動体の影響が低減され、比較されるフレームどうしが同じチャンネルに属するか否かを正確に判断することができる。 As described above, in the present embodiment, the information on the partial regions D k and D ′ k having a low local similarity Y lk is not considered in calculating the overall similarity B 1 , for the following reason. That is, the frames of the same channel have the same background image but are different in the moving body portion. If the overall similarity B 1 is determined using the information of the entire frame when the proportion of the moving body portion in the entire frame is large, the overall similarity B 1 is low even if the frames of the same channel are used. Because it becomes. Therefore, here, in order to avoid erroneous determination, the partial regions D k and D ′ k having a low local similarity Y lk that are considered to include many mobile parts are not used for calculating the total similarity B 1 . As a result, the influence of the moving object is reduced, and it can be accurately determined whether or not the compared frames belong to the same channel.
 続くステップS7では、判定部44が、ステップS6で算出された総合類似度B1,B2,・・・,BLのうち最大のものを判定し、当該最大のものが所定の閾値よりも大きいか否かを判定する。そして、所定の閾値よりも大きいと判定された場合には、処理はステップS8に進み、所定の閾値以下と判定された場合には、処理はステップS10に進む。 In the subsequent step S7, the determination unit 44 determines the maximum one of the overall similarities B 1 , B 2 ,..., B L calculated in step S6, and the maximum is greater than a predetermined threshold value. Determine whether it is larger. If it is determined that the value is greater than the predetermined threshold, the process proceeds to step S8. If it is determined that the value is equal to or less than the predetermined threshold, the process proceeds to step S10.
 ステップS8では、判定部44は、比較フレームFjがチャンネルCHMAXに属すると判定する。MAXとは、最大の総合類似度Blを与えるlの値である。具体的には、判定部44は、比較フレームFjにチャンネル名CHMAXのラベリングを付与し、比較フレームFjをチャンネルCHMAXに分類する。これにより、比較フレームFjは、チャンネルCHMAXに属するフレーム群と同じチャンネルに属することになる。 In step S8, the determination unit 44 determines that the comparison frame F j belongs to the channel CH MAX . MAX is a value of l that gives the maximum total similarity B l . More specifically, the determination unit 44, the labeling of the channel name CH MAX assigned to the comparison frame F j, to classify the comparison frame F j to the channel CH MAX. As a result, the comparison frame F j belongs to the same channel as the frame group belonging to the channel CH MAX .
 ステップS8の後、処理はステップS9に進められる。ステップS9では、設定部42が、チャンネルCHMAXの基準フレームGMAXを更新する。具体的には、設定部42は、比較フレームFjと既存の基準フレームGMAXとを合成して、新たな基準フレームGMAXとする。本実施形態では、合成の態様として、既存の基準フレームGMAXよりも比較フレームFjの重みが大きくなるような重み付け平均が採用される。こうすることで、チャンネルCHMAXに属するフレーム群を代表する基準フレームGMAXが、より直近のフレームに重みが置かれた合成画像となる。その結果、フレーム間のチャンネルの異同を判断するに当たり、時系列の変化に対応することが可能になる。なお、他の実施形態では、比較フレームFjをそのまま、新たな基準フレームGMAXとして設定してもよい。また、ステップS9を省略し、基準フレームGMAXの更新を行わないようにすることもできる。 After step S8, the process proceeds to step S9. In step S9, the setting unit 42 updates the reference frame G MAX of the channel CH MAX. Specifically, the setting unit 42 synthesizes the comparison frame F j and the existing reference frame G MAX to form a new reference frame G MAX . In the present embodiment, a weighted average is employed as a mode of synthesis such that the weight of the comparison frame F j is larger than that of the existing reference frame G MAX . By doing so, the reference frame G MAX representing the frame group belonging to the channel CH MAX becomes a composite image in which the weight is placed on the latest frame. As a result, it is possible to cope with a time-series change in determining the channel difference between frames. In another embodiment, the comparison frame F j may be set as a new reference frame G MAX as it is. It is also possible to omit the step S9, so as not to update the reference frame G MAX.
 一方、ステップS10は、ステップS6で算出された総合類似度B1,B2,・・・,BLが、いずれも所定の閾値以下と判断される場合に実行されるステップである。言い換えると、ステップS10は、比較フレームFjが、いずれのチャンネルCHlの基準フレームGlとも類似しない場合に実行されるステップである。ステップS10では、設定部42は、既存のチャンネルCH1,CH2,・・・,CHLに加え、新たなチャンネルCHL+1を定義し、比較フレームFjを当該チャンネルCHL+1に分類する。具体的には、比較フレームFjに新たなチャンネル名CHL+1のラベリングを付与する。また、設定部42は、比較フレームFjを、新たなチャンネルCHL+1に属するフレーム群を代表する基準フレームGL+1として設定する。 On the other hand, step S10 is a step that is executed when it is determined that the total similarity B 1 , B 2 ,..., B L calculated in step S6 is less than or equal to a predetermined threshold value. In other words, step S10 is a step executed when the comparison frame F j is not similar to the reference frame G l of any channel CH l . In step S10, the setting unit 42 defines a new channel CH L + 1 in addition to the existing channels CH 1 , CH 2 ,..., CH L , and sets the comparison frame F j to the channel CH L + 1 . Classify. Specifically, labeling of a new channel name CH L + 1 is given to the comparison frame F j . The setting unit 42 sets the comparison frame F j as a reference frame G L + 1 that represents a frame group that belongs to the new channel CH L + 1 .
 全てのフレームF1,F2,・・・,FJに対するステップS3~S10が終了すると、処理はステップS11に進められる。ステップS11では、既存のチャンネルCH1,CH2,・・・,CHLの修正が行われる。具体的には、算出部43は、基準フレームG1,G2,・・・,GL間のフレーム全体の類似度を総当たりで算出する。なお、ここでいうフレーム全体とは、ステップS3と同様、エリア設定ウィンドウW3上でエリアの指定がされていた場合には、当該エリア全体であり、指定がされていない場合には、フレームの画面全体である。また、本実施形態では、ステップS5,S6と同様、類似度として正規化相互相関(ZNCC)が算出されるが、他の類似度が算出されてもよい。また、類似度が、総合類似度Bl及び局所類似度Ylkとは異なる方法で算出されてもよい。そして、判定部44は、これらの類似度に所定の閾値を超えるものがあれば、そのような類似度を与える基準フレームGlに対応するチャンネルCHlどうしを、1つのチャンネルに統合する。具体的には、統合されるチャンネルCHlに属するフレームFjの全てに、同じチャンネル名のラベリングを再付与する。なお、ここでは、3つ以上のチャンネルCHlが1つに統合されることもある。 When Steps S3 to S10 for all the frames F 1 , F 2 ,..., F J are completed, the process proceeds to Step S11. In step S11, the existing channels CH 1 , CH 2 ,..., CH L are corrected. Specifically, the calculation unit 43, reference frame G 1, G 2, · · ·, calculated on the total per frame overall similarity between G L. The whole frame here is the same as step S3 when the area is specified on the area setting window W3, and when the area is not specified, the frame screen is displayed. The whole. In this embodiment, as in steps S5 and S6, normalized cross-correlation (ZNCC) is calculated as the similarity, but other similarities may be calculated. Further, the similarity may be calculated by a method different from the total similarity B 1 and the local similarity Y lk . Then, the determination unit 44, If there is more than a predetermined threshold value to these similarities, was what channel CH l corresponding to the reference frame G l providing such similarity, integrated into one channel. Specifically, labeling with the same channel name is re-assigned to all the frames F j belonging to the channel CH l to be integrated. Here, three or more channels CH 1 may be integrated into one.
 更に、判定部44は、以上のチャンネルの統合を行ってもなお1枚のフレームしか含まれないチャンネルCHlが存在する場合には、当該チャンネルCHlを消滅させる。具体的には、判定部44は、消滅させる全てのチャンネルCHlに属するフレームFjに、「未分類」のラベリングを再付与する。なお、本実施形態では、1枚のフレームFjしか含まれないチャンネルCHlが消滅の対象となるが、このような基準値を2枚、3枚、・・・としてもよい。また、作成可能なチャンネル数に上限を設けておくこともできる。この場合、チャンネル数が上限を超えないように、フレームFjの枚数の少ないチャンネルから消滅させるようにすることができる。 Furthermore, the determination unit 44, or more if the channel CH l not only include still one frame by performing the integration of the channel is present, to extinguish the channel CH l. Specifically, the determination unit 44 re-assigns “unclassified” labeling to the frames F j belonging to all the channels CH 1 to be eliminated. In the present embodiment, the channel CH l containing only one frame F j is a target for disappearance, but such reference values may be set to two, three, and so on. It is also possible to set an upper limit on the number of channels that can be created. In this case, it is possible to eliminate the channel F j from a channel with a small number so that the number of channels does not exceed the upper limit.
 以上の処理によると、最終的に作成されたチャンネルのチャンネル名CHlに含まれる数値は、連続番号とはなっていない可能性が高い。そこで、判定部44は、最終的に作成されたチャンネルCHlに、CH01,CH02,・・・とのチャンネル名を順番に再付与し、各フレームFjに付与されているラベリングを、これらの新たなチャンネル名で更新する。そして、その後、自動分類処理が終了する。 According to the above processing, the numerical value included in the channel name CH l of the finally created channel is highly likely not to be a serial number. Therefore, the determination unit 44 finally created channel CH l, CH01, CH02, the channel name and ... and reapplication sequentially, the labeling which is given to each frame F j, these Update with a new channel name. After that, the automatic classification process ends.
 <2.第2実施形態>
 以下、本発明の第2実施形態について説明する。図11は、第2実施形態に係る自動分類処理の流れを示しており、第2実施形態と第1実施形態とは、自動分類のアルゴリズムのみが相違する。また、図9及び図11を比較すれば分かるように、両自動分類処理の相違点は、第2実施形態においてはステップS4~S6の代わりに、ステップS24~S26が挿入されている点のみである。従って、以下では、簡単のため、当該相違点のみを説明する。
<2. Second Embodiment>
Hereinafter, a second embodiment of the present invention will be described. FIG. 11 shows the flow of automatic classification processing according to the second embodiment. The second embodiment and the first embodiment are different only in the automatic classification algorithm. Further, as can be seen by comparing FIG. 9 and FIG. 11, the difference between the two automatic classification processes is only that steps S24 to S26 are inserted instead of steps S4 to S6 in the second embodiment. is there. Therefore, for the sake of simplicity, only the difference will be described below.
 第2実施形態では、第1実施形態と同様のステップS1~S3の実行後に、ステップS24が実行される。ステップS24では、算出部43が、比較フレームFjの各部分領域D1,D2,・・・,DK内で、特徴点を検出する。このとき、各部分領域D1,D2,・・・,DK内で検出される特徴点の数は、同数とすることが好ましい。以下、比較フレームFj内の特徴点を、P1,P2,・・・,PU(Uは、2以上の整数)と表す。 In the second embodiment, step S24 is executed after execution of steps S1 to S3 similar to those in the first embodiment. In step S24, the calculation unit 43 detects feature points in the partial areas D 1 , D 2 ,..., D K of the comparison frame F j . At this time, the number of the partial regions D 1, D 2, · · ·, feature points detected in the D K, it is preferable that the same number. Hereinafter, the feature points in the comparison frame F j are represented as P 1 , P 2 ,..., P U (U is an integer of 2 or more).
 算出部43は、比較フレームFj内において、各特徴点Pu(u=1,2,・・・,U)の近傍に、部分領域Vuを設定する(図12参照)。部分領域Vuは、特徴点Puを中心とする所定サイズの領域である。そして、算出部43は、既存のチャンネルCH1,CH2,・・・,CHLの基準フレームG1,G2,・・・,GLの各々に対し、部分領域V'1,V'2,・・・,V'Uを設定する(図12参照)。部分領域V'1,V'2,・・・,V'Uは、フレームの画面内において、それぞれ部分領域V1,V2,・・・,VUと同じ位置を占めるように定義される。 The calculation unit 43 sets a partial region V u in the vicinity of each feature point P u (u = 1, 2,..., U) in the comparison frame F j (see FIG. 12). The partial region V u is a region having a predetermined size with the feature point P u as the center. Then, calculating unit 43, existing channel CH 1, CH 2, · · ·, reference frames G 1, G 2 of CH L, · · ·, for each of the G L, partial regions V '1, V' 2 ,..., V ′ U are set (see FIG. 12). Partial region V '1, V' 2, ···, V 'U , in the screen frame, each partial area V 1, V 2, · · ·, are defined to occupy the same position as V U .
 以上のとおり、本実施形態では、比較フレームFjの中から、部分領域D1,D2,・・・,DKの単位で、特徴点P1,P2,・・・,PUが検出される。その結果、特徴点P1,P2,・・・,PUが、比較フレームFjの画面全体の一部の領域に偏ってしまうことがなく、比較フレームFj全体の中から凡そ均等に検出される。 As described above, in the present embodiment, from the comparison frame F j, subregion D 1, D 2, · · ·, in units of D K, the feature point P 1, P 2, · · ·, is P U Detected. As a result, the feature point P 1, P 2, ···, P U is, without thereby biased in a partial area of the entire screen of the comparison frame F j, approximately evenly from the entire comparison frame F j Detected.
 次に、ステップS25では、算出部43は、部分領域Vu,V'u(u=1,2,・・・,U)の組み合わせ毎に、比較フレームFjと、各基準フレームGl(l=1,2,・・・,L)との局所類似度Yluを算出する。すなわち、局所類似度Yluとは、比較フレームFjの部分領域Vuと基準フレームGlの部分領域V' uとの類似度である。なお、第1及び第2実施形態における局所類似度は、共に部分領域での類似度を表すという点で共通するため、同じ記号Yを用いて表す。また、本実施形態では、正規化相互相関(ZNCC)と呼ばれる相関係数が算出されるが、第1実施形態と同様に、他の類似度が算出されてもよい。 Next, in step S25, the calculation unit 43 calculates the comparison frame F j and each reference frame G l (for each combination of the partial regions V u , V ′ u (u = 1, 2,..., U). l = 1,2, ···, calculates the local similarity Y lu and L). That is, the local similarity Y lu, a similarity between the partial region V 'u comparison frame F j of the partial region V u and the reference frame G l. In addition, since the local similarity in 1st and 2nd embodiment is common at the point that both represent the similarity in a partial region, it represents using the same symbol Y. In this embodiment, a correlation coefficient called normalized cross-correlation (ZNCC) is calculated, but other similarities may be calculated as in the first embodiment.
 次に、ステップS26では、算出部43は、比較フレームFjと、各基準フレームGl(l=1,2,・・・,L)とのフレーム全体での総合類似度Blを算出する。なお、第1及び第2実施形態における総合類似度は、共にフレーム全体での類似度を表すという点で共通するため、同じ記号Bを用いて表す。具体的には、算出部43は、各l(l=1,2,・・・,L)について、局所類似度Yluの高い部分領域Vu,V'uを判断する。このとき、例えば、局所類似度Yluが上位規定割合に入るものを高いと判断することもできるし、局所類似度Yluが所定の閾値よりも大きいものを高いと判断することもできる。そして、各l(l=1,2,・・・,L)について、フレーム全体の中で局所類似度Yluの高いVu,V'uのみを用いて、比較フレームFjと基準フレームGlとの類似度を再度算出し、総合類似度Blとする。本実施形態でも、総合類似度Blとしては、正規化相互相関(ZNCC)が算出されるが、第1実施形態と同様に、他の類似度が算出されてもよい。また、本実施形態でも、総合類似度Blと局所類似度Yluとが異なる方法で算出されてもよい。 Next, in step S26, the calculation unit 43 calculates the overall similarity B l for the entire frame between the comparison frame F j and each reference frame G l (l = 1, 2,..., L). . Note that the overall similarity in the first and second embodiments is common in that both represent the similarity in the entire frame, and therefore the same symbol B is used. Specifically, the calculation unit 43 determines partial regions V u and V ′ u having a high local similarity Y lu for each l (l = 1, 2,..., L). At this time, for example, it is possible to determine that the local similarity Y lu falls within the higher specified ratio, and it is possible to determine that the local similarity Y lu is higher than a predetermined threshold. For each l (l = 1, 2,..., L), the comparison frame F j and the reference frame G are used by using only V u and V ′ u having a high local similarity Y lu in the entire frame. The degree of similarity with l is calculated again to obtain the total degree of similarity B l . Also in the present embodiment, the normalized cross correlation (ZNCC) is calculated as the total similarity B 1 , but other similarities may be calculated as in the first embodiment. Also in this embodiment, the total similarity B 1 and the local similarity Y lu may be calculated by different methods.
 以上のステップS26により、総合類似度B1,B2,・・・,BLが算出される。その後の処理の流れは、第1実施形態と同様である。 The overall similarity B 1 , B 2 ,..., B L is calculated by the above step S26. The subsequent processing flow is the same as in the first embodiment.
 以上のとおり、本実施形態では、総合類似度を算出するに当たり、特徴点P1,P2,・・・,PUの近傍の部分領域V1,V2,・・・,VU以外の領域の情報は考慮されない。なぜならば、異なるチャンネルのフレームどうしであっても、それらが似通った光景を撮影したものであれば、フレーム全体の情報を用いて総合類似度Blを判断すると、総合類似度Blが高くなるからである。例えば、同一の店内の異なる場所を撮影した2つの映像においては、どちらも同じ均一色の壁紙や床等がフレームの大部分を占めることがあり、そのような場合、総合類似度Blが高くなってしまう。従って、ここでは、誤判定を避けるべく、同じ均一色の壁紙や床等の背景を写している可能性の高い領域、すなわち、特徴点P1,P2,・・・,PUの近傍以外の領域は、総合類似度Blの算出に用いられない。その結果、似通った背景の影響を低減して、比較されるフレームどうしが同じチャンネルに属するか否かを正確に判断することができる。 As described above, in the present embodiment, in calculating the overall similarity, the feature point P 1, P 2, · · ·, partial regions V 1 of the vicinity of the P U, V 2, · · ·, other than V U Area information is not considered. This is because, even if frames of different channels are taken from similar scenes, if the overall similarity B l is determined using the information of the entire frame, the overall similarity B l increases. Because. For example, in two images taken at different locations in the same store, the same uniform color wallpaper or floor may occupy most of the frame. In such a case, the overall similarity B l is high. turn into. Thus, here, to avoid an erroneous determination, regions with high possibility that copies of the same uniform color wallpaper, background such as a floor, that is, the feature point P 1, P 2, · · ·, other than the vicinity of the P U Are not used for calculating the total similarity B 1 . As a result, it is possible to reduce the influence of the similar background and accurately determine whether or not the compared frames belong to the same channel.
 更に、本実施形態では、総合類似度Blを算出するに当たり、局所類似度Yluの低い部分領域Vu,V'uの情報も考慮されない。なぜならば、フレーム全体に占める移動体部分の割合が大きい場合に、フレーム全体の情報を用いて総合類似度Blを判断すると、たとえ同じチャンネルのフレームどうしであっても、総合類似度Blが低くなるからである。従って、ここでは、誤判定を避けるべく、移動体部分を多く含むと考えらえる局所類似度Yluの低い部分領域Vu,V'uは、総合類似度Blの算出に用いられない。その結果、移動体の影響を低減して、比較されるフレームどうしが同じチャンネルに属するか否かをさらに正確に判断することができる。 Furthermore, in this embodiment, in calculating the total similarity B 1 , information on the partial regions V u and V ′ u having a low local similarity Y lu is not taken into consideration. This is because, when the ratio of the moving body portion in the entire frame is large, when it is determined overall similarity B l using the information of the entire frame, even if the frame to each other in the same channel, the overall similarity B l This is because it becomes lower. Therefore, here, in order to avoid erroneous determination, the partial regions V u and V ′ u having a low local similarity Y lu that are considered to include many mobile parts are not used for the calculation of the total similarity B l . As a result, it is possible to reduce the influence of the moving body and more accurately determine whether the compared frames belong to the same channel.
 <3.第3実施形態>
 以下、本発明の第3実施形態について説明する。図13は、第3実施形態に係る自動分類処理の流れを示しており、第3実施形態と第1及び第2実施形態とは、自動分類のアルゴリズムのみが相違する。また、図13と図9とを比較すれば分かるように、第1及び第3実施形態に係る自動分類処理の相違点は、第3実施形態においてはステップS5,S6の代わりに、サブルーチンS30が挿入されている点のみである。従って、以下では、簡単のため、当該相違点のみを説明する。
<3. Third Embodiment>
Hereinafter, a third embodiment of the present invention will be described. FIG. 13 shows a flow of automatic classification processing according to the third embodiment. The third embodiment is different from the first and second embodiments only in the automatic classification algorithm. As can be seen from a comparison between FIG. 13 and FIG. 9, the difference between the automatic classification processes according to the first and third embodiments is that, in the third embodiment, a subroutine S30 is used instead of steps S5 and S6. Only the points that are inserted. Therefore, for the sake of simplicity, only the difference will be described below.
 上記のとおり、第1及び第2実施形態では、フレーム間の類似度として相関係数が算出されたが、第3実施形態では、フレーム間の類似度がフレームの色彩傾向に基づいて評価される。また、フレームの色彩傾向としては、フレームの画面内における全体的な色彩傾向と、フレームの画面内における局所的な色彩傾向とが総合的に考慮される。以下、具体的に説明する。 As described above, in the first and second embodiments, the correlation coefficient is calculated as the similarity between frames. In the third embodiment, the similarity between frames is evaluated based on the color tendency of the frames. . Further, as the color tendency of the frame, an overall color tendency in the frame screen and a local color tendency in the frame screen are comprehensively considered. This will be specifically described below.
 まず、第3実施形態では、第1実施形態と同様のステップS1~S4が実行され、続いて、サブルーチンS30が実行される。サブルーチンS30は、フレームの画面内における局所的な又は全体的な色彩傾向を示す様々な色指標を評価する処理であり、図17にその詳細が示される。特に、サブルーチンS30に含まれるステップS31~S33では、濃度が考慮され、ステップS34~S38では、彩度が考慮され、ステップS39~S43では、色相が考慮される。なお、ステップS31~S33では、濃度が考慮されるが、濃度に代えて、明度や輝度が用いられてもよい。また、サブルーチンS30に含まれるステップS44~46では、フレームの全体的な印象度が考慮される。 First, in the third embodiment, the same steps S1 to S4 as in the first embodiment are executed, and then the subroutine S30 is executed. Subroutine S30 is a process for evaluating various color indexes indicating local or overall color trends in the frame screen, and details thereof are shown in FIG. In particular, density is taken into consideration in steps S31 to S33 included in the subroutine S30, saturation is taken into consideration in steps S34 to S38, and hue is taken into consideration in steps S39 to S43. In steps S31 to S33, the density is taken into consideration, but brightness or luminance may be used instead of the density. In steps S44 to S46 included in the subroutine S30, the overall impression level of the frame is considered.
 まず、ステップS31では、算出部43は、比較フレームFj全体の平均濃度Hdを算出する。なお、ここでいう比較フレームFj全体とは、エリア設定ウィンドウW3上でエリアの指定がされていた場合には、当該エリア全体であり、指定がされていない場合には、フレームの画面全体である。また、算出部43は、比較フレームFj内の各部分領域Dk(k=1,2,・・・,K)における平均濃度Edkを算出する。なお、平均濃度は、対象とするエリア内の全ての画素の濃度の平均値として算出され、各画素の濃度は、当該画素のRGB値の平均値として算出される。続いて、算出部43は、各部分領域Dk(k=1,2,・・・,K)に対し、濃度Edk>濃度Hdとなる場合には、「1」の値を付与し、濃度Edk≦濃度Hdとなる場合には、「0」の値を付与する(図14参照)。なお、この「1」「0」は、局所領域の濃度がフレーム全体を基準として相対的に高いか低いかを示す指標であり、濃度(明るさ)に関する局所的な色彩傾向を表す色指標(以下、濃度指標)である。 First, in step S31, calculation unit 43 calculates an average density Hd overall comparison frame F j. Here, a comparative entire frame F j to say, if it has been designated area on the area setting window W3 are entire area, if not specified is the entire screen of the frame is there. Further, the calculation unit 43 calculates an average density Ed k in each partial region D k (k = 1, 2,..., K) in the comparison frame F j . The average density is calculated as an average value of the density of all the pixels in the target area, and the density of each pixel is calculated as the average value of the RGB values of the pixel. Subsequently, the calculation unit 43 assigns a value of “1” to the partial areas D k (k = 1, 2,..., K) when the density Ed k > the density Hd. When the density Ed k ≦ density Hd, a value of “0” is given (see FIG. 14). Note that “1” and “0” are indicators that indicate whether the density of the local region is relatively high or low with respect to the entire frame, and are color indicators that represent local color trends related to density (brightness) ( Hereinafter, the concentration index).
 続くステップS32では、算出部43は、ステップS31と同様の処理を、既存のチャンネルCH1,CH2,・・・,CHLの基準フレームG1,G2,・・・,GLの各々に対し実行する。すなわち、算出部43は、各l(l=1,2,・・・,L)について、基準フレームGl内の各部分領域D'k(k=1,2,・・・,K)に対し、濃度指標を算出する(図14参照)。なお、図13及び図17に示すとおり、ステップS32は繰り返し実行されることになるため、濃度指標が算出済みである基準フレームGlについては、当該値が参照され、新たな算出の処理は省略される。 In step S32, calculation unit 43, the same processing as step S31, each of the existing channel CH 1, CH 2, · · ·, reference frames G 1 of CH L, G 2, ···, G L Run against. That is, the calculation unit 43 assigns each l (l = 1, 2,..., L) to each partial region D ′ k (k = 1, 2,..., K) in the reference frame G l . On the other hand, a concentration index is calculated (see FIG. 14). Incidentally, as shown in FIGS. 13 and 17, since that would step S32 is repeatedly executed, for reference frames G l concentration index is already calculated, it is referred the value, the processing of a new calculation omitted Is done.
 続くステップS33では、算出部43は、ステップS31,S32で算出された濃度指標に基づいて、比較フレームFjと、各基準フレームGl(l=1,2,・・・,L)との間の類似度Bdlを算出する。具体的には、算出部43は、類似度Bdlとして、比較フレームFj内での濃度指標の分布パターンと、各基準フレームGl内での濃度指標の分布パターンとの類似度を算出する。本実施形態では、「1」「0」の濃度指標が一致しない部分領域Dk,D'kの組み合わせの数をカウントし、当該カウント値を部分領域の数Kで割った値の二乗値を、類似度Bdlとする(図14参照)。なお、ここで、部分領域の数Kでの除算が行われるのは、類似度Bdlを0~1の間の値に維持し、正規化するためである。また、ステップS33、及び後述するステップS38,S43で算出される類似度は、類似している程小さな値となる。この意味で、これらの類似度は、非類似度と捉えることもできる。 In the subsequent step S33, the calculation unit 43 calculates the comparison frame F j and each reference frame G l (l = 1, 2,..., L) based on the density index calculated in steps S31 and S32. the similarity is calculated Bd l between. Specifically, the calculation unit 43 calculates the similarity between the density index distribution pattern in the comparison frame F j and the density index distribution pattern in each reference frame G l as the similarity Bd l. . In the present embodiment, the number of combinations of partial areas D k and D ′ k where the density indices “1” and “0” do not match is counted, and a square value of a value obtained by dividing the count value by the number K of partial areas is obtained. , the similarity Bd l (see FIG. 14). Here, the division of the number K of the partial area is performed, the similarity Bd l and maintained at a value between 0 and 1, in order to normalize. Further, the similarity calculated in step S33 and steps S38 and S43, which will be described later, becomes a smaller value as they are similar. In this sense, these similarities can also be regarded as dissimilarities.
 ところで、画像の構図とは、画面内における複数の要素の配置により決定される。そして、多くの画像は、単純化すれば、注目物体を表す領域とその背景を表す領域とに大きく分かれる構図となる。すなわち、濃度指標が「1」の領域と「0」の領域とは、一方が注目物体を表し、他方がその背景を表していると考えることができる。この考え方からすると、比較フレームFj内での濃度指標の分布パターンとは、比較フレームFjの構図を表しており、基準フレームGl内での濃度指標の分布パターンとは、基準フレームGlの構図を表していると言える。従って、類似度Bdlは、比較フレームFj及び基準フレームGl間の構図の類似度を表していると言える。 Incidentally, the composition of an image is determined by the arrangement of a plurality of elements in the screen. Many images have a composition that is roughly divided into an area representing the object of interest and an area representing the background. That is, it can be considered that one of the region having the density index “1” and the region “0” represents the object of interest and the other represents the background thereof. From this concept, the distribution pattern of concentration indicators in comparison frame F j, represents the composition of the comparison frame F j, the distribution pattern of concentration indicators in the reference frame G l is the reference frame G l It can be said that it represents the composition of Therefore, it can be said that the similarity Bd l represents the similarity of composition between the comparison frame F j and the reference frame G l .
 続くステップS34では、算出部43は、比較フレームFj内での濃度指標の分布パターンに基づいて、比較フレームFjを2つの部分領域R1,R2に分割する(図15参照)。部分領域R1は、比較フレームFj内で濃度指標が「1」となる部分領域Dkを全て結合した領域であり、部分領域R2は、比較フレームFj内で濃度指標が「0」となる部分領域Dkを全て結合した領域である。すなわち、ステップS34では、比較フレームFjの構図に基づいて、比較フレームFjを注目物体を表す領域とその背景を表す領域とに分割している。 In step S34, calculation unit 43, based on the distribution pattern of concentration indicators in comparison frame F j, dividing the comparison frame F j 2 two partial regions R 1, to R 2 (see FIG. 15). The partial region R 1 is a region obtained by combining all the partial regions D k having the density index “1” in the comparison frame F j , and the partial region R 2 has the density index “0” in the comparison frame F j . This is a region where all the partial regions D k are combined. That is, in step S34, based on the composition of the comparison frame F j, is divided into an area representing the background and a region representing the object of interest the comparison frame F j.
 続くステップS35では、ステップS34と同様に、算出部43は、既存のチャンネルCH1,CH2,・・・,CHLの基準フレームG1,G2,・・・,GLの各々における濃度指標の分布パターンに基づいて、基準フレームG1,G2,・・・,GLの各々を部分領域R'1,R'2に分割する(図15参照)。部分領域R'1は、基準フレームGl内で濃度指標が「1」となる部分領域D'kを全て結合した領域であり、部分領域R'2は、基準フレームGl内で濃度指標が「0」となる部分領域D'kを全て結合した領域である。すなわち、ステップS35では、基準フレームGlの構図に基づいて、基準フレームGlを注目物体を表す領域とその背景を表す領域とに分割している。 In step S35, similarly to step S34, calculation unit 43, the concentration in each of the existing channel CH 1, CH 2, · · ·, reference frames G 1 of CH L, G 2, ···, G L Based on the index distribution pattern, each of the reference frames G 1 , G 2 ,..., G L is divided into partial areas R ′ 1 and R ′ 2 (see FIG. 15). The partial region R ′ 1 is a region obtained by combining all the partial regions D ′ k whose density index is “1” in the reference frame G 1 , and the partial region R ′ 2 has a density index in the reference frame G 1 . This is a region where all the partial regions D ′ k that become “0” are combined. That is, in step S35, based on the composition of the reference frame G l, divides the reference frame G l in the region that represents the object of interest and a region representing the background.
 続くステップS36では、算出部43は、比較フレームFj内の部分領域R1,R2の各々に対し、RGB別の3つの平均値を算出する。続いて、算出部43は、部分領域R1におけるRGB別の3つの平均値から、部分領域R1の彩度を算出するとともに、部分領域R2におけるRGB別の3つの平均値から、部分領域R2の彩度を算出する。次に、算出部43は、部分領域R1,R2の間の相対的な彩度(以下、相対彩度)を算出する。相対彩度は、部分領域R1の彩度と部分領域R2の彩度との差の絶対値として算出される。相対彩度は、彩度に関する色彩傾向を表す色指標(以下、彩度指標)である。 In subsequent step S36, the calculation unit 43 calculates three average values for each of R, G, and B for each of the partial regions R 1 and R 2 in the comparison frame F j . Subsequently, the calculation unit 43, the three average values of RGB by the partial region R 1, to calculate the saturation partial region R 1, the RGB separate three average values in the partial region R 2, the partial region to calculate the saturation of R 2. Next, the calculation unit 43 calculates the relative saturation (hereinafter referred to as relative saturation) between the partial regions R 1 and R 2 . The relative saturation is calculated as an absolute value of the difference between the saturation of the partial region R 1 and the saturation of the partial region R 2 . Relative saturation is a color index (hereinafter referred to as saturation index) representing a color tendency related to saturation.
 続くステップS37では、算出部43は、ステップS36と同様の処理を、既存のチャンネルCH1,CH2,・・・,CHLの基準フレームG1,G2,・・・,GLの各々に対し実行する。すなわち、算出部43は、各l(l=1,2,・・・,L)について、基準フレームGl内の部分領域R'1,R'2の各々の彩度を算出し、これらの差の絶対値である相対彩度を算出する。なお、図13及び図17に示すとおり、ステップS37は繰り返し実行されることになるため、相対彩度が算出済みである基準フレームGlについては、当該値が参照され、新たな算出の処理は省略される。ステップS35も、同様に省略され得る。 In subsequent step S37, calculation unit 43, the same processing as step S36, each of the existing channel CH 1, CH 2, · · ·, reference frames G 1 of CH L, G 2, ···, G L Run against. That is, the calculation unit 43 calculates the saturation of each of the partial regions R ′ 1 and R ′ 2 in the reference frame G 1 for each l (l = 1, 2,..., L), Relative saturation, which is the absolute value of the difference, is calculated. Incidentally, as shown in FIGS. 13 and 17, since the step S37 will be repeatedly executed, for reference frames G l relative saturation has already been calculated, the value is referred to, the processing of a new calculation Omitted. Step S35 can be omitted as well.
 続くステップS38では、算出部43は、ステップS36,S37で算出された比較フレームFjの相対彩度と、各基準フレームGl(l=1,2,・・・,L)の相対彩度との類似度Bslを算出する。本実施形態では、類似度Bslは、両相対彩度の差を255で割った値の二乗値として算出される。なお、ここで、255での除算が行われるのは、類似度Bslを0~1の間の値に維持し、正規化するためである。 In the subsequent step S38, the calculation unit 43 calculates the relative saturation of the comparison frame F j calculated in steps S36 and S37 and the relative saturation of each reference frame G l (l = 1, 2,..., L). The similarity Bs l is calculated. In the present embodiment, the similarity Bs l is calculated as a square value of a value obtained by dividing the difference between the two relative chromas by 255. Here, the division by 255 is performed in order to maintain the similarity Bs l at a value between 0 and 1 and normalize it.
 続くステップS39では、算出部43は、比較フレームFjを、主領域O1及び副領域O2の2つの領域に分割する。具体的には、ステップS34で算出された部分領域R1,R2のうち、面積のより広い方の領域を主領域O1とし、より狭い方の領域を副領域O2として設定する(図16参照)。すなわち、ステップS39は、比較フレームFjの構図に基づいて、比較フレームFjを注目物体を表す領域とその背景を表す領域とに分割している。 In subsequent step S39, the calculation unit 43 divides the comparison frame F j into two regions, a main region O 1 and a sub region O 2 . Specifically, among the partial regions R 1 and R 2 calculated in step S34, the region with the larger area is set as the main region O 1 and the region with the narrower region is set as the sub region O 2 (see FIG. 16). That is, step S39, based on the composition of the comparison frame F j, is divided into an area representing the background and a region representing the object of interest the comparison frame F j.
 続くステップS40では、ステップS39と同様に、算出部43は、基準フレームG1,G2,・・・,GLの各々を、主領域O'1及び副領域O'2の2つの領域に分割する。具体的には、ステップS35で算出された部分領域R'1,R'2のうち、面積のより広い方の領域を主領域O'1とし、より狭い方の領域を副領域O'2として設定する(図16参照)。すなわち、ステップS40では、基準フレームGlの構図に基づいて、基準フレームGlを注目物体を表す領域とその背景を表す領域とに分割している。 In subsequent step S40, as in step S39, the calculation unit 43 converts each of the reference frames G 1 , G 2 ,..., G L into two regions, a main region O ′ 1 and a sub region O ′ 2. To divide. Specifically, of the partial regions R ′ 1 and R ′ 2 calculated in step S35, the region with the larger area is set as the main region O ′ 1 and the region with the narrower region is set as the sub-region O ′ 2. Set (see FIG. 16). That is, in step S40, based on the composition of the reference frame G l, divides the reference frame G l in the region that represents the object of interest and a region representing the background.
 続くステップS41では、算出部43は、主領域O1の平均色相を算出するとともに、副領域O2の平均色相を算出する。なお、この平均色相は、色相に関する局所的な色彩傾向を表す色指標(以下、色相指標)であり、対象とするエリア内の全ての画素の色相の平均値として算出される。 In subsequent step S41, the calculation unit 43 calculates the average hue of the main area O 1 and calculates the average hue of the sub area O 2 . The average hue is a color index (hereinafter referred to as a hue index) representing a local color tendency related to the hue, and is calculated as an average value of the hues of all the pixels in the target area.
 続くステップS42では、算出部43は、ステップS41と同様の処理を、既存のチャンネルCH1,CH2,・・・,CHLの基準フレームG1,G2,・・・,GLの各々に対し実行する。すなわち、算出部43は、各l(l=1,2,・・・,L)について、基準フレームGl内の領域O'1,O'2の各々に対し、色相指標を算出する。なお、図13及び図17に示すとおり、ステップS42は繰り返し実行されることになるため、色相指標が算出済みである基準フレームGlについては、当該値が参照され、新たな算出の処理は省略される。ステップS40も、同様に省略され得る。 In step S42, calculation unit 43, the same processing as step S41, each of the existing channel CH 1, CH 2, · · ·, reference frames G 1 of CH L, G 2, ···, G L Run against. That is, the calculation unit 43 calculates a hue index for each of the regions O ′ 1 and O ′ 2 in the reference frame G l for each l (l = 1, 2,..., L). Incidentally, as shown in FIGS. 13 and 17, to become a thing step S42 is repeatedly executed, for reference frames G l hue index has already been calculated, is referred the value, the processing of a new calculation omitted Is done. Step S40 can be omitted as well.
 続くステップS43では、算出部43は、比較フレームFj内の主領域O1における色相指標と、各基準フレームGl(l=1,2,・・・,L)内の主領域O'1における色相指標との類似度Bh1lを算出する。本実施形態では、類似度Bh1lは、主領域O1,O'1の色相指標の差を180で割った値の二乗値として算出される。また、算出部43は、比較フレームFj内の副領域O2における色相指標と、各基準フレームGl(l=1,2,・・・,L)内の副領域O'2における色相指標との類似度Bh2lを算出する。具体的には、副領域O2,O'2の色相指標の差を180で割った値の二乗値を、類似度Bh2lとする。なお、ここで、180での除算が行われるのは、類似度Bh1l,Bh2lを0~1の間の値に維持し、正規化するためである。 In subsequent step S43, calculation unit 43 compares the frame F and the hue index of the main region O 1 in j, the reference frame G l (l = 1,2, ··· , L) main region O '1 in The similarity Bh 1l with the hue index at is calculated. In the present embodiment, the similarity Bh 1l is calculated as a square value of a value obtained by dividing the difference between the hue indexes of the main regions O 1 and O ′ 1 by 180. The calculation unit 43 also calculates the hue index in the sub-region O 2 in the comparison frame F j and the hue index in the sub-region O ′ 2 in each reference frame G l (l = 1, 2,..., L). The similarity Bh 2l is calculated. Specifically, the square value of the value obtained by dividing the difference between the hue indexes of the sub-regions O 2 and O ′ 2 by 180 is set as the similarity Bh 2l . Here, the reason why the division by 180 is performed is to maintain the similarity Bh 1l and Bh 2l between 0 and 1 and normalize them.
 続くステップS44~S46は、フレームの画面内における全体的な色彩傾向を示す色指標を評価するステップである。本実施形態では、フレームの画面内における全体的な色彩傾向を評価するために、様々な印象度Z1,Z2,・・・,ZI(Iは、2以上の整数)が定義されている。なお、これらの印象度Z1,Z2,・・・,ZIを定義するための各種情報は、ソフトウェア管理領域50内の印象度定義領域53内に格納されている。 Subsequent steps S44 to S46 are steps for evaluating a color index indicating an overall color tendency in the frame screen. In this embodiment, various impressions Z 1 , Z 2 ,..., Z I (I is an integer of 2 or more) are defined in order to evaluate the overall color tendency in the frame screen. Yes. Various kinds of information for defining these impression degrees Z 1 , Z 2 ,..., Z I are stored in an impression degree definition area 53 in the software management area 50.
 以下、印象度Zi(i=1,2,・・・,I)について、説明する。各印象度Ziは、1又は複数の色条件に関連付けられており、これらの色条件には、重みが定義されている。なお、同じ印象度Ziに関連付けられている1又は複数の色条件の重みは、合計して1となる。図18は、印象度Z1「ナチュラル」の例を示しており、印象度Z1「ナチュラル」は、「緑」「茶」及び「ベージュ」の3つの色条件に関連付けられている。また、「緑」「茶」及び「ベージュ」の3つの色条件には、それぞれ0.5,0.25,0.25の重みが付与されている。図18に示すとおり、各色条件には、濃度(明度)、彩度及び色相の値に対する評価値が定義されている。ここで、ある画素が与えられた場合、各色条件の値は、当該画素の濃度、彩度及び色相の値に対応する評価値を導出した後、これらの評価値を掛け合わせることにより算出される。そして、当該画素の印象度Ziの値は、各色条件の値に以上の重みを付けて加算した値として算出される。 Hereinafter, the impression degree Z i (i = 1, 2,..., I) will be described. Each impression degree Z i is associated with one or a plurality of color conditions, and weights are defined for these color conditions. Note that the weight of one or more color conditions associated with the same impression degree Z i is 1 in total. FIG. 18 shows an example of the impression degree Z 1 “natural”, and the impression degree Z 1 “natural” is associated with three color conditions of “green”, “brown”, and “beige”. Also, weights of 0.5, 0.25, and 0.25 are assigned to the three color conditions of “green”, “brown”, and “beige”, respectively. As shown in FIG. 18, evaluation values for density (lightness), saturation, and hue values are defined for each color condition. Here, when a certain pixel is given, the value of each color condition is calculated by deriving evaluation values corresponding to the density, saturation, and hue values of the pixel and then multiplying these evaluation values. . Then, the value of the impression degree Z i of the pixel is calculated as a value obtained by adding the above weighting values to the values of the respective color conditions.
 ステップS44では、算出部43は、比較フレームFjのフレーム全体での各印象度Zi(i=1,2,・・・,I)の値を算出する。なお、ここでいうフレーム全体とは、エリア設定ウィンドウW3上でエリアの指定がされていた場合には、当該エリア全体であり、指定がされていない場合には、フレームの画面全体である。具体的には、算出部43は、各i(i=1,2,・・・,I)について、比較フレームFjのフレーム全体に含まれる各画素について印象度Ziの値を算出し、これらの値の平均値を算出し、当該平均値をフレーム全体での印象度Ziの値とする。なお、フレーム全体での印象度Ziの値は、印象度Ziに関するフレーム全体での色彩傾向を表す色指標(以下、印象指標)である。 In step S44, the calculation unit 43 calculates the value of each impression degree Z i (i = 1, 2,..., I) in the entire comparison frame F j . The whole frame here is the entire area when the area is designated on the area setting window W3, and the whole frame screen when the area is not designated. Specifically, for each i (i = 1, 2,..., I), the calculation unit 43 calculates the value of the impression degree Z i for each pixel included in the entire frame of the comparison frame F j . An average value of these values is calculated, and the average value is set as the value of the impression degree Z i in the entire frame. The value of the impression Z i of the entire frame, the color index representing a color tendency of the entire frame about the impression Z i (hereinafter, the impression index) is.
 続くステップS45では、算出部43は、ステップS44と同様の処理を、既存のチャンネルCH1,CH2,・・・,CHLの基準フレームG1,G2,・・・,GLの各々に対し実行する。すなわち、算出部43は、各l(l=1,2,・・・,L)について、印象度Z1,Z2,・・・,ZIの各々に対し、基準フレームGlの印象指標を算出する。なお、図13及び図17に示すとおり、ステップS45は繰り返し実行されることになるため、印象指標が算出済みである基準フレームGlについては、当該値が参照され、新たな算出の処理は省略される。 In step S45, calculation unit 43, the same processing as step S44, each of the existing channel CH 1, CH 2, · · ·, reference frames G 1 of CH L, G 2, ···, G L Run against. That is, the calculating section 43, each of l (l = 1,2, ···, L) for, impression Z 1, Z 2, · · ·, for each of Z I, impression index for the reference frame G l Is calculated. Incidentally, as shown in FIGS. 13 and 17, to become a thing step S45 is repeatedly executed, for reference frames G l impression index has already been calculated, is referred the value, the processing of a new calculation omitted Is done.
 続くステップS46では、算出部43は、印象度Z1,Z2,・・・,ZIに基づいて、比較フレームFjと、各基準フレームGl(l=1,2,・・・,L)とのフレーム全体での類似度Bilを算出する。具体的には、算出部43は、各l(l=1,2,・・・,L)及び各i(i=1,2,・・・,I)について、比較フレームFjのフレーム全体での印象度Ziの値と基準フレームGlのフレーム全体での印象度Ziの値の差の二乗値を算出する。そして、各l(l=1,2,・・・,L)について、これらのI個の二乗値の合計値の平方根(I次元の印象度空間内における比較フレームFjと基準フレームGlとの距離)を1から引いた値を算出し、これを類似度Bilとする。 In step S46, calculation unit 43, impression Z 1, Z 2, · · ·, based on the Z I, Comparative frame F j and each reference frame G l (l = 1,2, ··· , the similarity is calculated Bi l of the entire frame with L). Specifically, the calculation unit 43 calculates the entire frame of the comparison frame F j for each l (l = 1, 2,..., L) and each i (i = 1, 2,..., I). It calculates the square value of the difference between the value of the impression Z i of the entire frame value and the reference frame G l of impression Z i in. For each l (l = 1, 2,..., L), the square root of the sum of these I square values (the comparison frame F j and the reference frame G l in the I-dimensional impression degree space) distance) to calculate the value obtained by subtracting from 1, to this similarity Bi l.
 次に、ステップS47では、算出部43は、既に算出された類似度Bdl,Bsl,Bh1l,Bh2l,Bil(l=1,2,・・・,L)に基づいて、比較フレームFjと、各基準フレームGl(l=1,2,・・・,L)との類似度(以下、総合類似度)Blを算出する。なお、第1及び第3実施形態における総合類似度は、共にフレーム全体での類似度を表すという点で共通するため、同じ記号Bを用いて表す。本実施形態では、算出部43は、各l(l=1,2,・・・,L)について、Bdl,Bsl,Bh1l,Bh2lの合計値の平方根(4つの指標に関する4次元空間内における比較フレームFjと基準フレームGlとの距離)を1から引いた値(構図比較結果)を算出した後、当該値とBil(印象比較結果)を掛け合わせた値を総合類似度Blとする。なお、他の実施形態では、異なる方法を採用することも可能であり、例えば、構図比較結果及び/又は印象比較結果に適当な係数を掛けたり、適当な値を足したり引いたりした後に、両値を掛け合わせるようにしてもよい。或いは、適宜係数を加えつつ、1-Bdlの平方根,1-Bslの平方根,1-Bh1lの平方根,1-Bh2lの平方根,Bilを足し合わせた値を総合類似度Blとしてもよい。 Next, in step S47, the calculation unit 43 performs comparison based on the already calculated similarities Bd l , Bs l , Bh 1l , Bh 2l , Bi l (l = 1, 2,..., L). A similarity (hereinafter referred to as a total similarity) B l between the frame F j and each reference frame G l (l = 1, 2,..., L) is calculated. Note that the overall similarity in the first and third embodiments is common in that both represent the similarity in the entire frame, and therefore, the same symbol B is used. In the present embodiment, the calculation unit 43 calculates, for each l (l = 1, 2,..., L), the square root of the total value of Bd l , Bs l , Bh 1l , Bh 2l (four-dimensional for four indices). After calculating a value (composition comparison result) obtained by subtracting the distance between the comparison frame F j and the reference frame G l in the space from 1 (composition comparison result), the value obtained by multiplying the value and Bi l (impression comparison result) is totally similar. Degree B1 . In other embodiments, different methods can be adopted. For example, both the composition comparison result and / or the impression comparison result are multiplied by an appropriate coefficient, and an appropriate value is added or subtracted. You may make it multiply a value. Alternatively, a sum of the square root of 1-Bd 1 , the square root of 1-Bs 1 , the square root of 1-Bh 1l , the square root of 1-Bh 2l , and Bi l while appropriately adding coefficients is used as the total similarity B 1. Also good.
 以上のステップS47により、総合類似度B1,B2,・・・,BLが算出される。その後の処理の流れは、第1実施形態と同様である。 Through the above step S47, the total similarity B 1 , B 2 ,..., BL is calculated. The subsequent processing flow is the same as in the first embodiment.
 <4.用途>
 画像処理プログラム2は、多種多様な動画に対する画像処理を取り扱うことができ、例えば、警察等の機関が事件の捜査のために防犯カメラの監視映像を解析する場面でも利用され得る。例えば、同じ店舗内に多数の防犯カメラが設置されることがあるが、これらの監視映像は、しばしば、混合タイムラプス動画の形式で記録される。そして、このような場合、防犯カメラを設置している店舗内の専用機器を用いれば、混合タイムラプス動画に含まれる各チャンネルを分離することができるが、警察等においては、各メーカーの各機種に対応した専用機器を所有している訳ではないため、再生が困難になる。従って、動画の仕様に関わらず、混合タイムラプス動画をチャンネル毎の動画へと分離することが可能な以上のタイムライン分離処理は、このような場面で活用することができる。
<4. Application>
The image processing program 2 can handle image processing for a wide variety of moving images. For example, the image processing program 2 can be used in a scene where an organization such as the police analyzes a surveillance video of a security camera for investigation of an incident. For example, many security cameras may be installed in the same store, but these surveillance videos are often recorded in the form of a mixed time-lapse movie. And in such a case, if you use a dedicated device in the store where the security camera is installed, you can separate each channel included in the mixed time-lapse video, but in the police etc., each model of each manufacturer Since it does not have a corresponding dedicated device, playback becomes difficult. Therefore, the timeline separation process that can separate the mixed time-lapse moving image into the moving image for each channel can be utilized in such a scene regardless of the specification of the moving image.
 <5.変形例>
 以上、本発明の一実施形態について説明したが、本発明は上記実施形態に限定されるものではなく、その趣旨を逸脱しない限りにおいて、種々の変更が可能である。例えば、以下の変更が可能である。
<5. Modification>
As mentioned above, although one Embodiment of this invention was described, this invention is not limited to the said embodiment, A various change is possible unless it deviates from the meaning. For example, the following changes can be made.
 <5-1>
 上記実施形態における再分類ウィンドウW4を、複数のウィンドウから構成してもよい。
<5-1>
The reclassification window W4 in the above embodiment may be composed of a plurality of windows.
 <5-2>
 上記実施形態では、選択チャンネルを切り替えることにより、フレーム一覧エリアC2内においてフレーム群がチャンネル又は分類の単位で切り替え表示されるようになっていた。しかしながら、複数のチャンネル及び「未分類」の分類の各々に対し、フレーム群を一覧表示するためのフレーム一覧エリアを設けてもよい。この場合においては、画面スペースを効率的に使用するべく、各フレーム一覧エリア内において、フレーム群をサムネイル表示することが好ましく、さらに、サムネイル画像群をタイムラインに沿って配列することが好ましい。また、この場合、アイコンの形態のチャンネルオブジェクト91及び未分類オブジェクト92を省略し、各チャンネル用のフレーム一覧エリアを、関連付け操作のためのチャンネルオブジェクトとして使用するとともに、「未分類」の分類用のフレーム一覧エリアを、関連付け操作のための未分類オブジェクトとして使用することができる。すなわち、「未分類」のフレーム一覧エリアの中から任意のフレームを選択し、これをチャンネル用の任意のフレーム一覧エリア(チャンネルオブジェクト)にドラッグ&ドロップすることにより、当該フレームを当該エリアに対応するチャンネルに分類できるように構成することも可能である。チャンネルに含まれるフレームを、別のチャンネル又は分類に移動させる場合も同様である。
<5-2>
In the above embodiment, by switching the selected channel, the frame group is switched and displayed in units of channel or classification in the frame list area C2. However, a frame list area for displaying a list of frames may be provided for each of a plurality of channels and a classification of “unclassified”. In this case, in order to use the screen space efficiently, it is preferable to display the frame group as a thumbnail in each frame list area, and it is preferable to arrange the thumbnail image group along the timeline. Further, in this case, the channel object 91 and the unclassified object 92 in the form of icons are omitted, and the frame list area for each channel is used as a channel object for the association operation, and the “unclassified” classification object is used. The frame list area can be used as an unclassified object for the association operation. That is, by selecting an arbitrary frame from the “unclassified” frame list area and dragging and dropping it to an arbitrary frame list area (channel object) for a channel, the frame is associated with the area. It is also possible to configure it so that it can be classified into channels. The same applies to a case where a frame included in a channel is moved to another channel or classification.
 或いは、「未分類」のフレーム群を一覧表示するためのフレーム一覧エリアと、チャンネルに属するフレーム群を一覧表示するためのフレーム一覧エリアとを設け、「未分類」のフレーム群が前者のエリア内で常に表示されるようにしてもよい。この場合、後者のエリア内では、現在選択されているチャンネルに属するフレーム群のみが一覧表示されるようにすることができる。 Alternatively, a frame list area for displaying a list of “unclassified” frame groups and a frame list area for displaying a list of frame groups belonging to the channel are provided, and the “unclassified” frame groups are included in the former area. May be always displayed. In this case, in the latter area, it is possible to display only a list of frames belonging to the currently selected channel.
 <5-3>
 第3実施形態では、様々な色指標が定義され、これらが様々に組み合わされて総合類似度Blが定義されたが、他の態様で色指標を定義することも可能であるし、他の態様で色指標を組み合わせて総合類似度Blを定義することも可能である。例えば、第3実施形態では、濃度に関する色指標に基づいて構図が決定されたが、例えば、色相及び/又は彩度に関する色指標に基づいて構図を決定してもよい。また、部分領域R1,R2の間の相対彩度に基づいて類似度Bslが算出されたが、部分領域R1,R'1間での絶対的な彩度の差や、部分領域R2,R'2間での絶対的な彩度の差を類似度Bslとしてもよい。また、部分領域R1,R'1間での濃度の差や、部分領域R2,R'2間での濃度の差を類似度として算出し、総合類似度Blに加味してもよい。
<5-3>
In the third embodiment, are defined various color index, but these are overall similarity B l is defined variously combined, it is also possible to define the color index in other embodiments, other It is also possible to define the overall similarity B 1 by combining color indexes in a manner. For example, in the third embodiment, the composition is determined based on the color index related to density, but the composition may be determined based on the color index related to hue and / or saturation, for example. Also, the similarity Bs l is calculated based on the relative saturation between the partial areas R 1 and R 2 , but the absolute saturation difference between the partial areas R 1 and R ′ 1 , A difference in absolute saturation between R 2 and R ′ 2 may be used as the similarity Bs l . Further, the difference in density between the partial areas R 1 and R ′ 1 and the difference in density between the partial areas R 2 and R ′ 2 may be calculated as the similarity and added to the total similarity B 1. .
 <5-4>
 上記実施形態では、チャンネル分離処理が開始した直後に、エリア設定ウィンドウW3が表示され、誤判定を避けるためのエリアの設定ができるように構成されていた。しかしながら、このようなエリア設定ウィンドウW3を表示させず、かかるエリアの設定ができないようにしてもよい。或いは、チャンネル分離処理が開始した直後は、エリア設定ウィンドウW3が表示されないものの、再分類ウィンドウW4上に、エリアの設定を行った上で自動分類処理のやり直しを命じるためのボタンを用意してもよい。この場合、ユーザは、エリアを設定せずに行った自動分類処理の結果が適切ではない(誤判定が多い)場合にのみ、以上のボタンを押下し、エリア設定ウィンドウW3を表示させる。そして、エリアの設定を行った上で、再度自動分類処理を実行させることができる。或いは、チャンネル分離処理が開始した直後にエリア設定ウィンドウW3を表示させ、エリアを設定した状態で自動分類処理を実行することができるようにしつつ、さらに、再分類ウィンドウW4上に上述のボタンを用意してもよい。
<5-4>
In the above embodiment, the area setting window W3 is displayed immediately after the start of the channel separation process, and the area can be set to avoid erroneous determination. However, such an area setting window W3 may not be displayed so that the area cannot be set. Alternatively, although the area setting window W3 is not displayed immediately after the start of the channel separation process, a button may be provided on the reclassification window W4 for setting the area and instructing the redo of the automatic classification process. Good. In this case, the user presses the above button and displays the area setting window W3 only when the result of the automatic classification process performed without setting the area is not appropriate (there are many erroneous determinations). Then, after setting the area, the automatic classification process can be executed again. Alternatively, the area setting window W3 is displayed immediately after the start of the channel separation process, and the automatic classification process can be executed with the area set, and the above buttons are provided on the reclassification window W4. May be.
 <5-5>
 上記実施形態のステップS9では、比較フレームFjと既存の基準フレームGMAXとを重み付け平均により合成したものを、新たな基準フレームGMAXとした。しかしながら、基準フレームGMAXの更新方法は、これに限られない。例えば、各画素について、チャンネルCHMAXに含まれる全てのフレーム(その時点の比較フレームFjも含まれる)の画素値のヒストグラムを作成し、最頻となる画素値を基準フレームGMAXの当該画素の画素値としてもよい。或いは、ヒストグラム中の中央値となる画素値を基準フレームGMAXの当該画素の画素値としてもよい。
<5-5>
In step S9 of the above embodiment, a new reference frame GMAX is obtained by combining the comparison frame Fj and the existing reference frame GMAX by weighted averaging. However, the method of updating the reference frame G MAX is not limited thereto. For example, for each pixel, a histogram of pixel values of all frames (including the comparison frame F j at that time) included in the channel CH MAX is created, and the most frequent pixel value is determined as the pixel of the reference frame G MAX . It is good also as a pixel value. Alternatively, the pixel value serving as the median value in the histogram may be used as the pixel value of the pixel in the reference frame GMAX .
 <5-6>
 上記実施形態において、ステップS10の直後等にチャンネル数Lが一定数以上に達しているか否かを判定し、一定数以上に達していると判定される場合には、ステップS11と同様の既存のチャンネルの修正処理を行うようにしてもよい。チャンネル数Lが増えすぎると、処理負荷が多大になるからである。ただし、フレーム数が1のチャンネルが存在したとしても、当該1のフレームが、現在選択されている比較フレームFjから時間的に一定値以内の近さであれば、当該チャンネルの削除を行わないようにすることが好ましい。当該チャンネルは、最近作成されたチャンネルであり、今後当該チャンネルに分類されるフレームが出現する可能性が高いからである。なお、以上のようなチャンネルの削除を行っただけでは、チャンネル数Lが一定値を超えてしまう場合には、チャンネル数が2以上のチャンネル、3以上のチャンネルも、順次削除の対象とし、チャンネル数が一定数以下に保たれるようにすることができる。
<5-6>
In the above embodiment, it is determined whether or not the number of channels L has reached a certain number immediately after step S10, and when it is determined that the number has reached a certain number or more, You may make it perform the correction process of a channel. This is because if the number of channels L increases too much, the processing load becomes large. However, even if there is a channel with the number of frames of 1, if the frame of 1 is close to a certain value in time from the currently selected comparison frame F j , the channel is not deleted. It is preferable to do so. This is because the channel is a recently created channel and there is a high possibility that a frame classified as the channel will appear in the future. If the number of channels L exceeds a certain value only by deleting the channels as described above, the channels with the number of channels of 2 or more are also sequentially deleted, and the channels The number can be kept below a certain number.
1 画像処理装置
2 画像処理プログラム
41 自動分類部
42 設定部
43 算出部
44 判定部
45 再分類部
91 チャンネルオブジェクト
92 未分類オブジェクト
C3 再生エリア
W4 再分類ウィンドウ(再分類画面)
DESCRIPTION OF SYMBOLS 1 Image processing apparatus 2 Image processing program 41 Automatic classification part 42 Setting part 43 Calculation part 44 Determination part 45 Reclassification part 91 Channel object 92 Unclassified object C3 Playback area W4 Reclassification window (reclassification screen)

Claims (7)

  1.  1の動画内に異なるチャンネルに属するフレームが混在する場合に、前記フレームを前記チャンネル毎に分類するための画像処理装置であって、
     前記動画に含まれる複数のフレームに画像処理を施すことにより、局所的な色彩傾向及び/又は全体的な色彩傾向に基づいてフレーム間の類似度を算出し、前記類似度に応じて、前記複数のフレームを複数のチャンネルに分類する自動分類部
    を備える、
    画像処理装置。
    An image processing apparatus for classifying the frames into the channels when frames belonging to different channels are mixed in one moving image,
    By performing image processing on a plurality of frames included in the moving image, a similarity between frames is calculated based on a local color tendency and / or an overall color tendency, and the plurality of frames are determined according to the similarity. An automatic classification unit that classifies the frames into a plurality of channels,
    Image processing device.
  2.  前記自動分類部は、
     前記動画に含まれる特定のフレームを、又は前記動画に含まれる2以上の特定のフレームを合成したフレームを基準フレームとして設定するとともに、前記動画に含まれる別の特定のフレームを比較フレームとして設定する設定部と、
     前記比較フレーム及び前記基準フレームを複数の部分領域に分割し、前記部分領域毎に第1色指標を算出し、前記第1色指標に基づいて、前記比較フレーム及び前記基準フレームの構図を導出し、前記比較フレームの構図と前記基準フレームの構図との前記類似度を算出する算出部と、
     前記類似度に基づいて、前記比較フレームが前記基準フレーム又はその合成元のフレームと同じチャンネルに属するか否かを判定する判定部と
    を含む、
    請求項1に記載の画像処理装置。
    The automatic classification unit
    A specific frame included in the moving image or a frame obtained by combining two or more specific frames included in the moving image is set as a reference frame, and another specific frame included in the moving image is set as a comparison frame A setting section;
    The comparison frame and the reference frame are divided into a plurality of partial areas, a first color index is calculated for each partial area, and a composition of the comparison frame and the reference frame is derived based on the first color index. A calculation unit that calculates the similarity between the composition of the comparison frame and the composition of the reference frame;
    A determination unit that determines, based on the similarity, whether the comparison frame belongs to the same channel as the reference frame or a frame from which the reference frame is synthesized;
    The image processing apparatus according to claim 1.
  3.  前記自動分類部は、
     前記動画に含まれる特定のフレームを、又は前記動画に含まれる2以上の特定のフレームを合成したフレームを基準フレームとして設定するとともに、前記動画に含まれる別の特定のフレームを比較フレームとして設定する設定部と、
     前記比較フレーム及び前記基準フレームを複数の部分領域に分割し、前記部分領域毎に第1色指標を算出し、前記第1色指標に基づいて、前記比較フレーム及び前記基準フレームの構図を導出し、前記構図に基づいて、前記比較フレーム及び前記基準フレームを新たな複数の部分領域に分割し、前記比較フレーム及び前記基準フレームの各々に対し、前記新たな部分領域の少なくとも1つにおいて第2色指標を算出し、前記比較フレームの前記第2色指標と前記基準フレームの前記第2色指標との前記類似度を算出する算出部と、
     前記類似度に基づいて、前記比較フレームが前記基準フレーム又はその合成元のフレームと同じチャンネルに属するか否かを判定する判定部と
    を含む、
    請求項1に記載の画像処理装置。
    The automatic classification unit
    A specific frame included in the moving image or a frame obtained by combining two or more specific frames included in the moving image is set as a reference frame, and another specific frame included in the moving image is set as a comparison frame A setting section;
    The comparison frame and the reference frame are divided into a plurality of partial areas, a first color index is calculated for each partial area, and a composition of the comparison frame and the reference frame is derived based on the first color index. And dividing the comparison frame and the reference frame into a plurality of new partial areas based on the composition, and for each of the comparison frame and the reference frame, a second color in at least one of the new partial areas. A calculation unit that calculates an index and calculates the degree of similarity between the second color index of the comparison frame and the second color index of the reference frame;
    A determination unit that determines, based on the similarity, whether the comparison frame belongs to the same channel as the reference frame or a frame from which the reference frame is synthesized;
    The image processing apparatus according to claim 1.
  4.  前記自動分類部は、
     前記動画に含まれる特定のフレームを、又は前記動画に含まれる2以上の特定のフレームを合成したフレームを基準フレームとして設定するとともに、前記動画に含まれる別の特定のフレームを比較フレームとして設定する設定部と、
     前記比較フレーム及び前記基準フレームを複数の部分領域に分割し、前記部分領域毎に第1色指標を算出し、前記第1色指標に基づいて、前記比較フレーム及び前記基準フレームの構図を導出し、前記構図に基づいて、前記比較フレーム及び前記基準フレームを新たな複数の部分領域に分割し、前記比較フレーム及び前記基準フレームの各々に対し、前記新たな部分領域の少なくとも2つにおいて第2色指標を算出し、前記比較フレーム及び前記基準フレームの各々に対し、前記第2色指標間の差である第3色指標を算出し、前記比較フレームの前記第3色指標と前記基準フレームの前記第3色指標との前記類似度を算出する算出部と、
     前記類似度に基づいて、前記比較フレームが前記基準フレーム又はその合成元のフレームと同じチャンネルに属するか否かを判定する判定部と
    を含む、
    請求項1に記載の画像処理装置。
    The automatic classification unit
    A specific frame included in the moving image or a frame obtained by combining two or more specific frames included in the moving image is set as a reference frame, and another specific frame included in the moving image is set as a comparison frame A setting section;
    The comparison frame and the reference frame are divided into a plurality of partial areas, a first color index is calculated for each partial area, and a composition of the comparison frame and the reference frame is derived based on the first color index. And dividing the comparison frame and the reference frame into a plurality of new partial areas based on the composition, and for each of the comparison frame and the reference frame, a second color in at least two of the new partial areas. An index is calculated, and for each of the comparison frame and the reference frame, a third color index that is a difference between the second color indices is calculated, and the third color index of the comparison frame and the reference frame A calculation unit for calculating the similarity with the third color index;
    A determination unit that determines, based on the similarity, whether the comparison frame belongs to the same channel as the reference frame or a frame from which the reference frame is synthesized;
    The image processing apparatus according to claim 1.
  5.  前記自動分類部は、濃度、明度、輝度、彩度及び色相の少なくとも1つに基づいて、前記色彩傾向を評価する、
    請求項1から4のいずれかに記載の画像処理装置。
    The automatic classification unit evaluates the color tendency based on at least one of density, brightness, luminance, saturation, and hue;
    The image processing apparatus according to claim 1.
  6.  1の動画内に異なるチャンネルに属するフレームが混在する場合に、前記フレームを前記チャンネル毎に分類するための画像処理プログラムであって、
     前記動画に含まれる複数のフレームに画像処理を施すことにより、局所的な色彩傾向及び/又は全体的な色彩傾向に関するフレーム間の類似度を算出し、前記類似度に応じて、前記複数のフレームを複数のチャンネルに分類するステップ
    をコンピュータに実行させる、
    画像処理プログラム。
    An image processing program for classifying the frames for each channel when frames belonging to different channels are mixed in one moving image,
    By performing image processing on a plurality of frames included in the moving image, a similarity between frames related to a local color tendency and / or an overall color tendency is calculated, and the plurality of frames is determined according to the similarity Causing the computer to perform the step of classifying the channel into a plurality of channels,
    Image processing program.
  7.  1の動画内に異なるチャンネルに属するフレームが混在する場合に、前記フレームを前記チャンネル毎に分類するための画像処理方法であって、
     前記動画に含まれる複数のフレームに画像処理を施すことにより、局所的な色彩傾向及び/又は全体的な色彩傾向に関するフレーム間の類似度を算出し、前記類似度に応じて、前記複数のフレームを複数のチャンネルに分類するステップ
    を備える、
    画像処理方法。
    An image processing method for classifying the frames into the channels when frames belonging to different channels are mixed in one moving image,
    By performing image processing on a plurality of frames included in the moving image, a similarity between frames related to a local color tendency and / or an overall color tendency is calculated, and the plurality of frames is determined according to the similarity Classifying the channel into a plurality of channels,
    Image processing method.
PCT/JP2015/051364 2014-03-27 2015-01-20 Image processing device WO2015146242A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014-066866 2014-03-27
JP2014066866A JP2017118155A (en) 2014-03-27 2014-03-27 Apparatus and program for image processing

Publications (1)

Publication Number Publication Date
WO2015146242A1 true WO2015146242A1 (en) 2015-10-01

Family

ID=54194784

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/051364 WO2015146242A1 (en) 2014-03-27 2015-01-20 Image processing device

Country Status (2)

Country Link
JP (1) JP2017118155A (en)
WO (1) WO2015146242A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015146243A1 (en) * 2014-03-27 2015-10-01 Nkワークス株式会社 Image processing device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000090239A (en) * 1998-09-10 2000-03-31 Matsushita Electric Ind Co Ltd Image retrieving device
JP2001036867A (en) * 1999-07-23 2001-02-09 Gen Tec:Kk Synchronous display device for plural channel moving images
JP2002319210A (en) * 2002-02-04 2002-10-31 Hitachi Ltd Recording and reproducing device, and reproducing device
JP2004094379A (en) * 2002-08-29 2004-03-25 Ntt Comware Corp Similar image retrieval device
JP2004254244A (en) * 2003-02-21 2004-09-09 Matsushita Electric Ind Co Ltd Monitoring camera system and its image recording and reproducing method
JP2013157874A (en) * 2012-01-31 2013-08-15 Nk Works Kk Image processing program and image processing device
JP2013239205A (en) * 2013-08-20 2013-11-28 Glory Ltd Image processing method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000090239A (en) * 1998-09-10 2000-03-31 Matsushita Electric Ind Co Ltd Image retrieving device
JP2001036867A (en) * 1999-07-23 2001-02-09 Gen Tec:Kk Synchronous display device for plural channel moving images
JP2002319210A (en) * 2002-02-04 2002-10-31 Hitachi Ltd Recording and reproducing device, and reproducing device
JP2004094379A (en) * 2002-08-29 2004-03-25 Ntt Comware Corp Similar image retrieval device
JP2004254244A (en) * 2003-02-21 2004-09-09 Matsushita Electric Ind Co Ltd Monitoring camera system and its image recording and reproducing method
JP2013157874A (en) * 2012-01-31 2013-08-15 Nk Works Kk Image processing program and image processing device
JP2013239205A (en) * 2013-08-20 2013-11-28 Glory Ltd Image processing method

Also Published As

Publication number Publication date
JP2017118155A (en) 2017-06-29

Similar Documents

Publication Publication Date Title
WO2015146241A1 (en) Image processing device
JP6485452B2 (en) Image processing device
JP4840426B2 (en) Electronic device, blurred image selection method and program
KR101747511B1 (en) Image processing device and image processing method
JP7286392B2 (en) Image processing device, image processing method, and program
JP6009481B2 (en) Image processing apparatus, important person determination method, image layout method, program, and recording medium
JP6279825B2 (en) Image processing apparatus, image processing method, program, and imaging apparatus
KR101942987B1 (en) Method, system for removing background of a video, and a computer-readable storage device
US11627227B2 (en) Image processing apparatus, image processing method, and storage medium
JP2011254240A (en) Image processing device, image processing method and program
US8861889B2 (en) Image processing device, method and program for extracting still image data from moving image data
CN108377351A (en) Image processing apparatus and image processing method for the laying out images in template
US8630527B2 (en) Image editing apparatus and method for controlling the same, and storage medium storing program
JP2015011585A (en) Image processing apparatus, image forming apparatus, image forming system, image processing method, and program
JP2006259788A (en) Image output device
US20100067787A1 (en) Image processing apparatus
US20210090312A1 (en) Image processing apparatus, image processing method, image processing program, and recording medium storing image processing program
WO2015146242A1 (en) Image processing device
JP2015022351A (en) Image processing program and image processing apparatus
JP6948787B2 (en) Information processing equipment, methods and programs
JP6904695B2 (en) Image processing device, image processing method, and program
JP5385059B2 (en) Image display method, program, image display apparatus, and imaging apparatus
JP6334584B2 (en) Display control apparatus, display control method, and program
JP5232107B2 (en) Image display method, program, image display apparatus, and imaging apparatus
JP2006172090A (en) Representative image selecting device and method and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15770379

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15770379

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP