WO2006046321A1 - Apparatus, method, and program product for reproducing or recording moving image - Google Patents

Apparatus, method, and program product for reproducing or recording moving image Download PDF

Info

Publication number
WO2006046321A1
WO2006046321A1 PCT/JP2005/003551 JP2005003551W WO2006046321A1 WO 2006046321 A1 WO2006046321 A1 WO 2006046321A1 JP 2005003551 W JP2005003551 W JP 2005003551W WO 2006046321 A1 WO2006046321 A1 WO 2006046321A1
Authority
WO
WIPO (PCT)
Prior art keywords
image data
identification object
central processing
object image
data
Prior art date
Application number
PCT/JP2005/003551
Other languages
French (fr)
Inventor
Shinya Satoh
Original Assignee
Earth Beat, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Earth Beat, Inc. filed Critical Earth Beat, Inc.
Publication of WO2006046321A1 publication Critical patent/WO2006046321A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/804Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components
    • H04N9/8042Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components involving data reduction
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/19Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
    • G11B27/28Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/78Television signal recording using magnetic recording
    • H04N5/781Television signal recording using magnetic recording on disks or drums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/84Television signal recording using optical recording
    • H04N5/85Television signal recording using optical recording on discs or drums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/907Television signal recording using static stores, e.g. storage tubes or semiconductor memories

Definitions

  • the present invention generally relates to an apparatus, a method, and a program product for reproducing or recording a moving image. More specifically, the invention relates to an apparatus, a method, and a program product for identifying images forming a moving image and reproducing the moving image from a predetermined image. The invention also relates to an apparatus, a method, and a program product for identifying images forming a moving image and storing the moving image together with a reproduction time corresponding to a predetermined image.
  • a program for reproducing and editing moving image data which is recorded in a recording medium such as a HD or a DVD, can cause a processor to display a reproduction button, a rewind button, a fast- forward button, a suspend button, a stop button, a chapter generation button, and a chapter selection button on a display, to reproduce the moving image data on the display and to store a reproduction time corresponding to suspended moving image data as chapter data.
  • each of an apparatus for reproducing and editing moving image data and a remote controller comprises a reproduction button, a rewind button, a fast-forward button, a suspension button, a stop button, a chapter reproduction button, and a chapter selection button, and can reproduce moving image data on a television and store a reproduction time corresponding to suspended moving image data as chapter data.
  • Figure 1 shows an example of a GUI that is displayed on a display.
  • reference numerals 1, 2, 3, 4, 5, 9, and 10 correspond to a reproduction button, a rewind button, a fast-forward button, a suspension button, a stop button, a chapter generation button, and a chapter selection button, respectively.
  • a pointer not-shown
  • the user can search for a desired image (scene) and reproduce a moving image from the image by clicking the rewind button 2 or the fast-forward button 3, moving the reproduction time button 7 or clicking a certain position in the total reproduction time area 8.
  • the user can generate chapter data, which makes it possible to reproduce moving image data from the desired image, by clicking the suspension button 4 and further clicking the chapter generation button
  • a screen of a television (not shown) displays the moving image display area 6 in Figure 1.
  • the user presses the reproduction button 1 of an apparatus for reproducing and editing moving image data or a remote controller therefor, whereby reproduction of moving image data is started.
  • an image is displayed on the screen of the television.
  • the user can search for a desired image (scene) and reproduce a moving image from the image by pressing the rewind button 2 or the fast-forward button 3.
  • the desired image (scene) is reproduced, the user can generate chapter data, which makes it possible to reproduce the moving image data from the desired image, by pressing the suspension button 4 and further pressing the chapter generation button 9.
  • An apparatus which reproduces a moving image in accordance with the present invention, comprises central processing means (21), recording medium control means (23) that controls a recording medium (22), input means control means (25) that controls input means (24), and display means control means (27) that controls display means (26).
  • the central processing means (21) inputs a moving image data portion, which corresponds to a first reproduction time, of moving image data stored in the recording medium (22) from the recording medium control means (23), decodes the moving image data portion, and extracts image data corresponding to the first reproduction time as reference image data.
  • the central processing means (21) outputs control data for displaying the reference image data on the display means (26) as first representative image data to the display means control means (27).
  • the central processing means (21) inputs moving data portion, which corresponds to a second reproduction time which is different from the first reproduction time, of the moving image data stored in the recording medium (22) from the recording medium control means (23), decodes the moving data portion, and extracts image data corresponding to the second reproduction time as identification object image data.
  • the central processing means (21) determines whether the reference image data and the identification object image data are different and, when the reference image data and the identification object image data are different, outputs control data for displaying the identification object image data on the display means (26) as second representative image data to the display means control means (27).
  • the central processing means (21) inputs operation data, which indicates that one representative image data of the first or the second representative image data displayed on the display means (26) is selected by the input means (24), from the input means control means (25), reproduces moving image data from a reproduction time corresponding to the selected representative image data, and outputs control data for displaying a moving image under reproduction on the display means (26) to the display means control means (27).
  • the central processing means (21) sets the identification - object image data as previous identification object image data and determines whether the reference image data and the previous identification object image data are similar.
  • the central processing means (21) inputs a moving image data portion, which corresponds to a third reproduction time which is different from the first and the second reproduction times, of the moving image data stored in the recording medium (22) from the recording medium control means (23), decodes the moving image data portion, and extracts image data corresponding to the third reproduction time as present identification object image data.
  • the central processing means (21) determines whether the reference image data and the present identification object image data are similar.
  • the central processing means (21) sets the present identification object image data as new reference image data.
  • the central processing means (21) sets the present identification object image data as new reference image data.
  • the central processing means (21) calculates a fitness value indicating a degree of coincidence of the reference image data and the identification object image data, when the calculated fitness value is smaller than a first fitness value, determines that the reference image data and the identification object image data are different, and when the calculated fitness value is smaller than a second fitness value which is larger than the first fitness value, determines that the reference image data and the identification object image data are similar.
  • the central processing means (21) sets the identification object image data as previous identification object image data and generates previous class data that indicates that the reference image data and the previous identification object image data are identical, similar, or partially similar.
  • the central processing means (21) inputs a moving image data portion, which corresponds to a third reproduction time which is different from the first and the second reproduction times, of moving image data stored in the recording medium (22) from the recording medium control means (23), decodes the moving image data portion, and extracts image data corresponding to the third reproduction time as present identification object image data.
  • the central processing means (21) When the reference image data and the present identification object image data are not different, the central processing means (21) generates present class data that indicates that the reference image data and the present identification object image data are identical, similar, or partially similar. When the previous class data and the present class data are different, the central processing means (21) sets the present identification object image data as new reference image data.
  • the central processing means (21) calculates a fitness value indicating a degree of coincidence of the reference image data and the identification object image data, when the calculated fitness value is smaller than the first fitness value, determines that the reference image data and the identification object image data are different, when the calculated fitness value is smaller than the second fitness value which is larger than the first fitness value, determines that the reference image data and the identification object image data are partially similar, when the calculated fitness value is smaller than a third fitness value which is larger than the second fitness value, determines that the reference image data and the identification object image data are similar, and when the calculated fitness value is larger than the third fitness value, determines that the reference image data and the identification object image data are identical. (Scene search mode)
  • the central processing means (21) inputs a moving image data portion, which corresponds to a first reproduction time, of moving image data stored in the recording medium (22) from the recording medium control means (23), decodes the moving image data portion, and extracts image data corresponding to the first reproduction time as identification object image data.
  • the central processing means (21) determines whether the reference image data and the Identification object image data are different and, when the reference image data and the identification object image data are not different, outputs control data for displaying the identification object image data on the display means (26) to the display means control means (27) as first representative image data.
  • the central processing means (21) inputs operation data, which indicates that the first representative image data displayed on the display means (26) is selected by the input means (24), from the input means control means (25), reproduces moving image data from a reproduction time corresponding to the selected representative image data, and outputs control data for displaying a moving image under reproduction on the display means (26) to the display means control means (27).
  • the central processing means (21) sets the identification object image data as previous identification object image data and determines whether the reference image data and the previous identification object image data are similar.
  • the central processing means (21) inputs a moving image data portion, which corresponds to a second reproduction time which is different from the first reproduction time, of the moving image data stored in the recording medium (22) from the recording medium control means (23), decodes the moving image data portion, and extracts image data corresponding to the second reproduction time as present identification object image data.
  • the central processing means (21) determines whether the reference image data and the present identification object image data are similar.
  • the central processing means (21) When the reference image data and the previous identification object image data are not similar and the reference image data and the present identification object image data are similar, the central processing means (21) outputs control data for displaying the present identification object image data on the display means (26) as second representative image data to the display means control means (27).
  • the central processing means (21) When the reference image data and the previous identification object image data are similar, and the reference image data and the present identification object image data are not similar, the central processing means (21) outputs control data for displaying the present identification object image data on the display means (26) as the second representative image data to the display means control means (27).
  • the central processing means (21) inputs operation data, which indicates that the second representative image data displayed on the display means (26) is selected by the input means (24), from the input means control means (25), reproduces moving image data from a reproduction time corresponding to the selected representative image data, and outputs control data for displaying a moving image under reproduction on the display means (26) to the display means control means (27).
  • the central processing means (21) sets the identification object image data as previous identification object image data and generates previous class data that indicates that the reference image data and the previous identification object image data are identical, similar, or partially similar.
  • the central processing means (21) inputs a moving image data portion, which corresponds to a second reproduction time which is different from the first reproduction time, of the moving image data stored in the storage means (22) from the recording medium control means (23), decodes the moving image data portion, and extracts image data corresponding to the second reproduction time as present identification object image data.
  • the central processing means (21) When the reference image data and the present identification object image data are not different, the central processing means (21) generates present class data that indicates that the reference image data and the present identification object image data are identical, similar, or partially similar. When the previous class data and the present class data are different, the central processing means (21) outputs control data for displaying the present identification object image data on the display means (26) as the second representative image data to the display means control means (27).
  • the central processing means (21) inputs operation data, which indicates that the second representative image data displayed on the display means (26) is selected by the input means (24), from the input means control means (25), reproduces moving image data from a reproduction time corresponding to the selected representative image data, and outputs control data for displaying a moving image under reproduction on the display means (26) to the display means control means (27).
  • operation data which indicates that the second representative image data displayed on the display means (26) is selected by the input means (24), from the input means control means (25), reproduces moving image data from a reproduction time corresponding to the selected representative image data, and outputs control data for displaying a moving image under reproduction on the display means (26) to the display means control means (27).
  • An apparatus which stores a moving image in accordance with the invention, comprises the central processing means (21), the recording medium control means (23) that controls the recording medium (22), and video signal output means control means (82) that controls video signal output means (81).
  • the central processing means (21) inputs a video signal, which is outputted from the video signal output means (81), from the video signal output means control means (82), encodes the video signal, and outputs control data for storing the moving image data in the recording medium (22) to the recording medium control means (23).
  • the central processing means (21) extracts image data from a video signal corresponding to a first storage time as reference image data.
  • the central processing means (21) sets the first storage time as a first reproduction time.
  • the central processing means (21) extracts image data from a video signal, which corresponds to a second storage time which is different from the first storage time, as identification object image data.
  • the central processing means (21) determines whether the reference image data and the identification object image data are different and, when the reference image data and the identification object image data are different, sets the second storage time as a second reproduction time.
  • the central processing means (21) outputs control data for storing the first and the second reproduction times in the recording medium (22) as chapter data to the recording medium control means (23).
  • the central processing means (21) sets the identification object image data as previous identification object image data and generates previous class data that indicates that the reference image data and the previous classification object image data are identical, similar, or partially similar.
  • the central processing means (21) extracts image data from a video signal, which corresponds to a third storage time which is different from the first and the second storage time, as new identification object image data.
  • the central processing means (21) generates present class data that indicates that the reference image data and the present identification object image data are identical, similar, or partially similar.
  • the central processing means (21) sets the present identification object image data as new reference image data.
  • a method and a program product, which reproduce a moving image in accordance with the invention are incorporated in the apparatus that reproduces a moving image in accordance with the invention.
  • a method and a program product, which store a moving image in accordance with the invention are incorporated in the apparatus that stores a moving image in accordance with the invention.
  • Figure 1 shows an example of a conventional GUI that is displayed on a display
  • Figure 2 shows a functional block diagram of an apparatus that reproduces a moving image in accordance with the invention
  • Figure 3 shows an example of a GUI in accordance with the invention that is displayed on display means
  • Figure 4 shows a functional block diagram of input means that remotely controls the apparatus that reproduces a moving image in accordance with the invention
  • Figure 5 represents images forming one moving image data and images obtained by grouping moving image data with a time division in accordance with the invention
  • Figure 6 shows an example in which representative image data of grouped images (scenes) 1 to Q shown in Figure 5 are displayed in a display area 38;
  • Figure 7 represents images forming a part of one moving image data and images grouped by a scene cut mode in accordance with the invention
  • Figure 8 shows an example in which representative image data of grouped images (scenes) 1 to M shown in
  • Figure 7 are displayed in the display area 38;
  • Figure 9 is a diagram for explaining planarization processing for a histogram distribution;
  • Figure 10 is a diagram for explaining an image model
  • Figure 11 represents images forming one moving image data, a suspended image (reference image), and images grouped by a scene search mode in accordance with the invention
  • Figure 12 shows an example in which suspended image data and a representative image of grouped images (scenes) 1 to T shown in Figure 11 are displayed in the display area 38;
  • Figure 13 represents images forming one moving image data, a suspended image and images before and after the image (one set of reference image), and images grouped by the scene search mode in accordance with the invention
  • Figure 14 shows a functional block diagram of an apparatus that stores a moving image in accordance with the invention
  • Figure 15 shows an example of a GUI in accordance with the invention that is displayed on display means.
  • Figure 16 represents images forming one moving image data and a representative image of images grouped by a chapter mode in accordance with the invention.
  • FIG. 2 shows a functional block diagram of an apparatus that reproduces a moving image in accordance with the invention.
  • An apparatus 20, which reproduces a moving image in accordance with the invention comprises central processing means 21, recording medium control means 23 that controls a recording medium 22, input means control means 25 that controls input means 24, display means control means 27 that controls display means 26, and a storage device 28.
  • the apparatus 20 for reproducing a moving image is, for example, a computer, an HD player, or a DVD player.
  • the central processing means 21 is, for example, a CPU, an MPU, or a DSP.
  • the recording medium 22 is, for example, an HD, a DVD, or a semiconductor memory. Moving image data is stored in the recording medium 22. A format of moving image data is, for example, MPEG, VOB, DV, or M-JPEG.
  • the recording medium control means 23 is, for example, a driver for the recording medium 22 or an interface between the recording medium 22 and the central processing means 21.
  • the input means 24 is, for example, a mouse, a keyboard, a button, or a remote controller.
  • the input means control means 25 is, for example, an interface between the input means 24 and the central processing means 21.
  • the display means 26 is, for example, a display or a television.
  • the display means control means 27 is, for example, a driver for the display means 26 or an interface between the display means 26 and the central processing means 21.
  • the storage device 28 is, for example, a semiconductor memory.
  • a program for reproducing an image in accordance with the invention is stored in the storage device 28. Note that this program may be stored in the recording medium 22.
  • a computer program product comprises this program stored on a central processing means 21 usable medium such as a semiconductor memory, an HD, a DVD, a FD, a MO disk, a CD, etc.
  • the central processing means 21 outputs control data for displaying a scene cut button 30, a scene search button 31, a reproduction button 32, a stop button 33, a return button 34, a suspension button 35, an image selection button 36, a pointer 37, a display area 38, and a total reproduction time area 39 on the display means 26 to the display means control means 27 in accordance with the program.
  • the display means 26 inputs control data from the display means control means 27 and displays the scene cut button 30, the scene search button 31, the reproduction button 32, the stop button 33, the return button 34, the suspension button 35, the image selection button 36, the pointer 37, the display area 38, and the total reproduction time area 39.
  • Figure 3 shows an example of a GUI to be displayed on the display means 26.
  • the input means 24 (e.g. a mouse) inputs operation by a user for moving the pointer 37 displayed on the display means 26 and outputs operation data for moving the pointer 37 to the input means control means 25.
  • the central processing means 21 inputs the operation data from the input means control means 25, converts the operation data into control data for moving the pointer 37 on the display means 26, and outputs the control data to the display means control means 27 in accordance with the program.
  • the display means 26 inputs the control data from the display means control means 27 and moves a display position of the pointer 37.
  • the input means 24 inputs operation by the user for clicking the pointer 37 displayed on the display means 26 and outputs operation data for clicking the pointer 37 to the input means control means 25.
  • the central processing means 21 inputs the operation data from the input means control means 25 and calculates a display position of the pointer 37 displayed on the display means 26 in accordance with the program.
  • the central processing means 21 determines that clicking of the scene cut button 30 is effective and executes a scene cut mode which is explained below. Note that instead of displaying the buttons 30, 31, 32, 33, 34, 35, and 36 and the pointer 37 with the display means 26, the input means 24 can comprise those buttons 30,
  • FIG. 4 shows a functional block diagram of input means that remotely controls the apparatus that reproduces a moving image in accordance with the invention.
  • the input means 24 e.g., the remote controller
  • the central processing means 21 inputs the operation data from the input means control means 25, determines that pressing of the scene cut button 30 is effective, and executes the scene cut mode to be described below in accordance with the program.
  • Figure 5 represents images forming data of one moving image and images obtained by grouping the moving image data according to time division. As shown in Figure 5, the moving image data includes P images (frames) corresponding to a total reproduction time.
  • Figure 6 shows an example in which representative image data of grouped images (scenes) 1 to Q shown in Figure 5 is displayed in the display area 38.
  • the central processing means 21 outputs control data for extracting total reproduction time data of the moving image data stored in the recording medium 22, to the recording medium control means 23, in accordance with the program.
  • the recording medium control means 23 inputs the control data from the central processing means 21, reads out the total reproduction time data from the moving image data, and outputs the total reproduction time data to the central processing means 21.
  • the central processing means 21 stores the total reproduction time data in the storage means 28.
  • the central processing means 21 sets a maximum number of pieces of representative image data to be displayed in the display area 38 to, for example, twelve and stores the maximum number of pieces data in the storage means 28.
  • the central processing means 21 divides the total reproduction time data by the maximum number of pieces data and stores a value obtained by the division (total reproduction time/maximum number of pieces) in the storage means 28 as first predetermined time interval data.
  • the total reproduction time is, for example, 2 hours (120 minutes)
  • the first predetermined time interval is, for example, 10 minutes (2 hours/12).
  • the central processing means 21 generates one set of reproduction time data, which starts from a reproduction time 0, increases at the first predetermined time interval, and is smaller than the total reproduction time in accordance with the program.
  • the one set of reproduction times is constituted by first to twelfth reproduction times
  • the first reproduction time is 0 minutes
  • the second reproduction time is 10 minutes
  • the third reproduction time is 20 minutes
  • the fourth reproduction time is 30 minutes
  • the fifth reproduction time is 40 minutes
  • the sixth reproduction time is 50 minutes
  • the seventh reproduction time is 60 minutes
  • the eighth reproduction time is 70 minutes
  • the ninth reproduction time is 80 minutes
  • the tenth reproduction time is 90 minutes
  • the eleventh reproduction time is 100 minutes
  • the twelfth reproduction time is 110 minutes.
  • the central processing means 21 outputs control data for extracting an image corresponding to the one set of reproduction times to the recording medium control means 23
  • the recording medium control means 23 inputs the control data from the central processing means 21, reads out one set of moving image data portions corresponding to the one set of reproduction times, and outputs the one set of moving image data portions to the central processing means 21.
  • the central processing means 21 decodes the one set of moving image data portions and extracts one set of image data (e.g., 720 pixels x 480 pixels) having color information.
  • the central processing means 21 stores the one set of image data having color information in the storage means 28.
  • Figure 5 represents images extracted first to Qth in this way.
  • the central processing means 21 recognizes images belonging to reproduction times satisfying this first condition as one group.
  • a reproduction time e.g., the first reproduction time (0 minutes)
  • a reproduction time e.g., the second reproduction time (10 minutes)
  • the central processing means 21 recognizes images belonging to reproduction times satisfying this first condition as one group.
  • the central processing means 21 recognizes images belonging to reproduction times satisfying this second condition as one group (e.g., images grouped Qth) .
  • Figure 5 represents images grouped first to Qth in this way.
  • the central processing means 21 determines images that represent the images first to Qth, respectively.
  • the central processing means 21 determines, for example, the images extracted first to Qth as images representing the images grouped first to Qth.
  • the central processing means 21 may newly extract image data corresponding to a reproduction time larger than the first reproduction time and smaller than the reproduction time corresponding to the image extracted second and determine the image as an image representing images grouped first.
  • the central processing means 21 changes a size (e.g., 720 pixels x 480 pixels) of an image (image having color information) representing grouped images to a size suitable for icon display (e.g., 120 pixels x 80 pixels) and stores the image in the storage means 28 as representative image data for icon display.
  • the central processing means 21 outputs control data for displaying first to Qth representative image data for icon display in the display area on the display means 26 to the display means control means 27.
  • the display means 26 inputs the control data from the display means control means 27 and displays the first to the Qth representative image data for icon display in the display area 38 ( Figure 6).
  • the central processing means 21 can automatically select one representative image data for icon display (e.g., first representative image data for icon display) .
  • the user can select one representative image data for icon display (e.g. , first representative image data for icon display) .
  • a reproduction time button 61 which corresponds to a reproduction time (e.g., the first reproduction time (0 minutes)) corresponding to the representative image data, is displayed in the total reproduction time area 39.
  • a reproduction time period (e.g., first reproduction time ⁇ reproduction time period ⁇ second reproduction time) 62 which corresponds to grouped images (e.g., the images grouped first) corresponding to the selected representative image data, is displayed in the total reproduction time area 39.
  • the central processing means 21 reproduces moving image data for a reproduction time period corresponding to the selected representative image or from a reproduction time corresponding to the selected representative image to the total reproduction time.
  • the display means 26 displays an image under reproduction on the entire display area 38.
  • the central processing means 21 can automatically select the selected representative image data (e.g., the first representative image data for icon display) again.
  • the user can also select another representative image data for icon display (e.g., the second representative image data for icon display) .
  • the central processing means 21 groups images according to the scene cut mode. In the following explanation, it is assumed that the second representative image for icon display is selected.
  • Figure 7 represents images forming a part of one moving image data and images grouped by the scene cut mode. As shown in Figure 7, the images include J images (frames) corresponding to a reproduction time period.
  • Figure 8 shows an example in which representative image data of grouped images (scenes) 1 to M shown in Figure 7 are displayed in the display area 38. ( (Initial setting) )
  • the central processing means 21 stores one set of reproduction time (e.g., first to twelfth reproduction times) data in the storage means 28 as a previous one set of reproduction time (previous first to twelfth reproduction times) data.
  • the first to the Qth representative image data for icon display are stored as previous first to Qth representative image data for icon display.
  • first to Qth reproduction time period data are stored as previous first to Qth reproduction time period data.
  • the central processing means 21 divides the previous second reproduction time period data, corresponding to the representative image data selected by the user, by the maximum number of pieces data and stores a value, which is obtained by dividing a value obtained by the division (previous second reproduction time period/maximum number of pieces) by a predetermined value ⁇ e.g., 4), (previous second reproduction time period/maximum number of pieces/4) in the storage means 28 as second predetermined time interval data.
  • the second predetermined time interval is, for example, 12.5 seconds (10 minutes/12/4) . It is to be noted that the central processing means 21 may change the maximum number of pieces of representative image data to be displayed in the display area 38.
  • the central processing means 21 generates one set of reproduction time data that starts from a previous second reproduction time (a reproduction time at the start of a reproduction time period corresponding to the selected representative image) , increases at the second predetermined time interval, and is smaller than a reproduction time (previous third reproduction time) which has lapsed between the first predetermined time interval and the previous second reproduction time.
  • the one set of' reproduction times is constituted by first to Nth reproduction times.
  • the central processing means 21 extracts image data having color information corresponding to the first reproduction time (the reproduction time at the start of the previous second reproduction time period) and- stores the image data in the storage means 28.
  • the central processing means 21 converts the image data having color information into image data having only luminance information (gray information) and further changes a size (e.g., 720 pixels x 480 pixels) of the image data to a size suitable for identification (e.g., 80 pixels x 60 pixels).
  • the central processing means 21 stores the image data having only luminance information, the size of which is changed to the size for identification, in the storage means 28 as reference image data.
  • the central processing means 21 subjects a histogram distribution of the image data having only luminance information to planarization processing, and stores the image data subjected to the planarization processing in the storage means 28 as reference image data.
  • Figure 9 is a diagram for explaining the planarization processing for the histogram distribution.
  • the central processing means 21 determines an image model from the histogram distribution of the reference image data and stores the determined image model data in the storage means 28.
  • an image model A as an image in which a histogram distribution concentrates in a dark portion or a bright portion of a luminance level (when one pixel is represented by eight bits, for example, 70% or more of the total number of pixels concentrates in 0 to 63 level or 192 to 255 level)
  • set an image model B to an image other than the image model A in which a histogram distribution concentrates in a specific portion of a luminance level
  • FIG. 10 is a diagram for explaining the image models.
  • the central processing means 21 changes a size (e.g., 720 pixels x 480 pixels) of the image data having color information corresponding to the reference image data to a size suitable for icon display (e.g., 120 pixels x 80 pixels).
  • the central processing means 21 stores the image data having color information, the size of which is changed to the size suitable for icon display, in the storage means 28 as the first representative image data.
  • the display means 26 displays the first representative image data in the display area 38 ( Figure 8). ((Setting for an identification object image))
  • the central processing means 21 extracts an image having color information corresponding to the second reproduction time (a reproduction time lapsed by the second predetermined time interval from the first reproduction time) and stores the image in the storage means 28.
  • the central processing means 21 converts the image having color information to an image having only luminance information and further changes a size of image data to a size suitable for identification.
  • the central processing means 21 stores the image having only luminance information, the size of which is changed to the size for identification, in the storage mean ' s -28 as identification object image data.
  • the central processing means 21 subjects a histogram distribution of the image data having only luminance information to the planarization processing and stores the planarized image data in the storage means 28 as identification object image data.
  • the central processing means 21 determines an image model from a histogram distribution of identification object image data and stores the determined image model data in the storage means 28. ((Calculation of a fitness value))
  • the central processing means 21 calculates a sum of absolute values of differences between luminance levels of respective pixels of reference image data and luminance levels of respective corresponding pixels of identification object image data. Further, the central processing means 21 subtracts a value, which is obtained by dividing the sum by the total number of pixels and a maximum value of the luminance levels, from 1. The central processing means 21 stores an obtained value in the storage means 28 as fitness data.
  • 1020.
  • the fitness value 0 means that the reference image data and the identification object data coincide with each other 0%
  • the fitness value 1 means that the reference image data and the identification object image data coincide with each other 100%.
  • the central processing means 21 determines that reference image data and identification object image data are different. On the other hand, when the fitness data is equal to or larger than the predetermined value (e.g., 0.88), the central processing means 21 determines that the reference image data and the identification object image data are not different in accordance with the program.
  • the central processing means 21 stores determination result data, which indicates whether the reference image data and the identification object image data are different, in the storage means 28.
  • the central processing means 21 divides a range from 1 to the predetermined value into plural sections and stores class data indicating a coincidence in the storage means 28 for each of the sections.
  • 1 to 0.88 is divided into three sections (1 to 0.95, 0.95 to 0.93, and 0.93 to 0.88).
  • the central processing means 21 determines that the reference image data and the identification object image data are similar.
  • the central processing means 21 determines that the reference image data and the identification object image data are partially similar. It is preferable to determine classes representing a coincidence (identical, similar, and partially similar) in this way. ((Identification technique 2))
  • the central processing means 21 determines that the reference image data and the identification object image data are different.
  • the central processing means 21 determines classes representing a coincidence (identical, similar, and partially similar) .
  • the central processing means 21 determines that the reference image data and the identification object image data are different.
  • the central processing means 21 determines classes representing a coincidence (identical (fitness value ⁇ 0.994) , similar (0.994>fitness value ⁇ 0.988) , and partially identical (0.988>fitness value ⁇ 0.982) ) . ((Identification technique 3))
  • the central processing means 21 determines that the reference image data and the identification object image data are different.
  • the central processing means 21 determines classes representing a coincidence (identical, similar, and partially similar) .
  • the central processing means 21 determines that the reference image data and the identification object image data are different.
  • the central processing means 21 determines classes representing a coincidence (identical, similar, and partially similar).
  • the central processing means 21 determines that the reference image data and the identification object image data are different.
  • the central processing means 21 determines classes representing a coincidence (identical (fitness value ⁇ 0.95), similar (0.95>fitness value ⁇ 0.93) , and partially identical (0.93>fitness value ⁇ 0.86) ) .
  • Determination result data indicating whether reference image data and identification object image data are different and class data representing a degree of coincidence of the reference image data and the identification object image data are stored in the storage means 28 as previous determination result data and previous class data, respectively.
  • the central processing means 21 extracts an image having color information corresponding to the next reproduction time (third reproduction time) out of a set of reproduction times.
  • the central processing means 21 generates identification object image data, image model data, and fitness data from the image data having color information corresponding to the next reproduction time and obtains determination result data and class data.
  • the central processing means 21 stores the identification object image data (image data having only luminance information, a size of which is changed to a size suitable for identification) in the storage means 28 as new reference image data.
  • the central processing means 21 stores image model data of the identification object image data in the storage means 28 as new image mode data of the reference image data.
  • the central processing means 21 determines whether previous class data and present class data coincide with each other. Even when it is determined that the reference image data and the identification object image data are not different, when the previous class data and the present class data are different, the central processing means 21 stores present identification object image data in the storage means 28 as new reference image data. In addition, the central processing means 21 stores image model data of the present identification object image data in the storage means 28 as new image model data of the reference image data.
  • image data corresponding to a first reproduction time is reference image data
  • image data corresponding to a second reproduction time is previous identification object image data
  • image data corresponding to a third reproduction time is present identification object image data.
  • present identification object image (image corresponding to a third reproduction time) data is stored as new reference image data.
  • the central processing means 21 changes a size of image data having color information corresponding to an updated reference image data to a size suitable for icon display and stores the image data in the storage means 28 as representative image data.
  • the display means 26 displays second to Mth representative image data generated in this way in the display area 38 ( Figure 8). ((Advantages of a scene cut mode))
  • Respective representative image data of images grouped into M images are displayed in the display area 38 as shown in Figure 8. Since plural representative image data, which are determined as different, are displayed in the display area 38, a user can comprehend contents of a moving image easily. In other words, the user can search for a desired image (scene) easily. In addition, since it is possible to display the plural representative image data, which are determined as different, in the display area 38 instantaneously, it is possible to comprehend contents of a moving image instantaneously.
  • the central processing means 21 updates the reference image data. Consequently, it is possible to recognize moving scenes as one group rather than recognizing the scenes as separate groups. As a result, the user can comprehend contents of a moving image more easily.
  • the central processing means 21 can also automatically select one representative image data for icon display (e.g., first representative image data for icon display) .
  • the user can also select one representative image data for icon display (e.g., first representative image data for icon display) .
  • the reproduction time button 61 which corresponds to a reproduction time corresponding to the representative image data, is displayed in the total reproduction time area 39.
  • the reproduction time period (e.g., reproduction time corresponding to first representative image data ⁇ reproduction time period ⁇ reproduction time corresponding to second representative image data) 62, which corresponds to grouped images (e.g., images grouped first) corresponding to the selected representative image data, is displayed in the total reproduction time area 39.
  • the central processing means 21 reproduces moving image data for a reproduction time period corresponding to the selected representative image or from a reproduction time corresponding to the selected representative image to the total reproduction time.
  • the display means 26 displays an image under reproduction in the entire display area 38. ((Sub-grouping))
  • the central processing means 21 can sub-group images according to a scene cut mode.
  • the central processing means 21 reproduces moving image data for a reproduction time period corresponding to the selected representative image or from a reproduction time corresponding to the selected representative image to the total reproduction time.
  • the display means 26 displays an image under reproduction in the entire display area 38.
  • Figure 11 represents images constituting one moving image data, a suspended image (reference image), and images grouped by the scene search mode.
  • Figure 12 shows an example in which suspended image data and representative image data of the grouped images (scenes) 1 to T shown in Figure 11 are displayed in the display area 38. ((Initial setting))
  • the central processing means 21 stores, for example, 0.1 second in the storage means 28 as predetermined time interval data in accordance with the program. ((Generation of a * reference image))
  • the central processing means 21 converts suspended image (image having! color information) data into image data having only luminance information and further changes a size (e.g., 720 pixels x 480 pixels) of the image data to a size suitable for identification (e.g., 80 pixels x 60 pixels) in accordance with the program.
  • the central processing means 21 stores the image data having only luminance information, the size of which is changed to the size for identification, in the storage means 28 as reference image data.
  • the central processing means 21 subjects a histogram distribution of the "image data having only luminance information to planarization processing and stores the image data subjected ttr ⁇ he planarization processing in the storage means 28 as reference image data.
  • the central processing means 21 determines an image model from a histogram distribution of the reference image data and stores the determined image model data in the storage means 28.
  • the central processing means 21 can set an image model A as an image in which a histogram distribution concentrates in a dark portion or a bright portion of a luminance level, set an image model B to an image other than the image model A in which a histogram distribution concentrates in a specific portion of a luminance level, and set an image model C to an image other than the image models A and B. ((Display of a reference image))
  • the central processing means 21 changes a size (e.g., 720 pixels x 480 pixels) of suspended image (image having color information) data to a size suitable for icon display (e.g., 120 pixels x 80 pixels) and stores the image in the storage means 28.
  • the display means 26 displays the image data, the size of which is changed to the size for icon display, in the display area 38 ( Figure 12). ((Setting for an identification object image))
  • the central processing means 21 generates one set of reproduction time data that start from a reproduction time 0 and increase at a predetermined time interval (e.g., 0.1 second).
  • the one set of reproduction time data are constituted by first to Cth reproduction time.
  • the central processing means 21 extracts an image having color information that constitutes image data corresponding to a first reproduction time (e.g., 0 second) and stores the image in the storage means 28.
  • the central processing means 21 converts the image having color information into an image having only luminance information and further changes a size of the image data to a size suitable for identification.
  • the central processing means 21 stores the image having only luminance information, the size of which is changed to the size for identification, in the storage means 28 as identification object image data.
  • the central processing means 21 subjects a histogram distribution of the image data having only luminance information to planarization processing and stores the planarized image data in the storage means 28 as identification object image.
  • the central processing means 21 determines an image model from a histogram distribution of the identification object image data and stores the determined image model data in the storage means 28. ((Pre-processing of an identification technique))
  • the central processing means 21 determines whether image model data of the reference image data and image model data of the identification object image data coincide with each other. ((Calculation of a fitness value))
  • the central processing means 21 calculates a sum of absolute values of differences between luminance levels of respective pixels of the reference image data and luminance levels of corresponding respective pixels of the identification object image data. Moreover, the central processing means 21 subtracts a value, which is obtained by dividing the sum by the total number of pixels and a maximum value of the luminance levels, from 1. The central processing means 21 stores an obtained value in the storage means 28 as fitness data.
  • the central processing means 21 determines that the reference image data and the identification object image data are different.
  • a predetermined value e.g. 0.95
  • the central processing means 21 determines that the reference image data and the identification object image data are not different in accordance with the program.
  • the central processing means 21 stores determination result data, which indicates whether the reference image data and the identification object image data are different, in the storage means 28.
  • the central processing means 21 divides a range of 1 to the predetermined value into plural sections and stores class data representing a coincidence in the storage means 28 for each of the sections. For example, the central processing means 21 divides 1 to 0.895 into three sections (1 to 0.95, 0.95 to 0.93, and 0.93 to 0.895).
  • the central processing means 21 determines that the reference image data and the identification object image data are similar.
  • the central processing means 21 determines that the reference image data and the identification object image data are partially similar. It is preferable to determine classes representing a coincidence (identical, similar, and partially similar) in this way.
  • the central processing means 21 determines that the reference image data and the identification object image data are different
  • the central processing means 21 determines classes representing a coincidence (identical, similar, and partially similar)
  • the central processing means 21 determines that the reference image data and the identification object image data are different.
  • the central processing means 21 determines classes representing a coincidence (identical (fitness value ⁇ O.994) , similar (0.994>fitness value ⁇ 0.988) , and partially identical (0.988>fitness value ⁇ 0.982) ) .
  • the central processing means.21 determines that the reference image data and the identification object image data are different.
  • the central processing means 21 determines classes representing a coincidence (identical, similar, and partially similar) .
  • the central processing means 21 determines that the reference image data and the identification object image data are different. Preferably, the central processing means 21 determines classes representing a coincidence.
  • the central processing means 21 determines classes representing a coincidence (identical
  • the central processing means 21 changes a size of image data having color information corresponding to the identification object image data to a size suitable for icon display and stores the image data in the storage means 28 as the representative image data.
  • the central processing means 21 determines that contents of previous class data and contents of present class data. Even when it is determined that the reference image data and the identification object image data are not different, when the contents of the present class data are identical and the contents of the previous class data are identical, the central processing means 21 does not generate representative image data from present identification object image data. In addition, even when it is determined that the reference image data and the identification object image data are not different, when the contents of the present class data are similar or partially similar and the contents of the previous class data are similar or partially similar, the central processing means 21 does not generate representative image data from the present identification object image data.
  • the display means 26 displays the first to the Tth representative image data generated in this way in the display area 38 ( Figure 12).
  • the display means 26 displays an indication 70 indicating contents of class data of the representative image data (identical, similar, and partially similar) .
  • the central processing means 21 can automatically select- one representative image data for icon display (e.g., first representative image data for icon display) .
  • a user can select one representative image data for icon display (e.g., first representative image data for icon display) .
  • the reproduction time button 61 which corresponds to a reproduction time corresponding to the representative image data, is displayed in the total reproduction time area 39.
  • a reproduction time period (e.g., reproduction time corresponding to first representative image data ⁇ reproduction time period ⁇ reproduction time corresponding to second representative image data) 62, which corresponds to grouped images (e.g., the images grouped first) corresponding to the selected representative image data, is displayed in the total reproduction time area 39.
  • the central processing means 21 reproduces moving image data for a reproduction time period corresponding to the selected representative image or from a reproduction time corresponding to the selected representative image to the total reproduction time.
  • the display means 26 displays an image under reproduction in the entire display area 38. ((Advantages of the scene search mode)) As shown in Figure 12, since an image identical with the reference image data is displayed in the display area 38, a user can search for a desired image (scene) easily.
  • the central processing means 21 may generate reference image data not only from suspended image (image having color information) data but also from an image a few frames before the suspended image and an image a few frames after the suspended image in accordance with the program.
  • Figure 13 shows an example in which the central processing means 21 generates one set of reference image data from a suspended image, an image one frame before the suspended image, and an image one frame after the suspended image.
  • FIG. 14 shows a block diagram of an apparatus that stores a moving image in accordance with the invention.
  • An apparatus 80 which stores a moving image in accordance with the invention, comprises a central processing means 21, a recording medium control means 23 that controls a recording medium 22, an input means control means 25 that controls an input means 24, a display means control means 27 that controls a display means 26, a storage device 28, and a video signal output means control means 82 that controls a video signal output means 81.
  • Moving image data is not stored in the recording medium 22 in advance, and moving image data is stored in the storage medium in accordance with the invention.
  • a program for storing an image in accordance with the invention is stored in the storage device 28. Note that this program may be stored in the recording medium 22.
  • a computer program product comprises this program stored on a central processing means 21 usable medium such as a semiconductor memory, an HD, a DVD, a FD, a MO disk, a CD, etc.
  • the video signal output means 81 is, for example, a TV tuner, a video camera, a computer, an HD player, or a DVD player.
  • the video signal output means control means 82 is, for example, an interface between the video signal output means 81 and the central processing means 21 or an input terminal for a video signal.
  • the central processing means 21 outputs control data for displaying a recording button 90, a stop button 33, an image input selection button 91, a pointer 37, and a display area 38 on a display means 26 to a display means control means 27 in accordance with the program.
  • the display means 26 inputs the control data from the display means control means 27 and displays the recording button 90, the stop button 33, the moving image input selection button 91, the pointer 37, and the display area 38.
  • Figure 15 shows an example of a GUI to be displayed on the display means 26. Note that, in Figure 15, the moving image input selection button 91 includes 1 to 6 channel buttons and an external input button.
  • the input means 24 may comprise those buttons 90, 33, and 91.
  • the central processing means 21 When the moving image input selection button 91 (e.g., the 1 channel button or the external input button) is selected by a user, the central processing means 21 outputs control data for inputting a corresponding video signal to the video signal output means control means 82.
  • the video signal output means 81 inputs the control data from the video signal output means control means 82 and outputs a corresponding video signal to the video signal output means control means 82.
  • the central processing means 21 inputs the corresponding video signal from the video signal output means control means 82 and outputs control data for displaying a video signal on the display means 26 to the display means control means 27.
  • the display means 26 displays a video in the entire display area 38.
  • the central processing means 21 executes a chapter mode to be explained below.
  • the central processing means 12 ends the chapter mode.
  • Figure 16 represents images forming one moving image data and representative images of images grouped by the chapter mode.
  • moving image data to be stored in accordance with the invention includes H images (frames) corresponding to a total storage time.
  • the chapter mode generates a representative image according to the same technique as the scene cut mode described above. ((Initial setting))
  • the central processing means 21 stores, for example, 0.033 second in the storage means as predetermined time interval data.
  • the central processing means 21 encodes a video signal at a predetermined rate (e.g., 30 image frames/second) and outputs control data for storing moving image data in the recording medium 22 to the recording medium control means 23 in accordance with the program.
  • a predetermined rate e.g. 30 image frames/second
  • the central processing means 21 extracts image data having color information from a video signal corresponding to a first storage time (e.g., 0 second) and stores the image data in the storing means 28.
  • the central processing means 21 converts the image data having a color information into image data having only luminance information (gray information) and further changes a size (e.g., 720 pixels x 480 pixels) of the image data to a size suitable for identification (e.g., 80 pixels x 60 pixels).
  • the central processing means 21 stores the image data having only luminance information, the size of which is changed to the size for identification, in the storage means 28 as. reference image data.
  • the central processing means 21 subjects a histogram distribution of the image data having only luminance information to planarization processing and stores the image data subjected to the planarization processing in the storage means 28 as reference image data.
  • the central processing means 21 determines an image model from the histogram distribution of the reference image data and stores the determined image model data in the storage means 28.
  • the central processing means 21 stores image data having color information corresponding to the reference image data in the storage means 28 as first representative image data.
  • the central processing means 21 extracts image data having color information from a video signal corresponding to a second storage time (a storage time lapsed by the predetermined time interval from the first storage time) and stores the image data in the storage means 28.
  • the central processing means 21 converts an image having color information into an image having only luminance information and further changes a size of the image data to a size suitable for identification.
  • the central processing means 21 stores the image having only luminance information, the size of which is changed to the size for identification, in the storage means 28 as identification object image data.
  • the central processing means 21 subjects a histogram distribution of the image data having only luminance information to the planarization processing and stores the planarized image data in the storage means 28 as identification object image data.
  • the central processing means 21 determines an image model from a histogram distribution of identification object image data and stores the determined image model data in the storage means 28. ((Calculation of a fitness value))
  • the central processing means 21 calculates a sum of absolute values of differences between luminance levels of respective pixels of the reference image data and luminance levels of corresponding respective pixels of the identification object image data. Moreover, the central processing means 21 subtracts a value, which is obtained by dividing the sum by the total number of pixels and a maximum value of the luminance levels, from 1. The central processing means 21 stores an obtained value in the storage means 28 as fitness data.
  • the central processing means 21 determines whether the reference image data and the identification object image data are different with the same technique (one of the identification techniques 1 to 3) as that in the scene cut mode explained above and stores determination result data in the storage means 28.
  • a coincidence identical, similar, and partially similar between the reference image data and the identification object image data is determined, and class data is stored in the storage means 28.
  • the determination result data which indicates whether the reference image data and the identification object image data are different, and the class data, which indicates whether the reference image data and the identification object image data coincide with each other, are stored in the storage means 28 as previous determination result data and previous class data, respectively.
  • the central processing means 21 Regardless of contents of the determination result data and the class data, the central processing means 21 generates identification object image data, image model data, and fitness data from a video signal corresponding to the next storage time (a storage time lapsed by the predetermined time interval from the second storage time) and obtains determination result data and class data. ((Update of a reference image)) When it is determined that the reference image data and the identification object image data are different, the central processing means 21 stores the identification object image data in the storage means 28 as new reference image data. In addition, the central processing means 21 stores image model data of the identification object image data in the storage means 28 as new image model data of the reference image data.
  • the central processing means 21 stores present identification object image data in the storage means 28 as new reference image data.
  • the central processing means 21 stores image model data of the present identification object image data in the storage means 28 as new image model data of the reference image data.
  • the central processing means 21 stores image data having color information corresponding to the updated reference image data in the storage means 28 as representative image data. ((End of storage of moving image data)) When the stop button 33 is selected by the user, the central processing means 21 stops encoding a video signal, and the recording medium 22 stops storing moving image data. ((Generation of chapter data))
  • the central processing means 21 stores a set of storage time data (i.e., a set of reproduction time data) corresponding to first to Gth representative images in the recording medium 22 as chapter data. ((Advantage of a chapter mode))

Abstract

A method including, decoding a moving image data portion corresponding to a first reproduction time and extracting image data corresponding to the first reproduction time as reference image data; displaying the reference image data as first representative image data; decoding a moving data portion corresponding to a second reproduction time and extracting image data corresponding to the second reproduction time as identification object image data; determining whether the reference image data and the identification object image data are different and, when the reference image data and the identification object image data are different, displaying the identification object image data as second representative image data; and reproducing moving image data from a reproduction time corresponding to selected representative image data of the first or the second representative image data.

Description

DESCRIPTION
APPARATUS, METHOD, AND PROGRAM PRODUCT FOR REPRODUCING OR RECORDING MOVING IMAGE
TECHNICAL FIELD
The present invention generally relates to an apparatus, a method, and a program product for reproducing or recording a moving image. More specifically, the invention relates to an apparatus, a method, and a program product for identifying images forming a moving image and reproducing the moving image from a predetermined image. The invention also relates to an apparatus, a method, and a program product for identifying images forming a moving image and storing the moving image together with a reproduction time corresponding to a predetermined image.
BACKGROUND ART
As is well known, a program for reproducing and editing moving image data, which is recorded in a recording medium such as a HD or a DVD, can cause a processor to display a reproduction button, a rewind button, a fast- forward button, a suspend button, a stop button, a chapter generation button, and a chapter selection button on a display, to reproduce the moving image data on the display and to store a reproduction time corresponding to suspended moving image data as chapter data. Alternatively, each of an apparatus for reproducing and editing moving image data and a remote controller comprises a reproduction button, a rewind button, a fast-forward button, a suspension button, a stop button, a chapter reproduction button, and a chapter selection button, and can reproduce moving image data on a television and store a reproduction time corresponding to suspended moving image data as chapter data.
Figure 1 shows an example of a GUI that is displayed on a display. In Figure 1, reference numerals 1, 2, 3, 4, 5, 9, and 10 correspond to a reproduction button, a rewind button, a fast-forward button, a suspension button, a stop button, a chapter generation button, and a chapter selection button, respectively. A user clicks the reproduction button 1 via a pointer (not-shown) displayed on the display, whereby reproduction of moving image data is started. When the reproduction of the moving image data is started, an image is displayed in a moving image display area 6 on the display and a reproduction time button 7 is displayed in a total reproduction time area 8. The user can search for a desired image (scene) and reproduce a moving image from the image by clicking the rewind button 2 or the fast-forward button 3, moving the reproduction time button 7 or clicking a certain position in the total reproduction time area 8. In addition/ when the desired image (scene) is reproduced, the user can generate chapter data, which makes it possible to reproduce moving image data from the desired image, by clicking the suspension button 4 and further clicking the chapter generation button
9. A screen of a television (not shown) displays the moving image display area 6 in Figure 1.. The user presses the reproduction button 1 of an apparatus for reproducing and editing moving image data or a remote controller therefor, whereby reproduction of moving image data is started. When the reproduction of the moving image data is started, an image is displayed on the screen of the television. The user can search for a desired image (scene) and reproduce a moving image from the image by pressing the rewind button 2 or the fast-forward button 3. In addition, when the desired image (scene) is reproduced, the user can generate chapter data, which makes it possible to reproduce the moving image data from the desired image, by pressing the suspension button 4 and further pressing the chapter generation button 9.
DISCLOSURE OF THE INVENTION
In the case in which the user clicks or presses the rewind button 2 or the fast-forward button 3 or moves the reproduction time button 7, images forming a moving image are reproduced successively on the display or the television. Therefore, whereas the user can comprehend contents of the moving image, a time required for searching for a desired image is increased. In a case in which the user clicks on a certain position in the total reproduction time area 8, images forming a moving image are reproduced instantaneously in the display area 6 on the display. However, since the user is not able to comprehend the contents of the moving image and, as a result, has to click other positions in the total reproduction time area 8 many times. In other words, it is difficult for the user to search for a desired image and, as a result, a time required for searching for the desired image is increased.
In a case in which the user sets a chapter in a moving image, in general, it is necessary to search for a plurality of desired images. Therefore, a time required for searching for the desired images is increased. As a result, operations required to set a chapter are complicated.
It is an object of the invention to provide an apparatus, a method, and a program product for reproducing a moving image according to a new technique.
It is another object of the invention to provide a new technique that allows a user to readily comprehend contents of a moving image when the user searches for a desired image. It is still another object of the invention to provide a technique that allows a user to easily search for a desired image.
It is still another object of the invention to provide a technique for setting a chapter according to a new technique.
It is still another object of the invention to provide a technique for setting a chapter easily.
It is still another object of the invention to provide an apparatus, a method, and a program product for storing a moving image according to the new technique. Those skilled in the art will be able to readily understand other objects of the invention by referring to the claims, drawings and embodiments of the invention as set out below.
(Scene cut mode)
An apparatus, which reproduces a moving image in accordance with the present invention, comprises central processing means (21), recording medium control means (23) that controls a recording medium (22), input means control means (25) that controls input means (24), and display means control means (27) that controls display means (26). The central processing means (21) inputs a moving image data portion, which corresponds to a first reproduction time, of moving image data stored in the recording medium (22) from the recording medium control means (23), decodes the moving image data portion, and extracts image data corresponding to the first reproduction time as reference image data. The central processing means (21) outputs control data for displaying the reference image data on the display means (26) as first representative image data to the display means control means (27). The central processing means (21) inputs moving data portion, which corresponds to a second reproduction time which is different from the first reproduction time, of the moving image data stored in the recording medium (22) from the recording medium control means (23), decodes the moving data portion, and extracts image data corresponding to the second reproduction time as identification object image data. The central processing means (21) determines whether the reference image data and the identification object image data are different and, when the reference image data and the identification object image data are different, outputs control data for displaying the identification object image data on the display means (26) as second representative image data to the display means control means (27). The central processing means (21) inputs operation data, which indicates that one representative image data of the first or the second representative image data displayed on the display means (26) is selected by the input means (24), from the input means control means (25), reproduces moving image data from a reproduction time corresponding to the selected representative image data, and outputs control data for displaying a moving image under reproduction on the display means (26) to the display means control means (27).
Preferably, when the reference image data and the identification object image data are not different, the central processing means (21) sets the identification - object image data as previous identification object image data and determines whether the reference image data and the previous identification object image data are similar. The central processing means (21) inputs a moving image data portion, which corresponds to a third reproduction time which is different from the first and the second reproduction times, of the moving image data stored in the recording medium (22) from the recording medium control means (23), decodes the moving image data portion, and extracts image data corresponding to the third reproduction time as present identification object image data. When the reference image data and the present identification object image data are not different, the central processing means (21) determines whether the reference image data and the present identification object image data are similar. When the reference image data and the previous identification object image data are not similar and the reference image data and the present identification object image data are similar, the central processing means (21) sets the present identification object image data as new reference image data. When the reference image data and the previous identification object image data are similar and the reference image data and the present identification object image data are not similar, the central processing means (21) sets the present identification object image data as new reference image data. The central processing means (21) calculates a fitness value indicating a degree of coincidence of the reference image data and the identification object image data, when the calculated fitness value is smaller than a first fitness value, determines that the reference image data and the identification object image data are different, and when the calculated fitness value is smaller than a second fitness value which is larger than the first fitness value, determines that the reference image data and the identification object image data are similar.
More preferably, when the reference image data and the identification object image data are not different, the central processing means (21) sets the identification object image data as previous identification object image data and generates previous class data that indicates that the reference image data and the previous identification object image data are identical, similar, or partially similar. The central processing means (21) inputs a moving image data portion, which corresponds to a third reproduction time which is different from the first and the second reproduction times, of moving image data stored in the recording medium (22) from the recording medium control means (23), decodes the moving image data portion, and extracts image data corresponding to the third reproduction time as present identification object image data. When the reference image data and the present identification object image data are not different, the central processing means (21) generates present class data that indicates that the reference image data and the present identification object image data are identical, similar, or partially similar. When the previous class data and the present class data are different, the central processing means (21) sets the present identification object image data as new reference image data. The central processing means (21) calculates a fitness value indicating a degree of coincidence of the reference image data and the identification object image data, when the calculated fitness value is smaller than the first fitness value, determines that the reference image data and the identification object image data are different, when the calculated fitness value is smaller than the second fitness value which is larger than the first fitness value, determines that the reference image data and the identification object image data are partially similar, when the calculated fitness value is smaller than a third fitness value which is larger than the second fitness value, determines that the reference image data and the identification object image data are similar, and when the calculated fitness value is larger than the third fitness value, determines that the reference image data and the identification object image data are identical. (Scene search mode)
The central processing means (21), which reproduces a moving image in accordance with the invention, sets reference image data. The central processing means (21) inputs a moving image data portion, which corresponds to a first reproduction time, of moving image data stored in the recording medium (22) from the recording medium control means (23), decodes the moving image data portion, and extracts image data corresponding to the first reproduction time as identification object image data. The central processing means (21) determines whether the reference image data and the Identification object image data are different and, when the reference image data and the identification object image data are not different, outputs control data for displaying the identification object image data on the display means (26) to the display means control means (27) as first representative image data. The central processing means (21) inputs operation data, which indicates that the first representative image data displayed on the display means (26) is selected by the input means (24), from the input means control means (25), reproduces moving image data from a reproduction time corresponding to the selected representative image data, and outputs control data for displaying a moving image under reproduction on the display means (26) to the display means control means (27).
Preferably, when the reference image data and the identification object image data are not different, the central processing means (21) sets the identification object image data as previous identification object image data and determines whether the reference image data and the previous identification object image data are similar. The central processing means (21) inputs a moving image data portion, which corresponds to a second reproduction time which is different from the first reproduction time, of the moving image data stored in the recording medium (22) from the recording medium control means (23), decodes the moving image data portion, and extracts image data corresponding to the second reproduction time as present identification object image data. When the reference image data and the present identification object image data are not different, the central processing means (21) determines whether the reference image data and the present identification object image data are similar. When the reference image data and the previous identification object image data are not similar and the reference image data and the present identification object image data are similar, the central processing means (21) outputs control data for displaying the present identification object image data on the display means (26) as second representative image data to the display means control means (27). When the reference image data and the previous identification object image data are similar, and the reference image data and the present identification object image data are not similar, the central processing means (21) outputs control data for displaying the present identification object image data on the display means (26) as the second representative image data to the display means control means (27). The central processing means (21) inputs operation data, which indicates that the second representative image data displayed on the display means (26) is selected by the input means (24), from the input means control means (25), reproduces moving image data from a reproduction time corresponding to the selected representative image data, and outputs control data for displaying a moving image under reproduction on the display means (26) to the display means control means (27).
More preferably, when the reference image data and the identification object image data are not different, the central processing means (21) sets the identification object image data as previous identification object image data and generates previous class data that indicates that the reference image data and the previous identification object image data are identical, similar, or partially similar. The central processing means (21) inputs a moving image data portion, which corresponds to a second reproduction time which is different from the first reproduction time, of the moving image data stored in the storage means (22) from the recording medium control means (23), decodes the moving image data portion, and extracts image data corresponding to the second reproduction time as present identification object image data. When the reference image data and the present identification object image data are not different, the central processing means (21) generates present class data that indicates that the reference image data and the present identification object image data are identical, similar, or partially similar. When the previous class data and the present class data are different, the central processing means (21) outputs control data for displaying the present identification object image data on the display means (26) as the second representative image data to the display means control means (27). The central processing means (21) inputs operation data, which indicates that the second representative image data displayed on the display means (26) is selected by the input means (24), from the input means control means (25), reproduces moving image data from a reproduction time corresponding to the selected representative image data, and outputs control data for displaying a moving image under reproduction on the display means (26) to the display means control means (27). (Chapter mode)
An apparatus, which stores a moving image in accordance with the invention, comprises the central processing means (21), the recording medium control means (23) that controls the recording medium (22), and video signal output means control means (82) that controls video signal output means (81). The central processing means (21) inputs a video signal, which is outputted from the video signal output means (81), from the video signal output means control means (82), encodes the video signal, and outputs control data for storing the moving image data in the recording medium (22) to the recording medium control means (23).
The central processing means (21) extracts image data from a video signal corresponding to a first storage time as reference image data. The central processing means (21) sets the first storage time as a first reproduction time. The central processing means (21) extracts image data from a video signal, which corresponds to a second storage time which is different from the first storage time, as identification object image data. The central processing means (21) determines whether the reference image data and the identification object image data are different and, when the reference image data and the identification object image data are different, sets the second storage time as a second reproduction time. The central processing means (21) outputs control data for storing the first and the second reproduction times in the recording medium (22) as chapter data to the recording medium control means (23). Preferably, when the reference image data and the identification object image data are not different, the central processing means (21) sets the identification object image data as previous identification object image data and generates previous class data that indicates that the reference image data and the previous classification object image data are identical, similar, or partially similar. The central processing means (21) extracts image data from a video signal, which corresponds to a third storage time which is different from the first and the second storage time, as new identification object image data. When the reference image data and the present identification object image data are not different, the central processing means (21) generates present class data that indicates that the reference image data and the present identification object image data are identical, similar, or partially similar. When the previous class data and the present class data are different, the central processing means (21) sets the present identification object image data as new reference image data.
A method and a program product, which reproduce a moving image in accordance with the invention, are incorporated in the apparatus that reproduces a moving image in accordance with the invention. A method and a program product, which store a moving image in accordance with the invention, are incorporated in the apparatus that stores a moving image in accordance with the invention.
BRIEF DESCRIPTION OF THE DRAWINGS
Figure 1 shows an example of a conventional GUI that is displayed on a display;
Figure 2 shows a functional block diagram of an apparatus that reproduces a moving image in accordance with the invention;
Figure 3 shows an example of a GUI in accordance with the invention that is displayed on display means;
Figure 4 shows a functional block diagram of input means that remotely controls the apparatus that reproduces a moving image in accordance with the invention;
Figure 5 represents images forming one moving image data and images obtained by grouping moving image data with a time division in accordance with the invention; Figure 6 shows an example in which representative image data of grouped images (scenes) 1 to Q shown in Figure 5 are displayed in a display area 38;
Figure 7 represents images forming a part of one moving image data and images grouped by a scene cut mode in accordance with the invention;
Figure 8 shows an example in which representative image data of grouped images (scenes) 1 to M shown in
Figure 7 are displayed in the display area 38; Figure 9 is a diagram for explaining planarization processing for a histogram distribution;
Figure 10 is a diagram for explaining an image model;
Figure 11 represents images forming one moving image data, a suspended image (reference image), and images grouped by a scene search mode in accordance with the invention;
Figure 12 shows an example in which suspended image data and a representative image of grouped images (scenes) 1 to T shown in Figure 11 are displayed in the display area 38;
Figure 13 represents images forming one moving image data, a suspended image and images before and after the image (one set of reference image), and images grouped by the scene search mode in accordance with the invention;
Figure 14 shows a functional block diagram of an apparatus that stores a moving image in accordance with the invention;
Figure 15 shows an example of a GUI in accordance with the invention that is displayed on display means; and
Figure 16 represents images forming one moving image data and a representative image of images grouped by a chapter mode in accordance with the invention.
BEST MODE FOR CARRYING OUT THE INVENTION
Embodiments of the invention to be explained below indicate exemplary embodiments of the invention described in claims. The invention should not be limited to the following embodiments. In addition, those skilled in the art are capable of modifying the following embodiments to easily constitute the invention described in claims.
(A. First embodiment) Figure 2 shows a functional block diagram of an apparatus that reproduces a moving image in accordance with the invention. An apparatus 20, which reproduces a moving image in accordance with the invention, comprises central processing means 21, recording medium control means 23 that controls a recording medium 22, input means control means 25 that controls input means 24, display means control means 27 that controls display means 26, and a storage device 28.
The apparatus 20 for reproducing a moving image is, for example, a computer, an HD player, or a DVD player.
The central processing means 21 is, for example, a CPU, an MPU, or a DSP. The recording medium 22 is, for example, an HD, a DVD, or a semiconductor memory. Moving image data is stored in the recording medium 22. A format of moving image data is, for example, MPEG, VOB, DV, or M-JPEG. The recording medium control means 23 is, for example, a driver for the recording medium 22 or an interface between the recording medium 22 and the central processing means 21. The input means 24 is, for example, a mouse, a keyboard, a button, or a remote controller. The input means control means 25 is, for example, an interface between the input means 24 and the central processing means 21. The display means 26 is, for example, a display or a television. The display means control means 27 is, for example, a driver for the display means 26 or an interface between the display means 26 and the central processing means 21. The storage device 28 is, for example, a semiconductor memory. A program for reproducing an image in accordance with the invention is stored in the storage device 28. Note that this program may be stored in the recording medium 22. A computer program product comprises this program stored on a central processing means 21 usable medium such as a semiconductor memory, an HD, a DVD, a FD, a MO disk, a CD, etc.
The central processing means 21 outputs control data for displaying a scene cut button 30, a scene search button 31, a reproduction button 32, a stop button 33, a return button 34, a suspension button 35, an image selection button 36, a pointer 37, a display area 38, and a total reproduction time area 39 on the display means 26 to the display means control means 27 in accordance with the program. The display means 26 inputs control data from the display means control means 27 and displays the scene cut button 30, the scene search button 31, the reproduction button 32, the stop button 33, the return button 34, the suspension button 35, the image selection button 36, the pointer 37, the display area 38, and the total reproduction time area 39. Figure 3 shows an example of a GUI to be displayed on the display means 26.
The input means 24 (e.g. a mouse) inputs operation by a user for moving the pointer 37 displayed on the display means 26 and outputs operation data for moving the pointer 37 to the input means control means 25. The central processing means 21 inputs the operation data from the input means control means 25, converts the operation data into control data for moving the pointer 37 on the display means 26, and outputs the control data to the display means control means 27 in accordance with the program. The display means 26 inputs the control data from the display means control means 27 and moves a display position of the pointer 37.
In addition, the input means 24 inputs operation by the user for clicking the pointer 37 displayed on the display means 26 and outputs operation data for clicking the pointer 37 to the input means control means 25. The central processing means 21 inputs the operation data from the input means control means 25 and calculates a display position of the pointer 37 displayed on the display means 26 in accordance with the program. When the display position of the pointer 37 displayed on the display means 26 corresponds to a display position of the scene cut button 30 displayed on the display means 26, the central processing means 21 determines that clicking of the scene cut button 30 is effective and executes a scene cut mode which is explained below. Note that instead of displaying the buttons 30, 31, 32, 33, 34, 35, and 36 and the pointer 37 with the display means 26, the input means 24 can comprise those buttons 30,
31, 32, 33, 34, 35, and 36. The input means 24 comprising1 such buttons is, for example, a remote controller. Figure 4 shows a functional block diagram of input means that remotely controls the apparatus that reproduces a moving image in accordance with the invention. The input means 24 (e.g., the remote controller) inputs operation by the user for pressing the scene cut button 30 comprised in the input means 24 and outputs the operation data to the input means control means 25. The central processing means 21 inputs the operation data from the input means control means 25, determines that pressing of the scene cut button 30 is effective, and executes the scene cut mode to be described below in accordance with the program. (Scene cut mode) Figure 5 represents images forming data of one moving image and images obtained by grouping the moving image data according to time division. As shown in Figure 5, the moving image data includes P images (frames) corresponding to a total reproduction time. Figure 6 shows an example in which representative image data of grouped images (scenes) 1 to Q shown in Figure 5 is displayed in the display area 38.
((Initial setting))
The central processing means 21 outputs control data for extracting total reproduction time data of the moving image data stored in the recording medium 22, to the recording medium control means 23, in accordance with the program. The recording medium control means 23 inputs the control data from the central processing means 21, reads out the total reproduction time data from the moving image data, and outputs the total reproduction time data to the central processing means 21. The central processing means 21 stores the total reproduction time data in the storage means 28. The central processing means 21 sets a maximum number of pieces of representative image data to be displayed in the display area 38 to, for example, twelve and stores the maximum number of pieces data in the storage means 28. Moreover, the central processing means 21 divides the total reproduction time data by the maximum number of pieces data and stores a value obtained by the division (total reproduction time/maximum number of pieces) in the storage means 28 as first predetermined time interval data. When the total reproduction time is, for example, 2 hours (120 minutes), the first predetermined time interval is, for example, 10 minutes (2 hours/12). ( (Initial operation) )
The central processing means 21 generates one set of reproduction time data, which starts from a reproduction time 0, increases at the first predetermined time interval, and is smaller than the total reproduction time in accordance with the program. For example, if the one set of reproduction times is constituted by first to twelfth reproduction times, the first reproduction time is 0 minutes, the second reproduction time is 10 minutes, the third reproduction time is 20 minutes, the fourth reproduction time is 30 minutes, the fifth reproduction time is 40 minutes, the sixth reproduction time is 50 minutes, the seventh reproduction time is 60 minutes, the eighth reproduction time is 70 minutes, the ninth reproduction time is 80 minutes, the tenth reproduction time is 90 minutes, the eleventh reproduction time is 100 minutes, and the twelfth reproduction time is 110 minutes. The central processing means 21 outputs control data for extracting an image corresponding to the one set of reproduction times to the recording medium control means 23 The recording medium control means 23 inputs the control data from the central processing means 21, reads out one set of moving image data portions corresponding to the one set of reproduction times, and outputs the one set of moving image data portions to the central processing means 21. The central processing means 21 decodes the one set of moving image data portions and extracts one set of image data (e.g., 720 pixels x 480 pixels) having color information. The central processing means 21 stores the one set of image data having color information in the storage means 28. Figure 5 represents images extracted first to Qth in this way.
When a reproduction time is equal to or larger than a reproduction time (e.g., the first reproduction time (0 minutes)) corresponding to the extracted image (e.g., the image extracted first) and smaller than a reproduction time (e.g., the second reproduction time (10 minutes)) corresponding to an image extracted next (e.g., the image extracted second), the central processing means 21 recognizes images belonging to reproduction times satisfying this first condition as one group. However, in a case that there is no image extracted next, when a reproduction time is equal to or larger than an extracted image (e.g., the image extracted (Q-l)th) and equal to or smaller than the total reproduction time, the central processing means 21 recognizes images belonging to reproduction times satisfying this second condition as one group (e.g., images grouped Qth) . Figure 5 represents images grouped first to Qth in this way. The central processing means 21 determines images that represent the images first to Qth, respectively. The central processing means 21 determines, for example, the images extracted first to Qth as images representing the images grouped first to Qth. The central processing means 21 may newly extract image data corresponding to a reproduction time larger than the first reproduction time and smaller than the reproduction time corresponding to the image extracted second and determine the image as an image representing images grouped first. The central processing means 21 changes a size (e.g., 720 pixels x 480 pixels) of an image (image having color information) representing grouped images to a size suitable for icon display (e.g., 120 pixels x 80 pixels) and stores the image in the storage means 28 as representative image data for icon display. The central processing means 21 outputs control data for displaying first to Qth representative image data for icon display in the display area on the display means 26 to the display means control means 27. The display means 26 inputs the control data from the display means control means 27 and displays the first to the Qth representative image data for icon display in the display area 38 (Figure 6). When the first to the Qth representative image data for icon display is displayed in the display area 38, the central processing means 21 can automatically select one representative image data for icon display (e.g., first representative image data for icon display) . Alternatively, the user can select one representative image data for icon display (e.g. , first representative image data for icon display) . When one representative image data for icon display is selected, a reproduction time button 61, which corresponds to a reproduction time (e.g., the first reproduction time (0 minutes)) corresponding to the representative image data, is displayed in the total reproduction time area 39. In addition, a reproduction time period (e.g., first reproduction time ≤ reproduction time period < second reproduction time) 62, which corresponds to grouped images (e.g., the images grouped first) corresponding to the selected representative image data, is displayed in the total reproduction time area 39. When a reproduction button is selected by the user (when the reproduction button 32 is clicked or pressed) in a state in which one representative image data for icon display is selected, the central processing means 21 reproduces moving image data for a reproduction time period corresponding to the selected representative image or from a reproduction time corresponding to the selected representative image to the total reproduction time. The display means 26 displays an image under reproduction on the entire display area 38. When the user selects the return button 34 in a state in which an image under reproduction or an image after stop is displayed on the entire display area 38, the first to the Qth representative image data for icon display are displayed in the display area 38 again. When the first to the Qth representative image data for icon display are displayed in the display area 38 again, the central processing means 21 can automatically select the selected representative image data (e.g., the first representative image data for icon display) again. The user can also select another representative image data for icon display (e.g., the second representative image data for icon display) . For example, the user clicks the second representative image for icon display via the pointer 37. Alternatively, the user clicks or presses a right button constituting the image selection button 36.
When the scene cut button 30 is selected by the user in a state in which the representative image data for icon display (e.g., the second representative image data for icon display) is selected, the central processing means 21 groups images according to the scene cut mode. In the following explanation, it is assumed that the second representative image for icon display is selected.
Figure 7 represents images forming a part of one moving image data and images grouped by the scene cut mode. As shown in Figure 7, the images include J images (frames) corresponding to a reproduction time period. Figure 8 shows an example in which representative image data of grouped images (scenes) 1 to M shown in Figure 7 are displayed in the display area 38. ( (Initial setting) )
When the scene cut button 30 is selected by the user, the central processing means 21 stores one set of reproduction time (e.g., first to twelfth reproduction times) data in the storage means 28 as a previous one set of reproduction time (previous first to twelfth reproduction times) data. In addition, the first to the Qth representative image data for icon display are stored as previous first to Qth representative image data for icon display. Moreover, first to Qth reproduction time period data are stored as previous first to Qth reproduction time period data.
The central processing means 21 divides the previous second reproduction time period data, corresponding to the representative image data selected by the user, by the maximum number of pieces data and stores a value, which is obtained by dividing a value obtained by the division (previous second reproduction time period/maximum number of pieces) by a predetermined value {e.g., 4), (previous second reproduction time period/maximum number of pieces/4) in the storage means 28 as second predetermined time interval data. When the previous second reproduction time period is, for example, 10 minutes (600 seconds) and the maximum number of pieces is 12, the second predetermined time interval is, for example, 12.5 seconds (10 minutes/12/4) . It is to be noted that the central processing means 21 may change the maximum number of pieces of representative image data to be displayed in the display area 38.
The central processing means 21 generates one set of reproduction time data that starts from a previous second reproduction time (a reproduction time at the start of a reproduction time period corresponding to the selected representative image) , increases at the second predetermined time interval, and is smaller than a reproduction time (previous third reproduction time) which has lapsed between the first predetermined time interval and the previous second reproduction time. The one set of' reproduction times is constituted by first to Nth reproduction times.
((Setting for a reference image)) The central processing means 21 extracts image data having color information corresponding to the first reproduction time (the reproduction time at the start of the previous second reproduction time period) and- stores the image data in the storage means 28. The central processing means 21 converts the image data having color information into image data having only luminance information (gray information) and further changes a size (e.g., 720 pixels x 480 pixels) of the image data to a size suitable for identification (e.g., 80 pixels x 60 pixels). The central processing means 21 stores the image data having only luminance information, the size of which is changed to the size for identification, in the storage means 28 as reference image data. Preferably, the central processing means 21 subjects a histogram distribution of the image data having only luminance information to planarization processing, and stores the image data subjected to the planarization processing in the storage means 28 as reference image data. Figure 9 is a diagram for explaining the planarization processing for the histogram distribution.
((Model determination for a reference image)) Preferably, the central processing means 21 determines an image model from the histogram distribution of the reference image data and stores the determined image model data in the storage means 28. For example, it is possible to set an image model A as an image in which a histogram distribution concentrates in a dark portion or a bright portion of a luminance level (when one pixel is represented by eight bits, for example, 70% or more of the total number of pixels concentrates in 0 to 63 level or 192 to 255 level) , set an image model B to an image other than the image model A in which a histogram distribution concentrates in a specific portion of a luminance level
(when one pixel is represented by 8 bits, for example, 70% or more of the total number of pixels concentrates in 32 to
63 level, 64 to 127 level, 96 to 159 level, 128 to 191 level, or 160 to 233 level), and set an image model C to an image other than the image models A and B. Figure 10 is a diagram for explaining the image models.
((Display of a representative image)) The central processing means 21 changes a size (e.g., 720 pixels x 480 pixels) of the image data having color information corresponding to the reference image data to a size suitable for icon display (e.g., 120 pixels x 80 pixels). The central processing means 21 stores the image data having color information, the size of which is changed to the size suitable for icon display, in the storage means 28 as the first representative image data. The display means 26 displays the first representative image data in the display area 38 (Figure 8). ((Setting for an identification object image))
The central processing means 21 extracts an image having color information corresponding to the second reproduction time (a reproduction time lapsed by the second predetermined time interval from the first reproduction time) and stores the image in the storage means 28.
The central processing means 21 converts the image having color information to an image having only luminance information and further changes a size of image data to a size suitable for identification. The central processing means 21 stores the image having only luminance information, the size of which is changed to the size for identification, in the storage mean's -28 as identification object image data.
Preferably, like the reference image data, the central processing means 21 subjects a histogram distribution of the image data having only luminance information to the planarization processing and stores the planarized image data in the storage means 28 as identification object image data.
( (Model determination for an identification object image) )
Preferably, the central processing means 21 determines an image model from a histogram distribution of identification object image data and stores the determined image model data in the storage means 28. ((Calculation of a fitness value))
The central processing means 21 calculates a sum of absolute values of differences between luminance levels of respective pixels of reference image data and luminance levels of respective corresponding pixels of identification object image data. Further, the central processing means 21 subtracts a value, which is obtained by dividing the sum by the total number of pixels and a maximum value of the luminance levels, from 1. The central processing means 21 stores an obtained value in the storage means 28 as fitness data.
In order to explain the fitness data more easily, a case in which the total number of pixels of reference and identification object image data is four and a maximum value of luminance levels is 255 level is assumed. For example, concerning the reference image data, luminance levels of first, second, third, and fourth pixels are 0, 0, 0, and 0, respectively, and concerning the identification object image data, luminance levels of first, second, third, and fourth pixels are 255, 255, 255, and 255, respectively. The central processing means 21 calculates a sum = |θ- 255|+|0-255| +|0-255|+'|0-255|=1020. Moreover, the central processing means 21 obtains a fitness value = 1- (1020/4/255)=0. As another example, for example, concerning the reference image data, luminance levels of first, second, third, and fourth pixels are 0, 0, 0, and 0, respectively, and concerning the identification object image data, luminance levels of first, second, third, and fourth pixels are 0, 0, 0, and 0, respectively. The central processing means 21 calculates a sum = |θ-θ|+|θ- 0|+|0-0|+|0-0|=0 and obtains a fitness value = 1- (0/4/255)=1. In this way, the fitness value is normalized in a range of 0 to 1. The fitness value 0 means that the reference image data and the identification object data coincide with each other 0%, and the fitness value 1 means that the reference image data and the identification object image data coincide with each other 100%. Note that a total number of pixels of reference and identification object image data to be actually used depends on a size for identification (e.g., in the case of 80 pixels x 60 pixels, 80x60=4800) . ((Identification technique I))
When fitness data is smaller than a predetermined value (e.g., 0.88), the central processing means 21 determines that reference image data and identification object image data are different. On the other hand, when the fitness data is equal to or larger than the predetermined value (e.g., 0.88), the central processing means 21 determines that the reference image data and the identification object image data are not different in accordance with the program. The central processing means 21 stores determination result data, which indicates whether the reference image data and the identification object image data are different, in the storage means 28. When it is determined that the reference image data and the identification object image data are not different, preferably, the central processing means 21 divides a range from 1 to the predetermined value into plural sections and stores class data indicating a coincidence in the storage means 28 for each of the sections. For example, 1 to 0.88 is divided into three sections (1 to 0.95, 0.95 to 0.93, and 0.93 to 0.88). When the fitness data is equal to or larger than 0.95, it is determined that the reference image data and the identification object image data are identical. When the fitness data is smaller than 0.95 and equal to or larger than 0.93, the central processing means 21 determines that the reference image data and the identification object image data are similar. When the fitness data is smaller than 0.93 and equal to or larger than 0.88, the central processing means 21 determines that the reference image data and the identification object image data are partially similar. It is preferable to determine classes representing a coincidence (identical, similar, and partially similar) in this way. ((Identification technique 2))
Preferably, when image model data of the reference image data is not an image model A in which a histogram distribution concentrates in a dark portion or a bright portion of a luminance level and when fitness data is smaller than a predetermined value (e.g., 0.88), the central processing means 21 determines that the reference image data and the identification object image data are different. Preferably, the central processing means 21 determines classes representing a coincidence (identical, similar, and partially similar) .
In addition, when image model data of the reference image data is the image model A in which a histogram distribution concentrates in a dark portion or a bright portion of a luminance level and when fitness data is smaller than a second predetermined value (e.g., 0.982), the central processing means 21 determines that the reference image data and the identification object image data are different. Preferably, the central processing means 21 determines classes representing a coincidence (identical (fitness value≥0.994) , similar (0.994>fitness value≥0.988) , and partially identical (0.988>fitness value≥0.982) ) . ((Identification technique 3))
More preferably, when image model data of the reference image data and image model data of the identification object image data are different and the image model data of the reference image data is not the image model A in which a histogram distribution concentrates in a dark portion or a bright portion of a luminance level, and when fitness data is smaller than a first predetermined value (e.g., 0.88), the central processing means 21 determines that the reference image data and the identification object image data are different. Preferably, the central processing means 21 determines classes representing a coincidence (identical, similar, and partially similar) .
In addition, when image model data of the reference image data and image model data of the identification object image data are different and the image model data of the reference image data is the image model A in which a histogram distribution concentrates in a dark portion or a bright portion of a luminance level and when fitness data is smaller than a second predetermined value (e.g., 0.982), the central processing means 21 determines that the reference image data and the identification object image data are different. Preferably, the central processing means 21 determines classes representing a coincidence (identical, similar, and partially similar).
Moreover, when image model data of the reference image data and image model data of the identification object image data coincide with each other and when fitness data is smaller than a third predetermined value (e.g., 0.86), the central processing means 21 determines that the reference image data and the identification object image data are different. Preferably, the central processing means 21 determines classes representing a coincidence (identical (fitness value≥0.95), similar (0.95>fitness value≥0.93) , and partially identical (0.93>fitness value≥0.86) ) .
((Update of an identification object image)) Determination result data indicating whether reference image data and identification object image data are different and class data representing a degree of coincidence of the reference image data and the identification object image data are stored in the storage means 28 as previous determination result data and previous class data, respectively.
Regardless of contents of the determination result data and the class data, the central processing means 21 extracts an image having color information corresponding to the next reproduction time (third reproduction time) out of a set of reproduction times.
The central processing means 21 generates identification object image data, image model data, and fitness data from the image data having color information corresponding to the next reproduction time and obtains determination result data and class data.
((Update of reference image)) When it is determined that the reference image data and the identification object image data are different, the central processing means 21 stores the identification object image data (image data having only luminance information, a size of which is changed to a size suitable for identification) in the storage means 28 as new reference image data. In addition, the central processing means 21 stores image model data of the identification object image data in the storage means 28 as new image mode data of the reference image data.
Preferably, when it is determined that the reference image data and the identification object image data are not different, the central processing means 21 determines whether previous class data and present class data coincide with each other. Even when it is determined that the reference image data and the identification object image data are not different, when the previous class data and the present class data are different, the central processing means 21 stores present identification object image data in the storage means 28 as new reference image data. In addition, the central processing means 21 stores image model data of the present identification object image data in the storage means 28 as new image model data of the reference image data.
For example, a state in which image data corresponding to a first reproduction time is reference image data, image data corresponding to a second reproduction time is previous identification object image data, and image data corresponding to a third reproduction time is present identification object image data is assumed. In this state, when it is determined that the reference image data and the previous identification object image data are not different and the reference image data and the previous identification object image data are identical and when it is determined that the reference image data and the present identification object image data are not different and the reference image data and the present identification object image data are similar, present identification object image (image corresponding to a third reproduction time) data is stored as new reference image data. ((Display of a representative image)) Every time it is determined that the reference image data and the identification object image data are different and therefore a reference image is updated, the central processing means 21 changes a size of image data having color information corresponding to an updated reference image data to a size suitable for icon display and stores the image data in the storage means 28 as representative image data. The display means 26 displays second to Mth representative image data generated in this way in the display area 38 (Figure 8). ((Advantages of a scene cut mode))
Respective representative image data of images grouped into M images are displayed in the display area 38 as shown in Figure 8. Since plural representative image data, which are determined as different, are displayed in the display area 38, a user can comprehend contents of a moving image easily. In other words, the user can search for a desired image (scene) easily. In addition, since it is possible to display the plural representative image data, which are determined as different, in the display area 38 instantaneously, it is possible to comprehend contents of a moving image instantaneously.
Moreover, even when it is determined that the reference image data and the identification object image data are not different, when previous class data and present class data are different, the central processing means 21 updates the reference image data. Consequently, it is possible to recognize moving scenes as one group rather than recognizing the scenes as separate groups. As a result, the user can comprehend contents of a moving image more easily.
((Reproduction of an image))
When first to Mth representative image data for icon display are displayed in the display area 38, the central processing means 21 can also automatically select one representative image data for icon display (e.g., first representative image data for icon display) . Alternatively, the user can also select one representative image data for icon display (e.g., first representative image data for icon display) . When one representative image data for icon display is selected, the reproduction time button 61, which corresponds to a reproduction time corresponding to the representative image data, is displayed in the total reproduction time area 39. In addition, the reproduction time period (e.g., reproduction time corresponding to first representative image data ≤ reproduction time period < reproduction time corresponding to second representative image data) 62, which corresponds to grouped images (e.g., images grouped first) corresponding to the selected representative image data, is displayed in the total reproduction time area 39. When a reproduction button is selected by the user in a state in which one representative image data for icon display is selected, the central processing means 21 reproduces moving image data for a reproduction time period corresponding to the selected representative image or from a reproduction time corresponding to the selected representative image to the total reproduction time. The display means 26 displays an image under reproduction in the entire display area 38. ((Sub-grouping)) When the scene cut button 30 is further selected by the user in a state in which one representative image data among the first to the Mth representative image data for icon display, the central processing means 21 can sub-group images according to a scene cut mode.
When a reproduction button is selected by the user in a state in which representative image data of the sub- grouped images (scenes) are displayed in the display area 38 and one representative image data is selected, the central processing means 21 reproduces moving image data for a reproduction time period corresponding to the selected representative image or from a reproduction time corresponding to the selected representative image to the total reproduction time. The display means 26 displays an image under reproduction in the entire display area 38. When the return button 34 is selected by the user in a state in which the representative image data of the sub- grouped images (scenes) are displayed in the display area 38, the representative image data of images (scenes) 1 to M grouped previously are displayed in the display area 38 again.
(Scene search mode)
Regardless of whether the scene cut mode is used, the user can select the reproduction button 32 to reproduce a moving image. In addition1, the user can select the suspension button 35 to stop a moving image under reproduction at a certain image (scene). Thereafter, when the user selects the scene search button 31, a scene search mode to be described below is executed. Figure 11 represents images constituting one moving image data, a suspended image (reference image), and images grouped by the scene search mode. Figure 12 shows an example in which suspended image data and representative image data of the grouped images (scenes) 1 to T shown in Figure 11 are displayed in the display area 38. ((Initial setting))
The central processing means 21 stores, for example, 0.1 second in the storage means 28 as predetermined time interval data in accordance with the program. ((Generation of a* reference image))
The central processing means 21 converts suspended image (image having! color information) data into image data having only luminance information and further changes a size (e.g., 720 pixels x 480 pixels) of the image data to a size suitable for identification (e.g., 80 pixels x 60 pixels) in accordance with the program. The central processing means 21 stores the image data having only luminance information, the size of which is changed to the size for identification, in the storage means 28 as reference image data. Preferably, the central processing means 21 subjects a histogram distribution of the "image data having only luminance information to planarization processing and stores the image data subjected ttr^he planarization processing in the storage means 28 as reference image data.
((Model determination for a reference image)) Preferably, the central processing means 21 determines an image model from a histogram distribution of the reference image data and stores the determined image model data in the storage means 28. For example, the central processing means 21 can set an image model A as an image in which a histogram distribution concentrates in a dark portion or a bright portion of a luminance level, set an image model B to an image other than the image model A in which a histogram distribution concentrates in a specific portion of a luminance level, and set an image model C to an image other than the image models A and B. ((Display of a reference image))
The central processing means 21 changes a size (e.g., 720 pixels x 480 pixels) of suspended image (image having color information) data to a size suitable for icon display (e.g., 120 pixels x 80 pixels) and stores the image in the storage means 28. The display means 26 displays the image data, the size of which is changed to the size for icon display, in the display area 38 (Figure 12). ((Setting for an identification object image))
The central processing means 21 generates one set of reproduction time data that start from a reproduction time 0 and increase at a predetermined time interval (e.g., 0.1 second). For example, the one set of reproduction time data are constituted by first to Cth reproduction time. The central processing means 21 extracts an image having color information that constitutes image data corresponding to a first reproduction time (e.g., 0 second) and stores the image in the storage means 28. The central processing means 21 converts the image having color information into an image having only luminance information and further changes a size of the image data to a size suitable for identification. The central processing means 21 stores the image having only luminance information, the size of which is changed to the size for identification, in the storage means 28 as identification object image data. Preferably, like the reference image data, the central processing means 21 subjects a histogram distribution of the image data having only luminance information to planarization processing and stores the planarized image data in the storage means 28 as identification object image. ( (Model determination for an identification object image) ) The central processing means 21 determines an image model from a histogram distribution of the identification object image data and stores the determined image model data in the storage means 28. ((Pre-processing of an identification technique))
The central processing means 21 determines whether image model data of the reference image data and image model data of the identification object image data coincide with each other. ((Calculation of a fitness value))
When image model data of the reference image data and image model data of the identification object image data coincide with each other, the central processing means 21 calculates a sum of absolute values of differences between luminance levels of respective pixels of the reference image data and luminance levels of corresponding respective pixels of the identification object image data. Moreover, the central processing means 21 subtracts a value, which is obtained by dividing the sum by the total number of pixels and a maximum value of the luminance levels, from 1. The central processing means 21 stores an obtained value in the storage means 28 as fitness data.
When image model data of the reference image data and image model data of the identification object image data do not coincide with each other, the central processing means 21 does not have to obtain fitness data. ( (Identification technique 1) )
When image model data of the reference image data and image model data of the identification object image data coincide with each other, when fitness data is smaller than a predetermined value (e.g., 0.895), the central processing means 21 determines that the reference image data and the identification object image data are different. On the other hand, when fitness data is equal to or larger than the predetermined value (e.g., 0.895), the central processing means 21 determines that the reference image data and the identification object image data are not different in accordance with the program. The central processing means 21 stores determination result data, which indicates whether the reference image data and the identification object image data are different, in the storage means 28. When it is determined that the reference image data and the identification object image data are not different, preferably, the central processing means 21 divides a range of 1 to the predetermined value into plural sections and stores class data representing a coincidence in the storage means 28 for each of the sections. For example, the central processing means 21 divides 1 to 0.895 into three sections (1 to 0.95, 0.95 to 0.93, and 0.93 to 0.895). When fitness data is equal to or larger than 0.95, it is determined that the reference image data and the identification object image data are identical. When fitness data is smaller than 0.95 and equal to or larger than 0.93, the central processing means 21 determines that the reference image data and the identification object image data are similar. When fitness data is smaller than 0.93 and equal to or larger than 0.895, the central processing means 21 determines that the reference image data and the identification object image data are partially similar. It is preferable to determine classes representing a coincidence (identical, similar, and partially similar) in this way.
((Identification technique 2))
Preferably, when image model data of the reference image data and image model data of the identification object image data coincide with each other, when image model data of the reference image data is not the image models A and B and when fitness data is smaller than a first predetermined value (e.g., 0.895), the central processing means 21 determines that the reference image data and the identification object image data are different Preferably, the central processing means 21 determines classes representing a coincidence (identical, similar, and partially similar) . In addition, when image model data of the referenced image data is the image model A or B and when fitness data is smaller than a second predetermined value (e.g., 0.982), the central processing means 21 determines that the reference image data and the identification object image data are different. Preferably, the central processing means 21 determines classes representing a coincidence (identical (fitness value≥O.994) , similar (0.994>fitness value≥0.988) , and partially identical (0.988>fitness value≥0.982) ) .
((Identification technique 3)) More preferably, when image model data of the reference image data and image model data of the identification object image data coincide with each other, when the image model data of the reference image data is not the image models A and B and when fitness data is smaller than a first predetermined value (e.g., 0.895), the central processing means.21 determines that the reference image data and the identification object image data are different. Preferably, the central processing means 21 determines classes representing a coincidence (identical, similar, and partially similar) .
In addition, when image model data of the reference image data is the image model B and when fitness data is smaller than a second predetermined value (e.g., 0.982), the central processing means 21 determines that the reference image data and the identification object image data are different. Preferably, the central processing means 21 determines classes representing a coincidence. Moreover, when image model data of the reference image data is the image model A and when fitness data is smaller than a third predetermined value (e.g., 0.994), the central processing means 21 determines that the reference image data and the identification object image data are different. Preferably, the central processing means 21 determines classes representing a coincidence (identical
(fitness value≥0.998) , similar (0.998>fitness value≥O.996) , and partially identical (0.996>fitness value≥0.994) ) . ((Update of an identification object image)) Determination result data indicating whether reference image data and identification object image data are different and class data representing a degree of coincidence of the reference image data and the identification object image data are stored in the storage means 28 as previous determination result data and previous class data, respectively. Regardless of contents of the determination result data and the class data, the central processing means 21 extracts an image having color information corresponding to the next reproduction time (second reproduction time) out of one set of reproduction times. The central processing means 21 generates identification object image data, image model data, and fitness data from the image data having color information corresponding to the next reproduction time and obtains determination result data and class data. ((Display of a representative image))
When it is determined that the reference image data and the identification object image data are not different, the central processing means 21 changes a size of image data having color information corresponding to the identification object image data to a size suitable for icon display and stores the image data in the storage means 28 as the representative image data.
Preferably, when it is determined that the reference image data and the identification object image data are not different, the central processing means 21 further determines that contents of previous class data and contents of present class data. Even when it is determined that the reference image data and the identification object image data are not different, when the contents of the present class data are identical and the contents of the previous class data are identical, the central processing means 21 does not generate representative image data from present identification object image data. In addition, even when it is determined that the reference image data and the identification object image data are not different, when the contents of the present class data are similar or partially similar and the contents of the previous class data are similar or partially similar, the central processing means 21 does not generate representative image data from the present identification object image data.
For example, a case in which it is determined that the reference image data and identification object image data corresponding to a second reproduction time are not different is assumed.
In this state, when it is determined that the reference image data and the identification object image data corresponding to the second reproduction time are identical and it is further determined that the reference image data and identification object image data corresponding to a first reproduction time are identical, representative image data is not generated from the identification object image data corresponding to the second reproduction time. On the other hand, when it is determined that the reference image data and the identification object image data corresponding to the second reproduction time are identical and it is further determined that the reference image data and the identification object image data corresponding to the first reproduction time are similar, representative image data is generated from the identification object image data corresponding to the second reproduction time.
In addition, in this state, when it is determined that the reference image data and the identification object image data corresponding to the second reproduction time are similar and it is further determined that the reference image data and the identification object image data corresponding to the first reproduction time are partially similar, representative image data is not generated from the identification object image data corresponding to the second reproduction time. On the other hand, when it is determined that the reference image data and the identification object image data corresponding to the second reproduction time are similar and it is further determined that the reference image data and the identification object image data corresponding to the first reproduction time are identical, representative image data is generated from the identification object image data corresponding to the second reproduction time.
The display means 26 displays the first to the Tth representative image data generated in this way in the display area 38 (Figure 12).
More preferably, the display means 26 displays an indication 70 indicating contents of class data of the representative image data (identical, similar, and partially similar) .
((Reproduction of an image))
When first to Tth representative image data for icon display are displayed in the display area 38, the central processing means 21 can automatically select- one representative image data for icon display (e.g., first representative image data for icon display) . Alternatively, a user can select one representative image data for icon display (e.g., first representative image data for icon display) . When one representative image data for icon display is selected, the reproduction time button 61, which corresponds to a reproduction time corresponding to the representative image data, is displayed in the total reproduction time area 39. In addition, a reproduction time period (e.g., reproduction time corresponding to first representative image data ≤ reproduction time period < reproduction time corresponding to second representative image data) 62, which corresponds to grouped images (e.g., the images grouped first) corresponding to the selected representative image data, is displayed in the total reproduction time area 39. When a reproduction button is selected by the user in a state in which one representative image data for icon display is selected, the central processing means 21 reproduces moving image data for a reproduction time period corresponding to the selected representative image or from a reproduction time corresponding to the selected representative image to the total reproduction time. The display means 26 displays an image under reproduction in the entire display area 38. ((Advantages of the scene search mode)) As shown in Figure 12, since an image identical with the reference image data is displayed in the display area 38, a user can search for a desired image (scene) easily.
In addition, since an image similar or partially similar to the reference image data is displayed in the display area 38, the user can search for a desired image (e.g. , an image different from the reference image partially or an enlarged image of the reference image) . ((Modification of a reference image)) The central processing means 21 may generate reference image data not only from suspended image (image having color information) data but also from an image a few frames before the suspended image and an image a few frames after the suspended image in accordance with the program. Figure 13 shows an example in which the central processing means 21 generates one set of reference image data from a suspended image, an image one frame before the suspended image, and an image one frame after the suspended image.
As shown in Figure 13, when the central processing means 21 generates plural reference images, since a larger number of representative images will be displayed, it is more likely that a desired image of a user is included in the representative images.
(B. Second embodiment)
Figure 14 shows a block diagram of an apparatus that stores a moving image in accordance with the invention. An apparatus 80, which stores a moving image in accordance with the invention, comprises a central processing means 21, a recording medium control means 23 that controls a recording medium 22, an input means control means 25 that controls an input means 24, a display means control means 27 that controls a display means 26, a storage device 28, and a video signal output means control means 82 that controls a video signal output means 81.
Moving image data is not stored in the recording medium 22 in advance, and moving image data is stored in the storage medium in accordance with the invention. A program for storing an image in accordance with the invention is stored in the storage device 28. Note that this program may be stored in the recording medium 22. A computer program product comprises this program stored on a central processing means 21 usable medium such as a semiconductor memory, an HD, a DVD, a FD, a MO disk, a CD, etc. The video signal output means 81 is, for example, a TV tuner, a video camera, a computer, an HD player, or a DVD player. The video signal output means control means 82 is, for example, an interface between the video signal output means 81 and the central processing means 21 or an input terminal for a video signal. The central processing means 21 outputs control data for displaying a recording button 90, a stop button 33, an image input selection button 91, a pointer 37, and a display area 38 on a display means 26 to a display means control means 27 in accordance with the program. The display means 26 inputs the control data from the display means control means 27 and displays the recording button 90, the stop button 33, the moving image input selection button 91, the pointer 37, and the display area 38. Figure 15 shows an example of a GUI to be displayed on the display means 26. Note that, in Figure 15, the moving image input selection button 91 includes 1 to 6 channel buttons and an external input button.
Instead of displaying the buttons 90, 33, and 91 and the pointer 37 with the display means 26, the input means 24 (e.g., a remote controller) may comprise those buttons 90, 33, and 91.
When the moving image input selection button 91 (e.g., the 1 channel button or the external input button) is selected by a user, the central processing means 21 outputs control data for inputting a corresponding video signal to the video signal output means control means 82. The video signal output means 81 inputs the control data from the video signal output means control means 82 and outputs a corresponding video signal to the video signal output means control means 82. The central processing means 21 inputs the corresponding video signal from the video signal output means control means 82 and outputs control data for displaying a video signal on the display means 26 to the display means control means 27. The display means 26 displays a video in the entire display area 38.
When the recording button 90 is selected (the recording button 90 is clicked or pressed) by the user in a state in which a video is displayed in the display area 38, the central processing means 21 executes a chapter mode to be explained below. When the suspension button 33 is selected by the user, the central processing means 12 ends the chapter mode.
(Chapter mode)
Figure 16 represents images forming one moving image data and representative images of images grouped by the chapter mode. As shown in Figure 16, moving image data to be stored in accordance with the invention includes H images (frames) corresponding to a total storage time. Note that the chapter mode generates a representative image according to the same technique as the scene cut mode described above. ((Initial setting))
The central processing means 21 stores, for example, 0.033 second in the storage means as predetermined time interval data.
((Start of storage of moving image data)) When the recording button 90 is selected by a user, the central processing means 21 encodes a video signal at a predetermined rate (e.g., 30 image frames/second) and outputs control data for storing moving image data in the recording medium 22 to the recording medium control means 23 in accordance with the program.
((Setting for a reference image))
The central processing means 21 extracts image data having color information from a video signal corresponding to a first storage time (e.g., 0 second) and stores the image data in the storing means 28. The central processing means 21 converts the image data having a color information into image data having only luminance information (gray information) and further changes a size (e.g., 720 pixels x 480 pixels) of the image data to a size suitable for identification (e.g., 80 pixels x 60 pixels). The central processing means 21 stores the image data having only luminance information, the size of which is changed to the size for identification, in the storage means 28 as. reference image data. Preferably, the central processing means 21 subjects a histogram distribution of the image data having only luminance information to planarization processing and stores the image data subjected to the planarization processing in the storage means 28 as reference image data.
((Model determination for a reference image)) Preferably, the central processing means 21 determines an image model from the histogram distribution of the reference image data and stores the determined image model data in the storage means 28.
((Setting for a representative image))
The central processing means 21 stores image data having color information corresponding to the reference image data in the storage means 28 as first representative image data.
((Setting for an identification object image)) The central processing means 21 extracts image data having color information from a video signal corresponding to a second storage time (a storage time lapsed by the predetermined time interval from the first storage time) and stores the image data in the storage means 28. The central processing means 21 converts an image having color information into an image having only luminance information and further changes a size of the image data to a size suitable for identification. The central processing means 21 stores the image having only luminance information, the size of which is changed to the size for identification, in the storage means 28 as identification object image data. Preferably, like the reference image data, the central processing means 21 subjects a histogram distribution of the image data having only luminance information to the planarization processing and stores the planarized image data in the storage means 28 as identification object image data.
((Model determination for an identification object image) ) Preferably, the central processing means 21 determines an image model from a histogram distribution of identification object image data and stores the determined image model data in the storage means 28. ((Calculation of a fitness value))
The central processing means 21 calculates a sum of absolute values of differences between luminance levels of respective pixels of the reference image data and luminance levels of corresponding respective pixels of the identification object image data. Moreover, the central processing means 21 subtracts a value, which is obtained by dividing the sum by the total number of pixels and a maximum value of the luminance levels, from 1. The central processing means 21 stores an obtained value in the storage means 28 as fitness data.
( (Identification technique) )
The central processing means 21 determines whether the reference image data and the identification object image data are different with the same technique (one of the identification techniques 1 to 3) as that in the scene cut mode explained above and stores determination result data in the storage means 28. When it is determined that the reference image data and the identification object image data are not different, preferably, a coincidence (identical, similar, and partially similar) between the reference image data and the identification object image data is determined, and class data is stored in the storage means 28. ((Update of an identification object image))
The determination result data, which indicates whether the reference image data and the identification object image data are different, and the class data, which indicates whether the reference image data and the identification object image data coincide with each other, are stored in the storage means 28 as previous determination result data and previous class data, respectively.
Regardless of contents of the determination result data and the class data, the central processing means 21 generates identification object image data, image model data, and fitness data from a video signal corresponding to the next storage time (a storage time lapsed by the predetermined time interval from the second storage time) and obtains determination result data and class data. ((Update of a reference image)) When it is determined that the reference image data and the identification object image data are different, the central processing means 21 stores the identification object image data in the storage means 28 as new reference image data. In addition, the central processing means 21 stores image model data of the identification object image data in the storage means 28 as new image model data of the reference image data.
Preferably, even when it is determined that the reference image data and the identification object image data are not different, when previous class data and present class data are different, the central processing means 21 stores present identification object image data in the storage means 28 as new reference image data. In addition, the central processing means 21 stores image model data of the present identification object image data in the storage means 28 as new image model data of the reference image data.
((Setting for a representative image)) Every time it is determined that the reference image data and the identification image data are different and therefore a reference image is updated, the central processing means 21 stores image data having color information corresponding to the updated reference image data in the storage means 28 as representative image data. ((End of storage of moving image data)) When the stop button 33 is selected by the user, the central processing means 21 stops encoding a video signal, and the recording medium 22 stops storing moving image data. ((Generation of chapter data))
The central processing means 21 stores a set of storage time data (i.e., a set of reproduction time data) corresponding to first to Gth representative images in the recording medium 22 as chapter data. ((Advantage of a chapter mode))
When moving image data is stored, a chapter is automatically created. As a result, operations required for setting a chapter are not complicated.

Claims

1. An apparatus (20) for reproducing a moving image comprising: central processing means (21); recording medium control means (23) that controls a recording medium (22); input means control means (25) that controls input means (24); and display means control means (27) that controls display means (26), wherein the central processing means (21) inputs a moving image data portion, which corresponds to a first reproduction time, of moving image data stored in the recording medium (22) from the recording medium control means (23), decodes the moving image data portion, and extracts image data corresponding to the first reproduction time as reference image data, the central processing means (21) outputs control data for displaying the reference image data on the display means (26) as first representative image data to the display means control means (27), the central processing means (21) inputs moving data portion, which corresponds to a second reproduction time which is different from the first reproduction time, of the moving image data stored in the recording medium (22) from the recording medium control means (23), decodes the moving data portion, and extracts image data corresponding to the second reproduction time as identification object image data, the central processing means (21) determines whether the reference image data and the identification object image data are different and, when the reference image data and the identification object image data are different, outputs control data for displaying the identification object image data on the display means (26) as second representative image data to the display means control means (27) , and the central processing means (21) inputs operation data, which indicates that one representative image data of the first or the second representative image data displayed on the display means (26) is selected by the input means (24), from the input means control means (25), reproduces moving image data from a reproduction time corresponding to the selected representative image data, and outputs control data for displaying a moving image under reproduction on the display means (26) to the display means control means (27).
2. The apparatus (20) according to claim 1, wherein when the reference image data and the identification object image data are not different, the central processing means (21) sets the identification object image data as previous identification object image data and determines whether the reference image data and the previous identification object image data are similar, the central processing means (21) inputs a moving image data portion, which corresponds to a third reproduction time which is different from the first and the second reproduction times, of the moving image data stored in the recording medium (22) from the recording medium control means (23), decodes the moving image data portion, and extracts image data corresponding to the third reproduction time as present identification object image data, when the reference image data and the present identification object image data are not different, the central processing means (21) determines whether the reference image data and the present identification object image data are similar, when the reference image data and the previous identification object image data are not similar and the reference image data and the present identification object image data are similar, the central processing means (21) sets the present identification object image data as new reference image data, and when the reference image data and the previous identification object image data are similar and the reference image data and the present identification object image data are not similar, the central processing means (21) sets the present identification object image data as new reference image data.
3. The apparatus (20) according to claim 2, wherein the central processing means (21) calculates a fitness value indicating a degree of coincidence of the reference image data and the identification object image data, when the calculated, fitness value is smaller than a first fitness value, determines that the reference image data and the identification object image data are different, and when the calculated fitness value is smaller than a second fitness value which is larger than the first fitness value, determines that the reference image data and the identification object image data are similar.
4. The apparatus (20) according to claim 1, wherein when the reference image data and the identification object image data are not different, the central processing means (21) sets the identification object image data as previous identification object image data and generates previous class data that indicates that the reference image data and the previous identification object image data are identical, similar, or partially similar, the central processing means (21) inputs a moving image data portion, which corresponds to a third reproduction time which is different from the first and the second reproduction times, of moving image data stored in the recording medium (22) from the recording medium control means (23), decodes the moving image data portion, and extracts image data corresponding to the third reproduction time as present identification object image data, when the reference image data and the present identification object image data are not different, the central processing means (21) generates present class data that indicates that the reference image data and the present identification object image data are identical, similar, or partially similar, and when the previous class data and the present class data are different, the central processing means (21) sets the present identification object image data as new reference image data.
5. The apparatus (20) according to claim 4, wherein the central processing means (21) calculates a fitness value indicating a degree of coincidence of the reference image data and the identification object image data, when the calculated fitness value is smaller than a first fitness value, determines that the reference image data and the identification object image data are different, when the calculated fitness value is smaller than a second fitness value which is larger than the first fitness value, determines that the reference image data and the identification object image data are partially similar, when the calculated fitness value is smaller than a third fitness value which is larger than the second fitness value, determines that the reference image data and the identification object image data are similar, and when the calculated fitness value is larger than the third fitness value, determines that the reference image data and the identification object image data are identical.
6. An apparatus (20) for reproducing a moving image comprising: central processing means (21); recording medium control means (23) that controls a recording medium (22); input means control means (25) that controls input means (24); and display means control means (27) that controls display means (26), wherein the central processing means (21) sets reference image data, the central processing means (21) inputs a moving image data portion, which corresponds to a first reproduction time, of moving image data stored in the recording medium (22) from the recording medium control means (23), decodes the moving image data portion, and extracts image data corresponding to the first reproduction time as identification object image data, the central processing means (21) determines whether the reference image data and the identification object image data are different and, when the reference image data and the identification object image data are not different, outputs control data for displaying the identification object image data on the display means (26) to the display means control means (27) as first representative image data, and the central processing means (21) inputs operation data, which indicates that the first representative image data displayed on the display means (26) is selected by the input means (24), from the input means control means (25), reproduces moving image data from reproduction time corresponding to the selected representative image data, and outputs control data for displaying a moving image under reproduction on the display means (26) to the display means control means (27).
7. The apparatus (20) according to claim 6, wherein when the reference image data and the identification object image data are not different, the central processing means (21) sets the identification object image data as previous identification object image data and determines whether the reference image data and the previous identification object image data are similar, the central processing means (21) inputs a moving image data portion, which corresponds to a second reproduction time which is different from the first reproduction time, of the moving image data stored in the recording medium (22) from the recording medium control means (23), decodes the moving image data portion, and extracts image data corresponding to the second reproduction time as present identification object image data, when the reference image data and the present identification object image data are not different, the central processing means (21) determines whether the reference image data and the present identification object image data are similar, when the reference image data and the previous identification object image data are not similar and the reference image data and the present identification object image data are similar, the central processing means (21) outputs control data for displaying the present identification object image data on the display means (26) as second representative image data to the display means control means (27), when the reference image data and the previous identification object image data are similar and the reference image data and the present identification object image data are not similar, the central processing means (21) outputs control data for displaying the present identification object image data on the display means (26) as the second representative image data to the display means control means (27), and the central processing means (21) inputs operation data, which indicates that the second representative image data displayed on the display means (26) is selected by the input means (24), from the input means control means (25), reproduces moving image data from a reproduction time corresponding to the selected representative image data, and outputs control data for displaying a moving image under reproduction on the display means (26) to the display means control means (27).
8. The apparatus (20) according to claim 6, wherein when the reference image data and the identification object image data are not different, the central processing means (21) sets the identification object image data as previous identification object image data and generates previous class data that indicates that the reference image data and the previous identification object image data are identical, similar, or partially similar, the central processing means (21) inputs a moving image data portion, which corresponds to a second reproduction time which is different from the first reproduction time, of the moving image data stored in the storage means (22) from the recording medium control means (23), decodes the moving image data portion, and extracts image data corresponding to the second reproduction time as present identification object image data, when the reference image data and the present identification object image data are not different, the central processing means (21) generates present class data that indicates that the reference image data and the present identification object image data are identical, similar, or partially similar, when the previous class data and the present class data are different, the central processing means (21) outputs control data for displaying the present identification object image data on the display means (26) as second representative image data to the display means control means (27), the central processing means (21) inputs operation data, which indicates that the second representative image data displayed on the display means (26) is selected by the input means (24), from the input means control means (25), reproduces moving image data from reproduction time corresponding to the selected representative image data, and outputs control data for displaying a moving image under reproduction on the display means (26) to the display means control means (27).
9. An apparatus (80) for storing a moving image comprising: central processing means (21); recording medium control means (23) that controls a recording medium (22); and video signal output means control means (82) that controls video signal output means (81), wherein the central processing means (21) inputs a video signal, which is outputted from the video signal output means (81), from the video signal output means control means (82), encodes the video signal, and outputs control data for storing the moving image data in the recording medium (22) to the recording medium control means (23), the central processing means (21) extracts image data from a video signal corresponding to a first storage time as reference image data, the central processing means (21) sets the first storage time as a first reproduction time, the central processing means (21) extracts image data from a video signal, which corresponds to a second storage time which is different from the first storage time, as identification object image data, the central processing means (21) determines whether the reference image data and the identification object image data are different and, when the reference image data and the identification object image data are different, sets the second storage time as a second reproduction time, and the central processing means (21) outputs control data for storing the first and the second reproduction times in the recording medium (22) to the recording medium control means (23) as chapter data.
10. The apparatus (80) according to claim 9, wherein when the reference image data and the identification object image data are not different, the central processing means (21) sets the identification object image data as previous identification object image data and generates previous class data that indicates that the reference image data and the previous classification object image data are identical, similar, or partially similar, the central processing means (21) extracts image data from a video signal, which corresponds to a third storage time which is different from the first and the second storage time, as new identification object image data, when the reference image data and the present identification object image data are not different, the central processing means (21) generates present class data that indicates that the reference image data and the present identification object image data are identical, similar, or partially similar, and when the previous class data and the present class data are different, the central processing means (21) sets the present identification object image data as new reference image data.
11. A method of reproducing a moving image comprising: a step of decoding a moving image data portion, which corresponds to a first reproduction time, of moving image data stored in a recording medium (22) and extracting image data corresponding to the first reproduction time as reference image data; a step of displaying the reference image data on a display means (26) as first representative image data; a step of decoding a moving data portion, which corresponds to a second reproduction time which is different from the first reproduction time, of the moving image data stored in the recording medium (22) and extracting image data corresponding to the second reproduction time as identification object image data; a step of determining whether the reference image data and the identification object image data are different and, when the reference image data and the identification object image data are different, displaying the identification object image data on the display means (26) as second representative image data; and a step of reproducing moving image data from a reproduction time corresponding to selected representative image data of the first or the second representative image data displayed on the display means (26).
12. A method of reproducing a moving image comprising: a step of setting reference image data; a step of decoding a moving image data portion, which corresponds to a first reproduction time, of moving image data stored in a recording medium (22) and extracting image data corresponding to the first reproduction time as identification object image data; a step of determining whether the reference image data and the identification object image data are different and, when the reference image data and the identification object image data are not different, displaying the identification object image data on the display means (26) as first representative image data; and a step of reproducing moving image data from reproduction time corresponding to the selected first representative image data.
13. A method of storing a moving image data comprising: a step of encoding a video signal, which is outputted from a video signal output means (81), and storing moving image data in a recording medium (22); a step of extracting image data from a video signal corresponding to a first storage time as reference image data; a step of setting the first storage time as a first reproduction time; a step of extracting image data from a video signal, which corresponds to a second storage time which is different from the first storage time, as identification object image data; a step of determining whether the reference image data and the identification object image data are different and, when the reference image data and the identification object image data are different, setting the second storage time as a second reproduction time; and a step of storing the first and the second reproduction times in the recording medium (22) as chapter data.
14. A computer program product for reproducing a moving image, said computer program product comprising a program and a central processing means usable medium, wherein the program causing a central processing means (21) to control the execution of: a step of decoding a moving image data portion, which corresponds to a first reproduction time, of moving image data stored in a recording medium (22) and extracting image data corresponding to the first reproduction time as reference image data; a step of displaying the reference image data on a display means (26) as first representative image data; a step of decoding a moving data portion, which corresponds to a second reproduction time which is different from the first reproduction time, of the moving image data stored in the recording medium (22) and extracting image data corresponding to the second reproduction time as identification object image data; a step of determining whether the reference image data and the identification object image data are different and, when the reference image data and the identification object image data are different, displaying the identification object image data on the display means (26) as second representative image data; and a step of reproducing moving image data from a reproduction time corresponding to selected representative image data of the first or the second representative image data displayed on the display means (26).
15. A computer program product for reproducing a moving image, said computer program product comprising a program and a central processing means usable medium, wherein the program causing a central processing means (21) to control the execution of: a step of setting reference image data; a step of decoding a moving image data portion, which corresponds to a first reproduction time, of moving image data stored in a recording medium (22) and extracting image data corresponding to the first reproduction time as identification object image data; a step of determining whether the reference image data and the identification object image data are different and, when the reference image data and the identification object image data are not different, displaying the identification object image data on the display means (26) as first representative image data; and a step of reproducing moving image data from a reproduction time corresponding to the selected first representative image data.
16. A computer program product for storing a moving image , said computer program product comprising a program and a central processing means usable medium, wherein the program causing a central processing means (21) to control the execution of: a step of encoding a video signal, which is outputted from a video signal output means (81), and storing moving image data in a recording medium (22); a step of extracting image data from a video signal corresponding to a first storage time as reference image data; a step of setting the first storage time as a first reproduction time; a step of extracting image data from a video signal, which corresponds to a second storage time which is different from the first storage time, as identification object image data; a step of determining whether the reference image data and the identification object image data are different and, when the reference image data and the identification object image data are different, setting the second storage time as a second reproduction time; and a step of storing the first and the second reproduction times in the recording medium (22) as chapter data.
PCT/JP2005/003551 2004-10-28 2005-02-24 Apparatus, method, and program product for reproducing or recording moving image WO2006046321A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2004-314305 2004-10-28
JP2004314305A JP2006129050A (en) 2004-10-28 2004-10-28 Device, method, and program for regenerating or recording animation

Publications (1)

Publication Number Publication Date
WO2006046321A1 true WO2006046321A1 (en) 2006-05-04

Family

ID=36227572

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2005/003551 WO2006046321A1 (en) 2004-10-28 2005-02-24 Apparatus, method, and program product for reproducing or recording moving image

Country Status (2)

Country Link
JP (1) JP2006129050A (en)
WO (1) WO2006046321A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008185526A (en) * 2007-01-31 2008-08-14 Nec Corp Color discrimination device and method
JP5287093B2 (en) * 2008-09-29 2013-09-11 株式会社三洋物産 Game machine
JP2018114398A (en) * 2018-05-07 2018-07-26 株式会社三洋物産 Game machine

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0963176A (en) * 1995-08-28 1997-03-07 Sony Corp Video cd reproducing device and method thereof
JP2003037796A (en) * 2001-07-24 2003-02-07 Hitachi Ltd Information recording and reproducing device
JP2004297399A (en) * 2003-03-26 2004-10-21 Sharp Corp Portable equipment
JP2004297658A (en) * 2003-03-28 2004-10-21 Fuji Photo Film Co Ltd Image processing program and apparatus

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0963176A (en) * 1995-08-28 1997-03-07 Sony Corp Video cd reproducing device and method thereof
JP2003037796A (en) * 2001-07-24 2003-02-07 Hitachi Ltd Information recording and reproducing device
JP2004297399A (en) * 2003-03-26 2004-10-21 Sharp Corp Portable equipment
JP2004297658A (en) * 2003-03-28 2004-10-21 Fuji Photo Film Co Ltd Image processing program and apparatus

Also Published As

Publication number Publication date
JP2006129050A (en) 2006-05-18

Similar Documents

Publication Publication Date Title
US8285114B2 (en) Electronic apparatus and display method
US8184947B2 (en) Electronic apparatus, content categorizing method, and program therefor
US8666223B2 (en) Electronic apparatus and image data management method
US8326623B2 (en) Electronic apparatus and display process method
US20080212953A1 (en) Optical Disc Recording/Reproducing Apparatus
KR101318459B1 (en) Method of viewing audiovisual documents on a receiver, and receiver for viewing such documents
EP3185539B1 (en) Information processing apparatus, imaging apparatus, image display control method and computer program
US20120182479A1 (en) Electronic apparatus and face image display control method of the electronic apparatus
US8201105B2 (en) Electronic apparatus and image display control method of the electronic apparatus
US9071806B2 (en) Reproducing apparatus
US20080267576A1 (en) Method of displaying moving image and image playback apparatus to display the same
US20070200936A1 (en) Apparatus, method, and program for controlling moving images
JP4856105B2 (en) Electronic device and display processing method
US20070168867A1 (en) Video reproduction device
JP2000350156A (en) Method for storing moving picture information and recording medium recording the information
US7035483B2 (en) Image search apparatus
KR100603158B1 (en) Video signal recording apparatus and method, video signal reproduction apparatus and method, video signal recording and reproduction apparatus and method, and recording medium
EP1544861A1 (en) Apparatus, method and program for reproducing information, and information recording medium
JP4667356B2 (en) Video display device, control method therefor, program, and recording medium
WO2006046321A1 (en) Apparatus, method, and program product for reproducing or recording moving image
US20020006268A1 (en) Video-signal recording &amp; playback apparatus, video-signal recording &amp; playback method and recording medium
JP2002354406A (en) Dynamic image reproducing equipment
JP2006253984A (en) Image display control apparatus and program
JP2006101076A (en) Method and device for moving picture editing and program
JP2007515864A (en) Video image processing method

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BW BY BZ CA CH CN CO CR CU CZ DK DM DZ EC EE EG ES FI GB GD GE GM HR HU ID IL IN IS KE KG KP KR LC LK LR LS LT LU LV MA MD MG MN MW MX MZ NA NI NO NZ OM PG PL PT RO RU SC SD SE SG SK SL SM TJ TM TN TR TT TZ UA UG US UZ VC YU ZA ZM

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): BW GH GM KE LS MW MZ NA SD SZ TZ UG ZM ZW AM AZ BY KG MD RU TJ TM AT BE BG CH CY DE DK EE ES FI FR GB GR HU IE IS LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 69 ( 1 ) EPC, EPO FORM 1205A SENT ON 12/09/07 .

122 Ep: pct application non-entry in european phase

Ref document number: 05710775

Country of ref document: EP

Kind code of ref document: A1