US9300904B2 - Image processing apparatus, image processing system, and image reading apparatus - Google Patents

Image processing apparatus, image processing system, and image reading apparatus Download PDF

Info

Publication number
US9300904B2
US9300904B2 US14/213,714 US201414213714A US9300904B2 US 9300904 B2 US9300904 B2 US 9300904B2 US 201414213714 A US201414213714 A US 201414213714A US 9300904 B2 US9300904 B2 US 9300904B2
Authority
US
United States
Prior art keywords
data
necessary
moving image
unnecessary
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related, expires
Application number
US14/213,714
Other languages
English (en)
Other versions
US20140218565A1 (en
Inventor
Takashi Miyoshi
Akio Kosaka
Hidekazu Iwaki
Arata Shinozaki
Mitsunori Kubo
Takayuki Nakatomi
Nobuyuki Watanabe
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Olympus Corp
Original Assignee
Olympus Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2011207819A external-priority patent/JP5809906B2/ja
Priority claimed from JP2011207818A external-priority patent/JP5809905B2/ja
Priority claimed from JP2012002581A external-priority patent/JP6027745B2/ja
Application filed by Olympus Corp filed Critical Olympus Corp
Assigned to OLYMPUS CORPORATION reassignment OLYMPUS CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KUBO, MITSUNORI, IWAKI, HIDEKAZU, MIYOSHI, TAKASHI, NAKATOMI, TAKAYUKI, SHINOZAKI, ARATA, WATANABE, NOBUYUKI, KOSAKA, AKIO
Assigned to OLYMPUS CORPORATION reassignment OLYMPUS CORPORATION CORRECTIVE ASSIGNMENT TO CORRECT THE EXECUTION DATE FOR THE THIRD ASSIGNOR PREVIOUSLY RECORDED AT REEL: 032730 FRAME: 0464. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT. Assignors: IWAKI, HIDEKAZU, KUBO, MITSUNORI, MIYOSHI, TAKASHI, NAKATOMI, TAKAYUKI, SHINOZAKI, ARATA, WATANABE, NOBUYUKI, KOSAKA, AKIO
Publication of US20140218565A1 publication Critical patent/US20140218565A1/en
Application granted granted Critical
Publication of US9300904B2 publication Critical patent/US9300904B2/en
Assigned to OLYMPUS CORPORATION reassignment OLYMPUS CORPORATION CHANGE OF ADDRESS Assignors: OLYMPUS CORPORATION
Expired - Fee Related legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/907Television signal recording using static stores, e.g. storage tubes or semiconductor memories
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/132Sampling, masking or truncation of coding units, e.g. adaptive resampling, frame skipping, frame interpolation or high-frequency transform coefficient masking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/136Incoming video signal characteristics or properties
    • H04N19/137Motion inside a coding unit, e.g. average field, frame or block difference
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/167Position within a video image, e.g. region of interest [ROI]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/17Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
    • H04N19/172Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a picture, frame or field
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/46Embedding additional information in the video signal during the compression process
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/433Content storage operation, e.g. storage operation in response to a pause request, caching operations
    • H04N21/4334Recording operations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/433Content storage operation, e.g. storage operation in response to a pause request, caching operations
    • H04N21/4335Housekeeping operations, e.g. prioritizing content for deletion because of storage space restrictions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/804Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components
    • H04N9/8042Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components involving data reduction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/82Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
    • H04N9/8205Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/77Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
    • H04N5/772Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera the recording apparatus and the television camera being placed in the same enclosure

Definitions

  • the present invention relates to a technique to automatically determine whether each moving image frame of moving images is necessary or unnecessary.
  • Patent document 1 Japanese Laid-open Patent Publication No. 2008-182544
  • Patent document 2 Japanese Laid-open Patent Publication No. 2005-295285
  • Patent document 3 Japanese Laid-open Patent Publication No. 2000-209483
  • An image processing apparatus of the present invention includes a necessary/unnecessary determination data generating unit configured to generate necessary/unnecessary determination data corresponding respectively to each moving image frame of an input moving image, used to determine whether or not the moving image frame is an unnecessary moving image frame; an image encoding unit configured to encode the input moving image, and also to divide each moving image frame of the moving image after encoding respectively into a plurality of pieces of data, and assign identification information respectively to, and record, in a recording unit, those pieces of data; a necessary/unnecessary determining unit configured to determine whether or not the moving image frame corresponding to the necessary/unnecessary determination data is an unnecessary moving image frame, based on the necessary/unnecessary determination data; and a moving image file managing unit configured to rewrite, in a table in the recording unit in which identification information recorded in the recording unit and information indicating a state of data at a recording position in the recording unit corresponding to the identification information are associated, information indicating a state of
  • the image reading apparatus of the present invention includes a necessary/unnecessary determining unit configured to read out, from a recording unit in which necessary/unnecessary determination data corresponding respectively to each moving image frame of an input moving image, used to determine whether or not the moving image frame is an necessary moving image frame; each moving image frame of the moving image after encoding to which identification information is assigned; and meta data in which identification information assigned to the moving image frame and necessary/unnecessary determination data corresponding to the moving image frame are integrated, the meta data, and based on necessary/unnecessary determination data in the meta data, to determine whether or not a moving image frame corresponding to identification information in the meta data is a necessary moving image frame; and a reading-out control unit configured to, using a table in which identification information recorded in the recording unit and information indicating a recording position in the recording unit corresponding to the identification information, read out only a moving image frame recorded at which a recording position in the recording unit corresponding to identification information of a moving image frame determined as
  • an image processing system of the present invention is an image processing system including an image processing apparatus configured to record a moving image in a recording unit and an image reading apparatus configured to readout a moving image from the recording unit, where the image processing apparatus includes a necessary/unnecessary determination data generating unit configured to generate necessary/unnecessary determination data corresponding respectively to each moving image frame of an input moving image, used to determine whether or not the moving image frame is an necessary moving image frame; an image encoding unit configured to encode the input moving image, and assign identification information respectively to, and record, in a recording unit, each moving image frame of the moving image after encoding; and a meta data generating unit configured to generate, and record, in the recording unit, meta data by integrating identification information assigned to the moving image frame and necessary/unnecessary determination data corresponding to the moving image frame; and the image reading apparatus includes a necessary/unnecessary determining unit configured to read out meta data from the recording unit, and based on necessary/unnecessary determination data in
  • an image processing apparatus of the present invention includes a feature data obtaining unit configured to obtain feature data corresponding to image data; a reference feature data obtaining unit configured to obtain reference feature data; a feature data evaluating unit configured to perform evaluation of the feature data based on the reference feature data; a necessary/unnecessary determining unit configured to perform necessary/unnecessary determination of image data corresponding to the feature data based on the evaluation performed by the feature data evaluating unit; and a control processing unit configured to perform control based on the necessary/unnecessary determination.
  • a computer-readable recording medium recording a program of the present invention makes a computer function as: a feature data obtaining unit configured to obtain feature data corresponding to image data; a feature data evaluating unit configured to perform evaluation of the feature data based on reference feature data obtained by the reference feature data obtaining unit; a necessary/unnecessary determining unit configured to perform necessary/unnecessary determination of image data based on the evaluation performed by a feature data evaluating unit; and a control processing unit configured to perform control based on the necessary/unnecessary determination.
  • an image processing method of the present invention includes obtaining, by a feature data obtaining unit, feature data corresponding to image data; performing, by a feature data evaluating unit, evaluation of the feature data based on reference feature data obtained by the reference feature data obtaining unit; performing, by a necessary/unnecessary determining unit, necessary/unnecessary determination of image data corresponding to the feature data based on the evaluation and performing, by the control processing unit, control based on the necessary/unnecessary determination.
  • FIG. 1 is a diagram illustrating an image processing apparatus of an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating an example of meta data, a moving image frame, a sector in a recording unit, and a cluster in a recording unit.
  • FIG. 3A is a diagram illustrating an example of a directory.
  • FIG. 3B is a diagram illustrating an example of FAT.
  • FIG. 3C is a diagram illustrating an example of FAT after information indicating the state of data is rewritten.
  • FIG. 4 is a diagram illustrating an example of a moving image frame, a unit of compression, and a cluster in a recording unit at the time of encoding in the MPEG format.
  • FIG. 5A is a diagram illustrating an image capturing apparatus of embodiment 1.
  • FIG. 5B is a diagram illustrating a variation example (part 1) of an image capturing apparatus of embodiment 1.
  • FIG. 5C is a diagram illustrating a variation example (part 2) of an image capturing apparatus of embodiment 1.
  • FIGS. 6A and 6B are a flowchart illustrating operations of a necessary/unnecessary determining unit of embodiment 1.
  • FIG. 7 is a diagram illustrating an example of a moving image frame determined as necessary and a moving image frame determined as unnecessary in embodiment 1.
  • FIG. 8 is a diagram illustrating another example of a moving image frame determined as necessary and a moving image frame determined as unnecessary in embodiment 1.
  • FIG. 9 is a diagram illustrating an image capturing apparatus of embodiment 2.
  • FIG. 10 is a flowchart illustrating operations of a necessary/unnecessary determining unit of embodiment 2.
  • FIG. 11 is a diagram illustrating an image capturing apparatus of embodiment 3.
  • FIG. 12 is a flowchart illustrating operations of a necessary/unnecessary determining unit of embodiment 3.
  • FIG. 13 is a diagram illustrating an image capturing apparatus of embodiment 4.
  • FIG. 14 is a flowchart illustrating operations of a necessary/unnecessary determining unit of embodiment 4.
  • FIG. 15 is a diagram illustrating another example of a moving image frame determined as necessary and a moving image frame determined as unnecessary in embodiment 4.
  • FIG. 16 is a diagram illustrating an image capturing apparatus of embodiment 5.
  • FIG. 17 is a diagram illustrating an image processing system of an embodiment of the present invention.
  • FIG. 18 is a diagram illustrating an image processing system of another embodiment of the present invention.
  • FIG. 19 is a diagram illustrating an image processing system of another embodiment of the present invention.
  • FIG. 20 is a diagram illustrating an image processing system of another embodiment of the present invention.
  • FIG. 21 is a diagram illustrating an image processing system of another embodiment of the present invention.
  • FIG. 22 is a diagram illustrating an example of meta data, a moving image frame, a sector in a recording unit, and a cluster in a recording unit.
  • FIG. 23A is a diagram illustrating an example of a directory.
  • FIG. 23B is a diagram illustrating an example of FAT.
  • FIG. 24 is a flowchart illustrating an example of operations of an image reading apparatus.
  • FIG. 25 is a diagram illustrating an example of a moving image frame, a unit of compression, and a cluster in a recording unit at the time of encoding in the MPEG format.
  • FIG. 26A is a diagram illustrating an image processing system including an image capturing apparatus as embodiment 1 of the image processing apparatus illustrated in FIG. 21 .
  • FIG. 26B is a diagram illustrating a variation example (part 1) of an image processing system including an image capturing apparatus as embodiment 1 of the image processing apparatus illustrated in FIG. 21 .
  • FIG. 26C is a diagram illustrating a variation example (part 2) of an image processing system including an image capturing apparatus as embodiment 1 of the image processing apparatus illustrated in FIG. 21 .
  • FIG. 27 is a flowchart illustrating operation of a necessary/unnecessary determining unit in an image processing system including an image capturing apparatus as embodiment 2 of the image processing apparatus illustrated in FIG. 21 .
  • FIG. 28 is a diagram illustrating an example of a moving image frame determined as necessary and a moving image frame determined as unnecessary in an image processing system including an image capturing apparatus as embodiment 2 of the image processing apparatus illustrated in FIG. 21 .
  • FIG. 29 is a diagram illustrating another example of a moving image frame determined as necessary and a moving image frame determined as unnecessary in an image processing system including an image capturing apparatus as embodiment 2 of the image processing apparatus illustrated in FIG. 21 .
  • FIG. 30 is a diagram illustrating an image processing system including an image capturing apparatus as embodiment 3 of the image processing apparatus illustrated in FIG. 21 .
  • FIG. 31 is a flowchart illustrating operation of a necessary/unnecessary determining unit in an image processing system including an image capturing apparatus as embodiment 3 of the image processing apparatus illustrated in FIG. 21 .
  • FIG. 32 is a diagram illustrating an image processing system including an image capturing apparatus as embodiment 4 of the image processing apparatus illustrated in FIG. 21 .
  • FIG. 33 is a flowchart illustrating operation of a necessary/unnecessary determining unit in an image processing system including an image capturing apparatus as embodiment 4 of the image processing apparatus illustrated in FIG. 21 .
  • FIG. 34 is a diagram illustrating an image processing system including an image capturing apparatus as embodiment 5 of the image processing apparatus illustrated in FIG. 21 .
  • FIG. 35 is a flowchart illustrating operation of a necessary/unnecessary determining unit in an image processing system including an image capturing apparatus as embodiment 5 of the image processing apparatus illustrated in FIG. 21 .
  • FIG. 36 is a diagram illustrating another example of a moving image frame determined as necessary and a moving image frame determined as unnecessary in an image processing system including an image capturing apparatus as embodiment 6 of the image processing apparatus illustrated in FIG. 21 .
  • FIG. 37 is a diagram illustrating an image processing apparatus of another embodiment of the present invention.
  • FIG. 38 is a diagram illustrating an image processing apparatus of another embodiment of the present invention.
  • FIG. 39 is a flowchart illustrating an example of operations of a control processing unit illustrated in FIG. 38 .
  • FIG. 40 is a diagram illustrating an example of data obtained by a reference feature data obtaining unit.
  • FIG. 41A is a diagram illustrating an example of peripheral apparatuses of an image processing apparatus.
  • FIG. 41B is a diagram illustrating an example of peripheral apparatuses of an image processing apparatus.
  • FIG. 41C is a diagram illustrating an example of peripheral apparatuses of an image processing apparatus.
  • FIG. 42 is a diagram illustrating another image processing apparatus of the present invention.
  • FIG. 43 is a flowchart illustrating an example of operations of a reference feature data generating unit illustrated in FIG. 42 .
  • FIG. 44 is a diagram illustrating an example of data obtained by a reference feature data obtaining unit.
  • FIG. 45 is a flowchart illustrating an example of operations of a control processing unit illustrated in FIG. 42 .
  • FIG. 46 is a diagram illustrating an image processing apparatus of another embodiment of the present invention.
  • FIG. 47 is a flowchart illustrating an example of operations of a reference feature data generating unit illustrated in FIG. 46 .
  • FIG. 48 is a diagram illustrating an example of data obtained by a reference feature data obtaining unit.
  • FIG. 49 is a diagram illustrating an image processing apparatus of another embodiment of the present invention.
  • FIG. 50 is a flowchart illustrating an example of operations of a control processing unit illustrated in FIG. 49 .
  • FIG. 51 is a diagram illustrating an example of an image processing apparatus of another embodiment of the present invention.
  • FIG. 52 is a flowchart illustrating an example of operations of a reference feature data generating unit illustrated in FIG. 51 .
  • FIG. 53A is a diagram illustrating an example of data obtained by a reference feature data obtaining unit.
  • FIG. 53B is a diagram illustrating an example of data obtained by a reference feature data obtaining unit.
  • FIG. 54 is a diagram illustrating an image processing apparatus of another embodiment of the present invention.
  • FIG. 55 is a diagram illustrating an example of image data for which necessary/unnecessary determination has been made.
  • FIG. 56 is a diagram illustrating an example of image data for which necessary/unnecessary determination has been made.
  • FIG. 57 is a diagram illustrating an image processing apparatus of another embodiment of the present invention.
  • FIG. 1 is a diagram illustrating an image processing apparatus of an embodiment of the present invention.
  • the image processing apparatus 100 of FIG. 1 includes a necessary/unnecessary determination data generating unit 103 , an image encoding unit 104 , a meta data generating unit 105 , a recording unit 106 , a necessary/unnecessary determining unit 107 , a moving image file managing unit 108 , a control processing unit 109 . Meanwhile, the control processing unit 109 is supposed to control operations of each unit in the image processing apparatus 100 .
  • the necessary/unnecessary determination data generating unit 103 generates necessary/unnecessary determination data used to determine whether or not it is an unnecessary moving image frame respectively corresponding to each of moving image frame of a moving image file input from outside (for example, an image capturing apparatus, an external recording unit, and the like) to the image processing apparatus 100 .
  • the image encoding unit 104 encodes moving images input from outside to the image processing apparatus 100 , and also, while dividing each moving image frame of the moving images after encoding into a plurality of pieces of data respectively, assigns identification information to those pieces of data and records them in the recording unit 106 .
  • the image encoding unit 104 performs encoding of input moving images, as illustrated in FIG. 2 , in the MPEG2 (Moving Picture Experts Group phase 2) format, and while dividing each moving image frame of the moving images after encoding respectively into a plurality of TS (Transport Stream) packets, assigns “identification information PID (Packet Identification)” (K, K+1, . . . ) respectively to the TS packets, and records them in the recording unit 106 .
  • PID Packet Identification
  • the recording unit 106 illustrated in FIG. 1 may be a recording unit (for example, a hard disc, RAM (Random Access Memory)) that is built in the image processing apparatus 100 , an external recording unit (for example, a medium such as CD-R (Rewritable Compact Disc), DVD-R (Rewritable Digital Versatile Disc), a memory card, or a USB memory and the like) and the like that maybe inserted/removed to/from the image processing unit 100 , and for example, is constituted by a non-volatile memory, or a flash memory
  • the recording unit 106 includes an identification information/recording position table that indicates correspondence between identification information assigned respectively to each piece of data in the image encoding unit 104 and information indicating the recording position in the recording unit 106 .
  • the recording unit 106 includes an identification information/recording position table that indicates correspondence between the identification information PID and the cluster number indicating the position of the cluster in the recording unit 106 .
  • the recording unit 106 includes a recording position/data state information table that indicates correspondence between information indicating the recording position in the recording unit 106 and information indicating the state of the data at each recording position (for example, when data continues to another recording position, information indicating the recording position, information indicating that data ends at the recording position, or information indicating that there is no data at the recording position). For example, as illustrated in FIG.
  • the recording unit 106 includes a FAT (File Allocation Table) as a recording position/data state information table that indicates correspondence between the cluster number indicating the position of the cluster in the recording unit 106 and information indicating the state of data at each cluster number (for example, when data continues to another cluster, the cluster number indicating the cluster position, information: “FFFF” indicating that data ends at the cluster, or information: “0000” indicating that there is no data in the cluster).
  • FAT Fe Allocation Table
  • the identification information PID integrated with the necessary/unnecessary determination data is not limited to the identification information PID of the top TS packet in the respective TS packet after the division of the moving image frame after encoding.
  • the necessary/unnecessary determining unit 107 determines whether or not the moving image frame corresponding to the identification information in the meta data is an unnecessary moving image frame.
  • the moving image file managing unit 108 identifies the recording position in the recording unit 106 corresponding to the identification information of the moving image frame determined by the necessary/unnecessary determining unit 107 as an unnecessary moving image frame, using the identification information/recording position table in the recording unit 106 .
  • the moving image file managing unit 108 identifies information indicating the state of data corresponding to the identified recording position, using the recording position/data state information table in the recording unit 106 , and rewrite the identified information indicating the state of data into information indicating that there is no data.
  • the moving image file managing unit 108 rewrite the information “0004” indicating the state of data at the identified cluster number “0003” into “0000’, using the FAT illustrated in FIG. 3B , and also rewrites the information “FFFF” indicating, in a case in which the moving image frame f2 is also recorded in another cluster, the state of data at the cluster number “0004” indicating the position of the cluster into “0000’.
  • the recording unit 106 may include a table in which all the pieces identification information recorded in the recording unit 106 , and information that indicates the state of data at the recording position of the recording unit 106 corresponding respectively to each of the pieces of identification information.
  • the moving image file managing unit 108 may be configured to, using the table, rewrite information that indicates the state of data at the recording position in the recording unit 106 corresponding to the identification information of a moving image frame determined as an unnecessary moving image frame by the necessary/unnecessary determining unit 107 into information indicating that there is no data.
  • the image processing apparatus 100 of an embodiment of the present invention is configured to rewrite information that indicates the state of data at the recording position in the recording unit 106 corresponding to the moving image frame determined as an unnecessary moving image frame in the respective moving image frames of moving images into information indicating that there is no data, in the stage before the editing work and playback work, when reading out moving images from the recording unit 106 at the time of editing and playing back the moving images, those other than the unnecessary moving image frame are read out.
  • the unnecessary moving image frame may be removed from the moving image file in the stage before the editing work and the playback work
  • the time taken to load the moving image file onto the editing program and the playback program may be shortened at the time of editing work and the playback work, and the decrease in the efficiency of the editing work and the playback work may be suppressed.
  • the image processing apparatus 100 of the embodiment of the present invention since the image processing apparatus 100 of the embodiment of the present invention only rewrite the information that indicates the state of data at the recording position in the recording unit 106 , to read out those other than the unnecessary moving image frame from the recording unit 106 , the data processing amount may be reduced, and the load put on the image processing apparatus 100 may be suppressed.
  • the image processing apparatus 100 of the embodiment of the present embodiment uses identification information assigned to each piece of data after the division of the moving image frame after encoding, to identify the recording position in the recording unit 106 corresponding to the unnecessary moving image frame, the data processing amount may be reduced, and the load put on the image processing apparatus 100 may be suppressed.
  • the necessary/unnecessary determining unit 107 may be configured to send top information Ki indicating the top moving image frame in a plurality of successive unnecessary moving image frames, and last information Kj indicating the last moving image frame to the moving image file managing unit 108 . That is, a flag indicating the top of moving images determined as unnecessary, and a flag indicating the last, may be described on the identification information PID.
  • the moving image file managing unit 108 rewrites information that indicates the state of data at the recording position in the recording unit 106 corresponding respectively to the top information Ki and the last information Kj into information indicating that there is no data and also, rewrites information that indicates the state of data at the recording position in the recording unit 106 corresponding to the moving image frames between the respective moving image frames corresponding to the top information Ki and the last information Kj into information that there is no data.
  • the moving image file managing unit 108 may also be configured to rewrite, when both of the flag indicating the top of the unnecessary moving image frames and the flag indicating the last of the unnecessary moving image frames are present, information that indicates the state of data at the recording position in the recording unit 106 corresponding to the identification information of the moving image frame that exits between the top of the unnecessary moving image frames and the last of the unnecessary moving image frames into information indicating that there is no data.
  • the rewriting of data may be performed when the pair of the flag indicating the top of unnecessary determination and the flag indicating the last is received.
  • the moving image file managing unit 108 writes all the pieces of information that indicates the state of data at the recording position in the recording unit 106 corresponding to the range from the GOP in which the top frame of unnecessary moving image is included to GOP in which the last frame of unnecessary moving image is included into information indicating that there is no data.
  • the unit of compression for example, I frame (Intra Picture), B1 frame-B3 frame (Bidirectionally Predictive Picture), P frame (Predictive Picture)
  • SH Sequence Header
  • GOP Group Of Picture
  • the recording unit 106 may include a table for recovery with the same content as the FAT.
  • the moving image file managing unit 108 is able to get information indicating the state of data at the recording position in the recording unit 106 corresponding to the identification information of data being the recovery target back to the original information, using the table for recovery. Accordingly, even after a determination as an unnecessary moving image frame is made and information that indicates the state of data at the recording position in the recording unit 106 corresponding to the identification information of the moving image frame is rewritten into information indicating that there is no data, the moving image frame may be put back to the state being available to be read out.
  • FIG. 5A is a diagram illustrating an image capturing unit 500 being embodiment 1 of the image processing apparatus 100 illustrated in FIG. 1 . Meanwhile, to the same configuration as the configuration illustrated in FIG. 1 , the same numeral is assigned and explanation for the configuration is omitted.
  • the image capturing unit 500 illustrated in FIG. 5A includes an image capturing unit 101 , an image processing unit 102 , a necessary/unnecessary determination data generating unit 103 , an image encoding unit 104 , a meta data generating unit 105 , a recording unit 106 , a necessary/unnecessary determining unit 107 , a moving image file managing unit 108 , and a control processing unit 109 .
  • the image capturing unit 101 converts the subject image whose image is formed on the imaging device such as the CCD (Charge Coupled Device) by the lens into an electric signal, to obtain moving images composed of still image frames or a plurality of moving image frames.
  • the imaging device such as the CCD (Charge Coupled Device)
  • the image processing unit 102 performs various image processing such as sharpness correction, contrast correction, luminance/chrominance signal generation and white balance correction, to the still image or each moving image frame of the moving image obtained by the image capturing unit 101 .
  • the image encoding unit 104 performs encoding in the JPEG (Joint Photographic Experts Group) format to the still image frame after image processing. In addition, the image encoding unit 104 performs encoding in the MPEG format and the like to the moving image after image processing.
  • JPEG Joint Photographic Experts Group
  • the image capturing unit 500 illustrated in FIG. 5A is characterized in that a subject detection processing unit 501 is included as the necessary/unnecessary determination data generating unit 103 , and by the subject detection processing unit 501 , in each moving image frame of the moving image after image processing, respectively, the “position of the subject” and the “size of the subject” as the necessary/unnecessary determination data are detected, and the meta data generating unit 105 , the detected “position of the subject” and the “size of the subject” and the identification information PID are integrated, and output to the necessary/unnecessary determining unit 107 as meta data.
  • FIGS. 6A and 6B are a flowchart illustrating an example of operations of the necessary/unnecessary determining unit 107 in the image capturing unit 500 illustrated in FIG. 5A .
  • the necessary/unnecessary determining unit 107 inputs the “position of the subject” and the “size of the subject” (S 11 ).
  • the subject detection processing unit 501 regards a rectangular area enclosing the detected subject as the subject area, and regards the upper-left pixel position (x0, y0) of the subject area as the “position of the subject”.
  • the subject detection processing unit 501 regards the number of pixels in the lateral direction of the subject area as W as the “size of the subject”, and also regards the number of pixels in the vertical direction of the subject area as H.
  • the subject detection processing unit 501 makes x0 a smaller value than 0.
  • the upper-left pixel position of the moving image frame is regarded as (1, 1).
  • the necessary/unnecessary determining unit 107 judges whether or not x0 is smaller than 0, that is, whether or not a subject has been detected (S 12 ).
  • the necessary/unnecessary determining unit 107 determines whether or not W is smaller than a threshold Thr_W, or whether or not H is smaller than a threshold Thr_H, that is, whether or not the subject is smaller than a prescribed size (S 13 ). Meanwhile, the threshold Thr_W and the threshold Thr_H may be set in advance by the user and the like.
  • the necessary/unnecessary determining unit 107 determines whether or not x0+W/2 is smaller than a threshold Thr_L_x, that is, whether or not the subject area deviates outwards from the left edge of the moving image frame (S 14 ). Meanwhile, the threshold Thr_L_x may be set in advance by the user and the like.
  • the necessary/unnecessary determining unit 107 determines whether or not x0+W/2 is larger than a threshold Thr_R_x, that is, whether the right edge of the subject area deviates outwards from the right edge of the moving image frame (S 15 ). Meanwhile, the threshold Thr_R_x may be set in advance by the user and the like.
  • the necessary/unnecessary determining unit 107 determines whether or not y0+H/2 is larger than a threshold Thr_U_y, that is, whether or not the upper edge of the subject area deviates outwards from the upper edge or the moving image frame (S 16 ). Meanwhile, the threshold Thr_U_y may be set in advance by the user and the like.
  • the necessary/unnecessary determining unit 107 determines whether or not y0+H/2 is smaller than the threshold Thr_D_y, that is, whether or not the bottom edge of the subject area deviates outwards from the bottom edge of the moving image frame (S 17 ). Meanwhile, the threshold Thr_D_y may be set in advance by the user and the like.
  • the necessary/unnecessary determining unit 107 reads out, from the storing unit, the necessary/unnecessary determination flag FLG_n of the currently-obtained moving image frame and the necessary/unnecessary determination flag FLG_n ⁇ 1 of the last-obtained moving image frame (S 20 ), and determines whether or not the multiplication result of the necessary/unnecessary determination flag FLG_n and the necessary/unnecessary determination flag FLG_n ⁇ 1 is smaller than zero, that is, whether or not the last-obtained moving image frame is the last moving image frame in one or more moving image frames that are unnecessary at the time of editing and playback (S 21 ).
  • the necessary/unnecessary determining unit 107 terminates the determination process of the unnecessary moving image frame, and upon determining that the moving image capturing has not been terminated yet (No in S 24 ), returning to S 11 , inputs meta data corresponding to the moving image frame to be obtained next.
  • the necessary/unnecessary determining unit 107 does not execute S 22 and S 23 , and determines whether or not the moving image capturing has been terminated (S 24 ).
  • the necessary/unnecessary determining unit 107 reads out the necessary/unnecessary determination flag FLG_n of the currently-obtained moving image frame and the necessary/unnecessary determination flag FLG_n ⁇ 1 of the last-obtained moving image frame from the storing unit (S 27 ), and determines whether or not the multiplication result of the necessary/unnecessary determination flag FLG_n and the necessary/unnecessary determination flag FLG_n ⁇ 1 is smaller than zero, that is, whether or not the currently-obtained moving image frame is the top moving image frame in one or more moving image frames that are unnecessary at the time of editing and playback (S 28 ).
  • the necessary/unnecessary determining unit 107 does not execute S 29 and S 30 , and determines whether or not the moving image capturing has been terminated (S 24 ).
  • the moving image file managing unit 108 rewrites information that indicates the state of data at the recording positions in the recording unit 106 corresponding to all the identification information between the identification information PID output together with the top information Ki and the identification information PID output together with the last information kj respectively into information indicating that there is no data.
  • the image capturing unit 500 illustrated in FIG. 5A is effective in reducing the amount of data of moving images obtained when the subject moves wildly such as when watching a sport and a race, as whether a moving image frame is necessary/unnecessary is determined according to the position and the size of the subject.
  • top information Ki is set for the moving image frame of the identification information corresponding to the last part of the successive frames.
  • the difference between frames may also be obtained based on the statistic of the images.
  • the total value of the differences of the level differences between frames is equal to or smaller than a prescribed value
  • a difference calculating unit 5011 that integrates the differences between frames is provided.
  • a histogram difference calculating unit 5012 that calculates the histogram and calculates the difference in the histograms between frames or their integral is provided.
  • FIG. 9 is a diagram illustrating an image capturing unit 900 being embodiment 2 of the image processing apparatus 100 illustrated in FIG. 1 . Meanwhile, to the same configuration as the configuration illustrated in FIG. 5 , the same numeral is assigned and explanation for the configuration is omitted.
  • the image capturing unit 900 illustrated in FIG. 9 is characterized in that it includes a focus information processing unit 901 as the necessary/unnecessary determination data generating unit 103 , and that, by the focus information processing unit 901 , for each moving image frame of the moving image after image processing, respectively, “focus information” is set as the necessary/unnecessary determination data, and by the meta data generating unit 105 , the “focus information” and the identification information PID are integrated and output to the necessary/unnecessary determining unit 107 as meta data.
  • FIG. 10 is a flowchart illustrating an example of the operations of the necessary/unnecessary determining unit 107 in the image capturing unit 900 illustrated in FIG. 9 .
  • the necessary/unnecessary determining unit 107 inputs “focus information” represented in the meta data (S 31 ). Meanwhile, every time when the image capturing unit 101 obtains a moving image frame, the focus information processing unit 901 obtains a contrast evaluation value C0 corresponding to the moving image frame from the image processing unit 102 , and also obtains the lens position L0 of the focus lens from the image capturing unit 101 and output them to the meta data generating unit 105 . In addition, when the contrast evaluation value C0 is smaller than 0, it is assumed that the focusing has failed, that is, the focus lens is out of focus.
  • the necessary/unnecessary determining unit 107 determines that the C0 is equal to or larger than 0, that is, the focus lens is in focus (No in S 32 ), it keeps the lens position L0_n corresponding to the currently-obtained moving image frame in a storing unit that is inside or outside the necessary/unnecessary determining unit 107 (S 33 ).
  • the necessary/unnecessary determining unit 107 reads out the lens position L0_n corresponding to the currently-obtained moving image frame and the lens position L0_n ⁇ 1 corresponding to the last-obtained moving image frame from the storing unit (S 34 ), and determines whether or not the absolute value of the difference between the lens position L0_n and the lens position L0_n ⁇ 1 is larger than a threshold Thr_L, that is, whether or not the focus is unstable (S 35 ).
  • the necessary/unnecessary determining unit 107 reads out the necessary/unnecessary determination flag FLG_n of the currently-obtained moving image frame and the necessary/unnecessary determination flag FLG_n ⁇ 1 of the last-obtained moving image frame from the storing unit (S 38 ), and determines whether or not the multiplication result of the necessary/unnecessary determination flag FLG_n and the necessary/unnecessary determination flag FLG_n ⁇ 1 is smaller than zero, that is, whether or not the last-obtained moving image frame is the last moving image frame in one or more moving image frames that are unnecessary at the time of editing and playback (S 39 ).
  • the necessary/unnecessary determining unit 107 integrates last information Kj indicating the last moving image frame in one or more unnecessary moving image frames and the identification information PID corresponding to the last-obtained moving image frame (S 40 ), and outputs it to the moving image file managing unit 108 (S 41 ).
  • the necessary/unnecessary determining unit 107 Upon receiving a moving image capturing termination instruction and the like from the user and determining that the moving image capturing has been terminated (Yes in S 42 ), the necessary/unnecessary determining unit 107 terminates the determination process of the unnecessary moving image frame, and upon determining that the moving image capturing has not been terminated yet (No in S 42 ), returns to S 31 , and inputs meta data corresponding to the moving image frame to be obtained next.
  • the necessary/unnecessary determining unit 107 does not execute S 40 and S 41 , and determines whether or not the moving image capturing has been terminated (S 42 ).
  • the necessary/unnecessary determining unit 107 reads out the necessary/unnecessary determination flag FLG_n of the currently-obtained moving image frame and the necessary/unnecessary determination flag FLG_n ⁇ 1 of the last-obtained moving image frame from the storing unit (S 45 ), and determines whether or not the multiplication result of the necessary/unnecessary determination flag FLG_n and the necessary/unnecessary determination flag FLG_n ⁇ 1 is smaller than zero, that is, whether or not the currently-obtained moving image frame is the top moving image frame in one or more moving image frames that are unnecessary at the time of editing and playback (S 46 ).
  • the necessary/unnecessary determining unit 107 integrates top information ki indicating the top moving image frame in one or more unnecessary moving image frames and the identification information PID corresponding to the currently-obtained moving mage frame (S 47 ), outputs it to the moving image file managing unit 108 (S 48 ), and determines whether or not the moving image capturing has been terminated (S 42 ).
  • the necessary/unnecessary determining unit 107 does not execute S 47 and S 48 , and determines whether or not the moving image capturing has been terminated (S 42 ).
  • the moving image file managing unit 108 rewrites information that indicates the state of data at the recording positions in the recording unit 106 corresponding to all the identification information between the identification information PID output together with the top information Ki and the identification information PID output together with the last information kj respectively into information indicating that there is no data.
  • the image capturing unit 900 illustrated in FIG. 9 is effective in reducing the amount of data of moving images obtained when not only the subject but also the person who is capturing the image is moving, as the necessary/unnecessary determination of about a moving image frame is made according to whether or not focusing has failed or whether or not the focus is unstable.
  • FIG. 11 is a diagram illustrating an image capturing unit 1100 being embodiment 3 of the image processing apparatus 100 illustrated in FIG. 1 . Meanwhile, to the same configuration as the configuration illustrated in FIG. 5 , the same numeral is assigned and explanation for the configuration is omitted.
  • the image capturing unit 1100 illustrated in FIG. 11 is characterized in that it includes a zoom information processing unit 1101 as the necessary/unnecessary determination data generating unit 103 , and the by the zoom information processing unit 1101 , for each moving image frame of the moving image after image processing, respectively, “zoom information” is set as the necessary/unnecessary determination data, and by the meta data generating unit 105 , the “zoom information” and the identification information PID are integrated and output to the necessary/unnecessary determining unit 107 as meta data.
  • FIG. 12 is a flowchart illustrating an example of the operations of the necessary/unnecessary determining unit 107 in the image capturing unit 1100 illustrated in FIG. 11 .
  • the necessary/unnecessary determining unit 107 inputs “zoom information” represented in the meta data (S 51 ). Meanwhile, every time when the image capturing unit 101 obtains a moving image frame, the zoom information processing unit 1101 obtains zoom information Z0 (for example, the lens position, the amount of lens movement of the zoom lens and the like) corresponding to the moving image frame from the image capturing unit 101 , and outputs it to the meta data generating unit 105 .
  • zoom information Z0 for example, the lens position, the amount of lens movement of the zoom lens and the like
  • the necessary/unnecessary determining unit 107 keeps the input zoom information Z0 in a storing unit that is inside or outside the necessary/unnecessary determining unit 107 (S 52 ).
  • the necessary/unnecessary determining unit 107 reads out the zoom information Z0_n corresponding to the currently-obtained moving image frame and the zoom information Z0_n ⁇ 1 corresponding to the last-obtained moving image frame from the storing unit (S 53 ), and determines whether or not the absolute value of the difference between the zoom information Z0_n and the zoom information Z0_n ⁇ 1 is larger than a threshold Thr_Z, that is, whether or not the lens position of the zoom lens is unstable (S 54 ).
  • the necessary/unnecessary determining unit 107 reads out the necessary/unnecessary determination flag FLG_n of the currently-obtained moving image frame and the necessary/unnecessary determination flag FLG_n ⁇ 1 of the last-obtained moving image frame from the storing unit (S 57 ), and determines whether or not the multiplication result of the necessary/unnecessary determination flag FLG_n and the necessary/unnecessary determination flag FLG_n ⁇ 1 is smaller than zero, that is, whether or not the last-obtained moving image frame is the last moving image frame in one or more moving image frames that are unnecessary at the time of editing and playback (S 58 ).
  • the necessary/unnecessary determining unit 107 integrates the last information Kj indicating the last moving image frame in one or more unnecessary moving image frames and the identification information PID corresponding to the last-obtained moving image frame identification information (S 59 ), and outputs it to the moving image file managing unit 108 (S 60 ).
  • the necessary/unnecessary determining unit 107 terminates the determination process of the unnecessary moving image frame, and upon determining that the moving image capturing has not been terminated yet (No in S 61 ), returns to S 51 , and inputs meta data corresponding to the moving image frame to be obtained next.
  • the necessary/unnecessary determining unit 107 does not execute S 59 and S 60 , and determines whether or not the moving image capturing has been terminated (S 61 ).
  • the necessary/unnecessary determining unit 107 reads out, from the storing unit, the necessary/unnecessary determination flag FLG_n of the currently-obtained moving image frame and the necessary/unnecessary determination flag FLG_n ⁇ 1 of the last-obtained moving image frame (S 64 ), and determines whether or not the multiplication result of the necessary/unnecessary determination flag FLG_n and the necessary/unnecessary determination flag FLG_n ⁇ 1 is smaller than zero, that is, whether or not the currently-obtained moving image frame is the top moving image frame in one or more moving image frames that are unnecessary at the time of editing and playback (S 65 ).
  • the necessary/unnecessary determining unit 107 integrates the top information ki indicating the top moving image frame in one or more unnecessary moving image frames and the identification information PID corresponding to the currently-obtained moving mage frame (S 66 ), and outputs it to the moving image file managing unit 108 (S 67 ), and determines whether or not the image capturing has been terminated (S 61 ).
  • the necessary/unnecessary determining unit 107 does not execute S 66 and S 67 , and determines whether or not the moving image capturing has been terminated (S 61 ).
  • the moving image file managing unit 108 rewrites information that indicates the state of data at the recording positions in the recording unit 106 corresponding to all the identification information between the identification information PID output together with the top information Ki and the identification information PID output together with the last information kj respectively into information indicating that there is no data.
  • the image capturing unit 1100 illustrated in FIG. 11 is effective in reducing the amount of data of moving images obtained when, for example, the subject is moving in the forward/backward directions with respect to the person who is capturing the image, as the necessary/unnecessary determination of about a moving image frame is made according to whether or not the lens position of the zoom lens is unstable.
  • the information used for the necessary/unnecessary determination process of about the unnecessary moving image frame is not limited to the position and size of the subject, the focus information or the zoom information, as in embodiments 1 through 3 described above.
  • the histogram of the luminance gradation of the respective image frames after image processing is not limited to the position and size of the subject, the focus information or the zoom information, as in embodiments 1 through 3 described above.
  • two or more pieces of information from the position and size of the subject, the focus information or the zoom information and the histogram of the luminance gradation mentioned above may be combined and used for the necessary/unnecessary determination process of about the unnecessary moving image frame.
  • FIG. 13 is a diagram illustrating an image capturing unit 1300 being embodiment 4 of the image processing apparatus 100 illustrated in FIG. 1 . Meanwhile, to the same configuration as the configuration illustrated in FIG. 5 , the same numeral is assigned and explanation for the configuration is omitted.
  • the image capturing unit 1300 illustrated in FIG. 13 is characterized in that it includes a radio antenna 1301 and a reception signal strength detecting unit 1302 as the necessary/unnecessary determination data generating unit 103 , and by the reception signal strength detecting unit 1302 , for each moving image frame of the moving image after image processing, respectively, RSSI (Received Signal Strength Indication) R0 detected as the necessary/unnecessary determination data generating unit 103 , and by the meta data generating unit 105 , the RSSI R0 and the identification information PID are integrated and output to the necessary/unnecessary determining unit 107 as meta data.
  • the radio antenna 1301 is supposed to receive a signal transmitted from a transmitter provided in the subject, and a signal transmitted from the image capturing unit 1300 and reflected on the subject.
  • FIG. 14 is a flowchart illustrating an example of the operations of the necessary/unnecessary determining unit 107 in the image capturing unit 1300 illustrated in FIG. 13 .
  • the necessary/unnecessary determining unit 107 inputs “RSSI” represented in the meta data (S 71 ). Meanwhile, every time when the image capturing unit 101 obtains a moving image frame, the reception signal strength detecting unit 1302 associates the level of the reception signal received by the radio antenna 1301 with the moving image frame and outputs it to the meta data generating unit 105 as the RSSI R0.
  • the necessary/unnecessary determining unit 107 determines whether or not the input RSSI R0 is larger than a threshold Thr_R, that is, whether or not the subject is at a far position with respect to the image capturing unit 1300 (S 72 ).
  • the necessary/unnecessary determining unit 107 reads out, from the storing unit, the necessary/unnecessary determination flag FLG_n of the currently-obtained moving image frame and the necessary/unnecessary determination flag FLG_n ⁇ 1 of the last-obtained moving image frame (S 75 ), and determines whether or not the multiplication result of the necessary/unnecessary determination flag FLG_n and the necessary/unnecessary determination flag FLG_n ⁇ 1 is smaller than zero, that is, whether or not the last-obtained moving image frame is the last moving image frame in one or more moving image frames that are unnecessary at the time of editing and playback (S 76 ).
  • the necessary/unnecessary determining unit 107 terminates the determination process of the unnecessary moving image frame, and upon determining that the moving image capturing has not been terminated yet (No in S 79 ), returns to S 71 , and inputs meta data corresponding to the moving image frame to be obtained next.
  • the necessary/unnecessary determining unit 107 does not execute S 77 and S 78 , and determines whether or not the moving image capturing has been terminated (S 79 ).
  • the necessary/unnecessary determining unit 107 reads out the necessary/unnecessary determination flag FLG_n of the currently-obtained moving image frame and the necessary/unnecessary determination flag FLG_n ⁇ 1 of the last-obtained moving image frame from the storing unit (S 82 ), determines whether or not the multiplication result of the necessary/unnecessary determination flag FLG_n and the necessary/unnecessary determination flag FLG_n ⁇ 1 is smaller than zero, that is, whether or not the currently-obtained moving image frame is the top moving image frame in one or more moving image frames that are unnecessary at the time of editing and playback (S 83 ).
  • the necessary/unnecessary determining unit 107 does not execute S 84 and S 85 , and determines whether or not the moving image capturing has been terminated (S 79 ).
  • the moving image file managing unit 108 rewrites information that indicates the state of data at the recording positions in the recording unit 106 corresponding to all the identification information between the identification information PID output together with the top information Ki and the identification information PID output together with the last information kj respectively into information indicating that there is no data.
  • the image capturing unit 1300 illustrated in FIG. 13 is effective in reducing the amount of data of moving images obtained when, for example, the subject is moving in the forward/backward directions with respect to the image capturing unit 1300 , as the necessary/unnecessary determination of about a moving image frame is made according to the RSSI R0.
  • the configuration is made so that, in the necessary/unnecessary determining unit 107 , the necessary/unnecessary determination of the moving image frame is made based on whether or not the RSSI R0 is larger than the threshold Thr_R, but the configuration may also be made so that, the necessary/unnecessary determination of the moving image frame is made based on whether or not the RSSI R0 is larger than the threshold Thr_R is made in the reception signal strength detecting unit 1302 .
  • meta data is generated by a flag indicating a unnecessary moving image frame and the identification information corresponding to the moving image frame are integrated.
  • the moving image file managing unit 108 rewrites the information that indicates the state of data at the recording position in the recording unit 106 corresponding to the identification information in the meta data into information indicating that there is no data.
  • the necessary/unnecessary determining unit 107 may be omitted.
  • FIG. 16 is a diagram illustrating an image capturing unit 1600 being embodiment 5 of the image processing apparatus 100 illustrated in FIG. 1 . Meanwhile, to the same configuration as the configuration illustrated in FIG. 5 , the same numeral is assigned and explanation for the configuration is omitted.
  • the image capturing unit 1600 is characterized in that, instead of the necessary/unnecessary determination data generating unit 103 , it includes a reduced image generating unit 1601 , and by the reduced image generating unit 1601 , the respective moving image frames of the moving image after image processing are respectively reduced, and the reduced images and the identification information PID are integrated and output to the necessary/unnecessary determination data generating unit 103 as meta data.
  • the necessary/unnecessary determination data generating unit 103 illustrated in FIG. 16 generates the necessary/unnecessary determination data used to determine whether or not the moving image frame corresponding to identification information is an unnecessary moving image frame, based on the reduced image in input meta data.
  • the necessary/unnecessary determination data generated by the necessary/unnecessary determination data generating unit 103 illustrated in FIG. 16 is assumed to be for example, the position and the size of the subject, or the histogram of the illuminance gradation and the like described above.
  • the operations of the necessary/unnecessary determining unit 107 and moving image file managing unit 108 illustrated in FIG. 16 are similar to the operations of the necessary/unnecessary determining unit 107 and moving image file managing unit 108 in embodiments 1 through 4 described above.
  • the necessary/unnecessary determination data generating unit 103 illustrated in FIG. 16 may be provided inside the necessary/unnecessary determining unit 107 .
  • the image capturing unit 1600 illustrated in FIG. 16 uses a reduced image when generating the necessary/unnecessary determination data, the load put on the image capturing unit 1600 when generating the necessary/unnecessary determination data may be suppressed.
  • FIG. 17 is a diagram illustrating an image processing system as a variation example in which a part of functions of the image capturing apparatus illustrated in FIG. 16 are placed outside. Meanwhile, to the same configuration as the configuration illustrated in FIG. 1 , the same numeral is assigned and explanation for the configuration is omitted.
  • the image processing system illustrated in FIG. 17 includes the image processing apparatus 100 and a server 1700 , and the image processing apparatus 100 and the server 1700 transmits/receives data to/from each other via a network 1701 .
  • the image processing apparatus 100 illustrated in FIG. 17 differs from the image processing apparatus 100 illustrated in FIG. 1 in that the server 1700 that is outside the image processing apparatus 100 is made to perform the determination process of the unnecessary moving image frame.
  • a transmitting unit 1702 of the image processing apparatus 100 illustrated in FIG. 17 transmits meta data generated in the meta data generating unit 105 of the server 1700 .
  • the necessary/unnecessary determining unit 1703 of the server 1700 illustrated in FIG. 17 determines whether or not the moving image frame corresponding to the identification information in the meta data is an unnecessary moving image frame, based on the necessary/unnecessary determination data in meta data received by the receiving unit 1704 , and transmits the determination result to the image processing apparatus 100 by the transmitting unit 1705 .
  • the moving image file managing unit 108 of the image processing apparatus 100 illustrated in FIG. 17 identifies the recording position in the recording unit 106 corresponding to the identification information of a moving image frame determined as an unnecessary moving image frame according to the determination result received by the receiving unit 1706 , using the identification information/recording position table in the recording unit 106 .
  • the moving image file managing unit 108 identifies information that indicates the state of data corresponding to the identified recording position, using the recording position/data state information table in the recording unit 106 , and rewrites the identified information that indicates the state of data into information that indicates that there is no data.
  • the recording unit 106 illustrated in FIG. 17 may include a table in which all the pieces of identification information recorded in the recording unit 106 and information that indicates the state of data at the recording position in the recording unit 106 respectively corresponding to each of the pieces of identification information are associated.
  • the moving image file managing unit 108 illustrated in FIG. 17 may be configured to rewrite, using this table, information that indicates the state of data at the recording position in the recording unit 106 corresponding to identification information of a moving image frame determined as an unnecessary moving image frame, transmitted from the server 1700 , into information that indicates that there is no data.
  • the image processing apparatus 100 illustrated in FIG. 17 may be replaced with one of the image capturing unit in embodiments 1 through 5 described above.
  • the image processing system illustrated in FIG. 17 is configured to perform the necessary/unnecessary determination process about a moving image frame at the server 1700 side that is outside the image processing apparatus 100 , and the load put on the image processing apparatus 100 may be suppressed.
  • FIG. 18 is a diagram illustrating another example of an image processing system as a variation example in which a part of functions of the image capturing apparatus illustrated in FIG. 16 are placed outside. Meanwhile, to the same configuration as the configuration of the image processing system illustrated in FIG. 17 , the same numeral is assigned and explanation for the configuration is omitted.
  • the image processing system illustrated in FIG. 18 differs from the image processing system illustrated in FIG. 17 in that, instead of the necessary/unnecessary determination data generating unit 103 , the reduced image generating unit 1601 illustrated in FIG. 16 is provided in the image processing apparatus 100 , and also, a necessary/unnecessary determination data generating unit 1800 is provided in the server 1700 .
  • the reduced image generating unit 1601 of the image processing apparatus 100 a reduced image of each of the respective moving image frames of moving images after image processing is generated, and the reduced image and the identification information PID are integrated by the meta data generating unit 105 and transmitted, as meta data, from the image processing apparatus 100 to the server 1700 via the network 1701 .
  • the necessary/unnecessary determination data generating unit 1800 of the server 1700 based on the reduced image in received meta data, the unnecessary/unnecessary determination data used to determine whether or not the moving image frame corresponding to the identification information in the meta data is unnecessary is generated. Meanwhile, the necessary/unnecessary determination data generated by the necessary/unnecessary determination data generating unit 1800 is assumed to be for example, the position and the size of the subject, or the histogram of the illuminance gradation and the like described above.
  • the necessary/unnecessary determining unit 1703 of the server 1700 determines whether or not the moving image frame corresponding to the identification information in received meta data is a moving image frame that is unnecessary moving image frame at the time of editing, based on the necessary/unnecessary determination data generated by the necessary/unnecessary determination data generating unit 1800 , and transmits the determination result to the image processing apparatus 100 by the transmitting unit 1705 .
  • the moving image file managing unit 108 of the image processing apparatus 100 identifies the recording position in the recording unit 106 corresponding to the identification information of the moving image frame determined as an unnecessary moving image frame, received by the receiving unit 1706 , using the identification information/recording position table in the recording unit 106 .
  • the moving image file managing unit 108 identifies information that indicates the state of data corresponding to the identified recording position, using a recording position/data state information table in the recording unit 106 and rewrites the identified information that indicates the state of data into information indicating that there is no data.
  • the recording unit 106 illustrated in FIG. 18 may include a table in which all the pieces of identification information recorded in the recording unit 106 and information that indicates the state of data at the recording position in the recording unit 106 respectively corresponding to each of the pieces of identification information are associated.
  • the moving image file managing unit 108 illustrated in FIG. 18 may be configured to rewrite, using this table, information that indicates the state of data at the recording position in the recording unit 106 corresponding to identification information of a moving image frame determined as an unnecessary moving image frame, transmitted from the server 1700 , into information that indicates that there is no data.
  • the image processing system illustrated in FIG. 18 includes the necessary/unnecessary determination data generating unit 1800 in the server 1700 , the load put on the image processing apparatus 100 may be suppressed.
  • the image processing system illustrated in FIG. 18 uses a reduced image when generating the necessary/unnecessary determination data, the load put on the server 1700 when generating the necessary/unnecessary determination data may be suppressed.
  • FIG. 19 is a diagram illustrating another example of the image processing system in FIGS. 17, 18 . Meanwhile, to the same configuration as the configuration of the image processing apparatus 100 illustrated in FIG. 1 and the same configuration as the image processing system illustrated in FIG. 17 , the same numeral is assigned and explanation for the configuration is omitted.
  • the image processing system illustrated in FIG. 19 differs from the image processing system illustrated in FIG. 17 in that the image processing apparatus 100 further includes the necessary/unnecessary determining unit 107 , a load state measuring unit 1900 , and a distribution switching unit 1901 .
  • the load state measuring unit 1900 is an example of a detecting unit that detects information for determining whether or not the image processing apparatus 100 is in a state in which a load may be applied. Meanwhile, the detecting unit in the claims is, for example, the load state measuring unit 1900 in FIG. 19 .
  • the load state measuring unit 1900 measures the state of the load on the resources (for example, the CPU, the memory and the like) of the image processing apparatus 100 .
  • the distribution switching unit 1901 switches the transmission destination of meta data generated in by the meta data generating unit 105 to either one of the necessary/unnecessary determining unit 1703 of the server 1700 and the necessary/unnecessary determining unit 107 of the image processing apparatus 100 .
  • the meta data is transmitted to the necessary/unnecessary determining unit 1703 of the server 1700
  • the meta data is transmitted to the necessary/unnecessary determining unit 107 of the image processing apparatus 100 .
  • the image processing system illustrated in FIG. 19 is configured so that, when the load put on the resources of the image processing apparatus 100 is high, it is determined that it is impossible to sufficiently use the resources of the image processing apparatus 100 in performing the necessary/unnecessary determination process about the moving image frame, and the server 1700 that is outside the image processing apparatus 100 is made to perform the necessary/unnecessary determination about the moving image frame, and therefore, the efficiency of the necessary/unnecessary determination process of about the moving image frame may be improved.
  • FIG. 20 is a diagram illustrating another embodiment of the image processing system illustrated in FIG. 19 . Meanwhile, to the same configuration as the image processing system illustrated in FIG. 17 , the same numeral is assigned, and explanation of the configuration is omitted.
  • the image processing system illustrated in FIG. 20 differs from the image processing system illustrated in FIG. 17 in that the image processing apparatus 100 further includes the necessary/unnecessary determining unit 107 , a power source mode detecting unit 2000 and a distribution switching unit 2001 .
  • the power source mode detecting unit 2000 is an example of a detecting unit that detects information for determining whether or not the image processing apparatus 100 is in a state in which a load may be applied. Meanwhile, the detecting unit in the claims is, for example, the power source mode detecting unit 2000 .
  • the power source mode detecting unit 2000 detects the power source mode of the image processing apparatus 100 (for example, a start-up mode in which the power is supplied to all of the respective units of the image processing apparatus 100 and a power-saving (standby) mode in which the power is supplied to a part of the respective units of the image processing apparatus 100 ).
  • the distribution switching unit 2001 switches the transmission destination of meta data generated in by the meta data generating unit 105 to either one of the necessary/unnecessary determining unit 1703 of the server 1700 and the necessary/unnecessary determining unit 107 of the image processing apparatus 100 .
  • the power source mode of the image processing apparatus 100 is the “power-saving (standby) mode”
  • the meta data is transmitted to the necessary/unnecessary determining unit 1703 of the server 1700
  • the power source mode of the image processing apparatus 100 is the “start-up mode”
  • the meta data is transmitted to the necessary/unnecessary determining unit 107 of the image processing apparatus 100 .
  • the image processing system illustrated in FIG. 20 is configured so that, when the power source mode of the image processing apparatus 100 is the “power-saving (standby) mode”, it is determined that it is impossible to sufficiently use the resources of the image processing apparatus 100 in performing the necessary/unnecessary determination process about the moving image frame, and the server 1700 that is outside the image processing apparatus 100 is made to perform the necessary/unnecessary determination process about the moving mage frame, and therefore, the efficiency of the necessary/unnecessary determination process of about the moving image frame may be improved.
  • FIG. 21 is a diagram illustrating an image processing system of another embodiment of the present invention.
  • the image processing system illustrated in FIG. 21 includes an image processing apparatus 100 a that records input moving images in a recording unit 106 a , and an image reading apparatus 108 a that reads out moving images from the recording unit 106 a.
  • the recording unit 106 a illustrated in FIG. 21 may be a recording unit that is built inside the image processing apparatus 100 a (for example, a hard disk, RAM (Random Access Memory) and the like), or may be an external recording unit that is insertable/removable to/from the image processing apparatus 100 (for example, a medium such as a CD (Compact Disc), DVD (Digital Versatile Disc) and the like, a ROM (Read Only Memory), a memory card constituted by a non-volatile memory or a flash memory and the like, or a USB memory and the like.
  • the recording unit 106 a illustrated in FIG. 21 is assumed as an external recording unit. Meanwhile, when the recording unit 106 a is built in the image processing apparatus 100 a , the image processing apparatus 100 a and the image reading apparatus 108 a connected to each other by a communication line and the like.
  • the image processing apparatus 100 a includes a subject detection processing unit 103 a , an image encoding unit 104 a , a meta data generating unit 105 a , and a control processing unit 109 a . Meanwhile, the control processing unit 109 a , controls the operation of the respective units of the image processing apparatus 100 a.
  • the subject detection processing unit 103 a generates necessary/unnecessary determination data used to determine whether or not it is an unnecessary moving image frame respectively corresponding to each of moving image frame of a moving image input from outside (for example, an image capturing apparatus, an external recording unit, and the like) to the image processing apparatus 100 a.
  • the image encoding unit 104 a encodes moving images input from outside to the image processing apparatus 100 a , and also, while dividing each moving image frame of the moving images after encoding into a plurality of pieces of data respectively, assigns identification information to those pieces of data and records them in the recording unit 106 a .
  • the image encoding unit 104 a performs encoding of input moving images, as illustrated in FIG. 22 , in the MPEG2 format, and while dividing each moving image frame of the moving images after encoding respectively into a plurality of TS packets, assigns identification information PID (K, K+1, . . . ) respectively to the TS packets, and records them in the recording unit 106 a.
  • the recording unit 106 a includes an identification information/recording position table that indicates correspondence between identification information assigned respectively to each piece of data in the image encoding unit 104 a and information indicating the recording position in the recording unit 106 a .
  • the recording unit 106 a includes an identification information/recording position table that indicates correspondence between the identification information PID and the cluster number indicating the position of the cluster in the recording unit 106 a.
  • the recording unit 106 a includes a recording position/data state information table that indicates correspondence between information indicating the recording position in the recording unit 106 a and information indicating the state of the data at each recording position (for example, when data continues to another recording position, information indicating the recording position, information indicating that data ends at the recording position, or information indicating that there is no data at the recording position). For example, as illustrated in FIG.
  • the recording unit 106 a includes a FAT as a recording position/data state information table that indicates correspondence between the cluster number indicating the position of the cluster in the recording unit 106 a and information indicating the state of data at each cluster number (for example, when data continues to another cluster, the cluster number, information: “FFFF” indicating that data ends at the cluster, or information: “0000” indicating that there is no data in the cluster).
  • the meta data generating unit 105 a generates meta data by integrating one piece of identification information in the identification information assigned respectively to the respective pieces of data after division of the moving image frame after encoding, and necessary/unnecessary determination data corresponding to the moving image frame, and records the generated meta data in the recording unit 106 a .
  • the identification information PID integrated with the necessary/unnecessary determination data is not limited to the identification information PID of the top TS packet in the respective TS packet after the division of the moving image frame after encoding.
  • the image encoding unit 104 a may be configured to encode moving images input from outside to the image processing apparatus 100 a , and also, to assign identification information respectively to the respective moving image frames of the moving image after encoding, and to record them in the recoding unit 106 a .
  • an identification information/recording position table indicating the identification information assigned respectively to the respective moving image frames and information indicating the recording position in the recording unit 106 a is provided in the recording unit 106 a .
  • the meta data generating unit 105 a generates meta data by integrating identification information assigned to the moving image frame after encoding and necessary/unnecessary determination data corresponding to the moving image frame, and records the generated meta data in the recording unit 106 a.
  • the image reading apparatus 108 a illustrated in FIG. 21 is a personal computer and the like, and includes a necessary/unnecessary determining unit 107 a , an identification information/recording position table for reading-out 110 a , a recording position/data state information table for reading-out 111 a , a reading-out control unit 112 a , reading-out unit 113 a , control processing unit 114 a . Meanwhile, the control processing unit 114 a controls the respective units in the image reading apparatus 108 a .
  • the necessary/unnecessary determining unit 107 a , reading-out control unit 112 a , the reading-out unit 113 a , and the control processing unit 114 a are constituted by a microcomputer and the like.
  • the necessary/unnecessary determining unit 107 a determines whether or not the moving image frame corresponding to the identification information in the meta data is an unnecessary moving image frame.
  • the identification information/recording position table for reading-out 110 a is a table with the same contents as the identification information/recording position table in the recording unit 106 a , and is similar to the directory illustrated in FIG. 23A for example.
  • the recording position/data state information table for reading-out 111 a is a table with the same contents as the recording position/data state information table in the recording unit 106 a , and is similar to the FAT illustrated in FIG. 23B .
  • the reading-out control unit 112 a reads out, from the recording unit 106 a , only the moving image frame that is determined as a necessary moving image frame by the necessary/unnecessary determining unit 107 a , in the respective moving image frames of the moving image file recorded in the recording unit 106 a.
  • the reading-out unit 113 a outputs the moving image file read out from the recording unit 106 a by the reading-out control unit 112 a to an editing program and a playback program of moving images executed inside or outside the image reading apparatus 108 a , or, to a display apparatus provided inside or outside the image reading apparatus 108 a.
  • FIG. 24 is a flowchart illustrating an example of the operations of the image reading apparatus 108 a.
  • the necessary/unnecessary determining unit 107 a reads out, from the recording unit 106 a , meta data corresponding to one or more moving image frames in the respective moving image frames of the moving image file indicated in the reading-out request and the like (S 12 a ).
  • the necessary/unnecessary determining unit 107 a determines whether or not the moving image frame corresponding to the identification information in the meta data is a necessary moving image frame (S 13 a ).
  • the necessary/unnecessary determining unit 107 a determines whether or not all the necessary moving image frames in the respective moving image frames in the moving image file indicated in the reading-out request and the like have been read out from the recording unit 106 a (S 14 a ).
  • the necessary/unnecessary determining unit 107 a determines that all the necessary moving image frames have been read out (Yes in S 14 a ), it terminates the reading-out process, and when it determines that all the necessary moving image frames have not been read out (No in S 14 a ), returning to S 12 a , it reads out meta data corresponding to next one or more moving image frame from the recording unit 106 a.
  • the reading-out control unit 112 a refers to the identification information/recording position table for reading-out 110 a and the recording position/data state information table for reading-out 111 a (S 15 a ), reads out the necessary moving image frame from the recording unit 106 a (S 16 a ), and determines whether or not all the necessary moving image frames have been read out (S 14 a ).
  • the reading-out control unit 112 a reads out, using the FAT illustrated in FIG. 23B , data recorded at the identified cluster number “0003”, and also reads out data recorded at the next cluster number “0004”.
  • the configuration is made so that the necessary/unnecessary determination about moving image frames is made for each piece of meta data, that is, for each moving image frame
  • the configuration may also be made so that the necessary/unnecessary determination about moving image frames is made for every prescribed number of moving image frames.
  • the prescribed number of moving image frames may be set in advance by the user and the like based on the amount of the moving image frames that are necessary at the time of editing and playback.
  • a table in which all the pieces of identification information recorded in the recording unit 106 a and information indicating the recording position in the recording unit 106 a corresponding respectively to the pieces of identification information are associated may be provided.
  • the reading-out control unit 112 a may be configured to read out, using this table, only the data recorded at the recorded position in the recording unit 106 a corresponding to the identification information of the moving image frame determined as a necessary moving image frame by the necessary/unnecessary 107 a , from the recording unit 106 a.
  • the image processing system illustrated in FIG. 21 is configured so that, in the precedent stage of the editing work and the playback work, the moving image frame determined as a necessary moving image frame in the respective moving image frames of the moving image file is read out from the recording unit 106 a , and therefore, when reading out the moving image file from the recording unit 106 a , moving image frames other than unnecessary moving image frames are read out.
  • the time taken to load the moving image file onto the editing program and the playback program may be shortened at the time of the editing work and the playback work, and the decrease in the efficiency of the editing work and the playback work may be suppressed.
  • the image processing system illustrated in FIG. 21 is configured to use the identification information assigned to each piece of data or each moving image frame after the division of the moving image frame to identify the recording position in the recording unit 106 a corresponding to the necessary moving image frame, and therefore, the data processing in generating meta data and in reading out meta data may be reduced, and the load put on the image processing apparatus 100 a and the image reading apparatus 108 a may be suppressed.
  • the image encoding unit 104 a when performing encoding in the MPEG format to an input moving image, performs the encoding, as illustrated in FIG. 25 for example, in the unit of compression of SH and GOP (for example, I frame (Intra Picture), B1 frame through B3 frame, P frame).
  • the reading-out control unit 112 a reads out, from the recording unit 106 a , the GOP including the necessary moving image frame.
  • the GOP for example, I frame (Intra Picture), B1 frame through B3 frame, P frame.
  • FIG. 26A is a diagram illustrating an image processing system including an image capturing unit 600 a as embodiment 1 of the image processing apparatus 100 a . To the same configuration as the configuration illustrated in FIG. 21 , the same numeral is assigned and explanation for it is omitted.
  • the image capturing unit 600 a illustrated in FIG. 26A is, for example, a digital camera and the like, and includes an image capturing unit 101 a , an image processing unit 102 a , a subject detection processing unit 601 a , an image encoding unit 104 a , a meta data generating unit 105 a , and a control processing unit 109 a.
  • the image capturing unit 101 a converts the subject image whose image is formed on the imaging device such as the CCD (Charge Coupled Device) by the lens into an electric signal, to obtain moving images composed of still image frames or a plurality of moving image frames.
  • the imaging device such as the CCD (Charge Coupled Device)
  • the image processing unit 102 a performs various image processing such as sharpness correction, contrast correction, luminance/chrominance signal generation and white balance correction, to the still image or each moving image frame of the moving image obtained by the image capturing unit 101 a.
  • the image encoding unit 104 a performs encoding in the JPEG (Joint Photographic Experts Group) format to the still image frame after image processing. In addition, the image encoding unit 104 a performs encoding in the MPEG format and the like to the moving image after image processing.
  • JPEG Joint Photographic Experts Group
  • the image capturing unit 600 a illustrated in FIG. 26A is characterized in that a subject detection processing unit 601 a is included as the necessary/unnecessary determination data generating unit 103 , and by the subject detection processing unit 601 a , in each moving image frame of the moving image after image processing, respectively, the “position of the subject” and the “size of the subject” as the necessary/unnecessary determination data are detected, and the meta data generating unit 105 a , the detected “position of the subject” and the “size of the subject” and the identification information PID are integrated, and recorded in the recording unit 106 a as meta data.
  • FIG. 27 is a flowchart illustrating an example of the operations of the necessary/unnecessary determining unit 107 a illustrated in FIG. 26A .
  • the necessary/unnecessary determining unit 107 a inputs the “position of the subject” and the “size of the subject” indicated in meta data read out from the recording unit 106 a (S 21 a ). Meanwhile, after detecting the subject within the moving image frame after image processing, the subject detection processing unit 601 a regards a rectangular area enclosing the detected subject as the subject area, and regards the upper-left pixel position (x0, y0) of the subject area as the “position of the subject”. In addition, the subject detection processing unit 601 a regards the number of pixels in the lateral direction of the subject area as W as the “size of the subject”, and also regards the number of pixels in the vertical direction of the subject area as H. Meanwhile, when no subject is detected from the moving image frame, the subject detection processing unit 601 a makes x0 a smaller value than 0. In addition, the upper-left pixel position of the moving image frame is assumed as (1, 1).
  • the necessary/unnecessary determining unit 107 a determines whether or not x0 is smaller than 0, that is, whether or not the subject has been detected (S 22 a ).
  • the necessary/unnecessary determining unit 107 a determines whether or not W is smaller than a Thr_W, or whether or not H is smaller than a threshold Thr_H, that is, whether or not the subject is smaller than a prescribed size (S 23 a ). Meanwhile, the threshold Thr_W and the threshold Thr_H may be set in advance by the user and the like.
  • the necessary/unnecessary determining unit 107 determines whether or not x0+W/2 is smaller than a threshold Thr_L_x, that is, whether or not the left edge of the subject area deviates outward from the left edge of the moving image frame (S 24 a ). Meanwhile, the threshold Thr_L_x may be set in advance by the user and the like.
  • the necessary/unnecessary determining unit 107 a determines whether or not x0+W/2 is larger than a threshold Thr_R_x, that is, whether the right edge of the subject area deviates outward from the right edge of the moving image frame (S 25 a ). Meanwhile, the threshold Thr_R_x may be set by the user and the like in advance.
  • the necessary/unnecessary determining unit 107 a determines whether or not y0+H/2 is larger than a threshold Thr_U_y, that is, whether or not the upper edge of the subject area deviates outward from the upper edge of the moving are (S 26 a ). Meanwhile, the threshold Thr_U_y may be set by the user and the like in advance.
  • the necessary/unnecessary determining unit 107 a determines whether or not y0+H/2 is smaller than a threshold Thr_D_y, that is, whether or not the bottom edge of the subject area deviates outward from the bottom edge of the moving image frame (S 27 a ). Meanwhile, the threshold Thr_U_y may be set by the user and the like in advance.
  • NG moving image frame
  • the image reading apparatus 108 a illustrated in FIG. 26A is effective in reducing the amount of data at the time of the reading out of moving image files obtained when the subject moves wildly such as when watching a sport and a race, as whether a moving image frame is necessary/unnecessary is determined according to the position and the size of the subject.
  • the configuration is made so that the composition index described in meta data is based on the position and the size of the subject, and the necessary/unnecessary determination of the moving image frame is performed, but the configuration may also be made, as illustrated in FIG. 29 , so that, when moving image frame in similar compositions are obtained successively, these moving image frames are determined as moving image frames that are unnecessary at the time of editing and playback.
  • the reading-out control unit 112 a does not read out the moving image frames of identification information corresponding to apart of the successive frames from the recording unit 106 a.
  • the difference between frames may be obtained based on the statistic of the image. Decision as unnecessary when the total value of the differences of the level differences between frames is equal to or smaller than a prescribed value, when the total value of the amounts of change of the histogram of frames to compare is equal to or smaller than a prescribed value, and the like may be used.
  • a difference calculating unit 1031 a that integrates the differences between frames is provided instead of the subject detection processing unit 103 a .
  • a histogram difference calculating unit 1032 a that calculates histogram and calculates the differences of the histograms of frames or the integration of them is provided instead of the subject detection processing unit 103 a.
  • FIG. 30 is a diagram illustrating an image processing system including an image capturing apparatus 1000 a as embodiment 2 of the image processing apparatus 100 a . Meanwhile, to the same configuration as the configuration illustrated in FIG. 26 , the same numeral is assigned and description for the configuration is omitted.
  • the image capturing apparatus 1000 a illustrated in FIG. 30 is characterized in that, a focus information processing unit 1001 a is provided, and by the focus information processing unit 1001 a , for each moving image frame of the moving image after image processing, respectively, “focus information” is set as the necessary/unnecessary determination data, and by the meta data generating unit 105 a , the “focus information” and the identification information PID are integrated and recorded in the recording unit 106 a as meta data.
  • FIG. 31 is a flowchart illustrating an example of the operation of the necessary/unnecessary determining unit 107 a illustrated in FIG. 30 .
  • the necessary/unnecessary determining unit 107 a inputs “focus information” indicated in meta data read out from the recording unit 106 a (S 31 a ). Meanwhile, every time when the image capturing unit 101 a obtains a moving image frame, the focus information processing unit 1001 a obtains a contrast evaluation value C0 corresponding to the moving image frame from the image processing unit 102 a , and also obtains the lens position L0 of the focus lens from the image capturing unit 101 a and output them to the meta data generating unit 105 a . In addition, when the focusing has failed, that is, the focus lens is out of focus, the image processing unit 102 a makes the contrast evaluation value C0 a smaller value than 0.
  • the necessary/unnecessary determining unit 107 a determines that C0 is equal to or larger than 0, that is, the focus lens is in focus (No in S 32 a ), it keeps the lens position L0_n corresponding to the currently-obtained moving image frame in a storing unit that is inside or outside the necessary/unnecessary determining unit 107 a (S 33 a ).
  • the necessary/unnecessary determining unit 107 a reads out the lens position L0_n corresponding to the currently-obtained moving image frame and the lens position L0_n ⁇ 1 corresponding to the last-obtained moving image frame from the storing unit described above (S 34 a ), and determines whether or not the absolute value of the difference between the lens position L0_n and the lens position L0_n ⁇ 1 is larger than a threshold Thr_L, that is, whether or not the focus is unstable (S 35 a ). Meanwhile, the threshold Thr_L may be set by the user and the like in advance.
  • the image reading apparatus 108 a illustrated in FIG. 30 is effective reducing the amount of data at the time of the reading out of moving image files obtained when not only the subject but also the person who is capturing the image is moving, as the necessary/unnecessary determination of about a moving image frame is made according to whether or not focusing has failed or whether or not the focus is unstable.
  • FIG. 32 is a diagram illustrating an image processing system including an image capturing apparatus 1200 a as embodiment 3 of the image processing apparatus 100 a . Meanwhile, to the same configuration as the configuration illustrated in FIG. 26 , the same numeral is assigned and description for the configuration is omitted.
  • the image capturing apparatus 1200 a illustrated in FIG. 32 is characterized in that, a zoom information processing unit 1201 a is provided, and by the zoom information processing unit 1201 a , for each moving image frame of the moving image after image processing, respectively, “zoom information” is set as the necessary/unnecessary determination data, and by the meta data generating unit 105 a , the “zoom information” and the identification information PID are integrated and recorded in the recording unit 106 a as meta data.
  • FIG. 33 is a flowchart illustrating an example of the operation of the necessary/unnecessary determining unit 107 a illustrated in FIG. 32 .
  • the necessary/unnecessary determining unit 107 a inputs the “zoom information” indicated in meta data read out from the recording unit 106 a (S 41 a ). Meanwhile, every time when the image capturing unit 101 a obtains a moving image frame, the zoom information processing unit 1201 a obtains zoom information Z0 (for example, the lens position, the amount of lens movement of the zoom lens and the like) corresponding to the moving image frame from the image capturing unit 101 a , and outputs it to the meta data generating unit 105 a.
  • zoom information Z0 for example, the lens position, the amount of lens movement of the zoom lens and the like
  • the necessary/unnecessary determining unit 107 a keeps the input zoom information Z0 in a storing unit that is inside or outside the necessary/unnecessary determining unit 107 a (S 42 a ).
  • the necessary/unnecessary determining unit 107 a reads out the zoom information Z0_n corresponding to the currently-obtained moving image frame and the zoom information Z0_n ⁇ 1 corresponding to the last-obtained moving image frame from the storing unit (S 43 a ), and determines whether or not the absolute value of the difference between the zoom information Z0_n and the zoom information Z0_n ⁇ 1 is larger than a threshold Thr_Z, that is, whether or not the lens position of the zoom lens is not stable (S 44 a ). Meanwhile, the threshold Thr_Z may be set by the user and the like in advance.
  • the image reading apparatus 108 a illustrated in FIG. 32 is effective in reducing the amount of data at the time of the reading out of moving image files obtained when, for example, the subject is moving in the forward/backward directions with respect to the person who is capturing the image, as the necessary/unnecessary determination of about a moving image frame is made according to whether or not the lens position of the zoom lens is unstable.
  • the information used for the necessary/unnecessary determination process of about the necessary moving image frame at the time of editing and playback of the moving image is not limited to the position and size of the subject, the focus information or the zoom information described above.
  • the histogram of the luminance gradation of the respective image frames after image processing is not limited to the position and size of the subject, the focus information or the zoom information described above.
  • two or more pieces of information from the position and size of the subject, the focus information, the zoom information and the histogram of the luminance gradation mentioned above may be combined and used for the necessary/unnecessary determination process of about the necessary moving image frame at the time of editing and playback of the moving image.
  • FIG. 34 is a diagram illustrating an image processing system including an image capturing apparatus 1400 a as embodiment 4 of the image processing apparatus 100 a . Meanwhile, to the same configuration as the configuration illustrated in FIG. 26 , the same numeral is assigned and description for the configuration is omitted.
  • the image capturing apparatus 1400 a illustrated in FIG. 34 is characterized in that a radio antenna 1401 a and a reception signal strength detecting unit 1402 a are provided, and by the reception signal strength detecting unit 1402 a , for each moving image frame of the moving image after image processing, respectively, RSSI R0 is detected, and by the meta data generating unit 105 a , the RSSI R0 and the identification information PID are integrated and recorded in the recording unit 106 a as meta data. Meanwhile, it is assumed that the radio antenna 1401 a receives a signal transmitted from a transmitter provided in the subject and a signal transmitted from the image capturing apparatus 1400 a and reflected on the subject.
  • FIG. 35 is a flowchart illustrating an example of the operation of necessary/unnecessary determining unit 107 a illustrated in FIG. 34 .
  • the necessary/unnecessary determining unit 107 a inputs the RSSI R0 indicated in meta data read out from the recording unit 106 a (S 51 a ). Meanwhile, every time when the image capturing unit 101 a captures a moving image frame, the reception signal strength detecting unit 1402 a outputs the level of the reception signal received by the radio antenna 1401 a while associating it to the moving image frame, to the meta data generating unit 105 a as RSSI R0.
  • the necessary/unnecessary determining unit 107 a determines whether or not the input RSSI R0 is larger than a threshold Thr_R, that is, whether or not the subject is far with respect to the image capturing apparatus 1400 a (S 52 a ). Meanwhile, the threshold Thr_R may be set by the user and the like in advance.
  • the image reading apparatus 108 a illustrated in FIG. 34 is effective in reducing the amount of data at the time of the reading out of moving image files obtained when, for example, the subject is moving in the forward/backward directions with respect to the image capturing apparatus 1400 a , as the necessary/unnecessary determination of about a moving image frame is made according to RSSI R0.
  • the configuration is made so that the necessary/unnecessary determination of the moving image frame is made, in the necessary/unnecessary determining unit 107 a , based on whether or not the RSSI R0 is larger than the threshold Thr_R, but the configuration may also be made so that the necessary/unnecessary determination of the moving image frame is made, in the reception signal strength detecting unit 1402 a of the image capturing apparatus 1400 a , based on whether or not the RSSI R0 is larger than the threshold Thr_R.
  • the necessary/unnecessary determining unit 107 a may be omitted.
  • the image processing apparatus 100 a and the image reading apparatus 108 a are configured respectively with an independent apparatus, but as illustrated in FIG. 37 , they may be configured with an integrated apparatus by including the function of the image reading apparatus 108 a in the image processing apparatus 100 a .
  • the operations of each unit of the image processing apparatus 100 a illustrated in FIG. 37 are similar to the operations described above and description for them is omitted.
  • FIG. 38 is a diagram illustrating an image processing apparatus of another embodiment of the present invention. Meanwhile, the objective of the image processing apparatus 10 illustrated in FIG. 38 is to improve the efficiency of the editing work and playback work of a plurality of pieces of image data.
  • the image processing apparatus 10 illustrated in FIG. 38 includes an input unit 11 , a processing unit 12 , an output unit 13 , a feature data obtaining unit 14 , a reference feature data obtaining unit 15 , feature data evaluating unit 16 , and a control processing unit 17 .
  • the input unit 11 and the feature data obtaining unit 14 may be configured as one.
  • the image process apparatus in the present application is assumed to mean the processing of data related to images, such as the evaluation and necessary/unnecessary determination of an image used for image processing using feature data of the image, not limited to the processing of image data.
  • the input unit 11 obtains image data (for example, a moving image and a plurality of still images) from outside (for example, an external server and the image capturing unit).
  • image data for example, a moving image and a plurality of still images
  • the processing unit 12 performs various image processing (control) such as sharpness correction, contrast correction, luminance/chrominance signal generation and white balance correction, meta data assignment, trimming process and the like, to the image data determined as necessary by the control processing unit 17 , in image data obtained by input unit 11 .
  • the output unit 13 outputs image data that received various image processing by the processing unit 12 to outside (for example, a display apparatus and a recording apparatus).
  • the feature data obtaining unit 14 obtains feature data (for example, meta data respectively corresponding to a plurality of still image images arranged in time series, moving image header data, or, data corresponding to the frame and the packet of the moving image by the time code) corresponding to image data from outside.
  • feature data for example, meta data respectively corresponding to a plurality of still image images arranged in time series, moving image header data, or, data corresponding to the frame and the packet of the moving image by the time code
  • the reference feature data obtaining unit 15 obtains reference feature data. Meanwhile, the reference feature data obtaining unit 15 may obtain reference feature data from outside, or may be storing it in advance. Meanwhile, the reference feature data obtaining unit 15 may be a recording unit that is built inside the image processing apparatus 10 such as a hard disk constituted by, for example, a ROM, RAM, non-volatile memory or a flash memory, or may be an external recording unit that is insertable/removable to/from the image processing apparatus 10 such as a CD, DVD, memory card, or a USB memory.
  • a hard disk constituted by, for example, a ROM, RAM, non-volatile memory or a flash memory
  • an external recording unit that is insertable/removable to/from the image processing apparatus 10 such as a CD, DVD, memory card, or a USB memory.
  • the feature data evaluating unit 16 evaluates feature data based on reference feature data.
  • reference feature data is assumed to be, for example, a value evaluated by evaluation items that indicate the degree of satisfaction or the degree of dissatisfaction of the user about image data.
  • reference feature data is assumed to be a value evaluated by evaluation items such as the blurring, shaking, contrast, definition, angle of view misalignment, asymmetricity, overlapping and the like of image data corresponding to the feature data.
  • reference feature data is assumed to be a value evaluated by evaluation items such as open/close of the eyes of the subject, the facial expression, the orientation of the face, the eye color (red eye in flash photography), and the like.
  • the feature data evaluating unit 16 evaluates the shaking, blurring, definition and the like by the condition of the outline and the focus in image data corresponding to feature data.
  • the output value of those sensors at the shutter timing may be stored and may be used for the evaluation of shaking.
  • the feature data evaluating unit 16 evaluates the contrast of image data corresponding to feature data, by calculating the luminance of the entire image data.
  • the feature data evaluating unit 16 evaluates the overlapping on the subject such as a finger and a strap overlapping the imaging lens, by detecting the occurrence of an extremely blurred focus in a certain area in image data corresponding to feature data.
  • the feature data evaluating unit 16 evaluates the angle of view misalignment by analyzing the verticality and horizontality of buildings, natural objects, humans in image data corresponding to feature data. Meanwhile, for example, the feature data evaluating unit 16 evaluates open/close of eyes by the analysis of the eye part in image data corresponding to feature data. Meanwhile, for example, the feature data evaluating unit 16 evaluates the facial expression by comparing image data corresponding to feature data and various expression patterns, and by estimating the shape of the eyes and mouth. Meanwhile, for example, the feature data evaluating unit 16 evaluates the orientation of the face by estimating the positional relationship of the eyes, nose, mouth, ears with each other and by estimating the face outline and the like in image data corresponding to the feature data.
  • the feature data evaluating unit 16 may compare feature data and a plurality of pieces of reference feature data, and may evaluate the feature data while sorting it in a phased manner. Meanwhile, for example, the feature data evaluating unit 16 evaluates may evaluate feature data by scoring. Meanwhile, for example, the feature data evaluating unit 16 may configured to compare feature data and reference feature data corresponding to respective items and evaluating the feature data for each item, and to evaluate the feature data comprehensively based on the evaluation results for the respective items.
  • the control processing unit 17 control the operation of each of the input unit 11 , the processing unit 12 , the output unit 13 , the feature data obtaining unit 14 , the reference feature data obtaining unit 15 , and the feature data evaluating unit 16 . Meanwhile, the control processing unit 17 includes a necessary/unnecessary determining unit 17 - 1 .
  • the necessary/unnecessary determining unit 17 - 1 performs necessary/unnecessary determination of image data corresponding to feature data based on the evaluation result of the feature data evaluating unit 16 .
  • the feature data obtaining unit 14 , the necessary/unnecessary determining unit 17 - 1 , and the processing unit 12 may be realized by a DSP (Digital Signal Processor) as a computer by executing a program recorded in a recording medium, or may also be realized by a program incorporated in an FPGA (Field Programmable Gate Array) and ASIC (Application Specific Integrated Circuit) as a computer.
  • the recording medium mentioned above may be a non-transient recording medium.
  • the non-transient recording medium is, while there is no particular limitation, a CD-ROM and the like for example.
  • FIG. 39 is a flowchart illustrating an example of the operation of the control processing unit 17 illustrated in FIG. 38 .
  • control processing unit 17 makes the feature data obtaining unit 14 obtain feature data corresponding to the first image data in a plurality of pieces of image data (Yes in S 21 b ), it takes out reference feature data from data obtained by the reference feature data obtaining unit 15 (S 22 b ), and makes the feature data evaluating unit 16 evaluate the feature data based on the reference feature data (S 23 b ).
  • the necessary/unnecessary determining unit 17 - 1 of the control processing unit 17 performs necessary/unnecessary determination of image data corresponding to the obtained feature data based on the evaluation result of the feature data evaluating unit 16 (S 24 b ).
  • control processing unit 17 makes the feature data obtaining unit 14 obtain feature data corresponding to the next image data (S 25 b ), and makes the feature data evaluation unit 16 evaluate the feature data based on the reference feature data (S 23 b ).
  • control processing unit 17 makes the input unit 11 obtain image data corresponding to feature data (S 26 b ), and after that, makes the processing unit 12 perform various image processing with respect to image data determined as necessary (S 27 b ), and makes the output unit 13 output the image data after the image processing (S 28 b ).
  • the control processing unit 17 makes the control processing unit 17 makes the feature data obtaining unit 14 obtain feature data corresponding to the next image data (S 25 b ), and makes the feature data evaluating unit 16 evaluate the feature data based on the reference feature data (S 23 b ).
  • control processing unit 17 terminates the image processing for the plurality of pieces of image data.
  • control processing unit 17 takes out “peak position area 10 through 245 ” as reference feature data corresponding to “luminance histogram” from a data table obtained by the reference feature data obtaining unit 15 illustrated in FIG. 40 .
  • the necessary/unnecessary determining unit 17 - 1 determines that the image data is too dark and unnecessary image data, and sends control data indicating that the image data is determined as unnecessary, to the input unit 11 .
  • the necessary/unnecessary determining unit 17 - 1 determines that the image data is too bright and unnecessary image data, and sends control data indicating that the image data is determined as unnecessary, to the input unit 11 .
  • the necessary/unnecessary determining unit 17 - 1 determines that the image data is necessary image data, and sends control data indicating that the image data is determined as necessary, to the input unit 11 .
  • the control processing unit 17 takes out “one or more” as reference feature data corresponding to “the number of faces detected” from a data table obtained by the reference feature data obtaining unit 15 illustrated in FIG. 40 .
  • the necessary/unnecessary determining unit 17 - 1 sends control data indicating that the image data is determined as unnecessary, to the input unit 11 .
  • the necessary/unnecessary determining unit 17 - 1 sends control data indicating that the image data is determined as necessary, to the input unit 11 .
  • the load on the processing unit 12 and the output destination of the output unit 13 may be reduced.
  • the editing work and playback work are made to be done by the processing unit 12 , but the editing work and playback work are made to be done using a processing unit of an external image processing apparatus.
  • the image processing apparatus 10 illustrated in FIG. 38 since the configuration is made so that various image processing is performed only for the image data based on the evaluation result of the feature data evaluating unit 16 in a plurality of pieces of image data to output to the outside, the load of the editing work and playback work at the output destination may be reduced, and the efficiency of those editing work and playback work may be improved.
  • the load on the processing unit 12 may be reduced.
  • the image processing apparatus 10 may obtain feature data from an image capturing apparatus 43 (image capturing unit) via communication apparatuses 41 , 42 .
  • the communication apparatus 42 and the image capturing unit 43 are constituted by a digital camera with the communication function, a mobile phone with a camera, a smartphone, or an image capturing unit via an adapter having the communication function.
  • the image processing apparatus 10 performs necessary/unnecessary determination of image data corresponding to the feature data, and in the case of determination as necessary, make a request of the image data to the image capturing unit 43 .
  • the image processing apparatus 10 receives the requested image data, it performs various image processing for the image data, and sends the image data after image processing to a display apparatus 44 . That is, the necessary/unnecessary determining unit 17 - 1 makes a request for image data determined as necessary to the image capturing unit 43 being an external apparatus. Meanwhile, the necessary/unnecessary determining unit 17 - 1 may be configured to make another external apparatus make a request for image data determined as necessary to the image capturing unit 43 being an external apparatus.
  • the image capturing unit 43 may be connected to a network 45 .
  • the image processing apparatus 10 performs transmission/reception of data of the image capturing unit 43 via the communication apparatus 42 and the network 45 .
  • the image processing apparatus 10 may include an image capturing unit 46 (image capturing unit). In this case, based on feature data sent from the image capturing unit 46 , the image processing apparatus 10 performs necessary/unnecessary determination of image data corresponding to the feature data, and in the case of determination as necessary, make a request of the image data to the image capturing unit 46 . Then, when the image processing apparatus 10 receives the requested image data, it performs various image processing for the image data, and sends the image data after image processing to a display apparatus 44 . In this case, depending on the external situation, the feature data evaluating unit 17 may review the reference feature data to be taken from data obtained by the reference feature data obtaining unit 15 .
  • FIG. 42 is a diagram illustrating an image processing apparatus of a variation example of the embodiment illustrated in FIG. 38 . Meanwhile, to the same configuration as the configuration illustrated in FIG. 38 , the same numeral is assigned and explanation for the configuration is omitted. Meanwhile, the objective of the image processing apparatus 50 illustrated in FIG. 42 is, for example, to improve the efficiency of the editing work and playback work of a plurality of pieces of image data.
  • the image processing apparatus 50 illustrated in FIG. 42 differs from the image processing apparatus 10 illustrated in FIG. 38 in that a power state detecting unit 51 (state detecting unit), an evaluation policy generating unit 52 , a reference feature data generating unit 53 , a transmitting unit 54 , and a receiving unit 55 are further included.
  • the power state detecting unit 51 sequentially detects the power type and the power state of the image processing apparatus 50 .
  • the power state detecting unit 51 detects “AC power” or “DC power” as the power type of the image processing apparatus 50 , and detects “DC power remaining capacity” as the power state.
  • the power state detecting unit 51 detects the power state such as the expected remaining time, the current, the voltage value of the DC power as the momentary value, an average value in a certain period in the past. Alternatively, all the power states in the past may be detected with a time stamp.
  • the AC power is a power without limit in supply such as the commercial power
  • the DC power represents a power with limitation in the remaining capacity and the usage capacity such as a battery (storage battery).
  • the evaluation policy generating unit 52 When the evaluation policy generating unit 52 detects that the power state detecting unit 51 is operating, it generates an evaluation policy about the power. For example, when power type of the image processing apparatus 50 is “DC power” and the power state is “DC power remaining capacity: less than 20%”, the evaluation policy generating unit 52 generates an evaluation policy to “obtain an image with a resolution equal to or below VGA (640 ⁇ 480). Meanwhile, for example, when the power type of the image processing apparatus 50 is “DC power” and the power state is “DC power remaining capacity: less than 20%”, the evaluation policy generating unit 52 may also generate an evaluation policy to “select an image that makes the total number of pixels in the image area equal to or below 640 ⁇ 480 regardless of the resolution in the X direction, Y direction”.
  • the reference feature data generating unit 53 generates reference feature data based on the evaluation policy generated by the evaluation policy generating unit 52 and the detection result of the power state detecting unit 51 .
  • the necessary/unnecessary determining unit 17 - 1 illustrated in FIG. 42 is sends an image obtaining request including the necessary/unnecessary determination result of image data corresponding to feature data and identification information of image data to the transmitting unit 54 .
  • the transmitting unit 54 transmits the image obtaining request sent from the necessary/unnecessary determining unit 17 - 1 to the server 56 via a network.
  • the network may be, for example, the Internet, LAN, on-chip network, as well as an interface such as USB and PCI. That is, the necessary/unnecessary determining unit 17 - 1 makes a request for image data determined as necessary to a server 56 being an external apparatus. Meanwhile, the necessary/unnecessary determining unit 17 - 1 may also be configured to make another external apparatus make a request for image data determined as necessary, to the server 56 .
  • the server 56 includes a receiving unit 57 , a selecting unit 58 , a saving unit 59 , and a transmitting unit 60 .
  • the receiving unit 57 receives an image obtaining request transmitted from the image processing apparatus 50 .
  • the selecting unit 58 takes out image data corresponding to identification information indicated in a received image obtaining request from a plurality of pieces of data saved in the saving unit 59 .
  • the transmitting unit 60 transmits image data taken out by the selecting unit 58 to the image processing apparatus 50 via a network.
  • the receiving unit 55 of the image processing apparatus 50 receives image data transmitted from the server 56 .
  • the input unit 11 illustrated in FIG. 42 transmits image data received by the receiving unit 55 to the processing unit 12 .
  • FIG. 43 is a flowchart illustrating an example of the operation of the reference feature data generating unit 53 .
  • the reference feature data generating unit 53 generates reference feature data based on the evaluation policy generated by the evaluation policy generating unit 52 and the detection result of the power state detecting unit 51 (S 61 b ), and after that, saves the generated reference feature data in the reference feature data obtaining unit 15 (S 62 b ).
  • a data table illustrated in FIG. 44 is stored in advance in a storing unit 56 illustrated in FIG. 42 .
  • the reference feature data generating unit 53 takes out “resolution: equal to or below 640 ⁇ 480” from the data table illustrated in FIG. 44 as reference feature data corresponding to the power type “DC power” and the power state “DC power remaining capacity: equal to or below 20%” and save it in the reference feature data obtaining unit 15 .
  • the reference feature data generating unit 53 takes out “resolution: equal to or below 1280 ⁇ 960” from the data table illustrated in FIG. 44 as reference feature data corresponding to the power type “DC power” and the power state “DC power remaining capacity: equal to or above 20%” and save it in the reference feature data obtaining unit 15 .
  • the power type of the image processing apparatus 50 detected by the power state detecting unit 51 is “AC power”
  • the reference feature data generating unit 53 takes out from the data table illustrated in FIG.
  • the reference feature data generating unit 53 generates reference feature data so that, in the low-energy state where the available power capacity is equal to or below a prescribed value, the amount of data of image data to be obtained becomes smaller compared with the non-low-energy state.
  • FIG. 45 is a flowchart illustrating an example of the operation of the control processing unit 17 illustrated in FIG. 42 .
  • control processing unit 17 makes the feature data obtaining unit 14 obtain feature data corresponding to the first image data in a plurality of pieces of image data (S 81 b ), it takes out reference feature data from data obtained by the reference feature data obtaining unit 15 (S 82 b ), and makes the feature data evaluating unit 16 evaluate the feature data based on the reference feature data (S 83 b ).
  • the necessary/unnecessary determining unit 17 - 1 of the control processing unit 17 performs necessary/unnecessary determination of image data corresponding to the obtained feature data based on the evaluation result of the feature data evaluating unit 16 (S 84 b ).
  • the control processing unit 17 makes the feature data obtaining unit 14 obtain feature data corresponding to the next image data (S 85 b ), and makes the feature data evaluating unit 16 evaluate the feature data based on the reference feature data (S 83 b ).
  • the control processing unit 17 makes the transmitting unit 54 transit an image obtaining request to the server 56 (S 86 b ), and also, makes the receiving unit 55 receive image data corresponding to the image obtaining request (S 87 b ), and after that, makes the processing unit 12 perform various image processing for the image data (S 88 b ), and makes the output unit 13 output the image data after the image processing (S 89 b ).
  • the control processing unit 17 makes the feature data obtaining unit 14 obtain feature data corresponding to the next image data (S 85 b ), and makes the feature data evaluating unit 16 evaluate the feature data based on the reference feature data (S 83 b ).
  • control processing unit 17 terminates the image processing for the plurality of pieces of image data.
  • the control processing unit 17 takes out “resolution: equal to or below 640 ⁇ 480” as reference feature data from data obtained by the reference feature data obtaining unit 15 .
  • the necessary/unnecessary determining unit 17 - 1 determines the image data as necessary.
  • the necessary/unnecessary determining unit 17 - 1 determines the image data as unnecessary.
  • the load on the processing unit 12 and the output destination of the output unit 13 may be reduced.
  • FIG. 46 is a diagram illustrating an image processing apparatus of a variation example of the embodiment illustrated in FIG. 42 . Meanwhile, to the same configuration illustrated in FIG. 42 , the same numeral is assigned and explanation for the configuration is omitted. Meanwhile, the objective of the image processing apparatus 90 illustrated in FIG. 46 is, for example, to improve the efficiency of the editing work and playback work of a plurality of pieces of image data.
  • the image processing apparatus 90 illustrated in FIG. 46 differs from the image processing apparatus 50 illustrated in FIG. 42 , a connection state determining unit 91 (state detecting unit) is included instead of the power state detecting unit 51 .
  • the connection state determining unit 91 obtains, from the transmitting unit 54 and the receiving unit 55 of the image processing apparatus 90 , the connection state to the network (connected or disconnected, and the like), the type of the network (wired or wireless, and the like), the network standard (IEEE802.3ab and IEEE802.11n, and the like), information about the network traffic and the like as network information, and outputs it respectively to the evaluation policy generating unit 52 and the reference feature data generating unit 53 .
  • the evaluation policy generating unit 52 of the present embodiment detects that the connection state determining unit 91 , it generates an evaluation policy about network information. For example, when the network type is “wired” and the network occupancy (the ratio of the current network traffic with respect to the upper limit of the network traffic) is “equal to or above 50%”, the evaluation policy generating unit 52 of the present embodiment generates an evaluation policy to “obtain an image of a resolution equal to or below VGA (640 ⁇ 480). Meanwhile, evaluation policy generating unit 52 may also combine a plurality of pieces of reference feature data.
  • the evaluation policy generating unit 52 may generate an evaluation policy to “obtain an image of a resolution of equal to or below VGA (640 ⁇ 480) and of a data size of equal to or below 20KB”. Meanwhile, when combining reference feature data, the evaluation policy generating unit 52 may use an evaluation formula.
  • FIG. 47 is a flowchart illustrating an example of the operation of the reference feature data generating unit 53 illustrated in FIG. 46 .
  • the reference feature data generating unit 53 generates reference feature data based on the evaluation policy generated by the evaluation policy generating unit 52 and the determination result (network information) of the connection state determining unit 91 (S 101 ), and after that, saves the generated reference feature data in the reference feature data obtaining unit 15 (S 102 ).
  • a data table illustrated in FIG. 48 is stored in advance in a storing unit 92 illustrated in FIG. 46 .
  • the connection state determining unit 91 outputs network information indicating that the network type is “wireless” and the network occupancy is “86.4%”.
  • the reference feature data generating unit 53 takes out from the data table illustrated in FIG.
  • the reference feature data generating unit 53 generates reference feature data so that, in the low-speed state where the data forwarding amount per unit time of the available network is equal to or below a prescribed value, the amount of data of image data to be obtained becomes smaller compared with the non-low-speed state.
  • the necessary/unnecessary determining unit 17 - 1 illustrated in FIG. 46 takes out an evaluation formula “2 ⁇ 3” and “resolution: equal to or below 640 ⁇ 480” and “image size: 20KB or below” as reference feature data corresponding to the feature IDs “2” and “3” from data obtained from the reference feature data obtaining unit 15 .
  • the necessary/unnecessary determining unit 17 - 1 sends an image obtaining request indicating that the image data is determined as necessary, by the transmitting unit 54 to the server 56 . That is, the necessary/unnecessary determining unit 17 - 1 makes a request for image data determined as necessary to the server 56 being an external apparatus. Meanwhile, the necessary/unnecessary determining unit 17 - 1 may be configured to make another external apparatus make a request for image data determined as necessary to the server 56 .
  • the necessary/unnecessary determining unit 17 - 1 sends an image obtaining request indicating that the image data is determined as unnecessary, by the transmitting unit 54 to the server 56 .
  • the evaluation formula used in the present embodiment is assumed to be set by an evaluation policy stored in advance, but it may be set by inputting an evaluation policy and an evaluation formula itself from a display unit 141 described later, or the evaluation formula or the evaluation policy may be changed or set to be a prescribed one by the result detected by the connection state determining unit 91 (state detecting unit).
  • the configuration is made so that feature data is evaluated by reference feature data about the power capacity detected by the power state detecting unit 51 illustrated in FIG. 42 and the feature data is evaluated by the reference feature data regarding the connection state with the network determined by the connection state determining unit 91 illustrated in FIG. 46 , but in present embodiment, the configuration may be made so that feature data is evaluated by reference feature data in which a plurality of types of reference feature data such as these reference feature data are combined. As described above, by evaluating feature data by combining a plurality of types of reference feature data, determination of data that is needed more may be performed according to the situation of the device used for the editing work and the playback work and the network.
  • FIG. 49 is a diagram illustrating an image processing apparatus of a variation example of the embodiment illustrated in FIG. 42 . Meanwhile, to the same configuration illustrated in FIG. 46 , the same numeral is assigned and explanation for the configuration is omitted. Meanwhile, the objective of the image processing apparatus 120 illustrated in FIG. 49 is, for example, to improve the efficiency of the editing work and playback work of a plurality of pieces of image data.
  • the image processing apparatus 120 illustrated in FIG. 49 is different from the image processing apparatus 90 illustrated in FIG. 46 in that an internal server 121 and a distribution switching unit 122 are further included.
  • the internal server 121 is in a similar configuration as in the server 56 , and functions as a cache of image data obtained from the server 56 .
  • image data transmitted from the server 56 to the receiving unit 55 of the image processing apparatus 120 is saved.
  • the update of image data in the internal server 121 may be performed at the time of the input of image data, regularly, or at the time of the occurrence of a specific event. Meanwhile, when the capacity is smaller than the server 56 , the internal server 121 decides the image data to be saved by giving priority to saving image data with a high frequency of access and image data whose latest access time is closer to the current time.
  • the distribution switching unit 122 switches the transmission destination of the image obtaining request to either one of the transmitting unit 54 and the internal server 121 of the image processing apparatus 120 , according to the network information output from the connection state determining unit 91 . That is, for example, when the image processing apparatus 120 is connected to the network, the distribution switching unit 122 makes the outside of the image processing apparatus 120 perform the work to transmit the image obtaining request to the server 56 via the network and to extract desired image data from a plurality of pieces of image data. On the other hand, when the image processing apparatus 120 is not connected to the network, the distribution switching unit 122 makes the inside of the image processing apparatus 120 perform the work to transmit the image obtaining request to the internal server 121 and to extract desired image data from a plurality of pieces of image data. Meanwhile, for example, the configuration may also be made that when the image processing apparatus 120 is connected to the network, when the network occupancy is higher than the threshold, the distribution switching unit 122 transmits the image obtaining request to the internal server 121 .
  • FIG. 50 is a flowchart illustrating an example of the operation of the control processing unit 17 illustrated in FIG. 49 .
  • control processing unit 17 makes the feature data obtaining unit 14 obtain feature data corresponding to the first image data in a plurality of pieces of image data (Yes in S 131 b ), it takes out reference feature data from data obtained by the reference feature data obtaining unit 15 (S 132 ), and makes the feature data evaluating unit 16 evaluate the feature data based on the reference feature data (S 133 ).
  • the necessary/unnecessary determining unit 17 - 1 of the control processing unit 17 performs necessary/unnecessary determination of image data corresponding to the obtained feature data based on the evaluation result of the feature data evaluating unit 16 (S 134 ).
  • the control processing unit 17 makes the feature data obtaining unit 14 obtain feature data corresponding to the next image data (S 135 ), and makes the feature data evaluating unit 16 evaluate the feature data based on the reference feature data (S 133 ).
  • control processing unit 17 In the case of determination as necessary (Yes S 134 ), the control processing unit 17 generates an image obtaining request (S 136 ), and switches the transmission destination of the image obtaining request by the distribution switching unit 122 to either of the external server 56 and the internal server 121 (S 137 ).
  • the control processing unit 17 makes the transmitting unit 54 transmit the image obtaining request to the server 56 (S 138 ) and also, makes the receiving unit 55 receive image data corresponding to the image obtaining request (S 139 ). That is, the control processing unit 17 makes a request for image data determined as necessary to a server 56 being an external apparatus. Meanwhile, the control processing unit 17 may also be configured to make another external apparatus make a request for image data determined as necessary, to the server 56 .
  • the control processing unit 17 sends the image obtaining request to the internal server 121 (S 140 ), and receives image data corresponding to image obtaining request sent from the internal server 121 by the input unit 11 (S 141 ).
  • the internal server 121 receives an image obtaining request, it takes out image data corresponding to the identification information indicated in the image obtaining request from a plurality of pieces of image data stored in a storing unit 123 illustrated in FIG. 49 , and sends the taken-out image data to the input unit 11 .
  • control processing unit 17 makes the processing unit 12 perform various image processing for image data (S 142 ), and makes the output unit 13 output the image data after the image processing (S 143 ).
  • the control processing unit 17 makes the feature data obtaining unit 14 obtain feature data corresponding to the next image data (S 135 ), and makes the feature data evaluating unit 16 evaluate the feature data based on the reference feature data (S 133 ).
  • control processing unit 17 terminates the image processing for the plurality of pieces of image data.
  • the configuration may also be made so that, when the image obtaining request is transmitted to the server 56 , the control processing unit 17 sends the received image data to the processing unit 12 , and also caches it in the internal server 121 . Meanwhile, when the image obtaining request is sent to the internal server 121 , the control processing unit 17 may send image data sent from the internal server 121 to the input unit 11 , to the processing unit 12 .
  • FIG. 51 is a diagram illustrating an image processing apparatus of another embodiment of the present invention. Meanwhile, to the same configuration illustrated in FIG. 42 , the same numeral is assigned and explanation for the configuration is omitted. Meanwhile, the objective of the image processing apparatus 140 illustrated in FIG. 51 is, for example, to improve the efficiency of the editing work and playback work of a plurality of pieces of image data.
  • the image processing apparatus 140 illustrated in FIG. 51 differs from the image processing apparatus 50 illustrated in FIG. 42 in that a display unit 141 is included instead of the power state detecting unit 51 .
  • the display unit 141 is a user interface such as a touch panel display, to which the user's intension about the evaluation policy is input by the user operation.
  • the input user intention is sent to the reference feature data generating unit 53 .
  • the evaluation policy generating unit 52 illustrated in FIG. 51 may be configured to assign the priority order to a plurality of evaluation policies based on the input user intention, and after that, to send to the evaluation policy to the reference feature data generating unit 53 .
  • the reference feature data generating unit 53 uses the evaluation policies according to the priority order.
  • the user intention at this time indicates a policy to evaluate at least one of the display speed, the resolution, the amount of data of image data with priority, for example.
  • FIG. 52 is a flowchart illustrating an example of the operation of the reference feature data generating unit 53 illustrated in FIG. 51 .
  • the reference feature data generating unit 53 illustrated in FIG. 51 generates reference feature data based on an evaluation policy generated by the evaluation policy generating unit 52 and the user intention input by the display unit 141 (S 151 ) and after that, saves the generated reference feature data in the reference feature data obtaining unit 15 (S 152 ).
  • the display unit 141 includes a background priority mode button and a subject priority mode button, and when either one of the buttons is pressed by the user, sends the mode corresponding to the pressed button to the reference feature data generating unit 53 as the user intention.
  • the background priority mode button is pressed, the reference feature data generating unit 53 sets the upper limit value of the acceleration and speed of the camera relatively low, so that the background may be enjoyed.
  • the reference feature data generating unit 53 sets the upper limit value of the camera acceleration as reference feature data to “equal to or below 1 G” corresponding to the “background priority mode”, and the upper limit value of the camera speed as reference feature data to “equal to or below 1 cm/s” corresponding to the “background priority mode”.
  • the reference feature data generating unit 53 sets the upper limit value of the acceleration and speed of the camera relatively high, so that the subject may be enjoyed. For example, as illustrated in FIG. 53A , the reference feature generating unit 53 sets the upper limit value of the camera acceleration as reference feature data to “equal to or below 5 G” corresponding to the “subject priority mode”, and the upper limit value of the camera speed as reference feature data to “equal to or below 5 cm/s” corresponding to the “subject priority mode”.
  • These combinations may be saved in the evaluation policy generating unit 52 in advance, or a user interface that enables editing by the user may be provided. Meanwhile, there is no limitation about the type of the parameter to be set. Meanwhile, the selection of the mode may also be an analog setting by a slide bar, instead of the choice between the two by buttons.
  • the display unit 141 may include a pro mode button and an amateur mode button, and when either one of the buttons is pressed by the user, sends the mode corresponding to the pressed button to the reference feature data generating unit 53 as the user intention.
  • the pro mode button is pressed
  • the reference feature data generating unit 53 sets the image data of 5 pieces preceding and following the image data from which the subject is detected as the extraction criterion and sets the upper limit value of the acceleration and speed of the camera relatively high, so that there is a wide variation of for editing. For example, as illustrated in FIG.
  • the reference feature data generating unit 53 sets the image extraction criterion as reference feature data to “the five frames preceding and following the subject-detected frame” corresponding to the “pro mode”, the upper limit value of the camera acceleration as reference feature data to “equal to or below 5 G” corresponding to the “pro mode”, and the upper limit value of the camera speed as reference feature data to “equal to or below 5 cm/s” corresponding to the “pro mode”.
  • the reference feature data generating unit 53 sets only the image data from which the subject is detected as the extraction criterion and sets the upper limit value of the acceleration and speed of the camera relatively low, to realize fuss-free editing. For example, as illustrated in FIG.
  • the reference feature data generating unit 53 sets the image extraction criterion as reference feature data to “only the subject-detected frame” corresponding to the “amateur mode”, the upper limit value of the camera acceleration as reference feature data to “equal to or below 1 G” corresponding to the “amateur mode”, and the upper limit value of the camera speed as reference feature data to “equal to or below 1 cm/s” corresponding to the “amateur mode”.
  • These combinations may be saved in the evaluation policy generating unit 52 in advance, or a user interface that enables editing by the user may be provided. Meanwhile, there is no limitation about the type of the parameter to be set. Meanwhile, the selection of the mode may also be an analog setting by a slide bar, instead of the choice between the two by buttons.
  • the types of the user intension and evaluation policy are not limited to the configuration described above.
  • the timing of the mode switching may be set before image capturing to according to the purpose of the day, or may be set each time before preview of the image data.
  • a priority recognizing unit may be provided in the display unit 141 , and the user may increase the priority intentionally for each scene, based on a sign captured in the captured image of the camera (such as scissors formed by the user's hand, a thumbs-up and the like), shaking of the camera, or rating information assigned in advance at the time of image capturing by the user.
  • control processing unit 17 makes the feature data obtaining unit 14 obtain feature data corresponding to the first image data in a plurality of pieces of image data, it takes out reference feature data from data obtained by the reference feature data obtaining unit 15 , and makes the feature data evaluating unit 16 evaluate the feature data based on the reference feature data.
  • the necessary/unnecessary determining unit 17 - 1 of the control processing unit 17 performs necessary/unnecessary determination of image data corresponding to the feature data based on the evaluation result of the feature data evaluating unit 16 .
  • control processing unit 17 makes the feature data obtaining unit 14 obtain feature data corresponding to the next image data, and makes the feature data evaluating unit 16 evaluate the feature data based on the reference feature data.
  • control processing unit 17 makes the transmitting unit 54 transmit the image obtaining request to the server 56 , and also, makes the receiving unit 55 receive image data corresponding to the image obtaining request, and after that, makes the processing unit 12 perform various image processing for the image data, and makes the output unit 13 output the image data after the image processing.
  • the control processing unit 17 makes the feature data obtaining unit 14 obtain feature data corresponding to the next image data and makes the feature data evaluating unit 16 evaluate the feature data based on the reference feature data.
  • control processing unit 17 terminates the image processing for the plurality of pieces of image data.
  • the control processing unit 17 takes out “five frames preceding and following the subject-detected frame” as the image extraction criterion from data obtained by the reference feature data obtaining unit 15 , “equal to or below 5 G” as the upper limit value of the camera acceleration, and “equal to or below 5 cm/s” as the upper limit value of the camera speed.
  • the necessary/unnecessary determining unit 17 - 1 determines the image data as necessary.
  • the necessary/unnecessary determining unit 17 - 1 determines the image data as unnecessary.
  • the load on the processing unit 12 and the output destination of the output unit 13 may be reduced.
  • FIG. 54 is a diagram illustrating an image processing apparatus of a variation example of the embodiment illustrated in FIG. 38 . Meanwhile, to the same configuration as the configuration illustrated in FIG. 38 , the same numeral is assigned and explanation for the configuration is omitted. Meanwhile, the objective of the image processing apparatus 170 illustrated in FIG. 54 is to improve the efficiency of the editing work and playback work of a plurality of pieces of image data.
  • the image processing apparatus 170 illustrated in FIG. 54 differs from the image processing apparatus 10 illustrated in FIG. 38 in that a feature data extracting unit 171 is further included.
  • the feature data extracting unit 171 extracts feature data from image data obtained by the input unit 11 .
  • the specific subject in the feature data extracted at this time is an object (a human face and body, automobile, flower, animal, sky, mountain, sea, road or building), a marker, or, a color and luminance.
  • the feature data extracting unit 171 includes a subject detecting unit, and by the subject detecting unit, extracts the “position of the subject” and the “size of the subject” from image data.
  • the feature data evaluating unit 16 illustrated in FIG. 54 takes out “the width in the X direction and the width in the Y direction of the image data” and “the prescribed size” from data obtained by the reference feature data obtaining unit 15 , and when it makes a determination as at least one of “the subject has not been detected”, “the size of the subject is smaller than the prescribed size”, “the position of the subject deviates outward from the left edge of the image data”, “the position of the subject deviates outward from the right edge of the image data”, “the position of the subject deviates outward from the upper edge of the image data”, and “the position of the subject deviates outward from the bottom edge of the image data”, it sends the determination result to the necessary/unnecessary determining unit 17 - 1 .
  • the necessary/unnecessary determining unit 17 - 1 performs necessary/unnecessary determination of image data obtained by the input unit 11 , based on the determination result sent from the feature data evaluating unit 16 . For example, as illustrated in FIG. 55 , when it is determined that, in a moving image composed of a plurality of pieces of image data P1 through P6, “the size of the subject is smaller than the prescribed size” in the image data P1, the necessary/unnecessary determining unit 17 - 1 determines the image data P1 as unnecessary.
  • the necessary/unnecessary determining unit 17 - 1 determines the image data P6 as unnecessary image data. Then, the necessary/unnecessary determining unit 17 - 1 controls the operation of the input unit 11 so that the image data P6 are not output from the input unit 11 to the processing unit 12 . Accordingly, the load on the processing unit 12 may be reduced.
  • the feature data extracting unit 171 extracts the composition of image data.
  • the feature data evaluating unit 16 illustrated in FIG. 54 takes out “the prescribed value” and “the prescribed number” as reference feature data from data obtained by the reference feature data obtaining unit 15 , and when it make a determination that “there are equal to or more than the prescribed number of successive pieces of image data in which the difference between two successive two pieces of image data is equal to or below the prescribed value”, it sends the determination result to the necessary/unnecessary determining unit 17 - 1 .
  • the necessary/unnecessary determining unit 17 - 1 performs necessary/unnecessary determination of image data obtained by the input unit 11 , based on the determination result sent from the feature data evaluating unit 16 . For example, as illustrated in FIG. 56 , when it is determined that, in a moving image composed of a plurality of pieces of image data P7 through P11, “there are two or more pieces of successive image data in which the difference between the image data P7, P8 is equal to or below the prescribed value”, the necessary/unnecessary determining unit 17 - 1 determines the image data P9 through P11 following those image data P7, P8 as unnecessary image data.
  • necessary/unnecessary determining unit 17 - 1 controls the operation of the input unit 11 so that the image data P9 through P11 are not output to from the input unit 11 to the processing unit 12 . Accordingly, the load on the processing unit 12 may be reduced.
  • the feature data extracting unit 171 may also be configured to extract feature data from a thumbnail image (a reduced image of image data), image data of a lower resolution (an image of a lower resolution than the image data), partial-image image data (a part of image data) obtained by the input unit 11 .
  • FIG. 57 is a diagram illustrating an image processing apparatus of another embodiment of the present invention. Meanwhile, to the same configuration as the configuration illustrated in FIG. 38 , the same numeral is assigned and explanation for the configuration is omitted. Meanwhile, the objective of the image processing apparatus 200 illustrated in FIG. 57 is to improve the efficiency of the editing work and playback work of a plurality of pieces of image data.
  • the image processing apparatus 200 illustrated in FIG. 57 differs from the image processing apparatus 10 illustrated in FIG. 38 in that a feature data saving unit 201 is further included.
  • the feature data obtaining unit 14 illustrated in FIG. 57 obtains feature data from outside regularly and saves it in the feature data saving unit 201 .
  • the image processing apparatus 200 illustrated in FIG. 57 when the image capturing unit is removed from the image processing apparatus 200 , when the network is disconnected, when feature data cannot be obtained regularly from outside due to a high network occupancy and the like, necessary/unnecessary determination of image data may be continued using feature data saved in the feature data saving unit 201 .
  • the image data for which necessary/unnecessary determination is performed at that time may be obtained when the image capturing unit is attached to the image processing apparatus 200 , when the network connection is recovered, and the like.
  • an offline necessary/unnecessary determination may be realized as well.
  • the feature data saving unit 201 may be provided outside the image processing apparatus 200 (for example, on the network and on a removable memory).
  • the efficiency of the editing work and playback work of moving images may be improved.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Television Signal Processing For Recording (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)
US14/213,714 2011-09-22 2014-03-14 Image processing apparatus, image processing system, and image reading apparatus Expired - Fee Related US9300904B2 (en)

Applications Claiming Priority (7)

Application Number Priority Date Filing Date Title
JP2011-207819 2011-09-22
JP2011-207818 2011-09-22
JP2011207819A JP5809906B2 (ja) 2011-09-22 2011-09-22 画像読出し装置及び画像処理システム
JP2011207818A JP5809905B2 (ja) 2011-09-22 2011-09-22 画像処理装置及び画像処理システム
JP2012-002581 2012-01-10
JP2012002581A JP6027745B2 (ja) 2012-01-10 2012-01-10 画像処理装置
PCT/JP2012/074227 WO2013042766A1 (ja) 2011-09-22 2012-09-21 画像処理装置、画像処理システム、及び画像読出し装置

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/074227 Continuation WO2013042766A1 (ja) 2011-09-22 2012-09-21 画像処理装置、画像処理システム、及び画像読出し装置

Publications (2)

Publication Number Publication Date
US20140218565A1 US20140218565A1 (en) 2014-08-07
US9300904B2 true US9300904B2 (en) 2016-03-29

Family

ID=47914526

Family Applications (1)

Application Number Title Priority Date Filing Date
US14/213,714 Expired - Fee Related US9300904B2 (en) 2011-09-22 2014-03-14 Image processing apparatus, image processing system, and image reading apparatus

Country Status (2)

Country Link
US (1) US9300904B2 (ja)
WO (1) WO2013042766A1 (ja)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102013224962A1 (de) * 2013-12-05 2015-06-11 Robert Bosch Gmbh Anordnung zum Erstellen eines Bildes einer Szene
GB2539461B (en) 2015-06-16 2020-01-08 Canon Kk Image data encapsulation
KR20180036464A (ko) * 2016-09-30 2018-04-09 삼성전자주식회사 이미지 처리 방법 및 이를 지원하는 전자 장치

Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0530498A (ja) 1991-07-24 1993-02-05 Seiko Epson Corp 動画像データ圧縮装置
JPH07121789A (ja) 1993-10-22 1995-05-12 Canon Inc 遠隔監視装置
JPH09214879A (ja) 1996-02-07 1997-08-15 Toshiba Corp 動画像処理方法
JP2000209483A (ja) 1999-01-18 2000-07-28 Nikon Corp 画像選別機能を有する電子カメラ、およびプログラムを記録した記録媒体
JP2002010179A (ja) 2000-06-19 2002-01-11 Olympus Optical Co Ltd プリント画像選択装置、自動選択機能付き写真プリンタ装置及び自動選択プリントシステム並びに記録媒体
JP2005020409A (ja) 2003-06-26 2005-01-20 Casio Comput Co Ltd 画像撮影装置、画像整理装置、プログラム
JP2005020196A (ja) 2003-06-24 2005-01-20 Casio Comput Co Ltd 画像撮影装置、及び画像整理装置
US20050219666A1 (en) 1998-11-20 2005-10-06 Nikon Corporation Image processing apparatus having image selection function, and recording medium having image selection function program
JP2005295285A (ja) 2004-03-31 2005-10-20 Japan Aviation Electronics Industry Ltd 動画像データ生成方法
JP2008061107A (ja) 2006-09-01 2008-03-13 Sega Corp 自動写真館
JP2008182544A (ja) 2007-01-25 2008-08-07 Sony Corp 画像保存装置、画像保存方法
JP2008306334A (ja) 2007-06-06 2008-12-18 Fujifilm Corp 動画ファイル生成方法およびこれを用いた動画撮影装置
JP2009004950A (ja) 2007-06-20 2009-01-08 Victor Co Of Japan Ltd 撮像装置および編集装置
US20090268943A1 (en) 2008-04-25 2009-10-29 Sony Corporation Composition determination device, composition determination method, and program
JP2009253542A (ja) 2008-04-03 2009-10-29 Casio Comput Co Ltd 撮影装置、画像記録方法、及びプログラム
US20100091864A1 (en) * 2007-06-07 2010-04-15 Fujitsu Limited Moving-image-similarity determination device, encoding device, and feature calculating method
JP2010147659A (ja) 2008-12-17 2010-07-01 Nikon Corp 電子カメラ
US20100312765A1 (en) * 2009-06-04 2010-12-09 Canon Kabushiki Kaisha Information processing apparatus, information processing method and program therefor
US20120106850A1 (en) * 2001-03-08 2012-05-03 Christof Koch Computation of intrinsic perceptual saliency in visual environments, and applications

Patent Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0530498A (ja) 1991-07-24 1993-02-05 Seiko Epson Corp 動画像データ圧縮装置
JPH07121789A (ja) 1993-10-22 1995-05-12 Canon Inc 遠隔監視装置
JPH09214879A (ja) 1996-02-07 1997-08-15 Toshiba Corp 動画像処理方法
US20050219666A1 (en) 1998-11-20 2005-10-06 Nikon Corporation Image processing apparatus having image selection function, and recording medium having image selection function program
US20060256396A1 (en) 1998-11-20 2006-11-16 Nikon Corporation Image processing apparatus having image selection function, and recording medium having image selection function program
JP2000209483A (ja) 1999-01-18 2000-07-28 Nikon Corp 画像選別機能を有する電子カメラ、およびプログラムを記録した記録媒体
JP2002010179A (ja) 2000-06-19 2002-01-11 Olympus Optical Co Ltd プリント画像選択装置、自動選択機能付き写真プリンタ装置及び自動選択プリントシステム並びに記録媒体
US20120106850A1 (en) * 2001-03-08 2012-05-03 Christof Koch Computation of intrinsic perceptual saliency in visual environments, and applications
JP2005020196A (ja) 2003-06-24 2005-01-20 Casio Comput Co Ltd 画像撮影装置、及び画像整理装置
JP2005020409A (ja) 2003-06-26 2005-01-20 Casio Comput Co Ltd 画像撮影装置、画像整理装置、プログラム
JP2005295285A (ja) 2004-03-31 2005-10-20 Japan Aviation Electronics Industry Ltd 動画像データ生成方法
JP2008061107A (ja) 2006-09-01 2008-03-13 Sega Corp 自動写真館
JP2008182544A (ja) 2007-01-25 2008-08-07 Sony Corp 画像保存装置、画像保存方法
JP2008306334A (ja) 2007-06-06 2008-12-18 Fujifilm Corp 動画ファイル生成方法およびこれを用いた動画撮影装置
US20100091864A1 (en) * 2007-06-07 2010-04-15 Fujitsu Limited Moving-image-similarity determination device, encoding device, and feature calculating method
JP2009004950A (ja) 2007-06-20 2009-01-08 Victor Co Of Japan Ltd 撮像装置および編集装置
JP2009253542A (ja) 2008-04-03 2009-10-29 Casio Comput Co Ltd 撮影装置、画像記録方法、及びプログラム
US20090268943A1 (en) 2008-04-25 2009-10-29 Sony Corporation Composition determination device, composition determination method, and program
JP2009267787A (ja) 2008-04-25 2009-11-12 Sony Corp 構図判定装置、構図判定方法、プログラム
JP2010147659A (ja) 2008-12-17 2010-07-01 Nikon Corp 電子カメラ
US20100312765A1 (en) * 2009-06-04 2010-12-09 Canon Kabushiki Kaisha Information processing apparatus, information processing method and program therefor

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
International Search Report (ISR) dated Oct. 23, 2012 issued in International Application No. PCT/JP2012/074227.
Japanese Office Action (and English translation thereof) dated Feb. 2, 2016, issued in counterpart Japanese Application No. 2012-002581.
Japanese Office Action (and English translation thereof) dated May 19, 2015, issued in counterpart Japanese Application No. 2011-207819.

Also Published As

Publication number Publication date
US20140218565A1 (en) 2014-08-07
WO2013042766A1 (ja) 2013-03-28

Similar Documents

Publication Publication Date Title
US8634657B2 (en) Image processing device and computer-program product of image evaluation
CN101419666B (zh) 图像处理装置、图像捕获装置、图像处理方法及记录媒介
JP5218508B2 (ja) 撮像装置
US8411997B2 (en) Image capture device and program storage medium
US9467625B2 (en) Imaging device capable of combining images
US8441554B2 (en) Image capturing apparatus capable of extracting subject region from captured image
US8681239B2 (en) Image capturing device, image capturing method, program, and integrated circuit
JP2012044415A (ja) 撮像装置及びその制御方法、並びにプログラム
US20210337115A1 (en) Image capture device with an automatic image capture capability
US20100246968A1 (en) Image capturing apparatus, image processing method and recording medium
US9300904B2 (en) Image processing apparatus, image processing system, and image reading apparatus
JP4973585B2 (ja) 画像のグループ化装置、およびカメラ
TWI508548B (zh) 攝像裝置及攝像方法
JP5655668B2 (ja) 撮像装置、画像処理方法及びプログラム
JP2013239796A (ja) 画像処理装置
JP2008219522A (ja) 撮像システム、撮像装置、撮像プログラム、及び撮像方法
JP5381498B2 (ja) 画像処理装置、画像処理プログラムおよび画像処理方法
JP6027745B2 (ja) 画像処理装置
JP6741532B2 (ja) 撮像装置および記録方法
US9955135B2 (en) Image processing apparatus, image processing method, and program wherein a RAW image to be subjected to special processing is preferentially subjected to development
JP5809906B2 (ja) 画像読出し装置及び画像処理システム
US9392169B2 (en) Image processing apparatus, image processing method, program, and imaging apparatus
JP5157618B2 (ja) 画像群の標題付与装置、およびカメラ
JP6318535B2 (ja) 撮像装置
JP5809905B2 (ja) 画像処理装置及び画像処理システム

Legal Events

Date Code Title Description
AS Assignment

Owner name: OLYMPUS CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MIYOSHI, TAKASHI;KOSAKA, AKIO;IWAKI, HIDEKAZU;AND OTHERS;SIGNING DATES FROM 20140303 TO 20140320;REEL/FRAME:032730/0464

AS Assignment

Owner name: OLYMPUS CORPORATION, JAPAN

Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE EXECUTION DATE FOR THE THIRD ASSIGNOR PREVIOUSLY RECORDED AT REEL: 032730 FRAME: 0464. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNORS:MIYOSHI, TAKASHI;KOSAKA, AKIO;IWAKI, HIDEKAZU;AND OTHERS;SIGNING DATES FROM 20140303 TO 20140324;REEL/FRAME:033302/0185

ZAAA Notice of allowance and fees due

Free format text: ORIGINAL CODE: NOA

ZAAB Notice of allowance mailed

Free format text: ORIGINAL CODE: MN/=.

ZAAA Notice of allowance and fees due

Free format text: ORIGINAL CODE: NOA

STCF Information on status: patent grant

Free format text: PATENTED CASE

AS Assignment

Owner name: OLYMPUS CORPORATION, JAPAN

Free format text: CHANGE OF ADDRESS;ASSIGNOR:OLYMPUS CORPORATION;REEL/FRAME:039344/0502

Effective date: 20160401

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20240329