CN102891965B - Image processing equipment and control method, image processing method - Google Patents

Image processing equipment and control method, image processing method Download PDF

Info

Publication number
CN102891965B
CN102891965B CN201210350890.XA CN201210350890A CN102891965B CN 102891965 B CN102891965 B CN 102891965B CN 201210350890 A CN201210350890 A CN 201210350890A CN 102891965 B CN102891965 B CN 102891965B
Authority
CN
China
Prior art keywords
image
information
system controller
face
classified information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201210350890.XA
Other languages
Chinese (zh)
Other versions
CN102891965A (en
Inventor
中濑雄一
稻垣温
参纳雅人
池田平
丹羽智弓
渡边等
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Publication of CN102891965A publication Critical patent/CN102891965A/en
Application granted granted Critical
Publication of CN102891965B publication Critical patent/CN102891965B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00127Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
    • H04N1/00132Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture in a digital photofinishing system, i.e. a system where digital photographic images undergo typical photofinishing processing, e.g. printing ordering
    • H04N1/00167Processing or editing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/583Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N1/32101Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N1/32128Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title attached to the image data, e.g. file header, transmitted message header, information on the same page or in the same computer file as the image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/74Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/40Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
    • H04N25/44Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by partially reading an SSIS array
    • H04N25/443Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by partially reading an SSIS array by reading pixels from selected 2D regions of the array, e.g. for windowing or digital zooming
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/77Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
    • H04N5/772Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera the recording apparatus and the television camera being placed in the same enclosure
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/804Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components
    • H04N9/8042Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components involving data reduction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/82Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
    • H04N9/8205Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
    • H04N9/8227Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal the additional signal being at least another television signal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2101/00Still video cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3212Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a job, e.g. communication, capture or filing of an image
    • H04N2201/3214Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a job, e.g. communication, capture or filing of an image of a date
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3212Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a job, e.g. communication, capture or filing of an image
    • H04N2201/3215Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a job, e.g. communication, capture or filing of an image of a time or duration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3225Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to an image, a page or a document
    • H04N2201/3226Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to an image, a page or a document of identification information or the like, e.g. ID code, index, title, part of an image, reduced-size image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3225Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to an image, a page or a document
    • H04N2201/325Modified version of the image, e.g. part of the image, image reduced in size or resolution, thumbnail or screennail
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3225Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to an image, a page or a document
    • H04N2201/325Modified version of the image, e.g. part of the image, image reduced in size or resolution, thumbnail or screennail
    • H04N2201/3251Modified version of the image, e.g. part of the image, image reduced in size or resolution, thumbnail or screennail where the modified version of the image is relating to a person or face
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3274Storage or retrieval of prestored additional information
    • H04N2201/3277The additional information being stored in the same storage device as the image data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/78Television signal recording using magnetic recording
    • H04N5/781Television signal recording using magnetic recording on disks or drums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/907Television signal recording using static stores, e.g. storage tubes or semiconductor memories

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Library & Information Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Studio Devices (AREA)
  • Television Signal Processing For Recording (AREA)

Abstract

The invention provides a kind of picture pick-up device and image capture method.Disclose a kind of for allowing to provide to the image after editor the technology of suitable classified information.When judge to be compiled as cut time, system controller use image processor by decompress after image cut become expect size, and to cut image carry out face detection.The head of the view data of the image after system controller Generation Edit.When automatically provide classified information be set to " ON " time, automatically provide classified information based on detected facial information.

Description

Image processing equipment and control method, image processing method
The divisional application that the application is the applying date is on August 11st, 2008, application number is 200810144494.5, denomination of invention is the application of " picture pick-up device and image capture method ".
Technical field
The present invention relates to a kind of picture pick-up device and method thereof that classified information can be provided to image.
Background technology
Along with widely using of digital camera, by digital camera and personal computer (hereinafter referred to as " PC "), the data of the image captured by digital camera and being stored in various types of recording medium from the view data that PC catches are popularized gradually.
In order to check, edit, organize storage view data on the recording medium or in order to other reason, reproduce this view data by picture reproducers such as digital camera, PC or printers.
There is such digital camera, PC and printer: can be provided for image the attribute that makes it possible to carry out effective image retrieval, and allow user to use this attribute to carry out retrieving images as search key.
There is a kind of digital camera that can show focusing (in-focus) position of captured image.So a kind of example technique disclosed in Japanese Unexamined Patent Publication 2003-143444 publication can show focusing position in an exaggerated way, easily can identify focusing position to make user.There is a kind of technology for storing the positional information about the focusing position in captured image.
But, in the above-described techniques, due to by using the history of editor's (operation) to check image, thus do not consider to provide suitable attribute according to the content of image.Especially, when image is in time falling the editors such as the cutting of desired part (cropping) from image cut, the classified information different from the classified information of the condition represented when taking may be suitable.
Such as, when being suitable for the image taking people captured in the screening-mode of portrait, now the classified information about people is provided to be no problem for this image, but, when by cut fall the part beyond people from this image cut time, the classified information of this people may be not quite convenient.
Summary of the invention
The invention provides a kind of for providing the technology of suitable classified information to the image after editor.
The present invention also provides a kind of technology for carrying out scrupulous editor when picture editting to the photographing information being supplied to image.The present invention also provides a kind of technology for allowing suitably to show when image reproducing photographing information.
According to an aspect of the present invention, a kind of picture pick-up device, comprising: image unit, for photographic images; Setting unit, for arranging the screening-mode used in described image unit; Providing unit, for basis by the described screening-mode set by described setting unit, or according to the subject information detected from the image captured by described image unit, provides classified information to captured image; And edit cell, for editing the described image with the described classified information provided by described providing unit, wherein, when passing through image described in described edit cell editor, described providing unit is used for the result of the editor carried out according to described edit cell, determine the classified information of described image, and determined classified information is supplied to the image after editor.
According to another aspect of the present invention, a kind of image capture method is provided, comprises: photographic images; The screening-mode used in described shooting is set; According to described screening-mode, or according to the subject information detected from captured image, provide classified information to described image; And the described image with described classified information is edited, wherein, when described in described editor's inediting during image, according to the result of the editor carried out in described editor, determine the classified information of described image, and determined classified information is supplied to the image after editor.
By following (with reference to the accompanying drawing) explanation to exemplary embodiments, further feature of the present invention will be apparent.
Accompanying drawing explanation
Fig. 1 is the outside drawing of digital camera according to an embodiment of the invention.
Fig. 2 is the block diagram of the example arrangement of the digital camera illustrated according to embodiment.
Fig. 3 illustrates the flow chart operated according to the exemplary overall of the digital camera of embodiment.
Fig. 4 is the flow chart of the exemplary process that rest image logging mode is shown.
Fig. 5 is the flow chart that the exemplary process that face detects is shown.
Fig. 6 is the flow chart of the exemplary process that shooting is shown.
Fig. 7 is the flow chart of the exemplary process that record is shown.
Fig. 8 is the flow chart of the exemplary process that first-born one-tenth is shown.
Fig. 9 illustrates example directory and file structure.
Figure 10 illustrates the example data structure of static picture document.
Figure 11 is the flow chart of the exemplary process that motion picture recording pattern is shown.
Figure 12 illustrates the example format for storing moving image data.
Figure 13 is the flow chart of the exemplary process that thumbnail file record is shown.
Figure 14 is the flow chart of the exemplary process that receiving mode is shown.
Figure 15 is the flow chart of the exemplary process that reproduction mode is shown.
Figure 16 is the flow chart of an exemplary process of the input wait state illustrated when not having image.
Figure 17 A and 17B is the flow chart of an exemplary process of the input wait state illustrated for reproduced image.
Figure 18 A and 18B is the flow chart of the exemplary process that editor is shown.
Figure 19 A ~ 19D is the key diagram for illustration of the object lesson cutting process.
Figure 20 illustrates that classified information arranges the flow chart of an exemplary process of pattern.
Figure 21 is the flow chart of the exemplary process that image file management is shown.
Figure 22 is the flow chart of the exemplary process that picture search is shown.
Figure 23 is the flow chart of the exemplary process that file analysis is shown.
Figure 24 is the flow chart of the exemplary process of the first-born one-tenth illustrated according to the second embodiment.
Figure 25 A ~ 25D is for illustration of the key diagram cutting the object lesson of process according to the second embodiment.
Embodiment
Below with reference to the accompanying drawings embodiment is described.In the embodiment of the following stated, illustrate that the shooting applying the present invention to take rest image and moving image arranges the example of (such as, digital camera).
first embodiment
the structure of digital camera
Fig. 1 is the outside drawing of digital camera 100 according to an embodiment of the invention.Image displaying part 28 shows image and various types of information.Mains switch 72 is for Switching power between connection (ON) and (OFF) state of disconnection.Reference numeral 61 represents shutter release button.Mode selector 60 for switching between the various using forestland of digital camera 100.Particularly, such as, can at rest image logging mode, switch mode between motion picture recording pattern and reproduction mode.
Operation part 70 is for receiving various types of operation from user.Operation part 70 comprises the functional units such as the touch panel on the picture of the various types of button shown in Fig. 1 and image displaying part 28.The example of the various buttons of operation part 70 comprises reset button, menu button, arranges button, the four-way button (up, down, right and left button) of cross arrangement and roller.
Connection cable 111 is for being connected to external device (ED) by digital camera 100.Connector 112 is for linking together connection cable 111 and digital camera 100.
Storage medium 200 can be such as storage card and hard disk.Storage medium slot 201 is for holding storage medium 200.When being contained in storage medium slot 201 by storage medium 200, storage medium 200 can communicate with digital camera 100.Lid 202 is for covering storage medium slot 201.
Fig. 2 is the block diagram of the example arrangement of the digital camera 100 illustrated according to the present embodiment.Camera part 22 comprises charge coupled device (CCD) or CMOS (Complementary Metal Oxide Semiconductor) (CMOS) element.Camera part 22 converts the optical imagery formed by pick-up lens 103 and shutter 101 to the signal of telecommunication, and wherein, shutter 101 has aperture function.Baffle plate 102 is for protecting camera system to avoid polluting or breakage by covering image pickup part, and wherein, image pickup part comprises the pick-up lens 103 of digital camera 100, and camera system comprises pick-up lens 103, shutter 101 and camera part 22.
Analog to digital (A/D) transducer 23 is for converting analog signal to digital signal.A/D converter 23 for converting the analog signal exported from camera part 22 to digital signal, and converts the analog signal exported from Audio Controller 11 to digital signal.
Controlled by Memory Controller 15 and system controller 50 pairs of timing sequencers 12.Timing sequencer 12 is for providing clock signal or control signal to camera part 22, Audio Controller 11, A/D converter 23 or digital-to-analog (D/A) transducer 13.
Image processor 24 is for carrying out the predetermined process such as size adjustment or color conversion (such as, picture element interpolation or minimizing) to the data from A/D converter 23 or Memory Controller 15.Image processor 24 carries out predetermined computation for using obtained view data, and system controller 50 can be controlled exposure and range finding based on this result of calculation.This makes it possible to carry out by camera lens (through-the-lens, TTL) automatic focusing (AF), automatic exposure (AE) and flash tube pre-luminous (EF) process.In addition, image processor 24 carries out predetermined computation for using obtained view data, and can carry out TTL Automatic white balance (AWB) based on this result of calculation.
By image processor 24 and Memory Controller 15, or directly by Memory Controller 15, by the writing data into memory 32 that exports from A/D converter 23.Memory 32 is stored as the view data of the numerical data become according to the data transaction that camera part 22 obtains by A/D converter 23, and stores the view data that will be presented in image displaying part 28.Memory 32 is also for storing the file header used in the voice data, rest image, moving image and the image file that are recorded by microphone 10.Therefore, memory 32 has the Still image data, the motion image data of scheduled time length and the memory capacity of voice data that are enough to store predetermined quantity.
Compression/de-compression part 16 is for using such as adaptive discrete cosine transform (ADCT) to compress view data or decompressing.Compression/de-compression part 16 uses shutter 101 obtains as trigger, storage view data in memory 32 for reading and compressing, and by the writing data into memory 32 after process.Compression/de-compression part 16 such as reads the view data after the compression memory 32 from the recording section 19 of storage medium 200 for decompressing, and by the writing data into memory 32 after process.The view data being write memory 32 by compression/de-compression part 16 forms file in the file part of system controller 50, and is recorded on storage medium 200 by this file data by interface 18.Memory 32 is also used as the memory (video memory) of display image.
D/A converter 13 for by store in memory 32, view data to be shown converts analog signal to, and this analog signal is supplied to image displaying part 28.Image displaying part 28, for based on the analog signal provided from D/A converter 13, display (such as, liquid crystal display (LCD)) shows data.Like this, by D/A converter 13, image displaying part 28 is utilized to show that be written to memory 32, to be shown view data.
By Audio Controller 11, the audio signal exported from microphone 10 is supplied to A/D converter 23, wherein, Audio Controller 11 comprises such as amplifier.Convert audio signal to digital signal by A/D converter 23, then by Memory Controller 15, this digital signal is stored in memory 32.The voice data be recorded on storage medium 200 is read in memory 32, then converts thereof into analog signal by D/A converter 13.Utilize this analog signal, Audio Controller 11 drives loud speaker 39, and outputting audio data.
Nonvolatile memory 56 be content erasable except and recordable memory, and can be such as EEPROM (Electrically Erasable Programmable Read Only Memo) (electricallyerasableprogrammableread-onlymemory, EEPROM).Nonvolatile memory 56 is stored in the constant and program that use in the operation of system controller 50.Here used program representation is for performing the program of the flow chart illustrated in the present embodiment below.
System controller 50 controls whole digital camera 100.System controller 50, by performing the program be stored in above-mentioned nonvolatile memory 56, carries out the process illustrated in the present embodiment below.System storage 52 can be random access memory (RAM), and the constant used in the operation of development system controller 50 and variable and the program that reads from nonvolatile memory 56.
Mode selector 60, first shutter release 62, second shutter release 64 and operation part 70 are the operating units for inputting various types of operational order to system controller 50.Mode selector 60 is in such as rest image logging mode, the mode of operation of switched system controller 50 between motion picture recording pattern and reproduction mode.
When centre (partly the pressing) of the operation of the shutter release button 61 of digital camera 100, connect the first shutter release 62, and generate the first shutter release signal SW1.In response to the first shutter release signal SW1, system controller 50 starts AF, AE, AWB, EF and/or other process.
When the operation of shutter release button 61 completes (pressing completely), connect the second shutter release 64, and generate the second shutter release signal SW2.In response to the second shutter release signal SW2, system controller 50 starts the operation of the series of steps of shooting process: write on storage medium 200 from reading signal by camera part 22 to by view data.
By selection function icon in various types of icons shown in image displaying part 28, distribute suitable function based on each scene to each functional unit of operation part 70, thus allow functional unit to be used as function button.The example of function button has comprised button, return push-button, image forwarding button, hop button, constriction button and attribute and has changed button.Such as, when pressing menu button, image displaying part 28 presents the menu screen allowing user to specify various setting.User can use the menu screen be presented in image displaying part 28, four-way button and arrange button, specifies various setting intuitively.Mains switch 72 is for switching on and off Switching power between state.
Power-supply controller of electric 80 comprises battery detection circuit, dc-dc and for switching the commutation circuit of the block be energized, and for detect whether be equipped with battery, assemble type and the remaining power life-span of battery.Power-supply controller of electric 80, in response to this testing result and the instruction from system controller 50, controls dc-dc, and provides the necessary voltage of necessary time section to the assembly comprising storage medium 200.Power unit 30 can comprise primary cell (such as, alkaline battery or lithium battery), secondary cell (such as, NI-G (NiCd) battery, nickel metallic hydrogen (NiMH) battery or lithium (Li) battery) or AC adapter.Connector 33 and 34 is for linking together power unit 30 and power-supply controller of electric 80.
Real-time clock (RTC) 40 is for measuring date and time.RTC40 has the internal electric source part except power-supply controller of electric 80, even if therefore when deenergization part 30, RTC40 also can continue Measuring Time.System controller 50 uses the date and time obtained from RTC40 when starting to arrange system timer, and performs timer control.
Interface 18 is the interfaces with storage medium 200 (such as, storage card or hard disk).Connector 35 is for being connected interface 18 with storage medium 200.Whether recording medium Existing detector 98 is assembled to connector 35 for detecting storage medium 200.
Storage medium 200 (such as, storage card or hard disk) comprising: recording section 19, and it comprises semiconductor memory or disk; With the interface 37 of digital camera 100; And the connector 36 for being connected with digital camera 100.
Communications portion 110 can carry out that RS-232C, USB (USB), IEEE1394, P1284, SCSI, modulator-demodulator, LAN and radio communication etc. are various types of to communicate.Connector 112 is for being connected to other device by communications portion 110 by digital camera 100, and wherein, in the case of wireless communication, connector 112 is antenna.
the overall operation of digital camera
Fig. 3 illustrates the flow chart operated according to the exemplary overall of the digital camera 100 of the present embodiment.When operating power switch 72, and when switching on power, in step S1, system controller 50 initialization flag and control variables.
Then, in step S2, the file that system controller 50 starts being stored in storage medium 200 manages.Below Primary Reference Figure 21 is described this file management processing.
Then, in step S3, S5 and S7, the setting position of system controller 50 judgment model selector 60.If be judged as that mode selector 60 is arranged on rest image logging mode, then flow process enters step S4 by step S3.In step S4, perform the process of rest image logging mode.Primary Reference Fig. 4 is illustrated the process of rest image logging mode below.If be judged as that mode selector 60 is arranged on motion picture recording pattern, then flow process enters step S6 by step S3 and S5.In step S6, perform the process of motion picture recording pattern.Primary Reference Figure 11 is illustrated the process of motion picture recording pattern below.If be judged as that mode selector 60 is arranged on reproduction mode, then flow process enters step S8 by step S3, S5 and S7.In step S8, the process of the rest image captured by execution reproduction or the reproduction mode of moving image.Primary Reference Figure 15 is illustrated the process of reproduction mode below.If be judged as that mode selector 60 is arranged on other pattern, then flow process enters step S9.In step S9, perform the process corresponding with selected pattern.The example of other pattern comprises: the process of sending mode, at sending mode, sends the file be stored in storage medium 200; And the process of receiving mode, at receiving mode, receive file from external device (ED), and this file is stored in storage medium 200.The process of the receiving mode of above-mentioned exemplary process is described later with reference to Figure 14.
After the process (in step S4, S6, S8 or S9) performing the pattern that arranges specified by mode selector 60, flow process enters step S10.In step S10, system controller 50 judges the setting position of mains switch 72.If be judged as that mains switch 72 is arranged on the position of power supply connection, then flow process turns back to step S3.If be judged as that mains switch 72 is arranged on the position of power supply disconnection, then flow process enters S11, and in step S11, system controller 50 carries out termination.An example of termination is that the display appeared in image displaying part 28 is changed over done state.Other example comprise closing baffle plate 102 to protect image pickup part, by comprising mark and the parameter of control variables, settings, mode record is set in nonvolatile memory 56 and cut off the power supply not needing the portion of electricity consumption.After termination in completing steps S11, flow process terminates, and state is switched to power-off state.
the process of rest image logging mode
Fig. 4 is the flow chart that the exemplary process occurred in the rest image logging mode shown in the step S4 of Fig. 3 is shown.Such as, when pattern is switched to another pattern by Land use models selector 60, or when mains switch 72 is set to power OFF position, by the process of the rest image logging mode shown in EOI Fig. 4.
After the process starting rest image logging mode, in step S401, then system controller 50 confirms screening-mode.The confirmation of screening-mode is carried out by following (1) or (2).(1) obtain the screening-mode used at the end of the last process of rest image logging mode from nonvolatile memory 56, and be stored in system storage 52.
(2) when user specifies screening-mode by operation operation part 70, specified screening-mode is stored in system storage 52.
Screening-mode is defined by the combination being applicable to the shutter speed of photographed scene, f-number, the luminance of photoflash lamp, sensitivity are arranged.Digital camera 100 according to the present embodiment has following screening-mode:
Automatic mode: based on measured exposure value, determines the various parameters used in camera automatically by the program be built in digital camera 100.
Manual mode: user freely can change the various parameters used in camera.
Scene mode: Lookup protocol is suitable for the combination that the shutter speed of photographed scene, f-number, photoflash lamp luminance and sensitivity are arranged.
Scene mode comprises following pattern:
Portrait (Portraitmode): the image being exclusively used in shooting people, wherein makes people focus on and make blurred background simultaneously.
Night scene mode (NightScenemode): be exclusively used in night scene, wherein utilizes photoflash lamp to irradiate people, and utilizes slow shutter speed record background.
Landscape configuration (Landscapemode): be exclusively used in wide landscape.
Snapshot mode at night (Night & Snapshotmode): be suitable for the beautiful image taking night scene and people when not using tripod;
Children and pet pattern (Kids & Petsmode): the image taking children and the animal of running fast, and the chance at excellent moment photographic images can not be missed.
Species (Foliagemode): be suitable for take greenery and autumn vegetation painted image.
Party pattern (Partymode): under fluorescent lamp or bulb, utilizes the image of the tone shooting subject of faithful to subject, compensates simultaneously to camera-shake.
Snow scenes pattern (Snowmode): even if be that people is dimmed and do not have blue ground photographic images yet for background with snow scenes.
Seabeach pattern (Beachmode): even if under the sea of sunlight high reverse--bias or the scene at sandy beach, also can people or the constant secretly photographic images of other subject.
Fireworks pattern (Fireworksmode): utilize optimum exposure clearly to take the image of the fireworks that soar.
Aquarium pattern (Aquariummode): arrange and be suitable for the sensitivity of the image of the fish taken in the water tank in indoor aquarium, white balance and tone.
Pattern (Underwatermode) under water: use the white balance for scene the best under water, with the blue photographic images reduced.
Referring back to Fig. 4, when confirming screening-mode in step S401, then in step S402, system controller 50 shows the view data provided from camera part 22 with live view show state.Here usedly to mean with live view show state display image: show the image obtained by camera part in real time.Then, in step S403, whether system controller 50 utilizes power-supply controller of electric 80 to judge to exist in the process of operand word camera 100 (to be comprised by power unit 30, such as, battery) the problem that causes of residual life, and judge in the process of operand word camera 100, whether there is the problem caused by the residual capacity with or without storage medium 200 or storage medium 200.If judge existing problems (step S403 is "No") in power unit 30 and the respective state of storage medium 200, then flow process enters step S404.In step S404, image displaying part 28 is under the control of system controller 50, and use image or the predetermined warning of audio frequency display, then flow process turns back to step S401.
If judge that then flow process enters step S405 in power unit 30 and the respective state of storage medium 200 no problem (step S403 is "Yes").In step S405, if needed, then system controller 50 specifies ON or OFF for automatically providing arranging of classified information.After the menu button that push part 70 comprises, user according to the menu screen (not shown) appeared in image displaying part 28, can select ON or OFF arbitrarily for automatically providing arranging of classified information.Utilizing the mark representing and whether classified information is provided automatically according to scene mode and subject information, representing for automatically providing arranging of classified information to specify ON or OFF.Set value (the ON/OFF value of mark) is remained in system storage 52.For automatically providing arranging of classified information to specify ON or OFF, this can environmentally be avoided user to provide undesired classified information.Classified information will be described below.
Then, in step S406, system controller 50 judges whether there is face in the picture signal shown by live view show state.Later with reference to Fig. 5, the exemplary process that this face detects is described.If detect face in facial check processing, then the quantity of the size (such as, width and height) of the position coordinates of face detected in picture signal, detected face, detected face, coefficient of reliability and other relevant information are stored in system storage 52 as facial information by system controller 50.If face do not detected in facial check processing, then zero is arranged to the region of position coordinates, size (such as, width and height), the quantity of face detected, coefficient of reliability and other relevant information.
During live view display, the face in the step S406 being stored in image in VRAM, that live view is used in showing is adopted to detect the image itself that can use from captured by CCD acquisition.
Then, in step S407, system controller 50 judges whether the first shutter release signal SW1 is ON.If the first shutter release signal SW1 is OFF (step S407 is " OFF "), then flow process turns back to step S405, and repeats the process of step S405 and S406.If the first shutter release signal SW1 is ON (step S407 is " ON "), then flow process enters step S408.In step S408, system controller 50 measuring distance, the focus of pick-up lens 103 to be adjusted in subject, and carries out light-metering to determine f-number and aperture time (shutter speed).In light-metering process, if needed, then carry out the setting of photoflash lamp.Now, if detect face in step S406, then can also find range in the scope of detected face.
Then, in step S409 and S410, judge the first shutter release signal SW1 and the second shutter release signal SW2 ON/OFF state separately.If when the first shutter release signal SW1 is in ON state, connect the second shutter release signal SW2 (step S409 is ON), then flow process enters step S411.If disconnect the first shutter release signal SW1, that is, discharge the first shutter release signal SW1, and disconnect the second shutter release signal SW2 (step S410 is OFF), then flow process turns back to step S405.Be in ON state at the first shutter release signal SW1, and when the second shutter release signal SW2 is in OFF state, repeat step S409 and S410.
If connected the second shutter release signal SW2 (pressing the second shutter release 64), then in step S411, the show state of image displaying part 28 has been arranged to constant color show state from live view show state by system controller 50.Mean with constant color show state display image: the image regular period of display solid color, when being intuitively informed in allow user the shutter release button pressed in digital camera, have taken image.In the present embodiment, under constant color show state, show black image (blackout (blackout)).Then, in step S412, system controller 50 is made a video recording, and comprises exposure and development.In exposure, by image processor 24 and Memory Controller 15, or from A/D converter 23 directly by Memory Controller 15, by the view data write memory 32 obtained by camera part 22 and A/D converter 23.In development, system controller 50 uses Memory Controller 15, if needed, also uses image processor 24, reads the view data in write memory 32, and carry out various types of process for read view data.Later with reference to Fig. 6, this shooting process is described.
Then, in step S413, system controller 50, in image displaying part 28, carries out REC playback display for the view data obtained in shooting process.REC playback display be: in order to allow user check captured image, shooting subject image after, by image record on the recording medium before, display image data predetermined amount of time (playback duration) in image displaying part 28.After REC playback display image data, in step S414, the view data obtained in shooting process is recorded on storage medium 200 as image file by system controller 50.Later with reference to Fig. 7, this recording processing is described.
After record in completing steps S414, in step S415, system controller 50 judges whether the second shutter release signal SW2 is in ON state.If judge that the second shutter release signal SW2 is in ON state, then repeat the judgement in step S415, until disconnect the second shutter release signal SW2.Period at this moment, the REC playback display of continuous image data.That is, when completing the record in step S414, the REC playback display of continuous image data in image displaying part 28, until disconnect the second shutter release signal SW2, that is, till discharging the second shutter release 64.This makes user by continuing to press shutter release button 61 completely, to use REC playback function fully to check captured view data.
User by after pressing shutter release button 61 photographic images completely, when user by by his/her hand-held from shutter release button 61 thus the complete down state of release shutter release button 61 time, flow process enters step S416 from step S415.In step S416, system controller 50 judges whether to have passed through predetermined playback duration.If judge to have passed through the scheduled time (step S416 is "Yes"), then flow process enters step S417.In step S417, the show state in image displaying part 28 is turned back to live view show state from REC playback show state by system controller 50.In order to for next shooting ready, check captured view data under REC playback show state after, this process makes the show state in image displaying part 28 can automatically change over live view show state from REC playback show state, and the display of live view show state order is from the view data of camera part 22.
In step S418, system controller 50 judges whether the first shutter release signal SW1 is in ON state.If judge that the first shutter release signal SW1 is in ON state in step S418, then flow process turns back to step S409; If judge that the first shutter release signal SW1 is in OFF state, then flow process turns back to step S405.That is, if continue half down state of shutter release button 61, that is, the first shutter release signal SW1 is in ON state, then this process prepares next shooting (in step S409).If release shutter release button 61, that is, the first shutter release signal SW1 is in OFF state, then complete the series of steps of shooting process, and process turns back to the state (in step S405) waiting for shooting.
face detects
Referring now to an example of the facial check processing in the step S406 of Fig. 5 key diagram 4.In step S501, the view data that system controller 50 will carry out facial check processing sends to image processor 24.In step S502, image processor 24, under the control of system controller 50, carries out filtering by horizontal bandpass filter (horizontalband-passfilter, BPF) to this view data.In step S503, image processor 24, under the control of system controller 50, carries out filtering by vertical BPF to the view data after utilize horizontal BPF process in step S502.Usage level BPF detects the marginal element from view data with vertical BPF.
Then, in step S504, system controller 50 carries out pattern matching for detected marginal element, to extract candidate's eyes, nose, mouth and ear.Then, in step S505, the object that system controller 50 will meet predetermined condition (such as, the distance of two eyes or gradient) from the candidate's eyes extracted among step S504 is defined as pair of eyes, candidate's eyes to be narrowed to the object with pair of eyes.Then, in step S506, candidate's eyes after constriction in step S505 are associated with other appropriate section (nose, mouth, ear) forming face by system controller 50, and carry out filtering to data, to detect face by non-face condition filtering device.In step S507, system controller 50, according to the facial testing result in step S506, exports the information about face, and completes this process.
As mentioned above, by the characteristic information of image data extraction view data shown under use live view show state, subject information can be detected.In the present embodiment, as an example of subject information, the information about face is described.But, other various types of information can also be used, such as, redeye detection information can be used.
shooting
Referring now to an example of the shooting process in the step S412 of Fig. 6 key diagram 4.In step S601, date and time when system controller 50 is made a video recording from system timer obtains, and be stored in system storage 52.Then, in step S602, system controller 50, based on the photometric data be stored in system storage 52, opens the shutter 101 with aperture function according to f-number.This makes camera part 22 start to expose (in step S603).
In step S604, system controller 50 waits for completing of the exposure that camera part 22 is carried out according to photometric data.In the end exposure moment, in step S605, system controller 50 closes shutter 101.Then, in step S606, from camera part 22 reading electric charges signal, and by A/D converter 23, image processor 24 and Memory Controller 15, or from A/D converter 23 directly by Memory Controller 15, by view data write memory 32.Step S601 ~ S606 corresponds to exposure-processed.
Then, in step S607, system controller 50 reads the view data stored in memory 32, and uses Memory Controller 15, if needed, also uses image processor 24, carries out image procossing for read view data order.The example of this image procossing comprises white balance and uses the compression of compression/de-compression part 16.By the view data write memory 32 after process.In step S608, system controller 50, from memory 32 reads image data, uses compression/de-compression part 16 to decompress to read view data, and adjusts the size of view data, to be presented in image displaying part 28.After this, system controller 50 is by Memory Controller 15, and the view data after size being adjusted sends D/A converter 13 to, to be presented in image displaying part 28.When completing the series of steps of this process, complete shooting process.
record
Referring now to an example of the recording processing in the step S414 of Fig. 7 key diagram 4.In step S701, system controller 50, according to file designation rule, is view data spanned file title to be recorded.Later with reference to an example of Fig. 9 supporting paper naming rule.Then, in step S702, system controller 50 obtains the date and time information be stored in the step S601 of Fig. 6 in system storage 52.Then, in step S703, system controller 50 obtains the information of the size about view data to be recorded.
Then, in step S704, system controller 50 judges the catalogue whether existed in storage medium 200 for storing the image file generated according to view data.If judge to there is not such catalogue (step S704 is "No"), then flow process enters step S705, and in step S705, system controller 50 generates the catalogue for storing this image file.An example of the rule for generating directory name is described later with reference to Fig. 9.Here, title 100XXX (902 in Fig. 9) is generated.
Then, in step S706, system controller 50 is the view data spanned file head stored in the step S606 of the shooting process of Fig. 6 in memory 32.By the information structure file header about condition when shooting date and shooting.An example of this generating process is described later with reference to Fig. 8.An example arrangement of image file generated in the above described manner is described later with reference to Fig. 9.
After the generation completing head, in step S707, system controller 50, according to the file name generated in step S701 and the date and time information obtained in step S702, generates directory entry, and is recorded on storage medium 200 by this image file.
first-born one-tenth
Referring now to an example of the head generating process in the step S706 of Fig. 8 key diagram 7.In step S801, system controller 50 obtains the settings automatically providing ON or OFF of the setting of classified information specified among the step S405 of Fig. 4 from system storage 52, and judges whether automatically to provide classified information to captured view data.If judge that the settings of ON or OFF to the setting automatically providing classified information are provided with " OFF ", that is, automatically do not provide classified information (step S801 is "No"), then flow process enters step S809.
If judge that the settings of ON or OFF to the setting automatically providing classified information are provided with " ON ", that is, automatically provide classified information (step S801 is "Yes"), then flow process enters step S802.In step S802, the step S406 that system controller 50 reads in Fig. 4 remains on the facial information in system storage 52, and judges whether to detect face.If judge to detect face (step S802 is "Yes"), then flow process enters step S804, in step S804, provides classified information " people ".If judge face (step S802 is "No") not detected, then flow process enters step S803.
In step S803, system controller 50 with reference to the scene mode of the image during shooting be stored in system storage 52, and judges whether scene mode is " Portrait ", " snapshot mode at night " and " children and pet pattern " wherein any one.In this Three models, assuming that have taken people.If judge that scene mode is their one of them (step S803 is "Yes"), then flow process enters step S804, and in step S804, system controller 50 provides classified information " people " to view data.If provide classified information " people " in step S804, if or judge that scene mode is not their wherein any one pattern (step S803 is "No"), then flow process enters step S805.
As mentioned above, in step S802 ~ S804, according to both scene modes of the facial information of an example as subject information and an example as camera setting model during shooting, provide identical classified information " people ".Different parameters when camera setting model when subject information and shooting is shooting, but according to content, they can have meaning after similar shooting.As the facial information of one of subject information with all there is same meaning as " Portrait ", " snapshot mode at night " and " children and pet pattern " of one of camera setting model during shooting: " estimate and have taken people ".Therefore, by providing same category information to the view data with this category information, after which enhancing shooting, operate the convenience of (such as, search operation).That is, specific shot body information and particular camera setting model is used to provide same category information, that this makes it possible to provide parameter when being different from shooting and be suitable for taking the classified information of rear operation (such as, search operation).This can strengthen convenience.
In addition, above-mentioned classified information provide process can for different scene mode and Portrait, night snapshot mode and children and pet pattern identical classified information is provided.Different scene mode has camera setting model during different shootings, but they can have similar meaning.Portrait, night snapshot mode and children and pet pattern all there is same meaning: " presumption have taken people ".Therefore, by providing identical classified information to this kind of view data, after enhancing shooting, operate the convenience of (such as, search operation).That is, for the polytype specific setting model in camera setting model during shooting provides identical classified information, that this makes it possible to provide parameter when being different from shooting and be suitable for taking the classified information of rear operation (such as, search operation).This convenience operated after can strengthening shooting.
Referring back to Fig. 8, in step S805, system controller 50 judge scene mode be whether " species ", " landscape configuration " and " fireworks pattern " wherein any one.In this Three models, the image captured by presumption is landscape shooting.If judge that scene mode is any one (step S805 is "Yes") in them, then flow process enters step S806, and in step S806, system controller 50 provides classified information " landscape " to view data.If provide classified information " landscape " in step S806, if or judge that scene mode is not any one (step S805 is "No") in them, then flow process enters step S807.
In step S807, system controller 50 judges that whether scene mode is any one in " party pattern ", " snow scenes pattern ", " seabeach pattern ", " fireworks pattern ", " aquarium pattern " and " under water pattern ".In these patterns, presumption have taken event.If judge that scene mode is any one (step S807 is "Yes") in them, then flow process enters step S808, and in step S808, system controller 50 provides classified information " event " to view data.
In the process above, the view data of taking in " fireworks pattern " is provided to information " landscape " and " event " of two types.That is, polytype information is provided according to single scene mode.Even if under the same camera setting model when taking (scene mode), captured view data also can tool multivalence.A this kind of example is image captured in " fireworks pattern ".In this case, system controller 50 provides the polytype classified information corresponding with taking rear meaning.Therefore, can provide parameter when being different from shooting and be suitable for taking the classified information of rear operation (such as, search operation).The convenience that this shooting that can strengthen digital camera 100 operates afterwards.
When all judging in step S803, S805 and S807 as " automatic mode ", " manual mode " of "No" or other scene mode, do not provide classified information.
After completing the process for providing the classified information in head, flow process enters step S809.In step S809, settings when system controller 50 uses classified information and takes about the information etc. of shooting date, generate header.
If carry out this process in editing and processing, then correct the information about coordinate described in head, and complete this process.Information about coordinate comprises the information about the positional information of image such as facial information and focusing frame information.When editing view data thus change visual angle (angleoffield) of view data, such as, when to cut etc. shear or combination, the coordinate information of the image before editor is inappropriate for the image after editor.In order to address this problem, when cut, based on the position of cutting out section and the size of cutting out section, recalculate the coordinate information of the image after editor, and in head, describing the coordinate information recalculated.When combining, based on the positional information of the image before the combination in the image after combination, recalculating coordinate information, and in head, describing the coordinate information recalculated.
Alternatively, the classified information that (again providing) user has changed can not be upgraded.Such as, this can be achieved the detection of classified information by not reflecting in this process according to judgement below: the last classified information different from the last automated provisioning classified information described in head is the altered classified information of user.Such as, when being provided with automated provisioning classified information " landscape ", and when being provided with " landscape " and " people " to classified information, judging that user is provided with " people ", therefore, not changing attribute " people ", and no matter automated provisioning result is how.When automated provisioning current results is different from the content of the classified information described in head, can require that user removes the attribute selecting to provide on graphic user interface (GUI).
Be provided on image during the mark of date printed, then can remove this mark when being provided with.Only when judging not comprise the printed date in cropped area, just can change the mark for date print.
Referring back to Fig. 8, when the settings of ON or OFF judging the setting automatically providing classified information are OFF (step S801 is "No"), skip the setting (step S802 ~ S808) of classified information, therefore generate the header without classified information.
As mentioned above, automatically providing the classified information used during such as search when taking, allowing user to classify to view data immediately in reproduction mode, and without the need to carrying out the arrangement of known image file when checking the view data of reproduction.Because the concept of classification is based on camera setting model during shooting and subject both information, the classified information being suitable for the concept of taking rear operation (such as, the search of view data) thus can be generated.Process shown in Fig. 8 corresponds to the exemplary process of being undertaken by providing unit.
In the foregoing, as camera setting model when automatically providing the shooting of classified information, several exemplary scenario pattern is described.But camera setting model is not limited to above-mentioned example.Automatically provide another example of classified information to be based on camera setting model during shooting: when taking distant view in a manual mode, presumption shooting landscape, therefore provides classified information " landscape ".Another example is: when using auto heterodyne photographic images, provide at least one in two classified informations estimated " people " and " event ".
In the foregoing, as an example of subject information, facial information is described.But subject information is not limited to facial information.Such as, blood-shot eye illness judgement information can be used.In this case, when blood-shot eye illness being detected, classified information " people " can be provided.
Automated provisioning classified information is not limited to information " people ", " landscape " and " event " of above-mentioned three types equally, as long as user can use this information after photographing easily.
the structure of catalogue and file
Fig. 9 illustrates the example arrangement being recorded in catalogue on storage medium 200 and file in above-mentioned recording processing.An example of the rule for generating directory name and file name is described below with reference to Fig. 9.As root record DCIM catalogue 901.In DCIM catalogue 901, generate subdirectory.Be made up of the title of each subdirectory six characters, and first three character is numeral.Numeral represented by first three numeral starts with 100, and whenever a generation catalogue, this numeral is increased 1.In fig .9, subdirectory " 100XXX " 902 and subdirectory " 101XXX " 903 is shown.
The file that digital camera 100 generates is generated under subdirectory.In the example shown in Fig. 9, at the file 904 ~ 909 that subdirectory 902 times generation digital cameras 100 generate, at the file 911 ~ 918 that subdirectory 903 times generation digital cameras 100 generate.Here generated file name is made up of the extension name of the file name of eight characters and three characters of expression file type.By rear four characters of digital configuration file title, and this four numerals are set from 0001.In rest image logging mode, provide file name, to make whenever the shooting image, the numeral represented by these rear four numerals is increased 1.Below the numeral represented by these rear four numerals is called reference number of a document.Extension name " JPG " is provided to the static picture document recorded in rest image logging mode.Extension name " AVI " is provided to the motion pictures files recorded in motion picture recording pattern.Thumbnail file to records management information provides extension name " THM ".
file structure
Figure 10 illustrates the example data structure of the static picture document recorded in above-mentioned recording processing.The image that image file 1001 comprises the beginning of the presentation video file of its beginning place start (startofimage, SOI) mark 1002 with immediately following after SOI1002, the application corresponding with head mark (APP1) 1003.Application mark (APP1) 1003 comprises following information.
Size (APPI length) 1004
Application labeled identifier code (APPI identifier code) 1005
The date created of view data and time (date-time) 1006
Date and time (original date time) 1007 during image data generating
The classified information 1018 of view data
The classified information 1020 of automated provisioning view data
The date print of view data arranges 1021
The focusing frame information 1022 of view data
Facial information 1019
Other photographing information 1009
Thumbnail image (thumbnail data) 1010
Classified information 1018 be parameter when being different from shooting and be suitable for taking the information of rear operation (such as, search), as described in reference diagram 8 above.One or more element such as " people ", " landscape " and " event " can be stored as the classified information 1018 during shooting.In addition, the universal classification information such as " kind 1 ", " kind 2 " and " kind 3 " can also be stored.For the view data by being sent to the external device (ED)s such as PC by communications portion 110, the classified informations for pointing out particular procedure (such as, the Email at transfer destination place sends) such as " work " can also be stored.User, by scheduled operation, is supplied to the view data of expectation by not having the classified information of these types of automated provisioning in the process shown in Fig. 8.Scheduled operation will be described below.Can edit (with reference to Figure 15) automated provisioning classified information during shooting in reproduction mode.
As mentioned above, except automated provisioning classified information during shooting, also provide and make user be easy to carefully to the classified information that view data is classified in reproduction mode while checking view data, this makes user to classify to data more easily.
In the present embodiment, automated provisioning classified information 1020 is set.Automated provisioning classified information 1020 keeps the system controller 50 automated provisioning information of the digital camera 100 according to the present embodiment.Forbid editing (see Figure 15) automated provisioning classified information 1020 in reproduction mode, and use automated provisioning classified information 1020, with by between classified information 1018 and automated provisioning classified information 1020 relatively identify user have a mind to change classified information.
Facial information 1019 is the information generated in facial check processing (the step S406 of Fig. 4).Facial information 1019 comprises the position coordinates of detected face, the size (width and height) of detected face, the quantity of detected face and coefficient of reliability.These elements are included for detected each face.
The view data be recorded in image file 1001 comprises definition quantization table (DQT) 1012, definition Huffman shows (DHT) 1013, frame starts that (SOF) marks 1014, scanning start (SOS) mark 1015 and compression after data 1016.The image utilizing presentation video file data to terminate terminates (EOI) and marks 1017 termination image files 1001.
It is represent the mark whether embedding shooting date and time when taking in captured image that date print arranges 1021.Date print is used to arrange 1021 to avoid in the overlap utilizing date print when having the printer print image of date print function.
Focusing frame information 1022 manages the focusing position in the automatic focusing (AF) during shooting and size for using coordinate system.User can see focusing position based on focusing frame information 1022.
motion picture recording mode treatment
Figure 11 is the flow chart of the exemplary process that motion picture recording pattern is shown.When mode selector 60 is arranged to motion picture recording pattern, then system controller 50 confirms screening-mode.In the present embodiment, under supposition is similar to the prerequisite of the screening-mode of rest image logging mode, the screening-mode in motion picture recording pattern is described.Certainly, motion picture recording pattern can have the screening-mode being exclusively used in animation shooting.
When the ON state of the second shutter release signal SW2 being detected in motion picture recording pattern, system controller 50 starts the process of the motion picture recording pattern shown in Figure 11.In step S1101, the view data sequential storage that camera part 22 obtains with predetermined frame frequency by system controller 50 in memory 32.Meanwhile, the voice data obtained by microphone 10, Audio Controller 11 and A/D converter 23 is also stored in memory 32 by system controller 50.In the present embodiment, assuming that voice data is PCM numerical data.
Then, in step S1102, system controller 50 carries out image procossing for the view data stored in memory 32.An example of image procossing is to carry out size adjustment with file record view data.Then, in step S1103, system controller 50 pairs of view data are compressed, and are stored in memory 32.
Figure 12 illustrates the example format for being stored in by recorded motion image data on storage medium 200.In data beginning configuration regular length head region 1201.Head region 1201 comprises the data of video frame frequency or audio sample rate.Immediately following the regular length voice data region 1202 of configuration below in head region 1201.Voice data region 1202 is with booking situation unit (in the present embodiment, being one second) stores audio data.By Audio Controller 11 and A/D converter 23, the audio sample inputing to microphone 10 is become numerical data, obtain voice data, and stored in memory 32.Immediately following after voice data region 1202, by with frame data element (1203 ~ 1206) sequential storage of predetermined frame frequency record in memory.Similarly, data element 1207 ~ 1212 represents the motion image data of next second, and data element 1213 ~ 1217 represents the motion image data of N second.Like this, order generates voice data and frame data, and stores them with booking situation unit, thus generates motion image data.
When storing the data of a second in the above described manner, with reference to the step S1104 of Figure 11, with record moving image and audio frequency concurrently, system controller 50 starts the motion image data stored in memory 32 to be recorded in storage medium 200.System controller 50 repeats step S1101 ~ S1104, until the request (step S1105) for stop motion image record detected.By the second shutter release signal SW2 again being detected, detecting that the free space of storage medium 200 is not enough or detect that the free space of memory 32 is not enough, produce the request being used for stop motion image record.
As mentioned above, Fig. 9 illustrates in the above-mentioned recording processing of carrying out in digital camera 100 example arrangement being recorded in catalogue on storage medium 200 and file.The motion pictures files recorded in motion picture recording pattern has extension name AVI, as shown in 915 and 917.The thumbnail file of records management information has extension name THM, as shown in 916 and 918.
Referring back to Figure 11, when in response to the request for stop motion image record during stop motion image recording processing, flow process enters step S1106 from step S1105.In step S1106, residue motion image data is in memory 32 write on storage medium 200 by system controller 50, and then recording indexes information 1218, in index information 1218, stores side-play amount and the size of each voice data and video data.Then, in step S1107, system controller 50 generates header (such as, totalframes).Then, in step S1108, system controller 50 describes total data size in directory entry, and the information of total data size is recorded on storage medium 200.Like this, motion pictures files record is completed.In step S1109, the management information of motion pictures files is generated thumbnail file, and wherein, thumbnail file has the numbering identical with above-mentioned motion pictures files, and there is extension name THM (such as, MVI_0005.THM (916)).Later with reference to Figure 10 and 13, an example of structure of thumbnail file and the process for generating and record thumbnail file are described.
the structure of thumbnail file and record
The thumbnail file generated in moving picture recording has the file structure similar with the image file shown in Figure 10.But this thumbnail file does not have the thumbnail image region 1010 for recording thumbnail data, in thumbnail image record data 1016 upon compression.
The image that thumbnail file 1001 is included in the beginning of the beginning place presentation video of thumbnail file starts (SOI) and marks 1002 and immediately following application mark (APP1) 1003 after SOI1002.Application mark (APP1) 1003 comprises following information:
Size (APP1 length) 1004;
Application labeled identifier code (APP1 identifier code) 1005;
The date created of view data and time (date-time) 1006;
Date and time (original date time) 1007 during image data generating;
The classified information 1018 of view data;
The classified information 1020 of automated provisioning view data;
The date print of view data arranges 1021;
The focusing frame information 1022 of view data;
Facial information 1019; And
Other photographing information 1009.
The downscaled images of the previous video frames when view data of thumbnail file is setting in motion image record.This view data comprises definition quantization table (DQT) 1012, definition Huffman shows (DHT) 1013, frame starts that (SOF) marks 1014, scanning start (SOS) mark 1015 and the compression corresponding with this downscaled images after data 1016.The image utilizing this view data of expression to terminate terminates (EOI) and marks this view data of 1017 terminations.
An example of the thumbnail recording processing in the step S1109 of Figure 11 is described referring now to Figure 13.In step S1301, system controller 50 generating thumbnail image.In the present embodiment, by the image procossing (such as, size being adjusted to predetermined image size) carried out the previous video frames of the motion image data stored in memory 32, generating thumbnail image is carried out.Then, in step S1302, compression/de-compression part 16, under the control of system controller 50, is compressed the thumbnail image generated in step S1301.Then, in step S1303, generate the head comprising application mark 1003 (see Figure 10).This process is described above with reference to Fig. 8.After the generation completing head, in step S1304, the thumbnail file comprising head and thumbnail image data writes on storage medium 200 by system controller 50, and completes thumbnail recording processing.
receiving mode process
Figure 14 is the flow chart of an exemplary process of the receiving mode illustrated as one of other pattern shown in the step S9 of Fig. 3.When the mode selector 60 of digital camera 100 is switched to receiving mode, perform the process of the receiving mode shown in Figure 14.In the following description, explanation is used for receive image file from external device (ED) (communicator) and recorded process on the recording medium.
In step S1401, system controller 50 checks whether the device existing and carry out with it communicating.If judge to there is not the device (step S1401 is "No") being in communication, then complete this reception process.If judge to exist the device (step S1401 is "Yes") being in communication, then flow process enters step S1402, and in step S1402, system controller 50 judges whether to there is the request that will send.Send request (step S1402 is "No") if judge not exist, then flow process turns back to step S1401, and system controller 50 checks whether the device existing and be in communication again, and wait sends request.
If judge that existence sends request (step S1402 is "Yes"), then in step S1403, system controller 50 receives data by communications portion 110 from the device being in communication, and received data is kept in memory 32 temporarily.Then, in step S1404, received data write on storage medium 200 by system controller 50.Now, when the head of received data comprises classified information, received data are recorded on storage medium 200, and do not carry out any process.When this head does not comprise classified information, newly classified information can be provided by the process substantially identical with the process shown in Fig. 8.In this case, by reference to received data head obtain shooting time camera setting model.Such as, can with reference to the facial information 1019 be included in the head of received data or the information about screening-mode (scene mode) be included in other photographing information 1009.Subject information can be obtained by reference to the head of received data, or can by the analysis for received view data, new detection subject information.
After completing and writing, flow process turns back to step S1401, and system controller 50 checks whether the device existing and carry out with it communicating again, and wait sends request.If judge to there is not the device being in communication, then exit this process.
reproduction mode process
Figure 15 is the flow chart of an exemplary process of the reproduction mode shown in step S8 that Fig. 3 is shown.In step S1501, system controller 50 obtains up-to-date image information from storage medium 200.The advantage obtaining up-to-date image information before the sum of computed image is: can after beginning reproduction mode, and display is used for the image of this process fast.
Then, in step S1502, system controller 50 judges whether successfully to obtain up-to-date image information.If judge successfully not obtain (step S1502 is "No"), then flow process enters step S1509, and in step S1509, system controller 50 is in the input wait state not having image.Later with reference to an example of the process in the flow chart description of step S1509 of Figure 16.An example of the situation of the image information that unsuccessful acquisition is up-to-date is the state that there is not image.Another example is that the defect of medium causes obtaining failed state.In step S1502, when successfully obtaining up-to-date image information, judge to there is at least one image, therefore flow process enters step S1503.
In step S1503, system controller 50, based on the up-to-date image information obtained in step S1501, reads up-to-date view data from storage medium 200.Then, in step S1504, system controller 50 carries out file analysis, and obtains photographing information and the attribute information of the up-to-date view data read.An example of this file analyzing and processing is described later with reference to Figure 23.In step S1505, system controller 50 shows the up-to-date view data read.Now, system controller 50 is also presented at the photographing information and attribute information that obtain in step S1504.Such as, according to the file analysis result in step S1504, when judging to cause read data invalid due to file corruption, system controller 50 also shows mistake instruction.
In step S1508, system controller 50 enters the input wait state for reproducing.Later with reference to Figure 17 A and 17B, an example for this input wait state reproduced is described.
there is no input wait state during image
An exemplary process of shown in the step S1509 that Figure 15 is described referring now to Figure 16, when not having an image input wait state.In step S1601, system controller 50 shows the message meaning " not having image " in image displaying part 28, to notify that user does not have view data.Then, in step S1602, the input that system controller 50 etc. are to be operated.The example of operation input used here comprise that user carries out for button or the operation of battery cover and the event of the low electric power of notice power supply.If detect that any operation inputs, then flow process enters step S1603.In step S1603, whether system controller 50 judges to operate input is operation for completing button.If judge that operation input is that then complete the process of reproduction mode, flow process enters the step S10 of Fig. 3 for the operation completing button (step S1603 is "Yes").If judge to operate the operation (step S1603 is "No") be input as beyond for the operation completing button, then flow process enters step S1604, and in step S1604, system controller 50 carries out operating process corresponding to input.Such as, although there is no view data, if input is for the operation of menu button, then display menu in image displaying part 28, thus allow user to change setting.
the input wait state reproduced
Referring now to Figure 17 A and 17B, an exemplary process for the input wait state reproduced is described.In step S1701, system controller 50 judges whether to input from the operation of user.The example of operation input used here comprises the event of the low electric power of operation that user carries out button or battery cover and notice power supply.System controller 50 is waited for, until any input detected.If detect any input, then flow process enters step S1702.
In step S1702, system controller 50 judges whether detected operation input is the operation that the search key comprised operation part 70 arranges button.If judge to operate the operation (step S1702 is "Yes") being input as and search key being arranged to button, then flow process enters step S1703.In step S1703, system controller 50 arranges next search key, and is stored in system storage 52.Search key is the attribute information as search unit.The example of search key comprises shooting date, classified information, file and moving image.That is, when shooting date, classified information, file and moving image can be utilized to search for, selective sequential shooting date, classified information, file and moving image are as the search key of the image be recorded on storage medium 200.This selective sequential can comprise cancels selected search key, that is, be switched to the reproduction mode for all images.
If judge that operation input is not the operation (step S1702 is "No") search key being arranged to button, then flow process enters step S1704.In step S1704, system controller 50 judges that whether detected operation input is the operation to the image forwarding button that operation part 70 comprises.If judge that detected operation input is the operation (step S1704 is "Yes") to image forwarding button, then flow process enters step S1705.In step S1705, system controller 50 reads next image to be shown for search key set in step S1703.Represent the button portion composing images forwarding button in direction by a pair, wherein, there is image to be shown in the direction in which.According to the direction corresponding with the button portion pressed, read next image to be shown.Then, in step S1706, system controller 50 carries out file analysis to the photographing information of the view data read in step S1705 and attribute information.Primary Reference Figure 23 is illustrated an example of this file analysis below.In step S1707, system controller 50 is presented at the view data that step S1705 reads.Now, based on the result of the file analysis in step S1706, display photographing information and attribute information.If according to the file analysis result of step S1706, judge due to the damage of such as file and cause read data to be invalid, then system controller 50 also shows mistake instruction.After completing display, system controller 50 turns back to the input wait state in step S1701.
If judge that detected operation input is not the operation (step S1704 is "No") to image forwarding button, then flow process enters step S1709.In step S1709, system controller 50 judges whether the calculating completing the total number of images started in the step S2103 of Figure 21.If judge not complete (step S1709 is "No"), then flow process turns back to step S1701, in step S1701, and the input that system controller 50 etc. are to be operated.Now, the message informing the user and do not complete this calculating or icon can be shown.Like this, before the calculating completing amount of images, perform the image forward operation of being undertaken by image forwarding button and by completing the complete operation that button carries out.Ignore other operation input, until complete the calculating of amount of images.
If judge the calculating (step S1709 is "Yes") completing amount of images, then flow process enters step S1710.In step S1710, system controller 50 judges whether to arrange menu by have selected classified information to the operation of operation part 70.If judge that have selected classified information arranges menu (step S1710 is "Yes"), then flow process enters step S1711.In step S1711, system controller 50 carries out the process that classified information arranges pattern.Illustrate that classified information arranges an example of this process of pattern later with reference to Figure 20.
If judge do not have selection sort information to arrange menu (step S1710 is "No"), then flow process enters step S1712.In step S1712, system controller 50 judges that whether detected operation input is the operation to the reset button that operation part 70 comprises.If judge that detected operation input is the operation (step S1712 is "Yes") to reset button, then flow process enters step S1713.In step S1713, system controller 50 removes the current view data be just presented in image displaying part 28.Subsequently, in step S1714, check the sum after removing.If add up to 0 (step S1714 is "Yes"), then flow process enters step S1715, and in step S1715, system controller 50 turns back to the input wait state not having image.This input wait state when not having image is described above with reference to Figure 16.
If still remain with view data (step S1714 is "No") after removing, then flow process enters step S1716, and in step S1716, system controller 50 reads then by the view data of display, to show next view data.Here, then by view data that the view data of display is next reference number of a document with the reference number of a document being eliminated view data.If remove up-to-date view data, then then display had the view data of the last reference number of a document of the reference number of a document being eliminated view data.Subsequently, in step S1717, system controller 50 carries out file analysis to the view data read in step S1716 as view data to be shown, and obtains photographing information and attribute information.Primary Reference Figure 23 is illustrated an example of this file analysis below.In step S1718, the view data read in step S1716 is presented in image displaying part 28 by system controller 50.Now, the photographing information obtained in step S1717 and attribute information is also presented at.If judge due to the damage of such as file according to the file analysis result in step S1717 and cause read data invalid, then system controller 50 also shows mistake instruction.After completing display, system controller 50 turns back to the input wait state in step S1701.
If judge that detected operation input is not the operation (step S1712 is "No") to reset button, then flow process enters step S1719.In step S1719, system controller 50 judges that whether detected operation input is the operation to Edit button.If judge that detected operation input is the operation (step S1719 is "Yes") to Edit button, then flow process enters step S1720, and in step S1720, system controller 50 is edited.Primary Reference Figure 18 A and 18B is illustrated an example of this editor below.
If judge that detected operation input is not the operation (step S1719 is "No") to Edit button, then flow process enters step S1721.In step S1721, system controller 50 judges that whether detected operation input is the operation to completing button.If judge that detected operation input is to the operation completing button (step S1721 is "Yes"), then complete the process of reproduction mode, and flow process enters the step S10 of Fig. 3.
If judge that detected operation input is not that then flow process enters step S1724 to the operation completing button (step S1721 is "No").In step S1724, system controller 50 carries out operating with other inputting corresponding process.The example of this process comprise editor to image, to the switching of multiple reproduction and when pressing menu button display menu.Multiple reproduction is so a kind of reproduction mode, in this mode, a picture of image displaying part 28 shows set thumbnail image.
editor
An example of the editing and processing in the step S1720 of Figure 17 is described referring now to Figure 18 A and 18B.An example of executable editing and processing for: by the image cut (cutting) that carries out the image file be presented in image displaying part 28 and image size conversion (size adjustment), record new image file.Below with reference to the flow chart shown in Figure 18 A and 18B, this editing and processing is described.In the following description, this editor is carried out to the file that file name is IMG_0002.JPG (905).
In step S1801, system controller 50 obtains the image file name (IMG_0002.JPG) of the view data be presented in image displaying part 28.Then, in step S1802, the view data corresponding with obtained file name is read in memory 32 from storage medium 200 by system controller 50.Then, in step S1803, compression/de-compression part 16, under the control of system controller 50, decompresses to the view data read in step S1802, and the data after decompressing is stored in memory 32.
Then, in step S1804, system controller 50 judge by the editor that carries out whether sized by adjustment.If judge to be compiled as size adjustment (step S1804 is "Yes") by what carry out, then flow process enters step S1805.In step S1805, system controller 50 uses image processor 24, zooms in or out the view data after decompressing to predetermined image size.Then, in step S1806, the view data after system controller 50 uses the 16 pairs of size adjustment of compression/de-compression part is compressed, and is stored in memory 32.Then, in step S1807, system controller 50 obtains the classified information of the raw image files read in step S1802, and is stored in system storage 52.The use of predetermined menus picture makes user can determine zoom factor at zoom in/out middle finger.
Then, in step S1808, the settings of ON or OFF automatically providing the setting of classified information are arranged to " OFF " by system controller 50 temporarily.To automatically provide the original settings record (preservation) of ON or OFF of the setting of classified information in the zones of different of system storage 52.Then, in step S1809, the head of the view data after system controller 50 Generation Edit.More specifically, copy the head of raw image files read in memory 32, and use the head of the raw image files copied, first-born one-tenth described in reference diagram 8 is carried out to newly-generated image file.Because the settings of ON or OFF automatically providing the setting of classified information are set to " OFF ", thus automatically do not provide classified information.Owing to generating this head based on the head of raw image files, thus after editor, newly-generated image file inherits classified information from raw image files.The region of appropriate change image size and the item such as date created and time.Then, in step S1810, system controller 50 by record (preserved) settings recover to the setting of ON or OFF of the setting automatically providing classified information.
Like this, the generation of the view data of newly-generated image file is completed.Therefore, in step S1811, generate the file name of newly-generated image file.In the present embodiment, spanned file title IMG_0003.JPG.Then, in step S1812, image file generated in the above described manner writes on storage medium 200 by system controller 50, and completes editor.
Size adjustment etc. is not had in the content of image to the editor changed, the image file before editor and the image file after editing have identical classified information.Therefore, the classified information of the view data after automated provisioning classified information and user are supplied to editor arbitrarily when taking raw image data by succession, even if for the view data after editor, also makes operation (such as, searching for) highly convenient.
If judge it is not size adjustment (step S1804 is "No") by the editor carried out, then flow process enters step S1813, and in step S1813, whether system controller 50 judges the editor carried out for cutting.If judge being compiled as of carrying out to cut (step S1813 is "Yes"), then flow process enters step S1814.In step S1814, system controller 50 uses image processor 24 that the view data after decompression is cut into appointment size.Then, in step S1815, what system controller 50 retained after cutting cut image carries out face and detects.Then, in step S1816, system controller 50 uses image processor 24 to carry out size adjustment (zoom in/out) to cutting image.Then, in step S1817, the view data after system controller 50 uses the 16 pairs of size adjustment of compression/de-compression part is compressed, and the view data after compression is again stored in memory 32.
Then, in step S1818, system controller 50 obtains the classified information of the raw image files read in step S1802, and is stored in system storage 52.The use of menu screen makes user can specify zoom factor in the clipped position in cutting and size adjustment (zoom in/out).Then, in step S1819, the head of the view data after system controller 50 Generation Edit.More specifically, copy the head of the raw image files read in memory 32, and use the head of the raw image files copied, first-born one-tenth described in reference diagram 8 is carried out for newly-generated image file.If automatically provide being set to " ON " of classified information, then based on facial information detected in step S1815, automatically provide classified information.The size of appropriate change image and the region of the item such as date created and time.
Like this, the generation of the view data of newly-generated image file is completed.In step S1811, generate the file name of newly-generated image file.In the present embodiment, spanned file title IMG_0003.JPG.Then, in step S1812, image file generated in the above described manner writes on storage medium 200 by system controller 50, and completes editor.
For the editor relating to the change of picture material such as cutting, again provide classified information based on the image after editor.Therefore, even if for the view data after editor, also can operate highly easily (such as, searching for).
If judge it is not cut the editor carried out (step S1813 is "No"), then flow process enters step S1820, and in step S1820, system controller 50 carries out other process.Example of other process comprises change and the combination of color transformed, the picture shape of image.Even if in these cases, also can carry out the graphical analysis corresponding with editor, and carry out first-born one-tenth.Process shown in Figure 18 corresponds to by edit cell and reoffers the exemplary process that unit carries out.
The object lesson cut is described referring now to Figure 19 A ~ 19D.In fig. 19 a, Reference numeral 1901 represents the image before cutting.Reference numeral 1902 represents and cuts appointed area by what cut by the operation from user.Reference numeral 1903 represents face detection focusing frame during shooting.Reference numeral 1904 represents the information occurred when activating date print and arranging, and it represents shooting date.Image 1901 before cutting has attribute " label (classified information): people; Number: 2; Date print: have (ON); Face coordinate 1: left, 10 × 10; And facial coordinate 2: in, 10 × 10 ".
Figure 19 B illustrate use cut after appointed area 1902 cuts retain cut image 1905.According to the flow chart of Figure 18 A and 18B, the attribute cutting image 1905 is: " label (classified information): people; Number: 1; Date print: without (OFF); Face coordinate 1: in, 90 × 90; And facial coordinate 2: nothing ".Attribute information after change makes can carry out suitable information displaying and search in reproduced image.Such as, as face detects as shown in focusing frame 1906, even if after cutting, also can show which part in subject and be identified as face and this part be judged as frame of focusing.
In Figure 19 C, Reference numeral 1911 represents the image before cutting.Reference numeral 1912 represents and cuts appointed area by what cut by the operation from user.Reference numeral 1913 represents face detection focusing frame during shooting.Image 1911 before cutting has attribute " label (classified information): people; Number: 2; Date print: have; Face coordinate 1: left, 10 × 10; And facial coordinate 2: in, 10 × 10 ".
Figure 19 D illustrate use cut after appointed area 1912 cuts retain cut image 1915.According to the flow chart of Figure 18 A and 18B, the attribute cutting image 1915 is " label (classified information): landscape; Number: 0; Date print: nothing; Face coordinate 1: nothing; And facial coordinate 2: nothing ".Attribute information after change makes can carry out suitable information displaying and search in reproduced image.Such as, according to the process described in reference to figure 18A and 18B, the image 1911 before cutting has attribute " label: people ", and cuts image 1915 and have attribute " label: landscape ".Therefore, utilize search key " people ", retrieval is cut image 1915 less than what do not photograph people, and search key " landscape " can be utilized to retrieve cut image 1915.
Cut image 1905 and 1915 all to have from the attribute " date print: nothing " after the image modification before respective cutting.Such as, when using the printer with the function of adding also date printed when printing, when printing the image 1901 and 1911 etc. before cutting and embedded in the image on date, if arranged " date print: have ", then printer can suppress date printed, to avoid the printing of date and time information overlapping.When during Exclude Dates printing portion, as cut in image 1905 and 1915, by being changed over by attribute " date print: nothing ", allowing printer side suitably to add and date printed from image.
classified information arranges the process of pattern
As described in reference to figure 17A and 17B, according in the digital camera 100 of the present embodiment, menu is set by selection information and performs the process that classified information arranges pattern.Figure 20 illustrates that classified information arranges the flow chart of an exemplary process of pattern.
In step S2001, system controller 50 judges whether that the operation existed from user inputs.The example of used here operation input comprises the event of the low electric power of operation that user carries out button or battery cover and notice power supply.System controller 50 is waited for, until any input detected.
If judge that detected operation is input as the complete operation (step S2002 is "Yes") being used to indicate classified information and having arranged, then flow process enters step S2003.In the present embodiment, the example that instruction here completes the complete operation that classified information is arranged comprises operation for exiting the menu button to operation part 70 that classified information is arranged, for the operation of cutting off the electricity supply and the operation for pattern to be changed over screening-mode from reproduction mode.In step S2003, the classified information of the view data changed in step S2011 described is later write image file by system controller 50.Then, complete classified information and pattern is set.Process the input wait state turned back in the step S1701 of Figure 17.
If judge that detected operation input is the operation (step S2005 is "Yes") to image forwarding button included in operation part 70, then flow process enters step S2006 from step S2005.In step S2006, the classified information of the view data after change in step S2011 described is later write image file by system controller 50.Then, in step S2007, system controller 50 reads next view data to be shown.By pair of buttons portion (in the present embodiment, being dextrad and left-hand) composing images forwarding button.According to selected direction, change the view data then shown.
Then, in step S2008, system controller 50 carries out file analysis to the view data read in step S2007, and according to this file acquisition attribute information.Primary Reference Figure 23 is illustrated an example of this file analysis below.In step S2009, read view data is presented in image displaying part 28 by system controller 50.Now, according to this setting, display photographing information and attribute information (such as, classified information).If according to the file analysis result in step S2008, judge due to such as file corruption and cause read data invalid, then system controller 50 also shows mistake instruction.After completing display, system controller 50 turns back to step S2001, and enters input wait state.
Image forward operation described in step S2005 ~ S2009 can be applied to single reproduction and many image reproducings (being also referred to as the display of many images), wherein, in single reproduction, single picture show single image; In many image reproducings, single picture shows multiple image (such as, 9 images).When many images show, in response to the instruction continuous moving cursor that image advances, and in response to this move, the classified information of view data is write image file.
If judge that the operation detected by step S2010 is input as classified information change operation (step S2010 is "Yes"), then flow process enters step S2011 from step S2010.In step S2011, system controller 50 changes the classified information of shown view data.In this stage, image file is not write in the change in classified information, but this change is stored in memory 32.Then, in step S2012, the classified information after change is reflected in the display of image displaying part 28 by system controller 50.
If judge that the operation input detected by step S2010 is not any one (step S2010 is "No") of aforesaid operations, then flow process enters step S2013, in step S2013, carries out other process.The example of other process is included between single reproduction and many image reproducings and switches.
As mentioned above, when switching the display of view data, or when completing classified information and arranging pattern, classified information is write image file.The access times to storage medium 200 can be reduced like this, and can the speed of service be improved.
file management
Figure 21 is the flow chart of an example of the image file management process illustrated in the step S2 of Fig. 3.In step S2101, system controller 50 removes the acknowledgement indicator of the latest image be recorded in system storage 52.In step S2102, system controller 50 removes the acknowledgement indicator of total number of files.In step S2103, system controller 50 utilizes the instruction for starting picture search, provides the picture search carried out that to walk abreast with above-mentioned process.Then, this process is completed.
picture search
Figure 22 is the flow chart of the example that the picture search process that the instruction of the beginning picture search provided in the step S2103 in response to Figure 21 is carried out is shown.When indicating beginning picture search, flow process enters step S2212 from step S2211.In step S2212, system controller 50 generates rendered content list.Such as, when meeting the reproducer of DCF standard, this process is a kind of like this process: the directory entry analyzing DCF root, search DCF catalogue, and DCF catalogue is added to rendered content list.
Then, in step S2213, system controller 50 judges whether to there is rendered content.If judge to there is not rendered content, that is, there is not the catalogue that can be processed by digital camera 100 or file (step S2213 is "No"), then the sum of file is set to 0 by system controller 50.After this, in step S2222, the confirmation traffic sign placement of total number of files is become 1 by system controller 50, and completes this process.
If judge to there is rendered content (step S2213 is "Yes"), then system controller 50 initialisation image search directory in step S2214.For the reproducer meeting DCF standard, such as, the DCF catalogue with maximum numbering is arranged to picture search catalogue.Then, in step S2215, system controller 50, by analyzing the directory entry of this catalogue, calculates the sum of the image be set in the catalogue of the target of picture search.The sum of the image in this calculated catalogue is added with the sum of the image in storage medium 200.
In step S2216, system controller 50 obtains the information described in the directory entry of DCF root.Particularly, minimum reference number of a document, maximum reference number of a document, the summation of reference number of a document, the summation of timestamp, the summation of file size, the sum of file and other project is obtained.These projects are stored in system storage 52 as directory entry information.
Then, in step S2217, system controller 50 judges whether to there is reproduced image file, that is, the file that can be processed by digital camera 100.If judge to there is reproduced image file (step S2217 is "Yes"), then flow process enters step S2218, and in step S2218, system controller 50 determines up-to-date image, and up-to-date image confirming traffic sign placement is become 1.
If exist by operate that button provides, for terminate sum calculating instruction (step S2220 is "Yes"), then flow process enters step S2222, and exits this process.If by operate that button provides, for terminate sum calculating instruction (step S2220 is "No"), then system controller 50 judges whether to there is the catalogue of not searching in step S2221.If judge to there is the catalogue (step S2221 is "Yes") of not searching for, then flow process enters step S2219.In step S2219, the catalogue of not searching for is arranged to picture search catalogue, and flow process turns back to step S2215.Like this, all catalogues in the rendered content list generated in step S2212 are carried out to the process of step S2215 ~ S2218.After completing and carrying out the process of step S2215 ~ S2218 for all catalogues, flow process enters step S2222.In step S2222, system controller 50 notifies up-to-date image confirming, the sum of computed image, and the acknowledgement indicator that total number of files is set, and exit this process.
Even if when there is rendered content, if do not have reproduced image in this catalogue, then arranging the acknowledgement indicator of total number of files, and exiting this process, wherein, what do not have reproduced image to mean image in catalogue adds up to 0.
file analysis
An example of the file analysis process in step S1504, the step S1706 of Figure 17 A and 17B of Figure 15 and the step S2008 of S1717, Figure 20 is described referring now to Figure 23.In step S2301, system controller 50 judges whether have file header as the file of evaluating objects, wherein, in file header, describes photographing information and attribute information (such as, classified information).If judge that file has this kind of file header (step S2301 is "Yes"), then system controller 50 is in step S2303, obtains photographing information, and obtain classified information in step S2303 from file header from file header.In step S2304, system controller 50 obtains the information about image data main body, such as, and the starting position of image subject and the method for compressed image.
As mentioned above, according to the present embodiment, when edited image, classified information is reoffered the image after to editor.Therefore, suitable classified information can be supplied to the image after editor.
second embodiment
Below with reference to the accompanying drawings second embodiment is described.
Be with a main difference point of the first embodiment: in the present embodiment, except classified information, also the photographing information such as focusing frame and shooting date are supplied to image, and in a first embodiment, only classified information are supplied to image.Explanation emphasis is below for this difference.
Omit for the detailed description with the first embodiment same section such as the process in the structure of digital camera, shooting, record, reproduction, each pattern and file structure.
first-born one-tenth
Referring now to another example of the head generating process in the step S706 of Figure 24 key diagram 7.
In step S2401, system controller 50 obtains specified by the step S405 of Fig. 4, automatically to provide ON or OFF of the setting of classified information settings from system storage 52, and judges whether this classified information to be automatically supplied to captured view data.If judge automatically to provide the settings of ON or OFF of the setting of classified information to be configured to " OFF ", that is, automatically do not provide classified information (step S2401 is "No"), then flow process enters step S2409.
If judge that the settings of ON or OFF of the setting by automatically providing classified information are arranged to " ON ", that is, automatically provide classified information (step S2401 is "Yes"), then flow process enters step S2402.In step S2402, system controller 50 reads in the step S406 of Fig. 4 the facial information remained in system storage 52, and judges whether to detect face.If judge to detect face (step S2402 is "Yes"), then flow process enters step S2404, in step S2404, provides classified information " people ".If judge face (step S2402 is "No") not detected, then flow process enters step S2403.
In step S2403, system controller 50 with reference to the scene mode of the image during shooting be stored in system storage 52, and judge scene mode be whether " Portrait ", " snapshot mode at night " and " children and pet pattern " wherein any one.If judge that scene mode is any one (step S2403 is "Yes") in them, then flow process enters step S2404, and in step S2404, system controller 50 provides classified information " people " to view data.If provide classified information " people " in step S2404, if or judge that scene mode is not their wherein any one (step S2403 is "No"), then flow process enters step S2405.
As mentioned above, in step S2402 ~ S2404, according to the scene mode of the facial information of an example as subject information and an example as camera setting model during shooting, provide identical classified information " people ".Different parameter when camera setting model when subject information and shooting is shooting, but according to content, they can have meaning after similar shooting.As the facial information of one of subject information with as the pattern of " Portrait ", " snapshot mode at night " and " children and pet pattern " of one of camera setting model during shooting, all there is identical meaning: " presumption has photographed people ".Therefore, by providing identical classified information to the view data with this category information, after enhancing shooting, operate the convenience of (such as, search operation).Namely, identical classified information is provided by using specific shot body information and particular camera setting model, that this makes it possible to provide parameter when being different from shooting and be suitable for taking the classified information of rear operation (such as, search operation).This can strengthen convenience.
In addition, above-mentioned classified information provide process can for Portrait, night snapshot mode and children and these different scene modes of pet pattern identical classified information is provided.Different scene mode has camera setting model during different shootings, but they can have similar meaning.Portrait, night snapshot mode and children and pet pattern all there is identical meaning: " presumption has photographed people ".Therefore, provide same category information to this kind of view data, after which enhancing shooting, operate the convenience of (such as, search operation).Namely, by providing identical classified information for the multiple concrete setting model in camera setting model during shooting, that this makes it possible to provide parameter when being different from shooting and be suitable for taking the classified information of rear operation (such as, search operation).This convenience operated after can strengthening shooting.
Referring back to Figure 24, in step S2405, system controller 50 judge scene mode be whether " species ", " landscape configuration " and " fireworks pattern " wherein any one.If judge that scene mode is any one (step S2405 is "Yes") in them, then flow process enters step S2406, and in step S2406, system controller 50 provides classified information " landscape " to view data.If provide classified information " landscape " in step S2406, if or judge that scene mode is not any one (step S2405 is "No") in them, then flow process enters step S2407.
In step S2407, system controller 50 judge scene mode be whether " party pattern ", " snow scenes pattern ", " seabeach pattern ", " fireworks pattern ", " aquarium pattern " and " under water pattern " wherein any one.In these patterns, presumption shooting event.If judge that scene mode is any one (step S2407 is "Yes") in them, then flow process enters step S2408, and in step S2408, system controller 50 provides classified information " event " to view data.
In the process above, the view data of taking in " fireworks pattern " is provided to information " landscape " and " event " of two types.That is, polytype information is provided according to single scene mode.Even if under the camera setting model when identical shooting (scene mode), captured view data also can tool multivalence.A this kind of example is image captured in " fireworks pattern ".In this case, system controller 50 provides the polytype classified information corresponding with taking rear meaning.Therefore, can provide parameter when being different from shooting and be suitable for taking the classified information of rear operation (such as, search operation).The convenience that this shooting that can strengthen digital camera 100 operates afterwards.
All be judged as in all these steps of step S2403, S2405 and S2407 "No", " automatic mode ", " manual mode " or other pattern, do not provide classified information.
After completing the process for providing the classified information in head, flow process enters step S2409.
In step S2409, if system controller 50 judges that this first-born one-tenth carried out in editor uses classified information to generate header, then flow process enters step S2410.
In step S2410, the coordinate information described in head before conversion editor, here, coordinate information is the information that facial information and focusing frame information etc. have about the position of image and the region of face.Relating in the editor changing visual angle, such as, when cut wait shear or combination, the coordinate information of the image before editor is inappropriate for the image after editing.In order to address this problem, when cutting, based on the position of cutting out section and the size of cutting out section, for the image conversion coordinate information after editor.
Then, in step S2411, judge invalid coordinates information, and change priority.By judging the position after whether comprising the conversion in step S2410 in the image after editing and area coordinate, carry out the judgement of invalid coordinates information.Such as, the example for the condition judging invalid coordinates information comprises: the position after conversion and area coordinate part stretch out the image after editor; Position after conversion and area coordinate stretch out the image after editor completely; And the regional center of coordinate after conversion stretches out the image after editor.
If the focusing degree during shootings such as focusing frame information has priority, then reset priority.Such as, when judging that the burnt frame of homophony is invalid coordinates information, the focusing with next priority is re-set as homophony Jiao.When judge the burnt frame of homophony be invalid coordinates information after not there is the focusing of next priority time, then this image does not have focusing frame coordinate.
Then, in step S2412, coordinate information is set in head.This is the coordinate information after the conversion in the editor described in step S2410 and S2411, or the coordinate information when judging shooting when not carrying out this process in the editor in step S2409.
Then, in step S2413, judge whether to there is focusing frame coordinate.Here, the coordinate of the captured image of process when focusing frame coordinate is shooting, or the focusing frame coordinate kept in step S2412 when carrying out this process in editor.If judged focusing frame (step S2413 is "Yes"), then in head, image is set to focus image in step S2414.If judge frame coordinate (step S2413 is "No") of not focusing, then in head, this image is set to out of focus (out-of-focus) image in step S2415.
Then, in step S2416, system controller 50 uses classified information, settings (such as, about shooting date and the information of time) generate header.The classified information that user changes cannot be upgraded.
Such as, this can be achieved the detection of classified information by not reflecting in this process according to judgement below: the last classified information different from the last automated provisioning classified information described in head is the altered classified information of user.
Such as, when being provided with automated provisioning classified information " landscape ", and when being provided with " landscape " and " people " to classified information, judging that user is provided with " people ", therefore, not changing attribute " people ", and no matter automated provisioning result is how.
When automated provisioning current results is different from the content of the classified information described in head, can require that user selects the attribute provided on GUI.
When be provided be provided for can carry out the mark of date print on image time, remove this mark.Only when judging not comprise the printed date in cropped area, just can change the mark for date printed.Referring back to Figure 24, when the settings of ON or OFF judging the setting automatically providing classified information are OFF (step S2401 is "No"), skip the setting (step S2402 ~ S2408) of classified information, therefore generate the header without classified information.
As mentioned above, when edited image, coordinate information (such as, focusing information) is transformed into the coordinate position in the image after editor, then judges invalidating.Therefore, the image lost as the focusing frame information of one of attribute is judged as image out of focus, therefore can provides suitable attribute.
cut
The following describes cutting according to the second embodiment.
In step S1804, system controller 50 judges whether by the editor carried out be size adjustment.If judge it is not size adjustment (step S1804 is "No") by the editor carried out, then flow process enters step S1813.In step S1813, system controller 50 judges the editor carried out whether to be cut.If judge it is cut the editor carried out (step S1813 is "Yes"), then flow process enters step S1814.
In step S1814, system controller 50 uses image processor 24 that the view data after decompression is cut into specific size.Then, in step S1815, system controller 50 carries out face detection to the image that cuts retained after cutting.Then, in step S1816, system controller 50 uses image processor 24 to carry out size adjustment (zoom in/out) to cutting image.Then, in step S1817, the view data after system controller 50 uses the 16 pairs of size adjustment of compression/de-compression part is compressed, and the view data after compression is again stored in memory 32.
Then, in step S1818, system controller 50 obtains the classified information of the raw image files read in step S1802, and is stored in system memory 52.Zoom factor in the position that the use of menu screen makes user to specify to cut middle shearing and size adjustment (zoom in/out).Then, in step S1819, the head of the view data after system controller 50 Generation Edit.More specifically, copy the head of raw image files read in memory 32, and use the head of the raw image files copied, first-born one-tenth described in reference diagram 24 is carried out for newly-generated image file.If automatically provide being set to " ON " of classified information, then based on the facial information detected by step S1815, automatically provide classified information.The size of appropriate change image and the region of the project such as date created and time.
Like this, the generation of the view data of newly-generated image file is completed.In step S1811, generate the file name of newly-generated image file.In the present embodiment, spanned file title IMG_0003.JPG.Then, in step S1812, image file generated in the above described manner writes on storage medium 200 by system controller 50, and completes this editor.
For the editor relating to the content changing image such as cutting, again provide classified information based on the image after editor.Therefore, even if for the view data after editor, also can operate highly easily (such as, searching for).
If judge it is not cut the editor carried out (step S1813 is "No"), then flow process enters step S1820, and in step S1820, system controller 50 carries out other process.Other process example comprise image color transformed, change image shape and combination.Even if in these cases, also can carry out the graphical analysis corresponding with editor, and carry out first-born one-tenth.
The object lesson cut is described referring now to Figure 25 A ~ 25D.
At Figure 25 A, Reference numeral 2501 expression cuts front image.Reference numeral 2502 represents and cuts appointed area by what cut by the operation from user.Reference numeral 2503 represents face detection focusing frame during shooting.Reference numeral 2504 represents the information activating and occur when date print is arranged, and it represents shooting date.Cut front image 2501 and there is attribute " label (classified information): people; Number: 2; Date print: have; Face coordinate 1: left, 10 × 10; Face coordinate 2: in, 10 × 10; Focusing frame coordinate 1: left, 10 × 10; Priority 2; Focusing frame coordinate 2: in, 10 × 10, priority 1; And focus image: be ".
Figure 25 B illustrate use cut after appointed area 2502 cuts retain cut image 2505.According to the flow chart of Figure 18 A and 18B, the attribute cutting image 2505 is " label (classified information): people; Number: 1; Date print: nothing; Face coordinate 1: in, 90 × 90; And facial coordinate 2: nothing; Focusing frame coordinate 1: in, 90 × 90, priority 1: focusing frame coordinate 2: nothing: and focus image: be ".Attribute information after change makes can carry out suitable information displaying and search when reproduced image.Such as, as face detects as shown in focusing frame 2506, even if after cutting, also can show which part in subject and be identified as face and this part be judged as frame of focusing.
In Figure 25 C, Reference numeral 2511 expression cuts front image.Reference numeral 2512 represents and cuts appointed area by what cut by the operation from user.Reference numeral 2513 represents face detection focusing frame during shooting.Cut front image 2511 and there is attribute " label (classified information): people; Number: 2; Date print: have; Face coordinate 1: left, 10 × 10; Face coordinate 2: in, 10 × 10; Focusing frame coordinate 1: left, 10 × 10, priority 2; Focusing frame coordinate 2: in, 10 × 10, priority 1; And focus image: be ".
Figure 25 D illustrate use cut after appointed area 2512 cuts retain cut image 2515.According to the flow chart of Figure 18 A and 18B, the attribute cutting image 2515 is " label (classified information): landscape; Number: 0; Date print: nothing; Face coordinate 1: nothing; And facial coordinate 2: nothing; Focusing frame coordinate 1: nothing; Focusing frame coordinate 2: nothing; And focus image: no ".Attribute information after change makes can carry out suitable information displaying and search when reproduced image.Such as, according to process described in reference diagram 18, face detects focusing frame 2511 and has attribute: " label: people ", and cuts image 2515 and have attribute " label: landscape ".Therefore, utilize search key " people " that retrieval is cut image 2515 less than what do not photograph people, and search key " landscape " can be utilized to retrieve cut image 2515.
Cut image 2505 and 2515 all to have and cut the attribute after front image modification " date print: nothing " from respective.Such as, when using the printer with the function of adding also date printed when printing, when image 2501 and 2511 before printing cuts etc. embedded in the image on date, if arranged " date print: have ", then printer can suppress date printed, to avoid the printing of date and time information overlapping.When from image Exclude Dates printing portion, and cutting in image 2505 and 2515 the same, by being changed over by attribute " date print: nothing ", allowing printer side suitably to add and date printed.
As mentioned above, according to the present embodiment, even if after edited image, also suitable photographing information can be provided.
By providing the storage medium of the program code of the software stored for realizing at least one above-described embodiment to system or equipment, the present invention can be realized.In this case, computer in this system or equipment (or CPU or microprocessor unit (MPU)) reads the program code be stored in this storage medium.
In this case, realize the function of above-described embodiment from the program code of storage medium reading itself, and the storage medium of this program code and this program code of storage comprises within the scope of this invention.
Example for the storage medium providing program code comprises floppy disk, hard disk, magneto optical disk (MO), compact disc read-only memory (CD-ROM), etch-recordable optical disk (CD-R), tape, Nonvolatile memory card or ROM.
Realized the function of at least one above-described embodiment by the computer performing the program code read, this is also included within scope of the present invention.Such as, the part or all of actual treatment of being carried out according to the instruction of this program code by the operating system (OS) run on computers, realize the function of at least one above-described embodiment, this is also included within scope of the present invention.
In addition, such a case is had: write on by the program code read from storage medium and be included on the expansion board inserted in computer or the memory in the functional expansion unit be connected with computer.In this case, utilization is such as included in the CPU in this functional expansion unit and carries out part or all of actual treatment according to the instruction of this program code, and this is also included within scope of the present invention.
In the foregoing, the example applying the present invention to digital camera is described.This application is not limited to this example.The present invention can be applicable to printer, mobile phone, mobile terminal etc. can the equipment of reproduced image.
According to the present embodiment, when edited image, according to the content of editor, reoffer with edit after the corresponding classified information of image.As a result, suitable classified information can be provided to the image after editor.
According to the present embodiment, even if after edited image, also suitable photographing information can be provided.
Although describe the present invention with reference to exemplary embodiments, should be appreciated that, the present invention is not limited to disclosed exemplary embodiments.The scope of appended claims meets the widest explanation, to comprise all this kind of amendments, equivalent structure and function.

Claims (5)

1. an image processing equipment, comprising:
Edit cell, for cutting the image providing the attribute information relevant with the face of personage or personage;
Face-detecting unit, for detecting face from the image cut out; And
Control unit, for when face being detected from the described image cut out, carry out controlling to record the described attribute information relevant with face that is personage or personage that be that provide before image cut, and when face not detected from the described image cut out, carry out controlling not record the described attribute information relevant with face that is personage or personage that be that provide before image cut.
2. image processing equipment according to claim 1, it is characterized in that, when described edit cell cut in process the part cutting out described face, described control unit cancels the classified information relevant with described face that described control unit provides.
3. image processing equipment according to claim 1 and 2, is characterized in that, when the picture appearance cut out described in being present at least partially of described face, described control unit cancels the classified information relevant with described face.
4. image processing equipment according to claim 1 and 2, is characterized in that, also comprises:
Image unit, for photographic images; And
Record cell, for the image captured by described image unit is recorded on a storage medium together with the described attribute information relevant with the face or personage of personage,
Wherein, described edit cell reads image that described storage medium records and cuts this image.
5. an image processing method, comprises the following steps:
The image with the attribute information relevant with the face of personage or personage is cut;
Face is detected from the image cut out; And
When face being detected from the described image cut out, carry out controlling to store the described attribute information relevant with face that is personage or personage that be that provide before described image cut, and when face not detected from the described image cut out, carry out controlling not store the described attribute information relevant with face that is personage or personage that be that provide before image cut.
CN201210350890.XA 2007-08-10 2008-08-11 Image processing equipment and control method, image processing method Expired - Fee Related CN102891965B (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
JP2007210242 2007-08-10
JP2007-210242 2007-08-10
JP2008-117296 2008-04-28
JP2008117296A JP5014241B2 (en) 2007-08-10 2008-04-28 Imaging apparatus and control method thereof
CN2008101444945A CN101365064B (en) 2007-08-10 2008-08-11 Image pickup apparatus and image pickup method

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN2008101444945A Division CN101365064B (en) 2007-08-10 2008-08-11 Image pickup apparatus and image pickup method

Publications (2)

Publication Number Publication Date
CN102891965A CN102891965A (en) 2013-01-23
CN102891965B true CN102891965B (en) 2016-02-03

Family

ID=40391178

Family Applications (4)

Application Number Title Priority Date Filing Date
CN201210350890.XA Expired - Fee Related CN102891965B (en) 2007-08-10 2008-08-11 Image processing equipment and control method, image processing method
CN201510355561.8A Active CN105049660B (en) 2007-08-10 2008-08-11 Image processing equipment and its control method
CN2008101444945A Active CN101365064B (en) 2007-08-10 2008-08-11 Image pickup apparatus and image pickup method
CN201510354747.1A Expired - Fee Related CN105007391B (en) 2007-08-10 2008-08-11 Image processing equipment and image processing method

Family Applications After (3)

Application Number Title Priority Date Filing Date
CN201510355561.8A Active CN105049660B (en) 2007-08-10 2008-08-11 Image processing equipment and its control method
CN2008101444945A Active CN101365064B (en) 2007-08-10 2008-08-11 Image pickup apparatus and image pickup method
CN201510354747.1A Expired - Fee Related CN105007391B (en) 2007-08-10 2008-08-11 Image processing equipment and image processing method

Country Status (2)

Country Link
JP (2) JP5014241B2 (en)
CN (4) CN102891965B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011109469A (en) 2009-11-18 2011-06-02 Canon Inc Content receiving apparatus, and method of controlling the same
JP5529568B2 (en) 2010-02-05 2014-06-25 キヤノン株式会社 Image processing apparatus, imaging apparatus, control method, and program
JP5550989B2 (en) * 2010-05-25 2014-07-16 オリンパスイメージング株式会社 Imaging apparatus, control method thereof, and program
JP6075819B2 (en) 2011-12-19 2017-02-08 キヤノン株式会社 Image processing apparatus, control method therefor, and storage medium
KR102360424B1 (en) * 2014-12-24 2022-02-09 삼성전자주식회사 Method of detecting face, method of processing image, face detection device and electronic system including the same
CN108491535B (en) * 2018-03-29 2023-04-07 北京小米移动软件有限公司 Information classified storage method and device
JP7213657B2 (en) * 2018-11-05 2023-01-27 キヤノン株式会社 IMAGING DEVICE, CONTROL METHOD AND PROGRAM THEREOF

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1950722A (en) * 2004-07-30 2007-04-18 松下电工株式会社 Individual detector and accompanying detection device
WO2007060980A1 (en) * 2005-11-25 2007-05-31 Nikon Corporation Electronic camera and image processing device

Family Cites Families (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10232769A (en) * 1997-02-19 1998-09-02 Hitachi Ltd Class addition supporting method
US7453498B2 (en) * 1998-03-26 2008-11-18 Eastman Kodak Company Electronic image capture device and image file format providing raw and processed image data
JP2002215643A (en) * 2001-01-15 2002-08-02 Minolta Co Ltd Image classification program, computer readable recording medium recording image classification program, and method and device for classifying image
JP3984029B2 (en) * 2001-11-12 2007-09-26 オリンパス株式会社 Image processing apparatus and program
US7289132B1 (en) * 2003-12-19 2007-10-30 Apple Inc. Method and apparatus for image acquisition, organization, manipulation, and publication
JP3826043B2 (en) * 2002-01-31 2006-09-27 キヤノン株式会社 Information processing apparatus and method
JP2004172655A (en) * 2002-11-15 2004-06-17 Fuji Photo Film Co Ltd Image processing apparatus and electronic camera
JP4366083B2 (en) * 2003-01-21 2009-11-18 キヤノン株式会社 Information processing apparatus, information processing method, and program
JP4312534B2 (en) * 2003-08-01 2009-08-12 富士フイルム株式会社 Signal processing device
CN100393097C (en) * 2003-11-27 2008-06-04 富士胶片株式会社 Apparatus, method, and program for editing images
JP2005223758A (en) * 2004-02-06 2005-08-18 Canon Inc Image processing apparatus, control method thereof, computer program, and recording medium
JP2005250716A (en) * 2004-03-03 2005-09-15 Canon Inc Image processing system
US8659619B2 (en) * 2004-03-26 2014-02-25 Intellectual Ventures Fund 83 Llc Display device and method for determining an area of importance in an original image
US20050228825A1 (en) * 2004-04-06 2005-10-13 Tsun-Yi Yang Method for managing knowledge from the toolbar of a browser
JP2006025238A (en) * 2004-07-08 2006-01-26 Fuji Photo Film Co Ltd Imaging device
JP2006060652A (en) * 2004-08-23 2006-03-02 Fuji Photo Film Co Ltd Digital still camera
JP2006101156A (en) * 2004-09-29 2006-04-13 Casio Comput Co Ltd Information processing device and program
JP4591038B2 (en) * 2004-10-28 2010-12-01 カシオ計算機株式会社 Electronic camera, image classification device, and program
KR100601709B1 (en) * 2004-11-22 2006-07-18 삼성전자주식회사 Apparatus and method for trimming a picture in digital camera
JP2006191302A (en) * 2005-01-05 2006-07-20 Toshiba Corp Electronic camera device and its operation guiding method
JP2006279252A (en) * 2005-03-28 2006-10-12 Fuji Photo Film Co Ltd Image trimming apparatus, method and program
JP4244972B2 (en) * 2005-08-02 2009-03-25 ソニー株式会社 Information processing apparatus, information processing method, and computer program
US8233708B2 (en) * 2005-08-17 2012-07-31 Panasonic Corporation Video scene classification device and video scene classification method
JP2007121654A (en) * 2005-10-27 2007-05-17 Eastman Kodak Co Photographing device
JP4232774B2 (en) * 2005-11-02 2009-03-04 ソニー株式会社 Information processing apparatus and method, and program
JP2007213231A (en) * 2006-02-08 2007-08-23 Fujifilm Corp Image processor
JP4043499B2 (en) * 2006-09-06 2008-02-06 三菱電機株式会社 Image correction apparatus and image correction method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1950722A (en) * 2004-07-30 2007-04-18 松下电工株式会社 Individual detector and accompanying detection device
WO2007060980A1 (en) * 2005-11-25 2007-05-31 Nikon Corporation Electronic camera and image processing device

Also Published As

Publication number Publication date
CN105049660B (en) 2018-05-29
CN105007391A (en) 2015-10-28
CN105007391B (en) 2018-01-30
JP5014241B2 (en) 2012-08-29
CN105049660A (en) 2015-11-11
JP2009065635A (en) 2009-03-26
CN102891965A (en) 2013-01-23
CN101365064B (en) 2012-10-24
JP5490180B2 (en) 2014-05-14
JP2012178879A (en) 2012-09-13
CN101365064A (en) 2009-02-11

Similar Documents

Publication Publication Date Title
JP4958759B2 (en) Display control device, display control device control method, program, and recording medium
US8451347B2 (en) Image processing apparatus, image playing method, image pick-up apparatus, and program and storage medium for use in displaying image data
US9609203B2 (en) Image pickup apparatus and image pickup method
JP5043390B2 (en) Image playback device and program
CN102891965B (en) Image processing equipment and control method, image processing method
JP4818033B2 (en) Image playback apparatus, control method thereof, and program
CN101924876B (en) Imaging apparatus and control method thereof, and image processing apparatus and control method thereof
JP4810376B2 (en) Image reproducing apparatus and control method thereof
JP2008205846A (en) Image processor, image processing method, and computer program
JP4850645B2 (en) Image reproducing apparatus and image reproducing method
JP4761555B2 (en) Data recording apparatus and control method thereof
JP5164353B2 (en) Image reproducing apparatus and control method thereof
JP2008072514A (en) Image reproduction device and control method
JP5377051B2 (en) Image processing apparatus, control method therefor, and program
JP2009225315A (en) Imaging apparatus
JP2008072497A (en) Image processing apparatus
JP2008072498A (en) Image reproducing device, control method therefor, program thereof
JP2008071167A (en) Image processor
JP2008053970A (en) Data recording device, and control method thereof

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20160203

Termination date: 20190811