CN101365064B - Image pickup apparatus and image pickup method - Google Patents

Image pickup apparatus and image pickup method Download PDF

Info

Publication number
CN101365064B
CN101365064B CN2008101444945A CN200810144494A CN101365064B CN 101365064 B CN101365064 B CN 101365064B CN 2008101444945 A CN2008101444945 A CN 2008101444945A CN 200810144494 A CN200810144494 A CN 200810144494A CN 101365064 B CN101365064 B CN 101365064B
Authority
CN
China
Prior art keywords
image
information
system controller
editor
classified information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN2008101444945A
Other languages
Chinese (zh)
Other versions
CN101365064A (en
Inventor
中濑雄一
稻垣温
参纳雅人
池田平
丹羽智弓
渡边等
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Priority to CN201510354747.1A priority Critical patent/CN105007391B/en
Priority to CN201510355561.8A priority patent/CN105049660B/en
Priority to CN201210350890.XA priority patent/CN102891965B/en
Publication of CN101365064A publication Critical patent/CN101365064A/en
Application granted granted Critical
Publication of CN101365064B publication Critical patent/CN101365064B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00127Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
    • H04N1/00132Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture in a digital photofinishing system, i.e. a system where digital photographic images undergo typical photofinishing processing, e.g. printing ordering
    • H04N1/00167Processing or editing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/583Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N1/32101Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N1/32128Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title attached to the image data, e.g. file header, transmitted message header, information on the same page or in the same computer file as the image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/74Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/40Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
    • H04N25/44Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by partially reading an SSIS array
    • H04N25/443Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by partially reading an SSIS array by reading pixels from selected 2D regions of the array, e.g. for windowing or digital zooming
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/77Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
    • H04N5/772Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera the recording apparatus and the television camera being placed in the same enclosure
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/804Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components
    • H04N9/8042Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components involving data reduction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/82Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
    • H04N9/8205Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
    • H04N9/8227Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal the additional signal being at least another television signal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2101/00Still video cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3212Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a job, e.g. communication, capture or filing of an image
    • H04N2201/3214Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a job, e.g. communication, capture or filing of an image of a date
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3212Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a job, e.g. communication, capture or filing of an image
    • H04N2201/3215Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a job, e.g. communication, capture or filing of an image of a time or duration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3225Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to an image, a page or a document
    • H04N2201/3226Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to an image, a page or a document of identification information or the like, e.g. ID code, index, title, part of an image, reduced-size image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3225Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to an image, a page or a document
    • H04N2201/325Modified version of the image, e.g. part of the image, image reduced in size or resolution, thumbnail or screennail
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3225Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to an image, a page or a document
    • H04N2201/325Modified version of the image, e.g. part of the image, image reduced in size or resolution, thumbnail or screennail
    • H04N2201/3251Modified version of the image, e.g. part of the image, image reduced in size or resolution, thumbnail or screennail where the modified version of the image is relating to a person or face
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3274Storage or retrieval of prestored additional information
    • H04N2201/3277The additional information being stored in the same storage device as the image data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/78Television signal recording using magnetic recording
    • H04N5/781Television signal recording using magnetic recording on disks or drums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/907Television signal recording using static stores, e.g. storage tubes or semiconductor memories

Abstract

Disclosed is a technique for allowing proper classification information to be provided to an edited image. When it is determined that the editing is cropping, a system controller cuts a decompressed image down to a desired size using an image processor and performs face detection on a crop image. The system controller generates a header for image data of an edited image. When the setting of automatically providing classification information is 'ON', classification information is automatically provided based on the detected face information.

Description

Picture pick-up device and image capture method
Technical field
The present invention relates to a kind of picture pick-up device and method thereof that classified information can be provided to image.
Background technology
Along with being widely used of digital camera, the data through digital camera and personal computer (below be referred to as " the PC ") image that digital camera is captured and universal gradually on various types of recording mediums from the image data storage that PC caught.
In order to check, to edit, organize the view data that is stored on the recording medium or, reproducing this view data through picture reproducers such as digital camera, PC or printers for other reason.
There are such digital camera, PC and printer: can be provided for making it possible to carry out the attribute of AP retrieval to image, and allow the user to use this attribute to come retrieving images as search key.
There is a kind of digital camera that can show focusing (in-focus) position of captured image.The disclosed a kind of like this exemplary techniques of TOHKEMY 2003-143444 communique can show focusing position with the amplification mode, so that the user can easily discern focusing position.There is a kind of technology that is used for storing about the positional information of the focusing position of captured image.
Yet, in above-mentioned technology, owing to check image through the history of using editor's (operation), thereby do not consider according to the content of image suitable attribute to be provided.Especially, when falling editor such as the cutting of desired portion (cropping) through the image cut of associating when image, the different classified information of classified information of the condition when taking with expression possibly be suitable.
For example; Under the situation of captured people's image, provide classified information about people for this image this moment is no problem, still in the screening-mode that is suitable for the shooting people picture; When through cutting when this image cut falls the part beyond the people, this people's classified information maybe be not quite convenient.
Summary of the invention
The present invention provides a kind of image that is used for behind editor that the technology of suitable classified information is provided.
The present invention also provides a kind of being used for when the picture editting, the photographing information that offers image to be carried out scrupulous technology of editing.The present invention also provides a kind of technology that is used to allow when the reconstruction of image, suitably to show photographing information.
According to an aspect of the present invention, a kind of picture pick-up device comprises: image unit is used for photographic images; The unit is set, is used for being provided with the employed screening-mode of said image unit; The unit is provided, is used for according to through the said set said screening-mode in unit that is provided with, perhaps according to from the captured image detection of said image unit to quilt take the photograph body information, to captured image classified information is provided; And edit cell; Be used for editing to having by the said said image of the said classified information that the unit provides that provides; Wherein, when when the said image of said edit cell editor, the said result that the editor that the unit is used for carrying out according to said edit cell is provided; Confirm the classified information of said image, and determined classified information is offered the image behind the editor.
According to another aspect of the present invention, a kind of image capture method is provided, comprises: photographic images; Employed screening-mode in the said shooting is set; According to said screening-mode, perhaps according to from captured image detection to quilt take the photograph body information, to said image classified information is provided; And the said image with said classified information edited; Wherein, when at the said image of said editor's inediting, according to the editor's who in said editor, is carried out result; Confirm the classified information of said image, and determined classified information is offered the image behind the editor.
Through following (with reference to the accompanying drawing) explanation to exemplary embodiments, further feature of the present invention will be obvious.
Description of drawings
Fig. 1 is the outside drawing of digital camera according to an embodiment of the invention.
Fig. 2 is the block diagram that illustrates according to the exemplary configurations of the digital camera of embodiment.
Fig. 3 is the flow chart that illustrates according to the exemplary overall operation of the digital camera of embodiment.
Fig. 4 is the flow chart that an exemplary process of rest image logging mode is shown.
Fig. 5 is the flow chart that the facial exemplary process that detects is shown.
Fig. 6 is the flow chart that the exemplary process of shooting is shown.
Fig. 7 is the flow chart that the exemplary process of record is shown.
Fig. 8 is the flow chart that the exemplary process of first-born one-tenth is shown.
Fig. 9 illustrates exemplary catalogue and file structure.
Figure 10 illustrates the example data structure of static picture document.
Figure 11 is the flow chart that an exemplary process of motion picture recording pattern is shown.
Figure 12 illustrates the example format that is used for the storing moving image data.
Figure 13 is the flow chart that the exemplary process of thumbnail file record is shown.
Figure 14 is the flow chart that an exemplary process of receiving mode is shown.
Figure 15 is the flow chart that an exemplary process of reproduction mode is shown.
Figure 16 is the flow chart that is illustrated in an exemplary process of the input wait state when not having image.
Figure 17 A and 17B are the flow charts that an exemplary process of the input wait state that is used for reproduced image is shown.
Figure 18 A and 18B are the flow charts that editor's a exemplary process is shown.
Figure 19 A~19D is the key diagram that is used to explain the object lesson that cuts processing.
Figure 20 illustrates the flow chart that classified information is provided with an exemplary process of pattern.
Figure 21 is the flow chart that the exemplary process of image file management is shown.
Figure 22 is the flow chart that the exemplary process of picture search is shown.
Figure 23 is the flow chart that the exemplary process of file analysis is shown.
Figure 24 is the flow chart that illustrates according to the exemplary process of the first-born one-tenth of second embodiment.
Figure 25 A~25D is used to explain the key diagram according to the object lesson that cuts processing of second embodiment.
Embodiment
Below with reference to description of drawings embodiment.In the embodiment of the following stated, the example of the shooting setting (for example, digital camera) that applies the present invention to take rest image and moving image is described.
First embodiment
The structure of digital camera
Fig. 1 is the outside drawing of digital camera 100 according to an embodiment of the invention.Image displaying part 28 display images and various types of information.Mains switch 72 is used for connecting Switching power between (ON) and (OFF) state of disconnection.Reference numeral 61 expression shutter release buttons.Mode selector 60 is used between the various use patterns of digital camera 100, switching.Particularly, for example, can be between rest image logging mode, motion picture recording pattern and reproduction mode switch mode.
Operation part 70 is used for receiving various types of operations from the user.Operation part 70 comprises the functional units such as touch panel on the picture of various types of buttons shown in Figure 1 and image displaying part 28.The example of the various buttons of operation part 70 comprises reset button, menu button, button is set, the four-way button of cross arrangement (upper and lower, the right side and left button) and roller.
Connection cable 111 is used for digital camera 100 is connected to external device (ED).Connector 112 is used for connection cable 111 and digital camera 100 are linked together.
Storage medium 200 can be for example storage card and hard disk.Storage medium slot 201 is used to hold storage medium 200.When being contained in storage medium 200 in the storage medium slot 201, storage medium 200 can communicate with digital camera 100.Lid 202 is used to cover storage medium slot 201.
Fig. 2 is the block diagram that illustrates according to the exemplary configurations of the digital camera 100 of present embodiment.Camera part 22 comprises charge-coupled device (CCD) or CMOS complementary metal-oxide-semiconductor (CMOS) element.Camera part 22 will convert the signal of telecommunication to through pick-up lens 103 and shutter 101 formed optical imagerys, and wherein, shutter 101 has the aperture function.Baffle plate 102 is used for protecting camera system to avoid polluting or breakage through covering image pickup part, and wherein, image pickup part comprises the pick-up lens 103 of digital camera 100, and camera system comprises pick-up lens 103, shutter 101 and camera part 22.
Analog to digital (A/D) transducer 23 is used for analog signal conversion is become digital signal.A/D converter 23 is used for the analog signal conversion from camera part 22 outputs is become digital signal, and will become digital signal from the analog signal conversion of Audio Controller 11 outputs.
Control by Memory Controller 15 and 50 pairs of timing sequencers 12 of system controller.Timing sequencer 12 is used for to camera part 22, Audio Controller 11, A/D converter 23 or digital-to-analog (D/A) transducer 13 clock signal or control signal being provided.
Image processor 24 is used for carry out size adjustment or color conversion predetermined process such as (for example, picture element interpolation or minimizings) from the data of A/D converter 23 or Memory Controller 15.Image processor 24 is used to use the view data of being obtained to carry out predetermined computation, and makes system controller 50 to control exposure and range finding based on this result of calculation.This makes it possible to carry out, and (through-the-lens, TTL) automatic focusing (AF), automatic exposure (AE) and flash tube luminous in advance (EF) are handled through camera lens.In addition, image processor 24 is used to use the view data of being obtained to carry out predetermined computation, and can carry out TTL AWB (AWB) based on this result of calculation.
Through image processor 24 and Memory Controller 15, perhaps directly through Memory Controller 15, will be from the writing data into memory 32 of A/D converter 23 outputs.The view data of the numerical data that the data transaction that memory 32 storage conducts are obtained according to camera part 22 by A/D converter 23 becomes, and storage will be presented at the view data on the image displaying part 28.Memory 32 also is used for storing the file header that the voice data, rest image, moving image and the image file that are write down through microphone 10 use.Therefore, memory 32 has the Still image data that is enough to storing predetermined quantity, the motion image data of scheduled time length and the memory capacity of voice data.
Compression/de-compression part 16 is used to use for example adaptive discrete cosine transform (ADCT) view data is compressed or to be decompressed.Compression/de-compression part 16 is used for reading and compresses and use shutter 101 as the view data that trigger obtained, be stored in memory 32, and the writing data into memory after will handling 32.Compression/de-compression part 16 for example is used for decompressing and reads the view data after the compression of memory 32 from the recording section 19 of storage medium 200, and the writing data into memory after will handling 32.The view data of writing memory 32 through compression/de-compression part 16 forms file in the file part of system controller 50, and through interface 18 this document data is recorded on the storage medium 200.Memory 32 is also with the memory (VRAM) that acts on display image.
D/A converter 13 is used for converting analog signal to being stored in view data memory 32, to be shown, and this analog signal is offered image displaying part 28.Image displaying part 28 is used for the analog signal that provided based on from D/A converter 13, goes up video data at display (for example, LCD (LCD)).Like this, through D/A converter 13, utilize image displaying part 28 to show and write view data memory 32, to be shown.
To offer A/D converter 23 from the audio signal of microphone 10 outputs through Audio Controller 11, wherein, Audio Controller 11 comprises for example amplifier.Convert audio signal to digital signal through A/D converter 23, through Memory Controller 15 this digital signal is stored in the memory 32 then.The voice data that is recorded on the storage medium 200 is read in memory 32, convert thereof into analog signal through D/A converter 13 then.Utilize this analog signal, Audio Controller 11 drives loud speaker 39, and outputting audio data.
To be that content is erasable remove and recordable memory nonvolatile memory 56, and can for for example EEPROM (electrically erasableprogrammable read-only memory, EEPROM).Nonvolatile memory 56 is stored in constant and the program of using in the operation of system controller 50.Here employed program representation is used for carrying out the program of the flow chart of present embodiment illustrated later.
The whole digital camera 100 of system controller 50 controls.System controller 50 is stored in the program in the above-mentioned nonvolatile memory 56 through execution, carries out the processing of illustrated later in the present embodiment.System storage 52 can be a random-access memory (ram), and employed constant and variable and the program that reads from nonvolatile memory 56 in the operation of development system controller 50.
Mode selector 60, first shutter release 62, second shutter release 64 and operation part 70 are the operating units that are used for to the various types of operational orders of system controller 50 inputs.Mode selector 60 is used in the for example mode of operation of switched system controller 50 between rest image logging mode, motion picture recording pattern and the reproduction mode.
When the centre (partly pressing) of the operation of the shutter release button 61 of digital camera 100, connect first shutter release 62, and generate the first shutter release signal SW1.In response to the first shutter release signal SW1, system controller 50 beginning AF, AE, AWB, EF and/or other processing.
When (pressing fully) accomplished in the operation of shutter release button 61, connect second shutter release 64, and generate the second shutter release signal SW2.In response to the second shutter release signal SW2, system controller 50 operation of the series of steps handled that begins to make a video recording: from reading signal by camera part 22 to view data is write on the storage medium 200.
Through selection function icon in the various types of icons that on image displaying part 28, shown, distribute appropriate functional based on each scene to each functional unit of operation part 70, thereby allow functional unit as function button.The example of function button comprises that accomplishing button, return push-button, image forwarding button, hop button, constriction button and attribute changes button.For example, when pressing menu button, on image displaying part 28, present the menu screen that the permission user specifies various settings.The user can use menu screen, the four-way button that is presented on the image displaying part 28 and button is set, and specifies various settings intuitively.Mains switch 72 is used for switching on and off Switching power between the state.
Power-supply controller of electric 80 comprises battery detection circuit, DC-DC transducer and the commutation circuit that is used to switch the piece that will be energized, and is used to detect type and the remaining power life-span that whether is equipped with battery, institute's assemble.Power-supply controller of electric 80 is controlled the DC-DC transducer, and to the assembly that comprises storage medium 200 the necessary voltage of necessary time period is provided in response to this testing result with from the instruction of system controller 50.Power unit 30 can comprise primary cell (for example, alkaline battery or lithium battery), secondary cell (for example, NI-G (NiCd) battery, nickel metallic hydrogen (NiMH) battery or lithium (Li) battery) or AC adapter.Connector 33 and 34 is used for power unit 30 and power-supply controller of electric 80 are linked together.
Real-time clock (RTC) 40 is used to measure date and time.Therefore RTC 40 has the internal electric source part except that power-supply controller of electric 80, even when deenergization part 30, RTC 40 also can continue Measuring Time.System controller 50 uses the date and time that is obtained from RTC 40 when starting that system timer is set, and carries out timer control.
Interface 18 is the interfaces with storage medium 200 (for example, storage card or hard disk).Connector 35 is used for interface 18 is connected with storage medium 200.Recording medium exists detector 98 to be used for detection of stored medium 200 whether to be assembled to connector 35.
Storage medium 200 (for example, storage card or hard disk) comprising: recording section 19, and it comprises semiconductor memory or disk; Interface 37 with digital camera 100; And be used for the connector 36 that is connected with digital camera 100.
Communications portion 110 can be carried out various types of communications such as RS-232C, USB (USB), IEEE1394, P1284, SCSI, modulator-demodulator, LAN and radio communication.Connector 112 is used for through communications portion 110 digital camera 100 being connected to other device, and wherein, under the situation of radio communication, connector 112 is an antenna.
The overall operation of digital camera
Fig. 3 is the flow chart that illustrates according to the exemplary overall operation of the digital camera 100 of present embodiment.When operating power switch 72, and during energized, at step S1, system controller 50 initialization flags and control variables.
Then, at step S2, system controller 50 begins the file that is stored in the storage medium 200 is managed.The back describes main reference Figure 21 to this document management processing.
Then, at step S3, S5 and S7, system controller 50 judgment model selectors 60 the position is set.Be set at the rest image logging mode if be judged as mode selector 60, then flow process gets into step S4 through step S3.At step S4, carry out the processing of rest image logging mode.Main reference Fig. 4 is explained the processing of rest image logging mode in the back.Be set at the motion picture recording pattern if be judged as mode selector 60, then flow process gets into step S6 through step S3 and S5.At step S6, carry out the processing of motion picture recording pattern.Main reference Figure 11 is explained the processing of motion picture recording pattern in the back.Be set at reproduction mode if be judged as mode selector 60, then flow process gets into step S8 through step S3, S5 and S7.At step S8, carry out the processing of the reproduction mode of reproducing captured rest image or moving image.Main reference Figure 15 is explained the processing of reproduction mode in the back.Be set at other pattern if be judged as mode selector 60, then flow process gets into step S9.At step S9, carry out and the corresponding processing of selected pattern.The example of other pattern comprises: the processing of sending mode, and at sending mode, send and be stored in the file in the storage medium 200; And the processing of receiving mode, at receiving mode, receive file, and this document is stored in the storage medium 200 from external device (ED).The back will be explained the processing of the receiving mode of above-mentioned exemplary process with reference to Figure 14.
Carrying out by the processing (at step S4, S6, S8 or S9) of the specified pattern that is provided with of mode selector 60 afterwards, flow process gets into step S10.At step S10, system controller 50 is judged the position that is provided with of mains switch 72.If be judged as the position that mains switch 72 is set at power connection, then flow process turns back to step S3.Be set at the position that power supply breaks off if be judged as mains switch 72, then flow process gets into S11, and at step S11, system controller 50 carries out termination.An example of termination is that the display change that appears on the image displaying part 28 is become done state.Other example comprise close baffle plate 102 with the protection image pickup part, will comprise sign and parameter, the settings of control variables, mode record be set in nonvolatile memory 56 and the power supply that cuts off the portion that does not need electricity consumption.After the termination in completing steps S11, flow process finishes, and state switches to power-off state.
The processing of rest image logging mode
Fig. 4 is the flow chart that is illustrated in the exemplary process that takes place in the rest image logging mode shown in the step S4 of Fig. 3.For example, when utilizing mode selector 60 that mode switch is arrived another pattern, perhaps when mains switch 72 is set to the power supply open position, through the processing of EOI rest image logging mode shown in Figure 4.
After the processing of beginning rest image logging mode, at step S401, system controller 50 is confirmed screening-mode then.Carry out the affirmation of screening-mode through following (1) or (2).(1) obtains employed screening-mode when the last processing of rest image logging mode finishes from nonvolatile memory 56, and it is stored in the system storage 52.
(2) when the user specifies screening-mode through operating operation part 70, specified screening-mode is stored in the system storage 52.
Shutter speed through being applicable to photographed scene, f-number, the luminance of photoflash lamp, the combination of sensitivity setting define screening-mode.Digital camera 100 according to present embodiment has following screening-mode:
Automatic mode:, confirm the various parameters used in the camera through being built in Automatic Program in the digital camera 100 based on measured exposure value.
Manual mode: the user can freely change the various parameters of using in the camera.
Scene mode: the combination of the shutter speed, f-number, photoflash lamp luminance and the sensitivity setting that are suitable for photographed scene is set automatically.
Scene mode comprises following pattern:
Portrait (Portrait mode): be exclusively used in the image of shooting people, the people focused on make blurred background simultaneously.
Night scene mode (Night Scene mode): be exclusively used in night scene, wherein utilize photoflash lamp irradiation people, and utilize slow shutter speed record background.
Landscape configuration (Landscape mode): be exclusively used in wide landscape.
Snapshot mode at night (Night & Snapshot mode): the beautiful image that is suitable under the situation of not using tripod, taking night scene and people;
Children and pet pattern (Kids & Pets mode): children that shooting is run fast and the image of animal, and can not miss chance at excellent moment photographic images.
Plant pattern (Foliage mode): be suitable for taking greenery and autumn vegetation painted image.
Party pattern (Party mode): under fluorescent lamp or bulb, utilize the faithful to tone of being taken the photograph body to take the image of being taken the photograph body, camera shake simultaneously compensates.
Snow scenes pattern (Snow mode): even with snow scenes be also not deepening of people and do not have blue ground photographic images of background.
Seabeach pattern (Beach mode): even under the scene at the sea of high reflection or sandy beach to sunlight, also can the people or other taken the photograph the constant secretly photographic images of body.
Fireworks pattern (Fireworks mode): utilize optimum exposure clearly to take the image of the fireworks that soar.
Aquarium pattern (Aquarium mode): sensitivity, white balance and tone that the image of the fish in the water tank be suitable for taking indoor aquarium is set.
Pattern (Underwater mode) under water: use to the best white balance of scene under water, with the blue photographic images that reduces.
Return with reference to figure 4, when when step S401 has confirmed screening-mode, then at step S402, system controller 50 shows the view data that is provided from camera part 22 with the live view show state.Here used means with live view show state display image: show the image that is obtained by the image pickup part branch in real time.Then; At step S403; Whether system controller 50 utilizes power-supply controller of electric 80 to judge in the process of operand word camera 100, to exist by power unit 30 (to comprise; For example, battery) the problem that residual life caused, and judge in the process of operand word camera 100, whether to exist by the problem that residual capacity caused that has or not storage medium 200 or storage medium 200.If judge existing problems in power unit 30 and storage medium 200 state separately (step S403 is " denying "), then flow process gets into step S404.At step S404, image displaying part 28 uses image or audio frequency to show predetermined warning under the control of system controller 50, and flow process turns back to step S401 then.
If judge no problem in power unit 30 and storage medium 200 state separately (step S403 is " being "), then flow process gets into step S405.At step S405, if desired, then system controller 50 is to providing being provided with of classified information to specify ON or OFF automatically.Behind the menu button that in push part 70, comprises, the user can be according to the menu screen (not shown) that appears on the image displaying part 28, to being provided with of classified information any selection ON or OFF are provided automatically.Utilize expression whether according to scene mode with taken the photograph the sign that body information provides classified information automatically, represent appointment ON or OFF to be set for what classified information was provided automatically.Set value (the ON/OFF value of sign) is remained in the system storage 52.For providing being provided with of classified information to specify ON or OFF automatically, this can be avoided the user that undesired classified information is provided according to environment.Classified information will be explained in the back.
Then, at step S406, system controller 50 judges in live view show state institute images displayed signal, whether have people's face.The back will be with reference to the exemplary process of this faces detection of figure 5 explanations.If in the face detection is handled, detected people's face; Then system controller 50 will be in picture signal the position coordinates, the size (for example, width and height) of detected face, quantity, coefficient of reliability and other relevant information of detected face of detected face be stored in the system storage 52 as facial information.If in face detect to be handled, do not detect people's face, then the zone to quantity, coefficient of reliability and other relevant information of position coordinates, size (for example, width and height), detected face is provided with zero.
During live view shows, adopt the face that is stored among the step S406 among the VRAM, live view used image in showing to detect and to use the captured image itself that obtains from CCD.
Then, at step S407, system controller 50 judges whether the first shutter release signal SW1 is ON.If the first shutter release signal SW1 is OFF (step S407 is " OFF "), then flow process turns back to step S405, and the processing of repeating step S405 and S406.If the first shutter release signal SW1 is ON (step S407 is " ON "), then flow process gets into step S408.At step S408, system controller 50 measuring distances are taken the photograph on the body so that the focus of pick-up lens 103 is adjusted to, and carry out photometry to confirm f-number and aperture time (shutter speed).In photometry is handled, if desired, then carry out the setting of photoflash lamp.At this moment, if in step S406, detected people's face, then can also in the scope of detected face, find range.
Then, at step S409 and S410, judge the first shutter release signal SW1 and second shutter release signal SW2 ON/OFF state separately.If when the first shutter release signal SW1 is in the ON state, connect the second shutter release signal SW2 (step S409 is ON), then flow process gets into step S411.If break off the first shutter release signal SW1, that is, discharge the first shutter release signal SW1, and disconnect the second shutter release signal SW2 (step S410 is OFF), then flow process turns back to step S405.Be in the ON state at the first shutter release signal SW1, and the second shutter release signal SW2 is when being in the OFF state, repeating step S409 and S410.
If connected the second shutter release signal SW2 (supressing second shutter release 64), then at step S411, system controller 50 is arranged to the constant color show state with the show state of image displaying part 28 from the live view show state.Mean with constant color show state display image: show the image regular period of solid color, taken image when letting the user intuitively be informed in the shutter release button of pressing in the digital camera.In the present embodiment, under the constant color show state, show black image (blackout (blackout)).Then, at step S412, system controller 50 is made a video recording, and comprises exposure and development.In exposure, through image processor 24 and Memory Controller 15, perhaps from A/D converter 23 directly through Memory Controller 15, the view data write memory 32 that will be obtained through camera part 22 and A/D converter 23.In development, system controller 50 uses Memory Controller 15, if desired, also uses image processor 24, reads the view data in the write memory 32, and carries out various types of processing for the view data that is read.The back will should be handled with reference to figure 6 explanations in shooting.
Then, at step S413, system controller 50 carries out the REC playback for the view data of being obtained in the shooting processing and shows on image displaying part 28.The REC playback shows: in order to let the user check captured image, and after the image of body is taken the photograph in shooting, before being recorded in image on the recording medium, display image data predetermined amount of time (playback duration) on image displaying part 28.After REC playback display image data, at step S414, the view data that system controller 50 will be obtained in will making a video recording and handling is recorded on the storage medium 200 as image file.The back will be with reference to figure 7 these recording processing of explanation.
Behind the record in completing steps S414, at step S415, system controller 50 judges whether the second shutter release signal SW2 is in the ON state.Be in the ON state if judge the second shutter release signal SW2, the then judgement among the repeating step S415 is till breaking off the second shutter release signal SW2.During at this moment, the REC playback that continues view data shows.That is to say that during record in having accomplished step S414, the REC playback that on image displaying part 28, continues view data shows, up to breaking off the second shutter release signal SW2, that is to say, discharges till second shutter release 64.This makes the user to use the REC playback function fully to check captured view data through continuing to press shutter release button 61 fully.
The user through after pressing shutter release button 61 photographic images fully, thereby when the user through his/her hand being taken when shutter release button 61 discharges the complete down state of shutter release buttons 61, flow process gets into step S416 from step S415.At step S416, system controller 50 judges whether to have passed through predetermined playback duration.Passed through the scheduled time (step S416 is " being ") if judge, then flow process gets into step S417.At step S417, system controller 50 turns back to the live view show state with the show state on the image displaying part 28 from REC playback show state.In order to get ready for next shooting; After under REC playback show state, checking captured view data; This processing makes the show state on the image displaying part 28 to change over the live view show state automatically from REC playback show state, and the live view show state shows the view data from camera part 22 in proper order.
At step S418, system controller 50 judges whether the first shutter release signal SW1 is in the ON state.Be in the ON state if judge the first shutter release signal SW1 at step S418, then flow process turns back to step S409; Be in the OFF state if judge the first shutter release signal SW1, then flow process turns back to step S405.That is to say that if continue half down state of shutter release button 61, that is, the first shutter release signal SW1 is in the ON state, then should handle and prepare next shooting (at step S409).If discharge shutter release button 61, that is, the first shutter release signal SW1 is in the OFF state, then accomplishes the series of steps that shooting is handled, and handles the state (at step S405) of waiting for shooting that turns back to.
The facial detection
Detect an example of handling referring now to the face among the step S406 of Fig. 5 key diagram 4.At step S501, system controller 50 will carry out face detection image data processed and send to image processor 24.At step S502, image processor 24 is under the control of system controller 50, and (horizontalband-pass filter BPF) carries out filtering to this view data through horizontal band pass filter.At step S503, image processor 24 carries out filtering through vertical BPF in step S502, utilizing the view data after horizontal BPF handles under the control of system controller 50.Usage level BPF detects the marginal element from view data with vertical BPF.
Then, at step S504, system controller 50 carries out pattern matching for detected marginal element, to extract candidate's eyes, nose, mouth and ear.Then; At step S505; System controller 50 will satisfy the object of predetermined condition (for example, the distance of two eyes or gradient) and confirm as pair of eyes from the candidate's eyes that among step S504, extracted, candidate's eyes are narrowed to the object with pair of eyes.Then, at step S506, system controller 50 will be associated with forming facial other appropriate section (nose, mouth, ear) by the candidate's eyes after the constriction in step S505, and through non-facial condition filtering device data carried out filtering, to detect face.At step S507, system controller 50 is according to the facial testing result among the step S506, and output is about the information of face, and completion should be handled.
As stated, can detect the body information of being taken the photograph through using the characteristic information of institute's images displayed data extract view data under the live view show state.In the present embodiment, as an example being taken the photograph body information, the information about face has been described.Yet, can also use other various types of information, for example, can use blood-shot eye illness detection information.
Shooting
An example referring now to the processing of the shooting among the step S412 of Fig. 6 key diagram 4.At step S601, system controller 50 obtains the date and time when beginning to make a video recording from system timer, and it is stored in the system storage 52.Then, at step S602, system controller 50 is opened the shutter 101 with aperture function based on the photometric data that is stored in the system storage 52 according to f-number.This makes camera part 22 begin exposure (at step S 603).
At step S604, system controller 50 is waited for camera part 22 completion according to the exposure that photometric data carried out.At end exposure constantly, at step S605, system controller 50 is closed shutter 101.Then, at step S606,, and, perhaps directly pass through Memory Controller 15, with view data write memory 32 from A/D converter 23 through A/D converter 23, image processor 24 and Memory Controller 15 from camera part 22 reading electric charges signals.Step S601~S606 is corresponding to exposure-processed.
Then, at step S607, system controller 50 reads the view data that is stored in the memory 32, and uses Memory Controller 15, if desired, also uses image processor 24, carries out image processing in proper order for the view data that is read.The example of this image processing comprises white balance and the compression of using compression/de-compression part 16.With the view data write memory 32 after handling.At step S608, system controller 50 uses 16 pairs of view data that read of compression/de-compression part to decompress from memory 32 reads image data, and the size of adjustment view data, so that it is presented on the image displaying part 28.After this, system controller 50 sends the adjusted view data of size to D/A converter 13, so that it is presented on the image displaying part 28 through Memory Controller 15.When accomplishing the series of steps of this processing, accomplished shooting and handled.
Record
An example referring now to the recording processing among the step S414 of Fig. 7 key diagram 4.At step S701, system controller 50 is according to the file designation rule, for treating recorded image data spanned file title.The back will be with reference to an example of figure 9 supporting paper naming rules.Then, at step S702, system controller 50 obtains the date and time information that in the step S601 of Fig. 6, is stored in the system storage 52.Then, at step S703, system controller 50 obtains the information about the size of treating recorded image data.
Then, at step S704, system controller 50 judges in storage medium 200, whether there is the catalogue that is used to store the image file that generates according to view data.Do not have such catalogue (step S704 is " denying ") if judge, then flow process gets into step S705, and at step S705, system controller 50 generates the catalogue that is used to store this image file.The back will be used to generate an example of the rule of directory name with reference to figure 9 explanation.Here, generate title 100XXX (902 among Fig. 9).
Then, at step S706, system controller 50 is for being stored in the view data spanned file head in the memory 32 in the shooting processed steps S606 of Fig. 6.By information configuration file head about the condition of shooting date when taking.The back will be with reference to an example of this first-born one-tenth processing of figure 8 explanations.The back will be explained an exemplary configurations of the image file that is generated in the above described manner with reference to figure 9.
After the generation of accomplishing head, at step S707, system controller 50 generates directory entry according to file name that is generated at step S701 and the date and time information that in step S702, is obtained, and this image file is recorded on the storage medium 200.
First-born one-tenth
An example referring now to the first-born one-tenth processing among the step S706 of Fig. 8 key diagram 7.At step S801, system controller 50 obtains the ON of the setting that classified information is provided automatically specified among the step S405 of Fig. 4 or the settings of OFF from system storage 52, and judges whether to captured view data classified information to be provided automatically.If judge to the ON of the setting that classified information is provided automatically or the settings of OFF and be provided with " OFF ", that is, classified information (step S801 is " denying ") is not provided automatically, then flow process gets into step S809.
If judge to the ON of the setting that classified information is provided automatically or the settings of OFF and be provided with " ON ", that is, classified information (step S801 is " being ") is provided automatically, then flow process gets into step S802.At step S802, the step S406 that system controller 50 reads in Fig. 4 remains on the facial information in the system storage 52, and judges whether to have detected face.Detected facial (step S802 is " being ") if judge, then flow process gets into step S804, at step S804, classified information " people " is provided.Do not detect face (step S802 is " denying ") if judge, then flow process gets into step S803.
At step S803; The scene mode of system controller 50 image when being stored in the shooting in the system storage 52, and judge that whether scene mode is wherein any one of " Portrait ", " snapshot mode at night " and " children and pet pattern ".In these three kinds of patterns, suppose and taken the people.If judge scene mode is their one of them (step S803 is " being "), and then flow process gets into step S804, and at step S804, system controller 50 provides classified information " people " to view data.If at step S 804 classified information " people " is provided, if perhaps judge wherein any one pattern that scene mode is not them (step S803 is " denying "), then flow process gets into step S805.
As stated, in step S802~S804, according to the scene mode of the example that condition is set as the facial information of an example being taken the photograph body information and the camera when taking the two, identical classified information " people " is provided.It is the different parameters when taking that camera when being taken the photograph body information and taking is provided with condition, but according to content, and they can have meaning after similar shootings." Portrait ", " snapshot mode at night " and " children and pet pattern " that camera during as the facial information of being taken the photograph one of body information with as shooting is provided with one of condition all have same meaning: " infer and taken the people ".Therefore, through to the view data with this category information same category information being provided, this has strengthened the convenience of taking back operation (for example, search operation).That is to say, use specific shot body information and particular camera that condition is set same category information is provided that this makes it possible to provide parameter and the classified information that be suitable for taking back operation (for example, search operation) that is different from when taking.This can strengthen convenience.
In addition, above-mentioned classified information provide processing can for different scene modes be Portrait, night snapshot mode with the pet pattern identical classified information is provided with children.Camera when different scene modes have different shootings is provided with condition, but they can have similar meaning.Portrait, night, snapshot mode and children and pet pattern all had same meaning: " infer and taken the people ".Therefore, through identical classified information being provided, strengthened the convenience of taking back operation (for example, search operation) to this type view data.That is to say that polytype specific setting condition that camera is provided with in the condition during for shooting provides identical classified information, this makes it possible to provide parameter and the classified information that be suitable for taking back operation (for example, search operation) that is different from when taking.This can strengthen the convenience of taking the back operation.
Return with reference to figure 8, at step S805, system controller 50 judge scene mode whether be " plant pattern ", " landscape configuration " and " fireworks pattern " wherein any one.In these three kinds of patterns, inferring captured image is that landscape is taken.Be any one (step S805 is " being ") in them if judge scene mode, then flow process gets into step S806, and at step S806, system controller 50 provides classified information " landscape " to view data.If classified information " landscape " is provided in step S806, not any one (step S805 is " denying ") in them if perhaps judge scene mode, then flow process gets into step S807.
At step S807, system controller 50 judges whether scene mode is any one in " party pattern ", " snow scenes pattern ", " seabeach pattern ", " fireworks pattern ", " the aquarium pattern " and " pattern under water ".In these patterns, infer the incident of having taken.Be any one (step S807 is " being ") in them if judge scene mode, then flow process gets into step S808, and at step S808, system controller 50 provides classified information " incident " to view data.
In above-mentioned processing, two types information " landscape " and " incident " are provided for shot image data in " fireworks pattern ".That is to say, various types of information is provided according to single scene mode.Even the same camera when taking is provided with (scene mode) under the condition, captured view data also can have multiple meaning.This type example is a captured image in " fireworks pattern ".In this case, system controller 50 provides and takes the corresponding polytype classified information of back meaning.Therefore, parameter and the classified information that be suitable for taking back operation (for example, search operation) that is different from when taking can be provided.This can strengthen the convenience of operating after the shooting of digital camera 100.
In step S803, S805 and S807, all judge under the situation of " automatic mode ", " manual mode " or other scene mode, classified information is not provided into " denying ".
After completion was used for providing the processing of classified information of head, flow process got into step S809.At step S809, system controller 50 uses classified informations and the settings when taking about the information of shooting date etc., generates header.
If in editing and processing, carry out this processing, then proofread and correct the information described in head, and accomplish and to handle about coordinate.Information about coordinate is to comprise the information about the positional information of image such as facial information and focusing frame information.Thereby when view data being edited the visual angle (angle of field) that changes view data, for example, cut wait shear or the situation of combination under, the coordinate information of the image before the editor is inappropriate for the image after editing.In order addressing this problem, under situation about cutting, based on the position of cutting out section and the size of cutting out section, to recomputate the coordinate information of the image behind the editor, and in head, describe the coordinate information that recomputates.Under the situation of combination, the positional information based on the image before the combination in the image after the combination recomputates coordinate information, and in head, describes the coordinate information that recomputates.
Alternatively, can not upgrade the classified information that (providing again) user has changed.For example, this can be according to following judgement through not reflecting that the detection to classified information is achieved in this processing: with the different last classified information of in head, having described of last automated provisioning classified information is the altered classified information of user.For example, when being provided with automated provisioning classified information " landscape ", and when classified information is provided with " landscape " and " people ", judges the user and be provided with " people ", therefore, do not change attribute " people ", and no matter the automated provisioning result how.When the content of the classified information described in the automated provisioning current results is different from head, can require the user to remove to be chosen in the attribute that is provided on the graphic user interface (GUI).
When being provided with when being used to make it possible to the sign of date printed on image, then removing should sign.Only be used for the sign that the date prints when judging when in cropped area, not comprising the date of being printed, just can changing.
Return with reference to figure 8, when the settings of ON that judges the setting that classified information is provided automatically or OFF is OFF (step S801 is " denying "), skip setting (the step S802~S808), therefore generate the header that does not have classified information of classified information.
As stated, the classified information of using when for example searching for is provided automatically when taking, allows the user in reproduction mode, immediately view data to be classified, and need not when checking the view data of reproduction, to carry out the arrangement of known image file.Because the camera the when notion of classification is based on and takes is provided with condition and is taken the photograph the body both information, thereby can generate and be suitable for taking the classified information that the notion of (for example, the search of view data) is operated in the back.Processing shown in Figure 8 is corresponding to by the exemplary process that provides the unit to carry out.
In the above, the camera during as shooting that classified information is provided automatically is provided with condition, and the several exemplary scene mode has been described.Yet camera is provided with condition and is not limited to above-mentioned example.Camera during based on shooting is provided with condition provides another example of classified information to be automatically: when under manual mode, taking distant view, infer the shooting landscape, classified information " landscape " therefore is provided.Another example is: when using the auto heterodyne photographic images, at least one in two classified informations of inferring " people " and " incident " is provided.
In the above,, facial information has been described as an example being taken the photograph body information.Yet, taken the photograph body information to be not limited to facial information.For example, can use blood-shot eye illness judgement information.In this case, when detecting blood-shot eye illness, classified information " people " can be provided.
The automated provisioning classified information is not limited to above-mentioned three types information " people ", " landscape " and " incident " equally, as long as the user can use this information easily after taking.
The structure of catalogue and file
Fig. 9 is illustrated in the catalogue that is recorded in the above-mentioned recording processing on the storage medium 200 and an exemplary configurations of file.An example that is used to generate the rule of directory name and file name below with reference to Fig. 9 explanation.As root record DCIM catalogue 901.In DCIM catalogue 901, generate subdirectory.Constitute the title of each subdirectory by six characters, and first three character is a numeral.The represented numeral of first three numeral begins with 100, and when generating a catalogue, should numeral increase 1.In Fig. 9, show subdirectory " 100XXX " 902 and subdirectory " 101XXX " 903.
The file that generation digital camera 100 is generated under subdirectory.In example shown in Figure 9, the file 904~909 that is generated at 902 times generation digital cameras 100 of subdirectory, the file 911~918 that is generated at 903 times generation digital cameras 100 of subdirectory.Here the file name that is generated is made up of the file name of eight characters and the extension name of three characters of expression file type.By back four characters of digital configuration file title, and this four numerals are set since 0001.In the rest image logging mode, file name is provided, so that when taking an image, will increase 1 by the represented numeral of these back four numerals.Below will be called reference number of a document by these back four represented numerals of numeral.To the static picture document that in the rest image logging mode, is write down extension name " JPG " is provided.To the motion pictures files that in the motion picture recording pattern, is write down extension name " AVI " is provided.Thumbnail file to records management information provides extension name " THM ".
File structure
Figure 10 is illustrated in the example data structure of the static picture document that is write down in the above-mentioned recording processing.Image file 1001 comprise the image of beginning of the presentation video file of its beginning place begin (start of image, SOI) mark 1002 and immediately following the SOI1002 back, with the corresponding application mark of head (APP1) 1003.Use mark (APP1) 1003 and comprise following information.
Size (APPI length) 1004
Use labeled identifier code (APPI identifier code) 1005
The date created of view data and the time (time on date) 1006
Date and time (original date time) 1007 when generating view data
The classified information 1018 of view data
The classified information 1020 of automated provisioning view data
The date setting of printing 1021 of view data
The focusing frame information 1022 of view data
Facial information 1019
Other photographing information 1009
Thumbnail image (thumbnail data) 1010
Classified information 1018 is parameter and the information that be suitable for taking back operation (for example, search) that is different from when taking, as top said with reference to figure 8.Classified information 1018 in the time of can storing " people ", " landscape " and one or more elements such as " incidents " as shooting.In addition, can also store " kind 1 ", " kind 2 " and universal classification information such as " kinds 3 ".For the view data that will send external device (ED)s such as PC to, can also store the classified information that " work " etc. is used to point out particular procedure (for example, transmitting the Email that purpose is located in sends) through communications portion 110.The user will not have the classified information of these types of automated provisioning to offer the desired images data in processing shown in Figure 8 through scheduled operation.Scheduled operation will be explained in the back.Can be in reproduction mode when taking the automated provisioning classified information edit (with reference to Figure 15).
As stated; When taking the automated provisioning classified information; Also provide to make the user when checking view data, be easy to the classified information of in reproduction mode, carefully view data being classified, this makes the user to classify to data more easily.
In the present embodiment, automated provisioning classified information 1020 is set.The system controller 50 automated provisioning information that automated provisioning classified information 1020 keeps according to the digital camera 100 of present embodiment.In reproduction mode, forbid automated provisioning classified information 1020 is edited (referring to Figure 15); And use automated provisioning classified information 1020, to have a mind to the classified information that changes through the user that relatively discerns between classified information 1018 and the automated provisioning classified information 1020.
Facial information 1019 is to detect the information that generates in (the step S406 of Fig. 4) of handling at face.Facial information 1019 comprises the quantity and the coefficient of reliability of the position coordinates of detected face, the size of detected face (width and height), detected face.Include these elements for each face of detect.
The view data that is recorded in the image file 1001 comprises that definition quantization table (DQT) 1012, definition Huffman table (DHT) 1013, frame begin the data 1016 after (SOF) mark 1014, scanning beginning (SOS) mark 1015 and the compression.The image that utilizes the presentation video file data to finish finishes (EOI) mark 1017 and stops image file 1001.
Date setting of printing 1021 is to represent whether when taking, in captured image, embed shooting date and the sign of time.Use date setting of printing 1021 overlapping with what avoid date when utilization has the printer prints image of date printing function to print.
The focusing position and the size of the automatic focusing (AF) when focusing frame information 1022 is used for using coordinate system to shooting are managed.The user can see focusing position based on focusing frame information 1022.
The motion picture recording mode treatment
Figure 11 is the flow chart that an exemplary process of motion picture recording pattern is shown.When mode selector 60 was arranged to the motion picture recording pattern, system controller 50 was confirmed screening-mode then.In the present embodiment, be similar in supposition under the prerequisite of screening-mode of rest image logging mode, the screening-mode in the motion picture recording pattern is described.Certainly, the motion picture recording pattern can have the screening-mode that is exclusively used in the animation shooting.
When in the motion picture recording pattern, detecting the ON state of the second shutter release signal SW2, the processing of the motion picture recording pattern that system controller 50 beginnings are shown in Figure 11.At step S1101, the view data sequential storage that system controller 50 is obtained camera part 22 with predetermined frame frequency is in memory 32.Simultaneously, system controller 50 also will be stored in the memory 32 through the voice data that microphone 10, Audio Controller 11 and A/D converter 23 are obtained.In the present embodiment, suppose that voice data is the PCM numerical data.
Then, at step S1102, system controller 50 carries out image processing for the view data that is stored in the memory 32.An example of image processing is in order to carry out the size adjustment with the file logging view data.Then, at step S1103,50 pairs of view data of system controller are compressed, and it is stored in the memory 32.
Figure 12 illustrates and is used for the motion image data that is write down is stored in the example format on the storage medium 200.Begin to locate configuring fixed length head region 1201 in data.Head region 1201 comprises the data of video frame frequency or audio sample rate.Immediately following configuring fixed length voice data zone 1202 in the back of head region 1201.Voice data zone 1202 is with booking situation unit's (in the present embodiment, being a second) stores audio data.Through Audio Controller 11 and A/D converter 23, the audio sample that inputs to microphone 10 is become numerical data, obtain voice data, and it is stored in the memory 32.Immediately following 1202 back in voice data zone, will be in memory with frame data element (1203~1206) sequential storage of predetermined frame frequency record.Similarly, data element 1207~1212 is represented the motion image data of next second, and data element 1213~1217 is represented the N motion image data of second.Like this, order generates voice data and frame data, and stores them with booking situation unit, thereby generates motion image data.
When the data of storing a second in the above described manner, with reference to the step S1104 of Figure 11, with record moving image and audio frequency concurrently, system controller 50 begins the motion image data that is stored in the memory 32 is recorded in the storage medium 200.System controller 50 repeating step S1101~S1104 are till detecting the request that is used for stop motion image record (step S1105).Free space through detecting the second shutter release signal SW2 once more, detecting storage medium 200 is not enough, or to detect the free space of memory 32 not enough, produces the request that is used for stop motion image record.
As stated, Fig. 9 is illustrated in the catalogue that is recorded in the above-mentioned recording processing of being carried out in the digital camera 100 on the storage medium 200 and an exemplary configurations of file.The motion pictures files that in the motion picture recording pattern, is write down has extension name AVI, shown in 915 and 917.The thumbnail file of records management information has extension name THM, shown in 916 and 918.
Return with reference to Figure 11, when in response to the request that is used for stop motion image record during stop motion image recording processing, flow process gets into step S1106 from step S1105.At step S1106, the motion image data that system controller 50 will remain in the memory 32 is write on the storage medium 200, and recording indexes information 1218 then, in index information 1218, stores the side-play amount and the size of each voice data and video data.Then, at step S1107, system controller 50 generates header (for example, totalframes).Then, at step S1108, system controller 50 is described total data size in directory entry, and the information of total data size is recorded on the storage medium 200.Like this, accomplished the motion pictures files record.At step S1109, with the management information generation thumbnail file of motion pictures files, wherein, thumbnail file has the numbering identical with above-mentioned motion pictures files, and has extension name THM (for example, MVI_0005.THM (916)).The back will be with reference to an example and the processing that is used to generate and write down thumbnail file of the structure of Figure 10 and 13 explanation thumbnail files.
The structure of thumbnail file and record
The thumbnail file that in the moving image record, is generated has and the similar file structure of image file shown in Figure 10.Yet this thumbnail file does not have the thumbnail image zone 1010 that is used to write down thumbnail data, and thumbnail image is recorded in the data 1016 after the compression.
The image of beginning that thumbnail file 1001 is included in the beginning place presentation video of thumbnail file begins (SOI) mark 1002 and immediately following the application mark (APP1) 1003 in SOI 1002 back.Use mark (APP1) 1003 and comprise following information:
Size (APP1 length) 1004;
Use labeled identifier code (APP1 identifier code) 1005;
The date created of view data and the time (time on date) 1006;
Date and time (original date time) 1007 when generating view data;
The classified information 1018 of view data;
The classified information 1020 of automated provisioning view data;
The date setting of printing 1021 of view data;
The focusing frame information 1022 of view data;
Facial information 1019; And
Other photographing information 1009.
The downscaled images of the previous video frames the when view data of thumbnail file is setting in motion image record.This view data comprise definition quantization table (DQT) 1012, definition Huffman table (DHT) 1013, frame begin (SOF) mark 1014, scanning beginning (SOS) mark 1015 and with the corresponding compression of this downscaled images after data 1016.The image that utilizes this view data of expression to finish finishes (EOI) mark 1017 and stops this view data.
An example of the thumbnail recording processing among the step S1109 of Figure 11 is described referring now to Figure 13.At step S1301, system controller 50 generates thumbnail image.In the present embodiment, the image processing (for example, size being adjusted to the predetermined image size) through the previous video frames that is stored in the motion image data in the memory 32 is carried out generates thumbnail image.Then, at step S1302, compression/de-compression part 16 is compressed the thumbnail image that is generated at step S1301 under the control of system controller 50.Then, at step S1303, generate the head that comprises application mark 1003 (referring to Figure 10).With reference to figure 8 this processing has been described above.After the generation of accomplishing head, at step S 1304, system controller 50 will comprise that the thumbnail file of head and thumbnail image data writes on the storage medium 200, and accomplish the thumbnail recording processing.
Receiving mode is handled
Figure 14 is the flow chart that illustrates as an exemplary process of the receiving mode of one of other pattern shown in the step S9 of Fig. 3.When the mode selector 60 with digital camera 100 switches to receiving mode, carry out the processing of receiving mode shown in Figure 14.In the explanation below, explanation is used for receiving image file and it being recorded in the processing on the recording medium from external device (ED) (communicator).
At step S1401, whether system controller 50 inspections exist the device that communicates with it.Be in communicating devices (step S1401 is " denying ") if judge not exist, then accomplish to receive and handle.Be in communicating devices (step S1401 is " being ") if judge to exist, then flow process gets into step S1402, and at step S1402, system controller 50 judges whether to exist the request that will send.Do not have the request of sending (step S1402 is " denying ") if judge, then flow process turns back to step S1401, and whether system controller 50 check to exist once more and be in communicating devices, and etc. request to be sent.
Have the request of transmission (step S1402 is " being ") if judge, then at step S1403, system controller 50 receives data through communications portion 110 from being in communicating devices, and the data that received is remained in the memory 32 temporarily.Then, at step S1404, system controller 50 writes on the data that received on the storage medium 200.At this moment, when the head of the data that received comprises classified information, the data that received are recorded on the storage medium 200, and do not carry out any processing.When this head does not comprise classified information, can be through classified information newly being provided with the essentially identical processing of processing shown in Figure 8.Camera when in this case, obtaining shooting through the head with reference to the data that received is provided with condition.Facial information 1019 in the head of the data that for example, can receive with reference to being included in perhaps is included in the information about screening-mode (scene mode) in other photographing information 1009.Can obtain the body information of being taken the photograph through head with reference to the data that received, perhaps can be through analysis for the view data that is received, the new body information of being taken the photograph that detects.
After completion was write, flow process turned back to step S1401, and system controller 50 checks whether there is the device that communicates with it once more, and etc. request to be sent.Be in communicating devices if judge not exist, then withdraw from this processing.
Reproduction mode is handled
Figure 15 is the flow chart that an exemplary process of the reproduction mode shown in the step S8 of Fig. 3 is shown.At step S1501, system controller 50 obtains up-to-date image information from storage medium 200.Advantage obtaining up-to-date image information before the sum of computed image is: can after the beginning reproduction mode, show the image that is used for this processing fast.
Then, at step S1502, system controller 50 judges whether that success obtains up-to-date image information.Successfully do not obtain (step S1502 is " denying ") if judge, then flow process gets into step S1509, and at step S1509, system controller 50 is in does not have the input of image wait state.The back will be with reference to an example of the processing among the flowchart text step S1509 of Figure 16.An example that does not successfully obtain the situation of up-to-date image information is the state that does not have image.Another example is the state that the defective of medium causes obtaining failure.At step S1502, when successfully obtaining up-to-date image information, judge at least one image of existence, so flow process gets into step S1503.
At step S1503, system controller 50 is based on the up-to-date image information that step S1501 is obtained, and reads up-to-date view data from storage medium 200.Then, at step S1504, system controller 50 carries out file analysis, and obtains the photographing information and the attribute information of the up-to-date view data that is read.The back will be explained an example of this document analyzing and processing with reference to Figure 23.At step S1505, system controller 50 shows the up-to-date view data that is read.At this moment, system controller 50 also is presented at photographing information and the attribute information that is obtained among the step S1504.According to the file analysis result among the step S1504, for example, when judging the data that cause being read owing to file corruption when invalid, system controller 50 also shows wrong indication.
At step S1508, system controller 50 gets into the input wait state that is used to reproduce.The back will be explained an example of this input wait state that is used to reproduce with reference to figure 17A and 17B.
Input wait state when not having image
An exemplary process shown in the step S1509 of Figure 15, the input wait state when not having image is described referring now to Figure 16.At step S1601, system controller 50 shows the message that means " not having image " on image displaying part 28, do not have view data to notify the user.Then, at step S1602, system controller 50 is waited for the input of operation.The example of used here operation input comprise that the user carries out for the operation of button or battery cover and the incident of the low electric power of notice power supply.If detect any operation input, then flow process gets into step S1603.At step S1603, whether the input of system controller 50 decision operation is for the operation of accomplishing button.If judge the operation input is for the operation of accomplishing button (step S1603 is " being "), then accomplishes the processing of reproduction mode, and flow process gets into the step S10 of Fig. 3.Be input as for the operation beyond the operation of accomplishing button (step S1603 is " denying ") if judge operation, then flow process gets into step S1604, and at step S1604, system controller 50 is operated the input corresponding processing.For example, although there is not view data, if input for the operation of menu button, display menu on image displaying part 28 then, thus allow the user to change setting.
The input wait state of reproducing
An exemplary process of the input wait state that is used to reproduce referring now to Figure 17 A and 17B explanation.At step S1701, system controller 50 judges whether the operation input from the user.The example of used here operation input comprises the incident of user to the low electric power of operation that button or battery cover carried out and notice power supply.System controller 50 is waited for, till detecting any input.If detected any input, then flow process gets into step S1702.
At step S1702, system controller 50 judges whether detected operation input is the operation that the search key that comprises in the operation part 70 is provided with button.Be input as the operation (step S1702 is " being ") that search key is provided with button if judge operation, then flow process gets into step S1703.At step S1703, system controller 50 is provided with next search key, and it is stored in the system storage 52.Search key is the attribute information as the search unit.The example of search key comprises shooting date, classified information, file and moving image.That is to say that in the time can utilizing shooting date, classified information, file and moving image to search for, selective sequential shooting date, classified information, file and moving image are as the search key that is recorded in the image on the storage medium 200.This selective sequential can comprise the selected search key of cancellation,, switches to the reproduction mode for all images that is.
If judge the operation input is not the operation (step S1702 is " denying ") that search key is provided with button, and then flow process gets into step S1704.At step S1704, system controller 50 judges whether detected operation input is the operation to the image forwarding button that comprises in the operation part 70.If judge detected operation input is the operation (step S1704 is " being ") to the image forwarding button, and then flow process gets into step S1705.At step S1705, system controller 50 reads next image to be shown to set search key in step S1703.By the button portion composing images forwarding button of a pair of expression direction, wherein, on this direction, there is image to be shown.According to the corresponding direction of the button portion of pressing, read next image to be shown.Then, at step S1706,50 pairs of system controllers carry out file analysis at the photographing information and the attribute information of the view data that step S1705 is read.The example that this document is analyzed is explained with main reference Figure 23 in the back.At step S1707, system controller 50 is presented at the view data that step S1705 is read.At this moment, based on the result of the file analysis among the step S1706, show photographing information and attribute information.If according to the file analysis result of step S1706, it is invalid judging the data that the damage owing to for example file causes being read, and then system controller 50 also shows wrong indication.After accomplishing demonstration, system controller 50 turns back to the input wait state among the step S1701.
If judge detected operation input is not the operation (step S1704 is " denying ") to the image forwarding button, and then flow process gets into step S1709.At step S1709, system controller 50 judges whether to have accomplished the calculating of the total number of images that in the step S2103 of Figure 21, begins.Do not accomplish (step S1709 is " denying ") if judge, then flow process turns back to step S1701, and at step S1701, system controller 50 is waited for the input of operation.At this moment, can show message from this calculating to user notification or the icon of not accomplishing.Like this, before the calculating of accomplishing amount of images, carry out the image forward operation of being carried out and through accomplishing the complete operation that button carried out through the image forwarding button.Ignore other operation input, till the calculating of accomplishing amount of images.
If judge the calculating (step S1709 is " being ") of having accomplished amount of images, then flow process gets into step S1710.At step S1710, system controller 50 judges whether through the classified information setup menu has been selected in the operation of operation part 70.Selected classified information setup menu (step S1710 is " being ") if judge, then flow process gets into step S1711.At step S1711, system controller 50 carries out the processing that classified information is provided with pattern.The back will explain that classified information is provided with an example of this processing of pattern with reference to Figure 20.
Do not have selection sort information setting menu (step S1710 is " denying ") if judge, then flow process gets into step S1712.At step S1712, system controller 50 judges whether detected operation input is the operation to the reset button that comprises in the operation part 70.If judge detected operation input is the operation (step S1712 is " being ") to reset button, and then flow process gets into step S1713.At step S1713, system controller 50 is removed the current view data that just is being presented on the image displaying part 28.Subsequently, at step S1714, the sum after inspection is removed.If add up to 0 (step S1714 is " being "), then flow process gets into step S1715, and at step S1715, system controller 50 turns back to does not have the input of image wait state.With reference to Figure 16 this input wait state when not having image has been described above.
If after removing, still remain with view data (step S1714 is " denying "), then flow process gets into step S1716, and at step S1716, system controller 50 reads then with the images displayed data, to show next view data.Here, are view data of next reference number of a document then with reference number of a document of the view data of being eliminated with the images displayed data.If removed up-to-date view data, then then demonstration had the view data of last reference number of a document of the reference number of a document of the view data of being eliminated.Subsequently, at step S1717,50 pairs of system controllers carry out file analysis as view data to be shown in the view data that step S1716 is read, and obtain photographing information and attribute information.The example that this document is analyzed is explained with main reference Figure 23 in the back.At step S1718, system controller 50 will be presented on the image displaying part 28 in the view data that step S1716 is read.At this moment, also be presented at photographing information and the attribute information that is obtained among the step S1717.If judge because for example the damage of the file data that cause being read are invalid according to the file analysis result among the step S1717, then system controller 50 also shows wrong the indication.After accomplishing demonstration, system controller 50 turns back to the input wait state among the step S1701.
If judge detected operation input is not the operation (step S1712 is " denying ") to reset button, and then flow process gets into step S1719.At step S1719, system controller 50 judges whether detected operation input is the operation to Edit button.If judge detected operation input is the operation (step S1719 is " being ") to Edit button, and then flow process gets into step S1720, and at step S1720, system controller 50 is edited.The back is with main reference Figure 18 A and this editor's of 18B explanation a example.
If judge detected operation input is not the operation (step S1719 is " denying ") to Edit button, and then flow process gets into step S1721.At step S1721, system controller 50 judges whether detected operation input is to accomplishing the operation of button.If judge detected operation input is the operation (step S1721 is " being ") to accomplishing button, then accomplish the processing of reproduction mode, and flow process gets into the step S10 of Fig. 3.
If judge detected operation input is not the operation (step S1721 is " denying ") to accomplishing button, and then flow process gets into step S1724.At step S1724, system controller 50 carries out and the corresponding processing of other operation input.The example of this processing comprise editor to image, to the switching of multiple reproduction and when pressing menu button display menu.Multiple reproduction is a kind of like this reproduction mode, in this pattern, on a picture of image displaying part 28, shows set thumbnail image.
Editor
An example referring now to the editing and processing among the step S1720 of Figure 18 A and 18B explanation Figure 17.An example of executable editing and processing is: through to being presented at image cut (cutting) that image file carried out and the image size conversion (size adjustment) on the image displaying part 28, write down new image file.Below with reference to this editing and processing of flowchart text shown in Figure 18 A and the 18B.In the explanation below, be that the file of IMG_0002.JPG (905) carries out this editor to file name.
At step S1801, system controller 50 obtains the image file name (IMG_0002.JPG) that is presented at the view data on the image displaying part 28.Then, at step S1802, system controller 50 will read in memory 32 from storage medium 200 with the corresponding view data of the file name of being obtained.Then, at step S1803, compression/de-compression part 16 decompresses to the view data that is read at step S1802 under the control of system controller 50, and the storage after will decompressing is in memory 32.
Then, at step S1804, system controller 50 judges whether the editor that will carry out is the size adjustment.Be compiled as size adjustment (step S1804 is " being ") if judge with what carry out, then flow process gets into step S1805.At step S1805, system controller 50 uses image processors 24, and the view data after decompressing is amplified or is contracted to the predetermined image size.Then, at step S1806, system controller 50 uses 16 pairs of adjusted view data of size of compression/de-compression part to compress, and it is stored in the memory 32.Then, at step S1807, system controller 50 obtains the classified information of the raw image files that in step S1802, is read, and it is stored in the system storage 52.The use of predetermined menus picture makes the user specify zoom factor in can amplifying/dwindling.
Then, at step S1808, system controller 50 will provide ON or the settings of OFF of the setting of classified information to be arranged to " OFF " automatically temporarily.The ON of the setting that classified information is provided automatically or the original settings of OFF are write down (preservation) in the zones of different of system storage 52.Then, at step S1809, system controller 50 generates the head of the view data behind the editor.More specifically, be replicated in the head of the raw image files that is read in the memory 32, and use the head of the raw image files that is duplicated, newly-generated image file is carried out the 8 described first-born one-tenth with reference to figure.Owing to provide ON or the settings of OFF of the setting of classified information to be set to " OFF " automatically, thereby classified information be not provided automatically.Because the head based on raw image files generates this head, thereby newly-generated image file is inherited classified information from raw image files after the editor.The zone of appropriate change image size and items such as date created and time.Then, at step S1810, system controller 50 with write down (preserve) settings recover setting to the ON or the OFF of the setting that classified information is provided automatically.
Like this, accomplish the generation of the view data of newly-generated image file.Therefore, at step S1811, generate the file name of newly-generated image file.In the present embodiment, spanned file title IMG_0003.JPG.Then, at step S1812, the image file that system controller 50 will be generated in the above described manner writes on the storage medium 200, and accomplishes editor.
On the content of image, do not have the editor that changes for size adjustment etc., the image file before the editor with edit after image file have identical classified information.Therefore, through inheriting the classified information of the view data after the automated provisioning classified information offers editor arbitrarily with the user when taking raw image data,, make that also operation (for example, searching for) is highly convenient even for the view data behind the editor.
If judging the editor who carries out is not size adjustment (step S1804 is " denying "), then flow process gets into step S1813, and at step S1813, system controller 50 judges that whether the editor that will carry out is for cutting.Being compiled as of carrying out cut (step S1813 is " being ") if judge, then flow process gets into step S1814.At step S1814, system controller 50 uses the view data after image processor 24 will decompress to cut into and specifies size.Then, at step S1815, the face detection is carried out in cutting that system controller 50 is kept after cutting on the image.Then, at step S1816, system controller 50 24 pairs of image processors of use cut image and carry out size adjustment (amplifying/dwindle).Then, at step S1817, system controller 50 uses 16 pairs of adjusted view data of size of compression/de-compression part to compress, and the view data after will compressing is stored in the memory 32 once more.
Then, at step S1818, system controller 50 obtains the classified information of the raw image files that is read at step S1802, and it is stored in the system storage 52.The use of menu screen makes the user can specify the clipped position in cutting to adjust the zoom factor in (amplifying/dwindle) with size.Then, at step S1819, system controller 50 generates the head of the view data behind the editor.More specifically, be replicated in the head of the raw image files that reads in the memory 32, and use the head of the raw image files that is duplicated, carry out 8 described first-born one-tenth with reference to figure for newly-generated image file.If be set to " ON " of classified information is provided automatically, then be based on detected facial information among the step S1815, classified information is provided automatically.The zone of item such as the size of appropriate change image and date created and time.
Like this, accomplished the generation of the view data of newly-generated image file.At step S1811, generate the file name of newly-generated image file.In the present embodiment, spanned file title IMG_0003.JPG.Then, at step S1812, the image file that system controller 50 will be generated in the above described manner writes on the storage medium 200, and accomplishes editor.
For the editor who relates to the change of picture material such as cutting, classified information is provided once more based on the image behind the editor.Therefore, even for the view data behind the editor, also can carry out high convenience operation (for example, search).
If judging the editor who carries out is not to cut (step S1813 is " denying "), then flow process gets into step S1820, and at step S1820, system controller 50 carries out other processing.The example of other processing comprises the change and the combination of color transformed, the picture shape of image.Even in these cases, also can carry out corresponding graphical analysis, and carry out first-born one-tenth with editor.Processing shown in Figure 180 is corresponding to by edit cell with the exemplary process that again unit is provided and is carried out.
Referring now to the object lesson that Figure 19 A~the 19D explanation cuts.In Figure 19 A, the image before Reference numeral 1901 expressions cut.Reference numeral 1902 expression will be cut by the operation from the user cuts the appointed area.Face when Reference numeral 1903 expressions are taken detects the focusing frame.The information that Reference numeral 1904 occurs when being illustrated in activation date setting of printing, it representes shooting date.Image 1901 before cutting has attribute " label (classified information): people; Number: 2; Date prints: (ON) arranged; Facial coordinate 1: in the left side, 10 * 10; And facial coordinate 2: in, 10 * 10 ".
Figure 19 B be illustrated in use cut that appointed area 1902 keeps after cutting cut image 1905.According to the flow chart of Figure 18 A and 18B, the attribute that cuts image 1905 is: " label (classified information): people; Number: 1; Date prints: do not have (OFF); Facial coordinate 1: in, 90 * 90; And facial coordinate 2: do not have ".Attribute information after the change makes can carry out suitable message demonstration and search in reproduced image.For example, detect shown in the focusing frame 1906,, can show that also which is taken the photograph in the body partly is identified as facial and this part is judged as the focusing frame even after cutting like face.
In Figure 19 C, the image before Reference numeral 1911 expressions cut.Reference numeral 1912 expression will be cut by the operation from the user cuts the appointed area.Face when Reference numeral 1913 expressions are taken detects the focusing frame.Image 1911 before cutting has attribute " label (classified information): people; Number: 2; Date prints: have; Facial coordinate 1: in the left side, 10 * 10; And facial coordinate 2: in, 10 * 10 ".
Figure 19 D be illustrated in use cut that appointed area 1912 keeps after cutting cut image 1915.According to the flow chart of Figure 18 A and 18B, the attribute that cuts image 1915 is " a label (classified information): landscape; Number: 0; Date prints: do not have; Facial coordinate 1: do not have; And facial coordinate 2: do not have ".Attribute information after the change makes can carry out suitable message demonstration and search in reproduced image.For example,, cut preceding image 1911 and have attribute " label: people ", have attribute " label: landscape " and cut image 1915 according to reference to figure 18A and the described processing of 18B.Therefore, utilize search key " people ", retrieval is cut image 1915 less than what do not photograph the people, cut image 1915 and can utilize search key " landscape " to retrieve.
Cut the attribute " date prints: do not have " after image 1905 and 1915 all has the image modification before the cutting separately.For example; When having, use adds when printing and during the printer of the function of date printed; Image 1901 before printing cuts and 1911 etc. has embedded under the situation of image on date; If " date prints: have " is set, then printer can suppress date printed, and is overlapping with the printing of avoiding date and time information.When Exclude Dates printing portion from image,,, allow printer side suitably to add and date printed through attribute being changed over " date prints: do not have " as in cutting image 1905 and 1915.
Classified information is provided with the processing of pattern
As said, in digital camera 100, carry out the processing that classified information is provided with pattern through selecting the information setting menu according to present embodiment with reference to figure 17A and 17B.Figure 20 illustrates the flow chart that classified information is provided with an exemplary process of pattern.
At step S2001, system controller 50 judges whether to exist the operation input from the user.The example of employed here operation input comprises that the user is to operation that button or battery cover carried out and the incident of notifying the low electric power of power supply.System controller 50 is waited for, till detecting any input.
Be input as the complete operation (step S2002 is " being ") that is used to indicate the setting of completion classified information if judge detected operation, then flow process gets into step S2003.In the present embodiment, the example of accomplishing the complete operation of classified information setting of the indication here comprise be used to withdraw from the classified information setting to operation, the operation that is used to cut off the electricity supply of the menu button of operation part 70 and the operation that is used for pattern is changed over from reproduction mode screening-mode.At step S2003, the classified information of the view data that system controller 50 changes among the described step S2011 is in the back write image file.Then, accomplish classified information pattern is set.Processing turns back to the input wait state among the step S1701 of Figure 17.
If judge detected operation input is the operation (step S2005 is " being ") to image forwarding button included in the operation part 70, and then flow process gets into step S2006 from step S2005.At step S2006, the classified information of the view data after system controller 50 changes among the described step S2011 is in the back write image file.Then, at step S2007, system controller 50 reads next view data to be shown.By pair of buttons portion (in the present embodiment, being dextrad and left-hand) composing images forwarding button.According to selected direction, change then images displayed data.
Then, at step S2008,50 pairs of view data that read at step S2007 of system controller are carried out file analysis, and according to this document getattr information.The example that this document is analyzed is explained with main reference Figure 23 in the back.At step S2009, system controller 50 is presented at the view data that is read on the image displaying part 28.At this moment, according to this setting, show photographing information and attribute information (for example, classified information).If according to the file analysis result among the step S2008, it is invalid to judge the data that cause being read owing to for example file corruption, and then system controller 50 also shows wrong indication.After accomplishing demonstration, system controller 50 turns back to step S2001, and gets into the input wait state.
Can the image forward operation described in step S2005~S2009 be applied to single reproduction and many reconstruction of image (also being referred to as many images shows), wherein, in single reproduction, on single picture, show single image; In many reconstructions of image, on single picture, show a plurality of images (for example, 9 images).Under the situation that many images show, the continuous moving cursor of instruction that advances in response to image, and, the classified information of view data is write image file in response to this move.
If judge in the detected operation of step S2010 and be input as classified information change operation (step S2010 is " being "), then flow process gets into step S2011 from step S2010.At step S2011, system controller 50 changes institute's images displayed classification of Data information.In this stage, image file is not write in the change in the classified information, but this change is stored in the memory 32.Then, at step S2012, the classified information after system controller 50 will change is reflected in the demonstration of image displaying part 28.
If judging in the detected operation of step S2010 input is not aforesaid operations any one (step S2010 is " deny "), then flow process entering step S2013 at step S2013, carries out other processing.The example of other processing is included between single reproduction and the many reconstruction of image and switches.
As stated, when switching the demonstration of view data, perhaps when the completion classified information is provided with pattern, classified information is write image file.Access times can be reduced like this, and the speed of service can be improved storage medium 200.
File management
Figure 21 is the flow chart that the example that the image file management among the step S2 of Fig. 3 handles is shown.At step S2101, system controller 50 is removed the affirmation sign that is recorded in the latest image in the system storage 52.At step S2102, system controller 50 is removed the affirmation sign of total number of files.At step S2103, system controller 50 is used to begin the instruction of picture search, provides and the parallel picture search of carrying out of above-mentioned processing.Then, accomplish and to handle.
Picture search
Figure 22 is the flow chart of an example handling of picture search that the instruction of the beginning picture search that provided among the step S2103 that illustrates in response to Figure 21 is carried out.When indication beginning picture search, flow process gets into step S2212 from step S2211.At step S2212, system controller 50 generates and reproduces directory listing.For example, under the situation of the reproducer that meets the DCF standard, this processing is a kind of like this processing: analyze the directory entry of DCF root, search for the DCF catalogue, and the DCF catalogue is added to reproducing directory listing.
Then, at step S2213, system controller 50 judges whether to exist the reproduction catalogue.Do not have the reproduction catalogue if judge, that is, not existing can be through the catalogue or the file (step S2213 is " denying ") of digital camera 100 processing, and then the sum of system controller 50 files is set to 0.After this, at step S2222, system controller 50 is arranged to 1 with the affirmation sign of total number of files, and accomplishes and should handle.
If judge to exist and reproduce catalogue (step S2213 is " being "), then system controller 50 initialisation image search directory in step S2214.For the reproducer that meets the DCF standard, for example, the DCF catalogue that will have maximum numbering is arranged to the picture search catalogue.Then, at step S2215, system controller 50 is through analyzing the directory entry of this catalogue, calculates the sum of the image in the catalogue of the target that is set to picture search.Total addition with the sum and the image in the storage medium 200 of the image in this catalogue that is calculated.
At step S2216, system controller 50 obtains the information described in the directory entry of DCF root.Particularly, obtain the summation of minimum reference number of a document, maximum reference number of a document, reference number of a document, the summation of timestamp, the summation of file size, sum and other project of file.With these projects as the directory entry information stores in system storage 52.
Then, at step S2217, system controller 50 judges whether to exist the reproduced image file, that is, and and can be through the file of digital camera 100 processing.Have reproduced image file (step S2217 is " being ") if judge, then flow process gets into step S2218, and at step S2218, system controller 50 is confirmed up-to-date image, and up-to-date image confirming sign is arranged to 1.
Accomplish the instruction that button provided, be used to finish sum calculating (step S2220 is " being ") if exist through operation, then flow process gets into step S2222, and withdraws from this processing.If do not accomplish the instruction that button provided, be used to finish sum calculating (step S2220 is " denying ") through operation, then system controller 50 judges whether to exist the not catalogue of search in step S2221.If judge the catalogue (step S2221 is " being ") that existence is not searched for, then flow process gets into step S2219.At step S2219, the catalogue of not search is arranged to the picture search catalogue, flow process turns back to step S2215.Like this, carry out the processing of step S2215~S2218 for all catalogues in the reproduction directory listing that in step S2212, is generated.After completion was carried out the processing of step S2215~S2218 for all catalogues, flow process got into step S2222.At step S2222, system controller 50 is notified up-to-date image confirming, the sum of computed image, and the affirmation sign of total number of files is set, and withdraw from this processing.
Even when exist reproducing catalogue, if in this catalogue, do not have reproduced image, the affirmation sign of total number of files is set then, and withdraws from this processing, wherein, what do not have in the catalogue that reproduced image means image adds up to 0.
File analysis
The example that file analysis among the step S2008 of step S1706 and S1717, Figure 20 of step S1504, Figure 17 A and the 17B of Figure 15 is handled is described referring now to Figure 23.At step S2301, system controller 50 judges whether the file as evaluating objects has file header, wherein, in file header, describes photographing information and attribute information (for example, classified information).Have this class file head (step S2301 is " being ") if judge file, then system controller 50 obtains photographing information at step S2303 from file header, and obtains classified information at step S2303 from file header.At step S2304, system controller 50 obtains the information about image data main body, for example, and the starting position of image subject and the method for compressed image.
As stated, according to present embodiment, when edited image, classified information is offered the image behind the editor again.Therefore, can suitable classified information be offered image behind the editor.
Second embodiment
Below with reference to description of drawings second embodiment.
Be with the main difference point of first embodiment: in the present embodiment, except that classified information, the photographing information such as frame and shooting date of also will focusing offer image, and in first embodiment, only classified information are offered image.Following explanation emphasis is to this difference.
Omission is for the detailed description with the first embodiment same section such as the processing in the structure of digital camera, shooting, record, reproduction, each pattern and file structure.
First-born one-tenth
Another example referring now to the first-born one-tenth processing among the step S706 of Figure 24 key diagram 7.
At step S2401, the step S405 that system controller 50 obtains at Fig. 4 from system storage 52 specified, automatically ON or the settings of OFF of the setting of classified information be provided, and judge whether this classified information is offered captured view data automatically.Be configured to " OFF " if judge the ON of the setting that classified information is provided automatically or the settings of OFF, that is, classified information (step S2401 is " denying ") be not provided automatically, then flow process gets into step S2409.
If judge the ON of the setting that classified information is provided automatically or the settings of OFF are arranged to " ON ", that is, classified information (step S2401 is " being ") are provided automatically, then flow process gets into step S2402.At step S2402, system controller 50 reads among the step S406 of Fig. 4 and remains on the facial information in the system storage 52, and judges whether to have detected face.Detected facial (step S2402 is " being ") if judge, then flow process gets into step S2404, at step S2404, classified information " people " is provided.Do not detect face (step S2402 is " denying ") if judge, then flow process gets into step S2403.
At step S2403; The scene mode of system controller 50 image when being stored in the shooting in the system storage 52, and judge scene mode whether be " Portrait ", " snapshot mode at night " and " children and pet pattern " wherein any one.Be any one (step S2403 is " being ") in them if judge scene mode, then flow process gets into step S2404, and at step S2404, system controller 50 provides classified information " people " to view data.If classified information " people " is provided in step S2404, if perhaps judge wherein any one (step S2403 is " denying ") that scene mode is not them, then flow process gets into step S2405.
As stated, at step S2402~S2404, the scene mode according to an example of condition is set as the facial information of an example being taken the photograph body information and the camera when taking provides identical classified information " people ".It is parameters different when taking that camera when being taken the photograph body information and taking is provided with condition, but according to content, and they can have meaning after the similar shooting.As the facial information of being taken the photograph one of body information with camera when taking " Portrait " of one of condition is set, the pattern of " snapshot mode at night " and " children and pet pattern " all has identical meaning: " infer and photographed the people ".Therefore, through identical classified information being provided, strengthened the convenience of taking back operation (for example, search operation) to view data with this category information.That is, through using specific shot body information with particular camera condition to be set identical classified information is provided, this makes it possible to provide parameter and the classified information that be suitable for taking back operation (for example, search operation) that is different from when taking.This can strengthen convenience.
In addition, above-mentioned classified information provide processing can for Portrait, night snapshot mode and these different scene patterns of children and pet pattern identical classified information is provided.Camera when different scene modes have different shootings is provided with condition, but they can have similar meaning.Portrait, night, snapshot mode and children all had identical meaning with the pet pattern: " infer and photographed the people ".Therefore, to this type view data same category information is provided, this has strengthened the convenience of taking back operation (for example, search operation).That is, the multiple condition that specifically is provided with that is provided with in the condition through the camera when taking provides identical classified information, and this makes it possible to provide parameter and the classified information that be suitable for taking back operation (for example, search operation) that is different from when taking.This can strengthen the convenience of taking the back operation.
Return with reference to Figure 24, at step S2405, system controller 50 judge scene mode whether be " plant pattern ", " landscape configuration " and " fireworks pattern " wherein any one.Be any one (step S2405 is " being ") in them if judge scene mode, then flow process gets into step S2406, and at step S2406, system controller 50 provides classified information " landscape " to view data.If classified information " landscape " is provided in step S2406, not any one (step S2405 is " denying ") in them if perhaps judge scene mode, then flow process gets into step S2407.
At step S2407, system controller 50 judge scene mode whether be " party pattern ", " snow scenes pattern ", " seabeach pattern ", " fireworks pattern ", " aquarium pattern " and " pattern under water " wherein any one.In these patterns, infer the shooting incident.Be any one (step S2407 is " being ") in them if judge scene mode, then flow process gets into step S2408, and at step S2408, system controller 50 provides classified information " incident " to view data.
In above-mentioned processing, two types information " landscape " and " incident " are provided for shot image data in " fireworks pattern ".That is to say, various types of information is provided according to single scene mode.Even the camera when identical shooting is provided with (scene mode) under the condition, captured view data also can have multiple meaning.This type example is a captured image in " fireworks pattern ".In this case, system controller 50 provides and takes the corresponding polytype classified information of back meaning.Therefore, parameter and the classified information that be suitable for taking back operation (for example, search operation) that is different from when taking can be provided.This can strengthen the convenience of operating after the shooting of digital camera 100.
In all these steps of step S2403, S2405 and S2407, all be judged as under the situation " denying ", " automatic mode ", " manual mode " or other pattern, classified information is not provided.
After completion was used for providing the processing of classified information of head, flow process got into step S2409.
At step S2409, if judging this first-born one-tenth that in editor, carries out, system controller 50 use classified information to generate header, then flow process gets into step S2410.
At step S2410, the coordinate information described in the head before the conversion editor, here, coordinate information is a facial information and focusing frame information etc. has the information about the position of image and facial zone.In relating to the editor who changes the visual angle, for example, cut etc. shear or the situation of combination under, the image of the coordinate information of the image before the editor after for editor is inappropriate.In order to address this problem, under situation about cutting, based on the position of cutting out section and the size of cutting out section, for the image transform coordinate information behind the editor.
Then,, judge invalid coordinates information, and change priority at step S2411.Through judging position and the area coordinate that whether comprises in the image behind the editor after the conversion among the step S2410, carry out the judgement of invalid coordinates information.For example, the example that is used to judge the condition of invalid coordinates information comprises: position after the conversion and area coordinate partly stretch out the image behind the editor; Position after the conversion and area coordinate stretch out the image behind the editor fully; And the regional center of the coordinate after the conversion is stretched out the image behind the editor.
If the focusing degree during shootings such as focusing frame information has priority, then reset priority.For example, when judging the burnt frame of homophony and be invalid coordinates information, the focusing that will have next priority is re-set as homophony Jiao.When judging the burnt frame of homophony and do not have the focusing of next priority after being invalid coordinates information, then this image does not have the frame of focusing coordinate.
Then, at step S2412, coordinate information is set in head.This is the coordinate information after the conversion in step S2410 and the described editor of S2411, or the coordinate information the during shooting when not carrying out this processing among the editor who judges in step S2409.
Then, at step S2413, judge whether to exist focusing frame coordinate.Here, the coordinate of the captured image of the processing when focusing frame coordinate is shooting, or the focusing frame coordinate that when in editor, carrying out this processing, in step S2412, is kept.If judged focusing frame (step S2413 for " being "), in head, be focus image then with image setting at step S2414.If judge the frame coordinate of not focusing (step S2413 for " denying "), in head, be out of focus (out-of-focus) image then with this image setting at step S2415.
Then, at step S2416, system controller 50 uses classified informations, settings (for example, about shooting date and time information) generation header.Cannot upgrade the classified information that the user changes.
For example, this can be according to following judgement through not reflecting that the detection to classified information is achieved in this processing: with the different last classified information of in head, having described of last automated provisioning classified information is the altered classified information of user.
For example, when being provided with automated provisioning classified information " landscape ", and when classified information is provided with " landscape " and " people ", judges the user and be provided with " people ", therefore, do not change attribute " people ", and no matter the automated provisioning result how.
When the content of the classified information described in the automated provisioning current results is different from head, can require the user to be chosen in the attribute that provides on the GUI.
When being provided with when being used to make it possible on image, to carry out sign that the date prints, removing should sign.Only, just can change the sign that is used for date printed when judging when in cropped area, not comprising the date of being printed.Return with reference to Figure 24, when the settings of ON that judges the setting that classified information is provided automatically or OFF is OFF (step S2401 is " denying "), skip setting (the step S2402~S2408), therefore generate the header that does not have classified information of classified information.
As stated, when edited image, coordinate information (for example, focusing information) is transformed into the coordinate position in the image behind the editor, judges then effectively/invalid.Therefore, will lose image as the focusing frame information of one of attribute and be judged as and lose burnt image, therefore suitable attribute can be provided.
Cut
Cutting according to second embodiment is described below.
At step S1804, system controller 50 judges whether the editor that will carry out is the size adjustment.If judging the editor who carries out is not size adjustment (step S1804 is " denying "), then flow process gets into step S1813.At step S1813, system controller 50 judges whether the editor that will carry out cuts.If judging the editor who carries out is to cut (step S1813 is " being "), then flow process gets into step S1814.
At step S1814, system controller 50 uses the view data after image processor 24 will decompress to cut into specific size.Then, at step S1815, the image that cuts that keeps after 50 pairs of system controllers cut carries out the face detection.Then, at step S1816, system controller 50 24 pairs of image processors of use cut image and carry out size adjustment (amplifying/dwindle).Then, at step S1817, system controller 50 uses 16 pairs of adjusted view data of size of compression/de-compression part to compress, and the view data after will compressing is stored in the memory 32 once more.
Then, at step S1818, system controller 50 obtains the classified information of the raw image files that is read at step S1802, and it is stored in the system memory 52.The use of menu screen makes the user to specify and cuts middle position of shearing and the big or small zoom factor of adjusting in (amplifying/dwindle).Then, at step S1819, system controller 50 generates the head of the view data behind the editor.More specifically, be replicated in the head of the raw image files that is read in the memory 32, and use the head of the raw image files that is duplicated, carry out described first-born one-tenth with reference to Figure 24 for newly-generated image file.If be set to " ON " of classified information is provided automatically, then be based on the detected facial information of step S1815, classified information is provided automatically.The size of appropriate change image and the zone of projects such as date created and time.
Like this, accomplish the generation of the view data of newly-generated image file.At step S1811, generate the file name of newly-generated image file.In the present embodiment, spanned file title IMG_0003.JPG.Then, at step S1812, the image file that system controller 50 will be generated in the above described manner writes on the storage medium 200, and accomplishes and should edit.
For the editor who relates to the content that changes image such as cutting, classified information is provided once more based on the image behind the editor.Therefore, even for the view data behind the editor, also can carry out high convenience operation (for example, search).
If judging the editor who carries out is not to cut (step S1813 is " denying "), then flow process gets into step S1820, and at step S1820, system controller 50 carries out other processing.The example of other processing comprises color transformed, the shape and the combination that change image of image.Even in these cases, also can carry out corresponding graphical analysis, and carry out first-born one-tenth with editor.
Referring now to the object lesson that Figure 25 A~the 25D explanation cuts.
At Figure 25 A, Reference numeral 2501 expressions cut preceding image.Reference numeral 2502 expression will be cut by the operation from the user cuts the appointed area.Face when Reference numeral 2503 expressions are taken detects the focusing frame.The information that occurs during Reference numeral 2504 expression activation date settings of printing, it representes shooting date.Image 2501 has attribute " label (classified information): people before cutting; Number: 2; Date prints: have; Facial coordinate 1: in the left side, 10 * 10; Facial coordinate 2: in, 10 * 10; Focusing frame coordinate 1: in the left side, 10 * 10; Priority 2; Focusing frame coordinate 2: in, 10 * 10, priority 1; And focus image: be ".
Figure 25 B be illustrated in use cut that appointed area 2502 keeps after cutting cut image 2505.According to the flow chart of Figure 18 A and 18B, the attribute that cuts image 2505 is " a label (classified information): people; Number: 1; Date prints: do not have; Facial coordinate 1: in, 90 * 90; And facial coordinate 2: do not have; Focusing frame coordinate 1: in, 90 * 90, priority 1: focusing frame coordinate 2: do not have: and focus image: be ".Attribute information after the change makes can carry out suitable message demonstration and search when reproduced image.For example, detect shown in the focusing frame 2506,, can show that also which is taken the photograph in the body partly is identified as facial and this part is judged as the focusing frame even after cutting like face.
In Figure 25 C, Reference numeral 2511 expressions cut preceding image.Reference numeral 2512 expression will be cut by the operation from the user cuts the appointed area.Face when Reference numeral 2513 expressions are taken detects the focusing frame.Image 2511 has attribute " label (classified information): people before cutting; Number: 2; Date prints: have; Facial coordinate 1: in the left side, 10 * 10; Facial coordinate 2: in, 10 * 10; Focusing frame coordinate 1: in the left side, 10 * 10, priority 2; Focusing frame coordinate 2: in, 10 * 10, priority 1; And focus image: be ".
Figure 25 D be illustrated in use cut that appointed area 2512 keeps after cutting cut image 2515.According to the flow chart of Figure 18 A and 18B, the attribute that cuts image 2515 is " a label (classified information): landscape; Number: 0; Date prints: do not have; Facial coordinate 1: do not have; And facial coordinate 2: do not have; Focusing frame coordinate 1: do not have; Focusing frame coordinate 2: do not have; And focus image: not ".Attribute information after the change makes can carry out suitable message demonstration and search when reproduced image.For example, according to reference to the described processing of Figure 18, the facial focusing frame 2511 that detects has attribute: " label: people " has attribute " label: landscape " and cut image 2515.Therefore, utilize search key " people " will not retrieve and cut image 2515, and can utilize search key " landscape " to retrieve to cut image 2515 less than what photograph the people.
Cut image 2505 and 2515 and all have the attribute " date prints: do not have " behind the image modification before the cutting separately.For example; When having, use adds when printing and during the printer of the function of date printed; Image 2501 and 2511 etc. has embedded under the situation of image on date before printing cuts; If " date prints: have " is set, then printer can suppress date printed, and is overlapping with the printing of avoiding date and time information.When from image Exclude Dates printing portion,,, allow printer side suitably to add and date printed through attribute being changed over " date prints: do not have " with the same in cutting image 2505 and 2515.
As stated, according to present embodiment,, also suitable photographing information can be provided even after edited image.
The storage medium of program code that can be through the software that storage is used to realize at least one the foregoing description is provided to system or equipment is realized the present invention.In this case, the computer in this system or equipment (or CPU or microprocessor unit (MPU)) reads the program code that is stored in this storage medium.
In this case, the program code itself that reads from storage medium is realized the function of the foregoing description, and the storage medium of this program code and this program code of storage is included in the scope of the present invention.
Be used to provide the example of the storage medium of program code to comprise floppy disk, hard disk, magneto optical disk (MO), compact disc read-only memory (CD-ROM), etch-recordable optical disk (CD-R), tape, Nonvolatile memory card or ROM.
The function of at least one the foregoing description of computer realization through carrying out the program code read, this is also included within the scope of the present invention.For example, through the part or all of actual treatment that operation operating system (OS) is on computers carried out according to the instruction of this program code, realize the function of at least one the foregoing description, this is also included within the scope of the present invention.
In addition, such a case is arranged: will write on from the program code that storage medium reads be included in the expansion board inserted the computer or with functional expansion unit that computer is connected in memory on.In this case, utilize the CPU that for example is included in this functional expansion unit to carry out part or all of actual treatment according to the instruction of this program code, this is also included within the scope of the present invention.
In the above, the example that applies the present invention to digital camera has been described.This application is not limited to this example.The present invention can be applicable to the equipment that printer, mobile phone, portable terminal etc. can reproduced images.
According to present embodiment, when edited image, according to content edited, provide again and edit after the corresponding classified information of image.As a result, can suitable classified information be provided to the image behind the editor.
According to present embodiment,, also suitable photographing information can be provided even after edited image.
Although the present invention has been described, should be appreciated that the present invention is not limited to disclosed exemplary embodiments with reference to exemplary embodiments.The scope of appended claims meets the wideest explanation, to comprise all this type modifications, equivalent structure and function.

Claims (7)

1. picture pick-up device comprises:
Image unit is used for photographic images;
The unit is set, is used for being provided with the employed screening-mode of said image unit;
The unit is provided, is used for according to through the said set said screening-mode in unit that is provided with, perhaps according to from the captured image detection of said image unit to quilt take the photograph body information, to captured image classified information is provided; And
Edit cell is used for editing having by the said said image of the said classified information that the unit provides that provides,
Wherein, When through the said image of said edit cell editor; Said provide the unit be used for through the said classified information of said image that will offer the image before the editor change into according to from the image detection behind the editor to quilt take the photograph the classified information that body information is confirmed, determined classified information is offered the image behind the editor.
2. picture pick-up device according to claim 1 is characterized in that the said editor that said edit cell carried out cuts said image.
3. picture pick-up device according to claim 1 is characterized in that, also comprises being used for providing the coordinate information of coordinate information that the unit is provided to said image,
Wherein, said coordinate information provides the unit to be used for the said editor who carries out according to said edit cell, to the image behind the said editor coordinate information about the image behind the said editor is provided again.
4. picture pick-up device according to claim 1 is characterized in that, comprises that also the photographing information that is used for the photographing information when said image provides shooting provides the unit,
Wherein, said photographing information provides the unit to be used for the said editor who carries out according to said edit cell, and the image behind said editor provides photographing information again.
5. picture pick-up device according to claim 4 is characterized in that, said photographing information comprise when taking focusing information and about in the information of shooting date at least one.
6. picture pick-up device according to claim 4 is characterized in that, said photographing information provides the unit to be used for to said image priority being provided, and
Said photographing information provides the unit to be used for the said editor who carries out according to said edit cell, and the image behind said editor provides priority again.
7. image capture method comprises:
Photographic images;
Employed screening-mode in the said shooting is set;
According to said screening-mode, perhaps according to from captured image detection to quilt take the photograph body information, to said image classified information is provided; And
Said image to having said classified information is edited,
Wherein, When at the said image of said editor's inediting; Through the said classified information of said image that will offer the image before the editor change into according to from the image detection behind the editor to quilt take the photograph the classified information that body information is confirmed, determined classified information is offered the image behind the editor.
CN2008101444945A 2007-08-10 2008-08-11 Image pickup apparatus and image pickup method Active CN101365064B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201510354747.1A CN105007391B (en) 2007-08-10 2008-08-11 Image processing equipment and image processing method
CN201510355561.8A CN105049660B (en) 2007-08-10 2008-08-11 Image processing equipment and its control method
CN201210350890.XA CN102891965B (en) 2007-08-10 2008-08-11 Image processing equipment and control method, image processing method

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
JP2007210242 2007-08-10
JP2007210242 2007-08-10
JP2007-210242 2007-08-10
JP2008117296 2008-04-28
JP2008-117296 2008-04-28
JP2008117296A JP5014241B2 (en) 2007-08-10 2008-04-28 Imaging apparatus and control method thereof

Related Child Applications (3)

Application Number Title Priority Date Filing Date
CN201510354747.1A Division CN105007391B (en) 2007-08-10 2008-08-11 Image processing equipment and image processing method
CN201510355561.8A Division CN105049660B (en) 2007-08-10 2008-08-11 Image processing equipment and its control method
CN201210350890.XA Division CN102891965B (en) 2007-08-10 2008-08-11 Image processing equipment and control method, image processing method

Publications (2)

Publication Number Publication Date
CN101365064A CN101365064A (en) 2009-02-11
CN101365064B true CN101365064B (en) 2012-10-24

Family

ID=40391178

Family Applications (4)

Application Number Title Priority Date Filing Date
CN2008101444945A Active CN101365064B (en) 2007-08-10 2008-08-11 Image pickup apparatus and image pickup method
CN201210350890.XA Expired - Fee Related CN102891965B (en) 2007-08-10 2008-08-11 Image processing equipment and control method, image processing method
CN201510354747.1A Expired - Fee Related CN105007391B (en) 2007-08-10 2008-08-11 Image processing equipment and image processing method
CN201510355561.8A Active CN105049660B (en) 2007-08-10 2008-08-11 Image processing equipment and its control method

Family Applications After (3)

Application Number Title Priority Date Filing Date
CN201210350890.XA Expired - Fee Related CN102891965B (en) 2007-08-10 2008-08-11 Image processing equipment and control method, image processing method
CN201510354747.1A Expired - Fee Related CN105007391B (en) 2007-08-10 2008-08-11 Image processing equipment and image processing method
CN201510355561.8A Active CN105049660B (en) 2007-08-10 2008-08-11 Image processing equipment and its control method

Country Status (2)

Country Link
JP (2) JP5014241B2 (en)
CN (4) CN101365064B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011109469A (en) 2009-11-18 2011-06-02 Canon Inc Content receiving apparatus, and method of controlling the same
JP5529568B2 (en) * 2010-02-05 2014-06-25 キヤノン株式会社 Image processing apparatus, imaging apparatus, control method, and program
JP5550989B2 (en) * 2010-05-25 2014-07-16 オリンパスイメージング株式会社 Imaging apparatus, control method thereof, and program
JP6075819B2 (en) 2011-12-19 2017-02-08 キヤノン株式会社 Image processing apparatus, control method therefor, and storage medium
KR102360424B1 (en) * 2014-12-24 2022-02-09 삼성전자주식회사 Method of detecting face, method of processing image, face detection device and electronic system including the same
CN108491535B (en) * 2018-03-29 2023-04-07 北京小米移动软件有限公司 Information classified storage method and device
JP7213657B2 (en) * 2018-11-05 2023-01-27 キヤノン株式会社 IMAGING DEVICE, CONTROL METHOD AND PROGRAM THEREOF

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1780357A (en) * 2004-11-22 2006-05-31 三星电子株式会社 Apparatus and method for trimming picture in digital camera
CN1801890A (en) * 2005-01-05 2006-07-12 株式会社东芝 Electronic camera apparatus and operation guide

Family Cites Families (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10232769A (en) * 1997-02-19 1998-09-02 Hitachi Ltd Class addition supporting method
US7453498B2 (en) * 1998-03-26 2008-11-18 Eastman Kodak Company Electronic image capture device and image file format providing raw and processed image data
JP2002215643A (en) * 2001-01-15 2002-08-02 Minolta Co Ltd Image classification program, computer readable recording medium recording image classification program, and method and device for classifying image
JP3984029B2 (en) * 2001-11-12 2007-09-26 オリンパス株式会社 Image processing apparatus and program
US7289132B1 (en) * 2003-12-19 2007-10-30 Apple Inc. Method and apparatus for image acquisition, organization, manipulation, and publication
JP3826043B2 (en) * 2002-01-31 2006-09-27 キヤノン株式会社 Information processing apparatus and method
JP2004172655A (en) * 2002-11-15 2004-06-17 Fuji Photo Film Co Ltd Image processing apparatus and electronic camera
JP4366083B2 (en) * 2003-01-21 2009-11-18 キヤノン株式会社 Information processing apparatus, information processing method, and program
JP4312534B2 (en) * 2003-08-01 2009-08-12 富士フイルム株式会社 Signal processing device
CN100393097C (en) * 2003-11-27 2008-06-04 富士胶片株式会社 Apparatus, method, and program for editing images
JP2005223758A (en) * 2004-02-06 2005-08-18 Canon Inc Image processing apparatus, control method thereof, computer program, and recording medium
JP2005250716A (en) * 2004-03-03 2005-09-15 Canon Inc Image processing system
US8659619B2 (en) * 2004-03-26 2014-02-25 Intellectual Ventures Fund 83 Llc Display device and method for determining an area of importance in an original image
US20050228825A1 (en) * 2004-04-06 2005-10-13 Tsun-Yi Yang Method for managing knowledge from the toolbar of a browser
JP2006025238A (en) * 2004-07-08 2006-01-26 Fuji Photo Film Co Ltd Imaging device
EP1772752A4 (en) * 2004-07-30 2009-07-08 Panasonic Elec Works Co Ltd Individual detector and accompaniment detection device
JP2006060652A (en) * 2004-08-23 2006-03-02 Fuji Photo Film Co Ltd Digital still camera
JP2006101156A (en) * 2004-09-29 2006-04-13 Casio Comput Co Ltd Information processing device and program
JP4591038B2 (en) * 2004-10-28 2010-12-01 カシオ計算機株式会社 Electronic camera, image classification device, and program
JP2006279252A (en) * 2005-03-28 2006-10-12 Fuji Photo Film Co Ltd Image trimming apparatus, method and program
JP4244972B2 (en) * 2005-08-02 2009-03-25 ソニー株式会社 Information processing apparatus, information processing method, and computer program
JP4841553B2 (en) * 2005-08-17 2011-12-21 パナソニック株式会社 Video scene classification apparatus, video scene classification method, program, recording medium, integrated circuit, and server-client system
JP2007121654A (en) * 2005-10-27 2007-05-17 Eastman Kodak Co Photographing device
JP4232774B2 (en) * 2005-11-02 2009-03-04 ソニー株式会社 Information processing apparatus and method, and program
CN102231801B (en) * 2005-11-25 2013-07-10 株式会社尼康 Electronic camera and image processing device
JP2007213231A (en) * 2006-02-08 2007-08-23 Fujifilm Corp Image processor
JP4043499B2 (en) * 2006-09-06 2008-02-06 三菱電機株式会社 Image correction apparatus and image correction method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1780357A (en) * 2004-11-22 2006-05-31 三星电子株式会社 Apparatus and method for trimming picture in digital camera
CN1801890A (en) * 2005-01-05 2006-07-12 株式会社东芝 Electronic camera apparatus and operation guide

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
JP特开2006-128966A 2006.05.18

Also Published As

Publication number Publication date
CN105007391A (en) 2015-10-28
CN105049660B (en) 2018-05-29
CN102891965B (en) 2016-02-03
CN101365064A (en) 2009-02-11
CN105049660A (en) 2015-11-11
JP5490180B2 (en) 2014-05-14
JP5014241B2 (en) 2012-08-29
JP2012178879A (en) 2012-09-13
CN105007391B (en) 2018-01-30
CN102891965A (en) 2013-01-23
JP2009065635A (en) 2009-03-26

Similar Documents

Publication Publication Date Title
CN101465962B (en) Display control apparatus, display control method
JP5043390B2 (en) Image playback device and program
CN101365064B (en) Image pickup apparatus and image pickup method
JP4818033B2 (en) Image playback apparatus, control method thereof, and program
CN101595727B (en) Image processing apparatus, control method of the image processing apparatus, and image processing system
US9609203B2 (en) Image pickup apparatus and image pickup method
CN101325678B (en) Image recording apparatus and image recording method
CN101924876B (en) Imaging apparatus and control method thereof, and image processing apparatus and control method thereof
JP4810376B2 (en) Image reproducing apparatus and control method thereof
JP4850645B2 (en) Image reproducing apparatus and image reproducing method
JP2008205846A (en) Image processor, image processing method, and computer program
JP5460001B2 (en) Image search apparatus, image search apparatus control method, program, and recording medium
JP2008053971A (en) Data recording device and control method thereof
JP5164353B2 (en) Image reproducing apparatus and control method thereof
JP2008072514A (en) Image reproduction device and control method
JP2008072497A (en) Image processing apparatus
JP2008072498A (en) Image reproducing device, control method therefor, program thereof
JP2008071167A (en) Image processor
JP2009225315A (en) Imaging apparatus
JP2012165080A (en) Image reproducing apparatus, image reproducing method therefor, and program
JP2008053970A (en) Data recording device, and control method thereof
JP2010068148A (en) Data display, control method for the same, and computer program

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant