WO2008075745A1 - 現像サーバ、現像クライアント、現像システム、および現像方法 - Google Patents
現像サーバ、現像クライアント、現像システム、および現像方法 Download PDFInfo
- Publication number
- WO2008075745A1 WO2008075745A1 PCT/JP2007/074571 JP2007074571W WO2008075745A1 WO 2008075745 A1 WO2008075745 A1 WO 2008075745A1 JP 2007074571 W JP2007074571 W JP 2007074571W WO 2008075745 A1 WO2008075745 A1 WO 2008075745A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- development
- correction
- image data
- data
- image
- Prior art date
Links
- 238000011161 development Methods 0.000 title claims abstract description 463
- 238000000034 method Methods 0.000 title claims description 60
- 238000012937 correction Methods 0.000 claims abstract description 502
- 238000013075 data extraction Methods 0.000 claims description 73
- 230000005540 biological transmission Effects 0.000 claims description 50
- 239000000284 extract Substances 0.000 claims description 7
- 238000003384 imaging method Methods 0.000 claims description 3
- 238000000605 extraction Methods 0.000 description 97
- 238000010586 diagram Methods 0.000 description 38
- 230000008569 process Effects 0.000 description 36
- 238000012545 processing Methods 0.000 description 34
- 230000014509 gene expression Effects 0.000 description 22
- 238000004891 communication Methods 0.000 description 16
- 230000035945 sensitivity Effects 0.000 description 14
- 238000000926 separation method Methods 0.000 description 13
- 239000004973 liquid crystal related substance Substances 0.000 description 11
- 238000011156 evaluation Methods 0.000 description 6
- 230000009467 reduction Effects 0.000 description 6
- 239000002131 composite material Substances 0.000 description 5
- 230000006870 function Effects 0.000 description 4
- 238000007639 printing Methods 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 230000010365 information processing Effects 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 1
- 230000005856 abnormality Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000006866 deterioration Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 238000012854 evaluation process Methods 0.000 description 1
- 230000002349 favourable effect Effects 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 230000011514 reflex Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00127—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
- H04N1/00132—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture in a digital photofinishing system, i.e. a system where digital photographic images undergo typical photofinishing processing, e.g. printing ordering
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00127—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
- H04N1/00132—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture in a digital photofinishing system, i.e. a system where digital photographic images undergo typical photofinishing processing, e.g. printing ordering
- H04N1/00137—Transmission
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00127—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
- H04N1/00132—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture in a digital photofinishing system, i.e. a system where digital photographic images undergo typical photofinishing processing, e.g. printing ordering
- H04N1/00143—Ordering
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00127—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
- H04N1/00132—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture in a digital photofinishing system, i.e. a system where digital photographic images undergo typical photofinishing processing, e.g. printing ordering
- H04N1/00161—Viewing or previewing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00127—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
- H04N1/00132—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture in a digital photofinishing system, i.e. a system where digital photographic images undergo typical photofinishing processing, e.g. printing ordering
- H04N1/00167—Processing or editing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00127—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
- H04N1/00132—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture in a digital photofinishing system, i.e. a system where digital photographic images undergo typical photofinishing processing, e.g. printing ordering
- H04N1/00169—Digital image input
- H04N1/00175—Digital image input from a still image storage medium
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00127—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
- H04N1/00132—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture in a digital photofinishing system, i.e. a system where digital photographic images undergo typical photofinishing processing, e.g. printing ordering
- H04N1/00169—Digital image input
- H04N1/00177—Digital image input from a user terminal, e.g. personal computer
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00127—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
- H04N1/00132—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture in a digital photofinishing system, i.e. a system where digital photographic images undergo typical photofinishing processing, e.g. printing ordering
- H04N1/00183—Photography assistance, e.g. displaying suggestions to the user
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00127—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
- H04N1/00132—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture in a digital photofinishing system, i.e. a system where digital photographic images undergo typical photofinishing processing, e.g. printing ordering
- H04N1/00185—Image output
- H04N1/00193—Image output to a portable storage medium, e.g. a read-writable compact disk
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00127—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
- H04N1/00204—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a digital computer or a digital computer system, e.g. an internet server
- H04N1/00244—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a digital computer or a digital computer system, e.g. an internet server with a server, e.g. an internet server
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/0035—User-machine interface; Control console
- H04N1/00405—Output means
- H04N1/00408—Display of information to the user, e.g. menus
- H04N1/0044—Display of information to the user, e.g. menus for image preview or review, e.g. to help the user position a sheet
- H04N1/00442—Simultaneous viewing of a plurality of images, e.g. using a mosaic display arrangement of thumbnails
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/0035—User-machine interface; Control console
- H04N1/00405—Output means
- H04N1/00408—Display of information to the user, e.g. menus
- H04N1/0044—Display of information to the user, e.g. menus for image preview or review, e.g. to help the user position a sheet
- H04N1/00442—Simultaneous viewing of a plurality of images, e.g. using a mosaic display arrangement of thumbnails
- H04N1/00445—Simultaneous viewing of a plurality of images, e.g. using a mosaic display arrangement of thumbnails arranged in a one dimensional array
- H04N1/00448—Simultaneous viewing of a plurality of images, e.g. using a mosaic display arrangement of thumbnails arranged in a one dimensional array horizontally
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/0035—User-machine interface; Control console
- H04N1/00405—Output means
- H04N1/00408—Display of information to the user, e.g. menus
- H04N1/0044—Display of information to the user, e.g. menus for image preview or review, e.g. to help the user position a sheet
- H04N1/00442—Simultaneous viewing of a plurality of images, e.g. using a mosaic display arrangement of thumbnails
- H04N1/00453—Simultaneous viewing of a plurality of images, e.g. using a mosaic display arrangement of thumbnails arranged in a two dimensional array
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/0035—User-machine interface; Control console
- H04N1/00405—Output means
- H04N1/00408—Display of information to the user, e.g. menus
- H04N1/0044—Display of information to the user, e.g. menus for image preview or review, e.g. to help the user position a sheet
- H04N1/00461—Display of information to the user, e.g. menus for image preview or review, e.g. to help the user position a sheet marking or otherwise tagging one or more displayed image, e.g. for selective reproduction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/32—Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
- H04N1/32101—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
- H04N1/32128—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title attached to the image data, e.g. file header, transmitted message header, information on the same page or in the same computer file as the image
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/46—Colour picture communication systems
- H04N1/56—Processing of colour picture signals
- H04N1/60—Colour correction or control
- H04N1/6011—Colour correction or control with simulation on a subsidiary picture reproducer
- H04N1/6013—Colour correction or control with simulation on a subsidiary picture reproducer by simulating several colour corrected versions of the same image simultaneously on the same picture reproducer
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/234—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/4223—Cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/433—Content storage operation, e.g. storage operation in response to a pause request, caching operations
- H04N21/4334—Recording operations
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/65—Transmission of management data between client and server
- H04N21/658—Transmission by the client directed to the server
- H04N21/6582—Data stored in the client, e.g. viewing habits, hardware capabilities, credit card number
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/8146—Monomedia components thereof involving graphical data, e.g. 3D object, 2D graphics
- H04N21/8153—Monomedia components thereof involving graphical data, e.g. 3D object, 2D graphics comprising still images, e.g. texture, background image
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/16—Analogue secrecy systems; Analogue subscription systems
- H04N7/173—Analogue secrecy systems; Analogue subscription systems with two-way working, e.g. subscriber sending a programme selection signal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/0008—Connection or combination of a still picture apparatus with another apparatus
- H04N2201/001—Sharing resources, e.g. processing power or memory, with a connected apparatus or enhancing the capability of the still picture apparatus
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/32—Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
- H04N2201/3201—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
- H04N2201/3204—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a user, sender, addressee, machine or electronic recording medium
- H04N2201/3205—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a user, sender, addressee, machine or electronic recording medium of identification information, e.g. name or ID code
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/32—Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
- H04N2201/3201—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
- H04N2201/3212—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a job, e.g. communication, capture or filing of an image
- H04N2201/3214—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a job, e.g. communication, capture or filing of an image of a date
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/32—Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
- H04N2201/3201—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
- H04N2201/3212—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a job, e.g. communication, capture or filing of an image
- H04N2201/3215—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to a job, e.g. communication, capture or filing of an image of a time or duration
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/32—Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
- H04N2201/3201—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
- H04N2201/3225—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to an image, a page or a document
- H04N2201/3252—Image capture parameters, e.g. resolution, illumination conditions, orientation of the image capture device
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/32—Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
- H04N2201/3201—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
- H04N2201/3225—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to an image, a page or a document
- H04N2201/3256—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to an image, a page or a document colour related metadata, e.g. colour, ICC profiles
Definitions
- Development server development client, development system, and development method
- RAW data image information captured by CCD, CMOS (complementary metal oxide semiconductor), LiveMOS (registered trademark) and other image sensors is recorded as almost raw data (hereinafter referred to as “RAW data”).
- CCD-RAW data non-calorie image information by CCD
- PEG data after processing and compression RAW data has the advantage that there is no deterioration in image quality due to lack of information, although the data size is large. Therefore, the RAW shooting mode makes it possible to leave a shot image with the highest image quality of the camera by recording the shot image as RAW data.
- RAW data is subjected to image adjustment by giving various correction parameters after shooting.
- each manufacturer uses its own data format for RAW data. For this reason, users can view RAW data directly on a general-purpose viewer on a personal computer, print it on a general-purpose printer, or use a standard photo, even though they are using a camera with a RAW shooting mode. I can't request printing from a printer
- RAW development software software for converting RAW data into general-purpose still image data for viewing or printing such as JPEG data or a visible image itself.
- RAW development software software for converting RAW data into general-purpose still image data for viewing or printing
- JPEG data or a visible image itself hereinafter referred to as "RAW development”
- RAW development software Such software is called RAW development software.
- Patent Document 1 discloses a technique in which RAW data captured on the user side is transmitted to a server with high processing capability prepared on the service provider side, and the development processing is centralized on the server. Yes.
- FIG. 1 is an explanatory diagram showing the configuration of a conventional development system and the flow of RAW development.
- the development system 10 includes a development server 20 that is an apparatus on the development service providing side, and a development client 30 that is an apparatus on the user side.
- the development client 30 can access a recording medium such as a memory card used for recording RAW data with a digital still camera, and can communicate with the development server 20.
- the development client 30 reads RAW data to be developed from a recording medium such as a memory card (S41), and transmits the read RAW data to the development server 20 together with information on shooting conditions ( S42).
- the development server 20 receives the RAW data (S43), develops the RAW data based on the shooting conditions, and generates image data such as JPEG data (S44).
- the user performs processing on the user side by performing RAW development processing on the server.
- RAW development can be performed without preparing a high-performance information processing device.
- Patent Document 2 describes a technique for automatically adjusting image quality to a hue (hereinafter referred to as "memory color”) that is determined to be preferable by human eyes.
- a hue hereinafter referred to as "memory color”
- the user can perform RAW development with generally preferred image quality.
- Patent Document 3 a technical capability for arbitrarily designating image quality at the time of development from the development client side is described in Patent Document 3, for example.
- intuitive term expressions and keywords that can be easily handled by general users are associated with adjustment contents of correction parameters in advance.
- Patent Document 1 Japanese Unexamined Patent Application Publication No. 2004-165797
- Patent Document 2 JP 2006_133875 A
- Patent Document 3 Japanese Unexamined Patent Publication No. 2003-234916
- the RAW development by the user is not limited to such development and adjustment. It is not realistic to adjust the image quality through repeated operations. Therefore, a development server and development system that can easily obtain images with image quality according to the user's wishes and preferences in the development of RAW data that must be developed by the development server is desired. .
- An object of the present invention is to develop a development sano, development client, and development capable of easily obtaining an image having a quality in accordance with a user's desire and preference in development of RAW data that is difficult to develop by an individual.
- a system and a development method are provided.
- the development server of the present invention includes an undeveloped image data receiving unit that receives undeveloped image data, and a plurality of undeveloped image data that are simply developed by applying correction parameters corresponding to different image quality.
- a temporary development image generation unit that generates a temporary development image and a development client transmit the plurality of temporary development images in combination with correction parameters applied to the respective temporary development images or information indicating the correction parameters.
- a developed image transmission unit, and a development instruction receiving unit that receives from the development client a correction parameter applied to any of the plurality of provisionally developed images or information indicating the same, and a development instruction for the undeveloped image data
- a developing unit that develops the undeveloped image data by applying the correction parameter indicated by the development instruction and generates developed image data.
- the development client of the present invention corresponds to an undeveloped image data transmission unit that transmits undeveloped image data to a development server that develops the undeveloped image data, and the undeveloped image data has different image quality.
- a provisionally developed image receiving unit configured to receive a plurality of provisional image images that are simply developed by applying correction parameters and received from the development server in combination with correction parameters applied to the respective provisionally developed images or information indicating the same;
- a user interface unit that displays a temporarily developed image and accepts a user selection for the temporarily developed image, correction parameters applied to the temporarily developed image selected in the user interface unit, or information indicating the correction parameters are displayed on the undeveloped image.
- the image data development instruction the development support is performed.
- a development instruction transmission unit for transmission to the server.
- the development system of the present invention employs a configuration including the development server and the development client.
- the development method of the present invention is a development method in a development server for developing undeveloped image data, wherein the step of receiving the undeveloped image data and the received undeveloped image data are different from each other in image quality.
- receiving the undeveloped image data by applying the correction parameter indicated by the received image instruction.
- FIG. 1 An explanatory diagram showing the structure of a conventional development system and the flow of RAW development.
- FIG. 2 is a system configuration diagram showing the configuration of a development system according to Embodiment 1 of the present invention.
- FIG. 4 is a flowchart showing a flow of RAW data separation processing in the first embodiment.
- FIG. 6 is an explanatory diagram showing an example of provisionally developed image data in the first embodiment
- FIG. 7 is an explanatory diagram showing an example of a temporarily developed image selection display screen in the first embodiment.
- FIG. 8 is an explanatory diagram showing an example of a temporary screen switching display screen in the first embodiment
- FIG. 9 is an explanatory diagram showing an example of a browse display screen in the first embodiment
- FIG. 10 is a system configuration diagram showing the configuration of a development system according to Embodiment 2 of the present invention.
- FIG. 12 is an explanatory diagram showing an example of the configuration of a correction-necessary rule table according to the second embodiment.
- FIG. 13 is a flowchart showing a flow of RAW data separation processing in the second embodiment.
- FIG. 14 is a flowchart showing a flow of correction rule matching processing requiring correction in the second embodiment.
- FIG. 16 is an explanatory diagram showing an example of a configuration of a similarity determination rule table in the third embodiment
- FIG. 17 is a sequence diagram showing the flow of RAW development in the third embodiment.
- FIG. 18 is a flowchart showing a flow of similar image grouping processing in the third embodiment.
- FIG. 19 is a system configuration diagram showing the configuration of a development system according to Embodiment 4 of the present invention.
- FIG. 21 is a sequence diagram showing the flow of RAW development in Embodiment 4.
- FIG. 22 is a flowchart showing a flow of user preference information extraction processing in the fourth embodiment.
- FIG. 23 is a system configuration diagram showing the configuration of a development system according to Embodiment 5 of the present invention.
- FIG. 24 is a system configuration diagram showing the configuration of a development system according to Embodiment 6 of the present invention.
- FIG. 2 is a system configuration diagram showing the configuration of the developing system according to Embodiment 1 of the present invention.
- the development system 100 includes a development client 110 that inputs RAW data. And a development server 120, which is a device that provides a development service for RAW data.
- the development client 110 is a television installed at home, and is connected to the development server 120 via a network (not shown) so as to be communicable.
- the provisionally developed image is an image in a data format that can be displayed at high speed by the development client 110, and is displayed by switching one after another by the operation of the user 130, and is used for selecting the image quality preferred by the user. is there.
- a provisionally developed image is an image developed by a simpler development process than the original development process, and is an image having an image quality close to the image quality of the image data (here, JPEG data) obtained by the original development process. .
- Simplification of development processing is realized by, for example, reduction of image resolution and color gradation.
- the correction parameter is a development parameter for setting development conditions for developing a RAW image (main development and temporary development).
- the image quality approximates the main image and the provisionally developed image to which the same correction parameters are applied.
- correction parameters include brightness correction values, tone curve shape specification values, color temperature specification values, sharpness specification values, and noise reduction specification values.
- the brightness correction value is a correction parameter for increasing or decreasing the brightness of the entire photograph.
- the tone curve shape specification value is a correction parameter for finely adjusting the gradation.
- the specified color temperature is a correction parameter for adjusting the white tolerance.
- the sharpness specification value is a correction parameter for adjusting the degree of contour enhancement.
- noise reduction specification value is a correction parameter for adjusting the noise removal amount.
- correction parameters basically define correction contents that act relatively on the image recorded in the original RAW data.
- the correction parameters have individual optimum values for each RAW data.
- the basic parameter is a correction parameter in a state where adjustment based on the selection of the temporarily developed image by the user is not performed.
- This basic parameter is a default correction parameter that does not correct the shooting information including the shooting conditions determined at the time of shooting with the camera. These shooting conditions include, for example, exposure, white balance (color temperature), shooting feeling The amount of noise removal with respect to the degree is included.
- the development system 100 adjusts the correction parameter based on the user's selection for the temporarily developed image. As a result, the development system 100 develops the RAW data with correction parameters corresponding to the image quality according to the user's desire and preference without allowing the user to directly adjust the correction parameters.
- the development client 110 transmits RAW data (photograph data) obtained by photographing with a camera (not shown) to the development server 120 and receives a provisionally developed image from the development server 120. Then, the development client 110 presents the temporarily developed image to the user 130, specifies the image quality corresponding to the temporarily developed image selected by the user 130 using a correction parameter described later, and develops the RAW data. To the development server 120.
- the user interface unit 111 includes a liquid crystal display and a receiving unit (both not shown).
- the liquid crystal display displays an image.
- the receiving unit receives a signal from the remote controller 150 as an input device. Using the liquid crystal display and the receiving unit, the user interface unit 111 presents various types of information to the user 130 and accepts information input from the user 130.
- the photo data input / output unit 112 has a memory force slot (not shown) for mounting a memory card as a recording medium.
- the photo data input / output unit 112 inputs / outputs information to / from the memory card 140 installed in the memory card slot via an electrical contact.
- the memory card 140 is inserted into the memory card slot as a recording medium for camera photo data (RAW data).
- RAW data camera photo data
- the photo data input / output unit 112 acquires the RAW data from the memory card 140 and outputs it to the RAW transmission unit 113.
- RAW data is, for example, data obtained by A / D-converting the amount of light captured by an image sensor such as a CCD or CMOS in an image sensor using a Bayer array, and arranging the data in a matrix form by a Huffman code. This is undeveloped image information compressed by using.
- Each pixel For example, the amount of information of a message is 12 bits (16 bits) or 16 bits. Each pixel corresponds to one color of R GB.
- the RAW data identification information and shooting information are added to the RAW data to the device that performed the shooting.
- the shooting information is information indicating shooting conditions such as shooting date / time, number of pixels, exposure, white balance (color temperature), and noise removal amount with respect to shooting sensitivity.
- RAW data is packed into NEF format data, for example, with imaging information added.
- the undeveloped image information and the data including the undeveloped image information and shooting information accompanying the undeveloped image information are collectively referred to as RAW data as appropriate.
- the temporarily developed image receiving unit 114 receives the temporarily developed image data sent from the developing server 120 and outputs it to the correction parameter determining unit 115.
- the correction parameter determination unit 115 presents correction parameter options to the user 130 in cooperation with the user interface unit 111. Then, the correction parameter determination unit 115 determines a correction parameter to be used for developing RAW data in accordance with the operation of the remote controller 150 by the user 130.
- the correction parameter determination unit 115 displays a temporarily developed image included in the temporarily developed image data as a correction parameter option on the liquid crystal display. Then, when any temporary development image is selected by a user operation from the displayed temporary development images, the correction parameter determination unit 115 converts the correction parameters added to the selected temporary development image to RAW data.
- the correction parameter used for the development is determined.
- the correction parameters are “brightness correction value: plus 0.5 EV equivalent”, “tone curve shape specification value: luminance input value, output value (0,0), (128, 135), (255, 255) ”,“ Color temperature specified value: 6500K ”,“ Sharpness specified value: plus 1 from standard ”,“ Noise reduction specified value: minus 3 from standard ”! /, And so on.
- the correction parameter transmission unit 116 transmits the determined correction parameter, that is, the content of the correction parameter corresponding to the image quality desired by the user 130, to the development server 120.
- the JPEG receiving unit 117 receives JPEG data sent from the development server 120. Then, the JPEG receiving unit 117 writes the received PEG data to the memory card 140 via the photo data input / output unit 112 or outputs it to the liquid crystal display via the user interface unit 111.
- the development server 120 uses a plurality of RAW data sent from the development client 110 that have been developed by applying correction parameters corresponding to different image quality, for which correction parameters need to be determined. A provisionally developed image is generated. Then, the development server 120 assembles the generated temporary development image with the correction parameter applied to the temporary development image, and transmits it to the development client 110. Further, the development server 120 develops the RAW data according to the correction parameter determined by the development client 110, that is, the development image quality desired by the user 130.
- the development server 120 includes an operator interface unit 121, a RAW reception unit 122, a correction target photo data extraction unit 123, a temporary development image generation unit 124, a temporary development image transmission unit 125, a correction parameter reception unit 126, a digital photo development Unit 127 and JPEG transmission unit 128
- the operator interface unit 121 includes a liquid crystal display (not shown) as a display device, and a keyboard and mouse (not shown) as input devices.
- the RAW reception unit 122 receives the RAW data sent from the development client 110 and outputs it to the correction target photo data extraction unit 123.
- the correction target photo data extraction unit 123 converts the RAW data into correction target data and non-correction target data. Sort into. Then, the correction target photo data extraction unit 123 outputs the correction target data and the non-correction target data to the digital photo development unit 127, and outputs only the correction target data to the temporary development image generation unit 124.
- the correction target data is RAW data that is the target of adjustment of the correction parameter, that is, the user.
- RAW data to which correction parameters adjusted based on 130 selections are applied.
- Data that is not subject to correction is RAW data that is not subject to correction parameter adjustment, that is, RAW data to which basic parameters are applied as they are.
- the temporarily developed image generation unit 124 generates a plurality of temporarily developed images corresponding to different image quality, that is, a plurality of temporarily developed images to which different correction parameters are applied, from the correction target data. Then, the temporarily developed image generation unit 124 outputs temporary developed image data obtained by adding correction parameters and identification information to the generated temporarily developed image to the temporarily developed image transmission unit 125.
- the correction parameter returned as the temporarily developed image data is a correction parameter for obtaining JPEG data having an image quality corresponding to each temporarily developed image.
- the identification information returned as the temporarily developed image data is the identification information of the RAW data that is the basis of the temporarily developed image as described above.
- the temporarily developed image transmission unit 125 transmits the temporarily developed image data to the development client 110.
- the correction parameter receiving unit 126 receives, from the development client 110, a correction parameter added to the temporarily developed image selected by the user 130, that is, a correction parameter corresponding to the image quality selected by the user 130. Then, the correction parameter receiving unit 126 outputs the received correction parameters to the digital photo developing unit 127.
- the digital photo development unit 127 converts the data not to be corrected into JPEG data without performing image quality correction. On the other hand, the digital photo development unit 127 corrects the image quality in accordance with the correction parameter input from the correction parameter receiving unit 126 and converts the correction target data into JPEG data. The digital photo development unit 127 outputs the generated PEG data to the JPEG transmission unit 128.
- the JPEG transmission unit 128 transmits JPEG data to the development client 110.
- the development server 120 and the development client 110 have a CPU, a storage medium such as a ROM (read only memory) storing a control program, a RAM (ran dom access memory) and a communication circuit.
- a storage medium such as a ROM (read only memory) storing a control program
- a RAM random dom access memory
- the functions of each unit described above are realized by the CPU executing the control program.
- FIG. 3 is a sequence diagram showing a flow of RAW development in the first embodiment.
- the process executed by the development client 110 is shown on the left side, and the process executed by the development server 120 is shown on the right side.
- step S 1000 the photo data input / output unit 112 of the development client 110 reads RAW data from the memory card 140 and outputs the read RAW data to the RAW transmission unit 113. As described above, shooting information is added to this RAW data.
- step S 1100 the RAW transmission unit 113 of the development client 110 transmits RAW data to the development server 120.
- step S 1200 the RAW reception unit 122 of the development server 120 receives the RAW data transmitted from the development client 110 in step S 1100, and uses the received RAW data as the correction target photo data extraction unit 123. Output to.
- step S 1300 the correction target photo data extraction unit 123 of the development server 120 separates RAW data into correction targets and non-correction targets. Specifically, the correction target photo data extraction unit 123 executes RAW data separation processing.
- the RAW data sorting process is a process of sorting correction target data and non-correction target data according to the operation of the operator 160 in the operator interface unit 121. In the following description, it is assumed that a plurality of pieces of RAW data are transmitted as a set of data from the development client 110 to the development server 120 and development is performed on the set of data.
- FIG. 4 is a flowchart showing the flow of RAW data separation processing by the correction target photo data extraction unit 123.
- step S3000 the correction target photo data extraction unit 123 inputs a batch of RAW data sent from the development client 110.
- FIG. 5 is an explanatory diagram showing an example of a correction required input screen.
- RAW image list display area 201 a list of RAW data (thumbnail images) is displayed as provisionally developed images to which basic parameters are applied (hereinafter referred to as “basic provisionally developed images”).
- basic provisionally developed images a result of determining whether correction is required for each RAW data is displayed.
- a basic provisionally developed image is an image that is visually equivalent (or approximate) to the result of applying basic parameters to certain RAW data.
- RAW image display area 202 a basic provisionally developed image of RAW data (hereinafter referred to as “RAW image” and! /) Is displayed, which is an object of input by the operator.
- RAW image a basic provisionally developed image of RAW data
- the shooting information display area 203 a histogram showing the lightness distribution of the image and the shooting conditions set for the camera at the time of shooting are displayed in the RAW image display area 202! Information accompanying the image is displayed in an auxiliary manner.
- the correction required check box 204 specifies whether or not the RAW data that is the basis of the RAW image displayed in the RAW image display area 202 should be the correction target data depending on whether or not it is checked. Can do.
- the operation area 205 various display elements and operation buttons for assisting the operation of the operator 160 are arranged.
- the thumbnail image, the basic provisionally developed image, and the histogram are created, for example, by the photographic data extraction unit 123 analyzing the RAW data.
- the correction target photo data extraction unit 123 may obtain a thumbnail image from the RAW data.
- the operator 160 shown in FIG. 2 refers to the display image in the RAW image display area 202 and the contents of the shooting information display area 203 as the correction target for the raw data displayed in the RAW image list display area 201. Whether or not to do so can be determined individually. Further, the operator 160 can input his / her judgment result depending on whether or not the correction required check box 204 is checked using the keyboard or mouse of the operator interface unit 121 shown in FIG.
- RAW data in which exposure correction, white balance (color temperature), adjustment of noise removal amount for shooting sensitivity, etc. are all properly set by the photographer, must be set with basic parameters. Even when it is used and automatically developed, a development result without a sense of incongruity can be obtained. If the correction target photo data extraction unit 123 targets image quality correction even for RAW data recorded under such shooting conditions, the number of provisionally developed images presented to the user 130 increases, and the work load on the user 130 is increased. Will increase. Therefore, RAW data such as V, which is problematic even if only basic parameters are used, should not be corrected.
- the RAW data in which such shooting has failed is identified from the display image in the RAW image display area 202 and the contents of the shooting information display area 203. It is easy. Therefore, the operator 160 uses the force S to accurately and quickly determine whether or not the image quality should be corrected by using the correction input screen 200 that requires correction.
- step S3200 of FIG. 4 the correction target photo data extraction unit 123 displays the selection result as to whether or not correction is required for the RAW data, which is obtained on the correction input screen 200 shown in FIG. input.
- step S3300 the correction target photo data extraction unit 123 determines whether or not correction is required for each RAW data.
- the correction target photo data extraction unit 123 proceeds to step S3400 for RAW data for which correction is required (S3300: YES), and proceeds to step S3500 for RAW data for which correction is not required (S3300: NO). Proceed to
- step S3400 the correction target photo data extraction unit 123 classifies the RAW data for which correction is required to be corrected, and outputs the RAW data to the provisionally developed image generation unit 124 and the digital photo development unit 127.
- step S 3500 the correction target photo data extraction unit 123 classifies the RAW data, for which the correction required is not selected, out of the correction target, and outputs it to the digital photo development unit 127.
- the correction target photo data extraction unit 123 ends the RAW data classification process when the batch of data sent from the development client 110 is classified into the correction target and the non-correction target. Then, the development server 120 proceeds to step S1400 in FIG.
- step S 1400 of FIG. 3 the provisionally developed image generation unit 124 of the development server 120 generates a provisionally developed image by performing the above simple development processing on the correction target data. Then, the temporarily developed image generation unit 124 outputs the temporarily developed image data obtained by adding the correction parameter and the identification information to the generated temporarily developed image to the temporarily developed image transmission unit 125.
- the provisionally developed image generation unit 124 applies a plurality of correction parameters corresponding to different image quality according to the number and adjustment width of correction parameters to be presented to the user 130 as options from one correction target data. A temporarily developed image is generated.
- FIG. 6 is an explanatory diagram showing an example of provisionally developed image data generated from one piece of correction target data.
- provisionally developed image generation unit 124 receives correction target data 210 corresponding to a RAW image, a plurality of provisionally developed images having different correction parameters and their associated information are displayed. Temporarily developed image data 211 is generated. The incidental information is for each temporary development image. Includes correction parameters corresponding to the image.
- each of a plurality of frames constituting the temporarily developed image data 211 corresponds to one temporarily developed image.
- each correction parameter is shown as an image quality adjustment parameter composed of an exposure adjustment value (upper) and a color temperature (lower).
- the adjustment range of the exposure correction value is set to minus 1EV (etasposure value) to plus 1EV, and the exposure correction value adjustment interval is set to 1 / 3EV.
- the figure also shows the case where the adjustable range of color temperature is 4500K (Kelvin) to 7500K and the color temperature adjustment interval is 500K.
- step S 1500 of FIG. 3 the temporarily developed image transmission unit 125 of the development server 120 transmits the temporarily developed image data input from the temporarily developed image generation unit 124 to the development client 110.
- provisionally developed image data 211 shown in FIG. 6 has been transmitted. If the number of provisionally developed images is large, the provisionally developed image data 211 may be divided and transmitted.
- step S 1600 the temporarily developed image receiving unit 114 of the developing client 110 receives the temporarily developed image data 211 sent from the developing server 120 and corrects the received temporarily developed image data 211. Output to the parameter determination unit 115.
- step S 1700 the correction parameter determination unit 115 of the development client 110 displays the temporarily developed image data 211 and accepts an instruction for correction content from the user 130 via the user interface unit 111. Specifically, the correction parameter determination unit 115 accepts an instruction from the user 130 regarding whether to display the list of provisionally developed images as a selection display or a switching display. Then, the correction parameter determination unit 115 presents the user 130 with a list of temporarily developed images based on the temporarily developed image data 211 in the manner of presentation instructed by the user 130.
- the provisionally developed image to be presented is an option for JPEG data image quality and an option for correction parameters.
- the temporarily developed image selection display screen 220 has a temporarily developed image 221, a frame 222, a selection key 223, and a decision key 224.
- the temporarily developed image 221 is a temporarily developed image based on the temporarily developed image data 211.
- the provisionally developed image 221 is arranged in a matrix in the order corresponding to the exposure correction value and the color temperature.
- the determination key 224 is used to determine the temporarily developed image 221 selected using the frame 222 as the temporarily developed image 221 designated by the user 130.
- the operation of the selection key 223 and the determination key 224 is performed in response to, for example, the operation of the direction key 151 provided on the remote controller 150 and the “OK” key 152 for determination input.
- the number of provisionally developed images 221 is large, the number of provisionally developed images 221 displayed at a time on the liquid crystal panel may be reduced by scrolling or switching pages.
- the correction parameter determination unit 115 determines that the image quality adjustment parameter force “exposure correction value: plus 2/3 EV, color temperature: 5500 K” is selected by the user 130.
- correction parameter determination unit 115 displays on the liquid crystal panel a temporary screen switching display screen that switches and arranges the temporarily developed images one by one on one screen.
- FIG. 8 is an explanatory diagram showing an example of the temporary screen switching display screen, and corresponds to FIG. The same parts as those in FIG. 7 are denoted by the same reference numerals, and description thereof will be omitted.
- the temporary screen switching display screen 230 has a temporary developed image display area 231.
- the temporarily developed image display area 231 is an area for displaying a currently selected temporarily developed image in a large size.
- information indicating how the temporary development image displayed in the temporary development image display area 231 changes when the selection key 223 is operated is displayed around the temporary development image display area 231.
- “Brightness: Brighter” is displayed on the right side of the temporarily developed image display area 231. This indicates that when the right direction is determined by the selection key 223, the display is switched to display of another temporarily developed image whose brightness is one level brighter.
- a temporarily developed image corresponding to the image quality adjustment parameter “exposure correction value: minus 1/3 EV, color temperature: 6500K” is displayed.
- the correction parameter determination unit 115 determines that the image quality adjustment parameter “exposure correction value: minus 1/3 EV, color temperature: 6500 K” is selected, The image quality adjustment parameter corresponding to the temporarily developed image currently selected is stored. Further, for example, when the right direction is determined by the selection key 223, the correction parameter determination unit 115 performs provisional development image corresponding to the image quality adjustment parameter “exposure correction value: positive minus OEV, color temperature: 6500K”. Switch to the display.
- step S 1800 of FIG. 3 the correction parameter determination unit 115 of the development client 110 outputs correction parameters and identification information corresponding to the selected temporarily developed image to the correction parameter transmission unit 116.
- step S 1900 the correction parameter transmission unit 116 of the development client 110 transmits the correction parameter and identification information input from the correction parameter determination unit 115 to the development servo 120.
- step S2000 the correction parameter receiving unit 126 of the development server 120 receives the correction parameter and identification information sent from the development client 110, and the received correction parameter and identification information are converted into the digital photo development unit 127. To pass.
- the JPEG transmission unit 128 of the development server 120 transmits the PEG data generated by the digital photo development unit 127 to the development client 110.
- the JPEG receiving unit 117 of the developing client 110 receives this JPEG data and writes it to the memory card 140 via the photo data input / output unit 112, for example.
- FIG. 9 is an explanatory diagram showing an example of a browse display screen, and corresponds to FIG.
- FIG. 8 denote the same parts as in FIG. 8, and also show an example of provisionally developed image data generated in the development server 120.
- the browse display screen 240 has a browser 242 in which a temporarily developed image display area 241 for displaying a temporarily developed image is arranged.
- the browse display screen 240 is the power displayed on the development client 110.
- the temporary developed image data 243 that is the basis of the browse display screen 240 is, for example, HTML (hypertext markup language) data and JPEG linked or included from the HTML data.
- the development server 120 includes a plurality of data portions 244 corresponding to a specific provisionally developed image. Temporarily developed image data 243 of the temporarily developed image is generated. Then, the development server 120 first transmits only the data portion 244 to the development client 110.
- the correction parameter determination unit 115 displays only the temporarily developed image based on the data portion 244 sent from the development server 120 in the temporarily developed image display area 241. Then, the correction parameter determination unit 115 requests the data of the other corresponding temporarily developed image from the development server 120 according to the operation of the direction key 151 of the remote controller 150, and when the corresponding data portion is returned, The display is switched to the temporarily developed image display based on the data portion. Then, when the “OK” key 152 of the remote controller 150 is pressed while any temporarily developed image is displayed in the temporarily developed image display area 241, the correction parameter determining unit 115 displays the temporary image being displayed. It is determined that the developed image has been selected.
- the development system 100 sorts the RAW data by the development server 120 to narrow down the number of images to be corrected.
- the development server 120 can suppress the number of provisionally developed images transmitted to the development client 110, and can suppress the load on the network.
- the number of provisionally developed images checked by the user 130 can be minimized, and the burden on the user 130 can be reduced.
- simpler presentation means can be adopted as means for presenting correction parameter options to the user 130, and image quality adjustment operations can be performed with simple operations. Is possible Capable user interface. That is, it is possible to select a correction parameter corresponding to the image quality that suits the user's preference by using a simple input device such as a television remote controller as it is.
- the combination of the correction parameters is not limited to the combination of the exposure correction value and the color temperature.
- a combination of sharpness that indicates the sharpness of the edge image and noise reduction that indicates the smoothness of noise contained in the image contrast that indicates the degree of enhancement of the brightness difference of the image, and color shading
- Various combination patterns, such as combinations with saturation can be used.
- the temporary developed image data may include information (a unique identifier or the like) indicating the correction parameter instead of the correction parameter.
- the development server 120 may receive reference information as a development instruction from the development client 110 side, and apply correction parameters corresponding to the received reference information to development.
- the development system 300 includes a development server 320 having a configuration different from that of the development server 120 of FIG.
- the development server 320 includes a correction target photo data extraction unit 323 instead of the correction target photo data extraction unit 123 and the operator interface unit 121.
- the correction target photo data extraction unit 323 has a correction rule storage unit 329 that requires correction.
- R transmitted from the development client 110 to the development server 320 AW data conforms to the EXIF (exchangable image file format, registered trademark), which is a standard for image files for digital still cameras, and various shooting information (hereinafter referred to as “EXIF information”). , U) is added! /.
- EXIF exchangable image file format, registered trademark
- FIG. 11 is an explanatory diagram showing an example of the configuration of EXIF information.
- the EXIF information 400 includes a plurality of pieces of information. Of the plurality of pieces of information constituting the EXSIF information 400, only the information group 401, which is a part of the information, is used as a material for determining force correction.
- This information group 401 includes, for example, manufacturer name, model, shooting date / time, shutter speed, F value, shooting mode, exposure correction value, flash mode, white balance (color temperature), and ISO sensitivity.
- information other than these pieces of information is used as a material for determining whether correction is necessary.
- the correction rule storage unit 329 shown in FIG. 10 stores a correction rule table.
- the necessary correction rule table describes in advance criteria for determining whether or not the RAW data sent from the development server 320 is a correction target!
- FIG. 12 is an explanatory diagram showing an example of the configuration of the correction rule table requiring correction.
- each rule has fields for rule item 434 and amendment rule 435.
- Rule item 434 is a field for designating the type of information used for determining whether or not correction is necessary.
- the correction necessary rule 435 describes a logical conditional expression for determining the necessity of correction from a given numerical value when each piece of information specified in the rule item 434 is given as a numerical value. is doing. However, here, the contents of each logical conditional expression are shown as the correction rule 435 required.
- the photographing information single rule 431 is a rule group for determining whether or not correction is necessary based on the single photographing information read from the EXIF information 400.
- the shooting information single rule 431 has a required correction rule 435 for the ISO sensitivity and the exposure correction value.
- the shooting information single rule 431 is defined as a standard for determining that a camera-specific threshold value needs to be corrected (hereinafter referred to as “required correction standard”) as a correction required rule 435 for ISO sensitivity.
- the shooting information single rule 431 stipulates that a correction criterion of ⁇ 2EV or less or + 2EV or more is required as a correction requirement rule 435 for exposure correction values regardless of the camera model.
- the exposure correction value is a value for intentionally changing the exposure amount with respect to the automatic exposure setting of the camera.
- Exposure compensation values are usually used in the range of minus 1EV to plus 1EV. If the exposure compensation value deviates significantly from this range, the photographer may have set the exposure compensation value incorrectly, or if the subject has taken a black exposure under the backlight! / It can be assumed that the situation was extremely difficult. Therefore, in the shooting information only, No. 1 431, the correction item 434, which is “1 EV or less or +2 EV or more”, is described in the item No. 434 of “Adjusting exposure straight”! /
- the shooting information composite rule 432 is a group of rules for determining whether correction is necessary based on a combination of a plurality of shooting information read from the EXIF information 400.
- the shooting information combination rule 432 defines a correction rule 435 required for each combination of exposure amount, flash, shooting mode, and white balance.
- the shooting information composite rule 432 is set for artificial light sources such as the light bulb color even though the exposure is estimated to be sunny (more than 13EV).
- the color temperature is not in the range of 4500K to 6000K).
- the shooting information combination rule 432 is necessary when the flash does not have a color temperature suitable for the flash (when the white balance is not in the range of 5400 to 5600 mm). It is specified as a correction standard.
- the shooting information combination rule 432 sets the color temperature within a range suitable for artificial light sources, regardless of the shooting mode and shutter speed (landscape mode and shutter speed 1/60 sec or more) that are estimated to be outdoor landscape photography. If it is! /, The white balance is not in the range of 4000 ⁇ to 7000 ⁇ , it is specified as a correction standard.
- the exposure unit is EV, and the exposure amount at which the photograph can be taken with appropriate brightness at an aperture value Fl, a shutter speed of 1 second, and ISO 100 is defined as 0EV. Normally, the exposure increases by 1EV each time the aperture value of the camera becomes a square root of 2, and the shutter speed and ISO sensitivity increase. Each EV increases by 1EV each time it doubles. It is said that outdoor exposure on a sunny day is around 14EV.
- the image information rule 433 is a rule group for determining whether correction is necessary based on various information other than the EXIF information 400.
- the image information rule 433 defines the necessary correction rules 435 for under-exposed, over-exposed, under-latitude, and color cast as rule items 434, respectively.
- the data amount of the bright part and the dark part of the image can be read as histogram peaks from the histogram information which is the brightness distribution of the entire image. Therefore, the image information Lunore 433 sets the underexposure (underexposure) when there is no mountain in the bright part and conversely overexposes (overexposure) when there is no mountain in the dark part.
- the image information rule 433 defines these cases as correction-required standards.
- the image information rule 433 specifies that the latitude is insufficient (exceeds the allowable range of luminance difference) when there are peaks in both the dark and bright areas, and is similarly defined as a correction criterion.
- image information Lunore 433 assumes that there is a possibility of causing color cast (color tone abnormality) when there is a large difference in the mountain shape of each RGB (red, green, and blue) histogram. It is stipulated as a correction standard.
- the correction-necessary rule table 430 is created based on the development results when the basic parameters are applied to RAW data obtained under various shooting conditions, for example, by a camera manufacturer. Then, the development server 320 acquires the correction-necessary rule table 430 created by the manufacturer from, for example, the server or the camera via the Internet. Alternatively, the correction necessary rule table 430 may be created by the user 130 at the development client 110 or by the operator 160 at the development server 320.
- FIG. 13 is a flowchart showing the flow of RAW data separation processing by the correction target photo data extraction unit 323, and corresponds to FIG. 4 of the first embodiment.
- FIG. 13 The same parts as those in FIG. 4 are denoted by the same reference numerals, and description thereof will be omitted.
- the correction target photo data extraction unit 323 reads the EXIF information 400 added to the RAW data input in step S3000.
- the correction target photo data extraction unit 323 may read only the information (information group 401) described in the rule item 434 of the correction required rule table 430 shown in FIG. Good.
- the correction target photo data extraction unit 323 analyzes the EXIF data structure with respect to the RAW data and reads the EXIF information 400. Since the technique of EX IF data structure analysis is known, its detailed description is omitted.
- step S3210 the correction target photo data extraction unit 323 executes a correction rule matching process that requires correction.
- the correction rule matching process is a process for matching the read EXIF information 400 with the correction rule 435 in the correction rule table 430.
- FIG. 14 is a flowchart showing the flow of the correction rule matching process that is required by the correction target photo data extraction unit 323.
- the shooting information single rule 431 and the shooting information composite rule 432 are described in the correction-necessary rule table 430 shown in FIG.
- step S 3211 the correction target photo data extraction unit 323 sets the reading position at the head of the correction rule table 430 that requires correction prior to reading information from the correction rule table 430 that requires correction.
- step S3212 the correction target photo data extraction unit 323 reads out the correction rule 435 of the reading position set at that time.
- step S3213 the correction target photo data extraction unit 323 prepares the read correction-required rule 435 (hereinafter referred to as "rule content” as appropriate), the camera V model (hereinafter “ It is determined whether it is “model-specific rule”. If the rule content is a model-specific rule (S3213: YES), the correction target photo data extraction unit 323 proceeds to step S3214. On the other hand, if the rule content is not a model-specific rule, that is, if it is common to all camera models (S3213: NO) Proceed to step 3215.
- step S3214 the correction target photo data extraction unit 323 reads the manufacturer name and model name from the EXIF information 400 of the RAW data because the rule content is a model-specific rule.
- step S3215 the correction target photo data extraction unit 323 selects the rule content common to all models because the rule content is common to all models, and proceeds to step S3217.
- step S3217 the correction target photo data extraction unit 323 determines whether the read correction-required rule 435 is the power that is defined by one type of shooting information alone, that is, the shooting information single rule 431. Judge whether or not. If the correction required rule 435 is the shooting information single rule 431 (S3217: YES), the correction target photo data extraction unit 323 proceeds to step S3218. On the other hand, if the correction-necessary rule 435 is not the shooting information single rule 431 !, that is, if it is the shooting information composite rule 432 (S3217: NO), the correction target photo data extraction unit 323 proceeds to step S3219.
- step S3218 the correction target photo data extraction unit 323 reads out the information necessary for the calculation of the conditional expression of the correction required correction 435 from the EXIF information 400 and substitutes it into the conditional expression to correct the R AW data. Evaluate the necessity, and go to step S3220.
- the correction target photo data extraction unit 323 reads “ISO sensitivity” and “ISO sensitivity ⁇ 800” corresponding to rule item 434! Select the correction rule 435 that specifies the conditional expression.
- the correction target photo data extraction unit 323 substitutes the ISO sensitivity read from the EXIF information 400 for the “ISO sensitivity” portion of the conditional expression, and evaluates whether the conditional expression is satisfied. If the conditional expression is satisfied, the correction target photo data extraction unit 323 determines that the RAW data to be processed is data that needs correction.
- a conditional expression "exposure correction value minus 2EV or exposure correction value> + 2EV” is defined. Shi Therefore, the correction target photo data extraction unit 323 substitutes the exposure correction value read from the EXIF information 400 for the “exposure correction value” portion of the conditional expression, and evaluates whether the conditional expression is satisfied. If the conditional expression is satisfied, the correction target photo data extraction unit 323 evaluates that the RAW data to be processed is correction required data.
- “or” is a logical OR operator meaning “or”.
- step S3219 the correction target photo data extraction unit 323 has not acquired all the information necessary for the calculation of the conditional expression of the correction required root 435, and thus acquires unacquired information. Specifically, the correction target photo data extraction unit 323 reads the shutter speed, F value, and ISO sensitivity from the EXIF information 400, and calculates the exposure amount.
- step S3221 the correction target photo data extraction unit 323 has completed the acquisition of information necessary for the calculation of the conditional expression of the correction required 435, so the EXIF information corresponds to the conditional expression. Substituting information to evaluate whether or not the correction of RAW data is necessary, the process proceeds to step S3220.
- rule 435 that requires correction corresponding to rule item 434 of “Exposure & White Balance”
- a conditional expression “Exposure ⁇ 13 EV and (color temperature ⁇ 4500 K or color temperature> 6000 K)” is specified.
- the correction target photo data extraction unit 323 substitutes the calculated exposure amount for the “exposure amount” portion of the conditional expression, and evaluates whether the conditional expression is satisfied. If the conditional expression is satisfied, the correction target photo data extraction unit 323 evaluates that the RAW data to be processed is the data that needs correction.
- “and” is a logical product operator meaning “and”.
- the EXIF400 information may not include the color temperature and may include the name of the white balance designation mode such as "Fine" or "Cloudy".
- a table of color temperatures associated with each designation mode may be prepared in advance in the correction target photo data extraction unit 323 for each camera manufacturer name and model name.
- the correction target photo data extraction unit 323 can easily obtain the color temperature of the RAW data by referring to this table.
- step S3220 the correction target photo data extraction unit 323 determines whether or not collation for all the correction rules 435 required in the correction rule table 430 is complete. I refuse. If there is a correction rule 435 that has not yet been verified (S3220: NO), the correction target photo data extraction unit 323 advances the reading position by one in step S3222, and then returns to step S3212. The following correction rule 435 is checked. On the other hand, the correction target photo data extraction unit 323 ends the correction rule matching process when all the correction rules 435 requiring correction have been completed (S3220: NO), and the RAW data shown in FIG. Proceed to step S3310 of the separation process.
- step S3218 determines in step S3218 or step S3221 that the RAW data is correction required data
- the process proceeds directly to step S3310 in Fig. 13 without passing through step S3220. You may make it.
- the necessity of correction may be determined from information other than EXIF information 400.
- the histogram information of RAW data may be acquired, and the necessity of correction may be determined based on the image information rule 433 shown in FIG.
- step S3310 of FIG. 13 the correction target photo data extraction unit 323 performs EXIF information
- the correction target photo data extraction unit 323 determines whether or not the RAW data is included in the correction required rule 435. If the RAW data is included in the correction required rule 435 (S3310: YES), the correction target photo data extraction unit 323 proceeds to step S3400 ⁇ and sorts the RAW data into correction targets. On the other hand, if the RAW data is not included in the correction-required rule 435 (S3310: NO), the correction target photo data extraction unit 323 proceeds to step S3500 and sorts the RAW data out of the correction target.
- the correction target data (RAW data) is output to the provisionally developed image generation unit 124, and the corresponding provisionally developed image is transmitted to the development client 110.
- correction target photo data extraction unit 323 uses the digital photo development unit as information for appropriately performing correction based on the determination that correction is necessary in the correction rule matching processing shown in FIG.
- the development server 320 refers to the correction rule table 430 that describes the criteria for determining whether correction is necessary, and uses the information accompanying the RAW data to Determine whether RAW data needs to be corrected. As a result, the development server 320 can automatically sort the correction target data, thereby reducing the amount of work for the operator and making the operation of the operator unnecessary.
- the operator interface unit 121 shown in Fig. 2 of the first embodiment or an operator interface with a reduced function thereof is provided so that the operator 160 can confirm the classification result by the RAW data classification process described above. It may be.
- the development server 320 displays a RAW image as an option on an information display screen such as the correction-necessary input screen 200 shown in FIG. 4, and indicates the result of separation by the RAW data separation process by a check mark or the like. Then, the development server 320 performs final separation after visual confirmation by the operator 160.
- the RAW data separation process it is possible to set data that has been erroneously excluded from the correction target even though it should be corrected. In other words, it is possible to reduce the burden on the operator 160 and to achieve accurate data separation with the power S.
- FIG. 15 is a system configuration diagram showing the configuration of the developing system according to the third embodiment of the present invention, and corresponds to FIGS. 2 and 10 of the first and second embodiments. 15, the same parts as those in FIGS. 2 and 10 are denoted by the same reference numerals, and description thereof will be omitted.
- the development system 500 includes a development server 520 having a configuration different from that of the development server 120 of FIG. 2 and the development server 320 of FIG.
- the development server 520 includes a digital photo development unit 527 instead of the digital photo development unit 127, and also includes a similar image grouping unit 530 and a representative image extraction unit 531.
- the similar image grouping unit 530 includes a similarity determination rule storage unit 532.
- the similar image grouping unit 530 inputs the correction target data output from the correction target photo data extraction unit 323, collects the similar images in the input correction target data into one group, and represents the representative image. Output to the extraction unit. Similar images are a plurality of correction target data that are similar in that they can be developed using the same correction parameter.
- the similarity determination rule storage unit 532 stores the similarity determination rule table! Similar size
- the standard rail table describes in advance a standard for determining whether or not a plurality of correction target data are similar! /.
- FIG. 16 is an explanatory diagram showing an example of the configuration of the similarity determination rule table.
- the similarity determination rule table 600 has fields of rule item 601 and similarity determination rule 602! /.
- the rule item 601 describes the type of information used for determining whether or not a plurality of correction target data are similar.
- the similarity determination rule 602 describes a criterion for determining whether or not a plurality of correction target data is similar. Specifically, the similarity determination rule 602 determines whether or not a plurality of correction target data is similar based on the given numerical values when each piece of information specified by the rule item 601 is given as a numerical value. A logical conditional expression is described. However, here, as the similarity determination rule 602, the contents of the respective logical conditional expressions are illustrated.
- RAW data shot in a similar environment usually has common image quality characteristics. Therefore, when there is a correction parameter that can produce a satisfactory development result for any one of a plurality of RAW data with similar shooting environments, the correction parameter is applied to other RAW data. Similarly, generally satisfactory development results can be obtained. As a rule of thumb, it is known that an appropriate correction parameter can be applied in common if the lighting state at the time of shooting is similar to the shooting method.
- the similarity determination rule table 600 describes contents focusing on this property.
- the similarity determination rule table 600 describes the similarity determination rule 602 “within 10 seconds between shooting times” corresponding to the rule item 601 “shooting date and time”.
- the similarity determination rule table 600 corresponds to "Shooting Date / Time and Shooting Mode" and! /, Rule Item 601.
- the similarity determination is "Shooting interval within 1 minute, shooting mode unchanged and flash mode unchanged”.
- Rule 602 is described. This is because when the shooting date / time and shooting mode of multiple correction target data satisfy the similarity determination rule 602, the multiple corrections are made. This is because the target data can be developed using the same correction parameter. As described in Embodiment 2, the shooting date and time and the shooting mode can be acquired from the EXIF information added to the correction target data.
- the representative image extraction unit 531 extracts one representative image for each group from the plurality of correction target data, and generates a temporary development image as correction target data for which a temporary development image is to be created. Output to part 124.
- FIG. 17 is a sequence diagram showing the flow of RAW development in the third embodiment, and corresponds to FIG. 3 in the first embodiment.
- FIG. 17 the same parts as those in FIG. 17.
- the similar image grouping unit 530 executes a similar image grouping process.
- the similar image doubling process is a process for grouping similar images into one group using the similarity determination rule table of the similarity determination rule storage unit 532.
- FIG. 18 is a flowchart showing a flow of similar image grouping processing by the similar image grouping unit 530.
- step S4100 the similar image grouping unit 530 selects one existing group when it already has a RAW data group by a group generation process to be described later. Then, the similar image grouping unit 530 reads EXIF information of RAW data belonging to the selected existing group. Similar image grouping unit 530 selects EXIF information of any RAW data belonging to the specified group may be read. The similar image grouping unit 530 may read the EXIF information of each of a plurality of or all of the RAW data belonging to the group and obtain an average value of the EXIF information. However, it is desirable that the similar image grouping unit 530 also reads out the RAW data captured immediately before as information related to the shooting date and time.
- step S4200 similar image grouping section 530 selects one similarity determination rule 602 in similarity determination rule table 600 and reads it out.
- step S4300 similar image grouping section 530 evaluates the similarity between the RAW data and the RAW data of the selected group. Specifically, the similar image duplication unit 530 uses the RAW data EXIF information and the EXIF information acquired from the selected group to determine the similarity based on the content of the selected similarity determination rule 601. evaluate.
- the similar image grouping unit 530 selects the similarity determination rule 602 “shooting interval within 10 seconds”, the shooting date / time read from the EXIF information of the group and the shooting date / time read from the RAW data are displayed. And compare. Then, the similar image grouping unit 530 determines that they are similar when the difference between these two shooting dates and times is within 10 seconds.
- the similar image grouping unit 530 converts each RAW data into identification information (hereinafter referred to as "dulp information") that is unique to the determined group. If you manage it as a group, The similar image grouping unit 530 may include group information in the supplementary information of each temporarily developed image data, and instruct development using the group information when specifying the correction parameter from the development client 110 side. Yo! / For such data grouping and group management based on identification information, for example, a well-known ID (identifier) management technique widely used can be applied.
- ID identifier
- step S4400 the similar image grouping unit 530 determines whether or not the relationship between the RAW data and the selected group matches the selected similarity determination rule 602. To do. The similar image grouping unit 530 determines that the relationship between the RAW data and the group does not match the similarity determination rule 602! /, Or if the similarity evaluation is performed! /, NA! /, (S4400: NO) Proceed to step S4500. On the other hand, if the relationship between the RAW data and the group matches the similarity determination rule 602 (S4400: YES), the similar image grouping unit 530 proceeds to step S4600.
- step S 4500 similar image grouping section 530 determines whether or not similarity evaluation has been completed for all similarity determination rules 602 of similarity determination rule table 600.
- the similar image grouping unit 530 returns to step S4200 in the case where there are 602 similar determinations that have not been used in the similarity evaluation! Similarity evaluation is performed by re-selecting judgment rule 602.
- similar image normalization unit 530 evaluates similarity for all similarity judgment rules 602 (S4500). : YES), go to step S4700.
- step S4700 gnorape section 530 determines whether or not the similarity evaluation has been completed for all existing groups.
- the similar image grouping unit 530 returns to step S4100 when an existing group exists (S4700: NO) as a target of similarity evaluation! /, NA! /, And selects an unselected group again. Perform similarity assessment.
- the similar image grouping unit 530 proceeds to step S4800 if the similarity is evaluated for all existing groups (S4700: YES).
- step S4800 the similar image grouping unit 530 generates a new group on the assumption that the RAW data is not similar to any existing group, and assigns the RAW data to the generated new group. Proceed to S4900.
- step S4600 the image grouping unit 530 adds the RAW data to the existing group because there is an existing group that matches the similarity determination rule 602, that is, a similar group. Proceed to S4900.
- step S4900 the similar image grouping unit 530 determines whether or not the grouping of all the RAW data input from the correction target photo data extraction unit 323 has been completed.
- the similar image grouping unit 530 performs grouping on the RAW data input from the correction target photo data extraction unit 323, and still performs grouping! /, Nana! / (S4900: NO), go back to step S4000, select the unselected RAW data, and perform grouping.
- the similar image grouping unit 530 ends the series of processing when the grouping of all the input RAW data is completed (S4 900: YES). .
- the similar image grouping unit 530 groups the non-correction target data among a group of RAW data sent from the development client 110 by applying the same correction parameter. . Then, the development server 520 outputs the grouped RAW data to the representative image extraction unit 531 with the corresponding group information added thereto, and proceeds to step S1330 in FIG.
- step S 1330 of FIG. 17 the representative image extraction unit 531 extracts the representative image data of each group by selecting the RAW data one by one from each group, and each of the extracted representative image data To the corresponding group information. Then, the representative image extraction unit 531 outputs the representative image data to which the group information is added to the provisionally developed image generation unit 124.
- the selection of representative image data in each group may be random, or a RAW image having EXIF information that is averaged in the group may be selected.
- steps S1400 to S2000 is the same as the processing described in FIG.
- step S2 000 the development server 520 receives the correction parameter selected by the user 130.
- step S 1400 the temporarily developed image data is generated from the correction target data extracted by the representative image extraction unit 531 out of the correction target data sorted by the correction target photo data extraction unit 323. Data only. Further, the number of correction target data extracted by the representative image extraction unit 531 is equal to the number of groups divided by correction target data. Therefore, the number of provisionally developed images transmitted in step S 1500 and the number of correction parameters transmitted in step S 2000 can be kept low compared to the case where representative image data is not extracted. .
- step S2110 the digital photo developing unit 527 applies the correction parameters input from the correction parameter receiving unit 126 in the same manner as the digital photo developing unit 127 in FIG.
- the correction target data is developed.
- the correction parameter is specified in association with each representative image data only.
- the digital photo development unit 527 applies the correction parameters specified for the representative image data to all the image data (RAW data) in the same group, and generates JPEG data.
- the development server 520 associates the RAW data that is the basis of the temporarily developed image or the temporarily developed image with the group information, and manages the correction parameter and the group information as a set. Thereby, it is possible to easily identify which correction parameter corresponds to which group.
- development server 520 groups a plurality of correction target data to which the same correction parameter can be applied.
- the development server 520 sets only representative image data as a target for temporary development image generation. Further, the development server 520 applies the correction parameters designated for each temporarily developed image to all correction target data in the same group.
- the development server 520 can reduce the size of the temporarily developed image data and the number of correction parameters transmitted / received to / from the development client 110. This also allows the development system 500 to minimize the number of provisionally developed images that are presented as options on the development client 110, and the user 130 can obtain the desired development results with minimal correction operations. I can do it. In other words, it is possible to further reduce the burden on the user 130.
- the development server 520 determines whether or not the non-correction target data belongs to any of the groups generated by the similar image grouping unit 530, and sets the same correction parameter as the corresponding group. It may be used for development. However, in this case, it is desirable that the development server 520 selects representative image data from the correction target data so that the user 130 can easily recognize the effect of the correction.
- the development server 520 is provided with an operator interface unit 121 shown in FIG.
- the operator 160 may confirm the grouping of the RAW data by the similar image grouping process.
- the development server 520 displays an RAW image as an option on an information display screen such as the correction-necessary input screen 200 shown in FIG. 4 and also shows the result of grouping by similar image grouping processing. Accept changes made by. In this case, for example, when an incorrect grouping is performed in the similar image grouping process, this can be corrected. That is, the burden on the operator 160 can be reduced and accurate data separation can be realized.
- FIG. 19 is a system configuration diagram showing the configuration of the developing system according to Embodiment 4 of the present invention, and corresponds to FIGS. 2, 10, and 15 of Embodiment 1 to Embodiment 3.
- the development system 700 includes a development server 720 having a configuration different from that of the development servers 120, 320, and 520 of FIG. 2, FIG. 10, and FIG.
- the development server 720 includes a digital photo development unit 727 instead of the digital photo development units 127 and 527, and a user preference information extraction unit 733 and a user preference information management unit 734.
- the user preference information extraction unit 733 includes a preference extraction rule storage unit 735.
- the user preference information extraction unit 733 receives the correction parameters output from the correction parameter reception unit 126, extracts the user preference information from the correction parameters, and outputs the user preference information to the user preference information management unit 734.
- the user preference information is information indicating a universal image quality tendency that the user 130 likes.
- the preference extraction rule storage unit 735 stores a preference extraction rule table.
- the preference extraction rule table describes in advance a preference extraction rule that serves as a reference for extracting user preference information from the correction parameters.
- FIG. 20 is an explanatory diagram showing an example of the configuration of the preference extraction rule table.
- the preference extraction table 800 has fields for an extraction condition 801, an extraction target 802, and an extraction result 803.
- the extraction condition 801 describes a condition under which user preference information can be extracted.
- the extraction strip Case 801 describes conditions relating to EXIF information of RAW data.
- the extraction target 802 specifies information to be read from the correction parameter when the extraction condition 801 is satisfied.
- the extraction result 803 designates the content to be extracted as user preference information.
- the preference extraction table 800 is associated with the extraction condition 801 “flash emission”, the extraction target 802 “color temperature”, and the default color temperature and correction parameter of “flash emission”.
- the extraction condition 803 “difference from color temperature” is described. This is a preference extraction rule when the difference between the color temperature automatically set by the camera and the color temperature included in the correction parameter is handled as user preference information.
- the preference extraction table 800 is associated with the extraction condition 801 of “exposure amount”, the extraction target 802 of “color temperature”, and the clear sky color when the exposure amount is 14 EV or more. Describe the extraction condition 803 and the difference between the temperature and the color temperature of the correction parameter.
- the preference extraction table 800 is specified as user preference information that can extract the amount of noise removal included in the correction parameter in association with the ISO sensitivity.
- the preference extraction table 800 is associated with the extraction condition 801 of “ISO sensitivity 100 or less”, the extraction target 802 of “noise removal amount”, and the extraction condition of “noise removal amount when ISO 100 or less” 803 Is described. This is because when the ISO sensitivity, which is the information included in EXIF information, is 100 or less, the noise removal amount read from the correction parameters is extracted as the “noise removal amount when the user prefers ISO 100 or less”. This is a preference extraction rule that specifies.
- the user preference information management unit 734 in FIG. 19 has a storage area for storing information (not shown), holds user preference information input from the user preference information extraction unit 733, and holds the stored user Manage preference information (user profile).
- the digital photo development unit 727 develops the correction target data using the correction parameters sent from the development client 110, and the user preference information management unit 734 handles the correction target data for the non-correction target data. Development is performed using the information held.
- FIG. 20 is a sequence diagram showing the flow of RAW development in the fourth embodiment, and corresponds to FIG. 3 in the first embodiment.
- FIG. 21 parts that are the same as the parts shown in FIG. 3 are given the same reference numerals, and explanation thereof is omitted.
- step S2000 the correction parameter receiving unit 126 also outputs the received correction parameter to the user preference information extracting unit 733 which is connected only by the digital photo developing unit 727.
- step S2010 the user preference information extraction unit 733 executes a user preference information extraction process.
- the user preference information extraction process is a process for extracting user preference information from the correction parameters using the preference extraction rule table 800.
- FIG. 22 is a flowchart showing the flow of user preference information extraction processing by the user preference information extraction unit 733.
- step S6000 the user preference information extraction unit 733 inputs the correction parameter designated by the user 130 from the correction parameter reception unit 126.
- step S6100 the user preference information extraction unit 733 reads the EXIF information added to the RAW data (correction target data) to be developed using the correction parameters.
- step S6200 the user preference information extraction unit 733 selects one preference extraction rule in the preference extraction rule table 800, the extraction condition 801, the extraction target 802, and the extraction target 802 of the selected preference extraction rule. Read the extraction result 803.
- step S6300 the user preference information extraction unit 733 reads the EXIF information.
- the extracted extraction condition 801 is collated to determine whether or not the extraction condition 801 is met. If the EXIF information matches the extraction condition 801 (S630 0: YES), the user preference information extraction unit 733 proceeds to step S6400.
- step S6500 the user preference information extraction unit 733 determines whether or not the read extraction result 803 depends on the eigenvalue of the camera. If the extraction result 803 depends on the eigenvalue of the camera (S6500: YES), the user preference information extraction unit 733 proceeds to step S6600. On the other hand, if the extraction result 803 does not depend on the eigenvalue of the camera (S6500: NO), the user preference information extraction unit 733 proceeds to the next step S6700 without passing through step S6600.
- the eigenvalue of the camera includes, for example, a default color temperature at the time of flash emission that the camera originally holds.
- the user preference information extraction unit 733 refers to the correspondence table held in advance and acquires the unique value of the camera.
- the user preference information extraction unit 733 in which the unique values of each camera are associated with the manufacturer name and model name in advance, is described by the manufacturer's name and model name of the camera that captured the RAW data from the EXIF information. Get the eigenvalue by referring to the correspondence table.
- step S6700 user preference information extraction section 733 generates an extraction result according to the contents of read extraction result 803, and proceeds to step S6800.
- the user preference information extraction unit 733 extracts Since the condition 801 is met, the color temperature is obtained from the EXIF information. In addition, the user preference information extraction unit 733 obtains the best name and model name from the EXIF information because the corresponding extraction result 803 depends on the camera's unique value of the default color temperature at flash emission. An extraction result is generated using a correspondence table.
- step S6300 if the user preference information extraction unit 733 matches the extraction condition 801 from which the EXIF information has been read and does not match the extraction condition 801 (S63 00 ⁇ 0), step 36400 Go through step S6800 without going through ⁇ 36700 [0240]
- step S6800 user preference information extraction unit 733 finishes the process of extracting user preference information and extracting user preference information for all preference extraction rules of preference extraction table 800. Judgment whether or not.
- the user preference information extraction unit 73 3 returns to step S6200 and returns to step S6200 if the preference extraction rules are still to be processed! /, Na! /
- the preference extraction rule is selected again to determine whether user preference information can be extracted.
- the user preference information extraction unit 733 ends the series of processing when the processing is completed for all preference extraction rules (S6800: YES).
- the user preference information extraction unit 733 extracts the user preference information from the correction data sent from the development client 110. Then, the user preference information extraction unit 733 outputs the extracted user preference information to the user preference information management unit 734, and proceeds to step S2020 in FIG.
- step S2020 of FIG. 21 the user preference information management unit 734 holds the input user preference information in the storage area, and already holds the user preference information for the same user. In this case, the existing user preference information is updated with the newly entered user preference information.
- the user preference information management unit 734 learns user preferences and preference trends over a long period of time using known learning methods such as multiplying existing information with new thought information weighted by a constant value. You may make it go. Further, the user preference information management unit 734 assumes that fixed users 130 are performing operations for each development client 110, and associates user preference information with the user 130 for each data transmission / reception destination. Also good. Further, the user preference information management unit 734 may perform user authentication prior to the development process, and associate user preference information with the user 130 for each user 130.
- step S2100 the digital photo development unit 727 develops the correction target data using the correction parameters sent from the development client 110, and outputs the JPEG data. Data.
- step S2120 the digital photo development unit 727 develops the non-correction target data using the user preference information held by the user preference information management unit 734, and generates JPEG data.
- the digital photo development unit 727 inquires of the user preference information management unit 734 about user preference information regarding the transmission source of the non-correction target data. Then, if the corresponding user information is already held in the user preference information management unit 734, the digital photo development unit 727 acquires this. Then, the digital photo development unit 727 generates a correction parameter corresponding to the acquired user preference information, and develops the non-correction target data using the generated correction parameter.
- the user preference information management unit 734 holds the difference S between the default color temperature during flash emission and the color temperature of the correction parameter, and user preference information.
- the digital photo development unit 727 performs development using the user preference information for the non-correction target data photographed by flash emission with the user 130 as the transmission source. That is, the digital photo development unit 727 performs development using a correction parameter obtained by adding the above color temperature difference to the default color temperature at the time of flash emission.
- the user preference information management unit 734 holds, as user preference information, the difference between the clear sky color temperature when the exposure amount is 14 EV or more and the color temperature of the correction parameter.
- the digital photo development unit 727 performs development using the user preference information for the non-correction data captured with the user 130 as the transmission source and the exposure amount of 14 EV or more. That is, the digital photo development unit 727 performs development using a correction parameter obtained by adding the above-described color temperature difference to the clear sky color temperature when the exposure amount is 14 EV or more.
- the development server 720 uses the correction parameters designated by the user 130 for the correction target data, and the user 130 related to the image quality correction of the user 130. Preference information is extracted, and non-correction target data is developed using the extracted user preference information. As a result, not only correction target data that does not impose additional work on the user 130 but also non-correction target data can be obtained with the ability S to obtain an image having a quality suitable for the user's preference as a development result.
- FIG. 23 is a system configuration diagram showing the configuration of the developing system according to the fifth embodiment of the present invention, and corresponds to FIG. 2 of the first embodiment.
- the same parts as those in FIG. 2 are denoted by the same reference numerals, and description thereof will be omitted.
- the development system 800 includes the first and second development clients 110-1 and 110-2 having the same configuration as the development client 110 in FIG. 2, and the development server 120 in FIG. A development server 820 having a different configuration is arranged.
- the development server 820 includes a communication destination selection unit 829 in addition to the configuration shown in FIG.
- the communication destination selection unit 829 of the development server 820 controls the communication of the provisionally developed image transmission unit 125 shown in Fig. 2, and generates the RAW data from the RAW data at a partner other than the RAW data transmission source. The temporarily developed image is transmitted. Further, the communication destination selection unit 829 controls the JPEG transmission unit 128 shown in FIG. 2, and generates PEG data based on the correction content to the RAW data transmission source that is not the correction content transmission source. For example, the communication destination selection unit 829 stores in advance a table (not shown) in which the transmission source of RAW data is associated with the other party to which the provisionally developed image is to be transmitted. Then, each time the RAW data is received, the communication destination selection unit 829 refers to this table and transmits the temporarily developed image to the counterpart corresponding to the RAW data transmission source.
- IP internet protocol
- the raw data is transmitted from the first development client 110-1 to the development server 820 (S1100, S1200).
- the development server 820 generates a provisionally developed image from the RAW data classified as the correction target in the received RAW data.
- the communication destination selection unit 829 selects the second development client 110-2 that is not the second development client 110-1 as the transmission destination of the generated temporary development image.
- the development server 820 transmits the second development client 110-2 and provisionally developed image data including the provisionally developed image (S1500, S1600).
- the second development client 110-2 transmits, to the development server 820, a correction parameter indicating the correction content of the temporary development image selected by the user from the received temporary development images. (S1900, S2000).
- the development server 820 generates JPEG data based on the received correction contents, and the generated PEG data is not sent to the second development client 110-2. It is transmitted to the development client 110-1 (S2100). The JPEG data may be transmitted to the second development client 110-2.
- the development system 800 also has a request that the same user has a plurality of development clients and that RAW data acquisition and provisional development image selection should be performed by different development clients. Can respond.
- the acquired RAW data is transmitted from the camera to the development server 820, and the temporarily developed image is selected by a personal computer equipped with a high-quality display.
- the present embodiment it is possible to respond more flexibly even when the user's request for the RAW data development form is complicated and diversified.
- the number of development clients that development server 820 can select as a communication destination is not limited to two.
- the correction contents can be determined by different development clients 110 for each individual RAW data, or a plurality of development clients can be selected from one RAW data.
- the correction contents can be determined by the development client 110, and a plurality of developed JPEG data can be obtained.
- the form of RAW data development can be further complicated and diversified. For example, it is possible to further widen the creation of works on digital images.
- the development system 900 is different from the first and second development clients 110-1 and 110-2 having the same configuration as the development client 110 in FIG. 19 and the development server 720 in FIG.
- a development server 920 having a configuration is arranged.
- the development server 920 has a user preference information management unit 934 instead of the user preference information management unit 734 in FIG.
- the user preference information management unit 934 applies to a certain development client 110.
- user preference information of another development client 110 is returned.
- the user preference information management unit 934 stores in advance a table (not shown) in which the raw data transmission source is associated with the development client 110 corresponding to the user preference information to be used. Then, whenever there is an inquiry from the digital photo development unit 727, the user preference information management unit 934 refers to this table and returns the corresponding user preference information.
- the second image client 110-2 is described as the development client 110 corresponding to the user preference information used for the first development client 110-1 in the table stored by the user preference information management unit 934. The case where this is done will be described.
- the step numbers in FIG. 24 correspond to the step numbers in FIG. 21 in the fourth embodiment.
- the first development client 110-1 and the development server 920 perform development according to the procedure shown in FIG. 21 (S1100 to S2000).
- the development server 920 that received the correction content generates JPEG data according to the received correction content for the correction target data.
- the development server 920 generates non-correction data based on the correction parameters from the second development client 110-2 that is not the first development client 110-2 by the user preference information management unit 734. Then, the updated user preference information is selected, and JPEG data is generated using the selected user preference information (S2100, S2120).
- the user preference information generated and updated by the development client 920 may be freely used for any other development client 110. Such free use can be achieved by, for example, using techniques well known to those skilled in the art, such as disclosure range information and access control information. Realize by using it.
- the user can use the user preference information (user profile) generated by the development client 110 used by another user, and the preference of others can be obtained. Can be reflected in the result of development. Thereby, for example, it becomes possible to perform digital image collaborative work creation by a plurality of users without bothering the collaborators.
- the RAW data acquisition destination by the development client is the power described as a memory card.
- the SD memory card secure digital memory card
- compact flash registration
- Various information recording media such as a memory card and a micro drive may be used.
- RAW data may be acquired for various electronic devices such as cameras.
- various known data reading and data communication technologies such as wired communication using a USB (universal serial bus) cable and wireless communication using a wireless LAN (local area network), are applied, and RAW data is transferred from an electronic device. Get it.
- the PEG data generated by the development server may be appropriately transmitted from the development server or the development client to an external device such as a printer by wired communication or wireless communication.
- correction parameters corresponding to different image quality from one correction target data were applied.
- the force for generating a plurality of temporarily developed images is not limited to this. Only one temporary development image corresponding to one type of correction parameter, which is different from the basic parameters, may be generated and the development client may select whether or not to use the corresponding correction parameter.
- the correction-needed rule storage unit, the similarity determination rule storage unit, and the preference extraction rule storage unit are arranged on a recording medium such as a hard disk provided outside the development server or a database on the network. You may do it.
- a common rule can be used by a plurality of development servers, and a more uniform development quality can be realized by a plurality of development servers.
- rules can be updated and new camera models can be easily handled, enabling a more flexible development system.
- development client can be applied to various electronic devices such as personal computers, PDAs (personal digital assistants), and mobile phones, not limited to television.
- the development server, development client, development system, and development method according to the present invention provide image quality in line with the user's wishes and preferences in developing image data that is difficult for individuals to develop such as RAW data. It is useful as a development sano, a development client, a development system, and a development method capable of obtaining an image more easily.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Computing Systems (AREA)
- Computer Graphics (AREA)
- Image Processing (AREA)
- Television Signal Processing For Recording (AREA)
- Facsimiles In General (AREA)
- Editing Of Facsimile Originals (AREA)
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008550186A JP4854748B2 (ja) | 2006-12-21 | 2007-12-20 | 現像サーバおよび現像方法 |
US12/377,266 US8238689B2 (en) | 2006-12-21 | 2007-12-20 | Development server, development client, development system, and development method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2006345001 | 2006-12-21 | ||
JP2006-345001 | 2006-12-21 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2008075745A1 true WO2008075745A1 (ja) | 2008-06-26 |
Family
ID=39536373
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2007/074571 WO2008075745A1 (ja) | 2006-12-21 | 2007-12-20 | 現像サーバ、現像クライアント、現像システム、および現像方法 |
Country Status (3)
Country | Link |
---|---|
US (1) | US8238689B2 (ja) |
JP (1) | JP4854748B2 (ja) |
WO (1) | WO2008075745A1 (ja) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102687146A (zh) * | 2009-10-27 | 2012-09-19 | 苹果公司 | 用于生成和标记照片集合中的事件的方法和系统 |
WO2013187127A1 (ja) * | 2012-06-13 | 2013-12-19 | 富士フイルム株式会社 | 画像処理システム、送信側装置および受信側装置 |
JP2015179909A (ja) * | 2014-03-18 | 2015-10-08 | キヤノン株式会社 | 撮像装置及びその制御方法 |
WO2015186605A1 (ja) * | 2014-06-04 | 2015-12-10 | 株式会社ソニー・コンピュータエンタテインメント | 画像処理装置、画像処理システム、撮像装置、および画像処理方法 |
JP2018133658A (ja) * | 2017-02-14 | 2018-08-23 | キヤノン株式会社 | 画像処理装置、制御方法およびプログラム |
US10063734B2 (en) | 2014-08-25 | 2018-08-28 | Canon Kabushiki Kaisha | Information processing apparatus that receives data from external apparatus via network, method of controlling the same, and storage medium |
Families Citing this family (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4720387B2 (ja) * | 2005-09-07 | 2011-07-13 | ソニー株式会社 | 撮像装置、画像処理装置、および方法、並びにコンピュータ・プログラム |
JP2011049650A (ja) * | 2009-08-25 | 2011-03-10 | Canon Inc | 画像処理装置及び画像処理方法 |
KR101110725B1 (ko) * | 2010-06-16 | 2012-02-27 | 엠텍비젼 주식회사 | 썸네일 이미지를 이용한 후보 이미지 제시 방법 및 이를 수행하는 이미지 신호 처리 장치와 촬상 장치 |
US20120087596A1 (en) * | 2010-10-06 | 2012-04-12 | Kamat Pawankumar Jagannath | Methods and systems for pipelined image processing |
JP5665519B2 (ja) * | 2010-12-15 | 2015-02-04 | キヤノン株式会社 | コンテンツ処理装置、コンテンツ処理装置の制御方法及びプログラム |
US9135952B2 (en) * | 2010-12-17 | 2015-09-15 | Adobe Systems Incorporated | Systems and methods for semi-automatic audio problem detection and correction |
US10074165B2 (en) * | 2014-09-10 | 2018-09-11 | Morpho, Inc. | Image composition device, image composition method, and recording medium |
CN107295325A (zh) * | 2017-08-09 | 2017-10-24 | 京东方科技集团股份有限公司 | 一种用于显示设备的色温调整方法及装置、显示设备 |
KR102499399B1 (ko) * | 2018-03-20 | 2023-02-14 | 삼성전자주식회사 | Isp 업데이트를 알리는 전자 장치 및 그 동작 방법 |
KR20200094525A (ko) * | 2019-01-30 | 2020-08-07 | 삼성전자주식회사 | 서로 연관된 복수의 데이터를 포함하는 하나의 파일을 처리하는 전자 장치 |
WO2020196672A1 (ja) * | 2019-03-28 | 2020-10-01 | 富士フイルム株式会社 | 画像処理装置、画像処理方法、及び画像処理プログラム |
JP2022002376A (ja) * | 2020-06-22 | 2022-01-06 | キヤノン株式会社 | 画像処理装置、画像処理方法およびプログラム |
US11516402B1 (en) * | 2020-12-11 | 2022-11-29 | Lux Optics Incorporated | Realtime image analysis and feedback |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004258955A (ja) * | 2003-02-26 | 2004-09-16 | Nikon Corp | デジタル画像処理の受注方法、およびデジタル画像処理注文用プログラム |
JP2005275454A (ja) * | 2004-03-22 | 2005-10-06 | Fuji Photo Film Co Ltd | 画像処理方法、画像処理システム及び画像処理装置並びに画像処理プログラム |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7128270B2 (en) * | 1999-09-17 | 2006-10-31 | Silverbrook Research Pty Ltd | Scanning device for coded data |
US6398428B1 (en) * | 2000-05-15 | 2002-06-04 | Eastman Kodak Company | Apparatus and method for thermal film development and scanning |
US6816625B2 (en) * | 2000-08-16 | 2004-11-09 | Lewis Jr Clarence A | Distortion free image capture system and method |
JP2003234916A (ja) | 2002-02-08 | 2003-08-22 | Seiko Epson Corp | 画像処理装置、画像処理方法、印刷装置、画像処理プログラムおよび画像処理プログラムを記録した媒体 |
JP2004165797A (ja) | 2002-11-11 | 2004-06-10 | Canon Inc | デジタル写真現像システム、デジタル写真現像サーバ、現像方法、及びプログラム |
US7551205B2 (en) | 2004-03-22 | 2009-06-23 | Fujifilm Corporation | Image processing method, image processing system, image processing apparatus and image processing program |
US7335026B2 (en) * | 2004-10-12 | 2008-02-26 | Telerobotics Corp. | Video surveillance system and method |
JP4822690B2 (ja) | 2004-11-02 | 2011-11-24 | キヤノン株式会社 | 画像処理方法及びその装置と、プリントサービスシステム |
-
2007
- 2007-12-20 US US12/377,266 patent/US8238689B2/en active Active
- 2007-12-20 JP JP2008550186A patent/JP4854748B2/ja active Active
- 2007-12-20 WO PCT/JP2007/074571 patent/WO2008075745A1/ja active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004258955A (ja) * | 2003-02-26 | 2004-09-16 | Nikon Corp | デジタル画像処理の受注方法、およびデジタル画像処理注文用プログラム |
JP2005275454A (ja) * | 2004-03-22 | 2005-10-06 | Fuji Photo Film Co Ltd | 画像処理方法、画像処理システム及び画像処理装置並びに画像処理プログラム |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102687146A (zh) * | 2009-10-27 | 2012-09-19 | 苹果公司 | 用于生成和标记照片集合中的事件的方法和系统 |
CN102687146B (zh) * | 2009-10-27 | 2016-05-04 | 苹果公司 | 用于生成和标记照片集合中的事件的方法和系统 |
WO2013187127A1 (ja) * | 2012-06-13 | 2013-12-19 | 富士フイルム株式会社 | 画像処理システム、送信側装置および受信側装置 |
JP5680799B2 (ja) * | 2012-06-13 | 2015-03-04 | 富士フイルム株式会社 | 画像処理システム、送信側装置および受信側装置 |
US9307211B2 (en) | 2012-06-13 | 2016-04-05 | Fujifilm Corporation | Image processing system, transmitting-side device and receiving-side device |
JP2015179909A (ja) * | 2014-03-18 | 2015-10-08 | キヤノン株式会社 | 撮像装置及びその制御方法 |
WO2015186605A1 (ja) * | 2014-06-04 | 2015-12-10 | 株式会社ソニー・コンピュータエンタテインメント | 画像処理装置、画像処理システム、撮像装置、および画像処理方法 |
JP2015231103A (ja) * | 2014-06-04 | 2015-12-21 | 株式会社ソニー・コンピュータエンタテインメント | 画像処理装置、画像処理システム、撮像装置、および画像処理方法 |
US10055815B2 (en) | 2014-06-04 | 2018-08-21 | Sony Interactive Entertainment Inc. | Image processing apparatus, image processing system, imaging apparatus and image processing method |
US10063734B2 (en) | 2014-08-25 | 2018-08-28 | Canon Kabushiki Kaisha | Information processing apparatus that receives data from external apparatus via network, method of controlling the same, and storage medium |
JP2018133658A (ja) * | 2017-02-14 | 2018-08-23 | キヤノン株式会社 | 画像処理装置、制御方法およびプログラム |
Also Published As
Publication number | Publication date |
---|---|
US20100195929A1 (en) | 2010-08-05 |
JP4854748B2 (ja) | 2012-01-18 |
US8238689B2 (en) | 2012-08-07 |
JPWO2008075745A1 (ja) | 2010-04-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4854748B2 (ja) | 現像サーバおよび現像方法 | |
JP3991196B2 (ja) | 画像処理システム及び画像処理サーバ | |
CN101103635B (zh) | 数码相机图像中的白平衡校正 | |
US20040095478A1 (en) | Image-capturing apparatus, image-processing apparatus, image-recording apparatus, image-processing method, program of the same and recording medium of the program | |
US6011547A (en) | Method and apparatus for reproducing image from data obtained by digital camera and digital camera used therefor | |
US8279481B2 (en) | Update control of image processing control data | |
US20040169873A1 (en) | Automatic determination of custom parameters based on scanned image data | |
US7312824B2 (en) | Image-capturing apparatus, image processing apparatus and image recording apparatus | |
JPWO2005079056A1 (ja) | 画像処理装置、撮影装置、画像処理システム、画像処理方法及びプログラム | |
KR20120118383A (ko) | 이미지 보정 장치 및 이를 이용하는 이미지 처리 장치와 그 방법들 | |
JP5407600B2 (ja) | 画像処理装置、画像処理方法及び電子カメラ | |
KR102146855B1 (ko) | 촬영 설정 값을 공유하는 촬영 장치 및 방법 및 공유 시스템 | |
US20040041926A1 (en) | Image-capturing apparatus, imager processing apparatus and image recording apparatus | |
JP2003060980A (ja) | 画像処理システム | |
JP4150490B2 (ja) | 画像処理システムおよび画像処理方法および記録媒体 | |
US20050206747A1 (en) | Digital camera and template data structure | |
US7609425B2 (en) | Image data processing apparatus, method, storage medium and program | |
JPH1155688A (ja) | 画像の色変換方法および装置 | |
JP2003052002A (ja) | 画像ファイルの出力画像調整 | |
JP4623024B2 (ja) | 電子カメラ | |
US20040057617A1 (en) | Image data supply method, recording apparatus and program | |
JP2004328534A (ja) | 画像形成方法、画像処理装置及び画像記録装置 | |
JP4292873B2 (ja) | 画像処理方法、画像処理装置及び画像記録装置 | |
JP2003134457A (ja) | 電子カメラ | |
JP2000261825A (ja) | 画像処理方法および装置並びに記録媒体 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 07850996 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2008550186 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 12377266 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 07850996 Country of ref document: EP Kind code of ref document: A1 |