WO2010047336A1 - 画像撮影システム及び画像撮影方法 - Google Patents
画像撮影システム及び画像撮影方法 Download PDFInfo
- Publication number
- WO2010047336A1 WO2010047336A1 PCT/JP2009/068076 JP2009068076W WO2010047336A1 WO 2010047336 A1 WO2010047336 A1 WO 2010047336A1 JP 2009068076 W JP2009068076 W JP 2009068076W WO 2010047336 A1 WO2010047336 A1 WO 2010047336A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- shooting
- image
- unit
- image data
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00127—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
- H04N1/00281—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a telecommunication apparatus, e.g. a switched network of teleprinters for the distribution of text-based information, a selective call terminal
- H04N1/00307—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a telecommunication apparatus, e.g. a switched network of teleprinters for the distribution of text-based information, a selective call terminal with a mobile telephone apparatus
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00127—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
- H04N1/00132—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture in a digital photofinishing system, i.e. a system where digital photographic images undergo typical photofinishing processing, e.g. printing ordering
- H04N1/00185—Image output
- H04N1/00198—Creation of a soft photo presentation, e.g. digital slide-show
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/0035—User-machine interface; Control console
- H04N1/00405—Output means
- H04N1/00408—Display of information to the user, e.g. menus
- H04N1/0044—Display of information to the user, e.g. menus for image preview or review, e.g. to help the user position a sheet
- H04N1/00442—Simultaneous viewing of a plurality of images, e.g. using a mosaic display arrangement of thumbnails
- H04N1/00453—Simultaneous viewing of a plurality of images, e.g. using a mosaic display arrangement of thumbnails arranged in a two dimensional array
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/0035—User-machine interface; Control console
- H04N1/00405—Output means
- H04N1/00408—Display of information to the user, e.g. menus
- H04N1/0044—Display of information to the user, e.g. menus for image preview or review, e.g. to help the user position a sheet
- H04N1/00458—Sequential viewing of a plurality of images, e.g. browsing or scrolling
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/21—Intermediate information storage
- H04N1/2104—Intermediate information storage for one or a few pictures
- H04N1/2112—Intermediate information storage for one or a few pictures using still video cameras
- H04N1/2137—Intermediate information storage for one or a few pictures using still video cameras with temporary storage before final recording, e.g. in a frame buffer
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/21—Intermediate information storage
- H04N1/2166—Intermediate information storage for mass storage, e.g. in document filing systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/21—Intermediate information storage
- H04N1/2166—Intermediate information storage for mass storage, e.g. in document filing systems
- H04N1/2179—Interfaces allowing access to a plurality of users, e.g. connection to electronic image libraries
- H04N1/2183—Interfaces allowing access to a plurality of users, e.g. connection to electronic image libraries the stored images being distributed among a plurality of different locations, e.g. among a plurality of users
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
- H04N23/611—Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/2621—Cameras specially adapted for the electronic generation of special effects during image pickup, e.g. digital cameras, camcorders, video cameras having integrated special effects capability
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2101/00—Still video cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/32—Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
- H04N2201/3201—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
- H04N2201/3225—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to an image, a page or a document
- H04N2201/3247—Data linking a set of images to one another, e.g. sequence, burst or continuous capture mode
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/32—Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device, e.g. between a still-image camera and its memory or between a still-image camera and a printer device
- H04N2201/3201—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
- H04N2201/3225—Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to an image, a page or a document
- H04N2201/3253—Position information, e.g. geographical position at time of capture, GPS data
Definitions
- the present invention relates to an image capturing system and an image capturing method in an information processing terminal having an image capturing function such as a mobile phone terminal and a PDA (Personal Digital Assistance).
- an image capturing function such as a mobile phone terminal and a PDA (Personal Digital Assistance).
- it has relatively few operation buttons and a small display interface like a digital camera or a mobile phone.
- the complexity of operation work for the management increases and the burden on the user further increases.
- the present invention automatically classifies image files in an information processing terminal equipped with an image capturing function for capturing an image such as a mobile phone terminal or a PDA in consideration of position information and time information as well as the content of shooting. It is another object of the present invention to provide an image capturing system and an image capturing method that can reduce the burden on the user in managing image files.
- the present invention provides an imaging unit that captures an image, a shooting mode setting unit that sets a shooting mode for shooting by the imaging unit, position information that indicates the position of the device at the time of shooting, And a shooting time information acquisition unit that acquires current time information as shooting time information, and setting of shooting mode and shooting time information at the time of shooting of the image data to the image data shot by the imaging unit.
- An additional information processing unit to be added as attribute information about contents, an attribute data added by the additional information processing unit, an image data storage unit for storing image data, and image data stored in the image data storage unit for each image
- a display control unit for searching and displaying according to the attribute information.
- Another invention is an image capturing method by an image capturing unit that captures an image, and includes the following steps.
- (1) Shooting mode setting step for setting a shooting mode for shooting by the imaging unit (2) During shooting, acquisition of shooting time information that acquires position information indicating the position of the own device and current time information as shooting time information
- Step (3) Image capturing step for capturing an image (4) Image mode captured at the time of capturing the image data and information at the time of capturing are added to the image data captured at the image capturing step as attribute information regarding the contents of each image data.
- the shooting mode information at the time of shooting, the position information of the own device, and the shooting time information can be added to the image data, and the image data is searched according to the attribute information. It can be saved as possible.
- the subject's attributes can be estimated to some extent based on the optical shooting mode according to the location and time zone, such as portrait shooting mode at a specific location or time zone, landscape shooting mode, still-image close-up mode, etc.
- the captured image data can be classified according to the estimation. As a result, the burden of image data classification work by the user can be reduced.
- the camera further includes a shooting time setting storage unit that associates position information and time information with a shooting mode and stores them as shooting mode settings, and the shooting mode setting unit includes the shooting time information acquired by the shooting time information acquisition unit. Accordingly, it is preferable to refer to the mode setting and set the referenced shooting mode for the imaging unit.
- the shooting setting storage unit stores shooting mode information in advance in association with position information and time information, when the position information and time information are acquired during shooting, The shooting mode can be set, and the burden of the user to select the shooting mode can be further reduced.
- the shooting mode stored in the shooting time setting storage unit is added or changed in association with the position information and time information when the user manually changes the shooting setting. Also good.
- the image processing apparatus further includes a schedule storage unit that stores and holds position information, time information, and schedule content related to the user's schedule as schedule information, and the shooting mode setting unit uses the schedule information when referring to the mode setting. It is preferable to set the shooting mode for the imaging unit based on the position information, the time information, and the type of schedule content included.
- the shooting mode is set by giving priority to the schedule information scheduled by the user, it is possible to reflect the user's action schedule and estimate the attribute of the image data more accurately.
- the schedule information includes the details of the schedule such as “birthday” or “overseas travel”. By doing so, it is possible to accurately sort image data.
- the display control unit has a calendar display function including the current date, and the calendar display function stores, in the image data storage unit, a date part before the current date with the current date as a boundary. It is preferable to display a list of the image data that has been displayed, and display the schedule information stored in the schedule storage unit in the month and date after the current date.
- an operation history recording unit that records a history of shooting modes set by the shooting mode setting unit, and a tutorial storage unit that stores information about usage examples of shooting modes as tutorial information in association with position information and time information; Based on the unused shooting mode extracted by the operation history analysis unit that analyzes the shooting mode history and extracts the unused shooting mode, the tutorial information is referred to and the unused shooting mode is extracted.
- a tutorial acquisition unit that acquires usage examples of shooting modes, and location information, time information, and schedule content types according to usage examples included in the tutorial information acquired by the tutorial acquisition unit are added to the schedule information as user schedules. It is preferable to further include a tutorial setting unit.
- the face portion of the person is detected, the face detection unit for calculating the coordinate position of the detected face portion in the basic image, and the face portion detected by the face detection unit is converted into image data of a specific shape
- a face seal generation unit, and the shooting mode setting unit can select a shooting mode including a face portion detection process by the face detection unit and an image data conversion process by the face seal generation unit, and record an operation history.
- the unit preferably records the history of the shooting mode including the detection process and the conversion process selected by the shooting mode setting unit in the operation history.
- the user's face of the image taken by the user can be partially detected and saved as a face sticker that can be used for icons, etc., so each user can use the person's face for various purposes can do. Furthermore, since the shooting mode including the detection process and the conversion process is also stored in the shooting time setting storage unit, automatic setting during shooting in shooting mode and tutorial setting when not in use can be performed.
- an information processing terminal equipped with an image capturing function for capturing an image such as a mobile phone terminal or a PDA
- the contents of shooting are taken into account automatically. It is possible to classify the image files and reduce the burden on the user in managing the image files.
- FIG. 1 is a conceptual diagram illustrating an overall configuration of an image capturing system according to an embodiment. It is a front view which shows the external appearance of the portable terminal which concerns on embodiment. It is a block diagram which shows the internal structure which concerns on the image imaging system of the portable terminal which concerns on embodiment. It is explanatory drawing which shows the display screen of the calendar function which concerns on embodiment. It is explanatory drawing which shows the display screen of the folder which stores the image data which concerns on embodiment. It is explanatory drawing which shows the screen transition of the schedule function which concerns on embodiment. It is explanatory drawing which shows the other screen transition of the schedule function which concerns on embodiment. It is explanatory drawing of the face seal function which concerns on embodiment.
- FIG. 1 is a conceptual diagram illustrating the overall configuration of the image capturing system according to the present embodiment
- FIG. 2 is a front view illustrating the appearance of the mobile terminal 1.
- the image capturing system according to the present embodiment is roughly configured by a camera-equipped mobile terminal 1 used by a user and a content server 3 installed on the Internet 2.
- the content server 3 is a server that distributes additional content and tutorial information, and includes, for example, a Web server.
- This content server is a server computer that transmits information such as HTML (HyperText Markup Language) files, image files, music files, etc. in a document system such as WWW (World Wide Web) or software having the function. Information such as documents and images is stored, and the information is transmitted through the Internet 2 in response to a request from the portable terminal 1.
- the content data distributed by the content server 3 can be preinstalled in the mobile terminal 1 and stored in the mobile terminal 1 in advance, and the mobile terminal 1 can store the content data in its own device. If the content data is searched and not stored in the own device, a distribution request may be made to the content server 3 and downloaded.
- the data storage unit content server 3 includes a regional image storage unit 31 that stores content data such as regional image data in association with location information, and as regional images, background data and wallpaper data of local attractions and specialties, It serves as a regional image distribution server that distributes content including regional image data unique to each region, such as frames and templates such as “face-up signboards” depicting characters in each region.
- a regional image storage unit 31 that stores content data such as regional image data in association with location information, and as regional images, background data and wallpaper data of local attractions and specialties, It serves as a regional image distribution server that distributes content including regional image data unique to each region, such as frames and templates such as “face-up signboards” depicting characters in each region.
- the content unique to each region includes “recommended information” ranking topics related to the region (sightseeing spots, special products, store information, other news), and map data. .
- the camera-equipped mobile terminal 1 is a portable telephone using wireless communication, communicates wirelessly with a relay point such as the base station 22 and can receive communication services such as calls and data communication while moving.
- Examples of the communication system of the cellular phone include an FDMA system, a TDMA system, a CDMA system, a W-CDMA, a PHS (Personal Handyphone System) system, and the like.
- this mobile phone has functions such as an application software execution function or a GPS (Global Positioning System) function, and also functions as a personal digital assistant (PDA).
- PDA personal digital assistant
- the camera function of the portable terminal 1 is an imaging function for optically taking a digital image.
- the position information acquisition function is a function for acquiring and recording position information indicating the position of the own device at the time of shooting.
- a method of detecting the position of the own device based on a signal from the satellite 21 or a method of detecting the position based on the radio wave intensity from the radio base station 22 of the mobile phone can be considered.
- the portable terminal 1 includes an operation button 116 for a user to perform an input operation, an operation device 116 such as a jog dial and a touch panel, and an image display unit 113.
- the image display unit 113 displays a main screen 401 and a calendar UI 402 as GUIs.
- FIG. 3 is a block diagram showing an internal configuration according to the image photographing system of the mobile terminal 1.
- the “module” used in the description refers to a functional unit that is configured by hardware such as an apparatus or a device, software having the function, or a combination thereof, and achieves a predetermined operation. .
- the mobile terminal 1 includes an image capturing unit 106, a shutter control unit 107, a face detection unit 109, a shooting mode setting unit 104, an additional information processing unit 105, and a shooting time information acquisition unit as image capturing modules. 102, a shooting setting storage unit 103, and a data storage unit 122.
- the imaging unit 106 is a module that optically captures a digital image, executes an imaging process in accordance with control from the shutter control unit 107, and stores the captured image in the data storage unit 122 as image data D1.
- the shutter control unit 107 receives an operation signal based on a user operation from the operation device through the operation signal acquisition unit 108 and executes a photographing process.
- the facial expression recognition unit 109a in the face detection unit 109 An automatic shooting mode for automatically performing shooting processing may be added at the moment when a specific facial expression such as a smile is recognized.
- the additional information processing unit 105 is a module that adds attribute information as tag information to the image data D1 captured by the image capturing unit 106.
- the additional information processing unit 105 captures the image data D1 acquired from the image capturing mode setting unit 104 at the time of image capturing,
- the shooting time information (position information, time, etc.) acquired from the shooting time information acquisition unit 102 is added as attribute information regarding the contents of each image data D1.
- As a method for adding the tag information it is conceivable that the tag information is directly stored in the image data D1, or another file data is used as management data (metadata) associated with the image data D1.
- the contents of the schedule (location, participant, event contents) corresponding to the photographing time can be recorded in association with the attribute information of the photographed image data.
- the tag information the face of a person shown in the image may be recognized and the feature of the face may be included.
- the facial features may be recorded, for example, by identifying a person with reference to a face sticker (face photo information) set in the address book or the like and associating it with the address book or the like.
- facial expressions such as a smile may be recognized, and the facial expressions may be included in the tag information.
- images taken with smile recognition can be classified by tag information “smile”, and an album in which smile images are collected can be automatically generated.
- this tag information can be displayed in a list for each image, and can be edited, changed, deleted, added, etc. by user operation.
- the attribute information included in the tag information can be used for album classification of the image, automatic mail generation when attached to an electronic mail, and the like.
- the shooting mode setting unit 104 is a module for setting a shooting mode for shooting by the imaging unit 106.
- This shooting mode includes settings such as shutter speed, exposure (aperture), focal length, presence / absence of flash, filter, and so on. Blurred shooting), Landscape mode (Adjusted to focus evenly from short distance to long distance: Including commemorative photos with multiple subjects), Close-up mode (Photo taken close to the subject) ), Sport mode (shooting at a high shutter speed, including continuous shooting, etc.), night portrait (such as shooting a person with a night view in the background) Adjustment of the aperture), flash emission prohibition mode (shooting outdoors), and the like.
- basic settings included in this shooting mode include settings for image size, image quality (resolution), and storage destination (main body or recording medium, etc.).
- a selection (combination) pattern of suitable shooting settings and functions may be displayed and selected as the type of camera.
- the mode that allows users to set all the shooting functions themselves is set to ⁇ Dedicated Pro Camera ''
- the mode for shooting at night or indoors with high sensitivity setting is set to ⁇ High Sensitivity Camera ''
- the mode that allows you to add frames and write letters and pictures after shooting is called ⁇ Purikura Camera ''
- the mode that transforms shot images and combines other images is ⁇ “Party camera”, “art camera” mode that can effect effects such as sepia tone, oil painting touch, pseudo fisheye lens, filter processing, etc., 2D barcode and QR code analysis, character recognition functions such as OCR
- the mode that can perform image analysis such as the face recognition function is set to “Search camera”, or the movie is shot and the shot movie / image is uploaded.
- Other shooting modes can be displayed by adding a comment to the captured image and uploading it to a website on the Internet as a “blog camera”, or by outputting the sound of animals such as dogs and cats.
- a mode suitable for taking a close-up shot of a still life such as “pet camera” as a mode for drawing attention to the camera or cooking can be set as a “cooking camera”.
- the shooting mode setting unit 104 sets the shooting mode for the imaging unit 106 based on the current time, the position information included in the schedule information, the time information, and the type of the scheduled content when referring to the mode setting. It is possible to make corrections, and to display a message recommending a better shooting mode setting. For example, if the current time is a trip listed in the schedule, the portrait mode setting is set as the default, and if the party is in a party, “Party Camera”, “Purikura Camera”, etc. are preferentially set. Output a message that recommends use. If the current time is at night, the “high sensitivity camera” may be recommended. Further, in conjunction with the setting of the shooting mode, a message recommending other functions, such as recommending the use of a local frame or the use of a navigation function by GPS, may be output.
- the shooting setting storage unit 103 is a storage device such as a non-volatile memory that stores and holds table data for selecting a shooting mode according to a place and time when the shooting mode setting unit 104 sets a shooting mode. It is. Specifically, table data in which position information and time information are associated with shooting modes is stored as a mode setting table T1, and the position information and time information are received in response to a request from the shooting mode setting unit 104.
- the mode setting table T1 is referred to, and the shooting mode to be used for shooting is read out and sent to the shooting mode setting unit 104.
- the shooting mode setting unit 104 refers to the mode setting according to the shooting time information acquired by the shooting time information acquisition unit 102 and sets the referenced shooting mode for the imaging unit 106.
- the shooting time information acquisition unit 102 acquires a position information acquisition unit 102a that acquires position information indicating the position of the own device at the time of shooting, and a clock unit that acquires current time information as shooting time information. 102b, and upon completion of shooting, in response to a request from the additional information processing unit 105, each data acquired from the position information acquisition unit 102a and the clock unit 102b is input to the additional information processing unit 105 as shooting time information. .
- the position information acquisition unit 102a is a module that acquires and records position information indicating the position of the own device at the time of shooting.
- the position information acquisition unit 102a detects the position of the own device by a signal from the satellite 21 like GPS, The position is detected by the radio wave intensity from the radio base station 22 of the mobile phone.
- the clock unit 102b is a module that measures the current time, and may be a time display that manages a time zone and takes into account a time difference in accordance with the position information acquired by the position information acquisition unit 102a.
- the data storage unit 122 is a storage device that stores various types of data.
- the captured image data D1, the processing data acquired from the content server 3, the combined image data obtained by combining them, and the image data D1 Edited data (for example, face seal) that has been processed is accumulated.
- the data storage unit 122 distributes and stores the image data D1 in a plurality of folders 122a to 122c according to the attribute information (tag information) added by the additional information processing unit 105.
- the images are stored in the folders according to the tag information.
- the present invention is not limited to this, and the actual storage location is specified in the basic settings. It is preferable to use a typical folder (for example, “My picture”), and perform classification based on pseudo-assignment based on tag information added to each image when displaying a list or album.
- the folders 122a to 122c are displayed as icons so that the image data can be stored according to themes, as shown in FIG. 5 (b).
- the image data stored in the folder can be browsed.
- this image data is browsed by referring to the attribute information (tag information) described above as a keyword, searching for the same / similar attribute information, and detecting related image data. You can also browse.
- As a display method for browsing image data low-resolution image data for thumbnails with a small data capacity may be displayed.
- This photo album is an album in which images are automatically classified according to tag information, and each image is classified according to the attribute of each image included in the tag information. Note that the same image appears to be stored in a plurality of folders (albums) because the same image data is classified into a plurality of albums by the number of tag information. .
- the types of photo albums can be classified according to the items described in the tag information.
- the photo albums can be classified during the period of the schedule category (such as “travel”), For each type), grouped by the person in the image and their attributes (for example, “family”), or grouped by a calendar date (a specific date such as a birthday).
- the face detection unit 109 extracts a certain geometric shape formed by eyes, nose, mouth, and the like, detects a human face part, and calculates a coordinate position of the detected face part in the basic image. It is a module to do. This face detection process is executed from the previous stage of the photographing operation. As shown in FIG. 8A, the detected face is highlighted as a marking 408a on the finder screen, and the distance to the subject is measured. It also works with an autofocus function that performs focusing according to distance.
- the face detection unit 109 has a facial expression recognition unit 109a that recognizes a predetermined facial expression such as a smile of a person to be photographed, and recognizes the face part and the facial expression from the monitor image at the time of photographing. , And a function of inputting the recognition result to the shutter control unit 107.
- the shutter control unit 107 inputs a control signal for causing the imaging unit 106 to perform a shutter operation according to the input recognition result.
- Information related to facial features and expressions recognized by the face detection unit 109 and facial expression recognition unit 109a is described in the tag information, and each image is classified by a person in the image or classified by the type of facial expression. be able to.
- facial features for example, a person may be specified by collating face stickers registered in the address book.
- the mobile terminal 1 includes an editing processing unit 110 and an editing data generation unit 111 as editing processing modules.
- the editing processing unit 110 is a module that executes various editing processes in accordance with the operation of the operator. In conjunction with the GUI, a photo retouching process such as drawing a picture on an image with a touch pen or writing characters or the like. In addition, it is possible to perform an image composition process for compositing an image such as a frame image or a background image with a taken basic image.
- the editing processing unit 110 also has a function of displaying a list of tag information associated with each image and editing such information as addition, deletion, and change.
- the editing processing unit 110 also includes an e-mail generation unit 110a that generates an e-mail and transmits it via the communication I / F 101.
- the e-mail generation unit 110a has a function of transmitting the image stored in the image data storage unit 122 and the captured image D1 as an attachment to the e-mail, and based on the tag information of the image data, the necessary items Can be automatically generated. For example, the person shown in the tag information is read out, the person is searched from the address book, and the e-mail address or name detected from the address book is automatically described as the e-mail destination address. To do. Also, from the information described in the tag information or the associated schedule information (history information), the location, time, and event content when the image was taken is extracted to create a sentence, and the sentence is electronic You may make it quote in the text and title of a mail.
- the e-mail generator 110a also has a function of inserting a face seal as a pictograph during editing of the mail text.
- a person's name and face seal insertion processing may be associated with the dictionary conversion function, and when the person's name is described in a sentence, it may be automatically converted or inserted as a face seal.
- the e-mail generation unit 110a also has a function of automatically inserting an e-mail by quoting the contents of the schedule when generating the e-mail. For example, an address book is searched from the name of a person related to the schedule described in the schedule, and the e-mail title and text are created from the event contents in the schedule with the mail address of the person as the destination.
- the editing data generation unit 111 is a module that generates editing data as a result of the editing operation by the editing processing unit 110, and the generated data is stored in the data storage unit 122.
- the edit data generation unit 111 also stores the tag information related to the edited image data in the data storage unit 122 together with the edited image data.
- the edit data generation unit 111 includes an image composition unit 111a.
- the image composition unit 111a uses the image captured by the image capturing unit 106 as a basic image D11, and the other image data D12 with respect to the basic image D11. Synthesize.
- the composition processing by the image composition unit 111a can also be controlled by an editing operation on the editing processing unit 110.
- the image composition unit 111a has a function of a face seal generation unit that converts the face portion detected by the face detection unit 109 into face seals 501a to 501c that are image data of a specific shape.
- the face seals 501a to 501c superimpose, for example, the image data of a costume on the basic image with reference to the coordinate position of the face portion detected by the face detection unit 109, and cut out the image.
- This face seal generation process is automatically executed by default when the face detection unit 109 detects a face part. Therefore, the user can consciously generate the face seal manually, and the face seal is automatically generated and accumulated every time the face part is photographed.
- the image composition unit 111a uses the coordinate position of the face portion detected by the face detection unit 109 as a reference, and the basic image that has been photographed. You may make it synthesize
- This additional image may be installed and stored in advance in the mobile terminal 1 or may be downloaded from the content server 3.
- the face sticker generated by the image compositing unit 111a can be stored in association with a person registered in the address book, and the face sticker of the person can be used for operations and data related to each person. Can do.
- a face sticker can be displayed as an address book index, and a face sticker can be displayed as an icon as a photo album index.
- This face seal can also be used as a pictograph meaning each person when composing an e-mail. When a person name registered in the address book is described in the e-mail, the person name is automatically entered. The face seal may be detected and added to the sentence.
- an operation signal acquisition unit 108 As a user interface system module, an operation signal acquisition unit 108, a display control unit 112, an image display unit 113, a schedule storage unit 121, an operation history recording unit 123, and a tutorial storage unit 124 And a tutorial setting unit 114.
- the operation signal acquisition unit 108 is a module that acquires an operation signal from an operation device based on a user operation and inputs an execution command corresponding to the operation signal to the shutter control unit 107 and the editing processing unit 110.
- the operation device here includes a touch panel, operation buttons, an acceleration sensor, and the like. This operation signal is transmitted to each module to be operated, and is transmitted to the operation history recording unit 123 and accumulated as log data D3.
- the display control unit 112 is a module that generates image data to be displayed on the image display unit 113, which is a GUI. At the time of shooting, the display control unit 112 displays a finder image captured by the imaging unit 106 or at the time of editing. The change of the image such as an icon based on the user operation signal acquired by the operation signal acquisition unit 108 is displayed. As an icon display method by the display control unit 112, as shown in FIG. 15, the icons 601 are arranged in a spiral shape (spiral shape) on the image display unit 113, and a spiral is formed by a user's rotation operation.
- the icon When the icon is rotated clockwise or counterclockwise, the icon may be moved from the center to the outside while expanding, or from the outside to the center while the icon is being reduced, while the size is enlarged or reduced.
- the user's rotation operation for example, it is conceivable whether a wheel unit for rotating operation is physically provided in the mobile phone body, or whether rotation of the pointing point on the touch panel is detected as the operation. .
- the display control unit 112 is provided with a slide show function for sequentially reproducing a plurality of images stored in the image data storage unit 122 in order.
- a slide show function for sequentially reproducing a plurality of images stored in the image data storage unit 122 in order.
- tag information which is a standard for classifying photo albums is collated, and information related to the tag information (photographed location, time, shooting mode, schedule of the time, Local information about the location) can be displayed together.
- information related to the tag information photographed location, time, shooting mode, schedule of the time, Local information about the location
- the photo album is related to travel
- local information on the travel destination a map, a movement trajectory based on the navigation history, and the like may be displayed as an introduction to the slide show.
- the display order of images included in the slide show can be set to the order of the shooting time, and when the shooting mode is a photo booth camera, the slide show frame is changed like a photo booth note, etc. An effect may be executed.
- the display control unit 112 has a function of displaying the attribute information of the data stored in the data storage unit 122 as face seals 501a to 501c.
- attribute information tag information
- a face sticker is displayed as an icon indicating the person.
- the association between the tag information and the face seal can be based on address book data, for example. That is, using a person registered in the address book as a reference, the photographed image or face seal is recorded in association with the person.
- the display control unit 112 also has a function of searching for and reading data stored in the data storage unit 122.
- a calendar UI (FIGS. 9A to 9C) and a mail browsing UI are provided. By selecting an operation on (FIGS. 8C and 8D), the corresponding data can be read out.
- the schedule storage unit 121 is a storage device that stores and holds position information, time information, and schedule content types related to a user's schedule as schedule information D2. This schedule information can be customized by each user, and is displayed on the calendar UI 402 as shown in FIGS. Specifically, the calendar UI 402 by the display control unit 112 is a calendar display function including the current date, and the date and time portion 404 prior to the current date is stored as data with a grid 405 indicating the current date as a boundary. The schedule information stored in the schedule storage unit 121 is displayed in the month and day portion 406 after the current date, as a list display of the image data D1 stored in the unit 122.
- the schedule information D2 for example, as shown in FIG. 6, if the user has made a "daily meal" plan for recording daily meals together with image data, as shown in FIG.
- the folder 122c related to the plan can be created.
- the snapshot photography mode is recorded in the mode setting table T1 in association with the time zone of meals three times a day and the position information of the home.
- a pop-up message 407 as shown in FIG. 6C is displayed, and the mode setting table T1 is automatically read out, and a snapshot of the meal is taken.
- a shooting mode suitable for shooting is prepared.
- the schedule information D2 for example, as shown in FIG. 7A, the user makes a travel plan, and as shown in FIG. 7B, at a predetermined place and time. Assume that you have recorded a meeting schedule in the schedule. Then, as shown in FIG. 7C, a pop-up message 407 is displayed when the meeting time and place recorded in the schedule is reached, and the mode is automatically set as shown in FIG. 7D.
- the table T1 is read, the camera is automatically activated, and a shooting mode suitable for taking a snapshot of a person is prepared.
- the tutorial storage unit 124 is a storage device that stores information related to usage examples of the shooting mode as tutorial information 124a in association with position information and time information.
- the tutorial stored in the tutorial storage unit 124 is a teaching material that explains how to use the shooting function and the like, and causes the moving image, the still image, the text text, and each function module of the mobile terminal 1 to be executed.
- Content data including a script (program).
- the current position is measured by using position information acquisition means such as GPS, and the recommended spot near the current location, the area information such as store information, gourmet information, and special products are displayed and output.
- the local information may be local information distributed from the content server 3, and is preferably displayed by ranking according to its importance (recommended level). This ranking may include movie performance, CD, Chaku-Uta download count, TV audience rating, and search count.
- a character string is displayed by a pop-up 407 as shown in FIGS. 10 (a) to 10 (c) and FIGS. 11 (a) to 11 (d).
- a guidance 601 by a still image and a schedule notification pop-up screen 602c are displayed, and explanations of usage examples are advanced in an interactive manner in accordance with a user's response operation for character information and voice output such as a balloon 602a by a character 602b.
- the browser function is automatically activated and the Internet is accessed or the local information data stored in the main unit is accessed in advance to recommend and explain the recommended function. , Display related information (ranking information, etc.).
- the tutorial information that prompts the user to use the camera function is activated, and the camera shooting screen is activated while guiding the user's operation.
- the case where the tutorial information is distributed as a present from the content server 3 is taken as an example, and the schedule according to the theme of “Let's play with the camera” is created by starting the tutorial information (FIG. 11B).
- a folder 122 dedicated to this tutorial has been generated (FIG. 11A).
- the operation guidance by the character 602b is started, and the operation lesson of “exposure correction” is proceeding in an interactive manner in accordance with the user's response operation (FIG. 11D). .
- Such a tutorial analyzes a user operation history and is executed for an unused function.
- the execution is set by the tutorial setting unit 114.
- the contents of the schedule registered in the schedule are periodically monitored, and when the scheduled time registered in the schedule is reached, an explanation of the recommended functions and alerts for the schedule time are provided. , Display related information by recommended research.
- the current time and current position are acquired from the shooting time information acquisition unit 102, and the function according to the situation (time, place, etc.) is explained and related information is displayed by recommended research. Also good.
- the tutorial setting unit 114 is a module that adds the position information, time information, and the type of schedule content corresponding to the usage example included in the tutorial information 124a acquired by the tutorial acquisition unit 114a to the schedule information as a user schedule.
- a tutorial acquisition unit 114a and an operation history analysis unit 114b are included.
- the operation history analysis unit 114b is a module that analyzes the shooting mode history and extracts unused shooting modes.
- the tutorial acquisition unit 114a is based on the unused shooting mode extracted by the operation history analysis unit 114b, the schedule registered in the schedule 121, and the function corresponding to the situation acquired by the shooting time information acquisition unit 102. This module refers to the information 124a and obtains a tutorial on the unused shooting mode and other functions.
- the operation history recording unit 123 is a storage device that records the shooting mode history set by the shooting mode setting unit as log data D3, and the mode setting history based on the user operation is accumulated as the log data D3.
- FIG. 12 is a flowchart showing the operation of the image photographing system according to this embodiment.
- the position information and current time information of the own device are acquired periodically by loop processing (S101 and S102), and the schedule information D2 is referred to, whether or not it is at the scheduled place / time, or is scheduled It is determined whether or not the user has started the operation voluntarily at an outside place / time (S103 and S104). As long as the user does not perform an operation and a predetermined time does not arrive, a standby state is set (“N” in S103 and S104).
- the schedule in step S104 may be a schedule for starting the operation lesson tutorial described above. That is, in the tutorial information 124a, information related to the usage example of the shooting mode is stored in association with the position information and the time information, and the position information, the time information, and the schedule according to the usage example included in the tutorial information 124a are stored. By adding the content type as schedule of the user to the schedule information, an operation lesson in unused shooting mode is scheduled as the schedule of the user.
- shooting is performed in the shooting mode selected by the user (S105 and S107),
- the photographing mode is added to the log data D3 (S108).
- table data in which position information and time information are associated with a shooting mode is stored in the shooting time setting storage unit 103 as a shooting mode setting T1 in advance, and a shooting mode setting step (S105). ),
- the mode setting T1 is referred to according to the shooting time information acquired in the shooting time information acquisition step (S101), and the referred shooting mode T1 is set for the imaging unit 106.
- the photographing mode is automatically selected to prompt the photographing operation (S106), and when the photographing operation is performed (S107).
- the shooting mode is added to the log data D3 (S108).
- the schedule information includes position information, time information, and schedule content types related to the user's schedule, and is included in the schedule information D2 when referring to mode setting in the shooting mode setting step (S106). Based on the position information, time information, and scheduled content type, the shooting mode for the imaging unit is set. In the schedule information D2, for example, the contents of the schedule such as “birthday” or “overseas travel” are described. Therefore, by using the type of the contents of the schedule as a keyword, a more accurate shooting mode Can be set.
- the image data D1 is sorted according to the attribute information including the shooting location / time and the shooting mode, and stored in the folders 122a to 122c of the data storage unit 122.
- the image data D1 stored in the folders 122a to 122c can be displayed as a list on the calendar UI by displaying icons according to the attribute information of the images.
- the date and time part before the current date is set as a list display of the image data D1 stored in the data storage unit 122 with the current date as a boundary, and the current date is displayed.
- the later date part is set as a display of the schedule information D2 stored in the schedule storage unit 121.
- the calendar and schedule are displayed in a spiral shape with the calendar date as an icon 601, and the display is changed by moving the spiral display forward and backward by a rotation operation by the user. Also good.
- a face sticker in which the face portion of a person is converted into image data of a specific shape can be generated, and the face sticker can be freely pasted in the calendar UI as an icon image. .
- the face seal can be associated with image tag information, address book, schedule, and the like.
- schedule information location, participant, event content
- the tag information the face of a person shown in the image can be recognized and the feature of the face can be included.
- the facial features can be recorded, for example, by identifying a person with reference to a face sticker set in the address book and associating it with the address book.
- an expression such as a smile may be recognized, and the expression may be included in the tag information.
- images taken with smile recognition can be classified by tag information “smile”, and an album in which smile images are collected can be automatically generated.
- the image composition unit 111a described above has a function of downloading additional data specific to a specific region by data transmission / reception with the content server 3, and combining the image data specific to the region with the image data of the face seal.
- the image composition function can be used in combination when generating the above-described face seal.
- FIG. 13 is a flowchart showing the operation of the image composition unit 111a according to this embodiment.
- an image is taken by the imaging unit 106 (S201).
- an “automatic shooting mode” in which the finder image is always monitored by the imaging unit 106 may be executed.
- the facial expression recognition unit 109a performs facial expression recognition on the finder image taken by the imaging unit 106 to recognize a predetermined facial expression, such as a smile, of the person to be photographed.
- the photographing process is automatically executed.
- step S201 the image data captured by the imaging unit 106 is recorded as a basic image in the data storage unit 122 (S202), and a face is detected in the basic image (S203).
- the face detection unit 109 calculates the coordinate position of the detected face part in the basic image.
- the position information of the own device at the time of shooting is acquired and recorded, and this position information is transmitted to the content server 3 (S204).
- the content server 3 that has received this position information (S301) searches the area image storage database for area image data associated with the position information based on the received position information at the time of shooting (S302), and selects it.
- the image is returned (S303).
- the distribution of the regional image data by the content server 3 can be omitted when the regional image data is preinstalled in the portable terminal 1 and stored in the portable terminal 1. Therefore, the processing in steps S204 and S301 to 303 may be executed when the portable terminal 1 searches for content data in the own device and is not stored in the own device.
- “recommended information” ranking the topics related to the region (sightseeing spots, special products, store information, other news) in addition to the above regional image data.
- map data is included, and these pieces of information can be used for the above-described tutorial, photo album of captured images, presentation of a slide show during image reproduction, and the like.
- the camera side that has received the regional image data or read out the data in its own device and acquired the regional image data (S205) synthesizes the acquired regional image data D12 with the basic image D11 (S206).
- the image synthesis unit 111a synthesizes the regional image data D12 with the basic image D11 based on the coordinate position of the face part detected by the face detection unit 109.
- This composite editing operation can be performed by an operation device 116 such as a GUI or a numeric keypad as shown in FIG.
- an operation device 116 such as a GUI or a numeric keypad as shown in FIG.
- a basic image D11 related to shooting (or editing) is displayed on the GUI, and a plurality of icons 1162 of the acquired area image data are listed in the icon area 1161, and these icons 1162 are displayed as
- the area image data D12 to be combined can be selected by clicking with the pointer 1163.
- the image data thus captured or edited and stored can be subjected to other editing operations such as photo retouch processing (S207). Thereafter, the editing operation is terminated (“N” in S208), and the edited image data is saved (S209).
- the image data is stored in the data storage unit 122 in association with the position information.
- the position information of the accumulated image is generated as index information (tag information) (S210) and displayed as an icon on the GUI.
- index information tag information
- the corresponding image and related data or program can be retrieved from the data storage unit 122 and read or activated.
- the face seal generated in this way can be read and viewed on the GUI by selecting an index icon by a selection operation such as a touch operation.
- the shooting mode information at the time of shooting, the position information of the own device, and the shooting time information can be added to the image data D1 for the shot image.
- Data D1 can be automatically sorted and stored.
- the subject's attributes can be estimated to some extent based on the optical shooting mode according to the location and time zone, such as portrait shooting mode at a specific location or time zone, landscape shooting mode, still-image close-up mode, etc.
- the photographed image data D1 can be classified according to the estimation. As a result, the burden of image data classification work by the user can be reduced.
- the shooting mode setting for the imaging unit is set based on the position information, time information, and schedule content type included in the schedule. Therefore, it is possible to prioritize the schedule information scheduled by the user and set the shooting mode, and to reflect the user's action schedule and to estimate the attribute of the image data D1 more accurately.
- the schedule information includes the contents of the schedule such as “birthday” or “overseas travel”, so the mode setting table T1 is based on a keyword. Therefore, the image data D1 can be accurately sorted.
- the calendar UI uses the current date as a boundary, the month and date part before the current date is displayed as a list of the image data D1 stored in the image data storage unit, and is later than the current date. Since the date and time portion is displayed as the schedule information stored in the schedule storage unit, the past can have a function as a diary of image data D1 for each month and day, and the future Since the schedule can be described and displayed, it can have a schedule function as a notebook. Since these displays are displayed on a single calendar with the current date as the boundary, the diary and notebook GUIs can be combined into one, improving visibility and increasing the display area. It is possible to reduce the size, and the display area with a limited area can be used effectively.
- the tutorial mode is automatically started at an appropriate place and time zone, and the user can be prompted to use the unused function.
- the face of the person in the image taken by the user can be partially detected and stored as a face sticker that can be used for an icon or the like, each user can use the face part of the person for various purposes. Can be used. Furthermore, since the shooting mode including the detection process and the conversion process is also stored in the shooting time setting storage unit, automatic setting during shooting in shooting mode and tutorial setting when not in use can be performed.
Abstract
Description
(1)撮像部による撮影の撮影モードを設定する撮影モード設定ステップ
(2)撮影の際に、自機の位置を示す位置情報、及び現在の時刻情報を撮影時情報として取得する撮影時情報取得ステップ
(3)画像を撮影する撮像ステップ
(4)撮像ステップで撮影した画像データに、当該画像データの撮影時における撮影モードの設定、及び撮影時情報を、各画像データの内容に関する属性情報として付加する付加情報処理ステップ
(5)付加情報処理部により付加された属性情報とともに、画像データを画像データ蓄積部に格納する画像データ格納ステップ
(6)画像データ蓄積部に格納された画像データを、各画像の属性情報に応じて検索して表示する表示制御ステップ
以上の構成を有する画像撮影システムを動作させることによって、本発明の画像撮影方法を実施することができる。図12は、本実施形態に係る画像撮影システムの動作を示すフローチャート図である。
さらに、上述した画像合成部111aは、コンテンツサーバー3とのデータ送受信により、特定の地域に固有の付加データをダウンロードして、上記フェイスシールの画像データにその地域固有の画像データを合成する機能を備えており、上述したフェイスシールの生成に際しては、この画像合成機能を併用することができる。図13は、本実施形態に係る画像合成部111aの動作を示すフローチャート図である。
以上説明した実施形態によれば、撮影した画像について、撮影時の撮影モード情報、自機の位置情報及び撮影時刻情報を画像データD1に付加することができるので、その属性情報に応じて、画像データD1を自動で振り分けて保存することができる。例えば、特定の場所や時間帯における人物撮影モードや、風景撮影モード、静物撮影の接写モードなど、場所及び時間帯に応じた光学的な撮影モードに基づいて被写体の属性をある程度推定することができ、撮影した画像データD1をその推定に応じて分類することができる。この結果、ユーザーによる画像データの分類作業の負担を軽減することができる。
D2…スケジュール情報
D3…ログデータ
T1…モード設定テーブル
1…携帯端末
2…インターネット
3…コンテンツサーバー
21…衛星
22…無線基地局
102…撮影時情報取得部
102a…位置情報取得部
102b…時計部
103…撮影時設定記憶部
104…撮影モード設定部
105…付加情報処理部
106…撮像部
107…シャッター制御部
108…操作信号取得部
109…顔検出部
109a…表情認識部
110…編集処理部
110a…電子メール生成部
111…編集データ生成部
111a…画像合成部
112…表示制御部
113…画像表示部
114…チュートリアル設定部
114a…チュートリアル取得部
114b…動作履歴解析部
116…操作デバイス
121…スケジュール記憶部
122…データ蓄積部
122a~c…フォルダ
123…動作履歴記録部
124…チュートリアル記憶部
124a…チュートリアル情報
Claims (12)
- 画像を撮影する撮像部と、
前記撮像部による撮影の撮影モードを設定する撮影モード設定部と、
撮影の際に、自機の位置を示す位置情報、及び現在の時刻情報を撮影時情報として取得する撮影時情報取得部と、
前記撮像部が撮影した画像データに、当該画像データの撮影時における前記撮影モードの設定、及び前記撮影時情報を、各画像データの内容に関する属性情報として付加する付加情報処理部と、
前記付加情報処理部により付加された属性情報とともに、前記画像データを格納する画像データ蓄積部と、
前記画像データ蓄積部に格納された画像データを、各画像の属性情報に応じて一覧表示する表示制御部と
を備えることを特徴とする画像撮影システム。 - 前記位置情報及び時刻情報と、撮影モードとを関連づけて撮影モード設定として記憶する撮影時設定記憶部をさらに備え、
前記撮影モード設定部は、撮影時情報取得部が取得した撮影時情報に応じて、前記モード設定を参照し、参照された撮影モードを前記撮像部に対して設定する
ことを特徴とする請求項1に記載の画像撮影システム。 - ユーザーの予定に関する位置情報、時刻情報及び予定内容の種別を、スケジュール情報として記憶保持するスケジュール記憶部をさらに備え、
前記撮影モード設定部は、前記モード設定を参照する際に、前記スケジュール情報に含まれる位置情報、時刻情報及び予定内容の種別に基づいて、前記撮像部に対する前記撮影モードの設定をする
ことを特徴とする請求項2に記載の画像撮影システム。 - 前記表示制御部は、現在の日にちを含むカレンダー表示機能を備え、
前記カレンダー表示機能は、前記現在の日にちを境界として、該現在の日にちよりも前の月日部分を、前記画像データ蓄積部に格納された画像データの一覧表示とし、該現在の日にちよりも後の月日部分を前記スケジュール記憶部に記憶されたスケジュール情報の表示とする
ことを特徴とする請求項3に記載の画像撮影システム。 - 前記撮影モード設定部が設定した撮影モードの履歴を記録する動作履歴記録部と、
前記撮影モードの使用例に関する情報を、位置情報及び時刻情報と関連づけてチュートリアル情報として記憶するチュートリアル記憶部と、
前記撮影モードの履歴を解析し、未使用の撮影モードを抽出する動作履歴解析部と、
前記動作履歴解析部が抽出した未使用の撮影モードに基づいて、前記チュートリアル情報を参照して、当該未使用の撮影モードの使用例を取得するチュートリアル取得部と、
前記チュートリアル取得部が取得したチュートリアル情報に含まれる位置情報、時刻情報、及び使用例に応じた予定内容の種別を、前記ユーザーの予定として前記スケジュール情報に追加するチュートリアル設定部と
をさらに備えることを特徴とする請求項3に記載の画像撮影システム。 - 人物の顔部分を検出するとともに、検出した該顔部分の、前記基本画像中における座標位置を算出する顔検出部と、
前記顔検出部が検出した顔部分を特定の形状の画像データに変換するフェイスシール生成部と
をさらに備え、
前記撮影モード設定部は、前記顔検出部による顔部分の検出処理、及び前記フェイスシール生成部による画像データの変換処理を含む撮影モードを選択可能であり、
前記動作履歴記録部は、前記撮影モード設定部が選択した検出処理及び前記変換処理とを含む撮影モードの履歴を前記動作履歴に含めて記録する
ことを特徴とする請求項1に記載の画像撮影システム。 - 画像を撮影する撮像部による画像撮影方法であって、
前記撮像部による撮影の撮影モードを設定する撮影モード設定ステップと、
撮影の際に、自機の位置を示す位置情報、及び現在の時刻情報を撮影時情報として取得する撮影時情報取得ステップと、
画像を撮影する撮像ステップと、
前記撮像ステップで撮影した画像データに、当該画像データの撮影時における前記撮影モードの設定、及び前記撮影時情報を、各画像データの内容に関する属性情報として付加する付加情報処理ステップと、
前記付加情報処理ステップで付加された属性情報とともに、前記画像データを画像データ蓄積部に格納する画像データ格納ステップと、
前記画像データ蓄積部に格納された画像データを、各画像の属性情報に応じて検索して表示する表示制御ステップと
を備えることを特徴とする画像撮影方法。 - 予め、前記位置情報及び時刻情報と、撮影モードとを関連づけて撮影モード設定として撮影時設定記憶部に記憶しておき、
前記撮影モード設定ステップでは、前記撮影時情報取得ステップで取得した撮影時情報に応じて、前記モード設定を参照し、参照された撮影モードを前記撮像部に対して設定する
ことを特徴とする請求項7に記載の画像撮影方法。 - 前記撮影モード設定ステップに先行して、ユーザーの予定に関する位置情報、時刻情報及び予定内容の種別を、スケジュール情報としてスケジュール記憶部に記憶保持するスケジュール記憶ステップをさらに備え、
前記撮影モード設定ステップでは、前記モード設定を参照する際に、前記スケジュール情報に含まれる位置情報、時刻情報及び予定内容の種別に基づいて、前記撮像部に対する前記撮影モードの設定をする
ことを特徴とする請求項8に記載の画像撮影方法。 - 前記表示制御ステップでは、現在の日にちを含むカレンダーを表示し、
前記カレンダーの表示では、前記現在の日にちを境界として、該現在の日にちよりも前の月日部分を、前記画像データ蓄積部に格納された画像データの一覧表示とし、該現在の日にちよりも後の月日部分を前記スケジュール記憶部に記憶されたスケジュール情報の表示とする
ことを特徴とする請求項9に記載の画像撮影方法。 - 予め、前記撮影モードの使用例に関する情報を、位置情報及び時刻情報と関連づけてチュートリアル情報としてチュートリアル記憶部に記憶しておき、
前記撮影モード設定ステップで設定した撮影モードの履歴を記録する動作履歴記録ステップと、
前記撮影モードの履歴を解析し、未使用の撮影モードを抽出する動作履歴解析ステップと、
前記動作履歴解析ステップで抽出した未使用の撮影モードに基づいて、前記チュートリアル情報を参照して、当該未使用の撮影モードの使用例を取得するチュートリアル取得ステップと、
前記チュートリアル取得ステップで取得したチュートリアル情報に含まれる位置情報、時刻情報、及び使用例に応じた予定内容の種別を、前記ユーザーの予定として前記スケジュール情報に追加するチュートリアル設定ステップと
をさらに備えることを特徴とする請求項9に記載の画像撮影方法。 - 前記画像データ格納ステップに先行させて、
人物の顔部分を検出するとともに、検出した該顔部分の、前記基本画像中における座標位置を算出し、
検出された前記顔部分を特定の形状の画像データに変換し、
前記撮影モード設定ステップでは、前記顔部分の検出処理、及び前記画像データの変換処理を含む撮影モードを選択可能であり、
前記動作履歴記録ステップでは、前記撮影モード設定ステップで選択した検出処理及び前記変換処理とを含む撮影モードの履歴を前記動作履歴に含めて記録する
ことを特徴とする請求項7に記載の画像撮影方法。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/124,954 US20110199511A1 (en) | 2008-10-20 | 2009-10-20 | Image photographing system and image photographing method |
JP2010534819A JPWO2010047336A1 (ja) | 2008-10-20 | 2009-10-20 | 画像撮影システム及び画像撮影方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008270360 | 2008-10-20 | ||
JP2008-270360 | 2008-10-20 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010047336A1 true WO2010047336A1 (ja) | 2010-04-29 |
Family
ID=42119373
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2009/068076 WO2010047336A1 (ja) | 2008-10-20 | 2009-10-20 | 画像撮影システム及び画像撮影方法 |
Country Status (3)
Country | Link |
---|---|
US (1) | US20110199511A1 (ja) |
JP (1) | JPWO2010047336A1 (ja) |
WO (1) | WO2010047336A1 (ja) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012018061A1 (en) | 2010-08-03 | 2012-02-09 | Ricoh Company, Ltd. | Imaging apparatus and imaging method |
KR20150099317A (ko) * | 2014-02-21 | 2015-08-31 | 삼성전자주식회사 | 이미지 처리 방법 및 장치 |
CN106878568A (zh) * | 2017-04-12 | 2017-06-20 | 中山市读书郎电子有限公司 | 一种手机使用模式的优化切换方案 |
JP2017526074A (ja) * | 2014-08-21 | 2017-09-07 | ホアウェイ・テクノロジーズ・カンパニー・リミテッド | メッセージングおよびソーシャル・ネットワーキング・アプリケーションのためのユーザ表情ライブラリを生成するシステムおよび方法 |
JP2018007018A (ja) * | 2016-07-01 | 2018-01-11 | 富士ゼロックス株式会社 | 画像処理装置及び画像処理プログラム |
JP2020188448A (ja) * | 2019-05-14 | 2020-11-19 | エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd | 撮像装置及び撮像方法 |
JP7463071B2 (ja) | 2019-10-07 | 2024-04-08 | キヤノン株式会社 | 電子機器および電子機器の制御方法 |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20120028491A (ko) * | 2010-09-15 | 2012-03-23 | 삼성전자주식회사 | 이미지 데이터 관리장치 및 방법 |
WO2012036331A1 (ko) * | 2010-09-17 | 2012-03-22 | 엘지전자 주식회사 | 이동 통신 단말기에서의 스케쥴 입력 방법 및 장치 |
EP2442550B1 (en) * | 2010-10-14 | 2015-06-17 | Sony Corporation | Image capturing device, system and method |
GB2496893A (en) | 2011-11-25 | 2013-05-29 | Nokia Corp | Presenting Name Bubbles at Different Image Zoom Levels |
WO2014050189A1 (ja) * | 2012-09-27 | 2014-04-03 | 富士フイルム株式会社 | 撮像装置及び画像処理方法 |
KR101709482B1 (ko) * | 2013-01-16 | 2017-02-23 | 삼성전자주식회사 | 서버 장치 및 서버의 제어 방법 |
EP3161727A4 (en) * | 2014-06-24 | 2018-02-14 | Pic2go Ltd. | Photo tagging system and method |
US10593023B2 (en) * | 2018-02-13 | 2020-03-17 | Adobe Inc. | Deep-learning-based automatic skin retouching |
CN111163258B (zh) * | 2018-11-07 | 2023-04-18 | 中兴通讯股份有限公司 | 一种图像存储控制方法、设备和存储介质 |
CN110377772A (zh) * | 2019-06-26 | 2019-10-25 | 华为技术有限公司 | 一种内容查找方法、相关设备及计算机可读存储介质 |
CN111159113A (zh) * | 2019-12-24 | 2020-05-15 | 广东电网有限责任公司 | 一种便携式数码拍照设备系统及照片筛选方法 |
CN114554179B (zh) * | 2022-02-24 | 2023-08-15 | 北京蜂巢世纪科技有限公司 | 基于目标模型的自动拍摄方法、系统、终端及存储介质 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002010178A (ja) * | 2000-06-19 | 2002-01-11 | Sony Corp | 画像管理システム及び画像管理方法、並びに、記憶媒体 |
JP2005091659A (ja) * | 2003-09-17 | 2005-04-07 | Casio Comput Co Ltd | カメラ装置、カメラ制御プログラム及びカメラシステム |
JP2005286379A (ja) * | 2004-03-26 | 2005-10-13 | Fuji Photo Film Co Ltd | 撮影補助システム及び撮影補助方法 |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003298987A (ja) * | 2002-03-29 | 2003-10-17 | Mitsubishi Electric Corp | 携帯型通信装置及び画像保存方法 |
US20050122405A1 (en) * | 2003-12-09 | 2005-06-09 | Voss James S. | Digital cameras and methods using GPS/time-based and/or location data to provide scene selection, and dynamic illumination and exposure adjustment |
JP2011109428A (ja) * | 2009-11-18 | 2011-06-02 | Sony Corp | 情報処理装置、情報処理方法、およびプログラム |
-
2009
- 2009-10-20 JP JP2010534819A patent/JPWO2010047336A1/ja active Pending
- 2009-10-20 US US13/124,954 patent/US20110199511A1/en not_active Abandoned
- 2009-10-20 WO PCT/JP2009/068076 patent/WO2010047336A1/ja active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002010178A (ja) * | 2000-06-19 | 2002-01-11 | Sony Corp | 画像管理システム及び画像管理方法、並びに、記憶媒体 |
JP2005091659A (ja) * | 2003-09-17 | 2005-04-07 | Casio Comput Co Ltd | カメラ装置、カメラ制御プログラム及びカメラシステム |
JP2005286379A (ja) * | 2004-03-26 | 2005-10-13 | Fuji Photo Film Co Ltd | 撮影補助システム及び撮影補助方法 |
Non-Patent Citations (1)
Title |
---|
KAORU MISAKI: "SmartWrite/SmartCalendar Tegaru ni Kakeru Memo to Memo to Shashin o Mitsuzukeru Calendar Kankyo no Teian", INFORMATION PROCESSING SOCIETY OF JAPAN KENKYU HOKOKU, vol. 2005, no. 71, 22 July 2005 (2005-07-22), pages 71 - 76 * |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012018061A1 (en) | 2010-08-03 | 2012-02-09 | Ricoh Company, Ltd. | Imaging apparatus and imaging method |
EP2601781A1 (en) * | 2010-08-03 | 2013-06-12 | Ricoh Company, Limited | Imaging apparatus and imaging method |
EP2601781A4 (en) * | 2010-08-03 | 2014-08-27 | Ricoh Co Ltd | IMAGE FORMING APPARATUS AND IMAGE FORMING METHOD |
US8982239B2 (en) | 2010-08-03 | 2015-03-17 | Ricoh Company, Ltd. | Imaging apparatus and imaging method for recording shooting completion information |
KR20150099317A (ko) * | 2014-02-21 | 2015-08-31 | 삼성전자주식회사 | 이미지 처리 방법 및 장치 |
KR102327779B1 (ko) * | 2014-02-21 | 2021-11-18 | 삼성전자주식회사 | 이미지 처리 방법 및 장치 |
JP2017526074A (ja) * | 2014-08-21 | 2017-09-07 | ホアウェイ・テクノロジーズ・カンパニー・リミテッド | メッセージングおよびソーシャル・ネットワーキング・アプリケーションのためのユーザ表情ライブラリを生成するシステムおよび方法 |
JP2018007018A (ja) * | 2016-07-01 | 2018-01-11 | 富士ゼロックス株式会社 | 画像処理装置及び画像処理プログラム |
CN106878568A (zh) * | 2017-04-12 | 2017-06-20 | 中山市读书郎电子有限公司 | 一种手机使用模式的优化切换方案 |
JP2020188448A (ja) * | 2019-05-14 | 2020-11-19 | エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd | 撮像装置及び撮像方法 |
JP7463071B2 (ja) | 2019-10-07 | 2024-04-08 | キヤノン株式会社 | 電子機器および電子機器の制御方法 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2010047336A1 (ja) | 2012-03-22 |
US20110199511A1 (en) | 2011-08-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5611829B2 (ja) | 情報処理装置の動作制御システム及び動作制御方法 | |
WO2010047336A1 (ja) | 画像撮影システム及び画像撮影方法 | |
US7734654B2 (en) | Method and system for linking digital pictures to electronic documents | |
US8279173B2 (en) | User interface for selecting a photo tag | |
CN101329678B (zh) | 基于元数据搜索和命名条目 | |
WO2019109245A1 (zh) | 一种故事相册的显示方法及装置 | |
US20080317346A1 (en) | Character and Object Recognition with a Mobile Photographic Device | |
EP1990744B1 (en) | User interface for editing photo tags | |
CA2630947C (en) | User interface for selecting a photo tag | |
US9449646B2 (en) | Methods and systems for media file management | |
KR101871779B1 (ko) | 사진 촬영 및 관리 어플리케이션을 구비한 단말기 | |
EP2711853B1 (en) | Methods and systems for media file management | |
JP2006350550A (ja) | アルバムコンテンツ自動作成方法及びシステム | |
JP5601125B2 (ja) | エディタプログラム、エディタ画面表示方法及びエディタプログラムを搭載した情報処理装置 | |
JP2004240579A (ja) | 画像サーバおよび画像サーバ制御プログラム | |
KR20220105958A (ko) | 일기장 자동화 시스템 및 일기장 자동 생성 방법 | |
KR20140031436A (ko) | 스마트 기기를 이용한 북마크 기반의 도서 콘텐츠 및 메모 관리 서비스 제공 방법 | |
Heid et al. | iPhoto'11: The Macintosh iLife Guide to using iPhoto with OS X Lion and iCloud | |
Anon et al. | Aperture? Exposed: The Mac? Photographer's Guide to Taming the Workflow | |
Bove | ILife'11 For Dummies | |
JP2005352923A (ja) | 情報蓄積装置及び情報蓄積方法 | |
JP2009260596A (ja) | 画像群再生装置、画像群再生方法、および画像群再生用プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09822033 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2010534819 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13124954 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 09822033 Country of ref document: EP Kind code of ref document: A1 |