WO2010047337A1 - 情報処理装置の動作制御システム及び動作制御方法 - Google Patents
情報処理装置の動作制御システム及び動作制御方法 Download PDFInfo
- Publication number
- WO2010047337A1 WO2010047337A1 PCT/JP2009/068077 JP2009068077W WO2010047337A1 WO 2010047337 A1 WO2010047337 A1 WO 2010047337A1 JP 2009068077 W JP2009068077 W JP 2009068077W WO 2010047337 A1 WO2010047337 A1 WO 2010047337A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- unit
- schedule
- tutorial
- image data
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
- G06F9/453—Help systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72469—User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons
- H04M1/72472—User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons wherein the items are sorted according to specific criteria, e.g. frequency of use
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/56—Details of telephonic subscriber devices including a user help function
Definitions
- the present invention relates to an operation control system and an operation control method for an information processing apparatus having a plurality of functional modules such as a mobile phone terminal and a PDA (Personal Digital Assistance).
- a PDA Personal Digital Assistance
- mobile terminal devices such as mobile phones that perform information communication by wireless communication have become widespread, and mobile terminal devices have not only a call function but also a mail transmission / reception function, a digital camera function, a music playback function, a television reception function, It is equipped with many functions such as a schedule management function.
- Patent Document 1 a system has been proposed in which an operator performs voice guidance by pressing a help button provided on a mobile phone and calling a guidance center that provides a mobile phone service.
- the present invention provides an information processing apparatus capable of prompting the user to make maximum use of the functions of the system by notifying the user of unused functions and providing operation guidance through a tutorial in a multi-function information processing apparatus. It is an object of the present invention to provide an operation control system and an operation control method.
- the present invention provides an information processing apparatus having a plurality of functional modules, a clock unit that acquires current time information, and an operation history recording unit that records operation histories of the plurality of functional modules.
- a tutorial storage unit that stores information on usage examples of each of the plurality of functional modules as tutorial information in association with time information
- an operation history analysis unit that analyzes an operation history and extracts unused functional modules
- an operation Based on the unused function modules extracted by the history analysis unit referring to the tutorial information
- the tutorial acquisition unit that acquires usage examples of the unused function modules, the time information on the user's schedule, and the type of schedule content Are stored as schedule information by the schedule storage unit and the tutorial acquisition unit.
- the time information included in the tutorial information and the type of schedule content according to the usage example are added to the schedule information as a user schedule, and usage examples of multiple functional modules are notified based on the schedule information And a guide portion.
- Another aspect of the present invention is an operation control method in an information processing apparatus having a plurality of functional modules, and information relating to usage examples of functional modules is stored in advance in a tutorial storage unit as tutorial information in association with position information and time information.
- an operation history recording step for recording an operation history of the functional module
- an operation history analysis step of analyzing an operation history and extracting unused function modules
- a tutorial acquisition step for acquiring a usage example of the unused function module
- a tutorial setting step for adding the position information, time information, and the type of schedule content corresponding to the usage example included in the tutorial information acquired in the tutorial acquisition step to the schedule information as a user schedule, and (5) schedule information.
- a tutorial start step for notifying usage examples of the plurality of function modules.
- information on unused functions that are different for each user and operation explanations are added to the schedule as information on usage examples, so that operation lessons for functions that the user is unaware or unknown can be performed. Can be scheduled as an appointment.
- the tutorial mode can be automatically started in an appropriate time zone, and the user can be prompted to use an unused function.
- the information processing apparatus further includes a position information acquisition unit that acquires position information indicating the position of the own device, and the tutorial storage unit associates information on an example of use of each of the plurality of functional modules with the time information and the position information.
- the schedule storage unit stores and holds the time information, the position information, and the type of the schedule content related to the user's schedule as schedule information.
- the tutorial storage unit and the operation history analysis unit are installed on the communication network and provided on the server, and the tutorial acquisition unit acquires an example of the use of unused function modules based on the operation history through the communication network. It is preferable to have a communication unit.
- the location information, time information, etc. at the time of shooting can be collected on the server from the operation history for each user regarding the function module, so the information is analyzed and unused function modules for each user are analyzed.
- Various usage examples can be set, and it is possible to provide usage examples of unused functional modules more appropriate for the user.
- operation history analysis and content data for tutorials are stored on the server, it is possible to reduce the processing burden on the user and to effectively use the memory capacity.
- the image display device further includes a display control unit having a calendar display function including the current date, and the calendar display function stores the date part before the current date in the image data storage unit with the current date as a boundary. It is preferable to display a list of stored image data, and display the schedule information stored in the schedule storage unit for the date and time after the current date.
- the plurality of functional modules include an imaging unit that captures an image, a shooting mode setting unit that sets an operation mode of shooting by the imaging unit, and position information that indicates a position of the own device at the time of shooting
- a shooting time information acquisition unit that acquires current time information as shooting time information, and setting of an operation mode and shooting time information when shooting the image data to the image data shot by the image pickup unit.
- An additional information processing unit to be added as attribute information relating to the content of the image, an attribute data added by the additional information processing unit, an image data storage unit for storing image data, and image data stored in the image data storage unit,
- a display control unit that searches and displays the image according to the attribute information of the image.
- the operation mode of shooting is the operation history recording unit, the operation mode set by the shooting mode setting unit. It is preferable that the recording, including the history operation history.
- the shooting mode information at the time of shooting, the position information of the own device, and the shooting time information can be added to the image data, the attribute information of each data corresponding to the additional information can be added.
- image data can be retrieved and displayed, and the burden of image data classification work by the user can be reduced.
- the additional information of the image data can be associated with the operation mode information, the position information, and the shooting time information, more detailed image data can be searched and viewed. And you can encourage users to use these features as part of the tutorial.
- the face portion of the person is detected, the face detection unit for calculating the coordinate position of the detected face portion in the basic image, and the face portion detected by the face detection unit is converted into image data of a specific shape
- a face seal generation unit, and the plurality of functional modules include a face detection unit and a face seal generation unit, and the shooting mode setting unit includes a face part detection process by the face detection unit, and a face seal generation unit.
- the operation mode including the image data conversion process can be selected, and the operation history recording unit records the operation mode history including the detection process and the conversion process selected by the shooting mode setting unit in the operation history. Preferably it is.
- the face portion of the person in the image taken by the user can be partially detected and stored as data, each user can use the face portion of the person for various purposes. Furthermore, since the operation mode including the detection process and the conversion process is also stored in the operation history recording unit, automatic setting during shooting of the operation mode and tutorial setting when not in use can be performed.
- a multi-function information processing apparatus such as a mobile phone terminal or a PDA
- an unused function is notified to the user by a tutorial and the function of the apparatus is improved.
- Encourage maximum use As a result, it is expected that users who use various contents can be newly acquired.
- FIG. 1 is a conceptual diagram illustrating an overall configuration of an image capturing system according to an embodiment. It is a front view which shows the external appearance of the portable terminal which concerns on embodiment.
- (A) is a block diagram which shows the internal structure which concerns on the schedule function and tutorial function of a portable terminal which concerns on embodiment
- (b) is a block diagram which shows the module of the image photographing system of a portable terminal. It is a flowchart figure which shows operation
- It is explanatory drawing which shows the display screen of the folder which stores the image data which concerns on embodiment. It is explanatory drawing which shows the display screen of the calendar function which concerns on embodiment. It is explanatory drawing which shows the screen transition of the schedule function which concerns on embodiment.
- FIG. 1 is a conceptual diagram illustrating the overall configuration of the image capturing system according to the present embodiment
- FIG. 2 is a front view illustrating the appearance of the mobile terminal 1.
- the image photographing system is schematically configured by a mobile terminal 1 with a camera used by a user and a content server 3 installed on the Internet 2.
- the camera-equipped mobile terminal 1 will be described as an example of an information processing apparatus having a plurality of functional modules.
- the content server 3 is a server that distributes additional content and tutorial information, and includes, for example, a Web server.
- This content server is a server computer that transmits information such as HTML (HyperText Markup Language) files, image files, music files, etc. in a document system such as WWW (World Wide Web) or software having the function. Information such as documents and images is stored, and the information is transmitted through the Internet 2 in response to a request from the portable terminal 1.
- the content data distributed by the content server 3 can be preinstalled in the mobile terminal 1 and stored in the mobile terminal 1 in advance, and the mobile terminal 1 can store the content data in its own device. If the content data is searched and not stored in the own device, a distribution request may be made to the content server 3 and downloaded.
- the content server 3 includes a regional image storage unit 31 that stores content data such as regional image data in association with position information. It serves as a regional image distribution server that distributes content including regional image data unique to each region, such as frames and templates like “signboards” depicting various characters.
- the content unique to each region includes “recommended information” ranking topics related to the region (sightseeing spots, special products, store information, other news), and map data. .
- the camera-equipped mobile terminal 1 is a portable phone using wireless communication, and a relay point such as a base station and a mobile phone communicate wirelessly to receive communication services such as calls and data communication while moving. It can.
- Examples of the communication system of the cellular phone include an FDMA system, a TDMA system, a CDMA system, a W-CDMA, a PHS (Personal Handyphone System) system, and the like.
- the mobile phone is also equipped with functions such as a digital camera function, an application software execution function, or a GPS (Global Positioning System) function, and also functions as a personal digital assistant (PDA).
- PDA personal digital assistant
- This camera function is an imaging function that optically captures digital images.
- the position information acquisition function is a function for acquiring and recording position information indicating the position of the own device at the time of shooting. As the position information acquisition function, as shown in FIG. A method of detecting the position of the own device based on a signal from the satellite 21 or a method of detecting the position based on the radio wave intensity from the radio base station 22 of the mobile phone can be considered.
- the portable terminal 1 includes an operation button 116 for a user to perform an input operation, an operation device 116 such as a jog dial and a touch panel, and an image display unit 113.
- the image display unit 113 includes a main screen 401 and a calendar UI 402 as GUIs.
- the mobile terminal 1 has a schedule function and a tutorial function using the calendar UI 402.
- FIG. 3A is a block diagram showing an internal configuration related to the schedule function and the tutorial function of the mobile terminal 1.
- the “module” used in the description refers to a functional unit that is configured by hardware such as an apparatus or a device, software having the function, or a combination thereof, and achieves a predetermined operation. .
- the mobile terminal 1 includes a schedule storage unit 121, a data storage unit 122, an operation history recording unit 123, a tutorial storage unit 124, and a tutorial setting unit 114 as schedule and tutorial modules.
- the data storage unit 122 is a storage device that stores various types of data.
- the captured image data D1 processing data (such as a regional image) acquired from the content server 3, combined image data obtained by combining these, Editing data (for example, face seal) obtained by processing each image data D1 is accumulated.
- the data storage unit 122 distributes and stores the image data D1 in a plurality of folders 122a to 122c according to the attribute information added by the additional information processing unit 105.
- the folders 122a to 122c are displayed as icons so that the image data can be stored according to themes, as shown in FIG. 5 (b).
- the image data stored in the folder can be browsed.
- this image data is browsed by referring to the attribute information (tag information) described above as a keyword, searching for the same / similar attribute information, and detecting related image data. You can also browse.
- As a display method for browsing image data low-resolution image data for thumbnails with a small data capacity may be displayed.
- This photo album is an album in which images are automatically classified according to tag information, and each image is classified according to the attribute of each image included in the tag information. Note that the same image appears to be stored in a plurality of folders (albums) because the same image data is classified into a plurality of albums by the number of tag information. .
- the types of photo albums can be classified according to the items described in the tag information.
- the photo albums can be classified during the period of the schedule category (such as “travel”), For each type), grouped by the person in the image and their attributes (for example, “family”), or grouped by a calendar date (a specific date such as a birthday).
- the schedule storage unit 121 is a storage device that stores and holds position information, time information, and schedule content types related to a user's schedule as schedule information D2. This schedule information can be customized by each user, and is displayed on a calendar UI 402 as shown in FIGS. Specifically, the calendar UI 402 by the display control unit 112 is a calendar display function including the current date, and the date and time portion 404 prior to the current date is stored as data with a grid 405 indicating the current date as a boundary. The schedule information stored in the schedule storage unit 121 is displayed in the month and day portion 406 after the current date, as a list display of the image data D1 stored in the unit 122.
- a folder 122c related to the plan is stored as shown in FIG. Can be created.
- the snap photography mode is recorded in the mode setting table T1 in association with the time zone of meals three times a day and the position information of the home.
- a pop-up message 407 as shown in FIG. 7C is displayed, and the mode setting table T1 is automatically read, and FIG.
- a shooting mode suitable for taking a snapshot of a meal is prepared as shown in FIG.
- the tutorial storage unit 124 is a storage device that stores information related to usage examples of the shooting mode as tutorial information 124a in association with position information and time information.
- the tutorial stored in the tutorial storage unit 124 is a teaching material that explains how to use the shooting function and the like, and causes the moving image, the still image, the text text, and each function module of the mobile terminal 1 to be executed.
- Content data including a script (program).
- the current position is measured by using position information acquisition means such as GPS, and the recommended spot near the current location, the area information such as store information, gourmet information, and special products are displayed and output.
- the local information may be local information distributed from the content server 3, and is preferably displayed by ranking according to its importance (recommended level). This ranking may include movie performance, CD, Chaku-Uta download count, TV audience rating, and search count.
- a character string is displayed by a pop-up 407 as shown in FIGS. 8A and 9C, and a guidance 601 and a schedule of a moving image or a still image are displayed.
- a notification pop-up screen 602c is displayed, and explanations of usage examples are advanced in an interactive manner in accordance with the user's response operation for character information and voice output such as a balloon 602a by the character 602b.
- the browser function is automatically activated and the Internet is accessed or the local information data stored in the main unit is accessed in advance to recommend and explain the recommended function. , Display related information (ranking information, etc.).
- FIGS. 8 and 9 the tutorial information that prompts the user to use the camera function is activated, and the camera shooting screen is activated while guiding the user's operation.
- FIG. 8 here, a case where tutorial information is distributed as a present from the content server 3 is taken as an example (FIG. 8 (a)), and a schedule according to the theme of “Let's play with a camera” is shown. A folder 122 dedicated to this tutorial is generated (FIGS. 9A and 9B).
- the schedule arrives (FIG. 8 (b) or FIG. 9 (c))
- the guidance of the operation by the character 602b is started, and according to the user's response operation, “camera shooting basic edition (FIG. 8 (a))”, “exposure”
- the operation lesson “Correction (FIG. 9D)” is progressing in an interactive manner.
- Such a tutorial analyzes a user operation history and is executed for an unused function.
- the execution is set by the tutorial setting unit 114.
- the contents of the schedule registered in the schedule are periodically monitored, and when the scheduled time registered in the schedule is reached, an explanation of the recommended functions and alerts for the schedule time are provided. , Display related information by recommended research.
- the current time and current position are acquired from the shooting time information acquisition unit 102, and the function according to the situation (time, place, etc.) is explained and related information is displayed by recommended research. Also good.
- the tutorial setting unit 114 is a module that adds the position information, time information, and the type of schedule content corresponding to the usage example included in the tutorial information 124a acquired by the tutorial acquisition unit 114a to the schedule information as a user schedule.
- a tutorial acquisition unit 114a and an operation history analysis unit 114b are included.
- the operation history analysis unit 114b is a module that analyzes the shooting mode history and extracts unused shooting modes.
- the tutorial acquisition unit 114a is based on the unused shooting mode extracted by the operation history analysis unit 114b, the schedule registered in the schedule 121, and the function corresponding to the situation acquired by the shooting time information acquisition unit 102. This module refers to the information 124a and obtains a tutorial on the unused shooting mode and other functions.
- the operation history analysis unit 114b is a module that analyzes shooting mode history and extracts unused shooting modes.
- the tutorial acquisition unit 114a is a module that acquires the tutorial of the unused shooting mode with reference to the tutorial information 124a based on the unused shooting mode extracted by the operation history analysis unit 114b.
- the operation history recording unit 123 is a storage device that records the history of the shooting mode set by the shooting mode setting unit 104 as log data D3, and the history of mode setting based on the user operation is accumulated as the log data D3.
- FIG. 4 is a flowchart showing operations of the schedule function and the tutorial function according to the present embodiment.
- unused function modules are periodically extracted. Specifically, each time the user uses any functional module, log data D3 of the used functional module is sequentially recorded, and the accumulated log data D3 is read from the operation history recording unit 123 ( (S101), an operation history analysis is executed (S102).
- the usage information of the unused function module is selected with reference to the tutorial information (S103). Then, the position information, the time information, and the type of schedule content corresponding to the usage example included in the tutorial information selected in step S103 are added to the schedule information D2 as a user schedule (S104).
- the tutorial information is also associated with the position information of the own device, and in the tutorial setting step, the time information, the position information, and the type of the schedule contents related to the user's schedule are added as schedule information D2. .
- the schedule information D2 is updated (S105).
- the updated schedule information D2 can be displayed as a list on a calendar UI by displaying icons according to the type of schedule content.
- the date and time part before the current date is set as a list display of the image data D1 stored in the data storage unit 122 with the current date as a boundary, and the current date is displayed.
- the later date part is set as a display of the schedule information D2 stored in the schedule storage unit 121.
- FIG. 4B is a flowchart showing a procedure of operation control based on a schedule (tutorial).
- the position information and current time information of the own device are periodically acquired by loop processing (S201), and the schedule information D2 is referred to (S202), and whether or not the current location / time is scheduled.
- a tutorial for an unused function module is set as a schedule
- the schedule in step S204 includes the schedule for starting the operation lesson tutorial described above. That is, in the tutorial information 124a, for example, information regarding usage examples of unused shooting modes is stored in association with the position information and time information, and the position information, time information, and use information included in the tutorial information 124a are stored. The type of the schedule content according to the example is added to the schedule information as the user's schedule, so that the operation lesson in the unused shooting mode is scheduled as the user's schedule.
- the functional module operated by the user voluntarily is executed (S205 and S207).
- the operation history is added to the log data D3 (S208).
- step S204 when the scheduled location / time is reached in step S204 (“Y” in step S204), an unused functional module is automatically activated, and a plurality of functional modules are used based on the schedule information. An example is notified and the use of an unused functional module is prompted (S206). When there is an operation on the function module and it is executed (S207), the operation history is added to the log data D3 (S208).
- FIG. 3B is a block diagram illustrating modules of the image capturing system.
- the image shooting system module includes an imaging unit 106, a shutter control unit 107, a face detection unit 109, a shooting mode setting unit 104, and an additional information processing unit 105.
- the imaging unit 106 is a module that optically captures a digital image, executes an imaging process in accordance with control from the shutter control unit 107, and stores the captured image in the data storage unit 122 as image data D1.
- the shutter control unit 107 receives an operation signal based on a user operation from the operation device through the operation signal acquisition unit 108 and executes a photographing process.
- the facial expression recognition unit 109a in the face detection unit 109 An automatic shooting mode for automatically performing shooting processing may be added at the moment when a specific facial expression such as a smile is recognized.
- the additional information processing unit 105 is a module that adds attribute information as tag information to the image data D1 captured by the image capturing unit 106.
- the additional information processing unit 105 captures the image data D1 acquired from the image capturing mode setting unit 104 at the time of image capturing,
- the shooting time information (position information, time, etc.) acquired from the shooting time information acquisition unit 102 is added as attribute information regarding the contents of each image data D1.
- As a method for adding the tag information it is conceivable that the tag information is directly stored in the image data D1, or another file data is used as management data (metadata) associated with the image data D1.
- the contents of the schedule (location, participant, event contents) corresponding to the photographing time can be recorded in association with the attribute information of the photographed image data.
- the tag information the face of a person shown in the image may be recognized and the feature of the face may be included.
- the facial features may be recorded, for example, by identifying a person with reference to a face sticker (face photo information) set in the address book or the like and associating it with the address book or the like.
- facial expressions such as a smile may be recognized, and the facial expressions may be included in the tag information.
- images taken with smile recognition can be classified by tag information “smile”, and an album in which smile images are collected can be automatically generated.
- this tag information can be displayed in a list for each image, and can be edited, changed, deleted, added, etc. by user operation.
- the attribute information included in the tag information can be used for album classification of the image, automatic mail generation when attached to an electronic mail, and the like.
- the shooting mode setting unit 104 is a module for setting a shooting mode for shooting by the imaging unit 106.
- This shooting mode includes settings such as shutter speed, exposure (aperture), focal length, presence / absence of flash, filter, and so on. Blurred shooting), Landscape mode (Adjusted to focus evenly from short distance to long distance: Including commemorative photos with multiple subjects), Close-up mode (Photo taken close to the subject) ), Sport mode (shooting at a high shutter speed, including continuous shooting, etc.), night portrait (such as shooting a person with a night view in the background) Adjustment of the aperture), flash emission prohibition mode (shooting outdoors), and the like.
- basic settings included in this shooting mode include settings for image size, image quality (resolution), and storage destination (main body or recording medium, etc.).
- a selection (combination) pattern of suitable shooting settings and functions may be displayed and selected as the type of camera.
- the mode that allows users to set all the shooting functions themselves is set to ⁇ Dedicated Pro Camera ''
- the mode for shooting at night or indoors with high sensitivity setting is set to ⁇ High Sensitivity Camera ''
- the mode that allows you to add frames and write letters and pictures after shooting is called ⁇ Purikura Camera ''
- the mode that transforms shot images and combines other images is ⁇ “Party camera”, “art camera” mode that can effect effects such as sepia tone, oil painting touch, pseudo fisheye lens, filter processing, etc., 2D barcode and QR code analysis, character recognition functions such as OCR
- the mode that can perform image analysis such as the face recognition function is set to “Search camera”, or the movie is shot and the shot movie / image is uploaded.
- Other shooting modes can be displayed by adding a comment to the captured image and uploading it to a website on the Internet as a “blog camera”, or by outputting the sound of animals such as dogs and cats.
- a mode suitable for taking a close-up shot of a still life such as “pet camera” as a mode for drawing attention to the camera or cooking can be set as a “cooking camera”.
- the shooting mode setting unit 104 sets the shooting mode for the imaging unit 106 based on the current time, the position information included in the schedule information, the time information, and the type of the scheduled content when referring to the mode setting. It is possible to make corrections, and to display a message recommending a better shooting mode setting. For example, if the current time is a trip listed in the schedule, the portrait mode setting is set as the default, and if the party is in a party, “Party Camera”, “Purikura Camera”, etc. are preferentially set. Output a message that recommends use. If the current time is at night, the “high sensitivity camera” may be recommended. Further, in conjunction with the setting of the shooting mode, a message recommending other functions, such as recommending the use of a local frame or the use of a navigation function by GPS, may be output.
- the shooting setting storage unit 103 is a storage device such as a non-volatile memory that stores and holds table data for selecting a shooting mode according to a place and time when the shooting mode setting unit 104 sets a shooting mode. It is. Specifically, table data in which position information and time information are associated with shooting modes is stored as a mode setting table T1, and the position information and time information are received in response to a request from the shooting mode setting unit 104.
- the mode setting table T1 is referred to, and the shooting mode to be used for shooting is read out and sent to the shooting mode setting unit 104.
- the shooting mode setting unit 104 refers to the mode setting according to the shooting time information acquired by the shooting time information acquisition unit 102 and sets the referenced shooting mode for the imaging unit 106.
- the shooting time information acquisition unit 102 acquires a position information acquisition unit 102a that acquires position information indicating the position of the own device at the time of shooting, and a clock unit that acquires current time information as shooting time information. 102b, and upon completion of shooting, in response to a request from the additional information processing unit 105, each data acquired from the position information acquisition unit 102a and the clock unit 102b is input to the additional information processing unit 105 as shooting time information. .
- the position information acquisition unit 102a is a module that acquires and records position information indicating the position of the own device at the time of shooting.
- the position information acquisition unit 102a detects the position of the own device by a signal from the satellite 21 like GPS, The position is detected by the radio wave intensity from the radio base station 22 of the mobile phone.
- the clock unit 102b is a module that measures the current time, and may be a time display that manages a time zone and takes into account a time difference in accordance with the position information acquired by the position information acquisition unit 102a.
- the data storage unit 122 is a storage device that stores various types of data.
- the captured image data D1, the processing data acquired from the content server 3, the combined image data obtained by combining them, and the image data D1 Edited data (for example, face seal) that has been processed is accumulated.
- the data storage unit 122 distributes and stores the image data D1 in a plurality of folders 122a to 122c according to the attribute information (tag information) added by the additional information processing unit 105.
- the images are stored in the folders according to the tag information.
- the present invention is not limited to this, and the actual storage location is specified in the basic settings. It is preferable to use a typical folder (for example, “My picture”), and perform classification based on pseudo-assignment based on tag information added to each image when displaying a list or album.
- the face detection unit 109 extracts a certain geometric shape formed by eyes, nose, mouth, and the like, detects a human face part, and calculates a coordinate position of the detected face part in the basic image. It is a module to do. This face detection process is executed from the previous stage of the photographing operation. As shown in FIG. 10A, the detected face is highlighted as a marking 408a on the finder screen, and the distance to the subject is measured. It also works with an autofocus function that performs focusing according to distance.
- the face detection unit 109 has a facial expression recognition unit 109a that recognizes a predetermined facial expression such as a smile of a person to be photographed, and recognizes the face part and the facial expression from the monitor image at the time of photographing. , And a function of inputting the recognition result to the shutter control unit 107.
- the shutter control unit 107 inputs a control signal for causing the imaging unit 106 to perform a shutter operation according to the input recognition result.
- Information related to facial features and expressions recognized by the face detection unit 109 and facial expression recognition unit 109a is described in the tag information, and each image is classified by a person in the image or classified by the type of facial expression. be able to.
- facial features for example, a person may be specified by collating face stickers registered in the address book.
- the mobile terminal 1 includes an editing processing unit 110 and an editing data generation unit 111 as editing processing modules.
- the editing processing unit 110 is a module that executes various editing processes in accordance with the operation of the operator. In conjunction with the GUI, a photo retouching process such as drawing a picture on an image with a touch pen or writing characters or the like. In addition, it is possible to perform an image composition process for compositing an image such as a frame image or a background image with a taken basic image.
- the editing processing unit 110 also has a function of displaying a list of tag information associated with each image and editing such information as addition, deletion, and change.
- the editing processing unit 110 also includes an e-mail generation unit 110a that generates an e-mail and transmits it via the communication I / F 101.
- the e-mail generation unit 110a has a function of transmitting the image stored in the image data storage unit 122 and the captured image D1 as an attachment to the e-mail, and based on the tag information of the image data, the necessary items Can be automatically generated. For example, the person shown in the tag information is read out, the person is searched from the address book, and the e-mail address or name detected from the address book is automatically described as the e-mail destination address. To do. Also, from the information described in the tag information or the associated schedule information (history information), the location, time, and event content when the image was taken is extracted to create a sentence, and the sentence is electronic You may make it quote in the text and title of a mail.
- the e-mail generator 110a also has a function of inserting a face seal as a pictograph during editing of the mail text.
- a person's name and face seal insertion processing may be associated with the dictionary conversion function, and when the person's name is described in a sentence, it may be automatically converted or inserted as a face seal.
- the e-mail generation unit 110a also has a function of automatically inserting an e-mail by quoting the contents of the schedule when generating the e-mail. For example, an address book is searched from the name of a person related to the schedule described in the schedule, and the e-mail title and text are created from the event contents in the schedule with the mail address of the person as the destination.
- the editing data generation unit 111 is a module that generates editing data as a result of the editing operation by the editing processing unit 110, and the generated data is stored in the data storage unit 122.
- the edit data generation unit 111 also stores the tag information related to the edited image data in the data storage unit 122 together with the edited image data.
- the edit data generation unit 111 includes an image composition unit 111a.
- the image composition unit 111a uses the image captured by the image capturing unit 106 as a basic image D11, and the other image data D12 with respect to the basic image D11. Synthesize.
- the composition processing by the image composition unit 111a can also be controlled by an editing operation on the editing processing unit 110.
- the image composition unit 111a has a function of a face seal generation unit that converts the face portion detected by the face detection unit 109 into a face seal that is image data D1 having a specific shape.
- This face seal is based on the coordinate position of the face portion detected by the face detection unit 109, for example, by overlaying the costume image data D1 or the like on the basic image, or cutting out the image, as shown in FIG. It is generated as icon image data of various shapes as shown.
- This face seal generation process is automatically executed by default when the face detection unit 109 detects a face part. Therefore, the user can consciously generate the face seal manually, and the face seal is automatically generated and accumulated every time the face part is photographed.
- the image composition unit 111a was photographed on the basis of the coordinate position of the face portion 107b detected by the face detection unit 109, as shown in FIGS. 13 (a) to 13 (c). You may make it synthesize
- the face sticker generated by the image compositing unit 111a can be stored in association with a person registered in the address book, and the face sticker of the person can be used for operations and data related to each person. Can do.
- a face sticker can be displayed as an address book index, and a face sticker can be displayed as an icon as a photo album index.
- This face seal can also be used as a pictograph meaning each person when composing an e-mail. When a person name registered in the address book is described in the e-mail, the person name is automatically entered. The face seal may be detected and added to the sentence.
- the portable terminal 1 includes an operation signal acquisition unit 108, a display control unit 112, and an image display unit 113 as user interface modules.
- the operation signal acquisition unit 108 is a module that acquires an operation signal from an operation device based on a user operation and inputs an execution command corresponding to the operation signal to the shutter control unit 107 and the editing processing unit 110.
- the operation device here includes a touch panel, operation buttons, an acceleration sensor, and the like. This operation signal is transmitted to each module to be operated, and is transmitted to the operation history recording unit 123 and accumulated as log data D3.
- the display control unit 112 is a module that generates image data to be displayed on the image display unit 113, which is a GUI. At the time of shooting, the display control unit 112 displays a finder image captured by the imaging unit 106 or at the time of editing. The change of the image such as an icon based on the user operation signal acquired by the operation signal acquisition unit 108 is displayed. As a method for displaying an icon by the display control unit 112, as shown in FIG. 14, the icons 601 are arranged in a spiral shape (spiral shape) on the image display unit 113, and a swirl is performed by a user's rotation operation.
- the icon When the icon is rotated clockwise or counterclockwise, the icon may be moved from the center to the outside while expanding, or from the outside to the center while the icon is being reduced, while the size is enlarged or reduced.
- the user's rotation operation for example, it is conceivable whether a wheel unit for rotating operation is physically provided in the mobile phone body, or whether rotation of the pointing point on the touch panel is detected as the operation. .
- the display control unit 112 is provided with a slide show function for sequentially reproducing a plurality of images stored in the image data storage unit 122 in order.
- a slide show function for sequentially reproducing a plurality of images stored in the image data storage unit 122 in order.
- tag information which is a standard for classifying photo albums is collated, and information related to the tag information (photographed location, time, shooting mode, schedule of the time, Local information about the location) can be displayed together.
- information related to the tag information photographed location, time, shooting mode, schedule of the time, Local information about the location
- the photo album is related to travel
- local information on the travel destination a map, a movement trajectory based on the navigation history, and the like may be displayed as an introduction to the slide show.
- the display order of images included in the slide show can be set to the order of the shooting time, and when the shooting mode is a photo booth camera, the slide show frame is changed like a photo booth note, etc. An effect may be executed.
- the display control unit 112 has a function of displaying the attribute information of the data stored in the data storage unit 122 as face seals 501a to 501c.
- attribute information tag information
- a face sticker is displayed as an icon indicating the person.
- the association between the tag information and the face seal can be based on address book data, for example. That is, using a person registered in the address book as a reference, the photographed image or face seal is recorded in association with the person.
- the display control unit 112 also has a function of searching for and reading data stored in the data storage unit 122. On the GUI, a calendar UI and a mail browsing UI (FIGS. 10C and 10D). By selecting the operation above, the corresponding data can be read out.
- the shooting mode setting unit 104 can select an operation mode including a face part detection process by the face detection unit 109 and an image data conversion process by the face seal generation function of the image composition unit 111a.
- the recording unit 123 records the operation mode history including the detection process and the conversion process selected by the shooting mode setting unit 104 in the operation history.
- FIG. 11 is a flowchart showing the operation of the image capturing function according to the present embodiment.
- the position information and current time information of the own device are acquired periodically by loop processing (S301 and S302), and the schedule information D2 is referred to, whether or not it is at the scheduled location / time, or is scheduled It is determined whether or not the user has started the operation voluntarily at an outside place / time (S303 and S304). As long as the user does not perform an operation and a predetermined time does not arrive, a standby state is set (“N” in S303 and S304).
- the schedule in step S304 may be the start of the tutorial for the operation lesson described above. That is, in the tutorial information 124a, information related to the usage example of the shooting mode is stored in association with the position information and the time information, and the position information, the time information, and the schedule according to the usage example included in the tutorial information 124a are stored. By adding the content type as schedule of the user to the schedule information, an operation lesson in unused shooting mode is scheduled as the schedule of the user.
- the user also performs a voluntary shooting start operation before the scheduled time arrives (“Y” in S303)
- shooting is performed in the shooting mode selected by the user (S305 and S307)
- the photographing mode is added to the log data D3 (S308).
- table data in which position information and time information are associated with shooting modes is stored in the shooting time setting storage unit 103 as shooting mode settings in advance, and a shooting mode setting step (S305). Then, according to the shooting time information acquired in the shooting time information acquisition step (S301), the mode setting T1 is referred to and the referenced mode setting table T1 is set for the imaging unit 106.
- the photographing mode is automatically selected to prompt the photographing operation (S306), and when the photographing operation is performed (S307).
- the shooting mode is added to the log data D3 (S308).
- the schedule information includes position information, time information, and schedule content types related to the user's schedule, and is included in the schedule information D2 when referring to the mode setting in the shooting mode setting step (S306). Based on the position information, time information, and scheduled content type, the shooting mode for the imaging unit is set.
- the schedule information D2 describes the details of the schedule, such as “birthday” or “overseas travel”, for example. By using the type of the schedule as a keyword, a more accurate shooting mode Can be set.
- the image data D1 is sorted according to the attribute information including the shooting location / time and the shooting mode, and stored in the folders 122a to 122c of the data storage unit 122.
- the image data D1 stored in the folders 122a to 122c can be displayed as a list on the calendar UI by displaying icons according to the attribute information of the images.
- the date and time part before the current date is set as a list display of the image data D1 stored in the data storage unit 122 with the current date as a boundary, and the current date is displayed.
- the later date part is set as a display of the schedule information D2 stored in the schedule storage unit 121.
- the calendar and schedule are arranged in a spiral shape with the calendar date as an icon 601, and the display range is changed by moving the spiral display forward and backward by a rotation operation by the user. Also good.
- a face sticker in which the face portion of a person is converted into image data of a specific shape can be generated, and the face sticker can be freely pasted in the calendar UI as an icon image. .
- the face seal can be associated with image tag information, address book, schedule, and the like.
- schedule information location, participant, event content
- the tag information the face of a person shown in the image can be recognized and the feature of the face can be included.
- the facial features can be recorded, for example, by identifying a person with reference to a face sticker set in the address book and associating it with the address book.
- an expression such as a smile may be recognized, and the expression may be included in the tag information.
- images taken with smile recognition can be classified by tag information “smile”, and an album in which smile images are collected can be automatically generated.
- FIG. 12 is a flowchart showing the operation of the image composition unit 111a according to this embodiment.
- an image is taken by the imaging unit 106 (S401).
- an “automatic shooting mode” in which the finder image is always monitored by the imaging unit 106 may be executed.
- the facial expression recognition unit 109a performs facial expression recognition on a finder image taken by the imaging unit 106, for example, to recognize a predetermined facial expression such as a smile, and recognizes the smile.
- the photographing process is automatically executed.
- the image data captured by the imaging unit 106 is recorded as a basic image in the data storage unit 122 (S402), and a face is detected in the basic image (S403).
- the face detection unit 109 calculates the coordinate position of the detected face part in the basic image.
- the position information of the own device at the time of photographing is acquired and recorded, and this position information is transmitted to the content server 3 (S404).
- the content server 3 that has received this position information (S501) searches the area image storage database for area image data associated with the position information based on the received position information at the time of shooting (S502) and selects it.
- the image is returned (S503).
- the distribution of the regional image data by the content server 3 can be omitted when the regional image data is preinstalled in the portable terminal 1 and stored in the portable terminal 1. Therefore, the processing in steps S404 and S501 to 503 may be executed when the portable terminal 1 searches for content data in the own device and is not stored in the own device.
- “recommended information” ranking the topics related to the region (sightseeing spots, special products, store information, other news) in addition to the above regional image data.
- map data is included, and these pieces of information can be used for the above-described tutorial, photo album of captured images, presentation of a slide show during image reproduction, and the like.
- the camera side that has received the regional image data or read out the data in its own device and acquired the regional image data (S405) synthesizes the acquired regional image data D12 with the basic image D11 (S406).
- the image synthesis unit 111a synthesizes the regional image data D12 with the basic image D11 based on the coordinate position of the face part detected by the face detection unit 109.
- This composite editing operation can be performed using a GUI as shown in FIG.
- a basic image D11 related to shooting (or editing) is displayed on the GUI, and a plurality of icons 1162 of the acquired area image data are listed in the icon area 1161, and these icons 1162 are displayed as
- the area image data D12 to be combined can be selected by clicking with the pointer 1163.
- the image data thus captured or edited and stored can be subjected to other editing operations such as photo retouch processing (S407). Thereafter, the editing operation is terminated (“N” in S408), and the edited image data is saved (S409).
- the image data is stored in the data storage unit 122 in association with the position information.
- the position information of the accumulated image is generated as index information (tag information) (S410) and displayed as an icon on the GUI.
- index information tag information
- the corresponding image and related data or program can be retrieved from the data storage unit 122 and read or activated.
- the face seal generated in this way can be read and viewed on the GUI by selecting an index icon by a selection operation such as a touch operation.
- the calendar UI uses the current date as a boundary, the month and date part before the current date is displayed as a list of the image data D1 stored in the image data storage unit, and is later than the current date. Since the date and time portion is displayed as the schedule information stored in the schedule storage unit, the past can have a function as a diary of image data D1 for each month and day, and the future Since the schedule can be described and displayed, it can have a schedule function as a notebook. Since these displays are displayed on a single calendar with the current date as the boundary, the diary and notebook GUIs can be combined into one, improving visibility and increasing the display area. It is possible to reduce the size, and the display area with a limited area can be used effectively.
- the face of the person in the image taken by the user can be partially detected and stored as a face sticker that can be used for an icon or the like, each user can use the face part of the person for various purposes. Can be used. Furthermore, since the shooting mode including the detection process and the conversion process is also stored in the shooting time setting storage unit, automatic setting during shooting in shooting mode and tutorial setting when not in use can be performed.
- the position information at the time of shooting based on the operation history of the function module, time information, etc. can be collected on the server, so the information is analyzed and the usage example of the unused function module for each user is Various settings can be made, and it is possible to provide a usage example of an unused functional module more appropriate for the user.
- operation history analysis and content data for tutorials are stored on the server, it is possible to reduce the processing load on the user and to effectively use the memory capacity.
Landscapes
- Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Software Systems (AREA)
- Theoretical Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Studio Devices (AREA)
- Telephone Function (AREA)
- Telephonic Communication Services (AREA)
Abstract
Description
(1)機能モジュールの動作履歴を記録する動作履歴記録ステップと、
(2)動作履歴を解析し、未使用の機能モジュールを抽出する動作履歴解析ステップと、
(3)動作履歴解析ステップで抽出した未使用の機能モジュールに基づいて、チュートリアル情報を参照して、当該未使用の機能モジュールの使用例を取得するチュートリアル取得ステップと、
(4)チュートリアル取得ステップで取得したチュートリアル情報に含まれる位置情報、時刻情報、及び使用例に応じた予定内容の種別を、ユーザーの予定としてスケジュール情報に追加するチュートリアル設定ステップと
(5)スケジュール情報に基づいて、複数の機能モジュールの使用例を通知するチュートリアル開始ステップと
を備える。
以上の構成を有するスケジュール機能及びチュートリアル機能を動作させることによって、本発明の動作制御方法を実施することができる。図4は、本実施形態に係るスケジュール機能及びチュートリアル機能の動作を示すフローチャート図である。
携帯端末1は、機能モジュールを多数有しており、この多機能モジュールの1つとして画像撮影機能が備えられている。図3(b)は、画像撮影系のモジュールを示すブロック図である。
以上の構成を有する画像撮影機能を動作させることによって、本実施形態に係る画像撮影方法を実施することができる。図11は、本実施形態に係る画像撮影機能の動作を示すフローチャート図である。
さらに、上述した画像合成部111aは、コンテンツサーバー3とのデータ送受信により、特定の地域に固有の付加データをダウンロードして、上記フェイスシールの画像データにその地域固有の画像データを合成する機能を備えている。図12は、本実施形態に係る画像合成部111aの動作を示すフローチャート図である。
ステップS401において、撮像部106により撮影された画像データは、基本画像としてデータ蓄積部122に記録され(S402)、基本画像内において、顔の検出が行われる(S403)。この顔部分の検出に際し、顔検出部109では、検出した顔部分の基本画像中における座標位置を算出しておく。
以上説明した実施形態によれば、ユーザー毎に異なる未使用の機能の案内及び操作説明等を、使用例に関する情報としてスケジュール情報D2に追加するので、ユーザーが不知又は不明であった機能の操作レッスンを、ユーザーの予定としてスケジュールすることができる。この結果、適切な時間帯においてチュートリアルモードを自動的に開始させ、未使用であった機能の利用をユーザーに促すことができる。
上記実施形態では、動作履歴の記録や解析、チュートリアル情報の蓄積や、選択をユーザー端末内に備えられたモジュールで行うようにしたが、本発明はこれに限定されるものではなく、チュートリアル記憶部124及び動作履歴解析部114bを、インターネット2上に設置されコンテンツサーバー3上に設けてもよい。この場合には、ユーザー端末側のチュートリアル取得部114aは、インターネット2を通じて、動作履歴に基づいて未使用の機能モジュールのチュートリアル(使用例)を取得するようにしてもよい。
D11…基本画像
D12…地域画像データ
D2…スケジュール情報
D3…ログデータ
T1…モード設定テーブル
1…携帯端末
2…インターネット
3…コンテンツサーバー
21…衛星
22…無線基地局
31…地域画像蓄積部
102…撮影時情報取得部
102a…位置情報取得部
102b…時計部
103…撮影時設定記憶部
104…撮影モード設定部
105…付加情報処理部
106…撮像部
107…シャッター制御部
107b…顔部分
108…操作信号取得部
109…顔検出部
109a…表情認識部
110…編集処理部
110a…電子メール生成部
111…編集データ生成部
111a…画像合成部
112…表示制御部
113…画像表示部
114…チュートリアル設定部
114a…チュートリアル取得部
114b…動作履歴解析部
116…操作デバイス
121…スケジュール記憶部
122…データ蓄積部
122a~c…フォルダ
123…動作履歴記録部
124…チュートリアル記憶部
124a…チュートリアル情報
Claims (12)
- 複数の機能モジュールを有する情報処理装置における動作制御システムであって、
現在の時刻情報を取得する時計部と、
前記複数の機能モジュールの動作履歴を記録する動作履歴記録部と、
前記複数の機能モジュールそれぞれの使用例に関する情報を、時刻情報と関連付けてチュートリアル情報として記憶するチュートリアル記憶部と、
前記動作履歴を解析し、未使用の機能モジュールを抽出する動作履歴解析部と、
前記動作履歴解析部が抽出した未使用の機能モジュールに基づいて、前記チュートリアル情報を参照して、当該未使用の機能モジュールの使用例を取得するチュートリアル取得部と、
ユーザーの予定に関する時刻情報及び予定内容の種別を、スケジュール情報として記憶保持するスケジュール記憶部と、
前記チュートリアル取得部が取得したチュートリアル情報に含まれる時刻情報、及び使用例に応じた予定内容の種別を、前記ユーザーの予定として前記スケジュール情報に追加するチュートリアル設定部と、
前記スケジュール情報に基づいて、前記複数の機能モジュールの使用例を通知するガイド部と
を備えることを特徴とする情報処理装置の動作制御システム。 - 自機の位置を示す位置情報を取得する位置情報取得部をさらに備え、
前記チュートリアル記憶部は、前記複数の機能モジュールそれぞれの使用例に関する情報を、前記時刻情報及び前記位置情報と関連付けてチュートリアル情報として記憶し、
スケジュール記憶部は、ユーザーの予定に関する時刻情報、前記位置情報及び予定内容の種別を、スケジュール情報として記憶保持する
ことを特徴とする請求項1に記載の情報処理装置の動作制御システム。 - 前記チュートリアル記憶部及び動作履歴解析部は、通信ネットワーク上に設置されサーバー上に備えられ、
前記チュートリアル取得部は、前記通信ネットワークを通じて、前記動作履歴に基づいて前記前記未使用の機能モジュールの使用例を取得する通信部を有する
ことを特徴とする請求項1に記載の情報処理装置の動作制御システム。 - 現在の日にちを含むカレンダー表示機能を有する表示制御部をさらに備え、
前記カレンダー表示機能は、前記現在の日にちを境界として、該現在の日にちよりも前の月日部分を、前記画像データ蓄積部に格納された画像データの一覧表示とし、該現在の日にちよりも後の月日部分を前記スケジュール記憶部に記憶されたスケジュール情報の表示をする
ことを特徴とする請求項1に記載の情報処理装置の動作制御システム。 - 前記複数の機能モジュールには、画像を撮影する撮像部が含まれ、該撮像部による撮影の動作モードを設定する撮影モード設定部と、
撮影の際に、自機の位置を示す位置情報、及び現在の時刻情報を撮影時情報として取得する撮影時情報取得部と、
前記撮像部が撮影した画像データに、当該画像データの撮影時における前記動作モードの設定、及び前記撮影時情報を、各画像データの内容に関する属性情報として付加する付加情報処理部と、
前記付加情報処理部により付加された属性情報とともに、前記画像データを格納する画像データ蓄積部と、
前記画像データ蓄積部に格納された画像データを、各画像の属性情報に応じて検索して表示する表示制御部と
をさらに備え、
前記撮影の動作モードは、
前記動作履歴記録部は、前記撮影モード設定部が設定した動作モードの履歴を前記動作履歴に含めて記録する
ことを特徴とする請求項1に記載の情報処理装置の動作制御システム。 - 人物の顔部分を検出するとともに、検出した該顔部分の、前記基本画像中における座標位置を算出する顔検出部と、
前記顔検出部が検出した顔部分を特定の形状の画像データに変換するフェイスシール生成部と
をさらに備え、
前記複数の機能モジュールには、前記顔検出部及び前記フェイスシール生成部が含まれ、
前記撮影モード設定部は、前記顔検出部による顔部分の検出処理、及び前記フェイスシール生成部による画像データの変換処理を含む動作モードを選択可能であり、
前記動作履歴記録部は、前記撮影モード設定部が選択した検出処理及び前記変換処理とを含む動作モードの履歴を前記動作履歴に含めて記録する
ことを特徴とする請求項1に記載の情報処理装置の動作制御システム。 - 複数の機能モジュールを有する情報処理装置における動作制御方法であって、
予め、前記機能モジュールの使用例に関する情報を、位置情報及び時刻情報と関連付けてチュートリアル情報としてチュートリアル記憶部に記憶しておき、
前記機能モジュールの動作履歴を記録する動作履歴記録ステップと、
前記動作履歴を解析し、未使用の機能モジュールを抽出する動作履歴解析ステップと、
前記動作履歴解析ステップで抽出した未使用の機能モジュールに基づいて、前記チュートリアル情報を参照して、当該未使用の機能モジュールの使用例を取得するチュートリアル取得ステップと、
前記チュートリアル取得ステップで取得したチュートリアル情報に含まれる位置情報、時刻情報、及び使用例に応じた予定内容の種別を、前記ユーザーの予定として前記スケジュール情報に追加するチュートリアル設定ステップと
前記スケジュール情報に基づいて、前記複数の機能モジュールの使用例を通知するチュートリアル開始ステップと
を備えることを特徴とする情報処理装置の動作制御方法。 - 前記チュートリアル記憶部には、前記複数の機能モジュールそれぞれの使用例に関する情報を、前記時刻情報及び自機の位置情報と関連付けてチュートリアル情報として記憶しておき、
チュートリアル設定ステップでは、ユーザーの予定に関する時刻情報、前記位置情報及び予定内容の種別を、スケジュール情報として追加する
ことを特徴とする請求項7に記載の情報処理装置の動作制御方法。 - 前記チュートリアル記憶部は、通信ネットワーク上に設置されサーバー上に備えられ、
前記動作履歴解析ステップ又は前記チュートリアル取得ステップでは、前記通信ネットワークを通じて、前記サーバーから、前記動作履歴に基づいて前記前記未使用の機能モジュールの使用例を取得する
ことを特徴とする請求項7に記載の情報処理装置の動作制御方法。 - 前記スケジュール情報を一覧表示する表示制御ステップをさらに備え、
前記表示制御ステップでは、現在の日にちを含むカレンダーを表示し、
前記カレンダーの表示では、前記現在の日にちを境界として、該現在の日にちよりも前の月日部分を、前記画像データ蓄積部に格納された画像データの一覧表示とし、該現在の日にちよりも後の月日部分を前記スケジュール記憶部に記憶されたスケジュール情報の表示とする
ことを特徴とする請求項7に記載の情報処理装置の動作制御方法。 - 前記複数の機能モジュールには、画像を撮影する撮像部が含まれ、
前記撮像部による撮影の撮影モードを設定する撮影モード設定ステップと、
撮影の際に、自機の位置を示す位置情報、及び現在の時刻情報を撮影時情報として取得する撮影時情報取得ステップと、
画像を撮影する撮像ステップと、
前記撮像ステップで撮影した画像データに、当該画像データの撮影時における前記撮影モードの設定、及び前記撮影時情報を、各画像データの内容に関する属性情報として付加する付加情報処理ステップと、
前記付加情報処理ステップで付加された属性情報とともに、前記画像データを画像データ蓄積部に格納する画像データ格納ステップと、
前記画像データ蓄積部に格納された画像データを、各画像の属性情報に応じて検索して表示する表示制御ステップと
を備えることを特徴とする請求項7に記載の情報処理装置の動作制御方法。 - 前記画像データ格納ステップに先行させて、
人物の顔部分を検出するとともに、検出した該顔部分の、前記基本画像中における座標位置を算出し、
検出された前記顔部分を特定の形状の画像データに変換し、
前記撮影モード設定ステップでは、前記顔部分の検出処理、及び前記画像データの変換処理を含む撮影モードを選択可能であり、
前記動作履歴記録ステップでは、前記撮影モード設定ステップで選択した検出処理及び前記変換処理とを含む撮影モードの履歴を前記動作履歴に含めて記録する
ことを特徴とする請求項11に記載の情報処理装置の動作制御方法。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/125,002 US20110200980A1 (en) | 2008-10-20 | 2009-10-20 | Information processing device operation control system and operation control method |
JP2010534820A JP5611829B2 (ja) | 2008-10-20 | 2009-10-20 | 情報処理装置の動作制御システム及び動作制御方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008270369 | 2008-10-20 | ||
JP2008-270369 | 2008-10-20 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010047337A1 true WO2010047337A1 (ja) | 2010-04-29 |
Family
ID=42119374
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2009/068077 WO2010047337A1 (ja) | 2008-10-20 | 2009-10-20 | 情報処理装置の動作制御システム及び動作制御方法 |
Country Status (3)
Country | Link |
---|---|
US (1) | US20110200980A1 (ja) |
JP (1) | JP5611829B2 (ja) |
WO (1) | WO2010047337A1 (ja) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012173961A (ja) * | 2011-02-21 | 2012-09-10 | Toshiba Tec Corp | トレーニング装置、プログラム、及びトレーニングシステム |
JP2016123083A (ja) * | 2014-12-24 | 2016-07-07 | キヤノンマーケティングジャパン株式会社 | 情報処理端末、制御方法、プログラム |
JP2016184416A (ja) * | 2016-05-20 | 2016-10-20 | ソニー株式会社 | 情報処理装置、情報処理方法および記憶媒体 |
JP2020052948A (ja) * | 2018-09-28 | 2020-04-02 | 富士フイルム株式会社 | 画像処理装置、画像処理方法、プログラムおよび記録媒体 |
US10740057B2 (en) | 2011-06-13 | 2020-08-11 | Sony Corporation | Information processing device, information processing method, and computer program |
CN111936970A (zh) * | 2018-03-20 | 2020-11-13 | 微软技术许可有限责任公司 | 交叉应用特征链接和教育消息传送 |
JP2020204987A (ja) * | 2019-06-19 | 2020-12-24 | カシオ計算機株式会社 | インストールシステム、サーバ装置、ユーザ側装置及びインストール方法 |
Families Citing this family (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8856656B2 (en) * | 2010-03-17 | 2014-10-07 | Cyberlink Corp. | Systems and methods for customizing photo presentations |
US8988456B2 (en) * | 2010-03-25 | 2015-03-24 | Apple Inc. | Generating digital media presentation layouts dynamically based on image features |
US20110296304A1 (en) * | 2010-05-27 | 2011-12-01 | Palm, Inc. | Adaptive Gesture Tutorial |
WO2011151709A2 (en) * | 2010-06-01 | 2011-12-08 | Young-Joo Song | Electronic multimedia publishing systems and methods |
US8584015B2 (en) * | 2010-10-19 | 2013-11-12 | Apple Inc. | Presenting media content items using geographical data |
US8654148B2 (en) * | 2010-12-23 | 2014-02-18 | Sony Corporation | Display control apparatus for deciding a retrieval range for displaying stored pieces of information |
TWI476587B (zh) * | 2011-12-01 | 2015-03-11 | Mstar Semiconductor Inc | 測試電子裝置之功能的測試方法以及測試裝置 |
US8872898B2 (en) * | 2011-12-14 | 2014-10-28 | Ebay Inc. | Mobile device capture and display of multiple-angle imagery of physical objects |
KR102053901B1 (ko) * | 2012-08-16 | 2019-12-09 | 삼성전자주식회사 | 일정 관리 방법, 일정 관리 서버 및 그를 위한 이동 단말 |
KR102223745B1 (ko) * | 2012-10-29 | 2021-03-08 | 울리히 세우데 | 그래픽 사용자 인터페이스를 구비하는 컴퓨터 내의 캘린더 이벤트를 디스플레이 및 내비게이트하는 방법 |
US9569287B1 (en) * | 2013-03-14 | 2017-02-14 | Dell Software Inc. | System and method for interactive tutorials |
KR102327779B1 (ko) * | 2014-02-21 | 2021-11-18 | 삼성전자주식회사 | 이미지 처리 방법 및 장치 |
US10757159B2 (en) | 2014-07-25 | 2020-08-25 | Gracenote Digital Ventures, Llc | Retrieval and playout of media content |
TWI684918B (zh) * | 2018-06-08 | 2020-02-11 | 和碩聯合科技股份有限公司 | 臉部辨識系統以及加強臉部辨識方法 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008009573A (ja) * | 2006-06-28 | 2008-01-17 | Hitachi Software Eng Co Ltd | 提案型操作支援システム及びプログラム |
JP2008067253A (ja) * | 2006-09-11 | 2008-03-21 | Nec Corp | 携帯端末装置 |
JP2008215939A (ja) * | 2007-03-01 | 2008-09-18 | Xanavi Informatics Corp | ナビゲーション装置およびその機能活性化方法 |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1415176A4 (en) * | 2001-03-19 | 2007-08-22 | Accenture Llp | VALET MOBILE |
US8156128B2 (en) * | 2005-09-14 | 2012-04-10 | Jumptap, Inc. | Contextual mobile content placement on a mobile communication facility |
EP2362649A1 (en) * | 2010-02-16 | 2011-08-31 | Axel Springer Digital TV Guide GmbH | Adaptive placement of auxiliary media in recommender systems |
US8811977B2 (en) * | 2010-05-06 | 2014-08-19 | At&T Mobility Ii Llc | Device-driven intelligence and feedback for performance optimization and planning of a service network |
-
2009
- 2009-10-20 WO PCT/JP2009/068077 patent/WO2010047337A1/ja active Application Filing
- 2009-10-20 JP JP2010534820A patent/JP5611829B2/ja active Active
- 2009-10-20 US US13/125,002 patent/US20110200980A1/en not_active Abandoned
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008009573A (ja) * | 2006-06-28 | 2008-01-17 | Hitachi Software Eng Co Ltd | 提案型操作支援システム及びプログラム |
JP2008067253A (ja) * | 2006-09-11 | 2008-03-21 | Nec Corp | 携帯端末装置 |
JP2008215939A (ja) * | 2007-03-01 | 2008-09-18 | Xanavi Informatics Corp | ナビゲーション装置およびその機能活性化方法 |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012173961A (ja) * | 2011-02-21 | 2012-09-10 | Toshiba Tec Corp | トレーニング装置、プログラム、及びトレーニングシステム |
US10740057B2 (en) | 2011-06-13 | 2020-08-11 | Sony Corporation | Information processing device, information processing method, and computer program |
JP2016123083A (ja) * | 2014-12-24 | 2016-07-07 | キヤノンマーケティングジャパン株式会社 | 情報処理端末、制御方法、プログラム |
JP2016184416A (ja) * | 2016-05-20 | 2016-10-20 | ソニー株式会社 | 情報処理装置、情報処理方法および記憶媒体 |
CN111936970A (zh) * | 2018-03-20 | 2020-11-13 | 微软技术许可有限责任公司 | 交叉应用特征链接和教育消息传送 |
CN111936970B (zh) * | 2018-03-20 | 2024-03-15 | 微软技术许可有限责任公司 | 交叉应用特征链接和教育消息传送 |
JP2020052948A (ja) * | 2018-09-28 | 2020-04-02 | 富士フイルム株式会社 | 画像処理装置、画像処理方法、プログラムおよび記録媒体 |
US11176378B2 (en) | 2018-09-28 | 2021-11-16 | Fujifilm Corporation | Image processing device, image processing method,program, and recording medium |
JP7171349B2 (ja) | 2018-09-28 | 2022-11-15 | 富士フイルム株式会社 | 画像処理装置、画像処理方法、プログラムおよび記録媒体 |
JP2020204987A (ja) * | 2019-06-19 | 2020-12-24 | カシオ計算機株式会社 | インストールシステム、サーバ装置、ユーザ側装置及びインストール方法 |
JP7302322B2 (ja) | 2019-06-19 | 2023-07-04 | カシオ計算機株式会社 | インストールシステム、サーバ装置、ユーザ側装置及びインストール方法 |
Also Published As
Publication number | Publication date |
---|---|
US20110200980A1 (en) | 2011-08-18 |
JP5611829B2 (ja) | 2014-10-22 |
JPWO2010047337A1 (ja) | 2012-03-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5611829B2 (ja) | 情報処理装置の動作制御システム及び動作制御方法 | |
WO2010047336A1 (ja) | 画像撮影システム及び画像撮影方法 | |
US7734654B2 (en) | Method and system for linking digital pictures to electronic documents | |
CN100476818C (zh) | 基于元数据搜索和命名条目 | |
US8279173B2 (en) | User interface for selecting a photo tag | |
US9076124B2 (en) | Method and apparatus for organizing and consolidating portable device functionality | |
KR20150033308A (ko) | 이동 단말기 및 그 제어 방법 | |
EP3528140A1 (en) | Picture processing method, device, electronic device and graphic user interface | |
EP1990744B1 (en) | User interface for editing photo tags | |
CA2630947C (en) | User interface for selecting a photo tag | |
KR101871779B1 (ko) | 사진 촬영 및 관리 어플리케이션을 구비한 단말기 | |
EP2711853B1 (en) | Methods and systems for media file management | |
JP6967496B2 (ja) | 画像処理装置,画像処理方法および画像処理プログラム | |
JP5727542B2 (ja) | 端末、端末の使用方法、手帳アプリケーション及びリフィル | |
JP2003204506A (ja) | 画像入力装置 | |
JP2004240579A (ja) | 画像サーバおよび画像サーバ制御プログラム | |
Micheletti | iPhone Photography and Video For Dummies | |
JP5601125B2 (ja) | エディタプログラム、エディタ画面表示方法及びエディタプログラムを搭載した情報処理装置 | |
KR20220105958A (ko) | 일기장 자동화 시스템 및 일기장 자동 생성 방법 | |
KR20140031436A (ko) | 스마트 기기를 이용한 북마크 기반의 도서 콘텐츠 및 메모 관리 서비스 제공 방법 | |
JP2005352923A (ja) | 情報蓄積装置及び情報蓄積方法 | |
Miser | IPhoto 11 Portable Genius | |
JP2020038531A (ja) | 文書作成支援装置、文書作成支援システム、及びプログラム | |
JP2005352922A (ja) | 情報蓄積装置及び情報蓄積方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09822034 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2010534820 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13125002 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 09822034 Country of ref document: EP Kind code of ref document: A1 |