US11200651B2 - Image processing apparatus, image processing method, and image processing program - Google Patents

Image processing apparatus, image processing method, and image processing program Download PDF

Info

Publication number
US11200651B2
US11200651B2 US16/580,861 US201916580861A US11200651B2 US 11200651 B2 US11200651 B2 US 11200651B2 US 201916580861 A US201916580861 A US 201916580861A US 11200651 B2 US11200651 B2 US 11200651B2
Authority
US
United States
Prior art keywords
image
remaining
images
attribute
group
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US16/580,861
Other versions
US20200104986A1 (en
Inventor
Tetsuya Matsumoto
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujifilm Corp
Original Assignee
Fujifilm Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujifilm Corp filed Critical Fujifilm Corp
Assigned to FUJIFILM CORPORATION reassignment FUJIFILM CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MATSUMOTO, TETSUYA
Publication of US20200104986A1 publication Critical patent/US20200104986A1/en
Application granted granted Critical
Publication of US11200651B2 publication Critical patent/US11200651B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • H04M1/72439User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for image or video messaging
    • G06K9/00664
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0633Lists, e.g. purchase orders, compilation or processing
    • G06Q30/0635Processing of requisition or of purchase orders
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/20Processor architectures; Processor configuration, e.g. pipelining
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/60Memory management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72469User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72469User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons
    • H04M1/72472User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons wherein the items are sorted according to specific criteria, e.g. frequency of use
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/24Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/28Indexing scheme for image data processing or generation, in general involving image processing hardware

Definitions

  • the invention relates to an image processing apparatus, an image processing method, and an image processing program.
  • JP2017-010251A the image having high importance to the user is extracted, so that a consideration as to extracting an image not recognized as important to the user is not made.
  • JP2017-059124A the image valuable to the user is extracted, so that an image recognized as not valuable to the user cannot be extracted.
  • JP2010-067186A the image highly satisfactory for the user is extracted, so that an image recognized as less satisfactory for the user is not extracted.
  • An object of the invention is to obtain an enjoyable image for a user unexpectedly.
  • An image processing apparatus comprises a first detection device (first detection means) for detecting, from a first image group, a first remaining image excluding one or a plurality of first images selected from the first image group, a second detection device (second detection means) for detecting, from a second image group, a second remaining image excluding one or a plurality of second images selected from the second image group, and an image output device (image output means) for outputting an image having an attribute in which a frequency for the attribute contained in the image is equal to or less than a first threshold, from the first remaining image and the second remaining image.
  • first detection device for detecting, from a first image group, a first remaining image excluding one or a plurality of first images selected from the first image group
  • second detection means for detecting, from a second image group, a second remaining image excluding one or a plurality of second images selected from the second image group
  • image output device image output means for outputting an image having an attribute in which a frequency for the attribute contained in the image is equal to or less
  • the invention also provides an image processing method suitable for the image processing apparatus. That is, the method comprises detecting, from a first image group, a first remaining image excluding one or a plurality of first images selected from the first image group by a first detection device (first detection means), detecting, from a second image group, a second remaining image excluding one or a plurality of second images selected from the second image group by a second detection device (second detection means), and outputting an image having an attribute in which a frequency for the attribute contained in the image is equal to or less than a first threshold, from the first remaining image and the second remaining image by an image output device (image output means).
  • first detection means detecting, from a first image group, a first remaining image excluding one or a plurality of first images selected from the first image group by a first detection device (first detection means)
  • second detection means detecting, from a second image group, a second remaining image excluding one or a plurality of second images selected from the second image group by a second detection device (second detection means)
  • the invention also provides a program controlling a computer of an image processing apparatus and a recording medium (portable recording medium) storing the program.
  • the image processing apparatus may include a processor, and the processor may detect, from a first image group, a first remaining image excluding one or a plurality of first images selected from the first image group, detect, from a second image group, a second remaining image excluding one or a plurality of second images selected from the second image group, and output an image having an attribute in which a frequency for the attribute contained in the image is equal to or less than a first threshold, from the first remaining image and the second remaining image.
  • the image output device outputs an image having an attribute in which the frequency for the attribute contained in the image is equal to or less than the first threshold and equal to or more than a second threshold.
  • the image output device may output an image having an attribute in which the frequency for the attribute contained in the image is equal to or less than a threshold, from each of the first remaining image and the second remaining image.
  • the image output device may output an image of which a similarity to the first image and the second image is equal to or less than a threshold, as the image having an attribute in which the frequency for the attribute contained in the image is equal to or less than the first threshold, from the first remaining image and the second remaining image.
  • the frequency for the attribute contained in the image may be an appearance frequency of a main subject included in the image.
  • the main subject may belong to at least one of a person or an article.
  • the frequency for the attribute contained in the image is the number of captured images of an imaging date, an imaging location, a photographer, or an imaging target.
  • the image output device sets, for example, as the attribute contained in the image, a subject excluding a main subject for which a frequency is equal to or more than a third threshold in the first image and the second image, and outputs an image having the attribute in which the frequency for the attribute contained in the image is equal to or less than the first threshold, from the first remaining image and the second remaining image.
  • the image output device may set, for example, as the attribute contained in the image, the subject excluding the main subject for which the frequency is equal to or more than the third threshold in the first image and the second image, and output a predetermined number or more of images having the attribute in which the frequency for the attribute contained in the image is equal to or less than the first threshold, from the first remaining image and the second remaining image.
  • An imaging time period of the first image group and an imaging time period of the second image group may be different from each other.
  • the image processing apparatus may further comprise an image product creation device (image product creation means) for creating an image product using the image output from the image output device.
  • image product creation means image product creation means
  • the output image becomes an enjoyable image for the user unexpectedly.
  • FIG. 1 illustrates the exterior of a smartphone.
  • FIG. 2 is a block diagram illustrating an electric configuration of the smartphone.
  • FIG. 3 is one example of a home screen.
  • FIG. 4 is one example of the home screen.
  • FIG. 5 is one example of a captured image.
  • FIG. 6 illustrates an overview of an editing system.
  • FIG. 7 is a flowchart illustrating a process procedure of the smartphone.
  • FIG. 8 illustrates an image selected by a user and an image found from remaining images.
  • FIG. 1 illustrates the exterior of a smartphone 1 (as an example of an image processing apparatus) as one embodiment of the image processing apparatus according to the invention.
  • the smartphone 1 illustrated in FIG. 1 includes a casing 2 having a flat plate shape and comprises a display and input unit 20 in which a display panel 21 as a display unit and an operation panel 22 as an input unit are integrated on one surface of the casing 2 .
  • the casing 2 comprises a microphone 32 , a speaker 31 , an operation unit 40 , and a camera unit 41 .
  • the configuration of the casing 2 is not limited thereto. For example, a configuration in which the display unit and the input unit are independent of each other can be employed, or a configuration having a folded structure or a sliding mechanism can be employed.
  • FIG. 2 is a block diagram illustrating a configuration of the smartphone 1 illustrated in FIG. 1 .
  • main constituents of the smartphone comprise a wireless communication unit 10 , the display and input unit 20 , a call unit 30 , the operation unit 40 , the camera unit 41 , a storage unit 50 , an external input-output unit 60 , a global positioning system (GPS) reception unit 70 , a motion sensor unit 80 , a power supply unit 90 , and a main control unit 100 .
  • main functions of the smartphone 1 have a wireless communication function of performing mobile wireless communication through a base station apparatus BS and a mobile communication network NW.
  • the wireless communication unit 10 performs wireless communication with the base station apparatus BS accommodated in the mobile communication network NW in accordance with an instruction from the main control unit 100 .
  • transmission and reception of various file data such as voice data and image data, electronic mail data, and the like and reception of web data, streaming data, and the like are performed.
  • the display and input unit 20 is a so-called touch panel that visually delivers information to a user by displaying images (still images and motion images), text information, and the like and detects a user operation performed on the displayed information under control of the main control unit 100 .
  • the display and input unit 20 comprises the display panel 21 and the operation panel 22 .
  • the display panel 21 uses a liquid crystal display (LCD), an organic electro-luminescence display (OELD), or the like as a display device.
  • the operation panel 22 is a device that is mounted in a manner enabling visual recognition of an image displayed on a display surface of the display panel 21 and detects one or a plurality of coordinates operated by a finger of the user or a stylus. In a case where the device is operated by the finger of the user or the stylus, a detection signal generated by the operation is output to the main control unit 100 . Next, the main control unit 100 detects an operation position (coordinates) on the display panel 21 based on the received detection signal.
  • the display panel 21 and the operation panel 22 of the smartphone 1 illustrated as one embodiment of the image processing apparatus according to the present invention are integrated to constitute the display and input unit 20 .
  • the operation panel 22 is arranged to completely cover the display panel 21 .
  • the operation panel 22 may have a function of detecting the user operation even in a region outside the display panel 21 .
  • the operation panel 22 may include a detection region for an overlapping part in overlap with the display panel 21 (hereinafter, referred to as a display region) and a detection region for the other peripheral part not in overlap with the display panel 21 (hereinafter, referred to as a non-display region).
  • the size of the display region may completely match the size of the display panel 21 , but both sizes do not necessarily match.
  • the operation panel 22 may include two sensitive regions including the peripheral part and the other inner part.
  • the width of the peripheral part is appropriately designed depending on the size and the like of the casing 2 .
  • a position detection method employed in the operation panel 22 is exemplified by a matrix switch method, a resistive film method, a surface acoustic wave method, an infrared method, an electromagnetic induction method, an electrostatic capacitive method, and the like. Any method can be employed.
  • the call unit 30 comprises the speaker 31 and the microphone 32 .
  • the call unit 30 converts the voice of the user input through the microphone 32 into voice data processible in the main control unit 100 and outputs the voice data to the main control unit 100 , or decodes the voice data received by the wireless communication unit 10 or the external input-output unit 60 and outputs the decoded voice data from the speaker 31 .
  • the speaker 31 can be mounted on the same surface as the surface on which the display and input unit 20 is disposed, and the microphone 32 can be mounted on a side surface of the casing 2 .
  • the operation unit 40 is a hardware key using a key switch or the like and receives an instruction from the user.
  • the operation unit 40 is a push-button type switch that is mounted on a side surface of the casing 2 of the smartphone 1 .
  • the operation unit 40 In a case where the operation unit 40 is pressed by the finger or the like, the operation unit 40 enters an ON state. In a case where the finger is released, the operation unit 40 enters an OFF state by a restoring force of a spring or the like.
  • the storage unit 50 stores a control program and control data of the main control unit 100 , application software, address data in which a name, a telephone number, and the like of a communication counterpart are associated, data of transmitted and received electronic mails, web data downloaded by web browsing, and downloaded contents data and also temporarily stores streaming data and the like.
  • the storage unit 50 is configured to include an internal storage unit 51 incorporated in the smartphone and an external storage unit 52 including a slot for an attachable and detachable external memory.
  • Each of the internal storage unit 51 and the external storage unit 52 constituting the storage unit 50 is implemented using a storage medium such as a flash memory type memory, a hard disk type memory, a multimedia card micro type memory, a card type memory (for example, a MicroSD (registered trademark) memory), a random access memory (RAM), or a read only memory (ROM).
  • a storage medium such as a flash memory type memory, a hard disk type memory, a multimedia card micro type memory, a card type memory (for example, a MicroSD (registered trademark) memory), a random access memory (RAM), or a read only memory (ROM).
  • the external input-output unit 60 acts as an interface for all external apparatuses connected to the smartphone 1 and is directly or indirectly connected to other external apparatuses by communication (for example, Universal Serial Bus (USB) and IEEE 1394) or networks (for example, the Internet, a wireless LAN, Bluetooth (registered trademark), radio frequency identification (RFID), Infrared Data Association (IrDA) (registered trademark), Ultra Wideband (UWB) (registered trademark), and ZigBee (registered trademark)).
  • communication for example, Universal Serial Bus (USB) and IEEE 1394
  • networks for example, the Internet, a wireless LAN, Bluetooth (registered trademark), radio frequency identification (RFID), Infrared Data Association (IrDA) (registered trademark), Ultra Wideband (UWB) (registered trademark), and ZigBee (registered trademark)).
  • RFID radio frequency identification
  • IrDA Infrared Data Association
  • UWB Ultra Wideband
  • ZigBee registered trademark
  • the external apparatuses connected to the smartphone 1 include a wired/wireless headset, a wired/wireless external charger, a wired/wireless data port, a memory card or a subscriber identity module (SIM) card/user identity module (UIM) card connected through a card socket, an external audio and video apparatus connected through an audio and video input/output (I/O) terminal, a wirelessly connected external audio and video apparatus, a smartphone connected in a wired/wireless manner, a personal computer connected in a wired/wireless manner, a PDA connected in a wired/wireless manner, and an earphone connected in a wired/wireless manner.
  • the external input-output unit can deliver data transferred from the external apparatuses to each constituent inside the smartphone 1 or transfer data inside the smartphone 1 to the external apparatuses.
  • the GPS reception unit 70 receives GPS signals transmitted from GPS satellites ST 1 to STn, executes a position measurement calculation process based on the plurality of received GPS signals, and detects a position including the latitude, the longitude, and the altitude of the smartphone 1 in accordance with an instruction from the main control unit 100 .
  • positional information can be obtained from the wireless communication unit 10 or the external input-output unit 60 (for example, a wireless LAN)
  • the GPS reception unit 70 can detect the position using the positional information.
  • the motion sensor unit 80 comprises, for example, a 3-axis acceleration sensor and detects a physical motion of the smartphone 1 in accordance with an instruction from the main control unit 100 . By detecting the physical motion of the smartphone 1 , the movement direction and the acceleration of the smartphone 1 are detected. The detection result is output to the main control unit 100 .
  • the power supply unit 90 supplies power stored in a battery (not illustrated) to each unit of the smartphone 1 in accordance with an instruction from the main control unit 100 .
  • the main control unit 100 comprises a microprocessor, operates in accordance with the control program and the control data stored in the storage unit 50 , and integrally controls each unit of the smartphone 1 .
  • the main control unit 100 has a mobile communication control function of controlling each unit of a communication system and an application processing function in order to perform voice communication and data communication through the wireless communication unit 10 .
  • the application processing function is implemented by operating the main control unit 100 in accordance with the application software stored in the storage unit 50 .
  • the application processing function includes an infrared communication function of performing data communication with an opposing device by controlling the external input-output unit 60 , an electronic mail function of transmitting and receiving electronic mails, and a web browsing function of browsing a web page.
  • the main control unit 100 has an image processing function such as displaying a video on the display and input unit 20 based on image data (data of a still image or a motion image) such as reception data and downloaded streaming data.
  • the image processing function is a function of causing the main control unit 100 to decode the image data, perform image processing on the decoding result, and display the image on the display and input unit 20 .
  • the main control unit 100 executes display control of the display panel 21 and operation detection control for detecting the user operation performed through the operation unit 40 or the operation panel 22 .
  • the main control unit 100 displays an icon for starting the application software or a software key such as a scroll bar or displays a window for composing an electronic mail.
  • the scroll bar refers to a software key for receiving an instruction to move a display part of an image for a large image or the like not accommodated in the display region of the display panel 21 .
  • the main control unit 100 detects the user operation performed through the operation unit 40 , receives an operation performed on the icon or an input of a text string in an input field of the window through the operation panel 22 , or receives a request for scrolling the display image through the scroll bar.
  • the main control unit 100 has a touch panel control function of determining whether the operation position on the operation panel 22 is in the overlapping part (display region) in overlap with the display panel 21 or the other peripheral part (non-display region) not in overlap with the display panel 21 and controlling the sensitive region of the operation panel 22 and the display position of the software key.
  • the main control unit 100 can detect a gesture operation performed on the operation panel 22 and execute a preset function depending on the detected gesture operation.
  • the gesture operation is not a simple touch operation in the related art and means an operation of drawing a trajectory by the finger or the like or specifying a plurality of positions at the same time, or an operation of drawing a trajectory from at least one of the plurality of positions by a combination thereof.
  • the camera unit 41 is a digital camera performing electronic imaging using an imaging element such as a complementary metal oxide semiconductor (CMOS) or a charge-coupled device (CCD).
  • CMOS complementary metal oxide semiconductor
  • CCD charge-coupled device
  • the camera unit 41 converts the image data obtained by imaging into compressed image data in, for example, Joint Photographic Experts Group (JPEG) and records the image data in the storage unit 50 or outputs the image data through the external input-output unit 60 or the wireless communication unit 10 under control of the main control unit 100 .
  • JPEG Joint Photographic Experts Group
  • the camera unit 41 is mounted on the same surface as the display and input unit 20 .
  • the mounting position of the camera unit 41 is not limited thereto.
  • the camera unit 41 may be mounted on the rear surface of the display and input unit 20 , or a plurality of camera units 41 may be mounted. In a case where the plurality of camera units 41 are mounted, imaging may be performed by a single camera unit 41 by switching the camera unit 41 performing the imaging, or imaging may be performed using the plurality of camera units 41 at the same time.
  • the camera unit 41 can be used in various functions of the smartphone 1 .
  • the image obtained by the camera unit 41 can be displayed on the display panel 21 , or the image of the camera unit 41 can be used as an operation input of the operation panel 22 .
  • the position in the detection of the position by the GPS reception unit 70 , the position can be detected with reference to the image from the camera unit 41 .
  • a determination of the optical axis direction of the camera unit 41 of the smartphone 1 and a determination of the current usage environment can be performed without using the 3-axis acceleration sensor or along with the 3-axis acceleration sensor.
  • the image from the camera unit 41 can also be used in the application software.
  • the image data of the still picture or the motion picture can be recorded in the storage unit 50 or output through the external input-output unit 60 or the wireless communication unit 10 by adding the positional information obtained by the GPS reception unit 70 , voice information (may be text information obtained by performing voice-to-text conversion by the main control unit or the like) obtained by the microphone 32 , attitude information obtained by the motion sensor unit 80 , and the like to the image data.
  • voice information may be text information obtained by performing voice-to-text conversion by the main control unit or the like
  • attitude information obtained by the motion sensor unit 80 and the like to the image data.
  • a program obtained through the Internet or the like is installed in advance on the smartphone 1 .
  • the process to be described below starts by starting the program.
  • the program may be stored in a recording medium such as the external storage unit 52 , and the program read from the external storage unit 52 may be installed on the smartphone 1 .
  • a plurality of images are selected monthly by the user from a collection (image group) of multiple images captured every month.
  • a plurality of images may be automatically selected every month.
  • the selected images are printed.
  • FIG. 3 is one example of the home screen displayed on the display panel 21 of the smartphone 1 .
  • Eleven image display regions 129 are formed almost throughout the home screen (the number of image display regions 129 may be less than 11 or greater than or equal to 12).
  • An imaging year and month display region 128 is displayed in almost the upper left portion of the home screen.
  • the imaging year and month display region 128 displays a text string “January” and a text string “2017”.
  • the imaging year and month display region 128 of the home screen after the start of the program displays the year and month corresponding to the time of the start of the program.
  • An imaging year and month specifying region 121 is formed in the upper portion of the home screen.
  • An imaging year display region 122 and a pull-down button 123 are formed in the imaging year and month specifying region 121 .
  • a pull-down menu is shown, and the user can select the desired imaging year.
  • Imaging month specifying regions 124 , 125 , and 126 are formed on the right side of the imaging year display region 122 .
  • the imaging month specifying regions 124 , 125 , and 126 display “December” (December 2016), “January”, and “February”, respectively.
  • the text string “January” displayed in the imaging month specifying region 125 at the center is surrounded. By surrounding “January”, it is shown that “January” is selected as a month in which images displayed in the image display regions 129 are captured.
  • a search button 127 is formed on the right side of the imaging month specifying region 125 .
  • An image addition region 130 is formed on the lower left side of the image display regions 129 . By touching the image addition region 130 , the number of image display regions 129 displayed on the home screen is increased by one.
  • An order button 131 on which a text string “order” is displayed is displayed in the lower portion of the home screen.
  • the order button 131 is touched in the case of ordering a print of the image.
  • a home button 132 on which a text string “home” is displayed, a sale button 133 on which a text string “sale” is displayed, and a menu button 134 on which a text string “menu” is displayed are formed in the lowermost portion of the home screen.
  • the imaging year and month display region 128 of the home screen after the start of the program displays the year and month corresponding to the time of the start of the program
  • the image is not displayed in the image display regions 129 in a condition where the image is not selected and the image is displayed in the image display regions 129 in a condition where the image is selected, from the images captured in the year and month corresponding to the time of the start of the program.
  • the example illustrated in FIG. 3 is one example of displaying the eleven images, in the image display regions 129 , which are selected from the images captured in January 2017.
  • FIG. 4 is also one example of the home screen displayed on the display panel 21 of the smartphone 1 .
  • the image display regions 129 of the home screen illustrated in FIG. 3 display images selected from the images captured in January 2017. Meanwhile, the image display regions 129 of the home screen illustrated in FIG. 4 display images selected from the images captured in February 2017.
  • An image is selected in the same manner from the captured images after March 2017.
  • the user selects the images every month from the multiple images that are captured every month.
  • the image data representing the selected image is transmitted to an order receiving server and the image is printed.
  • the print is mailed to the user, and the user simply organizes the mailed print into an album, so that the user can get the album recording the growth of the family.
  • the user may select an image captured on the month and day around the month of the order.
  • FIG. 5 illustrates the relation between images captured from January 2017 to December 2017 and the selected images.
  • 100 images from an image I 1 to an image I 100 are captured.
  • the 100 images from the image I 1 to the image I 100 are set as an image group IA 1 for January 2017.
  • Eleven images of the image I 1 to the image I 11 are selected from the image I 1 to the image I 100 included in the image group IA 1 for January 2017.
  • the selected images of the image I 1 to the image I 11 are an image IS 1 selected in January 2017.
  • the selected images of the image I 1 to the image I 11 are the images displayed in the image display regions 129 , as illustrated in FIG. 3 .
  • the image I 12 to the image I 100 excluding the selected image IS 1 are remaining images IR 1 for January 2017 among the image I 1 to the image I 100 included in the image group IA 1 for January 2017.
  • the remaining images IR 1 for January 2017 are images not selected from the image I 1 to the image I 100 included in the image group IA 1 for January 2017.
  • 100 images from an image I 101 to an image I 200 are captured.
  • the 100 images from the image I 101 to the image I 200 are set as an image group IA 2 for February 2017.
  • Eleven images of the image I 101 to the image I 111 are selected from the image I 101 to the image I 200 included in the image group IA 2 for February 2017.
  • the selected images of the image I 101 to the image I 111 are an image IS 2 selected in February 2017.
  • the selected images of the image I 101 to the image I 111 are the images displayed in the image display regions 129 , as illustrated in FIG. 4 .
  • the image I 112 to the image I 200 excluding the selected image IS 2 are remaining images IR 2 for February 2017 among the image I 101 to the image I 200 included in the image group IA 2 for February 2017.
  • the remaining images IR 2 for February 2017 are images not selected from the image I 101 to the image I 200 included in the image group IA 2 for February 2017.
  • a large number of images are also captured every month and images are selected every month from the large number of images.
  • 100 images from an image I 1101 to an image I 1200 are captured.
  • the 100 images from the image I 1101 to the image I 1200 are set as an image group IA 12 for December 2017.
  • Eleven images of the image I 1101 to the image I 1111 are selected from the image I 1101 to the image I 1200 included in the image group IA 12 for December 2017.
  • the selected images of the image I 1101 to the image I 1111 are an image IS 12 selected in December 2017.
  • the image I 1112 to the image I 1200 excluding the selected image IS 12 are remaining images IR 12 for December 2017 among the image I 1101 to the image I 1200 included in the image group IA 12 for December 2017.
  • the remaining images IR 12 for December 2017 are images not selected from the image I 1101 to the image I 1200 included in the image group IA 12 for December 2017.
  • FIG. 6 illustrates an overview of an image editing system according to the embodiment.
  • the smartphone 1 and an order receiving server 151 can communicate with each other through the Internet.
  • the order receiving server 151 and the printer 152 can communicate with each other.
  • image data is transmitted from the smartphone 1 to the order receiving server 151
  • the image data received by the order receiving server 151 is transmitted to the printer 152 .
  • An image represented by the image data is printed. The print is mailed to the user.
  • the image data representing the images selected by the user is transmitted to the order receiving server 151 every month, so that print of the images selected by the user is delivered to the user every month.
  • images that the user considers to be uncommon are determined from the images not selected by the user.
  • the image data representing the determined images is transmitted to the order receiving server 151 every month, so that prints except the images selected by the user are delivered to the user every month (the prints may not be delivered every month, and without printing, the images that the user considers to be uncommon may be determined and then the determined images may be notified to the user).
  • the prints of the images unexpected for the user are delivered, which increase the fun.
  • FIG. 7 is a flowchart illustrating a process procedure of the smartphone 1 .
  • a selection button of a remaining image mode appears so as to find images from the remaining images.
  • a period of finding images for example, one year from January 2017 to December 2017
  • the processing procedure illustrated in FIG. 7 starts. In the embodiment, it is assumed that one year from January 2017 to December 2017 is specified.
  • a first remaining image is detected from a first image group by the main control unit 100 (which is an example of the first detection device) of the smartphone 1 (step 141 ) and a second remaining image is detected from a second image group by the main control unit 100 (which is an example of the second detection device) (step 142 ).
  • the first image group is any of the image groups captured every month for one year from January 2017 to December 2017 specified (for example, the image group IA 1 in January 2017) and the first image group may be one image group or a plurality of image groups. Assuming that the first image group is the image group IA 1 for January 2017, the first remaining image is the remaining image IR 1 for January 2017.
  • the second image group is an image group different from the first image group (for example, the image group IA 2 for February 2017, although the first image group and the second image group have different imaging time periods, they may have the same imaging time period) among the image groups captured every month for one year from January 2017 to December 2017, and may be one image group or a plurality of image groups.
  • the second image group is the image group IA 2 for February 2017, the second remaining image is the remaining image IR 2 for February 2017.
  • both the first image group and the second image group may be one or a plurality of image groups.
  • the first image group is the image group IA 1 for January 2017 and the second image group includes 11 image groups from the image group IA 2 for February 2017 to the image group IA 12 for December 2017.
  • the first remaining image IR 1 is detected from the first image group IA 1 and each of the second remaining images IR 2 to IR 12 is detected from each of the second image groups IA 2 to IA 12 .
  • the main control unit 100 determines an image having an attribute in which a frequency for the attribute contained in the image is equal to or less than a first threshold (step 143 ).
  • the “attribute contained in the image” refers to various properties of the image derived from the image and for example, the properties include a main subject contained in the image, an imaging date of the image, an imaging location of the image, a photographer of the image, or an imaging target.
  • the subject detection is performed from the image, and determination is made through the position, size, or the like of the subject in the image.
  • the determination is made as to whether the subject is present at the center position of the image and is a subject of a certain size or more in a size relative to the image.
  • the main subject includes any of a person or an article.
  • the “frequency for the attribute contained in the image” means a numerical value represented by m/n (including a numerical value displayed by multiplying and dividing a certain number, such as percentage display).
  • the determined image is displayed on the display panel 21 (which is an example of the image output device) (step 144 ) and the image data representing the determined image is transmitted from the wireless communication unit 10 (which is an example of the image output device) of the smartphone 1 to the order receiving server 151 (refer to FIG. 8 ) (step 145 ).
  • the image data is transmitted from the order receiving server 151 to the printer 152 (which is an example of the image product creation device) and therefore, the print (which is an example of an image product) is performed. Since the print is mailed to the user, the user can receive the print representing the unexpected image, which is unlikely to be selected by the user himself or herself In the above embodiment, the determined image is displayed on the display panel 21 as a form of image output.
  • the form of the output of the determined image may not be a form that can be recognized by the user, such as display on the display panel 21 .
  • the main control unit 100 even in a case where image data representing the determined image is subjected to processing such as image evaluation processing, it is considered that the image is output as long as the image data representing the determined image is obtained.
  • the image may not be displayed on the display panel 21 . In such a case, the user does not recognize the determined image, but the image output itself is performed inside the main control unit 100 .
  • the image which is determined from the first remaining image IR 1 and each of the second remaining images IR 2 to IR 12 , is an image having an attribute in which the frequency for the attribute contained in the image is equal to or less than the first threshold, an image having an attribute with a high appearance frequency in the first remaining image IR 1 and each of the second remaining images IR 2 to IR 12 is removed and the image unexpected for the user can be obtained.
  • the images IS 1 to IS 12 which are selected from the image groups IA 1 to IA 12 obtained from January 2017 to December 2017, and the images ID 1 to ID 12 , which are determined by way of processing procedure illustrated in FIG. 7 from the remaining images IR 1 to IR 12 between January 2017 and December 2017, are illustrated in FIG. 8 .
  • the image ID 1 includes the images I 21 , I 22 , and I 23 that are determined from the remaining image IR 1 and the image ID 2 includes the images I 121 , I 122 , and I 123 that are determined from the remaining image IR 2 .
  • the image ID 12 includes the images I 1121 , I 1122 , and I 1123 that are determined from the remaining image IR 12 .
  • the selected images IS 1 to IS 12 are images selected as the images required by the user among a large number of images, and can be considered as representing a story of a main role (for example, a child in a user family) or the like in a plurality of people included in the images captured from January 2017 to December 2017.
  • the images ID 1 to ID 12 which are determined from the remaining images IR 1 to IR 12 , are images determined from the remaining images IR 1 to IR 12 that the user does not consider necessary to represent every month, and can be considered as representing a story of a supporting role (for example, user's parents who cannot meet so often because they live far away) or the like in a plurality of people included in the images captured from January 2017 to December 2017.
  • the image representing the story of the supporting role unexpected for the user can be obtained.
  • an image of the story in which an article such as a bicycle is a supporting role can be obtained.
  • the main control unit 100 determines the image having the attribute in which the frequency for the attribute contained in the image is equal to or less than the first threshold, from the first remaining image IR 1 and each of the second remaining images IR 2 to IR 12 .
  • the main control unit 100 may determine an image having the attribute in which the frequency for the attribute contained in the image is equal to or less than the first threshold and equal to or more than a second threshold. Since the image having the attribute in which the frequency for the attribute contained in the image is equal to or more than the second threshold is determined from the first remaining image IR 1 and each of the second remaining images IR 2 to IR 12 , the image in which the frequency for the attribute contained in the image is too low is removed.
  • the image is determined from both the first remaining image IR 1 and each of the second remaining images IR 2 to IR 12
  • the image may be determined from the first remaining image IR 1 or the second remaining images IR 2 to IR 12 .
  • the image may be determined from only the first remaining image IR 1 and the image may be determined from at least one remaining image of the second remaining images IR 2 to IR 12 .
  • the main control unit 100 may calculate a similarity to the first image IS 1 selected from the image group IA 1 for January 2017 and the second images IS 2 to IS 12 selected from the second image groups IA 2 to IA 12 for February 2017 to December 2017, and may determine and output the image having the calculated similarity that is equal to or less than a threshold, as the image having an attribute in which the frequency for the attribute contained in the image is equal to or less than the first threshold, from the first remaining image IR 1 and the second remaining images IR 2 to IR 12 .
  • an image that is not similar to any of the images I 1 to I 11 included in the first image IS 1 selected from the image group IA 1 for January 2017 and that is not similar to the images I 101 to I 111 included in the second images IS 2 to IS 12 selected from the image groups IA 2 to IA 12 for February 2017 to December 2017, is determined. Since the image not similar to the image selected by the user is found and printed as described above, the image more unexpected for the user can be obtained.
  • the attribute contained in the image is the main subject included in the image
  • the user can obtain an image including the subject captured unconsciously and forgotten.
  • an image captured in an imaging date different from the imaging date of the image selected by the user can be found.
  • the imaging date of the image selected by the user is a holiday in many cases, the image captured in the imaging date other than the holiday may be found in the image output means of the present embodiment.
  • the attribute contained in the image is the imaging location
  • an image captured at an imaging location different from the imaging location of the image selected by the user can be found.
  • the different imaging location includes a location other than the tourist attraction, a location where the user usually does not visit, or a location far away from the user's home.
  • an image captured by the photographer other than the user can be found (it is effective in a case where images captured by each family member are aggregated into one system).
  • the main control unit 100 performs a process of detecting the event from the selected image and an image of an event different from the detected event is found. In addition, the main control unit 100 evaluates an image from not-selected images and the image having a low evaluation value may be found.
  • the main control unit 100 may detect a main subject included in the frequency equal to or more than the third threshold in the first image IS 1 and the second images IS 2 to IS 12 selected by the user and may set, as the attribute contained in the image, a subject (main subject) different from the detected main subject. Even in such a case, the main control unit 100 may find an image having the attribute in which the frequency for the attribute contained in the image is equal to or less than the first threshold, from the first remaining image IR 1 and the second remaining images IR 2 to IR 12 . A predetermined number or more of images in which the main subject is the subject different from the detected main subject may be found. It is possible to find an image of the next main subject different from the main subject in the selected images.
  • the smartphone 1 is used in the above embodiment, a dedicated image processing apparatus, a personal computer, a tablet terminal, or the like other than the smartphone 1 can be used.
  • the order receiving server 151 instead of the smartphone 1 may perform at least one of the detection of the first remaining image (step 141 in FIG. 7 ), the detection of the second remaining image (step 142 in FIG. 7 ), or the process of determining the image having the attribute in which the frequency for the attribute contained in the image is equal to or less than the first threshold, from the first remaining image and the second remaining image (step 143 in FIG. 7 ).
  • the smartphone 1 may issue an instruction to the printer 152 to print an image. In that case, the smartphone 1 and the printer 152 do not communicate through the Internet and communicate using WiFi or the like.
  • the images are printed one sheet at a time.
  • a plurality of images may be printed on one sheet (an example of an image product) or the images may be printed on a photo book (which is an example of the image product) composed of a plurality of pages.
  • Processing units executing the above process include not only the main control unit 100 functioning as various processing units by executing software but also a programmable logic device such as a field-programmable gate array (FPGA) capable of changing a circuit configuration after manufacturing, a dedicated electric circuit such as an application specific integrated circuit (ASIC) as a processor having a circuit configuration dedicatedly designed to execute a specific process, and the like.
  • a programmable logic device such as a field-programmable gate array (FPGA) capable of changing a circuit configuration after manufacturing
  • a dedicated electric circuit such as an application specific integrated circuit (ASIC) as a processor having a circuit configuration dedicatedly designed to execute a specific process, and the like.
  • ASIC application specific integrated circuit
  • One processing unit may be configured with one of those various processors or may be configured with a combination of two or more processors of the same type or different types (for example, a plurality of FPGAs or a combination of a CPU and an FPGA).
  • a first example of configuring a plurality of processing units with one processor is such that as represented by a computer such as a client computer or a server, one processor is configured with a combination of one or more CPUs and software, and the processor functions as the plurality of processing units.
  • a second example is such that as represented by a system on chip or the like, a processor that implements the function of the whole system including the plurality of processing units using one integrated circuit (IC) chip is used.
  • Various processing units are configured using one or more of the various processors as a hardware structure.
  • the hardware structure of the various processors is more specifically an electric circuit in which circuit elements such as a semiconductor element are combined.

Abstract

Provided are an image processing apparatus, an image processing method, and an image processing program capable of obtaining an enjoyable image for a user unexpectedly. The user selects images from image groups imaged from January to December 2017 as an image to be printed. From not-selected remaining images, an image having an attribute in which a frequency for the attribute contained in each image is equal to or less than a threshold is found and printed. Since the obtained image is an image irrelevant to the images selected by the user, the image unexpected for the user can be obtained.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS
The present application claims priority under 35 U.S.C. § 119 to Japanese Patent Application No. 2018-184355, filed Sep. 28, 2018. Each of the above application(s) is hereby expressly incorporated by reference, in its entirety, into the present application.
BACKGROUND OF THE INVENTION Field of the Invention
The invention relates to an image processing apparatus, an image processing method, and an image processing program.
Description of the Related Art
Extremely large numbers of images are captured along with widespread use of digital cameras, smartphones, and the like. Since it is time-consuming to find a desired image among large numbers of images, it is considered that an image having high importance to a user is automatically extracted (JP2017-010251A), an image valuable to a user is extracted (JP2017-059124A), and an image highly satisfactory for a user is preferentially extracted (JP2010-067186A).
SUMMARY OF THE INVENTION
However, even in a case of an image not recognized as an image important to a user or an image valuable to a user, an enjoyable image for the user may be buried in a large number of captured images, actually. In JP2017-010251A, the image having high importance to the user is extracted, so that a consideration as to extracting an image not recognized as important to the user is not made. In JP2017-059124A, the image valuable to the user is extracted, so that an image recognized as not valuable to the user cannot be extracted. In JP2010-067186A, the image highly satisfactory for the user is extracted, so that an image recognized as less satisfactory for the user is not extracted.
An object of the invention is to obtain an enjoyable image for a user unexpectedly.
An image processing apparatus according to the invention comprises a first detection device (first detection means) for detecting, from a first image group, a first remaining image excluding one or a plurality of first images selected from the first image group, a second detection device (second detection means) for detecting, from a second image group, a second remaining image excluding one or a plurality of second images selected from the second image group, and an image output device (image output means) for outputting an image having an attribute in which a frequency for the attribute contained in the image is equal to or less than a first threshold, from the first remaining image and the second remaining image.
The invention also provides an image processing method suitable for the image processing apparatus. That is, the method comprises detecting, from a first image group, a first remaining image excluding one or a plurality of first images selected from the first image group by a first detection device (first detection means), detecting, from a second image group, a second remaining image excluding one or a plurality of second images selected from the second image group by a second detection device (second detection means), and outputting an image having an attribute in which a frequency for the attribute contained in the image is equal to or less than a first threshold, from the first remaining image and the second remaining image by an image output device (image output means).
The invention also provides a program controlling a computer of an image processing apparatus and a recording medium (portable recording medium) storing the program.
Further, the image processing apparatus may include a processor, and the processor may detect, from a first image group, a first remaining image excluding one or a plurality of first images selected from the first image group, detect, from a second image group, a second remaining image excluding one or a plurality of second images selected from the second image group, and output an image having an attribute in which a frequency for the attribute contained in the image is equal to or less than a first threshold, from the first remaining image and the second remaining image.
For example, the image output device outputs an image having an attribute in which the frequency for the attribute contained in the image is equal to or less than the first threshold and equal to or more than a second threshold.
In addition, for example, the image output device may output an image having an attribute in which the frequency for the attribute contained in the image is equal to or less than a threshold, from each of the first remaining image and the second remaining image.
The image output device may output an image of which a similarity to the first image and the second image is equal to or less than a threshold, as the image having an attribute in which the frequency for the attribute contained in the image is equal to or less than the first threshold, from the first remaining image and the second remaining image.
In the image output device, the frequency for the attribute contained in the image may be an appearance frequency of a main subject included in the image.
For example, the main subject may belong to at least one of a person or an article.
In the image output device, for example, the frequency for the attribute contained in the image is the number of captured images of an imaging date, an imaging location, a photographer, or an imaging target.
The image output device sets, for example, as the attribute contained in the image, a subject excluding a main subject for which a frequency is equal to or more than a third threshold in the first image and the second image, and outputs an image having the attribute in which the frequency for the attribute contained in the image is equal to or less than the first threshold, from the first remaining image and the second remaining image.
The image output device may set, for example, as the attribute contained in the image, the subject excluding the main subject for which the frequency is equal to or more than the third threshold in the first image and the second image, and output a predetermined number or more of images having the attribute in which the frequency for the attribute contained in the image is equal to or less than the first threshold, from the first remaining image and the second remaining image.
An imaging time period of the first image group and an imaging time period of the second image group may be different from each other.
The image processing apparatus may further comprise an image product creation device (image product creation means) for creating an image product using the image output from the image output device.
According to the invention, since the image having the attribute in which the frequency for the attribute contained in the image is equal to or less than the first threshold is output from the first remaining image and the second remaining image that are not selected, the output image becomes an enjoyable image for the user unexpectedly.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 illustrates the exterior of a smartphone.
FIG. 2 is a block diagram illustrating an electric configuration of the smartphone.
FIG. 3 is one example of a home screen.
FIG. 4 is one example of the home screen.
FIG. 5 is one example of a captured image.
FIG. 6 illustrates an overview of an editing system.
FIG. 7 is a flowchart illustrating a process procedure of the smartphone.
FIG. 8 illustrates an image selected by a user and an image found from remaining images.
DESCRIPTION OF THE PREFERRED EMBODIMENTS
FIG. 1 illustrates the exterior of a smartphone 1 (as an example of an image processing apparatus) as one embodiment of the image processing apparatus according to the invention. The smartphone 1 illustrated in FIG. 1 includes a casing 2 having a flat plate shape and comprises a display and input unit 20 in which a display panel 21 as a display unit and an operation panel 22 as an input unit are integrated on one surface of the casing 2. In addition, the casing 2 comprises a microphone 32, a speaker 31, an operation unit 40, and a camera unit 41. The configuration of the casing 2 is not limited thereto. For example, a configuration in which the display unit and the input unit are independent of each other can be employed, or a configuration having a folded structure or a sliding mechanism can be employed.
FIG. 2 is a block diagram illustrating a configuration of the smartphone 1 illustrated in FIG. 1. As illustrated in FIG. 2, main constituents of the smartphone comprise a wireless communication unit 10, the display and input unit 20, a call unit 30, the operation unit 40, the camera unit 41, a storage unit 50, an external input-output unit 60, a global positioning system (GPS) reception unit 70, a motion sensor unit 80, a power supply unit 90, and a main control unit 100. In addition, main functions of the smartphone 1 have a wireless communication function of performing mobile wireless communication through a base station apparatus BS and a mobile communication network NW.
The wireless communication unit 10 performs wireless communication with the base station apparatus BS accommodated in the mobile communication network NW in accordance with an instruction from the main control unit 100. By using the wireless communication, transmission and reception of various file data such as voice data and image data, electronic mail data, and the like and reception of web data, streaming data, and the like are performed.
The display and input unit 20 is a so-called touch panel that visually delivers information to a user by displaying images (still images and motion images), text information, and the like and detects a user operation performed on the displayed information under control of the main control unit 100. The display and input unit 20 comprises the display panel 21 and the operation panel 22.
The display panel 21 uses a liquid crystal display (LCD), an organic electro-luminescence display (OELD), or the like as a display device. The operation panel 22 is a device that is mounted in a manner enabling visual recognition of an image displayed on a display surface of the display panel 21 and detects one or a plurality of coordinates operated by a finger of the user or a stylus. In a case where the device is operated by the finger of the user or the stylus, a detection signal generated by the operation is output to the main control unit 100. Next, the main control unit 100 detects an operation position (coordinates) on the display panel 21 based on the received detection signal.
As illustrated in FIG. 1, the display panel 21 and the operation panel 22 of the smartphone 1 illustrated as one embodiment of the image processing apparatus according to the present invention are integrated to constitute the display and input unit 20. The operation panel 22 is arranged to completely cover the display panel 21. In the case of employing such an arrangement, the operation panel 22 may have a function of detecting the user operation even in a region outside the display panel 21. In other words, the operation panel 22 may include a detection region for an overlapping part in overlap with the display panel 21 (hereinafter, referred to as a display region) and a detection region for the other peripheral part not in overlap with the display panel 21 (hereinafter, referred to as a non-display region).
The size of the display region may completely match the size of the display panel 21, but both sizes do not necessarily match. In addition, the operation panel 22 may include two sensitive regions including the peripheral part and the other inner part. Furthermore, the width of the peripheral part is appropriately designed depending on the size and the like of the casing 2. Furthermore, a position detection method employed in the operation panel 22 is exemplified by a matrix switch method, a resistive film method, a surface acoustic wave method, an infrared method, an electromagnetic induction method, an electrostatic capacitive method, and the like. Any method can be employed.
The call unit 30 comprises the speaker 31 and the microphone 32. The call unit 30 converts the voice of the user input through the microphone 32 into voice data processible in the main control unit 100 and outputs the voice data to the main control unit 100, or decodes the voice data received by the wireless communication unit 10 or the external input-output unit 60 and outputs the decoded voice data from the speaker 31. In addition, as illustrated in FIG. 1, for example, the speaker 31 can be mounted on the same surface as the surface on which the display and input unit 20 is disposed, and the microphone 32 can be mounted on a side surface of the casing 2.
The operation unit 40 is a hardware key using a key switch or the like and receives an instruction from the user. For example, as illustrated in FIG. 1, the operation unit 40 is a push-button type switch that is mounted on a side surface of the casing 2 of the smartphone 1. In a case where the operation unit 40 is pressed by the finger or the like, the operation unit 40 enters an ON state. In a case where the finger is released, the operation unit 40 enters an OFF state by a restoring force of a spring or the like.
The storage unit 50 stores a control program and control data of the main control unit 100, application software, address data in which a name, a telephone number, and the like of a communication counterpart are associated, data of transmitted and received electronic mails, web data downloaded by web browsing, and downloaded contents data and also temporarily stores streaming data and the like. In addition, the storage unit 50 is configured to include an internal storage unit 51 incorporated in the smartphone and an external storage unit 52 including a slot for an attachable and detachable external memory. Each of the internal storage unit 51 and the external storage unit 52 constituting the storage unit 50 is implemented using a storage medium such as a flash memory type memory, a hard disk type memory, a multimedia card micro type memory, a card type memory (for example, a MicroSD (registered trademark) memory), a random access memory (RAM), or a read only memory (ROM).
The external input-output unit 60 acts as an interface for all external apparatuses connected to the smartphone 1 and is directly or indirectly connected to other external apparatuses by communication (for example, Universal Serial Bus (USB) and IEEE 1394) or networks (for example, the Internet, a wireless LAN, Bluetooth (registered trademark), radio frequency identification (RFID), Infrared Data Association (IrDA) (registered trademark), Ultra Wideband (UWB) (registered trademark), and ZigBee (registered trademark)).
For example, the external apparatuses connected to the smartphone 1 include a wired/wireless headset, a wired/wireless external charger, a wired/wireless data port, a memory card or a subscriber identity module (SIM) card/user identity module (UIM) card connected through a card socket, an external audio and video apparatus connected through an audio and video input/output (I/O) terminal, a wirelessly connected external audio and video apparatus, a smartphone connected in a wired/wireless manner, a personal computer connected in a wired/wireless manner, a PDA connected in a wired/wireless manner, and an earphone connected in a wired/wireless manner. The external input-output unit can deliver data transferred from the external apparatuses to each constituent inside the smartphone 1 or transfer data inside the smartphone 1 to the external apparatuses.
The GPS reception unit 70 receives GPS signals transmitted from GPS satellites ST1 to STn, executes a position measurement calculation process based on the plurality of received GPS signals, and detects a position including the latitude, the longitude, and the altitude of the smartphone 1 in accordance with an instruction from the main control unit 100. When positional information can be obtained from the wireless communication unit 10 or the external input-output unit 60 (for example, a wireless LAN), the GPS reception unit 70 can detect the position using the positional information.
The motion sensor unit 80 comprises, for example, a 3-axis acceleration sensor and detects a physical motion of the smartphone 1 in accordance with an instruction from the main control unit 100. By detecting the physical motion of the smartphone 1, the movement direction and the acceleration of the smartphone 1 are detected. The detection result is output to the main control unit 100.
The power supply unit 90 supplies power stored in a battery (not illustrated) to each unit of the smartphone 1 in accordance with an instruction from the main control unit 100.
The main control unit 100 comprises a microprocessor, operates in accordance with the control program and the control data stored in the storage unit 50, and integrally controls each unit of the smartphone 1. In addition, the main control unit 100 has a mobile communication control function of controlling each unit of a communication system and an application processing function in order to perform voice communication and data communication through the wireless communication unit 10.
The application processing function is implemented by operating the main control unit 100 in accordance with the application software stored in the storage unit 50. For example, the application processing function includes an infrared communication function of performing data communication with an opposing device by controlling the external input-output unit 60, an electronic mail function of transmitting and receiving electronic mails, and a web browsing function of browsing a web page.
In addition, the main control unit 100 has an image processing function such as displaying a video on the display and input unit 20 based on image data (data of a still image or a motion image) such as reception data and downloaded streaming data. The image processing function is a function of causing the main control unit 100 to decode the image data, perform image processing on the decoding result, and display the image on the display and input unit 20.
Furthermore, the main control unit 100 executes display control of the display panel 21 and operation detection control for detecting the user operation performed through the operation unit 40 or the operation panel 22. By executing the display control, the main control unit 100 displays an icon for starting the application software or a software key such as a scroll bar or displays a window for composing an electronic mail. The scroll bar refers to a software key for receiving an instruction to move a display part of an image for a large image or the like not accommodated in the display region of the display panel 21.
In addition, by executing the operation detection control, the main control unit 100 detects the user operation performed through the operation unit 40, receives an operation performed on the icon or an input of a text string in an input field of the window through the operation panel 22, or receives a request for scrolling the display image through the scroll bar.
Furthermore, by executing the operation detection control, the main control unit 100 has a touch panel control function of determining whether the operation position on the operation panel 22 is in the overlapping part (display region) in overlap with the display panel 21 or the other peripheral part (non-display region) not in overlap with the display panel 21 and controlling the sensitive region of the operation panel 22 and the display position of the software key.
In addition, the main control unit 100 can detect a gesture operation performed on the operation panel 22 and execute a preset function depending on the detected gesture operation. The gesture operation is not a simple touch operation in the related art and means an operation of drawing a trajectory by the finger or the like or specifying a plurality of positions at the same time, or an operation of drawing a trajectory from at least one of the plurality of positions by a combination thereof.
The camera unit 41 is a digital camera performing electronic imaging using an imaging element such as a complementary metal oxide semiconductor (CMOS) or a charge-coupled device (CCD). In addition, the camera unit 41 converts the image data obtained by imaging into compressed image data in, for example, Joint Photographic Experts Group (JPEG) and records the image data in the storage unit 50 or outputs the image data through the external input-output unit 60 or the wireless communication unit 10 under control of the main control unit 100. As illustrated in FIG. 1, in the smartphone 1, the camera unit 41 is mounted on the same surface as the display and input unit 20. However, the mounting position of the camera unit 41 is not limited thereto. The camera unit 41 may be mounted on the rear surface of the display and input unit 20, or a plurality of camera units 41 may be mounted. In a case where the plurality of camera units 41 are mounted, imaging may be performed by a single camera unit 41 by switching the camera unit 41 performing the imaging, or imaging may be performed using the plurality of camera units 41 at the same time.
In addition, the camera unit 41 can be used in various functions of the smartphone 1. For example, the image obtained by the camera unit 41 can be displayed on the display panel 21, or the image of the camera unit 41 can be used as an operation input of the operation panel 22. In addition, in the detection of the position by the GPS reception unit 70, the position can be detected with reference to the image from the camera unit 41. Furthermore, with reference to the image from the camera unit 41, a determination of the optical axis direction of the camera unit 41 of the smartphone 1 and a determination of the current usage environment can be performed without using the 3-axis acceleration sensor or along with the 3-axis acceleration sensor. The image from the camera unit 41 can also be used in the application software.
Besides, the image data of the still picture or the motion picture can be recorded in the storage unit 50 or output through the external input-output unit 60 or the wireless communication unit 10 by adding the positional information obtained by the GPS reception unit 70, voice information (may be text information obtained by performing voice-to-text conversion by the main control unit or the like) obtained by the microphone 32, attitude information obtained by the motion sensor unit 80, and the like to the image data.
A program obtained through the Internet or the like is installed in advance on the smartphone 1. The process to be described below starts by starting the program. In addition, the program may be stored in a recording medium such as the external storage unit 52, and the program read from the external storage unit 52 may be installed on the smartphone 1.
In the embodiment, a plurality of images are selected monthly by the user from a collection (image group) of multiple images captured every month. A plurality of images may be automatically selected every month. The selected images are printed.
FIG. 3 is one example of the home screen displayed on the display panel 21 of the smartphone 1.
Eleven image display regions 129 are formed almost throughout the home screen (the number of image display regions 129 may be less than 11 or greater than or equal to 12). An imaging year and month display region 128 is displayed in almost the upper left portion of the home screen. The imaging year and month display region 128 displays a text string “January” and a text string “2017”. The imaging year and month display region 128 of the home screen after the start of the program displays the year and month corresponding to the time of the start of the program.
An imaging year and month specifying region 121 is formed in the upper portion of the home screen. An imaging year display region 122 and a pull-down button 123 are formed in the imaging year and month specifying region 121. By pulling down the pull-down button 123, a pull-down menu is shown, and the user can select the desired imaging year. Imaging month specifying regions 124, 125, and 126 are formed on the right side of the imaging year display region 122. By scrolling the imaging year and month specifying region 121 to the right and left, months displayed in the imaging month specifying regions 124, 125, and 126 are switched. In the home screen illustrated in FIG. 3, the imaging month specifying regions 124, 125, and 126 display “December” (December 2016), “January”, and “February”, respectively. The text string “January” displayed in the imaging month specifying region 125 at the center is surrounded. By surrounding “January”, it is shown that “January” is selected as a month in which images displayed in the image display regions 129 are captured. A search button 127 is formed on the right side of the imaging month specifying region 125. An image addition region 130 is formed on the lower left side of the image display regions 129. By touching the image addition region 130, the number of image display regions 129 displayed on the home screen is increased by one.
An order button 131 on which a text string “order” is displayed is displayed in the lower portion of the home screen. The order button 131 is touched in the case of ordering a print of the image. In addition, a home button 132 on which a text string “home” is displayed, a sale button 133 on which a text string “sale” is displayed, and a menu button 134 on which a text string “menu” is displayed are formed in the lowermost portion of the home screen.
Since the imaging year and month display region 128 of the home screen after the start of the program displays the year and month corresponding to the time of the start of the program, the image is not displayed in the image display regions 129 in a condition where the image is not selected and the image is displayed in the image display regions 129 in a condition where the image is selected, from the images captured in the year and month corresponding to the time of the start of the program. The example illustrated in FIG. 3 is one example of displaying the eleven images, in the image display regions 129, which are selected from the images captured in January 2017.
FIG. 4 is also one example of the home screen displayed on the display panel 21 of the smartphone 1.
The image display regions 129 of the home screen illustrated in FIG. 3 display images selected from the images captured in January 2017. Meanwhile, the image display regions 129 of the home screen illustrated in FIG. 4 display images selected from the images captured in February 2017.
An image is selected in the same manner from the captured images after March 2017. The user selects the images every month from the multiple images that are captured every month. The image data representing the selected image is transmitted to an order receiving server and the image is printed. The print is mailed to the user, and the user simply organizes the mailed print into an album, so that the user can get the album recording the growth of the family. However, when ordering, the user may select an image captured on the month and day around the month of the order.
FIG. 5 illustrates the relation between images captured from January 2017 to December 2017 and the selected images.
In January 2017, 100 images from an image I1 to an image I100 are captured. The 100 images from the image I1 to the image I100 are set as an image group IA1 for January 2017. Eleven images of the image I1 to the image I11 are selected from the image I1 to the image I100 included in the image group IA1 for January 2017. The selected images of the image I1 to the image I11 are an image IS1 selected in January 2017. The selected images of the image I1 to the image I11 are the images displayed in the image display regions 129, as illustrated in FIG. 3. The image I12 to the image I100 excluding the selected image IS1 are remaining images IR1 for January 2017 among the image I1 to the image I100 included in the image group IA1 for January 2017. The remaining images IR1 for January 2017 are images not selected from the image I1 to the image I100 included in the image group IA1 for January 2017.
Similarly, in February 2017, 100 images from an image I101 to an image I200 are captured. The 100 images from the image I101 to the image I200 are set as an image group IA2 for February 2017. Eleven images of the image I101 to the image I111 are selected from the image I101 to the image I200 included in the image group IA2 for February 2017. The selected images of the image I101 to the image I111 are an image IS2 selected in February 2017. The selected images of the image I101 to the image I111 are the images displayed in the image display regions 129, as illustrated in FIG. 4. The image I112 to the image I200 excluding the selected image IS2 are remaining images IR2 for February 2017 among the image I101 to the image I200 included in the image group IA2 for February 2017. The remaining images IR2 for February 2017 are images not selected from the image I101 to the image I200 included in the image group IA2 for February 2017.
Similarly, in the other months, a large number of images are also captured every month and images are selected every month from the large number of images. In December 2017, 100 images from an image I1101 to an image I1200 are captured. The 100 images from the image I1101 to the image I1200 are set as an image group IA12 for December 2017. Eleven images of the image I1101 to the image I1111 are selected from the image I1101 to the image I1200 included in the image group IA12 for December 2017. The selected images of the image I1101 to the image I1111 are an image IS12 selected in December 2017. The image I1112 to the image I1200 excluding the selected image IS12 are remaining images IR12 for December 2017 among the image I1101 to the image I1200 included in the image group IA12 for December 2017. The remaining images IR12 for December 2017 are images not selected from the image I1101 to the image I1200 included in the image group IA12 for December 2017.
As described above, while there are images selected every month, there are also images not selected. In the embodiment, it is possible to find the image that the user does not expect from the images not selected. In FIG. 5, 100 images, as the same number of images, are captured every month and 11 images, as the same number of images, are selected every month. Meanwhile, the same number of images may not be captured every month and the same number of images may not be selected every month.
FIG. 6 illustrates an overview of an image editing system according to the embodiment.
The smartphone 1 and an order receiving server 151 can communicate with each other through the Internet. In addition, the order receiving server 151 and the printer 152 can communicate with each other.
In a case where image data is transmitted from the smartphone 1 to the order receiving server 151, the image data received by the order receiving server 151 is transmitted to the printer 152. An image represented by the image data is printed. The print is mailed to the user.
As described above, the image data representing the images selected by the user is transmitted to the order receiving server 151 every month, so that print of the images selected by the user is delivered to the user every month. In the embodiment, as described below, except for the image data representing the images selected by the user, images that the user considers to be uncommon are determined from the images not selected by the user. The image data representing the determined images is transmitted to the order receiving server 151 every month, so that prints except the images selected by the user are delivered to the user every month (the prints may not be delivered every month, and without printing, the images that the user considers to be uncommon may be determined and then the determined images may be notified to the user). The prints of the images unexpected for the user are delivered, which increase the fun.
FIG. 7 is a flowchart illustrating a process procedure of the smartphone 1.
In a case where a menu button 134 included in the home screen is touched, a selection button of a remaining image mode appears so as to find images from the remaining images. In a case where the selection button is touched, a period of finding images (for example, one year from January 2017 to December 2017) is specified by the user and the processing procedure illustrated in FIG. 7 starts. In the embodiment, it is assumed that one year from January 2017 to December 2017 is specified.
A first remaining image is detected from a first image group by the main control unit 100 (which is an example of the first detection device) of the smartphone 1 (step 141) and a second remaining image is detected from a second image group by the main control unit 100 (which is an example of the second detection device) (step 142). The first image group is any of the image groups captured every month for one year from January 2017 to December 2017 specified (for example, the image group IA1 in January 2017) and the first image group may be one image group or a plurality of image groups. Assuming that the first image group is the image group IA1 for January 2017, the first remaining image is the remaining image IR1 for January 2017. The second image group is an image group different from the first image group (for example, the image group IA2 for February 2017, although the first image group and the second image group have different imaging time periods, they may have the same imaging time period) among the image groups captured every month for one year from January 2017 to December 2017, and may be one image group or a plurality of image groups. Assuming that the second image group is the image group IA2 for February 2017, the second remaining image is the remaining image IR2 for February 2017. As long as the first image group and the second image group are different from each other, both the first image group and the second image group may be one or a plurality of image groups. In the embodiment, it is assumed that the first image group is the image group IA1 for January 2017 and the second image group includes 11 image groups from the image group IA2 for February 2017 to the image group IA12 for December 2017. The first remaining image IR1 is detected from the first image group IA1 and each of the second remaining images IR2 to IR12 is detected from each of the second image groups IA2 to IA12.
Next, from the first remaining image IR1 and each of the second remaining images IR2 to IR12, the main control unit 100 determines an image having an attribute in which a frequency for the attribute contained in the image is equal to or less than a first threshold (step 143). The “attribute contained in the image” refers to various properties of the image derived from the image and for example, the properties include a main subject contained in the image, an imaging date of the image, an imaging location of the image, a photographer of the image, or an imaging target. For the main subject, the subject detection is performed from the image, and determination is made through the position, size, or the like of the subject in the image. For example, the determination is made as to whether the subject is present at the center position of the image and is a subject of a certain size or more in a size relative to the image. The main subject includes any of a person or an article. In addition, in a case where “n” images are present in a certain image group (“n” is a natural number), and “m” images having a certain attribute are present in the “n” images (“m” is a natural number), the “frequency for the attribute contained in the image” means a numerical value represented by m/n (including a numerical value displayed by multiplying and dividing a certain number, such as percentage display).
The determined image is displayed on the display panel 21 (which is an example of the image output device) (step 144) and the image data representing the determined image is transmitted from the wireless communication unit 10 (which is an example of the image output device) of the smartphone 1 to the order receiving server 151 (refer to FIG. 8) (step 145). The image data is transmitted from the order receiving server 151 to the printer 152 (which is an example of the image product creation device) and therefore, the print (which is an example of an image product) is performed. Since the print is mailed to the user, the user can receive the print representing the unexpected image, which is unlikely to be selected by the user himself or herself In the above embodiment, the determined image is displayed on the display panel 21 as a form of image output. However, the form of the output of the determined image may not be a form that can be recognized by the user, such as display on the display panel 21. For example, in the main control unit 100, even in a case where image data representing the determined image is subjected to processing such as image evaluation processing, it is considered that the image is output as long as the image data representing the determined image is obtained. As a result of performing the image evaluation processing for the image data, in a case where the evaluation is low, the image may not be displayed on the display panel 21. In such a case, the user does not recognize the determined image, but the image output itself is performed inside the main control unit 100.
Particularly, since the image, which is determined from the first remaining image IR1 and each of the second remaining images IR2 to IR12, is an image having an attribute in which the frequency for the attribute contained in the image is equal to or less than the first threshold, an image having an attribute with a high appearance frequency in the first remaining image IR1 and each of the second remaining images IR2 to IR12 is removed and the image unexpected for the user can be obtained.
The images IS1 to IS12, which are selected from the image groups IA1 to IA12 obtained from January 2017 to December 2017, and the images ID1 to ID12, which are determined by way of processing procedure illustrated in FIG. 7 from the remaining images IR1 to IR12 between January 2017 and December 2017, are illustrated in FIG. 8. The image ID1 includes the images I21, I22, and I23 that are determined from the remaining image IR1 and the image ID2 includes the images I121, I122, and I123 that are determined from the remaining image IR2. The same applies to the other images, and the image ID12 includes the images I1121, I1122, and I1123 that are determined from the remaining image IR12.
The selected images IS1 to IS12 are images selected as the images required by the user among a large number of images, and can be considered as representing a story of a main role (for example, a child in a user family) or the like in a plurality of people included in the images captured from January 2017 to December 2017. In contrast, the images ID1 to ID12, which are determined from the remaining images IR1 to IR12, are images determined from the remaining images IR1 to IR12 that the user does not consider necessary to represent every month, and can be considered as representing a story of a supporting role (for example, user's parents who cannot meet so often because they live far away) or the like in a plurality of people included in the images captured from January 2017 to December 2017. The image representing the story of the supporting role unexpected for the user can be obtained. Alternatively, an image of the story in which an article such as a bicycle is a supporting role can be obtained.
In the above embodiment, the main control unit 100 determines the image having the attribute in which the frequency for the attribute contained in the image is equal to or less than the first threshold, from the first remaining image IR1 and each of the second remaining images IR2 to IR12. However, the main control unit 100 may determine an image having the attribute in which the frequency for the attribute contained in the image is equal to or less than the first threshold and equal to or more than a second threshold. Since the image having the attribute in which the frequency for the attribute contained in the image is equal to or more than the second threshold is determined from the first remaining image IR1 and each of the second remaining images IR2 to IR12, the image in which the frequency for the attribute contained in the image is too low is removed.
Further, in the example illustrated in FIG. 8, although the image is determined from both the first remaining image IR1 and each of the second remaining images IR2 to IR12, the image may be determined from the first remaining image IR1 or the second remaining images IR2 to IR12. For example, the image may be determined from only the first remaining image IR1 and the image may be determined from at least one remaining image of the second remaining images IR2 to IR12.
Furthermore, the main control unit 100 may calculate a similarity to the first image IS1 selected from the image group IA1 for January 2017 and the second images IS2 to IS12 selected from the second image groups IA2 to IA12 for February 2017 to December 2017, and may determine and output the image having the calculated similarity that is equal to or less than a threshold, as the image having an attribute in which the frequency for the attribute contained in the image is equal to or less than the first threshold, from the first remaining image IR1 and the second remaining images IR2 to IR12. For example, an image that is not similar to any of the images I1 to I11 included in the first image IS1 selected from the image group IA1 for January 2017 and that is not similar to the images I101 to I111 included in the second images IS2 to IS12 selected from the image groups IA2 to IA12 for February 2017 to December 2017, is determined. Since the image not similar to the image selected by the user is found and printed as described above, the image more unexpected for the user can be obtained.
Further, in a case where the attribute contained in the image is the main subject included in the image, it becomes possible to obtain an image including a subject different from the main subject, a subject not photographed together with the main subject, and a subject that does not often appear in the image. The user can obtain an image including the subject captured unconsciously and forgotten.
In a case where the attribute contained in the image is the imaging date, an image captured in an imaging date different from the imaging date of the image selected by the user can be found. For example, in a case where the imaging date of the image selected by the user is a holiday in many cases, the image captured in the imaging date other than the holiday may be found in the image output means of the present embodiment. In a case where the attribute contained in the image is the imaging location, an image captured at an imaging location different from the imaging location of the image selected by the user can be found. For example, the different imaging location includes a location other than the tourist attraction, a location where the user usually does not visit, or a location far away from the user's home. In addition, in a case where the attribute contained in the image is a photographer and the photographer of the image selected by the user is the user himself or herself in many cases, an image captured by the photographer other than the user can be found (it is effective in a case where images captured by each family member are aggregated into one system).
Furthermore, in a case where the attribute contained in the image is the number of captured images for each imaging target (subject), an image which does not include such a target (subject) can be found. In a case where the attribute contained in the image is an event recognized from the image, the main control unit 100 performs a process of detecting the event from the selected image and an image of an event different from the detected event is found. In addition, the main control unit 100 evaluates an image from not-selected images and the image having a low evaluation value may be found.
Furthermore, the main control unit 100 may detect a main subject included in the frequency equal to or more than the third threshold in the first image IS1 and the second images IS2 to IS12 selected by the user and may set, as the attribute contained in the image, a subject (main subject) different from the detected main subject. Even in such a case, the main control unit 100 may find an image having the attribute in which the frequency for the attribute contained in the image is equal to or less than the first threshold, from the first remaining image IR1 and the second remaining images IR2 to IR12. A predetermined number or more of images in which the main subject is the subject different from the detected main subject may be found. It is possible to find an image of the next main subject different from the main subject in the selected images.
Since information such as the imaging date which is an example of the attribute contained in the image is stored in a header of an image file in which the image data is stored, whether the frequency for the attribute contained in the image is equal to or less than the first threshold or not can be determined by using the information.
While the smartphone 1 is used in the above embodiment, a dedicated image processing apparatus, a personal computer, a tablet terminal, or the like other than the smartphone 1 can be used.
In addition, the order receiving server 151 instead of the smartphone 1 may perform at least one of the detection of the first remaining image (step 141 in FIG. 7), the detection of the second remaining image (step 142 in FIG. 7), or the process of determining the image having the attribute in which the frequency for the attribute contained in the image is equal to or less than the first threshold, from the first remaining image and the second remaining image (step 143 in FIG. 7). The smartphone 1 may issue an instruction to the printer 152 to print an image. In that case, the smartphone 1 and the printer 152 do not communicate through the Internet and communicate using WiFi or the like.
Furthermore, in the above embodiment, the images are printed one sheet at a time. However, instead of printing one sheet at a time, a plurality of images may be printed on one sheet (an example of an image product) or the images may be printed on a photo book (which is an example of the image product) composed of a plurality of pages.
Processing units executing the above process include not only the main control unit 100 functioning as various processing units by executing software but also a programmable logic device such as a field-programmable gate array (FPGA) capable of changing a circuit configuration after manufacturing, a dedicated electric circuit such as an application specific integrated circuit (ASIC) as a processor having a circuit configuration dedicatedly designed to execute a specific process, and the like.
One processing unit may be configured with one of those various processors or may be configured with a combination of two or more processors of the same type or different types (for example, a plurality of FPGAs or a combination of a CPU and an FPGA). A first example of configuring a plurality of processing units with one processor is such that as represented by a computer such as a client computer or a server, one processor is configured with a combination of one or more CPUs and software, and the processor functions as the plurality of processing units. A second example is such that as represented by a system on chip or the like, a processor that implements the function of the whole system including the plurality of processing units using one integrated circuit (IC) chip is used. Various processing units are configured using one or more of the various processors as a hardware structure.
Furthermore, the hardware structure of the various processors is more specifically an electric circuit in which circuit elements such as a semiconductor element are combined.

Claims (14)

What is claimed is:
1. An image processing apparatus comprising:
a first detection device for detecting, from a first image group, a first remaining image excluding one or a plurality of first images selected from the first image group;
a second detection device for detecting, from a second image group, a second remaining image excluding one or a plurality of second images selected from the second image group; and
an image output device for outputting an image having an attribute in which a frequency for the attribute contained in the image is equal to or less than a first threshold, from the first remaining image and the second remaining image.
2. The image processing apparatus according to claim 1,
wherein the image output device outputs an image having an attribute in which the frequency for the attribute contained in the image is equal to or less than the first threshold and equal to or more than a second threshold.
3. The image processing apparatus according to claim 1,
wherein the image output device outputs an image having an attribute in which the frequency for the attribute contained in the image is equal to or less than a threshold, from each of the first remaining image and the second remaining image.
4. The image processing apparatus according to claim 1,
wherein the image output device outputs an image of which a similarity to the first image and the second image is equal to or less than a threshold, as the image having an attribute in which the frequency for the attribute contained in the image is equal to or less than the first threshold, from the first remaining image and the second remaining image.
5. The image processing apparatus according to claim 1,
wherein, in the image output device, the attribute contained in the image is a main subject included in the image.
6. The image processing apparatus according to claim 5,
wherein the main subject belongs to at least one of a person or an article.
7. The image processing apparatus according to claim 1,
wherein, in the image output device, the attribute contained in the image is an imaging date, an imaging location, a photographer, or an imaging target.
8. The image processing apparatus according to claim 1,
wherein the image output device sets, as the attribute contained in the image, a subject excluding a main subject for which a frequency is equal to or more than a third threshold in the first image and the second image, and outputs an image having the attribute in which the frequency of the attribute contained in the image is equal to or less than the first threshold, from the first remaining image and the second remaining image.
9. The image processing apparatus according to claim 8,
wherein the image output device sets, as the attribute contained in the image, the subject excluding the main subject for which the frequency is equal to or more than the third threshold in the first image and the second image, and outputs a predetermined number or more of images having the attribute in which the frequency for the attribute contained in the image is equal to or less than the first threshold, from the first remaining image and the second remaining image.
10. The image processing apparatus according to claim 1,
wherein an imaging time period of the first image group and an imaging time period of the second image group are different from each other.
11. The image processing apparatus according to claim 1, further comprising:
an image product creation device for creating an image product using the image output from the image output device.
12. An image processing apparatus comprising:
a processor, the processor may perform:
detecting, from a first image group, a first remaining image excluding one or a plurality of first images selected from the first image group;
detecting, from a second image group, a second remaining image excluding one or a plurality of second images selected from the second image group; and
outputting an image having an attribute in which a frequency for the attribute contained in the image is equal to or less than a first threshold, from the first remaining image and the second remaining image.
13. An image processing method comprising:
detecting, from a first image group, a first remaining image excluding one or a plurality of first images selected from the first image group by a first detection device;
detecting, from a second image group, a second remaining image excluding one or a plurality of second images selected from the second image group by a second detection device; and
outputting an image having an attribute in which a frequency for the attribute contained in the image is equal to or less than a first threshold, from the first remaining image and the second remaining image by an image output device.
14. A non-transitory computer-readable recording medium storing program controlling a computer of an image processing apparatus to perform:
detecting, from a first image group, a first remaining image excluding one or a plurality of first images selected from the first image group;
detecting, from a second image group, a second remaining image excluding one or a plurality of second images selected from the second image group; and
outputting an image having an attribute in which a frequency for the attribute contained in the image is equal to or less than a first threshold, from the first remaining image and the second remaining image.
US16/580,861 2018-09-28 2019-09-24 Image processing apparatus, image processing method, and image processing program Active 2040-07-08 US11200651B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JPJP2018-184355 2018-09-28
JP2018-184355 2018-09-28
JP2018184355A JP7007249B2 (en) 2018-09-28 2018-09-28 Image processing device, image processing method and image processing program

Publications (2)

Publication Number Publication Date
US20200104986A1 US20200104986A1 (en) 2020-04-02
US11200651B2 true US11200651B2 (en) 2021-12-14

Family

ID=69946309

Family Applications (1)

Application Number Title Priority Date Filing Date
US16/580,861 Active 2040-07-08 US11200651B2 (en) 2018-09-28 2019-09-24 Image processing apparatus, image processing method, and image processing program

Country Status (3)

Country Link
US (1) US11200651B2 (en)
JP (1) JP7007249B2 (en)
CN (1) CN110971752B (en)

Citations (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5946444A (en) * 1993-08-24 1999-08-31 Lucent Technologies, Inc. System and method for creating personalized image collections from multiple locations by using a communications network
US6091771A (en) * 1997-08-01 2000-07-18 Wells Fargo Alarm Services, Inc. Workstation for video security system
US6122411A (en) * 1994-02-16 2000-09-19 Apple Computer, Inc. Method and apparatus for storing high and low resolution images in an imaging device
US6369908B1 (en) * 1999-03-31 2002-04-09 Paul J. Frey Photo kiosk for electronically creating, storing and distributing images, audio, and textual messages
US20030133159A1 (en) * 2002-01-11 2003-07-17 Genterprise Development Group, Inc. Systems and methods for producing portraits
US6608563B2 (en) * 2000-01-26 2003-08-19 Creative Kingdoms, Llc System for automated photo capture and retrieval
US20030161009A1 (en) * 2002-02-22 2003-08-28 Kenji Yokoo System and method for processing and ordering photographic prints
US20040201683A1 (en) * 2001-03-30 2004-10-14 Fujitsu Limited Image data dispensing system
US7027070B2 (en) * 2001-11-29 2006-04-11 Agilent Technologies, Inc. Systems and methods for manipulating a graphical display of a printed circuit board model for an automated x-ray inspection system
US20060114520A1 (en) * 2004-11-29 2006-06-01 Fuji Photo Film Co., Ltd. Image forming method and image forming apparatus
US20070250490A1 (en) * 2006-04-20 2007-10-25 Seiko Epson Corporation Data processing unit
US20080267584A1 (en) * 2002-03-26 2008-10-30 Microsoft Corporation Digital Video Segment Identification
US7446800B2 (en) * 2002-10-08 2008-11-04 Lifetouch, Inc. Methods for linking photographs to data related to the subjects of the photographs
US20080301133A1 (en) * 2007-05-29 2008-12-04 Microsoft Corporation Location recognition using informative feature vocabulary trees
US7652709B2 (en) * 2003-07-31 2010-01-26 Seiko Epson Corporation Image forming device, image output device, image processing system, image retrieving method, image quality determining method and recording medium
US20100067787A1 (en) 2008-09-12 2010-03-18 Canon Kabushiki Kaisha Image processing apparatus
US20100142833A1 (en) 2008-12-09 2010-06-10 Canon Kabushiki Kaisha Image selection device and control method thereof
US20100177246A1 (en) * 2009-01-15 2010-07-15 Nokia Corporation Increasing frame rate for imaging
US7836090B2 (en) * 2004-11-17 2010-11-16 Ndsu Research Foundation Method and system for data mining of very large spatial datasets using vertical set inner products
CN101976252A (en) 2010-10-26 2011-02-16 百度在线网络技术(北京)有限公司 Picture display system and display method thereof
US20120117473A1 (en) * 2010-11-09 2012-05-10 Edward Han System and method for creating photo books using video
US8184913B2 (en) * 2009-04-01 2012-05-22 Microsoft Corporation Clustering videos by location
US8634590B2 (en) * 2010-01-12 2014-01-21 Brother Kogyo Kabushiki Kaisha Image processing device and storage medium storing image processing program
US9031383B2 (en) * 2001-05-04 2015-05-12 Legend3D, Inc. Motion picture project management system
US9183446B2 (en) * 2011-06-09 2015-11-10 Canon Kabushiki Kaisha Image processing apparatus and image processing method
US20160014482A1 (en) * 2014-07-14 2016-01-14 The Board Of Trustees Of The Leland Stanford Junior University Systems and Methods for Generating Video Summary Sequences From One or More Video Segments
US20160132734A1 (en) 2014-11-07 2016-05-12 Hyundai Mobis Co., Ltd. Apparatus and method for detecting object for vehicle
US20160179846A1 (en) 2014-12-18 2016-06-23 Kabushiki Kaisha Toshiba Method, system, and computer readable medium for grouping and providing collected image content
US9396258B2 (en) * 2009-01-22 2016-07-19 Google Inc. Recommending video programs
US20160371536A1 (en) * 2015-06-22 2016-12-22 Fujifilm Corporation Image extraction device, image extraction method, program, and recording medium
JP2017037417A (en) 2015-08-07 2017-02-16 キヤノン株式会社 Image processor, method and program
US20170083545A1 (en) * 2015-09-18 2017-03-23 Fujifilm Corporation Image extraction system, image extraction method, image extraction program, and recording medium storing program
US9633257B2 (en) * 2003-03-28 2017-04-25 Abbyy Development Llc Method and system of pre-analysis and automated classification of documents
JP2017161993A (en) 2016-03-07 2017-09-14 富士フイルム株式会社 Image processing device, image processing method, program and recording medium
US20180132011A1 (en) * 2015-04-16 2018-05-10 W.S.C. Sports Technologies Ltd. System and method for creating and distributing multimedia content
US20180165856A1 (en) 2016-12-09 2018-06-14 Canon Kabushiki Kaisha Control method and storage medium
US10123065B2 (en) * 2016-12-30 2018-11-06 Mora Global, Inc. Digital video file generation
US20210120058A1 (en) * 2015-10-14 2021-04-22 Google Llc Capture, recording and streaming of media content
US11030730B2 (en) * 2019-04-02 2021-06-08 Shutterfly, Llc Composite group image

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7170632B1 (en) * 1998-05-20 2007-01-30 Fuji Photo Film Co., Ltd. Image reproducing method and apparatus, image processing method and apparatus, and photographing support system
JP6463231B2 (en) * 2015-07-31 2019-01-30 富士フイルム株式会社 Image processing apparatus, image processing method, program, and recording medium

Patent Citations (54)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5946444A (en) * 1993-08-24 1999-08-31 Lucent Technologies, Inc. System and method for creating personalized image collections from multiple locations by using a communications network
US6122411A (en) * 1994-02-16 2000-09-19 Apple Computer, Inc. Method and apparatus for storing high and low resolution images in an imaging device
US6091771A (en) * 1997-08-01 2000-07-18 Wells Fargo Alarm Services, Inc. Workstation for video security system
US6369908B1 (en) * 1999-03-31 2002-04-09 Paul J. Frey Photo kiosk for electronically creating, storing and distributing images, audio, and textual messages
US6608563B2 (en) * 2000-01-26 2003-08-19 Creative Kingdoms, Llc System for automated photo capture and retrieval
US20040201683A1 (en) * 2001-03-30 2004-10-14 Fujitsu Limited Image data dispensing system
US9031383B2 (en) * 2001-05-04 2015-05-12 Legend3D, Inc. Motion picture project management system
US7027070B2 (en) * 2001-11-29 2006-04-11 Agilent Technologies, Inc. Systems and methods for manipulating a graphical display of a printed circuit board model for an automated x-ray inspection system
US20030133159A1 (en) * 2002-01-11 2003-07-17 Genterprise Development Group, Inc. Systems and methods for producing portraits
US20030161009A1 (en) * 2002-02-22 2003-08-28 Kenji Yokoo System and method for processing and ordering photographic prints
US20080267584A1 (en) * 2002-03-26 2008-10-30 Microsoft Corporation Digital Video Segment Identification
US8244101B2 (en) * 2002-03-26 2012-08-14 Microsoft Corporation Digital video segment identification
US8994833B2 (en) * 2002-10-08 2015-03-31 Lifetouch, Inc. Photography system to organize digital photographs and information regarding the subjects therein
US7446800B2 (en) * 2002-10-08 2008-11-04 Lifetouch, Inc. Methods for linking photographs to data related to the subjects of the photographs
US9633257B2 (en) * 2003-03-28 2017-04-25 Abbyy Development Llc Method and system of pre-analysis and automated classification of documents
US7652709B2 (en) * 2003-07-31 2010-01-26 Seiko Epson Corporation Image forming device, image output device, image processing system, image retrieving method, image quality determining method and recording medium
US7836090B2 (en) * 2004-11-17 2010-11-16 Ndsu Research Foundation Method and system for data mining of very large spatial datasets using vertical set inner products
US20060114520A1 (en) * 2004-11-29 2006-06-01 Fuji Photo Film Co., Ltd. Image forming method and image forming apparatus
US20070250490A1 (en) * 2006-04-20 2007-10-25 Seiko Epson Corporation Data processing unit
US20080301133A1 (en) * 2007-05-29 2008-12-04 Microsoft Corporation Location recognition using informative feature vocabulary trees
JP2010067186A (en) 2008-09-12 2010-03-25 Canon Inc Image processing apparatus
US20100067787A1 (en) 2008-09-12 2010-03-18 Canon Kabushiki Kaisha Image processing apparatus
US20100142833A1 (en) 2008-12-09 2010-06-10 Canon Kabushiki Kaisha Image selection device and control method thereof
JP2010141412A (en) 2008-12-09 2010-06-24 Canon Inc Image selection device and control method thereof
US20100177246A1 (en) * 2009-01-15 2010-07-15 Nokia Corporation Increasing frame rate for imaging
US8264587B2 (en) * 2009-01-15 2012-09-11 Nokia Corporation Increasing frame rate for imaging
US9396258B2 (en) * 2009-01-22 2016-07-19 Google Inc. Recommending video programs
US8184913B2 (en) * 2009-04-01 2012-05-22 Microsoft Corporation Clustering videos by location
US8634590B2 (en) * 2010-01-12 2014-01-21 Brother Kogyo Kabushiki Kaisha Image processing device and storage medium storing image processing program
CN101976252A (en) 2010-10-26 2011-02-16 百度在线网络技术(北京)有限公司 Picture display system and display method thereof
US8327253B2 (en) * 2010-11-09 2012-12-04 Shutterfly, Inc. System and method for creating photo books using video
US20130080897A1 (en) * 2010-11-09 2013-03-28 Shutterfly, Inc. System and method for creating photo books using video
US20150293688A1 (en) * 2010-11-09 2015-10-15 Shutterfly, Inc. System and method for creating photo products using video
US9727221B2 (en) * 2010-11-09 2017-08-08 Shutterfly, Inc. System and method for creating photo products using video
US9081484B2 (en) * 2010-11-09 2015-07-14 Shutterfly, Inc. System and method for creating photo books using video
US20120117473A1 (en) * 2010-11-09 2012-05-10 Edward Han System and method for creating photo books using video
US9183446B2 (en) * 2011-06-09 2015-11-10 Canon Kabushiki Kaisha Image processing apparatus and image processing method
US20160014482A1 (en) * 2014-07-14 2016-01-14 The Board Of Trustees Of The Leland Stanford Junior University Systems and Methods for Generating Video Summary Sequences From One or More Video Segments
CN105590090A (en) 2014-11-07 2016-05-18 现代摩比斯株式会社 Apparatus And Method For Detecting Object For Vehicle
US20160132734A1 (en) 2014-11-07 2016-05-12 Hyundai Mobis Co., Ltd. Apparatus and method for detecting object for vehicle
US20160179846A1 (en) 2014-12-18 2016-06-23 Kabushiki Kaisha Toshiba Method, system, and computer readable medium for grouping and providing collected image content
JP2016119508A (en) 2014-12-18 2016-06-30 株式会社東芝 Method, system and program
US20180132011A1 (en) * 2015-04-16 2018-05-10 W.S.C. Sports Technologies Ltd. System and method for creating and distributing multimedia content
JP2017010251A (en) 2015-06-22 2017-01-12 富士フイルム株式会社 Image extraction apparatus, image extraction method, program, and recording medium
US20160371536A1 (en) * 2015-06-22 2016-12-22 Fujifilm Corporation Image extraction device, image extraction method, program, and recording medium
JP2017037417A (en) 2015-08-07 2017-02-16 キヤノン株式会社 Image processor, method and program
US20170083545A1 (en) * 2015-09-18 2017-03-23 Fujifilm Corporation Image extraction system, image extraction method, image extraction program, and recording medium storing program
JP2017059124A (en) 2015-09-18 2017-03-23 富士フイルム株式会社 Image extracting system, image extracting method, image extracting program, and recording medium that stores the program
US20210120058A1 (en) * 2015-10-14 2021-04-22 Google Llc Capture, recording and streaming of media content
JP2017161993A (en) 2016-03-07 2017-09-14 富士フイルム株式会社 Image processing device, image processing method, program and recording medium
US20180165856A1 (en) 2016-12-09 2018-06-14 Canon Kabushiki Kaisha Control method and storage medium
JP2018097481A (en) 2016-12-09 2018-06-21 キヤノン株式会社 Image processing apparatus, control method, and program
US10123065B2 (en) * 2016-12-30 2018-11-06 Mora Global, Inc. Digital video file generation
US11030730B2 (en) * 2019-04-02 2021-06-08 Shutterfly, Llc Composite group image

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
An Office Action issued by the China National Intellectual Property Administration dated Sep. 30, 2020, which corresponds to Chinese Application No. 201910922231.0 and is related to U.S. Appl. No. 16/580,861 with English language translation.
An Office Action issued by the China National Intellectual Property Administration on Feb. 20, 2021, which corresponds to Chinese Application No. 201910922231.0 and is related to U.S. Appl. No. 16/580,861 with English language translation.
An Office Action; "Notice of Reasons for Refusal", mailed by the Japanese Patent Office dated Aug. 3, 2021, which corresponds to Japanese Patent Application No. 2018-184355 and is related to U.S. Appl. No. 16/580,861; with English language translation.

Also Published As

Publication number Publication date
CN110971752A (en) 2020-04-07
US20200104986A1 (en) 2020-04-02
JP2020052949A (en) 2020-04-02
CN110971752B (en) 2021-08-31
JP7007249B2 (en) 2022-01-24

Similar Documents

Publication Publication Date Title
US9953212B2 (en) Method and apparatus for album display, and storage medium
US9172879B2 (en) Image display control apparatus, image display apparatus, non-transitory computer readable medium, and image display control method
EP2770729B1 (en) Apparatus and method for synthesizing an image in a portable terminal equipped with a dual camera
EP2471254B1 (en) METHOD FOR TRANSMITTING an IMAGE photographed by an IMAGE PICKUP APPARATUS
CN109905852B (en) Apparatus and method for providing additional information by using caller's telephone number
EP3413184B1 (en) Mobile terminal and method for controlling the same
KR20160021637A (en) Method for processing contents and electronics device thereof
US10638041B2 (en) Mobile terminal and method for controlling the same and operating in an always on camera mode
US20150324082A1 (en) Image display control apparatus, image display apparatus, non-transitory computer readable medium, and image display control method
US10205868B2 (en) Live view control device, live view control method, live view system, and program
US9430989B2 (en) Image display control apparatus, image display apparatus, non-transitory computer readable medium, and image display method for displaying images on a divided display
US10863095B2 (en) Imaging apparatus, imaging method, and imaging program
US11715328B2 (en) Image processing apparatus for selecting images based on a standard
US11200651B2 (en) Image processing apparatus, image processing method, and image processing program
US20220215050A1 (en) Picture Search Method and Device
US11170546B2 (en) Image processing apparatus, image processing method, and image processing program
WO2020158102A1 (en) Facial region detection device, image-capturing device, facial region detection method, and facial region detection program
JP2012123594A (en) Image display device, program, and image display method
KR20120066770A (en) Processing method of personal information for mobile phone

Legal Events

Date Code Title Description
AS Assignment

Owner name: FUJIFILM CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MATSUMOTO, TETSUYA;REEL/FRAME:050476/0883

Effective date: 20190717

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE