JP5773563B2 - Image processing apparatus, image processing apparatus control method, and program - Google Patents

Image processing apparatus, image processing apparatus control method, and program Download PDF

Info

Publication number
JP5773563B2
JP5773563B2 JP2009196911A JP2009196911A JP5773563B2 JP 5773563 B2 JP5773563 B2 JP 5773563B2 JP 2009196911 A JP2009196911 A JP 2009196911A JP 2009196911 A JP2009196911 A JP 2009196911A JP 5773563 B2 JP5773563 B2 JP 5773563B2
Authority
JP
Japan
Prior art keywords
image
image data
processing apparatus
unit
image processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2009196911A
Other languages
Japanese (ja)
Other versions
JP2011048655A (en
JP2011048655A5 (en
Inventor
佐藤 英生
英生 佐藤
Original Assignee
キヤノン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by キヤノン株式会社 filed Critical キヤノン株式会社
Priority to JP2009196911A priority Critical patent/JP5773563B2/en
Publication of JP2011048655A publication Critical patent/JP2011048655A/en
Publication of JP2011048655A5 publication Critical patent/JP2011048655A5/ja
Application granted granted Critical
Publication of JP5773563B2 publication Critical patent/JP5773563B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to an image processing apparatus, a control method for the image processing apparatus, and a program.

Conventionally, when image data stored in a data processing apparatus such as a digital camera is subjected to processing such as printing or storage by an image processing apparatus such as an MFP, the user performs printing by the following method, for example. be able to. Note that MFP is an abbreviation of (Multi function Peripheral).
The user stores the image data stored in the data processing device in a portable medium such as an SD card, selects the image data to be printed from the operation unit while the portable medium is connected to the image processing device, and selects The processed image data is transmitted to the image processing apparatus.

Alternatively, the portable media or the data processing apparatus main body is connected to a PC (Personal Computer), and image data to be processed is selected via an application running on the PC. Then, the PC transmits the selected image data to the image processing apparatus, and the image processing apparatus executes processing such as printing and storage based on the image data.
Here, the user A may want to give the friend B an image of the friend B among the images stored in a data processing device such as a digital camera.

In this case, the user first moves the image data stored in the data processing apparatus to the PC or the image processing apparatus using an SD card or a cable. Then, the user needs to select the image data one by one while checking which image data the friend B is reflected in, and give an instruction to the selected image data. End up.
Further, as a method of transferring image data between devices, for example, a mechanism for performing high-speed wireless communication between devices in a short distance has been devised, such as a method described in Patent Document 1. If such a mechanism is employed in a data processing apparatus such as a digital camera and an image processing apparatus, it is possible to reduce the work load using an SD card or a cable.

JP 2008-99236 A

However, even when the method of Patent Document 1 is used, the user selects image data to be processed by the image processing apparatus one by one while checking which image data shows the friend B. Since such a load is not reduced, the operation is troublesome.
In particular, when the number of image data is large, it takes time for the user to select desired image data.
The present invention has been made in view of such problems. The present invention, in the case of extracting the image showing the face of a particular person, a selection of a face as a reference, and to provide a mechanism that can be done in the user easy operation.

The image processing apparatus of the present invention that achieves the above object has the following configuration.
That is, an image processing apparatus including an imaging unit, the storage unit storing a plurality of images captured and generated by the imaging unit, and any one of the plurality of images stored in the storage unit In response to the image processing apparatus being brought close to an external device in a state in which a zoom-up operation is performed on the image displayed by the display means and a part of the image is zoomed up. Te, extraction means for extracting a receiving unit for receiving a face included in the close-up image by the zoom-up operation, the image showing the face of a person corresponding to the face accepted by said accepting means from the storage means And an output means for outputting the image extracted by the extraction means to the external device .

According to the present invention, in the case of extracting the image showing the face of a particular person, a selection of a face as a reference, the user can be carried out in easy Do operation.

It is a figure explaining the structure of an image processing system. It is a block diagram explaining the structure of a controller part. 4 is a diagram illustrating a UI displayed on the image forming apparatus. FIG. It is a block diagram explaining the structure of a data processor. It is a schematic diagram which shows the characteristic of the image data stored in the secondary storage part. 3 is a diagram illustrating an example of image processing of the image forming apparatus. FIG. 3 is a diagram illustrating an example of image processing of the image forming apparatus. FIG. 3 is a flowchart illustrating an example of image processing of the image forming apparatus. 3 is a flowchart illustrating an example of image processing of the image forming apparatus. 3 is a flowchart illustrating an example of image processing of the image forming apparatus. 3 is a diagram illustrating an example of a UI displayed on the image forming apparatus. FIG. It is a figure explaining the storage destination of an image data group. 3 is a diagram illustrating an example of a UI displayed on the image forming apparatus. FIG. 3 is a diagram illustrating an example of a UI displayed on the image forming apparatus. FIG. 3 is a diagram illustrating an example of image processing of the image forming apparatus. FIG. 3 is a flowchart illustrating an example of image processing of the image forming apparatus. 3 is a diagram illustrating an example of a UI displayed on the image forming apparatus. FIG. FIG. 4 is a diagram illustrating an image data group acquired by the image forming apparatus.

Next, the best mode for carrying out the present invention will be described with reference to the drawings.
<Description of system configuration>
[First Embodiment]
FIG. 1 is a diagram illustrating a configuration of an image processing system including an image forming apparatus 100 and a data processing apparatus 1000 according to the present embodiment. The image forming apparatus 100 is an example of an image processing apparatus. In this embodiment, an example in which the data processing device is a digital camera will be described. However, the data processing device includes a mobile phone and a mobile PC as long as the data processing device includes a display device.
Note that the data processing apparatus and the image forming apparatus are provided with a proximity wireless communication function that eliminates complicated connection settings and performs high-speed transfer by directly holding the device over a party to which data is to be transferred. Therefore, the user can transmit the image data group stored in the data processing apparatus to the image forming apparatus by holding the data processing apparatus over the image forming apparatus (for example, within a distance of 3 cm or less).

  In FIG. 1, the controller unit 110 is electrically connected to a reader unit 200 and a printer unit 300, receives information from the reader unit 200 and the printer unit 300, and sends various commands to the reader unit 200 and the printer unit 300. Or send. The controller unit 110 is connected to PCs (Personal Computers) 4001 and 4002 via the network 4000. The controller unit 110 receives image data and control commands from the PC 4001 and the PC 4002 via the network 4000. An example of the network is Ethernet (registered trademark).

The reader unit 200 optically reads a document image and converts it into image data. The reader unit 200 includes a scanner unit 210 having a function for reading a document and a document feeding unit 290 that transports a document sheet to a position where the scanner unit 210 can read the document sheet.
The scanner controller 210 </ b> A controls the document feeding unit 290 and the scanner unit 210 based on an instruction from the controller unit 110.
The printer unit 300 includes a paper feed unit 310 that stores a printing sheet, a marking unit 320 that transfers and fixes image data to the sheet, and a paper discharge unit 330 that discharges the printed sheet. The printer controller 320 A of the printer unit 300 feeds a sheet from the paper feeding unit 310 based on an instruction from the controller unit 110, prints image data on the fed sheet, and sends the printed sheet to the paper discharge unit 330. Eject paper.

The paper feed unit 310 can store a plurality of types of sheets. Further, the paper discharge unit 330 can perform post-processing of sorting and stapling on the printed sheets.
The operation unit 250 includes, for example, a hard key, a liquid crystal display unit, and a touch panel unit pasted on the liquid crystal display unit, and receives a user instruction through them. The operation unit 250 transmits a command corresponding to the instruction received from the user to the controller unit 110, and the controller unit 110 performs control according to the received command. The operation unit 250 displays the state of the image forming apparatus 100.

The operation unit 250 also displays on the liquid crystal display unit a soft key for accepting an operation of the image forming apparatus 100 and a display indicating the function and state of the image forming apparatus 100.
An HDD (Hard Disk Drive) 260 stores various settings of the image forming apparatus 100 and image data.
Using these configurations, the image forming apparatus 100 executes, for example, a copy function, an image data transmission function, a printer function, and the like. When executing the copy function, the controller unit 110 reads the image data of the original by the reader unit 200 and prints the read image data on a sheet by the printer unit 300. When executing the image data transmission function, the controller unit 110 converts the image data read by the reader unit 200 into code data, and transmits the code data to the PCs 4001 and 4002 via the network 4000. When executing the printer function, the controller unit 110 analyzes and expands the code data received from the PCs 4001 and 4002 via the network 4000, converts the code data into image data, and outputs the image data to the printer unit 300. The printer unit 300 performs printing based on the image data received from the controller unit 110. In other words, it can be said that the controller unit 110 is an image processing unit, and the printer unit 300 is an image forming unit.

In the present embodiment, an example of the image forming apparatus 100 will be described using an MFP having a plurality of functions. However, the image forming apparatus 100 may be a copier having only a copy function, or an SFP (Single Function Peripheral) having only a printer function. The wireless communication unit 400 is for performing wireless communication with a data processing apparatus such as a digital camera, a mobile phone, a PDA, or a notebook personal computer. The user can communicate with the image forming apparatus 100 and the data processing apparatus 1000 by bringing the data processing apparatus 1000 closer to the wireless communication unit 400 . The wireless communication unit 400 detects that the data processing apparatus 1000 is approached (approached), and transmits control data, image data, and the like to and from the data processing apparatus 1000. The wireless communication unit 400 may be controlled based on an instruction from the control device 110, or the wireless communication unit 400 may include a CPU by itself and the CPU may control the wireless communication unit 400. As described above, the image forming apparatus 100 has a function of performing high-speed wireless communication with the data processing apparatus 1000 at a short distance.

FIG. 2 is a block diagram illustrating a configuration of the controller unit 110 illustrated in FIG.
In FIG. 2, the main controller 111 is mainly composed of a CPU (Central Processing unit) 112, a bus controller 113, and various I / F (interface) controller circuits.
The CPU 112 and the bus controller 113 collectively control the operation of the entire controller unit 110. The CPU 112 executes various operations based on a program read from a ROM (Read only memory) 114 via the ROM I / F 115. For example, the CPU 112 interprets code data (for example, PDL (page description language)) received from the PC 4001 or PC 4002 shown in FIG. 1 based on the read program. In addition, the CPU 112 executes storage control for memories such as the DRAM 116 and the HDD 260.

  The bus controller 113 controls data transfer input / output from each I / F, and controls bus arbitration and DMA (Direct Memory Access) data transfer. A DRAM (Dynamic Random Access memory) 116 is connected to the main controller 111 via a DRAM I / F 117 and is used as a work area for the CPU 112 to operate and an area for storing image data.

The Codec 118 compresses the raster image data stored in the DRAM 116 using a method such as MH / MR / MMR / JBIG / JPEG. The Codec 118 performs processing such as decompressing the code data stored in a compressed state into raster image data.
The SRAM 119 is used as a temporary work area of the Codec 118. The Codec 118 is connected to the main controller 111 via the I / F 120, and data transfer to and from the DRAM 116 is controlled by the bus controller 113 and is DMA-transferred.
A graphic processor (GP) 135 performs image processing such as image rotation, image scaling, color space conversion, and binarization on raster image data stored in the DRAM 116. Further, the graphic processor 135 is provided with a face recognition module 190 that detects a feature of a person's face. Here, the face recognition module 190 executes processing for recognizing a person's face image from display image data displayed on the data processing apparatus 1000.

The SRAM 136 is used as a temporary work area for the GP 135. The GP 135 is connected to the main controller 111 via the I / F 137, and data transfer to and from the DRAM 116 is controlled by the bus controller 113 and DMA-transferred.
The network controller 121 is connected to the main controller 111 by the I / F 123 and is connected to an external network such as the network 4000 by the connector 122.
An expansion connector 124 for connecting an expansion board and an I / O control unit 126 are connected to the general-purpose high-speed bus 125. An example of the general-purpose high-speed bus 125 is a PCI (Peripheral Component Interconnect) bus.

The I / O control unit 126 is equipped with two channels of asynchronous serial communication unit controllers 127 for transmitting and receiving control commands to and from the CPUs of the reader unit 200 and the printer unit 300. The I / O control unit 126 is connected to the scanner I / F 140 and the printer I / F 145 via the I / O bus 128.
The panel I / F 132 is an I / F that exchanges data with the operation unit 250 illustrated in FIG. 1, and transfers the image data transferred from the LCD controller 131 to the operation unit 250.
Further, the panel I / F 132 transfers a key input signal received through a key such as a hard key or a liquid crystal touch panel key provided in the operation unit 250 to the I / O control unit 126 through the key input I / F 130.

The real-time clock module 133 is for updating / saving the date and time managed in the image forming apparatus 100, and is supplied with power by the backup battery 134.
An E-IDE (Enhanced Integrated Drive Electronics) interface (I / F) 161 is used to connect the HDD 260. The CPU 112 stores image data in the HDD 260 or reads image data from the HDD 260 via the E-IDEI / F 161.
The connector 142 and the connector 147 are connected to the reader unit 200 and the printer unit 300, respectively, and are composed of a synchronized step-synchronized serial I / F (143, 148) and a video I / F (144, 149).

  The scanner I / F 140 is connected to the reader unit 200 via the connector 142 and is connected to the main controller 111 via the scanner bus 141. The scanner I / F 140 performs predetermined processing on the image data received from the reader unit 200. The scanner I / F 140 outputs a control signal generated based on the video control signal sent from the reader unit 200 to the scanner bus 141. Data transfer from the scanner bus 141 to the DRAM 116 is controlled by the bus controller 113.

The printer I / F 145 is connected to the printer unit 300 via the connector 147 and is connected to the main controller 111 via the printer bus 146. The printer I / F 145 performs predetermined processing on the image data output from the main controller 111 and outputs the processed image data to the printer unit 300.
Transfer of raster image data developed on the DRAM 116 to the printer unit 300 is controlled by the bus controller 113. The raster image data is DMA-transferred to the printer unit 300 via the printer bus 146, printer I / F 145, and video I / F 149.

An SRAM (Static Random Access Memory) 151 is a memory that can retain stored contents even when the power of the entire image forming apparatus 100 is shut off by a power source supplied from a backup battery. The SRAM 151 is connected to the I / O control unit 126 via the bus 150.
Similarly, an EEPROM (Electrically Erasable and Programmable Read Only Memory) 152 is a memory connected to the I / O control unit 126 via the bus 150.
The wireless communication I / F 180 is an I / F that exchanges data with the wireless communication unit 400 illustrated in FIG. 2, and the CPU 112 receives data from the wireless communication unit 400 via the wireless communication I / F 180. To do. Further, the CPU 112 transfers data to the wireless communication unit 400 via the wireless communication I / F 180.

FIG. 3 is a diagram illustrating an example of a standard screen of the operation display unit of the operation unit 250 illustrated in FIG. This example is a screen displayed on a liquid crystal display unit including a touch panel. When the user detects that a button is pressed, the CPU 112 executes a function corresponding to the pressed button.
In FIG. 3, a copy mode key 524 is a key to be pressed when executing a copy function. When the copy mode key 524 is pressed, a copy mode screen 530 is displayed. An extended function key 501 is a key for entering a mode such as double-sided copying, multiple copying, movement, binding margin setting, and frame erasing setting.

Reference numeral 540 denotes a status line, which displays a message indicating the device status and print information. In the case of the example shown in FIG. 3, it indicates that the copy is waiting.
An image mode key 502 is a key for entering a setting mode for performing shading, shadowing, trimming, and masking on a copy image.
A user mode key 503 is a key for registering a mode memory and setting a standard mode screen. An applied zoom key 504 is a key for entering a mode for independently scaling the X and Y directions of a document, and a zoom program mode for calculating a scaling factor from the document size and copy size.

  The equal magnification key 512 is a key for setting the copy magnification to 100%. The reduction key 514 and the enlargement key 515 are keys for performing standard reduction and enlargement, respectively. The zoom key 516 is a key for shifting to an operation for setting an arbitrary variable magnification. A paper selection key 513 is a key to be pressed when selecting a copy paper. A fax key 525 is a key to be pressed when performing a fax. The Box key 526 is a key to be pressed when displaying the Box function. A printer key 527 is a key to be pressed when the print density is changed or when detailed print output information of PDL data from a remote host computer is to be referred to. An ID key 528 is a key for instructing to display an ID of the image forming apparatus (for example, a network address such as an IP address or information such as a machine name).

FIG. 4 is a block diagram for explaining the configuration of the data processing apparatus 1000 shown in FIG.
4 includes a CPU 1001, a ROM 1002, a RAM 1003, a wireless communication unit 1004, an imaging unit 1005, an operation unit 1006, a display unit 1007, and a secondary storage unit 1008. They are connected to each other via a bus as shown in the figure.
The CPU 1001 operates according to a program stored in the ROM 1002 and controls various operations of the data processing apparatus 1000.
The ROM 1002 is a nonvolatile memory that stores a program executed by the CPU 1001. A program for executing facial recognition of a person included in the captured image data is also included.

The RAM 1003 functions as a work memory for the CPU 1001. The RAM 1003 temporarily stores image data output from the imaging unit 1005 and image data read from the secondary storage unit 1008.
The wireless communication unit 1004 includes an encoding / decoding circuit unit necessary for wireless communication, an antenna, and the like, and communicates with an external device that exists in a range where the wireless communication unit 1004 can communicate.
The imaging unit 1005 includes a lens that forms incident light, a photoelectric converter (such as a CCD or CMOS sensor) that converts the formed light into an electric signal, and an analog electric signal output from the photoelectric converter into a digital electric signal. It is composed of an AD converter for conversion. The CPU 1001 generates image data based on the digital electric signal converted by the image capturing unit 1005, and adds a date on which the image data was captured, setting data such as a capturing condition as header information, and the like as one file. Is stored in the secondary storage unit 1008.

A display unit 1007 includes a liquid crystal display unit, and displays an operation screen and captured image data.
The operation unit 1006 includes a release button for instructing photographing, a mode selection dial for selecting an operation mode of the digital camera, a menu button for calling a menu item, a button such as a cross cursor button for selecting and instructing a menu item, a dial, a switch, and the like. Composed. A switch for zooming in and zooming out the display content on the display unit 1007 is also provided.

The states and state changes of these buttons, dials, and switches are output as electrical signals to the CPU 1001, and the CPU 1001 performs control according to the instructions.
The secondary storage unit 1008 stores captured image data and the like as a file. The secondary storage unit 1008 may be a built-in nonvolatile memory or a removable memory card.
In addition, these shall be supplied with electric power by power supply devices, such as a battery not shown.

FIG. 5 is a schematic diagram for explaining the characteristics of the image data stored in the secondary storage unit 1008 shown in FIG. Here, the feature is the face information of the person imaged by the imaging unit 1005, but is not limited thereto. In the following description, it is assumed that the secondary storage unit 1008 of the data processing apparatus 1000 stores image data obtained by capturing the person shown in FIG.
In FIG. 5, the secondary storage unit 1008 stores an image data group including image data 601 and 602 in which the person A, person B, person C, and person D captured by the user are captured in various combinations. . Here, when the user wants to print only the image data group including the person A, the CPU 1001 causes the display unit 1007 to display a preview of the image data 601 in accordance with an instruction from the operation unit 1006.

6 and 7 are diagrams for explaining a series of image processing examples for the image data 601 shown in FIG. 6A corresponds to a state in which the image data 601 is previewed on the display unit 1007. FIG. In this example, the person A and the person B are captured with the background as the background.
Here, as shown in FIG. 6B, the user uses the zoom up / zoom out buttons of the operation unit 1006 to display only the person A in a close-up display state. Then, the data processing apparatus 1000 in such a display state is brought close to the wireless communication unit 400 of the image forming apparatus 100.
As a result, the display image data shown in FIG. 6C and the image data group captured by the user are transferred to the image forming apparatus 100. Reference numeral 1301 in FIG. 6C denotes previewed display image data (a face image is close-up) being displayed as shown in FIG. 6B, and an image data group 1302 is a data processing device. All image data stored in 1000.

  Here, the image forming apparatus 100 performs face detection on the display image data 1301 received from the data processing apparatus 1000 by wireless communication by the face recognition module 190 of the graphic processor 135. Then, the feature of the person A in the display image data 1301 is recognized. Then, image data having similar characteristics is extracted from the image data group 1302. As a result, the image forming apparatus 100 extracts image data having the characteristics of the person A from the received image data group, and extracts and holds the image data shown in FIG. The extracted image data is a target to be printed from the printer unit 300. That is, among the eight pieces of image data shown in FIG. 5, four pieces of image data shown in FIG. 6D are printed.

The CPU 112 executes the above-described series of image processing simply by previewing the person the user wants to print on the display unit 1007 of the data processing apparatus 1000 and bringing it close to the communication unit 10. As a result, it is possible to automatically extract only the image data in which the person A who the user wants to print is captured from the image data group and print it at once.
Hereinafter, a case where a plurality of persons such as person A and person B are displayed on the display unit 1007 of the data processing apparatus 1000 will be described.
In this case, as in the case described above, the buttons A of the operation unit 1006 close up the persons A and B on the display unit 1007 of the data processing apparatus 1000 with respect to the image data 601 as shown in FIG. A preview is displayed. At this time, the image data group shown in FIG. 7B is transferred to the image forming apparatus 100.

Here, the preview image 1401 in FIG. 7B corresponds to the display image data 601 displayed on the display unit 1007 of the data processing apparatus 1000, and the image data group 1402 is stored in the data processing apparatus 1000. All stored image data.
The image forming apparatus 100 sequentially performs face detection on the preview image 1401 received from the data processing apparatus 1000 by the face recognition module 190 of the graphic processor 135. Here, the face recognition module 190 recognizes the characteristics of the person A and the person B in the image data.
Then, an image having the same characteristics as the person A is extracted from the image data group 1402, and image data having the same characteristics as the person B is separately extracted from the image data group 1402.
As a result, the result extracted by the face recognition module 190 based on the characteristics of the person A is shown in FIG. 7C, and the result extracted by the face recognition module 190 based on the characteristics of the person B is shown in FIG. Show. What is important here is that image data in which the person A and the person B are shown together exists in both extraction results. Then, the extracted image data shown in (C) and (D) of FIG.

That is, the printer unit 300 prints four pieces of image data shown in FIG. 7C from among the eight pieces of image data shown in FIG. 5, and subsequently, five images shown in FIG. 7D. The data is printed and a total of nine images are printed. As a result, a plurality of sheets common to (C) in FIG. 7 and (D) in FIG. 7 are printed.
As described above, in this embodiment, the CPU 112 executes the above-described series of image processing only by previewing the person who the user wants to print on the display unit 1007 of the data processing apparatus 1000 and bringing it close to the communication unit 10. As a result, the image forming apparatus 100 can automatically extract all the image data showing the persons A and B to be printed from the image data group and print them together.
Note that the face recognition program installed in the ROM 1002 of the data processing apparatus 1000 is also a program that can extract image data having characteristics similar to those of a person, similar to the face recognition module 190 of the image forming apparatus. The above is the basic image processing flow of the present invention, and more detailed operation will be described below. In this embodiment, the case where the data processing device 1000 is a digital camera will be described as an example for convenience. However, the data processing device 1000 may be a device such as a mobile phone or a PDA, even if it is not a digital camera. Good.

  Also, depending on the image forming apparatus 100, there may be a case where the face recognition module 190 is not installed, or a case where a face recognition program is not installed in the ROM 1002 of the data processing apparatus 1000. Therefore, in the present embodiment, a case will be described in which processing is performed in consideration of whether or not a face recognition program is installed in the image forming apparatus 100.

8 to 10 are flowcharts illustrating an example of a data processing procedure in the image forming apparatus according to the present exemplary embodiment. Note that S2001 to S2011, S2101 to S2118, and S2201 to S2207 indicate steps, and are realized by the CPU 112 loading the control program stored in the ROM 114 and HDD 260 into the DRAM 116 and executing it. In the following description, the data processing apparatus 1000 is described as a digital camera.
When the image is previewed on the display unit 1007 of the data processing apparatus 1000 as a digital camera and brought close to the wireless communication unit 400 of the image forming apparatus 100, a user interface as shown in FIG. Is displayed.

FIG. 11 is a diagram illustrating an example of a user interface displayed on the display unit of the operation unit 250 illustrated in FIG. In this example, the user selects whether to use a function for automatic image selection and whether the image forming apparatus 100 performs processing, that is, print processing or storage (registration processing) when the image is automatically selected. It is the example of a screen for. BT1 to BT5 are buttons.
In the present embodiment, the image forming apparatus 100 also has a function of storing data in the HDD 260 in addition to printing from the printer unit 300. In the above-described example of the operation outline in the image data storage area in the HDD 260, the image data storage area of the person A and the image data storage area of the person B are secured as different areas as shown in FIG. preferable.
CPU 112 determines whether or not the user has selected button BT1 on the screen shown in FIG. 11A displayed on the display unit of operation unit 250. If the CPU 112 determines that the user has selected the button BT2 without selecting the button BT1, that is, if it is determined that there has been no instruction for automatic image selection, the process advances to step S2201.

  On the other hand, if the CPU 112 determines in S2001 that the button BT1 has been selected by the user, the process proceeds to S2002. In step S2002, capability information is exchanged between the image forming apparatus 100 and the digital camera. Here, the capability information to be exchanged includes information indicating whether the image forming apparatus 100 has the face recognition module 190 or whether the digital camera has a face recognition program installed.

In step S2003, the CPU 112 determines whether the digital camera has a face recognition program (whether it has image selection capability). If the CPU 112 determines that no face recognition program is included, the process advances to step S2004, and the CPU 112 determines whether the image forming apparatus 100 includes the face recognition module 190.
If the CPU 112 determines in step S2003 that the digital camera has a face recognition program, the process advances to step S2007 to determine whether the image forming apparatus 100 has the face recognition module 190.

  If the CPU 112 determines in S2004 that face recognition cannot be performed in either the digital camera or the image forming apparatus 100, the process advances to S2005, and the screen shown in FIG. 11B is displayed on the display unit of the operation unit 250. To do. This is to notify the user that execution of automatic image selection has been selected in FIG. Here, the entire processing by the button BT7 is stopped, or the button BT6 for continuing the printing or storing operation input in FIG. 11A on the image data in the digital camera is displayed.

In step S2012, if the CPU 112 determines that the button BT7 is selected as a button, the process ends. If the CPU 112 determines that the button BT6 indicating processing continuation is selected, the process advances to step S2201. . In this case, the user manually selects image data to be printed or stored while confirming.
If the CPU 112 determines that only the image forming apparatus 100 can perform face recognition in S2004, the CPU 112 determines to perform face recognition on the image forming apparatus 100 side in S2006, and proceeds to S2010.
On the other hand, if the CPU 112 determines that only the digital camera can perform face recognition in S2007, the CPU 112 determines to perform face recognition on the digital camera side in S2008, and proceeds to S2010.

On the other hand, if YES in S2007, that is, if the CPU 112 determines that face recognition can be performed by both the image forming apparatus 100 and the digital camera, the process proceeds to S2009. In step S2009, the CPU 112 determines to perform face recognition on the image forming apparatus 100 side, and the process advances to step S2010.
In the present embodiment, when face recognition can be performed by both the image forming apparatus 100 and the digital camera, the face recognition is determined to be performed on the image forming apparatus 100 side, but the face recognition execution is determined to be determined on the digital camera side. You may leave it.
In step S2010, the CPU 112 determines whether the digital camera is previewing at least image data on the display unit 1007 based on the information exchanged in step S2002. If the CPU 112 determines that a preview has not been made, the process advances to S2011 to display the screen shown in FIG. 11C on the display unit of the operation unit 250, and the process returns to S2001. This is a screen that prompts the user to approach the wireless communication unit 400 after displaying a preview on the display unit 1007 of the digital camera.

  Note that there is a possibility that the user once leaves the wireless communication unit 400 after confirming the screen shown in FIG. As a result, another digital camera or device may be brought closer to the wireless communication unit 400. Therefore, when continuation corresponding to the button BT8 is selected with the switch on the screen in FIG. Return to.

On the other hand, if it is determined in S2010 that the digital camera is in the preview mode, the process proceeds to S2101 shown in FIG.
In step S <b> 2101, the CPU 112 determines whether it is determined that the image forming apparatus 100 performs face recognition. If it is determined that face recognition is to be performed on the image forming apparatus 100 side, the process advances to step S2102. In step S2102, display image data of a previewed portion transferred from the digital camera through communication between the wireless communication unit 400 and the wireless communication unit 1004 is acquired.

The display image data of the portion previewed by the digital camera here is, for example, the display image data 1301 shown in FIG. 6 or the display image data 1401 shown in FIG.
In step S2103, the CPU 112 acquires all image data stored in the digital camera. Here, all the image data is, for example, the image data group 1302 shown in FIG. 6 or the image data group 1402 shown in FIG.
If the acquisition of these image data is completed, the CPU 112 disconnects the image forming apparatus 100 from the digital camera in S2104. By doing so, the digital camera that has been close to the wireless communication unit 400 can be released at this stage, and the digital camera can be collected, so that the possibility of loss due to misplacement or the like can be reduced.

  In step S <b> 2105, the face recognition module 190 of the image forming apparatus 100 starts processing for detecting a human face in the acquired previewed display image data. A known technique may be used for detecting the face. In step S <b> 2106, the CPU 112 determines whether a human face image is detected in the previewed display image data. If the CPU 112 determines that a face image could not be detected, the screen shown in FIG. 13A is displayed on the display unit of the operation unit 250 in S2107, and the process returns to S2001.

In the user interface shown in FIG. 13A, the user is notified by a message that a face image could not be found in the preview image. This prompts the user to change the preview image with the digital camera and restart the process.
If the user selects the button BT21 shown in FIG. 13A, the process returns to S2001.
On the other hand, if the CPU 112 determines that face image detection has been confirmed in S2106, the process advances to S2108, and the GP 135 executes the face recognition module 190 to recognize the feature of the detected face image. If a plurality of face images can be detected in S2106, the feature of the face image is recognized for one of the faces whose features have not yet been recognized.
In step S <b> 2109, the GP 135 executes the face recognition module 190 to extract image data having features similar to the recognized face features from the received image data group. In step S2110, the CPU 112 determines whether the process of the image forming apparatus 100 determined in step S2001 is printing. If the CPU 112 determines that the selected process is not printing, in step S <b> 2111, the one or more image data extracted in step S <b> 2109 is stored in the image data storage area in the HDD 260, and the process advances to step S <b> 2112.

On the other hand, if the CPU 112 determines in S2110 that the selected process is printing, in S2112 the image data extracted in S2109 is output to the printer unit 300 to execute printing.
When the processing of the image forming apparatus 100, that is, printing or storage is completed, the CPU 112 determines in S2113 whether or not the processing has been completed for all the face images detected in S2106. If the CPU 112 determines that there is still a face to be processed, the process returns to S2108 and repeats from S2108 to step S2113 until the process is completed for all the face images detected in S2105.

On the other hand, if the CPU 112 determines in S2113 that the process has been completed for all faces, the process is terminated.
By repeating this process, if the previewed image data is the image data 1301, four pieces of image data as shown in FIG. 6D are printed.

When the previewed image data is display image data 1401, four image data as shown in FIG. 6D and five image data as shown in FIG. A total of 9 sheets will be printed.
On the other hand, if it is determined in S2101 that the data processing apparatus 1000 performs image selection, the process advances to S2114. In step S2114, the CPU 112 acquires image data selected in the digital camera. When the CPU 112 determines that face recognition is determined to be performed on the digital camera side, the CPU 1001 of the digital camera calls and executes a face recognition program from the ROM 1002.
Then, a face in the preview image displayed on the display unit 1007 of the digital camera is detected, the feature is recognized, and image data having the same feature is extracted from the secondary storage unit 1008. Then, the extracted image data is transferred to the image forming apparatus 100 by communication between the wireless communication unit 400 and the wireless communication unit 1004.

Next, in step S <b> 2115, when the image data acquisition process from the digital camera is completed, the CPU 112 disconnects the connection between the image forming apparatus 100 and the digital camera.
In step S2116, the CPU 112 determines whether the process of the image forming apparatus 100 determined in step S2001 is printing. If the CPU 112 determines that it is not printing, the image data received in S2114 is stored in the image data storage area in the HDD 260 in S2117, and this process is terminated.
On the other hand, if the CPU 112 determines that the selected process is printing in S2116, the image data received in S2114 is printed from the printer unit 300 in S2118, and this process ends.
On the other hand, if the CPU 112 determines in S2001 that there is no automatic selection instruction, the process advances to S2201 shown in FIG.

  In step S2201, all image data stored in the digital camera is acquired through communication between the wireless communication unit 400 and the wireless communication unit 1004. In step S2202, the acquired image data is previewed by displaying the user interface shown in FIG. 13B on the display unit of the operation unit 250 of the image forming apparatus 100. In the screen shown in FIG. 13B, from a scroll bar 2601 for scrolling preview contents, a check box 2602 for selecting a processing target image from the preview image, and a decision button 2604 for proceeding with processing. It is configured.

In step S2203, the process waits for selection of image data to be processed by a user operation. Here, data data to be processed is selected by a check instruction for a check box next to the previewed image data shown in FIG.
In step S2204, input of a check box is accepted until the determination button 2604 is pressed. If the CPU 112 determines that the determination button 2604 is pressed, the process advances to step S2205. In step S2205, the CPU 112 determines whether the process of the image forming apparatus 100 determined in step S2001 is printing.
If the CPU 112 determines that the selected process is not printing, the image data checked in S2206 is stored in the image data storage area in the HDD 260, and the process ends.

On the other hand, if the CPU 112 determines that the selected process is printing in S2205, the image data with a check 2603 is printed from the printer unit 300 in S2207, and this process is terminated.
In the above-described example, the method of transmitting the previewed image data (1301 and 1401) from the digital camera to the image forming apparatus 100 has been described.
This may be simple information such as specifying the previewed image data 601 from all the image data groups in the digital camera shown in FIG. 5 and coordinate information indicating the previewed range.
If a preview image is generated on the side of the image forming apparatus 100 that has received such information, the series of processes described above can be executed.
In addition, it is preferable to erase all received image data remaining in the image forming apparatus 100 from the HDD 260 after performing the printing process. In addition, it is preferable to delete all the received image data that is not to be stored from the HDD 260 after the storage process is performed.
Further, when a face recognition program is installed in the digital camera and the face recognition module 190 is present in the image forming apparatus 100, the user may be able to set in advance which apparatus is to perform face recognition.

For example, the screen shown in FIG. 14 is displayed on the display unit of the operation unit 250. Then, the user selects either button BT31 or button BT32.
If it is determined that both the image forming apparatus 100 and the digital camera can perform face recognition, it may be determined that face recognition is performed by the apparatus selected by the user on the screen of FIG. As a result, when face recognition can be performed by both the image forming apparatus 100 and the digital camera, the user can freely select which of the face recognition processing is performed.
According to the embodiment described above, the user can cause the image processing apparatus to process desired image data with an easy operation of displaying a preview of an image of a person to be printed and bringing it close to the communication unit of the image forming apparatus. it can. In addition, the user simply selects a person who wants to print by simply bringing a digital camera or the like close to the communication unit of the image forming apparatus, automatically determines the number of prints, classifies the prints for each person, and prints them in a storage device. It is possible to reduce the user's work load by storing.
It should be noted that the image storage areas of user A and user B shown in FIG. 12 are accessed by different instructions. For example, when the user A is instructed to display the image storage area, the CPU 112 displays the image data stored in the “A image storage area”. On the other hand, when the user B is instructed to display the image storage area, the CPU 112 displays the image data stored in the “B image storage area”. Also, a password may be set for each of “A image storage area” and “B image storage area”. When a password is set, the CPU 112, when receiving a display instruction, causes the password set for the image storage area receiving the display instruction to be input. When the input password matches the password set in the image storage area, the image data stored in the image storage area is displayed. In addition, the user can issue a print instruction or transmission instruction for image data stored in the image storage area. When receiving a print instruction from the user, the CPU 112 performs printing based on the image data stored in the image storage area. On the other hand, when receiving a transmission instruction, the CPU 112 transmits the image data stored in the image storage area to the external device.

[Second Embodiment]
In the above-described embodiment, the process of automatically extracting and printing or storing the image data in which the person A or the person B is captured from the image data captured by the digital camera has been described. On the other hand, in the case where the image data group stored in the data processing apparatus 1000 is as shown in FIG. 15A, an embodiment in which only image data in which both the person A and the person B are captured is processed. This will be described below.

FIG. 15 is a diagram illustrating an example of image processing in the image forming apparatus showing the present embodiment. Hereinafter, face image determination processing in the present embodiment will be described.
According to (A) of FIG. 15, in the image data group, image data 3601 and image data 3602 are the only image data in which the person A and the person B are shown at the same time. There is no choice but to preview either. The detailed contents of the image data 3601 are shown in FIG.
Even if the zoom up / zoom out buttons on the operation unit 1006 are used, only the person A and the person B cannot be displayed on the display unit 1007 as shown in FIG.
Therefore, in the present embodiment, after the preview images showing the persons A, B, and C are transferred to the image forming apparatus 100, the person A and the person B are specified on the image forming apparatus 100, whereby the desired person A Only the image data in which the person B is captured can be processed. Hereinafter, a more detailed operation will be described using a flowchart. For convenience, the case where the data processing apparatus 1000 is a digital camera will be described as an example.

  FIG. 16 is a flowchart illustrating an example of a data processing procedure in the image forming apparatus according to the present exemplary embodiment. In the present embodiment, only the difference will be described along the flowchart shown in FIG. Therefore, the same steps as those in FIG. 9 are denoted by the same reference numerals. Further, S3501 and S3502 are realized by the CPU 112 loading a control program from the ROM 114 and the HDD 260 to the DRAM 116 and executing it. That is, this process is a step inserted between S2104 and S2105 shown in FIG.

When the image data is previewed on the display unit 1007 of the digital camera and brought close to the communication unit 10 of the image forming apparatus 100, the display unit of the operation unit 250 displays a user interface shown in FIG.
In FIG. 17, BT41 to BT45 are buttons.
The user uses buttons BT41 to BT44 to determine whether to use the function of the present invention, to determine whether the image forming apparatus 100 performs processing, that is, to print or store, when automatically selecting an image, and to select details in the image. Do.
The detail selection menu 3001 is a menu indicating whether or not the person A and the person B are specified from the previewed image data in which the persons A, B, and C are shown. Checking the check box 3002 enables this function.
FIG. 18 is a diagram showing display image data and an image data group transferred from the digital camera to the image forming apparatus 100. This is composed of display image data 3401 to be previewed and all image data groups 3402 stored in the digital camera.

Also in the present embodiment, the operation according to the flow described with reference to FIGS. 8, 9, and 10 is performed. However, when the CPU 112 executes S2104, different processing is performed on the described flow.
Specifically, after receiving the display image data 3401 and the image data group 3402 to be previewed from the digital camera shown in FIG. 15C, the process proceeds to S3501. In step S <b> 3501, the CPU 112 of the image forming apparatus determines whether the detailed selection is selected by the check box 3002 in FIG. 17. If the CPU 112 determines that the detailed selection is not selected, the process advances to step S2105. In this case, the processes as described in the first embodiment are performed on all the persons A, B, and C, and the process ends.

On the other hand, if the CPU 112 determines in S3501 that the detailed selection 3301 shown in FIG. 17 is selected, the screen shown in FIG. 15D is displayed on the display unit of the operation unit 250. In this screen, the image forming apparatus 100 identifies the persons A and B to be processed from the persons A, B, and C shown in the image data previewed on the display unit 1007 of the data processing apparatus 1000 shown in FIG. It is a screen to do.
15D includes previewed display image data 3401, a redo button 3303, and an enter button 3304.

When the user points the face of the person A and the face of the person B with a pointing device such as the touch pen 3301, the CPU 112 displays a cursor 3302 for selecting the selected face image on the screen of the display unit 1007. If the input is wrong, the user can press the redo button 3303 to erase the cursor and point again.
Then, when the process may be performed on the face image surrounded by the cursor 3302, the user presses the enter button 3304. Waiting for this, the CPU 1001 advances the processing. Then, the processing after S2106 is performed for the area surrounded by the cursor, and this processing is terminated.
As a result, even if the persons A, B, and C are simultaneously shown on the previewed display image data 3401 shown in FIG. 18, the operation of the present invention can be performed only for the desired persons A and B. .
The object of the present invention can also be achieved by executing the following processing. That is, a storage medium that records a program code of software that realizes the functions of the above-described embodiments is supplied to a system or apparatus, and a computer (or CPU, MPU, etc.) of the system or apparatus is stored in the storage medium. This is the process of reading the code. In this case, the program code itself read from the storage medium realizes the functions of the above-described embodiments, and the program code and the storage medium storing the program code constitute the present invention.
The present invention is not limited to the above embodiment, and various modifications (including organic combinations of the embodiments) are possible based on the spirit of the present invention, and these are excluded from the scope of the present invention. is not.

100 Image forming apparatus 1000 Data processing apparatus

Claims (5)

  1. An image processing apparatus comprising an imaging means ,
    Storage means for storing a plurality of images captured and generated by the imaging means;
    Display means for displaying any one of the plurality of images stored in the storage means;
    In response to the image processing apparatus being brought close to an external device in a state in which the image displayed by the display unit is zoomed up to close up a part of the image , the zoom up operation is performed. Accepting means for accepting a face included in the close-up image;
    An extracting means for extracting an image of a person's face corresponding to the face received by the receiving means from the storage means;
    Output means for outputting the image extracted by the extraction means to the external device ;
    An image processing apparatus comprising:
  2. The image processing apparatus according to claim 1 , wherein the image processing apparatus is a digital camera, a mobile phone, or a mobile PC.
  3. The image processing apparatus according to claim 1 , wherein the output unit wirelessly transmits the image extracted by the extraction unit to the external device.
  4. An image processing apparatus control method comprising: an imaging unit; and a storage unit that stores a plurality of images captured and generated by the imaging unit,
    A display step of displaying any one of the plurality of images stored in the storage unit on a display unit;
    For image data displayed in said display step, in response to the state in which the zoom-up operation to close up a portion of the image is made the image processing apparatus is close to the external device, the zoom-in operation Receiving a face included in the image close-up by
    An extraction step of extracting from the storage means an image showing a person's face corresponding to the face received in the reception step;
    An output step of outputting the image extracted by the extraction step to the external device ;
    An image processing apparatus control method comprising:
  5. A program for causing a computer to execute the control method of the image processing apparatus according to claim 4 .
JP2009196911A 2009-08-27 2009-08-27 Image processing apparatus, image processing apparatus control method, and program Active JP5773563B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2009196911A JP5773563B2 (en) 2009-08-27 2009-08-27 Image processing apparatus, image processing apparatus control method, and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2009196911A JP5773563B2 (en) 2009-08-27 2009-08-27 Image processing apparatus, image processing apparatus control method, and program

Publications (3)

Publication Number Publication Date
JP2011048655A JP2011048655A (en) 2011-03-10
JP2011048655A5 JP2011048655A5 (en) 2012-10-11
JP5773563B2 true JP5773563B2 (en) 2015-09-02

Family

ID=43834901

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2009196911A Active JP5773563B2 (en) 2009-08-27 2009-08-27 Image processing apparatus, image processing apparatus control method, and program

Country Status (1)

Country Link
JP (1) JP5773563B2 (en)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002354401A (en) 2001-05-30 2002-12-06 Sony Corp Camera built-in recording and reproducing equipment and reproducing method
JP4367897B2 (en) 2003-02-27 2009-11-18 キヤノン株式会社 Image display control apparatus and method
JP2005086516A (en) * 2003-09-09 2005-03-31 Canon Inc Imaging device, printer, image processor and program
JP4474885B2 (en) * 2003-09-30 2010-06-09 カシオ計算機株式会社 Image classification device and image classification program
JP2006079220A (en) 2004-09-08 2006-03-23 Fuji Photo Film Co Ltd Image retrieval device and method
JP2009147454A (en) * 2007-12-11 2009-07-02 Canon Inc Image processing apparatus, control method of image processing apparatus and program

Also Published As

Publication number Publication date
JP2011048655A (en) 2011-03-10

Similar Documents

Publication Publication Date Title
JP5930777B2 (en) Printing apparatus, portable terminal and control method therefor, printing system, computer program
JP5268274B2 (en) Search device, method, and program
JP3549403B2 (en) File system
US7843578B2 (en) Image forming apparatus and method of controlling the same
JP4125208B2 (en) Image processing apparatus and image processing method
JP4046985B2 (en) Imaging device, file storage warning method, computer-readable storage medium, and program
US7379201B2 (en) Image processing apparatus and image processing method
US7038795B2 (en) Image input/output apparatus, method of controlling image input/output apparatus, image input/output system, and storage media
CN102077576B (en) Configuring apparatus, image output apparatus and methods of controlling same
CN101800826B (en) Image processing apparatus, terminal, printer apparatus and image processing method, having image restoring function
EP2538652A1 (en) Document scanning apparatus
JP2006067116A (en) Data processing method of information processing system, information processing system, storage medium and program
JP6218456B2 (en) Printing apparatus, communication method, and program
JP3977392B2 (en) Image forming apparatus and control method thereof
US8477352B2 (en) Image forming apparatus, control method thereof, image forming system, and program
US7804520B2 (en) Image sending and receiving system, image sending apparatus and image receiving apparatus
JP4141342B2 (en) Image forming apparatus
JP5967980B2 (en) Recording system, recording apparatus, and communication method
US8839104B2 (en) Adjusting an image using a print preview of the image on an image forming apparatus
JP5917096B2 (en) Print setting apparatus, print setting method, and program
JP2005217624A (en) Imaging apparatus, image processing apparatus, and image forming apparatus
US8553274B2 (en) Image processing apparatus, method for controlling the same, and storage medium
JP2006081081A (en) Painting and calligraphic work storage device, image reader and device for forming image
US20030142348A1 (en) Image forming system and image forming apparatus
EP1014258A3 (en) Automatic data routing via voice command annotation

Legal Events

Date Code Title Description
A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20120823

A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20120823

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20130619

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20130625

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20130823

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20131015

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20140114

A911 Transfer of reconsideration by examiner before appeal (zenchi)

Free format text: JAPANESE INTERMEDIATE CODE: A911

Effective date: 20140122

A912 Removal of reconsideration by examiner before appeal (zenchi)

Free format text: JAPANESE INTERMEDIATE CODE: A912

Effective date: 20140131

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20150123

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20150508

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20150630