CN101866349A - Image collating device - Google Patents

Image collating device Download PDF

Info

Publication number
CN101866349A
CN101866349A CN 201010165855 CN201010165855A CN101866349A CN 101866349 A CN101866349 A CN 101866349A CN 201010165855 CN201010165855 CN 201010165855 CN 201010165855 A CN201010165855 A CN 201010165855A CN 101866349 A CN101866349 A CN 101866349A
Authority
CN
China
Prior art keywords
image
mentioned
sheet
collating device
control part
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN 201010165855
Other languages
Chinese (zh)
Other versions
CN101866349B (en
Inventor
原聪司
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Olympus Corp
Original Assignee
Olympus Imaging Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Olympus Imaging Corp filed Critical Olympus Imaging Corp
Priority to CN201210296613.5A priority Critical patent/CN102982062B/en
Publication of CN101866349A publication Critical patent/CN101866349A/en
Application granted granted Critical
Publication of CN101866349B publication Critical patent/CN101866349B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Processing Or Creating Images (AREA)
  • Studio Devices (AREA)
  • Television Signal Processing For Recording (AREA)

Abstract

The object of the present invention is to provide a kind of user can carry out the image collating device of image classification giocoso.For this reason, in the present invention, when finger of user etc. touches the part (300a) of the image (300) that is presented on the display part (105), will comprise that the zonule of this touch location is extracted as image sheet (300b).By this image sheet (300b) is pasted on other images, generate my map image thus.Therefore, the user can use this my map image to carry out the retrieval of image.

Description

Image collating device
Technical field
The present invention relates to have the image collating device of the function that the image that is recorded in the recording medium is put in order.
Background technology
In recent years, the recording capacity of the recording medium of document image increases, the increase of the recording capacity of incident record medium and can write down a large amount of images.On the contrary, the user is difficult to retrieve desired images from be recorded in the image the recording medium in a large number.In light of this situation, proposed variously to carry out the technology (for example, with reference to the three No. 200819 communique of Japanese patent of invention) of taxonomic revision to being recorded in image in the recording medium.
In the three No. 200819 image classification technology in the past such as communique of above-mentioned Japanese patent of invention, the various technology such as technology of the enough technology of manually classifying of energy are arranged, classifying automatically, but the classification of image itself does not have enjoyment.
Summary of the invention
The object of the present invention is to provide a kind of user can carry out the image collating device of image classification giocoso.
In order to achieve the above object, the image collating device of a mode of the present invention is characterised in that to have: display part, it shows first image that is recorded in the recording portion: touch panel, and whether it detects has the touch operation that is presented at first image on the above-mentioned display part and the touch location when having carried out this touch operation; The image sheet extraction unit, it is extracted in above-mentioned first image and comprises that the image of the zonule of above-mentioned touch location is an image sheet when having carried out above-mentioned touch operation; And other image production parts, it pastes above extracted image sheet and generates second image on other images.
Description of drawings
Fig. 1 is the figure that the structure of the related image collating device of one embodiment of the present invention is shown.
Fig. 2 A is first figure that illustrates as the structure of the touch panel of the photoinduction formula of an example of touch panel.
Fig. 2 B is second figure that illustrates as the structure of the touch panel of the photoinduction formula of an example of touch panel.
Fig. 2 C is the 3rd figure that illustrates as the structure of the touch panel of the photoinduction formula of an example of touch panel.
Fig. 3 is the process flow diagram that the main action of the related camera of one embodiment of the present invention is shown.
Fig. 4 illustrates the process flow diagram that image classification is handled.
Fig. 5 A is first figure that the extraction summary of image sheet is shown.
Fig. 5 B is second figure that the extraction summary of image sheet is shown.
Fig. 6 A is first figure of the example of map (my map) image that I am shown.
Fig. 6 B is second figure of example that my map image is shown.
Fig. 7 is the process flow diagram that the example that the generation of my map image handles is shown.
Fig. 8 is the figure that is used to illustrate preferential viewing area.
Fig. 9 A is the figure that an example of classification results file is shown.
Fig. 9 B is used for the figure that the size to the position of the face that is recorded in the classification results file and face describes.
Figure 10 is the process flow diagram of reproduction processes that my map image is shown.
Figure 11 A is first figure of reproduction processes summary that my map image is shown.
Figure 11 B is second figure of reproduction processes summary that my map image is shown.
Figure 11 C is the 3rd figure of reproduction processes summary that my map image is shown.
Figure 12 is the process flow diagram that the variation of image sheet extraction is shown.
Figure 13 A is first figure of extraction summary that the image sheet of variation is shown.
Figure 13 B is second figure of extraction summary that the image sheet of variation is shown.
Figure 13 C is the 3rd figure of extraction summary that the image sheet of variation is shown.
Embodiment
Below, with reference to the description of drawings embodiments of the present invention.
Fig. 1 is the figure that the structure of the related image collating device of one embodiment of the present invention is shown.Herein, image collating device shown in Figure 1 is example with digital camera (below, be called for short do camera).
Camera 100 shown in Figure 1 has control part 101, operating portion 102, image pickup part 103, face test section 104, display part 105, touch panel 106, recording portion 107, clock and watch portion 108 and Department of Communication Force 109.
Control part 101 is by for being used for the LSI formations such as (large scale integrated circuits) that camera constitutes, and accepts the user to the operation of operating portion 102 and carry out the control of each module of camera 100 inside of Fig. 1.In addition, control part 101 also carries out Flame Image Process such as white balance correction or compressed and decompressed processing etc. at the image of obtaining by image pickup part 103.In addition, the control part 101 of present embodiment also has the function as the image sheet extraction unit, and described image sheet extraction unit is extracted image sheet according to the operation of touch panel 106 from the image (first image) that is presented on the display part 105.In addition, the control part 101 of present embodiment also have function as the image sheet division that the image sheet that is extracted is classified, generate as putting image sheet in order according to classification results other images (second image) other image production parts function and as the function of the image retrieval portion of the retrieval of from other images, carrying out image.The details of these each functions will be narrated in the back.
Operating portion 102 is the functional units that are used for being undertaken by the user operation of camera 100.In operating portion 102, comprise the on/off that for example is used to switch camera 100 power knob, be used for indicating release-push that photography carries out, the reproduction button that is used for reproducing, being used for being undertaken the selector button etc. of the various selection operations relevant with camera 100 by the user by user's indicating image by the user.
Image pickup part 103 is made of photographic lens, imaging apparatus and A/D converter section etc.Photographic lens is the lens that are used for the light beam from not shown subject is incided imaging apparatus.Imaging apparatus will be converted to electric signal based on the picture from the subject of the light beam of photographic lens.The A/D converter section will be converted to numerical data (after, be called view data) by the analog electrical signal that imaging apparatus obtains.
The face zone that face test section 104 detects in the view data that obtains in image pickup part 103.Carry out the detection in face zone by the characteristic that from image, detects faces such as eyes, nose or mouth.
Display part 105 is arranged on for example back side of camera 100, and image of obtaining via image pickup part 103 or the various images such as image that are recorded in the recording portion 107 are shown.This display part 105 is made of for example LCD etc.The user can confirm composition and shutter opportunity of subject by this display part 105.
Touch panel 106 is arranged to one with display part 105, detects whether have the touch operation of user in the display frame of display part 105 and the touch location when having carried out touch operation.Slide direction and slide amount when in addition, touch panel 106 finger of also detecting the user when having carried out touch operation etc. has carried out slide to the pressing quantity of touch panel 106 with after touch operation.
Fig. 2 A~Fig. 2 C shows the figure as the structure of the touch panel of the photoinduction formula of an example of touch panel 106.Display part 105 shown in Fig. 2 A illustrates the example of LCD, has display panel 105a and backlight 105b.Display panel 105a constitutes by the pixel that comprises liquid crystal is carried out two-dimensional arrangements, makes the optical transmission rate variation of each pixel according to the control of control part 101.Backlight 105b for example white light-emitting diode (LED) carries out rayed according to the control of control part 101 from the back side of display panel 105a as light source.In the example of present embodiment, be configured on each pixel of display panel 105a as the optical sensor of touch panel 106.
In this structure, for example shown in Fig. 2 A, when user's finger 200 touched display panel 105a, on finger 200 position contacting, the light that penetrates from backlight 105b was reflected, and this reflected light is received in optical sensor.Control part 101 monitors whether have from the output signal that is configured in the optical sensor on each pixel, detects the touch location of user's finger 200 according to the position of the optical sensor that has carried out signal output.
In addition, shown in the arrow A of Fig. 2 B, under the situation that the pressing quantity of 200 pairs of touch panels 106 of finger of user increases, compare with the situation that pressing quantity is less, the position distribution of optical sensor of carrying out signal output is on broad scope.According to the distribution of the optical sensor that carries out sort signal output, can detect the pressing force of 200 pairs of touch panels 106 of finger of user.
In addition, shown in the arrow B of Fig. 2 C, under the situation that user's finger 200 slides, the position of carrying out the optical sensor of signal output also is offset on the arrow B direction.Thus, can be according to the change in location direction of the optical sensor that carries out signal output and the slide direction and the slide amount of the finger 200 that location variation detects the user.
In addition, be not limited to the photoinduction formula, can also use by the contact pressure of user finger etc. and come the touch panel of the various structures such as pressure sensitive touch panel of senses touch operation as touch panel 106.
Obtain, compressed view data in control part 101 by image pickup part 103 via establishing for recording portion 107 records.In addition, in this recording portion 107, record view data with the form of the image file that added the photography conditions such as time on photography date by 108 timing of clock and watch portion.In addition, the recording portion in the present embodiment 107 also has the posting field as classified information recording portion 107a, the posting field of the conduct arrangement image recording 107b of portion.In classified information recording portion 107a, record the classification results file of the classification results of 101 pairs of view data of expression control part.In addition, in the arrangement image recording 107b of portion, record other images (below, be called my map image), these other images as according to classification results file consolidation the result of image sheet generate.
108 pairs in clock and watch portion has carried out time on date of image photography etc. and has carried out timing.Department of Communication Force 109 is that USB interface etc. is used for the telecommunication circuit that camera 100 and external unit communicate.
The action of the related camera of present embodiment 100 then, is described.Fig. 3 is the process flow diagram that the main action of the related camera of present embodiment 100 is shown.
In Fig. 3, the control part 101 of camera 100 judges at first whether the power supply of camera 100 connects (step S101).In the judgement of step S101, under the situation that power supply disconnects, control part 101 finishes the processing of Fig. 3.
On the other hand, in the judgement of step S101, under the situation of power connection, control part 101 judges whether the pattern of camera 100 is photograph mode (step S102).In the judgement of step S102, be under the situation of photograph mode in the pattern of camera 100, control part 101 shows in order to browse picture, makes image pickup part 103 actions.At this moment, control part 101 makes 104 actions of face test section, detects the face zone (step S103) in the view data that obtains via image pickup part 103.Afterwards, control part 101 computing (step S104) that exposes.In this exposure computing, control part 101 is measured the brightness of subject according to the view data that obtains via image pickup part 103.In addition, control part 101 brightness according to the subject of being measured, the brightness of the image that computing will obtain via image pickup part 103 when photography described later is made as the well-lit exposure.When photography, carry out the control of image pickup part 103 according to the exposure that in step S104, obtains.
Then, control part 101 is browsed picture demonstration (step S105) on display part 105.Browse in the picture demonstration at this, control part 101 is handled the image that obtains via image pickup part 103 successively and is presented on the display part 105.By making image pickup part 103 continuous actions, and the image that obtains by this continuous action is presented on the display part 105 of camera 100 continuously, can be with display part 105 as electronic viewfinder.Show that by this picture of browsing the user can determine composition and shutter opportunity.
Browsing after picture shows, whether control part 101 is judged by the cameraman and has been carried out releasing operation, promptly whether supressed the release-push (step S106) of operating portion 102.In the judgement of step S106, under the situation of not pressing release-push, handle and return step S101.At this moment, control part 101 judges once more whether the power supply of camera 100 is connected.On the other hand, in the judgement of step S106, supressing under the situation of release-push, control part 101 is carried out the photography action according to the exposure of computing in step S104.Then, control part 101 is at implementing to generate image file after the various Flame Image Process as the view data result of photography action, that obtain via image pickup part 103, and the image file that this generated is recorded (step S107) in the recording portion 107.Afterwards, 101 pairs of image files that are recorded in the recording portion 107 of control part reproduce, and photographs is presented at (step S108) on the display part 105.
Then, the view data in 101 pairs of image files that are recorded in the recording portion 107 of control part classify (step S109).Behind image classification, control part 101 generates the classification results file according to classification results and records in the recording portion 107, perhaps upgrades classification results file (step S110) under the situation that generates the classification results file.Afterwards, step S101 is returned in processing.In addition, the details of the image classification of step S109 is with aftermentioned.
In addition, in the judgement of step S102, not under the situation of photograph mode in the pattern of camera 100, control part 101 judges whether the pattern of cameras 100 is reproduction mode (step S111).In the judgement of step S111, not under the situation of reproduction mode in the pattern of camera 100, handle and return step S101.On the other hand, in the judgement of step S111, be under the situation of reproduction mode in the pattern of camera 100, control part 101 carries out the processing of reproduction mode.At this moment, (the photography time on date is up-to-date) image file of the last record in 101 pairs of image files that are recorded in the recording portion 107 of control part reproduces, and it is presented at (step S112) on the display part 105.
Afterwards, control part 101 is judged and whether has been carried out the indication (step S113) of image classification according to user's indication.According to whether having carried out user for example whether the operation of operating portion 102 or user's finger etc. are touched not shown image classification instruction button in the display frame that is presented at display part 105, carried out this judgement.In the judgement of step S113, under the situation of the indication of having carried out image classification, control part 101 carries out the processing (step S114) of image classification.Behind image classification, control part 101 generates the classification results file according to classification results and records in the recording portion 107, or upgrades classification results file (step S115) under the situation that generates the classification results file.In addition, the image classification of step S114 is handled and is handled identical with the image classification of step S109.Its details is with aftermentioned.
Behind step S113 or step S115, control part 101 is judged other demonstrations (step S116) beyond whether carrying out usually.According to whether having carried out user for example the map whether operation of operating portion 102 or user's finger etc. have touched not shown I in the display frame that is presented at display part 105 is reproduced button or thumbnail the Show Button, carry out this judgement.In the judgement of step S116, do not carrying out under the situation of other demonstrations, handle and return step S101.
On the other hand, in the judgement of step S116, under the situation of carrying out other demonstrations, control part 101 judges whether carried out my map reproduction indication (step S117) by the user.In the judgement of step S117, reproduce at the map that has carried out me under the situation of indication, my map image that 101 pairs of control parts are recorded among the arrangement image recording 107b of portion of recording portion 107 reproduces (step S118).After my reproduction processes of map image finishes, handle and return step S116.In addition, the details of the reproduction processes of my map image is with aftermentioned.
In addition, in the judgement of step S117, reproduce at the map that does not carry out me under the situation of indication, control part 101 judges that whether having carried out thumbnail shows indication (step S119).In the judgement of step S119, under the situation of not carrying out thumbnail demonstration indication, through after the schedule time, the new image of time on photography date second that control part 101 will be recorded in the image file in the recording portion 107 is presented at (step S120) on the display part 105.Afterwards, step S113 is returned in processing.
In addition, in the judgement of step S119, under the situation of having carried out thumbnail demonstration indication, the thumbnail image that control part 101 will be recorded in the image in the recording portion 107 is presented at (step S121) on the display part 105.Afterwards, control part 101 is according to the output of touch panel 106, judges the selection of whether having carried out thumbnail image, is any one (the step S122) whether user's finger etc. has touched the thumbnail image that is presented on the display part 105.In the judgement of step S122, touched at finger of user etc. under any one the situation of the thumbnail image that is presented on the display part 105, control part 101 is shown enlarged in (step S123) on the display part 105 with the image corresponding with the thumbnail image of this touch location.Afterwards, step S113 is returned in processing.In addition, in the judgement of step S122, do not touch at user's finger etc. under any one the situation of the thumbnail image that is presented on the display part 105, handle and return step S113 yet.
Then, handle with reference to the image classification of Fig. 4 description of step S109, S114.Before image classification is handled, shown in Fig. 5 A, showing photographs or reproduced image (below, be called for short and make image) 300 on the display part 105.After image classification is handled beginning, shown in Fig. 5 A, control part 101 make expression be used for by the user carry out image sheet classification classification item demonstration 301a~301c and be used for indicating the conclusion button 302 that finishes image classification to be presented at display part 105 (step S201) by the user.Then, control part 101 is judged the touch operation (step S202) in the display frame of display part 105 such as finger of whether having carried out the user according to the output of touch panel 106.In the judgement of step S202, carrying out before the touch operation, while control part 101 carries out the judgement standby of step S202.
In the judgement of step S202, carried out under the situation of touch operation at user's finger etc., control part 101 is according to the output of touch panel 106, judges the touch location (step S203) of user's finger etc.Afterwards, control part 101 judges whether the touch location that is detected is the position (step S204) of conclusion button 302.In the judgement of step S204, be that control part 101 finishes the processing of Fig. 4 under the situation of position of conclusion button 302 at touch location.At this moment, begin to carry out processing from step S110 or step S115.
In addition, in the judgement of step S204, not under the situation of position of conclusion button 302 at touch location, control part 101 judges that touch locations are whether in image 300 (step S205).In the judgement of step S205, not under the situation in image 300, handle and return step S202 at touch location.
In addition, in the judgement of step S205, under the situation of touch location in image 300, control part 101 will comprise that the zonule of this touch location extracts (step S206) as image sheet.For example, touched at user's finger 200 under the situation of position of the colored 300a shown in Fig. 5 A, extracted image sheet 300b as the zonule that comprises colored 300a.Set the size of this image sheet 300b according to the pressing force of user's finger 200.That is,, extract size thus and push the corresponding image sheet 300b of intensity with this by the display frame that the user pushes display part 105.When having extracted image sheet 300b, shown in Fig. 5 A, the such frame of image sheet 300b that is extracted can be visually seen in demonstration clearly.In addition, in the example of Fig. 5 A, extract the image sheet 300b of rectangle.But the shape of image sheet is not limited to rectangle, also can be for example circular.
After extracting image sheet 300b, control part 101 is according to the output of touch panel 106, detects the slide (step S207) of the finger that whether carried out the user etc.In the judgement of step S207, do not carrying out under the situation of slide, handle and return step S202.On the other hand, in the judgement of step S207, under the situation of having carried out slide, shown in Fig. 5 B, control part 101 comes the position (step S208) of mobile image sheet 300b according to the slide amount of user's finger 200 grades.Afterwards, whether control part 101 position of judging the image sheet 300b after moving shows any one consistent (step S209) among 301a~301c with classification item.In the judgement of step S209, under any one all inconsistent situation among the position of the image sheet 300b after moving and the classification item demonstration 301a~301c, handle and return step S202.On the other hand, in the judgement of step S209, under any one consistent situation among the position of image sheet 300b after moving and the classification item demonstration 301a~301c, control part 101 generates (or renewal) I's corresponding with classification item map image (step S210), and this classification item is corresponding with the position of image sheet 300b.
Fig. 6 A, Fig. 6 B are the figure of example that my map image is shown.Herein, Fig. 6 A illustrates my example of map image of classification item when " flower ", my example of map image when Fig. 6 B illustrates classification item and is " pet ".Shown in Fig. 6 A, Fig. 6 B, for my map image in the present embodiment, in the background image corresponding (being the flower nursery under the situation of " flower " for example, is pet cabin etc. under the situation of " pet ") 401, be the image that generates by reproducing image sheet 301b with classification item.In addition, in Fig. 6 A, Fig. 6 B, the frame on the border of presentation graphs photo 301b is not illustrated, but when reality was pasted, the frame of expectation removal of images sheet 301b was pasted.In addition, the illustration classification item is not personage's my map image, but as long as for example the thumbnail image of personage's face is embedded in the image of picture frame and paints, the thumbnail image of perhaps pasting personage's face in the image of personage's trunk gets final product.
More than such I's map image be by the user freely paste captured or the image that reproduced a part and with the image of the form progress that is adapted to consumer taste.Therefore, the user can carry out the classification of image giocoso.
In addition, also overlapping return push-button 402 in my map image.This return push-button 402 is employed buttons when aftermentioned is reproduced my map image.In addition, can synthesize the image of representing the user is incarnation 403, my selection etc. of map image that also can be undertaken being reproduced when aftermentioned is reproduced my map image by the operation of this incarnation 403.
Fig. 7 is the process flow diagram that the example that the generation of my map image handles is shown.In Fig. 7, control part 101 is set at the relative importance value (step S501) that is higher than other image sheets with the relative importance value (priority) of the up-to-date image sheet that extracted.Then, control part 101 judges whether kept preferential viewing area (step S502).Preferential viewing area is preferential image sheet and zone in my map image that disposes.
In the judgement of step S502, under the situation that has kept preferential viewing area, as shown in Figure 8, control part 101 reproducing image sheet in preferential viewing area generates my map image (step S505).At this moment, (being new) image sheet that relative importance value is high pastes the central authorities of preferential viewing area more, and (being old) image sheet that relative importance value is low pastes the end of preferential viewing area more.
In addition, in the judgement of step S502, under the situation that does not keep preferential viewing area, control part 101 uses to reject to wait at interval and dwindles the lower image sheet of relative importance value (step S503).Afterwards, control part 101 generates my map image (step S504) according to relative importance value reproducing image sheet.Thus, even becoming many, image sheet also can carry out the stickup of image sheet.
Behind the reproducing image sheet, control part 101 is with the location records of each image sheet in the classification results file described later (step S506).Afterwards, the image (step S507) of control part 101 synthetic incarnation in my map image that is generated.In addition, as the image of incarnation, the user can use the image of taking self and obtaining, and also can select the image of the incarnation of expectation from a plurality of options.Behind the image of synthetic incarnation, control part 101 finishes the processing of Fig. 7.
Like this, represent that cameraman's incarnation is decorated with image sheet successively, be embedded in the picture, so the cameraman can experience the sensation in the self-world of abundant developing, the motivation thereby formation is further photographed.Thus, actively carry out the photography with identical theme such as flower, therefore understand the difference of the photography of good photograph and difference easily, can access the attached effect that shooting technology improves.
Again return the explanation of Fig. 4 herein.In step S210, behind the map image that generates me, control part 101 records (step S211) in the classification results file with the classified information of the meaning of presentation video 300 and image sheet 301 correspondences.In addition, control part 101 will record (step S212) in the classification results file except information the corresponding informance of image 300 and image sheet 301 (quantity of the face in the image) etc.Afterwards, step S202 is returned in processing.
Fig. 9 A shows an example of classification results file.Classification results file shown in Fig. 9 A has the project of " quantity of face ", " position of face, feature, size ", " image sheet " and " photography time on date ".
For " quantity of face ", write down the quantity in the face zone in the view data detected when the face of step S103 detects.For " position of face, feature, size ", write down how to comprise what kind of face of degree size on which position in image.In the present embodiment, shown in Fig. 9 B, be these nine zones of A1~A9 for example with a picture segmentation, represent the position of face by this A1~A9.In addition, for the size of face,, represent with the size of the Three Estate shown in D1~D3 according to size with respect to the zone of A1~A9.D1 is the situation of the size in face zone less than the size in the zone of A1~A9.In addition, D2 is the situation of the regional inscribe (size about equally) of face zone and A1~A9.In addition, D3 is the situation of the size in face zone greater than the size in the zone of A1~A9.In addition, for the feature of face, write down the identifier of the feature of expression face according to each view data.For example, image 1 is illustrated in that there is the feature P-A face of (for example representing family) in the size with D2 in the zone of A4, and is illustrated in the face that has feature P-A in the zone of A6 with the size of D3.
" image sheet " records the image sheet of classifying as mentioned above and the corresponding informance of image.When generating my map image, write down this corresponding informance at every turn.For example, colored A or the corresponding situation of personage A in image 1 expression and my map image.
For " photography time on date ", write down time on photography date by clock and watch portion 8 records.Also can be by the GPS function is set in camera 100 the chronophotography position.
Then, with reference to my reproduction processes of map image of Figure 10 description of step S118.When the indication that the map image that has carried out me reproduces, control part 101 shows the selection picture (step S301) of my map image.Figure 11 A shows the example of selecting picture.Shown in Figure 11 A, in selecting picture, my the project demonstration 501a~501c of classification of map image that shows that expression is corresponding with above-mentioned classification item.In addition, in the example of Figure 11 A, flower nursery 501a is corresponding with classification item " flower ".In addition, tame 501b is corresponding with classification item " personage ".In addition, the tame 501c of pet is corresponding with classification item " pet ".In addition, in selecting picture, also show conclusion button 502.
Whether after having shown the selection picture shown in Figure 11 A, control part 101 is according to the output of touch panel 106, judge by the user and selected project to show any one project (step S302) among 501a~501c.In the judgement of step S302, there be not option to show under the situation of any one project among 501a~501c, control part 101 judges whether selected conclusion button 502 (step S303) by the user according to the output of touch panel 106.In the judgement of step S303, under the situation of not selecting conclusion button 502, handle and return step S301.On the other hand, in the judgement of step S303, under the situation of having selected conclusion button 502, control part 101 finishes the processing of Figure 10.Afterwards, handle the step S116 that returns Fig. 3.
In addition, in the judgement of step S302, having selected project to show under any one the situation among 501a~501c, shown in Figure 11 B, control part 101 shows I's corresponding with item selected map image (step S304) on display part 105.Herein, Figure 11 B shows the example that shows shown I's map image under the situation of 501a in the project of having selected the flower nursery.This my map image has been shown in Fig. 6 A.
After having shown my map image, control part 101 judges whether selected return push-button 402 (step S305) by the user according to the output of touch panel 106.In the judgement of step S305, under the situation of having selected return push-button 402, handle and return step S301.At this moment, the control part 101 selection picture shown in the displayed map 11A once more.Whether on the other hand, in the judgement of step S305, under the situation of not selecting return push-button 402, control part 101 is according to the output of touch panel 106, judge by the user and selected any one image sheet (step S306) in my map image.In the judgement of step S306, under the situation of not selecting image sheet, handle and return step S305.On the other hand, in the judgement of step S306, under the situation of having selected image sheet, control part 101 is presented at (step S307) on the display part 105 with reference to the classification results file with the image corresponding with the image sheet of being selected by the user.Figure 11 C illustrates the demonstration example of the correspondence image when having selected image sheet.For example, shown in Figure 11 B, under the situation of having selected image sheet 300b, on display part 105, show the image 300 corresponding with this image sheet 300b.In addition, when carrying out this demonstration, the image recording that also can represent to be presented on the display part 105 is the demonstration 601 of electron album.
As described above, according to present embodiment, the part that can extract image is as image sheet, uses this image sheet to generate my map image, and, can be according to this my map image retrieval desired images.Thus, the user can carry out the classification of image giocoso.
In addition, in the present embodiment, the size of the image sheet of extraction is changed according to the pressing force of touch panel 106.Thus, the user can use the image sheet of any size to generate my map image.
Herein, in the above-described embodiment, illustration I's the corresponding generation of map image with these three kinds classification of " flower ", " personage ", " pet ", still also can carry out classification in addition.In addition, also can generate classification item by the user.
In addition, in the above-described embodiment, only will comprise that the zonule of user's touch location extracts as image sheet.At this moment, extract the image sheet that main objects such as not only comprising flower also comprises background image.When under comprising the state of background image, image sheet being pasted my map image, my map image that is difficult to probably that observation post generates.With respect to this, also can be as the following image sheet of extraction as shown in Figure 12.Herein, replace Fig. 4 step S206 processing and carry out the processing of Figure 12.
In the judgement of the step S205 of Fig. 4, touched under the situation that is presented at the image on the display part 105 at finger of user etc., as shown in FIG. 13A, control part 101 is set zonule (rectangle, circle etc.) 701 (the step S401) of the pre-sizing that comprises touch location.Then, control part 101 comes the size (step S402) in changing cells territory 701 according to the pressing force of user's finger 200.
Then, control part 101 is obtained the view data in the zonule 701, detects the main color (maximum colors) (step S403) in the obtained view data.Afterwards, control part 101 detects the color part (step S404) approximate with the main color that detects in step S403 in zonule 701.That is, regard the main color of zonule 701 as in the zonule 701 most important object.Then, control part 101 makes this background area become transparent image (step S405) with not regional as a setting by the regional area surrounded of main color and approximate color thereof in the zonule 701.For example, the main object in the zonule 701 of extracting shown in Figure 13 B is colored, and the main color of this moment is the color of petal.Thus, in the processing of step S405, the part of for example branch 703 except the part of being surrounded by the zone 702 with petal color is formed transparent image shown in Figure 13 C.Extract image sheet thus.After extracting image sheet, the processing that the step S207 of execution graph 4 is later.
By extraction image sheet like this, can in my map image, not paste the background parts of image sheet.
According to above embodiment the present invention has been described, but has the invention is not restricted to above-mentioned embodiment, can in purport scope of the present invention, carry out various distortion and application certainly.In the above-described embodiment, the example as image collating device shows digital camera.But present embodiment can be applicable to the various devices with touch panel.
In addition, in the above-described embodiment, display items display shows 301a~301c all the time when the reproduction of image 300.Relative therewith, also can be behind reproduced image 300, carry out project in the moment that user's finger etc. surpasses the schedule time to touch time of image 300 and show.
In addition, in the above-described embodiment, the generation of extracting image sheet and my map image from rest image has been described mainly.But the method for above-mentioned embodiment also can be suitable for dynamic image.For example, if the image in the reproduction of dynamic image in the moment that the finger by the user etc. has carried out touching to the image in reproducing temporarily stops to reproduce can carry out that the image sheet identical with the situation of rest image extracts and my map image generates after then.Under the situation of the corresponding diagram photo in having selected my map image of generation like this, reproduce dynamic image according to the frame in the pairing dynamic image of this image sheet.
In addition, comprised the invention in various stages in the above-described embodiment, can extract various inventions by the appropriate combination of disclosed a plurality of technical characterictics.For example, even delete several technical characterictics the shown all technical characteristic from embodiment, also can address the above problem and obtain under the situation of effect as described above, the structure extraction that can delete this technical characterictic be invention.

Claims (7)

1. an image collating device is characterized in that, this image collating device has:
Display part, it shows first image that is recorded in the recording portion;
Touch panel, whether it detects has the touch operation that is presented at first image on the above-mentioned display part and the touch location when having carried out this touch operation;
The image sheet extraction unit, it is extracted in above-mentioned first image and comprises that the image of the zonule of above-mentioned touch location is an image sheet when having carried out above-mentioned touch operation; And
Other image production parts, it pastes the above-mentioned image sheet that is extracted and generates second image on other images.
2. image collating device according to claim 1 is characterized in that, above-mentioned second image is pasted a plurality of image sheets and constituted.
3. image collating device according to claim 2 is characterized in that, this image collating device also has:
The image sheet division, it is classified to above-mentioned image sheet; And
The classified information recording portion, it is recorded as classified information with above-mentioned sorting result,
Above-mentioned other image production parts only generate above-mentioned second image by the image sheet that has carried out same category with reference to above-mentioned classified information.
4. image collating device according to claim 3 is characterized in that,
Above-mentioned touch panel also detects the slide that carries out from above-mentioned touch location,
Above-mentioned image sheet division is according to by the detected slide of above-mentioned touch panel above-mentioned image sheet being classified.
5. image collating device according to claim 1, it is characterized in that, this image collating device also has image retrieval portion, this image retrieval portion is selecting to paste under the situation of the image sheet on above-mentioned second image, shows first image corresponding with selected this image sheet on above-mentioned display part.
6. image collating device according to claim 1 is characterized in that,
Above-mentioned touch panel also detects the pressing force on the above-mentioned touch location,
Above-mentioned image sheet extraction unit is determined above-mentioned zonule according to above-mentioned pressing force, and extraction and the corresponding image sheet in determined this zonule.
7. image collating device according to claim 1, it is characterized in that, above-mentioned image sheet extraction unit detects the main color in the above-mentioned zonule, makes the image in the zone that does not comprise detected this main color become transparent image, extracts above-mentioned image sheet then.
CN2010101658551A 2009-04-20 2010-04-20 Image collating device Expired - Fee Related CN101866349B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210296613.5A CN102982062B (en) 2009-04-20 2010-04-20 Image collating device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2009-102208 2009-04-20
JP2009102208A JP2010252266A (en) 2009-04-20 2009-04-20 Image arrangement apparatus

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN201210296613.5A Division CN102982062B (en) 2009-04-20 2010-04-20 Image collating device

Publications (2)

Publication Number Publication Date
CN101866349A true CN101866349A (en) 2010-10-20
CN101866349B CN101866349B (en) 2012-10-03

Family

ID=42958077

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201210296613.5A Expired - Fee Related CN102982062B (en) 2009-04-20 2010-04-20 Image collating device
CN2010101658551A Expired - Fee Related CN101866349B (en) 2009-04-20 2010-04-20 Image collating device

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN201210296613.5A Expired - Fee Related CN102982062B (en) 2009-04-20 2010-04-20 Image collating device

Country Status (2)

Country Link
JP (1) JP2010252266A (en)
CN (2) CN102982062B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104021215A (en) * 2014-06-20 2014-09-03 深圳市中兴移动通信有限公司 Method and device for sorting images
CN104965654A (en) * 2015-06-15 2015-10-07 广东小天才科技有限公司 Head portrait adjusting method and system

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5979987B2 (en) * 2012-05-30 2016-08-31 キヤノン株式会社 Information processing apparatus, control method thereof, and program
JP6091220B2 (en) * 2013-01-18 2017-03-08 オリンパス株式会社 Imaging apparatus, image processing apparatus, and image processing program
CN110537370B (en) * 2017-02-08 2023-08-25 弗劳恩霍夫应用研究促进协会 Apparatus and method for picture encoding, and apparatus and method for picture decoding
JP2018125887A (en) * 2018-04-12 2018-08-09 株式会社ニコン Electronic equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1604120A (en) * 2003-08-20 2005-04-06 奥西-技术有限公司 Metadata extraction from designated document areas
CN1689068A (en) * 2003-03-04 2005-10-26 富士通株式会社 Image display method, image display program, and information device
CN1728775A (en) * 2004-07-29 2006-02-01 奥林巴斯株式会社 Camera, reproducing apparatus, and album registration method
US20060221779A1 (en) * 2005-03-15 2006-10-05 Fuji Photo Film Co., Ltd. Album generating apparatus, album generating method and program

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3908171B2 (en) * 2003-01-16 2007-04-25 富士フイルム株式会社 Image storage method, apparatus, and program
CN101055592A (en) * 2007-05-20 2007-10-17 宁尚国 Image information generation method and device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1689068A (en) * 2003-03-04 2005-10-26 富士通株式会社 Image display method, image display program, and information device
CN1604120A (en) * 2003-08-20 2005-04-06 奥西-技术有限公司 Metadata extraction from designated document areas
CN1728775A (en) * 2004-07-29 2006-02-01 奥林巴斯株式会社 Camera, reproducing apparatus, and album registration method
US20060221779A1 (en) * 2005-03-15 2006-10-05 Fuji Photo Film Co., Ltd. Album generating apparatus, album generating method and program

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104021215A (en) * 2014-06-20 2014-09-03 深圳市中兴移动通信有限公司 Method and device for sorting images
CN104021215B (en) * 2014-06-20 2020-05-01 努比亚技术有限公司 Picture sorting method and device
CN104965654A (en) * 2015-06-15 2015-10-07 广东小天才科技有限公司 Head portrait adjusting method and system
CN104965654B (en) * 2015-06-15 2019-03-15 广东小天才科技有限公司 A kind of method and system of head portrait adjustment

Also Published As

Publication number Publication date
JP2010252266A (en) 2010-11-04
CN102982062B (en) 2017-03-01
CN101866349B (en) 2012-10-03
CN102982062A (en) 2013-03-20

Similar Documents

Publication Publication Date Title
CN101866349B (en) Image collating device
US6342900B1 (en) Information processing apparatus
CN101790035B (en) Camera
JP5261724B2 (en) Representative image selection by hierarchical clustering
US20080024632A1 (en) Photographing apparatus and photographing method
CN104320571B (en) Electronic equipment and method for electronic equipment
CN101600052B (en) Image processing apparatus and image processing method
KR101475939B1 (en) Method of controlling image processing apparatus, image processing apparatus and image file
US20100287502A1 (en) Image search device and image search method
CN101676913A (en) Image searching device, digital camera and image searching method
CN108200351A (en) Image pickup method, terminal and computer-readable medium
JP2011055190A (en) Image display apparatus and image display method
CN112232260A (en) Subtitle region identification method, device, equipment and storage medium
US8355050B2 (en) Image replay system, digital camera, and image replay device
US8941767B2 (en) Mobile device and method for controlling the same
JP2007266902A (en) Camera
US8866953B2 (en) Mobile device and method for controlling the same
CN103581512B (en) Shooting device and method
JP4920534B2 (en) Slide show generation system and camera having this system
CN102547202A (en) Display control apparatus and method
CN105052130B (en) Mobile device and its control method
JP2008131377A (en) Device, method and program for displaying
JP5302740B2 (en) Image display device
JP5253590B2 (en) Image processing apparatus and camera
JP2010187119A (en) Image capturing apparatus, and program for the same

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
C41 Transfer of patent application or patent right or utility model
TR01 Transfer of patent right

Effective date of registration: 20151116

Address after: Tokyo, Japan, Japan

Patentee after: Olympus Corporation

Address before: Tokyo, Japan, Japan

Patentee before: Olympus Imaging Corp.

CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20121003

Termination date: 20190420