WO2005096180A1 - 画像検索方法、装置及びプログラムを記録した記録媒体 - Google Patents
画像検索方法、装置及びプログラムを記録した記録媒体 Download PDFInfo
- Publication number
- WO2005096180A1 WO2005096180A1 PCT/JP2005/005649 JP2005005649W WO2005096180A1 WO 2005096180 A1 WO2005096180 A1 WO 2005096180A1 JP 2005005649 W JP2005005649 W JP 2005005649W WO 2005096180 A1 WO2005096180 A1 WO 2005096180A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- search target
- search
- images
- target images
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/583—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
Definitions
- the present invention relates to a technique for searching for a desired image from a large number of images stored in a storage device such as a hard disk drive (HDD).
- a storage device such as a hard disk drive (HDD).
- Various image search methods have been conventionally proposed in order to efficiently search for an image desired by a user from a large number of still images or moving images stored in a large-capacity storage device such as an HDD.
- feature amounts such as time information and color information are extracted from each of a large number of images to be searched, and a similarity measure between the images is calculated based on these feature amounts.
- a database is constructed by associating images with each other based on the scale.
- Patent Document 1 Japanese Patent Application Laid-Open No. 9-125930
- a large number of search target information is arranged in a two-dimensional or three-dimensional hierarchical space, and the search target information is three-dimensionally arranged.
- a method of displaying information is adopted. Specifically, for each piece of search target information, features such as the color, shape, size, type, content, and keywords of the search target image are extracted.
- a feature amount vector is generated from the feature amount, and a similarity measure between the pieces of search target information is calculated based on the feature amount vector.
- a large number of pieces of search target information are arranged in a search space such that the closer to each other the closer the similarity scale is, the more the search target layer is formed.
- Some search target information is obtained from this first search target layer.
- a second upper search target layer is formed, and by extracting some search target information from the second search target layer, a third upper search target layer is formed.
- the first to n-th (n is an integer equal to or greater than 2) search target layers are constructed by recursively executing such an operation of extracting the search target information.
- the first to n-th search target layers are displayed in three dimensions.
- Patent Document 2 Japanese Patent Application Laid-Open No. 11-175535
- Patent Document 2 Japanese Patent Application Laid-Open No. 11-175535
- a main object of the present invention is to enable a user to efficiently and easily search for a desired image from a large number of images stored in a storage device such as an HDD. It is an object of the present invention to provide an image search method, an image search device, and a recording medium storing an image search program.
- a first invention is an image search method, comprising: (a) extracting, from each of a plurality of search target images, at least one component common to the plurality of search target images; (b) the configuration Obtaining a feature amount characterizing each of the search target images based on an element; and (c) calculating a similarity measure between the search target images using the feature amount; Images within a certain range via a link And (d) searching for images while calculating the display link distance between the two search target images associated via N (N is an integer of 1 or more) links as N And characterized in that:
- a second invention is an image search method, comprising: (a) extracting, from each of a plurality of search target images, at least one component common to the plurality of search target images; Obtaining a feature amount characterizing each of the search target images based on an element; and (c) calculating a similarity measure between the search target images using the feature amounts; (D) associating images having a scale within a predetermined range via a link, (d) constructing a lower hierarchy in the search target image group associated in step (c), (e) A group of images (M is an integer of 2 or more) associated with each other via the link is extracted from the lower hierarchy, and the extracted image group is assigned to a higher hierarchy than the lower hierarchy.
- the group of search target images (F) in the upper hierarchy associating, through a link, images of which the similarity scale is within a predetermined range among the search target images, (g) N (N is Searching for images while calculating the display link distance between the two search target images associated with each other via the link (an integer of 1 or more) as N, wherein the steps (e) and ( It is characterized in that multiple layers are constructed by executing f) recursively.
- a third invention is an image search device, comprising: a storage device for storing a plurality of search target images; and at least one component common to the plurality of search target images from each of the plurality of search target images. And a feature amount obtaining unit that obtains a feature amount characterizing each of the search target images based on the constituent elements, and the search pair using the feature amount.
- a network construction unit that calculates a similarity measure between elephant images and associates the images having the similarity measure within a predetermined range among the search target images via a link, and N (N is an integer of 1 or more)
- A) an image search unit that searches for an image while calculating a display link distance between the two search target images associated with each other via the link as N.
- a fourth invention is an image search device, comprising: a storage device for storing a plurality of search target images; and at least one component common to the plurality of search target images from each of the plurality of search target images.
- a feature amount obtaining unit that obtains a feature amount characterizing each of the search target images based on the constituent elements, and a similarity measure between the search target images is calculated using the feature amounts,
- a network construction unit that, among the search target images, associates the images whose similarity scale is within a predetermined range with a link and builds a lower hierarchy in the search target image group that has been associated;
- An image search unit that searches for images while calculating a display link distance between the two search target images associated with each other via the link (where N is one or more integers).
- the network construction unit extracts M (M is an integer equal to or greater than 2) image groups associated with each other via the link from the lower hierarchy, and extracts the lower image group from the extracted image groups.
- M is an integer equal to or greater than 2
- a fifth invention is a recording medium on which an image search program is recorded, wherein a storage process for storing a plurality of search target images in a storage device, and the plurality of search target images are performed on the storage medium. Extracting at least one constituent element common to the number of search target images and obtaining a characteristic amount characterizing each of the search target images based on the constituent elements; and Calculating a similarity measure between the search target images, and correlating the images having the similarity measure within a predetermined range among the search target images via a link; and N (where N is 1 or more) And an image search process of searching for an image while calculating a display link distance between the two search target images associated with each other via the link of (n). I have.
- a sixth invention is a recording medium on which an image search program is recorded, wherein a storage process for storing a plurality of search target images in a storage device, and from each of the plurality of search target images, A feature amount obtaining process of extracting at least one common constituent element and obtaining a feature amount characterizing each of the search target images based on the constituent elements; and A similarity measure is calculated, and among the search target images, images in which the similarity measure is within a predetermined range are associated with each other via a link, and a lower hierarchy is constructed from the search target image group in which the association is made.
- An image hierarchy search process is performed while calculating the display link distance between the two search target images associated with each other through N links (N is an integer of 1 or more) as N.
- a computer that performs image search processing, and extracts M (M is an integer of 2 or more) associated image groups from the lower hierarchy through the links, and extracts the extracted images.
- M is an integer of 2 or more
- a search target image group belonging to a higher hierarchy than the lower hierarchy is configured, and in the higher hierarchy, among the search target images, the images having the similarity scale within a predetermined range are linked via a link.
- the feature is that multiple hierarchies are constructed by causing the computer to recursively execute the upper hierarchy construction process to associate them with each other. Yes.
- FIG. 1 is a functional block diagram schematically illustrating a configuration of an image search device according to an embodiment of the present invention.
- FIG. 2 is a diagram schematically showing a still image divided into four parts.
- FIG. 3 is a diagram schematically showing a still image divided into five parts.
- FIG. 4 is a diagram schematically showing a series of video shots
- FIG. 5 is a diagram showing the correspondence between the search target image and the feature amount.
- FIG. 6 is a diagram schematically showing a database topology (connection form).
- FIG. 7 is a diagram schematically showing the data arrangement of the database
- Fig. 8 is a flowchart showing the procedure of the network-type database construction process.
- Fig. 9 (a) shows the data arrangement of the network before registering a new image.
- Fig. 9 (b) shows the new image. It is a diagram showing a data array of the network after registration,
- FIG. 10 is a flowchart showing a procedure of a search process using a database.
- FIG. 11 is a flowchart showing a procedure of a list display process.
- FIG. 12 is a diagram schematically showing an example of the display screen
- FIG. 13 is a diagram schematically showing an example of the display screen
- FIG. 14 is a diagram schematically showing an example of a database topology.
- FIG. 15 is a diagram schematically showing an example of a display screen.
- FIG. 16 is a diagram schematically showing an example of a display screen
- FIG. 17 is a diagram schematically showing an example of the display screen
- FIG. 18 is a diagram schematically showing an example of a display screen
- FIG. 19 is a diagram schematically showing an example of the display screen
- FIG. 20 is a flowchart schematically showing the procedure of the hierarchical processing.
- FIG. 21 is a diagram illustrating an example of a topology for describing one layering procedure
- FIG. 22 is a diagram illustrating an example of a topology for describing one layering procedure
- FIG. It is a diagram schematically showing a hierarchical network type database
- FIG. 24 is a flowchart showing the procedure of an image search process using a hierarchical network type database.
- FIG. 25 is a flowchart showing the procedure of the process of moving between layers.
- FIG. 26 is a diagram for explaining one procedure of the inter-tier movement processing.
- FIG. 27 is a diagram for explaining one procedure of the inter-layer movement processing.
- FIG. 1 is a functional block diagram schematically showing a configuration of an image search device 1 according to an embodiment of the present invention.
- the image search device 1 includes a signal processing unit 10, a feature amount obtaining unit 11, a network construction unit 12, a main controller (image search unit) 13, an image synthesizing unit 14, an image database 19, and It has 20 network databases.
- These functional blocks 10 to 14, 19, and 20 are mutually connected via a bus 21 that transmits control signals and data signals.
- the main controller 13 is connected via a user interface 15 to an operation unit 16 to which user's instructions are input, and the image synthesizing unit 14 is connected to a display unit 18 via an output interface 17.
- the display unit 18 is a display device having a resolution capable of displaying a still image or a moving image.
- the operation unit 16 can give a user input instruction to the main controller 13 via the user interface 15, and specifically,
- the display unit 18 includes a pointing device such as a mouse for detecting the coordinate position on the screen and a keyboard.
- a touch screen that detects a position touched by a user's finger or the like on the screen of the display unit 18 and gives an instruction corresponding to the position to the main controller 13, or a voice generated by the user
- a voice recognition device that recognizes the result and gives the result to the main controller 13 may be employed.
- the main controller 13 has a function of controlling the operation of the functional blocks 10 to 14, 19, and 20.
- the hierarchy selecting unit 13A, the image selecting unit 13B, and the display controlling unit 1 that execute various search processes are performed. It has 3C.
- the main controller 13 may be composed of an integrated circuit having a microprocessor, ROM for storing control programs, a RAM, an internal bus, an input / output interface, and the like.
- the hierarchy selection unit 13A, the image selection unit 13B, and the display control unit 13C may be configured by a program executed by a microprocessor or a series of instructions, or may be configured by hardware.
- the feature amount acquisition unit 11 and the network construction unit 12 are respectively configured by independent hardware I / Rs, but instead, they are executed by the microprocessor of the main controller 13. It may be composed of a program or a series of instructions.
- an image search program for executing a search process by the microprocessor by the feature amount acquisition unit 11, the network construction unit 12, and the main controller 13 is recorded on a recording medium such as an HDD, a non-volatile memory, an optical disk, or a magnetic tape. This may be used.
- the signal processing unit 10 has a function of receiving an input image signal from the outside and transferring the input image signal to the image database 19 via the bus 21 at a predetermined timing. When an analog signal is input, the signal processing unit 10 performs AZD conversion on the input image signal, and then transfers the AZD signal to the image database 19.
- JPEG Joint Still image coding methods such as Photographic Experts Group, GIF (Graphic Interchange Format) and bitmap, and Motion-J PEG, AVI (Audio Video Interleaving) and MPEG (Moving Picture Experts Group)
- Sources of input image signals include, for example, wide-area networks such as movie cameras, digital cameras, television tuners, DVD players (Digital Versatile Disk Players), compact disk players, mini disk players, scanners, and the Internet.
- the image database 19 is constructed in a large-capacity storage device such as an HDD, and records and manages still images and moving images (hereinafter, referred to as search target images) transferred via the bus 21 according to an existing file system. .
- the feature amount acquiring unit 11 and the network constructing unit 12 construct a network-type database by associating a group of search target images recorded in the image database 19 with a mesh of a network. Record in the network database 20.
- the feature amount acquisition unit 11 is a functional block that performs a process of acquiring feature amounts of a large number of search target images (feature amount acquisition process). More specifically, the feature amount acquiring unit 11 extracts, from a large number of search target images recorded in the image database 19, constituent elements common to these search target images, for example, a set of pixels constituting each pixel. Extract color components or metadata. As one set of color components, for example, a set of R (red), G (green) and B (blue) color components, and a set of Y (luminance), Cb (color difference) and CK color difference) color components are used. No.
- the metadata includes information such as the attribute, semantic content, acquisition source or storage location added to the search target image.
- title recording date and time (absolute time Z relative time), acquisition location (latitude / longitude / altitude), genre, performer, keyword, comment, price It can extract information such as case (yen ⁇ dollars z euros) and image size as metadata.
- the feature amount obtaining unit 11 calculates a set of a plurality of feature values that characterize each of the search target images, that is, a feature amount, based on the components extracted from the search target image.
- the network construction unit 12 calculates a similarity measure between the search target images using the feature amount calculated by the feature amount acquisition unit 11, and the similarity scale among the search target images is A network-type database is constructed by associating images within a predetermined range via links.
- the feature amount acquiring unit 11 reads a still image from the image database 19 and divides the still image into M (M is an integer of 2 or more) blocks. For example, a still image 30 is divided into four blocks B1, B2, B3, and B4 as shown in FIG. 2, or a still image 30 is divided into five blocks B1, B2, B3, and B4 as shown in FIG. It can be divided into B4 and B5. Next, the average value of the R, G, and B components of each block, that is, the feature value, is calculated.
- the i-th (i is an integer of 1 or more)
- the R, G, and B components of are represented by n (k, m), gi (k, m), and b
- the average value is expressed as KKk, m)>, gg (k, m)> and bb (k, m)>, and the total number of R, G, and B components contained in the block is denoted by N ,
- the mean values r (k, m)>, ⁇ g (k, m)> and b (k, m)> is given by the following equation (1).
- Equation (1) gives the arithmetic average of each of the R, G, and B components.
- the geometric average of each of the R, G, and B components A harmonic average or a weighted average may be calculated.
- the arithmetic mean is given by (a + b) 2 for two numbers a and b, and the geometric mean is given by (ab) 1/2 for two positive numbers a and b.
- X k (x (k, l), x (k, 2), x (k, 3), '-, x (k, 3m-2), x (k, 3m-l), x (k , 3m),
- the vector quantity X k is defined as one element in a metric space.
- the Euclidean distance between the two search target images can be defined. That is, the Euclidean distance D (p, q) between the p-th (p is an integer of 1 or more) image and the q-th (q is an integer of 1 or more) image is defined by the following equation (3). . Nr
- the feature quantity obtaining unit 11 regards the vector quantity x k as a unique feature quantity that characterizes the search target image, and calculates the Euclidean distance D (p, q) as a similarity measure.
- the Euclidean distance becomes smaller as the two search target images become more similar to each other, and the similarity measure takes a smaller value.
- the reciprocal of the Euclidean distance may be defined as a similarity measure, and the configuration may be changed so that the similarity measure takes a larger value as the two search target images are more similar to each other.
- the Manhattan distance D ( P , q) is defined by the following equation (3A).
- Nr 3M
- the similarity measure is calculated when the search target image is a moving image composed of multiple frames and the components extracted from each frame are the R, G, and B color components.
- the moving image data a series of video shots Si, S 2, ..., S Ns (Ns is an integer of 2 or more) which, each video shot is composed of a plurality of frames Shall be.
- the feature amount acquisition unit 11 can identify each video shot by detecting each scene change Sc.
- the feature quantity acquisition unit 11 sets M frames (M is the number of frames) of each video shot S k (k is an integer of 1 to Ns). Is divided into blocks B1, B2, ... For example, you can divide the frame into four parts as shown in Figure 4.
- the feature amount obtaining unit 11 calculates an average value of each of the R component, the G component, and the B component of each block, and calculates a feature value by averaging these average values over a plurality of frames. Specifically, in the k-th video shot S k , the i-th R component, G component, and B component of the m-th block of the s-th (s is “!
- N k is an integer of 1 or more) frame
- the components are i, s; k, m), g (i, s; k, m) and b (i, s; k, m), respectively, m + which characterizes the k-th video shot S k
- the feature values R (k, m)>, ⁇ G (k, m)>, and ⁇ B (k, m)> of the first block are given by the following equation (4).
- x (3m-2), x (k, 3m-1), x (k, 3m) as shown in the above equation (4), the vector quantity given by the above equation (2) is obtained.
- X k can be constructed.
- the vector quantity X k is treated as an element in a metric space, and as shown in the above equation (3), the Euclidean distance D ( P , q) between two video shots is used as a similarity measure.
- D ( P , q) between two video shots is used as a similarity measure.
- a value that decreases with an increase in the Euclidean distance D (p, q), for example, a reciprocal may be defined as a similarity measure.
- the feature amount obtaining unit 11 uses the metadata itself or information included in the metadata as a feature amount to compare the matching rate of the metadata between the search target images. It has a function of calculating an example or an inversely proportional value as the similarity scale. Specifically, when the metadata includes numerical information such as the date and time of shooting, the shooting location, and the price, the numerical information is treated as a feature amount X k , and the feature amount X P of the P-th image and the feature amount of the q-th image The difference from the quantity Xq can be calculated as the similarity measure D (p, q).
- the metadata contains information that is difficult to express numerically, such as genre or keyword
- the numerical value included in the genre or keyword is characterized by an objective index such as “90% fun level, 90% excitement level”.
- adopted as the amount X k it is possible to calculate the difference between the feature quantity Xq feature quantity X p and q th image p th image as similarity measure D (p, q).
- the metadata title, performer When a code string that cannot be expressed numerically, such as a commenter or a comment, is used, the code string is used as a feature X k and the character string X P of the Pth image and the character string Xq of the qth image are used.
- D (p, q) can be set to ' ⁇ '.
- set the similarity measure D (p, q) to ⁇ 2 '' if the two strings X p and Xq match completely, and if the two strings X p and Xq partially match, similarity measure D (p, q) "is set to, two strings Xp, it is if Xq do not match exactly the similarity measure D (p, q) 1 ⁇ " the 'I can be set to.
- Characteristic amount obtaining section 1 1 is configured to calculate the feature quantity X k, stored in the network database 20 in association with the search target image the feature quantity X k.
- FIG. 5 is a diagram schematically showing the correspondence between the k-th image to be searched and the feature amount X k .
- Each search target image are denoted by the indenyl box number k, the feature quantity X k corresponding to the index number k is stored in Nettowa click database 20.
- the network construction unit 12 calculates the similarity measure D (p, q) between the two search target images with reference to a correspondence table as shown in FIG.
- the network construction unit 12 determines whether or not the similarity measure D (p, q) satisfies the relational expression shown in the following expression (5). It is determined that the image and the q-th image are similar to each other, and a network-type database is constructed by associating these search target images with each other, and this is stored in the network database 20.
- Rth is a threshold value of the similarity measure. It is desirable that the threshold value Rth be set to a value that allows an average of about 5 to 10 images to be associated with each search target image. Further, the display link distances between the associated search target images are all set to the same value. In the present embodiment, the display link distance is set to “1”, but is not limited thereto.
- FIG. 6 is a diagram schematically illustrating the topology (connection form) of the network database
- FIG. 7 is a diagram schematically illustrating the data arrangement of the network database. Referring to FIG. 6, the search target images 1 1 2 ,... Are associated with each other via a link, 2 , C 1 i 4 ,.
- the links C p and q are connection lines indicating an association between the two search target images I p and I q , and the distance (display link distance) of each link is set to “1”.
- Search target image 1 1 2, ... is the link c ⁇ , c 1 i 4, but it may also be considered to be arranged in ... both end positions of the (nodal).
- the display link distance between the two search target images is “N” when the links are associated via N (N is an integer of 1 or more) links.
- the display link distance between the two search target images I p and Iq can be defined as the number of links of the shortest route among the routes from one search target image Ip to the other search target image Iq.
- the search target image is
- the data array of the network-type database has a double array structure of an image array PA and connection arrays CA 1, CA 2 ,.
- Image array PA is connected sequence CAL CA 2, ... pointer '1' to '2', '3', a sequence for storing ..., connection sequence CA, CA 2, ..., the search target image 1 1 2, ... index number (hereinafter, referred to as the image number.) of an array of.
- the image numbers are continuously arranged in ascending order in each array.
- X is a symbol indicating the end of the image array or the connection array.
- a procedure of a network-type database construction process will be described.
- K is an integer greater than or equal to 0
- the process of registering the K + 1st new image ⁇ ⁇ + 1 in the database will be described. I do.
- the main controller 13 records the new image ⁇ + 1 input from the signal processing unit 10 in the image database 19 (step S 1), and stores the new image ⁇ + 1 in the network database 20. (Step S2). At this time, as shown in FIG. 9 (b), is secured connection sequence CA K + 1 of the area for Shinkiga image iota kappa + 1, wherein the image array PA pointer to the connection sequence CA K + 1 'K + 1 'is added.
- the main controller 13 causes the feature amount acquiring unit 11 to calculate the feature amount ⁇ ⁇ + 1 of the new image I K + 1 (step S3).
- the feature amount acquiring unit 11 extracts components such as R, G, R color components or metadata from the new image ⁇ ⁇ + 1 , and X K + 1 is calculated and recorded in the network database 20.
- the feature amount acquiring unit 11 acquires the feature amount Xj of the j-th image Ij recorded in the image database 19 from the network database 20 (Step S5).
- the feature amount Xj of the j-th image Ij may be newly calculated.
- the network construction unit 12 calculates a similarity measure D (j, K + 1) between the j-th image Ij and the new image I K + 1 using the feature amounts Xj, X K + 1 (Step S6). Further, the network construction unit 12 determines whether or not the similarity measure D (j, K + 1) satisfies the relational expression (5) (Step S7), and determines the similarity measure D (j, K + 1). If it is determined that does not satisfy the relational expression (5), the process proceeds to step S9.
- step S8 determines that the j-th image Ij and the new image I K + 1 Are determined to be similar to each other, and the two images Ij, ⁇ + 1 are associated (step S8). Specifically, as shown in FIG. 9 (b), the connection array CA K + 1 for the new image I K + 1 is added with the image number j of the first image Ij, and corresponds to the pointer j 'of the image array PA. New image ⁇ ⁇ + 1 image number K + 1 is added to the connection array CAj to be connected. Then, the network construction unit 12 records this data array in the network database 20. After that, the processing shifts to Step S9.
- step S9 it is determined whether or not the processing has been completed for all the images ⁇ ⁇ of the main controller 13. If it is determined that the processing has not been completed, the image number j is incremented (step S12). ), The processing after step S5 is repeatedly executed. Meanwhile, Mei When it is determined that the processing has been completed for all the images ⁇ ⁇ (step S9), the controller 13 determines whether there is no image to be associated in step S8 (step S10). If it is determined in step S10 that there is at least one image to be associated, the above database construction processing ends.
- step S10 determines that there is no image power to associate.
- the network constructing unit 12 determines that the value of the similarity scale D (j, K + 1) with the new image ⁇ + 1 is the smallest.
- the image Ij is associated with the new image ⁇ + 1 (step S 11).
- the database construction processing ends.
- FIG. 10 is a flowchart showing the procedure of the image search process
- FIG. 11 is a flowchart showing the procedure of the list display process used in the flowchart of FIG.
- the main controller 13 executes an image list display process (FIG. 11) (step S20).
- the image selection unit 13 ⁇ sets the display link distance to the initial value Rd (step S30), and then refers to the network database 20 to display the display link with the main image.
- An image whose distance is equal to or less than the initial value Rd is set as a sub-image (step S31).
- the initial value Rd is set to a value registered in advance, for example, “5”, unless otherwise specified by a force that can be specified by the user via the operation unit 16.
- the main image can be arbitrarily selected from a group of images registered in the network database 20. Unless otherwise specified, the image with the image number "1" is selected as the main image.
- the display control unit 13C causes the display unit 18 to display the main image and the sub image selected in step S31 on a single screen in a list format (step S32).
- the display control unit 13C Reads the main image and the sub image recorded in the image database 19 and transfers them to the image synthesizing unit 14 via the bus 21.
- the image synthesizing unit 14 synthesizes a thumbnail-size image group obtained by converting the resolution of the transferred main image and sub-image, and outputs the synthesized image to the display unit 18 via the output interface 17.
- the display order of the thumbnail images is the ascending order of the link distance to the main image, so that the sub-image having a higher similarity scale to the main image is preferentially displayed.
- FIG. 12 is a diagram schematically showing a display screen 40 of the display unit 18.
- the main image is displayed, the sub-image 1 2 to 1 25 similar to the main image I is displayed. If all the sub-images cannot be displayed on one screen, the user can specify the next-screen selection button 41 N by inputting the operation section 16 to list the remaining sub-images on the next screen. . Further, the user can specify the previous screen selection button 41B to return the display screen to the previous screen.
- thumbnail images of the main image and the sub-image are generated in advance and stored in the image database 19, and the image synthesizing unit 14 converts the high-resolution main image and the sub-image from the image database 19. Instead of reading, a thumbnail image may be read.
- the user can operate the operation unit 16 to specify a desired target image from the image group displayed on the screen 40. Alternatively, if the target image cannot be found, the user can perform an input operation on the operation unit 16 to specify a sub-image other than the target image as the next main image.
- the image selection unit 13B determines whether or not the target image is specified by detecting an input instruction from the operation unit 16 (step S33). When the user specifies the target image, the image selection unit 13B determines that the target image has been specified and ends the above processing. On the other hand, when the user specifies a sub-image other than the target image as the next main image, the image selector 13B determines that the target image has not been specified (step S10). 33), the designated sub-image is set as the main image (step S34), and thereafter, the process returns to the main routine (FIG. 10).
- step S21 of the main routine the image selecting unit 13B sets an image whose display link distance from the main image is equal to or less than the set value Rs as a sub image (step S21). Thereafter, the display control unit 13C causes the display unit 18 to display the main image and the sub image in a list format (step S22).
- the user can appropriately change the set value Rs held by the main controller 13 by performing an input operation on the operation unit 16. For example, in the case of the database shown in Fig.
- FIG. 13 is a diagram illustrating an example of the display screen 40 of the display unit 18. On the display screen 40, together with the main picture image 1 3 is displayed, the main image 1 3 and display link distances "1 J within subimages I, 1 2, 1 5, 1 6, 1 7 thumbnail size Are listed.
- the user can perform an input operation on the operation unit 16 to specify a desired target image from the group of images displayed on the screen 40.
- the image selection unit 13B determines whether or not the target image is specified by detecting an input instruction from the operation unit 16 (step S23). When the user specifies the target image, the image selection unit 13B determines that the target image has been specified, and ends the image search process.
- the image selecting unit 13B determines that the target image has not been specified (step S23), and then proceeds to step S25 according to the type of the input instruction. Alternatively, the processing shifts to one of S26 (Step S24).
- the input instruction is a "list display instruction”
- the list display processing of step S25 (FIG. 11) is executed, and then the processing after step S21 is repeatedly executed.
- the image selection unit 13B determines that there is a “continuation instruction” (step S24), The designated sub-image is set as the next main image (step S26). Thereafter, the processing after step S21 is repeatedly executed.
- a user enters a continuation instruction sub-image 1 6 designated by the main image as shown in FIG. 1 4 is changed from the image 1 3 to the image 1 6, the display screen 40 in the image shown in FIG. 1 5 Change.
- the main image 16 is displayed, and the sub-images I 3 , I 5 , I 10 , In, 1 with the display link distance to the main image 16 within “1” are displayed. 12 are listed in thumbnail size. If there is no sub image to be designated as the main image on the display screen 40, the user designates the main image by displaying a list of many thumbnail images as shown in FIG. 12 (step S25). The right image can be found quickly.
- the user can efficiently and easily search for a desired target image. Further, since the image search process mainly uses only the link information of the database, it is possible to perform a high-speed search with a small amount of calculation without performing complicated processing.
- the number of horizontal pixels of the display area across as compared to the main image 1 3 and for its vertical number of pixels is small lot
- main image 1 3 is disposed above
- ... are are arranged Te ⁇ Tsu horizontally.
- the main image 16 has a smaller number of horizontal pixels and a larger number of vertical pixels than the entire display area, and thus the main image 16 is arranged on the right side.
- main image 1 6 and overlapping area left of the display area in the sub-image 1 3 to be small, I 5 ,... Are arranged along the vertical direction.
- the display control unit 13C can configure an optimal arrangement according to the image sizes of the main image and the sub image.
- the other sequences shown in Figure 1 3 and 1 5, is also possible sequence shown in Figure 1. 6 to FIG. 1 9.
- “M” indicates a main image
- “S” indicates a sub-image.
- the sub-image displayed on the display screen 40 is a group of images whose display link distance from the main image is equal to or less than the set value Rs.
- the image whose distance is within the predetermined range centered on the set value Rs or the set value Rs may be set as the sub-image and displayed on the display screen 40.
- the setting value Rs 3
- only the image group whose display link distance from the main image is “3” may be displayed on the display screen 40, or the display link distance may be “2”, “ Only the image group of “3” and “4” may be displayed on the display screen 40.
- the network construction unit 12 can construct a higher-layer network from the network constructed by the processing procedure shown in FIG. 8 (hereinafter referred to as a 0th-layer network). That is, the network constructing unit 12 extracts a group of search target images that are indirectly associated with each other through N (N is an integer equal to or greater than 1) search target images from the 0th hierarchical network. Then, an image group belonging to a higher layer in the extracted search target image group is formed. Further, the network constructing unit 12 associates the search target images that are indirectly associated with each other in the 0th hierarchy in the higher hierarchy, and determines a display link distance between the associated search target images. By setting to “1”, a primary layer network is constructed. By executing the above processing recursively, it is possible to construct a higher-layer network.
- FIG. 20 is a flowchart schematically illustrating the procedure of the hierarchical processing.
- the network construction unit 12 reads the network of the 0th layer from the network database 20 (step S40), and sets the layer number i to “1” to construct the network of the first layer (step S41). .
- one origin image is selected from a plurality of images belonging to the 0th hierarchy (step S42).
- An arbitrary image can be selected by the user via the operation unit 16 as the starting image, but unless otherwise specified, the image with the smallest image number is selected.
- FIG. 21 is a diagram schematically showing the topology of the 0-th layer network. In FIG. 21, image I is selected as the starting image.
- the network construction unit 12 sets the origin image as the representative image (step S43), and deletes all the images adjacent to the representative image, that is, the images whose display link distance from the representative image is “1” (Ste S44). For example, as shown in FIG. 21, images 1 2 adjacent to the representative image, 1 3, 1 4 is deleted. Thereafter, the network construction unit 12 determines whether or not all the images have been processed (step S45), and when it is determined that all of the images have been processed, the process proceeds to step S47, and the processing is performed on all of the images. If it is determined not to do so, the process proceeds to step S46.
- step S46 an image adjacent to the image deleted in step S44 is selected as the next starting image (step S46).
- the image with the smallest image number is selected from the plurality of target images as the starting image, and the previous starting image is not selected again.
- the target image is the image I 5, I 6, I 7, is 1 8, of these image number is the smallest image 1 5 is selected as the starting point image.
- the processing after step S43 is repeatedly executed until it is determined in step S45 that the processing has been completed for all the images.
- images I, ⁇ 5 , ⁇ ,... surrounded by a thick frame are set as representative images. It is.
- the network construction unit 12 configures an image group of the i-th higher-order layer in the representative image group (step S47). , ⁇ ⁇ ⁇ ⁇ —associate the two images with the display link distance of “2” in the primary hierarchy with each other, and set all the display link distances between the two linked images to “1” (step S48). As a result, an i-th layer network is constructed. In the example shown in FIG. 22, the link 5 between the representative image cross surrounded by a thick frame shown in FIG. 21, C 1 i 6, C ⁇ y, ... are formed.
- the network construction unit 12 determines whether or not to terminate the hierarchical processing (step S49). If it is determined that the hierarchical processing is not to be terminated, the network constructing unit 12 increments the hierarchical number i (step S49). S50), the processing after step S42 is repeatedly executed. On the other hand, if it is determined that the hierarchical processing is to be terminated, the network constructing unit 12 terminates the hierarchical processing, and stores the constructed network of the first to the partial hierarchy (L is an integer of 1 or more) as a network data base. Record at 20. As a result, as shown in FIG. 23, and thus be the network 50 0-50 Shiga construction of the zero-order ⁇ Ji missing hierarchy.
- step S44 the process of deleting the image adjacent to the representative image was performed. Instead, the display link distance from the representative image is set to “NJ (N is an integer of 2 or more) or less. May be deleted.
- FIG. 24 is a flowchart schematically showing the procedure of an image search process by the main controller 13.
- the hierarchy selection unit 13A (FIG. 1) stores the data in the network database 20. Select the network of the next highest layer from the stored networks of the 0th to the missing layer as the search target. Alternatively, the first search target may be selected by the user via the operation unit 16.
- the display control unit 13C causes the display unit 18 to list the search target images belonging to the highest hierarchy by executing the image list display process shown in FIG. 11 (step S61). . That is, on the screen 40 of the display unit 18, the main image and the sub-image belonging to the highest hierarchy are displayed in a list format as shown in FIG.
- the search process ends (step S33 in FIG. 11).
- the target image cannot be found, the user can designate an image other than the target image as the next main image. In such a case, the specified image is set as the main image (Fig. 11, step S34).
- the image selection unit 13B sets an image whose display link distance from the main image is equal to or less than the set value Rs as a sub image (step S62). Thereafter, the display control unit 13C causes the display unit 18 to display the main image and the sub-image in a list format (step S63). The user can perform an input operation on the operation unit 16 to specify a desired target image from a group of images displayed on the screen 40.
- the image selection unit 13B determines whether or not the target image is specified by detecting an input instruction from the operation unit 16 (step S64). When the user specifies the target image, the image selection unit 13B determines that the target image has been specified, and ends the image search process.
- step S64 determines that the target image has not been specified (step S64), and then proceeds to step S66 according to the type of the input instruction. The processing shifts to either S67 or S68.
- the list display process of step S66 (FIG. 11) is executed, and then the processes after step S62 are repeatedly executed.
- step S65 the user inputs an instruction to change one of the sub-images to the main image
- step S68 the designated sub-image is set as the next main image (step S68). Thereafter, the processing after step S62 is repeatedly executed.
- step S67 the process of moving between layers in step S67 is performed.
- the procedure of the process of moving between layers by the layer selecting unit 13A will be described with reference to the flowchart of FIG.
- the symbol C1 in the figure represents a connector.
- the hierarchy selection unit 13A determines whether the user's input instruction is “summary search” or “detailed search” (step S70). If there is an input instruction of "detailed search", it is determined whether or not a network of a lower hierarchy than the current hierarchy exists (step S71). If there is no lower hierarchy, the processing shifts to the main routine (FIG. 24), and the processing after step S62 is repeatedly executed.
- step S71 if it is determined in step S71 that there is a lower hierarchy, the hierarchy selection unit 13A selects a search target from the current hierarchy 50 k + 1 (k is an integer of 0 or more) as shown in FIG. switching to a lower hierarchy 50 k (step S72), the process returns to the main routine (FIG. 24). Thereafter, the processing after step S62 is repeatedly executed. As a result, the operation unit 1 6 display screen 40, for the main image and sub-image belonging to the lower hierarchy 5Q k is displayed, the user, while viewing a table ⁇ surface 40, a lower layer 50 k You can search for target images that may be present in your site.
- step S70 When it is determined in step S70 that the input instruction power is “detailed search”, the hierarchy selection unit 13A determines whether there is a network of a higher hierarchy than the current hierarchy (step S73). . If there is no higher hierarchy, the processing shifts to the main routine (FIG. 24), and the processing after step S62 is repeatedly executed.
- the layer selector 13A determines whether a main image exists in the higher layer 50k + 1 (step S74). As illustrated in FIG. 26, when the main image Ij exists in the current and higher layers 50 k and 50 k + 1 , the hierarchy selection unit 13A sets the search target to the upper layer from the current layer 50 k . Switch to the layer 50 k + 1 (step S75), and then shift the processing to the main routine (FIG. 24). On the other hand, as illustrated in FIG.
- step S76 when the main image Ij that is present at the current layer 5C does not exist at the upper layer 50k + 1 , the layer selection unit 13A is adjacent to the main image Ij, that is, One of the sub-images I j + 1 having the shortest display link distance from the main image Ij and existing in the higher hierarchy is set as the next main image (step S76), and the search target is set to the current hierarchy. Switching from k to the upper layer 50 k + 1 (step S75), and thereafter, the process returns to the main routine (FIG. 24). Thereafter, the processing after step S62 is repeatedly executed.
- the main image and the sub-image belonging to the higher hierarchy 50 k + 1 are displayed on the display screen 40 of the operation unit 16, so that the user can visually recognize the display screen 40 and A target image that may be at 50 k + 1 can be searched. In this way, the user can efficiently and easily search for a desired target image while moving between layers.
- the image search process mainly uses only the hierarchical information and link information of the database, it is possible to perform a high-speed search with a small amount of computation without performing complicated processing.
- the image search device has been described above.
- the network topology shown in Fig. 6 is not displayed on the display unit 18, but when the user specifies the main image from which the target image has been searched, the topology is displayed on the display unit 18 in three dimensions. Is also good.
Landscapes
- Engineering & Computer Science (AREA)
- Library & Information Science (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
- Processing Or Creating Images (AREA)
- Image Analysis (AREA)
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/547,082 US20080235184A1 (en) | 2004-03-31 | 2005-03-22 | Image Search Method, Image Search Apparatus, and Recording Medium Having Image Search Program Code Thereon |
JP2006511629A JP4465534B2 (ja) | 2004-03-31 | 2005-03-22 | 画像検索方法、装置及びプログラムを記録した記録媒体 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2004-106037 | 2004-03-31 | ||
JP2004106037 | 2004-03-31 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2005096180A1 true WO2005096180A1 (ja) | 2005-10-13 |
Family
ID=35063982
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2005/005649 WO2005096180A1 (ja) | 2004-03-31 | 2005-03-22 | 画像検索方法、装置及びプログラムを記録した記録媒体 |
Country Status (3)
Country | Link |
---|---|
US (1) | US20080235184A1 (ja) |
JP (1) | JP4465534B2 (ja) |
WO (1) | WO2005096180A1 (ja) |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007256310A (ja) * | 2006-03-20 | 2007-10-04 | Seiko Epson Corp | 画像表示システムおよびサーバ装置 |
JP2008129942A (ja) * | 2006-11-22 | 2008-06-05 | Hitachi Ltd | コンテンツ検索装置 |
JP2008134725A (ja) * | 2006-11-27 | 2008-06-12 | Sharp Corp | コンテンツ再生装置 |
JP2010079871A (ja) * | 2008-06-09 | 2010-04-08 | Yahoo Japan Corp | ベクトルデータ検索装置 |
JP2010250529A (ja) * | 2009-04-15 | 2010-11-04 | Yahoo Japan Corp | 画像検索装置、画像検索方法及びプログラム |
JP2012009084A (ja) * | 2011-10-12 | 2012-01-12 | Hitachi Ltd | 情報出力装置 |
US8107690B2 (en) | 2007-04-16 | 2012-01-31 | Fujitsu Limited | Similarity analyzing device, image display device, image display program storage medium, and image display method |
KR20160085004A (ko) * | 2015-01-07 | 2016-07-15 | 한화테크윈 주식회사 | 중복 이미지 파일 검색 방법 및 장치 |
JP2017505937A (ja) * | 2013-12-02 | 2017-02-23 | ラクテン ユーエスエー インコーポレイテッド | オブジェクトネットワークをモデル化するシステム及び方法 |
JP6964372B1 (ja) * | 2021-05-19 | 2021-11-10 | 忠久 片岡 | コード生成方法、コード生成装置、プログラム、データ照合方法 |
JP7128555B1 (ja) * | 2021-05-19 | 2022-08-31 | 忠久 片岡 | データ照合方法 |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1755067A1 (en) * | 2005-08-15 | 2007-02-21 | Mitsubishi Electric Information Technology Centre Europe B.V. | Mutual-rank similarity-space for navigating, visualising and clustering in image databases |
JP4296521B2 (ja) * | 2007-02-13 | 2009-07-15 | ソニー株式会社 | 表示制御装置、表示制御方法、およびプログラム |
US7941442B2 (en) * | 2007-04-18 | 2011-05-10 | Microsoft Corporation | Object similarity search in high-dimensional vector spaces |
US7870130B2 (en) * | 2007-10-05 | 2011-01-11 | International Business Machines Corporation | Techniques for identifying a matching search term in an image of an electronic document |
US8774526B2 (en) * | 2010-02-08 | 2014-07-08 | Microsoft Corporation | Intelligent image search results summarization and browsing |
WO2016098430A1 (ja) * | 2014-12-15 | 2016-06-23 | ソニー株式会社 | 情報処理方法、映像処理装置及びプログラム |
WO2017179258A1 (ja) * | 2016-04-11 | 2017-10-19 | ソニー株式会社 | 情報処理装置、及び情報処理方法 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10124525A (ja) * | 1996-10-23 | 1998-05-15 | Matsushita Electric Ind Co Ltd | 検索装置 |
JP2000112973A (ja) * | 1998-10-02 | 2000-04-21 | Ricoh Co Ltd | 空間インデックス方法及び空間インデックス処理プログラムを格納した媒体 |
JP2001325294A (ja) * | 2000-05-17 | 2001-11-22 | Olympus Optical Co Ltd | 類似画像検索方法および類似画像検索装置 |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6941321B2 (en) * | 1999-01-26 | 2005-09-06 | Xerox Corporation | System and method for identifying similarities among objects in a collection |
JP4078085B2 (ja) * | 2001-03-26 | 2008-04-23 | キヤノン株式会社 | 変倍画像生成装置、方法、及びそのコンピュータプログラムとコンピュータ読み取り可能な記憶媒体 |
-
2005
- 2005-03-22 US US11/547,082 patent/US20080235184A1/en not_active Abandoned
- 2005-03-22 WO PCT/JP2005/005649 patent/WO2005096180A1/ja active Application Filing
- 2005-03-22 JP JP2006511629A patent/JP4465534B2/ja active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10124525A (ja) * | 1996-10-23 | 1998-05-15 | Matsushita Electric Ind Co Ltd | 検索装置 |
JP2000112973A (ja) * | 1998-10-02 | 2000-04-21 | Ricoh Co Ltd | 空間インデックス方法及び空間インデックス処理プログラムを格納した媒体 |
JP2001325294A (ja) * | 2000-05-17 | 2001-11-22 | Olympus Optical Co Ltd | 類似画像検索方法および類似画像検索装置 |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007256310A (ja) * | 2006-03-20 | 2007-10-04 | Seiko Epson Corp | 画像表示システムおよびサーバ装置 |
JP2008129942A (ja) * | 2006-11-22 | 2008-06-05 | Hitachi Ltd | コンテンツ検索装置 |
JP2008134725A (ja) * | 2006-11-27 | 2008-06-12 | Sharp Corp | コンテンツ再生装置 |
US8107690B2 (en) | 2007-04-16 | 2012-01-31 | Fujitsu Limited | Similarity analyzing device, image display device, image display program storage medium, and image display method |
US8300901B2 (en) | 2007-04-16 | 2012-10-30 | Fujitsu Limited | Similarity analyzing device, image display device, image display program storage medium, and image display method |
JP2010079871A (ja) * | 2008-06-09 | 2010-04-08 | Yahoo Japan Corp | ベクトルデータ検索装置 |
JP2010250529A (ja) * | 2009-04-15 | 2010-11-04 | Yahoo Japan Corp | 画像検索装置、画像検索方法及びプログラム |
JP2012009084A (ja) * | 2011-10-12 | 2012-01-12 | Hitachi Ltd | 情報出力装置 |
JP2017505937A (ja) * | 2013-12-02 | 2017-02-23 | ラクテン ユーエスエー インコーポレイテッド | オブジェクトネットワークをモデル化するシステム及び方法 |
KR20160085004A (ko) * | 2015-01-07 | 2016-07-15 | 한화테크윈 주식회사 | 중복 이미지 파일 검색 방법 및 장치 |
KR102260631B1 (ko) * | 2015-01-07 | 2021-06-07 | 한화테크윈 주식회사 | 중복 이미지 파일 검색 방법 및 장치 |
JP6964372B1 (ja) * | 2021-05-19 | 2021-11-10 | 忠久 片岡 | コード生成方法、コード生成装置、プログラム、データ照合方法 |
JP7128555B1 (ja) * | 2021-05-19 | 2022-08-31 | 忠久 片岡 | データ照合方法 |
Also Published As
Publication number | Publication date |
---|---|
US20080235184A1 (en) | 2008-09-25 |
JP4465534B2 (ja) | 2010-05-19 |
JPWO2005096180A1 (ja) | 2008-02-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2005096180A1 (ja) | 画像検索方法、装置及びプログラムを記録した記録媒体 | |
TWI361619B (en) | Image managing apparatus and image display apparatus | |
JP4507991B2 (ja) | 情報処理装置、情報処理方法、およびプログラム | |
US20210160495A1 (en) | Transferring system for huge and high quality images on network and method thereof | |
TWI395107B (zh) | 內容管理裝置、影像顯示裝置、影像拾訊裝置、處理方法、及使電腦執行處理方法之程式 | |
US8732149B2 (en) | Content output device, content output method, program, program recording medium, and content output integrated circuit | |
US7764849B2 (en) | User interface for navigating through images | |
CN1886740B (zh) | 用于从信息源检索信息的方法和装置 | |
JP4702743B2 (ja) | コンテンツ表示制御装置およびコンテンツ表示制御方法 | |
JP5517683B2 (ja) | 画像処理装置及びその制御方法 | |
JP2007041964A (ja) | 画像処理装置 | |
US20150139608A1 (en) | Methods and devices for exploring digital video collections | |
JP2013105309A (ja) | 情報処理装置、情報処理方法、及びプログラム | |
CN102014250A (zh) | 图像控制装置以及图像控制方法 | |
KR20140043359A (ko) | 정보 처리 장치, 정보 처리 방법 및 컴퓨터 프로그램 제품 | |
JP2007019963A (ja) | 表示制御装置、カメラ、表示制御方法、プログラム、記録媒体 | |
KR100644016B1 (ko) | 동영상 검색 시스템 및 방법 | |
JP2003076718A (ja) | 文書情報コンテンツ閲覧システム及び文書情報コンテンツ閲覧方法及びプログラム及び記録媒体 | |
JP4784656B2 (ja) | 情報処理装置、情報処理方法およびプログラム | |
KR102003492B1 (ko) | 검색 컨텍스트의 유지 | |
WO2001082131A1 (fr) | Dispositif d'extraction d'informations | |
Giang et al. | Street navigation using visual information on mobile phones | |
CN110688492B (zh) | 一种基于轻量级索引的知识图谱查询方法 | |
JP3412748B2 (ja) | 映像データベース検索表示方法、装置および映像データベース検索表示プログラムを記録した記録媒体 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
DPEN | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed from 20040101) | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2006511629 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWW | Wipo information: withdrawn in national office |
Country of ref document: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 11547082 Country of ref document: US |
|
122 | Ep: pct application non-entry in european phase |