CN101344964B - Apparatus and method for processing image - Google Patents

Apparatus and method for processing image Download PDF

Info

Publication number
CN101344964B
CN101344964B CN2008101280432A CN200810128043A CN101344964B CN 101344964 B CN101344964 B CN 101344964B CN 2008101280432 A CN2008101280432 A CN 2008101280432A CN 200810128043 A CN200810128043 A CN 200810128043A CN 101344964 B CN101344964 B CN 101344964B
Authority
CN
China
Prior art keywords
contents processing
processing
feature
combination
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN2008101280432A
Other languages
Chinese (zh)
Other versions
CN101344964A (en
Inventor
西田广文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ricoh Co Ltd
Original Assignee
Ricoh Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ricoh Co Ltd filed Critical Ricoh Co Ltd
Publication of CN101344964A publication Critical patent/CN101344964A/en
Application granted granted Critical
Publication of CN101344964B publication Critical patent/CN101344964B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Processing (AREA)
  • Facsimile Image Signal Circuits (AREA)
  • Image Analysis (AREA)

Abstract

The present invention discloses a device, a method and a computer program product for processing image. A first combination of feature and processing content of image data is stored in a storing unit in a first period, and a second combination of feature and processing content of image data is stored in the storing unit in a second period that is later in terms of time. When a change in processing content is detected between the first period and the second period, an updating unit updates the first combination to the second combination. An acquiring unit acquires a processing content for target image data based on a combination of feature and processing content stored in the storing unit. An output unit outputs the processing content acquired by the acquiring unit.

Description

Image processing equipment and method
The cross reference of related application
The application advocates the right of priority of Japanese priority file 2007-183000 that on July 12nd, 2007 applied for and the Japanese priority file 2008-109312 that applied on April 18th, 2008, and its full content is hereby expressly incorporated by reference.
Technical field
The present invention relates to be used to handle equipment, the method and computer program product of image.
Background technology
Along with being extensive use of of color scanner and digital camera, in some cases, the input media and the output unit that are used for the input and output view data differ from one another.For example, printer output is by the view data of the image of digital camera shooting.Output unit is output image data after according to the correction of the feature of view data usually, for example, carries out the correction of background color for view data.Yet when input media and output unit differed from one another, the feature of recognition image data was difficult.
In order to address the above problem, various technology have been researched and developed.For example, can be to the view data image processing equipment of carries out image processing (for example, referring to Japanese Patent Application Publication 2006-053690 number) rightly.In this equipment, about for example by the view data of the color file of the image-input device of scanner input, management for example " is cancelled " or the processing history and the treatment state of each processing of " reforming ", with management with handle the data that are used to create the optimum image data according to different application.In addition, disclose by output state conversion and can make user's vision ground and the image processing equipment of the state exchange handled of recognition image (for example, referring to Japanese Patent Application Publication 2006-074331 number) intuitively.
Yet the type of not only carrying out the view data of Flame Image Process is diversified, and the purpose of user's preference and use view data also is diversified.For example, handle as a setting, exist background color to remove, wherein background color is treated to white regardless of primitive color; And background removes, wherein by the primitive color that keeps background with spot or the back Flame Image Process that penetrates for not embodying.The user can select any described treatment technology according to individual preference.Yet when equipment was configured to the user and need selects to be used for desired images handles interior perhaps parameter, user's operation was inconvenient and complicated, therefore, has reduced efficient in operation.
Summary of the invention
The objective of the invention is to solve at least in part problems of the prior art.
According to an aspect of the present invention, a kind of image processing equipment is provided, comprise: the change-detection unit, detect between period 1 and second round and on contents processing, whether have variation, first of the feature of first view data and the contents processing combination is stored in the historical information storage unit in the period 1, storage is later than the period 1 second round in time similar in appearance to the feature of second view data of first view data and the combination of contents processing in second round; The historical information updating block when the change-detection unit detects between period 1 and second round the variation on contents processing, is updated to second combination with first combination;
Image data acquisition unit is obtained and is wanted processed destination image data; The contents processing acquiring unit obtains the contents processing that is used for destination image data based on being stored in the feature in the historical information storage unit and the combination of contents processing; And the contents processing output unit, the contents processing that output is obtained by the contents processing acquiring unit.
In addition, according to another aspect of the present invention, a kind of image processing method is provided, comprise: detect between period 1 and second round on contents processing, whether have variation, first combination of the feature of storage first view data and contents processing in the period 1, storage is later than the period 1 second round in time similar in appearance to the feature of second view data of first view data and the combination of contents processing in second round; When detecting between period 1 and second round the variation on contents processing, first combination is updated to second combination; Obtain and want processed destination image data; Obtain the contents processing that is used for destination image data based on the combination of feature and contents processing; And the contents processing that in obtaining contents processing, obtains of output.
In addition, according to another aspect of the present invention, provide a kind of computer program that comprises computer usable medium, had the computer readable program code that is stored in this medium, when carrying out this code, made computing machine carry out:
Detect between period 1 and second round and on contents processing, whether have variation, first combination of the feature of storage first view data and contents processing in the period 1, storage is later than the period 1 second round in time similar in appearance to the feature of second view data of first view data and the combination of contents processing in second round; When detecting between period 1 and second round the variation on contents processing, first combination is updated to second combination; Obtain and want processed destination image data; Obtain the contents processing that is used for destination image data based on the combination of feature and contents processing; And the contents processing that in obtaining contents processing, obtains of output.
Above-mentioned and other purposes of the present invention, feature, advantage and technology and industrial significance will read the detailed description of the following preferred embodiment of the present invention in conjunction with the drawings and better be understood.
Description of drawings
Fig. 1 is the electrical connection graph according to the image processing equipment of the first embodiment of the present invention;
Fig. 2 is the functional block diagram of image processing equipment shown in Figure 1;
Fig. 3 is the process flow diagram of carrying out in image processing equipment shown in Figure 2 that the historical storage of storage eigenvector and contents processing is handled in historical data base (DB);
Fig. 4 is used for being described in detail in the process flow diagram that feature calculation that historical storage shown in Figure 3 handles is handled;
Fig. 5 is the process flow diagram that is used for describing in detail the classification processing of the feature calculation processing shown in Fig. 4;
Fig. 6 is pattern (pattern) figure that is used for illustrating the multiresolution resolution process that classification shown in Figure 5 is handled;
Fig. 7 is the synoptic diagram of example that is used to calculate the mask pattern of high-order autocorrelation function;
Fig. 8 is the pattern figure of classified example;
Fig. 9 is the form that is stored in the example of the historical information among the historical DB;
Figure 10 is the form that is stored in the example of the contents processing among the historical DB;
Figure 11 is based on the process flow diagram that anticipation function is used for presenting to the user contents processing prediction processing of optimum contents processing;
Figure 12 is the process flow diagram that preference change detection is handled and the learning data set renewal is handled that is used for describing in detail the contents processing prediction processing shown in Figure 11;
Figure 13 is the synoptic diagram that is used to illustrate combination S;
Figure 14 is the functional block diagram of image processing equipment according to a second embodiment of the present invention;
Figure 15 is the process flow diagram by the contents processing prediction processing of image processing equipment execution shown in Figure 14;
Figure 16 is the synoptic diagram of the multifunctional product of a third embodiment in accordance with the invention; And
Figure 17 is the system configuration chart of the image processing system of a fourth embodiment in accordance with the invention.
Embodiment
Describe example embodiment of the present invention in detail below with reference to accompanying drawing.
Fig. 1 is the electrical connection graph according to the image processing equipment 1 of the first embodiment of the present invention.As shown in Figure 1, image processing equipment 1 is the computing machine of personal computer (PC) for example, and comprises CPU (central processing unit) (CPU) 2, main memory unit 5, inferior storage unit 6, moveable magnetic disc device 8, network interface (I/F) 10, display device 11, keyboard 12 and the indicating device 13 of mouse for example.Each unit in the CPU 2 central authorities control image processing equipment 1.Main memory unit 5 comprises and is used for the ROM (read-only memory) of canned data (ROM) 3 and random-access memory (RAM) 4 therein.Inferior storage unit 6 comprises hard disk drive (HDD) 7, wherein storing data files (for example, be used on colored bit map view data).Moveable magnetic disc device 8 is compact disc-ROM (CD-ROM) drivers for example, and in CD-ROM canned data, distribution of information to outside, is perhaps received information from the outside.Network I/F10 is used for communicating by letter with transmission information by network 9 with other outer computers.Display device 11 comprises for example cathode ray tube (CRT) or LCD (LCD), is the process or the result of operator's display process thereon.When the operator uses keyboard 12 during to CPU 2 input commands, information etc.By data bus 14 transmission/reception data between these parts.
In the present embodiment, image processing equipment 1 is applied to universal PC.Yet, the invention is not restricted to PC.For example, the present invention can be applied to the handheld terminal that relates to PDA(Personal Digital Assistant), P/PC, mobile phone, personal handyphone system (PHS) etc.
When the user powered on image processing equipment 1, CPU 2 activated boot, and it is the program that is stored among the ROM3.After this, HDD 7 is written into operating system in RAM 4, and it is the program of supervisory computer hardware and software, thus the operation system.In response to user operation, the os starting program, be written into information or canned data." Windows TM", " Unix TM" to wait be the main example of operating system.The running program of moving on operating system is called as application program.
Image processing equipment 1 is stored as application program with image processing program in HDD 7.This shows that HDD 7 is used as the storage medium of memory image handling procedure thereon.
Usually, such application program is stored among the storage medium 8a, for example resembles the optical information storage medium of CD-ROM and Digital video disc (DVD-ROM) or resembles the magnetic storage medium of floppy disk (FD).Being recorded in application program among the storage medium 8a is installed in HDD 7 grades in time storage unit 6.Therefore, have portable storage medium 8a and be used for the storage medium of application storing thereon.Alternatively, application program can be stored in the outer computer on the network that is connected to the Internet for example, makes and can download by the network I/F 10 that will be installed in HDD 7 grades time storage unit 6 from outer computer.In addition, can provide or be distributed to the outside by the image processing program that the network of for example the Internet is realized image processing equipment 1.
When image processing equipment 1 on operating system during the runs image processing program, CPU 2 comes middle ground to control each unit by carry out each computing according to image processing program.Except the computing of carrying out by CPU 2, the standardization (normalization) that will be performed when accumulation/transmission view data is described below handles, it is the specific processing of present embodiment.Standardization is to be converted to the processing of ideal form from the Digital Image Data that the external device (ED) (for example, scanner or digital camera) that is connected to image processing equipment 1 by network 9 receives.
Emphasizing to carry out this processing at a high speed under the situation about handling in real time.Therefore, preferably additionally provide the logical circuit (not shown), make CPU 2 use logical circuit to carry out computing to CPU 2.
Subsequently, the Flame Image Process of being carried out by CPU 2 is described below.Fig. 2 is the functional block diagram of image processing equipment 1.Image processing equipment 1 comprises image data acquisition unit 100, feature calculation unit 102, standard receiving element 103, historical data base (DB) 110, anticipation function creating unit 120, contents processing predicting unit (contents processing acquiring unit) 122, contents processing output unit 124, graphics processing unit 126, change-detection unit 130 and updating block 132.
Image data acquisition unit 100 is obtained view data.In addition, when input image data is file, image data acquisition unit 100 correction files crooked.
Feature calculation unit 102 is calculated the feature of whole image data.For example, statistical information is as feature, for example, and the color of space distribution, color distribution, marginal distribution and the background of dispersion degree, layout density, character and the picture of the character ratio of entire image, picture rate, character and the picture of entire image.
Standard receiving element 104 receives the standard of the contents processing of the view data of obtaining about image data acquisition unit 100.The standard of contents processing is imported by the user.Contents processing comprises the processing type, the parameter of type etc.For example, handle type have powerful connections color correction process, spatial filtering is handled and resolution extension is handled.
Background color correction is handled the color of having powerful connections and is removed, and wherein background color is treated to white; And background removing, wherein correcting background color.Disclose the method that background color removes in following file, for example Japanese Patent Application Publication 2004-320701 number and Japanese Patent Application Publication are 2005-110184 number.When algorithm or parameter set are expressed as " A ", as the background color correction of giving a definition is handled:
The A={ background color removes, background is removed, do not do anything }
Spatial filtering is handled smoothing processing, edge enhancement process and the adaptive filtering relevant for whole target image that will be processed.In adaptive filtering, each pixel is carried out different processing.For example being described in detail in Japanese Patent Application Publication 2003-281526 number of adaptive filtering is open.Following definition space filtration treatment:
A={ smoothing processing, edge enhancement process, adaptive filtering, do not do anything }
Resolution extension is handled processing and the normal picture interpolation that the resolution of disclosed increase character in the Japanese Patent Application Publication for example 2005-063055 number is arranged.As the resolution extension of giving a definition is handled:
The increase of A={ character resolution, image interpolation, do not do anything }
Historical DB 110 stores the feature that obtained by feature calculation unit 102 and therein by the contents processing about the predetermined image data relevant with each user (being assigned to each user ID of user) of user's appointment.With the combination of time sequencing storage feature and contents processing, wherein contents processing is by user's appointment therein for historical DB 110.In other words, historical DB 110 stores historical information H therein, and H is expressed as:
H={(x(1),a(1)),(x(2),a(2))} (1)
Wherein " x (k) " expression is concentrated the feature of extracting from the k of view data is individual, and " α (k) (α ∈ A) " expression algorithm or processing parameter,, is applicable to the contents processing of view data that is.
Anticipation function creating unit 120 is created anticipation function in order to the identification contents processing based on the feature and the contents processing that are stored among the historical DB 110 for the view data of new acquisition.
Contents processing predicting unit (contents processing acquiring unit) 122 comprises feature calculation unit 102 and anticipation function creating unit 120.The anticipation function prediction that contents processing predicting unit (contents processing acquiring unit) 122 is created based on the feature of being calculated by feature calculation unit 102 with by anticipation function creating unit 120 is for the optimum contents processing of the destination image data of predicting.
Contents processing output unit 124 shows the optimum contents processing by contents processing predicting unit (contents processing acquiring unit) 122 predictions on display screen.
Graphics processing unit 126 comes the view data carries out image processing according to the standard of the contents processing that is received by standard receiving element 104.
Change-detection unit 130 detect the period 1 stored in historical DB 110 view data and first relation between the contents processing and the period 1 after second round stored in historical DB 110 view data and contents processing between second whether have any variation between concerning.This view data comprises the feature of view data.
Updating block 132 upgrades the content of storage among the historical DB 110 based on the testing result of change-detection unit 130.
Fig. 3 is the process flow diagram of being handled by the historical storage of storing eigenvector and contents processing in historical DB 110 that image processing equipment 1 is carried out.At first, image data acquisition unit 100 is obtained view data (step S100).Then, feature calculation unit 102 is calculated the eigenvector (step S102) of the view data of being obtained by image data acquisition unit 100.Standard receiving element 104 receives the standard (step S104) about the contents processing of the Flame Image Process of the view data of being obtained by image data acquisition unit 100.
Be stored among the historical DB 110 (step S106) with the form that is associated by the eigenvector of feature calculation unit 102 calculating and the designated treatment content that receives by standard receiving element 104.That is, " x (k) α (k) " joined among the historical information H.Graphics processing unit 126 according to the contents processing of appointment to view data carries out image processing (step S108).
Fig. 4 is the process flow diagram that is used for describing in detail the feature calculation processing (step S102) of the historical storage processing shown in Fig. 3.Feature calculation unit 102 will all be divided into the square block (step S110) of equivalent size by the view data that image data acquisition unit 100 is obtained.At length, feature calculation unit 102 is divided into view data for example, 1 square centimeter (square block of the equivalent size of 1cm * 1cm), (promptly, being 80 * 80 pixels under the resolution of per inch 200 points (dpi), perhaps is 120 * 120 pixels under the resolution of 300dpi).
With each block sort is following three types any: " picture ", " char " and " other " (step S112).At length, process flow diagram is as shown in Figure 5 described, create image I, it wants the resolution of processed target image piece to be reduced for the low resolution (step S120) of about 100dpi, and simultaneously, resolution levels L (step S122) is set, and initialization resolution reduces rank k (k ← 0) (step S124).Execution in step S120 comes not only from image I to the processing of S124, and extracts feature from resolution as shown in Figure 6 is reduced to the image of low resolution.For example, when resolution levels L is set to 2, from image I, have the image I of a half-resolution of image I 1And image I with 1/4th resolution of image I 2In each extract feature.
When resolution reduces rank k and does not reach resolution levels L (at step S126 for being), establishment resolution be the image I created among the step S120 resolution 1/2 kImage I k(step S128).Then, with image I kBinarization (step S130).In the image of binarization, suppose by 1 expression black picture element, and by 0 expression white pixel.
Be 1/2 from having resolution kThe image I of binarization kObtain the eigenvector f of M dimension k(step S132) afterwards, resolution reduces rank k and increases by 1 (k ← k+1) (step S134).
Describe below from image I k(k=0 ..., L) by the method for extraction feature in the image of binarization.When the target image on the screen is called as image I (r), for displacement vector (S 1, S 2..., S N) " high-order autocorrelation function (N rank autocorrelation function) " that autocorrelation function is expanded to high-order (N rank) be defined as:
z N ( S 1 , S 2 , . . . , S N ) = Σ r I ( r ) I ( r + s 1 ) . . . I ( r + s N ) - - - ( 2 )
In it and ∑ be the summation of the pixel " r " of entire image.Therefore, according to exponent number or displacement vector (S 1, S 2..., S N) the high-order autocorrelation function of acquisition infinite number from image.In the present embodiment, be 2 to the maximum owing to the reason of simplifying is assumed to be the exponent number N of high-order autocorrelation function, and also displacement vector be restricted to the zone of 3 * 3 pixels on reference pixel r location about.As shown in Figure 7, when because and the same characteristic features of line displacement when being eliminated, be 25 about the summation of the number of features of the image of binarization.Can carry out the calculating of each feature with following mode: will add number of times corresponding to the accumulation of the pixel of local pattern about entire image.For example, calculate the feature of local pattern " No.3 " as follows: the sum of products (product sum) of gray-scale value of pixel that calculates the gray-scale value of reference pixel r and be positioned at tight the right of reference pixel r for entire image.In this way, be 1/2 from resolution kImage with 25 dimension (M=25) eigenvector f kBe calculated as:
f k=(g(k,l),...,g(k,25)) (3)
Repeating step S128 has surpassed resolution levels L to the processing of S134 up to the resolution reduction rank k that increases at step S134.
When the resolution reduction rank k that increases at step S134 has surpassed resolution levels L (step S126 is for denying), based on eigenvector f 0..., f LWith this block sort is following three types any: " picture ", " char " and " other " (step S136).
Describe the method that piece is classified below in detail.At first, based on 25 dimensional feature vector f k=(g (k, l) ..., g (k, 25)) (k=0 ..., L) create (25 * L) dimensional feature vector x=(g (0,1) ..., g (0,25) ..., g (L, 1) ..., g (L, 25).In order to come classification block, must provide learning function in advance based on eigenvector x.In the present embodiment, the data that will be used for learning function are divided into two classes, that is, only comprise the data of character and do not comprise other data of character, come calculated characteristics vector x.After this, calculate the average characteristics vector p of the pixel (after this being called " character pixels ") of the data that comprise character in advance 0And the average characteristics vector p of the pixel of other data (after this being called " non-character pixels ") 1The eigenvector x that obtains from the image block that will be classified is subdivided into given eigenvector p 0And p 1Linear combination, make their combination coefficient a 0And a 1Can represent the ratio between character pixels and the non-character pixels, perhaps " probability of character " or " probability of non-character ".When remaining unchanged in the target location of screen based on the local autocorrelative feature of high-order and have additive property about target numbers, can carry out above-mentioned segmentation.Following segmentation eigenvector x:
x=a 0·p 0+a 0·p 1=F Ta+e (4)
Wherein, " e " represents vector error, and " F " is defined as:
F=[p 0,p 1] T,a=(a 0,a 1) T (5)
By least squares approach, following acquisition optimum combination coefficient vector " a ":
a=(FFT) -1·F X (6)
Parameter a for the expression " probability of non-character " of each piece 1Threshold application is handled, and each piece is assigned to any of " picture ", " non-picture " and " the unknown " thus.When piece is classified as any of " the unknown " and " non-picture ", and the parameter a of indication " probability of character " 0When surpassing threshold value, this piece is classified as " char ".When piece is classified as any of " the unknown " and " non-picture " and parameter a 0When not surpassing threshold value, this piece is classified as " other ".Fig. 8 is the pattern figure of classified example.In example shown in Figure 8, black part is divided the expression " char ", and grey color part is represented " picture ", and white portion is represented " other ".
Return the explanation of process flow diagram shown in Figure 4, be classified as following three types, promptly during any in " picture ", " char " or " other ", calculate about 20 characteristics of image (step S114) based on the sorting result of all pieces at each piece.For example, characteristics of image has the degree of scatter (character and picture disperse distribute degree) of ratio, layout density (with character and the picture compression degree to narrow space) and the character and the picture of character and picture.At length, following five values are calculated as characteristics of image.Feature calculation unit 102 is calculated the different images feature that includes but not limited to these five values.Feature calculation unit 102 is extracted about 20 features, that is, and and about 20 dimensional features.From creating history based on the standard of contents processing, preferably use feature as much as possible by the viewpoint that various users select the anticipation function of optimum contents processing.
1. character ratio R t ∈ [0,1]: be classified as the piece of " char " and the ratio of all pieces
2. non-character ratio R p ∈ [0,1]: be classified as the piece of " picture " and the ratio of all pieces
3. layout density D ∈ [0,1]: the regional sum of piece that is classified as " char " and " picture " is divided by the drawing zone
4. the degree of scatter St of character (>0):, come the standardization variance-covariance matrix with the zone of image about space distribution at the character block of x and y direction
5. the degree of scatter Sp of non-character (>0):, come the standardization variance-covariance matrix with the zone of image about space distribution in the picture block of x and y direction
Fig. 9 is the form that is stored in the example of the historical information among the historical DB 110.In example shown in Figure 9, view data, the feature that from view data, obtains and for the form storage of contents processing to be associated of view data appointment.For example, for the contents processing of view data appointment have powerful connections color correction process (background color removes, background remove), spatial filtering handles (smoothing processing, edge enhancement process, adaptive filtering) and resolution extension is handled (increase of character resolution, image interpolation) (see figure 10).In addition, as shown in figure 10, not only comprise contents processing for the contents processing of view data appointment, and comprise parameter.For example, 1 in 3 in " background color removes 3 " and " edge strengthens 1 " represented parameter respectively.
Figure 11 is based on the process flow diagram that anticipation function is used for presenting to the user contents processing prediction processing of optimum contents processing.As the historical information H that uses about the view data of predetermined set, for example during the view data of 20 set, can obtain the optimum prediction function.Therefore, when storage during about the historical information H of the view data of predetermined set, the contents processing prediction processing is initialised.
At first, based on the historical information H that is stored among the historical DB 110, anticipation function creating unit 120 is created the anticipation function (step S200) that is used to predict about the contents processing of view data.Anticipation function is the function that is obtained from historical information H by learning function.In other words, the historical information H that is stored among the historical DB 110 is used as learning data set to create anticipation function.Should be used for creating anticipation function based on the study of the distance of weighting by nearest neighbor algorithm.
When the feature set F that obtains the presentation video content, the set of algorithms A that represents contents processing and parameter and historical information H, indication is about obtaining the applicability f of the algorithm α ∈ A of feature x from the view data of given the unknown H(function u) " f " is created as anticipation function, function f for α, x HBe expressed as:
f H: A * R N* U → R (R is a real number) (7)
Each set of algorithms A creates different function f HIn other words, the function f of the set of algorithms A of " background color correction processing " HThe function f of the set of algorithms A of " spatial filtering processing " HBe different.
Yet, the technical matters below existing.Based on the Bayes technology, as usefulness " u " expression user, when " x " presentation video feature and " α " expression are handled, applicability f H(α, x, u) can be by following formulae express:
p(α|u,x)=p(α|u)·p(x|α,u)/p(x|u)=p(α|u)·p(x|α,u)/∑ αp(α|u)·p(x|α,u) (8)
Wherein " p (x|u) " is the normalization factor of image.Therefore, when determining the priority of a plurality of processing α, can ignore normalization factor " p (x|u) ".In this case, following acquisition applicability f H(α, x, u):
f H(α,x,u)=p(α|u)·p(x|α,u) (9)
Can from historical information H, easily obtain normalization factor " p (x|u) ".Each contents processing and this processing are stored together by the number of times of each user's appointment.(x| α u) is called as historical information H to user u to the characteristic distribution p of the image of its application contents processing α.
Yet, when creating anticipation function, need to consider following 4 points:
1. be stored in historical information H among the historical DB 110 and depend on the user's of image processing equipment 1 preference and task.
2. should suppose in image processing equipment 1 historical information that is used to learn few relatively (tens data) to a hundreds of set.This is because need on-site support to respond at once by preference or the task that reads the user from the least possible data.
3. feature space is multidimensional (about 20 dimensions) space.Therefore, only select the feature be suitable for predicting, making to provide feature selection mechanism to eliminate disturbing factor, perhaps each intrinsic dimensionality of weighting.In addition, must consider that the character subset that is suitable for predicting is according to the difference of wanting selecteed single object or user and difference.
4. feature is the feature of continuous and multidimensional.In addition, the number of data set is few.In this case, (x| α is very difficult u) to obtain characteristic distribution p.Because " reason of dimension ", the application by nonparametric Parzen window or be used to suppose that greatest hope (EM) algorithm of gauss hybrid models estimates that (x| α is difficult u) to characteristic distribution p.
On the other hand, nearest neighbor algorithm is applicable to work-place study.Nearest neighbor algorithm is to use and Forecasting Methodology when the most similar past situation of the situation of pre-treatment, makes when the number of the collection of similar data increases the increase of prediction correctness.In addition, nearest neighbor algorithm is a pattern recognition method, and it does not need for example estimation of the probability distribution of Gaussian distribution.
About the problem of few learning data and multidimensional feature, can solve the collection number of data and the difficult problem between the dimension as follows: according to the distance scale (scale) that the significance level (combination of feature and contents processing) of the percentage contribution of each intrinsic dimensionality prediction and each learning data is come in the weighting nearest neighbor algorithm.Thus, in the present embodiment, use nearest neighbor algorithm based on the study of Weighted distance.
In nearest neighbor algorithm based on the study of Weighted distance, when between the impact point that calculates prototype point and predict apart from the time, what calculate is not simple Euclidean distance, but calculates the distance according to the weighting of the significance level of the impact point of the significance level of prototype point and prediction.
Equation has defined i prototype point x in (10) i, and the impact point that has defined prediction in the equation (11), that is, and the set point that be identified " y ".In this case, prototype point xi is corresponding to each the feature α that is included among the historical information H, and set point y is corresponding to the feature α that obtains from destination image data.
x i=(x i1,.......,x id) (10)
y=(y 1,......y d) (11)
In addition, suppose that class is known as " c ".Class represents to collect among the A, that is, and and the processing parameter that the index of algorithm (index) maybe will be employed.Each prototype point x iBe associated with the category information of the class of indicating user's appointment.
Based on weight w about j the intrinsic dimensionality of class c CjWith i prototype point x iWeight v i, i prototype point x iAnd square (Δ) of the distance between the set point y (Δ (y, xi)) calculates according to equation (12).
Δ ( y , x i ) = 1 v i 2 Σ j = 1 d w cj 2 ( y j - x ij ) 2 - - - ( 12 )
When the hypothesis intrinsic dimensionality by " d " expression, the number of data set is by " N " expression, and class-mark is by " C " expression, the number of parameter is a prototype point weight " N " and by weight " Cd " sum of the definite every dimension of each class.That is, the number of parameter is obtained by " N+Cd ".
Weight v in the equation (12) iAnd w CjAutomatically obtain from data by learning function.As learning criterion, will minimize by the error rate of row's one (leave-one-out) method assessment.At length, use the steepest descending method to obtain weight v under four criterions below by learning function iAnd w Cj
1. the point of the hypothesis prototype point that conduct is scheduled in same item is a sparse distribution, and according to there being or not existing prototype point recognition result difference.In this case, can determine that the prototype point has very big influence for the anticipation function of contents processing, that is, the prototype point has very high significance level.Therefore, in order to expand the zone of prototype point influence, increase weight v i
2. the point of supposing the prototype point that the conduct in same item is scheduled to is intensive, and because the influence degree for identification result that the existence of prototype point causes is low.In this case, can determine that the prototype point has low importance.Therefore, reduce weight v i
3. for predetermined class c,, increase weight w when j intrinsic dimensionality greatly during impact prediction Cj
4. for class c, when j intrinsic dimensionality was the factor of interference prediction, weight wcj was little as to level off to zero.
Be published in IEEE Transactions on PatternAnalysis and Machine Intelligence in July, 2006 at R.Parede and E.Vidal, the 28th volume, the 7th phase, the nearest neighbor algorithm based on the study of Weighted distance has been described in " the Learningweighted metrics to minimize nearest-neighbor classification error " of 1100-1110 page or leaf.
After having created anticipation function, image data acquisition unit 100 is obtained the destination image data (in step S202 for being) through prediction, and the feature (step S204) of the destination image data obtained by image data acquisition unit 100 of feature calculation unit 102 calculating.To handle the calculating that identical mode is carried out feature with feature calculation shown in Figure 4.
Then, contents processing predicting unit (contents processing acquiring unit) 122 predictions are about the optimum contents processing (step S206) of destination image data.At length, the feature that contents processing predicting unit (contents processing acquiring unit) 122 inputs are calculated by feature receiving element 102 about destination image data, and the contents processing that prediction is optimum, that is, and based on algorithm or the processing parameter of anticipation function prediction about destination image data.I prototype point x based on 120 acquisitions of anticipation function creating unit iWeight v iWith weight w about j the intrinsic dimensionality of class c Cj, i prototype point x i(i prototype point x iThe class level be c) and the feature y that from destination image data, obtains between square (Δ) of distance calculate according to equation (12).Such prototype point of the distance minimization between contents processing predicting unit (contents processing acquiring unit) 122 identification prototype points and the target of prediction point.Then, contents processing predicting unit (contents processing acquiring unit) 122 is identified in the proposed algorithm or the recommended parameter of the feature in the class level.
In other words, extract the most suitable feature of the feature of destination image data in the feature of the view data from be stored in historical DB 110.Contents processing predicting unit (contents processing acquiring unit) 122 will be relevant with the most suitable feature contents processing α be predicted as optimum contents processing about destination image data, and from historical DB 110, obtain contents processing α.
The optimum contents processing α that contents processing output unit 124 obtains contents processing predicting unit (contents processing acquiring unit) 122 presents to user (step S208).
By this way, optimum contents processing is presented to the user.If contents processing is the contents processing of expectation, the user selects contents processing to use this contents processing to view data.Therefore, the user can avoid each input image data will import the trouble of contents processing or parameter.
On the other hand, if the contents processing that presents is not the contents processing of expectation, the user can specify another optimum contents processing or parameter by the user interface of for example keyboard or mouse.
When standard receiving element 104 (step S210 is for being) when the user receives standard that contents processing changes and user ID, standard receiving element 104 upgrades the content (step S212) of historical DB 110 according to standard.In this case, the canonical representation of the variation contents processing different with the contents processing that presents will be applied to view data.The standard that changes comprises the contents processing that the user specifies the contents processing that will present to change over.At length, the feature that will obtain from destination image data the standard that is included in variation and contents processing newly joins among the historical information H.When standard receiving element 104 does not receive the standard of any variation (is not at step S210), processing controls proceeds to step S214.
According to the contents processing of appointment, for destination image data carries out image processing (step S214).When to when upgrading learning data set (step S216 is for being), whether 130 detections of change-detection unit exist any change in preference, and update data set (step S218).Then, processing controls is back to step S202 once more.For example, the time of upgrading learning data set since last time is set up the timing of upgrading learning data through preset time.
Alternatively, for example, when being provided with after the learning data set or will newly join historical DB 110 about the contents processing of the predetermined collection of view data and combination of features after upgrading learning data set, also may upgrade learning data set.
In addition, also can in the time will newly adding historical DB 110, upgrade learning data set about the combination of the feature of view data and contents processing at every turn.By this way, so long as predetermined timing, the timing of renewal can be set to arbitrary timing.
Figure 12 is used for describing in detail the preference change-detection processing of contents processing prediction processing as shown in figure 11 and the process flow diagram that learning data set upgrades processing (step S218).By detecting the contents processing different whether and have with the view data of the feature of the feature similarity of image data specified and combine, in the change-detection unit 130 detection preferences whether any variation is arranged with the contents processing of image data specified.
The learning data set T that uses when being set in advance in original establishment anticipation function (step S200 or S224) 0In addition, be set in data set T among the historical information H that joins after the original establishment anticipation function among the historical DB 110 1(step S220).Based on learning data set T 0The anticipation function of creating is called as " P 0".Learning data set T 0Be expressed as:
T 0={ (x i, y i): x iRepresentation feature, y iThe situation (contents processing) that expression is selected } (13)
Subsequently, detect error information F (step S222).At length, by the application of nearest neighbor algorithm, from satisfying " T 0∪ T 1" obtain except learning data set T in the factor of (arranging a method) 0With data set T 1Outside proper data.At this moment, " T 0∪ T 1" be indexed.At length, from 1 to | T 0| be called as learning data set T 0The factor, and from | T 0|+1 arrives | T 1| be called as data set T 1The factor.
Obtain error information F by equation (14).In this case, represent can not be by the correct recognition data of nearest neighbor algorithm for error information F.In other words, error information F makes the data of the selected situation (contents processing) that is associated with storing with proper data except the data different with the contents processing of actual selection.
F={ ((i, y i, k:i represents the index of misdata, (14)
y iExpression is about the correct data of i, and k represents the index about the nearest neighbor point of i)) }
Then, detect the combination S (step S224) of conflicting data set.Combination S is the combination of such data set: though the distance between data is not far, and the learning data set T between the data 0With data set T 1The selection difference.In other words, combination S is the combination of such data set: though the feature of each data is similar to, contents processing relevant with each feature between the data is different.
Combination S is defined as follows:
S={(i,k):(i,y i,k)∈F,(k,y k,i)∈F,y i≠y k,i≤|T 0|&k>|T 0|} (15)
Wherein " i " and " k " is the nearest neighbor point each other in the combination S, and it belongs to learning data set T separately 0With data set T 1Yet selected situation is different between " i " and " k ", makes as learning data set T 0With data set T 1When making up mutually, learning data set T 0With data set T 1Between mutual negatively influencing.Therefore, this is confirmed as the combination of the data set of contradiction each other.
Figure 13 is the synoptic diagram that is used to illustrate combination S.In Figure 13, white point is represented learning data set T 0, black color dots is represented data set T 1At learning data set T 0On data A and at data set T 1On data B between distance be called as " d ".Suppose be that the center of circle is the circle of radius with d with an A and be that the center of circle is not have other data in the zone of circle of radius with d with a B, and selected situation difference between some A and the some B.In this case, the combination with data A and data B joins collection S.
Turn back to the explanation of process flow diagram shown in Figure 12, detect (step S224) after the combination S, from being included in learning data set T 0Data in deletion be included in old data (step S226) in the combination S.In other words, carry out the processing of representing by equation (16).
T 0←T 0-{(x i,y i):(i,k)∈S} (16)
Then, from wherein deleting old data and combined data set T 1Learning data set T 0Data be called as learning data set.In other words, learning data set is updated (step S228).Anticipation function creating unit 120 is created anticipation function based on the learning data set of new settings.In other words, anticipation function creating unit 120 is upgraded anticipation function (step 230).The establishment of anticipation function is with the form execution identical with create processing with reference to the anticipation function among the step S200 of process flow diagram shown in Figure 1.
When the number of combination S is equal to or greater than predetermined designated value (step S232 is for being), judgement changes in preference, and exports this and judge and notify the user (step S234).On the other hand,, judge in preference not change, and export this and judge and notify the user (step 236) as the number of combination S during (step S232 for not) less than designated value.Then, finish the preference change-detection and handle (step S218).
Above-described processing may be changed or revise.
Subsequently, the following describes according to a second embodiment of the present invention image processing equipment 20.Image processing equipment 20 is carried out the contents processing prediction processing for a plurality of destination image data collection in batches.Figure 14 is the functional block diagram of image processing equipment 20.The difference of image processing equipment 20 and image processing equipment 1 is that image processing equipment 20 also comprises result storage unit 140 and result display unit 142.Represent the part identical with identical Reference numeral, and omit the description of these parts with Fig. 2.Result storage unit 140 is stored the result of the Flame Image Process of being carried out by graphics processing unit 126 therein.Result display unit 142 shows the result that is stored in the result storage unit 140.
Figure 15 is the process flow diagram by the contents processing prediction processing of image processing equipment 20 execution.Represent identical step of step among Figure 11 with first embodiment and the description of omitting these steps with identical Reference numeral.The prediction optimum contents processing after (step S206), according to the prediction contents processing for the actual carries out image processing of each destination image data collection (step S240).The result of Flame Image Process for example is stored in the main memory unit 5.When also having any collection of the destination image data do not have carries out image processing (at step S244 for being), processing controls proceeds to step S204 with for the processing of the destination image data execution in step S204 that does not have carries out image processing to S242.When having carried out Flame Image Process (step S244 is for denying) for all collection of destination image data, the result of all collection of target image data (step S246).Then, execution in step S210 is to the processing of S218.
By this way, in the image processing equipment 20 according to second embodiment, the result that display image is handled makes that the user can determine whether to change contents processing based on result.In addition, a plurality of integrated batch of execution contents processing prediction processing for destination image data makes and can effectively carry out processing.
Figure 16 is the synoptic diagram of the multifunctional product (MFP) 50 of a third embodiment in accordance with the invention.MFP 50 comprises scanner unit 51 as image fetching unit, and printer unit 52 is as the image print unit.MFP 50 also comprises image processing equipment 1.In more detail, the image data acquisition unit in image processing equipment 1 100 is obtained the view data of the image that scanner unit 51 reads as destination image data.Image processing equipment 1 is carried out and is handled, and is used to predict the contents processing that will be applied to the view data of obtaining.
Except above-described these, the processing that the configuration of MFP 50 and MFP 50 carries out is identical with image processing equipment 1, the description of having omitted these parts.
Figure 17 is the system layout according to the image processing system 60 of fourth embodiment of the invention.Image processing system 60 is client server systems, and wherein server computer S is connected to a plurality of client computer C by network N.Server computer S carries out identical processing with image processing equipment 1.Each client computer C sends image to server computer S.Server computer S comprises and image processing equipment 1 identical functions.Network scanner NS is connected to network N, makes that the image data acquisition unit 100 among the server computer S can be obtained view data from each client computer C or network scanner NS.
Historical DB 110 can be stored in other computing machines, promptly in the server (not shown) of the computing machine except server computer S.
Except above-described these, the processing that the configuration of image processing system 60 and image processing system 60 carried out is identical with image processing equipment 1, the description of having omitted these parts.
Though for fully and clearly reveal the present invention, by specific embodiment the present invention is described, subsidiary claims are therefore not limited, embody all modifications and the optional structure that those of ordinary skill in the art can carry out but be configured to, its full content falls into basic religious doctrine of the present invention.

Claims (14)

1. image processing equipment comprises:
The change-detection unit, detect between period 1 and second round and on contents processing, whether have variation, first of the feature of first view data and the contents processing combination is stored in the historical information storage unit in the period 1, storage is later than the period 1 second round in time similar in appearance to the feature of second view data of first view data and second combination of contents processing in second round;
The historical information updating block when the change-detection unit detects between period 1 and second round the variation on contents processing, is updated to second combination with first combination;
Image data acquisition unit is obtained and is wanted processed destination image data;
The contents processing acquiring unit obtains the contents processing that is used for destination image data based on being stored in the feature in the historical information storage unit and the combination of contents processing; And
The contents processing output unit, the contents processing that output is obtained by the contents processing acquiring unit.
2. image processing equipment according to claim 1, wherein
The contents processing acquiring unit comprises
The anticipation function creating unit is created the anticipation function of the contents processing that is used for the target of prediction view data based on the combination that is stored in feature in the historical information storage unit and contents processing; And
Feature calculation unit, the feature of calculating destination image data; And
The contents processing acquiring unit obtains the contents processing that is used for destination image data based on the anticipation function of being created by the anticipation function creating unit with by the feature that feature calculation unit is calculated.
3. image processing equipment according to claim 1 also comprises:
The standard receiving element receives the standard of the contents processing be used for destination image data from the user, wherein
The change-detection unit detects for the contents processing by the destination image data of user's appointment whether have variation on the contents processing.
4. image processing equipment according to claim 2, wherein the anticipation function creating unit is created anticipation function, makes that the error about the prediction of the combination that is stored in feature in the historical information storage unit and contents processing is minimized.
5. according to the image processing equipment shown in the claim 4, wherein
Feature comprises a plurality of features, and
The anticipation function creating unit is discerned first feature that has bigger contribution for the prediction of contents processing from a plurality of features, and the establishment anticipation function makes the weight of the weight of the feature of winning greater than other features.
6. image processing equipment according to claim 4, wherein
Combination comprises a plurality of combinations, and
The anticipation function creating unit is discerned first combination that has bigger contribution for the prediction of contents processing from a plurality of combinations, and the establishment anticipation function makes the weight of the weight of the combination of winning greater than other combinations.
7. image processing equipment according to claim 5, wherein the anticipation function creating unit is created anticipation function by using nearest neighbor algorithm in conjunction with the study of the distance of weighting.
8. image processing equipment according to claim 4, wherein the anticipation function creating unit is created anticipation function by using the steepest descending method.
9. image processing equipment according to claim 1, wherein the change-detection unit detects when the out-of-date variation that whether exists on contents processing second round.
10. image processing equipment according to claim 1, wherein be that the combination of wherein predetermined number has been stored in the cycle in the historical information storage unit second round.
11. image processing equipment according to claim 1, wherein be the preset time cycle second round.
12. image processing equipment according to claim 1 also comprises:
Graphics processing unit, the contents processing that obtains according to the contents processing acquiring unit comes the destination image data carries out image processing; And
Output unit, the result of the Flame Image Process of output image processing unit.
13. image processing equipment according to claim 1, wherein be each user storage in the historical information storage unit feature of view data and the combination of contents processing.
14. an image processing method comprises:
Detect between period 1 and second round and on contents processing, whether have variation, first combination of the feature of storage first view data and contents processing in the period 1, storage is later than the period 1 second round in time similar in appearance to the feature of second view data of first view data and second combination of contents processing in second round;
When detecting between period 1 and second round the variation on contents processing, first combination is updated to second combination;
Obtain and want processed destination image data;
Obtain the contents processing that is used for destination image data based on the combination of feature and contents processing; And
The contents processing that output is obtained in obtaining contents processing.
CN2008101280432A 2007-07-12 2008-07-10 Apparatus and method for processing image Expired - Fee Related CN101344964B (en)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
JP2007183000 2007-07-12
JP2007-183000 2007-07-12
JP2007183000 2007-07-12
JP2008109312A JP5022979B2 (en) 2007-07-12 2008-04-18 Image processing apparatus, image processing method, and program
JP2008109312 2008-04-18
JP2008-109312 2008-04-18

Publications (2)

Publication Number Publication Date
CN101344964A CN101344964A (en) 2009-01-14
CN101344964B true CN101344964B (en) 2011-11-16

Family

ID=40246963

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2008101280432A Expired - Fee Related CN101344964B (en) 2007-07-12 2008-07-10 Apparatus and method for processing image

Country Status (2)

Country Link
JP (1) JP5022979B2 (en)
CN (1) CN101344964B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106462397A (en) * 2014-06-11 2017-02-22 富士通株式会社 Program generation device, program generation method and program

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013050837A (en) * 2011-08-31 2013-03-14 Dainippon Screen Mfg Co Ltd Classification processing generation device and classification processing generation method
US10163028B2 (en) 2016-01-25 2018-12-25 Koninklijke Philips N.V. Image data pre-processing
CN114330400B (en) * 2020-10-12 2023-12-08 珠海格力电器股份有限公司 Two-dimensional code image processing method, system, device, electronic equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1496109A (en) * 2002-09-20 2004-05-12 ������������ʽ���� Output object image data selecting device and method
US6834130B1 (en) * 1998-02-18 2004-12-21 Minolta Co., Ltd. Image retrieval system for retrieving a plurality of images which are recorded in a recording medium, and a method thereof
CN1737852A (en) * 2004-08-10 2006-02-22 株式会社理光 Image processing device, image processing method, image processing program and recording medium
CN1744647A (en) * 2004-09-01 2006-03-08 株式会社理光 Apparatus, method, system, and computer program for managing image processing

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2815157B2 (en) * 1988-02-25 1998-10-27 株式会社リコー Image processing device
JP2676009B2 (en) * 1991-12-26 1997-11-12 富士写真フイルム株式会社 Radiation image reading condition and / or image processing condition determination method and device, and radiation image analysis method and device
JPH0737087A (en) * 1993-07-19 1995-02-07 Matsushita Electric Ind Co Ltd Picture processor
JPH10283458A (en) * 1997-04-04 1998-10-23 Minolta Co Ltd Image processor
JPH11185034A (en) * 1997-12-24 1999-07-09 Casio Comput Co Ltd Image data correction device and recording medium recording image data correction processing program
JP3991196B2 (en) * 2001-12-18 2007-10-17 富士ゼロックス株式会社 Image processing system and image processing server
JP3978098B2 (en) * 2002-08-12 2007-09-19 株式会社日立製作所 Defect classification method and apparatus
JP2004302502A (en) * 2003-03-28 2004-10-28 Fuji Photo Film Co Ltd Image processing device
JP4980552B2 (en) * 2003-09-30 2012-07-18 コニカミノルタエムジー株式会社 Image processing method, image processing apparatus, and image processing program
JP4339730B2 (en) * 2004-03-26 2009-10-07 富士フイルム株式会社 Image processing method and apparatus

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6834130B1 (en) * 1998-02-18 2004-12-21 Minolta Co., Ltd. Image retrieval system for retrieving a plurality of images which are recorded in a recording medium, and a method thereof
CN1496109A (en) * 2002-09-20 2004-05-12 ������������ʽ���� Output object image data selecting device and method
CN1737852A (en) * 2004-08-10 2006-02-22 株式会社理光 Image processing device, image processing method, image processing program and recording medium
CN1744647A (en) * 2004-09-01 2006-03-08 株式会社理光 Apparatus, method, system, and computer program for managing image processing

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106462397A (en) * 2014-06-11 2017-02-22 富士通株式会社 Program generation device, program generation method and program
CN106462397B (en) * 2014-06-11 2019-10-29 富士通株式会社 Program creating device, program creating method

Also Published As

Publication number Publication date
CN101344964A (en) 2009-01-14
JP2009037592A (en) 2009-02-19
JP5022979B2 (en) 2012-09-12

Similar Documents

Publication Publication Date Title
Huang et al. A new image thresholding method based on Gaussian mixture model
Tao et al. Image segmentation by three-level thresholding based on maximum fuzzy entropy and genetic algorithm
US20050248808A1 (en) Printing control interface system and method with handwriting discrimination capability
CN111950643B (en) Image classification model training method, image classification method and corresponding device
US8254669B2 (en) Data processing apparatus, computer program product, and data processing method for predicting an optimum function based on a case database and image feature values calculated by a feature-value calculating unit
CN101344964B (en) Apparatus and method for processing image
JP2008537198A (en) Intelligent import of information from a foreign application user interface using artificial intelligence
CN104115161A (en) Method and system for comparing images
CN111797886A (en) Generating OCR training data for neural networks by parsing PDL files
CN112669298A (en) Foundation cloud image cloud detection method based on model self-training
Liu et al. A novel fuzzy classification entropy approach to image thresholding
CN115017418A (en) Remote sensing image recommendation system and method based on reinforcement learning
CN109544561A (en) Cell mask method, system and device
CN108268641A (en) Invoice information recognition methods and invoice information identification device, equipment and storage medium
JP4077919B2 (en) Image processing method and apparatus and storage medium therefor
CN111583274A (en) Image segmentation method and device, computer-readable storage medium and electronic equipment
CN116310530A (en) Federal unsupervised image classification model training method, classification method and equipment based on semantic clustering
Merk et al. Estimation of the spatial weighting matrix for regular lattice data—An adaptive lasso approach with cross‐sectional resampling
CN113792659B (en) Document identification method and device and electronic equipment
US20210383572A1 (en) Search system
CN111476226B (en) Text positioning method and device and model training method
CN101777126A (en) Clustering method for multidimensional characteristic vectors
CN116958113A (en) Product detection method, device, equipment and storage medium
CN114881892B (en) Remote sensing image characteristic discretization method and device based on II-type fuzzy rough model
CN113486736B (en) Black box anti-attack method based on active subspace and low-rank evolution strategy

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20111116

Termination date: 20190710

CF01 Termination of patent right due to non-payment of annual fee