AU2007237343A1 - Motion Quality Analysis - Google Patents
Motion Quality Analysis Download PDFInfo
- Publication number
- AU2007237343A1 AU2007237343A1 AU2007237343A AU2007237343A AU2007237343A1 AU 2007237343 A1 AU2007237343 A1 AU 2007237343A1 AU 2007237343 A AU2007237343 A AU 2007237343A AU 2007237343 A AU2007237343 A AU 2007237343A AU 2007237343 A1 AU2007237343 A1 AU 2007237343A1
- Authority
- AU
- Australia
- Prior art keywords
- image
- determining
- image forming
- forming device
- motion quality
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
- G06T7/0006—Industrial image inspection using a design-rule based approach
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03G—ELECTROGRAPHY; ELECTROPHOTOGRAPHY; MAGNETOGRAPHY
- G03G15/00—Apparatus for electrographic processes using a charge pattern
- G03G15/50—Machine control of apparatus for electrographic processes using a charge pattern, e.g. regulating differents parts of the machine, multimode copiers, microprocessor control
- G03G15/5062—Machine control of apparatus for electrographic processes using a charge pattern, e.g. regulating differents parts of the machine, multimode copiers, microprocessor control by measuring the characteristics of an image on the copy material
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03G—ELECTROGRAPHY; ELECTROPHOTOGRAPHY; MAGNETOGRAPHY
- G03G15/00—Apparatus for electrographic processes using a charge pattern
- G03G15/70—Detecting malfunctions relating to paper handling, e.g. jams
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00002—Diagnosis, testing or measuring; Detecting, analysing or monitoring not otherwise provided for
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00002—Diagnosis, testing or measuring; Detecting, analysing or monitoring not otherwise provided for
- H04N1/00007—Diagnosis, testing or measuring; Detecting, analysing or monitoring not otherwise provided for relating to particular apparatus or devices
- H04N1/00015—Reproducing apparatus
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00002—Diagnosis, testing or measuring; Detecting, analysing or monitoring not otherwise provided for
- H04N1/00026—Methods therefor
- H04N1/00031—Testing, i.e. determining the result of a trial
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00002—Diagnosis, testing or measuring; Detecting, analysing or monitoring not otherwise provided for
- H04N1/00026—Methods therefor
- H04N1/00045—Methods therefor using a reference pattern designed for the purpose, e.g. a test chart
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00002—Diagnosis, testing or measuring; Detecting, analysing or monitoring not otherwise provided for
- H04N1/00026—Methods therefor
- H04N1/00063—Methods therefor using at least a part of the apparatus itself, e.g. self-testing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00002—Diagnosis, testing or measuring; Detecting, analysing or monitoring not otherwise provided for
- H04N1/00071—Diagnosis, testing or measuring; Detecting, analysing or monitoring not otherwise provided for characterised by the action taken
- H04N1/00082—Adjusting or controlling
- H04N1/00084—Recovery or repair, e.g. self-repair
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30144—Printing quality
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/0077—Types of the still picture apparatus
- H04N2201/0082—Image hardcopy reproducer
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Quality & Reliability (AREA)
- Theoretical Computer Science (AREA)
- Microelectronics & Electronic Packaging (AREA)
- Facsimiles In General (AREA)
Description
S&F Ref: 835150 AUSTRALIA PATENTS ACT 1990 COMPLETE SPECIFICATION FOR A STANDARD PATENT Name and Address Canon Kabushiki Kaisha, of 30-2, Shimomaruko 3-chome, of Applicant : Ohta-ku, Tokyo, 146, Japan Actual Inventor(s): Matthew Christian Duggan, Peter Allein Fletcher, Kieran Gerard Larkin Address for Service: Spruson & Ferguson St Martins Tower Level 35 31 Market Street Sydney NSW 2000 (CCN 3710000177) Invention Title: Electrophotographic image quality The following statement is a full description of this invention, including the best method of performing it known to me/us: 5845c( 1049804 I) -1 MOTION QUALITY ANALYSIS FIELD OF THE INVENTION The present invention relates to a method for determining the state of an image forming device, specifically, an image processing method for determining the wear state of an image forming device. 5 BACKGROUND Image forming devices such as electrophotographic printers typically contain many moving parts that will experience wear over time. This wear can cause a change in properties, which may result in a reduced image quality and artefacts in the output of the device. 10 Some of the artefacts that can occur when parts become worn are due to "motion quality" problems. The motion quality of a part is a measure of how close the motion of a given part is to an ideal motion. For example, most gears in the paper feed mechanism of an electrophotographic device should ideally turn at a constant velocity to feed paper through the machine in a smooth manner. In practice however, worn parts, power fluctuations, friction, is and vibrations in a device will cause the gears to move at a fluctuating velocity. This sort of motion quality problem will generally become more pronounced as a part wears, and can be used as an indicator that maintenance is required. Methods for measuring the motion quality of parts of a machine have traditionally involved adding hardware to measure motion quality in some way. Some well-known ways of 20 measuring the motion of a part are to use a rotary encoder on the part, or to measure the voltage of an electric motor attached directly to the part. However, the addition of hardware increases the cost of creating devices, so it is desirable to find methods for measuring motion quality without the need for additional hardware. 25 SUMMARY Disclosed herein is a method for determining a motion quality parameter of an image forming device. The method uses one or more parts of a digital capture of a test image produced by the image forming device to determine displacements from ideal locations associated with the test image. The method then uses the displacements to determine a the 30 motion quality parameter of the image forming device. 1046763_.DOC 835150_specification_20071130 -2 According to a first aspect of the present disclosure, there is provided a computer implemented method of determining a motion quality parameter of an image forming device, comprising the steps of: forming an image on a substrate using the image forming device; s capturing a digital image of the image; determining at least one displacement of a part of the image from a respective expected location; determining frequency components of the determined displacements; and using the frequency components to determine a motion quality parameter of the image io forming device which produced the image. According to a second aspect of the present disclosure, there is provided a method for determining a motion quality parameter of an image forming device, comprising the steps of: (a) capturing a digital image of an image formed by the image forming device; (b) determining a displacement for each of at least one part of the digital image 15 from a respective ideal location; (c) determining frequency components of the determined displacements; and (d) determining a motion quality parameter associated with the image forming device, based on the frequency components. 20 BRIEF DESCRIPTION OF THE DRAWINGS One or more embodiments of the present invention will now be described with reference to the drawings, in which: Fig. 1 is a schematic block diagram of a general purpose computer upon which 25 arrangements described can be practised; Fig. 2 is the overall process for measuring motion quality and detecting worn parts; Fig. 3 shows an example test chart which could be used for the process of Fig. 2; Fig. 4 shows the details of step 220 of Fig. 2; Fig. 5 shows the details of step 230 of Fig. 2; 30 Fig. 6 shows the details of step 340 of Fig. 5; and Fig. 7 is an example table of values that could be used in step 240 of Fig. 2. 1046763_1.DOC 835150_specification_20071130 -3 DETAILED DESCRIPTION Where reference is made in any one or more of the accompanying drawings to steps and/or features, which have the same reference numerals, those steps and/or features have for the purposes of this description the same function(s) or operation(s), unless the contrary 5 intention appears. An embodiment of the present disclosure seeks to remove the need for special hardware to measure motion quality by directly measuring the motion quality from the output images produced by the device. The steps of a method 200 of measuring motion quality are shown in Fig. 2. The 1o method 200 starts at step 210 by capturing (obtaining) a digital image of an output image of the device. The output image may be formed by said image forming device on a substrate, such as paper. In one embodiment, the output image is of a predefined test chart having known properties. The known properties can include, for example, the ideal locations of one or more parts of the test chart. This digital image may be obtained "on-line" as the page is fed is through the device or "off-line" by taking a digital image of a page after it has been finished. The digital image may be obtained using a digital scanner or camera. For this method, a greyscale (single colour) image is sufficient for determining motion quality, however colour data may be used to determine motion quality of different process colours from the image forming device simultaneously. For the purpose of this description, it is assumed that the 20 image has been obtained in greyscale with a resolution of 600 dots per inch (dpi). Once the digital image has been obtained, control moves to step 220 where the displacements of one or more different parts of the image are determined. The displacements are a measure of how far different parts of the image are from their respective expected (ideal) locations. These displacements may be measured in a horizontal direction, a vertical direction, 25 or both. There are several known ways to measure such displacements. Those techniques which measure displacements using known features of the ideal image are also known as "image registration" techniques. For the present disclosure, a displacement measurement with high accuracy and robustness is desirable. One method for measuring the fine displacements of an image in one direction will 30 now be described. The method works on the principle of "phase demodulation". An example chart 310 to which this method can be applied is shown in Fig. 3. The chart 310 has bands 320 of sinusoidal varying intensity. The detail of one of these bands is shown in the 1046763_.DOC 835150_specification_20071130 -4 magnified section 330. The magnified band 340 in the magnified section 330 has a toner level which varies smoothly with they (vertical) direction according to the function: I4y]= A 1+ sin where A is an intensity amplitude, k is the period of the sinusoid, and y is the vertical 5 location. For toner coverage function I which can vary between 0.0 (no toner) and 1.0 (full toner), A is chosen to be 0.5. Ify is measured in pixels (or px), the units of k are px. A large k will make a strip which appears to have lines far apart, where a small k will give lines which appear to be very close together. At a y resolution of 600 dpi, a good choice for k is either 8 or 16px. In practice, many image forming devices cannot exactly control their toner coverage to 10 any level between 0.0 and 1.0, however an output which approximates this function is sufficient. A method 400 for detecting the fine displacements of one of the strips in the digital image of the chart 310 (the image was obtained in step 210, above) will now be described with reference to Fig. 4. The first step 410 is to determine the location of one or more strips is in the image. Step 420 extracts the regions of the digital image containing the sinusoidal strips. The location of these regions in the digital image can be determined by assuming certain locations in the captured image (i.e., assuming that the page is sufficiently registered to the image capture coordinates), by finding additional fiducial marks (such as marks 350 shown in Fig. 3) at known locations on the page, or by directly finding the strips themselves 20 by searching for the lines of dark toner. Once the strip region has been determined and extracted, a vector representing the average darkness values across the strip region (i.e., the horizontal direction in Fig. 3) is determined in step 430. The result of this process is a vector of average darkness values, g[y]. The next step 440 is to determine the quadrature components of the darkness values 25 according to the equations: q1 [y] = sin(- )g[y] q2 [y] = cos 2,' ~ g[y] The final step 450 of the method 400 is to determine the fine displacements in the y direction, d[y]. The displacements are determined using the equation: 1046763_1.DOC 835150_specification_20071130 -5 dy]=-arctan 21r " W q( y+i- 2 i=0 where W is the "window" size. A larger window size has the effect of reducing noise in the final result, but may lose some high frequency information. In practice, a good window size is a small integer multiple of the k value chosen above, such as 5k. The displacements 5 measured by this method will be in pixels, but can be converted to millimetres or inches using the known resolution of 600dpi. The method 400 of Fig. 4 terminates after step 450. Returning to Fig. 2, in the next step 225 the displacements at every point (or pixel) in the image are determined. Some displacement measurement and registration techniques, such as the method 400 described above, do not provide displacement information at every point in 10 the image, rather such techniques provide information only for those points where there are strips to measure. When using a technique to determine displacement at every point, displacements may be generated for every point in the image using an interpolation scheme, many of which are known in the art. For step 225, a smooth interpolation scheme such as a spline interpolation is appropriate. The result of interpolating the known points to every point 15 in the image is two matrices of displacements D, [i, j] and D, [i, j] representing the horizontal and vertical displacement at each point respectively. For simplicity, these values in one embodiment are measured in millimetres, but other scales such as inches are equally possible. As stated above, some methods only measure one direction's displacements. If only one of the displacement matrices has been measured, it may be assumed for the rest of this 20 description that the unmeasured matrix is not processed in the following steps. Once the displacements of the digital image have been obtained, they are processed to find (detect) primary motion components in step 230. The details of step 230 are shown in the method 500 of Fig. 5. At step 510, the displacement values measured in step 220 are averaged over the perpendicular direction. That is, the x-direction displacements are averaged over y, 25 and the y-direction (vertical) displacements are averaged over x. This averaging process produces two vectors from the two matrices: d,[i]= - D,[ i, j] Yj=0 and 1046763_.DOC 835150_specification_20071130 -6 1x d,[j)-I -D,[ij) X i=o where X and Y are the width and height respectively of the matrices. In step 520, the vectors of average displacements from step 5 10 (the positional errors) are converted to motion error vectors by taking the discrete derivative of each: s d'., [i] d,[i]- dx [i - 1] and d', [i] = d, [i]- d, [i - 1] The next step 530 is to determine the frequency components of the displacements by utilising the motion error vectors. This can be implemented by performing a discrete Fourier 10 transform on the motion error vectors, producing complex vectors representing the frequency components of the motion error vectors: (,f= [ (d',) and (P [f]= 3(d',) 15 These motion frequency components are functions of pixel frequency (f), but a given pixel frequency can be converted to a spatial frequency using the resolution (or dots per millimetre) of the original captured digital image. In the next step 540, the peaks in the motion frequency components (x and 9 [ I are detected. For each vector, P peaks are detected and placed in a list of peaks, Ex and My , 20 respectively. In one embodiment, P is chosen as ten (10). The peaks are detected in each vector according to the procedure 600 shown in Fig. 6. The procedure for detecting peaks 600 will now be described with reference to Fig. 6. In step 610, the complex values in the vector are sorted into a list, arranged such that the frequencies with the highest magnitude are at the top of the list. At step 620, the frequency with the highest magnitude is removed from the list, 25 and placed in the corresponding list of peaks 7 x or X along with its magnitude. At step 630, the length of the list of peaks is compared to the desired number, P. If there are already P peaks in the list of peaks, Yes, the method concludes. Otherwise, if there are less than P peaks in the list of peaks, No, the method 600 moves to step 640, where any values less than E vector elements away from the detected peak (in the un-sorted vector) are removed (excluded) 1046763_.DOC 835150_specification_20071130 -7 from the sorted list. In one arrangement, the value of E is chosen as twenty (20). The method 600 then returns to step 610. One possible improvement to the method for detecting peaks 600 is to use a "frequency estimation" or "spectral estimation" method such as those known in the art. A 5 typical frequency estimation method works firstly by padding the input displacements d' with 0 values before performing the Fourier transform in step 530. This padding has the effect of interpolating the Fourier transform. Then, instead of removing a single location for the peak in step 630, a sinc function is detected and subtracted from values around the peak location. This has the effect of removing spread-spectrum sidebands around the peak which distort 1o other peaks. After the first frequency has been detected and removed, other peaks can be detected with greater accuracy, and also removed. For greater accuracy still, the process can be repeated by iteration to reduce the interaction between all peaks. Once the method 600 is complete, step 540 is complete, which also concludes method 500. is Once P peaks have been detected in each vector, the period T of each peak is 1 calculated as: f The method then uses said frequency components to determine a motion quality parameter of the image forming device that produced the original image. Returning again to Fig. 2, at step 240 a check is made for worn parts using detected 20 motion components and a table of known parts. In one embodiment, periods of the peaks in each direction are then compared to a list of known "wearing" parts in a device. Such a list could include, for example, entries of names, the period of their expected motion errors, and threshold values for when the part may need replacing. An example list with such values is shown as 700, in Fig. 7. A first part name, Drum Drive Gear, has a motion period of 2.5 mm 25 and an associated threshold of 0.05. A second part name, Developer Drive Gear, has a motion period of 5.0 mm and an associated threshold of 0.10. A third part name, Paper Feed Roller, has a motion period of 10 mm and an associated threshold of 0.50. One such list would be required for each direction, as the motion error of a given worn part will occur in either the x ory direction. If any of the peaks in the list I x or 7T' match a part in the corresponding list of 30 parts which show motion errors in that direction, and its magnitude exceeds the threshold at which action should be taken, appropriate action may be taken such as notifying the user of 1046763_1.DOC 835150_specification_20071130 -8 the device that a part may need replacing, notifying the service centre that a part may need replacing, or recording in a service log that a part may need replacing. The part names in the table 700 of Fig. 7 could be used to produce such a log. Such a log could be inspected by a technician at the next service call to determine what parts may need checking. The method s 200 of Fig. 2 then terminates. In another embodiment, a method for determining a motion quality parameter of an image forming device is disclosed. The method includes the steps of: capturing a digital image of an image formed by the image forming device; determining a displacement for each of at least one part of the digital image from a respective ideal location; determining frequency 10 components of the determined displacements; and determining a motion quality parameter associated with the image forming device, based on the frequency components. In one embodiment, the image formed by the image forming device is of a predefined test chart, wherein the respective ideal locations correspond to known properties of the test chart. In another embodiment, the digital image is obtained by one of a digital scanner and a is digital camera. In a further implementation, the digital image is produced on-line by the image forming device. The method of determining a motion quality parameter of an image forming device may be implemented using a computer system 100, such as that shown in Fig. 1, wherein the processes of Figs 2 to 7 may be implemented as software, such as one or more application 20 programs executable within the computer system 100. In particular, the steps of the method of determining a motion quality parameter of an image forming device are effected by instructions in the software that are carried out within the computer system 100. The instructions may be formed as one or more code modules, each for performing one or more particular tasks. The software may also be divided into two separate parts, in which a first 25 part and the corresponding code modules performs the determining motion quality parameter methods and a second part and the corresponding code modules manage a user interface between the first part and the user. The software may be stored in a computer readable medium, including the storage devices described below, for example. The software is loaded into the computer system 100 from the computer readable medium, and then executed by the 30 computer system 100. A computer readable medium having such software or computer program recorded on it is a computer program product. The use of the computer program 1046763_1.DOC 835150_specification_20071130 -9 product in the computer system 100 preferably effects an advantageous apparatus for determining a motion quality parameter of an image forming device. As seen in Fig. 1, the computer system 100 is formed by a computer module 101, input devices such as a keyboard 102 and a mouse pointer device 103, and output devices including 5 a printer 115, a display device 114 and loudspeakers 117. An external Modulator Demodulator (Modem) transceiver device 116 may be used by the computer module 101 for communicating to and from a communications network 120 via a connection 121. The network 120 may be a wide-area network (WAN), such as the Internet or a private WAN. Where the connection 121 is a telephone line, the modem 116 may be a traditional "dial-up" 10 modem. Alternatively, where the connection 121 is a high capacity (eg: cable) connection, the modem 116 may be a broadband modem. A wireless modem may also be used for wireless connection to the network 120. In one implementation, the computer system 100 also includes a scanner (not shown) for capturing a digital image of an image produced by an image forming device. In another implementation, the computer system 100 includes a digital is camera (not shown) for capturing a digital image of an image produced by an image forming device. The computer module 101 typically includes at least one processor unit 105, and a memory unit 106 for example formed from semiconductor random access memory (RAM) and read only memory (ROM). The module 101 also includes an number of input/output 20 (1/0) interfaces including an audio-video interface 107 that couples to the video display 114 and loudspeakers 117, an 1/0 interface 113 for the keyboard 102 and mouse 103 and optionally a joystick (not illustrated), and an interface 108 for the external modem 116 and printer 115. In some implementations, the modem 116 may be incorporated within the computer module 101, for example within the interface 108. The computer module 101 also 25 has a local network interface 111 which, via a connection 123, permits coupling of the computer system 100 to a local computer network 122, known as a Local Area Network (LAN). As also illustrated, the local network 122 may also couple to the wide network 120 via a connection 124, which would typically include a so-called "firewall" device or similar functionality. The interface 111 may be formed by an Ethernetim circuit card, a wireless 30 BluetoothTM or an IEEE 802.11 wireless arrangement. The interfaces 108 and 113 may afford both serial and parallel connectivity, the former typically being implemented according to the Universal Serial Bus (USB) standards and 1046763_.DOC 835150_specification_20071130 -10 having corresponding USB connectors (not illustrated). Storage devices 109 are provided and typically include a hard disk drive (HDD) 110. Other devices such as a floppy disk drive and a magnetic tape drive (not illustrated) may also be used. An optical disk drive 112 is typically provided to act as a non-volatile source of data. Portable memory devices, such optical disks 5 (eg: CD-ROM, DVD), USB-RAM, and floppy disks for example may then be used as appropriate sources of data to the system 100. In one implementation, a test chart is stored in one or more of the storage devices 109. The components 105, to 113 of the computer module 101 typically communicate via an interconnected bus 104 and in a manner which results in a conventional mode of operation 10 of the computer system 100 known to those in the relevant art. Examples of computers on which the described arrangements can be practised include IBM-PCs and compatibles, Sun Sparcstations, Apple Mac" or alike computer systems evolved therefrom. Typically, the application programs discussed above are resident on the hard disk drive 110 and read and controlled in execution by the processor 105. Intermediate storage of 15 such programs and any data fetched from the networks 120 and 122 may be accomplished using the semiconductor memory 106, possibly in concert with the hard disk drive 110. In some instances, the application programs may be supplied to the user encoded on one or more CD-ROM and read via the corresponding drive 112, or alternatively may be read by the user from the networks 120 or 122. Still further, the software can also be loaded into the computer 20 system 100 from other computer readable media. Computer readable media refers to any storage medium that participates in providing instructions and/or data to the computer system 100 for execution and/or processing. Examples of such media include floppy disks, magnetic tape, CD-ROM, a hard disk drive, a ROM or integrated circuit, a magneto-optical disk, or a computer readable card such as a PCMCIA card and the like, whether or not such 25 devices are internal or external of the computer module 101. Examples of computer readable transmission media that may also participate in the provision of instructions and/or data include radio or infra-red transmission channels as well as a network connection to another computer or networked device, and the Internet or Intranets including e-mail transmissions and information recorded on Websites and the like. 30 The second part of the application programs and the corresponding code modules mentioned above may be executed to implement one or more graphical user interfaces (GUIs) to be rendered or otherwise represented upon the display 114. Through manipulation of the 1046763_1.DOC 835150_specification_20071130 -11 keyboard 102 and the mouse 103, a user of the computer system 100 and the application may manipulate the interface to provide controlling commands and/or input to the applications associated with the GUI(s). The method of determining a motion quality parameter of an image forming device 5 may alternatively be implemented in dedicated hardware such as one or more integrated circuits performing the functions or sub functions of capturing a digital image of an image formed by an image forming device on a substrate, determining displacements of said digital image from an expected location, determining frequency components of said displacements, and determining a motion quality parameter of a device by utilising said frequency 10 components. Such dedicated hardware may include graphic processors, digital signal processors, or one or more microprocessors and associated memories. INDUSTRIAL APPLICABILITY It is apparent from the above that the arrangements described are applicable to the is computer, data processing, image processing, and printing industries. The foregoing describes only some embodiments of the present invention, and modifications and/or changes can be made thereto without departing from the scope and spirit of the invention, the embodiments being illustrative and not restrictive. In the context of this specification, the word "comprising" means "including 20 principally but not necessarily solely" or "having" or "including", and not "consisting only of'. Variations of the word "comprising", such as "comprise" and "comprises" have correspondingly varied meanings. 1046763_.DOC 835150_specification_20071130
Claims (11)
1. A computer implemented method of determining a motion quality parameter of an image forming device, comprising the steps of: 5 (a) forming an image on a substrate using said image forming device; (b) capturing a digital image of said image; (c) determining at least one displacement of a part of said image from a respective expected location; (d) determining frequency components of said determined displacements; and 10 (e) using said frequency components to determine a motion quality parameter of said image forming device which produced said image.
2. The method of claim 1, further comprising the step of: (f) differentiating said determined displacements before determining said frequency 15 components.
3. The method of claim 1, further comprising the step of: (g) using said motion quality parameter to determine a wear state of said image forming device. 20
4. The method of claim 1, wherein said motion quality parameter determination step comprises a peak-finding technique.
5. The method of claim 1, wherein said displacement determining step comprises a 25 demodulation method.
6. A method for determining a motion quality parameter of an image forming device, comprising the steps of: (a) capturing a digital image of an image formed by said image forming device; 30 (b) determining a displacement for each of at least one part of said digital image from a respective ideal location; (c) determining frequency components of said determined displacements; and 1046763_1.DOC 835150_specification_20071130 -13 (d) determining a motion quality parameter associated with said image forming device, based on said frequency components.
7. The method according to claim 6, wherein said image formed by said image forming s device is of a predefined test chart, wherein said respective ideal locations correspond to known properties of said test chart.
8. The method according to claim 6, wherein said digital image is obtained by one of a digital scanner and a digital camera. 10
9. The method according to claim 6, wherein said digital image is produced on-line by said image forming device.
10. A computer implemented method of determining a motion quality parameter of an 15 image forming device, said method being substantially as described herein with reference to the accompanying drawings.
11. A method of determining a motion quality parameter of an image forming device, said method being substantially as described herein with reference to the accompanying drawings. 20 DATED this Third Day of December, 2007 Canon Kabushiki Kaisha Patent Attorneys for the Applicant SPRUSON & FERGUSON 1046763_1.DOC 835150_specification_20071130
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU2007237343A AU2007237343A1 (en) | 2007-12-04 | 2007-12-04 | Motion Quality Analysis |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU2007237343A AU2007237343A1 (en) | 2007-12-04 | 2007-12-04 | Motion Quality Analysis |
Publications (1)
Publication Number | Publication Date |
---|---|
AU2007237343A1 true AU2007237343A1 (en) | 2009-06-18 |
Family
ID=40863154
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
AU2007237343A Abandoned AU2007237343A1 (en) | 2007-12-04 | 2007-12-04 | Motion Quality Analysis |
Country Status (1)
Country | Link |
---|---|
AU (1) | AU2007237343A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109509168A (en) * | 2018-08-30 | 2019-03-22 | 易诚博睿(南京)科技有限公司 | A kind of details automatic analysis method for picture quality objective evaluating dead leaf figure |
-
2007
- 2007-12-04 AU AU2007237343A patent/AU2007237343A1/en not_active Abandoned
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109509168A (en) * | 2018-08-30 | 2019-03-22 | 易诚博睿(南京)科技有限公司 | A kind of details automatic analysis method for picture quality objective evaluating dead leaf figure |
CN109509168B (en) * | 2018-08-30 | 2019-06-25 | 易诚博睿(南京)科技有限公司 | A kind of details automatic analysis method for picture quality objective evaluating dead leaf figure |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Chen et al. | Video camera–based vibration measurement for civil infrastructure applications | |
US7515305B2 (en) | Systems and methods for measuring uniformity in images | |
JP5193113B2 (en) | MTF measuring apparatus and MTF measuring program | |
JP2019087792A (en) | Image inspection device, image inspection system, and image inspection method | |
JP6487867B2 (en) | Printing result inspection apparatus, method and program | |
Dobson et al. | Fast, large-scale, particle image velocimetry-based estimations of river surface velocity | |
CN103067735B (en) | Full field sharpness test | |
US6584233B1 (en) | Method for determining the components of image noise patterns of an imaging device and use of this method in an imaging device | |
JPWO2009044452A1 (en) | Radiation image processing apparatus and radiation image processing program | |
CN103245287B (en) | Image evaluation device, image evaluation method and non-transitory storage medium | |
US8913852B2 (en) | Band-based patch selection with a dynamic grid | |
US20100302606A1 (en) | Phase estimation distortion analysis | |
Niskanen et al. | Video stabilization performance assessment | |
US20100149247A1 (en) | Nozzle functionality detection of inkjet printers | |
CN108596862A (en) | Processing method for excluding infrared thermal imagery panorama sketch interference source | |
US8599435B2 (en) | Photoreceptor motion quality estimation using multiple sampling intervals | |
AU2007237343A1 (en) | Motion Quality Analysis | |
US8721025B2 (en) | Method of measuring printer spatial characteristics | |
JP5642605B2 (en) | Inspection apparatus, program, and image alignment method | |
US20140320565A1 (en) | Velocity Estimation Methods, and Imaging Devices and Printing Devices using the Methods | |
CN104954624A (en) | Correct control device, image reading apparatus, and correction control method | |
US8467592B2 (en) | Substrate media distortion analysis | |
JP2013223048A (en) | Image processing apparatus and method thereof | |
Fang et al. | Study on the registration testing of color digital printing machine | |
JP2015172506A (en) | Image inspection device, image inspection system, and image inspection method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
DA3 | Amendments made section 104 |
Free format text: THE NATURE OF THE AMENDMENT IS: AMEND THE INVENTION TITLE TO READ MOTION QUALITY ANALYSIS |
|
MK1 | Application lapsed section 142(2)(a) - no request for examination in relevant period |