CN1606359A - Signal processing device and signal processing method, program, and recording medium - Google Patents

Signal processing device and signal processing method, program, and recording medium Download PDF

Info

Publication number
CN1606359A
CN1606359A CNA2004100959448A CN200410095944A CN1606359A CN 1606359 A CN1606359 A CN 1606359A CN A2004100959448 A CNA2004100959448 A CN A2004100959448A CN 200410095944 A CN200410095944 A CN 200410095944A CN 1606359 A CN1606359 A CN 1606359A
Authority
CN
China
Prior art keywords
signal
picture signal
output
unit
picture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CNA2004100959448A
Other languages
Chinese (zh)
Other versions
CN1606359B (en
Inventor
近藤哲二郎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2003283271A external-priority patent/JP4281453B2/en
Priority claimed from JP2003283272A external-priority patent/JP4300925B2/en
Priority claimed from JP2003283273A external-priority patent/JP4305743B2/en
Priority claimed from JP2003283274A external-priority patent/JP4305744B2/en
Application filed by Sony Corp filed Critical Sony Corp
Publication of CN1606359A publication Critical patent/CN1606359A/en
Application granted granted Critical
Publication of CN1606359B publication Critical patent/CN1606359B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/48Increasing resolution by shifting the sensor relative to the scene
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/13Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths with multiple sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • H04N23/661Transmitting camera control signals through networks, e.g. control via the Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals

Abstract

A signal processing unit subjects to signal processing first image signals, obtained as the output from a three-sensor-system sensor unit which uses CMOS sensors or the like, thereby obtaining high-image-quality second image signals. The three sensors are positioned at placement positions which are suitable for signal processing at the signal processing unit. The suitable placement positions have been obtained by learning performed beforehand. In one arrangement, the signal processing unit evaluates the second image signals, and controls the placement positions of the three sensors according to the evaluation results. In another arrangement, the first signals are evaluated in a predetermined region, and the capabilities of the sensors at the predetermined region are changed according to the evaluation results. In another arrangement, the sensor unit is controlled according to the level distribution of the first image signals. The present invention can be applied to still or video digital cameras.

Description

Signal handling equipment and signal processing method, program and recording medium
Technical field
The present invention relates to a kind of signal handling equipment and a kind of signal processing method, and a kind of program and its recording medium, relate in particular to a kind of image pick up equipment, a kind of signal handling equipment and signal processing method, and a kind of program and its recording medium, the signal processing that for example is picture signal by this obtains suitable picture signal, and picture signal is subjected to the control of signal processing, thereby produces the high-quality image signal.
Background technology
An example of image pick up equipment is digital camera (static or video), it has sensor device (or imaging device), such as CCD (charge coupled device) or CMOS (complementary metal oxide semiconductors (CMOS)) imager (also being called " cmos sensor "), for example they receive object light (from the light of object) and output and the corresponding picture signal of light quantity that is received.Sensor device serves as a kind of such transducer, promptly carries out light-to-current inversion and with the form output of the signal of telecommunication picture signal corresponding to this object light by sensing from the light of object and to it.
For example, use the mode of sensor device that many kinds are arranged, such as list-sensing system and three-sensing system.For list-sensing system, colour filter conduction R (red), G (green) and B (indigo plant) light, the sensing pixel is gone in the projection of for example optical light filter of arranging with the pattern that is called Baeyer matrix (Bayer array), and one of R, G or B light quilt.Therefore, each pixel of transducer receives the light of one of R, G or B, and is the picture signal that a pixel output has the signal component of one of R, G or B.Thus, for list-sensing system, formation all only has the signal component of R, G or B from each pixel of the image that transducer obtains, and therefore the signal component that subsequently each pixel is not had is carried out interpolation.For example, notice that a pixel only comprises R signal (component), the pixel that only has the G signal by a vicinity and the pixel that only has the B signal of a vicinity (are for example come the G signal of the pixel that forecasting institute is concerned about and B signal, please referring to, publication number is the Japanese unexamined patent of 2002-135797).
In addition, be called the signal processing of interpolation algorithm (demosaic) in addition, wherein the pixel transitions that only has one of R, G or B signal that will obtain from single CCD is the pixel (for example, referring to the open No.00/49812 in the world) with all R, G and B signal.
On the other hand, for three-sensing system, this sensor device arrangement has three transducers, each is used for R, G and B, therefore the R transducer receives R light, the G transducer receives G light, and the B transducer receives B light, so that exports the picture signal that each pixel all has R signal, G signal and three components of B signal.
This moment, for the image pick up equipment with three-sensing system, please note a certain light, this light of being concerned about is divided into R, G and B light by a prism, thereby the R transducer receives R light, and the G transducer receives G light, and the B transducer receives B light.Therefore, each transducer that is used for R, G and B is placed in the position of equivalence (identical) on the optics, so that at R, the G and the B light that are used for the same position place light that reception is concerned about on each transducer of each R, G and B.Please note, proposed in image pick up equipment, so to place the G transducer, promptly, make it with respect to R transducer and B sensor offset 1/2 pixel, so that the picture signal that obtains to have improved in vertical direction resolution (for example, please referring to, number of patent application is the Japanese unexamined patent of 08-256345).
In addition, for conventional equipment, image charge coupled apparatus (being designated hereinafter simply as " CCD ") or complementary metal oxide semiconductors (CMOS) (CMOS) imager (are designated hereinafter simply as " cmos sensor ", or " CMOS ") output of such sensing apparatus is exaggerated, to obtain the picture signal of appropriate level (level).
The output of sensor device is by amplifying such as this type of amplifier of AGC (automatic gain controller).Yet, under the very big situation of the contrast of object, may have the situation of using signal amplifier to be difficult to obtain suitable image signal level.Therefore, publication number is that the Japanese unexamined patent of 06-086155 discloses a kind of method, and it has the output of two amplifier amplification sensor devices of different gains by use, thereby amplifies a wide dynamic range easily.
Now, in the subsequent process of sensor device, carry out signal processing usually so that improve the picture quality of the picture signal of sensor device output.That is to say that the CCD of sensor device (imaging device) or cmos sensor receive incident light (during the predetermined time for exposure, corresponding to the object light within the scope of pixel), and output is corresponding to the picture signal of reception light quantity.Therefore, we can say transducer seem during the predetermined time for exposure in, to the corresponding scope of pixel in lasting light carry out time series or spatial sequence sampling, and output is as the sampled result of picture signal (pixel value).
In this manner, the picture signal of sensor device output is the sampled result of time series and spatial sequence lasting light, this means lost package is contained in a part of information in the primary light.Therefore, compare with primary light, the picture signal of sensor device output will decay aspect the picture quality (being amount of information) to some extent.Why Here it is will will carry out signal processing in the subsequent process of sensor device, to improve and the compare picture quality of the picture signal that its picture quality decayed of primary light.
At present, for common configuration, sensor device is not manufactured to and gives any consideration to the signal processing of carrying out in the subsequent process, and the operation of its output image signal does not give any consideration to the signal processing of carrying out in the subsequent process yet.Therefore, sensor device have with subsequent process in the irrelevant attribute of signal processing carried out, promptly sensor device is not considered the signal processing carried out in the subsequent process.So, not only make sensor device have with subsequent process in the irrelevant attribute of the signal processing carried out, and make with the uniform way output image signal, this means that the degree of the picture quality that can improve by execution signal processing in the subsequent process of sensor device is limited.
On the contrary, if sensor device can be exported the picture signal that is suitable for carrying out signal processing in subsequent process, can obtain the more picture signal of high image quality by signal processing so.
Summary of the invention
Propose the present invention according to above-mentioned viewpoint, and thus, an object of the present invention is to obtain to be suitable for the picture signal of signal processing and the signal processing of carries out image signal, can obtain the high-quality image signal by this.
First aspect according to the present invention, a kind of signal handling equipment comprises: sensor device is used for sensitive information and output and the corresponding signal of this information; And signal processing apparatus, be used to make the signal of sensor device output to stand signal processing; Wherein this sensor device is configured to and the corresponding attribute of signal processing.
A kind of signal processing method corresponding to first aspect of the present invention comprises: obtaining step is used for sensitive information and obtains by sensor device output and the corresponding signal of this information; And the signal processing step, make the signal of sensor device output stand signal processing; Wherein sensor device is configured to and the corresponding attribute of signal processing.
A kind of computer-readable program corresponding to first aspect of the present invention comprises: the obtaining step code, this step is used for sensitive information, and obtain by sensor device output with the corresponding signal of this information; And signal processing step code, this step is used to make the signal of sensor device output to stand signal processing; Wherein sensor device is configured to and the corresponding attribute of signal processing.
A kind of recorded medium stores corresponding to first aspect of the present invention has computer-readable program, and wherein this program comprises: the obtaining step code, this step is used for sensitive information, and obtain by sensor device output with the corresponding signal of this information; And signal processing step code, this step is used to make the signal of sensor device output to stand signal processing; Wherein sensor device is configured to and the corresponding attribute of signal processing.
The signal handling equipment of use first aspect according to the present invention, signal processing method, program and corresponding recording medium are carried out signal processing on the signal that the sensor device of sensitive information and output and the corresponding signal of this information is exported.In this case, sensor device is configured to and the corresponding attribute of signal processing.
Second aspect according to the present invention, a kind of signal handling equipment comprises: sensor device, has second transducer that at least one is used for the first sensor of sensor light and output and first component of the corresponding picture signal of this light and is used to export the second component of this picture signal; And signal processing apparatus, make first data image signal that from the output of sensor device, obtains stand signal processing and export second digital picture; Wherein the study by carrying out in advance is arranged on first and second transducers and the corresponding laying state of signal processing.
A kind of signal processing method corresponding to second aspect of the present invention, comprise: obtaining step is used to obtain the picture signal by the sensor device output of second transducer of the second component of the first sensor with sensor light at least and output and first component of the corresponding picture signal of this light and this picture signal of output; With the signal processing step, be used for making first data image signal that obtains from the output of sensor device to stand signal processing, and export second data image signal; Wherein the study by carrying out in advance is arranged on first and second transducers and the corresponding laying state of signal processing.
A kind of computer-readable program corresponding to second aspect present invention, comprise: the obtaining step code, be used to obtain picture signal by the sensor device output of second transducer of the second component of the first sensor with sensor light at least and output and first component of the corresponding picture signal of this light and this picture signal of output, with signal processing step code, be used for making first data image signal that obtains from the output of sensor device to stand signal processing, and export second data image signal; Wherein study by carrying out in advance, with first and second transducers be arranged on the corresponding laying state of signal processing on.
A kind of recorded medium stores corresponding to second aspect present invention a kind of computer-readable program, wherein this program comprises: the obtaining step code is used to obtain the picture signal by the sensor device output of second transducer of the second component of the first sensor with sensor light at least and output and first component of the corresponding picture signal of this light and this picture signal of output; And signal processing step code, be used for making first data image signal that obtains from the output of sensor device to stand signal processing, and export second data image signal; Wherein by the study carried out in advance with first and second transducers be arranged on the corresponding laying state of signal processing on.
For according to the signal handling equipment of second aspect present invention and signal processing method, program with corresponding to the recording medium of this program, described signal processing is carried out on first data image signal, export second data image signal by this, wherein first data image signal is to obtain with the first sensor of first component of the corresponding picture signal of this light and the output of the sensor device of second transducer of the second component of this picture signal of output from having at least one sensor light and output.In this case, by the study of carrying out in advance, first and second transducers be placed on the corresponding laying state of signal processing on.
According to a third aspect of the invention we, a kind of signal handling equipment comprises: signal processing apparatus is used to make the signal of exporting from the sensor device of sensitive information and output and the corresponding signal of this information to stand signal processing; Control device is used to control the attribute of sensor device; Apparatus for evaluating is used to assess the result that signal processing is carried out in relevant output with sensor device of the attribute of being controlled by control device; And definite device, be used for attribute definite according to the assessment result of apparatus for evaluating and the corresponding sensor device of this signal processing, and export the information of this attribute.
A kind of signal processing method corresponding to third aspect present invention comprises: the signal processing step is used to make the signal of exporting from the sensor device of sensitive information and output and the corresponding signal of this information to stand signal processing; Controlled step is used to control the attribute of sensor device; Appraisal procedure is used for assessing the result that signal processing is carried out in relevant output to sensor device with attribute of controlling in controlled step; And determining step is used for attribute definite according to the assessment result of appraisal procedure and the corresponding sensor device of signal processing, and exports the information of this attribute.
A kind of computer-readable program corresponding to third aspect present invention comprises: signal processing step code makes from sensitive information and also exports the signal processing that the signal of exporting with the sensor device of the corresponding signal of this information stands signal processing; The controlled step code is used to control the attribute of sensor device; The appraisal procedure code is used for assessing the result that signal processing is carried out in relevant output to sensor device with attribute of controlling in controlled step; And the determining step code is used for attribute definite according to the assessment result of code and the corresponding sensor device of signal processing, and exports the information of this attribute.
A kind of recorded medium stores corresponding to third aspect present invention a kind of computer-readable program, wherein this program comprises: signal processing step code is used to make the signal of exporting from the sensor device of sensitive information and output and the corresponding signal of this information to stand signal processing; The controlled step code is used to control the attribute of sensor device; The appraisal procedure code is used for assessing the result that signal processing is carried out in relevant output to sensor device with attribute of controlling in controlled step; And the determining step code is used for attribute definite according to the assessment result of code and the corresponding sensor device of this signal processing, and exports the information of this attribute.
For according to the signal handling equipment of third aspect present invention, signal processing method, program with corresponding to the recording medium of this program, wherein signal processing is to carry out on the signal that the sensor device of sensitive information and output and the corresponding signal of this information is exported, and on the other hand, the attribute Be Controlled of sensor device, and be evaluated at the result of the signal processing of carrying out in the output of sensor device with Be Controlled attribute.Determine attribute with the corresponding sensor device of signal processing according to assessment result, and export the information of this attribute.
According to fourth aspect present invention, a kind of signal handling equipment, comprise: signal processing apparatus, make first data image signal that obtains from the output of sensor device stand signal processing, this sensor device has the first sensor of at least one sensor light and output and first component of the corresponding picture signal of this light and is used to export second transducer of the second component of this picture signal; Control device is used to control the laying state of first and second transducers; Apparatus for evaluating, second data image signal that signal processing obtained is carried out in the output that is used to assess sensor device, and wherein, the laying state of first and second transducers is controlled by control device; And laying states definite according to the assessment result of apparatus for evaluating and corresponding first and second transducers of this signal processing, and the information of output laying state.
A kind of signal processing method corresponding to fourth aspect present invention, comprise: the signal processing step, be used to make the signal that obtains from sensor device to stand signal processing, this sensor device has second transducer of the second component of the first sensor of at least one sensor light and output and first component of the corresponding picture signal of this light and output image signal; Controlled step is used to control the laying state of first and second transducers; Appraisal procedure, second data image signal that signal processing obtained is carried out in the output that is used to assess by to sensor device, and wherein the laying state of first and second transducers is controlled in controlled step; And determining step, be used for laying states definite according to the assessment result of appraisal procedure and corresponding first and second transducers of this signal processing, and export the information of this laying state.
A kind of computer-readable program corresponding to fourth aspect present invention, comprise: signal processing step code, make the signal that obtains from the output of sensor device stand signal processing, this sensor device has second transducer of the second component of the first sensor of at least one sensor light and output and first component of the corresponding picture signal of this light and output image signal; The controlled step code is used to control the laying state of first and second transducers; The appraisal procedure code, second data image signal that signal processing obtained is carried out in the output that is used to assess sensor device, and wherein the laying state of first and second transducers is controlled in controlled step; And the determining step code, be used for assessment result according to appraisal procedure, determine the laying state with corresponding first and second transducers of this signal processing, and export the information of this laying state.
A kind of recorded medium stores corresponding to fourth aspect present invention a kind of computer-readable program, wherein this program comprises: signal processing step code, be used for making the signal that obtains from the output of sensor device to stand signal processing, this sensor device has second transducer of the second component of the first sensor of at least one sensor light and output and first component of the corresponding picture signal of this light and output image signal; The controlled step code is used to control the laying state of first and second transducers; The appraisal procedure code, second data image signal that signal processing obtained is carried out in the output that is used to assess by to sensor device, and wherein the laying state of first and second transducers is controlled in controlled step; And the determining step code, be used for assessment result according to appraisal procedure, determine the laying state with corresponding first and second transducers of this signal processing, and export the information of this laying state.
For signal handling equipment according to fourth aspect present invention, method, program and corresponding to the recording medium of this program, described signal processing is carried out on first data image signal, this first digital picture is to obtain from having at least one output that is used for the second component of sensor light and output and first component of the corresponding picture signal of this light and this picture signal of output and exports the transducer of second data image signal, export second data image signal by this, and on the other hand, control the laying state of first and second transducers, and assessment obtains second digital picture by carrying out signal processing in the output of the sensor device of the laying state controlled having.Determine laying state with the signal processing corresponding first or second sensor device according to assessment result, and export the information of this laying state.
The 5th aspect according to the present invention, a kind of signal handling equipment comprises: image conversion apparatus, make first digital picture that obtains from the output of imaging device stand image transitions and handle, this imaging device has the first sensor of at least one first component that obtains picture signal and obtains the second component of picture signal and export second transducer of second data image signal; Apparatus for evaluating is used to assess second data image signal; Control device is controlled at least one laying state of first and second transducers according to the assessment of apparatus for evaluating.
A kind of signal processing method corresponding to fifth aspect present invention, comprise: the image transitions step, make first digital picture that obtains from the output of imaging device stand image transitions and handle, this imaging device has the first sensor of at least one first component that obtains picture signal and obtains the second component of picture signal and export second transducer of second data image signal; Appraisal procedure is used to assess second data image signal; And controlled step, be used for controlling at least one laying state of first and second transducers according to the assessment of appraisal procedure.
A kind of computer-readable program corresponding to fifth aspect present invention, this program comprises: image transitions step code, be used for making first digital picture that obtains from the output of imaging device to stand image transitions and handle, this imaging device has the first sensor of at least one first component that obtains picture signal and obtains the second component of picture signal and export second transducer of second data image signal; The appraisal procedure code is assessed second data image signal; And the controlled step code, be used for controlling at least one laying state of first and second transducers according to the assessment of appraisal procedure.
A kind of recorded medium stores corresponding to fifth aspect present invention a kind of computer-readable program, wherein this program comprises: image transitions step code, make first digital picture that obtains from the output of imaging device stand image transitions and handle, this imaging device has the first sensor of at least one first component that obtains picture signal and obtains the second component of picture signal and export second transducer of second data image signal; The appraisal procedure code is assessed second data image signal; And the controlled step code, be used for controlling at least one laying state of first and second transducers according to the assessment of appraisal procedure.
For according to the signal handling equipment of fifth aspect present invention, signal processing method, program with corresponding to the recording medium of this program, this image transitions is handled and is carried out on first digital picture, and this first data image signal is from the first sensor with at least one first component that obtains picture signal and obtains the second component of picture signal and export the output of imaging device of second transducer of second data image signal to obtain.In addition, assess second data image signal, and according at least one the laying state in assessment control first and second transducers.
According to sixth aspect present invention, a kind of signal handling equipment comprises: parameter obtaining device is used to obtain predetermined parameters; Control device is used for controlling the first sensor with at least one first component that obtains picture signal and obtains the first sensor of imaging device of second transducer of second component of picture signal according to predefined parameter or at least one laying state of second transducer; And image conversion apparatus, be used for making first data image signal that obtains from the output of imaging device to stand to handle, and export second data image signal with the corresponding image transitions of predefined parameter.
A kind of signal processing method corresponding to sixth aspect present invention comprises: obtaining step is used to obtain predetermined parameters; Controlled step is used for controlling the first sensor with at least one first component that obtains picture signal and obtains the first sensor of imaging device of second transducer of second component of picture signal according to predefined parameter or at least one laying state of second transducer; And the image transitions step, be used for making first data image signal that obtains from the output of imaging device to stand corresponding with predefined parameter one image transitions processing, and export second data image signal.
A kind of computer-readable program corresponding to sixth aspect present invention comprises: the obtaining step code is used to obtain predetermined parameters; The controlled step code is used for controlling the first sensor with at least one first component that obtains picture signal and obtains the first sensor of imaging device of second transducer of second component of picture signal according to predefined parameter or at least one laying state of second transducer; And image transitions step code, make first data image signal that from the output of imaging device, obtains stand to handle, and export second data image signal with the image transitions of predefined parameter.
A kind of recorded medium stores corresponding to sixth aspect present invention a kind of computer-readable program, wherein this program comprises: the obtaining step code is used to obtain predetermined parameters; Controlled step code, control have the first sensor of at least one first component that obtains picture signal and obtain the first sensor of imaging device of second transducer of second component of picture signal or at least one the laying state in second transducer according to predefined parameter; And image transitions step code, make first data image signal that from the output of imaging device, obtains stand to handle, and export second data image signal with the corresponding image transitions of predefined parameter.
For according to the signal handling equipment of sixth aspect present invention, signal processing method, program with corresponding to for the recording medium of this program, have the first sensor of at least one first component that obtains picture signal and obtain the first sensor of imaging device of second transducer of second component of picture signal or at least one the laying state in second transducer is controlled according to predefined parameter, and carry out and the corresponding image transitions processing of predefined parameter at first data image signal that the output from imaging device obtains, export second data image signal by this.
According to seventh aspect present invention, a kind of signal handling equipment comprises: deriving means is used to obtain predetermined parameters; Image conversion apparatus makes from first sensor with at least one first component that obtains picture signal and obtains the second component of picture signal and export first digital picture that obtains the output of imaging device of second transducer of second data image signal and stand image processing; Control device is used for controlling at least one laying state of first or second transducer; Apparatus for evaluating is used to assess second data image signal; And storage device, corresponding laying state with the assessment of apparatus for evaluating with the storing predetermined parameter of related mode and first or second transducer.
A kind of signal processing method corresponding to seventh aspect present invention comprises: obtaining step is used to obtain predetermined parameters; The image transitions step makes from first sensor with at least one first component that obtains picture signal and obtains the second component of picture signal and export first digital picture that obtains the output of imaging device of second transducer of second data image signal and stand image processing; Controlled step is used for controlling at least one laying state of first or second transducer; Appraisal procedure is used to assess second data image signal; And storing step, corresponding laying state with the assessment of apparatus for evaluating with the storing predetermined parameter of related mode and first or second transducer.
A kind of computer-readable program corresponding to seventh aspect present invention comprises: the obtaining step code is used to obtain predetermined parameters; Image transitions step code makes from first sensor with at least one first component that obtains picture signal and obtains the second component of picture signal and export first digital picture that obtains the output of imaging device of second transducer of second data image signal and stand image processing; The controlled step code, the laying state of at least one in control first or second transducer; The appraisal procedure code is assessed second data image signal; And the storing step code, corresponding laying state with the assessment of apparatus for evaluating with the storing predetermined parameter of related mode and first or second transducer.
A kind of storage medium stores corresponding to seventh aspect present invention a kind of computer-readable program, wherein program comprises: the obtaining step code is used to obtain predetermined parameters; Image transitions step code makes from first sensor with at least one first component that obtains picture signal and obtains the second component of picture signal and export first digital picture that obtains the output of imaging device of second transducer of second data image signal and stand image processing; The controlled step code, the laying state of at least one in control first or second transducer; The appraisal procedure code is assessed second data image signal; And the storing step code, corresponding laying state with the assessment of apparatus for evaluating with the storing predetermined parameter of related mode and first or second transducer.
For according to the signal handling equipment of seventh aspect present invention, signal processing method, program with corresponding to the recording medium of this program, this image processing is carried out on first digital picture, and this first data image signal is from the first sensor with at least one first component that obtains picture signal and obtains the second component of picture signal and export the output of imaging device of second transducer of second data image signal to obtain.In addition, control at least one the laying state in first and second transducers, assess second data image signal, and corresponding with this assessment, with the laying state of the storing predetermined parameter of the mode of association and first or second transducer.
According to eighth aspect present invention, a kind of signal handling equipment comprises: image conversion apparatus, make first data image signal that from the output of sensor device, obtains stand image transitions and handle, and export second data image signal with a plurality of photoelectric conversion devices; And apparatus for evaluating, first data image signal of assessment presumptive area; Wherein with the corresponding a part of sensor device of first data image signal of presumptive area be changed for in assessment corresponding characteristic (capability) that apparatus for evaluating carried out.
A kind of signal processing method corresponding to eighth aspect present invention, comprise: the image transitions step, make first data image signal that from the output of sensor device, obtains stand image transitions and handle, and export second data image signal with a plurality of photoelectric conversion devices; And appraisal procedure, first data image signal of assessment presumptive area; Wherein with the corresponding a part of transducer dress of first data image signal of presumptive area be changed for in assessment corresponding characteristic that appraisal procedure carried out.
A kind of computer-readable program corresponding to eighth aspect present invention, comprise: image transitions step code, make first data image signal that from the output of sensor device, obtains stand image transitions and handle, and export second data image signal with a plurality of photoelectric conversion devices; And the appraisal procedure code, first data image signal of assessment presumptive area; Wherein with the corresponding a part of transducer dress of first data image signal of presumptive area be changed for in assessment corresponding characteristic that appraisal procedure carried out.
A kind of storage medium stores corresponding to eighth aspect present invention a kind of computer-readable program, wherein this program comprises: image transitions step code, make first data image signal that from the output of sensor device, obtains stand image transitions and handle, and export second data image signal with a plurality of photoelectric conversion devices; And the appraisal procedure code, be used to assess first data image signal of presumptive area; Wherein with the corresponding a part of transducer dress of first data image signal of presumptive area be changed for in assessment corresponding characteristic that appraisal procedure carried out.
For according to the signal handling equipment of eighth aspect present invention, signal processing method, program with corresponding to for the recording medium of this program, the image transitions processing is to carry out in first digital picture that the output from sensor device obtains, and exports second data image signal.On the other hand, first data image signal of assessment presumptive area, and with the corresponding a part of sensor device of first data image signal of presumptive area be changed into the assessment corresponding characteristic of first data image signal of presumptive area.
According to ninth aspect present invention, a kind of signal handling equipment that first picture signal is converted to the signal processing of second picture signal of carrying out, comprise: the class sorter, according to the rank of first picture signal of the imaging device output that is converted to picture signal from the object light of object is distributed second picture signal is assigned among in a plurality of classes one; Control device is according to the rank distribution control imaging device of first picture signal; Tap factor output device is each the class output tap factor that is obtained by study; And calculation element, carry out calculating to obtain second picture signal by use by first picture signal of the imaging device output of control device control and the tap factor of the class that the class sorter obtains.
A kind of signal processing method that first picture signal is converted to the signal processing of second picture signal corresponding to the execution of ninth aspect present invention, comprise: the class classification step, according to the rank of first picture signal of the imaging device output that is converted to picture signal from the object light of object is distributed second picture signal is assigned among in a plurality of classes one; Controlled step is according to the rank distribution control imaging device of first picture signal; Tap factor output step is each the class output tap factor that is obtained by study; Calculation procedure, the tap factor of the class that obtains in first picture signal by using the imaging device output controlled in controlled step and the class classification step is carried out calculating to obtain second picture signal.
A kind of computer-readable program that first picture signal is converted to the signal processing of second picture signal corresponding to the execution of ninth aspect present invention, comprise: class classification step code, according to the rank of first picture signal of the imaging device output that is converted to picture signal from the object light of object is distributed second picture signal is assigned among in a plurality of classes one; The controlled step code is according to the rank distribution control imaging device of first picture signal; Tap factor output step code is each the class output tap factor that is obtained by study; The calculation procedure code, the tap factor of the class that obtains in first picture signal by using the imaging device output controlled in controlled step and the class classification step is carried out calculating to obtain second picture signal.
A kind of storage medium that first picture signal is converted to the signal processing of second picture signal corresponding to the execution of ninth aspect present invention, it has stored a kind of computer-readable program, wherein this program comprises: class classification step code, according to the rank of first picture signal of the imaging device output that is converted to picture signal from the object light of object is distributed second picture signal is assigned among in a plurality of classes one; The controlled step code is according to the rank distribution control imaging device of first picture signal; Tap factor output step code is each the class output tap factor that is obtained by study; The calculation procedure code, the tap factor of the class that obtains in first picture signal by using the imaging device output controlled in controlled step and the class classification step is carried out calculating to obtain second picture signal.
For according to the signal handling equipment of ninth aspect present invention, signal processing method, program with corresponding to for the recording medium of this program, according to being distributed, the rank of first picture signal of the imaging device output that is converted to picture signal from the object light of object carries out the class classification that second picture signal one of is assigned in a plurality of classes, and according to the rank of first picture signal control imaging device that distributes.In addition, be each class output tap factor of being obtained by study, and use first picture signal of imaging device output of control device control and the tap factor of the class that the class sorter obtains to carry out calculating, obtain second picture signal by this.
According to tenth aspect present invention, a kind of signal handling equipment that first picture signal is converted to the signal processing of second picture signal of carrying out, comprise: the class sorter, according to the rank of first picture signal of the imaging device output that is converted to picture signal from the object light of object is distributed second picture signal is assigned among in a plurality of classes one; Activity detection apparatus detects the activity of first picture signal; Control device is according to the activity control imaging device of first picture signal; Tap factor output device is each the class output tap factor that is obtained by study; And calculation element, carry out calculating to obtain second picture signal by use by first picture signal of the imaging device output of control device control and the tap factor of the class that the class sorter obtains.
A kind of signal processing method that first picture signal is converted to the signal processing of second picture signal corresponding to the execution of tenth aspect present invention, comprise: the class classification step, according to the rank of first picture signal of the imaging device output that is converted to picture signal from the object light of object is distributed second picture signal is assigned among in a plurality of classes one; The activity detection step detects the activity of first picture signal; Controlled step is according to the activity control imaging device of first picture signal; Tap factor output step is each the class output tap factor that is obtained by study; Calculation procedure is carried out calculating to obtain second picture signal by the tap factor that uses the class that obtains in first picture signal of the imaging device output that controlled step is controlled and class classification step.
A kind of computer-readable program that first picture signal is converted to the signal processing of second picture signal corresponding to the execution of tenth aspect present invention, comprise: class classification step code, according to the rank of first picture signal of the imaging device output that is converted to picture signal from the object light of object is distributed second picture signal is assigned among in a plurality of classes one; Detect the step code, detect the activity of first picture signal; The controlled step code is according to the activity control imaging device of first picture signal; Tap factor output step code is each the class output tap factor that is obtained by study; The calculation procedure code, the tap factor of the class that obtains in first picture signal by using the imaging device output control in controlled step and the class classification step is carried out calculating to obtain second picture signal.
A kind of storage medium that first picture signal is converted to the signal processing of second picture signal corresponding to the execution of tenth aspect present invention, it has stored a kind of computer-readable program, wherein this program comprises: class classification step code, according to the rank of first picture signal of the imaging device output that is converted to picture signal from the object light of object is distributed second picture signal is assigned among in a plurality of classes one; Activity detection step code is according to the rank distribution detection imaging device of first picture signal; Tap factor output step code is each the class output tap factor that is obtained by study; The calculation procedure code, the tap factor of the class that obtains in first picture signal by using the imaging device output control in controlled step and the class classification step is carried out calculating to obtain second picture signal.
For according to the signal handling equipment of tenth aspect present invention, signal processing method, program with corresponding to the recording medium of this program, distribute according to the rank of first picture signal of the imaging device output that the object light from object is converted to picture signal and to carry out the classification that second picture signal one of is assigned in a plurality of classes.In addition, detect the activity of first picture signal, and control imaging device according to the activity of first picture signal.Be each class output tap factor of being obtained by study, and use first picture signal of imaging device output of control device control and the tap factor of the class that the class sorter obtains to carry out calculating, obtain second picture signal by this.
The tenth on the one hand according to the present invention, a kind of signal handling equipment that first picture signal is converted to the signal processing of second picture signal of carrying out, comprise: the class sorter distributes according to the rank of first picture signal that the imaging device that is converted to picture signal from the object light of object is produced second picture signal is assigned among in a plurality of classes one; The parameter output device, the parameter of the resolution of output expression second picture signal; Control device is according to this parameter control imaging device; Tap factor generating apparatus is from by being each class generation tap factor the factor seed data that obtained of study and this parameter; And calculation element, first picture signal of the imaging device output by using control device control and the tap factor of the class that the class sorter obtains are carried out calculating to obtain second picture signal.
A kind of signal processing method that second picture signal is converted to the signal processing of second picture signal corresponding to the present invention the tenth execution on the one hand, comprise: the class classification step, according to the rank of second picture signal of the imaging device output that is converted to picture signal from the object light of object is distributed second picture signal is assigned among in a plurality of classes one; Parameter output step, the parameter of the resolution of output expression second picture signal; Controlled step is according to this parameter control imaging device; The tap factor generates step, is each class generation tap factor from the factor seed data that obtained by study and this parameter; Calculation procedure, the tap factor of first picture signal by using the imaging device output control in controlled step and the class that obtains in the class classification step is carried out calculating to obtain second picture signal.
A kind of computer-readable program that first picture signal is converted to the signal processing of second picture signal corresponding to the present invention the tenth execution on the one hand, comprise: class classification step code distributes according to the rank of first picture signal that the imaging device that is converted to picture signal from the object light of object is produced second picture signal is assigned among in a plurality of classes one; Parameter output step code, the parameter of the resolution of output expression second picture signal; The controlled step code is according to this parameter control imaging device; The tap factor generates the step code, is each class generation tap factor from the factor seed data that obtained by study and this parameter; The calculation procedure code, the tap factor of first picture signal by using the imaging device output control in controlled step and the class that obtains in the class classification step is carried out calculating to obtain second picture signal.
A kind of storage medium that first picture signal is converted to the signal processing of second picture signal corresponding to the present invention the tenth execution on the one hand, it has stored the generation computer-readable program, wherein this program comprises: class classification step code distributes according to the rank of first picture signal that the imaging device that is converted to picture signal from the object light of object is produced second picture signal is assigned among in a plurality of classes one; Parameter output step code, the parameter of the resolution of output expression second picture signal; The controlled step code is according to this parameter control imaging device; The tap factor generates the step code, is each class generation tap factor from the factor seed data that obtained by study and this parameter; The calculation procedure code, the tap factor of first picture signal by using the imaging device output control in controlled step and the class that obtains in the class classification step is carried out calculating to obtain second picture signal.
For according to the present invention the tenth on the one hand signal handling equipment, signal processing method, program and corresponding to for the recording medium of this program, distribute to carry out according to the rank of first picture signal of the imaging device output that the object light from object is converted to picture signal second picture signal assigned to classification among in a plurality of classes one.In addition, the parameter of the resolution of output expression second picture signal, and according to this parameter control imaging device.In addition, from by being each class generation tap factor the factor seed data that obtained of study and this parameter, the tap factor of first picture signal of the imaging device output that use is controlled in controlled step and the class that obtains in the class classification step is carried out calculating, obtains second picture signal by this.
Brief description of drawings
Fig. 1 shows the block diagram of the profile instance of a specific embodiment using image pick up equipment of the present invention;
Fig. 2 shows the block diagram of the profile instance of signal processing unit 4 and output unit 5;
Fig. 3 shows the block diagram of the profile instance of sensor unit 1;
Fig. 4 shows the block diagram of the profile instance of signal processing unit 11;
Fig. 5 is a flow chart of describing the operation of image pick up equipment;
Fig. 6 shows the block diagram of the profile instance of image conversion unit 31;
Fig. 7 shows the block diagram of the profile instance of factor output unit 124;
Fig. 8 shows the block diagram of the profile instance of the facility for study of learning the tap factor;
Fig. 9 is a flow chart of describing the learning process of study tap factor;
Figure 10 is a flow chart of describing the processing of image conversion unit 31;
Figure 11 shows the block diagram as the profile instance of first specific embodiment of the image pick up equipment of facility for study on the study placement location;
Figure 12 shows the block diagram of the profile instance of transducer 231;
Figure 13 A is the block diagram of the placement location of description control R photoreceptor unit 256, G photoreceptor unit 256G and B photoreceptor unit 256B to 13D;
Figure 14 shows the block diagram of the profile instance of assessment unit 235;
Figure 15 shows the block diagram of the profile instance of position determination unit 236;
Figure 16 describes the flow chart of image pick up equipment as the learning process of facility for study;
Figure 17 shows the block diagram of the profile instance of using computer of the present invention;
Figure 18 shows the block diagram of the profile instance of second specific embodiment of using image pick up equipment of the present invention;
Figure 19 shows the block diagram of the profile instance of signal processing unit 404 and output unit 405;
Figure 20 shows the block diagram of the profile instance of sensor unit 401;
Figure 21 A is the block diagram of the placement location of description control R photoreceptor unit 423R, G photoreceptor unit 423G and B photoreceptor unit 423RB to 21D;
Figure 22 is the block diagram of the profile instance of explanation signal processing unit 411;
Figure 23 is the block diagram of first profile instance of explanation assessment unit 433;
Figure 24 is the block diagram of the relation between explanation side-play amount and the correlation;
Figure 25 is a flow chart of describing the operation of image pick up equipment;
Figure 26 is a flow chart of describing the evaluation process that assessment unit 433 carried out;
Figure 27 is the block diagram that the profile instance of the 3rd specific embodiment of using image pick up equipment of the present invention is described;
Figure 28 is the block diagram of the profile instance of explanation signal processing unit 411;
Figure 29 is the block diagram of the profile instance of explanation image conversion unit 431;
Figure 30 is the block diagram of another profile instance of explanation factor output unit 124;
Figure 31 describes the tap factor to upgrade the flow chart of handling;
Figure 32 is the block diagram that first profile instance of the facility for study of learning the factor seed data is described;
Figure 33 is a block diagram of describing the learning method of study factor seed data;
Figure 34 is a flow chart of describing the learning process of study factor seed data;
Figure 35 is a block diagram of describing the learning method of study factor seed data;
Figure 36 is the block diagram that second profile instance of the facility for study of learning the factor seed data is described;
Figure 37 is the block diagram of the profile instance of explanation signal processing unit 411;
Figure 38 is the block diagram of the profile instance of explanation control unit 211;
Figure 39 is a flow chart of describing the operation of image pick up equipment;
Figure 40 is the block diagram of profile instance of the facility for study of explanation learning parameter table;
Figure 41 is the block diagram of the profile instance of explanation position determination unit 535;
Figure 42 is a flow chart of describing the learning process of a kind of parameter list of study;
Figure 43 describes the flow chart that position determination unit 535 is carried out evaluation process;
Figure 44 is the block diagram that the profile instance of the 4th specific embodiment of using image pick up equipment of the present invention is described;
Figure 45 A and 45B describe the block diagram that sensor unit 601 changes on characteristic according to the control signal of signal processing unit 601 outputs;
Figure 46 is the block diagram of the profile instance of explanation transducer 601;
Figure 47 is the block diagram of the profile instance of explanation signal processing unit 604;
Figure 48 is the block diagram of first profile instance of explanation rank assessment unit 623;
Figure 49 is a flow chart of describing the operation of image pick up equipment;
Figure 50 is a flow chart of describing the processing of rank assessment unit 623;
Figure 51 is a flow chart of describing the operation of image pick up equipment;
Figure 52 is the block diagram of second profile instance of explanation rank assessment unit 623;
Figure 53 is the block diagram of the 3rd profile instance of explanation rank assessment unit 623;
Figure 54 is the block diagram of key diagram as the zone of conversion process target;
Figure 55 is the block diagram that the profile instance of the 5th specific embodiment of using sensing system of the present invention is described;
Figure 56 is the block diagram of first profile instance of explanation DRC circuit 802;
Figure 57 A and 57B are respectively the block diagrams of explanation classification tap and prediction tapped;
Figure 58 A is the block diagram that the profile instance of using sensing system of the present invention is described to 58C;
Figure 59 A and 59B are the block diagrams of the position of description control condenser 852;
Figure 60 A and 60B are the block diagrams of interpretive classification code;
Figure 61 is a flow chart of describing the processing of DRC circuit 802;
Figure 62 is that description control information produces the flow chart of handling;
Figure 63 A is the block diagram of the position of description control condenser 852 to 63C;
Figure 64 is that description control information produces the flow chart of handling;
Figure 65 is the block diagram of second profile instance of explanation DRC circuit 802;
Figure 66 A and 66B are the block diagrams of the dynamic range of interpretive classification tap;
Figure 67 is a flow chart of describing the processing of DRC circuit 802;
Figure 68 is that description control information produces the flow chart of handling;
Figure 69 is that description control information produces the flow chart of handling; And
Figure 70 is the block diagram of the 3rd profile instance of explanation DRC circuit 802.
Embodiment
Next, specific embodiments of the invention will be described.
First specific embodiment
Fig. 1 shows the profile instance of first specific embodiment of using image pick up equipment of the present invention.Please note that this image pick up equipment shown in Figure 1 can be for example digital still camera or video camera.Suppose that this image pick up equipment is a kind of digital camera here.
Sensor unit 1 comprises a plurality of and the corresponding photo-electric conversion element of pixel, is used for the object light that sensing sends from object, and output and the corresponding picture signal of this object light.That is to say that sensor unit 1 receives object light and provides picture signal with the corresponding electrical signal form of light quantity that is received to signal adjustment unit 2.
Signal adjustment unit 2 is carried out relevant complex sample (CDS), be included in the so-called noise that resets in the picture signal of sensor unit 1 output with elimination, and the picture signal as result that will obtain offers A/D converting unit 3.Picture signal that 3 pairs of signal adjustment units 2 of A/D converting unit provide is carried out analog-to-digital conversion, i.e. quantized visual signal and the data image signal as its result that is obtained offered signal processing unit 4.
The data image signal that signal processing unit 4 provides A/D converting unit 3 (being designated hereinafter simply as " picture signal ") is as first picture signal, and make first picture signal stand the predetermined picture conversion process, and to output unit 5 output as its result's data image signal as second picture signal.
Second picture signal of output unit 5 received signal processing units 4 outputs, and export these second picture signals.That is to say that output unit 5 perhaps is presented at it on unshowned monitor from second picture signal of unshowned output output from signal processing unit 4.In addition, output unit 5 is stored in second picture signal in the unshowned recording medium, such as CD, disk, magneto optical disk, tape, semiconductor memory or the like is perhaps by sending these picture signals as telephone wire, internet, local area network (LAN) or other as cable or wireless transmission medium.
For the image pick up equipment of above-mentioned configuration, object light receives at sensor unit 1 place, and by signal adjustment unit 2 and A/D converting unit 3, will offer signal processing unit 4 with the picture signal of the received corresponding electrical signal form of light quantity.Signal processing unit 4 makes sensor unit 1 stand signal processing by the picture signal that signal adjustment unit 2 and A/D converting unit 3 provide as first picture signal, this signal processing is picture for example improves picture quality by the raising resolution image transitions processing, and second picture signal that improve picture quality is by this offered output unit 5.In output unit 5, second picture signal that output signal processing unit 4 provides.
At this moment, sensor unit 1 is set to and the corresponding attribute of carrying out at signal processing unit 4 places of signal processing, promptly sets a property so that be suitable for the picture signal that signal processing unit 4 is carried out signal processing from sensor unit 1 output.
More particularly, sensor unit 1 is a kind of three-sensing system sensor device, for example comprises that three are used to obtain R, the G of picture signal and the transducer of B component (described subsequently R photoreceptor unit 23R, G photoreceptor unit 23G and B photoreceptor unit 23B).Therefore, sensor unit 1 has the picture signal of R signal, G signal and three components of B signal for each pixel output.In addition, obtain to be used for to be suitable for the relevant information of the attribute with sensor unit 1 that signal processing unit 4 is carried out the picture signal of signal processing by described study subsequently in advance, and sensor unit 1 is set to this attribute from sensor unit 1 output.Specifically, the laying state of the one or more transducers in three transducers of sensor unit 1 is set to following state: promptly sensor unit 1 output is suitable for the picture signal that signal processing unit 4 is carried out signal processing thus, promptly is set to the corresponding laying state of carrying out with signal processing unit 4 of signal processing.This moment, the transducer laying state comprises the placement location of transducer and the space angle of transducer (rotary state).Yet, please note that for current specific embodiment for convenience of description, the placement location of the transducer of sensor unit 1 will be used as the attribute utilization of sensor unit 1.Certainly, this and do not mean that the space angle of transducer can not be equally uses as the attribute of sensor unit 1.
As mentioned above, sensor unit 1 is set to the corresponding attribute of signal processing with signal processing unit 4 execution, and sensor unit 1 output is suitable for the picture signal that signal processing unit 4 is carried out signal processing by this.Thereby, make picture signal stand signal processing at signal processing unit 4 and can obtain the high-quality image signal.
Fig. 2 is the block diagram of the profile instance of explanation signal processing unit 4 shown in Figure 1 and output unit 5;
Signal processing unit 4 comprises three signal processing units, 11R, 11G and 11B.Signal processing unit 11R receives first picture signal with R, G and B signal that A/D converting unit 3 provides, and makes this first picture signal stand signal processing, and the R signal (component) that obtains second picture signal by this outputs to output unit 5 with it.Signal processing unit 11G receives first picture signal with R, G and B signal that A/D converting unit 3 provides, and makes first picture signal stand signal processing, and the G signal (component) that obtains second picture signal by this outputs to output unit 5 with it.Signal processing unit 11B receives first picture signal with R, G and B signal that A/D converting unit 3 provides, and makes first picture signal stand signal processing, obtains the B signal (component) of second picture signal by this, then it is outputed to output unit 5.
Output unit 5 comprises output unit 12R, 12G and 12B.Output unit 12R, 12G and 12B receive respectively and R signal, G signal and the B signal of second picture signal of output signal processing unit 11R, 11G and 11B output.Note that hereinafter as long as suitable the time, signal processing unit 11R, 11G and/or 11B also can be whole or individual, abbreviate " signal processing unit 11 " as.
Then, Fig. 3 is the block diagram of the profile instance of the sensor unit 1 shown in explanation Fig. 1 and 2; Object light is injected lens 21, and lens 21 gather object light on each of R photoreceptor unit 23R, G photoreceptor unit 23G and B photoreceptor unit 23B by prism 22.That is to say that the object light of injecting in the lens 21 is launched into prism 22.Prism 22 will be decomposed into R, G and B light from the object light of lens 21, and launch R, G and B light along the direction of R photoreceptor unit 23R, G photoreceptor unit 23G and 23B present position, B photoreceptor unit respectively.
R photoreceptor unit 23R, G photoreceptor unit 23G and B photoreceptor unit 23B are configured to photoelectric conversion device, the picture for example photoelectric diode this type of, it receives R, G and B light from prism 22, and take this to produce as with R signal, G signal and the B signal of the corresponding electrical signal form of light quantity that is received, then these signals are outputed to signal adjustment unit 2.
An example that can be used for the device of R photoreceptor unit 23R, G photoreceptor unit 23G and B photoreceptor unit 23B is CCD (charge coupled device).But, please note that R photoreceptor unit 23R, G photoreceptor unit 23G and B photoreceptor unit 23B only limit to CCD absolutely not, it also can use cmos sensor, or HARP (high-gain snowslide impact amorphous state photoconductor) replaces, and this HARP utilizes the picture tube that occurs in the electron avalanche phenomenon in the semi-conductive photoconductive target of a-Se (amorphous selenium).
For the sensor unit 1 of above-mentioned configuration, the placement location of R photoreceptor unit 23R, G photoreceptor unit 23G and B photoreceptor unit 23B is set on the position that the rgb signal that is suitable for carrying out the picture signal of signal processing in signal processing unit 11R, the 11G of signal processing unit 4 and 11B is output.That is to say that R photoreceptor unit 23R, G photoreceptor unit 23G and B photoreceptor unit 23B are placed on signal processing unit 4 and carry out on the corresponding position of signal processing.This moment, for example by carrying out the placement location that described subsequently study obtains to carry out with signal processing unit 4 the corresponding R photoreceptor unit 23R of signal processing, G photoreceptor unit 23G and B photoreceptor unit 23B in advance.
Here for convenience of description, the placement location of supposing whole R photoreceptor unit 23R has been set on the position that obtains by study.In addition, the placement location of supposing whole G photoreceptor unit 23G and B photoreceptor unit 23B has been set on the position that obtains by study.Yet, should note being placed on the position that obtains by study for each pixel with R photoreceptor unit 23R, can use MEMS (MEMS (micro electro mechanical system)) technology to arrange, the R photoreceptor unit 23R that is utilized thus be one therein the placement location of pixel can change the photoreceptor unit of (moving) substantially, the placement location of each pixel of R photoreceptor unit 23R all obtains in advance by study, and corresponding to the signal processing of carrying out at signal processing unit 4 places.This sets up too for G photoreceptor unit 23G and B photoreceptor unit 23B.
Then, Fig. 4 shows the profile instance of the signal processing unit 11 shown in 2; Provide by picture signal transducer 1 output and that pass through conduct first picture signal of image correction unit 2 and A/D converting unit 3 to signal processing unit 11R, 11G and 11B.
Signal processing unit 11R comprises image conversion unit 31R.First picture signal that offers signal processing unit 11R is provided for image conversion unit 31R.Image conversion unit 31R makes first picture signal stand image transitions and handles, for example improving picture quality, and provide the R data image signal that improves picture quality as having of signal processing results R signal to output unit 5 as second picture signal by improving resolution.
Signal processing unit 11G comprises image conversion unit 31G.First picture signal that offers signal processing unit 11G is provided for image conversion unit 31G.Image conversion unit 31G makes first picture signal stand image transitions and handles, for example improving picture quality, and provide the G data image signal that improves picture quality as having of signal processing results G signal to output unit 5 as second picture signal by improving resolution.
Signal processing unit 11B comprises image conversion unit 31B and image storage unit 32B.First picture signal that offers signal processing unit 11B is provided for image conversion unit 31B.Image conversion unit 31B makes first picture signal stand image transitions and handles, for example improving picture quality, and provide the B data image signal that improves picture quality as having of signal processing results B signal to output unit 5 as second picture signal by improving resolution.
Therefore it should be noted that image conversion unit 31R, 31G and 31B have identical configuration, and can be collectively referred to as according to circumstances or divide and be called " image conversion unit 31 ".
And then, the operation of the image pick up equipment shown in Fig. 1 and 2 will be described with reference to the flow chart among the figure 5.
For image pick up equipment, at first, in step S1, signal processing unit 4 obtains to stand first picture signal of signal processing from sensor unit 1.That is to say that in step S1, sensor unit 1 receives object light, carries out opto-electronic conversion then, obtains the picture signal (that is, making object image-forming) of electrical signal form by this, and picture signal is offered signal adjustment unit 2.The picture signal that signal adjustment unit 2 provides sensor unit 1 stands CDS to be handled, and then these picture signals is offered A/D converting unit 3.The picture signal that 3 pairs of signal adjustment units 2 of A/D converting unit provide is carried out the A/D conversion, then it is offered signal processing unit 4 as first picture signal, thereby signal processing unit 4 obtains first picture signal, then, process step S1 proceeds to step S2.
In step S2, at signal processing unit 4 places, first picture signal that the image conversion unit 31 (Fig. 4) of signal processing unit 11 provides A/D converting unit 3 stands to handle as the image transitions of signal processing, produce second picture signal that picture quality improves by this on first picture signal, flow process proceeds to step S3 then.
In step S3, signal processing unit 11 outputs are given output unit 5 in second picture signal that step S2 obtains, and finish the processing an of frame (or field) by this.For image pick up equipment, repeat processing according to the flow chart of Fig. 5, assign the order that stops image pickup up to for example user.
As mentioned above, the placement location of the R photoreceptor unit 23R of sensor unit 1 (Fig. 3), G photoreceptor unit 23G and B photoreceptor unit 23B has been set to and has been suitable on signal processing unit 4 ( signal processing unit 11R, 11G and 11B) carries out position that the RGB picture signal of signal processing is output.That is to say, R photoreceptor unit 23R, G photoreceptor unit 23G and B photoreceptor unit 23B be placed on the corresponding position of the signal processing of signal processing unit 4 on.Therefore, the picture signal that is suitable for carrying out signal processing at signal processing unit 4 so can obtain high quality graphic by making picture signal stand signal processing from sensor unit 1 output.
Then, Fig. 6 is the block diagram of the profile instance of explanation image conversion unit 31 shown in Figure 4.Image conversion unit 31 makes first picture signal that offers it stand image transitions and handles, and second picture signal that obtains is handled in output by image transitions.
This moment, if we for example say that first picture signal is a low resolution image signal, and second picture signal is the high resolution graphics image signal, and image transitions handles that to can be said to be that resolution improves and handles so.In addition, if we for example say that first picture signal is low-S/N (signal/noise, signal to noise ratio) picture signal, and second picture signal is height-S/N picture signal, and image transitions handles that to can be said to be that noise removal is handled so.In addition, if we for example say, first picture signal is the picture signal of pre-sizing, and second picture signal is the picture signal that is greater than or less than the size of first picture signal, and image transitions handles that also to can be said to be that image zoom (enlarge or dwindle) is handled so.
In image conversion unit 31, first picture signal that stand the image transitions processing is provided for prediction tapped extraction unit 121 and feature extraction unit 122.
The pixel that prediction tapped extraction unit 121 sequentially obtains to constitute second picture signal is as care pixel, and further extraction constitutes several pixels (its pixel value) of first picture signal as prediction tapped.Specifically, prediction tapped extraction unit 121 from first picture signal be extracted in corresponding first picture signal of the pixel with being concerned about first picture signal a certain pixel (for example, the most approaching spatial sequence of being concerned about in first picture signal and the pixel of time series pixel) proximity space sequence or a plurality of pixels of seasonal effect in time series, as prediction tapped.Prediction tapped extraction unit 121 offers computing unit 125 with the prediction tapped of the relevant pixel of being concerned about then.
Feature extraction unit 122 is used the feature of first picture signal pixel that extraction is concerned about, and this feature is offered class taxon 123.Can be used be concerned about pixel example comprise that the rank of the pixel value of the proximity space sequence of a certain pixel in corresponding first picture signal of the pixel with being concerned about in first picture signal or a plurality of pixels of seasonal effect in time series distributes, or the like.
Class taxon 123 is carried out the class classification, with according to from feature extraction unit 122 be concerned about the feature of pixel, the pixel of being concerned about is assigned in a plurality of classes one, and will as a result of be obtained offer factor output unit 124 with such corresponding Sort Code.That is to say, be worth by the scholar, or throughput scholar value and the quantized value that obtains be when representing that class taxon 123 output scholar values itself are as Sort Code in the feature of the pixel of being concerned about.In addition, when the vector value that is made of a plurality of components in the feature of care pixel was represented, 123 outputs of class taxon were by quantizing the value that vector value obtains, or the value by ADRC (adaptive dynamic range coding) acquisition is as Sort Code.
This moment, for the ADRC of K position, the maximum MAX and the minimum value MIN of the component of the vector value of the feature of for example detection formation pixel that expression is concerned about, and use DR=MAX-MIN as one group local dynamic range, according to this dynamic range DR will constitute be concerned about that the component of the feature of pixel is quantized into the K position once more.That is to say that minimum value MIN is concerned about the component of feature of pixel to deduct from constituting, and the value that is subtracted is removed (being quantized) DR/2 KArrange constituting a bit string of K position component that institute is concerned about the feature of pixel according to predefined procedure therein is used as the ADRC code and exports.Therefore, vector value in the feature of pixel that expression is concerned about stands under the situation of 1 ADRC processing, each component of the feature of pixel that formation is concerned about is divided by the mean value (being rounded to decimal) of maximum MAX and minimum value MIN, and each component becomes a position (that is binary bitization) whereby.Wherein the bit string with 1 component of predefined procedure arrangement is used as the output of ADRC code.Class taxon 123 output is for example by carrying out ADRC code that the ADRC processing obtained as Sort Code to the feature of care pixel.
The tap factor of each class of factor output unit 124 storage, and the tap factor to having stored in addition, the tap factor of the class of the Sort Code that provides to computing unit 125 output class taxons 123.Should be noted in the discussion above that tap factor here equals to multiply by in the so-called tap in digital filter the factor of input data.
Computing unit 125 obtains the prediction tapped of prediction tapped extraction unit 121 outputs and the tap factor of factor output unit 124 outputs, and uses prediction tapped and tap factor to carry out predetermined prediction and calculation, is concerned about the predicted value of the actual value of pixel to obtain.Therefore, computing unit 125 obtains and the pixel value of pixel that output is concerned about (more precisely, or its predicted value), promptly constitutes the pixel value of the pixel of second picture signal.
Then, Fig. 7 is the block diagram of the profile instance of explanation factor output unit 124 shown in Figure 6.In Fig. 7, factor output unit 124 comprises factor memory 181.Factor memory 181 is stored the tap factor of each class of passing through described study subsequently and obtaining in advance.In case class taxon 123 has been provided to Sort Code this factor memory 181, factor memory 181 is just read the tap factor of this Sort Code, and provides it to computing unit 125.
Then, prediction and calculation of carrying out in the computing unit shown in Figure 6 125 and the study that is used for the tap factor of being stored in of prediction and calculation factor memory shown in Figure 7 181 will be described in.
Suppose that we have as the high image quality picture signal of second picture signal with as the low-quality signal of first picture signal, this low-quality signal is to obtain by using the LPF (low pass filter) that reduces its resolution that the high image quality picture signal is filtered.We will consider such example now, promptly extract prediction tapped from the low image quality picture signal, and according to predetermined prediction and calculation, use prediction tapped and tap factor to obtain the pixel value of high image quality pixel.
For example use linear fundamental forecasting to calculate as predetermined prediction and calculation, by can obtain the pixel value y of high image quality pixel with the lower linear basic representation.
y = Σ n = 1 N w n x n - - - ( 1 )
X wherein nThe pixel value (following be called according to circumstances " low image quality pixel ") of n low image quality picture signal pixel of the prediction tapped of the relevant high image quality pixel y of expression formation, and w nN the tap factor that pixel value with n low image quality pixel multiplies each other of indicating.Note that in expression formula (1) prediction tapped comprises N low image quality pixel x 1, x 2Or the like up to x n
This moment, the pixel value y of high image quality pixel can by quadratic expression or more the expression formula of high-order obtain rather than the linear basic representation of expression formula (1).
For by y kThe actual value of the pixel value of the high image quality pixel of k sampling of expression and the actual value y that obtains from expression formula 1 kPredicted value y k', its predicated error e kIllustrate as following formula.
e k=y k-y k′ (2)
Here, the predicted value y in the expression formula (2) k' obtain according to expression formula (1), therefore according to the y of expression formula (1) substitution expression formula (2) k' in and obtain following expression formula.
e k = y k - ( Σ n = 1 N w n x n , k ) - - - ( 3 )
Note that in expression formula (3) x N, kExpression constitutes n low image quality pixel of the prediction tapped of relevant k high image quality pixel of sampling.
This moment, though in expression formula (3) (or in expression formula (2)), have predicated error e kBe 0 tap factor w nFor prediction high image quality pixel is best, but is that all high image quality pixels obtain as tap factor w nSuch tap factor is normally very difficult.Therefore, use as best tap factor w nThe least squares method of canonical representation, can obtain best tap factor w by minimizing the error sum of squares E that following expression formula represents n
E = Σ k = 1 K e k 2 - - - ( 4 )
Wherein K represents many group high image quality pixel y kWith the relevant high image quality pixel y of formation kThe low image quality pixel x of prediction tapped 1, k, x 2, kDeng up to x N, kNumber of samples (that is, study sampling quantity).
Use w nObtain the minimum value (minimum) of the error sum of squares E in the expression formula (4), wherein use tap factor w nTo carrying out partial differential and obtain 0, shown in expression formula (5) with E.
∂ E ∂ w n = e 1 ∂ e 1 ∂ w n + e 2 ∂ e 2 ∂ w n + · · · + e k ∂ e n ∂ w n = 0 ( n = 1,2 , · · · , N ) - - - ( 5 )
Therefore, use w nAbove-mentioned expression formula (3) is carried out partial differential and provided following expression formula.
∂ e k ∂ w 1 = - x 1 , k , ∂ e k ∂ w 2 = - x 2 , k , · · · , ∂ e k ∂ w N = - x N , k , ( k = 1,2 , · · · , K ) - - - ( 6 )
Following expression formula obtains according to expression formula (5) and (6).
Σ k = 1 K e k x 1 , k = 0 , Σ k = 1 K e k x 2 , k = 0 , · · · Σ k = 1 K e k x N , k = 0 - - - ( 7 )
With the e in expression formula (3) the substitution expression formula (7) kIn, thereby make expression formula (7) be expressed as the normal equation of expression formula (8).
For example, use cancellation to calculate (Gauss-Jordon's elimination approach), the normal equation of expression formula (8) can calculate tap factor w nResolve the normal equation of expression formula (8) for each class, make it possible to obtain best tap factor w for each class n(in this case, this tap factor makes error sum of squares E minimum).
And then, Fig. 8 illustrates and carries out study to obtain the tap factor w of each class by the normal equation that resolves expression formula (8) for each class nThe profile instance of facility for study.
Be used to learn tap factor w nThe study picture signal be transfused to facility for study.Here, an example of study picture signal is high-resolution high image quality picture signal.The study picture signal is provided for the tutor's data generating unit 131 and the student data generation unit 133 of facility for study.
Tutor's data generating unit 131 generates tutor's data from the study picture signal of supplying with it, and they are offered tutor's data storage cell 132.That is to say that, tutor's data generating unit 131 is provided to tutor's data storage cell 132 as tutor's data with the high image quality picture signal as what the study picture signal did not add variation here.The high image quality picture signal that 131 storages of tutor's data storage cell are generated by tutor's data generating unit 131 as tutor's data.
Student data generation unit 133 generates student data from the study picture signal, and this student data is offered student data memory cell 134.That is to say, student data generation unit 133 is carried out the filtration as the high image quality picture signal of study picture signal, so that reduce its resolution, generate the low image quality picture signal by this and offer student data memory cell 134 as student data.134 storages of student data memory come from the student data of student data generation unit 133.
Prediction tapped extraction unit 135 obtains the pixel as be concerned about tutor's pixel continuously, these pixels constitute as the high image quality picture signal that is stored in the tutor's data in tutor's data storage cell 132, and from constituting as extracting a predetermined pixel in the low image quality pixel of the low image quality picture signal that is stored in the student data the student data memory cell 134, a prediction tapped is set by this, this prediction tapped has the identical tap structure of prediction tapped with prediction tapped extraction unit shown in Figure 6 121 configurations, and offers adder unit 38.
Corresponding with tutor's pixel of being concerned about, feature extraction unit 136 with the same mode of feature extraction unit shown in Figure 6 122 use the low image quality pixel that constitutes the low image quality picture signal extract be concerned about the feature of tutor's pixel, and offering class taxon 137, this low image quality picture signal is as the student data that is stored in the student data memory cell 134.
Feature according to tutor's pixel of being concerned about of feature extraction unit 136 output, class taxon 137 is carried out the class classification identical with class taxon shown in Figure 6 123, and to adder unit 138 outputs and the corresponding Sort Code of consequent class that obtained.
Class taxon 137 the relevant of output are concerned about that the Sort Code of tutor's pixel is provided for adder unit 138.Adder unit 138 is read the pixel of being concerned about from tutor's data storage cell 132, and each Sort Code that provides for class taxon 137 is carried out the relevant tutor's pixel be concerned about and the addition of student data, the prediction tapped that the tutor's pixel about being concerned about that this student data formation prediction tapped extraction unit 135 provides is disposed.
That is to say, provide the tutor's data that are stored in tutor's data storage cell 132 y to adder unit 138 k, from the prediction tapped x of prediction tapped extraction unit 135 output 0, kWith Sort Code from 137 outputs of class taxon.
That provide for corresponding each class of Sort Code for class taxon 137, adder unit 138 uses prediction tapped (student data) x N, kCarry out to calculate, this calculating equal in the matrix left side of expression formula (8) one of student data with another (x N, kx N ', k) its addition of sum of products and (∑).
Equally, that provide for corresponding each class of Sort Code for class taxon 137, adder unit 138 uses prediction tapped (student data) x N, kWith tutor's data y kCarry out to calculate, this calculating equals the student data x in the vector on expression formula (8) right side N, kWith tutor's data y kProduct (x N, k, y k) and its addition and (∑).
That is to say its left side matrix component (∑ x that the expression formula (8) that adder unit 138 is stored is obtained with previous relevant tutor's data as care tutor pixel in its memory (not shown) N, kx N ', k) and right side resolute (∑ x N, ky k), and add matrix component (∑ x N, kx N ', k) or resolute (∑ x N, ky k), the respective component x of relevant new tutor's data as tutor's pixel of being concerned about N, k-1, x N ', k-1Or x N, k-1y K-1, use tutor's data y K-1With student data x N, k-1Calculate respective component x N, k-1x N ', k-1Or x N, k-1Or x N, k-1y K-1(being the addition of the summation of executable expressions (8) expression).
Adder unit 138 uses all tutor's data of tutor's data storage cell 132 storages to carry out this addition as care tutor pixel, so that for each class constitutes the normal equation that expression formula (8) provides, export this normal equation then and give tap factor computing unit 139.Tap factor computing unit 139 resolves the normal equation that adder unit 138 provides for each class, thereby obtains and export best tap factor w for each tap n
Factor memory 181 shown in Figure 7 each class storage tap factor w for obtaining by facility for study shown in Figure 8 n
Please note, for such scheme, the study picture signal is as obtaining with the corresponding tutor's data of second picture signal that does not change, and the conduct of low image quality picture signal and the corresponding student data of first picture signal that have wherein reduced the resolution of study picture signal, carry out the study of tap factor according to this, therefore can start the image transitions processing and obtain the tap factor, thereby can realize that wherein first picture signal is carried out resolution raising processing is treated to second picture signal of the resolution with raising with it.
This moment, depend on the picture signal of the corresponding student data of first picture signal with the picture signal of the corresponding tutor's data of second picture signal and how to select, can obtain to be used for the tap factor that various types of image transitions are handled.
That is to say, for example, be used as tutor's data for high image quality picture signal wherein, and noise is to be attached on tutor's data high image quality picture signal with generation to be used as in the arrangement of the picture signal that has noise of student data, carrying out study handles, can obtain the tap factor of carries out image conversion process by this, it is first picture signal to be converted to removed the removal that (or minimizing) be included in second picture signal of noise wherein and handle that this image transitions is handled.
Equally, for example, for such scheme, promptly wherein given picture signal is as tutor's data and reduce the picture signal that produces as the picture signal of the pixel quantity that picture signal had of tutor's data as student data, perhaps wherein given picture signal reduces picture signal as the pixel quantity that picture signal had of student data to produce the picture signal as tutor's data as student data and according to predetermined minimizing ratio, carrying out study handles, can obtain the tap factor of carries out image conversion process by this, this image transitions is treated to convergent-divergent and handles, and it is converted to second image extended or that reduce with first picture signal.In addition, be provided as the picture signal of tutor's data and student data, allow to obtain to carry out the tap factor that various image transitions are handled, such as the conversion of pixel quantity, the conversion of depth-width ratio etc. in a certain mode.
And then, will describe the processing of carrying out by facility for study shown in Figure 8 with reference to the flow process of figure 9, i.e. study is handled.
At first, in step S51, tutor's data generating unit 131 and student data generation unit 133 produce tutor's data and student data from the study picture signal respectively, then output.That is to say that tutor's data generating unit 131 is not done to change and exported the study picture signal as tutor's data.Equally, student data generation unit 133 uses LPF to filter this study picture signal with predetermined cut-out frequency, generates and export the student data of relevant tutor's data (study picture signal) by this for each frame.
Tutor's data of tutor's data generating unit 131 outputs are provided for tutor's data storage cell 132 and are stored in wherein, and the student data of student data generation unit 133 outputs is provided for student data memory cell 134 and is stored in wherein.
Subsequently, flow process proceeds to step S52, and in this step, in the tutor's data from be stored in tutor's data storage cell 132, prediction tapped extraction unit 135 takes out as the tutor's pixel be concerned about that also is not used as tutor's pixel of being concerned about.In addition, in step S52, the student data setting of prediction tapped extraction unit 135 from be stored in student data memory cell 134 provides it to adder unit 138 then about the prediction tapped of tutor's pixel of being concerned about, and flow process proceeds to step S53 then.
In step S53, feature extraction unit 136 uses the student data of student data memory cell 134 storages to extract the feature of tutor's pixel of being concerned about, provides it to class taxon 137 then, and flow process proceeds to step S54 then.
In step S54, class taxon 137 is according to coming from the classification that the institute relevant with tutor's pixel that be concerned about feature extraction unit 136 is concerned about pixel tutor's pixel that execution is concerned about of feature, and obtain by this to adder unit 138 output with such corresponding Sort Code, flow process proceeds to step S55 then.
In step S55, adder unit 138 is read tutor's pixel of being concerned about for the Sort Code that class taxon 137 provides from tutor's data storage cell 132, the addition of executable expressions (8), the prediction tapped that this addition is disposed about the tutor's pixel be concerned about and the relevant tutor's pixel be concerned about that constitutes prediction tapped extraction unit 135 and provide is so flow process proceeds to step S56.
In step S56, determine tutor's data that whether prediction tapped extraction unit 135 has been stored does not also have as tutor's pixel of being concerned about in tutor's data storage cell 132.If determined it is tutor's data that prediction tapped extraction unit 135 still stores also not to be had as tutor's pixel of being concerned about in tutor's data storage cell 132 what step S56 did, so for the tutor's data that also do not have as be concerned about tutor's pixel, prediction tapped extraction unit 135 turns back to step S52, and repeats identical processing.On the other hand, if determining that step S56 did is tutor's data that prediction tapped extraction unit 135 no longer includes in tutor's data storage cell 132 also not to be had as tutor's pixel of being concerned about, adder unit 138 is followed flow process and is proceeded to step S57 for by handling each class of having been obtained the left side matrix and the right side vector of expression formula (8) being offered computing unit 139 up to now so.
In step S57, each class that tap factor computing unit 139 provides for adder unit 138 is resolved the normal equation of each class of the left side matrix that constitutes expression formula (8) and right side vector, obtains and export tap factor w by this for each class n, follow end process.
Though may there be this situation, promptly owing to reasons such as the quantity of learning picture signal are not enough, can not obtain to obtaining the quantity of the required normal equation of tap factor, be so default tap factor of class output but for example can arrange tap factor computing unit 139.
Factor memory 181 shown in Figure 7 each class storage tap factor for being obtained as described above.Yet factor memory 181 storage that should note image conversion unit 31R shown in Figure 4 is by carrying out the R signal that only uses picture signal as R, the G of tutor's data and picture signal and the B signal tap factor that study obtained as student data.Equally, factor memory 181 storage of image conversion unit 31G shown in Figure 4 is by carrying out the G signal that only uses picture signal as R, the G of tutor's data and picture signal and the B signal tap factor that study obtained as student data.In addition, the storage of the factor memory 181 of image conversion unit 31B shown in Figure 4 is by carrying out the B signal that only uses picture signal as R, the G of tutor's data and picture signal and the B signal tap factor that study obtained as student data.
Then, will describe the image transitions of being carried out by image conversion unit shown in Figure 6 31 with reference to the flow chart among Figure 10 handles.Note that handling with reference to the described image transitions of Figure 10 is the processing of carrying out in the step S2 of Fig. 5.
In step S61, prediction tapped extraction unit 121 will constitute second picture signal also not as a pixel that the pixel conduct is concerned about of care pixel, and several pixels of extraction formation first picture signal (in fact, the pixel value of pixel) as prediction tapped, this pixel is used to predict the pixel value of the pixel of being concerned about of second picture signal, and flow process proceeds to step S62 then.
In step S62, feature extraction unit 122 is used the feature of first picture signal pixel that extraction is concerned about, and this feature is offered class taxon 123, and flow process proceeds to step S63 then.In step S63, class taxon 123 is carried out one the class classification processing that is used for care pixel is sorted in a plurality of classes according to the feature of the pixel of being concerned about that feature extraction unit 122 provides, and provide and as the corresponding Sort Code of the class that result obtained to factor output unit 124, flow process proceeds to step S64 then.In step S64, factor output unit 124 is read the tap factor of the class of the Sort Code that class taxon 123 provides, and outputs it to computing unit 125 then, and flow process proceeds to step S65 then.
At step S65, computing unit 125 uses prediction tapped that tap extraction unit 121 provide and is concerned about the calculating that the tap factor of the class of pixel comes executable expressions (1), the pixel value of pixel that acquisition by this is concerned about from the institute of factor output unit 124 outputs.
Image conversion unit 31 be as a screen (that is, a frame or) be concerned about pixel all pixel execution in step S61 of second picture signal to the processing of S65, flow process is returned then.
Then, Figure 11 illustrates the profile instance as the layout of the image pick up equipment of the facility for study that is used to carry out study sensor unit 1 attribute information, this sensor unit 1 is used to export the picture signal of the signal processing that is suitable for signal processing unit shown in Figure 14 execution, this attribute information is the information of the laying state of relevant R photoreceptor unit 23R, G photoreceptor unit 23G and B photoreceptor unit 23B, and it is corresponding with the signal processing that signal processing unit 4 is carried out.Sensor unit 231 has a plurality of and the corresponding photoelectric conversion device of pixel, goes into wherein object light and output and the corresponding picture signal of object light to detect projection.That is to say that sensor unit 231 receives the picture signal of object light and the acquisition and the corresponding electrical signal form of light quantity that receives, and provides it to signal adjustment unit 232 and 238 then.
Please note, as described below, picture signal (following be also referred to as according to circumstances " normal image signal ") and the high-quality image signal of arranging sensor unit 231 to obtain with the sensor unit 1 that can obtain to equal shown in Figure 1, the assessment unit 235 that this high-quality image signal (following be also referred to as according to circumstances " evaluate image signal ") is used for describing subsequently, it equals second picture signal that signal processing unit shown in Figure 1 is exported.Sensor unit 231 provides normal image signal to signal adjustment unit 232, and the evaluate image signal is offered signal adjustment unit 238.
Simultaneously, the control signal that will come from controller 240 offers sensor unit 231.The attribute of sensor unit 231 changes according to the control signal that controller 240 provides, and obtains and the corresponding normal image signal of object light according to the attribute that changes.
As the signal adjustment unit among Fig. 12, signal adjustment unit 232 makes the normal image signal of sensor unit 231 outputs stand CDS and handles, and the picture signal as result that will obtain offers A/D converting unit 233.
As the A/D converting unit 3 among Fig. 1, the normal image signal that A/D converting unit 233 provides signal adjustment unit 232 stands the A/D conversion, promptly sample and quantize normal image signal, and the data image signal that will obtain as result offers signal processing unit 234 as first picture signal.
With the mode signalization processing unit 234 identical with signal processing unit shown in Figure 24, and make first picture signal that comes from A/D converting unit 233 stand image pattern 6 and handle, obtain second picture signal (R, G and B signal) by this and this is offered assessment unit 235 to the image transitions of the described signal processing of Figure 10.
Provide second picture signal that comes from signal processing unit 234 to assessment unit 235, and the control signal of the attribute of the control sensor unit 231 that provides of controller 240 and come from the evaluate image signal of A/D converting unit 239.Second picture signal that the evaluate image signal evaluation signal processing unit 234 that assessment unit 235 uses A/D converting unit 239 to provide provides, and when obtaining this assessment, the attribute of the sensor unit 231 that related this assessment and control signal are represented, that is to say, when obtaining first picture signal, the related attribute information that should assess and represent the attribute of sensor unit 231, this is the purpose of carrying out signal processing at signal processing unit 234 places, so that obtain second picture signal after the assessment.In addition, assessment unit 235 assessment and the attribute information that will organize first picture signal offers position determination unit 236.
When obtaining to be suitable for when signal processing unit 234 carries out first picture signal of signal processing, the assessment of second picture signal of this group that position determination unit 236 provides according to assessment unit 235 and the attribute that attribute information is determined transducer Unit 231, the attribute of the sensor unit 231 of promptly corresponding (so the signal processing of the attribute of sensor unit 1 and signal processing unit 4 shown in Figure 1 is corresponding) with the signal processing at signal processing unit 234 places, and the attribute information of identity property offered position storage unit 237.The attribute information that position storage unit 237 memory location determining units 236 provide.
As signal adjustment unit 232, signal adjustment unit 238 makes the evaluate image signal of sensor unit 231 outputs stand CDS and handles, and will offer A/D converting unit 239 as the evaluate image signal that result obtains.But, please note, because signal adjustment unit 232 is handled normal image signal, signal adjustment unit 238 is handled the evaluate image signal that picture quality is higher than normal image signal, therefore signal adjustment unit 238 has the ability higher than signal adjustment unit 232, so that keep the picture quality of evaluate image signal.
As A/D converting unit 233, the evaluate image signal that A/D converting unit 239 provides signal adjustment unit 238 stands the A/D conversion, promptly sample and the quantitative evaluation picture signal, and the number evaluation picture signal that will obtain as its result offers assessment unit 235.But, please note, because A/D converting unit 233 is handled normal image signal, A/D converting unit 239 is handled the evaluate image signal that picture quality is higher than normal image signal, therefore A/D converting unit 239 than the high ability of A/D converting unit 233 (for example has, than A/D converting unit 233 more quantization or higher sample frequencys) so that keep the picture quality of evaluate image signal.
Controller 240 is provided for controlling the control signal of the attribute of sensor unit 241 to sensor unit 231 and assessment unit 235.
Then, Figure 12 illustrates the profile instance of sensor unit shown in Figure 11 231.Object light is throwed in the lens 251, lens 251 accumulate in object light on assessment R photoreceptor unit 255R, assessment G photoreceptor unit 255G and the assessment B photoreceptor unit 255B by half-reflecting mirror 252 and prism 253 then, by half-reflecting mirror 252 and prism 254 object light are accumulated on R photoreceptor unit 256R, G photoreceptor unit 256G and the B photoreceptor unit 256B simultaneously.
That is to say that the object light that projection is gone in the lens 251 is launched into half-reflecting mirror 252.Half-reflecting mirror 252 reflexes to prism 253 with a part of object light from lens 251, and remainder is sent to prism 254.
The object light that prism 253 will come from half-reflecting mirror 252 is separated into R, G and B light, and R, G and B light are transmitted into assessment R photoreceptor unit 255R, assessment G photoreceptor unit 255G and assessment B photoreceptor unit 255B separately on the direction of position.Assessment R photoreceptor unit 255R, assessment G photoreceptor unit 255G and assessment B photoreceptor unit 255B are provided with photoelectric conversion unit, such as photodiode or the like, so that receive R, G and B light, and obtain R signal, G signal and B signal corresponding to the electrical signal form of the light quantity that is received from prism 253.Then, the picture signal of being made up of R signal, G signal and B signal is used as the evaluate image signal then and outputs to signal adjustment unit 238 (Figure 11).The example of assessment R photoreceptor unit 255R, assessment G photoreceptor unit 255G and assessment B photoreceptor unit 255B comprises CCD, cmos sensor, HARP or the like.
The object light that prism 254 will come from lens 251 is separated into rgb light, and launches R, G and B light on the direction of the position at place separately at R photoreceptor unit 256R, G photoreceptor unit 256G and B photoreceptor unit 256B.R photoreceptor unit 256R, G photoreceptor unit 256G and B photoreceptor unit 256B are provided with photoelectric conversion unit, such as photodiode or the like, so that receive R, G and B light, and obtain and R signal, G signal and the B signal of the corresponding electrical signal form of light quantity that received from prism 254.The picture signal of being made up of R signal, G signal and B signal is used as normal image signal then and outputs to signal adjustment unit 232 (Figure 11).The example of R photoreceptor unit 256R, G photoreceptor unit 256G and B photoreceptor unit 256B comprises CCD, cmos sensor, HARP etc.Yet R photoreceptor unit 256R, G photoreceptor unit 256G and B photoreceptor unit 256B preferably have and R photoreceptor unit 23R shown in Figure 3, characteristic that G photoreceptor unit 23G is identical with B photoreceptor unit 23B.
Each controls R control unit 257R, G control unit 257G and B control unit 257B the mobile of placement location of R photoreceptor unit 256R, G photoreceptor unit 256G and B photoreceptor unit 256B according to the control signal that controller 240 (Figure 11) provides, and changes the attribute of sensor unit 251 by this.
Now, for convenience of description, we suppose that at this R control unit 257R controls the placement location of whole R photoreceptor unit 256R, and G control unit 257G controls the placement location of whole G photoreceptor unit 256G, and B control unit 257B controls the placement location of whole B photoreceptor unit 256B.Yet, it should be noted that and for example use the MEMS technology, allow the placement location of the pixel of R photoreceptor unit 256R can be changed (moving) basically, can control the placement location of the single pixel of R photoreceptor unit 256R by this independently.This is for G photoreceptor unit 256G and G control unit 257G, and B photoreceptor unit 256B and B control unit 257B set up equally.
Equally, assessment R photoreceptor unit 255R, assessment G photoreceptor unit 255G and assessment B photoreceptor unit 255B have than R photoreceptor unit 256R, the G photoreceptor unit 256G and the high characteristic of B photoreceptor unit 256B that obtain normal image signal, owing to will obtain high-quality evaluate image signal by this.That is to say, assessment R photoreceptor unit 255R, assessment G photoreceptor unit 255G and assessment B photoreceptor unit 255B for example have than R photoreceptor unit 256R, G photoreceptor unit 256G and wide dynamic range, more quantity or the pixel of B photoreceptor unit 256B, or the like.
Then, will describe by R control unit 257R, G control unit 257G shown in Figure 12 and B control unit 257B each control among R photoreceptor unit 256R, G photoreceptor unit 256G and the B photoreceptor unit 256B to 13D with reference to figure 13A.
Each has the pixel (corresponding to photodiode or the like) of finite region B photoreceptor unit 256B shown in G photoreceptor unit 256G and Figure 13 C shown in R photoreceptor unit 256R, Figure 13 B shown in Figure 13 A, and the corresponding picture signal of light quantity of exporting and receiving in each pixel place.Note that at Figure 13 A that in 13D pixel is that every limit is the square of finite length.
Here, the foursquare center of gravity of each each free pixel of position of the pixel of R photoreceptor unit 256R, G photoreceptor unit 256G and B photoreceptor unit 256B is represented, and the pixel of R photoreceptor unit 256R, G photoreceptor unit 256G and B photoreceptor unit 256B is represented by point, circle and X respectively.To make the viewpoint of image pick up equipment, such as video camera or still camera, for example the position of the respective pixel of R photoreceptor unit 256R, G photoreceptor unit 256G and B photoreceptor unit 256B fully optically is complementary.That is to say that R photoreceptor unit 256R, G photoreceptor unit 256G and B photoreceptor unit 256b all are arranged on the optical equivalence position, so that receive R, G and the beta radiation of light by corresponding pixel.
The mobile respectively R photoreceptor unit 256R of control signal, G photoreceptor unit 256G that R control unit 257R, G control unit 257G and B control unit 257B provide according to controller 240 (Figure 11) and the placement location of B photoreceptor unit 256B, this placement location is the attribute of sensor unit 241.That is to say, the placement location of R photoreceptor unit 256R, G photoreceptor unit 256G and B photoreceptor unit 256B be not fix but movably, and therefore the respective pixel of R photoreceptor unit 256R, the G photoreceptor unit 256G in the sensor unit 241 and B photoreceptor unit 256B need not optically identical position on.
Shown in Figure 13 D, for the locations of pixels (point shown in Figure 13 A and the 13D) of as a reference R photoreceptor unit 256R, drift rate (circle shown in Figure 13 B and the 13D) is represented as Ph on the horizontal direction of the location of pixels of G photoreceptor unit 256G and vertical direction GAnd Pv G, drift rate (X shown in Figure 13 C and the 13D) is represented as Ph on the horizontal direction of the location of pixels of B photoreceptor unit 256B and vertical direction BAnd Pv B
The control signal that R control unit 257R, G control unit 257G and B control unit 257B provide according to controller 240 moves the placement location of R photoreceptor unit 256R, G unit 256G and B photoreceptor unit 256B, so that realize Ph G, Pv G, Ph BAnd Pv BSide-play amount.
Now, in this case, can do some arrangements therein, for example, the position of R photoreceptor unit 256R is fixed, only mobile G photoreceptor unit 256G and B photoreceptor unit 256B.Perhaps, can do another kind of the arrangement therein, promptly except that R control unit 257R, fix for one among G control unit 257G and the B control unit 257B, move remaining two, such arrangement be can do therein in addition, all R photoreceptor unit 256R, G photoreceptor unit 256G and B photoreceptor unit 256B promptly can be moved.
Equally, in sensor unit shown in Figure 12 241, though the location of pixels that assessment R photoreceptor unit 255R, assessment G photoreceptor unit 255G and assessment B photoreceptor unit 255B optically have same position can mobile R photoreceptor unit 256R, G photoreceptor unit 256G and the placement location of B photoreceptor unit 256B.That is to say that all placed on the position of equivalence on the optics for assessment R photoreceptor unit 255R, assessment G photoreceptor unit 255G and assessment B photoreceptor unit 255B, so R, G and the beta radiation of light are received by corresponding pixel.
Figure 14 illustrates the profile instance of assessment unit shown in Figure 11 235.Assessment unit 235 comprises image storage unit 261, correlation calculations unit 262 and assessed value memory cell 263.The jobbie optical storage evaluate image signal that image storage unit 261 provides by signal adjustment unit 238 and A/D converting unit 239 for transducer 231.
Second picture signal that correlation calculations unit 262 uses the evaluate image signal evaluation of storage in the image storage unit 261 to obtain from normal image signal, the evaluate image signal that this second picture signal provides corresponding to the signal processing unit 234 that is stored in the image storage unit 261, be second picture signal that provides of correlation calculations unit 262 picked up signal processing units 234 and be stored in correlation between the evaluate image signal in the image storage unit 261, and its correlation is provided to assessed value memory cell 263, the assessment result that provides as signal processing unit 234 or the assessed value of second picture signal.
At this moment, the example of the correlation between the frame (field) of second picture signal and evaluate image signal is the inverse of absolute value sum of the difference of the part or all of pixel that is positioned at same position between second picture signal and the evaluate image signal.
Except the assessed value of second picture signal that comes from correlation calculations unit 262, also provide the control signal of controller 240 outputs to assessed value memory cell 263.When representing that by the control signal of controller 240 output first image when the assessed value of second picture signal that is used to obtain 262 outputs of correlation calculations unit is by sensor unit 231 acquisitions, the attribute of this transducer 231, R control unit 257R, G control unit 257G promptly shown in Figure 12 and the placement location of B control unit 257B.Assessed value memory cell 263 is stored placement location in relevant mode and is come from the assessed value of second picture signal of correlation calculations unit 262.Assessed value memory cell 263 is each second picture signal storage assessed value of a plurality of images of each position of relevant predefined a plurality of positions (be designated hereinafter simply as " desired location "), the position of the relevant R photoreceptor unit 256R of this desired location, G photoreceptor unit 256G and B photoreceptor unit 256B offers assessed value position determination unit 236 (Figure 11) then.
Please note, here, controller 240 outputs shown in Figure 11 are as the control signal of the placement location of the R photoreceptor unit 256R, the G photoreceptor unit 256G that are used to control transducer 241 (Figure 12) and B photoreceptor unit 256B, and this signal indication is at the horizontal direction of the location of pixels of G photoreceptor unit 256G and the side-play amount Ph on the vertical direction GAnd Pv GWith at the horizontal direction of the location of pixels of B photoreceptor unit 256B and the side-play amount Ph on the vertical direction BAnd Pv B(these side-play amounts Ph G, Pv G, Ph BAnd Pv BCan be built into " side-play amount P ").
Then, Figure 15 is the profile instance of explanation position determination unit 236 shown in Figure 11;
Each assessed value of a plurality of second picture signals of relevant a plurality of desired locations is provided to assessed value integral unit 271, and this assessed value is by assessed value memory cell 263 outputs of assessment unit 235.Assessed value integral unit 271 is that the relevant desired location of each picture signal of a plurality of second picture signals of each position of a plurality of desired locations is integrated assessed value, and will offer optimum position determining unit 272 by integrating the assessed value that obtained (following be also referred to as according to circumstances " integrating assessed value ").
The integration assessed value of each position of a plurality of desired locations that optimum position determining unit 272 provides according to assessed value integral unit 271 is determined and will be carried out the corresponding desired location of signal processing at signal processing unit 234, promptly determine R photoreceptor unit 256R, G photoreceptor unit 256G shown in Figure 12 and the placement location of B photoreceptor unit 256B, and this placement location is offered position storage unit 237 (Figure 11) as the optimum position of carrying out signal processing at signal processing unit 234 places (and signal processing unit 4).
Next, the learning process (learning process) of the optimum position of image pick up equipment shown in Figure 11 will be described with reference to flow chart shown in Figure 16.
At first, in step S201, a placement location of being concerned about in a plurality of placement locations of controller 240 with institute's care group is as R photoreceptor unit 256R, G photoreceptor unit 256G and the B photoreceptor 256B placement location of sensor unit shown in Figure 12 241, and will represent that the control signal of the placement location be concerned about offers the assessed value memory cell 263 of assessment unit shown in Figure 14 235.In addition, in step S201, controller 240 will represent that the control signal of the placement location be concerned about offers R control unit 257R, G control unit 257G and the B control unit 257B of sensor unit 241, and the placement location of R photoreceptor unit 257R, G photoreceptor unit 257G and B photoreceptor unit 257B moved to the placement location of being concerned about, then flow process proceeds to step S202.
In step S202, signal processing unit 234 obtains the picture signal of sensor unit 231 outputs.That is to say that in step S202, sensor unit 231 receives object light, carries out opto-electronic conversion then, obtains the picture signal (that is, making object image-forming) as the signal of telecommunication by this, and picture signal is offered signal adjustment unit 232 and 238.The picture signal that signal adjustment unit 232 provides sensor unit 231 stands CDS to be handled, and these picture signals are offered A/D converting unit 233.The picture signal that 233 pairs of signal adjustment units 232 of A/D converting unit provide is carried out the A/D conversion, and it is offered signal processing unit 234 as first picture signal.On the other hand, the picture signal that signal adjustment unit 238 provides sensor unit 231 stands CDS to be handled, and these picture signals are offered A/D converting unit 239.The picture signal that 239 pairs of signal adjustment units 238 of A/D converting unit provide is carried out the A/D conversion, and it is offered apparatus for evaluating 235 as the evaluate image signal.
That is to say that at sensor unit 231 places, R photoreceptor unit 256R, the G photoreceptor unit 256G and the B photoreceptor unit 256B that are placed on the placement location of being concerned about obtain and the corresponding normal image signal of incident object light.Provide this normal image signal by signal adjustment unit 232 and A/D converting unit 233 to signal processing unit 234.
In addition, at transducer 231 places, assessment R photoreceptor unit 255R, assessment G photoreceptor unit 255G and assessment B photoreceptor unit 255B obtain same and the corresponding evaluate image signal of incident object light.Provide the evaluate image signal by signal adjustment unit 238 and A/D converting unit 239 to assessment unit 235.At assessment unit shown in Figure 14 235 places, storage evaluate image signal in image storage unit 261.
Then, flow process proceeds to step S203 from step S202, the identical image transitions of signal processing that first picture signal that provides by A/D converting unit 232 that makes signal processing unit 234 stands to carry out with signal processing unit shown in Figure 14 is handled, obtain to have second picture signal of on first picture signal, having improved picture quality by this, and second picture signal offered assessment unit 235, flow process proceeds to step S204 then.
In step S204, the evaluation process of second picture signal that assessment unit 235 execution assessing signal processing units 234 provide, flow process proceeds to step S205 then.That is to say, at assessment unit shown in Figure 14 235 places, read the evaluate image signal in the evaluate image signal of correlation calculations unit 262 from be stored in image storage unit 261 as the evaluate image signal be concerned about, this evaluate image signal obtains from the object light identical with the object light of the normal picture signal that is used for second picture signal that picked up signal processing unit 234 provides.In addition, correlation between second picture signal that correlation calculations unit 262 picked up signal processing units 234 provide and the evaluate image signal of being concerned about, and the assessed value of second picture signal that this correlation is provided as signal processing unit 234 offers assessed value memory cell 263.
The control signal that assessed value memory cell 263 will come from the desired location of being concerned about that the controller 240 among the step S201 of assessed value and front just now of second picture signal of correlation calculations unit 262 provides associates, and the storage assessed value related with the desired location of being concerned about.
In step S205, whether each frame in the frame of controller 240 definite predetermined numbers has obtained the assessed value of the relevant desired location of being concerned about.If each frame in step S205 in the frame of definite predetermined number does not also have to obtain the assessed value of the relevant desired location of being concerned about, flow process is returned step S202 so, sensor unit 231 receives the incident object light at that time, and execution opto-electronic conversion, so that obtain the picture signal of electrical signal form, repeat identical processing then.
Equally, if each frame in step S205 in the frame of definite predetermined number has obtained the assessed value of the relevant desired location of being concerned about, flow process proceeds to step S206 so, and controller 240 determines whether all a plurality of desired locations have been used as the desired location of being concerned about.
If in step S206, determine also not to be that all a plurality of desired locations all are used as the desired location of being concerned about, flow process is returned step S201 so, controller 240 is measured and also is provided with one of a plurality of desired locations that are used as the desired location of being concerned about, and repeats identical processing subsequently.
Equally, if determine that in step S206 all a plurality of desired locations all are used as the desired location of being concerned about, if promptly for each frame in the frame of predetermined number obtained with a plurality of desired locations in the corresponding assessed value of each desired location, and it is stored in the assessed value memory cell 263 of assessment unit shown in Figure 14 235, the corresponding assessed value of each desired location with in a plurality of desired locations that will be obtains for each image of the frame of predetermined number of assessed value memory cell 263 offers position determination unit 236 so, and flow process proceeds to step S207 then.
At step S207, the assessed value integral unit 271 of position determination unit 236 shown in Figure 15 be each desired location in a plurality of desired locations integrate that each image for the frame of predetermined number obtains with a plurality of desired locations in the corresponding assessed value of each desired location, thereby obtain the integration assessed value of each desired location in a plurality of desired locations, provide it to optimum position determining unit 272 then.That is to say, assessed value integral unit 271 is obtained a certain desired location, obtaining for example be the mean value, maximum value minimum value or the like of assessed value of each picture frame acquisition in the picture frame of predetermined number, as the integration assessed value of the desired location of being concerned about, and provide it to optimum position determining unit 272.
Flow process proceeds to step S208 from step S207, the integration assessed value of each desired location of a plurality of desired locations that provide according to assessed value integral unit 271 in this optimum position determining unit 272 is determined the corresponding desired location of signal processing with signal processing unit 234, the placement location of the R photoreceptor unit 256R in the sensor unit 241 promptly shown in Figure 12, G photoreceptor unit 256G and B photoreceptor unit 256B.
That is to say, in step S208, optimum position determining unit 272 for example obtains maximum in the integration assessed value of each desired location from a plurality of desired locations that assessed value integral unit 271 provides, and determine and the maximum corresponding desired location of assessed value of integrating, with its as with the corresponding desired location of the signal processing of signal processing unit 234, be the signal processing of signal processing unit shown in Figure 14 and place the best placement location of R photoreceptor unit 23R shown in Figure 3, G photoreceptor unit 23G and B photoreceptor unit 23B.
In addition, in step S208, the information of optimum position determining unit 272 storage representation optimum position in position storage unit 237 (being equivalent to above-mentioned attribute information), process ends then.
In sensor unit shown in Figure 11, R photoreceptor unit 23R, G photoreceptor unit 23G and B photoreceptor unit 23B (Fig. 3) are placed on by on the represented optimum position of the information in the position memory that above-mentioned learning process is stored in 237.Therefore, for sensor unit shown in Figure 11, can obtain to be suitable for carrying out the picture signal of signal processing, and the picture signal that obtains from sensor unit 1 is carried out the picture signal that signal processing makes it possible to obtain high image quality again at signal processing unit 4.
In above-mentioned example, second picture signal is described to that second picture signal and evaluate image correlation between signals assess by obtaining, but for example it should be noted that and can carry out according to the S/N of second picture signal.In addition, can import the assessment of second picture signal from the outside.That is to say, can implement such arrangement, wherein show second picture signal, and the assessment of for example exporting second picture signal by the user who observes display image.
Next, can carry out above-mentioned a series of processing by specialized hardware or software by execution such as signal processing unit 4 and 234, assessment unit 235, position determination unit 236, controllers 240.Using software to carry out under the situation of this series of processes, the program that constitutes this software is installed in microcomputer, all-purpose computer or the like.
Now, Figure 17 explanation will be installed the profile instance of the computer that is used to carry out above-mentioned series of processes program therein.
This program can be stored among hard disk 305 or the ROM303, and these all are the recording mediums of built-in computer.Perhaps, this program can also be temporarily or for good and all is stored on the removable storage medium 311, such as floppy disc, CD-ROM (compact disk read-only memory); MO (magneto optical disk) dish; DVD (digital versatile disc), disk, semiconductor memory or the like.A kind of removable recording medium 311 like this can be used as so-called compressed software to be provided.
Except from this removable recording medium 311 to the computer installation procedure, can be via satellite, wirelessly such as digital broadcast satellite from the download website, perhaps by this program is sent in the computer wiredly such as this type of network of LAN (local area network (LAN)) or internet, computer uses communication unit 308 to receive the program that is transmitted, and it is installed in the built-in hard disk 305.
Computer has built-in CPU (central processing unit) 302.Input/output interface 310 is connected in CPU302 by bus 301, and work as the user by using by coming input command such as this type of input units of forming 307 such as keyboard, mouse, microphones, by input/output interface 310, the corresponding program that is stored among the ROM (read-only memory) 303 of carrying out.Perhaps, CPU302 can be stored on the hard disk 305 via satellite or network transmit by communication unit 308 receive and be installed on the hard disk 305 or removable recording medium 311 on being assemblied in driver 309 program reading and be installed on the hard disk 305 be written into RAM (random access memory) 304 so that carry out.Therefore, CPU302 carries out according to the configuration of above-mentioned flow process or above-mentioned block diagram and handles.Then, CPU302 uses output unit 306 outputs of input/output interface 310 from being made of LCD (LCD) or loud speaker as required, and perhaps transmit result, or it is stored in the hard disk 305 from communication unit 308, or the like.
Now, in current specification,, so that carrying out the various processing that do not need the time series sequential processes that provides by flow chart, computer can walk abreast or with the formal description treatment step of program code individually by row (for example, parallel processing or object-based processing).In addition, this program can be handled by unit or a plurality of computer.And, this program can be sent on the remote computer and carry out.
Please note, signal processing unit 4 and 234 can be carried out the processing that is used to obtain second picture signal except above-mentioned image transitions is handled, such as making first picture signal stand that digital clamper processing, white balance adjustment processing, gamma correction are handled, linear interpolation is handled, or the like.
Simultaneously, use so-called three-sensor device though described current specific embodiment is a sensor unit 1 and 231, sensor unit 1 and 231 also can use list-transducer, two-transducer or four or above sensing system.
In addition, for current specific embodiment, made description with respect to sensor unit 1 and 231, sensor unit 1 and transducer 231 sensor light and output and the corresponding picture signal of this light, but, can do such arrangement therein, be that sensor unit 1 and 231 is microphones of sensing sound, and its output and this acoustic phase corresponding audio signal, perhaps sensor unit 1 and 232 is that for example also export and the transducer of the corresponding signal of this information by the information of other type such as temperature or acceleration for sensing.Even now note that the information type according to institute's sensing, and the signal processing performed at sensor unit 1 and 231 after-stage is different.
Equally, except the laying state of R photoreceptor unit 23R, G photoreceptor unit 23G and B photoreceptor unit 23B, the example of the attribute of sensor unit 1 (same with transducer 231) comprises the placement location that is used for light is accumulated in lens on the sheet on the pixel, with the corresponding voltage of each pixel charge stored (electric current) magnification ratio, or the like.
Second specific embodiment
Next, second specific embodiment of the present invention will be described.
The profile instance of second specific embodiment of image pick up equipment of the present invention has been used in Figure 18 explanation.This image pick up equipment shown in Figure 180 for example can be digital still camera or digital camera.
The same with sensor unit shown in Figure 11, transducer 401 comprises and the corresponding a plurality of photoelectric conversion units of pixel, it is used for sensing and projects wherein object light, and to the picture signal of signal adjustment unit 402 outputs with the corresponding electrical signal form of light quantity that receives.Equally, different with sensor unit 1 shown in Figure 1, the control signal that sensor unit 401 provides according to signal processing unit 404 changes its state.
Signal adjustment unit 402 is carried out CDS and is handled with removal and be included in the noise that resets in the picture signal of transducer 401 outputs, and will offer A/D converting unit 403 as the picture signal that result obtained.The picture signal that 403 pairs of signal adjustment units 402 of A/D converting unit provide is carried out the A/D conversion, promptly offers signal processing unit 404 by the sample quantization picture signal and with the digital picture that as a result of be obtained.
The data image signal that signal processing unit 404 provides A/D converting unit 403 (being designated hereinafter simply as " picture signal ") is as first picture signal, and make first picture signal stand the predetermined picture conversion process, and to output unit 405 output as data image signal that its result obtained as second picture signal.Equally, 404 pairs of second picture signals that the result obtained as it of signal processing unit are assessed, and will assess corresponding control signal with this and offer sensor unit 401, with the state of control sensor unit 401.
Second picture signal of output unit 405 received signal processing units 404 outputs, and export these second picture signals.That is to say that output unit 405 perhaps is presented at it on unshowned monitor from second picture signal of unshowned output output from signal processing unit 404.In addition, output unit 405 is stored in second picture signal in the unshowned recording medium, such as CD, disk, magneto optical disk, tape, semiconductor memory or the like is perhaps by sending these picture signals as telephone wire, internet, local area network (LAN) or other as cable or wireless transmission medium.
Image pick up equipment for above-mentioned configuration, receive object light at sensor unit 401 places, and will offer signal processing unit 404 with the picture signal of the corresponding electrical signal form of reception light quantity by signal adjustment unit 402 and A/D converting unit 403.Signal processing unit 404 makes picture signal that transducer 401 provides by signal adjustment unit 402 and A/D converting unit 403 stand to handle such as image transitions this type of signal processing as first picture signal, and having improved second picture signal of picture quality by this to output unit 405 output, this image transitions is for example handled and is improved picture quality by improving resolution.In output unit 405, second picture signal that output signal processing unit 404 provides.
Equally, signal processing unit 404 assessments are handled second picture signal that is obtained by making first picture signal that comes from sensor unit 401 stand image transitions.In addition, signal processing unit 404 provides and the state of the corresponding control signal of its assessment with control transducer 401 to transducer 401.
The control signal that transducer 401 provides according to signal processing unit 404 changes its state, and the picture signal that is obtained under the state of change is being followed in output.
Transducer 401 is a kind of three-transducer imaging sensor devices, for example comprises that three are used to obtain R, the G of picture signal and the transducer of B component (described subsequently R photoreceptor unit 23R, G photoreceptor unit 23G and B photoreceptor unit 23B).Therefore, sensor unit 1 has the picture signal of R signal, G signal and three components of B signal for each pixel output.The control signal that transducer 401 provides according to signal processing unit 404 changes the laying state of three one or more transducers in the transducer.Therefore, the transducer laying state of transducer 401 is to be controlled by the control signal that signal processing unit 404 provides.This moment, the transducer laying state comprises the attitude (rotary state) of the placement location and the transducer of transducer.Yet, please note for current specific embodiment, for convenience of description, will carry out the relevant description of using the control signal that comes from signal processing unit 404 to control the placement location of transducer or sensor unit 401.Should notice that also the attitude of transducer can control equally.
Figure 19 illustrates the profile instance of signal processing unit shown in Figure 180 404 and output unit 405.Signal processing unit 404 comprises three signal processing unit 411R, 411G and 411B.Signal processing unit 411R receives first picture signal with R, G and B signal that A/D converting unit 403 provides, and make the conversion of first picture signal stand the image transitions processing, obtain the R signal (component) of second picture signal by this, then this R signal is outputed to output unit 405.Signal processing unit 411G receives first picture signal with R, G and B signal that A/D converting unit 403 provides, and make the conversion of first picture signal stand the image transitions processing, obtain the G signal (component) of second picture signal by this, then this G signal is outputed to output unit 405.Signal processing unit 411G assesses the G signal of second picture signal equally, and controls the laying state of each transducer of sensor unit 401 according to this assessment.Signal processing unit 411B receives first picture signal with R, G and B signal that A/D converting unit 403 provides, and make the conversion of first picture signal stand the image transitions processing, obtain the B signal (component) of second picture signal by this, then this B signal is outputed to output unit 405.
Notice that, signal processing unit 411G assesses the G signal of second picture signal here, the B signal (component) of second picture signal that provides to output unit 405 is provided.Yet, though arrange signal processing unit 411G that the G signal of second picture signal is assessed, and according to the laying state of this assessment control transducer or sensor unit 401, but R signal that also can be by assessing second picture signal or any in the B signal rather than assessment G signal come sensor unit 401 is controlled, and perhaps also can assess R, the G of second picture signal and two or more signals of B signal.
Output unit 405 comprises output unit 412R, 412G and 412B.Output unit 412R, 412G and 412B receive respectively and R signal, G signal and the B signal of second picture signal of output signal processing unit 411R, 411G and 411B output.Note that hereinafter as long as suitable the time, signal processing unit 411R, 411G and/or 411B are collectively referred to as or divide to be called " signal processing unit 411 ".
Then, Figure 20 is the profile instance of the sensor unit 401 shown in explanation Figure 18 and 19; Object light is injected in the lens 421, and lens 421 gather object light on each of R photoreceptor unit 423R, G photoreceptor unit 423G and B photoreceptor unit 423B by prism 422.That is to say that projection goes into that light is launched in the prism 422 in the lens 421.The object light that prism 422 will come from lens 421 is decomposed into R, G and B light, and along direction emission R, G and B light corresponding to R photoreceptor unit 423R, G photoreceptor unit 423G and 423B present position, B photoreceptor unit.
R photoreceptor unit 423R, G photoreceptor unit 423G and B photoreceptor unit 423B are configured to photoelectric conversion device, the picture for example photoelectric diode this type of, it receives R, G and B light from prism 422, and produce R signal, G signal and B signal with the corresponding electrical signal form of light quantity that receives by this, then these signals are outputed to signal adjustment unit 402.
An example that can be used for R photoreceptor unit 423R, G photoreceptor unit 423G and B photoreceptor unit 423B device is CCD (charge coupled device).Yet, please note that R photoreceptor unit 423R, G photoreceptor 423G and B photoreceptor 423B are not limited to CCD, also can replace with cmos sensor or HARP.
The control signal that R control unit 424R, G control unit 424G and B control unit 424B its each provide according to signal processing unit 411G is to mobile control of R photoreceptor unit 423R, G photoreceptor unit 423G and B photoreceptor unit 423B placement location.
Here for convenience of description, the position of supposing whole R photoreceptor unit 423R has been set on the position that is obtained by R control unit 424R.In addition, the placement location of supposing whole G photoreceptor unit 423G and B photoreceptor unit 423B has been set on the position that is obtained by G control unit 424G and B control unit 424B.But, should note to use the MEMS technology to arrange, the R photoreceptor unit 423R that is utilized whereby be one therein the placement location of pixel can change the photoreceptor unit of (moving) substantially, so R control unit 424R can individually control the placement location of each pixel of R photoreceptor unit 423R.This sets up too for G photoreceptor unit 423G, G control unit 424G, B photoreceptor unit 423B and B control unit 424B.
Next, 21A with reference to the accompanying drawings use R control unit 424R, G control unit 424G shown in Figure 21 and B control unit 424B is described to each control carried out among R photoreceptor unit 423R, G photoreceptor unit 423G and the B photoreceptor unit 423B to 21D.
B photoreceptor unit 423B shown in G photoreceptor unit 423G shown in R photoreceptor unit 423R, Figure 21 B shown in Figure 21 A and Figure 21 C, its each all have the pixel of finite region (corresponding to photodiode or the like), and output is in each pixel and the corresponding picture signal of light quantity (pixel value) that is received.Note that at Figure 21 A in 21D, described pixel is the square that each limit all has finite length.
Here, each is represented the locations of pixels of R photoreceptor unit 423R, G photoreceptor unit 423G and B photoreceptor unit 423B by the foursquare center of gravity as pixel, and the pixel of R photoreceptor unit 423R, G photoreceptor unit 423G and B photoreceptor unit 423B is respectively by point, circle and X representative.Near showing image pick up equipment, such as video camera or still camera, all optically mate the position of for example respective pixel of R photoreceptor unit 423R, G photoreceptor unit 423G and B photoreceptor unit 423B.That is to say that R photoreceptor unit 423R, G photoreceptor unit 423G and B photoreceptor unit 423B all are placed on the optics on the equivalent position, so that the R of light, G and beta radiation are received by corresponding pixel.
Its control signal that provides according to signal processing unit 411G (Figure 11) respectively of R control unit 424R, G control unit 424G and B control unit 424B moves the position of R photoreceptor unit 423R, G photoreceptor unit 423G and B photoreceptor unit 423B to be controlled.That is to say, the placement location of R photoreceptor unit 423R, G photoreceptor unit 423G and B photoreceptor unit 423B be not fix but therefore movably, and R photoreceptor unit 423R, the G photoreceptor unit 423G in the transducer 401 there is no need on the optically identical position with the respective pixel of B photoreceptor unit 423B.
Shown in Figure 21 D, for the locations of pixels (shown in the point of Figure 13 A and 13D) of as a reference R photoreceptor unit 423R, be represented as Ph in the horizontal direction and the side-play amount on the vertical direction (by circle expression shown in Figure 21) of the location of pixels of G photoreceptor unit 423G GAnd Pv G, and be represented as Ph in the horizontal direction and the side-play amount on the vertical direction (representing) of the location of pixels of B photoreceptor unit 423B by the X among Figure 21 BAnd Pv B
The control signal that R control unit 424R, G control unit 424G and B control unit 424B provide according to signal processing unit 411G moves R photoreceptor unit 423R, G unit 423G and B photoreceptor unit 423B gets placement location, so that realize side-play amount Ph G, Pv G, Ph B, Pv B
Now, in this case, can do some arrangements therein, for example, the position of R photoreceptor unit 423R is fixed, only mobile photoreceptor unit 423G and B photoreceptor unit 423B.Perhaps, another arrangement therein, be that another photoreceptor unit except that R photoreceptor unit 423R is fixed among R photoreceptor unit 423R, G photoreceptor unit 423G and the B photoreceptor unit 423B, and remaining two can be moved, such arrangement be can do therein in addition, all R photoreceptor unit 423R, G photoreceptor unit 423G and B photoreceptor unit 423B promptly can be moved.
Next, Figure 22 illustrates the profile instance of signal processing unit shown in Figure 19 411. Signal processing unit 411R, 411G and 411B provide from transducer 401 and have also passed through the picture signal of signal adjustment unit 402 and A/D converting unit 403 as first picture signal.
Signal processing unit 411R comprises image conversion unit 431R and image storage unit 432R.First picture signal that offers signal processing unit 411R is provided for image conversion unit 431R.Image conversion unit 431R makes first picture signal stand image transitions and handles, for example improving picture quality, and provide to image storage unit 432R and to have as the R data image signal of the raising picture quality of signal processing results R signal as second picture signal by improving resolution.
Second picture signal that the interim memory image converting unit of image storage unit 432R 431R provides.In addition, image storage unit 432R selects the selection information of image to read second picture signal from second picture signal of being stored according to the assessment unit 433 of signal processing unit 411G being used for of providing, and second picture signal that will read offers output unit 405.
Signal processing unit 411G comprises image conversion unit 431G, image storage unit 432G and assessment unit 433.First picture signal that offers signal processing unit 411G is provided for image conversion unit 431G.Image conversion unit 431G makes first picture signal stand image transitions and handles, for example improving picture quality, and provide to image storage unit 432G and assessment unit 433 and to have as the G data image signal of the raising picture quality of signal processing results G signal as second picture signal by improving resolution.
Second picture signal that the interim memory image converting unit of image storage unit 432G 431G provides.In addition, image storage unit 432G selects the selection information of image to read second picture signal from second picture signal of being stored according to the assessment unit 433 of signal processing unit 411G being used for of providing, and second picture signal that will read offers output unit 405.
The G signal of second picture signal that 433 couples of image conversion unit 431G of assessment unit provide is assessed, and provide corresponding to the control signal of assessing to sensor unit 401, control R photoreceptor unit 423R, the G photoreceptor unit 423G of transducer 401 and the placement location (Figure 20) of B photoreceptor unit 423B by this.In addition, assessment unit 433 will offer image storage unit 432G corresponding to the selection information of the assessment of second picture signal, and further identical selection information be offered the image storage unit 432R of signal processing unit 411R and the image storage unit 432B of signal processing unit 411B.
Signal processing unit 411B comprises image conversion unit 431B and image storage unit 432B.First picture signal that offers signal processing unit 411B is provided for image conversion unit 431B.Image conversion unit 431B makes first picture signal stand image transitions and handles, for example improving picture quality, and provide to image storage unit 432B and to have as the B data image signal of the raising picture quality of signal processing results B signal as second picture signal by improving resolution.
Second picture signal that the interim memory image converting unit of image storage unit 432B 431B provides.In addition, image storage unit 432B selects the selection information of image to read second picture signal from second picture signal of being stored according to the assessment unit 433 of signal processing unit 411G being used for of providing, and second picture signal that will read offers output unit 405.
Note that image conversion unit 431R, 431G and 431B have identical configuration, therefore can be according to circumstances, be collectively referred to as or divide and be called " image conversion unit 431 ".In addition, should notice that image storage unit 432R, 432G and 432B have identical configuration, therefore according to circumstances, can be collectively referred to as or divide and be called " image storage unit 432 ".
Then, Figure 23 illustrates the profile instance of assessment unit shown in Figure 22 433; Assessment unit 433 comprises memory cell 441, correlation calculations unit 442, determines assessment unit 443 and control signal output unit 444, and the picture quality of the G signal of second picture signal that provides of evaluate image converting unit 431G.
More particularly, second picture signal that provides of image storage unit 441 interim memory image converting unit 431G.Correlation between second picture signal that second picture signal that correlation calculations unit 442 computed image converting unit 431G provided last time and image conversion unit 431G provide specifically, and will offer as the correlation that result of calculation obtains and determine assessment unit 443.
Determine that correlation that assessment unit 443 provides according to correlation calculations unit 442 comes second picture signal of evaluate image converting unit 431G output, and to obtain effect be that the picture quality of second picture signal is high, or low assessment result.In addition, determine that assessment unit 443 offers control signal output unit 444 with the assessment result of second picture signal, and export selection information to image storage unit 432R shown in Figure 22,432G and 432B according to this assessment result.
Control signal output unit 444 provides the control signal of the placement location of R photoreceptor unit 423R, the G photoreceptor unit 423G of control transducer 401 and B photoreceptor unit 423B (Figure 20) to the R of transducer 401 control unit 424R, G control unit 424G and B control unit 424B, and this control signal is corresponding with the assessment result from second picture signal of definite assessment unit 443.Therefore, the placement location of R photoreceptor unit 423R, G photoreceptor unit 423G and B photoreceptor unit 423B is controlled.
For as the assessment unit 433 of above-mentioned setting for, memory cell 441 is second picture signal that provides of memory image converting unit 431G continuously.When new second picture signal that correlation calculations unit 442 receives that image conversion unit 431G provide, it calculates that these second picture signals and image conversion unit 431G provided last time and is stored in correlation between second picture signal in the memory cell 441.
At this moment, the example of the correlation between two of second picture signal frames (field) be the difference of the some or all pixels that are positioned at same position between two picture signals absolute and inverse.
Correlation calculations unit 442 offers the correlation that obtains determines assessment unit 443.Determine that correlation that assessment unit 443 provides according to correlation calculations unit 442 comes second picture signal of evaluate image converting unit 431G output, and to obtain effect be that the picture quality of second picture signal is high, or low assessment result.If the effect that definite assessment unit 443 obtains is that the picture quality of second picture signal is low assessment result, determine that so assessment unit 443 offers control signal output unit 444 with assessment result.
As shown in figure 21, when receiving effect is that the picture quality of second picture signal is when being low assessment result, control signal output unit 444 provides the control signal of the placement location of R photoreceptor unit 423R, the G photoreceptor unit 423G of corresponding control transducer 401 with assessment result and B photoreceptor unit 423B (Figure 20) to transducer 401, promptly, with reference to the location of pixels of R photoreceptor unit 423R, change the offset value Ph of the location of pixels of G photoreceptor unit 423G GAnd Pv G, and the offset value Ph of the location of pixels of B photoreceptor unit 423B BAnd Pv BControl signal.Now, for having current side-play amount Ph G, Pv G, Ph BAnd Pv BBe the four-vector of component, its representation in components is vector P (Ph G, Pv G, Ph B, Pv B), and four-dimensional infinitely small vector representation is Δ p, when a certain frame of imaging, control signal output unit 444 newly is provided with a vector P+ Δ p who also is not set up, and output is used to control the control signal that R photoreceptor unit 423R, G photoreceptor unit 423G and B photoreceptor unit 423B are displaced to the position that the component value with vector P+ Δ p is complementary.At this moment, the component of infinitely small vector Δ p for example can be a random number.
In this case, R photoreceptor unit 423R, the G photoreceptor unit 423G of the control signal movable sensor 401 that provides according to control signal output unit 444 or the placement location of B photoreceptor unit 423B (Figure 20).In addition, the R photoreceptor unit 423R of transducer 401, G photoreceptor unit 423G and B photoreceptor unit 423B move the reception object light with it, and the corresponding picture signal of light quantity of exporting and being received.By signal adjustment unit 402 and A/D converting unit 403, be used as the first new picture signal by the picture signal of transducer 401 output and offer signal processing unit 411.The image conversion unit 431 of signal processing unit 411 shown in Figure 22 makes the first new picture signal stand image transitions and handles, and will offer image storage unit 432 as the second new picture signal that the result obtained that image transitions is handled and be stored in wherein.In addition, image conversion unit 431R offers assessment unit 433 with the second new picture signal.
The correlation calculations unit 442 of assessment unit 433 receives the second new picture signal from image conversion unit 431R, and calculate that this second picture signal and image conversion unit 431G provided last time and be stored in correlation between second picture signal in the memory 411, this correlation will offer determines assessment unit 443.
By repeating above-mentioned processing, determine that assessment unit 443 obtains from side-play amount Ph G, Pv G, Ph BAnd Pv BThe correlation of second picture signal that obtains in first picture signal of imaging.
Now, Figure 24 explanation is for each value, side-play amount Ph G, Pv G, Ph BAnd Pv BBetween relation and use from side-play amount Ph G, Pv G, Ph BAnd Pv BThe correlation of second picture signal that obtains in first picture signal of imaging.Correlation is represented by a certain side-play amount Ph G, Pv G, Ph BAnd Pv BSecond picture signal that obtains and by side-play amount Ph G ', Pv G ', Ph B 'And Pv B 'Correlation between second picture signal that obtains, this side-play amount Ph G ', Pv G ', Ph B 'And Pv B 'Be offset by dimensionless corresponding to above-mentioned infinitely small vector Δ p.
Therefore, a certain side-play amount Ph G, Pv G, Ph B, Pv BLow correlation represent by side-play amount Ph G, Pv G, Ph B, Pv BSecond picture signal that obtains is to have fuzzy edge and the low image quality that do not have sharpening.On the other hand, a certain side-play amount Ph G, Pv G, Ph B, Pv BHigh correlation represent by side-play amount Ph G, Pv G, Ph B, Pv BSecond picture signal that obtains is the high image quality with sharpen edges.
Therefore, if the correlation that correlation calculations unit 442 provides is low, really it is low with the image quality measure of second picture signal that accepted opinion so shown in Figure 23 is estimated unit 443, if opposite correlation is high, if for example obtain maximum (or peak) as shown in Figure 24, the image quality measure with second picture signal is high so.If assessment result be the picture quality of second picture signal for high, determine so that assessment unit 443 is used for calculating in two second picture signals of correlation when image storage unit 432 output effects are these assessment results of acquisition one will selecteed selection information (Figure 22).
At image storage unit 432R, 432G and 432B place, select second picture signal of information according to this, be that picture quality is the second high picture signal promptly, from second picture signal of above-mentioned storage, read, and provide it to output unit 405 about obtaining assessment result.
Next, the operation of the image pick up equipment shown in Figure 18 and 19 will be described with reference to the flow chart among Figure 25.
For image pick up equipment, at first, in step S101, transducer 401 receives object light, carries out opto-electronic conversion then, obtains the picture signal (that is, making object image-forming) of electrical signal form by this, and picture signal is offered signal adjustment unit 402.The picture signal that signal adjustment unit 402 provides sensor unit 401 stands CDS to be handled, and then these signals is offered A/D converting unit 403.The picture signal that 403 pairs of signal adjustment units 402 of A/D converting unit provide is carried out the A/D conversion, offer signal processing unit 404 as first picture signal then, therefore, signal processing unit 404 obtains first picture signal, thereby flow process proceeds to step S102 from S101.
In step S102, at signal processing unit 404 places, the image conversion unit 431 (Figure 22) of signal processing unit 411 makes first picture signal that comes from A/D converting unit 403 stand to handle as the image transitions of signal processing, on first picture signal, produce second picture signal that picture quality improves by this, and offer image storage unit 432 and be stored in wherein.In addition, in step S102, image conversion unit 431G will offer assessment unit 433 as second picture signal that the result obtained that image transitions is handled, thereby flow process proceeds to step S103.
In step S103, assessment unit 433 is carried out and is used for evaluation process that second picture signal that image conversion unit 431G provides is assessed, and flow process proceeds to step S104 then.In step S104, assessment unit 433 determines the assessment result that obtains is that picture quality is whether the second high image obtains.
In step S104, if determine in step S104, if determining the assessment result that obtains is that picture quality is that low second image is obtained, flow process proceeds to step S105 so, and assessment unit 433 will provide control signal with the amount of specifying Offsets Ph to transducer 401 G, Pv G, Ph BAnd Pv B, the placement location of the R photoreceptor unit 423R of movable sensor unit 401, G photoreceptor unit 423G or B photoreceptor unit 423B (Figure 20) by this, flow process is returned step S101 then.
In step S101, formerly the placement location place that has moved among the step S105 is R photoreceptor unit 423R, the G photoreceptor unit 423G of sensor unit 401 or each the acquisition picture signal among the photoreceptor unit 423B, and repeats identical processing.
In step S104, if determining the assessment result that obtains is that picture quality is that high second picture signal is obtained, assessment unit 433 provides second picture signal that produces assessment result with selecteed selection information to image storage unit 432 so, and flow process proceeds to step S106 then.
In step S106, image storage unit 432R, 432G and 432B select and read secondary signal according to the selection information that assessment unit 433 provides from residue second picture signal of storing under step S102 state, it is second picture signal of high image quality, and output to output unit 405, thereby the processing of a frame of view data (or field) finishes.Image pick up equipment for example provides the image pickup halt instruction up to the user according to the flow chart reprocessing of Figure 25.
Next, the evaluation process of being carried out by assessment unit shown in Figure 23 433 of Figure 25 will be described with reference to the flow chart of Figure 26 in step S103.
In evaluation process, at first, in step S311, second picture signal (Figure 25) that image conversion unit 431G provided among the step S102 before memory cell 411 was stored in, and correlation calculations unit 442 receives these second picture signals.In addition, in step S311, correlation between second picture signal that second picture signal that correlation calculations unit 442 computed image converting unit 431G provide and memory cell 441 are formerly stored among the step S311, and correlation is provided to determines assessment unit 443, flow process proceeds to step S312 then.
In step S312, when obtaining one of two second picture signals being used to obtain correlation, determine assessment unit 443 interim memory dependency computing units 442 with side-play amount Ph G, Pv G, Ph BAnd Pv BThe correlation that relevant form provides, flow process proceeds to step S313 then.At this moment, when obtaining one of two second picture signals being used to obtain the correlation that correlation calculations unit 422 provides, determine that assessment unit 443 obtains side-play amount Ph from control signal output unit 444 G, Pv G, Ph BAnd Pv B
In step S313, in the correlation of step S312 storage and the relation between the side-play amount, determine that assessment unit 443 determines whether to obtain the maximum of correlation for up to now.If in step S313, determine also not obtain the maximum of correlation, flow process proceeds to step S314 so, determine that it is that second picture signal is the assessment of the picture signal of low image quality that assessment unit 443 is made effect, flow process is returned the step S104 among Figure 25 then.
In this case, in the step S104 of Figure 25, determine that assessment unit 443 determines that also not obtaining effect is that picture quality is high assessment result, and therefore with assessment result, promptly effect is that second picture signal is that the assessment result of low image quality offers control signal output unit 444.In step S105, control signal output unit 444 receptions are that second picture signal is the assessment result of low image quality, and provide with this assessment result is corresponding to sensor unit 401 and to be used to specify new side-play amount Ph G, Pv G, Ph BAnd Pv BControl signal.
Return step S26, if in step S313, determine to have obtained the maximum of correlation, flow process proceeds to step S315 so, determines that it is that second picture signal is the assessment of high image quality that assessment unit 443 is made effect, and flow process is returned the step S104 among Figure 25 then.
In this case, in the step S104 of Figure 25, if determining the assessment result that obtains is that picture quality is that high second picture signal is obtained, to provide effect to image storage unit 432 be to produce second picture signal of assessment result with selecteed selection information to assessment unit 433 so, and flow process proceeds to step S106 then.
In step S106, the selection information that image storage unit 432R, 432G and 432B provide according to assessment unit 433 is selected from residue second picture signal of storing under step S102 state and is read secondary signal, promptly has second picture signal of high image quality.
As mentioned above, second data image signal is assessed, according to assessment result control side-play amount Ph G, Pv G, Ph B, and Pv B, so control R photoreceptor unit 423R, the G photoreceptor unit 423G of sensor unit 401 and the placement location of B photoreceptor unit 423B (Figure 20), therefore be placed on and side-play amount Ph for having G, Pv G, Ph BAnd Pv BThe sensor unit 401 of corresponding locational R photoreceptor unit 423R, G photoreceptor unit 423G and B photoreceptor unit 423B, output is suitable for carrying out the picture signal that image transitions is handled in image conversion unit 431, thereby can obtain second picture signal of high image quality in image conversion unit 431.
Though in the foregoing description, the correlation that obtains in step S313 is under the peaked situation, making effect is that second picture signal is the assessment result of high image quality, but can use the arrangement of a replacement, if wherein the maximum of the correlation that obtains in step S313 has surpassed the maximum of predetermined threshold, making effect so is that second picture signal is the assessment result of high image quality.
Equally, in above-mentioned situation, second picture signal is with according to the formal description of correlation assessment, but can use the arrangement of replacement, wherein according to respect to side-play amount Ph G, Pv G, Ph BAnd Pv BThe S/N of second picture signal that obtains of each value come second picture signal is assessed.In addition, can import the assessment of second picture signal from the outside.That is to say, for example can implement such arrangement, wherein show second picture signal, and for example export the assessment of second picture signal by the user of observation post's display image.
In addition, can arrange, wherein prepare several values in advance, obtain the correlation of relevant all several offset values relevant side-play amount, when obtaining the highest correlation, one of two picture signals of output in the step S106 of Figure 25.
Equally, can do such arrangement, promptly a frame (field) within period the step S101 among as far as possible repeatedly execution Figure 25 to the circulation of S105, and in the step S106 of Figure 25, when the highest correlation of the correlation that obtains during circular treatment, to be obtained, one of two picture signals of output.
Therefore the setting of image conversion unit 431 shown in Figure 22 is identical with image conversion unit 31 shown in Figure 4, has omitted description to it (please refer to Fig. 6 to 10 and to its description) here.
The 3rd specific embodiment
And then, Figure 27 is the profile instance that the 3rd specific embodiment of using image pick up equipment of the present invention is described.Note that corresponding to those parts in Figure 18 or 19 to provide, and omitted description wherein according to circumstances with identical Reference numeral.Briefly, except the operating unit 185 that further provides, according to the image pick up equipment of the 3rd specific embodiment shown in Figure 27 basically with identical according to the image pick up equipment of second specific embodiment shown in Figure 18 and 19.
Operating unit 185 is buttons of for example operating by the user, and to signal processing unit 404 outputs and this operation corresponding parameter.Signal processing unit 404 among Figure 27 disposes signal processing unit 411R, 411G and 411B, as shown in Figure 19.
Figure 28 illustrates signal processing unit 411R, the 411G of the signal processing unit 404 that composition is shown in Figure 27 and the profile instance of 411B.Note that with Figure 22 in the corresponding parts of those parts represent with identical Reference numeral, and omitted description wherein according to circumstances.With regard to signal processing unit 411R, the 411G and 411B that are concerned about, they all are with same way as shown in Figure 22 configuration, but be to be provided for image conversion unit 431 (forming) at the difference that this made dispositions, use image conversion unit 431 to implement and handle with the corresponding image transitions of this parameter by 431R, 431G and 431B from the parameter of operating unit 185 outputs.
Figure 29 is the profile instance of explanation image conversion unit 431 shown in Figure 28.Note that with Fig. 6 in the corresponding parts of those parts represent with identical Reference numeral, and omitted description wherein according to circumstances.Here, image conversion unit 431 shown in Figure 22 has the configuration identical with image conversion unit shown in Figure 4 31, and image conversion unit 31 shown in Figure 4 was described to the mode of Figure 10 with Fig. 6.Yet the parameter of operating unit 185 outputs is provided for factor output unit 124.
Figure 30 is another profile instance of explanation factor output unit 124 shown in Figure 29.The identical reference number of same here use is represented and the corresponding parts of those parts shown in Figure 7.
Though arrangement shown in Figure 7 comprises factor output unit 124, storage is by the tap factor of each class of learning in advance to obtain, but for the arrangement among Figure 30, factor output unit 124 for each class from like as the tap factor that generates the image that can produce desired quality the tap seed data of seed and the predefined parameter.
The tap factor that factor memory 181 provides for each class storage factor generation unit 182.In case class taxon 123 provides Sort Code, factor memory 181 reads the tap factor of the class that Sort Code represents from the tap factor for the storage of each class, and outputs to computing unit 125.
Factor generation unit 182 generates the tap factor according to the parameter that is stored in the factor seed data in the factor kind quantum memory 183 and be stored in the parameter storage 184 for each class, and this tap factor is provided for factor memory 181 and is stored by covering.
The factor seed data that the factor seed data that factor kind quantum memory 183 is described by study subsequently for each class storage obtains.The factor seed data is the data as the seed that generates the tap factor.
If user's operation unit 185, parameter storage 184 are just according to the parameter of this operation by covering storage operation unit 185 to export.
For factor output unit 124 shown in Figure 30, for each tap storage (setting) tap factor in factor memory 181, be the tap factor of each class of computing unit 125 uses, it upgraded according to the operation that the user is done operating unit 185.
Now, 31 flow chart with reference to the accompanying drawings is described as the processing that each class is upgraded the tap factor, this processing is to carry out at factor output unit 124 places shown in Figure 30, and promptly the tap factor upgrades and handles.
At first, in step S171, parameter storage 184 determines whether operating unit 185 provides parameter, and when determining that in step S171 operating unit 185 provides parameter, flow process proceeds to step S172 so, wherein parameter storage 184 is by covering the parameter that storage is provided, and flow process proceeds to step S173 then.
In addition, if determine that in step S171 operating unit 185 does not also provide parameter, skips steps S172 so, flow process proceeds to step S173.
Therefore, for parameter storage 184, if being provided with the user by user's operation and operating unit 185, operating unit 185 operates corresponding parameter, so the content of being stored with the parameter update that provides.
In step S173, factor generation unit 182 is read the factor seed data for each class from factor kind quantum memory 183, and from parameter storage 184, read parameter, obtain factor seed data and parameter by this, and obtain the tap factor for each class according to factor seed data and parameter.Flow process proceeds to step S174 then, and here, factor generation unit 182 offers factor memory 181 with the tap factor of each class, and by covering storage.Flow process turns back to S171 from step S174, after this repeats identical processing.
Therefore, image conversion unit 431 places in Figure 29, use utilizes previous expression formula (1) to carry out the image transitions processing that first picture signal is converted to second picture signal with the tap factor of this parameter update, promptly handles with the corresponding image transitions of this parameter.
Note that in Figure 31,,, otherwise will be skipped so with the processing among execution in step S173 and the S174 if new parameter is capped in parameter storage 184.
And then, with to making description about the factor seed data that generates the tap factor at factor generation unit 182 places and learn to be stored in the factor kind quantum memory 183.
Suppose picture signal that we have a high image quality as second picture signal and low-quality signal as first picture signal, this low-quality signal filters acquisition by using the LPF (low pass filter) that reduces resolution to the high image quality picture signal.We will consider such example now, promptly extract prediction tapped from the low image quality picture signal, and according to predetermined prediction and calculation, for example the linear fundamental forecasting of expression formula (1) is calculated, and uses prediction tapped and tap factor to obtain the pixel value of high image quality pixel.
At this moment, the pixel value y of high image quality pixel can obtain by quadratic expression or higher expression formula, and non-linear basic representation (1).
On the other hand, at factor generation unit 182 places, generate tap factor w according to the parameter that is stored in the factor seed data in the factor kind quantum memory 183 and be stored in the parameter storage 184 n, use factor seed data and parameter to utilize following expression formula to carry out the tap factor w at factor generation unit 182 places in this supposition nGeneration.
w n = Σ m = 1 M β m , n z m - 1 - - - ( 9 )
β wherein M, nExpression is used to obtain n tap factor w nM factor seed data, and z represents parameter.Note that in expression formula (9), use M factor seed data β N, 1, β N, 2Deng up to β N, MObtain tap factor w n
Now, be to be understood that from factor seed β M, nWith acquisition tap factor w among the parameter z nExpression formula be not limited in expression formula (9).
In expression formula (9), determine z by parameter z M-1Value by introducing a new variable t mUse following expression formula definition.
t m=z m-1(m=1,2,...M) (10)
Following expression formula will be produced in expression formula (10) the substitution expression formula (9).
w n = Σ m = 1 M β m , n t m - - - ( 11 )
According to expression formula (11), by factor seed data β M, nT with variable mLinear basic representation obtains tap factor w n
This moment, the actual value of k of the pixel value of high image quality pixel sampling is y k, and actual value y kPredicted value be y k', predicated error e kUse following expression formula to express.
e k=y k-y k′ (12)
This moment, the predicted value y in the expression formula (12) k' obtain according to expression formula (1), therefore according to expression formula (1) with y k' substitution expression formula (12) and obtain following expression formula (1).
e k = y k - ( Σ n = 1 N w n x n , k ) - - - ( 13 )
X wherein N, kExpression constitutes n low image quality pixel of k prediction tapped of sampling of relevant high image quality pixel.
W with expression formula (11) substitution expression formula (13) nThe following expression formula of middle generation.
e k = y k - ( Σ n = 1 N ( Σ m = 1 M β m , n t m ) x n , k ) - - - ( 14 )
This moment, for prediction high image quality pixel, the predicated error e in expression formula (14) wherein kBe 0 factor seed data β M, nBe best, but for all high image quality pixels, obtain such factor seed data β M, nGenerally be difficult.
Therefore, for example can utilize least squares method and minimum error quadratic sum E to obtain wherein factor seed data β M, nIt is best scope.
E = Σ k = 1 K e k 2 - - - ( 15 )
Wherein K represents high image quality pixel y kWith low image quality pixel x 1, k, x 2, kDeng up to x N, kThe number of samples (that is, the quantity of study sampling) of group, this low image quality pixel constitutes relevant high image quality pixel y kPrediction tapped.
Use β M, nObtain the minimum value (minimum) of the error sum of squares E in the expression formula (15), wherein use factor seed data β M, nTo carrying out partial differential and obtain 0, shown in expression formula (16) with E.
∂ E ∂ β m , n = Σ k = 1 K 2 · ∂ e k ∂ β m , n · e k = 0 - - - ( 16 )
Following expression formula will be produced in expression formula (13) the substitution expression formula (16).
Σ k = 1 K t m x n , k e k = Σ k = 1 K t m x n , k ( y k - ( Σ n = 1 N ( Σ m = 1 M β m , n t m ) x n , k ) = 0 - - - ( 17 )
We define X equally as formula (18) and (19) I, p, j, qAnd y I, p
X i , p , j , q = Σ k = 1 K x i , k t p x j , k t q
( i = 1,2 , · · · , N : j = 1,2 , · · · , N : p = 1,2 , · · · , M : q = 1,2 , · · · , M ) - - - ( 18 )
Y i , p = Σ k = 1 K x i , k t p y k - - - ( 19 )
In this case, formula (17) is by using X I, p, j, qAnd y I, pAnd can be expressed as shown in Equation (20) normal equation.
For example, use cancellation to calculate (Gauss-Jordon's elimination approach), the normal equation of expression formula (20) can calculate factor seed data β M, n
Factor kind quantum memory 183 storages shown in Figure 29 are by wherein using a large amount of high image quality pixel y 1, y 2Deng up to y kTutor's data and each y of formation high image quality pixel as the tutor who learns kThe low image quality pixel x of prediction tapped 1, k, x 2, kDeng up to x N, kAs the student's who learns student data, resolve the factor seed data β that study obtained of expression formula (20) M, nFactor generation unit 182 according to expression formula (9) from factor seed data β M, nWith generation tap factor w among the parameter z that is stored in the parameter storage 184 nComputing unit 125 uses tap factor w then nWith relevant low image quality pixel (pixel of the first picture signal) x that is concerned about the prediction tapped of pixel of formation nCalculation expression (1), this institute is concerned about that pixel is the high image quality pixel, the approximate predicted value of pixel that acquisition by this is concerned about, this institute is concerned about that pixel is the high image quality pixel.
And then, Figure 32 explanation obtains factor seed data β by the normal equation of conciliating operator expression formula (20) is set M, nThe profile instance of facility for study.Note that corresponding to those parts among Fig. 8 and represent, and omitted description wherein according to circumstances with identical Reference numeral.
Be used to learn factor seed data β M, nThe study picture signal be transfused to facility for study.For example, the high image quality picture signal can be used to learn picture signal.
In facility for study, the study picture signal is provided for tutor's data generating unit 131 and student data generation unit 133.Tutor's data generating unit 131 generates tutor's data from the study picture signal of supplying with it, then tutor's data are offered tutor's data storage cell 132.That is to say that at this, tutor's data generating unit 131 offers tutor's data storage cell 132 as tutor's data with the high image quality picture signal as what the study picture signal did not add variation.The high image quality picture signal that 132 storages of tutor's data storage cell are generated by tutor's data generating unit 132 as tutor's data.
Student data generation unit 133 generates student data from the study picture signal, and student data is offered student data memory cell 134.That is to say, 133 pairs of high image quality picture signals as the study picture signal of student data generation unit are carried out and are filtered, so that reduce its resolution, generate the low image quality picture signal by this, and this low image quality picture signal is offered student data memory cell 134 as student data.
At this moment, except the study picture signal is provided to student data generation unit 133, also provide a plurality of values in the scope of the parameter z that the parameter storage 184 in Figure 30 provides to it, these values are provided by parameter generating unit 191.That is to say,, provide for example z=0,1,2 from parameter generating unit 191 to student data generation unit 133 so if our supposition can be assumed to the scope of parameter z 0 real number in the Z scope, or the like up to Z.
In addition, student data generation unit 133 uses LPF, corresponding to the parameter z that offers it, filters as the high image quality picture signal of learning picture signal with predetermined cut-out frequency, generates the low image quality picture signal as student data by this.
Therefore, in this case, at student data generation unit 133 places, according to high image quality picture signal as the study picture signal, generated Z+1 type low image quality picture signal, as shown in figure 33 as student data with different resolution.Please note that for example the parameter Z value is high more, the cut-out frequency that is used to filter the employed LPF of high image quality picture signal is just high more, generates the low image quality picture signal as student data by this.Therefore, the value of parameter z is big more, and the resolution of low image quality picture signal corresponding with it is just high more.
Equally, for this example, for convenience of description, generate the low image quality picture signals at student data generation unit 133, resolution has reduced the quantity corresponding to parameter z on the horizontal direction of high image quality picture signal and vertical direction therein.
Return Figure 32, the student data that student data memory cell 134 storage student data generation units 133 provide.
Prediction tapped extraction unit 135 obtains the pixel as be concerned about tutor's pixel continuously, described pixel constitutes as the high image quality picture signal that is stored in the tutor's data in tutor's data storage cell 132, and from constituting as extracting predetermined one in the low image quality pixel that is stored in the student data memory cell 134 as the low image quality picture signal of student data, dispose a prediction tapped by this, this prediction tapped has and the identical configuration of prediction tapped of being disposed by pre-gauge head extraction unit 121 shown in Figure 29, and prediction tapped is offered adder unit 192.
For tutor's pixel of being concerned about, feature extraction unit 136 with the mode identical with feature extraction unit shown in Figure 29 122 use the low image quality pixel that is stored in the student data memory cell 134 as student data extract be concerned about the feature of tutor's pixel, and these features are offered class taxon 137.
Please note, the parameter z that parameter generating unit 191 generates and provides is provided for prediction tapped extraction unit 135 and feature extraction unit 136, and prediction tapped extraction unit 135 and feature extraction unit 136 used the student data that parameter z generated that provides corresponding to parameter generating unit 191 to constitute prediction tapped or extracted the feature that institute is concerned about tutor's pixel (at this, being to use to have corresponding to the cut-out frequency generation of the LPF of the parameter z low image quality signal as student data).
According to tutor's pixel of being concerned about of feature extraction unit 136 output, class taxon 137 is carried out the class classification identical with class taxon shown in Figure 29 123, and to the corresponding Sort Code of class of adder unit 192 outputs and class classification results.
Adder unit 192 is read tutor's pixel of being concerned about from tutor's data storage cell 132, and carry out relevant tutor's pixel of being concerned about and student data and the addition of parameter z when the generation student data for each class of class taxon 137, this student data constitutes the prediction tapped of tutor's pixel of being concerned about setting that relevant prediction tapped extraction unit 135 provides.
That is to say, provide the tutor's data that are stored in tutor's data storage cell 132 y to adder unit 192 k, prediction tapped extraction unit 135 output prediction tapped x I, k(x J, k) and the Sort Code of output from class taxon 137, and the parameter z that is used to dispose the student data of prediction tapped by the generation that parameter generating unit 191 provides.
Equally, for the corresponding class of Sort Code that each and class taxon 137 provide, adder unit 192 uses prediction tapped (student data) x I, k(x J, k) and parameter z execution calculating, this calculating equals to be used for obtaining the defined component x of expression formula (18) of expression formula (20) left side matrix I, p, j, qStudent data and the multiplication (x of parameter z I, kt px J, kt q), and their addition (∑).Note that the t in the expression formula (18) pCalculate from parameter z according to expression formula (10).This is for the t in the expression formula (18) qAlso set up.
Equally, for the corresponding class of Sort Code that each and class taxon 137 provide, adder unit 192 uses prediction tapped (student data) x I, k, tutor's data y kCarry out to calculate with parameter z, this calculating equals to be used for to obtain defined component y in the expression formula (19) of vector on expression formula (20) right side I, pStudent data x I, k, tutor's data y kMultiplication (x with parameter z I, kt py k), and their addition (∑).Note that the t in the expression formula (19) pCalculate from parameter z according to expression formula (10).
That is to say that adder unit 192 is stored the left side matrix component x of the relevant expression formula (20) of regarding tutor's data as previous interested tutor's pixel and obtaining in its memory (not shown) I, p, j, qWith right side resolute y I, p, and add and matrix component x I, p, j, qOr resolute y I, pThe component x of new tutor's data that tutor's pixel that corresponding relevant conduct is concerned about obtains I, kt px J, kt qOr component x I, kt py k, use tutor's data y k, student data x I, k(x J, k) and parameter z calculating respective component x I, kt px J, kt qOr x I, kt py k(that is component x in the executable expressions (18), I, p, j, qOr the component y in the expression formula (19) I, pAddition and represented addition).
For all values 0,1 of parameter z or the like up to Z, adder unit 192 uses tutor's data storage cell 132 all tutor's data of storage to carry out this addition as tutor's pixel of being concerned about, so that, then normal equation is offered tap factor computing unit 193 for each class constitutes the normal equation that expression formula (20) provides.Tap factor computing unit 193 resolves the normal equation of each class that adder unit 192 provided, and therefore obtains also output factor seed data β for each class M, n
Parameter generating unit 191 generates a plurality of value z=0,1,2 or the like up to Z, and the scope that offer the parameter z of the parameter storage 184 among Figure 30 can be as above-mentioned supposition, and these values are offered student data generation unit 133.In addition, parameter generating unit 191 offers prediction tapped extraction unit 135, feature extraction unit 136 and adder unit 192 with the parameter z that generates.
Next, will the processing (learning process) that facility for study shown in Figure 32 is carried out be described with reference to the flow chart of Figure 34.
At first, in step S181, tutor's data generating unit 131 and student data generation unit 133 generate tutor's data and student data respectively from the study picture signal, after inciting somebody to action they are exported.That is to say that tutor's data generating unit 131 does not add the output study picture signal of variation as tutor's data.Equally, the parameter z with z+1 value that generates by parameter generating unit 191 is provided to student data generation unit 133, and student data generation unit 133, with use LPF to filter to learn picture signal with the corresponding cut-out frequency of parameter z that generates with parameter generating unit 191 with z+1 value (0,1 or the like up to z), generate and export z+1 frame by this about the student data of tutor's data (study picture signal) of each frame.
Tutor's data of tutor's data generating unit 131 outputs are provided for tutor's data storage cell 132 and are stored in tutor's data storage cell 132, and the student data of student data generation unit 133 outputs is provided for student data memory 134 and is stored in wherein.
Subsequently, flow process proceeds to step S182, and here, parameter generating unit 191 is set to initial value, for example zero with parameter z, and this parameter z is offered prediction tapped extraction unit 135, feature extraction unit 136 and adder unit 192, and flow process proceeds to S183 then.In step S183, obtain also not to be used as tutor's pixel of being concerned about of tutor's pixel acquisition of being concerned about in tutor's data of prediction tapped extraction unit 135 from be stored in tutor's data storage cell 132.In addition, in step S183, corresponding with tutor's pixel of being concerned about, prediction tapped extraction unit 135 disposes the prediction tapped about the parameter z of parameter generating unit 191 outputs from the student data that student data memory cell 134 is stored, (generating student data) by the corresponding study picture signal of tutor's data of using tutor's pixel of being concerned about with the LPF filtration and the conduct of the corresponding cut-out frequency of parameter z, and providing it to adder unit 192, flow process proceeds to step S184 then.
In step S184, feature extraction unit 136 uses the relevant student data of the parameter z with parameter generating unit 191 outputs of student data memory 134 storages to extract the feature of tutor's pixel of being concerned about, provide it to class taxon 137 then, flow process proceeds to step S185 then.
In step S185, class taxon 137 is concerned about the classification of pixel tutor's pixel that execution is concerned about of feature according to the relevant institute that comes from tutor's pixel of being concerned about of feature extraction unit 136, and output obtain by this give adder unit 192 with such corresponding Sort Code, flow process proceeds to step S186 then.
In step S186, adder unit 192 is read tutor's pixel of being concerned about from tutor's data storage cell 132, and uses the parameter z of the tutor's pixel be concerned about, prediction tapped that prediction tapped extraction unit 135 provides and parameter generating unit 191 outputs to come the left side matrix component x of calculation expression (20) I, kt px J, kt qWith right side resolute x I, kt py kIn addition, to acquired matrix component and resolute, adder unit 192 adds the matrix component x that from the pixel, prediction tapped and the parameter z that are concerned about obtain corresponding with it on the matrix component of the Sort Code of class taxon 137 and resolute I, kt px J, kt qWith resolute x I, kt py k, flow process proceeds to step S187 then.
In step S187, parameter generating unit 191 determines whether the parameter z of output equals the maximum that Z can suppose.If determine that in step S187 parameter z is not equal to (promptly, less than) maximum that can suppose of Z, flow process proceeds to step S188 so, and parameter generating unit 191 makes parameter z add 1, and to prediction tapped extraction unit 135, feature extraction unit 136 and the new parameter z of adder unit 192 outputs.Flow process is returned step S183 then, repeats identical processing subsequently.
Equally, if determine that in step S187 parameter z equals the maximum that Z can suppose, flow process proceeds to step S189 so, and prediction tapped extraction unit 135 determines whether to have or not tutor's storage as tutor's pixel of being concerned about in tutor's data storage cell 132.If determining the tutor's data that also do not obtain as tutor's pixel of being concerned about still also is stored in tutor's data storage cell 132, prediction tapped extraction unit 135 carries also and returns step S182 as tutor's data of tutor's pixel of being concerned about so, the complex phase of laying equal stress on processing together.
On the other hand, if in step S189, determine tutor's data that tutor's data storage cell 132 no longer includes also not to be had as tutor's pixel of being concerned about, adder unit 192 provides up to now left side matrix and right side vector by the expression formula (20) of handling acquired each class to tap factor computing unit 193 so, and flow process proceeds to step S190 then.
In step S190, each class that tap factor computing unit 193 provides for adder unit 192 is resolved the normal equation of each class of the left side matrix that constitutes expression formula (20) and right side vector, obtains and export the factor seed data β of each class by this M, n, follow end process.
Though may there be this situation, promptly owing to reasons such as the quantity of learning picture signal are not enough, can not obtain to obtaining the quantity of the required normal equation of factor seed data, for example can arrange factor seed computing unit 193 to be so default factor seed data of class output.
At this moment, for facility for study shown in Figure 32, study has been described as obtaining the low image quality picture signal as student data as the study picture signal of tutor's data with by the resolution that reduces the high image quality picture signal corresponding to parameter z, from the factor seed data β of expression formula (11) with the high image quality picture signal M, nRepresented tap factor w n, corresponding to the variable t of parameter z nWith student data X nIn directly obtain factor seed data β M, nAnd carry out, this tap factor w nThe error sum of squares of the predicted value y of tutor's data of being predicted by the linear basic representation of expression formula (1) is minimized, as shown in figure 33; Yet, factor seed data β M, nStudy be not limited to these, for example also can alternatively use shown in Figure 35 and carry out.
That is to say, for arrangement shown in Figure 35, situation with Figure 33 is similar, as study picture signal the high image quality picture signal as tutor's data, with by using LPF with the corresponding cut-out frequency of parameter z to reduce the horizontal direction of high image quality picture signal and low image quality picture signal that vertical direction resolution obtains as student data, at first, for each value of parameter z (here, z=0,1 or the like is up to Z) from tap factor w nWith student data x nObtain tap factor w n, this tap factor w nThe error sum of squares of the predicted value y of tutor's data of being predicted by the linear basic representation of expression formula (1) is minimized.In addition for the arrangement among Figure 35, the tap factor w of acquisition nAs tutor's data and parameter z as student data, be to obtain factor seed data β M, nLearn this factor seed data β and carry out M, nMake tap factor w as tutor's data nThe error sum of squares of predicted value minimize and this tap factor w nBe to use expression formula (11) by factor seed data β M, nWith variable t corresponding to parameter z nPrediction.
At this moment, by resolving the normal equation of expression formula (8), can obtain to provide tap factor w for each parameter value of each class (z=0,1 or the like up to Z) by the minimum of the error sum of squares E of the predicted value y of tutor's data of the linear fundamental forecasting expression formula prediction of expression formula (11) (minimum) value n, the situation of facility for study as shown in Figure 8.
At this moment, from factor seed data β M, nIn obtain the tap factor and with the corresponding variable t of parameter z n, as expression formula (11) is represented.This means, for the tap factor w that obtains from expression formula (11) n', the best tap factor w that obtains by expression formula (11) wherein nWith tap factor w n' between error e nBe 0 factor seed data β M, nBe to be used to predict the best tap factor w shown in the following expression formula (21) nBest factor seed data β M, n, but for all tap factor W nObtain this factor seed data β M, nGenerally be difficult.
e n=w n-w n′ (21)
Expression formula (21) can be revised as following expression formula by expression formula (11).
e n = w n - ( Σ m = 1 M β m , n t m ) - - - ( 22 )
At this moment, will adopt least squares method as indication factor seed data β in this case M, nBe best standard, can obtain best factor seed data β by the error sum of squares E that minimizes in the following expression formula M, n
E = Σ n = 1 N e n 2 - - - ( 23 )
Use factor seed data β M, nObtain the minimum value (minimum) of the error sum of squares E in the expression formula (23), wherein use factor seed data β M, n Produce 0 to carrying out partial differential, shown in expression formula (24) with E.
∂ E ∂ β m , n = Σ m = 1 M 2 ∂ e n ∂ β m , n · e n = 0 - - - ( 24 )
Following expression formula will be produced in expression formula (22) the substitution expression formula (24).
Σ m = 1 M t m ( w n - ( Σ m = 1 M β m , n t m ) ) = 0 - - - ( 25 )
Now, the x in the let us definition expression formula (26) and (27) I, jAnd y i
X i , j = Σ z = 0 Z t i t j ( i = 1,2 , · · · , m : j = 1,2 , · · · , M ) - - - ( 26 )
Y i = Σ z = 0 Z t i w n - - - ( 27 )
In this case, use x I, jAnd y iExpression formula (25) can be provided by the normal equation shown in the expression formula (28).
Figure A20041009594400851
For example, the normal equation of same expression formula (28) can calculate seed data β by using elimination approach (Gauss-Jordon's elimination approach) M, n
Next, Figure 36 illustrates the profile instance of facility for study, and it is used for carrying out study to obtain factor seed data β by the normal equation that provides and resolve expression formula (28) M, nNote that with Fig. 8 or 32 in corresponding those parts represent with identical reference marker, and omitted description wherein according to circumstances.
The parameter z of the Sort Code of tutor's pixel of being concerned about of related genera taxon 137 outputs and parameter generating unit 191 outputs is provided for adder unit 138.Adder unit 138 is read tutor's pixel of being concerned about from tutor's data storage cell 132, and each value of the parameter z of each Sort Code that provides for class taxon 137 and parameter generating unit 191 outputs carries out the relevant tutor's pixel be concerned about and the addition of student data, the prediction tapped of tutor's pixel of being concerned about setting that this student data formation provides with respect to prediction tapped extraction unit 135.
That is to say, provide the tutor's data that are stored in tutor's data storage cell 132 y to adder unit 138 k, prediction tapped extraction unit 135 output prediction tapped x M, k, class taxon 137 output Sort Code, and be used for generating configuration prediction tapped x M, kThe parameter z of student data, this parameter is by parameter generating unit 191 outputs.
For the corresponding class of Sort Code that provides for each and class taxon 137 and each parameter z value of parameter generating unit 191 outputs, adder unit 138 use prediction tapped (student data) x N, kCarry out to calculate, this calculating equal in the left side matrix of expression formula (8) one of student data with another the (x that multiplies each other N, kx N ', k) and addition (∑).
Equally, for the corresponding class of Sort Code that provides for each and class taxon 137 and each parameter Z value of parameter generating unit 191 outputs, adder unit 138 uses prediction tapped (student data) x N, kWith tutor's data y kCarry out and calculate, this calculating equals the student data x in the vector of expression formula (8) the right N, kWith tutor's data y kProduct (x N, kx N, k) and addition and (∑).
That is to say that adder unit 138 is stored its left side matrix component (the ∑ x with previous relevant tutor's data as care tutor pixel that is obtained of expression formula (8) in its memory (not shown) N, kx N ' k) and right side resolute (∑ x N, ky k), and add matrix component (∑ x N, kx N ' k) or resolute (∑ x N, ky k), the relevant respective component x that obtains new tutor's data as tutor's pixel of being concerned about N, k-1, x N ', k-1Or x N, k-1y K-1, use tutor's data y K-1With student data x N, k-1The respective component x that calculates N, k-1x N ', kOr x N, k-1y K-1(being the addition of the summation of executable expressions (8) expression).Adder unit 138 uses all tutor's data of tutor's data storage cell 132 storages to carry out this addition as care tutor pixel, so that, give tap factor computing unit 139 with normal equation then for each value of each class and parameter z constitutes the normal equation that expression formula (8) provides.Each value of tap factor computing unit 139 operation parameter z is resolved the normal equation that adder unit 138 provides for each class, thereby obtains and export the best tap factor w of each value with parameter z for each class n, provide it to adder unit 201 then.
Adder unit 201 is carried out for each class related parameter z (or corresponding its variable t m) and best tap factor W nAddition.That is to say that adder unit 201 uses the variable t that obtains by expression formula (10) from parameter z i(t j) carry out and calculate, this calculating equals and the corresponding variable of parameter z and another variable t i(t j) (t that multiplies each other it j) and its addition, to obtain to be defined in the component X in the expression formula (26) I, j, in the matrix in this expression formula left side in expression formula (28).
In case should be understood that words fully, only determine component X by parameter z I, jAnd not relevant, so component X with class I, jCalculating do not need in fact to be that each class carries out.
In addition, adder unit 201 uses the variable t that obtains from parameter z by expression formula (10) iCarry out calculating with best tap factor Wn, this calculating equals the multiplication (t with corresponding variable ti of parameter z and best tap factor iw n) and they and, this parameter z is used for obtaining being defined in the component y in the expression formula (27) in the vector on right side of expression formula (28) i
Adder unit 201 is the component X of each class acquisition by expression formula (26) expression I, jWith component y by expression formula (27) expression i, and for each class is provided with the normal equation of expression formula (28), and normal equation is offered factor seed computing unit 202.Factor seed computing unit 202 resolves the expression formula (28) that adder unit 201 provides for each class, obtains and output factor seed data β for each class by this M, n
Figure 30 factor kind quantum memory 183 may be arranged to the class storage factor seed data β of above-mentioned acquisition M, n
At this moment, for factor output unit 124 shown in Figure 30, can arrange it, the best tap factor w that is stored in memory of each value of parameter z of tap factor computing unit shown in Figure 36 139 outputs wherein for example is not provided to factor kind quantum memory 183 nAnd be stored in the memory and be arranged on best tap factor in the factor memory 181 according to what be stored in that parameter z in the parameter storage 184 selects.Yet, in this case, memory have with the proportional characteristic of the assumable value of parameter z be necessary.On the contrary, in the arrangement that provides factor kind quantum memory 183 with the storage factor seed data, the storage characteristics of factor kind quantum memory 183 does not also rely on the assumable value of parameter z, thereby the memory that can use little characteristic is as factor kind quantum memory 183.In addition, at storage factor seed data β M, nThe time, by expression formula (9) from the value of parameter z the tap factor w that generates n, therefore can obtain can be described as continuous tap factor w according to the value of parameter z nTherefore, mode that can less step is adjusted the picture quality of high image quality picture signal of conduct second picture signal of computing unit 125 outputs.
Please note, state for the scheme for above-mentioned, the study picture signal is used as and the corresponding tutor's data of second picture signal that do not add variation, and the low image quality picture signal that has wherein reduced the resolution of study picture signal is used as and the corresponding student data of first picture signal, carry out the study of factor seed data according to this, therefore can start the image transitions processing and obtain the factor seed data, thereby can realize that wherein first picture signal is carried out resolution raising processing is treated to second picture signal of the resolution with raising with it.
In this case, at image conversion unit 431 places, can improve the horizontal direction resolution and the vertical direction resolution of picture signal according to parameter z.Therefore, in this case, we can say that parameter z is and the resolution corresponding parameter.
This moment, depend on how to select with the picture signal of the corresponding student data of first picture signal and with the picture signal of the corresponding tutor's data of second picture signal, can obtain to be used for the factor seed data that various types of image transitions are handled.
That is to say, for example, be used as in the scheme of tutor's data for high image quality picture signal wherein, with the corresponding noise of parameter z is to be attached on tutor's data high image quality picture signal to produce the picture signal that has noise as student data, carrying out study handles, can obtain the factor seed data of carries out image conversion process by this, it is first picture signal to be converted to removed the noise removal that (or minimizing) comprise second picture signal of noise wherein and handle that this image transitions is handled.
Equally, for example, for such scheme, promptly wherein given picture signal is as tutor's data and reduce and will have the picture signal that produces as the picture signal of the pixel quantity that picture signal had of tutor's data as student data, perhaps wherein given picture signal reduces picture signal as the pixel quantity that picture signal had of student data to produce as tutor's data image signal as student data and according to predetermined minimizing ratio, carrying out study handles, can obtain the factor seed data of carries out image conversion process by this, this image transitions is treated to convergent-divergent and handles, and it is converted to second image extended or that reduce with first picture signal.
When in factor kind quantum memory 183,, when perhaps handling the storage factor seed data, can carry out at image conversion unit 31 places and corresponding noise removal of parameter z or convergent-divergent (enlarge or reduce) for convergent-divergent for noise removal is handled the storage factor seed data.
In above-mentioned situation, tap factor w nPass through β 1, nz 0+ β 2, nz 1+ ... + β M, nz M-1Definition as expression formula (9) is represented, is used the tap factor w of expression formula (9) acquisition and parameter z corresponding improve the standard direction and vertical direction resolution n, but can be tap factor w nDo such arrangement, wherein horizontal direction resolution and vertical direction resolution all with separately parameter z xAnd z yCorresponding and independent improve.
That is to say tap factor w nBy defining like this, three expression formulas for example
β 1,nz x 0z y 02,nz x 1z y 03,nz x 2z y 04,nz x 3z y 05,nz x 0z y 1
6,nz x 0z y 27,nz x 0z y 38,nz x 1z y 19,nz x 2z y 110,nz x 1z y 2
Rather than expression formula (9), and be defined in variable t in the expression formula (10) mBe by
t 1=z x 0z y 0,t 2=z x 1z y 0,t 3=z x 2z y 0,t 4=z x 3z y 0,t 5=z x 0z y 1
t 6=z x 0z y 2,t 7=z x 0z y 3,t 8=z x 1z y 1,t 9=z x 2z y 1,t 10=z x 1z y 2
Define, rather than expression formula (10).Equally, in this case, tap factor w nCan be expressed as expression formula (11) at last, can locate to carry out at facility for study (Figure 32 and 36) thereby use as the study of the picture signal of student signal, wherein the horizontal direction resolution of tutor's signal and vertical direction resolution all with parameter z xAnd z yReduce accordingly, obtain factor seed data β by this M, nThereby, can obtain and each autoregressive parameter z xAnd z yThe tap factor w of corresponding improve the standard directional resolution and vertical direction resolution alone n
Other examples except with horizontal direction resolution and corresponding each the autoregressive parameter z of vertical direction resolution xAnd z yOutside also comprise and introducing and time series resolution corresponding parameter z t, can obtain by this to be used for and each autoregressive parameter z x, z yAnd z tThe tap factor w of the corresponding directional resolution of improving the standard, vertical direction resolution, time series resolution n
Equally, for convergent-divergent is handled, can improve with resolution equally and handle that identical mode obtains to be used in the horizontal direction or the tap factor w of convergent-divergent and the corresponding expansion percentage of parameter z (minimizing percentage) on the vertical direction n, perhaps obtain to be used in the horizontal direction with vertical direction on independently convergent-divergent and separately parameter z xAnd z yThe tap factor w of corresponding expansion percentage (minimizing percentage) n
In addition, for facility for study (Figure 32 and 36), can by with parameter z xReduce the horizontal direction and the vertical direction resolution of tutor's data accordingly and carry out study, and noise is increased to and parameter z yCorresponding tutor's data, and will obtain factor seed data β M, nPicture signal as student data, can obtain whereby and parameter z xThe tap factor W of corresponding improve the standard directional resolution and vertical direction resolution n, and also with parameter z yCorresponding execution noise removal.
Factor kind quantum memory 183 (Figure 30) the storage factor seed data of image conversion unit 431R shown in Figure 28, this factor seed data are to obtain by only using as the R signal of the picture signal of tutor's data and the study of all R as the picture signal of student data, G and B signal.Equally, factor kind quantum memory 183 (Figure 30) storage of image conversion unit 431G shown in Figure 28 is only used as the G signal of the picture signal of tutor's data and the factor seed data that study obtained of all R as the picture signal of student data, G and B signal.In the same way, the factor kind quantum memory 183 (Figure 30) of image conversion unit 431B shown in Figure 28 storage is only used as R, the G of the picture signal of the B signal of the picture signal of tutor's data and all student data and the factor seed data that study obtained of B signal.
Then, Figure 37 illustrates signal processing unit 411R, the 411G of the composition signal processing unit 404 that composition is shown in Figure 27 and another profile instance of 411B.Note that corresponding to those parts among Figure 28 and represent, and omitted description wherein according to circumstances with identical Reference numeral.That is to say, signal processing unit 411R, 411G shown in Figure 37 and 411B arrange with same way as shown in Figure 28, except image storage unit 432R, 432G and 432B are not provided, and provide control unit 211, this control unit 211 replaces assessment units 433.
In Figure 37, provide the parameter of operating unit 185 output rather than second picture signal of image conversion unit 431G output to control unit 211.Control unit 211 obtains the parameter of operating unit 185 outputs, and R photoreceptor unit 423R, the G photoreceptor unit 423G of control sensor unit 401 (Figure 20) and the placement location of B photoreceptor unit 423B.
Figure 38 illustrates the profile instance of control unit shown in Figure 37 211.Control signal output unit 221 obtains the parameter that operating unit 185 provides, and identification side-play amount Ph G, Pv G, Ph BAnd Pv B, these side-play amounts and the relating to parameters that obtains from operating unit 185, this parameter is stored in the parameter list of parameter table stores unit 222.In addition, control signal output unit 221 will be provided for specifying the side-play amount Ph of operation parameter table identification to sensor unit 401 in the mode identical with above-mentioned control signal output unit 444 G, Pv G, Ph BAnd Pv BControl signal, control R photoreceptor unit 423R, the G photoreceptor unit 423G of sensor unit 401 (Figure 20) and the placement location of B photoreceptor unit 423B by this.
222 storages of parameter table stores unit are by the parameter list of the parameter correlation of manipulation operations unit 185 inputs, and to be fit to picture signal interval scale that image transitions handles according to the parameter R photoreceptor unit 423R of sensor unit 401, the side-play amount Ph of the placement location of G photoreceptor unit 423G and B photoreceptor unit 423B G, Pv G, Ph B, and Pv BParameter list is obtained in advance by the parameter list study of describing later.
Correspondingly, at control signal output unit 221, be provided for specifying the side-play amount Ph that joins with the parameter correlation that obtains from operating unit 185 to sensor unit 401 G, Pv G, Ph B, and Pv BControl signal control the R photoreceptor unit 423R of sensor unit 401 (Figure 20), therefore the placement location of G photoreceptor unit 423G and G photoreceptor unit 423B is sensor unit 401 outputs with the corresponding picture signal that is suitable for the image transitions processing of the parameter that obtains from operating unit 185.Make such picture signal stand to handle, can obtain to have higher-quality picture signal with the corresponding image transitions of parameter that obtains from operating unit 185.
Next, with reference to the flow chart among Figure 39, the operation of the image pick up equipment shown in Figure 27 will be with reference to the line description that is configured into to signal processing unit 411 shown in Figure 37, and it has formed the signal processing unit 404 shown in Figure 27.
At first, in step S191, the control signal output unit 221 of control unit 211 (Figure 38) obtains the parameter of operating unit 185 outputs, and flow process enters step S192.In step S192, control unit 211 identification from operating unit 185 that obtain with side-play amount Ph relating to parameters G, Pv G, Ph B, and Pv B, in the 222 li stored parameters tables in parameter table stores unit, be the side-play amount Ph of specified sensor unit 401 G, Pv G, Ph B, and Pv BControl signal is provided, and flow process enters step S193.Correspondingly, the R photoreceptor unit 423R of control sensor unit 401, the placement location quilt of G photoreceptor unit 423G and B photoreceptor unit 423B.
In step S193, sensor unit 401 receives object light, and carries out opto-electronic conversion, thereby obtains the picture signal (that is, making object image-forming) of electrical signal form, and picture signal is offered signal adjustment unit 402.The picture signal that signal adjustment unit 402 provides sensor unit 401 stands the CDS processing and these is offered A/D converting unit 403.The picture signal that A/D converting unit 403 provides signal adjustment unit 402 stands the A/D conversion, offer signal processing unit 411 as first picture signal then, and flow process enters step S194 from step S193.
In other words, in this case, the R photoreceptor unit 423R of sensor unit 401 (Figure 20), G photoreceptor unit 423G and B photoreceptor unit 423B placement location be side-play amount Ph with the parameter correlation connection of operating unit 185 output G, Pv G, Ph B, and Pv BCorresponding position.Correspondingly, in step S193, export from sensor unit 401 with the corresponding picture signal that is suitable for the image transitions processing of the parameter of operating unit 185 outputs, and provide this picture signal as first picture signal to signal processing unit 411.
In step S194, first picture signal that the image conversion unit 431 (Figure 29) of signal processing unit 411 (Figure 37) provides A/D converting unit 403 stands conduct and handles with the image transitions of the corresponding signal processing of parameter of operating unit 185 outputs, thereby produce second picture signal that picture quality improves on first picture signal, flow process enters step S195 then.
Now, as previously mentioned, first picture signal that offers image conversion unit 431 is and the corresponding picture signal that is suitable for the image transitions processing of the parameter of operating unit 185 outputs, correspondingly, in step S194, the corresponding image transitions of parameter that makes first picture signal stand to obtain with operating unit 185 is handled so that obtain the more picture signal of high image quality.
In step S195, image changes unit 431 and exports second picture signal of the image transitions processing that is obtained to output unit 405, thereby finishes but the processing of a frame (or field).Adopt image pick up equipment, repeat flow chart, assign the order that stops image pickup up to for example user according to Figure 39.
Next, Figure 40 has illustrated the profile instance of facility for study, is used for carrying out the study of the parameter list that is stored in parameter table stores unit 222 shown in Figure 38.
Sensor unit 531, signal adjustment unit 532 and A/D converting unit 533 and the sensor unit 401 shown in Figure 27, signal adjustment unit 402 disposes with A/D converting unit 403 identical modes.Yet, should note, by the image pick up equipment shown in Figure 27, the R photoreceptor unit 423R of sensor unit 401 (Figure 20), the placement location of G photoreceptor unit 423G and B photoreceptor unit 423B is by signal processing unit 404 (in other words conj.or perhaps, the control signal of the control unit 211 outputs signal processing unit 411G of formation processing unit 404) is controlled, by the facility for study shown in Figure 40, each of sensor unit 531 and R photoreceptor unit 423R, the placement location of G photoreceptor unit 423G and the corresponding photoreceptor of B photoreceptor unit 423B unit (after this, referring to " placement location in the sensor unit 531 " according to circumstances) is controlled by the control signal of controller 537 outputs.
Image conversion unit 534 is configured in the mode identical with the image conversion unit 431 (431G) shown in Figure 29.Yet, image conversion unit 431 shown in Figure 29 is carried out with the corresponding image transitions of parameter of operating unit 185 outputs and is handled, and the image conversion unit 534 shown in Figure 40 is carried out on first picture signal of A/D converting unit 533 outputs with the corresponding image transitions of parameter of controller 537 outputs and handled.
Position determination unit 535 obtains the specified control signal and the side-play amount Ph of parameter of controller 537 outputs G, Pv G, Ph B, and Pv B(after this being called for short " side-play amount P " according to circumstances).In addition, position determination unit 535 obtains second picture signal from image transitions processing unit 534, this second picture signal makes in first picture signal with the placement location place of the corresponding sensor unit 531 of control signal of the controller 537 output state of the side-play amount P representative of control signal appointment (that is, by) imaging and stands the corresponding signal conversion processes of parameter (after this also can be called " with control signal and corresponding second picture signal of parameter " according to circumstances) with controller 537 outputs.Second picture signal that provides of position determination unit 535 evaluate image converting units 534 is corresponding with assessment result then, the parameter of controller 537 outputs and the side-play amount P of control signal indication is associated, and these are offered position storage unit 536.
Position storage unit 536 is stored parameter and the side-play amount P that provides from position determination unit 535 with the form of parameter and side-play amount P group.Each parameter of being correlated with and side-play amount group in a plurality of values of the parameter z of position storage unit 536 storage controls 537 outputs, and parameter list is the tabulation of parameter and side-play amount group.
Controller 537 produces the value that some parameter z can suppose, for example, z=0,1,2, or the like up to z, this is the same with the mode of parameter generating unit 191 among Figure 32.In addition, for the parameter value of each generation, controller 537 produces the value (P that some side-play amount P can suppose 1, P 2, or the like up to P NWherein N be one more than or equal to 2 value), controller 537 then order with the parameter value that produces as the parameter value of being concerned about, and each value in a plurality of values of parameter value z that is concerned about and the corresponding side-play amount P of parameter value that produced be concerned about is offered position determination unit 535.In addition, controller 537 provides the control signal of the amount of specifying Offsets P to position determination unit 535 and sensor unit 531.
Figure 41 has illustrated the profile instance of the position determination unit 535 shown in Figure 40.Position determination unit 535 comprises, memory cell 541, correlation calculations unit 542 and definite assessment unit 543.Memory cell 541, each all is and the memory cell 441 shown in Figure 23 that the identical mode of correlation calculations unit 442 and definite assessment unit 443 disposes for correlation calculations unit 542 and definite assessment unit 543.
Yet it should be noted that also to determining that assessment unit 543 provides the input of the side-play amount of parameter (parameter value of being concerned about) and slave controller 537 (Figure 40) output.With shown in Figure 23 really accepted opinion to estimate unit 443 identical, determine that 543 pairs of assessment units assess from second picture signals of image conversion unit 534 (Figure 40) output, this assessment is based on the correlation that provides from correlation calculations unit 542, and the output assessment result is that the picture quality of second picture signal is high or low.In addition, determine that assessment unit 543 is associated parameter and the side-play amount that slave controller 537 provides according to assessment result, and related parameter and side-play amount group offered position storage unit 536 (Figure 40).
Next, adopt the parameter list study of the facility for study shown in Figure 40 to handle and to be described with reference to the flow process among Figure 42.
At first, in step S320, controller 537 selects a value as care parameter value z from the scope that parameter z can suppose, and this value is offered image conversion unit 534 and definite assessment unit 543 of position determination unit 535 (Figure 41).Equally in step S320, the parameter value z that is concerned about that image conversion unit 534 and definite assessment unit 543 slave controllers 537 obtain, flow process enters step S321.
In step S321, sensor unit 531 receives the light of object, and carries out opto-electronic conversion, obtains the picture signal of the signal of telecommunication (that is, making object image-forming) form by this, and this picture signal is offered signal adjustment unit 532.The picture signal that signal adjustment unit 532 provides sensor unit 531 stands the CDS processing and these is offered A/D converting unit 533, the picture signal that A/D converting unit 533 provides signal adjustment unit 532 stands the A/D conversion, offer signal processing unit 534 as first picture signal then, flow process enters step S322 from step S321.
In step 322, the institute that first picture signal that image conversion unit 534 provides A/D converting unit 533 stands to obtain with controller 537 is concerned about the corresponding image transitions processing of parameter value z, thereby on first picture signal, produce second picture signal that picture quality improves, and then flow process enters step S323.
In step 323, second picture signal that position determination unit 535 evaluate image converting units 534 provide, flow process enters step S324.Evaluation process performed in step S323 will be described in detail with reference to Figure 43 after a while.
In step 324, position determination unit 535 (Figure 41) really accepted opinion to estimate the picture quality that unit 543 determines whether to obtain second picture signal be that high assessment result is, as the assessment result of second picture signal among the described step S323 just.
If in step S324, the picture quality of determining second picture signal is that high assessment result does not also obtain, flow process enters step S325, controller 537 offers sensor unit 531 to the control signal of the amount of specifying Offsets P, and the placement location that changes (moving) sensor unit 531 thus is changed.Notice that controller 537 is provided with control signal estimated offset P in the 444 identical modes of the control signal output unit shown in Figure 23, for instance.In addition, in step 324, controller 537 also the side-play amount P that offers sensor unit 531 offer position determination unit 535 really accepted opinion estimate unit 543, flow process is returned step S321.
In step S321, obtain picture signal with reformed sensor unit 531 among its placement location step S325 formerly, subsequently, repeating step S321 is to S325.
Because step S321 is to the circulation of S325, image conversion unit 534 makes first picture signal by each acquisition of a plurality of placement locations of sensor unit 531 stand to be concerned about that with institute the corresponding image transitions of parameter value z handles, thus second picture signal of generation conduct and the corresponding image transitions result of care parameter value z.In addition, evaluated in step S323 with relevant corresponding each second picture signal of a plurality of side-play amounts of being concerned about parameter value z and obtaining.Note, if behind step S320 execution in step S231 for the first time, sensor unit 531 obtains picture signal with the placement location of acquiescence.
Subsequently, if in step S324, definite picture quality that has obtained second picture signal is high assessment result, flow process enters step S326, and when obtaining assessment result, position determination unit 535 accepted opinion is really estimated unit 543 the side-play amount P that care parameter value z and controller 537 provide is associated, promptly, in the mode that is associated be concerned about parameter value z, to position storage unit 536 provide and store when obtain that expression obtained the high-quality assessment result with corresponding first picture signal of second picture signal time indication sensor unit 531 the side-play amount P of laying state, and relevant be concerned about parameter value z and side-play amount P.Correspondingly, position storage unit 536 has been stored side-play amount P, thus with corresponding first picture signal (after this abbreviating " optimized migration amount " as) that is suitable for the image transitions processing of care parameter value z.
Flow process enters step S327 from step S326 then, and all numerical value in the scope that can suppose parameter z of controller 537 are as the parameter value z that is concerned about here, and according to optimized migration amount P whether obtained finish definite.If under the situation that all values in the assumable scope of parameter z is all considered, in step S327, determine not obtain optimized migration amount P, then flow process is returned step S320, here controller 537 select again in the assumable scope of parameter z one not by do be used as be concerned about parameter value numerical value be concerned about parameter value as new, same processing repeats.
Equally, if under the situation that all values in the assumable scope of parameter z all are considered, in step S327, determine to have obtained optimized migration amount P, promptly, if parameter list has been stored in the position storage unit 536, this parameter list is that some parameter z can suppose each numerical value in a plurality of numerical value and its optimized migration amount P group in the scope, and then flow process finishes.
As previously mentioned, with the corresponding image transitions of parameter z handle be about sensor unit 531 with each corresponding position of a plurality of side-play amount P on each first picture signal of obtaining carry out, for each value in a plurality of values of parameter z, it is evaluated to handle second picture signal that obtains by image transitions, and obtain optimized migration amount P, this optimized migration amount P is the side-play amount when obtaining second picture signal of high image quality; Correspondingly, can obtain parameter list, this parameter list is when obtaining first picture signal of handling with the corresponding suitable image transitions of parameter z, the incidence relation of parameter z and optimized migration amount P.Extremely shown in Figure 39 as Figure 37, first picture signal, carry out imaging by sensor unit 401 at this placement location on the position based on the corresponding side-play amount representative of parameter z of parameter list and operating unit 185 outputs, thereby, with parameter z corresponding be suitable for picture signal that image transitions handles just can be obtained, correspondingly, having more, second picture signal of high image quality just can obtain.
Note, by the learning process among Figure 42, can obtain parameter list for parameter z can suppose some values in the scope, therefore, the parameter list that is stored in the parameter table stores unit 222 shown in Figure 38 can not stored the same value of exporting with operating unit 185 of parameter.In this case, by to being stored in the parameter in the parameter list in the parameter table stores unit 222 and the linear interpolation or the similarity method of side-play amount, control signal output unit 221 obtains the corresponding side-play amount of parameter with operating unit 185 outputs.
Next, the evaluation process of being carried out by position determination unit shown in Figure 41 535 among the step S323 in Figure 42 will be described with reference to the flow chart among Figure 43.
For this evaluation process, at first, in step S330, second picture signal that memory cell 541 storage is provided by step S322 (Figure 42) image conversion unit 534 in front, and correlation calculations unit 542 receives these second picture signals.In addition, in step S330, correlation calculations unit 542 second picture signal that provides from image conversion unit 534 is provided and among previous step S330 by the correlation between second picture signal of memory cell 541 storages, flow process enters step S331.
In step S331, determine the correlation that assessment unit 543 interim memory dependency computing units 542 provide, in a way, when two second picture signals that are used for obtaining correlation are selected one and side-play amount P be associated, flow process enters step S332.Now, when two second picture signals that are used for obtaining the correlation that correlation calculations unit 542 provides are selected one, determine that assessment unit 543 obtains side-play amount P from controller shown in Figure 40 537.
In step S332,, determine that assessment unit 543 determines whether to obtain maximum for correlation for correlation of in step S331, storing up to now and the relation between the side-play amount.If determine among the step S332 that flow process enters step S333, determines that it is that second picture signal is the assessment of low image quality that assessment unit 543 is made effect not for correlation obtains maximum, flow process is returned the step S324 among Figure 42.
In this case, in the step S324 of Figure 42, determine that assessment unit 543 determines that not obtaining effect is that picture quality is high assessment result, flow process enters step S325.In step S325, controller 537 provides and the corresponding control signal that is used to specify new side-play amount of assessment result to sensor unit 531, and this side-play amount P is offered definite assessment unit 543.
Return the step S332 among Figure 43, obtain maximum if in step S332, determine for correlation, flow process enters step S334, determines that it is the assessment that second picture signal has high image quality that assessment unit 543 is made effect, and flow process is returned the step S324 among Figure 42.
In this case, in the step S324 of Figure 42, determine that relevant its assessment result that is obtained is that picture quality is that high second picture signal is obtained, flow process enters step S326.In step S326, determine that assessment unit 543 is associated parameter value z that is concerned about and controller 537 provides when obtaining assessment result side-play amount P, promptly, in the effect of relevant its assessment result that obtains is that picture quality is when being corresponding first picture signal that obtains of second picture signal of high (optimized migration amount), the side-play amount P of the laying state of representative sensor unit 401, and relevant actual parameter value z and optimized migration amount are provided and are stored in the position storage unit 536.
In the described in front situation, be described according to following arrangement, wherein, if in step S332, obtained the maximum of correlation, then making effect is that second picture signal is the assessment of high image quality, still, also can do another kind of the arrangement, if wherein obtain a maximum that is equal to or higher than the correlation of predetermined threshold in step S332, then making effect is that second picture signal is the assessment of high image quality.
Equally, in the described in front situation, be described according to following arrangement, wherein, in the assessment of making second picture signal based on correlation, yet, can do another kind of the arrangement, wherein for example, based on relevant side-play amount Ph G, Pv G, Ph B, and Pv BEach value and the S/N of second picture signal that obtains wait and assess.In addition, the assessment of second picture signal may be transfused to from the outside.In other words, for example, may do a kind of arrangement, wherein show second picture signal, and for example export the assessment of second picture signal by the user of observation post's display image.
Previously described by signal processing unit 404, image conversion unit 534, position determination unit 535, controller 537, or the like a series of processing of carrying out can carry out by special hardware or software.When carrying out this a series of processing with software, with reference to Figure 17, as previously mentioned, the program of forming this software will be installed on the microcomputer, on a kind of all-purpose computer, or the like.
Notice simultaneously, except foregoing image transitions was handled, image conversion unit 431 and 534 can be carried out the processing that obtains second picture signal, handled such as making first picture signal stand digital clamper, white balance adjustment processing, gamma correction are handled, linear interpolation is handled, or the like.
Equally, adopt so-called three-sensor device when existing embodiment is described to sensor unit 401 and 531, list-transducer then, two-transducer, or four or more multisensor syste can be applied on sensor unit 401 and 531.
In addition, when in above-mentioned arrangement, when using the G signal of second picture signal that this second picture signal is assessed, also can use R signal or B signal or R, two or more in G and the B signal assess second picture signal.
The 4th specific embodiment
Next, the 4th specific embodiment of the present invention will be described with reference to Figure 44.Figure 44 has illustrated the profile instance of the 4th specific embodiment of using image pick up equipment of the present invention.Image pick up equipment shown in Figure 44 can be for example digital still camera or video camera, with image pick up equipment shown in Figure 180.
Sensor unit 601 comprises, a plurality of and the corresponding photo-electric conversion element of pixel are used to receive by unshowned optical system and project wherein object light, and provides picture signal with the corresponding electrical signal form of this object light to signal adjustment unit 602.In addition, the control signal that provides according to signal processing unit 604 of sensor unit 601 changes its characteristic.
Be included in the noise that resets in the picture signal of sensor unit 601 output for elimination, signal adjustment unit 602 is carried out CDS and is handled, and with the signal adjustment unit among Figure 18 402, and the picture signal as result that obtains is offered A/D converting unit 603.The picture signal that 603 pairs of signal adjustment units 602 of A/D converting unit provide is carried out the A/D conversion, with the A/D converting unit 403 among Figure 18, that is, by the sample quantization picture signal, and the data image signal that will obtain as its result offers signal processing unit 604.
The data image signal that signal processing unit 604 provides A/D converting unit 603 (after this abbreviating " picture signal " as) is used as first picture signal, and make first picture signal stand the predetermined picture conversion process, and to output unit 605 output as its result's digitized map signal as second picture signal.Equally, first picture signal in the presumptive area of 604 pairs one screens of signal processing unit (frame or field) is assessed, and control signal is offered and assess corresponding sensor unit 601.
Second picture signal of output unit 605 received signal processing units 604 outputs with the output unit among Figure 18 405, and is exported these second picture signals.That is to say that output unit 605 is from second picture signal of unshowned exterior terminal output from signal processing unit 604, or be presented on the unshowned monitor.Equally, output unit 605 is stored in second picture signal in the unshowned recording medium, such as CD, disk, magneto optical disk, tape, semiconductor memory or the like is perhaps by sending these picture signals as telephone wire, internet, local area network (LAN) or other as cable or wireless transmission medium.
Image pick up equipment for above-mentioned configuration, receive object light at sensor unit 601 places, and will offer signal processing unit 604 with the picture signal of the corresponding electrical signal form of reception light quantity by signal adjustment unit 602 and A/D converting unit 603.Signal processing unit 604 makes picture signal that transducer 601 provides by signal adjustment unit 602 and A/D converting unit 603 stand to handle such as image transitions this type of signal processing as first picture signal, and having improved second picture signal of picture quality by this to output unit 605 output, this image transitions is for example handled and is improved picture quality by improving resolution.In output unit 605, second picture signal that output signal processing unit 604 provides.
Equally, signal processing unit 604 comes from first picture signal of sensor unit 601 with the predetermined zone assessment of one screen.That is to say that 604 pairs of each screens from first picture signal of sensor unit 601 of signal processing unit are assessed.In addition, corresponding with this assessment, signal processing unit 604 provides control signal to sensor unit 601.
Corresponding with first picture signal of presumptive area, sensor unit 601 changes the characteristic of each pixel of a screen (whole photosurface).Then, sensor unit 601 outputs are from the picture signal of the pixel acquisition of change characteristic.
Next, will the characteristic variations of sensor unit be described according to the control signal of signal processing unit 604 outputs.As previously mentioned, the picture signal from sensor unit 601 outputs is quantized in A/C converting unit 603.Correspondingly, if the presumptive area in the screen is the plane, correspondingly, other variation of the signal level of the picture signal of presumptive area is enough little of to be fit to the quantization step in the quantification that the A/D converting unit 603 shown in Figure 45 A carries out, picture signal in the presumptive area all is quantified as same value at A/D converting unit 603 places, filters small variation.The carries out image conversion process can not produce high-resolution image on the data image signal that is quantified as identical value on the signal processing unit 604.
Correspondingly, the signal processing unit 604 for a change characteristic of sensor unit 601 provides control signal, thereby change the characteristic of sensor unit 601, so that can handling by image transitions, suitable picture signal is output, promptly, can be output so that handle the picture signal that can produce high-definition picture, for instance by image transitions.
That is to say, the picture signal of the presumptive area of 604 pairs of sensor units of signal processing unit, 601 outputs is assessed, if and other variation of the signal level of picture signal is considered to the same little with shown in Figure 45 A, the characteristic of sensor unit 601 will be changed, so that from other variation of signal level of the picture signal of sensor unit 601 output is greatly, shown in Figure 45 B.In this case, therein signal level other change tangible picture signal and be imported into signal processing unit 604, and handle, thereby obtain high-resolution image carrying out image transitions on this picture signal.
Figure 46 has illustrated a kind of profile instance that adopts the sensor unit 601 of variation characteristic.Sensor unit 601 has the level of being arranged in and a large amount of pixels of vertical direction, thereby has formed a kind of photosurface.For example, each pixel all is made up of photoreceptor unit 611 and controller unit 612, as shown in figure 46.
Photoreceptor unit 611 for example is configured to photoelectric conversion devices such as photodiode, so that to control unit 612 outputs and the corresponding signal of telecommunication of light quantity that is received.Control unit 612 is made up of transistor etc., is used for by predetermined magnification ratio the signal of telecommunication from photoreceptor unit 611 being amplified, and outputs to signal adjustment unit 602.Equally, provide control signal from signal processing unit 604 to control unit 612, and control unit 612 control magnification ratios, the signal of telecommunication from photoreceptor unit 601 amplifies according to control signal thus.
Control unit 612 changes magnification ratio as its characteristic according to the control signal from signal processing unit 604, and image transitions handles has other picture signal of signal level that has changed so that output is suitable for carrying out on signal processing unit 604.
Now, the sensor unit of being made up of pixel 601 with photoreceptor unit 611 and control unit 612 can be configured by use the MEMS technology on cmos sensor, for instance.Yet, should be appreciated that sensor unit 601 never is limited to cmos sensor, it also can use CCD, or HARP replaces, and this HARP utilizes the picture tube that occurs in the electron avalanche phenomenon in the semi-conductive photoconductive target of a-Se.Equally, sensor unit 601 also can be made up of the equipment with amplifying unit, and what this amplifying unit was used to whole or one or more pixels increases progressively the enlarged image signal, so that change the magnification ratio of amplifying unit according to control signal.
Next, Figure 47 has illustrated a kind of profile instance to the signal processing unit shown in Figure 44 604.In Figure 47, signal processing unit 604 comprises, image conversion unit 621, image correction unit 622 and rank assessment unit 623.
Offer signal processing unit 604 as first picture signal from the picture signal of sensor unit 601 outputs by signal adjustment unit 602 and A/D converting unit 603.This first picture signal is provided for image conversion unit 621 and rank assessment unit 623.
Image conversion unit 621 makes first picture signal stand image transitions and handles to improve picture quality, and for example such as raising resolution, and the data image signal that will improve the quality offers image correction unit 622 as second picture signal.
Provide second picture signal from image conversion unit 621 to image correction unit 622, and from the magnification ratio information and the area information of rank assessment unit 623.Based on the magnification ratio information and the area information that provide by rank assessment unit 623, second picture signal that image correction unit 622 is proofreaied and correct from image conversion unit 621, and second picture signal after will proofreading and correct offers output unit 605.
Rank assessment unit 623 is assessed first picture signal by the presumptive area of the part of a screen.In addition, rank assessment unit 623 definite magnification ratios, control unit shown in Figure 46 is thus carried out amplifieroperation, and the area information in the zone of the amplification message of indication magnification ratio and indication execution assessment is offered image correction unit 622.Equally, as control signal, rank assessment unit 623 offers the pixel of the composition presumptive area of control unit 612 to amplification message, and it comes from the pixel of forming sensor unit 601.
That is to say that whether first picture signal that rank assessment unit 623 is evaluated at each zone in the screen is fit to the image correction process of image correction unit 621.Especially, the signal rank (brightness or color) of first picture signal of each presumptive area of rank assessment unit 623 identification, and the assessing signal rank to change be greatly or little.In addition, obtain in relevant institute and to be evaluated as signal level other changes too hour, rank assessment unit 623 is that the magnification ratio of the pixel of compositing area is formulated a higher value, and equally relevant institute obtain be evaluated as signal level other when changing too greatly, rank assessment unit 623 is that the magnification ratio of the pixel of compositing area is formulated a smaller value, and representing the amplification message of magnification ratio to offer the control unit 612 (Figure 46) of each regional pixel as control signal.
Control unit 612 as shown in Figure 46, the output signal of photoreceptor unit 611 is amplified with the magnification ratio from the control signal of rank assessment unit 623, correspondingly, be adapted at carrying out on the image conversion unit 621 picture signal with the variation of signal rank of image transitions processing from sensor unit 601 outputs.
On the other hand, at rank assessment unit 623 places, offered the magnification ratio information of the control unit 612 of sensor unit 601 as control signal, the area information in the zone of forming by the pixel of being carried out amplifieroperation by the magnification ratio of amplification message representative with expression is by interrelated, and is provided for image correction unit 622.Image correction unit 622, according to the magnification ratio information and the area information that are provided by rank assessment unit 623, correction makes first picture signal stand image transitions by image conversion unit 621 and handles and second picture signal of acquisition.
That is to say that by the magnification ratio corresponding gain represented with the magnification ratio information relevant with area information, the value of first picture signal of area information indicating area is different with the value of the signal of exporting from the photoreceptor unit 611 of sensor unit 601.Correspondingly, image correction unit 622 is proofreaied and correct second picture signal in this zone, this regional area information is by the indicated magnification ratio representative of the amplification message relevant with area information, obtain the second identical picture signal of gain that should obtain with picture signal so that produce, this obtains by the signal of exporting from the photoreceptor unit 611 of sensor unit 601 is carried out the image transitions processing.Especially, the gain that decays of the proportional value of magnification ratio by the amplification message relevant indication, second picture signal in the zone that image correction unit 622 correcting area information are indicated with area information.
Noting, by rank assessment unit 623, can be this screen (frame in zone) as the presumptive area of assessment increment, or the zone of being made up of single or multiple pixels.
Now, at rank assessment unit 623 places, if the presumptive area as the assessment increment is a pixel or a plurality of pixel, but not a whole screen, image transitions processing to first picture signal is carried out in image conversion unit 621, and this first picture signal has different magnification ratios to each presumptive area.When image conversion unit 621 is implemented dissimilar calculating, be that first picture signal with different magnification ratios is carried out if calculate, then carrying out the difference that is necessary to consider magnification ratio when calculating.Here, in order to simplify description, the let us hypothesis will be whole screen as the presumptive area of assessment increment at rank assessment unit 623 places.
Next, Figure 48 has illustrated first profile instance of the rank assessment unit 623 shown in Figure 47.In Figure 48, rank assessment unit 623 takies degree computing unit 632 and magnification ratio determining unit 633 compositions by assessment pixel extraction unit 631.
First picture signal that provides from sensor unit 601 to assessment pixel extraction unit 631 arrives signal processing unit 604 via signal adjustment unit 602 and A/D converting unit 603.Assessment pixel extraction unit 631 extracts the pixel of a screen that is used as assessment first picture signal as the assessment pixel, and these are offered takies degree computing unit 632.Take the degree that takies that degree computing unit 632 calculates the assessment pixel of shielding as one of assessment increment, and this is offered magnification ratio determining unit 633.
In magnification ratio determining unit 633, according to taking the degree that takies that degree computing unit 632 provides, first picture signal as a screen of assessment increment is evaluated, and determine one with the corresponding magnification ratio of assessment as with the magnification ratio of the control unit 612 of a corresponding pixel of picture signal of shielding of sensor unit 601.In addition, magnification ratio determining unit 633 offers sensor unit 601 (or its control unit 612) to the magnification ratio information of this magnification ratio of indication as control signal.In addition, magnification ratio determining unit 633 is associated magnification ratio information and offers image correction unit 622 with the area information of representing a screen, and (Figure 47) formed with the pixel of being carried out amplifieroperation by the indicated magnification ratio of magnification ratio information in this zone.
Next, the operation of the image pick up equipment shown in Figure 44 will be described with reference to the flow chart among Figure 49.
For this image pick up equipment, at first, in step S401, the photoreceptor unit 611 of sensor unit 601 receives object light and carries out opto-electronic conversion, thereby obtain first picture signal (making object image-forming) of electrical signal form, control unit 612 amplifies this picture signal with predetermined magnification ratio, and the signal that is exaggerated offers signal adjustment unit 602, and flow process enters step S402.Note, if after opening the power supply of image pick up equipment images acquired for the first time, the magnification ratio of the control unit 612 of sensor unit 601 is default values.
In step S402, the screen image signal that signal adjustment unit 602 provides sensor unit 601 stands the signal adjustment, handles as CDS, then these are offered A/D converting unit 603, and flow process enters step S403.In step S403, A/D converting unit 603 makes the screen image signal from signal adjustment unit 602 stand image transitions, offer signal processing unit 604 as first picture signal then, and flow process enters step S404.
In step S404,623 a pairs of screen first picture signals that provided by A/D converting unit 603 of the rank assessment unit of signal processing unit 604 (Figure 47) are assessed, and are that the control unit 612 shown in Figure 46 is determined magnification ratio.In addition, in step S404, rank assessment unit 623 is the amplification message of its magnification ratio of indication, the area information that is performed a relevant screen with indication and assessment offers image correction unit 622, and the control unit 612 of pixel that magnification ratio information is offered a screen of forming transducer 601 is as its control information, and flow process enters step S405.Now, the detailed process of carrying out in step S404 will be described after a while.
In step S405, the control unit 612 control magnification ratios of each pixel in the sensor unit 601, thus, the control signal that provides according to rank assessment unit 623 among the step S404 formerly, the output of photoreceptor unit 611 has been exaggerated, and flow process enters step S406.
In step S406, the image transitions of first picture signal that provides about A/D converting unit 603 is provided by the image conversion unit 621 of signal processing unit 604 (Figure 47) to be handled, have second picture signal that on first picture signal, has improved picture quality and be provided for image correction unit 622, and flow process enters step S407.
In step S407, the magnification ratio information and the area information that provide based on the rank assessment unit 623 among the step S404 formerly, the picture signal that 622 pairs of image conversion unit 621 of image correction unit provide is proofreaied and correct, second picture signal that has been corrected is provided for output unit 605, and flow process enters step S408.
In step S408, second picture signal that output unit 605 output is provided by the image correction unit 622 of signal processing unit 604, thereby, finish the processing of a relevant screen image.That is to say,, repeat flow process, assign the order that stops image pickup up to for example user according to the processing of a picture screen among Figure 49 for the image pick up equipment among Figure 44.
Correspondingly, in following step S401, the magnification ratio of being controlled from the step S404 of picture signal with the processing of front that the photoreceptor unit 611 of sensor unit 601 is exported amplifies, correspondingly, first picture signal of suitable image transitions processing is provided for image conversion unit 621.
Next, the evaluation process of step S404 among Figure 49 will be described with reference to the flow chart among Figure 50, at first, in the evaluation process in step S421, the assessment pixel extraction unit 631 of rank assessment unit (Figure 48) extracts pixel as the assessment pixel from the pixel as composition one screen of assessment increment, be used to assess the pixel of first picture signal of this screen.In other words, assessment pixel extraction unit 631 extracts from the pixel of forming a screen and for example has the pixel of first image signal level higher than first rank and have than low other pixel of level of second level as the assessment pixel.
Now, for first rank, one near being used with the little value of maximum that equals or ratio first picture signal can be supposed.Equally, for second level, one near being used with the big value of minimum value that equals or ratio first picture signal can be supposed.In the following description, have in first picture signal than first rank high the level other pixel will be called " high-level pixel " according to circumstances, and have in first picture signal than second level low the level other pixel be called " low level pixel " according to circumstances.
In step S421, assessment pixel extraction unit 631 is from the pixel that the composition one as the assessment increment shields, extract high-level pixel and low level pixel as the assessment pixel, and this assessment pixel offered take degree computing unit 632, and flow process enters step S422.
Among the step S422, in a screen as the assessment increment, take the ratio that takies that each the high-level pixel that provided at step S421 by assessment pixel extraction unit 631 and low level pixel are provided for degree computing unit 632, take degree and low level takies degree as high-level, they are offered magnification ratio determining unit 633, and flow process enters step S423.
In step S423, high-levelly take degree and low level takies degree according to what take that degree unit 632 provides, 633 pairs of magnification ratio determining units are assessed as a screen first picture signals of assessment increment, and according to this assessment determine magnification ratio as with the magnification ratio of the control unit 612 of the corresponding pixel of a screen image signal.
That is to say, if the high-level degree that takies is sufficiently more than low level and takies degree, this means in a screen and exist a large amount of high-level pixels, so first picture signal that magnification ratio determining unit 633 is made is evaluated as a screen is not suitable for carrying out image transitions in image conversion unit 621 and handles, and determined one than the magnification ratio of the low value of currency as its unit 612 of control to magnification ratio.Equally, if low level takies degree and is sufficiently more than the high-level degree that takies, this means in a screen and exist a large amount of low level pixels, so first picture signal that magnification ratio determining unit 633 is made is evaluated as a screen is not suitable for carrying out image transitions in image conversion unit 621 and handles, and define one than the magnification ratio of the high value of currency as its unit 612 of control to magnification ratio.Otherwise what magnification ratio determining unit 633 was made is evaluated as, and first picture signal of a screen is suitable for handling in the image transitions of image conversion unit 621, and has determined that to magnification ratio a preceding value is as the magnification ratio of controlling its unit 612.
In step S423, the amplification message that magnification ratio determining unit 633 provides indicating predetermined magnification ratio to sensor unit 601 (or its control unit 612) as control signal, and is magnification ratio information and indication that the area information in zone is associated with a screen, provide related magnification ratio information and area information to image correction unit 622 (Figure 47), this zone is formed with the pixel of being carried out amplifieroperation by the indicated magnification ratio of magnification ratio information, and flow process is returned.
In this case, the picture signal that has the appropriate level of image transitions processing at image conversion unit 621 places is output from sensor unit 601, and next, having more, second picture signal of high image quality can obtain from the image transitions processing procedure.Should be noted that and to do a kind of arrangement, when promptly only determining a low magnification ratio and only be scheduled to a high power for the low level pixel for high-level pixel.
Arrangement as shown in figure 49, in step S401, sensor unit 601 amplifies the picture signal of the photoreceptor unit 611 output magnification ratio with step S404 control in front, thus the present frame of sensor unit 601 or picture signal with by evaluated present frame or the determined magnification ratio of image amplify.On the other hand, can do a kind of arrangement, wherein sentence the picture signal of amplifying present frame or field by the determined magnification ratio of image of evaluated present frame or field at sensor unit 601.
Now, the operation of the image pick up equipment shown in Figure 44 will be described with reference to a kind of situation, and this situation is that sensor unit 601 is sentenced the picture signal of being amplified present frame or field by the determined magnification ratio of image of this evaluated frame or field.In this case, the step S431 of Figure 51 to S435 carry out with Figure 49 step S401 to the identical processing of S435, and flow process enters step S436.
In step S436, with the corresponding step S431 of the step S401 among Figure 49, the photoreceptor unit 611 of sensor unit 601 receives object light and carries out opto-electronic conversion, thereby obtains first picture signal of electrical signal form.In addition, in step S636, control unit 612 is to amplify the picture signal that is obtained by photoreceptor unit 611 at the magnification ratio that preceding step S635 was controlled, and the enlarged image signal offered signal processing unit 604 via signal adjustment unit 602 and A/D converting unit 603, and flow process enters step S437.
Flow process proceeds to step S439 in order by step S437, thus, carry out and Figure 49 in step S406 to the identical processing procedure of S408, thereby finish the treatment of picture of a frame or field.According to the flow chart among Figure 51, the order that stops image pickup being assigned in the processing of the image pick up equipment that repeated using is shown in Figure 44 up to for example user.
In the processing of the flow chart among Figure 51, in step S431 and S436, sensor unit 601 is carried out twice image pickup in the circulation of a frame or field.In step S434, the picture signal of picking up for the first time is evaluated, determines the magnification ratio of the picture signal that pick up the second time at step S435 based on assessment result.Correspondingly, at sensor unit 601, amplify with the magnification ratio of assessing by the picture signal that imaging is obtained to the first time to determine by the picture signal that the imaging second time of the frame of current region obtains.
Notice that according to the processing of the flow chart among Figure 49, for single frames (field), 601 needs of sensor unit are carried out Polaroid, but according to the processing of the flow chart among Figure 51, for single frames (field), sensor unit 601 needs execution at least twice imaging.
Next, Figure 52 has illustrated the another kind arrangement of the rank assessment unit 623 among Figure 47.Among Figure 52, rank assessment unit 623 comprises a kind of activity computing unit 641 and a kind of magnification ratio determining unit 642.
Through A/D converting unit 603 and signal adjustment unit 602, first picture signal that is provided to signal processing unit 604 by sensor unit 601 offers activity computing unit 641.Activity computing unit 641 calculates the activity as first picture signal in the screen of assessment increments, and calculate activity offer magnification ratio determining unit 642.
Now, the example of activity of first signal of one screen that can adopt can be a screen (promptly, the minimum and maximum value of first picture signal dynamic range) poor, neighbor in first picture signal the summation of absolute value of difference, the difference of first picture signal in one screen, or the like.
642 pairs of magnification ratio determining units are assessed as first picture signal in the screen of assessment increment, and determine and the magnification ratio of the corresponding magnification ratio conduct of its assessment with the control unit 612 of the corresponding pixel of a screen image signal of sensor unit 601.
That is to say, if for example the activity that obtains from activity computing unit 641 is very big, first picture signal of making that is evaluated as a screen is not suitable for the image transitions of carrying out in image conversion unit 621 and handles, so magnification ratio determining unit 642 is determined the magnification ratio of the low magnification ratio of more current magnification ratio as control unit 612.Equally, if for example the activity that obtains from activity computing unit 641 is very little, first picture signal of making that is evaluated as a screen is not suitable for carrying out image transitions in image conversion unit 621 and handles, so magnification ratio determining unit 642 is determined the magnification ratio of the high magnification ratio of more current magnification ratio as control unit 612.In addition, if the activity that obtains from activity computing unit 641 is both little also not little, first picture signal of making that is evaluated as a screen is suitable for carrying out image transitions in image conversion unit 621 to be handled, so magnification ratio determining unit 642 is determined the magnification ratio of current magnification ratio as control unit 612.
Magnification ratio determining unit 642 offers sensor unit 601 (its control unit 612) to the indicated amplification message of determined magnification ratio as control signal, equally the area information of amplification message and indication one screen is associated, this zone is made up of the pixel that the indicated magnification ratio of magnification ratio information carries out the sensor unit 601 of amplifieroperation, and magnification ratio information and area information that should association be provided for image correction unit 622 (Figure 47).
In this case, same, the picture signal that has the appropriate level of image transitions processing in image conversion unit 621 is exported from sensor unit 601, and then, having more, second picture signal of high image quality just can obtain from the image transitions processing.
Next, Figure 53 has illustrated the 3rd configuration arrangement of the described rank assessment unit 623 of Figure 47.In Figure 53, the rank assessment unit is by comparing unit 651 and 652 configurations of magnification ratio determining unit.
Through A/D converting unit 603 and signal adjustment unit 602, first picture signal that offers signal processing unit 604 from sensor unit 601 is provided for comparing unit 651.Comparing unit 651 compares first picture signal and the predetermined threshold value of shielding as the assessment increment, and comparative result is offered magnification ratio determining unit 652.The example of the threshold value that compares with first picture signal is a first threshold, and it is a less value for the image transitions processing intent of image conversion unit 621, and second threshold value is the bigger value concerning image transitions is handled.
Equally, the example of first picture signal that compares with first and second threshold values in comparing unit 651 comprises, any first picture signal of shielding as the assessment increment, first picture signal that on a screen, has the number of pixels of maximum quantity, the mean value of first picture signal on a screen, or the like.
The result who compares according to the threshold value that provides with comparing unit 651, magnification ratio determining unit 652 assessment is as a screen first picture signal of assessment increments, and determines and the magnification ratio of the corresponding magnification ratio conduct of assessment with the control unit 612 of the corresponding pixel of a screen image signal.
That is to say, if the comparative result of comparing unit 651 is indicated first picture signal to equal and is lower than first threshold, magnification ratio determining unit 652 is made first picture signal that is evaluated as a screen and is not suitable for image conversion unit 621 and carries out image transitions and handle, and determines one than the magnification ratio of the high magnification ratio value of currency as control unit 612 based on assessment.Equally, if the comparative result of comparing unit 651 is indicated first picture signal to equal and is higher than second threshold value, magnification ratio determining unit 652 is made first picture signal that is evaluated as a screen and is not suitable for image conversion unit 621 and carries out image transitions and handle, and determines one than the magnification ratio of the low magnification ratio value of currency as control unit 612 based on assessment.In addition, if the comparative result of comparing unit 651 indicates first picture signal in the scope of the first threshold and second threshold value, magnification ratio determining unit 652 is made the suitable image conversion unit 621 of first picture signal that is evaluated as a screen and is carried out the image transitions processing, and determines the magnification ratio of currency as control unit 612 based on assessment.
Magnification ratio determining unit 652 offers sensor unit 601 (its control unit 612) to the magnification ratio information of the determined magnification ratio of indication as control signal then, equally, the area information of magnification ratio information and indication one screen is associated, the magnification ratio that this zone is indicated by magnification ratio information by magnification ratio carries out the pixel of the sensor unit 601 of amplifieroperation to be formed, and relevant magnification ratio information and area information are provided for image correction unit 622 (Figure 47).
In this case, same, the picture signal of the appropriate level that the image transitions of image conversion unit 621 is handled is output from sensor unit 601, and then, having more, second picture signal of high image quality can obtain from the image transitions processing.
Note, can do a kind of arrangement, wherein in single pixel increment, carry out the comparison of first picture signal and threshold value,, determine the magnification ratio of each pixel according to comparative result.
The configuration of image conversion unit 621 is the same with the configuration of image conversion unit 21 among Fig. 6 among Figure 47, and correspondingly, its description will be omitted (seeing Fig. 6 to Figure 10 and corresponding the description).Equally, can do a kind of arrangement to image pick up equipment, wherein arrangement as shown in figure 27 equally provides operating unit 185, so that make the configuration of the image conversion unit 431 among the configuration of the image conversion unit 621 among Figure 47 and Figure 29 (see Figure 29 to Figure 36, describe with corresponding) identical.If adopted under the situation of this arrangement, signal processing unit 604 offers sensor unit 601 can be according to parameter correction as the indicated magnification ratio of the magnification ratio information of control signal.That is, magnification ratio can be corrected feasible high more with the corresponding resolution of parameter, and the value of magnification ratio is big more.
Equally, the description of front is described with reference to an example, wherein, in image transitions is handled, the integral body of one frame or field is converted to second picture signal by first picture signal, but wherein can do a kind of arrangement, for example, the subregion of one frame or field is converted to second picture signal by first picture signal, shown in Figure 54.
A series of processing of foregoing signal processing unit 604 can be by special hardware or software implementation.If implement this series of processes with software, with reference to Figure 17, as previously mentioned, the program of forming software is installed on a kind of microcomputer, in a kind of all-purpose computer, or the like.
The 5th specific embodiment
Various details the 5th specific embodiment.Figure 55 has illustrated a kind of embodiment that uses sensing system of the present invention.Should be noted that at this term " system " whether irrelevant in a single example in order to a kind of logic configuration and the component devices that refer to a plurality of equipment.
Described sensor system configuration has cmos imager 801 and DRC (numeral reality is created) circuit 802, and the sense light of object (object light), and the picture signal of output and the corresponding high image quality of object.That is to say that cmos imager 801 receives object light, and provide picture signal with the corresponding electrical signal form of light quantity that is received to DRC circuit 802.
The picture signal that 802 pairs of cmos imagers 801 of DRC circuit provide is carried out signal processing, and obtains and export to have the more picture signal of high image quality (after this being called " high image quality picture signal " according to circumstances).In addition, the picture signal that provides based on cmos imager 801 of DRC circuit 802 is controlled cmos imager 801.Correspondingly, control cmos imager 801 is so that output is suitable for the picture signal that DRC circuit 802 carries out signal processing.
Correspondingly, for the sensing system among Figure 55, cmos imager 801 outputs are suitable for the picture signal that DRC circuit 802 carries out signal processing, so DRC circuit 802 can be by carrying out the picture signal that signal processing obtains high image quality on these picture signals.
Figure 56 has illustrated first profile instance of the DRC circuit 802 shown in a kind of Figure 55.In Figure 56, DRC circuit 802 comprises DRC unit 811, is used for the picture signal of cmos imager 801 outputs is carried out signal processing, and the picture signal control cmos imager 801 that provides according to cmos imager 801 is provided control unit 812.
DRC unit 811 carries out multiple signal processing, and one of them example is that the image transitions of converted image signal from first picture signal to second picture signal handled.Handle identical configuration by the foregoing realization image transitions of picture, can realize described image transitions processing, yet, at this moment, comprise that the example to the application cmos imager 108 of the control of cmos imager 108 will be described, it should be noted that part is wherein described identical with the description of front.
Now, for example, if we suppose that first picture signal is a low resolution image signal, and second picture signal is the high resolution graphics image signal, and image transitions is handled and can be said to be is resolution to be improved handle.Equally, for example, if we suppose that first picture signal is low S/N (signal/noise) picture signal, and second picture signal is high S/N picture signal, and image transitions is handled and can be said to be is that noise removal is handled.In addition, for example, if we suppose that first picture signal is the picture signal of pre-sizing, and second picture signal is bigger or littler than the signal of first picture signal, and image transitions is handled and can be said to be is that a kind of image zoom (enlarge or dwindle) is handled.Correspondingly, handle by image transitions, can realize different types of processing, this depends on how first and second picture signals define.
DRC unit 811 as first picture signal, and converts the picture signal of cmos imager 108 output to the high image quality picture signal to first picture signal as second picture signal.
Now, in DRC unit 811, the picture signal that provides from cmos imager 108 is offered prediction tapped extraction unit 821 and classification tap extraction unit 822 as first picture signal.The pixel that to form second picture signal of prediction tapped extraction unit 821 orders is as the pixel of being concerned about, in addition, extract some pixels that are used to constitute first picture signal (or one might rather say, its pixel value) and be used for the pixel value that forecasting institute is concerned about pixel as prediction tapped.
More particularly, prediction tapped extraction unit 821 from first picture signal extract with corresponding first picture signal of being concerned about of pixel a certain pixel (for example, the most approaching spatial sequence of being concerned about in first picture signal and the pixel of time series pixel) proximity space sequence or a plurality of pixels of seasonal effect in time series, as prediction tapped.
Classification tap extraction unit 822 extract form first picture signal some pixels as the classification tap, thereby in order to carry out the class classification pixel of being concerned about is assigned in a plurality of classes one.
Now, prediction tapped can have identical tap structure with the classification tap, or has different tap structures.It should be noted that equally classification tap extraction unit 822 is corresponding with the feature extraction unit 122 among Fig. 6.
The prediction tapped that prediction tapped extraction unit 821 obtains is provided for prediction and calculation unit 825, and the classification tap that classification tap extraction unit 822 obtains has been provided for Sort Code generation unit 823.
Distribute based on the rank of composition from the pixel of the classification tap of classification tap extraction unit 822, Sort Code generation unit 823 is assigned in a plurality of classes one to the pixel of being concerned about, and generate and as the corresponding Sort Code of its result's class, this Sort Code is provided for factor generation unit 824 then.Should be noted that Sort Code generation unit 823 is corresponding to the class taxon 123 among Fig. 6.
As previously mentioned, class classification can be undertaken by for example adopting ADRC or similarity method.By using the method for ADRC, the pixel value of forming the pixel of classification tap stands ADRC to be handled, and the classification of the pixel of being concerned about is followed synthetic ARC sign indicating number and determined.
This moment, for the ADRC of K position, the maximum MAX and the minimum value MIN of the component of the vector value of the feature of for example detection formation presentation class tap, and use DR=MAX-MIN, the component of the feature of composition and classification tap is quantized into the K position once more according to this dynamic range DR as one group local dynamic range.That is to say that minimum value MIN is deducted, and the value that is subtracted is removed (being quantized) DR/2 from the component of the feature of composition and classification tap KThe bit string of K position component of arranging the feature of composition and classification tap according to predefined procedure is used as the output of ADRC code therein.Therefore, under the vector value of the feature of presentation class tap stands situation that 1 ADRC handles, each component of the feature of composition and classification tap is divided by the mean value (being rounded to decimal) of maximum MAX and minimum value MIN, and each component becomes a position (that is binary bitization) whereby.Wherein the bit string with 1 component of predefined procedure arrangement is used as the output of ADRC code.Class taxon 123 output is for example handled the ADRC code that obtained as Sort Code by the feature of classification tap being carried out ADRC.
Now, form the rank distribution pattern of the pixel of classification tap, for example, can be taken as Sort Code and output to Sort Code generation unit 823, do not make any change.But, in this case, if the classification tap comprise, the pixel value of N pixel, and be the pixel value appointment K position of each pixel, the number of the Sort Code of being exported by Sort Code generation unit 823 should be (2 N) K, the proportional very large number in numerical value exponentially ground with the pixel value figure place K of pixel.
Correspondingly, on Sort Code generation unit 823, the class classification is preferably to be handled by foregoing ADRC, and vector quantization or similar methods are compressed the quantity of classification tap information and carried out.
Now, 822 punishment class taps can obtain from the picture signal of cmos imager 801 outputs in classification tap extraction unit, thereby Sort Code obtains from Sort Code generation unit 823.Correspondingly, we can say that classification tap extraction unit 822 and Sort Code generation unit 823 have been formed the class taxon of carrying out the class classification.
Factor generation unit 824 is each class storage tap factor that obtains by study, and from the tap factor of storage, provide (output) tap factor to be stored in for prediction and calculation unit 823, this tap factor and from (tap tap factor) in the corresponding address of the Sort Code of Sort Code generation unit 823 with the represented class of the Sort Code that provides from Sort Code generation unit 823.Now, the tap factor equals the factor that the input data in the so-called tap in digital filter multiply each other.
Prediction and calculation unit 825 obtains the prediction tapped of prediction tapped extraction unit 821 outputs, tap factor with 824 outputs of factor generation unit, and adopt prediction tapped and tap factor, the prediction and calculation of being scheduled to by the predicted value execution of the true value that obtains institute's cares pixel.Correspondingly, the predicted value of prediction and calculation unit 825 pixel values of pixel that output is concerned about promptly, is formed the pixel value of the pixel of second picture signal.
Rank according to the picture signal of exporting from cmos imager 801 distributes, control unit 812 control cmos imagers 801.More properly, the Sort Code of the classification tap of extracting from the picture signal of cmos imager 801 outputs offers control unit 812 from Sort Code generation unit 823.At control unit 812, DL (delay line) 826 is provided by the Sort Code that provides from Sort Code generation unit 823 temporarily, and the Sort Code of being stored is offered amount of movement control unit 827.Amount of movement control unit 827 is controlled cmos imager 801 according to the Sort Code that provides from DL826.
Now, as previously mentioned, Sort Code generation unit 823 is handled and is produced Sort Code by making Sort Code stand ADRC.This Sort Code is a character string of quantized value once more, the pixel value of wherein forming a plurality of pixels of the classification tap of extracting from the picture signal of cmos imager 801 outputs is quantized once more, correspondingly, we can say and represented a plurality of pixels of forming the classification tap, that is, the rank from the picture signal of cmos imager 801 output distributes.Correspondingly, we can say the rank of the picture signal of having exported according to cmos imager 801 according to the amount of movement control unit 827 of the Sort Code control cmos imager 801 control cmos imager 801 that distributes.
Next, illustrated the example of the tap structure of prediction tapped and classification tap among Figure 57 A and the 57B respectively.Figure 57 A has illustrated a kind of example of tap structure of the tap of classifying.Example shown in Figure 57 A has a kind of classification tap of disposing nine pixels.In other words, adopt the example among Figure 57 A, for the picture signal of cmos imager 801 outputs in a kind of cross classification tap of the corresponding pixel arrangement of the pixel of being concerned about, and thereon, down, a left side and right all have two adjacent pixels.
Figure 57 B has illustrated a kind of example of tap structure of prediction tapped.Example shown in Figure 57 B has a kind of prediction tapped that disposes 13 pixels.In other words, adopt the example shown in Figure 57 B, for the picture signal of cmos imager 801 outputs in the classification tap of a kind of rhombus of the corresponding pixel arrangement of the pixel of being concerned about, and thereon, down, a left side and right direction all have two pixels, four diagonals a pixel are arranged respectively.
The prediction and calculation of the prediction and calculation unit 825 shown in Figure 56 is the same with the processing procedure of computing unit 125 shown in Figure 6, specific descriptions are omitted at this, because the same mode (seeing Fig. 8 to Figure 10) of learning method of the tap factor in the factor memory 181 among Fig. 7 can be adopted and be stored in to the study of the tap factor that uses in the prediction and calculation.
The configuration of corresponding facility for study is the same with the configuration of facility for study shown in Figure 8, yet, should be noted that the same mode of tap configuration of feature extraction unit 136 employings and classification tap unit 822 disposes the classification tap, it has been provided for class taxon 137.Class taxon 137 produces the Sort Code the same with Sort Code generation unit 823 then.
Next, Figure 58 A to Figure 58 C has illustrated the configuration example of the sensing system shown in a kind of Figure 55.Figure 58 A is a kind of plan view of the sensing system of Figure 55.
Sensing system is configured on a kind of single chip by semi-conductive processing.In Figure 58 A, cmos imager 801 is built into the upper right portion of a chip, and DRC circuit 802 and other circuit are configured in other parts.
Shown in Figure 58 B, cmos imager 801 has the so-called unit of arranging in a large number in the lattice layout, and these unit are equal to pixel.Each pixel of cmos imager 801 has photodiode 851, and condenser 852 and MEMS unit 853 are shown in Figure 58 C.
Photodiode 851 receives incident lights, and produce and output and the corresponding signal of telecommunication of the light quantity that receives.Each signal of telecommunication of photodiode 851 outputs all is the pixel value of single pixel.
Condenser 852 is a kind of so-called and goes up lens, and is placed on the position in the face of the photosurface of photodiode 851.Condenser 852 is assembled light beam, and a beam emissions of assembling is to photodiode 851.At the concurrent photodiode 851 that is mapped to of gathering light beam on the condenser 852 service efficiency of the light beam on photodiode 852 can be enhanced.
MEMS unit 853 is moveable portion of a kind of MEMS of disposing technology, and it is holding condenser 852.Drive the position that MEMS unit 853 moves to the position of condenser 852 photosurface of photodiode.
Note,, omitted these in illustrating though each pixel of cmos imager 801 all has the circuit of picture amplifier etc.
The amount of movement control unit 827 of the control unit 812 shown in Figure 56 is controlled the position of condenser 852 by driving MEMS unit 853.Notice these, adopt the Position Control of the condenser 852 of amount of movement control unit 827 to be described with reference to figure 59A to Figure 60 B.
Now, shown in Figure 58 A and 59B, by driving MEMS unit 853, the position of condenser 852 can move near the position of photodiode 851 with away from the position of photodiode 851, that is, and and two positions.In addition, shown in Figure 59 A, if the position of condenser 852 away from photodiode 851, will be from the object light of condenser 852 to photodiode 851 emission close limits.Equally, shown in Figure 59 B, if the position of condenser 852 near photodiode 851, will be from the object light of condenser 852 to photodiode 851 emission wide regions.Equally, the position of condenser 852 and the relation of scope that is transmitted into the object light of photodiode 851 from condenser 852 may be opposite with content noted earlier.
The object light that penetrates from condenser 852 is received by photodiode 851, exports as pixel value with the approximately proportional signal of telecommunication of the integer value of the light quantity that is received.Thereby the light quantity that is received is integrated and as pixel value at photodiode 851, so the minor alteration of object light is dropped (that is, quantizing) on pixel value.The minor alteration of the object light that is dropped on pixel value will be called " integration effect " according to circumstances.The object light scope that photodiode 851 receives is wide more, and it is big more to integrate effect, and the more minor variations in this expression object light has been dropped, thus the image (image) that has obtained to have low resolution with some mobile fuzzies.
If with the corresponding image of object light be a kind of on rank (planar section) have the plane picture of very little variation, noise become significantly.Correspondingly, in this case, owing to integrate effect, adopt photodiode 851 to receive object light on than large tracts of land the noise that comprises from the image of the pixel value of photodiode 851 outputs is reduced at one.
Equally, if with the corresponding image of object light be the image that (non-flat portion) alters a great deal on rank owing to integrate effect, adopt photodiode 851 to receive object light on than large tracts of land and will alleviate bigger rank and change and reduce resolution at one.Correspondingly, in this case, adopt photodiode 851 to receive object light on long-pending than leptoprosopy and can make the changing of object light from the image of the pixel value of photodiode 851 outputs obtain reflecting relatively really than large level at one.
On the other hand, in the signal processing that DRC unit 811 is carried out (image transitions processings), dispose prediction tapped, and the pixel of being concerned about (or its pixel value) is come out by the calculating of using this prediction tapped is predicted by the pixel value of photodiode 851 outputs.Correspondingly, if from smoothed image, include noise in the pixel in the prediction tapped of configuration (extraction), be concerned about that then the predictablity rate of pixel has descended.Equally,, change the image pixel that reduces, be concerned about that then the predictablity rate of pixel can descend but be used as rank if the pixel in the prediction tapped has originally had great changes on rank.
That is to say, for level and smooth (plane) image, for carry out appropriate signal handle with toilet be concerned about pixel can with high-accuracy and from the DRC unit the 811 high image quality picture signals that obtain predict that prediction tapped need adopt has seldom that the pixel of noise is configured.Equally, for the image that has bigger variation on the rank, prediction tapped need be by reflecting strictly according to the facts that the pixel that rank changes is configured.
As previously mentioned, allow photodiode 851 receive object light and can reduce the noise that is comprised in the image of pixel value of photodiode 851 outputs in a kind of scope of broad.Equally, allowing photodiode 851 receive object light in a kind of narrower scope can make the bigger rank variation in the object light be reflected strictly according to the facts in the image of the pixel value of photodiode 851 outputs.
Correspondingly, receive object light for obtaining level and smooth image light electric diode 851 at a kind of relative broad range, and rank variation image greatly for object light, photodiode 851 is receiving object light than close limit, thereby DRC unit 811 can obtain to have the more picture signal of high image quality with the pixel of high accuracy rate forecasting institute care thereby suitable image processing can be implemented.
Correspondingly, amount of movement control unit 827 is controlled the position of condenser 852 according to the Sort Code that provides from Sort Code generation unit 823 via DL826, and is as follows.
Figure 60 A and 60B have illustrated from the example of the Sort Code of Sort Code generation unit 823 outputs.Figure 60 A has illustrated by carry out 1 ADRC in the cross classification tap shown in Figure 57 A and has handled the Sort Code that obtains.Equally, Figure 60 B has illustrated the pixel value P1 to P9 of nine pixels of the classification tap shown in the composition diagram 57A, becomes a single file with the sequence arrangement from pixel P1 to P9.
Handle for 1 ADRC, the pixel value of forming the pixel of classification tap is quantized once more by mean value, and this mean value is the maximum MAX (max pixel value) of the pixel of composition classification tap and the mean value of its minimum value MIN (minimum pixel value).That is to say that become 0 less than the pixel value of the mean value of maximum MAX and minimum value MIN, the pixel value that is equal to or greater than this mean value becomes 1.
Correspondingly, for the classification tap that the smooth from image extracts, the variation of the pixel value of the pixel P1 to P9 of composition classification tap is very little, so adjacent position does not almost have bit inversion, for example " 000000001 " among Figure 60 A in Sort Code.
On the other hand, for the classification tap of extracting from the part of image with bigger variation, the variation of the pixel value of the pixel P1 to P9 of composition classification tap is very bigger, so Sort Code is by the number of times acquisition of the bit inversion of adjacent position, for example " 1011010101 among Figure 60 B.
Correspondingly,,, can discern this image for for smoothed image, if the number of times of the bit inversion at phase ortho position more greatly then can discern bigger variation is arranged on the rank if the number of times of the bit inversion at phase ortho position is less for Sort Code.
Correspondingly, if the number of times of the bit inversion at the phase ortho position in the Sort Code is less, the image of the pixel value that is obtained by photodiode 851 is level and smooth, so the position of amount of movement control unit 827 control condensers 852 is on the position near photodiode 851 shown in Figure 59 B, so that photodiode 851 receives object light at a relative broad range.Equally, if the number of times of the bit inversion at the phase ortho position in the Sort Code is bigger, then photodiode 851 has bigger rank variation with the image of the pixel value of acquisition, so the position of amount of movement control unit 827 control condensers 852 is on the position away from photodiode 851 shown in Figure 59 A, so that photodiode 851 receives object light at one than close limit.
Next, the signal processing of the DRC circuit 802 shown in Figure 56 will be set forth with reference to Figure 61.Here it is noted that DRC circuit 811 a kind of pixel in N+1 the frame (or) as the pixel of being concerned about, and predict that this institute is concerned about pixel.
In this case, on the picture signal of N the frame of exporting from cmos imager 801, in step S511 classification tap extraction unit 822 with in the N frame of cmos imager 801 outputs be concerned about that the nearest locational pixel of locations of pixels is that the center is with the classification tap (Figure 57 A) of cross extraction pixel as the pixel of being concerned about, and a pixel of extracting offered Sort Code generation unit 823, and flow process enters step S512.In other words, at this, the picture signal of N+1 frame be concerned about that the classification tap of pixel is to extract from the picture signal of N frame, be exactly a frame of front.
In step S512, Sort Code generation unit 823 by the classification tap that is provided by classification tap extraction unit 822 is carried out 1 ADRC handle obtain be concerned about the Sort Code of pixel, and offer factor generation unit 824, simultaneously offer amount of movement control unit 827, and flow process enters step S513 via DL826.
In step S513, corresponding with the Sort Code that provides via DL826, amount of movement control unit 827 produces control information for the position of control condenser 852, and flow process enters step S514.In step S514, amount of movement control unit 827 follow the control information that in step S513 formerly, produces be controlled at form be concerned about the MEMS unit 853 of pixel of the prediction tapped of pixel, thereby the condenser 852 of pixel is moved near the position of photodiode 851 or away from the position of photodiode 851.
Next, imaging at the N+1 frame is regularly located, when the picture signal of N+1 frame in cmos imager 801 imagings and when exporting, flow process proceeds to step S515 from step S514, and prediction tapped extraction unit 821 with in N+1 the frame of cmos imager 801 output be concerned about that the nearest locational pixel of locations of pixels is that the center is with the prediction tapped (Figure 57 B) of rhombus extraction pixel as be concerned about pixel, and a pixel of extracting offered prediction and calculation unit 825, and flow process enters step S516.
In other words, in step S514, as MEMS unit 853 Be Controlled of the pixel of the prediction tapped of care pixel, also Be Controlled of the position of the condenser 852 of this pixel thus.Correspondingly, in step S515, be concerned about pixel prediction tapped comprise that method is controlled the position of condenser 852 in view of the above from the pixel value of photodiode 851 outputs of pixel.
In step S516, factor generation unit 824 output is concerned about the tap factor that the Sort Code of pixel is indicated by the institute that Sort Code generation unit 823 provides.That is to say, by the study in the foregoing facility for study, factor generation unit 824 is each class that obtains in advance storage tap factor, and correspondingly reads out the tap factor of the class of being indicated by the Sort Code of the pixel of being concerned about from the tap factor for each class.And output tap factor is given prediction and calculation unit 825.
Flow process enters step S516 to step S517 then, the tap factor that prediction tapped that employing prediction tapped extraction unit 821 provides and factor generation unit 824 provide, the calculating of expression formula (1) above prediction and calculation unit 825 is carried out, thereby obtained the pixel value of being concerned about, and processing procedure finishes.
Thereby processing recited above is used as each pixel in the N+1 frame as the pixel of being concerned about and is carried out in proper order, further, also is like this about N+2 frame, or the like.
Note, foregoing arrangement comprises from the classification tap of the pixel of being concerned about of N frame extraction, the N frame is a frame of N+1 front, extracts from the picture signal of N+1 frame but the classification tap of the pixel be concerned about may be arranged, and this frame is the frame of the pixel be concerned about.
Equally, in N+1 frame, there is some situation, be under the situation of the pixel be concerned about and at the pixel #B near this pixel to be under the pixel situation of being concerned about at a certain pixel #A wherein, the condenser 852 of the same pixel of cmos imager 801 is controlled to different positions.This can be by being at cmos imager 801 places the picture signal of N+1 frame to be carried out imaging with time series under the situation of the pixel be concerned about on the position at the condenser 852 of pixel under the situation that is the pixel be concerned about to solve as pixel #A on the position of the condenser 852 of pixel and as pixel #B.
In addition, arrangement described herein comprises, the position of the condenser of controlling by control MEMS unit 853 on as the pixel of the prediction tapped of the pixel of being concerned about 852, but also can make a kind of arrangement wherein, for example, the position of the condenser 852 of the immediate pixel of the pixel be concerned about to some extent of control only perhaps is controlled at the position of the condenser 852 of the whole pixels in a certain scope of the pixel of being concerned about, or the like.
Next, the processing procedure among the step S513 among Figure 61 (handling for generating with the control information generation of the corresponding control information of Sort Code) will be described in detail, with reference to the flow chart among Figure 62.
At first, in step S521, amount of movement control unit 827 calculates the number of times at the bit inversion at phase ortho position in the Sort Code of the pixel of being concerned about, and flow process enters step S522.
If Sort Code is, for example, " 000000001 " shown in Figure 60 A, here only the 8th and the 9th existence by 0 to 1 inversion, so the number of times of the bit inversion that calculates is 1.Equally, if Sort Code is, for example, " 101101010 " shown in Figure 60 B, inversion the 1st and the 2nd existence from 1 to 0, in the inversion of the 2nd and the 3rd existence from 0 to 1, in the inversion of the 4th and the 5th existence from 1 to 0, in the inversion of the 5th and the 6th existence from 0 to 1, inversion the 6th and the 7th existence from 1 to 0, in the inversion of the 7th and the 8th existence from 0 to 1, and in the inversion of the 8th and the 9th existence from 1 to 0, so the number of times of the bit inversion that calculates is 7.
In step S522, whether the number of times of the bit inversion in the Sort Code of the pixel that 827 judgements of amount of movement control unit are concerned about is greater than a predetermined threshold value.If to be top situation the same 9 (if or the number of forming the pixel of classification tap be nine pixels) for Sort Code, the predetermined threshold of employing can be 3, or the like, for example.
If the number of times of the bit inversion in the Sort Code of the pixel of being concerned about is confirmed as greater than predetermined threshold value, promptly, shown in Figure 60 B, if Sort Code is " 101101010 ", so the number of times of bit inversion is 7, this is bigger than predetermined threshold value 3, flow process enters step S523, and amount of movement control unit 827 is interpreted as that changing near the rank by the locations of pixels of being concerned about of the image of cmos imager 801 imagings is greatly to this result, and correspondingly produce control information with the Position Control of condenser 852 on position away from photodiode 851, that is, the object light of a close limit is injected on the position of photodiode 851, and flow process is returned.
If the number of times of the bit inversion in the Sort Code of the pixel of being concerned about is confirmed as being not more than predetermined threshold value, promptly, shown in Figure 60 B, if Sort Code is " 000000001 ", so the number of times of bit inversion is 1, this is not more than predetermined threshold value 3, flow process enters step S524, and amount of movement control unit 827 is interpreted as that near the locations of pixels rank of being concerned about by the image of cmos imager 801 imagings be level and smooth to this result, and correspondingly produce control information with the Position Control of condenser 852 on position near photodiode 851, that is, the object light of a wide region is injected on the position of photodiode 851, and flow process is returned.
Yet according to the another kind arrangement, here be described, in this arrangement, obtain Sort Code and be used as the information that indication distributes near the rank by the locations of pixels of being concerned about of the image of cmos imager 801 imagings from the classification tap, except adopting the arrangement of classification tap, other arrangement also can be formulated, such as a kind of arrangement of using near a plurality of any pixels of prediction tapped or a kind of similar pixel of being concerned about, as the information of indication near the rank distribution of the locations of pixels of being concerned about.
Equally, in above-mentioned situation, by control MEMS unit 853, be defined in two positions one for the control of condenser 852, promptly, a kind of position and a kind of position away from it near photodiode 851, but the position of condenser 852 can be controlled on the three or more positions.
Figure 63 A to 63C has illustrated a kind of arrangement, wherein the position of condenser 852 is controlled on the position in three positions, these three positions are the positions that certain distance arranged and be taken as normal place apart from photodiode 851, position near photodiode 851, a position away from photodiode 851.
Should be appreciated that, equally in this arrangement, condenser 852 is far away more apart from the position of photodiode 851, just narrow more from the scope of the object light of condenser 852 directive photodiodes 851, and condenser 852 is near more apart from the position of photodiode 851, and is just wide more from the scope of the object light of condenser 852 directive photodiodes 851.
Shown in Figure 63 A to 63C, on the normal place in the position, position to three of control condenser 852, during near the position of photodiode 851 with away from the position of photodiode 851, amount of movement control unit 827 is still controlled the position with the corresponding condenser 852 of Sort Code.
In other words, if the number of times of the bit inversion in the Sort Code is less, and correspondingly, the peripheral region of the locations of pixels of being concerned about from the picture signal of cmos imager 801 outputs is a smooth, the position of amount of movement control unit 827 control condensers 852 is on the position near photodiode 851, shown in Figure 63 C, so that on photodiode 851, throw a kind of object light of wide region.Equally, if the number of times of the bit inversion in the Sort Code is bigger, the rank of the peripheral region of the locations of pixels of being concerned about from the picture signal of cmos imager 801 outputs changes greatly, the position of amount of movement control unit 827 control condensers 852 is on the position of principle photodiode 851, shown in Figure 63 A, so that on photodiode 851, throw a kind of object light of close limit.And, if the number of times of the bit inversion in the Sort Code not only not quite but also not little, the rank of the peripheral region of the locations of pixels of being concerned about from the picture signal of cmos imager 801 output changes not only not quite but also is not little, that is, placed in the middle, the position of amount of movement control unit 827 control condensers 852 is on normal place, shown in Figure 63 B, so that the not wide also not narrow object light of a kind of scope of projection on photodiode 851 is promptly, placed in the middle.
Next, during control information among the step S513 among Figure 61 generates and handles, shown in Figure 63 A to Figure 63 C, the position of condenser 852 is controlled on three normal places in the position, near the position of photodiode 851, with the position away from photodiode 851, this will be described in detail with reference to the flow chart among Figure 64.
At first, in step S531, the number of times of the bit inversion at the phase ortho position in the Sort Code of the pixel that 827 calculating of amount of movement control unit are concerned about, and flow process enters step S532.Here to form the quantity of the position of Sort Code sign indicating number be 9 for let us hypothesis, the same with example among Figure 62.In this case, the minimum value of the bit inversion number of times in the Sort Code is 0, and maximum is 8.
In step S532, whether the bit inversion number of times in the Sort Code of amount of movement control unit 827 definite pixels of being concerned about is less relatively, such as, between 0 to 2 time.
In step S532, if the bit inversion number of times in the Sort Code of the pixel of being concerned about is determined between 0 to 2 time, flow process enters standard S533, amount of movement control unit 827 thinks that the amount of the locations of pixels of the being concerned about rank variation on every side in the image of cmos imager 801 imagings is less, and correspondingly produce control information and control the position of condenser 852 to position near photodiode 851, promptly, the object light of a wide region is injected the position of photodiode 851, and flow process is returned.
In step S532, if the bit inversion number of times in the Sort Code of the pixel of being concerned about is determined not between 0 to 2 time, flow process enters step S534, and whether the bit inversion number of times in the Sort Code of the pixel that 827 judgements of amount of movement control unit are concerned about is bigger, such as, between 6 to 8 times.
If in step S534, bit inversion number of times in the Sort Code of the pixel of being concerned about is determined between 6 to 8 times, flow process enters step S535, amount of movement control unit 827 thinks that amount that the rank around the locations of pixels of being concerned about in the image that cmos imager 801 generates changes greatly, and correspondingly produce control information and control the position of condenser 852 to position away from photodiode 851, promptly, the object light of a close limit is injected the position of photodiode 851, and flow process is returned.
In step S534, if the bit inversion number of times in the Sort Code of the pixel of being concerned about is determined not between 6 to 8 times, flow process enters step S536, amount of movement control unit 827 judge in the Sort Code of the pixel of being concerned about the bit inversion number of times whether not only not quite but also not little, such as, between 3 to 5 times.
If in step S536, bit inversion number of times in the Sort Code of the pixel of being concerned about is determined between 3 to 5 times, flow process enters step S537, amount of movement control unit 827 thinks that amount that the rank around the locations of pixels of being concerned about in the image that cmos imager 801 generates changes not only not quite but also not little, and correspondingly produce control information and control the position of condenser 852 on the reference position, promptly, centre position for photodiode 851, thereby the object light of an intermediate range is injected photodiode 851, and flow process is returned.
Equally, in step S536, if the bit inversion number of times in the Sort Code of the pixel of being concerned about is determined not between 6 to 8 times, amount of movement control unit 827 is used as mistake to this situation, does not produce control information and returns.In this case, the position of condenser 852 remains on the former position, for example.
If Sort Code has the 9-position and controls the position of condenser 852 according to the number of times of the bit inversion in the Sort Code, the position of condenser 852 can have 9 more than.
Next, Figure 65 has illustrated the example of second configuration of the DRC circuit 802 shown in Figure 55.Assembly among Figure 65 is the same with assembly among Figure 56, and is indicated with identical reference marker, and will omit description according to circumstances.In other words, the configuration of the DRC circuit among the DRC circuit shown in Figure 65 and Figure 56 is basic identical, except having substituted control unit 812 with control unit 862.Control unit 862 comprises detection of activity unit 876 and amount of exercise control unit 877.
Classification tap by the pixel of being concerned about of classification tap extraction unit 822 outputs has been provided for detection of activity unit 876.From classification tap from the pixel of being concerned about of classification tap extraction unit 822, detection of activity unit 876 detects near the activity the location of pixels of being concerned about in the picture signal of cmos imagers 801 outputs, and activity has been offered amount of movement control unit 877.The example of the activity here comprises the dynamic range (minimum and maximum value poor of forming the pixel of classification tap) of pixel of the classification tap of the pixel that composition is concerned about, form the pixel be concerned about the classification tap neighbor the absolute value sum of difference, the absolute value sum and the mean value thereof of the difference of each pixel of the classification tap of the pixel that composition is concerned about, or the like.In this arrangement, the dynamic range of the pixel of the classification tap of the pixel that composition is concerned about for example, will be used as activity.
According to the activity that detection of activity unit 876 provides, amount of movement control unit 877 is controlled cmos imager in the 827 the same modes of amount of movement control unit among Figure 56.In other words, according to the activity that detection of activity unit 876 provides, the position of the condenser 852 of the pixel in the prediction tapped of the pixel that 877 controls of amount of movement control unit are concerned about.
Figure 66 A and 66B have illustrated from the example of the classification tap of classification tap extraction unit 822 outputs, and the pixel value P1 to P9 of 9 pixels of the classification tap among the composition diagram 57A has been arranged on the single row with the order of P1 to P9.
For the classification tap that the smooth from image extracts, the variation of the pixel value of the pixel P1 to P9 of composition classification tap is less, so dynamic range DR such as Figure 66 A are depicted as smaller value, such as.
On the other hand, for the classification tap of extracting from the part of image with bigger variation, the variation of forming the pixel value P1 to P9 of classification tap is bigger, so for example shown in Figure 66 B, dynamic range DR is a bigger value.
Correspondingly, if the dynamic range of classification tap is less, a kind of level and smooth image can be assert, and if the dynamic range of classification tap bigger, a kind of have the image that changes than large level and can be assert.
Thereby, if the dynamic range of classification tap is less, and correspondingly the image of the image value that is obtained by photodiode 851 is level and smooth, the position of amount of movement control unit 877 control condensers 852 is to the position near photodiode 851, shown in Figure 59 B, so that on photodiode 851, throw a kind of object light of wide region.Equally, if the dynamic range of classification tap is bigger, and correspondingly the image of the image value that is provided by photodiode 851 has bigger rank to change, the position of amount of movement control unit 877 control condensers 852 is to the position away from photodiode 851, shown in Figure 59 A, so that on photodiode 851, throw a kind of object light of close limit.
Next, the signal processing of the DRC circuit 802 shown in Figure 65 will be described with reference to the flow chart among Figure 67.Here, as the situation among Figure 61, a certain pixel in the N+1 that the DRC unit is 811 frame (or field) is used as the pixel of being concerned about, and predicts the pixel that this is concerned about.
In this case, on the picture signal of N the frame of exporting from cmos imager 801, among the step S541, classification tap extraction unit 822 is that the pixel be concerned about (Figure 57) of cross pixel as the classification tap extracted at the center with the nearest locational pixel of the pixel of being concerned about of N frame of the picture signal of cmos imager 801 outputs, and it is offered Sort Code generation unit 823 and detection of activity unit 876, and flow process enters step S542.Here same, in other words, the classification tap of the pixel of being concerned about of N+1 frame is just extracted the N frame image signal from former frame, as the situation among Figure 61.
In step S542,1 ADRC of Sort Code generation unit 823 by the classification tap carrying out classification tap extraction unit 822 and provide handles the Sort Code that obtains the pixel be concerned about, and it is offered factor generation unit 824, and flow process enters step S543.
In step S543, detection of activity unit 876 detects dynamic range from the classification tap of the pixel of being concerned about of classification tap extraction unit 822 as activity, it is offered amount of movement control unit 877, and flow process enters step S544.
In step S543, amount of movement control unit 877 produces (determining) control information, position according to the dynamic range control condenser 852 of the classification tap that provides from detection of activity unit 876 is provided, and flow process enters step S545, in step S545, amount of movement control unit 877 is controlled the MEMS unit 853 of the pixel of the prediction tapped of forming the pixel of being concerned about according to the control information that produces among the previous step S543, thereby, the position of the condenser 852 of this pixel has been moved to the precalculated position.
Next, imaging at the N+1 frame is regularly located, when the picture signal of N+1 frame is exported by imaging and at cmos imager 801, flow process enters S546 from S545, and prediction tapped extraction unit 821 is that the pixel be concerned about (Figure 57 B) of pixel as the classification tap extracted with rhombus in the center with the nearest locational pixel of the pixel of being concerned about of N frame of the picture signal of cmos imager 801 outputs, and it is offered prediction and calculation unit 825, and flow process enters step S547.
That is to say, in step S545, as MEMS unit 853 Be Controlled of a certain pixel of the prediction tapped of the pixel of being concerned about, also Be Controlled of the position of the condenser 852 of this pixel thus.Correspondingly, in step S546, the prediction tapped of the pixel of being concerned about comprises that method is controlled the position of condenser 852 in view of the above from the pixel value of photodiode 851 outputs of pixel.
In step S547, factor generation unit 824 is provided to prediction and calculation unit 825 by the classification tap factor of the Sort Code of the pixel of being concerned about that is provided by Sort Code generation unit 823, and flow process enters step S548, here the calculating of expression formula (1) above the prediction and calculation unit 825 tap factor that adopts the prediction tapped that provides from prediction tapped extraction unit 821 and factor generation unit 824 to provide is carried out, thereby, obtained the pixel value of the pixel be concerned about, and flow process finishes.
Each pixel in the N+1 frame is all carried out foregoing processing procedure as the pixel order of being concerned about, in addition, for the N+2 frame also be like this, or the like.
Note, foregoing arrangement comprises the classification tap of extracting the pixel of being concerned about from the picture signal as the N frame of N+1 frame former frame, but may arrange from picture signal, to extract as the N+1 frame of the frame of the pixel of being concerned about be concerned about the classification tap of pixel.
Equally, in the N+1 frame, there is some situation, be under the situation of the pixel be concerned about and at the pixel #B near this pixel to be under the pixel situation of being concerned about at a certain pixel #A wherein, the condenser 852 of the same pixel of cmos imager 801 is controlled to different positions.This can be by being at cmos imager 801 places the picture signal of N+1 frame to be carried out imaging with time series under the situation of the pixel be concerned about on the position at the condenser 852 of pixel under the situation that is the pixel be concerned about to solve as pixel #A on the position of the condenser 852 of pixel and as pixel #B.Or, with priority give with as before or after as the position of the corresponding condenser 852 of the pixel of the pixel of being concerned about.
In addition, arrangement described herein comprises, the position of the condenser of controlling by control MEMS unit 853 on as the pixel of the prediction tapped of the pixel of being concerned about 852, but also can make a kind of arrangement wherein, for example, the position of the condenser 852 of the immediate pixel of the pixel be concerned about to some extent of control only perhaps is controlled at the position of the condenser 852 of the whole pixels in a certain scope of the pixel of being concerned about, or the like.
Next, processing procedure among the step S544 of Figure 67 (for the generation and the control information generation of the movable corresponding control information of classification tap are handled) will be described in detail, this is described with reference to the position of controlling condenser on one of two positions or another, shown in Figure 59 near or away from the situation of photodiode 851, with reference to the flow chart among Figure 68.
At first, in step S551, the dynamic range of the classification tap of the pixel that amount of movement control unit 877 is concerned about with maximum dynamic range normalization, and flow process enters step S552.That is to say, amount of movement control unit 877 is with the dynamic range of maximum dynamic range segmentation and classification tap, maximum dynamic range is poor from the assumable minimum and maximum value of picture signal of cmos imager 801 output, thus the dynamic range of the classification tap of having standardized.After this, the dynamic range of the classification tap of having been standardized will be cited as " normalized dynamic range " according to circumstances.Notice that the dynamic range of normalization classification tap is not necessarily necessary.
In step S552, amount of movement control unit 877 determines that whether normalized dynamic range is greater than a predetermined threshold value.The threshold value that adopts can be, for example, and 0.5 etc.
In step S552, if normalized dynamic range is determined greater than predetermined threshold value, flow process enters step S553, and amount of movement control unit 877 is expressed as this near the location of pixels of being concerned about, variation by the image level of cmos imager 801 imagings is very big, and correspondingly generate control information with the Position Control of condenser 852 to position, promptly away from photodiode 851, a kind of object light of close limit is projected the position of photodiode 851, and flow process is returned.
In step S552, if being determined, normalized dynamic range is not more than predetermined threshold value, flow process enters step S553, and amount of movement control unit 877 is expressed as this near the location of pixels of being concerned about, variation by the image level of cmos imager 801 imagings is level and smooth, and correspondingly generate control information with the Position Control of condenser 852 to position near photodiode 851, promptly, a kind of object light of wide region is projected the position of photodiode 851, and flow process is returned.
Yet, at this a kind of arrangement is described, wherein, normalized dynamic range of classification tap is used as the information of indication by near the activity the location of pixels of being concerned about in the image of cmos imager 801 imagings, except adopting the layout of classification tap, other layouts also can be formulated, for example, and a kind of arrangement of adopting near the information of the activity the location of pixels of being concerned about is indicated in the conduct of the difference of prediction tapped or the similar minimum and maximum value of being concerned about near a plurality of any pixels the pixel.
Equally, in above-mentioned situation, by control MEMS unit 853, control of carrying out in two positions to condenser 852, that is, near the position of photodiode 851 with away from the position of photodiode 851, but condenser 852 can be controlled on the three or more positions.
Next, during control information among the step S544 of Figure 67 generates and handles, shown in Figure 63 A to Figure 63 C, the position of condenser 852 is controlled in three positions, promptly, normal place, near the position of photodiode 851 with away from the position of photodiode 851, this will be described in detail with reference to the flow chart among Figure 69.
At first, in step S561, the dynamic range of the classification tap of the pixel of being concerned about by standardizing, normalized dynamic range of the dynamic range ratio of the classification tap of the pixel that the 877 acquisition conducts of amount of movement control unit are concerned about is as maximum dynamic range, and flow process enters step S562.
In step S562, amount of movement control unit 877 determines whether normalized dynamic range is less relatively, for example, is a value littler than 0.3.
In step S562, if it is littler than 0.3 that normalized dynamic range is determined, flow process enters step S563, will be less at this amount of movement control unit 877 by near the rank variable quantity of the image location of pixels of being concerned about of cmos imager 801 imagings, and correspondingly produce control information condenser 852 is controlled on the normal place, that is, allow the object light of scope placed in the middle be incident upon the position of photodiode 851, and flow process is returned.
In step S562, if normalized dynamic range is confirmed as being not less than 0.3, flow process enters step S564, and amount of movement control unit 877 determines normalized dynamic ranges not only not quite but also not little, such as, more than or equal to 0.3 but less than 0.6 value.In step S564, if normalized dynamic range is identified as more than or equal to 0.3 less than 0.6 value, flow process enters step S565, amount of movement control unit 877 is determined by near the rank variable quantity of the image location of pixels of being concerned about of cmos imager 801 imagings not only not quite but also not little, and correspondingly produce control information and control condenser 852 on position away from distance photodiode 851, promptly, a kind of object light that allows close limit is incident upon the position of photodiode 851, and flow process is returned.
In step S564, be not one more than or equal to 0.3 but less than 0.6 value, flow process enters step S566 if normalized dynamic range is identified as, and whether the normalized dynamic range of amount of movement control unit 877 decisions is relatively large, such as, one greater than 0.6 value.
In step S566, if normalized dynamic range be identified as be one greater than 0.6 value, flow process enters step S567, it is bigger by near the rank variable quantity of the image location of pixels of being concerned about of cmos imager 801 imagings that amount of movement control unit 877 is thought, and correspondingly produce control information condenser 852 is controlled on the position away from photodiode 851, that is, a kind of object light that allows close limit is incident upon the position of photodiode 851, and flow process is returned.
Equally, in step S566, if normalized dynamic range is identified as and is one and is not more than 0.6 value, amount of movement control unit 877 is used as fault processing to this situation, does not produce control information and returns.In this case, the position of condenser 852 remains on the former position, for example.
Foregoing layout comprises the factor generation unit 824 shown in Figure 56 and 65, its stores the tap factor of each class that obtains by prior study, but for factor generation unit 824, its for each class from like as the tap factor that generates the image that can produce desired quality the tap seed data of seed and the predefined parameter.
By factor seed data and predetermined parameters, the configuration that generates the configuration of factor generation unit of tap factor and factor generation unit 124 shown in Figure 30 for each class is the same, so detailed description will be omitted (seeing Figure 30 to 36 and its description).
Equally, the same with the configuration of the facility for study shown in the corresponding configuration of facility for study and Figure 20, but identical classification tap is disposed in the tap of feature extraction unit 136 configurations and unit 822 configurations of classification tap extraction, and these are offered class taxon 137.Class taxon 137 generates and the identical Sort Code of Sort Code generation unit 823 configurations.
According to corresponding parameter such as resolution shown in Figure 30, if be that each class generates the tap factor on factor output unit 124, cmos imager 801 may be controlled according to parameter, rather than according to Sort Code or classification tap activity.
Figure 70 has illustrated the example of the another kind configuration of DRC circuit 802, and it controls cmos imager 801 according to parameter.Note, indicate by same reference marker with corresponding assembly in Figure 56 or 65, and omit its description according to circumstances.Briefly, the DRC circuit 802 of illustrating among Figure 70 is the same with shown in Figure 56 basically, and difference is that amount of movement control unit 917 is provided rather than the DL826 of control unit 812 and amount of movement control unit 827.
Among Figure 70, user's operating operation unit 985, factor generation unit 824 and amount of movement control unit 917 are given according to its operation output parameter z in this unit.Factor generation unit 824 is configured by the method for Figure 30, and come to be each class configuration tap factor based on the parameter z that provides from operating unit 985, and the tap factor of the Sort Code that indication Sort Code generation unit 823 is provided is exported to prediction and calculation unit 825.
The corresponding cmos imager 801 of parameter z that 917 controls of amount of movement control unit and operating unit 985 provide.That is to say that if parameter z is bigger, factor generation unit 824 produces the tap factors and improves resolution greatly, and if parameter z less, the slight raising resolution of factor generation unit 824 generation tap factors.For DRC unit 811, be used if improve the tap factor of resolution greatly, the pixel of predicted composition tap also should have high-resolution to be adapted at the signal processing that DRC unit 811 is carried out.Equally, for DRC unit 811, if the tap factor of slight raising resolution is used, the pixel of predicted composition tap also should have not high resolution to be adapted at the signal processing that DRC unit 811 is carried out.
Correspondingly, if parameter z is big and a factor that improves resolution greatly produces, amount of movement control unit 917 control condensers 852 are on the position away from photodiode 851, so that the object light of a close limit is injected in the photodiode 851.Equally, if the factor of the raising resolution that parameter z is less and one slight produces, amount of movement control unit 917 control condensers 852 are on the position near photodiode 851, so that the object light of a wide region is injected in the photodiode 851.
DRC circuit 802 can be realized by special hardware, or by a computer, for example one comprises CPU (central processing unit), and (comprising DSP (Digital Signal Processing)) and semiconductor memory etc. and microcomputer are carried out the program of processing as previously described and realized.
Program can be installed in the computer, or is stored in movable storage medium, as floppy disk, CD-ROM (compact disk read-only memory), MO (magneto optical disk) dish, a kind of DVD (digital versatile disc), disk, semiconductor memory or the like, and be used as canned software and provide.
Except from this removable recording medium phase microcomputer installation procedure, can be via satellite, wirelessly such as digital broadcast satellite from the download website, perhaps, download also and install by this program is sent in the computer wiredly such as this type of network of LAN (local area network (LAN)) or internet.
Now, in this manual, the processing procedure of describing in making the program code of computer implemented a plurality of processing provides time sequencing in not necessarily will be by flow chart and handles, device can walk abreast or carry out separately (such as, parallel processing or OO processing).In addition, program can be by single or multiple Computer Processing.
As previously mentioned, control cmos imager 801 carries out the picture signal of signal processing so that output is suitable for DRC unit 811, can obtain by the signal processing of DRC unit 811 so have the picture signal that improves picture quality.
Notice that though present embodiment adopts cmos imager (cmos sensor) to come captured image, other formation method such as CCDs, can replacedly use.
Equally, though cmos imager 801 and DRC circuit 802 have been configured on the chip in the present embodiment, cmos imager 801 also can be configured on the chip that separates with DRC circuit 802.
In addition, in the present embodiment, though the object light scope that is mapped to into photodiode 851 is to control by the position of control condenser 852, the method that control is injected into the object light scope of photodiode 851 is not limited to this.For example, can make a kind of arrangement, wherein,, can be conditioned by adjusting diaphragm so that be injected into the object light scope of photodiode 851 for each pixel of cmos imager 801 provides the diaphragm that adopts the MEMS technology.Equally, except control is injected into the object light scope of photodiode 851, photodiode 851 receives the time quantum (time for exposure) of light, or the like, also can Be Controlled.

Claims (96)

1. signal handling equipment comprises:
Sensor device, sensitive information and output and the corresponding signal of this information; And
Signal processing apparatus makes the signal of sensor device output stand signal processing;
Wherein said sensor device is configured to and the corresponding attribute of signal processing.
2. according to the signal handling equipment of claim 1, wherein by the study of execution in advance, sensor device is configured to and the corresponding attribute of signal processing.
3. signal processing method comprises:
Obtaining step is used for sensitive information and obtains the signal of being exported by the output and the sensor device of the corresponding signal of this information; And
The signal processing step is used to make the signal of sensor device output to stand signal processing;
Wherein sensor device is configured to and the corresponding attribute of signal processing.
4. computer-readable program comprises:
The code that is used for obtaining step, this obtaining step are used for sensitive information and obtain by the signal of output with the sensor device output of the corresponding signal of this information; And
The code that is used for the signal processing step is used to make the signal of sensor device output to stand signal processing;
Wherein sensor device is configured to and the corresponding attribute of signal processing.
5. recording medium of storing computer-readable program, described program comprises:
The code that is used for obtaining step, this obtaining step are used for sensitive information and obtain by the signal of output with the sensor device output of the corresponding signal of this information; And
The code that is used for the signal processing step is used to make the signal of sensor device output to stand signal processing;
Wherein sensor device is configured to and the corresponding attribute of signal processing.
6. signal handling equipment comprises:
Sensor device has at least
First sensor is used for light sensing and the output and first component of the corresponding picture signal of described light, and
Second transducer is used to export the second component of described picture signal;
Signal processing apparatus makes first data image signal that obtains from the output of described sensor device stand signal processing, and exports second data image signal;
The study of wherein said first and second transducers by carrying out in advance is configured to and the corresponding laying state of signal processing.
7. according to the signal handling equipment of claim 6, described signal processing apparatus comprises:
Feature deriving means is used to use described first data image signal, the feature of pixel that extraction is concerned about;
The class sorter is used for the feature according to care pixel, and care pixel is classified;
The factor output device is used to export and the described corresponding factor of classification of being concerned about pixel; And
Calculation element is used for being concerned about the corresponding factor of class of pixel and second digital signal that the pixel of being concerned about is obtained in described first data image signal execution calculating by using with described.
8. according to the signal handling equipment of claim 7, described factor output device comprises storage device, and each that is used to a plurality of classes stored described factor.
9. according to the signal handling equipment of claim 6, wherein first and second transducers are complementary metal oxide semiconductors (CMOS) transducer or charge coupled device.
10. signal processing method comprises:
Obtaining step is used to obtain the picture signal of described sensor device output, and described sensor device has at least
First sensor is used for light sensing and the output and first component of the corresponding picture signal of described light, and
Second transducer is used to export the second component of described picture signal; And
The signal processing step makes first data image signal that obtains from the output of described sensor device stand signal processing, and exports second data image signal;
The study of wherein said first and second transducers by carrying out in advance is configured to and the corresponding laying state of described signal processing.
11. a computer-readable program comprises:
The code that is used for obtaining step, this obtaining step are used to obtain the described picture signal of sensor device output, and this sensor device has at least
First sensor is used for light sensing and the output and first component of the corresponding picture signal of described light, and
Second transducer is used to export the second component of described picture signal;
The code that is used for the signal processing step is used for making first data image signal that obtains from the output of described sensor device to stand signal processing, and exports second data image signal;
The study of wherein said first and second transducers by carrying out in advance is configured to and the corresponding laying state of signal processing.
12. a recording medium of storing computer-readable program, this program comprises:
The code that is used for obtaining step is used to obtain the described picture signal of sensor device output, and this sensor device has at least
First sensor is used for light sensing and the output and first component of the corresponding picture signal of described light, and
Second transducer is used to export the second component of described picture signal;
The code that is used for the signal processing step is used for making first data image signal that obtains from the output of described sensor device to stand signal processing, and exports second data image signal;
The wherein study of first and second transducers by carrying out in advance is configured to and the corresponding laying state of described signal processing.
13. a signal handling equipment comprises:
Signal processing apparatus is used to make the signal of sensor device output to stand signal processing, this sensor arrangement senses information and output and the corresponding signal of this information;
Control device is used to control the attribute of described sensor device;
Apparatus for evaluating is used to assess about the described signal processing results by the output of the described sensor device of described control device controlled attribute; And
Determine device, be used for, determine attribute with the corresponding described sensor device of described signal processing according to the assessment result of apparatus for evaluating and the output information of attribute.
14., also comprise described sensor device according to the signal handling equipment of claim 13.
15. a signal processing method comprises:
The signal processing step is used to make the signal of sensor device output to stand signal processing, this sensor arrangement senses information and output and the corresponding signal of information;
Controlled step is used to control the attribute of described sensor device;
Appraisal procedure is used to assess about the described signal processing results by the output of the described sensor device of described controlled step controlled attribute; And
Determining step is used for according to the assessment result of described appraisal procedure and the output information of attribute, determines the attribute with the corresponding sensor device of described signal processing.
16. a computer-readable program comprises:
The code that is used for the signal processing step makes the signal of sensor device output stand signal processing, this sensor arrangement senses information and output and the corresponding signal of this information;
The code that is used for controlled step is used to control the attribute of described sensor device;
The code that is used for appraisal procedure is used to assess about the described signal processing results by the output of the described sensor device of described controlled step controlled attribute; And
The code that is used for determining step is used for according to the assessment result of described appraisal procedure and the output information of attribute, determines the attribute with the corresponding sensor device of described signal processing.
17. a recording medium of storing computer-readable program, described program comprises:
The code that is used for the signal processing step makes the signal of sensor device output stand signal processing, this sensor arrangement senses information and output and the corresponding signal of described information;
The code that is used for controlled step is used to control the attribute of described sensor device;
The code that is used for appraisal procedure is used to assess the described signal processing results of exporting about by the described sensor device of described controlled step controlled attribute; And
The code that is used for determining step is used for according to the assessment result of described appraisal procedure and the output information of attribute, determines the attribute with the corresponding sensor device of described signal processing.
18. a signal handling equipment comprises:
Signal processing apparatus makes first data image signal that obtains from the output of sensor device stand signal processing, and this sensor device has at least
First sensor is used for light sensing and the output and first component of the corresponding picture signal of this light, and
Second transducer is used to export the second component of described picture signal and exports second data image signal;
Control device is used to control the laying state of first and second transducers;
Apparatus for evaluating is used to assess and the output of described sensor device is carried out signal processing and second data image signal that obtains, the wherein laying state controlled device of first and second transducers control; And
Determine device, be used for assessment result, determine with the laying state of corresponding first and second transducers of signal processing and export the information of this laying state according to described apparatus for evaluating.
19. according to the signal handling equipment of claim 18, wherein for each laying state of first and second transducers, apparatus for evaluating obtains the related assessed value as second data image signal of second data image signal and evaluate image signal.
20. according to the signal handling equipment of claim 18, described signal processing apparatus comprises:
Feature deriving means is used to use described first data image signal, the feature of pixel that extraction is concerned about;
The class sorter is used for the feature according to care pixel, to the pixel classification of being concerned about;
The factor output device is used to export and the described corresponding factor of class of being concerned about pixel;
Calculation element, be used for by use with the corresponding factor of class of is concerned about pixel and first data image signal execution calculating obtain be concerned about second data image signal of pixel.
21. according to the signal handling equipment of claim 20, described factor output device be included as a plurality of classes each store the storage device of described factor.
22. according to the signal handling equipment of claim 18, wherein first and second transducers are complementary metal oxide semiconductors (CMOS) transducer or charge coupled device.
23., also comprise described sensor device according to the signal handling equipment of claim 18.
24. a signal processing method comprises:
The signal processing step is used for making first data image signal that obtains from the output of sensor device to stand signal processing, and this sensor device has at least
First sensor is used for light sensing and the output and first component of the corresponding picture signal of this light, and
Second transducer is used for the second component of output image signal and exports second data image signal;
Controlled step is used to control the laying state of described first and second transducers;
Appraisal procedure is used to assess second data image signal that signal processing obtains is carried out in the output of sensor device, the wherein laying state Be Controlled step of first and second transducers control;
Determining step is used for the assessment result according to appraisal procedure, determines with the laying state of corresponding first and second transducers of described signal processing and exports the information of this laying state.
25. a computer-readable program comprises:
The code that is used for the signal processing step is used for making first data image signal that obtains from the output of sensor device to stand signal processing, and this sensor device has at least
First sensor is used for light sensing and the output and first component of the corresponding picture signal of this light, and
Second transducer is used for the second component of output image signal and exports second data image signal;
The code that is used for controlled step is used to control the laying state of described first and second transducers;
The code that is used for appraisal procedure is used to assess and the output of sensor device is carried out signal processing and second data image signal that obtains, the wherein laying state Be Controlled step of first and second transducers control;
The code that is used for determining step is used for the assessment result according to described appraisal procedure, determines with the laying state of corresponding first and second transducers of signal processing and exports the information of laying state.
26. a recording medium of storing computer-readable program, program comprises:
The code that is used for the signal processing step makes first data image signal that obtains from the output of sensor device stand signal processing, and this sensor device has at least
First sensor is used for light sensing and the output and first component of the corresponding picture signal of this light, and
Second transducer is used for the second component of output image signal and exports second data image signal;
The code that is used for controlled step is used to control the laying state of described first and second transducers;
The code that is used for appraisal procedure is used to assess and the output of sensor device is carried out signal processing and second data image signal that obtains, the wherein laying state Be Controlled step of first and second transducers control; And
The code that is used for determining step is used for the assessment result according to appraisal procedure, determines with the laying state of corresponding first and second transducers of signal processing and exports the information of laying state.
27. a signal handling equipment comprises:
Image conversion apparatus is used for making first data image signal that obtains from the output of imaging device to stand the image transitions processing, and this imaging device has at least
First sensor is used to obtain first component of picture signal, and
Second transducer is used to obtain the second component of picture signal and exports second data image signal; And
Apparatus for evaluating is used to assess described second data image signal; And
Control device is used for the assessment according to apparatus for evaluating, the laying state of one of described at least first and second transducers of control.
28. signal handling equipment according to claim 27, described apparatus for evaluating also comprises related calculation element, be used to obtain first or second transducer second data image signal that obtains under the first laying state situation with at first or second transducer related between second data image signal that obtains under the second laying state situation, wherein according to described association, the picture quality of second data image signal is evaluated.
29. according to the signal handling equipment of claim 27, described image conversion apparatus comprises:
Feature deriving means is used to use described first data image signal, the feature of pixel that extraction is concerned about;
The class sorter is used for the feature according to care pixel, to the pixel classification of being concerned about;
The factor output device is used to export the corresponding factor of class be concerned about pixel; And
Calculation element is used for by using the corresponding factor of class and first data image signal with is concerned about pixel, execution calculating obtain be concerned about second digital signal of pixel.
30. according to the signal handling equipment of claim 29, the factor output device comprises storage device, is used to each storage factor of a plurality of classes.
31. according to the signal handling equipment of claim 29, the factor output device also comprises:
Storage device, the storage seed data is as the seed of described factor;
Generating apparatus, being used for from predefined parameter and described seed data is each generation factor of a plurality of classes.
32. according to the signal handling equipment of claim 27, wherein said first and second transducers are complementary metal oxide semiconductors (CMOS) transducer or charge coupled device.
33., comprise imaging device in addition according to the signal handling equipment of claim 27.
34. a signal processing method comprises:
The image transitions step makes first data image signal that obtains from the output of imaging device stand image transitions and handles, and this imaging device has at least
First sensor is used to obtain first component of picture signal,
Second transducer is used to obtain the second component of picture signal and exports this second picture signal;
Appraisal procedure is used to assess described second data image signal; And
Controlled step is used for the assessment according to described appraisal procedure, the laying state of one of described at least first and second transducers of control.
35. a computer-readable program comprises:
The code that is used for the image transitions step makes first data image signal that obtains from the output of imaging device stand image transitions and handles, and this imaging device has at least
First sensor is used to obtain first component of picture signal, and
Second transducer is used to obtain the second component of picture signal and exports this second picture signal;
The code that is used for appraisal procedure is used to assess described second data image signal; And
The code that is used for controlled step is used for the assessment according to appraisal procedure, the laying state of one of described at least first and second transducers of control.
36. a recording medium of storing computer-readable program, this program comprises:
The code that is used for the image transitions step is used for making first data image signal that obtains from the output of imaging device to stand the image transitions processing, and this imaging device has at least
First sensor, be used to obtain picture signal first component and
Second transducer is used to obtain the second component of picture signal and exports this second data image signal;
The code that is used for appraisal procedure is used to assess second data image signal; And
The code that is used for controlled step, according to the assessment of appraisal procedure, the laying state of one of described at least first and second transducers of control.
37. a signal handling equipment comprises:
Parameter obtaining device is used to obtain predetermined parameters;
Control device, the laying state of one of the first sensor at least of control imaging device or second transducer, this imaging device has at least
First sensor, be used to obtain picture signal first component and
Second transducer is used for obtaining according to described predetermined parameters the second component of picture signal; And
Image conversion apparatus, corresponding with predetermined parameters, make first data image signal that from the output of described imaging device, obtains stand image transitions and handle, export second data image signal.
38. signal handling equipment according to claim 37, comprise storage device in addition, be used for the parameter list that storing predetermined parameter is associated with the laying state of first or second transducer, wherein said control device is controlled described first or second transducer to the laying state that is associated with predefined parameter in the parameter list.
39. according to the signal handling equipment of claim 37, image conversion apparatus comprises:
Feature deriving means is used to use first data image signal, the feature of pixel that extraction is concerned about;
The class sorter is used for the feature according to care pixel, to the pixel classification of being concerned about;
Storage device is used to store the seed of seed data as factor;
Generating apparatus is used for from predetermined parameters and described seed data, is each the generation factor in a plurality of classes; And
Calculation element is used for by the class corresponding factor of use be concerned about pixel, and the factor of each class that described generating apparatus produces and first data image signal are carried out and calculated second digital signal of obtaining the pixel of being concerned about.
40. according to the signal handling equipment of claim 37, wherein first and second transducers are complementary metal oxide semiconductors (CMOS) transducer or charge coupled device.
41., also comprise imaging device according to the signal handling equipment of claim 37.
42. a signal processing method comprises:
Obtaining step is used to obtain predetermined parameters;
Controlled step is used to control the laying state of one of the first sensor at least of imaging device or second transducer, and this imaging device has at least
First sensor is used to obtain first component of picture signal,
Second transducer is used for obtaining according to predetermined parameters the second component of picture signal; And
The image transitions step, corresponding with predetermined parameters, make first data image signal that from the output of imaging device, obtains stand image transitions and handle, export second data image signal.
43. a computer-readable program comprises:
The code that is used for obtaining step is used to obtain predetermined parameters;
The code that is used for controlled step, the laying state of one of the first sensor at least of control imaging device or second transducer, this imaging device has at least
First sensor, be used to obtain picture signal first component and
Second transducer is used for obtaining according to described predetermined parameters the second component of picture signal; And
The code that is used for the image transitions step, corresponding with predetermined parameters, make first data image signal that from the output of described imaging device, obtains stand image transitions and handle, export second data image signal.
44. a recording medium of storing computer-readable program, this program comprises:
The code that is used for obtaining step is used to obtain predetermined parameters;
The code that is used for controlled step is controlled the laying state of one of the first sensor at least of imaging device at least or second transducer, and this imaging device has at least
First sensor, be used to obtain picture signal first component and
Second transducer is used for obtaining according to predetermined parameters the second component of picture signal; And
The code that is used for the image transitions step, corresponding with predetermined parameters, make first data image signal that from the output of imaging device, obtains stand image transitions and handle, export second data image signal.
45. a signal handling equipment comprises:
Deriving means is used to obtain predetermined parameters;
Image conversion apparatus is used for making first data image signal that obtains from the output of imaging device to stand image processing, and this imaging device has at least
First sensor is used to obtain first component of picture signal, and
Second transducer is used to obtain the second component of picture signal and exports this second picture signal;
Control device is used to control the laying state of one of at least the first or second transducer;
Apparatus for evaluating is used to assess second data image signal; And
Storage device, corresponding with the assessment of apparatus for evaluating, with the storing predetermined parameter of interrelational form and the laying state of first or second transducer.
46. according to the signal handling equipment of claim 45, described apparatus for evaluating also comprises
Related calculation element, be used to obtain described first or second transducer second data image signal that obtains under the first laying state situation with at described first or second transducer related between second data image signal that obtains under the second laying state situation, wherein according to described association, the picture quality of second data image signal is evaluated.
47. according to the signal handling equipment of claim 45, described image conversion apparatus comprises:
Feature deriving means uses described first data image signal, the feature of pixel that extraction is concerned about;
The class sorter is according to the feature of care pixel, to the pixel classification of being concerned about;
Storage device, the storage seed data is as the seed of described factor;
Generating apparatus is used for from predetermined parameters and seed data, is each the generation factor in a plurality of classes; And
Calculation element is used for by the class corresponding factor of use be concerned about pixel, and the factor and first data image signal of each class that described generating apparatus produces are carried out and calculated second digital signal of obtaining the pixel of being concerned about.
48. according to the signal handling equipment of claim 45, wherein first and second transducers are complementary metal oxide semiconductors (CMOS) transducer or charge coupled device.
49., also comprise imaging device according to the signal handling equipment of claim 45.
50. a signal processing method comprises:
Obtaining step is used to obtain predetermined parameters;
The image transitions step is used for making first data image signal that obtains from the output of imaging device to stand image processing, and this imaging device has at least
First sensor, be used to obtain picture signal first component and
Second transducer is used to obtain the second component of picture signal and exports this second data image signal;
Controlled step is used to control the laying state of one of at least the first or second transducer;
Appraisal procedure is used to assess second data image signal; And
Storing step, corresponding with the assessment of described apparatus for evaluating, with the storing predetermined parameter of interrelational form and the laying state of first or second transducer.
51. a computer-readable program comprises:
The code that is used for obtaining step is used to obtain predetermined parameters;
The code that is used for the image transitions step makes first data image signal that obtains from the output of imaging device stand image processing, and this imaging device has at least
First sensor, be used to obtain picture signal first component and
Second transducer is used to obtain the second component of picture signal and exports this second data image signal; The controlled step code is used to control the laying state of one of at least the first or second transducer;
The code that is used for appraisal procedure is used to assess second data image signal;
The code that is used for storing step, corresponding with the assessment of described apparatus for evaluating, with the storing predetermined parameter of interrelational form and the laying state of first or second transducer.
52. the storage medium of a storage computation machine readable program, described program comprises:
The code that is used for obtaining step is used to obtain predetermined parameters;
The code that is used for the image transitions step makes first data image signal that obtains from the output of imaging device stand image processing, and this imaging device has at least
First sensor, be used to obtain picture signal first component and
Second transducer is used to obtain the second component of picture signal and exports this second data image signal; The controlled step code is used to control the laying state of one of at least the first or second transducer;
The code that is used for appraisal procedure is used to assess second data image signal; And
The code that is used for storing step, corresponding with the assessment of described apparatus for evaluating, with the storing predetermined parameter of interrelational form and the laying state of first or second transducer.
53. a signal handling equipment comprises:
Image conversion apparatus is used for making first data image signal that obtains from the output of sensor device with a plurality of photoelectric conversion devices to stand image transitions and handles, and exports second data image signal; And
Apparatus for evaluating is used to assess first data image signal of presumptive area;
Wherein be changed and corresponding characteristic that apparatus for evaluating is assessed with the described sensor device of the corresponding part of first data image signal of presumptive area.
54., also comprise sensor device according to the signal handling equipment of claim 53.
55. signal handling equipment according to claim 54, described sensor device also comprises and amplifies control unit to amplify in described a plurality of photoelectric conversion device the output signal of each, control its magnification ratio simultaneously, wherein said amplification control unit control magnification ratio is to assessing on the corresponding value with apparatus for evaluating.
56. according to the signal handling equipment of claim 54, comprise means for correcting in addition, be used for characteristic correction second data image signal according to described sensor device.
57. according to the signal handling equipment of claim 53, wherein said presumptive area is the zone of one or more pixels.
58. according to the signal handling equipment of claim 55, described apparatus for evaluating comprises in addition:
Extraction element is used to extract the assessment pixel as the pixel that is used for described first data image signal assessment;
Calculation element is used for calculating and takies degree, and this takies degree is the percentage that described assessment pixel takies presumptive area; And
Magnification ratio is determined device, based on the described degree that takies, assesses first data image signal of described presumptive area, and determines described magnification ratio according to assessment.
59. according to the signal handling equipment of claim 55, described apparatus for evaluating comprises in addition:
Calculation element is used to calculate the activity at first data image signal of described presumptive area;
Magnification ratio is determined device, based on described activity, assesses first data image signal of described presumptive area, determines magnification ratio according to assessment.
60. according to the signal handling equipment of claim 55, apparatus for evaluating comprises in addition:
Comparison means is used for first data image signal and the predetermined threshold of comparison in presumptive area;
Magnification ratio is determined device, the comparative result that is used for installing based on the comparison, and first data image signal of assessment presumptive area, and according to the definite described magnification ratio of assessment.
61. according to the signal handling equipment of claim 60, wherein processing is an either large or small value to predetermined threshold to image transitions.
62. according to the signal handling equipment of claim 54, wherein said sensor device is the equipment that has for the amplifying unit of each pixel amplifying signal.
63. according to the signal handling equipment of claim 54, wherein sensor device is complementary metal oxide semiconductors (CMOS) transducer or charge coupled device.
64. according to the signal handling equipment of claim 53, described image conversion apparatus comprises in addition:
Feature deriving means is used to use described first data image signal, the feature of pixel that extraction is concerned about;
The class sorter is used for the feature according to care pixel, to the pixel classification of being concerned about;
The factor output device is used to export the corresponding factor of class be concerned about pixel;
Calculation element is used for carrying out second digital signal that the imago number that closes is obtained in calculating by using with the corresponding factor of class and first data image signal of be concerned about pixel.
65. according to the signal handling equipment of claim 64, the factor output device comprises that also storage device comes to be each storage factor in a plurality of classes.
66. according to the signal handling equipment of claim 64, described factor output device also comprises:
Storage device is used to store seed data as the factor seed;
Generating apparatus from predetermined parameters and seed data, is each the generation factor in a plurality of classes.
67. a signal processing method comprises:
The image transitions step makes first data image signal that obtains from the output of sensor device with a plurality of photoelectric conversion devices stand image transitions and handles and export second data image signal; And
Appraisal procedure is used to assess first data image signal of presumptive area;
Wherein be changed and corresponding characteristic that appraisal procedure is assessed with the described sensor device of the corresponding part of first data image signal of presumptive area.
68. a computer-readable program comprises:
The code that is used for the image transitions step makes first data image signal that obtains from the output of sensor device with a plurality of photoelectric conversion devices stand image transitions and handles and export second data image signal;
The code that is used for appraisal procedure is used to assess first data image signal of presumptive area;
Wherein be changed and corresponding characteristic that appraisal procedure is assessed with the described sensor device of the corresponding part of first data image signal of presumptive area.
69. the storage medium of a storage computation machine readable program, described program comprises:
The code that is used for the image transitions step makes first data image signal that obtains from the output of sensor device with a plurality of photoelectric conversion devices stand image transitions and handles and export second data image signal;
The code that is used for appraisal procedure is used to assess first data image signal of presumptive area;
Wherein be changed and corresponding characteristic that appraisal procedure is assessed with the described sensor device of the corresponding part of first data image signal of presumptive area.
70. a signal handling equipment, it carries out the signal processing that first picture signal is converted to second picture signal, and this equipment comprises:
The class sorter is used for distributing according to the rank from first picture signal of imaging device output second picture signal is classified among to a plurality of classes one, so that conduct is converted to picture signal from the object light of the light of object;
Control device is used for distributing according to the rank of described first picture signal control device control imaging device;
Tap factor output device is used for each the class output tap factor to obtaining by study; And
Calculation element, be used for obtaining second picture signal by the calculating of carrying out the tap factor that uses first picture signal and described class, the imaging device output that this first picture signal is controlled by described control device, the tap factor of described class is obtained by the class sorter.
71. according to the signal handling equipment of claim 70, imaging device comprises also and is used for object light is converged to condenser on each pixel that wherein said control device is controlled the position of described condenser.
72. according to the signal handling equipment of claim 70, wherein imaging device is the complementary metal oxide semiconductors (CMOS) transducer.
73., intactly form by the complementary metal oxide semiconductors (CMOS) transducer according to the signal handling equipment of claim 72.
74. according to the signal handling equipment of claim 70, wherein said imaging device is a charge coupled device.
75. according to the signal handling equipment of claim 70, described class sorter comprises:
Classification tap extraction device is used for extracting pixel from first picture signal and uses for classification as the classification tap;
The Sort Code generating apparatus is used for the dynamic range according to the pixel of described classification tap, and 1 of pixel rank obtaining described classification tap distributes, and generates the distribute Sort Code of corresponding class of representative and its rank.
76. according to the signal handling equipment of claim 75, wherein corresponding with 1 inverted number of times of rank distribution meta, described control device is controlled described imaging device.
77. according to the signal handling equipment of claim 75, wherein corresponding with the dynamic range of classification tap pixel, described control device is controlled described imaging device.
78. signal handling equipment according to claim 75, wherein calculation element is carried out the calculating of the tap factor that uses first picture signal and described class, this first picture signal is by the described imaging device output of described control device control, such tap factor from first picture signal one frame or before the rank of first picture signal distribute and obtain.
79. a signal handling equipment, it carries out the signal processing that first picture signal is converted to second picture signal, and equipment comprises:
The class sorter is used for distributing according to the rank from first picture signal of imaging device output second picture signal is classified among to a plurality of classes one, so that conduct is converted to picture signal from the object light of the light of object;
Movable sniffer is used to survey the activity of first picture signal;
Control device is used for controlling described imaging device according to the activity of described first picture signal;
Tap factor output device is used for each the class output tap factor to obtaining by study; And
Calculation element, be used for obtaining second picture signal by the calculating of carrying out the tap factor that uses first picture signal and described class, the imaging device output that this first picture signal is controlled by described control device, the tap factor of described class is obtained by the class sorter.
80. according to the signal handling equipment of claim 79, imaging device comprises also and is used for object light is converged to condenser on each pixel that wherein said control device is controlled the position of described condenser.
81. according to the signal handling equipment of claim 79, wherein imaging device is a kind of complementary metal oxide semiconductors (CMOS) transducer.
82. according to the signal handling equipment of claim 79, wherein imaging device is a kind of charge coupled device.
83. according to the signal handling equipment of claim 79, the dynamic range of a plurality of pixels of first picture signal that wherein movable sniffer acquisition is predetermined is as activity.
84. a signal handling equipment, it carries out the signal processing that first picture signal is converted to second picture signal, and this equipment comprises:
The class sorter is used for distributing according to the rank from first picture signal of imaging device output second picture signal is classified among to a plurality of classes one, so that conduct is converted to picture signal from the object light of the light of object;
The parameter output device, the parameter of described second image signal resolution of output expression;
Control device is used for controlling described imaging device according to described parameter;
Tap factor generating apparatus, the factor seed data and the described parameter that obtain from study generate the tap factor for each class; And
Calculation element, be used for obtaining described second picture signal by the calculating of carrying out the tap factor that uses described first picture signal and described class, the imaging device output that this first picture signal is controlled by control device, such tap factor is obtained by described class sorter.
Be used for object light is converged to condenser on each pixel 85. 4 signal handling equipment according to Claim 8, described imaging device also comprise, wherein control device is controlled the position of described condenser.
86. 4 signal handling equipment is intactly formed by the complementary metal oxide semiconductors (CMOS) transducer according to Claim 8.
87. 4 signal handling equipment according to Claim 8, wherein imaging device is a kind of charge coupled device.
88. a signal processing method, it carries out the signal processing that first picture signal is converted to second picture signal, and this method comprises:
The class classification step is used for distributing according to the rank from first picture signal of imaging device output second picture signal is classified among to a plurality of classes one, so that conduct is converted to picture signal from the object light of the light of object;
Controlled step is used for rank according to first picture signal and distributes and control described imaging device;
Tap factor output step is used for each the class output tap factor to obtaining by study; And
Calculation procedure is used for obtaining second picture signal by the calculating of carrying out the tap factor that uses first picture signal and described class, the imaging device output that this first picture signal is controlled by controlled step, and such tap factor is obtained by classification step.
89. a computer-readable signal handler, it carries out the signal processing that first picture signal is converted to second picture signal, and this program comprises:
The code that is used for the class classification step distributes according to the rank from first picture signal of imaging device output, second picture signal is classified among in a plurality of classes one so that handle is converted to picture signal as the object light from the light of object;
The code that is used for controlled step is used for the described imaging device of rank distribution control according to first picture signal;
The code that is used for tap factor output step is used for each the class output tap factor to obtaining by study; And
The code that is used for calculation procedure, be used for obtaining second picture signal by the calculating of carrying out the tap factor that uses first picture signal and described class, the imaging device output that this first picture signal is controlled by controlled step, the tap factor of described class is obtained by classification step.
90. a storage medium of storing computer-readable program, this program are used to carry out the signal processing that first picture signal is converted to second picture signal, program comprises:
The code that is used for the class classification step is used for distributing according to the rank from first picture signal of imaging device output second picture signal is classified among to a plurality of classes one, so that conduct is converted to picture signal from the object light of the light of object;
The code that is used for controlled step is used for the described imaging device of rank distribution control according to first picture signal;
The code that is used for tap factor output step is used for each the class output tap factor to obtaining by study; And
The code that is used for calculation procedure, be used for obtaining second picture signal by the calculating of carrying out the tap factor that uses first picture signal and described class, the imaging device output that this first picture signal is controlled by controlled step, the tap factor of described class is obtained by classification step.
91. a signal processing method, it carries out the signal processing that first picture signal is converted to second picture signal, and described method comprises:
The class classification step is used for distributing according to the rank from first picture signal of imaging device output second picture signal is classified among to a plurality of classes one, so that conduct is converted to picture signal from the object light of the light of object;
The activity detection steps is surveyed the activity of first picture signal;
Controlled step is used for controlling described imaging device according to the activity of first picture signal;
Tap factor output step is used for each the class output tap factor to obtaining by study; And
Calculation procedure is used for obtaining second picture signal by the calculating of carrying out the tap factor that uses first picture signal and described class, the imaging device output that this first picture signal is controlled by controlled step, and the tap factor of described class is obtained by classification step.
92. a computer-readable program, it is used to carry out the signal processing that first picture signal is converted to second picture signal, and this program comprises:
The code that is used for the class classification step is used for distributing according to the rank from first picture signal of imaging device output second picture signal is classified among to a plurality of classes one, so that conduct is converted to picture signal from the object light of the light of object;
The code that is used for the activity detection steps is used to survey the activity of first picture signal;
The code that is used for controlled step is used for the described imaging device of rank distribution control according to described first picture signal;
The code that is used for tap factor output step is used for obtain to such an extent that each class is exported tap factor by study; And
The calculation procedure code, be used for obtaining second picture signal by carrying out the calculating of using first picture signal and described class to get the tap factor, the described imaging device output that this first picture signal is controlled by controlled step, the tap factor of described class is obtained by classification step.
93. storing computer-readable program and getting storage medium for one kind, this program is used to carry out the signal processing that first picture signal is converted to second picture signal, described program comprises:
The code that is used for the class classification step is used for distributing according to the rank from first picture signal of imaging device output second picture signal is classified among to a plurality of classes one, so that conduct is converted to picture signal from the object light of the light of object;
The code that is used for the activity detection steps is used to survey the activity of first picture signal;
The code that is used for controlled step is used for the described imaging device of rank distribution control according to first picture signal;
The code that is used for tap factor output step is used for each the class output tap factor to obtaining by study; And
The code that is used for calculation procedure, be used for obtaining second picture signal by the calculating of carrying out the tap factor that uses first picture signal and described class, the described imaging device output that this first picture signal is controlled by controlled step, the tap factor of described class is obtained by classification step.
94. a signal processing method, it is used to carry out the signal processing that first picture signal is converted to second picture signal, and this method comprises:
The class classification step is used for distributing according to the rank from first picture signal of imaging device output second picture signal is classified among to a plurality of classes one, so that conduct is converted to picture signal from the object light of the light of object;
Parameter output step is used to export the parameter of representing described second image signal resolution;
Controlled step is used for controlling described imaging device according to described parameter;
The tap factor generates step, is used for obtaining to such an extent that factor seed data and described parameter generate tap factor for each described class from study; And
Calculation procedure is used for obtaining second picture signal by the calculating of carrying out the tap factor that uses first picture signal and described class, the described imaging device output that this first picture signal is controlled by control device, and the tap factor of described class is obtained by sorter.
95. a computer-readable program, it is used to carry out the signal processing that first picture signal is converted to second picture signal, and described program comprises:
The code that is used for the class classification step is used for according among second picture signal being classified to a plurality of classes one from the rank substep of first picture signal of imaging device output, so that conduct is converted to picture signal from the object light of the light of object;
The code that is used for parameter output step is used to export the parameter of representing second image signal resolution;
The code that is used for controlled step is used for controlling described imaging device according to described parameter;
Be used for the code that the tap factor generates step, be used for generating the tap factor for each described class from factor seed data and described parameter that study obtains; And
The code that is used for calculation procedure, be used for obtaining second picture signal by the calculating of carrying out the tap factor that uses first picture signal and described class, the described imaging device output that described first picture signal is controlled by control device, the tap factor of described class is obtained by sorter.
96. storing computer-readable program and getting storage medium for one kind, this program is used to carry out the signal processing that first picture signal is converted to second picture signal, described program comprises:
The code that is used for the class classification step is used for distributing according to the rank from first picture signal of imaging device output second picture signal is classified among to a plurality of classes one, so that conduct is converted to picture signal from the object light of the light of object;
The code that is used for parameter output step is used to export the parameter of representing second image signal resolution;
The code that is used for controlled step is used for controlling described imaging device according to parameter;
Be used for the code that the tap factor produces step, be used for generating the tap factor for each described class from factor seed data and described parameter that study obtains; And
The code that is used for calculation procedure, be used for obtaining second picture signal by the calculating of carrying out the tap factor that uses first picture signal and described class, the described imaging device output that this first picture signal is controlled by control device, the tap factor of described class is obtained by sorter.
CN2004100959448A 2003-07-31 2004-07-30 Signal processing device and signal processing method Expired - Fee Related CN1606359B (en)

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
JP2003283271A JP4281453B2 (en) 2003-07-31 2003-07-31 Signal processing apparatus and signal processing method
JP283271/2003 2003-07-31
JP2003283272A JP4300925B2 (en) 2003-07-31 2003-07-31 Imaging apparatus, signal processing apparatus, signal processing method, program, and recording medium
JP283272/2003 2003-07-31
JP2003283273A JP4305743B2 (en) 2003-07-31 2003-07-31 Signal processing apparatus, signal processing method, program, and recording medium
JP283274/2003 2003-07-31
JP2003283274A JP4305744B2 (en) 2003-07-31 2003-07-31 Signal processing apparatus, signal processing method, program, and recording medium
JP283273/2003 2003-07-31

Related Child Applications (3)

Application Number Title Priority Date Filing Date
CNB2007101010053A Division CN100525389C (en) 2003-07-31 2004-07-30 Signal processing apparatus and signal processing method
CNB2007101010068A Division CN100525390C (en) 2003-07-31 2004-07-30 Signal processing apparatus and signal processing method
CNB2007101010049A Division CN100527787C (en) 2003-07-31 2004-07-30 Signal processing apparatus and signal processing method

Publications (2)

Publication Number Publication Date
CN1606359A true CN1606359A (en) 2005-04-13
CN1606359B CN1606359B (en) 2010-06-16

Family

ID=34229368

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2004100959448A Expired - Fee Related CN1606359B (en) 2003-07-31 2004-07-30 Signal processing device and signal processing method

Country Status (3)

Country Link
US (1) US7595819B2 (en)
KR (1) KR101085410B1 (en)
CN (1) CN1606359B (en)

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4144377B2 (en) * 2003-02-28 2008-09-03 ソニー株式会社 Image processing apparatus and method, recording medium, and program
US20060262210A1 (en) * 2005-05-19 2006-11-23 Micron Technology, Inc. Method and apparatus for column-wise suppression of noise in an imager
JP4351658B2 (en) * 2005-07-21 2009-10-28 マイクロン テクノロジー, インク. Memory capacity reduction method, memory capacity reduction noise reduction circuit, and memory capacity reduction device
JP4662880B2 (en) * 2006-04-03 2011-03-30 三星電子株式会社 Imaging apparatus and imaging method
US7876957B2 (en) * 2007-05-31 2011-01-25 Aptina Imaging Corporation Methods and apparatuses that reduce noise in image signals
JP5045295B2 (en) * 2007-07-30 2012-10-10 ソニー株式会社 Signal processing apparatus and method, and program
JP4989378B2 (en) * 2007-09-03 2012-08-01 キヤノン株式会社 Image processing method and recording apparatus
DE102007044471A1 (en) * 2007-09-18 2009-04-02 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Method for sectionally determining a parameter-dependent correction value approximation course and sensor arrangement
JP4835949B2 (en) * 2007-12-21 2011-12-14 ソニー株式会社 Image processing apparatus and method, learning apparatus and method, program, and recording medium
US8204328B2 (en) * 2008-04-08 2012-06-19 The United States Of America, As Represented By The Secretary Of The Navy Automated underwater image restoration via denoised deconvolution
US20120239349A1 (en) * 2011-02-28 2012-09-20 Vladimir Trakhtman Method to detect signals
JP2013009293A (en) * 2011-05-20 2013-01-10 Sony Corp Image processing apparatus, image processing method, program, recording medium, and learning apparatus
US8948338B2 (en) * 2011-11-03 2015-02-03 Medtronic Navigation, Inc. Dynamically scanned X-ray detector panel
CN103383851B (en) * 2012-05-04 2016-02-24 赛恩倍吉科技顾问(深圳)有限公司 Cd driver control circuit
JP2014126903A (en) 2012-12-25 2014-07-07 Toshiba Corp Image processing apparatus, image processing method, and program
WO2014134172A1 (en) * 2013-02-26 2014-09-04 Cornell University Event correlation using data having different time resolutions
KR20140111758A (en) 2013-03-12 2014-09-22 삼성전자주식회사 Image processing device and computing system having the same
US20150193699A1 (en) * 2014-01-08 2015-07-09 Civitas Learning, Inc. Data-adaptive insight and action platform for higher education
WO2016002068A1 (en) * 2014-07-04 2016-01-07 三菱電機株式会社 Image expansion device, image expansion method, surveillance camera, program, and recording medium

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03106186A (en) 1989-09-20 1991-05-02 Olympus Optical Co Ltd Solid-state image pickup device
JPH06339082A (en) 1993-05-28 1994-12-06 Canon Inc Photoelectric conversion device
JP2931520B2 (en) * 1993-08-31 1999-08-09 三洋電機株式会社 Color separation circuit for single-chip color video camera
US5552825A (en) * 1994-11-08 1996-09-03 Texas Instruments Incorporated Color resolution enhancement by using color camera and methods
JPH08331465A (en) 1995-05-30 1996-12-13 Canon Inc Double plate type image pickup device
JP3859089B2 (en) 1995-05-31 2006-12-20 ソニー株式会社 Signal conversion apparatus and signal conversion method
AU718453B2 (en) * 1996-07-17 2000-04-13 Sony Corporation Image coding and decoding using mapping coefficients corresponding to class information of pixel blocks
CN1075649C (en) * 1996-08-22 2001-11-28 明碁电脑股份有限公司 Method and system for adjusting position of charge coupled device and lens in scanning assembly
JP3796844B2 (en) 1996-10-04 2006-07-12 ソニー株式会社 Image processing apparatus, image processing method, parameter generation apparatus, and parameter generation method
KR100499434B1 (en) * 1997-05-06 2005-07-07 소니 가부시끼 가이샤 Image converter and image conversion method
JPH11220753A (en) 1998-02-03 1999-08-10 Matsushita Electric Ind Co Ltd Solid-state image-pickup element aligning device
JP2000138944A (en) 1998-10-29 2000-05-16 Nikon Corp Image pickup device, electronic camera and registration correction method
US7573508B1 (en) * 1999-02-19 2009-08-11 Sony Corporation Image signal processing apparatus and method for performing an adaptation process on an image signal
JP3365333B2 (en) * 1999-03-03 2003-01-08 日本電気株式会社 Resolution converter
US7339617B1 (en) * 1999-03-12 2008-03-04 Sony Corporation Image providing device and its providing method, image processing device and processing method, and storage medium
JP4844780B2 (en) 2000-04-13 2011-12-28 ソニー株式会社 Imaging control apparatus, imaging control method, program, and program recording medium
JP2002182095A (en) 2000-12-19 2002-06-26 Fuji Photo Film Co Ltd Focal position adjusting device, exposure head and image recorder
US20030052989A1 (en) 2001-07-18 2003-03-20 Bean Heather Noel Non-polarizing shutter/CCD module
JP2003075252A (en) 2001-09-03 2003-03-12 Fuji Electric Co Ltd Photosensor corresponding to high dynamic range
US6639201B2 (en) 2001-11-07 2003-10-28 Applied Materials, Inc. Spot grid array imaging system
JP4284908B2 (en) 2001-12-25 2009-06-24 ソニー株式会社 MOS type solid-state imaging device and manufacturing method thereof

Also Published As

Publication number Publication date
KR101085410B1 (en) 2011-11-21
CN1606359B (en) 2010-06-16
KR20050014750A (en) 2005-02-07
US20050052541A1 (en) 2005-03-10
US7595819B2 (en) 2009-09-29

Similar Documents

Publication Publication Date Title
CN1606359A (en) Signal processing device and signal processing method, program, and recording medium
CN1213592C (en) Adaptive two-valued image processing method and equipment
CN1249987C (en) Radiation image processing equipment and method, image processing system, storing medium and program
CN1167265C (en) Image communication system and method thereof
CN1846447A (en) Image processing method, image processing apparatus, and computer program
CN1305006C (en) Method and system for profviding promated information to image processing apparatus
CN100345158C (en) Method and system for producing formatted data related to geometric distortions
CN1134975C (en) Image processing device
CN1754384A (en) Image processing device and method, learning device and method, recording medium, and program
CN1260958C (en) Data processing appts. data processing method, and data processing system
CN1701614A (en) Image processing method and device, and program
CN1774031A (en) Image processing apparatus and image processing method as well as computer program
CN1645918A (en) Method of controlling semiconductor device, signal processing method, semicondcutor device, and electronic apparatus
CN1610412A (en) Image processing apparatus and image processing method and program
CN1961338A (en) Image processing apparatus and method, and recording medium and program
CN1867940A (en) Imaging apparatus and image processing method therefor
CN1488225A (en) Image processing method, image processing program, and image processor
CN1770831A (en) Data processing method, data processing apparatus, semiconductor device for detecting physical quantity distribution, and electronic equipment
CN1816825A (en) Signal processing device, signal processing method, program, and recording medium
CN1910907A (en) Control method, control device and control program for photographic device
CN101076126A (en) Imaging apparatus and method, and imaging device
CN101038625A (en) Image processing apparatus and method
CN1926881A (en) Motion vector detecting apparatus, motion vector detection method and computer program
CN1812479A (en) Image processing apparatus and method, image output apparatus, camera apparatus, program and medium
CN101047822A (en) Thumbnail generating apparatus and image shooting apparatus

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20100616

Termination date: 20150730

EXPY Termination of patent right or utility model