US9288370B2 - Imaging apparatus and audio processing apparatus - Google Patents
Imaging apparatus and audio processing apparatus Download PDFInfo
- Publication number
- US9288370B2 US9288370B2 US13/296,916 US201113296916A US9288370B2 US 9288370 B2 US9288370 B2 US 9288370B2 US 201113296916 A US201113296916 A US 201113296916A US 9288370 B2 US9288370 B2 US 9288370B2
- Authority
- US
- United States
- Prior art keywords
- period
- noise
- audio signal
- drive
- unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related, expires
Links
- 238000012545 processing Methods 0.000 title claims description 50
- 238000003384 imaging method Methods 0.000 title claims description 45
- 230000005236 sound signal Effects 0.000 claims abstract description 102
- 238000000034 method Methods 0.000 claims description 58
- 230000003287 optical effect Effects 0.000 claims description 49
- 230000011514 reflex Effects 0.000 claims description 11
- 230000007246 mechanism Effects 0.000 claims description 4
- 230000009467 reduction Effects 0.000 abstract description 6
- 230000008569 process Effects 0.000 description 35
- 238000001514 detection method Methods 0.000 description 14
- 230000008859 change Effects 0.000 description 12
- 238000004891 communication Methods 0.000 description 9
- 230000006870 function Effects 0.000 description 9
- 238000004364 calculation method Methods 0.000 description 7
- 238000012937 correction Methods 0.000 description 7
- 230000003139 buffering effect Effects 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 230000003595 spectral effect Effects 0.000 description 4
- 238000011410 subtraction method Methods 0.000 description 4
- 230000009466 transformation Effects 0.000 description 4
- 238000006243 chemical reaction Methods 0.000 description 3
- 239000000872 buffer Substances 0.000 description 2
- 238000011946 reduction process Methods 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000009795 derivation Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 230000004907 flux Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
Images
Classifications
-
- H04N5/225—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B20/00—Signal processing not specific to the method of recording or reproducing; Circuits therefor
- G11B20/24—Signal processing not specific to the method of recording or reproducing; Circuits therefor for reducing noise
Definitions
- the present invention relates to an imaging apparatus and an audio processing apparatus.
- a digital camera on market which includes a function for capturing moving images and recording audio signals, in addition to capturing still images.
- a capturing state of the digital camera changes, so that a drive unit of the digital camera operates to drive a focus lens, or to drive a diaphragm mechanism in response to a change in brightness.
- the operation of the drive unit thus generates noise in the audio signal being recorded.
- Japanese Patent Application Laid-Open No. 2006-279185 discusses an imaging apparatus that performs a spectral subtraction method which is one method of cancelling noise.
- the image apparatus performs noise cancellation on the audio signal input through a microphone in synchronization with driving of a zoom lens drive motor.
- Japanese Patent Application Laid-Open No. 2006-287387 discusses a technique for executing noise cancellation as described below.
- a reference microphone is placed near a drive motor that generates the drive noise.
- noise cancellation is performed. Since the noise is detected based on the signal input to the reference microphone, a time lag between timing at which a drive signal is transmitted to the drive motor, and timing at which the drive noise is generated by actually driving the drive motor, is reduced.
- Japanese Patent Application Laid-Open No. 2001-344000 discusses a technique for accurately detecting sudden noise that is generated in a communication apparatus such as a cellular phone which encodes and transmits the audio signal. More specifically, the audio signal acquired in performing communication is divided into frames of a predetermined time length, and the signal for each frame is then transformed to frequency domains. The change in the signal level is then monitored for each frequency domain, so that the sudden noise is detected.
- the present invention is directed to an imaging apparatus and a drive noise cancellation apparatus that solves the above-described problems and appropriately cancels the noise.
- the section in which the drive noise is generated can be determined with a small calculation load, and the drive noise can thus be effectively cancelled.
- an imaging apparatus includes an imaging unit configured to convert an optical image of an object to an image signal, an optical unit configured to impart an optical image of an object to the imaging unit, a drive unit configured to drive the optical unit, a control unit configured to output a drive signal and control the drive unit, an audio acquisition unit configured to acquire audio signals, a determination unit configured to analyze an audio signal acquired by the audio acquisition unit during a predetermined period from when the drive signal has been output, and determine a noise reduction period based on a specific frequency component included in the audio signal of the predetermined period, a noise reduction unit configured to reduce from an audio signal acquired by the audio acquisition unit, noise during a period determined by the determination unit, and a recording unit configured to record on a recording medium, an audio signal from which noise has been reduced by the noise reduction unit.
- FIG. 1 is a center cross-sectional view illustrating an exemplary embodiment according to the present invention.
- FIG. 2 is a block diagram illustrating a schematic configuration of the exemplary embodiment illustrated in FIG. 1 .
- FIG. 3 is a flowchart illustrating a noise cancellation process according to a first exemplary embodiment.
- FIGS. 4A and 4B illustrate examples of waveforms and specific frequency components of the audio signal in which the drive noise is generated.
- FIGS. 5A , 5 B, 5 C, 5 D, and 5 E are timing charts illustrating noise cancellation processing sections.
- FIGS. 6A , 6 B, and 6 C illustrate a process for predicting the audio signal in the noise cancellation processing section.
- FIG. 7 illustrates a table of analysis section length, characteristic frequency, and sound pressure threshold value for each optical element.
- FIGS. 8A and 8B illustrate examples of the drive noise for each drive unit.
- FIG. 9 illustrates a system configuration according to a second exemplary embodiment.
- FIG. 10 is a block diagram illustrating a schematic configuration of the system illustrated in FIG. 9 .
- FIG. 11 is a flowchart illustrating the noise cancellation process performed by an external processing apparatus.
- FIG. 12 illustrates a configuration of a system in which the audio signal is transferred by detaching a memory.
- FIG. 1 is a center cross-sectional view illustrating a digital single lens reflex camera which is an exemplary embodiment of an imaging apparatus according to the present invention.
- a digital single lens reflex camera 100 includes a camera body 101 and an image-taking lens 102 .
- the image-taking lens 102 includes inside a lens barrel 103 an imaging optical system 104 having an optical axis 105 .
- the imaging optical system 104 includes a focus lens group, a camera-shake correction lens unit, a diaphragm mechanism, and an optical system drive unit 106 that drives the above-described components. Further, the imaging optical system 104 includes a lens control unit 107 which controls the optical system drive unit 106 .
- the imaging optical system 104 is electrically connected to the camera body 101 at a lens mount contact 108 .
- An object optical image entering from front of the image-taking lens 102 passes through the optical axis 105 and enters the camera body 101 .
- a main mirror 110 formed of a half mirror reflects a portion of incident light, and the reflected light is formed as an image on a focusing screen 117 .
- the user or a photographer can visually recognize from an eye-piece lens 112 via a pentagonal prism 111 the optical image formed on the focusing screen 117 .
- Such a configuration thus forms an optical view finder.
- An automatic exposure (AE) sensor 116 detects the brightness of the optical image formed on the focusing screen 117 . Further, the object optical image that has been transmitted through the main mirror 110 is reflected by a sub-mirror 113 , and enters an auto-focus (AF) sensor 114 . An output of the AF sensor 114 is used in performing focus detection of the object image.
- the AE sensor 116 detects an amount of exposure of the entire focusing screen 117 , or a portion or a plurality of portions of the focusing screen 117 .
- an instruction is issued to start image capturing.
- the main mirror 110 and the sub-mirror 113 then retract from an imaging light path, so that the object optical image enters an image sensor 118 .
- the detection results of the AF sensor 114 and the AE sensor 116 , and the output from the image sensor 118 are supplied to a camera control unit 119 .
- the camera control unit 119 thus controls the entire camera 100 according to the supplied signals.
- a microphone 115 When capturing the moving image, a microphone 115 , i.e., an audio input unit, captures external sounds, converts the captured sounds to the audio signal, and provides the audio signal to the camera control unit 119 .
- the audio signal is recorded in synchronization with an image signal output from the image sensor 118 .
- FIG. 2 is a block diagram illustrating a schematic configuration of the digital single lens reflex camera 100 .
- the digital single lens reflex camera 100 includes an imaging system, an image processing system, an audio processing system, a recording/playback system, and a control system.
- the imaging system includes the imaging optical system 104 and the image sensor 118 .
- the image processing system includes an analog/digital (A/D) conversion unit 131 and an image processing unit 132 .
- the audio processing system includes the microphone 115 and an audio signal processing circuit 137 .
- the recording/playback system includes a recording unit 133 and a memory 134 .
- the control system includes the optical system drive unit 106 , the lens control unit 107 , the camera control unit 119 , the AF sensor 114 , the AE sensor 116 , and an operation switch detection unit 135 .
- the optical system drive unit 106 includes a focus lens drive unit 106 a , a camera shake correction drive unit 106 b , and a diaphragm drive unit 106 c.
- the imaging system is an optical processing system which uses the imaging optical system 104 to form an image of the light coming from the object on an imaging plane of the image sensor 118 .
- a portion of the light flux is guided to the AF sensor 114 , i.e., the focus detection unit, via a mirror disposed in the main mirror 110 .
- the control system appropriately adjusts the imaging optical system 104 , so that an appropriate amount of light from the object is received by the image sensor 118 , and the object image is formed in the vicinity of the image sensor 118 as will be described below.
- the A/D conversion unit 131 digitizes the image signal output from the image sensor 118 and inputs the digitized image signal to the image processing unit 132 .
- the image processing unit 132 then processes the image data received from the A/D conversion unit 131 .
- the image processing unit 132 includes a white balance circuit, a gamma correction circuit, and an interpolation calculation circuit which performs interpolation calculation for increasing resolution.
- the audio signal processing unit 137 in the audio processing system performs an appropriate process on the signal output from the microphone 115 , and generates the audio signal to be recorded.
- the recording unit to be described below records the generated audio signal associated with the image data.
- the recording unit 133 outputs the image data to the memory 134 , and generates and stores the image data of the image to be output to an image display device 136 . Further, the recording unit 133 uses a predetermined method and compresses the image data, moving image data, and audio data. The recording unit 133 records the compressed data in the recording medium.
- the camera control unit 119 generates and outputs timing signals in the image capturing operation.
- the AF sensor 114 i.e., the focus detection unit, detects a focus state of the object
- the AE sensor 116 i.e., an exposure detection unit, detects the brightness of the object.
- the lens control unit 107 adjusts focusing, zooming, and the diaphragm of the imaging optical system 104 , according to the control signal output from the camera control unit 119 .
- the control system controls each of the imaging system, the image processing system, and the recording/playback system, according to an operation from the outside.
- the operation switch detection unit 135 detects that the user has pressed a shutter release button (not illustrated).
- the camera control unit 119 then controls driving of the image sensor 118 , the operation of the image processing unit 132 , and the compression process to be performed by the recording unit 133 , according to the detection result. Further, the camera control unit 119 controls information display performed by the optical finder and a liquid crystal monitor configuring the image display device 136 .
- the camera control unit 119 determines an appropriate focus position and diaphragm position according to the detection results of the AF sensor 114 and the AE sensor 116 .
- the camera control unit 119 then supplies to the lens control unit 107 the control signal indicating control for driving the focus lens and the diaphragm to such positions.
- the lens control unit 107 causes the focus lens driving unit 106 a and the diaphragm drive unit 106 c to drive the focus lens and the diaphragm respectively according to the control signal from the camera control unit 119 .
- the lens control unit 107 is connected to a camera shake detection sensor (not illustrated).
- the lens control unit 107 controls the camera shake drive unit 106 b according to the detection result of the camera shake detection sensor and thus reduces the camera shake.
- the camera control unit 119 uses the drive amount of the focus lens drive unit 106 a and the continuous image signals output from the image sensor 118 to adjust the focus state of the image optical system.
- the camera control unit 119 employs a focus detection method referred to as a hill-climbing method. Further, the camera control unit 119 uses the image signal output from the image sensor 118 to calculate the brightness of the object and thus adjusts the diaphragm.
- FIG. 3 is a flowchart illustrating the noise cancellation process according to the present exemplary embodiment.
- FIGS. 4A and 4B illustrate examples of the waveforms of the audio signal and the change in the sound pressure level of extracted frequencies.
- FIG. 4A illustrates the example of the waveform of the audio signal acquired by the microphone 115 . Referring to FIG. 4A , time is indicated on a horizontal axis, and a voltage level of the audio signal output from the microphone 115 is indicated on a vertical axis.
- FIG. 4B illustrates a change in the sound pressure of 10 kHz and 2 kHz frequency components of the audio signal illustrated in FIG. 4A . Referring to FIG. 4B , time is indicated on the horizontal axis, and a sound pressure level is indicated on the vertical axis.
- FIGS. 5A through 5E illustrate timing charts for the diaphragm drive unit 106 c to drive the diaphragm in the image-taking lens 102 .
- FIG. 5A illustrates the timing chart for issuing a diaphragm drive command from the camera body 101 to the image-taking lens 102 .
- FIG. 5B illustrates the timing chart for applying a drive voltage by which the diaphragm drive unit 106 c actually drives the diaphragm.
- FIG. 5C illustrates the timing chart of a period during which the diaphragm drive unit 106 c generates the drive noise.
- FIG. 5A illustrates the timing chart for issuing a diaphragm drive command from the camera body 101 to the image-taking lens 102 .
- FIG. 5B illustrates the timing chart for applying a drive voltage by which the diaphragm drive unit 106 c actually drives the diaphragm.
- FIG. 5C illustrates the timing chart of a period during which the diaphra
- FIG. 5D illustrates the timing chart of a period in which the audio signal is buffered for a predetermined length of time, and frequency analysis is performed to determine a noise cancellation processing section.
- FIG. 5E illustrates the timing chart of the noise cancellation processing section calculated from the result of frequency analysis.
- the digital single lens reflex camera 100 when the user presses a moving image capturing switch (not illustrated) of the digital single lens reflex camera 100 , the digital single lens reflex camera 100 starts capturing the moving image. The digital single lens reflex camera 100 also starts the sound recording operation at the same time.
- step S 1001 the audio signal acquired by and output from the microphone 115 is stored in the memory 134 via the audio signal processing circuit 137 in synchronization with the video signal.
- step S 1002 the camera control unit 119 determines whether there is a command that instructs driving of the optical system driving unit 106 .
- the command may be in a form of the user performing the diaphragm adjustment, or driving of a focus lens for focusing. If the drive command is not detected (NO in step S 1002 ), the process proceeds to step S 1009 .
- step S 1009 the camera control unit 119 determines whether the moving image capturing switch is off. If the moving image capturing switch is on (NO in step S 1009 ), the process returns to step S 1001 , and the camera control unit 119 continues to record the audio signal. On the other hand, if the moving image capturing switch is off (YES in step S 1009 ), the camera control unit 119 ends the image capturing operation including the sound recording.
- step S 1003 the camera control unit 119 buffers the audio signal of a predetermined period starting from issuing of the drive command as illustrated in FIG. 5D .
- diaphragm drive noise is mixed into the audio signal captured in the diaphragm drive period among the buffered audio signal.
- step S 1004 the camera control unit 119 divides into frames the audio signal buffered for a predetermined time starting from issuing of the lens drive command. The camera control unit 119 then consecutively performs Fourier transformation on each frame and transforms each frame to frequency domains.
- step S 1005 the camera control unit 119 extracts from the audio signal transformed into the frequency domains, the change in the sound pressure of a characteristic frequency of the diaphragm drive noise.
- the example of the waveforms illustrated in FIG. 4B illustrates that a sound pressure change in a 10 kHz component 401 indicates a drive noise component, and a sound pressure change in a 2 kHz component 402 indicates the component of the audio signal acquired from the object.
- FIG. 4B also illustrates the sound pressure change of the drive noise component for the periods other than the buffering period.
- the sound pressure of the 2 kHz component 402 greatly changes with respect to time, so that it is difficult to determine the diaphragm drive time from the change.
- the sound pressure does not change greatly for the 10 kHz component 401 during the period in which there is only the sound of the object, and greatly changes during the diaphragm drive period.
- the diaphragm drive period is described to represent the lens drive period.
- the 10 kHz component is described above as the characteristic frequency component.
- the characteristic frequency component may be of other frequencies as long as it is in the frequency domain that is greatly included in the drive noise and not in the audio signal acquired from the object.
- the lens control unit 107 stores as a data table illustrated in FIG. 7 the characteristic frequency and a determination threshold value for each image-taking lens 102 and each driven elements in the image-taking lens 102 , and transfers the data to the camera control unit 119 as necessary.
- step S 1006 the camera control unit 119 calculates a section in which the sound pressure of the characteristic frequency extracted in step S 1005 exceeds the predetermined threshold value.
- the camera control unit 119 calculates the section in which the sound pressure of the 10 kHz component 401 exceeds a threshold value 403 illustrated in FIG. 4B , and determines a noise cancellation processing section 404 in which noise cancellation is to be performed.
- FIGS. 8A and 8B illustrate 5 kHz, 10 kHz, and 15 kHz components of two types of lens drive noises generated when the focus lens is driven using different actuators.
- the sound pressure level is indicated on the vertical axis, and the time is indicated on the horizontal axis.
- FIG. 8A illustrates the change in the sound pressure for each frequency of the drive noise generated when the lens is driven using a direct current (DC) motor as the actuator.
- FIG. 8B illustrates the change in sound pressure for each frequency of the drive noise generated when the lens is driven using an ultrasonic motor as the actuator.
- DC direct current
- each of the 5 kHz, 10 kHz, and 15 kHz components are uniformly included in the drive noise.
- the 10 kHz bandwidth is greatly included, whereas the 5 kHz and 15 kHz components are less included as compared to the 10 kHz component in the drive noise.
- the sound pressure level of the drive noise generated when using the ultrasonic motor is lower than the sound pressure level of the drive noise generated when using the DC motor, in each frequency domain.
- the threshold value in the frequency domain in which the sound of the object is not included, to clearly separate the object sound from the drive noise.
- the characteristic frequency and the sound pressure level differ according to the type of lens and the drive operation, it is desirable to set the threshold value for each lens type and drive operation.
- the threshold values 403 are thus stored in the lens control unit 107 as value that are different for each lens type and drive operation as illustrated in FIG. 7 , and are transferred to the camera control unit 119 as necessary.
- step S 1007 the camera control unit 119 performs noise cancellation with respect to the noise cancellation processing section 404 calculated in step S 1006 .
- the noise cancellation method will be described in detail below.
- step S 1008 the camera control unit 119 controls the recording device 133 to record the audio signal that has been subjected to noise cancellation in step S 1007 , in synchronization with the captured moving image.
- the camera control unit 119 directly records the buffered audio signals other than those in the noise cancellation processing section, without performing noise cancellation processing thereon.
- step S 1009 the camera control unit 119 determines whether the moving image capturing switch has been turned off. If the moving image capturing switch has been turned off (YES in step S 1009 ), the camera control unit 119 ends recording the sound. If the moving image capturing switch has not been turned off (NO in step S 1009 ), the camera control unit 119 continues recording the audio signal.
- the camera control unit 119 issues the diaphragm drive command at time T 1 .
- the audio signal of a predetermined time between T 1 and T 5 illustrated in FIG. 5E is then buffered.
- the buffering period is different depending on the type of the image-taking lens 102 and the elements that drive the image-taking lens 102 , so that extra time is added previous and subsequent to the drive time in the buffering period.
- the analysis section length which indicates the length of the buffering period is stored in the data table illustrated in FIG. 7 .
- the data table is stored in the memory within the image-taking lens 102 , and when the image-taking lens 102 is connected to the camera body 101 , the data table is transferred to the control unit 119 or the memory 134 and stored. Further, when the image-taking lens 102 is attached to the camera body 101 , the optical system drive unit 106 may drive each lens element, measure the drive time, and determine the buffering period from the measured time. Furthermore, the data table to be applied may be determined by storing in the memory 134 in the camera body 101 the data table for each type of image-taking lens 102 , and identifying the type of the attached image-taking lens.
- the lens control unit 107 causes the diaphragm drive unit 106 c to drive the diaphragm at time T 2 , according to the diaphragm drive command received from the camera control unit 119 .
- the diaphragm drive unit 106 c thus drives the diaphragm at time T 2 , so that the drive noise is generated.
- the drive voltage of the diaphragm drive unit 106 c is lowered, and the diaphragm drive operation ends.
- there remains a reverberant sound of the drive noise generated due to driving the diaphragm which continues to time T 4 as illustrated in FIG. 5C .
- Buffering and frequency analysis are thus performed for the predetermined time length between T 1 and T 5 in which the reverberant sound period is included, as illustrated in FIG. 5D .
- the correct drive noise generation period T 2 to T 4 including the time lag generated between issuing of the diaphragm drive command and actual start of driving the diaphragm and the time in which the reverberant sound noise remains, can be determined.
- a highly accurate noise cancellation can thus be performed.
- the noise cancellation process employs a prediction process which uses the audio signals previous and subsequent to the drive noise generation period, to predict the audio signal during the drive noise generation period.
- FIGS. 6A , 6 B, and 6 C illustrate the audio signal waveforms in each processing procedure. Referring to FIGS. 6A-6C , the time is indicated on the horizontal axis, and the signal level is indicated on the vertical axis.
- FIG. 6A illustrates the audio signal from the object into which the drive noise is mixed.
- FIG. 6B illustrates the audio signal in the middle of performing the prediction process in the noise cancellation process.
- FIG. 6C illustrates the audio signal acquired after application of the prediction process.
- the audio signal in the noise cancellation processing section i.e., the audio signal into which the drive noise is mixed, is discarded in the prediction process.
- a learning operation and a prediction operation are then performed to interpolate the audio signal in the noise cancellation processing section using the signal acquired by the prediction operation.
- audio prediction includes derivation of a linear prediction coefficient (i.e., the learning operation) and signal prediction using the linear prediction coefficient (i.e., the prediction operation) to be described below.
- a linear prediction coefficient i.e., the learning operation
- signal prediction using the linear prediction coefficient i.e., the prediction operation
- equation (2) is acquired.
- equation (2) if ⁇ t is sufficiently small, the current value is expressed by a linear sum of neighborhood p values. Further, if an approximation of x t obtained using the above-described prediction operation is sufficiently appropriate, x t+1 can also be obtained as the linear sum of neighborhood p values.
- ⁇ t can be set sufficiently small, values can be sequentially predicted, and the signal can be acquired.
- the linear prediction coefficient ⁇ i which minimizes ⁇ t is thus to be acquired.
- the operation for acquiring ⁇ i which minimizes ⁇ t will be referred to as the learning operation.
- the linear prediction coefficient ⁇ i can be acquired by minimizing ⁇ t 2 in a learning section in which the learning operation is performed.
- equation (3) is acquired.
- Equation (5) indicates that ⁇ i can be determined by solving p sets of linear simultaneous equations.
- an approximation value of x t+1 can be similarly acquired from a neighborhood p ⁇ 1 sample values and the signal acquired by performing prediction.
- the signal in the prediction section can thus be generated by sequentially repeating the above-described process.
- the operation for acquiring the approximation of the prediction section from the obtained ⁇ i will be referred to as the prediction operation.
- the learning operation and the prediction operation will be described below using the examples of the waveforms illustrated in FIGS. 6A-6C .
- the signal previous and subsequent to the prediction section is used in performing the learning operation.
- Such a process uses the characteristic of the audio signal, i.e., repeatability is comparatively high within an extremely short time range.
- a section previous in terms of time to the section in which the drive noise is existent is set as a learning section 1 .
- a section subsequent in terms of time to the section in which the drive noise is existent is set as a learning section 2 .
- the calculations are separately performed with respect to the signal in the learning section 1 and signal in the learning section 2 .
- Generating the signal in the prediction section after performing the learning operation in the learning section 1 will be referred to as prediction from the fore, i.e., forward prediction.
- generating the signal in the prediction section after performing the learning operation in the learning section 2 will be referred to as prediction from the back, i.e., backward prediction.
- the forward prediction and the backward prediction are weighted in calculating the signal in the prediction section. More specifically, the nearer to the learning section 1 , the greater the value acquired by performing forward prediction is weighted. The nearer to the learning section 2 , the greater the value acquired by performing backward prediction is weighted.
- the frequency analysis is performed for a predetermined period based on the drive signal, and the noise cancellation processing section is determined.
- the drive noise generation section can be accurately detected, so that noise cancellation performance can be improved.
- the drive noise generation section can be accurately determined by performing the noise cancellation process using the prediction process, so that prediction accuracy can be improved.
- the drive noise cancellation process is performed with respect to the diaphragm drive noise.
- the present invention can also be applied to the cancellation of drive noise generated due to driving other optical elements.
- the present invention can be applied to cases where the drive noise or generation timing of the noise can be detected by the operation button or the camera control unit.
- the present invention can be applied to driving of the focus lens, the camera-shake correction lens, and a lock mechanism of the camera-shake correction lens (not illustrated).
- the present invention can be applied to driving of image sensor shift-type camera-shake correction, driving of an electronic zoom lens, and wobbling driving of the image sensor.
- the present invention can be applied to pressing of an operating button and a pop-up driving of a flash which generate operation noise.
- the noise cancellation method employs the prediction process which predicts from the audio signal generated previous and subsequent to the drive noise generation section and reproduces the audio signal in the drive noise generation section.
- other methods may be used, such as a mute method in which the signal in the drive noise generation section is set to 0 to be mute.
- the spectral subtraction method which transforms the signal to the frequency domains and calculates the difference between the characteristic frequencies may be used.
- the audio signal in the noise generation section is simply set to 0, so that the calculation load is extremely small. Further, a hearer has less feeling of strangeness when a silent period due to muting is short, so that it is effective to correctly acquire the noise generation section as described in the present invention.
- the spectral subtraction method is employed, information acquired by recording only the drive noise and transforming to the frequency domain is previously recorded.
- the noise is cancelled by subtracting only the frequency domain information of the drive noise from the frequency domain information of the object audio signal into which the drive noise is mixed.
- regular noise such as a humming noise can be easily cancelled.
- the position at which only the frequency domain information of the drive noise is subtracted from the actual section into which the drive noise is mixed may be shifted with respect to time.
- the noise cancellation performance is thus lowered. As a result, it is effective to correctly acquire the noise generation section as described above according to the present invention, even when the spectral subtraction method is to be used.
- the drive noise is cancelled when the sound is captured.
- the drive noise can be cancelled after capturing the sound.
- the signal indicating the generation timing of the drive noise is recorded along with the audio signal into which the drive noise is mixed. Both signals are then transferred to the drive noise cancellation apparatus, and the drive noise is cancelled.
- the data indicating the characteristic of the drive noise may also be recorded at the same time and be transferred to the drive noise cancellation apparatus.
- Such data indicating the characteristic of the drive noise includes a type of the drive noise, a threshold value for determining the drive noise period, and the period in which the drive noise continues.
- FIG. 9 is a schematic diagram illustrating a system including the digital single-lens reflex camera and an external processing apparatus, i.e., a drive noise cancellation apparatus.
- a digital single-lens reflex camera 100 a is connected to an external processing apparatus 170 via a communication cable 151 .
- FIG. 10 is a block diagram illustrating the system configuration illustrated in FIG. 9 .
- the camera 100 a includes a communication connector 141 for connecting to an external device.
- the communication connector 141 electrically connects to a communication connector 174 in the external processing apparatus 170 via the communication cable 151 .
- the elements having similar functions as those in the first exemplary embodiment are assigned the same reference numerals.
- the external processing apparatus 170 includes a control unit 171 , an audio signal processing unit 172 , a memory 173 , an operation input unit 175 , an audio reproduction device 176 , and an image display device 177 .
- the control unit 171 controls each unit to perform operations including noise cancellation according to the operation on the operation input unit 175 by an operator.
- the results of performing control including an operation status of noise cancellation are output to the audio reproduction unit 176 and the image display device 177 .
- the control unit 171 receives from the camera 100 a via the communication connector 174 , recorded moving image data including the audio signal in which the drive noise has not been cancelled, and the signal indicating the drive noise generation timing.
- the audio signal processing unit 172 then performs the noise cancellation process similar to the first exemplary embodiment on the audio signal which includes the drive noise received from the camera 100 a , and records the processed signal in the memory 173 .
- FIG. 11 is a flowchart illustrating the drive noise reduction process performed by the external processing device 170 .
- the flowchart illustrated in FIG. 11 starts when the operator instructs the control unit 171 via the operation input unit 175 to start the noise cancellation process.
- step S 2001 the external processing device 170 reads via the communication cable 151 , the audio signal into which the drive noise is mixed, and the moving image data including the drive timing signal recorded in the memory 134 in the camera body 101 a.
- step S 2002 the control unit 171 determines whether the drive timing signal which is in synchronization with the read audio signal is detected. If the control unit 171 does not detect the drive timing signal which is in synchronization with the read audio signal (NO in step S 2002 ), the process proceeds to step S 2010 .
- step S 2010 the control unit 171 directly records the audio signal.
- step S 2009 the control unit 171 determines whether the audio signal to be processed has ended. If the audio signal has not ended (NO in step S 2009 ), the process returns to step S 2001 , and the control unit 171 continues to read the audio signal from the camera 100 a . If the audio signal has ended (YES in step S 2009 ), the control unit 171 ends the drive noise cancellation process.
- step S 2003 the control unit 171 buffers the audio signal of a predetermine length of time from a point of receiving the drive timing signal.
- the processes performed in step S 2004 to step S 2007 are similar to those performed in step S 1004 to step S 1007 illustrated in FIG. 3 except for the audio signal processing unit 172 performing the noise reduction process. Detailed description will thus be omitted.
- step S 2008 the speech signal processing unit 172 records in the memory 173 the audio signal on which the noise cancellation process has been performed.
- step S 2009 the control unit 171 determines whether the audio signal to be processed has ended. If the audio signal has not ended (NO in step S 2009 ), the process returns to step S 2001 , and the control unit 171 continues to read the audio signal from the camera 100 a . If the audio signal has ended (YES in step S 2009 ), the control unit 171 ends the drive noise cancellation process.
- the audio signal on which the noise cancellation process has been performed is recorded in the memory 173 in synchronization with the image data included in the moving image data received from the camera 100 a .
- the audio signal on which the noise cancellation process has been performed may be re-written in the memory 134 in the camera 100 a and be overwritten on the audio signal in the memory 134 which includes the drive noise.
- a memory card reader 152 may be used to transfer the necessary data to the external processing apparatus as illustrated in FIG. 12 .
- the present invention can be accomplished by supplying an apparatus with a storage medium in which a software program code which implements the functions of the above exemplary embodiments is stored.
- a computer or central processing unit (CPU), micro-processor unit (MPU) and/or the like
- CPU central processing unit
- MPU micro-processor unit
- the program code itself read from the storage medium implements the functions of the above exemplary embodiments.
- the program code itself and the storage medium in which the program code is stored constitute the present invention.
- a flexible disk, a hard disk, an optical disk, a magneto-optical disk, a compact disc read-only memory (CD-ROM), a compact disc recordable (CD-R), a magnetic tape, a nonvolatile memory card, and a ROM can be used as the storage medium for supplying the program code.
- the above case includes a case where a basic system or an operating system (OS) or the like which operates on the computer performs a part or all of processing based on instructions of the above program code and where the functions of the above exemplary embodiments are implemented by the processing.
- OS operating system
- the above case also includes a case where the program code read out from the storage medium is written to a memory provided on an expansion board inserted into a computer or to an expansion unit connected to the computer, so that the functions of the above exemplary embodiments are implemented.
- a CPU or the like provided in the expansion board or the expansion unit performs a part or all of actual processing.
- aspects of the present invention can also be realized by a computer of a system or apparatus (or devices such as a CPU, MPU, etc.) that reads out and executes a program recorded on a memory device to perform the functions of the above-described embodiments, and by a method, the steps of which are performed by a computer of a system or apparatus by, for example, reading out and executing a program recorded on a memory device to perform the functions of the above-described embodiments.
- the program is provided to the computer for example via a network or from a recording medium of various types serving as the memory device (e.g., a computer-readable medium).
- the system or apparatus, and the recording medium where the program is stored are included as being within the scope of the present invention.
Landscapes
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Studio Devices (AREA)
- Camera Bodies And Camera Details Or Accessories (AREA)
- Circuit For Audible Band Transducer (AREA)
Abstract
Description
x t+α1 x t−1+ . . . +αp x t−p=εt (1)
In equation (1), εt is an uncorrelated random variable of an average value 0 and variance σ2.
In equation (3), α0=1. Equation (4) is then used to simplify equation (3).
The linear prediction coefficient αi which minimizes equation (3) can be determined by setting a partial difference with respect to αj (j=1, 2, . . . , p) in equation (3) to 0. As a result, equation (5) is obtained.
Equation (5) indicates that αi can be determined by solving p sets of linear simultaneous equations. cij in equation (5) can be acquired using xt−i (i=1, 2, . . . , p).
If the approximation is sufficiently appropriate, the right-hand side of equation (6) can be used as the prediction signal instead of xt.
Claims (17)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010-264119 | 2010-11-26 | ||
JP2010264119A JP5656586B2 (en) | 2010-11-26 | 2010-11-26 | Imaging apparatus, control method therefor, and audio processing apparatus and method |
Publications (2)
Publication Number | Publication Date |
---|---|
US20120133784A1 US20120133784A1 (en) | 2012-05-31 |
US9288370B2 true US9288370B2 (en) | 2016-03-15 |
Family
ID=46126374
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/296,916 Expired - Fee Related US9288370B2 (en) | 2010-11-26 | 2011-11-15 | Imaging apparatus and audio processing apparatus |
Country Status (4)
Country | Link |
---|---|
US (1) | US9288370B2 (en) |
JP (1) | JP5656586B2 (en) |
KR (1) | KR101457392B1 (en) |
CN (1) | CN102572263B (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012203040A (en) * | 2011-03-23 | 2012-10-22 | Canon Inc | Sound signal processing apparatus and its control method |
JP6039205B2 (en) * | 2012-03-21 | 2016-12-07 | キヤノン株式会社 | Imaging device |
JP2014085609A (en) | 2012-10-26 | 2014-05-12 | Sony Corp | Signal processor, signal processing method, and program |
JP6144945B2 (en) * | 2013-03-29 | 2017-06-07 | キヤノン株式会社 | Signal processing apparatus and method |
US11817114B2 (en) * | 2019-12-09 | 2023-11-14 | Dolby Laboratories Licensing Corporation | Content and environmentally aware environmental noise compensation |
Citations (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001344000A (en) | 2000-05-31 | 2001-12-14 | Toshiba Corp | Noise canceler, communication equipment provided with it, and storage medium with noise cancellation processing program stored |
JP2003233947A (en) | 2002-02-08 | 2003-08-22 | Fuji Photo Film Co Ltd | Sound recording device |
JP2004023502A (en) | 2002-06-18 | 2004-01-22 | Fuji Photo Film Co Ltd | Digital camera |
US6784933B1 (en) * | 1999-09-10 | 2004-08-31 | Kabushiki Kaisha Toshiba | Solid-state imaging device and method for controlling same |
US20040172240A1 (en) * | 2001-04-13 | 2004-09-02 | Crockett Brett G. | Comparing audio using characterizations based on auditory events |
US20060132624A1 (en) * | 2004-12-21 | 2006-06-22 | Casio Computer Co., Ltd. | Electronic camera with noise reduction unit |
JP2006270591A (en) | 2005-03-24 | 2006-10-05 | Nikon Corp | Electronic camera, data reproducing device and program |
JP2006279185A (en) | 2005-03-28 | 2006-10-12 | Casio Comput Co Ltd | Imaging apparatus, and sound recording method and program |
JP2006287387A (en) | 2005-03-31 | 2006-10-19 | Casio Comput Co Ltd | Imaging apparatus, sound recording method, and program |
KR20070004806A (en) | 2004-03-30 | 2007-01-09 | 산요덴키가부시키가이샤 | Noise removing circuit |
JP2008053802A (en) | 2006-08-22 | 2008-03-06 | Sony Corp | Recorder, noise removing method, and noise removing device |
US20080151062A1 (en) * | 2006-12-20 | 2008-06-26 | Yoichiro Okumura | Digital camera |
US20080219470A1 (en) * | 2007-03-08 | 2008-09-11 | Sony Corporation | Signal processing apparatus, signal processing method, and program recording medium |
US20090147624A1 (en) * | 2007-12-06 | 2009-06-11 | Sanyo Electric Co., Ltd. | Sound Collection Environment Deciding Device, Sound Processing Device, Electronic Appliance, Sound Collection Environment Deciding Method and Sound Processing Method |
JP2010118975A (en) | 2008-11-14 | 2010-05-27 | Nec Corp | Sound collector and method for suppressing noise |
US20100260354A1 (en) * | 2009-04-13 | 2010-10-14 | Sony Coporation | Noise reducing apparatus and noise reducing method |
US20100270954A1 (en) * | 2009-04-24 | 2010-10-28 | Hisao Ito | Driving apparatus, optical apparatus, and driving signal control circuit |
US20110022403A1 (en) * | 2009-07-27 | 2011-01-27 | Canon Kabushiki Kaisha | Sound recording apparatus and method |
US20110052139A1 (en) * | 2009-08-28 | 2011-03-03 | Sanyo Electric Co., Ltd. | Imaging Device And Playback Device |
US20110069946A1 (en) * | 2009-09-17 | 2011-03-24 | Panasonic Corporation | Focus adjusting apparatus and imaging apparatus |
US20110096206A1 (en) * | 2009-06-26 | 2011-04-28 | Nikon Corporation | Image Pickup Apparatus |
US8379110B2 (en) * | 2009-09-16 | 2013-02-19 | Canon Kabushiki Kaisha | Image sensing apparatus and system |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005244613A (en) * | 2004-02-26 | 2005-09-08 | Toshiba Corp | Digital still camera |
-
2010
- 2010-11-26 JP JP2010264119A patent/JP5656586B2/en not_active Expired - Fee Related
-
2011
- 2011-11-15 US US13/296,916 patent/US9288370B2/en not_active Expired - Fee Related
- 2011-11-22 KR KR1020110121966A patent/KR101457392B1/en active IP Right Grant
- 2011-11-28 CN CN201110396302.1A patent/CN102572263B/en not_active Expired - Fee Related
Patent Citations (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6784933B1 (en) * | 1999-09-10 | 2004-08-31 | Kabushiki Kaisha Toshiba | Solid-state imaging device and method for controlling same |
JP2001344000A (en) | 2000-05-31 | 2001-12-14 | Toshiba Corp | Noise canceler, communication equipment provided with it, and storage medium with noise cancellation processing program stored |
US20040172240A1 (en) * | 2001-04-13 | 2004-09-02 | Crockett Brett G. | Comparing audio using characterizations based on auditory events |
JP2003233947A (en) | 2002-02-08 | 2003-08-22 | Fuji Photo Film Co Ltd | Sound recording device |
JP2004023502A (en) | 2002-06-18 | 2004-01-22 | Fuji Photo Film Co Ltd | Digital camera |
US20080279393A1 (en) | 2004-03-30 | 2008-11-13 | Sanyo Electric Co., Ltd. | Noise Eliminating Circuit |
KR20070004806A (en) | 2004-03-30 | 2007-01-09 | 산요덴키가부시키가이샤 | Noise removing circuit |
US20060132624A1 (en) * | 2004-12-21 | 2006-06-22 | Casio Computer Co., Ltd. | Electronic camera with noise reduction unit |
JP2006270591A (en) | 2005-03-24 | 2006-10-05 | Nikon Corp | Electronic camera, data reproducing device and program |
JP2006279185A (en) | 2005-03-28 | 2006-10-12 | Casio Comput Co Ltd | Imaging apparatus, and sound recording method and program |
JP2006287387A (en) | 2005-03-31 | 2006-10-19 | Casio Comput Co Ltd | Imaging apparatus, sound recording method, and program |
JP2008053802A (en) | 2006-08-22 | 2008-03-06 | Sony Corp | Recorder, noise removing method, and noise removing device |
US20080151062A1 (en) * | 2006-12-20 | 2008-06-26 | Yoichiro Okumura | Digital camera |
US20080219470A1 (en) * | 2007-03-08 | 2008-09-11 | Sony Corporation | Signal processing apparatus, signal processing method, and program recording medium |
US20090147624A1 (en) * | 2007-12-06 | 2009-06-11 | Sanyo Electric Co., Ltd. | Sound Collection Environment Deciding Device, Sound Processing Device, Electronic Appliance, Sound Collection Environment Deciding Method and Sound Processing Method |
JP2010118975A (en) | 2008-11-14 | 2010-05-27 | Nec Corp | Sound collector and method for suppressing noise |
US20100260354A1 (en) * | 2009-04-13 | 2010-10-14 | Sony Coporation | Noise reducing apparatus and noise reducing method |
US20100270954A1 (en) * | 2009-04-24 | 2010-10-28 | Hisao Ito | Driving apparatus, optical apparatus, and driving signal control circuit |
US20110096206A1 (en) * | 2009-06-26 | 2011-04-28 | Nikon Corporation | Image Pickup Apparatus |
US20110022403A1 (en) * | 2009-07-27 | 2011-01-27 | Canon Kabushiki Kaisha | Sound recording apparatus and method |
US20110052139A1 (en) * | 2009-08-28 | 2011-03-03 | Sanyo Electric Co., Ltd. | Imaging Device And Playback Device |
US8379110B2 (en) * | 2009-09-16 | 2013-02-19 | Canon Kabushiki Kaisha | Image sensing apparatus and system |
US20110069946A1 (en) * | 2009-09-17 | 2011-03-24 | Panasonic Corporation | Focus adjusting apparatus and imaging apparatus |
Also Published As
Publication number | Publication date |
---|---|
KR101457392B1 (en) | 2014-11-03 |
US20120133784A1 (en) | 2012-05-31 |
CN102572263B (en) | 2015-05-27 |
JP2012114842A (en) | 2012-06-14 |
JP5656586B2 (en) | 2015-01-21 |
KR20120057526A (en) | 2012-06-05 |
CN102572263A (en) | 2012-07-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8379110B2 (en) | Image sensing apparatus and system | |
US8626500B2 (en) | Apparatus and method for noise reduction and sound recording | |
US9288370B2 (en) | Imaging apparatus and audio processing apparatus | |
US8698911B2 (en) | Sound recording device, imaging device, photographing device, optical device, and program | |
JP5279629B2 (en) | Imaging device | |
US9282229B2 (en) | Audio processing apparatus, audio processing method and imaging apparatus | |
JP4952769B2 (en) | Imaging device | |
US9232146B2 (en) | Imaging device with processing to change sound data | |
JP5361398B2 (en) | Imaging device | |
US9734840B2 (en) | Signal processing device, imaging apparatus, and signal-processing program | |
KR101399986B1 (en) | Audi0 signal processing apparatus | |
US8855482B2 (en) | Imaging apparatus and sound processing apparatus | |
US9294835B2 (en) | Image capturing apparatus, signal processing apparatus and method | |
JP5158054B2 (en) | Recording device, imaging device, and program | |
JP6061476B2 (en) | Audio processing device | |
JP2011205527A (en) | Imaging apparatus, method and program | |
JP5736839B2 (en) | Signal processing apparatus, imaging apparatus, and program | |
JP2012165219A (en) | Imaging apparatus | |
JP2011205526A (en) | Imaging apparatus, method, and program | |
JP2013178456A (en) | Signal processor, camera and signal processing program | |
JP3977406B2 (en) | Autofocus device, autofocus method, imaging device, program, storage medium | |
JP2013161041A (en) | Signal processor, camera and signal processing program | |
JP2016018082A (en) | Voice processing device and method, as well as imaging device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: CANON KABUSHIKI KAISHA, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KAJIMURA, FUMIHIRO;REEL/FRAME:027712/0421 Effective date: 20111108 |
|
ZAAA | Notice of allowance and fees due |
Free format text: ORIGINAL CODE: NOA |
|
ZAAB | Notice of allowance mailed |
Free format text: ORIGINAL CODE: MN/=. |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20240315 |