CN115118892A - Image acquisition method and device and electronic equipment - Google Patents

Image acquisition method and device and electronic equipment Download PDF

Info

Publication number
CN115118892A
CN115118892A CN202210733571.0A CN202210733571A CN115118892A CN 115118892 A CN115118892 A CN 115118892A CN 202210733571 A CN202210733571 A CN 202210733571A CN 115118892 A CN115118892 A CN 115118892A
Authority
CN
China
Prior art keywords
image sensor
image
multispectral
pixels
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210733571.0A
Other languages
Chinese (zh)
Inventor
裴珺
吴旭邦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202210733571.0A priority Critical patent/CN115118892A/en
Publication of CN115118892A publication Critical patent/CN115118892A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/14Picture signal circuitry for video frequency region
    • H04N5/144Movement detection

Abstract

The application provides an image acquisition method, an image acquisition device and electronic equipment, and belongs to the technical field of camera shooting. The method comprises the following steps: when the image sensor is controlled to be exposed, acquiring motion information of a shooting object through multispectral pixels and a spectral transmitter in the image sensor; determining compensation information of the image sensor based on the motion information; controlling the image sensor to move based on the compensation information, so that the image sensor is static relative to the shooting object in the exposure process; and after the exposure is finished, outputting a first original image.

Description

Image acquisition method and device and electronic equipment
Technical Field
The embodiment of the invention relates to the technical field of camera shooting, in particular to an image acquisition method and device and electronic equipment.
Background
When a user shoots through a lens of the electronic equipment, the electronic equipment can expose in a line-by-line exposure mode to shoot a clear picture. When the shot object is in a moving state, shooting in a line-by-line exposure mode can cause the image area where the shot object is located to be inclined or distorted, and the shooting result is inclined, irregular swinging or partially exposed, namely a jelly effect is generated.
In order to solve the situation of the jelly effect, the definition of shooting a moving object is improved by increasing the line-by-line exposure speed of the electronic equipment or changing the line-by-line exposure to the global exposure.
However, the above method may cause pixel failure and increase the use cost of the electronic device.
Disclosure of Invention
The embodiment of the invention provides an image acquisition method, an image acquisition device and electronic equipment, which can achieve the effect of still obtaining a normal and clear image when a moving object is shot.
In a first aspect, an embodiment of the present invention provides an image capturing method applied to an electronic device, where the image capturing apparatus includes an image sensor and a spectrum emitter, where multispectral pixels are disposed on the image sensor, and the method includes:
when the image sensor is controlled to be exposed, acquiring motion information of a shooting object through multispectral pixels in the image sensor and the spectral transmitter;
determining compensation information for the image sensor based on the motion information;
controlling the image sensor to move based on the compensation information, so that the image sensor is static relative to the shooting object in the exposure process;
and after the exposure is finished, outputting a first original image.
In a second aspect, an embodiment of the present invention provides an image capturing device, where the image capturing device includes an image sensor and a spectrum emitter, where multispectral pixels are disposed on the image sensor, and the device includes:
the motion information acquisition module is used for acquiring motion information of a shooting object through multispectral pixels in the image sensor and the spectral transmitter when the image sensor is controlled to be exposed;
a compensation information acquisition module for determining compensation information of the image sensor based on the motion information;
the image sensor moving module is used for controlling the image sensor to move based on the compensation information so that the image sensor is static relative to the shooting object in the exposure process;
and the first original image output module is used for outputting a first original image after exposure is finished.
In a third aspect, embodiments of the present invention further provide an electronic device, including a processor and a memory, where the memory stores a program or instructions executable on the processor, and the program or instructions, when executed by the processor, implement the steps of the image processing method according to the first aspect.
In a fourth aspect, the embodiments of the present invention also provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the image processing method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the first aspect.
In a sixth aspect, embodiments of the present application provide a computer program product, which is stored in a storage medium and executed by at least one processor to implement the method according to the first aspect.
In the embodiment of the application, the movement information of the shooting object is acquired through multispectral pixels and a spectral transmitter in an image sensor; determining compensation information of the image sensor based on the motion information; when the image sensor is controlled to be exposed, the image sensor is controlled to move based on the compensation information, so that the image sensor is static relative to the shooting object in the exposure process; and after the exposure is finished, outputting a first original image. In the exposure process, the motion information of the shot object is obtained in real time, the camera module is adjusted to move in real time, so that the camera module finishes displacement compensation movement when each pixel row is exposed, the image sensor is static relative to the shot object, and therefore, when the moving shot object is shot, a clear moving object image can be shot under the condition that the exposure speed is not increased, the probability of pixel failure is reduced, and the use cost of the electronic equipment is reduced aiming at the exposure mode that the exposure speed is not increased.
The foregoing description is only an overview of the technical solutions of the present invention, and the embodiments of the present invention are described below in order to make the technical means of the present invention more clearly understood and to make the above and other objects, features, and advantages of the present invention more clearly understandable.
Drawings
FIG. 1 is a schematic view of an image under the jelly effect provided by an embodiment of the present invention;
FIG. 2 is a flowchart illustrating steps of an image acquisition method according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of an optical anti-shake lens module according to an embodiment of the invention;
FIG. 4 is a schematic structural diagram of a micro-frustum lens module according to an embodiment of the present invention;
FIG. 5 is a flowchart illustrating steps of another image capture method according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of a spectral curve provided by an embodiment of the present invention;
FIG. 7 is a circuit diagram of a color pixel imaging process provided by an embodiment of the present invention;
FIG. 8 is a multi-spectral pixel distribution map provided by an embodiment of the present invention;
FIG. 9 is a graph of a multispectral pixel distribution according to another embodiment of the present invention;
fig. 10 is a block diagram of an image capturing apparatus according to an embodiment of the present invention;
fig. 11 is an electronic device provided by an embodiment of the present invention;
fig. 12 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present invention.
Description of reference numerals:
11-an image sensor; 12-a lens; 13-a drive frame; 21-a lens module; 22-a magnetomotive frame; 23-image sensor.
Detailed Description
Exemplary embodiments of the present invention will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the invention are shown in the drawings, it should be understood that the invention can be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art.
The following explains some concepts and/or terms involved in the schemes provided in the embodiments of the present application.
The image sensor (sensor) is the core of the camera and is the most critical technology in the camera. The Sensor is divided into two types, one is a widely used CCD (Charge-coupled Device) element; the other is a CMOS device (Complementary Metal-Oxide-Semiconductor). In contrast to conventional cameras, which use "film" as their information recording carrier, the "film" of a digital camera is its imaging photosensitive element, which is the "film" of the digital camera that is not to be replaced and is integral with the camera.
Currently, CMOS devices are mainly used, which mainly use semiconductors made of two elements, i.e., silicon and germanium, so that N (negatively charged) and P (positively charged) semiconductors coexist on the CMOS devices, and the currents generated by the two complementary effects can be recorded and interpreted as images by a processing chip.
The CMOS camera module is a camera module which is mainly used in the current mobile phone and mainly comprises a lens, a voice coil motor, an infrared filter, an image sensor, a digital signal processor and a soft board. The CMOS work flow is that the voice coil motor drives the lens to reach the position with accurate focusing, external light passes through the lens, is filtered by the infrared filter and irradiates on a photosensitive diode of the image sensor, the photosensitive diode converts sensed optical signals into electric signals, a digital signal matrix (namely an image) is formed through the amplifying circuit and the analog-to-digital conversion circuit, and the digital signal matrix (namely the image) is compressed and stored after being processed by the digital signals.
The jelly effect refers to the deformation and color change produced like jelly in production and life. The formation of jelly is determined by the characteristics of a camera itself, and most of cameras using a CMOS sensor use a rolling shutter, which is implemented by exposing an image sensor line by line. When exposure starts, the image sensor scans line by line and exposes line by line until all pixel points are exposed. If all the actions are completed in a very short time, the shooting is not affected generally. But if the object being photographed moves at a high speed or vibrates rapidly with respect to the camera. When the rolling shutter method is used for shooting, the progressive scanning speed is not enough, and the shooting result can be inclined, swinging or partial exposure. The phenomenon occurring in the rolling shutter mode is defined as the jelly effect. Referring to fig. 1, fig. 1 is a schematic view of an image under the jelly effect. When scanning the first row of pixels, the image corresponding to one row of pixels is normally shot, when scanning the second row of pixels, because the object moves to the left, the image obtained by scanning the second row of pixels is actually shifted, so that the shot image of the object obtained after the last row-by-row scanning is an inclined image, and the shooting condition is called as the jelly effect.
Fig. 2 is a flowchart of steps of an image processing method provided in an embodiment of the present invention, where the method is applied to an image acquisition device, where the image acquisition device includes an image sensor and a spectrum emitter, where multispectral pixels are disposed on the image sensor, and the method may include:
step 101, when the image sensor is controlled to be exposed, acquiring motion information of a shooting object through multispectral pixels in the image sensor and the spectrum transmitter.
In the embodiment of the invention, the image sensor performs imaging in a line-by-line exposure mode, and the photosensitive units are arranged from the head line, the second line and the third line.
A multispectral pixel is a pixel that receives only light within a target spectral range (i.e., light of a particular spectrum) and filters light outside the target spectral range. The multispectral pixels can be used for obtaining the reflectivity of a target shooting object to different wave band spectrums, and meanwhile, the motion information of a shooting object can be obtained through the time difference of received spectrum signals.
Further, the spectrum emitter is an electronic component capable of emitting spectrums of different wave bands, the emitted spectrums of the spectrum emitter can be reflected by the target moving object, and the image sensor can receive the reflected spectrums. The moving distance of the target moving object can be calculated by utilizing the spectrum emitted by the spectrum emitter, specifically, after the spectrum emitted by the spectrum emitter is reflected by the target moving object to reach the image sensor, at the moment, the propagation time of the spectrum can be obtained according to the time of the reflected spectrum captured by the image sensor and the time of the spectrum emitted by the spectrum emitter, and based on the known light speed, the moving distance and the moving direction of the target moving object can be obtained through the time difference of the emitted spectrum and the received spectrum.
Through the cooperation of the spectrum transmitter and the multispectral pixels, the motion information of the shot object can be acquired in real time, so that the position of the image sensor can be adjusted in real time according to the motion information.
Step 102, determining compensation information of the image sensor based on the motion information.
In the embodiment of the present invention, after the motion information is obtained in step 101, the moving distance and the moving direction of the image sensor may be obtained according to the moving distance and the moving direction included in the motion information, which are compensation information of the image sensor, and the image sensor moves in real time according to the compensation information along with the real-time motion of the object to be photographed to compensate the motion of the object to be photographed. For example, if the motion information of the shot object moves 3mm away from the image sensor, the obtained compensation information is 3mm moving in the same direction as the moving object, so that the camera module and the target moving object still keep relatively still after moving.
Calculating compensation information of the image sensor, and further referring to exposure time of the pixel rows, wherein if the exposure time of each pixel row is 3ms, the motion speed of the target moving object can be obtained through the obtained moving distance and time period, for example, if the calculated motion speed is 1mm/ms, the distance of the target moving object moving in the time period of the pixel row exposure can be obtained as 3mm, when the compensation information of the image sensor is obtained, the displacement of the shooting object during exposure can be further referred to, the displacement generated in the exposure time period and the displacement generated by reading exposure data are summed, and the obtained result is used as the compensation information of the image sensor, so that the displacement adjustment of the image sensor is more accurate.
Further, the camera module including the image sensor can be applied to an optical anti-shake lens or a pan-tilt anti-shake lens module, referring to fig. 3 and fig. 4, wherein fig. 3 is a schematic structural diagram of the optical anti-shake lens module, and fig. 4 is a schematic structural diagram of a micro-cloud-platform anti-shake lens module. As shown in fig. 3, a gyroscope is disposed in the optical anti-shake lens module, the gyroscope can monitor the tiny movement of the electronic device, if the gyroscope monitors that the electronic device moves, the moving signal is transmitted to the image signal processing component, the image signal processing component can obtain the compensation amount of the lens for the displacement of the electronic device through calculation, and finally the lens 12 is driven by the electromagnetic force driving frame 13 to move in the direction opposite to the displacement direction of the electronic device relative to the image sensor 11, so as to compensate the displacement generated by the shake of the electronic device, thereby stabilizing the optical path, and effectively overcoming the image blur problem generated by the vibration of the mobile phone.
The micro-pan-tilt module consists of a limiting mechanism, a double-ball suspension, a lens, a voice coil motor, a double-S-shaped FPC (Flexible Printed Circuit) flat cable, a magnetomotive frame, a module carrying frame and a protective cover. The CMOS and the lens group are packaged in a double-ball suspension, and then are installed on the magnetomotive frame, two pairs of balls are matched with a cross support, so that the micro-cloud platform can respectively complete flexible double-shaft rotation in an X axis and a Y axis, thereby realizing three-dimensional anti-shake, referring to figure 4, when the electronic equipment is displaced, the camera module 21 in the magnetomotive frame 22 rotates to compensate the shaking displacement of the electronic equipment, at the moment, the image sensor 23 and the camera module 21 can simultaneously move in the magnetomotive frame 22 to ensure the light path to be stable, the micro-cloud platform module structure scheme can realize the movement of the whole camera module, and the optical anti-shake lens structure module can only move the lens, thereby losing a part of pictures, therefore, the micro-cloud platform module scheme has better imaging effect, the anti-shake range area can reach 3.2 times of the optical anti-shake lens structure module, the anti-shake stability is stronger.
Therefore, if the electronic device itself has a shake displacement, then when calculating the compensation information of the image sensor, the displacement of the electronic device due to shake can be further fused, for example, due to shake of the electronic device itself, the lens or the entire lens module needs to move 1mm in the direction of the object to be photographed, and after obtaining the motion information of the moving object, the obtained lens or the entire lens module needs to move 3mm in the direction of the object to be photographed, then the two displacement adjustment amounts are superposed, and the lens or the entire lens module needs to move 4mm in the direction of the object to be photographed. Through comprehensively considering the displacement of the target moving object and the displacement of the electronic equipment, the displacement of the lens or the whole lens module which needs to be moved can be judged more accurately so as to obtain a higher-quality image. The embodiment of the invention does not limit which kind of lens module structure is used.
And 103, controlling the image sensor to move based on the compensation information so that the image sensor is static relative to the shooting object in the exposure process.
In the embodiment of the invention, after the first pixel row is exposed, after the image sensor moves to the corresponding position, the exposure operation of the second pixel row is started, at this time, because the image sensor is still relatively static with the shooting object after moving, when the second pixel is exposed, the position of the shooting object corresponding to the image sensor is not changed, at this time, the image data corresponding to the shooting object obtained by the exposure of the second pixel row does not have the inclination problem, after the second exposure operation is finished, the motion information of the target moving object is continuously acquired, the image sensor is moved before the next pixel row starts to be exposed, so that when the next pixel row forms an image, the image sensor still remains relatively static with respect to the shooting object, thus each pixel row is sequentially exposed, and when each pixel row runs, the camera module always keeps relatively static with respect to the target moving object, after the exposure is finished, a clear image of the shot object can be finally obtained.
And step 104, outputting a first original image after the exposure is finished.
In the embodiment of the invention, under the real-time adjustment of the image sensor, after all pixel rows are exposed, the first original image with clear shot objects in the picture can be output.
In summary, the embodiment of the present application provides an image acquisition method, which obtains motion information of a photographic object through multispectral pixels and a spectral emitter in an image sensor when the image sensor is controlled to perform exposure; determining compensation information of the image sensor based on the motion information; controlling the image sensor to move based on the compensation information, so that the image sensor is static relative to the shooting object in the exposure process; and after the exposure is finished, outputting a first original image. In the exposure process, the motion information of the shot object is obtained in real time, the camera module is adjusted to move in real time, so that the camera module finishes displacement compensation movement when each pixel row is exposed, the image sensor is static relative to the shot object, and therefore when the moving shot object is shot, a clear moving object image is shot under the condition that the exposure speed is not increased, the probability of pixel failure is reduced, and the use cost of the electronic equipment is reduced aiming at the exposure mode that the exposure speed is not increased.
Fig. 5 is a flowchart illustrating steps of another image capturing method according to an embodiment of the present invention, as shown in fig. 5, the method may include:
step 201, when the image sensor is controlled to be exposed, acquiring motion information of a shooting object through multispectral pixels in the image sensor and the spectral transmitter. The motion information includes a motion direction and a motion speed.
This step may specifically refer to step 101, which is not described herein again.
Step 202, determining the moving distance and the moving direction of the image sensor according to the moving direction, the moving speed and the reading time.
In the embodiment of the invention, the reading time is the time for reading the pixel line after the exposure of the pixel line is finished, and the image sensor can be correspondingly adjusted according to the obtained motion information by calculating the motion information of the shooting object in the time for reading the pixel line. If the propagation speed of the spectrum is c and the time difference between the first time (the start time of the reading time) when the spectrum emitter emits the spectrum and the second time when the spectrum is captured by the image sensor is t1, the distance d1 between the object and the camera module is ct1/2, the obtained distance d1 is the distance between the current object and the image sensor, the distance between the current object and the image sensor can be obtained according to the speed c of the spectrum, namely the light speed, and the time difference t1, the spectrum emitter emits the spectrum at the third time (the end time of the reading time), the spectrum reflected by the object is captured by the image sensor at the fourth time, the end time of the reading time, the distance d2 between the image sensor and the object are calculated according to the same calculation method, and the distance d1 is compared with the distance d2, that is, the moving direction of the photographic subject can be obtained, and the moving speed of the photographic subject can be obtained according to the time difference (reading time) between the two changes of the distance and the two intervals of the emission spectra. According to the obtained motion information of the shooting object, the moving distance and the moving direction of the image sensor can be obtained. Such as: if the shooting object moves 3mm away from the image sensor within the reading time, the compensation information of the image sensor is as follows: the moving direction is the same as the shooting object, and the moving distance is 3 mm. The compensation information of the image sensor can be correspondingly obtained according to the obtained motion information of the shooting object, so that the image sensor is moved according to the compensation information, and the jelly effect is solved.
And 203, controlling the image sensor to move based on the compensation information, so that the image sensor is static relative to the shooting object in the exposure process.
This step may specifically refer to step 103, which is not described herein again.
And step 204, outputting a first original image after the exposure is finished.
This step may specifically refer to step 104, which is not described herein again.
Optionally, the method may further include:
step 205, responding to a shooting operation, acquiring a second original image acquired by the image sensor;
in the embodiment of the invention, when the image is shot, a background image of the environment where the shot object is located can be obtained, because the image sensor moves correspondingly along with the movement of the shot object when the shot object is shot, at the moment, a clear image of the shot object can be shot, but because the image sensor moves, the shot background of the shot object is actually deformed or unclear, and therefore, a second original image with a clear background can be collected to ensure that the collected clear image of the background where the shot object is located.
Step 206, performing image fusion on the first original image and the second original image to obtain a target image; and the definition of the background image in the second original image meets a preset condition.
In the embodiment of the invention, the movement information of the shot object is obtained through the multispectral pixels and the spectrum emitter, the multispectral pixels and the spectrum emitter can correspond to the mobile image sensor to shoot a first original image with clear shot object, when the image sensor does not move, a second original image with clear background is obtained, at the moment, the image area corresponding to the shot object in the first original image and the background image area in the second original image can be taken to perform fusion calculation, a clear image comprising the shot object and the background of the shot object can be obtained, the image quality problem caused by the jelly effect is solved, and the finally obtained image can be further stored or previewed. As for the fusion method of the two images, the image of the shooting object is obtained by extracting the contour line of the shooting object, the oblique shooting object in the second original image is removed in the same manner, and then the two obtained images are fused to obtain the final image, or the features in the images are extracted in a feature extraction manner and then fused, which is not limited in the embodiment of the present invention.
Optionally, the image sensor comprises at least one multispectral pixel set, each multispectral pixel set comprising at least two adjacently arranged multispectral pixels; the spectral bands corresponding to each multispectral pixel in the multispectral pixel group are different.
In an embodiment of the present invention, the multispectral pixels are arranged in the image sensor in a multispectral pixel group, and the spectral band corresponding to each multispectral pixel is different. For example, if there are two multispectral pixels, one multispectral pixel may be set to allow a spectrum in a wavelength band of 10-500 nm to pass through, and the other multispectral pixel may be set to allow a spectrum in a wavelength band of 501-800nm to pass through, so that spectral signals with different wavelengths may be received by different multispectral pixels, and spectral signals reflected by a photographic object may be received by a group of multispectral pixels to obtain corresponding reflectivities respectively.
Further, the number of multispectral pixels included in the multispectral pixel group may be 4, 6, 9, and the like, and when the number of multispectral pixels included in the multispectral pixel group is also 4, each multispectral pixel may set the spectral range of the allowed wavelength band to become smaller, such as setting the spectral range to be smaller: 10-200 nm, 200-400 nm, 400-600 nm and 600-800 nm, because the number of the multispectrals is increased, the range of each multispectral pixel for receiving a spectral wave band is reduced, so that the division of the multispectral pixel group for receiving a spectral signal is more accurate, and the obtained spectral signal with the optimal reflectivity of a shooting object is more accurate. When the number of multispectral pixels is other, the division method is the same, and the multispectral pixels can be divided according to the actual number, and the embodiment of the application is not limited herein.
Optionally, the range of spectral bands emitted by the spectral emitter includes a spectral band corresponding to each of the multispectral pixels in the multispectral pixel group, and before step 201, the method may further include:
and step 207, controlling the spectrum transmitter to transmit emission spectrum signals of different wave bands.
In the embodiment of the invention, the spectrum emitter can emit spectrum signals of different wave bands, and the spectrum emitter is controlled to emit the emission spectrum signals of different wave bands, so that the reflectivity of the shooting object to the emission spectrum signals of different wave bands can be obtained, and the optimal reflectivity of the shooting object can be obtained.
And 208, acquiring the reflectivity of the shooting object to the emission spectrum signals of different wave bands through the multispectral pixels.
In the embodiment of the invention, the multispectral pixel can detect the spectrum reflected by the shooting object, the reflectivity of the multispectral pixel to the spectrums of different wave bands is obtained by detecting the reflectivity of the reflected spectrum, and the spectrum of the target wave band with the highest reflectivity of the target moving object is finally obtained.
The multispectral pixels can generate a spectral reflectivity curve for the reflectivity of the shooting object to the emission spectrum, and the spectral reflectivity curve can represent the reflectivity of the shooting object to different wave band spectral signals. Referring to fig. 6, fig. 6 shows a graph of the reflectivity of a photographic subject for different wavelength band spectra, the abscissa in fig. 6 represents the length of the wavelength band, and the ordinate represents the magnitude of the reflectivity, and it can be seen from fig. 6 that the values of the reflectivity of the photographic subject for different wavelength band spectra are obtained, and the wavelength band of the spectrum with the highest reflectivity can be determined according to the judgment of the magnitude of the reflectivity.
Further, the multispectral pixels can obtain the reflectivity curve of the shooting object by detecting the emissivity of the shooting object to the spectrum emitted by the spectrum emitter, and similarly, can also obtain the reflectivity curve of the shooting object by detecting the reflectivity of the spectrum of different wave bands in the natural light reflected by the shooting object. Specifically, when natural light is weak, the spectrum emitter can emit spectrum signals of different wave bands to obtain a reflectivity curve of the shot object, the spectrum signal of the wave band with the best reflectivity of the shot object can be accurately obtained by obtaining the reflectivity curve, the situation that the reflectivity is low and the reflection spectrum is not easy to capture when the shot object is made of special materials such as glass is prevented, and the accuracy of distance calculation is further ensured.
In this embodiment of the present invention, step 208 may specifically include:
substep 2081, obtaining a reflection signal value received by each multispectral pixel in the multispectral pixel group;
in the embodiment of the invention, the spectral signals received by the multispectral pixels are converted into electric signals for processing during processing, signal values representing the intensity of the spectral signals can be obtained through processes such as analog-to-digital conversion and the like, and the intensity of the spectral signals received by the multispectral pixels can be obtained through comparison of the signal values.
Substep 2082, determining the reflectivity of the photographic subject to different waveband spectral signals based on the emission spectral signals and the reflection signal values.
In the embodiment of the invention, the emission spectrum signal emitted by the spectrum emitter has a corresponding signal value which can represent the signal intensity of the emission spectrum signal, the reflectivity of the shooting object to different spectrums can be determined according to the magnitude of the reflection signal value and the magnitude of the signal value of the emission spectrum signal emitted by the spectrum emitter, and the spectrum band with large reflectivity can be used as a target band.
Step 209, determining a target waveband based on the reflectivity; and the reflectivity corresponding to the target wave band is maximum.
In the embodiment of the present invention, referring to fig. 6, it can be seen that the reflectivity of the wavelength band around 600nm is the highest, and at this time, the spectrum corresponding to the wavelength band with the highest reflectivity of the photographic object may be used as the target wavelength band of the spectrum emitted by the spectrum emitter, for example, the target wavelength band may be the spectrum wavelength band with the length around 600nm, so as to improve the probability of the collected reflection spectrum.
Optionally, the photosensitive layer of the image sensor further comprises: a color pixel; and the ratio of multispectral pixels to color pixels is greater than or equal to 3% and less than or equal to 50%.
In the embodiment of the invention, the image sensor may include conventional color pixels and multispectral pixels, the color pixels are used for generating an image of a photographed object, the multispectral pixels are used for acquiring motion information of a moving object in real time, and multispectral pixel groups may be uniformly distributed in the color pixels, so that motion information can be acquired in real time for motion tracks in the whole image sensor area.
Referring to fig. 7, fig. 7 shows a color Pixel imaging circuit diagram, and in particular, the color Pixel structure may be a PPD (fixed Photodiode Pixel) Pixel structure. The PPD pixel includes a photosensitive region, i.e., photodiode, of PD1 and 4 transistors: a reset transistor RST, a floating switch TX1, a row selector SET, and a signal amplifier SF, also referred to as a 4T pixel structure. The working mode is as follows:
1. exposure, RST and TX1 are simultaneously turned on to empty PD1, exposure is started after RST and TX1 are turned off, electron-hole pairs generated by light irradiation are separated by the presence of the electric field of PD1, electrons move to the n region, and holes move to the p region.
2. Reset, at the end of exposure, RST is activated, resetting the readout region to a high level.
3. And reading out a reset level, after the reset is finished, reading out the reset level, wherein the reset level comprises offset frequency noise and low frequency noise of a mos (field effect transistor) tube and capacitance noise introduced by reset, and storing a read-out signal in a first capacitor.
4. Charge transfer, TX activated, completely transfers charge from the photosensitive region to the FD1 region for readout.
5. The signal level is read out, and the voltage signal of the FD1 is read out to the second capacitor. The signals here include: signals generated by photoelectric conversion, offset frequency noise, low frequency noise and capacitance noise introduced by resetting.
6. And signal output, namely performing analog amplification on a signal obtained by subtracting the signals stored in the two capacitors, and performing analog-to-digital conversion to output a digitized signal.
It should be noted that, the ratio of the multispectral pixels to the color pixels should be within a preset range, if the ratio of the multispectral pixels is too small, or if the multispectral pixels do not exist in a region of the image sensor corresponding to the photographed object, the motion information of the photographed object in the region is not captured, if the ratio of the multispectral pixels is too large, or even exceeds the number of the color pixels, the number of the color pixels that can acquire the image may be too small, so that the received light is too small, and the generated image is not clear enough, therefore, the ratio of the multispectral pixels to the color pixels may be set to be greater than or equal to 3% and less than or equal to 50% to ensure that not only rich color information can be acquired, but also the motion information can be captured, and a clear photographed image is finally obtained, and the number of the multispectral pixels is not limited herein.
Optionally, the photosensitive layer comprises: a plurality of rectangular multispectral pixel arrays, said rectangular multispectral pixel arrays comprised of a plurality of said multispectral pixels; when the color pixels in the photosensitive layer are in a quad Bayer array distribution, the rectangular multispectral pixel array is located at a diagonal position of a rectangular local area in the photosensitive layer.
In the embodiment of the invention, referring to fig. 8, fig. 8 shows a distribution of multispectral pixels in an image sensor when the color pixels of the photosensitive layer are arranged in a quad bayer arrangement. R, G, B in fig. 3 is a conventional color pixel, P represents a multispectral pixel, the multispectral pixel may be distributed in a two-by-two array in a group of 4 pixels, in a rectangular local area of an image sensor, the multispectral pixel may be distributed at two opposite corners of the area, correspondingly, in other local areas of the image sensor, the multispectral pixel may be distributed in the same distribution manner, so as to ensure that the multispectral pixel can accurately acquire the motion information of the photographic object in real time. The embodiment of the invention does not limit the specific number and the distribution rule of the multispectral pixels.
Optionally, the photosensitive layer comprises: a plurality of rectangular multispectral pixel arrays, said rectangular multispectral pixel arrays comprised of a plurality of said multispectral pixels; when the color pixels in the photosensitive layer are distributed in a Bayer array, the rectangular multispectral pixel array is positioned at four positions of one rectangular local area in the photosensitive layer, wherein the four positions are mutually symmetrical.
In the embodiment of the invention, referring to fig. 9, fig. 9 shows a distribution of multispectral pixels in an image sensor when the color pixels of the photosensitive layer are arranged in a bayer arrangement. R, G, B in fig. 9 are conventional color pixels, P represents multispectral pixels, which may be distributed in a two-by-two array in groups of 4 in an image sensor, and in a rectangular local area of an image sensor, the multispectral pixels may be distributed in four symmetrical positions of the area, and correspondingly, in other local areas of the image sensor, the multispectral pixels may be distributed in the same distribution. The embodiment of the invention does not limit the number and the distribution rule of the multispectral pixels.
In summary, an embodiment of the present invention provides an image acquisition method, in which motion information of a photographed object is acquired through multispectral pixels and a spectral emitter in an image sensor when the image sensor is controlled to perform exposure; determining compensation information of the image sensor based on the motion information; controlling the image sensor to move based on the compensation information, so that the image sensor is static relative to the shooting object in the exposure process; and after the exposure is finished, outputting a first original image. In the exposure process, the motion information of the shot object is obtained in real time, the camera module is adjusted to move in real time, so that the camera module finishes displacement compensation movement when each pixel row is exposed, the image sensor is static relative to the shot object, and therefore when the moving shot object is shot, a clear moving object image is shot under the condition that the exposure speed is not increased, the probability of pixel failure is reduced, and the use cost of the electronic equipment is reduced aiming at the exposure mode that the exposure speed is not increased.
According to the image acquisition method provided by the embodiment of the application, the execution main body can be an image acquisition device. The embodiment of the present application describes an image capturing apparatus provided in the embodiment of the present application, with a method for executing an image capturing method by an image capturing apparatus as an example.
Fig. 10 is a block diagram of an image capturing device applied to an electronic device, where a photosensitive layer of an image sensor of the electronic device includes multispectral pixels, and the multispectral pixels are used to obtain motion information of a moving object, as shown in fig. 10, the device 30 includes:
a motion information acquiring module 301, configured to acquire motion information of a photographic object through multispectral pixels in the image sensor and the spectral transmitter;
a compensation information obtaining module 302, configured to determine compensation information of the image sensor based on the motion information;
an image sensor moving module 303, configured to, when controlling the image sensor to perform exposure, control the image sensor to move based on the compensation information, so that the image sensor is stationary relative to the photographic subject during the exposure;
the first original image output module 304 is configured to output a first original image after the exposure is completed.
Optionally, the motion information includes a motion direction and a motion speed, and the compensation information obtaining module 302 includes:
the compensation information calculation submodule is used for determining the moving distance and the moving direction of the image sensor according to the moving direction, the moving speed and the reading time;
optionally, the device 30 comprises:
and the second original image output module is used for responding to shooting operation and acquiring a second original image acquired by the image sensor.
The fusion module is used for carrying out image fusion on the first original image and the second original image to obtain a target image; and the definition of the background image in the second original image meets a preset condition.
Optionally, the image sensor in the device 30 comprises at least one multispectral pixel set, each multispectral pixel set comprising at least two adjacently arranged multispectral pixels; the spectral bands corresponding to each multispectral pixel in the multispectral pixel group are different.
Optionally, the range of spectral bands emitted by the spectral emitter includes a spectral band corresponding to each of the multispectral pixels in the set of multispectral pixels, and the apparatus 30 further comprises:
the spectrum signal emission module is used for controlling the spectrum emitter to emit emission spectrum signals of different wave bands;
the reflectivity acquisition module is used for acquiring the reflectivity of the shooting object to the emission spectrum signals of different wave bands through the multispectral pixels;
a target band determination module that determines a target band based on the reflectivity; and the reflectivity corresponding to the target wave band is maximum.
Optionally, the reflectivity obtaining module includes:
a reflection signal value acquisition submodule for acquiring a reflection signal value received by each of the multispectral pixels in the multispectral pixel group;
and the reflectivity determining submodule is used for determining the reflectivity of the shooting object to the spectrum signals of different wave bands based on the emission spectrum signals and the reflection signal values.
Optionally, in the apparatus 30, the image sensor further comprises: and the ratio of the multispectral pixel to the color pixel is greater than or equal to 3% and less than or equal to 50%.
In summary, an embodiment of the present invention provides an image capturing device, which obtains motion information of a captured object through multispectral pixels and a spectral emitter in an image sensor when the image sensor is controlled to perform exposure; determining compensation information of the image sensor based on the motion information; controlling the image sensor to move based on the compensation information, so that the image sensor is static relative to the shooting object in the exposure process; and after the exposure is finished, outputting a first original image. In the exposure process, the motion information of the shot object is obtained in real time, the camera module is adjusted to move in real time, so that the camera module finishes displacement compensation movement when each pixel row is exposed, the image sensor is static relative to the shot object, and therefore when the moving shot object is shot, a clear moving object image is shot under the condition that the exposure speed is not increased, the probability of pixel failure is reduced, and the use cost of the electronic equipment is reduced aiming at the exposure mode that the exposure speed is not increased.
The image capturing device in the embodiment of the present application may be an electronic device, or may be a component in an electronic device, such as an integrated circuit or a chip. The electronic device may be a terminal, or may be a device other than a terminal. The electronic Device may be, for example, a Mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic Device, a Mobile Internet Device (MID), an Augmented Reality (AR)/Virtual Reality (VR) Device, a robot, a wearable Device, an ultra-Mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and may also be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine, a self-service machine, and the like, and the embodiments of the present application are not particularly limited.
The image acquisition device in the embodiment of the present application may be a device having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present application are not limited specifically.
The image acquisition device provided by the embodiment of the application can realize each process realized by the method embodiments of fig. 1 to 9, achieve the same technical effect, and is not repeated here for avoiding repetition.
Optionally, as shown in fig. 11, an electronic device 400 is further provided in an embodiment of the present application, and includes a processor 401 and a memory 402, where the memory 402 stores a program or an instruction that can be executed on the processor 401, and when the program or the instruction is executed by the processor 401, the steps of the embodiment of the image processing method are implemented, and the same technical effects can be achieved, and are not described again to avoid repetition.
It should be noted that the electronic device in the embodiment of the present application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 12 is a schematic hardware structure diagram of an electronic device implementing an embodiment of the present application.
The electronic device 400 includes, but is not limited to: a radio frequency unit 1001, a network module 1002, an audio output unit 1003, an input unit 1004, a sensor 1005, a display unit 1006, a user input unit 1007, an interface unit 1008, a memory 1009, and a processor 1010.
Those skilled in the art will appreciate that the electronic device 400 may further include a power source (e.g., a battery) for supplying power to various components, and the power source may be logically connected to the processor 1010 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system. The electronic device structure shown in fig. 8 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is omitted here.
The processor 1010 is used for acquiring motion information of a shooting object through multispectral pixels in the image sensor and the spectral transmitter;
determining compensation information for the image sensor based on the motion information;
when the image sensor is controlled to be exposed, the image sensor is controlled to move based on the compensation information, so that the image sensor is static relative to the shooting object in the exposure process;
and after the exposure is finished, outputting a first original image.
Optionally, the motion information includes a motion direction and a motion speed, and the processor 1010 is further configured to determine a moving distance and a moving direction of the image sensor according to the motion direction, the motion speed, and the reading time.
Optionally, the processor 1010 is configured to, in response to a shooting operation, acquire a second original image acquired by the image sensor; carrying out image fusion on the first original image and the second original image to obtain a target image; and the definition of the background image in the second original image meets a preset condition.
Optionally, the image sensor comprises at least one multispectral pixel set, each multispectral pixel set comprising at least two adjacently arranged multispectral pixels; the spectral wave bands corresponding to each multispectral pixel in the multispectral pixel group are different.
Optionally, the range of spectral bands emitted by the spectral emitter includes a spectral band corresponding to each of the multispectral pixels in the multispectral pixel group, and the processor 1010 is further configured to control the spectral emitter to emit emission spectral signals of different bands; acquiring the reflectivity of the shooting object to the emission spectrum signals of different wave bands through the multispectral pixels; determining a target band based on the reflectivity; and the reflectivity corresponding to the target wave band is maximum.
Optionally, the processor 1010 is further configured to obtain a reflection signal value received by each of the multispectral pixels in the multispectral pixel group; and determining the reflectivity of the shooting object to different wave band spectrum signals based on the emission spectrum signals and the reflection signal values.
Optionally, the image sensor further comprises: and the ratio of the multispectral pixel to the color pixel is greater than or equal to 3% and less than or equal to 50%.
The processor 1010 executes the method, and in the exposure process, the motion information of the shot object is obtained in real time to adjust the motion of the camera module in real time, so that the camera module finishes displacement compensation movement when exposing each pixel row, and the image sensor is static relative to the shot object.
It should be understood that in the embodiment of the present application, the input Unit 1004 may include a Graphics Processing Unit (GPU) 10041 and a microphone 10042, and the Graphics Processing Unit 10041 processes image data of still pictures or videos obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The display unit 1006 may include a display panel 10061, and the display panel 10061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 1007 includes at least one of a touch panel 10071 and other input devices 10072. The touch panel 10071 is also referred to as a touch screen. The touch panel 10071 may include two parts, a touch detection device and a touch controller. Other input devices 10072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
The memory 1009 may be used to store software programs as well as various data. The memory 1009 may mainly include a first storage area for storing a program or an instruction and a second storage area for storing data, wherein the first storage area may store an operating system, an application program or an instruction (such as a sound playing function, an image playing function, and the like) required for at least one function, and the like. Further, the memory 1009 may include volatile memory or nonvolatile memory, or the memory 1009 may include both volatile and nonvolatile memory. The non-volatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable PROM (EEPROM), or a flash Memory. The volatile Memory may be a Random Access Memory (RAM), a Static Random Access Memory (Static RAM, SRAM), a Dynamic Random Access Memory (Dynamic RAM, DRAM), a Synchronous Dynamic Random Access Memory (Synchronous DRAM, SDRAM), a Double Data Rate Synchronous Dynamic Random Access Memory (Double Data Rate SDRAM, ddr SDRAM), an Enhanced Synchronous SDRAM (ESDRAM), a Synchronous Link DRAM (SLDRAM), and a Direct Memory bus RAM (DRRAM). The memory 1009 in the embodiments of the present application includes, but is not limited to, these and any other suitable types of memory.
Processor 1010 may include one or more processing units; optionally, the processor 1010 integrates an application processor, which primarily handles operations involving the operating system, user interface, and applications, and a modem processor, which primarily handles wireless communication signals, such as a baseband processor. It will be appreciated that the modem processor described above may not be integrated into processor 1010.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the embodiment of the image acquisition method, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a computer read only memory ROM, a random access memory RAM, a magnetic or optical disk, and the like.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to run a program or an instruction to implement each process of the embodiment of the image acquisition method, and can achieve the same technical effect, and the details are not repeated here to avoid repetition.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
Embodiments of the present application provide a computer program product, where the program product is stored in a storage medium, and the program product is executed by at least one processor to implement the processes in the above-mentioned embodiments of the image acquisition method, and can achieve the same technical effects, and in order to avoid repetition, details are not repeated here.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatuses in the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions recited, e.g., the described methods may be performed in an order different from that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a computer software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the present embodiments are not limited to those precise embodiments, which are intended to be illustrative rather than restrictive, and that various changes and modifications may be effected therein by one skilled in the art without departing from the scope of the appended claims.

Claims (10)

1. An image acquisition method is applied to an image acquisition device, the image acquisition device comprises an image sensor and a spectrum emitter, multispectral pixels are arranged on the image sensor, and the method comprises the following steps:
when the image sensor is controlled to be exposed, acquiring motion information of a shooting object through multispectral pixels in the image sensor and the spectral transmitter;
determining compensation information for the image sensor based on the motion information;
controlling the image sensor to move based on the compensation information, so that the image sensor is static relative to the shooting object in the exposure process;
and after the exposure is finished, outputting a first original image.
2. The image capturing method of claim 1, wherein the motion information includes a motion direction and a motion speed, and wherein determining compensation information for the image sensor based on the motion information includes:
and determining the moving distance and the moving direction of the image sensor according to the moving direction, the moving speed and the reading time.
3. The image acquisition method according to claim 1, characterized in that the method further comprises:
responding to shooting operation, and acquiring a second original image acquired by the image sensor;
carrying out image fusion on the first original image and the second original image to obtain a target image;
and the definition of the background image in the second original image meets a preset condition.
4. The image acquisition method according to claim 1, wherein said image sensor comprises at least one multispectral pixel set, each of said multispectral pixel set comprising at least two adjacently disposed multispectral pixels; the spectral wave bands corresponding to each multispectral pixel in the multispectral pixel group are different.
5. The method according to claim 4, wherein the spectral range emitted by the spectral emitter includes a spectral range corresponding to each of the multispectral pixels in the set of multispectral pixels, and wherein the method further comprises, prior to acquiring motion information of a photographic subject by the multispectral pixels in the image sensor and the spectral emitter:
controlling the spectrum emitter to emit emission spectrum signals of different wave bands;
acquiring the reflectivity of the shooting object to the emission spectrum signals of different wave bands through the multispectral pixels;
determining a target band based on the reflectivity;
and the reflectivity corresponding to the target wave band is maximum.
6. The image acquisition method according to claim 5, wherein said obtaining the reflectivity of said spectral signals of different wavelength bands by said multispectral pixels comprises:
acquiring a reflection signal value received by each multispectral pixel in the multispectral pixel group;
and determining the reflectivity of the shooting object to different wave band spectrum signals based on the emission spectrum signals and the reflection signal values.
7. The image capturing method according to claim 1, wherein the image sensor further comprises: a color pixel, and a ratio of the multispectral pixel to the color pixel is greater than or equal to 3% and less than or equal to 50%.
8. An image capture device comprising an image sensor and a spectral emitter, the image sensor having multispectral pixels disposed thereon, the device comprising:
the motion information acquisition module is used for acquiring motion information of a shooting object through multispectral pixels in the image sensor and the spectral transmitter when the image sensor is controlled to be exposed;
a compensation information acquisition module for determining compensation information of the image sensor based on the motion information;
the image sensor moving module is used for controlling the image sensor to move based on the compensation information so that the image sensor is static relative to the shooting object in the exposure process;
and the first original image output module is used for outputting a first original image after the exposure is finished.
9. The image capturing device as claimed in claim 8, wherein the motion information includes a motion direction and a motion speed, and the compensation information acquiring module includes:
and the compensation information calculation submodule is used for determining the moving distance and the moving direction of the image sensor according to the moving direction, the moving speed and the reading time.
10. An electronic device comprising a processor and a memory, the memory storing a program or instructions executable on the processor, the program or instructions when executed by the processor implementing the steps of the image acquisition method according to any one of claims 1 to 7.
CN202210733571.0A 2022-06-24 2022-06-24 Image acquisition method and device and electronic equipment Pending CN115118892A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210733571.0A CN115118892A (en) 2022-06-24 2022-06-24 Image acquisition method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210733571.0A CN115118892A (en) 2022-06-24 2022-06-24 Image acquisition method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN115118892A true CN115118892A (en) 2022-09-27

Family

ID=83329532

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210733571.0A Pending CN115118892A (en) 2022-06-24 2022-06-24 Image acquisition method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN115118892A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106161942A (en) * 2016-07-29 2016-11-23 广东欧珀移动通信有限公司 The method and apparatus of shooting moving object and mobile terminal
CN107613219A (en) * 2017-09-21 2018-01-19 维沃移动通信有限公司 A kind of image pickup method, mobile terminal and storage medium
CN110097509A (en) * 2019-03-26 2019-08-06 杭州电子科技大学 A kind of restored method of local motion blur image
CN112040126A (en) * 2020-08-31 2020-12-04 维沃移动通信有限公司 Shooting method, shooting device, electronic equipment and readable storage medium
CN112312017A (en) * 2020-10-29 2021-02-02 维沃移动通信有限公司 Shooting control method and electronic equipment
CN112672054A (en) * 2020-12-25 2021-04-16 维沃移动通信有限公司 Focusing method and device and electronic equipment
CN113959346A (en) * 2021-10-18 2022-01-21 苏州多感科技有限公司 Displacement detection module and mobile device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106161942A (en) * 2016-07-29 2016-11-23 广东欧珀移动通信有限公司 The method and apparatus of shooting moving object and mobile terminal
CN107613219A (en) * 2017-09-21 2018-01-19 维沃移动通信有限公司 A kind of image pickup method, mobile terminal and storage medium
CN110097509A (en) * 2019-03-26 2019-08-06 杭州电子科技大学 A kind of restored method of local motion blur image
CN112040126A (en) * 2020-08-31 2020-12-04 维沃移动通信有限公司 Shooting method, shooting device, electronic equipment and readable storage medium
CN112312017A (en) * 2020-10-29 2021-02-02 维沃移动通信有限公司 Shooting control method and electronic equipment
CN112672054A (en) * 2020-12-25 2021-04-16 维沃移动通信有限公司 Focusing method and device and electronic equipment
CN113959346A (en) * 2021-10-18 2022-01-21 苏州多感科技有限公司 Displacement detection module and mobile device

Similar Documents

Publication Publication Date Title
US20110205415A1 (en) Image sensor, electronic apparatus, and driving method of electronic apparatus
JP4609092B2 (en) Physical information acquisition method and physical information acquisition device
US20060181625A1 (en) CMOS image sensor with wide dynamic range
KR20080103454A (en) Solid-state imaging device, signal processing device and signal processing method for solid-state imaging device, and imaging apparatus
US9363455B2 (en) Solid-state imaging device, electronic device and driving method thereof with selective second reading
JP2006050337A (en) Imaging apparatus, imaging method, and imaging control program
JP5013812B2 (en) Imaging apparatus and correction method
JP2008109501A (en) Imaging device and correcting method
CN109983758A (en) Image-forming component, imaging method and electronic equipment
JP2017188760A (en) Image processing apparatus, image processing method, computer program, and electronic apparatus
US10812704B2 (en) Focus detection device, method and storage medium, for controlling image sensor operations
US7349015B2 (en) Image capture apparatus for correcting noise components contained in image signals output from pixels
JP2006129237A (en) Imaging method and imaging apparatus
JP2006148550A (en) Image processor and imaging device
JP5426891B2 (en) Imaging sensor and imaging apparatus
JP2007020109A (en) Electronic blur correction apparatus
CN115118892A (en) Image acquisition method and device and electronic equipment
JP2006108889A (en) Solid-state image pickup device
JP2017103568A (en) Imaging apparatus, control method for imaging device, and program
JP4840991B2 (en) PHOTOELECTRIC CONVERSION DEVICE, ITS CONTROL METHOD, AND IMAGING DEVICE
JP5127510B2 (en) IMAGING DEVICE AND IMAGING DEVICE CONTROL METHOD
JP2008118378A (en) Photographing device and its driving method
JP7400909B2 (en) Imaging device and imaging device
JP6463159B2 (en) IMAGING DEVICE, ITS CONTROL METHOD, PROGRAM, AND STORAGE MEDIUM
JP2018093301A (en) Image pick-up device and control method of the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination