CN114430464A - Image processing method and device - Google Patents
Image processing method and device Download PDFInfo
- Publication number
- CN114430464A CN114430464A CN202011171450.9A CN202011171450A CN114430464A CN 114430464 A CN114430464 A CN 114430464A CN 202011171450 A CN202011171450 A CN 202011171450A CN 114430464 A CN114430464 A CN 114430464A
- Authority
- CN
- China
- Prior art keywords
- light
- photosensitive
- row
- photosensitive units
- state
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 30
- 238000012545 processing Methods 0.000 claims abstract description 166
- 238000000034 method Methods 0.000 claims abstract description 58
- 230000005540 biological transmission Effects 0.000 claims abstract description 39
- 238000003860 storage Methods 0.000 claims description 10
- 238000004590 computer program Methods 0.000 claims description 8
- 238000005096 rolling process Methods 0.000 abstract description 31
- 230000000694 effects Effects 0.000 abstract description 21
- 235000015110 jellies Nutrition 0.000 abstract description 18
- 239000008274 jelly Substances 0.000 abstract description 18
- 230000009286 beneficial effect Effects 0.000 abstract description 5
- 230000000903 blocking effect Effects 0.000 abstract description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 86
- 238000012544 monitoring process Methods 0.000 description 41
- 230000004044 response Effects 0.000 description 41
- 230000003287 optical effect Effects 0.000 description 39
- 238000013461 design Methods 0.000 description 25
- 230000006870 function Effects 0.000 description 20
- 230000008569 process Effects 0.000 description 20
- 239000013589 supplement Substances 0.000 description 20
- 238000010586 diagram Methods 0.000 description 16
- 230000010287 polarization Effects 0.000 description 13
- 239000000758 substrate Substances 0.000 description 10
- 230000003068 static effect Effects 0.000 description 8
- 238000003384 imaging method Methods 0.000 description 7
- 230000000750 progressive effect Effects 0.000 description 7
- 230000008859 change Effects 0.000 description 6
- 230000000007 visual effect Effects 0.000 description 6
- 230000002035 prolonged effect Effects 0.000 description 5
- 230000011514 reflex Effects 0.000 description 5
- 239000007787 solid Substances 0.000 description 5
- 238000013473 artificial intelligence Methods 0.000 description 4
- 238000004891 communication Methods 0.000 description 4
- 239000007788 liquid Substances 0.000 description 4
- 239000000463 material Substances 0.000 description 4
- 230000001502 supplementing effect Effects 0.000 description 4
- 238000002834 transmittance Methods 0.000 description 4
- 206010070834 Sensitisation Diseases 0.000 description 3
- 238000011161 development Methods 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000008313 sensitization Effects 0.000 description 3
- 230000003595 spectral effect Effects 0.000 description 3
- 239000000126 substance Substances 0.000 description 3
- 206010039203 Road traffic accident Diseases 0.000 description 2
- 238000013459 approach Methods 0.000 description 2
- 230000006399 behavior Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 239000013078 crystal Substances 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000002349 favourable effect Effects 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 238000007726 management method Methods 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000033228 biological regulation Effects 0.000 description 1
- 210000000988 bone and bone Anatomy 0.000 description 1
- 239000011248 coating agent Substances 0.000 description 1
- 238000000576 coating method Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000000280 densification Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 239000012530 fluid Substances 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000001678 irradiating effect Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000012806 monitoring device Methods 0.000 description 1
- 230000001537 neural effect Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000036961 partial effect Effects 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 239000002904 solvent Substances 0.000 description 1
- 239000010409 thin film Substances 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
- 238000001771 vacuum deposition Methods 0.000 description 1
- 238000005406 washing Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/75—Circuitry for compensating brightness variation in the scene by influencing optical camera components
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/73—Circuitry for compensating brightness variation in the scene by influencing the exposure time
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
- Exposure Control For Cameras (AREA)
Abstract
An image processing method and device are provided, wherein the method comprises the following steps: the image processing device switches the light chopper to a light transmission state before the first moment, simultaneously gates a first line of photosensitive units to an Nth line of photosensitive units in the image sensor at the first moment so as to enable the first line of photosensitive units to the Nth line of photosensitive units to start exposure at the same time, and further switches the light chopper to a light blocking state before the preset exposure time of the first line of photosensitive units passes after the first moment. The mode ensures that the photosensitive units of all rows can receive light rays in the common photosensitive time interval to really execute photosensitive operation, and cannot receive the light rays in different photosensitive time intervals to be incapable of executing the photosensitive operation, thereby not only being beneficial to solving the jelly effect existing when a rolling shutter is used for shooting a moving object, but also being beneficial to prolonging the time length of the common photosensitive time interval of all rows of the photosensitive units as much as possible and being beneficial to fully exposing images.
Description
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image processing method and apparatus.
Background
Electronic devices having a shooting function implemented by an image processing apparatus (e.g., a camera module) in the electronic device are increasingly used at present. The exposure time of the image processing apparatus is an important factor affecting the quality of photographing, and when the exposure time is too long, the photograph may be over-exposed to cause the color to be bright, and when the exposure time is too short, the photograph may be under-exposed to cause the color to be dark. Users want to acquire high-definition and wide-angle photographs and expect that the photographs have vivid color restoration feeling to obtain better visual experience effect, which puts higher demands on the exposure time of the image processing device.
A shutter is a device that controls exposure time in an image processing device. At present, there are two kinds of shutters, one is a Global Shutter (GS), which requires all pixels on the whole frame to be exposed simultaneously, so as to complete the imaging of the whole frame at the same time. The other is a Rolling Shutter (RS), which exposes each row of pixels in sequence until all the pixels are exposed. RS has lower requirements on time granularity and also lower relative cost compared to GS. Moreover, signal interference still exists between the photosensitive elements corresponding to different pixel points during exposure, and the photosensitive elements corresponding to all the pixel points interfere with each other at the same time during simultaneous exposure of all the pixel points by the GS, so that noise is serious. And RS adopts the mode of line-by-line exposure, so that time difference exists in the interference between the photosensitive elements corresponding to the pixel points in different lines, and the noise is relatively less. Based on this, it is currently more preferable to use the RS as a shutter in the image processing apparatus in each scene.
However, the existing rolling shutters still have some disadvantages which are difficult to overcome, such as the jelly effect. At present, most of schemes for solving the jelly effect of the RS are not ideal, for example, in some schemes, the jelly effect is solved by sacrificing the photosensitive duration of each row of pixel points, but this way obviously shortens the actual exposure duration of each row of pixel points, and the image color distortion is easily caused by insufficient exposure, which affects the visual experience of users. Therefore, how to reasonably design to prolong the actual exposure time of the rolling shutter becomes a problem when solving the jelly effect of the rolling shutter.
Disclosure of Invention
In view of this, the present application provides an image processing method and apparatus, so as to solve the jelly effect of the RS and simultaneously extend the actual exposure duration of the rolling shutter, so as to improve the visual experience of the user.
In a first aspect, the present application provides an image processing method, which is applicable to an image processing apparatus, where the image processing apparatus includes a shutter and an image sensor, the image sensor includes first to nth rows of photosensitive cells, and preset exposure time periods of the first to nth rows of photosensitive cells sequentially increase. The method comprises the following steps: the image processing device switches the light chopper to a light transmission state before a first moment so as to enable light to be transmitted to the image sensor through the light chopper, and then simultaneously gates the first row of photosensitive units to the Nth row of photosensitive units at the first moment, so that the first row of photosensitive units to the Nth row of photosensitive units can simultaneously start exposure operation at the first moment, and further, the image processing device switches the light chopper to a light shielding state before the first moment and before the preset exposure duration of the first row of photosensitive units, so as to close a transmission path of the light to the image sensor. Wherein N is a positive integer.
In the above design, by controlling the light shielding and transmission of the light shielding device, the photosensitive units of each row can receive light in a common photosensitive time period to really execute the photosensitive operation, and cannot receive light in different photosensitive time periods to really execute the photosensitive operation, so that even if the photosensitive units are used for shooting a moving object, images obtained by the photosensitive units of each row basically do not generate an inclination phenomenon, thereby being beneficial to solving the jelly effect existing when the rolling shutter is used for shooting the moving object. Furthermore, in the mode, the light chopper is switched to the light transmitting state before each row of photosensitive units are exposed, and the light chopper is switched to the light shielding state before the first row of photosensitive units are exposed, so that the duration of the common photosensitive time period corresponding to each row of photosensitive units is close to the preset exposure duration of the first photosensitive unit, the duration of the common photosensitive time period is prolonged as much as possible, images can be fully exposed, and the quality of the shot images is improved. In addition, the light chopper is switched to a light transmission state before shooting is performed, so that the state of the light chopper is not required to be switched before each line of photosensitive cells are switched to a working state, and the light chopper is only required to be switched to a light shielding state before the nth line of photosensitive cells are switched to a dormant state.
In an alternative design, the image sensor is in a global reset mode. In the design, the image sensor in the global reset mode can start the photosensitive operation of each row of photosensitive units at the same time, and the image sensor is set in the global reset mode, so that the photosensitive units in each row can be gated at the same time.
In an alternative design, the image processing device may set the light transmission state as a default state of the shutter. For example, the image processing apparatus may switch the shutter to the transmissive state after detecting that the image processing apparatus is turned on, or the image processing apparatus may switch the shutter to the transmissive state after the preset exposure time of the nth row of photosensitive cells captured last time is over. In this design, since the shutter transmits light in the default state, the image processing apparatus can directly gate the N lines of photosensitive cells when it is necessary to perform photographing, without performing an operation of switching to the light transmitting state any more, thereby contributing to an increase in response speed of photographing.
In an alternative design, when the preset exposure time of the first row of photosensitive units is a first time and the time required for the shutter to switch to the light shielding state is a second time, the image processing apparatus may send a light shielding state switching instruction to the shutter to drive the shutter to switch to the light shielding state after a third time after the first time. Wherein the third duration is not greater than the difference in duration between the first duration and the second duration. The design takes the response time length required by the light chopper to switch from the light transmitting state to the light shielding state into consideration when the light chopper is switched to the light shielding state, and the mode can control the light chopper to be switched to the light shielding state in time before the first row of photosensitive units finish exposure, so that the photosensitive units of all rows can finish photosensitive at the same time.
In an optional design, after the image processing apparatus switches the shutter to the light-blocking state when the preset exposure duration of the first row of photosensitive units is the first duration and the difference between the preset exposure duration of the nth row of photosensitive units and the preset exposure duration of the first row of photosensitive units is the fourth duration, a light-transmitting state switching instruction may be further sent to the shutter to drive the shutter to switch to the light-transmitting state after the fifth duration passes after the first time. Wherein the fifth duration is greater than a sum of the durations of the first duration and the fourth duration. The design utilizes the preset exposure time length difference of each row of photosensitive units to control the light chopper to be switched to a light transmission state after each row of photosensitive units finish exposure, and the mode can not interfere the operation of each row of photosensitive units to execute the photosensitive imaging at the time, and can also switch the light chopper to the default light transmission state in time.
In a second aspect, the present application provides an image processing apparatus comprising: a switching unit for switching the shutter to a light transmitting state before a first time; the light transmitting state is used for transmitting light to the image sensor through the light chopper; the gating unit is used for gating the photosensitive units in the first row to the photosensitive units in the Nth row at the same time at the first moment so as to enable the photosensitive units in the first row to the photosensitive units in the Nth row to start exposure operation at the same time at the first moment; a switching unit, further configured to: switching the light chopper to a light shielding state before the preset exposure duration of the first row of photosensitive units after the first moment; the shading state is used for the shutter to close the transmission path of light to the image sensor. And N is a positive integer, and the preset exposure time lengths from the first line of photosensitive units to the Nth line of photosensitive units can be sequentially increased.
In an alternative design, the image sensor is in a global reset mode.
In an alternative design, the switching unit may switch the shutter to the light-transmitting state after detecting that the image processing apparatus is powered on. Alternatively, the switching unit may switch the shutter to the light-transmitting state after the preset exposure time of the nth line of photosensitive cells in the last shooting is finished.
In an optional design, when the preset exposure time of the first row of photosensitive units is a first time and the time required for the shutter to switch to the light-shielding state is a second time, the switching unit may send a light-shielding state switching instruction to the shutter after a third time elapses after the first time. The shading state switching instruction is used for driving the light chopper to switch to the shading state. The third duration is not greater than the difference in duration between the first duration and the second duration.
In an optional design, when the preset exposure time of the first row of photosensitive units is a first time length, and a time length difference between the preset exposure time of the nth row of photosensitive units and the preset exposure time of the first row of photosensitive units is a fourth time length, the switching unit may send a transmission state switching instruction to the shutter after switching the shutter to the light shielding state and after a fifth time length elapses after the first time. The transmission state switching instruction is used for driving the light chopper to switch to a transmission state. The fifth duration is greater than the sum of the durations of the first and fourth durations.
In a third aspect, the present application provides an image processing apparatus comprising: a controller for transmitting a first control instruction to the shutter when the photographing operation does not need to be performed, and transmitting a second control instruction to the image sensor when the photographing operation needs to be performed; the light chopper is used for switching to a light-transmitting state according to a first control instruction so as to enable light to be transmitted to the image sensor through the light chopper; the image sensor is used for gating the first row of photosensitive units to the Nth row of photosensitive units simultaneously according to a second control instruction so as to enable the first row of photosensitive units to the Nth row of photosensitive units to start exposure operation simultaneously; the controller is also used for sending a third control instruction to the light chopper before the preset exposure duration of the first row of photosensitive units passes after the first row of photosensitive units starts exposure; and the light chopper is also used for switching to a light-shielding state according to a third control instruction so as to close a transmission passage of the light to the image sensor. And N is a positive integer, and the preset exposure time lengths from the first line of photosensitive units to the Nth line of photosensitive units can be sequentially increased.
In an alternative design, the controller may send a first control instruction to the shutter after detecting that the image processing apparatus is turned on; alternatively, the controller may send the first control instruction to the shutter after the end of the preset exposure time of the nth line of photosensitive cells in the last shooting. In this way, the controller can switch the shutter to the light-transmitting state when it is not necessary to perform a photographing operation.
In an optional design, if the preset exposure time of the first row of photosensitive units is a first time and the time required for the shutter to switch to the light shielding state is a second time, the controller may send a third control instruction to the shutter when a third time elapses after the exposure of the first row of photosensitive units is started. Wherein the third duration may be no greater than a difference in duration between the first duration and the second duration.
In an optional design, if the preset exposure time of the first row of photosensitive units is a first time length, and the time length difference between the preset exposure time of the nth row of photosensitive units and the preset exposure time of the first row of photosensitive units is a fourth time length, the controller may send a first control instruction to the shutter after switching the shutter to the light-shielding state and after the exposure of the first row of photosensitive units is started and after the fifth time length elapses. Wherein the fifth duration may be greater than a sum of the durations of the first and fourth durations.
In an optional design, an optical unit may be further included in the image processing apparatus, the optical unit being located between the shutter and the image sensor. In this case, the optical unit may image light on the image sensor while the shutter is in the light transmitting state.
In an alternative design, the image sensor may receive the light signal transmitted to the image sensor through the shutter and convert the light signal into a corresponding image signal in a period after the exposure of the first row of photosensitive units is started and before a preset exposure time period of the first row of photosensitive units elapses.
In an alternative design, the image processing apparatus may further include an image signal processor and a display, an input of the image signal processor may be connected to an output of the image sensor, an output of the image signal processor may be connected to an input of the controller, and an output of the controller may be connected to the display. In this case, the image signal processor may receive the image signal sent by the image sensor, process the image signal to obtain a processed image signal, and send the processed image signal to the controller, and the controller may further control the display to display the image according to the processed image signal.
In an alternative design, the shutter may be a liquid crystal panel, a mechanical iris, or a resonant FP cavity, but may of course be other devices capable of performing the shutter function.
In a fourth aspect, the present application provides an electronic device comprising a processor coupled with a memory, the processor being capable of executing a computer program stored in the memory to cause the electronic device to perform the image processing method according to any of the first aspect.
In a fifth aspect, the present application provides a computer-readable storage medium storing a computer program which, when executed, implements the image processing method according to any one of the first aspects.
For the beneficial effects of the second aspect to the fifth aspect, please refer to the technical effects that can be achieved by the corresponding design in the first aspect, and the detailed description is omitted here.
Drawings
Fig. 1 schematically shows a structure of an electronic device;
FIG. 2 is a diagram illustrating an exposure timing chart of photosensitive cells of each row corresponding to a rolling shutter;
fig. 3A and 3B exemplarily show a jelly effect diagram generated when a moving object is photographed using a rolling shutter;
FIG. 4 is a diagram schematically illustrating an exposure timing chart for controlling each line of photosensitive cells by the shutter;
fig. 5 is a schematic structural diagram schematically illustrating an image processing apparatus provided in an embodiment of the present application;
fig. 6 is a graph schematically illustrating a spectral transmittance of an optical filter provided in an embodiment of the present application;
fig. 7 is a schematic diagram illustrating an execution flow of an image processing method provided in an embodiment of the present application;
FIG. 8 is a timing chart illustrating exposure of photosensitive cells in each row controlled by a shutter according to an embodiment of the present application;
fig. 9 is a schematic diagram illustrating an execution flow of an image processing method provided in the second embodiment of the present application;
fig. 10 is a schematic structural view schematically illustrating a liquid crystal light shielding panel provided in an embodiment of the present application;
FIG. 11 is a timing diagram illustrating exemplary timing control of each row of photo-sensing units provided by an embodiment of the present application;
fig. 12 is a schematic diagram illustrating an execution flow of an image processing method provided in the third embodiment of the present application;
fig. 13 is a schematic structural diagram schematically illustrating a mechanical aperture provided in an embodiment of the present application;
fig. 14 schematically illustrates a structure of an image processing apparatus according to an embodiment of the present application.
Detailed Description
The image processing method disclosed in the present application can be applied to an image processing apparatus having an image capturing function. The image processing apparatus in the embodiment of the present application may be an apparatus having only an image capturing function, such as a monitoring camera, a vehicle event recorder, a video camera, and the like. Alternatively, the image processing apparatus in the embodiment of the present application may be an apparatus having an image capturing function and also having other functions, for example, an electronic device. In some embodiments of the present application, the image processing apparatus may be an electronic device or an independent unit, and when the image processing apparatus is an independent unit, the unit may be embedded in the electronic device, and the image processing method disclosed in the embodiments of the present application may be executed when the electronic device is in a shooting state, so as to extend a photosensitive duration of each row of pixel points during shooting using a rolling shutter to improve a quality of a captured image. In other embodiments of the present application, the image processing apparatus may also be a unit packaged inside the electronic device, for example, a front camera or a rear camera in the electronic device, for implementing the above-mentioned image processing function of the electronic device. The electronic device may be a device including a display device such asA portable electronic device with functions of a personal digital assistant and/or a music player, such as a mobile phone, a tablet computer, a wearable device (e.g., a smart watch) with wireless communication function, or a vehicle-mounted device. Exemplary embodiments of the portable electronic device include, but are not limited to, a mountOr other operating system. The portable electronic device may also be a device such as a Laptop computer (Laptop) with a touch sensitive surface (e.g., a touch panel), etc. It should also be understood that in some other embodiments of the present application, the electronic device may be a desktop computer having a touch-sensitive surface (e.g., a touch panel).
Fig. 1 schematically shows a structure of an electronic device. It should be understood that the illustrated electronic device 100 is merely an example, and that the electronic device 100 may have more or fewer components than shown in the figures, may combine two or more components, or may have a different configuration of components. The various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
As shown in fig. 1, the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like. The various components of the electronic device 100 are described in detail below with reference to fig. 1.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory, so that repeated accesses can be avoided, the waiting time of the processor 110 can be reduced, and the processing efficiency can be improved.
In some embodiments, processor 110 may include one or more interfaces. For example, the interface may include an integrated circuit (I2C) interface, an inter-integrated circuit (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface. The MIPI interface may be used to connect the processor 110 with the peripheral devices such as the display screen 194 and the camera 193. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the capture functionality of electronic device 100. The processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the electronic device 100.
The electronic device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like. The ISP is used for processing data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193. The camera 193 is used to capture still images or video. An object generates an optical image through a lens and projects the optical image to a photosensitive element (i.e., an image sensor). The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB or YUV format. In some embodiments, the electronic device 100 may include 1 or more cameras 193, for example, may include both front and rear cameras.
Although not shown in fig. 1, the electronic device 100 may further include a bluetooth device, a positioning device, a flash, a micro-projection device, or a Near Field Communication (NFC) device, which is not described herein.
Some terms related to the schemes in the embodiments of the present application are exemplarily described below:
(1) and (6) exposing.
The exposure in the embodiment of the present application refers to the whole process of irradiating light to the image sensor through the lens and generating a photoelectric reaction on the image sensor. The image sensor comprises a plurality of photosensitive units, and each photosensitive unit in the plurality of photosensitive units can receive light irradiated on the photosensitive unit and convert the light into an electric signal in an operating state. That is, the conditions under which each photosensitive unit can perform exposure include: the photosensitive unit is in a working state, and light irradiates the photosensitive unit. As long as one of the two conditions is not satisfied (for example, the photosensitive unit is in a sleep state although there is light irradiated onto the photosensitive unit, or no light is irradiated onto the photosensitive unit although the photosensitive unit is in an operating state, or the photosensitive unit is in a sleep state and no light is irradiated onto the photosensitive unit), the photosensitive unit cannot perform exposure.
(2) The exposure time and the actual exposure time are preset.
The preset exposure time and the actual exposure time in the embodiment of the present application are both for the photosensitive unit. The actual exposure time period refers to a time interval during which the photosensitive unit actually performs exposure. The preset exposure time refers to the time interval from the time when the photosensitive unit is switched to the operating state to the time when the photosensitive unit is switched to the sleep state, and the time interval can be set in the image sensor by a person skilled in the art according to a software manner. In this case, if the image sensor receives an exposure instruction, the image sensor switches the photosensitive unit to the operating state first, and then switches the photosensitive unit to the sleep state after a preset exposure duration. During this time interval the light sensing unit may or may not receive light (i.e. the exposure is not actually performed).
(3) A rolling shutter.
The rolling shutter in the embodiment of the present application refers to a shutter that performs imaging in a line-by-line exposure manner. In the rolling shutter, each photosensitive unit in the image sensor may be divided into N (N is a positive integer) rows of photosensitive units, each row of photosensitive units in the N rows of photosensitive units may correspond to a row of area on the actually imaged picture, and each row of photosensitive units is configured to receive light of a corresponding row of area in the actual scene and perform exposure imaging on the row of area on the actually imaged picture (i.e., generate image signals corresponding to the row of area). In this case, the line-by-line exposure in the rolling shutter means that N lines of photosensitive units are sequentially gated from top to bottom, that is, the N lines of photosensitive units are sequentially switched to a working state to control the N lines of photosensitive units to sequentially perform exposure, and the preset exposure durations of the N lines of photosensitive units may be the same or different, or may be partially the same or different, and are not particularly limited.
Taking the example that the N rows of photosensitive cells all have the same preset exposure time period, fig. 2 exemplarily shows an exposure timing chart of the N rows of photosensitive cells corresponding to a rolling shutter, in this example, it is assumed that the N rows of photosensitive cells are named as a first row of photosensitive cells, a second row of photosensitive cells, … …, and an nth row of photosensitive cells, respectively, and the preset exposure time period corresponding to each row of photosensitive cells is 40ms, and then refer to fig. 2: if the image processing apparatus is at a first moment (T)0) When receiving a shooting instruction, the image processing device will be at the time T0Firstly, the first line of photosensitive units is switched to a working state from a dormant state, and when the time difference of starting exposure of any two adjacent lines of photosensitive units is 0.02ms, the image processing device can start exposure again at the time T0The second line of photosensitive units is switched from the dormant state to the working state at the time T for +0.02ms0+0.04ms, the third row photosensitive unit is switched from the dormant state toWorking state, analogizing in turn until at time T0And the photosensitive units in the Nth row are switched from the dormant state to the working state at +0.02(N-1) ms. Thus, if a row of photosensitive cells receives light in an operating state, the row of photosensitive cells can actually perform a photosensitive operation. Furthermore, since the first line of photosensitive cells in the N lines of photosensitive cells is first switched to the working state, the first line of photosensitive cells first satisfies the preset exposure time of 40ms, in which case the image processing apparatus will be at the time T0The +40ms is to switch the first line of photosensitive units from the working state to the dormant state so that the first line of photosensitive units is in [ T ]0,T0+40ms]Receives light to expose and generate image signals corresponding to the first row area. The image processing apparatus will again be at time T0+40.02ms, the second line of photosensitive cells is switched from the working state to the dormant state, so that the second line of photosensitive cells is in [ T0+0.02ms,T0+40.02ms]Receives light to generate image signals corresponding to the second row of areas. The image processing apparatus will again be at time T0And +41ms, the third row photosensitive unit is switched from the working state to the dormant state, so that the second row photosensitive unit is in [ T ]0+0.04ms,T0+40.04ms]Receives light to generate image signals corresponding to the third row area. And so on until the image processing apparatus is at time T0The photosensitive units in the Nth row are switched from the working state to the dormant state when the +40+0.02(N-1) ms exists, so that the photosensitive units in the Nth row are in [ T ] state0+0.02(N-1),T0+40+0.02(N-1)ms]Receives light to generate image signals corresponding to the N-th row area.
It should be understood that the various state switching moments referred to in the above can be implemented by timers, for example, a special corresponding timer is set for each moment that the switching state needs to be executed, in this case, each timer has a timing value, and when the current moment satisfies the timing value of a certain timer, the state switching operation corresponding to the timer is executed. Alternatively, one timer may be provided for each of the two state switching times of the photosensitive units, in which case each timer has two count values, and when the current time satisfies the first count value of the timer, the photosensitive unit corresponding to the timer is switched to the operating state, and when the current time satisfies the second count value of the timer, the photosensitive unit corresponding to the timer is switched to the sleep state. Or, a comprehensive timer may be set for each state switching time, where the timer includes 2N timing values, and when the current time satisfies the kth (K is a positive integer less than N) timing value of the timer, the kth photosensitive unit is switched to the operating state, and when the current time satisfies the N + K timing value of the timer, the kth photosensitive unit is switched to the sleep state.
It should be understood that, in the above description, a row of photosensitive units may include a plurality of photosensitive units, and switching a row of photosensitive units to an operating state means switching all the photosensitive units included in the row of photosensitive units to an operating state at the same time, so that all the photosensitive units included in the row of photosensitive units are simultaneously exposed. For example, an image sensor with a resolution of 1440 × 1000 may generally include 1000 rows of photosensitive cells, and each row of photosensitive cells may be formed by 1440 photosensitive cells.
(4) The jelly effect.
The jelly effect in the embodiment of the present application occurs in a scene in which a moving object is photographed using a rolling shutter. The movement referred to herein means a relative movement, and for example, when an image processing apparatus in a dynamic movement photographs an object in a static state, or when an image processing apparatus in a static state photographs an object in a dynamic movement, the photographed object moves with respect to the image processing apparatus. As can be seen from the exposure time relationship of the N rows of photosensitive units illustrated in fig. 2, when light is always irradiated on the N rows of photosensitive units, the N rows of photosensitive units are actually photosensitive-imaged in different time periods, and in this case, if the object to be photographed moves relative to the image processing apparatus, the positions of the frames photographed by the N rows of photosensitive units in the respective photosensitive time periods also change, so that the N regions corresponding to the N rows of photosensitive units are tilted when they are combined into a complete image. For convenience of understanding, taking the photographing process in fig. 3A and 3B as an example, the generation process of the jelly effect is described. As shown in fig. 3A, when a user sits in a car moving to the right and takes the image processing device to photograph a building, the building moves to the left relative to the image processing device, in this case, a later line of photosensitive units in two adjacent lines of photosensitive units will have a later photosensitive period than a previous line of photosensitive units, so that a reference position of a picture photographed by the later line of photosensitive units is more right than a reference position of a picture photographed by the previous line of photosensitive units (i.e., a photographed building area is more left), and thus, when N lines of photosensitive units are all exposed, the pictures obtained by exposure of the N lines of photosensitive units are combined together, so that the whole image tilts to the left from top to bottom, as shown in fig. 3B.
According to the above, the applicant thought that if N rows of photosensitive units are exposed in the same period, the image reference of the N rows of photosensitive units is the same, and thus the whole image obtained by combining the image signals obtained by exposing the N rows of photosensitive units is not inclined, so that the jelly effect existing when a rolling shutter is used to shoot a moving object can be solved. However, in the progressive scanning mode, the N rows of photosensitive units have the same preset exposure duration, and the N rows of photosensitive units are respectively switched to the working state at different times and switched to the sleep state at different times, so that the respective photosensitive periods of the N rows of photosensitive units are different, which results in that the N rows of photosensitive units respectively perform exposure operations in different photosensitive periods. If the N rows of photosensitive units are required to be exposed in the same period, a light chopper is further arranged in the image processing device, and the N rows of photosensitive units receive light in the same photosensitive period by controlling the light transmitting state and the light shielding state of the light chopper so as to really execute the exposure operation.
FIG. 4 is a timing diagram illustrating an exposure process for controlling N rows of photosensitive cells by shutters, as shown in FIG. 4, during which the photosensitive cells in the Nth row are switched to an active state (i.e., at time T)0+0.02(N-1)), until the first row of photosensitive cells switches to the rest state (i.e., time T)0Before the +40) time of the start of the operation,the N rows of light-sensing units are all in operation, so the time period T0+0.02(N-1),T0+40]Can be used as the photosensitive period of the N rows of photosensitive units. In this case, in order to ensure that the light sensing periods are accurately within the common operation period of the light sensing units of each row, the image processing apparatus may be at time T0+0.02(N-1) or a later time, a transmission state switching instruction is sent to the shutter to switch the shutter from the light-shielding state to the transmission state, and the shutter may be switched to the transmission state at the time T0And before +40, sending a shading state switching instruction to the light chopper to switch the light chopper from the light transmitting state to the shading state, so that the N rows of photosensitive units receive the light transmitted through the light chopper to be exposed in the same photosensitive time period, and cannot receive the light in other time periods so as not to be exposed. However, there are some problems in this manner, for example, in a manner that a light transmission state switching instruction is sent to the shutter after the nth row photosensitive cells are switched to the operating state, and a light shielding state switching instruction is sent to the shutter before the first row photosensitive cells are switched to the sleep state, so that the actual exposure period of the N rows of photosensitive cells is as illustrated in fig. 4 (T)0+0.02(N-1),T0+40), no exposure operation is performed for other periods (e.g., the unexposed period 1 and the unexposed period 2). On one hand, the common photosensitive time period of the N rows of photosensitive units in the progressive scanning mode is short, which results in that the duration of the actual exposure time period (i.e., the actual exposure time period) of the N rows of photosensitive units is short (when N is larger, the actual exposure time period is shorter), and directly taking this time period as the common photosensitive time period may cause a captured image to be darker due to insufficient exposure, which affects the visual experience of a user. On the other hand, the shutter (e.g., liquid crystal shutter panel) at the present stage has a power-on response time of ms level (e.g., 2ms) and a power-off response time of um level (e.g., 100um), and the duration of the common light sensing period depends on both the response time for switching the shutter to the light-transmitting state and the response time for switching the shutter to the light-shielding state, and the two switches are controlled by power-on and power-off respectively (e.g., power-on switching to the light-transmitting state and power-off switching to the light-shielding state, or power-on switching to the light-shielding state and power-off switching to the light-transmitting state), and the two switches are both in commonThe actual exposure time of each line of photosensitive units can be further shortened, even when the power-on is switched to the light-transmitting state and the power-off is switched to the light-shielding state, the time when the light chopper is switched back to the light-shielding state may be reached when the light chopper is not successfully switched to the light-transmitting state, so that the N photosensitive units do not receive light in the whole process, and imaging cannot be performed. Therefore, according to this embodiment, if the actual exposure time of the N photosensitive units is to be increased, a shutter having a faster response speed needs to be designed, which inevitably increases the difficulty of the process design of the shutter, increases the cost of the image processing apparatus, and is not favorable for popularization.
In view of this, the present application provides an image processing method, which is suitable for the above-mentioned scene using a rolling shutter to shoot a moving object, and the method not only can solve the problem of jelly existing in the scene, but also can increase the actual exposure duration of N rows of photosensitive units as much as possible without modifying the structure of the shutter, so as to fully expose and obtain an image with better quality, thereby improving the visual experience of the user.
The present application will be described in further detail below with reference to the accompanying drawings. It is to be noted that "at least one" in the description of the present application means one or more, where a plurality means two or more. In view of this, the "plurality" may also be understood as "at least two" in the embodiments of the present invention. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" generally indicates that the preceding and following related objects are in an "or" relationship, unless otherwise specified. In addition, it is to be understood that the terms first, second, etc. in the description of the present application are used for distinguishing between the descriptions and not necessarily for describing a sequential or chronological order. For example, "first timer", "second timer", and "third timer" are merely exemplary to indicate different timers, and do not mean a difference in the importance or priority of the three timers.
Fig. 5 is a schematic structural diagram illustrating an image processing apparatus according to an embodiment of the present disclosure, and as shown in fig. 5, the image processing apparatus may include a shutter, an optical unit, an image sensor, an Image Signal Processor (ISP), a controller and a display screen, which are sequentially disposed, and the controller may further have a first control end (a)1) And a second control terminal (a)2) The controller passes through a control terminal a1Is connected with the shutter and passes through the control end a2An image sensor is connected. The image sensor can be a CCD or CMOS phototransistor, and a plurality of photosensitive units are arranged in the image sensor, each photosensitive unit can sense a specified measured quantity and convert the measured quantity into a usable signal according to a certain rule, so that each photosensitive unit can be composed of a sensitive element and a conversion element. When the image sensor is a CMOS phototransistor, these light sensing units may be diodes. It should be understood that fig. 5 is only one example of an image processing apparatus, which may have more or less components than illustrated in fig. 5. For example, in another example, the image processing apparatus may include only the shutter, the image sensor, and the controller, and in this case, the image processing apparatus may directly image the image signal exposed by the image sensor without processing the image signal image. For example, in another example, the shutter may be disposed between the optical unit and the image sensor, but this approach requires a space between the optical unit and the image sensor to dispose the shutter, so the process requirement on the image processing apparatus is high, and the shutter cannot have a large size, otherwise the image processing apparatus is large as a whole and is inconvenient to carry. Based on this, in a preferred design, the shutter may be disposed outside the optical unit in the manner illustrated in fig. 5, that is, the optical unit is located between the shutter and the image sensor, which not only simplifies the process of the image processing apparatus but also does not need to define the specification of the shutter, thereby contributing to a reduction in the cost of the image processing apparatus.
Further, the image processing apparatus may further include a bus system (not illustrated in fig. 5), wherein the respective components in the image processing apparatus may be connected by the bus system.
It should be understood that the controller may be a chip. For example, the controller may be a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), a system on chip (SoC), a Central Processing Unit (CPU), a Network Processor (NP), a digital signal processing circuit (DSP), a Microcontroller (MCU), a Programmable Logic Device (PLD), or other integrated chips.
In implementation, the steps of the image processing method in the present application may be implemented by integrated logic circuits of hardware in the controller or instructions in the form of software. The steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware processor, or may be implemented by a combination of hardware and software modules in a controller. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in a memory, and the controller reads information in the memory and completes the steps of the image processing method in the application in combination with hardware of the controller.
It should be noted that the controller in the embodiment of the present application may be an integrated circuit chip having signal processing capability. In implementation, the steps of the image processing method in the present application may be implemented by integrated logic circuits of hardware in the controller or instructions in the form of software. The controller described above may be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in a memory, and a processor reads information in the memory and completes the steps of the method in combination with hardware of the processor.
It will be appreciated that the memory in the embodiments of the subject application can be either volatile memory or nonvolatile memory, or can include both volatile and nonvolatile memory. The non-volatile memory may be a read-only memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an electrically Erasable EPROM (EEPROM), or a flash memory. Volatile memory can be Random Access Memory (RAM), which acts as external cache memory. By way of example, but not limitation, many forms of RAM are available, such as Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), Synchronous Dynamic Random Access Memory (SDRAM), double data rate SDRAM, enhanced SDRAM, SLDRAM, Synchronous Link DRAM (SLDRAM), and direct rambus RAM (DR RAM). It should be noted that the memory of the systems and methods described herein is intended to comprise, without being limited to, these and any other suitable types of memory.
In the embodiment of the present application, when the shutter is in the light-shielding state, the shutter isolates the ambient light from the outside of the optical unit, and in this case, each photosensitive unit in the image sensor cannot receive the ambient light, and therefore each photosensitive unit cannot be exposed to form an image. When the light chopper is in a light-transmitting state, ambient light can enter the optical unit through the light chopper, the optical unit comprises a lens (formed by a series of optical lenses) and an optical filter, the lens is used for focusing the ambient light entering the optical unit onto the optical filter, and the optical filter can filter light rays (such as ultraviolet light and infrared light) in an infrared cut-off spectral range in the received ambient light rays and then transmit the filtered ambient light rays to the image sensor. When the filtered ambient light reaches the image sensor, the image sensor can switch each row of photosensitive units to a working state according to preset exposure duration set by software, after exposure of each row of photosensitive units is completed, the image sensor can send an image signal obtained by exposure to the ISP, the ISP processes the received image signal to obtain a processed image signal, the processed image signal is sent to the controller, and the controller sends the processed image signal to the display for display. The processing of the ISP may include, but is not limited to, demosaicing, auto-exposure, auto-white balancing, etc., among others. Regarding the process of controlling each line of photosensitive cells by the image sensor, reference is made to the first embodiment, which will not be described here.
Fig. 6 is a graph illustrating a spectral transmittance curve of an optical filter provided by an embodiment of the present application, in this example, the optical filter may specifically be an infrared cut filter, and the optical filter may be formed by coating a plurality of thin films of different materials on a surface of a glass substrate by using a vacuum coating technique, and the optical filter may filter infrared light and ultraviolet light in ambient light. As can be seen from the description of fig. 6, the filter has a good transmittance (for example, 80% or more) for visible light having a wavelength in the range of 400nm to 630nm, and has a poor transmittance (for example, about 0%) for infrared light having a wavelength of more than 700nm and ultraviolet light having a wavelength of less than 400 nm. Under the condition, after the optical filter receives the ambient light focused by the lens, the infrared light with the wavelength of more than 700nm and the ultraviolet light with the wavelength of less than 400nm can be filtered as much as possible, and only most visible light in the wavelength range of 400nm to 630nm is transmitted to the image sensor. Therefore, the filter can cut off a visible light region (400 nm-630 nm), a near infrared region (above 700 nm) and a near ultraviolet region (below 400 nm), so that the ambient light transmitted to the image sensor does not contain infrared light which can interfere with the imaging quality of the image sensor as far as possible, and the subsequent image formed by the image sensor is more in line with the optimal vision of human eyes.
In the embodiment of the present application, the lens in the optical unit may be a zoom lens, or may be a fixed focus lens, and is not limited specifically. When the lens in the optical unit is a zoom lens, in the development stage of the image processing apparatus, a person skilled in the art can select a zoom range of the lens, an aperture range of the lens and a resolution of the image sensor according to a scene to be applied after the image processing apparatus leaves a factory, so that a subsequent user can select a required focal length and aperture according to a required shooting scene and shoot an image or video by using the image sensor with the resolution. For example, in the field of surveillance cameras for security functions, the zoom range of the zoom lens may be set to 10mm to 30mm, the aperture may be set to a fixed aperture F/1.7 (or several selectable values may be set, and a user selects one of the values to perform the current shooting task at the time of use), and the resolution of the image sensor may be set to 1920 × 1000.
In an optional implementation manner, as shown in fig. 5, the image processing apparatus may further include a light supplement unit, and the light supplement unit may be implemented by a plurality of Light Emitting Diodes (LEDs) in cooperation with lenses with different divergence angles. In this case, the controller may further have a third control terminal (a)3) The controller passes through a3The end is connected with a light supplement unit. When the controller determines that the current shooting environment is insufficient (such as night scene shooting), the controller can also pass through a3And the end sends a light supplement instruction to the light supplement unit so that the light supplement unit sends light according to the light supplement instruction to supplement the light to the ambient light. The light supplement unit can select various light supplement wave bands, such as a white light supplement lamp, a 750nm light supplement lamp, a 850nm light supplement lamp or a 940nm light supplement lamp.In the field of monitoring cameras for security functions, the light supplementing unit can emit light rays containing the multiple light supplementing wave bands. Assuming that the optical filter is the optical filter illustrated in fig. 6, the controller may turn on the light supplement lamp and synchronously adjust the optical filter to be in the light filtering mode under the condition of dark light in the daytime, so that infrared light supplemented to ambient light by the light supplement unit is filtered by the optical filter, and visible light supplemented to ambient light by the light supplement unit can reach the image sensor through the optical filter, so as to improve the light brightness during exposure. Under the darker condition of light at night, the controller also can open the light filling lamp to adjust the light filter into the mode of passing through entirely, like this, visible light and infrared light that the light filling unit supplyed in the ambient light all can pass through the light filter and reach image sensor. No matter day or night, as long as light is weaker, the image processing device can image by utilizing stronger light under the light supplementing operation of the light supplementing unit, so that the brightness of the shot image is improved. In other schemes, the light supplement unit may only emit visible light but not infrared light and ultraviolet light, and the light shield may be always in the light filtering mode.
Next, a specific implementation procedure of the image processing method in the embodiment of the present application is described based on the image processing apparatus illustrated in fig. 5.
[ EXAMPLES one ]
Fig. 7 schematically illustrates an execution flow of an image processing method provided in an embodiment of the present application, where the method is applied to a controller in an image processing apparatus, such as the controller illustrated in fig. 5. As shown in fig. 7, the method includes:
in step 701, the controller switches the shutter to the transmissive state before the first time.
In the above step 701, the first time may refer to a time at which the image processing apparatus needs to perform photographing. For example, when the image processing apparatus continuously captures a plurality of frames of images in a periodic manner to obtain a video, the first timing may refer to a timing between when the capturing of the last frame of image is completed and when the capturing of the next frame of image is started. In this case, the controller may switch the shutter to the light-transmitting state after the previous frame image is captured, and then start capturing the next frame image.
In an alternative embodiment, the controller may switch the shutter to the light-transmitting state when the image processing apparatus does not need to perform a photographing operation. The image processing apparatus does not need to perform a shooting operation, and may specifically include the following:
the image processing device is powered on and before shooting is performed for the first time; and,
the image processing apparatus completes exposure of the previous frame before exposure of the next frame starts.
In this embodiment, if it is in the scene of the captured image, the phrase "after the exposure of the previous frame is completed until the exposure of the next frame is started" may refer to: after the last captured image is finished and before the next captured image is started. If the video is in a scene of recording a video, since the video is actually formed by combining images obtained by shooting one frame after another (for example, if a video of 1s is recorded at a frame rate of 25 frames/second, 25 frames of images are shot within a time duration of 1s, the shooting time duration of each frame of image is 40ms, and the preset exposure time durations of the N rows of photosensitive units in each frame of image are all less than 40ms), the "before the exposure of the next frame after the exposure of the previous frame is completed" may refer to: after the N-th line of photosensitive units of the previous frame image in two adjacent frames of images which are continuously shot are switched to the dormant state, the N-th line of photosensitive units of the next frame image are switched to the working state. In this case, the default image processing apparatus does not need to perform a photographing operation in a time interval between any two adjacent frames of images during recording of the video.
The above embodiment actually sets the light transmissive state as the default state of the shutter. In this case, the controller may directly send a transmission state switching instruction (or may also be referred to as a reset instruction, where the shutter may be switched to a default transmission state under the reset instruction), or may first detect a current state of the lower shutter, and send a transmission state switching instruction to the shutter when it is determined that the current state is the light-shielding state. Of course, the operation of detecting the current state requires more steps to be performed by the controller, which is not as direct as the way of directly sending the light-shielding state switching instruction. Further, the controller may also control the image sensor and the shutter to complete the shooting operation each time the shooting operation needs to be performed (refer to steps 702 to 704, which will not be described first). In the process of shooting each frame of image, the image sensor needs to be exposed to image, so the controller can send an exposure instruction to the N lines of photosensitive units according to the preset exposure time length of the N lines of photosensitive units, and can also control the shutter to be in a light-transmitting state in a part of time period in the process of shooting each frame of image, and after the part of time period is in a light-shielding state (namely after the preset exposure time length of the first line of photosensitive units is finished), when the controller determines that the exposure of the frame of image is finished (namely after the preset exposure time length of the Nth line of photosensitive units is finished), the controller can also send a light-transmitting state switching instruction to the shutter to control the shutter to be switched from the light-shielding state to the light-transmitting state, thereby maintaining the default state of the shutter.
It should be understood that in the embodiment of the present application, when the image processing apparatus does not need to perform a shooting operation, each photosensitive unit in the image sensor is in a sleep state by default (i.e., is loaded with a voltage but does not operate), and such a sleep state is switched only when each photosensitive unit receives an exposure instruction sent by the controller. In this case, when the image processing apparatus does not need to perform a photographing operation, even if the shutter is in the light-transmitting state by default so that light is irradiated onto each of the photosensitive units, the photosensitive units do not perform exposure because they do not receive an exposure instruction.
In step 702, the controller simultaneously gates each row of light-sensing units in the image sensor at a first time to enable each row of light-sensing units to simultaneously start exposure.
In an alternative embodiment, if the controller switches the shutter to the light-transmitting state when the photographing operation is not required to be performed, the controller may also gate the rows of photosensitive cells in the image sensor at the same time when the photographing operation is required to be performed, for example, may switch the rows of photosensitive cells in the image sensor to the operating state at the same time. Therefore, because the light shield is in a light-transmitting state by default, each row of photosensitive units can simultaneously collect light transmitted to each row of photosensitive units through the light shield, and the light signals are converted into image signals. In this way, the shutter is switched to the light-transmitting state before shooting is performed, so that the photosensitive cells in each row can perform exposure immediately after the photosensitive cells in each row are gated, and this way can avoid the phenomenon that the photosensitive cells in each row are already exposed without being switched to the light-transmitting state by the shutter.
In the embodiment of the application, the controller determines whether the shooting operation needs to be executed in different scenes and shows different performances. For example, in the field of a surveillance camera for security, after an image processing apparatus is installed in an area to be monitored and powered on, the image processing apparatus needs to be in a state of recording a video all the time, and therefore, in such a scenario, as long as the controller detects that the image processing apparatus is powered on, it can be determined that a shooting operation needs to be performed (refer to the second embodiment). For another example, in the terminal field, if the camera application on the terminal device detects that the user triggers the shooting button in the camera application, the camera application may send a notification message to the controller, notifying the controller that the shooting operation needs to be currently executed (refer to the third embodiment).
It should be noted that if the shutter is controlled to be in the light-shielding state when shooting is not required, the shutter may be already switched to the light-transmitting state when the image processing apparatus receives a shooting instruction, and therefore the image processing apparatus may directly control the image sensor to perform a light-sensing operation, thereby contributing to improvement of the response speed of shooting. However, this is only an alternative embodiment. In other optional embodiments, the controller may also switch the light shutter to the light-transmitting state after receiving the shooting instruction, and then pass through each row of the light sensing units. Although the mode may prolong the response time of the image processing device from the receiving of the shooting instruction to the shooting of the obtained image, the method can solve the problem of the jelly effect when the rolling shutter is used for shooting the moving object, simultaneously prolong the exposure time of the image processing device as much as possible and improve the quality of the shot image.
In step 702, the photosensitive cells in each row are gated simultaneously, which may be specifically implemented by setting an operation mode of the image sensor. For example, in a case where an image sensor in a global Reset mode (Reset) is directly provided in the image processing apparatus (the image sensor in the global Reset mode can simultaneously sense the light of the photosensitive cells of each row), the image processing apparatus performs the image processing in such a manner that the photosensitive cells of each row simultaneously sense the light for any one photographing. For example, in another case, an image sensor capable of switching modes may be provided in the image processing apparatus, and when it is necessary for each line of photosensitive cells to be exposed in a simultaneous photosensitive manner, the mode of the image sensor may be switched to a global reset mode, and when it is necessary for each line of photosensitive cells to be exposed in a line-by-line photosensitive manner, the mode of the image sensor may be switched to a line-by-line exposure mode. The details of the global reset mode and the line-by-line sensitization mode will be described in the third embodiment, and will not be described here.
In step 703, the controller switches the shutter to the light-shielding state before the exposure of the first row of photosensitive cells in the image sensor is completed.
In the embodiment of the application, a rolling shutter is used in an image processing apparatus to perform shooting, and the rolling shutter needs to perform exposure in a line-by-line exposure manner, because only one memory exists in the rolling shutter, and the memory can only be used for storing image signals obtained by exposing a certain line of photosensitive units at the same time, therefore, although each line of photosensitive units can start exposure at the same time, each line of photosensitive units needs to respectively finish exposure at different times, so that the image signals obtained by respectively exposing each line of photosensitive units can be respectively stored in the memory. In this case, if the image processing apparatus performs line-by-line exposure from top to bottom, when the photosensitive units of each line start exposure simultaneously, the preset exposure time lengths corresponding to the photosensitive units of each line are also sequentially increased, and the first line of photosensitive units actually belongs to the photosensitive unit with the shortest preset exposure time length in the photosensitive units of each line, so that the time period from the exposure start of each line of photosensitive units to the exposure end of the first line of photosensitive units belongs to the common photosensitive time period of each line of photosensitive units.
Based on this, in an alternative embodiment, the controller may further start a timer for timing after gating each row of the photosensitive cells, and if the state switching response period of the shutter is not considered, the controller may switch the shutter to the light-shielding state before the preset exposure period of the first row of the photosensitive cells is passed by the timer. Alternatively, if the state switching response time of the shutter is taken into consideration, the controller may switch the shutter to the light-shielding state before a difference in time length between the preset exposure time length of the first row of photosensitive cells and the state switching response time length of the shutter elapses. According to this embodiment, each row of light-sensing units can receive light for exposure imaging in a common light-sensing period, and does not receive light in different light-sensing periods, so that the image signals generated by exposure of each row of light-sensing units are basically free from inclination.
A specific example is illustrated: fig. 8 illustrates an exposure timing chart of controlling N rows of photosensitive units through a shutter according to an embodiment of the present application, in this example, assuming that each row of photosensitive units is a first row to an nth row of photosensitive units, a preset exposure time of the first row of photosensitive units is 40ms, and a difference between preset exposure times of any two adjacent rows of photosensitive units is 0.02ms, then: if the controller is at time T0When receiving the shooting instruction, the controller will be at the time T0When the photosensitive units in each row from the first row of photosensitive units to the Nth row of photosensitive units are switched to the working state, and a timer is started to time, so that each row of photosensitive units can be kept at the time T0The exposure is started by receiving the light transmitted through the light shield. When the timer counts the preset photosensitive duration (i.e. 40ms) corresponding to the first row of photosensitive units without considering the state switching response duration of the shutter, the controller sends a shutter state switching command to the shutter to switch the shutter to the shutter state, because the shutter state switching response duration is not consideredThis is from time (T)0+40ms), although only the second row to the nth row of the photosensitive cells are still in operation, the light is blocked by the shutter, and the photosensitive cells cannot receive the light, and the exposure is not continued, so that the actual exposure time period of each row of the photosensitive cells is T0,T0+40ms]. In consideration of the state switching response time of the shutter (assumed to be a liquid crystal shutter panel), if the shutter is switched to the shutter state at power-on, the response time for switching to the shutter state is 2ms, and when the timer counts a time difference (i.e., 38ms) between a preset photosensitive time 40ms corresponding to the first row of photosensitive cells and the response time 2ms for switching to the shutter state, the controller sends a shutter state switching instruction to the shutter. If the power is turned off, the response time length for switching to the shading state is 100us, and when the timer counts the time length difference (namely 39.9ms) between the preset photosensitive time length 40ms corresponding to the first row of photosensitive units and the response time length 100us for switching to the shading state, the controller sends a shading state switching instruction to the shading device. In this case, although the controller transmits the light-shielding state switching instruction, the light-shielding state of the light-shielding device is not actually switched to the light-shielding state until 2ms or 100um has elapsed at that time, and therefore, each line of photosensitive cells can receive the light to continue the exposure operation until 2ms or 100um has elapsed after the transmission of the instruction, and therefore, the actual exposure period of each line of photosensitive cells is [ T [0,T0+40 ms). Based on the above description, the preferred setting of the shutter to be switched to the light shielding state when being powered off is used to prolong the real exposure time of each light sensing unit as much as possible through the faster response speed of the light shielding state.
In the above example, "when the timer counts the preset photosensitive time corresponding to the first row of photosensitive cells, the controller sends the light-shielding state switching instruction to the shutter", or "when the timer counts the time difference between the preset photosensitive time corresponding to the first row of photosensitive cells and the state response time of the shutter, the controller sends the light-shielding state switching instruction to the shutter", which is actually used to ensure that each row of photosensitive cells has the longest actual exposure time. The longest actual exposure period is approximately 40ms, which is 0.02(N-1) longer than the actual exposure period of 40-0.02(N-1) in the scheme illustrated in FIG. 4. Based on the resolution of the image sensor commonly used in the present stage, the value of N is mostly set to 1000 or more, in this case, the actual exposure time of the solution in the above example is at least 20.02ms (i.e. 0.02 × 1000-1) longer than that of the solution illustrated in fig. 4, so the solution in the above example can not only solve the jelly effect existing when the rolling shutter shoots the moving object, but also fully ensure the time for exposing the image of each row of photosensitive units, which is helpful to improve the quality of the image exposed by each row of photosensitive units.
It should be understood that the solution in the above example is to maximize the actual exposure time period of each row of photosensitive cells. However, in other examples, the controller may send the light-shielding state switching instruction to the shutter when the timer has not counted the preset photosensitive time period corresponding to the first row of photosensitive cells, or send the light-shielding state switching instruction to the shutter when the timer has not counted the time period difference between the preset photosensitive time period corresponding to the first row of photosensitive cells and the state response time period of the shutter. Although the actual exposure time of each line of photosensitive units in these examples is not maximized, as long as the time control is reasonable, the full exposure of the image can be realized while the jelly effect existing when the rolling shutter shoots the moving object is solved, so as to take account of the image quality and the visual experience of the user.
In an alternative embodiment, the time difference between the preset exposure time lengths of two adjacent rows of photosensitive units may be a fixed value that is preset at the time of factory shipment of the image processing apparatus and cannot be adjusted, and although the preset exposure time length of the first row of photosensitive units also corresponds to a preset value, the preset value may be adjusted during use. For example, when ambient light is stronger, each row of photosensitive units may need only a shorter time to be fully exposed in the strong light state, and therefore, in order to avoid over-brightness of an image, the controller may adjust the preset exposure duration of the first row of photosensitive units to a smaller value in the scene, in this case, because the preset exposure durations of two adjacent rows of photosensitive units have a fixed duration difference, the preset exposure durations of the second row of photosensitive units to the nth row of photosensitive units also correspondingly and automatically implement corresponding adjustment, so that the N rows of photosensitive units can really perform exposure operation in a smaller common photosensitive period to generate an image with appropriate brightness. For another example, when the ambient light is darker, each row of photosensitive units may need a longer time to be fully exposed in the low light state, and therefore, in order to avoid over-darkness of an image, the controller may adjust the preset exposure duration of the first row of photosensitive units to a larger value in the scene, in this case, because the preset exposure durations of two adjacent rows of photosensitive units have a fixed duration difference, the preset exposure durations of the second row of photosensitive units to the nth row of photosensitive units also correspondingly and automatically implement corresponding adjustment, so that the N rows of photosensitive units can really perform exposure operation in a longer common photosensitive period to generate an image with proper brightness. In another alternative embodiment, if the preset exposure time of the first row of photosensitive cells is not considered to be adjusted, the controller may also send the shading state switching instruction to the shutter as soon as possible when the ambient light is stronger, so as to maintain the rows of photosensitive cells to have a smaller common photosensitive period by switching the shutter to the shading state in advance, and send the shading state switching instruction to the shutter later when the ambient light is weaker, so as to maintain the rows of photosensitive cells to have a larger common photosensitive period by switching the shutter to the shading state later. In this embodiment, the moment when the controller sends the shading state switching instruction is within a reasonable time range, the lower limit value of the reasonable time range may be the median value of the actual exposure time length or slightly larger than the median value, and the upper limit value of the reasonable time range may be the actual exposure time length. This reasonable time range may be 20ms, 40ms, for example, when the exposure is performed in a rolling shutter manner as illustrated in fig. 8.
It should be understood that the scheme illustrated in fig. 8 only describes the use of the timer to time the controller sends the light-shielding-state switching instruction, and does not describe the use of the timer to time the photosensitive cells in each row to switch to the sleep state. Assuming that the time for switching each line of photosensitive cells to the sleep state is counted by the first timer and the time for sending the shading state switching instruction by the controller is counted by the second timer, the controller may start the first timer and the second timer at the same time after receiving the photographing instruction, in which case the timing time of the second timer is satisfied first, so that the controller may send the shading state switching instruction to the shading device, and then the first timing time of the first timer is satisfied again, so that the controller may send an instruction to the first line of photosensitive cells to control the first line of photosensitive cells to be switched to the sleep state, and the second timing time of the first timer is satisfied again, so that the controller may send an instruction to the second line of photosensitive cells to control the second line of photosensitive cells to be switched to the sleep state, … …, and the nth timing time of the first timer is satisfied again, so that the controller can send an instruction to the nth row of photosensitive units to control the nth row of photosensitive units to be switched to the dormant state.
In an alternative embodiment, if the preset exposure time lengths of the first row photosensitive units to the nth row photosensitive units become longer in sequence, after the nth row photosensitive units are switched to the sleep state, the first row photosensitive units to the nth row photosensitive units are actually switched to the sleep state, the shooting operation is finished, and a condition that the image processing apparatus does not need to perform the shooting operation is currently satisfied. In this case, the controller may also send a light-transmissive-state switching instruction to the shutter to switch the shutter to the light-transmissive state. At this time, since all the N lines of photosensitive units have been switched to the sleep state, even if light can be irradiated to the N lines of photosensitive units through the shutter, the N lines of photosensitive units will not be exposed any more, and this shooting will not be affected. In this embodiment, the time for sending the transmission state switching instruction by the controller may also be implemented by a timer, for example, a third timer may be further disposed in the controller, the time of the third timer is not less than the preset exposure time of the nth line of photosensitive units, after the controller receives the shooting instruction, the controller may further start the third timer, and when the time of the third timer is met, the controller may send the transmission state switching instruction to the shutter. For example, the duration of the third timer may not be set to be too large, for example, may be set to a value equal to or slightly greater than the preset exposure duration of the nth line photosensitive unit, so as to switch the shutter to the light-transmitting state in time after the end of the current shooting, and improve the response speed of the next shooting, considering that there may be a small time difference between two adjacent shots.
It should be noted that the above descriptions of "timing the time when each line of photosensitive cells is switched to the sleep state by using the first timer", "timing the time when the controller sends the light-shielding state switching instruction by using the second timer", and "timing the time when the controller sends the light-transmitting state switching instruction by using the third timer" are only an optional implementation manner. In other alternative embodiments, the same timer may also be used to simultaneously count "the time for each row of photosensitive cells to switch to the sleep state", "the time for the controller to send the light-shielding state switching instruction", and "the time for the controller to send the light-transmitting state switching instruction", and the timing process and the control manner of the timer refer to the above contents, and are not described herein.
In step 704, the controller controls the display to display a corresponding image according to the image signal obtained by exposing each row of photosensitive units in the image sensor.
For example, referring to fig. 4, the image sensor may first send an image signal obtained by exposure to the image signal processor, the image signal processor processes the image signal to obtain a processed image signal, and then sends the processed image signal to the controller, and then the controller sends the processed image signal to the display for display.
In the first embodiment, by controlling the light shielding and the light transmitting of the light shielding device, the photosensitive units of each row can be exposed in a common photosensitive time interval and not exposed in different photosensitive time intervals, so that even if the device is used for shooting a moving object, images obtained by exposing the photosensitive units of each row basically do not incline, and the device is helpful for solving the problem of the jelly effect existing when the rolling shutter is used for shooting the moving object. Furthermore, in the first embodiment, the shutter is switched to the light-transmitting state before the shooting is performed, and the photosensitive cells in each row are simultaneously gated during the shooting, and the shutter is switched to the light-shielding state before the exposure of the photosensitive cell with the minimum preset exposure time duration is completed, so that the time duration of the common photosensitive time duration corresponding to the photosensitive cells in each row is close to the minimum preset exposure time duration of the photosensitive cell, and the time duration of the common photosensitive time duration is prolonged as much as possible, so that the image can be fully exposed, and the quality of the shot image is improved. In addition, since the shutter is switched to the light-transmitting state as soon as the embodiment performs photographing, the state of the shutter does not need to be switched until the respective rows of the photosensitive cells are switched to the operating state, but only the shutter needs to be switched to the light-shielding state before the nth row of photosensitive cells is switched to the sleep state, the common exposure time period for each row of light-sensing units in this manner therefore depends only on the response time period for the shutter to switch to the light-blocking state, and no longer on the response time period for the shutter to switch to the transmissive state, compared to the image processing method illustrated in fig. 4 (the common exposure time period for each row of photosensitive cells depends on both the response time period for the shutter to switch to the blocking state and the response time period for the shutter to switch to the transmissive state), this approach has lower process requirements for the shutter and therefore also contributes to reducing the cost of the image processing apparatus.
The image processing method in the present application is further described below from the second embodiment and the third embodiment, respectively, based on the first embodiment.
Example two: monitoring field
With the gradual population densification of cities, urban traffic is also more and more crowded, especially in early peak hours and late peak hours, which has become a bottleneck of urban traffic development. In order to facilitate urban traffic and traffic police to dredge public traffic, the smart traffic system is gradually applied in the traffic field as one of effective means for solving the bottleneck of urban traffic development. The intelligent traffic system is provided with monitoring equipment at some important traffic positions, and the monitoring equipment is used for automatically identifying traffic violations or assisting big data in counting traffic flow and the like so as to improve the working efficiency. These important traffic locations may include, for example, high-speed violation detection, Electronic Toll Collection (ETC) at a gate, road monitoring at a crossroad, road traffic monitoring, and electronic police. At present, a global shutter camera is directly arranged at the traffic positions as monitoring equipment, but the global shutter camera is expensive in price, large in read noise and possibly not suitable for popularization and use. Therefore, if the rolling shutter camera is used as a monitoring device at the traffic positions and combined with the image processing method provided by the embodiment of the application, the problems of high cost and high noise of the global shutter camera can be overcome, and an image meeting the quality requirement can be obtained when a fast moving object is shot.
Fig. 9 is a schematic diagram illustrating an execution flow of an image processing method provided in embodiment two of the present application, where the method is applied to a controller in an image processing apparatus, such as the controller illustrated in fig. 5. In this method, it is assumed that the shutter is a liquid crystal shutter panel. As shown in fig. 9, the method includes:
when the shooting condition of the current frame monitoring image is met, executing a step 902;
when the shooting condition of the current frame monitoring image is not satisfied, the step 901 is continuously executed.
In the above step 901, assuming that the frame rate is set to 10 frames/second, the graphics processing apparatus may capture one frame of monitoring image in the monitoring area every 100 milliseconds. In this case, satisfying the capturing condition of the current frame image may specifically be: a time period of 100 milliseconds elapses from the start of shooting of the last frame. If 100 milliseconds have not elapsed, the image processing apparatus may be performing an exposure operation for the previous frame image, or the previous frame image has been completely exposed and the shutter is being switched to the transmissive state, or the previous frame image has been completely exposed and the shutter has also been switched to the transmissive state and is in a waiting stage.
In step 902, the controller determines whether the voltage applied to the liquid crystal shading panel is the first voltage, if not, step 903 is executed, and if so, step 904 is executed.
In the above step 902, when the first voltage is applied to the liquid crystal shutter panel, the liquid crystal shutter panel may be in a light transmitting state. When the second voltage is applied to the liquid crystal shading panel, the liquid crystal shading panel can be in a shading state. The first voltage and the second voltage may be controlled by the controller through a square wave signal, for example, in one case, when the controller outputs a high level (for example, power is supplied), the liquid crystal shutter panel is powered on, so that the liquid crystal shutter panel is in a light-transmitting state, and when the controller outputs a low level (for example, power is not supplied), the liquid crystal shutter panel is powered off, so that the liquid crystal shutter panel is in a light-shielding state. For another example, when the controller outputs a high level, the liquid crystal shutter panel is powered on and off, so that the liquid crystal shutter panel is in a light-blocking state, and when the controller outputs a low level, the liquid crystal shutter panel is powered off and so that the liquid crystal shutter panel is in a light-transmitting state.
Illustratively, since the power-on response time of the liquid crystal shutter panel is about 2ms and the power-off response time of the liquid crystal shutter panel is about 100us, and the common light sensing time of each row of light sensing units in the embodiment of the present invention only depends on the response time of the liquid crystal shutter panel for switching to the light-shielding state, the liquid crystal shutter panel can be set to switch to the light-shielding state when being powered off, and the liquid crystal shutter panel switches to the light-transmitting state when being powered on, that is, the first voltage corresponds to a high level and the second voltage corresponds to a low level. In this way, the controller sends the shading switching instruction to the liquid crystal shading panel about 100us of the end time of the preset exposure time of the first row of photosensitive units, and the liquid crystal shading panel is switched to the shading state to only occupy a smaller section of the common photosensitive time period of each row of photosensitive units, so that the common photosensitive time period of each row of photosensitive units is maximized.
For convenience of understanding, the first voltage is referred to as power-up, and the second voltage is referred to as power-down.
Fig. 10 schematically illustrates a structure of a liquid crystal light shielding panel according to an embodiment of the present disclosure, and as shown in fig. 10, the liquid crystal light shielding panel may include a first polarizing layer, a first substrate layer, a first conductive layer, a liquid crystal layer, a second conductive layer, a second substrate layer, and a second polarizing layer, a bottom surface of the first polarizing layer contacts a top surface of the first substrate layer, a bottom surface of the first substrate layer contacts a first surface of the first conductive layer, a second surface of the first conductive layer contacts a top surface of the liquid crystal layer, a bottom surface of the liquid crystal layer contacts a first surface of the second conductive layer, a second surface of the second conductive layer contacts a top surface of the second substrate layer, and a bottom surface of the second substrate layer contacts a top surface of the second polarizing layer. Wherein the first conductive layer is connected to the first end of the electrode, and the second conductive layer is connected to the second end of the electrode. The liquid crystal layer is made of liquid crystal materials, and the liquid crystal materials are between solid and liquid, so that the liquid crystal layer not only has the optical characteristics of solid crystals, but also has the liquid flow characteristics. The liquid crystal material belongs to an ordered fluid, which is an intermediate state that has the partial properties of crystal and liquid and is formed after some solid substances are melted or dissolved by a solvent, the liquid is easy to flow although the rigidity of the solid substances is lost, and the anisotropic ordered arrangement of the molecules of the partially crystalline substance is maintained.
As shown in fig. 10, the light of each vibration direction irradiated to the liquid crystal black out panel is first irradiated to the first polarization layer, and the first polarization layer allows only the light of a certain vibration direction (assumed to be 0 ℃) to be irradiated to the liquid crystal layer through the first substrate layer. The liquid crystal layer has different deflection properties when powered on and powered off, has a 90 ℃ deflection function when powered on, deflects light rays in a 0 ℃ vibration direction irradiated to the liquid crystal layer into light rays in a 90 ℃ vibration direction, and then irradiates the second polarizing layer through the second substrate layer. When the liquid crystal layer is not deflected when the liquid crystal layer is powered down, the light rays in the 0 ℃ vibration direction irradiated to the liquid crystal layer are not deflected, and therefore the light rays in the 0 ℃ vibration direction are directly irradiated to the second polarization layer through the second substrate layer. The second polarizing layer may be arranged to allow 0 ℃ polarization and may also be arranged to allow 90 ℃ polarization. When the second polarizing layer is set to allow polarization at 0 ℃, if the light deflected through the liquid crystal layer is light in the vibration direction at 0 ℃, the light may continue to pass through the second polarizing layer to be irradiated to the image sensor, so that the liquid crystal black out panel is in a light transmitting state. If the light deflected by the liquid crystal layer is light in a vibration direction of 90 ℃, the light cannot pass through the second polarizing layer, so that the liquid crystal shading panel is in a shading state. When the second polarizing layer is set to allow 90 ℃ polarization, if the light deflected through the liquid crystal layer is light in the 0 ℃ vibration direction, the light cannot pass through the second polarizing layer, so that the liquid crystal light-shielding panel is in a light-shielding state. If the light deflected by the liquid crystal layer is light in a vibration direction of 90 ℃, the light can penetrate through the second polarizing layer to irradiate the image sensor, so that the liquid crystal shading panel is in a light transmitting state.
According to the above, the liquid crystal light-shielding panel is actually controlled by the second polarization layer to be switched to the light-shielding state or to be switched to the light-shielding state, when the second polarization layer is set to allow polarization at 0 ℃, the liquid crystal light-shielding panel is switched to the light-shielding state and to be switched to the light-transmitting state, and when the second polarization layer is set to allow polarization at 90 ℃, the liquid crystal light-shielding panel is switched to the light-transmitting state and to be switched to the light-shielding state. If it is desired to maintain a longer period of common sensing time, the electrical switching to the light-blocking state needs to be controlled, and therefore, the second polarizing layer is preferably set to allow polarization at 0 ℃.
In step 903, the controller controls the liquid crystal shutter panel to apply a first voltage to switch the liquid crystal shutter panel to a transmissive state.
It should be noted that, steps 902 and 903 are optional steps, which can ensure that the ambient light is transmitted to each row of photosensitive cells through the light shield before each row of photosensitive cells is gated, so that each row of photosensitive cells can immediately perform exposure after being gated, which helps to make the actual exposure time of each row of photosensitive cells meet the requirement.
In other optional embodiments, if the controller automatically switches the shutter to the light-transmitting state after each frame is shot, and the frame rate is set such that a time duration between when the preset exposure time duration of the nth line of photosensitive cells of the previous frame of monitoring image in the two adjacent frames of monitoring images is ended and when the N line of photosensitive cells of the next frame of monitoring image starts exposure is not less than a time duration required for the liquid crystal shutter panel to switch from the light-shielding state to the light-transmitting state (if the liquid crystal shutter panel is electrically switched from the light-shielding state to the light-transmitting state, about 2ms), the controller may not perform the determination. If the interval duration is less than the duration required for switching the liquid crystal shading panel from the shading state to the light transmitting state, the controller can switch the liquid crystal shading panel to the shading state by changing some shooting parameters, for example, reducing the preset exposure duration of the first line of photosensitive units in each frame of monitoring image, so that the preset exposure duration of the Nth line of photosensitive units is correspondingly reduced, and the liquid crystal shading panel can perform the operation of switching to the light transmitting state earlier. Alternatively, the controller may not change the shooting parameters, but directly switch the liquid crystal shading panel to the shading state through the above steps 902 and 903 after the shooting of the next frame of monitoring image is started, and then let each line of photosensitive cells actually perform the photosensitive operation. However, in this way, even if each line of photosensitive cells of the next frame of monitoring image has been switched to the working state, each line of photosensitive cells will not perform exposure until the shutter is switched to the transparent state, so the actual exposure time of each line of photosensitive cells is not as long as the actual exposure time corresponding to the former two ways.
In step 904, the controller simultaneously gates the rows of light sensing units in the image sensor so that the rows of light sensing units simultaneously initiate exposure.
In step 905, the controller controls the liquid crystal light shielding panel to apply a second voltage to switch the liquid crystal light shielding panel to a light shielding state before the exposure of the first line of photosensitive cells in the image sensor is completed.
Illustratively, the controller may further control the liquid crystal shutter panel to apply the first voltage after the last line of photosensitive cells in the image sensor is exposed, so that the liquid crystal shutter panel is switched to a light-transmitting state for facilitating the next shooting.
As a specific example, fig. 11 schematically illustrates a timing control diagram of each row of photosensitive cells provided in an embodiment of the present application. In this example, assuming that there are all the line 1 to 1000 photosensitive units, the preset exposure time of the line 1 photosensitive unit is 40ms, and the preset exposure time of any two adjacent lines of photosensitive units is different by 0.02ms, then turning off the line 1 photosensitive unit for 19.98ms turns off the line 1000 photosensitive unit. In this case, when the state transition time of the liquid crystal shutter panel is not considered, the controller may power down the liquid crystal shutter panel when a time of not more than 40ms (e.g., 39ms as illustrated in fig. 11) has elapsed after gating each row of light sensing units, and may power up the liquid crystal shutter panel again when a time of not less than 19.98ms (e.g., 23ms as illustrated in fig. 11) has elapsed after turning off the first row of light sensing units. When the state transition time of the liquid crystal shutter panel is taken into consideration, the controller may power down the liquid crystal shutter panel when a time of not more than 39.9ms (assuming that the power-down response time period is 100us) has elapsed after the respective rows of light sensing units are gated on, and may power up the liquid crystal shutter panel again when a time of not less than 19.98ms has elapsed after the first row of light sensing units are turned off.
For example, after the preset exposure time of the nth row of photosensitive units passes after each row of photosensitive units is gated, the controller determines that the current exposure is finished, so that the controller may enable the image sensor to send image signals obtained by exposing each row of photosensitive units to the image signal processor.
In step 907, the controller determines whether the monitored image includes the target object, if yes, step 908 is executed, and if not, step 901 is executed.
In step 908, the controller selects an optimal monitoring image (e.g., a proper angle or a clearest target object, etc.) from the plurality of monitoring images including the target object, and displays the optimal monitoring image on the display.
In the embodiment of the application, when the object moves relative to the image processing device at a small speed, the image inclination phenomenon obtained by the image processing device is not serious and can be even ignored. Therefore, the target object in the above steps 907 and 908 may specifically refer to a fast moving object. The image processing method in the present application can be applied to capture undistorted images in scenes in which objects move rapidly, and these scenes may include one or more of the following scenes, for example:
in one possible monitoring scene, an image processing device is provided at the violation detection site, and the image processing device captures a parking-prohibited area at a set frame rate to obtain a plurality of frames of monitoring images (the vehicle is parked in the parking-prohibited area itself belongs to a scene in which an object moves). After each frame of monitoring image is obtained by shooting, the controller may call an Artificial Intelligence (AI) algorithm to identify whether the frame of image includes a vehicle, such as a large truck, a motor vehicle, a bus, or a car. When a vehicle is present, it is determined that the vehicle is parking violating the parking. Under the condition, the controller can analyze the license plate number of the vehicle according to the shot monitoring image to record the illegal parking behavior, and can send illegal parking warning to the vehicle on a display screen arranged around the parking forbidding area, so that the owner can be informed to move the vehicle before the vehicle is actually parked in the parking forbidding area, and the public traffic order is maintained.
In another possible monitoring scene, an image processing device is arranged at the main urban trunk line, and the image processing device shoots a road section needing to count the flow according to a set frame rate to obtain a plurality of frames of monitoring images (a scene that a vehicle runs on the main urban trunk line and belongs to a moving object). After each frame of monitoring image is obtained through shooting, the controller can call an AI algorithm to identify whether the frame of image comprises the vehicle or not. When a vehicle is present, it belongs to one object of statistical traffic. In this case, the controller may wait for the vehicle to move out of the road section and then obtain all the monitoring images of the vehicle, and then find a monitoring image with the most suitable shooting angle from all the monitoring images, so as to analyze information such as the type, license plate number, owner age, and the like of the vehicle, thereby facilitating subsequent traffic flow statistics operation.
In one possible monitoring scenario, an image processing device is disposed on the expressway, and the image processing device photographs each lane at a set frame rate to obtain multiple frames of monitoring images. For the continuous multi-frame images obtained by shooting, the controller can call an AI algorithm to identify the position change condition of the same vehicle in each frame of image, and the driving speed of the vehicle is determined according to the position change condition and the shooting time of the continuous multi-frame images. And when the running speed of the vehicle is determined to exceed the maximum limit speed corresponding to the traffic lane where the vehicle is located, determining that the vehicle is in overspeed violation. Under the condition, the controller can analyze the license plate number of the vehicle according to the shot monitoring image to record the overspeed violation, and can send overspeed violation warning to the vehicle on a display screen arranged around the driving lane to remind the vehicle owner to reduce the speed as soon as possible and reduce the probability of traffic accidents.
In another possible monitoring scene, an image processing device is arranged at the intersection, and the image processing device shoots the motor vehicle lane and the traffic lights according to a set frame rate to obtain a plurality of frames of monitoring images. Aiming at the continuous multi-frame images obtained by shooting, the controller calls an AI algorithm to detect whether the motor vehicle which still moves rapidly exists under the condition of red light, and if so, the motor vehicle is determined to run the red light against the rules. Under the condition, the controller can not only analyze the license plate number of the motor vehicle according to the monitoring image obtained by shooting so as to record the overspeed violation behavior, but also send out the red light violation warning to the motor vehicle owner on a display screen arranged around the traffic light. For example, in order to ensure the safety of pedestrians, the controller may further notify the main control unit of the urban intelligent traffic system of the current traffic condition, so that the main control unit of the urban intelligent traffic system can decide a solution suitable for relieving the current traffic condition according to the current traffic condition of each intersection, so as to avoid traffic accidents as much as possible.
It should be understood that the image processing method in the present application has a good effect when shooting a fast moving object, but does not mean a bad effect when shooting a static object, and thus "defining the target object as a fast moving object" is only an alternative embodiment. In other alternative embodiments, the target object may also be a static object, and is not required to be limited in this respect.
In an alternative embodiment, if the image processing apparatus has two operation modes, namely a progressive exposure mode and a global reset mode, the image processing apparatus may automatically adjust to a suitable operation mode according to the scene to perform the subsequent shooting operation. For example, when a static object is photographed, the image processing apparatus can automatically switch to the progressive exposure mode, the shutter does not block light in the whole course in the progressive exposure mode, and the image processing apparatus directly photographs the static object in the manner illustrated in fig. 4 to obtain a high-quality image. When a dynamic moving object is shot, the image processing device can automatically switch to the global reset mode, and the shading and the light transmission of the light chopper are controlled in the global reset mode according to the mode illustrated in fig. 7, so that the exposure time of the image is prolonged as much as possible under the condition of solving the jelly effect, and the image with better quality is shot as much as possible.
The scheme in the second embodiment simultaneously gates all rows of photosensitive units, and is matched with the liquid crystal shading panel to be switched to a light-transmitting state before shooting, so that the problem of image deformation generated when a rolling shutter is used for shooting moving objects can be solved, the actual exposure time of the image can be prolonged, and the image quality is improved. Furthermore, in the second embodiment, the liquid crystal light shielding panel can be set to be switched to the light shielding state from the power-off state and to be switched to the light transmitting state from the power-on state to the power-on state, so that the common light sensing time period of each row of light sensing units can be lengthened as much as possible even according to the power-on response time and the power-off response time of the existing liquid crystal light shielding panel, and the requirements on the power-on response time and the power-off response time of the liquid crystal light shielding panel are not high as in fig. 4. In addition, even if the liquid crystal light-shielding panel is set to be switched to the light-shielding state by power-on and switched to the light-transmitting state by power-off in the second embodiment, the common light-sensing duration of each row of light-sensing units only depends on the power-on response time of the liquid crystal light-shielding panel, so that the liquid crystal light-shielding panel can only be required to have a faster power-on response time, and the liquid crystal light-shielding panel is not required to have a faster power-off response time, which reduces the response sensitivity requirement on the switching state of the liquid crystal light-shielding panel in the image processing device. The second embodiment of the invention can directly select the liquid crystal shading panel with the power-off response time in the order of us (for example, about 100us) and the power-on response time in the order of ms or even larger as the light chopper, without needing the liquid crystal shading panel to be too fine, so that the mode is also favorable for reducing the production and processing difficulty of the liquid crystal shading panel and the cost of the image processing device.
Example three: terminal field
At present, users have more and more shooting requirements on terminal devices such as single lens reflex cameras, mobile phones or tablet computers, and the users want to obtain the terminal devices with better shooting quality at lower cost. Taking a single lens reflex as an example, most of traditional single lens reflex cameras use a rolling shutter to shoot, but the rolling shutter is easy to generate the problem of image deformation when shooting a moving object, so that the shooting experience of a user is not very good. Under the circumstance, if the rolling shutter is controlled to shoot the image by combining the image processing method provided by the embodiment of the application, the image meeting the quality requirement of the user can be shot when the single lens reflex camera shoots a fast moving object, the cost of the single lens reflex camera cannot be increased, and the comprehensive experience of the user is improved.
Fig. 12 is a schematic diagram illustrating an execution flow of an image processing method provided in the third embodiment of the present application, where the method is applied to a controller in an image processing apparatus, such as the controller illustrated in fig. 5. In this method, the shutter is assumed to be a mechanical aperture. As shown in fig. 12, the method includes:
in step 1201, the controller receives a shooting instruction triggered by a user.
In step 1201, when the image processing apparatus has a touch display screen, the image processing apparatus may further support the user to trigger the shooting instruction by means of a sliding touch. For example, a user may first trigger the camera application on the touch display screen, and after entering a shooting interface of the camera application, may select configuration contents such as a shooting mode, a target focal length, and a target aperture in the interface, and finally click a shooting button. Correspondingly, the camera application processor first detects the submission operation of the user, so the camera application processor can generate a corresponding shooting instruction and send the shooting instruction to the controller.
In step 1202, the controller determines whether the image processing apparatus is in the progressive sensing mode or the global reset mode:
if the image processing apparatus is in the line-by-line sensitization mode, executing step 1203;
if the image processing apparatus is in the global reset mode, step 1204 is executed.
In an alternative embodiment, the image processing apparatus may further support a user setting a photographing mode of the image processing apparatus. For example, a default shooting mode may be set in the image processing apparatus, and after the user enters the shooting mode configuration interface through the setting key, the image processing apparatus may display all selectable shooting modes to the user, so that the user may modify the default shooting mode. Alternatively, the image processing apparatus may actively display a shooting mode configuration interface to the user when it is detected that the image processing apparatus enters the shooting state, and the user selects a shooting mode corresponding to the current shooting on the shooting mode configuration interface.
In step 1203, the controller gates the photosensitive units in each row in sequence, and turns off the photosensitive units in each row in sequence after the preset exposure time of the photosensitive units in each row is met.
In an alternative embodiment, the controller may further control the blades of the mechanical diaphragm to be opened to maintain the mechanical diaphragm in a light-transmitting state when the image processing apparatus does not need to perform a photographing operation. Moreover, the controller does not need to perform any state switching operation on the blades of the mechanical diaphragm in the whole process of line-by-line sensitization performed by the controller. On the contrary, the global reset mode also needs to switch the state of the mechanical diaphragm, so that the global reset mode has higher requirements on the exposure ending time than the line-by-line photosensitive mode, the control process is more precise, and the power consumption is larger. Therefore, when a user needs to photograph a static object or photograph an object moving slowly, since the photographed object does not substantially move or moves slowly, even if each line of photosensitive cells performs exposure in a line-by-line exposure manner, the image is not substantially deformed. In this case, the user can directly select the progressive exposure mode as the shooting mode used for this shooting to save the power consumption of the image processing apparatus.
In step 1204, the controller determines whether the blades of the mechanical diaphragm are open, if not, step 1205 is executed, and if so, step 1206 is executed.
In an alternative embodiment, the mechanical aperture may be arranged in the optical unit, in particular outside the lens. Fig. 13 is a schematic structural diagram illustrating a mechanical diaphragm provided in an embodiment of the present application, and as shown in fig. 13, the mechanical diaphragm may include a plurality of blades, and when the plurality of blades are closed, the plurality of blades may be combined into a disc, so as to block ambient light irradiated onto the optical unit outside the lens. When there is at least one opened leaf among the plurality of leaves, ambient light irradiated onto the optical unit can be irradiated onto the lens through an area where the opened leaf is located. Also, the controller may control the amount of ambient light transmitted through the mechanical iris by controlling the number and degree of opening of the blades. It should be understood that the shape of the vane shown in fig. 7 is only an exemplary illustration, and the present application is not limited to the specific shape of the vane, for example, the vane may be a polygon such as a rectangle, a square, a trapezoid, or a parallelogram, and may also be a non-polygon such as a semicircle, an ellipse, etc., as long as all vanes can completely block the ambient light after being closed.
In the step 1204, the controller determines whether the blades of the mechanical diaphragm are open, and specifically, may determine whether at least one of the blades of the mechanical diaphragm is open, and when one or more of the blades are open, it may be determined that the blades of the mechanical diaphragm are open, and when all the blades are closed, it may be determined that the blades of the mechanical diaphragm are not open.
In step 1205, the controller controls the blades of the mechanical diaphragm to open so that the mechanical diaphragm is switched to a transparent state.
It should be noted that step 1204 and step 1205 are optional steps, which enable each line of photosensitive cells to perform exposure immediately after gating, and help to prolong the actual exposure time of each line of photosensitive cells.
In step 1206, the controller simultaneously gates the photosensitive cells in each row of the image sensor, so that the photosensitive cells in each row simultaneously start exposure.
In an alternative embodiment, the mechanical aperture may default to a fully open state in which all the leaves are fully open, i.e. 100% of the ambient light enters the mechanical aperture and impinges on the lens. After receiving the shooting instruction and before simultaneously gating each line of photosensitive units, the controller may further detect the current ambient light intensity (for example, calling an ambient light sensor preset in the image processing apparatus to detect), when the ambient light intensity is greater than a first threshold (for example, corresponding to a scene of photo shooting or midday shooting), the brightness of the ambient light is too high, and if the ambient light is directly used to shoot an image, color distortion of the image may be caused by overexposure. Therefore, in this case, the controller may also simultaneously gate each row of the photosensitive units after synchronously decreasing the opening angles of all the blades in the mechanical aperture. In this way, since the opening angles of all the blades are reduced synchronously, the brightness uniformity of the shot picture can be improved, the amount of ambient light entering the mechanical diaphragm can be reduced, and the occurrence of overexposure of the image can be avoided. It should be understood that how much the opening angles of all the blades are reduced can be determined according to the intensity of the ambient light, for example, when the intensity of the ambient light is higher, the opening angles of all the blades can be synchronously reduced by a larger value, so that the opening angles of all the blades are smaller; when the intensity of the ambient light is smaller, the opening angles of all the blades can be synchronously reduced by a smaller value, so that the opening angles of all the blades are larger. Correspondingly, when the intensity of the ambient light is less than the second threshold (e.g., corresponding to a scene taken at night), the brightness of the ambient light is too low, and if the ambient light is directly used to take an image, the color of the image may be dark due to underexposure. In this case, since all the blades are fully opened to the maximum angle, the brightness cannot be adjusted by the mechanical aperture, and thus the controller may perform the light supplement operation by the light supplement unit illustrated in fig. 5. In this embodiment, the first threshold is greater than the second threshold, and a value of the first threshold and a value of the second threshold may be set by a person skilled in the art according to experience, which is not specifically limited in this application.
In step 1207, the controller controls the blades of the mechanical diaphragm to close before the exposure of the first line of photosensitive cells in the image sensor is completed, so that the mechanical diaphragm is switched to the light-shielding state.
For example, after the last line of photosensitive units in the image sensor is exposed, the controller may further control the blades of the mechanical diaphragm to open, so that the mechanical diaphragm is switched to a light-transmitting state, thereby facilitating the exposure operation of the next frame image.
In step 1208, the controller controls the display to display a corresponding image according to the image signal obtained by exposing each line of photosensitive cells in the image sensor.
The scheme in the second embodiment simultaneously gates all rows of photosensitive units, and is matched with the mechanical aperture to be switched to a light-transmitting state before shooting, so that the problem of image deformation generated when a rolling shutter is used for shooting moving objects can be solved, the actual exposure time of the image can be prolonged, and the image quality is improved. Furthermore, the mechanical diaphragm belongs to pure mechanical structure control, and has simpler structure, lower processing difficulty and more controllable cost compared with the up-down control of the liquid crystal shading panel. In addition, through regard as the shutter with the mechanical diaphragm and set up in optical unit, not only do not need additionally to open up the space and reserve for the shutter, can also use this one component of mechanical diaphragm to realize that the diaphragm is adjusted and these two kinds of functions of shading regulation simultaneously, help improving the utilization ratio of component.
The second embodiment and the third embodiment are only examples in which a liquid crystal shutter panel and a mechanical aperture are used as shutters, and the application process of the image processing method in different scenes is described. It should be understood that the shutter in the embodiments of the present application may also be other types of components, such as a resonant (FP) cavity. Moreover, the image processing apparatus and the image processing method in the embodiment of the present application may be applied to the monitoring field and the terminal field illustrated above, and may also be applied to more fields in the future, for example, the smart home fields such as a robot, a washing machine, a liquid crystal television, and a microwave oven, and are not limited specifically.
It is to be understood that the various embodiments herein may also be combined with each other to arrive at new embodiments.
It should be noted that the names of the above-mentioned information are merely examples, and as the communication technology evolves, the names of the above-mentioned arbitrary information may change, but the meaning of the above-mentioned information is the same as that of the above-mentioned information of the present application, regardless of the change in the names, and the information falls within the scope of the present application.
The above-mentioned scheme provided by the present application is mainly introduced from the perspective of interaction between network elements. It is to be understood that each network element described above, in order to implement the above functions, includes a corresponding hardware structure and/or software module for performing each function. Those of skill in the art will readily appreciate that the present invention can be implemented in hardware or a combination of hardware and computer software, with the exemplary elements and algorithm steps described in connection with the embodiments disclosed herein. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
Based on the foregoing embodiments and the same concept, fig. 14 is a schematic diagram of an image processing apparatus provided in an embodiment of the present application, and as shown in fig. 14, the image processing apparatus 1401 may be any of the image processing apparatuses illustrated above, or may be a chip or a circuit, such as a chip or a circuit that can be disposed in an image processing apparatus.
The image processing apparatus may correspond to the image processing apparatus in the above method. The image processing apparatus may implement the steps performed by the image processing apparatus in any one or any number of corresponding methods as shown in figures 1 to 13 above. The information processing apparatus may include a switching unit 1402 and a gating unit 1403.
When the image processing apparatus 1401 is the above-mentioned image processing apparatus, the switching unit 1402 may switch the shutter to the light-transmitting state before the first time to transmit the light to the image sensor through the shutter, the gating unit 1403 may gate the first row of photosensitive cells to the nth row of photosensitive cells simultaneously at the first time to enable the first row of photosensitive cells to start the exposure operation simultaneously at the first time, and the switching unit 1402 may switch the shutter to the light-shielding state before the preset exposure time of the first row of photosensitive cells passes after the first time to enable the shutter to close the transmission path of the light to the image sensor. And N is a positive integer, and the preset exposure time lengths from the first line of photosensitive units to the Nth line of photosensitive units can be sequentially increased.
In an alternative embodiment, the image sensor may be in a global reset mode.
In an optional implementation manner, the switching unit 1402 is specifically configured to: switching the light chopper to a light transmission state after detecting that the image processing device is started; or after the preset exposure time of the Nth line of photosensitive units shot at the last time is ended, the light chopper is switched to a light transmission state.
In an optional implementation manner, the switching unit 1402 is specifically configured to: when the preset exposure time of the first row of photosensitive units is a first time length and the time required by the light chopper to switch to the light shielding state is a second time length, a light shielding state switching instruction is sent to the light chopper after a third time length passes after the first time to drive the light chopper to switch to the light shielding state. Wherein the third duration is not greater than the difference in duration between the first duration and the second duration.
In an alternative embodiment, after the switching unit 1402 switches the shutter to the light shielding state, the switching unit 1402 is further configured to: when the preset exposure time of the first row of photosensitive units is a first time length and the time length difference between the preset exposure time of the Nth row of photosensitive units and the preset exposure time of the first row of photosensitive units is a fourth time length, a light-transmitting state switching instruction is sent to the light chopper after the fifth time length after the first time so as to drive the light chopper to be switched to a light-transmitting state. Wherein the fifth duration is greater than a sum of the durations of the first duration and the fourth duration.
For the concepts, explanations, details and other steps related to the technical solutions provided in the embodiments of the present application related to the image processing apparatus 1401, reference is made to the descriptions of the foregoing methods or other embodiments, and no further description is given here.
It should be understood that the above division of the units of the image processing apparatus 1401 is merely a division of logical functions, and may be wholly or partially integrated into one physical entity or may be physically separated in actual implementation.
According to the method provided by the embodiment of the present application, the present application further provides an electronic device, which may be a specific hardware entity, or may also be a chip or a circuit. The electronic device may include a processor and a memory, which may be connected by a bus system. In the electronic device, a memory may be used to store instructions, and a processor may be used to execute the instructions stored in the memory to implement a method corresponding to any one or more of the methods shown in fig. 1-13 above.
According to the method provided by the embodiment of the present application, the present application further provides a computer program product, which includes: computer program code which, when run on a computer, causes the computer to perform the method of any one of the embodiments shown in figures 1 to 13.
According to the method provided by the embodiment of the present application, the present application further provides a computer-readable storage medium storing program code, which when run on a computer, causes the computer to execute the method of any one of the embodiments shown in fig. 1 to 13.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored on a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website, computer, server, or data center to another website, computer, server, or data center via wire (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., a floppy disk, a hard disk, a magnetic tape), an optical medium (e.g., a Digital Video Disk (DVD)), or a semiconductor medium (e.g., a Solid State Drive (SSD)), among others.
It is to be understood that the expressions "first", "second", and "third", etc., referred to in the above embodiments are only for distinguishing different units, and are not intended to limit the number or usage. "connected" is to be understood at one time as an electrical connection or electrical coupling and not as being connected directly by a wire, as it is understood that "connected" may be indirectly connected by other means.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.
Claims (12)
1. An image processing method is characterized in that the method is applied to an image processing device, the image processing device comprises a light chopper and an image sensor, the image sensor comprises a first row of photosensitive units to an Nth row of photosensitive units, and the preset exposure time lengths of the first row of photosensitive units to the Nth row of photosensitive units are sequentially increased; wherein N is a positive integer; the method comprises the following steps:
switching the shutter to a light transmitting state before a first time; the light transmitting state is used for transmitting light rays to the image sensor through the light chopper;
gating the first row of photosensitive units to the Nth row of photosensitive units at the first moment simultaneously, so that the first row of photosensitive units to the Nth row of photosensitive units start exposure operation at the first moment simultaneously;
switching the light chopper to a light shielding state before the preset exposure duration of the first row of photosensitive units passes after the first moment; the shading state is used for the light chopper to close a transmission passage of the light to the image sensor.
2. The method of claim 1, wherein the image sensor is in a global reset mode.
3. The method of claim 1 or 2, wherein said switching the shutter to a light transmissive state prior to the first time comprises:
after the image processing device is detected to be started, the light chopper is switched to the light transmission state; or,
and after the preset exposure time of the Nth line of photosensitive units shot at the last time is finished, switching the light chopper to the light transmission state.
4. The method according to any one of claims 1 to 3, wherein when the preset exposure time period of the first row of photosensitive units is a first time period and the time period required for the shutter to switch to the light-shielding state is a second time period, then:
the switching the shutter to a light-shielding state before the preset exposure duration of the first row of photosensitive cells passes after the first time includes:
transmitting a light shielding state switching instruction to the light shielding device after a third time length passes after the first time, wherein the light shielding state switching instruction is used for driving the light shielding device to be switched to the light shielding state; the third duration is not greater than a difference in duration between the first duration and the second duration.
5. The method according to any one of claims 1 to 4, wherein when the preset exposure time period of the first row of photosensitive units is a first time period and the difference between the preset exposure time period of the Nth row of photosensitive units and the preset exposure time period of the first row of photosensitive units is a fourth time period:
after the switching the shutter to the light shielding state, the method further comprises:
sending a light transmission state switching instruction to the light chopper after a fifth time length passes after the first time, wherein the light transmission state switching instruction is used for driving the light chopper to be switched to the light transmission state; the fifth duration is greater than a sum of durations of the first duration and the fourth duration.
6. An image processing apparatus characterized by comprising:
a switching unit for switching the shutter to a light transmitting state before a first time; the light transmitting state is used for transmitting light rays to the image sensor through the light chopper;
the gating unit is used for gating the photosensitive units in the first row to the photosensitive units in the Nth row at the first moment so as to enable the photosensitive units in the first row to the photosensitive units in the Nth row to start exposure operation at the first moment; the preset exposure time from the first row of photosensitive units to the Nth row of photosensitive units is sequentially increased;
the switching unit is further configured to: switching the light chopper to a light shielding state before the preset exposure duration of the first row of photosensitive units passes after the first moment; the shading state is used for the light chopper to close a transmission passage of the light to the image sensor.
7. The apparatus of claim 6, wherein the image sensor is in a global reset mode.
8. The apparatus according to claim 6 or 7, wherein the switching unit is specifically configured to:
after the image processing device is detected to be started, the light chopper is switched to the light transmission state; or,
and after the preset exposure time of the Nth line of photosensitive units shot at the last time is finished, switching the light chopper to the light transmission state.
9. The apparatus according to any one of claims 6 to 8, wherein when the preset exposure time period of the first row of photosensitive units is a first time period and the time period required for the shutter to switch to the light-shielding state is a second time period:
the switching unit is specifically configured to:
transmitting a light shielding state switching instruction to the light shielding device after a third time length passes after the first time, wherein the light shielding state switching instruction is used for driving the light shielding device to be switched to the light shielding state; the third duration is not greater than a difference in duration between the first duration and the second duration.
10. The apparatus according to any one of claims 6 to 9, wherein when the preset exposure time period of the first row of photosensitive units is a first time period and the difference between the preset exposure time period of the nth row of photosensitive units and the preset exposure time period of the first row of photosensitive units is a fourth time period:
after the switching unit switches the shutter to the light shielding state, the switching unit is further configured to:
sending a light transmission state switching instruction to the light chopper after a fifth time length passes after the first time, wherein the light transmission state switching instruction is used for driving the light chopper to be switched to the light transmission state; the fifth duration is greater than a sum of durations of the first duration and the fourth duration.
11. An electronic device comprising a processor coupled with a memory, the processor to execute a computer program stored in the memory to cause the electronic device to perform the method of any of claims 1-5.
12. A computer-readable storage medium, characterized in that it stores a computer program which, when executed, implements the method of any one of claims 1 to 5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011171450.9A CN114430464A (en) | 2020-10-28 | 2020-10-28 | Image processing method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011171450.9A CN114430464A (en) | 2020-10-28 | 2020-10-28 | Image processing method and device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114430464A true CN114430464A (en) | 2022-05-03 |
Family
ID=81309271
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011171450.9A Pending CN114430464A (en) | 2020-10-28 | 2020-10-28 | Image processing method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114430464A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118214952A (en) * | 2024-05-20 | 2024-06-18 | 浙江大华技术股份有限公司 | Image acquisition method, image acquisition device, electronic device, and computer-readable storage medium |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101467440A (en) * | 2006-06-15 | 2009-06-24 | 日本电气株式会社 | Image processing circuit, mobile terminal and sensor control method |
CN102724406A (en) * | 2012-06-26 | 2012-10-10 | 北京中电兴发科技有限公司 | Method for realizing global shutter function of complementary metal-oxide semiconductor (CMOS) snap-shot video camera |
JP2013165432A (en) * | 2012-02-13 | 2013-08-22 | Nikon Corp | Shutter apparatus, nd filter device and imaging apparatus |
CN104796635A (en) * | 2015-04-20 | 2015-07-22 | 中国航天科技集团公司第九研究院第七七一研究所 | Global reset release control method used for oversized-area-array CMOS (complementary metal-oxide-semiconductor transistor) image sensor |
CN105915812A (en) * | 2016-04-28 | 2016-08-31 | 努比亚技术有限公司 | Mobile terminal and exposing method and apparatus for the same |
CN111586264A (en) * | 2019-02-15 | 2020-08-25 | 杭州海康威视数字技术股份有限公司 | Image acquisition device and image acquisition method |
-
2020
- 2020-10-28 CN CN202011171450.9A patent/CN114430464A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101467440A (en) * | 2006-06-15 | 2009-06-24 | 日本电气株式会社 | Image processing circuit, mobile terminal and sensor control method |
JP2013165432A (en) * | 2012-02-13 | 2013-08-22 | Nikon Corp | Shutter apparatus, nd filter device and imaging apparatus |
CN102724406A (en) * | 2012-06-26 | 2012-10-10 | 北京中电兴发科技有限公司 | Method for realizing global shutter function of complementary metal-oxide semiconductor (CMOS) snap-shot video camera |
CN104796635A (en) * | 2015-04-20 | 2015-07-22 | 中国航天科技集团公司第九研究院第七七一研究所 | Global reset release control method used for oversized-area-array CMOS (complementary metal-oxide-semiconductor transistor) image sensor |
CN105915812A (en) * | 2016-04-28 | 2016-08-31 | 努比亚技术有限公司 | Mobile terminal and exposing method and apparatus for the same |
CN111586264A (en) * | 2019-02-15 | 2020-08-25 | 杭州海康威视数字技术股份有限公司 | Image acquisition device and image acquisition method |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118214952A (en) * | 2024-05-20 | 2024-06-18 | 浙江大华技术股份有限公司 | Image acquisition method, image acquisition device, electronic device, and computer-readable storage medium |
CN118214952B (en) * | 2024-05-20 | 2024-08-02 | 浙江大华技术股份有限公司 | Image acquisition method, image acquisition device, electronic device, and computer-readable storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3624439B1 (en) | Imaging processing method for camera module in night scene, electronic device and storage medium | |
CN110505411B (en) | Image shooting method and device, storage medium and electronic equipment | |
WO2021258321A1 (en) | Image acquisition method and apparatus | |
WO2020038110A1 (en) | Panoramic photographing method and apparatus, terminal and computer-readable storage medium | |
CN109167931B (en) | Image processing method, device, storage medium and mobile terminal | |
WO2023040622A1 (en) | Hdr image processing method and electronic device | |
WO2021109620A1 (en) | Exposure parameter adjustment method and apparatus | |
CN107948538B (en) | Imaging method, imaging device, mobile terminal and storage medium | |
CN113079306B (en) | Image pickup module, electronic device, image pickup method, and image pickup apparatus | |
US9571739B2 (en) | Camera timer | |
CN102801906B (en) | Monitoring camera with day and night double apertures | |
US7868934B2 (en) | Image capturing apparatus, control method therefor, program, and storage medium for providing a limited extraction range of a video image | |
CN109618102B (en) | Focusing processing method and device, electronic equipment and storage medium | |
US6839087B1 (en) | Exposure controller of a digital camera | |
WO2020082968A1 (en) | Camera, electronic device, and identity authentication method | |
CN112004026A (en) | Phase focusing device, phase focusing method, shooting device, terminal equipment and medium | |
CN116095476B (en) | Camera switching method and device, electronic equipment and storage medium | |
WO2019007191A1 (en) | Control method, electronic device and computer-readable storage medium | |
WO2024040981A1 (en) | Photographing method and related device therefor | |
CN103841319A (en) | Image pickup apparatus | |
CN114430464A (en) | Image processing method and device | |
CN113497880A (en) | Method for shooting image and electronic equipment | |
CN115103131A (en) | Shooting module protection method and device, computer readable medium and electronic equipment | |
CN114995012A (en) | Camera module and electronic equipment | |
CN112889271B (en) | Image processing method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20220503 |