WO2020159160A1 - Optical device including liquid lens and having chromatic aberration correction function, and image processing method - Google Patents

Optical device including liquid lens and having chromatic aberration correction function, and image processing method Download PDF

Info

Publication number
WO2020159160A1
WO2020159160A1 PCT/KR2020/001163 KR2020001163W WO2020159160A1 WO 2020159160 A1 WO2020159160 A1 WO 2020159160A1 KR 2020001163 W KR2020001163 W KR 2020001163W WO 2020159160 A1 WO2020159160 A1 WO 2020159160A1
Authority
WO
WIPO (PCT)
Prior art keywords
channel
information
field
image
correction
Prior art date
Application number
PCT/KR2020/001163
Other languages
French (fr)
Korean (ko)
Inventor
정재욱
Original Assignee
엘지이노텍(주)
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지이노텍(주) filed Critical 엘지이노텍(주)
Publication of WO2020159160A1 publication Critical patent/WO2020159160A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B3/00Simple or compound lenses
    • G02B3/12Fluid-filled or evacuated lenses
    • G02B3/14Fluid-filled or evacuated lenses of variable focal length
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0075Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for altering, e.g. increasing, the depth of field or depth of focus
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/02Mountings, adjusting means, or light-tight connections, for optical elements for lenses
    • G02B7/04Mountings, adjusting means, or light-tight connections, for optical elements for lenses with mechanism for focusing or varying magnification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration

Definitions

  • An embodiment relates to an optical device and an image processing method including a liquid lens and having chromatic aberration correction.
  • a camera lens including a solid lens is designed such that chromatic aberration is suppressed as much as possible, so that a separate chromatic aberration correction is not required in an image process step after taking an image.
  • the chromatic aberration can be corrected easily because the user can directly remove chromatic aberration through candidate correction, the chromatic aberration amount is very small, and the shape in which the chromatic aberration is generated is symmetrical up/down/left/right.
  • chromatic aberration exhibits the same aspect as a camera lens including a solid lens in normal shooting, that is, in a normal state, in which an auto-focusing (AF) function is performed.
  • AF auto-focusing
  • chromatic aberration is largely generated and the amount of chromatic aberration varies depending on the OIS driving frequency, but OIS function is performed. It will increase approximately 5 to 8 times compared to when it is not.
  • the OIS driving frequency means a frequency for the shaking of a user who uses a camera lens.
  • a camera lens including a liquid lens Since the use of tilting of the interface between the two liquids, the shape of chromatic aberration is also asymmetrical up/down/left/right. Due to this, there is a problem that the image quality of the photographed image is significantly deteriorated.
  • An embodiment includes a liquid lens capable of correcting chromatic aberration and providing improved image quality, and provides an optical device and an image processing method having a chromatic aberration correction function.
  • An optical device includes a lens unit including a liquid lens; An image sensor receiving information of light passing through the lens unit through an R channel receiving a red wavelength range, a G channel receiving a green wavelength range, and a B channel receiving a blue wavelength range; And shifting information of light received through at least one correction channel among the R channel, the G channel, and the B channel by a preset movement amount corresponding to the driving information of the liquid lens and field information on the image sensor of the correction channel. It may include an image processing unit.
  • the correction channel may be a B channel and an R channel
  • the G channel may be a reference channel
  • the image processing unit maps and stores a correction angle of the liquid lens for each of a plurality of compensation angles, and stores the first movement amount of each field and the R channel of the image sensor of the B channel for each of the correction angles.
  • a look-up table that maps and stores the second movement amount for each field on the image sensor; The first movement amount for each field of the B channel corresponding to the compensation angle corresponding to the driving information of the liquid lens, and the second movement amount for each field of the R channel corresponding to the compensation angle are output from the lookup table.
  • Control unit And shifting information of the light received on the B channel by the first movement amount for each field output from the lookup table by field, and receiving information on the light received on the R channel by the second movement amount for each field output from the lookup table.
  • An image shifting unit may be shifted for each field.
  • the image processing unit may include a correction angle determining unit corresponding to a compensation angle corresponding to the driving information of the liquid lens and obtaining a preset correction angle of the liquid lens; A movement amount determination unit corresponding to the correction angle and determining a first movement amount for each field on the image sensor of the B channel and a second movement amount for each field on the image sensor of the R channel; And an image shifting unit shifting the information of the light received through the B channel by the first movement amount for each field, and shifting the information of the light received by the R channel by the second movement amount for each field.
  • the optical device may generate a corrected final image by synthesizing information of light received on the G channel, information of light received on the B channel and shifted, and information on light received on the R channel and shifted.
  • a synthetic part may be further included.
  • the optical device may further include a hand shake detection unit that converts user hand shake information corresponding to the driving information of the liquid lens into a compensation angle.
  • the image stabilization unit may include a gyro sensor that senses the degree of image stabilization of the user and converts the driving information of the liquid lens with respect to the sensed image amount to a compensation angle.
  • a gyro sensor that senses the degree of image stabilization of the user and converts the driving information of the liquid lens with respect to the sensed image amount to a compensation angle.
  • a lens unit including a liquid lens; An image sensor that receives information of light passing through the lens unit; And an image processing unit for correcting the information of the light, comprising: converting driving information of the liquid lens into a compensation angle; Obtaining a correction angle of the preset liquid lens corresponding to the compensation angle; Corresponding to the correction angle, the field information on the image sensor of at least one correction channel among the R channel receiving the red wavelength range, the G channel receiving the green wavelength range, and the B channel receiving the blue wavelength range. Determining a corresponding preset movement amount; And shifting information of light received through the correction channel by the determined amount of movement for each field on the image sensor.
  • the correction channel may be an R channel and a B channel
  • the G channel may be a reference channel
  • the determining of the movement amount may include determining a first movement amount for each field on the image sensor of the B channel that corresponds to the correction angle;
  • the method may include determining a second movement amount for each field on the image sensor of the R channel corresponding to the correction angle.
  • the shifting step may include shifting information of light received through the B channel by the first movement amount for each field on the image sensor; And shifting the information of the light received through the R channel by the second movement amount for each field on the image sensor.
  • the image processing method may generate a corrected final image by synthesizing information of light received on the G channel, information of light received on the R channel and shifted, and information of light received on the B channel and shifted. It may further include a step.
  • the optical device and the image processing method including a liquid lens according to an embodiment and having a chromatic aberration correction function are improved by correcting chromatic aberration by shifting information of light received through a compensation channel by a preset amount of movement for each field according to the degree of hand movement of the user It can provide image quality.
  • FIG. 1 is a block diagram of an optical device according to an embodiment.
  • FIG. 2 is a cross-sectional view of the liquid lens unit including the liquid lens illustrated in FIG. 1.
  • FIG. 3 shows an implementation example according to an embodiment of the optical device shown in FIG. 1.
  • FIG. 4 shows a planar shape according to the embodiment of the image sensor shown in FIG. 1.
  • FIG. 7 is a block diagram according to an embodiment of the image processing unit illustrated in FIG. 1.
  • FIG. 9 is a block diagram according to another embodiment of the image processing unit shown in FIG. 1.
  • FIG. 10 is a flowchart illustrating an image processing method according to an embodiment.
  • FIG. 11 is a view for explaining chromatic aberration correction according to an embodiment.
  • FIG. 12 is a diagram for describing an image in which chromatic aberration is corrected by an optical device and an image processing method according to an embodiment.
  • the terms used in the embodiments of the present invention are for describing the embodiments and are not intended to limit the present invention.
  • the singular form may also include the plural form unless specifically stated in the phrase, and is combined with A, B, C when described as “at least one (or more than one) of A and B, C”. It can contain one or more of all possible combinations.
  • first, second, A, B, (a), and (b) may be used. These terms are only for distinguishing the component from other components, and the term is not limited to the nature, order, or order of the component.
  • a component when a component is described as being'connected','coupled' or'connected' to another component, the component is not only directly connected, coupled or connected to the other component, but also to the component It may also include the case of'connected','coupled' or'connected' due to another component between the other components.
  • top (top) or bottom (bottom) when described as being formed or disposed in the “top (top) or bottom (bottom)” of each component, the top (top) or bottom (bottom) is not only when two components are in direct contact with each other, but also one It also includes a case in which another component described above is formed or disposed between two components.
  • up (up) or down (down) when expressed as “up (up) or down (down)”, it may include the meaning of the downward direction as well as the upward direction based on one component.
  • the variable lens may be a variable focus lens. Also, the variable lens may be a lens whose focus is adjusted.
  • the variable lens may be at least one of a liquid lens, a polymer lens, a liquid crystal lens, a VCM type, and an SMA type.
  • the liquid lens may include a liquid lens including one liquid and a liquid lens including two liquids.
  • the liquid lens including one liquid may change the focus by adjusting the membrane disposed at a position corresponding to the liquid, and for example, the focus may be changed by pressing the membrane by the electromagnetic force of the magnet and the coil.
  • the liquid lens including two liquids may control the interface formed by the conductive liquid and the non-conductive liquid by using a voltage applied to the liquid lens, including the conductive liquid and the non-conductive liquid.
  • the polymer lens can change the focus of the polymer material through a driving unit such as a piezo.
  • the liquid crystal lens can change the focus by controlling the liquid crystal by electromagnetic force.
  • the VCM type can change the focus by adjusting the solid lens or the lens assembly including the solid lens through an electromagnetic force between the magnet and the coil.
  • the SMA type may use a shape memory alloy to control a solid lens or a lens assembly including the solid lens to change focus.
  • the optical device having the chromatic aberration correction function according to the embodiment will be described as including a liquid lens as a variable lens, and the image processing method according to the embodiment will be described as being performed at an optical device including a liquid lens as a variable lens.
  • the embodiment is not limited thereto. That is, the following description of the optical device can be applied to an optical device including a variable lens other than a liquid lens, and the description of the following image processing method is performed by an optical device including a variable lens other than a liquid lens It can also be applied to the video processing method.
  • FIG. 1 is a block diagram of an optical device 100 according to an embodiment, and includes a lens unit (or lens assembly) 110, an image sensor 120, an image processing unit 130, and an image synthesis unit 140. Can.
  • the lens unit 110 may include a liquid lens 110A.
  • the embodiment is not limited to the liquid lens 110A having a specific structure.
  • FIG. 2 is a cross-sectional view of a liquid lens unit including the liquid lens 110A shown in FIG. 1.
  • the liquid lens unit illustrated in FIG. 2 may include a liquid lens 110A and first and second connecting substrates 116 and 118.
  • the liquid lens 110A includes a plurality of different types of liquids LQ1, LQ2, first to third plates P1, P2, P3, first and second electrodes E1, E2, and an insulating layer 119. It may include.
  • a plurality of liquids is accommodated in a cavity (CA:cavity), the first liquid having a conductivity (LQ1) and non-conductive and can be embodied as a non-conductive material such as oil (oil) 2 liquid (or insulating liquid) (LQ2).
  • the first liquid LQ1 and the second liquid LQ2 do not mix with each other, and an interface BO may be formed at a contact portion between the first and second liquids LQ1 and LQ2.
  • the first liquid LQ1 may be disposed on the second liquid LQ2, or the second liquid LQ2 may be disposed on the first liquid LQ1.
  • the inner surface of the first plate P1 may form a side wall i of the cavity CA.
  • the first plate P1 may include upper and lower openings having a predetermined inclined surface. That is, the cavity CA may be defined as an area surrounded by an inclined surface of the first plate P1, a first opening contacting the second plate P2, and a second opening contacting the third plate P3.
  • the diameter of the wider opening among the first and second openings may vary depending on the field of view (FOV) required by the liquid lens 110A or the role of the liquid lens 110A.
  • the interface BO formed by the two liquids may move along the inclined surface of the cavity CA by the driving voltage.
  • the first liquid LQ1 and the second liquid LQ2 may be filled, received, or disposed in the cavity CA of the first plate P1. Further, the cavity CA is a site through which light passes. Therefore, the first plate P1 may be made of a transparent material, or may contain impurities so that light transmission is not easy.
  • First and second electrodes E1 and E2 may be disposed on one surface and the other surface of the first plate P1, respectively.
  • the plurality of first electrodes E1 are spaced apart from the second electrode E2 and may be disposed on the upper surface, side surfaces, and lower surfaces of the first plate P1.
  • the second electrode E2 is disposed in at least a portion of the lower surface of the first plate P1 and may directly contact the first liquid LQ1.
  • the plurality of first electrodes E1 may correspond to individual electrodes that can be electrically separated from each other, and the plurality of second electrodes E2 may correspond to common electrodes that may not be electrically separated from each other. have. A portion of the second electrode E2 disposed on the other surface of the first plate P1 may be exposed to the first liquid LQ1 having conductivity.
  • the second plate P2 may be disposed on the upper surface of the first electrode E1 and the cavity CA.
  • the third plate P3 may be disposed under the second electrode E2 and under the cavity CA.
  • the second plate P2 and the third plate P3 may be disposed to face each other with the first plate P1 therebetween. Also, at least one of the second plate P2 or the third plate P3 may be omitted.
  • Each of the second and third plates P2 and P3 is a region through which light passes, and may be made of a translucent material.
  • each of the second and third plates P2 and P3 may be made of glass, and may be formed of the same material for convenience of processing.
  • the second plate P2 may have a configuration that allows the incident light to proceed into the cavity CA of the first plate P1.
  • the third plate P3 may have a configuration that allows light passing through the cavity CA of the first plate P1 to be emitted.
  • the third plate P3 may directly contact the first liquid LQ1.
  • the insulating layer 119 may be disposed while covering a part of the lower surface of the second plate P2 in the upper region of the cavity CA. That is, the insulating layer 119 may be disposed between the second liquid LQ2 and the second plate P2. In addition, the insulating layer 119 may be disposed while covering at least a portion of the first electrode E1 forming the sidewall of the cavity CA. In addition, the insulating layer 119 may be disposed on the lower surface of the first plate P1, covering a portion of the second electrode E2 and the first plate P1 and the first electrode E1. Due to this, the contact between the first electrode E1 and the first liquid LQ1 and the contact between the first electrode E1 and the second liquid LQ2 may be blocked by the insulating layer 119.
  • the insulating layer 119 covers one electrode (eg, the first electrode E1) of the first and second electrodes E1 and E2, and the other electrode (eg, the second electrode E2). )) to expose a portion of the first liquid (LQ1) having conductivity so that electrical energy is applied.
  • one electrode eg, the first electrode E1 of the first and second electrodes E1 and E2, and the other electrode (eg, the second electrode E2).
  • the first connecting substrate 116 and the second connecting substrate 118 serve to supply voltage to the liquid lens 110A.
  • the first electrode E1 may be electrically connected to the first connection substrate 116
  • the second electrode E2 may be electrically connected to the second connection substrate 118.
  • the optical device including the liquid lens 110A has an Auto-Focusing (AF) function or image stabilization or image stabilization (OIS: Optical) Image Stabilizer) function.
  • AF Auto-Focusing
  • OIS Optical
  • FIG. 3 shows an implementation example according to an embodiment of the optical device 100 shown in FIG. 1.
  • the lens unit 110 of the optical device 100 illustrated in FIG. 3 may include a first lens unit 150, a liquid lens 110A, and a second lens unit 160.
  • the optical device 100 may further include a filter 170 and an image sensor 120.
  • the imaging surface 120S illustrated in FIG. 3 may correspond to the imaging surface of the image sensor 120 illustrated in FIG. 1.
  • each of the first lens unit 150 and the second lens unit 160 is a solid lens, and may be implemented as glass or plastic, but the embodiment is the first lens unit 150 and the first 2 It is not limited to each specific material of the lens unit 160. Also, at least one of the first lens unit 150 or the second lens unit 160 may be omitted.
  • the liquid lens 110A is illustrated between the first lens unit 150 and the second lens unit 160, unlike the liquid lens 110A, the first lens unit ( It may be disposed on the left side of the 150) or the right side of the second lens unit 160.
  • the first lens unit 150 may include two lenses 152 and 154, but may also include fewer or more than two lenses.
  • the second lens unit 160 may include three lenses 162, 164, and 166, but may also include fewer or more lenses than three.
  • the filter 170 may filter light corresponding to a specific wavelength range for light passing through the first lens unit 150, the liquid lens 110A, and the second lens unit 160.
  • the filter 170 may be an infrared (IR) blocking filter or an ultraviolet (UV) blocking filter, but embodiments are not limited thereto.
  • the filter 170 may be disposed between the lens unit 110 and the imaging surface 120S of the image sensor 120. In some cases, the filter 170 may be omitted.
  • the image sensor 120 converts light passing through the lens unit 110 into an image, and outputs the converted image to the image processing unit 130. That is, the image sensor 120 has a lens through an R channel receiving a red wavelength range, a G channel receiving a green wavelength range, and a B channel receiving a blue wavelength range. Information of the light passing through the unit 110 may be received.
  • the image processing unit 130 serves to correct chromatic aberration of the image output from the image sensor 120.
  • chromatic aberration may mean the degree to which information (or an image) of light received through the correction channel is shifted compared to information (or an image) of light received through the reference channel. The meaning of chromatic aberration will be described later in detail.
  • The'correction channel' means a channel to be corrected among R channels, G channels and B channels
  • the'reference channel' means a channel that is a reference for correcting the correction channels among R channels, G channels and B channels. do. That is, the degree to which the image received by the correction channel is shifted based on the image received by the reference channel corresponds to chromatic aberration.
  • the correction channels are described as B channels and R channels, and G channels are reference channels, but embodiments are not limited thereto.
  • the image processing unit 130 is configured to correspond to field information on the image sensor 120 of the driving information of the liquid lens 110A and the correction channel of the information of the light received through at least one correction channel of the R channel, the G channel, and the B channel. Chromatic aberration can be corrected by shifting by a set amount of movement.
  • the driving information of the liquid lens 110A may be information about the degree of hand movement of the user (hereinafter, referred to as “hand shake information”).
  • hand shake information information about the degree of hand movement of the user
  • the image processing unit 130 shifts the information (for example, an image) of the light received through the correction channel by the amount of movement for each field.
  • FIG. 4 shows a planar shape according to the embodiment 120A of the image sensor 120 shown in FIG. 1, the horizontal axis represents the position in the x-axis direction, and the vertical axis represents the y-axis position perpendicular to the x-axis.
  • the aforementioned “field” indicates a position relative to the optical axis on the imaging surface 120S of the image sensor 120. That is, the field indicates a relative position of the image sensor 120 in the radial direction with respect to the center of the image sensor 120.
  • a field at the center of the image sensor 120 is 0.0f, and the entire area of the image sensor 120 is radially based on the center.
  • the field of 50% of the point is ⁇ 0.5f
  • the edge field of the image sensor 120 is ⁇ 1.0.
  • FIG. 5 shows the relationship between the amount of chromatic aberration (hereinafter referred to as'aberration amount') and the field of the image sensor 120 when the interface BO of the liquid lens 110A is not tilted
  • FIG. 6 shows the liquid lens ( Shows the relationship between the aberration amount and the field of the image sensor 120 when the interface BO of 110A) is tilted.
  • the horizontal axis represents the aberration amount
  • the vertical axis represents the physical position of the image sensor 120. That is, a value obtained by normalizing the physical position of the vertical axis corresponds to the field.
  • 'BR' represents the amount of chromatic aberration between the B channel and the R channel
  • 'BG' represents the amount of chromatic aberration between the B channel and the G channel.
  • the sign of the field is (+) on the upper side and the sign of the field is (-) on the lower side with respect to the axis representing the aberration amount.
  • the sign (ie, direction) of this field is meaningless because the camera lens is rotationally symmetric about the optical axis.
  • the sign of the field ie the direction of the field, is not taken into account.
  • the sign (ie, direction) of this field must be considered because the camera lens is not rotationally symmetric about the optical axis.
  • the sign of the field ie the direction of the field.
  • the amount of movement for each field required to correct chromatic aberration in the image processing unit 130 may be determined in advance for each driving information (for example, hand-shake information) of the liquid lens 110A. This will be described as follows with reference to FIGS. 7 to 9.
  • FIG. 7 is a block diagram according to an embodiment 130A of the image processing unit 130 shown in FIG. 1.
  • the image processing unit 130A may include a control unit 134, a look-up-table (LUT) 136 and an image shifting unit 138.
  • the LUT 136 maps and stores the correction angles of the liquid lens 110A for a plurality of compensation angles in advance, and stores the first movement amount and R channel for each field on the image sensor 120 of the B channel for each of the correction angles.
  • the second movement amount for each field on the sensor 120 may be mapped and stored in advance.
  • the correction angle stored in the LUT 136 may be applied to the liquid lens 110A and replaced with a driving voltage that controls the interface of the liquid lens 110A.
  • the LUT 136 shown in FIG. 8 maps and stores the compensation angle and the compensation angle, and stores the compensation angle and the first and second movement amounts for each field.
  • the unit of each of the first and second movement amounts illustrated in FIG. 8 is the number of pixels. In the case of FIG. 8, only fields having a positive (+) sign are stored in the LUT 136, but fields having a negative (-) sign may also be stored in the LUT 136.
  • the control unit 134 looks up the first movement amount for each field of the B channel corresponding to the compensation angle corresponding to the image stabilization information and the second movement amount for each field of the R channel corresponding to the compensation angle ( 136). That is, using the compensation angle corresponding to the hand-shake information as the address of the LUT 136, the control unit 134 determines the first and second movement amounts for each field of the B and R channels stored in the corresponding address of the LUT 136. It can be output to the image shifting unit 138.
  • the optical device may further include a camera shake detection unit 50.
  • the camera shake detection unit 50 may convert camera shake information corresponding to driving information of the liquid lens 110A into a compensation angle, and output the converted compensation angle to the controller 134.
  • the camera shake detection unit 50 may include a gyro sensor 52.
  • the gyro sensor 52 senses the degree of the user's hand-shake, converts the hand-shake information about the sensed hand-shake to a compensation angle, and outputs the converted result to the controller 134. It is not limited to the specific implementation examples of unit 50.
  • the compensation angle when the compensation angle is “0”, there is no first and second movement amount for each field.
  • the compensation angle when the compensation angle is '0.3 degrees', the compensation angle is 0.42, and the first movement amount for each field in fields 0, 0.2, 0.4, 0.6, 0.8, and 1 is -0.1, 0.5, 0.4, -0.2, -0.4, respectively.
  • the second movement amount per field in fields 0, 0.2, 0.4, 0.6, 0.8, and 1 is 0, 0.4, 0.6, 0.6, 0.8, and 1.9, respectively.
  • the spacing between fields is 0.2, but this is only an example, and may be larger or smaller than 0.2. As the distance between fields is smaller, the amount of movement is further subdivided, so that chromatic aberration can be accurately corrected.
  • the image shifting unit 138 shifts the information (for example, an image) of light received in the B channel for each field by the first movement amount for each field output from the LUT 136, and images the shifted result through the output terminal OUT2 Output to the synthesis unit 140.
  • the image shifting unit 138 shifts the information (for example, an image) of light received on the R channel by the second movement amount per field output from the LUT 136 for each field, and outputs the shifted result to the output terminal OUT2. Output to the image synthesis unit 140 through.
  • the image shifting unit 138 receives the information of the light received from the image sensor 120 through the input terminal IN1 in the B channel, and receives the information of the light received from the image sensor 120 through the input terminal IN2 through the R channel. Can receive Alternatively, the image shifting unit 138 receives information of light including both the information of the light received from the image sensor 120 through the input terminal IN1 or IN2 in the B channel and the information of the light received through the R channel, and from the received light information. It is also possible to separate information of light received on the B channel and information of light received on the R channel.
  • the image shifting unit 138 displays information of the light received through the B channel in fields 0, 0.2, 0.4, 0.6, 0.8, and 1, respectively, -0.1, 0.5, and 0.4. , -0.2, -0.4 and 0.7 pixels, and the information of the light received through the R channel is 0, 0.4, 0.6, 0.6, 0.8 and 1.9 pixels in fields 0, 0.2, 0.4, 0.6, 0.8 and 1, respectively. Shift it.
  • FIG. 9 is a block diagram according to another embodiment 130B of the image processing unit 130 shown in FIG. 1.
  • the image processing unit 130B may include a correction angle determination unit 133, a movement amount determination unit 135, and an image shifting unit 138.
  • a correction angle determination unit 133 may be included in the image processing unit 130B.
  • a movement amount determination unit 135 may be included in the image processing unit 130B.
  • an image shifting unit 138 may be included in the image processing unit 130B.
  • FIG. 9 the same reference numerals are used for the same components as in FIG. 7, and overlapping descriptions are omitted.
  • the correction angle determination unit 133 obtains a correction angle of the preset liquid lens 110A corresponding to the compensation angle output from the image stabilization unit 50 and outputs the correction angle to the movement amount determination unit 135.
  • the correction angle determination unit 133 may be implemented in the form of a look-up table 133A using a compensation angle as an address and a correction angle as data stored at each address.
  • the correction angle determining unit 133 may map a plurality of correction angles for each compensation angle in advance and store them in the form of a lookup table 133A.
  • the correction angle determination unit 133 may output the correction angle mapped to the compensation angle output from the image stabilization unit 50 to the movement amount determination unit 135.
  • the correction angle determined by the correction angle determination unit 133 may be output to the liquid lens driving unit (not shown) through the output terminal OUT3.
  • the liquid lens driving unit generates a driving voltage for driving the liquid lens 110A in response to the correction angle output from the correction angle determination unit 133, thereby tilting the angle ⁇ of the interface BO of the liquid lens 110A. Can be adjusted.
  • the movement amount determination unit 135 corresponds to the correction angle output from the correction angle determination unit 133, and the first movement amount per field on the image sensor 120 of the preset B channel and each field on the image sensor 120 of the R channel.
  • the second movement amount may be determined, and the determined result may be output to the image shifting unit 138.
  • the movement amount determination unit 135 may be implemented in the form of a look-up table 135A that uses correction angles as addresses and first and second movement amounts for each field as data stored in each address. have. To this end, the movement amount determination unit 135 may map a plurality of correction angles and first and second movement amounts for each field in advance and store them in the form of a look-up table 135A. In this case, the movement amount determination unit 135 may output first and second movement amounts for each field mapped to the correction angle output from the correction angle determination unit 133 to the image shifting unit 138.
  • the image synthesizing unit 140 synthesizes information of light received on the G channel, information of light received on the B channel and shifted, and information on light received on the R channel and shifted, and synthesized results.
  • the final image with corrected chromatic aberration can be output through the output terminal OUT1.
  • the image synthesizing unit 140 extracts the information of the light received through the G channel from the information of the light output from the image sensor 120, the information of the light received and shifted by the B channel, and the received and shifted by the R channel.
  • the information of the light may be received from the image processing unit 130, and information of the light received through the extracted G channel, information of the light received and shifted through the B channel, and information of the light received and shifted through the R channel may be synthesized.
  • the final image output through the output terminal OUT1 may be displayed on a display (not shown), but the embodiment is not limited thereto.
  • the above-described image processing unit 130 and the image synthesis unit 140 may be respectively implemented in separate configurations, or may serve as the above-described image processing unit 130 and the image synthesis unit 140 in one control unit.
  • FIG. 10 is a flowchart illustrating an image processing method 200 according to an embodiment.
  • the image processing method 200 illustrated in FIG. 10 may be performed by the optical device 100 illustrated in FIG. 1, or may also be performed by an optical device having a configuration different from the configuration illustrated in FIG. 1.
  • the optical device 100 illustrated in FIG. 1 may perform the image processing method 200 illustrated in FIG. 10 or may perform another image processing method illustrated in FIG. 10.
  • the image processing method 200 illustrated in FIG. 10 is performed by the optical device 100 illustrated in FIG. 1.
  • Step 210 information on the hand-shake, which is information on the degree of the user's hand-shake, is converted into a compensation angle (step 210).
  • Step 210 may be performed by the image stabilization unit 50 shown in FIGS. 7 and 9.
  • a correction angle of the preset liquid lens 110A corresponding to the compensation angle is obtained (step 220).
  • the step 220 may be performed by the control unit 134 and the LUT 136 illustrated in FIG. 7, or may be performed by the correction angle determination unit 133 illustrated in FIG. 9.
  • Step 220 a predetermined amount of movement for each field corresponding to the correction angle and corresponding to field information on the image sensor 120 of at least one correction channel among R channels, G channels, and B channels is determined (230 and 240) step). Steps 230 and 240 may be performed by the control unit 134 and the LUT 136 illustrated in FIG. 7, or may be performed by the movement amount determining unit 135 illustrated in FIG. 9.
  • the correction channels are R and B channels, and the G channel can be a reference channel.
  • the first movement amount for each field on the image sensor 120 of the preset B channel corresponding to the correction angle may be determined (step 230).
  • a second movement amount for each field on the image sensor 120 of the preset R channel corresponding to the correction angle may be determined (step 240).
  • operation 240 is performed, but the embodiment is not limited thereto. That is, after the 240th step is performed, the 230th step may be performed, and the 230th and 240th steps may be simultaneously performed.
  • Steps 250 and 260 may be performed by the image shifting unit 138 illustrated in FIGS. 7 and 9.
  • the correction channels are R channels and B channels
  • the reference channel is G channels
  • image information eg, an image
  • the first movement amount may be shifted for each field on the sensor 120 (step 250).
  • Step 260 information (eg, an image) of the light received through the R channel may be shifted by a second amount of movement for each field on the image sensor 120 (Step 260).
  • the 260th step is illustrated, but the embodiment is not limited thereto. That is, after step 260 is performed, step 250 may be performed, and steps 250 and 260 may be performed at the same time. Alternatively, step 250 may be performed after step 230 is performed, and step 260 may be performed after step 240 is performed.
  • steps 220 to 260 illustrated in FIG. 10 may be performed by the image processing unit 130.
  • step 260 information of the light received through the G channel (eg, an image), information of the light received through the R channel and shifted (eg, an image), and information of the light received through the B channel and shifted (eg, an image) , Image) may be synthesized, and the synthesized result may be generated as a final image in which chromatic aberration is corrected (step 270 ).
  • Step 270 may be performed by the image synthesis unit 140 illustrated in FIG. 1.
  • the information of the light received through the G channel is referred to as an'G channel image'
  • the information of the light received through the R channel is referred to as an'R channel image'
  • the information of the light received through the B channel is' B channel image.
  • FIG. 11 is a view for explaining chromatic aberration correction according to an embodiment.
  • the image of which chromatic aberration is suppressed can be obtained by increasing and decreasing the size of the image by the amount of chromatic aberration of the lens for each R channel, G channel and B channel of the image having chromatic aberration.
  • the image IM1 output from the image sensor 120 has chromatic aberration.
  • the chromatic aberration may mean a difference in size of an image for each wavelength.
  • the G-channel image GAM
  • the first size difference obtained by subtracting the size of the G-channel image from the R-channel image size is referred to as' ⁇ '
  • the B-channel image size Let's suppose the second size difference subtracting the size of the image is' ⁇ '.
  • the result of subtracting the first size difference ( ⁇ ) from the size of the image of the R channel (RIM), the result of subtracting the second size difference ( ⁇ ) from the size of the image of the B channel (BIM), and the G channel When synthesizing the image GIM of, an image IM2 with corrected (or suppressed) chromatic aberration may be obtained.
  • the largest value is the chromatic aberration. It corresponds.
  • a chromatic aberration of 10 ⁇ m means that the largest size difference among image size differences of the R channel, the G channel, and the B channel is 10 ⁇ m.
  • the chromatic aberration of the image generated from the optical signal passing through the lens unit 110 in the state where the interface BO of the liquid lens 110A is tilted has a different size between the images of the R channel, the G channel, and the B channel.
  • the images of R channel, G channel and B channel appear in shifted form.
  • the G channel is the reference channel among the R channels, the G channels, and the B channels
  • the R channel image R1 is shifted based on the G channel image GIM.
  • Chromatic aberration appears. For this reason, referring to FIG.
  • the optical device 100 and the image processing method 200 may correct chromatic aberration by shifting the image of each of the correction channel, for example, the B channel and the R channel according to image stabilization information.
  • FIG. 12 is a diagram for describing an image in which chromatic aberration is corrected by the optical device 100 and the image processing method 200 according to the embodiment.
  • 0.9f to -0.9f denote a field
  • 0 degrees and 0.6 degrees denote compensation angles.
  • the compensation angle of '0 degree' is a steady state without shaking, and as shown in FIG. 2, there is no tilting at the interface BO.
  • the compensation angle of '0.6 degrees' may mean that the shaking motion is maximized, and as illustrated in FIG. 2, the inclined angle ⁇ of the interface BO is 0.6 degrees.
  • the image of the chromatic aberration corrected is obtained by shifting and synthesizing the image of the R channel and the image of the B channel by a predetermined amount of movement for each field according to the compensation angle obtained according to the image stabilization information. Can be seen.
  • the optical device may include a device capable of processing or analyzing an optical signal.
  • the optical device may include a camera/video device, a telescope device, a microscope device, an interferometer device, a photometer device, a polarimeter device, a spectrometer device, a reflectometer device, an autocollimator device, and a lens meter device.
  • the optical device may be implemented as a portable device such as a smart phone, a notebook computer, and a tablet computer.
  • a portable device such as a smart phone, a notebook computer, and a tablet computer.
  • Such an optical device may include a display unit for outputting an image (not shown), a battery for supplying power to each unit (not shown), and a main body housing for mounting each unit and the battery.
  • the optical device may further include a communication module capable of communicating with other devices and a memory unit capable of storing data.
  • the communication module and the memory unit may also be mounted in the body housing.
  • a lens unit including a liquid lens; An image sensor that receives information of light passing through the lens unit; And an image processing unit for correcting the information of light, a recording medium recording a program for executing an image processing method performed in an optical device, the function of converting information on the degree of hand shake of a user into a compensation angle and corresponding to the compensation angle At least, the function to obtain a predetermined correction angle of the liquid lens, and at least one of the R channel corresponding to the correction angle and the R channel receiving the red wavelength range, the G channel receiving the green wavelength range, and the B channel receiving the blue wavelength range.
  • the function of determining the amount of movement implemented by the program recorded on the computer-readable recording medium corresponds to the correction angle, and the function of determining the first amount of movement for each field on the preset B channel image sensor and the correction angle And it may include a function for determining the second movement amount for each field on the preset R channel image sensor.
  • the shifting function implemented by a program recorded on a computer-readable recording medium is a function of shifting the information of light received on the B channel by a first movement amount for each field on the image sensor and the information of light received on the R channel.
  • a function of shifting by the second movement amount for each field on the sensor may be included.
  • the computer-readable recording medium further includes a function of synthesizing the information of the light received on the G channel, the information of the light received on the R channel and shifted, and the information of the light received on the B channel and shifted to generate a corrected final image. You can record the program you implement.
  • the computer-readable recording medium includes any kind of storage device that stores data that can be read by a computer system.
  • Examples of computer-readable recording media include ROM, RAM, CD-ROM, magnetic tape, floppy disks, and optical data storage devices.
  • the computer-readable recording medium can also be distributed over network coupled computer systems so that the computer readable code is stored and executed in a distributed fashion.
  • functional programs, codes, and code segments for implementing the image processing method can be easily inferred by programmers in the technical field to which the present invention pertains.
  • the optical device and the image processing method including the liquid lens according to the embodiment and having chromatic aberration correction function include a camera/video device, a telescope device, a microscope device, an interferometer device, a photometer device, a polarimeter device, a spectrometer device, a reflectometer device, and an autocollimator It can be used in devices, lens meter devices, smartphones, notebook computers, tablet computers, and the like.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computing Systems (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

An optical device of an embodiment comprises: a lens unit including a liquid lens; an image sensor that receives information of light passing through the lens unit via an R channel that receives a red wavelength range, a G channel that receives a green wavelength range, and a B channel that receives a blue wavelength range; and an image processing unit that shifts information of light, received through at least one correction channel from among the R channel, the G channel, and the B channel, by a preset movement amount corresponding to driving information of the liquid lens and field information on the image sensor of the correction channel.

Description

액체 렌즈를 포함하며 색수차 보정 기능을 갖는 광학 기기 및 영상 처리 방법Optical device and image processing method with liquid lens and chromatic aberration correction
실시 예는 액체 렌즈를 포함하며 색수차 보정 기능을 갖는 광학 기기 및 영상 처리 방법에 관한 것이다.An embodiment relates to an optical device and an image processing method including a liquid lens and having chromatic aberration correction.
기존의 광학 기기에서 고체 렌즈를 포함하는 카메라 렌즈는 색수차가 최대한 억제되도록 설계되므로, 영상을 촬영 후 이미지 프로세스 단계에서 별도의 색수차 보정이 요구되지 않는다. 또는, 필요한 경우, 사용자가 직접 후보정을 통해 색수차를 제거할 수 있고, 색수차량도 매우 작으며, 색수차가 발생되는 형태도 상/하/좌/우 대칭적이기 때문에, 색수차를 쉽게 보정할 수 있다.In the conventional optical device, a camera lens including a solid lens is designed such that chromatic aberration is suppressed as much as possible, so that a separate chromatic aberration correction is not required in an image process step after taking an image. Alternatively, if necessary, the chromatic aberration can be corrected easily because the user can directly remove chromatic aberration through candidate correction, the chromatic aberration amount is very small, and the shape in which the chromatic aberration is generated is symmetrical up/down/left/right.
그러나, 액체 렌즈를 포함하는 카메라 렌즈의 경우 오토포커싱(AF:Auto-Focusing) 기능 등을 수행하는 일반적인 촬영 시 즉, 정상상태에서는 색수차가 고체 렌즈를 포함하는 카메라 렌즈와 동일한 양상을 보인다. 반면에, 액체 렌즈를 포함하는 카메라 렌즈가 손떨림 보정 내지 영상 흔들림 방지(OIS:Optical Image Stabilizer) 기능을 수행하면, 색수차가 크게 발생하게 되며 그 색수차의 량은 OIS 구동 주파수에 따라 다르지만 OIS 기능이 수행되지 않을 때 대비 대략 5배 내지 8배 증가하게 된다. 여기서, OIS 구동 주파수란, 카메라 렌즈를 사용하는 사용자의 손떨림에 대한 주파수를 의미한다.However, in the case of a camera lens including a liquid lens, chromatic aberration exhibits the same aspect as a camera lens including a solid lens in normal shooting, that is, in a normal state, in which an auto-focusing (AF) function is performed. On the other hand, when a camera lens including a liquid lens performs image stabilization or image stabilization (OIS), chromatic aberration is largely generated and the amount of chromatic aberration varies depending on the OIS driving frequency, but OIS function is performed. It will increase approximately 5 to 8 times compared to when it is not. Here, the OIS driving frequency means a frequency for the shaking of a user who uses a camera lens.
또한, OIS 기능을 수행하기 위해, 광축과 수직인 x-y 평면상에서 x축과 y축 방향으로 전이(translation)하는 기존의 VCM(Voice Coil Motor)을 사용하는 방식과 달리, 액체 렌즈를 포함하는 카메라 렌즈는 두 액체의 경계면의 틸팅(tilting)을 이용하기 때문에 색수차의 형태도 상/하/좌/우 비대칭적으로 된다. 이로 인해, 촬영된 이미지의 화질이 크게 저하되는 문제점이 있다.In addition, in order to perform the OIS function, unlike the conventional method of using a VCM (Voice Coil Motor) that transverses in the x-axis and y-axis directions on the xy plane perpendicular to the optical axis, a camera lens including a liquid lens Since the use of tilting of the interface between the two liquids, the shape of chromatic aberration is also asymmetrical up/down/left/right. Due to this, there is a problem that the image quality of the photographed image is significantly deteriorated.
실시 예는 색수차를 보정하여 개선된 화질을 제공할 수 있는 액체 렌즈를 포함하며 색수차 보정 기능을 갖는 광학 기기 및 영상 처리 방법을 제공한다.An embodiment includes a liquid lens capable of correcting chromatic aberration and providing improved image quality, and provides an optical device and an image processing method having a chromatic aberration correction function.
일 실시 예에 의한 광학 기기는, 액체 렌즈를 포함하는 렌즈부; Red 파장 영역대를 수신하는 R채널, Green 파장 영역대를 수신하는 G채널, 및 Blue 파장 영역대를 수신하는 B채널을 통해 상기 렌즈부를 통과한 광의 정보를 수신하는 이미지 센서; 및 상기 R채널, 상기 G채널 및 상기 B채널 중 적어도 하나의 보정 채널로 수신된 광의 정보를 상기 액체 렌즈의 구동 정보와 상기 보정 채널의 상기 이미지 센서상 필드 정보에 대응되는 기 설정된 이동량만큼 쉬프팅시키는 이미지 처리부를 포함할 수 있다.An optical device according to an embodiment includes a lens unit including a liquid lens; An image sensor receiving information of light passing through the lens unit through an R channel receiving a red wavelength range, a G channel receiving a green wavelength range, and a B channel receiving a blue wavelength range; And shifting information of light received through at least one correction channel among the R channel, the G channel, and the B channel by a preset movement amount corresponding to the driving information of the liquid lens and field information on the image sensor of the correction channel. It may include an image processing unit.
예를 들어, 상기 R 채널, 상기 G 채널 및 상기 B 채널 중에서 상기 보정 채널은 B채널과 R 채널이고, 상기 G 채널은 기준 채널일 수 있다.For example, among the R channel, the G channel, and the B channel, the correction channel may be a B channel and an R channel, and the G channel may be a reference channel.
예를 들어, 상기 이미지 처리부는 복수의 보상 각도별로 상기 액체 렌즈의 보정 각도를 매핑시켜 저장하고, 복수의 상기 보정 각도별로 상기 B채널의 상기 이미지 센서상 필드별 제1 이동량과 상기 R채널의 상기 이미지 센서 상 필드별 제2 이동량을 매핑시켜 저장하는 룩업테이블; 상기 액체 렌즈의 상기 구동 정보에 상응하는 보상 각도에 상응하는 상기 B채널의 상기 필드별 제1 이동량과, 상기 보상 각도에 상응하는 상기 R채널의 상기 필드별 제2 이동량을 상기 룩업테이블로부터 출력시키는 제어부; 및 상기 룩업테이블로부터 출력된 상기 필드별 제1 이동량만큼 상기 B채널로 수신된 광의 정보를 필드별로 쉬프팅하고, 상기 룩업테이블로부터 출력된 상기 필드별 제2 이동량만큼 상기 R채널로 수신된 광의 정보를 필드별로 쉬프팅하는 이미지 쉬프팅부를 포함할 수 있다.For example, the image processing unit maps and stores a correction angle of the liquid lens for each of a plurality of compensation angles, and stores the first movement amount of each field and the R channel of the image sensor of the B channel for each of the correction angles. A look-up table that maps and stores the second movement amount for each field on the image sensor; The first movement amount for each field of the B channel corresponding to the compensation angle corresponding to the driving information of the liquid lens, and the second movement amount for each field of the R channel corresponding to the compensation angle are output from the lookup table. Control unit; And shifting information of the light received on the B channel by the first movement amount for each field output from the lookup table by field, and receiving information on the light received on the R channel by the second movement amount for each field output from the lookup table. An image shifting unit may be shifted for each field.
예를 들어, 상기 이미지 처리부는 상기 액체 렌즈의 상기 구동 정보에 상응하는 보상 각도에 대응하며 기 설정된 상기 액체 렌즈의 보정 각도를 구하는 보정 각도 결정부; 상기 보정 각도에 대응하며 기 설정된 상기 B채널의 상기 이미지 센서상 필드별 제1 이동량과 상기 R채널의 상기 이미지 센서상 필드별 제2 이동량을 결정하는 이동량 결정부; 및 상기 B채널로 수신된 상기 광의 정보를 필드별로 상기 제1 이동량만큼 쉬프팅하고, 상기 R채널로 수신된 상기 광의 정보를 필드별로 상기 제2 이동량만큼 쉬프팅하는 이미지 쉬프팅부를 포함 할 수 있다.For example, the image processing unit may include a correction angle determining unit corresponding to a compensation angle corresponding to the driving information of the liquid lens and obtaining a preset correction angle of the liquid lens; A movement amount determination unit corresponding to the correction angle and determining a first movement amount for each field on the image sensor of the B channel and a second movement amount for each field on the image sensor of the R channel; And an image shifting unit shifting the information of the light received through the B channel by the first movement amount for each field, and shifting the information of the light received by the R channel by the second movement amount for each field.
예를 들어, 상기 광학 기기는, 상기 G채널로 수신된 광의 정보, 상기 B채널로 수신되어 쉬프팅된 광의 정보 및 상기 R채널로 수신되어 쉬프팅된 광의 정보를 합성하여 보정된 최종 이미지를 생성하는 이미지 합성부를 더 포함할 수 있다.For example, the optical device may generate a corrected final image by synthesizing information of light received on the G channel, information of light received on the B channel and shifted, and information on light received on the R channel and shifted. A synthetic part may be further included.
예를 들어, 상기 광학 기기는, 상기 액체 렌즈의 상기 구동 정보에 해당하는 사용자의 손떨림 정보를 보상 각도로 환산하는 손떨림 감지부를 더 포함할 수 있다.For example, the optical device may further include a hand shake detection unit that converts user hand shake information corresponding to the driving information of the liquid lens into a compensation angle.
예를 들어, 상기 손떨림 감지부는 상기 사용자의 손떨림의 정도를 센싱하고, 센싱된 상기 손떨림의 정도에 대한 상기 액체 렌즈의 상기 구동 정보를 보상 각도로 환산하는 자이로 센서를 포함할 수 있다.For example, the image stabilization unit may include a gyro sensor that senses the degree of image stabilization of the user and converts the driving information of the liquid lens with respect to the sensed image amount to a compensation angle.
다른 실시 예에 의하면, 액체 렌즈를 포함하는 렌즈부; 상기 렌즈부를 통과한 광의 정보를 수신하는 이미지 센서; 및 상기 광의 정보를 보정하는 이미지 처리부를 포함하는 광학 기기에서 수행되는 영상 처리 방법은, 상기 액체 렌즈의 구동 정보를 보상 각도로 환산하는 단계; 상기 보상 각도에 대응하며 기 설정된 상기 액체 렌즈의 보정 각도를 구하는 단계; 상기 보정 각도에 대응하며, Red 파장 영역대를 수신하는 R채널, Green 파장 영역대를 수신하는 G채널 및 Blue 파장 영역대를 수신하는 B채널 중에서 적어도 하나의 보정 채널의 상기 이미지 센서 상 필드 정보에 대응되는 기 설정된 이동량을 결정하는 단계; 및 상기 보정 채널로 수신된 광의 정보를 상기 이미지 센서상 필드별 상기 결정된 이동량만큼 쉬프팅하는 단계를 포함할 수 있다.According to another embodiment, a lens unit including a liquid lens; An image sensor that receives information of light passing through the lens unit; And an image processing unit for correcting the information of the light, comprising: converting driving information of the liquid lens into a compensation angle; Obtaining a correction angle of the preset liquid lens corresponding to the compensation angle; Corresponding to the correction angle, the field information on the image sensor of at least one correction channel among the R channel receiving the red wavelength range, the G channel receiving the green wavelength range, and the B channel receiving the blue wavelength range. Determining a corresponding preset movement amount; And shifting information of light received through the correction channel by the determined amount of movement for each field on the image sensor.
예를 들어, 상기 R 채널, 상기 G 채널 및 상기 B 채널 중에서 상기 보정 채널은 R채널과 B 채널이고, 상기 G채널은 기준 채널일 수 있다.For example, among the R channel, the G channel, and the B channel, the correction channel may be an R channel and a B channel, and the G channel may be a reference channel.
예를 들어, 상기 이동량을 결정하는 단계는 상기 보정 각도에 대응하며 기 설정된 상기 B채널의 상기 이미지 센서상 필드별 제1 이동량을 결정하는 단계; 및For example, the determining of the movement amount may include determining a first movement amount for each field on the image sensor of the B channel that corresponds to the correction angle; And
상기 보정 각도에 대응하며 기 설정된 상기 R채널의 상기 이미지 센서 상 필드별 제2 이동량을 결정하는 단계를 포함할 수 있다.The method may include determining a second movement amount for each field on the image sensor of the R channel corresponding to the correction angle.
예를 들어, 상기 쉬프팅 단계는 상기 B채널로 수신된 광의 정보를 상기 이미지 센서상 필드별로 상기 제1 이동량만큼 쉬프팅하는 단계; 및 상기 R채널로 수신된 광의 정보를 상기 이미지 센서상 필드별로 상기 제2 이동량만큼 쉬프팅하는 단계를 포함할 수 있다.For example, the shifting step may include shifting information of light received through the B channel by the first movement amount for each field on the image sensor; And shifting the information of the light received through the R channel by the second movement amount for each field on the image sensor.
예를 들어, 상기 영상 처리 방법은, 상기 G채널로 수신된 광의 정보, 상기 R채널로 수신되어 쉬프팅된 광의 정보 및 상기 B채널로 수신되어 쉬프팅된 광의 정보를 합성하여 보정된 최종 이미지를 생성하는 단계를 더 포함할 수 있다.For example, the image processing method may generate a corrected final image by synthesizing information of light received on the G channel, information of light received on the R channel and shifted, and information of light received on the B channel and shifted. It may further include a step.
실시 예에 따른 액체 렌즈를 포함하며 색수차 보정 기능을 갖는 광학 기기 및 영상 처리 방법은 사용자의 손떨림 정도에 따라 기 설정된 필드별 이동량만큼 보상 채널로 수신된 광의 정보를 쉬프팅시킴으로서, 색수차를 보정하여 개선된 화질을 제공할 수 있다.The optical device and the image processing method including a liquid lens according to an embodiment and having a chromatic aberration correction function are improved by correcting chromatic aberration by shifting information of light received through a compensation channel by a preset amount of movement for each field according to the degree of hand movement of the user It can provide image quality.
또한, 본 실시 예에서 얻을 수 있는 효과는 이상에서 언급한 효과들로 제한되지 않으며 언급하지 않은 또 다른 효과는 아래의 기재로부터 본 발명이 속하는 분야에서 통상의 지식을 가진 자에게 명확하게 이해될 수 있을 것이다.In addition, the effects obtainable in the present embodiment are not limited to the above-mentioned effects, and other effects not mentioned can be clearly understood by those skilled in the art from the following description. There will be.
도 1은 실시 예에 의한 광학 기기의 블럭도이다.1 is a block diagram of an optical device according to an embodiment.
도 2는 도 1에 도시된 액체 렌즈를 포함하는 액체 렌즈부의 단면도이다.FIG. 2 is a cross-sectional view of the liquid lens unit including the liquid lens illustrated in FIG. 1.
도 3은 도 1에 도시된 광학 기기의 일 실시 예에 의한 구현 례를 나타낸다.3 shows an implementation example according to an embodiment of the optical device shown in FIG. 1.
도 4는 도 1에 도시된 이미지 센서의 실시 예에 의한 평면 형상을 나타낸다.4 shows a planar shape according to the embodiment of the image sensor shown in FIG. 1.
도 5는 액체 렌즈의 계면이 틸팅되지 않았을 때 색수차의 량과 이미지 센서의 필드 간의 관계를 나타낸다.5 shows the relationship between the amount of chromatic aberration and the field of the image sensor when the interface of the liquid lens is not tilted.
도 6은 액체 렌즈의 계면이 틸팅되었을 때 수차량과 이미지 센서의 필드 간의 관계를 나타낸다.6 shows the relationship between the aberration amount and the field of the image sensor when the interface of the liquid lens is tilted.
도 7은 도 1에 도시된 이미지 처리부의 일 실시 예에 의한 블럭도이다.7 is a block diagram according to an embodiment of the image processing unit illustrated in FIG. 1.
도 8은 LUT에 저장되는 데이터의 일 례를 나타낸다.8 shows an example of data stored in the LUT.
도 9는 도 1에 도시된 이미지 처리부의 다른 실시 예에 의한 블럭도이다.9 is a block diagram according to another embodiment of the image processing unit shown in FIG. 1.
도 10은 실시 예에 의한 영상 처리 방법을 설명하기 위한 플로우차트이다.10 is a flowchart illustrating an image processing method according to an embodiment.
도 11은 실시 예에 의한 색수차 보정을 설명하기 위한 도면이다.11 is a view for explaining chromatic aberration correction according to an embodiment.
도 12는 실시 예에 의한 광학 기기 및 영상 처리 방법에 의해 색수차가 보정된 이미지를 설명하기 위한 도면이다.12 is a diagram for describing an image in which chromatic aberration is corrected by an optical device and an image processing method according to an embodiment.
이하, 첨부된 도면을 참조하여 본 발명의 바람직한 실시 예를 상세히 설명한다.Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings.
다만, 본 발명의 기술 사상은 설명되는 일부 실시 예에 한정되는 것이 아니라 서로 다른 다양한 형태로 구현될 수 있고, 본 발명의 기술 사상 범위 내에서라면, 실시 예들 간 그 구성 요소들 중 하나 이상을 선택적으로 결합, 치환하여 사용할 수 있다.However, the technical spirit of the present invention is not limited to some described embodiments, and may be implemented in various different forms, and within the scope of the technical spirit of the present invention, one or more of its components between embodiments may be selectively selected. It can be used by bonding and substitution.
또한, 본 발명의 실시 예에서 사용되는 용어(기술 및 과학적 용어를 포함)는, 명백하게 특별히 정의되어 기술되지 않는 한, 본 발명이 속하는 기술분야에서 통상의 지식을 가진 자에게 일반적으로 이해될 수 있는 의미로 해석될 수 있으며, 사전에 정의된 용어와 같이 일반적으로 사용되는 용어들은 관련 기술의 문맥상의 의미를 고려하여 그 의미를 해석할 수 있을 것이다.In addition, the terms used in the embodiments of the present invention (including technical and scientific terms), unless specifically defined and described, can be generally understood by those skilled in the art to which the present invention pertains. It can be interpreted as meaning, and commonly used terms, such as predefined terms, may interpret the meaning in consideration of the contextual meaning of the related technology.
또한, 본 발명의 실시 예에서 사용된 용어는 실시 예들을 설명하기 위한 것이며 본 발명을 제한하고자 하는 것은 아니다. 본 명세서에서, 단수형은 문구에서 특별히 언급하지 않는 한 복수형도 포함할 수 있고, “A 및(와) B, C중 적어도 하나(또는 한 개이상)”으로 기재되는 경우 A, B, C로 조합할 수 있는 모든 조합 중 하나이상을 포함할 수 있다.In addition, the terms used in the embodiments of the present invention are for describing the embodiments and are not intended to limit the present invention. In the present specification, the singular form may also include the plural form unless specifically stated in the phrase, and is combined with A, B, C when described as “at least one (or more than one) of A and B, C”. It can contain one or more of all possible combinations.
또한, 본 발명의 실시 예의 구성 요소를 설명하는 데 있어서, 제1, 제2, A, B, (a), (b) 등의 용어를 사용할 수 있다. 이러한 용어는 그 구성 요소를 다른 구성 요소와 구별하기 위한 것일 뿐, 그 용어에 의해 해당 구성 요소의 본질이나 차례 또는 순서 등으로 한정되지 않는다.In addition, in describing the components of the embodiments of the present invention, terms such as first, second, A, B, (a), and (b) may be used. These terms are only for distinguishing the component from other components, and the term is not limited to the nature, order, or order of the component.
그리고, 어떤 구성 요소가 다른 구성요소에 ‘연결’, ‘결합’ 또는 ‘접속’된다고 기재된 경우, 그 구성 요소는 그 다른 구성요소에 직접적으로 연결, 결합 또는 접속되는 경우 뿐만아니라, 그 구성 요소와 그 다른 구성요소 사이에 있는 또 다른 구성 요소로 인해 ‘연결’, ‘결합’ 또는 ‘접속’되는 경우도 포함할 수 있다.And, when a component is described as being'connected','coupled' or'connected' to another component, the component is not only directly connected, coupled or connected to the other component, but also to the component It may also include the case of'connected','coupled' or'connected' due to another component between the other components.
또한, 각 구성 요소의 “상(위) 또는 하(아래)”에 형성 또는 배치되는 것으로 기재되는 경우, 상(위) 또는 하(아래)는 두 개의 구성 요소들이 서로 직접 접촉되는 경우 뿐만아니라 하나 이상의 또 다른 구성 요소가 두 개의 구성 요소들 사이에 형성 또는 배치되는 경우도 포함한다. 또한 “상(위) 또는 하(아래)”로 표현되는 경우 하나의 구성 요소를 기준으로 위쪽 방향뿐만 아니라 아래쪽 방향의 의미도 포함할 수 있다.In addition, when described as being formed or disposed in the “top (top) or bottom (bottom)” of each component, the top (top) or bottom (bottom) is not only when two components are in direct contact with each other, but also one It also includes a case in which another component described above is formed or disposed between two components. In addition, when expressed as “up (up) or down (down)”, it may include the meaning of the downward direction as well as the upward direction based on one component.
가변 렌즈는 초점 가변 렌즈일 수 있다. 또한 가변 렌즈는 초점이 조절되는 렌즈일 수 있다. 가변 렌즈는 액체 렌즈, 폴리머 렌즈, 액정 렌즈, VCM 타입, SMA 타입 중 적어도 하나일 수 있다. 액체 렌즈는 하나의 액체를 포함하는 액체 렌즈와 두개의 액체를 포함하는 액체 렌즈를 포함할 수 있다. 하나의 액체를 포함하는 액체 렌즈는 액체와 대응되는 위치에 배치되는 멤브레인을 조절하여 초점을 가변시킬 수 있으며, 예를들어 마그넷과 코일의 전자기력에 의해 멤브레인을 가압하여 초점을 가변시킬 수 있다. 두개의 액체를 포함하는 액체 렌즈는 전도성 액체와 비전도성 액체를 포함하여 액체 렌즈에 인가되는 전압을 이용하여 전도성 액체와 비전도성 액체가 형성하는 계면을 조절할 수 있다. 폴리머 렌즈는 고분자 물질을 피에조 등의 구동부를 통해 초점을 가변시킬 수 있다. 액정 렌즈는 전자기력에 의해 액정을 제어하여 초점을 가변시킬 수 있다. VCM 타입은 고체 렌즈 또는 고체 렌즈를 포함하는 렌즈 어셈블리를 마그넷과 코일간의 전자기력을 통해 조절하여 초점을 가변시킬 수 있다. SMA 타입은 형상기억합금을 이용하여 고체 렌즈 또는 고체 렌즈를 포함하는 렌즈 어셈블리를 제어하여 초점을 가변시킬 수 있다.The variable lens may be a variable focus lens. Also, the variable lens may be a lens whose focus is adjusted. The variable lens may be at least one of a liquid lens, a polymer lens, a liquid crystal lens, a VCM type, and an SMA type. The liquid lens may include a liquid lens including one liquid and a liquid lens including two liquids. The liquid lens including one liquid may change the focus by adjusting the membrane disposed at a position corresponding to the liquid, and for example, the focus may be changed by pressing the membrane by the electromagnetic force of the magnet and the coil. The liquid lens including two liquids may control the interface formed by the conductive liquid and the non-conductive liquid by using a voltage applied to the liquid lens, including the conductive liquid and the non-conductive liquid. The polymer lens can change the focus of the polymer material through a driving unit such as a piezo. The liquid crystal lens can change the focus by controlling the liquid crystal by electromagnetic force. The VCM type can change the focus by adjusting the solid lens or the lens assembly including the solid lens through an electromagnetic force between the magnet and the coil. The SMA type may use a shape memory alloy to control a solid lens or a lens assembly including the solid lens to change focus.
이하, 실시 예에 의한 색수차 보정 기능을 갖는 광학 기기는 가변 렌즈로서 액체 렌즈를 포함하는 것으로 설명하고, 실시 예에 의한 영상 처리 방법은 가변 렌즈로서 액체 렌즈를 포함하는 광학 기기에서 수행되는 것으로 설명하지만, 실시 예는 이에 국한되지 않는다. 즉, 하기의 광학 기기에 대한 설명은 액체 렌즈 이외의 가변 렌즈를 포함하는 광학 기기에 대해서도 적용할 수 있고, 하기의 영상 처리 방법에 대한 설명은 액체 렌즈 이외의 가변 렌즈를 포함하는 광학 기기에서 수행되는 영상 처리 방법에 대해서도 적용할 수 있다.Hereinafter, the optical device having the chromatic aberration correction function according to the embodiment will be described as including a liquid lens as a variable lens, and the image processing method according to the embodiment will be described as being performed at an optical device including a liquid lens as a variable lens. , The embodiment is not limited thereto. That is, the following description of the optical device can be applied to an optical device including a variable lens other than a liquid lens, and the description of the following image processing method is performed by an optical device including a variable lens other than a liquid lens It can also be applied to the video processing method.
이하, 실시 예에 의한 액체 렌즈를 포함하며 색수차 보정 기능을 갖는 광학 기기를 첨부된 도면을 참조하여 다음과 같이 설명한다.Hereinafter, an optical device including a liquid lens according to an embodiment and having a chromatic aberration correction function will be described with reference to the accompanying drawings.
도 1은 실시 예에 의한 광학 기기(100)의 블럭도로서, 렌즈부(또는, 렌즈 어셈블리)(110), 이미지 센서(120), 이미지 처리부(130) 및 이미지 합성부(140)를 포함할 수 있다.1 is a block diagram of an optical device 100 according to an embodiment, and includes a lens unit (or lens assembly) 110, an image sensor 120, an image processing unit 130, and an image synthesis unit 140. Can.
렌즈부(110)는 액체 렌즈(110A)를 포함할 수 있다.The lens unit 110 may include a liquid lens 110A.
일반적인 액체 렌즈(110A)에 대해 첨부된 도 2를 참조하여 설명하지만, 실시 예는 특정한 구조를 갖는 액체 렌즈(110A)에 국한되지 않는다.Although the general liquid lens 110A is described with reference to FIG. 2 attached, the embodiment is not limited to the liquid lens 110A having a specific structure.
도 2는 도 1에 도시된 액체 렌즈(110A)를 포함하는 액체 렌즈부의 단면도이다.2 is a cross-sectional view of a liquid lens unit including the liquid lens 110A shown in FIG. 1.
도 2에 도시된 액체 렌즈부는 액체 렌즈(110A) 및 제1 및 제2 연결 기판(116, 118)을 포함할 수 있다.The liquid lens unit illustrated in FIG. 2 may include a liquid lens 110A and first and second connecting substrates 116 and 118.
액체 렌즈(110A)는 서로 다른 종류의 복수의 액체(LQ1, LQ2), 제1 내지 제3 플레이트(P1, P2, P3), 제1 및 제2 전극(E1, E2) 및 절연층(119)을 포함할 수 있다.The liquid lens 110A includes a plurality of different types of liquids LQ1, LQ2, first to third plates P1, P2, P3, first and second electrodes E1, E2, and an insulating layer 119. It may include.
복수의 액체(LQ1, LQ2)는 캐비티(CA:cavity)에 수용되며, 전도성을 갖는 제1 액체(LQ1)와 비전도성을 가지며 오일(oil)과 같이 비전도성을 갖는 물질로 구현될 수 있는 제2 액체(또는, 절연 액체)(LQ2)를 포함할 수 있다. 제1 액체(LQ1)와 제2 액체(LQ2)는 서로 섞이지 않으며, 제1 및 제2 액체(LQ1, LQ2) 사이의 접하는 부분에 계면(BO)이 형성될 수 있다. 제2 액체(LQ2) 위에 제1 액체(LQ1)가 배치될 수도 있고, 제1 액체(LQ1) 위에 제2 액체(LQ2)가 배치될 수도 있다.A plurality of liquids (LQ1, LQ2) is accommodated in a cavity (CA:cavity), the first liquid having a conductivity (LQ1) and non-conductive and can be embodied as a non-conductive material such as oil (oil) 2 liquid (or insulating liquid) (LQ2). The first liquid LQ1 and the second liquid LQ2 do not mix with each other, and an interface BO may be formed at a contact portion between the first and second liquids LQ1 and LQ2. The first liquid LQ1 may be disposed on the second liquid LQ2, or the second liquid LQ2 may be disposed on the first liquid LQ1.
제1 플레이트(P1)의 내측면은 캐비티(CA)의 측벽(i)을 이룰 수 있다. 제1 플레이트(P1)는 기 설정된 경사면을 갖는 상하의 개구부를 포함할 수 있다. 즉, 캐비티(CA)는 제1 플레이트(P1)의 경사면, 제2 플레이트(P2)와 접촉하는 제1 개구 및 제3 플레이트(P3)와 접촉하는 제2 개구로 둘러싸인 영역으로 정의될 수 있다.The inner surface of the first plate P1 may form a side wall i of the cavity CA. The first plate P1 may include upper and lower openings having a predetermined inclined surface. That is, the cavity CA may be defined as an area surrounded by an inclined surface of the first plate P1, a first opening contacting the second plate P2, and a second opening contacting the third plate P3.
제1 및 제2 개구 중에서 보다 넓은 개구의 직경은 액체 렌즈(110A)에서 요구하는 화각(FOV) 또는 액체 렌즈(110A)가 수행해야 할 역할에 따라 달라질 수 있다. 두 액체가 형성한 계면(BO)은 구동 전압에 의해 캐비티(CA)의 경사면을 따라 움직일 수 있다.The diameter of the wider opening among the first and second openings may vary depending on the field of view (FOV) required by the liquid lens 110A or the role of the liquid lens 110A. The interface BO formed by the two liquids may move along the inclined surface of the cavity CA by the driving voltage.
제1 플레이트(P1)의 캐비티(CA)에 제1 액체(LQ1) 및 제2 액체(LQ2)가 충진, 수용 또는 배치될 수 있다. 또한, 캐비티(CA)는 광이 투과하는 부위이다. 따라서, 제1 플레이트(P1)는 투명한 재료로 이루어질 수도 있고, 광의 투과가 용이하지 않도록 불순물을 포함할 수도 있다.The first liquid LQ1 and the second liquid LQ2 may be filled, received, or disposed in the cavity CA of the first plate P1. Further, the cavity CA is a site through which light passes. Therefore, the first plate P1 may be made of a transparent material, or may contain impurities so that light transmission is not easy.
제1 플레이트(P1)의 일면과 타면에 제1 및 제2 전극(E1, E2)이 각각 배치될 수 있다. 복수의 제1 전극(E1)은 제2 전극(E2)과 이격되고, 제1 플레이트(P1)의 상부면과 측면 및 하부면에 배치될 수 있다. 제2 전극(E2)은 제1 플레이트(P1)의 하부면의 적어도 일부 영역에 배치되고, 제1 액체(LQ1)와 직접 접촉할 수 있다.First and second electrodes E1 and E2 may be disposed on one surface and the other surface of the first plate P1, respectively. The plurality of first electrodes E1 are spaced apart from the second electrode E2 and may be disposed on the upper surface, side surfaces, and lower surfaces of the first plate P1. The second electrode E2 is disposed in at least a portion of the lower surface of the first plate P1 and may directly contact the first liquid LQ1.
전술한 바와 같이, 복수의 제1 전극(E1)은 서로 전기적으로 분리될 수 있는 개별 전극에 해당하고, 복수의 제2 전극(E2)은 서로 전기적으로 이격되지 않을 수 있는 공통 전극에 해당할 수 있다. 제1 플레이트(P1)의 타면에 배치된 제2 전극(E2)의 일부가 전도성을 갖는 제1 액체(LQ1)에 노출될 수 있다.As described above, the plurality of first electrodes E1 may correspond to individual electrodes that can be electrically separated from each other, and the plurality of second electrodes E2 may correspond to common electrodes that may not be electrically separated from each other. have. A portion of the second electrode E2 disposed on the other surface of the first plate P1 may be exposed to the first liquid LQ1 having conductivity.
또한, 제2 플레이트(P2)는 제1 전극(E1)의 상면과 캐비티(CA) 위에 배치될 수 있다. 제3 플레이트(P3)는 제2 전극(E2)의 하면과 캐비티(CA) 아래에 배치될 수 있다. 제2 플레이트(P2)와 제3 플레이트(P3)는 제1 플레이트(P1)를 사이에 두고 서로 대향하여 배치될 수 있다. 또한, 제2 플레이트(P2) 또는 제3 플레이트(P3) 중 적어도 하나는 생략될 수도 있다.In addition, the second plate P2 may be disposed on the upper surface of the first electrode E1 and the cavity CA. The third plate P3 may be disposed under the second electrode E2 and under the cavity CA. The second plate P2 and the third plate P3 may be disposed to face each other with the first plate P1 therebetween. Also, at least one of the second plate P2 or the third plate P3 may be omitted.
제2 및 제3 플레이트(P2, P3) 각각은 광이 통과하는 영역으로서, 투광성 재료로 이루어질 수 있다. 예를 들면, 제2 및 제3 플레이트(P2, P3) 각각은 유리(glass)로 이루어질 수 있으며, 공정의 편의상 동일한 재료로 형성될 수 있다.Each of the second and third plates P2 and P3 is a region through which light passes, and may be made of a translucent material. For example, each of the second and third plates P2 and P3 may be made of glass, and may be formed of the same material for convenience of processing.
제2 플레이트(P2)는 입사되는 광이 제1 플레이트(P1)의 캐비티(CA) 내부로 진행하도록 허용하는 구성을 가질 수 있다. 제3 플레이트(P3)는 제1 플레이트(P1)의 캐비티(CA)를 통과한 광이 출사되도록 허용하는 구성을 가질 수 있다. 제3 플레이트(P3)는 제1 액체(LQ1)와 직접 접촉할 수 있다.The second plate P2 may have a configuration that allows the incident light to proceed into the cavity CA of the first plate P1. The third plate P3 may have a configuration that allows light passing through the cavity CA of the first plate P1 to be emitted. The third plate P3 may directly contact the first liquid LQ1.
절연층(119)은 캐비티(CA)의 상부 영역에서 제2 플레이트(P2)의 하부면의 일부를 덮으면서 배치될 수 있다. 즉, 절연층(119)은 제2 액체(LQ2)와 제2 플레이트(P2)의 사이에 배치될 수 있다. 또한, 절연층(119)은 캐비티(CA)의 측벽을 이루는 제1 전극(E1)의 적어도 일부를 덮으면서 배치될 수 있다. 또한, 절연층(119)은 제1 플레이트(P1)의 하부면에서, 제2 전극(E2)의 일부와 제1 플레이트(P1) 및 제1 전극(E1)을 덮으며 배치될 수 있다. 이로 인해, 제1 전극(E1)과 제1 액체(LQ1) 간의 접촉 및 제1 전극(E1)과 제2 액체(LQ2) 간의 접촉이 절연층(119)에 의해 차단될 수 있다.The insulating layer 119 may be disposed while covering a part of the lower surface of the second plate P2 in the upper region of the cavity CA. That is, the insulating layer 119 may be disposed between the second liquid LQ2 and the second plate P2. In addition, the insulating layer 119 may be disposed while covering at least a portion of the first electrode E1 forming the sidewall of the cavity CA. In addition, the insulating layer 119 may be disposed on the lower surface of the first plate P1, covering a portion of the second electrode E2 and the first plate P1 and the first electrode E1. Due to this, the contact between the first electrode E1 and the first liquid LQ1 and the contact between the first electrode E1 and the second liquid LQ2 may be blocked by the insulating layer 119.
절연층(119)은 제1 및 제2 전극(E1, E2) 중 하나의 전극(예를 들어, 제1 전극(E1))을 덮고, 다른 하나의 전극(예를 들어, 제2 전극(E2))의 일부를 노출시켜 전도성을 갖는 제1 액체(LQ1)에 전기 에너지가 인가되도록 할 수 있다.The insulating layer 119 covers one electrode (eg, the first electrode E1) of the first and second electrodes E1 and E2, and the other electrode (eg, the second electrode E2). )) to expose a portion of the first liquid (LQ1) having conductivity so that electrical energy is applied.
제1 연결 기판(116)과 제2 연결 기판(118)은 액체 렌즈(110A)에 전압을 공급하는 역할을 한다. 이를 위해, 제1 전극(E1)은 제1 연결 기판(116)과 전기적으로 연결되고, 제2 전극(E2)은 제2 연결 기판(118)과 전기적으로 연결될 수 있다.The first connecting substrate 116 and the second connecting substrate 118 serve to supply voltage to the liquid lens 110A. To this end, the first electrode E1 may be electrically connected to the first connection substrate 116, and the second electrode E2 may be electrically connected to the second connection substrate 118.
제1 연결 기판(116)과 제2 연결 기판(118)을 통해 제1 및 제2 전극(E1, E2)으로 구동 전압이 인가될 때, 제1 액체(LQ1)와 제2 액체(LQ2) 사이의 계면(BO)이 변형되어 액체 렌즈(110A)의 곡률과 같은 형상 또는 초점거리 중 적어도 하나가 변경(또는, 조정)될 수 있다. 예를 들어, 구동 전압에 대응하여 액체 렌즈(110A) 내에 형성되는 계면(BO)의 굴곡 또는 경사도 중 적어도 하나가 변하면서 액체 렌즈(110A)의 초점 거리가 조정될 수 있다. 이러한 계면(BO)의 변형, 곡률 반경, 틸팅 각도 등이 제어되면, 액체 렌즈(110A)를 포함하는 광학 기기는 오토포커싱(AF:Auto-Focusing) 기능이나 손떨림 보정 내지 영상 흔들림 방지(OIS:Optical Image Stabilizer) 기능 등을 수행할 수 있다.When a driving voltage is applied to the first and second electrodes E1 and E2 through the first connecting substrate 116 and the second connecting substrate 118, between the first liquid LQ1 and the second liquid LQ2 Deformation of the interface BO of at least one of the shape or focal length, such as the curvature of the liquid lens 110A, may be changed (or adjusted). For example, the focal length of the liquid lens 110A may be adjusted while at least one of the bending or inclination of the interface BO formed in the liquid lens 110A changes in response to the driving voltage. When the deformation, curvature radius, tilting angle, etc. of the interface BO is controlled, the optical device including the liquid lens 110A has an Auto-Focusing (AF) function or image stabilization or image stabilization (OIS: Optical) Image Stabilizer) function.
도 3은 도 1에 도시된 광학 기기(100)의 일 실시 예에 의한 구현 례를 나타낸다.3 shows an implementation example according to an embodiment of the optical device 100 shown in FIG. 1.
도 3에 도시된 광학 기기(100)의 렌즈부(110)는 제1 렌즈부(150), 액체 렌즈(110A) 및 제2 렌즈부(160)를 포함할 수 있다. 또한, 광학 기기(100)는 필터(170) 및 이미지 센서(120)를 더 포함할 수 있다.The lens unit 110 of the optical device 100 illustrated in FIG. 3 may include a first lens unit 150, a liquid lens 110A, and a second lens unit 160. In addition, the optical device 100 may further include a filter 170 and an image sensor 120.
도 3에 도시된 액체 렌즈(110A) 및 이미지 센서(120)는 도 1에 도시된 액체 렌즈(110A) 및 이미지 센서(120)에 각각 해당하므로, 동일한 참조부호를 사용하며 중복되는 설명을 생략한다. 따라서, 도 3에 도시된 촬상면(120S)은 도 1에 도시된 이미지 센서(120)의 촬상면에 해당할 수 있다.Since the liquid lens 110A and the image sensor 120 shown in FIG. 3 correspond to the liquid lens 110A and the image sensor 120 shown in FIG. 1, respectively, the same reference numerals are used and overlapping descriptions are omitted. . Accordingly, the imaging surface 120S illustrated in FIG. 3 may correspond to the imaging surface of the image sensor 120 illustrated in FIG. 1.
외부로부터 제1 렌즈부(150)로 입사된 광은 액체 렌즈(110A)를 통과하여 제2 렌즈부(160)로 입사될 수 있다. 액체 렌즈(110A)과 달리, 제1 렌즈부(150) 및 제2 렌즈부(160) 각각은 고체 렌즈로서, 유리 또는 플라스틱으로 구현될 수 있으나, 실시 예는 제1 렌즈부(150) 및 제2 렌즈부(160) 각각의 특정한 재질에 국한되지 않는다. 또한, 제1 렌즈부(150) 또는 제2 렌즈부(160) 중 적어도 하나는 생략될 수도 있다. 또한, 도 3의 경우, 제1 렌즈부(150)와 제2 렌즈부(160) 사이에 액체 렌즈(110A)가 배치된 것으로 예시되어 있지만, 이와 달리 액체 렌즈(110A)는 제1 렌즈부(150)의 왼쪽 또는 제2 렌즈부(160)의 오른쪽에 배치될 수도 있다.Light incident from the outside to the first lens unit 150 may pass through the liquid lens 110A and enter the second lens unit 160. Unlike the liquid lens 110A, each of the first lens unit 150 and the second lens unit 160 is a solid lens, and may be implemented as glass or plastic, but the embodiment is the first lens unit 150 and the first 2 It is not limited to each specific material of the lens unit 160. Also, at least one of the first lens unit 150 or the second lens unit 160 may be omitted. In addition, in the case of FIG. 3, although the liquid lens 110A is illustrated between the first lens unit 150 and the second lens unit 160, unlike the liquid lens 110A, the first lens unit ( It may be disposed on the left side of the 150) or the right side of the second lens unit 160.
예를 들어, 도 3에 도시된 바와 같이 제1 렌즈부(150)는 2개의 렌즈(152, 154)를 포함할 수 있지만, 2개보다 작거나 많은 렌즈를 포함할 수도 있다. 또한, 도 3에 도시된 바와 같이 제2 렌즈부(160)는 3개의 렌즈(162, 164, 166)를 포함할 수 있지만, 3개보다 작거나 많은 렌즈를 포함할 수도 있다.For example, as illustrated in FIG. 3, the first lens unit 150 may include two lenses 152 and 154, but may also include fewer or more than two lenses. In addition, as illustrated in FIG. 3, the second lens unit 160 may include three lenses 162, 164, and 166, but may also include fewer or more lenses than three.
필터(170)는 제1 렌즈부(150), 액체 렌즈(110A) 및 제2 렌즈부(160)를 통과한 광에 대해 특정 파장 범위에 해당하는 광을 필터링할 수 있다. 필터(170)는 적외선(IR) 차단 필터 또는 자외선(UV) 차단 필터일 수 있으나, 실시 예는 이에 한정되지 않는다. 필터(170)는 렌즈부(110)와 이미지 센서(120)의 촬상면(120S) 사이에 배치될 수 있다. 경우에 따라, 필터(170)는 생략될 수도 있다.The filter 170 may filter light corresponding to a specific wavelength range for light passing through the first lens unit 150, the liquid lens 110A, and the second lens unit 160. The filter 170 may be an infrared (IR) blocking filter or an ultraviolet (UV) blocking filter, but embodiments are not limited thereto. The filter 170 may be disposed between the lens unit 110 and the imaging surface 120S of the image sensor 120. In some cases, the filter 170 may be omitted.
다시, 도 1을 참조하면, 이미지 센서(120)는 렌즈부(110)를 통과한 광을 이미지로 변환하고, 변환된 이미지를 이미지 처리부(130)로 출력한다. 즉, 이미지 센서(120)는 레드(Red) 파장 영역대를 수신하는 R채널, 그린(Green) 파장 영역대를 수신하는 G채널, 및 블루(Blue) 파장 영역대를 수신하는 B채널을 통해 렌즈부(110)를 통과한 광의 정보를 수신할 수 있다.Referring to FIG. 1 again, the image sensor 120 converts light passing through the lens unit 110 into an image, and outputs the converted image to the image processing unit 130. That is, the image sensor 120 has a lens through an R channel receiving a red wavelength range, a G channel receiving a green wavelength range, and a B channel receiving a blue wavelength range. Information of the light passing through the unit 110 may be received.
이미지 처리부(130)는 이미지 센서(120)로부터 출력되는 이미지의 색수차를 보정하는 역할을 한다. 여기서, ‘색수차’란, 기준 채널로 수신된 광의 정보(또는, 이미지) 대비 보정 채널로 수신된 광의 정보(또는, 이미지)가 쉬프팅된 정도를 의미할 수 있다. 색수차의 의미에 대해서는 상세히 후술된다.The image processing unit 130 serves to correct chromatic aberration of the image output from the image sensor 120. Here, the term “chromatic aberration” may mean the degree to which information (or an image) of light received through the correction channel is shifted compared to information (or an image) of light received through the reference channel. The meaning of chromatic aberration will be described later in detail.
‘보정 채널’이란 R채널, G채널 및 B 채널 중에서 보정의 대상이 되는 채널을 의미하고, ‘기준 채널’이란 R채널, G채널 및 B 채널 중에서 보정 채널을 보정하기 위해 기준이 되는 채널을 의미한다. 즉, 기준 채널로 수신된 이미지를 기준으로 보정 채널로 수신된 이미지가 쉬프팅된 정도가 색수차에 해당한다. 이하, 보정 채널은 B채널과 R 채널이고, G채널은 기준 채널인 것으로 설명하지만 실시 예는 이에 국한되지 않는다.The'correction channel' means a channel to be corrected among R channels, G channels and B channels, and the'reference channel' means a channel that is a reference for correcting the correction channels among R channels, G channels and B channels. do. That is, the degree to which the image received by the correction channel is shifted based on the image received by the reference channel corresponds to chromatic aberration. Hereinafter, the correction channels are described as B channels and R channels, and G channels are reference channels, but embodiments are not limited thereto.
이미지 처리부(130)는 R채널, G채널 및 B채널 중 적어도 하나의 보정 채널로 수신된 광의 정보를 액체 렌즈(110A)의 구동 정보와 보정 채널의 이미지 센서(120)상 필드 정보에 대응되는 기 설정된 이동량만큼 쉬프팅시켜, 색수차를 보정할 수 있다. 여기서, 액체 렌즈(110A)의 구동 정보란, 사용자의 손떨림 정도에 대한 정보(이하, “손떨림 정보”라 함)일 수 있다. 이때, 이미지 처리부(130)는 보정 채널로 수신된 광의 정보(예를 들어, 이미지)를 필드별 이동량만큼 쉬프팅시킨다.The image processing unit 130 is configured to correspond to field information on the image sensor 120 of the driving information of the liquid lens 110A and the correction channel of the information of the light received through at least one correction channel of the R channel, the G channel, and the B channel. Chromatic aberration can be corrected by shifting by a set amount of movement. Here, the driving information of the liquid lens 110A may be information about the degree of hand movement of the user (hereinafter, referred to as “hand shake information”). At this time, the image processing unit 130 shifts the information (for example, an image) of the light received through the correction channel by the amount of movement for each field.
도 4는 도 1에 도시된 이미지 센서(120)의 실시 예(120A)에 의한 평면 형상을 나타내며, 가로축은 x축 방향의 위치를 나타내고 세로축은 x축과 수직하는 y축 방향의 위치를 나타낸다.4 shows a planar shape according to the embodiment 120A of the image sensor 120 shown in FIG. 1, the horizontal axis represents the position in the x-axis direction, and the vertical axis represents the y-axis position perpendicular to the x-axis.
전술한 ‘필드’란 이미지 센서(120)의 촬상면(120S)에서 광축을 기준으로 상대적인 위치를 나타낸다. 즉, 필드란, 이미지 센서(120)의 중심을 기준으로 방사 방향으로 이미지 센서(120)의 상대적인 위치를 나타낸다. 도 4를 참조하면, x축 및 y축 각각과 수직하는 z축이 광축일 경우, 이미지 센서(120)의 중심에서의 필드는 0.0f이고, 중심을 기준으로 방사형으로 이미지 센서(120) 전체 면적의 50%되는 지점의 필드는 ±0.5f이고, 이미지 센서(120)의 가장 자리의 필드는 ±1.0이다.The aforementioned “field” indicates a position relative to the optical axis on the imaging surface 120S of the image sensor 120. That is, the field indicates a relative position of the image sensor 120 in the radial direction with respect to the center of the image sensor 120. Referring to FIG. 4, when the z-axis perpendicular to each of the x-axis and the y-axis is an optical axis, a field at the center of the image sensor 120 is 0.0f, and the entire area of the image sensor 120 is radially based on the center. The field of 50% of the point is ±0.5f, and the edge field of the image sensor 120 is ±1.0.
도 5는 액체 렌즈(110A)의 계면(BO)이 틸팅되지 않았을 때 색수차의 량(이하, ‘수차량’이라 함)과 이미지 센서(120)의 필드 간의 관계를 나타내고, 도 6은 액체 렌즈(110A)의 계면(BO)이 틸팅되었을 때 수차량과 이미지 센서(120)의 필드 간의 관계를 나타낸다. 각 그래프에서, 횡축은 수차량을 나타내고, 종축은 이미지 센서(120)의 물리적인 위치를 나타낸다. 즉, 종축의 물리적 위치를 정규화한 값이 필드에 해당한다. 또한, 각 도면에서 ‘BR’은 B채널과 R채널 간의 색수차의 량을 나타내고, ‘BG’는 B채널과 G채널 간의 색수차의 량을 나타낸다.FIG. 5 shows the relationship between the amount of chromatic aberration (hereinafter referred to as'aberration amount') and the field of the image sensor 120 when the interface BO of the liquid lens 110A is not tilted, and FIG. 6 shows the liquid lens ( Shows the relationship between the aberration amount and the field of the image sensor 120 when the interface BO of 110A) is tilted. In each graph, the horizontal axis represents the aberration amount, and the vertical axis represents the physical position of the image sensor 120. That is, a value obtained by normalizing the physical position of the vertical axis corresponds to the field. In addition, in each drawing,'BR' represents the amount of chromatic aberration between the B channel and the R channel, and'BG' represents the amount of chromatic aberration between the B channel and the G channel.
도 5 및 도 6 각각에서, 수차량을 나타내는 축을 기준으로 위쪽은 필드의 부호가 (+)이고, 아래쪽은 필드의 부호가 (-)이다.5 and 6, the sign of the field is (+) on the upper side and the sign of the field is (-) on the lower side with respect to the axis representing the aberration amount.
일반적인 경우 즉, 계면(BO)이 틸팅되지 않았을 때, 카메라 렌즈는 광축을 중심으로 회전 대칭이기 때문에 이 필드의 부호(즉, 방향)가 의미가 없다. 예를 들어, 도 5를 참조하면, 부호가 양(+)인 필드와 부호가 음(-)인 필드의 수차량은 광축(즉, 이미지 센서(120)의 중심)을 기준으로 서로 대칭이므로, 필드의 부호를 즉, 필드의 방향을 고려하지 않는다.In the general case, that is, when the interface BO is not tilted, the sign (ie, direction) of this field is meaningless because the camera lens is rotationally symmetric about the optical axis. For example, referring to FIG. 5, since the amount of aberration between the positive (+) field and the negative (−) field is symmetrical with respect to the optical axis (ie, the center of the image sensor 120), The sign of the field, ie the direction of the field, is not taken into account.
그러나, 도 2에 도시된 액체 렌즈(110A)의 계면(BO)이 틸팅될 경우, 카메라 렌즈는 광축을 중심으로 회전 대칭이 아니기 때문에 이 필드의 부호(즉, 방향)를 고려해야 한다. 예를 들어, 도 6를 참조하면, 부호가 양(+)인 필드와 부호가 음(-)인 필드의 수차량은 광축(즉, 이미지 센서(120)의 중심)을 기준으로 서로 비대칭이므로, 필드의 부호를 즉, 필드의 방향을 고려해야 한다.However, when the interface BO of the liquid lens 110A shown in FIG. 2 is tilted, the sign (ie, direction) of this field must be considered because the camera lens is not rotationally symmetric about the optical axis. For example, referring to FIG. 6, since the amount of aberration between the positive sign (+) field and the negative sign (−) field is asymmetric with respect to the optical axis (ie, the center of the image sensor 120), Consider the sign of the field, ie the direction of the field.
한편, 이미지 처리부(130)에서 색수차를 보정하기 위해 필요한 필드별 이동량은 액체 렌즈(110A)의 구동 정보(예를 들어, 손떨림 정보)별로 사전에 결정될 수 있다. 이에 대해 도 7 내지 도 9를 참조하여 다음과 같이 살펴본다.Meanwhile, the amount of movement for each field required to correct chromatic aberration in the image processing unit 130 may be determined in advance for each driving information (for example, hand-shake information) of the liquid lens 110A. This will be described as follows with reference to FIGS. 7 to 9.
도 7은 도 1에 도시된 이미지 처리부(130)의 일 실시 예(130A)에 의한 블럭도이다.7 is a block diagram according to an embodiment 130A of the image processing unit 130 shown in FIG. 1.
이미지 처리부(130A)는 제어부(134), 룩업테이블(LUT:Look-Up-Table)(136) 및 이미지 쉬프팅부(138)를 포함할 수 있다.The image processing unit 130A may include a control unit 134, a look-up-table (LUT) 136 and an image shifting unit 138.
LUT(136)는 복수의 보상 각도별로 액체 렌즈(110A)의 보정 각도를 매핑시켜 사전에 저장하고, 복수의 보정 각도별로 B채널의 이미지 센서(120)상 필드별 제1 이동량과 R채널의 이미지 센서(120)상 필드별 제2 이동량을 매핑시켜 사전에 저장할 수 있다.The LUT 136 maps and stores the correction angles of the liquid lens 110A for a plurality of compensation angles in advance, and stores the first movement amount and R channel for each field on the image sensor 120 of the B channel for each of the correction angles. The second movement amount for each field on the sensor 120 may be mapped and stored in advance.
LUT(136)에 저장되는 보정 각도는 액체 렌즈(110A)에 인가되어 액체 렌즈(110A)의 계면을 조절하는 구동 전압으로 대체될 수도 있다.The correction angle stored in the LUT 136 may be applied to the liquid lens 110A and replaced with a driving voltage that controls the interface of the liquid lens 110A.
도 8은 LUT(136)에 저장되는 데이터의 일 례를 나타낸다.8 shows an example of data stored in the LUT 136.
예를 들어, 도 8에 도시된 LUT(136)는 보상 각도와 보정 각도를 매핑시켜 저장하고, 보정 각도와 필드별 제1 및 제2 이동량을 매핑시켜 저장하고 있다. 도 8에 도시된 제1 및 제2 이동량 각각의 단위는 픽셀 수이다. 도 8의 경우, LUT(136)에 양(+)의 부호를 갖는 필드만이 저장된 것으로 도시되어 있지만, LUT(136)에 음(-)의 부호를 갖는 필드도 저장될 수 있다.For example, the LUT 136 shown in FIG. 8 maps and stores the compensation angle and the compensation angle, and stores the compensation angle and the first and second movement amounts for each field. The unit of each of the first and second movement amounts illustrated in FIG. 8 is the number of pixels. In the case of FIG. 8, only fields having a positive (+) sign are stored in the LUT 136, but fields having a negative (-) sign may also be stored in the LUT 136.
다시, 도 7을 참조하면, 제어부(134)는 손떨림 정보에 상응하는 보상 각도에 해당하는 B채널의 필드별 제1 이동량과, 보상 각도에 해당하는 R채널의 필드별 제2 이동량을 룩업테이블(136)로부터 출력시킬 수 있다. 즉, 손떨림 정보에 상응하는 보상 각도를 LUT(136)의 어드레스로 이용하여, 제어부(134)는 LUT(136)의 해당하는 어드레스에 저장된 B채널과 R채널의 필드별 제1 및 제2 이동량을 이미지 쉬프팅부(138)로 출력시킬 수 있다.Referring again to FIG. 7, the control unit 134 looks up the first movement amount for each field of the B channel corresponding to the compensation angle corresponding to the image stabilization information and the second movement amount for each field of the R channel corresponding to the compensation angle ( 136). That is, using the compensation angle corresponding to the hand-shake information as the address of the LUT 136, the control unit 134 determines the first and second movement amounts for each field of the B and R channels stored in the corresponding address of the LUT 136. It can be output to the image shifting unit 138.
실시 예에 의한 광학 기기는 손떨림 감지부(50)를 더 포함할 수 있다. 손떨림 감지부(50)는 액체 렌즈(110A)의 구동 정보에 해당하는 손떨림 정보를 보상 각도로 환산하고, 환산된 보상 각도를 제어부(134)로 출력할 수 있다. 이를 위해, 손떨림 감지부(50)는 자이로 센서(52)를 포함할 수 있다. 자이로 센서(52)는 사용자의 손떨림의 정도를 센싱하고, 센싱된 손떨림의 정도에 대한 손떨림 정보를 보상 각도로 환산하고, 환산된 결과를 제어부(134)로 출력할 수 있으나, 실시 예는 손떨림 감지부(50)의 특정한 구현 례에 국한되지 않는다.The optical device according to the embodiment may further include a camera shake detection unit 50. The camera shake detection unit 50 may convert camera shake information corresponding to driving information of the liquid lens 110A into a compensation angle, and output the converted compensation angle to the controller 134. To this end, the camera shake detection unit 50 may include a gyro sensor 52. The gyro sensor 52 senses the degree of the user's hand-shake, converts the hand-shake information about the sensed hand-shake to a compensation angle, and outputs the converted result to the controller 134. It is not limited to the specific implementation examples of unit 50.
예를 들어, 도 8을 참조하면, 보상 각도가 ‘0’일 경우, 필드별 제1 및 제2 이동량은 없다. 그러나, 보상 각도가 ‘0.3도’일 경우, 보정 각도는 0.42이고, 필드 0, 0.2, 0.4, 0.6, 0.8 및 1에서 필드별 제1 이동량은 각각 -0.1, 0.5, 0.4, -0.2, -0.4 및 0.7이고, 필드 0, 0.2, 0.4, 0.6, 0.8 및 1에서 필드별 제2 이동량은 각각 0, 0.4, 0.6, 0.6, 0.8 및 1.9이다.For example, referring to FIG. 8, when the compensation angle is “0”, there is no first and second movement amount for each field. However, when the compensation angle is '0.3 degrees', the compensation angle is 0.42, and the first movement amount for each field in fields 0, 0.2, 0.4, 0.6, 0.8, and 1 is -0.1, 0.5, 0.4, -0.2, -0.4, respectively. And 0.7, and the second movement amount per field in fields 0, 0.2, 0.4, 0.6, 0.8, and 1 is 0, 0.4, 0.6, 0.6, 0.8, and 1.9, respectively.
이때, 도 8의 경우, 필드 간의 간격은 0.2이지만, 이는 일 례에 불과하며, 0.2보다 클수도 있고 작을 수도 있다. 필드 간의 간격이 작을수록 이동량은 더욱 세분화되므로, 색수차가 정밀하게 보정될 수 있다.In this case, in the case of FIG. 8, the spacing between fields is 0.2, but this is only an example, and may be larger or smaller than 0.2. As the distance between fields is smaller, the amount of movement is further subdivided, so that chromatic aberration can be accurately corrected.
이미지 쉬프팅부(138)는 LUT(136)로부터 출력된 필드별 제1 이동량만큼 B채널로 수신된 광의 정보(예를 들어, 이미지)를 필드별로 쉬프팅하고, 쉬프팅된 결과를 출력단자 OUT2를 통해 이미지 합성부(140)로 출력한다. 또한, 이미지 쉬프팅부(138)는 LUT(136)로부터 출력된 필드별 제2 이동량만큼 R채널로 수신된 광의 정보(예를 들어, 이미지)를 필드별로 쉬프팅하고, 쉬프팅된 결과를 출력단자 OUT2를 통해 이미지 합성부(140)로 출력한다. 이를 위해, 이미지 쉬프팅부(138)는 입력단자 IN1을 통해 이미지 센서(120)로부터 B채널로 수신된 광의 정보를 받고, 입력단자 IN2를 통해 이미지 센서(120)로부터 R채널로 수신된 광의 정보를 받을 수 있다. 또는, 이미지 쉬프팅부(138)는 입력단자 IN1 또는 IN2를 통해 이미지 센서(120)로부터 B채널로 수신된 광의 정보와 R채널로 수신된 광의 정보가 모두 포함된 광의 정보를 받고, 받은 광의 정보로부터 B채널로 수신된 광의 정보와 R채널로 수신된 광의 정보를 분리할 수도 있다.The image shifting unit 138 shifts the information (for example, an image) of light received in the B channel for each field by the first movement amount for each field output from the LUT 136, and images the shifted result through the output terminal OUT2 Output to the synthesis unit 140. In addition, the image shifting unit 138 shifts the information (for example, an image) of light received on the R channel by the second movement amount per field output from the LUT 136 for each field, and outputs the shifted result to the output terminal OUT2. Output to the image synthesis unit 140 through. To this end, the image shifting unit 138 receives the information of the light received from the image sensor 120 through the input terminal IN1 in the B channel, and receives the information of the light received from the image sensor 120 through the input terminal IN2 through the R channel. Can receive Alternatively, the image shifting unit 138 receives information of light including both the information of the light received from the image sensor 120 through the input terminal IN1 or IN2 in the B channel and the information of the light received through the R channel, and from the received light information. It is also possible to separate information of light received on the B channel and information of light received on the R channel.
예를 들어, 이미지 쉬프팅부(138)는 전술한 바와 같이 보상 각도가 0.3도일 경우, B채널로 수신된 광의 정보를 필드 0, 0.2, 0.4, 0.6, 0.8 및 1에서 각각 -0.1, 0.5, 0.4, -0.2, -0.4 및 0.7 픽셀 수만큼 쉬프트시키고, R채널로 수신된 광의 정보를 필드 0, 0.2, 0.4, 0.6, 0.8 및 1에서 각각 0, 0.4, 0.6, 0.6, 0.8 및 1.9 픽셀 수만큼 쉬프트시킨다.For example, when the compensation angle is 0.3 degrees as described above, the image shifting unit 138 displays information of the light received through the B channel in fields 0, 0.2, 0.4, 0.6, 0.8, and 1, respectively, -0.1, 0.5, and 0.4. , -0.2, -0.4 and 0.7 pixels, and the information of the light received through the R channel is 0, 0.4, 0.6, 0.6, 0.8 and 1.9 pixels in fields 0, 0.2, 0.4, 0.6, 0.8 and 1, respectively. Shift it.
도 9는 도 1에 도시된 이미지 처리부(130)의 다른 실시 예(130B)에 의한 블럭도이다.9 is a block diagram according to another embodiment 130B of the image processing unit 130 shown in FIG. 1.
이미지 처리부(130B)는 보정 각도 결정부(133), 이동량 결정부(135) 및 이미지 쉬프팅부(138)를 포함할 수 있다. 도 9에서 도 7과 동일한 부에 대해서는 동일한 참조부호를 사용하였으며, 중복되는 설명을 생략한다.The image processing unit 130B may include a correction angle determination unit 133, a movement amount determination unit 135, and an image shifting unit 138. In FIG. 9, the same reference numerals are used for the same components as in FIG. 7, and overlapping descriptions are omitted.
보정 각도 결정부(133)는 손떨림 감지부(50)로부터 출력되는 보상 각도에 대응하여 기 설정된 액체 렌즈(110A)의 보정 각도를 구하고, 보정 각도를 이동량 결정부(135)로 출력한다. 이를 위해, 도 8을 참조하면, 보정 각도 결정부(133)는 보상 각도를 어드레스로하고, 보정 각도를 각 어드레스에 저장된 데이타로 하는 룩업테이블(133A)의 형태로 구현될 수 있다. 이를 위해, 보정 각도 결정부(133)는 복수의 보상 각도별 보정 각도를 사전에 매핑하여 룩업테이블(133A)의 형태로 저장할 수 있다. 이 경우, 보정 각도 결정부(133)는 손떨림 감지부(50)로부터 출력되는 보상 각도에 매핑된 보정 각도를 이동량 결정부(135)로 출력할 수 있다.The correction angle determination unit 133 obtains a correction angle of the preset liquid lens 110A corresponding to the compensation angle output from the image stabilization unit 50 and outputs the correction angle to the movement amount determination unit 135. To this end, referring to FIG. 8, the correction angle determination unit 133 may be implemented in the form of a look-up table 133A using a compensation angle as an address and a correction angle as data stored at each address. To this end, the correction angle determining unit 133 may map a plurality of correction angles for each compensation angle in advance and store them in the form of a lookup table 133A. In this case, the correction angle determination unit 133 may output the correction angle mapped to the compensation angle output from the image stabilization unit 50 to the movement amount determination unit 135.
또한, 보정 각도 결정부(133)에서 결정된 보정 각도는 출력단자 OUT3을 통해 액체 렌즈 구동부(미도시)로 출력될 수도 있다. 액체 렌즈 구동부는 보정 각도 결정부(133)로부터 출력되는 보정 각도에 응답하여 액체 렌즈(110A)를 구동하는 구동 전압을 생성하여, 액체 렌즈(110A)의 계면(BO)의 틸팅된 각도(θ)를 조정할 수 있다.Also, the correction angle determined by the correction angle determination unit 133 may be output to the liquid lens driving unit (not shown) through the output terminal OUT3. The liquid lens driving unit generates a driving voltage for driving the liquid lens 110A in response to the correction angle output from the correction angle determination unit 133, thereby tilting the angle θ of the interface BO of the liquid lens 110A. Can be adjusted.
이동량 결정부(135)는 보정 각도 결정부(133)로부터 출력되는 보정 각도에 대응하며 기 설정된 B채널의 이미지 센서(120)상 필드별 제1 이동량과 R채널의 이미지 센서(120)상 필드별 제2 이동량을 결정하고, 결정된 결과를 이미지 쉬프팅부(138)로 출력할 수 있다.The movement amount determination unit 135 corresponds to the correction angle output from the correction angle determination unit 133, and the first movement amount per field on the image sensor 120 of the preset B channel and each field on the image sensor 120 of the R channel. The second movement amount may be determined, and the determined result may be output to the image shifting unit 138.
이를 위해, 도 8을 참조하면, 이동량 결정부(135)는 보정 각도를 어드레스로하고, 필드별 제1 및 제2 이동량을 각 어드레스에 저장된 데이타로 하는 룩업테이블(135A)의 형태로 구현될 수 있다. 이를 위해, 이동량 결정부(135)는 복수의 보정 각도와 필드별 제1 및 제2 이동량을 사전에 매핑하여 룩업테이블(135A)의 형태로 저장할 수 있다. 이 경우, 이동량 결정부(135)는 보정 각도 결정부(133)로부터 출력되는 보정 각도에 매핑된 필드별 제1 및 제2 이동량을 이미지 쉬프팅부(138)로 출력할 수 있다.To this end, referring to FIG. 8, the movement amount determination unit 135 may be implemented in the form of a look-up table 135A that uses correction angles as addresses and first and second movement amounts for each field as data stored in each address. have. To this end, the movement amount determination unit 135 may map a plurality of correction angles and first and second movement amounts for each field in advance and store them in the form of a look-up table 135A. In this case, the movement amount determination unit 135 may output first and second movement amounts for each field mapped to the correction angle output from the correction angle determination unit 133 to the image shifting unit 138.
다시, 도 1을 참조하면, 이미지 합성부(140)는 G채널로 수신된 광의 정보, B채널로 수신되어 쉬프팅된 광의 정보 및 R채널로 수신되어 쉬프팅된 광의 정보를 합성하고, 합성된 결과를 색수차가 보정된 최종 이미지로서 출력단자 OUT1을 통해 출력할 수 있다. 이를 위해, 이미지 합성부(140)는 이미지 센서(120)로부터 출력되는 광의 정보로부터 G채널을 통해 수신된 광의 정보를 추출하고, B 채널로 수신되어 쉬프팅된 광의 정보와 R채널로 수신되어 쉬프팅된 광의 정보를 이미지 처리부(130)로부터 받고, 추출된 G채널을 통해 수신된 광의 정보와 B채널로 수신되어 쉬프팅된 광의 정보와 및 R채널로 수신되어 쉬프팅된 광의 정보를 합성할 수 있다.Referring again to FIG. 1, the image synthesizing unit 140 synthesizes information of light received on the G channel, information of light received on the B channel and shifted, and information on light received on the R channel and shifted, and synthesized results. The final image with corrected chromatic aberration can be output through the output terminal OUT1. To this end, the image synthesizing unit 140 extracts the information of the light received through the G channel from the information of the light output from the image sensor 120, the information of the light received and shifted by the B channel, and the received and shifted by the R channel. The information of the light may be received from the image processing unit 130, and information of the light received through the extracted G channel, information of the light received and shifted through the B channel, and information of the light received and shifted through the R channel may be synthesized.
출력단자 OUT1을 통해 출력되는 최종 이미지는 디스플레이(미도시)에 디스플레이될 수 있으나, 실시 예는 이에 국한되지 않는다.The final image output through the output terminal OUT1 may be displayed on a display (not shown), but the embodiment is not limited thereto.
상술한 이미지 처리부(130)와 이미지 합성부(140)는 각각 별개의 구성으로 실시되거나, 하나의 제어 유닛에서 상술한 이미지 처리부(130)와 이미지 합성부(140)의 역할을 수행할 수도 있다.The above-described image processing unit 130 and the image synthesis unit 140 may be respectively implemented in separate configurations, or may serve as the above-described image processing unit 130 and the image synthesis unit 140 in one control unit.
이하, 실시 예에 의한 색수차를 보정하는 영상 처리 방법을 첨부된 도면을 참조하여 다음과 같이 설명한다.Hereinafter, an image processing method for correcting chromatic aberration according to an embodiment will be described with reference to the accompanying drawings.
도 10은 실시 예에 의한 영상 처리 방법(200)을 설명하기 위한 플로우차트이다.10 is a flowchart illustrating an image processing method 200 according to an embodiment.
도 10에 도시된 영상 처리 방법(200)은 도 1에 도시된 광학 기기(100)에서 수행될 수도 있고, 도 1에 도시된 구성과 다른 구성을 갖는 광학 기기에서도 수행될 수 있다. 또는, 도 1에 도시된 광학 기기(100)는 도 10에 도시된 영상 처리 방법(200)을 수행할 수도 있고, 도 10에 도시된 바와 다른 영상 처리 방법을 수행할 수도 있다.The image processing method 200 illustrated in FIG. 10 may be performed by the optical device 100 illustrated in FIG. 1, or may also be performed by an optical device having a configuration different from the configuration illustrated in FIG. 1. Alternatively, the optical device 100 illustrated in FIG. 1 may perform the image processing method 200 illustrated in FIG. 10 or may perform another image processing method illustrated in FIG. 10.
이하, 이해를 돕기 위해, 도 10에 도시된 영상 처리 방법(200)은 도 1에 도시된 광학 기기(100)에서 수행되는 것으로 설명한다.Hereinafter, for ease of understanding, it will be described that the image processing method 200 illustrated in FIG. 10 is performed by the optical device 100 illustrated in FIG. 1.
먼저, 사용자의 손떨림 정도에 대한 정보인 손떨림 정보를 보상 각도로 환산한다(제210 단계). 제210 단계는 도 7 및 도 9에 도시된 손떨림 감지부(50)에서 수행될 수 있다.First, information on the hand-shake, which is information on the degree of the user's hand-shake, is converted into a compensation angle (step 210). Step 210 may be performed by the image stabilization unit 50 shown in FIGS. 7 and 9.
제210 단계 후에, 보상 각도에 대응하며 기 설정된 액체 렌즈(110A)의 보정 각도를 구한다(제220 단계). 제220 단계는 도 7에 도시된 제어부(134)와 LUT(136)에 의해 수행될 수도 있고, 도 9에 도시된 보정 각도 결정부(133)에서 수행될 수도 있다.After the step 210, a correction angle of the preset liquid lens 110A corresponding to the compensation angle is obtained (step 220). The step 220 may be performed by the control unit 134 and the LUT 136 illustrated in FIG. 7, or may be performed by the correction angle determination unit 133 illustrated in FIG. 9.
제220 단계 후에, 보정 각도에 대응하며 R채널, G채널 및 B 채널 중에서 적어도 하나의 보정 채널의 이미지 센서(120)상 필드 정보에 대응되는 기 설정된 필드별 이동량을 결정한다(제230 및 제240 단계). 제230 및 제240 단계는 도 7에 도시된 제어부(134)와 LUT(136)에 의해 수행될 수도 있고, 도 9에 도시된 이동량 결정부(135)에서 수행될 수도 있다.After the step 220, a predetermined amount of movement for each field corresponding to the correction angle and corresponding to field information on the image sensor 120 of at least one correction channel among R channels, G channels, and B channels is determined (230 and 240) step). Steps 230 and 240 may be performed by the control unit 134 and the LUT 136 illustrated in FIG. 7, or may be performed by the movement amount determining unit 135 illustrated in FIG. 9.
전술한 바와 같이, R, G 및 B 채널 중에서 보정 채널은 R채널과 B 채널이고, G채널은 기준 채널일 수 있다. 이 경우, 제220 단계 후에, 보정 각도에 대응하며 기 설정된 B채널의 이미지 센서(120)상 필드별 제1 이동량을 결정할 수 있다(제230 단계). 제230 단계 후에, 보정 각도에 대응하며 기 설정된 R채널의 이미지 센서(120)상 필드별 제2 이동량을 결정할 수 있다(제240 단계). 도 10의 경우, 제230 단계가 수행된 이후에 제240 단계가 수행된 것으로 도시되어 있지만, 실시 예는 이에 국한되지 않는다. 즉, 제240 단계가 수행된 이후에 제230 단계가 수행될 수도 있고, 제230 단계와 제240 단계는 동시에 수행될 수도 있다.As described above, among the R, G, and B channels, the correction channels are R and B channels, and the G channel can be a reference channel. In this case, after the step 220, the first movement amount for each field on the image sensor 120 of the preset B channel corresponding to the correction angle may be determined (step 230). After the step 230, a second movement amount for each field on the image sensor 120 of the preset R channel corresponding to the correction angle may be determined (step 240). In the case of FIG. 10, after operation 230 is performed, operation 240 is performed, but the embodiment is not limited thereto. That is, after the 240th step is performed, the 230th step may be performed, and the 230th and 240th steps may be simultaneously performed.
제240 단계 후에, 보정 채널로 수신된 광의 정보를 이미지 센서상 필드별로 이동량만큼 쉬프팅한다(제250 및 제260 단계). 제250 및 제260 단계는 도 7 및 도 9에 도시된 이미지 쉬프팅부(138)에서 수행될 수 있다.After the 240th step, the information of the light received through the correction channel is shifted by the amount of movement for each field on the image sensor (steps 250 and 260). Steps 250 and 260 may be performed by the image shifting unit 138 illustrated in FIGS. 7 and 9.
만일, R채널, G채널 및 B 채널 중에서 보정 채널은 R채널과 B 채널이고, 기준 채널은 G채널일 경우, 제240 단계 후에, B채널로 수신된 광의 정보(예를 들어, 이미지)를 이미지 센서(120)상 필드별로 제1 이동량만큼 쉬프팅할 수 있다(제250 단계).If, among R channels, G channels, and B channels, the correction channels are R channels and B channels, and the reference channel is G channels, after the 240th step, image information (eg, an image) of the light received through the B channel is imaged. The first movement amount may be shifted for each field on the sensor 120 (step 250).
제250 단계 후에, R채널로 수신된 광의 정보(예를 들어, 이미지)를 이미지 센서(120)상 필드별로 제2 이동량만큼 쉬프팅할 수 있다(제260 단계). 도 10의 경우, 제250 단계가 수행된 이후에 제260 단계가 수행된 것으로 도시되어 있지만, 실시 예는 이에 국한되지 않는다. 즉, 제260 단계가 수행된 이후에 제250 단계가 수행될 수도 있고, 제250 단계와 제260 단계는 동시에 수행될 수도 있다. 또는, 제230 단계가 수행된 이후에 제250 단계가 수행되고, 제240 단계가 수행된 이후에 제260 단계가 수행될 수도 있다.After the 250th step, information (eg, an image) of the light received through the R channel may be shifted by a second amount of movement for each field on the image sensor 120 (Step 260). In the case of FIG. 10, after the 250th step is performed, the 260th step is illustrated, but the embodiment is not limited thereto. That is, after step 260 is performed, step 250 may be performed, and steps 250 and 260 may be performed at the same time. Alternatively, step 250 may be performed after step 230 is performed, and step 260 may be performed after step 240 is performed.
이상에서와 같이, 도 10에 도시된 제220 내지 제260 단계는 이미지 처리부(130)에서 수행될 수 있다.As described above, steps 220 to 260 illustrated in FIG. 10 may be performed by the image processing unit 130.
제260 단계 후에, G채널로 수신된 광의 정보(예를 들어, 이미지), R채널로 수신되어 쉬프팅된 광의 정보(예를 들어, 이미지) 및 B채널로 수신되어 쉬프팅된 광의 정보(예를 들어, 이미지)를 합성하고, 합성된 결과를 색수차가 보정된 최종 이미지로서 생성할 수 있다(제270 단계). 제270 단계는 도 1에 도시된 이미지 합성부(140)에서 수행될 수 있다.After the step 260, information of the light received through the G channel (eg, an image), information of the light received through the R channel and shifted (eg, an image), and information of the light received through the B channel and shifted (eg, an image) , Image) may be synthesized, and the synthesized result may be generated as a final image in which chromatic aberration is corrected (step 270 ). Step 270 may be performed by the image synthesis unit 140 illustrated in FIG. 1.
이하, 실시 예에 의한 광학 기기 및 영상 처리 방법을 다음과 같이 부연 설명한다. 설명의 편의상, G채널을 통해 수신된 광의 정보를 ‘G채널의 이미지’라 하고, R채널을 통해 수신된 광의 정보를 ‘R채널의 이미지’라 하고, B채널을 통해 수신된 광의 정보를 ‘B채널의 이미지’라 한다.Hereinafter, an optical device and an image processing method according to an embodiment will be described in detail as follows. For convenience of description, the information of the light received through the G channel is referred to as an'G channel image', the information of the light received through the R channel is referred to as an'R channel image', and the information of the light received through the B channel is' B channel image.
도 11은 실시 예에 의한 색수차 보정을 설명하기 위한 도면이다.11 is a view for explaining chromatic aberration correction according to an embodiment.
색수차를 갖는 이미지의 R채널, G채널 및 B 채널 별로 렌즈의 색수차량만큼 이미지의 크기를 증감시킨 후 합성하여 색수차가 억제된 이미지를 얻을 수 있다. 이해를 돕기 위해, 도 11을 참조하면, 이미지 센서(120)로부터 출력되는 이미지(IM1)는 색수차를 갖는다. 여기서, 색수차란, 파장별 이미지의 크기 차이를 의미할 수 있다. G채널의 이미지(GIM)를 기준으로 할 때, R채널의 이미지의 크기로부터 G채널의 이미지의 크기를 감산한 제1 크기차를 ‘α’라 하고, B채널의 이미지의 크기로부터 G채널의 이미지의 크기를 감산한 제2 크기차를 ‘β’라 하자. 이때, R채널의 이미지의 크기에서 제1 크기차(α)를 감산한 결과(RIM)와, B채널의 이미지의 크기에서 제2 크기차(β)를 감산한 결과(BIM)와, G채널의 이미지(GIM)를 합성할 경우, 색수차가 보정된(또는, 억제된) 이미지(IM2)가 획득될 수 있다.The image of which chromatic aberration is suppressed can be obtained by increasing and decreasing the size of the image by the amount of chromatic aberration of the lens for each R channel, G channel and B channel of the image having chromatic aberration. To help understanding, referring to FIG. 11, the image IM1 output from the image sensor 120 has chromatic aberration. Here, the chromatic aberration may mean a difference in size of an image for each wavelength. Based on the G-channel image (GIM), the first size difference obtained by subtracting the size of the G-channel image from the R-channel image size is referred to as'α', and the B-channel image size Let's suppose the second size difference subtracting the size of the image is'β'. At this time, the result of subtracting the first size difference (α) from the size of the image of the R channel (RIM), the result of subtracting the second size difference (β) from the size of the image of the B channel (BIM), and the G channel When synthesizing the image GIM of, an image IM2 with corrected (or suppressed) chromatic aberration may be obtained.
656.2㎚의 파장 대역을 갖는 R채널의 이미지의 크기, 436.1㎚의 파장 대역을 갖는 B채널의 이미지의 크기 및 587.6㎚의 파장 대역을 갖는 G채널의 이미지의 크기 간의 차이 중에서 가장 큰 값이 색수차에 해당한다. 예를 들어, 색수차가 10㎛란, R채널, G채널 및 B 채널의 이미지의 크기 차이 중에서 가장 큰 크기 차이가 10㎛임을 의미한다. 이와 같이, 실시 예의 경우, G채널의 이미지를 기준으로 R채널과 B채널의 이미지의 크기가 달라진 정도를 G채널의 이미지의 크기와 같아지도록, R채널과 B채널의 이미지의 크기를 증감시킴으로써, 색수차가 보정(또는, 억제)될 수 있다.Among the differences between the size of the image of the R channel having a wavelength band of 656.2 nm, the size of the image of the B channel having a wavelength band of 436.1 nm, and the size of the image of the G channel having a wavelength band of 587.6 nm, the largest value is the chromatic aberration. It corresponds. For example, a chromatic aberration of 10 μm means that the largest size difference among image size differences of the R channel, the G channel, and the B channel is 10 μm. As described above, in the case of the embodiment, by increasing or decreasing the size of the image of the R channel and the B channel, so that the size of the image of the R channel and the B channel is different from that of the G channel, based on the image of the G channel, Chromatic aberration can be corrected (or suppressed).
그러나, 액체 렌즈(110A)의 계면(BO)이 틸팅된 상태에서 렌즈부(110)를 통과한 광 신호로부터 생성된 이미지의 색수차는 R채널, G채널 및 B 채널의 이미지 간의 크기가 다른 형태로 나타나는 대신에 R채널, G체널 및 B 채널의 이미지가 쉬프팅된 형태로 나타난다. 예를 들어, 도 4를 참조하면, R채널, G채널 및 B 채널 중에서 G채널을 기준 채널로 할 때, G채널의 이미지(GIM)을 기준으로 R채널의 이미지(R1)가 쉬프팅된 형태로 색수차가 나타난다. 이로 인해, 도 6을 참조하면, 부호가 양(+)인 필드의 색수차량이 증가함에 부호가 음(-)인 필드의 색수차량은 감소한다. 따라서, 전술한 실시 예에 의한 광학 기기(100) 및 영상 처리 방법(200)은 보정 채널 예를 들어, B채널과 R채널 각각의 이미지를 손떨림 정보에 따라 쉬프팅시킴으로서, 색수차를 보정할 수 있다.However, the chromatic aberration of the image generated from the optical signal passing through the lens unit 110 in the state where the interface BO of the liquid lens 110A is tilted has a different size between the images of the R channel, the G channel, and the B channel. Instead of appearing, the images of R channel, G channel and B channel appear in shifted form. For example, referring to FIG. 4, when the G channel is the reference channel among the R channels, the G channels, and the B channels, the R channel image R1 is shifted based on the G channel image GIM. Chromatic aberration appears. For this reason, referring to FIG. 6, as the amount of chromatic aberration in a field with a positive sign (+) increases, the amount of chromatic aberration in a field with a negative sign (-) decreases. Accordingly, the optical device 100 and the image processing method 200 according to the above-described embodiment may correct chromatic aberration by shifting the image of each of the correction channel, for example, the B channel and the R channel according to image stabilization information.
도 12는 실시 예에 의한 광학 기기(100) 및 영상 처리 방법(200)에 의해 색수차가 보정된 이미지를 설명하기 위한 도면이다. 여기서, 0.9f 내지 -0.9f는 필드를 나타내고, 0도와 0.6도는 보상 각도를 나타낸다. 보상 각도가 ‘0도’란 손떨림이 없는 정상 상태이며 도 2에 도시된 바와 같이 계면(BO)에 틸팅이 없는 상태이다. 보상 각도가 ‘0.6도’란 손떨림이 최대가 되는 상태이며 도 2에 도시된 바와 같이 계면(BO)의 기울어진 각도(θ)가 0.6도 임을 의미할 수 있다.12 is a diagram for describing an image in which chromatic aberration is corrected by the optical device 100 and the image processing method 200 according to the embodiment. Here, 0.9f to -0.9f denote a field, and 0 degrees and 0.6 degrees denote compensation angles. The compensation angle of '0 degree' is a steady state without shaking, and as shown in FIG. 2, there is no tilting at the interface BO. The compensation angle of '0.6 degrees' may mean that the shaking motion is maximized, and as illustrated in FIG. 2, the inclined angle θ of the interface BO is 0.6 degrees.
도 12를 참조하면, 실시 예에서와 같이, 손떨림 정보에 따라 구해진 보상 각도에 따라 R채널의 이미지와 B채널의 이미지를 필드별로 사전에 결정된 이동량만큼 쉬프팅시켜 합성함으로써, 색수차가 보정된 이미지가 획득될 수 있음을 알 수 있다.Referring to FIG. 12, as shown in the embodiment, the image of the chromatic aberration corrected is obtained by shifting and synthesizing the image of the R channel and the image of the B channel by a predetermined amount of movement for each field according to the compensation angle obtained according to the image stabilization information. Can be seen.
한편, 전술한 실시 예에 의한 광학 기기는 광 신호를 가공하거나 분석할 수 있는 장치를 포함할 수 있다. 광학 기기의 예로는 카메라/비디오 장치, 망원경 장치, 현미경 장치, 간섭계 장치, 광도계 장치, 편광계 장치, 분광계 장치, 반사계 장치, 오토콜리메이터 장치, 렌즈미터 장치 등이 있을 수 있다.Meanwhile, the optical device according to the above-described embodiment may include a device capable of processing or analyzing an optical signal. Examples of the optical device may include a camera/video device, a telescope device, a microscope device, an interferometer device, a photometer device, a polarimeter device, a spectrometer device, a reflectometer device, an autocollimator device, and a lens meter device.
또한, 광학 기기는 스마트폰, 노트북 컴퓨터, 태블릿 컴퓨터 등의 휴대용 장치로 구현될 수 있다. 이러한 광학 기기는 영상을 출력하는 디스플레이부(미도시), 각 부에에 전원을 공급하는 배터리(미도시), 각 부와 배터리를 실장하는 본체 하우징을 포함할 수 있다. 광학 기기는 타 기기와 통신할 수 있는 통신모듈과, 데이터를 저장할 수 있는 메모리부를 더 포함할 수 있다. 통신 모듈과 메모리부 역시 본체 하우징에 실장될 수 있다.Further, the optical device may be implemented as a portable device such as a smart phone, a notebook computer, and a tablet computer. Such an optical device may include a display unit for outputting an image (not shown), a battery for supplying power to each unit (not shown), and a main body housing for mounting each unit and the battery. The optical device may further include a communication module capable of communicating with other devices and a memory unit capable of storing data. The communication module and the memory unit may also be mounted in the body housing.
한편, 액체 렌즈를 포함하는 렌즈부; 렌즈부를 통과한 광의 정보를 수신하는 이미지 센서; 및 광의 정보를 보정하는 이미지 처리부를 포함하는 광학 기기에서 수행되는 영상 처리 방법을 실행하기 위한 프로그램을 기록한 기록 매체는, 사용자의 손떨림 정도에 대한 정보를 보상 각도로 환산하는 기능과, 보상 각도에 대응하며 사전에 결정된 액체 렌즈의 보정 각도를 구하는 기능과, 보정 각도에 대응하며 Red 파장 영역대를 수신하는 R채널, Green 파장 영역대를 수신하는 G채널 및 Blue 파장 영역대를 수신하는 B채널 중에서 적어도 하나의 보정 채널의 이미지 센서상 필드 정보에 대응되는 기 설정된 이동량을 결정하는 기능 및 보정 채널로 수신된 광의 정보를 이미지 센서상 필드별로 결정된 이동량만큼 쉬프팅하는 기능을 구현하는 프로그램을 기록하며, 컴퓨터는 기록 매체를 읽을 수 있다.Meanwhile, a lens unit including a liquid lens; An image sensor that receives information of light passing through the lens unit; And an image processing unit for correcting the information of light, a recording medium recording a program for executing an image processing method performed in an optical device, the function of converting information on the degree of hand shake of a user into a compensation angle and corresponding to the compensation angle At least, the function to obtain a predetermined correction angle of the liquid lens, and at least one of the R channel corresponding to the correction angle and the R channel receiving the red wavelength range, the G channel receiving the green wavelength range, and the B channel receiving the blue wavelength range. Records a program that implements a function of determining a preset amount of movement corresponding to field information on an image sensor of one correction channel and a function of shifting information of light received by a correction channel by a determined amount of movement for each field on an image sensor, and the computer The recording medium can be read.
또한, 컴퓨터로 읽을 수 있는 기록 매체에 기록된 프로그램에 의해 구현되는 이동량을 결정하는 기능은 보정 각도에 대응하며 기 설정된 B채널의 이미지 센서상 필드별 제1 이동량을 결정하는 기능 및 보정 각도에 대응하며 기 설정된 R채널의 이미지 센서상 필드별 제2 이동량을 결정하는 기능을 포함할 수 있다.In addition, the function of determining the amount of movement implemented by the program recorded on the computer-readable recording medium corresponds to the correction angle, and the function of determining the first amount of movement for each field on the preset B channel image sensor and the correction angle And it may include a function for determining the second movement amount for each field on the preset R channel image sensor.
또한, 컴퓨터로 읽을 수 있는 기록 매체에 기록된 프로그램에 의해 구현되는 쉬프팅 기능은 B채널로 수신된 광의 정보를 이미지 센서상 필드별로 제1 이동량만큼 쉬프팅하는 기능 및 R채널로 수신된 광의 정보를 이미지 센서상 필드별로 제2 이동량만큼 쉬프팅하는 기능을 포함할 수 있다.In addition, the shifting function implemented by a program recorded on a computer-readable recording medium is a function of shifting the information of light received on the B channel by a first movement amount for each field on the image sensor and the information of light received on the R channel. A function of shifting by the second movement amount for each field on the sensor may be included.
또한, 컴퓨터로 읽을 수 있는 기록 매체는 G채널로 수신된 광의 정보, R채널로 수신되어 쉬프팅된 광의 정보 및 B채널로 수신되어 쉬프팅된 광의 정보를 합성하여 보정된 최종 이미지를 생성하는 기능을 더 구현하는 프로그램을 기록할 수 있다.In addition, the computer-readable recording medium further includes a function of synthesizing the information of the light received on the G channel, the information of the light received on the R channel and shifted, and the information of the light received on the B channel and shifted to generate a corrected final image. You can record the program you implement.
컴퓨터가 읽을 수 있는 기록 매체는 컴퓨터 시스템에 의하여 읽혀질 수 있는 데이터가 저장되는 모든 종류의 저장 장치를 포함한다. 컴퓨터가 읽을 수 있는 기록 매체의 예로는 ROM, RAM, CD-ROM, 자기 테이프, 플로피디스크, 광 데이터 저장장치 등이 있다. 또한 컴퓨터가 읽을 수 있는 기록 매체는 네트워크로 연결된 컴퓨터 시스템에 분산되어, 분산방식으로 컴퓨터가 읽을 수 있는 코드가 저장되고 실행될 수 있다. 그리고, 상기 영상 처리 방법을 구현하기 위한 기능적인(function) 프로그램, 코드 및 코드 세그먼트들은 본 발명이 속하는 기술분야의 프로그래머들에 의해 용이하게 추론될 수 있다.The computer-readable recording medium includes any kind of storage device that stores data that can be read by a computer system. Examples of computer-readable recording media include ROM, RAM, CD-ROM, magnetic tape, floppy disks, and optical data storage devices. The computer-readable recording medium can also be distributed over network coupled computer systems so that the computer readable code is stored and executed in a distributed fashion. In addition, functional programs, codes, and code segments for implementing the image processing method can be easily inferred by programmers in the technical field to which the present invention pertains.
이상에서 실시 예를 중심으로 설명하였으나 이는 단지 예시일 뿐 본 발명을 한정하는 것이 아니며, 본 발명이 속하는 분야의 통상의 지식을 가진 자라면 본 실시 예의 본질적인 특성을 벗어나지 않는 범위에서 이상에 예시되지 않은 여러 가지의 변형과 응용이 가능함을 알 수 있을 것이다. 예를 들어, 실시 예에 구체적으로 나타난 각 구성 요소는 변형하여 실시할 수 있는 것이다. 그리고 이러한 변형과 응용에 관계된 차이점들은 첨부된 청구 범위에서 규정하는 본 발명의 범위에 포함되는 것으로 해석되어야 할 것이다.The embodiments have been mainly described above, but this is merely an example, and the present invention is not limited thereto, and those skilled in the art to which the present invention pertains are not exemplified above in the range that does not depart from the essential characteristics of the present embodiment. It will be appreciated that various modifications and applications are possible. For example, each component specifically shown in the embodiment can be implemented by modification. And differences related to these modifications and applications should be construed as being included in the scope of the invention defined in the appended claims.
발명의 실시를 위한 형태는 전술한 "발명의 실시를 위한 최선의 형태"에서 충분히 설명되었다.Modes for carrying out the invention have been fully described in the above-mentioned "Best Mode for Invention".
실시 예에 의한 액체 렌즈를 포함하며 색수차 보정 기능을 갖는 광학 기기 및 영상 처리 방법은 카메라/비디오 장치, 망원경 장치, 현미경 장치, 간섭계 장치, 광도계 장치, 편광계 장치, 분광계 장치, 반사계 장치, 오토콜리메이터 장치, 렌즈미터 장치, 스마트폰, 노트북 컴퓨터, 태블릿 컴퓨터 등에 이용될 수 있다.The optical device and the image processing method including the liquid lens according to the embodiment and having chromatic aberration correction function include a camera/video device, a telescope device, a microscope device, an interferometer device, a photometer device, a polarimeter device, a spectrometer device, a reflectometer device, and an autocollimator It can be used in devices, lens meter devices, smartphones, notebook computers, tablet computers, and the like.

Claims (10)

  1. 액체 렌즈를 포함하는 렌즈부;A lens unit including a liquid lens;
    Red 파장 영역대를 수신하는 R채널, Green 파장 영역대를 수신하는 G채널, 및 Blue 파장 영역대를 수신하는 B채널을 통해 상기 렌즈부를 통과한 광의 정보를 수신하는 이미지 센서; 및An image sensor receiving information of light passing through the lens unit through an R channel receiving a red wavelength range, a G channel receiving a green wavelength range, and a B channel receiving a blue wavelength range; And
    상기 R채널, 상기 G채널 및 상기 B채널 중 적어도 하나의 보정 채널로 수신된 광의 정보를 상기 액체 렌즈의 구동 정보와 상기 보정 채널의 상기 이미지 센서상 필드 정보에 대응되는 기 설정된 이동량만큼 쉬프팅시키는 이미지 처리부를 포함하는 광학 기기.An image for shifting information of light received through at least one correction channel among the R channel, the G channel, and the B channel by a preset movement amount corresponding to the driving information of the liquid lens and field information on the image sensor of the correction channel. Optical device comprising a processing unit.
  2. 제1 항에 있어서, 상기 R 채널, 상기 G 채널 및 상기 B 채널 중에서 상기 보정 채널은 B채널과 R 채널이고, 상기 G 채널은 기준 채널인 광학 기기.The optical device of claim 1, wherein the correction channel among the R channel, the G channel, and the B channel is a B channel and an R channel, and the G channel is a reference channel.
  3. 제2 항에 있어서, 상기 이미지 처리부는The method of claim 2, wherein the image processing unit
    복수의 보상 각도별로 상기 액체 렌즈의 보정 각도를 매핑시켜 저장하고, 복수의 상기 보정 각도별로 상기 B채널의 상기 이미지 센서상 필드별 제1 이동량과 상기 R채널의 상기 이미지 센서 상 필드별 제2 이동량을 매핑시켜 저장하는 룩업테이블;The correction angle of the liquid lens is mapped and stored for each of the compensation angles, and the first movement amount for each field on the image sensor of the B channel and the second movement amount for each field on the image sensor of the R channel for each of the correction angles Lookup table to store the mapping;
    상기 액체 렌즈의 상기 구동 정보에 상응하는 보상 각도에 상응하는 상기 B채널의 상기 필드별 제1 이동량과, 상기 보상 각도에 상응하는 상기 R채널의 상기 필드별 제2 이동량을 상기 룩업테이블로부터 출력시키는 제어부; 및The first movement amount for each field of the B channel corresponding to the compensation angle corresponding to the driving information of the liquid lens, and the second movement amount for each field of the R channel corresponding to the compensation angle are output from the lookup table. Control unit; And
    상기 룩업테이블로부터 출력된 상기 필드별 제1 이동량만큼 상기 B채널로 수신된 광의 정보를 필드별로 쉬프팅하고, 상기 룩업테이블로부터 출력된 상기 필드별 제2 이동량만큼 상기 R채널로 수신된 광의 정보를 필드별로 쉬프팅하는 이미지 쉬프팅부를 포함하는 광학 기기.The information of the light received on the B channel is shifted for each field by the first movement amount for each field output from the lookup table, and the information of the light received on the R channel by the second movement amount for each field output from the lookup table is fielded. An optical device including an image shifting unit that is not very shifting.
  4. 제2 항에 있어서, 상기 이미지 처리부는The method of claim 2, wherein the image processing unit
    상기 액체 렌즈의 상기 구동 정보에 상응하는 보상 각도에 대응하며 기 설정된 상기 액체 렌즈의 보정 각도를 구하는 보정 각도 결정부;A correction angle determination unit corresponding to a compensation angle corresponding to the driving information of the liquid lens and obtaining a preset correction angle of the liquid lens;
    상기 보정 각도에 대응하며 기 설정된 상기 B채널의 상기 이미지 센서상 필드별 제1 이동량과 상기 R채널의 상기 이미지 센서상 필드별 제2 이동량을 결정하는 이동량 결정부; 및A movement amount determination unit corresponding to the correction angle and determining a first movement amount for each field on the image sensor of the B channel and a second movement amount for each field on the image sensor of the R channel; And
    상기 B채널로 수신된 상기 광의 정보를 필드별로 상기 제1 이동량만큼 쉬프팅하고, 상기 R채널로 수신된 상기 광의 정보를 필드별로 상기 제2 이동량만큼 쉬프팅하는 이미지 쉬프팅부를 포함하는 광학 기기.And an image shifting unit for shifting the information of the light received on the B channel by the first movement amount for each field, and shifting the information of the light received on the R channel by the second movement amount for each field.
  5. 제3 항 또는 제4 항에 있어서,The method of claim 3 or 4,
    상기 G채널로 수신된 광의 정보, 상기 B채널로 수신되어 쉬프팅된 광의 정보 및 상기 R채널로 수신되어 쉬프팅된 광의 정보를 합성하여 보정된 최종 이미지를 생성하는 이미지 합성부를 더 포함하는 광학 기기.And an image synthesizing unit for synthesizing the information of the light received on the G channel, the information of the light received on the B channel and shifted, and the information of the light received on the R channel and shifted to generate a corrected final image.
  6. 제3 항 또는 제4 항에 있어서, 상기 액체 렌즈의 상기 구동 정보에 해당하는 사용자의 손떨림 정보를 보상 각도로 환산하는 손떨림 감지부를 더 포함하는 광학 기기.The optical device according to claim 3 or 4, further comprising a camera shake detection unit that converts a user's camera shake information corresponding to the driving information of the liquid lens into a compensation angle.
  7. 액체 렌즈를 포함하는 렌즈부; 상기 렌즈부를 통과한 광의 정보를 수신하는 이미지 센서; 및 상기 광의 정보를 보정하는 이미지 처리부를 포함하는 광학 기기에서 수행되는 영상 처리 방법에 있어서,A lens unit including a liquid lens; An image sensor that receives information of light passing through the lens unit; And an image processing unit for correcting the information of the light,
    상기 액체 렌즈의 구동 정보를 보상 각도로 환산하는 단계;Converting driving information of the liquid lens into a compensation angle;
    상기 보상 각도에 대응하며 기 설정된 상기 액체 렌즈의 보정 각도를 구하는 단계;Obtaining a correction angle of the preset liquid lens corresponding to the compensation angle;
    상기 보정 각도에 대응하며, Red 파장 영역대를 수신하는 R채널, Green 파장 영역대를 수신하는 G채널 및 Blue 파장 영역대를 수신하는 B채널 중에서 적어도 하나의 보정 채널의 상기 이미지 센서 상 필드 정보에 대응되는 기 설정된 이동량을 결정하는 단계; 및Corresponding to the correction angle, the field information on the image sensor of at least one correction channel among the R channel receiving the red wavelength range, the G channel receiving the green wavelength range, and the B channel receiving the blue wavelength range. Determining a corresponding preset movement amount; And
    상기 보정 채널로 수신된 광의 정보를 상기 이미지 센서상 필드별 상기 결정된 이동량만큼 쉬프팅하는 단계를 포함하는 영상 처리 방법.And shifting the information of the light received through the correction channel by the determined amount of movement for each field on the image sensor.
  8. 제7 항에 있어서, 상기 R 채널, 상기 G 채널 및 상기 B 채널 중에서 상기 보정 채널은 R채널과 B 채널이고, 상기 G채널은 기준 채널이고,8. The method of claim 7, wherein the correction channel among the R channel, the G channel and the B channel is an R channel and a B channel, and the G channel is a reference channel,
    상기 이동량을 결정하는 단계는Determining the amount of movement is
    상기 보정 각도에 대응하며 기 설정된 상기 B채널의 상기 이미지 센서상 필드별 제1 이동량을 결정하는 단계; 및Determining a first movement amount for each field on the image sensor of the preset B channel corresponding to the correction angle; And
    상기 보정 각도에 대응하며 기 설정된 상기 R채널의 상기 이미지 센서 상 필드별 제2 이동량을 결정하는 단계를 포함하는 영상 처리 방법.And determining a second movement amount for each field on the image sensor of the R channel that corresponds to the correction angle.
  9. 제8 항에 있어서, 상기 쉬프팅 단계는The method of claim 8, wherein the shifting step
    상기 B채널로 수신된 광의 정보를 상기 이미지 센서상 필드별로 상기 제1 이동량만큼 쉬프팅하는 단계; 및Shifting the information of the light received through the B channel by the first movement amount for each field on the image sensor; And
    상기 R채널로 수신된 광의 정보를 상기 이미지 센서상 필드별로 상기 제2 이동량만큼 쉬프팅하는 단계를 포함하는 영상 처리 방법.And shifting the information of the light received through the R channel by the second movement amount for each field on the image sensor.
  10. 제7 항에 있어서, 상기 R 채널, 상기 G 채널 및 상기 B 채널 중에서 상기 보정 채널은 R채널과 B 채널이고, 상기 G채널은 기준 채널이고,8. The method of claim 7, wherein the correction channel among the R channel, the G channel and the B channel is an R channel and a B channel, and the G channel is a reference channel,
    상기 G채널로 수신된 광의 정보, 상기 R채널로 수신되어 쉬프팅된 광의 정보 및 상기 B채널로 수신되어 쉬프팅된 광의 정보를 합성하여 보정된 최종 이미지를 생성하는 단계를 더 포함하는 영상 처리 방법. And generating a corrected final image by synthesizing the information of the light received on the G channel, the information of the light received on the R channel and shifted, and the information of the light received on the B channel and shifted.
PCT/KR2020/001163 2019-01-30 2020-01-23 Optical device including liquid lens and having chromatic aberration correction function, and image processing method WO2020159160A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020190012084A KR20200094509A (en) 2019-01-30 2019-01-30 Optical device including liquid lens and having function for compensating chromatic aberration, and image processing method
KR10-2019-0012084 2019-01-30

Publications (1)

Publication Number Publication Date
WO2020159160A1 true WO2020159160A1 (en) 2020-08-06

Family

ID=71841536

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2020/001163 WO2020159160A1 (en) 2019-01-30 2020-01-23 Optical device including liquid lens and having chromatic aberration correction function, and image processing method

Country Status (2)

Country Link
KR (1) KR20200094509A (en)
WO (1) WO2020159160A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100819301B1 (en) * 2006-12-20 2008-04-03 삼성전자주식회사 Method and apparatus for optical image stabilizer on mobile camera module
JP2009219123A (en) * 2008-03-12 2009-09-24 Thomson Licensing Correcting method of color aberration
KR101121014B1 (en) * 2010-12-22 2012-03-16 중앙대학교 산학협력단 Apparatus and method for color enhancement of low exposure image captured by multiple color-filter aperture
KR101125765B1 (en) * 2011-01-24 2012-03-27 중앙대학교 산학협력단 Apparatus and method for registration between color channels based on depth information of image taken by multiple color filter aperture camera
KR20180102418A (en) * 2017-03-07 2018-09-17 엘지이노텍 주식회사 Camera module including liquid lens and optical apparatus

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100819301B1 (en) * 2006-12-20 2008-04-03 삼성전자주식회사 Method and apparatus for optical image stabilizer on mobile camera module
JP2009219123A (en) * 2008-03-12 2009-09-24 Thomson Licensing Correcting method of color aberration
KR101121014B1 (en) * 2010-12-22 2012-03-16 중앙대학교 산학협력단 Apparatus and method for color enhancement of low exposure image captured by multiple color-filter aperture
KR101125765B1 (en) * 2011-01-24 2012-03-27 중앙대학교 산학협력단 Apparatus and method for registration between color channels based on depth information of image taken by multiple color filter aperture camera
KR20180102418A (en) * 2017-03-07 2018-09-17 엘지이노텍 주식회사 Camera module including liquid lens and optical apparatus

Also Published As

Publication number Publication date
KR20200094509A (en) 2020-08-07

Similar Documents

Publication Publication Date Title
WO2011078614A2 (en) Camera module
WO2014003493A1 (en) Camera module
WO2014014222A1 (en) Camera module
WO2018135851A1 (en) Liquid lens, and camera module and optical device comprising same
WO2014003492A1 (en) Camera module
WO2019164335A1 (en) Lens module and camera module comprising same
WO2018128518A1 (en) Camera module and optical instrument including same
WO2020209492A1 (en) Folded camera and electronic device including the same
WO2020122594A1 (en) Lens assmebly and camera module including same
WO2018182349A1 (en) Liquid lens, and camera module and optical device comprising same
WO2019088353A1 (en) Camera module including liquid lens and optical instrument
WO2018194195A1 (en) Mobile terminal
WO2019225984A1 (en) Liquid lens, and camera module and optical instrument comprising same
WO2021040341A1 (en) Sensor driving device and camera module
WO2018182348A1 (en) Camera module including liquid lens
WO2020189992A1 (en) Camera module
WO2018147670A1 (en) Camera module including liquid lens, and optical device
WO2018151527A1 (en) Liquid lens and camera module including same
WO2018139894A1 (en) Liquid lens module and camera module comprising same
WO2018164524A1 (en) Liquid lens, and camera module and optical instrument including same
WO2020159160A1 (en) Optical device including liquid lens and having chromatic aberration correction function, and image processing method
WO2019235880A1 (en) Optical device
WO2021006592A1 (en) Liquid lens control device
WO2019212280A1 (en) Dual camera module comprising liquid lenses
WO2020242135A1 (en) Liquid lens and lens assembly comprising liquid lens

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20748793

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20748793

Country of ref document: EP

Kind code of ref document: A1