KR20170061536A - Camera module and mobile terminal comprising the camera - Google Patents
Camera module and mobile terminal comprising the camera Download PDFInfo
- Publication number
- KR20170061536A KR20170061536A KR1020150166718A KR20150166718A KR20170061536A KR 20170061536 A KR20170061536 A KR 20170061536A KR 1020150166718 A KR1020150166718 A KR 1020150166718A KR 20150166718 A KR20150166718 A KR 20150166718A KR 20170061536 A KR20170061536 A KR 20170061536A
- Authority
- KR
- South Korea
- Prior art keywords
- images
- image
- mobile terminal
- unit
- coordinates
- Prior art date
Links
Images
Classifications
-
- H04N5/2257—
-
- H04N13/02—
-
- H04N5/2253—
-
- H04N5/2256—
Landscapes
- Telephone Function (AREA)
Abstract
A camera module according to the present invention includes: an illumination unit configured to illuminate light on a specific plane; First and second cameras arranged to be spaced apart from each other and configured to photograph first and second images respectively including predetermined patterns corresponding to light illuminated on the specific plane; And correcting at least one of the first and second images using the first and second coordinates of the predetermined pattern included in the first and second images, And a controller for generating a three-dimensional image using two images.
Description
The present invention relates to a camera module for acquiring a three-dimensional image and a mobile terminal including the camera module.
As the interest in stereoscopic image service has been increasing recently, devices for providing stereoscopic images are continuously being developed. The stereoscopic method, the time-of-flight (TOF) method, and the structure light method are examples of a method of implementing the stereoscopic image.
The basic principle of the stereoscopic method is that a plurality of images orthogonal to each other are input to the left eye and the right eye of a human being, and stereoscopic images are generated by combining images input from the left eye and right eye of the human brain. At this time, images arranged orthogonally to each other are respectively a reft view image and a right view image.
The recent 3D camera module is configured to simultaneously photograph the left eye image and the right eye image in one device. For example, a stereo system using two identical cameras is often used. In the case of the stereo system, the two cameras were arranged at a predetermined interval, and the left and right images were acquired using two separate completely independent cameras (two lenses, two sensors, two ISPs).
However, in the stereoscopic 3D camera module, a quality problem due to an assembly error between two cameras deteriorates the quality of a 3D image, resulting in a problem of a high-precision assembling process and a yield reduction.
Conventionally, stereo correction is performed using an image of a chess board to reduce an assembly error between the two cameras. However, in the case of stereo correction using a chess board, a physical tool called chess board is indispensable, and there is a disadvantage that a plurality of images must be taken.
In addition, although the conventional 3D camera module is initially well aligned between two cameras, there is a problem that an image quality is deteriorated due to a tolerance between two cameras due to an external impact generated during use.
SUMMARY OF THE INVENTION It is an object of the present invention to provide a camera module capable of performing stereo correction without using a physical tool such as a chess board and a mobile terminal having the camera module. will be.
It is another object of the present invention to provide a camera module in which image distortion due to a tolerance between two cameras generated during use of a camera module or a mobile terminal according to the present invention can be easily corrected, .
A camera module according to the present invention includes: an illumination unit configured to illuminate light on a specific plane; First and second cameras arranged to be spaced apart from each other and configured to photograph first and second images respectively including predetermined patterns corresponding to light illuminated on the specific plane; And correcting at least one of the first and second images using the first and second coordinates of the predetermined pattern included in the first and second images, And a controller for generating a three-dimensional image using two images.
In one embodiment of the present invention, the control unit recognizes, as the specific plane, a space corresponding to a portion of the first and second images satisfying a predetermined condition from a space corresponding to the first and second images, The illumination unit can be controlled so that light is illuminated on the specific plane.
In one embodiment, the control unit may generate a look up table for performing a correction for a portion of the first and second images, and may include a lookup table for at least one of the first and second images, a look up table can be applied.
In an embodiment, the controller may generate an error difference value by comparing the current difference value and the reference difference value of the first and second coordinates, and may generate the lookup table based on the generated error difference value have.
In one embodiment of the present invention, the control unit stores the generated lookup table in a memory, and when the third and fourth images different from the first and second images are taken through the first and second cameras, 3 and the fourth image by applying the lookup table to at least one of the third and fourth images, and generating a three-dimensional image using the at least one corrected third and fourth images .
In an embodiment, the lookup table stored in the memory may be updated every predetermined period.
In one embodiment, the predetermined pattern is a plurality of patterns, and the control unit may calculate the first and second coordinates of each pattern by matching each pattern in the first and second images.
In an exemplary embodiment, the control unit may sequentially match the patterns from the central area to the outer area of the first or second image.
In one embodiment, the predetermined pattern comprises a set of a plurality of pixels, each pixel having a first or a second pixel value at random, and each pattern having a different code.
A mobile terminal according to the present invention includes: an illumination unit configured to illuminate light on a specific plane; First and second cameras arranged to be spaced apart from each other and configured to photograph first and second images respectively including predetermined patterns corresponding to light illuminated on the specific plane; And correcting at least one of the first and second images using the first and second coordinates of the predetermined pattern included in the first and second images, And a controller for generating a three-dimensional image using two images.
A method of controlling a camera module according to the present invention includes the steps of illuminating light on a specific plane through an illumination unit; Receiving first and second images through a first and a second camera, the first and second images including a predetermined pattern corresponding to light illuminated in the specific plane; Correcting at least one of the first and second images using the first and second coordinates of the predetermined pattern included in the first and second images; And generating a three-dimensional image using the at least one corrected first and second images.
The camera module and the mobile terminal according to an embodiment of the present invention may use the first and second coordinates of a predetermined pattern included in the first and second images to determine a tolerance generated when assembling the camera module, It is possible to easily correct the image distortion caused by the distortion. Accordingly, a 3D stereoscopic image with improved quality can be generated and provided.
The camera module and the mobile terminal according to an exemplary embodiment of the present invention may be configured such that when a user does not separately perform a command related to light illumination or an instruction related to image correction but a portion recognized as a plane exists in the received image, A lookup table for the part may be generated and the correction may be performed using the generated lookup table.
Thereby, it is possible to easily correct the image distortion that is generated or deepened during the use of the camera module or the mobile terminal according to the present invention by the user.
FIG. 1A is a block diagram for explaining a mobile terminal according to the present invention; FIG.
1B and 1C are conceptual diagrams illustrating an example of a mobile terminal according to the present invention in different directions.
2 is a conceptual diagram of a mobile terminal having a camera module related to the present invention.
3 is a conceptual diagram for explaining a lighting unit related to the present invention;
4 is a flowchart typically showing a control method of the present invention.
5A through 5C are conceptual diagrams for explaining the control method shown in FIG. 4;
6A is a flowchart for explaining a control method in which light is illuminated in a specific plane related to the present invention.
6B to 6D are conceptual diagrams for explaining the control method of FIG. 6A. FIG.
FIG. 7A is a flowchart illustrating a control method for matching patterns in first and second images related to the present invention; FIG.
FIG. 7B is a conceptual diagram for explaining the control method of FIG. 7A. FIG.
Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like or similar elements are denoted by the same reference numerals, and redundant description thereof will be omitted. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role. In the following description of the embodiments of the present invention, a detailed description of related arts will be omitted when it is determined that the gist of the embodiments disclosed herein may be blurred. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed. , ≪ / RTI > equivalents, and alternatives.
Terms including ordinals, such as first, second, etc., may be used to describe various elements, but the elements are not limited to these terms. The terms are used only for the purpose of distinguishing one component from another.
It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between.
The singular expressions include plural expressions unless the context clearly dictates otherwise.
In the present application, the terms "comprises", "having", and the like are used to specify that a feature, a number, a step, an operation, an element, a component, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof.
The mobile terminal described in this specification includes a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), a navigation device, a slate PC A tablet PC, an ultrabook, a wearable device such as a smartwatch, a smart glass, and a head mounted display (HMD). have.
However, it will be appreciated by those skilled in the art that the configuration according to the embodiments described herein may be applied to fixed terminals such as a digital TV, a desktop computer, a digital signage, and the like, will be.
1A to 1C are block diagrams for explaining a mobile terminal according to the present invention, and FIGS. 1B and 1C are conceptual diagrams showing an example of a mobile terminal according to the present invention in different directions.
The
The
The
The
The
The
The
In addition, the
In addition to the operations related to the application program, the
In addition, the
The
At least some of the components may operate in cooperation with one another to implement a method of operation, control, or control of a mobile terminal according to various embodiments described below. In addition, the operation, control, or control method of the mobile terminal may be implemented on the mobile terminal by driving at least one application program stored in the
Hereinafter, the various components of the
First, referring to the
The mobile communication module 112 may be a mobile communication module or a mobile communication module such as a mobile communication module or a mobile communication module that uses technology standards or a communication method (e.g., Global System for Mobile communication (GSM), Code Division Multi Access (CDMA), Code Division Multi Access 2000 (Enhanced Voice-Data Optimized or Enhanced Voice-Data Only), Wideband CDMA (WCDMA), High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution And an external terminal, or a server on a mobile communication network established according to a long term evolution (AR), a long term evolution (AR), or the like.
The wireless signal may include various types of data depending on a voice call signal, a video call signal or a text / multimedia message transmission / reception.
The wireless Internet module 113 is a module for wireless Internet access, and may be built in or externally attached to the
Wireless Internet technologies include, for example, wireless LAN (WLAN), wireless fidelity (Wi-Fi), wireless fidelity (Wi-Fi) Direct, DLNA (Digital Living Network Alliance), WiBro Interoperability for Microwave Access, High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE) and Long Term Evolution-Advanced (LTE-A) 113 transmit and receive data according to at least one wireless Internet technology, including Internet technologies not listed above.
The wireless Internet module 113 for performing a wireless Internet connection through the mobile communication network can be used for wireless Internet access by WiBro, HSDPA, HSUPA, GSM, CDMA, WCDMA, LTE or LTE- May be understood as a kind of the mobile communication module 112.
The short-
Here, the other
The position information module 115 is a module for obtaining the position (or current position) of the mobile terminal, and a representative example thereof is a Global Positioning System (GPS) module or a Wireless Fidelity (WiFi) module. For example, when the mobile terminal utilizes the GPS module, it can acquire the position of the mobile terminal by using a signal transmitted from the GPS satellite. As another example, when the mobile terminal utilizes the Wi-Fi module, it can acquire the position of the mobile terminal based on information of a wireless access point (AP) that transmits or receives the wireless signal with the Wi-Fi module. Optionally, the location information module 115 may perform any of the other functions of the
Next, the
The
The
Meanwhile, the
First, the
Examples of the
On the other hand, for convenience of explanation, the act of recognizing that the object is located on the touch screen in proximity with no object touching the touch screen is referred to as "proximity touch & The act of actually touching an object on the screen is called a "contact touch. &Quot; The position at which the object is closely touched on the touch screen means a position where the object corresponds to the touch screen vertically when the object is touched. The
The touch sensor senses a touch (or touch input) applied to the touch screen (or the display unit 151) by using at least one of various touch methods such as a resistance film type, a capacitive type, an infrared type, an ultrasonic type, do.
For example, the touch sensor may be configured to convert a change in a pressure applied to a specific portion of the touch screen or a capacitance generated in a specific portion to an electrical input signal. The touch sensor may be configured to detect a position, an area, a pressure at the time of touch, a capacitance at the time of touch, and the like where a touch object touching the touch screen is touched on the touch sensor. Here, the touch object may be a finger, a touch pen, a stylus pen, a pointer, or the like as an object to which a touch is applied to the touch sensor.
Thus, when there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and transmits the corresponding data to the
On the other hand, the
On the other hand, the touch sensors and the proximity sensors discussed above can be used independently or in combination to provide a short touch (touch), a long touch, a multi touch, a drag touch ), Flick touch, pinch-in touch, pinch-out touch, swipe touch, hovering touch, and the like. Touch can be sensed.
The ultrasonic sensor can recognize the position information of the object to be sensed by using ultrasonic waves. Meanwhile, the
The
The
The
Also, the
In the stereoscopic display unit, a three-dimensional display system such as a stereoscopic system (glasses system), an autostereoscopic system (no-glasses system), and a projection system (holographic system) can be applied.
The
The
In addition to vibration, the
The
The
The signal output from the
The
The identification module is a chip for storing various information for authenticating the use right of the
The
The
The
Meanwhile, as described above, the
In addition, the
The
In addition, the
As another example, the
In the following, various embodiments may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.
Referring to FIGS. 1B and 1C, the disclosed
Here, the terminal body can be understood as a concept of referring to the
The
A
In some cases, electronic components may also be mounted on the
As shown, when the
These
The
Meanwhile, the
The
1B and 1C, a
However, these configurations are not limited to this arrangement. These configurations may be excluded or replaced as needed, or placed on different planes. For example, the
The
The
In addition, the
The
The touch sensor may be a film having a touch pattern and disposed between the
In this way, the
The first
The
The
The
The first and
In this figure, the
The contents input by the first and
On the other hand, a rear input unit (not shown) may be provided on the rear surface of the terminal body as another example of the
The rear input unit may be disposed so as to overlap with the
When a rear input unit is provided on the rear surface of the terminal body, a new type of user interface using the rear input unit can be realized. When the
Meanwhile, the
The
The
And a
The
The
And a second
The terminal body may be provided with at least one antenna for wireless communication. The antenna may be embedded in the terminal body or formed in the case. For example, an antenna constituting a part of the broadcast receiving module 111 (see FIG. 1A) may be configured to be able to be drawn out from the terminal body. Alternatively, the antenna may be formed in a film type and attached to the inner surface of the
The terminal body is provided with a power supply unit 190 (see FIG. 1A) for supplying power to the
The
The
The
Meanwhile, the
In other words, the
The image received through the camera may be referred to as a preview image. Specifically, the preview image refers to an image received through a camera in real time. The preview image can be changed based on whether the mobile terminal equipped with the
The depth information may be named as a depth value, a depth information, a depth value, or the like. The depth information may mean a distance (or a distance value) between a subject corresponding to a pixel included in the image and a mobile terminal (more specifically, a camera).
For example, when the distance between the object corresponding to a specific pixel of the image and the mobile terminal is n, the depth information of the specific pixel may be a specific value corresponding to n. The specific value corresponding to n may be n or a value converted by a predetermined algorithm.
In addition, when the coordinates of the image are set as the x-axis and the y-axis perpendicular to the x-axis, the depth information may be a value corresponding to the z-axis perpendicular to the x-axis and the y-axis, respectively. The absolute value of the depth information may increase as the distance between the subject and the mobile terminal increases.
Such depth information can be utilized in various fields. For example, the depth information may be used to photograph / create a stereoscopic image, or used to generate 3D printing data used in a 3D printer, or may be used to generate movement information of an object (a subject) And can be used to detect the presence of a foreign substance.
The mobile terminal according to the present invention can extract depth information of an image received through a camera in various ways. For example, the control unit 180 (see FIG. 1A) may include a stereoscopic method of extracting depth information using at least two cameras, a method of extracting depth information using light emitting elements arranged to form a predetermined pattern And a time-of-flight (ToF) method in which depth information is extracted based on a time when the light emitted from the light emitting device is reflected and returned. Alternatively, depth information may be extracted through a combination of these methods .
Hereinafter, extraction of depth information using a stereoscopic method will be described.
The stereoscopic method is a method of imitating the principle that a human perceives visually and implements the image.
More specifically, in the stereoscopic method, two cameras different from each other are arranged with a baseline so as to input an image through a left eye and a right eye in the case of a person, so that different cameras having disparity of binocular And receives two images. Hereinafter, the two different images are referred to as first and second images.
That is, the same subject image is formed at different positions in the first and second images, respectively, and the disparity on the subject varies depending on the distance between the camera and the subject.
In other words, as the distance of the subject from the camera increases, the disparity on the subject decreases, and as the distance of the subject from the camera decreases, the disparity on the subject increases.
That is, in the stereoscopic method, the depth information on the object can be extracted using the disparity detected on the object.
Although the stereoscopic method has been described in terms of a subject image unit, disparity is detected in pixel units, depth information of pixels is extracted, and a three-dimensional shape on the subject can be realized by using the extracted depth information.
In order to realize a three-dimensional image on a subject of better quality, it is necessary to accurately extract the depth information on the subject. In order to accurately extract the depth information on the subject, it is necessary to accurately detect the disparity on the subject from the first and second images. In order to accurately detect the disparity on the subject, it is necessary to grasp the image distortion due to the hardware tolerance generated when assembling or using the two cameras, and to correct the distorted image.
In the camera module or the mobile terminal according to the present invention, an illumination unit capable of illuminating light corresponding to a preset pattern is used to grasp the image distortion due to the tolerance generated when assembling or using two cameras.
Hereinafter, the overall structure of the camera module will be examined, and the details of the illumination section will be specifically discussed.
2 is a conceptual diagram of a
2, a
The
The
An
On one side of the
A portion of the
Here, the photographing directions of the first and
2, the
2, a
That is, the
2, the
Although not shown, the
The optical system means a system of optical components in which a reflector, a lens, and the like are appropriately arranged in order to realize an image of an object by using reflection or refraction of light.
In addition, the image sensor images the incident light through the incident surfaces of the first and
As described above, the light emitted from the
At least one of the first and second cameras (more precisely, the image sensors provided in the camera) 221 and 222 may be a visible light region, To sense the light of the light source.
Alternatively, when the
Meanwhile, the light emitted from the
3 is a conceptual diagram for explaining an
Referring to FIG. 3, the light emitted from the
At this time, the externally illuminated light may be photographed through at least one of the first and
The image photographed by the camera may include a pattern corresponding to the plurality of spot lights SP. That is, the image received through the camera may include a predetermined pattern. A detailed description related to this will be described in detail later with reference to FIG. 5B.
The predetermined pattern may be determined (set) by the user, or may be predetermined when producing the product of the mobile terminal. Also, the predetermined pattern may be changed by a user's request or under the control of the
Meanwhile, in the present invention, the
For example, referring to FIG. 3, the
More specifically, the plane may be a surface (for example, a wall surface) or the like which is placed substantially perpendicular to the traveling direction of the light on the ground.
As described above, the image of light illuminated on the specific plane through at least one of the first and
Further, the
Hereinafter, a method of correcting an image according to an exemplary embodiment of the present invention will be described in detail with reference to the accompanying drawings.
FIG. 4 is a flow chart representing a control method of the present invention, and FIGS. 5A to 5C are conceptual diagrams for explaining the control method shown in FIG.
Referring to FIG. 4, in the present invention, light may be illuminated on a specific plane through the illumination unit 210 (S410).
In addition, the first and second images including a predetermined pattern corresponding to the light illuminated on the specific plane may be received through the first and
Here, after step S410, step S420 may be performed, or the two steps may be performed simultaneously. Alternatively, after step S420 is performed first, step S410 may be performed.
For example, when the image received through at least one of the first and
More specifically, the
Thereafter, at least one of the first and second images may be corrected using the first and second coordinates of the predetermined pattern included in the first and second images (S430).
More specifically, in step S430, the
That is, the
Hereinafter, a detailed procedure for generating the lookup table will be described with reference to FIG. 5A.
Referring to FIG. 5A, the
In the first and
More specifically, at least a part of the first and
At this time, the coordinates of the same pattern may be different in the first and
In other words, if the coordinates of the pattern are referred to as a first coordinate in the
Referring to FIGS. 5A and 5B, the first coordinate 511 may be (x1, y1) and the second coordinate 521 may be (x2, y2).
The reason that the first and
In other words, the current difference values of the first and
Referring to FIGS. 5A and 5B, the current difference value of the first and
In the present invention, an error difference value is generated from the current difference values of the first and second coordinates based on the reference difference values of the first and
Here, the reference difference value is a difference value between the first and second coordinates necessarily generated by disposing the first and
That is, the reference difference value may be a difference value between the first and second coordinates in an ideal situation in which the assembly tolerances of the first and
Referring to FIGS. 5A and 5B, the reference difference value of the first and
More specifically, it may be a value obtained by subtracting the coordinate (x1 + b, y1) of the ideal pattern of the second image at the first coordinate (x1, y1). On the other hand, the coordinate (x1 + b, y1) of the ideal pattern can mean the coordinate value of the pattern in a situation where the assembly tolerance of the first and
As described above, the
For example, the error difference value may be (x1-x2 + b, y1-y2) days obtained by subtracting the reference difference value (-b, 0) from the current difference value (x1-x2, y1- .
The
Here, the lookup table may be data having the same size (for example, resolution) as the first and second images.
That is, the lookup table can also be implemented as a two-dimensional image. More specifically, if the first and second images are two-dimensional images of 1024 * 768 size, the look-up table may be a two-dimensional image of 1024 * 768 size.
That is, the lookup table may include a plurality of pixels. Furthermore, the pixel value of the look-up table may be a value of either the x coordinate or the y coordinate of the error difference value.
More specifically, the lookup table may include first and second lookup tables 531 and 532.
Wherein at least some of the pixels of the first lookup table 531 have an x coordinate of the error difference value as a pixel value and at least a portion of the pixels of the second lookup table 532 have a y coordinate of the error difference value As a pixel value.
Referring to FIG. 5 (c), in the first lookup table 531, a position of a position corresponding to a coordinate of a pattern of any one of the first and second images (reference to the second image in the drawing) The pixels may have a pixel value of 'x1-x2 + b' which is the x-coordinate of the error difference value.
Referring to FIG. 5 (d), in the second lookup table 532, pixels at positions corresponding to the coordinates of any one of the patterns of the first and second images are y-coordinates of the error difference value 'y1-y2' can be a pixel value.
In the figure, the pixels having the error difference value as pixel values in the first and second lookup tables 531 and 532 are pixels at positions corresponding to the second coordinates, but the present invention is not limited thereto Do not.
That is, in the first and second lookup tables 531 and 532, the pixels having the error difference value as the pixel value may be pixels corresponding to the first coordinate.
In the first and second lookup tables 531 and 532, pixels at positions other than positions corresponding to the pattern may have a pixel value of 'null' or '0'.
Up to this point, it has been described that the
That is, when a plurality of patterns are included in the first and
For example, referring to FIG. 5B, when a plurality of patterns are included in the first and second images, the
That is, the pixels P1 at positions corresponding to the first pattern in the lookup table 533 have the pixel value A, and the pixels P2 at the positions corresponding to the second pattern can have the pixel value B.
On the other hand, the
Accordingly, most of the pixels of the lookup table 533 may have pixel values similar to surrounding pixels.
The
For example, referring again to FIG. 5A, the
For example, when the first lookup table 531 is applied to the
Likewise, when the second lookup table 532 is applied to the
When the first and second lookup tables 531 and 532 are applied to the
That is, the lookup table is applied to any one of the first and second images so that the first and second images can be corrected.
Referring again to FIG. 4, after at least one of the first and second images is corrected, a step of generating a three-dimensional image using the at least one corrected first and second images may be performed (S440).
Referring to FIG. 5C, at least one of the first and
In this case, the depth information of the subject can be extracted based on the disparity between the two images of the first and
Further, the
Up to now, it has been described in detail how a three-dimensional image is generated by using at least one of the first and second images and at least one of the corrected first and second images through the coordinate information of the pattern.
In the present invention, the
Further, the
FIG. 6A is a flowchart for explaining a control method in which a specific plane is recognized and light is illuminated on the specific plane, and FIGS. 6B to 6D are conceptual diagrams for explaining the control method of FIG. 6A.
Referring to FIG. 6A, the first and second images may be received through the first and
In operation S620, a space corresponding to a portion of the first and second images satisfying predetermined conditions may be recognized as a specific plane among the spaces corresponding to the first and second images.
Here, the predetermined condition may be a predetermined condition by a predetermined plane recognition algorithm.
The planar recognition algorithm may be an algorithm for extracting a portion corresponding to a plane in an image using contour information of objects in the image, concavities in the object, characteristics of convex or discontinuous points, and the like.
More specifically, the planar recognition algorithm first extracts information on an outline of objects in an image, and separates the objects based on the outline based on the extracted information.
Then, the concave point, the convexity and the discontinuity point are extracted in the object, and if the number of the concave, convex and discontinuity points extracted is equal to or less than the preset number, the object can be recognized as corresponding to the plane .
Alternatively, the planar recognition algorithm may be an algorithm that uses a frame by frame technique and an edge detection technique.
According to the planar recognition algorithm of this example, a moving image having a frame number per second higher than a predetermined number can be photographed through the first and
At this time, the light illuminated by the
For example, when the frame rate of the moving image is 60 frames per second (FPS), an image of about 30 frames photographed in a state where the power of the
The
Referring to FIG. 6A again, after step S620, a step of illuminating light on the specific plane may be performed through the illumination unit 210 (S630).
At this time, when the specific plane and a space recognized as a non-specific plane coexist in a space corresponding to the first and second images, the
Referring to FIGS. 6A and 6B, when at least one of the first and
In addition, the
For example, an image object corresponding to a subject such as a person or an object may exist in another
Referring to (c) of FIG. 6B, the
Referring back to FIG. 6A, in operation S640, a lookup table for the received
The specific procedure for generating the lookup table has already been described above with reference to FIGS. 4, 5A to 5C.
As described above, the lookup table may be the same as the size of the received
That is, the lookup table thus generated has pixels corresponding to the
In order to compensate for this, the
Hereinafter, a control method of generating the final look-up table will be described in more detail with reference to FIGS. 6C to 6D. FIG.
6C, the
Here, the third image may be an image received from at least one of the first and second cameras at a time different from the time when the first and second images are received.
Thereafter, as described above, the second lookup table for the received
Then, the
6D, a
As such, the final lookup table 650 generated may have pixel values for image correction as a whole. Accordingly, the
According to this method, even if a user does not separately execute a command related to light illumination or an instruction related to image correction, if a portion recognized as a plane exists in the received image, a lookup table for the part is generated . The lookup table may be fragmented, and a final lookup table may be created.
Further, the generated final lookup table is applied to at least one of the received first and second images, so that correction for at least one of the first and second images can be performed without a separate input of the user.
The final look-up table 650 may be stored in memory.
Further, each time the first and second images for three-dimensional image generation are received through the first and
Further, the final lookup table 650 may be updated every predetermined period.
In other words, every time the first and second images for three-dimensional image generation are received through the first and
That is, since the final lookup table is regenerated periodically, image distortion due to the assembly tolerance can be effectively corrected even if the assembly tolerance of the camera module occurs after the first and second images are corrected.
Referring back to FIG. 6A, after step S640, at least one of the first and second images is corrected using the generated lookup table, and the first and second images, A three-dimensional image can be generated (S650).
A specific method of generating a three-dimensional image using the first and second images in this step is described above with reference to FIG. 4, but will not be described herein.
On the other hand, in the present invention, as described above, the predetermined pattern may be a plurality of patterns.
The
Hereinafter, with reference to the drawings, a method of matching each pattern in the first and second images will be described in detail.
FIG. 7A is a flowchart for explaining a control method for matching patterns in first and second images related to the present invention. FIG. FIG. 7B is a conceptual diagram for explaining the control method of FIG. 7A.
Referring to FIG. 7A, the first and
Thereafter, a step of sequentially matching the patterns from the
More specifically, each pattern may be sequentially matched along a predetermined direction from the
The predetermined direction may be a radial direction or a swirling direction. For example, referring to FIG. 7B, the predetermined direction may be a swirling
Meanwhile, since the
Meanwhile, the sizes of the
Therefore, the pattern matching can be performed more accurately and quickly since the pattern matching is performed from the central area to the outer area in the first and second images.
Specifically, the matching of the patterns may be performed by comparing the codes of the patterns.
Here, the code of the pattern may mean a combination of pixels forming the pattern.
More specifically, the respective patterns may have different codes to be distinguished from each other.
For example, referring to FIG. 7B, pattern P1 and pattern P2 may have different codes. That is, different patterns may have different codes through combinations of pixels included in the pattern.
The
Here, the
The predetermined pattern P1 may be a set of a plurality of pixels. In the drawing, a pattern P1 consisting of 25 pixels is shown, but the number of pixels is not limited thereto.
Each of the pixels A may have either a first or a second pixel value at random. For example, the first or second pixel value may be a value of 0 or 1. In the drawing, a pixel having a pixel value of 0 is denoted by A, and a pixel having a pixel value of 1 is denoted by B.
More specifically, when comparing the pixel values of the pixels included in each pattern, the
7B, in order to match a pattern corresponding to one of the patterns P1 in the
When one matching operation is completed, matching operations for other patterns may be sequentially performed along a predetermined direction (for example, vortex direction), as described above. Accordingly, the matching operation for all the patterns included in the first and second images can be efficiently performed.
When a matching operation for all the patterns is performed, calculation of the first and second coordinates for each of the patterns can be performed as described above. When the calculation of the first and second coordinates is performed, a lookup table for image correction may be formed based on the calculated first and second coordinates. Through the lookup table thus formed, at least one of the first and second images can be corrected. Dimensional stereoscopic image can be generated based on the first and second images in which at least one is corrected. The generated three-dimensional stereoscopic image may be output through the display unit of the mobile terminal.
The camera module and the mobile terminal according to an embodiment of the present invention may use the first and second coordinates of a predetermined pattern included in the first and second images to determine a tolerance generated when assembling the camera module, It is possible to easily correct the image distortion caused by the distortion. Accordingly, a 3D stereoscopic image with improved quality can be generated and provided.
The camera module and the mobile terminal according to an exemplary embodiment of the present invention may be configured such that when a user does not separately perform a command related to light illumination or an instruction related to image correction but a portion recognized as a plane exists in the received image, A lookup table for the part may be generated and the correction may be performed using the generated lookup table.
Thereby, it is possible to easily correct the image distortion that is generated or deepened during the use of the camera module or the mobile terminal according to the present invention by the user.
The present invention described above can be embodied as computer-readable codes on a medium on which a program is recorded. The computer readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of the computer readable medium include a hard disk drive (HDD), a solid state disk (SSD), a silicon disk drive (SDD), a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, , And may also be implemented in the form of a carrier wave (e.g., transmission over the Internet). Also, the computer may include a
Claims (11)
First and second cameras arranged to be spaced apart from each other and configured to photograph first and second images respectively including predetermined patterns corresponding to light illuminated on the specific plane; And
Correcting at least one of the first and second images using the first and second coordinates of the predetermined pattern included in the first and second images, and correcting at least one of the first and second images using the first and second coordinates, A camera module comprising a controller for generating a three-dimensional image using an image.
Wherein,
Recognizing a space corresponding to a portion of the first and second images satisfying a predetermined condition among the spaces corresponding to the first and second images as the specific plane,
And controls the illumination unit to illuminate light on the specific plane.
Wherein,
Generating a look-up table for performing a correction on a portion of the first and second images,
Wherein the lookup table is applied to at least one of the first and second images.
Wherein,
Comparing the current difference value and the reference difference value of the first and second coordinates to generate an error difference value,
And generates the look-up table based on the generated error difference value.
The control unit stores the generated lookup table in a memory,
When the third and fourth images different from the first and second images are captured through the first and second cameras, the lookup table is applied to at least one of the third and fourth images, 4 images, and generates a three-dimensional image using the at least one corrected third and fourth images.
Wherein the lookup table stored in the memory is updated every predetermined period.
The predetermined pattern is a plurality of patterns,
Wherein,
And the first and second coordinates of the respective patterns are calculated by matching the respective patterns in the first and second images.
Wherein,
And sequentially matching the patterns from the central area to the outer area of the first or second image.
The predetermined pattern is a set of a plurality of pixels,
Each pixel having a first or second pixel value at random,
Wherein each of the patterns has a different code.
First and second cameras arranged to be spaced apart from each other and configured to photograph first and second images respectively including predetermined patterns corresponding to light illuminated on the specific plane; And
Correcting at least one of the first and second images using the first and second coordinates of the predetermined pattern included in the first and second images, and correcting at least one of the first and second images using the first and second coordinates, And a controller for generating a three-dimensional image using the image.
Receiving first and second images through a first and a second camera, the first and second images including a predetermined pattern corresponding to light illuminated in the specific plane;
Correcting at least one of the first and second images using the first and second coordinates of the predetermined pattern included in the first and second images; And
And generating a three-dimensional image using the at least one corrected first and second images.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150166718A KR20170061536A (en) | 2015-11-26 | 2015-11-26 | Camera module and mobile terminal comprising the camera |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150166718A KR20170061536A (en) | 2015-11-26 | 2015-11-26 | Camera module and mobile terminal comprising the camera |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20170061536A true KR20170061536A (en) | 2017-06-05 |
Family
ID=59223190
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020150166718A KR20170061536A (en) | 2015-11-26 | 2015-11-26 | Camera module and mobile terminal comprising the camera |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR20170061536A (en) |
-
2015
- 2015-11-26 KR KR1020150166718A patent/KR20170061536A/en unknown
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9942453B2 (en) | Mobile terminal and method for controlling the same | |
US9667855B2 (en) | Mobile terminal for focusing of image capturing and method for controlling the same | |
KR20160149068A (en) | Mobile terminal and method for controlling the same | |
KR20160031886A (en) | Mobile terminal and control method for the mobile terminal | |
KR20170005649A (en) | 3d camera module and mobile terminal comprising the 3d camera module | |
KR101783773B1 (en) | Dual camera module and mobile terminal comprising the camera | |
EP3439301B1 (en) | Image processing apparatus and mobile terminal | |
KR20150136934A (en) | Mobile terminal | |
KR20170130952A (en) | Mobile terminal and method for controlling the same | |
KR20180048170A (en) | Display apparatus | |
KR20160012009A (en) | Mobile terminal and method for controlling the same | |
KR20170004706A (en) | Iris identification apparatus of mobile terminal and controlling mrthod thereof | |
KR20160148401A (en) | Mobile terminal | |
KR101529933B1 (en) | Mobile terminal | |
KR20180028210A (en) | Display device and method for controlling the same | |
KR20170083907A (en) | Mobile terminal | |
KR20170024276A (en) | Mobile terminal and method for controlling the same | |
KR20160096271A (en) | Mobile terminal and method for controlling the same | |
EP4130649A1 (en) | Mobile terminal and control method therefor | |
KR20170061536A (en) | Camera module and mobile terminal comprising the camera | |
KR20190077738A (en) | Mobile Terminal | |
KR20170087343A (en) | Mobile terminal and method for controlling the same | |
KR20170011613A (en) | Mobile terminal and method for controlling the same | |
KR20170022082A (en) | Mobile terminal and method for controlling the same | |
KR20160077907A (en) | Mobile terminal and method for controlling the same |