WO2021221207A1 - Imaging lens, and camera module and vehicle comprising same - Google Patents

Imaging lens, and camera module and vehicle comprising same Download PDF

Info

Publication number
WO2021221207A1
WO2021221207A1 PCT/KR2020/005712 KR2020005712W WO2021221207A1 WO 2021221207 A1 WO2021221207 A1 WO 2021221207A1 KR 2020005712 W KR2020005712 W KR 2020005712W WO 2021221207 A1 WO2021221207 A1 WO 2021221207A1
Authority
WO
WIPO (PCT)
Prior art keywords
imaging lens
lens
lens group
imaging
efl
Prior art date
Application number
PCT/KR2020/005712
Other languages
French (fr)
Korean (ko)
Inventor
이규승
김진범
여상옥
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to PCT/KR2020/005712 priority Critical patent/WO2021221207A1/en
Publication of WO2021221207A1 publication Critical patent/WO2021221207A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • G02B13/18Optical objectives specially designed for the purposes specified below with lenses having one or more non-spherical faces, e.g. for reducing geometrical aberration

Definitions

  • the present invention relates to an imaging lens, a camera module including the same, and a vehicle, and more particularly, to an imaging lens capable of improving peripheral object recognition performance, a camera module and a vehicle including the same.
  • Cameras can be classified into standard, telephoto, and wide-angle cameras according to the focal length of the lens.
  • the wide-angle camera has a wide angle of view, but in general, image distortion is increased in the periphery (wide-angle part).
  • a vehicle camera is a camera that provides information necessary for driving by recognizing an object in the front or rear of the vehicle.
  • a wide-angle camera is generally used for a vehicle camera to recognize more information. Therefore, the image distortion problem in the vehicle camera greatly deteriorates the performance of the camera, and accordingly, it is not possible to accurately recognize an object located in the periphery in front of the vehicle, which is a great threat to safe driving.
  • the vehicle front camera should be able to recognize an object 35m away from a wide angle (about 60 degrees), and it should be able to recognize an object more than 85m away from the front (about 0 degrees).
  • a camera used in a vehicle needs to be able to recognize surrounding objects even in a dark environment, so a bright lens with a small f-number is required.
  • the multi-camera is a camera including a camera for photographing a central portion and a camera for photographing a peripheral portion, and has a problem in that the volume of the camera increases, the price increases, and the operation of merging images photographed by a plurality of cameras into one is required.
  • the peripheral distortion mitigation camera has a problem in that the size of the peripheral image to be photographed increases, and thus a larger image sensor is required, and accordingly, the volume of the camera increases and the price increases.
  • an object of the present invention is to provide an imaging lens capable of increasing object recognition performance of a peripheral part while maintaining object recognition performance of a central part.
  • an object of the present invention is to provide an imaging lens capable of accurately recognizing an object even in a dark environment in order to solve the above problems.
  • an object of the present invention is to provide a vehicle that can promote the safety of occupants in various situations by mounting a camera module including the imaging lens.
  • an imaging lens has negative power in order from an object side to an image side, and a first lens group with a convex target surface toward the object side, negative power.
  • Each of the first to sixth lens groups may include at least one lens.
  • the first lens group may have a meniscus shape convex toward the object.
  • the second lens group may have a meniscus shape convex toward the image.
  • the third lens group or the fourth lens group may have both convex shapes.
  • At least one of the target surface and the imaging surface of the fifth lens group or the sixth lens group may have at least one inflection point.
  • At least one of the first lens group to the sixth lens group may include at least one aspherical lens.
  • the imaging lens according to an embodiment of the present invention for achieving the above object when the distance from the object-side surface to the image surface of the first lens group is TTL, the conditional expression of 20mm ⁇ TTL ⁇ 25mm may be satisfied. .
  • the conditional expression of ANG 50° may be satisfied.
  • the imaging lens according to an embodiment of the present invention for achieving the above object, when the diagonal length of the image plane is ImgH and the entrance pupil aperture of the imaging lens is EPD, the conditional expression of 1.3 ⁇ ImgH/EPD ⁇ 2.0 is satisfied.
  • the imaging lens according to an embodiment of the present invention for achieving the above object, when the distance from the object side surface to the image plane of the first lens group is TTL and the total focal length of the imaging lens is EFL, 4.8 ⁇ The conditional expression of TTL/EFL ⁇ 5.6 may be satisfied.
  • the imaging lens according to an embodiment of the present invention for achieving the above object, when the distance from the object side surface to the image plane of the first lens group is TTL and the half angle of view of the imaging lens is ANG, 2.5 ⁇ ANG
  • the conditional expression of /TTL ⁇ 2.95 may be satisfied.
  • the imaging lens according to an embodiment of the present invention for achieving the above object, when the diagonal length of the image plane is ImgH and the distance from the object side surface of the first lens group to the image plane is TTL, 4.9 ⁇ TTL/ The conditional expression of ImgH ⁇ 5.7 may be satisfied.
  • the imaging lens according to an embodiment of the present invention for achieving the above object, when the diagonal length of the image plane is ImgH, the total focal length of the imaging lens is EFL, and the half angle of view of the imaging lens is ANG, 0.4 ⁇ ImgH
  • the conditional expression of /(EFL * tan(ANG)) ⁇ 0.6 may be satisfied.
  • the imaging lens according to an embodiment of the present invention for achieving the above object, when the total focal length of the imaging lens is EFL and the focal length of the first lens group is f1, -2.42 ⁇ f1/EFL ⁇ -
  • the conditional expression in 2.14 can be satisfied.
  • the imaging lens according to an embodiment of the present invention for achieving the above object, when the total focal length of the imaging lens is EFL and the focal length of the second lens group is f2, -5.93 ⁇ f2/EFL ⁇ -
  • the conditional expression of 5.11 can be satisfied.
  • the imaging lens according to an embodiment of the present invention for achieving the above object, when the total focal length of the imaging lens is EFL and the focal length of the sixth lens group is f6, 3.22 ⁇ f6/EFL ⁇ 3.64 condition can be satisfied.
  • the imaging lens according to an embodiment of the present invention including six lens groups, has an effect of improving object recognition performance of a peripheral part while maintaining object recognition performance of a central part.
  • the imaging lens according to an embodiment of the present invention includes at least one aspherical lens and a plastic lens and is designed to have a low f-number, so that an object can be accurately recognized even in a dark environment.
  • the camera module according to an embodiment of the present invention can be directly applied to an existing vehicle camera module by applying an imaging lens including a miniaturized six lens group and an image sensor having the same size as that of the existing sensor. It works.
  • the vehicle including the imaging lens according to an embodiment of the present invention has an effect that can promote the safety of the occupant in various situations by accurately recognizing an object located in the peripheral portion.
  • FIG. 1 is a view showing a photographing angle of view of a vehicle and a camera module equipped with a camera module including an imaging lens according to an embodiment of the present invention.
  • FIG. 2 is a block diagram referenced for explaining a vehicle according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating an imaging lens according to an embodiment of the present invention.
  • FIG. 4 is a graph of measuring coma aberration of the imaging lens of FIG. 3;
  • FIG. 5 is a graph illustrating spherical aberration, astigmatism, and distortion of the imaging lens of FIG. 3 .
  • FIG. 6 shows a result of comparing the degree of distortion in the periphery of an image photographed using the imaging lens of FIG. 3 with that of a conventional imaging lens.
  • FIG. 7 shows the result of comparing the object recognition distance according to the angle of view of the imaging lens of FIG. 3 with that of the conventional imaging lens.
  • module and “part” for the components used in the following description are given or mixed in consideration of only the ease of writing the specification, and do not have a meaning or role distinct from each other by themselves. Accordingly, the terms “module” and “unit” may be used interchangeably.
  • FIG. 1 is a view showing a photographing angle of view of a vehicle 10 and a camera module 100 equipped with a camera module 100 including an imaging lens 110 according to an embodiment of the present invention
  • FIG. 2 is the present invention. It is a block diagram referenced to describe the vehicle 10 according to the embodiment.
  • the vehicle 10 may include wheels rotated by a power source and a steering input device 200 for controlling the traveling direction of the vehicle 10 .
  • the vehicle 10 may be an autonomous vehicle.
  • the vehicle 10 may be switched to an autonomous driving mode or a manual mode based on a user input received through a user interface device (not shown).
  • the vehicle 10 may be switched to an autonomous driving mode or a manual mode based on driving situation information.
  • the driving situation information may include at least one of object information outside the vehicle, navigation information, and vehicle state information.
  • the vehicle 10 may be switched from the manual mode to the autonomous driving mode or from the autonomous driving mode to the manual mode based on the driving situation information generated by the object detection apparatus 300 .
  • the vehicle 10 may receive a user input for driving through the driving manipulation device 50 . Based on a user input received through the driving manipulation device 50 , the vehicle 10 may be driven.
  • the vehicle 10 may include the object detecting apparatus 300 .
  • the object detecting apparatus 300 is an apparatus for detecting an object located outside the vehicle 10 .
  • the object detection apparatus 300 may generate object information based on the sensed data.
  • the object information may include information on the existence of an object, location information of the object, distance information between the vehicle 10 and the object, and relative speed information between the vehicle 10 and the object.
  • the object may be various objects related to the operation of the vehicle 10 .
  • the object may include a vehicle, other vehicle, pedestrian, two-wheeled vehicle, traffic signal, light, road, structure, speed bump, building, feature, animal, and the like.
  • the object may be classified into a moving object and a still object.
  • the moving object may be a concept including another moving vehicle and a moving pedestrian.
  • the stationary object may be a concept including a traffic signal, a road, a structure, a building, another stopped vehicle, and a stationary pedestrian.
  • the camera module 100 may include an imaging lens 110 including a plurality of lens groups.
  • the camera module 100 may be located at an appropriate place outside the vehicle in order to acquire an image outside the vehicle.
  • the camera module 100 may include an imaging lens 110 , a filter 120 , and an image sensor 130 .
  • the imaging lens 110 may be configured by arranging a plurality of lens groups in a line along an optical axis.
  • the imaging lens 110 refracts light incident from the subject to form an image on the image sensor 130 .
  • the configuration of the imaging lens 110 will be described in detail below with reference to FIGS. 3 to 6 .
  • the filter 120 may selectively transmit light passing through the imaging lens 110 according to a wavelength.
  • the filter 120 may include an infrared filter 121 (Infrared Ray Filter).
  • the filter 120 may further include a cover glass 122 and the like.
  • the infrared filter 121 and the cover glass 122 may be replaced or omitted with other filters as necessary, and may be designed so as not to affect the optical characteristics of the imaging lens 110 .
  • the image sensor 130 is disposed to be spaced apart from the imaging lens 110 , and performs a function of converting light input through the imaging lens 110 into an electrical signal.
  • a charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS) may be used as the image sensor 130 .
  • the camera module 100 may further include a lens barrel 140 and an actuator 150 .
  • the lens barrel 140 serves as a housing for protecting the imaging lens 110 , and may move in the optical axis direction according to the driving of the actuator 150 .
  • the actuator 150 performs an auto focus (AF) function by moving the lens barrel 140 and the bobbin (not shown) along the optical axis direction through electromagnetic force using a coil.
  • the actuator 150 may be configured as a voice coil motor (VCM) or the like.
  • the camera module 100 may acquire position information of an object, distance information from an object, or relative speed information with an object by using various image processing algorithms.
  • the camera module 100 may acquire distance information and relative speed information from an object based on a change in the size of the object over time from the acquired image.
  • the camera module 100 may be disposed adjacent to the front windshield in the interior of the vehicle to acquire an image of the front of the vehicle.
  • the camera module 100 may be disposed around a front bumper or a radiator grill.
  • the camera module 100 may be disposed adjacent to the rear glass in the interior of the vehicle to obtain an image of the rear of the vehicle.
  • the camera module 100 may be disposed around a rear bumper, a trunk, or a tailgate.
  • the camera module 100 may be disposed adjacent to at least one of the side windows in the interior of the vehicle in order to acquire an image of the side of the vehicle.
  • the camera module 100 may be disposed around a side mirror, a fender, or a door.
  • the camera module 100 may provide the acquired image to the processor 350 of the object detection apparatus 300 .
  • the object detection apparatus 300 may include a radar 310 , a lidar 320 , an ultrasonic sensor 330 , an infrared sensor 340 , and a processor 350 . Meanwhile, the object detection apparatus 300 may operate in association with the camera module 100 .
  • the object detecting apparatus 300 may further include other components in addition to the described components, or may not include some of the described components.
  • the radar 310 may include an electromagnetic wave transmitter and a receiver.
  • the radar 310 may be implemented in a pulse radar method or a continuous wave radar method in view of a radio wave emission principle.
  • the radar 310 may be implemented in a frequency modulated continuous wave (FMCW) method or a frequency shift keyong (FSK) method according to a signal waveform among continuous wave radar methods.
  • FMCW frequency modulated continuous wave
  • FSK frequency shift keyong
  • the radar 310 detects an object based on an electromagnetic wave, a time of flight (TOF) method or a phase-shift method, and a position of the detected object, a distance from the detected object, and a relative speed. can be detected.
  • TOF time of flight
  • the radar 310 may be disposed at an appropriate location outside the vehicle to detect an object located in front, rear or side of the vehicle.
  • the lidar 320 may include a laser transmitter and a receiver.
  • the lidar 320 may be implemented in a time of flight (TOF) method or a phase-shift method.
  • TOF time of flight
  • the lidar 320 may be implemented as a driven or non-driven type.
  • the lidar 320 When implemented as a driving type, the lidar 320 is rotated by a motor and may detect an object around the vehicle 10 .
  • the lidar 320 may detect an object located within a predetermined range with respect to the vehicle 10 by light steering.
  • Vehicle 10 may include a plurality of non-driven lidars 320 .
  • the lidar 320 detects an object based on a time of flight (TOF) method or a phase-shift method as a laser light medium, and determines the position of the detected object, the distance to the detected object, and Relative speed can be detected.
  • TOF time of flight
  • phase-shift method as a laser light medium
  • the lidar 320 may be disposed at an appropriate location outside the vehicle to detect an object located in front, rear or side of the vehicle.
  • the ultrasonic sensor 330 may include an ultrasonic transmitter and a receiver.
  • the ultrasound sensor 330 may detect an object based on ultrasound, and detect a position of the detected object, a distance from the detected object, and a relative speed.
  • the ultrasonic sensor 330 may be disposed at an appropriate location outside the vehicle to detect an object located at the front, rear, or side of the vehicle.
  • the infrared sensor 340 may include an infrared transmitter and a receiver.
  • the infrared sensor 340 may detect an object based on infrared light, and detect a position of the detected object, a distance from the detected object, and a relative speed.
  • the infrared sensor 340 may be disposed at an appropriate location outside the vehicle to detect an object located in the front, rear, or side of the vehicle.
  • the processor 350 may control the overall operation of each unit of the object detection apparatus 300 .
  • the processor 350 compares data sensed by the camera module 100 , the radar 310 , the lidar 320 , the ultrasonic sensor 330 , and the infrared sensor 340 with pre-stored data to detect an object or can be classified.
  • the processor 350 may detect and track the object based on the image acquired by the camera module 100 .
  • the processor 350 may perform operations such as calculating a distance to an object and calculating a relative speed with respect to an object through an image processing algorithm.
  • the processor 350 may acquire distance information and relative velocity information from the obtained image based on a change in the size of the object over time.
  • the processor 350 may acquire distance information and relative speed information from an object through a pin hole model, road surface profiling, or the like.
  • the processor 350 may detect and track the object based on at least one of a reflected electromagnetic wave, a reflected laser, a reflected ultrasonic wave, and a reflected infrared light transmitted to the object and reflected back.
  • the processor 350 may perform operations such as calculating a distance to an object and calculating a relative speed with respect to an object based on at least one of reflected electromagnetic waves, reflected lasers, reflected ultrasonic waves, and reflected infrared light.
  • the object detecting apparatus 300 may include a plurality of processors 350 or may not include the processors 350 .
  • each of the camera module 100 , the radar 310 , the lidar 320 , the ultrasonic sensor 330 , and the infrared sensor 340 may individually include a processor.
  • the object detection apparatus 300 may be operated under the control of the processor or the controller 900 of the apparatus in the vehicle 10 .
  • the vehicle 10 includes a user interface device that receives a user input and provides information generated in the vehicle 10 to the user, and various devices in the vehicle 10 .
  • a vehicle driving device that electrically controls the driving of a vehicle, a communication device that communicates with an external device, a driving system that controls various operations of the vehicle 10 in an autonomous driving mode, a navigation system that provides navigation information,
  • a sensing unit for sensing an interface unit for exchanging data or supplying power with an external device connected to the vehicle 10 , a memory for storing various data related to the vehicle 10 , and operation of each component of the vehicle 10 .
  • a power supply for supplying power required for the , etc. may be further included as needed.
  • the camera module 100 installed in the vehicle 10 may include at least one imaging lens 110 .
  • the imaging lens 110 may have an angle of view of a certain size.
  • the camera module 100 may be disposed in the vehicle 10 such that the imaging lens 110 faces the front direction of the vehicle 10 .
  • the area located in the front direction of the camera module 100 is defined as the central area Ar1, and a partial area near the left end or right end in the range of the angle of view that the camera module 100 can photograph is the peripheral area Ar2. define.
  • An image of the central region Ar1 is included in the center of the image captured by the camera module 100 , and an image of the peripheral region Ar2 is included in the periphery of the image.
  • FIG. 3 is a diagram illustrating an imaging lens 110 according to an embodiment of the present invention.
  • the spherical or aspherical shape of the lens in FIG. 3 is only presented as an example and is not limited thereto.
  • the term 'target surface' refers to the surface of the lens facing the object side with respect to the optical axis
  • the term 'image-forming surface' refers to the surface of the lens facing the image side with respect to the optical axis.
  • positive power of a lens indicates a converging lens that converges parallel light
  • negative power of a lens indicates a diverging lens that diverges parallel light
  • the imaging lens 110 includes a first lens group 101 , a second lens group 102 , a third lens group 103 , a fourth lens group 104 in order from the object side to the image side; A fifth lens group 105 and a sixth lens group 106 may be included.
  • Each of the first lens group 101 to the sixth lens group 106 may include at least one lens.
  • a shutter may be included on the front surface of the first lens group 101 , and an iris (not shown) may be disposed between the first lens group 101 to the sixth lens group 106 .
  • the aperture may be a variable aperture.
  • the imaging lens 110 is included in the camera module 100, and the filter 120 and the image sensor 130 may be arranged to be spaced apart from each other in the sixth lens group 106 of the imaging lens 110 in order. .
  • the filter 120 may include an infrared filter 121 . Meanwhile, the filter 120 may further include a cover glass 122 and the like.
  • the infrared filter 121 and the cover glass 122 may be a flat optical member, and the cover glass 122 may be a glass for protecting the imaging surface.
  • the image sensor 130 may detect light incident from the sixth lens group 106 .
  • the first lens group 101 may include at least one lens.
  • the first lens group 101 may have negative (-) power.
  • the first lens group 101 may be disposed closest to the object side, the target surface S11 may be convex, and the imaging surface S12 may be concave. That is, the first lens group 101 may have a meniscus shape convex toward the object.
  • the first lens group 101 may be formed to be relatively larger than the second lens group 102 to the sixth lens group 106 . Accordingly, all light incident through the target surface S11 of the first lens group 101 may be incident on the target surface S21 of the second lens group 102, and the imaging lens 110 has a wide angle of view. can be implemented
  • the second lens group 102 may include at least one lens.
  • the second lens group 102 may have negative (-) power.
  • the second lens group 102 may be disposed to be spaced apart from the imaging surface S12 of the first lens group 101 , the target surface S21 may be concave, and the imaging surface S22 may be convex. That is, the second lens group 102 may have a meniscus shape convex toward the image side.
  • the third lens group 103 may include at least one lens.
  • the third lens group 103 may have positive (+) power.
  • the third lens group 103 may be disposed to be spaced apart from the imaging plane S22 of the second lens group 102 , and both the target plane S31 and the imaging plane S32 may be convex.
  • the fourth lens group 104 may include at least one lens.
  • the fourth lens group 104 may have positive (+) power.
  • the fourth lens group 104 may be disposed to be spaced apart from the imaging plane S32 of the third lens group 103 , and both the target plane S41 and the imaging plane S42 may be convex.
  • the fifth lens group 105 may include at least one lens.
  • the fifth lens group 105 may have negative (-) power.
  • the fifth lens group 105 may be disposed to be spaced apart from the imaging surface S42 of the fourth lens group 104 .
  • At least one inflection point may be formed on at least one of the target surface S51 and the imaging surface S52 of the fifth lens group 105 .
  • the target surface S51 may be concave in the paraxial region and convex toward the edge
  • the imaging surface S52 may be convex in the paraxial region and concave toward the edge.
  • the shape of the fifth lens group 105 is not limited thereto.
  • the sixth lens group 106 may include at least one lens.
  • the sixth lens group 106 may have positive (+) power.
  • the sixth lens group 106 may be disposed to be spaced apart from the imaging surface S52 of the fifth lens group 105 .
  • At least one inflection point may be formed on at least one of the target surface S61 and the imaging surface S62 of the sixth lens group 106 .
  • the target surface S61 may be convex in the paraxial region and concave toward the edge
  • the imaging surface S62 may be concave in the paraxial region and convex toward the edge.
  • the shape of the sixth lens group 106 is not limited thereto.
  • At least one of the first lens group 101 to the sixth lens group 106 may include an aspherical lens, and all of the lenses may have a rotationally symmetric shape with respect to the optical axis.
  • the lenses included in the first lens group 101 to the sixth lens group 106 may be made of a glass material or a plastic material.
  • the manufacturing cost can be greatly reduced.
  • the surfaces of the lenses included in the first lens group 101 to the sixth lens group 106 may be coated to prevent reflection or improve surface hardness.
  • the imaging lens 110 configured as described above can reduce distortion of the peripheral portion, and greatly increase the object recognition performance of the peripheral portion in the camera module 100 or the vehicle 10 including the imaging lens 110 .
  • Table 1 shows the radius of curvature, thickness, or distance of each lens group included in the imaging lens 110 according to an embodiment of the present invention.
  • the unit of the radius of curvature and the thickness or distance is millimeters.
  • the distance (thickness) from the target surface S11 to the imaging surface S12 of the first lens group 101 on the optical axis is 0.9101 mm
  • the The distance (thickness) from the target plane S21 to the imaging plane S22 is 4.8463 mm
  • the distance (thickness) from the target plane S31 to the imaging plane S32 of the third lens group 103 is 1.9086 mm.
  • the distance (thickness) from the target plane S41 to the imaging plane S42 of the fourth lens group 104 is 5.1864 mm
  • the imaging plane S52 from the target plane S51 of the fifth lens group 105 ) may be 0.9221 mm
  • the distance (thickness) from the target surface S61 to the imaging surface S62 of the sixth lens group 106 may be 1.4247 mm.
  • the imaging plane S12 of the first lens group 101 is arranged on the optical axis spaced apart by 3.3112 mm to the target plane S21 of the second lens group 102 , and the imaging plane of the second lens group 102 .
  • (S22) is arranged on the optical axis at a distance of 0.0899 mm to the target surface S31 of the third lens group 103, and the imaging surface S32 of the third lens group 103 is the fourth lens group 104
  • the image-forming surface S42 of the fourth lens group 104 is spaced 0.1591 mm apart from the target surface S51 of the fifth lens group 105 on the optical axis to be spaced 0.0855 mm apart from the target surface S41 on the optical axis.
  • the imaging surface S52 of the fifth lens group 105 is spaced 0.4962 mm apart to the target surface S61 of the sixth lens group 106 on the optical axis, and the sixth lens group 106
  • the imaging surface S62 may be disposed on the optical axis at a distance of 0.1861 mm up to the upper surface S71 of the filter 121 .
  • Table 2 shows the conic constant (k) and aspheric coefficients (A to H) of the lens surface of each lens group included in the imaging lens 110 according to an embodiment of the present invention.
  • the first lens group 101 and the fourth lens group 104 to the sixth lens group 106 include aspherical lenses.
  • the first lens group 101 to the sixth lens group 106 may include at least one aspherical lens, and are not limited to the examples shown in Table 2.
  • the imaging lens 110 may satisfy Conditional Expression 1 as follows.
  • Fno is a constant (F-number) indicating the brightness of the imaging lens 110 .
  • F-number the brightness of the imaging lens 110 .
  • Fno increases, the brightness of the imaging lens 110 becomes darker, and the amount of light received by the imaging lens 110 decreases in the same environment.
  • the imaging lens 110 When Fno is greater than 1.7, the imaging lens 110 lacks the ability to recognize an object in a dark place, so that the imaging lens 110 cannot realize the target object recognition performance. On the other hand, when Fno is less than 1.55, the size of the lenses included in the imaging lens 110 is increased or the number of lenses is increased, so that the volume of the imaging lens 110 is increased and the weight is heavy.
  • the imaging lens 110 may satisfy Conditional Expression 2 below.
  • TTL is the distance (Total Top Length or Total Track Length) from the object-side incident surface to the image surface of the first lens group 101 . That is, TTL represents the total length of the imaging lens 110 .
  • the TTL is longer than 25mm, the length of the imaging lens 110 is long, so it is difficult to mount it on the camera module 100 applied to the vehicle 10, etc., and if the TTL is shorter than 20mm, the image quality of the imaging lens 110 may deteriorate. have.
  • the imaging lens 110 may satisfy the following conditional expression (3).
  • ANG is a numerical value representing the half-field angle of the imaging lens.
  • the half angle of view means 1/2 of the total angle of view of the imaging lens 110 .
  • the ANG is 50 degrees or less, the size of the area that the imaging lens 110 can contain is small, and thus the imaging lens 110 cannot realize the target object recognition performance.
  • the imaging lens 110 may satisfy the following conditional expression (4).
  • ImgH is the diagonal length (Image Height) of the image sensor 130
  • EPD is the entrance pupil diameter of the imaging lens 110 .
  • the volume of the imaging lens 110 may increase or desired information may not be sufficiently obtained.
  • the image may darken.
  • the imaging lens 110 may satisfy the following conditional expression 5.
  • TTL is the distance from the object-side incident surface of the first lens group 101 to the image surface
  • EFL is the total focal length of the imaging lens 110 .
  • the TTL/EFL value is greater than 5.6, the amount of information in the center of the photographed image is reduced, and when the TTL/EFL value is less than 4.8, the amount of information in the peripheral portion in the photographed image is reduced, so that the imaging lens 110 recognizes the target object. performance cannot be implemented.
  • the imaging lens 110 may satisfy Conditional Expression 6 below.
  • ANG is a numerical value representing the half-field angle of the imaging lens
  • TTL is the distance from the object-side incident surface of the first lens group 101 to the image surface.
  • the ANG/TTL value is less than 2.5, the amount of information in the photographed image decreases and the size of the imaging lens 110 increases.
  • the ANG/TTL value is greater than 2.95, the information amount in the photographed image decreases.
  • the imaging lens 110 may satisfy Conditional Expression 7 below.
  • TTL is the distance from the object-side incident surface of the first lens group 101 to the image surface
  • ImgH is the diagonal length of the image surface of the image sensor 130 .
  • the TTL/ImgH value is greater than 5.7, the volume of the imaging lens 110 increases, and when the TTL/ImgH value is less than 5.7, the image quality of the periphery of the photographed image is deteriorated.
  • the imaging lens 110 may satisfy the following conditional expression 8.
  • ImgH is the diagonal length of the image sensor 130
  • EFL is the total focal length of the imaging lens 110
  • ANG is a numerical value representing the half angle of view of the imaging lens.
  • the ImgH/(EFL * tan(ANG)) value is less than 0.4, the distortion of the imaging lens 110 increases and the amount of information in the photographed image is reduced. If the ImgH/(EFL * tan(ANG)) value is greater than 0.6, The object recognition performance in the center of the captured image decreases, and the amount of information in the periphery decreases.
  • the imaging lens 110 may satisfy the following conditional expression 9.
  • f1 is the focal length of the first lens group 101
  • EFL is the total focal length of the imaging lens 110 .
  • the angle of view of the imaging lens 110 becomes narrow, and the size of a region that can be captured through the imaging lens 110 decreases. Accordingly, it is impossible to realize the object recognition performance that the imaging lens 110 is aiming for.
  • the imaging lens 110 may satisfy the following conditional expression 10.
  • f2 is the focal length of the second lens group 102
  • EFL is the total focal length of the imaging lens 110 .
  • the f2/EFL value is less than -5.93 or greater than -5.11, the quality of the peripheral area in the captured image is reduced.
  • the imaging lens 110 may satisfy the following conditional expression 11.
  • f6 is the focal length of the sixth lens group 106
  • EFL is the total focal length of the imaging lens 110 .
  • the imaging lens 110 cannot realize the target object recognition performance.
  • the imaging lens 110 satisfies the above-described conditional expressions. Accordingly, the imaging lens 110 has improved optical performance, can be applied to the vehicle 10 with a compact size, and can have high object recognition performance in a dark environment or a peripheral area.
  • FIG. 4 is a graph of measuring coma aberration of the imaging lens 110 of FIG. 3 .
  • FIG. 4 is a graph of measuring tangential aberration and sagittal aberration of each wavelength according to a field height of the imaging lens 110 .
  • FIG. 5 is a graph illustrating longitudinal spherical aberration, astigmatic field curves, and distortion of the imaging lens of FIG. 3 .
  • the Y-axis means the size of the image
  • the X-axis means the focal length (in mm) and the degree of distortion (in %). As each curve approaches the Y-axis, the aberration correction function of the imaging lens 110 may be improved.
  • FIG. 6 shows a result of comparing the degree of distortion of the periphery of an image photographed using the imaging lens 110 of FIG. 3 with that of a conventional imaging lens.
  • FIG. 6 (a) shows an image 601 taken with a conventional imaging lens
  • FIG. 6 (b) shows an image 602 taken with an imaging lens 110 according to an embodiment of the present invention. it has been shown
  • the object OB1a located in the central part (Ar1, near 0 degrees of view) of the image 601 taken with a conventional general imaging lens is similar to the shape of the real object, but the peripheral part ( It can be seen that the object OB2a located at Ar2 (near 60 degrees of view) is severely distorted compared to the shape and size of the actual object.
  • the object OB1b located in the central portion Ar1 and the peripheral portion Ar2 of the image 602 taken with the imaging lens 110 according to an embodiment of the present invention It can be confirmed that all of the objects OB2b are similar to the shape of the real object.
  • the horizontal length W2 of the image 602 photographed with the imaging lens 110 according to an embodiment of the present invention is the same as the horizontal length W1 of the image 601 photographed with the conventional general imaging lens, and , an object located in the periphery Ar2 can be photographed more accurately without distortion.
  • the imaging lens 110 can be directly applied to the existing vehicle camera module.
  • the object detection apparatus 300 of the vehicle 10 can accurately detect and track an object existing in the periphery of the image based on the image obtained from the camera module 100, The safety of occupants can be promoted in various situations.
  • FIG. 7 shows a result of comparing the object recognition distance according to the angle of view of the imaging lens 110 of FIG. 3 with that of the conventional imaging lens.
  • the imaging lens 110 has an object recognition distance of about 85 m from the central portion (0°), which is 96% compared to the 89 m object recognition distance of a conventional imaging lens. Therefore, it can be confirmed that the object recognition performance is almost similar to that of the conventional imaging lens.
  • the imaging lens 110 has an object recognition distance of about 37 m from the periphery (60°), which is 128% compared to the 25 m object recognition distance of a conventional imaging lens. Accordingly, it can be confirmed that the object recognition performance in the peripheral area is significantly improved compared to the conventional imaging lens.
  • the imaging lens 110 can improve the object recognition performance of the peripheral part by increasing the amount of information in the peripheral part while maintaining the object recognition performance of the central part.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The present invention relates to an imaging lens, and a camera module and a vehicle comprising same. An imaging lens according to one embodiment of the present invention comprises: a first lens group which has, in order, a negative power upward from an object side, and of which a target surface protrudes toward the object side; a second lens group having a negative power; a third lens group having a positive power; a fourth lens group having a positive power; a fifth lens group having a negative power; and a sixth lens group having a positive power, wherein each of the first to sixth lens groups can include at least one lens. Therefore, distortion of a peripheral part can be mitigated and object recognition performance at the peripheral part can be improved.

Description

촬상 렌즈, 이를 포함하는 카메라 모듈 및 차량 Imaging lens, camera module and vehicle including same
본 발명은 촬상 렌즈, 이를 포함하는 카메라 모듈 및 차량에 관한 것으로, 보다 상세하게는, 주변부의 물체 인식 성능을 향상시킬 수 있는 촬상 렌즈, 이를 포함하는 카메라 모듈 및 차량에 관한 것이다.The present invention relates to an imaging lens, a camera module including the same, and a vehicle, and more particularly, to an imaging lens capable of improving peripheral object recognition performance, a camera module and a vehicle including the same.
카메라는 렌즈의 초점 거리에 따라 표준, 망원, 광각 카메라로 분류할 수 있다. 이 중, 광각 카메라는 넓은 화각을 갖는 반면, 일반적으로 주변부(광각부)에서 영상의 왜곡이 심해지는 특징을 갖는다. Cameras can be classified into standard, telephoto, and wide-angle cameras according to the focal length of the lens. Among them, the wide-angle camera has a wide angle of view, but in general, image distortion is increased in the periphery (wide-angle part).
차량용 카메라는 차량의 전방 또는 후방의 물체를 인지하여 운전에 필요한 정보를 제공하는 카메라이다. 차량용 카메라는 보다 많은 정보를 인식하기 위하여 광각 카메라가 사용되는 것이 일반적이다. 따라서 차량용 카메라에서 이러한 영상 왜곡 문제는 카메라의 성능을 크게 악화시키며, 이에 따라 차량 전방의 주변부에 위치한 물체를 정확하게 인식하지 못하게 되고, 안전 운전에 큰 위협이 된다.A vehicle camera is a camera that provides information necessary for driving by recognizing an object in the front or rear of the vehicle. A wide-angle camera is generally used for a vehicle camera to recognize more information. Therefore, the image distortion problem in the vehicle camera greatly deteriorates the performance of the camera, and accordingly, it is not possible to accurately recognize an object located in the periphery in front of the vehicle, which is a great threat to safe driving.
차량용 전방 카메라는 광각(약 60도)에서 35m 떨어진 물체를 인식할 수 있어야 하며, 전방(약 0도)에서 85m 이상 떨어진 물체를 인식할 수 있어야 한다. 또한 차량에 사용되는 카메라는 어두운 환경에서도 주변의 물체를 인식할 수 있어야 하므로 f-number가 작은 밝은 렌즈가 필요하다.The vehicle front camera should be able to recognize an object 35m away from a wide angle (about 60 degrees), and it should be able to recognize an object more than 85m away from the front (about 0 degrees). In addition, a camera used in a vehicle needs to be able to recognize surrounding objects even in a dark environment, so a bright lens with a small f-number is required.
종래에는 2 이상의 카메라를 사용하는 다중 카메라 또는 주변부의 왜곡을 완화시켜 영상이 더 넓은 픽셀에 나타나도록 하는 카메라가 차량용으로 널리 사용되었다.Conventionally, a multi-camera using two or more cameras or a camera that mitigates peripheral distortion so that an image appears in a wider pixel has been widely used for vehicles.
다중 카메라는 중앙부를 촬영하는 카메라와 주변부를 촬영하는 카메라를 포함하는 카메라로써, 카메라의 부피가 커지고 가격이 상승하며, 복수 개의 카메라에서 촬영한 영상을 하나로 병합하는 작업이 필요하다는 문제점이 있다.The multi-camera is a camera including a camera for photographing a central portion and a camera for photographing a peripheral portion, and has a problem in that the volume of the camera increases, the price increases, and the operation of merging images photographed by a plurality of cameras into one is required.
한편, 주변부 왜곡 완화 카메라는 촬영되는 주변부 영상의 크기가 커지게 되어 더 큰 이미지 센서가 필요하며, 이에 따라 카메라의 부피가 커지고 가격이 상승한다는 문제점이 있다.On the other hand, the peripheral distortion mitigation camera has a problem in that the size of the peripheral image to be photographed increases, and thus a larger image sensor is required, and accordingly, the volume of the camera increases and the price increases.
본 발명은 상기한 문제점을 해결하기 위하여, 중앙부의 물체 인식 성능은 유지하면서 주변부의 물체 인식 성능을 높일 수 있는 촬상 렌즈를 제공하는데 목적이 있다.In order to solve the above problems, an object of the present invention is to provide an imaging lens capable of increasing object recognition performance of a peripheral part while maintaining object recognition performance of a central part.
한편, 본 발명은 상기한 문제점을 해결하기 위하여, 어두운 환경에서도 물체를 정확하게 인식할 수 있는 촬상 렌즈를 제공하는데 목적이 있다.Meanwhile, an object of the present invention is to provide an imaging lens capable of accurately recognizing an object even in a dark environment in order to solve the above problems.
한편, 본 발명은 상기한 문제점을 해결하기 위하여, 상기 촬상 렌즈를 포함하는 카메라 모듈을 장착하여 여러 상황에서 탑승자의 안전을 도모할 수 있는 차량을 제공하는데 목적이 있다.On the other hand, in order to solve the above problems, an object of the present invention is to provide a vehicle that can promote the safety of occupants in various situations by mounting a camera module including the imaging lens.
본 발명의 과제들은 이상에서 언급한 과제들로 제한되지 않으며, 언급되지 않은 또 다른 과제들은 아래의 기재로부터 이 발명이 속하는 기술분야에서 통상의 지식을 가진 자에게 명확하게 이해될 수 있을 것이다.The problems of the present invention are not limited to the problems mentioned above, and other problems not mentioned will be clearly understood by those of ordinary skill in the art to which the present invention belongs from the following description.
상기 목적을 달성하기 위한 본 발명의 일 실시예에 따른 촬상 렌즈는, 물체측으로부터 상측으로 순서대로, 음의 파워(power)를 갖고, 대상면이 물체측으로 볼록한 제1 렌즈군, 음의 파워를 갖는 제2 렌즈군, 양의 파워를 갖는 제3 렌즈군, 양의 파워를 갖는 제4 렌즈군, 음의 파워를 갖는 제5 렌즈군 및 양의 파워를 갖는 제6 렌즈군을 포함하며, 제1 렌즈군 내지 제6 렌즈군은 각각 적어도 하나 이상의 렌즈를 포함할 수 있다.In order to achieve the above object, an imaging lens according to an embodiment of the present invention has negative power in order from an object side to an image side, and a first lens group with a convex target surface toward the object side, negative power. a second lens group having a second lens group having a positive power, a third lens group having a positive power, a fourth lens group having a positive power, a fifth lens group having a negative power, and a sixth lens group having a positive power, Each of the first to sixth lens groups may include at least one lens.
한편, 상기 목적을 달성하기 위한 본 발명의 일 실시예에 따른 촬상 렌즈에서, 제1 렌즈군은 물체측으로 볼록한 메니스커스 형상일 수 있다.Meanwhile, in the imaging lens according to an embodiment of the present invention for achieving the above object, the first lens group may have a meniscus shape convex toward the object.
한편, 상기 목적을 달성하기 위한 본 발명의 일 실시예에 따른 촬상 렌즈에서, 제2 렌즈군은 상측으로 볼록한 메니스커스 형상일 수 있다.Meanwhile, in the imaging lens according to an embodiment of the present invention for achieving the above object, the second lens group may have a meniscus shape convex toward the image.
한편, 상기 목적을 달성하기 위한 본 발명의 일 실시예에 따른 촬상 렌즈에서, 제3 렌즈군 또는 제4 렌즈군은 양면이 볼록한 형상일 수 있다.Meanwhile, in the imaging lens according to an embodiment of the present invention for achieving the above object, the third lens group or the fourth lens group may have both convex shapes.
한편, 상기 목적을 달성하기 위한 본 발명의 일 실시예에 따른 촬상 렌즈에서, 제5 렌즈군 또는 제6 렌즈군의 대상면 및 결상면 중 적어도 하나는 적어도 하나의 변곡점을 가질 수 있다.Meanwhile, in the imaging lens according to an embodiment of the present invention for achieving the above object, at least one of the target surface and the imaging surface of the fifth lens group or the sixth lens group may have at least one inflection point.
한편, 상기 목적을 달성하기 위한 본 발명의 일 실시예에 따른 촬상 렌즈에서, 제1 렌즈군 내지 상기 제6 렌즈군 중 적어도 하나는 적어도 하나의 비구면 렌즈를 포함할 수 있다.Meanwhile, in the imaging lens according to an embodiment of the present invention for achieving the above object, at least one of the first lens group to the sixth lens group may include at least one aspherical lens.
한편, 상기 목적을 달성하기 위한 본 발명의 일 실시예에 따른 촬상 렌즈에서, 촬상 렌즈의 밝기를 나타내는 상수를 Fno라 할 때, 1.55 ≤ Fno ≤ 1.7의 조건식을 만족할 수 있다.Meanwhile, in the imaging lens according to an embodiment of the present invention for achieving the above object, when Fno is a constant representing the brightness of the imaging lens, a conditional expression of 1.55 ≤ Fno ≤ 1.7 may be satisfied.
한편, 상기 목적을 달성하기 위한 본 발명의 일 실시예에 따른 촬상 렌즈에서, 제1 렌즈군의 물체측 면부터 상면까지의 거리를 TTL이라 할 때, 20mm ≤ TTL ≤ 25mm의 조건식을 만족할 수 있다.On the other hand, in the imaging lens according to an embodiment of the present invention for achieving the above object, when the distance from the object-side surface to the image surface of the first lens group is TTL, the conditional expression of 20mm ≤ TTL ≤ 25mm may be satisfied. .
한편, 상기 목적을 달성하기 위한 본 발명의 일 실시예에 따른 촬상 렌즈에서, 촬상 렌즈의 반화각을 ANG라 할 때, ANG 50°의 조건식을 만족할 수 있다.On the other hand, in the imaging lens according to an embodiment of the present invention for achieving the above object, when the half angle of view of the imaging lens is ANG, the conditional expression of ANG 50° may be satisfied.
한편, 상기 목적을 달성하기 위한 본 발명의 일 실시예에 따른 촬상 렌즈에서, 상면의 대각 길이를 ImgH, 촬상 렌즈의 입사동 구경을 EPD라 할 때, 1.3 ≤ ImgH/EPD ≤ 2.0의 조건식을 만족할 수 있다.On the other hand, in the imaging lens according to an embodiment of the present invention for achieving the above object, when the diagonal length of the image plane is ImgH and the entrance pupil aperture of the imaging lens is EPD, the conditional expression of 1.3 ≤ ImgH/EPD ≤ 2.0 is satisfied. can
한편, 상기 목적을 달성하기 위한 본 발명의 일 실시예에 따른 촬상 렌즈에서, 제1 렌즈군의 물체측 면부터 상면까지의 거리를 TTL, 촬상 렌즈의 전체 초점거리를 EFL이라 할 때, 4.8 ≤ TTL/EFL ≤ 5.6의 조건식을 만족할 수 있다.On the other hand, in the imaging lens according to an embodiment of the present invention for achieving the above object, when the distance from the object side surface to the image plane of the first lens group is TTL and the total focal length of the imaging lens is EFL, 4.8 ≤ The conditional expression of TTL/EFL ≤ 5.6 may be satisfied.
한편, 상기 목적을 달성하기 위한 본 발명의 일 실시예에 따른 촬상 렌즈에서, 제1 렌즈군의 물체측 면부터 상면까지의 거리를 TTL, 촬상 렌즈의 반화각을 ANG라 할 때, 2.5 ≤ ANG/TTL ≤ 2.95의 조건식을 만족할 수 있다.On the other hand, in the imaging lens according to an embodiment of the present invention for achieving the above object, when the distance from the object side surface to the image plane of the first lens group is TTL and the half angle of view of the imaging lens is ANG, 2.5 ≤ ANG The conditional expression of /TTL ≤ 2.95 may be satisfied.
한편, 상기 목적을 달성하기 위한 본 발명의 일 실시예에 따른 촬상 렌즈에서, 상면의 대각 길이를 ImgH, 제1 렌즈군의 물체측 면부터 상면까지의 거리를 TTL이라 할 때, 4.9 ≤ TTL/ImgH ≤ 5.7의 조건식을 만족할 수 있다.Meanwhile, in the imaging lens according to an embodiment of the present invention for achieving the above object, when the diagonal length of the image plane is ImgH and the distance from the object side surface of the first lens group to the image plane is TTL, 4.9 ≤ TTL/ The conditional expression of ImgH ≤ 5.7 may be satisfied.
한편, 상기 목적을 달성하기 위한 본 발명의 일 실시예에 따른 촬상 렌즈에서, 상면의 대각 길이를 ImgH, 촬상 렌즈의 전체 초점거리를 EFL, 촬상 렌즈의 반화각을 ANG라 할 때, 0.4 ≤ ImgH/(EFL * tan(ANG)) ≤ 0.6의 조건식을 만족할 수 있다.Meanwhile, in the imaging lens according to an embodiment of the present invention for achieving the above object, when the diagonal length of the image plane is ImgH, the total focal length of the imaging lens is EFL, and the half angle of view of the imaging lens is ANG, 0.4 ≤ ImgH The conditional expression of /(EFL * tan(ANG)) ≤ 0.6 may be satisfied.
한편, 상기 목적을 달성하기 위한 본 발명의 일 실시예에 따른 촬상 렌즈에서, 촬상 렌즈의 전체 초점거리를 EFL, 제1 렌즈군의 초점거리를 f1이라 할 때, -2.42 ≤ f1/EFL ≤ -2.14의 조건식을 만족할 수 있다.On the other hand, in the imaging lens according to an embodiment of the present invention for achieving the above object, when the total focal length of the imaging lens is EFL and the focal length of the first lens group is f1, -2.42 ≤ f1/EFL ≤ - The conditional expression in 2.14 can be satisfied.
한편, 상기 목적을 달성하기 위한 본 발명의 일 실시예에 따른 촬상 렌즈에서, On the other hand, in the imaging lens according to an embodiment of the present invention for achieving the above object,
한편, 상기 목적을 달성하기 위한 본 발명의 일 실시예에 따른 촬상 렌즈에서, 촬상 렌즈의 전체 초점거리를 EFL, 제2 렌즈군의 초점거리를 f2라 할 때, -5.93 ≤ f2/EFL ≤ -5.11의 조건식을 만족할 수 있다.On the other hand, in the imaging lens according to an embodiment of the present invention for achieving the above object, when the total focal length of the imaging lens is EFL and the focal length of the second lens group is f2, -5.93 ≤ f2/EFL ≤ - The conditional expression of 5.11 can be satisfied.
한편, 상기 목적을 달성하기 위한 본 발명의 일 실시예에 따른 촬상 렌즈에서, 촬상 렌즈의 전체 초점거리를 EFL, 제6 렌즈군의 초점거리를 f6이라 할 때, 3.22 ≤ f6/EFL ≤ 3.64의 조건식을 만족할 수 있다.On the other hand, in the imaging lens according to an embodiment of the present invention for achieving the above object, when the total focal length of the imaging lens is EFL and the focal length of the sixth lens group is f6, 3.22 ≤ f6/EFL ≤ 3.64 condition can be satisfied.
기타 실시예들의 구체적인 사항들은 상세한 설명 및 도면들에 포함되어 있다.The details of other embodiments are included in the detailed description and drawings.
본 발명에 따르면, 다음과 같은 효과가 있다.According to the present invention, there are the following effects.
본 발명의 일 실시예에 따른 촬상 렌즈는, 6개의 렌즈군을 포함하여, 중앙부의 물체 인식 성능은 유지하면서 주변부의 물체 인식 성능을 높일 수 있는 효과가 있다.The imaging lens according to an embodiment of the present invention, including six lens groups, has an effect of improving object recognition performance of a peripheral part while maintaining object recognition performance of a central part.
한편, 본 발명의 일 실시예에 따른 촬상 렌즈는, 하나 이상의 비구면 렌즈와 플라스틱 렌즈 등을 포함하고 낮은 f-number를 갖도록 설계되어, 어두운 환경에서도 물체를 정확하게 인식할 수 있는 효과가 있다.Meanwhile, the imaging lens according to an embodiment of the present invention includes at least one aspherical lens and a plastic lens and is designed to have a low f-number, so that an object can be accurately recognized even in a dark environment.
한편, 본 발명의 일 실시예에 따른 카메라 모듈은, 소형화된 6개의 렌즈군을 포함하는 촬상 렌즈와 기존의 센서와 동일한 크기의 이미지 센서를 적용하여, 기존의 차량용 카메라 모듈에 바로 적용할 수 있는 효과가 있다.On the other hand, the camera module according to an embodiment of the present invention can be directly applied to an existing vehicle camera module by applying an imaging lens including a miniaturized six lens group and an image sensor having the same size as that of the existing sensor. It works.
한편, 본 발명의 일 실시예에 따른 촬상 렌즈를 포함하는 차량은, 주변부에 위치하는 물체를 정확하게 인식하여 여러 상황에서 탑승자의 안전을 도모할 수 있는 효과가 있다.On the other hand, the vehicle including the imaging lens according to an embodiment of the present invention has an effect that can promote the safety of the occupant in various situations by accurately recognizing an object located in the peripheral portion.
본 발명의 효과들은 이상에서 언급한 효과들로 제한되지 않으며, 언급되지 않은 또 다른 효과들은 청구범위의 기재로부터 이 발명이 속하는 기술분야에서 통상의 지식을 가진 자에게 명확하게 이해될 수 있을 것이다.The effects of the present invention are not limited to the above-mentioned effects, and other effects not mentioned will be clearly understood by those of ordinary skill in the art from the description of the claims.
도 1은 본 발명의 일 실시예에 따른 촬상 렌즈를 포함하는 카메라 모듈이 장착된 차량 및 카메라 모듈의 촬영 화각을 나타낸 도면이다.1 is a view showing a photographing angle of view of a vehicle and a camera module equipped with a camera module including an imaging lens according to an embodiment of the present invention.
도 2는 본 발명의 실시예에 따른 차량을 설명하는데 참조되는 블럭도이다.2 is a block diagram referenced for explaining a vehicle according to an embodiment of the present invention.
도 3은 본 발명의 일 실시예에 따른 촬상 렌즈를 도시한 도면이다.3 is a diagram illustrating an imaging lens according to an embodiment of the present invention.
도 4는 도 3의 촬상 렌즈의 코마수차(Coma aberration)를 측정한 그래프이고, 4 is a graph of measuring coma aberration of the imaging lens of FIG. 3;
도 5는 도 3의 촬상 렌즈의 구면수차, 비점수차, 왜곡수차를 나타내는 그래프이다.5 is a graph illustrating spherical aberration, astigmatism, and distortion of the imaging lens of FIG. 3 .
도 6은 도 3의 촬상 렌즈를 사용하여 촬영된 이미지 주변부의 왜곡 정도를 종래의 촬상 렌즈와 비교한 결과를 나타낸 것이다.FIG. 6 shows a result of comparing the degree of distortion in the periphery of an image photographed using the imaging lens of FIG. 3 with that of a conventional imaging lens.
도 7은 도 3의 촬상 렌즈의 화각에 따른 물체 인식 가능 거리를 종래의 촬상 렌즈와 비교한 결과를 나타낸 것이다.FIG. 7 shows the result of comparing the object recognition distance according to the angle of view of the imaging lens of FIG. 3 with that of the conventional imaging lens.
이하에서는 도면을 참조하여 본 발명을 보다 상세하게 설명한다.Hereinafter, the present invention will be described in more detail with reference to the drawings.
도면 부호에 관계없이 동일하거나 유사한 구성요소는 동일한 참조 번호를 부여하고 이에 대한 중복되는 설명은 생략하기로 한다. 이하의 설명에서 사용되는 구성요소에 대한 접미사 "모듈" 및 "부"는 명세서 작성의 용이함만이 고려되어 부여되거나 혼용되는 것으로서, 그 자체로 서로 구별되는 의미 또는 역할을 갖는 것은 아니다. 따라서, 상기 "모듈" 및 "부"는 서로 혼용되어 사용될 수도 있다.Regardless of the reference numerals, the same or similar components are assigned the same reference numerals, and overlapping descriptions thereof will be omitted. The suffixes "module" and "part" for the components used in the following description are given or mixed in consideration of only the ease of writing the specification, and do not have a meaning or role distinct from each other by themselves. Accordingly, the terms “module” and “unit” may be used interchangeably.
또한, 본 명세서에 개시된 실시 예를 설명함에 있어서 관련된 공지 기술에 대한 구체적인 설명이 본 명세서에 개시된 실시 예의 요지를 흐릴 수 있다고 판단되는 경우 그 상세한 설명을 생략한다. 또한, 첨부된 도면은 본 명세서에 개시된 실시 예를 쉽게 이해할 수 있도록 하기 위한 것일 뿐, 첨부된 도면에 의해 본 명세서에 개시된 기술적 사상이 제한되지 않으며, 본 발명의 사상 및 기술 범위에 포함되는 모든 변경, 균등물 내지 대체물을 포함하는 것으로 이해되어야 한다.In addition, in describing the embodiments disclosed in the present specification, if it is determined that detailed descriptions of related known technologies may obscure the gist of the embodiments disclosed in the present specification, the detailed description thereof will be omitted. In addition, the accompanying drawings are only for easy understanding of the embodiments disclosed in the present specification, and the technical spirit disclosed herein is not limited by the accompanying drawings, and all changes included in the spirit and scope of the present invention , should be understood to include equivalents or substitutes.
제1, 제2 등과 같이 서수를 포함하는 용어는 다양한 구성요소들을 설명하는데 사용될 수 있지만, 상기 구성요소들은 상기 용어들에 의해 한정되지는 않는다. 상기 용어들은 하나의 구성요소를 다른 구성요소로부터 구별하는 목적으로만 사용된다.Terms including an ordinal number, such as first, second, etc., may be used to describe various elements, but the elements are not limited by the terms. The above terms are used only for the purpose of distinguishing one component from another.
어떤 구성요소가 다른 구성요소에 "연결되어" 있다거나 "접속되어" 있다고 언급된 때에는, 그 다른 구성요소에 직접적으로 연결되어 있거나 또는 접속되어 있을 수도 있지만, 중간에 다른 구성요소가 존재할 수도 있다고 이해되어야 할 것이다. 반면에, 어떤 구성요소가 다른 구성요소에 "직접 연결되어" 있다거나 "직접 접속되어" 있다고 언급된 때에는, 중간에 다른 구성요소가 존재하지 않는 것으로 이해되어야 할 것이다.When a component is referred to as being “connected” or “connected” to another component, it is understood that the other component may be directly connected or connected to the other component, but other components may exist in between. it should be On the other hand, when it is said that a certain element is "directly connected" or "directly connected" to another element, it should be understood that no other element is present in the middle.
단수의 표현은 문맥상 명백하게 다르게 뜻하지 않는 한, 복수의 표현을 포함한다. The singular expression includes the plural expression unless the context clearly dictates otherwise.
본 출원에서, "포함한다" 또는 "가지다" 등의 용어는 명세서상에 기재된 특징, 숫자, 단계, 동작, 구성요소, 부품 또는 이들을 조합한 것이 존재함을 지정하려는 것이지, 하나 또는 그 이상의 다른 특징들이나 숫자, 단계, 동작, 구성요소, 부품 또는 이들을 조합한 것들의 존재 또는 부가 가능성을 미리 배제하지 않는 것으로 이해되어야 한다.In the present application, terms such as "comprises" or "have" are intended to designate that a feature, number, step, operation, component, part, or combination thereof described in the specification exists, but one or more other features It should be understood that this does not preclude the existence or addition of numbers, steps, operations, components, parts, or combinations thereof.
도 1은 본 발명의 일 실시예에 따른 촬상 렌즈(110)를 포함하는 카메라 모듈(100)이 장착된 차량(10) 및 카메라 모듈(100)의 촬영 화각을 나타낸 도면이고, 도 2는 본 발명의 실시예에 따른 차량(10)을 설명하는데 참조되는 블럭도이다.1 is a view showing a photographing angle of view of a vehicle 10 and a camera module 100 equipped with a camera module 100 including an imaging lens 110 according to an embodiment of the present invention, and FIG. 2 is the present invention. It is a block diagram referenced to describe the vehicle 10 according to the embodiment.
도 1 및 도 2를 참조하면, 차량(10)은 동력원에 의해 회전하는 바퀴, 차량(10)의 진행 방향을 조절하기 위한 조향 입력 장치(200)를 포함할 수 있다.1 and 2 , the vehicle 10 may include wheels rotated by a power source and a steering input device 200 for controlling the traveling direction of the vehicle 10 .
차량(10)은 자율 주행 차량일 수 있다. 차량(10)은 사용자 인터페이스 장치(미도시)를 통해 수신되는 사용자 입력에 기초하여, 자율 주행 모드 또는 메뉴얼 모드로 전환될 수 있다. The vehicle 10 may be an autonomous vehicle. The vehicle 10 may be switched to an autonomous driving mode or a manual mode based on a user input received through a user interface device (not shown).
차량(10)은 주행 상황 정보에 기초하여, 자율 주행 모드 또는 메뉴얼 모드로 전환될 수 있다. 주행 상황 정보는, 차량 외부의 오브젝트 정보, 내비게이션 정보 및 차량 상태 정보 중 적어도 어느 하나를 포함할 수 있다.The vehicle 10 may be switched to an autonomous driving mode or a manual mode based on driving situation information. The driving situation information may include at least one of object information outside the vehicle, navigation information, and vehicle state information.
예를 들어, 차량(10)은 오브젝트 검출 장치(300)에서 생성되는 주행 상황 정보에 기초하여, 메뉴얼 모드에서 자율 주행 모드로 전환되거나, 자율 주행 모드에서 메뉴얼 모드로 전환될 수 있다.For example, the vehicle 10 may be switched from the manual mode to the autonomous driving mode or from the autonomous driving mode to the manual mode based on the driving situation information generated by the object detection apparatus 300 .
차량(10)이 메뉴얼 모드로 운행되는 경우, 차량(10)은, 운전 조작 장치(50)를 통해 운전을 위한 사용자 입력을 수신할 수 있다. 운전 조작 장치(50)를 통해 수신되는 사용자 입력에 기초하여, 차량(10)은 운행될 수 있다.When the vehicle 10 is driven in the manual mode, the vehicle 10 may receive a user input for driving through the driving manipulation device 50 . Based on a user input received through the driving manipulation device 50 , the vehicle 10 may be driven.
한편, 차량(10)은 오브젝트 검출 장치(300)를 포함할 수 있다.Meanwhile, the vehicle 10 may include the object detecting apparatus 300 .
오브젝트 검출 장치(300)는, 차량(10) 외부에 위치하는 오브젝트를 검출하기 위한 장치이다. 오브젝트 검출 장치(300)는, 센싱 데이터에 기초하여, 오브젝트 정보를 생성할 수 있다.The object detecting apparatus 300 is an apparatus for detecting an object located outside the vehicle 10 . The object detection apparatus 300 may generate object information based on the sensed data.
오브젝트 정보는, 오브젝트의 존재 유무에 대한 정보, 오브젝트의 위치 정보, 차량(10)과 오브젝트와의 거리 정보 및 차량(10)과 오브젝트와의 상대 속도 정보를 포함할 수 있다.The object information may include information on the existence of an object, location information of the object, distance information between the vehicle 10 and the object, and relative speed information between the vehicle 10 and the object.
오브젝트는, 차량(10)의 운행과 관련된 다양한 물체들일 수 있다. 예를 들어 오브젝트는 차로, 타 차량, 보행자, 이륜차, 교통 신호, 빛, 도로, 구조물, 과속 방지턱, 건물, 지형물, 동물 등을 포함할 수 있다.The object may be various objects related to the operation of the vehicle 10 . For example, the object may include a vehicle, other vehicle, pedestrian, two-wheeled vehicle, traffic signal, light, road, structure, speed bump, building, feature, animal, and the like.
한편, 오브젝트는, 이동 오브젝트와 정지 오브젝트로 분류될 수 있다. 예를 들면, 이동 오브젝트는, 이동 중인 타 차량, 이동 중인 보행자를 포함하는 개념일 수 있다. 예를 들면, 정지 오브젝트는, 교통 신호, 도로, 구조물, 건물, 정지한 타 차량, 정지한 보행자를 포함하는 개념일 수 있다.Meanwhile, the object may be classified into a moving object and a still object. For example, the moving object may be a concept including another moving vehicle and a moving pedestrian. For example, the stationary object may be a concept including a traffic signal, a road, a structure, a building, another stopped vehicle, and a stationary pedestrian.
카메라 모듈(100)은, 복수개의 렌즈군을 포함하는 촬상 렌즈(110)를 포함할 수 있다. 카메라 모듈(100)은, 차량 외부 영상을 획득하기 위해, 차량의 외부의 적절한 곳에 위치할 수 있다.The camera module 100 may include an imaging lens 110 including a plurality of lens groups. The camera module 100 may be located at an appropriate place outside the vehicle in order to acquire an image outside the vehicle.
카메라 모듈(100)은 촬상 렌즈(110), 필터(120) 및 이미지 센서(130)를 포함할 수 있다. The camera module 100 may include an imaging lens 110 , a filter 120 , and an image sensor 130 .
촬상 렌즈(110)는 복수개의 렌즈군이 광축에 따라 일렬로 배열되어 구성될 수 있다. 촬상 렌즈(110)는 피사체로부터 입사한 광을 굴절시켜 이미지 센서(130)에 상이 맺히도록 한다. 촬상 렌즈(110)의 구성에 대해서는 이하 도 3 내지 도 6과 관련한 설명에서 자세히 설명하기로 한다.The imaging lens 110 may be configured by arranging a plurality of lens groups in a line along an optical axis. The imaging lens 110 refracts light incident from the subject to form an image on the image sensor 130 . The configuration of the imaging lens 110 will be described in detail below with reference to FIGS. 3 to 6 .
필터(120)는 촬상 렌즈(110)를 통과한 빛을 파장에 따라 선택적으로 투과할 수 있다. 예를 들어, 필터(120)는 적외선 필터(121, Infrared Ray Filter)를 포함할 수 있다. 한편, 필터(120)는 커버 글래스(122) 등을 더 포함할 수 있다. 다만, 적외선 필터(121)와 커버 글래스(122)는 필요에 따라 다른 필터 등으로 대체하거나 생략할 수 있으며, 촬상 렌즈(110)의 광학적 특성에는 영향을 미치지 않도록 설계될 수 있다.The filter 120 may selectively transmit light passing through the imaging lens 110 according to a wavelength. For example, the filter 120 may include an infrared filter 121 (Infrared Ray Filter). Meanwhile, the filter 120 may further include a cover glass 122 and the like. However, the infrared filter 121 and the cover glass 122 may be replaced or omitted with other filters as necessary, and may be designed so as not to affect the optical characteristics of the imaging lens 110 .
이미지 센서(130)는 촬상 렌즈(110)와 이격되어 배치되며, 촬상 렌즈(110)를 통해 입력된 광을 전기 신호로 변환하는 기능을 수행한다. 이미지 센서(130)로는 CCD(Charge Coupled Device) 또는 CMOS(Complementary Metal Oxide Semiconductor) 등이 사용될 수 있다.The image sensor 130 is disposed to be spaced apart from the imaging lens 110 , and performs a function of converting light input through the imaging lens 110 into an electrical signal. As the image sensor 130 , a charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS) may be used.
한편, 카메라 모듈(100)은 렌즈 배럴(140) 및 액추에이터(150)를 더 포함할 수 있다. Meanwhile, the camera module 100 may further include a lens barrel 140 and an actuator 150 .
렌즈 배럴(140)은 촬상 렌즈(110)를 보호하는 하우징(housing) 역할을 수행하고, 액추에이터(150)의 구동에 따라 광축 방향으로 이동할 수 있다. The lens barrel 140 serves as a housing for protecting the imaging lens 110 , and may move in the optical axis direction according to the driving of the actuator 150 .
액추에이터(150)는 코일을 이용한 전자기력을 통해 렌즈 배럴(140) 및 보빈(미도시)을 광축 방향에 따라 이동시킴으로써, 오토 포커싱(Auto Focus, AF) 기능을 수행한다. 액추에이터(150)는 VCM(Voice Coil Motor) 등으로 구성될 수 있다.The actuator 150 performs an auto focus (AF) function by moving the lens barrel 140 and the bobbin (not shown) along the optical axis direction through electromagnetic force using a coil. The actuator 150 may be configured as a voice coil motor (VCM) or the like.
한편, 카메라 모듈(100)은, 다양한 영상 처리 알고리즘을 이용하여, 오브젝트의 위치 정보, 오브젝트와의 거리 정보 또는 오브젝트와의 상대 속도 정보를 획득할 수 있다. Meanwhile, the camera module 100 may acquire position information of an object, distance information from an object, or relative speed information with an object by using various image processing algorithms.
예를 들면, 카메라 모듈(100)은, 획득된 영상에서, 시간에 따른 오브젝트 크기의 변화를 기초로, 오브젝트와의 거리 정보 및 상대 속도 정보를 획득할 수 있다. For example, the camera module 100 may acquire distance information and relative speed information from an object based on a change in the size of the object over time from the acquired image.
예를 들면, 카메라 모듈(100)은, 차량 전방의 영상을 획득하기 위해, 차량의 실내에서, 프런트 윈드 쉴드에 근접하게 배치될 수 있다. 또는, 카메라 모듈(100)은, 프런트 범퍼 또는 라디에이터 그릴 주변에 배치될 수 있다.For example, the camera module 100 may be disposed adjacent to the front windshield in the interior of the vehicle to acquire an image of the front of the vehicle. Alternatively, the camera module 100 may be disposed around a front bumper or a radiator grill.
예를 들면, 카메라 모듈(100)은, 차량 후방의 영상을 획득하기 위해, 차량의 실내에서, 리어 글라스에 근접하게 배치될 수 있다. 또는, 카메라 모듈(100)은, 리어 범퍼, 트렁크 또는 테일 게이트 주변에 배치될 수 있다.For example, the camera module 100 may be disposed adjacent to the rear glass in the interior of the vehicle to obtain an image of the rear of the vehicle. Alternatively, the camera module 100 may be disposed around a rear bumper, a trunk, or a tailgate.
예를 들면, 카메라 모듈(100)은, 차량 측방의 영상을 획득하기 위해, 차량의 실내에서 사이드 윈도우 중 적어도 어느 하나에 근접하게 배치될 수 있다. 또는, 카메라 모듈(100)은, 사이드 미러, 휀더 또는 도어 주변에 배치될 수 있다.For example, the camera module 100 may be disposed adjacent to at least one of the side windows in the interior of the vehicle in order to acquire an image of the side of the vehicle. Alternatively, the camera module 100 may be disposed around a side mirror, a fender, or a door.
카메라 모듈(100)은, 획득된 영상을 오브젝트 검출장치(300)의 프로세서(350)에 제공할 수 있다. The camera module 100 may provide the acquired image to the processor 350 of the object detection apparatus 300 .
오브젝트 검출장치(300)는, 레이더(310), 라이다(320), 초음파 센서(330), 적외선 센서(340) 및 프로세서(350)를 포함할 수 있다. 한편, 오브젝트 검출 장치(300)는, 카메라 모듈(100)과 연계되어 동작할 수 있다.The object detection apparatus 300 may include a radar 310 , a lidar 320 , an ultrasonic sensor 330 , an infrared sensor 340 , and a processor 350 . Meanwhile, the object detection apparatus 300 may operate in association with the camera module 100 .
실시예에 따라, 오브젝트 검출 장치(300)는, 설명되는 구성 요소외에 다른 구성 요소를 더 포함하거나, 설명되는 구성 요소 중 일부를 포함하지 않을 수 있다.According to an embodiment, the object detecting apparatus 300 may further include other components in addition to the described components, or may not include some of the described components.
레이더(310)는, 전자파 송신부, 수신부를 포함할 수 있다. 레이더(310)는 전파 발사 원리상 펄스 레이더(Pulse Radar) 방식 또는 연속파 레이더(Continuous Wave Radar) 방식으로 구현될 수 있다. 레이더(310)는 연속파 레이더 방식 중에서 신호 파형에 따라 FMCW(Frequency Modulated Continuous Wave)방식 또는 FSK(Frequency Shift Keyong) 방식으로 구현될 수 있다.The radar 310 may include an electromagnetic wave transmitter and a receiver. The radar 310 may be implemented in a pulse radar method or a continuous wave radar method in view of a radio wave emission principle. The radar 310 may be implemented in a frequency modulated continuous wave (FMCW) method or a frequency shift keyong (FSK) method according to a signal waveform among continuous wave radar methods.
레이더(310)는 전자파를 매개로, TOF(Time of Flight) 방식 또는 페이즈 쉬프트(phase-shift) 방식에 기초하여, 오브젝트를 검출하고, 검출된 오브젝트의 위치, 검출된 오브젝트와의 거리 및 상대 속도를 검출할 수 있다. The radar 310 detects an object based on an electromagnetic wave, a time of flight (TOF) method or a phase-shift method, and a position of the detected object, a distance from the detected object, and a relative speed. can be detected.
레이더(310)는, 차량의 전방, 후방 또는 측방에 위치하는 오브젝트를 감지하기 위해 차량의 외부의 적절한 위치에 배치될 수 있다. The radar 310 may be disposed at an appropriate location outside the vehicle to detect an object located in front, rear or side of the vehicle.
라이다(320)는, 레이저 송신부, 수신부를 포함할 수 있다. 라이다(320)는, TOF(Time of Flight) 방식 또는 페이즈 쉬프트(phase-shift) 방식으로 구현될 수 있다. The lidar 320 may include a laser transmitter and a receiver. The lidar 320 may be implemented in a time of flight (TOF) method or a phase-shift method.
라이다(320)는, 구동식 또는 비구동식으로 구현될 수 있다.The lidar 320 may be implemented as a driven or non-driven type.
구동식으로 구현되는 경우, 라이다(320)는, 모터에 의해 회전되며, 차량(10) 주변의 오브젝트를 검출할 수 있다.When implemented as a driving type, the lidar 320 is rotated by a motor and may detect an object around the vehicle 10 .
비구동식으로 구현되는 경우, 라이다(320)는, 광 스티어링에 의해, 차량(10)을 기준으로 소정 범위 내에 위치하는 오브젝트를 검출할 수 있다. 차량(10)은 복수의 비구동식 라이다(320)를 포함할 수 있다.When implemented as a non-driven type, the lidar 320 may detect an object located within a predetermined range with respect to the vehicle 10 by light steering. Vehicle 10 may include a plurality of non-driven lidars 320 .
라이다(320)는, 레이저 광 매개로, TOF(Time of Flight) 방식 또는 페이즈 쉬프트(phase-shift) 방식에 기초하여, 오브젝트를 검출하고, 검출된 오브젝트의 위치, 검출된 오브젝트와의 거리 및 상대 속도를 검출할 수 있다. The lidar 320 detects an object based on a time of flight (TOF) method or a phase-shift method as a laser light medium, and determines the position of the detected object, the distance to the detected object, and Relative speed can be detected.
라이다(320)는, 차량의 전방, 후방 또는 측방에 위치하는 오브젝트를 감지하기 위해 차량의 외부의 적절한 위치에 배치될 수 있다.The lidar 320 may be disposed at an appropriate location outside the vehicle to detect an object located in front, rear or side of the vehicle.
초음파 센서(330)는, 초음파 송신부, 수신부를 포함할 수 있다. 초음파 센서(330)은, 초음파를 기초로 오브젝트를 검출하고, 검출된 오브젝트의 위치, 검출된 오브젝트와의 거리 및 상대 속도를 검출할 수 있다. The ultrasonic sensor 330 may include an ultrasonic transmitter and a receiver. The ultrasound sensor 330 may detect an object based on ultrasound, and detect a position of the detected object, a distance from the detected object, and a relative speed.
초음파 센서(330)는, 차량의 전방, 후방 또는 측방에 위치하는 오브젝트를 감지하기 위해 차량의 외부의 적절한 위치에 배치될 수 있다.The ultrasonic sensor 330 may be disposed at an appropriate location outside the vehicle to detect an object located at the front, rear, or side of the vehicle.
적외선 센서(340)는, 적외선 송신부, 수신부를 포함할 수 있다. 적외선 센서(340)는, 적외선 광을 기초로 오브젝트를 검출하고, 검출된 오브젝트의 위치, 검출된 오브젝트와의 거리 및 상대 속도를 검출할 수 있다.The infrared sensor 340 may include an infrared transmitter and a receiver. The infrared sensor 340 may detect an object based on infrared light, and detect a position of the detected object, a distance from the detected object, and a relative speed.
적외선 센서(340)는, 차량의 전방, 후방 또는 측방에 위치하는 오브젝트를 감지하기 위해 차량의 외부의 적절한 위치에 배치될 수 있다.The infrared sensor 340 may be disposed at an appropriate location outside the vehicle to detect an object located in the front, rear, or side of the vehicle.
프로세서(350)는, 오브젝트 검출 장치(300)의 각 유닛의 전반적인 동작을 제어할 수 있다.The processor 350 may control the overall operation of each unit of the object detection apparatus 300 .
프로세서(350)는, 카메라 모듈(100), 레이더(310), 라이다(320), 초음파 센서(330) 및 적외선 센서(340)에 의해 센싱된 데이터와 기 저장된 데이터를 비교하여, 오브젝트를 검출하거나 분류할 수 있다.The processor 350 compares data sensed by the camera module 100 , the radar 310 , the lidar 320 , the ultrasonic sensor 330 , and the infrared sensor 340 with pre-stored data to detect an object or can be classified.
프로세서(350)는, 카메라 모듈(100)에서 획득된 영상에 기초하여, 오브젝트를 검출하고, 트래킹할 수 있다. 프로세서(350)는, 영상 처리 알고리즘을 통해, 오브젝트와의 거리 산출, 오브젝트와의 상대 속도 산출등의 동작을 수행할 수 있다.The processor 350 may detect and track the object based on the image acquired by the camera module 100 . The processor 350 may perform operations such as calculating a distance to an object and calculating a relative speed with respect to an object through an image processing algorithm.
예를 들면, 프로세서(350)는, 획득된 영상에서, 시간에 따른 오브젝트 크기의 변화를 기초로, 오브젝트와의 거리 정보 및 상대 속도 정보를 획득할 수 있다. For example, the processor 350 may acquire distance information and relative velocity information from the obtained image based on a change in the size of the object over time.
예를 들면, 프로세서(350)는, 핀홀(pin hole) 모델, 노면 프로파일링 등을 통해, 오브젝트와의 거리 정보 및 상대 속도 정보를 획득할 수 있다.For example, the processor 350 may acquire distance information and relative speed information from an object through a pin hole model, road surface profiling, or the like.
프로세서(350)는, 오브젝트로 송신되고 반사되어 되돌아오는 반사 전자파, 반사 레이저, 반사 초음파, 반사 적외선 광 중 적어도 하나에 기초하여, 오브젝트를 검출하고, 트래킹할 수 있다. 프로세서(350)는, 반사 전자파, 반사 레이저, 반사 초음파, 반사 적외선 광 중 적어도 하나에 기초하여, 오브젝트와의 거리 산출, 오브젝트와의 상대 속도 산출 등의 동작을 수행할 수 있다.The processor 350 may detect and track the object based on at least one of a reflected electromagnetic wave, a reflected laser, a reflected ultrasonic wave, and a reflected infrared light transmitted to the object and reflected back. The processor 350 may perform operations such as calculating a distance to an object and calculating a relative speed with respect to an object based on at least one of reflected electromagnetic waves, reflected lasers, reflected ultrasonic waves, and reflected infrared light.
실시예에 따라, 오브젝트 검출 장치(300)는, 복수의 프로세서(350)를 포함하거나, 프로세서(350)를 포함하지 않을 수도 있다. 예를 들면, 카메라 모듈(100), 레이더(310), 라이다(320), 초음파 센서(330) 및 적외선 센서(340) 각각은 개별적으로 프로세서를 포함할 수 있다.According to an embodiment, the object detecting apparatus 300 may include a plurality of processors 350 or may not include the processors 350 . For example, each of the camera module 100 , the radar 310 , the lidar 320 , the ultrasonic sensor 330 , and the infrared sensor 340 may individually include a processor.
오브젝트 검출 장치(300)에 프로세서(350)가 포함되지 않는 경우, 오브젝트 검출 장치(300)는, 차량(10)내 장치의 프로세서 또는 제어부(900)의 제어에 따라, 동작될 수 있다.When the processor 350 is not included in the object detection apparatus 300 , the object detection apparatus 300 may be operated under the control of the processor or the controller 900 of the apparatus in the vehicle 10 .
한편, 도면에는 도시되지 않았으나, 본 발명의 일 실시예에 따른 차량(10)은 사용자 입력을 수신하고 사용자에게 차량(10)에서 생성된 정보를 제공하는 사용자 인터페이스 장치, 차량(10) 내의 각종 장치의 구동을 전기적으로 제어하는 차량 구동 장치, 외부 디바이스와 통신을 수행하는 통신 장치, 자율 주행 모드에서 차량(10)의 각종 운행을 제어하는 운행 시스템, 내비게이션 정보를 제공하는 내비게이션 시스템, 차량의 상태를 센싱하는 센싱부, 차량(10)에 연결되는 외부 기기와 데이터 교환 또는 전원 공급 등을 수행하는 인터페이스부, 차량(10)과 관련한 각종 데이터를 저장하는 메모리, 차량(10)의 각 구성 요소들의 동작에 필요한 전원을 공급하는 전원 공급부 등이 필요에 따라 더 포함될 수 있다.Meanwhile, although not shown in the drawings, the vehicle 10 according to an embodiment of the present invention includes a user interface device that receives a user input and provides information generated in the vehicle 10 to the user, and various devices in the vehicle 10 . A vehicle driving device that electrically controls the driving of a vehicle, a communication device that communicates with an external device, a driving system that controls various operations of the vehicle 10 in an autonomous driving mode, a navigation system that provides navigation information, A sensing unit for sensing, an interface unit for exchanging data or supplying power with an external device connected to the vehicle 10 , a memory for storing various data related to the vehicle 10 , and operation of each component of the vehicle 10 . A power supply for supplying power required for the , etc. may be further included as needed.
도 1의 (b)를 참조하면, 본 발명의 일 실시예에 따른 차량(10)에 설치되는 카메라 모듈(100)은 적어도 하나의 촬상 렌즈(110)를 포함할 수 있다. 촬상 렌즈(110)는 일정한 크기의 화각을 가질 수 있다. Referring to FIG. 1B , the camera module 100 installed in the vehicle 10 according to an embodiment of the present invention may include at least one imaging lens 110 . The imaging lens 110 may have an angle of view of a certain size.
카메라 모듈(100)은 촬상 렌즈(110)가 차량(10)의 정면 방향을 향하도록 차량(10) 내에 배치될 수 있다. 카메라 모듈(100)의 정면 방향에 위치하는 영역은 중앙 영역(Ar1)으로 정의하고, 카메라 모듈(100)이 촬영할 수 있는 화각 범위에서 좌측 끝단 또는 우측 끝단 부근의 일부 영역은 주변 영역(Ar2)로 정의한다. The camera module 100 may be disposed in the vehicle 10 such that the imaging lens 110 faces the front direction of the vehicle 10 . The area located in the front direction of the camera module 100 is defined as the central area Ar1, and a partial area near the left end or right end in the range of the angle of view that the camera module 100 can photograph is the peripheral area Ar2. define.
카메라 모듈(100)에 의해 촬영되는 이미지의 가운데는 중앙 영역(Ar1)의 영상이 포함되고, 이미지의 주변에는 주변 영역(Ar2)의 영상이 포함된다.An image of the central region Ar1 is included in the center of the image captured by the camera module 100 , and an image of the peripheral region Ar2 is included in the periphery of the image.
한편, 이하에서는 카메라 모듈(100)이 차량(10)의 정면 방향을 향하도록 배치된 실시예를 중심으로 기술한다.Meanwhile, in the following, an embodiment in which the camera module 100 is disposed to face the front direction of the vehicle 10 will be mainly described.
도 3은 본 발명의 일 실시예에 따른 촬상 렌즈(110)를 도시한 도면이다. 도 3에서 렌즈의 구면 또는 비구면 형상은 일 실시 예로 제시되었을 뿐 이 형상에 한정되지 않는다.3 is a diagram illustrating an imaging lens 110 according to an embodiment of the present invention. The spherical or aspherical shape of the lens in FIG. 3 is only presented as an example and is not limited thereto.
본 발명에서 '대상면'이라 함은 광축을 기준으로 물체측(object side)을 향하는 렌즈의 면을 의미하며, '결상면'이라 함은 광축을 기준으로 상측(image side)을 향하는 렌즈의 면을 의미한다.In the present invention, the term 'target surface' refers to the surface of the lens facing the object side with respect to the optical axis, and the term 'image-forming surface' refers to the surface of the lens facing the image side with respect to the optical axis. means
또한, 본 발명에서 렌즈의 "양의 파워"는 평행광을 수렴시키는 수렴 렌즈를 나타내며, 렌즈의 "음의 파워"는 평행광을 발산시키는 발산 렌즈를 나타낸다.In addition, in the present invention, "positive power" of a lens indicates a converging lens that converges parallel light, and "negative power" of a lens indicates a diverging lens that diverges parallel light.
도면을 참조하면, 촬상 렌즈(110)는 물체측으로부터 상측으로 순서대로 제1 렌즈군(101), 제2 렌즈군(102), 제3 렌즈군(103), 제4 렌즈군(104), 제5 렌즈군(105) 및 제6 렌즈군(106)을 포함할 수 있다. Referring to the drawings, the imaging lens 110 includes a first lens group 101 , a second lens group 102 , a third lens group 103 , a fourth lens group 104 in order from the object side to the image side; A fifth lens group 105 and a sixth lens group 106 may be included.
제1 렌즈군(101) 내지 제6 렌즈군(106)은 각각 적어도 하나 이상의 렌즈를 포함할 수 있다.Each of the first lens group 101 to the sixth lens group 106 may include at least one lens.
제1 렌즈군(101)의 전면에는 셔터(shutter)가 포함될 수 있고, 제1 렌즈군(101) 내지 제6 렌즈군(106) 사이에 조리개(미도시)가 배치될 수 있다. 조리개는 가변 조리개일 수 있다. A shutter may be included on the front surface of the first lens group 101 , and an iris (not shown) may be disposed between the first lens group 101 to the sixth lens group 106 . The aperture may be a variable aperture.
촬상 렌즈(110)는 카메라 모듈(100)에 포함되며, 필터(120)와 이미지 센서(130)는 순서대로 촬상 렌즈(110)의 제6 렌즈군(106)에서 상측으로 이격되어 배열될 수 있다.The imaging lens 110 is included in the camera module 100, and the filter 120 and the image sensor 130 may be arranged to be spaced apart from each other in the sixth lens group 106 of the imaging lens 110 in order. .
필터(120)는 적외선 필터(121)를 포함할 수 있다. 한편, 필터(120)는 커버 글래스(122) 등을 더 포함할 수 있다. 적외선 필터(121)와 커버 글래스(122)는 평판 형상의 광학 부재이며, 커버 글래스(122)는 촬상면 보호용 글래스일 수 있다.The filter 120 may include an infrared filter 121 . Meanwhile, the filter 120 may further include a cover glass 122 and the like. The infrared filter 121 and the cover glass 122 may be a flat optical member, and the cover glass 122 may be a glass for protecting the imaging surface.
이미지 센서(130)는 제6 렌즈군(106)에서 입사된 광을 감지할 수 있다.The image sensor 130 may detect light incident from the sixth lens group 106 .
제1 렌즈군(101)은 적어도 하나 이상의 렌즈를 포함할 수 있다. 제1 렌즈군(101)은 음(-)의 파워(power)를 가질 수 있다. The first lens group 101 may include at least one lens. The first lens group 101 may have negative (-) power.
제1 렌즈군(101)은 물체측에 가장 가깝게 배치되고, 대상면(S11)이 볼록하고, 결상면(S12)이 오목할 수 있다. 즉, 제1 렌즈군(101)은 물체측으로 볼록한 메니스커스(meniscus) 형상일 수 있다.The first lens group 101 may be disposed closest to the object side, the target surface S11 may be convex, and the imaging surface S12 may be concave. That is, the first lens group 101 may have a meniscus shape convex toward the object.
제1 렌즈군(101)은 제2 렌즈군(102) 내지 제6 렌즈군(106)에 비해 상대적으로 크게 형성될 수 있다. 이에 따라 제1 렌즈군(101)의 대상면(S11)을 통해 입사되는 모든 빛이 제2 렌즈군(102)의 대상면(S21)으로 입사될 수 있고, 촬상 렌즈(110)는 넓은 화각을 구현할 수 있다.The first lens group 101 may be formed to be relatively larger than the second lens group 102 to the sixth lens group 106 . Accordingly, all light incident through the target surface S11 of the first lens group 101 may be incident on the target surface S21 of the second lens group 102, and the imaging lens 110 has a wide angle of view. can be implemented
제2 렌즈군(102)은 적어도 하나 이상의 렌즈를 포함할 수 있다. 제2 렌즈군(102)은 음(-)의 파워를 가질 수 있다. The second lens group 102 may include at least one lens. The second lens group 102 may have negative (-) power.
제2 렌즈군(102)은 제1 렌즈군(101)의 결상면(S12)으로부터 이격되어 배치될 수 있고, 대상면(S21)이 오목하고, 결상면(S22)이 볼록할 수 있다. 즉, 제2 렌즈군(102)은 상측으로 볼록한 메니스커스(meniscus) 형상일 수 있다.The second lens group 102 may be disposed to be spaced apart from the imaging surface S12 of the first lens group 101 , the target surface S21 may be concave, and the imaging surface S22 may be convex. That is, the second lens group 102 may have a meniscus shape convex toward the image side.
제3 렌즈군(103)은 적어도 하나 이상의 렌즈를 포함할 수 있다. 제3 렌즈군(103)은 양(+)의 파워를 가질 수 있다. The third lens group 103 may include at least one lens. The third lens group 103 may have positive (+) power.
제3 렌즈군(103)은 제2 렌즈군(102)의 결상면(S22)으로부터 이격되어 배치될 수 있고, 대상면(S31) 및 결상면(S32)이 모두 볼록할 수 있다.The third lens group 103 may be disposed to be spaced apart from the imaging plane S22 of the second lens group 102 , and both the target plane S31 and the imaging plane S32 may be convex.
제4 렌즈군(104)은 적어도 하나 이상의 렌즈를 포함할 수 있다. 제4 렌즈군(104)은 양(+)의 파워를 가질 수 있다. The fourth lens group 104 may include at least one lens. The fourth lens group 104 may have positive (+) power.
제4 렌즈군(104)은 제3 렌즈군(103)의 결상면(S32)으로부터 이격되어 배치될 수 있고, 대상면(S41) 및 결상면(S42)이 모두 볼록할 수 있다.The fourth lens group 104 may be disposed to be spaced apart from the imaging plane S32 of the third lens group 103 , and both the target plane S41 and the imaging plane S42 may be convex.
제5 렌즈군(105)은 적어도 하나 이상의 렌즈를 포함할 수 있다. 제5 렌즈군(105)은 음(-)의 파워를 가질 수 있다. The fifth lens group 105 may include at least one lens. The fifth lens group 105 may have negative (-) power.
제5 렌즈군(105)은 제4 렌즈군(104)의 결상면(S42)으로부터 이격되어 배치될 수 있다.The fifth lens group 105 may be disposed to be spaced apart from the imaging surface S42 of the fourth lens group 104 .
제5 렌즈군(105)의 대상면(S51)과 결상면(S52) 중 적어도 하나는 적어도 하나의 변곡점이 형성될 수 있다. 예를 들어, 대상면(S51)은 근축 영역에서 오목하다가 가장자리로 갈수록 볼록한 형상일 수 있고, 결상면(S52)은 근축 영역에서 볼록하다가 가장자리로 갈수록 오목한 형상일 수 있다. 그러나 제5 렌즈군(105)의 형상이 이에 제한되지는 않는다.At least one inflection point may be formed on at least one of the target surface S51 and the imaging surface S52 of the fifth lens group 105 . For example, the target surface S51 may be concave in the paraxial region and convex toward the edge, and the imaging surface S52 may be convex in the paraxial region and concave toward the edge. However, the shape of the fifth lens group 105 is not limited thereto.
제6 렌즈군(106)은 적어도 하나 이상의 렌즈를 포함할 수 있다. 제6 렌즈군(106)은 양(+)의 파워를 가질 수 있다. The sixth lens group 106 may include at least one lens. The sixth lens group 106 may have positive (+) power.
제6 렌즈군(106)은 제5 렌즈군(105)의 결상면(S52)으로부터 이격되어 배치될 수 있다.The sixth lens group 106 may be disposed to be spaced apart from the imaging surface S52 of the fifth lens group 105 .
제6 렌즈군(106)의 대상면(S61)과 결상면(S62) 중 적어도 하나는 적어도 하나의 변곡점이 형성될 수 있다. 예를 들어, 대상면(S61)은 근축 영역에서 볼록하다가 가장자리로 갈수록 오목한 형상일 수 있고, 결상면(S62)은 근축 영역에서 오목하다가 가장자리로 갈수록 볼록한 형상일 수 있다. 그러나 제6 렌즈군(106)의 형상이 이에 제한되지는 않는다.At least one inflection point may be formed on at least one of the target surface S61 and the imaging surface S62 of the sixth lens group 106 . For example, the target surface S61 may be convex in the paraxial region and concave toward the edge, and the imaging surface S62 may be concave in the paraxial region and convex toward the edge. However, the shape of the sixth lens group 106 is not limited thereto.
한편, 제1 렌즈군(101) 내지 제6 렌즈군(106) 중 적어도 하나는 비구면 렌즈를 포함할 수 있고, 렌즈는 모두 광축을 기준으로 회전 대칭 형상을 가질 수 있다.Meanwhile, at least one of the first lens group 101 to the sixth lens group 106 may include an aspherical lens, and all of the lenses may have a rotationally symmetric shape with respect to the optical axis.
또한, 제1 렌즈군(101) 내지 제6 렌즈군(106)에 포함되는 렌즈는 유리 재질 또는 플라스틱 재질로 이루어질 수 있다. 렌즈가 플라스틱 재질로 제작되는 경우, 제조비용이 크게 절감될 수 있다.In addition, the lenses included in the first lens group 101 to the sixth lens group 106 may be made of a glass material or a plastic material. When the lens is made of a plastic material, the manufacturing cost can be greatly reduced.
한편, 제1 렌즈군(101) 내지 제6 렌즈군(106)에 포함되는 렌즈는 반사 방지나 표면 경도 향상을 위해 표면이 코팅처리 될 수 있다.Meanwhile, the surfaces of the lenses included in the first lens group 101 to the sixth lens group 106 may be coated to prevent reflection or improve surface hardness.
이와 같이 구성된 촬상 렌즈(110)는 주변부의 왜곡을 감소시킬 수 있고, 촬상 렌즈(110)를 포함하는 카메라 모듈(100) 또는 차량(10)에서 주변부의 물체 인식 성능을 크게 높일 수 있게 된다.The imaging lens 110 configured as described above can reduce distortion of the peripheral portion, and greatly increase the object recognition performance of the peripheral portion in the camera module 100 or the vehicle 10 including the imaging lens 110 .
표 1은 본 발명의 일 실시예에 따른 촬상 렌즈(110)에 포함되는 각 렌즈군들의 곡률 반경, 두께 또는 거리를 나타낸다. 여기서, 곡률 반경과 두께 또는 거리의 단위는 밀리미터이다.Table 1 shows the radius of curvature, thickness, or distance of each lens group included in the imaging lens 110 according to an embodiment of the present invention. Here, the unit of the radius of curvature and the thickness or distance is millimeters.
SurfaceSurface 곡률 반경 (R)radius of curvature (R) 두께 또는 거리 (d)thickness or distance (d)
S11S11 6.37876.3787 0.90100.9010
S12S12 1.00381.0038 3.31123.3112
S21S21 -6.6687-6.6687 4.84634.8463
S22S22 -12.7126-12.7126 0.08990.0899
S31S31 8.89798.8979 1.90861.9086
S32S32 -16.7565-16.7565 0.08550.0855
S41S41 7.15837.1583 5.18645.1864
S42S42 -6.8404-6.8404 0.15910.1591
S51S51 -7.2431-7.2431 0.92210.9221
S52S52 -3.6526-3.6526 0.49620.4962
S61S61 6.99446.9944 1.42471.4247
S62S62 -4.6244-4.6244 0.18610.1861
S71S71 InfinityInfinity 0.70000.7000
S72S72 InfinityInfinity 0.74590.7459
S81S81 InfinityInfinity 0.50000.5000
S82S82 InfinityInfinity 2.05222.0522
S91S91 InfinityInfinity --
표 1에서 제1 렌즈군(101) 내지 제6 렌즈군(106) 및 필터(121), 커버 글래스(122)의 대상면 및 결상면의 곡률(S11~S82) 및 이미지 센서(130) 상면의 곡률(S91)이 기재되어 있다.In Table 1, the curvatures S11 to S82 of the target surface and the imaging surface of the first lens group 101 to the sixth lens group 106, the filter 121, and the cover glass 122, and the image sensor 130 upper surface The curvature S91 is described.
표에서, 곡률이 양(+)인 경우 물체측으로 휘어진 경우이며, 곡률이 음(-)인 경우 이미지 센서(130) 측으로 휘어진 경우이다. 곡률이 무한(infinity)인 경우는 표면이 플랫(flat)한 경우이다.In the table, when the curvature is positive (+), it is curved toward the object, and when the curvature is negative (-), it is curved toward the image sensor 130 . When the curvature is infinity, the surface is flat.
표 1과 도 3을 함께 참조하면, 광축 상에서 제1 렌즈군(101)의 대상면(S11)으로부터 결상면(S12)까지의 거리(두께)는 0.9101mm이고, 제2 렌즈군(102)의 대상면(S21)으로부터 결상면(S22)까지의 거리(두께)는 4.8463mm이며, 제3 렌즈군(103)의 대상면(S31)으로부터 결상면(S32)까지의 거리(두께)는 1.9086mm이고, 제4 렌즈군(104)의 대상면(S41)으로부터 결상면(S42)까지의 거리(두께)는 5.1864mm이며, 제5 렌즈군(105)의 대상면(S51)으로부터 결상면(S52)까지의 거리(두께)는 0.9221mm이고, 제6 렌즈군(106)의 대상면(S61)으로부터 결상면(S62)까지의 거리(두께)는 1.4247mm 일 수 있다.Referring to Table 1 and FIG. 3 together, the distance (thickness) from the target surface S11 to the imaging surface S12 of the first lens group 101 on the optical axis is 0.9101 mm, and the The distance (thickness) from the target plane S21 to the imaging plane S22 is 4.8463 mm, and the distance (thickness) from the target plane S31 to the imaging plane S32 of the third lens group 103 is 1.9086 mm. , the distance (thickness) from the target plane S41 to the imaging plane S42 of the fourth lens group 104 is 5.1864 mm, and the imaging plane S52 from the target plane S51 of the fifth lens group 105 ) may be 0.9221 mm, and the distance (thickness) from the target surface S61 to the imaging surface S62 of the sixth lens group 106 may be 1.4247 mm.
한편, 제1 렌즈군(101)의 결상면(S12)은 제2 렌즈군(102)의 대상면(S21)까지 3.3112mm 이격되어 광축상에 배치되고, 제2 렌즈군(102)의 결상면(S22)은 제3 렌즈군(103)의 대상면(S31)까지 0.0899mm 이격되어 광축상에 배치되며, 제3 렌즈군(103)의 결상면(S32)은 제4 렌즈군(104)의 대상면(S41)까지 0.0855mm 이격되어 광축상에 배치되고, 제4 렌즈군(104)의 결상면(S42)은 제5 렌즈군(105)의 대상면(S51)까지 0.1591mm 이격되어 광축상에 배치되며, 제5 렌즈군(105)의 결상면(S52)은 제6 렌즈군(106)의 대상면(S61)까지 0.4962mm 이격되어 광축상에 배치되고, 제6 렌즈군(106)의 결상면(S62)은 필터(121)의 상면(S71)까지 0.1861mm 이격되어 광축상에 배치될 수 있다.On the other hand, the imaging plane S12 of the first lens group 101 is arranged on the optical axis spaced apart by 3.3112 mm to the target plane S21 of the second lens group 102 , and the imaging plane of the second lens group 102 . (S22) is arranged on the optical axis at a distance of 0.0899 mm to the target surface S31 of the third lens group 103, and the imaging surface S32 of the third lens group 103 is the fourth lens group 104 The image-forming surface S42 of the fourth lens group 104 is spaced 0.1591 mm apart from the target surface S51 of the fifth lens group 105 on the optical axis to be spaced 0.0855 mm apart from the target surface S41 on the optical axis. and the imaging surface S52 of the fifth lens group 105 is spaced 0.4962 mm apart to the target surface S61 of the sixth lens group 106 on the optical axis, and the sixth lens group 106 The imaging surface S62 may be disposed on the optical axis at a distance of 0.1861 mm up to the upper surface S71 of the filter 121 .
표 2는 본 발명의 일 실시예에 따른 촬상 렌즈(110)에 포함되는 각 렌즈군들의 렌즈면의 코닉상수(k) 및 비구면 계수(A 내지 H)를 나타낸다.Table 2 shows the conic constant (k) and aspheric coefficients (A to H) of the lens surface of each lens group included in the imaging lens 110 according to an embodiment of the present invention.
SurfaceSurface S11S11 S12S12 S41S41 S42S42
kk 00 -9.5101E-01-9.5101E-01 00 00
AA 3.0727E-023.0727E-02 -2.9624E-01-2.9624E-01 -2.1312E-02-2.1312E-02 -1.2469E-02-1.2469E-02
BB -5.6615E-03-5.6615E-03 -1.0941E-02-1.0941E-02 -1.4257E-03-1.4257E-03 -3.5599E-03-3.5599E-03
CC 1.3920E-041.3920E-04 3.5206E-063.5206E-06 3.8415E-063.8415E-06 2.1891E-042.1891E-04
DD -1.7341E-06-1.7341E-06 -6.3767E-06-6.3767E-06 -6.4873E-06-6.4873E-06 -4.8837E-06-4.8837E-06
EE 00 00 00 00
F F 00 00 00 00
G G 00 00 00 00
H H 00 00 00 00
SurfaceSurface S51S51 S52S52 S61 S61 S62S62
kk 00 00 00 00
AA -8.6257E-02-8.6257E-02 7.7845E-027.7845E-02 2.7118E-022.7118E-02 1.5320E-011.5320E-01
BB 1.9402E-021.9402E-02 1.6114E-021.6114E-02 -1.1361E-02-1.1361E-02 -3.7489E-03-3.7489E-03
CC -2.1250E-03-2.1250E-03 -8.7124E-06-8.7124E-06 9.0737E-049.0737E-04 -2.7679E-04-2.7679E-04
DD 1.3862E-041.3862E-04 -6.5660E-05-6.5660E-05 -6.3378E-05-6.3378E-05 6.8647E-056.8647E-05
EE -3.7078E-06-3.7078E-06 7.6306E-067.6306E-06 1.2373E-061.2373E-06 -6.1938E-06-6.1938E-06
FF -3.6075E-07-3.6075E-07 -3.9162E-07-3.9162E-07 2.1223E-072.1223E-07 2.0640E-072.0640E-07
GG 4.6749E-084.6749E-08 1.6466E-081.6466E-08 -1.7539E-08-1.7539E-08 3.6679E-093.6679E-09
HH -1.5942E-09-1.5942E-09 -1.4276E-10-1.4276E-10 4.5309E-104.5309E-10 -3.0223E-10-3.0223E-10
표 2를 참조하면, 제1 렌즈군(101) 및 제4 렌즈군(104) 내지 제6 렌즈군(106)은 비구면 렌즈를 포함한다. 다만 제1 렌즈군(101) 내지 제6 렌즈군(106)은 적어도 하나의 비구면 렌즈를 포함할 수 있으며, 표 2에 기재된 예에 제한되지 않는다. Referring to Table 2, the first lens group 101 and the fourth lens group 104 to the sixth lens group 106 include aspherical lenses. However, the first lens group 101 to the sixth lens group 106 may include at least one aspherical lens, and are not limited to the examples shown in Table 2.
한편, 본 발명의 일 실시예에 따른 촬상 렌즈(110)는 아래와 같은 조건식 1을 만족할 수 있다.Meanwhile, the imaging lens 110 according to an embodiment of the present invention may satisfy Conditional Expression 1 as follows.
<조건식 1><Condition 1>
1.55 ≤ Fno ≤ 1.71.55 ≤ Fno ≤ 1.7
여기서, Fno는 촬상 렌즈(110)의 밝기를 나타내는 상수(F-number)이다. Fno가 클수록 촬상 렌즈(110)의 밝기는 어두워지며, 동일한 환경에서 촬상 렌즈(110)가 받아들이는 광량이 감소한다.Here, Fno is a constant (F-number) indicating the brightness of the imaging lens 110 . As Fno increases, the brightness of the imaging lens 110 becomes darker, and the amount of light received by the imaging lens 110 decreases in the same environment.
Fno가 1.7 보다 커지면, 촬상 렌즈(110)는 어두운 곳에서 물체를 인식하는 성능이 부족하여, 촬상 렌즈(110)가 목적으로 하는 물체 인식 성능을 구현할 수 없다. 한편, Fno가 1.55보다 작으면, 촬상 렌즈(110)에 포함되는 렌즈들의 크기가 커지거나 렌즈의 매수가 늘어나므로, 촬상 렌즈(110)의 부피가 커지고 무게가 무거워진다.When Fno is greater than 1.7, the imaging lens 110 lacks the ability to recognize an object in a dark place, so that the imaging lens 110 cannot realize the target object recognition performance. On the other hand, when Fno is less than 1.55, the size of the lenses included in the imaging lens 110 is increased or the number of lenses is increased, so that the volume of the imaging lens 110 is increased and the weight is heavy.
한편, 본 발명의 일 실시예에 따른 촬상 렌즈(110)는 아래와 같은 조건식 2을 만족할 수 있다.Meanwhile, the imaging lens 110 according to an embodiment of the present invention may satisfy Conditional Expression 2 below.
<조건식 2><Condition 2>
20mm ≤ TTL ≤ 25mm20mm ≤ TTL ≤ 25mm
여기서, TTL은 제1 렌즈군(101)의 물체측 입사면부터 상면까지의 거리(Total Top Length 또는 Total Track Length)이다. 즉, TTL은 촬상 렌즈(110)의 총 길이를 나타낸다.Here, TTL is the distance (Total Top Length or Total Track Length) from the object-side incident surface to the image surface of the first lens group 101 . That is, TTL represents the total length of the imaging lens 110 .
TTL이 25mm 보다 길어지면, 촬상 렌즈(110)의 길이가 길어 차량(10) 등에적용되는 카메라 모듈(100)에 장착이 어려우며, TTL이 20mm 보다 짧아지면 촬상 렌즈(110)의 화질 저하가 발생할 수 있다.If the TTL is longer than 25mm, the length of the imaging lens 110 is long, so it is difficult to mount it on the camera module 100 applied to the vehicle 10, etc., and if the TTL is shorter than 20mm, the image quality of the imaging lens 110 may deteriorate. have.
한편, 본 발명의 일 실시예에 따른 촬상 렌즈(110)는 아래와 같은 조건식 3을 만족할 수 있다.On the other hand, the imaging lens 110 according to an embodiment of the present invention may satisfy the following conditional expression (3).
<조건식 3><Condition 3>
50°≤ ANG50°≤ ANG
여기서, ANG는 촬상 렌즈의 반화각을 나타내는 수치이다. 반화각은 촬상 렌즈(110)의 전체 화각의 1/2을 의미한다.Here, ANG is a numerical value representing the half-field angle of the imaging lens. The half angle of view means 1/2 of the total angle of view of the imaging lens 110 .
ANG가 50도 이하이면, 촬상 렌즈(110)가 담을 수 있는 영역의 크기가 작아, 촬상 렌즈(110)가 목적으로 하는 물체 인식 성능을 구현할 수 없게 된다.When the ANG is 50 degrees or less, the size of the area that the imaging lens 110 can contain is small, and thus the imaging lens 110 cannot realize the target object recognition performance.
한편, 본 발명의 일 실시예에 따른 촬상 렌즈(110)는 아래와 같은 조건식 4을 만족할 수 있다.On the other hand, the imaging lens 110 according to an embodiment of the present invention may satisfy the following conditional expression (4).
<조건식 4><Condition 4>
1.3 ≤ ImgH/EPD ≤ 2.01.3 ≤ ImgH/EPD ≤ 2.0
여기서, ImgH는 이미지 센서(130)의 상면의 대각 길이(Image Height)이고, EPD는 촬상 렌즈(110)의 입사동 구경(Entrance Pupil Diameter)이다.Here, ImgH is the diagonal length (Image Height) of the image sensor 130 , and EPD is the entrance pupil diameter of the imaging lens 110 .
ImgH/EPD 값이 1.3보다 작으면, 촬상 렌즈(110)의 부피가 커지거나, 원하는 정보를 충분하게 얻지 못할 수 있고, ImgH/EPD 값이 2.0보다 크면, 이미지 센서(130)에서 센싱하는 물체의 상이 어두워질 수 있다.If the ImgH/EPD value is less than 1.3, the volume of the imaging lens 110 may increase or desired information may not be sufficiently obtained. The image may darken.
한편, 본 발명의 일 실시예에 따른 촬상 렌즈(110)는 아래와 같은 조건식 5을 만족할 수 있다.Meanwhile, the imaging lens 110 according to an embodiment of the present invention may satisfy the following conditional expression 5.
<조건식 5><Conditional Expression 5>
4.8 ≤ TTL/EFL ≤ 5.64.8 ≤ TTL/EFL ≤ 5.6
여기서, TTL은 제1 렌즈군(101)의 물체측 입사면부터 상면까지의 거리이고, EFL은 촬상 렌즈(110)의 전체 초점거리(Effective Focal Length)이다. Here, TTL is the distance from the object-side incident surface of the first lens group 101 to the image surface, and EFL is the total focal length of the imaging lens 110 .
TTL/EFL 값이 5.6보다 크면, 촬영된 이미지에서 중앙부의 정보량이 줄어들며, TTL/EFL 값이 4.8보다 작으면, 촬영된 이미지에서 주변부의 정보량이 줄어들어, 촬상 렌즈(110)가 목적으로 하는 물체 인식 성능을 구현할 수 없게 된다.When the TTL/EFL value is greater than 5.6, the amount of information in the center of the photographed image is reduced, and when the TTL/EFL value is less than 4.8, the amount of information in the peripheral portion in the photographed image is reduced, so that the imaging lens 110 recognizes the target object. performance cannot be implemented.
한편, 본 발명의 일 실시예에 따른 촬상 렌즈(110)는 아래와 같은 조건식 6을 만족할 수 있다.Meanwhile, the imaging lens 110 according to an embodiment of the present invention may satisfy Conditional Expression 6 below.
<조건식 6><Conditional Expression 6>
2.5 ≤ ANG/TTL ≤ 2.952.5 ≤ ANG/TTL ≤ 2.95
여기서, ANG는 촬상 렌즈의 반화각을 나타내는 수치이고, TTL은 제1 렌즈군(101)의 물체측 입사면부터 상면까지의 거리이다. Here, ANG is a numerical value representing the half-field angle of the imaging lens, and TTL is the distance from the object-side incident surface of the first lens group 101 to the image surface.
ANG/TTL 값이 2.5보다 작으면, 촬영된 이미지에서 정보량이 줄어들고 촬상 렌즈(110)의 크기가 커지게 되며, ANG/TTL 값이 2.95보다 크면, 촬영된 이미지에서 정보량이 줄어들게 된다.When the ANG/TTL value is less than 2.5, the amount of information in the photographed image decreases and the size of the imaging lens 110 increases. When the ANG/TTL value is greater than 2.95, the information amount in the photographed image decreases.
한편, 본 발명의 일 실시예에 따른 촬상 렌즈(110)는 아래와 같은 조건식 7을 만족할 수 있다.Meanwhile, the imaging lens 110 according to an embodiment of the present invention may satisfy Conditional Expression 7 below.
<조건식 7><Conditional Expression 7>
4.9 ≤ TTL/ImgH ≤ 5.74.9 ≤ TTL/ImgH ≤ 5.7
여기서, TTL은 제1 렌즈군(101)의 물체측 입사면부터 상면까지의 거리이고, ImgH는 이미지 센서(130)의 상면의 대각 길이이다.Here, TTL is the distance from the object-side incident surface of the first lens group 101 to the image surface, and ImgH is the diagonal length of the image surface of the image sensor 130 .
TTL/ImgH 값이 5.7보다 크면, 촬상 렌즈(110)의 부피가 커지며, TTL/ImgH 값이 5.7보다 작으면, 촬영된 이미지에서 주변부의 화질이 나빠지게 된다.When the TTL/ImgH value is greater than 5.7, the volume of the imaging lens 110 increases, and when the TTL/ImgH value is less than 5.7, the image quality of the periphery of the photographed image is deteriorated.
한편, 본 발명의 일 실시예에 따른 촬상 렌즈(110)는 아래와 같은 조건식 8을 만족할 수 있다.Meanwhile, the imaging lens 110 according to an embodiment of the present invention may satisfy the following conditional expression 8.
<조건식 8><Conditional Expression 8>
0.4 ≤ ImgH/(EFL * tan(ANG)) ≤ 0.60.4 ≤ ImgH/(EFL * tan(ANG)) ≤ 0.6
여기서, ImgH는 이미지 센서(130)의 상면의 대각 길이이고, EFL은 촬상 렌즈(110)의 전체 초점거리이며, ANG는 촬상 렌즈의 반화각을 나타내는 수치이다.Here, ImgH is the diagonal length of the image sensor 130 , EFL is the total focal length of the imaging lens 110 , and ANG is a numerical value representing the half angle of view of the imaging lens.
ImgH/(EFL * tan(ANG)) 값이 0.4보다 작으면, 촬상 렌즈(110)의 왜곡이 심해져 촬영된 이미지의 정보량이 줄어들고, ImgH/(EFL * tan(ANG)) 값이 0.6보다 크면, 촬영 이미지의 중심부의 물체 인식 성능이 떨어지고 주변부의 정보량이 줄어든다If the ImgH/(EFL * tan(ANG)) value is less than 0.4, the distortion of the imaging lens 110 increases and the amount of information in the photographed image is reduced. If the ImgH/(EFL * tan(ANG)) value is greater than 0.6, The object recognition performance in the center of the captured image decreases, and the amount of information in the periphery decreases.
한편, 본 발명의 일 실시예에 따른 촬상 렌즈(110)는 아래와 같은 조건식 9를 만족할 수 있다.Meanwhile, the imaging lens 110 according to an embodiment of the present invention may satisfy the following conditional expression 9.
<조건식 9><Conditional Expression 9>
-2.42 ≤ f1/EFL ≤ -2.14-2.42 ≤ f1/EFL ≤ -2.14
여기서, f1은 제1 렌즈군(101)의 초점거리이고, EFL은 촬상 렌즈(110)의 전체 초점거리이다.Here, f1 is the focal length of the first lens group 101 , and EFL is the total focal length of the imaging lens 110 .
f1/EFL 값이 -2.42보다 작거나 -2.14보다 크면, 촬상 렌즈(110)의 화각이 좁아져, 촬상 렌즈(110)를 통해 담을 수 있는 영역의 크기가 작아진다. 따라서 촬상 렌즈(110)가 목적으로 하는 물체 인식 성능을 구현할 수 없게 된다.When the f1/EFL value is less than -2.42 or greater than -2.14, the angle of view of the imaging lens 110 becomes narrow, and the size of a region that can be captured through the imaging lens 110 decreases. Accordingly, it is impossible to realize the object recognition performance that the imaging lens 110 is aiming for.
한편, 본 발명의 일 실시예에 따른 촬상 렌즈(110)는 아래와 같은 조건식 10을 만족할 수 있다.Meanwhile, the imaging lens 110 according to an embodiment of the present invention may satisfy the following conditional expression 10.
<조건식 10><Conditional Expression 10>
-5.93 ≤ f2/EFL ≤ -5.11-5.93 ≤ f2/EFL ≤ -5.11
여기서, f2는 제2 렌즈군(102)의 초점거리이고, EFL은 촬상 렌즈(110)의 전체 초점거리이다.Here, f2 is the focal length of the second lens group 102 , and EFL is the total focal length of the imaging lens 110 .
f2/EFL 값이 -5.93보다 작거나 -5.11보다 크면, 촬영 이미지에서 주변부의 화질이 감소하게 된다.If the f2/EFL value is less than -5.93 or greater than -5.11, the quality of the peripheral area in the captured image is reduced.
한편, 본 발명의 일 실시예에 따른 촬상 렌즈(110)는 아래와 같은 조건식 11을 만족할 수 있다.Meanwhile, the imaging lens 110 according to an embodiment of the present invention may satisfy the following conditional expression 11.
<조건식 11><Conditional Expression 11>
3.22 ≤ f6/EFL ≤ 3.643.22 ≤ f6/EFL ≤ 3.64
여기서, f6는 제6 렌즈군(106)의 초점거리이고, EFL은 촬상 렌즈(110)의 전체 초점거리이다.Here, f6 is the focal length of the sixth lens group 106 , and EFL is the total focal length of the imaging lens 110 .
f6/EFL 값이 3.22보다 작거나 3.64보다 크면, 촬영 이미지에서 중앙부와 주변부의 왜곡이 심해져, 촬상 렌즈(110)가 목적으로 하는 물체 인식 성능을 구현할 수 없게 된다.If the f6/EFL value is less than 3.22 or greater than 3.64, distortion in the center and the periphery of the photographed image becomes severe, so that the imaging lens 110 cannot realize the target object recognition performance.
한편, 표 3을 참조하면, 본 발명의 일 실시예에 따른 촬상 렌즈(110)는 앞서 설명한 조건식들을 만족함을 확인할 수 있다. 이에 따라 촬상 렌즈(110)는 광학 성능이 향상되고, 컴팩트한 크기로 차량(10)에 적용이 가능하며, 어두운 환경 또는 주변부에서 높은 물체 인식 성능을 가질 수 있다.Meanwhile, referring to Table 3, it can be confirmed that the imaging lens 110 according to an embodiment of the present invention satisfies the above-described conditional expressions. Accordingly, the imaging lens 110 has improved optical performance, can be applied to the vehicle 10 with a compact size, and can have high object recognition performance in a dark environment or a peripheral area.
EFLEFL 4.4644.464
FnoFno 1.641.64
ANGANG 64.264.2
EPDEPD 2.882.88
TTLTTL 23.523.5
ImgHImgH 4.6134.613
도 4는 도 3의 촬상 렌즈(110)의 코마수차(Coma aberration)를 측정한 그래프이다.FIG. 4 is a graph of measuring coma aberration of the imaging lens 110 of FIG. 3 .
도 4는 촬상 렌즈(110)의 상면의 높이(field hight)에 따라 각 파장의 탄젠셜(tangential) 수차와 새저틀(sagittal) 수차를 측정한 그래프이다.FIG. 4 is a graph of measuring tangential aberration and sagittal aberration of each wavelength according to a field height of the imaging lens 110 .
그래프가 양의 축과 음의 축에서 각각 X축에 근접할 수록 코마수차 보정기능이 좋을 수 있다. 도 4의 측정 예들은 거의 모든 필드에서 상들의 값이 X축에 인접하게 나타나므로, 모두 우수한 코마수차 보정 기능을 보여주고 있다.The closer the graph is to the X-axis on the positive and negative axes, the better the coma correction function can be. In the measurement examples of FIG. 4 , since the values of the images appear adjacent to the X-axis in almost all fields, all of them show excellent coma correction functions.
도 5는 도 3의 촬상 렌즈의 구면수차(longitudinal spherical aberration), 비점수차(astigmatic field curves) 및 왜곡수차(distortion)를 나타내는 그래프이다.5 is a graph illustrating longitudinal spherical aberration, astigmatic field curves, and distortion of the imaging lens of FIG. 3 .
도 5에서, Y축은 이미지의 크기를 의미하고, X축은 초점거리(mm 단위) 및 왜곡도(% 단위)를 의미한다. 각 곡선들이 Y축에 근접할 수록 촬상 렌즈(110)의 수차 보정 기능이 향상될 수 있다.In FIG. 5 , the Y-axis means the size of the image, and the X-axis means the focal length (in mm) and the degree of distortion (in %). As each curve approaches the Y-axis, the aberration correction function of the imaging lens 110 may be improved.
도 6은 도 3의 촬상 렌즈(110)를 사용하여 촬영된 이미지 주변부의 왜곡 정도를 종래의 촬상 렌즈와 비교한 결과를 나타낸 것이다.FIG. 6 shows a result of comparing the degree of distortion of the periphery of an image photographed using the imaging lens 110 of FIG. 3 with that of a conventional imaging lens.
도 6의 (a)는 종래의 촬상 렌즈로 촬영한 이미지(601)를 나타낸 것이고, 도 6의 (b)는 본 발명의 일 실시예에 따른 촬상 렌즈(110)로 촬영한 이미지(602)를 나타낸 것이다.6 (a) shows an image 601 taken with a conventional imaging lens, and FIG. 6 (b) shows an image 602 taken with an imaging lens 110 according to an embodiment of the present invention. it has been shown
도 6의 (a)를 참조하면, 종래의 일반적인 촬상 렌즈로 촬영한 이미지(601)의 중앙부(Ar1, 화각 0도 부근)에 위치하는 물체(OB1a)는 실제 물체의 형태와 유사하지만, 주변부(Ar2, 화각 60도 부근)에 위치하는 물체(OB2a)는 실제 물체의 형태 및 크기와 비교하여 심하게 왜곡된 것을 확인할 수 있다. Referring to (a) of FIG. 6 , the object OB1a located in the central part (Ar1, near 0 degrees of view) of the image 601 taken with a conventional general imaging lens is similar to the shape of the real object, but the peripheral part ( It can be seen that the object OB2a located at Ar2 (near 60 degrees of view) is severely distorted compared to the shape and size of the actual object.
도 6의 (b)를 참조하면, 본 발명의 일 실시예에 따른 촬상 렌즈(110)로 촬영한 이미지(602)의 중앙부(Ar1)에 위치하는 물체(OB1b) 및 주변부(Ar2)에 위치하는 물체(OB2b)는 모두 실제 물체의 형태와 유사한 것을 확인할 수 있다.Referring to FIG. 6 (b), the object OB1b located in the central portion Ar1 and the peripheral portion Ar2 of the image 602 taken with the imaging lens 110 according to an embodiment of the present invention. It can be confirmed that all of the objects OB2b are similar to the shape of the real object.
또한, 본 발명의 일 실시예에 따른 촬상 렌즈(110)로 촬영한 이미지(602)의 가로 길이(W2)는 종래의 일반적인 촬상 렌즈로 촬영한 이미지(601)의 가로 길이(W1)과 동일하며, 주변부(Ar2)에 위치한 물체를 왜곡없이 더 정확하게 촬영할 수 있다.In addition, the horizontal length W2 of the image 602 photographed with the imaging lens 110 according to an embodiment of the present invention is the same as the horizontal length W1 of the image 601 photographed with the conventional general imaging lens, and , an object located in the periphery Ar2 can be photographed more accurately without distortion.
이에 따라, 기존의 센서와 동일한 크기의 이미지 센서를 적용할 수 있으므로, 촬상 렌즈(110)를 기존의 차량용 카메라 모듈에 바로 적용할 수 있다.Accordingly, since an image sensor having the same size as that of the existing sensor can be applied, the imaging lens 110 can be directly applied to the existing vehicle camera module.
또한, 본 발명의 일 실시예에 따른 차량(10)의 오브젝트 검출장치(300)는 카메라 모듈(100)에서 획득된 이미지에 기초하여, 이미지 주변부에 존재하는 오브젝트도 정확하게 검출하고 트래킹할 수 있으며, 여러 상황에서 탑승자의 안전을 도모할 수 있다.In addition, the object detection apparatus 300 of the vehicle 10 according to an embodiment of the present invention can accurately detect and track an object existing in the periphery of the image based on the image obtained from the camera module 100, The safety of occupants can be promoted in various situations.
도 7은 도 3의 촬상 렌즈(110)의 화각에 따른 물체 인식 가능 거리를 종래의 촬상 렌즈와 비교한 결과를 나타낸 것이다. FIG. 7 shows a result of comparing the object recognition distance according to the angle of view of the imaging lens 110 of FIG. 3 with that of the conventional imaging lens.
도면을 참조하면, 본 발명의 일 실시예에 따른 촬상 렌즈(110)는 중앙부(0°)에서 물체 인식 가능 거리가 85m 정도로써, 종래의 촬상 렌즈의 물체 인식 가능 거리 89m 대비 96% 수준이다. 따라서, 종래의 촬상 렌즈와 비교하여 거의 비슷한 수준의 물체 인식 성능을 보유함을 확인할 수 있다.Referring to the drawings, the imaging lens 110 according to an embodiment of the present invention has an object recognition distance of about 85 m from the central portion (0°), which is 96% compared to the 89 m object recognition distance of a conventional imaging lens. Therefore, it can be confirmed that the object recognition performance is almost similar to that of the conventional imaging lens.
한편, 본 발명의 일 실시예에 따른 촬상 렌즈(110)는 주변부(60°)에서 물체 인식 가능 거리가 37m 정도로써, 종래의 촬상 렌즈의 물체 인식 가능 거리 25m 대비 128% 수준이다. 따라서, 종래의 촬상 렌즈와 비교하여 주변부에서의 물체 인식 성능이 대폭 향상되었음을 확인할 수 있다.On the other hand, the imaging lens 110 according to an embodiment of the present invention has an object recognition distance of about 37 m from the periphery (60°), which is 128% compared to the 25 m object recognition distance of a conventional imaging lens. Accordingly, it can be confirmed that the object recognition performance in the peripheral area is significantly improved compared to the conventional imaging lens.
이에 따라, 본 발명의 일 실시예에 따른 촬상 렌즈(110)는 중심부의 물체 인식 성능을 유지하면서, 주변부의 정보량을 증가시켜, 주변부의 물체 인식 성능도 향상시킬 수 있다.Accordingly, the imaging lens 110 according to an embodiment of the present invention can improve the object recognition performance of the peripheral part by increasing the amount of information in the peripheral part while maintaining the object recognition performance of the central part.
이상에서는 본 발명의 바람직한 실시예에 대하여 도시하고 설명하였지만, 본 발명은 상술한 특정의 실시예에 한정되지 아니하며, 청구범위에서 청구하는 본 발명의 요지를 벗어남이 없이 당해 발명이 속하는 기술분야에서 통상의 지식을 가진 자에 의해 다양한 변형실시가 가능한 것은 물론이고, 이러한 변형실시들은 본 발명의 기술적 사상이나 전망으로부터 개별적으로 이해되어져서는 안될 것이다.In the above, preferred embodiments of the present invention have been illustrated and described, but the present invention is not limited to the specific embodiments described above, and it is common in the technical field to which the present invention pertains without departing from the gist of the present invention as claimed in the claims. Various modifications may be made by those having the knowledge of, of course, and these modifications should not be individually understood from the technical spirit or perspective of the present invention.

Claims (19)

  1. 물체측으로부터 상측으로 순서대로in order from the object side to the top side
    음의 파워(power)를 갖고, 대상면이 물체측으로 볼록한 제1 렌즈군;a first lens group having negative power and having an object surface convex;
    음의 파워를 갖는 제2 렌즈군;a second lens group having negative power;
    양의 파워를 갖는 제3 렌즈군;a third lens group having positive power;
    양의 파워를 갖는 제4 렌즈군;a fourth lens group having positive power;
    음의 파워를 갖는 제5 렌즈군; 및a fifth lens group having negative power; and
    양의 파워를 갖는 제6 렌즈군;을 포함하며,A sixth lens group having positive power;
    상기 제1 렌즈군 내지 제6 렌즈군은 각각 적어도 하나 이상의 렌즈를 포함하는 촬상 렌즈.Each of the first to sixth lens groups includes at least one or more lenses.
  2. 제1항에 있어서,According to claim 1,
    상기 제1 렌즈군은 물체측으로 볼록한 메니스커스 형상인 촬상 렌즈.The first lens group is an imaging lens having a meniscus shape convex toward the object.
  3. 제1항에 있어서,According to claim 1,
    상기 제2 렌즈군은 상측으로 볼록한 메니스커스 형상인 촬상 렌즈.The second lens group is an imaging lens having a meniscus shape convex toward the image side.
  4. 제1항에 있어서,According to claim 1,
    상기 제3 렌즈군 또는 상기 제4 렌즈군은 양면이 볼록한 촬상 렌즈.The third lens group or the fourth lens group is an imaging lens in which both surfaces are convex.
  5. 제1항에 있어서,According to claim 1,
    상기 제5 렌즈군 또는 상기 제6 렌즈군의 대상면 및 결상면 중 적어도 하나는 적어도 하나의 변곡점을 가지는 촬상 렌즈.At least one of a target surface and an imaging surface of the fifth lens group or the sixth lens group has at least one inflection point.
  6. 제1항에 있어서,According to claim 1,
    상기 제1 렌즈군 내지 상기 제6 렌즈군 중 적어도 하나는 적어도 하나의 비구면 렌즈를 포함하는 촬상 렌즈.At least one of the first to sixth lens groups includes at least one aspherical lens.
  7. 제1항에 있어서,According to claim 1,
    상기 촬상 렌즈의 밝기를 나타내는 상수를 Fno라 할 때,When the constant representing the brightness of the imaging lens is Fno,
    1.55 ≤ Fno ≤ 1.71.55 ≤ Fno ≤ 1.7
    의 조건식을 만족하는 촬상 렌즈.An imaging lens that satisfies the conditional expression of
  8. 제1항에 있어서,According to claim 1,
    상기 제1 렌즈군의 물체측 면부터 상면까지의 거리를 TTL이라 할 때,When the distance from the object side surface to the image surface of the first lens group is TTL,
    20mm ≤ TTL ≤ 25mm20mm ≤ TTL ≤ 25mm
    의 조건식을 만족하는 촬상 렌즈.An imaging lens that satisfies the conditional expression of
  9. 제1항에 있어서,According to claim 1,
    상기 촬상 렌즈의 반화각을 ANG라 할 때,When the half angle of view of the imaging lens is ANG,
    50°≤ ANG 50°≤ ANG
    의 조건식을 만족하는 촬상 렌즈.An imaging lens that satisfies the conditional expression of
  10. 제1항에 있어서,According to claim 1,
    상면의 대각 길이를 ImgH, 상기 촬상 렌즈의 입사동 구경을 EPD라 할 때,When the diagonal length of the image plane is ImgH and the entrance pupil aperture of the imaging lens is EPD,
    1.3 ≤ ImgH/EPD ≤ 2.01.3 ≤ ImgH/EPD ≤ 2.0
    의 조건식을 만족하는 촬상 렌즈.An imaging lens that satisfies the conditional expression of
  11. 제1항에 있어서,According to claim 1,
    상기 제1 렌즈군의 물체측 면부터 상면까지의 거리를 TTL, 상기 촬상 렌즈의 전체 초점거리를 EFL이라 할 때,When the distance from the object side surface to the image surface of the first lens group is TTL, and the total focal length of the imaging lens is EFL,
    4.8 ≤ TTL/EFL ≤ 5.64.8 ≤ TTL/EFL ≤ 5.6
    의 조건식을 만족하는 촬상 렌즈.An imaging lens that satisfies the conditional expression of
  12. 제1항에 있어서,According to claim 1,
    상기 제1 렌즈군의 물체측 면부터 상면까지의 거리를 TTL, 상기 촬상 렌즈의 반화각을 ANG라 할 때,When the distance from the object side surface to the image surface of the first lens group is TTL, and the half angle of view of the imaging lens is ANG,
    2.5 ≤ ANG/TTL ≤ 2.952.5 ≤ ANG/TTL ≤ 2.95
    의 조건식을 만족하는 촬상 렌즈.An imaging lens that satisfies the conditional expression of
  13. 제1항에 있어서,According to claim 1,
    상면의 대각 길이를 ImgH, 상기 제1 렌즈군의 물체측 면부터 상기 상면까지의 거리를 TTL이라 할 때,When the diagonal length of the image plane is ImgH and the distance from the object side surface of the first lens group to the image plane is TTL,
    4.9 ≤ TTL/ImgH ≤ 5.74.9 ≤ TTL/ImgH ≤ 5.7
    의 조건식을 만족하는 촬상 렌즈.An imaging lens that satisfies the conditional expression of
  14. 제1항에 있어서,According to claim 1,
    상면의 대각 길이를 ImgH, 상기 촬상 렌즈의 전체 초점거리를 EFL, 상기 촬상 렌즈의 반화각을 ANG라 할 때,When the diagonal length of the image plane is ImgH, the total focal length of the imaging lens is EFL, and the half angle of view of the imaging lens is ANG,
    0.4 ≤ ImgH/(EFL * tan(ANG)) ≤ 0.60.4 ≤ ImgH/(EFL * tan(ANG)) ≤ 0.6
    의 조건식을 만족하는 촬상 렌즈.An imaging lens that satisfies the conditional expression of
  15. 제1항에 있어서,According to claim 1,
    상기 촬상 렌즈의 전체 초점거리를 EFL, 상기 제1 렌즈군의 초점거리를 f1이라 할 때,When the total focal length of the imaging lens is EFL and the focal length of the first lens group is f1,
    -2.42 ≤ f1/EFL ≤ -2.14-2.42 ≤ f1/EFL ≤ -2.14
    의 조건식을 만족하는 촬상 렌즈.An imaging lens that satisfies the conditional expression of
  16. 제1항에 있어서,According to claim 1,
    상기 촬상 렌즈의 전체 초점거리를 EFL, 상기 제2 렌즈군의 초점거리를 f2라 할 때,When the total focal length of the imaging lens is EFL and the focal length of the second lens group is f2,
    -5.93 ≤ f2/EFL ≤ -5.11-5.93 ≤ f2/EFL ≤ -5.11
    의 조건식을 만족하는 촬상 렌즈.An imaging lens that satisfies the conditional expression of
  17. 제1항에 있어서,According to claim 1,
    상기 촬상 렌즈의 전체 초점거리를 EFL, 상기 제6 렌즈군의 초점거리를 f6이라 할 때,When the total focal length of the imaging lens is EFL and the focal length of the sixth lens group is f6,
    3.22 ≤ f6/EFL ≤ 3.643.22 ≤ f6/EFL ≤ 3.64
    의 조건식을 만족하는 촬상 렌즈.An imaging lens that satisfies the conditional expression of
  18. 제1항 내지 제17항 중 어느 한 항의 촬상 렌즈;The imaging lens of any one of claims 1 to 17;
    상기 촬상 렌즈를 통과한 빛을 파장에 따라 선택적으로 투과하는 필터; 및a filter that selectively transmits light passing through the imaging lens according to a wavelength; and
    상기 필터를 투과한 빛을 수용하는 이미지 센서;를 포함하는 카메라 모듈.A camera module comprising a; an image sensor for receiving the light transmitted through the filter.
  19. 제18항의 카메라 모듈을 포함하는 차량.A vehicle comprising the camera module of claim 18 .
PCT/KR2020/005712 2020-04-29 2020-04-29 Imaging lens, and camera module and vehicle comprising same WO2021221207A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/KR2020/005712 WO2021221207A1 (en) 2020-04-29 2020-04-29 Imaging lens, and camera module and vehicle comprising same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/KR2020/005712 WO2021221207A1 (en) 2020-04-29 2020-04-29 Imaging lens, and camera module and vehicle comprising same

Publications (1)

Publication Number Publication Date
WO2021221207A1 true WO2021221207A1 (en) 2021-11-04

Family

ID=78374140

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2020/005712 WO2021221207A1 (en) 2020-04-29 2020-04-29 Imaging lens, and camera module and vehicle comprising same

Country Status (1)

Country Link
WO (1) WO2021221207A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100930167B1 (en) * 2007-09-19 2009-12-07 삼성전기주식회사 Ultra wide angle optical system
JP2010243709A (en) * 2009-04-03 2010-10-28 Ricoh Co Ltd Wide angle lens and imaging device
JP2013073145A (en) * 2011-09-29 2013-04-22 Fujifilm Corp Image pickup lens and image pickup apparatus
JP2017102211A (en) * 2015-11-30 2017-06-08 コニカミノルタ株式会社 Imaging lens and imaging apparatus
JP2018136476A (en) * 2017-02-23 2018-08-30 富士フイルム株式会社 Imaging lens and imaging apparatus

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100930167B1 (en) * 2007-09-19 2009-12-07 삼성전기주식회사 Ultra wide angle optical system
JP2010243709A (en) * 2009-04-03 2010-10-28 Ricoh Co Ltd Wide angle lens and imaging device
JP2013073145A (en) * 2011-09-29 2013-04-22 Fujifilm Corp Image pickup lens and image pickup apparatus
JP2017102211A (en) * 2015-11-30 2017-06-08 コニカミノルタ株式会社 Imaging lens and imaging apparatus
JP2018136476A (en) * 2017-02-23 2018-08-30 富士フイルム株式会社 Imaging lens and imaging apparatus

Similar Documents

Publication Publication Date Title
WO2016003211A1 (en) Photographing lens and photographing apparatus
WO2018080100A1 (en) Optical lens system
WO2014104787A1 (en) Photographic lens and photographic apparatus using the same
WO2022124850A1 (en) Optical system and camera module including same
WO2010077050A2 (en) Fisheye lens
WO2022145990A1 (en) Optical system and camera module for vehicle
WO2016105074A1 (en) Lens optical system
WO2016140526A1 (en) Imaging lens and camera module having same
WO2018080103A1 (en) Optical lens system
WO2021221207A1 (en) Imaging lens, and camera module and vehicle comprising same
WO2021201568A1 (en) Optical system and camera module including same
WO2023003365A1 (en) Optical system, and optical module and camera module comprising same
WO2021091090A1 (en) Lens assembly and electronic device including the same
WO2021071320A1 (en) Imaging lens
WO2022164196A1 (en) Optical system and camera module comprising same
WO2022035134A1 (en) Optical system
WO2021194012A1 (en) Imaging lens, and camera module and electronic device comprising same
WO2021215807A1 (en) Optical system and camera module including same
WO2021261615A1 (en) Imaging lens, and camera module and electronic device comprising same
WO2024085675A1 (en) Optical system, optical device, and camera module including same
WO2017200306A1 (en) Wide-angle lens system and vehicle camera comprising same
WO2021054740A1 (en) Camera module
WO2024096508A1 (en) Camera module and driving-assistant device
WO2022265452A2 (en) Optical system and camera module comprising same
WO2023282727A1 (en) Optical system and camera module comprising same

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20933929

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20933929

Country of ref document: EP

Kind code of ref document: A1