WO2021221207A1 - Lentille d'imagerie et module de caméra et véhicule la comprenant - Google Patents

Lentille d'imagerie et module de caméra et véhicule la comprenant Download PDF

Info

Publication number
WO2021221207A1
WO2021221207A1 PCT/KR2020/005712 KR2020005712W WO2021221207A1 WO 2021221207 A1 WO2021221207 A1 WO 2021221207A1 KR 2020005712 W KR2020005712 W KR 2020005712W WO 2021221207 A1 WO2021221207 A1 WO 2021221207A1
Authority
WO
WIPO (PCT)
Prior art keywords
imaging lens
lens
lens group
imaging
efl
Prior art date
Application number
PCT/KR2020/005712
Other languages
English (en)
Korean (ko)
Inventor
이규승
김진범
여상옥
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to PCT/KR2020/005712 priority Critical patent/WO2021221207A1/fr
Publication of WO2021221207A1 publication Critical patent/WO2021221207A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • G02B13/18Optical objectives specially designed for the purposes specified below with lenses having one or more non-spherical faces, e.g. for reducing geometrical aberration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof

Definitions

  • the present invention relates to an imaging lens, a camera module including the same, and a vehicle, and more particularly, to an imaging lens capable of improving peripheral object recognition performance, a camera module and a vehicle including the same.
  • Cameras can be classified into standard, telephoto, and wide-angle cameras according to the focal length of the lens.
  • the wide-angle camera has a wide angle of view, but in general, image distortion is increased in the periphery (wide-angle part).
  • a vehicle camera is a camera that provides information necessary for driving by recognizing an object in the front or rear of the vehicle.
  • a wide-angle camera is generally used for a vehicle camera to recognize more information. Therefore, the image distortion problem in the vehicle camera greatly deteriorates the performance of the camera, and accordingly, it is not possible to accurately recognize an object located in the periphery in front of the vehicle, which is a great threat to safe driving.
  • the vehicle front camera should be able to recognize an object 35m away from a wide angle (about 60 degrees), and it should be able to recognize an object more than 85m away from the front (about 0 degrees).
  • a camera used in a vehicle needs to be able to recognize surrounding objects even in a dark environment, so a bright lens with a small f-number is required.
  • the multi-camera is a camera including a camera for photographing a central portion and a camera for photographing a peripheral portion, and has a problem in that the volume of the camera increases, the price increases, and the operation of merging images photographed by a plurality of cameras into one is required.
  • the peripheral distortion mitigation camera has a problem in that the size of the peripheral image to be photographed increases, and thus a larger image sensor is required, and accordingly, the volume of the camera increases and the price increases.
  • an object of the present invention is to provide an imaging lens capable of increasing object recognition performance of a peripheral part while maintaining object recognition performance of a central part.
  • an object of the present invention is to provide an imaging lens capable of accurately recognizing an object even in a dark environment in order to solve the above problems.
  • an object of the present invention is to provide a vehicle that can promote the safety of occupants in various situations by mounting a camera module including the imaging lens.
  • an imaging lens has negative power in order from an object side to an image side, and a first lens group with a convex target surface toward the object side, negative power.
  • Each of the first to sixth lens groups may include at least one lens.
  • the first lens group may have a meniscus shape convex toward the object.
  • the second lens group may have a meniscus shape convex toward the image.
  • the third lens group or the fourth lens group may have both convex shapes.
  • At least one of the target surface and the imaging surface of the fifth lens group or the sixth lens group may have at least one inflection point.
  • At least one of the first lens group to the sixth lens group may include at least one aspherical lens.
  • the imaging lens according to an embodiment of the present invention for achieving the above object when the distance from the object-side surface to the image surface of the first lens group is TTL, the conditional expression of 20mm ⁇ TTL ⁇ 25mm may be satisfied. .
  • the conditional expression of ANG 50° may be satisfied.
  • the imaging lens according to an embodiment of the present invention for achieving the above object, when the diagonal length of the image plane is ImgH and the entrance pupil aperture of the imaging lens is EPD, the conditional expression of 1.3 ⁇ ImgH/EPD ⁇ 2.0 is satisfied.
  • the imaging lens according to an embodiment of the present invention for achieving the above object, when the distance from the object side surface to the image plane of the first lens group is TTL and the total focal length of the imaging lens is EFL, 4.8 ⁇ The conditional expression of TTL/EFL ⁇ 5.6 may be satisfied.
  • the imaging lens according to an embodiment of the present invention for achieving the above object, when the distance from the object side surface to the image plane of the first lens group is TTL and the half angle of view of the imaging lens is ANG, 2.5 ⁇ ANG
  • the conditional expression of /TTL ⁇ 2.95 may be satisfied.
  • the imaging lens according to an embodiment of the present invention for achieving the above object, when the diagonal length of the image plane is ImgH and the distance from the object side surface of the first lens group to the image plane is TTL, 4.9 ⁇ TTL/ The conditional expression of ImgH ⁇ 5.7 may be satisfied.
  • the imaging lens according to an embodiment of the present invention for achieving the above object, when the diagonal length of the image plane is ImgH, the total focal length of the imaging lens is EFL, and the half angle of view of the imaging lens is ANG, 0.4 ⁇ ImgH
  • the conditional expression of /(EFL * tan(ANG)) ⁇ 0.6 may be satisfied.
  • the imaging lens according to an embodiment of the present invention for achieving the above object, when the total focal length of the imaging lens is EFL and the focal length of the first lens group is f1, -2.42 ⁇ f1/EFL ⁇ -
  • the conditional expression in 2.14 can be satisfied.
  • the imaging lens according to an embodiment of the present invention for achieving the above object, when the total focal length of the imaging lens is EFL and the focal length of the second lens group is f2, -5.93 ⁇ f2/EFL ⁇ -
  • the conditional expression of 5.11 can be satisfied.
  • the imaging lens according to an embodiment of the present invention for achieving the above object, when the total focal length of the imaging lens is EFL and the focal length of the sixth lens group is f6, 3.22 ⁇ f6/EFL ⁇ 3.64 condition can be satisfied.
  • the imaging lens according to an embodiment of the present invention including six lens groups, has an effect of improving object recognition performance of a peripheral part while maintaining object recognition performance of a central part.
  • the imaging lens according to an embodiment of the present invention includes at least one aspherical lens and a plastic lens and is designed to have a low f-number, so that an object can be accurately recognized even in a dark environment.
  • the camera module according to an embodiment of the present invention can be directly applied to an existing vehicle camera module by applying an imaging lens including a miniaturized six lens group and an image sensor having the same size as that of the existing sensor. It works.
  • the vehicle including the imaging lens according to an embodiment of the present invention has an effect that can promote the safety of the occupant in various situations by accurately recognizing an object located in the peripheral portion.
  • FIG. 1 is a view showing a photographing angle of view of a vehicle and a camera module equipped with a camera module including an imaging lens according to an embodiment of the present invention.
  • FIG. 2 is a block diagram referenced for explaining a vehicle according to an embodiment of the present invention.
  • FIG. 3 is a diagram illustrating an imaging lens according to an embodiment of the present invention.
  • FIG. 4 is a graph of measuring coma aberration of the imaging lens of FIG. 3;
  • FIG. 5 is a graph illustrating spherical aberration, astigmatism, and distortion of the imaging lens of FIG. 3 .
  • FIG. 6 shows a result of comparing the degree of distortion in the periphery of an image photographed using the imaging lens of FIG. 3 with that of a conventional imaging lens.
  • FIG. 7 shows the result of comparing the object recognition distance according to the angle of view of the imaging lens of FIG. 3 with that of the conventional imaging lens.
  • module and “part” for the components used in the following description are given or mixed in consideration of only the ease of writing the specification, and do not have a meaning or role distinct from each other by themselves. Accordingly, the terms “module” and “unit” may be used interchangeably.
  • FIG. 1 is a view showing a photographing angle of view of a vehicle 10 and a camera module 100 equipped with a camera module 100 including an imaging lens 110 according to an embodiment of the present invention
  • FIG. 2 is the present invention. It is a block diagram referenced to describe the vehicle 10 according to the embodiment.
  • the vehicle 10 may include wheels rotated by a power source and a steering input device 200 for controlling the traveling direction of the vehicle 10 .
  • the vehicle 10 may be an autonomous vehicle.
  • the vehicle 10 may be switched to an autonomous driving mode or a manual mode based on a user input received through a user interface device (not shown).
  • the vehicle 10 may be switched to an autonomous driving mode or a manual mode based on driving situation information.
  • the driving situation information may include at least one of object information outside the vehicle, navigation information, and vehicle state information.
  • the vehicle 10 may be switched from the manual mode to the autonomous driving mode or from the autonomous driving mode to the manual mode based on the driving situation information generated by the object detection apparatus 300 .
  • the vehicle 10 may receive a user input for driving through the driving manipulation device 50 . Based on a user input received through the driving manipulation device 50 , the vehicle 10 may be driven.
  • the vehicle 10 may include the object detecting apparatus 300 .
  • the object detecting apparatus 300 is an apparatus for detecting an object located outside the vehicle 10 .
  • the object detection apparatus 300 may generate object information based on the sensed data.
  • the object information may include information on the existence of an object, location information of the object, distance information between the vehicle 10 and the object, and relative speed information between the vehicle 10 and the object.
  • the object may be various objects related to the operation of the vehicle 10 .
  • the object may include a vehicle, other vehicle, pedestrian, two-wheeled vehicle, traffic signal, light, road, structure, speed bump, building, feature, animal, and the like.
  • the object may be classified into a moving object and a still object.
  • the moving object may be a concept including another moving vehicle and a moving pedestrian.
  • the stationary object may be a concept including a traffic signal, a road, a structure, a building, another stopped vehicle, and a stationary pedestrian.
  • the camera module 100 may include an imaging lens 110 including a plurality of lens groups.
  • the camera module 100 may be located at an appropriate place outside the vehicle in order to acquire an image outside the vehicle.
  • the camera module 100 may include an imaging lens 110 , a filter 120 , and an image sensor 130 .
  • the imaging lens 110 may be configured by arranging a plurality of lens groups in a line along an optical axis.
  • the imaging lens 110 refracts light incident from the subject to form an image on the image sensor 130 .
  • the configuration of the imaging lens 110 will be described in detail below with reference to FIGS. 3 to 6 .
  • the filter 120 may selectively transmit light passing through the imaging lens 110 according to a wavelength.
  • the filter 120 may include an infrared filter 121 (Infrared Ray Filter).
  • the filter 120 may further include a cover glass 122 and the like.
  • the infrared filter 121 and the cover glass 122 may be replaced or omitted with other filters as necessary, and may be designed so as not to affect the optical characteristics of the imaging lens 110 .
  • the image sensor 130 is disposed to be spaced apart from the imaging lens 110 , and performs a function of converting light input through the imaging lens 110 into an electrical signal.
  • a charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS) may be used as the image sensor 130 .
  • the camera module 100 may further include a lens barrel 140 and an actuator 150 .
  • the lens barrel 140 serves as a housing for protecting the imaging lens 110 , and may move in the optical axis direction according to the driving of the actuator 150 .
  • the actuator 150 performs an auto focus (AF) function by moving the lens barrel 140 and the bobbin (not shown) along the optical axis direction through electromagnetic force using a coil.
  • the actuator 150 may be configured as a voice coil motor (VCM) or the like.
  • the camera module 100 may acquire position information of an object, distance information from an object, or relative speed information with an object by using various image processing algorithms.
  • the camera module 100 may acquire distance information and relative speed information from an object based on a change in the size of the object over time from the acquired image.
  • the camera module 100 may be disposed adjacent to the front windshield in the interior of the vehicle to acquire an image of the front of the vehicle.
  • the camera module 100 may be disposed around a front bumper or a radiator grill.
  • the camera module 100 may be disposed adjacent to the rear glass in the interior of the vehicle to obtain an image of the rear of the vehicle.
  • the camera module 100 may be disposed around a rear bumper, a trunk, or a tailgate.
  • the camera module 100 may be disposed adjacent to at least one of the side windows in the interior of the vehicle in order to acquire an image of the side of the vehicle.
  • the camera module 100 may be disposed around a side mirror, a fender, or a door.
  • the camera module 100 may provide the acquired image to the processor 350 of the object detection apparatus 300 .
  • the object detection apparatus 300 may include a radar 310 , a lidar 320 , an ultrasonic sensor 330 , an infrared sensor 340 , and a processor 350 . Meanwhile, the object detection apparatus 300 may operate in association with the camera module 100 .
  • the object detecting apparatus 300 may further include other components in addition to the described components, or may not include some of the described components.
  • the radar 310 may include an electromagnetic wave transmitter and a receiver.
  • the radar 310 may be implemented in a pulse radar method or a continuous wave radar method in view of a radio wave emission principle.
  • the radar 310 may be implemented in a frequency modulated continuous wave (FMCW) method or a frequency shift keyong (FSK) method according to a signal waveform among continuous wave radar methods.
  • FMCW frequency modulated continuous wave
  • FSK frequency shift keyong
  • the radar 310 detects an object based on an electromagnetic wave, a time of flight (TOF) method or a phase-shift method, and a position of the detected object, a distance from the detected object, and a relative speed. can be detected.
  • TOF time of flight
  • the radar 310 may be disposed at an appropriate location outside the vehicle to detect an object located in front, rear or side of the vehicle.
  • the lidar 320 may include a laser transmitter and a receiver.
  • the lidar 320 may be implemented in a time of flight (TOF) method or a phase-shift method.
  • TOF time of flight
  • the lidar 320 may be implemented as a driven or non-driven type.
  • the lidar 320 When implemented as a driving type, the lidar 320 is rotated by a motor and may detect an object around the vehicle 10 .
  • the lidar 320 may detect an object located within a predetermined range with respect to the vehicle 10 by light steering.
  • Vehicle 10 may include a plurality of non-driven lidars 320 .
  • the lidar 320 detects an object based on a time of flight (TOF) method or a phase-shift method as a laser light medium, and determines the position of the detected object, the distance to the detected object, and Relative speed can be detected.
  • TOF time of flight
  • phase-shift method as a laser light medium
  • the lidar 320 may be disposed at an appropriate location outside the vehicle to detect an object located in front, rear or side of the vehicle.
  • the ultrasonic sensor 330 may include an ultrasonic transmitter and a receiver.
  • the ultrasound sensor 330 may detect an object based on ultrasound, and detect a position of the detected object, a distance from the detected object, and a relative speed.
  • the ultrasonic sensor 330 may be disposed at an appropriate location outside the vehicle to detect an object located at the front, rear, or side of the vehicle.
  • the infrared sensor 340 may include an infrared transmitter and a receiver.
  • the infrared sensor 340 may detect an object based on infrared light, and detect a position of the detected object, a distance from the detected object, and a relative speed.
  • the infrared sensor 340 may be disposed at an appropriate location outside the vehicle to detect an object located in the front, rear, or side of the vehicle.
  • the processor 350 may control the overall operation of each unit of the object detection apparatus 300 .
  • the processor 350 compares data sensed by the camera module 100 , the radar 310 , the lidar 320 , the ultrasonic sensor 330 , and the infrared sensor 340 with pre-stored data to detect an object or can be classified.
  • the processor 350 may detect and track the object based on the image acquired by the camera module 100 .
  • the processor 350 may perform operations such as calculating a distance to an object and calculating a relative speed with respect to an object through an image processing algorithm.
  • the processor 350 may acquire distance information and relative velocity information from the obtained image based on a change in the size of the object over time.
  • the processor 350 may acquire distance information and relative speed information from an object through a pin hole model, road surface profiling, or the like.
  • the processor 350 may detect and track the object based on at least one of a reflected electromagnetic wave, a reflected laser, a reflected ultrasonic wave, and a reflected infrared light transmitted to the object and reflected back.
  • the processor 350 may perform operations such as calculating a distance to an object and calculating a relative speed with respect to an object based on at least one of reflected electromagnetic waves, reflected lasers, reflected ultrasonic waves, and reflected infrared light.
  • the object detecting apparatus 300 may include a plurality of processors 350 or may not include the processors 350 .
  • each of the camera module 100 , the radar 310 , the lidar 320 , the ultrasonic sensor 330 , and the infrared sensor 340 may individually include a processor.
  • the object detection apparatus 300 may be operated under the control of the processor or the controller 900 of the apparatus in the vehicle 10 .
  • the vehicle 10 includes a user interface device that receives a user input and provides information generated in the vehicle 10 to the user, and various devices in the vehicle 10 .
  • a vehicle driving device that electrically controls the driving of a vehicle, a communication device that communicates with an external device, a driving system that controls various operations of the vehicle 10 in an autonomous driving mode, a navigation system that provides navigation information,
  • a sensing unit for sensing an interface unit for exchanging data or supplying power with an external device connected to the vehicle 10 , a memory for storing various data related to the vehicle 10 , and operation of each component of the vehicle 10 .
  • a power supply for supplying power required for the , etc. may be further included as needed.
  • the camera module 100 installed in the vehicle 10 may include at least one imaging lens 110 .
  • the imaging lens 110 may have an angle of view of a certain size.
  • the camera module 100 may be disposed in the vehicle 10 such that the imaging lens 110 faces the front direction of the vehicle 10 .
  • the area located in the front direction of the camera module 100 is defined as the central area Ar1, and a partial area near the left end or right end in the range of the angle of view that the camera module 100 can photograph is the peripheral area Ar2. define.
  • An image of the central region Ar1 is included in the center of the image captured by the camera module 100 , and an image of the peripheral region Ar2 is included in the periphery of the image.
  • FIG. 3 is a diagram illustrating an imaging lens 110 according to an embodiment of the present invention.
  • the spherical or aspherical shape of the lens in FIG. 3 is only presented as an example and is not limited thereto.
  • the term 'target surface' refers to the surface of the lens facing the object side with respect to the optical axis
  • the term 'image-forming surface' refers to the surface of the lens facing the image side with respect to the optical axis.
  • positive power of a lens indicates a converging lens that converges parallel light
  • negative power of a lens indicates a diverging lens that diverges parallel light
  • the imaging lens 110 includes a first lens group 101 , a second lens group 102 , a third lens group 103 , a fourth lens group 104 in order from the object side to the image side; A fifth lens group 105 and a sixth lens group 106 may be included.
  • Each of the first lens group 101 to the sixth lens group 106 may include at least one lens.
  • a shutter may be included on the front surface of the first lens group 101 , and an iris (not shown) may be disposed between the first lens group 101 to the sixth lens group 106 .
  • the aperture may be a variable aperture.
  • the imaging lens 110 is included in the camera module 100, and the filter 120 and the image sensor 130 may be arranged to be spaced apart from each other in the sixth lens group 106 of the imaging lens 110 in order. .
  • the filter 120 may include an infrared filter 121 . Meanwhile, the filter 120 may further include a cover glass 122 and the like.
  • the infrared filter 121 and the cover glass 122 may be a flat optical member, and the cover glass 122 may be a glass for protecting the imaging surface.
  • the image sensor 130 may detect light incident from the sixth lens group 106 .
  • the first lens group 101 may include at least one lens.
  • the first lens group 101 may have negative (-) power.
  • the first lens group 101 may be disposed closest to the object side, the target surface S11 may be convex, and the imaging surface S12 may be concave. That is, the first lens group 101 may have a meniscus shape convex toward the object.
  • the first lens group 101 may be formed to be relatively larger than the second lens group 102 to the sixth lens group 106 . Accordingly, all light incident through the target surface S11 of the first lens group 101 may be incident on the target surface S21 of the second lens group 102, and the imaging lens 110 has a wide angle of view. can be implemented
  • the second lens group 102 may include at least one lens.
  • the second lens group 102 may have negative (-) power.
  • the second lens group 102 may be disposed to be spaced apart from the imaging surface S12 of the first lens group 101 , the target surface S21 may be concave, and the imaging surface S22 may be convex. That is, the second lens group 102 may have a meniscus shape convex toward the image side.
  • the third lens group 103 may include at least one lens.
  • the third lens group 103 may have positive (+) power.
  • the third lens group 103 may be disposed to be spaced apart from the imaging plane S22 of the second lens group 102 , and both the target plane S31 and the imaging plane S32 may be convex.
  • the fourth lens group 104 may include at least one lens.
  • the fourth lens group 104 may have positive (+) power.
  • the fourth lens group 104 may be disposed to be spaced apart from the imaging plane S32 of the third lens group 103 , and both the target plane S41 and the imaging plane S42 may be convex.
  • the fifth lens group 105 may include at least one lens.
  • the fifth lens group 105 may have negative (-) power.
  • the fifth lens group 105 may be disposed to be spaced apart from the imaging surface S42 of the fourth lens group 104 .
  • At least one inflection point may be formed on at least one of the target surface S51 and the imaging surface S52 of the fifth lens group 105 .
  • the target surface S51 may be concave in the paraxial region and convex toward the edge
  • the imaging surface S52 may be convex in the paraxial region and concave toward the edge.
  • the shape of the fifth lens group 105 is not limited thereto.
  • the sixth lens group 106 may include at least one lens.
  • the sixth lens group 106 may have positive (+) power.
  • the sixth lens group 106 may be disposed to be spaced apart from the imaging surface S52 of the fifth lens group 105 .
  • At least one inflection point may be formed on at least one of the target surface S61 and the imaging surface S62 of the sixth lens group 106 .
  • the target surface S61 may be convex in the paraxial region and concave toward the edge
  • the imaging surface S62 may be concave in the paraxial region and convex toward the edge.
  • the shape of the sixth lens group 106 is not limited thereto.
  • At least one of the first lens group 101 to the sixth lens group 106 may include an aspherical lens, and all of the lenses may have a rotationally symmetric shape with respect to the optical axis.
  • the lenses included in the first lens group 101 to the sixth lens group 106 may be made of a glass material or a plastic material.
  • the manufacturing cost can be greatly reduced.
  • the surfaces of the lenses included in the first lens group 101 to the sixth lens group 106 may be coated to prevent reflection or improve surface hardness.
  • the imaging lens 110 configured as described above can reduce distortion of the peripheral portion, and greatly increase the object recognition performance of the peripheral portion in the camera module 100 or the vehicle 10 including the imaging lens 110 .
  • Table 1 shows the radius of curvature, thickness, or distance of each lens group included in the imaging lens 110 according to an embodiment of the present invention.
  • the unit of the radius of curvature and the thickness or distance is millimeters.
  • the distance (thickness) from the target surface S11 to the imaging surface S12 of the first lens group 101 on the optical axis is 0.9101 mm
  • the The distance (thickness) from the target plane S21 to the imaging plane S22 is 4.8463 mm
  • the distance (thickness) from the target plane S31 to the imaging plane S32 of the third lens group 103 is 1.9086 mm.
  • the distance (thickness) from the target plane S41 to the imaging plane S42 of the fourth lens group 104 is 5.1864 mm
  • the imaging plane S52 from the target plane S51 of the fifth lens group 105 ) may be 0.9221 mm
  • the distance (thickness) from the target surface S61 to the imaging surface S62 of the sixth lens group 106 may be 1.4247 mm.
  • the imaging plane S12 of the first lens group 101 is arranged on the optical axis spaced apart by 3.3112 mm to the target plane S21 of the second lens group 102 , and the imaging plane of the second lens group 102 .
  • (S22) is arranged on the optical axis at a distance of 0.0899 mm to the target surface S31 of the third lens group 103, and the imaging surface S32 of the third lens group 103 is the fourth lens group 104
  • the image-forming surface S42 of the fourth lens group 104 is spaced 0.1591 mm apart from the target surface S51 of the fifth lens group 105 on the optical axis to be spaced 0.0855 mm apart from the target surface S41 on the optical axis.
  • the imaging surface S52 of the fifth lens group 105 is spaced 0.4962 mm apart to the target surface S61 of the sixth lens group 106 on the optical axis, and the sixth lens group 106
  • the imaging surface S62 may be disposed on the optical axis at a distance of 0.1861 mm up to the upper surface S71 of the filter 121 .
  • Table 2 shows the conic constant (k) and aspheric coefficients (A to H) of the lens surface of each lens group included in the imaging lens 110 according to an embodiment of the present invention.
  • the first lens group 101 and the fourth lens group 104 to the sixth lens group 106 include aspherical lenses.
  • the first lens group 101 to the sixth lens group 106 may include at least one aspherical lens, and are not limited to the examples shown in Table 2.
  • the imaging lens 110 may satisfy Conditional Expression 1 as follows.
  • Fno is a constant (F-number) indicating the brightness of the imaging lens 110 .
  • F-number the brightness of the imaging lens 110 .
  • Fno increases, the brightness of the imaging lens 110 becomes darker, and the amount of light received by the imaging lens 110 decreases in the same environment.
  • the imaging lens 110 When Fno is greater than 1.7, the imaging lens 110 lacks the ability to recognize an object in a dark place, so that the imaging lens 110 cannot realize the target object recognition performance. On the other hand, when Fno is less than 1.55, the size of the lenses included in the imaging lens 110 is increased or the number of lenses is increased, so that the volume of the imaging lens 110 is increased and the weight is heavy.
  • the imaging lens 110 may satisfy Conditional Expression 2 below.
  • TTL is the distance (Total Top Length or Total Track Length) from the object-side incident surface to the image surface of the first lens group 101 . That is, TTL represents the total length of the imaging lens 110 .
  • the TTL is longer than 25mm, the length of the imaging lens 110 is long, so it is difficult to mount it on the camera module 100 applied to the vehicle 10, etc., and if the TTL is shorter than 20mm, the image quality of the imaging lens 110 may deteriorate. have.
  • the imaging lens 110 may satisfy the following conditional expression (3).
  • ANG is a numerical value representing the half-field angle of the imaging lens.
  • the half angle of view means 1/2 of the total angle of view of the imaging lens 110 .
  • the ANG is 50 degrees or less, the size of the area that the imaging lens 110 can contain is small, and thus the imaging lens 110 cannot realize the target object recognition performance.
  • the imaging lens 110 may satisfy the following conditional expression (4).
  • ImgH is the diagonal length (Image Height) of the image sensor 130
  • EPD is the entrance pupil diameter of the imaging lens 110 .
  • the volume of the imaging lens 110 may increase or desired information may not be sufficiently obtained.
  • the image may darken.
  • the imaging lens 110 may satisfy the following conditional expression 5.
  • TTL is the distance from the object-side incident surface of the first lens group 101 to the image surface
  • EFL is the total focal length of the imaging lens 110 .
  • the TTL/EFL value is greater than 5.6, the amount of information in the center of the photographed image is reduced, and when the TTL/EFL value is less than 4.8, the amount of information in the peripheral portion in the photographed image is reduced, so that the imaging lens 110 recognizes the target object. performance cannot be implemented.
  • the imaging lens 110 may satisfy Conditional Expression 6 below.
  • ANG is a numerical value representing the half-field angle of the imaging lens
  • TTL is the distance from the object-side incident surface of the first lens group 101 to the image surface.
  • the ANG/TTL value is less than 2.5, the amount of information in the photographed image decreases and the size of the imaging lens 110 increases.
  • the ANG/TTL value is greater than 2.95, the information amount in the photographed image decreases.
  • the imaging lens 110 may satisfy Conditional Expression 7 below.
  • TTL is the distance from the object-side incident surface of the first lens group 101 to the image surface
  • ImgH is the diagonal length of the image surface of the image sensor 130 .
  • the TTL/ImgH value is greater than 5.7, the volume of the imaging lens 110 increases, and when the TTL/ImgH value is less than 5.7, the image quality of the periphery of the photographed image is deteriorated.
  • the imaging lens 110 may satisfy the following conditional expression 8.
  • ImgH is the diagonal length of the image sensor 130
  • EFL is the total focal length of the imaging lens 110
  • ANG is a numerical value representing the half angle of view of the imaging lens.
  • the ImgH/(EFL * tan(ANG)) value is less than 0.4, the distortion of the imaging lens 110 increases and the amount of information in the photographed image is reduced. If the ImgH/(EFL * tan(ANG)) value is greater than 0.6, The object recognition performance in the center of the captured image decreases, and the amount of information in the periphery decreases.
  • the imaging lens 110 may satisfy the following conditional expression 9.
  • f1 is the focal length of the first lens group 101
  • EFL is the total focal length of the imaging lens 110 .
  • the angle of view of the imaging lens 110 becomes narrow, and the size of a region that can be captured through the imaging lens 110 decreases. Accordingly, it is impossible to realize the object recognition performance that the imaging lens 110 is aiming for.
  • the imaging lens 110 may satisfy the following conditional expression 10.
  • f2 is the focal length of the second lens group 102
  • EFL is the total focal length of the imaging lens 110 .
  • the f2/EFL value is less than -5.93 or greater than -5.11, the quality of the peripheral area in the captured image is reduced.
  • the imaging lens 110 may satisfy the following conditional expression 11.
  • f6 is the focal length of the sixth lens group 106
  • EFL is the total focal length of the imaging lens 110 .
  • the imaging lens 110 cannot realize the target object recognition performance.
  • the imaging lens 110 satisfies the above-described conditional expressions. Accordingly, the imaging lens 110 has improved optical performance, can be applied to the vehicle 10 with a compact size, and can have high object recognition performance in a dark environment or a peripheral area.
  • FIG. 4 is a graph of measuring coma aberration of the imaging lens 110 of FIG. 3 .
  • FIG. 4 is a graph of measuring tangential aberration and sagittal aberration of each wavelength according to a field height of the imaging lens 110 .
  • FIG. 5 is a graph illustrating longitudinal spherical aberration, astigmatic field curves, and distortion of the imaging lens of FIG. 3 .
  • the Y-axis means the size of the image
  • the X-axis means the focal length (in mm) and the degree of distortion (in %). As each curve approaches the Y-axis, the aberration correction function of the imaging lens 110 may be improved.
  • FIG. 6 shows a result of comparing the degree of distortion of the periphery of an image photographed using the imaging lens 110 of FIG. 3 with that of a conventional imaging lens.
  • FIG. 6 (a) shows an image 601 taken with a conventional imaging lens
  • FIG. 6 (b) shows an image 602 taken with an imaging lens 110 according to an embodiment of the present invention. it has been shown
  • the object OB1a located in the central part (Ar1, near 0 degrees of view) of the image 601 taken with a conventional general imaging lens is similar to the shape of the real object, but the peripheral part ( It can be seen that the object OB2a located at Ar2 (near 60 degrees of view) is severely distorted compared to the shape and size of the actual object.
  • the object OB1b located in the central portion Ar1 and the peripheral portion Ar2 of the image 602 taken with the imaging lens 110 according to an embodiment of the present invention It can be confirmed that all of the objects OB2b are similar to the shape of the real object.
  • the horizontal length W2 of the image 602 photographed with the imaging lens 110 according to an embodiment of the present invention is the same as the horizontal length W1 of the image 601 photographed with the conventional general imaging lens, and , an object located in the periphery Ar2 can be photographed more accurately without distortion.
  • the imaging lens 110 can be directly applied to the existing vehicle camera module.
  • the object detection apparatus 300 of the vehicle 10 can accurately detect and track an object existing in the periphery of the image based on the image obtained from the camera module 100, The safety of occupants can be promoted in various situations.
  • FIG. 7 shows a result of comparing the object recognition distance according to the angle of view of the imaging lens 110 of FIG. 3 with that of the conventional imaging lens.
  • the imaging lens 110 has an object recognition distance of about 85 m from the central portion (0°), which is 96% compared to the 89 m object recognition distance of a conventional imaging lens. Therefore, it can be confirmed that the object recognition performance is almost similar to that of the conventional imaging lens.
  • the imaging lens 110 has an object recognition distance of about 37 m from the periphery (60°), which is 128% compared to the 25 m object recognition distance of a conventional imaging lens. Accordingly, it can be confirmed that the object recognition performance in the peripheral area is significantly improved compared to the conventional imaging lens.
  • the imaging lens 110 can improve the object recognition performance of the peripheral part by increasing the amount of information in the peripheral part while maintaining the object recognition performance of the central part.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

La présente invention concerne une lentille d'imagerie, ainsi qu'un module de caméra et un véhicule la comprenant. Une lentille d'imagerie selon un mode de réalisation de la présente invention comprend : un premier groupe de lentilles qui a, dans l'ordre, une puissance négative vers le haut depuis un côté objet et dont une surface cible fait saillie vers le côté objet ; un deuxième groupe de lentilles ayant une puissance négative ; un troisième groupe de lentilles ayant une puissance positive ; un quatrième groupe de lentilles ayant une puissance positive ; un cinquième groupe de lentilles ayant une puissance négative ; et un sixième groupe de lentilles ayant une puissance positive, chacun des premier à sixième groupes de lentilles pouvant comprendre au moins une lentille. Par conséquent, la distorsion d'une partie périphérique peut être atténuée et la performance de reconnaissance d'objet au niveau de la partie périphérique peut être améliorée.
PCT/KR2020/005712 2020-04-29 2020-04-29 Lentille d'imagerie et module de caméra et véhicule la comprenant WO2021221207A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/KR2020/005712 WO2021221207A1 (fr) 2020-04-29 2020-04-29 Lentille d'imagerie et module de caméra et véhicule la comprenant

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/KR2020/005712 WO2021221207A1 (fr) 2020-04-29 2020-04-29 Lentille d'imagerie et module de caméra et véhicule la comprenant

Publications (1)

Publication Number Publication Date
WO2021221207A1 true WO2021221207A1 (fr) 2021-11-04

Family

ID=78374140

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2020/005712 WO2021221207A1 (fr) 2020-04-29 2020-04-29 Lentille d'imagerie et module de caméra et véhicule la comprenant

Country Status (1)

Country Link
WO (1) WO2021221207A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100930167B1 (ko) * 2007-09-19 2009-12-07 삼성전기주식회사 초광각 광학계
JP2010243709A (ja) * 2009-04-03 2010-10-28 Ricoh Co Ltd 広角レンズおよび撮像装置
JP2013073145A (ja) * 2011-09-29 2013-04-22 Fujifilm Corp 撮像レンズおよび撮像装置
JP2017102211A (ja) * 2015-11-30 2017-06-08 コニカミノルタ株式会社 撮像レンズ及び撮像装置
JP2018136476A (ja) * 2017-02-23 2018-08-30 富士フイルム株式会社 撮像レンズおよび撮像装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100930167B1 (ko) * 2007-09-19 2009-12-07 삼성전기주식회사 초광각 광학계
JP2010243709A (ja) * 2009-04-03 2010-10-28 Ricoh Co Ltd 広角レンズおよび撮像装置
JP2013073145A (ja) * 2011-09-29 2013-04-22 Fujifilm Corp 撮像レンズおよび撮像装置
JP2017102211A (ja) * 2015-11-30 2017-06-08 コニカミノルタ株式会社 撮像レンズ及び撮像装置
JP2018136476A (ja) * 2017-02-23 2018-08-30 富士フイルム株式会社 撮像レンズおよび撮像装置

Similar Documents

Publication Publication Date Title
WO2016003211A1 (fr) Objectif photographique et appareil photographique
WO2018080100A1 (fr) Système de lentille optique
WO2014104787A1 (fr) Objectif photographique et appareil photographique utilisant ledit objectif
WO2022124850A1 (fr) Système optique et module de caméra le comprenant
WO2016105074A1 (fr) Système optique de lentilles
WO2016140526A1 (fr) Objectif de formation d'image et module de caméra l'intégrant
WO2022145990A1 (fr) Système optique et module de caméra pour véhicule
WO2018080103A1 (fr) Système de lentilles optiques
WO2021221207A1 (fr) Lentille d'imagerie et module de caméra et véhicule la comprenant
WO2021201568A1 (fr) Système optique et module de caméra le comprenant
WO2023003365A1 (fr) Système optique, et module optique et module de caméra le comprenant
WO2021091090A1 (fr) Ensemble lentille et dispositif électronique le comprenant
WO2021261615A1 (fr) Lentille d'imagerie, ainsi que module de caméra et dispositif électronique la comprenant
WO2022164196A1 (fr) Système optique et module de caméra le comprenant
WO2022035134A1 (fr) Système optique
WO2021194012A1 (fr) Lentille d'imagerie, et module de caméra et dispositif électronique la comprenant
WO2021215807A1 (fr) Système optique et module de caméra le comprenant
WO2024085675A1 (fr) Système optique, dispositif optique et module de caméra le comprenant
WO2021071320A1 (fr) Lentille d'imagerie
WO2017200306A1 (fr) Système de lentilles à grand angle et caméra de véhicule le comprenant
WO2021054740A1 (fr) Module de caméra
WO2024096508A1 (fr) Module de caméra et dispositif d'aide à la conduite
WO2022265452A2 (fr) Système optique et module de caméra le comprenant
WO2023282727A1 (fr) Système optique et module de caméra le comprenant
WO2022124852A1 (fr) Système optique et module de caméra le comprenant

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20933929

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20933929

Country of ref document: EP

Kind code of ref document: A1