WO2023179518A1 - 一种红外成像模组和红外成像方法 - Google Patents

一种红外成像模组和红外成像方法 Download PDF

Info

Publication number
WO2023179518A1
WO2023179518A1 PCT/CN2023/082398 CN2023082398W WO2023179518A1 WO 2023179518 A1 WO2023179518 A1 WO 2023179518A1 CN 2023082398 W CN2023082398 W CN 2023082398W WO 2023179518 A1 WO2023179518 A1 WO 2023179518A1
Authority
WO
WIPO (PCT)
Prior art keywords
infrared imaging
infrared
imaging module
lens
window
Prior art date
Application number
PCT/CN2023/082398
Other languages
English (en)
French (fr)
Inventor
彭海军
潘超
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2023179518A1 publication Critical patent/WO2023179518A1/zh

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • G02B13/001Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras
    • G02B13/0015Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras characterised by the lens design
    • G02B13/002Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras characterised by the lens design having at least one aspherical surface
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • G02B13/001Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras
    • G02B13/008Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras designed for infrared light
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • G02B13/14Optical objectives specially designed for the purposes specified below for use with infrared or ultraviolet radiation
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B30/00Camera modules comprising integrated lens units and imaging units, specially adapted for being embedded in other devices, e.g. mobile phones or vehicles

Definitions

  • the present application relates to the field of optical communication technology, and more specifically, to an infrared imaging module and an infrared imaging method.
  • infrared thermal imaging uses the thermal radiation of natural objects to image, which allows the infrared night vision system to observe distances up to 10 times farther than ordinary car headlights. It can ensure that in harsh conditions such as heavy fog, heavy rain, sleet, dark night, glare, etc., the road conditions ahead can be observed very clearly, greatly reducing the incidence of traffic accidents and better ensuring driving safety.
  • infrared car night vision systems are expensive and are currently only used in a few high-end cars.
  • the infrared vehicle thermal imaging lens will have a huge market if the cost can be reduced.
  • This application provides an infrared imaging module and an infrared imaging method, which can reduce costs while ensuring the quality of infrared imaging.
  • an infrared imaging module includes a lens and an infrared detector, wherein: the concave surface of the lens faces the target object; the infrared detector includes a window piece and an imaging surface, the object-side surface of the window piece is a curved surface, and the window piece is located between the lens and the imaging surface.
  • the imaging surface is used to detect infrared images of target objects.
  • the infrared imaging module disclosed in this application by setting the object surface of the window piece as a curved surface, can realize an imaging module with high image quality and no thermal requirements through a one-piece lens and window piece, thereby improving the imaging effect and reducing costs. Furthermore, the infrared imaging module disclosed in this application can also be equipped with multiple lenses to improve the imaging effect.
  • the above-mentioned infrared imaging module satisfies the following expression:
  • f1 is the focal length of the infrared imaging module
  • n is the central wavelength refractive index of the lens
  • R is the convex curvature radius of the lens
  • F is the image-side numerical aperture of the infrared imaging module.
  • the focal lengths of the above-mentioned window and the infrared imaging module satisfy the following expression:
  • f2 is the focal length of the window.
  • At least one of the object-side surface and the image-side surface of the lens is a binary diffraction surface.
  • the imaging effect of the infrared imaging module can be improved.
  • the above lens has positive refractive power.
  • the object-side surface of the window is an aspherical surface or a diffraction surface.
  • the imaging effect of the infrared imaging module can be improved; when the object-side surface of the above-mentioned window piece is a diffraction surface, the athermalization performance of the infrared imaging module can be improved.
  • the image surface of the above-mentioned window piece is flat, and the above-mentioned window piece has positive refractive power.
  • the material of the lens includes at least one of chalcogenide glass, silicon, germanium and gallium arsenide.
  • the material of the window includes at least one of silicon, germanium, and gallium arsenide.
  • the field of view angle of the above-mentioned infrared imaging module is between 30 degrees and 50 degrees. In this way, it can be adapted to a variety of application scenarios while ensuring imaging effects.
  • an infrared imaging method is provided.
  • the method is suitable for an infrared imaging device.
  • the infrared imaging device includes a lens and an infrared detector.
  • the infrared detector includes a window and an imaging surface.
  • the method includes: obtaining a first image of a target object through a lens, with the concave surface of the lens facing the target object;
  • the window piece obtains the second image of the target object based on the first image of the target object, and the object-side surface of the window piece is a curved surface; the infrared image of the target object is obtained based on the second image of the target object through the imaging surface.
  • the infrared imaging method disclosed in this application can detect the infrared image of the target object through an imaging module with a one-piece lens and an object-side surface set as a curved window piece with high image quality and no need for thermalization, thereby improving the imaging effect and reducing costs.
  • an infrared detection device including a control circuit, a display, and an infrared imaging module as described in any one of the first aspects.
  • the control circuit is used to control the infrared image generated by the imaging module to be displayed on the display. .
  • a fourth aspect provides a mobile device, including a processor and the infrared imaging module as described in any one of the first aspects, and the processor communicates with the infrared imaging module.
  • the mobile device may be a vehicle or other device that requires an infrared imaging module (for example, a transportation vehicle or movable object that requires spatial operation or movement).
  • the vehicle includes but is not limited to cars, bicycles, motorcycles, trains, and subways. , airplanes, ships, aircraft, robots or other types of transportation or movable objects, etc.
  • Figure 1 is a schematic structural diagram of an infrared vehicle night vision system.
  • Figure 2 is a specific application scenario of the infrared imaging module and infrared imaging method provided by the embodiment of the present application.
  • Figure 3 is another specific application scenario of the infrared imaging module and infrared imaging method provided by the embodiment of the present application.
  • Figure 4 is a schematic structural diagram of an infrared imaging module provided by an embodiment of the present application.
  • Figure 5 is a schematic diagram of the full-frequency curve of the infrared imaging module provided by the embodiment of the present application at different temperatures.
  • Figure 6 is a schematic diagram of field curvature distortion of the infrared imaging module provided by the embodiment of the present application at different temperatures.
  • Figure 7 is a schematic flowchart of an infrared imaging method provided by an embodiment of the present application.
  • infrared thermal imaging uses the thermal radiation of natural objects to image, which allows the infrared night vision system to observe distances up to 10 times farther than ordinary car headlights. It can ensure that in harsh conditions such as heavy fog, heavy rain, sleet, dark night, glare, etc., the road conditions ahead can be observed very clearly, greatly reducing the incidence of traffic accidents and better ensuring driving safety.
  • Figure 1 shows a schematic structural diagram of an infrared vehicle night vision system.
  • the system 100 includes a first lens 110 , a second lens 120 , a third lens 130 and a detector 140 .
  • the first lens 110, the second lens 120, and the third lens 130 are made of chalcogenide glass
  • the window piece on the detector 140 is made of silica glass.
  • the optical design of a low-cost vehicle night vision lens is achieved.
  • three lenses are used to achieve athermalization and correct system aberrations, and the cost is still high.
  • the large number of lenses in a 3-piece lens reduces the overall transmittance and affects the quality of infrared imaging.
  • this application proposes an infrared imaging module and an infrared imaging method in order to reduce costs while ensuring the quality of infrared imaging.
  • Figure 2 is a specific application scenario of the infrared imaging module and infrared imaging method provided by the embodiment of the present application.
  • vehicle 220 is included.
  • the application scenario may also include a cloud server 210, and the vehicle 220 and the cloud server 210 may communicate through the network.
  • Computing platform 221 may include at least one processor 222 that may execute instructions 224 stored in a non-transitory computer-readable medium such as memory 223.
  • computing platform 221 may also be multiple computing devices that control individual components or subsystems of vehicle 220 in a distributed fashion.
  • Processor 222 may be any conventional processor, such as a central processing unit (CPU).
  • the processor 222 may also include a processor such as a graphic process unit (GPU), a field programmable gate array (FPGA), a system on chip (SOC), an application specific integrated chip ( application specific integrated circuit (ASIC) or their combination.
  • GPU graphic process unit
  • FPGA field programmable gate array
  • SOC system on chip
  • ASIC application specific integrated circuit
  • memory 223 may store data such as road maps, route information, vehicle location, direction, speed and other such vehicle data, as well as other information. This information may be used by vehicle 220 and computing platform 221 during operation of vehicle 220 in autonomous, semi-autonomous and/or manual modes.
  • the above-mentioned vehicle 220 may include one or more different types of vehicles, or may include one or more different types of vehicles on land (for example, roads, roads, railways, etc.), water (for example: waterways, rivers, oceans, etc.) or transportation or movable objects that operate or move in space.
  • vehicles may include cars, bicycles, motorcycles, trains, subways, airplanes, ships, aircraft, robots or other types of transportation vehicles or movable objects, etc., which are not limited in the embodiments of this application.
  • the application scenario shown in Figure 2 may also include a cloud server 210.
  • the cloud server 210 can perform perception fusion, computational reasoning, etc. based on the real-time infrared image uploaded by the vehicle 220 and other information in the cloud server (such as information about other vehicles, road condition information, etc.).
  • the cloud server 210 can also be implemented through a virtual machine.
  • Embodiments of the present application can also be applied to many fields in artificial intelligence, such as image recognition, image processing, high-precision maps, intelligent driving, intelligent transportation, autonomous driving and other fields.
  • applications in these branches of artificial intelligence require infrared imaging.
  • infrared imaging can further determine the road conditions faced by the vehicle, thereby assisting the driver to make correct driving operations based on the actual situation.
  • the input image can be processed by infrared imaging to obtain road condition information to assist the driver in making correct decisions.
  • road condition information such as traffic light information, other vehicle information, pedestrian information, etc.
  • This road condition information is input to the driving decision module.
  • the driving decision module further determines what operation to perform. For example, when the current road condition information shows that there are pedestrians or other stationary objects ahead, the driving decision module will issue a stop instruction message.
  • the driving decision module can also perform automatic braking operations.
  • Figure 3 uses infrared imaging to improve the safety of smart driving, effectively preventing inaccuracies due to driver neglect or harsh conditions such as heavy fog, heavy rain, sleet, dark nights, and glare. Knowing the road condition information and making incorrect operations can greatly reduce the incidence of traffic accidents and better ensure driving safety.
  • Figure 4 shows a schematic structural diagram of an infrared imaging module provided by an embodiment of the present application.
  • the infrared imaging module 400 includes a lens 410 and an infrared detector 420, wherein: the concave surface of the lens 410 faces the target object; the infrared detector 420 includes a window 421 and an imaging surface (image, IMG) 422.
  • the object-side surface of 421 is a curved surface, and the window piece 421 is located between the lens 410 and the imaging surface 422.
  • the imaging surface 422 is used to detect the infrared image of the target object.
  • the infrared imaging module 400 satisfies the following expression:
  • f1 is the focal length of the infrared imaging module 400
  • n is the central wavelength refractive index of the lens 410
  • R is the convex curvature radius of the lens 410
  • F is the image-side numerical aperture of the infrared imaging module 400.
  • f2 is the focal length of the window piece 421.
  • At least one of the object-side surface and the image-side surface of the lens 410 is a binary diffraction surface. In this way, the imaging effect of the infrared imaging module 400 can be improved.
  • lens 410 has positive refractive power.
  • the object-side surface of the window piece 421 is an aspheric surface (asphere, ASP) or a diffraction surface (binary).
  • the object-side surface of the window piece 421 is an aspherical surface, the imaging effect of the infrared imaging module 400 can be improved; when the object-side surface of the window piece 421 is a diffraction surface, the athermalization performance of the infrared imaging module 400 can be improved. , helping to expand application scenarios.
  • the image surface of the window piece 421 is a plane, and the window piece 421 has positive refractive power.
  • the material of the lens 410 includes at least one of chalcogenide glass, silicon, germanium, and gallium arsenide.
  • the material of the window 421 includes at least one of silicon, germanium, and gallium arsenide.
  • the materials of the lens 410 and the window 421 may also include other infrared materials, which are not limited in this application.
  • the field of view angle of the infrared imaging module 400 may be 30 degrees.
  • the object-side surface of the lens 410 may be an aspheric surface, and the image-side surface may be a binary diffraction surface; the object-side surface of the window 421 may be an aspheric surface, and the image-side surface may be a flat plate.
  • the optical related parameters of the infrared imaging module 400 are as shown in Table 1 below. Among them, along the optical axis from the object side to the image side are the lens 410, the window 421 and the imaging surface 422 in order.
  • the surface number of the object-side surface of the lens 410 is 1, and the surface number of the image-side surface is 2; the surface number of the object-side surface of the window piece 421 is 3, and the surface number of the image-side surface is 4; the object-side surface of the imaging surface 422 The surface number is 5, and the surface number of the image square surface is 6.
  • ST represents the stop
  • ASP represents the aspheric surface
  • Boary2 represents the binary diffraction surface
  • IMG represents the imaging surface (image).
  • K represents the conic coefficient
  • A represents the fourth-order aspherical coefficient
  • B represents the sixth-order aspherical system.
  • A1 represents the second-order coefficient of the diffraction surface
  • A2 represents the fourth-order coefficient of the diffraction surface
  • A3 represents the sixth-order coefficient of the diffraction surface.
  • Table 1 Optical related parameters of the infrared imaging module 400 when the field of view is 30 degrees
  • the field of view angle of the infrared imaging module 400 may be 40 degrees.
  • the object-side surface of the lens 410 may be an aspheric surface, and the image-side surface may be a binary diffraction surface; the object-side surface of the window 421 may be an aspheric surface, and the image-side surface may be a flat plate.
  • the optical related parameters of the infrared imaging module 400 are as shown in Table 2 below. Please refer to the above for the meanings of each letter and blank space in the table.
  • the field of view angle of the infrared imaging module 400 may be 50 degrees.
  • the object-side surface of the lens 410 may be an aspheric surface, and the image-side surface may be a binary diffraction surface; the object-side surface of the window 421 may be an aspheric surface, and the image-side surface may be a flat plate.
  • the optical related parameters of the infrared imaging module 400 are as shown in Table 3 below. Please refer to the above for the meanings of each letter and blank space in the table.
  • Figure 5 shows a schematic diagram of the full-frequency curve of the infrared imaging module provided by the embodiment of the present application at different temperatures.
  • the maximum field of view of the infrared imaging module 400 is 30 degrees and the maximum spatial frequency is 42 line pairs/mm (lp/mm)
  • the infrared imaging module 400 has good imaging quality.
  • the sagittal and meridional values are 0.00, 4.42, 9.83, and 15.00 micrometers ( ⁇ m) respectively.
  • the abscissa is the spatial frequency and the ordinate is the optical transfer function.
  • the curve formed is called full frequency. curve.
  • the infrared imaging module provided by this application, when the temperature is at low temperature (minus 45 degrees Celsius), normal temperature (20 degrees Celsius) and high temperature (80 degrees Celsius), no matter the point at the center of the field of view (such as wavelength When the sagittal and meridian are 0.00 microns) or a point close to the edge of the field of view (for example, when the wavelength sagittal and meridian are 15.00 microns), the corresponding optical transmission
  • the transfer function curves all have a small difference from the theoretical values (optical transfer function curves corresponding to the sagittal or meridional diffraction limit), and the captured images have good imaging quality.
  • Figure 6 is a schematic diagram of the distortion of the infrared imaging module provided by the embodiment of the present application at different temperatures. As shown in Figure 6, when the maximum field of view of the infrared imaging module 400 is 30 degrees and the temperature is at low temperature (minus 45 degrees Celsius), normal temperature (20 degrees Celsius) and high temperature (80 degrees Celsius), the distortion of the infrared imaging module 400 The size value allows the aberration index of the infrared imaging module to meet the imaging needs, and can meet the needs of athermalization at minus 45 degrees Celsius to 80 degrees Celsius. Among them, Figure 6 takes wavelengths of 8.00, 10.00, and 12.00 microns ( ⁇ m) as an example. The ordinate is the field of view angle, and the abscissa is the percentage.
  • the field of view angle is between 0 and 15 degrees, and the wavelengths are 8.00, 10.00 and 12.00 microns respectively.
  • the image distortion percentage is small, and the captured images have good imaging quality.
  • FIG. 5 and FIG. 6 illustrate the beneficial effects of the infrared imaging module 400 provided by the present application when the field of view angle of the infrared imaging module 400 is 30 degrees.
  • the field of view angle of the infrared imaging module 400 is other possible angles (for example, 40 degrees, 50 degrees, or other possible angles, etc.), the above beneficial effects will be achieved. For the sake of simplicity, details will not be described again.
  • FIG. 7 shows a schematic flowchart of the infrared imaging method provided by the embodiment of the present application. This method is suitable for the infrared imaging device disclosed in the above embodiment.
  • the concave surface of the lens faces the target object, and at least one of the object-side surface and the image-side surface of the lens is a binary diffraction surface.
  • the lens has positive refractive power.
  • the material of the lens includes at least one of chalcogenide glass, silicon, germanium and gallium arsenide.
  • the lens may be a one-piece lens, which can reduce costs.
  • the lens can be a stack of multiple lenses, which can improve the imaging effect.
  • S720 Obtain the second image of the target object based on the first image of the target object through the window slice.
  • S730 Obtain the infrared image of the target object based on the second image of the target object through the imaging surface.
  • the window piece and the imaging surface can form an infrared detector.
  • the object-side surface of the window piece is a curved surface.
  • the window piece is located between the lens and the imaging surface.
  • the window piece is used to obtain the second image of the target object based on the first image of the target object.
  • the imaging surface is used to obtain the infrared image of the target object based on the second image of the target object.
  • the object-side surface of the window is an aspherical surface or a diffraction surface
  • the image-side surface of the window is a plane
  • the window has positive refractive power.
  • the material of the window includes at least one of silicon, germanium and gallium arsenide.
  • the lens and the infrared detector may form an infrared imaging module.
  • the infrared imaging module satisfies the following expression:
  • f1 is the focal length of the infrared imaging module
  • n is the central wavelength refractive index of the lens
  • R is the convex curvature radius of the lens
  • F is the image-side numerical aperture of the infrared imaging module.
  • f2 is the focal length of the window.
  • the field of view angle of the infrared imaging module is not limited.
  • it can be between 30 degrees and 50 degrees or other possible angles.
  • the infrared imaging method disclosed in this application can be realized by using a one-piece lens and the object-side surface being set as a curved window.
  • the imaging module with high image quality and no need for thermalization detects the infrared image of the target object, improving the imaging effect and reducing costs.
  • An embodiment of the present application also provides an infrared detection device, including a control circuit, a display, and the above-mentioned imaging module 400.
  • the control circuit is used to control the infrared image generated by the imaging module 400 to be displayed on the display.
  • An embodiment of the present application also provides a vehicle, including a processor and the above-mentioned imaging module 400.
  • the processor and the imaging module 400 can communicate.
  • An embodiment of the present application also provides a device, including a processor and an interface.
  • the processor may be used to execute the method in the above method embodiment.
  • the above processing device may be a chip.
  • the processing device may be a field programmable gate array (FPGA), an application specific integrated circuit (ASIC), or a system on chip (SoC), or It can be a central processing unit (CPU), a network processor (NP), a digital signal processing circuit (DSP), or a microcontroller unit , MCU), it can also be a programmable logic device (PLD) or other integrated chip.
  • FPGA field programmable gate array
  • ASIC application specific integrated circuit
  • SoC system on chip
  • CPU central processing unit
  • NP network processor
  • DSP digital signal processing circuit
  • MCU microcontroller unit
  • PLD programmable logic device
  • each step of the above method can be completed by instructions in the form of hardware integrated logic circuits or software in the processor.
  • the steps of the methods disclosed in conjunction with the embodiments of the present application can be directly implemented by a hardware processor for execution, or can be executed by a combination of hardware and software modules in the processor.
  • the software module can be located in random access memory, flash memory, read-only memory, programmable read-only memory or electrically erasable programmable memory, registers and other mature storage media in this field.
  • the storage medium is located in the memory, and the processor reads the information in the memory and completes the steps of the above method in combination with its hardware. To avoid repetition, it will not be described in detail here.
  • the processor in the embodiment of the present application may be an integrated circuit chip with signal processing capabilities.
  • each step of the above method embodiment can be completed through an integrated logic circuit of hardware in the processor or instructions in the form of software.
  • the above-mentioned processor may be a general-purpose processor, a digital signal processor (DSP), an application-specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic devices, discrete gate or transistor logic devices, or discrete hardware components.
  • DSP digital signal processor
  • ASIC application-specific integrated circuit
  • FPGA field programmable gate array
  • a general-purpose processor may be a microprocessor or the processor may be any conventional processor, etc.
  • the steps of the method disclosed in conjunction with the embodiments of the present application can be directly implemented by a hardware decoding processor, or executed by a combination of hardware and software modules in the decoding processor.
  • the software module can be located in random access memory, flash memory, read-only memory, programmable read-only memory or electrically erasable programmable memory, registers and other mature storage media in this field.
  • the storage medium is located in the memory, and the processor reads the information in the memory and completes the steps of the above method in combination with its hardware.
  • the memory in the embodiment of the present application may be a volatile memory or a non-volatile memory, or may include both volatile and non-volatile memories.
  • the non-volatile memory can be read-only memory (ROM), programmable ROM (PROM), erasable programmable read-only memory (erasable PROM, EPROM), electrically removable memory. Erase programmable read-only memory (electrically EPROM, EEPROM) or flash memory.
  • Volatile memory may be random access memory (RAM), which is used as an external cache.
  • RAM static random access memory
  • DRAM dynamic random access memory
  • SDRAM synchronous dynamic random access memory
  • double data rate SDRAM double data rate SDRAM
  • DDR SDRAM double data rate SDRAM
  • ESDRAM enhanced synchronous dynamic random access memory
  • SLDRAM synchronous link dynamic random access memory
  • direct rambus RAM direct rambus RAM
  • the present application also provides a computer program product.
  • the computer program product includes: computer program code.
  • the computer program code When the computer program code is run on a computer, it causes the computer to execute the embodiment shown in Figure 7 Methods.
  • the present application also provides a computer-readable medium.
  • the computer-readable medium stores program code.
  • the program code is run on a computer, the computer is caused to execute the embodiment shown in Figure 7 Methods.
  • the computer program product includes one or more computer instructions.
  • the computer may be a general-purpose computer, a special-purpose computer, a computer network, or other programmable device.
  • the computer instructions may be stored in or transmitted from one computer-readable storage medium to another, e.g., the computer instructions may be transferred from a website, computer, server, or data center Transmission to another website, computer, server or data center through wired (such as coaxial cable, optical fiber, digital subscriber line (DSL)) or wireless (such as infrared, wireless, microwave, etc.) means.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains one or more available media integrated.
  • the usable media may be magnetic media (e.g., floppy disks, hard disks, tapes), optical media (e.g., high-density digital video discs (DVD)), or semiconductor media (e.g., solid state disks, SSD)) etc.
  • magnetic media e.g., floppy disks, hard disks, tapes
  • optical media e.g., high-density digital video discs (DVD)
  • DVD digital video discs
  • semiconductor media e.g., solid state disks, SSD
  • a component may be, but is not limited to, a process, a processor, an object, an executable file, a thread of execution, a program and/or a computer running on a processor.
  • applications running on the computing device and the computing device may be components.
  • One or more components can reside in a process and/or thread of execution and a component can be localized on one computer and/or distributed between two or more computers. Additionally, these components can execute from various computer-readable media having various data structures stored thereon.
  • a component may, for example, be based on a signal having one or more data packets (eg, data from two components interacting with another component, a local system, a distributed system, and/or a network, such as the Internet, which interacts with other systems via signals) Communicate through local and/or remote processes.
  • data packets eg, data from two components interacting with another component, a local system, a distributed system, and/or a network, such as the Internet, which interacts with other systems via signals
  • the disclosed systems, devices and methods can be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of the circuit is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components may be combined or can be integrated into another system, or some features can be ignored, or not implemented.
  • the coupling or direct coupling or communication connection between each other shown or discussed may be through some interfaces, and the indirect coupling or communication connection of the devices or units may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated.
  • the components shown as units may or may not be physical units, that is, they may be located in one place, or they may be distributed to multiple network units. Superior. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in each embodiment of the present application can be integrated into one processing unit, each unit can exist physically alone, or two or more units can be integrated into one unit.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Health & Medical Sciences (AREA)
  • Toxicology (AREA)
  • Lenses (AREA)

Abstract

一种红外成像模组(400),该红外成像模组(400)可以应用于红外探测装置或者设备上,其中,设备包括但不限于一种或多种不同类型的交通工具,例如汽车,自行车,摩托车,火车,地铁,飞机,船,飞行器,机器人或其它类型的运输工具或可移动物体等。该红外成像模组(400)包括透镜(410)和红外探测器(420),其中:透镜(410)的凹面朝向目标物体;红外探测器(420)包括窗口片(421)和成像面(422),窗口片(421)的物方表面为曲面,窗口片(421)位于透镜(410)与成像面(422)之间,成像面(422)用于探测目标物体的红外图像。本申请所揭示的红外成像模组(400),通过将窗口片(421)的物方表面设置为曲面,可以通过一片式透镜(410)和窗口片(421)即可实现高像质无热化需求的成像模组,提升成像效果,降低成本。

Description

一种红外成像模组和红外成像方法
本申请要求于2022年3月23日提交中国国家知识产权局、申请号为202210294687.9、申请名称为“一种红外成像模组和红外成像方法”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及光通信技术领域,并且更具体地,涉及一种红外成像模组和红外成像方法。
背景技术
随着红外非制冷探测器技术的成熟,非制冷热像仪在各领域得到广泛的应用,其中一个非常重要的应用领域便是红外车载夜视系统。由于红外成像原理与可见光成像原理不同,红外热成像是利用自然界物体自身的热辐射来成像,这使得红外夜视系统能够观察到比普通汽车前大灯远高达10倍以上的距离。其可以保证在大雾、暴雨、雨雪天、漆黑夜晚,炫光等恶劣的条件下,可以非常清楚地观察前方路面情况,大大降低了交通事故的发生率,更好的保证行车安全。
然而,红外车载夜视系统价格昂贵,目前只在少数高档汽车上使用。红外车载热成像镜头作为红外车载夜视系统的重要部件,倘若能实现低成本化,市场将非常可观。
因此,亟需一种红外成像模组和红外成像方法,能够降低成本,同时保证红外成像的质量。
发明内容
本申请提供一种红外成像模组和红外成像方法,能够降低成本,同时保证红外成像的质量。
第一方面,提供了一种红外成像模组。该红外成像模组包括透镜和红外探测器,其中:透镜的凹面朝向目标物体;红外探测器包括窗口片和成像面,窗口片的物方表面为曲面,窗口片位于透镜与成像面之间,成像面用于探测目标物体的红外图像。
本申请所揭示的红外成像模组,通过将窗口片的物方表面设置为曲面,可以通过一片式透镜和窗口片实现高像质无热化需求的成像模组,提升成像效果,降低成本。进一步地,本申请所揭示的红外成像模组还可以设置多个透镜,以提升成像效果。
结合第一方面,在第一方面的某些实现方式中,上述红外成像模组满足如下表达式:
其中,f1为红外成像模组的焦距,n为透镜的中心波长折射率,R为透镜的凸面曲率半径,F为红外成像模组的像方数值孔径。
结合第一方面,在第一方面的某些实现方式中,上述窗口片与红外成像模组的焦距满足如下表达式:
其中,f2为所述窗口片的焦距。
结合第一方面,在第一方面的某些实现方式中,上述透镜的物方表面和像方表面中的至少一个为二元衍射面。这样做,可以提升该红外成像模组的成像效果。可选的,上述透镜具有正屈光力。
结合第一方面,在第一方面的某些实现方式中,上述窗口片的物方表面为非球面或衍射面。其中,当上述窗口片的物方表面为非球面时可以提升该红外成像模组的成像效果;当上述窗口片的物方表面为衍射面时可以提升该红外成像模组的无热化性能,有助于扩展应用场景。可选的,上述窗口片的像方表面为平面,上述窗口片具有正屈光力。
结合第一方面,在第一方面的某些实现方式中,上述透镜的材料包括硫系玻璃、硅、锗和砷化镓中的至少一种。
结合第一方面,在第一方面的某些实现方式中,上述窗口片的材料包括硅、锗和砷化镓中的至少一种。
结合第一方面,在第一方面的某些实现方式中,上述红外成像模组的视场角在30度至50度之间。这样做,可以在保证成像效果的前提下适应多种应用场景。
第二方面,提供了一种红外成像方法。该方法适用于红外成像装置,红外成像装置包括透镜和红外探测器,红外探测器包括窗口片和成像面,该方法包括:通过透镜得到目标物体的第一图像,透镜的凹面朝向目标物体;通过窗口片根据目标物体的第一图像得到目标物体的第二图像,窗口片的物方表面为曲面;通过成像面根据目标物体的第二图像得到目标物体的红外图像。
本申请所揭示的红外成像方法,可以通过一片式透镜和物方表面设置为曲面窗口片实现高像质无热化需求的成像模组探测目标物体的红外图像,提升成像效果,降低成本。
第三方面,提供了一种红外探测装置,包括控制电路、显示器以及如第一方面中任一项所述的红外成像模组,控制电路用于控制成像模组生成的红外图像在显示器上显示。
第四方面,提供了一种移动设备,包括处理器以及如第一方面中任一项所述的红外成像模组,处理器与红外成像模组进行通信。其中,该移动设备可以是车辆或者其他需要红外成像模组的装置(例如,需要空间上操作或移动的运输工具或者可移动物体),车辆包括但不限于汽车,自行车,摩托车,火车,地铁,飞机,船,飞行器,机器人或其它类型的运输工具或可移动物体等。
附图说明
图1是一种红外车载夜视系统的结构示意图。
图2是本申请实施例提供的红外成像模组和红外成像方法的一种具体应用场景。
图3是本申请实施例提供的红外成像模组和红外成像方法的另一种具体应用场景。
图4是本申请实施例提供的红外成像模组的结构示意图。
图5是本申请实施例提供的红外成像模组在不同温度下的全频曲线示意图。
图6是本申请实施例提供的红外成像模组在不同温度下的场曲畸变示意图。
图7是本申请实施例提供的红外成像方法的流程示意图。
具体实施方式
下面将结合附图,对本申请中的技术方案进行描述。
随着红外非制冷探测器技术的成熟,非制冷热像仪在各领域得到广泛的应用,其中一个 非常重要的应用领域便是红外车载夜视系统。由于红外成像原理与可见光成像原理不同,红外热成像是利用自然界物体自身的热辐射来成像,这使得红外夜视系统能够观察到比普通汽车前大灯远高达10倍以上的距离。其可以保证在大雾、暴雨、雨雪天、漆黑夜晚,炫光等恶劣的条件下,可以非常清楚地观察前方路面情况,大大降低了交通事故的发生率,更好的保证行车安全。
图1示出了一种红外车载夜视系统的结构示意图。如图1所示,系统100包括第一透镜110、第二透镜120、第三透镜130以及探测器140。其中,第一透镜110、第二透镜120、第三透镜130为硫系玻璃,探测器140上的窗口片为硅玻璃。这样,通过硅和硫系玻璃两种玻璃、三片镜片的组合,实现了低成本车载夜视镜头的光学设计。但是在该系统中,使用3片透镜达到无热化和矫正系统像差的需求,成本仍然较高。此外,3片式镜头镜片数量较多导致整体穿透率降低,影响红外成像质量。
基于上述原因,本申请提出了一种红外成像模组和红外成像方法,以期望能够降低成本,同时保证红外成像的质量。
图2是本申请实施例提供的红外成像模组和红外成像方法的一例具体应用场景。在该应用场景中,包括车辆220。在某些场景下,该应用场景还可以包括云端服务器210,车辆220和云端服务器210可以通过网络进行通信。
车辆220的部分或所有功能受计算平台221控制。计算平台221可包括至少一个处理器222,处理器222可以执行存储在例如存储器223这样的非暂态计算机可读介质中的指令224。
在一些实施例中,计算平台221还可以是采用分布式方式控制车辆220的个体组件或子系统的多个计算设备。处理器222可以是任何常规的处理器,诸如中央处理单元(central processing unit,CPU)。替选地,处理器222还可以包括诸如图像处理器(graphic process unit,GPU),现场可编程门阵列(field programmable gate array,FPGA)、片上系统(system on chip,SOC)、专用集成芯片(application specific integrated circuit,ASIC)或它们的组合。
除了指令224以外,存储器223还可存储数据,例如道路地图、路线信息,车辆的位置、方向、速度以及其它这样的车辆数据,以及其他信息。这种信息可在车辆220在自主、半自主和/或手动模式中操作期间被车辆220和计算平台221使用。
应理解,图2中车辆的结构不应理解为对本申请实施例的限制。
可选的,上述车辆220可以包括一种或多种不同类型的交通工具,也可以包括一种或多种不同类型的在陆地(例如,公路,道路,铁路等),水面(例如:水路,江河,海洋等)或者空间上操作或移动的运输工具或者可移动物体。例如,车辆可以包括汽车,自行车,摩托车,火车,地铁,飞机,船,飞行器,机器人或其它类型的运输工具或可移动物体等,本申请实施例对此不作限定。
另外,如图2所示的应用场景中还可以包括云端服务器210。例如,云端服务器210可以根据车辆220上传的实时红外图像和云端服务器中其他信息(例如其他车辆的信息、路况信息等),进行感知融合、计算推理等。
一个实施例中,该云端服务器210还可以通过虚拟机来实现。
本申请实施例还可以应用在人工智能中的很多领域,例如,图像识别、图像处理、高精地图、智能驾驶、智能交通、自动驾驶等领域。具体而言,应用在这些人工智能领域中的需要红外成像的分支部分。例如,在高精地图领域,经过红外成像可以获得更丰富的信息,从而能够提供准确率更高的地图信息。又例如,在智能驾驶领域,经过红外成像可以进一步确定车辆所面对的路况,从而能够根据实际情况辅助驾驶者做出正确的驾驶操作。
下面对智能驾驶的应用场景进行简单的介绍。
在智能驾驶场景中,可以将输入的图像经红外成像后获得路况信息,从而辅助驾驶者做出正确决策。如图3所示,当将摄像机采集到的物像输入到红外成像模组时,可以获得当前路况的信息,例如交通灯信息、其他车辆信息以及行人信息等,这些路况信息输入到驾驶决策模块。对于这些路况信息,驾驶决策模块进一步判定做出怎样的操作,例如,当前路况信息中显示前方有行人或其他静止的物体时,驾驶决策模块就发出停止的指示信息,进一步地,在某些产品中,驾驶决策模块还可以采取自动制动的操作。又例如,当前路况信息中发现前方存在人行横道,驾驶决策模块就发出减速的指示信息。也就是说,图3是利用红外成像环节来提高了智能驾驶的安全性,有效防止因为驾驶者疏于观察或大雾、暴雨、雨雪天、漆黑夜晚,炫光等恶劣的条件下无法准确得知路况信息从而做出错误操作,大大降低了交通事故的发生率,更好的保证行车安全。
图4示出了本申请实施例提供的红外成像模组的结构示意图。如图4所示,红外成像模组400包括透镜410和红外探测器420,其中:透镜410的凹面朝向目标物体;红外探测器420包括窗口片421和成像面(image,IMG)422,窗口片421的物方表面为曲面,窗口片421位于透镜410与成像面422之间,成像面422用于探测目标物体的红外图像。
在本申请实施例中,红外成像模组400满足如下表达式:
其中,f1为红外成像模组400的焦距,n为透镜410的中心波长折射率,R为透镜410的凸面曲率半径,F为红外成像模组400的像方数值孔径。窗口片421与红外成像模组400的焦距满足如下表达式:
其中,f2为窗口片421的焦距。
作为一种可能的实现方式,透镜410的物方表面和像方表面中的至少一个为二元衍射面。这样做,可以提升该红外成像模组400的成像效果。可选的,透镜410具有正屈光力。
作为一种可能的实现方式,窗口片421的物方表面为非球面(asphere,ASP)或衍射面(binary)。其中,当窗口片421的物方表面为非球面时可以提升该红外成像模组400的成像效果;当窗口片421的物方表面为衍射面时可以提升红外成像模组400的无热化性能,有助于扩展应用场景。可选的,窗口片421的像方表面为平面,窗口片421具有正屈光力。
在本申请实施例中,透镜410的材料包括硫系玻璃、硅、锗和砷化镓中的至少一种。窗口片421的材料包括硅、锗和砷化镓中的至少一种。可选的,透镜410以及窗口片421的材料还可以包括其他红外材料,本申请对其不做限定。
在本申请实施例中,红外成像模组400的视场角可以为30度。此时,透镜410的物方表面可以为非球面,像方表面可以为二元衍射面;窗口片421的物方表面可以为非球面,像方表面可以为平板。红外成像模组400的视场角可以为30度时,红外成像模组400的光学相关参数如下表1所示。其中,沿光轴从物侧至像侧依次为透镜410、窗口片421和成像面422。透镜410的物方表面的面序号为1,像方表面的面序号为2;窗口片421的物方表面的面序号为3,像方表面的面序号为4;成像面422的物方表面的面序号为5,像方表面的面序号为6。“ST”表示光阑(stop),“ASP”表示非球面(asphere),“Binary2”表示二元衍射面,“IMG”表示成像面(image)。“K”表示圆锥系数,“A”表示四阶非球面系数,“B”表示六阶非球面系 数,“C”表示八阶非球面系数,“D”表示十阶非球面系数,“E”表示十二阶非球面系数。“A1”表示衍射面二阶系数,“A2”表示衍射面四阶系数,“A3”表示衍射面六阶系数。表格中的空白部分表示对该部分的类型或参数不做限定,本领域技术人员可根据不同的应用场景以及设备条件进行选择。
表1 红外成像模组400的视场角为30度时的光学相关参数
在本申请实施例中,红外成像模组400的视场角可以为40度。此时,透镜410的物方表面可以为非球面,像方表面可以为二元衍射面;窗口片421的物方表面可以为非球面,像方表面可以为平板。红外成像模组400的视场角可以为40度时,红外成像模组400的光学相关参数如下表2所示。其中,表格中各个字母以及空白等所代表的含义请参照上文。
表2 红外成像模组400的视场角为40度时的光学相关参数

在本申请实施例中,红外成像模组400的视场角可以为50度。此时,透镜410的物方表面可以为非球面,像方表面可以为二元衍射面;窗口片421的物方表面可以为非球面,像方表面可以为平板。红外成像模组400的视场角可以为50度时,红外成像模组400的光学相关参数如下表3所示。其中,表格中各个字母以及空白等所代表的含义请参照上文。
表3 红外成像模组400的视场角为50度时的光学相关参数
图5示出了本申请实施例提供的红外成像模组在不同温度下的全频曲线示意图。如图5所示,当红外成像模组400的最大视场角为30度,最大空间频率为42线对/毫米(lp/mm)时,当温度在低温(零下45摄氏度)、常温(20摄氏度)和高温(80摄氏度)时,该红外成像模组400均具有良好的成像质量。图5中以弧矢和子午分别为0.00、4.42、9.83、15.00微米(μm)为例进行示例性说明,其中,横坐标为空间频率,纵坐标为光学传递函数,形成的曲线称为全频曲线。图中所形成的全频曲线越靠近弧矢或子午衍射极限所对应的全频曲线代表成像质量越好。由图5可以看出,本申请提供的红外成像模组,当温度在低温(零下45摄氏度)、常温(20摄氏度)和高温(80摄氏度)时,无论是视场中心位置的点(例如波长弧矢和子午为0.00微米时)还是靠近视场边缘位置的点(例如波长弧矢和子午为15.00微米)其对应的光学传 递函数曲线均与理论值(弧矢或子午衍射极限对应的光学传递函数曲线)相差较小,拍摄图像均具有良好的成像质量。
图6是本申请实施例提供的红外成像模组在不同温度下的畸变示意图。如图6所示,当红外成像模组400的最大视场角为30度,温度在低温(零下45摄氏度)、常温(20摄氏度)和高温(80摄氏度)时,红外成像模组400的畸变大小值使得该红外成像模组的像差指标可以满足成像需求,并且可以满足在零下45摄氏度至80摄氏度无热化的需求。其中,图6中以波长分别为8.00、10.00、12.00微米(μm)为例进行说明,纵坐标为视场角度,横坐标百分比,畸变百分比越小代表拍摄图像的形变小,成像质量高。由图6可以看出,当温度在低温(零下45摄氏度)、常温(20摄氏度)和高温(80摄氏度)时,视场角度在0到15度之间,波长分别为8.00、10.00、12.00微米时,图像畸变百分比均较小,拍摄图像均具有良好的成像质量。
应理解,上述图5和图6以红外成像模组400的视场角为30度时,对本申请提供的红外成像模组400的有益效果进行了图示性说明。当红外成像模组400的视场角为其它可能的角度(例如,40度、50度或其他可能的角度等)时,均具有上述有益效果,为了简洁,不再赘述。
图7示出了本申请实施例提供的红外成像方法的流程示意图,该方法适用于上述实施例公开的红外成像装置。
S710,通过透镜得到目标物体的第一图像。
其中,透镜的凹面朝向目标物体,且透镜的物方表面和像方表面中的至少一个为二元衍射面。可选的,透镜具有正屈光力。透镜的材料包括硫系玻璃、硅、锗和砷化镓中的至少一种。
本申请实施例对透镜的数量不做限定,例如,透镜可以是一片式透镜,这样做可以降低成本。又例如,透镜可以是多片叠加的透镜,这样做可以提升成像效果。
S720,通过窗口片根据目标物体的第一图像得到目标物体的第二图像。
S730,通过成像面根据目标物体的第二图像得到目标物体的红外图像。
其中,窗口片和成像面可以组成红外探测器,窗口片的物方表面为曲面,窗口片位于透镜与成像面之间,窗口片用于根据目标物体的第一图像得到目标物体的第二图像,成像面用于根据目标物体的第二图像得到目标物体的红外图像。
可选的,窗口片的物方表面为非球面或衍射面,窗口片的像方表面为平面,窗口片具有正屈光力。窗口片的材料包括硅、锗和砷化镓中的至少一种。
在本申请实施例中,透镜和红外探测器可以组成红外成像模组。该红外成像模组满足如下表达式:
其中,f1为红外成像模组的焦距,n为透镜的中心波长折射率,R为透镜的凸面曲率半径,F为红外成像模组的像方数值孔径。窗口片与红外成像模组的焦距满足如下表达式:
其中,f2为所述窗口片的焦距。
在本申请实施例中,对红外成像模组的视场角不做限定,例如,其可以在30度至50度之间或者其他可能的角度。
本申请所揭示的红外成像方法,可以通过一片式透镜和物方表面设置为曲面窗口片实现 高像质无热化需求的成像模组探测目标物体的红外图像,提升成像效果,降低成本。
本申请实施例还提供了一种红外探测装置,包括控制电路、显示器以及上述成像模组400,控制电路用于控制成像模组400生成的红外图像在显示器上显示。
本申请实施例还提供了一种车辆,包括处理器以及上述成像模组400,处理器与成像模组400能够进行通信。
本申请实施例还提供了一种装置,包括处理器和接口。所述处理器可用于执行上述方法实施例中的方法。
应理解,上述处理装置可以是一个芯片。例如,该处理装置可以是现场可编程门阵列(field programmable gate array,FPGA),可以是专用集成芯片(application specific integrated circuit,ASIC),还可以是系统芯片(system on chip,SoC),还可以是中央处理器(central processor unit,CPU),还可以是网络处理器(network processor,NP),还可以是数字信号处理电路(digital signal processor,DSP),还可以是微控制器(micro controller unit,MCU),还可以是可编程控制器(programmable logic device,PLD)或其他集成芯片。
在实现过程中,上述方法的各步骤可以通过处理器中的硬件的集成逻辑电路或者软件形式的指令完成。结合本申请实施例所公开的方法的步骤可以直接体现为硬件处理器执行完成,或者用处理器中的硬件及软件模块组合执行完成。软件模块可以位于随机存储器,闪存、只读存储器,可编程只读存储器或者电可擦写可编程存储器、寄存器等本领域成熟的存储介质中。该存储介质位于存储器,处理器读取存储器中的信息,结合其硬件完成上述方法的步骤。为避免重复,这里不再详细描述。
应注意,本申请实施例中的处理器可以是一种集成电路芯片,具有信号的处理能力。在实现过程中,上述方法实施例的各步骤可以通过处理器中的硬件的集成逻辑电路或者软件形式的指令完成。上述的处理器可以是通用处理器、数字信号处理器(DSP)、专用集成电路(ASIC)、现场可编程门阵列(FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件。可以实现或者执行本申请实施例中的公开的各方法、步骤及逻辑框图。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。结合本申请实施例所公开的方法的步骤可以直接体现为硬件译码处理器执行完成,或者用译码处理器中的硬件及软件模块组合执行完成。软件模块可以位于随机存储器,闪存、只读存储器,可编程只读存储器或者电可擦写可编程存储器、寄存器等本领域成熟的存储介质中。该存储介质位于存储器,处理器读取存储器中的信息,结合其硬件完成上述方法的步骤。
可以理解,本申请实施例中的存储器可以是易失性存储器或非易失性存储器,或可包括易失性和非易失性存储器两者。其中,非易失性存储器可以是只读存储器(read-only memory,ROM)、可编程只读存储器(programmable ROM,PROM)、可擦除可编程只读存储器(erasable PROM,EPROM)、电可擦除可编程只读存储器(electrically EPROM,EEPROM)或闪存。易失性存储器可以是随机存取存储器(random access memory,RAM),其用作外部高速缓存。通过示例性但不是限制性说明,许多形式的RAM可用,例如静态随机存取存储器(static RAM,SRAM)、动态随机存取存储器(dynamic RAM,DRAM)、同步动态随机存取存储器(synchronous DRAM,SDRAM)、双倍数据速率同步动态随机存取存储器(double data rate SDRAM,DDR SDRAM)、增强型同步动态随机存取存储器(enhanced SDRAM,ESDRAM)、同步连接动态随机存取存储器(synchlink DRAM,SLDRAM)和直接内存总线随机存取存储器(direct rambus RAM,DR RAM)。应注意,本文描述的系统和方法的存储器旨在包括但不限于这些和任意其它适合类型的存储器。
根据本申请实施例提供的方法,本申请还提供一种计算机程序产品,该计算机程序产品包括:计算机程序代码,当该计算机程序代码在计算机上运行时,使得该计算机执行图7所示实施例的方法。
根据本申请实施例提供的方法,本申请还提供一种计算机可读介质,该计算机可读介质存储有程序代码,当该程序代码在计算机上运行时,使得该计算机执行图7所示实施例的方法。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机指令时,全部或部分地产生按照本申请实施例所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线(digital subscriber line,DSL))或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质(例如,软盘、硬盘、磁带)、光介质(例如,高密度数字视频光盘(digital video disc,DVD))、或者半导体介质(例如,固态硬盘(solid state disc,SSD))等。
在本说明书中使用的术语“部件”、“模块”、“系统”等用于表示计算机相关的实体、硬件、固件、硬件和软件的组合、软件、或执行中的软件。例如,部件可以是但不限于,在处理器上运行的进程、处理器、对象、可执行文件、执行线程、程序和/或计算机。通过图示,在计算设备上运行的应用和计算设备都可以是部件。一个或多个部件可驻留在进程和/或执行线程中,部件可位于一个计算机上和/或分布在两个或更多个计算机之间。此外,这些部件可从在上面存储有各种数据结构的各种计算机可读介质执行。部件可例如根据具有一个或多个数据分组(例如来自与本地系统、分布式系统和/或网络间的另一部件交互的二个部件的数据,例如通过信号与其它系统交互的互联网)的信号通过本地和/或远程进程来通信。
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各种说明性逻辑块(illustrative logical block)和步骤(step),能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的系统、装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请所提供的几个实施例中,应该理解到,所揭露的系统、装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述电路的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元 上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (13)

  1. 一种红外成像模组,其特征在于,包括透镜和红外探测器,其中:
    所述透镜的凹面朝向目标物体;
    所述红外探测器包括窗口片和成像面,所述窗口片的物方表面为曲面,所述窗口片位于所述透镜与所述成像面之间,所述成像面用于探测所述目标物体的红外图像。
  2. 根据权利要求1所述的红外成像模组,其特征在于,所述红外成像模组满足如下表达式:
    其中,f1为所述红外成像模组的焦距,n为所述透镜的中心波长折射率,R为所述透镜的凸面曲率半径,F为所述红外成像模组的像方数值孔径。
  3. 根据权利要求2所述的红外成像模组,其特征在于,所述窗口片与所述红外成像模组的焦距满足如下表达式:
    其中,f2为所述窗口片的焦距。
  4. 根据权利要求1至3中任一项所述的红外成像模组,其特征在于,所述透镜的物方表面和像方表面中的至少一个为二元衍射面。
  5. 根据权利要求1至4中任一项所述的红外成像模组,其特征在于,所述透镜具有正屈光力。
  6. 根据权利要求1至5中任一项所述的红外成像模组,其特征在于,所述窗口片的物方表面为非球面或衍射面。
  7. 根据权利要求1至6中任一项所述的红外成像模组,其特征在于,所述窗口片的像方表面为平面,所述窗口片具有正屈光力。
  8. 根据权利要求1至7中任一项所述的红外成像模组,其特征在于,所述透镜的材料包括硫系玻璃、硅、锗和砷化镓中的至少一种。
  9. 根据权利要求1至8中任一项所述的红外成像模组,其特征在于,所述窗口片的材料包括硅、锗和砷化镓中的至少一种。
  10. 根据权利要求1至9中任一项所述的红外成像模组,其特征在于,所述红外成像模组的视场角在30度至50度之间。
  11. 一种红外成像方法,其特征在于,所述方法适用于红外成像装置,所述红外成像装置包括透镜和红外探测器,所述红外探测器包括窗口片和成像面,所述方法包括:
    通过所述透镜得到目标物体的第一图像,所述透镜的凹面朝向所述目标物体;
    通过所述窗口片根据所述目标物体的第一图像得到所述目标物体的第二图像,所述窗口片的物方表面为曲面;
    通过所述成像面根据所述目标物体的第二图像得到所述目标物体的红外图像。
  12. 一种红外探测装置,其特征在于,包括控制电路、显示器以及如权利要求1至10中任一项所述的红外成像模组,所述控制电路用于控制所述红外成像模组生成的红外图像在所述显示器上显示。
  13. 一种移动设备,其特征在于,包括处理器以及如权利要求1至10中任一项所述的红外成像模组,所述处理器与所述红外成像模组通信。
PCT/CN2023/082398 2022-03-23 2023-03-20 一种红外成像模组和红外成像方法 WO2023179518A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210294687.9A CN116841004A (zh) 2022-03-23 2022-03-23 一种红外成像模组和红外成像方法
CN202210294687.9 2022-03-23

Publications (1)

Publication Number Publication Date
WO2023179518A1 true WO2023179518A1 (zh) 2023-09-28

Family

ID=88099944

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/082398 WO2023179518A1 (zh) 2022-03-23 2023-03-20 一种红外成像模组和红外成像方法

Country Status (2)

Country Link
CN (1) CN116841004A (zh)
WO (1) WO2023179518A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007171170A (ja) * 2005-11-25 2007-07-05 Matsushita Electric Works Ltd 熱型赤外線検出装置の製造方法
US20100165134A1 (en) * 2006-04-17 2010-07-01 Dowski Jr Edward R Arrayed Imaging Systems And Associated Methods
US20120013706A1 (en) * 2008-10-07 2012-01-19 Entre National de la Recherche Scientifique-CNRS Infrared wide field imaging system integrated in a vacuum housing
JP2012198191A (ja) * 2011-03-07 2012-10-18 Ricoh Co Ltd 遠赤外線検出装置
CN110488394A (zh) * 2019-08-26 2019-11-22 华中科技大学 一种长波红外复合光学系统
WO2021075807A1 (ko) * 2019-10-16 2021-04-22 이준섭 원적외선 열화상 센서 어셈블리용 윈도우 및 이를 포함하는 원적외선 열화상 센서 어셈블리

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007171170A (ja) * 2005-11-25 2007-07-05 Matsushita Electric Works Ltd 熱型赤外線検出装置の製造方法
US20100165134A1 (en) * 2006-04-17 2010-07-01 Dowski Jr Edward R Arrayed Imaging Systems And Associated Methods
US20120013706A1 (en) * 2008-10-07 2012-01-19 Entre National de la Recherche Scientifique-CNRS Infrared wide field imaging system integrated in a vacuum housing
JP2012198191A (ja) * 2011-03-07 2012-10-18 Ricoh Co Ltd 遠赤外線検出装置
CN110488394A (zh) * 2019-08-26 2019-11-22 华中科技大学 一种长波红外复合光学系统
WO2021075807A1 (ko) * 2019-10-16 2021-04-22 이준섭 원적외선 열화상 센서 어셈블리용 윈도우 및 이를 포함하는 원적외선 열화상 센서 어셈블리

Also Published As

Publication number Publication date
CN116841004A (zh) 2023-10-03

Similar Documents

Publication Publication Date Title
US10317771B2 (en) Driver assistance apparatus and vehicle
US10768505B2 (en) Driver assistance apparatus and vehicle
JP7016966B2 (ja) 車載カメラ用レンズ装置
US9889859B2 (en) Dynamic sensor range in advanced driver assistance systems
JP7140135B2 (ja) 可変焦点距離レンズ系および撮像装置
CN111413777A (zh) 行动载具辅助系统
WO2024109364A1 (zh) 光学系统、摄像头和车辆
US11427174B2 (en) Movable carrier auxiliary system and brake controlling method thereof
JP6653456B1 (ja) 撮像装置
WO2023179518A1 (zh) 一种红外成像模组和红外成像方法
US20230065993A1 (en) Image signal processing pipelines for high dynamic range sensors
TWM585938U (zh) 行動載具輔助系統
JP2020150427A (ja) 撮像装置、撮像光学系及び移動体
JP7140133B2 (ja) 撮像レンズおよび撮像装置
US20200358933A1 (en) Imaging device and electronic apparatus
CN114829988B (zh) 透镜系统、用于控制透镜系统的方法和计算机程序产品
TWM579603U (zh) 行動載具輔助系統
JP7140136B2 (ja) 可変焦点距離レンズ系および撮像装置
US20240010208A1 (en) Map-assisted target detection for sensor calibration
JP7244129B1 (ja) ナイトビジョンカメラ
US20240010233A1 (en) Camera calibration for underexposed cameras using traffic signal targets
US20240092315A1 (en) Wiper blade lifting mechanism
Aasai et al. Object Detection in Curved Mirror with Multi-Cameras from Single Viewpoint Video
CN117452598A (zh) 一种低成本双目镜头用光学系统
CN116974042A (zh) 一种超广角高清车载车内人员监控用光学系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23773766

Country of ref document: EP

Kind code of ref document: A1