CN210324308U - Vehicle identification system - Google Patents

Vehicle identification system Download PDF

Info

Publication number
CN210324308U
CN210324308U CN201920735841.5U CN201920735841U CN210324308U CN 210324308 U CN210324308 U CN 210324308U CN 201920735841 U CN201920735841 U CN 201920735841U CN 210324308 U CN210324308 U CN 210324308U
Authority
CN
China
Prior art keywords
vehicle
processor
lens
image
fingerprint
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201920735841.5U
Other languages
Chinese (zh)
Inventor
张永明
赖建勋
陈舒涵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ability Opto Electronics Technology Co Ltd
Original Assignee
Ability Opto Electronics Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ability Opto Electronics Technology Co Ltd filed Critical Ability Opto Electronics Technology Co Ltd
Application granted granted Critical
Publication of CN210324308U publication Critical patent/CN210324308U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Lock And Its Accessories (AREA)

Abstract

The utility model relates to an automobile-used identification system utilizes to set up in the outside and inside fingerprint sensor ware of vehicle, gains the fingerprint image to the judgement of reference fingerprint image and treater that the collocation database was stored confirms whether the coming person who wants to get into in the vehicle is the driver, and does not need traditional key to open the door of vehicle and start the vehicle.

Description

Vehicle identification system
Technical Field
The utility model discloses an utilize the dual judgement who sets up inside and outside fingerprint sensor of vehicle and treater to confirm driver's automobile-used identification system.
Background
In the age of technology, vehicles have become indispensable vehicles, and at present, vehicles still need to open the doors and start the engine by using a metal key or a wafer key.
In addition, with the rapid development of technology, the unlocking technology has advanced, and it is not difficult for someone with confidence to copy a metal key or a chip key or directly unlock the car lock, so the risk of car theft still exists.
In summary, the inventor of the present invention has devised and designed an identification system for vehicles to improve the deficiency of the known technology and further enhance the industrial application.
SUMMERY OF THE UTILITY MODEL
In view of the above-mentioned known problems, an object of the present invention is to provide an identification system for a vehicle, which solves the problems encountered in the prior art.
Based on the above object, the present invention provides an identification system for vehicle, which is suitable for vehicle, and comprises at least one first fingerprint sensor, at least one second fingerprint sensor, a database and a processor. The first fingerprint sensor is arranged outside the vehicle to generate a first fingerprint image. The second fingerprint sensor is arranged in the vehicle to generate a second fingerprint image. The database is arranged in the vehicle and stores the main reference fingerprint image. The processor is arranged in the vehicle and electrically connected with the database, the first fingerprint sensor and the second fingerprint sensor so as to receive the first fingerprint image, the second fingerprint image and the main reference fingerprint image, and when the processor judges that the first fingerprint image is consistent with the main reference fingerprint image, the processor obtains the pass authentication and enables the vehicle door of the vehicle to be opened; when the processor determines that the second fingerprint image matches the primary reference fingerprint image, the processor obtains the usage authentication and activates an engine of the vehicle. The double judgment of the processor is used for confirming whether a coming person who wants to enter the vehicle is a driver or not, and a traditional key is not needed for opening a door of the vehicle and starting the vehicle.
Preferably, when the processor determines that the first fingerprint image does not match the main reference fingerprint image, the processor does not obtain the pass authentication and locks the doors of the vehicle.
Preferably, when the processor determines that the first fingerprint image matches the primary reference fingerprint image but the second fingerprint image does not match the primary reference fingerprint image, the processor obtains the pass authentication but does not obtain the use authentication, and the second fingerprint sensor re-senses the pass authentication.
Preferably, the processor obtains the user information based on the first fingerprint image or the second fingerprint image.
Preferably, the database further comprises a plurality of secondary reference fingerprint images for providing to the processor.
Preferably, when the processor judges that the first fingerprint image accords with one of the plurality of secondary reference fingerprint images, the processor obtains the pass authentication and enables the door of the vehicle to be opened; when the processor determines that the second fingerprint image matches one of the plurality of secondary reference fingerprint images, the processor obtains the authorization to use the certificate and enables an engine of the vehicle.
Preferably, when the processor determines that the first fingerprint image does not conform to the plurality of secondary reference fingerprint images, the processor does not obtain the pass authentication and lock the doors of the vehicle.
Preferably, when the processor determines that the first fingerprint image matches one of the plurality of secondary reference fingerprint images but the second fingerprint image does not match any of the plurality of secondary reference fingerprint images, the processor obtains the pass authentication without obtaining the authorization to use authentication, and the second fingerprint sensor re-senses or the processor re-searches the database for another frame of the plurality of secondary reference fingerprint images.
Preferably, the vehicle identification system of the present invention further comprises a positioning element, and when the processor obtains the pass authentication and the use authentication, the positioning element positions and establishes a map for the vehicle.
Preferably, the vehicle identification system of the present invention further includes a wireless transceiver, an external electronic device and a peripheral component, the wireless transceiver is disposed inside the vehicle and wirelessly connected to the external electronic device and the processor, the peripheral component is disposed on the vehicle, when the processor obtains the pass authentication and the use authentication, the external electronic device transmits a control signal to the processor through the wireless transceiver, and the processor operates the peripheral component according to the control signal; alternatively, when the processor obtains the pass authentication and the use authentication, the processor enables the peripheral device to operate.
Preferably, the peripheral element comprises a driver seat, a passenger side door, a passenger side window, a rear view mirror, an air conditioner, an instrument panel, a drive recorder, a lighting device, a multimedia player, an airbag, or a vehicle transmission.
Preferably, the first fingerprint sensor and the second fingerprint sensor are optical fingerprint sensors and have lenses, and the lenses at least comprise three lenses with refractive power.
Preferably, the lens barrel further satisfies the following condition:
1.0≦f/HEP≦10.0;
0deg≦HAF≦150deg;
0mm≦PhiD≦18mm;
PhiA/PhiD is 0.99 or less; and
0.9≦2(ARE/HEP)≦2.0
wherein f is the focal length of the lens; HEP is the diameter of the entrance pupil of the lens; HAF is half of the maximum visual angle of the lens; PhiD is the maximum value of the minimum side length on the plane of the outer periphery of the base of the lens and vertical to the optical axis of the lens; PhiA is the maximum effective diameter of the lens surface of the lens closest to the imaging surface; the ARE is a contour curve length obtained from a starting point to an end point along the contour of the lens surface with the intersection point of the optical axis and any lens surface of any lens in the lens as the starting point, and with the position at a vertical height from the entrance pupil diameter of the optical axis 1/2 as the end point.
Preferably, the first fingerprint sensor and the second fingerprint sensor are capacitive or resistive fingerprint identification.
Based on the above object, the present invention provides an identification system for vehicle, which is suitable for vehicle, and comprises at least one fingerprint sensor, a database and a processor. The fingerprint sensor is arranged on the vehicle to generate a fingerprint image. The database is arranged in the vehicle and stores at least one frame of first reference image and at least one frame of second reference image. The processor is arranged in the vehicle and electrically connected with the database and the fingerprint sensor, and when the processor judges that the fingerprint image is consistent with the first reference image, the processor obtains a first use permission and enters an owner mode; when the processor judges that the fingerprint image is consistent with the second reference image, the processor obtains a second use permission to enter a visitor mode. The processor judges to distinguish the coming person to enter the vehicle as the driver or the visitor, so as to increase the driving safety.
Preferably, the fingerprint sensor is divided into a first fingerprint sensor disposed outside the vehicle to generate a first fingerprint image and a second fingerprint sensor disposed inside the vehicle to generate a second fingerprint image.
Preferably, when the processor determines that the first fingerprint image matches the first reference image or the second reference image, the processor obtains the pass authentication to open the door of the vehicle.
Preferably, when the processor obtains the pass authentication and determines that the second fingerprint image matches the first reference image, the processor obtains the first usage right to enter the owner mode, and the processor enables the vehicle to run at a high speed or a low speed.
Preferably, the vehicle identification system of the present invention further includes a positioning element, and when the processor obtains the pass authentication and the first right of use, the positioning element positions and establishes a map for the vehicle.
Preferably, the vehicle identification system of the present invention further includes a wireless transceiver, an external electronic device and a peripheral component, the wireless transceiver is disposed inside the vehicle and wirelessly connected to the external electronic device and the processor, the peripheral component is disposed on the vehicle, when the processor obtains the pass authentication and the first usage right, the external electronic device transmits a control signal to the processor through the wireless transceiver, and the processor operates the peripheral component according to the control signal; alternatively, when the processor obtains the pass authentication and the first right of use, the processor causes the peripheral device to operate.
Preferably, the peripheral element comprises a driver seat, a passenger side door, a passenger side window, a rear view mirror, an air conditioner, an instrument panel, a drive recorder, a lighting device, a multimedia player, an airbag, or a vehicle transmission.
Preferably, when the processor obtains the pass authentication and determines that the second fingerprint image conforms to the second reference image, the processor obtains the second usage right to enter the visitor mode, and the processor drives the vehicle at a low speed.
Preferably, the utility model discloses a vehicle identification system further includes the locating element, and when the treater gained pass authentication and second right of use, the locating element was fixed a position and was established the map for the vehicle, and the vehicle can only travel to a plurality of restriction places in the map.
Preferably, the first fingerprint sensor and the second fingerprint sensor are optical fingerprint sensors and have lenses, and the lenses at least comprise three lenses with refractive power.
Preferably, the lens barrel further satisfies the following condition:
1.0≦f/HEP≦10.0;
0deg≦HAF≦150deg;
0mm≦PhiD≦18mm;
PhiA/PhiD is 0.99 or less; and
0.9≦2(ARE/HEP)≦2.0
wherein f is the focal length of the lens; HEP is the diameter of the entrance pupil of the lens; HAF is half of the maximum visual angle of the lens; PhiD is the maximum value of the minimum side length on the plane of the outer periphery of the base of the lens and vertical to the optical axis of the lens; PhiA is the maximum effective diameter of the lens surface of the lens closest to the imaging surface; the ARE is a contour curve length obtained from a starting point to an end point along the contour of the lens surface with the intersection point of the optical axis and any lens surface of any lens in the lens as the starting point, and with the position at a vertical height from the entrance pupil diameter of the optical axis 1/2 as the end point.
Preferably, the first fingerprint sensor and the second fingerprint sensor are capacitive or resistive fingerprint identification.
In view of the above, the present invention provides a vehicle identification system, which is suitable for a vehicle and includes a sound sensor, a database and a processor. The sound sensor is arranged outside or inside the vehicle to obtain sound information. The database is arranged in the vehicle and stores the main reference sound information. The processor is arranged in the vehicle and electrically connected with the database and the sound sensor, and when the processor judges that the sound information is consistent with the main reference sound information, the processor obtains the use authentication to open the door of the vehicle and start the engine of the vehicle; when the processor judges that the sound information does not match the main reference sound information, the processor locks the door of the vehicle.
Preferably, the processor obtains the user information based on the voice information.
Preferably, the database further comprises a plurality of secondary reference sound information for providing to the processor.
Preferably, when the processor determines that the sound information corresponds to one of the plurality of secondary reference sound information, the processor obtains the authorization to use the authentication and enables the door of the vehicle to be opened and the engine of the vehicle to be started; when the processor judges that the sound information does not conform to the plurality of secondary reference sound information, the processor locks the door of the vehicle.
Preferably, the vehicle identification system of the present invention further comprises a positioning element, and when the processor obtains the usage authentication, the positioning element performs positioning and map building for the vehicle.
Preferably, the vehicle identification system of the present invention further includes a wireless transceiver, an external electronic device and a peripheral component, wherein the wireless transceiver is disposed inside the vehicle and wirelessly connected to the external electronic device and the processor, the peripheral component is disposed on the vehicle, when the processor obtains the use authentication, the external electronic device transmits a control signal to the processor through the wireless transceiver, and the processor operates the peripheral component according to the control signal; alternatively, when the processor obtains the authentication, the processor enables the peripheral device to operate.
Preferably, the peripheral element comprises a driver seat, a passenger side door, a passenger side window, a rear view mirror, an air conditioner, an instrument panel, a drive recorder, a lighting device, a multimedia player, an airbag, or a vehicle transmission.
The advantage of the foregoing embodiment is that the utility model discloses an automobile-used identification system utilizes the fingerprint sensor to obtain the fingerprint image to the judgement of collocation treater, whether the discernment is the driver to the visitor who wants to get into in the vehicle, and does not need traditional key to open the door of vehicle and start the vehicle.
The foregoing embodiment has the advantage that the vehicle identification system of the present invention utilizes the sound sensor to obtain the sound information, and matches the judgment of the processor to identify whether the person coming into the vehicle is the driver, without needing the conventional key to open the door of the vehicle and start the vehicle.
Drawings
Fig. 1 is a block diagram of a first embodiment of the vehicle identification system of the present invention;
fig. 2 and 3 are configuration diagrams of a first fingerprint sensor of the vehicle identification system of the present invention;
fig. 4 is a configuration diagram of a second fingerprint sensor of the vehicle identification system of the present invention;
fig. 5 is a block diagram of a second embodiment of the vehicle identification system of the present invention;
fig. 6 is a configuration diagram of a first optical embodiment of the lens of the identification system for the vehicle according to the present invention;
fig. 7 is a graph showing spherical aberration, astigmatism and optical distortion of the first optical embodiment of the present invention from left to right in sequence;
fig. 8 is a configuration diagram of a second optical embodiment of the vehicular identification system lens of the present invention;
fig. 9 is a graph showing spherical aberration, astigmatism and optical distortion of the second optical embodiment of the present invention from left to right in sequence;
fig. 10 is a configuration diagram of a third optical embodiment of the identification system lens for a vehicle according to the present invention;
fig. 11 is a graph showing spherical aberration, astigmatism and optical distortion of the third optical embodiment of the present invention from left to right in sequence;
fig. 12 is a configuration diagram of a fourth optical embodiment of the vehicular identification system lens of the present invention;
fig. 13 is a graph showing spherical aberration, astigmatism and optical distortion of a fourth optical embodiment of the present invention from left to right in sequence;
fig. 14 is a configuration diagram of a fifth optical embodiment of the identification system lens for a vehicle according to the present invention;
fig. 15 is a graph showing spherical aberration, astigmatism and optical distortion of a fifth optical embodiment of the present invention from left to right in sequence;
fig. 16 is a configuration diagram of a sixth optical embodiment of the identification system lens for a vehicle according to the present invention;
fig. 17 is a graph sequentially showing spherical aberration, astigmatism and optical distortion of a sixth optical embodiment of the present invention from left to right;
fig. 18 is a block diagram of a third embodiment of the vehicle identification system of the present invention.
[ notation ] to show
1: vehicle identification system
10: processor with a memory having a plurality of memory cells
20: database
30: positioning element
40: peripheral component
50: human-machine interface
60: aperture
70: infrared filter
111. 121, 131, 141, 151, 161, 171: side of the object
112. 122, 132, 142, 152, 162, 172: image side
C: vehicle with a steering wheel
F1: first fingerprint sensor
F2: second fingerprint sensor
H: handle bar
I1: first fingerprint image
I2: second fingerprint image
O: owner mode
PI: primary reference fingerprint image
PSF: mainly referring to the sound information
RT: wireless transceiver
S: sound sensor
SF: sound information
And (3) SI: secondary reference fingerprint image
SSF: secondary reference sound information
UI (user interface): user information
V: visitor mode
Detailed Description
The advantages, features and technical solutions of the present invention will be described in greater detail and can be better understood with reference to the exemplary embodiments and the accompanying drawings, and the present invention may be implemented in different forms, so should not be construed as limited to the embodiments set forth herein, but rather, to those skilled in the art, the embodiments are provided so that this disclosure will convey the scope of the invention more completely and completely, and the present invention will be defined only by the appended claims.
It will be understood that, although the terms first, second, etc. may be used herein to describe various elements, components, regions, layers and/or sections, these elements, components, regions, layers and/or sections should not be limited by these terms. These terms are only used to distinguish one element, component, region, layer or section from another element, component, region, layer or section. Thus, the "first element," "first part," "first region," "first layer" and/or "first portion" discussed below may be referred to as "second element," "second part," "second region," "second layer" and/or "second portion" without departing from the spirit and teachings of the present invention.
Furthermore, the terms "comprises" and/or "comprising" refer to the presence of stated features, regions, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, regions, integers, steps, operations, elements, components, and/or groups thereof.
Unless defined otherwise, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and the present invention and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
Please refer to fig. 1 to 4, which are a block diagram of a first embodiment of the vehicle identification system, a configuration diagram of a first fingerprint sensor of the vehicle identification system, and a configuration diagram of a second fingerprint sensor of the vehicle identification system, respectively. As shown in fig. 1 to 4, the vehicle identification system of the present invention is suitable for a vehicle C, and includes at least one first fingerprint sensor F1, at least one second fingerprint sensor F2, a database 20 and a processor 10. Each first fingerprint sensor F1 is disposed outside the vehicle C to generate a first fingerprint image I1. Each second fingerprint sensor F2 is disposed inside the vehicle C to generate a second fingerprint image I2. The database 20 is disposed inside the vehicle C and stores the main reference fingerprint image PI. The processor 10 is disposed inside the vehicle C and electrically connected to the database 20, each first fingerprint sensor F1 and each second fingerprint sensor F2 to receive the first fingerprint image F1, the second fingerprint image F2 and the main reference fingerprint image PI, and when the processor 10 determines that the first fingerprint image F1 matches the main reference fingerprint image PI, the processor 10 obtains the pass authentication and opens the door of the vehicle C; when the processor 10 determines that the second fingerprint image F2 matches the main reference fingerprint image PI, the processor 10 obtains the usage authentication and activates the engine of the vehicle C. The double determination by the processor 10 is used to determine whether the driver is the coming person who wants to enter the vehicle C, and a conventional key is not required to open the door of the vehicle C and start the vehicle.
The outside and the inside of the vehicle C are distinguished with reference to the door; that is, the side of the door close to the outside environment is the outside of the vehicle C, and the side of the door close to the seat is the inside of the vehicle C. This is for illustrative purposes only and does not limit the scope of the present invention.
In one embodiment, the first fingerprint sensor F1 may be disposed on a vehicle door as shown in fig. 2; in another embodiment, the first fingerprint sensor F1 may be disposed on the handle H' of the vehicle door as shown in FIG. 3. The first fingerprint sensor F1 may be other preferred locations, as long as it is outside the vehicle C, and is not limited by the scope of the present invention. In addition, the plurality of second fingerprint sensors F2 may be disposed near the steering wheel, the control lever and the multimedia player, respectively, as shown in fig. 4, but it is also possible to have other preferred positions, and the present invention is not limited to the illustrated scope of the present invention.
In one embodiment, each of the first fingerprint sensor F1 and each of the second fingerprint sensor F2 is optical fingerprint sensor and has a lens; in another embodiment, each first fingerprint sensor and each second fingerprint sensor is capacitive or resistive fingerprint identification. Of course, the first fingerprint sensor F1 and the second fingerprint sensor F2 of optical fingerprint sensing and capacitive fingerprint identification can be mixed according to actual needs, and are not limited to the scope of the present invention.
In some embodiments, the processor 10 may determine whether the first fingerprint image F1 matches the reference fingerprint image PI based on the end point of the fingerprint of the reference fingerprint image PI; in some embodiments, the processor 10 may determine whether the first fingerprint image F1 matches the reference fingerprint image PI based on the branch point of the fingerprint of the reference fingerprint image PI; in some embodiments, the processor 10 may determine whether the first fingerprint image F1 matches the reference fingerprint image PI based on the fingerprint pattern of the fingerprint image PI. Of course, other fingerprint features in the fingerprint image PI can be used as the basis for the determination, and the present invention is not limited to the scope listed in the present invention.
The operation of the processor 10 and each of the first fingerprint sensors F1 and each of the second fingerprint sensors F2 is described in detail as follows: (1) when the processor 10 determines that the first fingerprint image F1 does not match the main reference fingerprint image PI, the processor 10 does not obtain the pass authentication and locks the door of the vehicle C, and the person who wants to enter the vehicle C is rejected. (2) When the processor 10 determines that the first fingerprint image F1 matches the primary reference fingerprint image PI, but the second fingerprint image F2 does not match the primary reference fingerprint image PI, the processor 10 obtains pass authentication without obtaining use authentication, and each second fingerprint sensor F2 may not sense the person who has entered the vehicle C, so that the person who has entered the vehicle C places his finger on the second fingerprint sensor F2 again, and the second fingerprint sensor F2 senses again. (3) When the processor 10 determines that the first fingerprint image F1 matches the primary reference fingerprint image PI and that the second fingerprint image F2 matches the primary reference fingerprint image PI, the processor 10 obtains pass authentication and use authentication, i.e., confirms that the driver is the coming person to enter the vehicle C.
Furthermore, the vehicle identification system of the present invention further comprises a positioning device 30, a peripheral device 40, a human-machine interface 50 with a wireless transceiver RT, and an external electronic device. The positioning element 30, the peripheral element 40 and the human-machine interface 50 with the wireless transceiver RT are disposed inside the vehicle C and electrically connected to the processor 10, and the external electronic device is wirelessly connected to the wireless transceiver RT, wherein the wireless connection includes Internet (Internet), Wi-Fi, WiMax (world interoperability for Microwave Access), ZigBee (ZigBee), bluetooth (bluetooth), NB-iot (narrow Band iot) or lora (long range), and may be other wireless connections, without limiting the scope of the present invention.
Furthermore, when the processor 10 determines that the first fingerprint image F1 matches the main reference fingerprint image PI and the second fingerprint image F2 matches the main reference fingerprint image PI, the processor 10 obtains the pass authentication and the use authentication, and determines that the coming person who wants to enter the vehicle C is the driver, after obtaining the pass authentication and the use authentication, the processor 10 sends out the start signal to operate the positioning element 30, the positioning element 30 performs positioning instead of the vehicle C and establishes a map according to the surrounding environment of the vehicle C, and the positioning element 30 can also know the weather and the temperature of the location of the vehicle C, so that the driver can know the environment and the traffic of the driver, and the driver controls the operation of the peripheral element 40 through the human-computer interface 50; alternatively, the driver can send a control signal to the wireless transceiver RT through an external electronic device, and then transmit the control signal to the processor 10, and the processor 10 controls the operation of the peripheral device 40 according to the control signal. Meanwhile, the processor 10 records the settings of the peripheral device 40 and stores them as the user information UI of the database 20, and when the driver senses at the first fingerprint sensor F1 and the second fingerprint sensor F2 again, the processor 10 can extract the user information UI from the database 20 according to the first fingerprint image I1 and the second fingerprint image I2, and the processor 10 displays the user information UI on the human-machine interface 50.
Wherein, the user information UI can include the height, age and tourist location of the driver; the external electronic device can be a mobile phone or a tablet, and certainly can also be other electronic elements capable of being wirelessly connected with the wireless transceiver RT, without being limited to the range listed by the present invention; the peripheral elements 40 include a driver seat, a passenger side door, a passenger side window, a rear view mirror, an air conditioner, an instrument panel, a driving recorder, a lighting device, a multimedia player, an airbag, or a vehicle transmission, and can be the peripheral elements 40 only if the peripheral elements can be controlled by the processor 10, without limiting the scope of the present invention.
In addition, the database 20 further includes a plurality of secondary reference fingerprint images SI for providing to the processor 10. The secondary reference fingerprint images SI are fingerprint images of other users who have been authorized by the driver to approve the use of the vehicle C; the operation of the processor 10, the secondary reference fingerprint image SI, each of the first fingerprint sensors F1 and each of the second fingerprint sensors F2 is described in detail as follows: (1) when the processor 10 determines that the first fingerprint image I1 does not conform to the plurality of secondary reference fingerprint images SI, the processor 10 does not obtain the pass authentication and lock the doors of the vehicle C, and the coming person who wants to enter the vehicle C is not the person authorized by the driver to approve the use of the vehicle C. (2) When the processor 10 determines that the first fingerprint image I1 matches one of the secondary reference fingerprint images SI, but the second fingerprint image I2 does not match any of the secondary reference fingerprint images SI, the processor 10 obtains the pass authentication without obtaining the authorization to use authentication, and each second fingerprint sensor F2 may not sense the coming person who has entered the vehicle C, so that the coming person who has entered the vehicle C puts his finger on the second fingerprint sensor F2 again; alternatively, the processor 10 does not find a secondary reference fingerprint image SI matching the incoming person in the vehicle C in real time, and the processor 10 searches the database 20 for another frame of the plurality of secondary reference fingerprint images SI again. (3) When the processor 10 determines that the first fingerprint image I1 and the second fingerprint image I2 both correspond to one of the plurality of secondary reference fingerprint images SI, the processor 10 obtains the pass authentication and the authorization authentication, confirms that the coming person who wants to enter the vehicle C is the person authorized by the driver to approve the use of the vehicle C, and the processor 10 opens the doors of the vehicle C and starts the engine of the vehicle C.
Similarly, when the processor 10 determines that the first fingerprint image I1 and the second fingerprint image I2 both conform to one of the plurality of secondary reference fingerprint images SI, the processor 10 obtains the pass authentication and the authorization authentication, confirms that the coming person who wants to enter the vehicle C is the user authorized to approve the use of the vehicle C by the driver, after obtaining the pass authentication and the authorization authentication, the processor 10 sends out the start signal to operate the positioning element 30, the positioning element 30 performs positioning on the vehicle C instead of the vehicle C and establishes a map according to the surrounding environment of the vehicle C, and the positioning element 30 can also know the weather and the temperature of the position of the vehicle C, so that the user can know the environment and the traffic of the user, and the user controls the operation of the surrounding element 40 through the human-computer interface 50; alternatively, the user can send a control signal to the wireless transceiver RT through an external electronic device, and then transmit the control signal to the processor 10, and the processor 10 controls the operation of the peripheral device 40 according to the control signal. Meanwhile, the processor 10 records the settings of the peripheral device 40 and stores them as the user information UI of the database 20, and when the user senses at the first fingerprint sensor F1 and the second fingerprint sensor F2 again, the processor 10 can extract the user information UI from the database 20 according to the first fingerprint image and the second fingerprint image, and the processor 10 displays the user information UI on the human-machine interface 50.
Please refer to fig. 5, which is a block diagram illustrating a second embodiment of the vehicle identification system according to the present invention. As shown in fig. 5, the difference between the second embodiment and the first embodiment of the present invention is that the processor 10 has an owner mode O and a visitor mode V and the database 20 stores at least one frame first reference image R1 and at least one frame second reference image R2, and the configuration of the remaining elements is the same as that of the first embodiment, and the description of the configuration of the same elements is not repeated here.
The operation of the processor 10, the first reference image R1, the second reference image R2, each first fingerprint sensor F1 and each second fingerprint sensor F2 is described in detail as follows: (1) when the processor 10 determines that the first fingerprint image I1 does not match the first reference image R1 or the second reference image R2, the processor 10 does not obtain the pass authentication and locks the door of the vehicle C, and the person who wants to enter the vehicle C is rejected outside the door; when the processor determines that the first fingerprint image I1 matches the first reference image R1 or the second reference image R2, the processor 10 obtains the pass authentication to open the door of the vehicle C. (2) When the processor 10 obtains the pass authentication but determines that the second fingerprint image I2 does not conform to the first reference image R1 or the second reference image R2, the processor 10 obtains the pass authentication without obtaining the first right of use or the second right of use, and each second fingerprint sensor F2 may not sense the coming person who has entered the vehicle C, so that the coming person who has entered the vehicle C puts his/her finger on the second fingerprint sensor F2 again. (3) When the processor 10 obtains the pass authentication and determines that the second fingerprint image I2 matches the first reference image R1, the processor 10 obtains the first usage right and enters the owner mode O, and the processor 10 drives the vehicle C at a high speed or a low speed. (3) When the processor 10 obtains the pass authentication and determines that the second fingerprint image I2 matches the second reference image R2, the processor 10 obtains the second usage right to enter the visitor mode V, and the processor 10 drives the vehicle C at a low speed.
It should be noted that the user with the first usage right is the driver or the user authorized to use by the driver, and can perform various operations on the vehicle by using the human-machine interface 50, such as: the position of a seat in the vehicle is controlled, the temperature and the wind direction of air conditioning equipment are controlled, films recorded by a driving recorder are retrieved, a preset radio station is played, a rearview mirror is automatically adjusted, and the like; the second user, such as a young user of a college student, needs to manually start the vehicle C and operate the peripheral components 40. Of course, the usage authority can be increased according to the usage status without limiting the scope of the present invention.
When the processor 10 determines that the first fingerprint image F1 and the second fingerprint image F2 match the first reference image R1, the processor 10 obtains the pass authentication and the first usage right and enters the owner mode O to confirm that the coming person who wants to enter the vehicle C is the driver or the user authorized to use by the driver, the processor 10 enables the vehicle C to run at a high speed or at a low speed, the positioning element 30 performs positioning instead of the vehicle C and establishes a map according to the surrounding environment of the vehicle C, the positioning element 30 can also know the weather and the temperature of the location of the vehicle C, the driver can know the environment and the traffic of the driver, and the driver controls the operation of the surrounding element 40 through the human-machine interface 50; alternatively, the driver can send a control signal to the wireless transceiver RT through an external electronic device, and then transmit the control signal to the processor 10, and the processor 10 controls the operation of the peripheral device 40 according to the control signal. Meanwhile, the processor 10 records the setting of the peripheral device 40 and stores it in the database 20, and when the driver again senses at the first fingerprint sensor F1 and the second fingerprint sensor F2, the processor 10 can extract the setting of the peripheral device 40 stored in the database 20 from the database 20 according to the first fingerprint image and the second fingerprint image, and the processor 10 displays the user information UI on the human-machine interface 50.
In addition, when the processor 10 obtains the pass authentication and the first usage right and enters the owner mode O, it is determined that the driver or the user authorized to use the vehicle C is the coming person who has entered the vehicle C, in one embodiment, the human machine interface 50 matches the seat position or the backrest tilt angle with the height and driving habit of the driver or the user authorized to use the vehicle, or the driver or the authorized to use the vehicle can adjust the passenger side window and lock/unlock the passenger side door through the human machine interface 50. In one embodiment, the human-machine interface 50 may automatically adjust the frequency of the radio broadcast to the driver or the user's preferred listening authorized by the driver, or alternatively, the human-machine interface 50 may cause the wireless transceiver RT to automatically connect to the driver or the driver's authorized user's cell phone.
In one embodiment, the human machine interface 50 may adjust the GPS of the positioning element 30, in which the location that the driver or the user authorized to use the driver frequently arrives at or the destination coordinates set in advance by the user are loaded. In one embodiment, the human-machine interface 50 can adjust the rear-view mirror to meet the best viewing angle or driving habits of the driver or the user authorized to use the driver, or the human-machine interface 50 can adjust the display mode or display information of the dashboard and adjust the recording time of the driving recorder or turn on/off the driving recorder. In one embodiment, the human machine interface 50 may be configured to activate the airbag, or alternatively, the human machine interface 50 may adjust the transmission to meet the driver's or driver's authorized user's preferences for different driving conditions, such as starting, acceleration, driving, and overcoming various road obstacles, for different driving wheel traction and vehicle speeds.
In one embodiment, the hmi 50 can adjust the air conditioning equipment so that the temperature and humidity of the environment in the vehicle C meet the preferences of the driver or the user authorized to use the vehicle C. In one embodiment, the human-machine interface 50 can adjust the lighting device to make the brightness and color temperature of the environment in the vehicle C conform to the preference of the driver or the user authorized to use the vehicle C, or the human-machine interface 50 can adjust the multimedia playing device to load the playing list or playing mode in the multimedia playing device, which conforms to the preference of the driver or the user authorized to use the vehicle C.
The operation of the peripheral element 40 of the vehicle C and the starting of the vehicle C described in the preceding paragraphs are merely examples, and the operation of other elements of the vehicle C may be adjusted according to the needs of the driver or the user authorized to use the vehicle C, and is not limited to the scope of the present invention.
When the processor 10 determines that the first fingerprint image F1 and the second fingerprint image F2 match the second reference image R2, the processor 10 obtains the pass authentication and the second permission and enters the visitor mode V to confirm that the coming person to enter the vehicle C is a younger user, the processor 10 enables the vehicle C to run at a low speed, the positioning element 30 performs positioning instead of the vehicle C and establishes a map according to the surrounding environment of the vehicle C, and the positioning element 30 can also know the weather and the temperature of the position of the vehicle C, because the processor 10 obtains the second permission, the younger user can only drive the vehicle C to a plurality of restricted locations in the map, and the turning on of the peripheral element 40 needs to be performed manually.
It should be noted that the usage right can be increased according to the usage status, and correspondingly, the reference image of the database can also be increased according to the usage status, without being limited to the scope of the present invention.
In some embodiments, the lens element includes three lens elements with refractive power, and the first lens element, the second lens element and the third lens element are arranged in order from an object side to an image side, and the lens element satisfies the following condition: 0.1 ≦ InTL/HOS ≦ 0.95; wherein HOS is the distance from the object side surface of the first lens to the imaging surface on the optical axis; the InTL is the distance from the object side surface of the first lens to the image side surface of the third lens on the optical axis.
In some embodiments, the lens element includes four lens elements with refractive power, and the first lens element, the second lens element, the third lens element and the fourth lens element are arranged in order from an object side to an image side, and the lens element satisfies the following condition: 0.1 ≦ InTL/HOS ≦ 0.95; wherein HOS is the distance from the object side surface of the first lens to the imaging surface on the optical axis; the InTL is the distance from the object side surface of the first lens to the image side surface of the fourth lens on the optical axis.
In some embodiments, the lens element includes five lens elements with refractive power, and the first lens element, the second lens element, the third lens element, the fourth lens element and the fifth lens element are arranged in order from an object side to an image side, and the lens element satisfies the following conditions: 0.1 ≦ InTL/HOS ≦ 0.95; wherein HOS is the distance from the object side surface of the first lens to the imaging surface on the optical axis; the InTL is the distance from the object side surface of the first lens to the image side surface of the fifth lens on the optical axis.
In some embodiments, the lens element includes six lens elements with refractive power, and the first lens element, the second lens element, the third lens element, the fourth lens element, the fifth lens element and the sixth lens element are arranged in order from an object side to an image side, and the lens element satisfies the following conditions: InTL/HOS is more than or equal to 0.1 and less than or equal to 0.95; wherein HOS is the distance from the object side surface of the first lens to the imaging surface on the optical axis; the InTL is the distance from the object side surface of the first lens to the image side surface of the sixth lens on the optical axis.
In addition to the above-described structural embodiments, the following description will be directed to optical embodiments in which the lens is applicable. The utility model discloses an automobile-used identification system can use three operating wavelength to design, is 486.1nm, 587.5nm, 656.2nm respectively, and wherein 587.5nm is the main reference wavelength of mainly drawing technical characteristic for the reference wavelength. The utility model discloses an automobile-used identification system also can use five operating wavelength to design, is 470nm, 510nm, 555nm, 610nm, 650nm respectively, and wherein 555nm is the main reference wavelength who refers to the wavelength for mainly drawing technical characteristic.
The ratio PPR of the focal length f of the lens to the focal length fp of each lens with positive refractive power, the ratio NPR of the focal length f of the lens to the focal length fn of each lens with negative refractive power, the sum of the PPRs of all the lenses with positive refractive power is Σ PPR, the sum of the NPRs of all the lenses with negative refractive power is Σ NPR, and it is helpful to control the total refractive power and the total length of the lens when the following conditions are satisfied: 0.5 ≦ Σ PPR/| Σ NPR ≦ 15, preferably, the following condition may be satisfied: 1 ≦ Σ PPR/| Σ NPR | ≦ 3.0.
In addition, half of the diagonal length of the effective sensing area of the optical image sensor S (i.e. the imaging height of the lens, or the maximum image height) is HOI, and the distance from the object-side surface of the first lens element to the imaging surface on the optical axis is HOS, which satisfies the following conditions: HOS/HOI ≦ 50; and 0.5 ≦ HOS/f ≦ 150. Preferably, the following conditions may be satisfied: 1 ≦ HOS/HOI ≦ 40; and 1 ≦ HOS/f ≦ 140. Therefore, the vehicle identification system can be kept miniaturized and carried on a light and portable electronic product.
In addition, in an embodiment, the lens of the present invention can be provided with at least one aperture according to the requirement to reduce stray light, which is helpful to improve the image quality.
Further, in the lens module of the present invention, the aperture configuration may be a front aperture or a middle aperture, wherein the front aperture means that the aperture is disposed between the object to be photographed and the first lens, and the middle aperture means that the aperture is disposed between the first lens and the image plane. If the diaphragm is a front diaphragm, the exit pupil of the lens and the imaging surface can generate a longer distance to accommodate more optical elements, and the efficiency of the optical image sensor S for receiving images can be increased; if the diaphragm is arranged in the middle, the system can help to enlarge the angle of view of the system, and the lens has the advantage of a wide-angle lens. The distance between the diaphragm and the imaging surface is InS, which satisfies the following condition: 0.1 ≦ InS/HOS ≦ 1.1. Therefore, the miniaturization of the lens and the wide-angle characteristic can be simultaneously maintained.
The utility model discloses an among the camera lens, use the camera lens to contain six lenses that have refractive power as the example, the distance between first lens objective face to sixth lens image side face is the InTL, and the thickness sum of all lens that have refractive power on the optical axis is sigma TP, and it satisfies the following condition: 0.1 ≦ Σ TP/InTL ≦ 0.9. Therefore, the contrast of system imaging and the yield of lens manufacturing can be considered simultaneously, and a proper back focal length is provided to accommodate other elements.
The radius of curvature of the object-side surface of the first lens is R1, and the radius of curvature of the image-side surface of the first lens is R2, which satisfy the following conditions: 0.001 ≦ R1/R2 ≦ 25. Therefore, the first lens element has proper positive refractive power strength, and the spherical aberration is prevented from increasing and speeding up. Preferably, the following conditions may be satisfied: 0.01 ≦ R1/R2 ≦ 12.
The curvature radius of the object side surface of the sixth lens is R11, the curvature radius of the image side surface of the sixth lens is R12, and the following conditions are met: -7< (R11-R12)/(R11+ R12) < 50. Therefore, astigmatism generated by the lens is favorably corrected.
The first lens and the second lens are separated by a distance IN12 on the optical axis, which satisfies the following condition: IN12/f ≦ 60. Therefore, the chromatic aberration of the lens is improved to improve the performance of the lens.
The distance between the fifth lens element and the sixth lens element is IN56, which satisfies the following condition: IN56/f ≦ 3.0. Therefore, the chromatic aberration of the lens is improved to improve the performance of the lens.
The thicknesses of the first lens element and the second lens element on the optical axis are TP1 and TP2, respectively, which satisfy the following conditions: 0.1 ≦ (TP1+ IN12)/TP2 ≦ 10. Therefore, the method is beneficial to controlling the sensitivity of lens manufacturing and improving the performance of the lens.
The thicknesses of the fifth lens element and the sixth lens element on the optical axis are TP5 and TP6, respectively, and the distance between the two lens elements on the optical axis is IN56, which satisfies the following conditions: 0.1 ≦ (TP6+ IN56)/TP5 ≦ 15 thereby helping to control the lens manufacturing sensitivity and reduce the overall system height.
The thicknesses of the third lens element, the fourth lens element, and the fifth lens element on the optical axis are TP3, TP4, and TP5, respectively, the distance between the third lens element and the fourth lens element on the optical axis is IN34, the distance between the fourth lens element and the fifth lens element on the optical axis is IN45, and the distance between the object-side surface of the first lens element and the image-side surface of the sixth lens element is invl, which satisfies the following conditions: 0.1 ≦ TP4/(IN34+ TP4+ IN45) < 1. Therefore, the aberration generated in the process of incident light advancing is corrected slightly layer by layer, and the total height of the system is reduced.
The utility model discloses an among the camera lens, critical point C61 of the object side of sixth lens is HVT61 with the vertical distance of optical axis, critical point C62 of the image side of sixth lens is HVT62 with the vertical distance of optical axis, the object side of sixth lens is the horizontal displacement distance of the nodical to critical point C61 position in the optical axis for SGC61, the image side of sixth lens is the nodical to critical point C62 position in the optical axis for SGC62 in the horizontal displacement distance of optical axis, can satisfy the following condition: 0mm ≦ HVT61 ≦ 3 mm; 0mm < HVT62 ≦ 6 mm; 0 ≦ HVT61/HVT 62; 0mm ≦ SGC61 ≦ 0.5 mm; 0mm < | SGC62 | ≦ 2 mm; and 0< SGC62 |/(| SGC62 | + TP6) ≦ 0.9. Therefore, the aberration of the off-axis field can be effectively corrected.
The utility model discloses a it satisfies following condition of camera lens: 0.2 ≦ HVT62/HOI ≦ 0.9. Preferably, the following conditions may be satisfied: 0.3 ≦ HVT62/HOI ≦ 0.8. Therefore, the aberration correction of the peripheral field of view of the lens is facilitated.
The utility model discloses a it satisfies following condition of camera lens: 0 ≦ HVT62/HOS ≦ 0.5. Preferably, the following conditions may be satisfied: 0.2 ≦ HVT62/HOS ≦ 0.45. Therefore, the aberration correction of the peripheral field of view of the lens is facilitated.
The utility model discloses an among the camera lens, the horizontal displacement distance parallel with the optical axis between the thing side of sixth lens and the inflection point of the most recent optical axis of the thing side of the last nodical of optical axis to sixth lens shows with SGI611, and the image side of sixth lens shows with SGI621 with the horizontal displacement distance parallel with the optical axis between the inflection point of the most recent optical axis of the image side of the last nodical of optical axis to sixth lens, and it satisfies the following condition: 0< SGI611/(SGI611+ TP6) ≦ 0.9; 0< SGI621/(SGI621+ TP6) ≦ 0.9. Preferably, the following conditions may be satisfied: 0.1 ≦ SGI611/(SGI611+ TP6) ≦ 0.6; 0.1 ≦ SGI621/(SGI621+ TP6) ≦ 0.6.
A horizontal displacement distance parallel to the optical axis between an intersection point of an object-side surface of the sixth lens element on the optical axis and a second inflection point near the optical axis of the object-side surface of the sixth lens element is denoted by SGI612, and a horizontal displacement distance parallel to the optical axis between an intersection point of an image-side surface of the sixth lens element on the optical axis and a second inflection point near the optical axis of the image-side surface of the sixth lens element is denoted by SGI622, which satisfy the following conditions: 0< SGI612/(SGI612+ TP6) ≦ 0.9; 0< SGI622/(SGI622+ TP6) ≦ 0.9. Preferably, the following conditions may be satisfied: 0.1 ≦ SGI612/(SGI612+ TP6) ≦ 0.6; 0.1 ≦ SGI622/(SGI622+ TP6) ≦ 0.6.
The vertical distance between the inflection point of the nearest optical axis on the object side surface of the sixth lens element and the optical axis is represented by HIF611, the vertical distance between the inflection point of the nearest optical axis on the image side surface of the sixth lens element and the optical axis is represented by HIF621, and the following conditions are satisfied: 0.001mm ≦ HIF611 ≦ 5 mm; 0.001mm ≦ HIF621 ≦ 5 mm. Preferably, the following conditions may be satisfied: 0.1mm ≦ HIF611 ≦ 3.5 mm; 1.5mm ≦ HIF621 ≦ 3.5 mm.
A vertical distance between an inflection point on an object side surface of the sixth lens element, which is closer to the optical axis, and the optical axis is HIF612, and a vertical distance between an inflection point on an image side surface of the sixth lens element, which is closer to the optical axis, and the optical axis is HIF622, which satisfy the following conditions: 0.001mm ≦ HIF612 ≦ 5 mm; 0.001mm ≦ HIF622 ≦ 5 mm. Preferably, the following conditions may be satisfied: 0.1mm ≦ HIF622 ≦ 3.5 mm; 0.1mm ≦ HIF612 ≦ 3.5 mm.
A vertical distance between an inflection point on a third optical axis and an optical axis of an object side surface of the sixth lens element is denoted by HIF613, and a vertical distance between an inflection point on a third optical axis and an optical axis of an image side surface of the sixth lens element, which intersect with the optical axis, and an image side surface of the sixth lens element, is denoted by HIF623, and the following conditions are satisfied: 0.001mm ≦ HIF613 ≦ 5 mm; 0.001mm ≦ HIF623 ≦ 5 mm. Preferably, the following conditions may be satisfied: 0.1mm ≦ HIF623 ≦ 3.5 mm; 0.1mm ≦ HIF613 ≦ 3.5 mm.
A vertical distance between an inflection point on an object side surface of the sixth lens element, which is closer to the optical axis, and the optical axis is HIF614, and a vertical distance between an intersection point on the optical axis of an image side surface of the sixth lens element, which is closer to the optical axis, and the optical axis is HIF624, which satisfy the following conditions: 0.001mm ≦ HIF614 ≦ 5 mm; 0.001mm ≦ HIF624 ≦ 5 mm. Preferably, the following conditions may be satisfied: 0.1mm ≦ HIF624 ≦ 3.5 mm; 0.1mm ≦ HIF614 ≦ 3.5 mm.
In the lens of the present invention, (TH1+ TH2)/HOI satisfies the following conditions: 0< (TH1+ TH2)/HOI ≦ 0.95; preferably, the following conditions may be satisfied: 0< (TH1+ TH2)/HOI ≦ 0.5; (TH1+ TH 2)/HOS; preferably, the following conditions may be satisfied: 0< (TH1+ TH2)/HOS ≦ 0.95; preferably, the following conditions may be satisfied: 0< (TH1+ TH2)/HOS ≦ 0.5; the 2-fold ratio (TH1+ TH2)/PhiA satisfies the following conditions: 0<2 times (TH1+ TH2)/PhiA ≦ 0.95; preferably, the following conditions may be satisfied: 0<2 times (TH1+ TH2)/PhiA ≦ 0.5.
The utility model discloses an implementation of camera lens, accessible have high dispersion coefficient and the lens staggered arrangement of low dispersion coefficient, and help the correction of camera lens colour difference.
The equation for the above aspheric surface is:
z=ch2/[1+[1-(k+1)c2h2]0.5]+A4h4+A6h6+A8h8+A10h10+A12h12+A14h14+A16h16+A18h18+A20h20+…
where z is a position value referenced to a surface vertex at a position of height h in the optical axis direction, k is a cone coefficient, c is an inverse of a curvature radius, and a4, a6, A8, a10, a12, a14, a16, a18, and a20 are high-order aspheric coefficients.
The utility model provides an in the camera lens, the material of lens can be plastic or glass. When the lens is made of plastic, the production cost and the weight can be effectively reduced. In addition, when the lens is made of glass, the thermal effect can be controlled and the design space for lens refractive power configuration can be increased. In addition, the object side and the image side of the first lens element to the sixth lens element in the lens can be aspheric surfaces, which can obtain more control variables, besides reducing the aberration, compared with the use of the traditional glass lens element, the number of the lens elements can be reduced, thereby effectively reducing the total height of the lens.
Furthermore, in the lens barrel provided by the present invention, if the lens surface is convex, it is indicated that the lens surface is convex at the paraxial region in principle; if the lens surface is concave, it means in principle that the lens surface is concave at the paraxial region.
Further, the method is carried out. The utility model discloses a visual demand of camera lens makes in first lens, second lens, third lens, fourth lens, fifth lens and the sixth lens at least one lens be the light filtering component that the filtering wavelength is less than 500 nm's light, and at least one of its accessible lens of this special utensil filtering function is coated film on the surface or this lens itself is made by the material that has the filterable short wavelength and is reached.
Further, the imaging surface of the first fingerprint sensor F1 or the second fingerprint sensor F2 of the present invention may be a plane or a curved surface as required. The image forming surface is a curved surface (e.g., a spherical surface with a radius of curvature), which is helpful for reducing the incident angle required for focusing light on the image forming surface, and is beneficial for increasing the relative illumination in addition to achieving the length (TTL) of the miniature lens.
First optical embodiment
As shown in fig. 6, the lens element includes six lens elements with refractive power, and includes, in order from an object side to an image side, a first lens element 11, a second lens element 12, a third lens element 13, a fourth lens element 14, a fifth lens element 15 and a sixth lens element 16.
Referring to fig. 6 and 7, fig. 6 is a configuration diagram of a lens of an automotive identification system according to a first optical embodiment of the present invention, and fig. 7 is a graph illustrating spherical aberration, astigmatism and optical distortion of the first optical embodiment of the present invention from left to right in sequence. In fig. 6, the first lens element 11, the stop 60, the second lens element 12, the third lens element 13, the fourth lens element 14, the fifth lens element 15, the sixth lens element 16, the ir-filter 70, the image plane and the first fingerprint sensor F1 are disposed in order from the object side to the image side, and the first fingerprint sensor F1 may be replaced by the second fingerprint sensor F2.
The first lens element 11 with negative refractive power has a concave object-side surface 111 and a concave image-side surface 112, and is aspheric, and the object-side surface 111 has two inflection points. The profile curve length of the maximum effective radius of the object-side surface 111 of the first lens 11 is denoted as ARS11, and the profile curve length of the maximum effective radius of the image-side surface 112 of the first lens 11 is denoted as ARS 12. The contour curve length of the 1/2 entrance pupil diameter (HEP) at the object side surface 111 of the first lens 11 is indicated by ARE11, and the contour curve length of the 1/2 entrance pupil diameter (HEP) at the image side surface 112 of the first lens 11 is indicated by ARE 12. The thickness of the first lens 11 on the optical axis is TP 1.
A horizontal displacement distance parallel to the optical axis between an intersection point of the object-side surface 111 of the first lens element 11 on the optical axis and an inflection point of the object-side surface 111 of the first lens element 11 closest to the optical axis is represented by SGI111, and a horizontal displacement distance parallel to the optical axis between an intersection point of the image-side surface 112 of the first lens element 11 on the optical axis and an inflection point of the image-side surface 112 of the first lens element 11 closest to the optical axis is represented by SGI121, which satisfies the following conditions: SGI111 ═ 0.0031 mm; | SGI111 |/(| SGI111 | + TP1) | -0.0016.
A horizontal displacement distance parallel to the optical axis between an intersection point of the object-side surface 111 of the first lens element 11 on the optical axis and a second inflection point close to the optical axis of the object-side surface 112 of the first lens element 11 is denoted by SGI112, and a horizontal displacement distance parallel to the optical axis between an intersection point of the image-side surface 112 of the first lens element 11 on the optical axis and a second inflection point close to the optical axis of the image-side surface 112 of the first lens element 11 is denoted by SGI122, which satisfies the following conditions: SGI 112-1.3178 mm; | SGI112 |/(| SGI112 | + TP1) | -0.4052.
The vertical distance between the inflection point of the object-side surface 111 of the first lens element 11 closest to the optical axis and the optical axis is denoted by HIF111, and the vertical distance between the inflection point of the image-side surface 112 of the first lens element 11 closest to the optical axis and the optical axis is denoted by HIF121, which satisfies the following conditions: HIF 111-0.5557 mm; HIF111/HOI is 0.1111.
A vertical distance between an inflection point on the second optical axis approaching on the object-side surface 111 of the first lens element 11 and the optical axis is denoted by HIF112, and a vertical distance between an inflection point on the second optical axis approaching on the image-side surface 112 of the first lens element 11 and the optical axis is denoted by HIF122, which satisfies the following conditions: HIF 112-5.3732 mm; HIF112/HOI 1.0746.
The second lens element 12 with positive refractive power has a convex object-side surface 121 and a convex image-side surface 122, and is aspheric, and the object-side surface 121 has an inflection point. The maximum effective radius of the object-side surface 121 of the second lens 12 has a profile curve length denoted ARS21 and the maximum effective radius of the image-side surface 122 of the second lens 12 has a profile curve length denoted ARS 22. The contour curve length of the 1/2 entrance pupil diameter (HEP) at the object side 121 of the second lens 12 is indicated by ARE21, and the contour curve length of the 1/2 entrance pupil diameter (HEP) at the image side 122 of the second lens 12 is indicated by ARE 22. The thickness of the second lens element 12 on the optical axis is TP 2.
A horizontal displacement distance parallel to the optical axis between an intersection point of the object-side surface 121 of the second lens element 12 on the optical axis and an inflection point of the object-side surface 121 of the second lens element 12 closest to the optical axis is represented by SGI211, and a horizontal displacement distance parallel to the optical axis between an intersection point of the image-side surface 122 of the second lens element 12 on the optical axis and an inflection point of the image-side surface 122 of the second lens element 12 closest to the optical axis is represented by SGI221, which satisfies the following conditions: SGI 211-0.1069 mm; -SGI 211 |/(| SGI211 | + TP2) ═ 0.0412; SGI221 ═ 0 mm; | SGI221 |/(| SGI221 | + TP2) | 0.
The vertical distance between the inflection point of the object-side surface 121 of the second lens element 12 closest to the optical axis and the optical axis is represented by HIF211, and the vertical distance between the inflection point of the image-side surface 122 of the second lens element 12 closest to the optical axis and the optical axis is represented by HIF221, which satisfies the following conditions: HIF 211-1.1264 mm; HIF211/HOI 0.2253; HIF221 ═ 0 mm; HIF 221/HOI is 0.
The third lens element 13 with negative refractive power has a concave object-side surface 131 and a convex image-side surface 132, and is aspheric, and the object-side surface 131 and the image-side surface 132 both have inflection points. The maximum effective radius of the object side surface 131 of the third lens 13 has a profile curve length represented by ARS31, and the maximum effective radius of the image side surface 132 of the third lens 13 has a profile curve length represented by ARS 32. The contour curve length of 1/2 entrance pupil diameter (HEP) at the object side 131 of the third lens 13 is indicated by ARE31, and the contour curve length of 1/2 entrance pupil diameter (HEP) at the image side 132 of the third lens 13 is indicated by ARE 32. The thickness of the third lens 13 on the optical axis is TP 3.
A horizontal displacement distance parallel to the optical axis between an intersection point of the object-side surface 131 of the third lens element 13 on the optical axis and an inflection point of the closest optical axis of the object-side surface 131 of the third lens element 13 is represented by SGI311, and a horizontal displacement distance parallel to the optical axis between an intersection point of the image-side surface 132 of the third lens element 13 on the optical axis and an inflection point of the closest optical axis of the image-side surface 132 of the third lens element 13 is represented by SGI321, which satisfies the following conditions: SGI 311-0.3041 mm; -SGI 311 |/(| SGI311 | + TP3) — 0.4445; SGI 321-0.1172 mm; -SGI 321 |/(| SGI321 | + TP3) — 0.2357.
The vertical distance between the inflection point of the nearest optical axis on the object-side surface 131 of the third lens element 13 and the optical axis is represented by HIF311, and the vertical distance between the inflection point of the nearest optical axis on the image-side surface 132 of the third lens element 13 and the optical axis is represented by HIF321, which satisfies the following conditions: HIF311 1.5907 mm; HIF311/HOI 0.3181; HIF 321-1.3380 mm; HIF 321/HOI 0.2676.
The fourth lens element 14 with positive refractive power has a convex object-side surface 141 and a concave image-side surface 142, and is aspheric, wherein the object-side surface 141 has two inflection points and the image-side surface 142 has one inflection point. The maximum effective radius of the object-side surface 141 of the fourth lens 14 has a profile curve length represented by ARS41, and the maximum effective radius of the image-side surface 142 of the fourth lens 14 has a profile curve length represented by ARS 42. The contour curve length of the 1/2 entrance pupil diameter (HEP) at the object side 141 of the fourth lens 14 is indicated by ARE41, and the contour curve length of the 1/2 entrance pupil diameter (HEP) at the image side 142 of the fourth lens 14 is indicated by ARE 42. The thickness of the fourth lens element 14 on the optical axis is TP 4.
A horizontal displacement distance parallel to the optical axis between an intersection point of the object-side surface 141 of the fourth lens element 14 on the optical axis and an inflection point of the object-side surface 141 of the fourth lens element 14 closest to the optical axis is represented by SGI411, and a horizontal displacement distance parallel to the optical axis between an intersection point of the image-side surface 142 of the fourth lens element 14 on the optical axis and an inflection point of the image-side surface 142 of the fourth lens element 14 closest to the optical axis is represented by SGI421, which satisfies the following conditions: SGI411 ═ 0.0070 mm; -SGI 411 |/(| SGI411 | + TP4) ═ 0.0056; SGI421 ═ 0.0006 mm; | SGI421 |/(| SGI421 | + TP4) | -0.0005.
A horizontal displacement distance parallel to the optical axis between an intersection point of the object-side surface 141 of the fourth lens element 14 on the optical axis and a second inflection point close to the optical axis of the object-side surface 141 of the fourth lens element 14 is represented by SGI412, and a horizontal displacement distance parallel to the optical axis between an intersection point of the image-side surface 142 of the fourth lens element 14 on the optical axis and a second inflection point close to the optical axis of the image-side surface 142 of the fourth lens element 14 is represented by SGI422, which satisfies the following conditions: SGI412 ═ -0.2078 mm; | SGI412 |/(| SGI412 | + TP4) | -0.1439.
A vertical distance between an inflection point of an object-side surface 141 of the fourth lens element 14 closest to the optical axis and the optical axis is HIF411, and a vertical distance between an intersection point of an image-side surface 142 of the fourth lens element 14 on the optical axis and the inflection point of the image-side surface 142 of the fourth lens element 14 closest to the optical axis and the optical axis is HIF421, which satisfies the following conditions: HIF411 mm 0.4706 mm; HIF411/HOI 0.0941; HIF421 of 0.1721 mm; HIF 421/HOI ═ 0.0344.
A vertical distance between an inflection point on the second optical axis approaching point of the object-side surface 141 of the fourth lens element 14 and the optical axis is HIF412, and a vertical distance between an inflection point on the second optical axis approaching point of the image-side surface 142 of the fourth lens element 14 and the optical axis is HIF422, which satisfies the following conditions: HIF412 ═ 2.0421 mm; HIF412/HOI 0.4084.
The fifth lens element 15 with positive refractive power has a convex object-side surface 151 and a convex image-side surface 152, and is aspheric, wherein the object-side surface 151 has two inflection points and the image-side surface 152 has one inflection point. The profile curve length of the maximum effective radius of the object-side surface 151 of the fifth lens 15 is denoted as ARS51, and the profile curve length of the maximum effective radius of the image-side surface 152 of the fifth lens 15 is denoted as ARS 52. The contour curve length of the 1/2 entrance pupil diameter (HEP) at the object side 152 of the fifth lens 15 is indicated by ARE51, and the contour curve length of the 1/2 entrance pupil diameter (HEP) at the image side 154 of the fifth lens 15 is indicated by ARE 52. The thickness of the fifth lens element 15 on the optical axis is TP 5.
A horizontal displacement distance parallel to the optical axis between an intersection point of the object-side surface 151 of the fifth lens element 15 on the optical axis and an inflection point of the object-side surface 151 of the fifth lens element 15 closest to the optical axis is represented by SGI511, and a horizontal displacement distance parallel to the optical axis between an intersection point of the image-side surface 152 of the fifth lens element 15 on the optical axis and an inflection point of the image-side surface 152 of the fifth lens element 15 closest to the optical axis is represented by SGI521, which satisfies the following conditions: SGI 511-0.00364 mm; -SGI 511 |/(| SGI511 | + TP5) ═ 0.00338; SGI521 ═ 0.63365 mm; | SGI521 |/(| SGI521 | + TP5) | -0.37154.
A horizontal displacement distance parallel to the optical axis between an intersection point of the object-side surface 151 of the fifth lens element 15 on the optical axis and a second inflection point close to the optical axis of the object-side surface 151 of the fifth lens element 15 is denoted by SGI512, and a horizontal displacement distance parallel to the optical axis between an intersection point of the image-side surface 152 of the fifth lens element 15 on the optical axis and a second inflection point close to the optical axis of the image-side surface 152 of the fifth lens element 15 is denoted by SGI522, which satisfies the following conditions: SGI512 ═ 0.32032 mm; | SGI512 |/(| SGI512 | + TP5) | -0.23009.
A horizontal displacement distance parallel to the optical axis between an intersection point of the object-side surface 151 of the fifth lens element 15 on the optical axis and a third inflection point close to the optical axis of the object-side surface 151 of the fifth lens element 15 is denoted by SGI513, and a horizontal displacement distance parallel to the optical axis between an intersection point of the image-side surface 152 of the fifth lens element 15 on the optical axis and a third inflection point close to the optical axis of the image-side surface 152 of the fifth lens element 15 is denoted by SGI523, which satisfies the following conditions: SGI513 ═ 0 mm; -SGI 513 |/(| SGI513 | + TP5) ═ 0; SGI523 ═ 0 mm; -SGI 523 |/(| SGI523 | + TP5) ═ 0.
A horizontal displacement distance parallel to the optical axis between an intersection point of the object-side surface 151 of the fifth lens element 15 on the optical axis and a fourth inflection point near the optical axis of the object-side surface 151 of the fifth lens element 15 is denoted by SGI514, and a horizontal displacement distance parallel to the optical axis between an intersection point of the image-side surface 152 of the fifth lens element 15 on the optical axis and a fourth inflection point near the optical axis of the image-side surface 152 of the fifth lens element 15 is denoted by SGI524, which satisfies the following conditions: SGI514 ═ 0 mm; -SGI 514 |/(| SGI514 | + TP5) | 0; SGI524 ═ 0 mm; | SGI524 |/(| SGI524 | + TP5) | 0.
The vertical distance between the inflection point of the object-side surface 151 closest to the optical axis and the optical axis of the fifth lens element 15 is HIF511, and the vertical distance between the inflection point of the image-side surface 152 closest to the optical axis and the optical axis of the fifth lens element 15 is HIF521, which satisfy the following conditions: HIF 511-0.28212 mm; HIF 511/HOI 0.05642; HIF521 ═ 2.13850 mm; HIF521/HOI 0.42770.
A vertical distance between an inflection point on the second optical axis approaching side surface 151 of the fifth lens element 15 and the optical axis is HIF512, and a vertical distance between an inflection point on the second optical axis approaching side surface 152 of the fifth lens element 15 and the optical axis is HIF522, which satisfies the following conditions: HIF 512-2.51384 mm; HIF 512/HOI 0.50277.
A vertical distance between an inflection point of a third optical axis approaching side surface 151 of the fifth lens element 15 and the optical axis is HIF513, and a vertical distance between an inflection point of a third optical axis approaching side surface 152 of the fifth lens element 15 and the optical axis is HIF523, which satisfies the following conditions: HIF513 ═ 0 mm; HIF513/HOI ═ 0; HIF523 ═ 0 mm; HIF523/HOI ═ 0.
A vertical distance between a fourth inflection point on the object-side surface 151 of the fifth lens element 15 closer to the optical axis and the optical axis is HIF514, and a vertical distance between a fourth inflection point on the image-side surface 152 of the fifth lens element 15 closer to the optical axis and the optical axis is HIF524, where the following conditions are satisfied: HIF514 ═ 0 mm; HIF514/HOI ═ 0; HIF524 ═ 0 mm; HIF524/HOI ═ 0.
The sixth lens element 16 with negative refractive power has a concave object-side surface 161, a concave image-side surface 162, two inflection points on the object-side surface 161 and one inflection point on the image-side surface 162. Therefore, the angle of incidence of each field of view on the sixth lens element 16 can be effectively adjusted to improve aberration. The maximum effective radius of the object-side surface 161 of the sixth lens 16 has a contour curve length of ARS61, and the maximum effective radius of the image-side surface 162 of the sixth lens 16 has a contour curve length of ARS 62. The contour curve length of 1/2 entrance pupil diameter (HEP) at the object side surface 161 of the sixth lens 16 is indicated by ARE61, and the contour curve length of 1/2 entrance pupil diameter (HEP) at the image side surface 162 of the sixth lens 16 is indicated by ARE 62. The thickness of the sixth lens element 16 on the optical axis is TP 6.
A horizontal displacement distance parallel to the optical axis between an intersection point of the object-side surface 161 of the sixth lens element 16 on the optical axis and an inflection point of the object-side surface 161 of the sixth lens element 16 closest to the optical axis is represented by SGI611, and a horizontal displacement distance parallel to the optical axis between an intersection point of the image-side surface 162 of the sixth lens element 16 on the optical axis and an inflection point of the image-side surface 162 of the sixth lens element 16 closest to the optical axis is represented by SGI621, which satisfies the following conditions: SGI611 ═ 0.38558 mm; -SGI 611 |/(| SGI611 | + TP6) — 0.27212; SGI 621-0.12386 mm; -SGI 621 |/(| SGI621 | + TP6) — 0.10722.
A horizontal displacement distance parallel to the optical axis between an intersection point of the object-side surface 161 of the sixth lens element 16 on the optical axis and a second inflection point near the optical axis of the object-side surface 162 of the sixth lens element 16 is denoted by SGI612, and a horizontal displacement distance parallel to the optical axis between an intersection point of the image-side surface 162 of the sixth lens element 16 on the optical axis and a second inflection point near the optical axis of the image-side surface 162 of the sixth lens element 16 is denoted by SGI621, which satisfy the following conditions: SGI612 ═ -0.47400 mm; -SGI 612 |/(| SGI612 | + TP6) — 0.31488; SGI622 ═ 0 mm; | SGI622 |/(| SGI622 | + TP6) | 0.
A vertical distance between an inflection point of an object-side surface 161 of the sixth lens element 16 closest to the optical axis and the optical axis is HIF611, and a vertical distance between an inflection point of an image-side surface 162 of the sixth lens element 16 closest to the optical axis and the optical axis is HIF621, which satisfy the following conditions: HIF611 ═ 2.24283 mm; HIF 611/HOI 0.44857; HIF 621-1.07376 mm; HIF621/HOI 0.21475.
A vertical distance between an inflection point on the second optical axis approaching side surface 161 of the sixth lens element 16 and the optical axis is denoted by HIF612, and a vertical distance between an inflection point on the second optical axis approaching side surface 162 of the sixth lens element 16 and the optical axis is denoted by HIF622, which satisfy the following conditions: HIF612 ═ 2.48895 mm; HIF 612/HOI 0.49779.
A vertical distance between an inflection point of a third optical axis approaching on the object side surface 161 of the sixth lens element 16 and the optical axis is HIF613, and a vertical distance between an inflection point of a third optical axis approaching on the image side surface 162 of the sixth lens element 16 and the optical axis is HIF623, which satisfy the following conditions: HIF613 ═ 0 mm; HIF613/HOI ═ 0; HIF623 ═ 0 mm; HIF623/HOI is 0.
A vertical distance between a fourth inflection point on the object-side surface 161 of the sixth lens element 16 closer to the optical axis and the optical axis is HIF614, and a vertical distance between a fourth inflection point on the image-side surface 162 of the sixth lens element 16 closer to the optical axis and the optical axis is HIF624, which satisfy the following conditions: HIF614 ═ 0 mm; HIF614/HOI ═ 0; HIF624 ═ 0 mm; HIF624/HOI ═ 0.
The infrared filter 70 is made of glass, is disposed between the sixth lens element 16 and the first fingerprint sensor F1, and does not affect the focal length of the lens.
In this embodiment, the focal length of the lens is f, the entrance pupil diameter of the lens is HEP, half of the maximum viewing angle of the lens is HAF, and the numerical values thereof are as follows: f is 4.075 mm; f/HEP is 1.4; and HAF 50.001 degrees and tan (HAF) 1.1918.
In the lens barrel of the present embodiment, the focal length of the first lens 11 is f1, and the focal length of the sixth lens 16 is f6, which satisfies the following conditions: f 1-7.828 mm; | f/f1 | -0.52060; f6 ═ 4.886; and | f1 | -f 6 |.
In the lens barrel of the present embodiment, the focal lengths of the second lens 12 to the fifth lens 15 are f2, f3, f4, and f5, respectively, which satisfy the following conditions: f2 | + -f 3 | + f4 | + f5 | -95.50815 mm; | f1 | f6 | 12.71352mm and | f2 | + -f 3 | -f 4 | + | f5 | f1 | f6 |.
In the lens barrel of this embodiment, the sum of the PPR of all the lenses with positive refractive power is Σ PPR ═ f/f2+ f/4 + f/f5 ═ 1.63290, the sum of the NPR of all the lenses with negative refractive power is Σ NPR ∑ f/f1 ++ | f/f3 ± + | f/f6 | (1.51305), and Σ PPR/| NPR | (1.07921). The following conditions are also satisfied: | f/f2 | -0.69101; | f/f3 | -0.15834; | f/f4 | -0.06883; | f/f5 | -0.87305; | f/f6 | -0.83412.
In the lens barrel of this embodiment, a distance between the object-side surface 111 of the first lens element 11 and the image-side surface 162 of the sixth lens element 16 is InTL, a distance between the object-side surface 111 of the first lens element 11 and the image plane is HOS, a distance between the diaphragm 60 and the image plane is InS, a half of a diagonal length of an effective sensing area of the first fingerprint sensor F1 is HOI, and a distance between the image-side surface 162 of the sixth lens element 16 and the image plane is BFL, which satisfy the following conditions: instl + BFL ═ HOS; HOS 19.54120 mm; HOI 5.0 mm; HOS/HOI 3.90824; HOS/f 4.7952; 11.685mm for InS; and InS/HOS 0.59794.
In the lens barrel of this embodiment, the sum of the thicknesses of all the lenses with refractive power on the optical axis is Σ TP, which satisfies the following condition: Σ TP is 8.13899 mm; and Σ TP/intil 0.52477; InTL/HOS 0.917102. Therefore, the contrast of system imaging and the yield of lens manufacturing can be considered simultaneously, and a proper back focal length is provided to accommodate other elements.
In the lens barrel of the present embodiment, the radius of curvature of the object-side surface 111 of the first lens element 11 is R1, and the radius of curvature of the image-side surface 112 of the first lens element 11 is R2, which satisfy the following conditions: R1/R2 | -8.99987. Therefore, the first lens element 11 has a proper positive refractive power strength, thereby preventing the spherical aberration from increasing and speeding up.
In the lens barrel of the present embodiment, the radius of curvature of the object-side surface 161 of the sixth lens element 16 is R11, and the radius of curvature of the image-side surface 162 of the sixth lens element 16 is R12, which satisfy the following conditions: (R11-R12)/(R11+ R12) ═ 1.27780. Therefore, astigmatism generated by the optical imaging module is favorably corrected.
In the lens barrel of this embodiment, the sum of the focal lengths of all the lenses with positive refractive power is Σ PP, which satisfies the following condition: f2+ f4+ f5 is 69.770 mm; and f5/(f2+ f4+ f5) ═ 0.067. Therefore, the positive refractive power of the single lens can be properly distributed to other positive lenses, so that the generation of remarkable aberration in the process of the incident light ray is inhibited.
In the lens barrel of the present embodiment, the sum of the focal lengths of all the lenses with negative refractive power is Σ NP, which satisfies the following condition: Σ NP ═ f1+ f3+ f6 ═ 38.451 mm; and f6/(f1+ f3+ f6) ═ 0.127. Therefore, the negative refractive power of the sixth lens element 16 can be properly distributed to the other negative lens elements, so as to suppress the occurrence of significant aberration during the incident light beam.
IN the lens barrel of the present embodiment, the distance between the first lens element 11 and the second lens element 12 on the optical axis is IN12, which satisfies the following condition: IN 12-6.418 mm; IN12/f 1.57491. Therefore, the chromatic aberration of the lens is improved to improve the performance of the lens.
IN the lens barrel of the present embodiment, the distance between the fifth lens element 15 and the sixth lens element 16 on the optical axis is IN56, which satisfies the following condition: IN56 is 0.025 mm; IN56/f 0.00613. Therefore, the chromatic aberration of the lens is improved to improve the performance of the lens.
In the lens barrel of the present embodiment, the thicknesses of the first lens element 11 and the second lens element 12 on the optical axis are TP1 and TP2, respectively, which satisfy the following conditions: TP 1-1.934 mm; TP 2-2.486 mm; and (TP1+ IN12)/TP2 ═ 3.36005. Therefore, the method is beneficial to controlling the manufacturing sensitivity of the optical imaging module and improving the performance of the optical imaging module.
IN the lens barrel of the present embodiment, the thicknesses of the fifth lens element 15 and the sixth lens element 16 on the optical axis are TP5 and TP6, respectively, and the distance between the two lens elements on the optical axis is IN56, which satisfies the following conditions: TP5 ═ 1.072 mm; TP6 ═ 1.031 mm; and (TP6+ IN56)/TP5 ═ 0.98555. Thereby helping to control the sensitivity of lens manufacture and reducing the overall system height.
IN the lens barrel of the present embodiment, the distance between the third lens element 13 and the fourth lens element 14 on the optical axis is IN34, and the distance between the fourth lens element 14 and the fifth lens element 15 on the optical axis is IN45, which satisfies the following conditions: IN34 is 0.401 mm; IN45 is 0.025 mm; and TP4/(IN34+ TP4+ IN45) ═ 0.74376. Therefore, the aberration generated in the process of the incident light traveling is corrected in a layer-by-layer micro-amplitude mode, and the total height of the system is reduced.
In the lens barrel of this embodiment, a horizontal displacement distance between an intersection point of the object-side surface 151 of the fifth lens element 15 on the optical axis and a maximum effective radius position of the object-side surface 151 of the fifth lens element 15 on the optical axis is InRS51, a horizontal displacement distance between an intersection point of the image-side surface 152 of the fifth lens element 15 on the optical axis and a maximum effective radius position of the image-side surface 152 of the fifth lens element 15 on the optical axis is InRS52, and a thickness of the fifth lens element 15 on the optical axis is TP5, which satisfies the following conditions: InRS 51-0.34789 mm; InRS 52-0.88185 mm; | InRS51 |/TP 5 | -0.32458 and | InRS52 |/TP 5 | -0.82276. Therefore, the manufacturing and molding of the lens are facilitated, and the miniaturization of the lens is effectively maintained.
In the lens barrel of the present embodiment, a vertical distance between a critical point of the object-side surface 151 of the fifth lens element 15 and the optical axis is HVT51, and a vertical distance between a critical point of the image-side surface 152 of the fifth lens element 15 and the optical axis is HVT52, which satisfies the following conditions: HVT51 ═ 0.515349 mm; HVT 52-0 mm.
In the lens barrel of the present embodiment, a horizontal displacement distance between an intersection point of the object-side surface 161 of the sixth lens element 16 on the optical axis and the maximum effective radius position of the object-side surface 161 of the sixth lens element 16 on the optical axis is InRS61, a horizontal displacement distance between an intersection point of the image-side surface 162 of the sixth lens element 16 on the optical axis and the maximum effective radius position of the image-side surface 162 of the sixth lens element 16 on the optical axis is InRS62, and a thickness of the sixth lens element 16 on the optical axis is TP6, which satisfies the following conditions: InRS 61-0.58390 mm; InRS62 ═ 0.41976 mm; | InRS61 |/TP 6 | -0.56616 and | InRS62 |/TP 6 | -0.40700. Therefore, the manufacturing and molding of the lens are facilitated, and the miniaturization of the lens is effectively maintained.
In the lens barrel of the present embodiment, a vertical distance between a critical point of the object-side surface 161 of the sixth lens element 16 and the optical axis is HVT61, and a vertical distance between a critical point of the image-side surface 162 of the sixth lens element 16 and the optical axis is HVT62, which satisfies the following conditions: HVT61 ═ 0 mm; HVT 62-0 mm.
In the lens barrel of the present embodiment, it satisfies the following conditions: HVT51/HOI 0.1031. Therefore, the aberration correction of the peripheral field of view of the lens is facilitated.
In the lens barrel of the present embodiment, it satisfies the following conditions: HVT51/HOS 0.02634. Therefore, the aberration correction of the peripheral field of view of the lens is facilitated.
In the lens barrel of the present embodiment, the second lens element 12, the third lens element 13 and the sixth lens element 16 have negative refractive power, the abbe number of the second lens element 12 is NA2, the abbe number of the third lens element 13 is NA3, and the abbe number of the sixth lens element 16 is NA6, which satisfy the following conditions: NA6/NA2 ≦ 1. Therefore, correction of chromatic aberration of the lens is facilitated.
In the lens of this embodiment, the TV distortion at the time of imaging of the lens is TDT, and the optical distortion at the time of imaging is ODT, which satisfy the following conditions: TDT 2.124%; and the ODT is 5.076 percent.
In the lens barrel of this embodiment, LS is 12mm, PhiA is 2 times EHD62 to 6.726mm (EHD62: the maximum effective radius of the image side surface 162 of the sixth lens 16), PhiC + PhiA +2 times TH2 to 7.026mm, PhiD +2 times (TH1+ TH2) to 7.426mm, TH1 to 0.2mm, TH2 to 0.15mm, PhiA/PhiD to 0.9057, TH1+ TH2 to 0.35mm, (TH1+ TH2)/HOI to 0.035, (TH1+ TH2)/HOS to 0.0179, 2 times (TH1+ TH2)/PhiA to 0.1041, (TH1+ TH2)/LS to 0.0292.
The following list I and list II are referred to cooperatively.
Figure DEST_PATH_GDA0002362845220000281
Figure DEST_PATH_GDA0002362845220000291
TABLE II aspheric coefficients of the first optical example
Figure DEST_PATH_GDA0002362845220000292
Figure DEST_PATH_GDA0002362845220000301
According to the first and second tables, the following values related to the length of the profile curve can be obtained:
Figure DEST_PATH_GDA0002362845220000302
Figure DEST_PATH_GDA0002362845220000311
the first optical embodiment is a detailed structural data of the first optical embodiment, wherein the unit of the radius of curvature, the thickness, the distance, and the focal length is mm, and the surfaces 0-16 sequentially represent surfaces from the object side to the image side. Table II shows aspheric data of the first optical embodiment, where k represents the cone coefficients in the aspheric curve equation, and A1-A20 represents the aspheric coefficients of order 1-20 of each surface. In addition, the following tables of the optical embodiments correspond to the schematic diagrams and aberration graphs of the optical embodiments, and the definitions of the data in the tables are the same as those of the first and second tables of the first optical embodiment, which is not repeated herein. In addition, the following optical embodiments have the same mechanical element parameters as those of the first optical embodiment.
Second optical embodiment
As shown in fig. 8, the lens element includes seven lens elements with refractive power, and the lens element includes, in order from an object side to an image side, a first lens element 11, a second lens element 12, a third lens element 13, a fourth lens element 14, a fifth lens element 15, a sixth lens element 16 and a seventh lens element 17.
Referring to fig. 8 and 9, fig. 8 is a configuration diagram of a second optical embodiment of a lens of an automotive identification system according to the present invention, and fig. 9 is a graph showing a spherical aberration, astigmatism and optical distortion of the second optical embodiment of the present invention from left to right in sequence. In fig. 8, the image sensor includes, in order from an object side to an image side, a first lens element 11, a second lens element 12, a third lens element 13, an aperture stop 60, a fourth lens element 14, a fifth lens element 15, a sixth lens element 16, a seventh lens element 17, an ir-filter 70, an image plane, and a first fingerprint sensor F1, and the first fingerprint sensor F1 may be replaced by a second fingerprint sensor F2.
The first lens element 11 with negative refractive power has a convex object-side surface 111 and a concave image-side surface 112.
The second lens element 12 with negative refractive power has a concave object-side surface 121 and a convex image-side surface 122.
The third lens element 13 with positive refractive power has a convex object-side surface 131 and a convex image-side surface 132.
The fourth lens element 14 with positive refractive power has a convex object-side surface 141 and a convex image-side surface 142.
The fifth lens element 15 with positive refractive power has a convex object-side surface 151 and a convex image-side surface 152.
The sixth lens element 16 with negative refractive power has a concave object-side surface 161 and a concave image-side surface 162. Therefore, the angle of incidence of each field of view on the sixth lens element 16 can be effectively adjusted to improve aberration.
The seventh lens element 17 with negative refractive power has a convex object-side surface 171 and a convex image-side surface 172. Thereby, the back focal length is advantageously shortened to maintain miniaturization. In addition, the angle of incidence of the light rays in the off-axis field can be effectively suppressed, and the aberration of the off-axis field can be further corrected.
The infrared filter 70 is made of glass, is disposed between the seventh lens element 17 and the first fingerprint sensor F1, and does not affect the focal length of the lens.
Please refer to the following table three and table four.
Figure DEST_PATH_GDA0002362845220000321
TABLE IV aspheric coefficients of the second optical example
Figure DEST_PATH_GDA0002362845220000322
Figure DEST_PATH_GDA0002362845220000331
In the second optical embodiment, the curve equation of the aspherical surface represents the form as in the first optical embodiment. In addition, the following parameters are defined in the same way as in the first optical embodiment, and are not repeated herein.
According to the third table and the fourth table, the following conditional expressions can be obtained:
Figure DEST_PATH_GDA0002362845220000332
Figure DEST_PATH_GDA0002362845220000341
according to the third table and the fourth table, the following conditional expressions can be obtained: according to the first and second tables, the following values related to the length of the profile curve can be obtained:
Figure DEST_PATH_GDA0002362845220000342
according to the third table and the fourth table, the following conditional expressions can be obtained:
Figure DEST_PATH_GDA0002362845220000351
third optical embodiment
As shown in fig. 10, the lens element includes six lens elements with refractive power, and includes, in order from an object side to an image side, a first lens element 11, a second lens element 12, a third lens element 13, a fourth lens element 14, a fifth lens element 15 and a sixth lens element 16.
Referring to fig. 10 and 11, fig. 10 is a configuration diagram of a lens of an identification system for a vehicle according to a third optical embodiment of the present invention, and fig. 11 is a graph illustrating spherical aberration, astigmatism and optical distortion of the third optical embodiment of the present invention from left to right in sequence. In fig. 10, the first lens element 11, the second lens element 12, the third lens element 13, the stop 60, the fourth lens element 14, the fifth lens element 15, the sixth lens element 16, the ir-filter 70, the image plane and the first fingerprint sensor F1 are disposed in order from the object side to the image side, and the first fingerprint sensor F1 may be replaced by the second fingerprint sensor F2.
The first lens element 11 with negative refractive power has a convex object-side surface 111 and a concave image-side surface 112.
The second lens element 12 with negative refractive power has a concave object-side surface 121 and a convex image-side surface 122.
The third lens element 13 with positive refractive power has a convex object-side surface 131 and a convex image-side surface 132, and is aspheric, and the image-side surface 132 has an inflection point.
The fourth lens element 14 with negative refractive power has a concave object-side surface 141 and a concave image-side surface 142, and is aspheric, and the image-side surface 142 has an inflection point.
The fifth lens element 15 with positive refractive power has a convex object-side surface 151 and a convex image-side surface 152.
The sixth lens element 16 with negative refractive power has a convex object-side surface 161 and a concave image-side surface 162, and both the object-side surface 161 and the image-side surface 162 have inflection points. Thereby, the back focal length is advantageously shortened to maintain miniaturization. In addition, the angle of incidence of the light rays in the off-axis field can be effectively suppressed, and the aberration of the off-axis field can be further corrected.
The infrared filter 70 is made of glass, is disposed between the sixth lens element 16 and the first fingerprint sensor F1, and does not affect the focal length of the lens.
Please refer to table five and table six below.
Figure DEST_PATH_GDA0002362845220000352
Figure DEST_PATH_GDA0002362845220000361
TABLE VI aspheric coefficients of the third optical example
Figure DEST_PATH_GDA0002362845220000362
Figure DEST_PATH_GDA0002362845220000371
In the third optical embodiment, the curve equation of the aspherical surface represents the form as in the first optical embodiment. In addition, the following parameters are defined in the same way as in the first optical embodiment, and are not repeated herein.
According to table five and table six, the following conditional values can be obtained:
Figure DEST_PATH_GDA0002362845220000372
according to table five and table six, the following values related to the profile curve length can be obtained:
Figure DEST_PATH_GDA0002362845220000373
Figure DEST_PATH_GDA0002362845220000381
according to table five and table six, the following conditional values can be obtained:
Figure DEST_PATH_GDA0002362845220000382
fourth optical embodiment
In fig. 12, in an embodiment of the present disclosure, the lens includes five lens elements with refractive power, and a first lens element 11, a second lens element 12, a third lens element 13, a fourth lens element 14 and a fifth lens element 15 are arranged in order from an object side to an image side.
Referring to fig. 12 and 13, fig. 12 is a configuration diagram of a fourth optical embodiment of a lens of an automotive identification system according to the present invention, and fig. 13 is a graph illustrating spherical aberration, astigmatism and optical distortion of the fourth optical embodiment of the present invention from left to right according to a certain order. In fig. 12, the first lens element 11, the second lens element 12, the third lens element 13, the stop 60, the fourth lens element 14, the fifth lens element 15, the ir-filter 70, the image plane and the first fingerprint sensor F1 are disposed in order from the object side to the image side, and the first fingerprint sensor F1 may be replaced by the second fingerprint sensor F2.
The first lens element 11 with negative refractive power has a convex object-side surface 111 and a concave image-side surface 112.
The second lens element 12 with negative refractive power has a concave object-side surface 121 and a concave image-side surface 122, and is aspheric, and the object-side surface 121 has an inflection point.
The third lens element 13 with positive refractive power has a convex object-side surface 131 and a convex image-side surface 132, and is aspheric, and the object-side surface 131 has an inflection point.
The fourth lens element 14 with positive refractive power has a convex object-side surface 141 and a convex image-side surface 142, and is aspheric, and the object-side surface 141 has an inflection point.
The fifth lens element 15 with negative refractive power has a concave object-side surface 151 and a concave image-side surface 152, which are both aspheric, and the object-side surface 151 has two inflection points. Thereby, the back focal length is advantageously shortened to maintain miniaturization.
The infrared filter 70 is made of glass, is disposed between the fifth lens element 15 and the first fingerprint sensor F1, and does not affect the focal length of the lens.
Please refer to table seven and table eight below.
Figure DEST_PATH_GDA0002362845220000391
Figure DEST_PATH_GDA0002362845220000401
TABLE eighth and fourth optical examples aspheric coefficients
Figure DEST_PATH_GDA0002362845220000402
In a fourth optical embodiment, the curve equation for the aspheric surface represents the form as in the first optical embodiment. In addition, the following parameters are defined in the same way as in the first optical embodiment, and are not repeated herein.
According to the seventh and eighth tables, the following conditional values can be obtained:
Figure DEST_PATH_GDA0002362845220000403
Figure DEST_PATH_GDA0002362845220000411
according to table seven and table eight, the following values related to the profile curve length can be obtained:
Figure DEST_PATH_GDA0002362845220000412
according to the seventh and eighth tables, the following conditional values can be obtained:
Figure DEST_PATH_GDA0002362845220000421
fifth optical embodiment
In fig. 14, in an embodiment of the present disclosure, the lens includes four lens elements with refractive power, and the lens includes, in order from an object side to an image side, a first lens element 11, a second lens element 12, a third lens element 13 and a fourth lens element 14.
Referring to fig. 14 and 15, fig. 14 is a configuration diagram of a fifth optical embodiment of a lens of an automotive identification system according to the present invention, and fig. 15 is a graph showing a spherical aberration, astigmatism and optical distortion of the fifth optical embodiment of the present invention from left to right in sequence. In fig. 14, the image sensor includes, in order from an object side to an image side, an aperture stop 60, a first lens element 11, a second lens element 12, a third lens element 13, a fourth lens element 14, an ir-filter 70, an image plane and a first fingerprint sensor F1, and the first fingerprint sensor F1 may be replaced by a second fingerprint sensor F2.
The first lens element 11 with positive refractive power has a convex object-side surface 111 and a convex image-side surface 112, and is aspheric, and the object-side surface 111 has an inflection point.
The second lens element 12 with negative refractive power has a convex object-side surface 121 and a concave image-side surface 122, and is aspheric, wherein the object-side surface 121 has two inflection points and the image-side surface 122 has one inflection point.
The third lens element 13 with positive refractive power has a concave object-side surface 131 and a convex image-side surface 132, and is aspheric, wherein the object-side surface 131 has three inflection points and the image-side surface 132 has one inflection point.
The fourth lens element 14 with negative refractive power has a concave object-side surface 141 and a concave image-side surface 142, and is aspheric, wherein the object-side surface 141 has two inflection points and the image-side surface 142 has one inflection point.
The infrared filter 70 is made of glass, is disposed between the fourth lens element 14 and the first fingerprint sensor F1, and does not affect the focal length of the lens.
Please refer to table nine and table ten below.
Figure DEST_PATH_GDA0002362845220000422
Figure DEST_PATH_GDA0002362845220000431
TABLE Ten, aspheric coefficients of the fifth optical example
Figure DEST_PATH_GDA0002362845220000432
Figure DEST_PATH_GDA0002362845220000441
In a fifth optical embodiment, the curve equation for the aspheric surface represents the form as in the first optical embodiment. In addition, the following parameters are defined in the same way as in the first optical embodiment, and are not repeated herein.
The following conditional values are obtained according to table nine and table ten:
Figure DEST_PATH_GDA0002362845220000442
Figure DEST_PATH_GDA0002362845220000451
the following conditional values are obtained according to table nine and table ten:
Figure DEST_PATH_GDA0002362845220000452
values associated with the profile curve length can be obtained according to table nine and table ten:
Figure DEST_PATH_GDA0002362845220000453
Figure DEST_PATH_GDA0002362845220000461
sixth optical embodiment
Referring to fig. 16 and 17, fig. 16 is a configuration diagram of a sixth optical embodiment of a lens of an automotive identification system according to the present invention, and fig. 17 is a graph sequentially showing spherical aberration, astigmatism and optical distortion of the sixth optical embodiment of the present invention from left to right. In fig. 16, the first lens element 11, the stop 60, the second lens element 12, the third lens element 13, the ir-filter 70, the image plane and the first fingerprint sensor F1 are disposed in order from the object side to the image side, and the first fingerprint sensor F1 may be replaced by the second fingerprint sensor F2.
The first lens element 11 with positive refractive power has a convex object-side surface 111 and a concave image-side surface 112.
The second lens element 12 with negative refractive power has a concave object-side surface 121 and a convex image-side surface 122, and is aspheric, and the image-side surface 122 has an inflection point.
The third lens element 13 with positive refractive power has a convex object-side surface 131 and a concave image-side surface 132, and is aspheric, wherein the object-side surface 131 has two inflection points and the image-side surface 132 has one inflection point.
The infrared filter 70 is made of glass, is disposed between the third lens 13 and the first fingerprint sensor F1, and does not affect the focal length of the lens.
Please refer to the following table eleven and table twelve.
Figure DEST_PATH_GDA0002362845220000462
Figure DEST_PATH_GDA0002362845220000471
Aspheric coefficients of the twelfth and sixth optical examples
Figure DEST_PATH_GDA0002362845220000472
In the sixth optical embodiment, the curve equation of the aspherical surface represents the form as in the first optical embodiment. In addition, the following parameters are defined in the same way as in the first optical embodiment, and are not repeated herein.
The following conditional values were obtained according to table eleven and table twelve:
Figure DEST_PATH_GDA0002362845220000473
Figure DEST_PATH_GDA0002362845220000481
the following conditional values were obtained according to table eleven and table twelve:
Figure DEST_PATH_GDA0002362845220000482
values associated with the profile curve length are obtained according to table eleven and table twelve:
Figure DEST_PATH_GDA0002362845220000483
please refer to fig. 18, which is a block diagram illustrating a third embodiment of a vehicle identification system according to the present invention. As shown in fig. 18, the third embodiment of the present invention is different from the first embodiment in that the sound sensor S and the database store the main reference sound information PSF and the sub reference sound information SSF, and the configuration of the other components is the same as that of the first embodiment, and the description of the configuration of the same components will not be repeated here.
Wherein, the sound sensor S is disposed outside or inside the vehicle C to obtain the sound information SF. The database 20 is disposed inside the vehicle C and stores the main reference sound information PSF. The processor 10 is disposed inside the vehicle C and electrically connected to the database 20 and the sound sensor S, when the processor 10 determines that the sound information SF matches the main reference sound information PSF, the processor 10 obtains the usage authentication to open the door of the vehicle C and start the engine of the vehicle C; when the processor 10 judges that the sound information SF does not coincide with the main reference sound information PSF, the processor causes the door of the vehicle C to be locked.
In one embodiment, the processor 10 can determine whether the sound information SF matches the main reference sound information PSF based on the waveform of the main reference sound information PSF. In another embodiment, the processor 10 can determine whether the sound information SF and the main reference sound information SSF match based on the waveform and frequency of the main reference sound information PSF. Of course, the determination may be made mainly by referring to other features of the sound information SSF, and is not limited to the scope of the present invention.
When the processor 10 determines that the sound information SF matches the main reference sound information PSF, the processor 10 obtains the usage authentication, and determines that the coming person who wants to enter the vehicle C is the driver, the positioning element 30 performs positioning instead of the vehicle C and establishes a map according to the surrounding environment of the vehicle C, and the positioning element 30 can also know the weather and the temperature of the position of the vehicle C, so that the driver can know the environment and the traffic of the driver, and the driver controls the operation of the peripheral element 40 through the human-computer interface 50; alternatively, the driver can send a control signal to the wireless transceiver RT through an external electronic device, and then transmit the control signal to the processor 10, and the processor 10 controls the operation of the peripheral device 40 according to the control signal. Meanwhile, the processor 10 records the setting of the peripheral component 40 and stores it as the user information UI of the database 20, when the driver transmits the voice information SF to the voice sensor S again, the processor 10 can extract the user information UI from the database 20 according to the voice information SF, and the processor 10 displays the user information UI on the human-machine interface 50.
In addition, the database 20 further includes a plurality of secondary reference sound information SSF provided to the processor 10. The secondary reference sound information SSF is sound information of other users who have been authorized by the driver to approve the use of the vehicle C; the operation of the processor 10, the secondary reference sound information SSF and the sound sensor S is described in detail as follows: (1) when the processor 10 determines that the sound information SF matches one of the plurality of secondary reference sound information SSF, the processor 10 obtains the authorization for use authentication and opens the door of the vehicle C, and the driver who wants to enter the vehicle C authorizes the driver to approve the use of the vehicle C, and the processor 10 starts the engine of the vehicle C. (2) When the processor 10 determines that the sound information SF does not conform to the reference sound information SSF, the driver is not authorized to grant the vehicle C, and the processor 10 does not obtain the authorization authentication and locks the door of the vehicle C.
Similarly, when the processor 10 determines that the sound information SF matches one of the secondary reference sound information SSF, the processor 10 obtains the authorization authentication, confirms that the coming person who wants to enter the vehicle C is the user authorized to use the vehicle C by the driver, the positioning element 30 performs positioning instead of the vehicle C and establishes a map according to the surrounding environment of the vehicle C, the positioning element 30 can also know the weather and the temperature of the location of the vehicle C, the user can further know the environment and the traffic of the user, and the user controls the operation of the peripheral element 40 through the human-machine interface 50; alternatively, the user can send a control signal to the wireless transceiver RT through an external electronic device, and then transmit the control signal to the processor 10, and the processor 10 controls the operation of the peripheral device 40 according to the control signal. Meanwhile, the processor 10 records the setting of the peripheral component 40 and stores it as the user information UI of the database 20, when the driver transmits the voice information SF to the voice sensor S again, the processor 10 can extract the user information UI from the database 20 according to the voice information SF, and the processor 10 displays the user information UI on the human-machine interface 50.
Before observing, the utility model discloses an automobile-used identification system 1, through setting up in the outside and inside first fingerprint sensor ware F1 and the second fingerprint sensor ware F2 of vehicle C, gain first fingerprint image I1 and second fingerprint image I2 to the data and the judgement of treater that collocation database 20 stored confirm that the visitor who wants to get into in the vehicle is the driver, and do not need traditional key to open the door of vehicle and start the vehicle. In a word, the utility model discloses an automobile-used identification system has like above-mentioned advantage, increases the security of driving a vehicle.
The foregoing is by way of example only, and not limiting. It is intended that the appended claims cover any and all equivalent modifications or variations of this invention which do not depart from the spirit and scope of this invention.

Claims (33)

1. An identification system for a vehicle, adapted for use in a vehicle, comprising:
the first fingerprint sensor is arranged outside the vehicle to generate a first fingerprint image;
at least one second fingerprint sensor arranged in the vehicle to generate a second fingerprint image;
a database disposed inside the vehicle and storing at least one main reference fingerprint image; and
the processor is arranged in the vehicle and is electrically connected with the database, the first fingerprint sensor and the second fingerprint sensor so as to receive the first fingerprint image, the second fingerprint image and the main reference fingerprint image, and when the processor judges that the first fingerprint image is consistent with the main reference fingerprint image, the processor obtains pass authentication and opens the vehicle door of the vehicle; when the processor determines that the second fingerprint image matches the primary reference fingerprint image, the processor obtains a usage authentication and causes an engine of the vehicle to start.
2. The vehicle identification system of claim 1, wherein the pass authentication is not obtained by the processor when the processor determines that the first fingerprint image does not match the primary reference fingerprint image.
3. The vehicle identification system of claim 1, wherein when the processor determines that the first fingerprint image matches the primary reference fingerprint image but the second fingerprint image does not match the primary reference fingerprint image, the processor obtains the pass authentication without obtaining the use authentication, and the second fingerprint sensor re-senses.
4. The vehicle identification system of claim 1, wherein the processor obtains user information based on the first fingerprint image or the second fingerprint image.
5. The vehicle identification system of claim 1, wherein the database further comprises a plurality of secondary reference fingerprint images for providing to the processor.
6. The vehicle identification system of claim 5, wherein when the processor determines that the first fingerprint image matches one of the plurality of secondary reference fingerprint images, the processor obtains the pass authentication and causes a door of the vehicle to open; when the processor determines that the second fingerprint image matches one of the plurality of secondary reference fingerprint images, the processor obtains authorization to use authentication and causes an engine of the vehicle to start.
7. The vehicle identification system of claim 6, wherein when the processor determines that none of the first fingerprint images matches the plurality of secondary reference fingerprint images, the processor does not obtain the pass authentication and causes the doors of the vehicle to lock.
8. The vehicle identification system of claim 6, wherein when the processor determines that the first fingerprint image matches one of the secondary reference fingerprint images but none of the second fingerprint images matches the secondary reference fingerprint images, the processor obtains the pass authentication without obtaining the authorization to use authentication, the second fingerprint sensor re-senses or the processor re-looks for another frame of the secondary reference fingerprint images in the database.
9. The vehicular identification system according to claim 1, further comprising a positioning element for positioning and mapping said vehicle when said processor obtains said pass authentication and said use authentication.
10. The vehicle identification system of claim 1, further comprising a wireless transceiver, an external electronic device and a peripheral device, wherein the wireless transceiver is disposed inside the vehicle and is wirelessly connected to the external electronic device and the processor, the peripheral device is disposed on the vehicle, when the processor obtains the pass authentication and the use authentication, the external electronic device transmits a control signal to the processor through the wireless transceiver, and the processor operates the peripheral device according to the control signal; or, when the processor obtains the pass authentication and the use authentication, the processor makes the peripheral element operate.
11. The vehicle identification system of claim 10, wherein said peripheral component comprises a driver seat, a passenger side door, a passenger side window, a rear view mirror, an air conditioner, an instrument panel, a tachograph, a light, a multimedia player, an airbag, or a vehicle transmission.
12. The vehicle identification system of claim 1, wherein the first fingerprint sensor and the second fingerprint sensor are optical fingerprint sensors and have lenses, and the lenses comprise at least three lenses with refractive power.
13. The vehicular identification system according to claim 12, wherein the lens further satisfies the following condition:
1.0≦f/HEP≦10.0;
0deg≦HAF≦150deg;
0mm≦PhiD≦18mm;
PhiA/PhiD is 0.99 or less; and
0.9≦2(ARE/HEP)≦2.0
wherein f is the focal length of the lens; HEP is the diameter of the entrance pupil of the lens; HAF is half of the maximum visual angle of the lens; PhiD is the maximum value of the minimum side length on the plane which is perpendicular to the optical axis of the lens and is arranged at the outer periphery of the base of the lens; PhiA is the maximum effective diameter of the lens surface of the lens closest to an imaging surface; the ARE is a contour curve length obtained from an intersection point of any lens surface of any lens in the lens and the optical axis as a starting point, and a position at a vertical height from the entrance pupil diameter of the optical axis 1/2 as an end point, along the contour of the lens surface from the starting point to the end point.
14. The vehicle identification system of claim 1, wherein said first fingerprint sensor and said second fingerprint sensor are capacitive or resistive fingerprint sensors.
15. An identification system for a vehicle, adapted for use in a vehicle, comprising:
at least one fingerprint sensor disposed on the vehicle to generate a fingerprint image;
the database is arranged in the vehicle and stores at least one first reference image and at least one second reference image;
the processor is arranged in the vehicle and electrically connected with the database and the fingerprint sensor, and when the processor judges that the fingerprint image is consistent with the first reference image, the processor acquires a first use authority and enters an owner mode; when the processor judges that the fingerprint image is consistent with the second reference image, the processor obtains a second use permission to enter a visitor mode.
16. The vehicle identification system according to claim 15, wherein the fingerprint sensor is divided into a first fingerprint sensor disposed outside the vehicle to generate a first fingerprint image and a second fingerprint sensor disposed inside the vehicle to generate a second fingerprint image.
17. The vehicle identification system of claim 16, wherein when the processor determines that the first fingerprint image matches the first reference image or the second reference image, the processor obtains a pass authentication to open a door of the vehicle.
18. The vehicle identification system according to claim 17, wherein when the processor obtains the pass authentication and determines that the second fingerprint image matches the first reference image, the processor obtains the first usage right and enters the owner mode, and the processor causes the vehicle to travel at a high speed or a low speed.
19. The vehicular identification system according to claim 18, further comprising a location component for locating and mapping said vehicle when said processor obtains said pass authentication and said first right of use.
20. The vehicle identification system of claim 18, further comprising a wireless transceiver, an external electronic device and a peripheral device, wherein the wireless transceiver is disposed inside the vehicle and is wirelessly connected to the external electronic device and the processor, the peripheral device is disposed on the vehicle, when the processor obtains the pass authentication and the first right of use, the external electronic device transmits a control signal to the processor through the wireless transceiver, and the processor operates the peripheral device according to the control signal; or, when the processor obtains the pass authentication and the first usage right, the processor makes the peripheral element operate.
21. The vehicle identification system of claim 20, wherein said peripheral component comprises a driver seat, a passenger side door, a passenger side window, a rear view mirror, an air conditioner, an instrument panel, a tachograph, a light, a multimedia player, an airbag, or a vehicle transmission.
22. The vehicle identification system of claim 17, wherein when the processor obtains the pass authentication and determines that the second fingerprint image matches the second reference image, the processor obtains the second usage right to enter the visitor mode, and the processor drives the vehicle at a low speed.
23. The vehicular identification system according to claim 22, further comprising a positioning element for positioning and establishing a map for the vehicle when the processor obtains the pass authentication and the second right of use, wherein the vehicle can only travel to a plurality of restricted locations on the map.
24. The vehicle identification system of claim 16, wherein the first fingerprint sensor and the second fingerprint sensor are optical fingerprint sensors and have lenses, and the lenses comprise at least three lenses with refractive power.
25. The vehicular identification system according to claim 24, wherein said lens further satisfies the following condition:
1.0≦f/HEP≦10.0;
0deg≦HAF≦150deg;
0mm≦PhiD≦18mm;
PhiA/PhiD is 0.99 or less; and
0.9≦2(ARE/HEP)≦2.0
wherein f is the focal length of the lens; HEP is the diameter of the entrance pupil of the lens; HAF is half of the maximum visual angle of the lens; PhiD is the maximum value of the minimum side length on the plane which is perpendicular to the optical axis of the lens and is arranged at the outer periphery of the base of the lens; PhiA is the maximum effective diameter of the lens surface of the lens closest to an imaging surface; the ARE is a contour curve length obtained from an intersection point of any lens surface of any lens in the lens and the optical axis as a starting point, and a position at a vertical height from the entrance pupil diameter of the optical axis 1/2 as an end point, along the contour of the lens surface from the starting point to the end point.
26. The vehicle identification system of claim 16, wherein said first fingerprint sensor and said second fingerprint sensor are capacitive or resistive fingerprint sensors.
27. An identification system for a vehicle, adapted for use in a vehicle, comprising:
a sound sensor disposed outside or inside the vehicle to acquire sound information;
a database disposed inside the vehicle and storing main reference sound information; and
the processor is arranged in the vehicle and electrically connected with the database and the sound sensor, and when the processor judges that the sound information is consistent with the main reference sound information, the processor obtains use authentication to open the door of the vehicle and start the engine of the vehicle; and when the processor judges that the sound information is not consistent with the main reference sound information, the processor enables the vehicle door to be locked.
28. The vehicle identification system of claim 27, wherein said processor obtains user information based on said voice information.
29. The vehicle identification system of claim 27, wherein said database further comprises a plurality of secondary reference audio information for providing to said processor.
30. The vehicular identification system according to claim 29, wherein when the processor judges that the sound information corresponds to one of the plurality of secondary reference sound information, the processor obtains the use authentication and causes a door of the vehicle to be opened and an engine of the vehicle to be started; and when the processor judges that the sound information does not accord with the plurality of secondary reference sound information, the processor enables the vehicle door to be locked.
31. The vehicular identification system according to claim 27, further comprising a location component for locating and mapping the vehicle when the processor obtains the usage certificate.
32. The vehicle identification system of claim 27, further comprising a wireless transceiver, an external electronic device and a peripheral device, wherein the wireless transceiver is disposed inside the vehicle and wirelessly connected to the external electronic device and the processor, the peripheral device is disposed on the vehicle, when the processor obtains the authentication, the external electronic device transmits a control signal to the processor through the wireless transceiver, and the processor operates the peripheral device according to the control signal; or, when the processor obtains the use authentication, the processor enables the peripheral element to operate.
33. The vehicular identification system according to claim 32, wherein the peripheral component comprises a driver seat, a passenger side door, a passenger side window, a rear view mirror, an air conditioner, an instrument panel, a drive recorder, a light device, a multimedia player, an airbag, or a vehicle transmission.
CN201920735841.5U 2019-03-14 2019-05-16 Vehicle identification system Active CN210324308U (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
TW108203036 2019-03-14
TW108203036U TWM582005U (en) 2019-03-14 2019-03-14 Vehicle identification system

Publications (1)

Publication Number Publication Date
CN210324308U true CN210324308U (en) 2020-04-14

Family

ID=68317581

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201920735841.5U Active CN210324308U (en) 2019-03-14 2019-05-16 Vehicle identification system

Country Status (2)

Country Link
CN (1) CN210324308U (en)
TW (1) TWM582005U (en)

Also Published As

Publication number Publication date
TWM582005U (en) 2019-08-11

Similar Documents

Publication Publication Date Title
CN209895073U (en) Action carrier auxiliary system
CN210075422U (en) Action carrier auxiliary system
CN113433675B (en) Optical system, lens module and electronic equipment
WO2019078221A1 (en) Variable-focal-distance lens system and imaging device
CN113900235A (en) Optical system, image capturing module, electronic equipment and carrier
CN110837178A (en) Optical imaging module
CN110837170A (en) Optical imaging module
CN113985576B (en) Optical system, image capturing module, electronic device and carrier
CN113985578B (en) Optical system, image capturing module, electronic equipment and automobile
CN111768519A (en) Vehicle identification system
CN210324308U (en) Vehicle identification system
CN210376834U (en) Action carrier auxiliary system
WO2021189463A1 (en) Optical imaging system, imaging module, electronic device and driving device
CN113985581B (en) Optical system, camera module, electronic equipment and vehicle-mounted system
CN113376809B (en) Optical lens, camera module, electronic equipment and automobile
WO2019073744A1 (en) Imaging lens and imaging device
CN114829988B (en) Lens system, method for controlling a lens system and computer program product
CN111201467B (en) Variable focal length lens system and imaging apparatus
CN113253426A (en) Optical system, lens module and electronic equipment
CN111830682A (en) Optical imaging system, image capturing module, electronic device and carrier
CN112526731A (en) Optical system, camera module and automobile
CN210626766U (en) Optical system, camera module and automobile
CN113960760B (en) Optical lens, camera module, electronic equipment and automobile
CN211627917U (en) Camera module, electronic device and automobile
WO2022198628A1 (en) Optical system, camera module, electronic device and automobile

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant