CN117395483A - Assembling method and equipment for camera module - Google Patents
Assembling method and equipment for camera module Download PDFInfo
- Publication number
- CN117395483A CN117395483A CN202311673449.XA CN202311673449A CN117395483A CN 117395483 A CN117395483 A CN 117395483A CN 202311673449 A CN202311673449 A CN 202311673449A CN 117395483 A CN117395483 A CN 117395483A
- Authority
- CN
- China
- Prior art keywords
- carrier
- optical lens
- distance
- optical axis
- angle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 63
- 230000003287 optical effect Effects 0.000 claims abstract description 215
- 238000006073 displacement reaction Methods 0.000 claims abstract description 73
- 238000001514 detection method Methods 0.000 claims description 25
- 238000012545 processing Methods 0.000 claims description 18
- 230000033001 locomotion Effects 0.000 claims description 12
- 239000003292 glue Substances 0.000 claims description 11
- 238000009434 installation Methods 0.000 claims description 10
- 238000012360 testing method Methods 0.000 claims description 9
- 238000012546 transfer Methods 0.000 claims description 5
- 230000008569 process Effects 0.000 abstract description 17
- NJPPVKZQTLUDBO-UHFFFAOYSA-N novaluron Chemical compound C1=C(Cl)C(OC(F)(F)C(OC(F)(F)F)F)=CC=C1NC(=O)NC(=O)C1=C(F)C=CC=C1F NJPPVKZQTLUDBO-UHFFFAOYSA-N 0.000 abstract 2
- 230000006870 function Effects 0.000 description 23
- 238000010586 diagram Methods 0.000 description 20
- 238000004891 communication Methods 0.000 description 19
- 238000007726 management method Methods 0.000 description 15
- 238000013461 design Methods 0.000 description 12
- 238000010295 mobile communication Methods 0.000 description 11
- 238000003384 imaging method Methods 0.000 description 8
- 230000000694 effects Effects 0.000 description 7
- 238000005516 engineering process Methods 0.000 description 7
- 238000003860 storage Methods 0.000 description 7
- 238000004364 calculation method Methods 0.000 description 5
- 238000006243 chemical reaction Methods 0.000 description 5
- 238000001723 curing Methods 0.000 description 5
- 230000004807 localization Effects 0.000 description 5
- 238000004519 manufacturing process Methods 0.000 description 5
- 238000013459 approach Methods 0.000 description 4
- 238000011161 development Methods 0.000 description 4
- 229920001621 AMOLED Polymers 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 238000004132 cross linking Methods 0.000 description 3
- 239000007788 liquid Substances 0.000 description 3
- 239000003550 marker Substances 0.000 description 3
- 238000006116 polymerization reaction Methods 0.000 description 3
- 239000007787 solid Substances 0.000 description 3
- 230000005236 sound signal Effects 0.000 description 3
- 239000000853 adhesive Substances 0.000 description 2
- 230000001070 adhesive effect Effects 0.000 description 2
- 238000013528 artificial neural network Methods 0.000 description 2
- 230000003416 augmentation Effects 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000010191 image analysis Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 230000007774 longterm Effects 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 239000002096 quantum dot Substances 0.000 description 2
- 230000035945 sensitivity Effects 0.000 description 2
- 230000004304 visual acuity Effects 0.000 description 2
- 230000005355 Hall effect Effects 0.000 description 1
- 238000003848 UV Light-Curing Methods 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000004438 eyesight Effects 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 230000001537 neural effect Effects 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 230000006641 stabilisation Effects 0.000 description 1
- 238000011105 stabilization Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/54—Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/55—Optical parts specially adapted for electronic image sensors; Mounting thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/57—Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
Abstract
The application provides an assembly method and assembly equipment of a camera module, relates to the technical field of electronic equipment, and is used for solving the technical problem that the moving direction of a carrier is not coincident with the optical axis of an optical lens in the moving process of the carrier. The camera module comprises: the automatic focusing driving assembly, the pedestal, the carrier and the optical lens, wherein the automatic focusing driving assembly is connected between the pedestal and the carrier, the carrier is provided with a mounting opening, the mounting opening is used for mounting the optical lens, and the assembly method of the camera module comprises the following steps: acquiring a displacement offset angle of the carrier, wherein the displacement offset angle is an included angle between the central axis of the mounting port and the moving direction of the carrier driven by the automatic focusing driving assembly; acquiring an optical axis inclination angle of an optical lens, wherein the optical axis inclination angle is an included angle between the central axis of the optical lens and the optical axis of the optical lens; the optical lens is mounted at the mounting port such that a central axis of the mounting port coincides with an optical axis of the optical lens, based on the displacement deviation angle and the optical axis inclination angle.
Description
Technical Field
The embodiment of the application relates to the technical field of electronic equipment, in particular to an assembly method and assembly equipment of a camera module.
Background
With the development of electronic technology, terminal devices such as mobile phones, tablet computers, etc. have become one of the common articles for users to live, learn, and work. At present, terminal equipment with camera module is more and more popular, and camera module makes terminal equipment can also realize taking a picture, making a video recording on the basis of original function to greatly richen and expanded the service function of moving terminal equipment, added a lot of enjoyment for people's life.
Wherein, the camera module generally includes: the automatic focusing device comprises an automatic focusing driving assembly, a carrier and an optical lens, wherein the automatic focusing driving assembly can drive the carrier to move, and the optical lens is arranged on the carrier. Thus, the automatic focusing driving assembly can move through the driving carrier to adjust the focal length of the camera module.
However, when the automatic focusing driving assembly drives the carrier to move, the moving direction of the carrier is not coincident with the optical axis of the optical lens, so that the eccentric problem of the carrier occurs in the moving process, and the eccentric problem of the closer focus section is more serious, so that the resolving power of the camera module image is greatly weakened, the shot picture is blurred, and the user experience is influenced.
Disclosure of Invention
The embodiment of the application provides an assembly method and assembly equipment of a camera module, which are used for solving the technical problem that the moving direction of a carrier is not coincident with the optical axis of an optical lens in the moving process of the carrier.
In order to achieve the above purpose, the embodiments of the present application adopt the following technical solutions:
in a first aspect, a method for assembling a camera module is provided, the camera module includes: the automatic focusing driving assembly, the seat body, the carrier and the optical lens, wherein the automatic focusing driving assembly is connected between the seat body and the carrier, the carrier is provided with a mounting opening, and the mounting opening is used for mounting the optical lens, and the assembly method comprises the following steps: acquiring a displacement offset angle of the carrier, wherein the displacement offset angle is an included angle between the central axis of the mounting port and the moving direction of the carrier driven by the automatic focusing driving assembly; acquiring an optical axis inclination angle of an optical lens, wherein the optical axis inclination angle is an included angle between the central axis of the optical lens and the optical axis of the optical lens; the optical lens is mounted at the mounting port such that a central axis of the mounting port coincides with an optical axis of the optical lens, based on the displacement deviation angle and the optical axis inclination angle.
Since the embodiments of the present application detect the tilt angle of the optical axis of the optical lens and the displacement offset angle of the carrier (i.e., the displacement offset angle of the autofocus drive assembly). When the optical lens is assembled with the carrier, the optical lens is assembled with the carrier according to the inclination angle of the optical axis of the optical lens and the displacement offset angle of the carrier so that the central axis of the mounting opening coincides with the optical axis of the optical lens. Therefore, the central axis of the carrier mounting port and the optical axis of the optical lens can always keep coincident when the carrier moves along with the automatic focusing driving assembly, so that the image analysis force of the camera module in any voice coil motor stroke is high, the definition of a shot picture is improved, and the user experience is improved.
In some possible implementations of the first aspect, obtaining the displacement offset angle of the carrier includes: controlling the automatic focusing driving assembly to drive the carrier to move a preset distance relative to the seat body; obtaining a moving distance of a carrier on a first reference plane, wherein the first reference plane is perpendicular to a central axis of the mounting port;
obtaining a displacement offset angle of the carrier according to the preset distance and the moving distance, wherein,θ is the displacement offset angle of the carrier, H is the moving distance, and L is the preset distance. Thus, the displacement offset angle of the carrier is calculated from the trigonometric function.
In some possible implementations of the first aspect, obtaining a movement distance of the carrier in the first reference plane includes: acquiring an initial projection position of the carrier on a first reference plane when the carrier is at the initial position; acquiring a target projection position of the carrier on a first reference plane after moving a preset distance; and obtaining the moving distance of the carrier on the first reference plane according to the initial projection position and the target projection position. That is, the embodiment of the application obtains the moving distance of the carrier on the first reference plane by obtaining the initial projection position and the target projection position of the carrier on the first reference plane when the carrier is at the initial position.
In some possible implementations of the first aspect, controlling the autofocus drive assembly to drive the carrier to move a predetermined distance relative to the housing includes: acquiring an initial position and a current position of a carrier; determining the actual moving distance of the carrier according to the current position and the initial position of the carrier; judging whether the actual moving distance is equal to the preset distance according to the actual moving distance and the preset distance of the carrier; if yes, the automatic focusing driving component is controlled to stop driving the carrier to move.
In some possible implementations of the first aspect, the preset distance is a distance from a maximum travel position of the carrier moving away from the base to an initial position of the carrier. Therefore, the process of controlling the carrier to move to the preset distance is simpler, and when the automatic focusing driving assembly moves to the maximum travel position, the moving distance of the carrier on the first reference plane is larger, so that the moving distance is convenient to measure, and the error of the calculated displacement offset angle of the carrier is reduced.
In some possible implementations of the first aspect, obtaining an optical axis tilt angle of the optical lens includes: acquiring a relation graph of a moving distance of an automatic focusing driving assembly and a Modulation Transfer Function (MTF) value of a focus through performing defocusing test treatment on an optical lens; from the relationship graph, the optical axis tilt angle of the optical lens is confirmed.
In some possible implementations of the first aspect, before the optical lens is mounted at the mounting port according to the displacement offset angle and the optical axis tilt angle, the assembly method further includes: dispensing and fixing the optical lens and the mounting opening; after the optical lens is mounted at the mounting port according to the displacement offset angle and the optical axis tilt angle, the assembly method further includes: and curing the connecting glue to connect the optical piece lens with the mounting port.
In this way, the photoinitiator in the connecting glue can generate active free radicals after absorbing high-intensity ultraviolet light, so that polymerization, crosslinking and grafting reactions are initiated, the connecting glue is converted from a liquid state to a solid state within a few seconds, and the production efficiency of the camera module is improved.
In a second aspect, an assembling device of a camera module is provided, for executing the assembling method of the camera module in the first aspect, where the assembling device includes a first detecting device, a second detecting device, and an assembling device, the first detecting device is used to detect a displacement offset angle of the carrier, where the displacement offset angle is an included angle between a central axis of the mounting hole and a moving direction of the carrier driven by the autofocus driving assembly; the second detection device is used for detecting the optical axis inclination angle of the optical lens, wherein the optical axis inclination angle is an included angle between the central axis of the optical lens and the optical axis of the optical lens; the assembling device is used for installing the optical lens on the installation opening according to the displacement offset angle and the optical axis inclination angle so that the central axis of the installation opening coincides with the optical axis of the optical lens.
In some possible implementations of the second aspect, the first detection device includes: the first control unit is used for controlling the automatic focusing driving assembly to drive the carrier to move a preset distance relative to the base body; the first detection unit is used for acquiring the moving distance of the carrier on a first reference plane, and the first reference plane is perpendicular to the central axis of the mounting port; the calculating unit is used for obtaining the displacement offset angle of the carrier according to the preset distance and the moving distance.
In some possible implementations of the second aspect, the first control unit includes: the first image sensor is used for acquiring the initial position and the current position of the carrier; the first distance measuring device is used for determining the actual moving distance of the carrier according to the current position and the initial position of the carrier; the first controller is used for judging whether the actual moving distance of the carrier is equal to the preset distance of the carrier according to the actual moving distance and the preset distance; if yes, the first controller controls the automatic focusing driving assembly to stop driving the carrier to move.
In some possible implementations of the second aspect, the first detection unit includes: the second image sensor is used for acquiring an initial projection position of the carrier on the first reference plane when the carrier is at the initial position and acquiring a target projection position of the carrier on the first reference plane after the carrier moves by a preset distance; the second distance measuring device is used for obtaining the moving distance of the carrier on the first reference plane according to the initial projection position and the target projection position.
In some possible implementations of the second aspect, the second detecting device is further configured to obtain a graph of a relationship between the movement distance of the autofocus drive assembly and the MTF value by performing a defocus test process on the optical lens, and confirm the tilt angle of the optical axis of the optical lens according to the graph.
In some possible implementations of the second aspect, the assembly device further includes: and the dispensing device is also used for dispensing and fixing the optical lens and the mounting opening.
It may be appreciated that the beneficial effects of the assembly device of the camera module set and any possible design manner of the assembly device of the camera module set provided in the second aspect may refer to the beneficial effects of the assembly device of the first aspect and any possible design manner of the assembly device, and are not described herein.
Drawings
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of an auto-focusing AF mobile phone according to an embodiment of the present application;
fig. 3 (a) is a schematic diagram of an image collected by a camera module of an AF mobile phone under the condition that a photographed object is not focused;
fig. 3 (b) is a schematic diagram of an image collected after an AF mobile phone camera module provided in an embodiment of the present application performs auto-focusing on a photographed object;
Fig. 4 is a schematic diagram of a positional relationship among a voice coil motor, an optical lens, a carrier and an image sensor according to an embodiment of the present application;
fig. 5 (a) is a schematic diagram of a moving direction of a mover portion of a voice coil motor according to an embodiment of the present application;
fig. 5 (b) is a schematic view of an optical axis of an optical lens according to an embodiment of the present application;
fig. 6 (a) is a schematic structural diagram of a voice coil motor and an optical lens after assembly according to an embodiment of the present application;
fig. 6 (b) is a schematic structural diagram of a voice coil motor, an optical lens and an imaging surface S in an axial direction according to an embodiment of the present application;
fig. 6 (c) is a schematic structural diagram of an optical lens near a near-focus segment according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of an assembling apparatus according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of a first image sensor and a mounting port according to an embodiment of the present disclosure;
fig. 9 is a schematic structural diagram of a second detection device according to an embodiment of the present application;
fig. 10 is a schematic diagram of a pattern drawn on a graphics card according to an embodiment of the present application;
FIG. 11 is a flowchart of a method for assembling a camera module according to an embodiment of the present disclosure;
FIG. 12 is a second flowchart of a method for assembling a camera module according to an embodiment of the present disclosure;
fig. 13 is a schematic view of a carrier according to an embodiment of the present application after the carrier is located at an initial position and moved a predetermined distance from the carrier;
FIG. 14 is a third flowchart of a method for assembling a camera module according to the embodiment of the present disclosure;
FIG. 15 is a flowchart of a method for assembling a camera module according to an embodiment of the present disclosure;
FIG. 16 is a schematic view of an initial projection position and a target projection position of a carrier on a first reference plane according to an embodiment of the present disclosure;
FIG. 17 is a flowchart of a method for assembling a camera module according to an embodiment of the present disclosure;
fig. 18 is a graph of correspondence between MTF values and strokes provided in an embodiment of the present application;
fig. 19 is a flowchart of a method for assembling a camera module according to an embodiment of the present application.
Detailed Description
In the description of the embodiments of the present application, the terminology used in the embodiments below is for the purpose of describing particular embodiments only and is not intended to be limiting of the present application. As used in the specification of this application and the appended claims, the singular forms "a," "an," "the," and "the" are intended to include, for example, "one or more" such forms of expression, unless the context clearly indicates to the contrary. It should also be understood that in the various embodiments herein below, "at least one", "one or more" means one or more than two (including two). The term "and/or" is used to describe an association relationship of associated objects, meaning that there may be three relationships; for example, a and/or B may represent: a alone, a and B together, and B alone, wherein A, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise. The term "coupled" includes both direct and indirect connections, unless stated otherwise. The terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated.
In the embodiments of the present application, words such as "exemplary" or "such as" are used to mean serving as examples, illustrations, or descriptions. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
In order to facilitate the technical personnel to understand the technical scheme of the embodiment of the application, the technical terms related to the embodiment of the application are explained below.
Autofocus (AF): when the camera is used for shooting an object, light reflection occurs on the surface of the object, after the light reflected by the surface of the object enters a camera lens, the light can be received by an image sensor positioned behind the camera lens, then the object distance of the object can be obtained through processing calculation of a processor, and then the camera can automatically move the lens according to the object distance to complete focusing so as to shoot the object.
With the rapid development of photographing technology, images formed by photographing are widely used in various applications. Sometimes, the definition degree of the image not only affects the experience effect of the user, but also greatly affects the realization effect of the application.
For example, with the rapid development of computer vision and mobile phone photographing technology, many Augmented Reality (AR) applications are increasingly used. The AR technology is a technology for fusing virtual information with the real world, and widely uses various technical means such as multimedia, three-dimensional modeling, real-time tracking and registration, intelligent interaction, sensing and the like, and applies virtual information generated by electronic equipment to the real world after simulation of the virtual information such as characters, images, three-dimensional models, music, videos and the like, wherein the virtual information and the real world are mutually complemented, so that the 'enhancement' of the real world is realized. In AR applications, spatial localization techniques play a critical role. Spatial localization techniques are implemented in combination with cloud-side (cloud server) visual localization services (visualpositioning service, VPS) techniques and end-side (electronic devices, such as cell phones) instant localization and map building (simultaneous localization and mapping, SLAM) techniques. Taking the mobile phone as an example of the terminal side electronic equipment, an AR application on the mobile phone collects an image of the current position through a camera, the image is uploaded to a cloud server, the cloud server performs position calculation by adopting a VPS technology, a calculation result is returned to the mobile phone, and the mobile phone fuses with the SLAM technology to perform real-time tracking and positioning. If the image acquired by the mobile phone camera is a blurred image, the position resolving precision is greatly influenced, and even resolving failure is caused.
The embodiment of the application provides an assembly method of camera module, can make the carrier of camera module when removing, the direction of movement of carrier and the optical lens optical axis coincidence of camera module avoid the carrier to appear eccentric problem in the removal in-process to improve the resolution of camera module image, and then improve the definition of the image that the camera module took.
The assembly method of the camera module provided by the embodiment of the application can be applied to the assembly of electronic equipment with a camera. The electronic device may include a mobile phone, a tablet computer, a notebook computer, a personal computer (personal computer, PC), an ultra-mobile personal computer (ultra-mobile personal computer, UMPC), a handheld computer, a netbook, an intelligent home device (e.g., an intelligent television, a smart screen, a large screen, a smart speaker, a smart air conditioner, etc.), a personal digital assistant (personal digital assistant, PDA), a wearable device (e.g., a smart watch, a smart bracelet, etc.), a vehicle-mounted device, an augmented reality device, a virtual reality device, etc., which the embodiments of the present application do not limit in any way. The electronic device can run an operating system and install application programs. Alternatively, the operating system that the electronic device runs may be an android (android) system, a Windows system, an iOS system, or the like.
For example, please refer to fig. 1, which illustrates a schematic structure of an electronic device 100. The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera module 193, a display 194, and the like.
It should be understood that the illustrated structure of the embodiment of the present invention does not constitute a specific limitation on the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural hub and a command center of the electronic device 100, among others. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code including instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device 100 (e.g., audio data, phonebook, etc.), and so on. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transfer data between the electronic device 100 and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other electronic devices, such as AR devices, etc.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera module 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier and low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication (near field communication, NFC), infrared (IR), etc., as applied to the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 150 of electronic device 100 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that electronic device 100 may communicate with a network and other devices through wireless communication techniques. The wireless communication techniques may include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The sensor module 180 includes an inertial measurement unit (inertial measurement unit, IMU) module, or the like. The IMU module may include gyroscopes, accelerometers, and the like. Gyroscopes and accelerometers may be used to gather motion information of the electronic device 100. In some embodiments, the angular velocity of the electronic device 100 about three axes may be determined by a gyroscope. Accelerometers may be used to capture the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 100 is stationary.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The electronic device 100 may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback. For example, touch operations acting on different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also correspond to different vibration feedback effects by touching different areas of the display screen 194. Different application scenarios (such as time reminding, receiving information, alarm clock, game, etc.) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), a mini, a Micro led, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement photographing functions through an ISP, a camera module 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera module 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera module 193.
With the continuous development and wide application of the electronic device 100, the auto-focusing function is increasingly applied to electronic devices such as smart phones and tablet computers. The electronic equipment with the automatic focusing function can automatically focus the shot object during shooting, so that clear imaging of the shot object is realized.
Taking an auto-focus mobile phone as an example, fig. 2 is a schematic diagram of an auto-focus AF mobile phone. As shown in fig. 2, the AF phone may include one or more camera modules 193 with auto-focusing capability, and the one or more camera modules 193 may be disposed on the back of the phone or on the front of the phone. When the AF phone photographs the photographed object, the camera module 193 may automatically complete focusing on the photographed object to obtain clear imaging of the photographed object. In fig. 3 (a), a schematic diagram of an image collected by a camera module of an AF mobile phone under the condition that a shot object is not focused is shown in fig. 3 (a), and the collected image of the shot object is blurred and has low definition under the condition that the shot object is not focused by the camera module of the AF mobile phone. In fig. 3 (b), an image diagram collected after the camera module of the AF mobile phone automatically focuses on the object to be shot is shown in fig. 3 (b), and the collected image of the object to be shot is clearer than before focusing after the camera module of the AF mobile phone automatically focuses on the object to be shot, so that the actual form of the object to be shot can be reflected. Therefore, the AF mobile phone can acquire the image of the shot object with high definition through the automatic focusing capability of the camera module, and because the automatic focusing process is automatically executed by the AF mobile phone, manual operation of a user is not needed, and therefore shooting experience of the user is improved.
The camera module 193 is generally composed of an auto focus drive assembly, an image sensor 1932, a housing 1933, a carrier 1934, a drive control board, and an optical lens 1935. In the following description, the automatic focusing driving assembly is taken as an example of a voice coil motor 1931, and the voice coil motor 1931 includes a rotor portion and a stator portion, the rotor portion is connected with a carrier 1934, and a mounting opening is provided on the carrier 1934, and is used for mounting an optical lens 1935. The mover portion may be moved in the optical axis direction with respect to the stator portion. The image sensor 1932 is fixed to a side of the voice coil motor 1931 remote from the optical lens 1935. The optical lens 1935 may move with the mover portion, and may cause an imaging focus of the lens on the object to be photographed to be located on the image sensor 1932. In this way, the electronic apparatus 100 can realize clear imaging of the subject.
Generally, the mover portion moves relative to the stator portion in such a manner that the stator portion is a magnet that is fixed to a housing 1933, the housing 1933 being located on a side of the voice coil motor 1931 remote from the carrier 1934, and the mover portion includes an energized coil. The drive control board controls the start or stop of the voice coil motor 1931 by controlling whether the energizing coil is energized. After the power-on coil is electrified, the magnetic field of the magnet can generate Lorentz force on the moving charges in the power-on coil, and the Lorentz force is the thrust of the mover part for pushing the optical lens.
Fig. 4 is a schematic diagram illustrating a positional relationship among a voice coil motor 1931, an optical lens 1935, a carrier 1934 and an image sensor 1932, and referring to fig. 4, the voice coil motor 1931 and the optical lens 1935 may be disposed on a lighting side of the image sensor 1932. The carrier 1934 may have a mounting opening, in which the optical lens 1935 is located, and a light emitting side of the optical lens 1935 corresponds to a light collecting side of the image sensor 1932. The optical lens 1935 can collect light of a photographed object, the voice coil motor 1931 can drive the optical lens 1935 to perform auto-focusing, and then the light collected by the optical lens 1935 is projected onto the image sensor 1932, and the image sensor 1932 can convert the collected optical signal into an electrical signal, so as to complete conversion of the photoelectric signal.
As shown in fig. 5 (a) and 5 (b), when the voice coil motor 1931 and the optical lens 1935 are produced, the moving direction M of the mover portion of the voice coil motor 1931 is not coincident with the central axis direction of the mounting opening 1934A and the optical axis direction N of the optical lens 1935 is not coincident with the central axis of the optical lens 1935 due to the influence of assembly and material tolerances.
Fig. 6 (a) is a schematic structural diagram of the assembled voice coil motor 1931 and optical lens 1935, fig. 6 (b) is a schematic structural diagram of the voice coil motor 1931, the optical lens 1935 and the imaging surface S in the axial direction, and fig. 6 (c) is a schematic structural diagram of the optical lens close to the near-focus section. Referring to fig. 6 (a), fig. 6 (b) and fig. 6 (c), when the carrier 1934 moves along the mover portion, an eccentric (center) problem occurs, and the closer to the near-focus segment, the more serious the center problem is, so that the resolution of the image of the camera module 193 is greatly reduced, and the taken image is blurred, thereby affecting the user experience.
Based on this, the embodiment of the application provides an assembly method and assembly equipment of a camera module, which are used for solving the problem that the carrier is eccentric in the moving process of the following rotor part.
In a first aspect, an embodiment of the present application provides an assembly device of a camera module, including: the device comprises a first detection device, a second detection device and an assembling device.
The first detection device is used for detecting a displacement offset angle of the carrier, wherein the displacement offset angle is an included angle between the central axis of the mounting port and the moving direction of the carrier driven by the automatic focusing driving assembly. The second detection device is used for detecting an optical axis inclination angle of the optical lens, wherein the optical axis inclination angle is an included angle between the central axis of the optical lens and the optical axis of the optical lens. The assembling device is used for installing the optical lens on the installation opening according to the displacement offset angle and the optical axis inclination angle so that the central axis of the installation opening coincides with the optical axis of the optical lens.
Since the embodiment of the application detects the optical axis tilt angle of the optical lens and the displacement offset angle of the carrier (i.e., the displacement offset angle of the mover portion). When the optical lens is assembled with the carrier, the optical lens is assembled with the carrier according to the inclination angle of the optical axis of the optical lens and the displacement offset angle of the carrier so that the central axis of the mounting opening coincides with the optical axis of the optical lens. So, the carrier is following the mover part in-process that removes, and the central axis of this carrier mounting port can remain the coincidence all the time with the optical axis of optical lens to ensure that the image analytic power of camera module in the stroke of arbitrary voice coil motor is all higher, improve the definition of the photo of shooing, and then improve user's experience.
Referring to fig. 7, in one possible structural design, the assembling apparatus 200 further includes: a rotation shaft 210, the rotation shaft 210 may be disposed perpendicular to a horizontal plane, and 3 stations may be sequentially disposed around a circumference of rotation of the rotation shaft 210. The 3 stations may include a first station, a second station, and a third station sequentially disposed along a circumference of the rotation shaft 210. The first station may be a voice coil motor feeding station, a first detecting device 220 may be disposed at the first station, and after the first detecting device 220 detects the displacement angle of the carrier, the rotating shaft 210 may drive the voice coil motor detecting the displacement angle of the carrier to move to the second station. The second station is a feeding station of the optical lens, a second detecting device 230 can be arranged at the second station, and after the second detecting device 230 detects the inclination angle of the optical axis of the optical lens, the rotating shaft 210 drives the voice coil motor for detecting the displacement offset angle of the carrier and the inclination angle of the optical axis of the optical lens to move to the third station together. The third station may be a blanking station, where an assembling device 240 may be disposed at the third station, and after the assembling device 240 installs and fixes the optical lens to the mounting port of the carrier, the optical lens and the voice coil motor after being connected and fixed may be blanked.
Therefore, 3 stations (namely the first station, the second station and the third station) are connected in series through the rotary shaft 210, so that the assembly line production of the camera module is facilitated, and the production efficiency of the camera module is improved.
In some embodiments, the first detecting device 220 may include: the device comprises a first control unit, a first detection unit and a calculation unit.
In one possible implementation, the control unit refers to a device that can generate an operation control signal according to the instruction operation code and the timing signal, and instruct the autofocus drive component to execute the control instruction. By way of example, the control unit may be a central processing unit (central processing unit, CPU), a general purpose processor network processor (network processor, NP), a digital signal processor (digital signal processing, DSP), a programmable logic device (programmable logic device, PLD), a microprocessor, a microcontroller, or any combination thereof. The control unit may also be other means with processing functions, such as a circuit, a device or a software module.
In another possible implementation, the control unit may also be a micro control unit (microcontroller unit, MCU). The MCU is also called a single chip microcomputer (single chip microcomputer) or a single chip microcomputer, which properly reduces the frequency and specification of a central processing unit (central process unit, CPU), and integrates peripheral interfaces such as a memory (memory), a counter (Timer), USB, A/D conversion, UART, PLC, DMA and the like, and even an LCD driving circuit on a single chip to form a chip-level computer for different application occasions to perform different combination control.
The first control unit is used for controlling the automatic focusing driving assembly (for example, a voice coil motor) to drive the carrier to move a preset distance relative to the base; the first detection unit is used for acquiring the moving distance of the carrier on a first reference plane, wherein the first reference plane is perpendicular to the central axis of the mounting port; the calculating unit is used for obtaining the displacement offset angle of the carrier according to the preset distance and the moving distance.
It will be appreciated that the displacement offset angle of the carrier may be calculated from a trigonometric function. By way of example only, and not by way of limitation,wherein θ is the displacement offset angle of the carrier, H is the moving distance, and L is the preset distance.
According to the embodiment of the application, the displacement offset angle (namely the displacement offset angle of the rotor part) of the carrier is calculated according to the trigonometric function by obtaining the preset moving distance of the carrier relative to the base and the moving distance of the carrier on the first reference plane. Therefore, data support is provided for the assembly of the optical lens and the carrier, the higher resolving power of the camera module is ensured, and the experience effect of a user is improved.
In one possible implementation, the first control unit may include: the first image sensor, the first ranging device and the first controller. The first distance measuring device may be a distance measuring device such as a laser interferometer or a laser head, which is not limited in this application.
The first image sensor is used for acquiring an initial position and a current position of the carrier; the first distance measuring device is used for determining the actual moving distance of the carrier according to the current position and the initial position of the carrier. The initial position may be the position of the carrier at the minimum stroke of the voice coil motor.
Referring to fig. 8, the exit surface of the first image sensor 221 may be disposed parallel to the central axis O of the mounting hole. In this way, the first image sensor 221 can accurately detect the initial position and the current position of the carrier, thereby improving the accuracy of the detection result.
In addition, the first controller is electrically connected with the voice coil motor and the first distance measuring device, and is used for judging whether the actual moving distance of the carrier is equal to the preset distance of the carrier according to the actual moving distance and the preset moving distance, if so, the first controller controls the automatic focusing driving assembly to stop driving the carrier to move; if not, the first controller continues to control the automatic focusing driving assembly to drive the carrier to move until the actual moving distance is equal to the preset distance.
For example, if the preset distance is 3mm and the actual moving distance detected by the first distance measuring device is 2.8mm, the first controller may increase the current input to the powered voice coil so that the actual moving distance increases. If the actual moving distance detected by the first distance measuring device is 3.1mm, the first controller may decrease the current input to the powered voice coil such that the actual moving distance is decreased.
In this way, when the actual moving distance is different from the preset distance, the first controller controls the current in the input energizing voice coil to gradually approach the actual moving distance until the actual moving distance is equal to the preset distance.
In another possible implementation manner, the first control unit may include: and the second controller is electrically connected with the voice coil motor.
Wherein, this voice coil motor can be closed loop motor, and this voice coil motor can also include: hall element, which is a kind of magnetic sensor based on hall effect. The Hall element can be used for measuring the Gaussian value in the magnetic field in the voice coil motor so as to further determine the displacement distance of the rotor.
It can be understood that in the control of the closed-loop voice coil motor, a preset displacement amount can be directly input to the closed-loop voice coil motor, and the voice coil motor continuously adjusts the position of the mover portion according to the feedback of the hall element, so that the displacement amount of the mover portion is equal to the input displacement amount.
Therefore, the second controller of the embodiment of the present application may be configured to send a preset distance to the voice coil motor, where the voice coil motor moves after receiving the preset distance, and the hall element may continuously feed back an actual moving distance of the voice coil motor (the mover portion) to the voice coil motor, so that the position of the mover portion of the voice coil motor is continuously adjusted after the voice coil motor receives the voice coil motor until the actual moving distance of the mover portion is the same as the preset distance.
It will be appreciated that the open loop motor will not stop immediately when it moves to the predetermined distance, and will shake at the predetermined distance due to factors such as inertia, so a settling time is required to maintain stability. The closed-loop motor provided by the embodiment of the application is provided with the feedback control of the Hall element, so that the time for moving the preset distance is short, and the position of the rotor part is adjusted by the closed-loop motor according to the feedback movement position of the Hall element, so that the moving precision and the adjusting speed of the voice coil motor can be improved, and the accuracy of detecting the carrier displacement offset angle is further improved.
In some embodiments, the first detection unit may include: a second image sensor and a second distance measuring device. Similarly, the second distance measuring device may be a distance measuring device such as a laser interferometer or a laser head, which is not limited in this application.
With continued reference to fig. 8, the exit surface of the second image sensor 222 may be disposed perpendicular to the central axis O of the mounting hole, and the second image sensor 222 is configured to obtain an initial projection position of the carrier on the first reference plane when the carrier is at the initial position, and obtain a target projection position of the carrier on the first reference plane after the carrier is moved by a preset distance. The second distance measuring device is used for obtaining the moving distance of the carrier on the first reference plane according to the initial projection position and the target projection position.
Referring to fig. 9, in some embodiments, the second detecting device 230 may include: a graphic card 231, a third image sensor 232, and a pickup device 233. Among them, a pickup device 233 may be provided between the graphic card 231 and the third sensor 232, the pickup device 233 being for fixing an optical lens 1935.
For example, the pickup device 233 may be a vacuum suction head capable of sucking a side of the optical lens 1935 to fix the optical lens 1935 between the graphic card 231 and the third image sensor 232, thereby performing an out-of-focus test process on the optical lens 1935.
In the embodiment of the present application, the distance between the third image sensor 232 and the pickup device 233 is adjusted, so that the out-of-focus test process of the optical lens 1935 fixed on the pickup device 233 is performed, a graph of the relationship between the moving distance of the auto-focus driving assembly and the MTF value of the focal point is obtained, and the tilt angle of the optical axis of the optical lens 1935 is confirmed according to the graph of the relationship.
In one possible structural design, the pickup device 233 is slidably disposed between the graphic card 231 and the third image sensor 232, and the sliding path of the pickup device 233 is parallel to the vertical line G of the graphic card 231. In this way, the distance between the third image sensor 232 and the pickup device 233 can be adjusted by moving the pickup device 233.
In yet another possible structural design, the third image sensor 232 is slidably disposed on a side of the pickup device 233 away from the third image sensor 232, and the sliding path of the third image sensor 232 is parallel to the vertical line G of the graphic card 231. In this way, the distance between the third image sensor 232 and the pickup device 233 can be adjusted by moving the third image sensor 232.
Since the MTF value is calculated typically using radial and tangential target stripes, the tangential and radial target stripes are perpendicular to each other. Thus, for example, referring to fig. 10, the card may be a card having a lattice-like stripe pattern with alternating black and white.
In some embodiments, the assembly device may further include: the dispensing device is used for dispensing the optical lens and the mounting opening, so that the embodiment of the application is used for dispensing and fixing the optical lens and the mounting opening through the dispensing device, and the optical lens is prevented from falling off from the mounting opening.
In some embodiments, the adhesive flowing out of the adhesive dispensing device includes a photoinitiator, and the assembly device may further include: an ultraviolet (ultraviolet curing, UV) curing machine for curing the connection glue to connect the optics lens with the mounting port.
In this way, the photoinitiator in the connecting glue can generate active free radicals after absorbing high-intensity ultraviolet light, so that polymerization, crosslinking and grafting reactions are initiated, the connecting glue is converted from a liquid state to a solid state within a few seconds, and the production efficiency of the camera module is improved.
In a second aspect, an embodiment of the present application provides a method for assembling a camera module, as shown in fig. 11, the method for assembling a camera module may include:
s101, a first detection device acquires a displacement offset angle of the carrier.
The carrier is provided with a mounting port, and the displacement offset angle is an included angle between the central axis of the carrier mounting port and the moving direction of the carrier driven by the automatic focusing driving assembly.
In some embodiments, the first detection device may determine the displacement offset angle of the carrier by acquiring a preset distance that the carrier moves relative to the base, and a moving distance of the carrier on the first reference plane. Wherein the first reference plane is perpendicular to the central axis of the mounting opening.
Specifically, referring to fig. 12, the step S101 may include the following steps S1011-S1013.
S1011, the first control unit controls the automatic focusing driving assembly to drive the carrier to move a preset distance relative to the base.
In one possible implementation, the preset distance may be a preset distance. The preset distance may be a fixed value, for example, referring to fig. 13, the carrier 1934b is an initial position of the carrier, and the carrier 1934c is a position after the carrier moves and the preset distance a. The preset distance a may be 1mm, 2mm, 2.5mm, or the like, which is not limited in the present application.
In one possible design, referring to fig. 14, the first control unit may control the autofocus driving assembly to drive the carrier to move a predetermined distance relative to the base as follows steps S1011A-S1011D 2.
S1011A, the first image sensor acquires the initial position and the current position of the carrier.
It can be understood that when the camera module is automatically focused, the electrified coil and the magnet can generate electromagnetic interaction to generate a magnetic field, so that acting force for pushing the carrier to move can be generated to drive the optical lens to move. The Hall element can be used for sensing the displacement of the rotor part of the voice coil motor and outputting an analog signal corresponding to the magnetic field intensity, and the driving chip processes the analog signal after receiving the analog signal, and the holding current of the electrified coil can be calculated so as to be used for adjusting or determining the position of the optical lens.
Therefore, before the driving chip controls the automatic focusing driving assembly to drive the carrier to drive the seat body to move, the first image sensor can acquire the initial position of the carrier. After the driving core body controls the automatic focusing driving component to drive the base body to move for a certain stroke, the first image sensor can acquire the current position of the carrier.
Alternatively, the side of the carrier facing the first image sensor may be provided with a first Mark point, the first image sensor marking the current position of the carrier by determining the position of the first Mark point. Thus, the position of the carrier is determined more clearly and accurately through the first mark point.
S1011B, the first distance measuring device determines the actual moving distance of the carrier according to the current position and the initial position of the carrier.
The first distance measuring device can be a distance measuring device such as a laser interferometer or a laser head.
In addition, the first image sensor can respectively shoot images of the initial position and the current position of the carrier, the first distance measuring device measures the pixel distances of the initial position and the current position of the carrier in the images according to the images of the initial position and the current position of the carrier, and then the first distance measuring device can determine the actual moving distance of the carrier according to the scale and the pixel distances of the images.
In one possible design, the first image sensor may record the position of the first mark point when the carrier is in the initial position (hereinafter referred to as the first mark position), and the position of the first mark point when the carrier is in the current position (hereinafter referred to as the second mark position). The first distance measuring device determines the actual displacement distance of the carrier by detecting the distance between the first marker position and the second marker position.
S1011C, the first controller judges whether the actual moving distance of the carrier is equal to the preset distance of the carrier according to the actual moving distance and the preset distance.
S1011D1, if yes, the first controller controls the automatic focusing driving component to stop driving the carrier to move.
S1011D2, if not, the first controller continues to control the automatic focusing driving component to drive the carrier to move.
It is understood that the first controller continues to control the autofocus drive assembly to drive the carrier to move until the actual movement distance is equal to the predetermined distance.
In one possible design, if the actual displacement distance of the carrier is smaller than the preset distance, the actual displacement distance of the carrier may be increased by increasing the current input to the energized coil, and gradually approaches until the preset distance is reached. If the actual displacement distance of the carrier is greater than the preset distance, the current input into the electrified coil by the driving circuit board can be reduced, so that the actual displacement distance of the carrier is reduced and gradually approaches until the preset distance is reached.
It should be noted that, when the preset distance is a fixed value, the process of controlling the carrier to move to the preset distance is complex, that is, the carrier needs to be controlled to move gradually for a certain distance to approach gradually until the carrier moves to the preset distance. In order to facilitate detection of the displacement angle of the carrier, in some embodiments, the preset distance may be a distance from a maximum travel position of the carrier moving away from the base to an initial position of the carrier.
Therefore, the process of controlling the carrier to move to the preset distance is simpler, and when the automatic focusing driving assembly moves to the maximum travel position, the moving distance of the carrier on the first reference plane is larger, so that the moving distance is convenient to measure, and the error of the calculated displacement offset angle of the carrier is reduced.
In another possible design, the voice coil motor may be a closed loop motor, which may further include: a Hall element. The first control unit may include: and the second controller is electrically connected with the voice coil motor. The second controller sends first information to the voice coil motor, the first information including: a preset distance. The voice coil motor moves after receiving the first information, and in the moving process of the voice coil motor, the Hall element continuously feeds back the actual moving distance of the voice coil motor (the rotor part) to the voice coil motor until the actual moving distance fed back by the Hall element is equal to the preset distance, and the voice coil motor is controlled to be powered off.
Since the open loop motor does not stop immediately when moving to the position of the preset distance, it can shake at the position of the preset distance due to factors such as inertia, so that a stabilization time is required before the motor can be stabilized. The closed-loop motor provided by the embodiment of the application is provided with the feedback control of the Hall element, so that the time for moving the preset distance is short, and the position of the rotor part is adjusted by the closed-loop motor according to the feedback movement position of the Hall element, so that the moving precision and the adjusting speed of the voice coil motor can be improved, and the accuracy of detecting the carrier displacement offset angle is further improved.
S1012, a first detection unit acquires the moving distance of the carrier on a first reference plane; wherein the first reference plane is perpendicular to the central axis of the mounting opening.
Referring to FIG. 15, in one possible design, the step S1012 may include at least the following steps S1012A-S1012C.
S1012A, the second image sensor acquires an initial projection position of the carrier on the first reference plane when the carrier is at the initial position.
The second image sensor may capture a first image when the carrier is in the initial position. The position of the carrier in the first image may be taken as the initial projection position of the carrier on the first reference plane.
Alternatively, the side of the carrier facing the second image sensor may be provided with second marker points, the position of which in the first image may be used as the initial projection position of the carrier on the first reference plane. Thus, the position of the carrier is determined by the second mark point more accurately and clearly.
S1012B, the second image sensor acquires the target projection position of the carrier on the first reference plane after moving a preset distance.
In addition, the second image sensor can shoot a second image after the carrier moves a preset distance, and the position of the carrier in the second image can be used as the target projection position of the carrier on the first reference plane.
If a second mark point is arranged on the side, facing the second image sensor, of the carrier, the position of the second mark point in the second image can be used as the target projection position of the carrier on the first reference plane.
And S1012C, the second distance measuring device obtains the moving distance of the carrier on the first reference plane according to the initial projection position and the target projection position.
The second distance measuring device measures the pixel distance between the initial projection position and the target projection position according to the initial projection position and the target projection position of the carrier. And then, the second distance measuring device can determine the moving distance of the carrier on the first reference plane according to the scale and the pixel distance of the image shot by the second image sensor.
For example, referring to fig. 16, 201 is an initial projection position of the carrier on the first reference plane, and 202 is a target projection position of the carrier on the first reference plane.
In one possible implementation, the movement distance of the carrier in the first reference plane is the length of the straight distance edge P of the initial projection position 201 and the target projection position 202.
In another possible implementation, the moving distance of the carrier in the first reference plane may include: a movement distance Q of the carrier on the abscissa of the first reference plane and a movement distance R of the carrier on the ordinate of the first reference plane, wherein the ordinate or the abscissa may be perpendicular to the exit surface of the first image sensor.
S1013, the calculating unit calculates and obtains the displacement offset angle of the carrier according to the preset distance and the moving distance.
Wherein, according to the trigonometric function,wherein θ is the displacement offset angle of the carrier, H is the moving distance, and L is the preset distance.
In one possible implementation, the displacement offset angle of the carrier may include: a first offset angle and a second offset angle,
the first offset angle may be an offset angle of the displacement offset angle of the carrier on the first reference plane, and as shown in fig. 16, for example, the first offset angle may be an angle between an abscissa axis of the carrier on the first reference plane and the edge P. The second offset angle may be an angle between a central axis of the mounting port and a direction of movement of the carrier.
In another possible implementation, the displacement offset angle of the carrier may include: an abscissa offset angle and an ordinate offset angle. The abscissa offset angle is an offset angle of the carrier on the abscissa, and by way of example, the abscissa offset angle may be an angle between the central axis of the mounting port and the initial position of the carrier to the first position, and the ordinate offset angle may be an angle between the central axis of the mounting port and the initial position of the carrier to the second position; the first position is a position of the carrier moving by a length Q from the coordinate origin on the abscissa axis of the first reference plane, and the second position is a distance of the carrier moving by a length R from the coordinate origin on the ordinate axis of the first reference plane, wherein the coordinate origin is a projection point of the carrier on the first reference plane when the carrier is at the initial position.
S102, the second detection device acquires the optical axis inclination angle of the optical lens.
The optical axis inclination angle is an included angle between the central axis of the optical lens and the optical axis of the optical lens, and the optical axis refers to the central axis of the light passing through the optical lens.
In addition, referring to fig. 17, the acquisition of the optical axis tilt angle of the optical lens may include the following steps S1021-S1022.
S1021, performing defocus test processing on the optical lens by the second detection device, and acquiring a relation graph of the moving distance of the automatic focusing driving assembly and the MTF value.
It is understood that this mtf=contrast of the output image/contrast of the input image. The larger the MTF value is between 0 and 1, the better the imaging quality of the camera module is. The MTF may also be referred to as a spatial contrast transfer function (spatial contrast transfer function), a spatial frequency contrast sensitivity function (spatial frequencycontrast sensitivity function). The MTF value reflects the ability of the optical system to deliver various frequency sinusoid modulation levels.
Wherein, the defocus test process refers to: and acquiring MTF values of the imaging images of the image card under different distances (strokes) between the third image sensor and the optical lens at four positions of upper left, upper right, lower left and lower right, and simultaneously drawing the corresponding relation between the MTF values at the four positions and the strokes into four relation graphs. For example, the graph may be shown with reference to fig. 18.
S1022, the second detection device confirms the optical axis inclination angle of the optical lens according to the relation graph.
It can be understood that the graph of the relationship is marked with coordinates when the MTF value is maximum at four positions of the upper left, the upper right, the lower left and the lower right of the module test image.
According to the characteristic that the distance of the voice coil motor is the same when the optical lens focuses on objects with the same distance in the same stroke, the optical axis inclination angle of the optical lens can be calculated according to the coordinates when the MTF value is the largest at four positions.
S103, the assembling device is used for installing the optical lens on the installation opening according to the displacement deviation angle and the optical axis inclination angle so that the central axis of the installation opening is overlapped with the optical axis of the optical lens.
In some embodiments, the assembly device may include: the mechanical arms can adjust the assembly direction and the installation angle of the optical lens and the installation opening. The assembling device can adjust the assembling direction of the optical lens and the mounting opening according to the displacement deviation angle and the optical axis inclination angle so as to enable the central axis of the mounting opening to coincide with the optical axis of the optical lens.
Since the embodiment of the application detects the optical axis tilt angle of the optical lens and the displacement offset angle of the carrier (i.e., the displacement offset angle of the mover portion). When the optical lens is assembled with the carrier, the optical lens is assembled with the carrier according to the inclination angle of the optical axis of the optical lens and the displacement offset angle of the carrier so that the central axis of the mounting opening coincides with the optical axis of the optical lens. Therefore, the central axis of the carrier mounting port and the optical axis of the optical lens can always keep coincident when the carrier moves along with the rotor part, so that stable image analysis force of the camera module on the stroke of any voice coil motor is ensured, the definition of a shot photo is improved, and the user experience is improved.
Referring to fig. 19, in some embodiments, before the step S103, the method of assembling the camera module may further include:
S103A, dispensing and fixing the optical lens and the mounting opening by the dispensing device.
The dispensing device can dispense glue on the circumferential side wall surface of the mounting opening, so that the optical lens is convenient to be connected with the mounting opening.
With continued reference to fig. 19, in some embodiments, after the step S103, the method of assembling the camera module may further include:
s104, curing the connecting glue by a UV curing machine so as to connect the optical piece lens with the mounting port.
In this way, the photoinitiator in the connecting glue can generate active free radicals after absorbing high-intensity ultraviolet light, so that polymerization, crosslinking and grafting reactions are initiated, the connecting glue is converted from a liquid state to a solid state within a few seconds, and the production efficiency of the camera module is improved.
It will be apparent to those skilled in the art from this description that, for convenience and brevity of description, only the above-described division of the functional modules is illustrated, and in practical application, the above-described functional allocation may be performed by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to perform all or part of the functions described above.
In the several embodiments provided in this application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical functional division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another apparatus, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and the parts displayed as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a readable storage medium. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions for causing a device (may be a single-chip microcomputer, a chip or the like) or a processor (processor) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read Only Memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely a specific embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered in the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
Claims (13)
1. An assembly method of a camera module, the camera module comprising: the automatic focusing device comprises an automatic focusing driving assembly, a seat body, a carrier and an optical lens, wherein the automatic focusing driving assembly is connected between the seat body and the carrier, a mounting opening is formed in the carrier and used for mounting the optical lens, and the automatic focusing device is characterized in that the assembling method comprises the following steps:
acquiring a displacement offset angle of the carrier, wherein the displacement offset angle is an included angle between the central axis of the mounting port and the moving direction of the carrier driven by the automatic focusing driving assembly;
acquiring an optical axis inclination angle of the optical lens, wherein the optical axis inclination angle is an included angle between a central axis of the optical lens and an optical axis of the optical lens;
and installing the optical lens at the installation opening according to the displacement offset angle and the optical axis inclination angle so that the central axis of the installation opening coincides with the optical axis of the optical lens.
2. The method of assembling of claim 1, wherein said obtaining a displacement offset angle of said carrier comprises:
controlling the automatic focusing driving assembly to drive the carrier to move a preset distance relative to the seat body;
Obtaining a moving distance of the carrier on a first reference plane, wherein the first reference plane is perpendicular to a central axis of the mounting port;
obtaining a displacement offset angle of the carrier according to the preset distance and the moving distance, wherein,and θ is a displacement offset angle of the carrier, H is the moving distance, and L is the preset distance.
3. The method of assembling of claim 2, wherein said obtaining a distance of movement of the carrier in the first reference plane comprises:
acquiring an initial projection position of the carrier on the first reference plane when the carrier is at the initial position;
acquiring a target projection position of the carrier on the first reference plane after moving a preset distance;
and obtaining the moving distance of the carrier on the first reference plane according to the initial projection position and the target projection position.
4. A method of assembling as claimed in claim 2 or 3, wherein said controlling the autofocus drive assembly to drive the carrier a predetermined distance relative to the housing comprises:
acquiring an initial position and a current position of the carrier;
determining an actual moving distance of the carrier according to the initial position and the current position of the carrier;
Judging whether the actual moving distance is equal to the preset distance according to the actual moving distance of the carrier and the preset distance;
if yes, the automatic focusing driving assembly is controlled to stop driving the carrier to move.
5. A method of assembling as claimed in claim 2 or 3 wherein the predetermined distance is the distance from the maximum travel position of the carrier in a direction away from the housing to the initial position of the carrier.
6. A method of assembling as claimed in any one of claims 1 to 3, wherein said obtaining an optical axis tilt angle of the optical lens comprises:
acquiring a relation graph of the moving distance of the automatic focusing driving assembly and the MTF value of the modulation transfer function by performing defocusing test processing on the optical lens;
and confirming the optical axis inclination angle of the optical lens according to the relation graph.
7. A method of assembling according to any one of claims 1 to 3, wherein said assembling method further comprises, before said mounting of said optical lens at said mounting port, in accordance with said displacement offset angle and said optical axis tilt angle:
dispensing and fixing the optical lens and the mounting port;
After the optical lens is mounted at the mounting port according to the displacement offset angle and the optical axis tilt angle, the assembling method further includes:
and curing the connecting glue to connect the optical piece lens with the mounting opening.
8. An assembling apparatus of a camera module for performing the assembling method of the camera module according to any one of claims 1 to 7, comprising:
the first detection device is used for detecting a displacement offset angle of the carrier, wherein the displacement offset angle is an included angle between the central axis of the mounting port and the carrier driven by the automatic focusing driving assembly to move;
the second detection device is used for detecting the optical axis inclination angle of the optical lens, and the optical axis inclination angle is an included angle between the central axis of the optical lens and the optical axis of the optical lens;
and the assembling device is used for installing the optical lens on the mounting opening according to the displacement offset angle and the optical axis inclination angle so that the central axis of the mounting opening coincides with the optical axis of the optical lens.
9. The assembly apparatus of claim 8, wherein the first detection device comprises:
The first control unit is used for controlling the automatic focusing driving assembly to drive the carrier to move a preset distance relative to the base;
the first detection unit is used for acquiring the moving distance of the carrier on a first reference plane, and the first reference plane is perpendicular to the central axis of the mounting port;
and the calculating unit is used for obtaining the displacement offset angle of the carrier according to the preset distance and the moving distance.
10. The assembly device of claim 9, wherein the first control unit comprises:
the first image sensor is used for acquiring the initial position and the current position of the carrier;
the first distance measuring device is used for determining the actual moving distance of the carrier according to the current position and the initial position of the carrier;
the first controller is used for judging whether the actual moving distance of the carrier is equal to the preset distance of the carrier according to the actual moving distance and the preset distance; if yes, the first controller controls the automatic focusing driving assembly to stop driving the carrier to move.
11. Assembly device according to claim 9 or 10, characterized in that the first detection unit comprises:
A second image sensor for acquiring an initial projection position of the carrier on the first reference plane when the carrier is at the initial position, and acquiring a target projection position of the carrier on the first reference plane after the carrier is moved by a preset distance;
and the second distance measuring device is used for obtaining the moving distance of the carrier on the first reference plane according to the initial projection position and the target projection position.
12. The apparatus according to any one of claims 8 to 10, wherein the second detecting means is further configured to acquire a graph of a relation between a moving distance of the autofocus drive assembly and an MTF value by performing defocus test processing on the optical lens, and confirm an optical axis tilt angle of the optical lens based on the graph.
13. The assembly device according to any one of claims 8-10, further comprising:
and the dispensing device is also used for dispensing and fixing the optical lens and the mounting opening.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311673449.XA CN117395483B (en) | 2023-12-07 | 2023-12-07 | Assembling method and equipment for camera module |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311673449.XA CN117395483B (en) | 2023-12-07 | 2023-12-07 | Assembling method and equipment for camera module |
Publications (2)
Publication Number | Publication Date |
---|---|
CN117395483A true CN117395483A (en) | 2024-01-12 |
CN117395483B CN117395483B (en) | 2024-05-14 |
Family
ID=89465106
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311673449.XA Active CN117395483B (en) | 2023-12-07 | 2023-12-07 | Assembling method and equipment for camera module |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117395483B (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111619829A (en) * | 2020-05-11 | 2020-09-04 | 北京控制工程研究所 | Multistage cooperative control method based on active pointing hyperstatic platform |
CN112543321A (en) * | 2019-09-23 | 2021-03-23 | 宁波舜宇光电信息有限公司 | Position compensation detection and correction method, camera module and manufacturing method thereof |
JPWO2021171412A1 (en) * | 2020-02-26 | 2021-09-02 | ||
CN114384659A (en) * | 2020-10-16 | 2022-04-22 | 宁波舜宇光电信息有限公司 | Lens assembly, assembling method thereof and camera module comprising lens assembly |
CN115716190A (en) * | 2021-08-27 | 2023-02-28 | 宁波舜宇光电信息有限公司 | Assembling method and equipment of multi-group lens group and assembling method of camera module |
-
2023
- 2023-12-07 CN CN202311673449.XA patent/CN117395483B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112543321A (en) * | 2019-09-23 | 2021-03-23 | 宁波舜宇光电信息有限公司 | Position compensation detection and correction method, camera module and manufacturing method thereof |
JPWO2021171412A1 (en) * | 2020-02-26 | 2021-09-02 | ||
CN111619829A (en) * | 2020-05-11 | 2020-09-04 | 北京控制工程研究所 | Multistage cooperative control method based on active pointing hyperstatic platform |
CN114384659A (en) * | 2020-10-16 | 2022-04-22 | 宁波舜宇光电信息有限公司 | Lens assembly, assembling method thereof and camera module comprising lens assembly |
CN115716190A (en) * | 2021-08-27 | 2023-02-28 | 宁波舜宇光电信息有限公司 | Assembling method and equipment of multi-group lens group and assembling method of camera module |
Also Published As
Publication number | Publication date |
---|---|
CN117395483B (en) | 2024-05-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9927223B2 (en) | Distance image acquisition apparatus and distance image acquisition method | |
EP3322174A2 (en) | Electronic device shooting image | |
CN106550181B (en) | Camera module and terminal equipment | |
CN111901524B (en) | Focusing method and device and electronic equipment | |
CN208028980U (en) | A kind of camera module and electronic equipment | |
CN112840634B (en) | Electronic device and method for obtaining image | |
CN111123617B (en) | Camera module including aperture | |
KR20140140855A (en) | Method and Apparatus for controlling Auto Focus of an photographing device | |
CN107529018B (en) | Flash lamp, electronic device with flash lamp and flash lamp control method | |
CN102207674A (en) | Panorama image shooting apparatus and method | |
JP2009267792A (en) | Imaging apparatus | |
EP3352453B1 (en) | Photographing method for intelligent flight device and intelligent flight device | |
EP3718296B1 (en) | Electronic device and method for controlling autofocus of camera | |
CN113647094A (en) | Electronic device, method, and computer-readable medium for providing out-of-focus imaging effects in video | |
KR102668233B1 (en) | Electronic device for obtaining images by controlling frame rate for external object moving through point ofinterest and operating method thereof | |
CN104571135A (en) | Cloud deck tracking photography system and cloud deck tracking photography method | |
US20160323499A1 (en) | Method and apparatus for forming images and electronic equipment | |
CN104580904A (en) | Method and device for determining rotating camera position | |
CN117395483B (en) | Assembling method and equipment for camera module | |
US11598929B2 (en) | Terminal device, lens adjustment method and computer-readable storage medium | |
CN110602381B (en) | Depth of field detection method and device, storage medium and terminal | |
CN106483740B (en) | Projector focusing method and device | |
CN113709353B (en) | Image acquisition method and device | |
CN114731362A (en) | Electronic device including camera and method thereof | |
CN116719202B (en) | Target tracking electronic equipment, terminal equipment and target tracking system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |