CN111510631A - Rapid automatic focusing method and focusing device - Google Patents

Rapid automatic focusing method and focusing device Download PDF

Info

Publication number
CN111510631A
CN111510631A CN202010345982.3A CN202010345982A CN111510631A CN 111510631 A CN111510631 A CN 111510631A CN 202010345982 A CN202010345982 A CN 202010345982A CN 111510631 A CN111510631 A CN 111510631A
Authority
CN
China
Prior art keywords
aperture
image
focusing
module
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010345982.3A
Other languages
Chinese (zh)
Inventor
范国海
胡文锐
薛晓利
吕太洪
严文娟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu National Railways Electrical Equipment Co ltd
Original Assignee
Chengdu National Railways Electrical Equipment Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu National Railways Electrical Equipment Co ltd filed Critical Chengdu National Railways Electrical Equipment Co ltd
Priority to CN202010345982.3A priority Critical patent/CN111510631A/en
Publication of CN111510631A publication Critical patent/CN111510631A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/20Processor architectures; Processor configuration, e.g. pipelining
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Automatic Focus Adjustment (AREA)

Abstract

The invention discloses a rapid automatic focusing device, which comprises an aperture stepping motor, a focal length stepping motor, a motor driving module, an image processing unit, a positioning unit, a communication device, an image identification unit, a power supply module, a display module and a data processing unit, wherein the aperture stepping motor is connected with the focal length stepping motor; the aperture stepping motor and the focus stepping motor are respectively connected with the motor driving module; the motor driver, the graphic processing unit, the positioning unit, the communication device, the image recognition unit and the power module are respectively connected with the data processing unit. By the method and the device, the dynamic setting of the aperture value and the focal length value under various scenes can be realized, and a good imaging effect is ensured.

Description

Rapid automatic focusing method and focusing device
Technical Field
The invention relates to the field of portable railway detection equipment, in particular to a rapid automatic focusing method and a focusing device.
Background
With the great speed increase of railways, the development of high-speed railways and passenger dedicated lines in China, the contact network occupies a very important position in the electrified railways. Improving the operational reliability of the contact network becomes an important link of railway safe transportation.
The contact network safety inspection device is used for detecting the state of a contact network in a specified section, adopts a portable image acquisition device, is temporarily installed on a cab driver station of an electric locomotive or a motor train unit in operation, and acquires images of the periphery and the peripheral state of the contact network. The technical state of a contact suspension component, the external environment around the contact network line, whether an intrusion limit exists or not, obstacles which obstruct the safe operation of power supply and the like are analyzed online or offline.
In the operation process of the existing contact net safety inspection device, the illumination intensity is different under various conditions of complex environment (cloudy day, rainy day, sunny day, normal light and backlight) and complex scene (inside a tunnel, outside the tunnel, bridge and the like). At present, contact net inspection devices used by railway departments adopt manual methods for adjusting light rings and focal lengths. This method makes it difficult to quickly adjust the set parameters in place in a short time on a fast moving vehicle. Or the car is got on, the aperture value and the focal distance are fixed, and various scenes and weather conditions are ignored. This results in an inability to guarantee the quality of the acquired image. Overexposure of the image occurs and the image becomes unclear. The difficulty of later-stage defect identification is increased, and even major defect faults are omitted.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provides a quick automatic focusing device which comprises an aperture stepping motor, a focal length stepping motor, a motor driving module, an image processing unit, a positioning unit, a communication device, an image recognition unit, a power supply module, a display module and a data processing unit, wherein the aperture stepping motor is connected with the focal length stepping motor; the aperture stepping motor and the focus stepping motor are respectively connected with the motor driving module; the motor driver, the graphic processing unit, the positioning unit, the communication device, the image recognition unit and the power module are respectively connected with the data processing unit.
Preferably, the motor driving module comprises a motor driver and a U-shaped optical coupler, the motor driver and the U-shaped optical coupler are respectively connected with the data processing unit, and the U-shaped optical coupler is used for monitoring the respective movement speed and movement direction of the aperture stepping motor and the focal length stepping motor.
Preferably, the positioning unit comprises a GPS module and an A-GPS module; the A-GPS module and the GPS module are respectively connected with the data processing unit.
Preferably, the graphic processing unit includes an analysis processing module, and the analysis processing module is configured to calculate a gray value of the image, an edge sharpness of the image, and determine whether the image is clear.
A fast automatic focusing method comprises the following steps:
acquiring target image data by using a camera, judging whether an input target image is clear in focusing or not by using an analysis processing module and adopting a graphic algorithm, and if not, calculating the edge sharpness of the image;
acquiring n target image data through focusing according to the edge sharpness and the relative position of the camera and the target unchanged, and calculating the relative blur rate of each target image, wherein the focal length of the target image corresponding to the relative blur rate with the minimum value is the focal length of the camera and the target, so as to finish focusing;
step three: and calculating the gray value of the input target image by adopting a graphic algorithm through the analysis processing module, judging whether the target image is overexposed or underexposed, and if the target image is overexposed or underexposed, adjusting the aperture to enable the gray value of the target image to fall into a set reference value to finish aperture adjustment.
Further, the focusing comprises the following processes: the relative position of the camera and the target is unchanged, the focusing step length is delta _ a, the focal distance range is [ a, b ], the target is imaged once every time the target is focused once, the total imaging is set to be n times, each imaging corresponds to one focal distance fn, the relative blurring rate Rk (k =1.. n) of n images is calculated by using an image algorithm, the minimum blurring rate Rmin and the corresponding image with the minimum blurring rate are calculated, and finally the focal distance f corresponding to the image is obtained, namely the focal distance imaged by the camera.
Further, the aperture adjustment comprises the following processes: firstly, calibrating an angle and the size of an aperture, measuring the length from the smallest aperture to the largest aperture, and measuring the perimeter of a lens; the ratio of 360/aperture is = the circumference of the lens/aperture length, namely the maximum corresponding aperture is obtained, and the lengths corresponding to different aperture scale values are measured to obtain the corresponding degrees of different aperture values; finally, obtaining the corresponding relation between the aperture value and the angle value; and the corresponding aperture can be obtained by adjusting the angle.
Further, the reference value setting comprises the following processes of obtaining a brightness average value avg in the target image, obtaining a distance o (a) from avg to a, wherein o (a) is a weight fraction of brightness, obtaining a gray mean square value S of the whole image, counting gray intervals in the image, calculating a weight L, and obtaining a comprehensive weight o (S) + S + L which is a reference value of image gray
The invention has the beneficial effects that: the invention realizes dynamic setting of aperture value and focal length value under various scenes by adding two stepping motors at the lens end and combining with an image algorithm to ensure good imaging effect.
The positioning module of the invention uses a GPS + AGPS module and combines a line basic database and an image recognition algorithm to accurately position. The adjustment of aperture value and focal length can be ensured under various conditions.
Drawings
FIG. 1 is a schematic diagram of a fast autofocus apparatus;
FIG. 2 is a schematic diagram of a method of fast auto-focusing;
fig. 3 is a schematic diagram of focus autofocus.
Detailed Description
The technical solutions of the present invention are further described in detail below with reference to the accompanying drawings, but the scope of the present invention is not limited to the following.
As shown in fig. 1, the fast automatic focusing apparatus provided by the present invention comprises an aperture stepping motor, a focus stepping motor, a motor driving module, a graphic processing unit, a positioning unit, a communication device, an image recognition unit, a power supply module, a display module, and a data processing unit; the aperture stepping motor and the focus stepping motor are respectively connected with the motor driving module; the motor driver, the graphic processing unit, the positioning unit, the communication device, the image recognition unit and the power module are respectively connected with the data processing unit.
The motor driving module comprises a motor driver and a U-shaped optical coupler, the motor driver and the U-shaped optical coupler are respectively connected with the data processing unit, and the U-shaped optical coupler is used for monitoring the respective movement speed and the movement direction of the aperture stepping motor and the focal length stepping motor.
The positioning unit comprises a GPS module and an A-GPS module; the A-GPS module and the GPS module are respectively connected with the data processing unit.
The image processing unit comprises an analysis processing module, and the analysis processing module is used for calculating the gray value of the image, the edge sharpness of the image and judging whether the image is clear or not.
A fast automatic focusing method comprises the following steps:
acquiring target image data by using a camera, judging whether an input target image is clear in focusing or not by using an analysis processing module and adopting a graphic algorithm, and if not, calculating the edge sharpness of the image;
acquiring n target image data through focusing according to the edge sharpness and the relative position of the camera and the target unchanged, and calculating the relative blur rate of each target image, wherein the focal length of the target image corresponding to the relative blur rate with the minimum value is the focal length of the camera and the target, so as to finish focusing;
step three: and calculating the gray value of the input target image by adopting a graphic algorithm through the analysis processing module, judging whether the target image is overexposed or underexposed, and if the target image is overexposed or underexposed, adjusting the aperture to enable the gray value of the target image to fall into a set reference value to finish aperture adjustment.
The focusing comprises the following processes: the relative position of the camera and the target is unchanged, the focusing step length is delta _ a, the focal distance range is [ a, b ], the target is imaged once every time the target is focused once, the total imaging is set to be n times, each imaging corresponds to one focal distance fn, the relative blurring rate Rk (k =1.. n) of n images is calculated by using an image algorithm, the minimum blurring rate Rmin and the corresponding image with the minimum blurring rate are calculated, and finally the focal distance f corresponding to the image is obtained, namely the focal distance imaged by the camera.
The aperture adjustment includes the following processes: firstly, calibrating an angle and the size of an aperture, measuring the length from the smallest aperture to the largest aperture, and measuring the perimeter of a lens; the ratio of 360/aperture is = the circumference of the lens/aperture length, namely the maximum corresponding aperture is obtained, and the lengths corresponding to different aperture scale values are measured to obtain the corresponding degrees of different aperture values; finally, obtaining the corresponding relation between the aperture value and the angle value; and the corresponding aperture can be obtained by adjusting the angle.
The reference value setting process includes the steps of obtaining the average value avg of brightness in a target image, obtaining the distance o (a) from avg to a, wherein o (a) is the weight fraction of the brightness, obtaining the mean square value S of gray scale of the whole image, counting gray scale intervals in the image, calculating a weight L, and obtaining the comprehensive weight o (S) + S + L which is the reference value of the gray scale of the image.
The invention provides a portable vehicle-mounted device capable of automatically and dynamically adjusting lens aperture and small focal length based on different scenes and weather conditions. The whole device consists of a shooting device, a lens focal length, an accurate diaphragm control device and a fixing device.
The main function of the lens end is to realize the accurate aperture and object distance rotation adjustment positioning function according to the given parameter information (to realize the automatic adjustment of the camera under different illumination conditions to achieve the clearest picture, the aperture and the object distance are respectively adjusted by adding two stepping motors to the lens end, the output of the stepping motor is the optical coupler count value, and the user only needs the aperture value and the focal length value of the lens, so the corresponding relation between the optical coupler count value of the stepping motor and the aperture value and the focal length value needs to be found).
After the target object is determined, the imaging effects of images with different aperture values and focal length values are different. And finally, identifying the sharpness of the image edge by using the GPU to determine the image quality of the image, and finally automatically adjusting the aperture value and the optical coupling count value of the focal length value corresponding to the image quality.
The GPS is mainly used in the host to connect with a satellite to acquire longitude and latitude information of the current equipment, and the GPS can be used to combine AGPS information and identification of image tunneling characteristics to enter and exit the tunnel, and can quickly reset information such as aperture value and exposure time to ensure good imaging quality of the image.
The GPS + AGPS positioning module acquires positioning information of an indefinite scene, judges the picture quality by combining an image algorithm, and resets lens aperture, exposure time parameters and the like to ensure the definition of imaging.
The accurate positioning of the lens aperture and focusing is realized by matching a stepping motor with a reduction gearbox. After being connected to a stepping motor driving plate, the two stepping motors are installed in a customized shell, and the shell is installed on a lens part of a camera, so that a stepping motor gear is meshed with a camera lens. When the motor is driven, the motor is accelerated to a specified speed from the starting speed according to parameters in a serial port command. And meanwhile, each motor uses two U-shaped optocouplers to monitor the movement speed and direction. The phase difference of the two optical couplers is 90 degrees. The specific working principle is shown in fig. 3.
The automatic focusing function of the system consists of two functional modules, one is an analysis processing module and the other is a control driving module.
The analysis processing module judges whether the input image is clear in focusing or not through a pure image algorithm, and if the input image is not clear in focusing, the module simultaneously calculates out-of-focus programs of the image;
the relative position of the camera and the target is unchanged, the focusing step length is delta _ a, the focal distance range is [ a, b ], the target is imaged once every time the target is focused once, the total imaging is set to be n times, each imaging corresponds to one focal distance fn, the relative blurring rate Rk (k =1.. n) of n images is calculated by using an image algorithm, the minimum blurring rate Rmin and the corresponding image with the minimum blurring rate are calculated, and finally the focal distance f corresponding to the image is obtained, namely the focal distance which is the maximum imaging of the camera.
Algorithm idea of fuzzy rate RK (k =1 …. n).
And taking a proper roi at the focusing target, performing laplace calculation on the roi image, performing threshold filtering, and counting the number n of edge points. And calculating the number m of the pixels of the roi, and then 1-n/m represents the fuzzy rate k.
The analysis processing module is used for realizing the gray value calculation of the input image by a pure image algorithm and judging whether the image is overexposed, underexposed and the like. And feeding back the information to the singlechip control module. And the control driving module resets the aperture value according to the information. Finally, the target image is in a clear imaging effect.
The image with accurate focusing has higher definition, distinct layers and high contrast. The system has the main idea that the field contrast of an image is considered, namely the gradient difference of the gray characteristic between adjacent pixels; based on the characteristics of the railway industry, a variance method is adopted. The image with clear focusing has larger gray difference between data compared with the image with fuzzy focusing, namely the variance of the data is also larger, the definition of the image is measured by the variance of the gray data of the image, and the larger the variance is, the better the definition is represented.
The stepping motor automatically adjusts the size of the diaphragm of the lens, and the diaphragm is calibrated in angle and size in advance.
The calibration idea is as follows:
the aperture is manually adjusted to the minimum value and calibrated, and then the aperture is adjusted to the maximum value, and the angle is measured. One angle per scale was tested. And finally finding out the corresponding relation between the aperture value and the angle value. And archiving the corresponding relation of all the numerical values.
In practical application, a group of reference tables of the corresponding relation between the aperture size and the angle are provided, in the using process of a client, the aperture size is set according to weather conditions, only the adjustment angle needs to be clicked on a program interface, the angle information can be transmitted to a lens single chip microcomputer program, the data can be converted into the optical coupling step length, and the aperture is automatically adjusted to an appointed position to acquire an image.
And the collected images are transmitted to a computing module for image brightness identification, and finally, a reference is given to a user whether to collect the later-stage images according to the brightness. If the previous steps are not repeated, the setting is carried out until the aperture size is suitable.
To obtain a good image effect inside the tunnel, the size of the aperture needs to be set again quickly when entering the tunnel. However, the GPS positioning is poor after entering the tunnel environment. In order to accurately identify the tunnel in which the equipment enters and exits, an AGPS (assisted Global positioning System) auxiliary positioning function is added, and accurate positioning is realized by combining basic data.
AGPS is an assisted global positioning system. It combines GSM/GPRS and traditional satellite positioning, uses base station to transmit auxiliary satellite information to reduce the delay time of GPS chip to obtain satellite signal, and can make up the covered indoor signal by base station, and can reduce the dependence of GPS chip on satellite. Therefore, the system can provide a wider and more pleasant-speed bit service under the condition of poor surrounding environment of the entering tunnel, and the error is generally within 10 meters. The system adopts a hardware module with an AGPS function and greatly improves the signal acceptance by externally arranging 4G antenna signals. When the device is in the tunnel, it can be preliminarily positioned and the tunnel characteristics can be identified by combining the images. At this time, the aperture value is automatically set and adjusted to the optimal state by combining with an image recognition algorithm.
When the device is in the tunnel, it can be preliminarily positioned and the tunnel characteristics can be identified by combining the images. At this time, the aperture is automatically adjusted
The foregoing is illustrative of the preferred embodiments of this invention, and it is to be understood that the invention is not limited to the precise form disclosed herein and that various other combinations, modifications, and environments may be resorted to, falling within the scope of the concept as disclosed herein, either as described above or as apparent to those skilled in the relevant art. And that modifications and variations may be effected by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (8)

1. A rapid automatic focusing device is characterized by comprising an aperture stepping motor, a focal length stepping motor, a motor driving module, a graphic processing unit, a positioning unit, a communication device, an image recognition unit, a power supply module, a display module and a data processing unit; the aperture stepping motor and the focus stepping motor are respectively connected with the motor driving module; the motor driver, the graphic processing unit, the positioning unit, the communication device, the image recognition unit and the power module are respectively connected with the data processing unit.
2. The fast automatic focusing device of claim 1, wherein the motor driving module comprises a motor driver and a U-shaped optical coupler, the motor driver and the U-shaped optical coupler are respectively connected with the data processing unit, and the U-shaped optical coupler is used for monitoring the respective movement speed and movement direction of the aperture stepping motor and the focus stepping motor.
3. The fast autofocus apparatus of claim 1, wherein the positioning unit comprises a GPS module, an a-GPS module; the A-GPS module and the GPS module are respectively connected with the data processing unit.
4. The apparatus of claim 1, wherein the graphic processing unit comprises an analysis processing module, and the analysis processing module is configured to calculate a gray level of the image, an edge sharpness of the image, and determine whether the image is sharp.
5. A fast automatic focusing method is characterized by comprising the following steps:
the method comprises the steps of firstly, acquiring target image data by using a camera, calculating the edge sharpness of the image data by adopting a gradient method or a variance method, judging whether an input target image is clear in focus or not, and repeatedly acquiring and calculating if no clear image is found;
acquiring n target image data through focusing according to the edge sharpness and the relative position of the camera and the target unchanged, calculating the relative blur rate of each target image, wherein the focal length of the target image corresponding to the relative blur rate with the minimum value is the focal length of the camera and the target, and finishing focusing according to the focal length;
step three: and calculating the gray value of the input target image through the analysis processing module, judging whether the target image is overexposed or underexposed, and if the target image is overexposed or underexposed, adjusting the aperture to enable the gray value of the target image to fall into a set reference value to finish aperture adjustment.
6. The fast auto-focusing method according to claim 5, wherein said focusing comprises the following processes: the relative position of the camera and the target is unchanged, the focusing step length is delta _ a, the focal distance range is [ a, b ], wherein a is an initial value of the focusing range, b is a final value of the focusing range, the target is imaged once every focusing, the total imaging is set to be n times, each imaging corresponds to one focal distance fn, the relative blurring rates Rk of n images are calculated, k =1 … n is calculated, the minimum blurring rate Rmin and the corresponding image with the minimum blurring rate are obtained, and finally the focal distance f corresponding to the image is obtained, namely the focal distance imaged by the camera.
7. The fast auto-focusing method according to claim 5, wherein said aperture adjustment comprises the following steps: firstly, calibrating an angle and the size of an aperture, acquiring the length from the smallest aperture to the largest aperture, and acquiring the perimeter of a lens; the aperture ratio is 360/aperture degree = the circumference of the lens/aperture length, namely the maximum corresponding degree of the aperture is obtained, and the lengths corresponding to different aperture scale values are obtained in the same way, so that the degrees corresponding to different aperture values are obtained; finally, obtaining the corresponding relation between the aperture value and the angle value; and the corresponding aperture can be obtained by adjusting the angle.
8. The fast automatic focusing method of claim 5, wherein the reference value setting comprises the steps of calculating the average value avg of the brightness in the target image, calculating the distance o (a) from avg to a, wherein o (a) is the weight number of the brightness, calculating the mean square value S of the gray scale of the whole image, counting the gray scale interval in the image, and calculating the weight L, wherein the comprehensive weight o (a) + S + L is the reference value of the gray scale of the image.
CN202010345982.3A 2020-04-27 2020-04-27 Rapid automatic focusing method and focusing device Pending CN111510631A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010345982.3A CN111510631A (en) 2020-04-27 2020-04-27 Rapid automatic focusing method and focusing device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010345982.3A CN111510631A (en) 2020-04-27 2020-04-27 Rapid automatic focusing method and focusing device

Publications (1)

Publication Number Publication Date
CN111510631A true CN111510631A (en) 2020-08-07

Family

ID=71873317

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010345982.3A Pending CN111510631A (en) 2020-04-27 2020-04-27 Rapid automatic focusing method and focusing device

Country Status (1)

Country Link
CN (1) CN111510631A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112461853A (en) * 2020-10-30 2021-03-09 珠海市奥德维科技有限公司 Automatic focusing method and system
CN112730449A (en) * 2020-12-16 2021-04-30 上海辛玮智能科技有限公司 Optical method for detecting three-dimensional surrounding view of automatic focusing liquid crystal module
CN113810616A (en) * 2021-09-27 2021-12-17 季华实验室 Aperture focal length adjusting method, system, electronic device and storage medium
CN114666487A (en) * 2020-12-22 2022-06-24 余姚舜宇智能光学技术有限公司 Focusing positioning method and system and electronic equipment

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101232583A (en) * 2008-01-31 2008-07-30 北京航空航天大学 Method for self-adaptive regulating camera aperture
CN101266389A (en) * 2008-04-15 2008-09-17 太原理工天成电子信息技术有限公司 Automatic focusing control device
CN102025910A (en) * 2009-09-22 2011-04-20 华晶科技股份有限公司 Continuous focusing method for digital camera
US20110243539A1 (en) * 2010-03-31 2011-10-06 Canon Kabushiki Kaisha Autofocusing zoom lens
CN102981347A (en) * 2012-12-25 2013-03-20 中国科学院长春光学精密机械与物理研究所 Automatic SUSAN focusing method for video monitoring system
CN103079032A (en) * 2012-11-16 2013-05-01 上海培清科技有限公司 High-definition capillary electrophoresis gel imaging system
CN105450921A (en) * 2014-08-22 2016-03-30 宏碁股份有限公司 Image-acquiring device and automatic focusing compensation method thereof
CN108712616A (en) * 2018-08-14 2018-10-26 浙江大丰实业股份有限公司 Sidelight cage sidelight testing agency

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101232583A (en) * 2008-01-31 2008-07-30 北京航空航天大学 Method for self-adaptive regulating camera aperture
CN101266389A (en) * 2008-04-15 2008-09-17 太原理工天成电子信息技术有限公司 Automatic focusing control device
CN102025910A (en) * 2009-09-22 2011-04-20 华晶科技股份有限公司 Continuous focusing method for digital camera
US20110243539A1 (en) * 2010-03-31 2011-10-06 Canon Kabushiki Kaisha Autofocusing zoom lens
CN103079032A (en) * 2012-11-16 2013-05-01 上海培清科技有限公司 High-definition capillary electrophoresis gel imaging system
CN102981347A (en) * 2012-12-25 2013-03-20 中国科学院长春光学精密机械与物理研究所 Automatic SUSAN focusing method for video monitoring system
CN105450921A (en) * 2014-08-22 2016-03-30 宏碁股份有限公司 Image-acquiring device and automatic focusing compensation method thereof
CN108712616A (en) * 2018-08-14 2018-10-26 浙江大丰实业股份有限公司 Sidelight cage sidelight testing agency

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112461853A (en) * 2020-10-30 2021-03-09 珠海市奥德维科技有限公司 Automatic focusing method and system
CN112461853B (en) * 2020-10-30 2021-07-27 珠海市奥德维科技有限公司 Automatic focusing method and system
CN112730449A (en) * 2020-12-16 2021-04-30 上海辛玮智能科技有限公司 Optical method for detecting three-dimensional surrounding view of automatic focusing liquid crystal module
CN112730449B (en) * 2020-12-16 2023-07-14 上海辛玮智能科技有限公司 Microscopic three-dimensional detection optical method for auto-focusing liquid crystal module
CN114666487A (en) * 2020-12-22 2022-06-24 余姚舜宇智能光学技术有限公司 Focusing positioning method and system and electronic equipment
CN114666487B (en) * 2020-12-22 2024-04-09 余姚舜宇智能光学技术有限公司 Focusing and positioning method and system and electronic equipment
CN113810616A (en) * 2021-09-27 2021-12-17 季华实验室 Aperture focal length adjusting method, system, electronic device and storage medium

Similar Documents

Publication Publication Date Title
CN111510631A (en) Rapid automatic focusing method and focusing device
CN107449589B (en) HUD imaging position detection device and HUD imaging position detection method
CN111508260A (en) Vehicle parking space detection method, device and system
CN112866673B (en) Image acquisition equipment, camera control method and device and train image acquisition system
CN109714535A (en) A kind of auto-focusing machine vision metrology device and method based on color difference
CN105953741B (en) System and method for measuring local geometric deformation of steel structure
CN111830470B (en) Combined calibration method and device, target object detection method, system and device
CN114594533A (en) Video rainfall monitoring method and device based on self-adaptive Gaussian mixture algorithm
CN111147764A (en) Leaky coaxial cable image acquisition method and system based on real-time image recognition
CN115604550A (en) High-speed accurate fixed-distance photographing device and method
US7339147B2 (en) System and method for automatic focusing of images
CN108305290B (en) Accurate spectrum measuring method for moving target
CN113743286A (en) Target monitoring system and method for multi-source signal fusion
CN106959378B (en) Single width motion blur image speed calculation method
CN105225254B (en) A kind of exposure method and system of automatic tracing localized target
CN111273134A (en) Ultraviolet and short wave infrared band fused corona monitoring device
CN115372992A (en) Panoramic single-photon laser radar system and target imaging method
KR102210571B1 (en) Bridge and tunnel safety diagnosis remote monitoring alarm method using GPS coordinates and mobile communication system
CN113029089A (en) Video image inner target distance estimation method based on electric power field auxiliary information
CN112097693A (en) Straightness measuring system and method based on unmanned aerial vehicle
CN209803848U (en) Integrated road tunnel variable-focus visual detection system
CN116337014B (en) Processing method of unmanned aerial vehicle aerial photogrammetry data
CN104864850B (en) Fast imaging type tunnel sky ratio measurement method
EP2423899A1 (en) Method for identifying reflecting objects subjected to variable lighting conditions and system for performing said method
CN117037007B (en) Aerial photographing type road illumination uniformity checking method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200807

RJ01 Rejection of invention patent application after publication