CN116828280A - Image acquisition method and related equipment - Google Patents

Image acquisition method and related equipment Download PDF

Info

Publication number
CN116828280A
CN116828280A CN202310796158.3A CN202310796158A CN116828280A CN 116828280 A CN116828280 A CN 116828280A CN 202310796158 A CN202310796158 A CN 202310796158A CN 116828280 A CN116828280 A CN 116828280A
Authority
CN
China
Prior art keywords
light source
target
camera
distance
light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310796158.3A
Other languages
Chinese (zh)
Inventor
徐景硕
于娜
李铭
张博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Huaray Technology Co Ltd
Original Assignee
Zhejiang Huaray Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Huaray Technology Co Ltd filed Critical Zhejiang Huaray Technology Co Ltd
Priority to CN202310796158.3A priority Critical patent/CN116828280A/en
Publication of CN116828280A publication Critical patent/CN116828280A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/671Focus control based on electronic image sensor signals in combination with active ranging signals, e.g. using light or sound signals emitted toward objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/958Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging
    • H04N23/959Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging by adjusting depth of field during image capture, e.g. maximising or setting range based on scene characteristics

Abstract

The application provides an image acquisition method and related equipment, which can adjust the depth of field of a camera by adjusting the wavelength of an irradiation light source, can acquire clear images of objects, has higher response speed, and does not need a mechanism to move the camera or change the shape of a camera lens. An image acquisition method comprising: receiving a target distance between an object in a target area and a camera; selecting a target light source from a plurality of light sources according to the target distance; and controlling the target light source to be in an on state.

Description

Image acquisition method and related equipment
Technical Field
The application relates to the technical field of imaging, and discloses an image acquisition method and related equipment.
Background
In an image acquisition scene, the depth of field of a lens is generally determined by self factors such as working distance, aperture, focal length, definition standard and the like. Under the condition that the factors of the lens are not changed, the depth of field of the lens is fixed, so that the application scene of the lens is limited, the application flexibility is poor, and the lens is difficult to be suitable for the manufacturing field.
Disclosure of Invention
The application provides an image acquisition system, which can adjust the depth of field of a camera by adjusting the wavelength of an irradiation light source, can acquire clear images of objects, has higher response speed, and does not need a mechanism to move the camera or change the shape of a camera lens.
In a first aspect, an embodiment of the present application provides an image capturing method, which may be applied to an image capturing system. The image acquisition method may include:
receiving a target distance between an object in a target area and a camera;
selecting a target light source from a plurality of light sources according to the target distance;
and controlling the target light source to be in an on state.
In a possible implementation manner, the selecting a target light source from a plurality of light sources according to the target distance includes:
and selecting the light source corresponding to the target distance as the target light source according to the corresponding relation between the preset distance and the light source.
In a possible implementation manner, the selecting a target light source from a plurality of light sources according to the target distance includes:
and selecting a light source corresponding to the distance set to which the target distance belongs as the target light source according to the corresponding relation between the preset distance set and the light source.
In a possible embodiment, the method further comprises:
and sending a first indication signal to a camera, selecting one other light source after a preset time period, controlling the other light source to provide light with a corresponding wavelength for the target area, and sending a second indication signal to the camera, wherein the other light source is a light source except the target light source in the plurality of light sources.
In a second aspect, the embodiment of the application further provides an image acquisition system. The system may include:
a camera arranged at a designated position for collecting an image of an object in a target area;
a plurality of light sources, each of the plurality of light sources corresponding to a plurality of wavelengths, each light source for providing light of a corresponding wavelength;
a light source control means for selecting one target light source from the plurality of light sources according to a target distance between the object and the camera; and controlling the target light source to be in an on state.
In a possible embodiment, the light source control device comprises a controller, and the controller is specifically configured to:
and selecting the light source corresponding to the target distance as the target light source according to the corresponding relation between the preset distance and the light source.
In a possible embodiment, the light source control device comprises a controller, and the controller is specifically configured to:
and selecting a light source corresponding to the distance set to which the target distance belongs as the target light source according to the corresponding relation between the preset distance set and the light source.
In a possible embodiment, the light source control device comprises a distance measuring element;
the ranging element is for measuring a distance between an object within the target area and the camera.
In a possible embodiment, the distance measuring element comprises at least one of the following sensors:
infrared distance measuring sensor, ultrasonic distance measuring sensor, laser distance measuring sensor.
In a possible embodiment, the light source control device includes a plurality of pressure sensors; the target area comprises a plurality of subareas, and all subareas are not overlapped;
the plurality of subareas are in one-to-one correspondence with the plurality of pressure sensors;
the pressure sensor is used for: measuring a pressure signal generated by the object on a subarea corresponding to the pressure sensor, and sending the measured pressure signal to the controller when the measured pressure signal is greater than a preset signal threshold;
the controller is further configured to: and determining the position of the subarea corresponding to the first pressure sensor as the position of the object based on the received pressure signal provided by the first pressure sensor.
In a possible embodiment, the camera comprises a lens and an image sensor;
the image sensor is used for collecting optical signals at the imaging surface of the lens and converting the optical signals into electric signals.
In one possible embodiment, the image acquisition system includes a single shot mode;
after the light source control device controls the target light source to be in an on state, the light source control device is configured to send a first indication signal to the camera;
the camera is configured to acquire an image of an object within the target area after receiving the first indication signal.
In one possible embodiment, the image acquisition system includes a burst mode;
the light source control device is configured to send a first indication signal to the camera after controlling the target light source to provide light with a corresponding wavelength to the target area, and after a preset time period, select one other light source and control the other light source to provide light with a corresponding wavelength to the target area, and send a second indication signal to the camera, wherein the other light source is a light source except the target light source in the plurality of light sources;
the camera is configured to acquire a first image of an object within the target area after receiving the first indication signal; and acquiring a second image of the object in the target area after receiving the second indication signal.
In a possible embodiment, the distance between the object and the camera is the distance between the object and the camera in the direction of the optical axis of the camera.
In a possible embodiment, the plurality of light sources includes at least two of:
red light, green light, blue light.
In a possible embodiment, the wavelengths of the plurality of light sources are preconfigured. The focal lengths of the lenses respectively corresponding to the light sources are also preconfigured. In general, the larger the wavelength provided by the light source, the larger the focal length corresponding to the light source.
The depth of field and the focal length which can be supported by the image acquisition system accord with the following relation:
wherein DOF represents depth of field, F represents focal length, F represents lens relative aperture, d represents focusing distance, and P represents allowable diffuse spot size.
In some examples, the plurality of light sources may include a first light source and a second light source. Wherein the first light source may provide a first wavelength (lambda 1 ). The second light source may provide a second wavelength (lambda) 2 ). First wavelength (lambda) 1 ) The corresponding focal length is a first focal length (f 1), a second wavelength (lambda 2 ) The corresponding focal length is the second focal length (f 2). The first light source is in an on state, and the second light source is in an off state, so that the first light source irradiates a first wavelength (lambda) 1 ) Based on the relation between the depth of field and the focal length, the depth of field of the image acquisition system is recorded as DOF1. The first light source is in an off state, and the second light source is in an on state, so that the second light source irradiates a target area with a second wavelength (lambda) 2 ) Based on the relation between the depth of field and the focal length, the depth of field of the image acquisition system is recorded as DOF2. In this example, the maximum depth of field that the image acquisition system may provide is DOF 1 +DOF 2
The embodiment of the application has the following beneficial effects:
the application provides an image acquisition method and related equipment, which can adjust the depth of field of a camera by adjusting the wavelength of a light source irradiating a target area, can acquire clear images of objects, has higher response speed, does not need a mechanism to move the camera or change the shape of a camera lens, and ensures that the image acquisition system provided by the application has stronger anti-vibration capability and supports more use times.
Additional features and advantages of the application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the application. The objectives and other advantages of the application will be realized and attained by the structure particularly pointed out in the written description and claims thereof as well as the appended drawings.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly described below, it will be apparent that the drawings in the following description are only some embodiments of the present application, and that other drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic structural diagram of an image acquisition system according to an embodiment of the present application;
fig. 2 is a schematic view of depth of field of an image acquisition system according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a target area according to an embodiment of the present application;
fig. 4a is a schematic structural diagram of an image acquisition system according to an embodiment of the present application;
fig. 4b is a schematic structural diagram of an image capturing system according to an embodiment of the present application;
fig. 4c is a schematic structural diagram of an image capturing system according to an embodiment of the present application;
fig. 4d is a schematic structural diagram of an image capturing system according to an embodiment of the present application;
fig. 5 is a schematic diagram of an image acquisition method according to an embodiment of the present application;
fig. 6 is a schematic diagram of an image acquisition method according to an embodiment of the present application;
fig. 7 is a schematic diagram of an image acquisition method according to an embodiment of the present application;
fig. 8 is a schematic diagram of an image acquisition method according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more apparent, the technical solutions of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the technical solutions of the present application, but not all embodiments. All other embodiments, based on the embodiments described in the present document, which can be obtained by a person skilled in the art without any creative effort, are within the scope of protection of the technical solutions of the present application.
The object space depth at which a clear image is formed along the optical axis of the lens is generally called the depth of field of the lens. The range of the corresponding depth of field in image space is called the depth of focus. The depth of field of the lens cannot meet the scene requirement, and clear imaging of different object distances can be realized by a mode of time-sharing adjustment of the actual focusing distance (time-sharing focusing) of the lens.
In the related art, a time-sharing focusing mode can adjust the overall position of the lens, for example, a voice coil motor, a stepping motor and other mechanisms are utilized to drive the lens to move back and forth along the optical axis, and the movement amount makes the new object plane conjugate with the original focal plane. The other time-sharing focusing mode can adjust the position of the image sensor, for example, a stepping motor and other mechanisms are adopted to drive the image sensor to move back and forth along the optical axis, and the movement amount is equal to the lens image distance change amount caused by the object distance change. The other time-sharing focusing mode is to adjust the focal length of the lens, for example, a stepping motor is used to control the positions of some lens groups in the lens, so as to change the actual focal length of the lens and realize the conjugation of object planes with different distances and the same focal plane. In the time-sharing focusing modes, a stepping motor or a voice coil motor is adopted to drive a lens or a lens group in the lens to move, and the response time is longer and is generally 0.1-1s. In addition, the moving amount precision is low, the reliability is poor, and the like.
The other time-sharing focusing mode is to adjust the focal power of the lens to change the focal length of the lens, such as using the lens inside or matching the lens outside with the lens with variable focal power, the lens material can be liquid or other films, and the lens can meet the inverse piezoelectric effect, i.e. the shape of the lens material is influenced by the magnitude of externally applied voltage. The change of the curvature of the lens causes the focal power of the lens to change, and the focal length of the lens module can be adjusted by adjusting the voltage value of the applied voltage. In such split focus mode, a process is still required for lens shape change, especially for intermediate oscillation to a substantially stable process, but the response time is still generally more than 25ms, and it is difficult to detect the requirement of high-speed zoom below 1ms in a scene.
In view of this, the embodiment of the application provides an image acquisition method and related equipment, which can change the depth of field and has a smaller response time.
Fig. 1 illustrates an image acquisition system according to an exemplary embodiment. The image acquisition system may comprise a camera, a plurality of light sources and a light source control device. The light source control device can execute the image acquisition method provided by the embodiment of the application. The implementation process of the image acquisition method is described below in connection with an image acquisition system.
The image acquisition system is used for acquiring an image of an object in the target area. Wherein the camera is positioned at a designated location and images of objects within the target area can be acquired. The position of the camera in the image acquisition system provided by the embodiment of the application is fixed, and the position of the camera does not need to be moved. In the embodiment of the application, the object can be a person, a commodity or a person or an article bearing the mark. The article may be a product or a semi-finished product in the field of industrial manufacturing. The commodity may be an inanimate object or an animate object. The identification may be a graphic code, bar code, combination of numbers, etc. Alternatively, the image acquisition system may be provided in an industrial manufacturing field, such as a line. The image acquisition system may be used to identify the thickness of the product or semi-finished product. The embodiment of the present application is not particularly limited thereto.
The image acquisition system comprises a plurality of light sources corresponding to a plurality of wavelengths. One light source corresponds to one wavelength. A light source may provide light of a wavelength corresponding to the light source itself to an object in the target area. The light source control device can control the working states of a plurality of light sources. The operating states of the light source may include an on state and an off state. The light source may be a light source comprising light emitting diodes (Light Emitting Diode, LEDs). The light source may also be a Laser Diode (LD).
The light source control device can control any one of the light sources to be turned on or off. According to the embodiment of the application, the light source control device can select one light source from a plurality of light sources according to the distance between the object in the target area and the camera, and control the selected light source to be in an on state and other light sources to be in an off state.
The specific number of the plurality of light sources is not particularly limited in the embodiment of the present application. The specific number of the plurality of light sources can be configured in connection with the actual application scenario. In the present application, the wavelength of the light source may refer to the center wavelength of the spectrum of the light source. The plurality of wavelengths corresponding to the plurality of light sources may include, but are not limited to 470nm,487nm,523nm,625nm,630nm, etc.
The depth of field and focal length that the image acquisition system provided in the embodiments of the present application can support may conform to the relationship,
wherein DOF represents depth of field, F represents focal length, F represents lens relative aperture, d represents focusing distance, and P represents allowable diffuse spot size.
In some examples, the plurality of light sources may be a first light source and a second light source. The first light source corresponds to a wavelength λ1, and the first light source can provide light with the wavelength λ1. The second light source corresponds to a wavelength of λ2, and the second light source can provide light with the wavelength of λ2. Referring to fig. 2, the light source control device controls the first light source to be turned on, the second light source to be turned off, and the first distance between the lenses in the camera is conjugate to the light receiving plane of the object plane P1 with D1, so that the depth of image space after D1 space conversion near the object plane P1 meets the depth of focus of the photosensitive surface of the image sensor in the camera, a clear image can be acquired, the focal length of the camera is a first focal length 1/f1, and the depth of field of the camera is a first distance D1.
The light source control device controls the second light source to be turned on, the first light source is turned off and is conjugate with a light receiving plane of an object plane P2 with a second distance D2 between lenses in the camera, so that the depth of an image space after D2 space conversion near the object plane P2 can be achieved to meet the focal depth of a photosensitive surface of an image sensor in the camera, clear images can be acquired, the focal length of the camera is a second focal length 1/f2, and at the moment, the depth of field of the camera is a second distance D2, wherein d2=d1+d1. It can be seen that adjusting the focal length of the camera can be achieved in the image acquisition system by the wavelength of the light impinging on the target area. Wherein the relation between the first focal length and the second focal length is 1/d 1-1/d2=1/f 1-1/f2.
It should be noted that the D1 space and the D2 space are conveniently distinguished. The D1 space shown in fig. 2 is independent of the D2 space. In the actual application scene, a definition blind area is avoided, and the D1 space and the D2 space are overlapped or overlapped.
By the description in the above example, it is clear that by adjusting the wavelength of the light source illuminating the target area, the depth of field of the camera can be increased, and a clear image of the object in the increased depth of field range can be acquired.
The image sensor may include, but is not limited to, a Complementary Metal Oxide Semiconductor (CMOS), a Charge-coupled Device (CCD). The image sensor may sample the camera imaging surface, converting the optical signal into an electrical signal. The general image sensors are classified into two types, a CMOS image sensor and a CCD image sensor, respectively. In some examples, in an image capturing system provided by an embodiment of the present application, a camera includes a lens and an image sensor. In other examples, the camera in the image capturing system provided by the embodiment of the present application may include a lens, and the image capturing system may further include an image sensor. In still other examples, the image acquisition system may be used with an external image sensor. The present application is not particularly limited thereto.
Based on the image capturing system provided in any one of the above embodiments, the light source control device may include a controller. The controller may select one light source, denoted as the target light source, based on the distance between the object (within the target area) and the camera (for ease of differentiation, denoted as target distance). The controller controls the target light source to be turned on, and can provide light with corresponding wavelength for the target area. In the image acquisition system provided by the embodiment of the application, the controller can adopt any one of the following modes for determining the target light source.
In one possible design, in the image acquisition system provided by the embodiment of the present application, the controller may store a preset correspondence between the distance and the light source. In the corresponding relation between the distance and the light source, one distance corresponds to one light source only. Multiple distances may correspond to the same light source. The controller may determine, as the target light source, a light source corresponding to the target distance according to the correspondence between the distance and the light source. The light source control device may control the target light source to be in an on state, and the target light source may irradiate light of a wavelength corresponding to the target light source to the target area. In this case, the other light sources than the target light source among the plurality of light sources are in the off state.
In another possible design, in the image capturing system provided by the embodiment of the present application, the controller may store a preset correspondence between the distance set and the light source. In the corresponding relation between the distance sets and the light sources, one distance set uniquely corresponds to one light source. Multiple distance sets may correspond to the same light source. The distances in any two distance sets are different. The controller may determine a set of distances to which the target distance belongs. And selecting a light source corresponding to the distance set of the target distance as a target light source according to the corresponding relation between the distance set and the light source. The light source control device may control the target light source to be in an on state, and the target light source may irradiate light of a wavelength corresponding to the target light source to the target area. In this case, the other light sources than the target light source among the plurality of light sources are in the off state.
In some examples, please refer to fig. 3, the correspondence between the distance sets and the light sources may include a first distance set and a second distance set. The target region may include a first sub-region and a second sub-region. The distances between the camera and each position point in the first subarea all belong to a first distance set. The distances between the camera and each position point in the second sub-area belong to a second distance set. In the actual application scene, if the distance set to which the target distance belongs is the first distance set, the object can be reflected in the first sub-area in the target area. Similarly, if the distance set to which the target distance belongs is the second distance set, the object may be reflected in the second sub-region in the target region. The present example is only for illustrating the relationship between the distance set and the sub-region of the target region, and is not intended to be a specific limitation on the number of distance sets in the correspondence between the distance set and the light source.
In a possible implementation manner, the image acquisition system provided based on any one of the above embodiments, as shown in fig. 4a, the light source control device may include a ranging element. The distance measuring element may be used to measure the distance between the object and the camera within the target area.
In some examples, the distance measuring element may be disposed within a preset range around the camera, and the distance between the object in the target area measured by the distance measuring element and the distance measuring element may be approximately regarded as the distance between the object in the target area and the camera.
In other examples, the distance measuring element may be arranged at a preset position, the distance of which from the camera is denoted as distance Da. The distance between the object and the distance measuring element in the target area measured by the distance measuring element is denoted as distance Db. The light source control means may determine the distance DC from the camera to the object to be measured based on the distance Da and the distance Db.
Alternatively, dc=da+db. Alternatively, the light source control device may determine the distance of the camera to the object to be measured based on the first angle, the second angle, the distance Da, and the distance Db. The object is taken as a straight line with the camera as a reference straight line, the straight line of the object and the ranging element is taken as a first straight line, and the first angle is a minimum positive angle formed by intersecting the first straight line and the reference straight line. The straight line where the distance measuring element and the camera are located is a second straight line, and the second angle is the minimum positive angle formed by intersecting the second straight line and the reference straight line.
The ranging elements may include, but are not limited to, one or more of the following sensors: infrared distance measuring sensor, ultrasonic distance measuring sensor, laser distance measuring sensor. In some examples, the ranging element may include an infrared ranging sensor. The distance measuring element may measure the distance between the object and the camera within the target area based on infrared distance measuring technology. In still other examples, the ranging element may include an ultrasonic ranging sensor. The distance measuring element may measure the distance between the object and the camera within the target area based on ultrasonic distance measuring techniques. In still other examples, the ranging element may include a laser ranging sensor. The distance measuring element may measure the distance between the object and the camera within the target area based on a laser distance measuring technique.
In another possible implementation manner, referring to fig. 4b, the light source control device may include a pressure sensor based on the image acquisition system provided in any one of the above embodiments. The target region may include a plurality of sub-regions, and the sub-regions do not overlap with each other. The pressure sensors are in one-to-one correspondence with the sub-areas. One pressure sensor corresponds to one sub-area. The object is located in the subarea or the object is placed in the subarea, pressure is generated for the subarea, and the pressure sensor corresponding to the subarea can measure the generated pressure, and the measured physical quantity is recorded as a pressure signal. Each pressure sensor may be configured to send a measured pressure signal to the controller when its own measured pressure signal is greater than a preset signal threshold.
In some examples, the first pressure sensor sends a pressure signal to the controller among the plurality of pressure sensors, which may reflect the presence of an object within the sub-region corresponding to the first pressure sensor. The controller in the light source control device may further determine the position of the sub-region corresponding to the first pressure sensor as the position of the object according to the received pressure signal provided by the first pressure sensor. The center position of the sub-area corresponding to the first pressure sensor is determined as the position of the object.
As can be seen from the above description, the distance measuring element may provide the target distance to the controller for the controller to select the light source. The process of selecting the light source by the controller may be referred to in the previous embodiments, and will not be described herein.
Based on the image acquisition system provided in any one of the embodiments, the image acquisition system may adjust the depth of field by adjusting the wavelength of the light source. The image acquisition system may include a variety of capture modes including, but not limited to, a single capture mode and a repeated capture mode.
In one possible design, in the single shot mode of the image capturing system, after the light source control device controls the target light source to be in the on state, the light source control device may send a first indication signal to the camera, where the first indication signal is used to instruct the camera to capture an image. The camera may acquire an image of the object within the target area after receiving the first indication signal. The acquired image of the object is clearer and can be used for detection, identification and other purposes.
In another possible design, the image acquisition system is in a burst mode. After the light source control device controls the target light source to be in an on state, the light source control device can send a first indication signal to the camera, and the first indication signal is used for indicating the camera to collect images. The light source control device selects one other light source (the other light source is the light source except the target light source in the plurality of light sources) after the preset time, controls the light source to be in an on state, controls the target light source to be in an off state, and sends a second indication signal to the camera, wherein the second indication signal is used for indicating the camera to collect images.
A camera is configured to acquire a first image of an object within the target area after receiving the first indication signal. And the camera acquires a second image of the object in the target area after receiving the second indication signal. Two images of the subject, a first image and a second image, respectively, may be obtained in the burst mode. The first image and the second image based on the object can be used for detection, identification and other purposes.
In one possible embodiment, the controller in the image acquisition system may include a programmable controller (Programmable Logic Controller, PLC) and a light source controller. A PLC may generally include a microprocessor module, an input/output I/O module. Optionally, modules such as a programmer, a power supply and the like can be further included. The light source controller may be used to provide voltage and current drive for each of the plurality of light sources.
In one possible design, referring to FIG. 4c, the controller may include a PLC50 and a light source controller 30.PLC 50 may include a first port 511, a second port 512, and a third port 521. The first port 511 may be used to output a high level signal or a low level signal. The second port 512 may be used to output a high level signal or a low level signal. Third port 521 may be coupled to ranging element 60 for receiving signals provided by ranging element 60.
The light source controller 30 may provide voltage and current drive for each of the plurality of light sources. The light source controller 30 may include a plurality of light source driving interfaces. The plurality of light sources comprises, for example, a first light source 41 and a second light source 42. The plurality of light source driving interfaces of the light source controller 30 may include a first light source driving interface 311 and a second light source driving interface 312. The first light source driving interface 311 may be connected to the second light source 42, and the second light source driving interface 312 may be connected to the first light source 41. The light source controller 30 may include a first external trigger input port 321 and a second external trigger input port 322. The first external trigger input port 321 is connected to the first port 511 of the PLC. The second external trigger input port 322 is connected to the second port 512 of the PLC.
In some examples, the first port 511 of the PLC50 outputs a high level signal to the first external trigger port 321 of the light source controller, and the second port 512 of the PLC50 outputs a low level signal to the second external trigger port 322 of the light source controller, which may enable the PLC50 to control the light source controller 30 to light up the first light source 41 and to turn off the second light source 42. For example, the second light source driving interface 312 of the light source controller 30 supplies the driving voltage or driving current to the first light source 41, and the first light source driving interface 311 does not supply the driving voltage or driving current to the second light source 42.
In other examples, the first port 511 of the PLC50 outputs a low level signal to the first external trigger port 321 of the light source controller, and the second port 512 of the PLC50 outputs a high level signal to the second external trigger port 322 of the light source controller, which may enable the PLC50 to control the light source controller 30 to light up the second light source 42 and to turn off the first light source 41. For example, the second light source driving interface 312 of the light source controller 30 does not supply the driving voltage or the driving current to the first light source 41, and the first light source driving interface 311 supplies the driving voltage or the driving current to the second light source 42.
Optionally, at least one of the light source controller 30 or the PLC50 may also include an RS232 communication port, which RS232 communication port may be used to interface with the camera to provide an indication signal to the camera. For example, referring again to fig. 4c, the light source controller 30 may further include an RS232 communication port 33. The RS232 communication port 33 may be connected to a camera. The light source controller 30 may send an indication signal to the camera via the RS232 communication port 33 to indicate that the camera is taking a picture (i.e., image acquisition).
In a scenario where the image acquisition system is in single shot mode, the PLC50 may determine the distance between the object and the camera from the data provided by the ranging element 60. Or the PLC50 may determine that the actual position of the object plane (plane in which the object is located) is in the aforementioned D1 space or the aforementioned D2 space based on the data provided from the distance measuring element 60. The light source corresponding to the D1 space is a first light source 41, and the light source corresponding to the D2 space is a second light source 42. If the actual position of the object plane is in the D1 space, the PLC50 can determine the light source corresponding to the distance between the object and the camera as the first light source 41. In this case, the PLC50 may control the first light source 41 to be turned on and the second light source 42 to be turned off. The image acquired by the camera can be used for image detection, identification and other purposes.
If the object plane actual position is in the aforementioned D2 space, the PLC50 can determine the light source corresponding to the distance between the object and the camera as the second light source 42. In this case, the PLC50 may control the first light source 41 to be turned off and the second light source 42 to be turned on. The image acquired by the camera can be used for image detection, identification and other purposes.
In a scenario in which the image acquisition system is in a burst mode, in some examples, PLC50 may first control first light source 41 to illuminate and second light source 42 to extinguish and instruct the camera to perform image acquisition. The PLC50 can then control the first light source 41 to go off and the second light source 42 to light up again and instruct the camera to perform image acquisition. The images acquired by the camera for multiple times are used for image detection, identification and other purposes. In other examples, PLC50 may first control first light source 41 to go off and second light source 42 to light up and instruct the camera to perform image acquisition. The PLC50 can then control the first light source 41 to turn on and the second light source 42 to turn off again and instruct the camera to perform image acquisition. The images acquired by the camera for multiple times are used for image detection, identification and other purposes.
In one possible implementation, the image acquisition system may support only the burst mode. The light source control means in the image acquisition system may comprise a controller. Referring to fig. 4d, the controller may be a light source controller 30. The light source controller 30 may provide voltage and current drive for each of the plurality of light sources. The light source controller 30 may include a plurality of light source driving interfaces. For example, the plurality of light sources includes a first light source 41 and a second light source 42. The plurality of light source driving interfaces of the light source controller 30 may include a first light source driving interface 311 and a second light source driving interface 312. The first light source driving interface 311 may be connected to the second light source 42, and the second light source driving interface 312 may be connected to the first light source 41.
The light source controller 30 may include a first external trigger input port 321 and a second external trigger input port 322. The first external trigger input port 321 may be connected to the first control port 201 of the camera through the first signal connection line 51. The second external trigger input port 322 may be connected to the second control port 202 of the camera through a second signal connection line 52. The camera may control the light source controller 30 to switch the light source through the first control port 201 and the second control port 202.
In some examples, the camera may perform configuration 1, enabling a high level signal to be output to the first external trigger input port 321 through the first control port 201, and a low level signal to be output to the second external trigger input port 322 through the second control port 202. The light source controller 30 may be caused to control the first light source 41 to be turned on and the second light source 42 to be turned off. The camera may acquire an image at the wavelength of the first light source 41.
The camera may perform configuration 2, enabling the output of a low level signal to the first external trigger input port 321 through the first control port 201, and the output of a high level signal to the second external trigger input port 322 through the second control port 202. The light source controller 30 may be caused to control the first light source 41 to be turned off and the second light source 42 to be turned on. The camera may capture images at the wavelength of the second light source 42.
Based on the image capturing system provided in any one of the foregoing embodiments, based on the same concept, the embodiment of the present application further provides an image capturing method, which may be executed by the image capturing system, referring to fig. 5, and the image capturing method may include the following steps:
in step 501, the light source control device selects a target light source from a plurality of light sources according to a target distance between an object in a target area and a camera.
In step 502, the light source control device controls the target light source to be in an on state.
In step 503, the light source control device sends an acquisition indication signal to the camera.
The acquisition indication signal is used for indicating the camera to acquire an image.
In step 504, the camera acquires an image of the object in the target area after receiving the acquisition indication signal.
Based on the image capturing system provided in any one of the foregoing embodiments, based on the same concept, the embodiment of the present application further provides an image capturing method, which may be executed by the light source control device, referring to fig. 6, and the image capturing method may include the following steps:
in step 601, a target distance between an object within a target area and a camera is determined.
Step 602, selecting a target light source from a plurality of light sources according to a target distance between an object in a target area and a camera.
And step 603, controlling the target light source to be in an on state.
In one possible implementation manner, the light source control device may select, as the target light source, a light source corresponding to the target distance according to a correspondence between a preset distance and the light source.
In one possible implementation manner, the light source control device may select, as the target light source, a light source corresponding to a distance set to which the target distance belongs according to a correspondence between a preset distance set and the light source.
In a possible implementation manner, in step 601, the light source control device may determine the target distance between the object and the camera in the target area by any one of the following manners: infrared ranging method, ultrasonic ranging method, laser ranging method.
In one possible embodiment, the light source control device includes a plurality of pressure sensors; the target area comprises a plurality of subareas, and all subareas are not overlapped; the plurality of subareas are in one-to-one correspondence with the plurality of pressure sensors; the pressure sensor is used for: measuring a pressure signal generated by the object on a subarea corresponding to the pressure sensor, and sending the measured pressure signal to the controller when the measured pressure signal is greater than a preset signal threshold; the controller is further configured to: and determining the position of the sub-region corresponding to the first pressure sensor as the position of the object based on the received pressure signal provided by the first pressure sensor, so as to determine the target distance between the object in the target region and the camera in step 601.
In some application scenarios, the light source control device may be configured in a single beat mode. As shown in fig. 7, the image acquisition method provided by the embodiment of the present application may include the following steps:
step 701, determining a target distance between an object within a target area and a camera.
Step 702, selecting a target light source from a plurality of light sources according to a target distance between an object in a target area and a camera.
And step 703, controlling the target light source to be in an on state.
Step 704, a first indication signal is sent to the camera to cause the camera to capture an image of the object within the target area.
In some application scenarios, the light source control device may be configured in a single beat mode. As shown in fig. 8, the image acquisition method provided by the embodiment of the application may include the following steps:
in step 801, a target distance between an object within a target area and a camera is determined.
Step 802, selecting a target light source from a plurality of light sources according to a target distance between an object in a target area and a camera.
Step 803, controlling the target light source to be in an on state.
Step 804, a first indication signal is sent to the camera to cause the camera to capture an image of the object within the target area.
Step 805, selecting one other light source from a plurality of light sources other than the target light source.
Step 806, controlling the other light sources to be in an on state.
Step 807 sends a second indication signal to the camera to cause the camera to capture an image of the object within the target area.
The controller to which embodiments of the present application relate may be a central processing unit (Central Processing Unit, CPU), a general purpose processor, a graphics processor (Graphics Processing Unit, GPU) digital signal processor (Digital Signal Processor, DSP), an Application-specific integrated circuit (Application-specific Integrated Circuit, ASIC), a field programmable gate array (Field Programmable Gate Array, FPGA) or other programmable logic device, a transistor logic device, a hardware component, or any combination thereof.
In another aspect, the present application provides a computer readable storage medium storing computer instructions that, when executed on a controller (or processor), cause the controller (or processor) to perform any one of the image acquisition methods provided by the embodiments of the present application.
In another aspect, an embodiment of the present application provides a computer program product, including a computer program, where the computer program when executed by a controller (or a processor) implements any one of the image acquisition methods provided in the embodiment of the present application.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various modifications and variations can be made to the present application without departing from the spirit or scope of the application. Thus, it is intended that the present application also include such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.

Claims (10)

1. An image acquisition method, comprising:
receiving a target distance between an object in a target area and a camera;
selecting a target light source from a plurality of light sources according to the target distance;
and controlling the target light source to be in an on state.
2. The method of claim 1, wherein selecting a target light source from a plurality of light sources comprises:
and selecting the light source corresponding to the target distance as the target light source according to the corresponding relation between the preset distance and the light source.
3. The method of claim 1, wherein selecting a target light source from a plurality of light sources comprises:
and selecting a light source corresponding to the distance set to which the target distance belongs as the target light source according to the corresponding relation between the preset distance set and the light source.
4. A method according to any one of claims 1-3, wherein the method further comprises:
and sending a first indication signal to the camera, selecting one other light source after a preset time period, controlling the other light source to provide light with a corresponding wavelength for the target area, and sending a second indication signal to the camera, wherein the other light source is a light source except the target light source in the plurality of light sources.
5. An image acquisition system, comprising:
a camera arranged at a designated position for collecting an image of an object in a target area;
a plurality of light sources, each of the plurality of light sources corresponding to a plurality of wavelengths, each light source for providing light of a corresponding wavelength;
a light source control means for selecting one target light source from the plurality of light sources according to a target distance between the object and the camera; and controlling the target light source to be in an on state.
6. The system of claim 5, wherein the light source control device comprises a controller, the controller being specifically configured to:
and selecting the light source corresponding to the target distance as the target light source according to the corresponding relation between the preset distance and the light source.
7. The system of claim 5, wherein the light source control device comprises a controller, the controller being specifically configured to:
and selecting a light source corresponding to the distance set to which the target distance belongs as the target light source according to the corresponding relation between the preset distance set and the light source.
8. The system of any of claims 5-7, wherein the image acquisition system comprises a single shot mode;
after the light source control device controls the target light source to be in an on state, the light source control device is configured to send a first indication signal to the camera;
the camera is configured to acquire an image of an object within the target area after receiving the first indication signal.
9. The system of any of claims 5-7, wherein the image acquisition system comprises a repeat mode;
after the light source control device controls the target light source to provide light with a corresponding wavelength to the target area, the light source control device is configured to: sending a first indication signal to the camera, selecting one other light source after a preset time period, controlling the other light source to provide light with a corresponding wavelength for the target area, and sending a second indication signal to the camera, wherein the other light source is a light source except the target light source in the plurality of light sources;
the camera is configured to acquire a first image of an object within the target area after receiving the first indication signal; and acquiring a second image of the object in the target area after receiving the second indication signal.
10. A computer readable storage medium storing computer instructions which, when run on a processor, cause the processor to perform the method of any one of claims 1-4.
CN202310796158.3A 2023-06-30 2023-06-30 Image acquisition method and related equipment Pending CN116828280A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310796158.3A CN116828280A (en) 2023-06-30 2023-06-30 Image acquisition method and related equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310796158.3A CN116828280A (en) 2023-06-30 2023-06-30 Image acquisition method and related equipment

Publications (1)

Publication Number Publication Date
CN116828280A true CN116828280A (en) 2023-09-29

Family

ID=88121738

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310796158.3A Pending CN116828280A (en) 2023-06-30 2023-06-30 Image acquisition method and related equipment

Country Status (1)

Country Link
CN (1) CN116828280A (en)

Similar Documents

Publication Publication Date Title
US9961253B2 (en) Autofocus system for a high speed periodically modulated variable focal length lens
US9830694B2 (en) Multi-level image focus using a tunable lens in a machine vision inspection system
US9930243B2 (en) Variable focal length imaging system
CN101770573B (en) Automatic focusing iris image imaging device for iris recognition and control method thereof
CN101472071B (en) Image pickup apparatus and focus detection apparatus
JPWO2017213052A1 (en) Ranging system and ranging method
US10095003B2 (en) Autofocus apparatus, autofocus method, and program
JP2011530722A (en) Liquid lens with temperature compensated focusing time
JP2011530722A5 (en)
CN110726383B (en) High-precision integrated three-dimensional measurement system based on MEMS
US7528354B2 (en) Method and device for position sensing of an optical component in an imaging system
US11102459B2 (en) 3D machine-vision system
WO2005124418A1 (en) Focusing information acquisition device, and focusing information acquisition method
US20180275390A1 (en) Modulation monitoring system for use with an imaging system that includes a high speed periodically modulated variable focal length lens
JP2019191180A (en) Variable focal length lens system using quasi-sinusoidal periodic intensity modulated light
JPH01259313A (en) Range finder for camera
JP2016105068A (en) Distance measurement device and distance measurement method
US11249225B2 (en) Tunable acoustic gradient lens system utilizing amplitude adjustments for acquiring images focused at different z-heights
JP2001036799A (en) Method and device for adjusting position of optical lens for fixed focus type image pickup device and computer readable recording medium storage program concerned with the method
CN116828280A (en) Image acquisition method and related equipment
JP7379039B2 (en) Lens device, imaging device, lens device control method, and program
JP2021067704A (en) Imaging device
KR101333161B1 (en) Apparatus of processing image based on confocal and method thereof
JP2021124429A (en) Scanning measurement method and scanning measurement device
CN112558380A (en) Image detection device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication