CN104680113B - Image capture device - Google Patents

Image capture device Download PDF

Info

Publication number
CN104680113B
CN104680113B CN201410830194.8A CN201410830194A CN104680113B CN 104680113 B CN104680113 B CN 104680113B CN 201410830194 A CN201410830194 A CN 201410830194A CN 104680113 B CN104680113 B CN 104680113B
Authority
CN
China
Prior art keywords
light source
array
sensor
light
axis
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410830194.8A
Other languages
Chinese (zh)
Other versions
CN104680113A (en
Inventor
费代里科·卡尼尼
圭多·毛里齐奥·奥利瓦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Datalogic Scanning Group SRL
Original Assignee
Datalogic Scanning Group SRL
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Datalogic Scanning Group SRL filed Critical Datalogic Scanning Group SRL
Priority to CN201410830194.8A priority Critical patent/CN104680113B/en
Priority claimed from CN201080066573.4A external-priority patent/CN102870121B/en
Publication of CN104680113A publication Critical patent/CN104680113A/en
Application granted granted Critical
Publication of CN104680113B publication Critical patent/CN104680113B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Facsimile Scanning Arrangements (AREA)
  • Image Input (AREA)

Abstract

The present invention relates to a kind of image capture device.Specifically, a kind of image capture device of type of imager is described, which includes:Image processing system, described image forming apparatus include sensor, and the sensor definition optics receives axis, at least one region read distance and taken on substrate at least one reading distance by sensor frame;Lighting device, the lighting device include the array of adjacent light source, it limits illumination optical axis, it is characterised in that:Light source can individually drive, and each light source is suitable for illuminating the area for the size that size is much smaller than the region taken by sensor frame;Illumination axis is not overlapped with receiving axis;Also, the image capture device includes the driver of light source, the driver is suitable for driving light source, is illuminated with least closing at least one light source for reading distance and being in the outside boundaries in the region taken on substrate by sensor frame.

Description

Image capture device
Divisional application explanation
It is that on March 11st, 2010, application number are that 201080066573.4, denomination of invention is " image the applying date that the application, which is, The divisional application of the Chinese invention patent application of trap setting ".
Technical field
System is read the present invention relates to a kind of image capture device, and more particularly to the optical information of " imager " type Or such a device of reader.
Background technology
The optical information reader of type of imager is known.Such reader includes to catch or obtaining being present in The image capture device of the image of optical information on any kind of substrate, including electrically or electronically filled further through any thereon Put the display of display optical information.
In the present specification and in the appended claims, word " optical information " is used in the sense that its is broadest Including one-dimensional optical code, stack and both two-dimentional optical codes, wherein information is encoded as shape, size, color And/or at least two different colours element mutual alignment, and alphanumeric character, signature, logo, seal, trade mark, mark Label, handwritten text and general image also have combinations thereof, are especially in the presence of in pre-printed formal, and comprising It is suitable for identifying based on its shape and/or volume and/or the image of the feature of selecting object.
In the present specification and in the appended claims, term " light " uses in the sense that its is broadest, instruction Not only in the visible spectrum but also in the electromagnetic radiation of the ultraviolet wavelength with infrared spectrum or wave-length coverage.Term such as " face Color ", " optics ", " image " and " view " similarly use in its broadest sense.Especially, it can use invisible ink will Coding information is marked on substrate, but sensitive for ultraviolet or infrared-ray.
The optical information reader of type of imager generally include to have in addition to image capture device it is one or more not Same devices that are other functions or communicating.
Such other device includes herein:For handling the device of captured image, can from such image or from The extract information content of the image;Storage device;For the information content of the image of acquisition and/or extraction to be transferred to reading Take the device or interface on the outside of device;Device or interface for the setting data for inputting the reader from external source;For to User shows the alphanumeric and/or figure of the operating status for being for example related to reader, the content of read information etc. The device of information;For being manually entered the device of control signal and data;For powering or being obtained from outside in power supply signal Part device.
In addition, it may include in the optical information reader of type of imager or other device in combination wraps herein Include:By showing that the region taken by image capture device frame visually indicates on substrate, for example, the center in the region and/or its Edge and/or at least part in corner, carry out the sighting device that auxiliary operator positions reader relative to optical information;For The auxiliary device (rangefinder) that image capture device is correctly focused on, the auxiliary device are shown in focused condition on substrate There is shape-variable, size and/or the luminous pattern of position between non-focusing condition, and can indicate that and reach focusing Condition and the direction of manual mobile image trap setting and substrate;As a result instruction device, the result instruction device is by shining Shape, size, color and/or the change of position of figure, show that instruction is attempted to catch image and/or decoding figure on substrate The luminous pattern of positive or negative result during as information, thereby increases and it is possible to which ground shows the reason for negative decision;For detecting substrate In the presence of, and/or for measuring or estimating to read the sensor of distance-reference substance i.e. in reader specifically image capture device The distance between substrate-device.Also can be by projecting suitable luminous pattern, such as divide to the function of marking and indicate to focus on Do not completed together for the bar or a pair of of cross of a pair of angled, the pair of inclined bar or a pair of of cross be only in the distance of focusing It is at the region taken by image capture device frame at center and intersects or overlapped.
The measurement or estimation of distance are usually read device use, to be only included in minimum and maximum work when optical information is located at Make just to activate decoding algorithm during the distance between distance place, and/or to control zoom lens control device and/or be caught for changing image automatically Catch the device of the focusing distance (automatic to focus on) of device.In addition, the measurement or estimation of distance, which are may be used at, wherein needs image In the case of digital restoration, because the downgrade functions of the optics of image processing system or PSF (point spread function) are depended on Read distance.In addition, the measurement or estimation of distance are desirable for the volume for calculating object.
For aiming at and/or indicating the device of focus for example in US 5,949,057, US 6,811,085, US 7,392, Described in 951 B2, US 5,331,176, US 5,378,883 and 1 466 292 B1 of EP.
As a result instruction device is for example described in 1 128 315 A1 of aforementioned documents US 5,331,176 and EP.
It is emphasized that aiming, the instruction of focused condition, result instruction, there are detection and read distance measurement or Each in estimation function can be implemented by different modes, and the different mode is well known in itself and unfavorable is used in substrate On light projection.Only be mentioned that herein as example, for aim at and/or focused condition, to the thing taken by sensor institute frame Thing is found a view and is shown;For instruction as a result, sound instruction and visually indicating is not projected on substrate, but projected towards operator; Exist for detection, measurement or estimated distance and/or assessment focused condition, electro-optical system, radar or Vltrasonic device etc..
The image capture device of type of imager includes image processing system or part, described image forming apparatus or part Including in light-sensitive element-straight line or preferably matrix-type-regularly arranged or array format sensor, the sensor Electric signal can be produced from optical signalling, and described image forming apparatus or part generally also include the receiver optics device of image Part, the receiver optics can form the image comprising the substrate of optical information or its region on a sensor.
Image capture device is characterized in that optics receives axis, and the optics receives axis and passes through receiver optics The center of element limit or limited in the case of single lens by the center of curvature of optical surface, and the optics receives Axis defines the main operative orientation of image capture device.Image capture device is further characterized in that workspace areas, described Workspace areas is generally shaped to be similar to the pyramidal truncated cone extended in front of sensor.Workspace areas, In other words wherein optical information is led to by the sensor area of space that correctly frame takes and its image fully focuses on a sensor Often it is characterized in that visual field and the depth of field, the visual field represents angular breadth of the working region on receiving axis, and the depth of field represents work Make size of the region along the direction for receiving axis.Therefore the depth of field represents that edge receives axis on reader and substrate by sensor frame The scope between the useful distance of minimum and maximum between the region taken.Visual field can also be with the shape of " vertical " and " level " visual field Formula represents, is in other words represented by the form of by receiving two angular ranges in axis and orthogonal plane, with It is separated with 90 ° in view of the form factor of sensor, or even in the case of without the reception system of any symmetry Four angular ranges in half-plane represent.
Workspace areas-and the therefore visual field and the depth of field-can be fixed or in size and/or ratio by ripe The zoom and/or autofocus system known and dynamically change, the zoom and/or autofocus system are, for example, to be used to move One or more lens or aperture, other components of mirror or receiver optics for movable sensor and/or are used for Change receiver optics one or more lens-such as liquid lens or deformable lens-curvature electromechanics, pressure Electrically or optically electric actuator.
1 764 835 A1 of EP describe a kind of optical sensor, wherein the group of each light-sensitive element or light-sensitive element is equal With the lens or other optical elements being combined, such as aperture, prism surface, light guide or GRIN Lens.This document is overall On do not mention the illumination in the region taken by sensor frame.
Although the only known image capture device run with ambient light, the image capture device of type of imager is usually also Including being suitable for one or more light beams possibly projecting carrying optical information with variable intensity and/or spectral component Substrate on lighting device or part.The light beam or the entirety of light beam sent by lighting device defines illumination optical axis, The illumination optical axis is the mean direction of this single or compound light beam, in the case of a two-dimensional array as the light The axis of symmetry of the beam at least one plane and usually in two vertical planes.
Correctly to operate image capture device, lighting device must can illuminate the whole working space of image processing system Region.
A kind of image capture device, described image trap setting are similar to above mentioned US5,378,883 Fig. 4's Image capture device, wherein as illustrated in Fig. 1, lighting device 90 is not coaxial with image processing system 91, but arranges On the side of image processing system 91 and it is configured so that the illumination axis 92 of illuminating bundle 93 and reception axis 94 converge, two Described image trap setting is influenced be subject to intrinsic parallactic error and is subject to the shadow of intrinsic perspective distortion error in the case of dimension Ring.This error causes the intersection point and the workspace region of substrate S and image processing system 91 between substrate S and illuminating bundle 93 Intersection point between domain 95 is generally at most in very small reading distance (at the substrate S about partly indicated in Fig. 1 Distance) in the range of be concentric.Therefore, to cause lighting device 90 to illuminate the whole working space of image processing system 91 Region 95, is largely reading at distance, illumination is over-redundancy (with reference to the substrate S partly indicated in Fig. 11Or base Plate S2The distance at place), in other words, illumination extends to the areas outside taken on substrate by sensor frame, the result is that the wave of energy Take.
In the prior art in catching some devices of image, parallactic error to pass through lighting device and image shape Coaxially solved into device.
US 5,319,182 describes the image capture device of non-imaged device type but scan type, wherein lighting device It is generally coaxial with sensor because they by wherein with can the transmitter of program activation replace with the light-sensitive element of sensor Matrix composition.This device is potentially closely and flexibly, but be also subject to obvious between transmitter and light-sensitive element The puzzlement for the problem of being optically isolated:Even if by between transmitter and light-sensitive element provide as in this document it is proposed every From device, sent by transmitter and even if by for example opaque partition wall in any surface or projection optical device in minimum degree The rear surface with antireflection process reflex to the intensity of light on light-sensitive element and be also far above from the base for carrying optical information The intensity for the light that plate receives.In addition, arrange that light-sensitive element and light-emitting component cause in terms of efficiency on single substrate Compromise, because being the feature of material that is required with effective light-emitting component with obtaining the material required by effective light-sensitive element Expect that feature is opposite.
In US 5,430,286, the alignment between the light and image processing system that are sent by lighting device passes through light Beam separator obtains.Due to along illumination path and 50% power loss along both RX paths, being read so causing to exist The very big space being occupied in device and the result of low-down efficiency.
Also the such system for the space problem being occupied is in foregoing US 5, and described in 331,176, the system makes The replacement of beam splitter is used as by the use of semitransparent mirror.This document also teach that the size of the part of adjustment illuminating bundle, but logical Mechanical mobile device progress is crossed, described device causes the consumption of the space occupied and reader.In addition, such solution without Method avoids the shortcomings that waste for the energy of illumination, because the part of illuminating bundle is only blocked.
2007/0158427 A1 of US for representing the immediate prior art describe a kind of lighting system in figure 5b, A pair of of illumination that the lighting system is included each being arranged on the opposite side of sensor and is combined with larger operating distance Array and each a pair of of illumination array being also arranged on the opposite side of sensor and be combined with less distance. Because the part of the light beam for a pair of of the array emitter being generally combined by the operating distance with bigger is directed and is sized It is at least equably to illuminate the whole region that is taken by sensor frame in maximum distance apart, so the result is that herein at distance and more At short reading distance, the illumination by this array is over-redundancy, and in other words, the array is in the area taken by sensor frame Overseas side extension.A pair of of array that such shortcoming is combined relative to the operating distance with smaller occurs.Therefore the dress of this document Put that efficiency is very low, be particularly hardly fitted for the energy saving battery powered portable reader as important requirement.Document It also teach that and only open a problem of array per centering from substrate to avoid reflecting, therefore drop into and be subject to parallax and perspective The situation of the system of distortion error puzzlement, or teach when read apart from it is unknown when open two in a pair of of array.The text Offer and also describe:Another pair luminaire, the other both sides for being each arranged in sensor are sentenced and are illuminated for reading one-dimensional coding Filament;With four luminaires of the apex that is arranged in sensor, to aim at the region of concern.
The content of the invention
The technical problem that the present invention is based on is:A kind of effective image capture device is provided, and is more specifically provided into As such a device of the optical information reader of device type, specifically described device no parallax error, does not provide in quilt also The excessive redundancy illumination for the areas outside extension that sensor frame takes, and avoid the optical interference between light source and light-sensitive element Any possibility.
In the first aspect of the present invention, the present invention relates to the image capture device of type of imager, including:
- image processing system, described image forming apparatus include sensor, and the sensor includes the one-dimensional of light-sensitive element Or two-dimensional array, and define that optics receives axis, at least one reading distance and on substrate at least one reading The region taken at distance by sensor frame;
- lighting device, the lighting device include the array of array or adjacent light source, it limits illumination optical axis,
It is characterized in that:
- light source can individually drive, and each light source is suitable for illuminating size much smaller than the area taken by sensor frame Size area,
- illumination axis is not overlapped with receiving axis,
- described image trap setting includes the driver of light source, and the driver of the light source is suitable for driving light source, so that Few close is illuminated at least one light source for reading distance and being in the outside boundaries in the region taken on substrate by sensor frame.
In the present specification and in the appended claims, term " optics reception axis ", which means to indicate, is received The direction that the center of the element of device optics limits, or limited in the case of independent lens by the center of curvature of optical surface Direction.
In the present specification and in the appended claims, term " illumination optical axis " means to indicate following maximum The mean direction of illuminating bundle, i.e., in addition to the possible different angle of the light source except the opposite end in array obscures, In the case that all light sources of array are opened, the highest luminance light beam that is sent by lighting device.
It should be noted that in the present specification and in the appended claims, term " axis " uses for the sake of simplicity, But it is half axis in both cases in practice.
In the present specification and in the appended claims, term " adjacent " means that instruction is not present between light source There is following component, the function is with emission function and/or with being under the jurisdiction of the addressing of this function such as light source, driving Dynamic, heat dissipation, optoisolated function are different;Therefore this term need not be construed to instruction light source with restrictive, sense and be contacted with each other.
In the present specification and in the appended claims, " border " art in the region taken on substrate by sensor frame Language means ranks of the instruction with the thickness for being at most equal to the region illuminated by the single light source of array.In other words, term In view of following true, i.e., light source is quantitatively under any circumstance limited, and the region that each light source illuminates has The size of limit, therefore define the limit of resolution of the lighting system relative to the geometrical boundary in the region taken by sensor frame.
Each individually drivable light source preferably includes a single illumination component, but it may include more than one Illumination component.
Preferably, at least one multiple reading distances for reading distance and being included in the depth of field, in other words, are included in most The small multiple reading distances for reading between distance and maximum read distance and (reading distance and maximum read distance containing minimum).
Driver is suitable for driving light source at least to illuminate the region taken by sensor frame on substrate to close at which The reading distance of the light source of outside boundaries can be mutually discontinuous, or can in the depth of field consecutive variations.
In general, direction and/or shape are preferably limited for the increase depth of field and/or in the space in the region taken by sensor frame Shape, image processing system further include at least one receiver optics with fixed or variable focal length.It is such to connect Single lens or the optical component group that device optics can especially include being shared by the light-sensitive element of sensor is received, and/or Each associated with the subgroup of light-sensitive element or element lens array, prism surface and/or aperture, such as in foregoing EP 1 Described in 764 835 A1.
In general, image processing system includes zoom and/or autofocus system, in the case, taken by sensor frame Region is changed in a manner of proportional by the not direct reading in the depth of field.
Receiving axis can overlap with the normal to a surface of sensor, or with an angle relative to the normal slope.
Preferably, will to increase the depth of focus in image-side and/or normal slope of the axis relative to array of source will be illuminated Array of source is combined with least one projecting lens.More specifically, each light source can be equipped with the projecting lens of their own, and/ Or at least one single projecting lens shared by the light source of array can be provided.
Each projecting lens can be replaced by other optical elements or is combined with other optical elements, the optics member Part such as aperture, prism surface, light guide and/or GRIN Lens, similar to described in foregoing 1 764 835 A1 of EP.
Illumination axis can be overlapped with the normal of the plane of array, or with an angle relative to the normal slope.
In certain embodiments, axis is illuminated parallel to reception axis and disconnect.
In other embodiments, illuminate axis and tilt and not copline relative to receiving axis.It is inclined in two axis In the case of, two axis can intersect, and usually intersect in front of sensor, or their tiltables.
In certain embodiments, array and sensor are coplanar so that they can be advantageously made in identical collection Into on the identical supporting item of circuit board, or it is formed on identical ic substrate.
In other embodiments, array and sensor arrangement are in the plane being mutually inclined so that advantageously illuminate axis And the angle of inclination received between axis is determined or helps to be determined.
Preferably, the light source of array is suitable for generally illuminating being more than if being switched on and is taken in the depth of field by sensor frame Maximum region area.
More specifically, the quantity of light source is chosen so as to:When an independent light source is by unlatching/closing, the quilt on substrate The area that lighting device generally illuminates undergoes sufficiently small percentage change.
Preferably, percentage change is less than or equal to 15%, more preferably less or equal to 10%, is even more preferably less than Or equal to 5%.
Preferably, driver is suitable for being not turned on all light sources of array at any reading distance.
It is highly preferred that driver is suitable for closing at least one light for being in the edge of array in each reading distance Source.In other words, driver is suitable for being not turned on two light in the opposite end arrangement of array at any reading distance Source.
Preferably, driver is suitable for closing at reading distance the outside boundaries for illuminating the region taken by sensor frame All light sources, and be suitable for opening all light sources in the border for illuminating the region taken by sensor frame in run mode.
Preferably, driver is suitable for only opening at least one in the region for illuminating and being taken by sensor frame in run mode The light source in a concern area.
Driver may be in response to read the measuring appliance of distance or the device for estimating reading distance.
The measuring appliance for reading distance can be device that is different from reader and communicating, such as photronic system, The measurement of device based on phase measurement or the flight time based on laser or LED beam, visible ray or IR (infrared ray) Device, or device based on radar or ultrasound in nature etc..
It is preferable, however, that driver be suitable for opening in run mode the figure for being selected as projection light emitting of array with The light source of distance is read for assessing.Distance is read based on a sensor by being launched by least some light sources of the array Light formed image shape and/or position and measure or estimate.
Driver is suitably adapted for opening array in run mode and is selected as generally illuminating for aiming at by sensor The region and/or the light source of the figure in its at least one concern area that frame takes.
Driver is suitably adapted for opening array in run mode and is selected as generally illuminating for indicating sensed The light source of the figure of the result of seizure image is attempted in the region that device frame takes.
The light source of array is preferably individually actuatable also in terms of emissive porwer.
Preferably, the array of light source, which is suitable for launching, exceedes a kind of light of wavelength.Especially, array may include to be suitable for sending out Penetrate the first subset of the multiple light sources of first wave length and at least one be suitable for launching the second wave length different from first wave length The yield in the second subset of multiple light sources.Alternatively, each light source is suitably adapted for alternatively launching the light of different wave length.
In the case of such measure, such as the color of illumination can be adjusted based on the color of optical encoding and its background. In addition, for example red illumination figure can be projected for the result of negative by projecting green illumination figure for result certainly Shape, easily to provide the different instructions of the result of seizure or reading trial.In addition, it is that the multiple concern areas of aiming are also its side Just user selects, and lighting figure can be made diversified.
The array of light source can be one-dimensional or two-dimentional.
Array of source can be flat or curved.By the way that light source is arranged on curved surface, may be such that each The length of light path between light source and substrate is identical or substantially the same, therefore compensate for light that light source is sent in flat battle array By the different decay of experience in the case of row, and therefore obtain and reading the uniform illumination at distance.Curved arrangement It can also be used for determining or aiding in determining whether the difference of the primary beam of different light sources.
Preferably, the quantity of the light source of array is greater than or equal to 32 under one-dimensional case, or be more than under two-dimensional case or Equal to 32 × 32.
It is highly preferred that the quantity of the light source of two-dimensional array is from the group including 32 × 32,64 × 64,44 × 32 and 86 × 64 Selection, and selected under one-dimensional case from the group including 32 and 64.
In one embodiment, driver is suitable for closing the region at least illuminated and taken by sensor frame at reading distance The first half outside boundaries all light sources, image capture device further includes:The of the adjacent light source that can individually drive Two arrays, the second array define the second illumination axis, and the second illumination axis is not overlapped with receiving axis;And light source Driver is suitable for driving the light source of the second array, with close at least illuminate the region taken by sensor frame with it is the first half complementary The second half outside boundaries light source.
In one embodiment, image capture device further includes:The second array for the adjacent light source that can individually drive, The second array defines the second illumination axis, and the second illumination axis is not overlapped with receiving axis;The driver of light source It is suitable for driving the light source of the second array, to close the light source for the outside boundaries at least illuminating the region taken by sensor frame.
In one embodiment, driver is suitable for determining to be switched on or off individually array in real time according at least to reading distance Which light source.
In embodiment, determine in real time by analytic method execution, in other words, using being only dependent upon reader and special It is known (design) geometric parameter of image processing system, its lighting device, and/or the analytic formula of its space arrangement Determine, the space arrangement includes the space arrangement of its component or sub-component.
Preferably, analytic method includes the following steps:
- in the specified point with calculating the region taken on substrate by sensor frame in relevant first referential of reception device Coordinate;
- go to and the coordinate transform in relevant second referential of lighting device;With
- in the second referential, calculate the light source for the array for illuminating corresponding specified point.
Preferably, in abovementioned steps, perform one or more in following (1) to the formula of (31).
In embodiment, determine to perform at least partially through experience or adaptive approach in real time, the experience or adaptive Induction method includes driving in a manner of recurrence to open the subset of light source, relative to the regional assessment taken by sensor frame in substrate On the position in area that is illuminated and/or scope, and adapt to the subset of light source based on this assessment.
The initial subset of light source can determine in advance by analysis mode, by experience or adaptive approach, therefore the solution The array of source of analysis mode, each image capture device that be used to for example correct batch production by experience or adaptive approach Inexactness.
In embodiment, the recurrence of the subset opened of light source adapts to perform along multiple radially spaced-apart directions.
In embodiment, the subset opened of light source by the position of end light source that will be opened along the multiple side Determined to interpolation.
In an alternate embodiment, driver is suitable for according to reading distance by reading them from inquiry table Determine which light source be switched on or off individually.
Driver is suitably adapted for disposable (una tantum) and establishes the inquiry table, in particular with parsing or examine/ Adaptive approach, similar to definite in real time.
Alternatively, driver is suitably adapted for receiving the inquiry table as input, and the inquiry table is by individually locating Reason device is to parse or experience/adaptive approach is disposably established, similar to determining in real time.
If according to the reading distance disposably occurred in a single processing unit come determine should open respectively or The light source of closing, then preferably implemented by computer program, manages image capture device to the computer program parameter One or more amounts.In this way, advantageously, identical computer program can be used for for example a series of reader model.
Such computer program represents the other aspect of the present invention.
The light source of array is preferably the light source or organic light sources of solid-state type, and more preferably described light source from including Selected in the group of LED, OLED, miniature LED and microlaser.
In another aspect of the present invention, the present invention relates to the optical information reader of type of imager, the reading Device includes image capture device as described above.
In another aspect of the present invention, the present invention relates to computer-readable storage device, the storage device Including foregoing routine.
In another aspect of the present invention, the present invention relates to optical pickup, the optical pickup includes can be independent The array of the adjacent light source of ground driving, and be suitable for light illumination mode, aiming pattern and read result indicating mode driving battle array The driver of the light source of row.
Preferably, the driver is also suitable for optical distance measurement system or measurement pattern driving light source.
Brief description of the drawings
Other feature and advantage the retouching some embodiments of the present invention completed by reference to attached drawing of the present invention State preferably to emphasize, wherein:
Fig. 1 has been described in detail, and which illustrates the image capture device of the prior art, wherein lighting device is formed with image Device is not coaxial,
Fig. 2 schematically illustrates the optical information reader of type of imager according to the present invention,
Fig. 3 schematically illustrates image capture device according to the present invention,
Fig. 4 illustrates the portion of the array of the miniature LED with pre-collimator lens on each light source with the ratio more amplified Point,
Fig. 5 illustrates the illumination with the flat array of the light source of image processing system out-of-alignment lighting device,
Fig. 6 illustrates the illumination with the curved array of the light source of image processing system out-of-alignment lighting device,
Fig. 7 to Fig. 9 be a diagram that the block diagram of some embodiments of the driving of the light source of lighting device,
Figure 10 to Figure 17 is the expression of the geometry of image capture device or part thereof,
Figure 18 be a diagram that the block diagram of another embodiment of the driving of the light source of lighting device,
Figure 19 is that the figure of the embodiment of the driving of the light source of the lighting device of Figure 18 represents,
Figure 20 is that the figure of the embodiment of the driving of the light source of lighting device represents,
Figure 21 a, Figure 21 b and Figure 21 c illustrate whole block diagram, and the block diagram illustrate in detail the lighting device in Figure 20 Light source driving embodiment,
Figure 22 to Figure 27 is schematically showing for the various embodiments of image capture device,
Figure 28 is the expression of the geometry of the embodiment of the lighting device of image capture device,
Figure 29 illustrates the light source of the image capture device of embodiment, which waits to be opened with different operating distances The whole region taken by sensor frame is illuminated at place,
Figure 30 to Figure 37 schematically illustrates the other function of the lighting device of image capture device,
Figure 38 and Figure 39 schematically illustrates the other embodiments of image capture device.
Embodiment
Fig. 2 is the reading system or the block diagram of reader 1 in brief of the optical information of type of imager according to the present invention.
Reader 1 includes the image capture device 2 that can catch or obtain the image of optical information C, and the optical information C exists Illustrated in Fig. 2 by the two-dimensional optical code being present on substrate S.
The image capture device 2 hereinafter preferably described is included into image processing system or part 3, described image shape The matrix-type including the array format in light-sensitive element-linear or preferably as shown into device or part 3-sensor 4, the sensor 4 can generate electric signal from optical signalling, in other words from the light R launched by substrate S, and the smooth R is existing Graphic element modulation, especially by code or other optical information C modulation.
Even if unnecessary, image processing system 3 generally also further includes picture receiver optics 5, and described image receives Device optics 5 can form the image for fully having focused on the substrate S containing optical information C or part thereof on sensor 4.
Image capture device 2 further includes:It is suitable for projecting illuminating bundle T into lighting device or the part 6 to substrate S.
Reader 1 further includes processing and/or control device 7, and the processing and/or control device 7 can be from by picture catchings The image acquisition information content that device 2 or part thereof is caught, such as decoding 2 d code C, and other portions of control reader 1 Part.
Processing and/or control device 7 are substantially known, and including:For handling the signal launched by sensor 4 Hardware and/or software service, are, for example, wave filter, amplifier, sampler and/or binaryzation device;For reconstructing and/or decoding light Learn the module of code, including for inquire about may code table, for inquire about with may code dependent any plain text information Table model;Optical character recognition module etc..
The image of the acquisition and/or programming code of its handling result and reader 1, the process parameter values and inquiry Table is usually stored at least one interim and/or large capacity storage device 8 in digital form, and storage device 8 is reading The storage device that the possibility of device 1 can be removed.Storage device 8 also serves as service memory to perform software algorithm.
Reader 1 may also include communicator or interface 9, for by the information content of the image of acquisition and/or extraction PERCOM peripheral communication with reader 1 and/or for inputting the configuration data for reader 1 from external source.
Reader 1 further includes at least one output device 10, for showing the fortune for being for example related to reader 1 to user The alphanumeric and/or graphical information of row state, the information content read etc., and/or currently taken for showing by 4 frame of sensor Image.Output device 10 alternatively or in addition includes printer, the output dress of speech synthesizer or other aforementioned informations Put.
Reader 1 further includes for example to be manually entered for the control signal of configured readers and/or at least one of data Device 11, such as similar to keyboard or multiple buttons or control-rod, directionkeys, mouse, touch pad, touch-screen, voice control dress Put.
Reader 1 further includes at least one supply unit 12, with using battery supply or by from main power source or from outside Device obtains a variety of components of power supply direction of signal and supplies suitable voltage and current level.
Reader 1 further includes the driver 13 hereinafter preferably described of lighting device 6.
As described in hereinafter preferably, driver 13 and lighting device 6 preferably except realize substrate S or one Or more concern area (ROI) illumination functions with by image processing system 3 catch image outside, also realize in following device One or more devices illumination functions:Sighting device, output indicating device, for detecting presence and/or the use of substrate S The device of the focused condition of distance and/or image capture device 2 (view finder) is read in optical measurement or estimation.
Processing and/or control device 7 can be particularly one or more microprocessors by one or more processors Or microcontroller, and/or circuit with discrete or integrated component implemented.
Similarly, driver 13 can be by one or more circuits with discrete or integrated component and/or logical Cross one or more processors and be particularly one or more microprocessors or microcontroller implementations.
In addition, although processing and/or control device 7 and driver 13 are shown as the device separated in fig. 2, they can Shared one or more such circuits and processor, and/or the shared one or more devices for performing storage device 8.
More generally, it should be understood that Fig. 2 illustrates different square frames from the angle of function.It is above-mentioned from the angle of physics The all parts of reader 1 may be made in discrete object, it is assumed that their as being schematically illustrated in fig. 2 phases each other Mutual communication, for the communication of control, data and/or power supply signal.Connection can be carried out via cable and/or wireless mode.
Therefore, above-mentioned reader 1 may be made in a single object, and wherein all parts are contained in unshowned outer In shell, the shell has example as used in the suitable shape and size in fixed or portable stage;The shell includes At least one transparent region, as the light T launched and the passage of the light R received.Shell and/or it is one or more in Portion's supporting item is also configured to the component of predetermined correlation support image capture device 2 and lighting device 6.
Conversely, output device 10 and/or manual input device 11 and/or processing and/or control device 7 can be at least in part By computer-implemented.
In addition, lighting device 6 and image processing system 3 may be formed in separated shell, it is each equal in the shell With the transparent region of itself, and lighting device 6 and image processing system 3 are in reader or the installation steps phase of reading system 1 Between constrained in predetermined correlation in space.
Fig. 3 in more detail but schematically illustrates image capture device 2 according to an embodiment of the invention.
The sensor 4 of the image processing system 3 of image capture device 2 includes the array of light-sensitive element 14, the photosensitive member Each electric signal is provided which in part, and the intensity of electric signal is to project the function of light on it.As an example, Fig. 3 is shown The dimension sensor 4 of square, but the sensor can also be rectangle, circle or ellipse.Sensor 4 can for example with C-MOS or CCD technologies are made.Alternatively, sensor 4 can be actuated to the letter that extraction is generated by the subset of its light-sensitive element 14 Number, and critical condition is used as, each individually light-sensitive element 14 is individually actuatable.
The receiver optics 5 of the image processing system 3 of image capture device 2 is designed to be formed on sensor 4 Include the substrate S of optical information C or the image in its region.Receiver optics 5 may include one or more lens, one Or more aperture, refraction, reflection or diffraction optical element, receiver optics 5 can deform to change having for sensor 4 Imitate aspect ratio.As an example, in figure 3 receiver optics 5 be shown as being placed in the plane parallel to sensor 4 and with Coaxial inverse lens.
Image processing system 3 defines the workspace areas 15 in the extension of the front of sensor 4.Workspace areas 15 is By sensor 4, correctly frame takes optical information C and the image of optical information C fully focuses on the region in the space on sensor 4. In this workspace areas 15, optimal focal plane can be fixed or be changed by autofocus system.As square In the case that the square shaped sensor device 4 of the particular case of shape sensor is representative, workspace areas 15 is Pyramid Or pyramidal truncated cone;In the case of circular or oval sensor 4, workspace areas 15 is cone or circle The truncated cone of taper;In the case of one-dimensional sensor 4, pyramidal base portion generally becomes thinner, and is believed that working region 15 are generally flat.
Image processing system 3 also defines the optical axial of receiver optics 5, in short, receiving axis Z.Receive Axis Z is limited by the center of the element of receiver optics 5, or passes through the song of optical surface in the case of independent lens Rate center limits.As will become more clear hereinafter, receiving axis Z need not be perpendicular to sensor 4, it is not required that passes through biography The center of sensor 4.
Especially, in the case where receiver optics 5 includes deflecting element, it can be that image is formed to receive axis Z Non-rectilinear in device 3, but straight reception axis Z simulations can be passed through under any circumstance within the meaning of the present invention.
The vertex O of workspace areas 15 is arranged along axis Z is received, in short, receiving vertex O.Workspace areas 15 vertex O is the vertex of pyramid or cone, and the vertex falls in the case of inverted receiver optics 5 In the optical centre of optics 5, and in the case of the receiver optics 5 of non-inverted, the vertex generally falls in biography 4 rear of sensor.
Image processing system 3 also defines angular width of the working region 15 on receiving axis Z, and the angular width is led to Chang Yisi angle beta1、β2、β3、β4Form represent that the origins of four angles is being received in the O of vertex, and with receive axle In one in the side that line Z coincides and extends in orthogonal four half-planes.Two main sides of reference sensor 4 To the i.e. line direction and column direction of its light-sensitive element 4 can refer to and pass through angle beta1、β3Represent " level " visual field and pass through angle Spend β2、β4Represent " vertical " visual field.Under the particular case coaxial and placed in the middle relative to receiver optics 5 of sensor 4, work Making area of space 15 has symmetry, and the β in terms of absolute value13And β24.In the case of one-dimensional sensor, " hang down Directly " visual field is much smaller than " level " visual field, and generally negligible.
Image processing system 3 also defines the depth of field DOF, the depth of field DOF and illustrates the workspace region that edge receives axis Z The scope in domain 15.
In figure 3, substrate S is generally being indicated at reading distance D with S, and the region correspondingly taken by sensor frame is with 16 Instruction;As special circumstances, substrate S takes distance D in minimum readable1Sentence S1Instruction, and the region taken by sensor frame is with 161 Instruction, and substrate S is in maximum readable distance D2Sentence S2Instruction, and the region taken by sensor frame is with 162Instruction.The depth of field because This passes through DOF=D2-D1Provide.
It is emphasized that even if receiving axis Z need not need not take perpendicular to sensor 4 and perpendicular to by 4 frame of sensor Substrate region 16, also along receive axis Z from receive vertex O measurement read distance D, D1、D2
Workspace areas 15 can be fixed, or by well known zoom in size and/or ratio and/or certainly Dynamic focusing system dynamically changes, and the zoom and/or autofocus system such as electromechanics, piezoelectricity or electro-optical actuators, are used for One or more lens or aperture of mobile receiver optics 5, mirror or other components, and/or for change one or The curvature of more lens such as liquid lens or deformable lens, and/or for movable sensor 4.In preferred embodiment In, receiver optics 5 includes the 416 SL-C1 liquid lens of Arctic manufactured by the Varioptic SA of France.
In other words, although assuming β for the sake of simplicity in figure 31、β2、β3、β4Visual field at different reading distance D It is identical, but this can along receive axis Z be changed by zoom system, pancreatic system so that working region 15 be no longer pyramid or cone it is quiet State truncated cone, but with variable size and/or ratio.Description of the invention remains overall effective under any circumstance.
The lighting device 6 of the image capture device 2 of the optical information reader 1 of type of imager includes adjacent light source 18 Array 17.In figure 3, for clarity, some light sources in the light source 18 be illustrate only.
The light source 18 of array 17 can individually be driven by driver 13, to open and close, and preferably also can be according to The wavelength or wave-length coverage of intensity and/or transmitting individually drives.Therefore, this is limited inside as " pixelation source " Array, or the array of PPEA (programmable photon transmitter array) can be defined to.
The light source 18 of array 17 is preferably each to include single illumination component, illumination component in shape and size each other It is identical.However, the light source 18 of array 17 may also comprise the illumination component of different shape and/or size.In addition, the light source of array 17 18 each can include the illumination component of multiple groups for being gathered into identical or different shape and/or size.In other words, if according to The quantity of the quantity for the cluster that the present invention can individually drive in other words light source 18 is still fully big to the following work(for realizing lighting device 6 Energy property, the then driving in pixelation source can carry out in the aspect of illumination component or the cluster of pixel.
Lighting device 6 alternatively includes illumination optics.
Illumination optics may include one or more lens that can be deformed and possible aperture, refraction, reflection or Diffraction optical element, this is public for the light source 18 of all arrays 17.Illumination optics can be public, and scheme It is shown as in 3 as exemplary image inversion optics 19a coaxial with array 17.
As example following Figure 14 are better described into Figure 16, illumination optics also can be used as alternatively or Extraly include multiple lens 19b, each lens 19b is combined with the light source 18 of array 17.With with light source 18 or its illumination Having for such lens 19b of the comparable size of element determines and specifically reduces the Effective emission angle degree of independent light source 18 Function, and such lens 19b can also have the function of to determine the orientation of illuminating bundle reflected by single light source 18.
Each lens 19b can be replaced or with other optical elements such as aperture, prism surface, light guide or GRIN Lens It is combined, preferably to select the direction for the light beam launched by single light source, for example, such as in foregoing 1 764 835 A1 of EP Described in.
As shown in following Figure 16 as example, multiple lens 19b also can be with the image optics of public non-inverted Device 19c is used in combination, or is used in combination with public inverted image forming optics 19a.
The light source 18 of array 17 it is preferably made on public substrate in the form of integrated circuit.Preferably, light source 18 Also by being driven with row index and the address bus of row index.
Preferably, fill factor, i.e., the gross area occupied by the activated surface of light source 18 (or multiple lens 19b) and arrangement Ratio between the gross area of the substrate of the integrated circuit of source (lens) be it is high, preferably more than 90%.
In one embodiment, the light source 18 of array 17 is miniature LED.The miniature LED is miniature transmitter, described micro- Type transmitter is for example made with gallium nitride (GaN) technology, the surface of emission of the linear dimension of the bigger with equal to about 20 microns Product, but the linear dimension is generally also small to 4 microns;With this technology, array 17 can be fabricated in very small dimensions comprising thousands of Or light source 18 up to ten thousand (for example, the array of the size of several mm of very little for 512 × 512 illumination component) and with minimum Cost and power consumption.Such device can also be launched at different wavelengths.
In one embodiment, the light source 18 of array 17 is OLED (Organic Light Emitting Diode).OLED is by by a system Arrange the thin organic film arrangement electrooptical device of acquisition between the two conductors.When a current is applied, light stream is launched.This process quilt Referred to as electroluminescent phosphorescence.Even with the system of multilayer, the array 17 of OLED 18 is also very thin, a typically smaller than 500 nanometers (millis 0.5/1000th of rice), and as low as 100nm.OLED consumes low-down energy, it is desirable to low-down voltage (2 to 10 volts). OLED can be shone with the different wave length in visible spectrum.OLED also can be to generally reach 740 photographs of per inch (pixel/inch) The density of bright element is arranged in array closely, and for 15 square microns, (" OLED/CMOS combinations open each array The New World (OLED/CMOS combo opens a new world of microdisplay) of miniscope ", laser gathers The burnt world (Laser Focus World), in December, 2001, volume 37, the 12nd phase, Pan Weier publications (Pennwell ), Publications it can be obtained at following link:“http://www.optoiq.com/index/photonics- technologies-applications/lfw-d isplay/lfw-article-display/130152/articles/ laser-focus-world/volume-37/iss ue-12/features/microdisplays/oled-cmos-combo- opens-a-new-world-of-mic rodisplay.html”;" organically grow up:Shine organic crystal and polymer indication Revolutionary variation can be occurred so as to manufacture cost reduction and portability increase (Organically by flat-panel monitor grown:Luminescent organic crystals and polymers promise to revolutionize flat-panel displays with possibilities for low-cost manufacture and more ) ", portability Laser Focus World (Laser Focus World), in August, 2001, volume 37, the 8th phase, Pan Weier goes out Version thing (Pennwell Publications), it can be obtained at following link:“http://www.optoiq.com/index/ photonics-technologies-applications/lfw-d isplay/lfw-article-display/113647/ articles/laser-focus-world/volume-37/iss ue-8/features/back-to-basics/ organically-grown.html”).OLED has the launch angle of non-constant width, generally reaches 160 °.The array 17 of OLED It may be provided on flexible base board and curved construction be therefore presented.The array 17 of OLED may be alternatively formed to so that radiated element has Different shapes and/or size.
In one embodiment, the light source 18 of array 17 is LED (light emitting diode).LED is with 50 microns of maximum The photoelectron emission device of linear dimension, the linear dimension is up to 350 microns or bigger;These devices can realize high efficiency, But cost is big chip size and to need heat dissipation element between each other, this cause 17 volume of array of therefore formation bigger than normal and Transmitter has big idle area between each other, i.e. fill factor is low;Alternatively, LED emitter may be formed on substrate, Such as in aforementioned documents US 5, described in 319,182, such as C-MOS substrates, but there is relatively low efficiency.In addition, LED 18 Driver chip tend to the contact with center, this generates the moon at the center in the region illuminated respectively Shadow.Even if there are method to avoid this defect, such as in foregoing US 6, the contact geometry shape proposed in 811,085, but this A little systems are also relatively expensive and consume relatively great amount of energy, are frequently necessary to relatively large heat dissipation near each light source 18 in addition Area, it reduce its fill factor as described above.
In one embodiment, the light source 18 of array 17 is mutually tied with the micro mirror manufactured with MEMS (MEMS) technology The laser of conjunction, the laser are movable in the orientation not allowed light through, in other words close within the meaning of the present invention Laser, and the laser is movable at least one orientation allowed light through, in other words within the meaning of the present invention Open laser.Such device is referred to as " micro projector " in field.It is capable of providing for being combined with each micro mirror Laser, or it is public single laser to be also capable of providing for multiple micro mirrors.However, the presence of movable part is related to one Quantitative consumption and abrasion.
Other technologies can be used for the array 17 for making light source 18.
As an example of the array 17 of light source 18, Fig. 4 has been illustrated on each light source 18 with very big magnification ratio The part of the array 17 of miniature LED with pre-collimator lens 19b.
Lighting device 6 is constructed so that each light source 18 of array 17 launches base lighting light beam, the base lighting light Beam has the average propagation direction of oneself in the front space of lighting device 6.As hereinafter will be better explained, illumination Device 6 is also configured as so that mutually adjacent and may slightly weigh by region that the adjacent light source 18 of array 17 illuminates on substrate S Folded, to form the general lighting light beam indicated below with T, the shape and size of general lighting light beam are depending on current how many Light source 18 and which light source 18 are opened by driver 13.The quantity of the light source 18 of array 17 is chosen so as to the quilt on substrate S The region that lighting device 6 all illuminates experienced sufficiently small percentage change when independent light source 18 is opened/closed.It is preferred that Ground, the percentage change are less than or equal to 15%, more preferably less or equal to 10%, are even more preferably less than or equal to 5%.
Fig. 3 illustrates illuminating bundle T0If opening the whole light sources 18 of array 17, remove at the opposite end of array 17 Outside the direction ambiguity of light source at end, above-mentioned illuminating bundle T will be launched by lighting device 60
Lighting device 6 defines that illumination optical axis A, the illumination optical axis A are this highest luminance light beam T0It is flat Equal direction, as at least one plane intraoral illumination light beam T0The axis of symmetry, and in the illustrated situation of two-dimensional array 17 Lower is usually in two vertical plane intraoral illumination light beam T0The axis of symmetry.
In public illumination optics 19a, 19c and the optics relative to this public illumination optics 19a, 19c In the case of axis array 17 placed in the middle, illumination axis A is limited by the center of the element of public illumination optics 19a, 19c It is fixed, or limited by the center of curvature of the optical surface in the case of public single lens 19a, 19c.Especially, illuminating In the case that optics 19a, 19b, 19c include deflecting element, illumination axis A can be not in the inner side of illumination optics 6 It is straight, but illumination axis A can still pass through straight illumination axis A simulations within the meaning of the present invention.
In square or usually in the case of the representativeness of the two-dimensional array 17 of rectangle, highest luminance light beam T0It is pyramid Shape or pyramidal truncated cone shape;In the case of circular or oval-shaped array 17, illuminating bundle T0It is cone or truncated cone Shape;In the case of one-dimensional array 17, pyramidal base portion becomes generally thinner, its thickness is equal to by single light source 18 The size in the area illuminated, and it is believed that highest luminance light beam T0Generally it is flat.
Lighting device 6 also defines illumination vertex A0, the illumination vertex A0It is the vertex of following pyramid or circular cone: In the case of public inversion illumination optics 19a, vertex A is illuminated0Overlapped with its optical centre, and in the illumination of non-inverted In the case of optics 19b, 19c, vertex A is illuminated0Generally fall in the rear of array 17.
It is emphasized that as will become apparent hereinafter, depending on public illumination optics 19a, 19c relative to The orientation of array 17 and positioning and/or the geometry of the single lens 19b depending on being combined with light source 18, illumination axis A Need not be perpendicular to array 17, it is not required that pass through the center of array 17.
According to the present invention, axis A is illuminated not overlap with receiving axis Z.Especially, lighting device 6 and image processing system 3 It is not coaxial.Usually, vertex O and illumination vertex A are received0It is misaligned, and illuminate axis A and receive axis Z and be mutually inclined.So It is assumed that receive vertex O and illumination A0Misaligned, then illuminating axis A can be parallel with reception axis Z.It is assumed that illumination axis A and Receive axis Z to be mutually inclined, then receive vertex O and illumination A0It can be overlapped in principle.
According to the present invention, the driver 13 of the light source 18 of array 17 is suitable for driving light source 18 in a manner of described below, To close at general reading distance D the light source for the outside boundaries for illuminating the region 16 taken on substrate S by 4 frame of sensor 18.Therefore, in figure 3, drawing reference numeral 20 indicates in array 17 and illuminates what is taken by 4 frame of sensor at reading distance D The light source 18 on the border in region 16.At distance D, driver 13 is responsible for opening the light source 18 in circumference 20 (containing circumference 20), and And close the light source in the outside of circumference 20.In the case of it is desirable that only illuminating the part in the region 16 taken by 4 frame of sensor, such as Hereinafter preferably describe, driver 13 will be responsible for opening the only one subset of the light source 18 in circumference 20.
It is emphasized that in this and remainder in specification and claims, " closing " and " unlatching " and Derivative form is not necessarily mean that the switching of instruction state, and if mean to be in desired state comprising light source 18, Then driver 13 maintains this state.
It should be understood that mean that the geometry circumference in region is crossed in instruction by " border " in the region 16 that sensor frame takes Layer, the thickness of this layer is determined by the region illuminated by the single light source 18 of array 17, and therefore relative to by 4 frame of sensor It is fairly small for the whole region 16 taken.
As specific situation, in figure 3, drawing reference numeral 201Indicate in array 17 and read distance D in minimum1Place is shone The bright region 16 taken by 4 frame of sensor1Border light source 18, distance D herein1Place's driver 13 is responsible for being opened into more in circumference 201(include circumference 201) in all light sources, and be responsible for close circumference 201The light source in outside;Drawing reference numeral 202Indicate array In 17 in maximum read distance D2Illuminate the region 16 taken by 4 frame of sensor in place2Border light source 18, distance D herein2Place Driver 13 is responsible for being opened into more in circumference 202(include circumference 202) in all light sources, and be responsible for close circumference 202Outside Light source.
As seen from Figure 3, when reading distance D is in D1And D2Between when changing, light source 20 that periphery is opened1, 20 and 202Change in array 17 allow for correcting parallactic error and perspective distortion error, and the parallactic error and perspective distortion are missed Difference is intrinsic (specific situation herein for the non-coaxial arrangement relative to image processing system 3 of lighting device 6 Under, axis A is equally tilted relative to axis Z, therefore region 20,201、202For trapezoidal shape).Drawing reference numeral 21,211、212Figure Show if all light sources 18 of array 17 are opened respectively in distance D, D1、D2The circumference in the region being illuminated is changed speech by place It, in different distances D, D1、D2Locate substrate S, S1、S2With highest luminance light beam T0Intersection:It should be noted that it is such most Big illumination region 21,211、212In each how to extend to far more than the region taken at corresponding distance by 4 frame of sensor 16、161、162, this in the case where the light launched by light source 18 is in visible spectrum, is brought corresponding to the waste of energy The region 16,16 that is taken by 4 frame of sensor1、162Misleading user visually indicate aspect the shortcomings that.
Although generally unclear from Fig. 3, the independent light source 18 opened of array 17, there are public illumination The place of optics 19a, 19c and in substrate S, S1、S2On the region 16,16 that is taken by 4 frame of sensor1、162Between light path It is non-constant.As this as a result, in the presence of the illumination unevenness and/or focused lost schematically shown in Figure 5.
Such illumination unevenness and/or focused lost can pass through the suitable of illumination optics 19a, 19b, 19c Design to correct, but prove that this is probably especially numerous tired.
Alternatively or in addition, driver 13 can drive light source 18 so that light source is especially being schemed with different intensity The ground transmitting of intensity increase from right to left in 3.
It is emphasized that the intensity by modulating single light source 18, can also correct in the 18 intensity side of of light source itself The possible inhomogeneities in face, therefore increase unknown perception of the lighting device 6 for product tolerance.In other words, it is not necessary to have Uniform transmitter array 17.
Still alternatively or in addition, the array 17 of light source 18 may be arranged at corresponding to public illumination optics 19a, On the curved surface of the optimum focusing curve (caustic curve) of 19b, 19c, the curved surface in the case of a one-dimensional array Generally become curve so that the outermost light source 18 of array 17 is placed in by public illumination optics 19a, 19b, 19c Correct distance, the image of focusing is projected on substrate S.Embodiment with curved arrays 17 is schematically in figure 6 Diagram, and be especially possible in case of oleds.In embodiment, the light source 18 of array 17, which may be arranged at, has and figure On the curved surface of the opposite concavity of 6 concavity.In the case, the illuminating bundle diverging of single light source 18 and illumination light Learning device can save.
There are different methods, i.e., the light source 18 to be opened of array 17 is selected according to the method driver 13, and can Selection of land using the intensity of the light source 18 and/or launch wavelength (s) as the reading in the depth of field DOF of image processing system 3 away from Function from D, only to illuminate the complete region 16 taken on substrate S by 4 frame of sensor.Below, for brevity, Determining for the light source 18 to be opened is only related to, which imply can determine corresponding intensity and/or launch wavelength (multiple ripples at the same time It is long).
First, it is described to determine in real time or disposably to carry out.
In the case where determining in real time, driver 13 itself should include hardware and/or software module, to implement to determine to calculate Method.As illustrated in the figure 7, in step 100, the work at present distance D (D of setting or detection in depth of field DOF1≤D≤ D2).In a step 101, must being opened only to illuminate the subset of the whole region 16 taken by 4 frame of sensor for light source 18 is determined 18a, specifically determines in one of following method.In a step 102, at most all light sources of subset 18a are switched on.
Establish inquiry table in the case of disposable determine, then reader 1 image capture device 2 normal operation Period, driver 13 refer to the inquiry table.Driver 13 may include that the hardware and/or software module, or method can lead to again Ppu execution is crossed, and only inquiry table can be loaded into the memory relevant with driver 13 8 of reader 1.It is disposable true It is fixed preferably generally to be carried out for each distance D that reads in depth of field DOF, in other words, in D1And D2Between continuously or with close Suitable sample rate change, and it is disposable definite therefore to provide this for the cycle of operation.The sampling scope of operating distance D can not be permanent Fixed, especially, the operating distance D of sampling can be close to minimum operating distance D1When mutually closer to and close to maximum functional Distance D2When it is mutually less close, wherein the construction for the light source 18 opened more slowly changes.With reference to figure 8, in step 103 Operating distance D is respectively chosen as minimum operating distance D1Or maximum functional distance D2.Then step 101 is performed, determines light source 18 Must be turned on only to illuminate the subset 18a of the whole region 16 taken by 4 frame of sensor.Then record is deposited at step 104 Storage is in inquiry table, and the record includes selected operating distance D, and (first time circulated herein corresponds respectively to D when performing1 Or D2) and subset 18a definite in a step 101.In step 105, then respectively with infinitesimal amount or based on preselected The amount of sampling increase or decrease operating distance D.In step 106, then check whether circulation has held on whole depth of field DOF OK, in other words whether operating distance D correspondingly exceedes maximum functional distance D2Or less than minimum operating distance D1.In the feelings of negative Under condition, step 101,104,105 and 106 repeat, therefore new record is inserted in inquiry table.When circulating in the whole depth of field When having been performed on DOF, in other words when the inspection of step 106 is certainly, driver 13 can enter normal use pattern.Herein In pattern, the current operating distance D (D in depth of field DOF are set or detected in step 1001≤D≤D2).In step 107 In, from inquiry table read light source 18 must be turned on only illuminated at current operating distance D taken by 4 frame of sensor it is whole The subset 18a in region 16.In a step 102, the light source of subset 18a is at most opened.
In the different embodiments of image capture device 2, light at the given operating distance D in depth of field DOF is determined Must be turned on for source 18 can be according to different with the step 101 for only illuminating the subset 18a of the whole region 16 taken by 4 frame of sensor Method, (step 101) of Fig. 7 and disposably (perform to the step 101) of Fig. 8 in real time.
First method is the method for parsing type.Once the geometry and optical configuration of image capture device 2 have been established, Actually can be just which light source 18 that each reading distance D calculates array 17 is illuminated by each photosensitive of sensor 4 The fundamental region that 14 frame of element takes.It should be noted that in practice, the base region taken by 14 frame of each light-sensitive element of sensor 4 Domain can at most be illuminated by four light sources 18 of the formation square being disposed adjacently to one another in array 17.
One preferred embodiment of analytic method is schematically illustrated in fig.9, and then with reference to figures 10 to Figure 17 more It is described in detail.
With reference to figure 9, in step 108, being combined with reception device 3 and the construction based on image processing system 3 It is described specifically to receive vertex O in the first referential of origin, to calculate the coordinate of some specified points in one referential The border in the region 16 that point allows to be taken by 4 frame of sensor on substrate S is identified to.Such specified point is preferably its image It is formed in the light-sensitive element 14 for the circumference for defining sensor 4 and/or its image is formed in the center light-sensitive element 14 of sensor 4 On point.Especially, in the case of rectangular or square sensor 4, referential is preferably cartesian coordinate system, and specific Point preferably corresponds to those points seen by the light-sensitive element 14 for two opposite apexes for being at least in sensor 4; In the case of circular or oval sensor 4, referential is preferably cylindrical coordinate system and specified point preferably corresponds to Center light-sensitive element 14 and periphery light-sensitive element 14, or corresponding to period-luminosity outside the two or four of the axis of symmetry along sensor 4 Quick element 14.In fact, in the presence of being this by the coordinate representation of the corresponding all the points of the circumference in the region 16 with being taken by 4 frame of sensor The parsing relation of the function of the specified point of sample.
In step 109, perform the coordinate of specified point and filled to the conversion of the second referential, second referential with illumination 6 are put to be combined and specifically to illuminate vertex A0For origin.Especially, in the case of rectangular or square array 17, second Referential is preferably cartesian coordinate system;In the case of circular or oval-shaped array 17, the second referential is preferably justified Cylindrical coordinate.In some cases, can suitably change, increase or decrease from a referential be delivered to another referential and Using the corresponding all the points of the circumference for illustrating the region 16 with being taken on substrate S by 4 frame of sensor coordinate and/or illustrate with The specified point of the parsing relation of the coordinate of the corresponding all the points of circumference in the region illuminated on substrate by array 17:For example, such as Fruit is rectangle by the region 16 that 4 frame of sensor takes on substrate S and is viewed as trapezoidal region, then may be used by lighting device 6 Operated on four vertex, or can be operated for example on two opposite vertex or the center in the first referential and one top Operated on point, and trapezoidal four vertex in the second referential can be obtained by the parsing relation of rectangle.
In step 110, in the second referential and based on the construction of lighting device 6, illuminating for array 17 is calculated The light source 18 of corresponding specified point.
The coordinate transform between two referentials performed in step 109 is inherently well known.Only conduct Example, is that origin is receiving cartesian coordinate system X, Y, Z and the second referential at the O of vertex in the first referential with reference to figure 10 It is that origin is illuminating vertex A0In the case of cartesian coordinate system U, V, the W at place, coordinate transform is usually rotation translation (rotation Add translation), this can be reduced to rotate or translate under specific circumstances.With x0、y0、z0Indicate the illumination vertex A of the second referential0 Coordinate in the first referential, and with cos α1…cosα9Indicate axis U, V, W of the second referential relative to the first referential X, Y, the direction cosines of Z (represent, angle [alpha] to simplify1…α9It is referential U ', V ', W ' of the instruction relative to the O moved in Figure 10 Angle), it is described conversion by following relational expression group expression:
U=(x-x0)*cosα1+(y–y0)*cosα2+(z–z0)*cosα3 (1)
V=(x-x0)*cosα4+(y–y0)*cosα5+(z–z0)*cosα6 (2)
W=(x-x0)*cosα7+(y–y0)*cosα8+(z–z0)*cosα9 (3)
Illuminate vertex A0Position be shown in first quartile (x0、y0、z0For on the occasion of) in, but illuminate vertex A0Position can be In any quadrant.Illuminate vertex A0Position also can along one in axis and/or positioned at receive vertex O at.In addition, two One or more axis in a referential are parallel and/or in the case of overlapping or being vertical, direction cosines cos α1…cosα9One A or more can be zero or unit one.
By means of Figure 11 and Figure 12, now by the point for explaining the region 16 that will be taken on substrate S by 4 frame of sensor and photosensitive member 14 associated relation of part, the relation use in the step 108 of the method for Fig. 9 in the case where there, i.e., sensor 4 is two dimension Rectangle or as the square of its special circumstances, and be illustrated as paraxial lens 5 inverted light device have and sensor 4 The parallel principal plane of plane, the principal plane is on other occasions and the optical centre by receiver optics 5 Receive the vertical plane of axis Z.It should be noted that to maintain general type, the center that axis Z is not passed through sensor 4 is received, But pass through its general point Os
In the case of this embodiment of image processing system 3, the first referential is preferably chosen to origin and is receiving top Cartesian coordinate system X, Y, Z at point O, z axis selected as overlap but with opposite with the path for receiving light R with receiving axis Z Mode is orientated, and X, Y are oriented parallel to the column direction and line direction of the principal direction of sensor 4, i.e. light-sensitive element 14.
At general operating distance D, i.e. in the plane with following equation,
Z=D (4)
Workspace areas 15 (being indicated with chain-dotted line and dotted line) defines the region taken on substrate S by 4 frame of sensor 16 (being not shown for brevity).
The angle beta for the visual field being limited on the side of substrate S1、β2、β3、β4With the side of the sensor 4 in opposite quadrant On the angle beta between the edge of axis Z and sensor 4 is received '1、β’2、β’3、β’4It is associated with following relation:
β’k=AMAGSk (5)
Wherein AMAGSIt is the angular magnification of receiver optics 5, usually AMAGS≤1。
Although as described above, visual field β1、β2、β3、β4Be shown as along receive axis Z be constant, but usually this be not must Need, such as may be present as operating distance function i.e. as current z coordinate function visual field zoom and/or automatically It is not required in the case of focusing system.In this case, the formula more than (5) and will be described hereinafter following In some formulas, the value for the visual field being used at considered operating distance.
If s is the distance between principal plane of sensor 4 and receiver optics 5, axis Z is received in coordinate The point O of (0,0, s)sPlace meets with sensor 4.With reference to figure 12, such as fruit dot OsFall behind the center of the light-sensitive element 14 of sensor 4 Side, and if I and J are row index and the row index axis interval of sensor 4, i.e., the centers of two adjacent light-sensitive elements 14 it Between respectively along the distance of line direction and column direction, then each light-sensitive element 14 is limited by its center, and the center is referring to It is the coordinate for having in X, Y, Z and being represented by following relation:
F (i*I, j*J, s) (6)
Wherein i and j is the row index and row index of sensor 4 respectively, and the index can use positive integer value and negative integer value, And with OsCentered on light-sensitive element 14 at take null value.
Such as fruit dot OsDo not fall at the center of light-sensitive element 14, but fall the distance I in distance center1, J1Place, then each The coordinate at the center of light-sensitive element will be with (i*I+I1, j*J+J1, s) represent.If the mutual not phase of the light-sensitive element 14 of sensor 4 Deng then can still calculating its coordinate in referential X, Y, Z.It should be noted that square on sensor 4 is uniformly divided among or In the case of circular light-sensitive element, the columns and rows between centers spacing I of sensor 4, J are equal to each other.
Such as fruit dot OsFall at the center of light-sensitive element 14, then receive the axis of symmetry that axis Z is sensor 4, and work Area of space 15 has two symmetrical planes, therefore β13And β24.In the case, row index and row index have The equal limiting value on absolute value.
It is readily recognized that, the region that 14 frames of general light-sensitive element limited at distance D by index i, j take The coordinate of center P there is the coordinate that is represented with following relation:
Z=D (9)
In unit angle magnifying power AMAGSIn the case of=1, relation (7), (8) are reduced to simple ratio:
In the case of the embodiment illustrated in fig. 11, in the step 108 of the method for Fig. 9, relational expression (7), (8), (9) Or (10), (11), (9) are separately to be limited to four point P on the vertex in the region 16 taken on substrate S by 4 frame of sensor1、 P2、P3、P4, or it is applied only to opposite vertex P1And P3Or P2And P4Place.
Although being not used in the method for Fig. 9, be worth being described below relation i.e. relational expression (7), (8) it is trans, and its Middle operating distance D is replaced by general coordinate z:
The trans arbitrary point P allowed for given workspace areas 15 so that its image of the reception of sensor 4 The index of light-sensitive element 14 be identified.Certainly, because index i, j is integer, approximation is taken as hithermost by relational expression Integer.In the case where the visual field of single light-sensitive element 14 is slightly overlapping, in overlay region, by insufficiently or approximate too much Two integers will identify the image for receiving considered point light-sensitive element 14 pair.
It is obviously recognize that, situation about being discussed with reference to figure 11 and Figure 12 is set up, mutatis mutandis in such as lower lighting device 6 Embodiment, lighting device 6 has rectangle or the square two-dimensional array 17 as special circumstances in the described embodiment, and has Parallel to the public inversion illumination optics 19a of the principal plane of the plane of array 17, the principal plane specific feelings herein The plane of the optical axial for the optical centre that illumination optics 19a passes through its own is perpendicular under condition.Relevant reference symbol Number indicated in the bracket of Figure 11 and Figure 12.Point G indicates the position for the virtual light source for illuminating point P, array 17 it is at least one Light source 18 corresponds to the point G, and four light sources 18 at most adjacent to each other for forming square correspond to the point G.
In the case of this embodiment of lighting device 6, the second referential be advantageously chosen as cartesian coordinate system U, V, W, its origin is in illumination vertex A0Place, its axis W is overlapped with the optical axial of public inversion illumination optics 19a, and axis Line U, V are oriented parallel to the line direction and column direction of the principal direction of array 17, i.e. light source 18.It should be noted that only illuminating The center that axis A passes through array 17 it is specific in the case of, axis W with illumination axis A overlap.
Once general point P or specifically point P1、P2、P3、P4Or P1、P3Or P2、P4Coordinate u in coordinate system U, V, W, V, w is obtained in the step 109 of the method for Fig. 9 and by relation (1), (2), (3), then in the step 110 of the method for Fig. 9 Therefore following relation is applied to such coordinate:
Allow the light source 18 for illuminating point P of computing array 17 corresponding to relation (12), the relation (14) of (13), (15) Row index and row index m, n, wherein index 0,0 (the point A of light source 18 with being set along axis W2) related.
In relation (14), (15), M and N are the row distance between axles and row distance between axles of light source 18, AMAGaIt is public inversion Any angular magnification of illumination optics 19a, wherein following relation (16) is set up:
γ’k=AMAGa* γk (16)
T is the plane and illumination vertex A of array 170The distance between, therefore the distance is measured along axis W, and refer to The ordinary circumstance discussed above and special circumstances of image capture device 3 are set up.
In the embodiment of the lighting device 6 illustrated in fig. 11, can additionally there is the single light source 18 with array 17 The lens 19b being combined, to change the angle of the light source 18 transmitting width and/or the direction of the launch.In consideration of it, will be to Figure 14 to figure 16 subsequent description is referred to.
Relation (14) and (15) illustrate the appointing in workspace areas 15 used in the step 110 of the method for Fig. 9 Meaning point P and the correlation between the row index and row index of the light source 18 for illuminating point P of array 17, the same pin of this correlation The embodiment of illumination apparatus 6, the lighting device 6 carry the illumination optics 19a of public non-inverted, the illumination light The principal plane for the plane that device 19a has parallel to array 17 again is learned, the principal plane is perpendicular in this special case The plane of the optical axial by its optical centre of illumination optics 19c, as shown in Figure 13.
It is emphasized that relation (1) to relation (16) is to be only dependent upon known (design) geometric parameter of reader 1 Parsing relation, and especially it is only dependent upon the image processing system 3 of reader 1 and known (design) geometry of lighting device 6 Parameter, and/or its space arrangement is only dependent upon, including (set known to the space of its component or sub-component arrangement Meter) geometric parameter.
Relation (14) and (15) are also set up in the case where there, i.e., the illumination optics of non-inverted type include foregoing more A lens 19b combined with the single light source 18 of array 17, lens 19b may be combined with public non-inverted lens 19c, such as Figure 14 into Figure 16 it is illustrated.
For brevity, the lighting device 6 with this lens 19b is being assumed to include one or more battle arrays in fig. 14 Diagram in the principal direction of row 17 and the plane of illumination axis A, thinks that this figure is adequately described according to existing teaching more herein General three-dimensional situation.
Each individually lens 19b handles the lens 19b launched by light source 18 light placed on it, it is surrounded to be formed The illumination axis A of itselfmThe angle ω determined in the size by lens 19b placed in the middlemInterior light beam, the illumination axis AmBy The center of light source 18 is determined with the line that the center of lens 19b in conjunction connects.By the way that the center of lens 19b is opposite It is positioned properly in light source 18, therefore can ensures individually to illuminate axis AmIncline at the desired angle relative to the plane of array 17 Tiltedly.
In the embodiment of Figure 14, axis A is illuminatedmIt is mutually scattered, illuminate vertex A to limit0, the illumination vertex A0 Fall in the case of shown at 17 rear of array, and evenly radially separated with angle μ.In the case, the illumination of lighting device 6 Axis A is the angular bisector of the angle limited by the illumination axis of first light source and last light source, and perpendicular to battle array The plane of row 17.However, by suitably orienting lens 19b relative to light source 18, can have in 17 front cross of array Elementary beam.In the illustrated case, launch angle ωmAngle μ is equal to, therefore adjacent light source 18 illuminates adjacent connect Touch area 22m, but launch angle ωmIt may be slightly larger than angle μ so that the area 22 being illuminatedmIt is slightly overlapping.
The difference of the embodiment of Figure 15 and the embodiment of Figure 14 is:Illumination axis A is not orthogonal to the plane of array 17, and It is with angle η0Relative to the normal slope of the plane of array 17.
The advantages of these embodiments of lighting device 6 is:It is very low having on the direction of the plane of array 17 Thickness.
In both cases, to reduce the size of the light beam of transmitting, lens are fully placed such that angular magnification< 1, and accurately so that in each 18 front angle magnifying power AMAG of light sourcem1/ ω, each light source 18 will have by oneself at it Transmitting in angle ω.
Difference lies in array 17 and individually for the embodiment of the lighting device 6 of Figure 16 and the embodiment of the lighting device of Figure 14 Lens 19b arranged downstream angle enlargement multiple AMAGm<1 other public non-inverted optics 19c, will The launch angle of single light source 18 is further reduced to value ω '=AMAGmm.Lighting device 6 is perpendicular to the flat of array 17 Thickness increase on the direction in face, but the collimation of illuminating bundle T is further.With especially small own launch angle ω's In the case of light source 18, angle enlargement multiple AMAGmMay be AMAGm>1.Similar public non-inverted optics 19c It can be provided for the situation of the embodiment of Figure 15.
In the embodiment of Figure 14 to Figure 16, the illumination axis A of single light source 18m, launch angle ωmAnd angle enlargement Multiple AMAGmCan be mutually different, and similarly, even in the method (step 101) for the light source 18 to be opened for determining array 17 Relevant complexity in the case of, illuminate axis AmAlso need not be spaced.However, for given lighting device 6, then light source 18 single illumination axis can calculate or measure under any circumstance with illuminating the angle that axis A is formed.Therefore, always can be true It is fixed by function associated with the light source 18 for illuminating point P of array 17 the point P on substrate S, no matter the function may how complexity.
In Figure 14 into Figure 16, the region 22 illuminated by light source 18mOnly purposes of illustration are shown parallel to array plane Plane on, but the plane needs not be the plane at given reading distance D, it is not required that is image processing system Vague plane is waited in 3 focal plane.
Figure 11 and Figure 13 to Figure 16 and Figure 17 being described below is contemplated that to represent wherein array 17 be bending Array (Fig. 6) lighting device 6 many embodiments.
It is obviously recognize that, is set up with reference to figures 13 to the situation that Figure 16 is discussed, mutatis mutandis in image processing system 3 Corresponding embodiment.For brevity, do not indicated in drawing reference numeral opposite into Figure 16 Figure 13 in bracket.
By means of Figure 17, the point and the light of sensor 4 that will explain the region 16 that will be taken on substrate S by 4 frame of sensor now The associated relation of quick element 14, the relation use in the step 108 of the method for Fig. 9 in the case where there, i.e. rectangle Or the dimension sensor 4 as special circumstances square, and the inverted light device of single paraxial lens 5 is illustrated as, it is described It is vertical in that particular case that inverted light device, which has the principal plane-principal plane for the plane for being not parallel to sensor 4, In the plane for receiving axis Z of the optical centre by receiver optics 5.
In the case of this embodiment of image processing system 3, the first referential is advantageously chosen to its origin and is connecing Cartesian reference system X, Y, Z of vertex O is received, its axis Z selected as overlaps but with the path phase with receiving light R with receiving axis Z Anti- mode orients (Fig. 2), and Y-axis line is oriented parallel to the line direction of sensor 4, this line direction upper light sensitive element 14 with Index i is indicated.Light-sensitive element 14 forms angle δ with the column direction of the index j sensors 4 indicated and X-axis line on it.Wherein The principal plane of receiver optics 5 is its ordinary circumstance also relative to the inclined situation of line direction of sensor 4, to simplify See and it is not handled.In addition, in fig. 17, receive axis Z and be designated as corresponding to its center by sensor 4 for the sake of simplicity Point Os, and the center of the light-sensitive element 14 especially by sensor 4, generally, however this be not strictly required and join The consideration for examining Figure 12 discussions is set up.
Sensor 4 is in plane 30 (being schematically indicated by three chain-dotted lines) thereon with plane X, Y along by such as Under equation set limit straight line 31 meet:
X=-s/tan δ (17)
Arbitrary y (18)
Z=0 (19)
Negative sign wherein in relation (17) considers following true, i.e. reception vertex O and reception axis Z and sensor 4 Intersection point OsThe distance between s be negative value in referential X, Y, Z.
In the depth of field DOF of image processing system 3, along reception axis Z measurements and pass through coordinate
Q (0,0, D) (20)
The general operating distance D that limit of the point Q for receiving axis Z at, define by axis 31 and putting down by point Q Face 32 (is also schematically indicated) by three chain-dotted lines, and therefore can be represented by following relation:
X*D+z* (- s/tan δ)-[(- s/tan δ) * D]=0 (21)
For the regulation of reference chart 12 used above, each light-sensitive element 14 of sensor 4 is limited by its center, Each coordinate of the light-sensitive element 14 in referential X, Y, Z is represented by following relation (22):
F(j*J*cosδ,i*I,s+j*J*sinδ) (22)
For the sake of simplicity, the angular magnification AMAG of the unit of receiver optics 5 is assumed in writings.Here it is assumed that In the case of, the center in the region taken by general 14 frame of light-sensitive element identified with index i, j, which is located at, passes through light-sensitive element On 14 itself and the straight lines (the straight line FOP of Figure 17) by receiving vertex O, and therefore it can be represented by following parameter equation group:
X=j*J*cos δ * f (23)
Y=i*I*f (24)
Z=(s+j*J*sin δ) * f (25)
F takes arbitrary value.
In the region 16 taken in plane 32 by 4 frame of sensor, the center P's in the region taken by each 14 frame of light-sensitive element Therefore coordinate represents by the formula (23) of the value for parameter f, (24), (25), the value of the parameter f by by formula (23), (24), (25) acquisition is combined with relation (21), i.e.,:
In the case of the embodiment illustrated in fig. 17, in the step 108 of the method for Fig. 9, there will be the f of relation (26) The relation (23) of value, (24), (25) are applied to define the four of the vertex in the region 16 taken by 4 frame of sensor on substrate S A point P1、P2、P3、P4, or even it is only applied to opposite vertex P1And P3Or P2And P4
Figure 17 and it is described above also according to ground be applied to wherein lighting device 6 with the case of corresponding embodiment Lighting device 6, in other words the lighting device 6 with rectangle or two-dimensional array 17 square under special circumstances and be inverted Optics, principal plane of the inverted light device with the plane for being not parallel to array 17-described in that particular case Principal plane is the plane perpendicular to the illumination optics 19a axis for passing through illumination optics 19a itself optical centres. Equally drawing reference numeral opposite in the case is put in bracket in fig. 17.
Once the general point P or more preferable place P in coordinate system U, V, W1、P2、P3、P4Or P1、P3Or P2、P4Coordinate u, V, w has been obtained, then in the step 109 of Fig. 9 and pass through the method for Fig. 9 step 110 in relation (1), (2), (3), it is as follows Therefore relation will be applied to such coordinate:
N=u/ (N*cos ε * f) (27)
M=v/ (M*f) (28)
This is relation (23), the backstepping of (24), and the wherein value of parameter f also meets following relation:
At the same time
W=(t+n*N*sin ε) * f (30)
Wherein relation (29) and (30) correspond to relation (26) and (25).
By the way that relation (30) and (27) are combined, obtain:
By the way that (31) are substituted into (29), f (u, w) is obtained, and is updated to eventually through by the value of f (u, w) in (28), M (u, v, w) is obtained, this is omitted for brevity.
It is emphasized that the relation from (17) to (31) is equally to be only dependent upon known (design) geometry of reader 1 The parsing relation of parameter, and depend specifically on the image processing system 3 of reader 1 and its known of lighting device 6 (sets Meter) geometric parameter, and/or depending on the space of image processing system 3 and lighting device 6 arrangement, including image form dress Put 3 and the component of lighting device 6 or the space arrangement of sub-component.
In a variety of constructions of image processing system 3 and lighting device 6, therefore the relation allows to calculate array 17 Illuminate specified point and be typically the row index m and row index n of the light source 18 of the arbitrary point P in workspace areas 15, its Middle index 0,0 and (the point A of light source 18 placed along axis W2) related.
Above with reference to as described in Fig. 9, it should be understood that may need or advantageously according to referential it is each in cause Figure type and change/increase/and reduce the specified point in any one referential.Therefore, relation (1) is to (3) and its inverse Push away and can be applied not only to specific point, and be similarly applied to the expression of straight line or curve.
Therefore above-mentioned formula allows being identified below according to analytic method, i.e., Fig. 7 real-time method step 101 or In the step 101 of the disposable method of Fig. 8, at the given operating distance D to be completed, opening to shine for array 17 is determined The subset 18a of the light source 18 of the bright whole region 16 taken on substrate S by 4 frame of sensor.
Above-mentioned formula can simplify according to the specific construction of image capture device 2.In addition, different referentials can be identical Or more advantageously apply corresponding different formula.
In order to calculate each light source 18 determined in the step 101 of the method in Fig. 7 or Fig. 8 of the array 17 to be opened Intensity, is readily each in the region 16 taken on calculating substrate S by 4 frame of sensor in the case of the intensity variable Distance ds of the point P away from the light source 18 for illuminating point P (the distance d is not indicated in the accompanying drawings for brevity).In Figure 10 to Figure 17 In the case of, easily this distance can be represented in referential U, V, W by following relation:
Wherein amplitude takes suitable symbol.
For the purpose of the initial designs of the array 17 of light source 18, being worth calculating be able to must launch to illuminate in its interior array 17 Whole workspace areas 15 is the minimum solid angle of whole visual field of the image processing system 3 in its whole depth of field DOF.Change speech It, passes through highest luminance light beam T0Comprising solid angle must be at least equal to this minimum solid angle.
This can easily by above-mentioned concept and formula are applied to suitable specified point and by assessment from index m and N obtained which be most positive value and which be most negative value to realize, the specified point is, for example, to read distance D in minimum =D1The working region 16 that place is taken by 4 frame of sensor1Vertex and in maximum read distance D=D2The work that place is taken by 4 frame of sensor Make region 162Vertex.The construction and geometry of guidance lighting device 6 and its position relative to image processing system 3 One or more in amount are advantageously kept in this assessment with parametric form.Such amount includes illumination vertex A0With figure As the coordinate x in forming apparatus 3 relevant referential X, Y, Z0, y0, z0, direction cosines cos α1To cos α9, array 17 away from photograph Bright vertex A0Distance, the angular magnification AMAG of illumination opticsa, the angle of inclination in the case of the embodiment of Figure 17 ε, and row distance between axles and row distance between axles M, N are also generally included, the row index of array 17 and the limiting value of row index m, n, change speech It, the quantity of the light source 18 of array 17, and the point A in array 172Position.Perhaps, the virtual value of such amount is subject to possibility Design limitation, such as the full-size of image capture device 2 and with suitable feature array 17 availability.So And usually, always can be so that all light sources 18 of array 17 are at least one reading by the size of array 17 and location determination Take and be all utilized at distance D, in other words all opened.When in the construction of instruction image processing system 3 and the amount of geometry One or more when will be also maintained with parametric form, similar consideration is in the case where designing whole image trap setting 2 Effectively, the amount is, for example,:Sensor 4 is away from the distance s for receiving vertex O, the angular magnification AMAG of reception optics 5s, Angle of inclination δ in the case of the embodiment of Figure 17, and also such as usually columns and rows distance between axles I and J, the row of sensor 4 refer to The quantity of the limiting value, the in other words light-sensitive element 14 of sensor 4 of mark and row index i, j, and point OsPosition in sensor 4 Put.
However, it should be understood that the value of amount listed above is known normal for given image capture device 2 Number.
According to situation discussed previously, simplified formula can be exported to apply in one-dimensional sensor 4 and/or array 17 In the case of, and/or usually more complicated formula can be exported with applied in the case of curved array 17 (Fig. 6).
Also it must be clear that, it is assumed that optical information C is occupied in the space being generally positioned in workspace areas 15 Region and be in fully at similar local distance so that the focusing on sensor 4 is uniform enough, then substrate S can be in reality Have in trampling relative to any orientation for receiving axis Z;In the case, even if being calculated based on single operating distance D, in reality The lighting condition obtained in trampling by lighting device 6 is also suitable.It is more multiple even from parsing situation formula to be applied It is miscellaneous, but also can perform and this is inclined definite more accurately to be considered to the light source 18 to be opened.
Second method is empirical or self-adaptation type method, and figure 18 illustrates described for its typical embodiment Second method is used for real-time (Fig. 7) and disposable (Fig. 8) implementation steps 101 in the different embodiments of image capture device 2, The step 101 determined at the given work at present distance D in depth of field DOF, light source 18 must be opened with only illuminate by The subset 18a for the whole region 16 that 4 frame of sensor takes.
This embodiment be suitable for sensor 4 plane is parallel with the plane of array 17 and the plane be all rectangle feelings Condition.More generally method is described below.
In the step 120, driver is initially responsible for opening all light sources 18 of array 17.In the case, by sensor 4 The whole region 16 that frame takes is illuminated certainly, and this is checked in step 121.The situation of negative means image capture device 2 Design and/or assembly error, in other words highest luminance light beam T0More than or equal to required minimum solid angle condition not Meet, and/or illumination vertex A0Position and/or illumination axis A it is incorrect relative to the inclination for receiving axis Z, and/or array 17 failure, and because the method terminates.However, step 120,121 can be omitted.
It is in step 122 TRUE by traffic sign placement in step 121 the result is that in the case of certainly, preselected array 17 edge, and start following operation circulation.In step 123, driver is responsible for closing from the preselected side of array 17 The p light source 18 that boundary starts.In step 124, check whether the whole region 16 taken by 4 frame of sensor is still illuminated.No By traffic sign placement be FALSE in step 125, and make quantity p reduction quantity a in step 126 in the case of fixed.Then return Return to the execution of step 123 and subsequent checking step 124.In the case where the result of step 124 is affirmative, in step 127 Whether middle inspection mark is still TRUE.In the yes case, quantity p is made to accelerate b in step 128, and back to step Rapid 123 execution.In the case where the result of step 127 is negative, i.e., when mark is arranged to FALSE in step 125 When, in step 129 by quantity a, b reduce and be TRUE by traffic sign placement.Check whether quantity a, b is zero in step 130 Value.In the negative case, back to step 128.In the yes case, current value p indicates the pre-selection from array 17 The edge selected starts the light source 18 that should be closed, and the interim of the subset 18a for the light source 18 that should be illuminated therefore is set in step 131 Form.Then check whether that all edges of array 17 have all been examined in step 132, in the negative case, return to Step 122, wherein selecting the different edges of array 17 certainly.When all edges of array 17 are examined and step 132 When being as a result certainly, the final subset 18a to be illuminated of light source 18 is set in step 133.
Wish the description with reference to foregoing analytic modell analytical model, referring also to Figure 19, select array as the finger with negative value maximum It is-m to mark m i.e. this indexminLight source 18 edge, in the result duration p=p of the affirmative of step 1301Indicate what is opened The row index of first light source 18 is m1=-mmin+p1;Select array as with the occasion of maximum row index m i.e. this index for mmaxLight source 18 edge, in the result duration p=p of the affirmative of step 1302Indicate the last light source 18 opened Row index is m2=mmax-p2;Select array is-n as row index i.e. this index with negative value maximumminLight source 18 Edge, in the result duration p=p of the affirmative of step 1303The row index for indicating the first light source 18 opened is n3=-nmin+ p3;Select array is n as having on the occasion of maximum row index i.e. this indexmaxLight source 18 edge, in step 130 Result duration p=p certainly3The row index for indicating the last light source 18 opened is n4=nmax-p4.Therefore, index (m1, n3), (m2, n3), (m2, n4), (m1, n4) light source will be opened.
The circulation of above-mentioned step 123 to 130 can perform at the same time for two edges of one-dimensional array 17, or in two-dimentional battle array In the case of row 17 (that is, while definite row, corresponding row subset) is performed at the same time or for adjacent for a pair of opposite edge Edge perform (that is, while determine the ranks index of the first light source to be opened since the vertex of array) at the same time;Certainly, In the case, variable p, a, b and mark will suitably increase., in addition can be only in certain construction of image capture device 2 The fully circulation of repeat step 123 to 130 on two or three edges of array, for example, when on substrate S by 4 frame of sensor The region 16 taken be rectangular area and for when observed by sensor 4 and lighting device 6 observe when it is all placed in the middle when.
It should be understood that the use of quantity a and b allow the loop number that is generally performed to reduce, this by perform from The binary search in the first source of the subset 18a for the light source 18 to be opened that the preselected edge of array 17 starts is realized.Change Yan Zhi, as long as the region 16 taken on substrate S by 4 frame of sensor is illuminated, indicates maintenance TRUE entirely, then in step 128 per this Close many (b) a light sources 18.When excessive light source 18 has been closed-mark is changed into FALSE, if then attempting by switching back into (a) a light source is done to close less light source every time, can pent last light source since edge until running into.Especially In step 126,129, the reduction and/or increase of a and/or b can by successive half point and/or double (dichotomy) come into OK, to realize the Fast Convergent of algorithm.However, the use of quantity a and b are optional, single source can be opened and closed every time.
Those skilled in the art will appreciate how the block diagram of modification Figure 18, all to be remained turned-off from all light sources 18 and One or more constructions for only opening light source every time start, or from the light source 18 of the intermediate region for first switching on array 17 Construction starts.
Moreover, it will be understood that the initial number p of pent light source 18 may be selected to be performed finally definite letter Number.Really, when operating distance D increases (or corresponding reduction), the light source 18 that should be closed since an edge of array 17 Quantity increase, and the light source 18 that should be opened since the opposite edge of array 17 quantity reduce (comparison diagram 3).Cause This, as always to return to the replacement illuminated to start of whole array 17, can be determined as immediate working distance from light source Subset 18a from D starts.
Wherein the area to be opened on array 17 be general quadrangle and be not rectangular or square more generally In the case of, the situation generally when the plane residing for sensor 4 and array 17 is not parallel there is a situation where and especially in Figure 17 Lower generation, the different embodiments of experience/adaptive approach are more advantageous to (Fig. 7) in real time and disposably (Fig. 8) performs step Rapid 101, the step 101 is at the given work at present distance D in depth of field DOF, determine light source 18 must be turned on only according to The subset 18a in the bright region 16 taken by 4 frame of sensor.
This embodiment is based on the following steps described with reference also to Figure 20:
A) a series of row or column of array 17 are one after the other opened, until by the region 16 that 4 frame of sensor takes by least partly Ground illuminates, especially until sensor 4 detects the image of line that is usually inclined and being not at center;
B) identify and open the light source 18 for being designated as " beginning light source " below of array 17, this is in fig. 20 with point G0 Represent, the light source 18 illuminates the point P of this line being illuminated0, the P0The point F of sensor 4 is illuminated again0(or light-sensitive element 14);Beginning light source selected as is preferably illuminated to the light of the intermediate point of the part for the line being illuminated seen by sensor 4 Source;Selection can be carried out for example with one after the other opening all light sources 18 of examined row or column rapidly;
C) light source G is being started along the direction of the selection orientation of array 17, the origin in the direction of the orientation0Place and along this direction know Following point P is not illuminated1Light source 18 (pass through the point G in Figure 201Represent), i.e., described P1Image be formed in the side of sensor 4 On one in light-sensitive element 14 at edge, which passes through the point F in Figure 201Represent;
D) light source 18 and the corresponding edge of sensor 4 that storage recognizes;
E) select every time the orientation direction angularly separated with the direction previously performed come repeat step c) and d) until 360 ° are completed, identification corresponds to light-sensitive element F2、F3、F4... point G2、G3、G4…;It should be noted that each of sensor 4 is known Other edge can be the edge identical or adjacent with the edge previously performed;Between angle between suitably selected direction Away from so that there are at least eight iterative step c) and d), preferably at least 12 iterative steps, with for the every of sensor 4 A limb recognition goes out at least two light sources 18;
F) point on one in the light-sensitive element 14 at the identical edge for being formed in sensor 4 for illuminating its image Each light source group, such as the light source G for Figure 202、G3, identify the straight line for connecting the light source group on array 17;With
G) this straight line is connected to form the circumference for the polygon (quadrangle) for limiting light source 18 to be opened.
For circular/oval sensor, method is identical, but significantly the different edges of distinguishing sensor 4 are It is insignificant, and in order to find the border of light source 18 to be opened, since the light source recognized, it is necessary to using recognizing Non-linear interpolation between the position of light source, the non-linear interpolation are known to those of skill in the art.
The possible embodiment of such embodiment is shown in Figure 21 on multipage is divided.
Therefore abovementioned steps a) is implemented with the first operation circulation.In first step 150, initialization counter QUAD, example Such as it is initialized as 0.This counter identifies the following region of array 17, performs search the to be opened of array 17 in this region The subset 18a of light source 18.In preferable implement, value QUAD=0 identifies whole array 17, and value QUAD=1 to 4 is indicated Four quadrants of array 17.Other subdivisions of array 17 can be used.In subsequent step 151, by the current of counter QUAD The central series in the area of value mark is opened so that when all light sources 18 that QUAD=0 is the central series of array 17 are all opened. In step 152, check whether the region 16 taken on substrate S by 4 frame of sensor is illuminated at least in part, or whether be illuminated At least part of line " seen " by sensor 4.In the negative case, step 153 is reached, wherein the light source currently opened 18 row are closed, and the central row in the area identified by the currency of counter QUAD is opened so that the array as QUAD=0 All light sources 18 of 17 central row are opened.In subsequent step 154, rechecking is taken on substrate S by 4 frame of sensor Region 16 whether be illuminated at least in part, or at least part for the line whether being illuminated " is seen " by sensor 4.Negating In the case of, the parameter of QUAD increases by 1 in step 155, and checks that in step 156 wherein array 17 has ideally divided Region specifically all four quadrant (QUAD>QUADmax, particularly QUAD>4) all areas in are not yet exhausted.Certainly In the case of, because there are the design error of reader 1 or its failure method terminates.If quadrant is not yet all detected (QUAD≤QUADmax), then return and perform step 151, therefore the central series for opening considered quadrant (is examined in step 153 The central series of the quadrant of worry).If step 152 or step 154 give affirmative as a result, if this means the quilt on substrate S Illuminated at least in part by the ranks currently opened on array 17 in the region 16 that 4 frame of sensor takes.If it should be noted that read Device 1 is designed appropriately, then at any reading distance D in the depth of field DOF light source 18 to be opened of array 17 subset 18a Size can be neglected relative to the overall size of array 17, and be usually enough with QUAD=0 iteration.
In step 158, implement abovementioned steps b), in other words, identification and open array 17 and be under the jurisdiction of what is currently opened Ranks (row or column) and be chosen to illuminate equally by the single light source 18 of the point of sensor 4 " seeing ";Preferably, Beginning light source selected as illuminates that light of the intermediate point of the part seen by sensor 4 for the line being illuminated on substrate S Source.Step 158 can for example including:The photosensitive member being between the light-sensitive element 14 that those are currently illuminated of identification sensor 4 Part 14, and all light sources 18 for rapidly one after the other opening examined row or column are then performed, this light-sensitive element is assessed every time 14 output.
After step 158, implement above-mentioned steps c), operation circulation d), e) is performed.In step 159, initialization Four device variables for this circulation:FIR=1, SENSOR_EDGE=FALSE and two positive integer values H, L, their meaning Justice will be illustrated hereinafter.First variable indicates the light source for being searched along the direction of which orientation of array 17 and illuminating and putting as follows 18, i.e., on one in the light-sensitive element 14 for the edge that the image of described point is formed in sensor 4.For example, variables D IR For example can therewith succeed-be performed to edge from the 1 change-step 152 or step 154 as column or row respectively the side searched respectively To greatest measure, MAX_DIR.Each direction is on array 17 relative to previous direction rotary constant or non-constant angle Degree, and 45 ° are preferably rotated, to obtain the direction of eight orientations, or 30 ° are preferably rotated to obtain the side of 12 orientations To.Second variable SENSOR_EDGE is that the light source that instruction is searched along direction DIR (illuminates image and is formed in sensor 4 The light source of the point on one in the light-sensitive element 14 of edge) whether have been found that mark.
At this point, in a step 160, the H light source on orientation direction DIR of array 17 is opened.Followed by walk Rapid 161, it is checked whether at least one of light-sensitive element 14 at an edge in the edge of sensor 4 is illuminated.No In the case of fixed, SENSOR_EDGE=TRUE is checked whether in step 162;In the negative case, similar in step When first time in 162 performs, back to step 160, therefore the ranks opened with H light source " extension " on the DIR of direction.
When at least one of light-sensitive element 14 at an edge in the edge of discovery sensor 4 in the step 161 is shone When bright, "Yes" is exported, performs step 165, wherein mark SENSOR_EDGE is changed to TRUE;, will in subsequent step 166 Value H and L reduce;And in subsequent step 167, check whether H=0 and L=0.
In the negative case, i.e., if quantity L, H still be on the occasion of, and then carry out step 168, wherein in direction DIR The light source 18 of upper unlatching reduces L, in other words, L is closed since the opposite end of the light source with of the straight line of the direction DIR along orientation A light source 18.Then, back to 161, therefore whether the light-sensitive element 14 for assessing the edge of sensor 4 is still illuminated.Certainly In the case of, repeat step 165 to 168, therefore close the light source of cumulative a small amount of L every time, that is, shorten (but less every time) The ranks being illuminated on the DIR of direction.
When result of the checking step 161 with negative but the result previously with affirmative, because SENSOR_EDGE is TRUE is so the inspection of step 162 is affirmative;Therefore, the step 163 of the value reduction of performance variable H and variables L and mark is performed Will SENSOR_EDGE is changed into the step 164 of FALSE;It is then back to and performs step 160.Under these conditions, sensor 4 really The light-sensitive element 14 at edge be once illuminated but be no longer illuminated, therefore restart " to extend " on DIR directions and be illuminated Ranks, with back to the edge-illuminated towards sensor 4, but the ranks being illuminated are extended with the amount of smaller.
It is affirmative as a result, this indicates illuminate that abovementioned steps, which are repeated up to the result that value H, L is all zero, step 167, The light source 18 of point on its image be formed in the light-sensitive element 14 of the edge of sensor 4 one has been identified.Indicate The value of this light source 18 is usually a pair of of row index and row index of the light source 18, described value in step 169 with corresponding biography The edge of sensor 4 is stored together, therefore implements abovementioned steps d).
After step 169, perform check last search direction whether reached i.e. whether DIR>The step of MAX_DIR 170;In the negative case, mark SENSOR_EDGE is set into back TRUE and counter DIR increases by 1 in step 171, so Afterwards in step 172, all (ranks of preceding value of the edge with DIR) currently opened are being started into the light source 18 outside light source All close, then return to step 160, repeat search and illuminate same edge or adjacent edge that its image is formed in sensor 4 The whole circulation of the light source 18 of the point on one in light-sensitive element 14 on edge, and by the light source 18 and the edge of sensor 4 Store together.
In the case where the result of step 170 is affirmative, the repetition of step e) identified above terminates.Then, advance To by step 173 and step 174 implementation steps f) and step g), being found respectively in the step 173 by interpolation method On array 17 by illuminate corresponding to sensor 4 identical edge light-sensitive element 14 point light source 18 link up it is straight Line, and connect these straight lines in the step 174, so that define the circumference of light source 18a to be opened on array 17 Vertex.
The use of parameter L, H is strictly not required, but this is using allowing that the edge corresponding to sensor 4 will be illuminated Light-sensitive element 14 point light source 18 search accelerate.Preferably, parameter L, H is initially set to 2 power, and halves every time. Alternatively, parameter L, H can reduce constant amount every time, especially reduce 1.
Alternatively, light source 18 can open one every time along each direction DIR, until illuminating corresponding to sensor 4 The light source 18 of the point of the light-sensitive element 14 at edge is recognized directly.
Whether what is performed in step 121,124,152,154 and 161 is illuminated for the region 16 that is taken by 4 frame of sensor And the assessment being illuminated in which way can be by the image for being exported by the image processing system 3 of image capture device 2 Automatically analyze to perform.
If it is allowed to as the replacement of analysis of the assessment based on whole output image automatically is made based on the part for image Analysis, then automatic assessment can be accelerated, and point at the edge based on image is especially allowed in the case of one-dimensional sensor 4 Analysis, and the analysis of the row and column based on the circumference for forming image is allowed in the case of the dimension sensor 4 or is allowed to based in only Entreat the analysis of columns and/or rows.The partial analysis of this type make use of the referred to as ROI or Multi_ of well known imaging sensor The possibility of ROI, this allows to limit one or more concerns area (ROI), this causes the output phase from sensor 4 for reading Rounding Ge Kuangqu areas are far more rapidly.Alternatively or in addition, the image caught with lower resolution ratio can be assessed, i.e., for example only Analyze whole sensor 4 or one or more concern areas alternate light-sensitive element 14.
Make to perform in step 121,124,152,154 and 161 for the region 16 that is taken by 4 frame of sensor whether by The assessment illuminated and be illuminated in which way also can visually be performed logical it is necessary to be shown on output device 10 by operator Cross the image of the acquisition of sensor 4.In the case, user will by the manual input device 11 of control signal and/or data Suitable feedback, which provides, arrives reading system 1, this will be similar to that the use of the mark in the method for Figure 18.Otherwise, permission may be present User increases or decreases two of the quantity that the light source of (or close) is opened since each edge of array 17 or more respectively Multiple controls, therefore perform the function similar to square frame 126,128,129,163,166.
It must be noted that in the case of being determined in real time what is assessed automatically using adaptive approach and image, in addition Factor works, i.e. inclination of the substrate S-phase for receiving axis Z is worked.When the substrate S is with receiving axis Z out of plumb, The distance of each point in the region taken by 4 frame of sensor is about in the distance range of average operation distance D, and in this situation Under, adaptive approach will provide the different perpendicular to the subset in the case of receiving axis Z from substrate S of the light source of array 17 The unlatching of subset 18a is as a result;However, if image capture device 2 is properly designed, there is no wherein array 17 Size is the launch angle T with equal to required minimum launch angle0The situation opened at the same time of all light sources 18.
The reduction form of adaptive approach can also be used for improving definite with analytic method (for example, that above-mentioned method) The selection of the subset 18a of light source 18, such as the array 17 of the light source of each image capture device 2 for correcting batch production Inaccurately.In the case, step 123 to 131 or 160 to 169 only in the neighborhood of the subset 18a calculated with analytic method Perform, in other words, since quantity p, H, the L on border (index m, n) for indicating this subset 18a.
Figure 22 to Figure 27 schematically illustrates some specific advantageous embodiments of the device 2 for catching image.For For the sake of simplifying statement, all embodiments are being assumed to comprising illumination axis A and are receiving axis Z, the direction of sensor 4 or principal direction Described with the direction of array 17 or the plane of principal direction, wherein in view of foregoing teaching, it is believed that they are adequately described more General situation, includes the situation (Fig. 6) of curved arrays.
According to the embodiment of Figure 22, image processing system 3 is carrying and sensor 4 is coaxial connects according to Figure 11, Figure 13 Receive the image processing system of one in the embodiment of device optics 5, or according to one in the embodiment of Figure 14 or Figure 16 Image processing system.Lighting device 6 be according to Figure 11, Figure 13 with the illumination optics 19a coaxial with array 17, The lighting device of one in the embodiment of 19c, or according to the lighting device of one in the embodiment of Figure 14 or Figure 16.Receive Axis Z illuminates axis A therefore perpendicular to array 17 therefore perpendicular to sensor 4.Illuminate axis A parallel to receive axis Z but Do not coincide with.Therefore array 17 and sensor 4 may be disposed so that coplanar, and be advantageously arranged on identical supporting item, in phase With integrated circuit plate on or manufacture on identical integrated circuit plate.It should be noted that in the case, lighting device 6 should be set It is calculated as having and corresponds to highest luminance light beam T more than the required three-dimensional launch angle of minimum0Total three-dimensional launch angle, And therefore some light sources 18 of array 17 are always off.To reduce this shortcoming, array 17 also may be disposed so that parallel to sensor 4, but it is not coplanar.The advantages of this embodiment is that design and assembling are simple.
On the other hand, the lighting device 6 of following embodiments according to Figure 23 to Figure 27 may be designed as having being equal to and want The three-dimensional launch angle for the three-dimensional launch angle of minimum asked so that gauge without light source 18 is always off on array 17, and array 17 is complete It is utilized.
According to the embodiment of Figure 23, image processing system 3 is carrying and sensor 4 is coaxial connects according to Figure 11, Figure 13 Receive the image processing system of one in the embodiment of device optics 5, or according to one in the embodiment of Figure 14 or Figure 16 Image processing system.Lighting device 6 be according to Figure 11, Figure 13 with the illumination optics 19a coaxial not with array 17, The lighting device of one in the embodiment of 19c, or according to the lighting device of one in the embodiment of Figure 15 or Figure 17.Receive Axis Z illuminates axis A relative to the plane of array 17 to use θ herein therefore perpendicular to sensor 40The angle tilt of instruction. Illuminate axis A and tilted relative to axis Z is received with equal angle, θ=θ0.Therefore array 17 may be arranged at parallel with sensor 4 Plane on, be particularly arranged as it is coplanar, the advantages of with being discussed above with reference to Figure 22.If it should be noted that for illumination Device 6 has used the construction of Figure 17, then is not highly advantageous if illumination plane tilts very much.
According to the embodiment of Figure 24, image processing system 3 is carrying and sensor 4 is coaxial connects according to Figure 11, Figure 13 Receive the image processing system of one in the embodiment of device optics 5, or according to one in the embodiment of Figure 14 or Figure 16 Image processing system.Lighting device 6 be according to Figure 11, Figure 13 with the illumination optics 19a coaxial with array 17, The lighting device of one in the embodiment of 19c, or according to the lighting device of one in the embodiment of Figure 14 or Figure 16.Receive Axis Z illuminates axis A therefore perpendicular to array 17 therefore perpendicular to sensor 4.Sensor 4 and array 17 are arranged at it Between form angle, θ1Plane on so that illumination axis A relative to receive axis Z tilt with identical angle, θ=θ1
According to the embodiment of Figure 25, image processing system 3 is carrying and sensor 4 is coaxial connects according to Figure 11, Figure 13 Receive the image processing system of one in the embodiment of device optics 5, or according to one in the embodiment of Figure 14 or Figure 16 Image processing system.Lighting device 6 be according to Figure 11, Figure 13 with the illumination optics 19a coaxial not with array 17, The lighting device of one in the embodiment of 19c, or according to the lighting device of one in the embodiment of Figure 15 or Figure 17.Receive Axis Z illuminates axis A relative to the plane of array 17 with herein with θ therefore perpendicular to sensor 40The angle tilt of instruction. Sensor 4 and array 17 are arranged in and form angle, θ therebetween1Plane on so that illumination axis A relative to receive axis Z incline Tiltedly with angle, θ=θ10.This embodiment allows angle, θ1And θ0Absolute value it is small, and the image for the small size being therefore maintained is caught Catch device 2 and obtain big angle, θ and the design flexibility of bigger still through with two parameters to work.In the depth of field When DOF is concentrated on close to the region of reader 1, this embodiment is particularly advantageous.
According to the embodiment of Figure 26, image processing system 3 is the image processing system according to the embodiment of Figure 17.Illumination dress It is the photograph of one in the embodiment with illumination optics 19a, the 19c coaxial with array 17 according to Figure 11, Figure 13 to put 6 Bright device, or according to the lighting device of one in the embodiment of Figure 14 or Figure 16.Axis A is illuminated therefore perpendicular to array 17, And axis Z is received relative to the plane of sensor 4 with herein with θ2The angle tilt of instruction so that axis A is relative to connecing for illumination Axis Z is received to tilt with equal angle, θ=θ2.Therefore array 17 may be arranged in parallel plane with sensor 4, particularly common Arrange to face, the advantages of with being discussed above with reference to Figure 22.
According to the embodiment of Figure 27, image processing system 3 is the image processing system according to the embodiment of Figure 17.Illumination dress It is one in the embodiment with illumination optics 19a, the 19c coaxial not with array 17 according to Figure 11, Figure 13 to put 6 Lighting device, or according to the lighting device of one in the embodiment of Figure 15 or Figure 17.Axis A is illuminated therefore relative to array 17 Plane with herein with θ0The angle tilt of instruction, and axis Z is received relative to the plane of sensor 4 with herein with θ2Indicate Angle tilt.Therefore array 17 may be arranged in parallel plane with sensor 4, particularly arrange coplanarly, join with more than The advantages of Figure 22 is discussed is examined, and illuminates axis A and is tilted relative to axis Z is received with angle, θ=θ02.This embodiment also allows for Angle, θ0And θ2Absolute value it is small, and the image capture device 2 for the small size being therefore maintained works still through with two Parameter and obtain wide-angle θ and the design flexibility of bigger.
The illumination optics 19a of the public non-inverted of the embodiment of Figure 13 and Figure 16, which also may be disposed so that, makes its axis phase Tilted for array 17, similar to situation described in reference diagram 17.Such lighting device 6 can be advantageously used in Figure 23, figure 25th, in the embodiment of the image capture device of Figure 27.
In addition, the measure of Figure 11 and Figure 17 can be combined, this is inclined by will be inverted illumination optics 19a relative to array 17 Tiltedly and offset ground is arranged to obtain the tilt angle theta being worth greatly between illumination axis A and the plane normal of array 170To realize, The increase smaller of the overall size of image capture device 2 at the same time, when depth of field DOF is concentrated on close to the area of reader, this is special Beneficial.
With reference to figure 28, this figure is related to the lighting device 6 of the embodiment of Figure 11, and the projection of inversion type is positioned in array 17 When in the object plane of lens 19a, it can make what is launched by the array 17 of light source 18 by the f numbers of suitably selected projecting lens 19a Illuminating bundle is focused on for the suitable distance range in the front of lighting device 6.Such distance range should at least correspond to image The depth of field DOF of the sensor 4 of trap setting 2.
Known such distance range (W.J.Smith, " Modern Optical as image-side depth of focus in the literature Engineering, 3rd ed., ed.McGraw Hill 2000, chap.6.8) according to from projecting lens 19a focusing away from Measure away from projecting lens 19a (δ ') from D ' or measured and different towards projecting lens 19a (δ ").However, for For the distance D ' being worth greatly, it is assumed that δ '=δ ", then this difference can be neglected, and the scope of distance is substantially equal to δ '=D '2*κ/Wa= D’K’/Wa, wherein K ' is the image 22 of each light source 18 on substrate SmDesign maximum magnifying power, unit mm, κ are with angle Spend the identical amount (K '=D ' * tg κ) of Fuzzy Representation, and WaIt is the aperture of projecting lens 19a.For low-angle, such as herein In the case of investigation, κ ≈ ωm, or κ ≈ ω 'm, wherein angle ωm、ω’mIndicated in Figure 14 into Figure 16.
The work f numbers (D '/W of projecting lens 19aa) and focal length (D ') it is higher, then image-side depth of focus δ ", δ ' are bigger.Such as It is assumed that will be focused on the illuminating bundle T of ± 25 ° of launch angle at distance D '=350mm away from projecting lens 19a, and recognize It is equal to the 2.5% of the size for the image being about illuminated for direction ambiguity κ, i.e. κ=1.25 °, then the f numbers 35 that work is sufficient so that figure Image side depth of focus δ ' ≈ δ "=267mm, i.e., if the DOF=2 δ ', by the image being incident upon by array 17 on substrate S for The whole depth of field DOF of image processing system 3 keeps focusing on.
By by the aperture W of projecting lens 19aaSelection is between 5mm and 20mm, preferably between 6mm and 12mm, and The focal length D ' of lighting device 6 is selected between 100mm and 350mm, the image-side of the representative value with application can be obtained Depth of focus δ ', in other words, obtains the representative value of the depth of field DOF of image processing system 2 and minimum reading distance D1Representative value and Maximum read distance D2Representative value.
Therefore it is presumed that suitably selected projecting lens 19a, then can obtain and be used as light source 18 by what lighting device 6 projected Array 17 projection image, which has clearly edge at each operating distance D.
Similar Consideration is applied to the lighting device of the embodiment of Figure 13 to Figure 17.
In the embodiment of the lighting device 6 of Figure 11, Figure 13, Figure 16, Figure 17, illumination optics 19a or 19c are preferably Including collimation lens, the collimation lens has for example constant angle enlargement characteristic AMAGa, preferably with 0.7 angle Amplification ratio.Illumination optics 19a, 19c preferably have fixed focal length.
As specific example, being constructed as below for Figure 24, the lighting device of the construction with Figure 11, band are let us consider that Have and be in plane Z, Y (plane of Figure 24) and there are multiple mtotThe one-dimensional array of=52 light source 18 along axis Y arrangements 17, the light source 18 is separated from each other using M=100 μm and total length is 52*100 μm=5.2mm.It is assumed that illumination axis A and reception Tilt angle theta between axis Z1For θ1=14 ° so that cos α1=1.000, cos α2=0.000, cos α3=0.000, cos α4 =0.000, cos α5=0.970, cos α6=-0.242, cos α7=0.000, cos α8=0.242, cos α9=0.970.Illumination Vertex A0It is in away from the y for receiving vertex O in the range of 0 to 20mm0Distance at, such as=y0- 10mm, and illuminate vertex A0 10mm is moved towards axis Z, therefore its coordinate is A0(0, -10,10).Set it is also assumed that carrying along axis X and using origin O in The image capture device 3 of the one-dimensional sensor 4 of the heart has constant and is limited relative to the symmetrical visual field of z axis, the visual field It is scheduled on β1=+20 ° and β3(in general, visual field β between=- 20 °13Between 10 ° and 30 °).It is also assumed that the depth of field of sensor 4 DOF is included in minimum operating distance D1=30mm and maximum functional distance D2Between=500mm so that depth of field DOF=470mm.So Assume that lighting device 19a has constant angle enlargement characteristic, takes amplification ratio AMAG afterwardsa=0.7, and array 17 is arranged in away from photograph At the distance t=-6mm of bright optics 19a.Applying equation (1) arrives formula (15), at each distance D, obtains on array 17 It is to be opened to cover the minimum index m of the end light source 18 of the ranks taken herein at distance by 4 frame of sensor exactly1And maximum Index m2.For with the step-length of 30mm, in final step with the step-length of 20mm come the operating distance D sampled, the change of this index Show below in table 1.
Table 1
D m1 m2
30 -25 12
60 -14 21
90 -10 23
120 -9 24
150 -8 24
180 -7 25
210 -7 25
240 -7 25
270 -6 25
300 -6 25
330 -6 26
360 -6 26
390 -6 26
420 -6 26
450 -6 26
480 -5 26
500 -5 26
Figure 29 provides the visable representation of the subset 18a of light source 18 to be opened on array 17, the visable representation It is designated as proceeding to the continuous band of Maximum Index from minimum index.
From qualitatively angle, it is readily apparent that equal visual field β from the research of table 1 and Figure 291、β3Or angle target Close to ultimate range D2Operating distance D at, on array 17 in the position of first and last light source 18a to be opened not In the presence of obvious big change, in other words, end index m1And m2Slowly change, and from close to minimum range D1Operating distance Near place, larger change is experienced in the position of first and last light source 18a to be opened in array 17, is in other words referred to Mark m1And m2Change more quickly.
It is noted that at operating distance D, all light sources 18 of array 17 are not switched on, but each At operating distance D, a certain amount of light source 18 since at least one edge of array 17 is closed.In addition, first and last Light source 18 is respectively in minimum operating distance D1With maximum functional distance D2Open at place.If as described above, lighting device 6 (by most Big illuminating bundle T0Comprising) solid launch angle is equal to the required three-dimensional launch angle of minimum, then this optimal conditions can Obtained using any embodiment in the embodiment of Figure 23 to Figure 27.However, the construction of Figure 22 can be used, simply, in this feelings Some light sources 18 of array 17 will be always off at all working distance D under condition.
The situation of two-dimensional array 17 is expanded to similar to Table I but usually from Figure 29 and/or from it in inquiry sheet form Expression, and/or with the aforedescribed process, therefore the index m of end light source 18 to be opened in array 17 should be obtainedi、ni, from And step 101 is performed, which determines waiting out for array 17 at each operating distance D in the depth of field DOF of sensor 4 Open with illuminate the whole region 16 taken on substrate S by 4 frame of sensor light source 18 subset 18a.
If it is determined that array 17 is to be opened to illuminate 16 light source 18 of whole region taken on substrate S by 4 frame of sensor Subset 18a use analytic method, in real time (Fig. 7) and disposably (Fig. 8) carry out, then driver 13 must be known by read away from From D.
Such information can be provided by reading the suitable measuring appliance of distance, and the measuring appliance can be shown in fig. 2 The part of the reading system 1 gone out can be communicated by communication interface 9.This can be used in the such measuring appliance for reading distance D Different well known modes is made in matter, such as by the device based on photronic system, the measurement or sharp based on phase Light device or LED, visible ray or IR (infrared ray) the light beam flight time measurement device, or radar or ultrasound in nature Device etc..
However, the intrinsic flexibility of the array 17 of the light source 18 being operated alone of lighting device 16 is provided and as follows may be used Can property, i.e., illuminated on substrate S shape-variable, size luminous pattern and/or illuminate in the region 16 taken by 4 frame of sensor Luminous pattern as the position for the function for reading distance D, additionally it is possible to illuminated with variable wavelength (multiple wavelength), this allows to survey Distance D, and the presence of detected substrate S, and measurable or estimation image processing system 3 focusing bar are read in amount or estimation Part.
By obtaining the image (partly) illuminated by luminous pattern of substrate S with image capture device 3, therefore can estimate Or even accurately measure distance at substrate S present positions, that is, read distance D.Alternatively, read distance D estimation or Measurement is performed by user, and manually control and/or data input device 11 are suitably supplied to driver 13.
For example, with reference to figure 30 and Figure 31, wherein image capture device 2 is, for example, the picture catching according to the embodiment of Figure 23 Device, driver 13 can drive array 17 for example sent out with opening the subset 18b of light source 18 in certain predetermined launch angle φ The sub-set of light sources 18b penetrated, intentionally only to illuminate the part in the region 16 taken by 4 frame of sensor.Change when reading distance D When, since the parallactic error between lighting device 6 and image processing system 3 (is not corrected, but intentionally in the case For this purpose), the size on the border for the luminous pattern 23 being illuminated on substrate S and position are in the figure caught by sensor 4 As interior change.In the depicted situation, the luminous pattern 23 being projected is with the increase for reading distance D and from by sensor The rectangle that the edge in the region 16 that 4 frames take starts and the opposite edges towards the region 16 are increasingly widened.Similarly, can throw The mobile band that can be widened is penetrated, or cross can be projected in the case of two-dimensional array 17.If substrate S is not present or in the depth of field On the outside of DOF, then luminous pattern 23 does not fall or only partially falls in the region 16 taken by 4 frame of sensor, or luminous pattern 23 It is excessively fuzzy so as to also obtain the existing function of detection substrate S.
In an alternate embodiment, driver 13 can drive array 17, carry out example to open the construction 18b of light source 18 The bar of a pair of angled is such as projected, the bar is formd by the position changed according to operating distance D on substrate S and separated at two Bar between the luminous pattern that continuously changes, the luminous pattern is V, X, inverted V and two carry and initial inclination phase To inclined separated bar, as example described in foregoing EP 1466292B1.X-shape can be formed advantageously with image The optimum focusing distance of device 3 is associated.In another embodiment, driver 13 can drive array 17 to open light source 18 18b is constructed for example to project a pair of of cross, the cross according to position of the distance D changes on substrate S is read by forming It is continuously changing between two different crosses and being capable of differently from each other inclined luminous pattern, and in two cross weights Single cross at folded operating distance D can be advantageously associated with the optimum focusing distance of image processing system 3, such as example Described in foregoing US 5,949,057.The following fact can also be used in the estimation or measurement of operating distance D, that is, is incident upon substrate Luminous pattern 23 on S little by little loses definition when the optimum focusing distance movement from image processing system 3 is opened, in other words Fog, it is such as described above with reference to Figure 28.These embodiments with other similar embodiments therefore also allow for estimation or The function of focused condition is assessed, and/or in the case of visible-range intraoral illumination, it is allowed to provide focused condition to user Visual information, the visual information are implemented using image capture device 2.In the case where for example projecting two oblique stripes, shine Figure also advantageously indicates image capture device 2 and substrate S-phase to user and mutually shifts to realize the direction of focused condition.
Alternatively or in addition, as being schematically illustrated in Figure 32 into Figure 34, measuring or estimating work automatically The information that is received after distance D or based on the device outside image capture device 2, driver 13 can drive array 17 open light The construction 18c in source 18, to be incident upon the luminous pattern 24 in limit of visible spectrum, the illuminated diagram on substrate S at distance D Shape 24 can intuitivism apprehension, be, for example, word " TOO FAR (too far) ", " TOO CLOSE (excessively near) ", and the projection may with The ambiguity of the figure 24, this by the suitable matching of the focal length of array 17 and the focal length of reception device 3 come overcome with Be intended to meaning is further passed on, and word can be " OK " in the in-focus situation.
It should be noted that although above-mentioned construction 18b is having determined that the illuminating on substrate S of light source 18 by driver 13 Opened before the subset 18a in the region 16 taken by 4 frame of sensor, but the construction 18c that will now be described can be by driver 13 Determine to open after the subset 18a for illuminating the region 16 taken by 4 frame of sensor on substrate S of light source 18, and therefore can be favourable Ground is placed in the middle relative to this subset 18a, as shown in Figure 34.
The intrinsic flexibility of the array 17 of the light source 18 being operated alone of lighting device 16 also provides result of implementation The possibility of instruction device.In this operational mode, driver 13 drive array 17 opens the construction of light source 18, with substrate Instruction is illuminated on S to attempt to catch image and/or decode the positive or negative result of optical information C and the possibility original of negative decision The luminous pattern of cause, such as " OK " are shown in Figure 34 by the construction 18c of light source 18, or are shown in a similar way “NO”.As interchangeable, or except this of the vpg connection in the luminous pattern for indicating result changes outer, it can be used and send out Change in terms of the size of light figure, color and/or position, for example, any green emitting figure by indicate affirmative as a result, and Red luminous pattern will indicate the result of negative.Equally in the case, it is placed in the middle preferably relative to subset 18a to construct 18c.
The intrinsic flexibility of the array 17 of the light source 18 being operated alone of lighting device 6 additionally provides implementation and aims at dress The possibility put.
Thus, for example, to provide the image for the aiming being used at the whole region 16 taken by 4 frame of sensor, this passes through Show that visually indicating for the region 16 taken by 4 frame of sensor carrys out auxiliary operator and believe reader relative to optics on substrate S C positioning is ceased, once having limited the to be opened to illuminate the subset 18a of the whole region 16 taken by 4 frame of sensor of light source 18, is driven Device 13 can drive array 17 open one or a certain amount of light source 18d at the edge of this subset 18a or adjacent edges, To illuminate one or more parts on the border in the region 16 taken by 4 frame of sensor or border, such as in dimension sensor 4 In the case of corner, as schematically shown by the luminuous aiming figure 26 in Figure 35.Alternatively or in addition, drive Dynamic device 13 will be responsible for illuminating one or certain of middle part on four sides of the rectangle or quadrangle limited by subset 18a The light source 18 of amount, and/or illuminate in the rectangle limited by subset 18a or a certain amount of of cross is arranged as at the center of quadrangle Light source.
Wherein image capture device 2 can be advantageously set only to catch in the region 16 taken by 4 frame of sensor there is also a variety of The application of one or more concern area ROI.There is provided multiple light sources 18 being operated alone of array 17 allows to be readily available The illumination and/or aiming of the part of this concern area ROI.In the case, driver 13 drives the array 17 of lighting device 6 with only One or more construction 18e (not shown) of the light source 18 of the subset 18a determined according to one of above-mentioned method are opened, each Construction 18e is sized and positions as follows relative to subset 18a, i.e., which corresponds to associated concern area ROI phases The mode for how being sized and positioning for the whole region 16 taken on substrate S by 4 frame of sensor.
First application includes constructing the reader 1 with dimension sensor 4 as linear reader.To increase frame rate With read reactivity, the quantity of the row of the work of sensor 4 can be reduced to only several rows (low to arrive a line) and this in itself Know;In the case, it is ideally to be arranged in vertical field of view β to pay close attention to area ROI2、β4Or horizontal field of view β1、β3Center at The thin rectangle region for being reduced to ranks.The construction 18e for the light source 18 opened by driver 13 is therefore including in one direction On subset 18a central source or all light sources of some central sources and subset 18a in vertical direction, so as in base On plate S thin light belt is projected at concern area ROI.
Second application is the image of file of the processing with standardized format or form.As an example, Figure 36 illustrates bag The file or form 200 of pending different types of information are included, especially:
- include with OCR (optical mask identification) form, in other words with the volume for the written symbol that can be identified by suitable software The area 201 of code information;
- include one or more optical linear and/or 2 d code areas 202;
- include in graphic form other coding informations area 203, the graphic form such as handwritten text, signature, business Mark or logo, stamp or image.
The suitable processing of the first image caught by sensor 4 or part thereof, which may be with low resolution Carry out, the position in such area 201 to 203 can be positioned at by the area taken by 4 frame of sensor by substantially well known mode In domain 16, the region 16 is assumed to overlap with whole file 200 for the sake of simplifying and representing.In the region taken by 4 frame of sensor In the case that 16 extend beyond whole file 200, this is contemplated that as other concern area.
Once one or more in such area 201 to 203 have been positioned, then driver 13 can be with similar to ginseng Examine the mode drive array 17 described in Figure 35 with only open illuminate positioned area (multiple areas) 200 to 203 at least part, in The heart and/or the light source on border 18, to serve as the auxiliary of the interactive selection in the area positioned for aiming at and/or being actually processed per entropy coder, such as Pass through the aiming luminous pattern 26,26 in Figure 361、262、263Shown.
The interactive selection of user can for example by show different areas 201 to 203 and may by show by sensor 4 The whole region 16 that frame takes is carried out with associated different digital, wherein associated different digital is also by lighting device 6 itself are incident upon the aiming luminous pattern 26,26 in positioned area 200 to 2031、262、263At or near, or light source wherein 18 be suitable for according to the different wavelength of at least two in visible-range either individually or as overall emission in the case of pass through With the different aiming luminous pattern 26,26 of different color shows1、262、263Come carry out.Each numeral or color can for example have There are the different buttons of the manual input device 11 for the reader 1 being associated, or one or two may be present and is used in difference Area 200 to 203 in circulate the button of selection, after a certain time or by pressing other button or suitable with another Mode makes selection become last selection.The area 200 to 203 selected in every case can for example by the illumination of bigger, Illumination or the similar fashion had a rest and protrude, or while pressing selection key every time, can illuminate single area 201 to 203 once.
For the identical purpose of interaction selection, or one or more positioning are selected by user in a subsequent step Region 200 to 203, driver 13 can drive array 17 only include the light source 18 for illuminating the area 200 to 203 of positioning to open 18e is constructed, the purpose for catching the image in each area to provide the illumination of optimization to be used for, such as example passes through the light-emitting zone in Figure 37 272、273Show.
In the case of the reader 1 being used together with levels of standards or form, figure 26,26 is aimed atiAnd/or portion Divide lighting figure 27,27iSize and position in the region 16 corresponding to whole form 200 taken by 4 frame of sensor can be Preset in reader 1 in configuration step, and as the replacement positioned in real time.
As other application, in the case of the reader 1 that nobody looks after, such as reading by relative to reader Optical code C entrained by 1 object performed relative motion, such as optical code C on a moving belt, driver 13 can drive battle array Row 17 only include the construction 18f for illuminating the light source 18 in the area for being provided with optical code C at it with unlatching.
Also to aim at and/or selecting the concern area on substrate S and/or the purpose of the whole region 16 taken by frame, as photograph The bright regional center and/or the replacement on border, can illuminate at least part in the region (Figure 37) completely.
Figure 38 illustrates further embodiment in, array 17, the 17a of two light sources 18 that can be operated alone can be arranged On the opposite both sides of the sensor 4 of image processing system 3.Two arrays 17,17a can be driven by driver 13, so as to Each each half 28,28a at most illuminating the region 16 taken by 4 frame of sensor.In the case, can more easily illuminate by The big region 16 that 4 frame of sensor takes.
Alternatively, two arrays 17,17a can by driver 13 by relative to receive axis Z it is symmetrical in a manner of drive, The radiosity in the whole region 16 taken by 4 frame of sensor or in one or more concern area to be doubled, This is by the way that the transmitting superposition of two arrays 17,17a are realized, as shown in Figure 39.By 4 frame of sensor on substrate S The illuminating and also automatically obtain evenly in the region 16 taken because array 17 because illuminating lower light further from substrate S Source 18 corresponds to the light source 18 because illuminating higher closer to substrate S of array 17a, and vice versa.
In Figure 38 and Figure 39, it is assumed that use non-inverted illumination optics at two arrays 17,17a, such as including Single lens 19b, and using the illumination optics of non-inverted at sensor 4, it will be appreciated that can be used above-mentioned Possessive construction.
Similarly, in further embodiment (not shown), it is possible to provide four arrays of the light source 18 that can be operated alone, institute Four arrays are stated to be arranged at four sides of the rectangle of image processing system 3 or especially square sensor 4.
In above-mentioned a variety of miscellaneous functions, lighting device 6 can be caused to work with low " resolution ratio ", in other words, respectively with Subset 18a, 18b, 18c, 18d, 18e, 18f work, only open alternate light source 18, or can open and close alternate one or The group of more light sources 18, to consume less energy.Alternatively or in addition, by only analyzing some light-sensitive elements 14, for example, only whole sensor 4 or one or more a concern area alternate light-sensitive element 14 or light-sensitive element 14 Group, image processing system 3 can be run with low resolution, and in other words reader 1 can implement suitable algorithm, for low resolution Rate assesses at least one first sample image.
Those skilled in the art will readily appreciate how to apply above-mentioned concept and method, especially by above-mentioned Equation represented by sensor 4 workspace areas 15 in arbitrary point P and each array 17,17a it is to be opened with shine Incidence relation between the light source (multiple light sources) 18 of bright point P, the selection followed with being accurately defined driver 13 are opened And alternatively with which kind of intensity and/or launch wavelength (multiple wavelength) open array 17,17a which light source 18 standard, with The various embodiments in the region 16 taken by 4 frame of sensor and/or a variety of other functions are illuminated in implementation, such as above with reference to Figure 30 extremely Described by Figure 37.
Persons skilled in the art will be understood that, in the various embodiments, array 17,17a light source 18 number Amount and/or its intensity can be selected according to different factors, and the factor includes:Depth of field DOF, the sensor 4 of image processing system 3 Size and resolution ratio, cost, driver 13 or build inquiry table processor calculating capacity.
It has been found that under two-dimensional case, the quantity of suitable light source 18 is at least 32 × 32, it is therefore preferable to 64 × 64 or more It is more, or with form factor 4:It is 44 × 32 in the case of 3 sensor 4, it is therefore preferable to 86 × 64 or more.Similarly, Under one-dimensional case, the quantity of suitable individually addressable light source 18 is 32 or 64 or more.
Above-mentioned image capture device 2 and particularly therefore its lighting device 6 has clear advantage.
First advantage is:Although the visual field of image capture device 3 and the illuminated field of lighting device 6 are not coaxial, avoid Therebetween any parallactic error and perspective distortion error.This allows to save energy, because need not make to be illuminated extension More than the region 16 taken by 4 frame of sensor region with view of parallactic error.
, can be single in the case of without any movable part (in addition to the embodiment with micro mirror) but in the following way The intrinsic flexibility of the array 17 of the light source 18 solely driven also provides the region for easily changing and being illuminated on substrate S Possibility, i.e., as described above by simply only opening to illuminate the institute needed for the region 16 taken on substrate S by 4 frame of sensor Those light sources of some light sources 18 in other words subset 18a, or by only open the light source 18 of this sub-set of light sources 18a be used for it is above-mentioned The part of different purposes is realized.In other words, the array 17 for the light source 18 that can be operated alone allows the illumination using the present invention Device 6 realizes one or more other different functions, therefore reduces the cost and volume of reader 1, wherein the work( It can usually be implemented according to the prior art by different devices.
In the case that illumination axis A and reception axis Z are overlapped, merge one or more foregoing miscellaneous functions and arrive It is also novel that single image capture device 2 is interior and its own also represents Promethean aspect.
As the array 17 of light source 18 is formed on substrate there are following modification, the modification to be opened by having at its center Hole allows the array 17 of the light source 18 of its arrangement concentric with image processing system 3 to form.Claim 1 scope it This outer solution has the following advantages that, that is, implements the symmetrical arrangement that axis Z is received relative to optics, and its cost is The supporting item of perforation is manufactured, this is not standard and make it that the design of driver 13 complicates.
Similar to use zoom and/or autofocus system, the highest luminance light beam T of lighting device 60It can also be made as and pass through Well known zoom and/or the autofocus system dynamically change in terms of size and/or ratio, the zoom and/or autohemagglutination Burnt system is for example electromechanical, piezoelectricity or electro-optical actuators, the actuator for example by using liquid lens or deformable lens and For one or more lens of portable lighting optics, and/or for changing the one or more of illumination optics The curvature of a lens, and/or for moving array 17.
Falling the other solution outside the scope of claim 1 is included by multiple relatively little of OLED parts shapes Into lighting device, lighting device is formed especially by following irregular part, the part is shaped to be placed on Together to form the overlapping sided figure of the part of certain amount such as three.Based on distance is read, formed relative to image The irregularities that trap setting 3 has the figure of parallactic error is opened.Also it may be present and one or more be arranged as being formed The rectangle of one or more concentric rectangles around irregularities and/or the sequence of angle part, the rectangle and/or angle The sequence of degree part is illuminated to provide aiming figure.
The feelings that illumination optics 19a, 19b, 19c can fully be collimated in light source 18 and launched according to suitable direction Under condition, such as (Fig. 6) is not present in the case where array 17 is arranged along curved surface.
In the case of the array 17 arranged along curved surface (Fig. 6), the reference application of all planes for array 17 To the plane tangent with the part of array 17.

Claims (12)

1. a kind of optical information reader (1) of type of imager, including:
- image processing system (3), described image forming apparatus (3) include sensor (4), and the sensor (4) includes photosensitive member The one-dimensional or two-dimensional array of part (14), and define that optics receives axis (Z), at least one reading distance (D, D1、D2) and institute State at least one read apart from (D, D1、D2) it is in substrate (S, S1、S2) on the region (16,16 that is taken by the sensor (4) frame1、 162),
- lighting device (6), the lighting device (6) include the array (17) of adjacent light source (18), the lighting device (6) Illumination optical axis (A) is limited,
- wherein, each light source (18) is adapted to illuminate size in the following way and is less than to be taken by the sensor (4) frame The region (16,161、162) size area (22m), i.e.,:So that each light source (18) is fully collimated and according to conjunction Suitable direction transmitting;Or cause each light source (18) to be provided with optical element, which is selected from by each light The group formed from oneself projecting lens (19b), aperture, prism surface, light guide, GRIN Lens and combinations thereof,
The light source (18) of-array (17) is miniature transmitter, and the miniature transmitter is made with gallium nitride (GaN) technology,
- also, it is described illumination axis (A) not with it is described reception axis (Z) overlap.
2. the optical information reader (1) of type of imager according to claim 1, wherein by the sensor (4) The fundamental region that each light-sensitive element (14) frame takes can be at most by the interior phase each other of array (17) in adjacent light source (18) Four light sources (18) of the formation square arranged adjacently are illuminated;Or when the array (17) of adjacent light source (18) is one-dimensional battle array During row, the fundamental region taken by each light-sensitive element (14) frame of the sensor (4) can be at most by two adjacent light Illuminate in source (18).
3. the optical information reader (1) of type of imager according to claim 1, wherein each light source (18) includes list A illumination component.
4. the optical information reader (1) of type of imager according to any one of claim 1-3, the wherein battle array of light source It is one-dimensional array to arrange (17).
5. the optical information reader (1) of type of imager according to any one of claim 1-3, wherein in substrate (S) the percentage change undergone on by total area that the lighting device (6) illuminates in single source (18) unlatching/closing is small In or equal to 15%.
6. the optical information reader (1) of type of imager according to claim 5, wherein percentage change is less than Or equal to 10%.
7. the optical information reader (1) of type of imager according to claim 5, wherein percentage change is less than Or equal to 5%.
8. the optical information reader (1) of type of imager according to any one of claim 1-3, wherein picture catching Device (2) further includes the second array (17a) of adjacent light source (18), and the second array (17a) defines the second illumination axis Line, the second illumination axis are not overlapped with the reception axis (Z).
9. the optical information reader (1) of type of imager according to any one of claim 1-3, wherein light source (18) Array (17) it is associated with least one projecting lens (19a, 19b, 19c).
10. the optical information reader (1) of type of imager according to claim 9, wherein providing by array (17) The shared at least one projecting lens (19a, 19c) of light source (18).
11. the optical information reader (1) of type of imager according to claim 9, wherein the projecting lens (19a, 19b, 19c) it is associated with another optical element.
12. the optical information reader (1) of type of imager according to any one of claim 1-3, wherein the light Source (18) can individually drive.
CN201410830194.8A 2010-03-11 2010-03-11 Image capture device Active CN104680113B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410830194.8A CN104680113B (en) 2010-03-11 2010-03-11 Image capture device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201080066573.4A CN102870121B (en) 2010-03-11 2010-03-11 Image capturing device
CN201410830194.8A CN104680113B (en) 2010-03-11 2010-03-11 Image capture device

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201080066573.4A Division CN102870121B (en) 2010-03-11 2010-03-11 Image capturing device

Publications (2)

Publication Number Publication Date
CN104680113A CN104680113A (en) 2015-06-03
CN104680113B true CN104680113B (en) 2018-05-15

Family

ID=53315136

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410830194.8A Active CN104680113B (en) 2010-03-11 2010-03-11 Image capture device

Country Status (1)

Country Link
CN (1) CN104680113B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9747484B1 (en) * 2016-02-25 2017-08-29 Symbol Technologies, Llc Module or arrangement for, and method of, reading a target by image capture with an imaging reader having offset imaging and aiming systems
CN106980316B (en) * 2017-02-08 2019-10-29 南昌大学 A kind of ultrasonic array obstacle avoidance system applied to mobile platform
CN108594451B (en) * 2018-03-12 2020-01-24 Oppo广东移动通信有限公司 Control method, control device, depth camera and electronic device
WO2019174436A1 (en) * 2018-03-12 2019-09-19 Oppo广东移动通信有限公司 Control method, control device, depth camera and electronic device
CN110222712B (en) * 2019-04-30 2021-06-22 杰创智能科技股份有限公司 Multi-special-item target detection algorithm based on deep learning

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5168167A (en) * 1991-01-31 1992-12-01 International Business Machines Corporation Optical scanner having controllable light sources
CN101485235A (en) * 2006-06-30 2009-07-15 皇家飞利浦电子股份有限公司 Device and method for controlling a lighting system by proximity sensing of a spotlight control device and spotlight control device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6628445B2 (en) * 2000-03-17 2003-09-30 Accu-Sort Systems, Inc. Coplanar camera scanning system
US7846753B2 (en) * 2007-08-10 2010-12-07 Hong Kong Applied Science And Technology Research Institute Vertical light emitting diode and method of making a vertical light emitting diode

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5168167A (en) * 1991-01-31 1992-12-01 International Business Machines Corporation Optical scanner having controllable light sources
CN101485235A (en) * 2006-06-30 2009-07-15 皇家飞利浦电子股份有限公司 Device and method for controlling a lighting system by proximity sensing of a spotlight control device and spotlight control device

Also Published As

Publication number Publication date
CN104680113A (en) 2015-06-03

Similar Documents

Publication Publication Date Title
CN102870121B (en) Image capturing device
US9501678B2 (en) Indicia reading terminal including optical filter
CN102375968B (en) Image engine with integrated circuit structure for indicia reading terminal
US6003773A (en) Tablet style indicia reader with system for handling multiple indicia
CN100371941C (en) Imagingbar code reader with movable light beam simulation
CN202870848U (en) Graphic symbol identification system based on digital imaging
ES2820451T3 (en) Imaging barcode reader with light-emitting diode to generate a field of view
CN202995752U (en) Digital-imaging code symbol reading system
CN1308884C (en) Image capture system and method using common image array
US8087587B2 (en) Dual laser aiming patterns for an imaging-based bar code reader
CN104680113B (en) Image capture device
CN109074474A (en) Electronic equipment and correlation technique including the processing circuit for sensing the image from subarray spaced apart
CN101999128A (en) Systems and methods for forming a composite image of multiple portions of an object from multiple perspectives
CN108351955A (en) Compact image-forming module with rangefinder
US20080290171A1 (en) Illumination apparatus for an imaging-based bar code reader
FR3055447A1 (en) MORE VISIBLE VIEWING BRAND FOR OPTICAL READER
US7264168B2 (en) Asymmetrical scanner
CN106056026B (en) A kind of multi-direction bar code scanner of more single photosensitive receiving tubes of LASER Discharge Tube matching
ES2463390T3 (en) Reverse vending machine and representation method using images for it
US7571855B2 (en) Display with symbology
JP2011009802A (en) Optical wireless communication apparatus, optical wireless communication method and program

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant