CN107301359B - Imaging barcode reader with color separated sight and illuminator - Google Patents

Imaging barcode reader with color separated sight and illuminator Download PDF

Info

Publication number
CN107301359B
CN107301359B CN201610233259.XA CN201610233259A CN107301359B CN 107301359 B CN107301359 B CN 107301359B CN 201610233259 A CN201610233259 A CN 201610233259A CN 107301359 B CN107301359 B CN 107301359B
Authority
CN
China
Prior art keywords
frequency
light
patent application
scanner
sight
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610233259.XA
Other languages
Chinese (zh)
Other versions
CN107301359A (en
Inventor
冯琛
任杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hand Held Products Inc
Original Assignee
Hand Held Products Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hand Held Products Inc filed Critical Hand Held Products Inc
Priority to CN202310091284.9A priority Critical patent/CN115983301A/en
Priority to CN201610233259.XA priority patent/CN107301359B/en
Priority to US15/470,971 priority patent/US10055625B2/en
Priority to EP21204528.0A priority patent/EP4006769A1/en
Priority to EP17163708.5A priority patent/EP3232367B1/en
Publication of CN107301359A publication Critical patent/CN107301359A/en
Application granted granted Critical
Publication of CN107301359B publication Critical patent/CN107301359B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/10544Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum
    • G06K7/10821Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum further details of bar or optical code scanning devices
    • G06K7/10881Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum further details of bar or optical code scanning devices constructional details of hand-held scanners

Abstract

An imaging barcode reader having a color separating sight and an illuminator. Scanners for machine-readable symbols, such as bar codes and two-dimensional matrix symbols, employ at least two different light frequencies (colors). The first frequency supports accurate aiming of the scanner at the symbol. The second frequency supports illumination of the machine-readable symbol so that the optical imaging element of the scanner can read the reflected illumination light at the second frequency. The use of two different light frequencies enables both aiming and scanning to occur simultaneously without the aiming process interfering with the scanning process. This enables the targeting frequency to be used for additional purposes, such as providing signaling to the user of the scanner. In an embodiment, two different light sources are used in the scanner to provide different light frequencies. In an embodiment, different color filters are employed to separate and distinguish light frequencies. In an embodiment, signal processing may be employed to digitally distinguish a plurality of separate frequencies in the light reflected from the symbol.

Description

Imaging barcode reader with color separated sight and illuminator
Technical Field
The present invention relates to methods and apparatus for decoding machine-readable symbols, and more particularly to methods and apparatus for aiming a symbol reader and illuminating a machine-readable symbol.
Background
Machine-readable symbols (MRS) provide a way to encode information in a compact printed form (or pop-up form) that can be scanned and then interpreted by an optically-based symbol detector. Such machine-readable symbols are often attached to (or imprinted on) product packaging, food products, general consumer items, machine parts, equipment, and other manufactured items for machine-based identification and tracking purposes.
One exemplary type of machine-readable symbol is a bar code, which employs a series of bars and white spaces oriented vertically along a single row. The set of bars and spaces corresponds to a codeword. The codeword is associated with an alpha-numeric symbol, one or more numeric digits, or other symbol function.
To facilitate encoding of larger amounts of information into a single machine-readable symbol, two-dimensional (2D) barcodes have been devised. These are also commonly referred to as stacked, matrix and/or area bar codes. The 2D matrix symbols employ an arrangement of cells (also referred to as elements or modules) in the shape of regular polygons, typically squares. The particular arrangement of cells in a 2D matrix symbol represents a data characteristic and/or a symbol function.
In this document, the terms "barcode" and "symbol" are used interchangeably, both generally referring to machine-readable symbols, whether linear or two-dimensional.
Symbol readers (or bar code readers), also known as scanners, are employed to read matrix symbols using a variety of optical scanning electronics and methods. In order to properly scan a symbol, the symbol must be within the field of view of the reader. Some readers are handheld and can be aimed at symbols; other readers are fixed in position and the symbol (and the object to which it is attached) must be placed within the field of view of the reader.
Either way, the symbol scanner may project a "sight mode" or "sight beam" (a pattern of light that may indicate the scanner center of the field of view); the aimer pattern may also project/include a corner pattern to indicate the edges of the field of view. Proper alignment or overlap of the projected aimer pattern with the target symbol indicates that the scanner is properly aimed for scanning.
Once the scanner and symbol are properly aligned such that the symbol is in the field of view of the scanner, the scanner performs image capture via the imaging element. When the imaging element captures an image of a symbol, it is often necessary to turn off the sight beam because the sight pattern is visible to the imager and becomes noise superimposed on the symbol. Even if a focus is made on a particular color (red, amber, green), existing collimators have high intensity at the wavelengths to which the image sensor is sensitive. Hence, it is easy to capture the aimer illumination of the symbol, which may disturb the symbol interpretation. In other words, the imager cannot reliably capture an image of the symbol when the sight is on. This is true for most commonly used image sensors with electronic rolling shutters. Thus, the time it takes with the sight "on" (illuminating the symbol) directly reduces the imager barcode reader response speed.
Alternatively, for a global shutter image sensor, reduced response speed may be less of an issue because the sight may be turned on during the shutter-closed portion of the entire image capture cycle. But for global shutter image sensors, the sight with a short turn-on period also becomes less visible because of the limitation of the sight light source output power. On-and-off (on-and-off) sights also introduce a flash mode, which causes eye strain to the user.
One way to address this problem is to use a continuously on sight that has less contribution to the total image illumination; for example, the sight pattern may be thin or have a dotted pattern. However, for 2D symbols with high density, poor print quality, or 2D codes with lower redundancy rates, this tradeoff will introduce poor decoding rates.
Accordingly, there is a need for a system and method for both sight illumination and symbol capture illumination that avoids time sharing between the sight process and symbol capture, yet achieves a high level of accuracy performance for symbol decoding.
Disclosure of Invention
Thus, in one aspect, the present invention uses different colors of light (i.e., different frequencies or different frequency bands) to separate the scope pattern from the image capture illumination bandwidth. This enables the symbol scanner to capture an image of the symbol even when the sight light is always on. In one embodiment, color separation may be achieved by adding a color blocking filter to block the aimer pattern from reaching the image sensor. The image sensor becomes effectively color-blind to the sight color. Images captured using such devices are not subject to the scope pattern free.
Alternatively, color separation can also be achieved via software image processing with commonly used color image sensors, without the need for color blocking filters. However, software color image filtering can result in some degradation of image quality; decoding performance may be reduced; longer decoding times may be introduced; or may necessitate increased processor cost.
Some of the advantages of the color-separating sight may include:
the aimer pattern may be full frame, indicating an entire frame of the field of view (FOV), the center marker, and the near-center best decoded region;
the sight mode may also include some indicator, such as decode status (e.g., ready to trigger, busy decoding, successful decoding, or failed decoding);
the sight mode may also possibly indicate a decoding condition, such as too far or too close for decoding.
Drawings
FIG. 1 is a perspective view of an exemplary handheld symbol reader for acquiring data from machine-readable symbols.
FIG. 2 is an internal block diagram of an exemplary symbol reader for acquiring data from machine-readable symbols.
FIG. 3 is an internal block diagram of an exemplary symbol reader for acquiring data from machine-readable symbols.
FIG. 4 is an exploded view of the internal structural components of an exemplary symbol reader for acquiring data from machine-readable symbols.
FIG. 5 illustrates spectral characteristics of several exemplary light sources that may be employed in connection with an exemplary symbol reader for acquiring data from machine-readable symbols.
FIG. 6 illustrates spectral characteristics of several exemplary optical filters that may be employed in connection with an exemplary symbol reader for acquiring data from machine-readable symbols.
FIG. 7 is a flow chart of an exemplary method performed by an exemplary symbol reader for aiming the symbol reader and reading symbols via the symbol reader using at least two different frequency bands of light.
Detailed Description
In the following description, certain specific details are set forth in order to provide a thorough understanding of various embodiments. However, it will be understood by those skilled in the art that the present invention may be practiced without these details. In other instances, well-known structures associated with imagers, scanners, and/or other devices operable to read machine-readable symbols have not been described or illustrated in detail to avoid unnecessarily obscuring descriptions of the embodiments.
Throughout the following specification and claims, the word "comprise", and variations thereof such as "comprises" and "comprising", are to be interpreted in an open-ended sense, that is "including but not limited to", unless the context requires otherwise.
Reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
The headings provided herein are for convenience only and do not interpret the scope or meaning of the claimed invention.
Color, frequency, and frequency band.
Throughout the following discussion, it will be understood that references to "color," "color of light," or "frequency of light" (whether a general reference to "color" or a reference to a particular color, such as blue, red, yellow, etc.) may refer to more than a single frequency. Alternatively, such a reference to a first color or a first frequency may refer to a suitably scaled down frequency band to be distinguished (e.g., via a filter) from a second color at a second frequency band. So for example, "red" may refer to a frequency range (approximately 405THz to 480 THz) associated with red color perceived by humans, and it may be distinguished from other colors, such as blue (approximately 610 to 665 THz) or yellow (approximately 510 to 540 THz), among others.
Color may refer to the entire frequency range, or one or more subsets of that range, conventionally attributed to a particular color, with the understanding that the particular frequency range actually selected will be suitable for a given application (such as imaging a 2D symbol). In different embodiments, different frequency ranges within the common frequency range (e.g., different frequency bands within the "green spectrum") may be employed. In practical applications, the projection frequency of light or the reception frequency of light may extend into a frequency band of adjacent colors by a small amount; or the projected light may have multiple frequency bands but a specified frequency or band of frequencies that dominates the intensity to the extent that the projected light is effective for one of the dominant frequency bands.
It is also understood that in practical applications, a given frequency band may overlap two or even three commonly colored domains (domains) (e.g., a given frequency band may include adjacent portions of the green and yellow frequency bands).
The term "broadband" as used herein refers to light emission across substantially all of a plurality of frequency bands of common colors (e.g., from red to green, or yellow to blue), with no frequency band substantially predominant in intensity. "wideband" may also refer to the emission of multiple frequency bands that are not adjacent, but typically include multiple different conventionally named colors. "broadband" may also refer to light emission that is white, that is across substantially all visible colors (red to blue). In general, the terms "frequency band" or "color" may be understood as being significantly narrower over a range of frequencies and distinguishable from broadband emission.
In many cases, in this document, no specific color is specified, and more precisely reference is made to the first frequency band or the second frequency band; it will be appreciated that in particular embodiments, each frequency band may be assigned a particular separate color suitable for the application at hand.
It will also be appreciated that although reference is made herein primarily to "color", "frequency or light" or "frequency bands", such description may be readily made in terms of the wavelength of light. The "color" or "frequency" is chosen for convenience only. Generally, in this document, "color", "colors", and "frequency" or "frequencies" are used interchangeably.
For example listed here only (and without limitation) are some of the frequency ranges that may be conventionally assigned to the various visible colors:
red: approximately 405 to 480THz or 700 to 625nm;
orange color: approximately 480 to 510THz or 625 to 590nm;
amber color: assigned differently at the boundaries of orange and yellow, but typically centered at approximately 504THz or 595nm;
yellow: approximately 510 to 540THz or 590 to 560nm;
green: approximately 540 to 580THz or 560 to 520nm;
cyan: approximately 580 to 610THz or 520 to 495nm;
blue color: approximately 610 to 665THz or 495 to 450nm;
purple: approximately 665 to 790THz or 450 to 380nm.
Those skilled in the art will recognize that the frequency or wavelength of the particular division between common colors is somewhat arbitrary and may vary somewhat in descriptions from different sources.
A symbol reader.
The present systems and methods include devices designed to read machine-readable symbols.
In an exemplary embodiment, such a device may be a handheld scanner. FIG. 1 is a perspective view of an exemplary handheld symbol reader 100 for acquiring data from a machine-readable symbol 102.
The machine-readable symbol 102 is affixed to a package 104 or the like such that a user points the hand-held symbol reader 100 at the machine-readable symbol 102, typically using an aiming beam 116 to guide aiming.
Symbol reader 100 may be a line scanner operable to emit a narrow beam 118 of electromagnetic energy and scan the narrow beam 118 of electromagnetic energy across a field of view 106 on a two-dimensional (2D) machine-readable symbol 102. In other embodiments, aperture devices, mirrors, lenses, etc., are adapted to scan across the symbol line to receive returned electromagnetic energy from a relatively small portion (e.g., cell) of the machine-readable symbol, which is detected by an optical detector system.
In still other embodiments, the 2D array symbol reader 100 obtains a captured image of the machine-readable symbol 102 (and an appropriate area of blank space around the machine-readable symbol). For the present system and method, acquisition of an image of the captured symbol may be the preferred method of operation of the symbol reader 100. Suitable image processing hardware 235 and software (see fig. 2 below) running on the processors 242, 244 are used to deconstruct the captured image to determine the data bits represented by the cells.
The machine-readable symbol reader 100 is illustrated as having a housing 108, a display 110, a keypad (keypad) 112 and an actuator device 114. The actuator device 114 may be a trigger, button, or other suitable actuator operable by a user to initiate the symbol reading process.
The machine-readable symbol 102 shown in the figures is intended to be generic and thus illustrates various types and formats of machine-readable symbols. For example, some machine-readable symbols may include a single row of code words (e.g., a barcode). Other types of machine-readable symbols (e.g., matrices or area codes) may be configured in other shapes, such as circular, hexagonal, rectangular, square, and so forth. Many various types and forms of machine-readable symbols are intended to be included within the scope of the present systems and methods.
Internal block diagram of symbol reader.
FIG. 2 illustrates an internal block diagram of an exemplary symbol reader 100, which includes elements that may be present in a scanner to support the present systems and methods.
In one embodiment of the present system and method, the symbol reader 100 may be an optical reader. The optical reader 100 may include an illumination assembly 220 for illuminating a target object T having a 1D or 2D bar code symbol 102 attached or imprinted thereon. The optical reader 100 may also include an imaging component 230 for receiving an image of the subject T and generating an electrical output signal indicative of the data optically encoded therein.
Illumination assembly 220 may include, for example, one or more illumination source assemblies 222, such as one or more LEDs. The illumination assembly 220 may also include one or more associated illumination optics and/or aiming optics assemblies 224 for directing the illumination light 118 from the light source(s) 222 in the direction of the target object T. The optical assembly 224 may include a mirror, turning mirror, lens, or other light focusing or directing element (not shown). Two optical assemblies (224.1, 224.2) are illustrated in fig. 2, but a common or integrated optical assembly 224 may be used to focus both the aimer light 116 and the illumination light 118 in an application.
In an embodiment, a first split light (laser or LED) 222.1 is used to generate the sight light 116, and a second split illumination light (typically an LED, but could be a laser) 222.2 is used to generate the reading light 118 to support the imaging assembly 230 in reading the symbol 102. In an alternative embodiment, the sight LED/laser 222.1 and the illumination LED 222.2 may be combined into a single illumination element.
In one embodiment, separate aiming optics 224.1 and illumination optics 224.2 are provided. In an alternative embodiment, a single optical element or group of optical elements may provide focusing for both aiming and illumination.
The imaging assembly 230 receives reflected light 260 reflected from the symbol 102. The reflected light 260 may include spectral components of the aimer light 116, the illumination light 118, or both. In one embodiment, the imaging assembly 230 may include an image sensor 232 (such as a 2D CCD or CMOS solid state image sensor) along with imaging optics 234 for receiving and focusing an image of the subject T onto the image sensor 232. The field of view of the imaging assembly 230 will depend on the application. In general, the field of view should be large enough so that the imaging assembly can capture a bitmap representation of the scene that is included in the image data read area near the read range.
In one embodiment of the present system and method, the exemplary symbol reader 100 of FIG. 2 also includes a programmable controller 240, which may include an integrated circuit microprocessor 242 and an Application Specific Integrated Circuit (ASIC) 244. Both processor 242 and ASIC 244 are programmable control devices capable of receiving, outputting, and processing data in accordance with stored programs stored in either or both of a read/write Random Access Memory (RAM) 245 and an Erasable Read Only Memory (EROM) 246. Both the processor 242 and the ASIC 244 are also connected to a general purpose bus 248, through which bus 248 program data and working data (including address data) can be received and transmitted in either direction of any circuit also connected thereto. However, processor 242 and ASIC 244 may differ from each other in how they are manufactured and how they are used.
In one embodiment, processor 242 may be a general-purpose, off-the-shelf (VLSI) integrated circuit microprocessor that has full control of the circuitry of FIG. 2, but which is dedicated most of its time to decoding image data stored in RAM 245 according to program data stored in EROM 246. Processor 244, on the other hand, may be a dedicated VLSI integrated circuit (such as a programmable logic or gate array) that is programmed to devote time to functions other than decoding image data and, thus, relieve processor 242 of the burden of performing these functions.
In an alternative embodiment, the special purpose ASIC 244 may be eliminated entirely if the general purpose processor 242 is fast enough and powerful enough to perform all of the functions contemplated by the present systems and methods. Thus, it will be understood that neither the number of processors used, nor the division of work between them, has any fundamental significance for the purposes of the present systems and methods.
In one embodiment, the exemplary symbol reader 100 includes a signal processor 235 and an analog-to-digital (A/D) chip 236. Together, these chips take raw data from the image sensor 232 and convert the data into a digital format, which in one exemplary embodiment may be a digital format that communicates a particular color or color bandwidth for further processing by the programmable controller 240.
In one embodiment, the system and method of the present invention employs algorithms stored in EROM 246 that enable programmable controller 240 to analyze image data from signal processor 235 and A/D236. In one embodiment and as described further below, this image analysis may include analyzing color or frequency information (levels of different frequency bands) in the image data. In one embodiment, and based in part on the color analysis, programmable controller 240 may then implement improved systems and methods to distinguish a first color provided by sight 222.1 (or sight filter 320.1, see fig. 3) from a second color provided by illuminator 222.2 (or illuminator filter 320.2, see fig. 3).
The exemplary symbol reader 100 may also include input/output (I/O) circuitry 237, for example, to support the use of the keyboard 112 and the trigger 114. The symbol reader 100 may also include output/display circuitry 238 that supports the display 110.
An exemplary symbol reader having color discrimination between the sight and the illuminator, and a color selection image sensor.
In conventional symbol reader 100, the light source 222 (e.g., sight LED 222.1 and illumination LED 222.2, whether two or only one physical element in fact) may employ: (i) A common color or frequency of light, or (ii) two similar colors or frequencies that are not separated between the two spectral bands (e.g., partially overlapping bands). This may lead to interference problems between projecting the aimer light 116 for aiming the reader 100 and projecting the illumination light 118 for the purpose of optically obtaining the symbol 102. Prior art symbol readers have addressed these interference problems that result in various performance compromises in the manner described above in this document.
The present system and method introduces the use of at least two different colors or, equivalently, at least two different frequency bands, where at least one such frequency band is designated for the aimer light 116 and another such frequency band is designated for the illumination light 118. It will be appreciated that at least two of the different frequency bands employed for targeting and image acquisition, respectively, will be substantially non-overlapping. The preferred color frequency may vary in different embodiments.
A first exemplary embodiment of the present system and method may employ a monochrome image sensor 232, a blue collimator LED 222.1, and a so-called "white" illumination LED 222.2 (which projects primarily in the blue and yellow portions of the spectrum). This combination may effectively distinguish between the imaging light 118 and the aimer light 116 because the monochromatic image sensor 232 may have a relatively weak response to blue frequencies. Thus, the yellow frequency in the imaging light 118 will dominate in affecting the imaging sensor 232, and thus in imaging the symbol 102; while blue light (whether from aimer LED 222.1 or illumination LED 222.2) will introduce relatively little interference in the imaging process.
A second exemplary embodiment of the present system and method may employ a monochrome image sensor 232, a blue collimator LED 222.1, and a red or amber (approximately 504THz or 595 nm) illumination LED 222.2. This combination may avoid the need for color filters 330 in the imaging assembly 230 (see fig. 3 below).
A third exemplary embodiment of the present system and method may employ a color image sensor 232, an amber or cyan aimer LED 222.1 and an illumination LED 222.2 or broadband illumination LED 222.2 having a significant spectral intensity of a color other than amber or cyan. This combination may effectively distinguish between the imaging light 118 and the aimer light 116 because the color image sensor 232 may have a relatively weak response to frequencies associated with amber or cyan.
A fourth exemplary embodiment of the present system and method employs software color filtering. Such an embodiment may use a first specified color for the sight LED 222.1; at least one second specified color (in a different frequency band than the first) for the illumination LED 222.2; and software image/color processing that removes the first specified color (the aimer color) from the signal generated by the imaging assembly 230.
This fourth exemplary embodiment may require the following stages: (i) Converting the analog image signals from the imaging assembly 230 to digital form; (ii) Performing fourier analysis or the like on the digitized signal to identify specific frequency elements; and (iii) removing frequency elements associated with the first specified color.
Possible costs of the fourth exemplary embodiment (software color filter method) may include: (i) Some degradation in image quality, which may reduce decoding performance; (ii) color image filtering may increase decoding time; and (iii) may require a more expensive hardware processor 240. Some possible advantages of the software filter approach are discussed further below in this document.
The above embodiments are merely exemplary. Other combinations of one or more first colors projected by LED sight 222.1, one or more second spectrally different colors projected by illumination LED 222.2, and other possible spectral responses by image sensor 232 are contemplated within the scope and spirit of the present systems and methods, as recited in the appended claims.
In various embodiments of the present systems and methods, the scope spectrum and the illumination spectrum have at least one common color or one common spectral band, along with one or more whatever colors differ between the two. In an alternative embodiment, the collimator color(s) and the illumination color(s) are substantially distinct from each other, that is to say they do not have substantial spectral overlap.
An exemplary symbol reader having one or more color filters.
FIG. 3 illustrates an internal block diagram of the exemplary symbol reader 100, which is similar to the exemplary symbol reader 100 of FIG. 2 above, but with additional elements. For many of the elements that are common between two exemplary symbol readers (e.g., hardware processors 242/244, memory 245/246, bus 248, etc.), the description is not repeated below and may assume substantially the same as that provided above in connection with FIG. 2.
In the exemplary symbol reader 100 of figure 3, the filters 330, 320.1 and 320.2 are elements of the imaging assembly 230 or the illumination assembly 220. Filters 320/330 may be made of a variety of materials known in the art (e.g., glass, resin plastic, polyester, and polycarbonate), and may, for example, be dyed to pass certain colors or bandwidths of light while blocking other colors or bandwidths of light. The filter may also be referred to as "photographic filters". The filters may be dyed or may be coated to achieve filtering, and multiple frequencies may be filtered by a single filtering element (e.g., using coatings of multiple different filtering materials).
In the figure, filters 320/330 are shown as separate and apart elements from optics 224/234, and also from image sensor 232 and LED/laser 222. In one embodiment, some or all of the filters 320/330 may be structurally separate from these other components 224/234/222/232.
However, those skilled in the art will recognize that this illustrative distinction is for purposes of illustration and not limitation. In an alternative embodiment, for example, the imaging color filter 330 may be integrated with the imaging optics 234 or the image sensor 232. For example, the imaging optics 234 may include one or more lenses that pass only the specified frequency band, or may include one or more mirrors that reflect only the specified frequency band, such that the imaging optics 234 and the imaging color filter 330 are structurally one element.
In an alternative embodiment, for example, the aimer color filter 320.1 associated with the aiming light 116 may be integrated with the aiming optics 224.1 or the LED aimer 222.1. For example, aiming optics 224.1 may include one or more lenses that pass only the specified frequency band, or may include one or more mirrors that reflect only the specified frequency band, such that aiming optics 224.1 and aiming filter 320.1 are structurally one element. In an alternative embodiment, the LED sight 222.1 may have an integral color filter element or may be configured to generate only a single frequency band, which may limit the color emitted by the LED sight 222.1 to a specified frequency band.
Substantially similar considerations apply to the illumination filter 320.2 associated with the illumination optics 224.2 and the illumination LED 222.2.
Those skilled in the art will further appreciate that the positional or spatial order of the elements shown is for illustration only and that in various embodiments the positional or spatial order of the elements shown may be changed while substantially achieving the same filtering, optical, illumination, and image capture effects. For example, in the embodiment illustrated in fig. 3, imaging color filter 330 is illustrated as receiving light reflected from symbol 102, which is then partially transmitted (due to filtering) to imaging optics 234. In alternative embodiments with alternative structural ordering, light reflected by symbol 102 may first be received and focused by imaging optics 234 and then filtered by imaging color filter 330.
Similar considerations apply to aiming light and illumination light. For example, in the embodiment illustrated in fig. 3, light from illumination LED 222.2 is first filtered to a specified bandwidth by illumination filter 320.2, and then the filtered light is focused by illumination optics 224.2 onto symbol 102. In an alternative embodiment with an alternative structural ordering of components, light from illumination LED 222.2 is first focused by illumination optics 224.2 and then filtered by illumination color filter 320.2.
The color filters 320.1, 320.2 and 330 are generally used to implement the illumination and image capture processes discussed above, with at least a first frequency band being used for purposes of aiming the scanner 100 and at least one distinct second frequency band being used to illuminate the symbol 102 for image capture.
As will be appreciated by those skilled in the art, all, some, or none of the illustrated filters may be necessary to implement the present systems and methods.
Nomenclature-filters and cut-off filters.
By convention, filters are often named for the color they allow to pass through, and may also be referred to as a "band pass filter". For example, a blue filter may allow the blue color to pass while substantially blocking all other colors. Thus, it appears from the surface that the filter generally looks "bluish". Similarly, a red filter may allow wavelengths associated with the red color to pass, such that the filter appears "reddish. A "cut-off filter" or "blocking filter" is a filter that blocks the indicated frequency or color. For example, a "blue-IR cut filter" or "blue blocking filter" blocks frequencies associated with blue (and with infrared light) while substantially allowing other frequencies to pass.
In general, in this document, reference to a blue filter refers to a blue band-pass filter (which passes blue and thus "colors" white light to blue). Reference to an amber filter refers to an amber band pass filter that colors white light to amber. Where a cut-off filter is specifically applicable, it will generally be set forth in the text as much, however in some cases, depending on the context and use of the element, the use of a cut-off filter may be apparent even when not so set forth.
In one exemplary embodiment of the present system and method, the symbol reader 100 may employ a monochrome or color image sensor 232; blue laser sight 222.1 or blue LED sight 222.1, or alternatively blue sight bandpass filter 320.1; and a blue Infrared (IR) cut filter for both the illumination filter 320.2 and the imaging filter 330 (and thereby removing the blue color from the illumination light 118). This combination results in an imaging process that excludes the collimator band (blue) from the imaging process.
In an alternative exemplary embodiment employing three filters, both the laser/LED sight 222.1 and the illumination LED 222.2 may be broadband light emitters. In such embodiments, the aiming filter 320.1 may be used to filter the light from the LED aimer 222.1 to allow the first frequency band to pass, while blocking a second transmitted frequency band (which is different from the first one); and illumination filter 320.2 may be used to filter the light from illumination LED 222.2 to allow a second frequency band to pass.
The imaging color filter 330 associated with the imaging sensor 232 may also be configured to filter the received light. The imaging filter 330 may filter out (remove) the first frequency band passed by the aimer filter 320.1 (thereby filtering out aimer light); while allowing the passage of the second frequency passed by the illumination filter 320.2 (thereby allowing the image sensor 232 to receive reflected illumination light).
In an alternative embodiment employing two filters, the laser/LED aimer 222.1 may emit broadband light but have the aimer filter 320.1 configured to pass only the first frequency of light (e.g., blue), while the illumination LED 222.2 may emit broadband light and have no associated filter (thus no illumination filter 320.2 is employed). The imaging color filter 330 associated with the imaging sensor 232 may also be configured to filter the received light. The imaging filter 330 may filter out (remove) the first frequency band passed by the aimer filter 320.1 (thereby filtering out aimer light); while allowing the passage of the remaining broadband frequencies (thereby allowing the image sensor 232 to receive all reflected illumination light except that reflected in the band of the LED sight 222.1).
In an alternative embodiment employing two filters, the laser/LED sight 222.1 and illumination LED 222.2 may have respective color filters 320.1 and 320.1, each configured to allow passage of a different frequency band (e.g., blue or red for the sight, but yellow for the illumination light). The image sensor 232 may be a charge-coupled device that is sensitive to yellow light only or primarily yellow light, and thus primarily only to illumination light, without the need for the imaging filter 330.
In alternative embodiments, more than three filters may also be employed, for example for signaling purposes. For example, the illumination LED 222.2 may have an associated illumination filter 320.2 configured to pass only a first frequency band (e.g., red). The image sensor 232 may have an imaging filter 330 configured to pass only the same first frequency band (in this case, red). The laser or LED aimer 222.1 may be a broadband emitter, but with multiple aimer color filters 320.1; different colors such as blue, green, orange, and purple may then be used to signal different states of the scanner 100. In an alternative embodiment, the scanner 100 may employ multiple collimator LEDs 222.1 (blue LEDs, green LEDs, orange LEDs, etc.) emitting different colors of light instead of or in addition to the collimator filters 320.1.
The above combination of filters is merely exemplary. Those skilled in the art will appreciate that other combinations of filters may also be employed within the scope and spirit of the present system and method as set forth in the appended claims.
Exemplary structural arrangements.
Fig. 4 illustrates an exploded structural/component view of an exemplary symbol reader 100 in accordance with the present systems and methods. The associated component parts are arranged so that they may be in one exemplary symbol reader. Labeled as imaging assemblies 230 and those elements that are part of the lighting assembly 220. Those skilled in the art will appreciate that the operating reader 100 will include other components not shown in the figures, such as processors (242, 244), memories (245, 246), I/O control chip 237, other electronic components (235, 236), power supply elements, circuit boards, and the like.
Symbol reader 100 includes an image sensor 232, an imaging color filter 330, and an imaging lens support 234.1 and an imaging lens 234.2 (both of which are elements of imaging optics 234).
Symbol reader 100 also includes illuminator LED 222.2, illuminator color filter 320.2, and illuminator optics/lens 224.2.
Symbol reader 100 also includes a sight LED 222.1, as well as a sight aperture 224.1 and a sight lens 224.2 (both of which are elements of a sight optic 224.1).
It will be noted that in the embodiment shown, the aimer elements do not include distinct, identifiable aimer filters 320.1. In one embodiment, and as discussed above, the symbol reader 100 may not require the sight filter 320.1. In one embodiment, the aimer LEDs 222.1 may be configured to emit light only in a limited frequency band or specific color range, thereby eliminating the need for the aimer color filters 320.1. In an alternative embodiment, color filtering may be performed by the collimator lens 224.1. In an alternative embodiment, a different aimer color filter 320.1 (not shown in fig. 4) may be included at any of several points in the optical path of the aiming element. In an alternative embodiment, the symbol reader 100 may have a plurality of sight color filters 320.1, which may be used for such purposes of signaling the user via different sight colors.
In one embodiment, the aimer aperture 224.1 may be configured to be adjustable to display various graphical images for user signaling and instruction purposes. In an alternative embodiment, the symbol reader 100 may employ a plurality of sight LEDs 222.1, possibly of different colors, to enable or facilitate signaling via the sight light.
The symbol reader 100 of fig. 4 is exemplary only. As noted above, not all elements may be included in all embodiments. For example, the use of different color filters and/or the presence of different color filters 320.1/320.2/330 may vary in different embodiments, and not all filters are necessarily present in all embodiments.
The symbol reader 100 also includes a base 408 that holds other components in place and provides an exterior surface for the reader 100 to grip and hold. The base 408 includes a forward opening or receptacle configured to provide a path for light to exit the reader 100 and enter the reader 100; and for holding and mounting forward optical components. The illuminator container 402 is configured to mount the illuminator lens 224.2; the imager container 404 is configured to mount the imaging lens 234.2 (and possibly the imaging lens holder 234.1); the sight receptacle 406 is configured to mount the sight lens 224.1. The base 408 may also provide an internal mounting receptacle for other electronic components listed above, but not shown.
Illumination and filter selection.
Illumination: fig. 5 provides a spectral plot 502/504/506/508/510 for a wide variety of scanner light sources 222/232 that may be employed in connection with the present systems and methods, and in particular an exemplary symbol reader 100 employing a blue sight source 222.1 and an illuminator/reading light source 222.2 (which is white or red/amber).
In one embodiment, such an exemplary scanner may be employed for the aimer LED 222.1, such as a blue/violet LED (502) emitting in a wavelength from 456 to 458 nm; or in an alternative embodiment, a blue/violet laser diode emitting at a wavelength from about 435nm to 500nm is used, where the peak center wavelength is about 465nm (504). In alternative embodiments, such wavelengths may also be generated in whole or in part by a combination of the aimer light source 222.1 and an appropriate aimer color filter 320.1.
In one embodiment, such an exemplary scanner may employ for the illumination/reading LED 222.2, for example, a broad spectrum white LED (506) with two or more wavelength peaks; or in an alternative embodiment, a red/amber LED (508) having a peak wavelength of approximately 625 nm. In an alternative embodiment, such wavelengths may also be generated in whole or in part by a combination of the illuminator light sources 222.2 and appropriate illuminator color filters 320.2.
In one embodiment, such an exemplary scanner may employ a charge coupled device or similar sensing element with a substantially broadband, frequency insensitive (i.e., monochromatic) spectral response for the image sensor 232 (510).
A filter: FIG. 6 provides a spectral plot 602/604/606/608 for a wide variety of optical filters 320/330 that may be employed in conjunction with the exemplary symbol reader 100 in accordance with the present systems and methods.
In one embodiment, the exemplary scanner 100 may employ an amber bandpass filter (604) for the image sensor color filter 330 that will block the aimer illumination 116 from the blue aimer 222.1 while still allowing the read light 118 from the white illuminator 222.2.
In an alternative embodiment, the exemplary scanner 100 may employ a red band pass filter (606) for the image sensor color filter 330 that will block the aimer illumination 116 from the green aimer 222.1 while passing the read light 118 from the red illuminator 222.2.
In an alternative embodiment, the exemplary scanner 100 may employ a yellow band pass filter (602) for the image sensor filter 330 that will block the aimer illumination 116 from the blue aimer 222.1 and also block the blue portion of the read light 118 from the white LED illuminator 222.2; but will still pass other frequencies of reading light from the white LED illuminator 222.2.
In an alternative embodiment, the exemplary scanner 100 may employ a customized band pass filter (602) for the image sensor color filter 330 that will block the aimer illumination 116 from reaching the image sensor 232; but will still pass other frequencies of read light from the white LED illuminator 222.2 up to the image sensor 232.
Optical band and filter design optimization.
In the application of the present system and method, those skilled in the art will recognize that it is desirable to optimize the selection of the bands of light 116, 118 (and the accompanying selection of the filter 320) in view of optimizing the performance of the scanner 100. Various considerations and factors may come into play for optimization. For example and without limitation, these include:
(1) Human eye response and perception: the color for the aimer light 116 and associated filter 320.1 selection (if a filter is used) may be selected in view of providing the best contrast for the aimer to help the user easily aim the scanner 100 with minimal eye strain.
(2) And (3) signal-to-noise ratio optimization: both the aimer light 116 and the illumination light 118 may be selected with consideration of achieving the best signal-to-noise ratio for the image sensor 232. In practice, this may require selecting the aimer light 116 to be spectrally as far away as possible from the frequency to which the image sensor 232 is most responsive.
(3) Ambient light contribution: in real-world applications, ambient light will contribute to the image signal. Ambient light typically includes natural sunlight, artificial indoor lighting, and combinations thereof. The optical band selection 116, 118 and filter 320 designs can be optimized based on statistics of these contributions to achieve optimal sight visibility and image sensitivity.
Application of software-based color filtering: in real-world use of the scanner 100, ambient lighting conditions may change. Thus, dynamic discrimination of bands of light for aiming and detection may result in optimal scanner performance in varying environments. The software color filtering method has been discussed above. Software filtering may have the advantage of providing real-time optimization of the image sensor 232 detection criteria to achieve optimal separation between the aimer light 116 and the desired image illumination according to various application environments. In the case of scene analysis and spectral analysis, unwanted aimer patterns may be isolated from the image.
An exemplary method.
FIG. 7 presents a flowchart of an exemplary method for reading a machine-readable symbol 102 by the exemplary symbol reader 100. The method may be divided into alternative paths depending on the particular hardware and/or software configuration of the exemplary scanner 100.
The method begins with either or both of steps 705 and 710, and steps 705 and 710 may be performed in the order shown in one embodiment; in an alternative embodiment, steps 705 and 710 may be performed in reverse order (i.e., step 710 precedes step 705); and in an alternative embodiment, steps 705 and 710 are performed simultaneously.
In step 705, the reader 100 emits aiming light 116 from the sight 222.1 and possibly the associated color filter 320.1, the aiming light 116 including at least a first frequency band of light but excluding (or substantially minimizing) at least one second frequency band of light different from the first frequency band.
In step 710, the reader 100 emits read light 118 from the illuminator 222.2 and possibly the associated color filter 320.2, the read light 118 comprising at least a second frequency band different from the first frequency band from the sight 222.1.
In step 715, the symbol reader receives light reflected back from the machine-readable symbol.
In the first embodiment, the symbol reader has an imaging sensor 232 configured to respond to light at the second frequency rather than light at the first frequency. Continuing from step 715, in step 720.1, the image sensor 232 so configured generates an electrical signal representative of light at the second frequency and suitable for sign data determination. In step 735, symbol reader 100 determines symbol data from the electrical signal.
In an alternate second embodiment, the symbol reader has an imaging sensor 232 that is responsive to both light of the first frequency and light of the second frequency. Continuing from step 715, in step 720.2, the reflected received light is processed through a filter configured to pass light of the second frequency and block light of the first frequency. In step 725.2, the filtered light is received by the image sensor 232, which then generates light representing the second frequency and an electrical signal suitable for sign data determination. In step 735, the symbol reader 100 determines symbol data from the electrical signal.
In an alternate third embodiment, the symbol reader has an imaging sensor 232 that is responsive to both light at the first frequency and light at the second frequency. Continuing from step 715, in step 720.3, the image sensor generates electrical signals in response to both the first and second frequency bands, thereby generating electrical signals reflecting both frequency bands. In step 725.3, the symbol reader 100 digitally filters the electrical signal to select signal elements indicative of the second frequency band while removing elements indicative of the first frequency band. In step 735, symbol reader 100 determines symbol data from the electrical signal.
And (6) summarizing.
The exemplary symbol reader 100 illuminates machine-readable symbols with at least two separate colors or frequency bands. The first frequency band is used to illuminate the symbol to assist in aiming the reader. The second frequency is used to provide illumination for actually reading the symbol. By employing two separate frequency bands, it is possible to aim the symbol reader 100 and read the machine-readable symbol 102 to engage in both operations simultaneously without degrading signal integrity and without reducing the reliability of the symbol reading process.
Figure DEST_PATH_IMAGE001
To supplement the present disclosure, the present application incorporates by reference in its entirety the following commonly assigned patents, patent application publications, and patent applications:
U.S. Pat. No. 6,832,725;
U.S. Pat. No.7,128,266;
U.S. Pat. No.7,159,783;
U.S. Pat. No.7,413,127;
U.S. Pat. No.7,726,575;
U.S. Pat. No. 8,294,969;
U.S. Pat. No. 8,317,105;
U.S. Pat. No. 8,322,622;
U.S. Pat. No. 8,366,005;
U.S. Pat. No. 8,371,507;
U.S. Pat. No. 8,376,233;
U.S. Pat. No. 8,381,979;
U.S. Pat. No. 8,390,909;
U.S. Pat. No. 8,408,464;
U.S. Pat. No. 8,408,468;
U.S. Pat. No. 8,408,469;
U.S. Pat. No. 8,424,768;
U.S. Pat. No. 8,448,863;
U.S. Pat. No. 8,457,013;
U.S. Pat. No. 8,459,557;
U.S. Pat. No. 8,469,272;
U.S. Pat. No. 8,474,712;
U.S. Pat. No. 8,479,992;
U.S. Pat. No. 8,490,877;
U.S. Pat. No. 8,517,271;
U.S. Pat. No. 8,523,076;
U.S. Pat. No. 8,528,818;
U.S. Pat. No. 8,544,737;
U.S. Pat. No. 8,548,242;
no. 8,548,420;
U.S. Pat. No. 8,550,335;
U.S. Pat. No. 8,550,354;
U.S. Pat. No. 8,550,357;
U.S. Pat. No. 8,556,174;
U.S. Pat. No. 8,556,176;
U.S. Pat. No. 8,556,177;
U.S. Pat. No. 8,559,767;
U.S. Pat. No. 8,599,957;
U.S. Pat. No. 8,561,895;
U.S. Pat. No. 8,561,903;
U.S. Pat. No. 8,561,905;
U.S. Pat. No. 8,565,107;
U.S. Pat. No. 8,571,307;
U.S. Pat. No. 8,579,200;
U.S. Pat. No. 8,583,924;
U.S. Pat. No. 8,584,945;
U.S. Pat. No. 8,587,595;
U.S. Pat. No. 8,587,697;
U.S. Pat. No. 8,588,869;
U.S. Pat. No. 8,590,789;
U.S. Pat. No. 8,596,539;
U.S. Pat. No. 8,596,542;
U.S. Pat. No. 8,596,543;
U.S. Pat. No. 8,599,271;
U.S. Pat. No. 8,599,957;
U.S. Pat. No. 8,600,158;
U.S. Pat. No. 8,600,167;
U.S. Pat. No. 8,602,309;
U.S. Pat. No. 8,608,053;
U.S. Pat. No. 8,608,071;
U.S. Pat. No. 8,611,309;
U.S. Pat. No. 8,615,487;
U.S. Pat. No. 8,616,454;
U.S. Pat. No. 8,621,123;
U.S. Pat. No. 8,622,303;
U.S. Pat. No. 8,628,013;
U.S. Pat. No. 8,628,015;
U.S. Pat. No. 8,628,016;
U.S. Pat. No. 8,629,926;
U.S. Pat. No. 8,630,491;
U.S. Pat. No. 8,635,309;
U.S. Pat. No. 8,636,200;
U.S. Pat. No. 8,636,212;
U.S. Pat. No. 8,636,215;
U.S. Pat. No. 8,636,224;
U.S. Pat. No. 8,638,806;
U.S. Pat. No. 8,640,958;
U.S. Pat. No. 8,640,960;
U.S. Pat. No. 8,643,717;
U.S. Pat. No. 8,646,692;
U.S. Pat. No. 8,646,694;
U.S. Pat. No. 8,657,200;
U.S. Pat. No. 8,659,397;
U.S. Pat. No. 8,668,149;
U.S. Pat. No. 8,678,285;
U.S. Pat. No. 8,678,286;
U.S. Pat. No. 8,682,077;
U.S. Pat. No. 8,687,282;
U.S. Pat. No. 8,692,927;
U.S. Pat. No. 8,695,880;
U.S. Pat. No. 8,698,949;
U.S. Pat. No. 8,717,494;
U.S. Pat. No. 8,717,494;
U.S. Pat. No. 8,720,783;
U.S. Pat. No. 8,723,804;
U.S. Pat. No. 8,723,904;
U.S. Pat. No. 8,727,223;
U.S. Pat. No. D702,237;
U.S. Pat. No. 8,740,082;
U.S. Pat. No. 8,740,085;
U.S. Pat. No. 8,746,563;
U.S. Pat. No. 8,750,445;
U.S. Pat. No. 8,752,766;
U.S. Pat. No. 8,756,059;
U.S. Pat. No. 8,757,495;
U.S. Pat. No. 8,760,563;
U.S. Pat. No. 8,763,909;
U.S. Pat. No. 8,777,108;
U.S. Pat. No. 8,777,109;
U.S. Pat. No. 8,779,898;
U.S. Pat. No. 8,781,520;
U.S. Pat. No. 8,783,573;
U.S. Pat. No. 8,789,757;
U.S. Pat. No. 8,789,758;
U.S. Pat. No. 8,789,759;
U.S. Pat. No. 8,794,520;
U.S. Pat. No. 8,794,522;
U.S. Pat. No. 8,794,525;
U.S. Pat. No. 8,794,526;
U.S. Pat. No. 8,798,367;
U.S. Pat. No. 8,807,431;
U.S. Pat. No. 8,807,432;
U.S. Pat. No. 8,820,630;
U.S. Pat. No. 8,822,848;
U.S. Pat. No. 8,824,692;
U.S. Pat. No. 8,824,696;
U.S. Pat. No. 8,842,849;
U.S. Pat. No. 8,844,822;
U.S. Pat. No. 8,844,823;
U.S. Pat. No. 8,849,019;
U.S. Pat. No. 8,851,383;
U.S. Pat. No. 8,854,633;
U.S. Pat. No. 8,866,963;
U.S. Pat. No. 8,868,421;
U.S. Pat. No. 8,868,519;
U.S. Pat. No. 8,868,802;
U.S. Pat. No. 8,868,803;
U.S. Pat. No. 8,870,074;
U.S. Pat. No. 8,879,639;
U.S. Pat. No. 8,880,426;
U.S. Pat. No. 8,881,983;
U.S. Pat. No. 8,881,987;
U.S. Pat. No. 8,903,172;
U.S. Pat. No. 8,908,995;
U.S. Pat. No. 8,910,870;
U.S. Pat. No. 8,910,875;
U.S. Pat. No. 8,914,290;
U.S. Pat. No. 8,914,788;
U.S. Pat. No. 8,915,439;
U.S. Pat. No. 8,915,444;
U.S. Pat. No. 8,916,789;
U.S. Pat. No. 8,918,250;
U.S. Pat. No. 8,918,564;
U.S. Pat. No. 8,925,818;
U.S. Pat. No. 8,939,374;
U.S. Pat. No. 8,942,480;
U.S. Pat. No. 8,944,313
U.S. Pat. No. 8,944,327;
U.S. Pat. No. 8,944,332;
U.S. Pat. No. 8,950,678;
U.S. Pat. No. 8,967,468;
U.S. Pat. No. 8,971,346;
U.S. Pat. No. 8,976,030;
U.S. Pat. No. 8,976,368;
U.S. Pat. No. 8,978,981;
U.S. Pat. No. 8,978,983;
U.S. Pat. No. 8,978,984;
U.S. Pat. No. 8,985,456;
U.S. Pat. No. 8,985,457;
U.S. Pat. No. 8,985,459;
U.S. Pat. No. 8,985,461;
U.S. Pat. No. 8,988,578;
U.S. Pat. No. 8,988,590;
U.S. Pat. No. 8,991,704;
U.S. Pat. No. 8,996,194;
U.S. Pat. No. 8,996,384;
U.S. Pat. No. 9,002,641;
U.S. Pat. No. 9,007,368;
U.S. Pat. No. 9,010,641;
U.S. Pat. No. 9,015,513;
U.S. Pat. No. 9,016,576;
U.S. Pat. No. 9,022,288;
U.S. Pat. No. 9,030,964;
U.S. Pat. No. 9,033,240;
U.S. Pat. No. 9,033,242;
U.S. Pat. No. 9,036,054;
U.S. Pat. No. 9,037,344;
U.S. Pat. No. 9,038,911;
U.S. Pat. No. 9,038,915;
U.S. Pat. No. 9,047,098;
U.S. Pat. No. 9,047,359;
U.S. Pat. No. 9,047,420;
U.S. Pat. No. 9,047,525;
U.S. Pat. No. 9,047,531;
U.S. Pat. No. 9,053,055;
U.S. Pat. No. 9,053,378;
U.S. Pat. No. 9,053,380;
U.S. Pat. No. 9,058,526;
U.S. Pat. No. 9,064,165;
U.S. Pat. No. 9,064,167;
U.S. Pat. No. 9,064,168;
U.S. Pat. No. 9,064,254;
U.S. Pat. No. 9,066,032;
U.S. Pat. No. 9,070,032;
U.S. design patent No. D716,285;
U.S. design patent No. D723,560;
U.S. design patent No. D730,357;
U.S. design patent No. D730,901;
U.S. design patent No. D730,902;
U.S. design patent No. D733,112;
U.S. design patent No. D734,339;
international publication No. 2013/163789;
international publication No. 2013/173985;
international publication No. 2014/019130;
international publication No. 2014/110495;
U.S. patent application publication No. 2008/0185432;
U.S. patent application publication No. 2009/0134221;
U.S. patent application publication No. 2010/0177080;
U.S. patent application publication No. 2010/0177076;
U.S. patent application publication No. 2010/0177707;
U.S. patent application publication No. 2010/0177749;
U.S. patent application publication No. 2010/0265880;
U.S. patent application publication No. 2011/0202554;
U.S. patent application publication No. 2012/0111946;
U.S. patent application publication No. 2012/0168511;
U.S. patent application publication No. 2012/0168512;
U.S. patent application publication No. 2012/0193423;
U.S. patent application publication No. 2012/0203647;
U.S. patent application publication No. 2012/0223141;
U.S. patent application publication No. 2012/0228382;
U.S. patent application publication No. 2012/0248188;
U.S. patent application publication No. 2013/0043312;
U.S. patent application publication No. 2013/0082104;
U.S. patent application publication No. 2013/0175341;
U.S. patent application publication No. 2013/0175343;
U.S. patent application publication No. 2013/0257744;
U.S. patent application publication No. 2013/0257759;
U.S. patent application publication No. 2013/0270346;
U.S. patent application publication No. 2013/0287258;
U.S. patent application publication No. 2013/0292475;
U.S. patent application publication No. 2013/0292477;
U.S. patent application publication No. 2013/0293539;
U.S. patent application publication No. 2013/0293540;
U.S. patent application publication No. 2013/0306728;
U.S. patent application publication No. 2013/0306731;
U.S. patent application publication No. 2013/0307964;
U.S. patent application publication No. 2013/0308625;
U.S. patent application publication No. 2013/0313324;
U.S. patent application publication No. 2013/0313325;
U.S. patent application publication No. 2013/0342717;
U.S. patent application publication No. 2014/0001267;
U.S. patent application publication No. 2014/0008439;
U.S. patent application publication No. 2014/0025584;
U.S. patent application publication No. 2014/0034734;
U.S. patent application publication No. 2014/0036848;
U.S. patent application publication No. 2014/0039693;
U.S. patent application publication No. 2014/0042814;
U.S. patent application publication No. 2014/0049120;
U.S. patent application publication No. 2014/0049635;
U.S. patent application publication No. 2014/0061306;
U.S. patent application publication No. 2014/0063289;
U.S. patent application publication No. 2014/0066136;
U.S. patent application publication No. 2014/0067692;
U.S. patent application publication No. 2014/0070005;
U.S. patent application publication No. 2014/0071840;
U.S. patent application publication No. 2014/0074746;
U.S. patent application publication No. 2014/0076974;
U.S. patent application publication No. 2014/0078341;
U.S. patent application publication No. 2014/0078345;
U.S. patent application publication No. 2014/0097249;
U.S. patent application publication No. 2014/0098792;
U.S. patent application publication No. 2014/0100813;
U.S. patent application publication No. 2014/0103115;
U.S. patent application publication No. 2014/0104413;
U.S. patent application publication No. 2014/0104414;
U.S. patent application publication No. 2014/0104416;
U.S. patent application publication No. 2014/0104451;
U.S. patent application publication No. 2014/0106594;
U.S. patent application publication No. 2014/0106725;
U.S. patent application publication No. 2014/0108010;
U.S. patent application publication No. 2014/0108402;
U.S. patent application publication No. 2014/0110485;
U.S. patent application publication No. 2014/0114530;
U.S. patent application publication No. 2014/0124577;
U.S. patent application publication No. 2014/0124579;
U.S. patent application publication No. 2014/0125842;
U.S. patent application publication No. 2014/0125853;
U.S. patent application publication No. 2014/0125999;
U.S. patent application publication No. 2014/0129378;
U.S. patent application publication No. 2014/0131438;
U.S. patent application publication No. 2014/0131441;
U.S. patent application publication No. 2014/0131443;
U.S. patent application publication No. 2014/0131444;
U.S. patent application publication No. 2014/0131445;
U.S. patent application publication No. 2014/0131448;
U.S. patent application publication No. 2014/0133379;
U.S. patent application publication No. 2014/0136208;
U.S. patent application publication No. 2014/0140585;
U.S. patent application publication No. 2014/0151453;
U.S. patent application publication No. 2014/0152882;
U.S. patent application publication No. 2014/0158770;
U.S. patent application publication No. 2014/0159869;
U.S. patent application publication No. 2014/0166755;
U.S. patent application publication No. 2014/0166759;
U.S. patent application publication No. 2014/0168787;
U.S. patent application publication No. 2014/0175165;
U.S. patent application publication No. 2014/0175172;
U.S. patent application publication No. 2014/0191644;
U.S. patent application publication No. 2014/0191913;
U.S. patent application publication No. 2014/0197238;
U.S. patent application publication No. 2014/0197239;
U.S. patent application publication No. 2014/0197304;
U.S. patent application publication No. 2014/0214631;
U.S. patent application publication No. 2014/0217166;
U.S. patent application publication No. 2014/0217180;
U.S. patent application publication No. 2014/0231500;
U.S. patent application publication No. 2014/0232930;
U.S. patent application publication No. 2014/0247315;
U.S. patent application publication No. 2014/0263493;
U.S. patent application publication No. 2014/0263645;
U.S. patent application publication No. 2014/0267609;
U.S. patent application publication No. 2014/0270196;
U.S. patent application publication No. 2014/0270229;
U.S. patent application publication No. 2014/0278387;
U.S. patent application publication No. 2014/0278391;
U.S. patent application publication No. 2014/0282210;
U.S. patent application publication No. 2014/0284384;
U.S. patent application publication No. 2014/0288933;
U.S. patent application publication No. 2014/0297058;
U.S. patent application publication No. 2014/0299665;
U.S. patent application publication No. 2014/0312121;
U.S. patent application publication No. 2014/0319220;
U.S. patent application publication No. 2014/0319221;
U.S. patent application publication No. 2014/0326787;
U.S. patent application publication No. 2014/0332590;
U.S. patent application publication No. 2014/0344943;
U.S. patent application publication No. 2014/0346233;
U.S. patent application publication No. 2014/0351317;
U.S. patent application publication No. 2014/0353373;
U.S. patent application publication No. 2014/0361073;
U.S. patent application publication No. 2014/0361082;
U.S. patent application publication No. 2014/0362184;
U.S. patent application publication No. 2014/0363015;
U.S. patent application publication No. 2014/0369511;
U.S. patent application publication No. 2014/0374483;
U.S. patent application publication No. 2014/0374485;
U.S. patent application publication No. 2015/0001301;
U.S. patent application publication No. 2015/0001304;
U.S. patent application publication No. 2015/0003673;
U.S. patent application publication No. 2015/0009338;
U.S. patent application publication No. 2015/0009610;
U.S. patent application publication No. 2015/0014416;
U.S. patent application publication No. 2015/0021397;
U.S. patent application publication No. 2015/0028102;
U.S. patent application publication No. 2015/0028103;
U.S. patent application publication No. 2015/0028104;
U.S. patent application publication No. 2015/0029002;
U.S. patent application publication No. 2015/0032709;
U.S. patent application publication No. 2015/0039309;
U.S. patent application publication No. 2015/0039878;
U.S. patent application publication No. 2015/0040378;
U.S. patent application publication No. 2015/0048168;
U.S. patent application publication No. 2015/0049347;
U.S. patent application publication No. 2015/0051992;
U.S. patent application publication No. 2015/0053766;
U.S. patent application publication No. 2015/0053768;
U.S. patent application publication No. 2015/0053769;
U.S. patent application publication No. 2015/0060544;
U.S. patent application publication No. 2015/0062366;
U.S. patent application publication No. 2015/0063215;
U.S. patent application publication No. 2015/0063676;
U.S. patent application publication No. 2015/0069130;
U.S. patent application publication No. 2015/0071819;
U.S. patent application publication No. 2015/0083800;
U.S. patent application publication No. 2015/0086114;
U.S. patent application publication No. 2015/0088522;
U.S. patent application publication No. 2015/0096872;
U.S. patent application publication No. 2015/0099557;
U.S. patent application publication No. 2015/0100196;
U.S. patent application publication No. 2015/0102109;
U.S. patent application publication No. 2015/0115035;
U.S. patent application publication No. 2015/0127791;
U.S. patent application publication No. 2015/0128116;
U.S. patent application publication No. 2015/0129659;
U.S. patent application publication No. 2015/0133047;
U.S. patent application publication No. 2015/0134470;
U.S. patent application publication No. 2015/0136851;
U.S. patent application publication No. 2015/0136854;
U.S. patent application publication No. 2015/0142492;
U.S. patent application publication No. 2015/0144692;
U.S. patent application publication No. 2015/0144698;
U.S. patent application publication No. 2015/0144701;
U.S. patent application publication No. 2015/0149946;
U.S. patent application publication No. 2015/0161429;
U.S. patent application publication No. 2015/0169925;
U.S. patent application publication No. 2015/0169929;
U.S. patent application publication No. 2015/0178523;
U.S. patent application publication No. 2015/0178534;
U.S. patent application publication No. 2015/0178535;
U.S. patent application publication No. 2015/0178536;
U.S. patent application publication No. 2015/0178537;
publication No. 2015/0181093 the U.S. patent application;
U.S. patent application publication No. 2015/0181109;
(Feng et al) application Ser. No.13/367,978, entitled a, filed on 7/2/2012Laser Scanning Module Employing an Elastomeric U-Hinge Based Laser Scanning AssemblyU.S. patent application of (a);
(Fitch et al) application Ser. No.29/458,405, entitled an, filed 2013, 6/19Electronic DeviceThe U.S. patent application;
(London et al) application Ser. No.29/459,620 entitled an filed 2013, 7/2/7Electronic Device EnclosureThe U.S. patent application;
(Oberpiller et al) application Ser. No.29/468,118, entitled an, filed 2013, 9, 26Electronic Device CaseU.S. patent application of (a);
(Colavito et al) filing on day 1/8 of 2014 under the designation of application No.14/150,393, isIndicia-reader Having Unitary Construction ScannerThe U.S. patent application;
(Feng et al) filed on 7/3/2014, application number 14/200,405 entitledIndicia Reader for Size-Limited ApplicationsU.S. patent application of (a);
(Van Horn et al) filed on 1/4.2014, application No.14/231,898 entitled "article for saleHand- Mounted Indicia-Reading Device with Finger Motion TriggeringThe U.S. patent application;
(Oberpiller et al) U.S. patent application Ser. No.29/486,759, entitled an Imaging Terminal, filed 4/2.2014;
(Showering) application Ser. No.14/257,364, entitled "14/257,364, filed on 21/4/2014Docking System and Method Using Near Field CommunicationThe U.S. patent application;
(Ackley et al) filed on 29/4/2014, application No.14/264,173, entitled "article of manufactureAutofocus Lens System for Indicia ReadersThe U.S. patent application;
(Jovanovski et al) filed 5/14/2014, application number 14/277,337, entitled "article of manufactureMULTIPURPOSE OPTICAL READERU.S. patent application of (a);
(Liu et al) application Ser. No.14/283,282, entitled "patent application Ser. No.14/283,282, filed on 21/5/2014TERMINAL HAVING ILLUMINATION AND FOCUS CONTROLThe U.S. patent application;
(Hejl) U.S. patent application Ser. No.14/327,827 entitled A MOBILE-PHONE ADAPTER FOR ELECTRONIC TRANSACTIONS filed 7/10/2014;
(Hejl) U.S. patent application Ser. No.14/334,934, entitled a SYSTEM AND METHOD FOR indication VERIFICATION, filed on 18.7.2014;
(Xian et al) U.S. patent application No.14/339,708 entitled LASER SCANNING CODE SYMBOL READING SYSTEM filed 24/7/2014;
(Rueblinger et al) U.S. patent application Ser. No.14/340,627 entitled an AXIALLY REINFORCED FLEXIBLE SCAN ELEMENT filed 25/7/2014;
(Good et al) U.S. patent application Ser. No.14/446,391 entitled MULTI UNCTION POINT OF SALE APPATUS WITH OPTICAL SIGNATURE CAPTURE filed 30/7/2014;
(Todeschini) U.S. patent application No.14/452,697 entitled INTERACTIVE INDICIA READER filed 6.8.2014;
(Li et al) U.S. patent application Ser. No.14/453,019 entitled DIMENSIONING SYSTEM WITH GUIDED ALIGNMENT filed on 6.8.2014;
(Todesschini et al) U.S. patent application Ser. No.14/462,801 entitled MOBILE COMPUTING DEVICE WITH DATA COGNITION SOFTWARE filed on 19/8/2014;
(McCloskey et al) U.S. patent application Ser. No.14/483,056 entitled VARIABLE DEPTH OF FIELD BARCODE SCANNER filed 9/10/2014;
(Singel et al) U.S. patent application Ser. No.14/513,808 entitled IDENTIFYING INVENTORY ITEMS IN A STORAGE FACILITY filed 10/14 2014;
(Laffargue et al) U.S. patent application Ser. No.14/519,195 entitled HANDHELD DIMENSIONING SYSTEM WITH FEEDBACK, filed on 21/10/2014;
(Thuries et al) U.S. patent application Ser. No.14/519,179, entitled DIMENSIONING SYSTEM WITH MULTIPATH INTERFERENCE MITITITITION, filed on 21/10/2014;
(Ackley et al) U.S. patent application Ser. No.14/519,211 entitled SYSTEM AND METHOD FOR DIMENSIONING filed on 21/10/2014;
(Laffargue et al) U.S. patent application Ser. No.14/519,233 entitled HANDHELD DIMENSIONER WITH DATA-QUALITY INDICATION, filed on 21/10/2014;
(Ackley et al) U.S. patent application Ser. No.14/519,249 entitled HANDHELD DIMENSIONING SYSTEM WITH MEASUREMENT-CONFORMANCE FEEDBACK filed on 21/10/2014;
(Braho et al) U.S. patent application Ser. No.14/527,191, entitled METHOD AND SYSTEM FOR RECOGNIZING SPEED USE WILDCARDS IN AN EXPECTED RESPONSE, filed 10/29/2014;
(Schoon et al, U.S. patent application Ser. No.14/529,563 entitled ADAPTABLE INTERFACE FOR A MOBILE COMPUTING DEVICE filed on 31/10/2014;
(Todesschini et al) U.S. patent application Ser. No.14/529,857 entitled BARCODE READER WITH SECURITY FEATURES filed on 31/10/2014;
(Bian et al) U.S. patent APPLICATION Ser. No.14/398,542 entitled PORTABLE ELECTRONIC DEVICES HAVING A SEPARATE LOCATION TRIGGER UNIT FOR USE IN CONTROLLING AN APPLICATION UNIT filed on 3.11.2014;
(Miller et al) U.S. patent application Ser. No.14/531,154 entitled DIRECTING AN INSPECTOR THROUGH AN INSPECTION, filed on 11/3/2014;
(Todesschini) U.S. patent application Ser. No.14/533,319, entitled BARCODE SCANNING SYSTEM USE WEARABLE DEVICE WITH EMBEDED CAMERA, filed on 5.11.2014;
(Braho et al) U.S. patent application Ser. No.14/535,764, entitled CONCATENATED EXPECTED RESPONSES FOR SPEED RECOGNITION, filed 11/7 2014;
(Todesschini) U.S. patent application Ser. No.14/568,305 entitled AUTO-CONTRAST VIEWFENDER FOR AN INDICIA READER filed 12.12.2014;
(Goldsmith) U.S. patent application Ser. No.14/573,022 entitled DYNAMIC DIAGNOSTIC INDICATOR GENERATION, filed on 17.12.2014;
(Ackley et al) U.S. patent application Ser. No.14/578,627 entitled SAFETY SYSTEM AND METHOD filed 12/22 2014;
(Bowles) U.S. patent application No.14/580,262 entitled MEDIA GATE FOR THERMAL TRANSFER PRINTERS, filed on 23.12.2014;
(Payne) U.S. patent application Ser. No.14/590,024 entitled SHELVING AND PACKAGE LOCATING SYSTEMS FOR DELIVERY VEHICLES filed on 6.1.2015;
(Ackley) U.S. patent application Ser. No.14/596,757, entitled SYSTEM AND METHOD FOR DETECTING BARC PRINTING ERRORS, filed on 14.1.2015;
(Chen et al) U.S. patent application Ser. No.14/416,147 entitled OPTICAL READING APPARATUS HAVARING VARIABLE SETTINGS, filed 21/1/2015;
(Oberpiller et al) U.S. patent application Ser. No.14/614,706, entitled DEVICE FOR SUPPORTING AN ELECTRONOL TOOL ON A USER' S HAND, filed ON 5.2.2015.2.3;
(Morton et al) U.S. patent application Ser. No.14/614,796, entitled CARGO APPROPORTION TECHNIQUES, filed on 5.2.2015;
(Bidwell et al) U.S. patent application Ser. No.29/516,892 entitled TABLE COMPUTER filed on 6.2.2015;
(Pecorari) U.S. patent application No.14/619,093 entitled METHODS FOR TRAINING A SPEECH recording SYSTEM filed on 11.2.2015;
(Todesschini) U.S. patent application Ser. No.14/628,708 entitled DEVICE, SYSTEM, AND METHOD FOR DETERMINING THE STATUS OF CHECKOUT LANES filed on 23.2.2015.2;
(Gomez et al) U.S. patent application Ser. No.14/630,841, entitled TERMINAL INCLUDING IMAGING ASSEMBLY, filed on 25.2.2015;
(Sevier) U.S. patent application Ser. No.14/635,346 entitled SYSTEM AND METHOD FOR removable storage-AND-FORWARD DATA HANDLING BY ENCODED INFORMATION READING TERMINALS filed 3/635,346 of 2015;
(Zhou et al) U.S. patent application Ser. No.29/519,017 entitled SCANNER filed 3/2/2015;
(Zhu et al) U.S. patent application Ser. No.14/405,278 entitled DESIGN PATTERN FOR SECURE STORE filed on 9.3.2015;
(Kearney et al) U.S. patent application Ser. No.14/660,970, entitled DECODABLE INDICIA READING TERMINAL WITH COMMUNICED ILLUMINATION, filed 3/18/2015;
(Soule et al) U.S. patent application Ser. No.14/661,013, entitled REPROGRAMMING SYSTEM AND METHOD FOR DEVICES INCLUDING PROGRAMMING SYMBOL, filed 3/18/2015;
(Van Horn et al) U.S. patent application Ser. No.14/662,922 entitled MULTIFLUCTION POINT OF SALE SYSTEM, filed 3/19/2015;
(Davis et al) U.S. patent application Ser. No.14/663,638, entitled VEHICLE Motor Assembly WITH Conditionable IGNITION SWITCH Container, filed 3/20/2015;
(Todesschini) U.S. patent APPLICATION Ser. No.14/664,063 entitled METHOD AND APPLICATION FOR SCANNING A BARCOODE WITH A SMART DEVICE WHILE CONTINUOUSLY RUNNING AND DISPLAYING AN APPLICATION THE SMART DEVICE DISPLAY filed 3/20/2015;
(Funyak et al) U.S. patent application Ser. No.14/669,280, entitled TRANSFORMING COMPONENTS OF A WEB PAGE TO VOICE PROMPTS, filed on 26.3.2015;
(Bidwell) U.S. patent application Ser. No.14/674,329 filed 3/31/2015, entitled AIMER FOR BARCODE SCANNING;
(Huck) U.S. patent application Ser. No.14/676,109, entitled INDICIA READER, filed on 1/4/2015;
(Yeakley et al) U.S. patent application Ser. No.14/676,327, entitled DEVICE MANAGEMENT PROXY FOR SECURE DEVICES, filed on 1/4/2015;
(Showering) U.S. patent application Ser. No.14/676,898, entitled NAVIGATION SYSTEM CONGURED TO INTEGRATE MOTION SENSING DEVICE INPUTS, filed on 2.4.2015;
(Laffargue et al) U.S. patent application Ser. No.14/679,275 entitled DIMENSIONING SYSTEM CALIBRATION SYSTEMS AND METHODS filed on 6.4.2015;
(Bidwell et al) U.S. patent application Ser. No.29/523,098, entitled HANDLE FOR A TABLET COMPUTER, filed on 7.4.2015;
(Murawski et al) U.S. patent application Ser. No.14/682,615, entitled SYSTEM AND METHOD FOR Power MANAGEMENT OF Mobile DEVICES, filed on 9.4.2015;
(Qu et al) U.S. patent application Ser. No.14/686,822, entitled MULTIPLE PLATFORM SUPPORT SYSTEM AND METHOD, filed on 15/4/2015;
(Kohtz et al) U.S. patent application Ser. No.14/687,289, entitled SYSTEM FOR COMMUNICATION VIA A PERIPHERAL HUB, filed on 15.4.2015;
(Zhou et al) U.S. patent application Ser. No.29/524,186 entitled SCANNER, filed on 17.4.2015;
(Sewell et al) U.S. patent application Ser. No.14/695,364, entitled medicine MANAGEMENT SYSTEM, filed 24/4/2015;
(Kubler et al) U.S. patent application Ser. No.14/695,923 entitled SECURE UNATTED NETWORK AUTHENTICATION filed 24/4/2015;
(Schulte et al) U.S. patent application Ser. No.29/525,068 entitled TABLET COMPUTER WITH REMOVABLE SCANNING DEVICE filed on 27/4/2015;
(Nahill et al) U.S. patent application Ser. No.14/699,436 filed on 29/4/2015, entitled SYMBOL READING SYSTEM HAVING PREDICTIVE DIAGNOSTICS;
(Todesschini et al) U.S. patent APPLICATION Ser. No.14/702,110, entitled SYSTEM AND METHOD FOR Regulation BARCOODE DATA INJECTION INTO A RUNNING APPLICATION A SMART DEVICE, filed 5/1/2015;
(Young et al) U.S. patent application Ser. No.14/702,979, entitled TRACKING BATTERY CONDITIONS, filed on 4.5.2015;
U.S. patent application Ser. No.14/704,050, entitled INTERMEDIATE LINEAR POSITIONING, filed 5/2015 (Charpentier et al);
(Fitch et al) U.S. patent application Ser. No.14/705,012, entitled HANDS-FREE HUMAN MACHINE INTERFACE RESPONSIVE TO A DRIVER OF A VEHICLE, filed 5/6/2015;
(Hussey et al) U.S. patent application Ser. No.14/705,407, entitled METHOD AND SYSTEM TO PROTECT SOFTWARE-BASED NETWORK-CONNECTED DEVICES FROM ADVANCED PERSISTENT THREAT, filed 5/6/2015;
(Chamberlin) U.S. patent application Ser. No.14/707,037, entitled SYSTEM AND METHOD FOR DISPLAY OF INFORMATION use A VEHICLE-motor kit, filed on 8.5.2015.5;
(Pape) U.S. patent APPLICATION Ser. No.14/707,123 entitled APPLICATION INDEPENDENDENT DEX/UCS INTERFACE filed on 8/5/2015;
(Smith et al) U.S. patent application Ser. No.14/707,492, entitled METHOD AND APPATUS FOR READING OPTICAL INDICATION USING A PLURALITY OF DATA SOURCES, filed 5, 8/2015 on 2015;
(Smith) U.S. patent application Ser. No.14/710,666 entitled PRE-PAID USAGE SYSTEM FOR ENCODED INFORMATION READING TERMINALS filed on 13.5.2015;
(Fitch et al) U.S. patent application Ser. No.29/526,918, entitled CHARGING BASE, filed on 14/5/2015;
(Venkatesha et al) U.S. patent application Ser. No.14/715,672 entitled AUGUMENTED REALITY ENABLED HAZARD DISPLAY, filed 5/19/2015;
(Ackley) U.S. patent application Ser. No.14/715,916 entitled EVALUATING IMAGE VALUES filed 5/19/2015;
(Showering et al) U.S. patent application Ser. No.14/722,608 entitled INTERACTIVE USER INTERFACE FOR CAPTURING A DOCUMENT IN AN IMAGE SIGNAL filed on 27/5/2015;
(Oberpiller et al) U.S. patent application Ser. No.29/528,165 entitled IN-COUNTER BARCODE SCANNER, filed 5/27/2015;
(Wang et al) U.S. patent application Ser. No.14/724,134 entitled ELECTRONIC DEVICE WITH WIRELESS PATH SELECTION CAPABILITY filed on 28.5.2015;
(Barten) U.S. patent application Ser. No.14/724,849, entitled METHOD OF PROGRAMMING THE DEFALT CABLE INTERFACE SOFTWARE IN AN INDICIA READING DEVICE, filed on 29.5.2015;
(Barber et al) U.S. patent application Ser. No.14/724,908 entitled IMAGING APPATUS HAVING IMAGING ASSEMBLY, filed 5/29/2015;
(Cabillero et al) U.S. patent application Ser. No.14/725,352 entitled APPATUS AND METHOD FOR MONITORING ONE OR MORE PORTABLE DATA TERMINALS;
(Fitch et al) U.S. patent application Ser. No.29/528,590 entitled ELECTRONIC DEVICE filed 5/29/2015;
(Fitch et al) U.S. patent application Ser. No.29/528,890 entitled MOBILE COMPUTER HOUSE, filed on 2.6.2015;
(Cabilllero) U.S. patent application Ser. No.14/728,397 entitled DEVICE MANAGEMENT US ING VIRTUAL INTERFACES CROSS-REFERENCE TO RELATED APPLICATIONS, filed on 2.6.2015.6;
(Powilleit) U.S. patent application Ser. No.14/732,870, entitled DATA COLLECTION MODULE AND SYSTEM, filed on 8.6.2015;
(Zhou et al) U.S. patent application Ser. No.29/529,441 entitled INDICIA READING DEVICE, filed on 8.6.2015;
(Todesschini) U.S. patent application Ser. No.14/735,717 entitled INDICA-READING SYSTEMS HAVING AN INTERFACE WITH A USER' S NERVOUS SYSTEM filed on 10.6.2015;
(Amundsen et al) U.S. patent application Ser. No.14/738,038, entitled METHOD OF AND SYSTEM FOR DETECTING OBJECT WEIGHING INTERFERENCES, filed on 12.6.2015;
(Bandringa) U.S. patent application No.14/740,320 entitled TACTILE SWITCH FOR a MOBILE ELECTRONIC DEVICE, filed on day 16, 6/2015;
(Ackley et al) U.S. patent application Ser. No.14/740,373 filed on 16.6.2015, entitled sizing A VOLUME formulations;
(Xian et al) U.S. patent application No.14/742,818, entitled INDICIA READING SYSTEM EMPLOYING DIGITAL GAIN CONTROL, filed on 18.6.2015;
(Wang et al) U.S. patent application Ser. No.14/743,257, entitled WIRELESS MESH POINT PORTABLE DATA TERMINAL, filed on 18.6.2015;
(Vargo et al) U.S. patent application Ser. No.29/530,600 entitled CYCLONE, filed on 18.6.2015;
(Wang) U.S. patent application Ser. No.14/744,633 entitled IMAGE APPARATUS COMPRISING IMAGE SENSOR ARRAY HAVING SHARED GLOBAL SHOTTER CICUITRY filed on 19.6.2015;
(Todesschini et al) U.S. patent application Ser. No.14/744,836, entitled CLOOUD-BASED SYSTEM FOR READING OF DECODABLE INDICA, filed on 19/6/2015;
(Todesschini et al) U.S. patent application Ser. No.14/745,006 entitled SELECTIVE OUTPUT OF DECODED MESSAGE DATA filed on 19/6/2015;
(Thuries et al) U.S. patent application Ser. No.14/747,197, entitled OPTICAL PATTERN PROJECTOR, filed on 23.6.2015;
(Jovanovski et al) U.S. patent application Ser. No.14/747,490, entitled DUAL-PROJECTOR THE REE-DIMENSIONAL SCANNER, filed on 23.6.2015; and
(Xie et al) U.S. patent application Ser. No.14/748,446 entitled CORDLESS INDICIA READER WITH A MULTIFINCTION COIL FOR WIRELESS CHARGING AND EAS DEACTIVATION, filed 24.6.2015.
Figure 244122DEST_PATH_IMAGE001
In the description and/or drawings, there have been disclosed typical embodiments of the invention. The present invention is not limited to such exemplary embodiments. Use of the term "and/or" includes any and all combinations of one or more of the associated listed items. The figures are schematic representations and are thus not necessarily drawn to scale. Unless otherwise indicated, specific terms have been used in a generic and descriptive sense only and not for purposes of limitation.
The foregoing detailed description has set forth various embodiments of the devices and/or processes via the use of block diagrams, flowcharts, schematics, exemplary data structures, and examples. To the extent that such block diagrams, flowcharts, schematics, exemplary data structures, and examples contain one or more functions and/or operations, it will be understood by those skilled in the art that each function and/or operation within such block diagrams, flowcharts, schematics, exemplary data structures, or examples can be implemented, individually and/or collectively, by a wide range of hardware, software, firmware, or virtually any combination thereof.
In one embodiment, the present subject matter may be implemented via an Application Specific Integrated Circuit (ASIC). However, those skilled in the art will recognize that the embodiments disclosed herein can be equivalently implemented in standard integrated circuits, in whole or in part, as one or more computer programs running on one or more computers (e.g., as one or more programs running on one or more computer systems), as one or more programs running on one or more controllers (e.g., microcontrollers) as one or more programs running on one or more processors (e.g., microprocessors), as firmware, or as virtually any combination thereof, and that designing the circuitry and/or writing the code for the software and or firmware would be well within the skill of one of ordinary skill in the art in light of this disclosure.
Moreover, those skilled in the art will recognize that the control mechanisms taught herein are capable of being distributed as a program product in a variety of tangible forms, and that the illustrative embodiments apply equally regardless of the particular type of tangible instruction bearing media used to actually carry out the distribution. Examples of tangible instruction-bearing media include, but are not limited to, the following: recordable type media such as floppy disks, hard disk drives, CDROMs, digital magnetic tape, flash memory disks, and computer memory.
The various embodiments described above can be combined to provide further embodiments. These and other changes can be made to the systems and methods in light of the above detailed description. In general, in the following claims, the terms used should not be construed to limit the invention to the specific embodiments disclosed in the specification and the claims, but should be construed to include all machine-readable symbol scanning and processing systems and methods that read in accordance with the claims. Accordingly, the invention is not limited by the disclosure, but instead its scope is to be determined entirely by the following claims.

Claims (20)

1. A scanner, comprising:
a scope configured to provide aiming light comprising at least one aiming frequency to support aiming of the scanner at the machine-readable symbol MRS;
an illuminator configured to provide a read light including at least one read frequency that illuminates the MRS to support optical reading of the MRS; and
an imaging sensor having a filter configured to block at least one targeting frequency and transmit at least one reading frequency, the imaging sensor configured to:
receiving light reflected by the MRS, an
Converting the reflected light into an electrical signal suitable for signal processing by the scanner;
wherein the illuminator comprises an illuminator filter configured to block at least one aiming frequency.
2. The scanner of claim 1, wherein:
the sight is configured to provide aiming light comprising a first frequency;
the illuminator is configured to provide reading light at a second frequency, the second frequency being different from the first frequency; and
the imaging sensor is configured to respond to the second frequency and not to respond to the first frequency.
3. The scanner of claim 1, wherein:
the sight is configured to provide aiming light comprising a first frequency;
the illuminator is configured to provide reading light at a second frequency, the second frequency being different from the first frequency;
the optical filter is configured to block a first frequency and transmit a second frequency; and
the optical filter is configured and arranged to filter along an optical path that conveys light reflected from the MRS to the imaging sensor.
4. The scanner of claim 1, wherein:
the sight is configured to provide aiming light comprising a first frequency;
the illuminator is configured to provide reading light at a second frequency, the second frequency being different from the first frequency;
the optical filter is configured to block a first frequency and transmit a second frequency; zxfoom
The optical filter is interposed between the image sensor and the MRS.
5. The scanner of claim 1, wherein:
the sight is configured to provide aiming light comprising a first frequency;
the luminaire is configured to provide reading light comprising a plurality of frequencies including at least a second frequency different from the first frequency; and
the imaging sensor is configured to be responsive to at least a second frequency and not to be responsive to a first frequency.
6. The scanner of claim 1, wherein:
the sight is configured to provide aiming light comprising a first frequency;
the luminaire is configured to provide reading light comprising a plurality of frequencies including at least a second frequency different from the first frequency;
the optical filter is configured to block a first frequency and transmit a second frequency; and
the optical filter is configured and arranged to filter along an optical path that conveys light reflected from the MRS to the imaging sensor.
7. The scanner of claim 1, wherein:
the sight is configured to emit a first plurality of frequencies;
the at least one read frequency is a frequency at which the sight is not emitting; and
the imaging sensor is configured to be responsive to at least one reading frequency.
8. The scanner of claim 1, wherein:
the sight is configured to transmit a first plurality of frequencies;
the at least one read frequency is a frequency at which the sight is not emitting;
the optical filter is configured to transmit light of at least one read frequency and to block light of a first plurality of frequencies emitted by the sight.
9. The scanner of claim 7, wherein the sight comprises:
a sight light source emitting a first plurality of frequencies and emitting at least one reading frequency; and
a sight filter configured to block at least one read frequency.
10. The scanner of claim 1, wherein the sight is configured to generate a visual indicator of at least one of:
data of a machine-readable symbol;
instructions for operating an electronic scanner;
an electronic scanner function;
an electronic scanner state; and
the electronic scanner is active.
11. The scanner of claim 10, wherein the sight is configured to generate a visual indicator via at least one of:
a sight light source that emits a plurality of colors;
a plurality of aimer light sources, each light source of the plurality emitting a different color;
a sight filter configured to select a color from among a plurality of colors emitted by the sight; and
an image projector element.
12. The scanner of claim 1, wherein:
the sight and illuminator may be configured to be simultaneously operable; and
the imaging sensor is configured to acquire the MRS while the aiming light is emitted.
13. The scanner of claim 1, wherein the collimator is configured to emit a full frame light pattern comprising a full field of view of the MRS.
14. The scanner of claim 1, wherein the imaging sensor comprises:
a photosensor that converts the received reflected light into an electrical signal; and
a signal processing module configured to remove from the electrical signal a signal element representing aiming light reflected from the MRS.
15. A scanner, comprising:
a collimator providing an aiming light to support aiming of the scanner at the machine-readable symbol MRS;
an illuminator that provides a read light that illuminates the MRS to support optical reading of the MRS; and
an imaging sensor comprising a light-to-electrical conversion element and a signal filter module, and configured such that upon receiving light reflected from the MRS:
the optical-to-electrical conversion element converts the received light reflected from the MRS into a first electrical signal; and is
The signal filter module performs a signal filtering operation to extract a second electrical signal representative of received read light from the first electrical signal, the received read light being suitable for determining the data content of the MRS by the scanner.
16. The scanner of claim 15, wherein the imaging sensor is configured to distinguish between first light frequencies that are present in the reading light and that are not present in the aiming light.
17. The scanner of claim 16, comprising an optical bandpass filter configured to pass first light frequencies and block second light frequencies.
18. The scanner of claim 15, wherein:
the imaging sensor is configured such that upon receiving light reflected from the MRS, the imaging sensor distinguishes, in the received light, the aiming light reflected from the MRS from the reading light reflected from the MRS.
19. The scanner of claim 18, wherein the signal filter module is configured to dynamically select an optimal frequency from among a plurality of frequencies of the received read light to convert to an electrical signal suitable for determining a data content of an MRS.
20. The scanner of claim 19, wherein the selection of the optimal frequency is determined based on at least one of:
ambient spectrum impinging on MRS; and
optimizing an optical signal-to-noise ratio with respect to the imaging sensor.
CN201610233259.XA 2016-04-15 2016-04-15 Imaging barcode reader with color separated sight and illuminator Active CN107301359B (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
CN202310091284.9A CN115983301A (en) 2016-04-15 2016-04-15 Imaging barcode reader with color separated sight and illuminator
CN201610233259.XA CN107301359B (en) 2016-04-15 2016-04-15 Imaging barcode reader with color separated sight and illuminator
US15/470,971 US10055625B2 (en) 2016-04-15 2017-03-28 Imaging barcode reader with color-separated aimer and illuminator
EP21204528.0A EP4006769A1 (en) 2016-04-15 2017-03-29 Imaging barcode reader with color-separated aimer and illuminator
EP17163708.5A EP3232367B1 (en) 2016-04-15 2017-03-29 Imaging barcode reader with color separated aimer and illuminator

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610233259.XA CN107301359B (en) 2016-04-15 2016-04-15 Imaging barcode reader with color separated sight and illuminator

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202310091284.9A Division CN115983301A (en) 2016-04-15 2016-04-15 Imaging barcode reader with color separated sight and illuminator

Publications (2)

Publication Number Publication Date
CN107301359A CN107301359A (en) 2017-10-27
CN107301359B true CN107301359B (en) 2023-02-21

Family

ID=60137231

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202310091284.9A Pending CN115983301A (en) 2016-04-15 2016-04-15 Imaging barcode reader with color separated sight and illuminator
CN201610233259.XA Active CN107301359B (en) 2016-04-15 2016-04-15 Imaging barcode reader with color separated sight and illuminator

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202310091284.9A Pending CN115983301A (en) 2016-04-15 2016-04-15 Imaging barcode reader with color separated sight and illuminator

Country Status (1)

Country Link
CN (2) CN115983301A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1458618A (en) * 2002-03-07 2003-11-26 加拿大银行票据有限公司 Photoelectric file reader for reading UV/IR visual mark
US6655597B1 (en) * 2000-06-27 2003-12-02 Symbol Technologies, Inc. Portable instrument for electro-optically reading indicia and for projecting a bit-mapped color image
CN1788270A (en) * 2003-02-13 2006-06-14 讯宝科技公司 Optical code reader with autofocus and interface unit
JP2007110179A (en) * 2005-10-11 2007-04-26 Noritsu Koki Co Ltd Image reading apparatus and image reading method
CN1955830A (en) * 2005-10-27 2007-05-02 惠普开发有限公司 Imaging systems, imaging device analysis systems, imaging device analysis methods, and light beam emission methods
CN102203800A (en) * 2010-01-21 2011-09-28 计量仪器公司 Indicia reading terminal including optical filter
CN202235372U (en) * 2011-10-21 2012-05-30 刘强 Hepatocellular carcinoma detection and diagnosis system
CN104126187A (en) * 2012-02-01 2014-10-29 Opto电子有限公司 System and method for noise reduction in a bar code signal

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030034394A1 (en) * 1999-10-04 2003-02-20 Hand Held Products, Inc. Optical reader comprising finely adjustable lens assembly
US9672398B2 (en) * 2013-08-26 2017-06-06 Intermec Ip Corporation Aiming imagers

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6655597B1 (en) * 2000-06-27 2003-12-02 Symbol Technologies, Inc. Portable instrument for electro-optically reading indicia and for projecting a bit-mapped color image
CN1458618A (en) * 2002-03-07 2003-11-26 加拿大银行票据有限公司 Photoelectric file reader for reading UV/IR visual mark
CN1788270A (en) * 2003-02-13 2006-06-14 讯宝科技公司 Optical code reader with autofocus and interface unit
JP2007110179A (en) * 2005-10-11 2007-04-26 Noritsu Koki Co Ltd Image reading apparatus and image reading method
CN1955830A (en) * 2005-10-27 2007-05-02 惠普开发有限公司 Imaging systems, imaging device analysis systems, imaging device analysis methods, and light beam emission methods
CN102203800A (en) * 2010-01-21 2011-09-28 计量仪器公司 Indicia reading terminal including optical filter
CN202235372U (en) * 2011-10-21 2012-05-30 刘强 Hepatocellular carcinoma detection and diagnosis system
CN104126187A (en) * 2012-02-01 2014-10-29 Opto电子有限公司 System and method for noise reduction in a bar code signal

Also Published As

Publication number Publication date
CN107301359A (en) 2017-10-27
CN115983301A (en) 2023-04-18

Similar Documents

Publication Publication Date Title
US10896304B2 (en) Indicia reader having a filtered multifunction image sensor
US10753802B2 (en) System and method of determining if a surface is printed or a device screen
US10055625B2 (en) Imaging barcode reader with color-separated aimer and illuminator
US9773142B2 (en) System and method for selectively reading code symbols
US10798316B2 (en) Multi-spectral imaging using longitudinal chromatic aberrations
US10055627B2 (en) Mobile imaging barcode scanner
CN205959208U (en) Scanner
US9940497B2 (en) Minimizing laser persistence on two-dimensional image sensors
US10360424B2 (en) Illuminator for DPM scanner
CN107301359B (en) Imaging barcode reader with color separated sight and illuminator
EP4006769A1 (en) Imaging barcode reader with color-separated aimer and illuminator
CN109424871B (en) Illuminator for bar code scanner
US11120238B2 (en) Decoding color barcodes
WO2019014862A1 (en) Coaxial aimer for imaging scanner
US10387699B2 (en) Waking system in barcode scanner

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant