US9038903B2 - Method and apparatus for controlling illumination - Google Patents
Method and apparatus for controlling illumination Download PDFInfo
- Publication number
- US9038903B2 US9038903B2 US13/690,229 US201213690229A US9038903B2 US 9038903 B2 US9038903 B2 US 9038903B2 US 201213690229 A US201213690229 A US 201213690229A US 9038903 B2 US9038903 B2 US 9038903B2
- Authority
- US
- United States
- Prior art keywords
- frame data
- illumination
- short frame
- imaging sensors
- target object
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K7/00—Methods or arrangements for sensing record carriers, e.g. for reading patterns
- G06K7/10—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
- G06K7/10544—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum
- G06K7/10821—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum further details of bar or optical code scanning devices
- G06K7/1096—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum further details of bar or optical code scanning devices the scanner having more than one scanning window, e.g. two substantially orthogonally placed scanning windows for integration into a check-out counter of a super-market
Definitions
- the present invention relates to imaging-based barcode readers having two windows.
- a barcode is a coded pattern of graphical indicia comprised of a series of bars and spaces of varying widths, the bars and spaces having differing light reflecting characteristics. The pattern of the bars and spaces encode information. Barcode may be one dimensional (e.g., UPC barcode) or two dimensional (e.g., DataMatrix barcode).
- Systems that read, that is, image and decode barcodes employing imaging camera systems are typically referred to as imaging-based barcode readers or barcode scanners.
- Imaging-based barcode readers may be portable or stationary.
- a portable barcode reader is one that is adapted to be held in a user's hand and moved with respect to target indicia, such as a target barcode, to be read, that is, imaged and decoded.
- Stationary barcode readers are mounted in a fixed position, for example, relative to a point-of-sales counter.
- Target objects e.g., a product package that includes a target barcode
- the barcode reader typically provides an audible and/or visual signal to indicate the target barcode has been successfully imaged and decoded.
- barcodes are presented, as opposed to be swiped. This typically happens when the swiped barcode failed to scan, so the operator tries a second time to scan it. Alternately, presentation is done by inexperience users, such as when the reader is installed in a self check out installation.
- a typical example where a stationary imaging-based barcode reader would be utilized includes a point of sale counter/cash register where customers pay for their purchases.
- the reader is typically enclosed in a housing that is installed in the counter and normally includes a vertically oriented transparent window and/or a horizontally oriented transparent window, either of which may be used for reading the target barcode affixed to the target object, i.e., the product or product packaging for the product having the target barcode imprinted or affixed to it.
- the sales person or customer in the case of self-service check out) sequentially presents each target object's barcode either to the vertically oriented window or the horizontally oriented window, whichever is more convenient given the specific size and shape of the target object and the position of the barcode on the target object.
- FIG. 1 depicts a workstation in accordance with some embodiments.
- FIG. 2 is a schematic of a bi-optical workstation that includes a plurality of imaging sensors in accordance with some embodiments.
- FIGS. 3A-3F are schematics of a bi-optical workstation that has six imaging sensors in accordance with some embodiments;
- FIG. 4A shows the group of other optical components associated with the imaging sensor in FIG. 3A .
- FIG. 4B is a flowchart of a method of operating an imaging scanner in accordance with some embodiments.
- FIG. 1 depicts a workstation 10 in accordance with some embodiments.
- the workstation 10 is stationary and includes a housing 20 .
- the housing 20 has a generally horizontal window 25 H and a generally vertical window 25 V.
- the housing 20 can be integrated into the sales counter of a point-of-transaction system.
- the point-of-transaction system can also includes a cash register, a touch screen visual display, a printer for generating sales receipts, or other type user interface.
- the workstation 10 can be used by retailers to process transactions involving the purchase of products bearing an identifying target, such as UPC symbols.
- either a sales person or a customer will present a product or target object 40 selected for purchase to the housing 20 . More particularly, a target barcode 30 imprinted or affixed to the target object will be presented in a region near the windows 25 H and 25 V for reading, that is, imaging and decoding of the coded indicia of the target barcode. Upon a successful reading of the target barcode, a visual and/or audible signal will be generated by the workstation 10 to indicate to the user that the target barcode 30 has been successfully imaged and decoded.
- a plurality of imaging sensors 50 are mounted at the workstation 10 , for capturing light passing through either or both windows from a target which can be a one- or two-dimensional symbol, such as a two-dimensional symbol on a driver's license, or any document, as described below.
- Each imaging sensor 50 is a solid-state area array, preferably a CCD or CMOS array.
- the imaging sensors 50 and their associated illuminators 52 are operatively connected to a programmed microprocessor or controller 54 operative for controlling the operation of these and other components.
- the microprocessor is the same as the one used for decoding the return light scattered from the target and for processing the captured target images.
- the controller 54 sends successive command signals to the illuminators 52 to pulse the LEDs for a short time period of 300 microseconds or less, and successively energizes the imaging sensors 50 to collect light from a target only during said time period, also known as the exposure time period.
- the exposure time period also known as the exposure time period.
- FIG. 2 is only a schematic representation of an all imaging sensor-based workstation as embodied in a bi-optical workstation with two windows.
- the workstation can have other kinds of housings with different shapes.
- the workstation can have one window, two windows, or with more than two windows.
- the workstation can include between one to six imaging sensors.
- the bi-optical workstation can also include more than six imaging sensors.
- FIGS. 3A-3F are schematics of a bi-optical workstation that has six imaging sensors in accordance with some embodiments.
- the bi-optical workstation includes six imaging sensors C 1 , C 2 , C 3 , C 4 , C 5 , and C 6 . commonly mounted on a printed circuit board 22 .
- the printed circuit board 22 lies in a generally horizontal plane generally parallel to, and below, the generally horizontal window 25 H.
- the imaging sensor C 1 faces generally vertically upward toward an inclined folding mirror M 1 - a directly overhead at the left side of the horizontal window 25 H.
- the folding mirror M 1 - a faces another inclined narrow folding mirror M 1 - b located at the right side of the horizontal window 25 H.
- the folding mirror M 1 - b faces still another inclined wide folding mirror M 1 - c adjacent the mirror M 1 - a .
- the folding mirror M 1 - c faces out through the generally horizontal window 25 H toward the right side of the workstation.
- FIG. 3A it is shown that the imaging sensor C 1 is also associated with a group of other optical components 80 .
- FIG. 4A shows the group of other optical components 80 in details.
- the imaging sensor C 1 includes a sensor array 81 and an imaging lens 82 .
- two light emitting diodes 85 a and 85 b spaced apart, are installed closely adjacent to the sensor array 81 .
- light emitted from the light emitting diode 85 a (or 85 b ), after bouncing off the folding mirrors M 1 - a , M 1 - b , and M 1 - c sequentially, exits the housing 20 as the first illumination pattern centered by the light ray 110 .
- the folding mirrors M 1 - a , M 1 - b , and M 1 - c also constitute part of an optical system for defining a predetermined field of view for the imaging sensor C 1 .
- the predetermined field of view for the imaging sensor C 1 generally is centered by the light ray 110 .
- the predetermined field of view for the imaging sensor C 1 is preferably within the first illumination pattern.
- FIG. 3B depict the optical path for the imaging sensor C 2 .
- the imaging sensor C 2 and its associated optics in FIG. 3B is mirror symmetrical to the imaging sensor C 1 and its associated optics in FIG. 3A .
- the imaging sensor C 2 faces generally vertically upward toward an inclined folding mirror M 2 - a directly overhead at the right side of the horizontal window 25 H.
- the folding mirror M 2 - a faces another inclined narrow folding mirror M 2 - b located at the left side of the horizontal window 25 H.
- the folding mirror M 2 - b faces still another inclined wide folding mirror M 2 - c adjacent the mirror M 2 - a .
- the folding mirror M 2 - c faces out through the generally horizontal window 25 H toward the left side of the workstation.
- FIG. 3C depict the optical path for the imaging sensor C 3 .
- the imaging sensor C 3 faces generally vertically upward toward an inclined folding mirror M 3 - a directly overhead at the left side of the vertical window 25 V.
- the folding mirror M 3 - a faces another inclined narrow folding mirror M 3 - b located at the right side of the vertical window 25 V.
- the folding mirror M 3 - b faces still another inclined wide folding mirror M 3 - c adjacent the mirror M 3 - a .
- the folding mirror M 3 - c faces out through the generally vertical window 25 V toward the right side of the workstation.
- FIG. 3D depict the optical path for the imaging sensor C 4 .
- the imaging sensor C 4 and its associated optics in FIG. 3D is mirror symmetrical to the imaging sensor C 3 and its associated optics in FIG. 3C .
- the imaging sensor C 4 faces generally vertically upward toward an inclined folding mirror M 4 - a directly overhead at the right side of the vertical window 25 V.
- the folding mirror M 4 - a faces another inclined narrow folding mirror M 4 - b located at the left side of the vertical window 25 V.
- the folding mirror M 4 - b faces still another inclined wide folding mirror M 4 - c adjacent the mirror M 4 - a .
- the folding mirror M 4 - c faces out through the generally vertical window 25 V toward the left side of the workstation.
- FIG. 3E depict the optical path for the imaging sensor C 5 .
- the imaging sensor C 5 and its associated optics are located generally near a center area between the imaging sensors C 1 and C 2 .
- the imaging sensor C 5 faces generally vertically upward toward an inclined folding mirror M 5 - a that is located directly overhead of the imaging sensor C 5 and generally near a center area at one end of the window 25 H.
- the folding mirror M 5 - a faces another inclined folding mirror M 5 - b located at the opposite end of the window 25 H.
- the folding mirror M 5 - b faces out through the window 25 H in an upward direction.
- FIG. 3F depict the optical path for the imaging sensor C 6 .
- the imaging sensor C 6 and its associated optics are located generally near a center area between the imaging sensors C 3 and C 4 .
- the imaging sensor C 6 faces generally vertically upward toward an inclined folding mirror M 6 - a that is located directly overhead of the imaging sensor C 6 and generally near a center area at an upper end of the window 25 V.
- the folding mirror M 6 - a faces out through the window 25 V in a downward direction toward the countertop of the workstation.
- Bioptics scanner uses a fixed, very short exposure time in order to be able to capture sharp images of bar codes moving with high speed.
- An illumination system of imaging bi-optic scanner has to be able to deliver enough light power and illuminate a large scanning volume of a scanner to allow capturing bright images when exposure time is very short. At the same time the light cannot be too high, so images of bar codes that are close to scanning windows would not be saturated.
- some of the current method of operating the scanner include alternating between two levels of illumination: (1) the low level allows capturing images that are not saturated when bar code is close, yet yields dark images when bar code is far, and (2) the high level allows capturing good images when bar code is far, yet produces saturated images when bar code is close.
- FIG. 4B is a flowchart of a method 200 for operating an imaging scanner in accordance with some embodiments.
- an object detector is used to detect whether the target object is presented to the imaging scanner. If the target object is detected, the imaging scanner will continue to capture the images of the target object.
- light returned from the target object is captured with the imaging sensor to generate a short frame data while the illumination source is activated to provide an illumination light toward the target object with a first illumination level; subsequently, at block 250 , the short frame data is transmitted from the imaging sensor to the controller.
- a second illumination level is determined based upon the short frame data.
- light returned from the target object is captured with the imaging sensor to generate a regular frame data while the illumination source is activated to provide an illumination light toward the target object with the second illumination level, and the regular frame data is transmitted from the imaging sensor to the controller.
- the size of the regular frame data is at least 128 times larger than the size of the short frame data.
- the short frame data includes a histogram of an image captured by the imaging sensor.
- the short frame data includes subsampling imaging data obtained from the image captured by the imaging sensor while the illumination light is set to the first illumination level.
- the size of the regular frame data can be at least 50 times larger than the size of the short frame data.
- the short frame data is processed to select the second illumination level from two predetermined illumination levels, and the second illumination level is selected to be lower than the first illumination level when the percentage of saturated pixels in the short frame data is larger than a predetermined threshold level.
- the first illumination can be selected to be equal to the larger one of the two predetermined illumination levels. This allows for selecting the highest illumination level for capturing the regular frame data and avoiding image distortion due to image saturation.
- the criterion in this case is that the number of saturation pixels in a short frame and/or histogram is below the threshold. Otherwise a lower, of the two levels, is used for capturing the regular frame data.
- the short frame data is processed to select the second illumination level from two predetermined illumination levels, and the second illumination level is selected to be higher than the first illumination level unless the percentage of saturated pixels in an extrapolation of the short frame data is larger than a predetermined threshold level. For example, when the first illumination is selected to be equal to the smaller one of the two predetermined illumination levels, and then the digital/analog gain and image extrapolation can be used to select the second illumination level for capturing the regular frame data. Advantage is that perceived illumination and power consumption is only slightly affected.
- the short frame data is acquired using the first illumination level that is lower of the two illumination levels.
- the second illumination level is selected to be higher than the two illumination levels when the percentage of saturated pixels in an extrapolation of the short frame data is smaller than a predetermined threshold. For example, when the first illumination is selected to be equal to the smaller one of the two predetermined illumination levels, and then the digital/analog gain and image extrapolation can be used to select the second illumination level for capturing the regular frame data.
- the digital/analog gain and image extrapolation can be used to select the second illumination level for capturing the regular frame data.
- perceived illumination and power consumption is only slightly affected.
- the sub-sampled image is analyzed to detect specific patterns in the sub-sampled image and exclude the image areas where the patterns accuse from histogram evaluation. For example a reflection from the long edge can be detected and the area of the image when this reflection occurs can be masked out from histogram calculations.
- the short frame data is processed to select the second illumination level from N predetermined illumination levels.
- the integer N is less than four.
- the methods 200 can include transmitting from the imaging sensor to the controller in sequence a first short frame data, a first regular frame data, a second short frame data, and a second regular frame data.
- the first short frame data can be used for determining the illumination level for capturing the first regular frame data
- the second short frame data can be used for determining the illumination level for capturing the second regular frame data.
- the methods 200 can include transmitting from the imaging sensor to the controller in sequence a first short frame data, a first regular frame data, a second short frame data, a second regular frame data, a third short frame data, and a third regular frame data.
- a includes . . . a”, “contains . . . a” does not, without more constraints, preclude the existence of additional identical elements in the process, method, article, or apparatus that comprises, has, includes, contains the element.
- the terms “a” and “an” are defined as one or more unless explicitly stated otherwise herein.
- the terms “substantially”, “essentially”, “approximately”, “about” or any other version thereof, are defined as being close to as understood by one of ordinary skill in the art, and in one non-limiting embodiment the term is defined to be within 10%, in another embodiment within 5%, in another embodiment within 1% and in another embodiment within 0.5%.
- the term “coupled” as used herein is defined as connected, although not necessarily directly and not necessarily mechanically.
- a device or structure that is “configured” in a certain way is configured in at least that way, but may also be configured in ways that are not listed.
- processors such as microprocessors, digital signal processors, customized processors and field programmable gate arrays (FPGAs) and unique stored program instructions (including both software and firmware) that control the one or more processors to implement, in conjunction with certain non-processor circuits, some, most, or all of the functions of the method and/or apparatus described herein.
- processors or “processing devices” such as microprocessors, digital signal processors, customized processors and field programmable gate arrays (FPGAs) and unique stored program instructions (including both software and firmware) that control the one or more processors to implement, in conjunction with certain non-processor circuits, some, most, or all of the functions of the method and/or apparatus described herein.
- FPGAs field programmable gate arrays
- unique stored program instructions including both software and firmware
- an embodiment can be implemented as a computer-readable storage medium having computer readable code stored thereon for programming a computer (e.g., comprising a processor) to perform a method as described and claimed herein.
- Examples of such computer-readable storage mediums include, but are not limited to, a hard disk, a CD-ROM, an optical storage device, a magnetic storage device, a ROM (Read Only Memory), a PROM (Programmable Read Only Memory), an EPROM (Erasable Programmable Read Only Memory), an EEPROM (Electrically Erasable Programmable Read Only Memory) and a Flash memory.
Landscapes
- Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Toxicology (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Input (AREA)
Abstract
Description
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/690,229 US9038903B2 (en) | 2012-11-30 | 2012-11-30 | Method and apparatus for controlling illumination |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/690,229 US9038903B2 (en) | 2012-11-30 | 2012-11-30 | Method and apparatus for controlling illumination |
Publications (2)
Publication Number | Publication Date |
---|---|
US20140151452A1 US20140151452A1 (en) | 2014-06-05 |
US9038903B2 true US9038903B2 (en) | 2015-05-26 |
Family
ID=50824483
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/690,229 Active US9038903B2 (en) | 2012-11-30 | 2012-11-30 | Method and apparatus for controlling illumination |
Country Status (1)
Country | Link |
---|---|
US (1) | US9038903B2 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9857575B2 (en) | 2012-02-06 | 2018-01-02 | Cognex Corporation | System and method for expansion of field of view in a vision system |
US9892298B2 (en) | 2012-02-06 | 2018-02-13 | Cognex Corporation | System and method for expansion of field of view in a vision system |
US11966810B2 (en) | 2012-02-06 | 2024-04-23 | Cognex Corporation | System and method for expansion of field of view in a vision system |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10331926B1 (en) * | 2017-12-18 | 2019-06-25 | Symbol Technologies, Llc | Bi-optic barcode reader |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4564287A (en) | 1981-06-11 | 1986-01-14 | Canon Kabushiki Kaisha | Image formation apparatus including means for detecting and controlling image formation condition |
US4970556A (en) | 1987-06-30 | 1990-11-13 | Kabushiki Kaisha Toshiba | Auto-illuminating controller and image forming apparatus using the same |
US5061960A (en) | 1987-07-02 | 1991-10-29 | Minolta Camera Kabushiki Kaisha | Preexposure light control system for a photocopier |
US5107300A (en) | 1983-05-06 | 1992-04-21 | Canon Kabushiki Kaisha | Image forming apparatus including means for controlling the amount of light exposure |
US5608547A (en) | 1993-04-22 | 1997-03-04 | Minolta Camera Kabushiki Kaisha | Image forming apparatus having illumination direction altered for every plurality of readout operations with respect to one original |
US20100140356A1 (en) * | 2003-05-12 | 2010-06-10 | Hand Held Products, Inc. | Apparatus comprising image sensor |
US8196839B2 (en) | 2005-06-03 | 2012-06-12 | Hand Held Products, Inc. | Optical reader having reduced specular reflection read failures |
US8561907B2 (en) * | 2010-11-05 | 2013-10-22 | Sick Ag | Flicker-free illumination apparatus |
-
2012
- 2012-11-30 US US13/690,229 patent/US9038903B2/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4564287A (en) | 1981-06-11 | 1986-01-14 | Canon Kabushiki Kaisha | Image formation apparatus including means for detecting and controlling image formation condition |
US5107300A (en) | 1983-05-06 | 1992-04-21 | Canon Kabushiki Kaisha | Image forming apparatus including means for controlling the amount of light exposure |
US4970556A (en) | 1987-06-30 | 1990-11-13 | Kabushiki Kaisha Toshiba | Auto-illuminating controller and image forming apparatus using the same |
US5061960A (en) | 1987-07-02 | 1991-10-29 | Minolta Camera Kabushiki Kaisha | Preexposure light control system for a photocopier |
US5608547A (en) | 1993-04-22 | 1997-03-04 | Minolta Camera Kabushiki Kaisha | Image forming apparatus having illumination direction altered for every plurality of readout operations with respect to one original |
US20100140356A1 (en) * | 2003-05-12 | 2010-06-10 | Hand Held Products, Inc. | Apparatus comprising image sensor |
US8196839B2 (en) | 2005-06-03 | 2012-06-12 | Hand Held Products, Inc. | Optical reader having reduced specular reflection read failures |
US8561907B2 (en) * | 2010-11-05 | 2013-10-22 | Sick Ag | Flicker-free illumination apparatus |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9857575B2 (en) | 2012-02-06 | 2018-01-02 | Cognex Corporation | System and method for expansion of field of view in a vision system |
US9892298B2 (en) | 2012-02-06 | 2018-02-13 | Cognex Corporation | System and method for expansion of field of view in a vision system |
US10445544B2 (en) | 2012-02-06 | 2019-10-15 | Cognex Corporation | System and method for expansion of field of view in a vision system |
US11966810B2 (en) | 2012-02-06 | 2024-04-23 | Cognex Corporation | System and method for expansion of field of view in a vision system |
Also Published As
Publication number | Publication date |
---|---|
US20140151452A1 (en) | 2014-06-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8622305B2 (en) | Efficient multi-image bar code reader | |
EP2414989B1 (en) | Exposure control for multi-imaging scanner | |
US7757955B2 (en) | Bar code reader having multiple cameras | |
US11783681B2 (en) | Bioptical barcode reader | |
US20110073652A1 (en) | Method and apparatus for intelligently controlling illumination patterns projected from barcode readers | |
US9058531B2 (en) | Imaging scanner having near field communication device | |
US8740086B2 (en) | Apparatus for and method of reading indicia by using non-readily visible illumination light | |
WO2010005787A1 (en) | Multi-imaging scanner for reading multiple images | |
US9298957B2 (en) | Detecting window deterioration on barcode scanning workstation | |
US8960551B2 (en) | Method of decoding barcode with imaging scanner having multiple object sensors | |
US8740084B2 (en) | Method and apparatus for projecting illumination patterns from barcode readers | |
US9038903B2 (en) | Method and apparatus for controlling illumination | |
US9152834B2 (en) | Image capture based on scanning resolution setting compared to determined scanning resolution relative to target distance in barcode reading | |
US9245425B2 (en) | Produce lift apparatus | |
US9489554B2 (en) | Arrangement for and method of assessing efficiency of transactions involving products associated with electro-optically readable targets | |
US20130141584A1 (en) | Apparatus for and method of triggering electro-optical reading only when a target to be read is in a selected zone in a point-of-transaction workstation | |
US9483669B2 (en) | Barcode imaging workstation having sequentially activated object sensors | |
US20140144987A1 (en) | Bi-optical barcode scanning workstation with stitched sapphire windows | |
US9639720B2 (en) | System and method of automatically avoiding signal interference between product proximity subsystems that emit signals through mutually facing presentation windows of different workstations | |
US9495564B2 (en) | Arrangement for and method of assessing a cause of poor electro-optical reading performance by displaying an image of a symbol that was poorly read | |
US20240256805A1 (en) | Method to limit decode volume | |
US8511559B2 (en) | Apparatus for and method of reading targets by image captured by processing captured target images in a batch or free-running mode of operation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SYMBOL TECHNOLOGIES, INC., NEW YORK Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MADEJ, DARIUSZ J.;GOREN, DAVID P.;JOSEPH, EUGENE B.;REEL/FRAME:029381/0881 Effective date: 20121130 |
|
AS | Assignment |
Owner name: MORGAN STANLEY SENIOR FUNDING, INC. AS THE COLLATERAL AGENT, MARYLAND Free format text: SECURITY AGREEMENT;ASSIGNORS:ZIH CORP.;LASER BAND, LLC;ZEBRA ENTERPRISE SOLUTIONS CORP.;AND OTHERS;REEL/FRAME:034114/0270 Effective date: 20141027 Owner name: MORGAN STANLEY SENIOR FUNDING, INC. AS THE COLLATE Free format text: SECURITY AGREEMENT;ASSIGNORS:ZIH CORP.;LASER BAND, LLC;ZEBRA ENTERPRISE SOLUTIONS CORP.;AND OTHERS;REEL/FRAME:034114/0270 Effective date: 20141027 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: SYMBOL TECHNOLOGIES, LLC, NEW YORK Free format text: CHANGE OF NAME;ASSIGNOR:SYMBOL TECHNOLOGIES, INC.;REEL/FRAME:036083/0640 Effective date: 20150410 |
|
AS | Assignment |
Owner name: SYMBOL TECHNOLOGIES, INC., NEW YORK Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:MORGAN STANLEY SENIOR FUNDING, INC.;REEL/FRAME:036371/0738 Effective date: 20150721 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |