US20130146667A1 - Apparatus for and method of distinguishing among successive products entering a point-of-transaction workstation by detecting their exit therefrom - Google Patents
Apparatus for and method of distinguishing among successive products entering a point-of-transaction workstation by detecting their exit therefrom Download PDFInfo
- Publication number
- US20130146667A1 US20130146667A1 US13/315,626 US201113315626A US2013146667A1 US 20130146667 A1 US20130146667 A1 US 20130146667A1 US 201113315626 A US201113315626 A US 201113315626A US 2013146667 A1 US2013146667 A1 US 2013146667A1
- Authority
- US
- United States
- Prior art keywords
- successive
- window
- product
- zone
- light
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K7/00—Methods or arrangements for sensing record carriers, e.g. for reading patterns
- G06K7/10—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
- G06K7/10544—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum
- G06K7/10792—Special measures in relation to the object to be scanned
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K7/00—Methods or arrangements for sensing record carriers, e.g. for reading patterns
- G06K7/10—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
- G06K7/10544—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum
- G06K7/10554—Moving beam scanning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K7/00—Methods or arrangements for sensing record carriers, e.g. for reading patterns
- G06K7/10—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
- G06K7/10544—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum
- G06K7/10821—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum further details of bar or optical code scanning devices
- G06K7/1096—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum further details of bar or optical code scanning devices the scanner having more than one scanning window, e.g. two substantially orthogonally placed scanning windows for integration into a check-out counter of a super-market
Definitions
- the present disclosure relates generally to an apparatus for, and a method of, distinguishing among successive products entering a point-of-transaction workstation and bearing, or associated with, targets to be electro-optically read, by detecting the exit of each product from the workstation and, more particularly, to preventing accidental double-reading of the same target and to accelerating intentional reading of successive identical targets.
- Point-of-transaction workstations employing laser-based readers and/or imager-based readers have been used in many venues, such as supermarkets, department stores and other kinds of retail settings, as well as libraries and parcel deliveries and other kinds of public settings, as well as factories, warehouses and other kinds of industrial settings, for many years.
- Such workstations were often configured either as stand-mounted scanners each resting on a countertop and having a presentation window; or as vertical slot scanners each resting on, or built into, the countertop and having a generally vertically arranged, upright presentation window; or as flat-bed or horizontal slot scanners each resting on, or built into, the countertop and having a generally horizontally arranged presentation window; or as bi-optical, dual window scanners each resting on, or built into, the countertop and having both a generally horizontal presentation window supported by a generally horizontal platform and a generally vertically arranged, upright presentation window supported by a generally upright tower.
- Such workstations were often operated to electro-optically read a plurality of symbol targets, such as one-dimensional symbols, particularly Universal Product Code (UPC) bar code symbols, truncated symbols, stacked symbols, and two-dimensional symbols, as well as non-symbol targets, such as driver's licenses, receipts, signatures, etc., the targets being associated with, or borne by, objects or products to be processed by, e.g., purchased at, the workstations.
- symbol targets such as one-dimensional symbols, particularly Universal Product Code (UPC) bar code symbols, truncated symbols, stacked symbols, and two-dimensional symbols, as well as non-symbol targets, such as driver's licenses, receipts, signatures, etc.
- a user such as an operator or a customer, slid or swiped a product associated with, or bearing, the target in a moving direction across and past a respective presentation window in a swipe mode, or momentarily presented, and steadily momentarily held, the target associated with, or borne by, the product to an approximate central region of the respective presentation window in a presentation mode.
- the products could be moved relative to the respective window in various directions, for example, from right-to-left, or left-to-right, and/or in-and-out, or out-and-in, and/or high-to-low, or low-to-high, or any combination of such directions, or could be positioned either in contact with, or held at a working distance away from, either window during such movement or presentation.
- the choice depended on the type of the workstation, or on the user's preference, or on the layout of the venue, or on the type of the product and target.
- Return light returning from the target in the laser-based reader and/or in the imager-based reader was detected to generate an electrical signal indicative of the target.
- the electrical signal was then processed, and, when the target was a symbol, was decoded, and read, thereby identifying the product.
- Each imager included a one- or two-dimensional, solid-state, charge coupled device (CCD) array, or a complementary metal oxide semiconductor (CMOS) array, of image sensors (also known as pixels), and typically had an associated illuminator or illumination system to illuminate the target with illumination light over an illumination field.
- CMOS complementary metal oxide semiconductor
- Each imager also had an imaging lens assembly for capturing return illumination light reflected and/or scattered from the target, and for projecting the captured return light onto the sensor array.
- Each imager preferably operated at a frame rate of multiple frames per second, e.g., sixty frames per second.
- Each field of view, or each subfield was preferably individually illuminated, and overlapped, by a respective illumination field and extended through the windows over regions of the product.
- Each imager included either a global or a rolling shutter to help prevent image blur, especially when the targets passed through the scan volume at high speed, e.g., on the order of 100 inches per second.
- the illumination light was not emitted at all times, but was emitted in response to detection of return infrared (IR) light by an IR-based proximity system that included an IR emitter operative for emitting IR light into an IR emission field, and an IR sensor for sensing the return IR light within an IR detection field.
- IR return infrared
- a product entering the IR emission field reflected and/or scattered at least a portion of the emitted IR light incident on the product to the IR sensor. Detection of this return IR light by the IR sensor determined that the product had indeed entered the workstation, thereby triggering the illumination system and the reading of the target.
- FIG. 1 is a perspective view of a dual window, bi-optical, point-of-transaction workstation for imaging and reading successive targets on successive products passing through the workstation;
- FIG. 2 is a part-sectional view of the workstation of FIG. 1 diagrammatically depicting certain component systems inside the workstation;
- FIG. 3 is a part-sectional, overhead view of the workstation of FIG. 1 depicting a proximity system in accordance with this invention for detecting entry of the products into, and exit of the products from, the workstation;
- FIG. 4 is a graph of an analog signal waveform generated by the proximity system of FIG. 3 when a product is passed through the workstation in a swipe mode;
- FIG. 5 is a graph of an analog signal waveform generated by the proximity system of FIG. 3 when a product is passed through the workstation in a presentation mode.
- An apparatus or workstation in accordance with one feature of this invention, is operative for processing successive products associated with successive targets to be electro-optically read.
- the apparatus includes a housing, a window supported by the housing, and a reading system supported by the housing and operative for reading the successive targets with return light from the targets passing through the window.
- the apparatus also includes a proximity system supported by the housing and operative for detecting entry of each successive product associated with a respective successive target into a zone outside the window, and for detecting exit of each successive product associated with the respective successive target from the zone.
- a controller is operatively connected to the reading system and the proximity system. The controller processes the return light from the respective successive target in response to detection of the entry of each successive product into the zone, and distinguishes among the successive products in response to detection of the exit of each successive product from the zone.
- the reading system includes a solid-state imager having an array of image sensors looking at a field of view that extends through the window to each target to be imaged, and an energizable illumination system for illuminating the field of view with illumination light over an illumination field.
- the controller is operative for energizing the illumination system in response to the detection of the entry of each successive product into the zone, and for processing return illumination light returned from the respective successive target and captured in the field of view by the imager.
- Laser-based reading systems could also be employed.
- the housing When configured as a vertical slot scanner, for example, the housing supports the window as a single upright window in an upright plane. When configured as a flat-bed or horizontal scanner, for example, the housing supports the window as a single generally horizontal window in a generally horizontal plane.
- the housing When preferably configured as a bi-optical scanner, the housing has an upright tower that supports the window in an upright plane, and also has a generally horizontal platform that supports an additional window in a generally horizontal plane that intersects the upright plane.
- the proximity system is light-based and includes an infrared (IR) emitter for emitting IR light into an IR emission field, and an IR sensor for sensing return IR light within an IR detection field that, preferably, but not necessarily, intersects the IR emission field in the zone.
- the IR sensor is operative for detecting the entry of each successive product into the zone, and for detecting the exit of each successive product from the zone by sensing a change in magnitude over time of the return IR light returned by each successive product.
- the controller determines whether each change in magnitude over time of the return IR light is indicative of a swipe mode, as described above, in which a product is swiped across the window, or a presentation mode, as also described above, in which a product is presented and momentarily held at the window.
- An analog output signal of the IR sensor has a slope that is indicative of this change in magnitude over time of the return IR light.
- the controller determines whether each change in magnitude over time, i.e., the slope, of the return IR light of the product exiting the zone exceeds a predetermined threshold or slope for each mode.
- reference numeral 10 in FIG. 1 generally identifies a dual window, bi-optical, point-of-transaction workstation typically used by retailers to process transactions involving the purchase of products 22 bearing, or associated with, identifying targets 24 or indicia, such as the UPC symbol described above.
- the workstation 10 includes a housing 20 having a generally horizontal, preferably rectangular, window 12 located in a generally horizontal plane and supported by a horizontal housing portion or platform 14 of different sizes, and a vertical or generally vertical (referred to as “vertical” or “upright” hereinafter) window 16 that is located in a generally upright plane that intersects the generally horizontal plane and that is supported by a raised housing portion or tower 18 .
- the upright plane may lie in a vertical plane, or be slightly rearwardly or forwardly inclined relative to the vertical plane.
- the upright, preferably rectangular, window 16 is preferably recessed within its housing portion 18 to resist scratching
- the products are passed by a user 26 (see FIGS. 3-4 ), i.e., an operator or a customer, through a scan volume, which occupies the space at and above the horizontal window 12 , and also occupies the space at and in front of the upright window 16 .
- the generally horizontal window 12 measures about four inches in width by about six inches in length
- the generally upright window 16 measures about six inches in width by about eight inches in length
- the platform 14 may be long, e.g., on the order of twelve inches as measured in a direction away from and perpendicular to the upright window 16 , or short, e.g., on the order of eight inches as measured in a direction away from and perpendicular to the upright window 16 .
- the workstation 10 includes one or more cameras or solid-state imagers 30 (two shown schematically in FIG. 2 ), each having a sensor array, preferably a one- or two-dimensional, charge coupled device (CCD) array, or a complementary metal oxide semiconductor (CMOS) array, of image sensors (also known as pixels), preferably of megapixel size, e.g., 1280 pixels wide ⁇ 960 pixels high, with an imaging field of view diagrammatically shown by arrows and looking out through the windows 12 , 16 .
- the imaging field of each imager 30 measures about 15 degrees by 30 degrees.
- Each imager 30 includes, or is associated with, an illuminator or illumination system 32 , mounted at the workstation, for illuminating the target 24 with illumination light over an illumination field that overlaps the respecting imaging field.
- Each illuminator 32 preferably includes one or more light sources, e.g., surface-mounted, light emitting diodes (LEDs), located at each imager 30 to uniformly illuminate the target 24 .
- Each imager 30 includes an imaging lens system for capturing return illumination light reflected and/or scattered from the target 24 , and for projecting the captured return light onto the respective sensor array.
- Each imager 30 preferably has a shutter, typically a global shutter, that exposes each imager for an exposure time, preferably pre-set for the maximum anticipated exposure time needed to capture the target 24 at the maximum working distance away from each window.
- the maximum exposure time can be set to a value between 400-750 microseconds.
- Each imager 30 preferably operates at a frame rate of sixty frames per second, each frame lasting about 16.67 milliseconds.
- the shutter insures that the captured images will not be disturbed by motion of the target 24 relative to the window(s) 12 , 16 during the exposure time.
- a rolling or a mechanical shutter could also be employed.
- the target 24 and the product 22 can be presented or swiped at speeds up to around 100 inches per second across any part of either window.
- the user 26 processes a product 22 bearing a target 24 thereon, past the windows 12 , 16 by swiping the product, e.g., in the direction of arrow X in FIG. 1 , across a respective window in the abovementioned swipe mode, or by presenting and momentarily holding the product 22 at the respective window in the abovementioned presentation mode.
- the target 24 may located on any of the top, bottom, right, left, front and rear, sides of the product 22 , and at least one, if not more, of the imagers 30 will capture the illumination light reflected, scattered, or otherwise returning from the target 24 through one or both windows.
- the imagers 30 are preferably looking through the windows at around 45° so that they can each see a side of the product 22 that is generally perpendicular to, as well as generally parallel to, a respective window.
- FIG. 2 also schematically depicts that the imagers 30 and their associated illuminators 32 are operatively connected to a programmed microprocessor or controller 44 operative for controlling the operation of these and other components.
- the controller 44 is the same as the one used for processing the captured target images, and for decoding the return light scattered from the target when the target is a symbol.
- the controller 44 sends successive command signals to the illuminators 32 in response to detection of a product 22 in a predetermined zone in the workstation, as described in detail below, to pulse the LEDs for a short time period of 100 microseconds or less, and successively energizes the imagers 30 to collect light from a target 24 only during said time period, also known as the exposure time period.
- a target image By acquiring a target image during this brief time period, the image of the target is not excessively blurred even in the presence of relative motion between the imagers and the target.
- the workstation 10 is illustrated in FIG. 2 as having two imagers 30 , one for each window 12 , 16 , other configurations are within the scope of this invention.
- multiple imagers can be provided for each window, and an optical system comprised of multiple folding mirrors can be configured to establish multiple intersecting imaging fields of view looking out through each window.
- the optical system may include optical splitters each operative for splitting the imaging field of view of at least one of the imagers into a plurality of imaging subfields of view, each additional imaging subfield serving to replace an additional imager. These imaging subfields also intersect and look out each window.
- Each imaging field or subfield is illuminated in response to detection of the product 22 in the predetermined zone in the workstation by a proximity system.
- the controller 44 energizes the illumination system 32 in response to detection of the product 22 in the predetermined zone, and processes return illumination light returned from the target 24 and captured in the imaging field or subfield by the imager 30 .
- the proximity system is preferably light-based and includes an infrared (IR) emitter 50 , preferably comprised of one or more light emitting diodes (LEDs), for emitting IR light into an IR emission field bounded by side boundary edges 50 A, 50 B, and an IR sensor 54 for sensing return IR light within an IR detection field bounded by side boundary edges 54 A, 54 B.
- IR infrared
- the emitted IR light has its maximum intensity along an IR emission axis centrally located within the IR emission field.
- the return IR light has its maximum sensitivity along an IR detection axis centrally located within the IR detection field.
- the IR axes are preferably, but not necessarily, inclined and cross over and intersect one another directly in front of the upright window 16 .
- the IR detection field intersects the IR emission field in a common area of intersection (shown by a quadrilateral area highlighted by hatched lines in FIG. 3 and having corners A, B, C, D) to define the aforementioned predetermined zone directly in front of the upright window 16 .
- the predetermined zone is also directly above, and is generally coextensive in area with, the generally horizontal window 12 .
- the intersecting IR emission and detection fields above the horizontal window 12 and/or the platform 14 within the workstation 10 reduce false triggering by the proximity system, not only by the user 26 outside the workstation 10 , but also by items or parts of the user 26 inside the workstation 10 , but not in the predetermined zone A, B, C, D, e.g., directly overlying the generally horizontal window 12 . As shown in FIG. 3 , no part of the user is in the predetermined zone.
- the IR sensor 54 is operative not only for detecting the entry of each successive product 22 into the zone, but also the exit of each successive product 22 from the zone. By detecting each such exit, successive products 22 are distinguished, and the above-described accidental double-reading of the same target 24 is reliably prevented. Also, the above-described intentional reading of successive identical targets 24 is accelerated without the prior art delay of any fixed timeout period, thereby improving the overall throughput of the point-of-transaction workstation 10 .
- the IR sensor 54 senses a change in magnitude over time, i.e., the slope, of the return IR light returned by each successive product 22 . This is best shown in FIG. 4 , where the magnitude of the output analog signal produced by the IR sensor 54 as a function of time is depicted as waveform S for the swipe mode, and in FIG. 5 , where the magnitude of the output analog signal produced by the IR sensor 54 as a function of time is depicted as waveform P for the presentation mode. For reference purposes, in each of FIGS.
- waveform E depicts a constant, low magnitude, analog return signal output by the IR sensor 54 when no product 22 is present in the zone
- waveform F depicts a constant, high magnitude, output analog return signal output by the IR sensor 54 when the product 22 is constantly present in the zone.
- the leading portion of the waveform S between times t 1 and t 2 with a positive slope indicates the entry of the product 22 into the zone
- the middle portion of the waveform S between times t 2 and t 3 with a substantially zero slope indicates the presence of the product 22 in the zone
- the trailing portion of the waveform S between times t 3 and t 4 with a negative slope indicates the exit of the product 22 from the zone.
- the controller 44 is operative for measuring the trailing portion of the waveform S to determine the exit time and the exit speed of the product. For example, if the trailing portion of the waveform S lasts less than one second, or preferably, less than 200 milliseconds, then this relatively quick exit time indicates that the product is being swiped through the workstation.
- the leading portion of the waveform P between times t 5 and t 6 with a positive slope indicates the entry of the product 22 into the zone
- the middle portion of the waveform P between times t 6 and t 7 with a substantially zero slope indicates the presence of the product 22 in the zone
- the trailing portion of the waveform P between times t 7 and t 8 with a negative slope indicates the exit of the product 22 from the zone.
- the controller 44 is operative for measuring the trailing portion of the waveform P to determine the exit time and the exit speed of the product. For example, if the trailing portion of the waveform P lasts more than 500 milliseconds, or preferably, more than one second, then this relatively slower exit time indicates that the product is being presented at the workstation.
- the controller not only measures the exit times to indicate whether the product is being swiped or presented, but also measures the extent of the slope of the trailing portions of the waveforms P, S. To reliably distinguish between successive products, the controller determines whether each change in magnitude over time of the return IR light of the product exiting the zone exceeds a predetermined threshold or reference slope for each mode. Thus, the measured slope must exceed a predetermined reference value before the controller confirms that a product has indeed exited the workstation. Slope measurement performed by the controller is well known to the skilled artisan.
- the workstation need not be the illustrated bi-optical workstation having dual presentation windows as described above, but could also be configured either as a stand-mounted scanner having a single presentation window, or as a vertical slot scanner having a single upright presentation window, or as a flat-bed or horizontal slot scanner having a single, generally horizontally arranged presentation window.
- the target need not be the illustrated one-dimensional symbol, but could also be a truncated symbol, a stacked symbol, or a two-dimensional symbol, as well as a non-symbol target, such as a driver's license, a receipt, a signature, etc.
- the workstation need not have the illustrated two imagers, but could have more or less than two imagers, and one or all of the imagers could be replaced by laser-based readers.
- the proximity system need not be light-based. Also, the IR emission axis and the IR detection axis need not cross over as illustrated, but could also be positioned in mutual parallelism and extend perpendicularly of the upright window.
- a method of processing successive products associated with successive targets to be electro-optically read is performed by supporting a window on a housing, reading the successive targets with return light passing through the window, detecting entry of each successive product associated with a respective successive target into a zone outside the window, detecting exit of each successive product associated with the respective successive target from the zone, processing the return light from the respective successive target in response to detection of the entry of each successive product into the zone, and distinguishing among the successive products in response to detection of the exit of each successive product from the zone.
- a includes . . . a,” or “contains . . . a,” does not, without more constraints, preclude the existence of additional identical elements in the process, method, article, or apparatus that comprises, has, includes, or contains the element.
- the terms “a” and “an” are defined as one or more unless explicitly stated otherwise herein.
- the terms “substantially,” “essentially,” “approximately,” “about,” or any other version thereof, are defined as being close to as understood by one of ordinary skill in the art, and in one non-limiting embodiment the term is defined to be within 10%, in another embodiment within 5%, in another embodiment within 1%, and in another embodiment within 0.5%.
- the term “coupled” as used herein is defined as connected, although not necessarily directly and not necessarily mechanically.
- a device or structure that is “configured” in a certain way is configured in at least that way, but may also be configured in ways that are not listed.
- processors such as microprocessors, digital signal processors, customized processors, and field programmable gate arrays (FPGAs), and unique stored program instructions (including both software and firmware) that control the one or more processors to implement, in conjunction with certain non-processor circuits, some, most, or all of the functions of the method and/or apparatus described herein.
- processors or “processing devices” such as microprocessors, digital signal processors, customized processors, and field programmable gate arrays (FPGAs), and unique stored program instructions (including both software and firmware) that control the one or more processors to implement, in conjunction with certain non-processor circuits, some, most, or all of the functions of the method and/or apparatus described herein.
- FPGAs field programmable gate arrays
- unique stored program instructions including both software and firmware
- an embodiment can be implemented as a computer-readable storage medium having computer readable code stored thereon for programming a computer (e.g., comprising a processor) to perform a method as described and claimed herein.
- Examples of such computer-readable storage mediums include, but are not limited to, a hard disk, a CD-ROM, an optical storage device, a magnetic storage device, a ROM (Read Only Memory), a PROM (Programmable Read Only Memory), an EPROM (Erasable Programmable Read Only Memory), an EEPROM (Electrically Erasable Programmable Read Only Memory) and a Flash memory.
Landscapes
- Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Toxicology (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Input (AREA)
Abstract
Description
- The present disclosure relates generally to an apparatus for, and a method of, distinguishing among successive products entering a point-of-transaction workstation and bearing, or associated with, targets to be electro-optically read, by detecting the exit of each product from the workstation and, more particularly, to preventing accidental double-reading of the same target and to accelerating intentional reading of successive identical targets.
- Point-of-transaction workstations employing laser-based readers and/or imager-based readers have been used in many venues, such as supermarkets, department stores and other kinds of retail settings, as well as libraries and parcel deliveries and other kinds of public settings, as well as factories, warehouses and other kinds of industrial settings, for many years. Such workstations were often configured either as stand-mounted scanners each resting on a countertop and having a presentation window; or as vertical slot scanners each resting on, or built into, the countertop and having a generally vertically arranged, upright presentation window; or as flat-bed or horizontal slot scanners each resting on, or built into, the countertop and having a generally horizontally arranged presentation window; or as bi-optical, dual window scanners each resting on, or built into, the countertop and having both a generally horizontal presentation window supported by a generally horizontal platform and a generally vertically arranged, upright presentation window supported by a generally upright tower. Such workstations were often operated to electro-optically read a plurality of symbol targets, such as one-dimensional symbols, particularly Universal Product Code (UPC) bar code symbols, truncated symbols, stacked symbols, and two-dimensional symbols, as well as non-symbol targets, such as driver's licenses, receipts, signatures, etc., the targets being associated with, or borne by, objects or products to be processed by, e.g., purchased at, the workstations.
- A user, such as an operator or a customer, slid or swiped a product associated with, or bearing, the target in a moving direction across and past a respective presentation window in a swipe mode, or momentarily presented, and steadily momentarily held, the target associated with, or borne by, the product to an approximate central region of the respective presentation window in a presentation mode. The products could be moved relative to the respective window in various directions, for example, from right-to-left, or left-to-right, and/or in-and-out, or out-and-in, and/or high-to-low, or low-to-high, or any combination of such directions, or could be positioned either in contact with, or held at a working distance away from, either window during such movement or presentation. The choice depended on the type of the workstation, or on the user's preference, or on the layout of the venue, or on the type of the product and target. Return light returning from the target in the laser-based reader and/or in the imager-based reader was detected to generate an electrical signal indicative of the target. The electrical signal was then processed, and, when the target was a symbol, was decoded, and read, thereby identifying the product.
- Early all imager-based, bi-optical workstations required about ten to twelve, or at least six, solid-state imagers having multiple, intersecting fields of view extending through the windows in order to provide a full coverage scan volume in front of the windows to enable reliable reading of the target that could be positioned anywhere on all six sides of a three-dimensional product. To bring the cost of the imager-based workstation down to an acceptable level, it was known to reduce the need for the aforementioned six to twelve imagers down to two imagers, or even one imager, by splitting the field of view of at least one of the imagers into a plurality of subfields of view, each additional subfield serving to replace an additional imager. These subfields also intersected each other in order to again provide a full coverage scan volume that extended above the horizontal window and in front of the upright window as close as possible to a countertop, and sufficiently high above the countertop, and as wide as possible across the width of the countertop. The scan volume projected into space away from the windows and grew in size rapidly in order to cover targets on products that were positioned not only on the windows, but also at working distances therefrom.
- Each imager included a one- or two-dimensional, solid-state, charge coupled device (CCD) array, or a complementary metal oxide semiconductor (CMOS) array, of image sensors (also known as pixels), and typically had an associated illuminator or illumination system to illuminate the target with illumination light over an illumination field. Each imager also had an imaging lens assembly for capturing return illumination light reflected and/or scattered from the target, and for projecting the captured return light onto the sensor array. Each imager preferably operated at a frame rate of multiple frames per second, e.g., sixty frames per second. Each field of view, or each subfield, was preferably individually illuminated, and overlapped, by a respective illumination field and extended through the windows over regions of the product. Each imager included either a global or a rolling shutter to help prevent image blur, especially when the targets passed through the scan volume at high speed, e.g., on the order of 100 inches per second.
- Preferably, to reduce power consumption, to prolong operational lifetime, and to reduce bright light annoyance to operators and customers, the illumination light was not emitted at all times, but was emitted in response to detection of return infrared (IR) light by an IR-based proximity system that included an IR emitter operative for emitting IR light into an IR emission field, and an IR sensor for sensing the return IR light within an IR detection field. A product entering the IR emission field reflected and/or scattered at least a portion of the emitted IR light incident on the product to the IR sensor. Detection of this return IR light by the IR sensor determined that the product had indeed entered the workstation, thereby triggering the illumination system and the reading of the target.
- Although generally satisfactory for their intended purpose, one issue with such known presentation-type workstations involved accidental double-reading of a target, i.e., where a single target was read and reported to a host more than once. If a product with a target was kept in the workstation even after the target was initially read, or if the product was removed too slowly from the workstation, or if the product was moved to different locations in the workstation, a single target might be read and reported several times and, thus, the number of product transactions which were intended to be reported would not correspond to the number of product transactions which were actually reported.
- Accidental double-reading was not an issue for handheld scanners that employed a trigger to distinguish among targets. However, the above-described known presentation-type workstations, which were free-running, i.e., triggerless, typically relied on a predetermined timeout period, such as one or two seconds in duration, to prevent accidental double-reads of the same target from being reported to the host. Thus, if a second read, that was identical to a first read, occurred before the timeout period elapsed, then the second read was discarded or not reported, because it was assumed that an accidental double-read had occurred. However, if the second read occurred after the timeout period elapsed, then the second read was reported, because it was assumed that the second read was intentional.
- Nevertheless, there were circumstances where it was desired to intentionally read successive identical targets. For example, sometimes multiple identical products from the same supplier, e.g., multiple cans of the same soda brand, were successively swiped or presented at the workstation. Although the known workstations were effective in preventing accidental double-reads by relying on the above-described timeout period, this reliance slowed down any such intentional reading of successive identical targets, because the user had to wait for the timeout period to elapse before the next product could be swiped or presented. As a consequence, the overall throughput of the workstation was decreased, and the performance of the workstation was sometimes regarded as sluggish and, in some cases, was regarded as a defect.
- Accordingly, it would be desirable to reliably prevent accidental double-reading of the same target, and to accelerate the intentional reading of successive identical targets without experiencing and being delayed by any fixed timeout period, and to increase the overall throughput of the point-of-transaction workstation.
- The accompanying figures, where like reference numerals refer to identical or functionally similar elements throughout the separate views, together with the detailed description below, are incorporated in and form part of the specification, and serve to further illustrate embodiments of concepts that include the claimed invention, and explain various principles and advantages of those embodiments.
-
FIG. 1 is a perspective view of a dual window, bi-optical, point-of-transaction workstation for imaging and reading successive targets on successive products passing through the workstation; -
FIG. 2 is a part-sectional view of the workstation ofFIG. 1 diagrammatically depicting certain component systems inside the workstation; -
FIG. 3 is a part-sectional, overhead view of the workstation ofFIG. 1 depicting a proximity system in accordance with this invention for detecting entry of the products into, and exit of the products from, the workstation; -
FIG. 4 is a graph of an analog signal waveform generated by the proximity system ofFIG. 3 when a product is passed through the workstation in a swipe mode; and -
FIG. 5 is a graph of an analog signal waveform generated by the proximity system ofFIG. 3 when a product is passed through the workstation in a presentation mode. - Skilled artisans will appreciate that elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the dimensions of some of the elements in the figures may be exaggerated relative to other elements to help to improve understanding of embodiments of the present invention.
- The apparatus and method components have been represented where appropriate by conventional symbols in the drawings, showing only those specific details that are pertinent to understanding the embodiments of the present invention so as not to obscure the disclosure with details that will be readily apparent to those of ordinary skill in the art having the benefit of the description herein.
- An apparatus or workstation, in accordance with one feature of this invention, is operative for processing successive products associated with successive targets to be electro-optically read. The apparatus includes a housing, a window supported by the housing, and a reading system supported by the housing and operative for reading the successive targets with return light from the targets passing through the window. The apparatus also includes a proximity system supported by the housing and operative for detecting entry of each successive product associated with a respective successive target into a zone outside the window, and for detecting exit of each successive product associated with the respective successive target from the zone. A controller is operatively connected to the reading system and the proximity system. The controller processes the return light from the respective successive target in response to detection of the entry of each successive product into the zone, and distinguishes among the successive products in response to detection of the exit of each successive product from the zone.
- Different reading systems are contemplated. Preferably, the reading system includes a solid-state imager having an array of image sensors looking at a field of view that extends through the window to each target to be imaged, and an energizable illumination system for illuminating the field of view with illumination light over an illumination field. The controller is operative for energizing the illumination system in response to the detection of the entry of each successive product into the zone, and for processing return illumination light returned from the respective successive target and captured in the field of view by the imager. Laser-based reading systems could also be employed.
- Different housing configurations are contemplated. When configured as a vertical slot scanner, for example, the housing supports the window as a single upright window in an upright plane. When configured as a flat-bed or horizontal scanner, for example, the housing supports the window as a single generally horizontal window in a generally horizontal plane. When preferably configured as a bi-optical scanner, the housing has an upright tower that supports the window in an upright plane, and also has a generally horizontal platform that supports an additional window in a generally horizontal plane that intersects the upright plane.
- Different proximity systems are contemplated. Preferably, the proximity system is light-based and includes an infrared (IR) emitter for emitting IR light into an IR emission field, and an IR sensor for sensing return IR light within an IR detection field that, preferably, but not necessarily, intersects the IR emission field in the zone. The IR sensor is operative for detecting the entry of each successive product into the zone, and for detecting the exit of each successive product from the zone by sensing a change in magnitude over time of the return IR light returned by each successive product. The controller determines whether each change in magnitude over time of the return IR light is indicative of a swipe mode, as described above, in which a product is swiped across the window, or a presentation mode, as also described above, in which a product is presented and momentarily held at the window. An analog output signal of the IR sensor has a slope that is indicative of this change in magnitude over time of the return IR light. The controller determines whether each change in magnitude over time, i.e., the slope, of the return IR light of the product exiting the zone exceeds a predetermined threshold or slope for each mode.
- By distinguishing among the successive products, accidental double-reading of the same target is reliably prevented, and the intentional reading of successive identical targets without experiencing and being delayed by any fixed timeout period is accelerated. The overall throughput of the point-of-transaction workstation is thus improved.
- Turning now to the drawings,
reference numeral 10 inFIG. 1 generally identifies a dual window, bi-optical, point-of-transaction workstation typically used by retailers to process transactions involving the purchase ofproducts 22 bearing, or associated with, identifyingtargets 24 or indicia, such as the UPC symbol described above. Theworkstation 10 includes ahousing 20 having a generally horizontal, preferably rectangular,window 12 located in a generally horizontal plane and supported by a horizontal housing portion orplatform 14 of different sizes, and a vertical or generally vertical (referred to as “vertical” or “upright” hereinafter)window 16 that is located in a generally upright plane that intersects the generally horizontal plane and that is supported by a raised housing portion ortower 18. The upright plane may lie in a vertical plane, or be slightly rearwardly or forwardly inclined relative to the vertical plane. The upright, preferably rectangular,window 16 is preferably recessed within itshousing portion 18 to resist scratching The products are passed by a user 26 (seeFIGS. 3-4 ), i.e., an operator or a customer, through a scan volume, which occupies the space at and above thehorizontal window 12, and also occupies the space at and in front of theupright window 16. - By way of numerical example, the generally
horizontal window 12 measures about four inches in width by about six inches in length, while the generallyupright window 16 measures about six inches in width by about eight inches in length. Theplatform 14 may be long, e.g., on the order of twelve inches as measured in a direction away from and perpendicular to theupright window 16, or short, e.g., on the order of eight inches as measured in a direction away from and perpendicular to theupright window 16. - The
workstation 10 includes one or more cameras or solid-state imagers 30 (two shown schematically inFIG. 2 ), each having a sensor array, preferably a one- or two-dimensional, charge coupled device (CCD) array, or a complementary metal oxide semiconductor (CMOS) array, of image sensors (also known as pixels), preferably of megapixel size, e.g., 1280 pixels wide×960 pixels high, with an imaging field of view diagrammatically shown by arrows and looking out through thewindows imager 30 measures about 15 degrees by 30 degrees. - Each
imager 30 includes, or is associated with, an illuminator orillumination system 32, mounted at the workstation, for illuminating thetarget 24 with illumination light over an illumination field that overlaps the respecting imaging field. Eachilluminator 32 preferably includes one or more light sources, e.g., surface-mounted, light emitting diodes (LEDs), located at eachimager 30 to uniformly illuminate thetarget 24. Eachimager 30 includes an imaging lens system for capturing return illumination light reflected and/or scattered from thetarget 24, and for projecting the captured return light onto the respective sensor array. - Each
imager 30 preferably has a shutter, typically a global shutter, that exposes each imager for an exposure time, preferably pre-set for the maximum anticipated exposure time needed to capture thetarget 24 at the maximum working distance away from each window. By way of example, the maximum exposure time can be set to a value between 400-750 microseconds. Eachimager 30 preferably operates at a frame rate of sixty frames per second, each frame lasting about 16.67 milliseconds. The shutter insures that the captured images will not be disturbed by motion of thetarget 24 relative to the window(s) 12, 16 during the exposure time. A rolling or a mechanical shutter could also be employed. Thetarget 24 and theproduct 22 can be presented or swiped at speeds up to around 100 inches per second across any part of either window. - In use, the
user 26, such as an operator working at a supermarket checkout counter 28 (seeFIGS. 3-4 ), processes aproduct 22 bearing atarget 24 thereon, past thewindows FIG. 1 , across a respective window in the abovementioned swipe mode, or by presenting and momentarily holding theproduct 22 at the respective window in the abovementioned presentation mode. Thetarget 24 may located on any of the top, bottom, right, left, front and rear, sides of theproduct 22, and at least one, if not more, of theimagers 30 will capture the illumination light reflected, scattered, or otherwise returning from thetarget 24 through one or both windows. Theimagers 30 are preferably looking through the windows at around 45° so that they can each see a side of theproduct 22 that is generally perpendicular to, as well as generally parallel to, a respective window. -
FIG. 2 also schematically depicts that theimagers 30 and their associatedilluminators 32 are operatively connected to a programmed microprocessor orcontroller 44 operative for controlling the operation of these and other components. Preferably, thecontroller 44 is the same as the one used for processing the captured target images, and for decoding the return light scattered from the target when the target is a symbol. - In operation, the
controller 44 sends successive command signals to theilluminators 32 in response to detection of aproduct 22 in a predetermined zone in the workstation, as described in detail below, to pulse the LEDs for a short time period of 100 microseconds or less, and successively energizes theimagers 30 to collect light from atarget 24 only during said time period, also known as the exposure time period. By acquiring a target image during this brief time period, the image of the target is not excessively blurred even in the presence of relative motion between the imagers and the target. - Although the
workstation 10 is illustrated inFIG. 2 as having twoimagers 30, one for eachwindow - Each imaging field or subfield is illuminated in response to detection of the
product 22 in the predetermined zone in the workstation by a proximity system. Thecontroller 44 energizes theillumination system 32 in response to detection of theproduct 22 in the predetermined zone, and processes return illumination light returned from thetarget 24 and captured in the imaging field or subfield by theimager 30. - A preferred embodiment of the proximity system is shown in
FIG. 3 , wherein theimagers 30 and theilluminators 32 have been removed for clarity. The proximity system is preferably light-based and includes an infrared (IR)emitter 50, preferably comprised of one or more light emitting diodes (LEDs), for emitting IR light into an IR emission field bounded by side boundary edges 50A, 50B, and anIR sensor 54 for sensing return IR light within an IR detection field bounded by side boundary edges 54A, 54B. The emitted IR light has its maximum intensity along an IR emission axis centrally located within the IR emission field. The return IR light has its maximum sensitivity along an IR detection axis centrally located within the IR detection field. The IR axes are preferably, but not necessarily, inclined and cross over and intersect one another directly in front of theupright window 16. The IR detection field intersects the IR emission field in a common area of intersection (shown by a quadrilateral area highlighted by hatched lines inFIG. 3 and having corners A, B, C, D) to define the aforementioned predetermined zone directly in front of theupright window 16. In the illustrated bi-optical configuration, the predetermined zone is also directly above, and is generally coextensive in area with, the generallyhorizontal window 12. The intersecting IR emission and detection fields above thehorizontal window 12 and/or theplatform 14 within theworkstation 10, reduce false triggering by the proximity system, not only by theuser 26 outside theworkstation 10, but also by items or parts of theuser 26 inside theworkstation 10, but not in the predetermined zone A, B, C, D, e.g., directly overlying the generallyhorizontal window 12. As shown inFIG. 3 , no part of the user is in the predetermined zone. - The
IR sensor 54 is operative not only for detecting the entry of eachsuccessive product 22 into the zone, but also the exit of eachsuccessive product 22 from the zone. By detecting each such exit,successive products 22 are distinguished, and the above-described accidental double-reading of thesame target 24 is reliably prevented. Also, the above-described intentional reading of successiveidentical targets 24 is accelerated without the prior art delay of any fixed timeout period, thereby improving the overall throughput of the point-of-transaction workstation 10. - In the preferred embodiment, the
IR sensor 54 senses a change in magnitude over time, i.e., the slope, of the return IR light returned by eachsuccessive product 22. This is best shown inFIG. 4 , where the magnitude of the output analog signal produced by theIR sensor 54 as a function of time is depicted as waveform S for the swipe mode, and inFIG. 5 , where the magnitude of the output analog signal produced by theIR sensor 54 as a function of time is depicted as waveform P for the presentation mode. For reference purposes, in each ofFIGS. 4 and 5, waveform E depicts a constant, low magnitude, analog return signal output by theIR sensor 54 when noproduct 22 is present in the zone, and waveform F depicts a constant, high magnitude, output analog return signal output by theIR sensor 54 when theproduct 22 is constantly present in the zone. - In
FIG. 4 , the leading portion of the waveform S between times t1 and t2 with a positive slope indicates the entry of theproduct 22 into the zone, the middle portion of the waveform S between times t2 and t3 with a substantially zero slope indicates the presence of theproduct 22 in the zone, and the trailing portion of the waveform S between times t3 and t4 with a negative slope indicates the exit of theproduct 22 from the zone. Thecontroller 44 is operative for measuring the trailing portion of the waveform S to determine the exit time and the exit speed of the product. For example, if the trailing portion of the waveform S lasts less than one second, or preferably, less than 200 milliseconds, then this relatively quick exit time indicates that the product is being swiped through the workstation. - Analogously, in
FIG. 5 , the leading portion of the waveform P between times t5 and t6 with a positive slope indicates the entry of theproduct 22 into the zone, the middle portion of the waveform P between times t6 and t7 with a substantially zero slope indicates the presence of theproduct 22 in the zone, and the trailing portion of the waveform P between times t7 and t8 with a negative slope indicates the exit of theproduct 22 from the zone. Thecontroller 44 is operative for measuring the trailing portion of the waveform P to determine the exit time and the exit speed of the product. For example, if the trailing portion of the waveform P lasts more than 500 milliseconds, or preferably, more than one second, then this relatively slower exit time indicates that the product is being presented at the workstation. - The controller not only measures the exit times to indicate whether the product is being swiped or presented, but also measures the extent of the slope of the trailing portions of the waveforms P, S. To reliably distinguish between successive products, the controller determines whether each change in magnitude over time of the return IR light of the product exiting the zone exceeds a predetermined threshold or reference slope for each mode. Thus, the measured slope must exceed a predetermined reference value before the controller confirms that a product has indeed exited the workstation. Slope measurement performed by the controller is well known to the skilled artisan.
- It will be understood that each of the elements described above, or two or more together, also may find a useful application in other types of constructions differing from the types described above. For example, the workstation need not be the illustrated bi-optical workstation having dual presentation windows as described above, but could also be configured either as a stand-mounted scanner having a single presentation window, or as a vertical slot scanner having a single upright presentation window, or as a flat-bed or horizontal slot scanner having a single, generally horizontally arranged presentation window. The target need not be the illustrated one-dimensional symbol, but could also be a truncated symbol, a stacked symbol, or a two-dimensional symbol, as well as a non-symbol target, such as a driver's license, a receipt, a signature, etc. The workstation need not have the illustrated two imagers, but could have more or less than two imagers, and one or all of the imagers could be replaced by laser-based readers. The proximity system need not be light-based. Also, the IR emission axis and the IR detection axis need not cross over as illustrated, but could also be positioned in mutual parallelism and extend perpendicularly of the upright window.
- In accordance with another feature of this invention, a method of processing successive products associated with successive targets to be electro-optically read, is performed by supporting a window on a housing, reading the successive targets with return light passing through the window, detecting entry of each successive product associated with a respective successive target into a zone outside the window, detecting exit of each successive product associated with the respective successive target from the zone, processing the return light from the respective successive target in response to detection of the entry of each successive product into the zone, and distinguishing among the successive products in response to detection of the exit of each successive product from the zone.
- In the foregoing specification, specific embodiments have been described. However, one of ordinary skill in the art appreciates that various modifications and changes can be made without departing from the scope of the invention as set forth in the claims below. Accordingly, the specification and figures are to be regarded in an illustrative rather than a restrictive sense, and all such modifications are intended to be included within the scope of present teachings
- The benefits, advantages, solutions to problems, and any element(s) that may cause any benefit, advantage, or solution to occur or become more pronounced are not to be construed as a critical, required, or essential features or elements of any or all the claims. The invention is defined solely by the appended claims including any amendments made during the pendency of this application and all equivalents of those claims as issued.
- Moreover in this document, relational terms such as first and second, top and bottom, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. The terms “comprises,” “comprising,” “has,” “having,” “includes,” “including,” “contains,” “containing,” or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises, has, includes, contains a list of elements does not include only those elements, but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. An element proceeded by “comprises . . . a,” “has . . . a,” “includes . . . a,” or “contains . . . a,” does not, without more constraints, preclude the existence of additional identical elements in the process, method, article, or apparatus that comprises, has, includes, or contains the element. The terms “a” and “an” are defined as one or more unless explicitly stated otherwise herein. The terms “substantially,” “essentially,” “approximately,” “about,” or any other version thereof, are defined as being close to as understood by one of ordinary skill in the art, and in one non-limiting embodiment the term is defined to be within 10%, in another embodiment within 5%, in another embodiment within 1%, and in another embodiment within 0.5%. The term “coupled” as used herein is defined as connected, although not necessarily directly and not necessarily mechanically. A device or structure that is “configured” in a certain way is configured in at least that way, but may also be configured in ways that are not listed.
- It will be appreciated that some embodiments may be comprised of one or more generic or specialized processors (or “processing devices”) such as microprocessors, digital signal processors, customized processors, and field programmable gate arrays (FPGAs), and unique stored program instructions (including both software and firmware) that control the one or more processors to implement, in conjunction with certain non-processor circuits, some, most, or all of the functions of the method and/or apparatus described herein. Alternatively, some or all functions could be implemented by a state machine that has no stored program instructions, or in one or more application specific integrated circuits (ASICs), in which each function or some combinations of certain of the functions are implemented as custom logic. Of course, a combination of the two approaches could be used.
- Moreover, an embodiment can be implemented as a computer-readable storage medium having computer readable code stored thereon for programming a computer (e.g., comprising a processor) to perform a method as described and claimed herein. Examples of such computer-readable storage mediums include, but are not limited to, a hard disk, a CD-ROM, an optical storage device, a magnetic storage device, a ROM (Read Only Memory), a PROM (Programmable Read Only Memory), an EPROM (Erasable Programmable Read Only Memory), an EEPROM (Electrically Erasable Programmable Read Only Memory) and a Flash memory. Further, it is expected that one of ordinary skill, notwithstanding possibly significant effort and many design choices motivated by, for example, available time, current technology, and economic considerations, when guided by the concepts and principles disclosed herein, will be readily capable of generating such software instructions and programs and ICs with minimal experimentation.
- The Abstract of the Disclosure is provided to allow the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in various embodiments for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus, the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separately claimed subject matter.
Claims (19)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/315,626 US8453933B1 (en) | 2011-12-09 | 2011-12-09 | Apparatus for and method of distinguishing among successive products entering a point-of-transaction workstation by detecting their exit therefrom |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/315,626 US8453933B1 (en) | 2011-12-09 | 2011-12-09 | Apparatus for and method of distinguishing among successive products entering a point-of-transaction workstation by detecting their exit therefrom |
Publications (2)
Publication Number | Publication Date |
---|---|
US8453933B1 US8453933B1 (en) | 2013-06-04 |
US20130146667A1 true US20130146667A1 (en) | 2013-06-13 |
Family
ID=48484191
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/315,626 Active US8453933B1 (en) | 2011-12-09 | 2011-12-09 | Apparatus for and method of distinguishing among successive products entering a point-of-transaction workstation by detecting their exit therefrom |
Country Status (1)
Country | Link |
---|---|
US (1) | US8453933B1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
USD730901S1 (en) * | 2014-06-24 | 2015-06-02 | Hand Held Products, Inc. | In-counter barcode scanner |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
USD708183S1 (en) * | 2012-06-08 | 2014-07-01 | Datalogic ADC, Inc. | Data reader for checkout station |
USD723560S1 (en) * | 2013-07-03 | 2015-03-03 | Hand Held Products, Inc. | Scanner |
US9946907B2 (en) | 2014-05-20 | 2018-04-17 | Symbol Technologies, Llc | Compact imaging module and imaging reader for, and method of, detecting objects associated with targets to be read by image capture |
US10970506B2 (en) * | 2018-12-21 | 2021-04-06 | Datalogic Usa, Inc. | Bioptic data reader with wide-angle field-of-view |
US11966809B2 (en) * | 2022-05-31 | 2024-04-23 | Zebra Technologies Corporation | Synchronizing rolling shutter and global shutter sensors |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3072465B2 (en) * | 1995-10-25 | 2000-07-31 | 富士通株式会社 | Barcode reader |
US7389923B2 (en) * | 2005-12-29 | 2008-06-24 | Ncr Corporation | Methods and apparatus for tracking the direction of a moving item by a bar code scanner |
US8033472B2 (en) | 2007-06-28 | 2011-10-11 | Symbol Technologies, Inc. | Electro-optical imaging reader having plural solid-state imagers with shutters to prevent concurrent exposure |
-
2011
- 2011-12-09 US US13/315,626 patent/US8453933B1/en active Active
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
USD730901S1 (en) * | 2014-06-24 | 2015-06-02 | Hand Held Products, Inc. | In-counter barcode scanner |
USD757009S1 (en) | 2014-06-24 | 2016-05-24 | Hand Held Products, Inc. | In-counter barcode scanner |
Also Published As
Publication number | Publication date |
---|---|
US8453933B1 (en) | 2013-06-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7762464B2 (en) | Control of specular reflection in imaging reader | |
US11783681B2 (en) | Bioptical barcode reader | |
US8453933B1 (en) | Apparatus for and method of distinguishing among successive products entering a point-of-transaction workstation by detecting their exit therefrom | |
US9524406B2 (en) | Apparatus for and method of minimizing specular reflections in imaging field of view of workstation that reads targets by image capture | |
CA2882883C (en) | Checkout system for and method of preventing a customer-operated accessory reader facing a bagging area from imaging targets on products passed through a clerk-operated workstation to the bagging area | |
US8389945B1 (en) | Object detecting system in imaging-based barcode readers | |
AU2015298237B2 (en) | Detectiing window deterioration on barcode scanning workstation | |
US8960551B2 (en) | Method of decoding barcode with imaging scanner having multiple object sensors | |
AU2017366450B2 (en) | System and workstation for, and method of, deterring theft of a product associated with a target to be electro-optically read | |
US20140306009A1 (en) | Arrangement for and method of cleaning a platter of a product checkout workstation | |
US9740902B2 (en) | Apparatus for and method of triggering electro-optical reading only when a target to be read is in a selected zone in a point-of-transaction workstation | |
US9038903B2 (en) | Method and apparatus for controlling illumination | |
US20140224554A1 (en) | Produce lift apparatus | |
US9483669B2 (en) | Barcode imaging workstation having sequentially activated object sensors | |
US9639720B2 (en) | System and method of automatically avoiding signal interference between product proximity subsystems that emit signals through mutually facing presentation windows of different workstations | |
US8511559B2 (en) | Apparatus for and method of reading targets by image captured by processing captured target images in a batch or free-running mode of operation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SYMBOL TECHNOLOGIES, INC., NEW YORK Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PANG, ROBERT;REEL/FRAME:027355/0792 Effective date: 20111209 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: MORGAN STANLEY SENIOR FUNDING, INC. AS THE COLLATERAL AGENT, MARYLAND Free format text: SECURITY AGREEMENT;ASSIGNORS:ZIH CORP.;LASER BAND, LLC;ZEBRA ENTERPRISE SOLUTIONS CORP.;AND OTHERS;REEL/FRAME:034114/0270 Effective date: 20141027 Owner name: MORGAN STANLEY SENIOR FUNDING, INC. AS THE COLLATE Free format text: SECURITY AGREEMENT;ASSIGNORS:ZIH CORP.;LASER BAND, LLC;ZEBRA ENTERPRISE SOLUTIONS CORP.;AND OTHERS;REEL/FRAME:034114/0270 Effective date: 20141027 |
|
AS | Assignment |
Owner name: SYMBOL TECHNOLOGIES, LLC, NEW YORK Free format text: CHANGE OF NAME;ASSIGNOR:SYMBOL TECHNOLOGIES, INC.;REEL/FRAME:036083/0640 Effective date: 20150410 |
|
AS | Assignment |
Owner name: SYMBOL TECHNOLOGIES, INC., NEW YORK Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:MORGAN STANLEY SENIOR FUNDING, INC.;REEL/FRAME:036371/0738 Effective date: 20150721 |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |