US10410030B2 - System and method for recognizing deformed linear barcodes from a stream of varied focus video frames - Google Patents

System and method for recognizing deformed linear barcodes from a stream of varied focus video frames Download PDF

Info

Publication number
US10410030B2
US10410030B2 US14/833,184 US201514833184A US10410030B2 US 10410030 B2 US10410030 B2 US 10410030B2 US 201514833184 A US201514833184 A US 201514833184A US 10410030 B2 US10410030 B2 US 10410030B2
Authority
US
United States
Prior art keywords
barcode
processor
implemented
module
difference
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US14/833,184
Other versions
US20150363628A1 (en
Inventor
Jeffrey Roger Powers
Vikas Muppiddi Reddy
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
eBay Inc
Original Assignee
eBay Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by eBay Inc filed Critical eBay Inc
Priority to US14/833,184 priority Critical patent/US10410030B2/en
Publication of US20150363628A1 publication Critical patent/US20150363628A1/en
Assigned to EBAY, INC. reassignment EBAY, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: POWERS, JEFFREY ROGER, REDDY, VIKAS MUPPIDDI
Priority to US16/538,304 priority patent/US11055505B2/en
Application granted granted Critical
Publication of US10410030B2 publication Critical patent/US10410030B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/146Methods for optical code recognition the method including quality enhancement steps
    • G06K7/1473Methods for optical code recognition the method including quality enhancement steps error correction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K5/00Methods or arrangements for verifying the correctness of markings on a record carrier; Column detection devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/10544Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum
    • G06K7/10821Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation by scanning of the records by radiation in the optical part of the electromagnetic spectrum further details of bar or optical code scanning devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1408Methods for optical code recognition the method being specifically adapted for the type of code
    • G06K7/14131D bar codes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1439Methods for optical code recognition including a method step for retrieval of the optical code
    • G06K7/1443Methods for optical code recognition including a method step for retrieval of the optical code locating of the code in an image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1439Methods for optical code recognition including a method step for retrieval of the optical code
    • G06K7/1447Methods for optical code recognition including a method step for retrieval of the optical code extracting optical codes from image or text carrying said optical code
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/146Methods for optical code recognition the method including quality enhancement steps
    • G06K7/1478Methods for optical code recognition the method including quality enhancement steps adapting the threshold for pixels in a CMOS or CCD pixel sensor for black and white recognition

Definitions

  • This application relates generally to the field of barcode processing, and more specifically, to a system and method for estimating and classifying barcodes using heuristic and statistical measures.
  • Barcodes are widely used to identify and track goods and documents, among other things.
  • a commonly used barcode is a linear barcode, which is a machine-readable representation of data that represents data in the widths and spacing of parallel lines.
  • Different linear barcode formats have emerged over time, with Universal Product Code (UPC) and European Article Number (EAN) being two commonly used barcode formats.
  • UPC Universal Product Code
  • EAN European Article Number
  • a commonly used UPC code is a UPC-A barcode.
  • a UPC-A barcode is characterized by twelve decimal digits, preceded by a start delimiter and followed by an end delimiter. In the middle of the twelve digit barcode, between the sixth and seventh digits, is a middle delimiter.
  • the start, middle, and end delimiters function to separate the twelve digits into two groups of six digits.
  • the start and end delimiters are characterized by a “101” bit pattern, which may be visualized as two vertical black guard bars with a white space between the bars.
  • the middle delimiter is characterized by a “01010” bit pattern, which may be visualized as a white space, a black vertical guard bar, a white space, a black vertical guard bar, and a white space. Between the start and middle delimiters are six “left” digits, and between the middle and end delimiters are six “right” digits. Each digit is represented by a seven-bit code, with a binary ‘1’ value represented by a vertical black bar and a binary ‘0’ value represented by a vertical white space. The seven-bit code for each digit is represented visually as two bars and two spaces, with each of the bars and spaces having varying width depending on the digit. To distinguish between “left” digits and “right” digits, a “left” digit seven-bit code is the inverse of a “right” digit seven-bit code. The following table illustrates the seven-bit code values for each barcode digit.
  • the first, or leftmost, digit is a prefix digit, while the last, or rightmost digit, is an error correcting check digit.
  • a commonly used EAN barcode is an EAN-13 barcode.
  • the EAN-13 barcode is a superset of a UPC-A barcode.
  • the EAN-13 barcode uses thirteen digits broken up into four components: a prefix, which may be two or three digits long; a company number, which may be four to six digits long, an item reference number, which may be two to six digits, and a single checksum digit.
  • EAN-13 barcodes differ from UPC-A barcodes in that the data digits are split into three groups: a first digit, a first group of six digits, and a second group of six digits.
  • the first group of six digits is encoded according to one of two encoding schemes, one of which has even parity and one of which has odd parity, while the second group of six digits is encoded as bitwise complements to the digits of the first group having the odd parity encoding scheme.
  • Barcodes are commonly read using fixed or mounted barcode scanners, such as those found as part of a point-of-sale system, or using commercial laser-based, handheld barcode readers, which are often attached to a point-of-sale system.
  • fixed or mounted barcode scanners such as those found as part of a point-of-sale system
  • commercial laser-based, handheld barcode readers which are often attached to a point-of-sale system.
  • handheld and mobile devices there is a growing interest in leveraging the ability of these devices to read barcodes.
  • FIG. 1 is a network diagram depicting a network system, according to one embodiment, having a client-server architecture configured for exchanging data over a network.
  • FIG. 2 is a block diagram illustrating an example embodiment of various client modules that may used to execute the processes described herein.
  • FIG. 3 is a diagram of an example embodiment of a client barcode reading application.
  • FIG. 4 is a diagram of an example embodiment of a client barcode reading application attempting to read a deformed linear barcode.
  • FIG. 5 is an image of an example embodiment of a deformed linear barcode.
  • FIG. 6 is a diagram of an example embodiment of a client barcode reading application attempting to read a deformed linear barcode.
  • FIG. 7 is a diagram of an example embodiment of a deformed linear barcode.
  • FIG. 8 is a flow chart of an example method for recognizing a barcode.
  • FIG. 9 is a flow chart of an example method for recognizing a barcode.
  • FIG. 10 shows a diagrammatic representation of a machine in the example form of a computer system within which a set of instructions may be executed to cause the machine to perform any one or more of the methodologies discussed herein.
  • a system and method to recognize a deformed linear barcode from varied-focus video frames are disclosed.
  • a processor-implemented camera module may receive a stream of video frames. The stream of video frames may include a barcode contained therein.
  • a processor-implemented barcode blur estimate module may estimate an amount of defocus blur in a video frame of the stream of video frames.
  • the processor-implemented barcode blur estimate module may estimate an identity of the barcode using a backward extraction technique.
  • a processor-implemented barcode localization module may identify a region of the video frame containing the barcode.
  • a processor-implemented geometric modeler module may generate a geometric model of the barcode that includes an identified barcode deformity.
  • a processor-implemented barcode decoder module may decode the barcode using the estimated amount of defocus blur, the estimated barcode identity, and the geometric model of the barcode.
  • FIG. 1 is a network diagram depicting a network system 100 , according to one embodiment, having a client-server architecture configured for exchanging data over a network.
  • the network system 100 may be a commerce or publication/publisher 102 where clients may communicate and exchange data within the network system 100 .
  • the data may pertain to various functions (e.g., product lookups, product or price comparisons, and online item purchases) associated with the network system 100 and its users.
  • client-server architecture as an example, other embodiments may include other network architectures, such as a peer-to-peer or distributed network environment.
  • a data exchange platform in an example form of a network-based publisher 102 , may provide server-side functionality, via a network 104 (e.g., the Internet) to one or more clients.
  • the one or more clients may include users that utilize the network system 100 and more specifically, the network-based publisher 102 , to exchange data over the network 104 .
  • These transactions may include transmitting, receiving (communicating) and processing data to, from, and regarding content and users of the network system 100 .
  • the data may include, but are not limited to, content and user data such as barcode-related data, product profiles, product reviews, product comparisons, price comparisons, product recommendations and identifiers, product and service listings associated with buyers and sellers, auction bids, and transaction data, among other things.
  • the data exchanges within the network system 100 may be dependent upon user-selected functions available through one or more client or user interfaces (UIs).
  • the UIs may be associated with a client machine, such as a client machine 106 using a web client 110 .
  • the web client 110 may be in communication with the network-based publisher 102 via a web server 120 .
  • the UIs may also be associated with a client machine 108 using a programmatic client 112 , such as a client application.
  • the client machines 106 , 108 may be associated with a buyer, a seller, a payment service provider, or a shipping service provider, each in communication with the network-based publisher 102 and, optionally, each other.
  • the buyers and sellers may be any one of individuals, merchants, or service providers, among other things.
  • the role of a user of client machines 106 , 108 is immaterial to the discussion herein and the foregoing examples are merely examples of the types of users who may operate client machines 106 , 108 .
  • Client machines 106 , 108 executing web client 110 or programmatic client 112 may use the web client 110 or programmatic client 112 to read a barcode.
  • client machines 106 , 108 may be handheld or mobile devices.
  • Client machines 106 , 108 may have camera functionality, implemented in example embodiments as a built-in camera or external camera. In an example embodiment, the built-in camera or external camera may have a fixed focus lens.
  • client machines 106 , 108 may capture and decode a barcode using web client 110 or programmatic client 112 (e.g., client app).
  • Client machines 106 , 108 may transmit decoded barcode information to the network-based publisher 102 to retrieve additional information concerning the decoded barcode.
  • an application program interface (API) server 118 and a web server 120 are coupled to, and provide programmatic and web interfaces respectively to, one or more application servers 122 .
  • the application servers 122 host one or more publication or commerce application(s) 124 .
  • the application servers 122 are, in turn, shown to be coupled to one or more database server(s) 126 that facilitate access to one or more database(s) 128 .
  • the web server 120 and the API server 118 communicate and receive data, such as in the form of decoded barcode data, pertaining to products, listings, and transactions, among other things, via various user input tools.
  • the web server 120 may send and receive data to and from a barcode reading webpage on a browser application (e.g., web client 110 ) operating on a client machine (e.g., client machine 106 ).
  • the API server 118 may send and receive data to and from a barcode reading app (e.g., programmatic client 112 ) running on another client machine (e.g., client machine 108 ).
  • a barcode reading app e.g., programmatic client 112
  • the commerce or publication application(s) 124 may provide a number of commerce and publisher functions and services (e.g., listing, product lookup, price comparison, payment, etc.) to users who access the network-based publisher 102 .
  • the commerce and publication application(s) 124 may provide a number of services and functions to users for listing goods for sale, facilitating transactions, and reviewing or comparing products and prices of products.
  • Data pertaining to the services and functions provided by the commerce and publication application(s) 124 may be retrieved from database(s) 128 via database server 126 .
  • FIG. 2 is a block diagram illustrating an example embodiment of various client modules that may used to execute the processes described herein.
  • FIG. 2 will be described with reference to client machine 108 of FIG. 1 , although one of ordinary skill in the art will recognize that any client device may be used to implement the client modules discussed herein.
  • Client machine 108 may include various modules that perform various functions.
  • the modules may be implemented as software, hardware, firmware, or any combination thereof. Multiple functions may be performed in one or more modules as desired, and the embodiments described are merely examples.
  • the software may be executed, at least in part, on one or more of at least one computer processor, digital signal processor, Application Specific Integrated Circuit (ASIC), microprocessor, or other type of processor operating on a system, such as a personal computer (PC), set top box, personal digital assistant (PDA), smart phone, server, router, or other device capable of processing data, including network interconnection devices.
  • a computer processor digital signal processor
  • ASIC Application Specific Integrated Circuit
  • microprocessor or other type of processor operating on a system
  • PC personal computer
  • PDA personal digital assistant
  • smart phone server, router, or other device capable of processing data, including network interconnection devices.
  • Client machine 108 may execute a barcode reading application 202 .
  • Barcode reading application 202 may include or call at least a camera module 204 , a barcode blur estimation module 206 , a barcode localization module 208 , a barcode geometric modeler module 210 , a barcode decoder module 212 , and a communication module 214 , each of which may reside within client machine 108 .
  • Camera module 204 may be an internal or external camera for use with the client machine 108 .
  • many mobile devices such as the Apple iPhone®, Research in Motion Blackberry®, and devices running the Google Android® operating system, include camera devices enabling users to capture images or video.
  • camera module 204 includes a fixed focus lens.
  • camera module 204 may include a camera flash device, such as a light-emitting diode (LED) flash.
  • Camera module 204 may include software capable of implementing digital zoom for capturing images or video.
  • camera module 204 may capture video frames in standard or high definition. The video frames may be provided as a stream of video frames to the barcode reading application 202 or any components thereof.
  • Barcode blur estimation module 206 determines when a client barcode reading application should attempt to read and decode a barcode.
  • Barcode reading application 202 may interface with camera module 204 to receive video frames from camera module 204 .
  • client machine 108 may activate camera module 204 , and video frames captured by camera module 204 may be provided to barcode blur estimation module 206 .
  • Barcode reading application 202 may decode a barcode directly from one or more video frames captured or received by the camera module 204 . In other words, in an example embodiment, barcode reading application 202 does not need to capture an image or take a picture of a barcode in order to read and process the barcode.
  • barcode blur estimation module 206 evaluates a barcode contained within the stream of video frames to determine an appropriate reading environment.
  • Barcode blur estimation module 206 may estimate the defocus blur and motion blur contained in the video frames. Defocus blur may arise from the use of a fixed focus camera to capture video frames containing a barcode that is close to the camera. If the amount of defocus blur is greater than a predetermined acceptable threshold amount of defocus blur, barcode reading application 202 will not attempt to read the barcode from the video frames. Once the amount of defocus blur is determined to be less than the predetermined threshold, barcode reading application 202 will attempt to decode the barcode. Thus, unlike other barcode reading applications, barcode reading application 202 will not expend unnecessary resources attempting to decode a barcode contained in a video frame suffering from defocus blurriness.
  • the barcode blur estimation module 206 estimates the defocus blur by computing a difference in pixel intensity across pixels in the video frames that correspond to the barcode.
  • the pixels may be selected from a horizontal cross-section of the barcode. This difference across pixels is summed up.
  • the pixels may be smeared together such that the summed differences between pixels may be less than a perfectly sharp barcode image.
  • the difference sum may be used in an equation that predicts the radius of the defocus blur.
  • the equation used may be derived from testing performed over a large set of barcodes. The equation may produce an initial estimate of the defocus blur present in the video frames.
  • Known barcode decoding methods attempt to perform forward image analysis to decode the barcode, where a signal thought to contain a barcode is analyzed to identify the exact barcode displayed in the signal. This analysis attempts to identify peaks and valleys in the signal corresponding to pixel intensity, with a peak representing a black bar and a valley representing a white space. If the analysis is unable to determine the exact barcode contained in the signal, sharpening and other refinement of the signal are performed to obtain more distinct peaks and valleys. In other words, a forward image analysis starts with an unknown barcode and attempts to identify the exact barcode shown.
  • the forward image analysis approach to decoding a barcode can be inefficient and ineffectual as a failed attempt to decode the barcode will cause the forward image analysis approach to restart the analysis anew.
  • many attempts at decoding the barcode may occur, and even then, the exact barcode may not be determined, due to, among other things, the presence of blur.
  • the barcode blur estimation module 206 may use a backward extraction process to determine the identity of a barcode contained within one or more video frames. Using the determined guess or assumption of the barcode, the barcode blur estimation module 206 can narrow the number of possibilities of the identity of the barcode to a manageable number (e.g., 10 to 30 possible valid patterns) from the overall universe of possible barcodes. The barcode blur estimation module 206 may compare the assumed barcode to each of the possible valid barcode patterns to identify a match. Thus, the barcode blur estimation module 206 may formulate an assumption of the barcode and apply its assumption or expectation to narrow the possible universe of potential applicable barcodes. The calculation of the estimated defocus blur and the use of the backward extraction process may inform the barcode reading application 202 that a video frame of a sequence of video frames is in a condition suitable for reading.
  • a manageable number e.g. 10 to 30 possible valid patterns
  • Barcode localization module 208 operates in conjunction with barcode blur estimation module 206 to focus a search for a barcode within an area of a video frame. By searching for and processing a portion of a video frame containing a barcode, barcode reading application 202 does not expend unnecessary resources or time in processing an entire video frame. Barcode localization module 208 may use the known format of certain barcodes to aid in determining an area of the video frame in which to focus.
  • barcode localization module 208 may be forced to operate in blurry conditions, barcode localization module 208 also operates with the knowledge that despite the presence of blurriness in a video frame, barcode elements have a vertical correlation to each other, and often the vertical correlation between barcode elements (e.g., black vertical bars and white spaces) survives or is distinguishable among the blurriness.
  • barcode elements e.g., black vertical bars and white spaces
  • barcode localization module 208 may sample horizontal scanlines at various points throughout the video frame.
  • barcode localization module 208 may sample a horizontal scanline at one point in the video frame and compare the scanline to a different horizontal scanline taken at a nearby point (e.g., three pixels lower than the original scanline). If the two scanlines match and indicate the presence of a vertical correlation of barcode elements, barcode localization module 208 may assume that it has discovered the location of a barcode within the video frame.
  • Barcode localization module 208 may then use the known format of linear barcodes to further identify the region of the video frame where it believes the barcode resides.
  • a UPC or EAN barcode may include start and end guard bars that delineate the beginning and end of a barcode.
  • the start and end guard bars are characterized as having the same pattern of bars and spaces (and data) irrespective of the encoded content within the barcode.
  • the barcode localization module 208 may use the general location identified by the comparison of horizontal scanlines to search for known barcode identifying features, such as the start and end guard bars. Once identified, the location of the start and end guard bars of the barcode enable the barcode localization module 208 to mark the left and right bounds of the barcode.
  • the barcode localization module 208 may mark the upper and lower bounds of the barcode. Barcode localization module 208 may continually refine the bounds of the barcode until it determines that the barcode has been identified, at which point a boundary box or boundary identifiers surrounding the barcode are locked in place.
  • Barcode geometric modeler module 210 may attempt to compensate for geometric deformities in the barcode contained within the one or more video frames by developing a geometric model for the barcode. Despite being localized, the barcode contained within a video frame may not be able to be decoded, or may not be able to be decoded efficiently. In example embodiments, the barcode may be curved, tilted, warped, or rotated, among other things. Each of these deformities may affect and hamper decoding of the barcode. For example, if a barcode is curved, the vertical black bars near the left and right ends of the barcode may appear to be skinnier than in reality due to the curvature of the barcode.
  • Barcode geometric modeler module 210 may plot points throughout the barcode to develop a curve model for the barcode that accounts for the barcode being curved.
  • the vertical bars of the barcode may not be precisely vertical.
  • two horizontal cross-sections of the barcode may be obtained.
  • a vertical bar in the two horizontal cross-sections that is shifted may indicate that the barcode is skewed.
  • the barcode geometric modeler module 210 may calculate an angle formed by the skewed vertical bar and a vertical line corresponding to a non-skewed vertical bar.
  • the geometric model of the barcode developed by the barcode geometric modeler module 210 may account for barcode skew, curve, tilt, warping, and lighting conditions, among other things.
  • the barcode geometric modeler module 210 may compare the geometric model to the barcode contained in the video frame to determine the accuracy of the model. Barcode geometric modeler module 210 may refine the geometric model over successive video frames or decoding iterations.
  • Barcode decoder module 212 may account for the deformities in the barcode when attempting to decode the barcode by relying on the known features of the barcode format to aid in decoding the barcode. For example, every UPC barcode will have a middle delimiter to separate a first group of six digits from a second group of six digits. As previously discussed, the middle delimiter has a constant value regardless of the content of the UPC barcode. The barcode decoder module 212 may begin the decoding of the barcode by identifying the middle delimiter. Once identified, the barcode decoder module 212 may decode the barcode elements to the left of the middle delimiter and the barcode elements to the right of the middle delimiter independently. The decoding of barcode digits may involve estimating the digits of the barcode based on the bars and spaces of the barcode.
  • barcode decoder module 210 may first identify a middle delimiter of a UPC code when decoding a barcode, other types or formats of barcodes may be decoded differently. For example, other types of barcodes may not use a middle delimiter to separate a first group of barcode digits from a second group of barcode digits. For these and other types of barcodes, rather than concentrate on identifying a middle delimiter, barcode decoder module 210 may attempt to identify and decode an area of the barcode having a concentration of information or data.
  • Barcode reading application 202 may call or execute each of the barcode blur estimation module 206 , barcode localization module 208 , barcode geometric modeler module 210 , and barcode decoder module 212 for each attempted decoding.
  • the foregoing modules 206 , 208 , 210 , 212 may be executed simultaneously for each attempted decoding.
  • barcode reading application 202 may refine its estimate of the digits encoded in the barcode. The estimated digits may be compared to the check sum digit of the barcode to determine the accuracy of the estimate.
  • Communication module 214 may transmit decoded barcode data to an application server via a network and receive data from the commerce or publication system related to the barcode data.
  • barcode data may be transmitted to the publication system 102 to receive product information related to the barcode.
  • decoded barcode data may be submitted to retrieve price or product comparison data concerning the product identified by the barcode and similar products.
  • barcode decoding itself is performed by the client machine 108 without any network communications in the interest of preserving efficiency and speed.
  • FIG. 3 is a diagram of an example embodiment of a client barcode reading application.
  • a client barcode reading application 202 executing on a client machine 108 is shown.
  • Barcode reading application 202 may receive a series of video frames form a camera of the client machine 108 .
  • Barcode reading application 202 may display the video frames within a user interface that prompts a user to center a barcode 304 within a region of the user interface delineated by boundary markers 302 , such as brackets or arrows.
  • boundary markers 302 such as brackets or arrows.
  • barcode reading application 202 may execute modules (e.g., modules 206 , 208 , 210 , 212 of FIG. 2 ) to determine whether the received video frame is in condition for a barcode decoding attempt.
  • FIG. 4 is a diagram of an example embodiment of a client barcode reading application attempting to read a deformed linear barcode.
  • client barcode reading application 202 executing on client machine 108 is shown attempting to read a barcode 404 within boundaries 402 .
  • Barcode 404 is shown as suffering from defocus blurriness.
  • Barcode 404 may be blurry due to excessive motion or a lack of, or incorrect focus in, the video frame.
  • Barcode blur estimation module 206 of barcode reading application 202 may estimate a defocus blur for the displayed barcode by isolating a horizontal cross-section of the blurry barcode 404 and determining a difference across the pixels in the horizontal cross-section. The differences among pixels may be summed.
  • the barcode reading application 202 may not attempt to decode the barcode. Instead, the barcode reading application 202 may wait for another video frame in which the estimated defocus blur is lower than the predetermined acceptable blur threshold.
  • FIG. 5 is an image of an example embodiment of a deformed linear barcode.
  • barcode reading application 202 may be confronted with a blurry barcode 404 .
  • barcode blur estimation module 206 may estimate a defocus blur for the barcode 404 by isolating a horizontal cross-section 502 of the blurry barcode 404 .
  • the defocus blur for the barcode 404 may be estimated by obtaining the differences among pixels within the cross-section 502 , and specifically, by examining the color degradation or blending of colors from pixel to pixel.
  • the color degradation may result in the muting of the color intensity for vertical black bars and an increase in the color intensity for white spaces.
  • the differences among pixels may be summed and applied to an equation used to predict a radius of defocus blur.
  • the initial estimate of defocus blur may be used in an iterative algorithm that attempts to decode the barcode despite various barcode deformities.
  • FIG. 6 is a diagram of an example embodiment of a client barcode reading application attempting to read a deformed linear barcode.
  • client barcode reading application 202 executing on client machine 108 is shown attempting to decode a curved barcode 602 .
  • Barcode reading application 202 may prompt a user directing a camera of client machine 108 to center a barcode 602 within boundaries 402 .
  • barcode 602 is convexly curved.
  • Barcode 602 may reflect a barcode that is used to label a cylindrical product, such as an aluminum soda can, a jar, or a bottle.
  • Barcode blur estimation module 206 may calculate a defocus blur using any of the methods described herein. Assuming for the sake of the discussion that the defocus blur is acceptable (i.e., below the predetermined acceptable blur threshold), barcode localization module 208 may localize the area of the video frame to attempt a barcode decoding. Localization is aided by the presence of boundaries 402 that barcode reading application 202 displays and uses to prompt a user to center a barcode within the display screen of the client machine 108 . Barcode geometric modeler module 210 may attempt to compensate for the curvature of the displayed barcode 602 by developing a curve model for the barcode 602 .
  • the curve model may plot a series of points through the barcode and attempt to construct a curve or mathematical function to compensate for the curvature of certain barcode elements. For example, the curve model may attempt to identify non-curved data points corresponding to the curved data points using interpolation or smoothing.
  • Barcode decoder module 212 may attempt to decode the curved barcode 602 in the region identified by barcode localization module 208 using the geometric model developed by barcode geometric modeler module 210 . Each successive attempt at decoding the barcode also may update the various computed factors, such as the defocus blur, the geometric model, and the localized region.
  • FIG. 7 is a diagram of an example embodiment of a deformed linear barcode.
  • a representation of a barcode 702 suffering from skewing or tilting is shown. Tilting of a barcode may present difficulties to the barcode reading application 202 when decoding because the elements of the barcode may not be vertical.
  • barcode geometric modeler module 210 may take two horizontal cross-sections 704 , 706 of the tilted barcode 702 and compare the barcode elements contained within the two horizontal cross-sections. For simplicity of discussion, reference is made in FIG. 7 to only one barcode element of the barcode representation 702 .
  • Data points or segments corresponding to the barcode element at each respective point in the horizontal cross-section may illustrate the degree of tilting present in the barcode. If data points are taken, a line 708 may be drawn between the two points. An angle 712 may be calculated between the line 708 connecting the two data points of the barcode element taken from the two horizontal cross-sections and a vertical line 710 . The angle 712 may instruct the barcode geometric modeler module 210 , or any other module of the barcode reading application 202 , of the amount of shifting necessary to straighten the barcode 702 .
  • FIG. 8 is a flow chart of an example method 800 for recognizing a barcode.
  • a barcode reading application 202 may receive a stream of video frames from a camera.
  • the camera may be a component of a client device, such as a mobile device, or may be an external camera connected to the client device.
  • barcode reading application 202 may decode a barcode contained within one or more of the stream of video frames.
  • decoded barcode data may be communicated to a network publisher 102 to retrieve data related to the product or good identified by the barcode.
  • the retrieved data may be a description of the product, a price of the product, a comparison of the product to related products, or a comparison of the price of the product among a variety of sellers (e.g., individuals, retailers, wholesalers), among other things.
  • the retrieved data may be communicated back to the client device via network 104 and displayed on client machines 106 , 108 .
  • FIG. 9 is a flow chart of an example method 900 for recognizing a barcode.
  • a barcode reading application 202 may receive a stream of video frames from a camera.
  • the camera may be a component of client machines 106 , 108 or may be integrated with the client device via a cable or other communicative attachment.
  • a barcode blur estimation module 206 may estimate a defocus blur of a video frame of the stream of video frames to determine an amount of blur present in the video frame due to a fixed focus camera being too close to a barcode.
  • Barcode reading application 202 may choose to not decode a video frame if too much motion or defocus blur is present in the video frame, as the odds of success in decoding a barcode contained within a blur-filled video frame are low.
  • the defocus blur may be calculated by examining differences among pixels in a cross-section of a displayed barcode. For example, differences among pixels may include a color degradation or bleeding among neighboring pixels in a cross-section of a displayed barcode.
  • defocus blur may be estimated by examining pixels within the cross-section to determine differences in pixel intensity. The differences may be aggregated and applied to an equation used to predict a radius of defocus blur for the barcode. If the defocus blur is below a predetermined threshold of acceptable blur, the barcode reading application 202 may attempt to decode the barcode. Otherwise, the barcode reading application 202 may wait for a different video frame to attempt to decode the barcode.
  • the barcode may be localized within the video frame. Localization of the barcode within the video frame may help to conserve resources in that the entire video frame would not need to be processed. Localization of the barcode may entail discovering the bounds of the barcode within the video frame. In certain example embodiments, the bounds of the barcode may be determined by searching for certain barcode landmarks that signify the beginning and end of a barcode. For example, UPC and EAN barcodes are characterized by start and end guard bars that are located at the leftmost and rightmost points of the barcode. These guard bars are encoded with a static value such that a start guard bar in every UPC code has the same encoded value and visual representation.
  • the barcode may be localized to a particular region of the video frame.
  • the upper and lower bounds of the barcode may be discovered by examining cross-sections of the barcode to identify a point where the pixel intensity of the cross-section changes from one that indicates a barcode (e.g., alternating strong and weak pixel intensities reflecting the visual elements of a barcode) to one that lacks such alternating pixel intensities.
  • the barcode contained within the video frames may be geometrically modeled to account for various linear barcode deformities. For example, if the barcode is tilted, an angle of tilt may be calculated and the barcode may be shifted using the angle of tilt. If the barcode is curved, a curve model may be developed by fitting data points of the curved barcode into a curve. The lighting by which the barcode is illuminated may be compensated for as well in the geometric model. If the barcode is rotated, a degree of rotation may be calculated and included in the geometric model.
  • the barcode may be decoded. Decoding of the barcode factors in the blur present in the video frame and the geometric model developed for the barcode. Additionally, decoding of the barcode is attempted within the localized region of the video frame as opposed to processing an entire video frame. During decoding of the barcode, the digits of the barcode are estimated and may be compared to a check sum digit in the barcode for accuracy.
  • Decoding of the barcode may involve estimating a predetermined number of barcode digits as being likely accurate. The number of barcode digits needed to be estimated as likely accurate may depend upon an adjustable threshold that quantifies the number of likely accurate digits necessary for a likely correct final result. If the decoding is not complete, the example method returns to operation 904 for the next video frame in the stream of video frames. It is contemplated that operations 904 , 906 , 908 , and 910 are performed iteratively for each video frame in the stream of video frames. It is further contemplated that operations 904 , 906 , 908 , and 910 may be performed simultaneously. If the barcode decoding is complete, the example method finishes at operation 914 .
  • FIG. 10 shows a diagrammatic representation of machine in the example form of a computer system 1000 within which a set of instructions may be executed causing the machine to perform any one or more of the methodologies discussed herein.
  • the machine operates as a standalone device or may be connected (e.g., networked) to other machines.
  • the machine may operate in the capacity of a server or a client machine in server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment.
  • the machine may be a PC, a tablet PC, a set-top box (STB), a PDA, a cellular telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine.
  • STB set-top box
  • PDA personal area network
  • cellular telephone a packet data network
  • web appliance a web appliance
  • network router switch or bridge
  • the example computer system 1000 includes a processor 1002 (e.g., a central processing unit (CPU), a graphics processing unit (GPU) or both), a main memory 1004 and a static memory 1006 , which communicate with each other via a bus 1008 .
  • the computer system 1000 may further include a video display unit 1010 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)).
  • the computer system 1000 also includes an alphanumeric input device 1012 (e.g., a keyboard), a cursor control device 1014 (e.g., a mouse), a disk drive unit 1016 , a signal generation device 1018 (e.g., a speaker) and a network interface device 1020 .
  • the disk drive unit 1016 includes a machine-readable medium 1022 on which is stored one or more sets of instructions and data structures (e.g., software 1024 ) embodying or utilized by any one or more of the methodologies or functions described herein.
  • the software 1024 may also reside, completely or at least partially, within the main memory 1004 and/or within the processor 1002 during execution thereof by the computer system 1000 , the main memory 1004 and the processor 1002 also constituting machine-readable media.
  • the software 1024 may further be transmitted or received over a network 1026 via the network interface device 1020 utilizing any one of a number of well-known transfer protocols (e.g., HTTP).
  • HTTP transfer protocol
  • machine-readable medium 1022 is shown in an example embodiment to be a single medium, the term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions.
  • the term “machine-readable medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies disclosed herein, or that is capable of storing, encoding or carrying data structures utilized by or associated with such a set of instructions.
  • the term “machine-readable medium” shall accordingly be taken to include, but not be limited to, solid-state memories, optical media, and magnetic media.
  • a component or a module is a non-transitory and tangible unit capable of performing certain operations and may be configured or arranged in a certain manner.
  • one or more computer systems e.g., a standalone, client or server computer system
  • one or more components of a computer system e.g., a processor or a group of processors
  • software e.g., an application or application portion
  • a component or a module may be implemented mechanically or electronically.
  • a component or a module may comprise dedicated circuitry or logic that is permanently configured (e.g., as a special-purpose processor) to perform certain operations.
  • a component or a module also may comprise programmable logic or circuitry (e.g., as encompassed within a general-purpose processor or other programmable processor) that is temporarily configured by software to perform certain operations. It will be appreciated that the decision to implement a component or a module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.
  • the term “component” or “module” should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired) or temporarily configured (e.g., programmed) to operate in a certain manner and/or to perform certain operations described herein.
  • components or modules are temporarily configured (e.g., programmed)
  • each of the components or modules need not be configured or instantiated at any one instance in time.
  • the components or modules comprise a general-purpose processor configured using software
  • the general-purpose processor may be configured as respective different components or modules at different times.
  • Software may accordingly configure a processor, for example, to constitute a particular component or module at one instance of time and to constitute a different component or module at a different instance of time.
  • Components or modules can provide information to, and receive information from, other components or modules. Accordingly, the described components or modules may be regarded as being communicatively coupled. Where multiple of such components or modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses) that connect the components or modules. In embodiments in which multiple components or modules are configured or instantiated at different times, communications between such components or modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple components or modules have access. For example, one component or module may perform an operation, and store the output of that operation in a memory device to which it is communicatively coupled. A further component or module may then, at a later time, access the memory device to retrieve and process the stored output. Components or module may also initiate communications with input or output devices, and can operate on a resource (e.g., a collection of information).
  • a resource e.g., a collection of information
  • inventive subject matter may be referred to herein, individually and/or collectively, by the term “invention” merely for convenience and without intending to voluntarily limit the scope of this application to any single invention or inventive concept if more than one is in fact disclosed.
  • inventive concept merely for convenience and without intending to voluntarily limit the scope of this application to any single invention or inventive concept if more than one is in fact disclosed.

Abstract

In a system and method of recognizing a barcode from a stream of video frames, a processor-implemented camera module receives a stream of video frames, with at least one video frame including a barcode. A processor-implemented barcode blur estimate module estimates an amount of defocus blur in a video frame. The processor-implemented barcode blur estimate module further estimates an identity of the barcode. A processor-implemented barcode localization module identifies a region of the video frame containing the barcode. A processor-implemented barcode geometric modeler module generates a geometric model of the barcode that includes an identified barcode deformity. A processor-implemented barcode decoder module decodes the barcode from the video frame using the estimated amount of defocus blur, the estimated identity of the barcode, and the geometric model of the barcode.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS
This application is a continuation of U.S. patent application Ser. No. 14/488,191, filed on Sep. 16, 2014, which is a continuation of U.S. patent application Ser. No. 12/885,155, filed on Sep. 17, 2010, which claims the benefit of U.S. Provisional Application Serial No. 61/245,635, filed on Sep. 24, 2009. This application is related to U.S. application Ser. No. 12/885,221, filed on Sep. 17, 2010 and to U.S. application Ser. No. 14/488,179, filed Sep. 16, 2014. Each of these applications is incorporated herein by reference in its entirety.
TECHNICAL FIELD
This application relates generally to the field of barcode processing, and more specifically, to a system and method for estimating and classifying barcodes using heuristic and statistical measures.
BACKGROUND
Barcodes are widely used to identify and track goods and documents, among other things. A commonly used barcode is a linear barcode, which is a machine-readable representation of data that represents data in the widths and spacing of parallel lines. Different linear barcode formats have emerged over time, with Universal Product Code (UPC) and European Article Number (EAN) being two commonly used barcode formats.
A commonly used UPC code is a UPC-A barcode. A UPC-A barcode is characterized by twelve decimal digits, preceded by a start delimiter and followed by an end delimiter. In the middle of the twelve digit barcode, between the sixth and seventh digits, is a middle delimiter. The start, middle, and end delimiters function to separate the twelve digits into two groups of six digits. The start and end delimiters are characterized by a “101” bit pattern, which may be visualized as two vertical black guard bars with a white space between the bars. The middle delimiter is characterized by a “01010” bit pattern, which may be visualized as a white space, a black vertical guard bar, a white space, a black vertical guard bar, and a white space. Between the start and middle delimiters are six “left” digits, and between the middle and end delimiters are six “right” digits. Each digit is represented by a seven-bit code, with a binary ‘1’ value represented by a vertical black bar and a binary ‘0’ value represented by a vertical white space. The seven-bit code for each digit is represented visually as two bars and two spaces, with each of the bars and spaces having varying width depending on the digit. To distinguish between “left” digits and “right” digits, a “left” digit seven-bit code is the inverse of a “right” digit seven-bit code. The following table illustrates the seven-bit code values for each barcode digit.
Digit Left Pattern Right Pattern
0 0001101 1110010
1 0011001 1100110
2 0010011 1101100
3 0111101 1000010
4 0100011 1011100
5 0110001 1001110
6 0101111 1010000
7 0111011 1000100
8 0110111 1001000
9 0001011 1110100
Among the twelve digits of the barcode, the first, or leftmost, digit is a prefix digit, while the last, or rightmost digit, is an error correcting check digit.
A commonly used EAN barcode is an EAN-13 barcode. The EAN-13 barcode is a superset of a UPC-A barcode. The EAN-13 barcode uses thirteen digits broken up into four components: a prefix, which may be two or three digits long; a company number, which may be four to six digits long, an item reference number, which may be two to six digits, and a single checksum digit. EAN-13 barcodes differ from UPC-A barcodes in that the data digits are split into three groups: a first digit, a first group of six digits, and a second group of six digits. The first group of six digits is encoded according to one of two encoding schemes, one of which has even parity and one of which has odd parity, while the second group of six digits is encoded as bitwise complements to the digits of the first group having the odd parity encoding scheme.
Barcodes are commonly read using fixed or mounted barcode scanners, such as those found as part of a point-of-sale system, or using commercial laser-based, handheld barcode readers, which are often attached to a point-of-sale system. However, with the proliferation of handheld and mobile devices, there is a growing interest in leveraging the ability of these devices to read barcodes.
BRIEF DESCRIPTION OF THE DRAWINGS
The drawings illustrate generally, by way of example, but not by way of limitation, various embodiments discussed in the present document.
FIG. 1 is a network diagram depicting a network system, according to one embodiment, having a client-server architecture configured for exchanging data over a network.
FIG. 2 is a block diagram illustrating an example embodiment of various client modules that may used to execute the processes described herein.
FIG. 3 is a diagram of an example embodiment of a client barcode reading application.
FIG. 4 is a diagram of an example embodiment of a client barcode reading application attempting to read a deformed linear barcode.
FIG. 5 is an image of an example embodiment of a deformed linear barcode.
FIG. 6 is a diagram of an example embodiment of a client barcode reading application attempting to read a deformed linear barcode.
FIG. 7 is a diagram of an example embodiment of a deformed linear barcode.
FIG. 8 is a flow chart of an example method for recognizing a barcode.
FIG. 9 is a flow chart of an example method for recognizing a barcode.
FIG. 10 shows a diagrammatic representation of a machine in the example form of a computer system within which a set of instructions may be executed to cause the machine to perform any one or more of the methodologies discussed herein.
DETAILED DESCRIPTION
Although embodiments of the disclosure have been described with reference to specific example embodiments, it will be evident that various modifications and changes may be made to these embodiments without departing from the broader spirit and scope of the disclosure. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.
In various embodiments, a system and method to recognize a deformed linear barcode from varied-focus video frames are disclosed. A processor-implemented camera module may receive a stream of video frames. The stream of video frames may include a barcode contained therein. A processor-implemented barcode blur estimate module may estimate an amount of defocus blur in a video frame of the stream of video frames. The processor-implemented barcode blur estimate module may estimate an identity of the barcode using a backward extraction technique. A processor-implemented barcode localization module may identify a region of the video frame containing the barcode. A processor-implemented geometric modeler module may generate a geometric model of the barcode that includes an identified barcode deformity. A processor-implemented barcode decoder module may decode the barcode using the estimated amount of defocus blur, the estimated barcode identity, and the geometric model of the barcode.
FIG. 1 is a network diagram depicting a network system 100, according to one embodiment, having a client-server architecture configured for exchanging data over a network. For example, the network system 100 may be a commerce or publication/publisher 102 where clients may communicate and exchange data within the network system 100. The data may pertain to various functions (e.g., product lookups, product or price comparisons, and online item purchases) associated with the network system 100 and its users. Although illustrated herein as a client-server architecture as an example, other embodiments may include other network architectures, such as a peer-to-peer or distributed network environment.
A data exchange platform, in an example form of a network-based publisher 102, may provide server-side functionality, via a network 104 (e.g., the Internet) to one or more clients. The one or more clients may include users that utilize the network system 100 and more specifically, the network-based publisher 102, to exchange data over the network 104. These transactions may include transmitting, receiving (communicating) and processing data to, from, and regarding content and users of the network system 100. The data may include, but are not limited to, content and user data such as barcode-related data, product profiles, product reviews, product comparisons, price comparisons, product recommendations and identifiers, product and service listings associated with buyers and sellers, auction bids, and transaction data, among other things.
In various embodiments, the data exchanges within the network system 100 may be dependent upon user-selected functions available through one or more client or user interfaces (UIs). The UIs may be associated with a client machine, such as a client machine 106 using a web client 110. The web client 110 may be in communication with the network-based publisher 102 via a web server 120. The UIs may also be associated with a client machine 108 using a programmatic client 112, such as a client application. It can be appreciated that in various example embodiments, the client machines 106, 108 may be associated with a buyer, a seller, a payment service provider, or a shipping service provider, each in communication with the network-based publisher 102 and, optionally, each other. The buyers and sellers may be any one of individuals, merchants, or service providers, among other things. The role of a user of client machines 106, 108 is immaterial to the discussion herein and the foregoing examples are merely examples of the types of users who may operate client machines 106, 108.
Client machines 106, 108 executing web client 110 or programmatic client 112 may use the web client 110 or programmatic client 112 to read a barcode. In an example embodiment, client machines 106, 108 may be handheld or mobile devices. Client machines 106, 108 may have camera functionality, implemented in example embodiments as a built-in camera or external camera. In an example embodiment, the built-in camera or external camera may have a fixed focus lens. In an example embodiment, client machines 106, 108 may capture and decode a barcode using web client 110 or programmatic client 112 (e.g., client app). Client machines 106, 108 may transmit decoded barcode information to the network-based publisher 102 to retrieve additional information concerning the decoded barcode.
Referring to the network-based publisher 102, an application program interface (API) server 118 and a web server 120 are coupled to, and provide programmatic and web interfaces respectively to, one or more application servers 122. The application servers 122 host one or more publication or commerce application(s) 124. The application servers 122 are, in turn, shown to be coupled to one or more database server(s) 126 that facilitate access to one or more database(s) 128.
In one embodiment, the web server 120 and the API server 118 communicate and receive data, such as in the form of decoded barcode data, pertaining to products, listings, and transactions, among other things, via various user input tools. For example, the web server 120 may send and receive data to and from a barcode reading webpage on a browser application (e.g., web client 110) operating on a client machine (e.g., client machine 106). The API server 118 may send and receive data to and from a barcode reading app (e.g., programmatic client 112) running on another client machine (e.g., client machine 108).
The commerce or publication application(s) 124 may provide a number of commerce and publisher functions and services (e.g., listing, product lookup, price comparison, payment, etc.) to users who access the network-based publisher 102. For example, the commerce and publication application(s) 124 may provide a number of services and functions to users for listing goods for sale, facilitating transactions, and reviewing or comparing products and prices of products. Data pertaining to the services and functions provided by the commerce and publication application(s) 124 may be retrieved from database(s) 128 via database server 126.
FIG. 2 is a block diagram illustrating an example embodiment of various client modules that may used to execute the processes described herein. For example purposes only, FIG. 2 will be described with reference to client machine 108 of FIG. 1, although one of ordinary skill in the art will recognize that any client device may be used to implement the client modules discussed herein. Client machine 108 may include various modules that perform various functions. The modules may be implemented as software, hardware, firmware, or any combination thereof. Multiple functions may be performed in one or more modules as desired, and the embodiments described are merely examples. The software may be executed, at least in part, on one or more of at least one computer processor, digital signal processor, Application Specific Integrated Circuit (ASIC), microprocessor, or other type of processor operating on a system, such as a personal computer (PC), set top box, personal digital assistant (PDA), smart phone, server, router, or other device capable of processing data, including network interconnection devices.
Client machine 108 may execute a barcode reading application 202. Barcode reading application 202 may include or call at least a camera module 204, a barcode blur estimation module 206, a barcode localization module 208, a barcode geometric modeler module 210, a barcode decoder module 212, and a communication module 214, each of which may reside within client machine 108.
Camera module 204 may be an internal or external camera for use with the client machine 108. For example, many mobile devices, such as the Apple iPhone®, Research in Motion Blackberry®, and devices running the Google Android® operating system, include camera devices enabling users to capture images or video. In an example embodiment, camera module 204 includes a fixed focus lens. In further example embodiments, camera module 204 may include a camera flash device, such as a light-emitting diode (LED) flash. Camera module 204 may include software capable of implementing digital zoom for capturing images or video. In an example embodiment, camera module 204 may capture video frames in standard or high definition. The video frames may be provided as a stream of video frames to the barcode reading application 202 or any components thereof.
Barcode blur estimation module 206 determines when a client barcode reading application should attempt to read and decode a barcode. Barcode reading application 202 may interface with camera module 204 to receive video frames from camera module 204. In an example embodiment, upon execution of barcode reading application 202, client machine 108 may activate camera module 204, and video frames captured by camera module 204 may be provided to barcode blur estimation module 206. Barcode reading application 202 may decode a barcode directly from one or more video frames captured or received by the camera module 204. In other words, in an example embodiment, barcode reading application 202 does not need to capture an image or take a picture of a barcode in order to read and process the barcode.
In order to maximize the success of reading a barcode from a series of video frames, barcode blur estimation module 206 evaluates a barcode contained within the stream of video frames to determine an appropriate reading environment. Barcode blur estimation module 206 may estimate the defocus blur and motion blur contained in the video frames. Defocus blur may arise from the use of a fixed focus camera to capture video frames containing a barcode that is close to the camera. If the amount of defocus blur is greater than a predetermined acceptable threshold amount of defocus blur, barcode reading application 202 will not attempt to read the barcode from the video frames. Once the amount of defocus blur is determined to be less than the predetermined threshold, barcode reading application 202 will attempt to decode the barcode. Thus, unlike other barcode reading applications, barcode reading application 202 will not expend unnecessary resources attempting to decode a barcode contained in a video frame suffering from defocus blurriness.
In an example embodiment, after a barcode is located in the stream of video frames and a 1D signal is obtained, the barcode blur estimation module 206 estimates the defocus blur by computing a difference in pixel intensity across pixels in the video frames that correspond to the barcode. In an example embodiment, the pixels may be selected from a horizontal cross-section of the barcode. This difference across pixels is summed up. In an example embodiment in which defocus blur is present in a video frame, the pixels may be smeared together such that the summed differences between pixels may be less than a perfectly sharp barcode image. The difference sum may be used in an equation that predicts the radius of the defocus blur. The equation used may be derived from testing performed over a large set of barcodes. The equation may produce an initial estimate of the defocus blur present in the video frames.
Known barcode decoding methods attempt to perform forward image analysis to decode the barcode, where a signal thought to contain a barcode is analyzed to identify the exact barcode displayed in the signal. This analysis attempts to identify peaks and valleys in the signal corresponding to pixel intensity, with a peak representing a black bar and a valley representing a white space. If the analysis is unable to determine the exact barcode contained in the signal, sharpening and other refinement of the signal are performed to obtain more distinct peaks and valleys. In other words, a forward image analysis starts with an unknown barcode and attempts to identify the exact barcode shown. The forward image analysis approach to decoding a barcode can be inefficient and ineffectual as a failed attempt to decode the barcode will cause the forward image analysis approach to restart the analysis anew. Thus, depending on the quality of the barcode image, many attempts at decoding the barcode may occur, and even then, the exact barcode may not be determined, due to, among other things, the presence of blur.
The barcode blur estimation module 206 may use a backward extraction process to determine the identity of a barcode contained within one or more video frames. Using the determined guess or assumption of the barcode, the barcode blur estimation module 206 can narrow the number of possibilities of the identity of the barcode to a manageable number (e.g., 10 to 30 possible valid patterns) from the overall universe of possible barcodes. The barcode blur estimation module 206 may compare the assumed barcode to each of the possible valid barcode patterns to identify a match. Thus, the barcode blur estimation module 206 may formulate an assumption of the barcode and apply its assumption or expectation to narrow the possible universe of potential applicable barcodes. The calculation of the estimated defocus blur and the use of the backward extraction process may inform the barcode reading application 202 that a video frame of a sequence of video frames is in a condition suitable for reading.
Barcode localization module 208 operates in conjunction with barcode blur estimation module 206 to focus a search for a barcode within an area of a video frame. By searching for and processing a portion of a video frame containing a barcode, barcode reading application 202 does not expend unnecessary resources or time in processing an entire video frame. Barcode localization module 208 may use the known format of certain barcodes to aid in determining an area of the video frame in which to focus. As barcode localization module 208 may be forced to operate in blurry conditions, barcode localization module 208 also operates with the knowledge that despite the presence of blurriness in a video frame, barcode elements have a vertical correlation to each other, and often the vertical correlation between barcode elements (e.g., black vertical bars and white spaces) survives or is distinguishable among the blurriness.
Operating under these assumptions, barcode localization module 208 may sample horizontal scanlines at various points throughout the video frame. In an example embodiment, barcode localization module 208 may sample a horizontal scanline at one point in the video frame and compare the scanline to a different horizontal scanline taken at a nearby point (e.g., three pixels lower than the original scanline). If the two scanlines match and indicate the presence of a vertical correlation of barcode elements, barcode localization module 208 may assume that it has discovered the location of a barcode within the video frame.
Barcode localization module 208 may then use the known format of linear barcodes to further identify the region of the video frame where it believes the barcode resides. For example, as discussed herein, a UPC or EAN barcode may include start and end guard bars that delineate the beginning and end of a barcode. The start and end guard bars are characterized as having the same pattern of bars and spaces (and data) irrespective of the encoded content within the barcode. The barcode localization module 208 may use the general location identified by the comparison of horizontal scanlines to search for known barcode identifying features, such as the start and end guard bars. Once identified, the location of the start and end guard bars of the barcode enable the barcode localization module 208 to mark the left and right bounds of the barcode. Using the horizontal scanline samples, the barcode localization module 208 may mark the upper and lower bounds of the barcode. Barcode localization module 208 may continually refine the bounds of the barcode until it determines that the barcode has been identified, at which point a boundary box or boundary identifiers surrounding the barcode are locked in place.
Barcode geometric modeler module 210 may attempt to compensate for geometric deformities in the barcode contained within the one or more video frames by developing a geometric model for the barcode. Despite being localized, the barcode contained within a video frame may not be able to be decoded, or may not be able to be decoded efficiently. In example embodiments, the barcode may be curved, tilted, warped, or rotated, among other things. Each of these deformities may affect and hamper decoding of the barcode. For example, if a barcode is curved, the vertical black bars near the left and right ends of the barcode may appear to be skinnier than in reality due to the curvature of the barcode. As the width of a vertical bar of a barcode is used to determine the value of an encoded digit, a curved barcode can cause the barcode to be decoded incorrectly. Barcode geometric modeler module 210 may plot points throughout the barcode to develop a curve model for the barcode that accounts for the barcode being curved.
In the event the barcode is skewed or angled, the vertical bars of the barcode may not be precisely vertical. To compensate for a skewed barcode, two horizontal cross-sections of the barcode may be obtained. A vertical bar in the two horizontal cross-sections that is shifted may indicate that the barcode is skewed. To compensate for a skewed barcode, the barcode geometric modeler module 210 may calculate an angle formed by the skewed vertical bar and a vertical line corresponding to a non-skewed vertical bar.
The geometric model of the barcode developed by the barcode geometric modeler module 210 may account for barcode skew, curve, tilt, warping, and lighting conditions, among other things. The barcode geometric modeler module 210 may compare the geometric model to the barcode contained in the video frame to determine the accuracy of the model. Barcode geometric modeler module 210 may refine the geometric model over successive video frames or decoding iterations.
Barcode decoder module 212 may account for the deformities in the barcode when attempting to decode the barcode by relying on the known features of the barcode format to aid in decoding the barcode. For example, every UPC barcode will have a middle delimiter to separate a first group of six digits from a second group of six digits. As previously discussed, the middle delimiter has a constant value regardless of the content of the UPC barcode. The barcode decoder module 212 may begin the decoding of the barcode by identifying the middle delimiter. Once identified, the barcode decoder module 212 may decode the barcode elements to the left of the middle delimiter and the barcode elements to the right of the middle delimiter independently. The decoding of barcode digits may involve estimating the digits of the barcode based on the bars and spaces of the barcode.
It is to be appreciated that while barcode decoder module 210 may first identify a middle delimiter of a UPC code when decoding a barcode, other types or formats of barcodes may be decoded differently. For example, other types of barcodes may not use a middle delimiter to separate a first group of barcode digits from a second group of barcode digits. For these and other types of barcodes, rather than concentrate on identifying a middle delimiter, barcode decoder module 210 may attempt to identify and decode an area of the barcode having a concentration of information or data.
Barcode reading application 202 may call or execute each of the barcode blur estimation module 206, barcode localization module 208, barcode geometric modeler module 210, and barcode decoder module 212 for each attempted decoding. In an example embodiment, the foregoing modules 206, 208, 210, 212 may be executed simultaneously for each attempted decoding. With each iteration, barcode reading application 202 may refine its estimate of the digits encoded in the barcode. The estimated digits may be compared to the check sum digit of the barcode to determine the accuracy of the estimate.
Communication module 214 may transmit decoded barcode data to an application server via a network and receive data from the commerce or publication system related to the barcode data. For example, barcode data may be transmitted to the publication system 102 to receive product information related to the barcode. In another example embodiment, decoded barcode data may be submitted to retrieve price or product comparison data concerning the product identified by the barcode and similar products. In an example embodiment, barcode decoding itself is performed by the client machine 108 without any network communications in the interest of preserving efficiency and speed.
FIG. 3 is a diagram of an example embodiment of a client barcode reading application. Referring to FIG. 3, a client barcode reading application 202 executing on a client machine 108 is shown. Barcode reading application 202 may receive a series of video frames form a camera of the client machine 108. Barcode reading application 202 may display the video frames within a user interface that prompts a user to center a barcode 304 within a region of the user interface delineated by boundary markers 302, such as brackets or arrows. As the barcode 304 is centered within the boundary markers 302, barcode reading application 202 may execute modules (e.g., modules 206, 208, 210, 212 of FIG. 2) to determine whether the received video frame is in condition for a barcode decoding attempt.
FIG. 4 is a diagram of an example embodiment of a client barcode reading application attempting to read a deformed linear barcode. Referring to FIG. 4, client barcode reading application 202 executing on client machine 108 is shown attempting to read a barcode 404 within boundaries 402. Barcode 404 is shown as suffering from defocus blurriness. Barcode 404 may be blurry due to excessive motion or a lack of, or incorrect focus in, the video frame. Barcode blur estimation module 206 of barcode reading application 202 may estimate a defocus blur for the displayed barcode by isolating a horizontal cross-section of the blurry barcode 404 and determining a difference across the pixels in the horizontal cross-section. The differences among pixels may be summed. From this determination, the summed differences among pixels is applied to an equation to predict the radius of the defocus blur. If the defocus blur exceeds a predetermined acceptable blur threshold, the barcode reading application 202 may not attempt to decode the barcode. Instead, the barcode reading application 202 may wait for another video frame in which the estimated defocus blur is lower than the predetermined acceptable blur threshold.
FIG. 5 is an image of an example embodiment of a deformed linear barcode. As discussed herein with reference to FIG. 4, barcode reading application 202 may be confronted with a blurry barcode 404. In determining whether to attempt to decode blurry barcode 404, barcode blur estimation module 206 may estimate a defocus blur for the barcode 404 by isolating a horizontal cross-section 502 of the blurry barcode 404. The defocus blur for the barcode 404 may be estimated by obtaining the differences among pixels within the cross-section 502, and specifically, by examining the color degradation or blending of colors from pixel to pixel. The color degradation may result in the muting of the color intensity for vertical black bars and an increase in the color intensity for white spaces. The differences among pixels may be summed and applied to an equation used to predict a radius of defocus blur. The initial estimate of defocus blur may be used in an iterative algorithm that attempts to decode the barcode despite various barcode deformities.
FIG. 6 is a diagram of an example embodiment of a client barcode reading application attempting to read a deformed linear barcode. Referring to FIG. 6, client barcode reading application 202 executing on client machine 108 is shown attempting to decode a curved barcode 602. Barcode reading application 202 may prompt a user directing a camera of client machine 108 to center a barcode 602 within boundaries 402. In the example embodiment of FIG. 6, barcode 602 is convexly curved. Barcode 602 may reflect a barcode that is used to label a cylindrical product, such as an aluminum soda can, a jar, or a bottle.
Barcode blur estimation module 206 may calculate a defocus blur using any of the methods described herein. Assuming for the sake of the discussion that the defocus blur is acceptable (i.e., below the predetermined acceptable blur threshold), barcode localization module 208 may localize the area of the video frame to attempt a barcode decoding. Localization is aided by the presence of boundaries 402 that barcode reading application 202 displays and uses to prompt a user to center a barcode within the display screen of the client machine 108. Barcode geometric modeler module 210 may attempt to compensate for the curvature of the displayed barcode 602 by developing a curve model for the barcode 602. The curve model may plot a series of points through the barcode and attempt to construct a curve or mathematical function to compensate for the curvature of certain barcode elements. For example, the curve model may attempt to identify non-curved data points corresponding to the curved data points using interpolation or smoothing. Barcode decoder module 212 may attempt to decode the curved barcode 602 in the region identified by barcode localization module 208 using the geometric model developed by barcode geometric modeler module 210. Each successive attempt at decoding the barcode also may update the various computed factors, such as the defocus blur, the geometric model, and the localized region.
FIG. 7 is a diagram of an example embodiment of a deformed linear barcode. Referring to FIG. 7, a representation of a barcode 702 suffering from skewing or tilting is shown. Tilting of a barcode may present difficulties to the barcode reading application 202 when decoding because the elements of the barcode may not be vertical. To compensate for the tilting, barcode geometric modeler module 210 may take two horizontal cross-sections 704, 706 of the tilted barcode 702 and compare the barcode elements contained within the two horizontal cross-sections. For simplicity of discussion, reference is made in FIG. 7 to only one barcode element of the barcode representation 702. Data points or segments corresponding to the barcode element at each respective point in the horizontal cross-section may illustrate the degree of tilting present in the barcode. If data points are taken, a line 708 may be drawn between the two points. An angle 712 may be calculated between the line 708 connecting the two data points of the barcode element taken from the two horizontal cross-sections and a vertical line 710. The angle 712 may instruct the barcode geometric modeler module 210, or any other module of the barcode reading application 202, of the amount of shifting necessary to straighten the barcode 702.
FIG. 8 is a flow chart of an example method 800 for recognizing a barcode. At operation 802, a barcode reading application 202 may receive a stream of video frames from a camera. The camera may be a component of a client device, such as a mobile device, or may be an external camera connected to the client device. At operation 804, barcode reading application 202 may decode a barcode contained within one or more of the stream of video frames. At operation 806, decoded barcode data may be communicated to a network publisher 102 to retrieve data related to the product or good identified by the barcode. The retrieved data may be a description of the product, a price of the product, a comparison of the product to related products, or a comparison of the price of the product among a variety of sellers (e.g., individuals, retailers, wholesalers), among other things. The retrieved data may be communicated back to the client device via network 104 and displayed on client machines 106, 108.
FIG. 9 is a flow chart of an example method 900 for recognizing a barcode. At operation 902, a barcode reading application 202 may receive a stream of video frames from a camera. The camera may be a component of client machines 106, 108 or may be integrated with the client device via a cable or other communicative attachment.
At operation 904, a barcode blur estimation module 206 may estimate a defocus blur of a video frame of the stream of video frames to determine an amount of blur present in the video frame due to a fixed focus camera being too close to a barcode. Barcode reading application 202 may choose to not decode a video frame if too much motion or defocus blur is present in the video frame, as the odds of success in decoding a barcode contained within a blur-filled video frame are low. The defocus blur may be calculated by examining differences among pixels in a cross-section of a displayed barcode. For example, differences among pixels may include a color degradation or bleeding among neighboring pixels in a cross-section of a displayed barcode. The effects of defocus blur may be that black pixels may bleed some of their intensity into neighboring white pixels and in turn lose some of their intensity. Defocus blur may be estimated by examining pixels within the cross-section to determine differences in pixel intensity. The differences may be aggregated and applied to an equation used to predict a radius of defocus blur for the barcode. If the defocus blur is below a predetermined threshold of acceptable blur, the barcode reading application 202 may attempt to decode the barcode. Otherwise, the barcode reading application 202 may wait for a different video frame to attempt to decode the barcode.
At operation 906, the barcode may be localized within the video frame. Localization of the barcode within the video frame may help to conserve resources in that the entire video frame would not need to be processed. Localization of the barcode may entail discovering the bounds of the barcode within the video frame. In certain example embodiments, the bounds of the barcode may be determined by searching for certain barcode landmarks that signify the beginning and end of a barcode. For example, UPC and EAN barcodes are characterized by start and end guard bars that are located at the leftmost and rightmost points of the barcode. These guard bars are encoded with a static value such that a start guard bar in every UPC code has the same encoded value and visual representation. By discovering the start and end guard bars of the barcode, the barcode may be localized to a particular region of the video frame. The upper and lower bounds of the barcode may be discovered by examining cross-sections of the barcode to identify a point where the pixel intensity of the cross-section changes from one that indicates a barcode (e.g., alternating strong and weak pixel intensities reflecting the visual elements of a barcode) to one that lacks such alternating pixel intensities.
At operation 908, the barcode contained within the video frames may be geometrically modeled to account for various linear barcode deformities. For example, if the barcode is tilted, an angle of tilt may be calculated and the barcode may be shifted using the angle of tilt. If the barcode is curved, a curve model may be developed by fitting data points of the curved barcode into a curve. The lighting by which the barcode is illuminated may be compensated for as well in the geometric model. If the barcode is rotated, a degree of rotation may be calculated and included in the geometric model.
At operation 910, the barcode may be decoded. Decoding of the barcode factors in the blur present in the video frame and the geometric model developed for the barcode. Additionally, decoding of the barcode is attempted within the localized region of the video frame as opposed to processing an entire video frame. During decoding of the barcode, the digits of the barcode are estimated and may be compared to a check sum digit in the barcode for accuracy.
At operation 912, it is determined whether the barcode has been decoded. Decoding of the barcode may involve estimating a predetermined number of barcode digits as being likely accurate. The number of barcode digits needed to be estimated as likely accurate may depend upon an adjustable threshold that quantifies the number of likely accurate digits necessary for a likely correct final result. If the decoding is not complete, the example method returns to operation 904 for the next video frame in the stream of video frames. It is contemplated that operations 904, 906, 908, and 910 are performed iteratively for each video frame in the stream of video frames. It is further contemplated that operations 904, 906, 908, and 910 may be performed simultaneously. If the barcode decoding is complete, the example method finishes at operation 914.
FIG. 10 shows a diagrammatic representation of machine in the example form of a computer system 1000 within which a set of instructions may be executed causing the machine to perform any one or more of the methodologies discussed herein. In alternative embodiments, the machine operates as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine may operate in the capacity of a server or a client machine in server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. The machine may be a PC, a tablet PC, a set-top box (STB), a PDA, a cellular telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.
The example computer system 1000 includes a processor 1002 (e.g., a central processing unit (CPU), a graphics processing unit (GPU) or both), a main memory 1004 and a static memory 1006, which communicate with each other via a bus 1008. The computer system 1000 may further include a video display unit 1010 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)). The computer system 1000 also includes an alphanumeric input device 1012 (e.g., a keyboard), a cursor control device 1014 (e.g., a mouse), a disk drive unit 1016, a signal generation device 1018 (e.g., a speaker) and a network interface device 1020.
The disk drive unit 1016 includes a machine-readable medium 1022 on which is stored one or more sets of instructions and data structures (e.g., software 1024) embodying or utilized by any one or more of the methodologies or functions described herein. The software 1024 may also reside, completely or at least partially, within the main memory 1004 and/or within the processor 1002 during execution thereof by the computer system 1000, the main memory 1004 and the processor 1002 also constituting machine-readable media.
The software 1024 may further be transmitted or received over a network 1026 via the network interface device 1020 utilizing any one of a number of well-known transfer protocols (e.g., HTTP).
While the machine-readable medium 1022 is shown in an example embodiment to be a single medium, the term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term “machine-readable medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies disclosed herein, or that is capable of storing, encoding or carrying data structures utilized by or associated with such a set of instructions. The term “machine-readable medium” shall accordingly be taken to include, but not be limited to, solid-state memories, optical media, and magnetic media.
Certain embodiments are described herein as including logic or a number of components, modules, or mechanisms. A component or a module is a non-transitory and tangible unit capable of performing certain operations and may be configured or arranged in a certain manner. In example embodiments, one or more computer systems (e.g., a standalone, client or server computer system) or one or more components of a computer system (e.g., a processor or a group of processors) may be configured by software (e.g., an application or application portion) as a component that operates to perform certain operations as described herein.
In various embodiments, a component or a module may be implemented mechanically or electronically. For example, a component or a module may comprise dedicated circuitry or logic that is permanently configured (e.g., as a special-purpose processor) to perform certain operations. A component or a module also may comprise programmable logic or circuitry (e.g., as encompassed within a general-purpose processor or other programmable processor) that is temporarily configured by software to perform certain operations. It will be appreciated that the decision to implement a component or a module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.
Accordingly, the term “component” or “module” should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired) or temporarily configured (e.g., programmed) to operate in a certain manner and/or to perform certain operations described herein. Considering embodiments in which components or modules are temporarily configured (e.g., programmed), each of the components or modules need not be configured or instantiated at any one instance in time. For example, where the components or modules comprise a general-purpose processor configured using software, the general-purpose processor may be configured as respective different components or modules at different times. Software may accordingly configure a processor, for example, to constitute a particular component or module at one instance of time and to constitute a different component or module at a different instance of time.
Components or modules can provide information to, and receive information from, other components or modules. Accordingly, the described components or modules may be regarded as being communicatively coupled. Where multiple of such components or modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses) that connect the components or modules. In embodiments in which multiple components or modules are configured or instantiated at different times, communications between such components or modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple components or modules have access. For example, one component or module may perform an operation, and store the output of that operation in a memory device to which it is communicatively coupled. A further component or module may then, at a later time, access the memory device to retrieve and process the stored output. Components or module may also initiate communications with input or output devices, and can operate on a resource (e.g., a collection of information).
Although an embodiment has been described with reference to specific example embodiments, it will be evident that various modifications and changes may be made to these embodiments without departing from the broader spirit and scope of the disclosure. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense. The accompanying drawings that form a part hereof, show by way of illustration, and not of limitation, specific embodiments in which the subject matter may be practiced. The embodiments illustrated are described in sufficient detail to enable those skilled in the art to practice the teachings disclosed herein. Other embodiments may be utilized and derived therefrom, such that structural and logical substitutions and changes may be made without departing from the scope of this disclosure. This Detailed Description, therefore, is not to be taken in a limiting sense, and the scope of various embodiments is defined only by the appended claims, along with the full range of equivalents to which such claims are entitled.
Such embodiments of the inventive subject matter may be referred to herein, individually and/or collectively, by the term “invention” merely for convenience and without intending to voluntarily limit the scope of this application to any single invention or inventive concept if more than one is in fact disclosed. Thus, although specific embodiments have been illustrated and described herein, it should be appreciated that any arrangement calculated to achieve the same purpose may be substituted for the specific embodiments shown. This disclosure is intended to cover any and all adaptations or variations of various embodiments. Combinations of the above embodiments, and other embodiments not specifically described herein, will be apparent to those of skill in the art upon reviewing the above description.
The preceding technical disclosure is intended to be illustrative, and not restrictive. For example, the above-described embodiments (or one or more aspects thereof) may be used in combination with each other. Other embodiments will be apparent to those of skill in the art upon reviewing the above description. The scope of the claims should, therefore, be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled. In the appended claims, the terms “including” and “in which” are used as the plain-English equivalents of the respective terms “comprising” and “wherein.” Also, in the following claims, the terms “including” and “comprising” are open-ended, that is, a system, device, article, or process that includes elements in addition to those listed after such a term in a claim are still deemed to fall within the scope of that claim. Moreover, in the following claims, the terms “first,” “second,” and “third,” etc. are used merely as labels, and are not intended to impose numerical requirements on their objects.
In this document, the terms “a” or “an” are used, as is common in patent documents, to include one or more than one. In this document, the term “or” is used to refer to a nonexclusive or, such that “A or B” includes “A but not B,” “B but not A,” and “A and B,” unless otherwise indicated. Furthermore, all publications, patents, and patent documents referred to in this document are incorporated by reference herein in their entirety, as though individually incorporated by reference. In the event of inconsistent usages between this document and those documents so incorporated by reference, the usage in the incorporated reference(s) should be considered supplementary to that of this document; for irreconcilable inconsistencies, the usage in this document controls.
The Abstract of the Disclosure is provided to comply with 37 C.F.R. §1.72(b), requiring an abstract that will allow the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in a single embodiment for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separate embodiment.

Claims (19)

What is claimed is:
1. A system comprising
an evaluation module, implemented by at least one processor, configured to evaluate a threshold condition for reading a barcode that is included in a video frame, the threshold condition including a comparison between an amount of defocus blur for the barcode and a predetermined threshold value for defocus blur;
an estimation module, implemented by at least one processor, configured to estimate the amount of defocus blur for the barcode based on an aggregated difference in pixel intensity among a set of pixels within a cross-section of the barcode; and
an identification module, implemented by at least one processor, configured to estimate an identity of the barcode in response to a successful evaluation of the threshold condition.
2. The system of claim 1, wherein the amount of defocus blur corresponds to a radius of defocus blur that is based on a difference in pixel intensity in a cross-section of the barcode.
3. The system of claim 1, wherein the processor-implemented modules further comprise:
a difference module, implemented by at least one processor, configured to calculate a difference in pixel intensity among the set of pixels within the cross-section of the barcode; and
an aggregation module, implemented by at least one processor, configured to generate the aggregated difference by aggregating the difference in pixel intensity among the set of pixels.
4. The system of claim 1, wherein the processor-implemented modules further comprise:
a decoder module, implemented by at least one processor, configured to use the estimated barcode identity to decode the barcode.
5. The system of claim 1, wherein the processor-implemented modules further comprise:
a geometric module, implemented by at least one processor, configured to generate a geometric model of the barcode, the geometric model of the barcode including an identified barcode deformity; and
a decoder module, implemented by at least one processor, configured to use the estimated barcode identity and the identified barcode deformity to decode the barcode.
6. The system of 1, wherein the processor-implemented modules further comprise:
a camera module, implemented by at least one processor, configured to provide a stream of video frames, the video frame that includes the barcode being selected from the stream of video frames.
7. The system of claim 1, wherein the processor-implemented modules further comprise:
an extraction module, implemented by at least one processor, configured to extract the cross-section of the barcode from the video frame.
8. A computer-implemented method comprising:
evaluating, with at least one computer, a threshold condition for reading a barcode that is included in a video frame, the threshold condition including a comparison between an amount of defocus blur for the barcode and a predetermined threshold value for defocus blur;
estimating the amount of defocus blur for the barcode based on an aggregated difference in pixel intensity among a set of pixels within a cross-section of the barcode; and
estimating an identity of the barcode in response to a successful evaluation of the threshold condition.
9. The method of claim 8, wherein the amount of defocus blur corresponds to a radius of defocus blur that is based on a difference in pixel intensity in a cross-section of the barcode.
10. The method of claim 8, further comprising:
calculating a difference in pixel intensity among the set of pixels within the cross-section of the barcode; and
aggregating the difference in pixel intensity among the set of pixels to generate the aggregated difference.
11. The method of claim 8, further comprising:
using the estimated barcode identity to decode the barcode.
12. The method of claim 8, further comprising:
generating a geometric model of the barcode, the geometric model of the barcode including an identified barcode deformity; and
using the estimated barcode identity and the identified barcode deformity to decode the barcode.
13. The method of claim 8, further comprising:
providing a stream of video frames, the video frame that includes the barcode being selected from the stream of video frames.
14. The method of claim 8, further comprising:
extracting the cross-section of the barcode from the video frame.
15. A non-transitory machine-readable storage medium that stores instructions that, when executed by a machine, cause the machine to perform operations comprising:
evaluating a threshold condition for reading a barcode that is included in a video frame, the threshold condition including a comparison between an amount of defocus blur for the barcode and a predetermined threshold value for defocus blur;
estimating the amount of defocus blur for the barcode based on an aggregated difference in pixel intensity among a set of pixels within a cross-section of the barcode; and
estimating an identity of the barcode in response to a successful evaluation of the threshold condition.
16. The storage medium of claim 15, wherein the amount of defocus blur corresponds to a radius of defocus blur that is based on a difference in pixel intensity in a cross-section of the barcode.
17. The storage medium of claim 15, wherein the operations further comprise:
calculating a difference in pixel intensity among the set of pixels within the cross-section of the barcode; and
aggregating the difference in pixel intensity among the set of pixels to generate the aggregated difference.
18. The storage medium of claim 15, wherein the operations further comprise:
generating a geometric model of the barcode, the geometric model of the barcode including an identified barcode deformity; and
using the estimated barcode identity and the identified barcode deformity to decode the barcode.
19. The storage medium of claim 15, wherein the operations further comprise:
extracting the cross-section of the barcode from the video frame.
US14/833,184 2009-09-24 2015-08-24 System and method for recognizing deformed linear barcodes from a stream of varied focus video frames Active 2031-08-28 US10410030B2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US14/833,184 US10410030B2 (en) 2009-09-24 2015-08-24 System and method for recognizing deformed linear barcodes from a stream of varied focus video frames
US16/538,304 US11055505B2 (en) 2009-09-24 2019-08-12 System and method for recognizing deformed linear barcodes from a stream of varied focus video frames

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US24563509P 2009-09-24 2009-09-24
US12/885,155 US8851378B2 (en) 2009-09-24 2010-09-17 System and method for recognizing deformed linear barcodes from a stream of varied-focus video frames
US12/885,221 US8851382B2 (en) 2009-09-24 2010-09-17 System and method for estimation and classification of barcodes using heuristic and statistical measures
US14/488,179 US9275264B2 (en) 2009-09-24 2014-09-16 System and method for estimation and classification of barcodes using heuristic and statistical measures
US14/488,191 US9117131B2 (en) 2009-09-24 2014-09-16 System and method for recognizing deformed linear barcodes from a stream of varied focus video frames
US14/833,184 US10410030B2 (en) 2009-09-24 2015-08-24 System and method for recognizing deformed linear barcodes from a stream of varied focus video frames

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US14/488,191 Continuation US9117131B2 (en) 2009-09-24 2014-09-16 System and method for recognizing deformed linear barcodes from a stream of varied focus video frames

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/538,304 Continuation US11055505B2 (en) 2009-09-24 2019-08-12 System and method for recognizing deformed linear barcodes from a stream of varied focus video frames

Publications (2)

Publication Number Publication Date
US20150363628A1 US20150363628A1 (en) 2015-12-17
US10410030B2 true US10410030B2 (en) 2019-09-10

Family

ID=43755770

Family Applications (6)

Application Number Title Priority Date Filing Date
US12/885,155 Expired - Fee Related US8851378B2 (en) 2009-09-24 2010-09-17 System and method for recognizing deformed linear barcodes from a stream of varied-focus video frames
US12/885,221 Expired - Fee Related US8851382B2 (en) 2009-09-24 2010-09-17 System and method for estimation and classification of barcodes using heuristic and statistical measures
US14/488,179 Expired - Fee Related US9275264B2 (en) 2009-09-24 2014-09-16 System and method for estimation and classification of barcodes using heuristic and statistical measures
US14/488,191 Expired - Fee Related US9117131B2 (en) 2009-09-24 2014-09-16 System and method for recognizing deformed linear barcodes from a stream of varied focus video frames
US14/833,184 Active 2031-08-28 US10410030B2 (en) 2009-09-24 2015-08-24 System and method for recognizing deformed linear barcodes from a stream of varied focus video frames
US16/538,304 Active US11055505B2 (en) 2009-09-24 2019-08-12 System and method for recognizing deformed linear barcodes from a stream of varied focus video frames

Family Applications Before (4)

Application Number Title Priority Date Filing Date
US12/885,155 Expired - Fee Related US8851378B2 (en) 2009-09-24 2010-09-17 System and method for recognizing deformed linear barcodes from a stream of varied-focus video frames
US12/885,221 Expired - Fee Related US8851382B2 (en) 2009-09-24 2010-09-17 System and method for estimation and classification of barcodes using heuristic and statistical measures
US14/488,179 Expired - Fee Related US9275264B2 (en) 2009-09-24 2014-09-16 System and method for estimation and classification of barcodes using heuristic and statistical measures
US14/488,191 Expired - Fee Related US9117131B2 (en) 2009-09-24 2014-09-16 System and method for recognizing deformed linear barcodes from a stream of varied focus video frames

Family Applications After (1)

Application Number Title Priority Date Filing Date
US16/538,304 Active US11055505B2 (en) 2009-09-24 2019-08-12 System and method for recognizing deformed linear barcodes from a stream of varied focus video frames

Country Status (1)

Country Link
US (6) US8851378B2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200104559A1 (en) * 2009-09-24 2020-04-02 Ebay Inc. System and method for recognizing deformed linear barcodes from a stream of varied focus video frames

Families Citing this family (40)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9098764B2 (en) * 2009-07-20 2015-08-04 The Regents Of The University Of California Image-based barcode reader
US8121618B2 (en) 2009-10-28 2012-02-21 Digimarc Corporation Intuitive computing methods and systems
US9280768B2 (en) * 2010-03-17 2016-03-08 Verifone, Inc. Payment systems and methodologies
US8640952B2 (en) * 2011-01-13 2014-02-04 Samsung Electronics Co., Ltd. Mobile code decoding fault recovery via history data analysis
US8879639B2 (en) * 2011-01-31 2014-11-04 Hand Held Products, Inc. Adaptive video capture decode system
US9053478B2 (en) 2011-05-03 2015-06-09 Verifone, Inc. Mobile commerce system
US8532632B2 (en) * 2011-05-16 2013-09-10 Wesley Boudville Cellphone changing an electronic display that contains a barcode
US20120296817A1 (en) * 2011-05-20 2012-11-22 Powell Ken R Systems and methods for promoting products and services
US20130097034A1 (en) * 2011-10-12 2013-04-18 First Data Corporation Systems and Methods for Facilitating Point of Sale Transactions
US8867857B2 (en) 2011-12-28 2014-10-21 Samsung Electronics Co., Ltd. Method for restoration of blurred barcode images
US9424480B2 (en) * 2012-04-20 2016-08-23 Datalogic ADC, Inc. Object identification using optical code reading and object recognition
CN102929595A (en) * 2012-09-20 2013-02-13 腾讯科技(深圳)有限公司 Method and device for realizing action command
JP2015002513A (en) * 2013-06-18 2015-01-05 ソニー株式会社 Content supply device, content supply method, program, terminal device, and content supply system
US9471824B2 (en) 2013-07-12 2016-10-18 Qualcomm Incorporated Embedded barcodes for displaying context relevant information
US9495586B1 (en) 2013-09-18 2016-11-15 IDChecker, Inc. Identity verification using biometric data
US8995774B1 (en) 2013-09-19 2015-03-31 IDChecker, Inc. Automated document recognition, identification, and data extraction
CN104751092B (en) * 2013-12-26 2017-04-12 腾讯科技(深圳)有限公司 Method and device for processing graphic code
US9311517B2 (en) 2014-03-07 2016-04-12 Lockheed Martin Corporation Methods and systems for reducing the likelihood of false positive decodes
US9665754B2 (en) * 2014-05-28 2017-05-30 IDChecker, Inc. Identification verification using a device with embedded radio-frequency identification functionality
US11461567B2 (en) 2014-05-28 2022-10-04 Mitek Systems, Inc. Systems and methods of identification verification using hybrid near-field communication and optical authentication
US11640582B2 (en) 2014-05-28 2023-05-02 Mitek Systems, Inc. Alignment of antennas on near field communication devices for communication
US9361503B2 (en) 2014-10-30 2016-06-07 Datalogic IP Tech Srl Systems, methods and articles for reading highly blurred machine-readable symbols
US11351420B2 (en) * 2015-02-23 2022-06-07 Smartweights, Inc. Method and system for virtual fitness training and tracking devices
US10198648B1 (en) * 2015-04-10 2019-02-05 Digimarc Corporation Decoding 1D-barcodes in digital capture systems
US10503946B2 (en) * 2015-10-28 2019-12-10 Hewlett-Packard Development Company, L.P. Processing machine-readable link
US10331928B2 (en) 2015-11-06 2019-06-25 International Business Machines Corporation Low-computation barcode detector for egocentric product recognition
JP6645143B2 (en) * 2015-11-30 2020-02-12 ブラザー工業株式会社 Image analysis device
WO2018136027A1 (en) * 2017-01-17 2018-07-26 Hewlett-Packard Development Company, L.P. Omnidirectional barcode
EP3428834B1 (en) * 2017-07-12 2019-06-12 Sick AG Optoelectronic code reader and method for reading optical codes
CN107908998B (en) * 2017-11-28 2020-11-03 百富计算机技术(深圳)有限公司 Two-dimensional code decoding method and device, terminal equipment and computer readable storage medium
US10944610B2 (en) * 2017-12-22 2021-03-09 Massachusetts Institute Of Technology Decoding signals by guessing noise
US10608673B2 (en) 2017-12-22 2020-03-31 Massachusetts Institute Of Technology Decoding signals by guessing noise
CN108462494A (en) * 2018-03-08 2018-08-28 山东求同网信息科技有限公司 A kind of preparation method and workout system for the general COM code of enterprise
CN110457025B (en) * 2018-05-07 2022-04-12 腾讯科技(深圳)有限公司 Bar code display method and device, storage medium and electronic device
US10237583B1 (en) * 2018-08-17 2019-03-19 Begasp Llc. Execution of cases based on barcodes in video feeds
JP7202520B2 (en) * 2018-10-15 2023-01-12 京セラドキュメントソリューションズ株式会社 Image processing device
GB201918218D0 (en) 2019-12-11 2020-01-22 Maynooth Univ A method of decoding a codeword
US11431368B2 (en) 2020-03-16 2022-08-30 Massachusetts Institute Of Technology Noise recycling
US11870459B2 (en) 2020-06-08 2024-01-09 Massachusetts Institute Of Technology Universal guessing random additive noise decoding (GRAND) decoder
CN113596154B (en) * 2021-07-29 2023-12-05 深圳市玄羽科技有限公司 Intelligent Internet of things management and control platform based on big data and management and control method

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5262626A (en) 1989-12-06 1993-11-16 Symbol Technologies, Inc. Decoding bar codes from multiple scans using element replacement
US5278398A (en) 1990-01-05 1994-01-11 Symbol Technologies, Inc. Decoding bar code symbols by determining the best alignment of partial scans
US20050006479A1 (en) * 1998-06-12 2005-01-13 Symbol Technologies, Inc. Digitizing bar code symbol data
US20070211148A1 (en) 2000-08-28 2007-09-13 Yossi Lev System and method for providing added utility to a video camera
US20090001173A1 (en) 2007-06-28 2009-01-01 Sevier Mitchel P Bar code reading terminal with video capturing mode
US20090001170A1 (en) 2007-06-29 2009-01-01 Symbol Technologies, Inc. Imaging-Based Bar Code Reader with Image Stabilization
US20090108071A1 (en) 2007-10-31 2009-04-30 Symbol Technologies, Inc. Automatic Region of Interest Focusing for an Imaging-Based Bar Code Reader
US20090277962A1 (en) 2008-05-09 2009-11-12 Homeywell International Inc. Acquisition system for obtaining sharp barcode images despite motion
US20100189367A1 (en) 2009-01-27 2010-07-29 Apple Inc. Blurring based content recognizer
US20110068173A1 (en) 2009-09-24 2011-03-24 Ebay Inc. System and method for recognizing deformed linear barcodes from a stream of varied-focus video frames
US20120018518A1 (en) 2009-03-30 2012-01-26 Stroem Jacob Barcode processing
US20120104100A1 (en) 2009-07-20 2012-05-03 The Regents Of The University Of California, Santa Cruz Image-based barcode reader
US8260074B2 (en) * 2009-06-08 2012-09-04 National Chung Cheng University Apparatus and method for measuring depth and method for computing image defocus and blur status
US8284295B2 (en) * 2007-06-12 2012-10-09 Nikon Corporation Digital camera
US20120331140A1 (en) 2005-08-26 2012-12-27 Hand Held Products Inc Data collection device having dynamic access to multiple wireless networks

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7442059B2 (en) * 2005-11-11 2008-10-28 Elma Eletronic Ag Apparatus and method for the insertion and withdrawal of plug-in modules

Patent Citations (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5262626A (en) 1989-12-06 1993-11-16 Symbol Technologies, Inc. Decoding bar codes from multiple scans using element replacement
US5278398A (en) 1990-01-05 1994-01-11 Symbol Technologies, Inc. Decoding bar code symbols by determining the best alignment of partial scans
US20050006479A1 (en) * 1998-06-12 2005-01-13 Symbol Technologies, Inc. Digitizing bar code symbol data
US20070211148A1 (en) 2000-08-28 2007-09-13 Yossi Lev System and method for providing added utility to a video camera
US20120331140A1 (en) 2005-08-26 2012-12-27 Hand Held Products Inc Data collection device having dynamic access to multiple wireless networks
US8284295B2 (en) * 2007-06-12 2012-10-09 Nikon Corporation Digital camera
US20090001173A1 (en) 2007-06-28 2009-01-01 Sevier Mitchel P Bar code reading terminal with video capturing mode
US20090001170A1 (en) 2007-06-29 2009-01-01 Symbol Technologies, Inc. Imaging-Based Bar Code Reader with Image Stabilization
US20090108071A1 (en) 2007-10-31 2009-04-30 Symbol Technologies, Inc. Automatic Region of Interest Focusing for an Imaging-Based Bar Code Reader
US20090277962A1 (en) 2008-05-09 2009-11-12 Homeywell International Inc. Acquisition system for obtaining sharp barcode images despite motion
US20100187311A1 (en) 2009-01-27 2010-07-29 Van Der Merwe Rudolph Blurring based content recognizer
US20100189367A1 (en) 2009-01-27 2010-07-29 Apple Inc. Blurring based content recognizer
US20120018518A1 (en) 2009-03-30 2012-01-26 Stroem Jacob Barcode processing
US8260074B2 (en) * 2009-06-08 2012-09-04 National Chung Cheng University Apparatus and method for measuring depth and method for computing image defocus and blur status
US20120104100A1 (en) 2009-07-20 2012-05-03 The Regents Of The University Of California, Santa Cruz Image-based barcode reader
US20110068175A1 (en) 2009-09-24 2011-03-24 Ebay Inc. System and method for estimation and classification of barcodes using heuristic and statistical measures
US20110068173A1 (en) 2009-09-24 2011-03-24 Ebay Inc. System and method for recognizing deformed linear barcodes from a stream of varied-focus video frames
US8851378B2 (en) 2009-09-24 2014-10-07 Ebay Inc. System and method for recognizing deformed linear barcodes from a stream of varied-focus video frames
US8851382B2 (en) 2009-09-24 2014-10-07 Ebay Inc. System and method for estimation and classification of barcodes using heuristic and statistical measures
US20150001295A1 (en) 2009-09-24 2015-01-01 Ebay Inc. System and method for estimation and classification of barcodes using heuristic and statistical measures
US20150001296A1 (en) 2009-09-24 2015-01-01 Ebay Inc. System and method for recognizing deformed linear barcodes from a stream of varied-focus video frames
US9117131B2 (en) 2009-09-24 2015-08-25 Ebay, Inc. System and method for recognizing deformed linear barcodes from a stream of varied focus video frames

Non-Patent Citations (33)

* Cited by examiner, † Cited by third party
Title
"U.S. Appl. No. 12/885,155, Non Final Office Action dated Jan. 23, 2013", 8 pgs.
"U.S. Appl. No. 12/885,155, Non Final Office Action dated Sep. 17, 2013", 9 pgs.
"U.S. Appl. No. 12/885,155, Notice of Allowance dated Jun. 6, 2014", 7 pgs.
"U.S. Appl. No. 12/885,155, Response filed Apr. 23, 2013 to Non Final Office Action dated Jan. 23, 2013", 13 pgs.
"U.S. Appl. No. 12/885,155, Response filed Dec. 17, 2013 to Non Final Office Action dated Sep. 17, 2013", 14 pgs.
"U.S. Appl. No. 12/885,221 , Response filed Jul. 1, 2013 to Non Final Office Action dated Apr. 1, 2013", 10 pgs.
"U.S. Appl. No. 12/885,221, Non Final Office Action dated Apr. 1, 2013", 7 pgs.
"U.S. Appl. No. 12/885,221, Notice of Allowance dated Jun. 9, 2014", 8 pgs.
"U.S. Appl. No. 12/885,221, Notice of Allowance dated Oct. 7, 2013", 11 pgs.
"U.S. Appl. No. 14/488,179, Final Office Action dated Apr. 10, 2015", 8 pgs.
"U.S. Appl. No. 14/488,179, Non Final Office Action dated Nov. 20, 2014", 9 pgs.
"U.S. Appl. No. 14/488,179, Non Final Office Action dated Oct. 7, 2014", 9 pgs.
"U.S. Appl. No. 14/488,179, Notice of Allowance dated Oct. 26, 2015", 10 pgs.
"U.S. Appl. No. 14/488,179, Preliminary Amendment filed Oct. 1, 2014", 7 pgs.
"U.S. Appl. No. 14/488,179, PTO Response to Rule 312 Communication dated Jan. 5, 2016", 2 pgs.
"U.S. Appl. No. 14/488,179, Response filed Feb. 20, 2015 to Non Final Office Action dated Nov. 20, 2014", 10 pgs.
"U.S. Appl. No. 14/488,179, Response filed Jun. 10, 2015 to Final Office Action dated Apr. 10, 2015", 8 pgs.
"U.S. Appl. No. 14/488,191, Non Final Office Action dated Nov. 24, 2014", 8 pgs.
"U.S. Appl. No. 14/488,191, Non Final Office Action dated Oct. 7, 2014", 9 pgs.
"U.S. Appl. No. 14/488,191, Notice of Allowance dated Apr. 15, 2015", 9 pgs.
"U.S. Appl. No. 14/488,191, Preliminary Amendment filed Oct. 1, 2014", 6 pgs.
"U.S. Appl. No. 14/488,191, Response filed Feb. 24, 2015 to Non Final Office Action dated Nov. 24, 2014", 9 pgs.
Adelmann, Robert, et al., "Toolkit for Bar Code Recognition and Resolving on Camera Phones-Jump Starting the Internet of Things", [Online]. Retrieved from the Internet: <URL: http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.69.5519>, (2006), 8 pgs.
Adelmann, Robert, et al., "Toolkit for Bar Code Recognition and Resolving on Camera Phones—Jump Starting the Internet of Things", [Online]. Retrieved from the Internet: <URL: http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.69.5519>, (2006), 8 pgs.
Chai, Douglas, et al., "Locating and Decoding EAN-13 Barcodes from Images Captured by Digital Cameras", IEEE, ICICS, (2005), 5 pgs.
Chien, Sky Chew Chee, "Mobile Phone Bar Code Reader", Thesis submitted to the School of Information Technology and Electrical Engineering at the University of Queensland, (Oct. 24, 2007), 69 pgs.
Gallo, Orazio, et al., "Reading Challenging Barcodes with Cameras", (Dec. 7, 2009), 6 pages.
Liyanage, J P, "Efficient Decoding of Blurred, Pitched, and Scratched Barcode Images", Second International Conference on Industrial and Information Systems, [Online]. Retrieved from the Internet: <URL: http://www.cs.ucf.edu/˜janaka/projects/barcode/barcode_paper.pdf>, (Aug. 2007), 6 pgs.
Ohbuchi, Eisaku, et al., "Barcode Readers using the Camera Device in Mobile Phones", IEEE, International Conference on Cyberworlds, (2004), 6 pgs.
Rohs, Michael, et al., "Using Camera-Equipped Mobile Phones for Interacting with Real-World Objects", Advances in Pervasive Computing, [Online]. Retrieved from the Internet: <URL: http://citeseer.ist.psu.edu/viewdoc/summary?doi=10.1.1.2.7195>, (2004), 7 pgs.
Tan, Keng T., "JPEG color barcode images analysis: A camera phone capture channel model with auto-focus", International Journal of Signal Processing, Image Processing and Pattern Recognition vol. 2, No. 4,, (Dec. 2009), 10 pgs.
Terebes, Romulus, et al., "Camera Phone Based Barcode Decoding System", ACTA Technica Napocensis vol. 49, No. 3, (2008), 57-62.
Wachenfeld, Steffen, et al., "Robust Recognition of 1-D barcodes Using Camera Phones", IEEE, (2008), 4 pgs.

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200104559A1 (en) * 2009-09-24 2020-04-02 Ebay Inc. System and method for recognizing deformed linear barcodes from a stream of varied focus video frames
US11055505B2 (en) * 2009-09-24 2021-07-06 Ebay Inc. System and method for recognizing deformed linear barcodes from a stream of varied focus video frames

Also Published As

Publication number Publication date
US20150001295A1 (en) 2015-01-01
US20150363628A1 (en) 2015-12-17
US20200104559A1 (en) 2020-04-02
US8851382B2 (en) 2014-10-07
US9117131B2 (en) 2015-08-25
US8851378B2 (en) 2014-10-07
US20110068175A1 (en) 2011-03-24
US20150001296A1 (en) 2015-01-01
US20110068173A1 (en) 2011-03-24
US11055505B2 (en) 2021-07-06
US9275264B2 (en) 2016-03-01

Similar Documents

Publication Publication Date Title
US11055505B2 (en) System and method for recognizing deformed linear barcodes from a stream of varied focus video frames
Gallo et al. Reading 1D barcodes with mobile phones using deformable templates
US10891474B1 (en) Optical receipt processing
US7886978B2 (en) Techniques for decoding images of barcodes
Zamberletti et al. Robust angle invariant 1d barcode detection
US20050082370A1 (en) System and method for decoding barcodes using digital imaging techniques
US10740820B1 (en) Systems and methods for price searching via a mobile device reading display screen graphics
US8413903B1 (en) Decoding barcodes
US7593873B1 (en) Systems and methods for price searching and customer self-checkout using a mobile device
WO2012075608A1 (en) Indicia encoding system with integrated purchase and payment information
US9027833B2 (en) Commodity information display apparatus and method for displaying commodity information in different forms
WO2003001435A1 (en) Image based object identification
US20160314560A1 (en) Image displaying method, apparatus, and device, and computer program product
CN103336938A (en) Recognition method based one-dimensional bar code image
CN113627411A (en) Super-resolution-based commodity identification and price matching method and system
CN103034830B (en) Bar code decoding method and device
US20130256398A1 (en) Method and system to selectively process a code
CN111797642B (en) Bar code identification method and terminal
Adelmann Mobile phone based interaction with everyday products-on the go
US8061602B1 (en) Systems and methods for price searching on a mobile device
US9195874B2 (en) Apparatus and method for recognizing barcode
US20240112361A1 (en) Product volumetric assessment using bi-optic scanner
Patil et al. A Survey on PiCode: Picture-Embedding 2D Barcode
CN112288438A (en) Convenient payment method and system for electronic commerce
CN111598975A (en) Electronic tag image rendering method and related device

Legal Events

Date Code Title Description
AS Assignment

Owner name: EBAY, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:POWERS, JEFFREY ROGER;REDDY, VIKAS MUPPIDDI;REEL/FRAME:037353/0120

Effective date: 20100917

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE

CC Certificate of correction
MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4