US20190259048A1 - Systems and methods for determining measurement data of an item - Google Patents

Systems and methods for determining measurement data of an item Download PDF

Info

Publication number
US20190259048A1
US20190259048A1 US16/394,018 US201916394018A US2019259048A1 US 20190259048 A1 US20190259048 A1 US 20190259048A1 US 201916394018 A US201916394018 A US 201916394018A US 2019259048 A1 US2019259048 A1 US 2019259048A1
Authority
US
United States
Prior art keywords
item
volume
average
determining
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/394,018
Inventor
Christopher Joseph Hendrick
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Walmart Apollo LLC
Original Assignee
Walmart Apollo LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Walmart Apollo LLC filed Critical Walmart Apollo LLC
Priority to US16/394,018 priority Critical patent/US20190259048A1/en
Publication of US20190259048A1 publication Critical patent/US20190259048A1/en
Assigned to WAL-MART STORES, INC. reassignment WAL-MART STORES, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HENDRICK, CHRSITOPHER JOSEPH
Assigned to WALMART APOLLO, LLC reassignment WALMART APOLLO, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: WAL-MART STORES, INC.
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0207Discounts or incentives, e.g. coupons or rebates
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47FSPECIAL FURNITURE, FITTINGS, OR ACCESSORIES FOR SHOPS, STOREHOUSES, BARS, RESTAURANTS OR THE LIKE; PAYING COUNTERS
    • A47F9/00Shop, bar, bank or like counters
    • A47F9/02Paying counters
    • A47F9/04Check-out counters, e.g. for self-service stores
    • A47F9/046Arrangement of recording means in or on check-out counters
    • A47F9/047Arrangement of recording means in or on check-out counters for recording self-service articles without cashier or assistant
    • A47F9/048Arrangement of recording means in or on check-out counters for recording self-service articles without cashier or assistant automatically
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/08Payment architectures
    • G06Q20/20Point-of-sale [POS] network systems
    • G06Q20/208Input by product or record sensing, e.g. weighing or scanner processing
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07GREGISTERING THE RECEIPT OF CASH, VALUABLES, OR TOKENS
    • G07G1/00Cash registers
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07GREGISTERING THE RECEIPT OF CASH, VALUABLES, OR TOKENS
    • G07G1/00Cash registers
    • G07G1/0036Checkout procedures
    • G07G1/0045Checkout procedures with a code reader for reading of an identifying code of the article to be registered, e.g. barcode reader or radio-frequency identity [RFID] reader
    • G07G1/0054Checkout procedures with a code reader for reading of an identifying code of the article to be registered, e.g. barcode reader or radio-frequency identity [RFID] reader with control of supplementary check-parameters, e.g. weight or number of articles
    • G07G1/0063Checkout procedures with a code reader for reading of an identifying code of the article to be registered, e.g. barcode reader or radio-frequency identity [RFID] reader with control of supplementary check-parameters, e.g. weight or number of articles with means for detecting the geometric dimensions of the article of which the code is read, such as its size or height, for the verification of the registration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01GWEIGHING
    • G01G17/00Apparatus for or methods of weighing material of special form or property

Definitions

  • the present concepts relate generally to the measurement of items for sale, and more specifically, to a system and method for determining a volume and weight of a three-dimensional item from an image of the item.
  • Mobile electronic devices such as smartphones are increasingly being used in a store shopping experience. For example, modern smartphones can store shopping lists, provide electronic coupons that can be scanned at the phone display, and displays advertisements when the shopper is near a store item.
  • a method for determining a weight of an item for purchase comprises determining from a digital image of an item generated by a single camera a volume of the item, the single camera separated from the item at a distance in a horizontal and vertical direction established by a stationary rigid fixture; identifying the item; accessing a database to determine an average weight per volume measurement unit of the identified item; and calculating a weight of the item from the volume and the average weight per volume measurement unit.
  • the item is a store produce item having an irregular shape.
  • determining the volume of the item comprises: determining an average area of the item; determining an average depth of the item; and determining an average volume by multiplying the average area by the average depth.
  • the volume is approximated within two standard deviations using three different sub calculations.
  • determining the average area includes employing at least two mirrors to calculate multiple areas of the item.
  • the multiple areas of the item include: a first area of a front view of the image; a second area of a right reflective view of the image; and a third area of a left reflective view of the image.
  • the average depth is determined from an average of pixel widths of the front view, right respective view, and left reflective view.
  • the height is determined by multiplying an apparent height and a factor of a distance of the item from the camera and a pitch of the camera to the item.
  • determining a height of the item includes determining a resolution of the camera, and determining a correction ratio of pixel to millimeters.
  • determining the volume further comprises calculating a distance from the camera to the item by locking a focal length of the camera to a particular distance to provide a constant distance for applications for the basis of all other calculations.
  • the item is identified systematically, optically, or manually.
  • the method further comprises positioning the item on an apparatus having a base and a fixture; positioning the camera on the fixture a predetermined distance from the item; and generating the digital image of the item.
  • a system for determining a weight of an item for purchase comprises a single camera that generates a digital image of the item; at least two mirrors for determining an average volume of the item; an optical recognition device that identifies the item; a memory that stores an average weight per volume measurement unit of the identified item; and a special-purpose processor that calculates an estimated weight of the item from the approximate volume and the average weight per volume measurement unit.
  • the at least two mirrors are arranged to provide a front view, a right view, and a left view of the image.
  • the system further comprises a volume calculator that determines an area of the item from each of the front view, the right view, and the left view of the image, and determines an average of the three areas.
  • the volume calculator determines an average depth of the image for determining the volume.
  • the system further comprises an optical recognition device that identifies the produce item, and wherein the weight of the produce item is determined from a result of the optical recognition device and the volume.
  • a method for determining a volume and weight of a produce item comprises extrapolating a length, width, and depth of a produce item from a two-dimensional image of the produce item; determining a volume from a combination of the length, width, and depth of the produce item; and determining a weight of the item from the volume.
  • FIG. 1 is an illustrative side view of a system for measuring an item, in accordance with some embodiments.
  • FIG. 2 is a block diagram of elements of the system of FIG. 1 , in accordance with some embodiments.
  • FIG. 3 is flow diagram of a method for determining measurement data of an item, in accordance with some embodiments.
  • FIGS. 4A-4C are illustrative views of the system of FIG. 1 where an image of an item is captured at different vantage points for determining multiple areas of the item, in accordance with some embodiments.
  • FIGS. 5A-5C are front, left, and right views of the item shown in FIGS. 4A-4C , in accordance with some embodiments.
  • Embodiments of the present concepts relate to the use of an electronic device such as a smartphone for determining a volume, weight, or other measurement of an item, for example, a produce item such as a bundle of bananas, or other store items that require such measurement data in order to determine a value of the item.
  • the user's electronic device can be used to determine a volume and weight of a three-dimensional store item from an image of the item. More specifically, systems and methods are provided that determine a volume by extrapolating the length, width, and depth of irregular shaped items from a 2 dimensional image. Conventional approaches require a scale or other physical device for determining the weight of such items. Conventional scanning applications on smartphones or the like are not currently practical with certain store items such as produce items which must first be weighed to determine a sale price of the item.
  • a typical application may include a store customer downloading an application onto a smartphone or other mobile electronic device having a camera, placing a store item of interest on an apparatus having a base and a fixture, placing the smartphone on the fixture a predetermined distance from the item, taking a picture of the item from the camera on the smartphone, and receiving a display of an appropriate weight and cost of the item.
  • FIG. 1 is an illustrative side view of a system 100 for measuring an item, in accordance with some embodiments.
  • the system 100 includes a mobile electronic device 102 , a mirror set 104 , and a processor 106 , which permit a user to take a digital image, i.e., picture, of an item 12 or object, and calculate from the digital image the weight of the item 12 .
  • a mobile electronic device is referred to herein, other electronic devices may equally apply that are not necessarily mobile, but may be stationary.
  • mirrors are referred to herein, alternative devices may equally apply such as video screens or other elements that can capture or provide different views of items of interest.
  • the mobile electronic device 102 includes a camera 202 or related sensor for detecting the item 12 and encoding digital images of the item 12 .
  • the camera 202 can be digital camera, camera phone, tablet, electronic notebook, laptop computer, or other related device having a camera or related charge coupled device (CCD) or complementary metal oxide semiconductor (CMOS) device for collecting emitted light related to the item 12 .
  • CCD charge coupled device
  • CMOS complementary metal oxide semiconductor
  • the system 100 can include a fixture 108 that attaches to the camera 202 , or as shown in FIG. 1 , to the mobile electronic device 102 housing the camera 202 .
  • the fixture 108 can extend from a base 110 , or from a stationary object such as a tabletop or floor on which the item of interest 12 , for example, a bunch of bananas, is positioned and one or more digital images taken by the camera 202 .
  • the mirror set 104 can be positioned on the base 110 or other stationary object at a predetermined distance from the fixture 108 and camera 202 , which permits the item of interest 12 , for example, shown in FIG.
  • the item 4 as a produce item, to be positioned in the field of view of the camera 202 so that a single image can be taken of the item 12 for determining its volume.
  • the item 12 of interest may be positioned between the reflective surface of the mirror set 104 and the camera 202 .
  • a single image is taken by the camera 202 that may capture multiple angles by way of the mirrors 104 .
  • the sides and back of the item 12 may be visible in a single image taken by a camera facing a front of the item 12 .
  • FIGS. 4A-4C and 5A-5C the three sides are shown illustrating that three different views may be provided, e.g., front, left, and right views respectively, which may be used for performing volume calculations according to some embodiments.
  • the processor 106 determines an estimated weight of a produce item based on its volume, which in turn is determined by estimating surface areas of the produce item 12 from images taken by the mobile electronic device 102 .
  • the processor 106 may execute an application stored in memory (not shown) that determines the resolution capability of the camera 202 , for example, by communicating with the firmware of the device it is loaded on.
  • the processor 106 can determine with the given pixel density and the known distance from the item 12 a size, for example, how many millimeters, each pixel represents.
  • the processor 106 may then measure the number of pixels in each view of the item (e.g., front, right, and left views shown in FIGS. 4A-4C and 5A-5C ) and calculate the area in pixels for that view, e.g., areas a 1 , a 2 , and a 3 shown in FIGS. 4A-4C, 5A-5C respectively.
  • the processor 106 may then multiply the number of pixels by the calculated amount of pixels per millimeter to determine the number of square millimeters the pixels represent.
  • the foregoing steps may be performed for each view, for example, shown in FIGS. 4A-4C , then sum those areas together and divide by number of areas, e.g., (a 1 +a 2 +a 3 )/3, to determine the average area for the item 12 .
  • the processor 106 may then count the number of pixels along the average horizontal axis of the three sides and use that number as the depth of the item and then multiply the depth by the average area to determine the volume, for example, in cubic millimeters.
  • the processor 106 may then access a database of known weights per cubic millimeter for a set of items, select the item 12 from the stored item data, and multiply known weight value by the calculated volume to determine an estimated weight of the item 12 shown in the image.
  • the processor 106 may multiply the weight of the item 12 by the retail price per pound of the item 12 to determine the total cost of the item 12 .
  • FIG. 2 is a block diagram of elements of the system of FIG. 1 , in accordance with some embodiments.
  • the mobile electronic device 102 can include optical recognition technology 204 and a memory 206 . Some or all of these elements may be at the mobile electronic device 102 . In other embodiments, some of these elements may be located externally from the mobile electronic device 102 and communicate with each other and/or other elements of the system via a network or other communication path, for example, wired or wireless signals.
  • the optical recognition technology 204 is constructed to instantly recognize objects such as the item 12 .
  • the mobile electronic device 102 can include a processor (not shown) and memory 206 . Stored in the memory 206 can include one or more applications 208 executable by the device processor.
  • One application 208 may include recognition software that works with the camera 202 and/or optical recognition technology 204 or related device for identifying the item 12 .
  • a standalone optical recognition device may be provided, which is physically separate from the mobile electronic device 102 .
  • Data related to the identified item 12 may be compared to contents of a database 220 to determine the identity of the item 12 , and other information such as the item's weight per square millimeter.
  • An accurate value can be generated, for example, by a computer processor, in response to using optical recognition to identify the item 12 .
  • the accuracy value may be used for any number of purposes, for example, compared to a threshold value to establish a degree of confidence that the item 12 is correctly identified.
  • a product code such as a PLU number may be provided to identify the item when the accuracy value is greater than a predetermined threshold.
  • the identification of the item 12 could be obtained through optical recognition, a manually input number such as a PLU, a scanned barcode, QR code, RFID, or any unique identifier for the item 12 .
  • the item 12 can be identified systematically, optically, or manually.
  • the processor 106 can include a volume calculator 212 , a weight calculator 214 , and a database interface 216 . Some or all of these elements may be physically collocated at the processor 106 , for example, stored at a common memory and executed by a hardware processor. In other embodiments, at least some of these elements may be located externally from the processor 106 , for example, located at the mobile device 102 or standalone computer, and communicate with each other and/or other elements of the system via a network or other communication path, for example, wired or wireless signals.
  • the volume calculator 212 calculates an approximate volume of the item 12 from a depth, width, and height of the item 12 .
  • the volume is calculated based on pixel density instead of length, for example, millimeters or inches.
  • the volume can be approximated within two standard deviations.
  • the volume is determined by calculating the average width or depth of the item 12 being photographed, and multiplying it by the average area of the item 12 .
  • the areas of multiple sides of the bananas may be determined using the mirror set 104 to compensate for differences at each vantage point.
  • the weight calculator 214 determines an approximate weight of the item 12 from the calculated volume and average weight per volume, for example, weight per cubic millimeter determined from results retrieved from the database 220 , using the results generated by the optical recognition device 204 .
  • the database interface 216 establishes a communication between the database 220 and the weight calculator 214 .
  • FIG. 3 is flow diagram of a method for determining measurement data of an item, in accordance with some embodiments.
  • the camera 202 generates a digitized image of the item 12 , which can be stored at a memory on the mobile electronic device 102 or at a remote location.
  • the three views are essential for measuring irregular shapes. By determining an average area, it compensates for the extremes that may be encountered on objects that may have both convex and concave features.
  • a volume of the item 12 is determined from the digital image generated at block 302 , for example, by multiplying the average area by the depth of the item. Due to differing capabilities of camera resolution, the volume can be determined based on pixel density or related measurement, for example, by the processor 106 described in FIG. 1 . In doing so, the distance between the camera 202 and the item 12 may be calculated. This can be accomplished by locking the camera focal length to a specific distance, namely, a distance to the item 12 . For example, the focal length can be locked to 1 meter, whereby the camera 202 can only have items in the field of view that are 1 meter away in focus.
  • the camera 202 can capture images of the item 12 at a predetermined distance.
  • the application By setting the focal length the application will always have that distance as a known constant so it can calculation all the other measurements as a factorial of its distance from the item 12 .
  • the fixture 108 separated by the mirror set 104 at a predetermined distance also allows a distance of the camera 202 at the fixture 108 from the item 12 positioned between the fixture 18 and the mirror 104 to be determined, so that the camera is sufficiently focused.
  • the fixture 18 and camera mount are specific distances so despite what device is being used or camera resolution, the application will always know exactly how far away the item 12 is that is being measured.
  • the height of the item 12 may be determined.
  • the volume calculator 212 or a phone application 208 can determine the resolution or related feature of the camera 202 when taking a picture of the item 12 . This information may be determined from bios information stored at the camera 202 . Once the volume calculator 212 receives data regarding camera resolution and/or the related camera-related information, the volume calculator 212 can determine a correction ratio of pixels to square millimeters for determining an area of each view.
  • the fixture 108 can be used that holds the camera 202 at a correct angle and distance from the item to allow a constant, for example, the distance between the camera and item being photographed as well as the angle the picture is being captured (to ensure an acceptable view of both mirror reflections is provided, to be factored into the calculation of the item volume.
  • a height may be determined by multiplying an apparent height and a factor of a distance of the item from the camera and a pitch of the camera to the item.
  • the distance between the camera 202 on the fixture 208 and the item 12 can be determined to be lm.
  • the camera 202 may be determined to be a 10 megapixel camera, and has a field of view at 1 m is 2.5 m high ⁇ 1 m wide.
  • each pixel can be calculated to represent 0.25 square millimeters (sq. mm.) (for example, using well-known calculations (Eq. (1)) of 25 million sq. mm/10 million pixels).
  • h is the apparent height
  • d is the distance of the object
  • a is the actual size of the object.
  • the apparent height may be multiplied by the distance the object is from the camera 202 .
  • Apparent height is the observable relative height of an item at various distances. The apparent height appears to be smaller the further away it is. The true height on the other hand is the actual physical height of an item and is not relative to how far away it is.
  • the area of at least three sides or other surfaces are determined to compensate for differences of each vantage point.
  • the two mirrors 104 A, 104 B of the mirror set 104 are arranged to provide additional views of the item 12 to the camera 202 , and allows the volume calculator 212 to calculate the visible area of the other views provided by reflected images of the other views.
  • the angle of the captured image allows for the camera 202 to see the whole front of the item 12 , and both reflections in both mirrors 104 .
  • the vantage points may include (1) a front facing image to determine an area of the item 12 from this vantage point, (2) a right facing reflection taking by a first mirror 104 A to determine an area from this vantage point, (3) a left facing reflection taking by a second mirror 104 B to determine an area from this vantage point.
  • the mirror set 104 A, B (generally, 104 ) is employed on the base 110 , from a center of the fixture 108 behind the item 12 being measured.
  • the system then parses out the front view, right facing reflection, and left facing reflection to determine the area for each side. More specifically, the system may parse the x, y, and z axis from the single image of the item 12 .
  • the system excludes the reflected images by the mirrors 104 as part of the area (a 1 ) calculation.
  • the image of the item reflected by the right mirror 104 A is considered for calculating area (a 2 ).
  • the image of the item reflected by the left mirror 104 B is considered for calculating area (a 3 ).
  • the volume calculator 212 calculates an area of each of the three views, i.e., the front view facing the camera 202 and the two views provided by the mirrors 104 , and calculates an average of the three views to determine the volume of the item 12 .
  • the system determines the width of each view of the item 12 and determines an average width to be used as the depth. The average area is multiplied by the average depth to calculate the volume.
  • the volume calculator 212 calculates the area of each of the multiple views by pixel density, then averages the three calculated areas together to determine an overall average area of the item which takes into account the irregular shape of the items.
  • a front view of the bundle of bananas shown in FIGS. 4A and 5A comprises an area (a 1 ) having 15,000 pixels.
  • a second view shown in FIGS. 4B and 5B comprises an area (a 2 ) having 14,500 pixels.
  • a third view shown in FIGS. 4C and 5C comprises an area (a 3 ) having 18,000 pixels.
  • the average area of the pixels of the three viewed areas is 15,833 pixels, each pixel having a size of 0.25 sq. mm., as established above, or 3958 sq. mm.
  • the volume calculator 212 also calculates a depth of the item 12 . Since the item 12 has varying depths at different heights, an average depth is calculated with its corresponding average width based on the average width of each side. The depth is determined by perspective views, for example, shown in FIGS. 4A-4C, 5A-5C . For example, a front view of an image of an item has a width and height. If the item is rotated 90 degrees, then the width shown from the front view is the depth of the rotated item. The processor 106 calculates the average width (w) from the three perspective views (front, left, right), thereby determining the effective depth of the item 12 to use in the volume calculation.
  • w average width
  • the volume calculator 212 determines the weighted average width on each of the three image views, for example, referring again to FIGS. 5A-C , by measuring the number of pixels it takes to span the width (w) at one or more sections, or slices, of the image views, then calculating an average width of each section (p), or slice.
  • FIGS. 5A-5C illustrate the measurement of width, and the manner in which this is achieved is by performing a calculation that averages the pixel length at each slice of width of the item image.
  • the measurement is the calculation of the average width, the unit of measurement may be by in units of pixels (instead of mm or inches).
  • a section may be, on average, 1600 pixels wide, or 400 mm (1600 pixels ⁇ 0.25 mm/pixel).
  • a section may be, on average, 1500 pixels wide, or 375 mm (1500 pixels ⁇ 0.25 mm/pixel).
  • a section may be, on average, 1800 pixels wide, or 450 mm (1800 pixels ⁇ 0.25 mm/pixel).
  • the volume of the image of the item 12 is calculated by multiplying the average area by the average depth. In the example of FIGS. 4A-4C , it will be:
  • the item 12 is identified, for example, by the optical recognition device 204 .
  • stored at the database 220 may include a directory of items and a predetermined average weight per cubic millimeter for each item. This information is stored in the database 220 , and at block 308 a database record can be accessed for the known weight per cubic millimeter of bananas. In this example, the average cubic millimeter weight of a banana is 0.00000324 pounds per cubic millimeter.
  • the weight of the bananas can be determined.
  • FIG. 5 the intent is to show that we are trying to measure width, and the way we are going to do that is by averaging the pixel length at each slice of width of the item.
  • a camera has 1080p resolution, so it has 1080 pixels along the horizontal axis and 720 pixels along the vertical axis
  • the application would look at each of the 720 rows (slices) and see if the item is present in that row, if it is present in that row it would calculate how many of those 1080 horizontal pixels (width) are taken up by the item. Then it would average all the widths together to come up with the average width.
  • aspects may be embodied as a device, system, method, or computer program product. Accordingly, aspects may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module” or “system.” Furthermore, aspects may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon.
  • Computer program code for carrying out operations for the concepts may be written in any combination of one or more programming languages.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
  • LAN local area network
  • WAN wide area network
  • Internet Service Provider for example, AT&T, MCI, Sprint, EarthLink, MSN, GTE, etc.
  • These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
  • the computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, cloud-based infrastructure architecture, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s).
  • the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.

Landscapes

  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Accounting & Taxation (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Finance (AREA)
  • Strategic Management (AREA)
  • Development Economics (AREA)
  • Theoretical Computer Science (AREA)
  • General Business, Economics & Management (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Marketing (AREA)
  • Economics (AREA)
  • Game Theory and Decision Science (AREA)
  • Geometry (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Analysis (AREA)

Abstract

Provided are a system and method for determining a weight of an item for purchase, comprising determining from a digital image of an item generated by a single camera, a volume of the item, the single camera separated from the item at a distance in a horizontal and vertical direction established by a stationary rigid fixture; identifying the item; accessing a database to determine an average weight per volume measurement unit of the identified item; and calculating a weight of the item from the volume and the average weight per volume measurement unit.

Description

    RELATED APPLICATIONS
  • This application is a divisional application of U.S. patent application Ser. No. 15/204,206 filed on Jul. 7, 2016 entitled “Methods for Determining Measurement Data of an Item,” which claims priority to U.S. Provisional Patent Application Ser. No. 62/197,657, filed on Jul. 28, 2015 entitled “Systems and Methods for Determining Measurement Data of an Item,” the entirety of each of which is incorporated by reference herein.
  • FIELD
  • The present concepts relate generally to the measurement of items for sale, and more specifically, to a system and method for determining a volume and weight of a three-dimensional item from an image of the item.
  • BACKGROUND
  • Mobile electronic devices such as smartphones are increasingly being used in a store shopping experience. For example, modern smartphones can store shopping lists, provide electronic coupons that can be scanned at the phone display, and displays advertisements when the shopper is near a store item.
  • BRIEF SUMMARY
  • In one aspect, a method for determining a weight of an item for purchase, comprises determining from a digital image of an item generated by a single camera a volume of the item, the single camera separated from the item at a distance in a horizontal and vertical direction established by a stationary rigid fixture; identifying the item; accessing a database to determine an average weight per volume measurement unit of the identified item; and calculating a weight of the item from the volume and the average weight per volume measurement unit.
  • In some embodiments, the item is a store produce item having an irregular shape.
  • In some embodiments, determining the volume of the item comprises: determining an average area of the item; determining an average depth of the item; and determining an average volume by multiplying the average area by the average depth.
  • In some embodiments, the volume is approximated within two standard deviations using three different sub calculations.
  • In some embodiments, determining the average area includes employing at least two mirrors to calculate multiple areas of the item.
  • In some embodiments, the multiple areas of the item include: a first area of a front view of the image; a second area of a right reflective view of the image; and a third area of a left reflective view of the image.
  • In some embodiments, the average depth is determined from an average of pixel widths of the front view, right respective view, and left reflective view.
  • In some embodiments, the height is determined by multiplying an apparent height and a factor of a distance of the item from the camera and a pitch of the camera to the item.
  • In some embodiments, determining a height of the item includes determining a resolution of the camera, and determining a correction ratio of pixel to millimeters.
  • In some embodiments, determining the volume further comprises calculating a distance from the camera to the item by locking a focal length of the camera to a particular distance to provide a constant distance for applications for the basis of all other calculations.
  • In some embodiments, the item is identified systematically, optically, or manually.
  • In some embodiments, the method further comprises positioning the item on an apparatus having a base and a fixture; positioning the camera on the fixture a predetermined distance from the item; and generating the digital image of the item.
  • In another aspect, a system for determining a weight of an item for purchase, comprises a single camera that generates a digital image of the item; at least two mirrors for determining an average volume of the item; an optical recognition device that identifies the item; a memory that stores an average weight per volume measurement unit of the identified item; and a special-purpose processor that calculates an estimated weight of the item from the approximate volume and the average weight per volume measurement unit.
  • In some embodiments, the at least two mirrors are arranged to provide a front view, a right view, and a left view of the image.
  • In some embodiments, the system further comprises a volume calculator that determines an area of the item from each of the front view, the right view, and the left view of the image, and determines an average of the three areas.
  • In some embodiments, the volume calculator determines an average depth of the image for determining the volume.
  • In some embodiments, the system further comprises an optical recognition device that identifies the produce item, and wherein the weight of the produce item is determined from a result of the optical recognition device and the volume.
  • In another aspect, a method for determining a volume and weight of a produce item, comprises extrapolating a length, width, and depth of a produce item from a two-dimensional image of the produce item; determining a volume from a combination of the length, width, and depth of the produce item; and determining a weight of the item from the volume.
  • BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS
  • The above and further advantages of this invention may be better understood by referring to the following description in conjunction with the accompanying drawings, in which like numerals indicate like structural elements and features in various figures. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating the principles of the invention.
  • FIG. 1 is an illustrative side view of a system for measuring an item, in accordance with some embodiments.
  • FIG. 2 is a block diagram of elements of the system of FIG. 1, in accordance with some embodiments.
  • FIG. 3 is flow diagram of a method for determining measurement data of an item, in accordance with some embodiments.
  • FIGS. 4A-4C are illustrative views of the system of FIG. 1 where an image of an item is captured at different vantage points for determining multiple areas of the item, in accordance with some embodiments.
  • FIGS. 5A-5C are front, left, and right views of the item shown in FIGS. 4A-4C, in accordance with some embodiments.
  • DETAILED DESCRIPTION
  • In the following description, specific details are set forth although it should be appreciated by one of ordinary skill in the art that the systems and methods can be practiced without at least some of the details. In some instances, known features or processes are not described in detail so as to not obscure the present invention.
  • Embodiments of the present concepts relate to the use of an electronic device such as a smartphone for determining a volume, weight, or other measurement of an item, for example, a produce item such as a bundle of bananas, or other store items that require such measurement data in order to determine a value of the item. The user's electronic device can be used to determine a volume and weight of a three-dimensional store item from an image of the item. More specifically, systems and methods are provided that determine a volume by extrapolating the length, width, and depth of irregular shaped items from a 2 dimensional image. Conventional approaches require a scale or other physical device for determining the weight of such items. Conventional scanning applications on smartphones or the like are not currently practical with certain store items such as produce items which must first be weighed to determine a sale price of the item.
  • A typical application may include a store customer downloading an application onto a smartphone or other mobile electronic device having a camera, placing a store item of interest on an apparatus having a base and a fixture, placing the smartphone on the fixture a predetermined distance from the item, taking a picture of the item from the camera on the smartphone, and receiving a display of an appropriate weight and cost of the item.
  • FIG. 1 is an illustrative side view of a system 100 for measuring an item, in accordance with some embodiments.
  • The system 100 includes a mobile electronic device 102, a mirror set 104, and a processor 106, which permit a user to take a digital image, i.e., picture, of an item 12 or object, and calculate from the digital image the weight of the item 12. Although a mobile electronic device is referred to herein, other electronic devices may equally apply that are not necessarily mobile, but may be stationary. Also, although mirrors are referred to herein, alternative devices may equally apply such as video screens or other elements that can capture or provide different views of items of interest.
  • As shown in FIG. 2, the mobile electronic device 102 includes a camera 202 or related sensor for detecting the item 12 and encoding digital images of the item 12. The camera 202 can be digital camera, camera phone, tablet, electronic notebook, laptop computer, or other related device having a camera or related charge coupled device (CCD) or complementary metal oxide semiconductor (CMOS) device for collecting emitted light related to the item 12.
  • The system 100 can include a fixture 108 that attaches to the camera 202, or as shown in FIG. 1, to the mobile electronic device 102 housing the camera 202. The fixture 108 can extend from a base 110, or from a stationary object such as a tabletop or floor on which the item of interest 12, for example, a bunch of bananas, is positioned and one or more digital images taken by the camera 202. The mirror set 104 can be positioned on the base 110 or other stationary object at a predetermined distance from the fixture 108 and camera 202, which permits the item of interest 12, for example, shown in FIG. 4 as a produce item, to be positioned in the field of view of the camera 202 so that a single image can be taken of the item 12 for determining its volume. The item 12 of interest may be positioned between the reflective surface of the mirror set 104 and the camera 202. A single image is taken by the camera 202 that may capture multiple angles by way of the mirrors 104. For example, by arranging the mirror set 104 about a periphery of the item 12, the sides and back of the item 12 may be visible in a single image taken by a camera facing a front of the item 12. As shown in FIGS. 4A-4C and 5A-5C, the three sides are shown illustrating that three different views may be provided, e.g., front, left, and right views respectively, which may be used for performing volume calculations according to some embodiments.
  • The processor 106 determines an estimated weight of a produce item based on its volume, which in turn is determined by estimating surface areas of the produce item 12 from images taken by the mobile electronic device 102.
  • To achieve this, the processor 106 may execute an application stored in memory (not shown) that determines the resolution capability of the camera 202, for example, by communicating with the firmware of the device it is loaded on. The processor 106 can determine with the given pixel density and the known distance from the item 12 a size, for example, how many millimeters, each pixel represents. The processor 106 may then measure the number of pixels in each view of the item (e.g., front, right, and left views shown in FIGS. 4A-4C and 5A-5C) and calculate the area in pixels for that view, e.g., areas a1, a2, and a3 shown in FIGS. 4A-4C, 5A-5C respectively. The processor 106 may then multiply the number of pixels by the calculated amount of pixels per millimeter to determine the number of square millimeters the pixels represent. The foregoing steps may be performed for each view, for example, shown in FIGS. 4A-4C, then sum those areas together and divide by number of areas, e.g., (a1+a2+a3 )/3, to determine the average area for the item 12.
  • The processor 106 may then count the number of pixels along the average horizontal axis of the three sides and use that number as the depth of the item and then multiply the depth by the average area to determine the volume, for example, in cubic millimeters. The processor 106 may then access a database of known weights per cubic millimeter for a set of items, select the item 12 from the stored item data, and multiply known weight value by the calculated volume to determine an estimated weight of the item 12 shown in the image. In some embodiments, the processor 106 may multiply the weight of the item 12 by the retail price per pound of the item 12 to determine the total cost of the item 12.
  • FIG. 2 is a block diagram of elements of the system of FIG. 1, in accordance with some embodiments.
  • In addition to a camera 202, the mobile electronic device 102 can include optical recognition technology 204 and a memory 206. Some or all of these elements may be at the mobile electronic device 102. In other embodiments, some of these elements may be located externally from the mobile electronic device 102 and communicate with each other and/or other elements of the system via a network or other communication path, for example, wired or wireless signals. The optical recognition technology 204 is constructed to instantly recognize objects such as the item 12. The mobile electronic device 102 can include a processor (not shown) and memory 206. Stored in the memory 206 can include one or more applications 208 executable by the device processor. One application 208 may include recognition software that works with the camera 202 and/or optical recognition technology 204 or related device for identifying the item 12. Alternatively, a standalone optical recognition device may be provided, which is physically separate from the mobile electronic device 102.
  • Data related to the identified item 12 may be compared to contents of a database 220 to determine the identity of the item 12, and other information such as the item's weight per square millimeter. An accurate value can be generated, for example, by a computer processor, in response to using optical recognition to identify the item 12. The accuracy value may be used for any number of purposes, for example, compared to a threshold value to establish a degree of confidence that the item 12 is correctly identified. A product code such as a PLU number may be provided to identify the item when the accuracy value is greater than a predetermined threshold. The identification of the item 12 could be obtained through optical recognition, a manually input number such as a PLU, a scanned barcode, QR code, RFID, or any unique identifier for the item 12. Alternatively, or in addition, the item 12 can be identified systematically, optically, or manually.
  • The processor 106 can include a volume calculator 212, a weight calculator 214, and a database interface 216. Some or all of these elements may be physically collocated at the processor 106, for example, stored at a common memory and executed by a hardware processor. In other embodiments, at least some of these elements may be located externally from the processor 106, for example, located at the mobile device 102 or standalone computer, and communicate with each other and/or other elements of the system via a network or other communication path, for example, wired or wireless signals.
  • The volume calculator 212 calculates an approximate volume of the item 12 from a depth, width, and height of the item 12. The volume is calculated based on pixel density instead of length, for example, millimeters or inches. The volume can be approximated within two standard deviations. The volume is determined by calculating the average width or depth of the item 12 being photographed, and multiplying it by the average area of the item 12. To account for irregular shapes, such as the bananas shown in FIG. 1, the areas of multiple sides of the bananas may be determined using the mirror set 104 to compensate for differences at each vantage point.
  • The weight calculator 214 determines an approximate weight of the item 12 from the calculated volume and average weight per volume, for example, weight per cubic millimeter determined from results retrieved from the database 220, using the results generated by the optical recognition device 204. The database interface 216 establishes a communication between the database 220 and the weight calculator 214.
  • FIG. 3 is flow diagram of a method for determining measurement data of an item, in accordance with some embodiments.
  • At block 302, the camera 202 generates a digitized image of the item 12, which can be stored at a memory on the mobile electronic device 102 or at a remote location. The three views are essential for measuring irregular shapes. By determining an average area, it compensates for the extremes that may be encountered on objects that may have both convex and concave features.
  • At block 304, a volume of the item 12 is determined from the digital image generated at block 302, for example, by multiplying the average area by the depth of the item. Due to differing capabilities of camera resolution, the volume can be determined based on pixel density or related measurement, for example, by the processor 106 described in FIG. 1. In doing so, the distance between the camera 202 and the item 12 may be calculated. This can be accomplished by locking the camera focal length to a specific distance, namely, a distance to the item 12. For example, the focal length can be locked to 1 meter, whereby the camera 202 can only have items in the field of view that are 1 meter away in focus. The establishing, or locking, the focal length of the camera 202 to a specific distance, the camera 202 can capture images of the item 12 at a predetermined distance. By setting the focal length the application will always have that distance as a known constant so it can calculation all the other measurements as a factorial of its distance from the item 12. The fixture 108 separated by the mirror set 104 at a predetermined distance also allows a distance of the camera 202 at the fixture 108 from the item 12 positioned between the fixture 18 and the mirror 104 to be determined, so that the camera is sufficiently focused. the fixture 18 and camera mount are specific distances so despite what device is being used or camera resolution, the application will always know exactly how far away the item 12 is that is being measured.
  • Also, in determining volume, the height of the item 12 may be determined. The volume calculator 212 or a phone application 208 can determine the resolution or related feature of the camera 202 when taking a picture of the item 12. This information may be determined from bios information stored at the camera 202. Once the volume calculator 212 receives data regarding camera resolution and/or the related camera-related information, the volume calculator 212 can determine a correction ratio of pixels to square millimeters for determining an area of each view. To achieve this, the fixture 108 can be used that holds the camera 202 at a correct angle and distance from the item to allow a constant, for example, the distance between the camera and item being photographed as well as the angle the picture is being captured (to ensure an acceptable view of both mirror reflections is provided, to be factored into the calculation of the item volume. A height may be determined by multiplying an apparent height and a factor of a distance of the item from the camera and a pitch of the camera to the item.
  • For example, the distance between the camera 202 on the fixture 208 and the item 12 can be determined to be lm. Also, the camera 202 may be determined to be a 10 megapixel camera, and has a field of view at 1 m is 2.5 m high×1 m wide. Here, each pixel can be calculated to represent 0.25 square millimeters (sq. mm.) (for example, using well-known calculations (Eq. (1)) of 25 million sq. mm/10 million pixels).
  • h = a d Eq . ( 1 )
  • where h is the apparent height, d is the distance of the object, and a is the actual size of the object. To determine a “true” height of an object in the distance, the apparent height may be multiplied by the distance the object is from the camera 202. Apparent height is the observable relative height of an item at various distances. The apparent height appears to be smaller the further away it is. The true height on the other hand is the actual physical height of an item and is not relative to how far away it is.
  • To account for odd or irregular shapes for produce items or other items of interest of which an image is taken by the camera 202, the area of at least three sides or other surfaces are determined to compensate for differences of each vantage point. The two mirrors 104A, 104B of the mirror set 104 are arranged to provide additional views of the item 12 to the camera 202, and allows the volume calculator 212 to calculate the visible area of the other views provided by reflected images of the other views. The angle of the captured image allows for the camera 202 to see the whole front of the item 12, and both reflections in both mirrors 104.
  • For example, as shown in FIGS. 4A-4C, the vantage points may include (1) a front facing image to determine an area of the item 12 from this vantage point, (2) a right facing reflection taking by a first mirror 104A to determine an area from this vantage point, (3) a left facing reflection taking by a second mirror 104B to determine an area from this vantage point. The mirror set 104A, B (generally, 104) is employed on the base 110, from a center of the fixture 108 behind the item 12 being measured. The system then parses out the front view, right facing reflection, and left facing reflection to determine the area for each side. More specifically, the system may parse the x, y, and z axis from the single image of the item 12. In FIG. 4A, the system excludes the reflected images by the mirrors 104 as part of the area (a1) calculation. In FIG. 4B, the image of the item reflected by the right mirror 104A is considered for calculating area (a2). In FIG. 4C, the image of the item reflected by the left mirror 104B is considered for calculating area (a3). Thus, to provide a more accurate projection of the actual volume of the item, the volume calculator 212 calculates an area of each of the three views, i.e., the front view facing the camera 202 and the two views provided by the mirrors 104, and calculates an average of the three views to determine the volume of the item 12. Also, the system determines the width of each view of the item 12 and determines an average width to be used as the depth. The average area is multiplied by the average depth to calculate the volume.
  • In sum, the volume calculator 212 calculates the area of each of the multiple views by pixel density, then averages the three calculated areas together to determine an overall average area of the item which takes into account the irregular shape of the items.
  • For example, a front view of the bundle of bananas shown in FIGS. 4A and 5A comprises an area (a1) having 15,000 pixels. A second view shown in FIGS. 4B and 5B comprises an area (a2) having 14,500 pixels. A third view shown in FIGS. 4C and 5C comprises an area (a3) having 18,000 pixels. The average area of the pixels of the three viewed areas is 15,833 pixels, each pixel having a size of 0.25 sq. mm., as established above, or 3958 sq. mm.
  • The volume calculator 212 also calculates a depth of the item 12. Since the item 12 has varying depths at different heights, an average depth is calculated with its corresponding average width based on the average width of each side. The depth is determined by perspective views, for example, shown in FIGS. 4A-4C, 5A-5C. For example, a front view of an image of an item has a width and height. If the item is rotated 90 degrees, then the width shown from the front view is the depth of the rotated item. The processor 106 calculates the average width (w) from the three perspective views (front, left, right), thereby determining the effective depth of the item 12 to use in the volume calculation.
  • The volume calculator 212 determines the weighted average width on each of the three image views, for example, referring again to FIGS. 5A-C, by measuring the number of pixels it takes to span the width (w) at one or more sections, or slices, of the image views, then calculating an average width of each section (p), or slice. FIGS. 5A-5C illustrate the measurement of width, and the manner in which this is achieved is by performing a calculation that averages the pixel length at each slice of width of the item image. The measurement is the calculation of the average width, the unit of measurement may be by in units of pixels (instead of mm or inches). After the average width in number of pixels is generated, by all the slices from all the sides may be averaged, and a calculation may be performed to change the measurement from pixels to mm or fraction thereof. In some embodiments, as shown in FIG. 5, (w) and (p) are synonymous.
  • Referring again to FIGS. 4A and 5A, a section may be, on average, 1600 pixels wide, or 400 mm (1600 pixels×0.25 mm/pixel). In FIGS. 4B and 5B, a section may be, on average, 1500 pixels wide, or 375 mm (1500 pixels×0.25 mm/pixel). In FIGS. 4C and 5C, a section may be, on average, 1800 pixels wide, or 450 mm (1800 pixels×0.25 mm/pixel). The volume calculator 212 may calculate an average of the three average pixel widths (w) to determine the average depth of the item. In this example: ((400 mm+375 mm+450 mm)/3)=408.33 mm.
  • The volume of the image of the item 12 is calculated by multiplying the average area by the average depth. In the example of FIGS. 4A-4C, it will be:
    • Area=3,958 sq. mm.
    • Depth=408.33 mm.
    • Volume=3,958 sq. mm.*408.33 mm.=1,616,170.15 cubic millimeters
  • At block 306, the item 12 is identified, for example, by the optical recognition device 204.
  • At block 308, stored at the database 220 may include a directory of items and a predetermined average weight per cubic millimeter for each item. This information is stored in the database 220, and at block 308 a database record can be accessed for the known weight per cubic millimeter of bananas. In this example, the average cubic millimeter weight of a banana is 0.00000324 pounds per cubic millimeter.
  • At block 310, the weight of the bananas can be determined. The volume of the bananas shown in FIGS. 1 and 4A-4C can be determined by multiplying the average weight (0.00000324 pounds per cubic millimeter) by calculated volume (1,616,170.15 cubic millimeters)=5.236 pounds.
  • In FIG. 5 the intent is to show that we are trying to measure width, and the way we are going to do that is by averaging the pixel length at each slice of width of the item.
  • For example if a camera has 1080p resolution, so it has 1080 pixels along the horizontal axis and 720 pixels along the vertical axis, then the application would look at each of the 720 rows (slices) and see if the item is present in that row, if it is present in that row it would calculate how many of those 1080 horizontal pixels (width) are taken up by the item. Then it would average all the widths together to come up with the average width.
  • As will be appreciated by one skilled in the art, concepts may be embodied as a device, system, method, or computer program product. Accordingly, aspects may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module” or “system.” Furthermore, aspects may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon.
  • Computer program code for carrying out operations for the concepts may be written in any combination of one or more programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
  • Concepts are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
  • The computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, cloud-based infrastructure architecture, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
  • While concepts have been shown and described with reference to specific preferred embodiments, it should be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope as defined by the following claims.

Claims (6)

What is claimed is:
1. A system for determining a weight of an item for purchase, comprising:
a single camera that generates a digital image of the item;
at least two mirrors for determining an average volume of the item;
an optical recognition device that identifies the item;
a memory that stores an average weight per volume measurement unit of the identified item; and
a special-purpose processor that calculates an estimated weight of the item from the approximate volume and the average weight per volume measurement unit.
2. The system of claim 1, wherein the at least two mirrors are arranged to provide a front view, a right view, and a left view of the image.
3. The system of claim 1, further comprising a volume calculator that determines an area of the item from each of the front view, the right view, and the left view of the image, and determines an average of the three areas.
4. The system of claim 3, wherein the volume calculator determines an average depth of the image for determining the volume.
5. The system of claim 3, further comprising an optical recognition device that identifies the produce item, and wherein the weight of the produce item is determined from a result of the optical recognition device and the volume.
6. A method for determining a volume and weight of a produce item, comprising:
extrapolating a length, width, and depth of a produce item from a two-dimensional image of the produce item;
determining a volume from a combination of the length, width, and depth of the produce item; and
determining a weight of the item from the volume.
US16/394,018 2015-07-28 2019-04-25 Systems and methods for determining measurement data of an item Abandoned US20190259048A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/394,018 US20190259048A1 (en) 2015-07-28 2019-04-25 Systems and methods for determining measurement data of an item

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201562197657P 2015-07-28 2015-07-28
US15/204,206 US10318976B2 (en) 2015-07-28 2016-07-07 Methods for determining measurement data of an item
US16/394,018 US20190259048A1 (en) 2015-07-28 2019-04-25 Systems and methods for determining measurement data of an item

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US15/204,206 Division US10318976B2 (en) 2015-07-28 2016-07-07 Methods for determining measurement data of an item

Publications (1)

Publication Number Publication Date
US20190259048A1 true US20190259048A1 (en) 2019-08-22

Family

ID=57882486

Family Applications (2)

Application Number Title Priority Date Filing Date
US15/204,206 Active 2037-01-22 US10318976B2 (en) 2015-07-28 2016-07-07 Methods for determining measurement data of an item
US16/394,018 Abandoned US20190259048A1 (en) 2015-07-28 2019-04-25 Systems and methods for determining measurement data of an item

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US15/204,206 Active 2037-01-22 US10318976B2 (en) 2015-07-28 2016-07-07 Methods for determining measurement data of an item

Country Status (1)

Country Link
US (2) US10318976B2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11906290B2 (en) 2016-03-04 2024-02-20 May Patents Ltd. Method and apparatus for cooperative usage of multiple distance meters

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9667955B2 (en) * 2012-10-01 2017-05-30 Bodybarista Aps Method of calibrating a camera
EP3454698B1 (en) 2016-05-09 2024-04-17 Grabango Co. System and method for computer vision driven applications within an environment
US10615994B2 (en) 2016-07-09 2020-04-07 Grabango Co. Visually automated interface integration
CA3052292A1 (en) 2017-02-10 2018-08-16 Grabango Co. A dynamic customer checkout experience within an automated shopping environment
EP3610469A4 (en) 2017-05-10 2021-04-28 Grabango Co. Series-configured camera array for efficient deployment
US10740742B2 (en) 2017-06-21 2020-08-11 Grabango Co. Linked observed human activity on video to a user account
US10963704B2 (en) 2017-10-16 2021-03-30 Grabango Co. Multiple-factor verification for vision-based systems
US11481805B2 (en) 2018-01-03 2022-10-25 Grabango Co. Marketing and couponing in a retail environment using computer vision
CN108376448A (en) * 2018-02-24 2018-08-07 广州逗号智能零售有限公司 Commodity method of calibration and device
CN108537995A (en) * 2018-03-28 2018-09-14 青岛中科英泰商用系统股份有限公司 Self-help settlement method based on image recognition
WO2020092450A1 (en) 2018-10-29 2020-05-07 Grabango Co. Commerce automation for a fueling station
US11507933B2 (en) 2019-03-01 2022-11-22 Grabango Co. Cashier interface for linking customers to virtual data
CN110889473A (en) * 2019-11-01 2020-03-17 宁波纳智微光电科技有限公司 Intelligent measurement and transportation system and measurement and planning method thereof
CN110838142B (en) * 2019-11-05 2023-09-08 沈阳民航东北凯亚有限公司 Luggage size recognition method and device based on depth image
CN111325741B (en) * 2020-03-02 2024-02-02 上海媒智科技有限公司 Item quantity estimation method, system and equipment based on depth image information processing
CN113640177B (en) * 2021-06-29 2024-06-14 阿里巴巴创新公司 Cargo density measuring method and system and electronic equipment
US20240112361A1 (en) * 2022-09-30 2024-04-04 Zebra Technologies Corporation Product volumetric assessment using bi-optic scanner

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5184733A (en) * 1991-02-19 1993-02-09 Marel H.F. Apparatus and method for determining the volume, form and weight of objects
US20040114153A1 (en) * 2001-02-01 2004-06-17 Kristinn Andersen Laser mirror vision

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6508403B2 (en) 2001-05-04 2003-01-21 Institut National D'optique Portable apparatus for 3-dimensional scanning
US20030115160A1 (en) 2001-12-19 2003-06-19 First Data Corporation Weight measuring systems and methods for weighing items
US20050097064A1 (en) * 2003-11-04 2005-05-05 Werden Todd C. Method and apparatus to determine product weight and calculate price using a camera
EP1721682A4 (en) 2004-02-05 2008-06-18 Trinity Ind Corp Method for calculating nonadhering paint and method for calculating weight of solvent
US7708691B2 (en) 2005-03-03 2010-05-04 Sonowise, Inc. Apparatus and method for real time 3D body object scanning without touching or applying pressure to the body object
US8189875B2 (en) * 2008-03-31 2012-05-29 Tjs Dmcc Systems and methods for gemstone identification and analysis
MX2011009043A (en) * 2009-02-27 2011-12-16 Body Surface Translations Inc Estimating physical parameters using three dimensional representations.
KR20120049217A (en) * 2009-06-11 2012-05-16 에프피에스 푸드 프로세싱 시스템즈 비.브이. Method for determining weights of eggs, and apparatus
US8276490B2 (en) 2010-01-29 2012-10-02 Marchant Schmidt, Inc. Exact weight cutting system for food products
US8787621B2 (en) * 2012-06-04 2014-07-22 Clicrweight, LLC Methods and systems for determining and displaying animal metrics
US20140104413A1 (en) * 2012-10-16 2014-04-17 Hand Held Products, Inc. Integrated dimensioning and weighing system
JP2015099116A (en) * 2013-11-20 2015-05-28 セイコーエプソン株式会社 Component analyzer
US20160110791A1 (en) * 2014-10-15 2016-04-21 Toshiba Global Commerce Solutions Holdings Corporation Method, computer program product, and system for providing a sensor-based environment
CN107873101B (en) * 2014-11-21 2022-01-11 克里斯多夫·M·马蒂 Imaging system for object recognition and evaluation
US9406058B1 (en) * 2015-03-09 2016-08-02 Blackton, LLC. Self-service kiosk for assaying precious metals in the form of a jewelry and payment in exchange for the same
AU2016327051B2 (en) * 2015-09-21 2020-01-30 Plf Agritech Pty Ltd Image analysis for making animal measurements including 3-D image analysis
US10026187B2 (en) * 2016-01-12 2018-07-17 Hand Held Products, Inc. Using image data to calculate an object's weight
US11099057B2 (en) * 2016-09-01 2021-08-24 Imam Abdulrahman Bin Faisal University Grossing workstation with electronic scale
DE102016120406A1 (en) * 2016-10-26 2018-04-26 Deutsche Post Ag A method of determining a charge for sending a shipment

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5184733A (en) * 1991-02-19 1993-02-09 Marel H.F. Apparatus and method for determining the volume, form and weight of objects
US20040114153A1 (en) * 2001-02-01 2004-06-17 Kristinn Andersen Laser mirror vision

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11906290B2 (en) 2016-03-04 2024-02-20 May Patents Ltd. Method and apparatus for cooperative usage of multiple distance meters

Also Published As

Publication number Publication date
US20170030766A1 (en) 2017-02-02
US10318976B2 (en) 2019-06-11

Similar Documents

Publication Publication Date Title
US10318976B2 (en) Methods for determining measurement data of an item
US10825198B2 (en) 3 dimensional coordinates calculating apparatus, 3 dimensional coordinates calculating method, 3 dimensional distance measuring apparatus and 3 dimensional distance measuring method using images
JP7043085B2 (en) Devices and methods for acquiring distance information from a viewpoint
US8630455B2 (en) Method and system for audience digital monitoring
US9390314B2 (en) Methods and apparatus for determining dimensions of an item using 3-dimensional triangulation
US8488872B2 (en) Stereo image processing apparatus, stereo image processing method and program
US8315443B2 (en) Viewpoint detector based on skin color area and face area
US9569686B2 (en) Mobile device field of view region determination
CN107578047B (en) Eccentricity detection method for power cable
CN110619807B (en) Method and device for generating global thermodynamic diagram
JPWO2011125937A1 (en) Calibration data selection device, selection method, selection program, and three-dimensional position measurement device
CN110728649A (en) Method and apparatus for generating location information
Štolc et al. Depth and all-in-focus imaging by a multi-line-scan light-field camera
US11600019B2 (en) Image-based inventory estimation
CN111429194B (en) User track determination system, method, device and server
CN112504473B (en) Fire detection method, device, equipment and computer readable storage medium
US20130331145A1 (en) Measuring system for mobile three dimensional imaging system
US20130336539A1 (en) Position estimation device, position estimation method, and computer program product
Langmann Wide area 2D/3D imaging: development, analysis and applications
JP2013207745A (en) Image pickup device, image processing method, and program
JP2019022147A (en) Light source direction estimation device
CN109916341B (en) Method and system for measuring horizontal inclination angle of product surface
US9392254B2 (en) Game sizing camera
US9546860B2 (en) Method and system for contactless dimensional measurement of articles
JP2002286420A (en) Terminal equipment having dimension measurement function

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

AS Assignment

Owner name: WAL-MART STORES, INC., ARKANSAS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HENDRICK, CHRSITOPHER JOSEPH;REEL/FRAME:052731/0390

Effective date: 20150728

Owner name: WALMART APOLLO, LLC, ARKANSAS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:WAL-MART STORES, INC.;REEL/FRAME:052741/0433

Effective date: 20180226

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION