GB2531928A - Image-stitching for dimensioning - Google Patents

Image-stitching for dimensioning Download PDF

Info

Publication number
GB2531928A
GB2531928A GB201517843A GB201517843A GB2531928A GB 2531928 A GB2531928 A GB 2531928A GB 201517843 A GB201517843 A GB 201517843A GB 201517843 A GB201517843 A GB 201517843A GB 2531928 A GB2531928 A GB 2531928A
Authority
GB
Grant status
Application
Patent type
Prior art keywords
object
patent application
patent
image
range
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB201517843A
Other versions
GB201517843D0 (en )
Inventor
L Jovanovski Brian
Li Jingquan
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hand Held Products Inc
Original Assignee
Hand Held Products Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical means
    • G01B11/24Measuring arrangements characterised by the use of optical means for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical means for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/32Indexing scheme for image data processing or generation, in general involving image mosaicing
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Abstract

A method of determining the physical dimensions of an object 6, the method comprising: capturing, using a dimensioning system 10, a range image; moving 12, 13 either the dimensioning system or the object; and capturing further range images covering different portions of the object. The range images are then combined (e.g., using image-stitching) to form a composite range-image, which can be used to measure the object's size. The dimensioning system may project a light pattern onto the object, capture an image of the reflected pattern, and observe changes in the imaged pattern to obtain the depth image. The dimensioning system or the object may be moved automatically or messages may be provided to guide a user to move them.

Description

IMAGE-STITCHING FOR DIMENSIONING

CROSS-REFERENCE TO RELATED APPLICATION

[0001] This application claims the benefit of U.S. Patent Application Ser. No. 62/062,175 for System and Methods for Dimensioning, (filed October 10, 2014), which is hereby incorporated by reference in its entirety.

FIELD OF THE INVENTION

[0002] The present invention relates to systems for determining an object's physical dimensions (i.e., dimensioning systems) and, more specifically, to a dimensioning system that uses image-stitching to acquire the data necessary for dimensioning.

BACKGROUND

[0003] Determining an item's dimensions is often necessary as part of a logistics process (e.g., shipping, storage, etc.). Physically measuring objects, however, is time consuming and may not result in accurate measurements. For example, in addition to human error, measurement errors may result when measuring irregularly shaped objects or when combining multiple objects into a single measurement. As a result, dimensioning systems have been developed to automate, or assist with, this measurement.

[0004] A dimensioning system typically senses an object's shape/size in three-dimensions (3D) and then uses this 3D information to compute an estimate of an object's dimensions (e.g., volume, area, length, width, height, etc.). In addition, for irregular objects (or multiple objects), the dimensioning system may compute the dimensions of a minimum bounding box (MVBB) that contains the object (or objects).

[0005] The dimensioning system may sense an object by projecting a light pattern (i.e., pattern) into a field-of-view. Objects within the field-of-view will distort the appearance of the light pattern. The dimensioning system can capture an image of the reflected light-pattern and analyze the pattern distortions in the captured image to compute the 3D data necessary for dimensioning.

[0006] Accurate dimensioning requires images with (i) high pattern visibility and (ii) high pattern density. In some cases, however, the pattern is hard to resolve. For example, the pattern may be obscured by the shape of the object, or by the object's color (i.e., reflectivity). In other cases, the lighting in the environment may obscure the pattern in the captured images (e.g., under exposure or over exposure). In still other cases, the object may be larger than the dimensioning system's field-of-view. While moving the dimensioning system away from the object may help fit the object within the field-of-view, this comes at the expense of pattern density because the projected pattern spreads as the range between the object and the dimensioning system is increased.

[0007] In digital photography image-stitching is the process of combining images to produce a larger, high-resolution image. Image-stitching may be applied to dimensioning in order to increase the dimensioning system's field-of-view without sacrificing pattern density. In addition, image-stitching can help to resolve a pattern that was obscured in a single image. Therefore, a need exists for image-stitching images acquired by a dimensioning system in order to better measure objects.

SUMMARY

[0008] Accordingly, in one aspect, the present invention embraces a method for dimensioning an object. In the method, a dimensioning system is positioned so that at least a portion of an object is contained in the dimensioning system's field-of-view. The dimensioning system then captures a first range image of the field-of-view. After the first range image is captured, either the dimensioning system or the object is moved so that the dimensioning system's field-of-view contains a different portion of the object. Then, a second range image is captured. This process of moving the dimensioning system (or the object) and capturing a range images is repeated until a plurality of range images are captured. The plurality of range images are then combined to create a composite range-image. The dimensions of the object are then determined using the composite range-image.

[0009] In a possible embodiment of the method, capturing a range image includes (i) using a pattern projector to project a light pattern into the field-of-view, (ii) capturing an image of the reflected light-pattern using a range camera, and (iii) generating 3D data from the image of the reflected light-pattern.

[0010] In another possible embodiment of the method, capturing a range image includes (i) using a pattern projector to project a light pattern into the field-of-view, (ii) capturing an image of the reflected light-pattern using a range camera, and (iii) generating 3D data from the image of the reflected light-pattern so that the plurality of range images contain 3D sufficient for dimensioning the object. For example, 3D data sufficient for dimensioning may imply that 3D data is collected from all surfaces of the object. Alternatively, 3D data sufficient for dimensioning may imply that the 3D data from a surface of the object has no gaps (i.e., no missing areas) in the reflected light-pattern.

[0011] In another exemplary embodiment of the method, the dimensioning system is handheld.

[0012] In another exemplary embodiment of the method, audio and/or visual message are generated to guide the user in performing the movement of the dimensioning system or the object. For example, these audio and/or visual messages can include instructions for the user to (i) move the dimensioning system (or the object) in a particular direction, (ii) move the dimensioning system (or the object) at a particular speed, and/or (iii) cease moving the dimensioning system (or the object).

[0013] In another exemplary embodiment of the method, moving either the dimensioning system or the object includes an automatic movement of the dimensioning system (or the object).

[0014] In another exemplary embodiment of the method, combining the plurality of range images to create a composite range-images includes image-stitching the plurality of range images. In one possible embodiment, the image-stitching includes simultaneous localization and mapping (SLAM).

[0015] In another aspect, the present invention embraces a dimensioning system that includes (i) a pattern projector, (ii) a range camera, and (iii) a processor that is communicatively coupled to the pattern projector and the range camera. The pattern projector is configured to project a light pattern onto an object, while the range camera is configured to capture an image of the reflected light-pattern. The range camera uses the reflected light-pattern to generate 3D data and uses the 3D data to create a range image.

[0016] The dimensioning system's processor is configured by software to trigger the range camera to capture a plurality of range images and combine Lhe plurality of captured range images to form a composite range-image. Then, using the composite range-image, the processor calculates the dimensions of the object.

[0017] In an exemplary embodiment of the dimensioning system, the plurality of range images are captured as the spatial relationship between the dimensioning system and the object is changed. For example, in one embodiment, the dimensioning system is handheld and a user can move the dimensioning system so that each range image in the plurality of range images includes 3D data from a portion of the object, and the composite range-image includes 3D data from the entire object. In some embodiments, the processor is further configured by software to gather tracking/mapping information as the spatial relationship between the range camera and the object is changed. The tracking/mapping information can be used, in some embodiments, to generate messages to help a user change the spatial relationship between the range camera and the object. These messages may be instructions to (i) move the dimensioning system or the object in a particular direction, (ii) move the dimensioning system or the object at a particular speed, and/or (iii) cease moving the dimensioning system or the object. After the plurality of range images are captured, the processor can be configured by software to create a composite range-image by image-stitching the range images using the tracking/mapping information. In a possible embodiment, the plurality of range images for image-stitching have partially overlapping fields of view.

[0018] The foregoing illustrative summary, as well as other exemplary objectives and/or advantages of the invention, and the manner in which the same are accomplished, are further explained within the following detailed description and its accompanying drawings.

BRIEF DESCRIPTION OF THE DRAWINGS

[0019] Figure 1 schematically depicts a block diagram of a dimensioning system according to an embodiment of the present invention.

[0020] Figure 2 graphically depicts the principle of sensing three dimensions using a spatially offset pattern projector and range camera according to an embodiment of the present invention.

[0021] Figure 3 graphically depicts an implementation of a dimensioning system's pattern projector according to an embodiment of the present invention.

[0022] Figure 4 graphically depicts the movement of either the dimensioning system and/or the object according to an embodiment of the present invention.

[0023] Figure 5a graphically depicts a plurality of images, wherein each constituent image contains a portion of an object.

[0024] Figure 5b graphically depicts a composite image of the object formed by image-stitching the constituent images shown in Figure 5a.

[0025] Figure 6 graphically depicts a flow diagram illustrating a method for dimensioning an object according to an embodiment of the present invention.

DETAILED DESCRIPTION

[0026] The present invention embraces the use of image-stitching to create a composite range-image for dimensioning. Some advantages of using composite images for dimensioning are (i) better pattern coverage of an irregular object or group of objects, (ii) greater accuracy (i.e., higher pattern density), and (iii) immunity to lighting effects, such as shadows or bright reflections.

[0027] An exemplary dimensioning system is shown in Figure (Fig.) 1. The dimensioning system 10 includes a pattern projector 1 that is configured to project a light (e.g., infrared light) pattern into a field-of-view 2. The light pattern typically comprises points of light arranged in a pattern (i.e., point cloud). The points of light may be (i) sized identically or differently and (ii) may be arranged in some order or pseudo-randomly. The pattern projector may create the light pattern using a light source (e.g., laser, LED, etc.), a pattern creator (e.g., a mask, a diffractive optical element, etc.), and one or more lenses.

[0028] The dimensioning system 10 also includes a range camera 3 configured to capture an image of the projected light pattern that is reflected from the range camera's field-ofview 4. The field-of-view of the range camera 4 and the field-of-view of the pattern projector 2 should overlap but may not necessarily have identical shapes/sizes. The range camera 3 includes one or more lenses to form a real image of the field-of-view 4 onto an image sensor. Light filtering (e.g., infrared filter) may be also be used to help detect the reflected pattern by removing stray light and/or ambient light. An image sensor (e.g., CMOS sensor, CCD sensor, etc.) is used to create a digital image of the light pattern. The range camera may also include the necessary processing (e.g. DSP, FPGA, ASIC, etc.) to obtain 3D data from the light pattern image.

[0029] As shown in Fig. 2, the pattern projector 1 and the range camera 3 are spatially offset (e.g., stereoscopically arranged). The spatial offset 8 allows for changes in the range 5 of an object 6 to be detected as an image offset 7 on the range camera's image sensor. The spatial offset 8 may be adjusted to change the image offset 7 to change the resolution at which range differences 5 may be detected. In this way, image offsets in the point-cloud pattern may be converted into 3D data for objects within the dimensioning system's field-ofview.

[0030] The 3D data includes range values for each point of light in the point-cloud image. Further, range values between the points of light in the point-cloud image may be interpolated to create what is known as a range image. A range image is a gray scale image in which each pixel value in the image corresponds to an estimated range between the dimensioning system and a point in the field-of-view. The range camera may output 3D data in the form of point-cloud images or range images.

[0031] A range image may be analyzed using software algorithms running on the dimensioning system's processor 9 to detect objects and determine the object's dimensions. In some cases these algorithms may include steps to create a minimum bounding box (MVBB), which is a computer model of a box that surrounds an object (e.g., an irregularly shaped object) or a collection of objects (e.g., multiple boxes on a pallet). In this case, the dimensioning system may return the dimensions of the MVBB.

[0032] Accurate dimensioning requires high-quality Images of the reflected pattern (i.e., point-cloud images). A high quality point-cloud image is one in which the points of light in the pattern are visible on a plurality of the object's surfaces. Low quality point-cloud images may result from a variety of circumstances. For example, the imaged pattern may not be visible one or more surfaces (e.g., surfaces that are blocked from the pattern projector) or fall outside the fieldof-view of either the pattern projector and/or the range camera. In another example, the light pattern may be partially visible on a surface and/or lack sufficient pattern density (i.e., the number of visible points of light on the surface). In yet another example, the lighting (e.g., glare, shadows) in the object's environment and/or the object's reflectivity (e.g., dark objects) may adversely affect the visibility of the light pattern.

[0033] Fig. 3 graphically depicts a dimensioning system 10 projecting a light pattern 11 onto an object 6. Here the object is larger than the pattern projector's field-of-view 2. As a result, portions of the object do not intersect with the projected light-pattern 11. Since dimensioning relies on sensing the image offset of the projected light-pattern, no 3D data can be created for the portions of the object that do not intersect with the projected light-pattern 11.

[0034] The present invention mitigates or solves these problems by capturing a plurality of point-cloud images (or range images) from different perspectives and then combining the plurality of point-cloud images (or range images) into a composite point-cloud image (or range image).

[0035] Fig. 3 illustrates how the movement of the dimensioning system 10 and/or the object 6 may help capture (i.e., sense, sample, etc.) 3D data. The movement allows for the capture of 3D data from more portions of the object than could be obtained with a single range image having a field-of-view 2 smaller than the object 6.

[0036] Range images may be captured during the movement and then combined to form a composite range-image. The composite range-image has 3D data from more points on the object. For example, all sides of an object may be sampled during Lhe moving process to obtain 3D data from the entire object. Further, gaps in the pattern (i.e., missing areas in the pattern) may be filled in using this technique.

[0037] In one possible embodiment, the movement of the dimensioning system and/or the object is automatic and does not require user participation. In this embodiment, the dimensioning system may be coupled to movement devices (e.g., actuators, motors, etc.) that adjust the spatial relationship between the dimensioning system and the object. In one example, the object 6 may be placed in a measurement area and the dimensioning system 10 may be moved around the object 12 to collect range images from various perspectives as shown in Fig 4. In another example, a fixed dimensioning system may collect range images as an object 6 is rotated (e.g., on a motorized turntable) 13 as shown in Fig. 4. In these cases, position information may be obtained from the movement device and used to help combine the range images.

[0038] In another possible embodiment, the movement of the dimensioning system and/or the object is performed by a user. Here messages (e.g., audio, visual, etc.) may be generated by the dimensioning system's processor and conveyed to a user interface (e.g., screen, indicator lights, speaker, etc.). The user may follow the instructions provided by the messages to move the dimensioning-system/ object. The instructions may include messages to help a user know (i) how far to move the dimensioning-system/object, (ii) how fast to move the dimensioning-system/object, (iii) to move the dimensioning system/object to a particular location, and (iv) how long to continue moving the dimensioning-system/object (e.g., when to stop moving). For example, the dimensioning system may be handheld and the user may move the dimensioning system to change perspective. In this case, the dimensioning system may be configured to gather tracking information (e.g., sense its position and orientation within the environment) to help combine the range images.

[0039] In general, the dimensioning system may be moved in a variety of ways as the range images are captured. In some cases, however, this movement may have certain requirements to facilitate combining. For example, movements may be limited to movements having a constant range between the dimensioning system and the object, as changes in range can affect the image size of the light-pattern/object. In another example, the movement may be limited to a certain path having a particular starting point and ending point. This path may be determined using an expected object size/shape.

[0040] The requirements for movement may be reduced through the use of simultaneous localization and mapping (SLAM). SLAM is a computer algorithm that uses images (e.g., range images) of an environment to update the position of the imager (e.g., dimensioning system). When moving a dimensioning-system, for example, SLAM algorithms may detect features (i.e., landmarks) in a captured range image and then compare these landmarks to landmarks found in previously captured range images in order to update the position of the dimensioning system. This position information may be used to help combine the range images.

[0041] Combining range images is typically achieved using image-stitching. Image-stitching refers to computer algorithms that transform, register, and blend a plurality of constituent images to form a single composite image. The image-stitching algorithms may first determine an appropriate mathematical model to relate the pixel coordinates for constituent images to the pixel coordinates of a target composite-image surface (e.g., plane, cylinder, sphere, etc.). This involves transforming (e.g., warping) the images to the target composite-image surface. The transformed images may then registered to one another (e.g., using feature detection and mapping) and merged (e.g., blended) to remove edge effects.

[0042] The process and results of image-stitching are illustrated in Fig. 5a and Fig. 5b. As shown in Fig. 5a, four constituent images 14a, 14b, 14c, 14d of an object 6 are captured. Each of the four images contains a different portion of the object 6. Fig 5b illustrates the result of image-stitching the four constituent images. The composite image 15 contains the entire object 6.

[0043] While range images have pixels to representing range instead of reflected light, they are like conventional digital images in most other regards. As such, the principles of image-stitching described thus far may be applied equally to range images (or point-cloud images).

[0044] Figure 6 graphically depicts a flow diagram illustrating a method for dimensioning an object using image-stitching. The method begins with positioning 20 a dimensioning system so that at least a portion on an object is contained within the dimensioning system's point-of-view and capturing 30 a range image. The dimensioning system and/or the object is then moved 60 so that another portion of the object is within the field-of-view and another range image is captured 30. This process of moving and capturing is repeated until a plurality of range images is captured 40. The number of range images in the plurality of range images may be a predetermined number or may be determined based on the motion of the dimensioning system/object. The plurality of range images are then combined 70 to form a composite range-image, and the composite range-image is used to dimension 90 the object.

[0045] In one exemplary embodiment, the dimensioning system may create messages 50 to guide the movement of the dimensioning system and/or the object as described previously.

[0046] In another exemplary embodiment, the dimensioning system may create or update the composite range-image in real time. In this case, the dimensioning system may be able to examine the latest composite range-image to determine if there is 3D data sufficient for dimensioning (i.e., if a sufficient number of range images have been acquired) 80. If not, the dimensioning system may create messages to help the user move and capture range images so as to gather the missing or incomplete 3D data.

[0047] To supplement the present disclosure, this application incorporates entirely by reference the following commonly assigned patents, patent application publications, and patent applications: U.S. Patent No. 6,832,725; U.S. Patent No. 7,128,266; U.S. Patent No. 7,159,783; U.S. Patent No. 7,413,127; U.S. Patent No. 7,726,575; U.S. Patent No. 8,294,969; U.S. Patent No. 8,317,105; U.S. Patent No. 8,322,622; U.S. Patent No. 8,366,005; U.S. Patent No. 8,371,507; U.S. Patent No. 8,376,233; U.S. Patent No. 8,381,979; U.S. Patent No. 8,390,909; U.S. Patent No. 8,408,464; U.S. Patent No. 8,408,468; U.S. Patent No. 8,408,469; U.S. Patent No. 8,424,768; U.S. Patent No. 8,448,863; U.S. Patent No. 8,457,013; U.S. Patent No. 8,459,557; U.S. Patent No. 8,469,272; U.S. Patent No. 8,474,712; U.S. Patent No. 8,479,992; U.S. Patent No. 8,490,877; U.S. Patent No. 8,517,271; U.S. Patent No. 8,523,076; U.S. Patent No. 8,528,818; U.S. Patent No. 8,544,737; U.S. Patent No. 8,548,242; U.S. Patent No. 8,548,420; U.S. Patent No. 8,550,335; U.S. Patent No. 8,550,354; U.S. Patent No. 8,550,357; U.S. Patent No. 8,556,174; U.S. Patent No. 8,556,176; U.S. Patent No. 8,556,177; U.S. Patent No. 8,559,767; U.S. Patent No. 8,599,957; U.S. Patent No. 8,561,895; U.S. Patent No. 8,561,903; U.S. Patent No. 8,561,905; U.S. Patent No. 8,565,107; U.S. Patent No. 8,571,307; U.S. Patent No. 8,579,200; U.S. Patent No. 8,583,924; U.S. Patent No. 8,584,945; U.S. Patent No. 8,587,595; U.S. Patent No. 8,587,697; U.S. Patent No. 8,588,869; U.S. Patent No. 8,590,789; U.S. Patent No. 8,596,539; U.S. Patent No. 8,596,542; U.S. Patent No. 8,596,543; U.S. Patent No. 8,599,271; U.S. Patent No. 8,599,957; U.S. Patent No. 8,600,158; U.S. Patent No. 8,600,167; U.S. Patent No. 8,602,309; U.S. Patent No. 8,608,053; U.S. Patent No. 8,608,071; U.S. Patent No. 8,611,309; U.S. Patent No. 8,615,487; U.S. Patent No. 8,616,454; U.S. Patent No. 8,621,123; U.S. Patent No. 8,622,303; U.S. Patent No. 8,628,013; U.S. Patent No. 8,628,015; U.S. Patent No. 8,628,016; U.S. Patent No. 8,629,926; U.S. Patent No. 8,630,491; U.S. Patent No. 8,635,309; U.S. Patent No. 8,636,200; U.S. Patent No. 8,636,212; U.S. Patent No. 8,636,215; U.S. Patent No. 8,636,224; U.S. Patent No. 8,638,806; U.S. Patent No. 8,640,958; U.S. Patent No. 8,640,960; U.S. Patent No. 8,643,717; U.S. Patent No. 8,646,692; U.S. Patent No. 8,646,694; U.S. Patent No. 8,657,200; U.S. Patent No. 8,659,397; U.S. Patent No. 8,668,149; U.S. Patent No. 8,678,285; U.S. Patent No. 8,678,286; U.S. Patent No. 8,682,077; U.S. Patent No. 8,687,282; U.S. Patent No. 8,692,927; U.S. Patent No. 8,695,880; U.S. Patent No. 8,698,949; U.S. Patent No. 8,717,494; U.S. Patent No. 8,717,494; U.S. Patent No. 8,720,783; U.S. Patent No. 8,723,804; U.S. Patent No. 8,723,904; U.S. Patent No. 8,727,223; U.S. Patent No. D702,237; U.S. Patent No. 8,740,082; U.S. Patent No. 8,740,085; U.S. Patent No. 8,746,563; U.S. Patent No. 8,750,445; U.S. Patent No. 8,752,766; U.S. Patent No. 8,756,059; U.S. Patent No. 8,757,495; U.S. Patent No. 8,760,563; U.S. Patent No. 8,763,909; U.S. Patent No. 8,777,108; U.S. Patent No. 8,777,109; U.S. Patent No. 8,779,898; U.S. Patent No. 8,781,520; U.S. Patent No. 8,783,573; U.S. Patent No. 8,789,757; U.S. Patent No. 8,789,758; U.S. Patent No. 8,789,759; U.S. Patent No. 8,794,520; U.S. Patent No. 8,794,522; U.S. Patent No. 8,794,525; U.S. Patent No. 8,794,526; U.S. Patent No. 8,798,367; U.S. Patent No. 8,807,431; U.S. Patent No. 8,807,432; U.S. Patent No. 8,820,630; U.S. Patent No. 8,822,848; U.S. Patent No. 8,824,692; U.S. Patent No. 8,824,696; U.S. Patent No. 8,842,849; U.S. Patent No. 8,844,822; U.S. Patent No. 8,844,823; U.S. Patent No. 8,849,019; U.S. Patent No. 8,851,383; U.S. Patent No. 8,854,633; U.S. Patent No. 8,866,963; U.S. Patent No. 8,868,421; U.S. Patent No. 8,868,519; U.S. Patent No. 8,868,802; U.S. Patent No. 8,868,803; U.S. Patent No. 8,870,074; U.S. Patent No. 8,879,639; U.S. Patent No. 8,880,426; U.S. Patent No. 8,881,983; U.S. Patent No. 8,881,987; U.S. Patent No. 8,903,172; U.S. Patent No. 8,908,995; U.S. Patent No. 8,910,870; U.S. Patent No. 8,910,875; U.S. Patent No. 8,914,290; U.S. Patent No. 8,914,788; U.S. Patent No. 8,915,439; U.S. Patent No. 8,915,444; U.S. Patent No. 8,916,789; U.S. Patent No. 8,918,250; U.S. Patent No. 8,918,564; U.S. Patent No. 8,925,818; U.S. Patent No. 8,939,374; U.S. Patent No. 8,942,480; U.S. Patent No. 8,944,313; U.S. Patent No. 8,944,327; U.S. Patent No. 8,944,332; U.S. Patent No. 8,950,678; U.S. Patent No. 8,967,468; U.S. Patent No. 8,971,346; U.S. Patent No. 8,976,030; U.S. Patent No. 8,976,368; U.S. Patent No. 8,978,981; U.S. Patent No. 8,978,983; U.S. Patent No. 8,978,984; U.S. Patent No. 8,985,456; U.S. Patent No. 8,985,457; U.S. Patent No. 8,985,459; U.S. Patent No. 8,985,461; U.S. Patent No. 8,988,578; U.S. Patent No. 8,988,590; U.S. Patent No. 8,991,704; U.S. Patent No. 8,996,194; U.S. Patent No. 8,996,384; U.S. Patent No. 9,002,641; U.S. Patent No. 9,007,368; U.S. Patent No. 9,010,641; U.S. Patent No. 9,015,513; U.S. Patent No. 9,016,576; U.S. Patent No. 9,022,288; U.S. Patent No. 9,030,964; U.S. Patent No. 9,033,240; U.S. Patent No. 9,033,242; U.S. Patent No. 9,036,054; U.S. Patent No. 9,037,344; U.S. Patent No. 9,038,911; U.S. Patent No. 9,038,915; U.S. Patent No. 9,047,098; U.S. Patent No. 9,047,359; U.S. Patent No. 9,047,420; U.S. Patent No. 9,047,525; U.S. Patent No. 9,047,531; U.S. Patent No. 9,053,055; U.S. Patent No. 9,053,378; U.S. Patent No. 9,053,380; U.S. Patent No. 9,058,526; U.S. Patent No. 9,064,165; U.S. Patent No. 9,064,167; U.S. Patent No. 9,064,168; U.S. Patent No. 9,064,254; U.S. Patent No. 9,066,032; U.S. Patent No. 9,070,032; U.S. Design Patent No. D716,285; U.S. Design Patent No. D723,560; U.S. Design Patent No. D730,357; U.S. Design Patent No. D730,901; U.S. Design Patent No. D730,902 U.S. Design Patent No. D733,112; U.S. Design Patent No. D734,339; International Publication No. 2013/163789; International Publication No. 2013/173985; International Publication No. 2014/019130; International Publication No. 2014/110495; U.S. Patent Application Publication No. 2008/0185432; U.S. Patent Application Publication No. 2009/0134221; U.S. Patent Application Publication No. 2010/0177080; U.S. Patent Application Publication No. 2010/0177076; U.S. Patent Application Publication No. 2010/0177707; U.S. Patent Application Publication No. 2010/0177749; U.S. Patent Application Publication No. 2010/0265880; U.S. Patent Application Publication No. 2011/0202554; U.S. Patent Application Publication No. 2012/0111946; U.S. Patent Application Publication No. 2012/0168511; U.S. Patent Application Publication No. 2012/0168512; U.S. Patent Application Publication No. 2012/0193423; U.S. Patent Application Publication No. 2012/0203647; U.S. Patent Application Publication No. 2012/0223141; U.S. Patent Application Publication No. 2012/0228382; U.S. Patent Application Publication No. 2012/0248188; U.S. Patent Application Publication No. 2013/0043312; U.S. Patent Application Publication No. 2013/0082104; U.S. Patent Application Publication No. 2013/0175341; U.S. Patent Application Publication No. 2013/0175343; U.S. Patent Application Publication No. 2013/0257744; U.S. Patent Application Publication No. 2013/0257759; U.S. Patent Application Publication No. 2013/0270346; U.S. Patent Application Publication No. 2013/0287258; U.S. Patent Application Publication No. 2013/0292475; U.S. Patent Application Publication No. 2013/0292477; U.S. Patent Application Publication No. 2013/0293539; U.S. Patent Application Publication No. 2013/0293540; U.S. Patent Application Publication No. 2013/0306728; U.S. Patent Application Publication No. 2013/0306731; U.S. Patent Application Publication No. 2013/0307964; U.S. Patent Application Publication No. 2013/0308625; U.S. Patent Application Publication No. 2013/0313324; U.S. Patent Application Publication No. 2013/0313325; U.S. Patent Application Publication No. 2013/0342717; U.S. Patent Application Publication No. 2014/0001267; U.S. Patent Application Publication No. 2014/0008439; U.S. Patent Application Publication No. 2014/0025584; U.S. Patent Application Publication No. 2014/0034734; U.S. Patent Application Publication No. 2014/0036848; U.S. Patent Application Publication No. 2014/0039693; U.S. Patent Application Publication No. 2014/0042814; U.S. Patent Application Publication No. 2014/0049120; U.S. Patent Application Publication No. 2014/0049635; U.S. Patent Application Publication No. 2014/0061306; U.S. Patent Application Publication No. 2014/0063289; U.S. Patent Application Publication No. 2014/0066136; U.S. Patent Application Publication No. 2014/0067692; U.S. Patent Application Publication No. 2014/0070005; U.S. Patent Application Publication No. 2014/0071840; U.S. Patent Application Publication No. 2014/0074746; U.S. Patent Application Publication No. 2014/0076974; U.S. Patent Application Publication No. 2014/0078341; U.S. Patent Application Publication No. 2014/0078345; U.S. Patent Application Publication No. 2014/0097249; U.S. Patent Application Publication No. 2014/0098792; U.S. Patent Application Publication No. 2014/0100813; U.S. Patent Application Publication No. 2014/0103115; U.S. Patent Application Publication No. 2014/0104413; U.S. Patent Application Publication No. 2014/0104414; U.S. Patent Application Publication No. 2014/0104416; U.S. Patent Application Publication No. 2014/0104451; U.S. Patent Application Publication No. 2014/0106594; U.S. Patent Application Publication No. 2014/0106725; U.S. Patent Application Publication No. 2014/0108010; U.S. Patent Application Publication No. 2014/0108402; U.S. Patent Application Publication No. 2014/0110485; U.S. Patent Application Publication No. 2014/0114530; U.S. Patent Application Publication No. 2014/0124577; U.S. Patent Application Publication No. 2014/0124579; U.S. Patent Application Publication No. 2014/0125842; U.S. Patent Application Publication No. 2014/0125853; U.S. Patent Application Publication No. 2014/0125999; U.S. Patent Application Publication No. 2014/0129378; U.S. Patent Application Publication No. 2014/0131438; U.S. Patent Application Publication No. 2014/0131441; U.S. Patent Application Publication No. 2014/0131443; U.S. Patent Application Publication No. 2014/0131444; U.S. Patent Application Publication No. 2014/0131445; U.S. Patent Application Publication No. 2014/0131448; U.S. Patent Application Publication No. 2014/0133379; U.S. Patent Application Publication No. 2014/0136208; U.S. Patent Application Publication No. 2014/0140585; U.S. Patent Application Publication No. 2014/0151453; U.S. Patent Application Publication No. 2014/0152882; U.S. Patent Application Publication No. 2014/0158770; U.S. Patent Application Publication No. 2014/0159869; U.S. Patent Application Publication No. 2014/0166755; U.S. Patent Application Publication No. 2014/0166759; U.S. Patent Application Publication No. 2014/0168787; U.S. Patent Application Publication No. 2014/0175165; U.S. Patent Application Publication No. 2014/0175172; U.S. Patent Application Publication No. 2014/0191644; U.S. Patent Application Publication No. 2014/0191913; U.S. Patent Application Publication No. 2014/0197238; U.S. Patent Application Publication No. 2014/0197239; U.S. Patent Application Publication No. 2014/0197304; U.S. Patent Application Publication No. 2014/0214631; U.S. Patent Application Publication No. 2014/0217166; U.S. Patent Application Publication No. 2014/0217180; U.S. Patent Application Publication No. 2014/0231500; U.S. Patent Application Publication No. 2014/0232930; U.S. Patent Application Publication No. 2014/0247315; U.S. Patent Application Publication No. 2014/0263493; U.S. Patent Application Publication No. 2014/0263645; U.S. Patent Application Publication No. 2014/0267609; U.S. Patent Application Publication No. 2014/0270196; U.S. Patent Application Publication No. 2014/0270229; U.S. Patent Application Publication No. 2014/0278387; U.S. Patent Application Publication No. 2014/0278391; U.S. Patent Application Publication No. 2014/0282210; U.S. Patent Application Publication No. 2014/0284384; U.S. Patent Application Publication No. 2014/0288933; U.S. Patent Application Publication No. 2014/0297058; U.S. Patent Application Publication No. 2014/0299665; U.S. Patent Application Publication No. 2014/0312121; U.S. Patent Application Publication No. 2014/0319220; U.S. Patent Application Publication No. 2014/0319221; U.S. Patent Application Publication No. 2014/0326787; U.S. Patent Application Publication No. 2014/0332590; U.S. Patent Application Publication No. 2014/0344943; U.S. Patent Application Publication No. 2014/0346233; U.S. Patent Application Publication No. 2014/0351317; U.S. Patent Application Publication No. 2014/0353373; U.S. Patent Application Publication No. 2014/0361073; U.S. Patent Application Publication No. 2014/0361082; U.S. Patent Application Publication No. 2014/0362184; U.S. Patent Application Publication No. 2014/0363015; U.S. Patent Application Publication No. 2014/0369511; U.S. Patent Application Publication No. 2014/0374483; U.S. Patent Application Publication No. 2014/0374485; U.S. Patent Application Publication No. 2015/0001301; U.S. Patent Application Publication No. 2015/0001304; U.S. Patent Application Publication No. 2015/0003673; U.S. Patent Application Publication No. 2015/0009338; U.S. Patent Application Publication No. 2015/0009610; U.S. Patent Application Publication No. 2015/0014416; U.S. Patent Application Publication No. 2015/0021397; U.S. Patent Application Publication No. 2015/0028102; U.S. Patent Application Publication No. 2015/0028103; U.S. Patent Application Publication No. 2015/0028104; U.S. Patent Application Publication No. 2015/0029002; U.S. Patent Application Publication No. 2015/0032709; U.S. Patent Application Publication No. 2015/0039309; U.S. Patent Application Publication No. 2015/0039878; U.S. Patent Application Publication No. 2015/0040378; U.S. Patent Application Publication No. 2015/0048168; U.S. Patent Application Publication No. 2015/0049347; U.S. Patent Application Publication No. 2015/0051992; U.S. Patent Application Publication No. 2015/0053766; U.S. Patent Application Publication No. 2015/0053768; U.S. Patent Application Publication No. 2015/0053769; U.S. Patent Application Publication No. 2015/0060544; U.S. Patent Application Publication No. 2015/0062366; U.S. Patent Application Publication No. 2015/0063215; U.S. Patent Application Publication No. 2015/0063676; U.S. Patent Application Publication No. 2015/0069130; U.S. Patent Application Publication No. 2015/0071819; U.S. Patent Application Publication No. 2015/0083800; U.S. Patent Application Publication No. 2015/0086114; U.S. Patent Application Publication No. 2015/0088522; U.S. Patent Application Publication No. 2015/0096872; U.S. Patent Application Publication No. 2015/0099557; U.S. Patent Application Publication No. 2015/0100196; U.S. Patent Application Publication No. 2015/0102109; U.S. Patent Application Publication No. 2015/0115035; U.S. Patent Application Publication No. 2015/0127791; U.S. Patent Application Publication No. 2015/0128116; U.S. Patent Application Publication No. 2015/0129659; U.S. Patent Application Publication No. 2015/0133047; U.S. Patent Application Publication No. 2015/0134470; U.S. Patent Application Publication No. 2015/0136851; U.S. Patent Application Publication No. 2015/0136854; U.S. Patent Application Publication No. 2015/0142492; U.S. Patent Application Publication No. 2015/0144692; U.S. Patent Application Publication No. 2015/0144698; U.S. Patent Application Publication No. 2015/0144701; U.S. Patent Application Publication No. 2015/0149946; U.S. Patent Application Publication No. 2015/0161429; U.S. Patent Application Publication No. 2015/0169925; U.S. Patent Application Publication No. 2015/0169929; U.S. Patent Application Publication No. 2015/0178523; U.S. Patent Application Publication No. 2015/0178534; U.S. Patent Application Publication No. 2015/0178535; U.S. Patent Application Publication No. 2015/0178536; U.S. Patent Application Publication No. 2015/0178537; U.S. Patent Application Publication No. 2015/0181093; U.S. Patent Application Publication No. 2015/0181109; U.S. Patent Application No. 13/367,978 for a Laser Scanning Module Employing an Elastomeric U-Hinge Based Laser Scanning Assembly, filed February 7, 2012 (Feng et al.); U.S. Patent Application No. 29/458,405 for an Electronic Device, filed June 19, 2013 (Fitch et al.); U.S. Patent Application No. 29/459,620 for an Electronic Device Enclosure, filed July 2, 2013 (London et a/.); U.S. Patent Application No. 29/468,118 for an Electronic Device Case, filed September 26, 2013 (Oberpriller et al.); U.S. Patent Application No. 14/150,393 for Indicia-reader Having Unitary Construction Scanner, filed January 8, 2014 (Colavito et al.); U.S. Patent Application No. 14/200,405 for Indicia Reader for Size-Limited Applications filed March 7, 2014 (Feng et al.); U.S. Patent Application No. 14/231,898 for Hand-Mounted Indicia-Reading Device with Finger Motion Triggering filed April 1, 2014 (Van Horn et al.); U.S. Patent Application No. 29/486,759 for an Imaging Terminal, filed April 2, 2014 (Oberpriller et al.); U.S. Patent Application No. 14/257,364 for Docking System and Method Using Near Field Communication filed April 21, 2014 (Showering) ; U.S. Patent Application No. 14/264,173 for Autofocus Lens System for Indicia Readers filed April 29, 2014 (Ackley et al.); U.S. Patent Application No. 14/277,337 for MULTIPURPOSE OPTICAL READER, filed May 14, 2014 (Jovanovski et al.); U.S. Patent Application No. 14/283,282 for TERMINAL HAVING ILLUMINATION AND FOCUS CONTROL filed May 21, 2014 (Liu at al.); U.S. Patent Application No. 14/327,827 for a MOBILE-PHONE ADAPTER FOR ELECTRONIC TRANSACTIONS, filed July 10, 2014 (Hejl); U.S. Patent Application No. 14/334,934 for a SYSTEM AND METHOD FOR INDICIA VERIFICATION, filed July 18, 2014 (Hejl); U.S. Patent Application No. 14/339,708 for LASER SCANNING CODE SYMBOL READING SYSTEM, filed July 24, 2014 (Xian et al.); U.S. Patent Application No. 14/340,627 for an AXIALLY REINFORCED FLEXIBLE SCAN ELEMENT, filed July 25, 2014 (Rueblinger et al.); U.S. Patent Application No. 14/446,391 for MULTIFUNCTION POINT OF SALE APPARATUS WITH OPTICAL SIGNATURE CAPTURE filed July 30, 2014 (Good et al.); U.S. Patent Application No. 14/452,697 for INTERACTIVE INDICIA READER, filed August 6, 2014 (Todeschini); U.S. Patent Application No. 14/453,019 for DIMENSIONING SYSTEM WITH GUIDED ALIGNMENT, filed August 6, 2014 (Li et al.); U.S. Patent Application No. 14/462,801 for MOBILE COMPUTING DEVICE WITH DATA COGNITION SOFTWARE, filed on August 19, 2014 (Todeschini et al.); U.S. Patent Application No. 14/483,056 for VARIABLE DEPTH OF FIELD BARCODE SCANNER filed Sep. 10, 2014 (McCloskey et al.); U.S. Patent Application No. 14/513,808 for IDENTIFYING INVENTORY ITEMS IN A STORAGE FACILITY filed Oct. 14, 2014 (Singel et al.); U.S. Patent Application No. 14/519,195 for HANDHELD DIMENSIONING SYSTEM WITH FEEDBACK filed Oct. 21, 2014 (Laffargue et al.); U.S. Patent Application No. 14/519,179 for DIMENSIONING SYSTEM WITH MULTIPATH INTERFERENCE MITIGATION filed Oct. 21, 2014 (Thuries et al.); U.S. Patent Application No. 14/519,211 for SYSTEM AND METHOD FOR DIMENSIONING filed Oct. 21, 2014 (Ackley at al.); U.S. Patent Application No. 14/519,233 for HANDHELD DTMENSIONER WITH DATA-QUALITY INDICATION filed Oct. 21, 2014 (Laffargue at al.); U.S. Patent Application No. 14/519,249 for HANDHELD DIMENSIONING SYSTEM WITH MEASUREMENT-CONFORMANCE FEEDBACK filed Oct. 21, 2014 (Ackley et al.); U.S. Patent Application No. 14/527,191 for METHOD AND SYSTEM FOR RECOGNIZING SPEECH USING WILDCARDS IN AN EXPECTED RESPONSE filed Oct. 29, 2014 (Braho et al.); U.S. Patent Application No. 14/529,563 for ADAPTABLE INTERFACE FOR A MOBILE COMPUTING DEVICE filed Oct. 31, 2014 (Schoon et al.); U.S. Patent Application No. 14/529,857 for BARCODE READER WITH SECURITY FEATURES filed October 31, 2014 (Todeschini et al.); U.S. Patent Application No. 14/398,542 for PORTABLE ELECTRONIC DEVICES HAVING A SEPARATE LOCATION TRIGGER UNIT FOR USE IN CONTROLLING AN APPLICATION UNIT filed November 3, 2014 (Bian et al.); U.S. Patent Application No. 14/531,154 for DIRECTING AN INSPECTOR THROUGH AN INSPECTION filed Nov. 3, 2014 (Miller et al.); U.S. Patent Application No. 14/533,319 for BARCODE SCANNING SYSTEM USING WEARABLE DEVICE WITH EMBEDDED CAMERA filed Nov. 5, 2014 (Todeschini); U.S. Patent Application No. 14/535,764 for CONCATENATED EXPECTED RESPONSES FOR SPEECH RECOGNITION filed Nov. 7, 2014 (Braho et al.); U.S. Patent Application No. 14/568,305 for AUTO-CONTRAST VIEWFINDER FOR AN INDICIA READER filed Dec. 12, 2014 (Todeschini); U.S. Patent Application No. 14/573,022 for DYNAMIC DIAGNOSTIC INDICATOR GENERATION filed Dec. 17, 2014 (Goldsmith); U.S. Patent Application No. 14/578,627 for SAFETY SYSTEM AND METHOD filed Dec. 22, 2014 (Ackley et al.); U.S. Patent Application No. 14/580,262 for MEDIA GATE FOR THERMAL TRANSFER PRINTERS filed Dec. 23, 2014 (Bowles); U.S. Patent Application No. 14/590,024 for SHELVING AND PACKAGE LOCATING SYSTEMS FOR DELIVERY VEHICLES filed January 6, 2015 (Payne); U.S. Patent Application No. 14/596,757 for SYSTEM AND METHOD FOR DETECTING BARCODE PRINTING ERRORS filed Jan. 14, 2015 (Ackley); U.S. Patent Application No. 14/416,147 for OPTICAL READING APPARATUS HAVING VARIABLE SETTINGS filed January 21, 2015 (Chen et al.); U.S. Patent Application No. 14/614,706 for DEVICE FOR SUPPORTING AN ELECTRONIC TOOL ON A USER'S HAND filed Feb. 5, 2015 (Oberpriller et al.); U.S. Patent Application No. 14/614,796 for CARGO APPORTIONMENT TECHNIQUES filed Feb. 5, 2015 (Morton et al.); U.S. Patent Application No. 29/516,892 for TABLE COMPUTER filed Feb. 6, 2015 (Bidwell et al.); U.S. Patent Application No. 14/619,093 for METHODS FOR TRAINING A SPEECH RECOGNITION SYSTEM filed Feb. 11, 2015 (Pecorari); U.S. Patent Application No. 14/628,708 for DEVICE, SYSTEM, AND METHOD FOR DETERMINING THE STATUS OF CHECKOUT LANES filed Feb. 23, 2015 (Todeschini); U.S. Patent Application No. 14/630,841 for TERMINAL INCLUDING IMAGING ASSEMBLY filed Feb. 25, 2015 (Gomez et al.); U.S. Patent Application No. 14/635,346 for SYSTEM AND METHOD FOR RELIABLE STORE-AND-FORWARD DATA HANDLING BY ENCODED INFORMATION READING TERMINALS filed March 2, 2015 (Sevier); U.S. Patent Application No. 29/519,017 for SCANNER filed March 2, 2015 (Zhou et al.); U.S. Patent Application No. 14/405,278 for DESIGN PATTERN FOR SECURE STORE filed March 9, 2015 (Zhu et al.); U.S. Patent Application No. 14/660,970 for DECODABLE INDICIA READING TERMINAL WITH COMBINED ILLUMINATION filed March 18, 2015 (Kearney et al.); U.S. Patent Application No. 14/661,013 for REPROGRAMMING SYSTEM AND METHOD FOR DEVICES INCLUDING PROGRAMMING SYMBOL filed March 18, 2015 (Soule et al.); U.S. Patent Application No. 14/662,922 for MULTIFUNCTION POINT OF SALE SYSTEM filed March 19, 2015 (Van Horn et al.); U.S. Patent Application No. 14/663,638 for VEHICLE MOUNT COMPUTER WITH CONFIGURABLE IGNITION SWITCH BEHAVIOR filed March 20, 2015 (Davis et al.); U.S. Patent Application No. 14/664,063 for METHOD AND APPLICATION FOR SCANNING A BARCODE WITH A SMART DEVICE WHILE CONTINUOUSLY RUNNING AND DISPLAYING AN APPLICATION ON THE SMART DEVICE DISPLAY filed March 20, 2015 (Todeschini); U.S. Patent Application No. 14/669,280 for TRANSFORMING COMPONENTS OF A WEB PAGE TO VOICE PROMPTS filed March 26, 2015 (Funyak et al.); U.S. Patent Application No. 14/674,329 for AIMER FOR BARCODE SCANNING filed March 31, 2015 (Bidwell); U.S. Patent Application No. 14/676,109 for INDICIA READER filed April 1, 2015 (Huck); U.S. Patent Application No. 14/676,327 for DEVICE MANAGEMENT PROXY FOR SECURE DEVICES filed April 1, 2015 (Yeakley et al.); U.S. Patent Application No. 14/676,898 for NAVIGATION SYSTEM CONFIGURED TO INTEGRATE MOTION SENSING DEVICE INPUTS filed April 2, 2015 (Showering); U.S. Patent Application No. 14/679,275 for DIMENSIONING SYSTEM CALIBRATION SYSTEMS AND METHODS filed April 6, 2015 (Laffargue et al.); U.S. Patent Application No. 29/523,098 for HANDLE FOR A TABLET COMPUTER filed April 7, 2015 (Bidwell et al.); U.S. Patent Application No. 14/682,615 for SYSTEM AND METHOD FOR POWER MANAGEMENT OF MOBILE DEVICES filed April 9, 2015 (Murawski et al.); U.S. Patent Application No. 14/686,822 for MULTIPLE PLATFORM SUPPORT SYSTEM AND METHOD filed April 15, 2015 (Qu et al.); U.S. Patent Application No. 14/687,289 for SYSTEM FOR COMMUNICATION VIA A PERIPHERAL HUB filed April 15, 2015 (Kohtz et al.); U.S. Patent Application No. 29/524,186 for SCANNER filed April 17, 2015 (Zhou et al.); U.S. Patent Application No. 14/695,364 for MEDICATION MANAGEMENT SYSTEM filed April 24, 2015 (Sewell et al.); U.S. Patent Application No. 14/695,923 for SECURE UNATTENDED NETWORK AUTHENTICATION filed April 24, 2015 (Kubler et al.); U.S. Patent Application No. 29/525,068 for TABLET COMPUTER WITH REMOVABLE SCANNING DEVICE filed April 27, 2015 (Schulte et al.); U.S. Patent Application No. 14/699,436 for SYMBOL READING SYSTEM HAVING PREDICTIVE DIAGNOSTICS filed April 29, 2015 (Nahill et al.); U.S. Patent Application No. 14/702,110 for SYSTEM AND METHOD FOR REGULATING BARCODE DATA INJECTION INTO A RUNNING APPLICATION ON A SMART DEVICE filed May 1, 2015 (Todeschini et al.); U.S. Patent Application No. 14/702,979 for TRACKING BATTERY CONDITIONS filed May 4, 2015 (Young et al.); U.S. Patent Application No. 14/704,050 for INTERMEDIATE LINEAR POSITIONING filed May 5, 2015 (Charpentier et al.); U.S. Patent Application No. 14/705,012 for HANDS-FREE HUMAN MACHINE INTERFACE RESPONSIVE TO A DRIVER OF A VEHICLE filed May 6, 2015 (Fitch et al.); U.S. Patent Application No. 14/705,407 for METHOD AND SYSTEM TO PROTECT SOFTWARE-BASED NETWORK-CONNECTED DEVICES FROM ADVANCED PERSISTENT THREAT filed May 6, 2015 (Hussey et al.); U.S. Patent Application No. 14/707,037 for SYSTEM AND METHOD FOR DISPLAY OF INFORMATION USING A VEHICLE-MOUNT COMPUTER filed May 8, 2015 (Chamberlin); U.S. Patent Application No. 14/707,123 for APPLICATION INDEPENDENT DEX/UCS INTERFACE filed May 8, 2015 (Pape); U.S. Patent Application No. 14/707,492 for METHOD AND APPARATUS FOR READING OPTICAL INDICIA USING A PLURALITY OF DATA SOURCES filed May 8, 2015 (Smith et al.); U.S. Patent Application No. 14/710,666 for PRE-PAID USAGE SYSTEM FOR ENCODED INFORMATION READING TERMINALS filed May 13, 2015 (Smith); U.S. Patent Application No. 29/526,916 for CHARGING BASE filed May 14, 2015 (Fitch et al.); U.S. Patent Application No. 14/715,672 for AUGUMENTED REALITY ENABLED HAZARD DISPLAY filed May 19, 2015 (Venkatesha et al.); U.S. Patent Application No. 14/715,916 for EVALUATING IMAGE VALUES filed May 19, 2015 (Ackley); U.S. Patent Application No. 14/722,606 for INTERACTIVE USER INTERFACE FOR CAPTURING A DOCUMENT IN AN IMAGE SIGNAL filed May 27, 2015 (Showering et al.); U.S. Patent Application No. 29/528,165 for IN-COUNTER BARCODE SCANNER filed May 27, 2015 (Oberpriller et al.); U.S. Patent Application No. 14/724,134 for ELECTRONIC DEVICE WITH WIRELESS PATH SELECTION CAPABILITY filed May 28, 2015 (Wang et al.); U.S. Patent Application No. 14/724,849 for METHOD OF PROGRAMMING THE DEFAULT CABLE INTERFACE SOFTWARE IN AN INDICIA READING DEVICE filed May 29, 2015 (Barten); U.S. Patent Application No. 14/724,908 for IMAGING APPARATUS HAVING IMAGING ASSEMBLY filed May 29, 2015 (Barber et al.); U.S. Patent Application No. 14/725,352 for APPARATUS AND METHODS FOR MONITORING ONE OR MORE PORTABLE DATA TERMINALS (Caballero et al.); U.S. Patent Application No. 29/528,590 for ELECTRONIC DEVICE filed May 29, 2015 (Fitch et al.); U.S. Patent Application No. 29/528,890 for MOBILE COMPUTER HOUSING filed June 2, 2015 (Fitch et al.); U.S. Patent Application No. 14/728,397 for DEVICE MANAGEMENT USING VIRTUAL INTERFACES CROSS-REFERENCE TO RELATED APPLICATIONS filed June 2, 2015 (Caballero); U.S. Patent Application No. 14/732,870 for DATA COLLECTION MODULE AND SYSTEM filed June 8, 2015 (Powilleit); U.S. Patent Application No. 29/529,441 for INDICIA READING DEVICE filed June 8, 2015 (Zhou et al.); U.S. Patent Application No. 14/735,717 for INDICIA-READING SYSTEMS HAVING AN INTERFACE WITH A USER'S NERVOUS SYSTEM filed June 10, 2015 (Todeschini); U.S. Patent Application No. 14/738,038 for METHOD OF AND SYSTEM FOR DETECTING OBJECT WEIGHING INTERFERENCES filed June 12, 2015 (Amundsen et al.); U.S. Patent Application No. 14/740,320 for TACTILE SWITCH FOR A MOBILE ELECTRONIC DEVICE filed June 16, 2015 (Bandringa); U.S. Patent Application No. 14/740,373 for CALIBRATING A VOLUME DIMENSIONER filed June 16, 2015 (Ackley et al.); U.S. Patent Application No. 14/742,818 for INDICIA READING SYSTEM EMPLOYING DIGITAL GAIN CONTROL filed June 1E, 2015 (Xian et al.); U.S. Patent Application No. 14/743,257 for WIRELESS MESH POINT PORTABLE DATA TERMINAL filed June 18, 2015 (Wang et al.); U.S. Patent Application No. 29/530,600 for CYCLONE filed June 18, 2015 (Vargo et al); U.S. Patent Application No. 14/744,633 for IMAGING APPARATUS COMPRISING IMAGE SENSOR ARRAY HAVING SHARED GLOBAL SHUTTER CIRCUITRY filed June 19, 2015 (Wang); U.S. Patent Application No. 14/744,836 for CLOUD-BASED SYSTEM FOR READING OF DECODABLE INDICIA filed June 19, 2015 (Todeschini et al.); U.S. Patent Application No. 14/745,006 for SELECTIVE OUTPUT OF DECODED MESSAGE DATA filed June 19, 2015 (Todeschini et al.); U.S. Patent Application No. 14/747,197 for OPTICAL PATTERN PROJECTOR filed June 23, 2015 (Thuries et al.); U.S. Patent Application No. 14/747,490 for DUAL-PROJECTOR THREE-DIMENSIONAL SCANNER filed June 23, 2015 (Jovanovski et al.);and U.S. Patent Application No. 14/748,446 for CORDLESS INDICIA READER WITH A MULTIFUNCTION COIL FOR WIRELESS CHARGING AND FAS DEACTIVATION, filed June 24, 2015 (Xie et al.).

[0048] In the specification and/or figures, typical embodiments of the invention have been disclosed. The present invention is not limited to such exemplary embodiments. The use of the term "and/or" includes any and all combinations of one or more of the associated listed items. The figures are schematic representations and so are not necessarily drawn to scale. Unless otherwise noted, specific terms have been used in a generic and descriptive sense and not for purposes of limitation.

Claims (22)

  1. CLAIMS1. A method for dimensioning an object, the method comprising: positioning a dimensioning system so that at least a portion of an object is contained in the dimensioning system's field-of-view; capturing, using the dimensioning system, a range image of the field-of-view; moving either the dimensioning system or the object so that the dimensioning system's field-of-view contains a different portion of the object; repeating the capturing and the moving until a plurality of range images have been captured; combining the plurality of range images to create a composite range-image; and dimensioning the object using the composite range-image.
  2. 2. The method according to claim 1, wherein the capturing, using the dimensioning system, a range image of the field-of-view comprises: projecting, using a pattern projector, a light pattern into the field-of-view; capturing, using a range camera, an image of the fieldof-view, the image comprising a reflected light-pattern; and generating 3D data from the image of the reflected light-pattern.
  3. 3. The method according to claim 2, wherein the plurality of range images comprise 3D data sufficient for dimensioning the object.
  4. 4. The method according to claim 3, wherein the 3D data sufficient for dimensioning comprises 3D data from all surfaces of the object.
  5. S. The method according to claim 3, wherein the 3D data sufficient for dimensioning comprises 3D data from a surface of the object without any gaps in the reflected light-pattern.
  6. 6. The method according to claim 1, wherein the dimensioning system is handheld.
  7. 7. The method according to claim 1, wherein the moving either the dimensioning system or the object comprises generating audio and/or visual messages to guide a user to perform the movement.
  8. 8. The method according to claim 7, wherein the audio and/or visual messages comprise instructions for the user to (i) move the dimensioning system or the object in a particular direction, (ii) move the dimensioning system or the object at a particular speed, and/or (1 i) cease moving the dimensioning system or the object.
  9. 9. The method according to claim 1, wherein the moving of either the dimensioning system or the object comprises an automatic movement of the dimensioning system or the object.
  10. 10. The method according to claim 1, wherein the combining the plurality of range images to create a composite range-image, comprises: image-stitching the plurality of range images.
  11. 11. The method according to claim 10, wherein the image-stitching comprises simultaneous localization and mapping (SLAM).
  12. 12. A dimensioning system, comprising: a pattern projector configured to project a light pattern onto an object; a range camera configured to (i) capture an image of a reflected light-pattern, (ii) generate 3D data from the reflected light-pattern, and (iii) create a range image using the 3D data; and a processor communicatively coupled to the pattern projector and the range camera, wherein the processor is configured by software to: (i) trigger the range camera to capture a plurality of range images, (ii) combine the plurality of range images into a composite range-image, and (iv) calculate the dimensions of the object using the composite range-image.
  13. 13. The dimensioning system according to claim 12, wherein the plurality of range images are captured as the spatial relationship between the dimensioning system and the object is changed.
  14. 14. The dimensioning system according to claim 13, wherein (i) each range image in the plurality of range images comprises 3D data from a portion of the object, and (i i) the composite range-image comprises 3D data from the entire object.
  15. 15. The dimensioning system according to claim 14, wherein the processor is further configured by software to: gather tracking/mapping information as the spatial relationship between the range camera and the object is changed.
  16. 16. The dimensioning system according to claim 15, wherein to combine the plurality of range images into a composite range-image, the processor is configured by software to: image-stitch the plurality of range images using the tracking/mapping information.
  17. 17. The dimensioning system according to claim 16, wherein range images in the plurality of range images have partially overlapping fields of view.
  18. 18. The dimensioning system according to claim 15, wherein the processor is further configured to: use the tracking/mapping information to generate messages to help a user change the spatial relationship between the range camera and the object.
  19. 19. The dimensioning system according to claim 18, wherein the messages comprise instructions to (i) move the dimensioning system or the object in a particular direction, (ii) move the dimensioning system or the object at a particular speed, and/or (iii) cease moving the dimensioning system or the object.
  20. 20. The dimensioning system according to claim 12, wherein the dimensioning system is handheld.
  21. 21. A method substantially as hereinbefore described with reference to and/or as illustrated in any one or more of the Figures.
  22. 22. A dimensioning system as hereinbefore described with reference to and/or as illustrated in any one or more of the Figures.
GB201517843A 2014-10-10 2015-10-08 Image-stitching for dimensioning Pending GB201517843D0 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US201462062175 true 2014-10-10 2014-10-10
US14870488 US20160104274A1 (en) 2014-10-10 2015-09-30 Image-stitching for dimensioning

Publications (2)

Publication Number Publication Date
GB201517843D0 GB201517843D0 (en) 2015-11-25
GB2531928A true true GB2531928A (en) 2016-05-04

Family

ID=55130788

Family Applications (1)

Application Number Title Priority Date Filing Date
GB201517843A Pending GB201517843D0 (en) 2014-10-10 2015-10-08 Image-stitching for dimensioning

Country Status (1)

Country Link
GB (1) GB201517843D0 (en)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9464885B2 (en) 2013-08-30 2016-10-11 Hand Held Products, Inc. System and method for package dimensioning
US9557166B2 (en) 2014-10-21 2017-01-31 Hand Held Products, Inc. Dimensioning system with multipath interference mitigation
US9752864B2 (en) 2014-10-21 2017-09-05 Hand Held Products, Inc. Handheld dimensioning system with feedback
US9762793B2 (en) 2014-10-21 2017-09-12 Hand Held Products, Inc. System and method for dimensioning
US9779546B2 (en) 2012-05-04 2017-10-03 Intermec Ip Corp. Volume dimensioning systems and methods
US9779276B2 (en) 2014-10-10 2017-10-03 Hand Held Products, Inc. Depth sensor based auto-focus system for an indicia scanner
US9786101B2 (en) 2015-05-19 2017-10-10 Hand Held Products, Inc. Evaluating image values
US9823059B2 (en) 2014-08-06 2017-11-21 Hand Held Products, Inc. Dimensioning system with guided alignment
US9835486B2 (en) 2015-07-07 2017-12-05 Hand Held Products, Inc. Mobile dimensioner apparatus for use in commerce
US9841311B2 (en) 2012-10-16 2017-12-12 Hand Held Products, Inc. Dimensioning system
US9857167B2 (en) 2015-06-23 2018-01-02 Hand Held Products, Inc. Dual-projector three-dimensional scanner
US9897434B2 (en) 2014-10-21 2018-02-20 Hand Held Products, Inc. Handheld dimensioning system with measurement-conformance feedback
US9939259B2 (en) 2012-10-04 2018-04-10 Hand Held Products, Inc. Measuring object dimensions using mobile computer
US9940721B2 (en) 2016-06-10 2018-04-10 Hand Held Products, Inc. Scene change detection in a dimensioner
US10007858B2 (en) 2012-05-15 2018-06-26 Honeywell International Inc. Terminals and methods for dimensioning objects
US10025314B2 (en) 2016-01-27 2018-07-17 Hand Held Products, Inc. Vehicle positioning and object avoidance
US10060729B2 (en) 2014-10-21 2018-08-28 Hand Held Products, Inc. Handheld dimensioner with data-quality indication
US10066982B2 (en) 2015-06-16 2018-09-04 Hand Held Products, Inc. Calibrating a volume dimensioner

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070299338A1 (en) * 2004-10-14 2007-12-27 Stevick Glen R Method and apparatus for dynamic space-time imaging system
US20080247635A1 (en) * 2006-03-20 2008-10-09 Siemens Power Generation, Inc. Method of Coalescing Information About Inspected Objects
WO2014149702A1 (en) * 2013-03-15 2014-09-25 Faro Technologies, Inc. Three-dimensional coordinate scanner and method of operation
WO2016020038A1 (en) * 2014-08-08 2016-02-11 Cargometer Gmbh Device and method for determining the volume of an object moved by an industrial truck

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070299338A1 (en) * 2004-10-14 2007-12-27 Stevick Glen R Method and apparatus for dynamic space-time imaging system
US20080247635A1 (en) * 2006-03-20 2008-10-09 Siemens Power Generation, Inc. Method of Coalescing Information About Inspected Objects
WO2014149702A1 (en) * 2013-03-15 2014-09-25 Faro Technologies, Inc. Three-dimensional coordinate scanner and method of operation
WO2016020038A1 (en) * 2014-08-08 2016-02-11 Cargometer Gmbh Device and method for determining the volume of an object moved by an industrial truck

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9779546B2 (en) 2012-05-04 2017-10-03 Intermec Ip Corp. Volume dimensioning systems and methods
US10007858B2 (en) 2012-05-15 2018-06-26 Honeywell International Inc. Terminals and methods for dimensioning objects
US9939259B2 (en) 2012-10-04 2018-04-10 Hand Held Products, Inc. Measuring object dimensions using mobile computer
US9841311B2 (en) 2012-10-16 2017-12-12 Hand Held Products, Inc. Dimensioning system
US9464885B2 (en) 2013-08-30 2016-10-11 Hand Held Products, Inc. System and method for package dimensioning
US9823059B2 (en) 2014-08-06 2017-11-21 Hand Held Products, Inc. Dimensioning system with guided alignment
US9779276B2 (en) 2014-10-10 2017-10-03 Hand Held Products, Inc. Depth sensor based auto-focus system for an indicia scanner
US10060729B2 (en) 2014-10-21 2018-08-28 Hand Held Products, Inc. Handheld dimensioner with data-quality indication
US9557166B2 (en) 2014-10-21 2017-01-31 Hand Held Products, Inc. Dimensioning system with multipath interference mitigation
US9762793B2 (en) 2014-10-21 2017-09-12 Hand Held Products, Inc. System and method for dimensioning
US9752864B2 (en) 2014-10-21 2017-09-05 Hand Held Products, Inc. Handheld dimensioning system with feedback
US9897434B2 (en) 2014-10-21 2018-02-20 Hand Held Products, Inc. Handheld dimensioning system with measurement-conformance feedback
US9786101B2 (en) 2015-05-19 2017-10-10 Hand Held Products, Inc. Evaluating image values
US10066982B2 (en) 2015-06-16 2018-09-04 Hand Held Products, Inc. Calibrating a volume dimensioner
US9857167B2 (en) 2015-06-23 2018-01-02 Hand Held Products, Inc. Dual-projector three-dimensional scanner
US9835486B2 (en) 2015-07-07 2017-12-05 Hand Held Products, Inc. Mobile dimensioner apparatus for use in commerce
US10025314B2 (en) 2016-01-27 2018-07-17 Hand Held Products, Inc. Vehicle positioning and object avoidance
US9940721B2 (en) 2016-06-10 2018-04-10 Hand Held Products, Inc. Scene change detection in a dimensioner

Also Published As

Publication number Publication date Type
GB201517843D0 (en) 2015-11-25 grant

Similar Documents

Publication Publication Date Title
US20100165152A1 (en) Processing Images Having Different Focus
Lindner et al. Lateral and depth calibration of PMD-distance sensors
US7557935B2 (en) Optical coordinate input device comprising few elements
US20030210407A1 (en) Image processing method, image processing system and image processing apparatus
US20150063676A1 (en) System and Method for Package Dimensioning
US7310431B2 (en) Optical methods for remotely measuring objects
US20160109220A1 (en) Handheld dimensioning system with feedback
US20120242795A1 (en) Digital 3d camera using periodic illumination
US20090096783A1 (en) Three-dimensional sensing using speckle patterns
US20130076865A1 (en) Position/orientation measurement apparatus, processing method therefor, and non-transitory computer-readable storage medium
US20140362184A1 (en) Method of Error Correction for 3D Imaging Device
US20160109224A1 (en) Dimensioning system with multipath interference mitigation
US20140078519A1 (en) Laser Scanner
US20080044079A1 (en) Object-based 3-dimensional stereo information generation apparatus and method, and interactive system using the same
US20150003673A1 (en) Dimensioning system
US20120281087A1 (en) Three-dimensional scanner for hand-held phones
US20140104416A1 (en) Dimensioning system
US20110187820A1 (en) Depth camera compatibility
EP2722656A1 (en) Integrated dimensioning and weighing system
US20160040982A1 (en) Dimensioning system with guided alignment
US20060153558A1 (en) Method and apparatus for capturing images using a color laser projection display
WO2007096893A2 (en) Range mapping using speckle decorrelation
Wasenmüller et al. Comparison of kinect v1 and v2 depth images in terms of accuracy and precision
EP3007096A1 (en) Depth sensor based auto-focus system for an indicia scanner
US20140125775A1 (en) Three-dimensional image sensors