US20230153986A1 - Grading cosmetic appearance of a test object - Google Patents
Grading cosmetic appearance of a test object Download PDFInfo
- Publication number
- US20230153986A1 US20230153986A1 US17/525,643 US202117525643A US2023153986A1 US 20230153986 A1 US20230153986 A1 US 20230153986A1 US 202117525643 A US202117525643 A US 202117525643A US 2023153986 A1 US2023153986 A1 US 2023153986A1
- Authority
- US
- United States
- Prior art keywords
- test object
- interest
- region
- processor
- images
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
- G06T7/001—Industrial image inspection using an image reference approach
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K7/00—Methods or arrangements for sensing record carriers, e.g. for reading patterns
- G06K7/10—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
- G06K7/14—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
- G06K7/1404—Methods for optical code recognition
- G06K7/1408—Methods for optical code recognition the method being specifically adapted for the type of code
- G06K7/1413—1D bar codes
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8851—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K7/00—Methods or arrangements for sensing record carriers, e.g. for reading patterns
- G06K7/10—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
- G06K7/14—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
- G06K7/1404—Methods for optical code recognition
- G06K7/1439—Methods for optical code recognition including a method step for retrieval of the optical code
- G06K7/1443—Methods for optical code recognition including a method step for retrieval of the optical code locating of the code in an image
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/194—Segmentation; Edge detection involving foreground-background segmentation
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8851—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
- G01N2021/8854—Grading and classifying of flaws
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30164—Workpiece; Machine component
Definitions
- At least some embodiments disclosed herein relate generally to cosmetic evaluation of an object. More particularly, the embodiments relate to systems, devices, and methods for computer-aided cosmetic evaluation and categorization of an object such as, but not limited to, an electronic device or the like.
- computing devices e.g., mobile devices, such as cellular telephones, tablets, etc.
- One aspect includes inspecting the visual characteristics of the computing device to grade its visual appearance. Some of these devices are then refurbished and can be resold to new users.
- a method includes receiving, by a processor, a plurality of images of a test object, the plurality of images including a plurality of surfaces of the test object. In some embodiments, the method includes receiving, by the processor, an image of a barcode on the test object. In some embodiments, the method includes selecting, by the processor, a region of interest in each of the plurality of images of the test object. In some embodiments, the region of interest includes the test object having a background removed. In some embodiments, for the plurality of regions of interest as selected, the method includes comparing, by the processor, each region of interest with a corresponding profile image and identifying defects in each region of interest. In some embodiments, the corresponding profile image is determined from the image barcode on the test object. In some embodiments, the method includes grading, by the processor, a cosmetic appearance of each region of interest based on the identified defects. In some embodiments, the method includes storing the grades of the cosmetic appearance for each region of interest.
- the method includes sending the grades of the cosmetic appearance for each region of interest to a remote device.
- the plurality of images of the test object are received from a remote device.
- the remote device includes a camera configured to capture the plurality of images.
- the remote device is a cosmetic inspection device.
- the remote device is a mobile device.
- the barcode is a QR code.
- the method includes aligning the plurality of images with the corresponding profile images.
- the test object is a mobile device.
- the method includes determining a value of the mobile device based on the grades of the cosmetic appearance for each region of interest.
- the test object is a mobile device.
- the barcode is a QR code displayed on a display of the mobile device.
- a system includes a server device including a processor and a memory.
- the processor of the server device is configured to receive a plurality of images of a test object.
- the plurality of images include a plurality of surfaces of the test object.
- the processor is configured to receive an image of a barcode on the test object.
- the processor is configured to select a region of interest in each of the plurality of images of the test object.
- the region of interest includes the test object having a background removed.
- the processor is configured to compare each region of interest with a corresponding profile image and identify defects in each region of interest.
- the corresponding profile image is determined from the image barcode on the test object.
- the processor is configured to grade a cosmetic appearance of each region of interest based on the identified defects. In some embodiments, the processor to store the grades of the cosmetic appearance for each region of interest.
- the processor is configured to send the grades of the cosmetic appearance for each region of interest to a remote device over a network.
- the plurality of images of the test object are received from a remote device over a network.
- the remote device includes a camera configured to capture the plurality of images.
- the remote device is a cosmetic inspection device.
- the remote device is a mobile device.
- the barcode is a QR code.
- the processor is configured to align the plurality of images with the corresponding profile images.
- the test object is a mobile device.
- the processor is configured to determine a value of the mobile device based on the grades of the cosmetic appearance for each region of interest.
- the test object is a mobile device.
- the barcode is a QR code displayed on a display of the mobile device.
- a non-transitory computer-readable storage medium includes instructions that, when executed by a processor, cause the processor to perform a method.
- the method includes receiving, by the processor, a plurality of images of a test object, the plurality of images including a plurality of surfaces of the test object.
- the method includes receiving, by the processor, an image of a barcode on the test object.
- the method includes selecting, by the processor, a region of interest in each of the plurality of images of the test object.
- the region of interest includes the test object having a background removed.
- the method includes comparing, by the processor, each region of interest with a corresponding profile image and identifying defects in each region of interest.
- the corresponding profile image is determined from the image barcode on the test object.
- the method includes grading, by the processor, a cosmetic appearance of each region of interest based on the identified defects.
- the method includes storing the grades of the cosmetic appearance for each region of interest.
- the method includes sending the grades of the cosmetic appearance for each region of interest to a remote device over a network.
- FIG. 1 shows a cosmetic grading system, according to some embodiments.
- FIG. 2 shows a system for validation of installation of a component in an assembly, according to an embodiment.
- FIG. 3 shows a portion of the system for validation of installation of a component of an assembly of FIG. 2 , according to an embodiment.
- FIG. 4 shows a schematic architecture for the system of FIG. 2 , according to an embodiment.
- FIG. 5 shows a block diagram illustrating an internal architecture of an example of a mobile device, according to some embodiments.
- FIG. 6 shows a flowchart of a method, according to some embodiments.
- FIG. 7 shows a flowchart of a method, according to some embodiments.
- Various objects such as, for example, a shipping box or container can show damage such as scuffmarks, dents, rips, tears, or the like.
- Other examples include computer devices such as, but not limited to, smartphones, tablets, laptops, smartwatches, and the like, can show damage such as cracks, scuffmarks, or the like.
- the visible damage can be important in understanding whether the shipping box or container was damaged during shipment, or whether a computer device has lost some of its value.
- Objects such as computer devices also include numerous components that are assembled together.
- the assembly process can include fasteners (e.g., screws or the like) that keep the various components secured. It is important that these fasteners be installed correctly (e.g., all screws installed (e.g., no missing screws), proper screws installed, screws properly tightened, or the like) as part of the quality control process.
- the embodiments disclosed herein are directed to systems and methods for inspecting an appearance of an object (e.g., a computer device such as, but not limited to, a smartphone, a tablet, a laptop, a smartwatch, a cellphone, or the like).
- an object e.g., a computer device such as, but not limited to, a smartphone, a tablet, a laptop, a smartwatch, a cellphone, or the like.
- the inspection of the appearance and cosmetic grading of the object can be utilized during, for example, manufacturing of a device, in a retail setting in which computer devices are sold/purchased, or the like.
- An image of an object can be captured from each of a plurality of cameras (in a specific cosmetic grading device) or a plurality of images from a single camera can alternatively be captured.
- “Profile images” i.e., images of a particular object can be captured in a calibration process and used to train the cosmetic grading system.
- Each image of an object being validated i.e., a test object
- Each of the captured images is compared against the corresponding profile image to determine a cosmetic score.
- FIG. 1 shows a cosmetic grading system 50 , according to some embodiments.
- the cosmetic grading system 50 can be used to, for example, provide a variety of different cosmetic grades for various test objects.
- the cosmetic grading system 50 can provide a cosmetic grade for different types of test objects captured by different types of devices.
- the cosmetic grading system 50 can provide a cosmetic grading service that can be accessed by a variety of different remote devices that are able to utilize a server of the cosmetic grading system 50 .
- the cosmetic grading system 50 generally includes a server device 52 in communication with a computer device 54 through a network 56 .
- the system 50 can also include a computer device 58 connected to the server device 52 through the network 56 .
- the server device 52 can include a cosmetic grading application 62 that is configured to compare received images of a test object with images of a corresponding profile test object.
- the cosmetic grading application 62 can be in communication with a database including profile images of various test objects.
- the profile images can be of different views of the test object.
- the profile images can be associated with a particular test object according to a machine readable code such as a barcode.
- the barcode can be a QR code.
- the cosmetic grading application 62 can then compare images received from the computer device 54 or 58 and, based on the comparison, grade the cosmetic appearance of the test object.
- the cosmetic grading application 62 can then store the result and can also output the result via the network 56 to the computer device 54 or the 58 .
- the computer device 54 and the computer device 58 can be used to grade a cosmetic appearance of a test object without the computer device 54 or the computer device 58 being specifically configured with a cosmetic grading application.
- the network 56 may be referred to as the communications network 56 .
- Examples of the network 56 include, but are not limited to, a local area network (LAN), a wide area network (WAN), the Internet, or the like.
- the computer device 54 or computer device 58 can transmit data via the network 56 through a wireless connection using Wi-Fi, Bluetooth, or other similar wireless communication protocols.
- the computer device 54 or computer device 58 can transmit data via the network 56 through a cellular, 3G, 4G, 5G, or other wireless protocol.
- the computer device 54 can be a device specifically configured for capturing images of test objects.
- An example of the computer device 54 is a cosmetic inspection device such as the system 100 described in additional detail in accordance with FIGS. 2 - 4 below.
- the computer device 58 can include an application that permits a user to send images of a test object over the network 56 to the server device 52 for cosmetic grading.
- the computer device 58 includes a camera and a network input/output to accomplish the communication and image capturing.
- the computer device 58 includes a display for showing results of the cosmetic grading.
- the computer device 58 is a smartphone, a tablet, or the like.
- the computer device 58 can also be a laptop or a desktop computer having a camera attached thereto.
- FIG. 2 shows a system 100 for grading an appearance of a test object 102 , according to some embodiments.
- the system 100 can generally be used to, for example, capture images of the test object and communicate with a server device having a cosmetic grading application to assess a cosmetic appearance of the test object.
- the system 100 can be a kiosk implemented in a retail environment and the test object can be a shipping container, an electronic device (e.g., a smartphone, a smartwatch, a tablet, or the like) and determine whether the cosmetic appearance of the test object is damaged.
- the validation can be part of a quality control process during manufacturing.
- the test object 102 is a smartphone. It is to be appreciated that the smartphone is an example, and the test object 102 can vary beyond a smartphone. Examples of other test objects 102 include, but are not limited to, a tablet, a smartwatch, a mobile phone other than a smartphone, a personal digital assistant (PDA), a laptop computing device, or the like. Furthermore, the maker or manufacturer of the test object 102 is not limited. That is, the system 100 can be used to validate the installation correctness of components in test objects 102 from different manufacturers so long as a calibration procedure is performed to create a profile image for the corresponding test object 102 .
- the system 100 includes a display 104 for displaying results of the validation to the user.
- the display 104 can be a combined display and input (e.g., a touchscreen).
- the display 104 can be a display of a tablet or the like.
- a memory of the tablet can store one or more programs to be executed by a processing device of the tablet for validating the correctness of the installation of the component in the test object 102 .
- the display 104 is secured to housing 106 of the system 100 .
- the display 104 can be separate from the housing 106 (i.e., not secured to the housing 106 , but positioned near the system 100 and electronically connected to the system 100 ).
- a platform 108 is utilized to position the test object 102 within the system 100 for validation.
- the platform 108 enables each test object 102 placed into the system 100 for validation to be placed in substantially the same location. As a result, an amount of effort in determining whether the profile image and the test object 102 under test (test object) is in a same location relative to cameras of the system 100 can be reduced.
- the platform 108 is shown and described in additional detail in accordance with FIG. 3 below.
- the system 100 can be portable.
- the illustrated embodiment shows system 100 with a handle 110 for carrying the system 100 .
- portability of the system 100 is optional, and accordingly, the handle 110 is optional.
- the system 100 may be sized differently based on the type of test object 102 to be validated.
- FIG. 3 shows the platform 108 of the system 100 of FIG. 2 for validation of installation of a component in an test object 102 , according to an embodiment.
- the platform 108 includes a tiered surface having a first surface 112 and a second surface 116 .
- a step is thus formed between the first surface 112 and the second surface 116 .
- a plane of the first surface 112 and a plane of the second surface 116 are parallel.
- the second surface 116 is L-shaped when viewed from a top view.
- the second surface 116 is positioned a height H from the first surface 112 .
- the height H between the first surface 112 and the second surface 116 creates an abutment surface 118 .
- the height H is selected such that the abutment surface 118 serves as a stop for the test object 102 when placed within the system 100 .
- the abutment surface 118 is configured to provide a stop for the test object 102 on two sides of the test object 102 (i.e., a major dimension of the test object 102 and a minor dimension of the test object 102 ).
- the height H is selected to be smaller than a thickness T of the test object 102 being validated in the system 100 .
- the height H is selected to be smaller than the thickness T of the test object 102 to not hinder side views of the test object 102 .
- the height H is selected to be large enough that an operator inserting the test object 102 can abut the test object 102 with the abutment surface 118 . In this manner, the abutment surface 118 serves as a stop for the operator when inserting the test object 102 into the system 100 .
- the height H can be substantially the same as the thickness T of the test object 102 .
- the configuration of the platform 108 is helpful in establishing the location of the test object 102 .
- the system 100 can be calibrated to generate the profile images using a single assembly since the coordinate system is generally fixed.
- the platform 108 can, as a result, be used to account for minor variations in placement of the test object 102 by the operator as the offset from the expected coordinated system can be determined based on the location of the test object 102 relative to a calibration test object 102 .
- FIG. 4 shows a schematic architecture for the system 100 of FIG. 2 , according to an embodiment.
- the system 100 generally includes a plurality of cameras 120 ; a motion sensor 122 ; a proximity sensor 124 ; a processing device 126 , memory 128 , a network input/output (I/O) 130 , user I/O 132 , storage 134 , and an interconnect 136 .
- the processing device 126 , memory 128 , network input/output (I/O) 130 , user I/O 132 , storage 134 , and interconnect 136 can be within the housing 106 in some embodiments.
- the processing device 126 , memory 128 , network input/output (I/O) 130 , user I/O 132 , storage 134 , and interconnect 136 can be external from the housing 106 .
- the plurality of cameras 120 are arranged in the system 100 to capture different views of the test object 102 .
- the cameras 120 are digital cameras.
- the system 100 includes three cameras 120 arranged to capture a top view, an up-front view, and an up-side view.
- the system 100 includes four cameras 120 arranged to capture a top view, an up-front view, a first up-side view, and a second (opposite) up-side view. It will be appreciated that a single camera 120 could be used, although accuracy may be improved when a plurality of cameras 120 are used as a component may appear to be correctly installed in a first view but be determined to be incorrectly installed in a second view.
- the motion sensor 122 can be, for example, a laser sensor that can be triggered when an object (i.e., test object 102 ) breaks the laser signal.
- the motion sensor 122 can be installed at the opening to the housing 106 . In some embodiments, the motion sensor 122 may not be included.
- the proximity sensor 124 can be a sensor to determine when an object is placed near it.
- the proximity sensor 124 can be placed in the platform 108 of the system 100 .
- the cameras 120 can capture images of the test object 102 on the platform 108 .
- the proximity sensor 124 can be included regardless of whether the motion sensor 122 is present. In some embodiments with both motion sensor 122 and proximity sensor 124 , the image capturing may be performed after the proximity sensor 124 detects the test object 102 .
- automatically causing the image capturing and subsequent validation to be performed using the proximity sensor 124 , or a combination of the proximity sensor 124 and the motion sensor 122 can increase a number of test objects 102 that can be validated in a set period. That is, reducing effort of a human operator, or even allowing for a robotic arm to load the test object 102 into the system 100 for validation, can reduce an amount of time and effort needed to review the quality of the manufacturing process.
- the processing device 126 can retrieve and execute programming instructions stored in the memory 128 , the storage 134 , or combinations thereof.
- the processing device 126 can also store and retrieve application data residing in the memory 128 .
- the interconnect 136 is used to transmit programming instructions and/or application data between the processing device 126 , the user I/O 132 , the memory 128 , the storage 134 , and the network I/O 130 .
- the interconnect 136 can, for example, be one or more busses or the like.
- the processing device 126 can be a single processing device, multiple processing devices, or a single processing device having multiple processing cores. In some embodiments, the processing device 126 can be a single-threaded processing device. In some embodiments, the processing device 126 can be a multi-threaded processing device.
- the memory 128 is generally included to be representative of a random-access memory such as, but not limited to, Static Random-Access Memory (SRAM), Dynamic Random-Access Memory (DRAM), or Flash.
- the memory 128 can be a volatile memory.
- the memory 128 can be a non-volatile memory.
- at least a portion of the memory 128 can be virtual memory.
- the storage 134 is generally included to be representative of a non-volatile memory such as, but not limited to, a hard disk drive, a solid-state device, removable memory cards, optical storage, flash memory devices, network attached storage (NAS), or connections to storage area network (SAN) devices, or other similar devices that may store non-volatile data.
- the storage 134 is a computer readable medium.
- the storage 134 can include storage that is external to the user device, such as in a cloud.
- FIG. 5 shows a block diagram illustrating an internal architecture of an example of a computer, according to some embodiments.
- the computer can be, for example, the server device 52 , the computer device 54 , or the computer device 58 , in accordance with some embodiments.
- a computer as referred to herein refers to any device with a processor capable of executing logic or coded instructions, and could be a server, personal computer, set top box, smart phone, pad computer or media device, to name a few such devices.
- internal architecture 150 includes one or more processing units (also referred to herein as CPUs) 162 , which interface with at least one computer bus 152 .
- Also interfacing with computer bus 152 are persistent storage medium/media 156 , network interface 164 , memory 154 , e.g., random access memory (RAM), run-time transient memory, read only memory (ROM), etc., media disk drive interface 158 as an interface for a drive that can read and/or write to media including removable media such as floppy, CD ROM, DVD, etc. media, display interface 160 as interface for a monitor or other display device, keyboard interface 166 as interface for a keyboard, pointing device interface 168 as an interface for a mouse or other pointing device, and miscellaneous other interfaces 170 , 172 not shown individually, such as parallel and serial port interfaces, a universal serial bus (USB) interface, and the like.
- RAM random access memory
- ROM read only memory
- media disk drive interface 158 as an interface for a drive that can read and/or write to media including removable media such as floppy, CD ROM, DVD, etc. media
- display interface 160 as interface for a monitor or other display device
- Memory 154 interfaces with computer bus 152 so as to provide information stored in memory 154 to CPU 162 during execution of software programs such as an operating system, application programs, device drivers, and software modules that comprise program code, and/or computer executable process operations, incorporating functionality described herein, e.g., one or more of process flows described herein.
- CPU 162 first loads computer executable process operations from storage, e.g., memory 154 , storage medium/media 156 , removable media drive, and/or other storage device.
- CPU 162 can then execute the stored process operations in order to execute the loaded computer-executable process operations.
- Stored data e.g., data stored by a storage device, can be accessed by CPU 162 during the execution of computer-executable process operations.
- Persistent storage medium/media 156 is a computer readable storage medium(s) that can be used to store software and data, e.g., an operating system and one or more application programs. Persistent storage medium/media 156 can also be used to store device drivers, such as one or more of a digital camera driver, monitor driver, printer driver, scanner driver, or other device drivers, web pages, content files, playlists, and other files. Persistent storage medium/media 156 can further include program modules and data files used to implement one or more embodiments of the present disclosure.
- a module is a software, hardware, or firmware (or combinations thereof) system, process or functionality, or component thereof, that performs or facilitates the processes, features, and/or functions described herein (with or without human interaction or augmentation).
- a module can include sub-modules.
- Software components of a module may be stored on a computer readable medium. Modules may be integral to one or more servers, or be loaded and executed by one or more servers. One or more modules may be grouped into an engine or an application.
- Examples of computer-readable storage media include, but are not limited to, any tangible medium capable of storing a computer program for use by a programmable processing device to perform functions described herein by operating on input data and generating an output.
- a computer program is a set of instructions that can be used, directly or indirectly, in a computer system to perform a certain function or determine a certain result.
- Examples of computer-readable storage media include, but are not limited to, a floppy disk; a hard disk; a random access memory (RAM); a read-only memory (ROM); a semiconductor memory device such as, but not limited to, an erasable programmable read-only memory (EPROM), an electrically erasable programmable read-only memory (EEPROM), Flash memory, or the like; a portable compact disk read-only memory (CD-ROM); an optical storage device; a magnetic storage device; other similar device; or suitable combinations of the foregoing.
- hardwired circuitry may be used in combination with software instructions.
- the description is not limited to any specific combination of hardware circuitry and software instructions, nor to any source for the instructions executed by the data processing system.
- FIG. 6 shows a flowchart of a method 200 , according to some embodiments.
- the method 200 can be representative of a cosmetic grading service being accessed via a cosmetic grading device.
- the cosmetic grading device can be implemented as a kiosk or the like in a setting such as a retail store.
- the cosmetic grading device can be utilized in other environments such as, for example, in a manufacturing environment in which the object to be tested is a shipping box to be shipped or a computer device to be refurbished. It is to be appreciated that these are examples, and the applications can vary beyond the above stated examples.
- a test object is loaded into the system 100 . This includes abutting the test object with the abutment surface 118 of the platform 108 .
- the test object can be loaded by a human operator.
- a robotic or mechanical arm can be automated to place the test object onto the platform 108 .
- the test object can be a computer device such as, but not limited to, a smartphone, smartwatch, tablet, or the like. The placement of the test object can cause the motion sensor 122 , the proximity sensor 124 , or a combination thereof, to generate a signal indicative of the test object being in place.
- the plurality of cameras 120 each capture an image.
- the cameras 120 are oriented such that the captured images are of different views of the test object.
- the test object can have a barcode (e.g., a QR code or other machine readable code) on a surface of the test object that is captured by one of the cameras 120 .
- the captured images are transmitted to a server device to be compared against profile images that are retrieved using the barcode as captured.
- an output is generated by the server device that is indicative of the results of the validation (e.g., pass, fail, needs review).
- the output can be based on a range of the matching score. That is, if the matching score is greater than a first value, then the output can be that the test object passes; between the first value and a lower second value, the test object may need checking (e.g., by an operator); and between the lower second value and a third value that is lower than the second value, the test object may fail.
- a failure can mean, for example, that the test object is damaged such that the appearance does not match the expected appearance.
- the method includes selecting, by the processor, a region of interest in each of the plurality of images of the test object.
- the region of interest includes the test object having a background removed.
- the method includes comparing, by the processor, each region of interest with a corresponding profile image and identifying defects in each region of interest.
- the corresponding profile image is determined from the image barcode on the test object.
- the method includes grading, by the processor, a cosmetic appearance of each region of interest based on the identified defects.
- the method includes storing the grades of the cosmetic appearance for each region of interest.
- the output is received by the system 100 and displayed on the display 104 of the system 100 .
- FIG. 7 shows a flowchart of a method 250 , according to some embodiments.
- the method 250 can be representative of a cosmetic grading service being accessed via a computer device such as a smartphone or the like.
- the computer device can be utilized to review aesthetics of test objects in any environment accessible by a user, so long as the server providing the cosmetic grading service has received some profile images of the test object against which the images received from the computer device can be compared.
- a plurality of images of a test object are captured.
- the user can orient the computer device to capture multiple views of the test object.
- the test object can have a barcode (e.g., a QR code or other machine readable code) on a surface of the test object that is captured by the camera of the computer device.
- the captured images are transmitted to a server device to be compared against profile images that are retrieved using the barcode as captured.
- an output is generated by the server device that is indicative of the results of the validation (e.g., pass, fail, needs review).
- the output can be based on a range of the matching score. That is, if the matching score is greater than a first value, then the output can be that the test object passes; between the first value and a lower second value, the test object may need checking (e.g., by an operator); and between the lower second value and a third value that is lower than the second value, the test object may fail.
- a failure can mean, for example, that the test object is damaged such that the appearance does not match the expected appearance.
- the method includes selecting, by the processor, a region of interest in each of the plurality of images of the test object.
- the region of interest includes the test object having a background removed.
- the method includes comparing, by the processor, each region of interest with a corresponding profile image and identifying defects in each region of interest.
- the corresponding profile image is determined from the image barcode on the test object.
- the method includes grading, by the processor, a cosmetic appearance of each region of interest based on the identified defects.
- the method includes storing the grades of the cosmetic appearance for each region of interest.
- the output is received by the computer device and displayed on the display of the computer device.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Toxicology (AREA)
- Artificial Intelligence (AREA)
- Electromagnetism (AREA)
- Quality & Reliability (AREA)
- Signal Processing (AREA)
- Life Sciences & Earth Sciences (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Biochemistry (AREA)
- Immunology (AREA)
- Pathology (AREA)
- Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
Abstract
Description
- At least some embodiments disclosed herein relate generally to cosmetic evaluation of an object. More particularly, the embodiments relate to systems, devices, and methods for computer-aided cosmetic evaluation and categorization of an object such as, but not limited to, an electronic device or the like.
- Large volumes of computing devices (e.g., mobile devices, such as cellular telephones, tablets, etc.) are recycled and often refurbished. There are numerous aspects to the refurbishing process. One aspect includes inspecting the visual characteristics of the computing device to grade its visual appearance. Some of these devices are then refurbished and can be resold to new users.
- In some embodiments, a method includes receiving, by a processor, a plurality of images of a test object, the plurality of images including a plurality of surfaces of the test object. In some embodiments, the method includes receiving, by the processor, an image of a barcode on the test object. In some embodiments, the method includes selecting, by the processor, a region of interest in each of the plurality of images of the test object. In some embodiments, the region of interest includes the test object having a background removed. In some embodiments, for the plurality of regions of interest as selected, the method includes comparing, by the processor, each region of interest with a corresponding profile image and identifying defects in each region of interest. In some embodiments, the corresponding profile image is determined from the image barcode on the test object. In some embodiments, the method includes grading, by the processor, a cosmetic appearance of each region of interest based on the identified defects. In some embodiments, the method includes storing the grades of the cosmetic appearance for each region of interest.
- In some embodiments, the method includes sending the grades of the cosmetic appearance for each region of interest to a remote device.
- In some embodiments, the plurality of images of the test object are received from a remote device. In some embodiments, the remote device includes a camera configured to capture the plurality of images.
- In some embodiments, the remote device is a cosmetic inspection device.
- In some embodiments, the remote device is a mobile device.
- In some embodiments, the barcode is a QR code.
- In some embodiments, the method includes aligning the plurality of images with the corresponding profile images.
- In some embodiments, the test object is a mobile device. In some embodiments, the method includes determining a value of the mobile device based on the grades of the cosmetic appearance for each region of interest.
- In some embodiments, the test object is a mobile device. In some embodiments, the barcode is a QR code displayed on a display of the mobile device.
- In some embodiments, a system includes a server device including a processor and a memory. In some embodiments, the processor of the server device is configured to receive a plurality of images of a test object. In some embodiments, the plurality of images include a plurality of surfaces of the test object. In some embodiments, the processor is configured to receive an image of a barcode on the test object. In some embodiments, the processor is configured to select a region of interest in each of the plurality of images of the test object. In some embodiments, the region of interest includes the test object having a background removed. In some embodiments, for the plurality of regions of interest as selected, the processor is configured to compare each region of interest with a corresponding profile image and identify defects in each region of interest. In some embodiments, the corresponding profile image is determined from the image barcode on the test object. In some embodiments, the processor is configured to grade a cosmetic appearance of each region of interest based on the identified defects. In some embodiments, the processor to store the grades of the cosmetic appearance for each region of interest.
- In some embodiments, the processor is configured to send the grades of the cosmetic appearance for each region of interest to a remote device over a network.
- In some embodiments, the plurality of images of the test object are received from a remote device over a network. In some embodiments, the remote device includes a camera configured to capture the plurality of images.
- In some embodiments, the remote device is a cosmetic inspection device.
- In some embodiments, the remote device is a mobile device.
- In some embodiments, the barcode is a QR code.
- In some embodiments, the processor is configured to align the plurality of images with the corresponding profile images.
- In some embodiments, the test object is a mobile device. In some embodiments, the processor is configured to determine a value of the mobile device based on the grades of the cosmetic appearance for each region of interest.
- In some embodiments, the test object is a mobile device. In some embodiments, the barcode is a QR code displayed on a display of the mobile device.
- In some embodiments, a non-transitory computer-readable storage medium includes instructions that, when executed by a processor, cause the processor to perform a method. In some embodiments, the method includes receiving, by the processor, a plurality of images of a test object, the plurality of images including a plurality of surfaces of the test object. In some embodiments, the method includes receiving, by the processor, an image of a barcode on the test object. In some embodiments, the method includes selecting, by the processor, a region of interest in each of the plurality of images of the test object. In some embodiments, the region of interest includes the test object having a background removed. In some embodiments, for the plurality of regions of interest as selected, the method includes comparing, by the processor, each region of interest with a corresponding profile image and identifying defects in each region of interest. In some embodiments, the corresponding profile image is determined from the image barcode on the test object. In some embodiments, the method includes grading, by the processor, a cosmetic appearance of each region of interest based on the identified defects. In some embodiments, the method includes storing the grades of the cosmetic appearance for each region of interest.
- In some embodiments, the method includes sending the grades of the cosmetic appearance for each region of interest to a remote device over a network.
- References are made to the accompanying drawings that form a part of this disclosure and illustrate embodiments in which the systems and methods described in this Specification can be practiced.
-
FIG. 1 shows a cosmetic grading system, according to some embodiments. -
FIG. 2 shows a system for validation of installation of a component in an assembly, according to an embodiment. -
FIG. 3 shows a portion of the system for validation of installation of a component of an assembly ofFIG. 2 , according to an embodiment. -
FIG. 4 shows a schematic architecture for the system ofFIG. 2 , according to an embodiment. -
FIG. 5 shows a block diagram illustrating an internal architecture of an example of a mobile device, according to some embodiments. -
FIG. 6 shows a flowchart of a method, according to some embodiments. -
FIG. 7 shows a flowchart of a method, according to some embodiments. - Like reference numbers represent the same or similar parts throughout.
- Various objects such as, for example, a shipping box or container can show damage such as scuffmarks, dents, rips, tears, or the like. Other examples include computer devices such as, but not limited to, smartphones, tablets, laptops, smartwatches, and the like, can show damage such as cracks, scuffmarks, or the like. The visible damage can be important in understanding whether the shipping box or container was damaged during shipment, or whether a computer device has lost some of its value. Objects such as computer devices also include numerous components that are assembled together. The assembly process can include fasteners (e.g., screws or the like) that keep the various components secured. It is important that these fasteners be installed correctly (e.g., all screws installed (e.g., no missing screws), proper screws installed, screws properly tightened, or the like) as part of the quality control process.
- The embodiments disclosed herein are directed to systems and methods for inspecting an appearance of an object (e.g., a computer device such as, but not limited to, a smartphone, a tablet, a laptop, a smartwatch, a cellphone, or the like). The inspection of the appearance and cosmetic grading of the object can be utilized during, for example, manufacturing of a device, in a retail setting in which computer devices are sold/purchased, or the like.
- An image of an object can be captured from each of a plurality of cameras (in a specific cosmetic grading device) or a plurality of images from a single camera can alternatively be captured. “Profile images” (i.e., images of a particular object can be captured in a calibration process and used to train the cosmetic grading system. Each image of an object being validated (i.e., a test object) can be taken in the same coordinate system or with a predetermined relationship. Each of the captured images is compared against the corresponding profile image to determine a cosmetic score.
-
FIG. 1 shows acosmetic grading system 50, according to some embodiments. Thecosmetic grading system 50 can be used to, for example, provide a variety of different cosmetic grades for various test objects. Thecosmetic grading system 50 can provide a cosmetic grade for different types of test objects captured by different types of devices. For example, in some embodiments, thecosmetic grading system 50 can provide a cosmetic grading service that can be accessed by a variety of different remote devices that are able to utilize a server of thecosmetic grading system 50. - The
cosmetic grading system 50 generally includes aserver device 52 in communication with acomputer device 54 through anetwork 56. Thesystem 50 can also include acomputer device 58 connected to theserver device 52 through thenetwork 56. - The
server device 52 can include a cosmetic grading application 62 that is configured to compare received images of a test object with images of a corresponding profile test object. The cosmetic grading application 62 can be in communication with a database including profile images of various test objects. The profile images can be of different views of the test object. The profile images can be associated with a particular test object according to a machine readable code such as a barcode. In embodiments, the barcode can be a QR code. As a result, when the cosmetic grading application 62 receives an image of a barcode, the barcode can be used to retrieve the appropriate profile images for the test object. The cosmetic grading application 62 can then compare images received from the 54 or 58 and, based on the comparison, grade the cosmetic appearance of the test object. The cosmetic grading application 62 can then store the result and can also output the result via thecomputer device network 56 to thecomputer device 54 or the 58. In this manner, thecomputer device 54 and thecomputer device 58 can be used to grade a cosmetic appearance of a test object without thecomputer device 54 or thecomputer device 58 being specifically configured with a cosmetic grading application. - The
network 56 may be referred to as thecommunications network 56. Examples of thenetwork 56 include, but are not limited to, a local area network (LAN), a wide area network (WAN), the Internet, or the like. Thecomputer device 54 orcomputer device 58 can transmit data via thenetwork 56 through a wireless connection using Wi-Fi, Bluetooth, or other similar wireless communication protocols. Thecomputer device 54 orcomputer device 58 can transmit data via thenetwork 56 through a cellular, 3G, 4G, 5G, or other wireless protocol. - The
computer device 54 can be a device specifically configured for capturing images of test objects. An example of thecomputer device 54 is a cosmetic inspection device such as thesystem 100 described in additional detail in accordance withFIGS. 2-4 below. - The
computer device 58 can include an application that permits a user to send images of a test object over thenetwork 56 to theserver device 52 for cosmetic grading. Thecomputer device 58 includes a camera and a network input/output to accomplish the communication and image capturing. Thecomputer device 58 includes a display for showing results of the cosmetic grading. In some embodiments, thecomputer device 58 is a smartphone, a tablet, or the like. Thecomputer device 58 can also be a laptop or a desktop computer having a camera attached thereto. -
FIG. 2 shows asystem 100 for grading an appearance of atest object 102, according to some embodiments. Thesystem 100 can generally be used to, for example, capture images of the test object and communicate with a server device having a cosmetic grading application to assess a cosmetic appearance of the test object. For example, in some embodiments, thesystem 100 can be a kiosk implemented in a retail environment and the test object can be a shipping container, an electronic device (e.g., a smartphone, a smartwatch, a tablet, or the like) and determine whether the cosmetic appearance of the test object is damaged. In some embodiments, the validation can be part of a quality control process during manufacturing. - In the illustrated embodiment, the
test object 102 is a smartphone. It is to be appreciated that the smartphone is an example, and thetest object 102 can vary beyond a smartphone. Examples ofother test objects 102 include, but are not limited to, a tablet, a smartwatch, a mobile phone other than a smartphone, a personal digital assistant (PDA), a laptop computing device, or the like. Furthermore, the maker or manufacturer of thetest object 102 is not limited. That is, thesystem 100 can be used to validate the installation correctness of components intest objects 102 from different manufacturers so long as a calibration procedure is performed to create a profile image for thecorresponding test object 102. - The
system 100 includes adisplay 104 for displaying results of the validation to the user. In some embodiments, thedisplay 104 can be a combined display and input (e.g., a touchscreen). In some embodiments, thedisplay 104 can be a display of a tablet or the like. In such an embodiment, a memory of the tablet can store one or more programs to be executed by a processing device of the tablet for validating the correctness of the installation of the component in thetest object 102. - In the illustrated embodiment, the
display 104 is secured tohousing 106 of thesystem 100. In some embodiments, thedisplay 104 can be separate from the housing 106 (i.e., not secured to thehousing 106, but positioned near thesystem 100 and electronically connected to the system 100). However, it may be beneficial to secure thedisplay 104 to thehousing 106 to reduce a footprint of thesystem 100. - A
platform 108 is utilized to position thetest object 102 within thesystem 100 for validation. Theplatform 108 enables eachtest object 102 placed into thesystem 100 for validation to be placed in substantially the same location. As a result, an amount of effort in determining whether the profile image and thetest object 102 under test (test object) is in a same location relative to cameras of thesystem 100 can be reduced. Theplatform 108 is shown and described in additional detail in accordance withFIG. 3 below. - In some embodiments, the
system 100 can be portable. For example, the illustrated embodiment showssystem 100 with ahandle 110 for carrying thesystem 100. It is to be appreciated that portability of thesystem 100 is optional, and accordingly, thehandle 110 is optional. In some embodiments, thesystem 100 may be sized differently based on the type oftest object 102 to be validated. -
FIG. 3 shows theplatform 108 of thesystem 100 ofFIG. 2 for validation of installation of a component in antest object 102, according to an embodiment. - The
platform 108 includes a tiered surface having afirst surface 112 and asecond surface 116. A step is thus formed between thefirst surface 112 and thesecond surface 116. A plane of thefirst surface 112 and a plane of thesecond surface 116 are parallel. In the illustrated embodiment, thesecond surface 116 is L-shaped when viewed from a top view. - The
second surface 116 is positioned a height H from thefirst surface 112. The height H between thefirst surface 112 and thesecond surface 116 creates anabutment surface 118. - The height H is selected such that the
abutment surface 118 serves as a stop for thetest object 102 when placed within thesystem 100. Theabutment surface 118 is configured to provide a stop for thetest object 102 on two sides of the test object 102 (i.e., a major dimension of thetest object 102 and a minor dimension of the test object 102). - The height H is selected to be smaller than a thickness T of the
test object 102 being validated in thesystem 100. The height H is selected to be smaller than the thickness T of thetest object 102 to not hinder side views of thetest object 102. The height H is selected to be large enough that an operator inserting thetest object 102 can abut thetest object 102 with theabutment surface 118. In this manner, theabutment surface 118 serves as a stop for the operator when inserting thetest object 102 into thesystem 100. In some embodiments, the height H can be substantially the same as the thickness T of thetest object 102. - The configuration of the
platform 108 is helpful in establishing the location of thetest object 102. By including theplatform 108, thesystem 100 can be calibrated to generate the profile images using a single assembly since the coordinate system is generally fixed. Theplatform 108 can, as a result, be used to account for minor variations in placement of thetest object 102 by the operator as the offset from the expected coordinated system can be determined based on the location of thetest object 102 relative to acalibration test object 102. -
FIG. 4 shows a schematic architecture for thesystem 100 ofFIG. 2 , according to an embodiment. - The
system 100 generally includes a plurality ofcameras 120; amotion sensor 122; aproximity sensor 124; aprocessing device 126,memory 128, a network input/output (I/O) 130, user I/O 132,storage 134, and an interconnect 136. Theprocessing device 126,memory 128, network input/output (I/O) 130, user I/O 132,storage 134, and interconnect 136 can be within thehousing 106 in some embodiments. In some embodiments, theprocessing device 126,memory 128, network input/output (I/O) 130, user I/O 132,storage 134, and interconnect 136 can be external from thehousing 106. - The plurality of
cameras 120 are arranged in thesystem 100 to capture different views of thetest object 102. In some embodiments, thecameras 120 are digital cameras. For example, in some embodiments thesystem 100 includes threecameras 120 arranged to capture a top view, an up-front view, and an up-side view. In some embodiments, thesystem 100 includes fourcameras 120 arranged to capture a top view, an up-front view, a first up-side view, and a second (opposite) up-side view. It will be appreciated that asingle camera 120 could be used, although accuracy may be improved when a plurality ofcameras 120 are used as a component may appear to be correctly installed in a first view but be determined to be incorrectly installed in a second view. - The
motion sensor 122 can be, for example, a laser sensor that can be triggered when an object (i.e., test object 102) breaks the laser signal. Themotion sensor 122 can be installed at the opening to thehousing 106. In some embodiments, themotion sensor 122 may not be included. - The
proximity sensor 124 can be a sensor to determine when an object is placed near it. Theproximity sensor 124 can be placed in theplatform 108 of thesystem 100. In some embodiments, when themotion sensor 122 is triggered and theproximity sensor 124 detects an object, thecameras 120 can capture images of thetest object 102 on theplatform 108. In some embodiments, theproximity sensor 124 can be included regardless of whether themotion sensor 122 is present. In some embodiments with bothmotion sensor 122 andproximity sensor 124, the image capturing may be performed after theproximity sensor 124 detects thetest object 102. - In some embodiments, automatically causing the image capturing and subsequent validation to be performed using the
proximity sensor 124, or a combination of theproximity sensor 124 and themotion sensor 122, can increase a number oftest objects 102 that can be validated in a set period. That is, reducing effort of a human operator, or even allowing for a robotic arm to load thetest object 102 into thesystem 100 for validation, can reduce an amount of time and effort needed to review the quality of the manufacturing process. - The
processing device 126 can retrieve and execute programming instructions stored in thememory 128, thestorage 134, or combinations thereof. Theprocessing device 126 can also store and retrieve application data residing in thememory 128. - The interconnect 136 is used to transmit programming instructions and/or application data between the
processing device 126, the user I/O 132, thememory 128, thestorage 134, and the network I/O 130. The interconnect 136 can, for example, be one or more busses or the like. Theprocessing device 126 can be a single processing device, multiple processing devices, or a single processing device having multiple processing cores. In some embodiments, theprocessing device 126 can be a single-threaded processing device. In some embodiments, theprocessing device 126 can be a multi-threaded processing device. - The
memory 128 is generally included to be representative of a random-access memory such as, but not limited to, Static Random-Access Memory (SRAM), Dynamic Random-Access Memory (DRAM), or Flash. In some embodiments, thememory 128 can be a volatile memory. In some embodiments, thememory 128 can be a non-volatile memory. In some embodiments, at least a portion of thememory 128 can be virtual memory. - The
storage 134 is generally included to be representative of a non-volatile memory such as, but not limited to, a hard disk drive, a solid-state device, removable memory cards, optical storage, flash memory devices, network attached storage (NAS), or connections to storage area network (SAN) devices, or other similar devices that may store non-volatile data. In some embodiments, thestorage 134 is a computer readable medium. In some embodiments, thestorage 134 can include storage that is external to the user device, such as in a cloud. -
FIG. 5 shows a block diagram illustrating an internal architecture of an example of a computer, according to some embodiments. In some embodiments, the computer can be, for example, theserver device 52, thecomputer device 54, or thecomputer device 58, in accordance with some embodiments. - A computer as referred to herein refers to any device with a processor capable of executing logic or coded instructions, and could be a server, personal computer, set top box, smart phone, pad computer or media device, to name a few such devices. As shown in the example of
FIG. 5 ,internal architecture 150 includes one or more processing units (also referred to herein as CPUs) 162, which interface with at least one computer bus 152. Also interfacing with computer bus 152 are persistent storage medium/media 156,network interface 164,memory 154, e.g., random access memory (RAM), run-time transient memory, read only memory (ROM), etc., mediadisk drive interface 158 as an interface for a drive that can read and/or write to media including removable media such as floppy, CD ROM, DVD, etc. media,display interface 160 as interface for a monitor or other display device,keyboard interface 166 as interface for a keyboard,pointing device interface 168 as an interface for a mouse or other pointing device, and miscellaneous 170, 172 not shown individually, such as parallel and serial port interfaces, a universal serial bus (USB) interface, and the like.other interfaces -
Memory 154 interfaces with computer bus 152 so as to provide information stored inmemory 154 toCPU 162 during execution of software programs such as an operating system, application programs, device drivers, and software modules that comprise program code, and/or computer executable process operations, incorporating functionality described herein, e.g., one or more of process flows described herein.CPU 162 first loads computer executable process operations from storage, e.g.,memory 154, storage medium/media 156, removable media drive, and/or other storage device.CPU 162 can then execute the stored process operations in order to execute the loaded computer-executable process operations. Stored data, e.g., data stored by a storage device, can be accessed byCPU 162 during the execution of computer-executable process operations. - Persistent storage medium/
media 156 is a computer readable storage medium(s) that can be used to store software and data, e.g., an operating system and one or more application programs. Persistent storage medium/media 156 can also be used to store device drivers, such as one or more of a digital camera driver, monitor driver, printer driver, scanner driver, or other device drivers, web pages, content files, playlists, and other files. Persistent storage medium/media 156 can further include program modules and data files used to implement one or more embodiments of the present disclosure. - For the purposes of this disclosure a module is a software, hardware, or firmware (or combinations thereof) system, process or functionality, or component thereof, that performs or facilitates the processes, features, and/or functions described herein (with or without human interaction or augmentation). A module can include sub-modules. Software components of a module may be stored on a computer readable medium. Modules may be integral to one or more servers, or be loaded and executed by one or more servers. One or more modules may be grouped into an engine or an application.
- Examples of computer-readable storage media include, but are not limited to, any tangible medium capable of storing a computer program for use by a programmable processing device to perform functions described herein by operating on input data and generating an output. A computer program is a set of instructions that can be used, directly or indirectly, in a computer system to perform a certain function or determine a certain result. Examples of computer-readable storage media include, but are not limited to, a floppy disk; a hard disk; a random access memory (RAM); a read-only memory (ROM); a semiconductor memory device such as, but not limited to, an erasable programmable read-only memory (EPROM), an electrically erasable programmable read-only memory (EEPROM), Flash memory, or the like; a portable compact disk read-only memory (CD-ROM); an optical storage device; a magnetic storage device; other similar device; or suitable combinations of the foregoing.
- In some embodiments, hardwired circuitry may be used in combination with software instructions. Thus, the description is not limited to any specific combination of hardware circuitry and software instructions, nor to any source for the instructions executed by the data processing system.
-
FIG. 6 shows a flowchart of amethod 200, according to some embodiments. In some embodiments, themethod 200 can be representative of a cosmetic grading service being accessed via a cosmetic grading device. In some embodiments, the cosmetic grading device can be implemented as a kiosk or the like in a setting such as a retail store. In some embodiments, the cosmetic grading device can be utilized in other environments such as, for example, in a manufacturing environment in which the object to be tested is a shipping box to be shipped or a computer device to be refurbished. It is to be appreciated that these are examples, and the applications can vary beyond the above stated examples. - At block 202 a test object is loaded into the
system 100. This includes abutting the test object with theabutment surface 118 of theplatform 108. In some embodiments, the test object can be loaded by a human operator. In some embodiments, a robotic or mechanical arm can be automated to place the test object onto theplatform 108. In some embodiments, the test object can be a computer device such as, but not limited to, a smartphone, smartwatch, tablet, or the like. The placement of the test object can cause themotion sensor 122, theproximity sensor 124, or a combination thereof, to generate a signal indicative of the test object being in place. - At
block 204, in response to the signal generated by themotion sensor 122, theproximity sensor 124, or a combination thereof, the plurality ofcameras 120 each capture an image. As discussed above, thecameras 120 are oriented such that the captured images are of different views of the test object. In some embodiments, the test object can have a barcode (e.g., a QR code or other machine readable code) on a surface of the test object that is captured by one of thecameras 120. - At
block 206, the captured images are transmitted to a server device to be compared against profile images that are retrieved using the barcode as captured. - At
block 208 an output is generated by the server device that is indicative of the results of the validation (e.g., pass, fail, needs review). The output can be based on a range of the matching score. That is, if the matching score is greater than a first value, then the output can be that the test object passes; between the first value and a lower second value, the test object may need checking (e.g., by an operator); and between the lower second value and a third value that is lower than the second value, the test object may fail. A failure can mean, for example, that the test object is damaged such that the appearance does not match the expected appearance. - To obtain the grade, the method includes selecting, by the processor, a region of interest in each of the plurality of images of the test object. In some embodiments, the region of interest includes the test object having a background removed. In some embodiments, for the plurality of regions of interest as selected, the method includes comparing, by the processor, each region of interest with a corresponding profile image and identifying defects in each region of interest. In some embodiments, the corresponding profile image is determined from the image barcode on the test object. In some embodiments, the method includes grading, by the processor, a cosmetic appearance of each region of interest based on the identified defects. In some embodiments, the method includes storing the grades of the cosmetic appearance for each region of interest.
- At
block 210, the output is received by thesystem 100 and displayed on thedisplay 104 of thesystem 100. -
FIG. 7 shows a flowchart of amethod 250, according to some embodiments. In some embodiments, themethod 250 can be representative of a cosmetic grading service being accessed via a computer device such as a smartphone or the like. In some embodiments, the computer device can be utilized to review aesthetics of test objects in any environment accessible by a user, so long as the server providing the cosmetic grading service has received some profile images of the test object against which the images received from the computer device can be compared. - At block 252 a plurality of images of a test object are captured. The user can orient the computer device to capture multiple views of the test object. In some embodiments, the test object can have a barcode (e.g., a QR code or other machine readable code) on a surface of the test object that is captured by the camera of the computer device.
- At
block 254, the captured images are transmitted to a server device to be compared against profile images that are retrieved using the barcode as captured. - At
block 256 an output is generated by the server device that is indicative of the results of the validation (e.g., pass, fail, needs review). The output can be based on a range of the matching score. That is, if the matching score is greater than a first value, then the output can be that the test object passes; between the first value and a lower second value, the test object may need checking (e.g., by an operator); and between the lower second value and a third value that is lower than the second value, the test object may fail. A failure can mean, for example, that the test object is damaged such that the appearance does not match the expected appearance. - To obtain the grade, the method includes selecting, by the processor, a region of interest in each of the plurality of images of the test object. In some embodiments, the region of interest includes the test object having a background removed. In some embodiments, for the plurality of regions of interest as selected, the method includes comparing, by the processor, each region of interest with a corresponding profile image and identifying defects in each region of interest. In some embodiments, the corresponding profile image is determined from the image barcode on the test object. In some embodiments, the method includes grading, by the processor, a cosmetic appearance of each region of interest based on the identified defects. In some embodiments, the method includes storing the grades of the cosmetic appearance for each region of interest.
- At
block 258, the output is received by the computer device and displayed on the display of the computer device. - The terminology used herein is intended to describe embodiments and is not intended to be limiting. The terms “a,” “an,” and “the” include the plural forms as well, unless clearly indicated otherwise. The terms “comprises” and/or “comprising,” when used in this Specification, specify the presence of the stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, and/or components.
- It is to be understood that changes may be made in detail, especially in matters of the construction materials employed and the shape, size, and arrangement of parts without departing from the scope of the present disclosure. This Specification and the embodiments described are examples, with the true scope and spirit of the disclosure being indicated by the claims that follow.
Claims (20)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US17/525,643 US20230153986A1 (en) | 2021-11-12 | 2021-11-12 | Grading cosmetic appearance of a test object |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US17/525,643 US20230153986A1 (en) | 2021-11-12 | 2021-11-12 | Grading cosmetic appearance of a test object |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20230153986A1 true US20230153986A1 (en) | 2023-05-18 |
Family
ID=86323746
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US17/525,643 Pending US20230153986A1 (en) | 2021-11-12 | 2021-11-12 | Grading cosmetic appearance of a test object |
Country Status (1)
| Country | Link |
|---|---|
| US (1) | US20230153986A1 (en) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20230011330A1 (en) * | 2021-07-09 | 2023-01-12 | At&T Intellectual Property I, L.P. | Device condition determination |
Citations (14)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20020186877A1 (en) * | 1997-10-09 | 2002-12-12 | Vilella Joseph L. | Electronic assembly video inspection system |
| US20150022657A1 (en) * | 2013-07-19 | 2015-01-22 | Hon Hai Precision Industry Co., Ltd. | Electronic device and method for detecting surface flaw of object |
| US20170004617A1 (en) * | 2015-06-30 | 2017-01-05 | Hon Hai Precision Industry Co., Ltd | Electronic device and mehod for capturing multi-aspect images using the electronic device |
| US20190066199A1 (en) * | 2016-12-22 | 2019-02-28 | Capital One Services, Llc | Systems and methods for virtual fittings |
| US20190258225A1 (en) * | 2017-11-17 | 2019-08-22 | Kodak Alaris Inc. | Automated 360-degree dense point object inspection |
| US20200265487A1 (en) * | 2019-02-18 | 2020-08-20 | Ecoatm, Llc | Neural network based physical condition evaluation of electronic devices, and associated systems and methods |
| US20200357106A1 (en) * | 2019-05-09 | 2020-11-12 | Hon Hai Precision Industry Co., Ltd. | Method for detecting defects, electronic device, and computer readable medium |
| US20210116392A1 (en) * | 2019-10-22 | 2021-04-22 | Blancco Technology Group IP Oy | System and method for mobile device display and housing diagnostics |
| US11243513B2 (en) * | 2019-05-09 | 2022-02-08 | Micron Technology, Inc. | Controlling transport of physical objects based on scanning of encoded images |
| US20220051507A1 (en) * | 2020-08-17 | 2022-02-17 | Ecoatm, Llc | Kiosk for evaluating and purchasing used electronic devices |
| US20220092756A1 (en) * | 2020-09-21 | 2022-03-24 | International Business Machines Corporation | Feature detection based on neural networks |
| US20220114854A1 (en) * | 2018-12-19 | 2022-04-14 | Ecoatm, Llc | Systems and methods for vending and/or purchasing mobile phones and other electronic devices |
| US20220289403A1 (en) * | 2021-03-10 | 2022-09-15 | The Boeing Company | System and method for automated surface anomaly detection |
| US11526932B2 (en) * | 2008-10-02 | 2022-12-13 | Ecoatm, Llc | Kiosks for evaluating and purchasing used electronic devices and related technology |
-
2021
- 2021-11-12 US US17/525,643 patent/US20230153986A1/en active Pending
Patent Citations (14)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20020186877A1 (en) * | 1997-10-09 | 2002-12-12 | Vilella Joseph L. | Electronic assembly video inspection system |
| US11526932B2 (en) * | 2008-10-02 | 2022-12-13 | Ecoatm, Llc | Kiosks for evaluating and purchasing used electronic devices and related technology |
| US20150022657A1 (en) * | 2013-07-19 | 2015-01-22 | Hon Hai Precision Industry Co., Ltd. | Electronic device and method for detecting surface flaw of object |
| US20170004617A1 (en) * | 2015-06-30 | 2017-01-05 | Hon Hai Precision Industry Co., Ltd | Electronic device and mehod for capturing multi-aspect images using the electronic device |
| US20190066199A1 (en) * | 2016-12-22 | 2019-02-28 | Capital One Services, Llc | Systems and methods for virtual fittings |
| US20190258225A1 (en) * | 2017-11-17 | 2019-08-22 | Kodak Alaris Inc. | Automated 360-degree dense point object inspection |
| US20220114854A1 (en) * | 2018-12-19 | 2022-04-14 | Ecoatm, Llc | Systems and methods for vending and/or purchasing mobile phones and other electronic devices |
| US20200265487A1 (en) * | 2019-02-18 | 2020-08-20 | Ecoatm, Llc | Neural network based physical condition evaluation of electronic devices, and associated systems and methods |
| US20200357106A1 (en) * | 2019-05-09 | 2020-11-12 | Hon Hai Precision Industry Co., Ltd. | Method for detecting defects, electronic device, and computer readable medium |
| US11243513B2 (en) * | 2019-05-09 | 2022-02-08 | Micron Technology, Inc. | Controlling transport of physical objects based on scanning of encoded images |
| US20210116392A1 (en) * | 2019-10-22 | 2021-04-22 | Blancco Technology Group IP Oy | System and method for mobile device display and housing diagnostics |
| US20220051507A1 (en) * | 2020-08-17 | 2022-02-17 | Ecoatm, Llc | Kiosk for evaluating and purchasing used electronic devices |
| US20220092756A1 (en) * | 2020-09-21 | 2022-03-24 | International Business Machines Corporation | Feature detection based on neural networks |
| US20220289403A1 (en) * | 2021-03-10 | 2022-09-15 | The Boeing Company | System and method for automated surface anomaly detection |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20230011330A1 (en) * | 2021-07-09 | 2023-01-12 | At&T Intellectual Property I, L.P. | Device condition determination |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11243513B2 (en) | Controlling transport of physical objects based on scanning of encoded images | |
| US11189019B2 (en) | Method for detecting defects, electronic device, and computer readable medium | |
| CN110621984B (en) | Method and system for improving quality inspection | |
| US12086978B2 (en) | Defect detection of a component in an assembly | |
| US11900666B2 (en) | Defect detection and image comparison of components in an assembly | |
| US12134483B2 (en) | System and method for automated surface anomaly detection | |
| US11727522B2 (en) | Method, system, and apparatus for damage assessment and classification | |
| US9536298B2 (en) | Electronic device and method for detecting surface flaw of object | |
| US20230153986A1 (en) | Grading cosmetic appearance of a test object | |
| US11113432B2 (en) | Encoding images on physical objects to trace specifications for a manufacturing process | |
| WO2020168842A1 (en) | Vehicle damage assessment method and device and electronic device | |
| TWI748184B (en) | Defect detecting method, electronic device, and computer readable storage medium | |
| US20230316496A1 (en) | Grading cosmetic appearance of a test object | |
| CN114971443B (en) | A processing method and imaging device for logistics objects | |
| US11347854B2 (en) | System validator | |
| US20160209989A1 (en) | Record and replay of operations on graphical objects | |
| KR102280668B1 (en) | Method and system for dimensional quality inspectation | |
| TWI592885B (en) | Test system | |
| CN118331840B (en) | Data and interface joint test method, device, equipment and storage medium | |
| TWI781623B (en) | Object measurement method and object measurement system | |
| CN113030692B (en) | Test system and test method | |
| CN119180787A (en) | Product defect detection method, electronic device and storage medium | |
| CN119883881A (en) | Multi-tenant-oriented UI test method and device, computer equipment and medium | |
| CN119180943A (en) | Electronic tag platform image recognition method, electronic tag platform image recognition device, computer device, readable storage medium, and program product | |
| CN119514510A (en) | Method, system and medium for processing claims |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: FUTURE DIAL, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LI, JISHENG;ZHANG, QING;SIGNING DATES FROM 20211103 TO 20211108;REEL/FRAME:058103/0034 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STCT | Information on status: administrative procedure adjustment |
Free format text: PROSECUTION SUSPENDED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: ALLOWED -- NOTICE OF ALLOWANCE NOT YET MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |