US20230153986A1 - Grading cosmetic appearance of a test object - Google Patents

Grading cosmetic appearance of a test object Download PDF

Info

Publication number
US20230153986A1
US20230153986A1 US17/525,643 US202117525643A US2023153986A1 US 20230153986 A1 US20230153986 A1 US 20230153986A1 US 202117525643 A US202117525643 A US 202117525643A US 2023153986 A1 US2023153986 A1 US 2023153986A1
Authority
US
United States
Prior art keywords
test object
interest
region
processor
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US17/525,643
Inventor
Jisheng Li
Qing Zhang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Future Dial Inc
Original Assignee
Future Dial Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Future Dial Inc filed Critical Future Dial Inc
Priority to US17/525,643 priority Critical patent/US20230153986A1/en
Assigned to FUTURE DIAL, INC. reassignment FUTURE DIAL, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ZHANG, QING, LI, JISHENG
Publication of US20230153986A1 publication Critical patent/US20230153986A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/001Industrial image inspection using an image reference approach
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1408Methods for optical code recognition the method being specifically adapted for the type of code
    • G06K7/14131D bar codes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1439Methods for optical code recognition including a method step for retrieval of the optical code
    • G06K7/1443Methods for optical code recognition including a method step for retrieval of the optical code locating of the code in an image
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8854Grading and classifying of flaws
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component

Definitions

  • At least some embodiments disclosed herein relate generally to cosmetic evaluation of an object. More particularly, the embodiments relate to systems, devices, and methods for computer-aided cosmetic evaluation and categorization of an object such as, but not limited to, an electronic device or the like.
  • computing devices e.g., mobile devices, such as cellular telephones, tablets, etc.
  • One aspect includes inspecting the visual characteristics of the computing device to grade its visual appearance. Some of these devices are then refurbished and can be resold to new users.
  • a method includes receiving, by a processor, a plurality of images of a test object, the plurality of images including a plurality of surfaces of the test object. In some embodiments, the method includes receiving, by the processor, an image of a barcode on the test object. In some embodiments, the method includes selecting, by the processor, a region of interest in each of the plurality of images of the test object. In some embodiments, the region of interest includes the test object having a background removed. In some embodiments, for the plurality of regions of interest as selected, the method includes comparing, by the processor, each region of interest with a corresponding profile image and identifying defects in each region of interest. In some embodiments, the corresponding profile image is determined from the image barcode on the test object. In some embodiments, the method includes grading, by the processor, a cosmetic appearance of each region of interest based on the identified defects. In some embodiments, the method includes storing the grades of the cosmetic appearance for each region of interest.
  • the method includes sending the grades of the cosmetic appearance for each region of interest to a remote device.
  • the plurality of images of the test object are received from a remote device.
  • the remote device includes a camera configured to capture the plurality of images.
  • the remote device is a cosmetic inspection device.
  • the remote device is a mobile device.
  • the barcode is a QR code.
  • the method includes aligning the plurality of images with the corresponding profile images.
  • the test object is a mobile device.
  • the method includes determining a value of the mobile device based on the grades of the cosmetic appearance for each region of interest.
  • the test object is a mobile device.
  • the barcode is a QR code displayed on a display of the mobile device.
  • a system includes a server device including a processor and a memory.
  • the processor of the server device is configured to receive a plurality of images of a test object.
  • the plurality of images include a plurality of surfaces of the test object.
  • the processor is configured to receive an image of a barcode on the test object.
  • the processor is configured to select a region of interest in each of the plurality of images of the test object.
  • the region of interest includes the test object having a background removed.
  • the processor is configured to compare each region of interest with a corresponding profile image and identify defects in each region of interest.
  • the corresponding profile image is determined from the image barcode on the test object.
  • the processor is configured to grade a cosmetic appearance of each region of interest based on the identified defects. In some embodiments, the processor to store the grades of the cosmetic appearance for each region of interest.
  • the processor is configured to send the grades of the cosmetic appearance for each region of interest to a remote device over a network.
  • the plurality of images of the test object are received from a remote device over a network.
  • the remote device includes a camera configured to capture the plurality of images.
  • the remote device is a cosmetic inspection device.
  • the remote device is a mobile device.
  • the barcode is a QR code.
  • the processor is configured to align the plurality of images with the corresponding profile images.
  • the test object is a mobile device.
  • the processor is configured to determine a value of the mobile device based on the grades of the cosmetic appearance for each region of interest.
  • the test object is a mobile device.
  • the barcode is a QR code displayed on a display of the mobile device.
  • a non-transitory computer-readable storage medium includes instructions that, when executed by a processor, cause the processor to perform a method.
  • the method includes receiving, by the processor, a plurality of images of a test object, the plurality of images including a plurality of surfaces of the test object.
  • the method includes receiving, by the processor, an image of a barcode on the test object.
  • the method includes selecting, by the processor, a region of interest in each of the plurality of images of the test object.
  • the region of interest includes the test object having a background removed.
  • the method includes comparing, by the processor, each region of interest with a corresponding profile image and identifying defects in each region of interest.
  • the corresponding profile image is determined from the image barcode on the test object.
  • the method includes grading, by the processor, a cosmetic appearance of each region of interest based on the identified defects.
  • the method includes storing the grades of the cosmetic appearance for each region of interest.
  • the method includes sending the grades of the cosmetic appearance for each region of interest to a remote device over a network.
  • FIG. 1 shows a cosmetic grading system, according to some embodiments.
  • FIG. 2 shows a system for validation of installation of a component in an assembly, according to an embodiment.
  • FIG. 3 shows a portion of the system for validation of installation of a component of an assembly of FIG. 2 , according to an embodiment.
  • FIG. 4 shows a schematic architecture for the system of FIG. 2 , according to an embodiment.
  • FIG. 5 shows a block diagram illustrating an internal architecture of an example of a mobile device, according to some embodiments.
  • FIG. 6 shows a flowchart of a method, according to some embodiments.
  • FIG. 7 shows a flowchart of a method, according to some embodiments.
  • Various objects such as, for example, a shipping box or container can show damage such as scuffmarks, dents, rips, tears, or the like.
  • Other examples include computer devices such as, but not limited to, smartphones, tablets, laptops, smartwatches, and the like, can show damage such as cracks, scuffmarks, or the like.
  • the visible damage can be important in understanding whether the shipping box or container was damaged during shipment, or whether a computer device has lost some of its value.
  • Objects such as computer devices also include numerous components that are assembled together.
  • the assembly process can include fasteners (e.g., screws or the like) that keep the various components secured. It is important that these fasteners be installed correctly (e.g., all screws installed (e.g., no missing screws), proper screws installed, screws properly tightened, or the like) as part of the quality control process.
  • the embodiments disclosed herein are directed to systems and methods for inspecting an appearance of an object (e.g., a computer device such as, but not limited to, a smartphone, a tablet, a laptop, a smartwatch, a cellphone, or the like).
  • an object e.g., a computer device such as, but not limited to, a smartphone, a tablet, a laptop, a smartwatch, a cellphone, or the like.
  • the inspection of the appearance and cosmetic grading of the object can be utilized during, for example, manufacturing of a device, in a retail setting in which computer devices are sold/purchased, or the like.
  • An image of an object can be captured from each of a plurality of cameras (in a specific cosmetic grading device) or a plurality of images from a single camera can alternatively be captured.
  • “Profile images” i.e., images of a particular object can be captured in a calibration process and used to train the cosmetic grading system.
  • Each image of an object being validated i.e., a test object
  • Each of the captured images is compared against the corresponding profile image to determine a cosmetic score.
  • FIG. 1 shows a cosmetic grading system 50 , according to some embodiments.
  • the cosmetic grading system 50 can be used to, for example, provide a variety of different cosmetic grades for various test objects.
  • the cosmetic grading system 50 can provide a cosmetic grade for different types of test objects captured by different types of devices.
  • the cosmetic grading system 50 can provide a cosmetic grading service that can be accessed by a variety of different remote devices that are able to utilize a server of the cosmetic grading system 50 .
  • the cosmetic grading system 50 generally includes a server device 52 in communication with a computer device 54 through a network 56 .
  • the system 50 can also include a computer device 58 connected to the server device 52 through the network 56 .
  • the server device 52 can include a cosmetic grading application 62 that is configured to compare received images of a test object with images of a corresponding profile test object.
  • the cosmetic grading application 62 can be in communication with a database including profile images of various test objects.
  • the profile images can be of different views of the test object.
  • the profile images can be associated with a particular test object according to a machine readable code such as a barcode.
  • the barcode can be a QR code.
  • the cosmetic grading application 62 can then compare images received from the computer device 54 or 58 and, based on the comparison, grade the cosmetic appearance of the test object.
  • the cosmetic grading application 62 can then store the result and can also output the result via the network 56 to the computer device 54 or the 58 .
  • the computer device 54 and the computer device 58 can be used to grade a cosmetic appearance of a test object without the computer device 54 or the computer device 58 being specifically configured with a cosmetic grading application.
  • the network 56 may be referred to as the communications network 56 .
  • Examples of the network 56 include, but are not limited to, a local area network (LAN), a wide area network (WAN), the Internet, or the like.
  • the computer device 54 or computer device 58 can transmit data via the network 56 through a wireless connection using Wi-Fi, Bluetooth, or other similar wireless communication protocols.
  • the computer device 54 or computer device 58 can transmit data via the network 56 through a cellular, 3G, 4G, 5G, or other wireless protocol.
  • the computer device 54 can be a device specifically configured for capturing images of test objects.
  • An example of the computer device 54 is a cosmetic inspection device such as the system 100 described in additional detail in accordance with FIGS. 2 - 4 below.
  • the computer device 58 can include an application that permits a user to send images of a test object over the network 56 to the server device 52 for cosmetic grading.
  • the computer device 58 includes a camera and a network input/output to accomplish the communication and image capturing.
  • the computer device 58 includes a display for showing results of the cosmetic grading.
  • the computer device 58 is a smartphone, a tablet, or the like.
  • the computer device 58 can also be a laptop or a desktop computer having a camera attached thereto.
  • FIG. 2 shows a system 100 for grading an appearance of a test object 102 , according to some embodiments.
  • the system 100 can generally be used to, for example, capture images of the test object and communicate with a server device having a cosmetic grading application to assess a cosmetic appearance of the test object.
  • the system 100 can be a kiosk implemented in a retail environment and the test object can be a shipping container, an electronic device (e.g., a smartphone, a smartwatch, a tablet, or the like) and determine whether the cosmetic appearance of the test object is damaged.
  • the validation can be part of a quality control process during manufacturing.
  • the test object 102 is a smartphone. It is to be appreciated that the smartphone is an example, and the test object 102 can vary beyond a smartphone. Examples of other test objects 102 include, but are not limited to, a tablet, a smartwatch, a mobile phone other than a smartphone, a personal digital assistant (PDA), a laptop computing device, or the like. Furthermore, the maker or manufacturer of the test object 102 is not limited. That is, the system 100 can be used to validate the installation correctness of components in test objects 102 from different manufacturers so long as a calibration procedure is performed to create a profile image for the corresponding test object 102 .
  • the system 100 includes a display 104 for displaying results of the validation to the user.
  • the display 104 can be a combined display and input (e.g., a touchscreen).
  • the display 104 can be a display of a tablet or the like.
  • a memory of the tablet can store one or more programs to be executed by a processing device of the tablet for validating the correctness of the installation of the component in the test object 102 .
  • the display 104 is secured to housing 106 of the system 100 .
  • the display 104 can be separate from the housing 106 (i.e., not secured to the housing 106 , but positioned near the system 100 and electronically connected to the system 100 ).
  • a platform 108 is utilized to position the test object 102 within the system 100 for validation.
  • the platform 108 enables each test object 102 placed into the system 100 for validation to be placed in substantially the same location. As a result, an amount of effort in determining whether the profile image and the test object 102 under test (test object) is in a same location relative to cameras of the system 100 can be reduced.
  • the platform 108 is shown and described in additional detail in accordance with FIG. 3 below.
  • the system 100 can be portable.
  • the illustrated embodiment shows system 100 with a handle 110 for carrying the system 100 .
  • portability of the system 100 is optional, and accordingly, the handle 110 is optional.
  • the system 100 may be sized differently based on the type of test object 102 to be validated.
  • FIG. 3 shows the platform 108 of the system 100 of FIG. 2 for validation of installation of a component in an test object 102 , according to an embodiment.
  • the platform 108 includes a tiered surface having a first surface 112 and a second surface 116 .
  • a step is thus formed between the first surface 112 and the second surface 116 .
  • a plane of the first surface 112 and a plane of the second surface 116 are parallel.
  • the second surface 116 is L-shaped when viewed from a top view.
  • the second surface 116 is positioned a height H from the first surface 112 .
  • the height H between the first surface 112 and the second surface 116 creates an abutment surface 118 .
  • the height H is selected such that the abutment surface 118 serves as a stop for the test object 102 when placed within the system 100 .
  • the abutment surface 118 is configured to provide a stop for the test object 102 on two sides of the test object 102 (i.e., a major dimension of the test object 102 and a minor dimension of the test object 102 ).
  • the height H is selected to be smaller than a thickness T of the test object 102 being validated in the system 100 .
  • the height H is selected to be smaller than the thickness T of the test object 102 to not hinder side views of the test object 102 .
  • the height H is selected to be large enough that an operator inserting the test object 102 can abut the test object 102 with the abutment surface 118 . In this manner, the abutment surface 118 serves as a stop for the operator when inserting the test object 102 into the system 100 .
  • the height H can be substantially the same as the thickness T of the test object 102 .
  • the configuration of the platform 108 is helpful in establishing the location of the test object 102 .
  • the system 100 can be calibrated to generate the profile images using a single assembly since the coordinate system is generally fixed.
  • the platform 108 can, as a result, be used to account for minor variations in placement of the test object 102 by the operator as the offset from the expected coordinated system can be determined based on the location of the test object 102 relative to a calibration test object 102 .
  • FIG. 4 shows a schematic architecture for the system 100 of FIG. 2 , according to an embodiment.
  • the system 100 generally includes a plurality of cameras 120 ; a motion sensor 122 ; a proximity sensor 124 ; a processing device 126 , memory 128 , a network input/output (I/O) 130 , user I/O 132 , storage 134 , and an interconnect 136 .
  • the processing device 126 , memory 128 , network input/output (I/O) 130 , user I/O 132 , storage 134 , and interconnect 136 can be within the housing 106 in some embodiments.
  • the processing device 126 , memory 128 , network input/output (I/O) 130 , user I/O 132 , storage 134 , and interconnect 136 can be external from the housing 106 .
  • the plurality of cameras 120 are arranged in the system 100 to capture different views of the test object 102 .
  • the cameras 120 are digital cameras.
  • the system 100 includes three cameras 120 arranged to capture a top view, an up-front view, and an up-side view.
  • the system 100 includes four cameras 120 arranged to capture a top view, an up-front view, a first up-side view, and a second (opposite) up-side view. It will be appreciated that a single camera 120 could be used, although accuracy may be improved when a plurality of cameras 120 are used as a component may appear to be correctly installed in a first view but be determined to be incorrectly installed in a second view.
  • the motion sensor 122 can be, for example, a laser sensor that can be triggered when an object (i.e., test object 102 ) breaks the laser signal.
  • the motion sensor 122 can be installed at the opening to the housing 106 . In some embodiments, the motion sensor 122 may not be included.
  • the proximity sensor 124 can be a sensor to determine when an object is placed near it.
  • the proximity sensor 124 can be placed in the platform 108 of the system 100 .
  • the cameras 120 can capture images of the test object 102 on the platform 108 .
  • the proximity sensor 124 can be included regardless of whether the motion sensor 122 is present. In some embodiments with both motion sensor 122 and proximity sensor 124 , the image capturing may be performed after the proximity sensor 124 detects the test object 102 .
  • automatically causing the image capturing and subsequent validation to be performed using the proximity sensor 124 , or a combination of the proximity sensor 124 and the motion sensor 122 can increase a number of test objects 102 that can be validated in a set period. That is, reducing effort of a human operator, or even allowing for a robotic arm to load the test object 102 into the system 100 for validation, can reduce an amount of time and effort needed to review the quality of the manufacturing process.
  • the processing device 126 can retrieve and execute programming instructions stored in the memory 128 , the storage 134 , or combinations thereof.
  • the processing device 126 can also store and retrieve application data residing in the memory 128 .
  • the interconnect 136 is used to transmit programming instructions and/or application data between the processing device 126 , the user I/O 132 , the memory 128 , the storage 134 , and the network I/O 130 .
  • the interconnect 136 can, for example, be one or more busses or the like.
  • the processing device 126 can be a single processing device, multiple processing devices, or a single processing device having multiple processing cores. In some embodiments, the processing device 126 can be a single-threaded processing device. In some embodiments, the processing device 126 can be a multi-threaded processing device.
  • the memory 128 is generally included to be representative of a random-access memory such as, but not limited to, Static Random-Access Memory (SRAM), Dynamic Random-Access Memory (DRAM), or Flash.
  • the memory 128 can be a volatile memory.
  • the memory 128 can be a non-volatile memory.
  • at least a portion of the memory 128 can be virtual memory.
  • the storage 134 is generally included to be representative of a non-volatile memory such as, but not limited to, a hard disk drive, a solid-state device, removable memory cards, optical storage, flash memory devices, network attached storage (NAS), or connections to storage area network (SAN) devices, or other similar devices that may store non-volatile data.
  • the storage 134 is a computer readable medium.
  • the storage 134 can include storage that is external to the user device, such as in a cloud.
  • FIG. 5 shows a block diagram illustrating an internal architecture of an example of a computer, according to some embodiments.
  • the computer can be, for example, the server device 52 , the computer device 54 , or the computer device 58 , in accordance with some embodiments.
  • a computer as referred to herein refers to any device with a processor capable of executing logic or coded instructions, and could be a server, personal computer, set top box, smart phone, pad computer or media device, to name a few such devices.
  • internal architecture 150 includes one or more processing units (also referred to herein as CPUs) 162 , which interface with at least one computer bus 152 .
  • Also interfacing with computer bus 152 are persistent storage medium/media 156 , network interface 164 , memory 154 , e.g., random access memory (RAM), run-time transient memory, read only memory (ROM), etc., media disk drive interface 158 as an interface for a drive that can read and/or write to media including removable media such as floppy, CD ROM, DVD, etc. media, display interface 160 as interface for a monitor or other display device, keyboard interface 166 as interface for a keyboard, pointing device interface 168 as an interface for a mouse or other pointing device, and miscellaneous other interfaces 170 , 172 not shown individually, such as parallel and serial port interfaces, a universal serial bus (USB) interface, and the like.
  • RAM random access memory
  • ROM read only memory
  • media disk drive interface 158 as an interface for a drive that can read and/or write to media including removable media such as floppy, CD ROM, DVD, etc. media
  • display interface 160 as interface for a monitor or other display device
  • Memory 154 interfaces with computer bus 152 so as to provide information stored in memory 154 to CPU 162 during execution of software programs such as an operating system, application programs, device drivers, and software modules that comprise program code, and/or computer executable process operations, incorporating functionality described herein, e.g., one or more of process flows described herein.
  • CPU 162 first loads computer executable process operations from storage, e.g., memory 154 , storage medium/media 156 , removable media drive, and/or other storage device.
  • CPU 162 can then execute the stored process operations in order to execute the loaded computer-executable process operations.
  • Stored data e.g., data stored by a storage device, can be accessed by CPU 162 during the execution of computer-executable process operations.
  • Persistent storage medium/media 156 is a computer readable storage medium(s) that can be used to store software and data, e.g., an operating system and one or more application programs. Persistent storage medium/media 156 can also be used to store device drivers, such as one or more of a digital camera driver, monitor driver, printer driver, scanner driver, or other device drivers, web pages, content files, playlists, and other files. Persistent storage medium/media 156 can further include program modules and data files used to implement one or more embodiments of the present disclosure.
  • a module is a software, hardware, or firmware (or combinations thereof) system, process or functionality, or component thereof, that performs or facilitates the processes, features, and/or functions described herein (with or without human interaction or augmentation).
  • a module can include sub-modules.
  • Software components of a module may be stored on a computer readable medium. Modules may be integral to one or more servers, or be loaded and executed by one or more servers. One or more modules may be grouped into an engine or an application.
  • Examples of computer-readable storage media include, but are not limited to, any tangible medium capable of storing a computer program for use by a programmable processing device to perform functions described herein by operating on input data and generating an output.
  • a computer program is a set of instructions that can be used, directly or indirectly, in a computer system to perform a certain function or determine a certain result.
  • Examples of computer-readable storage media include, but are not limited to, a floppy disk; a hard disk; a random access memory (RAM); a read-only memory (ROM); a semiconductor memory device such as, but not limited to, an erasable programmable read-only memory (EPROM), an electrically erasable programmable read-only memory (EEPROM), Flash memory, or the like; a portable compact disk read-only memory (CD-ROM); an optical storage device; a magnetic storage device; other similar device; or suitable combinations of the foregoing.
  • hardwired circuitry may be used in combination with software instructions.
  • the description is not limited to any specific combination of hardware circuitry and software instructions, nor to any source for the instructions executed by the data processing system.
  • FIG. 6 shows a flowchart of a method 200 , according to some embodiments.
  • the method 200 can be representative of a cosmetic grading service being accessed via a cosmetic grading device.
  • the cosmetic grading device can be implemented as a kiosk or the like in a setting such as a retail store.
  • the cosmetic grading device can be utilized in other environments such as, for example, in a manufacturing environment in which the object to be tested is a shipping box to be shipped or a computer device to be refurbished. It is to be appreciated that these are examples, and the applications can vary beyond the above stated examples.
  • a test object is loaded into the system 100 . This includes abutting the test object with the abutment surface 118 of the platform 108 .
  • the test object can be loaded by a human operator.
  • a robotic or mechanical arm can be automated to place the test object onto the platform 108 .
  • the test object can be a computer device such as, but not limited to, a smartphone, smartwatch, tablet, or the like. The placement of the test object can cause the motion sensor 122 , the proximity sensor 124 , or a combination thereof, to generate a signal indicative of the test object being in place.
  • the plurality of cameras 120 each capture an image.
  • the cameras 120 are oriented such that the captured images are of different views of the test object.
  • the test object can have a barcode (e.g., a QR code or other machine readable code) on a surface of the test object that is captured by one of the cameras 120 .
  • the captured images are transmitted to a server device to be compared against profile images that are retrieved using the barcode as captured.
  • an output is generated by the server device that is indicative of the results of the validation (e.g., pass, fail, needs review).
  • the output can be based on a range of the matching score. That is, if the matching score is greater than a first value, then the output can be that the test object passes; between the first value and a lower second value, the test object may need checking (e.g., by an operator); and between the lower second value and a third value that is lower than the second value, the test object may fail.
  • a failure can mean, for example, that the test object is damaged such that the appearance does not match the expected appearance.
  • the method includes selecting, by the processor, a region of interest in each of the plurality of images of the test object.
  • the region of interest includes the test object having a background removed.
  • the method includes comparing, by the processor, each region of interest with a corresponding profile image and identifying defects in each region of interest.
  • the corresponding profile image is determined from the image barcode on the test object.
  • the method includes grading, by the processor, a cosmetic appearance of each region of interest based on the identified defects.
  • the method includes storing the grades of the cosmetic appearance for each region of interest.
  • the output is received by the system 100 and displayed on the display 104 of the system 100 .
  • FIG. 7 shows a flowchart of a method 250 , according to some embodiments.
  • the method 250 can be representative of a cosmetic grading service being accessed via a computer device such as a smartphone or the like.
  • the computer device can be utilized to review aesthetics of test objects in any environment accessible by a user, so long as the server providing the cosmetic grading service has received some profile images of the test object against which the images received from the computer device can be compared.
  • a plurality of images of a test object are captured.
  • the user can orient the computer device to capture multiple views of the test object.
  • the test object can have a barcode (e.g., a QR code or other machine readable code) on a surface of the test object that is captured by the camera of the computer device.
  • the captured images are transmitted to a server device to be compared against profile images that are retrieved using the barcode as captured.
  • an output is generated by the server device that is indicative of the results of the validation (e.g., pass, fail, needs review).
  • the output can be based on a range of the matching score. That is, if the matching score is greater than a first value, then the output can be that the test object passes; between the first value and a lower second value, the test object may need checking (e.g., by an operator); and between the lower second value and a third value that is lower than the second value, the test object may fail.
  • a failure can mean, for example, that the test object is damaged such that the appearance does not match the expected appearance.
  • the method includes selecting, by the processor, a region of interest in each of the plurality of images of the test object.
  • the region of interest includes the test object having a background removed.
  • the method includes comparing, by the processor, each region of interest with a corresponding profile image and identifying defects in each region of interest.
  • the corresponding profile image is determined from the image barcode on the test object.
  • the method includes grading, by the processor, a cosmetic appearance of each region of interest based on the identified defects.
  • the method includes storing the grades of the cosmetic appearance for each region of interest.
  • the output is received by the computer device and displayed on the display of the computer device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Toxicology (AREA)
  • Artificial Intelligence (AREA)
  • Electromagnetism (AREA)
  • Quality & Reliability (AREA)
  • Signal Processing (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)

Abstract

A method includes receiving, by a processor, a plurality of images of a test object, the plurality of images including a plurality of surfaces of the test object. The processor receives an image of a barcode on the test object. The processor selects a region of interest in each of the plurality of images of the test object. The region of interest includes the test object having a background removed. For the plurality of regions of interest as selected, the processor compares each region of interest with a corresponding profile image and identifying defects in each region of interest. The corresponding profile image is determined from the image barcode on the test object. The method includes grading, by the processor, a cosmetic appearance of each region of interest based on the identified defects. The method includes storing the grades of the cosmetic appearance for each region of interest.

Description

    FIELD OF THE TECHNOLOGY
  • At least some embodiments disclosed herein relate generally to cosmetic evaluation of an object. More particularly, the embodiments relate to systems, devices, and methods for computer-aided cosmetic evaluation and categorization of an object such as, but not limited to, an electronic device or the like.
  • BACKGROUND
  • Large volumes of computing devices (e.g., mobile devices, such as cellular telephones, tablets, etc.) are recycled and often refurbished. There are numerous aspects to the refurbishing process. One aspect includes inspecting the visual characteristics of the computing device to grade its visual appearance. Some of these devices are then refurbished and can be resold to new users.
  • SUMMARY
  • In some embodiments, a method includes receiving, by a processor, a plurality of images of a test object, the plurality of images including a plurality of surfaces of the test object. In some embodiments, the method includes receiving, by the processor, an image of a barcode on the test object. In some embodiments, the method includes selecting, by the processor, a region of interest in each of the plurality of images of the test object. In some embodiments, the region of interest includes the test object having a background removed. In some embodiments, for the plurality of regions of interest as selected, the method includes comparing, by the processor, each region of interest with a corresponding profile image and identifying defects in each region of interest. In some embodiments, the corresponding profile image is determined from the image barcode on the test object. In some embodiments, the method includes grading, by the processor, a cosmetic appearance of each region of interest based on the identified defects. In some embodiments, the method includes storing the grades of the cosmetic appearance for each region of interest.
  • In some embodiments, the method includes sending the grades of the cosmetic appearance for each region of interest to a remote device.
  • In some embodiments, the plurality of images of the test object are received from a remote device. In some embodiments, the remote device includes a camera configured to capture the plurality of images.
  • In some embodiments, the remote device is a cosmetic inspection device.
  • In some embodiments, the remote device is a mobile device.
  • In some embodiments, the barcode is a QR code.
  • In some embodiments, the method includes aligning the plurality of images with the corresponding profile images.
  • In some embodiments, the test object is a mobile device. In some embodiments, the method includes determining a value of the mobile device based on the grades of the cosmetic appearance for each region of interest.
  • In some embodiments, the test object is a mobile device. In some embodiments, the barcode is a QR code displayed on a display of the mobile device.
  • In some embodiments, a system includes a server device including a processor and a memory. In some embodiments, the processor of the server device is configured to receive a plurality of images of a test object. In some embodiments, the plurality of images include a plurality of surfaces of the test object. In some embodiments, the processor is configured to receive an image of a barcode on the test object. In some embodiments, the processor is configured to select a region of interest in each of the plurality of images of the test object. In some embodiments, the region of interest includes the test object having a background removed. In some embodiments, for the plurality of regions of interest as selected, the processor is configured to compare each region of interest with a corresponding profile image and identify defects in each region of interest. In some embodiments, the corresponding profile image is determined from the image barcode on the test object. In some embodiments, the processor is configured to grade a cosmetic appearance of each region of interest based on the identified defects. In some embodiments, the processor to store the grades of the cosmetic appearance for each region of interest.
  • In some embodiments, the processor is configured to send the grades of the cosmetic appearance for each region of interest to a remote device over a network.
  • In some embodiments, the plurality of images of the test object are received from a remote device over a network. In some embodiments, the remote device includes a camera configured to capture the plurality of images.
  • In some embodiments, the remote device is a cosmetic inspection device.
  • In some embodiments, the remote device is a mobile device.
  • In some embodiments, the barcode is a QR code.
  • In some embodiments, the processor is configured to align the plurality of images with the corresponding profile images.
  • In some embodiments, the test object is a mobile device. In some embodiments, the processor is configured to determine a value of the mobile device based on the grades of the cosmetic appearance for each region of interest.
  • In some embodiments, the test object is a mobile device. In some embodiments, the barcode is a QR code displayed on a display of the mobile device.
  • In some embodiments, a non-transitory computer-readable storage medium includes instructions that, when executed by a processor, cause the processor to perform a method. In some embodiments, the method includes receiving, by the processor, a plurality of images of a test object, the plurality of images including a plurality of surfaces of the test object. In some embodiments, the method includes receiving, by the processor, an image of a barcode on the test object. In some embodiments, the method includes selecting, by the processor, a region of interest in each of the plurality of images of the test object. In some embodiments, the region of interest includes the test object having a background removed. In some embodiments, for the plurality of regions of interest as selected, the method includes comparing, by the processor, each region of interest with a corresponding profile image and identifying defects in each region of interest. In some embodiments, the corresponding profile image is determined from the image barcode on the test object. In some embodiments, the method includes grading, by the processor, a cosmetic appearance of each region of interest based on the identified defects. In some embodiments, the method includes storing the grades of the cosmetic appearance for each region of interest.
  • In some embodiments, the method includes sending the grades of the cosmetic appearance for each region of interest to a remote device over a network.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • References are made to the accompanying drawings that form a part of this disclosure and illustrate embodiments in which the systems and methods described in this Specification can be practiced.
  • FIG. 1 shows a cosmetic grading system, according to some embodiments.
  • FIG. 2 shows a system for validation of installation of a component in an assembly, according to an embodiment.
  • FIG. 3 shows a portion of the system for validation of installation of a component of an assembly of FIG. 2 , according to an embodiment.
  • FIG. 4 shows a schematic architecture for the system of FIG. 2 , according to an embodiment.
  • FIG. 5 shows a block diagram illustrating an internal architecture of an example of a mobile device, according to some embodiments.
  • FIG. 6 shows a flowchart of a method, according to some embodiments.
  • FIG. 7 shows a flowchart of a method, according to some embodiments.
  • Like reference numbers represent the same or similar parts throughout.
  • DETAILED DESCRIPTION
  • Various objects such as, for example, a shipping box or container can show damage such as scuffmarks, dents, rips, tears, or the like. Other examples include computer devices such as, but not limited to, smartphones, tablets, laptops, smartwatches, and the like, can show damage such as cracks, scuffmarks, or the like. The visible damage can be important in understanding whether the shipping box or container was damaged during shipment, or whether a computer device has lost some of its value. Objects such as computer devices also include numerous components that are assembled together. The assembly process can include fasteners (e.g., screws or the like) that keep the various components secured. It is important that these fasteners be installed correctly (e.g., all screws installed (e.g., no missing screws), proper screws installed, screws properly tightened, or the like) as part of the quality control process.
  • The embodiments disclosed herein are directed to systems and methods for inspecting an appearance of an object (e.g., a computer device such as, but not limited to, a smartphone, a tablet, a laptop, a smartwatch, a cellphone, or the like). The inspection of the appearance and cosmetic grading of the object can be utilized during, for example, manufacturing of a device, in a retail setting in which computer devices are sold/purchased, or the like.
  • An image of an object can be captured from each of a plurality of cameras (in a specific cosmetic grading device) or a plurality of images from a single camera can alternatively be captured. “Profile images” (i.e., images of a particular object can be captured in a calibration process and used to train the cosmetic grading system. Each image of an object being validated (i.e., a test object) can be taken in the same coordinate system or with a predetermined relationship. Each of the captured images is compared against the corresponding profile image to determine a cosmetic score.
  • FIG. 1 shows a cosmetic grading system 50, according to some embodiments. The cosmetic grading system 50 can be used to, for example, provide a variety of different cosmetic grades for various test objects. The cosmetic grading system 50 can provide a cosmetic grade for different types of test objects captured by different types of devices. For example, in some embodiments, the cosmetic grading system 50 can provide a cosmetic grading service that can be accessed by a variety of different remote devices that are able to utilize a server of the cosmetic grading system 50.
  • The cosmetic grading system 50 generally includes a server device 52 in communication with a computer device 54 through a network 56. The system 50 can also include a computer device 58 connected to the server device 52 through the network 56.
  • The server device 52 can include a cosmetic grading application 62 that is configured to compare received images of a test object with images of a corresponding profile test object. The cosmetic grading application 62 can be in communication with a database including profile images of various test objects. The profile images can be of different views of the test object. The profile images can be associated with a particular test object according to a machine readable code such as a barcode. In embodiments, the barcode can be a QR code. As a result, when the cosmetic grading application 62 receives an image of a barcode, the barcode can be used to retrieve the appropriate profile images for the test object. The cosmetic grading application 62 can then compare images received from the computer device 54 or 58 and, based on the comparison, grade the cosmetic appearance of the test object. The cosmetic grading application 62 can then store the result and can also output the result via the network 56 to the computer device 54 or the 58. In this manner, the computer device 54 and the computer device 58 can be used to grade a cosmetic appearance of a test object without the computer device 54 or the computer device 58 being specifically configured with a cosmetic grading application.
  • The network 56 may be referred to as the communications network 56. Examples of the network 56 include, but are not limited to, a local area network (LAN), a wide area network (WAN), the Internet, or the like. The computer device 54 or computer device 58 can transmit data via the network 56 through a wireless connection using Wi-Fi, Bluetooth, or other similar wireless communication protocols. The computer device 54 or computer device 58 can transmit data via the network 56 through a cellular, 3G, 4G, 5G, or other wireless protocol.
  • The computer device 54 can be a device specifically configured for capturing images of test objects. An example of the computer device 54 is a cosmetic inspection device such as the system 100 described in additional detail in accordance with FIGS. 2-4 below.
  • The computer device 58 can include an application that permits a user to send images of a test object over the network 56 to the server device 52 for cosmetic grading. The computer device 58 includes a camera and a network input/output to accomplish the communication and image capturing. The computer device 58 includes a display for showing results of the cosmetic grading. In some embodiments, the computer device 58 is a smartphone, a tablet, or the like. The computer device 58 can also be a laptop or a desktop computer having a camera attached thereto.
  • FIG. 2 shows a system 100 for grading an appearance of a test object 102, according to some embodiments. The system 100 can generally be used to, for example, capture images of the test object and communicate with a server device having a cosmetic grading application to assess a cosmetic appearance of the test object. For example, in some embodiments, the system 100 can be a kiosk implemented in a retail environment and the test object can be a shipping container, an electronic device (e.g., a smartphone, a smartwatch, a tablet, or the like) and determine whether the cosmetic appearance of the test object is damaged. In some embodiments, the validation can be part of a quality control process during manufacturing.
  • In the illustrated embodiment, the test object 102 is a smartphone. It is to be appreciated that the smartphone is an example, and the test object 102 can vary beyond a smartphone. Examples of other test objects 102 include, but are not limited to, a tablet, a smartwatch, a mobile phone other than a smartphone, a personal digital assistant (PDA), a laptop computing device, or the like. Furthermore, the maker or manufacturer of the test object 102 is not limited. That is, the system 100 can be used to validate the installation correctness of components in test objects 102 from different manufacturers so long as a calibration procedure is performed to create a profile image for the corresponding test object 102.
  • The system 100 includes a display 104 for displaying results of the validation to the user. In some embodiments, the display 104 can be a combined display and input (e.g., a touchscreen). In some embodiments, the display 104 can be a display of a tablet or the like. In such an embodiment, a memory of the tablet can store one or more programs to be executed by a processing device of the tablet for validating the correctness of the installation of the component in the test object 102.
  • In the illustrated embodiment, the display 104 is secured to housing 106 of the system 100. In some embodiments, the display 104 can be separate from the housing 106 (i.e., not secured to the housing 106, but positioned near the system 100 and electronically connected to the system 100). However, it may be beneficial to secure the display 104 to the housing 106 to reduce a footprint of the system 100.
  • A platform 108 is utilized to position the test object 102 within the system 100 for validation. The platform 108 enables each test object 102 placed into the system 100 for validation to be placed in substantially the same location. As a result, an amount of effort in determining whether the profile image and the test object 102 under test (test object) is in a same location relative to cameras of the system 100 can be reduced. The platform 108 is shown and described in additional detail in accordance with FIG. 3 below.
  • In some embodiments, the system 100 can be portable. For example, the illustrated embodiment shows system 100 with a handle 110 for carrying the system 100. It is to be appreciated that portability of the system 100 is optional, and accordingly, the handle 110 is optional. In some embodiments, the system 100 may be sized differently based on the type of test object 102 to be validated.
  • FIG. 3 shows the platform 108 of the system 100 of FIG. 2 for validation of installation of a component in an test object 102, according to an embodiment.
  • The platform 108 includes a tiered surface having a first surface 112 and a second surface 116. A step is thus formed between the first surface 112 and the second surface 116. A plane of the first surface 112 and a plane of the second surface 116 are parallel. In the illustrated embodiment, the second surface 116 is L-shaped when viewed from a top view.
  • The second surface 116 is positioned a height H from the first surface 112. The height H between the first surface 112 and the second surface 116 creates an abutment surface 118.
  • The height H is selected such that the abutment surface 118 serves as a stop for the test object 102 when placed within the system 100. The abutment surface 118 is configured to provide a stop for the test object 102 on two sides of the test object 102 (i.e., a major dimension of the test object 102 and a minor dimension of the test object 102).
  • The height H is selected to be smaller than a thickness T of the test object 102 being validated in the system 100. The height H is selected to be smaller than the thickness T of the test object 102 to not hinder side views of the test object 102. The height H is selected to be large enough that an operator inserting the test object 102 can abut the test object 102 with the abutment surface 118. In this manner, the abutment surface 118 serves as a stop for the operator when inserting the test object 102 into the system 100. In some embodiments, the height H can be substantially the same as the thickness T of the test object 102.
  • The configuration of the platform 108 is helpful in establishing the location of the test object 102. By including the platform 108, the system 100 can be calibrated to generate the profile images using a single assembly since the coordinate system is generally fixed. The platform 108 can, as a result, be used to account for minor variations in placement of the test object 102 by the operator as the offset from the expected coordinated system can be determined based on the location of the test object 102 relative to a calibration test object 102.
  • FIG. 4 shows a schematic architecture for the system 100 of FIG. 2 , according to an embodiment.
  • The system 100 generally includes a plurality of cameras 120; a motion sensor 122; a proximity sensor 124; a processing device 126, memory 128, a network input/output (I/O) 130, user I/O 132, storage 134, and an interconnect 136. The processing device 126, memory 128, network input/output (I/O) 130, user I/O 132, storage 134, and interconnect 136 can be within the housing 106 in some embodiments. In some embodiments, the processing device 126, memory 128, network input/output (I/O) 130, user I/O 132, storage 134, and interconnect 136 can be external from the housing 106.
  • The plurality of cameras 120 are arranged in the system 100 to capture different views of the test object 102. In some embodiments, the cameras 120 are digital cameras. For example, in some embodiments the system 100 includes three cameras 120 arranged to capture a top view, an up-front view, and an up-side view. In some embodiments, the system 100 includes four cameras 120 arranged to capture a top view, an up-front view, a first up-side view, and a second (opposite) up-side view. It will be appreciated that a single camera 120 could be used, although accuracy may be improved when a plurality of cameras 120 are used as a component may appear to be correctly installed in a first view but be determined to be incorrectly installed in a second view.
  • The motion sensor 122 can be, for example, a laser sensor that can be triggered when an object (i.e., test object 102) breaks the laser signal. The motion sensor 122 can be installed at the opening to the housing 106. In some embodiments, the motion sensor 122 may not be included.
  • The proximity sensor 124 can be a sensor to determine when an object is placed near it. The proximity sensor 124 can be placed in the platform 108 of the system 100. In some embodiments, when the motion sensor 122 is triggered and the proximity sensor 124 detects an object, the cameras 120 can capture images of the test object 102 on the platform 108. In some embodiments, the proximity sensor 124 can be included regardless of whether the motion sensor 122 is present. In some embodiments with both motion sensor 122 and proximity sensor 124, the image capturing may be performed after the proximity sensor 124 detects the test object 102.
  • In some embodiments, automatically causing the image capturing and subsequent validation to be performed using the proximity sensor 124, or a combination of the proximity sensor 124 and the motion sensor 122, can increase a number of test objects 102 that can be validated in a set period. That is, reducing effort of a human operator, or even allowing for a robotic arm to load the test object 102 into the system 100 for validation, can reduce an amount of time and effort needed to review the quality of the manufacturing process.
  • The processing device 126 can retrieve and execute programming instructions stored in the memory 128, the storage 134, or combinations thereof. The processing device 126 can also store and retrieve application data residing in the memory 128.
  • The interconnect 136 is used to transmit programming instructions and/or application data between the processing device 126, the user I/O 132, the memory 128, the storage 134, and the network I/O 130. The interconnect 136 can, for example, be one or more busses or the like. The processing device 126 can be a single processing device, multiple processing devices, or a single processing device having multiple processing cores. In some embodiments, the processing device 126 can be a single-threaded processing device. In some embodiments, the processing device 126 can be a multi-threaded processing device.
  • The memory 128 is generally included to be representative of a random-access memory such as, but not limited to, Static Random-Access Memory (SRAM), Dynamic Random-Access Memory (DRAM), or Flash. In some embodiments, the memory 128 can be a volatile memory. In some embodiments, the memory 128 can be a non-volatile memory. In some embodiments, at least a portion of the memory 128 can be virtual memory.
  • The storage 134 is generally included to be representative of a non-volatile memory such as, but not limited to, a hard disk drive, a solid-state device, removable memory cards, optical storage, flash memory devices, network attached storage (NAS), or connections to storage area network (SAN) devices, or other similar devices that may store non-volatile data. In some embodiments, the storage 134 is a computer readable medium. In some embodiments, the storage 134 can include storage that is external to the user device, such as in a cloud.
  • FIG. 5 shows a block diagram illustrating an internal architecture of an example of a computer, according to some embodiments. In some embodiments, the computer can be, for example, the server device 52, the computer device 54, or the computer device 58, in accordance with some embodiments.
  • A computer as referred to herein refers to any device with a processor capable of executing logic or coded instructions, and could be a server, personal computer, set top box, smart phone, pad computer or media device, to name a few such devices. As shown in the example of FIG. 5 , internal architecture 150 includes one or more processing units (also referred to herein as CPUs) 162, which interface with at least one computer bus 152. Also interfacing with computer bus 152 are persistent storage medium/media 156, network interface 164, memory 154, e.g., random access memory (RAM), run-time transient memory, read only memory (ROM), etc., media disk drive interface 158 as an interface for a drive that can read and/or write to media including removable media such as floppy, CD ROM, DVD, etc. media, display interface 160 as interface for a monitor or other display device, keyboard interface 166 as interface for a keyboard, pointing device interface 168 as an interface for a mouse or other pointing device, and miscellaneous other interfaces 170, 172 not shown individually, such as parallel and serial port interfaces, a universal serial bus (USB) interface, and the like.
  • Memory 154 interfaces with computer bus 152 so as to provide information stored in memory 154 to CPU 162 during execution of software programs such as an operating system, application programs, device drivers, and software modules that comprise program code, and/or computer executable process operations, incorporating functionality described herein, e.g., one or more of process flows described herein. CPU 162 first loads computer executable process operations from storage, e.g., memory 154, storage medium/media 156, removable media drive, and/or other storage device. CPU 162 can then execute the stored process operations in order to execute the loaded computer-executable process operations. Stored data, e.g., data stored by a storage device, can be accessed by CPU 162 during the execution of computer-executable process operations.
  • Persistent storage medium/media 156 is a computer readable storage medium(s) that can be used to store software and data, e.g., an operating system and one or more application programs. Persistent storage medium/media 156 can also be used to store device drivers, such as one or more of a digital camera driver, monitor driver, printer driver, scanner driver, or other device drivers, web pages, content files, playlists, and other files. Persistent storage medium/media 156 can further include program modules and data files used to implement one or more embodiments of the present disclosure.
  • For the purposes of this disclosure a module is a software, hardware, or firmware (or combinations thereof) system, process or functionality, or component thereof, that performs or facilitates the processes, features, and/or functions described herein (with or without human interaction or augmentation). A module can include sub-modules. Software components of a module may be stored on a computer readable medium. Modules may be integral to one or more servers, or be loaded and executed by one or more servers. One or more modules may be grouped into an engine or an application.
  • Examples of computer-readable storage media include, but are not limited to, any tangible medium capable of storing a computer program for use by a programmable processing device to perform functions described herein by operating on input data and generating an output. A computer program is a set of instructions that can be used, directly or indirectly, in a computer system to perform a certain function or determine a certain result. Examples of computer-readable storage media include, but are not limited to, a floppy disk; a hard disk; a random access memory (RAM); a read-only memory (ROM); a semiconductor memory device such as, but not limited to, an erasable programmable read-only memory (EPROM), an electrically erasable programmable read-only memory (EEPROM), Flash memory, or the like; a portable compact disk read-only memory (CD-ROM); an optical storage device; a magnetic storage device; other similar device; or suitable combinations of the foregoing.
  • In some embodiments, hardwired circuitry may be used in combination with software instructions. Thus, the description is not limited to any specific combination of hardware circuitry and software instructions, nor to any source for the instructions executed by the data processing system.
  • FIG. 6 shows a flowchart of a method 200, according to some embodiments. In some embodiments, the method 200 can be representative of a cosmetic grading service being accessed via a cosmetic grading device. In some embodiments, the cosmetic grading device can be implemented as a kiosk or the like in a setting such as a retail store. In some embodiments, the cosmetic grading device can be utilized in other environments such as, for example, in a manufacturing environment in which the object to be tested is a shipping box to be shipped or a computer device to be refurbished. It is to be appreciated that these are examples, and the applications can vary beyond the above stated examples.
  • At block 202 a test object is loaded into the system 100. This includes abutting the test object with the abutment surface 118 of the platform 108. In some embodiments, the test object can be loaded by a human operator. In some embodiments, a robotic or mechanical arm can be automated to place the test object onto the platform 108. In some embodiments, the test object can be a computer device such as, but not limited to, a smartphone, smartwatch, tablet, or the like. The placement of the test object can cause the motion sensor 122, the proximity sensor 124, or a combination thereof, to generate a signal indicative of the test object being in place.
  • At block 204, in response to the signal generated by the motion sensor 122, the proximity sensor 124, or a combination thereof, the plurality of cameras 120 each capture an image. As discussed above, the cameras 120 are oriented such that the captured images are of different views of the test object. In some embodiments, the test object can have a barcode (e.g., a QR code or other machine readable code) on a surface of the test object that is captured by one of the cameras 120.
  • At block 206, the captured images are transmitted to a server device to be compared against profile images that are retrieved using the barcode as captured.
  • At block 208 an output is generated by the server device that is indicative of the results of the validation (e.g., pass, fail, needs review). The output can be based on a range of the matching score. That is, if the matching score is greater than a first value, then the output can be that the test object passes; between the first value and a lower second value, the test object may need checking (e.g., by an operator); and between the lower second value and a third value that is lower than the second value, the test object may fail. A failure can mean, for example, that the test object is damaged such that the appearance does not match the expected appearance.
  • To obtain the grade, the method includes selecting, by the processor, a region of interest in each of the plurality of images of the test object. In some embodiments, the region of interest includes the test object having a background removed. In some embodiments, for the plurality of regions of interest as selected, the method includes comparing, by the processor, each region of interest with a corresponding profile image and identifying defects in each region of interest. In some embodiments, the corresponding profile image is determined from the image barcode on the test object. In some embodiments, the method includes grading, by the processor, a cosmetic appearance of each region of interest based on the identified defects. In some embodiments, the method includes storing the grades of the cosmetic appearance for each region of interest.
  • At block 210, the output is received by the system 100 and displayed on the display 104 of the system 100.
  • FIG. 7 shows a flowchart of a method 250, according to some embodiments. In some embodiments, the method 250 can be representative of a cosmetic grading service being accessed via a computer device such as a smartphone or the like. In some embodiments, the computer device can be utilized to review aesthetics of test objects in any environment accessible by a user, so long as the server providing the cosmetic grading service has received some profile images of the test object against which the images received from the computer device can be compared.
  • At block 252 a plurality of images of a test object are captured. The user can orient the computer device to capture multiple views of the test object. In some embodiments, the test object can have a barcode (e.g., a QR code or other machine readable code) on a surface of the test object that is captured by the camera of the computer device.
  • At block 254, the captured images are transmitted to a server device to be compared against profile images that are retrieved using the barcode as captured.
  • At block 256 an output is generated by the server device that is indicative of the results of the validation (e.g., pass, fail, needs review). The output can be based on a range of the matching score. That is, if the matching score is greater than a first value, then the output can be that the test object passes; between the first value and a lower second value, the test object may need checking (e.g., by an operator); and between the lower second value and a third value that is lower than the second value, the test object may fail. A failure can mean, for example, that the test object is damaged such that the appearance does not match the expected appearance.
  • To obtain the grade, the method includes selecting, by the processor, a region of interest in each of the plurality of images of the test object. In some embodiments, the region of interest includes the test object having a background removed. In some embodiments, for the plurality of regions of interest as selected, the method includes comparing, by the processor, each region of interest with a corresponding profile image and identifying defects in each region of interest. In some embodiments, the corresponding profile image is determined from the image barcode on the test object. In some embodiments, the method includes grading, by the processor, a cosmetic appearance of each region of interest based on the identified defects. In some embodiments, the method includes storing the grades of the cosmetic appearance for each region of interest.
  • At block 258, the output is received by the computer device and displayed on the display of the computer device.
  • The terminology used herein is intended to describe embodiments and is not intended to be limiting. The terms “a,” “an,” and “the” include the plural forms as well, unless clearly indicated otherwise. The terms “comprises” and/or “comprising,” when used in this Specification, specify the presence of the stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, and/or components.
  • It is to be understood that changes may be made in detail, especially in matters of the construction materials employed and the shape, size, and arrangement of parts without departing from the scope of the present disclosure. This Specification and the embodiments described are examples, with the true scope and spirit of the disclosure being indicated by the claims that follow.

Claims (20)

What is claimed is:
1. A method, comprising:
receiving, by a processor, a plurality of images of a test object, the plurality of images including a plurality of surfaces of the test object;
receiving, by the processor, an image of a barcode on the test object;
selecting, by the processor, a region of interest in each of the plurality of images of the test object, the region of interest comprising the test object having a background removed;
for the plurality of regions of interest as selected, comparing, by the processor, each region of interest with a corresponding profile image and identifying defects in each region of interest,
wherein the corresponding profile image is determined from the image barcode on the test object;
grading, by the processor, a cosmetic appearance of each region of interest based on the identified defects; and
storing the grades of the cosmetic appearance for each region of interest.
2. The method of claim 1, comprising sending the grades of the cosmetic appearance for each region of interest to a remote device.
3. The method of claim 1, wherein the plurality of images of the test object are received from a remote device, wherein the remote device comprises a camera configured to capture the plurality of images.
4. The method of claim 3, wherein the remote device is a cosmetic inspection device.
5. The method of claim 3, wherein the remote device is a mobile device.
6. The method of claim 1, wherein the barcode is a QR code.
7. The method of claim 1, comprising aligning the plurality of images with the corresponding profile images.
8. The method of claim 1, wherein the test object is a mobile device; comprising determining a value of the mobile device based on the grades of the cosmetic appearance for each region of interest.
9. The method of claim 1, wherein the test object is a mobile device; wherein the barcode is a QR code displayed on a display of the mobile device.
10. A system, comprising:
a server device comprising a processor and a memory, wherein the processor of the server device is configured to:
receive a plurality of images of a test object, the plurality of images including a plurality of surfaces of the test object;
receive an image of a barcode on the test object;
select a region of interest in each of the plurality of images of the test object, the region of interest comprising the test object having a background removed; and
for the plurality of regions of interest as selected, the processor is configured to compare each region of interest with a corresponding profile image and identify defects in each region of interest,
wherein the corresponding profile image is determined from the image barcode on the test object;
grade, by the processor, a cosmetic appearance of each region of interest based on the identified defects; and
store the grades of the cosmetic appearance for each region of interest.
11. The system of claim 10, comprising sending the grades of the cosmetic appearance for each region of interest to a remote device over a network.
12. The system of claim 10, wherein the plurality of images of the test object are received from a remote device over a network, wherein the remote device comprises a camera configured to capture the plurality of images.
13. The system of claim 12, wherein the remote device is a cosmetic inspection device.
14. The system of claim 12, wherein the remote device is a mobile device.
15. The system of claim 10, wherein the barcode is a QR code.
16. The system of claim 10, wherein the processor is configured to align the plurality of images with the corresponding profile images.
17. The system of claim 10, wherein the test object is a mobile device; wherein the processor is configured to determine a value of the mobile device based on the grades of the cosmetic appearance for each region of interest.
18. The system of claim 10, wherein the test object is a mobile device; wherein the barcode is a QR code displayed on a display of the mobile device.
19. A non-transitory computer-readable storage medium comprising instructions that, when executed by a processor, cause the processor to perform a method, comprising:
receiving, by a processor, a plurality of images of a test object, the plurality of images including a plurality of surfaces of the test object;
receiving, by the processor, an image of a barcode on the test object;
selecting, by the processor, a region of interest in each of the plurality of images of the test object, the region of interest comprising the test object having a background removed;
for the plurality of regions of interest as selected, comparing, by the processor, each region of interest with a corresponding profile image and identifying defects in each region of interest,
wherein the corresponding profile image is determined from the image barcode on the test object;
grading, by the processor, a cosmetic appearance of each region of interest based on the identified defects; and
storing the grades of the cosmetic appearance for each region of interest.
20. The non-transitory computer readable storage medium of claim 19, comprising sending the grades of the cosmetic appearance for each region of interest to a remote device over a network.
US17/525,643 2021-11-12 2021-11-12 Grading cosmetic appearance of a test object Pending US20230153986A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/525,643 US20230153986A1 (en) 2021-11-12 2021-11-12 Grading cosmetic appearance of a test object

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US17/525,643 US20230153986A1 (en) 2021-11-12 2021-11-12 Grading cosmetic appearance of a test object

Publications (1)

Publication Number Publication Date
US20230153986A1 true US20230153986A1 (en) 2023-05-18

Family

ID=86323746

Family Applications (1)

Application Number Title Priority Date Filing Date
US17/525,643 Pending US20230153986A1 (en) 2021-11-12 2021-11-12 Grading cosmetic appearance of a test object

Country Status (1)

Country Link
US (1) US20230153986A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230011330A1 (en) * 2021-07-09 2023-01-12 At&T Intellectual Property I, L.P. Device condition determination

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020186877A1 (en) * 1997-10-09 2002-12-12 Vilella Joseph L. Electronic assembly video inspection system
US20150022657A1 (en) * 2013-07-19 2015-01-22 Hon Hai Precision Industry Co., Ltd. Electronic device and method for detecting surface flaw of object
US20170004617A1 (en) * 2015-06-30 2017-01-05 Hon Hai Precision Industry Co., Ltd Electronic device and mehod for capturing multi-aspect images using the electronic device
US20190066199A1 (en) * 2016-12-22 2019-02-28 Capital One Services, Llc Systems and methods for virtual fittings
US20190258225A1 (en) * 2017-11-17 2019-08-22 Kodak Alaris Inc. Automated 360-degree dense point object inspection
US20200265487A1 (en) * 2019-02-18 2020-08-20 Ecoatm, Llc Neural network based physical condition evaluation of electronic devices, and associated systems and methods
US20200357106A1 (en) * 2019-05-09 2020-11-12 Hon Hai Precision Industry Co., Ltd. Method for detecting defects, electronic device, and computer readable medium
US20210116392A1 (en) * 2019-10-22 2021-04-22 Blancco Technology Group IP Oy System and method for mobile device display and housing diagnostics
US11243513B2 (en) * 2019-05-09 2022-02-08 Micron Technology, Inc. Controlling transport of physical objects based on scanning of encoded images
US20220051507A1 (en) * 2020-08-17 2022-02-17 Ecoatm, Llc Kiosk for evaluating and purchasing used electronic devices
US20220092756A1 (en) * 2020-09-21 2022-03-24 International Business Machines Corporation Feature detection based on neural networks
US20220114854A1 (en) * 2018-12-19 2022-04-14 Ecoatm, Llc Systems and methods for vending and/or purchasing mobile phones and other electronic devices
US20220289403A1 (en) * 2021-03-10 2022-09-15 The Boeing Company System and method for automated surface anomaly detection
US11526932B2 (en) * 2008-10-02 2022-12-13 Ecoatm, Llc Kiosks for evaluating and purchasing used electronic devices and related technology

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020186877A1 (en) * 1997-10-09 2002-12-12 Vilella Joseph L. Electronic assembly video inspection system
US11526932B2 (en) * 2008-10-02 2022-12-13 Ecoatm, Llc Kiosks for evaluating and purchasing used electronic devices and related technology
US20150022657A1 (en) * 2013-07-19 2015-01-22 Hon Hai Precision Industry Co., Ltd. Electronic device and method for detecting surface flaw of object
US20170004617A1 (en) * 2015-06-30 2017-01-05 Hon Hai Precision Industry Co., Ltd Electronic device and mehod for capturing multi-aspect images using the electronic device
US20190066199A1 (en) * 2016-12-22 2019-02-28 Capital One Services, Llc Systems and methods for virtual fittings
US20190258225A1 (en) * 2017-11-17 2019-08-22 Kodak Alaris Inc. Automated 360-degree dense point object inspection
US20220114854A1 (en) * 2018-12-19 2022-04-14 Ecoatm, Llc Systems and methods for vending and/or purchasing mobile phones and other electronic devices
US20200265487A1 (en) * 2019-02-18 2020-08-20 Ecoatm, Llc Neural network based physical condition evaluation of electronic devices, and associated systems and methods
US20200357106A1 (en) * 2019-05-09 2020-11-12 Hon Hai Precision Industry Co., Ltd. Method for detecting defects, electronic device, and computer readable medium
US11243513B2 (en) * 2019-05-09 2022-02-08 Micron Technology, Inc. Controlling transport of physical objects based on scanning of encoded images
US20210116392A1 (en) * 2019-10-22 2021-04-22 Blancco Technology Group IP Oy System and method for mobile device display and housing diagnostics
US20220051507A1 (en) * 2020-08-17 2022-02-17 Ecoatm, Llc Kiosk for evaluating and purchasing used electronic devices
US20220092756A1 (en) * 2020-09-21 2022-03-24 International Business Machines Corporation Feature detection based on neural networks
US20220289403A1 (en) * 2021-03-10 2022-09-15 The Boeing Company System and method for automated surface anomaly detection

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230011330A1 (en) * 2021-07-09 2023-01-12 At&T Intellectual Property I, L.P. Device condition determination

Similar Documents

Publication Publication Date Title
US11243513B2 (en) Controlling transport of physical objects based on scanning of encoded images
US11189019B2 (en) Method for detecting defects, electronic device, and computer readable medium
CN110621984B (en) Method and system for improving quality inspection
US12086978B2 (en) Defect detection of a component in an assembly
US11900666B2 (en) Defect detection and image comparison of components in an assembly
US12134483B2 (en) System and method for automated surface anomaly detection
US11727522B2 (en) Method, system, and apparatus for damage assessment and classification
US9536298B2 (en) Electronic device and method for detecting surface flaw of object
US20230153986A1 (en) Grading cosmetic appearance of a test object
US11113432B2 (en) Encoding images on physical objects to trace specifications for a manufacturing process
WO2020168842A1 (en) Vehicle damage assessment method and device and electronic device
TWI748184B (en) Defect detecting method, electronic device, and computer readable storage medium
US20230316496A1 (en) Grading cosmetic appearance of a test object
CN114971443B (en) A processing method and imaging device for logistics objects
US11347854B2 (en) System validator
US20160209989A1 (en) Record and replay of operations on graphical objects
KR102280668B1 (en) Method and system for dimensional quality inspectation
TWI592885B (en) Test system
CN118331840B (en) Data and interface joint test method, device, equipment and storage medium
TWI781623B (en) Object measurement method and object measurement system
CN113030692B (en) Test system and test method
CN119180787A (en) Product defect detection method, electronic device and storage medium
CN119883881A (en) Multi-tenant-oriented UI test method and device, computer equipment and medium
CN119180943A (en) Electronic tag platform image recognition method, electronic tag platform image recognition device, computer device, readable storage medium, and program product
CN119514510A (en) Method, system and medium for processing claims

Legal Events

Date Code Title Description
AS Assignment

Owner name: FUTURE DIAL, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LI, JISHENG;ZHANG, QING;SIGNING DATES FROM 20211103 TO 20211108;REEL/FRAME:058103/0034

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STCT Information on status: administrative procedure adjustment

Free format text: PROSECUTION SUSPENDED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: ALLOWED -- NOTICE OF ALLOWANCE NOT YET MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS