US20210201469A1 - System and method for quantitative image quality assessment for photogrammetry - Google Patents
System and method for quantitative image quality assessment for photogrammetry Download PDFInfo
- Publication number
- US20210201469A1 US20210201469A1 US17/203,929 US202117203929A US2021201469A1 US 20210201469 A1 US20210201469 A1 US 20210201469A1 US 202117203929 A US202117203929 A US 202117203929A US 2021201469 A1 US2021201469 A1 US 2021201469A1
- Authority
- US
- United States
- Prior art keywords
- image quality
- environment
- photogrammetry
- user
- texture features
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000001303 quality assessment method Methods 0.000 title claims abstract description 76
- 238000000034 method Methods 0.000 title claims abstract description 28
- 230000000694 effects Effects 0.000 claims abstract description 7
- 230000008569 process Effects 0.000 claims abstract description 7
- 238000012545 processing Methods 0.000 claims description 34
- 238000004891 communication Methods 0.000 claims description 11
- 230000000007 visual effect Effects 0.000 description 8
- 238000003384 imaging method Methods 0.000 description 6
- 238000005259 measurement Methods 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 230000015654 memory Effects 0.000 description 4
- 230000001413 cellular effect Effects 0.000 description 3
- 239000003973 paint Substances 0.000 description 3
- 239000000463 material Substances 0.000 description 2
- 239000000843 powder Substances 0.000 description 2
- 238000013442 quality metrics Methods 0.000 description 2
- 230000003213 activating effect Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000011982 device technology Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000005507 spraying Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 238000011179 visual inspection Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
-
- G06K9/00671—
-
- G06K9/036—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/98—Detection or correction of errors, e.g. by rescanning the pattern or by human intervention; Evaluation of the quality of the acquired patterns
- G06V10/993—Evaluation of the quality of the acquired pattern
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/20—Scenes; Scene-specific elements in augmented reality scenes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30168—Image quality inspection
Definitions
- This patent application relates to computer-implemented software systems, metrology systems, photogrammetry-based systems, and automatic visual measurement or inspection systems, according to example embodiments, and more specifically to a system and method for quantitative image quality assessment for photogrammetry.
- Photogrammetry refers to the science of making measurements from photographs.
- the input to photogrammetry is photographs or images, and the output is typically a map, a drawing, a measurement, or a three-dimensional (3D) model of some real-world object or scene.
- Photographs or images for photogrammetry can be obtained from a variety of sources, including using aircraft (aerial photogrammetry), cameras on tripods, or even cameras held by hand (terrestrial or close-range photogrammetry).
- photogrammetry involves computing the relative poses of the camera(s) that acquired the photographs and then computing the accurate 3D geometry of the real-world scene or object being photographed.
- one of the common approaches taken is to first identify unique surface features in each image and then match corresponding features across images.
- a large number typically 1000's or larger
- surface features with a high density of surface features lead to smooth and dense 3D reconstructions of the object and accurate measurements.
- poor surface texture leads to rough surface modeling, missing regions, and low-quality measurements.
- a system and method for quantitative image quality assessment for photogrammetry are disclosed.
- an image quality assessment, tool is provided to address the shortcomings of the conventional photogrammetry tools as described above.
- the image quality assessment tool of various example embodiments can be advantageously used wherever image characteristics, such as texture, brightness, and contrast, must be quantitatively assessed.
- An example embodiment as disclosed herein includes a system and process that can assess the spatial and frequency content of an image, compute one or more image quality metrics, and perform other mathematical analysis to assess whether the metrics meet the desired image quality requirements.
- the output of an example embodiment can include, but is not limited to, a Pass/Fail determination, which can include a quantitative map of the image quality metrics overlaid on the image thereby showing areas of the image that have good, marginal, or poor surface texture.
- the output of an example embodiment can also include, but is not limited to, an identification of shadows and highlights, and recommendations to the user for improving texture in areas where the imaged texture is poor (e.g., “Please speckle this area lightly with white powder to improve texture”).
- the image quality assessment tool of the various example embodiments can be used with images of any resolution obtained from any type of camera or imaging device that produces an RGB or grayscale image.
- the image quality assessment tool of the various example embodiments can be deployed on any computational device including a smartphone, a personal computer or other client device, and also on a cloud computing platform.
- the image quality assessment tool of the various example embodiments can be used on objects of any size, shape, or material. Details of the various example embodiments are provided below.
- FIG. 1 a illustrates a sample object with a shiny surface for which the photogrammetry 3D reconstruction fails completely (i.e., no geometry output) because of the lack of surface texture features on the object;
- FIG. 1 b illustrates the same sample object of FIG. 1 a with its surface textured using black and white paint, providing a dense array of features
- FIG. 1 c illustrates a result of photogrammetry processing of the same sample object of FIG. 1 b with the strong surface texture features leading to a smooth and dense 3D reconstruction or modeling of the object;
- FIG. 2 a illustrates a sample object with a dull, but texture-less surface having a lack of surface texture features on the object
- FIG. 2 b illustrates the same sample object of FIG. 2 a showing the insufficient density of surface texture features
- FIG. 2 c illustrates the poor 3D reconstruction of the same sample object of FIG. 2 a showing holes and rough surfaces, because of the insufficient density of surface texture features on the object;
- FIG. 3 a illustrates a result of an example embodiment processing a sample object with weak (shiny) surface texture features, the result including an image quality map indicating low image quality values corresponding to shiny portions of the imaged object;
- FIG. 3 b illustrates a result of an example embodiment processing a sample object with satisfactory surface texture features, the result including an image quality map indicating high image quality values corresponding to portions of the imaged object with good surface texture features;
- FIG. 4 a illustrates a sample object with a sufficiently textured surface and having a sufficient quantity and arrangement of surface texture features on the object
- FIG. 4 b illustrates a result of an example embodiment processing the sample object of FIG. 4 a with satisfactory surface texture features, the result including an image quality threshold overlay indicating high image quality values corresponding to portions of the imaged object with good surface texture features, the image quality threshold overlay showing regions of the object image having an image quality exceeding a predetermined image quality threshold;
- FIG. 5 is a structure diagram that illustrates example embodiments of systems as described herein;
- FIG. 6 is a processing flow diagram that illustrates example embodiments of methods as described herein.
- FIG. 7 shows a diagrammatic representation of a machine in the example form of a computer system within which a set of instructions when executed may cause the machine to perform any one or more of the methodologies discussed herein.
- an image quality assessment system can be implemented on a computing platform, such as the computing platform described below in connection with FIG. 7 .
- the image quality assessment system of an example embodiment can be implemented with an imaging system or imaging capability to capture quality assessment images of an object or an environment for which photogrammetry is to be performed.
- an imaging system or imaging capability is not a required part of the image quality assessment system as the image quality assessment system can use images of an object or environment that are captured independently or separately from the image quality assessment system.
- the image quality assessment system of an example embodiment can receive or capture quality assessment images of an object or an environment for which photogrammetry is to be performed.
- the image quality assessment system can process the quality assessment images to determine the sufficiency and coverage of surface texture features of objects or environments depicted in the quality assessment images.
- a lack of surface texture features and/or their insufficient coverage depicted in the quality assessment images is indicative of the likelihood of a poor result if images of the object or environment are used for photogrammetry.
- a sufficiency and broad coverage of surface texture features of objects or environments depicted in the quality assessment images is indicative of the likelihood of a satisfactory result if images of the object or environment are used for photogrammetry.
- the image quality assessment system can provide at least three types of output or assessment results.
- the image quality assessment system can generate an image quality map indicating image quality values corresponding to regions of the surface of the object or environment that have satisfactory or unsatisfactory texture features for photogrammetry. In this manner, a user can see a visual representation of the object or environment and the image quality values corresponding to particular portions of the object or environment that need improved texture features for satisfactory photogrammetry.
- the image quality assessment system of an example embodiment can generate instructions or prompts for the user, which can direct the user to perform actions with respect to the object or environment that will effect the needed improvements of the texture features for satisfactory photogrammetry.
- the image quality assessment system can generate an image quality threshold overlay indicating regions of the surface of the object or environment that have satisfactory or unsatisfactory texture features for photogrammetry.
- the image quality threshold overlay is configured to be overlaid on the object image so the surface texture quality of particular regions of the object is clearly visible. In this manner, a user can see a visual representation of the object or environment and the particular portions of the object or environment that need improved texture features for satisfactory photogrammetry.
- the processing performed and the outputs produced by the image quality assessment system are described in more detail below. However, it is important to first describe the problems encountered when objects or environments have poor surface texture features and how this poor surface texture can affect photogrammetry and related 3D reconstruction or modeling.
- FIG. 1 a illustrates a sample object with a shiny surface for which the photogrammetry 3D reconstruction fails completely (i.e., no geometry output) because of the lack of surface texture features on the object. Shiny surfaces on objects don't provide texture features that sufficiently define the surface of the object. As a result, photogrammetry processing will be unable to properly model the object.
- FIG. 1 b illustrates the same sample object of FIG. 1 a with its surface textured using black and white paint, providing a dense array of surface texture features.
- the dense array of surface texture features on the object sufficiently define the surface of the object.
- photogrammetry processing can satisfactorily generate a related 3D reconstruction or modeling of the object.
- FIG. 1 c illustrates a result of photogrammetry processing of the same sample object of FIG. 1 b with the strong surface texture features leading to a smooth and dense 3D reconstruction or modeling of the object.
- FIG. 2 a illustrates a sample object with a dull, but texture-less surface having a lack of surface texture features on the object.
- the dull and texture-less object shown in FIG. 2 a doesn't provide surface texture features that sufficiently define the surface of the object. As a result, photogrammetry processing will be unable to properly model the object.
- FIG. 2 b illustrates the same sample object of FIG. 2 a showing the insufficient density of surface texture features, thereby rendering the object a poor candidate for photogrammetry processing.
- FIG. 2 c illustrates the poor 3D reconstruction of the same sample object of FIGS. 2 a and 2 b showing holes and rough surfaces, because of the insufficient density of surface texture features on the object.
- the image quality assessment system of an example embodiment provides a solution to avoid these unsatisfactory photogrammetry processing results.
- FIG. 3 a illustrates a result of the processing performed by the image quality assessment system of an example embodiment.
- the image quality assessment system has processed a sample object with weak (shiny) surface texture features.
- the image quality assessment system has produced a result including an image quality map 20 indicating low image quality values corresponding to shiny portions of the imaged object.
- FIG. 3 b illustrates a result of the processing performed by the image quality assessment system of an example embodiment.
- the image quality assessment system has processed a sample object with satisfactory surface texture features.
- the image quality assessment system has produced a result including an image quality map 21 indicating high image quality values corresponding to portions of the imaged object with good surface texture features.
- the image quality map 21 produced by the image quality assessment system enables a user to see a visual representation of the object or environment and the particular portions of the object or environment that need improved texture features for satisfactory photogrammetry.
- FIG. 4 a illustrates a sample object with a sufficiently textured surface and having a sufficient quantity and arrangement of surface texture features on the object.
- FIG. 4 b illustrates a result of the processing performed by the image quality assessment system of an example embodiment, where processing is performed on the sample object of FIG. 4 a with satisfactory surface texture features.
- the image quality assessment system has produced a result including an image quality threshold overlay 22 indicating high image quality values corresponding to portions of the imaged object with good surface texture features.
- the image quality threshold overlay 22 can show particular regions of the object having a surface texture quality exceeding a predetermined image quality threshold.
- the image quality threshold overlay 22 is also configured to be overlaid on the object image so the surface texture quality of particular regions of the object is clearly visible.
- FIG. 5 is a structure diagram that illustrates example embodiments of systems as described herein.
- the image quality assessment system 100 of an example embodiment can be configured as a software application executable by a data processor.
- the data processor can be in data communication with an image receiver configured to receive one or more quality assessment images.
- the image quality assessment system 100 of an example embodiment can receive or capture quality assessment images of an object or an environment for which photogrammetry is to be performed.
- the image quality assessment system 100 can process the quality assessment images to determine the sufficiency and coverage of surface texture features of objects or environments depicted in the quality assessment images as described above.
- a lack of sufficiency and/or coverage of surface texture of objects or environments depicted in the quality assessment images is indicative of the likelihood of a poor result if images of the object or environment are used for photogrammetry.
- a sufficiency and broad coverage of surface texture features of objects or environments depicted in the quality assessment images is indicative of the likelihood of a satisfactory result if images of the object or environment are used for photogrammetry.
- the image quality assessment system 100 can provide as least three types of output or assessment results as shown in FIG. 5 .
- the image quality assessment system 100 can generate an image quality map indicating image quality values corresponding to regions of the surface of the object or environment that have satisfactory or unsatisfactory texture features for photogrammetry. In this manner, a user can see a visual representation of the object or environment and the image quality values corresponding to particular portions of the object or environment that need improved texture features for satisfactory photogrammetry.
- the image quality assessment system 100 of an example embodiment can generate instructions or prompts for the user, which can direct the user to perform actions with respect to the object or environment that will effect the needed improvements of the texture features for satisfactory photogrammetry.
- the image quality assessment system 100 can be configured to automatically send text messages, email messages, user interface messages, or the like to instruct the user to apply texture to particular portions of the object or environment, the particular portions having image quality values below a pre-determined threshold.
- the image quality assessment system 100 can generate an image quality threshold overlay, as described above, indicating regions of the surface of the object or environment that have satisfactory or unsatisfactory texture features for photogrammetry.
- the image quality threshold overlay is configured to be overlaid on the object image so the surface texture quality of particular regions of the object is clearly visible. In this manner, a user can see a visual representation of the object or environment and the particular portions of the object or environment that need improved texture features for satisfactory photogrammetry.
- the method 2000 of an example embodiment can be configured to: receive one or more quality assessment images via an image receiver (processing block 2010 ); process the quality assessment images to determine if a quality of the surface texture features of an object or environment depicted in the quality assessment images satisfies a pre-determined quality threshold, the pre-determined quality threshold corresponding to a likelihood of a satisfactory result if images of the object or environment are used for photogrammetry (processing block 2020 ); generate an image quality map indicating image quality values corresponding to regions of the surface of the object or environment that have satisfactory or unsatisfactory texture features for photogrammetry (processing block 2030 ); and generate instructions or prompts for a user, the instructions or prompts directing the user to perform actions with respect to the object or environment that will effect improvements of the texture features for satisfactory photogrammetry (processing block 2040 ).
- FIG. 7 shows a diagrammatic representation of a machine in the example form of a mobile computing and/or communication system 700 within which a set of instructions when executed and/or processing logic when activated may cause the machine to perform any one or more of the methodologies described and/or claimed herein.
- the machine operates as a standalone device or may be connected (e.g., networked) to other machines.
- the machine may operate in the capacity of a server or a client machine in server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment.
- the machine may be a personal computer (PC), a laptop computer, a tablet computing system, a Personal Digital Assistant (PDA), a cellular telephone, a smartphone, a web appliance, a set-top box (STB), a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) or activating processing logic that specify actions to be taken by that machine.
- PC personal computer
- PDA Personal Digital Assistant
- STB set-top box
- STB set-top box
- network router switch or bridge
- the example mobile computing and/or communication system 700 includes a data processor 702 (e.g., a System-on-a-Chip (SoC), general processing core, graphics core, and optionally other processing logic) and a memory 704 , which can communicate with each other via a bus or other data transfer system 706 .
- the mobile computing and/or communication system 700 may further include various input/output (I/O) devices and/or interfaces 710 , such as a touchscreen display, an audio jack, and optionally a network interface 712 .
- I/O input/output
- the network interface 712 can include one or more radio transceivers configured for compatibility with any one or more standard wireless and/or cellular protocols or access technologies (e.g., 2nd (2G), 2.5, 3rd (3G), 4th (4G) generation, and future generation radio access for cellular systems, Global System for Mobile communication (GSM), General Packet Radio Services (GPRS), Enhanced Data GSM Environment (EDGE), Wideband Code Division Multiple Access (WCDMA), LTE, CDMA2000, WLAN, Wireless Router (WR) mesh, and the like).
- GSM Global System for Mobile communication
- GPRS General Packet Radio Services
- EDGE Enhanced Data GSM Environment
- WCDMA Wideband Code Division Multiple Access
- LTE Long Term Evolution
- CDMA2000 Code Division Multiple Access 2000
- WLAN Wireless Router
- Network interface 712 may also be configured for use with various other wired and/or wireless communication protocols, including TCP/IP, UDP, SIP, SMS, RTP, WAP, CDMA, TDMA, UMTS, UWB, WiFi, WiMax, BluetoothTM, IEEE 802.11x, and the like.
- network interface 712 may include or support virtually any wired and/or wireless communication mechanisms by which information may travel between the mobile computing and/or communication system 700 and another computing or communication system via network 714 .
- the memory 704 can represent a machine-readable medium on which is stored one or more sets of instructions, software, firmware, or other processing logic (e.g., logic 708 ) embodying any one or more of the methodologies or functions described and/or claimed herein.
- the logic 708 may also reside, completely or at least partially within the processor 702 during execution thereof by the mobile computing and/or communication system 700 .
- the memory 704 and the processor 702 may also constitute machine-readable media.
- the logic 708 , or a portion thereof may also be configured as processing logic or logic, at least a portion of which is partially implemented in hardware.
- the logic 708 , or a portion thereof may further be transmitted or received over a network 714 via the network interface 712 .
- machine-readable medium of an example embodiment can be a single medium
- the term “machine-readable medium” should be taken to include a single non-transitory medium or multiple non-transitory media (e.g., a centralized or distributed database, and/or associated caches and computing systems) that stores the one or more sets of instructions.
- the term “machine-readable medium” can also be taken to include any non-transitory medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the various embodiments, or that is capable of storing, encoding or carrying data structures utilized by or associated with such a set of instructions.
- the term “machine-readable medium” can accordingly be taken to include, but not be limited to, solid-state memories, optical media, and magnetic media.
- a system and method for quantitative image quality assessment for photogrammetry are disclosed.
- a software application program is used to enable the capture and processing of images on a computing or communication system, including mobile devices.
- the various example embodiments can be configured to automatically capture images of a part/object being inspected, all from the convenience of a portable electronic device, such as a smartphone. This collection of images can be processed and results can be distributed to a variety of network users.
- the various embodiments as described herein are necessarily rooted in computer and network technology and serve to improve these technologies when applied in the manner as presently claimed.
- the various embodiments described herein improve the use of mobile device technology and data network technology in the context of automated object visual inspection via electronic means.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Quality & Reliability (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Processing (AREA)
Abstract
Description
- This is a continuation-in-part patent application claiming priority to U.S. non-provisional patent application Ser. No. 16/560,823, filed on Sep. 4, 2019. This present patent application draws priority from the referenced patent application. The entire disclosure of the referenced patent application is considered part of the disclosure of the present application and is hereby incorporated by reference herein in its entirety.
- A portion of the disclosure of this patent document contains material that is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction of the patent document or the patent disclosure, as it appears in the Patent and Trademark Office patent files or records, but otherwise reserves all copyright rights whatsoever. The following notice applies to the disclosure provided herein and to the drawings that form a part of this document: Copyright 2018-2021 PhotoGAUGE, Inc., All Rights Reserved.
- This patent application relates to computer-implemented software systems, metrology systems, photogrammetry-based systems, and automatic visual measurement or inspection systems, according to example embodiments, and more specifically to a system and method for quantitative image quality assessment for photogrammetry.
- Photogrammetry refers to the science of making measurements from photographs. The input to photogrammetry is photographs or images, and the output is typically a map, a drawing, a measurement, or a three-dimensional (3D) model of some real-world object or scene. Photographs or images for photogrammetry can be obtained from a variety of sources, including using aircraft (aerial photogrammetry), cameras on tripods, or even cameras held by hand (terrestrial or close-range photogrammetry).
- Regardless of how the photographs or images are taken or obtained, photogrammetry involves computing the relative poses of the camera(s) that acquired the photographs and then computing the accurate 3D geometry of the real-world scene or object being photographed. In order to accurately compute the camera poses, one of the common approaches taken is to first identify unique surface features in each image and then match corresponding features across images. For robust matching of images, and therefore accurate identification of camera poses, one needs a large number (typically 1000's or larger) of strong surface features from each image. Surfaces with a high density of surface features lead to smooth and dense 3D reconstructions of the object and accurate measurements. In contrast, poor surface texture leads to rough surface modeling, missing regions, and low-quality measurements.
- Surface features, in turn, strongly depend on the visual texture, such as the shape, tone, and color depth of the objects in the images. The richer the texture captured in the images, the more features that can be detected. Visual texture is affected by the true nature of the object surface, but also external conditions such as lighting, shadows, etc.
- Thus, the success of photogrammetry depends critically on the texture of the imaged objects in the photographs or images. However, existing photogrammetry tools do not offer a way to quantify the quality of the surface texture of an object before acquiring photographs of the object. Existing photogrammetry tools cannot determine if the images of an object do not possess adequate texture for successful photogrammetry. Poor surface texture of an object is often only discovered from the poor quality of the photogrammetry results rather than prior to image capture when suitable intervention could be attempted to improve the surface texture of the object for imaging (e.g., by changing lighting, spraying a light coat of powder or paint on the object, etc.).
- In various example embodiments described herein, a system and method for quantitative image quality assessment for photogrammetry are disclosed. In the various example embodiments described herein, an image quality assessment, tool is provided to address the shortcomings of the conventional photogrammetry tools as described above. The image quality assessment tool of various example embodiments can be advantageously used wherever image characteristics, such as texture, brightness, and contrast, must be quantitatively assessed.
- An example embodiment as disclosed herein includes a system and process that can assess the spatial and frequency content of an image, compute one or more image quality metrics, and perform other mathematical analysis to assess whether the metrics meet the desired image quality requirements.
- The output of an example embodiment can include, but is not limited to, a Pass/Fail determination, which can include a quantitative map of the image quality metrics overlaid on the image thereby showing areas of the image that have good, marginal, or poor surface texture. The output of an example embodiment can also include, but is not limited to, an identification of shadows and highlights, and recommendations to the user for improving texture in areas where the imaged texture is poor (e.g., “Please speckle this area lightly with white powder to improve texture”).
- The image quality assessment tool of the various example embodiments can be used with images of any resolution obtained from any type of camera or imaging device that produces an RGB or grayscale image. The image quality assessment tool of the various example embodiments can be deployed on any computational device including a smartphone, a personal computer or other client device, and also on a cloud computing platform. The image quality assessment tool of the various example embodiments can be used on objects of any size, shape, or material. Details of the various example embodiments are provided below.
- The various embodiments are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings in which:
-
FIG. 1a illustrates a sample object with a shiny surface for which the photogrammetry 3D reconstruction fails completely (i.e., no geometry output) because of the lack of surface texture features on the object; -
FIG. 1b illustrates the same sample object ofFIG. 1a with its surface textured using black and white paint, providing a dense array of features; -
FIG. 1c illustrates a result of photogrammetry processing of the same sample object ofFIG. 1b with the strong surface texture features leading to a smooth and dense 3D reconstruction or modeling of the object; -
FIG. 2a illustrates a sample object with a dull, but texture-less surface having a lack of surface texture features on the object; -
FIG. 2b illustrates the same sample object ofFIG. 2a showing the insufficient density of surface texture features; -
FIG. 2c illustrates the poor 3D reconstruction of the same sample object ofFIG. 2a showing holes and rough surfaces, because of the insufficient density of surface texture features on the object; -
FIG. 3a illustrates a result of an example embodiment processing a sample object with weak (shiny) surface texture features, the result including an image quality map indicating low image quality values corresponding to shiny portions of the imaged object; -
FIG. 3b illustrates a result of an example embodiment processing a sample object with satisfactory surface texture features, the result including an image quality map indicating high image quality values corresponding to portions of the imaged object with good surface texture features; -
FIG. 4a illustrates a sample object with a sufficiently textured surface and having a sufficient quantity and arrangement of surface texture features on the object; -
FIG. 4b illustrates a result of an example embodiment processing the sample object ofFIG. 4a with satisfactory surface texture features, the result including an image quality threshold overlay indicating high image quality values corresponding to portions of the imaged object with good surface texture features, the image quality threshold overlay showing regions of the object image having an image quality exceeding a predetermined image quality threshold; -
FIG. 5 is a structure diagram that illustrates example embodiments of systems as described herein; -
FIG. 6 is a processing flow diagram that illustrates example embodiments of methods as described herein; and -
FIG. 7 shows a diagrammatic representation of a machine in the example form of a computer system within which a set of instructions when executed may cause the machine to perform any one or more of the methodologies discussed herein. - In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the various embodiments. It will be evident, however, to one of ordinary skill in the art that the various embodiments may be practiced without these specific details.
- In various example embodiments described herein, a system and method for quantitative image quality assessment for photogrammetry are disclosed. In the various example embodiments described herein, an image quality assessment system can be implemented on a computing platform, such as the computing platform described below in connection with
FIG. 7 . Additionally, the image quality assessment system of an example embodiment can be implemented with an imaging system or imaging capability to capture quality assessment images of an object or an environment for which photogrammetry is to be performed. However, an imaging system or imaging capability is not a required part of the image quality assessment system as the image quality assessment system can use images of an object or environment that are captured independently or separately from the image quality assessment system. - As described in more detail below, the image quality assessment system of an example embodiment can receive or capture quality assessment images of an object or an environment for which photogrammetry is to be performed. The image quality assessment system can process the quality assessment images to determine the sufficiency and coverage of surface texture features of objects or environments depicted in the quality assessment images. A lack of surface texture features and/or their insufficient coverage depicted in the quality assessment images is indicative of the likelihood of a poor result if images of the object or environment are used for photogrammetry. In contrast, a sufficiency and broad coverage of surface texture features of objects or environments depicted in the quality assessment images is indicative of the likelihood of a satisfactory result if images of the object or environment are used for photogrammetry. Upon completion of the processing of the quality assessment images by the image quality assessment system of an example embodiment, the image quality assessment system can provide at least three types of output or assessment results.
- Firstly, the image quality assessment system can generate an image quality map indicating image quality values corresponding to regions of the surface of the object or environment that have satisfactory or unsatisfactory texture features for photogrammetry. In this manner, a user can see a visual representation of the object or environment and the image quality values corresponding to particular portions of the object or environment that need improved texture features for satisfactory photogrammetry.
- Secondly, the image quality assessment system of an example embodiment can generate instructions or prompts for the user, which can direct the user to perform actions with respect to the object or environment that will effect the needed improvements of the texture features for satisfactory photogrammetry.
- Thirdly, the image quality assessment system can generate an image quality threshold overlay indicating regions of the surface of the object or environment that have satisfactory or unsatisfactory texture features for photogrammetry. The image quality threshold overlay is configured to be overlaid on the object image so the surface texture quality of particular regions of the object is clearly visible. In this manner, a user can see a visual representation of the object or environment and the particular portions of the object or environment that need improved texture features for satisfactory photogrammetry. The processing performed and the outputs produced by the image quality assessment system are described in more detail below. However, it is important to first describe the problems encountered when objects or environments have poor surface texture features and how this poor surface texture can affect photogrammetry and related 3D reconstruction or modeling.
-
FIG. 1a illustrates a sample object with a shiny surface for which the photogrammetry 3D reconstruction fails completely (i.e., no geometry output) because of the lack of surface texture features on the object. Shiny surfaces on objects don't provide texture features that sufficiently define the surface of the object. As a result, photogrammetry processing will be unable to properly model the object. -
FIG. 1b illustrates the same sample object ofFIG. 1a with its surface textured using black and white paint, providing a dense array of surface texture features. The dense array of surface texture features on the object sufficiently define the surface of the object. As a result, photogrammetry processing can satisfactorily generate a related 3D reconstruction or modeling of the object. -
FIG. 1c illustrates a result of photogrammetry processing of the same sample object ofFIG. 1b with the strong surface texture features leading to a smooth and dense 3D reconstruction or modeling of the object. Because the capture of images and photogrammetry processing of an object or environment can be expensive or time-consuming, it is beneficial to determine prior to image capture and photogrammetry processing if the object or environment being processed has an acceptable level and coverage of surface texture features for photogrammetry. The image quality assessment system of an example embodiment provides this determination. -
FIG. 2a illustrates a sample object with a dull, but texture-less surface having a lack of surface texture features on the object. As with the shiny object shown inFIG. 1a and described above, the dull and texture-less object shown inFIG. 2a doesn't provide surface texture features that sufficiently define the surface of the object. As a result, photogrammetry processing will be unable to properly model the object. -
FIG. 2b illustrates the same sample object ofFIG. 2a showing the insufficient density of surface texture features, thereby rendering the object a poor candidate for photogrammetry processing. -
FIG. 2c illustrates the poor 3D reconstruction of the same sample object ofFIGS. 2a and 2b showing holes and rough surfaces, because of the insufficient density of surface texture features on the object. The image quality assessment system of an example embodiment provides a solution to avoid these unsatisfactory photogrammetry processing results. -
FIG. 3a illustrates a result of the processing performed by the image quality assessment system of an example embodiment. In the example shown, the image quality assessment system has processed a sample object with weak (shiny) surface texture features. The image quality assessment system has produced a result including animage quality map 20 indicating low image quality values corresponding to shiny portions of the imaged object. -
FIG. 3b illustrates a result of the processing performed by the image quality assessment system of an example embodiment. In the example shown, the image quality assessment system has processed a sample object with satisfactory surface texture features. The image quality assessment system has produced a result including animage quality map 21 indicating high image quality values corresponding to portions of the imaged object with good surface texture features. Theimage quality map 21 produced by the image quality assessment system enables a user to see a visual representation of the object or environment and the particular portions of the object or environment that need improved texture features for satisfactory photogrammetry. -
FIG. 4a illustrates a sample object with a sufficiently textured surface and having a sufficient quantity and arrangement of surface texture features on the object. -
FIG. 4b illustrates a result of the processing performed by the image quality assessment system of an example embodiment, where processing is performed on the sample object ofFIG. 4a with satisfactory surface texture features. In the example ofFIG. 4b , the image quality assessment system has produced a result including an imagequality threshold overlay 22 indicating high image quality values corresponding to portions of the imaged object with good surface texture features. The imagequality threshold overlay 22 can show particular regions of the object having a surface texture quality exceeding a predetermined image quality threshold. The imagequality threshold overlay 22 is also configured to be overlaid on the object image so the surface texture quality of particular regions of the object is clearly visible. -
FIG. 5 is a structure diagram that illustrates example embodiments of systems as described herein. The imagequality assessment system 100 of an example embodiment can be configured as a software application executable by a data processor. The data processor can be in data communication with an image receiver configured to receive one or more quality assessment images. As shown inFIG. 5 , the imagequality assessment system 100 of an example embodiment can receive or capture quality assessment images of an object or an environment for which photogrammetry is to be performed. The imagequality assessment system 100 can process the quality assessment images to determine the sufficiency and coverage of surface texture features of objects or environments depicted in the quality assessment images as described above. A lack of sufficiency and/or coverage of surface texture of objects or environments depicted in the quality assessment images (e.g., unsatisfactory quality) is indicative of the likelihood of a poor result if images of the object or environment are used for photogrammetry. In contrast, a sufficiency and broad coverage of surface texture features of objects or environments depicted in the quality assessment images (e.g., satisfactory quality) is indicative of the likelihood of a satisfactory result if images of the object or environment are used for photogrammetry. Upon completion of the processing of the quality assessment images by the imagequality assessment system 100 of an example embodiment, the imagequality assessment system 100 can provide as least three types of output or assessment results as shown inFIG. 5 . - Firstly, the image
quality assessment system 100 can generate an image quality map indicating image quality values corresponding to regions of the surface of the object or environment that have satisfactory or unsatisfactory texture features for photogrammetry. In this manner, a user can see a visual representation of the object or environment and the image quality values corresponding to particular portions of the object or environment that need improved texture features for satisfactory photogrammetry. - Secondly, the image
quality assessment system 100 of an example embodiment can generate instructions or prompts for the user, which can direct the user to perform actions with respect to the object or environment that will effect the needed improvements of the texture features for satisfactory photogrammetry. For example, the imagequality assessment system 100 can be configured to automatically send text messages, email messages, user interface messages, or the like to instruct the user to apply texture to particular portions of the object or environment, the particular portions having image quality values below a pre-determined threshold. - Thirdly, the image
quality assessment system 100 can generate an image quality threshold overlay, as described above, indicating regions of the surface of the object or environment that have satisfactory or unsatisfactory texture features for photogrammetry. The image quality threshold overlay is configured to be overlaid on the object image so the surface texture quality of particular regions of the object is clearly visible. In this manner, a user can see a visual representation of the object or environment and the particular portions of the object or environment that need improved texture features for satisfactory photogrammetry. - Referring now to
FIG. 6 , a processing flow diagram illustrates an example embodiment of a method implemented by the example embodiments as described herein. Themethod 2000 of an example embodiment can be configured to: receive one or more quality assessment images via an image receiver (processing block 2010); process the quality assessment images to determine if a quality of the surface texture features of an object or environment depicted in the quality assessment images satisfies a pre-determined quality threshold, the pre-determined quality threshold corresponding to a likelihood of a satisfactory result if images of the object or environment are used for photogrammetry (processing block 2020); generate an image quality map indicating image quality values corresponding to regions of the surface of the object or environment that have satisfactory or unsatisfactory texture features for photogrammetry (processing block 2030); and generate instructions or prompts for a user, the instructions or prompts directing the user to perform actions with respect to the object or environment that will effect improvements of the texture features for satisfactory photogrammetry (processing block 2040). -
FIG. 7 shows a diagrammatic representation of a machine in the example form of a mobile computing and/orcommunication system 700 within which a set of instructions when executed and/or processing logic when activated may cause the machine to perform any one or more of the methodologies described and/or claimed herein. In alternative embodiments, the machine operates as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine may operate in the capacity of a server or a client machine in server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. The machine may be a personal computer (PC), a laptop computer, a tablet computing system, a Personal Digital Assistant (PDA), a cellular telephone, a smartphone, a web appliance, a set-top box (STB), a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) or activating processing logic that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term “machine” can also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions or processing logic to perform any one or more of the methodologies described and/or claimed herein. - The example mobile computing and/or
communication system 700 includes a data processor 702 (e.g., a System-on-a-Chip (SoC), general processing core, graphics core, and optionally other processing logic) and amemory 704, which can communicate with each other via a bus or otherdata transfer system 706. The mobile computing and/orcommunication system 700 may further include various input/output (I/O) devices and/orinterfaces 710, such as a touchscreen display, an audio jack, and optionally anetwork interface 712. In an example embodiment, thenetwork interface 712 can include one or more radio transceivers configured for compatibility with any one or more standard wireless and/or cellular protocols or access technologies (e.g., 2nd (2G), 2.5, 3rd (3G), 4th (4G) generation, and future generation radio access for cellular systems, Global System for Mobile communication (GSM), General Packet Radio Services (GPRS), Enhanced Data GSM Environment (EDGE), Wideband Code Division Multiple Access (WCDMA), LTE, CDMA2000, WLAN, Wireless Router (WR) mesh, and the like).Network interface 712 may also be configured for use with various other wired and/or wireless communication protocols, including TCP/IP, UDP, SIP, SMS, RTP, WAP, CDMA, TDMA, UMTS, UWB, WiFi, WiMax, Bluetooth™, IEEE 802.11x, and the like. In essence,network interface 712 may include or support virtually any wired and/or wireless communication mechanisms by which information may travel between the mobile computing and/orcommunication system 700 and another computing or communication system vianetwork 714. - The
memory 704 can represent a machine-readable medium on which is stored one or more sets of instructions, software, firmware, or other processing logic (e.g., logic 708) embodying any one or more of the methodologies or functions described and/or claimed herein. Thelogic 708, or a portion thereof, may also reside, completely or at least partially within theprocessor 702 during execution thereof by the mobile computing and/orcommunication system 700. As such, thememory 704 and theprocessor 702 may also constitute machine-readable media. Thelogic 708, or a portion thereof, may also be configured as processing logic or logic, at least a portion of which is partially implemented in hardware. Thelogic 708, or a portion thereof, may further be transmitted or received over anetwork 714 via thenetwork interface 712. While the machine-readable medium of an example embodiment can be a single medium, the term “machine-readable medium” should be taken to include a single non-transitory medium or multiple non-transitory media (e.g., a centralized or distributed database, and/or associated caches and computing systems) that stores the one or more sets of instructions. The term “machine-readable medium” can also be taken to include any non-transitory medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the various embodiments, or that is capable of storing, encoding or carrying data structures utilized by or associated with such a set of instructions. The term “machine-readable medium” can accordingly be taken to include, but not be limited to, solid-state memories, optical media, and magnetic media. - As described herein for various example embodiments, a system and method for quantitative image quality assessment for photogrammetry are disclosed. In various embodiments, a software application program is used to enable the capture and processing of images on a computing or communication system, including mobile devices. As described above, in a variety of contexts, the various example embodiments can be configured to automatically capture images of a part/object being inspected, all from the convenience of a portable electronic device, such as a smartphone. This collection of images can be processed and results can be distributed to a variety of network users. As such, the various embodiments as described herein are necessarily rooted in computer and network technology and serve to improve these technologies when applied in the manner as presently claimed. In particular, the various embodiments described herein improve the use of mobile device technology and data network technology in the context of automated object visual inspection via electronic means.
- The Abstract of the Disclosure is provided to allow the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in a single embodiment for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus, the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separate embodiment.
Claims (18)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/203,929 US11810282B2 (en) | 2019-09-04 | 2021-03-17 | System and method for quantitative image quality assessment for photogrammetry |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/560,823 US11269115B2 (en) | 2019-09-04 | 2019-09-04 | Speckled calibration artifact to calibrate metrology equipment |
US17/203,929 US11810282B2 (en) | 2019-09-04 | 2021-03-17 | System and method for quantitative image quality assessment for photogrammetry |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/560,823 Continuation-In-Part US11269115B2 (en) | 2019-09-04 | 2019-09-04 | Speckled calibration artifact to calibrate metrology equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
US20210201469A1 true US20210201469A1 (en) | 2021-07-01 |
US11810282B2 US11810282B2 (en) | 2023-11-07 |
Family
ID=76546444
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/203,929 Active 2039-11-16 US11810282B2 (en) | 2019-09-04 | 2021-03-17 | System and method for quantitative image quality assessment for photogrammetry |
Country Status (1)
Country | Link |
---|---|
US (1) | US11810282B2 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI779808B (en) * | 2021-08-30 | 2022-10-01 | 宏碁股份有限公司 | Image processing method |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9609288B1 (en) * | 2015-12-31 | 2017-03-28 | Unmanned Innovation, Inc. | Unmanned aerial vehicle rooftop inspection system |
US20180336737A1 (en) * | 2017-05-17 | 2018-11-22 | Bespoke, Inc. d/b/a Topology Eyewear | Systems and methods for determining the scale of human anatomy from images |
US20190114847A1 (en) * | 2017-10-13 | 2019-04-18 | Deere & Company | Unmanned aerial vehicle (uav)-assisted worksite data acquisition |
US20200258028A1 (en) * | 2017-09-29 | 2020-08-13 | Intel Corporation | Methods and apparatus for facilitating task execution using a drone |
US20200279389A1 (en) * | 2017-11-17 | 2020-09-03 | C 3 Limited | Object measurement system |
US20220067229A1 (en) * | 2020-09-03 | 2022-03-03 | International Business Machines Corporation | Digital twin multi-dimensional model record using photogrammetry |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3110987A (en) | 1961-08-10 | 1963-11-19 | Harold E G Arneson | Sphere lapping apparatus |
US20020185053A1 (en) | 2001-05-24 | 2002-12-12 | Lu Fei | Method for calibrating nanotopographic measuring equipment |
US6850858B1 (en) | 2001-07-06 | 2005-02-01 | Dupont Photomasks, Inc. | Method and apparatus for calibrating a metrology tool |
US7361941B1 (en) | 2004-12-21 | 2008-04-22 | Kla-Tencor Technologies Corporation | Calibration standards and methods |
US7473502B1 (en) | 2007-08-03 | 2009-01-06 | International Business Machines Corporation | Imaging tool calibration artifact and method |
US7788818B1 (en) | 2007-10-02 | 2010-09-07 | Sandia Corporation | Mesoscale hybrid calibration artifact |
US7684038B1 (en) | 2008-04-04 | 2010-03-23 | Kla-Tencor Corporation | Overlay metrology target |
US8826719B2 (en) | 2010-12-16 | 2014-09-09 | Hexagon Metrology, Inc. | Machine calibration artifact |
WO2014100598A1 (en) | 2012-12-21 | 2014-06-26 | Hexagon Metrology, Inc. | Calibration artifact and method of calibrating a coordinate measuring machine |
EP3507570A1 (en) | 2016-09-01 | 2019-07-10 | Hexagon Metrology, Inc | Conformance test artifact for coordinate measuring machine |
-
2021
- 2021-03-17 US US17/203,929 patent/US11810282B2/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9609288B1 (en) * | 2015-12-31 | 2017-03-28 | Unmanned Innovation, Inc. | Unmanned aerial vehicle rooftop inspection system |
US20180336737A1 (en) * | 2017-05-17 | 2018-11-22 | Bespoke, Inc. d/b/a Topology Eyewear | Systems and methods for determining the scale of human anatomy from images |
US20200258028A1 (en) * | 2017-09-29 | 2020-08-13 | Intel Corporation | Methods and apparatus for facilitating task execution using a drone |
US20190114847A1 (en) * | 2017-10-13 | 2019-04-18 | Deere & Company | Unmanned aerial vehicle (uav)-assisted worksite data acquisition |
US20200279389A1 (en) * | 2017-11-17 | 2020-09-03 | C 3 Limited | Object measurement system |
US20220067229A1 (en) * | 2020-09-03 | 2022-03-03 | International Business Machines Corporation | Digital twin multi-dimensional model record using photogrammetry |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI779808B (en) * | 2021-08-30 | 2022-10-01 | 宏碁股份有限公司 | Image processing method |
Also Published As
Publication number | Publication date |
---|---|
US11810282B2 (en) | 2023-11-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11663732B2 (en) | System and method for using images from a commodity camera for object scanning, reverse engineering, metrology, assembly, and analysis | |
US9818232B2 (en) | Color-based depth smoothing of scanned 3D model to enhance geometry in 3D printing | |
US9521391B2 (en) | Settings of a digital camera for depth map refinement | |
US10247541B2 (en) | System and method of estimating the three-dimensional size of an object for packaging or storing the object | |
US20170308736A1 (en) | Three dimensional object recognition | |
US20170103510A1 (en) | Three-dimensional object model tagging | |
JP2018124984A (en) | Method for 3d reconstruction of environment of mobile device, and corresponding computer program product and device | |
Mousavi et al. | The performance evaluation of multi-image 3D reconstruction software with different sensors | |
CN108121982B (en) | Method and device for acquiring facial single image | |
US20180300586A1 (en) | Image Quality Estimation Using a Reference Image Portion | |
WO2021056501A1 (en) | Feature point extraction method, movable platform and storage medium | |
CN115187715A (en) | Mapping method, device, equipment and storage medium | |
US20230146924A1 (en) | Neural network analysis of lfa test strips | |
US11810282B2 (en) | System and method for quantitative image quality assessment for photogrammetry | |
US10460503B2 (en) | Texturing of a three-dimensional (3D) model by UV map in-painting | |
CN112991429B (en) | Box volume measuring method, device, computer equipment and storage medium | |
CN115861156A (en) | Defect detection method, defect detection device, computer equipment and storage medium | |
CN115546379A (en) | Data processing method and device and computer equipment | |
Nurit et al. | HD-RTI: An adaptive multi-light imaging approach for the quality assessment of manufactured surfaces | |
Liu et al. | A new quality assessment and improvement system for print media | |
US20200388017A1 (en) | System, apparatus and method for facilitating inspection of a target object | |
US10529085B2 (en) | Hardware disparity evaluation for stereo matching | |
CN104156694A (en) | Method and device for identifying target object of image | |
CN105225219B (en) | Information processing method and electronic equipment | |
US20230222736A1 (en) | Methods and systems for interacting with 3d ar objects from a scene |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO SMALL (ORIGINAL EVENT CODE: SMAL); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
AS | Assignment |
Owner name: PHOTOGAUGE, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SHARMA, SAMEER;XAVIER, ARUNNELSON;SUBRAMANIAN, SANKARA J.;REEL/FRAME:058867/0760 Effective date: 20210308 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |