WO2023121663A1 - Feature detections - Google Patents

Feature detections Download PDF

Info

Publication number
WO2023121663A1
WO2023121663A1 PCT/US2021/064835 US2021064835W WO2023121663A1 WO 2023121663 A1 WO2023121663 A1 WO 2023121663A1 US 2021064835 W US2021064835 W US 2021064835W WO 2023121663 A1 WO2023121663 A1 WO 2023121663A1
Authority
WO
WIPO (PCT)
Prior art keywords
neighborhood
feature
mesh
examples
vertex
Prior art date
Application number
PCT/US2021/064835
Other languages
French (fr)
Inventor
Yujian XU
Matthew Donald Gaubatz
Stephen Bernard Pollard
Jan Philip ALLEBACH
Robert Alan Ulichney
Original Assignee
Purdue Research Foundation
Hewlett-Packard Development Company, L.P.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Purdue Research Foundation, Hewlett-Packard Development Company, L.P. filed Critical Purdue Research Foundation
Priority to PCT/US2021/064835 priority Critical patent/WO2023121663A1/en
Publication of WO2023121663A1 publication Critical patent/WO2023121663A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/20Finite element generation, e.g. wire-frame surface description, tesselation

Definitions

  • Items may be labeled or marked.
  • items may be labeled or marked to convey information about the items.
  • Some items may be labeled with text, characters, or symbols.
  • labeling may be utilized to inform a person about an item, such as materials in clothing, washing directions for clothing, nutrition information for food products, prices of goods, warning labels for machinery, usage directions for pharmaceutical products, etc.
  • labeling may be utilized for tracking items (e.g., items for inventory tracking or purchase) or for automated procedures (e.g., sorting, shipping, etc.).
  • Figure 1 is a flow diagram illustrating an example of a method for feature detection
  • Figure 2 is a block diagram illustrating examples of engines for feature detection
  • Figure 3 is a diagram illustrating an example of a portion of a 3D mesh
  • Figure 4 is a block diagram of an example of an apparatus that may be utilized to detect a feature in accordance with some of the techniques described herein;
  • Figure 5 is a block diagram illustrating an example of a computer- readable medium for feature detection; and [0007] Figure 6 is a diagram illustrating an example of a 3D object model and a scanned 3D mesh.
  • a feature is a geometrical structure on the surface of an object.
  • a feature may be a protrusion, bump, hill, extension, pit, depression, etc.
  • features may have a set parameter (e.g., size, shape, dimension(s), etc.).
  • 3D objects may be manufactured with a feature or features.
  • the feature(s) may be utilized in determining object orientation, geometry measurement (e.g., manufacturing deformation), and/or information encoding, etc.
  • a 3D object may be represented by a 3D mesh.
  • a 3D mesh is data indicating the geometry of an object.
  • a 3D mesh may represent the geometry of an object with vertices and/or polygonal faces (e.g., triangles).
  • a 3D mesh of an object may be generated by capturing a depth image of an object and/or scanning the object.
  • a depth sensor e.g., time-of-flight (ToF) camera, stereoscopic cameras, laser distance sensor, structured light scanner, etc.
  • ToF time-of-flight
  • a depth image or images of an object may be captured, where the distances of the depth image(s) may indicate positions of an object surface in a 3D space.
  • the positions on the object surface may be vertices of a 3D mesh. Faces of the 3D mesh (e.g., planes, triangles, etc.) may be generated between vertices to represent the object surface.
  • Some examples of the techniques described herein may be utilized to detect features of an object (e.g., added surface features) using a 3D mesh representing the object.
  • the features may have a set parameter or parameters (e.g., size, scale, shape, dimension(s), etc.).
  • Some examples of the techniques may compute per-scale differences of the surface geometry properties of a 3D mesh to detect a feature with a set parameter(s).
  • Feature detection may be performed in object identification (e.g., object serialization with an identifier) and/or surface geometry measurement (e.g., object orientation detection, manufacturing deformation calculation, 3D mesh alignment, etc.).
  • Feature detection on 3D surfaces may be challenging. For instance, feature computation may be performed in the plane of the depth sensor, as opposed to the local plane of the object’s surface. For instance, the local plane of an object’s surface may not be parallel with the sensor plane, and/or may vary significantly over regions of the object. Some approaches may target scaleinvariant features.
  • Some examples of the techniques described herein may provide an ability to locate features with deformations at a given scale (e.g., scale-specific artifacts). Some examples of the techniques described herein may provide tunable, 3D mesh-based scale-specific feature detection that conforms to the local geometry of an object to estimate displacements in local depth.
  • Some examples of the techniques described herein may include marking an object with robustly detectable surface features and/or providing a mechanism to perform feature detection.
  • a bandpass-filter operator over local surface depth is utilized to locate features with a set parameter using a 3D mesh representation for computation.
  • Some examples may provide efficient feature detection.
  • Some examples of the techniques may be performed in a manner capable of separating different surface regions that are in proximity with one another (e.g., the inside and outside of an object wall, or different fins on a support structure, etc.).
  • Some examples of the feature detection and/or locating techniques may be performed with object features with a set parameter (e.g., size, scale, dimension(s), etc.).
  • set feature parameter(s) e.g., size, geometry, direction with respect to the surrounding surface, etc.
  • manufacture e.g., 3D print, mold, form, etc.
  • a modified object may be created with surface features at a scale.
  • Figure 1 is a flow diagram illustrating an example of a method 100 for feature detection.
  • the method 100 and/or a method 100 element or elements may be performed by an apparatus (e.g., electronic device, computing device, scanner, server, etc.).
  • the method 100 may be performed by the apparatus 402 described in relation to Figure 4.
  • the apparatus may determine 102 a first value based on a first neighborhood of a vertex of a 3D mesh, where the first neighborhood has a first size based on a feature parameter.
  • the first value is a quantity indicating a center (e.g., region center, centroid, volumetric center, etc.) of the first neighborhood.
  • a neighborhood is a region of a 3D mesh.
  • the neighborhood may be a portion (e.g., a set of vertices and/or faces) of the 3D mesh.
  • a neighborhood may be determined relative to a vertex of the mesh. For instance, vertices that are within a distance from the vertex may be included in the neighborhood of the vertex.
  • a size of a neighborhood may be indicated by a distance.
  • the distance and/or neighborhood size may be expressed as a radius (from the vertex, for instance), a diameter, a Euclidean distance, dimension(s), and/or edge distance.
  • An edge distance is a quantity of edges between nodes (of a graph, for instance) representing vertices of the 3D mesh.
  • the method 100 may include generating a graph of the 3D mesh, where the graph includes nodes generated based on vertices of the 3D mesh.
  • a 3D mesh M may include a set of vertices V and a set of faces F.
  • the apparatus may generate a graph G, where each node represents a vertex of the mesh M and each face (e.g., triangular face) corresponds to three edges in G.
  • the edge distance between two nodes may be denoted by d(v lf V 2 ), where is a first vertex with a corresponding first node, V 2 is a second vertex with a corresponding second node, and where the edge distance is the minimal quantity of edges between the nodes.
  • a neighborhood e.g., the first neighborhood
  • a neighborhood around a vertex v may include all vertices where d(v, v') ⁇ n, where n is a positive integer.
  • vertices in a neighborhood may be denoted by a set S(v, n). An example of a neighborhood is given in relation to Figure 3.
  • a size of a neighborhood may be based on a feature parameter.
  • a feature parameter is a quantity that characterizes a feature. Examples of feature parameters may include feature width, feature length, radius from center, shape, bounding size, and/or dimension(s), etc.
  • a feature parameter may be set (e.g., established, utilized in object manufacturing to manufacture feature(s) on the surface of an object, etc.).
  • the apparatus may receive the feature parameter via an input device and/or may receive the feature parameter from another device (e.g., server, printer, networked device, etc.). For instance, a feature parameter may be set such that the size of the feature sought for detection is a given quantity.
  • the size of the neighborhood may be related to the feature parameter.
  • the size of the neighborhood e.g., neighborhood radius, Euclidean distance from a vertex, edge distance, etc.
  • the size of the neighborhood may be larger than a size of a feature indicated by the feature parameter.
  • a feature parameter indicates a radius of 2 millimeters (mm)
  • the size of the neighborhood may be 3 mm in a radius around a vertex.
  • an edge distance of a neighborhood may be related to the feature size expressed by the feature parameter based on a depth sensing density and/or a capture distance (e.g., scanning distance, distance between a depth sensor and the object).
  • a feature with a radius of 2 mm may relate to a quantity of edges (e.g., 4) based on depth sensor resolution and capture distance.
  • the feature parameter may indicate a quantity of edges.
  • a neighborhood size in edge distance e.g., 7 edges
  • a feature size e.g., 4 edges.
  • the neighborhood size may be selected to determine the position of the feature on the 3D mesh surface.
  • the neighborhood size may be less than a feature size, greater than a feature size, or equal to a feature size, where the depth value detected using the neighborhood size can be utilized to detect the presence of a feature.
  • a larger neighborhood size may be utilized such that a plane fit to the corresponding neighborhood may indicate the approximate position of the base of the feature.
  • a neighborhood size e.g., n H
  • the neighborhood size e.g., distance, width, radius, edge distance, etc.
  • the neighborhood size may be approximately 1.5 times the feature size. If the ratio is too large, the neighborhood may cover multiple features. If the ratio is too small, a plane fit may lead to wrong estimation of the base of the feature.
  • the method 100 may include determining the first neighborhood based on the graph and a first distance (e.g., first Euclidean distance, first edge distance, etc.) from a node corresponding to a vertex. For instance, the apparatus may determine a set of vertices of the first neighborhood by determining nodes that are within an edge distance from the vertex (e.g., the current vertex). For a vertex v, for example, the apparatus may determine vertices S(v, n H ), where n H is a first distance (e.g., first Euclidean distance, first edge distance, etc.).
  • a first distance e.g., first Euclidean distance, first edge distance, etc.
  • determining 102 the first value may include determining an average position of vertices of the first neighborhood. For instance, the apparatus may average the positions of the vertices of the first neighborhood to determine the average position. In some examples, the average position of the vertices may be denoted as m H and/or as a centroid of [0023] In some examples, determining 102 the first value may include subtracting the average position from the first neighborhood to determine a matrix (e.g., covariance matrix). For instance, the apparatus may subtract the centroid from S(v, n H ) to produce a modified set of vertices S'(v, n H ). The apparatus may calculate a covariance matrix based on the modified set of vertices. In some examples, the covariance matrix may be a 3 x 3 matrix of the form provided in Equation (1 ).
  • determining 102 the first value may include determining a normal of the first neighborhood based on the matrix (e.g., covariance matrix).
  • the normal may be denoted by n.
  • the apparatus may determine a normal of a fitted plane of S(v, n H ) (e.g., a normal of a plane fitted to the vertices of S(v, n H )).
  • the normal may be determined as an eigenvector corresponding to the smallest eigenvalue of the matrix (e.g., covariance matrix).
  • the first value may be determined 102 based on the normal.
  • the normal n may be a factor used to determine the first value.
  • the method 100 may include determining a set of faces of the first neighborhood of the 3D mesh, where determining 102 the first value may be based on areas and/or centroids of the set of faces.
  • the apparatus may determine a set of faces of the 3D mesh associated with the vertices S(v, n H — 1).
  • the set of faces may be denoted B w .
  • B w ⁇ F for all faces Fj inside the region including vertices of S(v, n H ) (e.g., associated with the vertices S(v, n H — 1)).
  • B w may be a function of the vertices of the first neighborhood.
  • the apparatus may determine centroids of the faces. For instance, the centroid of a face may be denoted by C(Fj). In some examples, the apparatus may determine areas of the faces. For instance, the area of a face may be denoted b CFj)-
  • the apparatus may determine 102 the first value in accordance with Equation (2).
  • C H is the first value (e.g., a first region center, a centroid of the first neighborhood, etc.).
  • the areas of the faces may be utilized to weight the corresponding centroids in the first value calculation.
  • the apparatus may determine 104 a second value based on a second neighborhood of the vertex of the 3D mesh, where the second neighborhood has a second size that is different from the first size.
  • the second value is a quantity indicating a center (e.g., region center, centroid, volumetric center, etc.) of the second neighborhood.
  • the second value may be determined similarly to the first value, with a different neighborhood size.
  • the size of the second neighborhood may be based on the feature parameter or may not be based on the feature parameter.
  • the method 100 may include determining the second neighborhood based on the graph and a second edge distance from the node corresponding to the vertex, where the second edge distance is smaller than the first edge distance.
  • the apparatus may determine a set of vertices of the second neighborhood by determining nodes that are within an edge distance from the vertex (e.g., the current vertex).
  • the apparatus may determine vertices S(v, n L ), where n L is a second edge distance.
  • C L is the second value (e.g., a second region center, a centroid of the second neighborhood, etc.), 4 is an area of a face of the second set of faces, and C (F ⁇ ) is a centroid of a face of the second set of faces.
  • the apparatus may determine 106 a depth value of the vertex based on the first value and the second value.
  • a depth value is a value indicating a depth of a vertex (e.g., depth of a vertex in a local region).
  • the depth value may indicate a height (e.g., a degree of protrusion or depression) of a vertex relative to a surface of an object (e.g., a surface level).
  • a depth value e.g., f(v)
  • the depth value may quantify a displacement of a vertex from the local surface.
  • the depth value may be a perpendicular distance from a vertex to a plane fitted to a neighborhood (e.g., local neighborhood).
  • determining 106 the depth value may be based on a normal of the first neighborhood and a difference between the first value and the second value.
  • the apparatus may subtract the second value from the first value and multiply the difference by a transpose of the normal.
  • the normal n may be utilized to determine (e.g., compute) a distance of the position of the second value C L (e.g., second centroid) with respect to the first value C H (e.g., first centroid) along a line in the normal direction intersecting with the first value C H (as the second value may not exactly line up with the first value in some cases, for instance).
  • C L e.g., second centroid
  • first value C H e.g., first centroid
  • the depth value may be determined 106 in accordance with Equation (4).
  • Equation (4) f (v) is the depth value of the vertex v, n T is the transpose of the normal, C H is the first value, C L is the second value, a normalization factor.
  • vectors described herein e.g., vertices, C H , C L
  • the depth value may be determined 106 in accordance with another expression
  • the depth value may be based on a normalization factor.
  • the apparatus may determine the depth value based on a normalization factor.
  • the depth value is normalized to account for non-uniform mesh structures that devote a larger number of vertices to regions with greater curvature.
  • a depth value at a non-feature smoothly curved surface region may be higher than that at a feature position.
  • the normalization factor can be determined based on the inverse square root of the total sampled area.
  • the apparatus may determine the normalization factor in accordance with Equation (5).
  • the normalization factor may be simplified
  • the apparatus may determine a depth value for multiple vertices (e.g., each vertex in the 3D mesh).
  • the depth value determination e.g., calculation
  • the apparatus may accelerate parallel computing.
  • the apparatus may detect 108 whether a feature is present based on the depth value.
  • the apparatus may determine whether the vertex is a maximum (e.g., a maximum within a region, a local maximum, etc.) based on the depth value. For instance, the apparatus may compare the depth value of the vertex to other neighboring depth values. In some examples, if the vertex is a maximum (e.g., maximum within a region) and satisfies a depth value threshold, the apparatus may detect 108 that a feature is present. For instance, a maximum may represent a rapid curvature change locally.
  • the depth value may be the same for neighboring vertices, for which no maximum would be found, or the depth value may be similar for neighboring vertices and may not satisfy the depth value threshold.
  • depth values calculated in a smooth region may be relatively small after the k(v) normalization and may not satisfy the depth value threshold.
  • the depth value threshold may be set based on feature size. Examples of the depth value threshold may include 0.1 , 0.25, 0.5, 1 , 3, 10, 50, etc. (depending on measurement units and/or feature size, for instance). Similar approaches may be utilized to detect a minimum (e.g., depression and/or debossed feature).
  • features may be sized such that locating features of the specified size may indicate positions of valid features.
  • feature sizes may be avoided that are similar in size to other object geometries (e.g., characteristics of other non-feature parts of the 3D mesh).
  • the abstraction of selecting local extrema based on a feature parameter (e.g., size) and/or adjusting to suppress non-extrema may enhance feature detection performance.
  • some examples of the techniques described herein may perform a pseudo bandpass filter computation over the surface depth. If the 3D mesh does not have approximately equal spacing between vertices, the apparatus may control a span (e.g., physical span and/or range) of the neighborhoods (e.g., sets of vertices such as S(v, n H )) in some examples.
  • a span e.g., physical span and/or range
  • the neighborhoods e.g., sets of vertices such as S(v, n H )
  • the apparatus may enforce a constraint that U G S( , n w ), implying that
  • the apparatus may pre-filter the vertices according to their depth values with a threshold T (e.g. T).
  • T may be determined by the size of the features (e.g., feature height).
  • the apparatus may determine candidate feature positions. For instance, the apparatus may determine whether the vertex is a candidate feature position based on the depth value.
  • a candidate feature position is a position that potentially corresponds to a feature.
  • multiple vertices may correspond to a feature.
  • the apparatus may find local extrema of (v). For a vertex v, for instance, if f (v) is greater than calculated depths (e.g., depth values) of its neighbors (e.g., all the immediate neighbors of v), the apparatus may determine that v is a local maximum.
  • the apparatus may cluster the vertex with other candidate feature positions in response to determining that the vertex is a candidate feature position.
  • the apparatus may detect a feature position based on the cluster. For instance, the apparatus may perform post-processing on the candidate feature positions.
  • features within the same region may have similar depth values.
  • the apparatus may discard those candidate feature positions whose depth values are relatively small in comparison to depth values of other candidate feature positions in the vicinity.
  • the apparatus may determine other candidate feature position(s) v’, where d( , v') ⁇ n 1 , where n 1 is a candidate feature /(v) . position threshold (e.g., size of the vicinity). In some examples, if ⁇
  • the apparatus may discard the candidate feature position v.
  • Candidate feature positions that are relatively close to each other may represent the same feature. For instance, if two candidate feature positions and V 2 satisfy the cluster threshold (e.g.
  • the apparatus may assign the candidate feature positions to a cluster. If no other candidate feature position is within the cluster threshold for v, then the apparatus may detect v as a feature position. For clustered candidate feature positions, the apparatus may calculate the average position of the clustered candidate feature positions as the detected feature position.
  • Figure 2 is a block diagram illustrating examples of engines for feature detection.
  • Figure 2 illustrates a graph generation engine 240, a neighborhood determination engine 242, a depth value determination engine 244, and a feature determination engine 250.
  • the term “engine” refers to circuitry (e.g., analog or digital circuitry, a processor, such as an integrated circuit, or other circuitry, etc.) or a combination of instructions (e.g., programming such as machine- or processor-executable instructions, commands, or code such as a device driver, programming, object code, etc.) and circuitry.
  • Some examples of circuitry may include circuitry without instructions such as an application specific integrated circuit (ASIC), a Field Programmable Gate Array (FPGA), etc.
  • a combination of circuitry and instructions may include instructions hosted at circuitry (e.g., an instruction module that is stored at a processor-readable memory such as random-access memory (RAM), a hard-disk, or solid-state drive, resistive memory, or optical media such as a digital versatile disc (DVD), and/or executed or interpreted by a processor), or circuitry and instructions hosted at circuitry.
  • a processor-readable memory such as random-access memory (RAM), a hard-disk, or solid-state drive, resistive memory, or optical media such as a digital versatile disc (DVD), and/or executed or interpreted by a processor
  • the graph generation engine 240 may obtain a 3D mesh 246.
  • the 3D mesh 246 may be captured by a depth sensor and/or received from another device.
  • the graph generation engine 240 may generate a graph based on a 3D mesh 246.
  • the graph generation engine 240 may generate nodes in the graph corresponding to the vertices of the 3D mesh 246 and may generate edges in the graph corresponding to edges of the faces of the 3D mesh 246.
  • the graph generation engine 240 may generate a graph based on a 3D mesh as described in relation to Figure 1 .
  • the graph may be provided to a neighborhood determination engine 242.
  • the neighborhood determination engine 242 may determine a first neighborhood and a second neighborhood based on the graph. In some examples, the neighborhood determination engine 242 may determine a first neighborhood and a second neighborhood based on vertices of the graph present within regions of volumes, where a volume or volumes may be indicated by the feature parameter 248. In some examples, the neighborhood determination engine 242 may determine a first neighborhood based on a first distance (e.g., first edge distance or other distance) from a node corresponding to a vertex. In some examples, the first distance may be based on a feature parameter 248.
  • a first distance e.g., first edge distance or other distance
  • the neighborhood determination engine 242 may determine a second neighborhood based on a second distance (e.g., second edge distance or other distance) from the node (e.g., the node corresponding to the vertex).
  • the second neighborhood e.g., second edge distance
  • the first neighborhood e.g., first edge distance
  • the neighborhood determination engine 242 may determine a first neighborhood and a second neighborhood for multiple nodes of the graph (e.g., multiple vertices represented by the nodes).
  • the neighborhood determination engine 242 may determine a first neighborhood and second neighborhood for each (e.g., all) of the nodes of the graph or may determine a first neighborhood and a second neighborhood for a subset of the nodes of the graph.
  • the neighborhood determination engine 242 may determine the neighborhoods as described in relation to Figure 1. Some examples of the techniques described herein may involve breaking up a 3D mesh into local neighborhoods. Graph structures may provide the ability to disambiguate different faces of object components that are near each other. The neighborhoods (e.g., sets of vertices) may be provided to the depth value determination engine 244.
  • the depth value determination engine 244 may determine a depth value based on the first neighborhood and the second neighborhood. In some examples, the depth value determination engine 244 may determine a depth value as described in relation to Figure 1. For instance, the depth value determination engine 244 may determine a first value based on the first neighborhood and may determine a second value based on the second neighborhood. The depth value determination engine 244 may utilize the first value and the second value to determine the depth value. In some examples, the depth value determination engine 244 may determine a depth value for multiple vertices (e.g., for all vertices or a subset of vertices). The depth value(s) may be provided to the feature determination engine 250.
  • the feature determination engine 250 may detect whether a feature is present and/or may detect a feature position based on the depth value(s). In some examples, detecting whether a feature is present and/or detecting a feature position may be performed as described in relation to Figure 1. For instance, the feature determination engine 250 may determine candidate feature positions, may cluster candidate feature positions, and/or may average clustered candidate feature positions to detect feature presence and/or feature position. In some examples, the feature determination engine 250 may detect multiple features and/or feature positions.
  • the feature detection(s) and/or feature position(s) may be provided to an operation engine (not shown in Figure 2).
  • the operation engine may perform an operation or operations based on the feature detection(s) and/or feature position(s). For instance, the operation engine may register (e.g., align) a 3D mesh (e.g., 3D mesh 246) to a 3D object model (e.g., an original 3D mesh) based on the feature position(s). For instance, the operation engine may determine a correspondence between the detected feature position(s) and feature position(s) of the 3D object model. The registration may be utilized to compare the 3D mesh to the 3D object model and determine differences (e.g., manufacturing deformations) in the 3D mesh.
  • a 3D mesh e.g., 3D mesh 246
  • 3D object model e.g., an original 3D mesh
  • the operation engine may decode information encoded by the feature position(s). For instance, the operation engine may recognize a pattern or message encoded with the feature position(s). The decoded information may be provided to a display for presentation and/or may be sent to another device.
  • Figure 3 is a diagram illustrating an example of a portion of a 3D mesh 352.
  • the dots in Figure 3 illustrate examples of vertices of the 3D mesh 352.
  • Figure 3 also illustrates an example of a feature 354 (e.g., a bump in a surface represented by the 3D mesh 352).
  • a position of the feature 354 may be detected in accordance with some of the techniques described herein.
  • an apparatus may determine a neighborhood relative to a vertex 356.
  • vertices of the 3D mesh 352 may be represented as nodes, and a neighborhood may be determined based on an edge distance from the vertex 356.
  • a neighborhood of vertices of S(v, 7) is illustrated with the dark dots 358 of Figure 3.
  • the neighborhood may be utilized with a smaller neighborhood (e.g., S(v, 1)) to detect a position of the feature 354 as described in relation to Figure 1 .
  • FIG. 4 is a block diagram of an example of an apparatus 402 that may be utilized to detect a feature in accordance with some of the techniques described herein.
  • the apparatus 402 may be an electronic device, such as a personal computer, a server computer, a smartphone, a tablet computer, scanning device, etc.
  • the apparatus 402 may include and/or may be coupled to a processor 404 and/or a memory 406.
  • the apparatus 402 may include additional components (not shown) and/or some of the components described herein may be removed and/or modified without departing from the scope of this disclosure.
  • the processor 404 may be any of a central processing unit (CPU), a digital signal processor (DSP), a semiconductor-based microprocessor, graphics processing unit (GPU), field-programmable gate array (FPGA), an application-specific integrated circuit (ASIC), and/or other hardware device suitable for retrieval and execution of instructions stored in the memory 406.
  • the processor 404 may fetch, decode, and/or execute instructions stored in the memory 406.
  • the processor 404 may include an electronic circuit or circuits that include electronic components for performing a function or functions of the instructions.
  • the processor 404 may be implemented to perform one, some, or all of the aspects, operations, elements, etc., described in relation to one, some, or all of Figures 1-6.
  • the memory 406 is an electronic, magnetic, optical, and/or other physical storage device that contains or stores electronic data (e.g., information and/or instructions).
  • the memory 406 may be, for example, Random Access Memory (RAM), Electrically Erasable Programmable Read-Only Memory (EEPROM), a storage device, an optical disc, and/or the like.
  • RAM Random Access Memory
  • EEPROM Electrically Erasable Programmable Read-Only Memory
  • the memory 406 may be volatile and/or non-volatile memory, such as Dynamic Random Access Memory (DRAM), EEPROM, magnetoresistive random-access memory (MRAM), phase change RAM (PCRAM), memristor, flash memory, and/or the like.
  • DRAM Dynamic Random Access Memory
  • MRAM magnetoresistive random-access memory
  • PCRAM phase change RAM
  • memristor flash memory, and/or the like.
  • the memory 406 may be a non-transitory tangible machine-readable storage medium (e.g., non- transitory tangible computer-readable medium), where the term “non-transitory” does not encompass transitory propagating signals.
  • the memory 406 may include multiple devices (e.g., a RAM card and a solid-state drive (SSD)).
  • the apparatus 402 may include a communication interface 424 through which the processor 404 may communicate with an external device or devices (e.g., networked device, server, smartphone, printer, etc.).
  • the apparatus 402 may be in communication with (e.g., coupled to, have a communication link with) a depth sensor.
  • the apparatus 402 may include an integrated depth sensor.
  • the communication interface 424 may include hardware and/or machine-readable instructions to enable the processor 404 to communicate with the external device or devices.
  • the communication interface 424 may enable a wired and/or wireless connection to the external device or devices.
  • the communication interface 424 may include a network interface card and/or may also include hardware and/or machine-readable instructions to enable the processor 404 to communicate with various input and/or output devices. Examples of output devices include a printer, a 3D printer, a display, etc. Examples of input devices include a keyboard, a mouse, a touch screen, etc., through which a user may input instructions and/or data into the apparatus 402.
  • the memory 406 of the apparatus 402 may store mesh data 408, feature data 410, graph generation instructions 412, neighborhood determination instructions 414, feature detection instructions 416, and/or operation instructions 418.
  • the mesh data 408 is data that indicates a 3D mesh (e.g., vertices, local edges, and/or faces of a 3D mesh).
  • the mesh data 408 may indicate or depict an object or objects with features.
  • the feature data 410 is data that indicates features of an object or objects.
  • the feature data 410 may indicate positions of detected features from the mesh data 408.
  • the graph generation instructions 412 are instructions to generate a graph based on a 3D mesh of an object, where the graph includes nodes corresponding to vertices of the 3D mesh and edges based on faces of the 3D mesh.
  • the processor 404 may execute the graph generation instructions 412 to generate a graph as described in relation to Figure 1 , Figure 2, and/or Figure 3.
  • the neighborhood determination instructions 414 are instructions to determine a first neighborhood based on a first distance (e.g., first Euclidean distance, first edge distance, etc.) from a node corresponding to a vertex, where the first distance is based on a feature parameter.
  • a first distance e.g., first Euclidean distance, first edge distance, etc.
  • the processor 404 may execute the neighborhood determination instructions 414 to determine the first neighborhood as described in relation to Figure 1 , Figure 2 and/or Figure 3.
  • the neighborhood determination instructions 414 are instructions to determine a second neighborhood based on a second distance (e.g., second Euclidean distance, second edge distance, etc.) from the node.
  • a second distance e.g., second Euclidean distance, second edge distance, etc.
  • the processor 404 may execute the neighborhood determination instructions 414 to determine the second neighborhood as described in relation to Figure 1 , Figure 2 and/or Figure 3.
  • the feature detection instructions 416 are instructions to determine a depth value of the vertex based on the first neighborhood and the second neighborhood.
  • the processor 404 may execute the feature detection instructions 416 to determine the depth value of the vertex as described in relation to Figure 1 , Figure 2 and/or Figure 3.
  • the processor 404 may execute the feature detection instructions 416 to determine a first region center based on the first neighborhood and to determine a second region center based on the second neighborhood, where the processor 404 is to determine the depth value based on the first region center and the second region center.
  • the processor 404 may determine the first region center (e.g., first value) and the second region center (e.g., second value) as described in relation to Figure 1 and/or Figure 2.
  • the feature detection instructions 416 are instructions to determine detect a feature position based on the depth value.
  • the processor 404 may execute the feature detection instructions 416 to determine the feature position as described in relation to Figure 1 , Figure 2 and/or Figure 3.
  • the operation instructions 418 are instructions to perform an operation or operations based on the feature position.
  • the processor 404 may execute the operation instructions 418 to perform an operation as described in relation to Figure 1 , Figure 2, and/or Figure 3.
  • the processor 404 may determine a manufacturing deformation of the object based on the feature position.
  • the processor 404 may utilize the feature position to register the 3D mesh with a 3D object model and determine geometric differences between the 3D mesh and the 3D object model to determine the manufacturing deformation.
  • the processor 404 may execute the operation instructions 418 to perform another operation or operations (e.g., feature decoding).
  • an element or elements of the apparatus 402 may be omitted or combined.
  • Figure 5 is a block diagram illustrating an example of a computer- readable medium 526 for feature detection.
  • the computer-readable medium 526 is a non-transitory, tangible computer-readable medium.
  • the computer- readable medium 526 may be, for example, RAM, EEPROM, a storage device, an optical disc, and the like.
  • the computer-readable medium 526 may be volatile and/or non-volatile memory, such as DRAM, EEPROM, MRAM, PCRAM, memristor, flash memory, and the like.
  • the memory 406 described in relation to Figure 4 may be an example of the computer-readable medium 526 described in relation to Figure 5.
  • the computer-readable medium 526 may include data (e.g., information and/or instructions).
  • the computer-readable medium 526 may include neighborhood determination instructions 528, depth value determination instructions 530, and/or filtering instructions 532.
  • the neighborhood determination instructions 528 may include instructions, when executed, cause a processor of an electronic device to determine a first neighborhood of a 3D mesh, where the first neighborhood has a first size based on a feature parameter.
  • the neighborhood determination instructions 528 may also include instructions, when executed, cause a processor of an electronic device to determine a second neighborhood of the 3D mesh, where the second neighborhood has a second size that is different from the first size.
  • the first neighborhood and the second neighborhood may be determined described in relation to Figure 1 , Figure 2, Figure 3, and/or Figure 4.
  • the depth value determination instructions 530 may include instructions, when executed, cause the processor of the electronic device to determine a depth value of a vertex of the 3D mesh based on a first centroid of the first neighborhood and a second centroid of the second neighborhood.
  • the depth value may be determined as described in relation to Figure 1 , Figure 2, Figure 3, and/or Figure 4.
  • the filtering instructions 532 may include instructions, when executed, cause the processor of the electronic device to determine that the vertex is a candidate feature position based on the depth value and neighboring depth values.
  • the candidate feature position may be determined as described in relation to Figure 1 , Figure 2, and/or Figure 4.
  • the processor may determine local extrema to determine candidate feature positions.
  • the filtering instructions 532 may include instructions, that when executed, cause the processor of the electronic device to determine a feature position based on the candidate feature position.
  • the feature position may be determined as described in relation to Figure 1 , Figure 2, and/or Figure 4.
  • the processor may determine isolated candidate feature positions and/or averaged clustered candidate feature positions to determine feature positions.
  • the feature position may correspond to a peak of a protruding feature of the 3D mesh.
  • Some examples of the techniques described herein may provide approaches to detect features for serialization and/or measurement of an object or objects. Some examples of the techniques described herein may be capable of detecting features in both acquired object scans (e.g., approximately uniform meshes) and/or 3D computer-aided design (CAD) files (e.g., non-uniform meshes). Some examples of the techniques described herein may provide flexibility to compute tuneable differences in scale in the plane of a local (e.g., mean) surface. Some examples of the techniques described herein may enable detecting surface features from 3D scans without information regarding object orientation in the 3D space. Some examples of the techniques described herein may delineate between layers that are close together. Some examples of the techniques described herein may provide for feature detection in object alignment and/or deformation calculation operations.
  • CAD computer-aided design
  • FIG. 6 is a diagram illustrating an example of a 3D object model 660 and a scanned 3D mesh 662.
  • a 3D object may be manufactured based on the 3D object model 660 (e.g., original 3D mesh).
  • the 3D object model 660 may be 3D printed to produce a physical 3D object.
  • the physical 3D object may be scanned to produce the scanned 3D mesh 662.
  • the 3D object model 660 may include first features 664.
  • the scanned 3D mesh 662 may include corresponding second features 666.
  • positions of the first features 664 may be determined previously (e.g., at design time and/or when initially added to the 3D object model 660) and/or may be determined in accordance with some examples of the techniques described herein. For instance, feature detection (e.g., feature position detection) in accordance with some examples of the techniques described herein may be performed on the 3D object model 660 (e.g., original 3D mesh). In some cases, original design data (e.g., initial feature positions) may be unavailable and feature detection in accordance with some of the techniques described herein may be utilized to detect features and/or feature positions. [0069] Positions of the second features 666 may be determined in accordance with some examples of the techniques described herein.
  • the positions of the second features 666 may be utilized to register the scanned 3D mesh 662 with the 3D object model 660.
  • Asymmetrical features may help to provide an unambiguous registration. Registration may allow comparison between the 3D object model 660 and the scanned 3D mesh 662, which may indicate manufacturing deformations.
  • features may be positioned to encode information. The detected feature positions may be utilized to decode information (e.g., to identify an object, to serialize objects, etc.).
  • the term “and/or” may mean an item or items.
  • the phrase “A, B, and/or C” may mean any of: A (without B and C), B (without A and C), C (without A and B), A and B (without C), B and C (without A), A and C (without B), or all of A, B, and C.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

Examples of methods are described herein. In some examples, a method includes determining a first value based on a first neighborhood of a vertex of a three-dimensional (3D) mesh. In some examples, the first neighborhood has a first size based on a feature parameter. In some examples, the method includes determining a second value based on a second neighborhood of the vertex of the 3D mesh. In some examples, the second neighborhood has a second size that is different from the first size. In some examples, the method includes determining a depth value of the vertex based on the first value and the second value. In some examples, the method includes detecting whether a feature is present based on the depth value.

Description

FEATURE DETECTIONS
BACKGROUND
[0001] Items may be labeled or marked. For example, items may be labeled or marked to convey information about the items. Some items may be labeled with text, characters, or symbols. For instance, labeling may be utilized to inform a person about an item, such as materials in clothing, washing directions for clothing, nutrition information for food products, prices of goods, warning labels for machinery, usage directions for pharmaceutical products, etc. In some cases, labeling may be utilized for tracking items (e.g., items for inventory tracking or purchase) or for automated procedures (e.g., sorting, shipping, etc.).
BRIEF DESCRIPTION OF THE DRAWINGS
[0002] Figure 1 is a flow diagram illustrating an example of a method for feature detection;
[0003] Figure 2 is a block diagram illustrating examples of engines for feature detection;
[0004] Figure 3 is a diagram illustrating an example of a portion of a 3D mesh;
[0005] Figure 4 is a block diagram of an example of an apparatus that may be utilized to detect a feature in accordance with some of the techniques described herein;
[0006] Figure 5 is a block diagram illustrating an example of a computer- readable medium for feature detection; and [0007] Figure 6 is a diagram illustrating an example of a 3D object model and a scanned 3D mesh.
DETAILED DESCRIPTION
[0008] Some examples of the techniques described herein may provide feature detection for three-dimensional (3D) objects. A feature is a geometrical structure on the surface of an object. For instance, a feature may be a protrusion, bump, hill, extension, pit, depression, etc. In some examples, features may have a set parameter (e.g., size, shape, dimension(s), etc.). For instance, 3D objects may be manufactured with a feature or features. In some examples, the feature(s) may be utilized in determining object orientation, geometry measurement (e.g., manufacturing deformation), and/or information encoding, etc.
[0009] In some examples, a 3D object may be represented by a 3D mesh. A 3D mesh is data indicating the geometry of an object. For instance, a 3D mesh may represent the geometry of an object with vertices and/or polygonal faces (e.g., triangles). In some examples, a 3D mesh of an object may be generated by capturing a depth image of an object and/or scanning the object. For instance, a depth sensor (e.g., time-of-flight (ToF) camera, stereoscopic cameras, laser distance sensor, structured light scanner, etc.) may determine distances between the depth sensor and the object to produce a depth image. A depth image or images of an object may be captured, where the distances of the depth image(s) may indicate positions of an object surface in a 3D space. The positions on the object surface may be vertices of a 3D mesh. Faces of the 3D mesh (e.g., planes, triangles, etc.) may be generated between vertices to represent the object surface.
[0010] Some examples of the techniques described herein may be utilized to detect features of an object (e.g., added surface features) using a 3D mesh representing the object. The features may have a set parameter or parameters (e.g., size, scale, shape, dimension(s), etc.). Some examples of the techniques may compute per-scale differences of the surface geometry properties of a 3D mesh to detect a feature with a set parameter(s). Feature detection may be performed in object identification (e.g., object serialization with an identifier) and/or surface geometry measurement (e.g., object orientation detection, manufacturing deformation calculation, 3D mesh alignment, etc.).
[0011] Feature detection on 3D surfaces may be challenging. For instance, feature computation may be performed in the plane of the depth sensor, as opposed to the local plane of the object’s surface. For instance, the local plane of an object’s surface may not be parallel with the sensor plane, and/or may vary significantly over regions of the object. Some approaches may target scaleinvariant features.
[0012] Some examples of the techniques described herein may provide an ability to locate features with deformations at a given scale (e.g., scale-specific artifacts). Some examples of the techniques described herein may provide tunable, 3D mesh-based scale-specific feature detection that conforms to the local geometry of an object to estimate displacements in local depth.
[0013] Some examples of the techniques described herein may include marking an object with robustly detectable surface features and/or providing a mechanism to perform feature detection. In some examples, a bandpass-filter operator over local surface depth is utilized to locate features with a set parameter using a 3D mesh representation for computation. Some examples may provide efficient feature detection. Some examples of the techniques may be performed in a manner capable of separating different surface regions that are in proximity with one another (e.g., the inside and outside of an object wall, or different fins on a support structure, etc.).
[0014] Some examples of the feature detection and/or locating techniques may be performed with object features with a set parameter (e.g., size, scale, dimension(s), etc.). For instance, set feature parameter(s) (e.g., size, geometry, direction with respect to the surrounding surface, etc.) may be utilized to manufacture (e.g., 3D print, mold, form, etc.) a 3D object with features. For instance, given a design for an object and feature parameter(s), a modified object may be created with surface features at a scale. Once the object is manufactured and scanned, feature detection may be performed to estimate the feature locations from the 3D mesh representing the object.
[0015] Throughout the drawings, similar reference numbers may designate similar or identical elements. When an element is referred to without a reference number, this may refer to the element generally, with and/or without limitation to any particular drawing or figure. In some examples, the drawings are not to scale and/or the size of some parts may be exaggerated to more clearly illustrate the example shown. Moreover, the drawings provide examples in accordance with the description. However, the description is not limited to the examples provided in the drawings.
[0016] Figure 1 is a flow diagram illustrating an example of a method 100 for feature detection. The method 100 and/or a method 100 element or elements may be performed by an apparatus (e.g., electronic device, computing device, scanner, server, etc.). For example, the method 100 may be performed by the apparatus 402 described in relation to Figure 4.
[0017] The apparatus may determine 102 a first value based on a first neighborhood of a vertex of a 3D mesh, where the first neighborhood has a first size based on a feature parameter. The first value is a quantity indicating a center (e.g., region center, centroid, volumetric center, etc.) of the first neighborhood. A neighborhood is a region of a 3D mesh. For instance, the neighborhood may be a portion (e.g., a set of vertices and/or faces) of the 3D mesh. A neighborhood may be determined relative to a vertex of the mesh. For instance, vertices that are within a distance from the vertex may be included in the neighborhood of the vertex. For instance, a size of a neighborhood may be indicated by a distance. In some examples, the distance and/or neighborhood size may be expressed as a radius (from the vertex, for instance), a diameter, a Euclidean distance, dimension(s), and/or edge distance.
[0018] An edge distance is a quantity of edges between nodes (of a graph, for instance) representing vertices of the 3D mesh. For example, the method 100 may include generating a graph of the 3D mesh, where the graph includes nodes generated based on vertices of the 3D mesh. In some examples, a 3D mesh M may include a set of vertices V and a set of faces F. The apparatus may generate a graph G, where each node represents a vertex of the mesh M and each face (e.g., triangular face) corresponds to three edges in G. In some examples, the edge distance between two nodes may be denoted by d(vlf V2), where
Figure imgf000007_0001
is a first vertex with a corresponding first node, V2 is a second vertex with a corresponding second node, and where the edge distance is the minimal quantity of edges between the nodes. In some examples, a neighborhood (e.g., the first neighborhood) around a vertex v may include all vertices where d(v, v') < n, where n is a positive integer. In some examples, vertices in a neighborhood may be denoted by a set S(v, n). An example of a neighborhood is given in relation to Figure 3.
[0019] In some examples, a size of a neighborhood may be based on a feature parameter. A feature parameter is a quantity that characterizes a feature. Examples of feature parameters may include feature width, feature length, radius from center, shape, bounding size, and/or dimension(s), etc. In some examples, a feature parameter may be set (e.g., established, utilized in object manufacturing to manufacture feature(s) on the surface of an object, etc.). In some examples, the apparatus may receive the feature parameter via an input device and/or may receive the feature parameter from another device (e.g., server, printer, networked device, etc.). For instance, a feature parameter may be set such that the size of the feature sought for detection is a given quantity. In some examples, the size of the neighborhood may be related to the feature parameter. For instance, the size of the neighborhood (e.g., neighborhood radius, Euclidean distance from a vertex, edge distance, etc.) may be larger than a size of a feature indicated by the feature parameter. For example, if a feature parameter indicates a radius of 2 millimeters (mm), the size of the neighborhood may be 3 mm in a radius around a vertex. In some examples, an edge distance of a neighborhood may be related to the feature size expressed by the feature parameter based on a depth sensing density and/or a capture distance (e.g., scanning distance, distance between a depth sensor and the object). For instance, a feature with a radius of 2 mm may relate to a quantity of edges (e.g., 4) based on depth sensor resolution and capture distance. In some examples, the feature parameter may indicate a quantity of edges. In some examples, a neighborhood size in edge distance (e.g., 7 edges) may be greater than a feature size (e.g., 4 edges).
[0020] In some examples, the neighborhood size may be selected to determine the position of the feature on the 3D mesh surface. In some examples, the neighborhood size may be less than a feature size, greater than a feature size, or equal to a feature size, where the depth value detected using the neighborhood size can be utilized to detect the presence of a feature. For instance, a larger neighborhood size may be utilized such that a plane fit to the corresponding neighborhood may indicate the approximate position of the base of the feature. In some examples, a neighborhood size (e.g., nH) may be selected to be larger than the feature size. For instance, the neighborhood size (e.g., distance, width, radius, edge distance, etc.) may be approximately 1.5 times the feature size. If the ratio is too large, the neighborhood may cover multiple features. If the ratio is too small, a plane fit may lead to wrong estimation of the base of the feature.
[0021] In some examples, the method 100 may include determining the first neighborhood based on the graph and a first distance (e.g., first Euclidean distance, first edge distance, etc.) from a node corresponding to a vertex. For instance, the apparatus may determine a set of vertices of the first neighborhood by determining nodes that are within an edge distance from the vertex (e.g., the current vertex). For a vertex v, for example, the apparatus may determine vertices S(v, nH), where nH is a first distance (e.g., first Euclidean distance, first edge distance, etc.).
[0022] In some examples, determining 102 the first value may include determining an average position of vertices of the first neighborhood. For instance, the apparatus may average the positions of the vertices of the first neighborhood to determine the average position. In some examples, the average position of the vertices may be denoted as mH and/or as a centroid of
Figure imgf000008_0001
[0023] In some examples, determining 102 the first value may include subtracting the average position from the first neighborhood to determine a matrix (e.g., covariance matrix). For instance, the apparatus may subtract the centroid from S(v, nH) to produce a modified set of vertices S'(v, nH). The apparatus may calculate a covariance matrix based on the modified set of vertices. In some examples, the covariance matrix may be a 3 x 3 matrix of the form provided in Equation (1 ).
Figure imgf000009_0001
In Equation (1 ), x, y, and z are coordinates corresponding to three dimensions. [0024] In some examples, determining 102 the first value may include determining a normal of the first neighborhood based on the matrix (e.g., covariance matrix). In some examples, the normal may be denoted by n. For instance, the apparatus may determine a normal of a fitted plane of S(v, nH) (e.g., a normal of a plane fitted to the vertices of S(v, nH)). In some examples, the normal may be determined as an eigenvector corresponding to the smallest eigenvalue of the matrix (e.g., covariance matrix). In some examples, the first value may be determined 102 based on the normal. For instance, the normal n may be a factor used to determine the first value.
[0025] In some examples, the method 100 may include determining a set of faces of the first neighborhood of the 3D mesh, where determining 102 the first value may be based on areas and/or centroids of the set of faces. For instance, the apparatus may determine a set of faces of the 3D mesh associated with the vertices S(v, nH — 1). The set of faces may be denoted Bw. For example, Bw = {F for all faces Fj inside the region including vertices of S(v, nH) (e.g., associated with the vertices S(v, nH — 1)). For instance, Bw may be a function of the vertices of the first neighborhood. In some examples, the apparatus may determine centroids of the faces. For instance, the centroid of a face may be denoted by C(Fj). In some examples, the apparatus may determine areas of the faces. For instance, the area of a face may be denoted b CFj)-
[0026] In some examples, the apparatus may determine 102 the first value in accordance with Equation (2).
Figure imgf000010_0001
In Equation (2), CH is the first value (e.g., a first region center, a centroid of the first neighborhood, etc.). In some examples, the areas of the faces may be utilized to weight the corresponding centroids in the first value calculation.
[0027] The apparatus may determine 104 a second value based on a second neighborhood of the vertex of the 3D mesh, where the second neighborhood has a second size that is different from the first size. The second value is a quantity indicating a center (e.g., region center, centroid, volumetric center, etc.) of the second neighborhood. In some examples, the second value may be determined similarly to the first value, with a different neighborhood size. In some examples, the size of the second neighborhood may be based on the feature parameter or may not be based on the feature parameter.
[0028] In some examples, the method 100 may include determining the second neighborhood based on the graph and a second edge distance from the node corresponding to the vertex, where the second edge distance is smaller than the first edge distance. For instance, the apparatus may determine a set of vertices of the second neighborhood by determining nodes that are within an edge distance from the vertex (e.g., the current vertex). For a vertex v, for example, the apparatus may determine vertices S(v, nL), where nL is a second edge distance. The vertices may be included in a smaller region S( , nL), where nL < nH. For instance, nL = 1.
[0029] In some examples, the second value may be determined 104 using a similar operation or operations to determining 102 the first value. For instance, determining 104 the second value may include determining a second set of faces (e.g., BL = {F^}) of the second neighborhood of the 3D mesh, determining second areas of the second set of faces, determining second centroids of the second set of faces, and/or determining the second value in accordance with Equation (3).
Figure imgf000011_0001
In Equation (3), CL is the second value (e.g., a second region center, a centroid of the second neighborhood, etc.), 4
Figure imgf000011_0002
is an area of a face of the second set of faces, and C (F^ ) is a centroid of a face of the second set of faces. In some examples, BL = {F } for all faces F inside the region including vertices of S( , nL — 1). For instance, if nL = 1 , then the edge distance is 0 and faces that include the vertex v may be included in the second set of faces BL. In some examples, a second normal associated with the second neighborhood may not be calculated.
[0030] The apparatus may determine 106 a depth value of the vertex based on the first value and the second value. A depth value is a value indicating a depth of a vertex (e.g., depth of a vertex in a local region). For instance, the depth value may indicate a height (e.g., a degree of protrusion or depression) of a vertex relative to a surface of an object (e.g., a surface level). In some examples, to find embossed and/or debossed features on a surface, a depth value (e.g., f(v)) may be assigned to a vertex or vertices. The depth value may quantify a displacement of a vertex from the local surface. For instance, the depth value may be a perpendicular distance from a vertex to a plane fitted to a neighborhood (e.g., local neighborhood). In some examples, determining 106 the depth value may be based on a normal of the first neighborhood and a difference between the first value and the second value. For instance, the apparatus may subtract the second value from the first value and multiply the difference by a transpose of the normal. For example, the normal n may be utilized to determine (e.g., compute) a distance of the position of the second value CL (e.g., second centroid) with respect to the first value CH (e.g., first centroid) along a line in the normal direction intersecting with the first value CH (as the second value may not exactly line up with the first value in some cases, for instance).
[0031] In some examples, the depth value may be determined 106 in accordance with Equation (4).
Figure imgf000012_0001
In Equation (4), f (v) is the depth value of the vertex v, nT is the transpose of the normal, CH is the first value, CL is the second value,
Figure imgf000012_0002
a normalization factor. In some examples, vectors described herein (e.g., vertices, CH, CL) may be expressed as column vectors or another vector representation. In some examples, the depth value may be determined 106 in accordance with another expression
Figure imgf000012_0003
[0032] In some examples, the depth value may be based on a normalization factor. For instance, the apparatus may determine the depth value based on a normalization factor. In some examples, the depth value is normalized to account for non-uniform mesh structures that devote a larger number of vertices to regions with greater curvature. In some approaches without the normalization, a depth value at a non-feature smoothly curved surface region may be higher than that at a feature position.
[0033] In some examples, the normalization factor can be determined based on the inverse square root of the total sampled area. For instance, the apparatus may determine the normalization factor in accordance with Equation (5).
Figure imgf000013_0001
In some examples, for a uniform mesh and/or in some approaches where the areas of the neighborhoods can be controlled, the normalization factor may be simplified
Figure imgf000013_0002
[0034] In some examples, the apparatus may determine a depth value for multiple vertices (e.g., each vertex in the 3D mesh). In some examples, the depth value determination (e.g., calculation) of the vertices may be accelerated by parallel computing.
[0035] The apparatus may detect 108 whether a feature is present based on the depth value. In some examples, the apparatus may determine whether the vertex is a maximum (e.g., a maximum within a region, a local maximum, etc.) based on the depth value. For instance, the apparatus may compare the depth value of the vertex to other neighboring depth values. In some examples, if the vertex is a maximum (e.g., maximum within a region) and satisfies a depth value threshold, the apparatus may detect 108 that a feature is present. For instance, a maximum may represent a rapid curvature change locally. In cases where a neighborhood is determined in a region without a feature (e.g., a smooth region), the depth value may be the same for neighboring vertices, for which no maximum would be found, or the depth value may be similar for neighboring vertices and may not satisfy the depth value threshold. For instance, depth values calculated in a smooth region may be relatively small after the k(v) normalization and may not satisfy the depth value threshold. In some examples, the depth value threshold may be set based on feature size. Examples of the depth value threshold may include 0.1 , 0.25, 0.5, 1 , 3, 10, 50, etc. (depending on measurement units and/or feature size, for instance). Similar approaches may be utilized to detect a minimum (e.g., depression and/or debossed feature). [0036] In some examples, features may be sized such that locating features of the specified size may indicate positions of valid features. In some examples, feature sizes may be avoided that are similar in size to other object geometries (e.g., characteristics of other non-feature parts of the 3D mesh). In some examples, the abstraction of selecting local extrema based on a feature parameter (e.g., size) and/or adjusting to suppress non-extrema may enhance feature detection performance.
[0037] In a 3D mesh with approximately equal spacing between the vertices, some examples of the techniques described herein may perform a pseudo bandpass filter computation over the surface depth. If the 3D mesh does not have approximately equal spacing between vertices, the apparatus may control a span (e.g., physical span and/or range) of the neighborhoods (e.g., sets of vertices such as S(v, nH)) in some examples. For example, the apparatus may enforce a constraint that U G S( , nw), implying that ||u — v|| < Bv, where Bv denotes a maximum distance between v and vertices in
Figure imgf000014_0001
In some examples, the apparatus may pre-filter the vertices according to their depth values with a threshold T (e.g.
Figure imgf000014_0002
T). The threshold may be determined by the size of the features (e.g., feature height).
[0038] In some examples, the apparatus may determine candidate feature positions. For instance, the apparatus may determine whether the vertex is a candidate feature position based on the depth value. A candidate feature position is a position that potentially corresponds to a feature. In some cases, multiple vertices may correspond to a feature. Among the remaining vertices after pre-filtering, for example, the apparatus may find local extrema of (v). For a vertex v, for instance, if f (v) is greater than calculated depths (e.g., depth values) of its neighbors (e.g., all the immediate neighbors of v), the apparatus may determine that v is a local maximum. If f (v) is less than calculated depths (e.g., depth values) of its neighbors, the apparatus may determine that v is a local minimum. In some examples, the apparatus may determine the neighbors of v. For instance, if d( , Vr) = 1, then the apparatus may determine that v’ is a neighbor of v. The locations of the local extrema (e.g., maxima and/or minima) may be determined as the candidate feature positions.
[0039] In some examples, the apparatus may cluster the vertex with other candidate feature positions in response to determining that the vertex is a candidate feature position. The apparatus may detect a feature position based on the cluster. For instance, the apparatus may perform post-processing on the candidate feature positions. In some examples, features within the same region may have similar depth values. The apparatus may discard those candidate feature positions whose depth values are relatively small in comparison to depth values of other candidate feature positions in the vicinity. In some examples, for each candidate feature position v, the apparatus may determine other candidate feature position(s) v’, where d( , v') < n1, where n1 is a candidate feature /(v) . position threshold (e.g., size of the vicinity). In some examples, if <
Figure imgf000015_0001
0.5, the apparatus may discard the candidate feature position v. Candidate feature positions that are relatively close to each other (e.g., within a cluster threshold) may represent the same feature. For instance, if two candidate feature positions
Figure imgf000015_0003
and V2 satisfy the cluster threshold (e.g.
Figure imgf000015_0002
3), the apparatus may assign the candidate feature positions to a cluster. If no other candidate feature position is within the cluster threshold for v, then the apparatus may detect v as a feature position. For clustered candidate feature positions, the apparatus may calculate the average position of the clustered candidate feature positions as the detected feature position.
[0040] In some examples, an element or elements of the method 100 may be omitted or combined. In some examples, the method 100 may include one, some, or all of the operations, elements, etc., described in relation to any of Figures 1-6. [0041] Figure 2 is a block diagram illustrating examples of engines for feature detection. For example, Figure 2 illustrates a graph generation engine 240, a neighborhood determination engine 242, a depth value determination engine 244, and a feature determination engine 250. As used herein, the term “engine” refers to circuitry (e.g., analog or digital circuitry, a processor, such as an integrated circuit, or other circuitry, etc.) or a combination of instructions (e.g., programming such as machine- or processor-executable instructions, commands, or code such as a device driver, programming, object code, etc.) and circuitry. Some examples of circuitry may include circuitry without instructions such as an application specific integrated circuit (ASIC), a Field Programmable Gate Array (FPGA), etc. A combination of circuitry and instructions may include instructions hosted at circuitry (e.g., an instruction module that is stored at a processor-readable memory such as random-access memory (RAM), a hard-disk, or solid-state drive, resistive memory, or optical media such as a digital versatile disc (DVD), and/or executed or interpreted by a processor), or circuitry and instructions hosted at circuitry.
[0042] The graph generation engine 240 may obtain a 3D mesh 246. For instance, the 3D mesh 246 may be captured by a depth sensor and/or received from another device. The graph generation engine 240 may generate a graph based on a 3D mesh 246. For instance, the graph generation engine 240 may generate nodes in the graph corresponding to the vertices of the 3D mesh 246 and may generate edges in the graph corresponding to edges of the faces of the 3D mesh 246. In some examples, the graph generation engine 240 may generate a graph based on a 3D mesh as described in relation to Figure 1 . The graph may be provided to a neighborhood determination engine 242.
[0043] The neighborhood determination engine 242 may determine a first neighborhood and a second neighborhood based on the graph. In some examples, the neighborhood determination engine 242 may determine a first neighborhood and a second neighborhood based on vertices of the graph present within regions of volumes, where a volume or volumes may be indicated by the feature parameter 248. In some examples, the neighborhood determination engine 242 may determine a first neighborhood based on a first distance (e.g., first edge distance or other distance) from a node corresponding to a vertex. In some examples, the first distance may be based on a feature parameter 248. The neighborhood determination engine 242 may determine a second neighborhood based on a second distance (e.g., second edge distance or other distance) from the node (e.g., the node corresponding to the vertex). The second neighborhood (e.g., second edge distance) may be smaller than the first neighborhood (e.g., first edge distance). In some examples, the neighborhood determination engine 242 may determine a first neighborhood and a second neighborhood for multiple nodes of the graph (e.g., multiple vertices represented by the nodes). In some examples, the neighborhood determination engine 242 may determine a first neighborhood and second neighborhood for each (e.g., all) of the nodes of the graph or may determine a first neighborhood and a second neighborhood for a subset of the nodes of the graph. In some examples, the neighborhood determination engine 242 may determine the neighborhoods as described in relation to Figure 1. Some examples of the techniques described herein may involve breaking up a 3D mesh into local neighborhoods. Graph structures may provide the ability to disambiguate different faces of object components that are near each other. The neighborhoods (e.g., sets of vertices) may be provided to the depth value determination engine 244.
[0044] The depth value determination engine 244 may determine a depth value based on the first neighborhood and the second neighborhood. In some examples, the depth value determination engine 244 may determine a depth value as described in relation to Figure 1. For instance, the depth value determination engine 244 may determine a first value based on the first neighborhood and may determine a second value based on the second neighborhood. The depth value determination engine 244 may utilize the first value and the second value to determine the depth value. In some examples, the depth value determination engine 244 may determine a depth value for multiple vertices (e.g., for all vertices or a subset of vertices). The depth value(s) may be provided to the feature determination engine 250. [0045] The feature determination engine 250 may detect whether a feature is present and/or may detect a feature position based on the depth value(s). In some examples, detecting whether a feature is present and/or detecting a feature position may be performed as described in relation to Figure 1. For instance, the feature determination engine 250 may determine candidate feature positions, may cluster candidate feature positions, and/or may average clustered candidate feature positions to detect feature presence and/or feature position. In some examples, the feature determination engine 250 may detect multiple features and/or feature positions.
[0046] In some examples, the feature detection(s) and/or feature position(s) may be provided to an operation engine (not shown in Figure 2). In some examples, the operation engine may perform an operation or operations based on the feature detection(s) and/or feature position(s). For instance, the operation engine may register (e.g., align) a 3D mesh (e.g., 3D mesh 246) to a 3D object model (e.g., an original 3D mesh) based on the feature position(s). For instance, the operation engine may determine a correspondence between the detected feature position(s) and feature position(s) of the 3D object model. The registration may be utilized to compare the 3D mesh to the 3D object model and determine differences (e.g., manufacturing deformations) in the 3D mesh.
[0047] In some examples, the operation engine may decode information encoded by the feature position(s). For instance, the operation engine may recognize a pattern or message encoded with the feature position(s). The decoded information may be provided to a display for presentation and/or may be sent to another device.
[0048] Figure 3 is a diagram illustrating an example of a portion of a 3D mesh 352. The dots in Figure 3 illustrate examples of vertices of the 3D mesh 352. Figure 3 also illustrates an example of a feature 354 (e.g., a bump in a surface represented by the 3D mesh 352). A position of the feature 354 may be detected in accordance with some of the techniques described herein. For instance, an apparatus may determine a neighborhood relative to a vertex 356. In some approaches, vertices of the 3D mesh 352 may be represented as nodes, and a neighborhood may be determined based on an edge distance from the vertex 356. For instance, a neighborhood of vertices of S(v, 7) is illustrated with the dark dots 358 of Figure 3. The neighborhood may be utilized with a smaller neighborhood (e.g., S(v, 1)) to detect a position of the feature 354 as described in relation to Figure 1 .
[0049] Figure 4 is a block diagram of an example of an apparatus 402 that may be utilized to detect a feature in accordance with some of the techniques described herein. The apparatus 402 may be an electronic device, such as a personal computer, a server computer, a smartphone, a tablet computer, scanning device, etc. The apparatus 402 may include and/or may be coupled to a processor 404 and/or a memory 406. The apparatus 402 may include additional components (not shown) and/or some of the components described herein may be removed and/or modified without departing from the scope of this disclosure.
[0050] The processor 404 may be any of a central processing unit (CPU), a digital signal processor (DSP), a semiconductor-based microprocessor, graphics processing unit (GPU), field-programmable gate array (FPGA), an application-specific integrated circuit (ASIC), and/or other hardware device suitable for retrieval and execution of instructions stored in the memory 406. The processor 404 may fetch, decode, and/or execute instructions stored in the memory 406. In some examples, the processor 404 may include an electronic circuit or circuits that include electronic components for performing a function or functions of the instructions. In some examples, the processor 404 may be implemented to perform one, some, or all of the aspects, operations, elements, etc., described in relation to one, some, or all of Figures 1-6.
[0051] The memory 406 is an electronic, magnetic, optical, and/or other physical storage device that contains or stores electronic data (e.g., information and/or instructions). The memory 406 may be, for example, Random Access Memory (RAM), Electrically Erasable Programmable Read-Only Memory (EEPROM), a storage device, an optical disc, and/or the like. In some examples, the memory 406 may be volatile and/or non-volatile memory, such as Dynamic Random Access Memory (DRAM), EEPROM, magnetoresistive random-access memory (MRAM), phase change RAM (PCRAM), memristor, flash memory, and/or the like. In some implementations, the memory 406 may be a non-transitory tangible machine-readable storage medium (e.g., non- transitory tangible computer-readable medium), where the term “non-transitory” does not encompass transitory propagating signals. In some examples, the memory 406 may include multiple devices (e.g., a RAM card and a solid-state drive (SSD)).
[0052] In some examples, the apparatus 402 may include a communication interface 424 through which the processor 404 may communicate with an external device or devices (e.g., networked device, server, smartphone, printer, etc.). In some examples, the apparatus 402 may be in communication with (e.g., coupled to, have a communication link with) a depth sensor. In some examples, the apparatus 402 may include an integrated depth sensor.
[0053] The communication interface 424 may include hardware and/or machine-readable instructions to enable the processor 404 to communicate with the external device or devices. The communication interface 424 may enable a wired and/or wireless connection to the external device or devices. In some examples, the communication interface 424 may include a network interface card and/or may also include hardware and/or machine-readable instructions to enable the processor 404 to communicate with various input and/or output devices. Examples of output devices include a printer, a 3D printer, a display, etc. Examples of input devices include a keyboard, a mouse, a touch screen, etc., through which a user may input instructions and/or data into the apparatus 402.
[0054] In some examples, the memory 406 of the apparatus 402 may store mesh data 408, feature data 410, graph generation instructions 412, neighborhood determination instructions 414, feature detection instructions 416, and/or operation instructions 418. The mesh data 408 is data that indicates a 3D mesh (e.g., vertices, local edges, and/or faces of a 3D mesh). For example, the mesh data 408 may indicate or depict an object or objects with features. The feature data 410 is data that indicates features of an object or objects. For example, the feature data 410 may indicate positions of detected features from the mesh data 408.
[0055] The graph generation instructions 412 are instructions to generate a graph based on a 3D mesh of an object, where the graph includes nodes corresponding to vertices of the 3D mesh and edges based on faces of the 3D mesh. For instance, the processor 404 may execute the graph generation instructions 412 to generate a graph as described in relation to Figure 1 , Figure 2, and/or Figure 3.
[0056] The neighborhood determination instructions 414 are instructions to determine a first neighborhood based on a first distance (e.g., first Euclidean distance, first edge distance, etc.) from a node corresponding to a vertex, where the first distance is based on a feature parameter. For instance, the processor 404 may execute the neighborhood determination instructions 414 to determine the first neighborhood as described in relation to Figure 1 , Figure 2 and/or Figure 3.
[0057] The neighborhood determination instructions 414 are instructions to determine a second neighborhood based on a second distance (e.g., second Euclidean distance, second edge distance, etc.) from the node. For instance, the processor 404 may execute the neighborhood determination instructions 414 to determine the second neighborhood as described in relation to Figure 1 , Figure 2 and/or Figure 3.
[0058] In some examples, the feature detection instructions 416 are instructions to determine a depth value of the vertex based on the first neighborhood and the second neighborhood. For instance, the processor 404 may execute the feature detection instructions 416 to determine the depth value of the vertex as described in relation to Figure 1 , Figure 2 and/or Figure 3. In some examples, the processor 404 may execute the feature detection instructions 416 to determine a first region center based on the first neighborhood and to determine a second region center based on the second neighborhood, where the processor 404 is to determine the depth value based on the first region center and the second region center. For instance, the processor 404 may determine the first region center (e.g., first value) and the second region center (e.g., second value) as described in relation to Figure 1 and/or Figure 2.
[0059] In some examples, the feature detection instructions 416 are instructions to determine detect a feature position based on the depth value. For instance, the processor 404 may execute the feature detection instructions 416 to determine the feature position as described in relation to Figure 1 , Figure 2 and/or Figure 3.
[0060] The operation instructions 418 are instructions to perform an operation or operations based on the feature position. For instance, the processor 404 may execute the operation instructions 418 to perform an operation as described in relation to Figure 1 , Figure 2, and/or Figure 3. For example, the processor 404 may determine a manufacturing deformation of the object based on the feature position. For instance, the processor 404 may utilize the feature position to register the 3D mesh with a 3D object model and determine geometric differences between the 3D mesh and the 3D object model to determine the manufacturing deformation. In some examples, the processor 404 may execute the operation instructions 418 to perform another operation or operations (e.g., feature decoding). In some examples, an element or elements of the apparatus 402 may be omitted or combined.
[0061] Figure 5 is a block diagram illustrating an example of a computer- readable medium 526 for feature detection. The computer-readable medium 526 is a non-transitory, tangible computer-readable medium. The computer- readable medium 526 may be, for example, RAM, EEPROM, a storage device, an optical disc, and the like. In some examples, the computer-readable medium 526 may be volatile and/or non-volatile memory, such as DRAM, EEPROM, MRAM, PCRAM, memristor, flash memory, and the like. In some implementations, the memory 406 described in relation to Figure 4 may be an example of the computer-readable medium 526 described in relation to Figure 5.
[0062] The computer-readable medium 526 may include data (e.g., information and/or instructions). For example, the computer-readable medium 526 may include neighborhood determination instructions 528, depth value determination instructions 530, and/or filtering instructions 532.
[0063] The neighborhood determination instructions 528 may include instructions, when executed, cause a processor of an electronic device to determine a first neighborhood of a 3D mesh, where the first neighborhood has a first size based on a feature parameter. The neighborhood determination instructions 528 may also include instructions, when executed, cause a processor of an electronic device to determine a second neighborhood of the 3D mesh, where the second neighborhood has a second size that is different from the first size. In some examples, the first neighborhood and the second neighborhood may be determined described in relation to Figure 1 , Figure 2, Figure 3, and/or Figure 4.
[0064] The depth value determination instructions 530 may include instructions, when executed, cause the processor of the electronic device to determine a depth value of a vertex of the 3D mesh based on a first centroid of the first neighborhood and a second centroid of the second neighborhood. In some examples, the depth value may be determined as described in relation to Figure 1 , Figure 2, Figure 3, and/or Figure 4.
[0065] The filtering instructions 532 may include instructions, when executed, cause the processor of the electronic device to determine that the vertex is a candidate feature position based on the depth value and neighboring depth values. In some examples, the candidate feature position may be determined as described in relation to Figure 1 , Figure 2, and/or Figure 4. For instance, the processor may determine local extrema to determine candidate feature positions.
[0066] In some examples, the filtering instructions 532 may include instructions, that when executed, cause the processor of the electronic device to determine a feature position based on the candidate feature position. In some examples, the feature position may be determined as described in relation to Figure 1 , Figure 2, and/or Figure 4. For instance, the processor may determine isolated candidate feature positions and/or averaged clustered candidate feature positions to determine feature positions. In some examples, the feature position may correspond to a peak of a protruding feature of the 3D mesh.
[0067] Some examples of the techniques described herein may provide approaches to detect features for serialization and/or measurement of an object or objects. Some examples of the techniques described herein may be capable of detecting features in both acquired object scans (e.g., approximately uniform meshes) and/or 3D computer-aided design (CAD) files (e.g., non-uniform meshes). Some examples of the techniques described herein may provide flexibility to compute tuneable differences in scale in the plane of a local (e.g., mean) surface. Some examples of the techniques described herein may enable detecting surface features from 3D scans without information regarding object orientation in the 3D space. Some examples of the techniques described herein may delineate between layers that are close together. Some examples of the techniques described herein may provide for feature detection in object alignment and/or deformation calculation operations.
[0068] Figure 6 is a diagram illustrating an example of a 3D object model 660 and a scanned 3D mesh 662. In some examples, a 3D object may be manufactured based on the 3D object model 660 (e.g., original 3D mesh). For instance, the 3D object model 660 may be 3D printed to produce a physical 3D object. The physical 3D object may be scanned to produce the scanned 3D mesh 662. As illustrated in Figure 6, the 3D object model 660 may include first features 664. The scanned 3D mesh 662 may include corresponding second features 666. In some examples, positions of the first features 664 may be determined previously (e.g., at design time and/or when initially added to the 3D object model 660) and/or may be determined in accordance with some examples of the techniques described herein. For instance, feature detection (e.g., feature position detection) in accordance with some examples of the techniques described herein may be performed on the 3D object model 660 (e.g., original 3D mesh). In some cases, original design data (e.g., initial feature positions) may be unavailable and feature detection in accordance with some of the techniques described herein may be utilized to detect features and/or feature positions. [0069] Positions of the second features 666 may be determined in accordance with some examples of the techniques described herein. In some examples, the positions of the second features 666 may be utilized to register the scanned 3D mesh 662 with the 3D object model 660. Asymmetrical features may help to provide an unambiguous registration. Registration may allow comparison between the 3D object model 660 and the scanned 3D mesh 662, which may indicate manufacturing deformations. In some examples, features may be positioned to encode information. The detected feature positions may be utilized to decode information (e.g., to identify an object, to serialize objects, etc.).
[0070] As used herein, the term “and/or” may mean an item or items. For example, the phrase “A, B, and/or C” may mean any of: A (without B and C), B (without A and C), C (without A and B), A and B (without C), B and C (without A), A and C (without B), or all of A, B, and C.
[0071] While various examples are described herein, the disclosure is not limited to the examples. Variations of the examples described herein may be implemented within the scope of the disclosure. For example, aspects or elements of the examples described herein may be omitted or combined.

Claims

24 CLAIMS
1 . A method, comprising: determining a first value based on a first neighborhood of a vertex of a three-dimensional (3D) mesh, wherein the first neighborhood has a first size based on a feature parameter; determining a second value based on a second neighborhood of the vertex of the 3D mesh, wherein the second neighborhood has a second size that is different from the first size; determining a depth value of the vertex based on the first value and the second value; and detecting whether a feature is present based on the depth value.
2. The method of claim 1 , further comprising generating a graph of the 3D mesh, wherein the graph comprises nodes generated based on vertices of the 3D mesh.
3. The method of claim 2, further comprising: determining the first neighborhood based on the graph and a first distance from a node corresponding to the vertex; and determining the second neighborhood based on the graph and a second distance from the node corresponding to the vertex, wherein the second distance is smaller than the first distance.
4. The method of claim 1 , wherein determining the first value comprises: determining an average position of vertices of the first neighborhood; subtracting the average position from the first neighborhood to determine a matrix; and determining a normal of the first neighborhood based on the matrix, wherein the first value is determined based on the normal.
5. The method of claim 1 , further comprising determining a set of faces of the first neighborhood of the 3D mesh, wherein determining the first value is based on areas and centroids of the set of faces.
6. The method of claim 1 , wherein determining the depth value is based on a normal of the first neighborhood and a difference between the first value and the second value.
7. The method of claim 6, wherein the depth value is based on a normalization factor.
8. The method of claim 1 , further comprising determining whether the vertex is a candidate feature position based on the depth value.
9. The method of claim 8, further comprising: clustering the vertex with other candidate feature positions in response to determining that the vertex is a candidate feature position; and detecting a feature position based on the cluster.
10. An apparatus, comprising: a memory; and a processor coupled to the memory, wherein the processor is to: generate a graph based on a 3D mesh of an object, wherein the graph comprises nodes corresponding to vertices of the 3D mesh and edges based on faces of the 3D mesh; determine a first neighborhood based on a first distance from a node corresponding to a vertex, wherein the first distance is based on a feature parameter; determine a second neighborhood based on a second distance from the node; determine a depth value of the vertex based on the first neighborhood and the second neighborhood; and detect a feature position based on the depth value.
11 . The apparatus of claim 10, wherein the processor is to: determine a first region center based on the first neighborhood; and determine a second region center based on the second neighborhood, wherein the processor is to determine the depth value based on the first region center and the second region center.
12. The apparatus of claim 10, wherein the processor is to determine a manufacturing deformation of the object based on the feature position.
13. A non-transitory tangible computer-readable medium comprising instructions when executed cause a processor of an electronic device to: determine a first neighborhood of a three-dimensional (3D) mesh, wherein the first neighborhood has a first size based on a feature parameter; determine a second neighborhood of the 3D mesh, wherein the second neighborhood has a second size that is different from the first size; determine a depth value of a vertex of the 3D mesh based on a first centroid of the first neighborhood and a second centroid of the second neighborhood; and determine that the vertex is a candidate feature position based on the depth value and neighboring depth values.
14. The non-transitory tangible computer-readable medium of claim 13, wherein the instructions when executed cause the processor to determine a feature position based on the candidate feature position. 27
15. The non-transitory tangible computer-readable medium of claim 14, wherein the feature position corresponds to a peak of a protruding feature of the 3D mesh.
PCT/US2021/064835 2021-12-22 2021-12-22 Feature detections WO2023121663A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2021/064835 WO2023121663A1 (en) 2021-12-22 2021-12-22 Feature detections

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2021/064835 WO2023121663A1 (en) 2021-12-22 2021-12-22 Feature detections

Publications (1)

Publication Number Publication Date
WO2023121663A1 true WO2023121663A1 (en) 2023-06-29

Family

ID=86903281

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2021/064835 WO2023121663A1 (en) 2021-12-22 2021-12-22 Feature detections

Country Status (1)

Country Link
WO (1) WO2023121663A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130212537A1 (en) * 2006-08-31 2013-08-15 Adobe Systems Incorporated Extracting Feature Information From Mesh
US20150262412A1 (en) * 2014-03-17 2015-09-17 Qualcomm Incorporated Augmented reality lighting with dynamic geometry
US20190318547A1 (en) * 2016-09-19 2019-10-17 Occipital, Inc. System and method for dense, large scale scene reconstruction

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130212537A1 (en) * 2006-08-31 2013-08-15 Adobe Systems Incorporated Extracting Feature Information From Mesh
US20150262412A1 (en) * 2014-03-17 2015-09-17 Qualcomm Incorporated Augmented reality lighting with dynamic geometry
US20190318547A1 (en) * 2016-09-19 2019-10-17 Occipital, Inc. System and method for dense, large scale scene reconstruction

Similar Documents

Publication Publication Date Title
CN109977886B (en) Shelf vacancy rate calculation method and device, electronic equipment and storage medium
CN104616278B (en) Three-dimensional point cloud interest point detection method and system
Lloyd et al. Recognition of 3D package shapes for single camera metrology
EP2909575B1 (en) Systems and methods for marking images for three-dimensional image generation
JP5493108B2 (en) Human body identification method and human body identification device using range image camera
KR20160143664A (en) Generating and decoding machine-readable optical codes with aesthetic component
CN110009673B (en) Depth information detection method and device and electronic equipment
JP6172432B2 (en) Subject identification device, subject identification method, and subject identification program
US9613244B2 (en) 2D indicia pose estimation and consequent grid localization and/or synchronization
JPWO2014084181A1 (en) Image measuring device
Zhou et al. 3D surface matching by a voxel-based buffer-weighted binary descriptor
US11941863B2 (en) Imaging system and method using a multi-layer model approach to provide robust object detection
Zitová et al. Landmark recognition using invariant features
US11468609B2 (en) Methods and apparatus for generating point cloud histograms
US11790204B2 (en) Read curved visual marks
WO2023121663A1 (en) Feature detections
JP2006331214A (en) Object identification tag and object identification system using it
Shah et al. Performance evaluation of 3d local surface descriptors for low and high resolution range image registration
US9158956B2 (en) Reader, reading method and computer program product
US20230267642A1 (en) Fiducial location
US20230196707A1 (en) Fiducial patterns
CN113095102A (en) Method for positioning bar code area
Klingensmith et al. Object modeling and recognition from sparse, noisy data via voxel depth carving
EP3029605A2 (en) Marker recognition device, marker recognition method, and recognition program
JP6668775B2 (en) Local feature extraction device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21969214

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE