EP4609233A2 - Vorrichtung und verfahren zur abbildung von objekten hinter einer opaken oberfläche - Google Patents

Vorrichtung und verfahren zur abbildung von objekten hinter einer opaken oberfläche

Info

Publication number
EP4609233A2
EP4609233A2 EP23883618.3A EP23883618A EP4609233A2 EP 4609233 A2 EP4609233 A2 EP 4609233A2 EP 23883618 A EP23883618 A EP 23883618A EP 4609233 A2 EP4609233 A2 EP 4609233A2
Authority
EP
European Patent Office
Prior art keywords
opaque surface
behind
objects
sensor data
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP23883618.3A
Other languages
English (en)
French (fr)
Inventor
John Robert Stauss
Micaela G. KAPP
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zircon Corp
Original Assignee
Zircon Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/972,740 external-priority patent/US12174332B2/en
Application filed by Zircon Corp filed Critical Zircon Corp
Publication of EP4609233A2 publication Critical patent/EP4609233A2/de
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/08Construction
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01VGEOPHYSICS; GRAVITATIONAL MEASUREMENTS; DETECTING MASSES OR OBJECTS; TAGS
    • G01V3/00Electric or magnetic prospecting or detecting; Measuring magnetic field characteristics of the earth, e.g. declination, deviation
    • G01V3/08Electric or magnetic prospecting or detecting; Measuring magnetic field characteristics of the earth, e.g. declination, deviation operating with magnetic or electric fields produced or modified by objects or geological structures or by detecting devices
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01VGEOPHYSICS; GRAVITATIONAL MEASUREMENTS; DETECTING MASSES OR OBJECTS; TAGS
    • G01V3/00Electric or magnetic prospecting or detecting; Measuring magnetic field characteristics of the earth, e.g. declination, deviation
    • G01V3/08Electric or magnetic prospecting or detecting; Measuring magnetic field characteristics of the earth, e.g. declination, deviation operating with magnetic or electric fields produced or modified by objects or geological structures or by detecting devices
    • G01V3/088Electric or magnetic prospecting or detecting; Measuring magnetic field characteristics of the earth, e.g. declination, deviation operating with magnetic or electric fields produced or modified by objects or geological structures or by detecting devices operating with electric fields
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01VGEOPHYSICS; GRAVITATIONAL MEASUREMENTS; DETECTING MASSES OR OBJECTS; TAGS
    • G01V3/00Electric or magnetic prospecting or detecting; Measuring magnetic field characteristics of the earth, e.g. declination, deviation
    • G01V3/15Electric or magnetic prospecting or detecting; Measuring magnetic field characteristics of the earth, e.g. declination, deviation specially adapted for use during transport, e.g. by a person, vehicle or boat

Definitions

  • the present invention relates to the field of mapping objects behind an opaque surface.
  • FIG. 1 illustrates a side view of a conventional scanner.
  • a scanner 102 may be used in a construction and home improvement environment 100.
  • scanner 102 may be configured to detect an object 101 behind an opaque surface 103.
  • object 101 may be a stud, an electrical wire, or a metal pipe.
  • the stud may be a wooden stud, vertical wooden element, bridging block, fire block, or any other block, joists, rafters, headers, posts, columns, let brace, or any similar wooden element used for integrity, fabrication, or maintenance of a structural element.
  • opaque surface 103 may be, for example, a wall covered with drywall, particle board, or plywood; as an example, a floor with opaque material attached to structural members; as an example, a ceiling with an opaque surface, attached to rafters; or any other opaque surface behind which objects are not visible through the surface.
  • scanner 102 may include a housing to enclose and protect various electronic components.
  • a housing within the housing of the scanner 102, it may include a printed circuit board (PCB) 104, which can be configured to hold the various electronic components, such as one or more capacitive sensor(s) 108, one or more metal sensors 109, one or more current sensors (not shown), a controller/processor and other integrated circuits (labelled as 106a and 106b).
  • the PCB 104 may be coupled to a battery 107, which provides power to the scanner 102.
  • the one or more capacitive sensor(s) 108, one or more metal sensors 109, and one or more current sensors are typically operated individually or separately. However, such conventional applications may be insufficient to address the complexity of differentiating one or more objects behind the opaque surface 103.
  • aspects of the present disclosure include an exemplary apparatus for mapping objects behind an opaque surface, comprising: a location tracker configured to generate location data, where the location data includes pairs of horizontal and vertical location data relative to a point of reference that is linked to the opaque surface; a sensor device configured to collect sensor data of the objects behind the opaque surface along a programmed scan path, where the sensor data corresponds to the location data, and the sensor device comprises one or more sensors and is held by the location tracker; a memory configured to store the sensor data and the location data; one or more processors configured to analyze the sensor data and the location data to identify information about the objects behind the opaque surface; and a user interface configured to communicate the information about the objects behind the opaque surface to a user.
  • aspects of the present invention include a method for mapping objects behind an opaque surface, comprising: generating location data by a location tracker, where the location data includes pairs of horizontal and vertical location data relative to a point of reference that is linked to the opaque surface; collecting sensor data corresponding to the location data in parallel, by a sensor device comprising one or more sensors and held by the location tracker, of the objects behind the opaque surface along a programmed scan path; storing, in a memory, the sensor data and the location data; analyzing, by one or more processors, the sensor data and the location data to identify information about the objects behind the opaque surface; and communicating, via a user interface, the information about the objects behind the opaque surface to a user.
  • FIG. 1 illustrates a side view of a conventional scanner.
  • FIG. 2A illustrates a top view of an exemplary embodiment for differentiating one or more objects detected behind an opaque surface according to aspects of the present invention.
  • FIG. 2B illustrates a front view of the exemplary embodiment of FIG. 2A for differentiating one or more objects detected behind an opaque surface according to aspects of the present invention.
  • FIG. 2C illustrates a first set of sensor data collected by the scanner of FIG.
  • FIG. 2D illustrates a second set of sensor data collected by the scanner of FIG. 2B according to aspects of the present invention.
  • FIG. 3A illustrates a front view of another exemplary embodiment for differentiating one or more objects detected behind an opaque surface according to aspects of the present invention.
  • FIG. 3B illustrates an exemplary embodiment of determining an estimated region of an object of FIG. 3A according to aspects of the present invention.
  • FIG. 3C illustrates another exemplary embodiment of determining an estimated region of another object of FIG. 3 A according to aspects of the present invention.
  • FIG. 3D illustrates an exemplary embodiment of displaying the estimated regions of the different objects of FIG. 3 A according to aspects of the present invention.
  • FIG. 4A illustrates a front view of yet another exemplary embodiment for differentiating one or more objects detected behind an opaque surface according to aspects of the present invention.
  • FIG. 4B illustrates an exemplary embodiment of determining an estimated region of an object of FIG. 4A according to aspects of the present invention.
  • FIG. 4C illustrates another exemplary embodiment of determining an estimated region of another object of FIG. 4A according to aspects of the present invention.
  • FIG. 4D illustrates an exemplary embodiment of displaying the estimated regions of the different objects of FIG. 4A according to aspects of the present invention.
  • FIG. 5A illustrates a top view of yet another exemplary embodiment for differentiating one or more objects detected behind an opaque surface according to aspects of the present invention.
  • FIG. 5C illustrates estimated exemplary regions of the different objects of FIG. 5B according to aspects of the present invention.
  • FIG. 5D illustrates an exemplary embodiment of displaying the estimated regions of the different objects of FIG. 5C according to aspects of the present invention.
  • FIG. 6A illustrates a top view of an exemplary embodiment for differentiating one or more objects detected behind an opaque surface using sensor data from different sensors according to aspects of the present invention.
  • FIG. 6B illustrates a front view of the exemplary embodiment of FIG. 6A for differentiating the detected object according to aspects of the present invention.
  • FIG. 6C illustrates an exemplary embodiment of determining a distance between the scanner and the object of FIG. 6B according to aspects of the present invention.
  • FIG. 7A illustrates a top view of an exemplary embodiment for detecting a metal object behind an opaque surface using sensor data from different sensors according to aspects of the present invention.
  • FIG. 7B illustrates a front view of the exemplary embodiment of FIG. 7A for detecting the metal object according to aspects of the present invention.
  • FIG. 7C illustrates an exemplary method of determining a distance between the scanner and the metal object of FIG. 7B according to aspects of the present invention.
  • FIG. 9A illustrates a method of differentiating one or more objects detected behind an opaque surface using sensor data from different sensors according to aspects of the present invention.
  • FIG. 9B illustrates a method of analyzing sensor data to identify estimated regions of the objects detected behind an opaque surface according to aspects of the present invention.
  • FIG. 9C illustrates a method of informing a user of the objects detected behind an opaque surface according to aspects of the present invention.
  • FIG. 10A illustrates a method of mapping objects behind an opaque surface according to aspects of the present disclosure.
  • FIG. 10B illustrates an exemplary apparatus for mapping objects behind an opaque surface according to aspects of the present disclosure.
  • FIG. 10C illustrates an exemplary location tracker and sensor device used for mapping objects behind an opaque surface according to aspects of the present disclosure.
  • FIG. 11A illustrates an exemplary implementation of analyzing sensor data and location data according to aspects of the present disclosure.
  • FIG. 1 IB illustrates exemplary applications of using image patterns of one or more objects behind an opaque surface according to aspects of the present disclosure.
  • FIG. 12A illustrates another exemplary implementation of analyzing sensor data and location data according to aspects of the present disclosure.
  • FIG. 12B illustrates exemplary applications of using relational information of one or more objects behind an opaque surface according to aspects of the present disclosure.
  • FIG. 12C illustrates another exemplary application of using relational information of one or more objects behind an opaque surface according to aspects of the present disclosure.
  • FIG. 13A illustrates yet another exemplary implementation of analyzing sensor data and location data according to aspects of the present disclosure.
  • FIG. 13B illustrates an exemplary application of using signal density of one or more objects behind an opaque surface according to aspects of the present disclosure.
  • FIG. 13C illustrates another exemplary application of using signal density of one or more objects behind an opaque surface according to aspects of the present disclosure.
  • FIG. 13D illustrates yet another exemplary application of using signal density of one or more objects behind an opaque surface according to aspects of the present disclosure.
  • FIG. 14 illustrates a method of communicating information about the objects behind an opaque surface according to aspects of the present disclosure.
  • FIG. 2A illustrates a top view of an exemplary embodiment for differentiating one or more objects detected behind an opaque surface according to aspects of the present invention.
  • the exemplary embodiment may include a scanner 202 an opaque surface 204, and one or more objects (labelled as 206, 208) behind the opaque surface 204.
  • the scanner 202 may be configured to differentiate a variety of objects detected behind the opaque surface, including but not limited to, for example: 1) wood studs, wood joists, wood rafters; 2) metallic objects; 3) electrical wires; or 4) other objects.
  • object 206 may be a wood stud
  • object 208 may be a metal pipe.
  • FIG. 2B illustrates a front view of the exemplary embodiment of FIG. 2A for detecting different objects behind an opaque surface according to aspects of the present invention.
  • the opaque surface is not shown for simplicity.
  • the scan direction may be from right to left.
  • the scan direction may be adjusted based on the working environment, the preference of the user, and the specific application. In other words, the scan direction may be from left to right, right to left, up to down, down to up, or diagonally.
  • a user may perform multiple scans and/or from multiple directions to improve the accuracy of sensor data collected.
  • FIG. 2C illustrates a first set of sensor data collected by the scanner of FIG. 2B according to aspects of the present invention.
  • the sensor data may be collected by one or more capacitive sensors of the scanner 202; and one or more items may be included in a set.
  • the signal may represent a change of capacitance due to the change in the density of the objects behind the opaque surface, which may include an indication of the density of object 206 and object 208.
  • the vertical axis represents a magnitude of the signal observed by the capacitive sensors, and the horizontal axis represents a distance of the capacitive sensors from the objects being detected.
  • the magnitude of the signal being observed by the capacitive sensors increases, reaching a plateau when the scanner is approximately above the center of the objects. As the scanner 202 continues to move past the center of the objects, the magnitude of the signal being observed by the capacitive sensors decreases.
  • a first reference signal strength may be used to identify the boundaries of object 206.
  • the region between the two dashed lines 210a and 210b has a signal strength at or above RSi, and this region may be estimated to be where object 206 is located.
  • the region outside of the two dashed lines 210a and 210b has a signal strength below RSi, and this region may be estimated to be where object 206 is not found.
  • the first reference signal strength RSi may be derived from empirical experimental data.
  • the first reference signal strength RSi may be programmable, and may be revised via a software update even after the scanner has been sold, the delivery methods of which are well known to those skilled in the art.
  • the distance DMINI represent a minimum distance between the capacitive sensors of the scanner 202 and the approximate center of the objects. Note that although a right to left scan is described in this example, similar observations may be obtained by a scan from left to right. In some applications, multiple scans from different directions may be used to improve the accuracy of the estimated boundaries of object 206.
  • FIG. 2D illustrates a second set of sensor data collected by the scanner of FIG. 2B according to aspects of the present invention.
  • the sensor data may be collected by one or more metal sensors of scanner 202; and one or more items may be included in a set.
  • the signal may represent a magnetic field detected behind the opaque surface, primarily affected by the existence of a metal object, such as object 208.
  • the vertical axis represents the magnitude of the signal observed by the metal sensors, and the horizontal axis represents the distance of the metal sensors from object 208.
  • the magnitude of the signal being observed by the metal sensors increases, reaching a plateau when the scanner is approximately above the center of object 208.
  • a second reference signal strength may be used to identify the boundaries of object 208.
  • the region between the two dashed lines 212a and 212b has a signal strength at or above RS2, and this region may be estimated to be where object 208 is located.
  • the region outside of the two dashed lines 212a and 212b has a signal strength below RS2, and this region may be estimated to be where object 208 is not found.
  • the second reference signal strength RS2 may be derived from empirical experimental data.
  • the second reference signal strength RS2tnay be programmable, and may be revised via a software update even after the scanner 202 has been sold, the delivery methods of which are well known to those skilled in the art.
  • the distance DMIN2 represents a minimum distance between the metal sensors of scanner 202 and the approximate center of object 208. Note that although a right to left scan is described in this example, similar observations may be obtained by a scan from left to right. In some applications, multiple scans from different directions may be used to improve the accuracy of the estimated boundaries of object 208.
  • FIG. 3A illustrates a front view of another exemplary embodiment for detecting different objects behind an opaque surface according to aspects of the present invention.
  • the exemplary embodiment may include a scanner 302 and one or more objects (labelled as 304 and 306) behind an opaque surface.
  • Object 304 may be a wood stud
  • object 306 may be a metal pipe.
  • the scan direction may be from left to right.
  • the method described above in association with FIG. 2A to FIG. 2D may be employed to determine an estimated region for each object behind the opaque surface, which is not repeated here.
  • rectangle 314 represents an estimated region of object 304
  • circle 316 represents an estimated region of object 306.
  • FIG. 3B illustrates an exemplary method of determining an estimated region of an object of FIG. 3A according to aspects of the present invention. As shown in FIG. 3B, the method of determining the estimated region of object 304 is used as an example.
  • a first estimated region 314a can be determined by employing the first reference signal strength (RS i) as described in association with FIG. 2C. Since the first reference signal strength may be programmable, for a wood stud, it can be programmed to provide the first estimated region 314a to be smaller than the actual object 304. By choosing the first estimated region 314a to be smaller than the actual object 304, this approach can provide the benefit of having a higher level of confidence that a wood stud is hit when a user drills into the opaque surface.
  • RS i first reference signal strength
  • a second estimated region 314b can be determined by inserting a safety margin.
  • This safety margin is represented by the area between the first estimated region 314a and the second estimated region 314b.
  • Various factors may be used to determine the safety margin, including but not limited to: 1) type of material of the opaque surface; 2) humidity of the environment; 3) temperature of the environment; or 4) other factors that may affect the accuracy of determining the estimated region of object 304.
  • the safety margin may add 2mm, 4mm, or other measurements on each side of the first estimated region to form the second estimated region based on the above factors and the design criteria for the scanner.
  • either the first estimated region 314a or the second estimated region 314b may be used to represent the estimated region of object 304.
  • FIG. 3C illustrates another exemplary method of determining an estimated region of another object of FIG. 3A according to aspects of the present invention.
  • the method of determining the estimated region of object 306 is used as an example.
  • a first estimated region 316a can be determined by employing the second reference signal strength (RS2) as described in association with FIG. 2D.
  • the second reference signal strength may be programmable, for a metal pipe, it can be programmed to provide the first estimated region 316a to be larger than the actual object 306, for example larger by 1 millimeter (mm), 3 mm, or other measurements on each side of the first estimated region based on design criteria for the scanner.
  • this approach can provide the benefit of having a higher level of confidence that a metal object is missed when the user drills into the opaque surface.
  • a second estimated region 316b can be determined by inserting a safety margin.
  • This safety margin is represented by the area between the first estimated region 316a and the second estimated region 316b.
  • Various factors may be used to determine the safety margin, including but not limited to: 1) type of material of the opaque surface; 2) humidity of the environment; 3) temperature of the environment; or 4) other factors that may affect the accuracy of determining the estimated region of object 306.
  • either the first estimated region 316a or the second estimated region 316b may be used to represent the estimated region of object 306.
  • FIG. 3D illustrates an exemplary implementation of displaying the estimated regions of the different objects of FIG. 3 A according to aspects of the present invention.
  • a user interface can mean any form of communication to a user, including, but not limited to, visual (for example via a display or one or more light emitting diodes), audible (for example via a speaker) or sensory (for example via a vibration).
  • the information being communicated may be displayed, streamed, stored, mapped, or distributed across multiple devices.
  • Communication to the user can mean either the user or any other person or object which can receive communication.
  • the method determines regions where a single object is detected as well as regions where multiple objects are detected. In the example shown in FIG.
  • metal pipe 326 may represent a region where multiple objects are detected (for example, which region includes part of stud 324), and rectangle 324 (which includes part of metal pipe 326) may represent a region where a part of it has multiple objects (for example, part of metal pipe 326 and part of stud 324) and another part of it (excluding the remainder of metal pipe 326 and the region that includes both stud 324 and metal pipe 326) has a single object.
  • the display may be configured to display the multiple objects detected behind the opaque surface for this region.
  • the display may be configured to display the single object detected behind the opaque surface.
  • the display may be configured to display nothing for the region of metal pipe 326.
  • FIG. 4A illustrates a front view of yet another exemplary embodiment for differentiating one or more objects detected behind an opaque surface according to aspects of the present invention.
  • the exemplary embodiment may include a scanner 402, and one or more objects (labelled as 404 and 406) behind an opaque surface.
  • Object 404 may be a wood stud
  • object 406 may be an electrical wire.
  • the scan direction may be from left to right.
  • the method described above in association with FIG. 2A to FIG. 2D may be employed to determine an estimated region for each object behind the opaque surface, which is not repeated here.
  • rectangle 414 represents an estimated region of object 404
  • rectangle 416 represents an estimated region of object 406.
  • FIG. 4B illustrates an exemplary method of determining an estimated region of an object of FIG. 4 A according to aspects of the present invention. As shown in FIG. 4B, the method of determining the estimated region of object 404 is used as an example.
  • a first estimated region 414a can be determined by employing the first reference signal strength (RS i) as described in association with FIG. 2C. Since the first reference signal strength may be programmable, for a wood stud, for example, it can be programmed to provide the first estimated region 414a to be smaller than the actual object 404, for example smaller by 2mm, 4mm, or other measurements on each side of the first estimated region based on design criteria for the scanner. By choosing the first estimated region 414a to be smaller than the actual object 404, this approach can provide the benefit of having a higher level of confidence that a wood stud is hit when a user drills into the opaque surface.
  • RS i first reference signal strength
  • a second estimated region 414b can be determined by inserting a safety margin.
  • This safety margin is represented by the area between the first estimated region 414a and the second estimated region 414b.
  • Various factors may be used to determine the safety margin, including but not limited to: 1) type of material of the opaque surface; 2) humidity of the environment; 3) temperature of the environment; or 4) other factors that may affect the accuracy of determining the estimated region of object 404.
  • either the first estimated region 414a or the second estimated region 414b may be used to represent the estimated region of object 404.
  • FIG. 4C illustrates another exemplary method of determining an estimated region of another object of FIG. 4A according to aspects of the present invention.
  • the method of determining the estimated region of object 406 is used as an example.
  • a first estimated region 416a can be determined by employing a third reference signal strength (RS3) similar to the description in association with FIG. 2D.
  • the third reference signal strength may be programmable.
  • RS3 third reference signal strength
  • it can be programmed to provide the first estimated region 416a to be larger than the actual object 406, for example larger by 3mm, 5 mm, or other measurements on each side of the first estimated region based on design criteria for the scanner.
  • this approach can provide the benefit of having a higher level of confidence that an electrical wire is missed when a user drills into the opaque surface.
  • a second estimated region 416b can be determined by inserting a safety margin.
  • This safety margin is represented by the area between the first estimated region 416a and the second estimated region 416b.
  • Various factors may be used to determine the safety margin, including but not limited to: 1) type of material of the opaque surface; 2) humidity of the environment; 3) temperature of the environment; or 4) other factors that may affect the accuracy of determining the estimated region of object 406.
  • the safety margin may add 1mm, 3mm, or other measurements on each side of the first estimated region to form the second estimated region based on the above factors and the design criteria for the scanner.
  • either the first estimated region 416a or the second estimated region 416b may be used to represent the estimated region of object 406.
  • FIG. 4D illustrates an exemplary implementation of displaying the estimated regions of the different objects of FIG. 4A according to aspects of the present invention.
  • the method determines regions where a single object is detected as well as regions where multiple objects are detected.
  • rectangle 426 may represent a region where multiple objects are detected
  • rectangle 424 (which includes part of rectangle 426) may represent a region where a part of it has multiple objects (for example the region that overlaps with rectangle 426) and another part of it (excluding the region that overlaps with rectangle 426) has a single object.
  • the display may be configured to display the multiple objects detected behind the opaque surface for this region.
  • the display may be configured to display the single object detected behind the opaque surface.
  • the display may be configured to display nothing for the region of the rectangle 426.
  • FIG. 5A illustrates a top view of yet another exemplary embodiment for differentiating one or more objects detected behind an opaque surface according to aspects of the present invention.
  • the exemplary embodiment may include a scanner 502, an opaque surface 504, and one or more objects (labelled as 506, 508, and 510) behind the opaque surface 504.
  • the scanner 502 may be configured to detect a variety of objects behind the opaque surface, including but not limited to: 1) wood studs; 2) metallic objects; 3) electrical wires; or 4) other objects.
  • object 506 may be a wood stud
  • object 508 may be a metal pipe
  • object 510 may be an electrical wire.
  • the scan direction may be from right to left.
  • the scan direction may be adjusted based on the working environment, the preference of the user, and the specific application. In other words, the scan direction may be from left to right, right to left, up to down, down to up, or diagonally.
  • a user may perform multiple scans and/or from multiple directions to improve the accuracy of sensor data collected.
  • FIG. 5C illustrates estimated regions of the different objects of FIG. 5B according to aspects of the present invention. Note that the method of determining an estimated region of an object is described above, for example in association with FIG. 3B and FIG. 3C, which is not repeated here. As shown in FIG. 5C, rectangle 516 represents an estimated region for stud 506, rectangle 518 represents an estimated region for metal pipe 508, and rectangle 520 represents an estimated region for electrical wire 510.
  • the display may be configured to display the estimated region for stud 506, represented by rectangle 526, and display the estimated region for metal pipe 508, represented by rectangle 528, and display the estimated region for electrical wire 510, represented by the rectangle 530.
  • the display may be configured to display the region under the rectangle 528 to include both metal pipe 508 and wood stud 506, and display the region under the rectangle 530 to include both electrical wire 510 and wood stud 506.
  • FIG. 6A illustrates a top view of an exemplary embodiment for differentiating one or more objects detected behind an opaque surface using sensor data from different sensors according to aspects of the present invention.
  • the exemplary embodiment may include a scanner 602, an opaque surface 604, and one or more objects (labelled as 606) behind the opaque surface 604.
  • object 606 may be, for example, a metal pipe.
  • FIG. 6B illustrates a front view of the exemplary embodiment of FIG. 6A for detecting the object according to aspects of the present invention.
  • the opaque surface is not shown for simplicity.
  • the scan direction may be from left to right.
  • the scan direction may be adjusted based on the working environment, the preference of the user, and the specific application. In other words, the scan direction may be from left to right, right to left, up to down, down to up, or diagonally.
  • a user may perform multiple scans and/or from multiple directions to improve the accuracy of sensor data collected.
  • FIG. 6C illustrates an exemplary method of determining a distance between the scanner and the object of FIG. 6B according to aspects of the present invention.
  • the vertical axis represents a common reference point or a common reference line from which a distance between scanner 602 and metal pipe 606 is estimated.
  • the horizontal axis represents a distance from the common reference point or the common reference line.
  • Scanner 602 may be configured to collect sensor data as described above in association with FIG. 2C and FIG. 2D. For example, based on the sensor data collected by one or more capacitive sensors of scanner 602, a first distance Di, representing a distance between scanner 602 and metal pipe 606, may be estimated by the capacitive sensors.
  • a second distance D2 representing a distance between scanner 602 and metal pipe 606, may be estimated by the metal sensors.
  • the metal sensors may provide an estimated distance (e.g. D2) that is shorter than the actual distance between scanner 602 and metal pipe 606.
  • the capacitive sensors may provide an estimated distance (e.g. Di) that is closer to the actual distance between scanner 602 and the metal pipe 606.
  • scanner 602 may be configured to derive a distance D3 for metal pipe 606 from the common reference.
  • scanner 602 will obtain an improved estimation of the distance between scanner 602 and metal pipe 606 in this example.
  • both the sensor data collected by the capacitive sensors and the metal sensors may be collected in parallel in a one-pass scan, or multiple sets of sensor data may be collected by the capacitive sensors and the metal sensors in parallel with multiple passes, respectively.
  • FIG. 7 A illustrates a top view of an exemplary embodiment for differentiating object(s), here a metal screw 706 and stud 708, detected behind an opaque surface using sensor data from different sensors according to aspects of the present invention.
  • the exemplary embodiment may include a scanner 702, an opaque surface 704, and one or more objects (labelled as 706 (metal screw) and 708 (stud)) behind opaque surface 704.
  • object 706 may be a metal screw and for example, object 708 may be a wood stud.
  • FIG. 7B illustrates a front view of the exemplary embodiment of FIG. 7A for detecting the metal object according to aspects of the present invention. As shown in FIG.
  • the scan direction may be from left to right.
  • the scan direction may be adjusted based on the working environment, the preference of the user, and the specific application. In other words, the scan direction may be from left to right, right to left, up to down, down to up, or diagonally.
  • a user may perform multiple scans and/or from multiple directions to improve the accuracy of sensor data collected.
  • FIG. 7C illustrates an exemplary method of determining a distance between the scanner and the metal object of FIG. 7B (screw 706) according to aspects of the present invention. As shown in FIG.
  • the vertical axis represents a common reference point or a common reference line from which a distance between scanner 702 and metal screw 706 and stud 708 is estimated.
  • the horizontal axis represents a distance from the common reference point or the common reference line.
  • Scanner 702 may be configured to collect sensor data as described above in association with FIG. 2C and FIG. 2D. For example, based on the sensor data collected by one or more capacitive sensors of scanner 702, a first distance Di, representing a distance between scanner 702 and metal screw 706 and stud 708 may be estimated by the capacitive sensors.
  • a second distance D2 representing a distance between scanner 702 and metal screw 706, may be estimated by the metal sensors.
  • the capacitive sensors and the metal sensors may provide different estimations with respect to the distance between scanner 702 and metal screw 706 based upon the relative size of the metal screw.
  • the metal sensors may provide an estimated distance (e.g. D2) that is different from the actual distance between scanner 702 and metal screw 706.
  • the capacitive sensors may provide an estimated distance (e.g. Di) that may be closer to the actual distance between scanner 702 and metal screw 706.
  • scanner 702 may be configured to derive a distance D3 for metal screw 706.
  • scanner 702 may be able to obtain an improved estimation of the distance between scanner 702 and metal screw 706 in this example.
  • both the sensor data collected by the capacitive sensors and the metal sensors may be collected in parallel in a one-pass scan, or multiple sets of sensor data may be collected by the capacitive sensors and the metal sensors in parallel with multiple passes, respectively.
  • FIG. 8 illustrates a block diagram of an exemplary embodiment of a system for differentiating one or more objects detected behind an opaque surface using sensor data from different sensors according to aspects of the present invention.
  • a controller 802 may be configured to process sensor data collected by sensors of the scanner, namely sensor data collected by capacitive sensors 804, metal sensor 806, and current sensor 808.
  • the controller is further configured to determine information about the detected objects behind the opaque surface based on the sensor data collected by capacitive sensors 804, metal sensor 806, and/or current sensor 808 in parallel.
  • the controller may include one or more processors.
  • a display 810 is configured to provide information about the detected objects to a user.
  • the functional blocks described in the system of FIG. 8 may be implemented in an integrated device such as scanner 202 of FIG. 2A.
  • the capacitive sensors 804, metal sensors 806, and current sensor 808 may reside in one device, while the controller 802 and the display 810 may reside in another device.
  • a scanner device may include the sensors, and the sensor data collected by the scanner device may be wirelessly communicated to a second device.
  • the second device for example a smartphone, a tablet, or a laptop, may include the controller 802 and the display 810.
  • the controller 802, the capacitive sensors 804, metal sensors 806, and current sensor 808, may reside in one device, while the display 810 may reside in another device.
  • a scanner device may include the controller 802 and the sensors, and the sensor data collected by the scanner device may be wirelessly communicated to a second device.
  • the second device for example a monitor, may be configured to receive and display the sensor data.
  • current sensors may be alternating current sensors.
  • current sensors may be able to detect the static magnetic field of or associated with direct current.
  • FIG. 9A illustrates a method of differentiating one or more objects detected behind an opaque surface using sensor data from different sensors according to aspects of the present invention.
  • the method collects, in parallel, sensor data of the one or more objects behind an opaque surface, by a plurality of sensors controlled by one or more processors.
  • the method analyzes, by the one or more processors, the sensor data to identify estimated regions of the one or more objects behind the opaque surface.
  • the method differentiates, by the one or more processors, the estimated regions of the one or more objects behind the opaque surface.
  • the method informs a user, by the one or more processors, of the one or more objects within the estimated regions behind the opaque surface.
  • the plurality of sensors may include at least a first set of sensors configured to detect a first type of material and a second set of sensors configured to detect a second type of material; and the estimated regions include a first estimated region of the first type of material and a second estimated region of the second type of material.
  • the first set of sensors may include one or more capacitive sensors and the first type of material include wood studs; and the second set of sensors may include one or more metal sensors and the second type of material include metal objects.
  • the plurality of sensors may further include a third set of sensors configured to detect a third type of material; where the third set of sensors includes one or more current sensors and the third type of material include electrical wires.
  • a set of sensors may include one or more sensors in the set.
  • the method of collecting sensor data includes mapping the sensor data of the one or more objects behind the opaque surface with respect to a common reference point.
  • the method of differentiating the estimated regions of the one or more objects behind the opaque surface includes determining an overlap region between the first estimated region and the second estimated region.
  • FIG. 9B illustrates a method of analyzing sensor data to identify estimated regions of the objects detected behind an opaque surface according to aspects of the present invention.
  • the method analyzes the sensor data to identify a first measured region for a wood stud, and reduces the first measured region by a first programmable percentage to derive a first estimated region for the wood stud.
  • the method analyzes the sensor data to identify a second measured region for a metal object, and enlarging the second measured region by a second programmable percentage to derive a second estimated region for the metal object.
  • the methods performed in block 912 and block 914 may additionally or optionally include the methods performed in block 916 and/or block 918.
  • the method analyzes the sensor data to identify a third measured region for an electrical wire, and enlarging the third measured region by a third programmable percentage to derive a third estimated region for the electrical wire.
  • the method adds programmable safety margins to the corresponding estimated regions in accordance with variations of an operating environment, where the variations of the operating environment include variations in temperature, humidity, material of the opaque surface, or some combination thereof.
  • FIG. 9C illustrates a method of informing a user of the objects detected behind an opaque surface according to aspects of the present invention.
  • the method described in either block 922 or block 924 may be performed.
  • the method prevents display of information in the overlap region.
  • the method selectively displays the first type of material, the second type of material, or both types of material in the overlap region.
  • FIG. 10A illustrates a method of mapping objects behind an opaque surface according to aspects of the present disclosure.
  • the method generates location data by a location tracker, where the location data includes pairs of horizontal and vertical location data relative to a point of reference that is linked to the opaque surface.
  • the method collects sensor data corresponding to the location data in parallel, by a sensor device comprising one or more sensors and held by the location tracker, of the objects behind the opaque surface along a programmed scan path.
  • a sensor device comprising one or more sensors and held by the location tracker
  • the method stores, in a memory, the sensor data and the location data.
  • the method analyzes, by one or more processors, the sensor data and the location data to identify information about the objects behind the opaque surface.
  • the method communicates, via a user interface, the information about the objects behind the opaque surface to a user.
  • mapping of a large area of an opaque surface can be beneficial in many construction or architectural scenarios.
  • the method can be employed including but not limited to 1) find a stud to hang heavy objects safely and securely; 2) identify plumbing or electrical objects to avoid potential hazards when cutting or drilling into the opaque surface; 3) locate structures like plumbing and electrical for modification or upgrades; 4) perform forensic determination of compliance with building codes; 5) perform remodeling or inspection of legacy infrastructure, where detailed drawings are either inaccurate or do not exist; and etc.
  • FIG. 10B illustrates an exemplary apparatus for mapping objects behind an opaque surface according to aspects of the present disclosure.
  • the apparatus includes location tracker 1012, sensor device 1014, memory 1016, one or more processors 1018, and user interface 1020.
  • the location data (from location tracker 1012) and sensor data (from sensor device 1014) are collected in synchronization.
  • Each corresponding pair of location data and sensor data are stored in the memory 1016.
  • the sensor device 1014 may include one or more capacitive sensors, one or more metal sensors, one or more current sensors, or other types of sensors.
  • the sensor data may include at least one of: sensor data collected by one or more capacitive sensors; sensor data collected by one or more metal sensors; sensor data collected by one or more current sensors, or a combination thereof.
  • the one or more processors 1018 can be configured to control the location tracker 1012, the sensor device 1014, the memory 1016, and the user interface 1020.
  • mapping can be configured to create a plot over a defined area that delineates the type and location of various hidden objects/structures behind an opaque surface.
  • the map can be recalled for future use, thus obviating the need to spend additional time scanning.
  • some benefits of the mapping method can include accurately determining size of objects; accurately determining shape of objects; and accurately determining location of objects behind the opaque surface.
  • Yet another benefit of the mapping method is that it can provide sensory data with a physical location and context. This allows a user to view subsurface structures in their entirety, which in turn improves the reliability of interpretation of sensor data within the structure, by giving the user the ability to view the entire scan area and locations of objects on a display.
  • FIG. 10C illustrates an exemplary location tracker and sensor device used for mapping objects behind an opaque surface according to aspects of the present disclosure.
  • the location tracker 1012 is configured to hold the sensor device 1014.
  • the location tracker 1012 is configured to scan an area of an opaque surface 1028 with respect to a reference point (not shown) one time, or to scan the area of the opaque surface 1028 a predetermined number of times.
  • the location tracker 1012 of FIG. 10B may include a first arm 1022a configured to control movements of the sensor device 1014 in a horizontal direction and a second arm 1024 configured to control movements of the sensor device 1014 in a vertical direction.
  • FIG. 10B may additionally or optionally include a third arm 1022b that is positioned in parallel with the first arm 1022a and is configured to control movements of the sensor device 1014 in a horizontal direction in conjunction with the first arm 1022a.
  • FIG. 10C shows the first arm 1022a and the third arm 1022b positioned horizontally and the second arm 1024 positioned vertically with respect to the opaque surface 1028.
  • the first arm and the third arm may be positioned vertically and the second arm may be positioned horizontally with respect to the opaque surface 1028.
  • the first arm and the second arm may be joined by a hinge and function like a robot arm; neither the first arm nor the second arm need to be positioned horizontally or vertically with respect to the opaque surface 1028.
  • the scan may be performed repetitive or different scan patterns relative to the reference point to increase accuracy and to remove outliers.
  • this approach is advantageous over manual operation of a scan device, because a manual operated scan device can not accurately track the location of the sensors relative to a reference point.
  • a manual operated scan device cannot accurately perform repetitive scans and accumulate results of different scan paths within the scan area, because a manual operated scan device does not have a pre-programmed scan pattern relative to the reference point.
  • streams of sensor data can be combined with corresponding location data to create an image of the hidden objects/structures beneath a defined scan area of the opaque surface.
  • Location data and sensor data can be stored and used for subsequent retrieval and analysis, which in turn can reduce construction costs. For example, by storing scanned sensor data and location data with respect to a reference point, such data can be assessed away from the job site, and be recalled from a stored database at a later time. This can enable off-line forensic assessment of the underlying structures and minimize any need for repetitive scanning in the future. A user can go back to the job site at a later time, equipped with information of the underlying structures relative to a previously used reference point. [00105] FIG.
  • 11 A illustrates an exemplary implementation of analyzing sensor data and location data according to aspects of the present disclosure.
  • the method analyzes one or more patterns detected using the sensor data and the location data; and determines the objects behind the opaque surface based on the one or more patterns detected.
  • the methods performed in block 1102 may optionally and/or additionally include the methods performed in block 1104.
  • the method identifies patterns of fasteners behind the opaque surface; and determines an intersection of two adjoining drywall sheets based on the patterns of fasteners behind the opaque surface.
  • location data and sensor data can be used to enable pattern recognition.
  • Patterns may include data from one or more sensors.
  • Virtual pins can be dropped by the user at any point during the scan. These virtual pins can be referenced and returned to in the future for further investigation.
  • dense areas in a vertical pattern can indicate a stud.
  • the method can be configured to allow a user to create waypoints, or drop virtual pins to identify regions of the scan of particular interest.
  • an area of high electromagnetic field may indicate the presence of an electrical fault behind the opaque surface.
  • a metal sensor can provide more information about a wall than just the location of metal objects, location of a stud can be derived with data collected by metal sensors by following the vertical pattern created by metal fasteners.
  • FIG. 1 IB illustrates exemplary applications of using image patterns of one or more objects behind an opaque surface according to aspects of the present disclosure.
  • the upper image shows objects behind an opaque surface with the opaque surface removed.
  • the lower image shows a scanned image with a user selected material, namely ferrous metals, being displayed.
  • patterns of the fasteners such as screws and nails, can be seen in three relatively vertical lines, shown in dotted boxes 1110a, 1112a, and 1114a in the lower image. From these patterns, it can be derived that there are three studs behind the opaque surface, as shown in the corresponding dotted boxes 1110b, 1112b, and 1114b in the upper image.
  • FIG. 12A illustrates another exemplary implementation of analyzing sensor data and location data according to aspects of the present disclosure.
  • the method analyzes relational information among multiple objects behind an opaque surface; and identifies the objects based on the relational information among multiple objects behind an opaque surface.
  • the methods performed in block 1202 may optionally and/or additionally include the methods performed in block 1204.
  • the method performs forensic determination of whether a building code has been met using the relational information among multiple objects behind the opaque surface; or plans a future project using the relational information among multiple objects behind the opaque surface.
  • the methods performed in block 1204 may optionally and/or additionally include the methods performed in block 1206.
  • the method performs an analysis of nailing patterns of plywood sheets in a structural shear wall behind the opaque surface; and determines whether the building code has been met based on the analysis. For example, using the relational information of the distances among the studs derived from FIG. 1 IB, it can be determined whether the studs were placed according to the building code. In another example, using the relational information of nailing patterns of plywood sheets in a structural shear wall, it can be determined whether the building code has been met in the construction process. [00109] According to aspects of the present disclosure, location data and sensor data collected by the location tracker and sensor device can be displayed on the same topographic grid, thereby showing elements behind an opaque surface in relation to each other as well as their grid position in the space behind the opaque surface. This information can be used in forensic analysis of architectural conformity, code compliance, structural integrity, cost estimation prior to commencement of a construction project, and post construction assessment of sound engineering practice and construction quality.
  • FIG. 12B illustrates exemplary applications of using relational information of one or more objects behind an opaque surface according to aspects of the present disclosure.
  • the upper image shows a side view of a stud and drywall sheets attached to the stud.
  • the lower image shows a scanned image of a front view of the same wall.
  • a tape join is an intersection of two adjoining drywall sheets. Screws are placed along the edge of the dry wall sheets to secure the material. Plaster is then placed into the crevasse and tape is placed over the plaster to create a smooth transition between the two pieces of drywall sheets.
  • relational information of objects such as two fasteners within close proximity from each other as highlighted in dotted box 1210a
  • patterns of adjacent fasteners shown in dotted box 1212a, 1212b and 1212c, provide relational information of the objects behind the wall. From this relational information, it can be derived that there is an intersection of two adjoining dry wall sheets behind the wall. The line of the intersection is in between each pair of adjacent fasteners, as approximately indicated at points 1216a, 1216b, and 1216c.
  • dotted box 1210b indicates the intersection of two adjoining drywall sheets 1214 and 1218 behind an opaque surface, with point 1216d approximately indicating the line of intersection.
  • FIG. 12C illustrates another exemplary application of using relational information of one or more objects behind an opaque surface according to aspects of the present disclosure.
  • the upper image shows a first scanned image with a user selected material, namely ferrous metals, being displayed.
  • the lower image shows a second scanned image with a user selected material, namely non-ferrous metals, being displayed.
  • ferrous metals include metals or alloys that contain iron. Examples of ferrous metal may include steel, carbon steel, and cast iron.
  • the upper image shows a steel strap and screws detected behind a wall. Non-ferrous metals do not contain iron. Examples of non-ferrous metals may include copper, aluminum, lead, zinc, silver, gold, nickel, titanium, and brass.
  • the lower image shows a copper pipe and screws detected behind the same wall.
  • the scanned images provide relational information of the objects behind the wall. From the relational information, it can be derived that there is a copper pipe 1220 behind the wall and the copper pipe is tied down by a steel strap 1222 in between two studs, which are indicated by the patterns of the fasteners as described above in association with FIG. 1 IB. This information can be used to verify whether the objects behind the opaque surface were constructed according to a building code.
  • FIG. 13A illustrates yet another exemplary implementation of analyzing sensor data and location data according to aspects of the present disclosure.
  • the method analyzes signal density of the objects using the sensor data and the location data; and informs user potential hazards in accordance with the signal density of the objects.
  • the methods performed in block 1302 may optionally and/or additionally include the methods performed in block 1304.
  • the method determines relative depth of the objects from the opaque surface using the signal density; and informs users to avoid drilling into an unintended object behind the opaque surface based on the relative depth of the objects from the opaque surface.
  • signal density can be used to determine the relative depth of objects behind an opaque surface, which in turn can be used to determine potential risks or hazards of interference with existing subsurface elements prior to perform a measured penetration of the opaque surface with a saw or a drill. Note that a larger amplitude of signal density can indicate an object is closer to the sensor; and a smaller amplitude of signal density can indicate an object is farther away from the sensor.
  • FIG. 13B illustrates an exemplary application of using signal density of one or more objects behind an opaque surface according to aspects of the present disclosure.
  • the upper image shows objects behind an opaque surface with the opaque surface removed.
  • the lower image shows a scanned image with a user selected material, namely non-ferrous metals, being displayed.
  • the object of interest is a copper pipe 1312 shown in the upper image, with its corresponding signal density 1314 shown in the lower scanned image.
  • areas with higher signal density (shown as darker areas) indicate the copper pipe 1312 may be closer to the opaque surface
  • areas with lower signal density shown as lighter areas) indicate the copper pipe 1312 may be farther away from the opaque surface.
  • a user may choose an area that avoids the copper pipe to drill or cut into the wall, which results in reducing potential risks and hazards.
  • a similar technique may be employed with respect to other objects, such as electrical wires and plastic pipes.
  • the user may select a desired material to be displayed. With the guidance of a scanned image of signal density of the user selected material, potential risks and hazards may be reduced during remodeling or construction.
  • FIG. 13C illustrates another exemplary application of using signal density of one or more objects behind an opaque surface according to aspects of the present disclosure.
  • the upper image shows a two dimensional signal density of objects behind an opaque surface.
  • the lower image shows a three dimensional signal density of the objects behind the opaque surface.
  • the upper image shows two studs (1320a and 1322a) and a plastic pipe 1324a, and the corresponding three dimension signal density profile of the studs (1320b and 1322b) and the plastic pipe 1324b are shown in the lower image.
  • the signal density may be higher where there is a screw in the stud.
  • this is indicated by a darker area 1326a in the upper two dimensional image, and by a bump 1326b in the lower three dimensional image. Similar to the example of FIG. 13B, using the signal density information, a user may choose an area to drill or cut into the wall in order to reduce potential risks and hazards.
  • FIG. 13D illustrates yet another exemplary application of using signal density of one or more objects behind an opaque surface according to aspects of the present disclosure.
  • the upper image shows a two dimensional signal density collected by capacitive sensors about objects behind an opaque surface.
  • the lower image shows a three dimensional signal density by the capacitive sensors about the objects behind the opaque surface.
  • the upper image shows the two dimensional signal density profile of three studs (1330a 1332a and 1334a), and the corresponding three dimension signal density profile of the studs (1330b 1332b and 1334b).
  • the signal density may be higher where a screw or other metal may be in a stud or on top of the stud.
  • a user may choose an area to drill or cut into the wall in order to reduce potential risks and hazards.
  • FIG. 14 illustrates a method of communicating information about the objects behind an opaque surface according to aspects of the present disclosure. As shown in FIG.
  • the method retrieves the information about the objects behind the opaque surface from the memory at a later time.
  • the methods performed in block 1402 may optionally and/or additionally include the methods performed in blocks 1404, 1406, 1408, and 1410.
  • the method displays the information about the objects behind the opaque surface as a heat map.
  • the method displays the information about the objects behind the opaque surface as a contour map.
  • the method displays one or more user selected types of material behind the opaque surface.
  • the method displays a combination of heat map, contour map, and/or one or more user selected types of material.
  • information may be shared with contractors without need for them to visit a construction or project site; and incomplete or work-in-progress information may be saved when a project is delayed, etc. This enables a team to return to a previous work site and retrieve the relevant information to continue an unfinished project.
  • location data and sensor data may be presented in multiple ways to the user. For example, if a user is searching for a wood stud, then information about wood stud can be emphasized, while information about other types of objects or materials may be filtered out. Similarly, if a user is searching for metallic pipe, or live electrical wire, these targets can be emphasized, and other types of objects or materials may be filtered out. In some other applications, different layers of information may be extracted from the underlying data set to create layers of the images for display.
  • BIM Building Information Modeling
  • BIM is the holistic process of creating and managing information for a built asset. Based on an intelligent model and enabled by a cloud platform, BIM integrates structured, multidisciplinary data to produce a digital representation of an asset across its lifecycle, from planning and design to construction and operations. BIM allows design and construction teams to work more efficiently, but it also allows them to capture the data they have created during the process to benefit operations and maintenance activities for the life cycle of the project.
  • references to specific functional units are to be seen as references to suitable means for providing the described functionality rather than indicative of a strict logical or physical structure or organization.
  • the invention can be implemented in any suitable form, including hardware, software, firmware, or any combination of these.
  • the invention may optionally be implemented partly as computer software running on one or more data processors and/or digital signal processors, along with the hardware components described above.
  • the elements and components of an embodiment of the invention may be physically, functionally, and logically implemented in any suitable way. Indeed, the functionality may be implemented in a single unit, in a plurality of units, or as part of other functional units. As such, the invention may be implemented in a single unit or may be physically and functionally distributed between different units and processors/controllers.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Environmental & Geological Engineering (AREA)
  • Geology (AREA)
  • General Life Sciences & Earth Sciences (AREA)
  • Geophysics (AREA)
  • Electromagnetism (AREA)
  • Business, Economics & Management (AREA)
  • Economics (AREA)
  • Tourism & Hospitality (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • Health & Medical Sciences (AREA)
  • General Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Geophysics And Detection Of Objects (AREA)
  • Processing Or Creating Images (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Image Generation (AREA)
EP23883618.3A 2022-10-25 2023-10-24 Vorrichtung und verfahren zur abbildung von objekten hinter einer opaken oberfläche Pending EP4609233A2 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US17/972,740 US12174332B2 (en) 2019-11-27 2022-10-25 Apparatus and method for mapping objects behind an opaque surface
PCT/US2023/077584 WO2024091903A2 (en) 2022-10-25 2023-10-24 Apparatus and method for mapping objects behind an opaque surface

Publications (1)

Publication Number Publication Date
EP4609233A2 true EP4609233A2 (de) 2025-09-03

Family

ID=90831873

Family Applications (1)

Application Number Title Priority Date Filing Date
EP23883618.3A Pending EP4609233A2 (de) 2022-10-25 2023-10-24 Vorrichtung und verfahren zur abbildung von objekten hinter einer opaken oberfläche

Country Status (3)

Country Link
EP (1) EP4609233A2 (de)
JP (1) JP2026501911A (de)
WO (1) WO2024091903A2 (de)

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130321621A1 (en) * 2012-05-31 2013-12-05 Martin M. Menzel Method for Mapping Hidden Objects Using Sensor Data
US9639960B1 (en) * 2016-11-04 2017-05-02 Loveland Innovations, LLC Systems and methods for UAV property assessment, data capture and reporting

Also Published As

Publication number Publication date
WO2024091903A3 (en) 2024-07-04
WO2024091903A2 (en) 2024-05-02
JP2026501911A (ja) 2026-01-19

Similar Documents

Publication Publication Date Title
US12265193B2 (en) Mapping objects behind an opaque surface using signal density
US9879994B2 (en) Method of placing a total station in a building
US10088344B2 (en) Underlying wall structure finder and infrared camera
TW201818297A (zh) 作業辨識裝置以及作業辨識方法
US11220867B2 (en) Continuous live tracking system for placement of cutting elements
JP6719368B2 (ja) 3次元空間可視化装置、3次元空間可視化方法およびプログラム
JP7286881B2 (ja) 不透明面の背後で検出された物体を区別するためのスキャナ
JP2021032716A (ja) 測量データ処理装置、測量データ処理方法および測量データ処理用プログラム
JP2007212264A (ja) 三次元レーザスキャナのスキャニング方法
JP2020172784A (ja) 山岳トンネルコンクリート厚測定方法および測定装置
EP4609233A2 (de) Vorrichtung und verfahren zur abbildung von objekten hinter einer opaken oberfläche
Loporcaro et al. Evaluation of Microsoft HoloLens Augmented Reality Technology as a construction checking tool
EP4430490A1 (de) Verfahren und system zur erzeugung eines 3d-modells für digitalen zwilling aus einer punktwolke
US20250349031A1 (en) Computer system for determining the position of an anchor, and method
CN106852189B (zh) 底层墙体结构探测器和红外相机
JP2006031549A5 (de)
US20250180774A1 (en) Augmented-reality supported non-destructive testing
Rodriguez et al. Feasibility of location tracking of construction resources using UWB for better productivity and safety
JP2019138659A (ja) キャリブレーション装置、キャリブレーション方法、制御装置および制御方法
JP2025079001A (ja) 情報処理装置、情報処理方法、およびシステム
JP2022129638A (ja) 情報処理装置、情報処理方法、プログラムおよび情報処理システム
WO2024165895A1 (en) Method and system for determining a position of a plurality of lidar sensors for industrial risky zones
CN116583843A (zh) 用于进行点云场地调试的系统和方法
WO2020079487A1 (en) Systems and methods for customized presentation of digital information in a physical space
JP2013061892A (ja) 設計支援装置、設計支援方法および設計支援プログラム

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20250425

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC ME MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)