WO2024091223A1 - Dispositif de balayage à motifs géométriques pour étalonnage de caméra - Google Patents

Dispositif de balayage à motifs géométriques pour étalonnage de caméra Download PDF

Info

Publication number
WO2024091223A1
WO2024091223A1 PCT/US2022/047660 US2022047660W WO2024091223A1 WO 2024091223 A1 WO2024091223 A1 WO 2024091223A1 US 2022047660 W US2022047660 W US 2022047660W WO 2024091223 A1 WO2024091223 A1 WO 2024091223A1
Authority
WO
WIPO (PCT)
Prior art keywords
scanning device
cameras
individual
foot
pressure
Prior art date
Application number
PCT/US2022/047660
Other languages
English (en)
Inventor
Laurence I. Schwartz
Kumar Rajan
Original Assignee
Aetrex, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aetrex, Inc. filed Critical Aetrex, Inc.
Publication of WO2024091223A1 publication Critical patent/WO2024091223A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • AHUMAN NECESSITIES
    • A43FOOTWEAR
    • A43DMACHINES, TOOLS, EQUIPMENT OR METHODS FOR MANUFACTURING OR REPAIRING FOOTWEAR
    • A43D1/00Foot or last measuring devices; Measuring devices for shoe parts
    • A43D1/02Foot-measuring devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/002Diagnosis, testing or measuring for television systems or their details for television cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/30Transforming light or analogous information into electric information
    • H04N5/33Transforming infrared radiation

Definitions

  • the present disclosure relates to the field of customized orthotic devices, and more particularly, to scanning devices used in connection with the production of customized orthotic devices or recommendations of orthotic devices.
  • Foot problems and the corresponding costs associated with foot care are significant in the United States and elsewhere. In cases where the foot problem is debilitating for particular activities, a number of hours of work time can be lost. Foot problems can arise from medical conditions, work conditions requiring standing or walking, athletic activities, and the like. Thus, foot problems can develop from medical conditions, work activity, or leisure activity.
  • Pedorthics is the field concerned with the design, manufacture, fit, and modification of footwear, foot orthotics, and foot devices as prescribed to help relieve painful or disabling conditions of the foot.
  • the goal of pedorthics is to provide protection and comfort to the consumer/patient, which has been achieved primarily by developing orthotic devices capable of reducing pressure at the greatest areas of impact.
  • additive manufacturing technologies have been used to produce custom orthotic devices or insoles in lieu of traditional subtractive manufacturing techniques and injection molding. Techniques, such as pressure sensing or imaging, have been used to compute three-dimensional (3D) models of the foot, which serve as the basis for generating customized orthotic devices with additive manufacturing or suggesting recommended pre-fabricated orthotic devices.
  • a scanning device comprises: a support base comprising a substantially flat upper surface; a calibration pattern disposed on the upper surface and undetectable or substantially undetectable by wavelengths in the visible spectrum; a plurality of cameras distributed around an outer perimeter of the support base and substantially oriented toward a center of the support base to capture the calibration pattern; and a processing device operatively coupled to each of the plurality of cameras.
  • the processing device is configured to activate and receive data generated by each of the plurality of cameras.
  • the calibration pattern comprises a plurality of circles.
  • the scanning device further comprises a pressure panel disposed on the upper surface of the support base.
  • the scanning device further comprises: a foil layer disposed on the pressure panel.
  • the calibration pattern is incorporated into the foil layer.
  • the plurality of cameras disposed around the outer perimeter of the support base are equidistant from the center of the pressure panel.
  • the processing device is further operably coupled to the pressure panel. In at least one embodiment, the processing device is further configured to activate and receive data generated by the pressure panel.
  • pressure panel comprises a plurality of pressure sensors arranged in a planar configuration.
  • each of the plurality of pressure sensors when the pressure panel is activated, are configured to generate signals representative of underfoot pressure when an individual’s foot is in contact with the pressure panel, the signals collectively defining a two-dimensional pressure map of the individual’s foot.
  • the processing device is configured to generate a three- dimensional reconstruction of an individual’s foot based on data captured by the pressure panel and each of the plurality of cameras when the individual’s foot is in contact with the pressure panel.
  • the processing device is configured to transmit data generated by the pressure panel and each of the plurality of cameras to a processing server for generating a three-dimensional reconstruction of the individual’s foot and/or data descriptive of an orthotic device customized to the individual’s anatomy.
  • the outer perimeter of the support base is a circular perimeter.
  • a total number of the plurality of cameras is four.
  • the four cameras are unevenly distributed around the circular perimeter.
  • At least one of the plurality of cameras comprises a depth sensor configured to capture depth data during image capture by its corresponding camera.
  • a method comprises: capturing, by each of a plurality of cameras, one or more calibration images of a calibration pattern disposed above a support base of a scanning device; calibrating each of the plurality of cameras based on the one or more calibration images; and capturing images of an individual’s foot by the plurality of cameras arranged around the support base.
  • the calibration pattern is undetectable or substantially undetectable by wavelengths in the visible spectrum;
  • the method further comprises computing a three- dimensional reconstruction of the individual’s foot based at least partially on the images of the individual’s foot.
  • the one or more calibration images are infrared images.
  • the calibration pattern comprises a plurality of circles.
  • one or more of the plurality of circles differ in diameter.
  • the support base comprises a pressure panel, and wherein the calibration pattern is part of a foil layer disposed on the pressure panel.
  • the method further comprises capturing a two- dimensional pressure map of the individual’s foot while the individual is standing on the pressure panel.
  • the method further comprises: generating a three- dimensional reconstruction of the individual’s foot based on the images of the individual’s foot and the two-dimensional pressure map.
  • a system comprises the scanning device of any of the preceding embodiments configured to perform the method of any of the preceding embodiments.
  • a non-transitory computer-readable storage medium comprises instructions encoded thereon that, when executed by a processing device, cause the processing device to perform the method of any of the preceding embodiments.
  • FIG. 1 illustrates an exemplary system architecture in accordance with embodiments of the present disclosure.
  • FIG. 2A shows a perspective view of an exemplary scanning device in accordance with embodiments of the present disclosure.
  • FIG. 2B shows a side view of the exemplary scanning device in accordance with embodiments of the present disclosure.
  • FIG. 2C shows a top view of the exemplary scanning device in accordance with embodiments of the present disclosure.
  • FIG. 3 shows an image of the scanner captured in the visible spectrum (top) compared to an image of the scanner captured in the infrared spectrum (2), revealing the calibration pattern along the top of the pressure panel in accordance with embodiments of the present disclosure.
  • FIG. 4A illustrates pre-defined approximations for the geometric patterns to be identified within images captured by each camera in accordance with embodiments of the present disclosure.
  • FIG. 4B shows infrared images captured by each camera from their respective vantage points in accordance with embodiments of the present disclosure.
  • FIG. 4C illustrates ellipse fitting based on the calibration pattern in accordance with embodiments of the present disclosure.
  • FIG. 4D illustrates detection of the final set of ellipses across all cameras in accordance with embodiments of the present disclosure.
  • FIG. 5 is a flow diagram illustrating a method of scanning an individual’s foot or feet in accordance with embodiments of the present disclosure.
  • FIG. 6 is a block diagram illustrating an exemplary computer system for use in accordance with embodiments of the present disclosure.
  • a scanning device capable of capturing images of the user’s foot or feet from different angles with a plurality of cameras.
  • the scanning device includes, in addition to or in lieu of imaging functionality, the ability to capture two-dimensional pressure maps of an individual’s foot or feet using a pressure panel.
  • the scanning device may comprise a foil layer with a camera calibration pattern (or simply a “calibration pattern”) that is used by the plurality of cameras for calibration.
  • the foil layer may, for example, be disposed on the pressure panel.
  • the terms “foil” and “foil layer” may refer to a thin material formed from one or more layers, and may include one or more layers of a deposited film, a paint, an ink, or other material.
  • the foil layer may include an outermost protective layer to prevent damage to underlying layers.
  • the film, paint, ink, or other material may be disposed directly on a surface on which the user stands during operation of the scanning device with or without a protective layer.
  • a deposited film, paint, ink, or other material may form a pattern (e.g., on a surface of the scanning device, on the foil, etc.), such as a plurality of circles, at fixed locations to facilitate self-calibration by the cameras.
  • a pattern e.g., on a surface of the scanning device, on the foil, etc.
  • certain embodiments utilize an ellipse fitting algorithm and a camera calibration algorithm based on geometric patterns (e.g., a geometric pattern painted onto the foil or directly onto a surface of the scanning device).
  • the pattern is formed using materials that are detectable in the infrared spectrum (e.g., greater than about 700 nm) and undetectable or substantially undetectable in the visible spectrum (i.e., undetectable or substantially undetectable by the human eye).
  • substantially undetectable means that the pattern may be very faintly visible to the human eye or to wavelengths less than 700 nm in the visible spectrum.
  • Undetectable or substantially undetectable patterns include those produced using visibly opaque markings and coated with a material that is infrared-transparent or infrared-translucent, or patterns produced with markings formed from materials that are undetectable or substantially undetectable to visible light, with such materials being familiar to those of ordinary skill in the art.
  • the scanning device may further be capable of enabling dynamic gait analysis by capturing a series of pressure maps of underfoot pressure when the individual steps onto and/or off of the scanning device.
  • the scanning device or a separate device, performs a 3D reconstruction of the individual’s foot or feet based on the pressure map (representative of the bottom of the foot) and the images captured at various angles (representative of the top, front, side, and back views of the foot).
  • the cameras are evenly distributed around a perimeter of the scanning device. In at least one embodiment, the cameras are unevenly distributed. For example, in an embodiment where only four cameras are used, the cameras may be arranged to define the four comers of a rectangle while being oriented toward a center of the pressure panel (i.e., toward the individual’s foot or feet).
  • the scanning device may enable during gait analysis by capturing pressure data for the user’s foot or feet, for example at 5-10 second intervals as the user steps into, across, and/or out of the scanning device.
  • the data may be processed to generate a video showing the evolution of underfoot pressure over time.
  • Certain embodiments of the present disclosure are also directed to methods utilizing geometric partial differential equations to generate a 3D surface representative of a foot.
  • the method can advantageously compute the 3D model using depth images obtained from the cameras in conjunction with an underfoot pressure map so as to account for the underside of the foot which is not visible to the cameras.
  • Such methods are described in U.S. Patent No. 11,151,738, the disclosure of which is hereby incorporated by reference herein in its entirety.
  • FIG. 1 illustrates an exemplary system architecture 100, in accordance with embodiments of the present disclosure.
  • the system architecture 100 includes a scanning device 200, a data processing server 120, a client device 130, and a data store 140, with each device of the system architecture 100 being communicatively coupled via a network 105.
  • One or more of the devices of the system architecture 100 may be implemented using a generalized computer system 600, described with respect to FIG. 6.
  • the devices of the system architecture 100 are merely illustrative, and it is to be understood that other scanning devices, user devices, data processing servers, data stores, and networks may be present.
  • network 105 may include a public network (e.g., the Internet), a private network (e.g., a local area network (LAN) or wide area network (WAN)), a wired network (e.g., Ethernet network), a wireless network (e.g., an 802.11 network or a Wi-Fi network), a cellular network (e.g., a Long Term Evolution (LTE) network), routers, hubs, switches, server computers, and/or a combination thereof.
  • LTE Long Term Evolution
  • the network 105 may include one or more networks operating as stand-alone networks or in cooperation with each other.
  • the network 105 may utilize one or more protocols of one or more devices to which they are communicatively coupled.
  • the scanning device 200 includes a support base comprising substantially flat upper and lower surfaces, a pressure panel disposed on the upper surface of the support base, and a plurality of cameras distributed around an outer perimeter of the support base and substantially oriented toward a center of the pressure panel.
  • the scanning device 200 further comprises an on-board processing device operatively coupled to the pressure panel and each of the plurality of cameras. The processing device may be configured to activate and receive data generated by the pressure panel and each of the plurality of cameras.
  • An exemplary scanning device 200 is described in greater detail with respect to FIGS. 2A-2C.
  • the data processing server 120 may include one or more computing devices (such as a rackmount server, a router computer, a server computer, a personal computer, a mainframe computer, a laptop computer, a tablet computer, a desktop computer, etc.), data stores (e.g., hard disks, memories, databases), networks, software components, and/or hardware components from which digital contents may be retrieved.
  • the data processing server 120 may be a server utilized by the scanning device 200, for example, to process generated scan data of an individual’s anatomy.
  • additional data processing servers may be present.
  • the data processing server 120 utilizes a modeling component 122 to generate and reconstruct 3D model data from data received from the scanning device 200.
  • the client device 130 may include a computing device such as a personal computer (PC), laptop, mobile phone, smart phone, tablet computer, netbook computer, etc.
  • An individual user may be associated with (e.g., own and/or operate) the client device 130.
  • a “user” may be represented as a single individual.
  • other embodiments of the present disclosure encompass a “user” being an entity controlled by a set of users and/or an automated source.
  • a set of individual users federated as a community in a company or government organization may be considered a “user.”
  • the user is the individual who is the subject of scanning by the scanning device 200.
  • the user is an operator, technician, or physician who is conducting or assisting with the scan of another individual with the scanning device 200.
  • the client device 130 may utilize one or more local data stores, which may be internal or external devices, and may each include one or more of a short-term memory (e.g., random access memory), a cache, a drive (e.g., a hard drive), a flash drive, a database system, or another type of component or device capable of storing data.
  • the local data stores may also include multiple storage components (e.g., multiple drives or multiple databases) that may also span multiple computing devices (e.g., multiple server computers). In at least one embodiment, the local data stores may be used for data back-up or archival purposes.
  • the client device 130 may implement a user interface 132, which may allow the client device 130 to send/receive information to/from other client devices, the scanning device 200, the data processing server 120, and the data store 140.
  • the user interface 132 may be a graphical user interface (GUI).
  • GUI graphical user interface
  • the user interface 132 may be a web browser interface that can access, retrieve, present, and/or navigate content (e.g., web pages such as Hyper Text Markup Language (HTML) pages) provided by the data processing server 120.
  • HTML Hyper Text Markup Language
  • the user interface 132 may be a standalone application (e.g., a mobile “app,” etc.), that enables a user to use the client device 130 to send/receive information to/from other client devices, the scanning device 200, the data processing server 120, and the data store 140.
  • a standalone application e.g., a mobile “app,” etc.
  • the data store 140 may include one or more of a short-term memory (e.g., random access memory), a cache, a drive (e.g., a hard drive), a flash drive, a database system, or another type of component or device capable of storing data.
  • the data store 140 may also include multiple storage components (e.g., multiple drives or multiple databases) that may also span multiple computing devices (e.g., multiple server computers).
  • the data store 140 may be cloud-based.
  • One or more of the devices of system architecture 100 may utilize their own storage and/or the data store 140 to store public and private data, and the data store 140 may be configured to provide secure storage for private data.
  • Such private data may include, for example, data descriptive of individuals who have been scanned with the scanning device 200, including names, contact information, physiological data, and scan data.
  • the data store 140 may be used for data back-up or archival purposes.
  • each of the scanning device 200, the data processing server 120, the client device 130, and the data store 140 are depicted in FIG. 1 as single, disparate components, these components may be implemented together in a single device or networked in various combinations of multiple different devices that operate together. In at least one embodiment, some or all of the functionality of the data processing server 120 and/or the data store 140 may be performed by the scanning device 200, the client device 130, or other devices.
  • the client device 130 may be within close proximity of or integrated with the scanning device 200, for example, as part of a scanning kiosk. In such embodiments, the client device 130 may implement the functionality of the modeling component 122, or may utilize the data processing server 120 to implement some or all of the functionality of the modeling component 122.
  • FIGS. 2A-2C show various views of the exemplary scanning device 200 in accordance with embodiments of the present disclosure.
  • the scanning device 200 includes a support base 202, a pressure panel 204, and a plurality of cameras 206 distributed around the support base 202.
  • Each camera 206 may be configured for capturing high-definition images (e.g., individual images or a movie), and may, in at least one embodiment, comprise an infrared sensor for capturing depth data. In at least one embodiment, one or more of the cameras 206 may be a stereo depth camera.
  • the scanning device 200 may have one or more on-board processing devices that are operatively coupled to the cameras 206 and the pressure panel 204, and may transmit activation signals to the various components and control the timing at which signals are captured, collected, and transmitted to one or more external devices for processing (e.g., the data processing server 120, the client device 130, etc.).
  • one or more external devices for processing e.g., the data processing server 120, the client device 130, etc.
  • one or more of the cameras 206 are housed within or mechanically coupled to respective support arms 208.
  • Each of the cameras 206 are mechanically coupled to or integrally formed with the support base via support arms 208, which may be substantially L-shaped, rigid members.
  • one or more of the support arms 208 are fixed in place, resulting in fixed, unmovable positions for the cameras 206. This may be beneficial in optimizing angles and distances at which images of the foot or feet are captured.
  • the positions of each camera 206 may be adjusted along the perimeter of the support base 202.
  • one or more of the support arms 208 may extend radially from the support base 202, and/or may be rotatable around a central axis of the support base 202 (e.g., slideably coupled to a track underneath the support base 202) and adjusted to a particular azimuthal angle.
  • one or more of the support arms 208 may be telescoping in order to adjust the vertical positions of their respective cameras 206 with respect to the support base 202.
  • the cameras 206 may be positioned to define a walking path 214 across the support base 202, as illustrated in FIG. 2C.
  • the left-most and right-most support arms 208 may be horizontally separated by a distance (e.g., 24- 36 inches) to allow for the individual to walk onto the support base 202 and pressure panel 204 either to enter the scanning device and prepare for a static scan, or to perform dynamic gait analysis.
  • a distance e.g., 24- 36 inches
  • the user may enter the scanning device from the bottom of FIG. 2C and rotate their feet/body by about 90 degrees.
  • the cameras 206 may be further separated to define an additional walking path (e.g., a walking path orthogonal to the walking path 214).
  • the cameras 206 may be configured to rotate around the outer perimeter of the support base 202 to perform image capture at different angles with respect to the user’s foot or feet.
  • the scanning device 200 may include a motorized coupling mechanism that allows the support arms 208 to travel along a stationary track, or each of the support arms 208 may be coupled to a motorized track.
  • the one or more cameras can be controlled such that images of the foot are captured at different angles as the cameras 206 traverse the track. In at least one embodiment, fewer than all of the cameras 206 shown are utilized, such as two or three cameras.
  • the pressure panel 204 includes a plurality of pressure cells arranged in a planar configuration (e.g., arranged in rows and columns or in another arrangement) adapted for generating polychromatic foot pressure readings.
  • the pressure panel may be an iStep® Pressure Plate (Aetrex Worldwide, Inc.) or a variation thereof, which uses over 3,700 pressure sensors that each span an area of 0.25 cm 2 .
  • iStep® Pressure Plate Aetrex Worldwide, Inc.
  • a method of generating a customized insole for footwear using information obtained from a pressure map of an individual’s feet is described in United States Patent No. 7,493,230, the disclosure of which is hereby incorporated by reference herein in its entirety.
  • the pressure panel 204 may be replaced with a substantially flat surface, which may be integrally formed with the support base.
  • a calibration pattern as discussed in greater detail below, may be disposed directly on this substantially flat surface in certain embodiments.
  • the support base 202 includes a power button/power indicator 210 for activating the scanning device 200.
  • the support base includes a panel 212, which may include a power input port and one or more ports for establishing a hard-wired connection with a client device (e.g., the client device 130) or a data processing server (e.g., the data processing server 120).
  • the scanning device 200 may be communicatively coupled to the client device or data processing server via a wireless connection.
  • an exemplary process for performing a scan with the scanning device 200 comprises first performing a static scan of the individual’s foot or feet.
  • the individual may be instructed (e.g., by a display screen operably coupled to the scanning device 200 or to an intermediate device, such as the client device 130 implementing the user interface 132) to step onto the pressure panel 204 with one foot or with both feet.
  • the individual is then instructed to place the other foot by itself onto the pressure panel 204 after completion of a scan of the first foot.
  • the static scan comprises measuring an underfoot pressure of the individual’s foot or feet by the pressure panel 204 and capturing images of and/or depth data for the individual’s foot or feet by the cameras 206.
  • the individual may be instructed to walk out of the scanning device 200 to perform a dynamic gait analysis by measuring a change in underfoot pressure over time during the individual’s movement.
  • the user may be instructed to walk into and out of the scanning device 200, walk into the scanning device 200 and remain still, or walk out of the scanning device 200 from a static position.
  • the dynamic gait analysis is performed prior to performing the static scan.
  • FIG. 3 shows an image of the scanner captured in the visible spectrum (top) compared to an image of the scanner captured in the infrared spectrum (2), revealing the calibration pattern along the top of the pressure panel.
  • the pattern of a foil layer is disposed onto an upper surface of the pressure panel using a paint/ink that is visible in the visible spectrum.
  • the paint may be overlaid with a further layer of paint/ink that is opaque in the visible spectrum, but is transparent in the infrared spectrum (e.g., opaque to the human eye, but invisible to the cameras during the calibration process).
  • the composition of the materials used in the foil layer may be any composition as appreciated by those of ordinary skill in the art.
  • a camera calibration algorithm is performed as follows. First, a pre-defined approximation for the geometric pattern is identified for each camera (as illustrated in FIG. 4A). Next, each of the cameras captures an infrared image of the foil (as shown in FIG. 4B). For each camera image, an algorithm is utilized to find all possible ellipse candidates using an adaptive thresholding and ellipse-fitting method, which may utilized the pre-defined patterns identified in the first step as starting points for the fitting (as illustrated in FIG. 4C). The detected ellipses from all the cameras are reconciled to detect the final set of ellipses across all the cameras (as shown in FIG. 4D). The extrinsic parameters for each camera are then derived from the geometry of the identified ellipses.
  • FIGS. 4A-4D are illustrated for a scanning device with four cameras, these embodiments are applicable to scanning devices with an arbitrary number of cameras.
  • FIG. 5 is a flow diagram illustrating a method 500 of scanning an individual’s foot or feet in accordance with embodiments of the present disclosure.
  • the method 500 may be performed by processing logic that includes hardware (e.g., circuitry, dedicated logic, programmable logic, microcode, etc.), software (e.g., instructions run on a processing device to perform hardware simulation), or a combination thereof.
  • the method 500 is performed by a processing device of the data processing server 120 implementing the modeling component 122, which transmits signals to the scanning device 200 to manage data capture.
  • some or all of the functionality of the modeling component 122 is distributed between the scanning device 200, the data processing server 120, and/or the client device 130.
  • the method 500 begins at block 510, the processing device causes each of a plurality of cameras (e.g., the cameras 206) to capture one or more calibration images of a calibration pattern disposed above a support base of a scanning device (e.g., the support base 202 of the scanning device 200).
  • the one or more calibration images are infrared images.
  • the pattern comprises a plurality of circles.
  • the plurality of circles differ in diameter.
  • the calibration pattern is part of a foil layer disposed on a pressure panel disposed on the support base.
  • each of the plurality of cameras are calibrated based on the one or more calibration images.
  • the processing device causes the plurality of cameras to capture images of the individual’s foot (or feet) from different viewpoints around the support base.
  • the cameras configured to capture images comprising depth data.
  • the processing device computes a three-dimensional reconstruction of the individual’s foot based at least partially on the captured images.
  • the processing device captures (e.g., directly by the scanning device 200 or by the scanning device 200 under the control of the data processing server 120) a two-dimensional pressure map of an individual’s foot (or feet) while the individual is standing on a pressure panel (e.g., the pressure panel 204).
  • the three- dimensional reconstruction of the individual’s foot is generated based further on the pressure map.
  • the processing device generates data descriptive of an orthotic device based on the three-dimensional reconstruction of the individual’s foot, for example, by generating a shape that matches an underfoot surface represented by the three-dimensional reconstruction.
  • the processing device generates a recommendation of a pre-made orthotic device based on various features represented by or derivable from the three- dimensional reconstruction (e.g., shoe size, arch height, heel width, or other features that would be appreciated by one of ordinary skill in the art).
  • the processing device transmits the data descriptive of the orthotic device to a manufacturing device to fabricate the orthotic device.
  • FIG. 6 illustrates a diagrammatic representation of a machine in the exemplary form of a computer system 600 within which a set of instructions (e.g., for causing the machine to perform any one or more of the methodologies discussed herein) may be executed.
  • the machine may be connected (e.g., networked) to other machines in a LAN, an intranet, an extranet, or the Internet.
  • the machine may operate in the capacity of a server or a client machine in client-server network environment, or as a peer machine in a peer-to-peer (or distributed) network environment.
  • the machine may be a personal computer (PC), a tablet PC, a set-top box (STB), a Personal Digital Assistant (PDA), a cellular telephone, a web appliance, a server, a network router, switch orbridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine.
  • PC personal computer
  • PDA Personal Digital Assistant
  • STB set-top box
  • WPA Personal Digital Assistant
  • a cellular telephone a web appliance
  • server a server
  • network router switch orbridge
  • Some or all of the components of the computer system 600 may be utilized by or illustrative of at least some of the devices of the system architecture 100, such as the scanning device 200, the data processing server 120, the client device 130, and the data store 140.
  • the exemplary computer system 600 includes a processing device (processor) 602, a main memory 604 (e.g., read-only memory (ROM), flash memory, dynamic random access memory (DRAM) such as synchronous DRAM (SDRAM) or Rambus DRAM (RDRAM), etc ), a static memory 606 (e.g., flash memory, static random access memory (SRAM), etc.), and a data storage device 620, which communicate with each other via a bus 610.
  • a processing device e.g., a main memory 604 (e.g., read-only memory (ROM), flash memory, dynamic random access memory (DRAM) such as synchronous DRAM (SDRAM) or Rambus DRAM (RDRAM), etc ), a static memory 606 (e.g., flash memory, static random access memory (SRAM), etc.), and a data storage device 620, which communicate with each other via a bus 610.
  • main memory 604 e.g., read-only memory (ROM), flash memory, dynamic random access memory (
  • Processor 602 represents one or more general -purpose processing devices such as a microprocessor, central processing unit, or the like. More particularly, the processor 602 may be a complex instruction set computing (CISC) microprocessor, reduced instruction set computing (RISC) microprocessor, very long instruction word (VLIW) microprocessor, or a processor implementing other instruction sets or processors implementing a combination of instruction sets. The processor 602 may also be one or more special-purpose processing devices such as an ASIC, a field programmable gate array (FPGA), a digital signal processor (DSP), network processor, or the like. The processor 602 is configured to execute instructions 626 for performing the operations and steps discussed herein, such as operations associated with the modeling component 122.
  • CISC complex instruction set computing
  • RISC reduced instruction set computing
  • VLIW very long instruction word
  • the processor 602 may also be one or more special-purpose processing devices such as an ASIC, a field programmable gate array (FPGA), a digital signal processor (DSP), network processor,
  • the computer system 600 may further include a network interface device 608.
  • the computer system 600 also may include a video display unit 612 (e.g., a liquid crystal display (LCD), a cathode ray tube (CRT), or a touch screen), an alphanumeric input device 614 (e.g., a keyboard), a cursor control device 616 (e.g., a mouse), and/or a signal generation device 622 (e.g., a speaker).
  • a video display unit 612 e.g., a liquid crystal display (LCD), a cathode ray tube (CRT), or a touch screen
  • an alphanumeric input device 614 e.g., a keyboard
  • a cursor control device 616 e.g., a mouse
  • a signal generation device 622 e.g., a speaker
  • Power device 618 may monitor a power level of a battery used to power the computer system 600 or one or more of its components.
  • the power device 618 may provide one or more interfaces to provide an indication of a power level, a time window remaining prior to shutdown of computer system 600 or one or more of its components, a power consumption rate, an indicator of whether computer system is utilizing an external power source or battery power, and other power related information.
  • indications related to the power device 618 may be accessible remotely (e.g., accessible to a remote back-up management module via a network connection).
  • a battery utilized by the power device 618 may be an uninterruptable power supply (UPS) local to or remote from computer system 600. In such embodiments, the power device 618 may provide information about a power level of the UPS.
  • UPS uninterruptable power supply
  • the data storage device 620 may include a computer-readable storage medium 624 on which is stored one or more sets of instructions 626 (e.g., software) embodying any one or more of the methodologies or functions described herein.
  • the instructions 626 may also reside, completely or at least partially, within the main memory 604 and/or within the processor 602 during execution thereof by the computer system 600, the main memory 604 and the processor 602 also constituting computer-readable storage media.
  • the instructions 626 may further be transmitted or received over a network 630 (e.g., the network 105) via the network interface device 608.
  • the instructions 626 include instructions for operating or processing data generated by the scanning device 200, as described throughout this disclosure. While the computer-readable storage medium 624 is shown in an exemplary embodiment to be a single medium, the terms “computer-readable storage medium” or “machine-readable storage medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions.
  • computer-readable storage medium or “machine-readable storage medium” shall also be taken to include any transitory or non-transitory medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present disclosure.
  • computer-readable storage medium shall accordingly be taken to include, but not be limited to, solid-state memories, optical media, and magnetic media.
  • the disclosure also relates to an apparatus, device, or system for performing the operations herein.
  • This apparatus, device, or system may be specially constructed for the required purposes, or it may include a general purpose computer selectively activated or reconfigured by a computer program stored in the computer.
  • a computer program may be stored in a computer- or machine-readable storage medium, such as, but not limited to, any type of disk including floppy disks, optical disks, compact disk read-only memories (CD-ROMs), and magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, or any type of media suitable for storing electronic instructions.
  • example or “exemplary” are used herein to mean serving as an example, instance, or illustration. Any aspect or design described herein as “example” or “exemplary” is not necessarily to be construed as preferred or advantageous over other aspects or designs. Rather, use of the words “example” or “exemplary” is intended to present concepts in a concrete fashion.
  • the term “or” is intended to mean an inclusive “or” rather than an exclusive “or.” That is, unless specified otherwise, or clear from context, “X includes A or B” is intended to mean any of the natural inclusive permutations.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

Des modes de réalisation de la présente invention concernent un dispositif de balayage comprenant une base de support, un motif d'étalonnage disposé sur une surface supérieure de la base de support, et une pluralité de caméras réparties autour d'un périmètre externe de la base de support. Dans au moins un mode de réalisation, le motif d'étalonnage n'est pas détectable ou sensiblement indétectable par l'oeil humain.
PCT/US2022/047660 2022-10-25 2022-10-25 Dispositif de balayage à motifs géométriques pour étalonnage de caméra WO2024091223A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202263271446P 2022-10-25 2022-10-25
US63/271,446 2022-10-25

Publications (1)

Publication Number Publication Date
WO2024091223A1 true WO2024091223A1 (fr) 2024-05-02

Family

ID=90831558

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2022/047660 WO2024091223A1 (fr) 2022-10-25 2022-10-25 Dispositif de balayage à motifs géométriques pour étalonnage de caméra

Country Status (1)

Country Link
WO (1) WO2024091223A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060034548A1 (en) * 2004-08-11 2006-02-16 Hamid Pishdadian Apparatus and method for scanning an object
US20120069193A1 (en) * 2010-09-16 2012-03-22 Honeywell International Inc. Thermal camera calibration
US20190236806A1 (en) * 2016-11-03 2019-08-01 Intel Corporation Real-time three-dimensional camera calibration
US20210279900A1 (en) * 2020-03-06 2021-09-09 Aetrex Worldwide, Inc. Scanning device with imaging and pressure-sensing functionality

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060034548A1 (en) * 2004-08-11 2006-02-16 Hamid Pishdadian Apparatus and method for scanning an object
US20120069193A1 (en) * 2010-09-16 2012-03-22 Honeywell International Inc. Thermal camera calibration
US20190236806A1 (en) * 2016-11-03 2019-08-01 Intel Corporation Real-time three-dimensional camera calibration
US20210279900A1 (en) * 2020-03-06 2021-09-09 Aetrex Worldwide, Inc. Scanning device with imaging and pressure-sensing functionality

Similar Documents

Publication Publication Date Title
US10579203B2 (en) Wellness mirror
US11900625B2 (en) Systems and methods for generating a three-dimensional reconstruction of a foot
US10117617B2 (en) Automated systems and methods for skin assessment and early detection of a latent pathogenic bio-signal anomaly
US20130162796A1 (en) Methods and apparatus for imaging, detecting, and monitoring surficial and subdermal inflammation
TW201821964A (zh) 用以量測腳部尺寸之應用程式
WO2017036115A1 (fr) Dispositif d'identification de l'identité et son procédé de fabrication, et procédé d'identification de l'identité
CN106164929A (zh) 用于创面评估与管理的方法和系统
CN103006232A (zh) 足部测量设备
CN101547639A (zh) 足部测量设备
US20200345295A1 (en) System and method for screening a patient's foot
EP0536357A1 (fr) Procede et dispositif d'aide a l'inspection d'un corps, notamment pour la tomographie
CN103957810A (zh) 图像处理装置、图像处理方法和程序
Dabas et al. Application of artificial intelligence methodologies to chronic wound care and management: a scoping review
CN107250842A (zh) 医学成像探测器
US9220462B2 (en) Imaging sensor and method for biometric mapping of facial skin
WO2024091223A1 (fr) Dispositif de balayage à motifs géométriques pour étalonnage de caméra
US20230022065A1 (en) Systems and methods for determining physical parameters of feet
Rajab et al. Development of Pressure Sensors to Help Support Community Lymphedema Monitoring: A Scoping Review
US20220117546A1 (en) Wound measurement
US20230200652A1 (en) System to detect foot abnormalities
CN116153510B (zh) 矫正镜控制方法、装置、设备、存储介质及智能矫正镜
KR101437482B1 (ko) 족문 분석장치 및 방법
US20220319676A1 (en) Predicting user body volume to manage medical treatment and medication
TW202242905A (zh) 預測生理參數的方法及系統
TW201828886A (zh) 自動化足部檢測分析系統

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22963657

Country of ref document: EP

Kind code of ref document: A1