WO2016115536A2 - Determining three-dimensional information from projections or placement of two-dimensional patterns - Google Patents

Determining three-dimensional information from projections or placement of two-dimensional patterns Download PDF

Info

Publication number
WO2016115536A2
WO2016115536A2 PCT/US2016/013723 US2016013723W WO2016115536A2 WO 2016115536 A2 WO2016115536 A2 WO 2016115536A2 US 2016013723 W US2016013723 W US 2016013723W WO 2016115536 A2 WO2016115536 A2 WO 2016115536A2
Authority
WO
WIPO (PCT)
Prior art keywords
target object
pattern
dimensional
initial
information regarding
Prior art date
Application number
PCT/US2016/013723
Other languages
French (fr)
Other versions
WO2016115536A3 (en
Inventor
Robert Bismuth
William Orner
Amit Rohatgi
Original Assignee
Robert Bismuth
William Orner
Amit Rohatgi
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bismuth, William Orner, Amit Rohatgi filed Critical Robert Bismuth
Publication of WO2016115536A2 publication Critical patent/WO2016115536A2/en
Publication of WO2016115536A3 publication Critical patent/WO2016115536A3/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2513Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with several lines being projected in more than one direction, e.g. grids, patterns
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/16Measuring arrangements characterised by the use of optical techniques for measuring the deformation in a solid, e.g. optical strain gauge
    • G01B11/165Measuring arrangements characterised by the use of optical techniques for measuring the deformation in a solid, e.g. optical strain gauge by means of a grating deformed by the object

Definitions

  • the present application is generally related to determining three- dimensional information of an object from projections or placement of two-dimensional patterns onto the object.
  • the object can be a part of a person's body, which has a varied, irregular shape, and the measurements would be used to fit that part into a wearable piece.
  • the object can be the person's feet, and the wearable piece can be a pair of shoes.
  • the object can also be inanimate, large or heavy, and the measurements would be used when the object needs to be covered, transported, valuated, etc.
  • the object can be the trunk of a tree, and the measurements can be used to decide whether to cut the tree down and how to transport the resulting log.
  • the object can be a large gold statute, and the measurements can be used to determine how much it is worth.
  • the irregular shape presents a challenge in obtaining good measurements.
  • the shape of the human body can fluctuate over time, adding to the complexity of the challenge.
  • an object when an object is inanimate, large, or heavy, it can be difficult to take the measurements overall. It would be useful to be able to easily obtain good measurements.
  • FIG. 1 illustrates an example setup of a shape measurement system disclosed in the present application.
  • FIG. 2 illustrates example components of the computing portion of the shape measurement system, comprising one or more processors or memories.
  • FIG. 3 illustrates example graphical user interfaces (GUIs) shown on a display screen of the shape management system for directing a movement of the target object.
  • GUIs graphical user interfaces
  • Fig. 4 illustrates an example process performed by the shape measurement system of projecting a two-dimensional pattern on a three-dimensional target object and analyzing the projected pattern in two dimensions to obtain three- dimensional information regarding the target object.
  • Fig. 5A illustrates another example setup of the shape measurement system.
  • Fig. 5B illustrates an example setup of the shape measurement system similar to the one shown in Fig. 5A.
  • Fig. 6 illustrates another example process performed by the shape measurement system of projecting a two-dimensional pattern on a three-dimensional target object and analyzing the projected pattern in two dimensions to obtain three- dimensional information regarding the target object.
  • Fig. 7 is a high-level block diagram showing an example architecture of a computer, which may represent any electronic device, any server, or any node within a cloud service as described herein.
  • This application discloses a shape measurement system that allows a user to easily take measurements of a target object.
  • the system projects or places two-dimensional initial patterns onto the target object and calculates three- dimensional information regarding the target object based on the two-dimensional projected patterns, which are transformed from the initial patterns due to the varying depths of the projection surface.
  • the system typically requires the user to merely gain access to a camera, which can be embedded into a mobile device, and optionally a light source, and the system quickly produces accurate and comprehensive measurements of the target object.
  • the shape measurement system has many potential applications.
  • the system can facilitate the fitting process and help reduce shipping efforts.
  • the system can assist in the making of prostheses, casts, braces, and biometric devices.
  • the system provides monitoring capabilities for weight loss and body sculpting efforts and can help with the manufacturing of fitness apparel and equipment, such as a helmet or shoes and boots.
  • the system can similarly facilitate the fitting of an animal into an outfit or a gear.
  • the system enables the tracking of body movement and positioning, such as for animation purposes.
  • the system makes it easy to replicate an existing object with three-dimensional printing.
  • the system similarly makes it easy to model new furniture for individual custom fit, model furniture after existing pieces and to replace parts or produce covers for existing pieces.
  • the system enables and a structure analysis showing the relationships among different parts of the vehicle as well as a fitness analysis between the interior of the vehicle and a customer driving the vehicle.
  • Fig. 1 illustrates an example setup of the shape measurement system disclosed in the present application.
  • the system includes a light source 102 and a pattern molding 104 embedding a two-dimensional initial pattern.
  • the light source 102, the pattern molding 104, and a target object 106 regarding which three-dimensional information is to be obtained are positioned such that the two-dimensional pattern is projected onto a surface of the target object 106 resulting in a distorted pattern 108.
  • the system also includes a camera 1 10, which then captures the distorted pattern 108 in two dimensions.
  • the system includes a processor and memory 1 12, which manages the different components of the system and performs computations, including analyzing the distorted pattern 108 to determine how the surface moves up and down and derives three-dimensional information regarding the target object.
  • the analysis can take into consideration the characteristics of the initial pattern, the relative placement of the light source 102, the pattern molding 104, the target object 106, and the camera 1 10, and other factors.
  • the system can include a display screen and an input device for communication with a user of the system.
  • the camera 1 10, the processor and memory 1 12, and the display screen are all combined in a single device, such as a cellular phone, a tablet, a desktop computer, a laptop computer, or a wearable device.
  • the processor and memory 1 12 are located separately from the rest of the system.
  • the processor and memory 1 12 communicate with the rest of the system, specifically the camera 1 10 or the display screen, across any type of network known to someone of ordinary skill in the art, such as a cellular network or the Internet.
  • the initial pattern can be projected onto multiple surfaces of the target object by rotating the target object or the components of the system around the target object.
  • multiple initial patterns can be used for the same surface or different surfaces of the target object.
  • the processor coupled with the memory can synthesize information obtained from multiple distorted patterns in deriving the three-dimensional information regarding the target object 106.
  • the shape management system can obtain three-dimensional information of a target object based on projections or placement of two-dimensional patterns onto the target object.
  • the system achieves this goal without making direct contact with the target object and without requiring substantial movement of the target object.
  • Fig. 2 illustrates example components of the computing portion of the shape measurement system, comprising one or more processors or memories.
  • the computing portion includes an original pattern management module 202, a pattern transformation module 204, and a distorted pattern processing module 206.
  • the original pattern management module 202 manages the creation and selection of two-dimensional initial patterns to be projected onto a target object.
  • An initial pattern can be isomorphic, repeating a configuration of lines or dots over the extent of the pattern.
  • Initial patterns may also be structured as an isomorphic repetition of one or more non-isomorphic patterns.
  • the extendible nature of an isomorphic pattern or an isomorphic repetition of one or more non- isomorphic patterns enables an assessment of the target object beyond a portion of the target object that is within the line of sight. More specifically, a repeating double log pattern is the initial example of an isomorphic repetition of a non-isomorphic pattern.
  • the original pattern management module 202 can also use an initial pattern that is partially or entirely non-isomorphic to uncover information regarding specific areas or aspects of the target object.
  • the use of a combined isomorphic pattern with embedded nonisomorphic patterns allows identification of missing portions, hidden portions or holes. For example, by bending a pattern with an isomorphic repetition of the non-isomorphic double log pattern provides identification of the non-isomorphic pattern and allows the system to detect missing elements of the overall pattern and the isomorphic pattern allows registration of the overall pattern to more easily scale and detect the missing parts of the non- isomorphic pattern.
  • different initial patterns can be used for the same surface or different surfaces of the target object.
  • the original pattern management module 202 can also determine additional initial patterns based on the result of projecting or placing existing initial patterns on the target object. For example, when the result of projecting an isomorphic pattern to an entire surface of the target object shows that a particular area of the surface is full of hills and valleys, the original pattern management module 202 can select a non-isomorphic, finer pattern to better capture the characteristics of that particular area. The selection can be made based on user selections, attributes of the target object, etc.
  • the original pattern management module 202 determines that each initial pattern is to be implemented by a molding based on a user selection, for example.
  • the molding can stand alone or be attached to the light source.
  • the size of the molding can depend on the locations where the molding may be placed relative to the light source and the target object, the desired coverage by the projected pattern of a surface of the target object, and other factors.
  • the molding can be made of any solid material that absorbs light without otherwise interfering with nearby lighting.
  • the pattern transformation module 204 manages the configuration of the shape measurement system for projecting a two-dimensional initial pattern onto the target object.
  • the pattern transformation module 204 selects a light source, which can be a collimated light source, a pointed light source, etc.
  • the pattern transformation module 204 also ensures that the light reaching the target object is visible. The visibility can be affected by the relative placement of the light source, the pattern molding, and the target object, the type and appearance of the surface to be hit by the light, and other factors. Based on these factors, the pattern transformation module 204 can determine the colors, intensities, and other attributes of the illuminated light accordingly.
  • the pattern transformation module 204 determines the distance between the light source and the pattern molding as well as the distance between the pattern molding and the target object.
  • the pattern transformation module 204 can arrange the light source, the pattern molding, and the target object to ensure that the projected pattern covers the entire surface of the target object.
  • the pattern transformation module 204 can similarly arrange the components of the system to ensure that the multiple projected patterns combine to cover the entire surface of the target object.
  • the pattern transformation module 206 selects a camera. It can be a standalone camera or one that is embedded in another computing device, such as a cellular phone, a tablet, a laptop computer, a desktop computer, or a wearable device. It then determines the placement of the camera relative to the light source and the target object generally to ensure that the camera clearly captures the projected pattern in its entirety. The distance to the target object in particular is useful to the computation of three-dimensional information performed by the distorted pattern processing module 206, as discussed below.
  • the pattern transformation module 204 provides movement instructions to the target object for obtaining projected patterns on different surfaces of the target object.
  • the pattern transformation module 204 coordinates the operation of the camera and the display screen to communicate with the target object.
  • the target object can be a human or an object operable by a human.
  • the pattern transformation module 204 determines an appropriate movement instruction and communicates the movement instruction to the target object using the display screen. It also ensures that the design and presentation of the movement instruction enables accurate observation by the human receiving the movement instruction even from some distance away.
  • Fig. 3 illustrates example graphical user interfaces (GUIs) shown on a display screen of the shape management system for directing the movement of the target object.
  • GUIs graphical user interfaces
  • the list of GUIs from left to right corresponds to a series of instructions shown to a person possessing or operating the target object during the course of communication with the person.
  • there are a sufficiently simple and large geometric shape 304 which is not entirely associated with any known object, for representing the target object and an easily identified direction indicator 306 that instructs a movement.
  • the geometric shape 304 is a circle.
  • the direction indicator 306 is a triangle where two of the vertices slide along the circle, the current position of these two vertices indicates a specific portion of the target object, and the movement of these two vertices indicates a specific movement of the specific portion.
  • the text states that the specific portion of the object should move in a certain fashion.
  • the pattern transformation module 204 can also coordinate the operation of the speaker to further facilitate the communication with the person by reading the text out loud, for example.
  • the distorted pattern processing module 206 examines a projected pattern to derive three-dimensional information regarding the target object.
  • the amount of local change in the projected pattern from the initial pattern is proportional to the depth information that constitutes the third dimension.
  • the distorted pattern processing module 206 measures the number of dots per inch (DPI) in the initial pattern as well as the DPI in the projected pattern as viewed by the camera, determine any distortion in the lines within the pattern and the 2D shapes enclosed by these lines.
  • the distorted pattern processing module 206 can then use the difference of the two DPI values along with the distortion in lines and enclosed shapes to estimate the depth corresponding to the surface covered by the patterns.
  • DPI dots per inch
  • the camera can capture approximately 300 DPI from the initial pattern
  • the camera can capture approximately 315 DPI from the projected pattern.
  • the difference of 15 DPI can then be converted to a depth of 2.3 inches.
  • various types of known patterns may be used.
  • the difference to the pixel count per inch is a function of the distance of the surface of the object from the camera.
  • the pixel count is the prime measure measurement of the depth of the object's surface relative to the camera: as the surface varies up and down it effectively varies closer or away from the camera and makes slight changes in the pixel count per inch of the object.
  • the distorted pattern processing module 206 determines a range of error tolerance, which can depend on the properties of the light source, of the camera, or of the target object. For example, when the target object is a human, one factor to consider is the breathing pattern, which affects the shape of the human body albeit to a slight degree.
  • the distorted pattern processing module 206 can instruct the human to hold their breath during the pattern projection or can tolerate an error in a range that corresponds to the moving up and down of the human chest.
  • the system can effectively measure the effect of motion on the object, for example the rate and/or depth of respiration.
  • the system can not only measure the depth of the snowfall and the shape of the snow drift on a roof but an also measure the rate of incease/decrease in depth and shape of the overall show covering as either more snow falls or snow melts.
  • an isomorphic pattern can be used to reveal information regarding an invisible portion of the target object, that information can be cross- referenced with information directly obtained from projecting an initial pattern on the portion when it becomes visible to reduce errors and increase confidence levels. This process can also help refine the calculation of the properties of an invisible portion using the extendible nature of an isomorphic pattern.
  • system can detect the rate of curvature of an object and the resulting rate of change in the depth of the 3D object relative to the camera. Depending on the accuracy needed by the end application for the data being gathered, this information can therefore be used to select areas of the object that require a finer grained analysis and the initial pattern can be modified or replaced by a pattern that has increased pattern density in the areas of the object that the system determined required more detailed analysis.
  • the distorted pattern processing module 206 displays the computed three-dimensional information regarding the target object to a user of the system, which includes coordinates of the measured surfaces of the target object.
  • the distorted pattern processing module 206 can further convert these coordinates into sizing information with respect to a wearable piece for the target object.
  • the sizing information can be related to a shirt that is specified in terms of certain portions of the person's upper body, such as the neck, the chest, the shoulders, or the arms, and available from an external source, such as but not limited to an external database, application, cloud service, web service or server.
  • the distorted pattern processing module 206 can identify the closest size of the shirt for the person and tailoring options based on the differences between the two pieces of information. The identified information can then also be displayed to the user.
  • Fig. 4 illustrates an example process performed by the shape measurement system of projecting a two-dimensional pattern on a three-dimensional target object and analyzing the projected pattern in two dimensions to obtain three- dimensional information regarding the target object.
  • the system selects a molding embodying a two-dimensional initial pattern.
  • the initial pattern can be largely isomorphic, an isomorphic repetition of one or more non-isomorphic patterns or it can have a specific structure for a specific area or aspect of the target object.
  • One or more initial patterns can be selected for the same surface or different surfaces of the target object.
  • the system selects a light source.
  • the light source can provide pointed light, which is easy to find, or collimated light, which makes the projected pattern easier to characterize.
  • step 406 the system determines the relative placement of the different components of the system and the target object such that the initial pattern is properly projected on the target object and captured for further analysis.
  • the system can provide instructions to be displayed so that the target object makes movements for better pattern projection and capturing.
  • step 408 the system captures and analyzes one or more projected patterns to obtain three-dimensional information regarding the target object.
  • the system can infer information even for an invisible portion of the target object. It can also cross- reference data extracted from different projected patterns to increase the accuracy of the analysis.
  • step 410 which is often an optional step, the system further converts the three-dimensional information into sizing information with respect to a wearable piece available in different sizes. This feature can be helpful when the target object is a portion of the human body, where the wearable piece can be clothes, glasses, hats, jewelry, belts, shoes, hair accessories, etc.
  • Fig. 5A illustrates another example setup of the shape measurement system.
  • the system includes an outfit 506, such as a shirt, with a two-dimensional initial pattern printed on it to be worn on a target object 504, such as a human body.
  • the system also includes a camera 502, a processor and memory, and a display screen, which can be combined in a single device, such as a cellular phone.
  • the outfit is worn, the fabric stretches, and the initial pattern also stretches into a distorted pattern, which is in a way "projected" onto the target object and can be analyzed to derive three-dimensional information regarding the target object as described above.
  • the original pattern management module 202 determines that the initial pattern is instead to be printed on an outfit to be worn.
  • the original pattern management module 202 can choose from a variety of types of fabric which stretch to track the surface of the object in the outfit. It can also choose from a variety of shapes for the outfit depending on the shape of the target object and other factors. For example, the outfit can fit a specific portion of a human body or have one of the basic geometric shapes. For each of the shapes, one or more sizes can be available. Another choice to be made by the original pattern management module 202 is a printing technique that ensures that the pattern stretches evenly when the fabric stretches.
  • the original pattern management module 202 sets the colors and contrast of the initial pattern and the rest of the outfit such that the projected patterns can be accurately captured by a camera.
  • the pattern transformation module 204 selects a calibration chart and determines how it is to be implemented.
  • the calibration chart typically comprises a distinct, simple, and small pattern, that may or may not be replicated in several locations relative to the overall pattern, whose distortion is easy to detect and measure. The calibration chart can be used to ensure that the camera capture would be successful.
  • the calibration chart can be designed such that when it is visible to the human eyes, it would be visible to the camera lens.
  • the calibration chart can be printed on the outfit together with the initial pattern, or it can be implemented in a standalone device and placed nearby the target object at the time of image capture, so that an image of the calibration chart or a distorted version thereof can be captured together with the distorted pattern corresponding to the initial pattern.
  • the distorted pattern processing module 206 performs a calibration process utilizing a specific calibration chart. Depending on how the calibration chart is implemented, the distorted pattern processing module 206 determines the distance of the calibration chart to the camera, which is helpful for further determining three-dimensional information regarding the target object. According to aspects of the disclosure, knowing the original size and layout of the calibration its distance from the camera can be estimated by analyzing the effective DPI of the pixels used to image the calibration chart coupled with the local distortion of the chart by the underlying object. Since the overall pattern is also measuring the same distortion the two taken together can be used to eliminate the distortion due to depth variation on the object enabling the distance between the camera and the object as the output of the pixel density.
  • Fig. 5B illustrates an example setup of the shape measurement system that is similar to the one shown in Fig. 5A.
  • the target object 510 is not a human body but an animal body. This illustrates that there is not much limitation on what the target object can be. It can be inanimate, such as a rock, and it can also be simply a collection of surfaces, such as the snow on the mountain top.
  • the outfit includes not only an initial pattern 512 but also a calibration chart 514. Information obtained from the use of the calibration chart 514 can facilitate the analysis of the distorted pattern.
  • step 602 the system selects a two-dimensional initial pattern, as discussed above, and a calibration chart.
  • the system could also select multiple patterns to be printed on different areas of the outfit.
  • the calibration chart is typically simple, large, and easy to detect and measure.
  • step 604 the system selects a type of fabric of which an outfit is to be made and determines how to print the initial pattern and the calibration chart on the outfit.
  • the calibration chart can be implemented as a standalone device.
  • the fabric should stretch to track the object in the outfit, and the printed patterns should stretch evenly as a result, and the printing should ensure that a printed pattern stretches evenly as the fabric stretches.
  • step 606 the system directs the placement of the outfit on the target object.
  • the target object is a person's upper body, and the outfit is a top
  • the person is instructed to wear the top and eliminate creases, stains, or anything else that may obscure the printed patterns from the top.
  • the fabric stretches, and the initial pattern and calibration chart also stretch, resulting in distorted patterns.
  • the system captures the distorted patterns.
  • the target object can also be instructed to make movements to enable different surfaces of the target object to be visible, and the distorted patterns on all these surfaces can be captured.
  • step 610 the system examines the distorted pattern corresponding to the calibration chart, obtains information useful for deriving three-dimensional information regarding the target object, including the distance to the camera, and uses the obtained information to derive three-dimensional information from the distorted pattern corresponding to the initial pattern.
  • Fig. 7 contains a high-level block diagram showing an example architecture of a computer 700, which may represent any electronic device, such as a mobile device or a server, including any node within a cloud service as described herein, and which may implement the operations described above.
  • the computer 700 includes one or more processors 710 and memory 720 coupled to an interconnect 730.
  • the interconnect 730 shown in Fig. 7 is an abstraction that represents any one or more separate physical buses, point to point connections, or both connected by appropriate bridges, adapters, or controllers.
  • the interconnect 730 may include, for example, a system bus, a Peripheral Component Interconnect (PCI) bus or PCI-Express bus, a HyperTransport or industry standard architecture (ISA) bus, a small computer system interface (SCSI) bus, a universal serial bus (USB), IIC (I2C) bus, or an Institute of Electrical and Electronics Engineers (IEEE) standard 1394 bus, also called "Firewire”.
  • PCI Peripheral Component Interconnect
  • ISA industry standard architecture
  • SCSI small computer system interface
  • USB universal serial bus
  • I2C IIC
  • IEEE Institute of Electrical and Electronics Engineers
  • the processor(s) 710 is/are the central processing unit (CPU) of the computer 700 and, thus, control the overall operation of the computer 700. In certain embodiments, the processor(s) 710 accomplish this by executing software or firmware stored in memory 720.
  • the processor(s) 710 may be, or may include, one or more programmable general-purpose or special-purpose microprocessors, digital signal processors (DSPs), programmable controllers, application specific integrated circuits (ASICs), programmable logic devices (PLDs), field-programmable gate arrays (FPGAs), trusted platform modules (TPMs), or a combination of such or similar devices.
  • DSPs digital signal processors
  • ASICs application specific integrated circuits
  • PLDs programmable logic devices
  • FPGAs field-programmable gate arrays
  • TPMs trusted platform modules
  • the memory 720 is or includes the main memory of the computer 700.
  • the memory 720 represents any form of random access memory (RAM), read-only memory (ROM), flash memory, or the like, or a combination of such devices.
  • the memory 720 may contain code 770 containing instructions according to the techniques disclosed herein.
  • the network adapter 740 provides the computer 700 with the ability to communicate with remote devices over a network and may be, for example, an Ethernet adapter.
  • the network adapter 740 may also provide the computer 700 with the ability to communicate with other computers.
  • the code 770 stored in memory 720 may be implemented as software and/or firmware to program the processor(s) 710 to carry out actions described above.
  • such software or firmware may be initially provided to the computer 700 by downloading it from a remote system through the computer 700 (e.g., via network adapter 740).
  • the techniques introduced herein can be implemented by, for example, programmable circuitry (e.g., one or more microprocessors) programmed with software and/or firmware, or entirely in special-purpose hardwired circuitry, or in a combination of such forms.
  • Software or firmware for use in implementing the techniques introduced here may be stored on a machine-readable storage medium and may be executed by one or more general-purpose or special-purpose programmable microprocessors.
  • a "machine-readable storage medium”, as the term is used herein, includes any mechanism that can store information in a form accessible by a machine (a machine may be, for example, a computer, network device, cellular phone, personal digital assistant (PDA), manufacturing tool, any device with one or more processors, etc.).
  • a machine-accessible storage medium includes recordable/non-recordable media (e.g., read-only memory (ROM); random access memory (RAM); magnetic disk storage media; optical storage media; flash memory devices; etc.), etc.
  • These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
  • the computer program instructions may also be loaded onto a computer, other programmable data processing apparatuses, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s).
  • the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

Embodiments of a shape measurement system and related methods are disclosed. In some embodiments, the system projects a two-dimensional initial pattern, which can be implemented as a standalone molding or can be attached to a light source or printed on an outfit, onto a surface of a three-dimensional object. The system analyzes the projected pattern captured in a two dimensions, which is a distorted version of the initial pattern due to the varying depth of the surface, and derive three-dimensional information regarding the target object. The analysis can rely on a calibration using a calibration chart, the projection nature of the light source, the isomorphism/non-isomorphism of the initial pattern, and other factors.

Description

DETERMINING THREE-DIMENSIONAL INFORMATION FROM
PROJECTIONS OR PLACEMENT OF TWO-DIMENSIONAL
PATTERNS
RELATED APPLICATIONS
[0001] This application claims priority to and benefit from U.S. Provisional Patent Application No. 62/104,559, filed on January 16, 2015 and titled "DETERMINING THREE-DIMENSIONAL INFORMATION FROM PROJECTIONS OR PLACEMENT OF TWO-DIMENSIONAL PATTERNS," the entire contents of which are incorporated herein by reference and relied upon. In cases where material incorporated herein by reference conflicts with the present disclosure, the present disclosure controls.
TECHNICAL FIELD
[0002] The present application is generally related to determining three- dimensional information of an object from projections or placement of two-dimensional patterns onto the object.
BACKGROUND
[0003] Often times, it is necessary to have the measurements of an object. The object can be a part of a person's body, which has a varied, irregular shape, and the measurements would be used to fit that part into a wearable piece. For example, the object can be the person's feet, and the wearable piece can be a pair of shoes. The object can also be inanimate, large or heavy, and the measurements would be used when the object needs to be covered, transported, valuated, etc. As one example, the object can be the trunk of a tree, and the measurements can be used to decide whether to cut the tree down and how to transport the resulting log. As another example, the object can be a large gold statute, and the measurements can be used to determine how much it is worth.
[0004] For parts of the human body, for example, the irregular shape presents a challenge in obtaining good measurements. In addition, the shape of the human body can fluctuate over time, adding to the complexity of the challenge. On the other hand, when an object is inanimate, large, or heavy, it can be difficult to take the measurements overall. It would be useful to be able to easily obtain good measurements.
BRIEF DESCRI PTION OF THE DRAWINGS
[0005] Various embodiments are disclosed in the following detailed description and accompanying drawings.
[0006] Fig. 1 illustrates an example setup of a shape measurement system disclosed in the present application.
[0007] Fig. 2 illustrates example components of the computing portion of the shape measurement system, comprising one or more processors or memories.
[0008] Fig. 3 illustrates example graphical user interfaces (GUIs) shown on a display screen of the shape management system for directing a movement of the target object.
[0009] Fig. 4 illustrates an example process performed by the shape measurement system of projecting a two-dimensional pattern on a three-dimensional target object and analyzing the projected pattern in two dimensions to obtain three- dimensional information regarding the target object.
[0010] Fig. 5A illustrates another example setup of the shape measurement system.
[0011] Fig. 5B illustrates an example setup of the shape measurement system similar to the one shown in Fig. 5A.
[0012] Fig. 6 illustrates another example process performed by the shape measurement system of projecting a two-dimensional pattern on a three-dimensional target object and analyzing the projected pattern in two dimensions to obtain three- dimensional information regarding the target object.
[0013] Fig. 7 is a high-level block diagram showing an example architecture of a computer, which may represent any electronic device, any server, or any node within a cloud service as described herein.
DETAILED DESCRIPTION
[0014] This application discloses a shape measurement system that allows a user to easily take measurements of a target object. The system projects or places two-dimensional initial patterns onto the target object and calculates three- dimensional information regarding the target object based on the two-dimensional projected patterns, which are transformed from the initial patterns due to the varying depths of the projection surface. The system typically requires the user to merely gain access to a camera, which can be embedded into a mobile device, and optionally a light source, and the system quickly produces accurate and comprehensive measurements of the target object.
[0015] The shape measurement system has many potential applications. In the fashion industry, the system can facilitate the fitting process and help reduce shipping efforts. In the medical devices industry, the system can assist in the making of prostheses, casts, braces, and biometric devices. In the exercise and fitness industry, the system provides monitoring capabilities for weight loss and body sculpting efforts and can help with the manufacturing of fitness apparel and equipment, such as a helmet or shoes and boots. In the animal care industry, the system can similarly facilitate the fitting of an animal into an outfit or a gear. In the motion capture industry, the system enables the tracking of body movement and positioning, such as for animation purposes. In the printing industry, the system makes it easy to replicate an existing object with three-dimensional printing. In the furniture industry, the system similarly makes it easy to model new furniture for individual custom fit, model furniture after existing pieces and to replace parts or produce covers for existing pieces. In the car and airplane industry, the system enables and a structure analysis showing the relationships among different parts of the vehicle as well as a fitness analysis between the interior of the vehicle and a customer driving the vehicle.
[0016] The following is a detailed description of exemplary embodiments to illustrate the principles of the invention. The embodiments are provided to illustrate aspects of the invention, but the invention is not limited to any embodiment. The scope of the invention encompasses numerous alternatives, modifications and the equivalent.
[0017] Numerous specific details are set forth in the following description in order to provide a thorough understanding of the invention. However, the invention may be practiced according to the claims without some or all of these specific details. For the purpose of clarity, technical material that is known in the technical fields related to the invention has not been described in detail so that the invention is not unnecessarily obscured.
[0018] Fig. 1 illustrates an example setup of the shape measurement system disclosed in the present application. In some embodiments, the system includes a light source 102 and a pattern molding 104 embedding a two-dimensional initial pattern. The light source 102, the pattern molding 104, and a target object 106 regarding which three-dimensional information is to be obtained are positioned such that the two-dimensional pattern is projected onto a surface of the target object 106 resulting in a distorted pattern 108. The system also includes a camera 1 10, which then captures the distorted pattern 108 in two dimensions. Furthermore, the system includes a processor and memory 1 12, which manages the different components of the system and performs computations, including analyzing the distorted pattern 108 to determine how the surface moves up and down and derives three-dimensional information regarding the target object. The analysis can take into consideration the characteristics of the initial pattern, the relative placement of the light source 102, the pattern molding 104, the target object 106, and the camera 1 10, and other factors. In addition, the system can include a display screen and an input device for communication with a user of the system.
[0019] In some embodiments, the camera 1 10, the processor and memory 1 12, and the display screen are all combined in a single device, such as a cellular phone, a tablet, a desktop computer, a laptop computer, or a wearable device. Alternatively, the processor and memory 1 12 are located separately from the rest of the system. In this case, the processor and memory 1 12 communicate with the rest of the system, specifically the camera 1 10 or the display screen, across any type of network known to someone of ordinary skill in the art, such as a cellular network or the Internet.
[0020] In some embodiments, the initial pattern can be projected onto multiple surfaces of the target object by rotating the target object or the components of the system around the target object. In addition, multiple initial patterns can be used for the same surface or different surfaces of the target object. In either case, the processor coupled with the memory can synthesize information obtained from multiple distorted patterns in deriving the three-dimensional information regarding the target object 106.
[0021] By virtue of such a setup, the shape management system can obtain three-dimensional information of a target object based on projections or placement of two-dimensional patterns onto the target object. The system achieves this goal without making direct contact with the target object and without requiring substantial movement of the target object. These features can be useful when the target object is difficult to move or measure. They also provide the convenience of obtaining shape measurements at a chosen location and without having to interact with other persons or objects.
[0022] Fig. 2 illustrates example components of the computing portion of the shape measurement system, comprising one or more processors or memories. In some embodiments, the computing portion includes an original pattern management module 202, a pattern transformation module 204, and a distorted pattern processing module 206.
[0023] In some embodiments, the original pattern management module 202 manages the creation and selection of two-dimensional initial patterns to be projected onto a target object. An initial pattern can be isomorphic, repeating a configuration of lines or dots over the extent of the pattern. Initial patterns may also be structured as an isomorphic repetition of one or more non-isomorphic patterns. The extendible nature of an isomorphic pattern or an isomorphic repetition of one or more non- isomorphic patterns enables an assessment of the target object beyond a portion of the target object that is within the line of sight. More specifically, a repeating double log pattern is the initial example of an isomorphic repetition of a non-isomorphic pattern. The original pattern management module 202 can also use an initial pattern that is partially or entirely non-isomorphic to uncover information regarding specific areas or aspects of the target object. According to aspects of the disclosure, the use of a combined isomorphic pattern with embedded nonisomorphic patterns allows identification of missing portions, hidden portions or holes. For example, by bending a pattern with an isomorphic repetition of the non-isomorphic double log pattern provides identification of the non-isomorphic pattern and allows the system to detect missing elements of the overall pattern and the isomorphic pattern allows registration of the overall pattern to more easily scale and detect the missing parts of the non- isomorphic pattern.
[0024] In some embodiments, different initial patterns can be used for the same surface or different surfaces of the target object. The original pattern management module 202 can also determine additional initial patterns based on the result of projecting or placing existing initial patterns on the target object. For example, when the result of projecting an isomorphic pattern to an entire surface of the target object shows that a particular area of the surface is full of hills and valleys, the original pattern management module 202 can select a non-isomorphic, finer pattern to better capture the characteristics of that particular area. The selection can be made based on user selections, attributes of the target object, etc.
[0025] In some embodiments, the original pattern management module 202 determines that each initial pattern is to be implemented by a molding based on a user selection, for example. The molding can stand alone or be attached to the light source. The size of the molding can depend on the locations where the molding may be placed relative to the light source and the target object, the desired coverage by the projected pattern of a surface of the target object, and other factors. The molding can be made of any solid material that absorbs light without otherwise interfering with nearby lighting.
[0026] In some embodiments, the pattern transformation module 204 manages the configuration of the shape measurement system for projecting a two-dimensional initial pattern onto the target object. The pattern transformation module 204 selects a light source, which can be a collimated light source, a pointed light source, etc. The pattern transformation module 204 also ensures that the light reaching the target object is visible. The visibility can be affected by the relative placement of the light source, the pattern molding, and the target object, the type and appearance of the surface to be hit by the light, and other factors. Based on these factors, the pattern transformation module 204 can determine the colors, intensities, and other attributes of the illuminated light accordingly.
[0027] In some embodiments, the pattern transformation module 204 determines the distance between the light source and the pattern molding as well as the distance between the pattern molding and the target object. When a single initial pattern is used for one surface of the target object, the pattern transformation module 204 can arrange the light source, the pattern molding, and the target object to ensure that the projected pattern covers the entire surface of the target object. When multiple initial patterns are used for one surface, the pattern transformation module 204 can similarly arrange the components of the system to ensure that the multiple projected patterns combine to cover the entire surface of the target object.
[0028] In some embodiments, the pattern transformation module 206 selects a camera. It can be a standalone camera or one that is embedded in another computing device, such as a cellular phone, a tablet, a laptop computer, a desktop computer, or a wearable device. It then determines the placement of the camera relative to the light source and the target object generally to ensure that the camera clearly captures the projected pattern in its entirety. The distance to the target object in particular is useful to the computation of three-dimensional information performed by the distorted pattern processing module 206, as discussed below.
[0029] In some embodiments, the pattern transformation module 204 provides movement instructions to the target object for obtaining projected patterns on different surfaces of the target object. The pattern transformation module 204 coordinates the operation of the camera and the display screen to communicate with the target object. For example, the target object can be a human or an object operable by a human. Upon detecting the current appearance of the target object using the camera, the pattern transformation module 204 determines an appropriate movement instruction and communicates the movement instruction to the target object using the display screen. It also ensures that the design and presentation of the movement instruction enables accurate observation by the human receiving the movement instruction even from some distance away.
[0030] Fig. 3 illustrates example graphical user interfaces (GUIs) shown on a display screen of the shape management system for directing the movement of the target object. The list of GUIs from left to right corresponds to a series of instructions shown to a person possessing or operating the target object during the course of communication with the person. In each GUI, there are a sufficiently simple and large geometric shape 304, which is not entirely associated with any known object, for representing the target object and an easily identified direction indicator 306 that instructs a movement. There can also be some text 302 that is brief and in a large, clear-to-read font that corresponds to the graphics, namely the combination of the geometric shape 304 and/or the direction indicator 306. In one example, the geometric shape 304 is a circle. Also in the example, the direction indicator 306 is a triangle where two of the vertices slide along the circle, the current position of these two vertices indicates a specific portion of the target object, and the movement of these two vertices indicates a specific movement of the specific portion. The text states that the specific portion of the object should move in a certain fashion. When the shape measurement system incorporates a speaker, the pattern transformation module 204 can also coordinate the operation of the speaker to further facilitate the communication with the person by reading the text out loud, for example. Once an instruction in a GUI is displayed, a new posture or arrangement of the target object formed in response to the instruction is detected, and a new instruction is determined and displayed in another GUI until the appearance of the object is ready for the projection of an initial pattern and the capturing of the projected pattern.
[0031] In some embodiments, the distorted pattern processing module 206 examines a projected pattern to derive three-dimensional information regarding the target object. The amount of local change in the projected pattern from the initial pattern is proportional to the depth information that constitutes the third dimension. When the initial pattern is a grid, the distorted pattern processing module 206 measures the number of dots per inch (DPI) in the initial pattern as well as the DPI in the projected pattern as viewed by the camera, determine any distortion in the lines within the pattern and the 2D shapes enclosed by these lines. The distorted pattern processing module 206 can then use the difference of the two DPI values along with the distortion in lines and enclosed shapes to estimate the depth corresponding to the surface covered by the patterns. For example, when a 12-megapixel (MP) camera is placed four feet away from the pattern molding, the camera can capture approximately 300 DPI from the initial pattern When it is placed the same distance away from the target object, the camera can capture approximately 315 DPI from the projected pattern. The difference of 15 DPI can then be converted to a depth of 2.3 inches. According to aspects of the invention various types of known patterns may be used. The difference to the pixel count per inch is a function of the distance of the surface of the object from the camera. The pixel count is the prime measure measurement of the depth of the object's surface relative to the camera: as the surface varies up and down it effectively varies closer or away from the camera and makes slight changes in the pixel count per inch of the object. Since the pattern is known, that count of pixels per inch coupled with sensing the line and enclosed shape distortion in the object gives the measurement of where the visible parts of the surface is located relative to the camera at any given point. Repeating a non-isomorphic pattern within the framework of an isomorphic pattern allows the system to use the measurements surrounding an occluded part of an object to estimate measurement of the any such occluded part of the object within an acceptable error tolerance.
[0032] In some embodiments, the distorted pattern processing module 206 determines a range of error tolerance, which can depend on the properties of the light source, of the camera, or of the target object. For example, when the target object is a human, one factor to consider is the breathing pattern, which affects the shape of the human body albeit to a slight degree. The distorted pattern processing module 206 can instruct the human to hold their breath during the pattern projection or can tolerate an error in a range that corresponds to the moving up and down of the human chest. Conversely, for applications in which static measurement is not require, the system can effectively measure the effect of motion on the object, for example the rate and/or depth of respiration. Similarly, for example, when the target object is the snow on the roof, the system can not only measure the depth of the snowfall and the shape of the snow drift on a roof but an also measure the rate of incease/decrease in depth and shape of the overall show covering as either more snow falls or snow melts.
[0033] Furthermore, as an isomorphic pattern can be used to reveal information regarding an invisible portion of the target object, that information can be cross- referenced with information directly obtained from projecting an initial pattern on the portion when it becomes visible to reduce errors and increase confidence levels. This process can also help refine the calculation of the properties of an invisible portion using the extendible nature of an isomorphic pattern.
[0034] According to aspects of the system, system can detect the rate of curvature of an object and the resulting rate of change in the depth of the 3D object relative to the camera. Depending on the accuracy needed by the end application for the data being gathered, this information can therefore be used to select areas of the object that require a finer grained analysis and the initial pattern can be modified or replaced by a pattern that has increased pattern density in the areas of the object that the system determined required more detailed analysis.
[0035] In some embodiments, the distorted pattern processing module 206 displays the computed three-dimensional information regarding the target object to a user of the system, which includes coordinates of the measured surfaces of the target object. The distorted pattern processing module 206 can further convert these coordinates into sizing information with respect to a wearable piece for the target object. For example, when the target object is a person's upper body, the sizing information can be related to a shirt that is specified in terms of certain portions of the person's upper body, such as the neck, the chest, the shoulders, or the arms, and available from an external source, such as but not limited to an external database, application, cloud service, web service or server. By comparing the three-dimensional information of the person's upper body with the sizing information related to the shirt, the distorted pattern processing module 206 can identify the closest size of the shirt for the person and tailoring options based on the differences between the two pieces of information. The identified information can then also be displayed to the user.
[0036] Fig. 4 illustrates an example process performed by the shape measurement system of projecting a two-dimensional pattern on a three-dimensional target object and analyzing the projected pattern in two dimensions to obtain three- dimensional information regarding the target object. In step 402, the system selects a molding embodying a two-dimensional initial pattern. The initial pattern can be largely isomorphic, an isomorphic repetition of one or more non-isomorphic patterns or it can have a specific structure for a specific area or aspect of the target object. One or more initial patterns can be selected for the same surface or different surfaces of the target object. In step 404, the system selects a light source. The light source can provide pointed light, which is easy to find, or collimated light, which makes the projected pattern easier to characterize. In step 406, the system determines the relative placement of the different components of the system and the target object such that the initial pattern is properly projected on the target object and captured for further analysis. The system can provide instructions to be displayed so that the target object makes movements for better pattern projection and capturing. In step 408, the system captures and analyzes one or more projected patterns to obtain three-dimensional information regarding the target object. The system can infer information even for an invisible portion of the target object. It can also cross- reference data extracted from different projected patterns to increase the accuracy of the analysis. In step 410, which is often an optional step, the system further converts the three-dimensional information into sizing information with respect to a wearable piece available in different sizes. This feature can be helpful when the target object is a portion of the human body, where the wearable piece can be clothes, glasses, hats, jewelry, belts, shoes, hair accessories, etc.
[0037] Fig. 5A illustrates another example setup of the shape measurement system. In some embodiments, the system includes an outfit 506, such as a shirt, with a two-dimensional initial pattern printed on it to be worn on a target object 504, such as a human body. The system also includes a camera 502, a processor and memory, and a display screen, which can be combined in a single device, such as a cellular phone. As the outfit is worn, the fabric stretches, and the initial pattern also stretches into a distorted pattern, which is in a way "projected" onto the target object and can be analyzed to derive three-dimensional information regarding the target object as described above.
[0038] In some embodiments, the original pattern management module 202 determines that the initial pattern is instead to be printed on an outfit to be worn. The original pattern management module 202 can choose from a variety of types of fabric which stretch to track the surface of the object in the outfit. It can also choose from a variety of shapes for the outfit depending on the shape of the target object and other factors. For example, the outfit can fit a specific portion of a human body or have one of the basic geometric shapes. For each of the shapes, one or more sizes can be available. Another choice to be made by the original pattern management module 202 is a printing technique that ensures that the pattern stretches evenly when the fabric stretches. According to aspects of the disclosure, "printing" would generally be using two processes: either the pattern is woven/knitted into the fabric as it is manufactured or is screen printed onto and into the fabric. As in the situation where the initial pattern is projected onto the target object by a light source, the original pattern management module 202 sets the colors and contrast of the initial pattern and the rest of the outfit such that the projected patterns can be accurately captured by a camera. [0039] In some embodiments, the pattern transformation module 204 selects a calibration chart and determines how it is to be implemented. The calibration chart typically comprises a distinct, simple, and small pattern, that may or may not be replicated in several locations relative to the overall pattern, whose distortion is easy to detect and measure. The calibration chart can be used to ensure that the camera capture would be successful. For example, the calibration chart can be designed such that when it is visible to the human eyes, it would be visible to the camera lens. The calibration chart can be printed on the outfit together with the initial pattern, or it can be implemented in a standalone device and placed nearby the target object at the time of image capture, so that an image of the calibration chart or a distorted version thereof can be captured together with the distorted pattern corresponding to the initial pattern.
[0040] In some embodiments, the distorted pattern processing module 206 performs a calibration process utilizing a specific calibration chart. Depending on how the calibration chart is implemented, the distorted pattern processing module 206 determines the distance of the calibration chart to the camera, which is helpful for further determining three-dimensional information regarding the target object. According to aspects of the disclosure, knowing the original size and layout of the calibration its distance from the camera can be estimated by analyzing the effective DPI of the pixels used to image the calibration chart coupled with the local distortion of the chart by the underlying object. Since the overall pattern is also measuring the same distortion the two taken together can be used to eliminate the distortion due to depth variation on the object enabling the distance between the camera and the object as the output of the pixel density.
[0041] Fig. 5B illustrates an example setup of the shape measurement system that is similar to the one shown in Fig. 5A. In this example, the target object 510 is not a human body but an animal body. This illustrates that there is not much limitation on what the target object can be. It can be inanimate, such as a rock, and it can also be simply a collection of surfaces, such as the snow on the mountain top. Also in this example, the outfit includes not only an initial pattern 512 but also a calibration chart 514. Information obtained from the use of the calibration chart 514 can facilitate the analysis of the distorted pattern. [0042] Fig. 6 illustrates another example process performed by the shape measurement system of projecting a two-dimensional pattern on a three-dimensional target object and analyzing the projected pattern in two dimensions to obtain three- dimensional information regarding the target object. In step 602, the system selects a two-dimensional initial pattern, as discussed above, and a calibration chart. The system could also select multiple patterns to be printed on different areas of the outfit. The calibration chart is typically simple, large, and easy to detect and measure. In step 604, the system selects a type of fabric of which an outfit is to be made and determines how to print the initial pattern and the calibration chart on the outfit. In other examples, the calibration chart can be implemented as a standalone device. The fabric should stretch to track the object in the outfit, and the printed patterns should stretch evenly as a result, and the printing should ensure that a printed pattern stretches evenly as the fabric stretches.
[0043] In step 606, the system directs the placement of the outfit on the target object. For example, when the target object is a person's upper body, and the outfit is a top, the person is instructed to wear the top and eliminate creases, stains, or anything else that may obscure the printed patterns from the top. As the outfit is worn, the fabric stretches, and the initial pattern and calibration chart also stretch, resulting in distorted patterns. In step 608, the system captures the distorted patterns. The target object can also be instructed to make movements to enable different surfaces of the target object to be visible, and the distorted patterns on all these surfaces can be captured. In step 610, the system examines the distorted pattern corresponding to the calibration chart, obtains information useful for deriving three-dimensional information regarding the target object, including the distance to the camera, and uses the obtained information to derive three-dimensional information from the distorted pattern corresponding to the initial pattern.
[0044] Fig. 7 contains a high-level block diagram showing an example architecture of a computer 700, which may represent any electronic device, such as a mobile device or a server, including any node within a cloud service as described herein, and which may implement the operations described above. The computer 700 includes one or more processors 710 and memory 720 coupled to an interconnect 730. The interconnect 730 shown in Fig. 7 is an abstraction that represents any one or more separate physical buses, point to point connections, or both connected by appropriate bridges, adapters, or controllers. The interconnect 730, therefore, may include, for example, a system bus, a Peripheral Component Interconnect (PCI) bus or PCI-Express bus, a HyperTransport or industry standard architecture (ISA) bus, a small computer system interface (SCSI) bus, a universal serial bus (USB), IIC (I2C) bus, or an Institute of Electrical and Electronics Engineers (IEEE) standard 1394 bus, also called "Firewire".
[0045] The processor(s) 710 is/are the central processing unit (CPU) of the computer 700 and, thus, control the overall operation of the computer 700. In certain embodiments, the processor(s) 710 accomplish this by executing software or firmware stored in memory 720. The processor(s) 710 may be, or may include, one or more programmable general-purpose or special-purpose microprocessors, digital signal processors (DSPs), programmable controllers, application specific integrated circuits (ASICs), programmable logic devices (PLDs), field-programmable gate arrays (FPGAs), trusted platform modules (TPMs), or a combination of such or similar devices.
[0046] The memory 720 is or includes the main memory of the computer 700. The memory 720 represents any form of random access memory (RAM), read-only memory (ROM), flash memory, or the like, or a combination of such devices. In use, the memory 720 may contain code 770 containing instructions according to the techniques disclosed herein.
[0047] Also connected to the processor(s) 710 through the interconnect 730 are a network adapter 740 and a mass storage device 750. The network adapter 740 provides the computer 700 with the ability to communicate with remote devices over a network and may be, for example, an Ethernet adapter. The network adapter 740 may also provide the computer 700 with the ability to communicate with other computers.
[0048] The code 770 stored in memory 720 may be implemented as software and/or firmware to program the processor(s) 710 to carry out actions described above. In certain embodiments, such software or firmware may be initially provided to the computer 700 by downloading it from a remote system through the computer 700 (e.g., via network adapter 740). CONCLUSION
[0049] The techniques introduced herein can be implemented by, for example, programmable circuitry (e.g., one or more microprocessors) programmed with software and/or firmware, or entirely in special-purpose hardwired circuitry, or in a combination of such forms. Software or firmware for use in implementing the techniques introduced here may be stored on a machine-readable storage medium and may be executed by one or more general-purpose or special-purpose programmable microprocessors.
[0050] In addition to the above mentioned examples, various other modifications and alterations of the invention may be made without departing from the invention. Accordingly, the above disclosure is not to be considered as limiting, and the appended claims are to be interpreted as encompassing the true spirit and the entire scope of the invention.
[0051] The various embodiments are described above with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
[0052] A "machine-readable storage medium", as the term is used herein, includes any mechanism that can store information in a form accessible by a machine (a machine may be, for example, a computer, network device, cellular phone, personal digital assistant (PDA), manufacturing tool, any device with one or more processors, etc.). For example, a machine-accessible storage medium includes recordable/non-recordable media (e.g., read-only memory (ROM); random access memory (RAM); magnetic disk storage media; optical storage media; flash memory devices; etc.), etc. [0053] These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
[0054] The computer program instructions may also be loaded onto a computer, other programmable data processing apparatuses, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
[0055] The aforementioned flowchart and diagrams illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
[0056] Although various features of the invention may be described in the context of a single embodiment, the features may also be provided separately or in any suitable combination. Conversely, although the invention may be described herein in the context of separate embodiments for clarity, the invention may also be implemented in a single embodiment. [0057] Reference in the specification to "some embodiments", "an embodiment", "one embodiment" or "other embodiments" means that a particular feature, structure, or characteristic described in connection with the embodiments is included in at least some embodiments, but not necessarily all embodiments, of the inventions.
[0058] It is to be understood that the phraseology and terminology employed herein is not to be construed as limiting and are for descriptive purpose only.
[0059] It is to be understood that the details set forth herein do not construe a limitation to an application of the invention.
[0060] Furthermore, it is to be understood that the invention can be carried out or practiced in various ways and that the invention can be implemented in embodiments other than the ones outlined in the description above.
[0061 ] It is to be understood that the terms "including", "comprising", "consisting" and grammatical variants thereof do not preclude the addition of one or more components, features, steps, or integers or groups thereof and that the terms are to be construed as specifying components, features, steps or integers.

Claims

CLAIMS I/We claim:
1 . A method of sizing a three-dimensional target object using two-dimensional patterns, comprising:
creating an initial two-dimensional pattern;
projecting the initial two-dimensional pattern on a three-dimensional target object to obtain a transformed two-dimensional pattern; and computing three-dimensional information regarding the target object based on the transformed pattern.
2. The method of claim 1 ,
wherein the initial pattern is implemented by a standalone molding,
the projecting comprising:
selecting a light source; and
positioning the light source, the molding, and the target object such that light illuminated from the light source shines through the molding before forming the transformed pattern on a surface of the target object.
3. The method of claim 1 ,
wherein the initial pattern is printed on an outfit made of a fabric that stretches and conforms to track a surface of an object in the outfit,
the projecting comprising:
placing the target object in the outfit so that as the fabric stretches and conforms, the initial pattern stretches into the transformed pattern, and positioning the target object in the outfit in each of a plurality of configurations.
4. The method of claim 1 , wherein the computing includes capturing a portion of the transformed pattern with a camera.
5. The method of claim 4, the computing further comprising:
computing a distance between the camera and the target object using a calibration pattern; and
deriving the three-dimensional information regarding the target object based on the distance.
6. The method of claim 1 , wherein the initial pattern is isomorphic or is an isomorphic repetition of non-isomorphic patterns.
7. The method of claim 1 , wherein the initial pattern includes a non-isomorphic portion that focuses on a specific area of the target object.
8. The method of claim 1 , further comprising:
revising the initial pattern to an intermediary two-dimensional pattern based on the transformed pattern;
projecting the intermediary pattern on the target object to obtain a final two- dimensional pattern; and
converting the final pattern into three-dimensional information regarding the target object.
9. The method of claim 1 , further comprising creating sizing information regarding the target object with respect to a wearable piece available in multiple sizes.
10. The method of claim 1 , wherein the target object is inanimate.
1 1 . A system for directing movements of a target object, comprising:
a processor;
a memory that stores a definition of a configuration;
a camera; and
a display screen, wherein
the camera captures an image of a target object,
the processor determines whether the target object is in the defined configuration from the captured image, when the determining indicates that the target object is not in the defined configuration, the display screen shows a geometric shape with a directional indicator to guide a movement of the target object, and
when the determining indicates that the target object is in the defined configuration, the display screen shows the geometric shape alone.
12. The system of claim 1 1 , wherein the geometric shape is a circle, and the directional indicator is a triangle where two of the vertices slide along the circle.
13. The system of claim 1 1 , wherein the display screen also shows text of no more than a specific number of words to explain the shown graphics.
14. The system of claim 13, further comprising a speaker that reads the shown text in a voice.
15. The system of claim 1 1 , wherein the system is a cellular phone, a laptop computer, a desktop computer, a tablet, or a wearable device.
16. The system of claim 1 1 ,
wherein the captured image includes a pattern projected on the target object, and
wherein the processor analyzes the projected pattern and determines three- dimensional information regarding the target object.
17. The system of claim 16, wherein the display screen displays the three- dimensional information regarding the target object.
18. The system of claim 16, wherein the display screen displays sizing information regarding the target object with respect to a wearable piece available in multiple sizes.
PCT/US2016/013723 2015-01-16 2016-01-15 Determining three-dimensional information from projections or placement of two-dimensional patterns WO2016115536A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201562104559P 2015-01-16 2015-01-16
US62/104,559 2015-01-16

Publications (2)

Publication Number Publication Date
WO2016115536A2 true WO2016115536A2 (en) 2016-07-21
WO2016115536A3 WO2016115536A3 (en) 2018-07-26

Family

ID=56406570

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2016/013723 WO2016115536A2 (en) 2015-01-16 2016-01-15 Determining three-dimensional information from projections or placement of two-dimensional patterns

Country Status (1)

Country Link
WO (1) WO2016115536A2 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110657754A (en) * 2019-09-24 2020-01-07 华侨大学 Quick measuring device of diamond wire saw deformation
WO2021099847A1 (en) * 2019-11-19 2021-05-27 Like A Glove Ltd. Photogrammetric measurement of body dimensions using patterned garments
DE102020124006B3 (en) 2020-09-15 2022-01-05 Laser Imaging Systems Gmbh EXPOSURE CONTROL IN PHOTOLITHOGRAPHIC DIRECT EXPOSURE METHODS FOR CREATING CIRCUIT BOARDS OR CIRCUITS

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2394341A1 (en) * 1999-12-13 2001-06-14 Princeton Video Image, Inc. 2-d/3-d recognition and tracking algorithm for soccer application
JP3859571B2 (en) * 2002-10-17 2006-12-20 ファナック株式会社 3D visual sensor
US8606019B2 (en) * 2007-04-23 2013-12-10 Nec Corporation Matching method for two-dimensional pattern, feature extracting method, apparatus used for the methods, and programs
US8235530B2 (en) * 2009-12-07 2012-08-07 C-Rad Positioning Ab Object positioning with visual feedback
US8836761B2 (en) * 2010-09-24 2014-09-16 Pixart Imaging Incorporated 3D information generator for use in interactive interface and method for 3D information generation

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110657754A (en) * 2019-09-24 2020-01-07 华侨大学 Quick measuring device of diamond wire saw deformation
WO2021099847A1 (en) * 2019-11-19 2021-05-27 Like A Glove Ltd. Photogrammetric measurement of body dimensions using patterned garments
DE102020124006B3 (en) 2020-09-15 2022-01-05 Laser Imaging Systems Gmbh EXPOSURE CONTROL IN PHOTOLITHOGRAPHIC DIRECT EXPOSURE METHODS FOR CREATING CIRCUIT BOARDS OR CIRCUITS
WO2022057981A1 (en) 2020-09-15 2022-03-24 Laser Imaging Systems Gmbh Exposure control in photolithographic direct exposure methods for manufacturing circuit boards or circuits

Also Published As

Publication number Publication date
WO2016115536A3 (en) 2018-07-26

Similar Documents

Publication Publication Date Title
US9791267B2 (en) Determining three-dimensional information from projections or placement of two-dimensional patterns
Bartol et al. A review of body measurement using 3D scanning
CN108475439B (en) Three-dimensional model generation system, three-dimensional model generation method, and recording medium
ES2693028T3 (en) System and method for deriving accurate body size measurements from a sequence of 2D images
KR101775327B1 (en) Method and program for providing virtual fitting service
RU2551731C1 (en) Method of virtual selection of clothes
US20160071322A1 (en) Image processing apparatus, image processing system and storage medium
US20220198780A1 (en) Information processing apparatus, information processing method, and program
CN106164978A (en) Parametrization deformable net is used to construct the method and system of personalized materialization
KR101499698B1 (en) Apparatus and Method for providing three dimensional model which puts on clothes based on depth information
CN105491307B (en) Depth sensing system
US10783376B2 (en) Information processing apparatus
WO2016115536A2 (en) Determining three-dimensional information from projections or placement of two-dimensional patterns
CN112652071A (en) Outline point marking method and device, electronic equipment and readable storage medium
Ashdown Full body 3-D scanners
US11443486B2 (en) Mobile 3D body scanning methods and apparatus
CN115862124A (en) Sight estimation method and device, readable storage medium and electronic equipment
CN115546436A (en) Three-dimensional (3D) image modeling system and method for determining a respective midsize of an individual
KR101499699B1 (en) Apparatus and Method for generating user's three dimensional body model based on depth information
KR102349481B1 (en) Method and apparatus for obtaining information about the shape of an object
KR102086227B1 (en) Apparatus for measuring body size
Hernandez et al. Underwater space suit performance assessments part 1: Motion capture system development and validation
Lafayette et al. Hybrid solution for motion capture with kinect v2 to different biotypes recognition
KR20230043341A (en) Method for providing user-customized clothing recommendation service
Yoshida et al. Shape completion and modeling of 3D foot shape while walking

Legal Events

Date Code Title Description
NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16738017

Country of ref document: EP

Kind code of ref document: A2