US20050267613A1 - Method to quantitativley analyze a model - Google Patents
Method to quantitativley analyze a model Download PDFInfo
- Publication number
- US20050267613A1 US20050267613A1 US11/071,917 US7191705A US2005267613A1 US 20050267613 A1 US20050267613 A1 US 20050267613A1 US 7191705 A US7191705 A US 7191705A US 2005267613 A1 US2005267613 A1 US 2005267613A1
- Authority
- US
- United States
- Prior art keywords
- image
- post
- processor
- analysis software
- points
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/10—Complex mathematical operations
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F30/00—Computer-aided design [CAD]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F30/00—Computer-aided design [CAD]
- G06F30/10—Geometric CAD
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F30/00—Computer-aided design [CAD]
- G06F30/20—Design optimisation, verification or simulation
- G06F30/23—Design optimisation, verification or simulation using finite element methods [FEM] or finite difference methods [FDM]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2113/00—Details relating to the application field
- G06F2113/12—Cloth
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2113/00—Details relating to the application field
- G06F2113/24—Sheet material
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/16—Cloth
Definitions
- the present invention relates generally to quantitative analysis of images from a post-processor, and more particularly to quantitative analysis of images from a post-processor using a calibration mechanism in the analysis software.
- Computer simulations of motion have long been used to model and predict the behavior of systems, particularly dynamic systems.
- Such systems utilize mathematical formulations to calculate structural volumes under various conditions based on fundamental physical properties.
- Various methods are known to convert a known physical object into a grid, or mesh, for performing finite element analysis, and various methods are known for calculating interfacial properties, such as stress and strain, at the intersection of two or more modeled physical objects.
- U.S. Pat. No. 6,810,300 issued to Woltman, et al. Oct. 26, 2004, discloses a method of designing a product for use on a body using a preferred product configuration.
- a method to quantitatively analyze results of a model comprises the steps of:
- a plurality of images may be generated in the post-processor.
- the model may be a product being worn on, in or adjacent to a body.
- the product may be an absorbent article.
- the absorbent article may be a sanitary napkin, pantiliner, incontinent pad, tampon, diaper, and breast pad.
- the calibration mechanism may be a calibration image, such as a box.
- the analysis software may be image analysis software.
- a method of analyzing physical test results in a virtual environment comprises the steps of:
- the step of converting said series of points into a format that can be read into a post-processor is carried out after the step of replicating at least one physical specimen in digital form to define a series of points.
- the physical specimen is a product capable of being worn on, in or adjacent to a body.
- the product may be an absorbent article, such as a sanitary napkin, pantiliner, incontinent pad, tampon, diaper, and breast pad.
- the step of aligning said series of points with at least a second series of points can be carried out after the step of replicating at least one physical specimen in digital form to define a series of points.
- a method for calculating a spacial relationship between at least two objects comprises the steps of:
- At least one of the objects can be a human body and at least one of said objects can be a product being worn on, in or adjacent to a body.
- the spacial relationship can be an area, volume or a distance between the objects.
- FIG. 1 is a flow diagram showing a method of quantitatively analyzing results of a model
- FIG. 2 is a flow diagram showing a method of quantitatively analyzing results of a model
- FIG. 3 is a visualization of the simulation results of a product against a body
- FIG. 4 is an image of a slice
- FIG. 5 is an image of a calibration mechanism
- FIG. 6 is an image of a graphical result
- FIG. 7 is an image of a graphical result
- FIG. 8 is a flow diagram showing a method of analyzing physical test results in a virtual environment
- FIG. 9 is an image of the aligned surfaces
- FIG. 10 is an image of a slice
- FIG. 11 is an image of a graphical result
- FIG. 12 is a flow diagram showing a method for calculating a spacial relationship between two objects
- FIGS. 13-15 are images of a feminine protection pad with wings applied to a panty using a virtual hand
- FIGS. 16-18 are images of sections of the pad and undergarment.
- FIG. 19 is an image of graphical results.
- the method 10 includes generating at least one image of a model in a post-processor at step 24 .
- a calibration mechanism of known dimensions is generated in the post-processor at step 26 .
- the calibration mechanism is read into analysis software.
- image is also read into the analysis software. The image from post-processor is analyzed quantitatively with the analysis software using the calibration mechanism at step 32 .
- a plurality of images are generated in the post-processor.
- the model may be a product, such as an absorbent article, worn on, in or adjacent to a human body.
- the absorbent articles may be sanitary napkins, pantiliners, incontinent pads, tampons, diapers, and breast pads.
- the calibration mechanism may be a calibration image, such as a box.
- Other calibration images include but are not limited to bars, lines, parallel lines, rectangles, circles, triangles, unique shapes created for specialized applications and other conventional shapes.
- step 28 the calibration mechanism is preferably read into image analysis software.
- Suitable post-processing software include but are not limited to ABAQUS CAE (ABAQUS Inc., Pawtucket, R.I.), LSPrePost (Livermore Software Technology Corporation, Livermore Calif.), Hyperview (Altair Engineering, Troy, Mich.) Fieldview (Intelligent Light, Rutherford, N.J.), and EnSight (Computational Engineering International, Apex, N.C.).
- FIG. 2 depicts the method 40 for quantitatively analyzing the results of a model, and more particularly for quantitatively determining the fit of the product in relation to the human body.
- step 42 model/simulation results are generated.
- a process to arrive at the simulation results is described in U.S. Pat. No. 6,810,300 issued Oct. 26, 2004 to Woltmann et al., or commonly-assigned co-pending application Ser. No. 60/550,479 filed Mar. 5, 2004 in the name of Anast et al.
- the model results are loaded/inputed/read into post-processing software, step 43 .
- suitable post-processing software include but are not limited to ABAQUS CAE (ABAQUS Inc., Pawtucket, R.I.), LSPrePost (Livermore Software Technology Corporation, Livermore Calif.), and Hyperview (Altair Engineering, Troy, Mich.).
- One known capability of post-processing software is the ability to use a repeated set of commands to drive a series of steps in the software, called scripting, with the repeated set of commands commonly called a script.
- an LSPrePost script can be used to visualize the simulation results of a product against a body at a series of different locations and angles in space, see FIG. 3 .
- Another known capability of post-processing software is the ability to create a series of images and save the images to a file format such as a JPEG file, step 44 .
- Another well known feature is the ability to generate a cross sectional 2 dimensional image, called a slice, through any combination of objects in the simulation at a variety of conditions that the model has calculated.
- FIG. 4 is an example of a slice.
- the concept of slicing can be thought of as cutting through the model along a determined path, and visually seeing the surface of all displayed objects along the path of this cut.
- These objects can include elements, nodes, surfaces, meshes, parts, and such. This can also include a sub-set of these objects.
- the states of these objects are previously determined when the model is being generated. Also, a variety of aspect ratios, and zoom levels are possible so as to concentrate the displayed image in the software within a specific region.
- a series of slice images are generated in a PNG file format from LSPrePost of the parts corresponding to the pad and body in a deformed state at slice planes that would correspond to key anatomical features on a woman such as the prepuce and perineum for example.
- the calibration mechanism is created in LSPrePost in which a cross sectional 2 dimensional image of a box of known dimensions is generated and saved to a file format, shown in FIG. 5
- the known box image dimensions allow the screen capture process to be calibrated in physical dimensions, millimeters in this example.
- the box provides a linkage between the post processing software and image analysis software, in that the image analysis software would otherwise have no knowledge of physical dimensions from the post-processing software.
- the image analysis software is capable of understanding the image as being 180 pixels by 180 pixels, the image analysis software now understands physical relationship in terms of a correlation of mm per pixel. Additionally, concerns such as imaging with different graphical resolutions, different resolution in horizontal vs. vertical, image orientation, and the such are resolved.
- Additional images in the post-processer can be generated at the same level of magnification (zoom) and so share the common calibration factors and aspect ratio. It is possible to repeat this process such that different angles or different zooms are considered.
- calibration might include producing other reference objects for example, a set of parallel lines or a bar, using a fixed predefined pixel size, saving the image data directly in real world coordinates, or passing the calibration parameters as an output text file.
- reference objects for example, a set of parallel lines or a bar
- a fixed predefined pixel size saving the image data directly in real world coordinates, or passing the calibration parameters as an output text file.
- an inherent or artificial object within a given product performance model for such a purpose. Examples include a simulation being run with a square or cube embedded in the simulation as an artificial object, or using a fixed reference internal to a body such as femur width as an inherent object.
- various predetermined colors are assigned to each element in the simulation, for example the body skin is black (zero gray scale), the pad edge is a shade of gray (128 gray scale), etc.
- the differentiation of materials based on the grayscale level provides a means to separate the components in the resulting 2D images during the image analysis steps described next.
- the output images are saved as a common graphics format file, in this example a PNG.
- Other forms of data output could be used, such as TIF or JPEG image format files, text file ASCII representation of the data, a binary format raw data file, or a 3D image format such as VRML or stereolithography.
- image analysis software Once the series of image data files and calibration mechanism are generated, they are imported/inputed/read into image analysis software, steps 48 and 50 .
- suitable image analysis software include but are not limited to MatLab (The Mathworks, Nattick Mass.), Optimas (Media Cybernetics, Silver Spring Md.) and ImagePro (Media Cybernetics, Silver Spring Md.).
- Other data analysis software packages would also be suitable to import and process this type of data, provided the data files are compatible with the software with either a built in reader or a custom programmed reader typical of what can be done within MatLab for uncommon data files.
- MatLab contains a PNG reader and is used to import the LSPrePost images directly into MatLabs working space. Once the images are read into the software they are analyzed, step 52 .
- a quantitative data report is provided from the analysis, step 54 . The data is interpreted and correlated with consumers, step 56 .
- a custom image processing script was written in MatLab to: 1) read all the images from LSPrePost, 2) calibrate the images, 3) measure the pad-body gap area, and 4) save the results to a graphical and a text file representation for review.
- FIGS. 6 and 7 illustrate examples of a graphical result files from different locations. Each step is described below.
- the extent of the pad is identified for limiting the width of the analysis region. This can be accomplished in a number of ways. One procedure defines the minimum and maximum ⁇ coordinate of the particular gray scale associated with the pad, as described previously.
- a vertical line scanning algorithm is used to find the lower body surface and the upper product surface.
- Two addition sub-steps are used to fill in any gap areas missed scanning vertically.
- Sub-step 3-2) scans horizontally, finding pixels between the gap boundary and any pixels identified above.
- Sub-step 3-3) scans vertically again in a manner similar to 3-2) to more finely identify remaining gap pixels.
- Alternative methods generally known as seed fill methods could equally well be used. Products that have holes or slits in the upper surfaces cause some trouble in this analysis, causing the pixels selection to leak into the interior of the product (not desirable) and thus seed fill routines will fail badly.
- a report text file is generated that records the Product name, software version, computer platform, time and date of analysis, location of the source images, calibration factors, image names, gap areas in mm 2 , and the selection steps used for each image.
- a copy is generated of each input PNG image with an addition of the gap area being colored or shaded. This provides a visual record of the gap definition as shown in FIGS. 6 and 7 .
- a method 70 for analyzing physical test results in a virtual environment is shown in FIG. 8 .
- a sanitary napkin or feminine pad is placed on a human female body and a first cast of the sanitary napkin on the human female is made, step 72 .
- the first cast is removed and a second cast of the human female body is made, step 74 .
- the second cast is in close proximity and most preferably in direct contact with the body.
- the casts form the physical specimens.
- the first and second casts are replicated in digital form to define a series of points, steps 76 and 78 .
- the series of points can be connected to form a series of lines or a surface.
- a virtual surface of a product or panty is created using a 3D digitizing arm (MicroScribe/Immersion Corporation, San Jose, Calif., USA).
- the digitizing arm is connected to a computer equipped with a software program that supports modeling via reverse engineering such as Rhinoceros (Robert McNeel & Associates, Seattle, Wash.).
- Rhinoceros Robott McNeel & Associates, Seattle, Wash.
- the digitizer uses XYZ coordination from a stylus on the digitizer arm to create a 3D wire frame model of the surface. This is accomplished by moving the stylus across the surface and capturing points.
- points can be captured along an axis of the surface; the points being taken along a series of sequential lines that are spaced across the surface; the number of lines and points within each line is determined by the level of detail to be captured. These sequential lines can be lofted within the software to generate the 3D surface.
- the Rhinoceros software offers a variety of file formats to save this 3D surface as. The file can be saved as a stereolithography file in an ASCII format.
- step 80 When two or more series of points are replicated in digital form it is often desirable that they be aligned with respect to one another, step 80 .
- This alignment process can be done manually, based on the digital surface profile, or with additional physical data.
- reference markers are placed on the body and transferred to the cast during the casting process. Any number of reference markers may be used and the location of the markers may be positioned as desired.
- the reference markers are separately digitized as a series of marker points and saved as a text file.
- the reference markers associated with each series of digitized points can be used to align the two series of points with respect to each other using techniques such as least squared methods, residual minimization and the like.
- the two series of aligned points are saved together in a file in the aligned position.
- a mannequin could be used to interact with the product, and the digital representation of the mannequin used versus the digitized product, the mannequin data resulting from digitizing the mannequin as described for a real body above, or directly from digital data from real humans used to manufacture the mannequin.
- Digitized models of the body and product can be obtained from digitizing the casts as above or could alternatively be captured directly from the body or stabilized product using any number of digitizing instruments, for example an Inspeck (Montreal, Canada) Capturer optical non-contact digitizer.
- One typical common output of these instruments is a stereolithography file as previously described.
- the body or products could also result in part from a surface used in a simulation as previously discussed, in mesh form or in another file format, or any combination of such between real digitized surface and virtually created or analyzed surface.
- the series of points are read directly into a post-processor, step 82 .
- they can also be read directly into a post-processor.
- a stereolithography file of the aligned surfaces is read directly into LSPrePost, shown in FIG. 9 .
- a known capability of post-processing software is the ability to create a series of images and save the images to a file format such as a JPEG file, step 84 .
- Another well known feature is the ability to generate a cross sectional 2 dimensional image, called a slice, through any combination of objects.
- FIG. 10 is an example of a slice.
- a variety of aspect ratios, and zoom levels are possible so as to concentrate the displayed image in the software within a specific region.
- a series of slice images are generated in a PNG file format from LSPrePost of the series of points corresponding to the pad and body at slice planes that would correspond to key anatomical features on a woman such as the prepuce and perineum for example.
- a calibration mechanism is generated in step 86 .
- the calibration mechanism is created in LSPrePost in which a cross sectional 2 dimensional image of a box of known dimensions is generated and saved to a file format, shown in FIG. 5
- various predetermined colors are assigned to each series of points, for example the body is black (zero gray scale), the pad is a shade of gray (128 gray scale), etc.
- the differentiation of materials based on the grayscale level provides a means to separate the components in the resulting 2D images during the image analysis steps described next.
- the output images are saved as a common graphics format file, in this example a PNG.
- step 92 A quantitative data report is provided from the analysis, step 94 .
- the data is interpreted and correlated with consumers, step 96 .
- a custom image processing script was written in MatLab to: 1) read all the images from LSPrePost, 2) calibrate the images, 3) measure the pad-body gap area, and 4) save the results to a graphical and a text file representation for review.
- FIG. 11 illustrates an example of a graphical result file. Each step is described below.
- the calibration image FIG. 5
- the box identified by locating all pixels with grayscale of 128.
- the X and Y extents in pixels of the box are used to calculate calibration factors in mm to be applied to all subsequent images.
- the extent of the pad is identified for limiting the width of the analysis region. This can be accomplished in a number of ways. One procedure defines the minimum and maximum ⁇ coordinate of the particular gray scale associated with the pad, as described previously.
- a vertical line scanning algorithm is used to find the lower body surface and the upper product surface.
- Two addition sub-steps are used to fill in any gap areas missed scanning vertically.
- Sub-step 3-2) scans horizontally, finding pixels between the gap boundary and any pixels identified above.
- Sub-step 3-3) scans vertically again in a manner similar to 3-2) to more finely identify remaining gap pixels.
- Alternative methods generally known as seed fill methods could equally well be used. Products that have holes or slits in the upper surfaces cause some trouble in this analysis, causing the pixels selection to leak into the interior of the product (not desirable) and thus seed fill routines will fail badly.
- a report text file is generated that records the Product name, software version, computer platform, time and date of analysis, location of the source images, calibration factors, image names, gap areas in mm 2 , and the selection steps used for each image.
- a copy is generated of each input PNG image with an addition of the gap area being colored or shaded. This provides a visual record of the gap definition as shown in FIG. 11 .
- step 122 model/simulation results are generated.
- a process to arrive at the simulation results is described in U.S. Pat. No. 6,810,300 issued Oct. 26, 2004 to Woltmann et al., or commonly-assigned co-pending application Ser. No. 60/550,479 filed Mar. 5, 2004 in the name of Anast et al.
- results from a model of a feminine protection pad with wings applied to a panty using a virtual hand are generated, FIGS. 13-15 .
- the model results are loaded/inputed/read into a post-processing software, step 123 .
- suitable post-processing software include but are not limited to ABAQUS CAE (ABAQUS Inc., Pawtucket, R.I.), LSPrePost (Livermore Software Technology Corporation, Livermore Calif.), and Hyperview (Altair Engineering, Troy, Mich.).
- One known capability of post-processing software is the ability to use a repeated set of commands to drive a series of steps in the software, called scripting, with the repeated set of commands commonly called a script.
- an LSPrePost script can be used to visualize the simulation results of a product against a panty at a series of different locations and angles in space, see FIGS. 13-15 .
- a variety of aspect ratios, and zoom levels are possible so as to concentrate the displayed image in the software within a specific region. It is further possible to display only sections or components of a model, such as only wings, only undergarment, only sections of a wing.
- Another known capability of post-processing software is the ability to create a series of images and save the images to a file format such as a JPEG file, step 124 .
- a series of images are generated in a PNG file format from LSPrePost of the parts corresponding to sections of the pad and undergarment in a deformed state, FIGS. 16-18 .
- a calibration mechanism is generated in step 126 .
- the calibration mechanism is created in LSPrePost in which a cross sectional 2 dimensional image of a box of known dimensions is generated and saved to a file format, shown in FIG. 5
- various predetermined colors are assigned to each element in the simulation, for example the outline of the undergarment is black (zero gray scale), the pad is a shade of gray (128 gray scale), etc.
- the differentiation of materials based on the grayscale level provides a means to separate the components in the resulting 2D images during the image analysis steps described next.
- the output images are saved as a common graphics format file, in this example a PNG.
- step 132 A quantitative data report is provided from the analysis, step 134 .
- the data is interpreted and correlated with consumers, step 136 .
- a custom image processing script was written in MatLab to 1) read all the images including the calibration mechanism from LSPrePost, 2) calibrate the images using the calibration mechanism, 3) composite the needed images as required by each measurement in a list of measurements, 4) calculate each measurement, and 5) save the results to a graphical and a text file representation for review.
- a custom image processing script was written in MatLab to 1) read all the images including the calibration mechanism from LSPrePost, 2) calibrate the images using the calibration mechanism, 3) composite the needed images as required by each measurement in a list of measurements, 4) calculate each measurement, and 5) save the results to a graphical and a text file representation for review.
- Each measurement is described in the steps below.
- a text file is created with a header describing the version of the analysis code used, the date and time of analysis, the computer platform used to run the analysis, and the location of the source images.
- Adhesive patch gap analysis The image of FIG. 18 is used again.
- the machine direction (MD) of the product is defined as the longest axis of the pad product.
- the cross direction (CD) is orthogonal in the plane of the pad to the MD, refer to FIG. 15 .
- Each CD line of the adhesive patch image of FIG. 18 is scanned to identify the end point of the left patch and the beginning of the right patch. The distance between the points is the adhesive patch gap.
- This gap is plotted versus the distance along the MD, yielding a distinctly shaped plot for each product simulation. The maximum and minimum gaps in the profile along with the positions are found and reported in mm.
- Wing Area and Gap Analysis A single image of the wings only is used as shown in FIG. 17 .
- the area of each wing is determined as in 3) above.
- the two wing areas, average wing area and total wing area are reported in mm 2 .
- Using a CD line scanning procedure as in 4) above the wing gap profile is calculated and plotted vs. the MD distance. The minimum, and the leading and trailing maximum gaps are reported in mm.
- Panty Elastic Wing Gap A composite image of the panty elastic edge and wing is used here, see FIGS. 16 and 17 .
- a CD scanning procedure as in 4) above is used to identify the area of the wing located over the panty. The individual, average and total wing coverage areas are reported in mm 2 .
- wing area located between the start of the wing on a given line and the start of the panty elastic on the same line is identified as the panty elastic gap and the two starting points used to determine the gap distance.
- the gap distance is plotted vs. MD distance.
- a separate plot is generated for the gap profile that is smoothed with a 7 point running average filter. The minimum gap, maximum gap, and their locations as well as the average gap distance is reported out in mm.
- the procedure is repeated for the wing elastic gap on the right side of the panty.
- the above method is not limited to virtual model images, but could equally well be used with images generated from digitized models of real physical prototypes, applied on real garments by real humans, additionally but not necessarily involving real bodies.
- Digitized models could include 3D geometry as described in method 70 or by taking digital pictures of the product in place.
- Calibration of digital pictures would entail adding a known length object in the image field of view, such as a precision rule as routinely practiced in the art, and using manual or image analysis techniques to locate and calibrate the image based on the known calibration marks of the object. Equally any object of known dimension could be used as a calibration source, including the products or garments themselves provided known features are not distorted in the image.
- Identification of features, using image analysis techniques, within the image could be accomplished in a number of ways, the goal of which is to provide sufficient contrast to isolate features of interest. These include but are not limited to: 1) manually pre-indicating the features using a highlighting method such as colored ink markers or paint, 2) using colored raw materials and/or colored garments to provide sufficient contrast, 3) using fluorescent dyes or native material fluorescence (for example adhesives naturally fluoresce) and UV illumination, 4) edge extraction, edge and/or pattern correlation or similar image analysis techniques, or 5) any combination of the above techniques.
- This feature identification can then be coupled with the procedures outlined in the previous example above to provide quantified output, in a manner and for use as discussed in the example.
- the above described processes can be used to improve product performance of existing products, to design new products, evaluate new concepts, and optimize design. Furthermore, an initial design can be analyzed using one of the methods above, the analysis results can be used to iterate the design and subsequently repeated. This enables a rapid development cycle for product design.
Abstract
A method of quantitatively analyzing results of a model. At least one image of a model is generated in a post-processor. A calibration mechanism of known dimensions is generated in the post-processor. The calibration mechanism is read into an analysis software. The image is read into the analysis software. The image from the post-processor is analyzed in the analysis software quantitatively using the calibration mechanism.
Description
- This application claims the benefit of U.S. Provisional Application No. 60/550,479, filed Mar. 5, 2004 and U.S. Provisional Application No. 60/550,490 filed Mar. 5, 2004.
- The present invention relates generally to quantitative analysis of images from a post-processor, and more particularly to quantitative analysis of images from a post-processor using a calibration mechanism in the analysis software.
- Computer simulations of motion, e.g., using FEA, have long been used to model and predict the behavior of systems, particularly dynamic systems. Such systems utilize mathematical formulations to calculate structural volumes under various conditions based on fundamental physical properties. Various methods are known to convert a known physical object into a grid, or mesh, for performing finite element analysis, and various methods are known for calculating interfacial properties, such as stress and strain, at the intersection of two or more modeled physical objects.
- Use of computer simulations such as computer aided modeling in the field of garment fit analysis is known. Typically, the modeling involves creating a three-dimensional (hereinafter “3D”) representation of the body, such as a woman, and a garment, such as a woman's dress, and virtually representing a state of the garment when the garment is actually put on the body. Such systems typically rely on geometry considerations, and do not take into account basic physical laws. One such system is shown in U.S. Pat. No. 6,310,627, issued to Sakaguchi on Oct. 30, 2001.
- Another field in which 3D modeling of a human body is utilized is the field of medical device development. In such modeling systems, geometry generators and mesh generators can be used to form a virtual geometric model of an anatomical feature and a geometric model of a candidate medical device. Virtual manipulation of the modeled features can be output to stress/strain analyzers for evaluation. Such a system and method are disclosed in WO 02/29758, published Apr. 11, 2002 in the names of Whirley, et al.
- Further, U.S. Pat. No. 6,810,300, issued to Woltman, et al. Oct. 26, 2004, discloses a method of designing a product for use on a body using a preferred product configuration.
- While the methods of designing products using computer simulations of motion are well known, typically the analysis of these methods are bound by the limited capabilities intrinsic in a post-processor. For example, post-processors are unable to automatically measure areas in slice planes between two arbitrary surfaces. Furthermore, measurements of distances are typically reliant on distances between nodes in a simulation. Existing image analysis software has been developed with a wide range of capabilities deficient in post-processors. However, the ability to directly couple post-processing software with the image analysis software is unkown. More specifically there is the need to allow for the passing of critical information from one software to another.
- A method to quantitatively analyze results of a model is disclosed. The method comprises the steps of:
-
- generating at least one image of said model in a post-processor;
- generating a calibration mechanism of known dimensions in said post-processor;
- reading said calibration mechanism into an analysis software;
- reading said image into said analysis software; and
- analyzing said image from said post-processor in said analysis software quantitatively using said calibration mechanism.
- A plurality of images may be generated in the post-processor. The model may be a product being worn on, in or adjacent to a body. The product may be an absorbent article. The absorbent article may be a sanitary napkin, pantiliner, incontinent pad, tampon, diaper, and breast pad. The calibration mechanism may be a calibration image, such as a box. The analysis software may be image analysis software.
- Additionally, a method of analyzing physical test results in a virtual environment is disclosed. The method comprises the steps of:
-
- replicating at least one physical specimen in digital form to define a series of points;
- reading said points into a post-processor;
- generating at least one image from said series of points in said post-processor;
- generating a calibration mechanism of known dimensions in said post-processor;
- reading said calibration mechanism into an analysis software;
- reading said image into said analysis software; and
- analyzing said image from said post-processor in said analysis software quantitatively using said calibration mechanism.
- The step of converting said series of points into a format that can be read into a post-processor is carried out after the step of replicating at least one physical specimen in digital form to define a series of points. The physical specimen is a product capable of being worn on, in or adjacent to a body. The product may be an absorbent article, such as a sanitary napkin, pantiliner, incontinent pad, tampon, diaper, and breast pad.
- The step of aligning said series of points with at least a second series of points can be carried out after the step of replicating at least one physical specimen in digital form to define a series of points.
- Furthermore, a method for calculating a spacial relationship between at least two objects is disclosed. The said method comprises the steps of:
-
- providing a model;
- generating results by running said model;
- reading said model results into a post-processor;
- generating at least one image of said model in a post-processor;
- generating a calibration mechanism of known dimensions in said post-processor;
- reading said calibration mechanism into an image analysis software;
- reading said image into said image analysis software;
- analyzing said image from said post-processor in said analysis software quantitatively using said calibration mechanism; and
- calculating the spacial relationship between said at least two objects using the quantitative analysis of the step of analyzing said image from said post-processor in said analysis software quantitatively using said calibration mechanism.
- At least one of the objects can be a human body and at least one of said objects can be a product being worn on, in or adjacent to a body. The spacial relationship can be an area, volume or a distance between the objects.
-
FIG. 1 is a flow diagram showing a method of quantitatively analyzing results of a model; -
FIG. 2 is a flow diagram showing a method of quantitatively analyzing results of a model, -
FIG. 3 is a visualization of the simulation results of a product against a body, -
FIG. 4 is an image of a slice, -
FIG. 5 is an image of a calibration mechanism, -
FIG. 6 is an image of a graphical result, -
FIG. 7 is an image of a graphical result, -
FIG. 8 is a flow diagram showing a method of analyzing physical test results in a virtual environment, -
FIG. 9 is an image of the aligned surfaces, -
FIG. 10 is an image of a slice, -
FIG. 11 is an image of a graphical result, -
FIG. 12 is a flow diagram showing a method for calculating a spacial relationship between two objects, -
FIGS. 13-15 are images of a feminine protection pad with wings applied to a panty using a virtual hand, -
FIGS. 16-18 are images of sections of the pad and undergarment, and -
FIG. 19 is an image of graphical results. - Referring now to the drawings and in particular to
FIG. 1 , a method, generally indicated byreference numeral 10, of quantitatively analyzing the results of a model is shown. When used herein, examples of specific equipment, software, products, and wearers are for illustrative purposes, and other types of these items may be used without departing from the scope of the present invention. In the method illustrated inFIG. 1 , themethod 10 includes generating at least one image of a model in a post-processor atstep 24. A calibration mechanism of known dimensions is generated in the post-processor atstep 26. Atstep 28 the calibration mechanism is read into analysis software. Atstep 30, image is also read into the analysis software. The image from post-processor is analyzed quantitatively with the analysis software using the calibration mechanism atstep 32. - In step 24 a plurality of images are generated in the post-processor. There is no limit on the number of images than can be generated in the post-processor. The model may be a product, such as an absorbent article, worn on, in or adjacent to a human body. The absorbent articles may be sanitary napkins, pantiliners, incontinent pads, tampons, diapers, and breast pads.
- In
step 26 the calibration mechanism may be a calibration image, such as a box. Other calibration images include but are not limited to bars, lines, parallel lines, rectangles, circles, triangles, unique shapes created for specialized applications and other conventional shapes. - In
step 28 the calibration mechanism is preferably read into image analysis software. - Examples of suitable post-processing software include but are not limited to ABAQUS CAE (ABAQUS Inc., Pawtucket, R.I.), LSPrePost (Livermore Software Technology Corporation, Livermore Calif.), Hyperview (Altair Engineering, Troy, Mich.) Fieldview (Intelligent Light, Rutherford, N.J.), and EnSight (Computational Engineering International, Apex, N.C.).
- When developing products such as sanitary napkins it is desirable to understand the fit of the product as it relates to the closeness of the product to the human body. One approach to understanding the fit of the product as it relates to the closeness of the product to the human body is to measure the gap between the product and the body at select locations.
FIG. 2 depicts themethod 40 for quantitatively analyzing the results of a model, and more particularly for quantitatively determining the fit of the product in relation to the human body. - In
step 42, model/simulation results are generated. A process to arrive at the simulation results is described in U.S. Pat. No. 6,810,300 issued Oct. 26, 2004 to Woltmann et al., or commonly-assigned co-pending application Ser. No. 60/550,479 filed Mar. 5, 2004 in the name of Anast et al. - The model results are loaded/inputed/read into post-processing software,
step 43. Examples of suitable post-processing software include but are not limited to ABAQUS CAE (ABAQUS Inc., Pawtucket, R.I.), LSPrePost (Livermore Software Technology Corporation, Livermore Calif.), and Hyperview (Altair Engineering, Troy, Mich.). - One known capability of post-processing software is the ability to use a repeated set of commands to drive a series of steps in the software, called scripting, with the repeated set of commands commonly called a script. In one such embodiment, an LSPrePost script can be used to visualize the simulation results of a product against a body at a series of different locations and angles in space, see
FIG. 3 . Another known capability of post-processing software is the ability to create a series of images and save the images to a file format such as a JPEG file,step 44. Another well known feature is the ability to generate a cross sectional 2 dimensional image, called a slice, through any combination of objects in the simulation at a variety of conditions that the model has calculated.FIG. 4 is an example of a slice. The concept of slicing can be thought of as cutting through the model along a determined path, and visually seeing the surface of all displayed objects along the path of this cut. These objects can include elements, nodes, surfaces, meshes, parts, and such. This can also include a sub-set of these objects. The states of these objects are previously determined when the model is being generated. Also, a variety of aspect ratios, and zoom levels are possible so as to concentrate the displayed image in the software within a specific region. In one embodiment, a series of slice images are generated in a PNG file format from LSPrePost of the parts corresponding to the pad and body in a deformed state at slice planes that would correspond to key anatomical features on a woman such as the prepuce and perineum for example. - While the generation of images from a simulation is well known, the ability to couple such images into image analysis software was unknown. More specifically there is the need to allow for the passing of critical information from one software to another. This is accomplished by the generation of a calibration mechanism,
step 46. In one such embodiment, the calibration mechanism is created in LSPrePost in which a cross sectional 2 dimensional image of a box of known dimensions is generated and saved to a file format, shown inFIG. 5 The known box image dimensions allow the screen capture process to be calibrated in physical dimensions, millimeters in this example. In other words, the box provides a linkage between the post processing software and image analysis software, in that the image analysis software would otherwise have no knowledge of physical dimensions from the post-processing software. In one embodiment, if the box image of the calibration box is known to represent a physical dimension of 25 mm by 25 mm, the image analysis software is capable of understanding the image as being 180 pixels by 180 pixels, the image analysis software now understands physical relationship in terms of a correlation of mm per pixel. Additionally, concerns such as imaging with different graphical resolutions, different resolution in horizontal vs. vertical, image orientation, and the such are resolved. - Additional images in the post-processer can be generated at the same level of magnification (zoom) and so share the common calibration factors and aspect ratio. It is possible to repeat this process such that different angles or different zooms are considered.
- Other forms of calibration might include producing other reference objects for example, a set of parallel lines or a bar, using a fixed predefined pixel size, saving the image data directly in real world coordinates, or passing the calibration parameters as an output text file. Alternatively, one can imagine using an inherent or artificial object within a given product performance model for such a purpose. Examples include a simulation being run with a square or cube embedded in the simulation as an artificial object, or using a fixed reference internal to a body such as femur width as an inherent object.
- In addition, various predetermined colors (digital grays scales in the example but any other colors could be used) are assigned to each element in the simulation, for example the body skin is black (zero gray scale), the pad edge is a shade of gray (128 gray scale), etc. The differentiation of materials based on the grayscale level provides a means to separate the components in the resulting 2D images during the image analysis steps described next. The output images are saved as a common graphics format file, in this example a PNG. Other forms of data output could be used, such as TIF or JPEG image format files, text file ASCII representation of the data, a binary format raw data file, or a 3D image format such as VRML or stereolithography.
- Once the series of image data files and calibration mechanism are generated, they are imported/inputed/read into image analysis software, steps 48 and 50. Examples of suitable image analysis software include but are not limited to MatLab (The Mathworks, Nattick Mass.), Optimas (Media Cybernetics, Silver Spring Md.) and ImagePro (Media Cybernetics, Silver Spring Md.). Other data analysis software packages would also be suitable to import and process this type of data, provided the data files are compatible with the software with either a built in reader or a custom programmed reader typical of what can be done within MatLab for uncommon data files. MatLab contains a PNG reader and is used to import the LSPrePost images directly into MatLabs working space. Once the images are read into the software they are analyzed,
step 52. A quantitative data report is provided from the analysis,step 54. The data is interpreted and correlated with consumers,step 56. - In one example, a custom image processing script was written in MatLab to: 1) read all the images from LSPrePost, 2) calibrate the images, 3) measure the pad-body gap area, and 4) save the results to a graphical and a text file representation for review.
FIGS. 6 and 7 illustrate examples of a graphical result files from different locations. Each step is described below. - 1) Calibration. The calibration image,
FIG. 5 , is read and the box identified by locating all pixels with grayscale of 128. The X and Y extents in pixels of the box are used to calculate calibration factors in mm to be applied to all subsequent images. - 2) For each calibrated image, the extent of the pad is identified for limiting the width of the analysis region. This can be accomplished in a number of ways. One procedure defines the minimum and maximum×coordinate of the particular gray scale associated with the pad, as described previously.
- 3) For each image, a vertical line scanning algorithm is used to find the lower body surface and the upper product surface. In sub-step, 3-1) image pixels in between these vertical points and within the defined analysis region, step 2), are accepted for inclusion in the gap area. Two addition sub-steps are used to fill in any gap areas missed scanning vertically. Sub-step 3-2) scans horizontally, finding pixels between the gap boundary and any pixels identified above. Sub-step 3-3) scans vertically again in a manner similar to 3-2) to more finely identify remaining gap pixels. Alternative methods, generally known as seed fill methods could equally well be used. Products that have holes or slits in the upper surfaces cause some trouble in this analysis, causing the pixels selection to leak into the interior of the product (not desirable) and thus seed fill routines will fail badly. To overcome this problem, one may limit the leakage by allowing the operator to remove sub-steps 3-2) and 3-3) from the analysis and note this correction in the results file. A more suitable approach would be to connect holes in the pad surface within the image using appropriate image processing techniques. The calibrated pixel areas for the above steps are summed to determine the gap area.
- 4) A report text file is generated that records the Product name, software version, computer platform, time and date of analysis, location of the source images, calibration factors, image names, gap areas in mm2, and the selection steps used for each image. In addition, a copy is generated of each input PNG image with an addition of the gap area being colored or shaded. This provides a visual record of the gap definition as shown in
FIGS. 6 and 7 . - A
method 70 for analyzing physical test results in a virtual environment is shown inFIG. 8 . - A sanitary napkin or feminine pad is placed on a human female body and a first cast of the sanitary napkin on the human female is made,
step 72. The first cast is removed and a second cast of the human female body is made,step 74. In a preferred execution the second cast is in close proximity and most preferably in direct contact with the body. The casts form the physical specimens. - The first and second casts are replicated in digital form to define a series of points, steps 76 and 78. The series of points can be connected to form a series of lines or a surface. In one example, a virtual surface of a product or panty is created using a 3D digitizing arm (MicroScribe/Immersion Corporation, San Jose, Calif., USA). The digitizing arm is connected to a computer equipped with a software program that supports modeling via reverse engineering such as Rhinoceros (Robert McNeel & Associates, Seattle, Wash.). A calibration process orients the digitizing arm in the real world with the coordinate system in the modeling software and is described with the equipment operating instructions.
- The digitizer uses XYZ coordination from a stylus on the digitizer arm to create a 3D wire frame model of the surface. This is accomplished by moving the stylus across the surface and capturing points. In one such embodiment, points can be captured along an axis of the surface; the points being taken along a series of sequential lines that are spaced across the surface; the number of lines and points within each line is determined by the level of detail to be captured. These sequential lines can be lofted within the software to generate the 3D surface. The Rhinoceros software offers a variety of file formats to save this 3D surface as. The file can be saved as a stereolithography file in an ASCII format.
- When two or more series of points are replicated in digital form it is often desirable that they be aligned with respect to one another,
step 80. This alignment process can be done manually, based on the digital surface profile, or with additional physical data. - In one embodiment reference markers are placed on the body and transferred to the cast during the casting process. Any number of reference markers may be used and the location of the markers may be positioned as desired. The reference markers are separately digitized as a series of marker points and saved as a text file. The reference markers associated with each series of digitized points can be used to align the two series of points with respect to each other using techniques such as least squared methods, residual minimization and the like. The two series of aligned points are saved together in a file in the aligned position.
- While the above process is performed for the surface of a product and for the surface of a body, it is understood that the two surfaces could correlate to any variety of surfaces including pad to pad, body to body, pad to undergarment, undergarment to body, and such. It is also understood that only at least two or more surfaces could be considered in such analysis. In addition a mannequin could be used to interact with the product, and the digital representation of the mannequin used versus the digitized product, the mannequin data resulting from digitizing the mannequin as described for a real body above, or directly from digital data from real humans used to manufacture the mannequin.
- Digitized models of the body and product can be obtained from digitizing the casts as above or could alternatively be captured directly from the body or stabilized product using any number of digitizing instruments, for example an Inspeck (Montreal, Canada) Capturer optical non-contact digitizer. One typical common output of these instruments is a stereolithography file as previously described. The body or products could also result in part from a surface used in a simulation as previously discussed, in mesh form or in another file format, or any combination of such between real digitized surface and virtually created or analyzed surface.
- The series of points are read directly into a post-processor,
step 82. When there are two or more aligned series of points, they can also be read directly into a post-processor. In some cases it is necessary to convert the series of points into a format that can be read into a post-processor. In one such case a stereolithography file of the aligned surfaces is read directly into LSPrePost, shown inFIG. 9 . - A known capability of post-processing software is the ability to create a series of images and save the images to a file format such as a JPEG file,
step 84. Another well known feature is the ability to generate a cross sectional 2 dimensional image, called a slice, through any combination of objects.FIG. 10 is an example of a slice. Also, a variety of aspect ratios, and zoom levels are possible so as to concentrate the displayed image in the software within a specific region. In one embodiment, a series of slice images are generated in a PNG file format from LSPrePost of the series of points corresponding to the pad and body at slice planes that would correspond to key anatomical features on a woman such as the prepuce and perineum for example. - A calibration mechanism is generated in
step 86. In one such embodiment, the calibration mechanism is created in LSPrePost in which a cross sectional 2 dimensional image of a box of known dimensions is generated and saved to a file format, shown inFIG. 5 - In addition, various predetermined colors (digital grays scales in the example but any other colors could be used) are assigned to each series of points, for example the body is black (zero gray scale), the pad is a shade of gray (128 gray scale), etc. The differentiation of materials based on the grayscale level provides a means to separate the components in the resulting 2D images during the image analysis steps described next. The output images are saved as a common graphics format file, in this example a PNG.
- Once the series of image data files and calibration mechanism are generated, they are imported/inputed/read into image analysis software, steps 88 and 90. Once the images are read into the software they are analyzed,
step 92. A quantitative data report is provided from the analysis,step 94. The data is interpreted and correlated with consumers,step 96. - In one example, a custom image processing script was written in MatLab to: 1) read all the images from LSPrePost, 2) calibrate the images, 3) measure the pad-body gap area, and 4) save the results to a graphical and a text file representation for review.
FIG. 11 illustrates an example of a graphical result file. Each step is described below. - 1. Calibration. The calibration image,
FIG. 5 , is read and the box identified by locating all pixels with grayscale of 128. The X and Y extents in pixels of the box are used to calculate calibration factors in mm to be applied to all subsequent images. - 2. For each calibrated image, e.g.,
FIG. 10 , the extent of the pad is identified for limiting the width of the analysis region. This can be accomplished in a number of ways. One procedure defines the minimum and maximum×coordinate of the particular gray scale associated with the pad, as described previously. - 3. For each image, a vertical line scanning algorithm is used to find the lower body surface and the upper product surface. In sub-step, 3-1) image pixels in between these vertical points and within the defined analysis region, step 2), are accepted for inclusion in the gap area. Two addition sub-steps are used to fill in any gap areas missed scanning vertically. Sub-step 3-2) scans horizontally, finding pixels between the gap boundary and any pixels identified above. Sub-step 3-3) scans vertically again in a manner similar to 3-2) to more finely identify remaining gap pixels. Alternative methods, generally known as seed fill methods could equally well be used. Products that have holes or slits in the upper surfaces cause some trouble in this analysis, causing the pixels selection to leak into the interior of the product (not desirable) and thus seed fill routines will fail badly. To overcome this problem, one may limit the leakage by allowing the operator to remove sub-steps 3-2) and 3-3) from the analysis and note this correction in the results file. A more suitable approach would be to connect holes in the pad surface within the image using appropriate image processing techniques. The calibrated pixel areas for the above steps are summed to determine the gap area.
- 4. A report text file is generated that records the Product name, software version, computer platform, time and date of analysis, location of the source images, calibration factors, image names, gap areas in mm2, and the selection steps used for each image. In addition, a copy is generated of each input PNG image with an addition of the gap area being colored or shaded. This provides a visual record of the gap definition as shown in
FIG. 11 . - In a separate embodiment, instead of considering the product fit against the body, one can measure a series of characteristics of a pad fit against an undergarment. This is of particular utility when considering the performance of sanitary napkins or feminine care products having wings. The
method 120 of calculating a spatial relationship between at least two objects is shown inFIG. 12 . - In
step 122, model/simulation results are generated. A process to arrive at the simulation results is described in U.S. Pat. No. 6,810,300 issued Oct. 26, 2004 to Woltmann et al., or commonly-assigned co-pending application Ser. No. 60/550,479 filed Mar. 5, 2004 in the name of Anast et al. - In one example, results from a model of a feminine protection pad with wings applied to a panty using a virtual hand are generated,
FIGS. 13-15 . - The model results are loaded/inputed/read into a post-processing software,
step 123. Examples of suitable post-processing software include but are not limited to ABAQUS CAE (ABAQUS Inc., Pawtucket, R.I.), LSPrePost (Livermore Software Technology Corporation, Livermore Calif.), and Hyperview (Altair Engineering, Troy, Mich.). - One known capability of post-processing software is the ability to use a repeated set of commands to drive a series of steps in the software, called scripting, with the repeated set of commands commonly called a script. In one such embodiment, an LSPrePost script can be used to visualize the simulation results of a product against a panty at a series of different locations and angles in space, see
FIGS. 13-15 . Also, a variety of aspect ratios, and zoom levels are possible so as to concentrate the displayed image in the software within a specific region. It is further possible to display only sections or components of a model, such as only wings, only undergarment, only sections of a wing. Another known capability of post-processing software is the ability to create a series of images and save the images to a file format such as a JPEG file,step 124. In one embodiment, a series of images are generated in a PNG file format from LSPrePost of the parts corresponding to sections of the pad and undergarment in a deformed state,FIGS. 16-18 . - A calibration mechanism is generated in
step 126. In one such embodiment, the calibration mechanism is created in LSPrePost in which a cross sectional 2 dimensional image of a box of known dimensions is generated and saved to a file format, shown inFIG. 5 - In addition, various predetermined colors (digital grays scales in the example but any other colors could be used) are assigned to each element in the simulation, for example the outline of the undergarment is black (zero gray scale), the pad is a shade of gray (128 gray scale), etc. The differentiation of materials based on the grayscale level provides a means to separate the components in the resulting 2D images during the image analysis steps described next. The output images are saved as a common graphics format file, in this example a PNG.
- Once the series of image data files and calibration mechanism are generated, they are imported/inputed/read into image analysis software, steps 128 and 130. Once the images are read into the software they are analyzed,
step 132. A quantitative data report is provided from the analysis,step 134. The data is interpreted and correlated with consumers,step 136. - In one such example, a custom image processing script was written in MatLab to 1) read all the images including the calibration mechanism from LSPrePost, 2) calibrate the images using the calibration mechanism, 3) composite the needed images as required by each measurement in a list of measurements, 4) calculate each measurement, and 5) save the results to a graphical and a text file representation for review. Each measurement is described in the steps below.
- 1) A text file is created with a header describing the version of the analysis code used, the date and time of analysis, the computer platform used to run the analysis, and the location of the source images.
- 2) Calibration. The calibration image,
FIG. 5 , is read and the box identified by locating all pixels with grayscale of 128. The X and Y extents in pixels of the box are used to calculate calibration factors in mm to be applied to all subsequent images. - 3) Adhesive patch Analysis. This requires just a single image of the adhesive patches, see
FIG. 17 . Only image pixels matching the grayscale assigned to patches are selected from within the image and set to a value of 1. All other pixels are assigned a value of zero. This forms a binary image. Pixels that are connected to their neighbors by at least one of the four sides or four edges of the pixel (8-connected) are grouped into blobs of connected pixels. The area of each blob and the total area is reported in mm2. - 4) Adhesive patch gap analysis. The image of
FIG. 18 is used again. The machine direction (MD) of the product is defined as the longest axis of the pad product. The cross direction (CD) is orthogonal in the plane of the pad to the MD, refer toFIG. 15 . Each CD line of the adhesive patch image ofFIG. 18 is scanned to identify the end point of the left patch and the beginning of the right patch. The distance between the points is the adhesive patch gap. This gap is plotted versus the distance along the MD, yielding a distinctly shaped plot for each product simulation. The maximum and minimum gaps in the profile along with the positions are found and reported in mm. - 5) Wing Area and Gap Analysis. A single image of the wings only is used as shown in
FIG. 17 . The area of each wing is determined as in 3) above. The two wing areas, average wing area and total wing area are reported in mm2. Using a CD line scanning procedure as in 4) above the wing gap profile is calculated and plotted vs. the MD distance. The minimum, and the leading and trailing maximum gaps are reported in mm. - 6) Panty Elastic Wing Gap. A composite image of the panty elastic edge and wing is used here, see
FIGS. 16 and 17 . A CD scanning procedure as in 4) above is used to identify the area of the wing located over the panty. The individual, average and total wing coverage areas are reported in mm2. Again using a CD scanning procedure, wing area located between the start of the wing on a given line and the start of the panty elastic on the same line is identified as the panty elastic gap and the two starting points used to determine the gap distance. The gap distance is plotted vs. MD distance. A separate plot is generated for the gap profile that is smoothed with a 7 point running average filter. The minimum gap, maximum gap, and their locations as well as the average gap distance is reported out in mm. The procedure is repeated for the wing elastic gap on the right side of the panty. - 7) Panty Elastic Length. Two images are needed a) panty elastic edge only, see
FIG. 16 , and b) a composite image of the panty elastic image,FIG. 16 with the wings image,FIG. 17 overlaid (wing image replaces elastic where they overlap). Using a CD scanning procedure as in 3) above on image a) the first location starting from the top of the image, working down the MD on the left side elastic, where the wing overlaps the elastic (elastic data is missing resulting from the wing overlay process) is noted. Similarly the last location where the wing overlaps the elastic is noted. Working from these same two locations on the image ofFIG. 16 the length of elastic between the two points is calculated and reported in mm. The procedure is repeated for the right hand elastic. The average of the left and right elastic length is reported as well. The distance between the left and right elastic start points and the distance between the left and right end points is calculated and reported in mm. - 8) The adhesive patch gaps, wing gap, and panty elastic gaps are tabulated vs. MD position and appended to the text report.
- 9) The calibration factors (X and Y directions) for the conversion of pixels into distance and areas in mm or mm2 is appended to the text report for documentation and the text file is closed.
- 10) All the graphical plots are programmatically placed in one figure and the figure written to storage as a JPEG file. An example results file is provide in
FIG. 19 . - The above method is not limited to virtual model images, but could equally well be used with images generated from digitized models of real physical prototypes, applied on real garments by real humans, additionally but not necessarily involving real bodies. Digitized models could include 3D geometry as described in
method 70 or by taking digital pictures of the product in place. Calibration of digital pictures would entail adding a known length object in the image field of view, such as a precision rule as routinely practiced in the art, and using manual or image analysis techniques to locate and calibrate the image based on the known calibration marks of the object. Equally any object of known dimension could be used as a calibration source, including the products or garments themselves provided known features are not distorted in the image. Identification of features, using image analysis techniques, within the image could be accomplished in a number of ways, the goal of which is to provide sufficient contrast to isolate features of interest. These include but are not limited to: 1) manually pre-indicating the features using a highlighting method such as colored ink markers or paint, 2) using colored raw materials and/or colored garments to provide sufficient contrast, 3) using fluorescent dyes or native material fluorescence (for example adhesives naturally fluoresce) and UV illumination, 4) edge extraction, edge and/or pattern correlation or similar image analysis techniques, or 5) any combination of the above techniques. This feature identification can then be coupled with the procedures outlined in the previous example above to provide quantified output, in a manner and for use as discussed in the example. - The above described processes can be performed over a range of products, bodies, garments, usage conditions, quantifying the results for each case. In doing so, a population of statistical measurements will be created which can be analyzed by known statistical techniques such as design of experiments, linear regression, significant difference and optimization to name a few.
- The above described processes can be used to improve product performance of existing products, to design new products, evaluate new concepts, and optimize design. Furthermore, an initial design can be analyzed using one of the methods above, the analysis results can be used to iterate the design and subsequently repeated. This enables a rapid development cycle for product design.
- All documents cited in the Detailed Description of the Invention are, in relevant part, incorporated herein by reference; the citation of any document is not to be construed as an admission that it is prior art with respect to the present invention.
- While particular embodiments of the present invention have been illustrated and described, it would be obvious to those skilled in the art that various other changes and modifications can be made without departing from the spirit and scope of the invention. It is therefore intended to cover in the appended claims all such changes and modifications that are within the scope of this invention.
Claims (26)
1. A method of quantitatively analyzing results of a model comprising the steps of:
a) generating at least one image of said model in a post-processor;
b) generating a calibration mechanism of known dimensions in said post-processor;
c) reading said calibration mechanism into an analysis software;
d) reading said image into said analysis software; and
e) analyzing said image from said post-processor in said analysis software quantitatively using said calibration mechanism.
2. The method of claim 1 , wherein a plurality of images are generated in said post-processor.
3. The method of claim 1 , wherein said model is a product being worn on, in or adjacent to a body.
4. The method of claim 3 , wherein said product is an absorbent article.
5. The method of claim 4 , wherein said absorbent article is selected from the group consisting of sanitary napkins, pantiliners, incontinent pads, tampons, diapers, and breast pads.
6. The method of claim 1 , wherein said calibration mechanism is a calibration image.
7. The method of claim 6 , wherein said calibration image is a box.
8. The method of claim 1 , wherein said analysis software is image analysis software.
9. A method of analyzing physical test results in a virtual environment comprising the steps of:
a) replicating at least one physical specimen in digital form to define a series of points;
b) reading said points into a post-processor;
c) generating at least one image from said series of points in said post-processor;
d) generating a calibration mechanism of known dimensions in said post-processor;
e) reading said calibration mechanism into an analysis software;
f) reading said image into said analysis software; and
g) analyzing said image from said post-processor in said analysis software quantitatively using said calibration mechanism.
10. The method of claim 9 , wherein the step of converting said series of points into a format that can be read into a post-processor is carried out after step a).
11. The method of claim 9 , wherein the physical specimen is of a product capable of being worn on, in or adjacent to a body.
12. The method of claim 11 , wherein said product is an absorbent article.
13. The method of claim 12 , wherein said absorbent article is selected from the group consisting of sanitary napkins, pantiliners, incontinent pads, tampons, diapers, and breast pads.
14. A method of analyzing physical test results in a virtual environment comprising the steps of:
a) replicating at least one physical specimen in digital form to define a series of points
b) aligning said series of points with at least a second series of points;
c) reading said aligned points into a post-processor;
d) generating at least one image from said aligned points in said post-processor;
e) generating a calibration mechanism of known dimensions in said post-processor;
f) reading said calibration mechanism into an analysis software;
g) reading said image into said analysis software; and
h) analyzing said image from said post-processor in said analysis software quantitatively using said calibration mechanism.
15. The method of claim 14 , wherein the step of converting said series of points into a format that can be read into a post-processor is carried out after step a).
16. The method of claim 14 , wherein at least one of said series of points respresents a product being worn on a body.
17. The method of claim 16 , wherein said product is an absorbent article.
18. The method of claim 17 , wherein said absorbent article is selected from the group consisting of sanitary napkins, pantiliners, incontinent pads, tampons, diapers, and breast pads.
19. A method for calculating a spacial relationship between at least two objects, said method comprising the steps of:
a) providing a model,
b) generating model results by running said model,
c) reading said model results into a post-processor,
d) generating at least one image of said model in the post-processor,
e) generating a calibration mechanism of known dimensions in said post-processor,
f) reading said calibration mechanism into an image analysis software,
g) reading said image into said image analysis software,
h) analyzing said image from said post-processor in said analysis software quantitatively using said calibration mechanism, and
i) calculating the spacial relationship between said at least two objects using the quantitative analysis of step h).
20. The method of claim 19 , wherein at least one of said objects is a human body and at least one of said objects is a product being worn on, in or adjacent to a body.
21. The method of claim 19 , wherein said spacial relationship is an area in at least one of said images between at least two of said objects.
22. The method of claim 19 , wherein said spacial relationship is a volume between at least two of said objects.
23. The method of claim 19 , wherein said spacial relationship is a distance between at least two points on said objects.
24. The method of claim 23 , wherein at least two distances are calculated and capable of being plotted in a graph.
25. The method of claim 24 , wherein said graph is of a profile through time, space, distance, or location.
26. The method of claim 19 , wherein at least one of said objects is a sanitary napkin and at least one of said objects is a woman's undergarment.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/071,917 US20050267613A1 (en) | 2004-03-05 | 2005-03-04 | Method to quantitativley analyze a model |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US55047904P | 2004-03-05 | 2004-03-05 | |
US55049004P | 2004-03-05 | 2004-03-05 | |
US11/071,917 US20050267613A1 (en) | 2004-03-05 | 2005-03-04 | Method to quantitativley analyze a model |
Publications (1)
Publication Number | Publication Date |
---|---|
US20050267613A1 true US20050267613A1 (en) | 2005-12-01 |
Family
ID=34961330
Family Applications (5)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/071,919 Abandoned US20050267614A1 (en) | 2004-03-05 | 2005-03-04 | System and method of virtual modeling of thin materials |
US11/072,152 Active 2025-10-24 US7634394B2 (en) | 2004-03-05 | 2005-03-04 | Method of analysis of comfort for virtual prototyping system |
US11/071,920 Abandoned US20050264562A1 (en) | 2004-03-05 | 2005-03-04 | System and method of virtual representation of thin flexible materials |
US11/071,917 Abandoned US20050267613A1 (en) | 2004-03-05 | 2005-03-04 | Method to quantitativley analyze a model |
US11/072,047 Abandoned US20050267615A1 (en) | 2004-03-05 | 2005-03-04 | System and method of virtual representation of folds and pleats |
Family Applications Before (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/071,919 Abandoned US20050267614A1 (en) | 2004-03-05 | 2005-03-04 | System and method of virtual modeling of thin materials |
US11/072,152 Active 2025-10-24 US7634394B2 (en) | 2004-03-05 | 2005-03-04 | Method of analysis of comfort for virtual prototyping system |
US11/071,920 Abandoned US20050264562A1 (en) | 2004-03-05 | 2005-03-04 | System and method of virtual representation of thin flexible materials |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/072,047 Abandoned US20050267615A1 (en) | 2004-03-05 | 2005-03-04 | System and method of virtual representation of folds and pleats |
Country Status (3)
Country | Link |
---|---|
US (5) | US20050267614A1 (en) |
EP (5) | EP1721271A1 (en) |
WO (5) | WO2005088582A2 (en) |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050264572A1 (en) * | 2004-03-05 | 2005-12-01 | Anast John M | Virtual prototyping system and method |
US20070027667A1 (en) * | 2005-07-14 | 2007-02-01 | Osborn Thomas W Iii | Computational model of the internal human pelvic environment |
US20080065041A1 (en) * | 2006-08-16 | 2008-03-13 | Mihai Alin Stan | Process for producing folded and compressed tampons |
US20080140368A1 (en) * | 2006-12-08 | 2008-06-12 | The Procter & Gamble Company | Method and system for predictive modeling of articles, such as tampons |
US20080140367A1 (en) * | 2006-12-08 | 2008-06-12 | The Procter & Gamble Company | Method and system to model materials for use in articles, such as tampons |
US20080183450A1 (en) * | 2007-01-30 | 2008-07-31 | Matthew Joseph Macura | Determining absorbent article effectiveness |
US20090048815A1 (en) * | 2007-08-17 | 2009-02-19 | The Procter & Gamble Company | Generalized constitutive modeling method and system |
US20090076783A1 (en) * | 2007-09-13 | 2009-03-19 | Tyco Healthcare Retail Services Ag | Digitally optimized fastener assembly and method of making the same |
US20110060555A1 (en) * | 2009-09-10 | 2011-03-10 | Arthur Joseph Koehler | Computer Based Models for Absorbent Articles |
US20110060570A1 (en) * | 2009-09-10 | 2011-03-10 | Chengming Wang | Computer Based Models for Absorbent Articles |
US9092585B2 (en) | 2013-01-22 | 2015-07-28 | The Procter & Gamble Company | Computer based models for absorbent articles |
Families Citing this family (49)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2006002061A2 (en) * | 2004-06-15 | 2006-01-05 | Sara Lee Corporation | Garment-model computer simulations |
WO2006002060A2 (en) * | 2004-06-15 | 2006-01-05 | Sara Lee Corporation | Systems and methods of generating integrated garment-model simulations |
DE602004026566D1 (en) | 2004-07-28 | 2010-05-27 | Procter & Gamble | Indirect pressure from AMG |
US7487116B2 (en) * | 2005-12-01 | 2009-02-03 | International Business Machines Corporation | Consumer representation rendering with selected merchandise |
US20160246905A1 (en) | 2006-02-14 | 2016-08-25 | Power Analytics Corporation | Method For Predicting Arc Flash Energy And PPE Category Within A Real-Time Monitoring System |
US9557723B2 (en) | 2006-07-19 | 2017-01-31 | Power Analytics Corporation | Real-time predictive systems for intelligent energy monitoring and management of electrical power networks |
US8170856B2 (en) * | 2006-04-12 | 2012-05-01 | Power Analytics Corporation | Systems and methods for real-time advanced visualization for predicting the health, reliability and performance of an electrical power system |
US20170046458A1 (en) | 2006-02-14 | 2017-02-16 | Power Analytics Corporation | Systems and methods for real-time dc microgrid power analytics for mission-critical power systems |
US9092593B2 (en) | 2007-09-25 | 2015-07-28 | Power Analytics Corporation | Systems and methods for intuitive modeling of complex networks in a digital environment |
US8317764B2 (en) * | 2006-07-14 | 2012-11-27 | The Procter And Gamble Company | Designing the shape of absorbent articles worn close to the body |
US8176933B2 (en) * | 2006-07-28 | 2012-05-15 | Hydril Usa Manufacturing Llc | Annular BOP packing unit |
US20080023917A1 (en) * | 2006-07-28 | 2008-01-31 | Hydril Company Lp | Seal for blowout preventer with selective debonding |
US8737704B2 (en) | 2006-08-08 | 2014-05-27 | The Procter And Gamble Company | Methods for analyzing absorbent articles |
US20080046189A1 (en) * | 2006-08-16 | 2008-02-21 | The Procter & Gamble Company | Method for measuring the partially saturated fluid transport properties of an absorbent |
US7684939B2 (en) | 2006-08-16 | 2010-03-23 | The Procter & Gamble Company | Method for designing an absorbent article |
US20110077927A1 (en) * | 2007-08-17 | 2011-03-31 | Hamm Richard W | Generalized Constitutive Modeling Method and System |
US8577650B2 (en) * | 2008-02-26 | 2013-11-05 | Kimberly-Clark Worldwide, Inc. | User interface for modeling thermal comfort |
WO2009116048A2 (en) * | 2008-03-20 | 2009-09-24 | Technion Research & Development Foundation Ltd. | A method for cosserat point element (cpe) modeling of nonlinear elastic materials |
US8144155B2 (en) * | 2008-08-11 | 2012-03-27 | Microsoft Corp. | Example-based motion detail enrichment in real-time |
US20110082597A1 (en) | 2009-10-01 | 2011-04-07 | Edsa Micro Corporation | Microgrid model based automated real time simulation for market based electric power system optimization |
US8428920B2 (en) * | 2010-02-12 | 2013-04-23 | Toyota Motor Engineering & Manufacturing North America, Inc. | Methods and systems for dynamic wrinkle prediction |
US20110208486A1 (en) * | 2010-02-19 | 2011-08-25 | Khalid Qureshi | Computer based modeling of fibrous materials |
WO2011103228A2 (en) * | 2010-02-19 | 2011-08-25 | The Procter & Gamble Company | Computer based modeling of processed fibrous materials |
US10628666B2 (en) | 2010-06-08 | 2020-04-21 | Styku, LLC | Cloud server body scan data system |
US11244223B2 (en) | 2010-06-08 | 2022-02-08 | Iva Sareen | Online garment design and collaboration system and method |
US10628729B2 (en) | 2010-06-08 | 2020-04-21 | Styku, LLC | System and method for body scanning and avatar creation |
US11640672B2 (en) | 2010-06-08 | 2023-05-02 | Styku Llc | Method and system for wireless ultra-low footprint body scanning |
WO2012036620A1 (en) * | 2010-09-14 | 2012-03-22 | Fotonic I Norden Ab | Apparatus and method for predicting redistribution of body volume |
GB201102794D0 (en) * | 2011-02-17 | 2011-03-30 | Metail Ltd | Online retail system |
US9244022B2 (en) | 2011-06-16 | 2016-01-26 | The Procter & Gamble Company | Mannequins for use in imaging and systems including the same |
US8780108B2 (en) * | 2011-11-02 | 2014-07-15 | X-Rite Switzerland GmbH | Apparatus, systems and methods for simulating a material |
US9013714B2 (en) * | 2011-12-06 | 2015-04-21 | The Procter & Gamble Company | Method of analyzing video or image data of an absorbent article |
GB2501473A (en) * | 2012-04-23 | 2013-10-30 | Clothes Network Ltd | Image based clothing search and virtual fitting |
US8712737B1 (en) * | 2012-07-20 | 2014-04-29 | Google Inc. | Use of physical deformation during scanning of an object to generate views of the object |
US20140149097A1 (en) * | 2012-11-29 | 2014-05-29 | The Procter & Gamble Company | Method to determine lotion effectiveness of a virtual absorbent article |
US9405868B2 (en) * | 2012-12-20 | 2016-08-02 | Livermore Software Technology Corp. | Systems and methods of numerically simulating structural behaviors of airbag made of coated fabric material |
US9361411B2 (en) | 2013-03-15 | 2016-06-07 | Honeywell International, Inc. | System and method for selecting a respirator |
US9635895B1 (en) | 2013-10-29 | 2017-05-02 | Vf Imagewear, Inc. | System and method for mapping wearer mobility for clothing design |
CN106157358A (en) * | 2015-03-26 | 2016-11-23 | 成都理想境界科技有限公司 | Object fusion method based on video image and terminal |
US11615462B2 (en) | 2016-02-16 | 2023-03-28 | Ohzone, Inc. | System for virtually sharing customized clothing |
US10373386B2 (en) | 2016-02-16 | 2019-08-06 | Ohzone, Inc. | System and method for virtually trying-on clothing |
US10127717B2 (en) | 2016-02-16 | 2018-11-13 | Ohzone, Inc. | System for 3D Clothing Model Creation |
CN106295081A (en) * | 2016-09-18 | 2017-01-04 | 张选琪 | Flexible manufacturing system Simulation on Decision system |
WO2018156087A1 (en) * | 2017-02-27 | 2018-08-30 | National University Of Singapore | Finite-element analysis augmented reality system and method |
US11948057B2 (en) * | 2017-06-22 | 2024-04-02 | Iva Sareen | Online garment design and collaboration system and method |
CN107704714B (en) * | 2017-11-06 | 2020-11-27 | 中车株洲电力机车有限公司 | Method and system for processing finite element simulation stress value and test stress value |
US11293124B2 (en) * | 2018-05-30 | 2022-04-05 | Nike, Inc. | Textile component production systems and methods |
EP3807905A4 (en) * | 2018-06-13 | 2022-04-27 | Vital Mechanics Research Inc. | Methods and systems for computer-based prediction of fit and function of garments on soft bodies |
CN112017276B (en) * | 2020-08-26 | 2024-01-09 | 北京百度网讯科技有限公司 | Three-dimensional model construction method and device and electronic equipment |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6434257B1 (en) * | 1995-02-17 | 2002-08-13 | International Business Machines Corporation | Size recognition system with method for determining price of a commodity |
US20020173761A1 (en) * | 1993-08-17 | 2002-11-21 | Roe Donald Carroll | Disposable absorbent article having capacity to store low-viscosity fecal material |
US20030028436A1 (en) * | 2001-06-27 | 2003-02-06 | Razumov Sergey N. | Method and system for selling clothes |
US20030232183A1 (en) * | 2000-09-13 | 2003-12-18 | The Procter & Gamble Company | Process for making a foam component |
US20040082930A1 (en) * | 1997-11-14 | 2004-04-29 | Tim Bast | Zoned disposable aborbent article for urine and low-viscosity fecal material |
US20040224132A1 (en) * | 1994-02-28 | 2004-11-11 | Roe Donald Carroll | Absorbent article with multiple zone structural elastic-like film web extensible waist feature |
US20040236552A1 (en) * | 2003-05-22 | 2004-11-25 | Kimberly-Clark Worldwide, Inc. | Method of evaluating products using a virtual environment |
US20040253440A1 (en) * | 2003-06-13 | 2004-12-16 | Kainth Arvinder Pal Singh | Fiber having controlled fiber-bed friction angles and/or cohesion values, and composites made from same |
US20050084532A1 (en) * | 2002-03-13 | 2005-04-21 | Howdle Steven M. | Polymer composition loaded with cells |
US20050148979A1 (en) * | 2003-12-30 | 2005-07-07 | Palma Joseph D. | Packaging component with sensory cue for opening |
Family Cites Families (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5495568A (en) * | 1990-07-09 | 1996-02-27 | Beavin; William C. | Computerized clothing designer |
US5625577A (en) * | 1990-12-25 | 1997-04-29 | Shukyohojin, Kongo Zen Sohonzan Shorinji | Computer-implemented motion analysis method using dynamics |
US6310627B1 (en) * | 1998-01-20 | 2001-10-30 | Toyo Boseki Kabushiki Kaisha | Method and system for generating a stereoscopic image of a garment |
EP1489532A3 (en) * | 1998-09-07 | 2006-03-08 | Bridgestone Corporation | Method of estimating tire performance |
US6310619B1 (en) * | 1998-11-10 | 2001-10-30 | Robert W. Rice | Virtual reality, tissue-specific body model having user-variable tissue-specific attributes and a system and method for implementing the same |
WO2000059581A1 (en) | 1999-04-01 | 2000-10-12 | Dominic Choy | Simulated human interaction systems |
US6404426B1 (en) * | 1999-06-11 | 2002-06-11 | Zenimax Media, Inc. | Method and system for a computer-rendered three-dimensional mannequin |
WO2001001235A1 (en) | 1999-06-25 | 2001-01-04 | Tara Chand Singhal | System and method for simulating how an article of wear will appear and feel on an individual |
US7663648B1 (en) | 1999-11-12 | 2010-02-16 | My Virtual Model Inc. | System and method for displaying selected garments on a computer-simulated mannequin |
AUPQ600100A0 (en) | 2000-03-03 | 2000-03-23 | Macropace Products Pty. Ltd. | Animation technology |
US7149665B2 (en) * | 2000-04-03 | 2006-12-12 | Browzwear International Ltd | System and method for simulation of virtual wear articles on virtual models |
JP3892652B2 (en) * | 2000-09-06 | 2007-03-14 | 住友ゴム工業株式会社 | Creating a tire analysis model |
US7840393B1 (en) | 2000-10-04 | 2010-11-23 | Trivascular, Inc. | Virtual prototyping and testing for medical device development |
EP1221674A3 (en) | 2001-01-05 | 2003-09-24 | Interuniversitair Microelektronica Centrum Vzw | System and method to obtain surface structures of multidimensional objects, and to represent those surface structures for animation, transmission and display |
US6889113B2 (en) * | 2001-08-23 | 2005-05-03 | Fei Company | Graphical automated machine control and metrology |
US7099734B2 (en) * | 2003-05-22 | 2006-08-29 | Kimberly-Clark Worldwide, Inc. | Method of evaluating the performance of a product using a virtual environment |
EP1625520A1 (en) * | 2003-05-22 | 2006-02-15 | Kimberly-Clark Worldwide, Inc. | Method of evaluating the performance of a product using a virtual environment |
US6810300B1 (en) * | 2003-05-22 | 2004-10-26 | Kimberly-Clark Worldwide, Inc. | Method of designing a product worn on a body in a virtual environment |
US20040236455A1 (en) * | 2003-05-22 | 2004-11-25 | Kimberly-Clark Worldwide, Inc. | Method of designing a product in a virtual environment |
US20040236457A1 (en) * | 2003-05-22 | 2004-11-25 | Kimberly-Clark Worldwide, Inc. | Method of evaluating articles used on a body in a virtual environment |
-
2005
- 2005-03-04 US US11/071,919 patent/US20050267614A1/en not_active Abandoned
- 2005-03-04 US US11/072,152 patent/US7634394B2/en active Active
- 2005-03-04 US US11/071,920 patent/US20050264562A1/en not_active Abandoned
- 2005-03-04 US US11/071,917 patent/US20050267613A1/en not_active Abandoned
- 2005-03-04 US US11/072,047 patent/US20050267615A1/en not_active Abandoned
- 2005-03-07 WO PCT/US2005/007393 patent/WO2005088582A2/en not_active Application Discontinuation
- 2005-03-07 EP EP05724738A patent/EP1721271A1/en not_active Withdrawn
- 2005-03-07 WO PCT/US2005/007391 patent/WO2005088487A1/en not_active Application Discontinuation
- 2005-03-07 WO PCT/US2005/007390 patent/WO2005088486A1/en not_active Application Discontinuation
- 2005-03-07 EP EP05724850A patent/EP1721276A2/en not_active Withdrawn
- 2005-03-07 EP EP05724847A patent/EP1721273A1/en not_active Withdrawn
- 2005-03-07 WO PCT/US2005/007392 patent/WO2005088488A1/en not_active Application Discontinuation
- 2005-03-07 WO PCT/US2005/007254 patent/WO2005088484A1/en not_active Application Discontinuation
- 2005-03-07 EP EP05724849A patent/EP1721275A1/en not_active Withdrawn
- 2005-03-07 EP EP05724848A patent/EP1721274A1/en not_active Withdrawn
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020173761A1 (en) * | 1993-08-17 | 2002-11-21 | Roe Donald Carroll | Disposable absorbent article having capacity to store low-viscosity fecal material |
US20040106911A1 (en) * | 1993-08-17 | 2004-06-03 | The Procter & Gamble Company | Disposable absorbent article having capacity to store low-viscosity fecal material |
US20040224132A1 (en) * | 1994-02-28 | 2004-11-11 | Roe Donald Carroll | Absorbent article with multiple zone structural elastic-like film web extensible waist feature |
US6434257B1 (en) * | 1995-02-17 | 2002-08-13 | International Business Machines Corporation | Size recognition system with method for determining price of a commodity |
US20040082930A1 (en) * | 1997-11-14 | 2004-04-29 | Tim Bast | Zoned disposable aborbent article for urine and low-viscosity fecal material |
US20030232183A1 (en) * | 2000-09-13 | 2003-12-18 | The Procter & Gamble Company | Process for making a foam component |
US20030028436A1 (en) * | 2001-06-27 | 2003-02-06 | Razumov Sergey N. | Method and system for selling clothes |
US20050084532A1 (en) * | 2002-03-13 | 2005-04-21 | Howdle Steven M. | Polymer composition loaded with cells |
US20040236552A1 (en) * | 2003-05-22 | 2004-11-25 | Kimberly-Clark Worldwide, Inc. | Method of evaluating products using a virtual environment |
US20040253440A1 (en) * | 2003-06-13 | 2004-12-16 | Kainth Arvinder Pal Singh | Fiber having controlled fiber-bed friction angles and/or cohesion values, and composites made from same |
US20050148979A1 (en) * | 2003-12-30 | 2005-07-07 | Palma Joseph D. | Packaging component with sensory cue for opening |
Cited By (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7937253B2 (en) * | 2004-03-05 | 2011-05-03 | The Procter & Gamble Company | Virtual prototyping system and method |
US20050264572A1 (en) * | 2004-03-05 | 2005-12-01 | Anast John M | Virtual prototyping system and method |
US20070027667A1 (en) * | 2005-07-14 | 2007-02-01 | Osborn Thomas W Iii | Computational model of the internal human pelvic environment |
US8392159B2 (en) | 2005-07-14 | 2013-03-05 | The Procter And Gamble Company | Computational model of the internal human pelvic environment |
US20110172978A1 (en) * | 2005-07-14 | 2011-07-14 | Osborn Iii Thomas Ward | Computational Model of the Internal Human Pelvic Environment |
US7937249B2 (en) | 2005-07-14 | 2011-05-03 | The Procter & Gamble Company | Computational model of the internal human pelvic environment |
US20080065041A1 (en) * | 2006-08-16 | 2008-03-13 | Mihai Alin Stan | Process for producing folded and compressed tampons |
US7735203B2 (en) | 2006-08-16 | 2010-06-15 | The Procter & Gamble Company | Process for producing folded and compressed tampons |
US20080140368A1 (en) * | 2006-12-08 | 2008-06-12 | The Procter & Gamble Company | Method and system for predictive modeling of articles, such as tampons |
US20080140367A1 (en) * | 2006-12-08 | 2008-06-12 | The Procter & Gamble Company | Method and system to model materials for use in articles, such as tampons |
US7715938B2 (en) | 2006-12-08 | 2010-05-11 | The Procter & Gamble Company | Method and system for predictive modeling of articles, such as tampons |
US7844399B2 (en) | 2006-12-08 | 2010-11-30 | The Procter & Gamble Company | Method and system to model materials for use in articles, such as tampons |
US20080183450A1 (en) * | 2007-01-30 | 2008-07-31 | Matthew Joseph Macura | Determining absorbent article effectiveness |
US7979256B2 (en) | 2007-01-30 | 2011-07-12 | The Procter & Gamble Company | Determining absorbent article effectiveness |
US20090048815A1 (en) * | 2007-08-17 | 2009-02-19 | The Procter & Gamble Company | Generalized constitutive modeling method and system |
US20090076783A1 (en) * | 2007-09-13 | 2009-03-19 | Tyco Healthcare Retail Services Ag | Digitally optimized fastener assembly and method of making the same |
US20110060570A1 (en) * | 2009-09-10 | 2011-03-10 | Chengming Wang | Computer Based Models for Absorbent Articles |
US20110060555A1 (en) * | 2009-09-10 | 2011-03-10 | Arthur Joseph Koehler | Computer Based Models for Absorbent Articles |
US8386219B2 (en) | 2009-09-10 | 2013-02-26 | The Procter & Gamble Company | Computer based models for absorbent articles |
US8392161B2 (en) | 2009-09-10 | 2013-03-05 | The Procter & Gamble Company | Computer based models for absorbent articles |
US8468000B1 (en) | 2009-09-10 | 2013-06-18 | The Procter & Gamble Company | Computer based models for absorbent articles |
US9092585B2 (en) | 2013-01-22 | 2015-07-28 | The Procter & Gamble Company | Computer based models for absorbent articles |
Also Published As
Publication number | Publication date |
---|---|
EP1721276A2 (en) | 2006-11-15 |
EP1721271A1 (en) | 2006-11-15 |
WO2005088488A1 (en) | 2005-09-22 |
US20050264563A1 (en) | 2005-12-01 |
US20050267614A1 (en) | 2005-12-01 |
WO2005088582A3 (en) | 2005-11-03 |
US7634394B2 (en) | 2009-12-15 |
WO2005088487A1 (en) | 2005-09-22 |
WO2005088582A2 (en) | 2005-09-22 |
EP1721273A1 (en) | 2006-11-15 |
US20050267615A1 (en) | 2005-12-01 |
US20050264562A1 (en) | 2005-12-01 |
EP1721275A1 (en) | 2006-11-15 |
EP1721274A1 (en) | 2006-11-15 |
WO2005088484A1 (en) | 2005-09-22 |
WO2005088486A1 (en) | 2005-09-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20050267613A1 (en) | Method to quantitativley analyze a model | |
US10172675B2 (en) | Implant design analysis suite | |
US7937253B2 (en) | Virtual prototyping system and method | |
Friess | Scratching the surface? The use of surface scanning in physical and paleoanthropology | |
Bhandari et al. | 3D polycrystalline microstructure reconstruction from FIB generated serial sections for FE analysis | |
Kumar et al. | Reverse engineering in product manufacturing: an overview | |
Viceconti et al. | CT data sets surface extraction for biomechanical modeling of long bones | |
JP2011214983A (en) | Simulation device and computer program therefor | |
Leong et al. | A feature‐based anthropometry for garment industry | |
Špelic | The current status on 3D scanning and CAD/CAM applications in textile research | |
Loverdos et al. | Geometrical digital twins of masonry structures for documentation and structural assessment using machine learning | |
Bertsatos et al. | A novel method for analyzing long bone diaphyseal cross-sectional geometry. A GNU Octave CSG Toolkit | |
Brown | Finite element modeling in musculoskeletal biomechanics | |
Šagi et al. | Reverse engineering | |
Baldo et al. | Proposition and experimental evaluation of a point-based compensation approach to reduce systematic errors in CT measurements | |
JPH10301258A (en) | Photomask defect analyzing system and defect analyzing method as well as recording medium recorded with this defect analyzing program | |
Klaas et al. | Construction of models and meshes of heterogeneous material microstructures from image data | |
Ghahremani et al. | Automated 3D image-based section loss detection for finite element model updating | |
Lukacs et al. | Non-contact whole-part inspection | |
Pandilov et al. | Reverse engineering–An effective tool for design and development of mechanical parts | |
Paulano-Godino et al. | Issues on the simulation of geometric fractures of bone models | |
CN114970262A (en) | Virtual human body model, modeling method and electronic equipment | |
Lee et al. | A study on parametric shape modifications of 3D skeletal models | |
Kim et al. | A study on the 3D body scan data editing process and errors analysis for clothing design | |
Blair et al. | Photometric stereo data for the validation of a structural health monitoring test rig |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: PROCTER & GAMBLE COMPANY, THE, OHIO Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ANAST, JOHN MATTHEW;MACURA, MATTHEW JOSEPH;LAVASH, BRUCE WILLIAM;AND OTHERS;REEL/FRAME:016638/0955;SIGNING DATES FROM 20050708 TO 20050812 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |