EP2715463A1 - System and method for manufacturing using a virtual frame of reference - Google Patents

System and method for manufacturing using a virtual frame of reference

Info

Publication number
EP2715463A1
EP2715463A1 EP11729496.7A EP11729496A EP2715463A1 EP 2715463 A1 EP2715463 A1 EP 2715463A1 EP 11729496 A EP11729496 A EP 11729496A EP 2715463 A1 EP2715463 A1 EP 2715463A1
Authority
EP
European Patent Office
Prior art keywords
component
manufacturing system
setpoint
controller
processors
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP11729496.7A
Other languages
German (de)
French (fr)
Inventor
Rajesh Kumar Singh
Jeremy Georgies BERTIN
Thomas Keith Olschner
Jon Richard ROSSITER
Steven Arthru MARSHALL
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Procter and Gamble Co
Original Assignee
Procter and Gamble Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Procter and Gamble Co filed Critical Procter and Gamble Co
Publication of EP2715463A1 publication Critical patent/EP2715463A1/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]
    • G05B19/41875Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM] characterised by quality surveillance of production
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F13/00Bandages or dressings; Absorbent pads
    • A61F13/15Absorbent pads, e.g. sanitary towels, swabs or tampons for external or internal application to the body; Supporting or fastening means therefor; Tampon applicators
    • A61F13/15577Apparatus or processes for manufacturing
    • A61F13/15772Control
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F13/00Bandages or dressings; Absorbent pads
    • A61F13/15Absorbent pads, e.g. sanitary towels, swabs or tampons for external or internal application to the body; Supporting or fastening means therefor; Tampon applicators
    • A61F13/15577Apparatus or processes for manufacturing
    • A61F13/15772Control
    • A61F2013/15796Control of the alignment or position of article or components
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/31From computer integrated manufacturing till monitoring
    • G05B2219/31078Several machines and several buffers, storages, conveyors, robots
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/32Operator till task planning
    • G05B2219/32182If state of tool, product deviates from standard, adjust system, feedback
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/37Measurements
    • G05B2219/37208Vision, visual inspection of workpiece
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40554Object recognition to track object on conveyor

Definitions

  • the present invention relates generally to a system and method for manufacturing, and more particularly to a system and method using a virtual frame of reference to electronically evaluate the position of components used in the manufacturing process.
  • the position of components used in the manufacturing process may affect the overall quality of the goods and the acceptance of the goods by consumers. Consumers often desire consistency in the configuration of purchased goods for both functional and aesthetic reasons. To ensure consistency throughout the manufacturing process, components must be positioned uniformly.
  • many disposable absorbent products such as diapers and feminine hygiene products include a core of absorbent material positioned between a top sheet and a bottom sheet. Variations in the placement of the core within the finished good can result in leakage and reduce the functionality of the product. Even if the placement of the core and other components do not affect the functionality of the product, consumers expect each product to maintain the same look and feel as one another. For example, a winged pantiliner having an off-center or skewed core may create confusion for a consumer as to the best way to place such a pantiliner in her undergarment.
  • a popular design on a diaper e.g., a popular children's character, a team logo, and other familiar designs
  • the placement of a popular design on a diaper must be consistently placed, in order to ensure that the design is fully shown (e.g., that a headless character is not shown, a team logo is not missing the name of the team's city, and other inconsistencies).
  • a controller for a manufacturing system includes one or more processors and one or more memory devices communicatively coupled to the one or more processors.
  • the one or more memory devices store machine instructions that, when executed by the one or more processors, cause the one or more processors to receive an electronic image of a component used in the manufacturing system.
  • the instructions also cause the one or more processors to analyze the electronic image using a virtual frame of reference to determine a location value associated with the component, to compare the location value and a setpoint, and to generate a phasing command for a machine in the manufacturing system based on the comparison.
  • a computerized method for determining and controlling the positions of components in a manufacturing system includes capturing an electronic image of a component used in the manufacturing system and analyzing, by one or more processors, the electronic image using a virtual frame of reference to determine a location value associated with the component.
  • the method further includes comparing the location value and a setpoint and also includes generating a phasing command for a machine in the manufacturing system based on the comparison.
  • a computerized method for controlling a component transformation in a manufacturing system includes receiving, at one or more processors, electronic images of components used by the manufacturing system to produce a manufactured good and classifying the images into two or more subpopulations.
  • the method also includes analyzing the images, by the one or more processors, using a virtual frame of reference to determine location values associated with the components.
  • the method further includes using the location values to determine a mathematical characteristic of each of the subpopulations and using the mathematical characteristic to generate a phasing command for a machine in the manufacturing system.
  • FIG. 1 is a schematic illustration of a vision system
  • FIG. 2 is an illustration of an electronic image
  • FIG. 3 is a schematic illustration of a controller
  • FIG. 4 is a schematic illustration of a manufacturing system
  • FIG. 5A is an illustration of a core sheet having uniformly-oriented features
  • FIG. 5B is an illustration of a core sheet having features with alternating orientations
  • FIG. 6A is an illustration of uniformly-oriented cores being turned to have a uniform orientation in the machine direction
  • FIG. 6B is an illustration of cores having alternating orientations being turned to have a uniform orientation in the machine direction
  • FIG. 6C is an illustration of cores having alternating orientations being turned to have alternating orientations in the machine direction
  • FIG.. 7 A is a side-view illustration of an individual core located between a top sheet and a back sheet.
  • FIG. 7B is a top- view illustration of an individual core located between a crimped topsheet and back sheet.
  • Disposable absorbent article refers to feminine hygiene products (e.g., pantiliners or pads), disposable diapers, pull-ons, training pants, and adult incontinence articles.
  • Machine direction refers to the direction of movement of a component along a manufacturing line.
  • Cross direction refers to the direction substantially perpendicular or perpendicular to the MD and across the component as it moves along a manufacturing line.
  • Component refers to any material, part, or combination of materials and/or parts used in the construction of a final good by a manufacturing system.
  • Phase refers to the positional relationship between two or more parts of a machine that performs repetitive motion.
  • phase may refer to the relative position of a roller that unwinds a roll of material used in the manufacturing process.
  • phase may also refer to the relative position of a punch that stamps apertures into a component used in the manufacturing process.
  • the terms "phasing,” “phased,” “phase,” and the like refer to the act of changing the phase of a device from one phase to another.
  • the act of phasing a roller may refer to advancing or retarding the rotation of the roller about its primary axis.
  • Component transformation refers to any action performed by the manufacturing process on one or more components used to produce the final consumer goods.
  • a component transformation may be any change to the configuration of a component performed by the manufacturing process. Some component transformations may only change the spatial properties of a component, while others may change the physical properties of the component. For example, rotating, flipping, and reorienting components are component transformations. Increasing or retarding the speed of motion of a component within the manufacturing process may also be component transformations. Other examples of transformations include cutting components, joining components, separating components from one another, changing the shapes of components, perforating components, punching or cutting apertures into components, and changing the look of components (e.g., by applying graphics, paint, dyes, or the like).
  • Controller refers to any electronic device or system that provides control commands to another electronic and/or mechanical system.
  • a controller includes one or more processors (e.g., a microprocessor, central processing unit, application- specific integrated circuit, or the like).
  • a controller may also include one or more memory devices (e.g., a RAM, ROM, non-volatile memory, flash memory, non-transitory memory, hard drive, disk drive, or any other electronic device capable of storing machine instructions) that communicate locally or remotely with the one or more processors.
  • the one or more memory devices store machine instructions that, when executed by the one or more processors, cause the one or more processors to provide the control commands.
  • Non-limiting examples of controllers include personal computers, servers, programmable logic controllers (PLCs), tablet computers, handheld computing devices, mobile telephones, distributed computing systems, cameras, and electronic displays.
  • PLCs programmable logic controllers
  • a vision system includes one or more cameras that capture images of components as they move through the manufacturing process. Any known type of electronic camera may be used.
  • a camera may be a charge-coupled device, a CMOS-pixel based device, or the like.
  • the image data captured by a camera is provided to one or more controllers for further analysis.
  • a camera may have one or more controllers integrated as part of the device (e.g., within the same housing as the camera) and/or transmit the image data to one or more controllers external to the camera.
  • the one or more controllers analyze the image data using a virtual frame of reference to determine if the manufacturing process needs to be adjusted. If an adjustment is needed, the one or more controllers generate control commands that change how one or more upstream and/or downstream component transformations are performed.
  • Vision system 100 includes camera 104.
  • camera 104 may be positioned in a fixed location and capture an electronic image of a component 106, as it passes vision system 100 in the machine direction along manufacturing line 108.
  • Camera 104 may be oriented in any number of positions in relation to the direction of motion of component 106.
  • camera 104 may be positioned above or below component 106, along the side of component 106, or somewhere therebetween.
  • Camera 104 may take continuous images (e.g., video) or still-frames that are captured periodically or in response to receiving a trigger from an upstream and/or downstream device.
  • Camera 104 provides a captured electronic image to controller 102, which analyzes the image using a virtual frame of reference to determine if an adjustment to the manufacturing process is needed.
  • the virtual frame of reference allows analysis of the spatio- or spatio-temporal location of component 106 as it relates to the manufacturing process and/or other components within the manufacturing process.
  • Controller 102 may analyze the position, orientation, and/or timing of component 106 as it passes vision system 100 to determine if an upstream or downstream component transformation requires adjustment. For example, if an upstream component transformation reorients component 106, controller 102 may analyze the position of component 106 relative to a setpoint position and send a control signal to the upstream process to ensure that the position of future components approaches the setpoint position.
  • controller 102 may analyze the timing of component 106 (e.g., to determine if component 106 reaches vision system 100 earlier or later than expected) and advance or retard an upstream or downstream transformation device, in order to ensure that the component 106 reaches the vision system 100 at the expected time.
  • a virtual frame of reference allows the location of a component to be determined relative to one or more coordinates (e.g., points, lines, areas, or the like) in the image. If the one or more cameras of a vision system remain at fixed locations, each image may be viewed as a set of coordinates, allowing the location of the component within an image to be determined. For example, the number of pixels from an edge of the image to an edge of the component within the image can be used to determine the position of the component. Similarly, the timing of when images are captured allows for comparisons to be made between multiple components that pass the vision system. In some cases, images may be captured continuously, periodically, and/or in response to an upstream or downstream trigger.
  • coordinates e.g., points, lines, areas, or the like
  • a virtual frame of reference When the camera remains in a fixed position, a virtual frame of reference may be used to analyze a component's location within an image and to adjust the manufacturing process. If the camera is not in a fixed position (e.g., moving), a virtual frame of reference may still be used, but must be adjusted to compensate for the movement of the camera. It is to be understood that while the descriptions herein primarily refer to camera positions that allow the edges of images to correspond to the MD and CD axes, this is merely illustrative and a vision system's camera may be oriented in any position to create a virtual frame of reference.
  • a vision system may make any number of determinations relating to the position and/or timing of a component passing the vision system. For example, a controller may determine the MD length, corner locations, skew, CD placement, and/or MD placement of a component. Additionally, a controller may determine the location of any distinguishing features on the component (e.g., an aperture, visual indicia, a physical characteristic, or the like). For example, a controller may determine the CD and/or MD placement of apertures on a component.
  • the controller utilizes the determined positions of a component relative to the virtual frame of reference to adjust the manufacturing process.
  • the controller may compare data relating to the actual position of the component to one or more setpoints, to determine if the manufacturing process needs to be adjusted.
  • the setpoints may be generated by the controller by analyzing one or more populations of components that pass the vision system. For example, an average position for a population of prior components may be used to generate a setpoint to analyze future components passing the vision system. In other cases, some or all of the setpoints may be preloaded into the controller.
  • Image 200 is captured as component 201 passes the camera of a vision system and analyzed to determine the location of component 201 relative to a virtual frame of reference.
  • this analysis may include determining the leading and trailing edges of component 201 in the machine direction (e.g., edges 216 and 218, respectively), locating one or more corners 214 of component 201, and/or determining the location of one or more edges of component 201 in the cross-direction (e.g., edges 220 and 222).
  • the controller may locate the corners using the intersection of the edges of component 201.
  • the controller may analyze the curvature of the leading and trailing edges of component 201 in the machine direction to approximate the location of the corners.
  • a controller may also determine the location of distinguishing features 210 (e.g., apertures, visual indicia, or the like).
  • the location of component 201 may be defined relative to any fixed point within image 200.
  • actual MD position 204 may be located relative to leading or trailing edges 224 of image 200 in the machine direction (e.g., by counting the number of pixels between an edge 224 and actual MD position 204, or the like).
  • the actual CD position 208 may be measured using the location of the corners 214 of component 201.
  • Another exemplary measurement includes the CD position of features 210, which may be determined relative to a virtual location in the cross-direction (e.g., centerline 212 of the image 200, or the like).
  • the MD position of features 210 may be determined relative to the leading edge 216 of component 201, or any other location in the machine direction.
  • MD-related measurements within the virtual frame of reference also rely on a temporal component, since MD positions are dependent on the timing of the manufacturing system.
  • image 200 must be captured at the proper moment, to ensure that positional measurements are meaningful.
  • the capturing of image 200 may be triggered periodically or triggered by the performance of a downstream transformation.
  • CD-related measurements within the virtual frame of reference may be made with regard to the position of the camera.
  • CD-related measurements may be made with regard to the fixed location of a mirror plate or other physical location associated with the position of the camera.
  • the MD length of component 201 (“Length-MD”) may be determined as:
  • Length MD MD trailing edge - MD leading edge using the difference between MD edges 216 and 218.
  • the length of the component may be determined using the difference between any two points on the perimeter of the component.
  • the skew of component 201 may be determined using the locations of 214. For example, the skew may be determined by using the following equation:
  • CD midpointi ead is the CD midpoint 228 of the corners 214 on the MD leading side 216 of component 201
  • CD midpoint trai i is the CD midpoint 230 of the corners 214 on the MD trailing side 218 of component 201
  • lengtfiMD is the length of the component in the MD direction (as determined above)
  • MD distance is the MD distance between midpoints 228, 230.
  • the CD placement of component 201 may be determined using the locations of its corners 214.
  • the CD placement (Placementc D ) may be determined using the following equation:
  • CD midpoint ad is the CD midpoint 228 of corners 214 on the MD leading side
  • CD midpoint trai i is the CD midpoint 230 of the corners 214 on the MD trailing side 218 of component 201.
  • the MD placement of the component may be determined using the leading edge of the component in MD direction.
  • the MD placement of the component may be determined relative to an edge of the electronic image as follows:
  • Placement MD MD leading edge— Image lead where Imagei ead is the location of edge 224 of image 200 image in the leading machine direction and MD leading edge is the actual MD position 204.
  • the CD and/or MD placement of distinguishable features 210 of component 201 may also be determined.
  • the CD placement of features 210 (Feature
  • Placementc D may be determined as follows:
  • Feature Placement MD the MD placement of the features
  • a controller analyzing electronic image 200 may also maintain one or more setpoint locations indicative of where component 201 should be positioned.
  • a MD setpoint 202 may indicate where actual MD position 204 should be at the time electronic image 200 is taken.
  • a CD setpoint 206 may indicate the ideal location of actual CD position 208.
  • the controller may use the setpoint locations to provide phasing commands and/or alerts to other upstream and/or downstream devices associated with the manufacturing process.
  • an alert may be provided by the controller signifying that maintenance may be needed or a phasing command may be sent to a component transformation device.
  • actual MD position 204 is located on the trailing side of MD setpoint 202 along the MD axis, this may be indicative of component 201 reaching MD setpoint 202 earlier or later than expected.
  • the controller may use this determination to advance or retard an upstream or downstream component transformation device, accordingly.
  • controller 102 may send a phasing control command to an upstream or downstream component transformation device to adjust how components are positioned in the cross-direction.
  • a controller analyzing an image may perform any number of functions associated with the analysis of the image.
  • a controller may generate an alert and/or stop an operation of the manufacturing process, if the difference between a measurement and a setpoint is above a threshold.
  • the controller may issue one or more phasing commands that adjust the phasing of one or more transformations in the manufacturing process (e.g., issue one or more phasing commands to a component transformation device).
  • Phasing commands may be either direct commands (e.g., if the controller provides direct control over the component transformation device) or indirect commands (e.g., indications of the determination provided to the controller that provides direct control over the component transformation device).
  • the controller may communicate with other computing devices, interface devices, and/or alarms, to receive and convey data about the image analysis.
  • controller 102 Referring now to FIG. 3, a schematic illustration of controller 102 is shown.
  • Controller 102 includes a processor 302, which may be one or more processors communicatively coupled to a memory 304, interface 306, and interface 308.
  • Memory 304 may be any form of memory capable of storing machine-executable instructions that implement one or more of the functions disclosed herein, when executed by processor 302.
  • memory 304 may be a RAM, ROM, flash memory, hard drive, EEPROM, CD-ROM, DVD, other forms of non-transitory memory devices, or the like. In some cases, memory 304 may be any combination of different memory devices.
  • Controller 102 receives electronic images from one or more cameras 104 via connection 312 and interface 306. In some cases, controller 102 may also provide control over camera 104 via connection 312. For example, controller 102 may control when camera 104 captures an image using a timing value stored in parameters 330 and/or using a trigger received from another device (e.g., transformation devices 316, other computing devices 334, or the like). Controller 104 may also provide phasing commands to transformation devices 316 via interface 306 and connection 314. Transformation devices 316 may be any device that performs a component transformation in the manufacturing system. By way of non-limiting examples, transformation devices 316 may be punchers, cutters, crimpers, turners, embossers, winders, unwinders, lotion applicators, or the like.
  • Connections 312 and 314 may be any combination of hardwired or wireless connections.
  • connection 312 may be a hardwired connection to provide electronic images to controller 102
  • connection 314 may be a wireless connection to provide phasing commands to transformation devices 316.
  • connections 312 and 314 may be part of a shared connection that conveys information between controller 102, camera 104, and transformation devices 316.
  • connections 312 and 314 may include one or more intermediary circuits (e.g., routers, modems, controllers, signal processors, and the like) and provide indirect connections to controller 102.
  • Interface 306 is configured to receive and transmit data between controller 102, camera 104, and/or other transformation devices 316.
  • interface 306 may include one or more wireless receivers if any of connections 312 or 314 are wireless.
  • Interface 306 may also include one or more wired ports if any of connections 312 or 314 are wired connections.
  • Interface 308 may provide one or more wired or wireless connections between controller 102, other computing devices 334 (e.g., one or more controllers, computers, servers, portable devices, PLCs, or the like), interface devices 336 (e.g., one or more electronic displays, human-machine interfaces, speakers, or the like), and/or alarms 338 (e.g., one or more sirens, flashing lights, or the like) via connections 320, 322, and 323, respectively.
  • interface 308 may provide a wired connection between controller 102 and a display (e.g., an interface device 336) and a wireless connection between controller 102 and a remote server (e.g., other computing device 334) via the Internet.
  • Memory 304 is shown to include image analyzer 324, which is configured to receive and analyze the electronic images captured by camera 104.
  • Image analyzer 324 detects the space occupied by a component within the image.
  • image analyzer 324 may detect a leading or trailing edge of a component in the machine direction, one or more edges in the cross-direction, and one or more corners of the component. In this way, image analyzer 324 may locate the perimeter of a component within the electronic image, and/or locate distinguishable features of the components (e.g., apertures, designs, protrusions, or the like).
  • Image analyzer 324 may also determine the location of a component in an image relative to another location and performs measurements based on the location. For example, the location of a point may be analyzed relative to another point in the image (e.g., a coordinate, an edge of the image, a fixed line, or the like).
  • Non-limiting examples of measurements performed by image analyzer 324 include: MD length, skew, CD placement, MD placement, feature CD placement, and/or feature MD placement.
  • Image analyzer 324 may also maintain an average of the measurements (e.g., a moving average, a weighted average, or the like) for a set number of components that are analyzed. For example, an average measurement may be maintained for the previous MD placement values for the last fifty components.
  • Image analyzer 324 may also make one or more of the measurements based on a transformation yet to be performed on the component.
  • a downstream device in the manufacturing process e.g., other computing device 334, component transformation device 316, or the like
  • controller 102 may provide a trigger to controller 102, which causes camera 104 to capture an image in response.
  • controller 102 may analyze the image with regard to the future transformations (e.g., in relation to a virtual cut line, placement of a feature onto the component, or the like).
  • Memory 304 is also shown to include a setpoint generator 326, which uses measurements from image analyzer 324 to generate one or more setpoints.
  • the setpoints may be coordinates, lines, or areas in the reference system that correspond to an expected measurement value associated with a component.
  • setpoint generator 326 may determine a setpoint using an average of measurements from image analyzer 324 for a set number of components. For example, setpoint generator 326 may calculate an MD setpoint using the average (e.g., a moving average, weighted average, or the like) of the MD leading edge of the last fifty components that have passed camera 104.
  • the average e.g., a moving average, weighted average, or the like
  • Setpoint generator 326 may also utilize data from transformation devices 316 to determine a setpoint (e.g., a virtual cut line, where a feature is to be placed onto the component, or the like). In some cases, setpoint generator 326 may maintain a group of two or more setpoints for a measurement, if components passing camera 104 have different orientations. For example, if components periodically pass camera 104 in three different orientations, setpoint generator 326 may use the orientation of the latest component to select the appropriate setpoint.
  • a setpoint e.g., a virtual cut line, where a feature is to be placed onto the component, or the like.
  • setpoint generator 326 may maintain a group of two or more setpoints for a measurement, if components passing camera 104 have different orientations. For example, if components periodically pass camera 104 in three different orientations, setpoint generator 326 may use the orientation of the latest component to select the appropriate setpoint.
  • Parameters 330 may include any number of user or system defined parameters that override or control the functions of controller 102.
  • parameters 330 may include parameters that specify how many components are analyzed by setpoint generator 326 before a setpoint is generated, a setpoint value that overrides a generated setpoint, when an alert is to be issued, or any other setting.
  • Parameters 330 may be preloaded into memory 304 and/or specified via other computing devices 334 or interface devices 336. For example, a user utilizing a touchscreen display (e.g., an interface device 336) may change a setting in parameters 330.
  • Memory 304 may include a setpoint analyzer 328 that compares one or more measurements from image analyzer 324 to one or more setpoints.
  • the one or more setpoints may be generated by setpoint generator 326 or preset in parameters 330.
  • Setpoint analyzer 328 determines the difference between the measurement and the setpoint, in order to determine if further action needs to be taken by controller 102.
  • Setpoint analyzer 328 may also compare the difference between the measurement and the setpoint to a threshold, in order to determine if further action is needed.
  • setpoint analyzer 328 may also utilize other variables (e.g., an offset, a multiplier, an average of measurements, or the like) as part of the determination.
  • Alerts 333 may include, in non-limiting examples, messages indicating that a component should be rejected based on length, skew, CD placement, MD placement, aperture CD placement, aperture MD placement, or any other measurement from the electronic image. For example, if the skew of the component is outside of an acceptable range, controller 102 may generate an alert to a machine operator via interface devices 336 and/or alarms 338 that maintenance may be needed. In another example, alerts 333 may be provided to other computing devices 334 to keep track of the number of rejected components.
  • phase command generator 332 determines that further action by controller 102 is needed, it may also provide an indication of this determination to phase command generator 332.
  • Phase command generator 332 generates phasing commands for transformation devices 316, which perform component transformations in the manufacturing system.
  • phase commands may cause the advancing or retarding of the processing of components by one or more component transformation devices 316.
  • a phase command may provide direct control over a component transformation device 316.
  • a phase command may be a command to other computing devices 334 (e.g., a controller) that causes the phasing of a component transformation device 316 (e.g., a crimper or turner).
  • a phase command may control when a crimper applies crimping to a component.
  • the vision systems described herein can be used to adjust any number of component transformations in manufacturing systems that produce any number of different types of goods. Such vision systems are able to automatically adjust a manufacturing system without user interaction and improve the overall quality of the finalized products. Any number of different types of manufacturing systems may be built by varying the number and type of transformation devices, and by utilizing one or more of the vision systems described herein to automate the system.
  • Manufacturing system 400 may be scaled to accommodate any number of manufacturing processes to manufacture any number of different types of goods. As shown the subscripts "a,” “b,” “c,” and “m” are intended to denote a numerical range of values from the number one to the variable "m.” Where a plurality of similarly-labelled devices are shown (e.g., material delivery devices 414a, 414b, 414c, 414m), this is intended to be non-limiting and to convey that manufacturing system 400 may include any number of such devices (e.g., manufacturing system 400 may have one, two, three, etc., component delivery devices).
  • Manufacturing system 400 may include one or more material delivery devices (e.g., material delivery devices 414a, 414b, 414c, 414m) that provide component materials to manufacturing system 400 for further processing.
  • a material delivery device may provide a raw material, semi-finished component, or a finished component, depending on its configuration.
  • Non-limiting examples of material delivery devices include rollers, pumps, conveyor belts, and the like.
  • each component material is fashioned by manufacturing system 400 into individual components by any number of transformation devices.
  • a component material provided by material delivery device 414a may be transformed by a first transformation device 408a, an intermediary transformation device 406a, and/or a final transformation device 416a.
  • any number of component transformation devices may be used in manufacturing system 400 to process an individual component.
  • a first component may be processed by a single transformation device 408a (e.g., intermediary transformation device 406a and final transformation device 416a may be omitted), by two transformation devices 408a, 406a (e.g., final transformation device 416a may be omitted), by three transformation devices 408a, 406a, 416a, or by more than three transformation devices (e.g., additional transformation devices may perform transformations between the transformations performed by the first transformation device 408a and intermediary transformation device 406a and/or between intermediary transformation device 406a and final transformation device 416a).
  • additional transformation devices may perform transformations between the transformations performed by the first transformation device 408a and intermediary transformation device 406a and/or between intermediary transformation device 406a and final transformation device 416a).
  • manufacturing system 400 may also include any number of transformation devices that combine components (e.g., a first combining device 412a, a second combining device 412b, an m th combining device 412m, and the like). Manufacturing system 400 may also include any number of transformation devices that perform transformations on combined components (e.g., a first transformation device 410a, a second transformation device 410b, and the like). As can be appreciated, manufacturing system 400 may be scaled to accommodate any number of different combinations of components by adding or removing transformation devices, as needed.
  • manufacturing system 400 may also include any number of vision systems that monitor the processing of individual components (e.g., vision system 402a, 402b, 402c, 402m) and/or the processing of combined components (e.g,. vision systems 404a, 404b).
  • a transformation device may trigger a vision system to capture images of components as the components pass the vision system.
  • an intermediary transformation device 406a performing a component transformation may trigger vision system 402a to capture an image.
  • the one or more vision systems in manufacturing system 400 analyze images using virtual frames of reference to determine if other transformation devices in system 400 need to be adjusted (e.g., phased).
  • vision system 402a may determine that an upstream transformation device (e.g., first transformation device 408a, or the like) and/or downstream transformation device (e.g., combining device 410a, or the like) needs to be adjusted.
  • a first material deliver device 414a may provide the core material to manufacturing system 400.
  • the core material may comprise any number of different absorbent materials configured to absorb liquids.
  • a second material delivery device 414b may provide a back sheet and a third material delivery device 414c may provide a top sheet to manufacturing system 400.
  • a disposable absorbent article may be constructed by positioning an absorbent core material between an absorbent top sheet and a non-absorbent back sheet (e.g., by combining device 412b), crimping the sheets together (e.g., by transformation device 410b), and cutting the crimped sheets (e.g., by transformation device 412m).
  • Each individual material used to manufacture the disposable absorbent article may undergo one or more component transformations before being combined.
  • features, such as apertures may be cut into the core sheet (e.g., by first transformation device 408a), individual cores may be cut from the core sheet (e.g., by intermediary transformation device 406a), and individual cores may be reoriented for further processing (e.g., by final transformation device 416a).
  • the top sheet, back sheet, and/or any other components used to manufacture the disposable absorbent article may undergo any number of transformations prior to being combined.
  • features 501 are placed onto core sheet 500 by one or more transformations and core sheet 500 is cut into individual cores 502 by further transformations.
  • features 501 may be apertures that are cut into core sheet 500 in order to increase absorbency.
  • features 501 may be designs applied to core sheet 500.
  • any number of orientations of individual cores may be produced from core sheet
  • features 501 are oriented in a uniform direction and core sheet 500 may be cut to produce individual cores 502 having a uniform orientation.
  • features 501 are applied to core sheet 500 in alternating positions, thereby creating two different populations of individual cores (e.g., cores 504 and 506).
  • any number of different populations of individual cores may be produced using core sheet 500.
  • individual cores may be cut from a core sheet and the cores reoriented for further processing. For example, if two populations of cores are created by applying features in alternating directions, the resulting individual cores may later be reoriented into a single orientation in the machine direction. As can be appreciated, individual cores may be reoriented into any number of different directions, thereby creating any number of different populations of individual cores.
  • FIGS. 6A-6C illustrations of transformations of core sheet 500 are shown as non-limiting examples.
  • features 501 are placed onto core sheet 500 in a uniform direction. Individual cores 502 are later cut from core sheet 500 and turned for further processing by the manufacturing system, such that individual cores 502 have a uniform orientation.
  • FIG. 6B features 501 are placed onto core sheet 500 in alternating directions, creating two populations of cores 504 and 506. Again, individual cores 504, 506 are cut from core sheet 500 and then turned to have the same orientation in the machine direction.
  • FIG. 6 A features 501 are placed onto core sheet 500 in a uniform direction. Individual cores 502 are later cut from core sheet 500 and turned for further processing by the manufacturing system, such that individual cores 502 have a uniform orientation.
  • FIG. 6B features 501 are placed onto core sheet 500 in alternating directions, creating two populations of cores 504 and 506. Again, individual cores 504, 506 are cut from core sheet 500 and then turned to have the same orientation in the machine direction.
  • features 501 are placed onto core sheet 500 in alternating directions, cut into individual cores 504, 506, and then reoriented such that individual cores 504 and 506 have alternating orientations after being turned.
  • any number of different transformations involving applying features to a core sheet, cutting the core sheet into individual cores, and reorienting the cores may be used.
  • a sheet of core material may be unwound by material delivery device 414a and features may be added to it by a first transformation device 408a.
  • An intermediary transformation device 406a cuts the core sheet into individual cores, and a final transformation device 416a then reorients the cut cores for further processing in system 400.
  • Vision system 402a may be triggered by the processing of intermediary transformation device 406a, thereby causing an image of the core sheet to be captured.
  • a controller of the vision system 402a analyzes the image to ensure that the locations of the component in the image and/or the features on the component are properly placed within the image.
  • a possible setpoint used by the controller as part of this determination may correspond to a virtual cut line (e.g., where the intermediary transformation device 406a is expected to cut the core sheet into an individual core).
  • the controller of vision system 402a may provide a phasing command (e.g., an adjustment command) to first transformation device 408a.
  • the controller may additionally provide an alert to a user or other computing device.
  • FIG. 7A a section view
  • a core 502 may be combined with a top sheet 700 and back sheet 702, such that core 502 is positioned between them.
  • Top sheet 700 and core 502 are configured to absorb liquid
  • back sheet 702 acts to retain excess liquid that may pass through core 502.
  • FIG. 7B a topview of the combined components is shown.
  • the orientation of core 502 within top sheet 700 and back sheet 702 depends on how core 502 is oriented by a core turning transformation is performed by a component transformation device (e.g., final transformation device 416a).
  • a component transformation device e.g., final transformation device 416a
  • combining devices 412a and 412b may operate to position an individual core between a top sheet and a back sheet.
  • a transformation device 410b then crimps the top sheet and back sheet in order to secure the individual core between them.
  • Vision system 404a and/or 404b may observe the combined components and adjust one or more transformation devices in manufacturing system 400 accordingly.
  • the capturing of an electronic image by vision system 404a and/or 404b may be triggered by the performance of a transformation by a transformation device.
  • transformation device 410b may provide a trigger to a vision system 404b, indicative of when a top sheet and back sheet are crimped.
  • Vision system 404b may analyze the characteristics of any number of subpopulations of components.
  • a subpopulation may include every nth component that passes vision system 404b, where n is greater than one.
  • a subpopulation may include every other component, every third component, every fourth component, etc.
  • Vision system 404b may also analyze each subpopulation individually and/or compare different subpopulations, in order to determine if a phasing command is needed. For example, if a turner orients components into three different orientations, vision system 404b may analyze three different subpopulations of components using virtual frames of reference, where each subpopulation corresponds to a different orientation from the turner.
  • vision system 404b may also determine one or more mathematical characteristics of a subpopulation.
  • a mathematical characteristic of a subpopulation may be any characteristic that generalizes position values for components within the subpopulation.
  • a mathematical characteristic may be an average CD position, an average MD position, an aggregate CD position, or an aggregate MD position.
  • vision system 404b may determine a mathematical characteristic across multiple subpopulations, in order to determine if a phasing command is needed.
  • vision system 404b may compare the average MD position of the previous fifty components to a threshold, in order to determine if a phasing command is needed.
  • the turning component transformation may have an effect on both the CD position and the MD position of the components. For example, since the position of an individual core component crimped between sheet components affects the overall quality of the disposable absorbent article, vision system 404b may adjust the phasing of a turner that turns the individual core component (e.g., final transformation device 416a). In cases in which the turner creates two or more differently oriented sets of core components, the vision system 404b may maintain running averages of the CD placements for each of the two subpopulations. The vision system 404b may then analyze the difference between the average CD placements of the two subpopulations, to determine if phasing is necessary.
  • a turner that turns the individual core component
  • the vision system 404b may maintain running averages of the CD placements for each of the two subpopulations. The vision system 404b may then analyze the difference between the average CD placements of the two subpopulations, to determine if phasing is necessary.
  • a phasing command may be sent to the turner.
  • the average CD placement of an individual subpopulation may be compared to a threshold value, to determine if a phasing command is needed. Analysis of CD placements may be repeated to verify that the phasing command caused the CD placements to move below the threshold. If the difference between the CD placement and the threshold increased, a phasing command may be provided by vision system 404b to the turner corresponding to the opposite direction. In addition, if the difference is greater than a second threshold (e.g., larger than the first threshold), vision system 404 may disable the issuance of phasing commands altogether and issue an alert to an operator and/or another computing device.
  • a second threshold e.g., larger than the first threshold
  • Vision system 404b may also maintain a running average of MD placement values for a set number of components (e.g., across multiple subpopulations).
  • the controller may generate a target setpoint for this average versus the assumed position of the leading edge of the MD cut to be performed by a transformation device 412m (e.g., a cutter that cuts the crimped sheets). If the average MD placement moves outside of the range defined by the setpoint plus or minus a predefined range, the controller provides a phasing command to a transformation device 410b and/or transformation device 412m (e.g., a crimper and/or cutter).
  • a transformation device 410b and/or transformation device 412m e.g., a crimper and/or cutter.
  • the controller may send a phasing command to an upstream and/or downstream transformation device to advance or retard a transformation device, in order to correct the timing of the manufacturing system.
  • a virtual frame of reference may be based on a cutting operation that has not yet been performed in a manufacturing system for disposable absorbent articles or may be based on the future placement of a racing stripe in a manufacturing system for an automobile.
  • the vision system described herein can be adapted to provide phasing control over any type of machinery in a manufacturing process and is not intended to be limited to those machines used to manufacture disposable absorbent articles.

Abstract

System and method to use a virtual frame of reference to evaluate and control a manufacturing system. Electronic images from a vision system may be analyzed using the virtual frame of reference to control the phasing of devices in the manufacturing system and to generate alerts.

Description

SYSTEM AND METHOD FOR MANUFACTURING USING A VIRTUAL FRAME
OF REFERENCE
FIELD OF THE INVENTION
The present invention relates generally to a system and method for manufacturing, and more particularly to a system and method using a virtual frame of reference to electronically evaluate the position of components used in the manufacturing process.
BACKGROUND OF THE INVENTION
During the manufacturing of consumer goods, the position of components used in the manufacturing process may affect the overall quality of the goods and the acceptance of the goods by consumers. Consumers often desire consistency in the configuration of purchased goods for both functional and aesthetic reasons. To ensure consistency throughout the manufacturing process, components must be positioned uniformly.
By way of example, many disposable absorbent products such as diapers and feminine hygiene products include a core of absorbent material positioned between a top sheet and a bottom sheet. Variations in the placement of the core within the finished good can result in leakage and reduce the functionality of the product. Even if the placement of the core and other components do not affect the functionality of the product, consumers expect each product to maintain the same look and feel as one another. For example, a winged pantiliner having an off-center or skewed core may create confusion for a consumer as to the best way to place such a pantiliner in her undergarment. Or, for example, the placement of a popular design on a diaper (e.g., a popular children's character, a team logo, and other familiar designs) must be consistently placed, in order to ensure that the design is fully shown (e.g., that a headless character is not shown, a team logo is not missing the name of the team's city, and other inconsistencies).
Based on the foregoing, developing new vision systems that automate a manufacturing process to produce consumer goods can be challenging and difficult. SUMMARY OF THE PRESENT INVENTION
A controller for a manufacturing system is shown and described herein. The controller includes one or more processors and one or more memory devices communicatively coupled to the one or more processors. The one or more memory devices store machine instructions that, when executed by the one or more processors, cause the one or more processors to receive an electronic image of a component used in the manufacturing system. The instructions also cause the one or more processors to analyze the electronic image using a virtual frame of reference to determine a location value associated with the component, to compare the location value and a setpoint, and to generate a phasing command for a machine in the manufacturing system based on the comparison.
A computerized method for determining and controlling the positions of components in a manufacturing system is shown and described herein. The method includes capturing an electronic image of a component used in the manufacturing system and analyzing, by one or more processors, the electronic image using a virtual frame of reference to determine a location value associated with the component. The method further includes comparing the location value and a setpoint and also includes generating a phasing command for a machine in the manufacturing system based on the comparison.
A computerized method for controlling a component transformation in a manufacturing system is shown and described herein. The method includes receiving, at one or more processors, electronic images of components used by the manufacturing system to produce a manufactured good and classifying the images into two or more subpopulations. The method also includes analyzing the images, by the one or more processors, using a virtual frame of reference to determine location values associated with the components. The method further includes using the location values to determine a mathematical characteristic of each of the subpopulations and using the mathematical characteristic to generate a phasing command for a machine in the manufacturing system. BRIEF DESCRIPTION OF THE DRAWINGS
The following detailed description of the present disclosure can be best understood when read in conjunction with the following drawings, where like structure is indicated with like reference numerals and in which:
FIG. 1 is a schematic illustration of a vision system;
FIG. 2 is an illustration of an electronic image;
FIG. 3 is a schematic illustration of a controller;
FIG. 4 is a schematic illustration of a manufacturing system;
FIG. 5A is an illustration of a core sheet having uniformly-oriented features;
FIG. 5B is an illustration of a core sheet having features with alternating orientations;
FIG. 6A is an illustration of uniformly-oriented cores being turned to have a uniform orientation in the machine direction;
FIG. 6B is an illustration of cores having alternating orientations being turned to have a uniform orientation in the machine direction;
FIG. 6C is an illustration of cores having alternating orientations being turned to have alternating orientations in the machine direction;
FIG.. 7 A is a side-view illustration of an individual core located between a top sheet and a back sheet; and
FIG. 7B is a top- view illustration of an individual core located between a crimped topsheet and back sheet.
Individual aspects of the drawings will be more fully apparent and understood in view of the detailed description that follows.
DETAILED DESCRIPTION
Present techniques to help automate the manufacture of goods often require using additional sensors, providing timing marks on the product itself, and/or making manual adjustments to the manufacturing process. It has been discovered that utilizing a virtual frame of reference allows for a reduction in the number of devices used by the manufacturing process and improves the overall quality of the manufactured goods. In addition, utilizing virtual frames of reference helps to automate the manufacturing process, thereby reducing the possibility of human error in adjusting the process.
Definitions
As used herein, the following terms are defined as follows:
"Disposable absorbent article" refers to feminine hygiene products (e.g., pantiliners or pads), disposable diapers, pull-ons, training pants, and adult incontinence articles.
"Machine direction" (MD) refers to the direction of movement of a component along a manufacturing line.
"Cross direction" (CD) refers to the direction substantially perpendicular or perpendicular to the MD and across the component as it moves along a manufacturing line.
"Component" refers to any material, part, or combination of materials and/or parts used in the construction of a final good by a manufacturing system.
"Phase" refers to the positional relationship between two or more parts of a machine that performs repetitive motion. For example, phase may refer to the relative position of a roller that unwinds a roll of material used in the manufacturing process. In another example, phase may also refer to the relative position of a punch that stamps apertures into a component used in the manufacturing process. When utilized as verbs, the terms "phasing," "phased," "phase," and the like refer to the act of changing the phase of a device from one phase to another. For example, the act of phasing a roller may refer to advancing or retarding the rotation of the roller about its primary axis.
"Component transformation" refers to any action performed by the manufacturing process on one or more components used to produce the final consumer goods. In general, a component transformation may be any change to the configuration of a component performed by the manufacturing process. Some component transformations may only change the spatial properties of a component, while others may change the physical properties of the component. For example, rotating, flipping, and reorienting components are component transformations. Increasing or retarding the speed of motion of a component within the manufacturing process may also be component transformations. Other examples of transformations include cutting components, joining components, separating components from one another, changing the shapes of components, perforating components, punching or cutting apertures into components, and changing the look of components (e.g., by applying graphics, paint, dyes, or the like).
"Controller" refers to any electronic device or system that provides control commands to another electronic and/or mechanical system. A controller includes one or more processors (e.g., a microprocessor, central processing unit, application- specific integrated circuit, or the like). A controller may also include one or more memory devices (e.g., a RAM, ROM, non-volatile memory, flash memory, non-transitory memory, hard drive, disk drive, or any other electronic device capable of storing machine instructions) that communicate locally or remotely with the one or more processors. The one or more memory devices store machine instructions that, when executed by the one or more processors, cause the one or more processors to provide the control commands. Non-limiting examples of controllers include personal computers, servers, programmable logic controllers (PLCs), tablet computers, handheld computing devices, mobile telephones, distributed computing systems, cameras, and electronic displays.
Vision Systems Using Virtual Frames of Reference In general, a vision system includes one or more cameras that capture images of components as they move through the manufacturing process. Any known type of electronic camera may be used. For example, a camera may be a charge-coupled device, a CMOS-pixel based device, or the like. The image data captured by a camera is provided to one or more controllers for further analysis. A camera may have one or more controllers integrated as part of the device (e.g., within the same housing as the camera) and/or transmit the image data to one or more controllers external to the camera. The one or more controllers analyze the image data using a virtual frame of reference to determine if the manufacturing process needs to be adjusted. If an adjustment is needed, the one or more controllers generate control commands that change how one or more upstream and/or downstream component transformations are performed.
Referring now to FIG. 1, an illustrative schematic of vision system 100 is shown. Vision system 100 includes camera 104. As shown, camera 104 may be positioned in a fixed location and capture an electronic image of a component 106, as it passes vision system 100 in the machine direction along manufacturing line 108. Camera 104 may be oriented in any number of positions in relation to the direction of motion of component 106. For example, camera 104 may be positioned above or below component 106, along the side of component 106, or somewhere therebetween. Camera 104 may take continuous images (e.g., video) or still-frames that are captured periodically or in response to receiving a trigger from an upstream and/or downstream device.
Camera 104 provides a captured electronic image to controller 102, which analyzes the image using a virtual frame of reference to determine if an adjustment to the manufacturing process is needed. The virtual frame of reference allows analysis of the spatio- or spatio-temporal location of component 106 as it relates to the manufacturing process and/or other components within the manufacturing process. Controller 102 may analyze the position, orientation, and/or timing of component 106 as it passes vision system 100 to determine if an upstream or downstream component transformation requires adjustment. For example, if an upstream component transformation reorients component 106, controller 102 may analyze the position of component 106 relative to a setpoint position and send a control signal to the upstream process to ensure that the position of future components approaches the setpoint position. In another example, controller 102 may analyze the timing of component 106 (e.g., to determine if component 106 reaches vision system 100 earlier or later than expected) and advance or retard an upstream or downstream transformation device, in order to ensure that the component 106 reaches the vision system 100 at the expected time.
In general, a virtual frame of reference allows the location of a component to be determined relative to one or more coordinates (e.g., points, lines, areas, or the like) in the image. If the one or more cameras of a vision system remain at fixed locations, each image may be viewed as a set of coordinates, allowing the location of the component within an image to be determined. For example, the number of pixels from an edge of the image to an edge of the component within the image can be used to determine the position of the component. Similarly, the timing of when images are captured allows for comparisons to be made between multiple components that pass the vision system. In some cases, images may be captured continuously, periodically, and/or in response to an upstream or downstream trigger. When the camera remains in a fixed position, a virtual frame of reference may be used to analyze a component's location within an image and to adjust the manufacturing process. If the camera is not in a fixed position (e.g., moving), a virtual frame of reference may still be used, but must be adjusted to compensate for the movement of the camera. It is to be understood that while the descriptions herein primarily refer to camera positions that allow the edges of images to correspond to the MD and CD axes, this is merely illustrative and a vision system's camera may be oriented in any position to create a virtual frame of reference.
Using the virtual frame of reference, a vision system may make any number of determinations relating to the position and/or timing of a component passing the vision system. For example, a controller may determine the MD length, corner locations, skew, CD placement, and/or MD placement of a component. Additionally, a controller may determine the location of any distinguishing features on the component (e.g., an aperture, visual indicia, a physical characteristic, or the like). For example, a controller may determine the CD and/or MD placement of apertures on a component.
The controller utilizes the determined positions of a component relative to the virtual frame of reference to adjust the manufacturing process. The controller may compare data relating to the actual position of the component to one or more setpoints, to determine if the manufacturing process needs to be adjusted. In some cases, the setpoints may be generated by the controller by analyzing one or more populations of components that pass the vision system. For example, an average position for a population of prior components may be used to generate a setpoint to analyze future components passing the vision system. In other cases, some or all of the setpoints may be preloaded into the controller.
Referring now to FIG. 2, an illustration of an electronic image 200 is shown. Image 200 is captured as component 201 passes the camera of a vision system and analyzed to determine the location of component 201 relative to a virtual frame of reference. For example, this analysis may include determining the leading and trailing edges of component 201 in the machine direction (e.g., edges 216 and 218, respectively), locating one or more corners 214 of component 201, and/or determining the location of one or more edges of component 201 in the cross-direction (e.g., edges 220 and 222). In cases where component 201 is of a substantially quadrilateral shape, the controller may locate the corners using the intersection of the edges of component 201. In cases where component 201 is not of a quadrilateral shape, the controller may analyze the curvature of the leading and trailing edges of component 201 in the machine direction to approximate the location of the corners. In addition to determining the spatial characteristics of component 201 within image 200, a controller may also determine the location of distinguishing features 210 (e.g., apertures, visual indicia, or the like).
The location of component 201 may be defined relative to any fixed point within image 200. For example, actual MD position 204 may be located relative to leading or trailing edges 224 of image 200 in the machine direction (e.g., by counting the number of pixels between an edge 224 and actual MD position 204, or the like). In another example, the actual CD position 208 may be measured using the location of the corners 214 of component 201. Another exemplary measurement includes the CD position of features 210, which may be determined relative to a virtual location in the cross-direction (e.g., centerline 212 of the image 200, or the like). Similarly, the MD position of features 210 may be determined relative to the leading edge 216 of component 201, or any other location in the machine direction.
In general, MD-related measurements within the virtual frame of reference also rely on a temporal component, since MD positions are dependent on the timing of the manufacturing system. In other words, image 200 must be captured at the proper moment, to ensure that positional measurements are meaningful. In some cases, the capturing of image 200 may be triggered periodically or triggered by the performance of a downstream transformation. In general CD-related measurements within the virtual frame of reference may be made with regard to the position of the camera. For example, CD-related measurements may be made with regard to the fixed location of a mirror plate or other physical location associated with the position of the camera.
Non-limiting examples of measurements that can be made by analyzing image
200 are as follows:
The MD length of component 201 ("Length-MD") may be determined as:
LengthMD = MD trailing edge - MD leading edge using the difference between MD edges 216 and 218. In other examples, the length of the component may be determined using the difference between any two points on the perimeter of the component. The skew of component 201 may be determined using the locations of 214. For example, the skew may be determined by using the following equation:
Skew = (CD midpoint1pnri— CD midpointtrnii) * ( 2*iengthMD ,
^ teu" ^ iruuj MD distance between midpoints where CD midpointiead is the CD midpoint 228 of the corners 214 on the MD leading side 216 of component 201, CD midpointtraii is the CD midpoint 230 of the corners 214 on the MD trailing side 218 of component 201, lengtfiMD is the length of the component in the MD direction (as determined above), and MD distance is the MD distance between midpoints 228, 230.
The CD placement of component 201 may be determined using the locations of its corners 214. For example, the CD placement (PlacementcD) may be determined using the following equation:
CD midpointtrail— CD midpointlead PlacementCD = avg(CD midpoints) =
where CD midpoint ad is the CD midpoint 228 of corners 214 on the MD leading side
216 of component 201 and CD midpointtraii is the CD midpoint 230 of the corners 214 on the MD trailing side 218 of component 201.
The MD placement of the component may be determined using the leading edge of the component in MD direction. For example, the MD placement of the component may be determined relative to an edge of the electronic image as follows:
PlacementMD = MD leading edge— Imagelead where Imageiead is the location of edge 224 of image 200 image in the leading machine direction and MD leading edge is the actual MD position 204.
The CD and/or MD placement of distinguishable features 210 of component 201 may also be determined. For example, the CD placement of features 210 (Feature
PlacementcD) may be determined as follows:
Feature PlacementCD— CD edge of the 1st feature— CD edge of the component
Similarly, the MD placement of the features (Feature PlacementMD) may be determined as follows:
Feature PlacementMD— MD edge of the 1st feature— MD edge of the component This way, the location of distinguishable features can also be located on component 201. In some cases, a controller analyzing electronic image 200 may also maintain one or more setpoint locations indicative of where component 201 should be positioned. For example, a MD setpoint 202 may indicate where actual MD position 204 should be at the time electronic image 200 is taken. Similarly, a CD setpoint 206 may indicate the ideal location of actual CD position 208. The controller may use the setpoint locations to provide phasing commands and/or alerts to other upstream and/or downstream devices associated with the manufacturing process. For example, if the skew of the component is beyond an acceptable range of threshold values, an alert may be provided by the controller signifying that maintenance may be needed or a phasing command may be sent to a component transformation device. In another example, if actual MD position 204 is located on the trailing side of MD setpoint 202 along the MD axis, this may be indicative of component 201 reaching MD setpoint 202 earlier or later than expected. The controller may use this determination to advance or retard an upstream or downstream component transformation device, accordingly. Similarly, if controller 102 determines that any variation exists between CD setpoint 206 and actual CD position 208, controller 102 may send a phasing control command to an upstream or downstream component transformation device to adjust how components are positioned in the cross-direction.
A controller analyzing an image may perform any number of functions associated with the analysis of the image. In some cases, a controller may generate an alert and/or stop an operation of the manufacturing process, if the difference between a measurement and a setpoint is above a threshold. In other cases, the controller may issue one or more phasing commands that adjust the phasing of one or more transformations in the manufacturing process (e.g., issue one or more phasing commands to a component transformation device). Phasing commands may be either direct commands (e.g., if the controller provides direct control over the component transformation device) or indirect commands (e.g., indications of the determination provided to the controller that provides direct control over the component transformation device). In further cases, the controller may communicate with other computing devices, interface devices, and/or alarms, to receive and convey data about the image analysis.
Referring now to FIG. 3, a schematic illustration of controller 102 is shown.
Controller 102 includes a processor 302, which may be one or more processors communicatively coupled to a memory 304, interface 306, and interface 308. Memory 304 may be any form of memory capable of storing machine-executable instructions that implement one or more of the functions disclosed herein, when executed by processor 302. For example, memory 304 may be a RAM, ROM, flash memory, hard drive, EEPROM, CD-ROM, DVD, other forms of non-transitory memory devices, or the like. In some cases, memory 304 may be any combination of different memory devices.
Controller 102 receives electronic images from one or more cameras 104 via connection 312 and interface 306. In some cases, controller 102 may also provide control over camera 104 via connection 312. For example, controller 102 may control when camera 104 captures an image using a timing value stored in parameters 330 and/or using a trigger received from another device (e.g., transformation devices 316, other computing devices 334, or the like). Controller 104 may also provide phasing commands to transformation devices 316 via interface 306 and connection 314. Transformation devices 316 may be any device that performs a component transformation in the manufacturing system. By way of non-limiting examples, transformation devices 316 may be punchers, cutters, crimpers, turners, embossers, winders, unwinders, lotion applicators, or the like.
Connections 312 and 314 may be any combination of hardwired or wireless connections. For example, connection 312 may be a hardwired connection to provide electronic images to controller 102, while connection 314 may be a wireless connection to provide phasing commands to transformation devices 316. In some cases, connections 312 and 314 may be part of a shared connection that conveys information between controller 102, camera 104, and transformation devices 316. In yet other cases, connections 312 and 314 may include one or more intermediary circuits (e.g., routers, modems, controllers, signal processors, and the like) and provide indirect connections to controller 102.
Interface 306 is configured to receive and transmit data between controller 102, camera 104, and/or other transformation devices 316. For example, interface 306 may include one or more wireless receivers if any of connections 312 or 314 are wireless. Interface 306 may also include one or more wired ports if any of connections 312 or 314 are wired connections. Interface 308 may provide one or more wired or wireless connections between controller 102, other computing devices 334 (e.g., one or more controllers, computers, servers, portable devices, PLCs, or the like), interface devices 336 (e.g., one or more electronic displays, human-machine interfaces, speakers, or the like), and/or alarms 338 (e.g., one or more sirens, flashing lights, or the like) via connections 320, 322, and 323, respectively. For example, interface 308 may provide a wired connection between controller 102 and a display (e.g., an interface device 336) and a wireless connection between controller 102 and a remote server (e.g., other computing device 334) via the Internet.
Memory 304 is shown to include image analyzer 324, which is configured to receive and analyze the electronic images captured by camera 104. Image analyzer 324 detects the space occupied by a component within the image. In non-limiting examples, image analyzer 324 may detect a leading or trailing edge of a component in the machine direction, one or more edges in the cross-direction, and one or more corners of the component. In this way, image analyzer 324 may locate the perimeter of a component within the electronic image, and/or locate distinguishable features of the components (e.g., apertures, designs, protrusions, or the like).
Image analyzer 324 may also determine the location of a component in an image relative to another location and performs measurements based on the location. For example, the location of a point may be analyzed relative to another point in the image (e.g., a coordinate, an edge of the image, a fixed line, or the like). Non-limiting examples of measurements performed by image analyzer 324 include: MD length, skew, CD placement, MD placement, feature CD placement, and/or feature MD placement. Image analyzer 324 may also maintain an average of the measurements (e.g., a moving average, a weighted average, or the like) for a set number of components that are analyzed. For example, an average measurement may be maintained for the previous MD placement values for the last fifty components. In some cases, an average is only calculated using measurements for components when the manufacturing system is at full operational speed. Image analyzer 324 may also make one or more of the measurements based on a transformation yet to be performed on the component. For example, a downstream device in the manufacturing process (e.g., other computing device 334, component transformation device 316, or the like) may provide a trigger to controller 102, which causes camera 104 to capture an image in response. Such timing allows controller 102 to analyze the image with regard to the future transformations (e.g., in relation to a virtual cut line, placement of a feature onto the component, or the like).
Memory 304 is also shown to include a setpoint generator 326, which uses measurements from image analyzer 324 to generate one or more setpoints. In general, the setpoints may be coordinates, lines, or areas in the reference system that correspond to an expected measurement value associated with a component. In some cases, setpoint generator 326 may determine a setpoint using an average of measurements from image analyzer 324 for a set number of components. For example, setpoint generator 326 may calculate an MD setpoint using the average (e.g., a moving average, weighted average, or the like) of the MD leading edge of the last fifty components that have passed camera 104. Setpoint generator 326 may also utilize data from transformation devices 316 to determine a setpoint (e.g., a virtual cut line, where a feature is to be placed onto the component, or the like). In some cases, setpoint generator 326 may maintain a group of two or more setpoints for a measurement, if components passing camera 104 have different orientations. For example, if components periodically pass camera 104 in three different orientations, setpoint generator 326 may use the orientation of the latest component to select the appropriate setpoint.
Parameters 330 may include any number of user or system defined parameters that override or control the functions of controller 102. For example, parameters 330 may include parameters that specify how many components are analyzed by setpoint generator 326 before a setpoint is generated, a setpoint value that overrides a generated setpoint, when an alert is to be issued, or any other setting. Parameters 330 may be preloaded into memory 304 and/or specified via other computing devices 334 or interface devices 336. For example, a user utilizing a touchscreen display (e.g., an interface device 336) may change a setting in parameters 330.
Memory 304 may include a setpoint analyzer 328 that compares one or more measurements from image analyzer 324 to one or more setpoints. The one or more setpoints may be generated by setpoint generator 326 or preset in parameters 330. Setpoint analyzer 328 determines the difference between the measurement and the setpoint, in order to determine if further action needs to be taken by controller 102. Setpoint analyzer 328 may also compare the difference between the measurement and the setpoint to a threshold, in order to determine if further action is needed. In some cases, setpoint analyzer 328 may also utilize other variables (e.g., an offset, a multiplier, an average of measurements, or the like) as part of the determination.
If setpoint analyzer 328 determines that further action by controller 102 is needed, it may provide an indication of this determination to alerts 333. Alerts 333 may include, in non-limiting examples, messages indicating that a component should be rejected based on length, skew, CD placement, MD placement, aperture CD placement, aperture MD placement, or any other measurement from the electronic image. For example, if the skew of the component is outside of an acceptable range, controller 102 may generate an alert to a machine operator via interface devices 336 and/or alarms 338 that maintenance may be needed. In another example, alerts 333 may be provided to other computing devices 334 to keep track of the number of rejected components.
If setpoint analyzer 328 determines that further action by controller 102 is needed, it may also provide an indication of this determination to phase command generator 332. Phase command generator 332 generates phasing commands for transformation devices 316, which perform component transformations in the manufacturing system. In general, phase commands may cause the advancing or retarding of the processing of components by one or more component transformation devices 316. In some cases, a phase command may provide direct control over a component transformation device 316. In other cases, a phase command may be a command to other computing devices 334 (e.g., a controller) that causes the phasing of a component transformation device 316 (e.g., a crimper or turner). For example, a phase command may control when a crimper applies crimping to a component.
As can be appreciated, the vision systems described herein can be used to adjust any number of component transformations in manufacturing systems that produce any number of different types of goods. Such vision systems are able to automatically adjust a manufacturing system without user interaction and improve the overall quality of the finalized products. Any number of different types of manufacturing systems may be built by varying the number and type of transformation devices, and by utilizing one or more of the vision systems described herein to automate the system.
Referring now to FIG. 4, a schematic illustration of a manufacturing system 400 is shown. Manufacturing system 400 may be scaled to accommodate any number of manufacturing processes to manufacture any number of different types of goods. As shown the subscripts "a," "b," "c," and "m" are intended to denote a numerical range of values from the number one to the variable "m." Where a plurality of similarly-labelled devices are shown (e.g., material delivery devices 414a, 414b, 414c, 414m), this is intended to be non-limiting and to convey that manufacturing system 400 may include any number of such devices (e.g., manufacturing system 400 may have one, two, three, etc., component delivery devices).
Manufacturing system 400 may include one or more material delivery devices (e.g., material delivery devices 414a, 414b, 414c, 414m) that provide component materials to manufacturing system 400 for further processing. A material delivery device may provide a raw material, semi-finished component, or a finished component, depending on its configuration. Non-limiting examples of material delivery devices include rollers, pumps, conveyor belts, and the like.
As shown, each component material is fashioned by manufacturing system 400 into individual components by any number of transformation devices. For example, a component material provided by material delivery device 414a may be transformed by a first transformation device 408a, an intermediary transformation device 406a, and/or a final transformation device 416a. As can be appreciated, any number of component transformation devices may be used in manufacturing system 400 to process an individual component. For example, a first component may be processed by a single transformation device 408a (e.g., intermediary transformation device 406a and final transformation device 416a may be omitted), by two transformation devices 408a, 406a (e.g., final transformation device 416a may be omitted), by three transformation devices 408a, 406a, 416a, or by more than three transformation devices (e.g., additional transformation devices may perform transformations between the transformations performed by the first transformation device 408a and intermediary transformation device 406a and/or between intermediary transformation device 406a and final transformation device 416a).
In addition to performing component transformations on individual components, manufacturing system 400 may also include any number of transformation devices that combine components (e.g., a first combining device 412a, a second combining device 412b, an mth combining device 412m, and the like). Manufacturing system 400 may also include any number of transformation devices that perform transformations on combined components (e.g., a first transformation device 410a, a second transformation device 410b, and the like). As can be appreciated, manufacturing system 400 may be scaled to accommodate any number of different combinations of components by adding or removing transformation devices, as needed.
As shown, manufacturing system 400 may also include any number of vision systems that monitor the processing of individual components (e.g., vision system 402a, 402b, 402c, 402m) and/or the processing of combined components (e.g,. vision systems 404a, 404b). In some cases, a transformation device may trigger a vision system to capture images of components as the components pass the vision system. For example, an intermediary transformation device 406a performing a component transformation may trigger vision system 402a to capture an image. The one or more vision systems in manufacturing system 400 analyze images using virtual frames of reference to determine if other transformation devices in system 400 need to be adjusted (e.g., phased). For example, vision system 402a may determine that an upstream transformation device (e.g., first transformation device 408a, or the like) and/or downstream transformation device (e.g., combining device 410a, or the like) needs to be adjusted.
In a more detailed example of how manufacturing system 400 may be used to manufacture goods, reference will now be made with respect to manufacturing disposable absorbent articles. It is to be understood that this is intended to be a non-limiting example and that manufacturing system 400 may manufacture any number of different types of goods. A first material deliver device 414a may provide the core material to manufacturing system 400. The core material may comprise any number of different absorbent materials configured to absorb liquids. Similarly, a second material delivery device 414b may provide a back sheet and a third material delivery device 414c may provide a top sheet to manufacturing system 400. In general, a disposable absorbent article may be constructed by positioning an absorbent core material between an absorbent top sheet and a non-absorbent back sheet (e.g., by combining device 412b), crimping the sheets together (e.g., by transformation device 410b), and cutting the crimped sheets (e.g., by transformation device 412m).
Each individual material used to manufacture the disposable absorbent article may undergo one or more component transformations before being combined. For example, features, such as apertures, may be cut into the core sheet (e.g., by first transformation device 408a), individual cores may be cut from the core sheet (e.g., by intermediary transformation device 406a), and individual cores may be reoriented for further processing (e.g., by final transformation device 416a). Similarly, the top sheet, back sheet, and/or any other components used to manufacture the disposable absorbent article may undergo any number of transformations prior to being combined.
Referring now to FIGS. 5 A and 5B, illustrations of a core sheet 500 are shown as non-limiting examples. As shown, features 501 are placed onto core sheet 500 by one or more transformations and core sheet 500 is cut into individual cores 502 by further transformations. For example, features 501 may be apertures that are cut into core sheet 500 in order to increase absorbency. In another example, features 501 may be designs applied to core sheet 500.
Any number of orientations of individual cores may be produced from core sheet
500. For example, as shown in FIG. 5A, features 501 are oriented in a uniform direction and core sheet 500 may be cut to produce individual cores 502 having a uniform orientation. However, as shown in FIG. 5B, features 501 are applied to core sheet 500 in alternating positions, thereby creating two different populations of individual cores (e.g., cores 504 and 506). In further examples (not shown), any number of different populations of individual cores may be produced using core sheet 500.
In other transformations, individual cores may be cut from a core sheet and the cores reoriented for further processing. For example, if two populations of cores are created by applying features in alternating directions, the resulting individual cores may later be reoriented into a single orientation in the machine direction. As can be appreciated, individual cores may be reoriented into any number of different directions, thereby creating any number of different populations of individual cores.
Referring now to FIGS. 6A-6C, illustrations of transformations of core sheet 500 are shown as non-limiting examples. In FIG. 6 A, features 501 are placed onto core sheet 500 in a uniform direction. Individual cores 502 are later cut from core sheet 500 and turned for further processing by the manufacturing system, such that individual cores 502 have a uniform orientation. In FIG. 6B, features 501 are placed onto core sheet 500 in alternating directions, creating two populations of cores 504 and 506. Again, individual cores 504, 506 are cut from core sheet 500 and then turned to have the same orientation in the machine direction. In FIG. 6C, features 501 are placed onto core sheet 500 in alternating directions, cut into individual cores 504, 506, and then reoriented such that individual cores 504 and 506 have alternating orientations after being turned. As can be appreciated, any number of different transformations involving applying features to a core sheet, cutting the core sheet into individual cores, and reorienting the cores may be used.
Referring again to FIG. 4, one example of how system 400 may be used to manufacture a disposable absorbent article can be seen, based on the foregoing. A sheet of core material may be unwound by material delivery device 414a and features may be added to it by a first transformation device 408a. An intermediary transformation device 406a cuts the core sheet into individual cores, and a final transformation device 416a then reorients the cut cores for further processing in system 400. Vision system 402a may be triggered by the processing of intermediary transformation device 406a, thereby causing an image of the core sheet to be captured. A controller of the vision system 402a then analyzes the image to ensure that the locations of the component in the image and/or the features on the component are properly placed within the image. For example, a possible setpoint used by the controller as part of this determination may correspond to a virtual cut line (e.g., where the intermediary transformation device 406a is expected to cut the core sheet into an individual core). If the locations of the component and/or features are not located at a setpoint or within a predefined setpoint range (e.g., the difference between a measurement and a setpoint is above a threshold value), the controller of vision system 402a may provide a phasing command (e.g., an adjustment command) to first transformation device 408a. In some cases, the controller may additionally provide an alert to a user or other computing device.
Further component transformations used to manufacture a disposable absorbent article may include placing individual cores between a top sheet and a back sheet, crimping the top sheet and back sheet, and cutting the sheets. For example, as shown in FIG. 7A (a section view), a core 502 may be combined with a top sheet 700 and back sheet 702, such that core 502 is positioned between them. Top sheet 700 and core 502 are configured to absorb liquid, while back sheet 702 acts to retain excess liquid that may pass through core 502. In FIG. 7B, a topview of the combined components is shown. The orientation of core 502 within top sheet 700 and back sheet 702 depends on how core 502 is oriented by a core turning transformation is performed by a component transformation device (e.g., final transformation device 416a).
Referring back to FIG. 4, combining devices 412a and 412b may operate to position an individual core between a top sheet and a back sheet. A transformation device 410b then crimps the top sheet and back sheet in order to secure the individual core between them. Vision system 404a and/or 404b may observe the combined components and adjust one or more transformation devices in manufacturing system 400 accordingly. In some cases, the capturing of an electronic image by vision system 404a and/or 404b may be triggered by the performance of a transformation by a transformation device. For example, transformation device 410b may provide a trigger to a vision system 404b, indicative of when a top sheet and back sheet are crimped.
Vision system 404b may analyze the characteristics of any number of subpopulations of components. In some cases, a subpopulation may include every nth component that passes vision system 404b, where n is greater than one. For example, a subpopulation may include every other component, every third component, every fourth component, etc. Vision system 404b may also analyze each subpopulation individually and/or compare different subpopulations, in order to determine if a phasing command is needed. For example, if a turner orients components into three different orientations, vision system 404b may analyze three different subpopulations of components using virtual frames of reference, where each subpopulation corresponds to a different orientation from the turner. In some cases, vision system 404b may also determine one or more mathematical characteristics of a subpopulation. A mathematical characteristic of a subpopulation may be any characteristic that generalizes position values for components within the subpopulation. In non-limiting examples, a mathematical characteristic may be an average CD position, an average MD position, an aggregate CD position, or an aggregate MD position. In other cases, vision system 404b may determine a mathematical characteristic across multiple subpopulations, in order to determine if a phasing command is needed. In a non-limiting example, vision system 404b may compare the average MD position of the previous fifty components to a threshold, in order to determine if a phasing command is needed.
If components are turned in two or more different orientations (e.g., subpopulations), the turning component transformation may have an effect on both the CD position and the MD position of the components. For example, since the position of an individual core component crimped between sheet components affects the overall quality of the disposable absorbent article, vision system 404b may adjust the phasing of a turner that turns the individual core component (e.g., final transformation device 416a). In cases in which the turner creates two or more differently oriented sets of core components, the vision system 404b may maintain running averages of the CD placements for each of the two subpopulations. The vision system 404b may then analyze the difference between the average CD placements of the two subpopulations, to determine if phasing is necessary. If the difference exceeds a predefined threshold, a phasing command may be sent to the turner. In another case, the average CD placement of an individual subpopulation may be compared to a threshold value, to determine if a phasing command is needed. Analysis of CD placements may be repeated to verify that the phasing command caused the CD placements to move below the threshold. If the difference between the CD placement and the threshold increased, a phasing command may be provided by vision system 404b to the turner corresponding to the opposite direction. In addition, if the difference is greater than a second threshold (e.g., larger than the first threshold), vision system 404 may disable the issuance of phasing commands altogether and issue an alert to an operator and/or another computing device.
Vision system 404b may also maintain a running average of MD placement values for a set number of components (e.g., across multiple subpopulations). The controller may generate a target setpoint for this average versus the assumed position of the leading edge of the MD cut to be performed by a transformation device 412m (e.g., a cutter that cuts the crimped sheets). If the average MD placement moves outside of the range defined by the setpoint plus or minus a predefined range, the controller provides a phasing command to a transformation device 410b and/or transformation device 412m (e.g., a crimper and/or cutter). For example, if the average MD placement is outside of the acceptable range, this may indicate that components are arriving at one or more downstream transformation devices earlier or later than expected. In this case, the controller may send a phasing command to an upstream and/or downstream transformation device to advance or retard a transformation device, in order to correct the timing of the manufacturing system.
Although the systems and methods disclosed are primarily described in relation to a manufacturing system for disposable absorbent articles, it is to be understood that this is illustrative only and not intended to be limiting. It is contemplated that the teaching of the present disclosure may be applied to any type of manufacturing system, without deviating from the scope of the present disclosure. For example, a virtual frame of reference may be based on a cutting operation that has not yet been performed in a manufacturing system for disposable absorbent articles or may be based on the future placement of a racing stripe in a manufacturing system for an automobile. Similarly, the vision system described herein can be adapted to provide phasing control over any type of machinery in a manufacturing process and is not intended to be limited to those machines used to manufacture disposable absorbent articles.
Many modifications and variations are possible in light of the above description. The above-described descriptions of the various systems and methods may be used alone or in any combination thereof without departing from the scope of the invention. Although the description and figures may show a specific ordering of steps, it is to be understood that different orderings of the steps are also contemplated in the present disclosure. Likewise, one or more steps may be performed concurrently or partially concurrently. It is therefore intended to cover in the appended claims all such changes and modifications that are within the scope of this disclosure.

Claims

CLAIMS What is claimed is:
1. A controller for a manufacturing system comprising:
one or more processors; and
one or more memory devices communicatively coupled to the one or more processors, wherein the one or more memory devices store machine instructions that, when executed by the one or more processors, cause the one or more processors to:
receive an electronic image of a component used in the manufacturing system;
analyze the electronic image using a virtual frame of reference to determine a location value associated with the component; compare the location value and a setpoint; and
generate a phasing command for a machine in the manufacturing system based on the comparison.
2. The controller according to claim 1, wherein the setpoint is determined relative to the location of a transformation that has not yet been performed by the manufacturing system.
3. The controller according to claim 1 or claim 2, wherein the machine instructions further cause the one or more processors to generate an alert based on the comparison.
4. The controller according to any one of the preceding claims, wherein the instructions further cause the one or more processors to disable generation of phasing commands if the difference between the location value and the setpoint is greater than a threshold value.
5. The controller according to any one of the preceding claims, wherein the location value is an average of locations for a set of components.
6. The controller according to any one of the preceding claims, wherein the phasing command phases a crimper in the manufacturing system.
7. The controller according to any one of the preceding claims, wherein the phasing command phases a turner in the manufacturing system.
8. A computerized method for controlling a component transformation in a manufacturing system, comprising: capturing an electronic image of a component used in the manufacturing system; analyzing, by one or more processors, the electronic image using a virtual frame of reference to determine a location value associated with the component;
comparing the location value and a setpoint; and
generating a phasing command for a machine in the manufacturing system based on the comparison.
9. The method according to claim 8, wherein the electronic image is captured in response to a component transformation being performed by the manufacturing system.
10. The method according to claim 8 or claim 9, further comprising generating an alert based on the comparison.
11. The method according to any one of claims 8, 9, or 10, further comprising determining, by the one or more processors, if the component should be rejected based on the comparison, wherein the location value comprises at least one of a length of the component in the machine direction, a placement of the component in the machine direction, a placement of the component in the cross direction, a skew of the component, a cross-directional placement of features on the component, or a machine directional placement of features on the component.
12. The method according to any one of claims 8, 9, 10, or 11, further comprising disabling generation of phasing commands if the difference between the location value and the setpoint is greater than a threshold value.
13. The method according to any one of claims 8, 9, 10, 11, or 12, wherein the location value is an average of locations for a set of components.
14. The method according to any one of claims 8, 9, 10, 11, 12, or 13, further comprising:
maintaining a group of two or more setpoints, wherein each setpoint corresponds to a different orientation of a component, and
using the orientation of the component to select a setpoint from the group for the comparison.
15. The method according to any one of claims 8, 9, 10, 11, 12, 13, or 14, wherein the phasing command phases a turner in the manufacturing system.
EP11729496.7A 2011-05-25 2011-05-25 System and method for manufacturing using a virtual frame of reference Withdrawn EP2715463A1 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2011/037938 WO2012161709A1 (en) 2011-05-25 2011-05-25 System and method for manufacturing using a virtual frame of reference

Publications (1)

Publication Number Publication Date
EP2715463A1 true EP2715463A1 (en) 2014-04-09

Family

ID=44627905

Family Applications (1)

Application Number Title Priority Date Filing Date
EP11729496.7A Withdrawn EP2715463A1 (en) 2011-05-25 2011-05-25 System and method for manufacturing using a virtual frame of reference

Country Status (4)

Country Link
EP (1) EP2715463A1 (en)
CN (1) CN103858064B (en)
BR (1) BR112013030032A2 (en)
WO (1) WO2012161709A1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
ITBO20130163A1 (en) * 2013-04-12 2014-10-13 Marchesini Group Spa METHOD AND SYSTEM TO SYNCHRONIZE A PROCESSING STATION WITH THE ADVANCEMENT OF AN ARTICLE
US20150173956A1 (en) 2013-12-20 2015-06-25 The Procter & Gamble Company Method for fabricating absorbent articles
US20150173964A1 (en) 2013-12-20 2015-06-25 The Procter & Gamble Company Method for fabricating absorbent articles
US20170296396A1 (en) 2016-04-14 2017-10-19 The Procter & Gamble Company Absorbent article manufacturing process incorporating in situ process sensors

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5359525A (en) * 1990-05-01 1994-10-25 Kimberly-Clark Corporation Apparatus and method for registration control of assembled components
JP3351763B2 (en) * 1999-07-19 2002-12-03 花王株式会社 Manufacturing method of absorbent article
GB2366797A (en) * 2000-09-13 2002-03-20 Procter & Gamble Process for making foam component by pressurising/depressurising
US6913664B2 (en) * 2001-05-23 2005-07-05 Zuiko Corporation Method and apparatus for producing disposable worn article
GB2457009A (en) * 2007-08-29 2009-08-05 Sca Hygiene Prod Ab Method for controlling a web
EP2090951A1 (en) * 2008-02-05 2009-08-19 Fameccanica.Data S.p.A. A method and a system for the automatic regulation of production lines and corresponding computer program product therefor

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
None *
See also references of WO2012161709A1 *

Also Published As

Publication number Publication date
CN103858064A (en) 2014-06-11
WO2012161709A1 (en) 2012-11-29
BR112013030032A2 (en) 2016-09-13
CN103858064B (en) 2017-07-28

Similar Documents

Publication Publication Date Title
CA2697786C (en) Apparatus and method for intermittent application of stretchable web to target web
EP2715463A1 (en) System and method for manufacturing using a virtual frame of reference
US8914140B2 (en) System and method for manufacturing using a virtual frame of reference
CN102295187B (en) Traction and paper feed system suitable for high-speed horizontal paper cutting
US9522478B2 (en) Rotary die cutter
JP4112709B2 (en) Buckle plate folding station and control method thereof
EP3020517A3 (en) Method for calibration of a conveyor tracking system, and guidance apparatus
IL257250A (en) Methods for creasing and cutting sheet materials
US9195861B2 (en) Methods and systems involving manufacturing sheet products by testing perforations
JP2017121665A5 (en)
EP3027546B1 (en) Diameter measurement of a roll of material in a winding system
CN102725216B (en) Knife folding device
EP2090951A1 (en) A method and a system for the automatic regulation of production lines and corresponding computer program product therefor
US11518135B2 (en) Apparatus and method for the production of a cushion product from a single- or multi-layer continuous paper strip
US9861534B2 (en) System and method for manufacturing using a virtual frame of reference
JP2010195584A5 (en)
JP7000363B2 (en) Robot control device and management system
JP2014227298A (en) Manufacturing system of optical film roll and manufacturing method of optical film roll
EP2990168B1 (en) Excess piece protrusion amount measurement method and excess piece protrusion amount measurement device
JP2017226526A (en) Slitter device, and width-directional position shift abnormality inspection device and method
US11446723B2 (en) Methods and systems for automatically adjusting operational parameters of one or more leveling machines
US9505251B2 (en) Recording apparatus and method for estimating cause of abnormality of recording apparatus
KR101496685B1 (en) Method and Apparatus for Correcting Dimension of Bendign Work
JP2020044587A (en) Manufacturing method of tire sound absorbing material and slit machining device
JP2014228845A (en) System for manufacturing optical film roll and method for manufacturing optical film roll

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20131106

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAX Request for extension of the european patent (deleted)
17Q First examination report despatched

Effective date: 20170922

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20180203