WO2019151877A1 - Procédé et système de guidage d'assemblage à réalité augmentée - Google Patents

Procédé et système de guidage d'assemblage à réalité augmentée Download PDF

Info

Publication number
WO2019151877A1
WO2019151877A1 PCT/NO2019/050032 NO2019050032W WO2019151877A1 WO 2019151877 A1 WO2019151877 A1 WO 2019151877A1 NO 2019050032 W NO2019050032 W NO 2019050032W WO 2019151877 A1 WO2019151877 A1 WO 2019151877A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual
augmented reality
component
space
assembly
Prior art date
Application number
PCT/NO2019/050032
Other languages
English (en)
Inventor
Angel Israel LOSADA SALVADOR
Conny Gillström
Original Assignee
Kitron Asa
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kitron Asa filed Critical Kitron Asa
Publication of WO2019151877A1 publication Critical patent/WO2019151877A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/18Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
    • G05B19/409Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by using manual data input [MDI] or by using control panel, e.g. controlling functions with the panel; characterised by control panel details or by setting parameters
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/32Operator till task planning
    • G05B2219/32014Augmented reality assists operator in maintenance, repair, programming, assembly, use of head mounted display with 2-D 3-D display and voice feedback, voice and gesture command

Definitions

  • the present invention relates to augmented reality systems for providing instructions to an operator performing a production task such as the assembly of parts.
  • the invention relates to systems and methods for creating augmented reality instructions that can be used in an augmented reality workstation to provide such instructions.
  • Augmented reality is a presentation of a physical environment augmented by additional information generated by a computer system.
  • the physical environment can be viewed directly or indirectly as video on a display, and this view is supplemented by additional information that is viewable and appears as if it is part of or present in the physical environment.
  • the physical environment can be viewed directly through goggles or a window or windshield, where virtual information is added in a heads up type display (HUD), or the physical environment is viewed on a video display with the virtual elements added.
  • the video display may be in the form of augmented reality (AR) goggles that are substantially virtual reality (VR) goggles with a forward looking camera attached, or it may be a separate display.
  • a number of other types of displays are known in the art.
  • the added information may be constructive or destructive. Constructive information is additional information that is added to the view of the physical environment, while destructive information is information that somehow masks or obscures information from the physical environment.
  • the visual information may be supplemented by audio, haptic, somatosensory and olfactory information or stimulus.
  • Augmented reality (AR) and virtual reality (VR) have been popularized for recreational purposes such as video games, provision of information about locations, buildings or sights to tourists, information about items in museums, virtual presentation of proposed architecture in a real environment, etc.
  • AR Augmented reality
  • VR virtual reality
  • development of AR solutions for more practical purposes have accelerated, and one area where AR is expected to find extended use in the years to come is in the field of manufacture and maintenance.
  • AR assisted production and assembly will facilitate training of personnel and increase efficiency, quality and quality control.
  • the present invention provides systems and methods that address and alleviate the needs outlined above.
  • the invention provides an augmented reality workstation for creating augmented reality guidance for an assembly operation in a manufacturing environment.
  • the augmented reality workstation includes a camera configured to capture a work area of the workstation, a computer system configured to receive a video feed from the camera and to combine the video feed with virtual graphical elements to generate an augmented reality video signal, a display configured to receive the augmented reality video signal from the computer system and display an augmented reality view of the work area, and a user interface.
  • the computer system includes an augmented reality management module configured to detect a characteristic feature in the video feed, the characteristic feature being associated with a known position in the work area and a reference position in the virtual 3D space, receive 3D descriptions of assembly components, associate the 3D descriptions of assembly components with respective positions in the virtual 3D space, change the position in the virtual 3D space based on parameters received from the user interface, and generate the virtual graphical elements based on the 3D representation and position them in the video signal based on their respective positions in the virtual 3D space.
  • the computer system also includes an augmented reality instruction file creator configured to save the 3D descriptions of assembly components, positions in the virtual 3D space associated with the 3D descriptions at defined points of an assembly operation, and a position in the virtual 3D space associated with the characteristic feature.
  • the characteristic feature may be a 2D marker such as a QR code or some other matrix code, or it may be a contrasting color or shape that is part of the first component.
  • the user interface is configured to receive user input identifying a 3D description of a specific assembly component, user input specifying a change in the position in the virtual 3D space with which the 3D description of a specific assembly component is associated, and user input indicative of a completion of an assembly step.
  • the augmented reality workstation further comprises a 3D model generator configured to generate a negative representation of at least a part of a 3D description and output the negative representation in a format that is usable by a 3D printer.
  • the augmented reality workstation comprises an image recognition module configured to detect and identify the characteristic feature and to calculate at least one of a distance and a viewing angle from the camera to the characteristic feature based on at least one of the size and the proportions of the characteristic feature in the video feed.
  • the augmented reality management module is further configured to receive 2D descriptions of assembly components, associate the 2D descriptions with respective positions in the virtual 3D space and change the position in virtual 3D space based on parameters received from the user interface. This may allow an operator to provide pictures or capture pictures of a component with the camera when 3D descriptions are not available. In some embodiments the augmented reality management module is further configured to generate an estimated 3D description based on the 2D description.
  • an augmented reality workstation for providing augmented reality guidance in a manufacturing environment.
  • the augmented reality workstation includes a camera configured to capture a work area of the workstation, a computer system configured to receive a video feed from the camera and to combine the video feed with virtual graphical elements to generate an augmented reality video signal, a display configured to receive the augmented reality video signal from the computer system and display an augmented reality view of the work area, and a user interface.
  • the computer system includes an augmented reality management module configured to detect a characteristic feature in the video feed, the characteristic feature being associated with a known position in the work area and a reference position in the virtual 3D space, receive 3D descriptions of assembly components, associate the 3D descriptions of assembly components with a position in the virtual 3D space, and change the position in the virtual 3D space based on parameters received from the user interface, and generate video output constituting a combination of the video feed and graphical representations of the 3D descriptions of assembly components as the augmented reality video signal.
  • the computer system also includes an augmented reality instruction file interpreter configured to extract information from an augmented reality instruction file and provide the extracted information to the augmented reality management module, the extracted information including 3D descriptions of assembly components, positions in the virtual 3D space associated with the 3D descriptions at defined points of an assembly operation, and a position in the virtual 3D space associated with the characteristic feature.
  • an augmented reality instruction file interpreter configured to extract information from an augmented reality instruction file and provide the extracted information to the augmented reality management module, the extracted information including 3D descriptions of assembly components, positions in the virtual 3D space associated with the 3D descriptions at defined points of an assembly operation, and a position in the virtual 3D space associated with the characteristic feature.
  • the augmented reality workstation may further comprise a user interface configured to receive user input indicating the completion of a current assembly step and the transition to a next assembly step.
  • the augmented reality workstation also includes a log file creator module configured to store at least one of: a time stamp associated with a beginning of an assembly step, a time stamp associated with a completion of an assembly step, a video segment of the work area during performance of an assembly step, a user identity of an operator logged into the system during performance of an assembly step, a product code associated with an assembly component added to an assembly during the performance of an assembly step, a serial number associated with the completed assembly, and a version number associated with an augmented reality instruction file.
  • a method for creating augmented reality guidance for an assembly process in a manufacturing environment.
  • the method which is performed by a computer system and connected devices, includes providing a video feed of a work area including a characteristic feature and a first component, maintaining a
  • the system may receive input from a user interface representing an adjustment of the position of the 3D description of the first component to a second position in the virtual 3D space corresponding to the position of the first component in the video feed, and updating the video signal based on the adjustment. This enables an operator to determine the appropriate position of the 3D position in virtual 3D space simply by viewing the augmented reality view on a display and making appropriate adjustments.
  • the method includes, for a next component, receiving input from the user interface representing an identification of the next component, loading a 3D description of the next component, associating the 3D description of the next component with a starting position in the virtual 3D space, receiving user input defining a final position in the virtual space, wherein the final position in the virtual 3D space is a position where the additional component is attached to the first component, receiving input from a user interface indicating the completion of an assembly step, saving a description of the assembly step including at least the identification of the next component, the 3D description of the next component, the starting position, and the final position.
  • the steps for a next component may be performed for additional next components until receiving input from a user interface indicating the completion of an assembly operation. At that time the method will save a description of the entire assembly operation including at least saved descriptions of assembly steps, the 3D description of a first component, the first position, and the reference position associated with the characteristic feature in the virtual 3D space. It should be noted that the steps for a next component, or the steps in general, do not have to be performed in the sequence described. For example may all components be identified and their 3D descriptions may be loaded before they are handled in individual assembly steps.
  • the step of loading a 3D representation of a next component includes loading a 2D representation of the component, and converting the 2D representation to a 3D representation.
  • the conversion of a 2D representation to a 3D representation may include generating a flat 3D representation with a position and an orientation in virtual 3D space, or generating a 3D estimate based on the 2D representation.
  • the characteristic feature is one of a 2D marker and a contrasting color or shape that is part of the first component.
  • a method in a computer system is providing augmented reality guidance for an assembly process in a manufacturing
  • the method comprises providing a video feed of a work area including a characteristic feature and a first component, maintaining a representation of a virtual 3D space in memory, detecting the characteristic feature in the video feed, associating the characteristic feature with a reference position in the virtual 3D space, loading an augmented reality instruction file, extracting from the augmented reality instruction file a predefined reference position in the virtual 3D space and associating the characteristic feature with the reference position, extracting a 3D description of a first component and a position relative to the reference position from the augmented reality instruction file and associating the 3D description of the first component with a first position in the virtual 3D space based on the position relative to the characteristic feature, and creating a combined video signal constituting a combination of the video feed and the 3D description of the first component wherein the position of the 3D description of the first component in the combined video signal is determined from its position relative to the reference position in the virtual 3D space.
  • the method proceeds by, for a next component, extracting a 3D description of the next component from the augmented reality instruction file and positioning the 3D description in a starting position in the virtual 3D space, adding the 3D description of the next component to the combined video signal, extracting from the augmented reality instruction file a predefined final position relative to the reference position in the virtual 3D space, generating an animation illustrating the movement of the next component from the starting position to the final position in the combined video signal by gradually updating the position of the next component in the virtual 3D space, receiving input from a user interface (601) representing the completion of an assembly step, and saving information relating to the completion of the assembly step in a log file.
  • a user interface 601 representing the completion of an assembly step
  • the steps for a next component may be repeated until input indicating the completion of an assembly operation is received from a user interface, at which time information relating to the completion of the assembly operation in the log file may be saved. It should, however, be noted that the steps for a next component, or the methods steps in general, do not have to be performed in the sequence described. For example, the step of extracting can be performed for all components before the individual assembly steps are performed.
  • Input from the user interface representing the completion of an assembly step may include the recognition of a ID or a 2D code.
  • Saving information relating to the completion of the assembly step may include storing at least one of: a time stamp associated with a beginning of an assembly step, a time stamp associated with a completion of an assembly step, a video segment of the work area during performance of an assembly step, a user identity of an operator logged into the system during performance of an assembly step, a product code associated with an assembly component added to an assembly during the performance of an assembly step, a serial number associated with the completed assembly, and a version number associated with an augmented reality instruction file.
  • FIG. 1 is an illustration of an embodiment of an AR workstation
  • FIG. 2 is an example of how AR information can be presented on a display during an assembly operation
  • FIG. 3 is a flowchart illustrating how an assembly operation can be guided by an AR workstation in accordance with the invention
  • FIG. 4 is a flowchart illustrating how an AR instruction file can be created in accordance with the invention.
  • FIG. 5 is a block diagram illustrating the modules included in and embodiment of a workstation according to the invention.
  • FIG. 6A is a block diagram illustrating modules and information flow in a workstation operating in an AR creation mode in accordance with the invention.
  • FIG. 6B is a block diagram illustrating modules and information flow in a workstation operating in an AR workstation mode in accordance with the invention.
  • FIG. 1 is an illustration of an embodiment of a workstation 100 consistent with the principles of the invention.
  • the workstation 100 includes a table 101 with a work area 102 upon which an operator 103 can perform assembly tasks in accordance with instructions provided by the AR system.
  • the work area 102 includes a number of markers 104, the purpose of which will be described in further detail below.
  • a main structural component 105 In the center of the work area a main structural component 105 has been positioned.
  • the main structural component 105 can be thought of as the part to which all other components are attached. However, it will be realized that many assemblies do not include one part that can readily be identified as a main component, and in such cases the component positioned in the center of the work area at the beginning of an assembly operation is simply a first component 105.
  • the illustrated embodiment further includes a display 106 and a camera 107.
  • the display 106 and the camera 107 are connected to a computer 108, which is configured to receive video from the camera 107 and combine this video with augmented reality information that provides the operator 103 with the required assembly instructions.
  • the computer 108 may be integrated into the display 106, which may be a tablet computer.
  • the display 106 may have a touch screen.
  • other forms of user input is possible instead of or in addition to the touch screen, such as, but not limited to, a mouse, a 3D mouse, graphics tablet, a microphone combined with voice recognition, a trackball, and a keyboard.
  • the exemplary embodiments described herein will concentrate on use of a display 106 with a touch screen for user input, but all the embodiments described herein can be modified to include different types of user input devices.
  • the workstation 100 further includes a number of parts distributed in a number of boxes, trays or bins 109.
  • the parts bins or component bins 109 may be provided with markers similar to the markers 104 provided on the work area 102, as will be described further below.
  • Also included are a number of tools 110.
  • the computer 108 may identify different parts or components, their location and also the required tool or tools for a particular task by showing relevant information on the display 106.
  • Supplementary instructions may be provided in other ways, for example by audio.
  • a fixture or jig 111 may also be provided.
  • This jig 111 may be positioned and shaped such that it ensures the correct positioning of the main structural component relative to the markers 104.
  • a jig may, for example, be manufactured through 3D printing of a negative representation of the volume of the main structural component from a CAD representation of that component.
  • the invention is not limited to the specific configuration illustrated in FIG. 1. As such, it is consistent with the principles of the invention to arrange the various elements differently, to include additional workstation components such as shelves, drawers, work lights, additional displays, projectors, etc.
  • the workstation may also include additional cameras in order to provide different viewing angles, close up details, to capture gestures performed by the operator 103, etc.
  • the goggles may be provided with one or more cameras that capture the view of the workstation as seen by the operator 103.
  • the display 106 shows the view of the work area 102 captured by the camera 107. In the center of the work area 102 the main structural component 105 is shown positioned in the jig 111 and surrounded by the markers 104.
  • the display 106 also shows the component bins 109.
  • the component bins 109 are provided with their own markers 204.
  • Various components 205a-205e are provided in the various bins.
  • the display shows a number of additional graphical elements which are virtual in the sense that they are not captured by the camera 107. Instead, they are generated by the computer 108 and combined with the video provided by the camera 107 in order to provide the operator 103 with assembly instructions.
  • These elements may include traditional user input controls, for example, such as forward to next task 210, back to previous task 211, audio on/off/repeat 212, pause 213, and reload or repeat step 214, are simply touch elements that can be invoked by the operator 103 simply by touching the screen of the display 106.
  • the user control elements 210-214 could be heads up elements that appeared to float in the air somewhere in the operator's field of view and that could be invoked for example through hand gestures.
  • a side bar 215 on the left of the display 106 the tools to be used as part of the current operation or task can be shown.
  • an image of pliers 216 is shown, and the operator should be able to identify this as the appropriate tool 216 to select from the available selection of tools 110 next to the work area 102 for the current assembly step.
  • the tools may be shown as a picture or an illustration, or even as an animated 3D representation, or it may simply identified by name. It may also be presented together with a specific tool id number, a setting to use with the tool such as a torque value setting for a torque screwdriver.
  • FIG. 2 Three elements in FIG. 2 are shown inside the video image as if they were present in or adjacent to the work area 102.
  • An arrow 217 points at the component bin 205a that holds the components of the type that is to be attached to the main structural component 105 as part of the current task.
  • a 3D graphical representation 218 of that part is shown superimposed on top of the component bin 205a.
  • a corresponding 3D representation 218 is shown inside the work area 102 next to the main structural component 105 and in a position and orientation that corresponds to how the actual component should be positioned and oriented when it is attached to the main structural component 105. This 3D representation may be animated.
  • the AR workstation 100 may even store the actual video sequence of the assembly operation as captured by the camera 107. Such an event log may be entered in a database and used for traceability of the manufacturing process, quality control, analysis of the efficiency of the assembly operation and identification of bottlenecks and possible improvements.
  • Some embodiments of the invention may enable remote access from the workstation 100 to a remote location. This may, for example, provide the operator 103 with the ability to consult a technical specialist or a supervisor or to look up additional information in an online repository of relevant information. Some embodiments may also enable connections to be established from a remote location to the AR workstation 100, for example by a supervisor who wishes to check in on and evaluate the work performed by the operator 103.
  • Communication between the AR workstation 100 and the remote location may typically involve audio and video.
  • the operator at the AR workstation may in some
  • embodiments have the possibility to highlight or draw or by other means show or indicate an area of interest for the remote access.
  • the display 106 is a touch screen the operator 103 may be able to draw directly on the screen using a finger or a stylus. Additional user interface controls may be provided for this purpose. These user interface control elements are not shown in the drawing, but may include elements for establishing a connection, elements for selecting drawing tools, an element for taking a screen shot, on screen tools for editing an image, etc.
  • a remote operator may be able to perform the same operations from the remote location.
  • drawings made on the screen or through the use of input devices in the manner described above may be interpreted by the AR application on the computer 108 as 3D virtual objects and given a position and orientation in 3D space.
  • some embodiments of the invention may allow an operator 103 with the required access rights to draw new components and include them in the assembly instructions. This will be described in further detail below as part of the description of the AR creator mode.
  • An AR workstation with the capabilities described above will be suitable for training of new operators, particularly with the remote access that gives easy access to the advice of a mentor.
  • the generic interface that can be used for any product assembly provided that the parts are not too big to fit on the workspace and captured by the camera 107, too small to be seen on the display 106 and worked on without special tools, or having other special requirements that cannot be provided by the illustrated embodiment of an AR workstation, such as protection from heat, extreme cold, or radiation, or use of specialized tools that require a specialized environment. Modifications may, however, be made to the workstation to expand the range of possible operations, or to prepare the workstation for more specialized work.
  • the AR workstation computer 108 uses markers 104 to determine positioning.
  • the virtual objects described in the instructions have their positions defined relative to these markers, and they will be shown on the display 106 in positions determined relative to the markers 104.
  • the main structural component 105 may be shown as a transparent
  • the transparent representation will enable the operator 103 to verify that the position of the component 105 in the jig 111 is consistent with the representation in virtual 3D space. If the video representation of the physical component and the virtual representation of the same are not perfectly aligned, the operator 103 should adjust the position the physical main structural component 105 such that it overlaps with the transparent version on the display 106.
  • a jig 111 is made as a negative shape as described above, it may be possible to ensure that the main structural component 105 can only be positioned in one way and that the jig 111 has a fixed position relative to the markers 104, for example by having the markers 104 actually attached to or printed on the jig 111. In this way the main structural component 105 will automatically be in the correct position with respect to the markers 104 when it is placed in the jig 111.
  • one marker is sufficient for calibration of the virtual 3D space relative to the physical space, but the provision of additional markers will make the system more robust, e.g. to situations where one or more markers are obscured from view by the operator 103 either by accident or by necessity.
  • the main structural component 105 may have a marker attached to it and this marker may be used as a reference. In this case, i.e. if all calibration of the virtual 3D space is done relative to the marker on the component 105, there will be no incorrect position for the main structural component, and all other virtual representations (the augmented part of the reality) will be positioned relative to the position of the video representation of the actual physical main structural component on the display 106. In this case there may be no need for a jig, at least not for purposes of correct positioning of the main structural component 105.
  • the computer 108 may be programmed to have image or pattern recognition capabilities. These capabilities may in some embodiments also be used to recognize markers placed on the bins 109.
  • the operator 103 When starting the AR workstation the operator 103 will scan a barcode or 2D matrix code representing the article number of the product to be assembled or otherwise provide this information to the computer 108.
  • the AR assembly instruction for that actual product will be loaded from a database of assembly instructions.
  • the database may be installed in the computer 108 or it may be accessible from a remote location. The latest revision of these instructions will be loaded, but previous revisions may be stored as well for reference, for example in order to compare with information in an event log.
  • the event log may include the revision number of the AR instructions that were used for that particular actual assembly.
  • the scanning of the bar code or 2D matrix code holding the article and/or serial number of the product may in some embodiments be performed using the camera 107 and image recognition software. In other embodiments, a separate bar code reader or RFID/NFC reader may be provided, or the user may be required to enter the serial number or select the product from a menu.
  • a log entry may include article number, serial number, operator id of the operator that is currently logged in, time stamp, revision number of AR workstation application.
  • the logged data may be transferred to a manufacturing execution system (MES) or some other database for traceability.
  • MES manufacturing execution system
  • video of the actual assembly operation as captured by the camera 107 may be included in the log.
  • FIG. 3 is a flowchart illustrating the performance of an assembly operation using the workstation described above.
  • a first step 301 the prepared AR instructions are loaded. How the AR instruction file is created will be described in further detail below.
  • the main structural component 105 is positioned in the jig 111, or directly on the work area 102 in embodiments where no jig 111 is provided.
  • additional features may assist in the correct placement of the main structural component 105, for example a transparent representation of that component on the display 104 with which the video image must be aligned.
  • the system is ready to start presentation of assembly steps and will do so upon receipt of user input representing an instruction to start in step 303. This will typically be provided by the operator 103 touching the correct user input control 210.
  • the system will initiate a next step 304 wherein virtual elements that are relevant to the current assembly step are positioned in their correct positions in the representation of 3D space in the memory of the computer 108.
  • the virtual 3D representations of the various components are extracted from the AR instructions and they may all be displayed from the beginning of the assembly operation, e.g. in association with their respective component bins 109, or they may be become visible only when they are needed in the appropriate process step.
  • the relationship between real space on the top of the workstation table 101 and the virtual 3D space described in the computer is determined by the position of the markers 104. All virtual elements are positioned in 3D space relative to them. How this is done will be described in further detail below.
  • step 307 the operator may provide user input by again touching user input control element 210.
  • step 308 the operator may then be determined in step 308 whether this completes the assembly operation, or if additional steps remain. If there are remaining steps the assembly step that has just been performed may be logged in step 309 and the process may return to step 304 where virtual elements related to the next step are positioned in virtual space relative to the positions of the markers 104, as described above. If, on the other hand, it is determined in step 308 that the final assembly step has been finished and the assembly has been completed, the process proceeds to step 310 where the last step is logged along with a log of the complete assembly operation.
  • the log may include such information as article or product number, serial number, the operator ID of the operator that is currently logged in to the workstation, one or more time stamps (e.g. for start, finish of each step, and finish of the entire assembly operation), and revision number of the assembly instructions that were used during the assembly.
  • additional information may also be logged, such as video captured by the camera 107, and a log of any interaction with instructors, mentors, or remote systems.
  • FIG. 4 is a flowchart illustrating a process of creating such instructions. In this situation the operator 103 is not a person IB
  • the markers 104 are already correctly positioned on the surface of the work area 102, that the correct component bins 109 are placed on the table 101, and that parts lists, bill of work, and data files with graphical representations of the various components 105, 205 are already present on or accessible from the computer 108.
  • the computer should also include a library of tools that can be listed as required in an assembly operation and represented in the side bar 215. Making sure these things are in order may be part of the preparations the operator 103 has to make prior to the actual design of the assembly instructions. These preparations may be seen as an integral part of the entire process, but in order to simplify the drawing and focus on the steps that directly contribute to the construction of the AR assembly instructions, they are not included in the drawing.
  • the process starts in step 401 when the AR workstation 100 is activated and starts to receive a video feed from the camera 107.
  • the received video images will include the work area 102 and the component bins 109, and therefore also the markers 104, 204.
  • the markers 104 on the work area 102 are interpreted by a 2D image recognition module in the computer 108 and their respective positions in the video image are used as reference points when the 3D representation of the work area in the computer is created.
  • the markers themselves have an orientation, so it is sufficient for the system to identify only one marker to determine position and orientation.
  • the 2D image recognition module may also be configured to determine the size of the markers 104 any proportional distortion in the markers 104 resulting from the viewing angle and thus determine the camera's height above the work area 102 and the camera's viewing angle towards the work area 102.
  • the markers 104 can thus be used to define the plane of the work area 102 surface in the virtual 3D space. Everything in the virtual 3D space is positioned relative to these markers and as a height above this surface.
  • the virtual 3D space may already exist as a representation of a space in the computer 108.
  • creation of the 3D representation of the work area is intended to refer to the specific association of a 3D space representation in a computer with a real physical space and the calibration of that representation through positioning of certain reference points in real space with coordinates in virtual space.
  • the markers 204 are also interpreted by the 2D image recognition module.
  • the markers 204 are permanently assigned to specific components such that when the 2D recognition module has recognized a marker the information extracted from that marker can be used to perform a look up in a parts list and retrieve information about the component from that list.
  • the markers 204 are only representative of a position of a bin and the operator 103 will have to associate each marker with a specific part number.
  • a next step 402 the operator 103 positions the main structural component 105 in the jig 111. If the jig 111 is not already a part of the work area 102 the operator 103 will first have to position the jig in the correct position relative to the markers 104.
  • This step means that the computer will start receiving video images with the main structural component 105 in the correct position.
  • the computer is configured to detect the presence of the main structural component based on image recognition.
  • the main structural component includes its own marker (not shown in the example in the drawings). This marker can be used to identify the component, and in some embodiments also to detect the component's position relative to the markers 104, and thereby also to position it as a representation in virtual 3D space.
  • a marker that is attached to the main structural component 105 may be a marker similar to the other markers 104, 204. However, it is consistent with the principles of the invention to use some other feature with sufficient contrast and shape to be utilized by the image recognition software to fully identify the 3D position and orientation of the product, in which case no additional markers explicitly provided for this purpose will be required.
  • a 3D representation of the main structural component is loaded by the computer 108. This may be performed based on user input from the operator 103. In embodiments where the main structural component 105 is recognized by identification of a 2D marker attached to it or, for example, by shape recognition, the representation may be automatically loaded as soon as the component has been recognized.
  • step 404 the operator 103 moves the position and orientation of the main structural component in virtual 3D space until it has been aligned with the video image of the physical main structural component 105.
  • a next step 405 the operator loads a data file describing the next component 205 to be added to the assembly.
  • the file describing the next component 205 may be a 3D representation such as a Step file as described above.
  • other file formats may be acceptable, including 2D image file formats such as image files like JPG, PNG, GIF, etc.
  • Some embodiments may also allow the operator 103 to capture an image of a specific part using the camera 107 and store that image as the data file describing the next component.
  • Some embodiments of the invention may include a 3D scanner that allows the operator 103 to scan a component in order to create the 3D representation.
  • embodiments may also be configured to create a 3D approximation based on several 2D images obtained from different angles.
  • the representation of the next component 205 may be associated with a position in the virtual 3D space, for example a default position from where it may be moved by the operator 103. This applies to both 2D and 3D representations. 2D representations will be shown as 2D images on the display 106 and it will not be possible to provide the same degree of realism in the presentation of how and where the component should be mounted or attached to the main structural component 105.
  • the process of loading this data file may include loading metadata describing the component, including a unique identification in form of an article number or serial number. Based on this information the computer may be able to determine in step 406 whether that part type has already been associated with a component bin 209. If it has the process may jump ahead to step 408, which is described below. Otherwise, the process will move to step 407, where the AR workstation will request and receive user input associating the component with an appropriate bin 209.
  • the computer 108 receives the association with the appropriate bin 209, whether this is done by user input or as a result of metadata, the virtual representation of the component in virtual 3D space may in some embodiments be automatically updated to position the component in or above the appropriate bin 209.
  • step 408 user input is received representing the final position and orientation of the component 205.
  • the system may, in some embodiments, generate an animation showing a path from the component's position in (or superimposed on) the appropriate bin 109 and to the component's final position and orientation when it has been attached or mounted to the main structural component.
  • a next step 409 the operator 103 can input additional metadata. This may be done by loading text files, loading audio files, adding text using a keyboard or by some other text input means (e.g. using the touch screen of display 106 or a digital pen or stylus), adding graphical elements such as the arrow 217 shown in FIG. 2, drawing circles or other symbols or figures using the touchscreen of the display 106 or a graphics tablet connected to computer 108. This step may also include adding additional sensory output information such as haptic information or audible information that is not vocal (e.g. the sound of a tool being used).
  • additional sensory output information such as haptic information or audible information that is not vocal (e.g. the sound of a tool being used).
  • the computer 108 includes a complete 3D description of the work area 102, the jig 111, the main structural component 105 and the next component 205 to be added to the assembly, some embodiments of the invention may replace the entire video feed with a 3D representation, for example at the request of the operator 103. It may then be possible for the operator to view the assembly operation from different angles, at different zoom levels, and in different level of detail. The operator 103 may add sequences of such representation as a part of the instructions, or request different views during assembly.
  • step 411 When all information has been added to the current step, the step is saved in step 410 and it is determined din step 411 whether this was the final step and the assembly operation has been completed. This is typically determined as a result of user input received from the operator 103. If the assembly is not completed the process returns to step 405 where data describing the next component to be added is loaded using one of the alternatives described above and the subsequent steps are repeated for that part. When it is finally determined in step 411 that the assembly operation has been completed the process moves to step 412 where the instruction file is saved. This file can now be used on this or another AR workstation 100 to guide assembly operations.
  • the file may be one single file holding all relevant information in some convenient file structure. However, the file may also be a collection of several files that may reference each other or information stored externally from the file or files themselves.
  • an embodiment may include all or any combination of input means for component representations, including a 3D file, a 2D image file, a 2D image obtained from the camera 107, a 3D representation from a 3D scanner and a 3D approximation generated from several 2D images. And any combination of these may be freely combined with one or more of the different user input means such as a touch screen, a keyboard, a mouse, a digital pen, a graphics board, and gesture input.
  • the extent to which an alternative requires additional hardware or software modules does not limit this possibility of combining various features in many ways, since none of the required hardware or software modules or modification to such modules would be mutually exclusive.
  • the AR representation will include a combination of information that is directly obtained from the camera 107 (video of the physical workspace) and information that is augmenting the information obtained from the camera 107.
  • haptic information e.g.
  • step 405 An alternative to the overall flow described with reference to FIG. 4 is one where a complete assembly is imported in step 403. Instead of importing additional components in step 405, the various components may be distributed to the various bins 209 either by manual user input from the operator 103, for example by drag and drop operations, or they can be distributed automatically based on metadata associated with each component and
  • FIG. 5 gives an overview of modules that may be included in an AR workstation 100 according to the invention.
  • Two main modules are included in the form of an AR Creator module 501 and an AR Workstation module 502. These modules represent the main functionality associated with creation of AR instructions and control of the progress of the AR instructions during an assembly operation.
  • an AR workstation may include only the modules required to perform assembly operations, i.e. only the AR workstation module 502 and the modules that may be controlled by it, while other embodiments may also include the AR creator module 501 and the additional module or modules that are controlled by the AR creator module 501, but not by the AR workstation module 502.
  • the AR creator module 501 may be present but deactivated. Yet another possibility is that access to the AR creator module 501 depends on the access rights of the operator 103 that is currently logged in.
  • Some embodiments of the invention may not have the architecture shown in FIG. 5, and in particular, the AR creator module 501 and the AR workstation module 502 does not have to be two separate modules, but may instead be one module that constitutes the main body of software and hardware components configured to receive and handle use input and administrate an utilize the remaining modules in accordance with the tasks to be performed.
  • An embodiment may include additional modules, or some of the modules illustrated in FIG. 5 may not be present.
  • the following description will be based on an exemplary embodiment where the AR creator module 501 and the AR workstation modules 502 are separate modules.
  • the set of functions related to control of the augmented reality environment may be considered an augmented reality management module which may be implemented as one or more modules.
  • the term augmented reality management module is intended to cover any combination of one or more modules that implement specified functionality.
  • the 3D placement and visualization module 503 handles the positioning, orientation and movement of virtual 3D components and objects in the virtual 3D space. This module may receive user input from the AR creator module 501 or the AR workstation module 502 and representing movement of a particular virtual object and change the position and orientation of that object accordingly.
  • the user management module 504 controls login and determines the access rights of the user.
  • the display manager module 505 controls the display 106 and delivers the combined AR video stream to the display 106.
  • the camera manager module 506 controls the camera 107. In addition to receiving and forwarding video signals, this module may also control zoom level, panning or tilting if the camera has that capability, or select between several cameras if several cameras are connected.
  • the 3D file importer module 507 receives 3D files describing components.
  • This module may implement all file conversion capabilities of the workstation.
  • the 3D file importer should be able to import as many common CAD file formats as possible, extract the 3D descriptions from those files and create a 3D representation according to a format used internally by the system.
  • This module may then deliver the 3D representation to the 3D placement and visualization module 503.
  • the ID marker recognizer module 508 is a module that is capable of interpreting ID codes, for example bar codes.
  • the ID marker may receive its input from an external bar code reader or it may receive the video feed from the camera 107 and perform feature extraction processing on the video images in order to extract and interpret the barcode.
  • the 2D marker recognizer module 509 may be a module that is configured to detect, extract and interpret 3D matrix codes such as for example QR codes from the video stream received from the camera 107. This module can be used to recognize the markers 104, 204 and as such the module may also be capable of determining distance and orientation based as described above.
  • the 3D base generator module 510 may be configured to create a negative shape from the 3D representation of the main structural component, from the completed assembly, or from an intermediate stage of the assembly process. This negative 3D representation can be used to create a file usable for 3D printing of a fixture or jig 111 as described above. As such, this module may only be part of the AR creator functionality.
  • a manufacturing execution system may be connected to the AR workstation module 502, but in most embodiments this system will not be part of the workstation as such, and it may be located remotely, for example on a server.
  • FIG. 6A illustrates the flow of information between the various modules during AR creation.
  • the various modules may be configured differently, for example with different distribution of functionality between modules, or with inclusion of additional modules or exclusion of some of the illustrated modules.
  • the illustration is therefore intended to be explanatory for the concepts of the invention, but not limiting on the scope of the invention.
  • FIG. 6A is conceptual in the sense that it illustrates the most important information flow between modules, but it does not include all possible communication between modules. As such, information may flow directly between modules that are not shown as being directly interconnected in the drawing and information that is not discussed or described herein may be communicated between the various modules.
  • functionality may be distributed differently between the modules, some modules may be combined into one, or some modules may be subdivided into a plurality of distinct modules. Functionality may also be distributed between several computer systems.
  • the respective modules may be combinations of hardware and software, or substantially only hardware or only software.
  • the modules or parts that correspond to modules or parts that have been described with reference to earlier drawings are given the same reference number.
  • the AR creator module 501 and AR workstation module 502 are not shown in this drawing since their functionality to a large extent is administrative or general. Thus, one of them will be involved in most of the information processing and information flow illustrated. Some of the modules that are introduced in FIG. 6 may in some embodiments be part of one or both of those modules. Some of them may also be part of the operating system of the computer system 108.
  • an operator 103 When an operator 103 first activates the AR workstation 100 he or she will have to use a user input device 601 to log in or register. The system will verify the user's identity with a user management module 502, which may be installed locally or which may be a remote authentication server. User management is well known in the art and will not be discussed further. Suffice to say that the functionality available to the operator 10S may depend on user rights that are granted according to a user profile that is part of the user management module 501. For example, some users may only be authorized to use the AR workstation 100 to perform assembly operations under control of the AR workstation module 502, while other users may be authorized to operate the workstation with access to the AR creation module 501, i.e. to create assembly instructions. Embodiments that do not implement user management and access rights are, of course, within the scope of the invention.
  • the user input device 601 may be one or more of a keyboard, a mouse, a touch screen, a digital pen, a graphics board and gesture recognition. If the user input device 601 includes a touch screen it may be the screen of the display 106, and if it includes gesture recognition the gestures may be captured by the camera 107. Other user input devices known in the art may also be used.
  • the workstation may now receive user input representing instructions to load a particular 3D description of a main structural component 105, i.e. the first component of the assembly process.
  • Subsequent components may be components that will be attached to the main structural component, either directly or by being attached to a subsequent component that has already been attached to the main structural component 105.
  • the 3D description of the main structural component 105 may be loaded from local storage 603. If it is not it may be accessed over a communication interface 604 from a remote database 505, or from some other device, for example a USB stick (not shown). In general files may be loaded and stored by the file
  • the AR workstation 100 is capable of importing a wide variety of 3D files, particularly files that are in one of the more common Computer-aided Design (CAD) file formats.
  • the AR workstation 100 may therefore include a 3D file importer that extracts the required 3D information and coverts it to the format handled internally by the AR workstation 100.
  • the internal file format, or 3D graphics format may be any one of the many existing 3D formats that are known in 3D modelling (e.g. from CAD or 3D gaming), a modified version of such a 3D format, or a proprietary format developed particularly for the AR workstation 100.
  • the AR workstation 100 includes a 3D base model generator 510.
  • This module may be configured to generate a negative shape of the main structural component 105, or more specifically a negative shape of one side of the main structural component 105, extended by sides going away from the negative shape and ending with a plane surface.
  • This negative shape can be outputted as a file in a file format suitable for 3D printing, for example an STL file or a FRML file, or it may be sent directly to a 3D printer connected or otherwise in communication with the AR workstation 100.
  • the resulting component can serve as a cradle, fixture or jig 111 in which the main structural component can be positioned on the work area 102 during assembly.
  • the printing process may add markers 104 directly to the jig 111 or provide distinct areas where such markers can be attached.
  • the 3D description extracted from the imported 3D file may then be forwarded to the 3D placement and visualization module 503.
  • This module maintains a representation of a virtual 3D space and stores information associating each imported 3D representation with a position and an orientation in 3D space.
  • the 3D placement and visualization module 503 is also capable of receiving user input from the user input module 601 specifying and adjustment of the position in virtual 3D space associated with a virtual component or object. This can be used by an operator to change the position of a main structural component 105 to a first position corresponding to the position in the virtual 3D space corresponding to the position of the main structural component 105 in the video feed when the main structural component 105 is positioned on the work area 102 where it will be positioned during assembly.
  • this position will be in the jig 111.
  • the operator 103 may also adjust the position of additional components in order to define a starting position, i.e. the position where the additional component should appear in the virtual 3D space and hence on the screen before it is attached to the main structural component 105, and in order to define the position in 3D space after the additional component has been attached.
  • This module may also handle insertion and adjustment of additional virtual objects such as arrows 217, metadata, and user control elements.
  • the graphic representations of the virtual elements controlled by the 3D placement and visualization module 503 is forwarded to display manager 505 where it is combined with the video feed from the camera 107.
  • the camera 107 captures video of the work area 102 and feeds that video to a camera manager 506.
  • the camera manager 506 receives the video feed and forwards it to a 1D/2D marker recognition module 598/509.
  • the camera manager 506 may also be configured to control the camera 107, for example by controlling zoom, pan and tilt, or by selecting between several cameras if the AR workstation 100 are provided with more than one camera 107.
  • the 1D/2D marker recognition module 508/509 is configured to interpret ID or 2D code in the video image, for example ID bar codes and 2D matrix codes.
  • the ID code recognition may be used to identify user identities from ID cards, product or serial numbers for assemblies, part numbers for individual components etc.
  • the ID marker recognition functionality may be used during a login process, in order to initiate or conclude an assembly process or an AR creation process, or to identify parts or represent the transition from one step to the next in the assembly process.
  • One or more of these functions may also be provided by a separate bar code reader, or by user input of text or selection from a menu.
  • the bar codes may be replaced by QR codes or some other form of 2D code or even RFID chips, and the ID code recognition module may be replaced by the 2D code recognition module or an RFID reader.
  • the 2D code recognition may be used to interpret the markers 104, 204 in the video and determine the position and orientation of the work area 102 as well as those of the component bins 109. This information may then be forwarded to the 3D placement and visualization module.
  • the 3D placement and visualization module keeps track of the position of the recognized markers in the video image and uses this to establish a correspondence between positions on the actual work area 102 and positions in virtual 3D space.
  • the video feed is also forwarded to the display manager 505 where it is combined with the virtual graphical elements from the 3D placement and visualization module 503 to create the combined AR image which can then be displayed on the display 107.
  • the various positions for virtual components and other virtual objects that are positioned in virtual 3D space may also be forwarded from the 3D placement and visualization module 503 to an AR file creator 606 at well-defined points during the assembly process, for example at the beginning and/or the end of each assembly step.
  • This file creator tracks and registers all user input that is intended for inclusion in the final AR instruction file and stores associated information in one or more files.
  • the file creator may register metadata and other information, for example audio, that is not associated with a position in 3D space.
  • the file creator 606 may register 3D descriptions and positions of additional virtual objects such as user controls, arrows and other indicators, and representations of tools.
  • the file creator 606 does not have to register 3D descriptions of shapes and user controls that are described and already present in the AR workstation as a standard component. The same may be the case for positions that are determined by the system, for example for standard controls 210, 211, 212, 213, 214 and information that is positioned in the sidebar 215, not in the virtual 3D space.
  • the completed AR instruction file may be stored in local storage 603 or sent to be stored in a remote database 610.
  • the operator 103 may input a 2D representation in the form of an image. This may be from an image file that is already available, or the camera 107 may be used to capture an image of a particular component. Some embodiments may include certain capabilities for generating 3D estimates for example based on a plurality of images captured from different angles or based on additional user input. In other embodiments the 2D image may be used as it is.
  • the 2D representation may still be associated with a position and an orientation in 3D space, and it may be thought of as a completely flat 3D object. When this description and the appended claims refer to 3D objects this is intended to include 2D images represented with a position in 3D space.
  • the MES system 111 and the remote instructor 612 are illustrated in FIG. 6, but they are primarily of relevance during assembly operations and will be described further below.
  • FIG. 6B corresponds to FIG. 6A but illustrates the AR workstation operating in assembly mode, i.e. under control of the AR workstation module 502.
  • a used may log in and receive access rights according to what is defined in the user management module 504. The user will then proceed to load an AR instructions file which has been created as described above.
  • the appropriate file may be identified based on user input in the form of text or selection from a menu. However, some embodiments require reading of a ID or 2D code identifying the assembly operation and selection of the
  • the camera 107, camera manager 506 and ID and/or 2D marker recognition module 508/509 may be used.
  • the identified file may then be loaded from local storage 603, or from a remote database 510 or in any other manner known in the art.
  • the 3D file importer 507 may extract the 3D descriptions of components and other virtual objects and forward them to the 3D placement and visualization module 503.
  • the various virtual objects whether they are representations of assembly components or virtual objects representing metadata, are associated with respective initial positions.
  • the initial positions of the various virtual objects may be determined in several ways. If the initial position is defined in the AR instruction file, this position will take precedence. If that is not the case, a virtual component may be positioned for example according to one of the following principles: a default position for the type of component, and a position determined based on the position of a marker 204 in the video feed image. Other possibilities, for example in the sidebar 215, are available as a design choice.
  • the 3D placement and visualization module 503 will forward the description, position and orientation of at least the main structural component 105 to the display manager 505 where it is combined with the video feed and forwarded to the display 107.
  • the system may now receive user input to proceed to a first assembly step.
  • a next component will become visible on the display, or it is already visible it will be pointed out by a marker such as an arrow 217.
  • a virtual representation of the component 218 while an arrow 217 points out the component's position for example in a parts bin 109.
  • the 3D placement and visualization module 503 may then change the position of the component 218 to a position where it is mounted in its appropriate position on the main structural component 105.
  • This change in position may be gradual such that the component 218 is shown as moving gradually from the place where it is stored to the place where it is to be mounted.
  • This animation may follow a path that is automatically generated from the initial position to the final position, or the path may have been explicitly defined by the operator during creation of the AR instruction file, for example by dragging the component on a touch screen display.
  • the operator will have to identify a selected component by presenting a ID or 2D product code associated with the component to the camera 107 for recognition by the 1D/2D marker recognition module 508/509. This may for example be done as a confirmation by the operator 103 that the step has been completed.
  • a log file creator 614 may receive this information from the 3D placement and visualization module along with time stamps for when the step started and was completed, and any other information, for example the video feed as captured during the operator's performance of the assembly step.
  • the log file may be stored in local storage 603 and/or forwarded to a remote MES system 511 or some other database 610.
  • the AR workstation 100 has a communication interface 604 which not only enables communication with the remote database or the MES system, but also enables communication with a person located remotely from the
  • This user input may include the ability to transmit audio and transmit user input representing lines being drawn on the display 107.
  • each assembly step has been described as including one component, sometimes referred to as a next component or an additional component. This should be interpreted as at least one next component.
  • the AR instruction file will not include 20 assembly steps with one step for each screw. Instead the step may include all 20 screws. Since all steps that include one next component by necessity includes one next component, the description does not loose generality for reciting one next component when several next components are included.
  • the main structural component 105 exists in physical space, positioned on the work area 102. In addition it is captured by the camera 107 and is as such represented as a video image on the display 106. The same component is described as a 3D description associated with a position in virtual 3D space, and this virtual component may also be shown on the display 106.
  • the various representations of the same object has not been given different reference numbers. By way of example, this means that the phrase "the 3D description of the main structural component 105" refers to the virtual representation inside the computer and not to the physical component on the work area 102.
  • An augmented reality workstation for providing augmented reality guidance in a manufacturing environment, the augmented reality workstation comprising: a camera 107 configured to capture a work area 102 of the workstation 100; a computer system 108 configured to receive a video feed from the camera 107 and to combine the video feed with virtual graphical elements 210, 211, 212, 213, 214, 217, 218 to generate an augmented reality video signal; a display system 106 configured to receive the augmented reality video signal from the computer system 108 and display an augmented reality view of the work area 102; wherein the computer system 108 is further configured to: detect at least one characteristic feature 104 in the video feed received from the camera 107; maintain a representation of a virtual 3D space in memory; assign a position in the virtual 3D space for the at least one characteristic feature and positioning at least one of the virtual graphical elements 210, 211, 212, 213, 214, 217, 218 in the virtual 3D space relative

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Train Traffic Observation, Control, And Security (AREA)
  • Processing Or Creating Images (AREA)

Abstract

La présente invention concerne un poste de travail de réalité augmentée pour créer et fournir un guidage de réalité augmentée pour une opération d'assemblage dans un environnement de fabrication. Le poste de travail de réalité augmentée comprend une caméra (107) configurée pour capturer une vue d'une zone de travail (102) ; un système d'ordinateur (108) configuré pour recevoir une alimentation vidéo à partir de la caméra (107) et pour combiner l'alimentation vidéo avec des éléments graphiques virtuels pour générer un signal vidéo de réalité augmentée, et un écran d'affichage (106) configuré pour recevoir ledit signal vidéo de réalité augmentée à partir dudit système informatique (108) et afficher une vue de réalité augmentée de ladite zone de travail (102). Le système d'ordinateur (108) comprend un module de gestion de réalité augmentée configuré pour détecter une particularité caractéristique dans l'alimentation vidéo et l'associer à une position connue dans la zone de travail et une position de référence dans l'espace 3D virtuel. Un opérateur (103) peut charger des descriptions 3D de composants d'assemblage, les associer à des positions respectives dans ledit espace 3D virtuel, changer leurs positions dans ledit espace 3D virtuel à l'aide d'une interface utilisateur, et créer un fichier d'instructions de réalité augmentée où les descriptions 3D des composants d'assemblage et les positions des composants dans ledit espace 3D virtuel durant les étapes définies de l'opération d'assemblage sont stockées.
PCT/NO2019/050032 2018-02-02 2019-02-04 Procédé et système de guidage d'assemblage à réalité augmentée WO2019151877A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
NO20180179A NO20180179A1 (en) 2018-02-02 2018-02-02 Method and system for augmented reality assembly guidance
NO20180179 2018-02-02

Publications (1)

Publication Number Publication Date
WO2019151877A1 true WO2019151877A1 (fr) 2019-08-08

Family

ID=65635777

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/NO2019/050032 WO2019151877A1 (fr) 2018-02-02 2019-02-04 Procédé et système de guidage d'assemblage à réalité augmentée

Country Status (2)

Country Link
NO (1) NO20180179A1 (fr)
WO (1) WO2019151877A1 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110716501A (zh) * 2019-11-18 2020-01-21 康美包(苏州)有限公司 一种数据传输方法、设备、装置及计算机可存储介质
CN111381679A (zh) * 2020-03-19 2020-07-07 三一筑工科技有限公司 一种基于ar的装配式建筑施工培训方法、装置及计算设备
CN113126575A (zh) * 2019-12-31 2021-07-16 捷普电子(无锡)有限公司 用于装配操作流程的引导方法及引导系统
CN113673894A (zh) * 2021-08-27 2021-11-19 东华大学 一种基于数字孪生的多人协作ar装配方法和系统
CN115009398A (zh) * 2022-07-08 2022-09-06 江西工业工程职业技术学院 一种汽车组装系统及其组装方法
US11762367B2 (en) 2021-05-07 2023-09-19 Rockwell Collins, Inc. Benchtop visual prototyping and assembly system

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11606364B2 (en) 2020-09-15 2023-03-14 Meta Platforms Technologies, Llc Artificial reality collaborative working environments
US11854230B2 (en) 2020-12-01 2023-12-26 Meta Platforms Technologies, Llc Physical keyboard tracking

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080024392A1 (en) * 2004-06-18 2008-01-31 Torbjorn Gustafsson Interactive Method of Presenting Information in an Image
EP2728548A2 (fr) * 2012-10-31 2014-05-07 The Boeing Company Cadre automatisé d'étalonnage de référence à réalité amplifiée

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2500416B8 (en) * 2012-03-21 2017-06-14 Sony Computer Entertainment Europe Ltd Apparatus and method of augmented reality interaction
KR102161510B1 (ko) * 2013-09-02 2020-10-05 엘지전자 주식회사 포터블 디바이스 및 그 제어 방법
AU2014202574A1 (en) * 2014-05-13 2015-12-03 Canon Kabushiki Kaisha Positioning of projected augmented reality content
TW201738847A (zh) * 2016-04-28 2017-11-01 國立交通大學 組裝指示系統及組裝指示方法
JP6701930B2 (ja) * 2016-04-28 2020-05-27 富士通株式会社 オーサリング装置、オーサリング方法およびオーサリングプログラム

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080024392A1 (en) * 2004-06-18 2008-01-31 Torbjorn Gustafsson Interactive Method of Presenting Information in an Image
EP2728548A2 (fr) * 2012-10-31 2014-05-07 The Boeing Company Cadre automatisé d'étalonnage de référence à réalité amplifiée

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
MIRKO FERRARI: "Huawei UPS2000 Augmented Reality installation instructions", 23 October 2017 (2017-10-23), XP055581228, Retrieved from the Internet <URL:http://arblog.inglobetechnologies.com/?p=2290> [retrieved on 20190415] *
STEVEN J HENDERSON ET AL: "Augmented reality in the psychomotor phase of a procedural task", MIXED AND AUGMENTED REALITY (ISMAR), 2011 10TH IEEE INTERNATIONAL SYMPOSIUM ON, IEEE, 26 October 2011 (2011-10-26), pages 191 - 200, XP032201451, ISBN: 978-1-4577-2183-0, DOI: 10.1109/ISMAR.2011.6092386 *
VIJAIMUKUND RAGHAVAN ET AL: "Interactive Evaluation of Assembly Sequences Using Augmented Reality", IEEE TRANSACTIONS ON ROBOTICS AND AUTOMATION, IEEE INC, NEW YORK, US, vol. 15, no. 3, 1 June 1999 (1999-06-01), XP011053407, ISSN: 1042-296X *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110716501A (zh) * 2019-11-18 2020-01-21 康美包(苏州)有限公司 一种数据传输方法、设备、装置及计算机可存储介质
CN113126575A (zh) * 2019-12-31 2021-07-16 捷普电子(无锡)有限公司 用于装配操作流程的引导方法及引导系统
CN113126575B (zh) * 2019-12-31 2022-07-26 捷普电子(无锡)有限公司 用于装配操作流程的引导方法及引导系统
CN111381679A (zh) * 2020-03-19 2020-07-07 三一筑工科技有限公司 一种基于ar的装配式建筑施工培训方法、装置及计算设备
US11762367B2 (en) 2021-05-07 2023-09-19 Rockwell Collins, Inc. Benchtop visual prototyping and assembly system
CN113673894A (zh) * 2021-08-27 2021-11-19 东华大学 一种基于数字孪生的多人协作ar装配方法和系统
CN113673894B (zh) * 2021-08-27 2024-02-02 东华大学 一种基于数字孪生的多人协作ar装配方法和系统
CN115009398A (zh) * 2022-07-08 2022-09-06 江西工业工程职业技术学院 一种汽车组装系统及其组装方法

Also Published As

Publication number Publication date
NO343601B1 (en) 2019-04-08
NO20180179A1 (en) 2019-04-08

Similar Documents

Publication Publication Date Title
WO2019151877A1 (fr) Procédé et système de guidage d&#39;assemblage à réalité augmentée
US11398080B2 (en) Methods for augmented reality applications
Hořejší Augmented reality system for virtual training of parts assembly
DK2996015T3 (en) PROCEDURE TO USE IMPROVED REALITY AS HMI VIEW
CN104995663B (zh) 用于使用光学字符识别来提供增强现实的方法和装置
JP6144364B2 (ja) 作業支援用データ作成プログラム
EP2549428A2 (fr) Procédé et système pour générer des études comportementales chez un individu
KR20090056760A (ko) 증강현실 저작 방법 및 시스템과 그 프로그램을 기록한컴퓨터로 읽을 수 있는 기록 매체
EP1926051A2 (fr) Plate-forme média connectée à un réseau
Ferrise et al. Multimodal training and tele-assistance systems for the maintenance of industrial products: This paper presents a multimodal and remote training system for improvement of maintenance quality in the case study of washing machine
EP3244286B1 (fr) Installation d&#39;un élément physique
JP2020098568A (ja) 情報管理装置、情報管理システム、情報管理方法および情報管理プログラム
US20200050857A1 (en) Methods and systems of providing augmented reality
CN108604256B (zh) 零件信息检索装置、零件信息检索方法以及程序
Pick et al. Design and evaluation of data annotation workflows for cave-like virtual environments
CN115482322A (zh) 生成合成训练数据集的计算机实现方法和系统
US11833761B1 (en) Optimizing interaction with of tangible tools with tangible objects via registration of virtual objects to tangible tools
JP6803794B2 (ja) 画像処理装置及び製造システム
JP7381556B2 (ja) メディアコンテンツ計画システム
Gimeno et al. An occlusion-aware AR authoring tool for assembly and repair tasks
Osorio-Gómez et al. An augmented reality tool to validate the assembly sequence of a discrete product
Wang et al. Integrated content authoring for augmented reality based product manual assembly process instruction
JP2007316832A (ja) パーツカタログ表示システムおよびその制御方法
Agrawal et al. HoloLabel: Augmented reality user-in-the-loop online annotation tool for as-is building information
JP2021015572A (ja) 情報管理システムおよび情報管理方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19708686

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19708686

Country of ref document: EP

Kind code of ref document: A1