WO2020049999A1 - Manufacturing assistance system, method, and program - Google Patents

Manufacturing assistance system, method, and program Download PDF

Info

Publication number
WO2020049999A1
WO2020049999A1 PCT/JP2019/032359 JP2019032359W WO2020049999A1 WO 2020049999 A1 WO2020049999 A1 WO 2020049999A1 JP 2019032359 W JP2019032359 W JP 2019032359W WO 2020049999 A1 WO2020049999 A1 WO 2020049999A1
Authority
WO
WIPO (PCT)
Prior art keywords
component
assembling
identification information
image
assembly
Prior art date
Application number
PCT/JP2019/032359
Other languages
French (fr)
Japanese (ja)
Inventor
航 粟津
Original Assignee
三菱自動車工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱自動車工業株式会社 filed Critical 三菱自動車工業株式会社
Priority to JP2020541113A priority Critical patent/JP7374908B2/en
Publication of WO2020049999A1 publication Critical patent/WO2020049999A1/en
Priority to JP2023013855A priority patent/JP7468722B2/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23PMETAL-WORKING NOT OTHERWISE PROVIDED FOR; COMBINED OPERATIONS; UNIVERSAL MACHINE TOOLS
    • B23P21/00Machines for assembling a multiplicity of different parts to compose units, with or without preceding or subsequent working of such parts, e.g. with programme control
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/04Manufacturing
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Definitions

  • the present invention relates to a manufacturing support system, a method, and a program for assisting a worker in assembling parts.
  • the work content may be indicated using technical terms or codes that are difficult to decipher (combination of a plurality of symbols for information transmission).
  • decipher combination of a plurality of symbols for information transmission.
  • the layout of the components may look different depending on the posture and orientation of the assembly target as viewed from an operator. Therefore, even if the assembled state of the parts looks different from the acceptable image, it may actually be an appropriate state. Further, since a plurality of parts are checked at the same time, an improper assembly state may be overlooked by visual inspection.
  • an assembling error easily occurs in a work in which a plurality of parts are assembled, and it is difficult to improve work efficiency.
  • One of the objects of the present invention is to provide a manufacturing support system, a method, and a program, which were created in view of the above-mentioned problems, and which can improve the work efficiency in assembling parts. It is to be noted that the present invention is not limited to this purpose, and is an operation and effect derived from each configuration shown in “Embodiments for Carrying Out the Invention” described later. Can be positioned as an objective.
  • the disclosed manufacturing support system is a manufacturing support system for assisting a worker in assembling parts.
  • the present system has a built-in photographing device for photographing a real scene in the field of view of the worker and a display device for superimposing and displaying a virtual image on the real scene, and includes smart glasses mounted on the head of the worker.
  • a first database that defines a relationship between an assembling pattern of the plurality of components with respect to an assembling target and identification information corresponding thereto.
  • a control device is provided for displaying a virtual image of the component on the display device so as to match a coordinate system of a real scene in the field of view of the worker.
  • the control device includes an extraction unit, an acquisition unit, a recognition unit, and a drawing unit.
  • the extraction unit extracts the identification information included in a work instruction document in which the contents of the assembling work are described, from a photographed image of the photographing device.
  • the acquisition unit acquires an assembly pattern of the component corresponding to the identification information based on the first database.
  • the recognition unit recognizes the assembly target from a captured image of the imaging device.
  • the drawing unit draws a virtual image of the component in a completed position and orientation in which the component is assembled to the assembly target.
  • a second database is provided in which a relationship between a storage location of the component, the number (use number) of the component to be assembled to the assembly target, and the identification information is defined.
  • the acquisition unit acquires, based on the second database, information on the storage location of the component corresponding to the identification information and information on the number of the components.
  • the recognition unit recognizes the storage location from a captured image of the imaging device. Further, the drawing unit displays information on the number of the components near the storage location.
  • the recognition unit recognizes a marker installed in the storage location in the captured image.
  • the control device includes a determination unit and a highlighting unit.
  • the determination unit determines whether or not the assembly state of the component matches the assembly pattern based on the captured image of the imaging device and the identification information.
  • the emphasis display unit highlights a part in which the assembly state of the component does not match the assembly pattern so as to conform to a coordinate system of a real scene in the field of view of the worker.
  • the type, the number, and the layout of the parts to be assembled are different for each work instruction.
  • FIG. 2 is a block diagram illustrating a hardware configuration of the manufacturing support system.
  • FIG. 3 is a block diagram illustrating a software configuration of a manufacturing support program.
  • (A) and (B) are configuration examples of a database.
  • (A)-(C) is a figure for demonstrating the example of a display of the virtual image of a component.
  • (A), (B) is a figure for demonstrating the example of a display of the number of components. It is a figure for explaining an example of highlighting of an inspection result.
  • (A), (B) is a figure for demonstrating the modification of a manufacturing support system.
  • the manufacturing support system is a system using a wearable gadget that assists the worker 51 in assembling parts, and is applied to a manufacturing process of the automobile 50 as shown in FIG.
  • a description will be given of support for the worker 51 to take out parts from the material storage shelf 53 and assemble them to the automobile 50.
  • Specific examples of parts to be assembled include electrical components such as relays, switches, and lamps, as well as glass, bumpers, power trains, seats, and tires.
  • Specific examples of components to be assembled include a relay box, a switch box, an instrument panel, a door, and a vehicle body.
  • the content of the work support is the provision of visual information using the smart glass 10 attached to the head of the worker 51.
  • the smart glass 10 is fixed to a helmet 52 of an operator 51.
  • the smart glass 10 incorporates a display device 13 and a photographing device 16.
  • the photographing device 16 is a camera that photographs the actual scene in the field of view of the worker 51 and the work instruction sheet 54.
  • the display device 13 is a device that superimposes and displays a virtual image on a real scene in the field of view of the worker 51.
  • the display method may be a method of projecting a virtual image on the retina of the worker 51 (retinal projection type), a method of projecting a virtual image on the surface of a half mirror (transmission type), or an image captured by the imaging device 16.
  • a method (non-transmissive type) of displaying an image in which a real scene and a virtual image are combined may be displayed.
  • various virtual images are displayed so as to be compatible with the coordinate system of the real scene viewed from the viewpoint of the worker 51.
  • This virtual image is a perspective projection image (or a deformed version of the perspective projection image) that conforms to the coordinate system of the real scene, and is rendered, for example, as a simple polygon model based on three-dimensional CAD (Computer-Aided Design) data of the part Is done.
  • the drawing object is a two-dimensional object.
  • the drawing objects are three-dimensional objects, and each drawing object is virtually arranged in the coordinate system of the real scene.
  • the content of work support is controlled by any control device (computer, computer).
  • the “electronic control device” here may be built in the smart glass 10 or may be prepared separately from the smart glass 10.
  • the program may be installed in a server 30 or a workstation on the network 19, and the display content calculated there may be displayed on the smart glass 10.
  • the server 30 connected to the network 19 via the repeater 36 may be used.
  • a program that operates even when not connected to the network 19 may be installed in the smart glass 10.
  • FIG. 3 is a block diagram showing a hardware configuration of the manufacturing support system realized by cooperation between the smart glass 10 and the server 30.
  • the smart glass 10 includes a recording medium drive 11, a storage device 12, a display device 13, an input device 14, a communication device 15, a photographing device 16 (camera) for reading a program recorded on a recording medium 18 (removable medium).
  • a microphone 17 and an electronic control unit 20 are provided.
  • the server 30 includes a recording medium drive 31, a storage device 32, a display device 33, an input device 34, a communication device 35, and an electronic control device 40.
  • the function of each element is almost the same as that of the element with the same name built in the smart glass 10, and therefore the description is omitted.
  • the recording medium 18 is, for example, a flash memory card, a semiconductor memory device conforming to the universal serial bus standard, an optical disk, or the like.
  • the recording medium drive 11 is a reading device that reads information (programs and data) recorded and stored on the recording medium 18.
  • the storage device 12 is a memory device for storing data and programs stored for a long period of time, and includes, for example, a non-volatile memory such as a flash memory, an EEPROM (Electrically Erasable Programmable Read-Only Memory), and a solid state drive. It is.
  • the display device 13 is one of devices for providing visual information to the worker 51.
  • the retina of the worker 51 is an image plane.
  • a half mirror or a display for example, a liquid crystal display (Liquid Crystal Display, LCD) or an organic EL display (Organic Electro-Luminescence Display, OELD) is arranged within the field of view of the worker 51. ] Is the imaging plane.
  • the input device 14 is one of pointing devices for inputting input signals to the electronic control devices 20 and 40.
  • a device such as a wireless keyboard and a wireless mouse can be connected to the smart glass 10.
  • An input device 14 such as a push button or a physical key is provided on a temple (a handle, a side portion) of the smart glass 10.
  • the microphone 17 is a device for inputting the voice of the worker 51 as an input signal.
  • the communication device 15 is a device that manages communication with the outside of the smart glass 10, and is, for example, a device that performs wireless communication with the server 30 via the network 19.
  • the repeater 36 shown in FIG. 2 can be considered to be included in the communication device 15.
  • input / output devices conforming to the short-range wireless communication standard are connected to the smart glass 10, those input / output devices are connected via the communication device 15.
  • the photographing device 16 is a digital video camera having a built-in image sensor such as a CCD (Charge-Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor).
  • the photographing device 16 is attached to a rim (periphery of a lens) or a bridge (a connection portion between left and right lenses) of the smart glass 10 and functions to photograph a field of view of an operator 51 wearing the smart glass 10.
  • Information on the image captured by the image capturing device 16 is transmitted to the electronic control device 20 and the server 30.
  • the “captured image” here includes not only a still image but also an image of one frame (one frame) included in a moving image.
  • the position and the photographing direction of the photographing device 16 are fixed to the smart glass 10, and the position of the display device 13 is also fixed to the smart glass 10. Therefore, as long as the relationship between the smart glass 10 and the line of sight of the worker 51 is kept constant, a virtual image (that is, a coordinate system whose appearance is the same as that of the real scene photographed by the photographing device 16) appears to exist in the same environment.
  • a three-dimensional object that matches the actual scene) can be drawn on the display device 13.
  • a known AR (Augmented Reality, Augmented Reality) development library can be used to calculate the position and orientation in a three-dimensional space of a subject in an image captured by the imaging device 16.
  • a known three-dimensional renderer or an AR drawing engine can be used for drawing a three-dimensional object that matches a real scene.
  • the electronic control unit 20 includes a processor 21 (central processing unit), a memory 22 (main memory, main storage device), an auxiliary storage device 23, an interface device 24, and the like, and is communicably connected to each other via a bus 25.
  • the processor 21 is a central processing unit including a control unit (control circuit), an operation unit (operation circuit), a cache memory (register group), and the like.
  • the memory 22 is a storage device that stores programs and data during work, and includes, for example, a ROM (Read Only Memory) and a RAM (Random Access Memory).
  • the auxiliary storage device 23 is a memory device that stores data and firmware that are held for a longer time than the memory 22, and includes, for example, a nonvolatile memory such as a flash memory or an EEPROM.
  • the interface device 24 controls input / output (I / O) between the electronic control device 20 and the outside.
  • the electronic control device 20 is connected to the recording medium drive 11, the storage device 12, the display device 13, the input device 14, the communication device 15, and the like via the interface device 24.
  • the electronic control unit 40 of the server 30 also includes a processor 41 (central processing unit), a memory 42 (main memory, main storage device), an auxiliary storage device 43, an interface device 44, and the like. Connected communicable.
  • the function of each element is almost the same as that of the element having the same name built in the electronic control unit 20, and therefore the description is omitted. Further, there is no intention to limit the specific hardware configuration of the electronic control device to the above configuration, and various known configurations may be applied.
  • FIG. 4 is a block diagram for explaining the processing contents of the manufacturing support program 9 executed by the smart glasses 10 and the server 30. These processing contents are recorded as application programs in the storage devices 12 and 32 of the smart glass 10 and the server 30, the recording medium 18, and the like, and are developed and executed on a memory.
  • the manufacturing support program 9 includes an extraction unit 3, an acquisition unit 4, a recognition unit 5, a drawing unit 6, a determination unit 7, and a highlighting unit 8. Further, a first database 1 and a second database 2 are provided as databases to which these elements can be referred.
  • the first database 1 defines a relationship between an assembly pattern of a component with respect to an assembly target and identification information corresponding thereto.
  • the identification information is information for specifying the content of the work instructed by the worker 51, and is called, for example, a work number, a work target vehicle number, an instruction target vehicle number, and the like.
  • the work number is described in the work instruction 54 presented by the work manager. It is assumed that the work instruction 54 is presented by the work manager before the worker 51 performs the assembling work, or is attached to the automobile 50 to be assembled.
  • the contents of the first database 1 are illustrated in FIG.
  • the first database 1 defines at least a relationship between an operation number for identifying an assembling operation and information (an assembling position and an assembling direction) for specifying a layout (an assembling pattern) of components to be assembled in the operation. You. If there are a plurality of types of parts to be assembled in the operation, an assembly pattern of a plurality of types of parts is set for one operation number. In the example shown in FIG. 5A, an assembling pattern of three types of parts (a-01, a-02, a-03) is set for the work number "ABCDE01234".
  • the types of parts and the assembling patterns will also be different.
  • the assembling patterns to be acquired by the worker 51 range from tens to hundreds of patterns, and it is said that memorizing these patterns requires months to years of experience. ing.
  • all of these patterns are defined in the first database 1 and are linked to the operation numbers. Thereby, the worker 51 is released from the memorization work of the assembly pattern.
  • the mounting position of the component can be specified by at least three parameters based on a coordinate system fixed to the mounting target (for example, a relay box).
  • the assembling direction of the component can be specified by at least three parameters based on a coordinate system fixed to the assembling target. Therefore, information for specifying the layout of one component can be uniquely specified by using six or more parameters.
  • the second database 2 defines the relationship between the identification information (work number), the storage location and the number of used parts (the number of assembled parts).
  • the component storage location is, for example, information indicating which tray is in which of the many component trays stored in the material storage shelf 53 shown in FIG. If the storage location of a component used in a certain operation is always fixed, the position (storage location) of the component can be specified by at least three parameters. On the other hand, when there is a possibility that the storage location of the component is changed depending on the work situation, it is preferable to add information for specifying the storage location to each record of the second database 2.
  • the number of parts “a-01” used in the operation number “ABCDE01234” is 3, and the storage location is the location code “aA11bB22cC33dD44”.
  • a character string, a symbol, a bar code, a two-dimensional code, or the like representing the place code “aA11bB22cC33dD44” is posted at a place where the part is actually stored.
  • the extracting unit 3 extracts identification information (work number) included in the work instruction sheet 54 in which the contents of the assembling work are described, from the image captured by the imaging device 16. It is assumed that the work instruction 54 is photographed before the worker 51 performs the work of assembling parts.
  • the extraction unit 3 of the present embodiment has a function of extracting a work number by reading a barcode of the work instruction sheet 54 shown in FIG. The information of the work number extracted here is transmitted to the acquisition unit 4.
  • the acquisition unit 4 has two functions.
  • the first function is information acquisition based on the first database 1
  • the second function is information acquisition based on the second database 2.
  • the information obtained here is transmitted to the drawing unit 6.
  • the former function is to acquire a component assembling pattern corresponding to the identification information (work number) extracted by the extraction unit 3 based on the first database 1. For example, when the work number extracted by the extraction unit 3 is “ABCDE01234”, three types of parts (a-01, a-02, aa) corresponding to this in the first database 1 shown in FIG. -03), an assembling pattern (information of an assembling position and an assembling direction) is acquired.
  • the latter function is to acquire, based on the second database 2, information on the storage location and the number of used parts corresponding to the identification information (work number). For example, based on the second database 2 shown in FIG. 5 (B), information on the location code and the number of uses is obtained for each of the three types of components (a-01, a-02, a-03).
  • the recognition unit 5 recognizes a specific object or mark present in an image captured by the image capturing device 16.
  • the recognizing unit 5 has a function of recognizing at least “part assembly target”. For example, in a relay assembling operation (operation number “ABCDE01234” in FIG. 5A), it is determined whether or not the relay box 55 to which the relay is to be installed is in the captured image. In this determination, the three-dimensional CAD data of the relay box 55 is referred to. As shown in FIG. 6A, when the relay box 55 is detected in the photographed image, photographing is performed based on the distribution of characteristic points (points and vertices included in the edge of the relay box 55) in the photographed image.
  • the position and orientation of the relay box 55 based on the position are estimated. Thereby, the appearance of the relay box 55 within the field of view of the worker 51 is specified.
  • the information on the position and orientation of the assembly target specified here is transmitted to the drawing unit 6 and is referred to when drawing a part.
  • the recognizing unit 5 also has a function of recognizing “part storage location” from the captured image. For example, in the relay assembling work (work number “ABCDE01234” in FIG. 5B), a character string, a symbol, a two-dimensional code, or the like representing the location code “aA11bB22cC33dD44” is recognized. As shown in FIG. 7A, when a marker 58 representing a location code is detected in a captured image, the position and orientation of the marker 58 based on the capturing position are estimated based on the position and shape of the marker 58. Is done. Information on the storage location of the part recognized here is also transmitted to the drawing unit 6.
  • the drawing unit 6 controls the display content of the display device 13 when assembling components, and has three main functions.
  • the first function is to draw a virtual image of a component at the position and orientation at the time of completion when the component is assembled to the assembly target.
  • the “virtual image” indicates a state (completion drawing) when the assembling of the parts is completed, and serves as a “model” of the assembling work.
  • the virtual image of the component is drawn such that the coordinate system matches the assembly target recognized by the recognition unit 5.
  • the virtual image of the component with respect to the relay box 55 shown in FIG. 6A is drawn at a position and orientation corresponding to a state where the component is assembled to the relay box 55.
  • FIG. 6B shows only the virtual image of the part drawn here.
  • the second function of the drawing unit 6 is a function of drawing information on the number (use number) of parts used in the assembling work near the relay box 55, and the third function is storing the information on the number. It is drawn near the place.
  • the third function is storing the information on the number. It is drawn near the place.
  • the surface on which each component is assembled to relay box 55 is planar, it is preferable to arrange the virtual images of each component on the plane.
  • not only the number of parts but also the part number and the appearance of the parts may be displayed together, so that the storage state may be transmitted in a more easily understood manner.
  • the determination unit 7 automatically determines whether or not the content of the work is appropriate when the work of assembling the parts is completed.
  • the identification information work number
  • each component in the captured image may be specified, and the position and orientation of each component with respect to the assembly target may be estimated to compare with the assembly pattern.
  • an image in a completed state prepared in advance and a captured image may be compared.
  • the determining unit 7 of the present embodiment determines the assembly state of the parts by using the learned auto encoder. Specifically, the captured image is input to the auto encoder, the output image is obtained, and the difference between the input image and the output image is calculated.
  • the learning data of the auto-encoder thousands of images obtained by photographing an assembly target in which components are correctly assembled from various positions and angles are used.
  • the output image is almost the same as the input image.
  • the image at that part is converted into an “image close to the correct assembled state” and output.
  • the determination unit 7 calculates a difference between the input image and the output image, and calculates a location where the difference is equal to or more than a predetermined threshold value as a “location having a high degree of abnormality”.
  • the highlighting section 8 highlights the “place with a high degree of abnormality” determined by the determining section 7.
  • a part where the assembly state of the parts does not match the assembly pattern is highlighted so as to conform to the coordinate system of the actual scene in the field of view of the worker.
  • the highlighter 8 according to the present embodiment has a function of displaying an image in which “a portion having a high degree of abnormality” is highlighted in red on the display device 13. As a result, as shown in FIG. 8, a portion having an improper assembling state is emphasized, and the presence of the assembling error is transmitted to the worker 51 in an easily understandable manner.
  • FIG. 9 is a flowchart for explaining the flow of the component assembling process.
  • the extraction unit 3 extracts the work number described in the work instruction 54 (step A4).
  • the acquisition unit 4 refers to the first database 1 and acquires an assembly pattern of the part corresponding to the work number (step A5).
  • step A8 it is determined whether or not the marker 58 representing the location code has been recognized in the captured image.
  • the marker 58 is detected, as shown in FIG. 7B, information on the storage location and the number of used parts is displayed on the display device 13 of the smart glass 10.
  • the worker 51 can take out the parts from the material storage shelf 53 while referring to this, and prepare for the assembling work.
  • step A10 it is determined whether or not the component installation target has been recognized.
  • the assembly target is recognized, a virtual image of the component is displayed on the display device 13 as shown in FIG. The worker 51 can assemble the parts while referring to this.
  • FIG. 10 is a flowchart for explaining the flow of the inspection process at the time of completing the assembly of the components.
  • the inspection process may be started when the worker 51 performs a predetermined input operation (for example, a predetermined voice instruction or a button operation) when the assembling operation is completed.
  • a predetermined input operation for example, a predetermined voice instruction or a button operation
  • the completion of the assembling work of the components may be automatically detected by image analysis, and the inspection process may be started.
  • step B1 various information related to the input operation by the worker 51 is input.
  • a predetermined inspection start condition for example, a predetermined voice instruction is given
  • the image captured by the image capturing device 16 is input to the auto encoder as an input image, and the output image is obtained.
  • Step B3 a difference between the input image and the output image is calculated, and a portion where the difference is equal to or larger than the threshold is highlighted. The worker 51 can correct the improper assembling state while referring to this.
  • identification information (work number) included in the work instruction sheet 54 is extracted, and a virtual image of the part is drawn at a position and an orientation corresponding to the identification information.
  • the relationship between the identification information (work number) and the component assembling pattern is defined in the first database 1 in advance.
  • the relationship between the identification information (work number) and the storage location and the number of parts used is prescribed in the second database 2 in advance.
  • a motion sensor acceleration sensor, direction sensor, gyro sensor, geomagnetic sensor, etc.
  • a motion sensor acceleration sensor, direction sensor, gyro sensor, geomagnetic sensor, etc.
  • the superimposition accuracy can be improved, and the realism and realism of the virtual image can be enhanced.
  • the minimum configuration of the manufacturing support system, the manufacturing support method, and the manufacturing support program in this case includes the first database 1, the extraction unit 3, the acquisition unit 4, the recognition unit 5, and the drawing unit 6.
  • the device in which these elements are stored can be set arbitrarily, and the functions can be distributed to a plurality of devices. When distributing functions, at least a plurality of devices need only be connected to be able to communicate with each other by wire or wirelessly.
  • the smart glass 10 is in charge of elements related to drawing on the display device 13 (for example, the drawing unit 6 and the highlighting unit 8), and other elements (for example, ,
  • the extracting unit 3, the acquiring unit 4, the recognizing unit 5, and the determining unit 7) may be assigned to the assembling work computer 59.
  • the first database 1 and the second database 2 may be stored in the production command server 60.
  • the production command server 60 is a large computer that controls the operation state of the production line and the operation of the production robot, and is connected to the assembly work computer 59 by a wired cable.
  • the assembling work computer 59 is a computer disposed near a work place where the assembling work is performed, and is wirelessly connected to the smart glass 10 via the repeater 36.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Manufacturing & Machinery (AREA)
  • General Physics & Mathematics (AREA)
  • Economics (AREA)
  • Human Resources & Organizations (AREA)
  • Automation & Control Theory (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Mechanical Engineering (AREA)
  • General Health & Medical Sciences (AREA)
  • Quality & Reliability (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • General Factory Administration (AREA)

Abstract

In the present invention, the head of a worker is fitted with smart glasses having a built-in photographing device for photographing the actual view in the field of view of the worker and a display device for superimposing and displaying a virtual image on the actual view. A relationship between an assembly pattern of a plurality of components with respect to an object to be assembled and identification information corresponding to the pattern is defined in a first database (1). A control device causes the display device to display a virtual image of the components so as to match a coordinate system of the actual view in the field of view of the worker. The control device is provided with: an extraction unit (3) for extracting identification information contained in work instructions; an acquisition unit (4) for acquiring the assembly pattern of the components corresponding to the identification information; a recognition unit (5) for recognizing the object to be assembled from the photographed image of the photographing device; and a drawing unit (6) for drawing a virtual image of the components at the positions and orientations at the time of completion.

Description

製造支援システム,方法,プログラムManufacturing support system, method, program
 本発明は、作業者による部品の組み付け作業を支援するための製造支援システム,方法,プログラムに関する。 The present invention relates to a manufacturing support system, a method, and a program for assisting a worker in assembling parts.
 従来、製造業の生産現場において、作業者に光透過型のヘッドマウントディスプレイを装着させ、作業内容や作業手順の指示などの情報を実景に重ねて表示(重畳表示)することで作業支援を実施するシステムが知られている。例えば、製品検査に際し、実際の検査対象物(実像)の近傍に合格画像(過去に合格と判定された検査対象物の画像)を表示することで、検査作業を支援する技術が提案されている。また、部品の組み付け作業において、特定の場所が強調されるように丸印で囲むことや、配線材の配索経路を表示することで作業性を向上させることも提案されている。これらの支援を実施することで、目視検査や組み付け作業の効率が改善されうる(特許文献1,2参照)。 Conventionally, at the production site of the manufacturing industry, workers are provided with a light transmission type head mounted display, and work support is provided by superimposing and displaying information such as work contents and work procedure instructions on the actual scene (superimposed display) There is a known system. For example, in product inspection, a technology has been proposed that supports an inspection operation by displaying a passing image (an image of an inspection object determined to be acceptable in the past) near an actual inspection object (real image). . It has also been proposed to improve the operability by assembling a part with a circle so as to emphasize a specific place, or by displaying a wiring route of a wiring member in a part assembling operation. By performing these supports, the efficiency of visual inspection and assembly work can be improved (see Patent Documents 1 and 2).
国際公開第2017/033561号International Publication No. 2017/033561 特開2018-014218号公報JP 2018-014218A
 ところで、組み付け作業の現場においては、専門用語や解読が困難な符号(情報伝達のために複数の記号を組み合わせたもの)を用いて作業内容が指示されることがあり、作業初心者には作業内容を正確に把握することすら難しいという側面もある。また、複数の部品が組み付け対象に組み付けられる作業では、作業者から見たときの組み付け対象の姿勢や向きによって、部品のレイアウトが違って見えることがある。そのため、部品の組み付け状態が合格画像と異なるように見える場合であっても、実際には適切な状態である場合がある。さらに、複数の部品を同時に確認することになるため、不適切な組み付け状態が目視検査で見過ごされることもある。このように、従来の作業支援システムでは、複数の部品が組み付けられる作業での組み付けミスが発生しやすく、作業効率を改善しにくいという課題がある。 By the way, at the site of the assembling work, the work content may be indicated using technical terms or codes that are difficult to decipher (combination of a plurality of symbols for information transmission). There is also the aspect that it is difficult to grasp even exactly. In a work in which a plurality of components are assembled to an assembly target, the layout of the components may look different depending on the posture and orientation of the assembly target as viewed from an operator. Therefore, even if the assembled state of the parts looks different from the acceptable image, it may actually be an appropriate state. Further, since a plurality of parts are checked at the same time, an improper assembly state may be overlooked by visual inspection. As described above, in the conventional work support system, there is a problem that an assembling error easily occurs in a work in which a plurality of parts are assembled, and it is difficult to improve work efficiency.
 本件の目的の一つは、上記のような課題に鑑みて創案されたものであり、部品の組み付け作業に関する作業効率を改善できる製造支援システム,方法,プログラムを提供することである。なお、この目的に限らず、後述する「発明を実施するための形態」に示す各構成から導き出される作用効果であって、従来の技術では得られない作用効果を奏することも、本件の他の目的として位置付けることができる。 One of the objects of the present invention is to provide a manufacturing support system, a method, and a program, which were created in view of the above-mentioned problems, and which can improve the work efficiency in assembling parts. It is to be noted that the present invention is not limited to this purpose, and is an operation and effect derived from each configuration shown in “Embodiments for Carrying Out the Invention” described later. Can be positioned as an objective.
 (1)開示の製造支援システムは、作業者による部品の組み付け作業を支援するための製造支援システムである。本システムは、前記作業者の視界内の実景を撮影する撮影装置と前記実景に虚像を重畳表示する表示装置とを内蔵し、前記作業者の頭部に装着されるスマートグラスを備える。また、組み付け対象に対する複数の前記部品の組み付けパターンとこれに対応する識別情報との関係を規定する第一データベースを備える。さらに、前記作業者の視界内の実景の座標系に適合するように、前記部品の虚像を前記表示装置に表示させる制御装置を備える。 (1) The disclosed manufacturing support system is a manufacturing support system for assisting a worker in assembling parts. The present system has a built-in photographing device for photographing a real scene in the field of view of the worker and a display device for superimposing and displaying a virtual image on the real scene, and includes smart glasses mounted on the head of the worker. In addition, there is provided a first database that defines a relationship between an assembling pattern of the plurality of components with respect to an assembling target and identification information corresponding thereto. Furthermore, a control device is provided for displaying a virtual image of the component on the display device so as to match a coordinate system of a real scene in the field of view of the worker.
 前記制御装置には、抽出部,取得部,認識部,描画部が設けられる。前記抽出部は、前記撮影装置の撮影画像の中から、前記組み付け作業の内容が記載された作業指示書に含まれる前記識別情報を抽出する。前記取得部は、前記第一データベースに基づき、前記識別情報に対応する前記部品の組み付けパターンを取得する。前記認識部は、前記撮影装置の撮影画像の中から、前記組み付け対象を認識する。前記描画部は、前記組み付け対象に前記部品が組み付けられた完成時の位置及び向きで前記部品の虚像を描画する。 制 御 The control device includes an extraction unit, an acquisition unit, a recognition unit, and a drawing unit. The extraction unit extracts the identification information included in a work instruction document in which the contents of the assembling work are described, from a photographed image of the photographing device. The acquisition unit acquires an assembly pattern of the component corresponding to the identification information based on the first database. The recognition unit recognizes the assembly target from a captured image of the imaging device. The drawing unit draws a virtual image of the component in a completed position and orientation in which the component is assembled to the assembly target.
 (2)前記部品の保管場所と前記組み付け対象に組み付けられる前記部品の個数(使用数)と前記識別情報との関係が規定された第二データベースを備えることが好ましい。この場合、前記取得部が、前記第二データベースに基づき、前記識別情報に対応する前記部品の保管場所の情報と前記部品の個数の情報とを取得する。また、前記認識部が、前記撮影装置の撮影画像の中から、前記保管場所を認識する。さらに、前記描画部が、前記部品の個数の情報を前記保管場所の近傍に表示する。 (2) It is preferable that a second database is provided in which a relationship between a storage location of the component, the number (use number) of the component to be assembled to the assembly target, and the identification information is defined. In this case, the acquisition unit acquires, based on the second database, information on the storage location of the component corresponding to the identification information and information on the number of the components. Further, the recognition unit recognizes the storage location from a captured image of the imaging device. Further, the drawing unit displays information on the number of the components near the storage location.
 (3)前記認識部が、前記撮影画像中における前記保管場所に設置されたマーカーを認識することが好ましい。
 (4)前記制御装置が、判定部と強調表示部とを有することが好ましい。この場合、前記判定部は、前記撮影装置の撮影画像と前記識別情報とに基づき、前記部品の組み付け状態が前記組み付けパターンに一致するか否かを判定する。また、前記強調表示部は、前記部品の組み付け状態が前記組み付けパターンに一致しない部位を、前記作業者の視界内の実景の座標系に適合するように強調表示する。
(3) It is preferable that the recognition unit recognizes a marker installed in the storage location in the captured image.
(4) Preferably, the control device includes a determination unit and a highlighting unit. In this case, the determination unit determines whether or not the assembly state of the component matches the assembly pattern based on the captured image of the imaging device and the identification information. The emphasis display unit highlights a part in which the assembly state of the component does not match the assembly pattern so as to conform to a coordinate system of a real scene in the field of view of the worker.
 作業指示書に含まれる識別情報を抽出し、これに対応する位置,向きで部品の虚像を描画することで、組み付けられる部品の種類や数やレイアウトが作業指示書ごとに異なっている場合であっても、正しい位置,向きに組み付けの手本を示すことができ、組み付け作業を支援することができる。また、作業指示書ごとに異なる部品の組み付け状態を作業者が暗記しておく必要がなくなり、組み付けミスを削減でき、生産性(作業効率)を向上させることができる。 By extracting the identification information included in the work instruction and drawing the virtual image of the part at the corresponding position and orientation, the type, the number, and the layout of the parts to be assembled are different for each work instruction. However, it is possible to set an example of assembling in a correct position and orientation, and to assist the assembling work. Further, it is not necessary for the operator to memorize the assembly state of the different parts for each work instruction document, so that assembly errors can be reduced, and productivity (work efficiency) can be improved.
製造支援システムの適用対象を示す図である。It is a figure showing the application object of a manufacturing support system. 製造支援システムの構成を説明するための図である。It is a figure for explaining composition of a manufacturing support system. 製造支援システムのハードウェア構成を示すブロック図である。FIG. 2 is a block diagram illustrating a hardware configuration of the manufacturing support system. 製造支援プログラムのソフトウェア構成を示すブロック図である。FIG. 3 is a block diagram illustrating a software configuration of a manufacturing support program. (A),(B)はデータベースの構成例である。(A) and (B) are configuration examples of a database. (A)~(C)は部品の虚像の表示例を説明するための図である。(A)-(C) is a figure for demonstrating the example of a display of the virtual image of a component. (A),(B)は部品の個数の表示例を説明するための図である。(A), (B) is a figure for demonstrating the example of a display of the number of components. 検査結果の強調表示例を説明するための図である。It is a figure for explaining an example of highlighting of an inspection result. 組み付け工程での製造支援手法を説明するためのフローチャートである。It is a flowchart for demonstrating the manufacturing support technique in an assembly process. 検査工程での製造支援手法を説明するためのフローチャートである。It is a flowchart for explaining the manufacturing support method in the inspection process. (A),(B)は製造支援システムの変形例を説明するための図である。(A), (B) is a figure for demonstrating the modification of a manufacturing support system.
 以下、図面を参照して実施形態としての製造支援システム,方法,プログラムについて説明する。以下に示す実施形態はあくまでも例示に過ぎず、以下の実施形態で明示しない種々の変形や技術の適用を排除する意図はない。本実施形態の各構成は、それらの趣旨を逸脱しない範囲で種々変形して実施することができる。また、必要に応じて取捨選択することができ、あるいは適宜組み合わせることができる。 Hereinafter, a manufacturing support system, method, and program as an embodiment will be described with reference to the drawings. The embodiment described below is merely an example, and there is no intention to exclude various modifications and application of technology not explicitly described in the following embodiment. Each configuration of the present embodiment can be variously modified and implemented without departing from the spirit thereof. Further, they can be selected as needed, or can be appropriately combined.
 本実施形態の製造支援システムは、作業者51による部品の組み付け作業を支援するウェアラブルガジェットを利用したシステムであり、図1に示すような自動車50の製造工程に適用される。ここでは、作業者51が資材保管棚53から部品を取り出して自動車50に組み付ける作業の支援について説明する。組み付けられる部品の具体例としては、リレー,スイッチ,ランプなどの電装品をはじめとして、ガラス,バンパー,パワートレーン,シート,タイヤなどが挙げられる。また、部品の組み付け対象(部品が組み付けられる装置)の具体例としては、リレーボックス,スイッチボックス,インストルメントパネル,ドア,車体などが挙げられる。部品の組み付け作業における組み付け対象及び部品の種類は多岐にわたり、本件の製造支援システム,方法,プログラムはあらゆる部品の組み付け作業に適用可能である。 The manufacturing support system according to the present embodiment is a system using a wearable gadget that assists the worker 51 in assembling parts, and is applied to a manufacturing process of the automobile 50 as shown in FIG. Here, a description will be given of support for the worker 51 to take out parts from the material storage shelf 53 and assemble them to the automobile 50. Specific examples of parts to be assembled include electrical components such as relays, switches, and lamps, as well as glass, bumpers, power trains, seats, and tires. Specific examples of components to be assembled (devices to which the components are assembled) include a relay box, a switch box, an instrument panel, a door, and a vehicle body. There are a wide variety of objects to be assembled and types of parts in the work of assembling parts, and the manufacturing support system, method, and program of the present invention can be applied to work of assembling all parts.
 作業支援の内容は、作業者51の頭部に装着されるスマートグラス10を利用した、視覚情報の提供である。図2に示すように、スマートグラス10は作業者51のヘルメット52に固定される。このスマートグラス10には、表示装置13と撮影装置16とが内蔵される。撮影装置16は、作業者51の視界内の実景や作業指示書54などを撮影するカメラである。また、表示装置13は、作業者51の視界内の実景に虚像を重畳表示するデバイスである。表示方式は、作業者51の網膜に虚像を投影する方式(網膜投影型)としてもよいし、ハーフミラーの表面に虚像を投影する方式(透過型)としてもよく、あるいは撮影装置16で撮影された実景と虚像とを合成した映像を表示する方式(非透過型)としてもよい。 The content of the work support is the provision of visual information using the smart glass 10 attached to the head of the worker 51. As shown in FIG. 2, the smart glass 10 is fixed to a helmet 52 of an operator 51. The smart glass 10 incorporates a display device 13 and a photographing device 16. The photographing device 16 is a camera that photographs the actual scene in the field of view of the worker 51 and the work instruction sheet 54. The display device 13 is a device that superimposes and displays a virtual image on a real scene in the field of view of the worker 51. The display method may be a method of projecting a virtual image on the retina of the worker 51 (retinal projection type), a method of projecting a virtual image on the surface of a half mirror (transmission type), or an image captured by the imaging device 16. A method (non-transmissive type) of displaying an image in which a real scene and a virtual image are combined may be displayed.
 いずれの方式においても、作業者51の視点で見える実景の座標系に適合するように、さまざまな虚像が表示される。この虚像は、実景の座標系に適合する透視投影画像(あるいはその透視投影画像をデフォルメしたもの)であり、例えば部品の三次元CAD(Computer-Aided Design)データに基づく簡易的なポリゴンモデルとしてレンダリングされる。なお、特許文献1,2のような既存の表示手法では、描画オブジェクトが二次元のオブジェクトである。これに対して本件の表示手法では、描画オブジェクトが三次元のオブジェクトとされ、各描画オブジェクトが実景の座標系内で仮想的に配置される。 も In either method, various virtual images are displayed so as to be compatible with the coordinate system of the real scene viewed from the viewpoint of the worker 51. This virtual image is a perspective projection image (or a deformed version of the perspective projection image) that conforms to the coordinate system of the real scene, and is rendered, for example, as a simple polygon model based on three-dimensional CAD (Computer-Aided Design) data of the part Is done. In the existing display methods as disclosed in Patent Documents 1 and 2, the drawing object is a two-dimensional object. On the other hand, in the present display method, the drawing objects are three-dimensional objects, and each drawing object is virtually arranged in the coordinate system of the real scene.
 作業支援の内容は、任意の制御装置(電子計算機,コンピューター)によって制御される。ここでいう「電子制御装置」は、スマートグラス10に内蔵されたものであってもよいし、スマートグラス10とは別に用意されたものであってもよい。例えば、ネットワーク19上のサーバー30やワークステーションにプログラムをインストールしておき、そこで演算された表示内容をスマートグラス10に表示させるシステムとしてもよい。この場合、図2に示すように、中継器36を介してネットワーク19に接続されたサーバー30を利用してもよい。あるいは、ネットワーク19に接続されていない状態でも動作するプログラムをスマートグラス10にインストールしておいてもよい。 The content of work support is controlled by any control device (computer, computer). The “electronic control device” here may be built in the smart glass 10 or may be prepared separately from the smart glass 10. For example, the program may be installed in a server 30 or a workstation on the network 19, and the display content calculated there may be displayed on the smart glass 10. In this case, as shown in FIG. 2, the server 30 connected to the network 19 via the repeater 36 may be used. Alternatively, a program that operates even when not connected to the network 19 may be installed in the smart glass 10.
[1.ハードウェア構成]
 図3は、スマートグラス10とサーバー30との協働によって実現される製造支援システムのハードウェア構成を示すブロック図である。スマートグラス10には、記録媒体18(リムーバブルメディア)に記録されたプログラムを読み込むための記録媒体ドライブ11,記憶装置12,表示装置13,入力装置14,通信装置15,撮影装置16(カメラ),マイク17,電子制御装置20が設けられる。同様に、サーバー30にも、記録媒体ドライブ31,記憶装置32,表示装置33,入力装置34,通信装置35,電子制御装置40が内蔵される。それぞれの要素の機能は、スマートグラス10に内蔵される同一名称の要素とほぼ同様であることから説明を省略する。
[1. Hardware configuration]
FIG. 3 is a block diagram showing a hardware configuration of the manufacturing support system realized by cooperation between the smart glass 10 and the server 30. The smart glass 10 includes a recording medium drive 11, a storage device 12, a display device 13, an input device 14, a communication device 15, a photographing device 16 (camera) for reading a program recorded on a recording medium 18 (removable medium). A microphone 17 and an electronic control unit 20 are provided. Similarly, the server 30 includes a recording medium drive 31, a storage device 32, a display device 33, an input device 34, a communication device 35, and an electronic control device 40. The function of each element is almost the same as that of the element with the same name built in the smart glass 10, and therefore the description is omitted.
 記録媒体18は、例えばフラッシュメモリーカードやユニバーサルシリアルバス規格に準拠した半導体メモリ装置,光ディスクなどである。記録媒体ドライブ11は、記録媒体18に記録,保存された情報(プログラムやデータ)を読み取る読取装置である。記憶装置12は、長期的に保持されるデータやプログラムが格納されるメモリ装置であり、例えばフラッシュメモリやEEPROM(Electrically Erasable Programmable Read-Only Memory),ソリッドステートドライブなどの不揮発性メモリがこれに含まれる。 The recording medium 18 is, for example, a flash memory card, a semiconductor memory device conforming to the universal serial bus standard, an optical disk, or the like. The recording medium drive 11 is a reading device that reads information (programs and data) recorded and stored on the recording medium 18. The storage device 12 is a memory device for storing data and programs stored for a long period of time, and includes, for example, a non-volatile memory such as a flash memory, an EEPROM (Electrically Erasable Programmable Read-Only Memory), and a solid state drive. It is.
 表示装置13は、作業者51に対して視覚情報を提供するためのデバイスの一つである。網膜投影型のデバイスでは、作業者51の網膜が結像面となる。一方、透過型や非透過型のデバイスでは、作業者51の視界内に配置されるハーフミラーやディスプレイ〔例えば液晶ディスプレイ(Liquid Crystal Display,LCD)や有機ELディスプレイ(Organic Electro-Luminescence Display,OELD)〕の表面が結像面となる。 The display device 13 is one of devices for providing visual information to the worker 51. In the retinal projection type device, the retina of the worker 51 is an image plane. On the other hand, in a transmission type or non-transmission type device, a half mirror or a display (for example, a liquid crystal display (Liquid Crystal Display, LCD) or an organic EL display (Organic Electro-Luminescence Display, OELD)) is arranged within the field of view of the worker 51. ] Is the imaging plane.
 入力装置14は、電子制御装置20,40への入力信号を入力するためのポインティングデバイスの一つである。入力装置14の具体例としては、無線キーボードや無線マウスなどのデバイスがスマートグラス10に接続可能である。また、スマートグラス10のテンプル(柄,側面側の部分)には、押しボタンや物理キーなどの入力装置14が設けられる。また、マイク17は作業者51の音声を入力信号として入力するためのデバイスである。 The input device 14 is one of pointing devices for inputting input signals to the electronic control devices 20 and 40. As a specific example of the input device 14, a device such as a wireless keyboard and a wireless mouse can be connected to the smart glass 10. An input device 14 such as a push button or a physical key is provided on a temple (a handle, a side portion) of the smart glass 10. The microphone 17 is a device for inputting the voice of the worker 51 as an input signal.
 通信装置15は、スマートグラス10の外部との通信を司る装置であり、例えばネットワーク19を介してサーバー30との間で無線通信を行うためのデバイスである。図2に示す中継器36は、通信装置15に含まれるものと見なすことができる。近距離無線通信規格に準拠した入出力装置がスマートグラス10に接続される場合には、それらの入出力装置が通信装置15を介して接続される。 The communication device 15 is a device that manages communication with the outside of the smart glass 10, and is, for example, a device that performs wireless communication with the server 30 via the network 19. The repeater 36 shown in FIG. 2 can be considered to be included in the communication device 15. When input / output devices conforming to the short-range wireless communication standard are connected to the smart glass 10, those input / output devices are connected via the communication device 15.
 撮影装置16は、CCD(Charge-Coupled Device)やCMOS(Complementary Metal Oxide Semiconductor)などのイメージセンサを内蔵したディジタルビデオカメラである。この撮影装置16は、スマートグラス10のリム(レンズの周縁)やブリッジ(左右のレンズの接続部分)などに取り付けられ、スマートグラス10を装着した作業者51の視界を撮影するように機能する。撮影装置16で撮影された撮影画像の情報は、電子制御装置20やサーバー30に伝達される。なお、ここでいう「撮影画像」には、静止画像だけでなく動画中に含まれる1フレーム(1コマ)の画像も含まれる。 The photographing device 16 is a digital video camera having a built-in image sensor such as a CCD (Charge-Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor). The photographing device 16 is attached to a rim (periphery of a lens) or a bridge (a connection portion between left and right lenses) of the smart glass 10 and functions to photograph a field of view of an operator 51 wearing the smart glass 10. Information on the image captured by the image capturing device 16 is transmitted to the electronic control device 20 and the server 30. Note that the “captured image” here includes not only a still image but also an image of one frame (one frame) included in a moving image.
 撮影装置16の位置や撮影方向は、スマートグラス10に対して固定されており、表示装置13の位置もスマートグラス10に対して固定されている。したがって、スマートグラス10と作業者51の視線との関係が一定に保たれている限り、撮影装置16で撮影された実景と同じ環境内に存在するかのように見える虚像(すなわち、座標系が実景に適合する三次元のオブジェクト)を表示装置13で描画することができる。なお、撮影装置16で撮影された画像中の被写体について、三次元空間内での位置や姿勢を算出するには、公知のAR(Augmented Reality,拡張現実)開発ライブラリを利用することができる。また、実景に適合する三次元オブジェクトの描画には、公知の三次元レンダラーやAR描画エンジンを利用することができる。 位置 The position and the photographing direction of the photographing device 16 are fixed to the smart glass 10, and the position of the display device 13 is also fixed to the smart glass 10. Therefore, as long as the relationship between the smart glass 10 and the line of sight of the worker 51 is kept constant, a virtual image (that is, a coordinate system whose appearance is the same as that of the real scene photographed by the photographing device 16) appears to exist in the same environment. A three-dimensional object that matches the actual scene) can be drawn on the display device 13. Note that a known AR (Augmented Reality, Augmented Reality) development library can be used to calculate the position and orientation in a three-dimensional space of a subject in an image captured by the imaging device 16. In addition, a known three-dimensional renderer or an AR drawing engine can be used for drawing a three-dimensional object that matches a real scene.
 電子制御装置20には、プロセッサ21(中央処理装置),メモリ22(メインメモリ,主記憶装置),補助記憶装置23,インタフェース装置24などが内蔵され、バス25を介して互いに通信可能に接続される。プロセッサ21は、制御ユニット(制御回路)や演算ユニット(演算回路),キャッシュメモリ(レジスタ群)などを内蔵する中央処理装置である。また、メモリ22は、プログラムや作業中のデータが格納される記憶装置であり、例えばROM(Read Only Memory),RAM(Random Access Memory)がこれに含まれる。 The electronic control unit 20 includes a processor 21 (central processing unit), a memory 22 (main memory, main storage device), an auxiliary storage device 23, an interface device 24, and the like, and is communicably connected to each other via a bus 25. You. The processor 21 is a central processing unit including a control unit (control circuit), an operation unit (operation circuit), a cache memory (register group), and the like. The memory 22 is a storage device that stores programs and data during work, and includes, for example, a ROM (Read Only Memory) and a RAM (Random Access Memory).
 補助記憶装置23は、メモリ22よりも長期的に保持されるデータやファームウェアが格納されるメモリ装置であり、例えばフラッシュメモリやEEPROMなどの不揮発性メモリがこれに含まれる。また、インタフェース装置24は、電子制御装置20と外部との間の入出力(Input and Output;I/O)を司るものである。電子制御装置20は、インタフェース装置24を介して、記録媒体ドライブ11,記憶装置12,表示装置13,入力装置14,通信装置15などに接続される。 The auxiliary storage device 23 is a memory device that stores data and firmware that are held for a longer time than the memory 22, and includes, for example, a nonvolatile memory such as a flash memory or an EEPROM. The interface device 24 controls input / output (I / O) between the electronic control device 20 and the outside. The electronic control device 20 is connected to the recording medium drive 11, the storage device 12, the display device 13, the input device 14, the communication device 15, and the like via the interface device 24.
 なお、サーバー30の電子制御装置40にも、プロセッサ41(中央処理装置),メモリ42(メインメモリ,主記憶装置),補助記憶装置43,インタフェース装置44などが内蔵され、バス45を介して互いに通信可能に接続される。それぞれの要素の機能は、電子制御装置20に内蔵される同一名称の要素とほぼ同様であることから説明を省略する。また、具体的な電子制御装置のハードウェア構成を上記の構成に限定する意図はなく、公知の各種構成を適用してもよい。 The electronic control unit 40 of the server 30 also includes a processor 41 (central processing unit), a memory 42 (main memory, main storage device), an auxiliary storage device 43, an interface device 44, and the like. Connected communicable. The function of each element is almost the same as that of the element having the same name built in the electronic control unit 20, and therefore the description is omitted. Further, there is no intention to limit the specific hardware configuration of the electronic control device to the above configuration, and various known configurations may be applied.
[2.ソフトウェア構成]
 図4は、スマートグラス10やサーバー30で実行される製造支援プログラム9の処理内容を説明するためのブロック図である。これらの処理内容は、アプリケーションプログラムとしてスマートグラス10及びサーバー30の記憶装置12,32や記録媒体18などに記録され、メモリー上に展開されて実行される。ここで実行される処理内容を機能的に分類すると、製造支援プログラム9には、抽出部3,取得部4,認識部5,描画部6,判定部7,強調表示部8が設けられる。また、これらの各要素が参照可能なデータベースとして、第一データベース1,第二データベース2が設けられる。
[2. Software configuration]
FIG. 4 is a block diagram for explaining the processing contents of the manufacturing support program 9 executed by the smart glasses 10 and the server 30. These processing contents are recorded as application programs in the storage devices 12 and 32 of the smart glass 10 and the server 30, the recording medium 18, and the like, and are developed and executed on a memory. When the processing contents executed here are functionally classified, the manufacturing support program 9 includes an extraction unit 3, an acquisition unit 4, a recognition unit 5, a drawing unit 6, a determination unit 7, and a highlighting unit 8. Further, a first database 1 and a second database 2 are provided as databases to which these elements can be referred.
 第一データベース1は、組み付け対象に対する部品の組み付けパターンとこれに対応する識別情報との関係を規定するものである。識別情報とは、作業者51に指示される作業内容を特定するための情報であり、例えば作業番号,作業対象車両番号,指示対象車両番号などと呼ばれる。本実施形態では、図2に示すように、作業番号が作業管理者から提示される作業指示書54に記載されているものとする。作業指示書54は、作業者51が組み付け作業を実施する前に、作業管理者から提示されるものとし、あるいは、組み付け対象となる自動車50に貼付されているものとする。 (1) The first database 1 defines a relationship between an assembly pattern of a component with respect to an assembly target and identification information corresponding thereto. The identification information is information for specifying the content of the work instructed by the worker 51, and is called, for example, a work number, a work target vehicle number, an instruction target vehicle number, and the like. In the present embodiment, as shown in FIG. 2, it is assumed that the work number is described in the work instruction 54 presented by the work manager. It is assumed that the work instruction 54 is presented by the work manager before the worker 51 performs the assembling work, or is attached to the automobile 50 to be assembled.
 第一データベース1の内容を図5(A)に例示する。第一データベース1には、少なくとも組み付け作業を識別するための作業番号と、その作業で組み付けられる部品のレイアウト(組み付けパターン)を特定するための情報(組み付け位置,組み付け向き)との関係が規定される。その作業で組み付けられる部品の種類数が複数である場合には、一つの作業番号に対して複数種の部品の組み付けパターンが設定される。図5(A)に示す例では、作業番号「ABCDE01234」に対し、三種の部品(a-01,a-02,a-03)の組み付けパターンが設定されている。 (5) The contents of the first database 1 are illustrated in FIG. The first database 1 defines at least a relationship between an operation number for identifying an assembling operation and information (an assembling position and an assembling direction) for specifying a layout (an assembling pattern) of components to be assembled in the operation. You. If there are a plurality of types of parts to be assembled in the operation, an assembly pattern of a plurality of types of parts is set for one operation number. In the example shown in FIG. 5A, an assembling pattern of three types of parts (a-01, a-02, a-03) is set for the work number "ABCDE01234".
 なお、作業対象となる自動車50の車種やグレード,仕向地などが異なれば、部品の種類や組み付けパターンも異なるものとなる。一般に、自動車50の組み付け作業において、作業者51が習得すべき組み付けパターンは数十から数百パターンに及び、これらのパターンを暗記するのには数ヶ月から数年の経験が必要だと言われている。一方、本実施形態ではこれらのすべてのパターンが第一データベース1に規定され、作業番号に対して紐付けられる。これにより、作業者51は組み付けパターンの暗記作業から解放される。 (4) If the type, grade, destination, etc. of the automobile 50 to be worked on are different, the types of parts and the assembling patterns will also be different. Generally, in assembling the automobile 50, the assembling patterns to be acquired by the worker 51 range from tens to hundreds of patterns, and it is said that memorizing these patterns requires months to years of experience. ing. On the other hand, in the present embodiment, all of these patterns are defined in the first database 1 and are linked to the operation numbers. Thereby, the worker 51 is released from the memorization work of the assembly pattern.
 部品の組み付け位置は、組み付け対象(例えばリレーボックス)に固定された座標系を基準として、少なくとも三つのパラメーターで特定することができる。同様に、部品の組み付け向きについても、組み付け対象に固定された座標系を基準として、少なくとも三つのパラメーターで特定することができる。したがって、一つの部品のレイアウトを特定するための情報は、6つ以上のパラメーターを用いることで一意に特定することができる。 The mounting position of the component can be specified by at least three parameters based on a coordinate system fixed to the mounting target (for example, a relay box). Similarly, the assembling direction of the component can be specified by at least three parameters based on a coordinate system fixed to the assembling target. Therefore, information for specifying the layout of one component can be uniquely specified by using six or more parameters.
 第二データベース2は、上記の識別情報(作業番号)と部品の保管場所及び使用数(組み付けられる部品の個数)との関係を規定するものである。部品の保管場所とは、例えば図1に示す資材保管棚53に収納される多数の部品トレイのうち、どのトレイに入っている部品なのかを表す情報である。ある作業で使用される部品の保管場所が常に固定されている場合には、少なくとも三つのパラメーターで部品の位置(保管場所)を特定可能である。一方、作業状況によって部品の保管場所が変更される可能性がある場合には、保管場所を特定するための情報を第二データベース2の各レコードに付加することが好ましい。 The second database 2 defines the relationship between the identification information (work number), the storage location and the number of used parts (the number of assembled parts). The component storage location is, for example, information indicating which tray is in which of the many component trays stored in the material storage shelf 53 shown in FIG. If the storage location of a component used in a certain operation is always fixed, the position (storage location) of the component can be specified by at least three parameters. On the other hand, when there is a possibility that the storage location of the component is changed depending on the work situation, it is preferable to add information for specifying the storage location to each record of the second database 2.
 図5(B)に示す例では、作業番号「ABCDE01234」で使用される部品「a-01」の使用数が3であり、その保管場所が場所コード「aA11bB22cC33dD44」となっている。その部品が実際に保管されている場所には、場所コード「aA11bB22cC33dD44」を表す文字列や記号やバーコード,二次元コードなどが掲示される。部品ごとに異なる場所コードを付与することで、各部品の保管場所が一意に特定されうる。 In the example shown in FIG. 5B, the number of parts “a-01” used in the operation number “ABCDE01234” is 3, and the storage location is the location code “aA11bB22cC33dD44”. A character string, a symbol, a bar code, a two-dimensional code, or the like representing the place code “aA11bB22cC33dD44” is posted at a place where the part is actually stored. By assigning a different location code to each component, the storage location of each component can be uniquely specified.
 抽出部3は、撮影装置16の撮影画像の中から、組み付け作業の内容が記載された作業指示書54に含まれる識別情報(作業番号)を抽出するものである。作業指示書54は、作業者51が部品の組み付け作業を実施する前に撮影されるものとする。本実施形態の抽出部3は、図2に示す作業指示書54のバーコードを読み取ることで、作業番号を抽出する機能を持つ。ここで抽出された作業番号の情報は、取得部4に伝達される。 The extracting unit 3 extracts identification information (work number) included in the work instruction sheet 54 in which the contents of the assembling work are described, from the image captured by the imaging device 16. It is assumed that the work instruction 54 is photographed before the worker 51 performs the work of assembling parts. The extraction unit 3 of the present embodiment has a function of extracting a work number by reading a barcode of the work instruction sheet 54 shown in FIG. The information of the work number extracted here is transmitted to the acquisition unit 4.
 取得部4は、二つの機能を有する。第一の機能は、第一データベース1に基づく情報取得であり、第二の機能は、第二データベース2に基づく情報取得である。ここで取得された情報は、描画部6に伝達される。
 前者の機能は、第一データベース1に基づき、抽出部3で抽出された識別情報(作業番号)に対応する部品の組み付けパターンを取得するものである。例えば、抽出部3で抽出された作業番号が「ABCDE01234」である場合には、図5(A)に示す第一データベース1でこれに対応する三種の部品(a-01,a-02,a-03)のそれぞれについて、組み付けパターン(組み付け位置及び組み付け向きの情報)が取得される。
 後者の機能は、第二データベース2に基づき、識別情報(作業番号)に対応する部品の保管場所及び使用数の情報を取得するものである。例えば、図5(B)に示す第二データベース2に基づき、三種の部品(a-01,a-02,a-03)のそれぞれについて、場所コード及び使用数の情報が取得される。
The acquisition unit 4 has two functions. The first function is information acquisition based on the first database 1, and the second function is information acquisition based on the second database 2. The information obtained here is transmitted to the drawing unit 6.
The former function is to acquire a component assembling pattern corresponding to the identification information (work number) extracted by the extraction unit 3 based on the first database 1. For example, when the work number extracted by the extraction unit 3 is “ABCDE01234”, three types of parts (a-01, a-02, aa) corresponding to this in the first database 1 shown in FIG. -03), an assembling pattern (information of an assembling position and an assembling direction) is acquired.
The latter function is to acquire, based on the second database 2, information on the storage location and the number of used parts corresponding to the identification information (work number). For example, based on the second database 2 shown in FIG. 5 (B), information on the location code and the number of uses is obtained for each of the three types of components (a-01, a-02, a-03).
 認識部5は、撮影装置16による撮影画像の中に存在する特定の物体やマークを認識するものである。認識部5は、少なくとも「部品の組み付け対象」を認識する機能を持つ。例えば、リレーの組み付け作業〔図5(A)中の作業番号「ABCDE01234」〕では、そのリレーが組み付けられるリレーボックス55が撮影画像の中にあるか否かが判定される。この判定では、リレーボックス55の三次元CADデータが参照される。図6(A)に示すように、撮影画像中にリレーボックス55が検出されると、撮影画像中の特徴点(リレーボックス55の端辺に含まれる点や頂点など)の分布に基づき、撮影位置(撮影装置16の位置)を基準としたリレーボックス55の位置及び姿勢が推定される。これにより、作業者51の視界内でのリレーボックス55の見え方が特定される。ここで特定される組み付け対象の位置及び姿勢の情報は、描画部6に伝達され、部品を描画するときに参照される。 The recognition unit 5 recognizes a specific object or mark present in an image captured by the image capturing device 16. The recognizing unit 5 has a function of recognizing at least “part assembly target”. For example, in a relay assembling operation (operation number “ABCDE01234” in FIG. 5A), it is determined whether or not the relay box 55 to which the relay is to be installed is in the captured image. In this determination, the three-dimensional CAD data of the relay box 55 is referred to. As shown in FIG. 6A, when the relay box 55 is detected in the photographed image, photographing is performed based on the distribution of characteristic points (points and vertices included in the edge of the relay box 55) in the photographed image. The position and orientation of the relay box 55 based on the position (the position of the imaging device 16) are estimated. Thereby, the appearance of the relay box 55 within the field of view of the worker 51 is specified. The information on the position and orientation of the assembly target specified here is transmitted to the drawing unit 6 and is referred to when drawing a part.
 また、認識部5は、撮影画像の中から「部品の保管場所」を認識する機能を併せ持つ。例えば、リレーの組み付け作業〔図5(B)中の作業番号「ABCDE01234」〕では、場所コード「aA11bB22cC33dD44」を表す文字列や記号や二次元コードなどが認識される。図7(A)に示すように、撮影画像中に場所コードを表すマーカー58が検出されると、そのマーカー58の位置や形状に基づき、撮影位置を基準としたマーカー58の位置や姿勢が推定される。ここで認識される部品の保管場所の情報も、描画部6に伝達される。 (4) The recognizing unit 5 also has a function of recognizing “part storage location” from the captured image. For example, in the relay assembling work (work number “ABCDE01234” in FIG. 5B), a character string, a symbol, a two-dimensional code, or the like representing the location code “aA11bB22cC33dD44” is recognized. As shown in FIG. 7A, when a marker 58 representing a location code is detected in a captured image, the position and orientation of the marker 58 based on the capturing position are estimated based on the position and shape of the marker 58. Is done. Information on the storage location of the part recognized here is also transmitted to the drawing unit 6.
 描画部6は、部品の組み付け時に表示装置13の表示内容を制御するものであり、おもに三つの機能を有する。第一の機能は、組み付け対象に部品が組み付けられた完成時の位置及び向きで、部品の虚像を描画するものである。ここでいう「虚像」とは、部品の組み付けが完了したときの状態(完成図)を示すものであり、組み付け作業の「手本」となるものである。これを参照しながら組み付け作業を実施することで、作業経験の浅い作業者51であっても組み付けミスをしにくくなり、作業効率が向上する。 The drawing unit 6 controls the display content of the display device 13 when assembling components, and has three main functions. The first function is to draw a virtual image of a component at the position and orientation at the time of completion when the component is assembled to the assembly target. Here, the “virtual image” indicates a state (completion drawing) when the assembling of the parts is completed, and serves as a “model” of the assembling work. By performing the assembling work while referring to this, even if the worker 51 has little working experience, it is difficult to make an assembling error, and the working efficiency is improved.
 部品の虚像は、認識部5で認識された組み付け対象に座標系が適合するように描画される。例えば、図6(A)に示すリレーボックス55に対する部品の虚像は、リレーボックス55に組み付けされた状態に相当する位置及び姿勢で描画される。図6(B)は、ここで描画される部品の虚像のみを取り出して示すものである。リレーボックス55と部品の虚像とを重畳させることで、図6(C)に示すように、あたかもリレーボックス55の上に部品が存在しているかのように見せることができる。 虚 The virtual image of the component is drawn such that the coordinate system matches the assembly target recognized by the recognition unit 5. For example, the virtual image of the component with respect to the relay box 55 shown in FIG. 6A is drawn at a position and orientation corresponding to a state where the component is assembled to the relay box 55. FIG. 6B shows only the virtual image of the part drawn here. By superimposing the virtual image of the component on the relay box 55, it is possible to make it appear as if the component exists on the relay box 55 as shown in FIG.
 描画部6における第二の機能は、組み付け作業で使用される部品の個数(使用数)の情報をリレーボックス55の近傍に描画する機能であり、第三の機能は、その個数の情報を保管場所の近傍に描画するものである。個数の情報をリレーボックス55の近傍に表示する際には、図6(B)に示すように、リレーボックス55に組み付けられる部品に重ならない位置に描画することが好ましい。また、リレーボックス55に対して各部品が組み付けられる面が平面状であるとすれば、各部品の虚像をその平面上で配列することが好ましい。同様に、個数の情報を保管場所の近傍に描画する際には、図7(B)に示すように、マーカー58や部品トレイに重ならない位置に描画することが好ましい。なお、部品の個数だけでなく部品番号や部品の外観などを一緒に表示することで、保管の状態をよりわかりやすく伝達してもよい。 The second function of the drawing unit 6 is a function of drawing information on the number (use number) of parts used in the assembling work near the relay box 55, and the third function is storing the information on the number. It is drawn near the place. When displaying the number information in the vicinity of the relay box 55, it is preferable to draw the information at a position that does not overlap with the components assembled in the relay box 55, as shown in FIG. Further, if the surface on which each component is assembled to relay box 55 is planar, it is preferable to arrange the virtual images of each component on the plane. Similarly, when drawing the number information near the storage location, it is preferable to draw the information at a position that does not overlap with the marker 58 or the component tray, as shown in FIG. 7B. In addition, not only the number of parts but also the part number and the appearance of the parts may be displayed together, so that the storage state may be transmitted in a more easily understood manner.
 判定部7は、部品の組み付け作業が完了したときに、その作業内容が適切であるか否かを自動的に判定するものである。ここでは、撮影画像と識別情報(作業番号)とに基づき、部品の組み付け状態が第一データベース1に規定された組み付けパターンに一致するか否かが判定される。例えば、撮影画像中の各部品を特定し、組み付け対象に対する各部品の位置及び姿勢を推定することで、組み付けパターンとの比較を実施してもよい。あるいは、あらかじめ用意された完成状態の画像と撮影画像とを比較してもよい。 The determination unit 7 automatically determines whether or not the content of the work is appropriate when the work of assembling the parts is completed. Here, based on the captured image and the identification information (work number), it is determined whether or not the assembly state of the component matches the assembly pattern defined in the first database 1. For example, each component in the captured image may be specified, and the position and orientation of each component with respect to the assembly target may be estimated to compare with the assembly pattern. Alternatively, an image in a completed state prepared in advance and a captured image may be compared.
 本実施形態の判定部7は、学習済みのオートエンコーダを利用して、部品の組み付け状態を判定する。具体的には、撮影画像をオートエンコーダに入力し、その出力画像を取得するとともに、入力画像と出力画像との差異を算出する。オートエンコーダの学習データとしては、部品が正しく組み付けられた組み付け対象をさまざまな位置,角度から撮影した数千枚の画像が用いられる。このようなオートエンコーダに撮影画像を入力したとき、その撮影画像中の組み付け状態が適切であれば、出力される画像が入力画像とほぼ同一の画像となる。一方、組み付け状態に不適切な箇所があれば、その箇所の像が「正しい組み付け状態に近い像」に変換されて出力される。つまり、組み付け状態が不適切な箇所は、オートエンコーダに対する入力画像と出力画像との差異として検出可能である。そこで判定部7は、入力画像と出力画像との差異を算出し、その差異が所定のしきい値以上となる箇所を「異常度が高い箇所」として算出する。 判定 The determining unit 7 of the present embodiment determines the assembly state of the parts by using the learned auto encoder. Specifically, the captured image is input to the auto encoder, the output image is obtained, and the difference between the input image and the output image is calculated. As the learning data of the auto-encoder, thousands of images obtained by photographing an assembly target in which components are correctly assembled from various positions and angles are used. When a captured image is input to such an auto encoder, if the assembled state in the captured image is appropriate, the output image is almost the same as the input image. On the other hand, if there is an inappropriate part in the assembled state, the image at that part is converted into an “image close to the correct assembled state” and output. In other words, an improperly assembled portion can be detected as a difference between an input image and an output image for the auto encoder. Therefore, the determination unit 7 calculates a difference between the input image and the output image, and calculates a location where the difference is equal to or more than a predetermined threshold value as a “location having a high degree of abnormality”.
 強調表示部8は、判定部7で判定された「異常度が高い箇所」を強調表示するものである。ここでは、部品の組み付け状態が組み付けパターンに一致しない部位が、作業者の視界内の実景の座標系に適合するように強調表示される。本実施形態の強調表示部8は、「異常度が高い箇所」を赤くハイライトした画像を表示装置13に表示する機能を持つ。これにより、図8に示すように、組み付け状態が不適切な箇所が強調され、組み付けミスの存在が作業者51にわかりやすく伝達される。 The highlighting section 8 highlights the “place with a high degree of abnormality” determined by the determining section 7. Here, a part where the assembly state of the parts does not match the assembly pattern is highlighted so as to conform to the coordinate system of the actual scene in the field of view of the worker. The highlighter 8 according to the present embodiment has a function of displaying an image in which “a portion having a high degree of abnormality” is highlighted in red on the display device 13. As a result, as shown in FIG. 8, a portion having an improper assembling state is emphasized, and the presence of the assembling error is transmitted to the worker 51 in an easily understandable manner.
[3.フローチャート]
 図9は、部品の組み付け工程の流れを説明するためのフローチャートである。本フロー中のフラグAは、識別情報(作業番号)が読み取られた状態であるか否かを表すものであり、識別情報が読み取られていない初期状態ではA=0に設定されている。まず、撮影装置16で撮影画像が撮影され、その内容が認識される(ステップA1)。また、フラグAの値が判定され(ステップA2)、A=0であるときには撮影画像中に作業指示書54が認識されたか否かが判定される(ステップA3)。
[3. flowchart]
FIG. 9 is a flowchart for explaining the flow of the component assembling process. The flag A in this flow indicates whether or not the identification information (work number) has been read, and is set to A = 0 in an initial state where the identification information has not been read. First, a photographed image is photographed by the photographing device 16 and its contents are recognized (step A1). Further, the value of the flag A is determined (step A2), and when A = 0, it is determined whether or not the work instruction 54 has been recognized in the captured image (step A3).
 ここで、作業指示書54が認識されると、作業指示書54に記載されている作業番号を抽出部3が抽出する(ステップA4)。これを受けて取得部4は第一データベース1を参照し、作業番号に対応する部品の組み付けパターンを取得する(ステップA5)。また、取得部4は第二データベース2を参照して、作業番号に対応する部品の保管場所,使用数の情報を取得する(ステップA6)。その後、フラグAの値がA=1に設定される(ステップA7)。次回以降の演算周期では、制御がステップA1からステップA8へと進行する。 Here, when the work instruction 54 is recognized, the extraction unit 3 extracts the work number described in the work instruction 54 (step A4). In response to this, the acquisition unit 4 refers to the first database 1 and acquires an assembly pattern of the part corresponding to the work number (step A5). Further, the acquiring unit 4 refers to the second database 2 to acquire information of the storage location and the number of used parts corresponding to the work number (step A6). Thereafter, the value of the flag A is set to A = 1 (step A7). In the next and subsequent calculation cycles, control proceeds from step A1 to step A8.
 ステップA8では、撮影画像中に場所コードを表すマーカー58が認識されたか否かが判定される。ここでマーカー58が検出されている場合には、図7(B)に示すように、部品の保管場所や使用数の情報がスマートグラス10の表示装置13に表示される。作業者51は、これを参照しながら部品を資材保管棚53から取り出し、組み付け作業の準備を整えることができる。また、ステップA10では、部品の組み付け対象が認識されたか否かが判定される。ここで組み付け対象が認識されている場合には、図6(C)に示すように、部品の虚像が表示装置13に表示される。作業者51は、これを参照しながら部品を組み付けることができる。 In step A8, it is determined whether or not the marker 58 representing the location code has been recognized in the captured image. Here, when the marker 58 is detected, as shown in FIG. 7B, information on the storage location and the number of used parts is displayed on the display device 13 of the smart glass 10. The worker 51 can take out the parts from the material storage shelf 53 while referring to this, and prepare for the assembling work. Also, in step A10, it is determined whether or not the component installation target has been recognized. Here, when the assembly target is recognized, a virtual image of the component is displayed on the display device 13 as shown in FIG. The worker 51 can assemble the parts while referring to this.
 図10は、部品の組み付け完了時における検査工程の流れを説明するためのフローチャートである。検査工程は、組み付け作業が完了したときに、作業者51が所定の入力操作(例えば、所定の音声指示やボタン操作など)を行うことで開始されるようにしてもよい。あるいは画像解析により部品の組み付け作業が完了したことを自動的に検知して、検査工程を開始してもよい。 FIG. 10 is a flowchart for explaining the flow of the inspection process at the time of completing the assembly of the components. The inspection process may be started when the worker 51 performs a predetermined input operation (for example, a predetermined voice instruction or a button operation) when the assembling operation is completed. Alternatively, the completion of the assembling work of the components may be automatically detected by image analysis, and the inspection process may be started.
 前者の場合、ステップB1において、作業者51による入力操作に関する各種情報が入力される。ここで、所定の検査開始条件(例えば、所定の音声指示がなされること)が成立すると(ステップB2)、撮影装置16での撮影画像が入力画像としてオートエンコーダに入力され、その出力画像が取得される(ステップB3)。また、入力画像と出力画像との差異が算出されるとともに、その差異がしきい値以上となる箇所が強調表示される。作業者51は、これを参照しながら不適切な組み付け状態を是正することができる。 In the former case, in step B1, various information related to the input operation by the worker 51 is input. Here, when a predetermined inspection start condition (for example, a predetermined voice instruction is given) is satisfied (step B2), the image captured by the image capturing device 16 is input to the auto encoder as an input image, and the output image is obtained. (Step B3). In addition, a difference between the input image and the output image is calculated, and a portion where the difference is equal to or larger than the threshold is highlighted. The worker 51 can correct the improper assembling state while referring to this.
[4.作用・効果]
 (1)上記の製造支援システム,方法,プログラムでは、作業指示書54に含まれる識別情報(作業番号)が抽出され、これに対応する位置、向きで部品の虚像が描画される。また、識別情報(作業番号)と部品の組み付けパターンとの関係は、第一データベース1にあらかじめ規定される。これにより、組み付け作業に係る部品の種類や使用数,レイアウトが作業指示書54ごとに異なっている場合であっても、正しい位置かつ正しい向きで部品の組み付けの手本を示すことができ、組み付け作業を支援することができる。
[4. Action / Effect]
(1) In the above-described manufacturing support system, method, and program, identification information (work number) included in the work instruction sheet 54 is extracted, and a virtual image of the part is drawn at a position and an orientation corresponding to the identification information. The relationship between the identification information (work number) and the component assembling pattern is defined in the first database 1 in advance. Thereby, even when the type, the number of use, and the layout of the parts related to the assembling work are different for each work instruction sheet 54, a model of the assembling of the parts can be shown in a correct position and a correct direction. We can support the work.
 また、作業者51は作業指示書54ごとに異なる部品の組み付けパターンを暗記しておく必要がなくなり、組み付けミスを削減することができる。さらに、組み付け作業中は常に手本となる虚像が表示されているため、作業者51が誤った組み付けを覚えてしまう懸念がなくなり、作業者51の習熟度を迅速に高めることができる。したがって、部品の組み付け作業に関する作業効率(生産性)を向上させることができる。 {Circle around (4)} The operator 51 does not need to memorize the assembling patterns of the different parts for each work instruction 54, thereby reducing assembling errors. Further, since a virtual image serving as a model is always displayed during the assembling work, there is no fear that the worker 51 will remember the wrong assembling, and the skill of the worker 51 can be quickly increased. Therefore, work efficiency (productivity) related to the work of assembling parts can be improved.
 (2)識別情報(作業番号)と部品の保管場所及び使用数との関係は、第二データベース2にあらかじめ規定される。これを用いて、部品の保管場所の近傍に使用数の情報を描画することで、部品の取り違えミスを予防することができる。例えば、外観が類似する複数の部品が混在する組み付け作業において、ピックアップミス(部品の取り出し間違い)を削減することができる。したがって、組み付け作業における不良率を低減させることができ、作業効率をさらに向上させることができる。 (2) The relationship between the identification information (work number) and the storage location and the number of parts used is prescribed in the second database 2 in advance. By using this information and drawing the information on the number of used parts near the storage location of the parts, it is possible to prevent mistakes in the parts being mixed up. For example, in an assembling operation in which a plurality of parts having similar appearances are mixed, it is possible to reduce pickup errors (errors in taking out parts). Therefore, the failure rate in the assembling work can be reduced, and the working efficiency can be further improved.
 (3)図7(A)に示すように、マーカー58を用いて部品の保管場所を特定することで、部品の仕様変更や保管場所の変更に対して柔軟に対応することが可能となり、作業効率をさらに向上させることができる。例えば、部品の保管場所が変更されたときには、部品と一緒にマーカー58も移動させればよい。なお、部品の種類や保管場所が変更されない場合には、マーカー58の代わりにその部品の外観や保管場所の外観などに基づく支援を実施してもよい。 (3) As shown in FIG. 7A, by specifying the storage location of the component using the marker 58, it becomes possible to flexibly respond to a change in the specification of the component or a change in the storage location. Efficiency can be further improved. For example, when the storage location of the part is changed, the marker 58 may be moved together with the part. If the type or storage location of the component is not changed, support based on the appearance of the component or the storage location may be performed instead of the marker 58.
 (4)図8に示すように、組み付け状態の異常度が高い箇所を実景の座標系に適合するように強調表示することで、その箇所を作業者51に気付かせやすくすることができる。これにより、組み付けミスを直ちに修正することができ、作業効率をさらに向上させることができる。また、作業者51の記憶違いや勘違いなどに由来する組み付け上の不具合が即座にフィードバックされるため、作業者51の習熟度を迅速に高めることができ、短期間での人材育成が可能となる。 (4) As shown in FIG. 8, by highlighting a portion where the degree of abnormality in the assembled state is high so as to conform to the coordinate system of the actual scene, it is possible to make the worker 51 easily notice that portion. Thereby, the assembling mistake can be corrected immediately, and the working efficiency can be further improved. In addition, since an assembling defect resulting from a mistake in memory or misunderstanding of the worker 51 is immediately fed back, the proficiency of the worker 51 can be quickly increased, and human resources can be cultivated in a short period of time. .
[5.変形例]
 上記の実施形態はあくまでも例示に過ぎず、本実施形態で明示しない種々の変形や技術の適用を排除する意図はない。本実施形態の各構成は、それらの趣旨を逸脱しない範囲で種々変形して実施することができる。また、必要に応じて取捨選択することができ、あるいは適宜組み合わせることができる。
[5. Modification]
The above embodiment is merely an example, and there is no intention to exclude various modifications and application of technology that are not explicitly described in the present embodiment. Each configuration of the present embodiment can be variously modified and implemented without departing from the spirit thereof. Further, they can be selected as needed, or can be appropriately combined.
 例えば、スマートグラス10にモーションセンサー(加速度センサー,方位センサー,ジャイロセンサー,地磁気センサーなど)を内蔵させてもよい。モーションセンサーで表示装置13及び撮影装置16の姿勢を高精度に把握することで、作業者51の視界内の実景や被写体の位置,姿勢をより正確に推定することが可能となり、実景に対する虚像の重畳精度を向上させることができ、虚像の現実感や臨場感を高めることができる。 For example, a motion sensor (acceleration sensor, direction sensor, gyro sensor, geomagnetic sensor, etc.) may be built in the smart glass 10. By grasping the postures of the display device 13 and the photographing device 16 with high accuracy using the motion sensor, it is possible to more accurately estimate the actual scene and the position and orientation of the subject within the field of view of the worker 51, and to realize a virtual image of the actual scene. The superimposition accuracy can be improved, and the realism and realism of the virtual image can be enhanced.
 本件における製造支援システム,製造支援方法,製造支援プログラムの最小構成は、第一データベース1,抽出部3,取得部4,認識部5,描画部6を備えたものである。これらの各要素が保存される装置は、任意に設定可能であるとともに、複数の装置に機能を分散させて配置することも可能である。機能を分散させる場合、少なくとも複数の装置同士が有線または無線で通信可能に接続されていればよい。 The minimum configuration of the manufacturing support system, the manufacturing support method, and the manufacturing support program in this case includes the first database 1, the extraction unit 3, the acquisition unit 4, the recognition unit 5, and the drawing unit 6. The device in which these elements are stored can be set arbitrarily, and the functions can be distributed to a plurality of devices. When distributing functions, at least a plurality of devices need only be connected to be able to communicate with each other by wire or wirelessly.
 上記の機能の分散に関して、図11(A)に示すように、表示装置13の描画に関する要素(例えば、描画部6,強調表示部8)をスマートグラス10に担当させ、それ以外の要素(例えば、抽出部3,取得部4,認識部5,判定部7)を組み付け作業用コンピューター59に担当させてもよい。また、第一データベース1や第二データベース2は、生産指令サーバー60に保存しておいてもよい。生産指令サーバー60は、生産ラインの稼働状態や生産ロボットの動作を制御する大型コンピューターであり、組み付け作業用コンピューター59に対して有線ケーブルで接続される。また、組み付け作業用コンピューター59は、組み付け作業が実施される作業場の近傍に配置されるコンピューターであり、スマートグラス10に対し、中継器36を介して無線で接続される。 Regarding the distribution of the above functions, as shown in FIG. 11A, the smart glass 10 is in charge of elements related to drawing on the display device 13 (for example, the drawing unit 6 and the highlighting unit 8), and other elements (for example, , The extracting unit 3, the acquiring unit 4, the recognizing unit 5, and the determining unit 7) may be assigned to the assembling work computer 59. Further, the first database 1 and the second database 2 may be stored in the production command server 60. The production command server 60 is a large computer that controls the operation state of the production line and the operation of the production robot, and is connected to the assembly work computer 59 by a wired cable. The assembling work computer 59 is a computer disposed near a work place where the assembling work is performed, and is wirelessly connected to the smart glass 10 via the repeater 36.
 このような装置構成により、スマートグラス10に内蔵される電子制御装置20の演算処理能力が低い場合であっても、実景に適合する虚像を精度良く表示装置13に表示させることが可能となる。また、生産指令サーバー60にデータベース1,2を配置することで、既存の生産システムで用いられているデータベースを利用した作業支援が容易となる。なお、スマートグラス10に十分な演算処理能力が期待できる場合には、図11(B)に示すように、スマートグラス10単体で動作しうるシステムを構成してもよい。この場合、他のコンピューターとの通信が不要となることから、オフライン環境での作業支援が可能となる。 With such a device configuration, it is possible to accurately display a virtual image suitable for a real scene on the display device 13 even when the electronic control device 20 built in the smart glass 10 has a low arithmetic processing capability. In addition, by arranging the databases 1 and 2 in the production command server 60, work support using the database used in the existing production system becomes easy. Note that if sufficient computation processing capability can be expected from the smart glass 10, a system that can operate by itself as shown in FIG. 11B may be configured. In this case, since communication with another computer is not required, work support in an offline environment is possible.
[6.付記]
 上記の変形例を含む実施形態に関し、以下の付記を開示する。
[付記1]
 視界内の実景を撮影する撮影装置と前記実景に虚像を重畳表示する表示装置とを内蔵するスマートグラスを頭部に装着した作業者に対し、前記実景の座標系に適合する虚像を前記表示装置に表示させることによって、部品の組み付け作業を支援する製造支援方法であって、
 前記撮影装置の撮影画像の中から、前記組み付け作業の内容が記載された作業指示書に含まれる前記識別情報を抽出し、
 組み付け対象に対する複数の前記部品の組み付けパターンとこれに対応する識別情報との関係が規定された第一データベースに基づき、前記識別情報に対応する前記部品の組み付けパターンを取得し、
 前記撮影装置の撮影画像の中から、前記組み付け対象を認識し、
 前記組み付け対象に前記部品が組み付けられた完成時の位置及び向きで前記部品の虚像を描画する
ことを特徴とする、製造支援方法。
[6. Appendix]
The following supplementary notes are disclosed with respect to the embodiment including the above-described modification.
[Appendix 1]
For a worker wearing a smart glass equipped with a built-in imaging device for photographing a real scene in a field of view and a display device for superimposing and displaying a virtual image on the real scene, the display device displays a virtual image conforming to the coordinate system of the real scene. A manufacturing support method for supporting the assembly work of parts by displaying
From the photographed image of the photographing device, extract the identification information included in the work instruction sheet in which the contents of the assembling work are described,
Based on a first database in which a relationship between a plurality of assembling patterns of the component to the assembling target and identification information corresponding thereto is obtained, an assembling pattern of the component corresponding to the identification information is obtained,
From the photographed image of the photographing device, recognize the assembly target,
A manufacturing support method, wherein a virtual image of the component is drawn at a position and an orientation at the time of completion when the component is assembled to the assembly target.
[付記2]
 前記部品の保管場所と前記組み付け対象に組み付けられる前記部品の個数と前記識別情報との関係が規定された第二データベースに基づき、前記識別情報に対応する前記部品の保管場所の情報と前記部品の個数の情報とを取得し、
 前記撮影装置の撮影画像の中から、前記保管場所を認識し、
 前記部品の個数の情報を前記保管場所の近傍に表示する
ことを特徴とする、付記1記載の製造支援方法。
[Appendix 2]
Based on a second database in which the relationship between the storage location of the component and the number of components to be assembled to the assembly target and the identification information is defined, information on the storage location of the component corresponding to the identification information and the Get the number information and
Recognizing the storage location from the photographed image of the photographing device,
The manufacturing support method according to claim 1, wherein information on the number of the parts is displayed near the storage location.
[付記3]
 前記撮影画像中における前記保管場所に設置されたマーカーを認識する
ことを特徴とする、付記2記載の製造支援方法。
[Appendix 3]
3. The manufacturing support method according to claim 2, further comprising recognizing a marker installed at the storage location in the captured image.
[付記4]
 前記撮影装置の撮影画像と前記識別情報とに基づき、前記部品の組み付け状態が前記組み付けパターンに一致するか否かを判定し、
 前記部品の組み付け状態が前記組み付けパターンに一致しない部位を、前記作業者の視界内の実景の座標系に適合するように強調表示する
ことを特徴とする、付記1~3のいずれか1項に記載の製造支援方法。
[Appendix 4]
Based on the captured image of the imaging device and the identification information, determine whether the assembly state of the component matches the assembly pattern,
The part according to any one of supplementary notes 1 to 3, wherein a part in which the assembly state of the component does not match the assembly pattern is highlighted so as to conform to a coordinate system of a real scene in the field of view of the worker. Manufacturing support method as described.
[付記5]
 視界内の実景を撮影する撮影装置と前記実景に虚像を重畳表示する表示装置とを内蔵するスマートグラスを頭部に装着した作業者に対し、前記実景の座標系に適合する虚像を前記表示装置に表示させることによって、部品の組み付け作業を支援する製造支援プログラムであって、
 前記撮影装置の撮影画像の中から、前記組み付け作業の内容が記載された作業指示書に含まれる前記識別情報を抽出し、
 組み付け対象に対する複数の前記部品の組み付けパターンとこれに対応する識別情報との関係が規定された第一データベースに基づき、前記識別情報に対応する前記部品の組み付けパターンを取得し、
 前記撮影装置の撮影画像の中から、前記組み付け対象を認識し、
 前記組み付け対象に前記部品が組み付けられた完成時の位置及び向きで前記部品の虚像を描画する
処理をコンピューターに実行させる、製造支援プログラム。
[Appendix 5]
For a worker wearing a smart glass equipped with a built-in imaging device for photographing a real scene in a field of view and a display device for superimposing and displaying a virtual image on the real scene, the display device displays a virtual image conforming to the coordinate system of the real scene. Is a manufacturing support program that supports the assembly work of parts by displaying
From the photographed image of the photographing device, extract the identification information included in the work instruction sheet in which the contents of the assembling work are described,
Based on a first database in which a relationship between a plurality of assembling patterns of the component to the assembling target and identification information corresponding thereto is obtained, an assembling pattern of the component corresponding to the identification information is obtained,
From the photographed image of the photographing device, recognize the assembly target,
A manufacturing support program for causing a computer to execute a process of drawing a virtual image of the component at a completed position and orientation in which the component is assembled to the assembly target.
[付記6]
 前記部品の保管場所と前記組み付け対象に組み付けられる前記部品の個数と前記識別情報との関係が規定された第二データベースに基づき、前記識別情報に対応する前記部品の保管場所の情報と前記部品の個数の情報とを取得し、
 前記撮影装置の撮影画像の中から、前記保管場所を認識し、
 前記部品の個数の情報を前記保管場所の近傍に表示する
処理をコンピューターに実行させる、付記5記載の製造支援プログラム。
[Appendix 6]
Based on a second database in which the relationship between the storage location of the component and the number of components to be assembled to the assembly target and the identification information is defined, information on the storage location of the component corresponding to the identification information and the Get the number information and
Recognizing the storage location from the photographed image of the photographing device,
The manufacturing support program according to claim 5, wherein the program causes a computer to execute a process of displaying the information on the number of the parts near the storage location.
[付記7]
 前記撮影画像中における前記保管場所に設置されたマーカーを認識する
処理をコンピューターに実行させる、付記6記載の製造支援プログラム。
[Appendix 7]
The manufacturing support program according to claim 6, wherein the program causes a computer to execute a process of recognizing a marker installed in the storage location in the captured image.
[付記8]
 前記撮影装置の撮影画像と前記識別情報とに基づき、前記部品の組み付け状態が前記組み付けパターンに一致するか否かを判定し、
 前記部品の組み付け状態が前記組み付けパターンに一致しない部位を、前記作業者の視界内の実景の座標系に適合するように強調表示する
処理をコンピューターに実行させる、付記5~7のいずれか1項に記載の製造支援プログラム。
[Appendix 8]
Based on the captured image of the imaging device and the identification information, determine whether the assembly state of the component matches the assembly pattern,
8. The computer according to any one of claims 5 to 7, further comprising: causing a computer to execute a process of highlighting a part in which the assembly state of the component does not match the assembly pattern so as to match a coordinate system of a real scene in the field of view of the worker. Manufacturing support program described in 1.
1 第一データベース
2 第二データベース
3 抽出部
4 取得部
5 認識部
6 描画部
7 判定部
8 強調表示部
9 製造支援プログラム
10 スマートグラス
13 表示装置
16 撮影装置
30 サーバー
50 自動車
51 作業者
52 ヘルメット
53 資材保管棚
54 作業指示書
55 リレーボックス(組み付け対象)
56 リレー(部品)
 
DESCRIPTION OF SYMBOLS 1 1st database 2 2nd database 3 Extraction part 4 Acquisition part 5 Recognition part 6 Drawing part 7 Judgment part 8 Highlight display part 9 Manufacturing support program 10 Smart glasses 13 Display device 16 Imaging device 30 Server 50 Car 51 Worker 52 Helmet 53 Material storage shelf 54 Work instructions 55 Relay box (for assembly)
56 relays (parts)

Claims (6)

  1.  作業者による部品の組み付け作業を支援するための製造支援システムであって、
     前記作業者の視界内の実景を撮影する撮影装置と前記実景に虚像を重畳表示する表示装置とを内蔵し、前記作業者の頭部に装着されるスマートグラスと、
     組み付け対象に対する複数の前記部品の組み付けパターンとこれに対応する識別情報との関係を規定する第一データベースと、
     前記作業者の視界内の実景の座標系に適合するように、前記部品の虚像を前記表示装置に表示させる制御装置とを備え、
     前記制御装置が、
     前記撮影装置の撮影画像の中から、前記組み付け作業の内容が記載された作業指示書に含まれる前記識別情報を抽出する抽出部と、
     前記第一データベースに基づき、前記識別情報に対応する前記部品の組み付けパターンを取得する取得部と、
     前記撮影装置の撮影画像の中から、前記組み付け対象を認識する認識部と、
     前記組み付け対象に前記部品が組み付けられた完成時の位置及び向きで前記部品の虚像を描画する描画部とを有する
    ことを特徴とする、製造支援システム。
    A manufacturing support system for assisting a worker in assembling parts,
    Incorporating a photographing device for photographing a real scene in the field of view of the worker and a display device for superimposing and displaying a virtual image on the real scene, a smart glass mounted on the head of the worker,
    A first database that defines a relationship between an assembling pattern of the plurality of components to the assembling target and identification information corresponding thereto,
    A control device for displaying a virtual image of the component on the display device so as to conform to a coordinate system of a real scene in the field of view of the worker,
    The control device,
    From an image captured by the image capturing device, an extraction unit configured to extract the identification information included in a work instruction sheet in which the content of the assembly operation is described,
    An acquisition unit that acquires an assembly pattern of the component corresponding to the identification information based on the first database,
    From a photographed image of the photographing device, a recognition unit that recognizes the assembly target,
    A drawing unit that draws a virtual image of the component at a position and orientation at the time of completion when the component is assembled to the assembly target.
  2.  前記部品の保管場所と前記組み付け対象に組み付けられる前記部品の個数と前記識別情報との関係が規定された第二データベースを備え、
     前記取得部が、前記第二データベースに基づき、前記識別情報に対応する前記部品の保管場所の情報と前記部品の個数の情報とを取得し、
     前記認識部が、前記撮影装置の撮影画像の中から、前記保管場所を認識し、
     前記描画部が、前記部品の個数の情報を前記保管場所の近傍に表示する
    ことを特徴とする、請求項1記載の製造支援システム。
    The storage location of the parts and a second database in which the relationship between the number of the parts and the identification information to be assembled to the assembly target is defined,
    The acquisition unit, based on the second database, acquires information on the storage location of the component corresponding to the identification information and information on the number of the component,
    The recognition unit recognizes the storage location from the captured image of the imaging device,
    The manufacturing support system according to claim 1, wherein the drawing unit displays information on the number of the components near the storage location.
  3.  前記認識部が、前記撮影画像中における前記保管場所に設置されたマーカーを認識する
    ことを特徴とする、請求項2記載の製造支援システム。
    The manufacturing support system according to claim 2, wherein the recognition unit recognizes a marker installed in the storage location in the captured image.
  4.  前記制御装置が、
     前記撮影装置の撮影画像と前記識別情報とに基づき、前記部品の組み付け状態が前記組み付けパターンに一致するか否かを判定する判定部と、
     前記部品の組み付け状態が前記組み付けパターンに一致しない部位を、前記作業者の視界内の実景の座標系に適合するように強調表示する強調表示部とを有する
    ことを特徴とする、請求項1~3のいずれか1項に記載の製造支援システム。
    The control device,
    A determination unit configured to determine whether an assembly state of the component matches the assembly pattern based on a captured image of the imaging device and the identification information;
    The image display device according to claim 1, further comprising: a highlighting unit that highlights a part in which the assembly state of the component does not match the assembly pattern so as to conform to a coordinate system of a real scene in the field of view of the worker. 4. The manufacturing support system according to any one of items 3.
  5.  視界内の実景を撮影する撮影装置と前記実景に虚像を重畳表示する表示装置とを内蔵するスマートグラスを頭部に装着した作業者に対し、前記実景の座標系に適合する虚像を前記表示装置に表示させることによって、部品の組み付け作業を支援する製造支援方法であって、
     前記撮影装置の撮影画像の中から、前記組み付け作業の内容が記載された作業指示書に含まれる前記識別情報を抽出し、
     組み付け対象に対する複数の前記部品の組み付けパターンとこれに対応する識別情報との関係が規定された第一データベースに基づき、前記識別情報に対応する前記部品の組み付けパターンを取得し、
     前記撮影装置の撮影画像の中から、前記組み付け対象を認識し、
     前記組み付け対象に前記部品が組み付けられた完成時の位置及び向きで前記部品の虚像を描画する
    ことを特徴とする、製造支援方法。
    For a worker wearing a smart glass equipped with a built-in imaging device for photographing a real scene in a field of view and a display device for superimposing and displaying a virtual image on the real scene, the display device displays a virtual image conforming to the coordinate system of the real scene. A manufacturing support method for supporting the assembly work of parts by displaying
    From the photographed image of the photographing device, extract the identification information included in the work instruction sheet in which the contents of the assembling work are described,
    Based on a first database in which a relationship between a plurality of assembling patterns of the component to the assembling target and identification information corresponding thereto is obtained, an assembling pattern of the component corresponding to the identification information is obtained,
    From the photographed image of the photographing device, recognize the assembly target,
    A manufacturing support method, wherein a virtual image of the component is drawn at a position and an orientation at the time of completion when the component is assembled to the assembly target.
  6.  視界内の実景を撮影する撮影装置と前記実景に虚像を重畳表示する表示装置とを内蔵するスマートグラスを頭部に装着した作業者に対し、前記実景の座標系に適合する虚像を前記表示装置に表示させることによって、部品の組み付け作業を支援する製造支援プログラムであって、
     前記撮影装置の撮影画像の中から、前記組み付け作業の内容が記載された作業指示書に含まれる前記識別情報を抽出し、
     組み付け対象に対する複数の前記部品の組み付けパターンとこれに対応する識別情報との関係が規定された第一データベースに基づき、前記識別情報に対応する前記部品の組み付けパターンを取得し、
     前記撮影装置の撮影画像の中から、前記組み付け対象を認識し、
     前記組み付け対象に前記部品が組み付けられた完成時の位置及び向きで前記部品の虚像を描画する
    処理をコンピューターに実行させる、製造支援プログラム。
     
    For a worker wearing a smart glass equipped with a built-in imaging device for photographing a real scene in a field of view and a display device for superimposing and displaying a virtual image on the real scene, the display device displays a virtual image conforming to the coordinate system of the real scene. Is a manufacturing support program that supports the assembly work of parts by displaying
    From the photographed image of the photographing device, extract the identification information included in the work instruction sheet in which the contents of the assembling work are described,
    Based on a first database in which a relationship between a plurality of assembling patterns of the component to the assembling target and identification information corresponding thereto is obtained, an assembling pattern of the component corresponding to the identification information is obtained,
    From the photographed image of the photographing device, recognize the assembly target,
    A manufacturing support program for causing a computer to execute a process of drawing a virtual image of the component at a completed position and orientation in which the component is assembled to the assembly target.
PCT/JP2019/032359 2018-09-03 2019-08-20 Manufacturing assistance system, method, and program WO2020049999A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2020541113A JP7374908B2 (en) 2018-09-03 2019-08-20 Manufacturing support systems, methods, programs
JP2023013855A JP7468722B2 (en) 2018-09-03 2023-02-01 Manufacturing support system, method, and program

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-164219 2018-09-03
JP2018164219 2018-09-03

Publications (1)

Publication Number Publication Date
WO2020049999A1 true WO2020049999A1 (en) 2020-03-12

Family

ID=69722914

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/032359 WO2020049999A1 (en) 2018-09-03 2019-08-20 Manufacturing assistance system, method, and program

Country Status (2)

Country Link
JP (2) JP7374908B2 (en)
WO (1) WO2020049999A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114554163A (en) * 2022-04-25 2022-05-27 深圳酷源数联科技有限公司 Coal mine underground operation monitoring system
WO2024106197A1 (en) * 2022-11-15 2024-05-23 トヨタ紡織株式会社 Inspection apparatus

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012218037A (en) * 2011-04-11 2012-11-12 Murata Machinery Ltd Sheet metal process work support system
JP2014071757A (en) * 2012-09-28 2014-04-21 Brother Ind Ltd Work assistance system and program
JP2014215748A (en) * 2013-04-24 2014-11-17 川崎重工業株式会社 Component mounting work supporting system, and component mounting method
JP2017007866A (en) * 2016-09-13 2017-01-12 オークラ輸送機株式会社 Picking system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012218037A (en) * 2011-04-11 2012-11-12 Murata Machinery Ltd Sheet metal process work support system
JP2014071757A (en) * 2012-09-28 2014-04-21 Brother Ind Ltd Work assistance system and program
JP2014215748A (en) * 2013-04-24 2014-11-17 川崎重工業株式会社 Component mounting work supporting system, and component mounting method
JP2017007866A (en) * 2016-09-13 2017-01-12 オークラ輸送機株式会社 Picking system

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114554163A (en) * 2022-04-25 2022-05-27 深圳酷源数联科技有限公司 Coal mine underground operation monitoring system
CN114554163B (en) * 2022-04-25 2022-08-19 深圳酷源数联科技有限公司 Coal mine underground operation monitoring system
WO2024106197A1 (en) * 2022-11-15 2024-05-23 トヨタ紡織株式会社 Inspection apparatus

Also Published As

Publication number Publication date
JP7468722B2 (en) 2024-04-16
JP2023065371A (en) 2023-05-12
JP7374908B2 (en) 2023-11-07
JPWO2020049999A1 (en) 2021-08-26

Similar Documents

Publication Publication Date Title
JP7488435B2 (en) AR-Corresponding Labeling Using Aligned CAD Models
CN106340217B (en) Manufacturing equipment intelligence system and its implementation based on augmented reality
JP7468722B2 (en) Manufacturing support system, method, and program
JP6491574B2 (en) AR information display device
US20180350056A1 (en) Augmented reality application for manufacturing
CN105666505B (en) Robot system having display for augmented reality
US9746913B2 (en) Secured mobile maintenance and operator system including wearable augmented reality interface, voice command interface, and visual recognition systems and related methods
EP3111297B1 (en) Tracking objects during processes
CN108228345A (en) The system and method assisted for interactive cognitive task
JP2006285788A (en) Mixed reality information generation device and method
JP7471428B2 (en) SYSTEM, METHOD AND COMPUTER READABLE MEDIUM FOR USING VIRTUAL/AUGHTENT FOR INTERACTING WITH COLLABORATORY ROBOTS IN A MANUFACTURING OR INDUSTRIAL ENVIRONMENT - Patent application
JP2021528777A (en) Automatic Dynamic Diagnostic Guide Using Augmented Reality
JP2008235504A (en) Assembly inspection device
JP6589604B2 (en) Teaching result display system
Stanimirovic et al. [Poster] A Mobile Augmented reality system to assist auto mechanics
JP6803794B2 (en) Image processing equipment and manufacturing system
WO2018223038A1 (en) Augmented reality application for manufacturing
JP2014165810A (en) Parameter acquisition device, parameter acquisition method and program
US20150286975A1 (en) Process support system and method
JP6481596B2 (en) Evaluation support device for vehicle head-up display
US20230377471A1 (en) Control system for an augmented reality device
WO2024142304A1 (en) Information processing device, terminal, and information processing method
WO2024142300A1 (en) Information processing device, terminal, and information processing method
US20230351682A1 (en) Systems and Methods for 3D Accident Reconstruction
US20220343038A1 (en) Vehicle simulator

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19856527

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020541113

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19856527

Country of ref document: EP

Kind code of ref document: A1