WO2019106987A1 - Work assisting system, work assisting method, and program - Google Patents

Work assisting system, work assisting method, and program Download PDF

Info

Publication number
WO2019106987A1
WO2019106987A1 PCT/JP2018/038401 JP2018038401W WO2019106987A1 WO 2019106987 A1 WO2019106987 A1 WO 2019106987A1 JP 2018038401 W JP2018038401 W JP 2018038401W WO 2019106987 A1 WO2019106987 A1 WO 2019106987A1
Authority
WO
WIPO (PCT)
Prior art keywords
work
sensor
detection area
area
support system
Prior art date
Application number
PCT/JP2018/038401
Other languages
French (fr)
Japanese (ja)
Inventor
伸行 刀根
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Priority to CN201880076962.1A priority Critical patent/CN111417976A/en
Priority to JP2019557058A priority patent/JPWO2019106987A1/en
Publication of WO2019106987A1 publication Critical patent/WO2019106987A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/04Manufacturing
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Definitions

  • the present invention generally relates to a work support system, a work support method, and a program, and more particularly, to a work support system that generates a work process, a work support method, and a program.
  • Patent Document 1 a work detection system applied to a work area has been proposed.
  • the work detection system described in Patent Document 1 includes a distance image sensor, a detection device, a notification device, and a setting device.
  • the distance image sensor outputs a distance image of the work area.
  • the detection device targets a work subject appearing in the distance image or a work handled by the work subject as a detection target, and outputs an informing instruction when an abnormality in work content is detected based on the distance image.
  • the notification device performs a notification operation when a notification command is input.
  • the setting device sets a plurality of detection areas in which the detection device detects the presence or absence of a detection target in the distance image, and the detection order thereof.
  • the detection device includes a setting unit having the function of the setting device.
  • the setting unit sets, for example, a detection area for detecting the presence or absence of a detection target in the distance image and the detection order thereof based on the content input by the operator using the input device.
  • An object of the present invention is to provide a work support system, a work support method and a program capable of easily generating a work process.
  • a work support system includes a sensor and a generation unit.
  • the sensor detects an object present in an automatic detection area set in the work area.
  • the generation unit generates a work process including an order of a plurality of work events in the work area based on an output of the sensor.
  • the work support method includes an order of a plurality of work events in the work area based on an output of a sensor that detects an object present in an automatic detection area set in the work area. Generate a work process.
  • the program according to one aspect of the present invention is an order of a plurality of work events in the work area based on an output of a sensor that detects an object present in an automatic detection area set in the work area in a computer system. Execute processing to generate a work process including
  • FIG. 1 is a block diagram of a work support system according to an embodiment of the present invention.
  • FIG. 2 is a perspective view for explaining how the worker works based on the work process generated by the generation unit of the work support system of the same.
  • FIG. 3 is a view showing an example of a distance image used in the work support system of the same.
  • FIG. 4 is a view showing an example of a color image used in the work support system of the same.
  • FIG. 5 is a conceptual diagram for explaining a method of setting a detection area in the work support system of the above.
  • the work support system 1 is a system for supporting a worker 100 (see FIG. 2) who performs work in the work area 11.
  • the work support system 1 includes a sensor 2 and a generation unit 3.
  • the worker 100 performs work in the work area 11 in a predetermined work order
  • the sensor 2 detects the inside of the three-dimensional detection area 12 (see FIG. 2) set in the work area 11.
  • the object (the hand of the operator 100, parts, etc.) present in the image is detected.
  • detecting an object includes detecting the movement of the object, detecting the presence or absence of the object, and the like.
  • the generation unit 3 generates a work process including the order of a plurality of work events in the work area 11 based on the output of the sensor 2.
  • the work support system 1 is used to support the worker 100 who works in the cell production method.
  • the worker 100 takes out parts from a plurality (four) of part boxes 13 on the work table 15 in a predetermined work order, and assembles a product.
  • the four part boxes 13 may be referred to as a part box 131, a part box 132, a part box 133, and a part box 134 sequentially from the left side of the figure.
  • the four part boxes 131 to 134 contain mutually different parts.
  • a product is completed by assembling four different parts.
  • the work support system 1 includes a sensor 2 and a generation unit 3 as shown in FIG.
  • the work support system 1 further includes a presentation unit 4, an editing unit 5, a user interface 6, a setting unit 7, and a guide information presentation unit 8.
  • the work support system 1 further includes a control unit 9 and a storage unit 10.
  • the control unit 9 controls the generation unit 3, the presentation unit 4, the editing unit 5, the user interface 6, the setting unit 7 and the guide information presentation unit 8.
  • the storage unit 10 includes information about the three-dimensional detection area 12 (see FIG. 2) set by the setting unit 7, the work process generated by the generation unit 3, and work guide information presented by the guide information presentation unit 8.
  • the work support system 1 has a first mode for generating a work process in the generation unit 3 and a second mode for using the work process generated in the generation unit 3.
  • the control unit 9 controls the generation unit 3 to generate a work process.
  • the guide information presentation unit 8 is controlled by the control unit 9, and the guide information presentation unit 8 sets the work guide information for the worker 100 in the work area 11 according to the work process.
  • Present work guide information is displayed on the upper surface 151 of the work table 15).
  • the sensor 2 detects an object present in the three-dimensional detection area 12 set in the work area 11.
  • the generation unit 3 generates a work process including the order of a plurality of work events in the work area 11 based on the output of the sensor 2.
  • the presentation unit 4 presents a work process.
  • the editing unit 5 edits the work process based on the input information from the user interface 6.
  • the setting unit 7 sets the automatic detection area 14 (see FIG. 4) in accordance with the input information from the user interface 6.
  • the automatic detection area 14 is set in the work area 11.
  • the automatic detection area 14 is an area where a work event that satisfies a predetermined detection condition is automatically detected from a distance image or a moving image of the image image.
  • the setting unit 7 sets the three-dimensional detection area 12 based on the output of the sensor 2.
  • the guide information presentation unit 8 sequentially presents a plurality of work guide information in accordance with the work process.
  • the work area 11 includes an area (three-dimensional area) on the work table 15 on which the worker 100 works.
  • the work area 11 includes the upper surface 151 of the workbench 15 and a plurality of part boxes 13 and the like on the workbench 15.
  • the plurality of three-dimensional detection areas 12 correspond, for example, to the plurality of part boxes 13 one by one.
  • the four three-dimensional detection areas 12 may also be referred to as the three-dimensional detection area 121, the three-dimensional detection area 122, the three-dimensional detection area 123, and the three-dimensional detection area 124 sequentially from the left side of FIG. is there.
  • the four three-dimensional detection areas 121, 122, 123 and 124 correspond to the four part boxes 131, 132, 133 and 134, respectively.
  • the four three-dimensional detection areas 121, 122, 123 and 124 are areas set on the front side of the four part boxes 131, 132, 133 and 134, respectively.
  • the four three-dimensional detection areas 121, 122, 123 and 124 are set to overlap with parts of the four part boxes 131, 132, 133 and 134, respectively, but the invention is not limited thereto. It does not have to overlap 132, 133 and 134, respectively.
  • the sensor 2 is disposed, for example, above the work area 11.
  • the sensor 2 is disposed above the work area 11 by being supported by a support 16 provided on the work table 15, for example.
  • the sensor 2 includes a distance image sensor 21 (see FIG. 1).
  • the distance image sensor 21 captures the work area 11 from above and outputs a distance image of the work area 11.
  • the distance image sensor 21 continuously generates a distance image (a pixel value of each pixel of the distance image is a distance value) in which a distance value from the sensor 2 to an object present in the work area 11 is a pixel value.
  • the sensor 2 can detect the movement of the object, the presence or absence of the object, and the like.
  • the distance image sensor 21 includes an infrared light source and an infrared camera.
  • the distance image sensor 21 is, for example, a distance image sensor of a TOF (Time Of Flight) system.
  • TOF Time Of Flight
  • the distance from the distance image sensor 21 to the object present in the work area 11 is determined by the infrared light source emitting infrared light to the work area 11 from the infrared light source and then the infrared light reflected by the object being infrared light It is determined by the time (time of flight) until the light enters (reaches) the camera.
  • the height information of the object is calculated by subtracting the distance between the object and the distance image sensor 21 from the distance between the upper surface 151 of the work bench 15 and the distance image sensor 21.
  • the distance image sensor 21 is not limited to the distance image sensor of the TOF method, but may be, for example, a distance image sensor of a stereo camera method.
  • the sensor 2 includes an image sensor 22 (see FIG. 1).
  • the image sensor 22 is, for example, a CMOS (Complementary MOS) color image sensor.
  • the image sensor 22 shoots the work area 11 from above and outputs a color image of the work area 11.
  • the image sensor 22 continuously generates a color image.
  • the distance image of the work area 11 and the color image correspond one-to-one with the distance image sensor 21 and the image sensor 22 at substantially the same time.
  • the image sensor 22 may be a sensor that outputs a monochrome image.
  • the generation unit 3, the presentation unit 4, the editing unit 5, the user interface 6, the setting unit 7, and the control unit 9 are configured by, for example, a computer system having a processor and a memory as main components.
  • the generation unit 3, the presentation unit 4, the editing unit 5, the setting unit 7, and the control unit 9 are realized by a computer system (for example, a laptop personal computer) having a processor and a memory, and the processor is a memory
  • the computer system functions as the generation unit 3, the presentation unit 4, the editing unit 5, the user interface 6, the setting unit 7, the control unit 9, and the storage unit 10 by executing the programs stored in.
  • the program may be pre-recorded in the memory of the computer system, but may be provided through a telecommunication line, or may be a non-transitory memory card, optical disk, hard disk drive (magnetic disk), etc. readable by the computer system. May be provided by being recorded on a physical recording medium.
  • a processor of a computer system is configured of one or more electronic circuits including a semiconductor integrated circuit (IC) or a large scale integrated circuit (LSI).
  • the plurality of electronic circuits may be integrated into one chip or may be distributed to a plurality of chips.
  • the plurality of chips may be integrated into one device or may be distributed to a plurality of devices.
  • the user interface 6 is configured by the input device (keyboard, mouse, etc.) of the personal computer described above.
  • the presentation unit 4 is configured by the display of the above-described personal computer, and presents a setting screen for setting the automatic detection area 14, a work process generated by the generation unit 3, and the like.
  • the presentation content (display content) of the presentation unit 4 is controlled by the control unit 9.
  • the editor 5 will be described after the generator 3 is described.
  • the setting unit 7 sets the three-dimensional detection area 12 in the work area 11 based on the output of the sensor 2. For example, when the preset automatic detection area 14 does not have specific height information, the setting unit 7 automatically detects the height information of the object when the object passes the automatic detection area 14 The three-dimensional detection area 12 is automatically set by including it in the 14 information. When the automatic detection area 14 has specific height information, the setting unit 7 automatically updates the three-dimensional information of the three-dimensional detection area 12 based on the output of the sensor 2.
  • the dimension detection area 12 can be set as a more appropriate range.
  • the generation unit 3 generates a work process including the order of a plurality of work events in the work area 11 based on the output of the sensor 2.
  • the generation unit 3 performs a series of operations (a series of operations as an example) to be a target of the operation process in order to generate an operation process in advance on the work bench 15
  • a work process is generated based on the output of the sensor 2.
  • the user who causes the work support system 1 to generate a work process uses the recordings of the output of the sensor 2 (moving images of distance image and image image) when a specific worker performs a series of work on the work table 15 in advance. Recording is performed, and the work support system 1 generates a work process using the moving image recorded on the recording medium.
  • the specific worker may be the same person as the worker 100 who works in the second mode.
  • the user uses the user interface 6 to cause the presentation unit 4 of the computer system described above to present the moving image. Then, the user uses the user interface 6 to select one of the plurality of distance images constituting the moving image of the distance image as the reference distance image P0 (see FIG. 3). In other words, the reference distance image P0 is captured from the moving image of the distance image.
  • the reference distance image P0 is, for example, an image in which the hand of a specific worker is not shown.
  • the upper surface 151 of the workbench 15 and a plurality of part boxes 131 to 134 appear.
  • the user uses the user interface 6 to select one image corresponding to the reference distance image P0 presented to the presenting unit 4 among the plurality of images as a reference image P10 (see FIG. 4).
  • the automatic detection area 14 is set with respect to the reference image P10.
  • the user sets a plurality of automatic detection areas 14 corresponding to the plurality of parts boxes 13 one by one using the user interface 6.
  • the user can set the automatic detection area 14 by specifying the range on the reference image P10 with the mouse.
  • the four automatic detection areas 14 may be referred to as an automatic detection area 141, an automatic detection area 142, an automatic detection area 143, and an automatic detection area 144 sequentially from the left side of FIG.
  • the automatic detection area 14 may be a two-dimensional area or a three-dimensional area.
  • the automatic detection area 14 includes, for example, information such as height, area, or volume as a detection condition in automatic detection. Further, other detection conditions in automatic detection are, for example, color, time range, staying time, operation direction, simultaneous or random group setting of plural detection areas, and the like.
  • the time range is a specified range of time used as a detection condition during the working time in which the work process is being performed.
  • the work time is, for example, the time from the start time point of the work process to the end time point.
  • the time range is, for example, a range between a point 3 seconds after the start of the work process and a point 12 seconds after.
  • the work event of the first detection area and the work event of the second detection area are The condition is to detect only when it is performed simultaneously.
  • the work event of the first detection area and the work event of the second detection area It is a condition that it detects even when either is performed first. Note that the setting of the automatic detection area 14 may be set using a distance image instead of an image.
  • the user uses the user interface 6 to instruct reproduction of a moving image of the distance image, setting of the three-dimensional detection area 12 and generation of a frame of a work event to be executed in the automatic detection area 14.
  • the setting unit 7 automatically sets the three-dimensional detection area 12 from the moving image of the distance image.
  • a frame is a processing frame in which the content of performing a specific task is recorded.
  • the three-dimensional detection area 12 is an area corresponding to the automatic detection area 14 in the real space in the distance image.
  • the setting unit 7 includes an area corresponding to the automatic detection area 14 in the distance image, and distance information in the height direction based on information on an object (specific worker's hand) entering the area.
  • a three-dimensional detection area 12 is set.
  • control unit 9 When the control unit 9 receives an instruction of flow generation (work process generation), the control unit 9 causes the generation unit 3 to generate a work process.
  • the generation unit 3 automatically generates a frame of a work event detected in the automatic detection area 14 from the distance image or the moving image of the image image.
  • the horizontal axis of FIG. 5 is a time axis of the recorded moving image.
  • the work process includes a plurality (four) of work events (steps).
  • the four work events include, for example, a work event (hereinafter, also referred to as work A) for removing a part from parts box 131, a work event (also referred to hereinafter as work B) for removing a part from parts box 132, and a part from parts box 133.
  • work C work event to be taken out
  • work D work event to be brought out of a part from the parts box 134
  • the generation unit 3 automatically generates frames of work events in the order in which the respective work events are executed.
  • the work process is a flow indicating, for example, the order of performing work events, “take parts from parts box 131” ⁇ “take parts from parts box 132” ⁇ “parts from parts box 134” It is a flow of the order of "take part from part box 133".
  • the work process may be, for example, a flow of the order of “part box 131” ⁇ “part box 132” ⁇ “part box 134” ⁇ “part box 133”.
  • the contents of each work event (“take parts from part box 131” and the like) may be set in advance in the generation unit 3 using the user interface 6.
  • the work support system 1 may automatically associate the contents of the work event with the frame of the corresponding work event in the generation unit 3 it can.
  • the generation unit 3 automatically generates frames of work events corresponding to the automatic detection area 141, the automatic detection area 142, the automatic detection area 144, and the automatic detection area 143 in this order.
  • a frame corresponding to the automatic detection area 141 corresponds to the three-dimensional detection area 121.
  • the frame corresponding to the automatic detection area 142 corresponds to the three-dimensional detection area 122.
  • the frame corresponding to the automatic detection area 144 corresponds to the three-dimensional detection area 124.
  • the frame corresponding to the automatic detection area 143 corresponds to the three-dimensional detection area 123.
  • the generation unit 3 generates a work process including the order of a plurality of work events in the work area 11 in a three-dimensional manner in which an object to be detected (the hand of the worker 100) is detected based on the output of the sensor 2. It generates according to the order of the detection area 12. More specifically, the generation unit 3 determines the work process in which the order of work events is determined according to the order in which the objects to be detected are detected with respect to the plurality of automatic detection areas 14 corresponding to the plurality of three-dimensional detection areas 12 one by one. Generate
  • the setting unit 7 can detect an object based on a difference value obtained by subtracting the pixel value of each pixel of the distance image from the pixel value of each pixel of the reference distance image P0 (see FIG. 3), and one of information on the object The height of the object can be determined as The setting unit 7 determines that there is an object to be detected if, for example, the number of pixels above the threshold is equal to or greater than the threshold, the number of pixels above the threshold is less than the predetermined number. For example, it is determined that there is no object to be detected. That is, the setting unit 7 has a function of detecting an object to be detected (a hand of the operator 100, a part, or the like) based on the distance image output from the distance image sensor 21.
  • the setting unit 7 sets the range of the three-dimensional detection area 12 in the vertical direction based on the information of the height of the object to be detected.
  • the setting unit 7 is configured to perform processing such as pattern matching on a distance image or a difference image between a reference distance image P0 and a distance image other than the reference distance image P0 and detect an object to be detected. Good.
  • the user can use the user interface 6 to edit the three-dimensional detection area 12 on the screen for three-dimensional detection area editing presented on the presentation unit 4. Editing of the three-dimensional detection area 12 manually corrects, for example, height information of the three-dimensional detection area 12 set automatically.
  • the setting unit 7 does not automatically set the height information of the three-dimensional detection area 12
  • the user can manually set the height information using the user interface 6.
  • the user can instruct, for example, change, addition, or deletion of the three-dimensional detection area 12 using a mouse, a keyboard, or the like of the user interface 6.
  • the user can also change, add, or delete the range of the three-dimensional detection area 12.
  • the setting of the three-dimensional detection area 12 and the generation of the work event frame performed in the automatic detection area 14 may be performed simultaneously or separately.
  • the user of the work support system 1 can input the content of the work event into the frame automatically generated by the generation unit 3.
  • the contents of the work event are, for example, work contents such as “take parts from parts box”.
  • the user can add a work event to a work process consisting of a frame automatically generated by the generation unit 3.
  • the user can delete a frame from the work process consisting of the frames automatically generated by the generation unit 3.
  • the user can set the detection condition of the object in the three-dimensional detection area 12 in the frame of the corresponding work event.
  • the detection condition of the three-dimensional detection area 12 is, for example, a color, a time range, a staying time, an operation direction, a simultaneous or random group setting of a plurality of detection areas, or the like.
  • the time range is a specified range of time used as a detection condition during the working time in which the work process is being performed.
  • the work time is, for example, the time from the start time point of the work process to the end time point.
  • the time range is, for example, a range between a point 3 seconds after the start of the work process and a point 12 seconds after.
  • the work event of the first detection area and the work event of the second detection area are The condition is to detect only when it is performed simultaneously.
  • the work event of the first detection area and the work event of the second detection area It is a condition that it detects even when either is performed first.
  • the stay time as the detection condition of the three-dimensional detection area 12 may be the time (stay time) in which the object was present in the automatic detection area 14 at the time of generation of the work process.
  • the setting unit 7 may automatically set the staying time of the object in the automatic detection area 14 as the detection condition of the object in the three-dimensional detection area 12.
  • the clock unit for clocking the stay time is configured by the timer provided in the above-mentioned personal computer.
  • the setting unit 7 may calculate the stay time using the moving image frame rate.
  • the user can edit the work process presented to the presentation unit 4 using, for example, sentences, drawings, images, moving pictures, symbols, and the like.
  • frames corresponding to a plurality of work events may be set parallel to the time axis.
  • frames extracted from the automatic detection areas 141, 142 corresponding to the part boxes 131, 132 can be set parallel to the time axis.
  • the guide information presentation unit 8 of the work support system 1 is disposed above the work area 11 so as not to overlap the sensor 2 in the vertical direction.
  • the guide information presentation unit 8 sequentially presents the work guide information in the work area 11 in accordance with the work process in the second mode described above.
  • the guide information presentation unit 8 is configured to project an image on a projection surface (display surface) including the upper surface 151 of the work table 15.
  • the guide information presentation unit 8 is, for example, a projector.
  • the guide information presentation unit 8 is arranged to project work guide information (video) toward the work area 11.
  • the work guide information is, for example, a colored arrow or the like 17 in a region corresponding to the three-dimensional detection area 12 of one part box 13 for taking out a part out of the plurality of part boxes 13 on the upper surface 151 of the work bench 15 See Figure 2). Therefore, in the work support system 1, the work guide information can be viewed with the worker 100 in the same posture as during work.
  • the work guide information is not limited to the figure 17 but may be a sentence (for example, "take a part" or the like) or may be both the figure 17 and a sentence.
  • the figure 17 is not limited to the arrow, and may be, for example, a rectangle.
  • the work guide information may be a figure other than an arrow or a rectangle, or may be a moving image, a photograph, or the like.
  • the operation of the guide information presentation unit 8 is controlled by the control unit 9.
  • the control unit 9 controls the timing of presentation and non-presentation of the work guide information by the guide information presentation unit 8.
  • the control unit 9 presents a plurality of pieces of work guide information one by one, for example, in an order predetermined by the work process created in the first mode. Each one of the plurality of work guide information corresponds to each one of a plurality of work events.
  • the interval from the presentation of one piece of work guide information for a predetermined time to the non-presentation and the presentation of the next piece of work guide information may be set as appropriate. Also, after the completion of the operation of the worker 100 is detected by the three-dimensional detection area 12 and the detection condition, the presentation of the work guide information may be switched.
  • the sensor 2 is arranged so as not to interfere with the presentation (projection) of the work guide information to the work area 11 by the guide information presentation unit 8.
  • the guide information presentation unit 8 is disposed so that the projection light is not irradiated to the sensor 2.
  • the work support system 1 uses the output of the sensor 2 for work support in the second mode.
  • the control unit 9 determines the presence or absence of an abnormality of the work based on the output of the sensor 2, and notifies the worker when there is an abnormality. You may make it notify.
  • the work guide information is presented and then the work according to the work guide information is performed For example, when the time until the time exceeds the above-mentioned specified time.
  • the worker 100 is instructed to take a part from the part box 131 at the left end in FIG. There is a case where a part is taken from the second part box 132 from.
  • the notification to the worker 100 may be performed by the control unit 9 controlling the guide information presentation unit 8 or may be performed by the control unit 9 controlling a voice generation device, a display device, and the like. Further, in the work support system 1, when operating in the second mode, the control unit 9 performs the work until it detects that the work according to the work guide information has been performed based on the output of the sensor 2 The guide information may be presented. In addition, for example, the operation support system 1 may operate in conjunction with the measuring device when operating in the second mode, or may cause operation guide information to be presented using a signal from the measuring device as a trigger.
  • the work guide information presented in the work area 11 can be viewed.
  • the worker 100 may work according to the work guide information presented by the guide information presentation unit 8. This enables the worker 100 to easily adapt to the work process. It becomes possible to make small the difference in the amount of production among a plurality of workers 100. In addition, it becomes possible for the worker 100 to easily adapt to different work processes for different types of products. Further, by using the work support system 1, the work contents of the plurality of workers 100 can be standardized.
  • the work support method includes the order of a plurality of work events in the work area 11 based on the output of the sensor 2 that detects an object present in the automatic detection area 14 set in the work area 11. Generate a work process.
  • the program according to one aspect is, based on the output of the sensor 2 for detecting an object present in the automatic detection area 14 set in the work area 11 in the computer system, a plurality of work events in the work area 11 Execute processing to generate work processes including the order of
  • the detection target of the sensor 2 is not limited to the hand of the worker 100, and may be, for example, a part, a product, or the like.
  • the senor 2 is not limited to the configuration provided with both the distance image sensor 21 and the image sensor 22, and may be configured with only one of the distance image sensor 21 and the image sensor 22. Further, the sensor 2 is not limited to the configuration provided with at least one of the distance image sensor 21 and the image sensor 22, and may be, for example, a plurality of picking sensors provided for each of the plurality of three-dimensional detection areas 12.
  • At least two parts boxes 13 of the plurality of parts boxes 13 may be arranged to overlap in the vertical direction.
  • the guide information presentation unit 8 is not limited to the projector, but may be, for example, a display embedded in a top plate including the upper surface 151 of the work table 15.
  • the work support system 1 may further include a presentation unit 4, an editing unit 5, a user interface 6, a setting unit 7, a guide information presentation unit 8, a control unit 9 and a storage unit 10.
  • a presentation unit 4 an editing unit 5, a user interface 6, a setting unit 7, a guide information presentation unit 8, a control unit 9 and a storage unit 10.
  • the components other than the sensor 2 and the generation unit 3 are not essential components and can be omitted as appropriate.
  • the presentation unit 4, the editing unit 5, the user interface 6, the setting unit 7, the guide information presentation unit 8, the control unit 9, and the storage unit 10 may share and execute their respective functions.
  • the work support system 1 may transmit the work process generated by the generation unit 3 to a terminal such as a smartphone or may be recorded on a recording medium.
  • the task assumed in the task support system 1 is not limited to manual assembly on the manufacturing line.
  • the operation may be good or defective products or parts on the production line, sorting of defective products, packing, cooking, cleaning, and the like.
  • At least a part of the functions of the generation unit 3 included in the work support system 1 may be realized by, for example, a server device in cloud computing.
  • a work support system (1) includes a sensor (2) and a generation unit (3).
  • the sensor (2) detects an object present in an automatic detection area (14) set in the work area (11).
  • the generation unit (3) generates a work process including an order of a plurality of work events in the work area (11) based on the output of the sensor (2).
  • the work support system (1) according to the second aspect further includes, in the first aspect, a presentation unit (4) for presenting a work process.
  • the user since the work process can be presented, the user can confirm the work process.
  • the senor (2) is a pixel value of the distance value from the sensor (2) to the object present in the automatic detection area (14). And a distance image sensor (21) that continuously generates a distance image.
  • the sensor (2) includes an image sensor (22).
  • a work support system (1) according to a fifth aspect of the present invention is the work support system (1) according to any one of the first to fourth aspects, which edits a work process based on input information from the user interface (6). Furthermore, it has.
  • the user can edit the work process using the user interface.
  • a work support system (1) according to a sixth aspect is a setting unit for setting the three-dimensional detection area (12) according to input information from the user interface (6) in any one of the first to fifth aspects And 7).
  • a first mode for generating a work process in the generation unit (3), and generation in the generation unit (3) There is a second mode using the work process, and in the second mode, the output of the sensor (2) is used for work support.
  • the same sensor (2) can be used in both the first mode and the second mode.
  • the work support system (1) according to the eighth aspect further includes a guide information presentation unit (8) for sequentially presenting a plurality of pieces of work guide information according to the work process in the second mode in the seventh aspect.
  • work support can be performed by sequentially presenting a plurality of pieces of work guide information according to the work process in the second mode.
  • a work support method is a work area (11) according to an output of a sensor (2) for detecting an object present in an automatic detection area (14) set in the work area (11). Generate a work process that includes the order of multiple work events in
  • a work process can be easily generated.
  • the program according to the tenth aspect relates to the computer system based on the output of the sensor (2) for detecting an object present in the automatic detection area (14) set in the work area (11). 11) execute a process of generating a work process including the order of a plurality of work events in;
  • Reference Signs List 1 work support system 2 sensor 3 generation unit 4 presentation unit 5 editing unit 6 user interface 7 setting unit 8 guide information presentation unit 9 control unit 11 work area 12 three-dimensional detection area 14 automatic detection area 100 worker

Abstract

The present invention addresses the problem of providing a work assisting system, a work assisting method, and a program with which it is possible to easily generate a work process. A work assisting system (1) is provided with a sensor (2) and a generation unit (3). The sensor (2) detects an object present in an automatic detection area that is set in a work area. The generation unit (3) generates a work process including a sequence of a plurality of work events in the work area on the basis of the output from the sensor (2).

Description

作業支援システム、作業支援方法及びプログラムWork support system, work support method and program
 本発明は、一般に作業支援システム、作業支援方法及びプログラムに関し、より詳細には、作業プロセスを生成する作業支援システム、作業支援方法及びプログラムに関する。 The present invention generally relates to a work support system, a work support method, and a program, and more particularly, to a work support system that generates a work process, a work support method, and a program.
 従来、作業領域に適用される作業検知システムが提案されている(特許文献1)。 BACKGROUND Conventionally, a work detection system applied to a work area has been proposed (Patent Document 1).
 特許文献1に記載された作業検知システムは、距離画像センサと、検知装置と、報知装置と、設定装置と、を備えている。距離画像センサは、作業領域の距離画像を出力する。検知装置は、距離画像に現れる作業主体又は作業主体が扱うワークを検知対象とし、距離画像をもとに作業内容の異常を検知すると報知命令を出力する。報知装置は、報知命令が入力されると報知動作を行う。設定装置は、距離画像内で検知装置が検知対象の存否を検知する複数の検知領域及びその検知順序を設定する。 The work detection system described in Patent Document 1 includes a distance image sensor, a detection device, a notification device, and a setting device. The distance image sensor outputs a distance image of the work area. The detection device targets a work subject appearing in the distance image or a work handled by the work subject as a detection target, and outputs an informing instruction when an abnormality in work content is detected based on the distance image. The notification device performs a notification operation when a notification command is input. The setting device sets a plurality of detection areas in which the detection device detects the presence or absence of a detection target in the distance image, and the detection order thereof.
 検知装置は、設定装置の機能を有する設定部を備えている。設定部は、例えば作業者が入力装置を用いて入力した内容をもとに、距離画像内で検知対象の存否を検知する検知領域と、その検知順序を設定する。 The detection device includes a setting unit having the function of the setting device. The setting unit sets, for example, a detection area for detecting the presence or absence of a detection target in the distance image and the detection order thereof based on the content input by the operator using the input device.
 特許文献1に記載された作業検知システムでは、設定作業者が、入力装置を操作し、作業工程に基づいて各工程で作業者が検知される検知領域とその検知順序とを設定する必要があり、作業プロセスの生成に手間がかかっていた。 In the work detection system described in Patent Document 1, it is necessary for the setting operator to operate the input device and set the detection area in which the worker is detected in each process based on the work process and the detection order thereof. , It took time to generate work process.
特開2013-25478号公報JP 2013-25478 A
 本発明の目的は、作業プロセスを容易に生成することが可能な作業支援システム、作業支援方法及びプログラムを提供することにある。 An object of the present invention is to provide a work support system, a work support method and a program capable of easily generating a work process.
 本発明に係る一態様の作業支援システムは、センサと、生成部と、を備える。前記センサは、作業領域内に設定された自動検知領域内に存在する物体を検知する。前記生成部は、前記センサの出力に基づいて、前記作業領域内の複数の作業イベントの順序を含む作業プロセスを生成する。 A work support system according to an aspect of the present invention includes a sensor and a generation unit. The sensor detects an object present in an automatic detection area set in the work area. The generation unit generates a work process including an order of a plurality of work events in the work area based on an output of the sensor.
 本発明に係る一態様の作業支援方法は、作業領域内に設定された自動検知領域内に存在する物体を検知するセンサの出力に基づいて、前記作業領域内の複数の作業イベントの順序を含む作業プロセスを生成する。 The work support method according to one aspect of the present invention includes an order of a plurality of work events in the work area based on an output of a sensor that detects an object present in an automatic detection area set in the work area. Generate a work process.
 本発明に係る一態様のプログラムは、コンピュータシステムに、作業領域内に設定された自動検知領域内に存在する物体を検知するセンサの出力に基づいて、前記作業領域内の複数の作業イベントの順序を含む作業プロセスを生成する処理を実行させる。 The program according to one aspect of the present invention is an order of a plurality of work events in the work area based on an output of a sensor that detects an object present in an automatic detection area set in the work area in a computer system. Execute processing to generate a work process including
図1は、本発明の一実施形態に係る作業支援システムのブロック図である。FIG. 1 is a block diagram of a work support system according to an embodiment of the present invention. 図2は、同上の作業支援システムの生成部で生成された作業プロセスに基づいて作業者が作業する様子を説明するための斜視図である。FIG. 2 is a perspective view for explaining how the worker works based on the work process generated by the generation unit of the work support system of the same. 図3は、同上の作業支援システムで用いる距離画像の一例を示す図である。FIG. 3 is a view showing an example of a distance image used in the work support system of the same. 図4は、同上の作業支援システムで用いるカラー画像の一例を示す図である。FIG. 4 is a view showing an example of a color image used in the work support system of the same. 図5は、同上の作業支援システムにおいて検知領域の設定方法を説明する概念図である。FIG. 5 is a conceptual diagram for explaining a method of setting a detection area in the work support system of the above.
 (実施形態)
 (1)概要
 本実施形態に係る作業支援システム1は、作業領域11で作業を行う作業者100(図2参照)を支援するためのシステムである。
(Embodiment)
(1) Overview The work support system 1 according to the present embodiment is a system for supporting a worker 100 (see FIG. 2) who performs work in the work area 11.
 作業支援システム1は、センサ2と、生成部3と、を備える。作業支援システム1では、作業者100が予め決められた作業順序で作業領域11内での作業を行い、センサ2により、作業領域11内に設定された3次元検知領域12(図2参照)内に存在する物体(作業者100の手、部品等)を検知する。ここにおいて、物体を検知するとは、物体の動きを検知すること、物体の存否を検知すること、等である。生成部3は、センサ2の出力に基づいて、作業領域11内の複数の作業イベントの順序を含む作業プロセスを生成する。 The work support system 1 includes a sensor 2 and a generation unit 3. In the work support system 1, the worker 100 performs work in the work area 11 in a predetermined work order, and the sensor 2 detects the inside of the three-dimensional detection area 12 (see FIG. 2) set in the work area 11. The object (the hand of the operator 100, parts, etc.) present in the image is detected. Here, detecting an object includes detecting the movement of the object, detecting the presence or absence of the object, and the like. The generation unit 3 generates a work process including the order of a plurality of work events in the work area 11 based on the output of the sensor 2.
 (2)構成
 (2.1)作業支援システムの全体構成
 本実施形態に係る作業支援システム1は、例えば、図2に示すように、セル生産方式で作業する作業者100を支援するために用いられるシステムである。セル生産方式では、作業者100は、作業台15上の複数(4つ)のパーツボックス13から予め決められた作業順序で部品を取り出し、製品を組み立てる。以下では、説明の便宜上、4つのパーツボックス13を、図の左側から順に、パーツボックス131、パーツボックス132、パーツボックス133、パーツボックス134と称することもある。4つのパーツボックス131~134には互いに異なる部品が入っている。セル生産方式では、一例として、互いに異なる4つの部品を組み立てることによって製品が完成するようになっている。
(2) Configuration (2.1) Overall Configuration of Work Support System For example, as shown in FIG. 2, the work support system 1 according to the present embodiment is used to support the worker 100 who works in the cell production method. System. In the cell production method, the worker 100 takes out parts from a plurality (four) of part boxes 13 on the work table 15 in a predetermined work order, and assembles a product. Hereinafter, for convenience of description, the four part boxes 13 may be referred to as a part box 131, a part box 132, a part box 133, and a part box 134 sequentially from the left side of the figure. The four part boxes 131 to 134 contain mutually different parts. In the cell production method, as an example, a product is completed by assembling four different parts.
 本実施形態に係る作業支援システム1は、図1に示すように、センサ2と、生成部3と、を備える。また、作業支援システム1は、提示部4と、編集部5と、ユーザインタフェース6と、設定部7と、ガイド情報提示部8と、を更に備える。作業支援システム1は、制御部9と、記憶部10と、を更に備える。制御部9は、生成部3、提示部4、編集部5、ユーザインタフェース6、設定部7及びガイド情報提示部8を制御する。記憶部10は、設定部7にて設定された3次元検知領域12(図2参照)、生成部3にて生成された作業プロセス、ガイド情報提示部8で提示する作業ガイド情報等に関する情報を記憶する。 The work support system 1 according to the present embodiment includes a sensor 2 and a generation unit 3 as shown in FIG. The work support system 1 further includes a presentation unit 4, an editing unit 5, a user interface 6, a setting unit 7, and a guide information presentation unit 8. The work support system 1 further includes a control unit 9 and a storage unit 10. The control unit 9 controls the generation unit 3, the presentation unit 4, the editing unit 5, the user interface 6, the setting unit 7 and the guide information presentation unit 8. The storage unit 10 includes information about the three-dimensional detection area 12 (see FIG. 2) set by the setting unit 7, the work process generated by the generation unit 3, and work guide information presented by the guide information presentation unit 8. Remember.
 作業支援システム1は、生成部3において作業プロセスを生成する第1モードと、生成部3において生成された作業プロセスを使用する第2モードと、がある。ここにおいて、作業支援システム1では、第1モードのときは制御部9によって生成部3が制御されて作業プロセスが生成される。作業支援システム1では、第2モードのときは制御部9によってガイド情報提示部8が制御されて、ガイド情報提示部8が、作業プロセスに従って作業者100のための作業ガイド情報を作業領域11に提示する(作業ガイド情報を作業台15の上面151上に表示する)。 The work support system 1 has a first mode for generating a work process in the generation unit 3 and a second mode for using the work process generated in the generation unit 3. Here, in the work support system 1, in the first mode, the control unit 9 controls the generation unit 3 to generate a work process. In the work support system 1, in the second mode, the guide information presentation unit 8 is controlled by the control unit 9, and the guide information presentation unit 8 sets the work guide information for the worker 100 in the work area 11 according to the work process. Present (work guide information is displayed on the upper surface 151 of the work table 15).
 センサ2は、作業領域11内に設定された3次元検知領域12内に存在する物体を検知する。生成部3は、センサ2の出力に基づいて、作業領域11内の複数の作業イベントの順序を含む作業プロセスを生成する。提示部4は、作業プロセスを提示する。編集部5は、ユーザインタフェース6からの入力情報に基づいて作業プロセスを編集する。設定部7は、自動検知領域14(図4参照)をユーザインタフェース6からの入力情報に従って設定する。ここにおいて、自動検知領域14は、作業領域11内に設定される。自動検知領域14は、距離画像又はイメージ画像の動画から所定の検知条件を満たす作業イベントが自動検知される領域である。また、設定部7は、センサ2の出力に基づいて、3次元検知領域12を設定する。ガイド情報提示部8は、作業プロセスに従って複数の作業ガイド情報を順次提示する。 The sensor 2 detects an object present in the three-dimensional detection area 12 set in the work area 11. The generation unit 3 generates a work process including the order of a plurality of work events in the work area 11 based on the output of the sensor 2. The presentation unit 4 presents a work process. The editing unit 5 edits the work process based on the input information from the user interface 6. The setting unit 7 sets the automatic detection area 14 (see FIG. 4) in accordance with the input information from the user interface 6. Here, the automatic detection area 14 is set in the work area 11. The automatic detection area 14 is an area where a work event that satisfies a predetermined detection condition is automatically detected from a distance image or a moving image of the image image. Further, the setting unit 7 sets the three-dimensional detection area 12 based on the output of the sensor 2. The guide information presentation unit 8 sequentially presents a plurality of work guide information in accordance with the work process.
 作業領域11は、作業者100が作業を行う作業台15上の領域(3次元領域)を含む。作業領域11は、作業台15の上面151、作業台15上の複数のパーツボックス13等を含む。複数の3次元検知領域12は、例えば、複数のパーツボックス13に一対一に対応する。以下では、説明の便宜上、4つの3次元検知領域12を、図2の左側から順に、3次元検知領域121、3次元検知領域122、3次元検知領域123、3次元検知領域124と称することもある。4つの3次元検知領域121、122、123及び124は、4つのパーツボックス131、132、133及び134にそれぞれ対応している。作業者100から見て、4つの3次元検知領域121、122、123及び124は、4つのパーツボックス131、132、133及び134それぞれの手前側に設定された領域である。4つの3次元検知領域121、122、123及び124は、4つのパーツボックス131、132、133及び134それぞれの一部に重なって設定されているが、これに限らず、4つのパーツボックス131、132、133及び134それぞれに重なっていなくてもよい。 The work area 11 includes an area (three-dimensional area) on the work table 15 on which the worker 100 works. The work area 11 includes the upper surface 151 of the workbench 15 and a plurality of part boxes 13 and the like on the workbench 15. The plurality of three-dimensional detection areas 12 correspond, for example, to the plurality of part boxes 13 one by one. Hereinafter, for convenience of explanation, the four three-dimensional detection areas 12 may also be referred to as the three-dimensional detection area 121, the three-dimensional detection area 122, the three-dimensional detection area 123, and the three-dimensional detection area 124 sequentially from the left side of FIG. is there. The four three- dimensional detection areas 121, 122, 123 and 124 correspond to the four part boxes 131, 132, 133 and 134, respectively. As viewed from the worker 100, the four three- dimensional detection areas 121, 122, 123 and 124 are areas set on the front side of the four part boxes 131, 132, 133 and 134, respectively. The four three- dimensional detection areas 121, 122, 123 and 124 are set to overlap with parts of the four part boxes 131, 132, 133 and 134, respectively, but the invention is not limited thereto. It does not have to overlap 132, 133 and 134, respectively.
 センサ2は、例えば、作業領域11の上方に配置される。センサ2は、例えば、作業台15に設けられた支持部16に支持されることで、作業領域11の上方に配置される。センサ2は、距離画像センサ21(図1参照)を含んでいる。距離画像センサ21は、作業領域11を上方から撮影して作業領域11の距離画像を出力する。ここにおいて、距離画像センサ21は、センサ2から作業領域11内に存在する物体までの距離値を画素値とする距離画像(距離画像の各画素の画素値は、距離値である)を連続的に生成する。これにより、センサ2は、物体の動き、物体の有無等を検知することができる。距離画像センサ21は、赤外光源と、赤外線カメラと、を含んでいる。距離画像センサ21は、例えば、TOF(Time Of Flight)方式の距離画像センサである。距離画像センサ21では、距離画像センサ21から作業領域11内に存在する物体までの距離は、赤外光源から作業領域11に赤外光が照射されてから物体で反射された赤外光が赤外線カメラに入射する(到達する)までの時間(飛行時間)によって求める。物体の高さ情報は、作業台15の上面151と距離画像センサ21との距離から、物体と距離画像センサ21との距離を引くことで算出される。距離画像センサ21は、TOF方式の距離画像センサに限らず、例えば、ステレオカメラ方式の距離画像センサ等であってもよい。 The sensor 2 is disposed, for example, above the work area 11. The sensor 2 is disposed above the work area 11 by being supported by a support 16 provided on the work table 15, for example. The sensor 2 includes a distance image sensor 21 (see FIG. 1). The distance image sensor 21 captures the work area 11 from above and outputs a distance image of the work area 11. Here, the distance image sensor 21 continuously generates a distance image (a pixel value of each pixel of the distance image is a distance value) in which a distance value from the sensor 2 to an object present in the work area 11 is a pixel value. Generate to Thereby, the sensor 2 can detect the movement of the object, the presence or absence of the object, and the like. The distance image sensor 21 includes an infrared light source and an infrared camera. The distance image sensor 21 is, for example, a distance image sensor of a TOF (Time Of Flight) system. In the distance image sensor 21, the distance from the distance image sensor 21 to the object present in the work area 11 is determined by the infrared light source emitting infrared light to the work area 11 from the infrared light source and then the infrared light reflected by the object being infrared light It is determined by the time (time of flight) until the light enters (reaches) the camera. The height information of the object is calculated by subtracting the distance between the object and the distance image sensor 21 from the distance between the upper surface 151 of the work bench 15 and the distance image sensor 21. The distance image sensor 21 is not limited to the distance image sensor of the TOF method, but may be, for example, a distance image sensor of a stereo camera method.
 センサ2は、イメージセンサ22(図1参照)を含んでいる。イメージセンサ22は、例えば、CMOS(Complementary MOS)カラーイメージセンサである。イメージセンサ22は、作業領域11を上方から撮影して作業領域11のカラーのイメージ画像を出力する。ここにおいて、イメージセンサ22は、カラーのイメージ画像を連続的に生成する。センサ2では、略同時刻に距離画像センサ21とイメージセンサ22とでそれぞれ撮影された作業領域11の距離画像とカラーのイメージ画像とが一対一に対応している。なお、イメージセンサ22は、モノクロのイメージ画像を出力するセンサであってもよい。 The sensor 2 includes an image sensor 22 (see FIG. 1). The image sensor 22 is, for example, a CMOS (Complementary MOS) color image sensor. The image sensor 22 shoots the work area 11 from above and outputs a color image of the work area 11. Here, the image sensor 22 continuously generates a color image. In the sensor 2, the distance image of the work area 11 and the color image correspond one-to-one with the distance image sensor 21 and the image sensor 22 at substantially the same time. The image sensor 22 may be a sensor that outputs a monochrome image.
 生成部3、提示部4、編集部5、ユーザインタフェース6、設定部7及び制御部9は、例えば、プロセッサ及びメモリを主構成とするコンピュータシステムにて構成されている。言い換えれば、生成部3、提示部4、編集部5、設定部7及び制御部9は、プロセッサ及びメモリを有するコンピュータシステム(例えば、ノート型のパーソナルコンピュータ)にて実現されており、プロセッサがメモリに格納されているプログラムを実行することにより、コンピュータシステムが生成部3、提示部4、編集部5、ユーザインタフェース6、設定部7、制御部9及び記憶部10として機能する。プログラムは、コンピュータシステムのメモリに予め記録されていてもよいが、電気通信回線を通じて提供されてもよいし、コンピュータシステムで読み取り可能なメモリカード、光学ディスク、ハードディスクドライブ(磁気ディスク)等の非一時的記録媒体に記録されて提供されてもよい。コンピュータシステムのプロセッサは、半導体集積回路(IC)又は大規模集積回路(LSI)を含む1乃至複数の電子回路で構成される。複数の電子回路は、1つのチップに集約されていてもよいし、複数のチップに分散して設けられていてもよい。複数のチップは、1つの装置に集約されていてもよいし、複数の装置に分散して設けられていてもよい。 The generation unit 3, the presentation unit 4, the editing unit 5, the user interface 6, the setting unit 7, and the control unit 9 are configured by, for example, a computer system having a processor and a memory as main components. In other words, the generation unit 3, the presentation unit 4, the editing unit 5, the setting unit 7, and the control unit 9 are realized by a computer system (for example, a laptop personal computer) having a processor and a memory, and the processor is a memory The computer system functions as the generation unit 3, the presentation unit 4, the editing unit 5, the user interface 6, the setting unit 7, the control unit 9, and the storage unit 10 by executing the programs stored in. The program may be pre-recorded in the memory of the computer system, but may be provided through a telecommunication line, or may be a non-transitory memory card, optical disk, hard disk drive (magnetic disk), etc. readable by the computer system. May be provided by being recorded on a physical recording medium. A processor of a computer system is configured of one or more electronic circuits including a semiconductor integrated circuit (IC) or a large scale integrated circuit (LSI). The plurality of electronic circuits may be integrated into one chip or may be distributed to a plurality of chips. The plurality of chips may be integrated into one device or may be distributed to a plurality of devices.
 ユーザインタフェース6は、上述のパーソナルコンピュータの入力装置(キーボード、マウス等)により構成されている。提示部4は、上述のパーソナルコンピュータのディスプレイにより構成されており、自動検知領域14を設定するための設定画面、生成部3で生成された作業プロセス等を提示する。提示部4の提示内容(表示内容)は、制御部9によって制御される。編集部5については、生成部3について説明した後で、説明する。 The user interface 6 is configured by the input device (keyboard, mouse, etc.) of the personal computer described above. The presentation unit 4 is configured by the display of the above-described personal computer, and presents a setting screen for setting the automatic detection area 14, a work process generated by the generation unit 3, and the like. The presentation content (display content) of the presentation unit 4 is controlled by the control unit 9. The editor 5 will be described after the generator 3 is described.
 設定部7は、センサ2の出力に基づいて、作業領域11内の3次元検知領域12を設定する。例えば、予め設定された自動検知領域14が、特定の高さ情報を有していない場合に、設定部7は、物体が自動検知領域14を通過したときの物体の高さ情報を自動検知領域14の情報に含めることで、3次元検知領域12を自動で設定する。また、自動検知領域14が特定の高さ情報を有している場合、設定部7は、センサ2の出力に基づいて、3次元検知領域12の3次元情報を自動で更新することにより、3次元検知領域12をより適切な範囲として設定することができる。 The setting unit 7 sets the three-dimensional detection area 12 in the work area 11 based on the output of the sensor 2. For example, when the preset automatic detection area 14 does not have specific height information, the setting unit 7 automatically detects the height information of the object when the object passes the automatic detection area 14 The three-dimensional detection area 12 is automatically set by including it in the 14 information. When the automatic detection area 14 has specific height information, the setting unit 7 automatically updates the three-dimensional information of the three-dimensional detection area 12 based on the output of the sensor 2. The dimension detection area 12 can be set as a more appropriate range.
 生成部3は、センサ2の出力に基づいて、作業領域11内の複数の作業イベントの順序を含む作業プロセスを生成する。生成部3では、第1モードにおいて、予め作業プロセスを生成するために作業プロセスの対象となる一連の作業(模範となる一連の作業)を特定の作業者が作業台15上で行ったときのセンサ2の出力に基づいて作業プロセスを生成する。作業支援システム1に作業プロセスを生成させるユーザは、一連の作業を特定の作業者が作業台15上で行ったときのセンサ2の出力(距離画像及びイメージ画像それぞれの動画)を予め記録媒体に録画し、作業支援システム1に、記録媒体に録画された動画を利用して作業プロセスを生成させる。特定の作業者は、第2モードのときに作業を行う作業者100と同じ人であってもよい。 The generation unit 3 generates a work process including the order of a plurality of work events in the work area 11 based on the output of the sensor 2. In the first mode, in the first mode, the generation unit 3 performs a series of operations (a series of operations as an example) to be a target of the operation process in order to generate an operation process in advance on the work bench 15 A work process is generated based on the output of the sensor 2. The user who causes the work support system 1 to generate a work process uses the recordings of the output of the sensor 2 (moving images of distance image and image image) when a specific worker performs a series of work on the work table 15 in advance. Recording is performed, and the work support system 1 generates a work process using the moving image recorded on the recording medium. The specific worker may be the same person as the worker 100 who works in the second mode.
 以下、作業支援システム1において作業プロセスを生成させるためのフローについて説明する。 Hereinafter, a flow for generating a work process in the work support system 1 will be described.
 ユーザは、ユーザインタフェース6を利用して上述のコンピュータシステムの提示部4に動画を提示させる。そして、ユーザは、ユーザインタフェース6を利用して、距離画像の動画を構成する複数の距離画像のうちの1つの距離画像を基準距離画像P0(図3参照)として選択する。言い換えれば、距離画像の動画から基準距離画像P0をキャプチャーする。基準距離画像P0は、例えば、特定の作業者の手が映っていない画像である。基準距離画像P0内には、作業台15の上面151及び複数のパーツボックス131~134が映っている。 The user uses the user interface 6 to cause the presentation unit 4 of the computer system described above to present the moving image. Then, the user uses the user interface 6 to select one of the plurality of distance images constituting the moving image of the distance image as the reference distance image P0 (see FIG. 3). In other words, the reference distance image P0 is captured from the moving image of the distance image. The reference distance image P0 is, for example, an image in which the hand of a specific worker is not shown. In the reference distance image P0, the upper surface 151 of the workbench 15 and a plurality of part boxes 131 to 134 appear.
 次に、ユーザは、ユーザインタフェース6を利用して、複数のイメージ画像のうち提示部4に提示されている基準距離画像P0に一対一に対応する1つのイメージ画像を基準画像P10(図4参照)として提示部4に提示させ、基準画像P10に対して自動検知領域14の設定を行う。ユーザは、複数のパーツボックス13に一対一に対応する複数の自動検知領域14を、ユーザインタフェース6を利用して設定する。例えば、ユーザは、マウスによって基準画像P10上の範囲を指定することにより、自動検知領域14を設定することができる。4つの自動検知領域14を、図4の左側から順に、自動検知領域141、自動検知領域142、自動検知領域143、自動検知領域144と称することもある。自動検知領域14は、2次元領域でも、3次元領域でもよい。自動検知領域14は、自動検知における検知条件として、例えば、高さ、面積又は体積等の情報を含む。また、自動検知におけるその他の検知条件としては、例えば、色、時間範囲、滞在時間、動作方向、複数の検知領域の同時又は順不同のグループ設定等である。ここで、時間範囲は、作業プロセスが実行されている作業時間中において検知条件として使用する指定範囲の時間である。作業時間は、例えば、作業プロセスの開始時点から終了時点までの時間である。時間範囲は、例えば、作業プロセスの開始時点から3秒後の時点と12秒後の時点との間の範囲である。また、複数の検知領域の同時グループ設定では、例えば、複数の検知領域が第1検知領域と第2検知領域とを含むとき、第1検知領域の作業イベントと第2検知領域の作業イベントとが同時に行われた場合のみ検知するという条件である。また、複数の検知領域の順不同グループ設定では、例えば、複数の検知領域が第1検知領域と第2検知領域とを含むとき、第1検知領域の作業イベントと第2検知領域の作業イベントとのどちらが先に行われた場合にも検知するという条件である。なお、自動検知領域14の設定は、イメージ画像ではなく、距離画像を用いて設定してもよい。 Next, the user uses the user interface 6 to select one image corresponding to the reference distance image P0 presented to the presenting unit 4 among the plurality of images as a reference image P10 (see FIG. 4). And the automatic detection area 14 is set with respect to the reference image P10. The user sets a plurality of automatic detection areas 14 corresponding to the plurality of parts boxes 13 one by one using the user interface 6. For example, the user can set the automatic detection area 14 by specifying the range on the reference image P10 with the mouse. The four automatic detection areas 14 may be referred to as an automatic detection area 141, an automatic detection area 142, an automatic detection area 143, and an automatic detection area 144 sequentially from the left side of FIG. The automatic detection area 14 may be a two-dimensional area or a three-dimensional area. The automatic detection area 14 includes, for example, information such as height, area, or volume as a detection condition in automatic detection. Further, other detection conditions in automatic detection are, for example, color, time range, staying time, operation direction, simultaneous or random group setting of plural detection areas, and the like. Here, the time range is a specified range of time used as a detection condition during the working time in which the work process is being performed. The work time is, for example, the time from the start time point of the work process to the end time point. The time range is, for example, a range between a point 3 seconds after the start of the work process and a point 12 seconds after. Further, in simultaneous group setting of a plurality of detection areas, for example, when the plurality of detection areas include the first detection area and the second detection area, the work event of the first detection area and the work event of the second detection area are The condition is to detect only when it is performed simultaneously. Also, in the case of unordered group setting of the plurality of detection areas, for example, when the plurality of detection areas include the first detection area and the second detection area, the work event of the first detection area and the work event of the second detection area It is a condition that it detects even when either is performed first. Note that the setting of the automatic detection area 14 may be set using a distance image instead of an image.
 次に、ユーザは、ユーザインタフェース6を利用して距離画像の動画の再生、3次元検知領域12の設定及び自動検知領域14で実行される作業イベントのフレームの生成を指示する。これにより、設定部7は、距離画像の動画から3次元検知領域12を自動で設定する。フレームは、特定の作業を行う内容が記録された処理枠である。3次元検知領域12は、距離画像において、実空間の自動検知領域14に対応する領域である。ここにおいて、設定部7は、距離画像において自動検知領域14に対応する領域と、当該領域に入った物体(特定の作業者の手)に関する情報に基づいて高さ方向の距離情報と、を含む3次元検知領域12を設定する。 Next, the user uses the user interface 6 to instruct reproduction of a moving image of the distance image, setting of the three-dimensional detection area 12 and generation of a frame of a work event to be executed in the automatic detection area 14. Thereby, the setting unit 7 automatically sets the three-dimensional detection area 12 from the moving image of the distance image. A frame is a processing frame in which the content of performing a specific task is recorded. The three-dimensional detection area 12 is an area corresponding to the automatic detection area 14 in the real space in the distance image. Here, the setting unit 7 includes an area corresponding to the automatic detection area 14 in the distance image, and distance information in the height direction based on information on an object (specific worker's hand) entering the area. A three-dimensional detection area 12 is set.
 制御部9は、フロー生成(作業プロセス生成)の指示を受け付けると、生成部3に作業プロセスを生成させる。 When the control unit 9 receives an instruction of flow generation (work process generation), the control unit 9 causes the generation unit 3 to generate a work process.
 生成部3は、距離画像又はイメージ画像の動画から自動検知領域14で検知される作業イベントのフレームを自動で生成する。 The generation unit 3 automatically generates a frame of a work event detected in the automatic detection area 14 from the distance image or the moving image of the image image.
 図5の横軸は、録画された動画の時間軸である。図5に示した例では、作業プロセスは、複数(4つ)の作業イベント(工程)を含む。4つの作業イベントは、例えば、パーツボックス131から部品を取り出す作業イベント(以下、作業Aともいう)、パーツボックス132から部品を取り出す作業イベント(以下、作業Bともいう)、パーツボックス133から部品を取り出す作業イベント(以下、作業Cともいう)、及びパーツボックス134から部品を取り出す作業イベント(以下、作業Dともいう)である。生成部3は、それぞれの作業イベントが実行される順に、作業イベントのフレームを自動で生成する。 The horizontal axis of FIG. 5 is a time axis of the recorded moving image. In the example shown in FIG. 5, the work process includes a plurality (four) of work events (steps). The four work events include, for example, a work event (hereinafter, also referred to as work A) for removing a part from parts box 131, a work event (also referred to hereinafter as work B) for removing a part from parts box 132, and a part from parts box 133. These are a work event to be taken out (hereinafter also referred to as work C) and a work event to be brought out of a part from the parts box 134 (hereinafter also referred to as work D). The generation unit 3 automatically generates frames of work events in the order in which the respective work events are executed.
 図5に示す例では、作業プロセスは、例えば、作業イベントを行う順序を示すフローであり、「パーツボックス131から部品を取る」→「パーツボックス132から部品を取る」→「パーツボックス134から部品を取る」→「パーツボックス133から部品を取る」の順序のフローである。作業プロセスは、例えば、「パーツボックス131」→「パーツボックス132」→「パーツボックス134」→「パーツボックス133」の順序のフローであってもよい。各作業イベントの内容(「パーツボックス131から部品を取る」等)は、ユーザインタフェース6を利用してあらかじめ生成部3に設定しておいてもよい。自動検知領域14に対応する作業イベントが予め生成部3に設定されている場合、作業支援システム1は、生成部3において、対応する作業イベントのフレームに作業イベントの内容を自動で紐付けることができる。 In the example shown in FIG. 5, the work process is a flow indicating, for example, the order of performing work events, “take parts from parts box 131” → “take parts from parts box 132” → “parts from parts box 134” It is a flow of the order of "take part from part box 133". The work process may be, for example, a flow of the order of “part box 131” → “part box 132” → “part box 134” → “part box 133”. The contents of each work event (“take parts from part box 131” and the like) may be set in advance in the generation unit 3 using the user interface 6. When the work event corresponding to the automatic detection area 14 is set in advance in the generation unit 3, the work support system 1 may automatically associate the contents of the work event with the frame of the corresponding work event in the generation unit 3 it can.
 生成部3は、図5に示すように、自動検知領域141、自動検知領域142、自動検知領域144、自動検知領域143の順に、それぞれに対応する作業イベントのフレームを自動で生成する。自動検知領域141に対応するフレームは、3次元検知領域121に対応する。自動検知領域142に対応するフレームは、3次元検知領域122に対応する。自動検知領域144に対応するフレームは、3次元検知領域124に対応する。自動検知領域143に対応するフレームは、3次元検知領域123に対応する。 As illustrated in FIG. 5, the generation unit 3 automatically generates frames of work events corresponding to the automatic detection area 141, the automatic detection area 142, the automatic detection area 144, and the automatic detection area 143 in this order. A frame corresponding to the automatic detection area 141 corresponds to the three-dimensional detection area 121. The frame corresponding to the automatic detection area 142 corresponds to the three-dimensional detection area 122. The frame corresponding to the automatic detection area 144 corresponds to the three-dimensional detection area 124. The frame corresponding to the automatic detection area 143 corresponds to the three-dimensional detection area 123.
 このように、生成部3は、作業領域11内の複数の作業イベントの順序を含む作業プロセスを、センサ2の出力に基づいて検知対象の物体(作業者100の手)が検知された3次元検知領域12の順序に従って生成する。より詳細には、生成部3は、複数の3次元検知領域12に一対一で対応する複数の自動検知領域14に関して検知対象の物体が検知された順序に従って作業イベントの順序を決めた作業プロセスを生成する。 As described above, the generation unit 3 generates a work process including the order of a plurality of work events in the work area 11 in a three-dimensional manner in which an object to be detected (the hand of the worker 100) is detected based on the output of the sensor 2. It generates according to the order of the detection area 12. More specifically, the generation unit 3 determines the work process in which the order of work events is determined according to the order in which the objects to be detected are detected with respect to the plurality of automatic detection areas 14 corresponding to the plurality of three-dimensional detection areas 12 one by one. Generate
 設定部7は、基準距離画像P0(図3参照)の各画素の画素値から距離画像の各画素の画素値を減算した差分値に基づいて物体を検知でき、かつ、物体に関する情報の1つとして物体の高さを求めることができる。設定部7は、例えば、上記の差分値が閾値以上の画素数が所定数以上あれば、検知対象の物体があると判断し、上記の差分値が閾値以上の画素数が所定数未満であれば検知対象の物体がないと判断する。つまり、設定部7は、距離画像センサ21から出力される距離画像に基づいて検知対象の物体(作業者100の手、部品等)を検知する機能を有している。設定部7は、上下方向における3次元検知領域12の範囲を、検知対象の物体の高さの情報に基づいて設定する。設定部7は、距離画像、又は基準距離画像P0と当該基準距離画像P0以外の距離画像との差分画像にパターンマッチング等の処理を施して検知対象の物体を検知するように構成されていてもよい。 The setting unit 7 can detect an object based on a difference value obtained by subtracting the pixel value of each pixel of the distance image from the pixel value of each pixel of the reference distance image P0 (see FIG. 3), and one of information on the object The height of the object can be determined as The setting unit 7 determines that there is an object to be detected if, for example, the number of pixels above the threshold is equal to or greater than the threshold, the number of pixels above the threshold is less than the predetermined number. For example, it is determined that there is no object to be detected. That is, the setting unit 7 has a function of detecting an object to be detected (a hand of the operator 100, a part, or the like) based on the distance image output from the distance image sensor 21. The setting unit 7 sets the range of the three-dimensional detection area 12 in the vertical direction based on the information of the height of the object to be detected. The setting unit 7 is configured to perform processing such as pattern matching on a distance image or a difference image between a reference distance image P0 and a distance image other than the reference distance image P0 and detect an object to be detected. Good.
 次に、ユーザは、ユーザインタフェース6を利用して、提示部4に提示されている3次元検知領域編集用の画面において3次元検知領域12の編集を行うことができる。3次元検知領域12の編集は、例えば、自動で設定した3次元検知領域12の高さ情報を手動で修正する。また、設定部7において3次元検知領域12の高さ情報を自動で設定しない場合は、ユーザは、ユーザインタフェース6を利用して、高さ情報を手動で、設定することができる。ユーザは、例えば、ユーザインタフェース6のマウス、キーボード等を利用して3次元検知領域12の変更、追加、削除を指示することができる。また、ユーザは、3次元検知領域12の範囲の変更、追加又は削除を行うこともできる。 Next, the user can use the user interface 6 to edit the three-dimensional detection area 12 on the screen for three-dimensional detection area editing presented on the presentation unit 4. Editing of the three-dimensional detection area 12 manually corrects, for example, height information of the three-dimensional detection area 12 set automatically. When the setting unit 7 does not automatically set the height information of the three-dimensional detection area 12, the user can manually set the height information using the user interface 6. The user can instruct, for example, change, addition, or deletion of the three-dimensional detection area 12 using a mouse, a keyboard, or the like of the user interface 6. The user can also change, add, or delete the range of the three-dimensional detection area 12.
 なお、生成部3では、3次元検知領域12の設定と自動検知領域14で実行される作業イベントのフレームの生成は、同時に実行しても、別で実行してもよい。 In the generation unit 3, the setting of the three-dimensional detection area 12 and the generation of the work event frame performed in the automatic detection area 14 may be performed simultaneously or separately.
 作業支援システム1のユーザは、ユーザインタフェース6及び編集部5を利用することにより、生成部3が自動で生成したフレームに作業イベントの内容を入力することができる。作業イベントの内容は、例えば、「パーツボックスから部品を取る」等の作業内容である。また、ユーザは、生成部3が自動で生成したフレームからなる作業プロセスに、作業イベントを追加することができる。また、ユーザは、生成部3が自動で生成したフレームからなる作業プロセスから、フレームを削除することができる。また、ユーザは、3次元検知領域12における物体の検知条件を、対応する作業イベントのフレームに設定することができる。3次元検知領域12の検知条件とは、例えば、色、時間範囲、滞在時間、動作方向、複数の検知領域の同時又は順不同のグループ設定等である。ここで、時間範囲は、作業プロセスが実行されている作業時間中において検知条件として使用する指定範囲の時間である。作業時間は、例えば、作業プロセスの開始時点から終了時点までの時間である。時間範囲は、例えば、作業プロセスの開始時点から3秒後の時点と12秒後の時点との間の範囲である。また、複数の検知領域の同時グループ設定では、例えば、複数の検知領域が第1検知領域と第2検知領域とを含むとき、第1検知領域の作業イベントと第2検知領域の作業イベントとが同時に行われた場合のみ検知するという条件である。また、複数の検知領域の順不同グループ設定では、例えば、複数の検知領域が第1検知領域と第2検知領域とを含むとき、第1検知領域の作業イベントと第2検知領域の作業イベントとのどちらが先に行われた場合にも検知するという条件である。 By using the user interface 6 and the editing unit 5, the user of the work support system 1 can input the content of the work event into the frame automatically generated by the generation unit 3. The contents of the work event are, for example, work contents such as “take parts from parts box”. Also, the user can add a work event to a work process consisting of a frame automatically generated by the generation unit 3. Also, the user can delete a frame from the work process consisting of the frames automatically generated by the generation unit 3. In addition, the user can set the detection condition of the object in the three-dimensional detection area 12 in the frame of the corresponding work event. The detection condition of the three-dimensional detection area 12 is, for example, a color, a time range, a staying time, an operation direction, a simultaneous or random group setting of a plurality of detection areas, or the like. Here, the time range is a specified range of time used as a detection condition during the working time in which the work process is being performed. The work time is, for example, the time from the start time point of the work process to the end time point. The time range is, for example, a range between a point 3 seconds after the start of the work process and a point 12 seconds after. Further, in simultaneous group setting of a plurality of detection areas, for example, when the plurality of detection areas include the first detection area and the second detection area, the work event of the first detection area and the work event of the second detection area are The condition is to detect only when it is performed simultaneously. Also, in the case of unordered group setting of the plurality of detection areas, for example, when the plurality of detection areas include the first detection area and the second detection area, the work event of the first detection area and the work event of the second detection area It is a condition that it detects even when either is performed first.
 なお、3次元検知領域12の検知条件としての滞在時間は、作業プロセス生成時において自動検知領域14に物体が存在していた時間(滞在時間)であってもよい。このとき、設定部7は、自動検知領域14における物体の滞在時間を、3次元検知領域12における物体の検知条件として自動で設定してもよい。このとき、滞在時間を計時する計時部は、上述のパーソナルコンピュータの備えているタイマにより構成されている。また、設定部7は、滞在時間を、動画のフレームレートを用いて計算してもよい。ユーザは、提示部4に提示された作業プロセスを、例えば、文章、描画、画像、動画、記号等を使って編集することができる。 The stay time as the detection condition of the three-dimensional detection area 12 may be the time (stay time) in which the object was present in the automatic detection area 14 at the time of generation of the work process. At this time, the setting unit 7 may automatically set the staying time of the object in the automatic detection area 14 as the detection condition of the object in the three-dimensional detection area 12. At this time, the clock unit for clocking the stay time is configured by the timer provided in the above-mentioned personal computer. In addition, the setting unit 7 may calculate the stay time using the moving image frame rate. The user can edit the work process presented to the presentation unit 4 using, for example, sentences, drawings, images, moving pictures, symbols, and the like.
 なお、複数の作業イベントに対応するフレームは、時間軸に対して並列に設定してもよい。例えば、パーツボックス131から部品を取る作業とパーツボックス132から部品を取る作業を同時に行う場合、又は、順不同で行う場合、パーツボックス131、132に対応する自動検知領域141、142から抽出されるフレームは、時間軸に対して並列に設定することができる。 Note that frames corresponding to a plurality of work events may be set parallel to the time axis. For example, when performing an operation of taking parts from the part box 131 and an operation of taking parts from the part box 132 at the same time or in random order, frames extracted from the automatic detection areas 141, 142 corresponding to the part boxes 131, 132 Can be set parallel to the time axis.
 作業支援システム1のガイド情報提示部8は、作業領域11の上方において上下方向においてセンサ2と重ならないように配置されている。ガイド情報提示部8は、上述の第2モードにおいて作業プロセスに従って作業ガイド情報を作業領域11に順次提示する。ガイド情報提示部8は、作業台15の上面151を含む投影面(表示面)に映像を投影するように構成されている。 The guide information presentation unit 8 of the work support system 1 is disposed above the work area 11 so as not to overlap the sensor 2 in the vertical direction. The guide information presentation unit 8 sequentially presents the work guide information in the work area 11 in accordance with the work process in the second mode described above. The guide information presentation unit 8 is configured to project an image on a projection surface (display surface) including the upper surface 151 of the work table 15.
 ガイド情報提示部8は、例えば、プロジェクタである。ガイド情報提示部8は、作業領域11に向かって作業ガイド情報(映像)を投影するように配置されている。作業ガイド情報は、作業台15の上面151のうち複数のパーツボックス13のうち部品を取り出す1つのパーツボックス13の3次元検知領域12に対応する領域に、例えば色付の矢印等の図形17(図2参照)で提示される。したがって、作業支援システム1では、作業者100が作業中と同様の姿勢のまま作業ガイド情報を見ることができる。作業ガイド情報は、図形17に限らず、文章(例えば、「部品を取って」等)でもよいし、図形17と文章との両方であってもよい。図形17は、矢印に限らず、例えば、長方形でもよい。また、作業ガイド情報は、矢印又は長方形以外の図形であってもよいし、動画、写真等であってもよい。ガイド情報提示部8の動作は、制御部9によって制御される。作業支援システム1では、制御部9が、ガイド情報提示部8による作業ガイド情報の提示、非提示のタイミングを制御する。ここにおいて、制御部9は、例えば、第1モードで作成した作業プロセスにより予め決められた順序で複数の作業ガイド情報を1つずつ提示させる。複数の作業ガイド情報の1つ1つは、複数の作業イベントの1つ1つに対応している。1つの作業ガイド情報を所定時間だけ提示させてから非提示とした後に、次の1つの作業ガイド情報を提示させるまでの間隔は、適宜設定すればよい。また、作業者100の動作の完了を3次元検知領域12及び検知条件で検知した後で、作業ガイド情報の提示を切り換えてもよい。作業支援システム1では、センサ2は、ガイド情報提示部8による作業領域11への作業ガイド情報の提示(投影)を妨げないように配置されている。ガイド情報提示部8は、投影光がセンサ2に照射されないように配置されている。 The guide information presentation unit 8 is, for example, a projector. The guide information presentation unit 8 is arranged to project work guide information (video) toward the work area 11. The work guide information is, for example, a colored arrow or the like 17 in a region corresponding to the three-dimensional detection area 12 of one part box 13 for taking out a part out of the plurality of part boxes 13 on the upper surface 151 of the work bench 15 See Figure 2). Therefore, in the work support system 1, the work guide information can be viewed with the worker 100 in the same posture as during work. The work guide information is not limited to the figure 17 but may be a sentence (for example, "take a part" or the like) or may be both the figure 17 and a sentence. The figure 17 is not limited to the arrow, and may be, for example, a rectangle. In addition, the work guide information may be a figure other than an arrow or a rectangle, or may be a moving image, a photograph, or the like. The operation of the guide information presentation unit 8 is controlled by the control unit 9. In the work support system 1, the control unit 9 controls the timing of presentation and non-presentation of the work guide information by the guide information presentation unit 8. Here, the control unit 9 presents a plurality of pieces of work guide information one by one, for example, in an order predetermined by the work process created in the first mode. Each one of the plurality of work guide information corresponds to each one of a plurality of work events. The interval from the presentation of one piece of work guide information for a predetermined time to the non-presentation and the presentation of the next piece of work guide information may be set as appropriate. Also, after the completion of the operation of the worker 100 is detected by the three-dimensional detection area 12 and the detection condition, the presentation of the work guide information may be switched. In the work support system 1, the sensor 2 is arranged so as not to interfere with the presentation (projection) of the work guide information to the work area 11 by the guide information presentation unit 8. The guide information presentation unit 8 is disposed so that the projection light is not irradiated to the sensor 2.
 作業支援システム1は、第2モードのとき、センサ2の出力を作業支援に利用する。作業支援システム1では、第2モードで動作しているときに、制御部9が、センサ2の出力に基づいて、作業の異常の有無を判断し、異常が有る場合にその旨を作業者に通知するようにしてもよい。 The work support system 1 uses the output of the sensor 2 for work support in the second mode. In the work support system 1, when operating in the second mode, the control unit 9 determines the presence or absence of an abnormality of the work based on the output of the sensor 2, and notifies the worker when there is an abnormality. You may make it notify.
 作業の異常としては、例えば、作業ガイド情報によって提示された作業ガイド情報による作業とは異なった作業が行われた場合、作業ガイド情報が提示されてから当該作業ガイド情報に従った作業が行われるまでの時間が上記の規定時間を超えている場合等である。作業ガイド情報による作業とは異なった作業が行われた場合としては、例えば、図2における左端のパーツボックス131から部品を取るように指示されたのに対して、作業者100が図2における左から2番目のパーツボックス132から部品を取った場合がある。 As an abnormality of the work, for example, when a work different from the work by the work guide information presented by the work guide information is performed, the work guide information is presented and then the work according to the work guide information is performed For example, when the time until the time exceeds the above-mentioned specified time. When a task different from the task based on the task guide information is performed, for example, the worker 100 is instructed to take a part from the part box 131 at the left end in FIG. There is a case where a part is taken from the second part box 132 from.
 作業者100への通知は、制御部9がガイド情報提示部8を制御することによって行ってもよいし、制御部9が音声発生装置、表示装置等を制御することによって行ってもよい。また、作業支援システム1では、第2モードで動作しているときに、制御部9は、センサ2の出力に基づいて、作業ガイド情報に従った作業が行われたことを検知するまで、作業ガイド情報を提示させたままにしてもよい。また、作業支援システム1は、例えば、第2モードで動作させるときに計測器と連係動作させてもよく、計測器からの信号をトリガとして作業ガイド情報を提示させてもよい。 The notification to the worker 100 may be performed by the control unit 9 controlling the guide information presentation unit 8 or may be performed by the control unit 9 controlling a voice generation device, a display device, and the like. Further, in the work support system 1, when operating in the second mode, the control unit 9 performs the work until it detects that the work according to the work guide information has been performed based on the output of the sensor 2 The guide information may be presented. In addition, for example, the operation support system 1 may operate in conjunction with the measuring device when operating in the second mode, or may cause operation guide information to be presented using a signal from the measuring device as a trigger.
 作業支援システム1では、作業者100が、作業領域11で作業を行うときに作業領域11に提示された作業ガイド情報を見ることができる。作業支援システム1を第2モードで動作させているときには、作業者100は、ガイド情報提示部8によって提示された作業ガイド情報に従って作業を行えばよい。これにより、作業者100は、作業プロセスに容易に適応することが可能となる。複数の作業者100間の生産量の差を小さくすることが可能となる。また、作業者100が、多品種の製品ごとに異なる作業プロセスに容易に適応することが可能となる。また、作業支援システム1を利用することにより、複数の作業者100の作業内容の標準化を図れる。 In the work support system 1, when the worker 100 works in the work area 11, the work guide information presented in the work area 11 can be viewed. When operating the work support system 1 in the second mode, the worker 100 may work according to the work guide information presented by the guide information presentation unit 8. This enables the worker 100 to easily adapt to the work process. It becomes possible to make small the difference in the amount of production among a plurality of workers 100. In addition, it becomes possible for the worker 100 to easily adapt to different work processes for different types of products. Further, by using the work support system 1, the work contents of the plurality of workers 100 can be standardized.
 以上説明した実施形態は、本発明の様々な実施形態の一つに過ぎない。実施形態は、本発明の目的を達成できれば、設計等に応じて種々の変更が可能である。また、作業支援システム1と同様の機能は、作業支援方法又はプログラム(コンピュータプログラム)で具現化されてもよい。 The embodiments described above are only one of various embodiments of the present invention. Various changes can be made to the embodiment according to the design and the like as long as the object of the present invention can be achieved. The same function as that of the work support system 1 may be embodied by a work support method or a program (computer program).
 一態様に係る作業支援方法は、作業領域11内に設定された自動検知領域14内に存在する物体を検知するセンサ2の出力に基づいて、作業領域11内の複数の作業イベントの順序を含む作業プロセスを生成する。 The work support method according to one aspect includes the order of a plurality of work events in the work area 11 based on the output of the sensor 2 that detects an object present in the automatic detection area 14 set in the work area 11. Generate a work process.
 また、一態様に係るプログラムは、コンピュータシステムに、作業領域11内に設定された自動検知領域14内に存在する物体を検知するセンサ2の出力に基づいて、作業領域11内の複数の作業イベントの順序を含む作業プロセスを生成する処理を実行させる。 Further, the program according to one aspect is, based on the output of the sensor 2 for detecting an object present in the automatic detection area 14 set in the work area 11 in the computer system, a plurality of work events in the work area 11 Execute processing to generate work processes including the order of
 以下、上記の実施形態の変形例を列挙する。 Hereinafter, modifications of the above embodiment will be listed.
 センサ2の検知対象は、作業者100の手に限らず、例えば、部品、製品等であってもよい。 The detection target of the sensor 2 is not limited to the hand of the worker 100, and may be, for example, a part, a product, or the like.
 また、センサ2は、距離画像センサ21とイメージセンサ22との両方を備えた構成に限らず、距離画像センサ21とイメージセンサ22とのうち一方だけを備えた構成でもよい。また、センサ2は、距離画像センサ21とイメージセンサ22との少なくとも一方を備えた構成に限らず、例えば、複数の3次元検知領域12ごとに設けられた複数のピッキングセンサであってもよい。 Further, the sensor 2 is not limited to the configuration provided with both the distance image sensor 21 and the image sensor 22, and may be configured with only one of the distance image sensor 21 and the image sensor 22. Further, the sensor 2 is not limited to the configuration provided with at least one of the distance image sensor 21 and the image sensor 22, and may be, for example, a plurality of picking sensors provided for each of the plurality of three-dimensional detection areas 12.
 また、作業領域11では、複数のパーツボックス13のうちの少なくとも2つのパーツボックス13が上下方向において重なって配置されていてもよい。 In the work area 11, at least two parts boxes 13 of the plurality of parts boxes 13 may be arranged to overlap in the vertical direction.
 また、ガイド情報提示部8は、プロジェクタに限らず、例えば、作業台15の上面151を含む天板に埋め込まれたディスプレイでもよい。 Further, the guide information presentation unit 8 is not limited to the projector, but may be, for example, a display embedded in a top plate including the upper surface 151 of the work table 15.
 また、作業支援システム1は、センサ2及び生成部3の他に、提示部4、編集部5、ユーザインタフェース6、設定部7、ガイド情報提示部8、制御部9及び記憶部10を備え得いるが、センサ2及び生成部3以外は必須の構成ではなく、適宜、省略可能である。また、提示部4、編集部5、ユーザインタフェース6、設定部7、ガイド情報提示部8、制御部9及び記憶部10は、それぞれが互いの機能を分担して実行していてもよい。 In addition to the sensor 2 and the generation unit 3, the work support system 1 may further include a presentation unit 4, an editing unit 5, a user interface 6, a setting unit 7, a guide information presentation unit 8, a control unit 9 and a storage unit 10. However, the components other than the sensor 2 and the generation unit 3 are not essential components and can be omitted as appropriate. In addition, the presentation unit 4, the editing unit 5, the user interface 6, the setting unit 7, the guide information presentation unit 8, the control unit 9, and the storage unit 10 may share and execute their respective functions.
 また、作業支援システム1は、生成部3で生成された作業プロセスをスマートフォン等の端末に送信してもよいし、記録媒体に記録させるようにしてもよい。 In addition, the work support system 1 may transmit the work process generated by the generation unit 3 to a terminal such as a smartphone or may be recorded on a recording medium.
 作業支援システム1で想定する作業は、製造ラインでの手組に限らない。例えば、作業は、製造ラインでの製品又は部品の良品、不良品の選別作業、梱包作業、調理、掃除等であってもよい。 The task assumed in the task support system 1 is not limited to manual assembly on the manufacturing line. For example, the operation may be good or defective products or parts on the production line, sorting of defective products, packing, cooking, cleaning, and the like.
 また、作業支援システム1が備える生成部3の少なくとも一部の機能は、例えば、クラウドコンピューティング(Cloud Computing)におけるサーバ装置によって実現されてもよい。 In addition, at least a part of the functions of the generation unit 3 included in the work support system 1 may be realized by, for example, a server device in cloud computing.
 (まとめ)
 以上説明した実施形態等から以下の態様が開示されている。
(Summary)
The following aspects are disclosed from the embodiment and the like described above.
 第1の態様に係る作業支援システム(1)は、センサ(2)と、生成部(3)と、を備える。センサ(2)は、作業領域(11)内に設定された自動検知領域(14)内に存在する物体を検知する。生成部(3)は、センサ(2)の出力に基づいて、作業領域(11)内の複数の作業イベントの順序を含む作業プロセスを生成する。 A work support system (1) according to a first aspect includes a sensor (2) and a generation unit (3). The sensor (2) detects an object present in an automatic detection area (14) set in the work area (11). The generation unit (3) generates a work process including an order of a plurality of work events in the work area (11) based on the output of the sensor (2).
 第1の態様に係る作業支援システム(1)では、作業プロセスを容易に生成することが可能となる。 In the work support system (1) according to the first aspect, it is possible to easily generate a work process.
 第2の態様に係る作業支援システム(1)は、第1の態様において、作業プロセスを提示する提示部(4)を更に備える。 The work support system (1) according to the second aspect further includes, in the first aspect, a presentation unit (4) for presenting a work process.
 第2の態様に係る作業支援システム(1)では、作業プロセスを提示することができるので、ユーザが作業プロセスを確認することができる。 In the work support system (1) according to the second aspect, since the work process can be presented, the user can confirm the work process.
 第3の態様に係る作業支援システム(1)は、第1又は2の態様において、センサ(2)は、センサ(2)から自動検知領域(14)に存在する物体までの距離値を画素値とする距離画像を連続的に生成する距離画像センサ(21)を含む。 In the work support system (1) according to the third aspect, in the first or second aspect, the sensor (2) is a pixel value of the distance value from the sensor (2) to the object present in the automatic detection area (14). And a distance image sensor (21) that continuously generates a distance image.
 第3の態様に係る作業支援システム(1)では、センサ(2)としてピッキングセンサを用いる場合に比べて作業者の作業性を向上させることが可能となる。 In the work support system (1) according to the third aspect, it is possible to improve the workability of the worker as compared to the case of using a picking sensor as the sensor (2).
 第4の態様に係る作業支援システム(1)は、第1~3の態様のいずれか一つにおいて、センサ(2)は、イメージセンサ(22)を含む。 In the work support system (1) according to the fourth aspect, in any one of the first to third aspects, the sensor (2) includes an image sensor (22).
 第4の態様に係る作業支援システム(1)では、ユーザがイメージセンサ(22)から出力される画像を見ながら自動検知領域(14)を設定することが可能となる。 In the work support system (1) according to the fourth aspect, it is possible for the user to set the automatic detection area (14) while viewing the image output from the image sensor (22).
 第5の態様に係る作業支援システム(1)は、第1~4の態様のいずれか一つにおいて、ユーザインタフェース(6)からの入力情報に基づいて作業プロセスを編集する編集部(5)を更に備える。 A work support system (1) according to a fifth aspect of the present invention is the work support system (1) according to any one of the first to fourth aspects, which edits a work process based on input information from the user interface (6). Furthermore, it has.
 第5の態様に係る作業支援システム(1)では、ユーザがユーザインタフェースを利用して作業プロセスの編集を行うことが可能となる。 In the work support system (1) according to the fifth aspect, the user can edit the work process using the user interface.
 第6の態様に係る作業支援システム(1)は、第1~5の態様のいずれか一つにおいて、3次元検知領域(12)をユーザインタフェース(6)からの入力情報に従って設定する設定部(7)を更に備える。 A work support system (1) according to a sixth aspect is a setting unit for setting the three-dimensional detection area (12) according to input information from the user interface (6) in any one of the first to fifth aspects And 7).
 第6の態様に係る作業支援システム(1)では、3次元検知領域(12)をカスタマイズすることが可能となる。 In the work support system (1) according to the sixth aspect, it is possible to customize the three-dimensional detection area (12).
 第7の態様に係る作業支援システム(1)は、第1~6の態様のいずれか一つにおいて、生成部(3)において作業プロセスを生成する第1モードと、生成部(3)において生成された作業プロセスを使用する第2モードと、があり、第2モードにおいてセンサ(2)の出力を作業支援に利用する。 In the work support system (1) according to the seventh aspect, in any one of the first to sixth aspects, a first mode for generating a work process in the generation unit (3), and generation in the generation unit (3) There is a second mode using the work process, and in the second mode, the output of the sensor (2) is used for work support.
 第7の態様に係る作業支援システム(1)では、第1モードと第2モードとの両方で同じセンサ(2)を利用することができる。 In the work support system (1) according to the seventh aspect, the same sensor (2) can be used in both the first mode and the second mode.
 第8の態様に係る作業支援システム(1)は、第7の態様において、第2モードにおいて作業プロセスに従って複数の作業ガイド情報を順次提示するガイド情報提示部(8)を更に備える。 The work support system (1) according to the eighth aspect further includes a guide information presentation unit (8) for sequentially presenting a plurality of pieces of work guide information according to the work process in the second mode in the seventh aspect.
 第8の態様に係る作業支援システム(1)では、第2モードにおいて作業プロセスに従って複数の作業ガイド情報を順次提示することにより、作業支援することが可能となる。 In the work support system (1) according to the eighth aspect, work support can be performed by sequentially presenting a plurality of pieces of work guide information according to the work process in the second mode.
 第9の態様に係る作業支援方法は、作業領域(11)内に設定された自動検知領域(14)内に存在する物体を検知するセンサ(2)の出力に基づいて、作業領域(11)内の複数の作業イベントの順序を含む作業プロセスを生成する。 A work support method according to a ninth aspect is a work area (11) according to an output of a sensor (2) for detecting an object present in an automatic detection area (14) set in the work area (11). Generate a work process that includes the order of multiple work events in
 第9の態様に係る作業支援方法では、作業プロセスを容易に生成することが可能となる。 In the work support method according to the ninth aspect, a work process can be easily generated.
 第10の態様に係るプログラムは、コンピュータシステムに、作業領域(11)内に設定された自動検知領域(14)内に存在する物体を検知するセンサ(2)の出力に基づいて、作業領域(11)内の複数の作業イベントの順序を含む作業プロセスを生成する処理を実行させる。 The program according to the tenth aspect relates to the computer system based on the output of the sensor (2) for detecting an object present in the automatic detection area (14) set in the work area (11). 11) execute a process of generating a work process including the order of a plurality of work events in;
 第10の態様に係るプログラムでは、作業プロセスを容易に生成することが可能となる。 In the program according to the tenth aspect, it is possible to easily generate a work process.
 1 作業支援システム
 2 センサ
 3 生成部
 4 提示部
 5 編集部
 6 ユーザインタフェース
 7 設定部
 8 ガイド情報提示部
 9 制御部
 11 作業領域
 12 3次元検知領域
 14 自動検知領域
 100 作業者
Reference Signs List 1 work support system 2 sensor 3 generation unit 4 presentation unit 5 editing unit 6 user interface 7 setting unit 8 guide information presentation unit 9 control unit 11 work area 12 three-dimensional detection area 14 automatic detection area 100 worker

Claims (10)

  1.  作業領域内に設定された自動検知領域内に存在する物体を検知するセンサと、
     前記センサの出力に基づいて、前記作業領域内の複数の作業イベントの順序を含む作業プロセスを生成する生成部と、を備える、
     作業支援システム。
    A sensor for detecting an object present in an automatic detection area set in the work area;
    Generating a work process including an order of a plurality of work events in the work area based on an output of the sensor.
    Work support system.
  2.  前記作業プロセスを提示する提示部を更に備える、
     請求項1に記載の作業支援システム。
    It further comprises a presentation unit that presents the work process.
    The work support system according to claim 1.
  3.  前記センサは、前記センサから前記自動検知領域に存在する物体までの距離値を画素値とする距離画像を連続的に生成する距離画像センサを含む、
     請求項1又は2に記載の作業支援システム。
    The sensor includes a distance image sensor that continuously generates a distance image having pixel values of distances from the sensor to an object present in the automatic detection area.
    The work support system according to claim 1 or 2.
  4.  前記センサは、イメージセンサを含む、
     請求項1~3のいずれか一項に記載の作業支援システム。
    The sensor includes an image sensor
    The work support system according to any one of claims 1 to 3.
  5.  ユーザインタフェースからの入力情報に基づいて前記作業プロセスを編集する編集部を更に備える、
     請求項1~4のいずれか一項に記載の作業支援システム。
    It further comprises an editor for editing the work process based on input information from a user interface.
    The work support system according to any one of claims 1 to 4.
  6.  3次元検知領域をユーザインタフェースからの入力情報に従って設定する設定部を更に備える、
     請求項1~5のいずれか一項に記載の作業支援システム。
    And a setting unit configured to set the three-dimensional detection area according to the input information from the user interface.
    The work support system according to any one of claims 1 to 5.
  7.  前記生成部において前記作業プロセスを生成する第1モードと、前記生成部において生成された前記作業プロセスを使用する第2モードと、があり、前記第2モードにおいて前記センサの出力を作業支援に利用する、
     請求項1~6のいずれか一項に記載の作業支援システム。
    There is a first mode for generating the work process in the generation unit and a second mode for using the work process generated in the generation unit, and the output of the sensor is used for work support in the second mode Do,
    The work support system according to any one of claims 1 to 6.
  8.  前記第2モードにおいて前記作業プロセスに従って複数の作業ガイド情報を順次提示するガイド情報提示部を更に備える、
     請求項7に記載の作業支援システム。
    The information processing apparatus further comprises a guide information presentation unit that sequentially presents a plurality of work guide information according to the work process in the second mode.
    The work support system according to claim 7.
  9.  作業領域内に設定された自動検知領域内に存在する物体を検知するセンサの出力に基づいて、前記作業領域内の複数の作業イベントの順序を含む作業プロセスを生成する、
     作業支援方法。
    Generating a work process including an order of a plurality of work events in the work area based on an output of a sensor that detects an object present in an automatic detection area set in the work area;
    Work support method.
  10.  コンピュータシステムに、
     作業領域内に設定された自動検知領域内に存在する物体を検知するセンサの出力に基づいて、前記作業領域内の複数の作業イベントの順序を含む作業プロセスを生成する処理を実行させるための
     プログラム。
    Computer system,
    A program for executing processing of generating a work process including an order of a plurality of work events in the work area based on an output of a sensor that detects an object present in an automatic detection area set in the work area. .
PCT/JP2018/038401 2017-11-29 2018-10-16 Work assisting system, work assisting method, and program WO2019106987A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201880076962.1A CN111417976A (en) 2017-11-29 2018-10-16 Work support system, work support method, and program
JP2019557058A JPWO2019106987A1 (en) 2017-11-29 2018-10-16 Work support system, work support method and program

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-229318 2017-11-29
JP2017229318 2017-11-29

Publications (1)

Publication Number Publication Date
WO2019106987A1 true WO2019106987A1 (en) 2019-06-06

Family

ID=66664887

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/038401 WO2019106987A1 (en) 2017-11-29 2018-10-16 Work assisting system, work assisting method, and program

Country Status (3)

Country Link
JP (1) JPWO2019106987A1 (en)
CN (1) CN111417976A (en)
WO (1) WO2019106987A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023067857A1 (en) * 2021-10-20 2023-04-27 株式会社アイオイ・システム Work support device and non-transitory storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009080759A (en) * 2007-09-27 2009-04-16 Park:Kk Cell production system process control method
JP2011134224A (en) * 2009-12-25 2011-07-07 Honda Motor Co Ltd Assembling work support system and program
JP2013025478A (en) * 2011-07-19 2013-02-04 Panasonic Corp Work detection system

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9448407B2 (en) * 2012-12-13 2016-09-20 Seiko Epson Corporation Head-mounted display device, control method for head-mounted display device, and work supporting system
JP2014159988A (en) * 2013-02-19 2014-09-04 Yaskawa Electric Corp Object detector, robot system, and object detection method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009080759A (en) * 2007-09-27 2009-04-16 Park:Kk Cell production system process control method
JP2011134224A (en) * 2009-12-25 2011-07-07 Honda Motor Co Ltd Assembling work support system and program
JP2013025478A (en) * 2011-07-19 2013-02-04 Panasonic Corp Work detection system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023067857A1 (en) * 2021-10-20 2023-04-27 株式会社アイオイ・システム Work support device and non-transitory storage medium

Also Published As

Publication number Publication date
JPWO2019106987A1 (en) 2020-11-19
CN111417976A (en) 2020-07-14

Similar Documents

Publication Publication Date Title
JP6075122B2 (en) System, image projection apparatus, information processing apparatus, information processing method, and program
JPWO2016132731A1 (en) Work support device, work support system, work support method, and recording medium for storing work support program
US20150049218A1 (en) Information processing methods and electronic devices
US9811751B2 (en) Image pickup apparatus with boundary identifier for areas of interest
JP2016529959A5 (en)
CN107656637A (en) A kind of automation scaling method using the projected keyboard for choosing manually at 4 points
KR101827221B1 (en) Mixed Reality Content Providing Device with Coordinate System Calibration and Method of Coordinate System Calibration using it
US20190213755A1 (en) Image labeling for cleaning robot deep learning system
CN107657642B (en) A kind of automation scaling method carrying out projected keyboard using external camera
CN103763471A (en) Digital zoom method and device
JP2016181068A (en) Learning sample imaging device
WO2019106987A1 (en) Work assisting system, work assisting method, and program
KR20120129293A (en) A method and a system for generating interactive contents based on augmented reality
US20180211442A1 (en) Information display system
US10365770B2 (en) Information processing apparatus, method for controlling the same, and storage medium
JP6948294B2 (en) Work abnormality detection support device, work abnormality detection support method, and work abnormality detection support program
Morioka et al. Cooking support system utilizing built-in cameras and projectors
KR102062982B1 (en) Method for video synthesis
JP2014164652A (en) Image edit device, image edit method and image edit program
JP7054774B2 (en) Projection control system and projection control method
JP6220514B2 (en) Robot control system and robot control method
US10051232B2 (en) Adjusting times of capture of digital images
JP2021047698A (en) Support device, support device control method and program
CN114363521B (en) Image processing method and device and electronic equipment
KR102362318B1 (en) Moving Object Ground Truth generation method and system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18884780

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019557058

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18884780

Country of ref document: EP

Kind code of ref document: A1